# [Official] NVIDIA GTX 1070 Owner's Club



## TopicClocker

Reserved for benchmarks and other things.


----------



## Ecks9T

Will some ss of the msi card I got from newegg tonight. Hopefully i can hit 2ghz easily.

Edit: http://www.techpowerup.com/gpuz/details/esbvz


----------



## Sea Otter

Have a day off today, so I'm trying my best to score this card in the next couple hours. Been refreshing this page for the past couple hours. Saw the EVGA one instock at Newegg, but it disappeared when I added it to my cart. Anyone know of a better way to track if any retailer has them in stock?


----------



## Porter_

i just ordered (backordered) 2 Gigabyte G1 Gaming 1070's from amazon


----------



## exzacklyright

IDK why but I get a lot more artifacts on this scene than any other in the heaven benchmark



I was only able to get to 190MHz with no artifacts.

As for memory.. I'm not sure how to figure out how high to go.

One review said this, "Our GTX 1080's GDDR5X memory also took a final offset +400MHz to achieve a 5400MHz final stable memory clock. We noticed that there is a memory hole with instability around +500MHz offset, and that we could then push it even higher to +600MHz, but we lost some performance compared with a +400MHz offset."

I'm not sure how they found it it had a memory hole? Or that they lost performance? I'm guessing FPS?

HARDOCP: http://www.hardocp.com/article/2016/06/13/geforce_gtx_1070_founders_edition_overclocking_review#.V2C3HfkrIuU they got to +230 with no issues somehow


----------



## Bdonedge

Just got a Gigabyte G1 Gaming 1070 as it went live









Will report back once it arrives


----------



## Alonjar

Well, my 1070 FE from Nvidia arrived today... sadly I'm too exhausted from working late to do anything with it. Really nice packaging though!


----------



## Mudfrog

Quote:


> Originally Posted by *Porter_*
> 
> i just ordered (backordered) 2 Gigabyte G1 Gaming 1070's from amazon


Did they indicate how long it would be before it ships? I've had it sitting in my cart for a while.


----------



## bbdale

Also got the G1 1070 yesterday when it went live. Hopefully Newegg comes though. Will report back at some point.


----------



## Cakewalk_S

Definitely interested to see how this 1070 is going to pump out the frames. Seems like quite a hefty performance jump from a GTX970 with little to no increase in power consumption...

Looks like this card will easily handle today's games upto 1440p with no problem at all. Not quite there yet for 4k but getting there, that's more of the 1080's business.


----------



## Brimlock

My card comes in tomorrow. No idea which brand I'm getting. Hoping that whoever I get has a step-up program like EVGA so I can eventually trade up to a AIB card.


----------



## sherlock

Quote:


> Originally Posted by *Brimlock*
> 
> My card comes in tomorrow. No idea which brand I'm getting. Hoping that whoever I get has a step-up program like EVGA so I can eventually trade up to a AIB card.


EVGA is the only OEM with a Step-up program. and if you ordered from Nvidia directly you are getting a Nvidia branded FE card, not a EVGA/MSI/ASUS branded card.


----------



## Brimlock

Quote:


> Originally Posted by *sherlock*
> 
> EVGA is the only OEM with a Step-up program. and if you ordered from Nvidia directly you are getting a Nvidia branded FE card, not a EVGA/MSI/ASUS branded card.


Huh, everyone else who ordered through Nvidia made it sound like they send you a random partner card. W/e not a big deal honestly.


----------



## sherlock

Quote:


> Originally Posted by *Brimlock*
> 
> Huh, everyone else who ordered through Nvidia made it sound like they send you a random partner card. W/e not a big deal honestly.


Not my photo, but see


The two Nvidia GTX 1080 are in different boxes vs the 2 EVGA GTX 1080 even though they are all FE.


----------



## shilka

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Definitely interested to see how this 1070 is going to pump out the frames. Seems like quite a hefty performance jump from a GTX970 with little to no increase in power consumption...
> 
> Looks like this card will easily handle today's games upto 1440p with no problem at all. Not quite there yet for 4k but getting there, that's more of the 1080's business.


A single GTX 1070 is not going to be enough if you have a high refresh rate 1440P monitor such as the Asus PG279Q.
If you have a PG279Q like i do you want to aim for 165 FPS, and and to do that you are either talking GTX 1070 SLI or GTX 1080 SLI.

I am looking into getting new video cards and i was thinking about two GTX 1070 cards to replace my old GTX 970 cards but i dont really know.
For games that dont work with SLI the FPS is going to be lower so if it comes to that GTX 1080 cards would be better.

But they also cost a ton more.
Long story short no a GTX 1070 is not enough for 1440P even at 60 FPS it struggles in some games.


----------



## Ecks9T

Quote:


> Originally Posted by *shilka*
> 
> A single GTX 1070 is not going to be enough if you have a high refresh rate 1440P monitor such as the Asus PG279Q.
> If you have a PG279Q like i do you want to aim for 165 FPS, and and to do that you are either talking GTX 1070 SLI or GTX 1080 SLI.
> 
> I am looking into getting new video cards and i was thinking about two GTX 1070 cards to replace my old GTX 970 cards but i dont really know.
> For games that dont work with SLI the FPS is going to be lower so if it comes to that GTX 1080 cards would be better.
> 
> But they also cost a ton more.
> Long story short no a GTX 1070 is not enough for 1440P even at 60 FPS it struggles in some games.


I agree with what you say, but wouldn't be better in the long run to get one 1080, then sli later. Also as far as i can tell, using the 1070 is great, but it does stutter with some games @ 1440p.
BTW, anyone annoyed with the pesty white sticker on 1 of the 4 gpu screw, cause now I cannot get a waterblock







.


----------



## shilka

Quote:


> Originally Posted by *Ecks9T*
> 
> I agree with what you say, but wouldn't be better in the long run to get one 1080, then sli later. Also as far as i can tell, using the 1070 is great, but it does stutter with some games @ 1440p.
> BTW, anyone annoyed with the pesty white sticker on 1 of the 4 gpu screw, cause now I cannot get a waterblock
> 
> 
> 
> 
> 
> 
> 
> .


I am talking with a friend about him buying my two GTX 970 cards in august or september.
I paid about 9000 kr ($1365 US) for them new so i told my friend i wanted 2500 kr ($378 US) for both, is that ripping him off or does that seem like a fair price?

There will be no new video card(s) in july as i am upgrading my motherboard CPU and RAM.
But yes i am thinking about getting one GTX 1080 and another one later, i just need to wait for the new Gigabyte Extreme Gaming GTX 1080 card.


----------



## Ecks9T

Quote:


> Originally Posted by *shilka*
> 
> I am talking with a friend about him buying my two GTX 970 cards in august or september.
> I paid about 9000 kr ($1365 US) for them new so i told my friend i wanted 2500 kr ($378 US) for both, is that ripping him off or does that seem like a fair price?
> 
> There will be no new video card(s) in july as i am upgrading my motherboard CPU and RAM.
> But yes i am thinking about getting one GTX 1080 and another one later, i just need to wait for the new Gigabyte Extreme Gaming GTX 1080 card.


That pretty fair for two 970s, since they are going about $200 USD each. Also, the pricing of these new cards, even non-FE, has not been at that MSRP. They have been a bit cheaper than the FE or more expensive than the FE which is a bummer.


----------



## shilka

Quote:


> Originally Posted by *Ecks9T*
> 
> That pretty fair for two 970s, since they are going about $200 USD each. Also, the pricing of these new cards, even non-FE, has not been at that MSRP. They have been a bit cheaper than the founders or more expensive than the FE which is a bummer.


The few retailers that have GTX 1070 and 1080 cards listed here in Denmark have ridiculous prices going on

Despite the fact that we have 25% VAT in Denmark an aftermarket GTX 1070 goes for around 4000 kr ($600 US)
The GTX 1080 aftermarket cards are around 6000 kr ($900 US)

I refuse to pay those prices so i am going to wait at least a month even if i was not upgrading my motherboard CPU and RAM.


----------



## Ecks9T

Quote:


> Originally Posted by *shilka*
> 
> The few retailers that have GTX 1070 and 1080 cards listed here in Denmark have ridiculous prices going on
> 
> Despite the fact that we have 25% VAT in Denmark an aftermarket GTX 1070 goes for around 4000 kr ($600 US)
> The GTX 1080 aftermarket cards are around 6000 kr ($900 US)
> 
> I refuse to pay those prices so i am going to wait at least a month even if i was not upgrading my motherboard CPU and RAM.


Yeah better off waiting until the hype dies down. I kinda of regret getting the card so early cause of the placement of the warranty sticker. But I wanted to upgrade when I have been pushing a gtx580 w/ 1.5gb for a while @ 1440p.


----------



## Sea Otter

Got my 1070 FE in today. Testing it right now and it loads at 78C. Seems a bit too high. Probably gonna adjust the fan curve soon.


----------



## Bdonedge

Quote:


> Originally Posted by *Sea Otter*
> 
> Got my 1070 FE in today. Testing it right now and it loads at 78C. Seems a bit too high. Probably gonna adjust the fan curve soon.


I feel like that sounds about right from what I've seen from reviews?


----------



## Ecks9T

Quote:


> Originally Posted by *Sea Otter*
> 
> Got my 1070 FE in today. Testing it right now and it loads at 78C. Seems a bit too high. Probably gonna adjust the fan curve soon.


that is much better than mine... i am hitting 86c on while playing f12015


----------



## Porter_

Quote:


> Originally Posted by *Mudfrog*
> 
> Did they indicate how long it would be before it ships? I've had it sitting in my cart for a while.


no and they're still sitting in my account with the status 'Not yet shipped'. not sure when they'll ship. to temper my impatience i'm anticipating sometime between next week and the end of the month. someone on reddit posted a screenshot of amazon support saying his card will ship on June 22nd. so maybe that's when this batch will go out. we'll see.


----------



## TimTheEnchanter

EVGA GTX 1070 Superclocked arriving Monday.


----------



## Brimlock

So I got my 1070 in today and got it into the mobo just fine, but I haven't been able to install the drivers. I've removed the old drivers and uninstalled GeForce experience, ran the drivers as admin, contacted NVidia support and they had me update my bios and I found out my windows build has not progressed from the launch build since I upgraded to windows 10. So now I'm at work waiting 8 hours to go home just to manually install windows updates to hopefully resolve this debacle.


----------



## the.hollow

Enjoying my 1070 FE card. Just wondering if there is a way to get past 82c temp limit. Either way loving the card and how silent it is.


----------



## Mudfrog

Quote:


> Originally Posted by *Porter_*
> 
> no and they're still sitting in my account with the status 'Not yet shipped'. not sure when they'll ship. to temper my impatience i'm anticipating sometime between next week and the end of the month. someone on reddit posted a screenshot of amazon support saying his card will ship on June 22nd. so maybe that's when this batch will go out. we'll see.


I missed my chance to order the G1, trying to decide if I want to wait it out or order a different brand.


----------



## sherlock

Look at that 1070 FE demand

__
http://instagr.am/p/BGu0pNtOISx%2F/


----------



## Brimlock

Quote:


> Originally Posted by *sherlock*
> 
> Look at that 1070 FE demand
> 
> __
> http://instagr.am/p/BGu0pNtOISx%2F/


Can't I'm at work.


----------



## mcbaes72

EDIT: Canceled order, Amazon 1-2 months for MSI Armor, no thanks! I'm keeping Titan Black (if ASUS repairs under warranty).

EDIT #2: Got lucky, found in stock Newegg, bought MSI 1070 Armor... will look good with white/black theme on my gaming rig.


----------



## killeraxemannic

Add me! Got my Gigabyte G1 GTX 1070 yesterday!


----------



## Brimlock

I had to wipe my OS to get my 1070 to work because my win10 was broken in a way that I wasn't even aware of till i got the 1070. It wouldn't upgrade past the launch build. I have done some testing and was able to get it above 1900MHz and sometimes stable above 2000 but it would fall back down into the high 1900s. I haven't had much chance to see how it works in many games yet.

I've tried WoW, and overwatch. In WoW my FPS seems to not have moved at all, quite a discouraging first test actually, but I believe this has to do with the game because when I jumped into overwatch, live gameplay was easily double the FPS of my 780 FTW, which usually hung around 55FPS and sometimes went into 60FPS. The 1070 at stock in the game however hangs around 100+ FPS, although I only had time for one match and didn't pay attention to the FPS during combat, I am certain that performance was stable.

I may jump into Warhammer later tonight and see whats up in that game now that I have more memory than the game can work with.

Edit: Today I was able to get my card above a stable 2000MHz, only dropping below 2000Mhz during some of the transition scenes on the Heaven Benchmark, so they were split second drops. The Core clock usually stuck around 2063 until the heat slow crept up, if I can manage the heat I believe I can keep it close to 2100 MHz. I was able to get it up to 2100MHz for a few seconds but have not had time to play with it to see if I can keep it stable. I have had the drivers crash on me a few times trying to boost it up but just not enough time to do what I want.

The fan on the FE is not loud and I can keep it close to 70 degrees Celsius at only 60% fan speed and even at that % its not very loud at all. Plus I have headphones on so the sound is even more drowned out, but even someone around without headphones would not be bothered by the sound of the fan. If anything its just as quiet if not quieter than my 780 FTW. The card is running on air and I have no plans to get a waterblock. There's been no coil-whine or any other annoying sounds to come from the GPU and appears to be able to stay relatively cool when under a load, but again I need more time with the card to make any solid decisions.

So far I'm impressed and see no reason to trade into an AIB card. If you haven't gotten a 1070, but plan to get one, I would wait for them to become more readily available just so that you can get the card at a cheaper price with supposed better coolers. A waterblock on this card is probably overkill.

Will update when I get home and get more time on the card.


----------



## killeraxemannic

Here's a first Firestrike Ultra result. Looks like I have the highest score for a single 1070 and 4770k currently! Woot haha

http://www.3dmark.com/fs/8845710


----------



## Brimlock

Quote:


> Originally Posted by *killeraxemannic*
> 
> Here's a first Firestrike Ultra result. Looks like I have the highest score for a single 1070 and 4770k currently! Woot haha
> 
> http://www.3dmark.com/fs/8845710


Dunno if I wanna spend 25$ for a benchmark tool. You are number 1 with a single card.


----------



## joll

Quote:


> Originally Posted by *killeraxemannic*
> 
> Here's a first Firestrike Ultra result. Looks like I have the highest score for a single 1070 and 4770k currently! Woot haha
> 
> http://www.3dmark.com/fs/8845710


Thanks for posting your score; I was wondering if my computer was performing OK.

I tried to match your settings and ended up with this: http://www.3dmark.com/3dm/12542626


----------



## killeraxemannic

Quote:


> Originally Posted by *joll*
> 
> Thanks for posting your score; I was wondering if my computer was performing OK.
> 
> I tried to match your settings and ended up with this: http://www.3dmark.com/3dm/12542626


Nice! That's why I downloaded and tried firestrike... Wanted to see how mine was doing compared to others. Gotta overclock now!


----------



## TimTheEnchanter

Got the card tonight, installed and running like a champ.


----------



## joll

Quote:


> Originally Posted by *killeraxemannic*
> 
> Nice! That's why I downloaded and tried firestrike... Wanted to see how mine was doing compared to others. Gotta overclock now!


That's a pretty nice result for the 1070 G1 just being stock. I had to add 100 mhz to my FE to match it.


----------



## killeraxemannic

Quote:


> Originally Posted by *joll*
> 
> That's a pretty nice result for the 1070 G1 just being stock. I had to add 100 mhz to my FE to match it.


Mine is the Gigabyte G1 at stock settings. Haven't changed anything on the card yet. My CPU is OC'd to 4.5 Ghz though.


----------



## killeraxemannic

Quote:


> Originally Posted by *Brimlock*
> 
> Dunno if I wanna spend 25$ for a benchmark tool. You are number 1 with a single card.


You should be able to get it for around $12 if you look around other places. I think I bought my copy on Amazon for like $13 or $14 a while back. Not to bad considering what you get.


----------



## Blackcurrent

Ordered a 1070 Gaming X hopefully should get it by the end of this month.


----------



## pez

Is everyone having similar experiences with the 1070 as the 1080 for OC'ing? From what I gather, the platform as a whole OC's about the same on FE vs AIB so long as temps are good. I'm actually considering a FE for the build this will go in.


----------



## Dude970

Just ordered a MSI Gaming X off Amazon, hope it doesn't take to long. Looking forward to joining the club


----------



## Essenbe

Good job. Hope the wait isn't too long.


----------



## Dude970

Quote:


> Originally Posted by *Essenbe*
> 
> Good job. Hope the wait isn't too long.


Thanks my friend


----------



## EGOvoruhk

Pre-ordered the ACX from Amazon. Hopefully they ship soon

Was looking for an SC, but went with the first one I could order. The vanilla ACX card should be able to be flashed with the SC BIOS, correct? Since it's based on the same board, and doesn't have the added power, like the FTW


----------



## Asus11

did have one of these on release day but wasn't that impressed on 3440x1440

im sure you guys will be over the moon with it because it was epic in most but the game I play the most is BF4 at the moment and it wasn't performing as well as i'd of liked

im sure its 100% perfect for 2560x1440 and lower though


----------



## bob70932

gents some advice if poss. Currently have msi gtx 980, what do I do, I have the money to buy a 1070 or do I sli the 980?

Thanks in advance

PS I have 1440 p 165hz monitor for gaming and 4690k i5 cpu. PSu evga g2 750w.

Cheers


----------



## Bdonedge

I'd say go 1070 - can SLI that in the future


----------



## Pragmatist

Just ordered an MSI Armor 1070 and I should get it in two days. I was going to get the 1080 at first, but I rather wait for the 1080Ti and enjoy the 1070 in the meantime.


----------



## BulletSponge

Quote:


> Originally Posted by *bob70932*
> 
> gents some advice if poss. Currently have msi gtx 980, what do I do, I have the money to buy a 1070 or do I sli the 980?
> 
> Thanks in advance
> 
> PS I have 1440 p 165hz monitor for gaming and 4690k i5 cpu. PSu evga g2 750w.
> 
> Cheers


Quote:


> Originally Posted by *Bdonedge*
> 
> I'd say go 1070 - can SLI that in the future


^This, get a 1070.


----------



## WiLd FyeR

Quote:


> Originally Posted by *bob70932*
> 
> gents some advice if poss. Currently have msi gtx 980, what do I do, I have the money to buy a 1070 or do I sli the 980?
> 
> Thanks in advance
> 
> PS I have 1440 p 165hz monitor for gaming and 4690k i5 cpu. PSu evga g2 750w.
> 
> Cheers


Wait for the 1080ti. SLI optimization usually is available at game release but not fully optimized.


----------



## marik123

Quote:


> Originally Posted by *bob70932*
> 
> gents some advice if poss. Currently have msi gtx 980, what do I do, I have the money to buy a 1070 or do I sli the 980?
> 
> Thanks in advance
> 
> PS I have 1440 p 165hz monitor for gaming and 4690k i5 cpu. PSu evga g2 750w.
> 
> Cheers


Why not sell your 980 and then pay a small difference to upgrade to a 1070?


----------



## Rhadamanthis

I ordered a Evga 1070 founders...next i wait for hybrid aio and going to oc


----------



## Bdonedge

So I just built my PC and used a 750W Platinum 80+ power supply - think its good for SLI 1070's in the future? I'm thinking about ordering a second G1 when I can


----------



## Brimlock

Quote:


> Originally Posted by *Bdonedge*
> 
> So I just built my PC and used a 750W Platinum 80+ power supply - think its good for SLI 1070's in the future? I'm thinking about ordering a second G1 when I can


Probably, the power consumption of the 1070 isn't that high.


----------



## killeraxemannic

Quote:


> Originally Posted by *Bdonedge*
> 
> So I just built my PC and used a 750W Platinum 80+ power supply - think its good for SLI 1070's in the future? I'm thinking about ordering a second G1 when I can


You should be fine. I ran 2 cards with much higher power requirements even doing some coin mining at one point off of my Gold 750W. As long as its a good PSU I am sure you are easily in the clear. The 1070's at load seem to pull a max of 165W


----------



## BulletSponge

Quote:


> Originally Posted by *killeraxemannic*
> 
> You should be fine. I ran 2 cards with much higher power requirements even doing some coin mining at one point off of my Gold 750W. As long as its a good PSU I am sure you are easily in the clear. The 1070's at load seem to pull a max of 165W


Same here,I was mining on a pair of 280X's 24/7 with a 750w psu, 1070 sli will be easy.


----------



## BulletSponge

Doh-delete


----------



## killeraxemannic

I was actually running 2 GTX 570's off of my 750W at one point and those can pull 220W per card at full load. 750W should be fine with no question for 1070 or 1080 SLI.


----------



## pez

Assuming I get the payment charged email, I think I managed to successfully snag an ASUS 1070 FE from NE.

Looking forward to completing the build with this last piece







.

Edit: WOOT, successfully charged!


----------



## Jimbags

My Gigabyte GTX1070 FE was finally delivered yesterday. 10 day wait! This thing is a beast. Havent done any serious overclocking as it bosted right up to 1930mhz itself?? Plays doom ultra 2560x1440 awesome!
Here it is


----------



## pez

That's some pretty awesome boost by default. What temps are you seeing? What are your ambient temps (if you know)?


----------



## Jimbags

Quote:


> Originally Posted by *pez*
> 
> That's some pretty awesome boost by default. What temps are you seeing? What are your ambient temps (if you know)?


Max temps are mid 70's.Ambient is sorta high though 27°c approx.Thats with fan at 80% though but i game with headphones. Tossing up wether to buy a water block as my last card was water cooled, so pump res and everything are still in case just the cpu is using it for now. 2x240mm rads haha.The non reference cards are going to fly!


----------



## skline00

Quote:


> Originally Posted by *Jimbags*
> 
> Max temps are mid 70's.Ambient is sorta high though 27°c approx.Thats with fan at 80% though but i game with headphones. Tossing up wether to buy a water block as my last card was water cooled, so pump res and everything are still in case just the cpu is using it for now. 2x240mm rads haha.The non reference cards are going to fly!


I added an EK GTX 1080 waterblock to my 1080. VERY nice. fits the 1070. I bought it from Performance PC. Temps are 29-30C idle and @38C max.


----------



## pez

Nice results (to both of you). I'm definitely curious to hear what the fan of the reference cooler is like compared to my 1080 G1.


----------



## 303869

Could somebody let me know some temps and noise levels from the Founders Edition? I'm looking to get the 1070 but cant decide between FE or third party cooler... I like the design of the FE but my current blower style card sounds like a vacuum under full load...

Edit just seen the above post!







so temps seem good is it noisy though? I'm planning on putting it in my GHTPC so noise levels are important.


----------



## AlbertXuk

Hello Guys, I just joined this website, to share with you all my Benchmarks for the MSI Founders Edition. I got 2 in SLI.

I have made 4 videos so far testing them as much as possible,

in 1080p
2K
4K

with and wthout SLI
with and Without Overclock

Is a very mild overclock

+150 Core Clock
+450 Memories

Hope you guys like it.

The Witcher III





Crysis 3





Shadow of Mordor





Rise of The Tomb Raider





One more thing I speak Spanish, so you won't understand what I say but there is legends on the video so you can have an idea of what is going on.

Cheers


----------



## Mr-Dark

My 1070 here



Stock boost clock is 1840mhz


----------



## Brimlock

Quote:


> Originally Posted by *RyReZar*
> 
> Could somebody let me know some temps and noise levels from the Founders Edition? I'm looking to get the 1070 but cant decide between FE or third party cooler... I like the design of the FE but my current blower style card sounds like a vacuum under full load...
> 
> Edit just seen the above post!
> 
> 
> 
> 
> 
> 
> 
> so temps seem good is it noisy though? I'm planning on putting it in my GHTPC so noise levels are important.


Its quiet and cool. I keep it between 60-75 Celsius but haven't watched where it stabilizes. My fan curve takes care of everything and never goes above 70% which is still extremely quiet with how some people explain their GPUs. With headphones going I don't even hear a slight hum from it. Cranked it up to 100% to hear how it was and I still wouldn't say it was bad. Thoroughly impressed overall with the FE edition. Would suggest waiting for a cheaper AIB if you can for a maybe better cooler though. If you can't wait though you shouldn't be disappointed.


----------



## Cakewalk_S

How come I haven't heard anyone mention anything about ASIC score?

When do the <$400 cards come out? I thought MSRP was supposed to be $379....


----------



## Brimlock

Quote:


> Originally Posted by *Cakewalk_S*
> 
> How come I haven't heard anyone mention anything about ASIC score?


I checked GPUZ on my card and it says its not compatible for the card.


----------



## phillyman36

Newegg has the EVGA FE 1070 in stock right now for those looking.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487247


----------



## renji1337

I want the EVGA FTW 1070 :O


----------



## phillyman36

Quote:


> Originally Posted by *renji1337*
> 
> I want the EVGA FTW 1070 :O


I'm looking for the EVGA SC version myself but figured I would post it since they are in such short supply.


----------



## KrAzYtHeBoY

Im buy GTX 1070 Gaming X (MSI)


----------



## marik123

Quote:


> Originally Posted by *Cakewalk_S*
> 
> How come I haven't heard anyone mention anything about ASIC score?
> 
> When do the <$400 cards come out? I thought MSRP was supposed to be $379....


I'm on the same boat, waiting for a $379 1070 to come out, and then I can put on my trusty old Arctic cooling mono s1 plus on it.









Possibly these two below will be very close to the $379 mark, hopefully.









https://us.msi.com/Graphics-card/GeForce-GTX-1070-AERO-8G.html#hero-overview

http://www.gigabyte.com/products/product-page.aspx?pid=5922#kf


----------



## Porter_

Quote:


> Originally Posted by *Porter_*
> 
> i just ordered (backordered) 2 Gigabyte G1 Gaming 1070's from amazon


Woohoo time to cancel my Amazon preorder. I ordered a Gigabyte G1 from Newegg last Friday and I just ordered a second one when it popped up on Newegg. These things are hard to find in stock right now. Nowinstock text alerts are great.


----------



## Sea Otter

Now that the Asus Strix is on Amazon for preorder, I really really regret getting the Founders Edition. Coil whine under load, 85C load temps, fan gets loud if I play any game.

Should've just waited the one week. Newegg doesn't allow a return on the card, so I'm basically stuck with it.

Hopefully this helps the people who are stuck deciding between the FE and another AIB card.


----------



## Brimlock

Quote:


> Originally Posted by *Sea Otter*
> 
> Now that the Asus Strix is on Amazon for preorder, I really really regret getting the Founders Edition. Coil whine under load, 85C load temps, fan gets loud if I play any game.
> 
> Should've just waited the one week. Newegg doesn't allow a return on the card, so I'm basically stuck with it.
> 
> Hopefully this helps the people who are stuck deciding between the FE and another AIB card.


Sounds like you got a bad egg. My card does none of that. Mine will maintain a core clock of about 2063Mhz and rarely passes 70 degrees Celsius, at 70% fan speed.


----------



## Ecks9T

Quote:


> Originally Posted by *Sea Otter*
> 
> Now that the Asus Strix is on Amazon for preorder, I really really regret getting the Founders Edition. Coil whine under load, 85C load temps, fan gets loud if I play any game.
> 
> Should've just waited the one week. Newegg doesn't allow a return on the card, so I'm basically stuck with it.
> 
> Hopefully this helps the people who are stuck deciding between the FE and another AIB card.


That isn't that bad in terms of temp. Mine his 86C but the fan only reaches about 75% at most. So I probably need to set up a custom profile. But then home I am in has no centralized air, so cooling my room is difficult when I am on the second story.


----------



## prey1337

Finally was able to order a Gigabyte G1 1070 from Newegg today. Been charged and says in "packaging" so everything seems to be looking good.
Seemed like the best bang for my buck card out there right now. ASUS Strix was going to be priced way too high.

Upgrading from an EVGA 560Ti 448 Core Classified, so this is going to be a solid jump I think.

Still running a single 1080p monitor for games, so I doubt I'll need to bother overclocking the G1 any more than it already is out of the box.
However, I do want to know what the card is capable of, I wouldn't mind a solid 2000MHz.


----------



## pez

Yeah, the 1070 at 1080p will be a great card for some time to come.


----------



## Mad Pistol

Got a 1070 FE headed my way, direct from Nvidia. She should be here in a few days.









I'm not terribly worried about the noise (although the coil wine will drive me up the wall if there is any). I highly doubt this card will be louder than my 780 w/ reference cooler.


----------



## 303869

Pre-ordered a EVGA 1070 FE, should recieve in about a week.


----------



## KrAzYtHeBoY

1070 Crash overpassed 2100mhz (core clock)









skyn3t need the bios!


----------



## Jimbags

Hey guys just a question. EK have released 2 reference water blocks. One is full length of card other is short of just before pci power plug? So there is different versions of the FE Editions?


----------



## Jimbags

This is what got me wondering?
Are there different FE PCB's?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Jimbags*
> 
> Hey guys just a question. EK have released 2 reference water blocks. One is full length of card other is short of just before pci power plug? So there is different versions of the FE Editions?


Both blocks are the same component coverage wise. One is see through acrylic, other is acetal. Really they are both full coverage water blocks.

Seen you posted a pic of the short block, the same block as the full coverage. The full coverage goes all the way over the pcb, other doesn't.


----------



## Jimbags

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Both blocks are the same component coverage wise. One is see through acrylic, other is acetal. Really they are both full coverage water blocks.
> 
> Seen you posted a pic of the short block, the same block as the full coverage. The full coverage goes all the way over the pcb, other doesn't.


Yeah but why? There must be a difference at that end of the board... Just wondering what? If you read ek's comments they talk of compatability. Anyone know the difference?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Jimbags*
> 
> Yeah but why? There must be a difference at that end of the board... Just wondering what? If you read ek's comments they talk of compatability. Anyone know the difference?


No same block as in coverage, see the Titan X blocks for example:




Only one style pcb FE 1070.


----------



## Jimbags

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> No same block as in coverage, see the Titan X blocks for example:
> 
> 
> 
> 
> Only one style pcb FE 1070.


I get the blocks are the same.. Thats obvious. EK say themselves its a short top, In the fb post I put up. They say to make it more compatible. Something has to be different for them to make a short top for the card. Please dont repeat yourself again I get blocks are the same!


----------



## Mr-Dark

Hello

GTX 1070 FE and Kraken G10/H105...

The PCB with memory heatsink



G10 installed



The whole build



The Fan's is ThermalTake ring 120 Red.. @900rpm.. Max temp is 44c at 2050mhz clock







.. at stock cooler the temp was 60c idle and 82c under load..

the room temp around 30c here.. Outside is +38c easily


----------



## Subz80

I am a German expat living and working in China. I just ended up buying a MSI Gaming X 1070 to replace my Asus Turbo 970 (white shroud blower style fan GPU... a bit noisy!). The MSI already comes nicely overclocked out of the box, but managed to push it further by an offset on the core clock of +100. After running intensive games for 30mins or so the final clock settles down to 2088Mhz. The memory is easily overclocked to 9Ghz. I attach a readout from Firestrike. What I find most amazing is the graphics score approaching a score of 20.000 which is dangerously close to 1080 territory (the precise score is 19.652, a bit hard to read on the mobile camera snapshot). I think a nicely overclocked 970 can get a score of about 12.000. So I manage to get an improvement of about 60%... not bad! Also in a well ventilated case the temps on this GPU barely exceed 60C... but MSI cards have always been good in that department.


----------



## Pragmatist

Quote:


> Originally Posted by *Subz80*
> 
> I am a German expat living and working in China. I just ended up buying a MSI Gaming X 1070 to replace my Asus Turbo 970 (white shroud blower style fan GPU... a bit noisy!). The MSI already comes nicely overclocked out of the box, but managed to push it further by an offset on the core clock of +100. After running intensive games for 30mins or so the final clock settles down to 2088Mhz. The memory is easily overclocked to 9Ghz. I attach a readout from Firestrike. What I find most amazing is the graphics score approaching a score of 20.000 which is dangerously close to 1080 territory (the precise score is 19.652, a bit hard to read on the mobile camera snapshot). I think a nicely overclocked 970 can get a score of about 12.000. So I manage to get an improvement of about 60%... not bad! Also in a well ventilated case the temps on this GPU barely exceed 60C... but MSI cards have always been good in that department.


That's a nice score. I got my MSI Armored 1070 today, but I will hook it up tomorrow because I don't have time today. The CD case was shredded and that annoys me since I'm a neat freak. The card looks good, though. I love that it's black and white, and I got my Phanteks enthoo evolv ATX tempered glass today too that'll fit the black and white theme I have going. In any case, I hope mine performs as well as yours. I will update with screenshots.


----------



## Mad Pistol

Now the game begins. I just ordered an MSI GTX 1070 off newegg and payed for 2 day shipping. It should be here Friday.

So basically what happens is the first card to get here (either from Nvidia or Newegg) is the one I keep, and the other one gets returned... or if there is any interest on this forum, I could keep one of them and sell it minus shipping charges... We shall see.


----------



## Cakewalk_S

Quote:


> Originally Posted by *Mad Pistol*
> 
> Now the game begins. I just ordered an MSI GTX 1070 off newegg and payed for 2 day shipping. It should be here Friday.
> 
> So basically what happens is the first card to get here (either from Nvidia or Newegg) is the one I keep, and the other one gets returned... or if there is any interest on this forum, I could keep one of them and sell it minus shipping charges... We shall see.


Find out which card clocks the best then sell...bin that puppy!


----------



## Mad Pistol

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Find out which card clocks the best then sell...bin that puppy!


Yea, I just checked Ebay. They're going for $500-550 brand new. I could easily get $400-450 used.

I think you're right... bin then sell is the plan.


----------



## Subz80

Quote:


> Originally Posted by *Subz80*
> 
> I am a German expat living and working in China. I just ended up buying a MSI Gaming X 1070 to replace my Asus Turbo 970 (white shroud blower style fan GPU... a bit noisy!). The MSI already comes nicely overclocked out of the box, but managed to push it further by an offset on the core clock of +100. After running intensive games for 30mins or so the final clock settles down to 2088Mhz. The memory is easily overclocked to 9Ghz. I attach a readout from Firestrike. What I find most amazing is the graphics score approaching a score of 20.000 which is dangerously close to 1080 territory (the precise score is 19.652, a bit hard to read on the mobile camera snapshot). I think a nicely overclocked 970 can get a score of about 12.000. So I manage to get an improvement of about 60%... not bad! Also in a well ventilated case the temps on this GPU barely exceed 60C... but MSI cards have always been good in that department.


... so just to explain in detail what I did with my new MSI Gaming X 1070. I downloaded the current beta version of MSI Afterburner. I also went to MSI's support website and downloaded and flashed the card with the latest BIOS which sets the card into OC mode by default (they released this BIOS over a review sample clock rates controversy). Then using Afterburner I applied:

Voltage: 100%
Power Limit: 126%
Core offset: +100
Mem offset: + 500

When launching one of the Unigine benchmarks or some game the card first boosts to exactly 2100Mhz... stays there for almost 5mins, but eventually settles down to 2088Mhz and stays there. I have also tried a core offset of +125, but Firestrike then causes the GPU driver to crash halfway through then benchmark.

Best of luck!


----------



## dmasteR

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601204369&Tpk=GTX%201070&ignorear=1

Two GTX 1070's available at the moment. MSI Gaming, MSI Armor.


----------



## Sea Otter

Still available after 5 hours. A week ago they would've easily sold out immediately. Guess stocks are looking up.


----------



## skline00

All 1070s gone.


----------



## Cheatdeath

Got a backorder in on the Gigabyte g1 1070 through newegg last night, hope it ships soon!


----------



## Ysbzqu6572

Guess I got lucky a week ago when store in my city had 1070 available for 489€
Fully enjoying it at 1440p, it is sweet card even though it's founders edition


----------



## Mad Pistol

Got confirmation that one of my 1070's will be here on Friday.

Still no word on the Nvidia shop one that I ordered.


----------



## BulletSponge

For anyone waiting on a back order through Amazon, if you email Amazon customer support in regards to your order and ask when they will be available they will give you free overnight shipping. It's not much but it will get it to you a tiny bit sooner.


----------



## Dude970

Quote:


> Originally Posted by *BulletSponge*
> 
> For anyone waiting on a back order through Amazon, if you email Amazon customer support in regards to your order and ask when they will be available they will give you free overnight shipping. It's not much but it will get it to you a tiny bit sooner.


thanks, just emailed them


----------



## Nightfallx

I have a Reference 1070 FE. I bought it from best buy last night, they got a shipment in of 1x 1070 and 1x 1080 and I bought the 1070.


----------



## Essenbe

Quote:


> Originally Posted by *Dude970*
> 
> thanks, just emailed them


Dude970 is going to have to change his user name to Dude1070


----------



## Brimlock

Had to undo my OC settings last night after a crazy spat of the drivers failing over and over for no clear reason. Not much of a loss though since OCing didn't seem to make much a difference.


----------



## Tasm

Got this beast yesterday:

http://i.imgur.com/8zqyI1M.jpg

Lets see what kind of OC it does.

Less than 2000 MHz and its going back.


----------



## Deviem

Hi, a new guy here. Got my hands on a MSI 1070 Gaming X...
Boosts 1936mhz by default and got it run through firestrike with 2075mhz / 9100mhz+


----------



## Dude970

Quote:


> Originally Posted by *Deviem*
> 
> Hi, a new guy here. Got my hands on a MSI 1070 Gaming X...
> Boosts 1936mhz by default and got it run through firestrike with 2075mhz / 9100mhz+


Nice, what was your Firestrike score?


----------



## KrAzYtHeBoY

*MSI GTX 1070 GAMING X*
Power Limit: +126
Temp limit: 96
Core Clock: +135
Mem Clock: +500

Max temperature: 61ºC


----------



## Subz80

Quote:


> Originally Posted by *KrAzYtHeBoY*
> 
> *MSI GTX 1070 GAMING X*
> Power Limit: +126
> Temp limit: 96
> Core Clock: +135
> Mem Clock: +500
> 
> Max temperature: 61ºC


Nice overclock! Is the +135 offset on the core carried out from the gaming or the oc profile? Remember that without the latest BIOS update the GPU always by default starts in the gaming profile and not the overclock one! What are your other system specs, in particular how much is your CPU overclocked? I have a Z97 Haswell system with a Xeon E1231v3 overclocked to a mild 4Ghz... but after overclocking the 1070 can only get an average of 100FPS in Unigine Heaven... so you are beating my by a 5% margin. Just want to figure out what can explain it. Also for some strange reason my version of MSI Afterburner can raise the max temps only to 92, not 96. But I am not sure if this will make any difference...


----------



## KrAzYtHeBoY

Quote:


> Originally Posted by *Subz80*
> 
> Nice overclock! Is the +135 offset on the core carried out from the gaming or the oc profile? Remember that without the latest BIOS update the GPU always by default starts in the gaming profile and not the overclock one! What are your other system specs, in particular how much is your CPU overclocked? I have a Z97 Haswell system with a Xeon E1231v3 overclocked to a mild 4Ghz... but after overclocking the 1070 can only get an average of 100FPS in Unigine Heaven... so you are beating my by a 5% margin. Just want to figure out what can explain it. Also for some strange reason my version of MSI Afterburner can raise the max temps only to 92, not 96. But I am not sure if this will make any difference...


Overclock values set on msi afterburner v4.2.0

CPU 4770K @ 4.2 at 1.225v
16GB GSKILL
Msi z97 gaming 5


----------



## Subz80

Quote:


> Originally Posted by *KrAzYtHeBoY*
> 
> Overclock values set on msi afterburner v4.2.0
> 
> CPU 4770K @ 4.2 at 1.225v
> 16GB GSKILL
> Msi z97 gaming 5


Nice, I am using some beta version of MSI's Afterburner which I thought was the latest one. Never mind! My rig is now also stable at:

Power Limit: +126
Temp limit: 92
Core Clock: +120
Mem Clock: +500

but at least on one occasion my GPU driver crashed when pushing the core to +125, which was during a Firestrike run. Also I noticed that in the OC range of +100-125 the dynamic boost rate always eventually throttles back (for example in Unigine Valley) to a value of 2060-2090Mhz, which squares with earlier reports that these GPUs tend to max boost dynamically into range of 2000-2100Mhz no matter what.


----------



## Deviem

Quote:


> Originally Posted by *Dude970*
> 
> Nice, what was your Firestrike score?


Here you go: http://www.3dmark.com/3dm/12654708

I tweaked the clocks a bit. Seems like 2063-2050MHz area is pretty stable on core. The memory I got running @ 9216MHz.
Temps never go above 75C with a mild fan profile. Seems like I need a volt mod bios next.


----------



## Ecks9T

After headaches with win10 and not being able to run 3dmark on windows 7. finally got some results with my card overclocked.

http://www.3dmark.com/3dm/12655389



Using msi afterburner 4.3 beta 4

Core Voltage: +10
Power Limit: 112
Temp Limit: 92
Core Clock: +200
Memory Clock: +500
Fan: Auto


----------



## Subz80

I have continued to experiment around with MSI Afterburner 4.3 beta 4 and only recently figured out that by pressing Ctrl+F you can enter an advanced core overclocking mode in which you are allowed to modify Nvidia's Boost 3.0 curve. This has been very useful to know about. In my experience the best way forward is to drag the right-most node with Ctrl+left mouse button UP so as to make the curve somewhat steeper. I think my right-most node now has an offset of about +155. The nodes as 1.081V and 1.093V are +109 and +117, respectively.

This means that in Firestrike my core clock now never falls below 2100Mhz. Still cannot break the 20.000 graphics score barrier but get as far as 19.663. But then again my CPU is not overclocked to 4.6Ghz and I think the graphics score can not be completely independent of the CPU performance. I have never seen my GPU display voltages higher than 1.1V I think... in most cases in-benchmark my voltage settles down to 1.081, so that is where my calibrated offset matters the most. I really wonder how badly NVidia has decided to voltage-cap these Pascal cards, this really seems to hold back the overclocking performance.

In any case with fan speeds cranked up only a little bit from standard settings these partner-board 1070 GPUs (especially with amazing coolers like the MSI gaming X) can be made to run stable at no less than 2100Mhz for sure...


----------



## Peet1

Hey everyone, i am posting here because i hope for a new Powerlimit BIOS down the line, as this Thread progresses,
or maybe some smart people, who are able to edit one, think there are enough people out there being in need for one.

That being said, i got some fine ass custom 1070 Jetstream in the mail yesterday and had some time now, to play a little bit with it.





Look at that Temperature. BTW its 32° Celsius ambient today.

Powerlimit is at 112'ish. As you can see, the BIOS already gives a bit more headroom.
My guess is, with a higher powerlimit and a maxed out voltage, i should be able to barely hold the 2100mhz.

The cooler is beefy enough, i mean, look at these temps. The card, even at 80% fanspeed, is barely audible. Thumbs up Palit!

Here the obligatory Heaven Bench with the GPU clock jumping around in steps of 2037,2050,2075mhz and 4404mhz on the memory.
Still playing around with the GPU voltage/Memory Clock a bit.


----------



## Ysbzqu6572

Hey guys, so today I switched from FE to G1 from gigabyte.. what a beauty..
However when I try to get MSI afterburner working with all the stuff like power limit and so on.. I can't, all I get is core clock/shader clock/ memory clock and that's it, power stuff is missing..
What to do ?


----------



## Peet1

You need the latest Beta Version 4.3.0 Beta 4 of Afterburner

http://www.guru3d.com/files-details/msi-afterburner-beta-download.html

and you need to check the option of the voltage regulation (on your own risk) in the settings tab.


----------



## Ysbzqu6572

Yeah I have this version also enabled voltage control but still lacking power limit control, just core voltage % becomes active.


----------



## Peet1

*shrugs*
Maybe driver trash.
Did you used DDU to uninstall the driver, for a fresh driver install?


----------



## Ysbzqu6572

Sorry my bad.. it was skin v2 problem.. skin v3 shows everything. thanks for help !


----------



## Dude970

Quote:


> Originally Posted by *Deviem*
> 
> Here you go: http://www.3dmark.com/3dm/12654708
> 
> I tweaked the clocks a bit. Seems like 2063-2050MHz area is pretty stable on core. The memory I got running @ 9216MHz.
> Temps never go above 75C with a mild fan profile. Seems like I need a volt mod bios next.


Thanks, looks like your card is performing great. I'm really looking forward to getting mine


----------



## Nightfallx

Quote:


> Originally Posted by *Dude970*
> 
> Originally Posted by Deviem View Post
> 
> Here you go: http://www.3dmark.com/3dm/12654708
> 
> I tweaked the clocks a bit. Seems like 2063-2050MHz area is pretty stable on core. The memory I got running @ 9216MHz.
> Temps never go above 75C with a mild fan profile. Seems like I need a volt mod bios next.


That score seems a bit low, mine is almost 14,000 stock.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Nightfallx*
> 
> That score seems a bit low, mine is almost 14,000 stock.


Compare the gpu score, not overall in 3dmark.


----------



## Nightfallx

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Compare the gpu score, not overall in 3dmark.


this is his. Graphics Score18 737
this is mine. Graphics Score17 299

that is a huge overclock for a tiny gain.


----------



## Mr-Dark

Quote:


> Originally Posted by *Nightfallx*
> 
> this is his. Graphics Score18 737
> this is mine. Graphics Score17 299
> 
> that is a huge overclock for a tiny gain.


My 1070 FE hit 20k graphic score at 2050/9000

http://www.3dmark.com/3dm/12650024


----------



## Nightfallx

Quote:


> Originally Posted by *Mr-Dark*
> 
> My 1070 FE hit 20k graphic score at 2050/9000
> 
> http://www.3dmark.com/3dm/12650024


that's why I said that one guys is really low for the overclock he has lol. I haven't attempted to overclock mine yet


----------



## MrTOOSHORT

His memory might be pushed too far and it's error correcting.

@Deviem

Lower your vram clock and see if your score improves.


----------



## KrAzYtHeBoY

Flashing new bios and select "OC MODE"
https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios

and new values for Afterburner.


----------



## Subz80

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> His memory might be pushed too far and it's error correcting.
> 
> @Deviem
> 
> Lower your vram clock and see if your score improves.


@Deviem

I doubt this advice will work. The graphics score is not completely CPU-independent. If you have a crazily overclocked CPU on a brand-new Skylake platform with some amazing DDR4 memory sticks then the graphics score may be pushed up that way. This guy's CPU is not overclocked much, but it has 6 physical cores and 12 threads... also he's got 65GBs of RAM for crying out loud! I had my memory offset dialed in at +500 (leading to more than 9.2Ghz) and I just tried running Firestrike with the memory offset halved to +250... and the graphics score definitely went down...


----------



## criminal

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> His memory might be pushed too far and it's error correcting.
> 
> @Deviem
> 
> Lower your vram clock and see if your score improves.


Probably what's happening.

http://www.3dmark.com/fs/8908433

If I take my vram any higher than that (9316), my scores tank.


----------



## Nightfallx

Quote:


> Originally Posted by *KrAzYtHeBoY*
> 
> Flashing new bios and select "OC MODE"
> https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
> 
> and new values for Afterburner.


where did you get a new bios? I didn't know you could edit these already.


----------



## Tasm

Quote:


> Originally Posted by *KrAzYtHeBoY*
> 
> Flashing new bios and select "OC MODE"
> https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
> 
> and new values for Afterburner.


How did you flash the BIOS?

Nice score with stock voltage.


----------



## Pragmatist

Got 3DMark on Steam, summer sale btw. I keep crashing nonstop.









I can run the 4K benchmark halfway.... that's the furthest I have gotten. Will play around with the settings and whatnot after the migration to the Phanteks enthoo evolv tempered glass edition. Highest stable core clock is 2063 with out increasing the voltage.


----------



## KrAzYtHeBoY

Quote:


> Originally Posted by *Nightfallx*
> 
> where did you get a new bios? I didn't know you could edit these already.


Quote:


> Originally Posted by *Tasm*
> 
> How did you flash the BIOS?
> 
> Nice score with stock voltage.


Download on MSI Official Support page or zippyshare
http://www20.zippyshare.com/v/7UTkg1MH/file.html

run .bat and press O ---> for oc mode default


----------



## Nightfallx

Quote:


> Originally Posted by *Pragmatist*
> 
> Got 3DMark on Steam, summer sale btw. I keep crashing nonstop.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can run the 4K benchmark halfway.... that's the furthest I have gotten. Will play around with the settings and whatnot after the migration to the Phanteks enthoo evolv tempered glass edition. Highest stable core clock is 2063 with out increasing the voltage.


thanks for the heads up, I just bought 3dmark also







. great price.


----------



## TheKoala

So I just got my card(MSI Gaming X) this morning and been rock stable running Fire Strike for the last couple of hours.

I've been Team Red for a while and it is nice to change it up.

Also, is there any certain settings or configurations that should be used while running Fire Strike?

My score below:

http://www.3dmark.com/3dm/12666542


----------



## criminal

Quote:


> Originally Posted by *TheKoala*
> 
> So I just got my card(MSI Gaming X) this morning and been rock stable running Fire Strike for the last couple of hours.
> 
> I've been Team Red for a while and it is nice to change it up.
> 
> Also, is there any certain settings or configurations that should be used while running Fire Strike?
> 
> My score below:
> 
> http://www.3dmark.com/3dm/12666542


Nice graphics score. You can go into Nvidia control panel under 3d settings and change some settings specific to Firestrike. I can't remember right off the top of my head what they are, but if you go to one of the Firestrike threads I think they mention it there.


----------



## Pragmatist

Quote:


> Originally Posted by *Nightfallx*
> 
> thanks for the heads up, I just bought 3dmark also
> 
> 
> 
> 
> 
> 
> 
> . great price.


You're welcome.









Too bad I can't run it for some odd reason, but I guess the issue is on my end, or so it seems. I will get to the bottom of it.


----------



## AuraNova

So...Not that this matters to anyone here, but I guess I can say that I am an owner of a 1070 for now. Sometime earlier this week, I made an impulse buy on a 1070 when I saw a particular one available on Newegg. I was gonna sell it off, but I just have this inclination of keeping it. Somehow, I am really waiting for Vega's release more than I would want a RX 480 at this point. Besides, I haven't run an NVIDIA card in ages.

I guess I can add mine when I get it, which should be tomorrow.


----------



## Luciferxy

Quote:


> Originally Posted by *Subz80*
> 
> I have continued to experiment around with MSI Afterburner 4.3 beta 4 and only recently figured out that by pressing Ctrl+F you can enter an advanced core overclocking mode in which you are allowed to modify Nvidia's Boost 3.0 curve. This has been very useful to know about. In my experience the best way forward is to drag the right-most node with Ctrl+left mouse button UP so as to make the curve somewhat steeper. I think my right-most node now has an offset of about +155. The nodes as 1.081V and 1.093V are +109 and +117, respectively.
> 
> This means that in Firestrike my core clock now never falls below 2100Mhz. Still cannot break the 20.000 graphics score barrier but get as far as 19.663. But then again my CPU is not overclocked to 4.6Ghz and I think the graphics score can not be completely independent of the CPU performance. I have never seen my GPU display voltages higher than 1.1V I think... in most cases in-benchmark my voltage settles down to 1.081, so that is where my calibrated offset matters the most. I really wonder how badly NVidia has decided to voltage-cap these Pascal cards, this really seems to hold back the overclocking performance.
> 
> In any case with fan speeds cranked up only a little bit from standard settings these partner-board 1070 GPUs (especially with amazing coolers like the MSI gaming X) can be made to run stable at no less than 2100Mhz for sure...


can you make a flat curve using that advance option (manual tweaking of those nodes) ?

that would be the same effect as having boost disable.


----------



## Subz80

Quote:


> Originally Posted by *Luciferxy*
> 
> can you make a flat curve using that advance option (manual tweaking of those nodes) ?
> 
> that would be the same effect as having boost disable.


To be honest, I have not tried that option, but am suspecting that it may not work that way. Been following the recommendations I have seen on some youtube videos. Also probably you do not want a completely flat curve as this would potentially kill the power-saving features of the card. You can however try and give the curve a somewhat flatter segment in the area of clocks where the card tends to oscillate under intense loads. This can stabilize the clocks a bit perhaps. Anyways, with my current curve profile I can always keep my MSI gaming X above 2100Mhz and will not try to push any further as this a sufficiently good enough stable OC result for me... and I still think that Firestrike graphics scores @20.000 and above are also the result of a modern and overclocked Skylake platform and do not exclusively depend on how well you managed to overclock your Pascal GPU!


----------



## Subz80

...for what it's worth here are my current results (I also got 3dmark on the Steam summer sale!)


----------



## HyeVltg3

Quote:


> Originally Posted by *AuraNova*
> 
> So...Not that this matters to anyone here, but I guess I can say that I am an owner of a 1070 for now. Sometime earlier this week, I made an impulse buy on a 1070 when I saw a particular one available on Newegg. I was gonna sell it off, but I just have this inclination of keeping it. Somehow, I am really waiting for Vega's release more than I would want a RX 480 at this point. Besides, I haven't run an NVIDIA card in ages.
> 
> I guess I can add mine when I get it, which should be tomorrow.


Ha!... did exactly this.
Just got Newegg notified a few mins ago, caved and added the MSI 1070 to cart and completed.
My initial plan was to grab 480 and ride it till Vega, but I may just ride the 1070 till Vega.

Mainly came here to ask how much people paid for their 1070s, I am Canadian but the currency conversion may just be right.

(Americanized; In USD)
Subtotal: $438.44
Handling: $0.00
Shipping: $9.23
GST/HST: $22.38 [tax]

Total Amount: $470.05

We Canadians paying more, less, just right?
I feel a bit overpriced for the GTX x70 range.
Last NV card I had was the GTX 670 and I paid $415usd for it (x2)


----------



## Subz80

Quote:


> Originally Posted by *HyeVltg3*
> 
> Ha!... did exactly this.
> Just got Newegg notified a few mins ago, caved and added the MSI 1070 to cart and completed.
> My initial plan was to grab 480 and ride it till Vega, but I may just ride the 1070 till Vega.
> 
> Mainly came here to ask how much people paid for their 1070s, I am Canadian but the currency conversion may just be right.
> 
> (Americanized; In USD)
> Subtotal: $438.44
> Handling: $0.00
> Shipping: $9.23
> GST/HST: $22.38 [tax]
> 
> Total Amount: $470.05
> 
> We Canadians paying more, less, just right?
> I feel a bit overpriced for the GTX x70 range.
> Last NV card I had was the GTX 670 and I paid $415usd for it (x2)


Hi there,

I paid 3600RMB tax inclusive in a high street store in Ningbo, China. At current exchange rates this is:

US$540 (ouch...)
CAD$700 (double ouch...)
EU488 (great...) --> In Germany my card currently costs EU499, so I got a slightly better deal in China. Also I am glad that my card overclocks stably really well and that there is practically no coil whine...


----------



## HyeVltg3

Quote:


> Originally Posted by *Subz80*
> 
> Hi there,
> 
> I paid 3600RMB tax inclusive in a high street store in Ningbo, China. At current exchange rates this is:
> 
> *US$540 (ouch...)
> CAD$700 (double ouch...)*
> EU488 (great...) --> In Germany my card currently costs EU499, so I got a slightly better deal in China. Also I am glad that my card overclocks stably really well and that there is practically no coil whine...


Wow dam thats expensive. but if thats normal for you then grats.

I still need to figure how to OC and get the 2100 I just heard about. sounds like a fantastic number.


----------



## Subz80

Quote:


> Originally Posted by *HyeVltg3*
> 
> Wow dam thats expensive. but if thats normal for you then grats.
> 
> I still need to figure how to OC and get the 2100 I just heard about. sounds like a fantastic number.


From my experience in terms of effective extra FPS you can actually see in benchmarks and games there is practically no difference between running the GPU at, say, 2050-2075Mhz or >2100Mhz. You are gaining 1-2FPS here and there at best, probably at the cost of having to increase fan speeds a bit and worry that you rig may crash on you.

So if you can get >2000Mhz without any dynamic throttling below this threshold this is already good enough! Also I am harboring the suspicion that with Pascal GPUs the extra performance gains you get from increasing clock speeds is not linear... which means at the high-end you see diminishing returns in terms of extra FPS...

...by the way, the new MSI Gaming X 1080 costs 5800RMB in my city Ningbo = CAD$1150 = US$880... I almost considered buying it... but then though, nahhh that's crazy! For a meagre extra 20-25% performance increase, much better option to try and overclock the hell out of the EU500 1070 card and leave it at that!


----------



## pez

1070 got here today. Can't wait to get the system up and running now.


----------



## Eorzean

Quote:


> Originally Posted by *HyeVltg3*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ha!... did exactly this.
> Just got Newegg notified a few mins ago, caved and added the MSI 1070 to cart and completed.
> My initial plan was to grab 480 and ride it till Vega, but I may just ride the 1070 till Vega.
> 
> Mainly came here to ask how much people paid for their 1070s, I am Canadian but the currency conversion may just be right.
> 
> (Americanized; In USD)
> Subtotal: $438.44
> Handling: $0.00
> Shipping: $9.23
> GST/HST: $22.38 [tax]
> 
> Total Amount: $470.05
> 
> We Canadians paying more, less, just right?
> I feel a bit overpriced for the GTX x70 range.
> Last NV card I had was the GTX 670 and I paid $415usd for it (x2)


Where'd you get yours? Mine was about $633 after taxes and shipping on NewEgg. Got the MSI Gaming X version. Had an order for the Gaming originally, but figured I'd pay the extra ~$15 to help it's resale value if I end up deciding to sell it.

I was also thinking about getting the RX 480 at first and "roughing" it out till the bigger cards came out, but I knew I'd probably regret not going for the extra oomph. I may just end up keeping it when the bigger cards come out since I won't be playing any demanding shooters on my 1440p monitor anyways!

Looking forward to the upgrade as I was using a GTX 780 for quite some time.


----------



## HyeVltg3

Quote:


> Originally Posted by *Eorzean*
> 
> Where'd you get yours? Mine was about $633 after taxes and shipping on NewEgg. Got the MSI Gaming X version. Had an order for the Gaming originally, but figured I'd pay the extra ~$15 to help it's resale value if I end up deciding to sell it.
> 
> I was also thinking about getting the RX 480 at first and "roughing" it out till the bigger cards came out, but I knew I'd probably regret not going for the extra oomph. I may just end up keeping it when the bigger cards come out since I won't be playing any demanding shooters on my 1440p monitor anyways!
> 
> Looking forward to the upgrade as I was using a GTX 780 for quite some time.


Actually just came here to ask about the X and Non-X variants.
I didnt notice there were two different ones, got the non-X and same place Newegg, thats probably why our Totals are off.

*Main Q*, are the MSI Gaming X just better binned cards or are you just paying the extra for the factory overclock? would hate find out I grabbed the lesser binned card to save $15 =/ (well mainly I didnt know there were Non-X versions, haha)


----------



## Peet1

Quote:


> Originally Posted by *HyeVltg3*
> 
> Actually just came here to ask about the X and Non-X variants.
> I didnt notice there were two different ones, got the non-X and same place Newegg, thats probably why our Totals are off.
> 
> *Main Q*, are the MSI Gaming X just better binned cards or are you just paying the extra for the factory overclock? would hate find out I grabbed the lesser binned card to save $15 =/ (well mainly I didnt know there were Non-X versions, haha)


Wow this is interesting. The non-X must be a asian and/or north american thing, cause they arent on sale in europe.

On a side note, i happen to notice that my gtx 1070 Jetstream has dual bios switch.
That might sound stupid, but how do i use it? Can i keep the pc on, or do i need to turn the pc off, then switch the bios and start the pc?


----------



## Subz80

try and copy the following OC profile into your profiles folder located in the MSI Afterburner installation folder... this profile STARTS with an initial clock speed of 1126Mhz, but should settle down to a sustainable 1101Mhz in Heaven and Valley in the long-run:

VEN_10DEDEV_1B81SUBSYS_33061462REV_A1BUS_1DEV_0FN_0.txt 13k .txt file


... in order to upload the file I had to change the file ending (suffix) to txt, but you need to rename this to cfg before copying it into the correct folder. I saved this as my SLOT1 profile in the latest beta version of MSI Afterburner (4.3.08352 beta). I changed this post and uploaded a more conservative profile, because the first one gave me a small amount of artefacts.


----------



## Deviem

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> His memory might be pushed too far and it's error correcting.
> 
> @Deviem
> 
> Lower your vram clock and see if your score improves.


Yeah I'll try lowering memclocks later today when I get home. I was actually wondering the lowish gfx score. There was no visible artifacting during the run though. Thanks for the tip, sir!


----------



## Tasm

So far, so good.



Do you guys think that increase the voltage to 100% can damage the card?

I mean, its pretty much impossible, since those increases are so small...even at + 100% the voltage will only be + 0.0290 V over stock voltage, that´s nothing at all.


----------



## crun

From my experience increasing Core Voltage does literally nothing. Can someone correct me?

Overall, I'm quite happy with my Gigabyte GTX 970 FE. I have a curve of between +190 to +200 core clock. Memory is set to +600. This translates to core clock between 1970-2100(max), depending on a game, e.g. Overwatch is 2050 most of the time, W3 (B&W) is 1974-2020. Memory clock is 2300 (9200). Fan curve is modified so it maxes out at 65%/83c. This way card is pretty silent, about the same as my case fans set to 12V (inaudible with open headset). More than 70% was a bit noisy, 80%+ was closing to R9 290 reference level.

I only hope that custom bios will be available at some point, so we can disable GPU boost. I'm pretty sure I would be able to run something like constant 2080/70%/83c then.

When was the custom bios released in case of Maxwell cards?


----------



## KrAzYtHeBoY

Vcore = 100% (graphic card not stable)

*Overclock based on BIOS OC DEFAULT MODE.*

Core voltage: 0%
Power Limit: 126%
Temp Limit: 92
Core clock: +105
Mem Clock: +505

GPU Clock: 1712mhz @ 1902mhz boost
Mem Clock: 2280mhz

*NEW SCORE:*


----------



## Mad Pistol

Shameless plug, but I am selling my second GTX 1070 that came in. The listing is here on the OCN marketplace. I am selling it @ MSRP (no markup, no BS)

http://www.overclock.net/t/1604056/for-sale-nvidia-gtx-1070-founders-edition-sealed-unopened/0_30


----------



## Deviem

Ok now we're on the right tracks.

I had the card running @ PCI-E x2 for some reason, as you can see from my first GPU-Z post few pages earlier.
Couldn't change the value no matter what I did, but swapping the card to different PCI-E slot did the trick. Now goes @ x16 3.0.


----------



## Dude970

Well done, nice improvement


----------



## AuraNova

Well, I finally got my car sometime around 5pm ET today.


Unfortunately, I can't put it in my current rig because it won't fit.







So I will hold off on entering the club for now, I guess.
But this card wasn't intended for my current build.


----------



## Oloc

Quote:


> Originally Posted by *HyeVltg3*
> 
> Actually just came here to ask about the X and Non-X variants.
> I didnt notice there were two different ones, got the non-X and same place Newegg, thats probably why our Totals are off.
> 
> *Main Q*, are the MSI Gaming X just better binned cards or are you just paying the extra for the factory overclock? would hate find out I grabbed the lesser binned card to save $15 =/ (well mainly I didnt know there were Non-X versions, haha)


From what I've read. The non-x is just the renamed LE version. And the only difference is MSI doesn't pre overclock it. I believe the upcoming Gaming Z card is the new name for their Lightning card.


----------



## Ecks9T

Quote:


> Originally Posted by *Oloc*
> 
> From what I've read. The non-x is just the renamed LE version. And the only difference is MSI doesn't pre overclock it. I believe the upcoming Gaming Z card is the new name for their Lightning card.


At some point when we start flashing bios' I don't think it will mattrr which card we have.


----------



## HyeVltg3

Quote:


> Originally Posted by *Oloc*
> 
> From what I've read. The non-x is just the renamed LE version. And the only difference is MSI doesn't pre overclock it. I believe the upcoming Gaming Z card is the new name for their Lightning card.


Gaming Z? awww that sounds awesome, especially when your name starts with the letter Z haha
Edit: *LE? do you mean FE?*

Quote:


> Originally Posted by *Ecks9T*
> 
> At some point when we start flashing bios' I don't think it will mattrr which card we have.


Really hoping this ends up like the 980ti, with so many variants but most can be modded to run like the other (hugely paraphrasing)

Well my MSI 1070 Non-X is in the packaging stage, I assume thats Newegg lingo for "we're on it...when it gets here" but could also be because its the weekend and the really do have it.
Do people still un-recommend SLI?
wondering if I should grab another one or wait for AMD's Vega or 1080ti.

I've always loved the x70 range, never understood the x80 when there's a Titan and then they introduced "Ti", even more reason to not grab a x80.


----------



## Ecks9T

Yeah, felt that the x80 are a bit much. Last one I bought at retail was a 480, then I started buying second hand cards until now. But SLI would be nice if you are running anything higher than 1440p from what I have read. And looking at reviews for SLI shows that it will still be limited per game title.


----------



## Hunched

Just ordered myself a Gigabyte 1070 for $400 USD! Finally a reasonably priced 1070, it's like $20 cheaper than the next cheapest 1070 and it's better too.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125875

Should be like a 1070 G1 but minus a fan and some RGB gimmicks. Can potentially overclock just as well with same power delivery as G1 and thermals aren't an issue with Pascal.

I wasn't planning on getting a 1070 but RX480 won't deliver this level of performance, who knows when Vega will be released, and I'm not getting a 980 Ti for the exact same price.
At $400, only $20 over MSRP, this card is the first 1070 that isn't kinda bull**** price-wise and it's been available to purchase for the first time only like 24 hours ago.
So ya got me Nvidia.

The only way I'll regret this card is IF people find a way to actually take advantage of more than a single 8-pin on Pascal with custom BIOS, at this point it's literally pointless from what I've seen.
1x 8-pin 1070's and 1080's OC just as well as 8+6 pin 1070's and 1080's and nothing OC's any better than a Founders Edition. It's ******ed.
Pascal OC's are 100% determined on the silicon lottery and just get whatever is cheapest is what I've gathered, every card has the same odds at the lottery so far.


----------



## Mad Pistol

Quote:


> Originally Posted by *Hunched*
> 
> Just ordered myself a Gigabyte 1070 for $400 USD! Finally a reasonably priced 1070, it's like $20 cheaper than the next cheapest 1070 and it's better too.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125875
> 
> Should be like a 1070 G1 but minus a fan and some RGB gimmicks. Can potentially overclock just as well with same power delivery as G1 and thermals aren't an issue with Pascal.
> 
> I wasn't planning on getting a 1070 but RX480 won't deliver this level of performance, who knows when Vega will be released, and I'm not getting a 980 Ti for the exact same price.
> At $400, only $20 over MSRP, this card is the first 1070 that isn't kinda bull**** price-wise and it's been available to purchase for the first time only like 24 hours ago.
> So ya got me Nvidia.
> 
> The only way I'll regret this card is IF people find a way to actually take advantage of more than a single 8-pin on Pascal with custom BIOS, at this point it's literally pointless from what I've seen.
> 1x 8-pin 1070's and 1080's OC just as well as 8+6 pin 1070's and 1080's and nothing OC's any better than a Founders Edition. It's ******ed.
> Pascal OC's are 100% determined on the silicon lottery and just get whatever is cheapest is what I've gathered, every card has the same odds at the lottery so far.


Good going. That's definitely a killer price. Let us know how it runs when you get it.


----------



## HyeVltg3

Quote:


> Originally Posted by *Hunched*
> 
> *The only way I'll regret this card is IF people find a way to actually take advantage of more than a single 8-pin on Pascal with custom BIOS*, at this point it's literally pointless from what I've seen.
> 1x 8-pin 1070's and 1080's OC just as well as 8+6 pin 1070's and 1080's and *nothing OC's any better than a Founders Edition. It's ******ed.*
> Pascal OC's are 100% determined on the silicon lottery and just get whatever is cheapest is what I've gathered, every card has the same odds at the lottery so far.


I'm not sure what you mean, isnt that a good thing? modded BIOS letting us OC better...!?

Is there true?! FE's OC better than the other AIBs?


----------



## Porter_

Quote:


> Originally Posted by *Hunched*
> 
> Just ordered myself a Gigabyte 1070 for $400 USD! Finally a reasonably priced 1070, it's like $20 cheaper than the next cheapest 1070 and it's better too.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125875
> 
> Should be like a 1070 G1 but minus a fan and some RGB gimmicks. Can potentially overclock just as well with same power delivery as G1 and thermals aren't an issue with Pascal.
> 
> I wasn't planning on getting a 1070 but RX480 won't deliver this level of performance, who knows when Vega will be released, and I'm not getting a 980 Ti for the exact same price.
> At $400, only $20 over MSRP, this card is the first 1070 that isn't kinda bull**** price-wise and it's been available to purchase for the first time only like 24 hours ago.
> So ya got me Nvidia.
> 
> The only way I'll regret this card is IF people find a way to actually take advantage of more than a single 8-pin on Pascal with custom BIOS, at this point it's literally pointless from what I've seen.
> 1x 8-pin 1070's and 1080's OC just as well as 8+6 pin 1070's and 1080's and nothing OC's any better than a Founders Edition. It's ******ed.
> Pascal OC's are 100% determined on the silicon lottery and just get whatever is cheapest is what I've gathered, every card has the same odds at the lottery so far.


I think you'll have more than enough cooling capacity with that card, particularly if you use a custom fan profile. I have the Gigabyte G1 and with the fan profile I set it levels off at 60C after hours of playing. It's more cooling capacity than I need. I'll see how it does when my second card arrives on Monday but I'm not expecting it to be a problem. I think you made the smart move going with the 2 fan $400 version.


----------



## Hunched

Quote:


> Originally Posted by *Mad Pistol*
> 
> Good going. That's definitely a killer price. Let us know how it runs when you get it.


I'm curious to find out how GPU Boost 3.0 and throttling works this time around, it's hard to find information.
Maxwell throttled back clocks and voltages at a mere 68c.

Also, things that previously required a BIOS edit to accomplish are now capable in OC utilities.
You are able to configure GPU Boost 3.0's voltage to frequency curve in EVGA XOC for example, seems neat and literally 0 people are talking about it for some reason.
http://www.evga.com/precisionxoc/
I can't find anything on whether or not GPU Boost 3.0 is an improvement, or more restrictive, whether throttling is enforced early or customizable or anything.
Nobody is talking about it yet everyone has to deal with it when overclocking so I don't understand.
Guess I'll find out...
Quote:


> Originally Posted by *HyeVltg3*
> 
> I'm not sure what you mean, isnt that a good thing? modded BIOS letting us OC better...!?
> 
> Is there true?! FE's OC better than the other AIBs?


If people are able to getting modded BIOS working to OC better, I didn't get a card that will benefit too greatly from it since it's only a single 8-pin is what I'm saying.
It would be more beneficial to have a 8+6pin which could have a BIOS mod to actually take advantage of the extra power, which I won't have.

I didn't mean the FE's OC better than all the AIB's, I mean they all have equal overclocking capability. Custom AIB's with super coolers and more power pins do not overclock any better, they are equal.
Which isn't how things should be, but it's how it is.
The MSI GTX 1080 Gaming X with its 8+6 and superior cooler overclocks no better than an FE.


----------



## Ysbzqu6572

Guys, I am disappointed with 1070 G1 gaming from gigabyte..
It overclocks about the same as my previous founders edition from ASUS but it is way louder


----------



## Mad Pistol

Quote:


> Originally Posted by *H4wk*
> 
> Guys, I am disappointed with 1070 G1 gaming from gigabyte..
> It overclocks about the same as my previous founders edition from ASUS but it is way louder


That's another reason I went ahead and got the Founder's Edition. The 1070 I have is actually quieter than the GTX 780 w/ reference cooler, and it's much faster. Not everyone likes it, but I think the Nvidia reference cooler on these Pascal cards is actually pretty good.


----------



## Hunched

Quote:


> Originally Posted by *H4wk*
> 
> Guys, I am disappointed with 1070 G1 gaming from gigabyte..
> It overclocks about the same as my previous founders edition from ASUS but it is way louder


How is that even possible? The fans must be better than the reference fan, and even if they were worse you have 3 instead of 1 so you could do much lower RPM.
Your fan curve must be super aggressive.


----------



## Ysbzqu6572

I dont know, they are so loud I can hear them spinning all three of them, that air noise is simply crazy they are like small turbines spinning up in games
FE was much quieter overall TBH.
Maybe I will play with fan curve or something but I think it is stupid that out of the box it is louder than reference isn't it ?


----------



## blahtibla

I was tired of waiting for the custom 1080s so i cancelled my order and picked up an MSI 1070 armor yesterday.

Seems i got a pretty decent chip. In Valley its stable at 2100 Mhz on the core and +450 on the memory.

For gaming (Overwatch) i have to dial back to 2062 Mhz or it will crash the driver. Here are some benchmarks at those clocks:

Firestrike Ultra
http://www.3dmark.com/3dm/12706575

Firestrike Extreme
http://www.3dmark.com/3dm/12706922

Valley (1440p ultra + 8x MSAA)
Score 2584





The cooler on this card looks puny, but it is very good as usual with MSI. Very effective and very silent.
I feel that this card is gonna fit my needs nicely for 1440p gaming, until the big chips are out.


----------



## Peet1

Quote:


> Originally Posted by *Mad Pistol*
> 
> That's another reason I went ahead and got the Founder's Edition. The 1070 I have is actually quieter than the GTX 780 w/ reference cooler, and it's much faster. Not everyone likes it, but I think the Nvidia reference cooler on these Pascal cards is actually pretty good.


The Problem with the 1070 founders editions cards is not the cooler per se, but the combination of a slightly insufficient cooler, that has not the vapor chambers like the 1080, with a low amount of power phases and a insufficient power delivery, which results in a inconsistent and constantly bouncing clock.
Quote:


> Originally Posted by *blahtibla*
> 
> The cooler on this card looks puny, but it is very good as usual with MSI. Very effective and very silent.


Hmmm i dont know, i would have thought that the armor cooler would be better, 71° Celsius is not that particularly well.

My Palit Jetstream 1070, for example (28° Ambient)


----------



## Pragmatist

MSI ARMOR.


----------



## Peet1

Quote:


> Originally Posted by *Pragmatist*
> 
> MSI ARMOR.


With a air condition standing next to it lol?


----------



## Yetyhunter

Is it really worth going for a custom AIB version like Gigabyte or ASUS ? OC and cooling wise. I can't find anything in stock here only FE editions. I really want to get this card this week. Awaiting recomandations.


----------



## Peet1

Quote:


> Originally Posted by *Yetyhunter*
> 
> Is it really worth going for a custom AIB version like Gigabyte or ASUS ? OC and cooling wise. I can't find anything in stock here only FE editions. I really want to get this card this week. Awaiting recomandations.


Get a palit card, best cooling.


----------



## Pragmatist

Quote:


> Originally Posted by *Peet1*
> 
> With a air condition standing next to it lol?


?

If you're wondering if it's air cooled, then yes it is.. In any case, there's too many factors for the temp to be the same for every user. I've never used Palit, barely heard of them tbh.

Edit: I got the joke now.... lol. No, though. I have it in a closed case. (Phanteks enthoo evolv ATX tempered glass)


----------



## Yetyhunter

Quote:


> Originally Posted by *Peet1*
> 
> Get a palit card, best cooling.


Nowhere to be found. Only ASUS, Gigabyte, MSI and Zotac All out of stock not even pre-order. Should i get the FE edition which is in stock?


----------



## Whiskas

Hello guys, my Msi 1070 Gaming X arrived yesterday, here is my results in 3DMark and Unigine:

Firestrike - http://www.3dmark.com/3dm/12715985
Firestrike Ultra - http://www.3dmark.com/3dm/12715894



OC Profile (Standart Bios)

Core voltage +100
Power limit +110
Core clock +105
Memory clock +400
Custom fan curve

Temp is around 63-65° during tests, after 10 minutes of Firestrike Ultra stress test it's 67°.
Was really disapointed that there is no way to check ASIC quality, but I see that everyone here have really comparable results, so maybe that doesn't really matter?


----------



## Dude970

Good temps and good scores. If it showed ASIC yours would be good, congrats!


----------



## Ecks9T

Quote:


> Originally Posted by *Whiskas*
> 
> Hello guys, my Msi 1070 Gaming X arrived yesterday, here is my results in 3DMark and Unigine:
> 
> Firestrike - http://www.3dmark.com/3dm/12715985
> Firestrike Ultra - http://www.3dmark.com/3dm/12715894
> 
> 
> 
> OC Profile (Standart Bios)
> 
> Core voltage +100
> Power limit +110
> Core clock +105
> Memory clock +400
> Custom fan curve
> 
> Temp is around 63-65° during tests, after 10 minutes of Firestrike Ultra stress test it's 67°.
> Was really disapointed that there is no way to check ASIC quality, but I see that everyone here have really comparable results, so maybe that doesn't really matter?


I don't think you need the +100 core voltage. i am only using +10 to get 112/200/500


----------



## Ranguvar

6700K @ 4.7GHz + EVGA 1070 SC @ 2.1GHz/9.34GHz!

Firestrike (using new driver, I don't mind that it won't validate): 17,347
http://www.3dmark.com/3dm/12717643

+145 to core, +685 to memory. I tried taking off the "+100%" to voltage and kept the same overclock.

Amusingly, with fan at 100% (which I wouldn't mind), the GPU barely kissed 50C.
That's absurd.


----------



## Whiskas

Quote:


> Originally Posted by *Dude970*
> 
> Good temps and good scores. If it showed ASIC yours would be good, congrats!


Thanks, hopefully the dev of gpu-z will add asic reading eventually.

Quote:


> Originally Posted by *Ecks9T*
> 
> I don't think you need the +100 core voltage. i am only using +10 to get 112/200/500


I used guru3d review as OC example, and they used +100. Anyway I'll test +10 too, thank you.


----------



## Mad Pistol

What most people getting for benchable levels on their 1070 Founders Editions?

I just completed a run of Heaven 4.0 at +175/+500 on my 1070, and I'm trying to get a baseline comparison.


----------



## pez

Quote:


> Originally Posted by *H4wk*
> 
> I dont know, they are so loud I can hear them spinning all three of them, that air noise is simply crazy they are like small turbines spinning up in games
> FE was much quieter overall TBH.
> Maybe I will play with fan curve or something but I think it is stupid that out of the box it is louder than reference isn't it ?


What is your cooling setup like, case-wise? My 1080 G1 never gets audible for any game (I.e. <50% fans while gaming). Also, there was a firmware update for the 1080 G1 that had something to do with the fans. Be sure there isn't a similar one for your 1070 G1.


----------



## Ecks9T

Quote:


> Originally Posted by *Mad Pistol*
> 
> What most people getting for benchable levels on their 1070 Founders Editions?
> 
> I just completed a run of Heaven 4.0 at +175/+500 on my 1070, and I'm trying to get a baseline comparison.


I can run Heaven 4.0 with my FE card tonight when i get home from work so we can compare.


----------



## Ysbzqu6572

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *H4wk*
> 
> I dont know, they are so loud I can hear them spinning all three of them, that air noise is simply crazy they are like small turbines spinning up in games
> FE was much quieter overall TBH.
> Maybe I will play with fan curve or something but I think it is stupid that out of the box it is louder than reference isn't it ?
> 
> 
> 
> What is your cooling setup like, case-wise? My 1080 G1 never gets audible for any game (I.e. <50% fans while gaming). Also, there was a firmware update for the 1080 G1 that had something to do with the fans. Be sure there isn't a similar one for your 1070 G1.
Click to expand...

SilentiumPC Gladius M35 with intake and exhaust fans.


----------



## criznit

@Mad Pistol

Here are my scores with +100 cv, +200 core and +500 mem


edit: Quoted wrong person


----------



## criznit

And here is my 1070 FE running at +175/+500


----------



## Ranguvar

Then here's my 1070 SC at +125 / +685.


----------



## Mad Pistol

Quote:


> Originally Posted by *criznit*
> 
> And here is my 1070 FE running at +175/+500


Hmmmmm

I bet it's my second screen and monitoring software that is bringing the score down slightly. I will have to revise this.

Still that gives me an idea of where I'm at. Specifically, what are the max benchable levels you can hit on your FE's?


----------



## criznit

I'm still toying with it but I can do +210 on core and +550 on memory. It really didn't up my scores too much but I'm able to run a pass (doubt it will run multiple passes) with those settings


----------



## Ecks9T

Quote:


> Originally Posted by *criznit*
> 
> I'm still toying with it but I can do +210 on core and +550 on memory. It really didn't up my scores too much but I'm able to run a pass (doubt it will run multiple passes) with those settings


From what i tried and benching with 3D mark, anything over 200 for me will cause the drivers to crash.


----------



## Eorzean

Quote:


> Originally Posted by *HyeVltg3*
> 
> Actually just came here to ask about the X and Non-X variants.
> I didnt notice there were two different ones, got the non-X and same place Newegg, thats probably why our Totals are off.
> 
> *Main Q*, are the MSI Gaming X just better binned cards or are you just paying the extra for the factory overclock? would hate find out I grabbed the lesser binned card to save $15 =/ (well mainly I didnt know there were Non-X versions, haha)


I've been reading reviews and saw mentions with silver IO plates for the non-X variants. Too lazy to check X reviews right now to see if anyone new has received a silver, non-finished IO plate for the price premium, though (which is possible). I just know that the X on the box will help with selling it down the road... I highly doubt they're higher binned just because clockspeeds aren't any higher than you'd get on an FE... It was just $15 more before tax and the standard wasn't in stock at the time, so I just said eff it and made the purchase.

I'm just having doubts performance wise (especially at 1440p)... but knowing the next Titan/1080 Ti will probably slam the 1080, I most likely made the best decision ATM because the 1070 will be easier to sell than the 1080 here... Ottawa is cheapskate central.

This is basically my stop-gap 'til something more capable of running ultrawide 1440p comes out.


----------



## HyeVltg3

Quote:


> Originally Posted by *Eorzean*
> 
> I've been reading reviews and saw mentions with silver IO plates for the non-X variants. Too lazy to check X reviews right now to see if anyone new has received a silver, non-finished IO plate for the price premium, though (which is possible). I just know that the X on the box will help with selling it down the road... I highly doubt they're higher binned just because clockspeeds aren't any higher than you'd get on an FE... It was just $15 more before tax and the standard wasn't in stock at the time, so I just said eff it and made the purchase.
> 
> I'm just having doubts performance wise (especially at 1440p)... but knowing the next Titan/1080 Ti will probably slam the 1080, I most likely made the best decision ATM because the 1070 will be easier to sell than the 1080 here... Ottawa is cheapskate central.
> 
> This is basically my stop-gap 'til something more capable of running ultrawide 1440p comes out.


Yes that was my main reason to jump on the 1070 instead of 1080, with the pattern Nvidia have been doing, a Ti will always trumph a x80 and the x70 are just great bang/buck cards. So it leaves me open to either jump on a 1080ti or Vega in the future instead of going from 1080 to something only 15-20% faster, or worse, less.
lets shope the 1070 does great at 1440p, just recently upgraded the 1080p lifestyle to 1440p so am excited.

Ordering to Montreal. I hope the 1070 resales well haha, not seeing any listings on ebay.ca, makes you wonder whats up.

Silver IO plate....isnt that normal? I dont think I would mind, would love a silver/metal backplate. getting bored of the same ol' same ol' black or gun-metal almost colours. I dont really care about a design, because my thin case and Hyper 212 block the backplate of any card I've stuck in there.

Also planning to grab a RX 480 to test and stick in my other semi-HTPC rig. would be expensive if I started with 1080. lowest I could find was $921 , total

EDIT:
AHHHHHHHHHHHHHHHH WHAAATT

__
https://www.reddit.com/r/4pwgqf/mods_can_we_get_a_sticky_for_canadians_about/

Canada Post's going on strike? right when order the cards, cmon!


----------



## pez

Quote:


> Originally Posted by *H4wk*
> 
> SilentiumPC Gladius M35 with intake and exhaust fans.


Check for that BIOS update for sure, but if that doesn't resolve it and you're seeing temps that are causing the fans to spin up that much, I'd wager to say you're not getting enough airflow through your case to expel that heat.


----------



## Ysbzqu6572

I saw temps around 70 celsius so I do not think the temps are an issue.. fans are loud to be honest.. I am going to try custom fan curve.


----------



## Ecks9T

This is my "base run" at +100/+400, but I fear I have a lemon. I tried going to my max of +200/+500 and the score did not changed at all.











The only time the score "dropped" was when i ran the benchmark at 1440p, scoring 1451
*Edit also try maxing my voltage to +100 instead of +10, it didnt do anything*


----------



## pez

Quote:


> Originally Posted by *H4wk*
> 
> I saw temps around 70 celsius so I do not think the temps are an issue.. fans are loud to be honest.. I am going to try custom fan curve.


So the questions still remains; what BIOS are you running on the card?


----------



## Ysbzqu6572

Quote:


> Originally Posted by *pez*
> 
> So the questions still remains; what BIOS are you running on the card?


http://i.imgur.com/nY2fGiC.gif
86.04.1E.00.68

I found no update on Gigabyte site for BIOS


----------



## pez

Quote:


> Originally Posted by *H4wk*
> 
> http://i.imgur.com/nY2fGiC.gif
> 86.04.1E.00.68
> 
> I found no update on Gigabyte site for BIOS


Yep, I double-checked and am only seeing BIOS for the 1080.


----------



## Ysbzqu6572

I downloaded their XTREME app and set fan profile to SILENT, it was set to around 40-50% fan speed at load
By default fan speed during load was 70% !!!!!!!! That's why it is so loud as hell.

I set my custom fan profile in MSI afterburner as follows http://i.imgur.com/TQcmUxo.png
now its much better


----------



## pez

Quote:


> Originally Posted by *H4wk*
> 
> I downloaded their XTREME app and set fan profile to SILENT, it was set to around 40-50% fan speed at load
> By default fan speed during load was 70% !!!!!!!! That's why it is so loud as hell.
> 
> I set my custom fan profile in MSI afterburner as follows http://i.imgur.com/TQcmUxo.png
> now its much better


Interesting. That's good info to know. Quite strange, too, considering the 1070 is the cooler running card.


----------



## boldenc

anyone tried the Zotac GTX1070 AMP? I would like to know what temps this card gives under load.


----------



## Ysbzqu6572

Quote:


> Originally Posted by *Pragmatist*
> 
> MSI ARMOR.


Weird.. my 1070 produces higher framerates and I do not even have unlocked CPU
GPU boosting at 2025-2038 MHz

http://www.3dmark.com/3dm/12733431


----------



## Pragmatist

It seems like your card is better than mine since you have a higher graphics score. I can't go further than 125 on the core, if I do it just crashes. I wonder if I can use the gaming X 1070 bios on the ARMOR? https://www.msi.com/Graphics-card/support/GeForce-GTX-1070-GAMING-X-8G.html#down-bios


----------



## 303869

Hi All, I have just recieved my gtx 1070 founders edition but I recieved the MSI one instead of the EVGA one which I bought. Are they identical? Should I return or keep?


----------



## Ysbzqu6572

I think EVGA has much better customer support and warranty.. therefore it is up to you I think, but EVGA should be better covered.
Out of curiosity.. how did that happen that they gave you MSI instead of EVGA one


----------



## 303869

Quote:


> Originally Posted by *H4wk*
> 
> I think EVGA has much better customer support and warranty.. therefore it is up to you I think, but EVGA should be better covered.
> Out of curiosity.. how did that happen that they gave you MSI instead of EVGA one


Yeah thats what I've thought in the past. But both cards do have identical warranty, well I think the MSI was in stock and the EVGA one wasnt so they just sent me the MSI one but did not inform or ask me this was going to happen. So I've just sent the company an email it is Scan.co.uk btw demanding some compensation if I keep the card.


----------



## BulletSponge

Quote:


> Originally Posted by *H4wk*
> 
> I think EVGA has much better customer support and warranty.. therefore it is up to you I think, but EVGA should be better covered.
> Out of curiosity.. how did that happen that they gave you MSI instead of EVGA one


I'm not sure if this is still the case but with previous MSI cards the warranty period began the moment the card came off the production line. EVGA's warranty begins at the time of purchase.


----------



## KrAzYtHeBoY




----------



## 303869

Quote:


> Originally Posted by *BulletSponge*
> 
> I'm not sure if this is still the case but with previous MSI cards the warranty period began the moment the card came off the production line. EVGA's warranty begins at the time of purchase.


Ah ok, I have arranged with the company for a return and I will wait for the EVGA card to come back into stock.


----------



## crun

Quote:


> Originally Posted by *criznit*
> 
> And here is my 1070 FE running at +175/+500


I am running the same clocks currently on my FE and have exactly the same score (unless I bench from idle temps - then I can achieve 103.1). I have a fan curve modified so the card heats up to 81c/65%.

+200 crashed was crashing from time to time even in benches, +190 crashed after once in BF4 plus saw some strange artefacts in W3, BF4. Hopefuly +175 will remain rock-stable. I was hoping for 2000+ stable clock, but it does dip to ~1950 in some games.

The best score in Heaven I could achive was 105.0 FPS with +200/+900, but it was accompanied with disco flashing artifacts and 90% fan speed


----------



## killeraxemannic

What are you guys getting for stock boost clocks on your cards? I have the Gigabyte G1 gaming GTX 1070 and mine is boosting up to 2012 Mhz in OC mode and 1974 on gaming mode on the gigabyte control panel. Is that normal or does my card just clock well? Does anyone know what determines how high it will clock itself?


----------



## Ubeermench

Enjoying it more than my 970's


----------



## Porter_

Quote:


> Originally Posted by *killeraxemannic*
> 
> What are you guys getting for stock boost clocks on your cards? I have the Gigabyte G1 gaming GTX 1070 and mine is boosting up to 2012 Mhz in OC mode and 1974 on gaming mode on the gigabyte control panel. Is that normal or does my card just clock well? Does anyone know what determines how high it will clock itself?


my single Gigabyte G1 boosts to upper 1900's but settles pretty quickly at 1924 MHz and holds there for hours of gaming . i'm on the default gaming mode but with power limit increased to 111% using Afterburner. i'll be installing my second card tonight and that's when i'll start playing with overclocks.

edit: i should note i'm us a custom fan curve that's pretty aggressive, settles in at 75% fan use. Corsair 600Q case and using headphones makes noise a non-issue for me.


----------



## Ysbzqu6572

without any app from gigabyte mine boosts by default to around 1950 but after some gaming it stays at ~1880


----------



## Tasm

Quote:


> Originally Posted by *killeraxemannic*
> 
> What are you guys getting for stock boost clocks on your cards? I have the Gigabyte G1 gaming GTX 1070 and mine is boosting up to 2012 Mhz in OC mode and 1974 on gaming mode on the gigabyte control panel. Is that normal or does my card just clock well? Does anyone know what determines how high it will clock itself?


Quote:


> Originally Posted by *killeraxemannic*
> 
> What are you guys getting for stock boost clocks on your cards? I have the Gigabyte G1 gaming GTX 1070 and mine is boosting up to 2012 Mhz in OC mode and 1974 on gaming mode on the gigabyte control panel. Is that normal or does my card just clock well? Does anyone know what determines how high it will clock itself?


Mine Boosts up to 1962 MHz with the Gaming profile.

So yes, its normal they boost way more than the original boost.


----------



## Yetyhunter

Guys please help me decide ! Get the FE edition RIGHT NOW or wait for a custom design. Price should be the same.


----------



## HyeVltg3

Quote:


> Originally Posted by *Yetyhunter*
> 
> Guys please help me decide ! Get the FE edition RIGHT NOW or wait for a custom design. Price should be the same.


Does Romania not have the AIB variants of the GTX 1070 yet?


----------



## killeraxemannic

Quote:


> Originally Posted by *Tasm*
> 
> Mine Boosts up to 1962 MHz with the Gaming profile.
> 
> So yes, its normal they boost way more than the original boost.


Have you ever seen it go higher than that? That's why I am wondering about mine. I haven't found anyone who has said that theirs boosts to what mine does regularly. I am trying to figure out what determines the boost as it seems everyone's card is different.


----------



## criminal

I may have missed it earlier in this thread, so I am sorry if this is known already. I started having crashes this weekend after updating to the newest beta Afterburner. This was happening in games and benchmarks that previously had stability. Come to find out the issue had to do with Afterburner. Event logs never showed Afterburner as the culprit, but on a hunch I figured it was worth a shot to uninstall. This may have been an isolated issue with my machine, but wanted to give a heads up to anyone having an issue and not sure what it could be.


----------



## brettjv

Quote:


> Originally Posted by *Hunched*
> 
> The only way I'll regret this card is IF people find a way to actually take advantage of more than a single 8-pin on Pascal with custom BIOS, at this point it's literally pointless from what I've seen.
> 1x 8-pin 1070's and 1080's OC just as well as 8+6 pin 1070's and 1080's and nothing OC's any better than a Founders Edition. It's ******ed.
> Pascal OC's are 100% determined on the silicon lottery and just get whatever is cheapest is what I've gathered, every card has the same odds at the lottery so far.


I just want to point out some thing about power plugs ... there's basically zero practical advantage to having more 'pins' or power cables on any of these cards. The limiting factor to how much power these cards will pull is the TDP limiter (Power Limit) coded into the bios. If anyone manages to hack a higher limit in a custom bios (which I expect they will) everyone will be in the same boat in terms of available 'juice' for overvolting, it won't matter your pin count.

IOW some seem under the mistaken impression that there actually is a 'limit' to how much power will flow through an 8-pin cable, a 6 pin cable, etc. For all practical purposes in this application (powering a GPU, esp. a Pascal) there's really not.

Well ... I should say, there IS a limit, but it's enforced by the PSU itself ... it's called overcurrent protection ... but ... every decent power supply out there always has a good bit more headroom available on the 'rail' (and even more if you have a single 12V+ rail) that's powering the 8-pin cable(s) ...

Your PSU IOW will almost certainly allow way more than just 150W to flow through an 8-pin cable before it shuts off due to OCP. In fact I doubt a single one of you is using a PSU that wouldn't allow 200W through a single 8-pin connection, and everyone with a beefy single rail can pull WAY more than that.

Remember also the TDP on the 1070 card is only 180W, so even if you had a power limit at 125% that only works out to 225W ... exactly what an 8-pin+PCI-Ex (75W) slot is rated for.

And the wires on an 8-pin wouldn't start getting hot to a worrisome level until you hit at LEAST 250-300W flowing through it, most likely. There' a HUUUUUUGE safety margin built in to the 150W 'rating' on an 8-pin.

As such I don't recommend anyone 'hold out' for a 8+6 pin card unless you KNOW ... it's also allowing a higher Power Limit in it's bios ... and if you're enough of a tinkerer to swap bios'es, I wouldn't even worry about it period, just get the card w/the other features & price you want. NOBODY is going to be 'overclocking higher' just by virtue of an extra 6-pin on one of these cards ... but if your not a 'bios swapper' you may want to look for cards with a higher built-in power limit value regardless of pin count.


----------



## Yetyhunter

Quote:


> Originally Posted by *HyeVltg3*
> 
> Does Romania not have the AIB variants of the GTX 1070 yet?


Sadly no. We barely have the fe in stock.


----------



## Hunched

Quote:


> Originally Posted by *brettjv*
> 
> snip


Whoa thanks for all the info!
Thought I was compromising with just the power delivery, now I feel better about my choice.

It should arrive by Thursday hopefully!


----------



## Eorzean

Quote:


> Originally Posted by *HyeVltg3*
> 
> AHHHHHHHHHHHHHHHH WHAAATT
> 
> __
> https://www.reddit.com/r/4pwgqf/mods_can_we_get_a_sticky_for_canadians_about/
> 
> Canada Post's going on strike? right when order the cards, cmon!


Mine was shipped via Purolator, and despite some saying it's actually Canada Post, my tracking number doesn't work on CP's website and appears to be a legit Purolator number. Guess we'll know in a few days. Fingers crossed they don't get stuck in transit.


----------



## pez

Quote:


> Originally Posted by *killeraxemannic*
> 
> Have you ever seen it go higher than that? That's why I am wondering about mine. I haven't found anyone who has said that theirs boosts to what mine does regularly. I am trying to figure out what determines the boost as it seems everyone's card is different.


I notice quite a bit that people who can keep the card cooler (i.e. good case airflow) and a little silicon lottery luck, they usually get the higher boost out of the gate. It seems that 2.1GHz is the ceiling on both the 1070 and 1080. My 1080 G1 currently will boost all day to 1911+ depending on the title with the stock 'OC'.
Quote:


> Originally Posted by *Yetyhunter*
> 
> Sadly no. We barely have the fe in stock.


If it's the same price as the AIBs coming and you don't mind the extra noise, I say go for it. I have a 1080 G1 and a 1070 FE and the FE has a weight to it that makes it feel really premium. People will complain about the plastic backplate, but the card really is something special to see and hold.


----------



## Yetyhunter

Quote:


> Originally Posted by *pez*
> 
> I notice quite a bit that people who can keep the card cooler (i.e. good case airflow) and a little silicon lottery luck, they usually get the higher boost out of the gate. It seems that 2.1GHz is the ceiling on both the 1070 and 1080. My 1080 G1 currently will boost all day to 1911+ depending on the title with the stock 'OC'.
> If it's the same price as the AIBs coming and you don't mind the extra noise, I say go for it. I have a 1080 G1 and a 1070 FE and the FE has a weight to it that makes it feel really premium. People will complain about the plastic backplate, but the card really is something special to see and hold.


And how well does it overclock ? I know that all cards are different but do you have about the same chances of getting 2000 mhz boost clock as any other AIB versions ?


----------



## pez

Quote:


> Originally Posted by *Yetyhunter*
> 
> And how well does it overclock ? I know that all cards are different but do you have about the same chances of getting 2000 mhz boost clock as any other AIB versions ?


Obviously I can't answer for all owners as some get north of 2.1GHz and some only hit mid-to-high 1900s. So far I can deduce that with good case airflow/card temps, and sometimes an aggressive fan curve, you can pretty much hit that. I haven't decided to do much OC'ing yet as the gains past 1900 seem to bring diminishing returns. I don't benchmark, rather I play games, so I have no reason to really OC much if there's not a gain happening that makes it worthwhile.


----------



## Whiskas

Quote:


> Originally Posted by *killeraxemannic*
> 
> Have you ever seen it go higher than that? That's why I am wondering about mine. I haven't found anyone who has said that theirs boosts to what mine does regularly. I am trying to figure out what determines the boost as it seems everyone's card is different.


Isn't that one of things determined by ASIC quality? Can't find link now, but i've read some test where higher ASIC led to higher boost by default.


----------



## Airrick10

Still no word on when the Asus cards will be out on newegg? I'm curious on how well these Strix cards will overclock.


----------



## brettjv

@hunched
NP mate ... yeah, common misunderstanding is that there's actually a 'hard limit' of 150W on the 8pin, 75W on the 6-pin ... these are all just 'ratings' that are extremely conservative in regards to 'safety' ... and the GPU makers will 'adhere' to these ratings when designing cards and attach plugs accordingly based on TDP, to pass UL regs and ATX specs ... but in reality all power supplies will allow somewhere between 'a lot' and 'a ton' more juice through them before kicking off due to OCP ... if the device they're powering is simply 'requesting it'.

So, if you've got an unrestricted bios on your card that's saying 'screw rated TDP, I want more juice', there's absolutely nothing stopping your single 8-pin cable from providing the requested juice unless you have a ridiculously sensitive OCP circuit on your PSU.

And there's no actual 'danger' unless you start drawing WAY WAY above the 150W rated current, that danger being your wires got red hot and melt the shielding, etc.

Think for example about your vacuum or your iron ... running on a cable that is really not EVEN as big as your 8 pin cable. You think you GPU is pulling the kinda power that your Iron is, at full heat? No way. You could probably run any single gpu cards ever created (and most dual-gpu) on a single 8-pin (at least on a beefy, single-rail psu) ... it's just ... your wires would start getting warm.


----------



## brettjv

Quote:


> Originally Posted by *killeraxemannic*
> 
> Have you ever seen it go higher than that? That's why I am wondering about mine. I haven't found anyone who has said that theirs boosts to what mine does regularly. I am trying to figure out what determines the boost as it seems everyone's card is different.


It'll be interesting to see if there's any common thread.

I remember with Kepler there was what I called 'the Kepler Boost', which was a fixed value that a given card would always clock up to (above and beyond the 'boost' clock', per GPU-Z), UNLESS it was being limited by thermals or TDP.

EVERY card had some set amount of Kepler Boost when it came from the factory, and it was always a multiple of 13MHz core. Some people had a 26MHz boost, some a 104MHz, some a 130MHz ... one person I think even had a 152MHz Kepler Boost ... and he also was able to OC the HAY-ELL out his card in the end. Mine I think has a 130MHz boost, and it turned out to clock exceptionally at the top end.

Again, this was based on measuring what your card would run at ABOVE the specified 'boost clock'. Every card was different back with Kepler too ... well, I mean, apart from the Kepler boost always being a multiple of 13MHz









ALthough it seems that Maxwell and Pascal are a LOT more sensitive to downclocking, I would GUESS that a similar phenomenon is still going on with both these generations ... everybody's cards has a 'set amount' they can auto-boost above their rated 'boost clock' ... provided there's no other limiting factor.

And the card YOU got ... looks to have a very very high 'Pascal Boost' rating, so I'm betting you'll find your top-end is VERY high ... if the same logic holds true with Pascal as did with Kepler ...


----------



## killeraxemannic

Quote:


> Originally Posted by *brettjv*
> 
> It'll be interesting to see if there's any common thread.
> 
> I remember with Kepler there was what I called 'the Kepler Boost', which was a fixed value that a given card would always clock up to (above and beyond the 'boost' clock', per GPU-Z), UNLESS it was being limited by thermals or TDP.
> 
> EVERY card had some set amount of Kepler Boost when it came from the factory, and it was always a multiple of 13MHz core. Some people had a 26MHz boost, some a 104MHz, some a 130MHz ... one person I think even had a 152MHz Kepler Boost ... and he also was able to OC the HAY-ELL out his card in the end. Mine I think has a 130MHz boost, and it turned out to clock exceptionally at the top end.
> 
> Again, this was based on measuring what your card would run at ABOVE the specified 'boost clock'. Every card was different back with Kepler too ... well, I mean, apart from the Kepler boost always being a multiple of 13MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ALthough it seems that Maxwell and Pascal are a LOT more sensitive to downclocking, I would GUESS that a similar phenomenon is still going on with both these generations ... everybody's cards has a 'set amount' they can auto-boost above their rated 'boost clock' ... provided there's no other limiting factor.
> 
> And the card YOU got ... looks to have a very very high 'Pascal Boost' rating, so I'm betting you'll find your top-end is VERY high ... if the same logic holds true with Pascal as did with Kepler ...


Is there any way to check the boost rating on my card? I was doing some experimenting last night... It is definitely temperature related. If I kick the GPU fans up to turbo in the Gigabyte utility and kick my case fans up a tad off of idle it will boost up over 2000 all day. If I set case fans at idle and leave the GPU fans at auto it stays around 1974. Seems like 65C is the turning point. Once it reaches 65C it starts to cut the boost back. I did have one game lock up when I was playing with it boosting above 2000 but I am not sure if that was related to the game or the GPU clocks. Seems like it has no problem running and completing benchmarks when its auto boosting above 2000. Here's a firestrike ultra at 2012 http://www.3dmark.com/fs/8991302 I'm sure the score would have been higher if I wouldn't have turned off my CPU OC the other day.


----------



## Ysbzqu6572

Anybody here with G-Sync monitor ?
Can somebody confirm that possibly new 1070/1080 drivers are causing G-Sync to glitch ?
As example in this video at Dragon's Dogma startup.. no matter whether g-sync is set to fullscreen or windowed/fullscreen.. when G-Sync is disabled it does not do this.


----------



## Pragmatist

Quote:


> Originally Posted by *H4wk*
> 
> Anybody here with G-Sync monitor ?
> Can somebody confirm that possibly new 1070/1080 drivers are causing G-Sync to glitch ?
> As example in this video at Dragon's Dogma startup.. no matter whether g-sync is set to fullscreen or windowed/fullscreen.. when G-Sync is disabled it does not do this.


I have G-sync activated (PG279Q) and I don't have any flickering issues at all. Does it only happen at start up, or do you experience it whilst gaming as well?


----------



## Ysbzqu6572

Nope, not in games, just in 2d stuff like menus and after it happens in menu it is also happening on desktop as well.

EDIT:

Seems this is related to MSI afterburner.. with it turned off it does not do this so far..


----------



## Pragmatist

Quote:


> Originally Posted by *criminal*
> 
> I may have missed it earlier in this thread, so I am sorry if this is known already. I started having crashes this weekend after updating to the newest beta Afterburner. This was happening in games and benchmarks that previously had stability. Come to find out the issue had to do with Afterburner. Event logs never showed Afterburner as the culprit, but on a hunch I figured it was worth a shot to uninstall. This may have been an isolated issue with my machine, but wanted to give a heads up to anyone having an issue and not sure what it could be.


This actually worked for me as well, so thanks for the heads up.


----------



## StarGazerLeon

I am quite impressed with the 1070 Gaming X so far. Without touching anything in MSI Afterburner, my card boosts to 1980 odd Mhz. I now run it at +100Mhz, 126%, and a fixed fan speed of 50% (I literally cannot hear it; I have to open up my case and put my ear up against it, and even then it's soooooo quiet it's unreal.) My max boost is now 2076Mhz, and it settles back down to 2032Mhz after hours at max load, 1440p gameplay, at stock voltage. The absolute max temperature the card has reached is precisely 70'C in a hot room. I can't imagine how cool it will be when the winter gets here, haha.

What core numbers are those of you with MSI 1070 Gaming Xs seeing? Just curious.


----------



## criminal

Quote:


> Originally Posted by *Pragmatist*
> 
> This actually worked for me as well, so thanks for the heads up.


Awesome. Glad the info helped someone.


----------



## killeraxemannic

Quote:


> Originally Posted by *H4wk*
> 
> Anybody here with G-Sync monitor ?
> Can somebody confirm that possibly new 1070/1080 drivers are causing G-Sync to glitch ?
> As example in this video at Dragon's Dogma startup.. no matter whether g-sync is set to fullscreen or windowed/fullscreen.. when G-Sync is disabled it does not do this.


I have the same monitor and do not have any issues. Using the DP cable that came with the monitor. Just saw your edit. MSI AB has always caused issues for me as far back as I can remember. I think its crap software just like MSI's products. I wouldn't use it.


----------



## Ysbzqu6572

Thanks, yeah It seems that it is related to afterburner.. I will test more tomorrow but after I quit afterburner I didn't got such issue while launching Dragon's Dogma

Do you guys have any alternative for fan profiles.. OC and so on ? Is Gigabyte Xtreme app good ?


----------



## killeraxemannic

Quote:


> Originally Posted by *H4wk*
> 
> Thanks, yeah It seems that it is related to afterburner.. I will test more tomorrow but after I quit afterburner I didn't got such issue while launching Dragon's Dogma
> 
> Do you guys have any alternative for fan profiles.. OC and so on ? Is Gigabyte Xtreme app good ?


I am currently using the Xtreme app version 1.02 and it is perfect so far. The original release had an OSD built in but it was causing issues so they dropped it on the next release.


----------



## Tasm

Quote:


> Originally Posted by *StarGazerLeon*
> 
> I am quite impressed with the 1070 Gaming X so far. Without touching anything in MSI Afterburner, my card boosts to 1980 odd Mhz. I now run it at +100Mhz, 126%, and a fixed fan speed of 50% (I literally cannot hear it; I have to open up my case and put my ear up against it, and even then it's soooooo quiet it's unreal.) My max boost is now 2076Mhz, and it settles back down to 2032Mhz after hours at max load, 1440p gameplay, at stock voltage. The absolute max temperature the card has reached is precisely 70'C in a hot room. I can't imagine how cool it will be when the winter gets here, haha.
> 
> What core numbers are those of you with MSI 1070 Gaming Xs seeing? Just curious.


Mine is struggling to do + 50 MHz at the moment....i am thinking in sending it back


----------



## AuraNova

Alright, so I added my card to the list. I don't have much other info right now. Not even pics. It's in my current system, which was moved to another older case. It's kind of all over the place because it was the only case I had to even fit the card in. Thank goodness for removable drive bays. Anyway, carry on.


----------



## Sea Otter

As some might know, Newegg had a strict "replacement only" return policy with the 1070s they were selling. I originally bought the Gigabyte Founders Edition, and deeply regretted it after getting 85C load temps. Decided to take a chance and talk to Customer Care. They decided to refund my entire purchase, and I ended up buying the EVGA 1070 SC (AIB) card. All in all turned out to be $20 cheaper than the FE, and I'll be working with drastically lower temps. Pretty happy with Newegg right now!


----------



## jopy

just gotten mine,








anyone have any idea how to make the rbg led to works, cant tweak it with the gigabyte software.

thks thks


----------



## prey1337

Quote:


> Originally Posted by *jopy*
> 
> just gotten mine,
> 
> 
> 
> 
> 
> 
> 
> 
> anyone have any idea how to make the rbg led to works, cant tweak it with the gigabyte software.
> 
> thks thks


Should work just fine using their Xtreme Engine utility.

The G1 was too long for my full size (7 year old) case.
So I sold it to my dad and installed it for him. RGB function worked easily.

My EVGA 1070 SC should be here at the end of this week. I know that one will fit for sure.


----------



## buttface420

just got this in the mail today gonna open and install now


----------



## jopy

Quote:


> Originally Posted by *prey1337*
> 
> Should work just fine using their Xtreme Engine utility.
> 
> The G1 was too long for my full size (7 year old) case.
> So I sold it to my dad and installed it for him. RGB function worked easily.
> 
> My EVGA 1070 SC should be here at the end of this week. I know that one will fit for sure.


did u install the utility first or nvidia driver first?

maybe ill try DDU and purge the nvdia driver and install the gigabyte engine first.


----------



## AuraNova

Quote:


> Originally Posted by *jopy*
> 
> just gotten mine,
> 
> 
> 
> 
> 
> 
> 
> 
> anyone have any idea how to make the rbg led to works, cant tweak it with the gigabyte software.
> 
> thks thks


Quote:


> Originally Posted by *buttface420*
> 
> just got this in the mail today gonna open and install now


Congrats to both of you on getting your cards. Happy overclocking.


----------



## prey1337

Quote:


> Originally Posted by *jopy*
> 
> did u install the utility first or nvidia driver first?
> 
> maybe ill try DDU and purge the nvdia driver and install the gigabyte engine first.


Driver, then Utility.

That's when I adjusted the LED's.


----------



## jopy

Quote:


> Originally Posted by *AuraNova*
> 
> Congrats to both of you on getting your cards. Happy overclocking.


only for cruching numbers








wont really needs the extra power till i get a 1440p display.
Quote:


> Originally Posted by *prey1337*
> 
> Driver, then Utility.
> 
> That's when I adjusted the LED's.


thks, will try reinstalling everything later


----------



## buttface420

i just installed it and went to check the asic quality in gpuz and it just says ASIC QUALITY: not supported on this card.

whats up with that lol?


----------



## Mad Pistol

Quote:


> Originally Posted by *Sea Otter*
> 
> As some might know, Newegg had a strict "replacement only" return policy with the 1070s they were selling. I originally bought the Gigabyte Founders Edition, and deeply regretted it after getting 85C load temps. Decided to take a chance and talk to Customer Care. They decided to refund my entire purchase, and I ended up buying the EVGA 1070 SC (AIB) card. All in all turned out to be $20 cheaper than the FE, and I'll be working with drastically lower temps. Pretty happy with Newegg right now!


That's really good to know. I've been doing business strictly with Amazon over the past few years simply because of their no-hassle return policy, amazing shipping (through prime), and customer care team. I may start giving Newegg a closer look again.


----------



## pez

Quote:


> Originally Posted by *jopy*
> 
> just gotten mine,
> 
> 
> 
> 
> 
> 
> 
> 
> anyone have any idea how to make the rbg led to works, cant tweak it with the gigabyte software.
> 
> thks thks


I've only had this happen once and an uninstall of the utility didn't work, but reseating the card did. Haven't had the same issue since *knock on wood*.


----------



## jopy

Quote:


> Originally Posted by *pez*
> 
> I've only had this happen once and an uninstall of the utility didn't work, but reseating the card did. Haven't had the same issue since *knock on wood*.


seem to have some conflict with my cooler usb header, swapped to another internal usb header and gigabyte engine is working fine now.


----------



## pez

Well that's interesting. What made you try that out? I wouldn't have thought to mess with it unless it was making weird contact.


----------



## Ysbzqu6572

Quote:


> Originally Posted by *H4wk*
> 
> Thanks, yeah It seems that it is related to afterburner.. I will test more tomorrow but after I quit afterburner I didn't got such issue while launching Dragon's Dogma
> 
> Do you guys have any alternative for fan profiles.. OC and so on ? Is Gigabyte Xtreme app good ?


It is not related to afterburner unfortunately.. once I enable G-Sync it starts doing it in Dragon's Dogma menu again with or without afterburner
It actually seems to be related to *drivers*.. more users are reporting same issue https://forums.geforce.com/default/topic/939358/geforce-1000-series/gtx-1080-flickering-issue/

Edit: The hotfix driver mentioned in the thread seem to have helped.


----------



## jopy

Quote:


> Originally Posted by *pez*
> 
> Well that's interesting. What made you try that out? I wouldn't have thought to mess with it unless it was making weird contact.


Because that antec 920 cooler is the most problematic junk ive ever used in my life LOL


----------



## pez

Quote:


> Originally Posted by *jopy*
> 
> Because that antec 920 cooler is the most problematic junk ive ever used in my life LOL


LOL. It's ok, us air-cooled boys will gladly welcome you with open arms and silent PCs







.


----------



## jopy

Quote:


> Originally Posted by *pez*
> 
> LOL. It's ok, us air-cooled boys will gladly welcome you with open arms and silent PCs
> 
> 
> 
> 
> 
> 
> 
> .


im srsly considering swapping it with Cryorig H5 just for some light ocing.
have enough of bad software


----------



## pez

Quote:


> Originally Posted by *jopy*
> 
> im srsly considering swapping it with Cryorig H5 just for some light ocing.
> have enough of bad software


Those header cables always seemed rather annoying. I've never personally used one, but when I plan my builds, I always try to think of where every cable is going to go, and those CLC header cables always seem to be one that I could never figure out how to make it aesthetically acceptable. Maybe it's time to spring for a custom loop, though







.


----------



## jopy

Quote:


> Originally Posted by *pez*
> 
> Those header cables always seemed rather annoying. I've never personally used one, but when I plan my builds, I always try to think of where every cable is going to go, and those CLC header cables always seem to be one that I could never figure out how to make it aesthetically acceptable. Maybe it's time to spring for a custom loop, though
> 
> 
> 
> 
> 
> 
> 
> .


custom loop overkill for my usage lol.


----------



## prey1337

And I'm here still running a Hyper 212 Plus, seems to be working just fine with my i7 920 OC'ed to 3.8GHz.

Did some heatsink cleaning and fan relocating last night, I'm having to make room for the EVGA 1070 on the way.


----------



## moustang

Quote:


> Originally Posted by *pez*
> 
> Those header cables always seemed rather annoying. I've never personally used one, but when I plan my builds, I always try to think of where every cable is going to go, and those CLC header cables always seem to be one that I could never figure out how to make it aesthetically acceptable. Maybe it's time to spring for a custom loop, though
> 
> 
> 
> 
> 
> 
> 
> .


Header cables on a CLC was never a problem for me. The hoses are impossible to hide, but the header cables....



Not a problem. The one from the CPU is the only one that is visible at all, and if I had rotated the pump 90 degrees it would be almost totally hidden as well.

EDIT:

In case you're wondering, the wire lying across the power supply goes to the 230mm fan that is mounted in the side panel that's removed.


----------



## Airrick10

Pulled the trigger on an MSI GTX 1070 Gaming X and should be here tomorrow! Finally I'll be able to retire my MSI 660Ti's in Sli. Since newegg had low stock on all the 1070's, I ended up getting mine from superbiiz.com. I was hesitant at first but they had good customer ratings so said why not. They have the same prices as newegg and had a 10% off code for 4th of July so that helped with the shipping fees.


----------



## buttface420

Quote:


> Originally Posted by *Airrick10*
> 
> Pulled the trigger on an MSI GTX 1070 Gaming X and should be here tomorrow! Finally I'll be able to retire my MSI 660Ti's in Sli. Since newegg had low stock on all the 1070's, I ended up getting mine from superbiiz.com. I was hesitant at first but they had good customer ratings so said why not. They have the same prices as newegg and had a 10% off code for 4th of July so that helped with the shipping fees.


congrats dude you're gonna love that card i just got one yesterday, i didnt even know the msi logo is RGB lighted, it didnt even say anything about it on the box. at stock it got a graphic score of 19216 in firestrike which completely destroys my 390 which was 12000 and beat my 980 ti sc which was 17800.


----------



## deegzor




----------



## pez

Quote:


> Originally Posted by *jopy*
> 
> custom loop overkill for my usage lol.


Maybe so, but it's always a cool project to take on if you ever feel like you can't upgrade anymore, but want to







.
Quote:


> Originally Posted by *moustang*
> 
> Header cables on a CLC was never a problem for me. The hoses are impossible to hide, but the header cables....
> 
> 
> 
> Not a problem. The one from the CPU is the only one that is visible at all, and if I had rotated the pump 90 degrees it would be almost totally hidden as well.
> 
> EDIT:
> 
> In case you're wondering, the wire lying across the power supply goes to the 230mm fan that is mounted in the side panel that's removed.


I'm a tad more picky in my cable management, so from a very attention-to-detail standpoint, it still bothers me







. Hell, I wish more motherboards used one of those adapters so I could connect those tiny header cables away from the motherboard.


----------



## buttface420

Quote:


> Originally Posted by *deegzor*
> 
> Got mine 2 weeks ago, was workin perdectly then i got black screen but pc otherwise working and checked with integrated graphics it works fine. Opened up the card and found this -> https://s31.postimg.org/iiqy99bqj/IMG_20160628_001125.jpg any chances for rma or any way to fix?


I would put the card back together and make it look like i never opened it and then rma it.


----------



## Airrick10

Quote:


> Originally Posted by *buttface420*
> 
> congrats dude you're gonna love that card i just got one yesterday, i didnt even know the msi logo is RGB lighted, it didnt even say anything about it on the box. at stock it got a graphic score of 19216 in firestrike which completely destroys my 390 which was 12000 and beat my 980 ti sc which was 17800.


Thanks Man!!! Yeah looking forward to this 4th of July weekend to benchmark it and see how it does!


----------



## mickeykool

I just picked up MSI GEFORCE® GTX 1070 ARMOR 8G OC from microcenter..

I'm seeing max clocks at 1808 out of the box.. ( didn't do any overclocking yet)

Fire strike graphics @ 18276


----------



## deegzor




----------



## buttface420

disregard my 1070 firestrike score of over 19000 apparently i didnt see the "Time measuring inaccurate. Results are invalid." on the score results. guess i have some figuring out to do.


----------



## criminal

Quote:


> Originally Posted by *buttface420*
> 
> disregard my 1070 firestrike score of over 19000 apparently i didnt see the "Time measuring inaccurate. Results are invalid." on the score results. guess i have some figuring out to do.


Check vram speeds if you are overclocking.


----------



## Hunched

So Newegg screwed everything up.
The estimated delivery of today - incorrect
The shipping courier - incorrect
The information saying it was in transit on Monday - incorrect

I ordered it when it was in stock and it said it shipped via Purolator on Monday.
I contact Purolator today as its not arrived and the tracking number doesn't work on their site despite it giving info on Newegg. They said it is a Canada Post tracking number.
I contact Canada Post which does have information in tracking about a label being submitted, but no transit info, they say they have nothing, and to contact the sender.
I contact Newegg to ask *** is happening and they say they have no idea and that they're contacting a warehouse to see why this information exists on their site, and why it's not with Purolator.

I paid $12 for this shipping when it should have been free but for whatever reason isn't covered by Premier.
They've always used Purolator in the past, but it looks like they were going to send it to Canada Post for the first time this time, DAYS BEFORE THEY GO ON STRIKE FOR WEEKS!

I should have had it today according to all the information. I may have it next week if they get their **** together.

Also Gigabyte lied to their customers about the card I've purchased by showing images of it with a backplate, and then stealthily removing the images after tons of people bought it to not include the backplate. They still haven't fixed the false advertising images on their official website.



Spoiler: Warning: Spoiler!








Also, I just had an EVGA 650 P2 PSU die after 20 days of use, which EVGA thankfully helped replace and now is fine.
But jesus *** is my luck with everything PC related lately... So many problems with my setup and shipping and everything.
Can't things just work properly please?








What am I going to have to deal with next? DOA 1070?


----------



## totalownership

Hi guys.
I'm seriously looking at the 1070. Was thinking 1080 but really don't see the need to go there with the benchmarks I'm seeing. I'm not 4K gaming and do not have any plans to do so. No real desire at all at this point. I'm coming from an EVGA Classified 780 (non-TI). My requirements will be triple screen racing at 1080p per screen. I'm looking at games like Project Cars, rFactor2 and the like. Any advice?


----------



## Mad Pistol

Quote:


> Originally Posted by *totalownership*
> 
> Hi guys.
> I'm seriously looking at the 1070. Was thinking 1080 but really don't see the need to go there with the benchmarks I'm seeing. I'm not 4K gaming and do not have any plans to do so. No real desire at all at this point. I'm coming from an EVGA Classified 780 (non-TI). My requirements will be triple screen racing at 1080p per screen. I'm looking at games like Project Cars, rFactor2 and the like. Any advice?


Go for it. I ran into a VRAM limitation with my GTX 780 that I had... no such thing with my 1070. It chews up and spits out everything I throw at it @ 3440x1440. The GTX 1070 is an excellent card.


----------



## jopy

Quote:


> Originally Posted by *Hunched*
> 
> So Newegg screwed everything up.
> The estimated delivery of today - incorrect
> The shipping courier - incorrect
> The information saying it was in transit on Monday - incorrect
> 
> I ordered it when it was in stock and it said it shipped via Purolator on Monday.
> I contact Purolator today as its not arrived and the tracking number doesn't work on their site despite it giving info on Newegg. They said it is a Canada Post tracking number.
> I contact Canada Post which does have information in tracking about a label being submitted, but no transit info, they say they have nothing, and to contact the sender.
> I contact Newegg to ask *** is happening and they say they have no idea and that they're contacting a warehouse to see why this information exists on their site, and why it's not with Purolator.
> 
> I paid $12 for this shipping when it should have been free but for whatever reason isn't covered by Premier.
> They've always used Purolator in the past, but it looks like they were going to send it to Canada Post for the first time this time, DAYS BEFORE THEY GO ON STRIKE FOR WEEKS!
> 
> I should have had it today according to all the information. I may have it next week if they get their **** together.
> 
> Also Gigabyte lied to their customers about the card I've purchased by showing images of it with a backplate, and then stealthily removing the images after tons of people bought it to not include the backplate. They still haven't fixed the false advertising images on their official website.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Also, I just had an EVGA 650 P2 PSU die after 20 days of use, which EVGA thankfully helped replace and now is fine.
> But jesus *** is my luck with everything PC related lately... So many problems with my setup and shipping and everything.
> Can't things just work properly please?
> 
> 
> 
> 
> 
> 
> 
> 
> What am I going to have to deal with next? DOA 1070?


mine have the backplate, not sure whats happening over there.
someone stolen those backplates from the warehouse ?


----------



## Airrick10

Quote:


> Originally Posted by *jopy*
> 
> mine have the backplate, not sure whats happening over there.
> someone stolen those backplates from the warehouse ?


I believe Hunched is talking abut the other Gigabyte card. There are two versions out there which are the Gaming G1 and the one that doesn't have the RGB, back plate and has a 2 fan cooler instead of 3 fans like the Gaming G1. But yeah it sucks that he along with another guy that left a review on newegg were mislead by some "side" pictures of the card that did give the impression that it had a back-plate when it didn't. I think newegg fixed the pictures because now they don't show the side picture that leads people to belive that it has a back plate.


----------



## jopy

Quote:


> Originally Posted by *Airrick10*
> 
> I believe Hunched is talking abut the other Gigabyte card. There are two versions out there which are the Gaming G1 and the one that doesn't have the RGB, back plate and has a 2 fan cooler instead of 3 fans like the Gaming G1. But yeah it sucks that he along with another guy that left a review on newegg were mislead by some "side" pictures of the card that did give the impression that it had a back-plate when it didn't. I think newegg fixed the pictures because now they don't show the side picture that leads people to belive that it has a back plate.


i see, couldnt tell the between the model from the picture he posted.

i looked into gigabyte official site, it seem to suggest it comes with the backplate from the side profile.
blatant misleading advertising :/ tsk tsk
http://www.gigabyte.com/products/product-page.aspx?pid=5922#kf


----------



## Airrick10

Quote:


> Originally Posted by *jopy*
> 
> i see, couldnt tell the between the model from the picture he posted.
> 
> i looked into gigabyte official site, it seem to suggest it comes with the backplate from the side profile.
> blatant misleading advertising :/ tsk tsk
> http://www.gigabyte.com/products/product-page.aspx?pid=5922#kf


I know...it sucks. Hopefully he can get at least a $20 refund of some sort.


----------



## prey1337

Dang, that is messed up, straight from their website and everything.

I would pretty annoyed as well.


----------



## BulletSponge

Hey Gigabyte...


----------



## Tasm

Did i got the worst GTX 1070 out there?

Because mine, wont even do 70 MHz core Overclock, not even with 100% voltage...***...

Fortunately, it automatically reaches 2000 MHz boost with just 20 MHz increase.


----------



## Hunched

Newegg appears to have lost my 1070 in transit and doesn't care.
I contacted them a second time just for some more information, and this second support representative gave me the worst customer support I have ever received, the woman earlier in the day was great.
I guess it's because I used the peasant web chat support instead of the Premier Hotline from my free trial the second time.

They basically told me to piss off and wait until JULY 7th before contacting them again despite it supposed to arrive JUNE 29th.
This would be alright if they could guarantee anything, such as the location of my package or whether or not it is on its way and will be delivered by that date.
They say that isn't a delay and that it is in transit, while the people transporting it guarantee they never got it...

So I decided to contact Canada Post a second time as well, to tell them I did contact the sender (Newegg) to notify them Canada Post never received anything from Newegg and ask if they could contact Newegg or something since apparently Newegg doesn't believe me or care. Unsurprisingly there's nothing Canada Post can do.
Canada Post suggested I request a refund from Newegg for their negligence and I agree.

I've told Newegg that Canada Post doesn't even have my 1070, they ignore that and say it's in transit.
No Newegg, you need to tell me you're going to get my 1070 to Canada Post and fix this, and stop telling me they're already in the process of delivering it to me when they don't even have it.
Better yet, deliver it to me with Purolator as you have with everything I have ever bought, and lets not go with Canada Post THE DAY BEFORE THEY GO ON STRIKE.
I paid for Purolator shipping, not deliver it to me 2 weeks after people are done striking Canada Post. I knew Canada Post was going to strike a week before I ordered it.

This is so stupid.
They gave me a Purolator tracking number that doesn't even work with Purolator, it's a Canada Post tracking number. They don't even understand what courier they're using to deliver my product.
Then supposedly they know more about the delivery of my 1070 than the company delivering it (when they can't even provide the right tracking number for the right courier), and refuse to listen to anything I say or back up anything they say with any kind of proof.
Prove to me Canada Post has it, since they're telling me they never got it. Don't tell me to sit on ass until July 7th without telling me you're going to do anything or look into anything.

Nowhere else in Canada even has it in stock so they just get to be dicks and do whatever they want I suppose. Super cool.

I'll let you all know how the Gigabyte GeForce GTX 1070 WindForce OC is after everyone else on the planet has got theirs already weeks before me








Should have arrived today, not even in Canada, not even in the process of making it to Canada, Friday is a Canadian Holiday. MAYBE it will somehow get here next week but I doubt it at this negligent rate.

If you're Canadian, go with NCIX if you can if they have stuff in stock, get them to price match or something. They're actually competent and what I've mainly used for everything.


----------



## Sea Otter

Got my second 1070 in today. Ran Fire Strike, here's the results - http://www.3dmark.com/3dm/12804425

Looking pretty awesome if I say so myself -


----------



## pez

Yikes, Xtreme Gaming 1080s with QC issues and a misleading Windforce OC 1070 is not a great outlook for Gigabyte. The G1 seems to be the safe bet from Gigabyte so far.


----------



## hemon

Hi,

do you have any power limit / throttle in OC with custom reference cards? I would like to know, if the one single 8-pin connector is enough...


----------



## vmanuelgm

Hi, could someone post a heaven 1080p maxed with 1070's SLI at highest frequencies, to see if they scale better than Maxwell and if it is possible to reach 200 fps???

Thanks in advance.










Would also like to see the same bench with 2025 in core and 4250 in memos in SLI. A powerful processor would also be great (6700k, 5960x, 6900k)!!!


----------



## Aloc

Hi guys! could be possible @ 2025mhz/4235mhz with i5 6600k this result?:



I saw this result in n3d spanish forum and i don´t really believe that score of 193fps avg @ that frequencies 2025mhz( with trottling to 2000ghz or below) and 4253mhz memos.

Anybody here could be so kind to do same test at this mhz, i would appreciate so much the help.

Thanks and salutes from spain.


----------



## Ysbzqu6572

Why not.. with 2 cards.
Mine at 2000 gets over 100 fps with i5-6500
2 cards in SLI theoretical 180+% scaling could be 190fps or more


----------



## Aloc

then, could you share your results in sli ? i dont like to speculate


----------



## SlackerITGuy

So seeing as the RX 480 did not meet my somewhat optimistic expectations (GTX 980 performance at stock with close to Fury X performance when *heavily* OCed), I have now decided to spend more and go with a GTX 1070.

But I wanted to ask you guys first, after browsing the GeForce.com forums, I've noticed several threads discussing high DCP latency and stuttering with these GP104 cards.

Are you guys experiencing any of this? what do you think?


----------



## Ysbzqu6572

I dont know, I dont notice anything .. running windows 10.


----------



## criminal

Quote:


> Originally Posted by *SlackerITGuy*
> 
> So seeing as the RX 480 did not meet my somewhat optimistic expectations (GTX 980 performance at stock with close to Fury X performance when *heavily* OCed), I have now decided to spend more and go with a GTX 1070.
> 
> But I wanted to ask you guys first, after browsing the GeForce.com forums, I've noticed several threads discussing high DCP latency and stuttering with these GP104 cards.
> 
> Are you guys experiencing any of this? what do you think?


Link? I have noticed some strange latency sometimes, but everything I checked looks good.


----------



## mickeykool

Quote:


> Originally Posted by *criminal*
> 
> Link? I have noticed some strange latency sometimes, but everything I checked looks good.


I think he's referring to this.. https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/


----------



## prey1337

Just read through that whole thread, interesting stuff.

I wonder if this is something solved with future drivers.

Seems like it happens to a small amount of people though (or most people don't notice), so I wonder what the real culprit is.

Anyone here with those latency issues?


----------



## mickeykool

I have a 1070 as well, only game i get low fps like 10 - 15 at startup and throughout the game is Fallout 4. I haven't really gotten around to figuring out whats the issue. I will try disabling gsync and reinstalling drivers..

Every other game works fine.


----------



## SlackerITGuy

Quote:


> Originally Posted by *mickeykool*
> 
> I think he's referring to this.. https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/


That's the one I was talking about.

There's a few other buried a few pages back as well, but that's the main one as far as I can tell.

*This is worrisome*, last time I had an NVIDIA card (GTX 470 and GTX 670), I had to deal with stuttering and CPU usage issues on some pretty big titles, the biggest ones being Battlefield Bad Company 2 and Battlefield 3. I certainly don't want a repeat of that scenario.


----------



## Mad Pistol

Quote:


> Originally Posted by *SlackerITGuy*
> 
> That's the one I was talking about.
> 
> There's a few other buried a few pages back as well, but that's the main one as far as I can tell.
> 
> *This is worrisome*, last time I had an NVIDIA card (GTX 470 and GTX 670), I had to deal with stuttering and CPU usage issues on some pretty big titles, the biggest ones being Battlefield Bad Company 2 and Battlefield 3. I certainly don't want a repeat of that scenario.


I remember the stuttering issue with BF3 when I had my 660 Ti. I think it may have been a Windows 7 issue, becuase so far, I have found absolutely no issues with performance with my 1070 and Windows 10. It was also the same story for my previous GTX 780; no issues.


----------



## criminal

Quote:


> Originally Posted by *Mad Pistol*
> 
> I remember the stuttering issue with BF3 when I had my 660 Ti. I think it may have been a Windows 7 issue, becuase so far, I have found absolutely no issues with performance with my 1070 and Windows 10. It was also the same story for my previous GTX 780; no issues.


I am on Windows 10 with my 1070 and didn't notice this issue with my 980. I don't know if what I am seeing is the same thing that thread is in regards too, but I am/was having the issue in Fallout 4 and Farcry 4. I have messed around with settings and think I have the issue taken care of with those games, but I still have some strange"mouse lag" issues when navigating Windows that I can't explain. Not a gigantic issue right now, but still annoying.


----------



## Ecks9T

I have a general question about the card an windows 10. When I was using win 10 after a fresh install, the OS had trouble recognizing the card. Even when i tried to install the drivers using the nvidia installer I received the message that the device is incompatible.


----------



## Wille114

Quote:


> Originally Posted by *Sea Otter*
> 
> Got my second 1070 in today. Ran Fire Strike, here's the results - http://www.3dmark.com/3dm/12804425
> 
> Looking pretty awesome if I say so myself -


Do you have second ribbon bridge for SLI? It will work better than if you only use single ribbon bridge. (Like HB SLI Bridge)


----------



## n64ADL

anybody know where to get a good 1070 water block from???


----------



## Hunched

My last post about the Newegg situation.
Finally the package has been discovered or sent, Canada Post finally received it today.
I asked Newegg not to send it with Canada Post since there's about a 90% chance they go on strike and my package gets stuck in limbo starting July 2nd, they said they were aware and emailed me about a switch to Purolator.
Yet they've gone ahead and given it to Canada Post anyway, it's now in transit with Canada Post.

So if Canada Post strikes, I'm getting a full refund from Newegg and never using them to buy anything ever again.
I'll stick with NCIX and all the other Canadian retailers that are smart enough to be avoiding using Canada Post because of the upcoming strike.

My Gigabyte 1070 WindForce OC will now arrive next Wednesday on the slim chance Canada Post doesn't strike.
Remember though, that's not a delay according to Newegg. It's only a whole week after the estimated delivery date. Definitely not delayed, definitely on time.


----------



## criminal

Quote:


> Originally Posted by *Hunched*
> 
> My last post about the Newegg situation.
> Finally the package has been discovered or sent, Canada Post finally received it today.
> I asked Newegg not to send it with Canada Post since there's about a 90% chance they go on strike and my package gets stuck in limbo starting July 2nd, they said they were aware and emailed me about a switch to Purolator.
> Yet they've gone ahead and given it to Canada Post anyway, it's now in transit with Canada Post.
> 
> So if Canada Post strikes, I'm getting a full refund from Newegg and never using them to buy anything ever again.
> I'll stick with NCIX and all the other Canadian retailers that are smart enough to be avoiding using Canada Post because of the upcoming strike.
> 
> My Gigabyte 1070 WindForce OC will now arrive next Wednesday on the slim chance Canada Post doesn't strike.
> Remember though, that's not a delay according to Newegg. It's only a whole week after the estimated delivery date. Definitely not delayed, definitely on time.


That sucks. I hardly ever use Newegg anymore. Got burned a couple of times and only buy cheap (under $50) stuff if I can't find it anywhere else.


----------



## Hunched

Quote:


> Originally Posted by *criminal*
> 
> That sucks. I hardly ever use Newegg anymore. Got burned a couple of times and only buy cheap (under $50) stuff if I can't find it anywhere else.


I ordered me Gigabyte 970 G1 from them at launch and a handful of other things and they were fine, I don't know what happened.
It sucks that they're screwing up with the most expensive item I have ever purchased for my PC...

They've lied about it being in transit, ignored proof that it wasn't, say they'll do as I request and go with Purolator as it says on their tracking anyway, ignore that still and go with Canada Post despite knowing of the strike.
It's like they just really couldn't care less, they don't listen to anything you have to say.
I'm ending up taking out of my frustration here because I'm remaining civil with them, but it gets nowhere, they're so bad.

I just want my 1070 on time and in one piece, and they aren't doing a thing to make that possible and are screwing everything up.
They say they'll do something or they'll say something is happening, meanwhile the exact opposite is happening and they don't do anything to change it when you're off the phone.

NewEgg, your #1 source of lies and unfulfilled requests.
It's a $550 CAD order, you'd think they might give a **** at least slightly.


----------



## Jimbags

I have been really impressed with acoustics and cooling capability of my GTX 1070FE. Looks sweet as too. Had the aircon set to 26°c the other night played doom on ultra 2560x1440. Highest temp I got was 74°c with fan at 70% clock speed was bouncing between 2000-2100mhz. Mem was only oc'd to effective 8500mhz. Awesome for a stock card imho


----------



## Martyr82

How are GTX1070 owners connecting to their CATLEAP / Korean monitors which have a sole DVI-D (Dual Link) input?

Can I just buy a HDMI > DVI-D converter?


----------



## Jimbags

Quote:


> Originally Posted by *Martyr82*
> 
> How are GTX1070 owners connecting to their CATLEAP / Korean monitors which have a sole DVI-D (Dual Link) input?
> 
> Can I just buy a HDMI > DVI-D converter?


Mine has a dvi-d dual link connector on it? Dont they all? Worksfine 2560x1440 dell ultra sharp


----------



## Blackcurrent

Just received my MSI 1070 Gaming X. With a little overclocking this is the result I got.


----------



## Pragmatist

Quote:


> Originally Posted by *Blackcurrent*
> 
> Just received my MSI 1070 Gaming X. With a little overclocking this is the result I got.


Going to try with AA and extreme later and edit the post. However, the gaming X is supposedly better than my MSI Armor variant.

1080p Ultra, no AA









1440p High, no AA


----------



## Tasm

For some reason, the stock Gaming Bios was limiting my OC.

After updating to the OC Bios, i can get a max boost of 2088 Mhz.

Not that bad.



http://www.3dmark.com/3dm/12846451?


----------



## Frutek

Quote:


> Originally Posted by *Tasm*
> 
> For some reason, the stock Gaming Bios was limiting my OC.
> 
> After updating for the OC Bios, i can get a max boost of 2088 Mhz.
> 
> Not that bad.
> 
> 
> 
> http://www.3dmark.com/3dm/12846451?


What clocks did you get before updating bios?

Here are mine game stable scores:

http://www.3dmark.com/compare/fs/9003599/fs/8742652/fs/8716850#


----------



## pewpewlazer

Quote:


> Originally Posted by *vmanuelgm*
> 
> Hi, could someone post a heaven 1080p maxed with 1070's SLI at highest frequencies, to see if they scale better than Maxwell and if it is possible to reach 200 fps???
> 
> Thanks in advance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would also like to see the same bench with 2025 in core and 4250 in memos in SLI. A powerful processor would also be great (6700k, 5960x, 6900k)!!!


Quote:


> Originally Posted by *Aloc*
> 
> Hi guys! could be possible @ 2025mhz/4235mhz with i5 6600k this result?:
> 
> 
> 
> I saw this result in n3d spanish forum and i don´t really believe that score of 193fps avg @ that frequencies 2025mhz( with trottling to 2000ghz or below) and 4253mhz memos.
> 
> Anybody here could be so kind to do same test at this mhz, i would appreciate so much the help.
> 
> Thanks and salutes from spain.


I'm guessing you're both asking because of this same result? haha.

For whatever reason, my print screen turns out black when I run 1080p full screen (desktop @ 4k). Windowed mode my results are 20 fps lower. So you'll have to take my word...

5820k @ 4.5ghz/3.3ghz (HT on)
ddr2400 15-15-15-36 1T
EVGA GTX 1070 FE SLI @ 1975-2000mhz core, 4400mhz memory

*177 fps* in Heaven at the above settings

16 fps lower... I don't really have any experience with Heaven and you didn't state what speed this guy has his 6600k at, so I won't say that result is impossible. Hell, maybe running my desktop res at 4k hurts, seeing as windowed mode alone is a 20 fps penalty.


----------



## PCPanamaCrew

pickup mine today(i import it) i think to resell it in my country to grab a 1080, is a msi FE the first impresion was good.. beacuse smell like lemon(maybe a electronics cleaner in the packing phase?) + new electronic odor lol

immediatly i oc it to 2000 - 2100, gta v is kicking me when i pass 2030 and also i oc the mem to 8600

i can run the gta at 1080p + FRAME SCALING MODE 2.500(4k textures) + MSAA X8 obviously i exeed the vram by 200 - 300 mb but its enough to render machinimas or take photos at the game maximum quality possible but its no playable max 15 fps at these settings.(so this means that you need a 1070 sli to run 4k msaa x8 at 30 fps and the cards running at 2000 mhz...)

also the witcher 3 runs 60 fps all time at 2560 x 1080 my primary resolution

still i thinking in a GTX 1080 at 2.1 or 2.2 to fullfill my needs(mostly play whit msaa x8..)










Spoiler: Warning: Spoiler!


----------



## rv8000

If someone has the time, could you please get a 1070 to boost to ~1962 while leaving the memory at stock clocks in FS; I'm checking some scaling between the 1070 and 1080.

http://www.3dmark.com/fs/9073334


----------



## Blackcurrent

A quick Firestrike bench with the 1070 Gaming X I just received. Can probably do a little better.

http://www.3dmark.com/3dm/12842753


----------



## Mad Pistol

I'm surprised we haven't seen an influx of new 1070 owners since the RX 480 was released.


----------



## mcbaes72

Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm surprised we haven't seen an influx of new 1070 owners since the RX 480 was released.


My 1070 Armor arrived earlier this week (Tuesday). Although I hope to see AMD succeed, I had no intention of buying their newest GPU.


----------



## mickeykool

Quote:


> Originally Posted by *mcbaes72*
> 
> My 1070 Armor arrived earlier this week (Tuesday). Although I hope to see AMD succeed, I had no intention of buying their newest GPU.


I'm curious what are u getting in fire strike graphics? I have the exact same card and am getting avg 16,000 at stock.


----------



## Airrick10

Quote:


> Originally Posted by *Blackcurrent*
> 
> A quick Firestrike bench with the 1070 Gaming X I just received. Can probably do a little better.
> 
> http://www.3dmark.com/3dm/12842753


I just got mine yesterday...doing a firestrike benchmark right now. I'll post results in a while. Running stock clocks.


----------



## MrPlankton

Bit the bullet on an MSI GTX 1070 Gaming (non-X version) to replace my MSI GTX 670 Power Edition SLI setup. Can't wait to see how it overclocks (and not having a 2 GB VRAM limit anymore).


----------



## buttface420

im not a overclocker on gpus but here's a tiny overclock on the gaming x 1070 http://www.3dmark.com/3dm/12863137

so happy to be in the 20,000+ club on a single gpu


----------



## Bdonedge

So in HWmonitor it says my 1070 is running at 1987mhz under load. This is without a OC. That can't be right?

I don't have any OC software on either. Is there a reason for that?


----------



## Blackcurrent

Quote:


> Originally Posted by *Bdonedge*
> 
> So in HWmonitor it says my 1070 is running at 1987mhz under load. This is without a OC. That can't be right?
> 
> I don't have any OC software on either. Is there a reason for that?


Yes, read up GPU Boost 3.0


----------



## stoker

Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm surprised we haven't seen an influx of new 1070 owners since the RX 480 was released.


Funny you say that. I was thinking maybe getting 2 but my train derailed on launch, so ordered a EVGA SC 1070 last night.


----------



## Bdonedge

Quote:


> Originally Posted by *Blackcurrent*
> 
> Yes, read up GPU Boost 3.0


I've read the 1080 stats on GPU boost 3.0 but that seemed to limit around 1850 mhz... I didn't think 1900+ for a 1070 would be standard?


----------



## 9colai

Just received mine yesterday (MSI 1070 X Gaming) and I'm really satisfied







. First upgrade in 4 years, came from a gtx 680 Directcu II TOP.
This card has awesome performance and really good temps while it stays silent!


----------



## Yetyhunter

How are your games running ? GTA V, witcher 3, Battlefield ... I will be upgrading from the same card and I want to know what to expect.


----------



## 9colai

Quote:


> Originally Posted by *Yetyhunter*
> 
> How are your games running ? GTA V, witcher 3, Battlefield ... I will be upgrading from the same card and I want to know what to expect.


I've played farcry 4 the most of the time since i got the card, and it runs between approximately 70 to 100 fps on ultra 1080P. It uses 5,5 GB VRAM though? Kinda strange on 1080P.

I've tested BF4 shortly and it ran solid 144 FPS.


----------



## Sea Otter

Anyone know if it's possible to flash an MSI Gaming X Bios onto the non-X model? I didn't want to pay $20 for the X model, and the normal model was the only card in stock at my time of purchase.

Also, can someone confirm if the only difference between the two cards is the factory overclock?


----------



## Frutek

Quote:


> Originally Posted by *rv8000*
> 
> If someone has the time, could you please get a 1070 to boost to ~1962 while leaving the memory at stock clocks in FS; I'm checking some scaling between the 1070 and 1080.
> 
> http://www.3dmark.com/fs/9073334


There you go

http://www.3dmark.com/compare/fs/9073334/fs/9080692#


----------



## StarGazerLeon

Quote:


> Originally Posted by *Tasm*
> 
> For some reason, the stock Gaming Bios was limiting my OC.
> 
> After updating to the OC Bios, i can get a max boost of 2088 Mhz.
> 
> Not that bad.
> 
> 
> 
> http://www.3dmark.com/3dm/12846451?


Which version did you flash to? I download the 3MB file from the MSI 1070 support page and extracted it, but I have 2 versions; one ends in 120 and the other 170 (These are rough guesses as I cannot remember tbe exact numbers). Which one is the fastest version included on MSI's site?


----------



## Blackcurrent

Quote:


> Originally Posted by *StarGazerLeon*
> 
> Which version did you flash to? I download the 3MB file from the MSI 1070 support page and extracted it, but I have 2 versions; one ends in 120 and the other 170 (These are rough guesses as I cannot remember tbe exact numbers). Which one is the fastest version included on MSI's site?


Read the pdf file


----------



## StarGazerLeon

Quote:


> Originally Posted by *Blackcurrent*
> 
> Read the pdf file


Derp. My mistake. All is clear now, thanks.


----------



## devilz

Just bought the MSI Gaming Edition today on the day of the Australian election lol. Price was $769AUD (~$567US).


----------



## Pragmatist

Quote:


> Originally Posted by *Frutek*
> 
> There you go
> 
> http://www.3dmark.com/compare/fs/9073334/fs/9080692#


That's an amazing score dude, it's just 1571 in between. Did you notice better performance as well with the hotfix driver?


----------



## Frutek

Quote:


> Originally Posted by *Pragmatist*
> 
> That's an amazing score dude, it's just 1571 in between. Did you notice better performance as well with the hotfix driver?


It's such close score because gtx 1080 has downclocked memory to 8GHz. I did notice better scores with 1080 I had. Can't say much about 1070 because first driver I installed was hotfix one.


----------



## Pragmatist

Quote:


> Originally Posted by *Frutek*
> 
> It's such close score because gtx 1080 has downclocked memory to 8GHz. I did notice better scores with 1080 I had. Can't say much about 1070 because first driver I installed was hotfix one.


Yeah, I've seen 1080's score much higher, it's just that it didn't do much better when having "almost" the same values. The 1070 is a great card tbh, and I'm having a lot of fun gaming with it. It's especially noticeable if you upgrade from a 7xx series card or older, because gaming got so much more fluent at 1440p.

I got less stutters and better performance with the hotfix driver, but it is just meant to fix the flickering issue some have with high refresh rate panels.









It could just be coincidence as well, since I haven't reproduced it more than once, nor have I tested reinstalling the drivers more than once.


----------



## Partogi

Is it necessary to have more than one power connectors for better OC capability (i.e. 8-pin & 6-pin or two 8-pin)?

I'm comparing Palit 1070 Super Jetstream vs. MSI 1070 Gaming X. Palit only has one 8-pin.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Partogi*
> 
> Is it necessary to have more than one power connectors for better OC capability (i.e. 8-pin & 6-pin or two 8-pin)?
> 
> I'm comparing Palit 1070 Super Jetstream vs. MSI 1070 Gaming X. Palit only has one 8-pin.


Not on these low powered cards. 8pin is more than enough.


----------



## CaptainZombie

Quote:


> Originally Posted by *mcbaes72*
> 
> My 1070 Armor arrived earlier this week (Tuesday). Although I hope to see AMD succeed, I had no intention of buying their newest GPU.


How is the Armor edition? Where does it fall with the rest of the MSI 1070s?


----------



## rv8000

Quote:


> Originally Posted by *Frutek*
> 
> There you go
> 
> http://www.3dmark.com/compare/fs/9073334/fs/9080692#


Pretty much what I was expecting, only an 8% performance gap for a 25% shader cut at the same clocks. I guess it's good news for the people looking towards the 1060 at least.


----------



## mcbaes72

Quote:


> Originally Posted by *mickeykool*
> 
> I'm curious what are u getting in fire strike graphics? I have the exact same card and am getting avg 16,000 at stock.


Just ran it, see attached screen capture. But this is OC'ed with...

Power = 108%
Core = 150
Memory = 456

Additional info...

Max Temp = 71 C
GPU Usage = 99%
Max Core Clock = 2101
Max Memory Clock = 4464
(MSI Afterburner, version 4.2.0)


----------



## mcbaes72

Quote:


> Originally Posted by *CaptainZombie*
> 
> How is the Armor edition? Where does it fall with the rest of the MSI 1070s?


IMO, it's in the middle of the pack. I think GPUs such as: MSI Gaming X, ASUS Strix, Zotac Amp Extreme would be rated above the Armor in terms of core clock speed and should bench higher. I will say this, with temps maxed at 71 degrees C (highest I've seen so far) and fans at 60%, it's still pretty quiet compared to my case fans when temps ramp up. Also, you gotta love the classy black/white color combo. I mainly bought it to match my gaming rig's color scheme. Yeah, looks were more important than highest core clock performer...haha!


----------



## LiquidHaus

Hey guys, figured I'd go here and ask for a legitimate opinion.

Looking to snag a 1070 this month. I've got my eye on two mainly but with a third coming up fairly quick in terms of curiosity.

SO, what would you choose?

Asus Strix 1070
Zotac Amp Extreme 1070
Gainward GLH 1070


----------



## mcbaes72

Quote:


> Originally Posted by *lifeisshort117*
> 
> Hey guys, figured I'd go here and ask for a legitimate opinion.
> 
> Looking to snag a 1070 this month. I've got my eye on two mainly but with a third coming up fairly quick in terms of curiosity.
> 
> SO, what would you choose?
> 
> Asus Strix 1070
> Zotac Amp Extreme 1070
> Gainward GLH 1070


From those three, I'd choose the Zotac Amp Extreme! Zotac tends to have one of the highest core/boost clocks and love the carbon/gunmetal look with triple fans. Only downside is the yellow on the backplate. Why didn't they stick with same color scheme as the front or at least basic black? Also, I've been going through ASUS RMA for almost three weeks on older GPU repair. Although they're helpful on the phone, repair/replacement process is pretty slow. I Googled other's opinions on their RMA process and read many complaints. So, keep that in mind when deciding between these choices.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *lifeisshort117*
> 
> Hey guys, figured I'd go here and ask for a legitimate opinion.
> 
> Looking to snag a 1070 this month. I've got my eye on two mainly but with a third coming up fairly quick in terms of curiosity.
> 
> SO, what would you choose?
> 
> Asus Strix 1070
> Zotac Amp Extreme 1070
> Gainward GLH 1070


the following has contributed largely to me being hooked going with an ASUS Strix 1070:

http://www.kitguru.net/wp-content/uploads/2016/06/xtemps5.png.pagespeed.ic.vqk8rw69YT.jpg

http://www.kitguru.net/components/graphic-cards/zardon/asus-republic-of-gamers-strix-gtx-1070-aura-rgb-oc/28/


----------



## Bugses

Which 1070 is the best to buy? I would prefer something silent if thats possible without hurting the performance too much.


----------



## PCPanamaCrew

Quote:


> Originally Posted by *Yetyhunter*
> 
> How are your games running ? GTA V, witcher 3, Battlefield ... I will be upgrading from the same card and I want to know what to expect.


1080p totally playable at max settings(msaa x8) 60 fps whit sync on, also you can run with the frame scaling mode(2.5 is like 4k textures msaa off) at 40 -60 fps and can handle 1080p msaa x8 and frame scaling mode 2.5 at 9 -15 fps(uses 8200mb vram)... lol you need at sli to run gta v 4k msaa x8 at 30 fps.. the witcher 3 60 fps whit gameworks on and sync.

why run gta v at maximum settings possible?? to photography or render videos in rockstar editor

here a photo that i took 1080p max frame scaling and msaa x8 looks how vram uses


Spoiler: Warning: Spoiler!


----------



## Airrick10

Ok so here are my MSI GTX 1070 Gaming X results for Heaven and Firestrike. Great card to replace my MSI 660Ti's in Sli. It's not much of an overclocker but overall I'm very satisfied with it. This card also ran pretty cool and never saw it go above 65c while doing multiple Heaven and Firestrike benchmarks. The card is pretty quiet as well with my custom fan curve. I'm still running windows 7 so I'm not sure if my scores will go up once I dive into windows 10 but I'll find out later this month when I upgrade before the deadline.

Heaven Benchmark
Overclock Settings
+105 Core
+775 Memory


Firestrke Benchmark
Overclock Settings
+100 Core
+500 Memory
70% Power limit (any higher and +100 core would crash)


----------



## Bdonedge

My gigabyte G1 without OC hasn't gone over 60C under full load gaaming and benchmark sessions - just FYI for anyone looking at which one to get. I'm about to snag a second one


----------



## jopy

Ive actually set dat G1 to eco mode, after 30mins of shadow of mordor, 1080p max setting, max load temp is only 50C, ambient 25~27c.


----------



## Yungbenny911

I have 3 1070's on the way to my house (as i can't make up my mind) lol. i got 1 msi gaming X, 1 msi gaming (non x), and 1 gigabyte G1. I'll be selling the gigabyte G1, and flashing the non X with an X bios.


----------



## bdc604

quick question: what are the dimensions of the msi 1070 gaming x without the mounting bracket? trying to figure out which 1070 i can squeeze into an ncase m1 with some creative zip cutting to the case frame.


----------



## Airrick10

Quote:


> Originally Posted by *bdc604*
> 
> quick question: what are the dimensions of the msi 1070 gaming x without the mounting bracket? trying to figure out which 1070 i can squeeze into an ncase m1 with some creative zip cutting to the case frame.


The length of the card is a little less than 11 inches...around 10 3/4 inches more less (not including the mounting bracket where the ports are). Hope this helps.


----------



## bdc604

quick question #2: anyone want to share their experiences with evga 1070 sc? can only find a couple posts on google mentioning its nominally better than fe, whereas msi gaming x seems to be mostly whisper quiet.


----------



## Eorzean

Can you guys run LatencyMon for a few minutes and then post your results here? I'm trying to determine if my card's faulty or not as it has extremely high DPC latency.


----------



## Blackcurrent

It's not just you I have it as well, but only when idling. When testing under load dpc latency is in the green and under 50 mostly./


----------



## Anth0789

Just ordered my Asus GTX 1070! Can't wait to get it.


----------



## HAL900

Did any holder of 1070 would record the same video on the latest driver ?


----------



## Eorzean

Quote:


> Originally Posted by *Blackcurrent*
> 
> It's not just you I have it as well, but only when idling. When testing under load dpc latency is in the green and under 50 mostly./


Posted an update in the other thread, but can you do a full run of Firestrike with LatencyMon and then see if there's any spikes at all? Heaven ran with low latency, whereas I was getting spikes in Firestrike. I thought maybe it was just happening in-between benchmarks during the loading screens, but it was spiking mid way through as well. I haven't had a chance to game yet but will fire up Overwatch as soon as it's done downloading and see if there's any stuttering.

Edit: Fixed! Well sort of. I turned off speedstep and now I stay in the green with occasional spikes of 450-500.


----------



## LiquidHaus

Quote:


> Originally Posted by *Anth0789*
> 
> Just ordered my Asus GTX 1070! Can't wait to get it.


the strix or founders?


----------



## Anth0789

Quote:


> Originally Posted by *lifeisshort117*
> 
> the strix or founders?


Strix!


----------



## LiquidHaus

Quote:


> Originally Posted by *Anth0789*
> 
> Strix!


sweeeeet i think i'll be doing the same next week. order it from newegg? when you get it you should post some close ups of the card on here.


----------



## Anth0789

Quote:


> Originally Posted by *lifeisshort117*
> 
> sweeeeet i think i'll be doing the same next week. order it from newegg? when you get it you should post some close ups of the card on here.


I ordered it from Canada Computers for pick up, who knows how long it will take before I get it but I will post pics once it arrives.


----------



## HAL900

AIDA64 Extreme Edition 5.75.3900
someone could throw the result of this gpu ??


----------



## xCamoLegend

Quote:


> Originally Posted by *HAL900*
> 
> AIDA64 Extreme Edition 5.75.3900
> someone could throw the result of this gpu ??


----------



## Bdonedge

when guys update the drivers - do you just allow the experience app to update it - or do you manually uninstall the old drivers and then download the new drivers like the old days?


----------



## Blackcurrent

I first uninstall the Nvidia drivers and reboot in safe mode with DDU. Then I simply let it run, restart and install the new drivers. I don't install any of the other Nvidia crap except for Physx.


----------



## Bdonedge

Quote:


> Originally Posted by *Blackcurrent*
> 
> I first uninstall the Nvidia drivers and reboot in safe mode with DDU. Then I simply let it run, restart and install the new drivers. I don't install any of the other Nvidia crap except for Physx.


DDU?


----------



## Arizonian

Quote:


> Originally Posted by *Bdonedge*
> 
> DDU?


https://forums.geforce.com/default/topic/550192/display-driver-uninstaller-ddu-v7-1/


----------



## 303869

Cant wait for my 1070! Orders been open 2 weeks now and hasn't even dispatched due to low stock


----------



## Samurai707

I walked in and grabbed my MSI 1070 Gaming X from a Central Computer (SF Location) during my lunch break last Wednesday. They got a bunch of FEs left. Go figure


----------



## Mad Pistol

Quote:


> Originally Posted by *Samurai707*
> 
> I walked in and grabbed my MSI 1070 Gaming X from a Central Computer (SF Location) during my lunch break last Wednesday. They got a bunch of FEs left. Go figure


Yea, I think the early adopters have all but gotten their share of the 1070 Founders Editions; they appear to be in stock at several locations.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601201888%20601204369%208000

Honestly, I love my 1070 FE. It's a great card that looks awesome! Will AIB cards outmatch it in performance? Yep. Do I care? Not really.









I actually thought about returning my MSI 1070 FE and keeping the Nvidia 1070 FE. However, I looked on the MSI 1070 FE, and there is absolutely no MSI branding on it at all. I have a feeling that all Founders Edition cards are like this; manufacturer branding is not allowed on these cards. In other words, no one would know I bought a card sold by MSI unless I told them.


----------



## wywywywy

Is there a way to lower the minimum fan speed on the FE to below 1000rpm yet? Or do we have to wait for custom BIOSes?


----------



## Ragsters

Hey guys! Is this score OK? nothing is overclocked.


----------



## Dude970

I would say that score is really good


----------



## Ragsters

Quote:


> Originally Posted by *Dude970*
> 
> I would say that score is really good


Thanks! I just want to be in the ballpark of similar systems. This is a brand new rig and I haven't figured everything out yet.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Anth0789*
> 
> I ordered it from Canada Computers for pick up, who knows how long it will take before I get it but I will post pics once it arrives.


16 more days until my (2) ASUS Stix arrive from Amazon. (originally 24 day wait from June 26 purchase date... they promised three to 4 weeks but that it Could be sooner than that. So maybe as little as 10 more days.) Brings back all the x-mas/ waiting feelings as a child.
















The really good part of waiting is my CC hasn't been charged yet, and so i was able to afford all the great Games from Steam for playing on GTX1070-SLI @1440P with G-Sync







... minus Tom Clancy's The Division, because sale price was only -25%. it can wait.

i'll post when my (2) ASUS 1070 Strix arrive and hopefully because they're coming from Amazon it will shed light on shipments nation-wide and continental-wise and timing(s) of them. man i hope they turn-up in 10 days and as a result i can avoid entering therapy.


----------



## HAL900

Quote:


> Originally Posted by *xCamoLegend*


It is rather little. 1797 mhz 2 tab gpu-z in the test ?


----------



## Frutek

Quote:


> Originally Posted by *HAL900*
> 
> It is rather little. 1797 mhz 2 tab gpu-z in the test ?


It's showing that way. I got same stock default boost but it goes up to 1985MHz in games.


----------



## HAL900

1797 in aida and 1985 mhz in game or both 1985 ?


----------



## Sea Otter

Quote:


> Originally Posted by *bdc604*
> 
> quick question #2: anyone want to share their experiences with evga 1070 sc? can only find a couple posts on google mentioning its nominally better than fe, whereas msi gaming x seems to be mostly whisper quiet.


I've tried both an FE and a 1070 SC. Have the 1070 SC in my system right now. Coil whine is pretty bad on it. It's a reference board with an ACX cooler, which cools marginally better than the FE. I was getting 82C load temps with my FE (1850MHz boost), and my EVGA one gets to around 75C.

Not sure why EVGA is marketing the ACX cooler so heavily when it's very marginally better than the stock cooler. Bad experience all around. Can't wait for my Strix to come from Newegg so I can finally return the SC.


----------



## HAL900

Why pascal It has further cut down double precision 32 instead of 24. Pascal is just maxwell. Nvidia cheated that is something else. sp has only improved by 10% xD


----------



## HAL900

1455/8000 mhz


----------



## Frutek

Quote:


> Originally Posted by *HAL900*
> 
> 1797 in aida and 1985 mhz in game or both 1985 ?


1797 in gpu-z, 1985mhz in game with OC bios on MSI GTX 1070


----------



## prey1337

Alright, just installed my GTX 1070 SC.

Firestrike Graphics score of 13985 at stock settings: http://www.3dmark.com/fs/9137780

Firestrike Graphics score of 14322 after my 1st round of overclocking: http://www.3dmark.com/fs/9138795

Settings below:


GPU Z:


2050MHz when boosting during Firestrike:


Now I feel like it's overclocking pretty well without much tinkering. I set the fan curve for aggressive which helped and it's still quiet compared to my Hyper 212+ CPU cooler.

Now for the questions.
My overall firestrike score is pretty dismal compared to everyone else's.
Partly because of my old i7 920 CPU OC'ed to 3.8GHz. That's dropping my physics and combined categories pretty bad.

But also my FPS drops pretty low in the 1st graphics test for some reason, which is impacting that score a lot.

I did not do a clean driver install, not sure if that will help.

Any suggestions folks?

Edit 1:
-Ran Doom on ultra, maxed out AA, motion blur low, averaging around 100fps. Usually sits much higher than that.
-Ran Ashes of the Singularity on Crazy, DX12, averaging 60fps or so, I think the CPU is to blame for that lower FPS.
-This is all with 1 monitor at 1080p.

Edit 1:
-Added Heaven score below, still looks low compared to everyone's scores. CPU to blame?
-Heaven was showing 2088MHz also, but Precision was showing closer to 2000.


----------



## Ysbzqu6572

Your cpu seems to be bottleneck here.


----------



## Blackcurrent

Your i7 920 is ancient it is a 2008 CPU, it is holding you back big time. Both your 3DMark score and Heaven score are very poor. In comparison I get 20k+ for Firestrike Graphic score on stock clocks. Around 100 fps for the first test with Firestrike.


----------



## pez

If you were to upgrade to a 2K monitor, CPU would be less relevant for you in actual gaming scenarios. However, if benchmarking is your thing, the CPU is definitely holding you back. A 2K monitor is usually cheaper than a platform upgrade







.


----------



## prey1337

@blackcurrent
Yea I'm actually impressed it's held up this long.
So I'm guessing the first graphics test uses some cpu?
At this point it's not effecting gaming, just benchmarks. And the card is performing well clock wise. But that's why I found it odd my graphics score was so low?

@pez
Can you expand on the 2k monitor part?
I would think the cpu has an easier time with 1080p.


----------



## Arizonian

Just popped in to say congrats to the 1070 owners I see gathering, good times.









Thank you TopicClocker for starting the club thread with owners list. It's now *[Official]*


----------



## versions

Quote:


> Originally Posted by *prey1337*
> 
> @blackcurrent
> Yea I'm actually impressed it's held up this long.
> So I'm guessing the first graphics test uses some cpu?
> At this point it's not effecting gaming, just benchmarks. And the card is performing well clock wise. But that's why I found it odd my graphics score was so low?
> 
> @pez
> Can you expand on the 2k monitor part?
> I would think the cpu has an easier time with 1080p.


Higher resolution adds GPU load, not CPU load. It's possible that it increases the CPU load slightly, but no more than that. So the higher the resolution, the less reliant on the CPU you are because you're going to hit the GPU bottleneck earlier. That said, increasing resolution is never going to increase your performance.

It's probably affecting gaming quite significantly depending on game, especially as you're playing at 1080P. Here you can compare Physics score with what I get with a modern 6700K at 4.8GHz with 3000MHz RAM.
http://www.3dmark.com/fs/9090498


----------



## pez

Quote:


> Originally Posted by *prey1337*
> 
> @blackcurrent
> Yea I'm actually impressed it's held up this long.
> So I'm guessing the first graphics test uses some cpu?
> At this point it's not effecting gaming, just benchmarks. And the card is performing well clock wise. But that's why I found it odd my graphics score was so low?
> 
> @pez
> Can you expand on the 2k monitor part?
> I would think the cpu has an easier time with 1080p.


Quote:


> Originally Posted by *versions*
> 
> Higher resolution adds GPU load, not CPU load. It's possible that it increases the CPU load slightly, but no more than that. So the higher the resolution, the less reliant on the CPU you are because you're going to hit the GPU bottleneck earlier. That said, increasing resolution is never going to increase your performance.
> 
> It's probably affecting gaming quite significantly depending on game, especially as you're playing at 1080P. Here you can compare Physics score with what I get with a modern 6700K at 4.8GHz with 3000MHz RAM.
> http://www.3dmark.com/fs/9090498


Essentially this.

Higher resolutions will rely quite a bit less on CPU depending on the game. However, I don't know of many CPU-bound titles that are still relevant. Some Source engine stuff and RTS games usually put a beating on the CPU, but most stuff now scales very well at higher resolutions. I think the GTX 1070 is quite a fine card for 2K. It will require some anti-anti-aliasing in some titles, but it's a high enough resolution that AA becomes less relevant.

Going to try and search to see if I can find some benchmarks done with your CPU with newer cards.

EDIT: Not finding much, but it seems that even something like a 2500K would be less of an actual bottleneck for the card. However, I would just do a lot of testing in your normal usage before making the plan to upgrade blindly. You should see a huge jump regardless, but I wouldn't say your CPU is irrelevant just yet.


----------



## Eric1285

You don't need a platform upgrade. Just grab a $50 Xeon X5650 off of eBay and overclock it. With a bit of luck you can do 4.2 ghz easy. Check out the super long X58 Xeon thread in the Intel General forum.

I spent $70 on a X5670 a few months ago and just got a 1070: http://www.3dmark.com/fs/8937271


----------



## BulletSponge

I ordered an MSI 1070 Gaming X on June 19th from Amazon and finally got an estimated delivery date of July 11th today.









With any luck (and one day shipping) it will be here this week. Glad I held out, I almost canceled my order yesterday to buy a 980Ti. Now whether Amazon is getting enough to actually have inventory or just fill all/some pre-orders, who knows?


----------



## Dude970

I ordered my msi card on the same day as you. I also received an email estimating 11 July


----------



## BulletSponge

Quote:


> Originally Posted by *Dude970*
> 
> I ordered my msi card on the same day as you. I also received an email estimating 11 July


----------



## prey1337

Quote:


> Originally Posted by *pez*
> 
> Essentially this.
> 
> Higher resolutions will rely quite a bit less on CPU depending on the game. However, I don't know of many CPU-bound titles that are still relevant. Some Source engine stuff and RTS games usually put a beating on the CPU, but most stuff now scales very well at higher resolutions. I think the GTX 1070 is quite a fine card for 2K. It will require some anti-anti-aliasing in some titles, but it's a high enough resolution that AA becomes less relevant.
> 
> Going to try and search to see if I can find some benchmarks done with your CPU with newer cards.
> 
> EDIT: Not finding much, but it seems that even something like a 2500K would be less of an actual bottleneck for the card. However, I would just do a lot of testing in your normal usage before making the plan to upgrade blindly. You should see a huge jump regardless, but I wouldn't say your CPU is irrelevant just yet.


I suppose I shouldn't be surprised, it has been nearly a decade of CPU tech progress.
I don't think any of the games I have are very cpu dependent, except Ashes which is why I included that one. It's still completely fluid gameplay and the CPU is only taxed ~30%.

Quote:


> Originally Posted by *Eric1285*
> 
> You don't need a platform upgrade. Just grab a $50 Xeon X5650 off of eBay and overclock it. With a bit of luck you can do 4.2 ghz easy. Check out the super long X58 Xeon thread in the Intel General forum.
> 
> I spent $70 on a X5670 a few months ago and just got a 1070: http://www.3dmark.com/fs/8937271


That a huge jump!
What all did you have to do when you got the Xeon?
I'm eye-balling a few on ebay.

Edit: I have a Asus Sabertooth X58, doesn't show the Xeon's are supported, but they should still work right?


----------



## Bdonedge

Quote:


> Originally Posted by *Blackcurrent*
> 
> Your i7 920 is ancient it is a 2008 CPU, it is holding you back big time. Both your 3DMark score and Heaven score are very poor. In comparison I get 20k+ for Firestrike Graphic score on stock clocks. Around 100 fps for the first test with Firestrike.


I'm confused - I have a 6700k and I'm not getting near 20k at stock clocks. What settings should
I be looking for - I'm running default in bios


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *BulletSponge*
> 
> I ordered an MSI 1070 Gaming X on June 19th from Amazon and finally got an estimated delivery date of July 11th today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With any luck (and one day shipping) it will be here this week. Glad I held out, I almost canceled my order yesterday to buy a 980Ti. Now whether Amazon is getting enough to actually have inventory or just fill all/some pre-orders, who knows?


cool. only 6 more days.









i wish they would speed up my delivery.

hoping they give me a more precise estimate too; within 6 days before delivery like urs would be nice.

(i ordered (2) ASUS Strix on June 26 from Amazon. (7 days after u ordered.) My current arrival estimation is 15 to 21 more days. But they always said it could be earlier.)

GL


----------



## Eric1285

Quote:


> Originally Posted by *prey1337*
> 
> I suppose I shouldn't be surprised, it has been nearly a decade of CPU tech progress.
> I don't think any of the games I have are very cpu dependent, except Ashes which is why I included that one. It's still completely fluid gameplay and the CPU is only taxed ~30%.
> That a huge jump!
> What all did you have to do when you got the Xeon?
> I'm eye-balling a few on ebay.
> 
> Edit: I have a Asus Sabertooth X58, doesn't show the Xeon's are supported, but they should still work right?


You'll have to check for your particular motherboard (pretty sure I remember the Sabertooth working) but usually you just need to make sure you have the latest BIOS (or new enough that they added support for the Xeon chips) and it should drop in and just work. Overclocking is a bit different than with your i7 since they're on different processes (45nm for your i7, 32nm for the Westmere Xeon chips). I think I saw you post in the huge Xeon thread so you've already found a great resource.


----------



## Eric1285

Edit - Nevermind, misread post.


----------



## prey1337

Quote:


> Originally Posted by *Eric1285*
> 
> You'll have to check for your particular motherboard (pretty sure I remember the Sabertooth working) but usually you just need to make sure you have the latest BIOS (or new enough that they added support for the Xeon chips) and it should drop in and just work. Overclocking is a bit different than with your i7 since they're on different processes (45nm for your i7, 32nm for the Westmere Xeon chips). I think I saw you post in the huge Xeon thread so you've already found a great resource.


Ya they say my Sabertooth is supported.
Definitely will need to check the bios. Should be updated, but it was a while back when I put in that board. Probably 2010-2011.

That thread is great, wealth of information, just a lot to sift through.

Just trying to find the chip I want at a decent price.

Need to back my SSD up too.

I'm guessing that will sort out my graphics score (still confused on why test 1 is so low) and obviously my physics/combines scores.


----------



## Joossens

Quote:


> Originally Posted by *prey1337*
> 
> Alright, just installed my GTX 1070 SC.
> 
> Firestrike Graphics score of 13985 at stock settings: http://www.3dmark.com/fs/9137780
> 
> Firestrike Graphics score of 14322 after my 1st round of overclocking: http://www.3dmark.com/fs/9138795
> 
> Settings below:
> 
> 
> GPU Z:
> 
> 
> 2050MHz when boosting during Firestrike:
> 
> 
> Now I feel like it's overclocking pretty well without much tinkering. I set the fan curve for aggressive which helped and it's still quiet compared to my Hyper 212+ CPU cooler.
> 
> Now for the questions.
> My overall firestrike score is pretty dismal compared to everyone else's.
> Partly because of my old i7 920 CPU OC'ed to 3.8GHz. That's dropping my physics and combined categories pretty bad.
> 
> But also my FPS drops pretty low in the 1st graphics test for some reason, which is impacting that score a lot.
> 
> I did not do a clean driver install, not sure if that will help.
> 
> Any suggestions folks?
> 
> Edit 1:
> -Ran Doom on ultra, maxed out AA, motion blur low, averaging around 100fps. Usually sits much higher than that.
> -Ran Ashes of the Singularity on Crazy, DX12, averaging 60fps or so, I think the CPU is to blame for that lower FPS.
> -This is all with 1 monitor at 1080p.
> 
> Edit 1:
> -Added Heaven score below, still looks low compared to everyone's scores. CPU to blame?
> -Heaven was showing 2088MHz also, but Precision was showing closer to 2000.


I own the same CPU but my mobo only supports PCIe v2.0 so yours too maybe?
I've read earlier in this thread someone that had his card running is PCIe 2.0 mode and he was getting low scores as well.


----------



## prey1337

Quote:


> Originally Posted by *Joossens*
> 
> I own the same CPU but my mobo only supports PCIe v2.0 so yours too maybe?
> I've read earlier in this thread someone that had his card running is PCIe 2.0 mode and he was getting low scores as well.


I think that was x2 instead of x16, I wouldn't think there is that much of a difference between PCIe 2.0 and 3.0 (you can see mine is 2.0 in GPUZ).


----------



## jlhawn

Ordered mine on July 1 and received today!! YA-HOO!


----------



## jlhawn

Quote:


> Originally Posted by *prey1337*
> 
> Ya they say my Sabertooth is supported.
> Definitely will need to check the bios. Should be updated, but it was a while back when I put in that board. Probably 2010-2011.
> 
> That thread is great, wealth of information, just a lot to sift through.
> 
> Just trying to find the chip I want at a decent price.
> 
> Need to back my SSD up too.
> 
> I'm guessing that will sort out my graphics score (still confused on why test 1 is so low) and obviously my physics/combines scores.


the new GTX 1070 and 1080 run fine in Sabertooth X58, I just installed my MSI GTX 1070 Gaming X in my Sabertooth X58 i7 970 with a 2010 mother board bios.


----------



## Airrick10

Quote:


> Originally Posted by *jlhawn*
> 
> Ordered mine on July 1 and received today!! YA-HOO!


Sweet!!! I hope yours is a better overclocker than mine


----------



## prey1337

Quote:


> Originally Posted by *jlhawn*
> 
> the new GTX 1070 and 1080 run fine in Sabertooth X58, I just installed my MSI GTX 1070 Gaming X in my Sabertooth X58 i7 970 with a 2010 mother board bios.


Are you score's similar to mine? I think that will validate whether or not the CPU really is tanking the benchmarks.

The card runs great, and so do games, but the benchmarks leave much to be desired.


----------



## jlhawn

Quote:


> Originally Posted by *Airrick10*
> 
> Sweet!!! I hope yours is a better overclocker than mine


so far with just a quick adjustment of +150 core clock in Afterburner I got 2138mhz


----------



## jlhawn

Quote:


> Originally Posted by *prey1337*
> 
> Are you score's similar to mine? I think that will validate whether or not the CPU really is tanking the benchmarks.
> 
> The card runs great, and so do games, but the benchmarks leave much to be desired.


Haven't benchmarked yet as I just installed the gpu 30 mins ago.
I know my GTX 970 and 980 I had benchmarked good in Heaven but just awful in FireStrike.


----------



## Airrick10

Quote:


> Originally Posted by *jlhawn*
> 
> so far with just a quick adjustment of +150 core clock in Afterburner I got 2138mhz


If you can keep that +150 stable, then it looks very promissing!!!







I could never go past +100/105 on the core but I could hit around +700 on the memory though. Regardless, I still love my card!


----------



## prey1337

Quote:


> Originally Posted by *jlhawn*
> 
> Haven't benchmarked yet as I just installed the gpu 30 mins ago.
> I know my GTX 970 and 980 I had benchmarked good in Heaven but just awful in FireStrike.


I'm interested to see the results in both Firestrike and Heaven.

You and I have the same mobo, but you have 2 more cores and a larger cache than I do.

Are you at the stock 3.2GHz?


----------



## jlhawn

Quote:


> Originally Posted by *prey1337*
> 
> I'm interested to see the results in both Firestrike and Heaven.
> 
> You and I have the same mobo, but you have 2 more cores and a larger cache than I do.
> 
> Are you at the stock 3.2GHz?


I will run those this evening, yes my cpu is stock at 3.2 but at 99% load it boost to 3.4ghz


----------



## jlhawn

Quote:


> Originally Posted by *Airrick10*
> 
> If you can keep that +150 stable, then it looks very promissing!!!
> 
> 
> 
> 
> 
> 
> 
> I could never go past +100/105 on the core but I could hit around +700 on the memory though. Regardless, I still love my card!


it held up in my favorite game (so far) but I will do more testing with both core and memory.


----------



## Airrick10

Quote:


> it held up in my favorite game (so far) but I will do more testing with both core and memory.


Cool! Looking forward to your benchmarks!


----------



## criminal

Quote:


> Originally Posted by *jlhawn*
> 
> it held up in my favorite game (so far) but I will do more testing with both core and memory.


What game would that be if you don't mind me asking? I have put mine through so many tests. I have a FE, but I am solid at +200 Core +550 Memory in everything. In some things I am good all the way to+260 Core and +670 Memory.


----------



## luan87us

Just ordered my Asus GTX 1070 Strix from newegg with 3 days shipping. Can't wait to test this beast out with my new skylake build. The Strix is going to look very sexy with my Asus ROG Maximus VIII mobo.


----------



## Sea Otter

Quote:


> Originally Posted by *luan87us*
> 
> Just ordered my Asus GTX 1070 Strix from newegg with 3 days shipping. Can't wait to test this beast out with my new skylake build. The Strix is going to look very sexy with my Asus ROG Maximus VIII mobo.


Just got my Strix in the system, replacing my EVGA 1070 SC (which I returned). It's great. Overclocks very easily to 2050MHz, hits around 98FPS in Unigine Heaven. Stays under 75C. Great great card.


----------



## Ka0sX

Bought ASUS ROG Strix GeForce GTX 1070 OC, 8GB yesterday will have it in a couple hrs onboard with driver


----------



## Blackfyre

I pulled the trigger last Friday and bought the only remaining *MSI GTX 1070 Gaming X* left in local stock in Perth, Australia.



*Full-Quality:* http://i.imgur.com/ghbB8Iu.png

I upgraded from a *Gigabyte HD 7970 OC Edition* after spending over a decade of being on *team red*.

Although I had built a few builds for friends & family over the years with nVidia cards, personally I have been a red for a long time.

I am saddened to leave team red in a way, but I am happy with this card, very satisfied. It has been a massive upgrade over the HD7970; and I can say that pushing myself to wait a few months was definitely the right decision to make. I almost pulled the trigger and bought a GTX 980 Ti for $1100 AUD not even 2 months ago, now I got a GTX 1070 for $779 AUD.


----------



## luan87us

Quote:


> Originally Posted by *Sea Otter*
> 
> Just got my Strix in the system, replacing my EVGA 1070 SC (which I returned). It's great. Overclocks very easily to 2050MHz, hits around 98FPS in Unigine Heaven. Stays under 75C. Great great card.


Very nice. I can't wait to see how it will pair with the i7 6700k. I'm upgrading from i5 3470 with 7850 so it's going to be a big jump.


----------



## Airrick10

Quote:


> Originally Posted by *Blackfyre*
> 
> I pulled the trigger last Friday and bought the only remaining *MSI GTX 1070 Gaming X* left in local stock in Perth, Australia.
> 
> 
> 
> *Full-Quality:* http://i.imgur.com/ghbB8Iu.png
> 
> I upgraded from a *Gigabyte HD 7970 OC Edition* after spending over a decade of being on *team red*.
> 
> Although I had built a few builds for friends & family over the years with nVidia cards, personally I have been a red for a long time.
> 
> I am saddened to leave team red in a way, but I am happy with this card, very satisfied. It has been a massive upgrade over the HD7970; and I can say that pushing myself to wait a few months was definitely the right decision to make. I almost pulled the trigger and bought a GTX 980 Ti for $1100 AUD not even 2 months ago, now I got a GTX 1070 for $779 AUD.


Great Score and nice overclock!!! Welcome to team Green, you won't regret it!


----------



## Blackfyre

Quote:


> Originally Posted by *Airrick10*
> 
> Great Score and nice overclock!!! Welcome to team Green, you won't regret it!


To be honest I can push to 800Mhz easy without any issues whatsoever, and I haven't even tested beyond that. It might go further, who knows. I just didn't want to risk it too much and stayed on the safe side. Also the differences in score become minimal after going past 600MHz on memory.

It's too bad I can't overclock the core speed past 125MHz (_or effectively past 2100MHz; crashes and artifacts galore_), I'd have preferred that over any memory overclockability.


----------



## Hunched

I originally just posted this in the Gigabyte BIOS tweaking topic here (which I used for my 970) and figured I may as well copy/paste it here too. So here it is...

I've finished my extensive benchmarking of my Gigabyte 1070 WindForce OC and I'm pretty disappointed, at best it's "lower middle-class" if that. I came from nearly a top 1% GTX 970 which doesn't help.
Here's my journey.

1. Setup
I started off after a DDU uninstall and complete clean install of just the newest driver and PhysX without changing anything in NVCP.
I ran a Firestrike benchmark without touching a thing right after.
Stock Firestrike Graphics Score = *18731*

2. Core Overclocking
I installed MSI AB and maxed the power and temp limits (111%/92c) and increased the core clock by +10mhz each successful run.
I stopped at +110mhz where it usually crashed before a run could finish, and settled at +100mhz as there was no artifacting or any visual weirdness, just program crashes at +110mhz.
With only temp and power limit maxed and a core of +100mhz I got a Firestrike Graphics Score of *19348*.
With the exact same settings but the fan at 100% the whole run I got a graphics score of *19498*.
*19498 GPU-Z Results*


Spoiler: Warning: Spoiler!







3. Memory Overclocking
With an established weak +100mhz core increase that stays under 2000mhz 99% of the time under load... I moved onto memory.
Increased in increments of 100mhz it wasn't until +700mhz that visual artifacting began. It persisted at +650mhz so I settled on +600mhz.
With this my Firestrike Graphics Score ended up at *20600*.
*20600 GPU-Z Results*


Spoiler: Warning: Spoiler!







4. The Frustration
I'm disappointed that my core is rarely ever over 2000mhz while others are getting 2100mhz+ with ease, and graphics scores of 21500+ while I'm stuck around 20500.
It isn't even 100% stable at 2000mhz either, it crashed once after hours of running. It looks like I'll probably have to lower things even further if I have issues while gaming, memory included.

What makes no sense is how the core clock causes crashes, there are no signs of instability visually or from GPU-Z like you would expect.
When the memory was reaching its OC limits, visual errors appeared before 3DMark or anything else crashed, you would expect this of the core as well.
When the core crashes, the driver never crashes, nothing spikes or drops prior to it happening in GPU-Z, there are no artifacts or visual issues. It's like something pulls the plug.

More frustration comes from the fact that *increasing core voltage does nothing to help*. It gives 0 overclocking headroom and in fact seems to add instability to your overclock.








Running these clocks that cause 3DMark to hard crash in Battlefield 4 cause BF4 to completely lock up and freeze after a short while. Again, no artifacts or any visual errors of any kind.
No display driver crashes, no visual instability, no GPU-Z drops or spikes prior to crashing, it's as if it just doesn't get enough power for a split second, though the voltage does not drop.
The way the core becomes unstable and crashes is sudden and weird, I feel there's something wrong with the BIOS possibly causing this limit.
There should be other signs of instability somewhere, this is abnormal and abrupt.

Here are GPU-Z results of a crash with the voltage maxed out. Besides the maxed core voltage the settings are identical from Part 2 Core Overclocking, the core is still only +100mhz but the increased voltage gave a higher max core. The max core was not what GPU-Z recorded right before the crash, it was at just 2012mhz.
Increasing core voltage from 1.0620 to 1.0930 makes it crash more, and doesn't allow for any more mhz on the core clock.








*If I increase my core voltage, I have to lower my core clock to maintain stability*.









Spoiler: Warning: Spoiler!







5. Conclusion
So I'm not too happy. My GTX 970 had a Firestrike Graphics Score of 14318 for what it's worth with this same system. Top tier 970 to bottom tier 1070.
I suppose at the end of the day, 14318 to 20600 is a 43.87% increase, and my stock 1070 score of 18731 to 20600 is a 9.97% increase.
This is assuming I don't have to lower my core clock to +90mhz or less which I may have to which is pathetic, hopefully it will hold up in games since it usually does in Firestrike.
Mem might have to go down to +575/550mhz if I ever see anything weird again but that's still pretty good, better luck than my core OC quite obviously.

If I could have achieved a score of 21750 like some people on 3DMark, that would be a pretty juicy 16% boost over my stock score.
I can't check my ASIC with GPU-Z to see just how bad it is since it's not yet supported, but I'm guessing 65% or less.
Lastly, my fans have been weirdly loud occasionally during RPM changes, they hit some real turbulence or something, vibrations. Hopefully that happens less.

Maybe I will be able to hit 2100mhz core when custom BIOS's are available, which will make it so increasing your core voltage ACTUALLY HELPS.
That could be useful, having a voltage increase that increases OC headroom and stability instead of ruining it


----------



## prey1337

Did some more tweaking, used DDU to do clean uninstall of the drivers and then installed fresh.
Started using Afterburner instead of PrecisionX.

Power Limit: 112% (Temp Limit 92C)
Core Clock: +110 (Dropped to +105 for stability reasons)
Mem Clock: +600

Heaven has a much better score now:


I gained a decent bump in the graphics category in Firestrike as well: http://www.3dmark.com/fs/9158103
Went from 14322 to 15274.

Seeing around 2062MHz under full load. And had a good gain on the Mem clock:


We will see how much the Xeon X5675 helps me out when I get that upgrade taken care of.

Edit: I wanted to add it's insane how the temps don't even get to 60C, my old 560 Ti was a space heater compared to this thing. And it's so quiet in comparison, also no coil whine.


----------



## Blackcurrent

Quote:


> Originally Posted by *prey1337*
> 
> Did some more tweaking, used DDU to do clean uninstall of the drivers and then installed fresh.
> Started using Afterburner instead of PrecisionX.
> 
> Power Limit: 110% (Temp Limit 90C)
> Core Clock: +110
> Mem Clock: +600
> 
> Heaven has a much better score now:
> 
> 
> I gained a decent bump in the graphics category in Firestrike as well: http://www.3dmark.com/fs/9158103
> Went from 14322 to 15274.
> 
> Seeing around 2062MHz under full load. And had a good gain on the Mem clock:
> 
> 
> We will see how much the Xeon X5675 helps me out when I get that upgrade taken care of.


Very nice


----------



## Bdonedge

Okay so - with gigabyte "gaming OC mode" on their program my G1 my 3d mark score is 15643

That seems low compared to what a lot of yall are getting.

Running a stock 6700k

Any tips or ideas on what I'm doing wrong


----------



## Pragmatist

Quote:


> Originally Posted by *Bdonedge*
> 
> Okay so - with gigabyte "gaming OC mode" on their program my G1 my 3d mark score is 15643
> 
> That seems low compared to what a lot of yall are getting.
> 
> Running a stock 6700k
> 
> Any tips or ideas on what I'm doing wrong


I take it that's your total score and not your graphics score? If so, that's what most of us are getting score wise.

Here's mine for comparison.


----------



## Bdonedge

Quote:


> Originally Posted by *Pragmatist*
> 
> I take it that's your total score and not your graphics score? If so, that's what most of us are getting score wise.
> 
> Here's mine for comparison.


Yeah total score. Graphics score I'm at 19827

sorry new to this - is overall score not this topic of conversation?
I should only be looking at graphics score?


----------



## deegzor

Thought to share my Msi gtx 1070 Fe OC for reference.



Full run can be found here -> http://www.3dmark.com/fs/9158268


----------



## versions

Quote:


> Originally Posted by *Bdonedge*
> 
> Yeah total score. Graphics score I'm at 19827
> 
> sorry new to this - is overall score not this topic of conversation?
> I should only be looking at graphics score?


When discussing graphics cards, graphics score is generally what people are talking about, yes. Overall score depends on what CPU you have, which makes for poor comparisons as one person could be running a 6600K and someone else a 6800K, in which case they'd get very different overall scores despite having similar graphics score. Unless you have a very significant CPU bottleneck, the processor won't affect graphics score very much.


----------



## Blackfyre

Quote:


> Originally Posted by *Bdonedge*
> 
> Okay so - with gigabyte "gaming OC mode" on their program my G1 my 3d mark score is 15643
> 
> That seems low compared to what a lot of yall are getting.
> 
> Running a stock 6700k
> 
> Any tips or ideas on what I'm doing wrong


That's normal, what's your graphics score?

When you see people posting their scores, it's usually their graphic scores in the 19000 to 21000 range.

Comparing graphic scores is more fair between GPU's as opposed to overall scores that combine both the graphic score and the physics score (_which is purely based on your CPU power_).


----------



## Hunched

Does anyone else have the Gigabyte 1070 WindForce OC? Curious about others experiences.
Feels bad man, not even being stable at 2000mhz core...

Every single person I've seen who has overclocked their 1070 in every thread on every site I've seen has done better than me... I've yet to find anyone struggling to get above 2000mhz, everyone else is struggling to get their 2100mhz.


----------



## Ysbzqu6572

I have the Gigabyte G1 version and it is stable at ~2060Mhz in overwatch without bumping up voltage, mine has trouble with memory, can't get over +360 without artifacts afterwards.
But I am happy with it.


----------



## Pharao86

Hey guys,

Just joined the club with my brand new Gigabyte G1 1070, i've been busy overclocking and so far this is what i got.



This result is the highest i got so far, and i've been gaming on these settings for 6 hours in a row, seems stable so far.

What do you guys think?


----------



## Blackfyre

Quote:


> Originally Posted by *H4wk*
> 
> I have the Gigabyte G1 version and it is stable at ~2060Mhz in overwatch without bumping up voltage, mine has trouble with memory, can't get over +360 without artifacts afterwards.
> But I am happy with it.


My *MSI GTX 1070 Gaming X* can't go past 2100Mhz without crashes or artifacts, but yeah it's always fluctuating between 2050Mhz and 2100Mhz stable. Memory I can push even to +750MHz using AB.

I had a choice between the Gigabyte G1 & MSI Gaming X and went with the MSI because it looked like a better built + it needed more power (_8+6_), which gave me an indication for better headroom. Same price exactly too here in Australia.


----------



## Pharao86

Here is my firestrike score.



is this any good?


----------



## Eorzean

Quote:


> Originally Posted by *Blackfyre*
> 
> My *MSI GTX 1070 Gaming X* can't go past 2100Mhz without crashes or artifacts, but yeah it's always fluctuating between 2050Mhz and 2100Mhz stable. Memory I can push even to +750MHz using AB.
> 
> I had a choice between the Gigabyte G1 & MSI Gaming X and went with the MSI because it looked like a better built + it needed more power (_8+6_), which gave me an indication for better headroom. Same price exactly too here in Australia.


That's a good card. The most I can add to my GX is +100 on the core (bringing it to around 2025 I think) and +500 on the memory. Any higher for either I crash and/or start seeing artifacts.

What are your power settings at? And default or OC bios? I found the OC bios gave me an extra 25Mhz on the core, bringing it to 2050. You might be able to squeeze a little more out of it changing to that bios, if you're not already on it.


----------



## Blackfyre

*I have a general question that I want to ask everyone, it's in BOLD at the bottom of this post:*
Quote:


> Originally Posted by *Eorzean*
> 
> That's a good card. The most I can add to my GX is +100 on the core (bringing it to around 2025 I think) and +500 on the memory. Any higher for either I crash and/or start seeing artifacts.
> 
> What are your power settings at? And default or OC bios? I found the OC bios gave me an extra 25Mhz on the core, bringing it to 2050. You might be able to squeeze a little more out of it changing to that bios, if you're not already on it.


I Done more precise overclocking and strong stress-testing to determine stability;

*Core Voltage Percentage* = 0%
*Power Limit* = 126%
*Core* = +106 MHz
*Memory* = +666

That's what I am currently using... I found that if I increase *Core Voltage Percentage* to *100%* the Core Clock will stick to 2063MHz continuously WITHOUT fluctuating at all, but when I do have it on 0% it decreases to 2050MHz after a bit, and then it stabilises between 2025 & 2050 (_fluctuating between the two_), out of safety I don't want to increase Core Voltage Percentage to 100% so I turned it off.

So yes, increasing Core Voltage Percentage does give it more stability at the MAX CLOCK, but it didn't allow me to break the 2063MHz barrier, turns out 2088MHz just like 2100MHz does crash the driver or cause artifacts when stress testing.

So my actual barrier is 2063MHz on the CORE. Going past that causes crashes or artifacts.

No I never upgraded my BIOS, I read that some people ended up having a worse experience, while others reported that their cards got 'nerfed' (_became worse_) after upgrading the BIOS. None of them provided evidence though. But I thought I remember reading a few saying their Power Limit decreased after the BIOS update too. I wish it had a dual BIOS option so I could test and switch to the other BIOS if things turn out bad.

I do have 2 questions, the first is for you EORZEAN, the second is a general question:

1. With your updated BIOS, what is the maximum POWER LIMIT percentage you can increase to using MSI AfterBurner?

*2. Is keeping Core Voltage Percentage in MSI Afterburner @ 100% safe? Or can driver updates and the such cause a spike that would mess up the card? I know I read that the GPU's are hardware voltage locked via regulators, is that true? So is it completely safe to keep it on 100% all the time in MSI AB and leave it there?*


----------



## GunnzAkimbo

OK!

Lowest price in AUS so far = $650 for the Gainward Phoenix.

http://www.netplus.com.au/product/VDGFGTX1070%2D14/Gainward_GeForce_GTX1070/

Keep in mind, a 1080 will cost $1100

http://www.diycomputers.com.au/product.asp?id=15793


----------



## Schneeder

Ordered a ASUS ROG Strix 1070 last night. Excited to get it!


----------



## Rebellion88

I'm looking to upgrade from a AMD 380X anyone else upgraded similar to me and know what sort of performance jump to expect? I'm looking to push into 1440p in single card terrority.


----------



## Blackfyre

Quote:


> Originally Posted by *Rebellion88*
> 
> I'm looking to upgrade from a AMD 380X anyone else upgraded similar to me and know what sort of performance jump to expect? I'm looking to push into 1440p in single card terrority.


I upgraded from a HD7970, which was re-branded as the 280X, which was re-branded as the 380X (_maybe with a minimal performance increase_). The performance jump was massive to the GTX 1070.

1440p gaming should be achievable in most if not ALL games (_especially after overclocking_).


----------



## Eorzean

Quote:


> Originally Posted by *Pharao86*
> 
> I do have 2 questions, the first is for you EORZEAN, the second is a general question:
> 
> 1. With your updated BIOS, what is the maximum POWER LIMIT percentage you can increase to using MSI AfterBurner?[/U][/B]


Wish I could answer that for you, but I pulled it from my case and put it back in its package. Decided to switch to the 1080 (going 1440p ultrawide and don't think this'll cut it). Hopefully someone else could answer that for you. If you wanted to check for yourself, the BIOS zip comes packaged with the Gaming (stock) BIOS as well as the OC one, and you can switch back and forth pretty easily (you just run the batch file and it asks which one you want flashed).

Link: https://ca.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios

To be honest, I didn't really feel the need to use it either. I ended up flashing back after a few Firestrike runs.


----------



## Blackfyre

Quote:


> Originally Posted by *Eorzean*
> 
> Wish I could answer that for you, but I pulled it from my case and put it back in its package. Decided to switch to the 1080 (going 1440p ultrawide and don't think this'll cut it).


Congratulations man, if it wasn't for the fact that the 1080 costs a kidney here in Australia, I'd have definitely got one instead of the GTX 1070.


----------



## Eorzean

Quote:


> Originally Posted by *Blackfyre*
> 
> Congratulations man, if it wasn't for the fact that the 1080 costs a kidney here in Australia, I'd have definitely got one instead of the GTX 1070.


Thanks. I think I'm more excited about the new monitor than the 1080 though, lol.


----------



## CaptainZombie

These prices are so all over the board it's not even funny. You have some retailers charging $459 for the gaming X and others at $479 to $499. I wish these prices would stabilize already. My local retailer has the Gaming X now up to $489. LOL!


----------



## criminal

Quote:


> Originally Posted by *Hunched*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I originally just posted this in the Gigabyte BIOS tweaking topic here (which I used for my 970) and figured I may as well copy/paste it here too. So here it is...
> 
> I've finished my extensive benchmarking of my Gigabyte 1070 WindForce OC and I'm pretty disappointed, at best it's "lower middle-class" if that. I came from nearly a top 1% GTX 970 which doesn't help.
> Here's my journey.
> 
> 1. Setup
> I started off after a DDU uninstall and complete clean install of just the newest driver and PhysX without changing anything in NVCP.
> I ran a Firestrike benchmark without touching a thing right after.
> Stock Firestrike Graphics Score = *18731*
> 
> 2. Core Overclocking
> I installed MSI AB and maxed the power and temp limits (111%/92c) and increased the core clock by +10mhz each successful run.
> I stopped at +110mhz where it usually crashed before a run could finish, and settled at +100mhz as there was no artifacting or any visual weirdness, just program crashes at +110mhz.
> With only temp and power limit maxed and a core of +100mhz I got a Firestrike Graphics Score of *19348*.
> With the exact same settings but the fan at 100% the whole run I got a graphics score of *19498*.
> *19498 GPU-Z Results*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3. Memory Overclocking
> With an established weak +100mhz core increase that stays under 2000mhz 99% of the time under load... I moved onto memory.
> Increased in increments of 100mhz it wasn't until +700mhz that visual artifacting began. It persisted at +650mhz so I settled on +600mhz.
> With this my Firestrike Graphics Score ended up at *20600*.
> *20600 GPU-Z Results*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 4. The Frustration
> I'm disappointed that my core is rarely ever over 2000mhz while others are getting 2100mhz+ with ease, and graphics scores of 21500+ while I'm stuck around 20500.
> It isn't even 100% stable at 2000mhz either, it crashed once after hours of running. It looks like I'll probably have to lower things even further if I have issues while gaming, memory included.
> 
> What makes no sense is how the core clock causes crashes, there are no signs of instability visually or from GPU-Z like you would expect.
> When the memory was reaching its OC limits, visual errors appeared before 3DMark or anything else crashed, you would expect this of the core as well.
> When the core crashes, the driver never crashes, nothing spikes or drops prior to it happening in GPU-Z, there are no artifacts or visual issues. It's like something pulls the plug.
> 
> More frustration comes from the fact that *increasing core voltage does nothing to help*. It gives 0 overclocking headroom and in fact seems to add instability to your overclock.
> 
> 
> 
> 
> 
> 
> 
> 
> Running these clocks that cause 3DMark to hard crash in Battlefield 4 cause BF4 to completely lock up and freeze after a short while. Again, no artifacts or any visual errors of any kind.
> No display driver crashes, no visual instability, no GPU-Z drops or spikes prior to crashing, it's as if it just doesn't get enough power for a split second, though the voltage does not drop.
> The way the core becomes unstable and crashes is sudden and weird, I feel there's something wrong with the BIOS possibly causing this limit.
> There should be other signs of instability somewhere, this is abnormal and abrupt.
> 
> Here are GPU-Z results of a crash with the voltage maxed out. Besides the maxed core voltage the settings are identical from Part 2 Core Overclocking, the core is still only +100mhz but the increased voltage gave a higher max core. The max core was not what GPU-Z recorded right before the crash, it was at just 2012mhz.
> Increasing core voltage from 1.0620 to 1.0930 makes it crash more, and doesn't allow for any more mhz on the core clock.
> 
> 
> 
> 
> 
> 
> 
> 
> *If I increase my core voltage, I have to lower my core clock to maintain stability*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 5. Conclusion
> So I'm not too happy. My GTX 970 had a Firestrike Graphics Score of 14318 for what it's worth with this same system. Top tier 970 to bottom tier 1070.
> I suppose at the end of the day, 14318 to 20600 is a 43.87% increase, and my stock 1070 score of 18731 to 20600 is a 9.97% increase.
> This is assuming I don't have to lower my core clock to +90mhz or less which I may have to which is pathetic, hopefully it will hold up in games since it usually does in Firestrike.
> Mem might have to go down to +575/550mhz if I ever see anything weird again but that's still pretty good, better luck than my core OC quite obviously.
> 
> If I could have achieved a score of 21750 like some people on 3DMark, that would be a pretty juicy 16% boost over my stock score.
> I can't check my ASIC with GPU-Z to see just how bad it is since it's not yet supported, but I'm guessing 65% or less.
> Lastly, my fans have been weirdly loud occasionally during RPM changes, they hit some real turbulence or something, vibrations. Hopefully that happens less.
> 
> Maybe I will be able to hit 2100mhz core when custom BIOS's are available, which will make it so increasing your core voltage ACTUALLY HELPS.
> That could be useful, having a voltage increase that increases OC headroom and stability instead of ruining it


Sorry you seem to have gotten burned by the silicon lottery. That's the way it goes sometimes though. I upgraded from a 980 that would only game at 1425/7600, so I am rather happy with my 1070. Besides the limited overclocking and bios control we have at the moment, its one of my favorite gpu's I have ever had. Regarding voltage, I don't think adding more voltage helps overclock scaling on Pascal. I have had much better luck with my clocks leaving the voltage at default. My issue is running into the power limit!


----------



## Cakewalk_S

When are GTX 1070 mini's supposed to come out? I'm honestly surprised how big the current PCB's are for a card with such a low TDP...


----------



## Ysbzqu6572

Quote:


> Originally Posted by *Blackfyre*
> 
> My *MSI GTX 1070 Gaming X* can't go past 2100Mhz without crashes or artifacts, but yeah it's always fluctuating between 2050Mhz and 2100Mhz stable. Memory I can push even to +750MHz using AB.
> 
> I had a choice between the Gigabyte G1 & MSI Gaming X and went with the MSI because it looked like a better built + it needed more power (_8+6_), which gave me an indication for better headroom. Same price exactly too here in Australia.


I too love MSI cards and I own MSI MB but MSI 1070 was around 100 euro more than my current Gigabyte G1 .. so it was absolutely no way and not worth it.


----------



## LiquidHaus

Quote:


> Originally Posted by *Rebellion88*
> 
> I'm looking to upgrade from a AMD 380X anyone else upgraded similar to me and know what sort of performance jump to expect? I'm looking to push into 1440p in single card terrority.


Quote:


> Originally Posted by *Blackfyre*
> 
> I upgraded from a HD7970, which was re-branded as the 280X, which was re-branded as the 380X (_maybe with a minimal performance increase_). The performance jump was massive to the GTX 1070.
> 
> 1440p gaming should be achievable in most if not ALL games (_especially after overclocking_).


i'll be right there with you guys.

i'm currently waiting for my 1070 amp extreme to get here on friday, and i'll be ripping out my four 7970s for it.

there are so many games out there that aren't supporting even 3-way crossfire profiles. my 4th card is pretty much just for looks. can never use it.

but even with multiple cards, i am expecting huge gains in performance. and I'm currently playing at 3440x1440.


----------



## HAL900

only interests me how you clock during the test aida if you have a graph oc?
Quote:


> Originally Posted by *Blackfyre*
> 
> I pulled the trigger last Friday and bought the only remaining *MSI GTX 1070 Gaming X* left in local stock in Perth, Australia.
> 
> 
> 
> *Full-Quality:* http://i.imgur.com/ghbB8Iu.png
> 
> I upgraded from a *Gigabyte HD 7970 OC Edition* after spending over a decade of being on *team red*.
> 
> Although I had built a few builds for friends & family over the years with nVidia cards, personally I have been a red for a long time.
> 
> I am saddened to leave team red in a way, but I am happy with this card, very satisfied. It has been a massive upgrade over the HD7970; and I can say that pushing myself to wait a few months was definitely the right decision to make. I almost pulled the trigger and bought a GTX 980 Ti for $1100 AUD not even 2 months ago, now I got a GTX 1070 for $779 AUD.


Clock in gpu-z in aida single and dual precision?


----------



## Blackfyre

Quote:


> Originally Posted by *HAL900*
> 
> only interests me how you clock during the test aida if you have a graph oc?
> Clock in gpu-z in aida single and dual precision?


What?









I'm lost, what are you asking exactly? Please elaborate...


----------



## HAL900

You do not know what a clock and aida and single and dual precyzion and gpu-z ??


----------



## supermi

Quote:


> Originally Posted by *HAL900*
> 
> You do not know what a clock and aida and single and dual precyzion and gpu-z ??


He is asking for HAL900's assistance, his name used to be Dave and you had another zero on your name ?


----------



## HAL900

boring


----------



## Bdonedge

Quote:


> Originally Posted by *Hunched*
> 
> Does anyone else have the Gigabyte 1070 WindForce OC? Curious about others experiences.
> Feels bad man, not even being stable at 2000mhz core...
> 
> Every single person I've seen who has overclocked their 1070 in every thread on every site I've seen has done better than me... I've yet to find anyone struggling to get above 2000mhz, everyone else is struggling to get their 2100mhz.


I have a G1 and can't get mine stable at 2k even with voltage increase


----------



## Hunched

Quote:


> Originally Posted by *criminal*
> 
> Sorry you seem to have gotten burned by the silicon lottery. That's the way it goes sometimes though. I upgraded from a 980 that would only game at 1425/7600, so I am rather happy with my 1070. Besides the limited overclocking and bios control we have at the moment, its one of my favorite gpu's I have ever had. Regarding voltage, I don't think adding more voltage helps overclock scaling on Pascal. I have had much better luck with my clocks leaving the voltage at default. My issue is running into the power limit!


My power limit is an issue too, especially if I increase the core voltage, but that doesn't help anyway.

You see in the 3 GPU-Z pics I posted, it's constant VRel and Pwr limit caps, blue and green perf caps.
It hits the limits more after overclocking both the core and memory, with GPU-Z giving readings such as 112.4% as max power consumption. My cards limit is 111%









Isn't the Founders Edition power limit 112% max? Why in the hell does my Gigabyte 1070 or any aftermarket card have less than that? I have 111%.
It's only 1% less but still... why is Gigabyte giving me a BIOS that is more limited in any way by any amount? It's supposed to be improved upon...
Even more of a reason for a custom BIOS








Quote:


> Originally Posted by *Bdonedge*
> 
> I have a G1 and can't get mine stable at 2k even with voltage increase


Well at least I'm not the only one then.







Felt like I was the only one in the world struggling with 2000mhz on the core from all the forums I've been reading about the 1070.
Maybe we're the only 2

Nvidia has done the impossible, increasing the voltage doesn't help, it's broken. It's best just to leave at default to keep heat lower and stop it from power throttling so much.

I'm real damn sick of Nvidia chasing this thermal and power efficiency dream by crippling everything in the path to get it. Shouldn't performance be priority #1?
Limiting your cards power so it can't heat up as much and power throttles all day isn't power efficiency.
It's insufficient power delivery, so you can brag about how you're not allowing your GPU's to draw enough power and how they're so cool because they can't get sufficient voltage?









I can't wait for Volta where every card runs off PCI-E power only and power throttle out of the box.
But they will run so cool and use so little power!


----------



## BulletSponge

LOL, just about the time I think the 1070 Gaming X is the best MSI card on air, MSI DISHES OUT ITS THIRD FAMILY OF GRAPHICS CARDS BASED ON GTX 10 SERIES OF GPUS.


----------



## Dude970

Quote:


> Originally Posted by *BulletSponge*
> 
> LOL, just about the time I think the 1070 Gaming X is the best MSI card on air, MSI DISHES OUT ITS THIRD FAMILY OF GRAPHICS CARDS BASED ON GTX 10 SERIES OF GPUS.


I have been waiting for my Gaming X for weeks argggggg







I'm not going back in a wait que, I will just OC the X


----------



## BulletSponge

Quote:


> Originally Posted by *Dude970*
> 
> I have been waiting for my Gaming X for weeks argggggg
> 
> 
> 
> 
> 
> 
> 
> I'm not going back in a wait que, I will just OC the X


Same here, a Z BIOS will be available soon.


----------



## Dude970

Quote:


> Originally Posted by *BulletSponge*
> 
> Same here, a Z BIOS will be available soon.


Good plan


----------



## Blackfyre

Quote:


> Originally Posted by *HAL900*
> 
> You do not know what a clock and aida and single and dual precyzion and gpu-z ??


No I meant your question was grammatically incorrect and I couldn't understand what exactly you're asking of me? And I still don't understand. Sorry. Please rephrase your original question, what exactly do you want to know?
Quote:


> Originally Posted by *BulletSponge*
> 
> Same here, a Z BIOS will be available soon.


Looking forward to flashing a Z BIOS if it's not too complicated once it's out. I'll wait for many of you to test first though


----------



## Hunched

Anyone with a Gigabyte G1 or WindForce OC care to check whether or not their fans make an unusual sound when exiting 0rpm mode?
When my WindForce OC fans start spinning up from 0rpm they make a buzz/grinding like sound, it should be a smooth transition from complete silence to simply the sound of the fans spinning but it's not.
They hit turbulence and vibrate or something of the sort.

Performance and silence are the most important things to me, and since my 1070 has failed both I'm going to RMA on the basis of defective fans and get another shot at the lottery.
I'm curious if the fan issue is common and if I'll likely encounter it again. Maybe I should just go for an MSI 1070 as I believe they have the highest quality fans on a 1070.
Thanks


----------



## AuraNova

Does anyone else have a Zotac AMP card, or am I the only one so far? I feel kind of alone in this.









I just wanted to know out of curiosity, nothing much more than that.


----------



## Hunched

Here's a video of the Gigabyte 1070 WindForce OC fan issue I have which I mentioned 2 posts back.


----------



## chin0

Quote:


> Originally Posted by *Hunched*
> 
> Anyone with a Gigabyte G1 or WindForce OC care to check whether or not their fans make an unusual sound when exiting 0rpm mode?
> When my WindForce OC fans start spinning up from 0rpm they make a buzz/grinding like sound, it should be a smooth transition from complete silence to simply the sound of the fans spinning but it's not.
> They hit turbulence and vibrate or something of the sort.
> 
> Performance and silence are the most important things to me, and since my 1070 has failed both I'm going to RMA on the basis of defective fans and get another shot at the lottery.
> I'm curious if the fan issue is common and if I'll likely encounter it again. Maybe I should just go for an MSI 1070 as I believe they have the highest quality fans on a 1070.
> Thanks


Definitely RMA it. I just received my Gigabyte G1 1070 today and it is really quiet. Makes no noise and fans do not turn on until after 60C.


----------



## Blackfyre

Quote:


> Originally Posted by *Hunched*
> 
> Here's a video of the Gigabyte 1070 WindForce OC fan issue I have which I mentioned 2 posts back.


Huh sounds like one of them is hitting a wire or something every time it's about to spin up, it's a weird noise. Definitely not normal. But look around all the fans see if any of them are being touched by something poking up at them (_obviously don't unscrew or force away anything, you might void warranty_).


----------



## JackCY

Quote:


> Originally Posted by *Tasm*
> 
> Mine is struggling to do + 50 MHz at the moment....i am thinking in sending it back


Yeah coz it's totally faulty when it doesn't OC well hey?...


----------



## Swiftes

ordering a 1070, question is I am unsure which to get!

I want a FE for the looks and potential SLI capabilities in the future (will fit nicer in my m-atx setup)

but do they run hot and will I get a better OC with a non ref card?

FE cards seem to be £30 cheaper than a non ref.

decisions...


----------



## Blackfyre

Quote:


> Originally Posted by *Swiftes*
> 
> ordering a 1070, question is I am unsure which to get!
> 
> I want a FE for the looks and potential SLI capabilities in the future (will fit nicer in my m-atx setup)
> 
> but do they run hot and will I get a better OC with a non ref card?
> 
> FE cards seem to be £30 cheaper than a non ref.
> 
> decisions...


*MSI GTX 1070 Gaming X* (_or if you're waiting then Gaming Z is coming out_), it's both the quietest and best performing GTX 1070 based on general consensus.

I purchased it and have been very impressed. I came from a card which had the Gigabyte triple fan system on it and this MSI card is so much quieter.


----------



## Blze001

Quote:


> Originally Posted by *Mad Pistol*
> 
> Honestly, I love my 1070 FE. It's a great card that looks awesome! Will AIB cards outmatch it in performance? Yep. Do I care? Not really.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I actually thought about returning my MSI 1070 FE and keeping the Nvidia 1070 FE. However, I looked on the MSI 1070 FE, and there is absolutely no MSI branding on it at all. I have a feeling that all Founders Edition cards are like this; manufacturer branding is not allowed on these cards. In other words, no one would know I bought a card sold by MSI unless I told them.


My EVGA FE doesn't have any markings either, I think they all look the same.

I got the Founder's Edition because I have a Mini-ITX case. Figures the only card I can use in my case is the one Nvidia artificially jacks the MSRP up on...


----------



## Swiftes

Quote:


> Originally Posted by *Blackfyre*
> 
> *MSI GTX 1070 Gaming X* (_or if you're waiting then Gaming Z is coming out_), it's both the quietest and best performing GTX 1070 based on general consensus.
> 
> I purchased it and have been very impressed. I came from a card which had the Gigabyte triple fan system on it and this MSI card is so much quieter.


Sadly the MSI card is £469.99 here which is ridiculous!

thinking the Zotac 1070 AMP for £419.99: https://www.overclockers.co.uk/zotac-geforce-gtx-1070-amp-edition-8192mb-gddr5-pci-express-graphics-card-gx-103-zt.html


----------



## 303869

Quote:


> Originally Posted by *Swiftes*
> 
> Sadly the MSI card is £469.99 here which is ridiculous!
> 
> thinking the Zotac 1070 AMP for £419.99: https://www.overclockers.co.uk/zotac-geforce-gtx-1070-amp-edition-8192mb-gddr5-pci-express-graphics-card-gx-103-zt.html


I ordered the EVGA FE but still out of stock at scan







Been over 2 weeks since I ordered it now. Looks like I should have bought from Overclockers!


----------



## Swiftes

Quote:


> Originally Posted by *RyReZar*
> 
> I ordered the EVGA FE but still out of stock at scan
> 
> 
> 
> 
> 
> 
> 
> Been over 2 weeks since I ordered it now. Looks like I should have bought from Overclockers!


Overclockers have got the 980Ti AMP Omega for £359.99 which is making things even more difficult...


----------



## 303869

Quote:


> Originally Posted by *Swiftes*
> 
> Overclockers have got the 980Ti AMP Omega for £359.99 which is making things even more difficult...


Yeah that is very tempting, good price aswell...


----------



## BulletSponge

Amazon again taking pre-orders for the MSI 1070 Gaming X for $449. Estimated to be in stock on July 11th.


----------



## Swiftes

bit the bullet and went for the Zotac 1070 FE, £409.99, 5 year warranty, delivery tomorrow.

The AMP would of dwarfed my case and taken up 3 slots, I need a 2 slot card for SLI use in a few months...


----------



## prey1337

Quote:


> Originally Posted by *Hunched*
> 
> Here's a video of the Gigabyte 1070 WindForce OC fan issue I have which I mentioned 2 posts back.


Yea that does not sound right at all.

As mentioned, it could be hitting something, or maybe the bearings aren't functioning properly.
I have an old case fan that requires I whack it a few times to get it to spin quietly, I have a feeling it's the bearings as well.

I would send it back if it's driving you crazy.


----------



## CaptainZombie

My plan is to pickup a 1070 today since they are finally trickling into stores. I am looking at the Gaming X and the EVGA SC ACX 3.0, overall which is the better card? I use mini-ITX cases so I do like the EVGA cards since they are smaller usually. It looks like the Gaming X and EVGA SC ACX 3.0 perform pretty closely.


----------



## prey1337

Quote:


> Originally Posted by *CaptainZombie*
> 
> My plan is to pickup a 1070 today since they are finally trickling into stores. I am looking at the Gaming X and the EVGA SC ACX 3.0, overall which is the better card? I use mini-ITX cases so I do like the EVGA cards since they are smaller usually. It looks like the Gaming X and EVGA SC ACX 3.0 perform pretty closely.


I don't have a history with MSI except for their software so I can't speak on their card quality, but I had to go with a card that fit in my full tower (7 year old case) and the EVGA cards were the only ones that fit that were not blower cooled.

Just take your measurements and assume nothing. I learned the hard way when I bought a G1 and it didn't fit by a good inch.
The MSI cards are also pretty tall/wide so it would of been pressed against my side panel.


----------



## Phixit

My GIGABYTE GeForce GTX 1070 G1 Gaming is on the way (On vehicle for delivery) !

Any benchmarks you guys recommend ? I'll run a few 3DMark Fire Strike + Heaven Benchmark.


----------



## CaptainZombie

Quote:


> Originally Posted by *prey1337*
> 
> I don't have a history with MSI except for their software so I can't speak on their card quality, but I had to go with a card that fit in my full tower (7 year old case) and the EVGA cards were the only ones that fit that were not blower cooled.
> 
> Just take your measurements and assume nothing. I learned the hard way when I bought a G1 and it didn't fit by a good inch.
> The MSI cards are also pretty tall/wide so it would of been pressed against my side panel.


I actually had an MSI 970 in my system at one point which fit tightly in a 250D. I think this card is the same size as the their 970.

I'm just also trying to determine if the Gaming X offers anything more over the EVGA SC ACX3.0. I was looking at some specs and reviews, I'm not seeing much difference......but would love for some owners opinions.


----------



## prey1337

Quote:


> Originally Posted by *CaptainZombie*
> 
> I actually had an MSI 970 in my system at one point which fit tightly in a 250D. I think this card is the same size as the their 970.
> 
> I'm just also trying to determine if the Gaming X offers anything more over the EVGA SC ACX3.0. I was looking at some specs and reviews, I'm not seeing much difference......but would love for some owners opinions.


Well my EVGA SC is great for my needs.

Boosts up to 2062MHz without hassle, haven't tinkered too much with that. Probably could squeeze out some more.

It's quiet, cools very well. Doesn't quite get to 60C under load.

I think the MSI's run cooler though due to their larger fans.


----------



## LiquidHaus

a lot of you guys are gonna be team Gigabyte and team MSI, with a few team EVGA. am I the only one that snagged the Zotac Amp Extreme? let's have an overclock competition!


----------



## Blackfyre

Quote:


> Originally Posted by *Phixit*
> 
> My GIGABYTE GeForce GTX 1070 G1 Gaming is on the way (On vehicle for delivery) !
> 
> Any benchmarks you guys recommend ? I'll run a few 3DMark Fire Strike + Heaven Benchmark.


Whatever benchmarks you want to include, but those two are a good start








Quote:


> Originally Posted by *lifeisshort117*
> 
> a lot of you guys are gonna be team Gigabyte and team MSI, with a few team EVGA. am I the only one that snagged the Zotac Amp Extreme? let's have an overclock competition!


If I was team Zotac looking at this, I'd make sure you get a really good card that overclocks well. It's just good advertisement







come on team Zotac, send him what he deserves.


----------



## Pragmatist

So close to reaching 20000 graphics score........

Also, overclocking the memory too much gave me a lower score as someone mentioned on this or the 1080 thread. In any case, I'll keep trying until I hit 20000 or higher.


----------



## boldenc

Should I go with the EVGA GTX 1070 SC or the Gigabyte GTX 1070 G1 ?
both are available for pre-order on Amazon.
I ordered both but I need to cancel one of them.


----------



## BulletSponge

Quote:


> Originally Posted by *boldenc*
> 
> Should I go with the EVGA GTX 1070 SC or the Gigabyte GTX 1070 G1 ?
> both are available for pre-order on Amazon.
> I ordered both but I need to cancel one of them.


I'd go with EVGA, but that is primarily because I've had such good experiences with their customer service.


----------



## LiquidHaus

Quote:


> Originally Posted by *Blackfyre*
> 
> If I was team Zotac looking at this, I'd make sure you get a really good card that overclocks well. It's just good advertisement
> 
> 
> 
> 
> 
> 
> 
> 
> come on team Zotac, send him what he deserves.


lol well, i was probably one of the first ones to order one from the stock that BH Photo got, so maybe the first ones were would-be binned a bit better!

my goal is 2300 core but we'll see about it haha


----------



## Akameta

I got mine yesterday:thumb:! MSI Gaming X. I overclocked it by 125 on the Core and 700 on memory and now my Boost Clock is about 2075-100~. Fan speed on 60% is barely audible. In fact, I think that the noise coming from corsair sp120. Idle temps are at about 35~ and during load it reaches 70. I chose to skip on EVGA this time around (previously had stock 470 and 780 ACX SC) due to bad ACX cooler performance in the past. The card got pretty loud in order to keep itself under 80 Celsius. Also, I heard that EVGA's 1080 ACX 3.0 had bad coil whine.It may maybe just rumors but it still made me a bit worried getting EVGA version of the card. Overall, pretty satisfied with the MSI performance. Hope it goes the same for everyone!

P.S. Still using EVGA Precision 1080's version because MSI Afterburner would not work properly...
P.S.S. Sorry for bad English)


----------



## mickeykool

Quote:


> Originally Posted by *Pragmatist*
> 
> So close to reaching 20000 graphics score........
> 
> Also, overclocking the memory too much gave me a lower score as someone mentioned on this or the 1080 thread. In any case, I'll keep trying until I hit 20000 or higher.


Keep in mind, drivers can also impact scores.. i was getting little over 2000 w/ previous driver.. Now I'm getting just under 2000 w/ the updated driver.


----------



## Pragmatist

Quote:


> Originally Posted by *mickeykool*
> 
> Keep in mind, drivers can also impact scores.. i was getting little over 2000 w/ previous driver.. Now I'm getting just under 2000 w/ the updated driver.


That's interesting, I'll look into it later. Thanks for the heads up.


----------



## supermodjo

Adme to club guys gigabyte 1070 g1 boost to 1962 from box on water.short stable tests 2152 core 9000 mem at max 43 dr celsius on water.werry happy with this card.coming from gtx 970.


----------



## Blackfyre

Quote:


> Originally Posted by *supermodjo*
> 
> Adme to club guys gigabyte 1070 g1 boost to 1962 from box on water.short stable tests 2152 core 9000 mem at max 43 dr celsius on water.


Damn, nice. Can you post some benchmark scores? Heaven, Valley, and Firestrike.

Quote:


> Originally Posted by *mickeykool*
> 
> Keep in mind, drivers can also impact scores.. i was getting little over 2000 w/ previous driver.. Now I'm getting just under 2000 w/ the updated driver.


*Firestrike GRAPHICS Score:*

*20726* vs *20741*

Before and after the driver update.

Drivers had absolutely no impact on my benchmark results.


----------



## LiquidHaus

hahaha dang dude you didn't waste any time. awesome sauce.


----------



## supermodjo

heaven run on 2152 9000 mem


----------



## supermi

Quote:


> Originally Posted by *supermodjo*
> 
> 
> 
> 
> Adme to club guys gigabyte 1070 g1 boost to 1962 from box on water.short stable tests 2152 core 9000 mem at max 43 dr celsius on water.werry happy with this card.coming from gtx 970.


What kind of vrm water block is that, does it cover all the vrm???

Looks good BTW!


----------



## supermodjo

What kind of vrm water block is that, does it cover all the vrm???

its cools the gpu core and vrm.its a universal block i usit to gtx 970 to.and yes cools entire vrm.


----------



## iZeroFive

After seeing Gaming X is not making a huge OC difference with that extra 6pin connector i think NVIDIA intentionaly limitting these cards for some reason.I finally decided and went with EVGA SC when it's available to order yesterday.In guru3d's review ACX 3.0 is super silent just like Gaming X and tempratures are not so different.Since that extra pcb width of Gaming X not making huge difference as a Manta user i think EVGA SC is better option for me even if the Gaming X is the most popular AIB solution for Pascal right now.


----------



## LiquidHaus

Quote:


> Originally Posted by *supermi*
> 
> What kind of vrm water block is that, does it cover all the vrm???
> 
> Looks good BTW!


I know which one it is.

it's the Heatkiller Universal DIY GPU Block.

Link: http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/17020

EDIT: now that I looked at his photo closely, the phases on his gpu aren't being cooled. when I watercool cards at work, the EK blocks always are designed to cool the phases as well as the vrms


----------



## CaptainZombie

Quote:


> Originally Posted by *prey1337*
> 
> Well my EVGA SC is great for my needs.
> 
> Boosts up to 2062MHz without hassle, haven't tinkered too much with that. Probably could squeeze out some more.
> 
> It's quiet, cools very well. Doesn't quite get to 60C under load.
> 
> I think the MSI's run cooler though due to their larger fans.


I ended up going with the EVGA SC, considering that you and many others are getting some pretty decent boosts.


----------



## Sea Otter

Quote:


> Originally Posted by *CaptainZombie*
> 
> I ended up going with the EVGA SC, considering that you and many others are getting some pretty decent boosts.


Interesting. With my experiences with my 1070 SC, it wasn't overclocking much over stock speeds, and load temps were around 75C. I may have gotten a dud. But then again, it's a reference board, and I don't really expect much from reference designs.


----------



## prey1337

Quote:


> Originally Posted by *CaptainZombie*
> 
> I ended up going with the EVGA SC, considering that you and many others are getting some pretty decent boosts.


Glad to help!

Quote:


> Originally Posted by *Sea Otter*
> 
> Interesting. With my experiences with my 1070 SC, it wasn't overclocking much over stock speeds, and load temps were around 75C. I may have gotten a dud. But then again, it's a reference board, and I don't really expect much from reference designs.


Did you adjust the fan curve? With the stock curve I was seeing much higher temps.
I don't see why these aren't set at more semi-aggressive speeds when they're very quiet still.


----------



## Tasm

Quote:


> Originally Posted by *iZeroFive*
> 
> After seeing Gaming X is not making a huge OC difference with that extra 6pin connector i think NVIDIA intentionaly limitting these cards for some reason.I finally decided and went with EVGA SC when it's available to order yesterday.In guru3d's review ACX 3.0 is super silent just like Gaming X and tempratures are not so different.Since that extra pcb width of Gaming X not making huge difference as a Manta user i think EVGA SC is better option for me even if the Gaming X is the most popular AIB solution for Pascal right now.


Yes, it makes no difference at all.

They are BIOS locked at 1.09V max...


----------



## supermi

Quote:


> Originally Posted by *lifeisshort117*
> 
> I know which one it is.
> 
> it's the Heatkiller Universal DIY GPU Block.
> 
> Link: http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/17020
> 
> EDIT: now that I looked at his photo closely, the phases on his gpu aren't being cooled. when I watercool cards at work, the EK blocks always are designed to cool the phases as well as the vrms


And which the 40 or 60, I might get a few if I can cover most of the important vrm bits.


----------



## CaptainZombie

Quote:


> Originally Posted by *Sea Otter*
> 
> Interesting. With my experiences with my 1070 SC, it wasn't overclocking much over stock speeds, and load temps were around 75C. I may have gotten a dud. But then again, it's a reference board, and I don't really expect much from reference designs.


I was reading in another thread that you need to adjust the fan curve like prey mentioned.
Quote:


> Originally Posted by *prey1337*
> 
> Glad to help!
> Did you adjust the fan curve? With the stock curve I was seeing much higher temps.
> I don't see why these aren't set at more semi-aggressive speeds when they're very quiet still.


Thanks for the advice. I'll run the card through its paces and see if I end up liking it. If not I have 30 days to exchange towards the Gaming X.


----------



## prey1337

Did some more tweaking, I think I'm satisfied for now until the new processor comes in.

Personal best today in Firestrike: http://www.3dmark.com/fs/9199210

New stable settings, benchmarked in Ashes of the Singularity for that (seems to be the most sensitive):


Didn't see any gains going up to +600 on the memory clock so I dropped it to the +500's.


----------



## iZeroFive

Quote:


> Originally Posted by *prey1337*
> 
> Did some more tweaking, I think I'm satisfied for now until the new processor comes in.
> 
> Personal best today in Firestrike: http://www.3dmark.com/fs/9199210
> 
> New stable settings, benchmarked in Ashes of the Singularity for that (seems to be the most sensitive):
> 
> 
> Didn't see any gains going up to +600 on the memory clock so I dropped it to the +500's.


Can you show your fan curve settings?

Also at what level ACX 3.0's fans becoming loud?


----------



## prey1337

Quote:


> Originally Posted by *iZeroFive*
> 
> Can you show your fan curve settings?
> 
> Also at what level ACX 3.0's fans becoming loud?


Sure!


Not sure about the loudness, my CPU fan is the loudest thing in my system so I've never heard the card louder than it.
Plus the card doesn't quite get to 60C, so my fans stay just under 75%.


----------



## CaptainZombie

My ASIC is at 60.2%.


----------



## Ysbzqu6572

Quote:


> Originally Posted by *CaptainZombie*
> 
> My ASIC is at 60.2%.


How do you know ?
Newest version of gpu-z says that card is not suppoorted


----------



## CaptainZombie

Quote:


> Originally Posted by *H4wk*
> 
> How do you know ?
> Newest version of gpu-z says that card is not suppoorted


I might be on an older version then.


----------



## Offender_Mullet

Just ordered an eVga 1070 FTW on NewEgg. It's listed as 'back ordered', so hopefully my order will go through once they get stock in.

I've been without a dedicated gpu for about a month now. I can't wait any longer for the aib RX 480 cards, so I decided to spend more. Haven't had an Nvidia gpu in quite some time.







Damn these jacked-up prices though.


----------



## Bdonedge

So I feel like my card is overclocking by default with the gigabyte program? I have the G1 and Doom has been crashing randomly sometimes and HW monitor shows max core speed at 2050 when this happens. I have to open the program and manually click "Eco" mode every single time and I have noticed it doesn't crash when I do that. Any suggestions?


----------



## Hunched

Am I correct in saying MSI Gaming X is the quietest 1070? I don't think anything else uses fans as big or as quiet.
Bigger fans = lower RPMs = less noise

I've seen that Palit is supposed to be very quiet as well but that's not available in NA


----------



## Pereb

Quote:


> Originally Posted by *prey1337*
> 
> Did some more tweaking, I think I'm satisfied for now until the new processor comes in.
> 
> Personal best today in Firestrike: http://www.3dmark.com/fs/9199210
> 
> New stable settings, benchmarked in Ashes of the Singularity for that (seems to be the most sensitive):
> 
> 
> Didn't see any gains going up to +600 on the memory clock so I dropped it to the +500's.


Get Afterburner 4.3.0 beta 4. You can increase voltage to 1093mv and get a few extra Mhz with the curve editor. I couldn't get past 2037 on 4.2.0, but I'm stable at 2088 with the beta.


----------



## XenoRad

Hello guys,

I've been a long time nVidia user except for the last two rounds where I went with AMD. I'm now looking to replace my 290x for 1440p gaming and it seems this round nVidia has the better offering.

I've set my eyes on the Gigabyte GTX 1070 G1 Gaming. Can those of you who already have this card recommend it? Are there any issues with the cooler design, power delivery, noise, artifacts, etc?

My last card from Gigabyte was a GTX 460 1 Gb model and was very pleased with it.

Thanks.


----------



## Phixit

Quote:


> Originally Posted by *XenoRad*
> 
> Hello guys,
> 
> I've been a long time nVidia user except for the last two rounds where I went with AMD. I'm now looking to replace my 290x for 1440p gaming and it seems this round nVidia has the better offering.
> 
> I've set my eyes on the Gigabyte GTX 1070 G1 Gaming. Can those of you who already have this card recommend it? Are there any issues with the cooler design, power delivery, noise, artifacts, etc?
> 
> My last card from Gigabyte was a GTX 460 1 Gb model and was very pleased with it.
> 
> Thanks.


I got mine last night, managed to put +120MHz on core and +600Mhz on memory. I'm not sure about the voltage % in the Gigabyte software, I've put it to 75% with +11% power limit. I'll need to adjust the fan speed curve tho, it looks like the card is doing some thermal throttling.

I think the MSI Gaming X OC is better at overclock with a 10+1 phase power and the extra 6-pin. I paid $569 CAD for the Gigabyte ($438 USD) and the cheapest MSI Gaming X OC was at $634 CAD ($488). That's a premium price for a few more MHz.

I know Gigabyte announced a 1070 Extreme Gaming with an extra 6-pin connector and a higher OC out of the box, but don't think suppliers have it yet.


----------



## prey1337

Quote:


> Originally Posted by *Pereb*
> 
> Get Afterburner 4.3.0 beta 4. You can increase voltage to 1093mv and get a few extra Mhz with the curve editor. I couldn't get past 2037 on 4.2.0, but I'm stable at 2088 with the beta.


Thanks for the tip, I didn't realize there was a beta out.


----------



## chiefkeif21

Quote:


> Originally Posted by *XenoRad*
> 
> Hello guys,
> 
> I've been a long time nVidia user except for the last two rounds where I went with AMD. I'm now looking to replace my 290x for 1440p gaming and it seems this round nVidia has the better offering.
> 
> I've set my eyes on the Gigabyte GTX 1070 G1 Gaming. Can those of you who already have this card recommend it? Are there any issues with the cooler design, power delivery, noise, artifacts, etc?
> 
> My last card from Gigabyte was a GTX 460 1 Gb model and was very pleased with it.
> 
> Thanks.


Just got mine last night, so far so good. Haven't tried overclocking it yet, but at 100% load in Overwatch it never breaks 65C @ boost of 1965MHz with the fans at a slight hum (I think around 60% fan speed? not sure). Build quality doesn't seem the best but it's quiet and cool so I can't complain.







No coil whine that I can hear, although I have seen other people having coil whine issues with this model of 1070.

Using it on a 1440p/144hz display and it's dishing out crazy frames, would recommend.


----------



## XenoRad

Thanks guys. I'll be picking up my G1 later today. Hopefully it will be a good card and serve me well.


----------



## criminal

Quote:


> Originally Posted by *CaptainZombie*
> 
> I might be on an older version then.


I wouldn't trust an older version. I think the ASIC on these cards is going to need to be read differently than before. That's why the newest version says it is not supported.


----------



## CaptainZombie

Quote:


> Originally Posted by *criminal*
> 
> I wouldn't trust an older version. I think the ASIC on these cards is going to need to be read differently than before. That's why the newest version says it is not supported.


Ok, thanks! I keep wondering if overall the MSI GX will be a better card over the EVGA SC due to the extra power it can take for OC.


----------



## jhatfie

I had $150 in NewEgg gift cards and $300 sitting in my Paypal account, so i bit and got my MSI GTX 1070 Gaming (non X) yesterday. I have been a fan of the MSI Gaming cooling solutions as they have always been super quiet and cool well. Still have a MSI Gaming R9 290X and had a GTX 970 and 980ti version as well. Have not tried to squeeze out max clocks yet but with upped voltage, +210 core and + 550 memory it seems stable thus far. Able to loop Valley benchmark and play Witcher 3 for a couple hours (with v-sync though so load is 90-95% usually). Core seems to settle in at under that scenario at 2101-2114 with some expected occasional drops. Temps have peaked at 65C with custom fan profile, which at that temp is about 60% fan. Very quiet as expected.

OC'd Performance at 2100/9000 seems to be about the same as my GTX 980ti at 1500/8000 (within a few % one way or the other). Granted I do not have a 980ti to test with any more though, so I only have some old tests to compare against.


----------



## prey1337

Quote:


> Originally Posted by *jhatfie*
> 
> I had $150 in NewEgg gift cards and $300 sitting in my Paypal account, so i bit and got my MSI GTX 1070 Gaming (non X) yesterday. I have been a fan of the MSI Gaming cooling solutions as they have always been super quiet and cool well. Still have a MSI Gaming R9 290X and had a GTX 970 and 980ti version as well. Have not tried to squeeze out max clocks yet but with upped voltage, +210 core and + 550 memory it seems stable thus far. Able to loop Valley benchmark and play Witcher 3 for a couple hours (with v-sync though so load is 90-95% usually). Core seems to settle in at under that scenario at 2101-2114 with some expected occasional drops. Temps have peaked at 65C with custom fan profile, which at that temp is about 60% fan. Very quiet as expected.
> 
> OC'd Performance at 2100/9000 seems to be about the same as my GTX 980ti at 1500/8000 (within a few % one way or the other). Granted I do not have a 980ti to test with any more though, so I only have some old tests to compare against.


+210 core??


----------



## LiquidHaus

uhhh those overclocking numbers are much different than anyone elses. care to screenshot?


----------



## jhatfie

Quote:


> Originally Posted by *prey1337*
> 
> +210 core??


Yeah, +210 core clock in afterburner. From what I understand the Gaming non X is clocked a bit lower than the X so it needs more to reach a similar OC.


----------



## LiquidHaus

alright you guys! time to finally add me into the club. picked up my Zotac Amp Extreme during my lunch break at work, and i've been testing it at work since.

unfortunately it doesn't seem to be much of an over-the-top clocker. I'm banking on the fact that whenever the bios are unlocked and voltage can be increased, that's when the card will really shine, based on the dual 8-pin and cooler size.

Zotac's own overclocker utility is very much like MSI Afterburner, and I was able to get the same settings across the two programs.

here are some shots I took at work with a screenshot to show clocks...


-

-

-

-

-


----------



## trelokomio58

Is there a custom bios for gtx 1070's?
I own a 1070 msi gamingX and i would like to try a custom bios with unlock voltage.


----------



## prey1337

Quote:


> Originally Posted by *jhatfie*
> 
> Yeah, +210 core clock in afterburner. From what I understand the Gaming non X is clocked a bit lower than the X so it needs more to reach a similar OC.


Interesting thanks!
Quote:


> Originally Posted by *lifeisshort117*
> 
> alright you guys! time to finally add me into the club. picked up my Zotac Amp Extreme during my lunch break at work, and i've been testing it at work since.
> 
> unfortunately it doesn't seem to be much of an over-the-top clocker. I'm banking on the fact that whenever the bios are unlocked and voltage can be increased, that's when the card will really shine, based on the dual 8-pin and cooler size.
> 
> Zotac's own overclocker utility is very much like MSI Afterburner, and I was able to get the same settings across the two programs.
> 
> here are some shots I took at work with a screenshot to show clocks...
> 
> 
> -
> 
> -
> 
> -
> 
> -
> 
> -


That thing looks intense! Congrats


----------



## supermodjo

Quote:


> Originally Posted by *prey1337*
> 
> Interesting thanks!
> That thing looks intense! Congrats


Quote:


> Originally Posted by *lifeisshort117*
> 
> alright you guys! time to finally add me into the club. picked up my Zotac Amp Extreme during my lunch break at work, and i've been testing it at work since.
> 
> unfortunately it doesn't seem to be much of an over-the-top clocker. I'm banking on the fact that whenever the bios are unlocked and voltage can be increased, that's when the card will really shine, based on the dual 8-pin and cooler size.
> 
> Zotac's own overclocker utility is very much like MSI Afterburner, and I was able to get the same settings across the two programs.
> 
> here are some shots I took at work with a screenshot to show clocks...
> 
> 
> -
> 
> -
> 
> -
> 
> -
> 
> -


2050 with 2x 8 pin.im on 2152 on gigabyte g1 on water with only 1x 8pin.you are temp limited or just silicon lotery or like most people said one 8 pin its enogh for overclock just silicone lotery


----------



## Hunched

Nobody answered, so I've went with the MSI Gaming 1070 8G (non-x) after convincing Newegg to allow a refund of my Gigabyte WindForce OC instead of a replacement.
I do not believe anyone else, EVGA, Asus, Gigabyte, Zotac, etc have a 1070 as quiet as MSI's Twin Frozr VI.
I'm choosing this card for specifically for its silence, it has large fans which can run low RPM and provide good cooling.

Also from what I've heard MSI has better customer service than everyone but EVGA, and they have service centers in Canada which is great for me!
The extra 6-pin is nice if BIOS mods ever unlock the ability to take advantage of it, but for now is basically a useless addition.
Also AFAIK there is 0 difference between the MSI Gaming 8G, MSI Gaming X 8G, and MSI Gaming Z 8G besides out of the box clocks, that's all you pay for.
You're not anymore likely to get 2100mhz with your $100+ more Gaming Z 8G than I am with my Gaming 8G, our lottery odds are equal.

So hopefully MSI pulls through with their higher quality, quieter fans. Pls no insane coil whine. I'd be super unlucky if it manages to be a worse overclocker than my Gigabyte WindForce OC.
If everything works out, this should be a nice upgrade in every way. The only good thing about my Gigabyte WindForce OC was 0 coil whine, everything else failed expectations.


----------



## LiquidHaus

Quote:


> Originally Posted by *supermodjo*
> 
> 2050 with 2x 8 pin.im on 2152 on gigabyte g1 on water with only 1x 8pin.you are temp limited or just silicon lotery or like most people said one 8 pin its enogh for overclock just silicone lotery


temp limited at 63C? Don't think so. Unless these cards throttle at 65C. Which I know isn't the case. I'm waiting for modded bios. unlocked voltage and my card should OC better. I refuse to believe my card can only get 2050mhz core.


----------



## mypickaxe

Officially joined the ranks. Sold off my single Titan X after much consternation. It was a tough call, but I figured this beats a second (used) TITAN X for SLI.

GTX 1070 Founders Edition, SLI, EK Full Cover water blocks (Nickel) installed, with nickel backplates.

2100 MHz OC boost clock, 9000 MHz GDDR5.


----------



## LiquidHaus

looks awesome man


----------



## mypickaxe

Quote:


> Originally Posted by *lifeisshort117*
> 
> looks awesome man


Thanks. I need to flush the thing completely and redo the pastel blue mix, but it's good for now!


----------



## Tasm

Anyone got the Zotac Amp Extreme?

https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme

I am trying to find if it has the voltage locked at 1.09v or not.

What a beast. I am thinking in sending my MSI back and get one.


----------



## LiquidHaus

Quote:


> Originally Posted by *Tasm*
> 
> Anyone got the Zotac Amp Extreme?
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme
> 
> I am trying to find if it has the voltage locked at 1.09v or not.
> 
> What a beast. I am thinking in sending my MSI back and get one.


I do, got it today too!










annnnnd the voltage is indeed locked. pretty sure it's the same bios as all the other 1070 AIBs.

as soon as someone unlocks the bios, or mods it for voltage modification, I have no doubt the card will soar from there. either way, I am happy with my purchase.


----------



## Tasm

Quote:


> Originally Posted by *lifeisshort117*
> 
> I do, got it today too!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> annnnnd the voltage is indeed locked. pretty sure it's the same bios as all the other 1070 AIBs.
> 
> as soon as someone unlocks the bios, or mods it for voltage modification, I have no doubt the card will soar from there. either way, I am happy with my purchase.


Thank you for the answer.

Can you tell me more details about it?

Max boost without OC?

Have you tested OC yet?

Max temp at load?

Is it silent at load?

IMHO, it looks to be the best version out there.


----------



## LiquidHaus

Quote:


> Originally Posted by *Tasm*
> 
> Thank you for the answer.
> 
> Can you tell me more details about it?
> 
> Max boost without OC?
> 
> Have you tested OC yet?
> 
> Max temp at load?
> 
> Is it silent at load?
> 
> IMHO, it looks to be the best version out there.


max boost without OC was 1900 something, like 1920 I think.

I got 2050mhz on mine, but i have such a good feeling with this one being better once modded bios come out, since power delivery on this board is higher than any other.

max temps never got about 65c.

and yes the fans stop spinning at idle, so completely silent.


----------



## Phixit

Finally reached the 20k graphic scores in Firestrike !


----------



## Blackfyre

Quote:


> Originally Posted by *Phixit*
> 
> Finally reached the 20k graphic scores in Firestrike !


Nice









What's your overclock?


----------



## Phixit

Quote:


> Originally Posted by *Blackfyre*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's your overclock?


+120MHz on core
+800MHz Memory

I doubt I'll be able to go over +120MHz on core .. I tried +125 and got display driver errors.

GPU-Z log shows that it reached 2088MHz/2100MHz with a voltage of 1.0930.


----------



## Mad Pistol

Quote:


> Originally Posted by *Phixit*
> 
> Finally reached the 20k graphic scores in Firestrike !


Nice. I thought I would give it a crack, too, and....

+150/+500, busted 20k



What's tempting is that I have a 2nd GTX 1070 that I am waiting for Nvidia to approve a return on, but it's unopened. I am so tempted to bust it out for some SLI action.


----------



## Phixit

Quote:


> Originally Posted by *Mad Pistol*
> 
> Nice. I thought I would give it a crack, too, and....
> 
> +150/+500, busted 20k
> 
> 
> 
> What's tempting is that I have a 2nd GTX 1070 that I am waiting for Nvidia to approve a return on, but it's unopened. I am so tempted to bust it out for some SLI action.


Nice !

I need an i7. more physics !


----------



## Mad Pistol

Quote:


> Originally Posted by *Phixit*
> 
> Nice !
> 
> I need an i7. more physics !


Lol. The i7 is only good for benchmarks... it doesn't do a damn thing in games.


----------



## Blackfyre

Quote:


> Originally Posted by *Phixit*
> 
> +120MHz on core
> +800MHz Memory
> 
> I doubt I'll be able to go over +120MHz on core .. I tried +125 and got display driver errors.
> 
> GPU-Z log shows that it reached 2088MHz/2100MHz with a voltage of 1.0930.












I have Core @ +106 MHz
And Memory @ +666 MHz

And get around 20700 graphic score. Tone them down to my settings and test (_I found the lower memory speed to be helpful, I can push 800 MHz too, but 666 gave me better performance_). You can keep core @ +120 MHz if you're sure it's stable.

If I even go to +110 MHz, the instability begins.

Obviously Core Voltage & Power Limit both maxed out.


----------



## Tasm

20k here too:



+ 100 Mhz Core.


----------



## mypickaxe

*Rise of the Tomb Raider (Steam copy) PC Patch 7 - DX12, V-Sync Toggle, Async Compute*

I've tested the latest Rise of the Tomb Raider patch (with single and SLI GTX 1070, see rig details in sig) which provides multi GPU support for DX12 as well as Async Compute. I ran the canned benchmark provided with the game each time. The only changes were to SLI enablement and DX11 to DX12.

To simplify the testing procedure, I tested in 1080p with maximum settings. I found that with everything cranked to the maximum in 1440p, DX12 would not run the game on an 8GB GPU. I would need to dial back the AA, but that would not provide a pure comparison between 1080p and 1440p, so I will leave that for another day.

To summarize my findings, moving from DX11 to DX12 with a single GTX 1070 (on my PC specific configuration) resulted in a *decrease* of 2% overall performance.

*However*, moving from DX11 to DX12 *with multiple GTX 1070s* resulted in an increase of 9% in overall performance.

Also, multiple GPUs scale better with my configuration in DX12 compared to DX11. I see a net increase of 9% moving from DX11 SLI to DX12 multi-gpu.

You can gain roughly 25% in scaling performance when moving from DX11 to DX12 with multiple GPUs compared to a single GPU in DX11 and DX12, respectively.



*Calculated as percent change (increase or decrease)

*Mountain Peak*

DX11 SLI Diff (vs single GPU) 45%
DX12 Multi Adapter Increase (vs single GPU) 69%
DX11 to DX12 Single Adapter Diff -8%
DX11 to DX12 Multi Adapter Diff 8%

*Syria*

DX11 SLI Diff (vs single) 57%
DX12 Multi Adapter Diff (vs single) 68%
DX11 to DX12 Single Adapter Diff -2%
DX11 to DX12 Multi Adapter Diff 5%
*
Geothermal Valley*

DX11 SLI Diff (vs single) 45%
DX12 Multi Adapter Diff (vs single) 75%
DX11 to DX12 Single Adapter Difference -4%
DX11 to DX12 Multi Adapter Diff 16%

*Overall*

DX11 SLI Diff (vs single) 52%
DX12 Multi Adapter Diff (vs single) 70%
DX11 to DX12 Single Adapter Diff -2%
DX11 to DX12 Multi Adapter Diff 10%

*Takeaways:*

1) SLI / Multiple GPU support offers massive FPS increase in Rise of the Tomb Raider.
2) SLI scaling is very good.
3) Multi GPU support scales better in DX12 compared to DX11.
4) Do not bother with DX12 if you have one GPU at this time. DX11 offers better performance.

I tested by enabling and disabling SLI in the Nvidia control panel. ROTR picked up the change automatically in DX12, and performance was better across the board with SLI enabled. The only place I saw a dropoff in minimum framerates was in the Syria section with DX12 SLI.

AMD may show different results. I'm going to test my R9 Nano later on to see if there's a difference with AMD hardware on this patch.

Test setup:


----------



## Majentrix

Look what I found on the bus.


----------



## mypickaxe

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Look what I found on the bus.


I hope you're taking that to your nearest police station.


----------



## Majentrix

You're damn right I am. Cards this fast should be impounded.


----------



## Mad Pistol

20.5k on the graphics score seems to be all I'm going to get out of this 1070 FE. Any higher on the core, and it errors out. Any higher on the memory, and I get graphical artifacts.

+180/+700


----------



## averian

I managed to squeeze out 21k on Firestrike graphics after I added an EVGA hybrid water cooler to my 1070 FE: http://www.3dmark.com/3dm/13056080



Afterburner settings: +220 Core, +650 Memory.


----------



## mypickaxe

Quote:


> Originally Posted by *averian*
> 
> I managed to squeeze out 21k on Firestrike graphics after I added an EVGA hybrid water cooler to my 1070 FE: http://www.3dmark.com/3dm/13056080
> 
> Afterburner settings: +220 Core, +650 Memory.


Single (Graphics Score 20555)



SLI (Graphics Score: 39217)



OC:



Afterburner Settings +205 Core, +647 Memory.


----------



## jhatfie

Tried my hand at Firestrike. Upped core to +225 and memory to +700. 21140 Graphics Score.
http://www.3dmark.com/3dm/13057696


----------



## Blackfyre

Quote:


> Originally Posted by *mypickaxe*
> 
> *Rise of the Tomb Raider (Steam copy) PC Patch 6 - DX12, V-Sync Toggle, Async Compute*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I've tested the latest Rise of the Tomb Raider patch (with single and SLI GTX 1070, see rig details in sig) which provides multii GPU support for DX12 as well as Async Compute. I ran the canned benchmark provided with the game each time. The only changes were to SLI enablement and DX11 to DX12.
> 
> To simplify the testing procedure, I tested in 1080p with maximum settings. I found that with everything cranked to the maximum in 1440p, DX12 would not run the game on an 8GB GPU. I would need to dial back the AA, but that would not provide a pure comparison between 1080p and 1440p, so I will leave that for another day.
> 
> To summarize my findings, moving from DX11 to DX12 with a single GTX 1070 (on my PC specific configuration) resulted in a *decrease* of 2% overall performance.
> 
> *However*, moving from DX11 to DX12 *with multiple GTX 1070s* resulted in an increase of 9% in overall performance.
> 
> Also, multiple GPUs scale better with my configuration in DX12 compared to DX11. I see a net increase of 9% moving from DX11 SLI to DX12 multi-gpu.
> 
> You can gain roughly 25% in scaling performance when moving from DX11 to DX12 with multiple GPUs compared to a single GPU in DX11 and DX12, respectively.
> 
> 
> 
> *Calculated as percent change (increase or decrease)
> *Mountain Peak*
> 
> DX11 SLI Diff (vs single GPU) 45%
> DX12 Multi Adapter Increase (vs single GPU) 69%
> DX11 to DX12 Single Adapter Diff -8%
> DX11 to DX12 Multi Adapter Diff 8%
> 
> *Syria*
> 
> DX11 SLI Diff (vs single) 57%
> DX12 Multi Adapter Diff (vs single) 68%
> DX11 to DX12 Single Adapter Diff -2%
> DX11 to DX12 Multi Adapter Diff 5%
> *
> Geothermal Valley*
> 
> DX11 SLI Diff (vs single) 45%
> DX12 Multi Adapter Diff (vs single) 75%
> DX11 to DX12 Single Adapter Difference -4%
> DX11 to DX12 Multi Adapter Diff 16%
> 
> *Overall*
> 
> DX11 SLI Diff (vs single) 52%
> DX12 Multi Adapter Diff (vs single) 70%
> DX11 to DX12 Single Adapter Diff -2%
> DX11 to DX12 Multi Adapter Diff 10%
> 
> *Takeaways:*
> 
> 1) SLI / Multiple GPU support offers massive FPS increase in Rise of the Tomb Raider.
> 2) SLI scaling is very good.
> 3) Multi GPU support scales better in DX12 compared to DX11.
> 4) Do not bother with DX12 if you have one GPU at this time. DX11 offers better performance.
> 
> I tested by enabling and disabling SLI in the Nvidia control panel. ROTR picked up the change automatically in DX12, and performance was better across the board with SLI enabled. The only place I saw a dropoff in minimum framerates was in the Syria section with DX12 SLI.
> 
> AMD may show different results. I'm going to test my R9 Nano later on to see if there's a difference with AMD hardware on this patch.
> 
> Test setup:


Mate a lot of work was put into this post, so thank you for writing it up! Interesting results.
























Quote:


> Originally Posted by *averian*
> 
> I managed to squeeze out 21k on Firestrike graphics after I added an EVGA hybrid water cooler to my 1070 FE: http://www.3dmark.com/3dm/13056080
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Afterburner settings: +220 Core, +650 Memory.


Damn water cooling!








You're pretty much getting non-Overclocked GTX 1080 Performance out of your 1070 (_close anyway_). That's crazy.
Quote:


> Originally Posted by *jhatfie*
> 
> Tried my hand at Firestrike. Upped core to +225 and memory to +700. 21140 Graphics Score.
> http://www.3dmark.com/3dm/13057696
> 
> 
> Spoiler: Warning: Spoiler!


You're on a water cooling loop correct? There's no way this was achieved on air. If it is, that's a record.


----------



## jhatfie

Quote:


> Originally Posted by *Blackfyre*
> 
> You're on a water cooling loop correct? There's no way this was achieved on air. If it is, that's a record.


Nope, just my regular MSI GTX 1070 Gaming on air in a 72F/22C room.


----------



## hemon

Quote:


> Originally Posted by *jhatfie*
> 
> Nope, just my regular MSI GTX 1070 Gaming on air in a 72F/22C room.


I bought yesterday this card too: It is a beast!! It ist just silent in my case (Fractal Define S) at full speed! I tried the OC and I reached +135 over the clock speed. Maybe it can go further but I did't tried it because I don't care about it now and 2035Mhz is just good for me!

AND: No throttling at all!


----------



## Bdonedge

I think I may not be understanding how core clock and memory clock work on OC programs. I've been doing +150 core clock and +500 memory clock and can't get stable. Couldn't even get +50 on each to get stable on MSI afterburner.

When you guys say +200 to core clock, that IS what you're referring to, correct?


----------



## ssgtnubb

Got my G1 in today, running Samsung memory, haven't messed with the OC yet but it's boosting to 1969 in Firestrike Ultra


----------



## Majentrix

This is the best Firestrike score I've managed to get out of my Gainward Phoenix. Almost exactly 5000 points more than my old 390.


----------



## ssgtnubb

Just downloaded gigabytes software and switched to oc mode which bumped me to 1999.5 with no manual oc. I'll have to take some time and tinker with this.


----------



## Bdonedge

Yeah I can't understand MSI afterburner. I put in +30 core clock and HWmonitor is saying my speeds clocked out at 2025. Obviously it's not 1:1 or am
I missing something here


----------



## averian

Bdonedge - GPU boost 3.0 makes things a bit confusing since nothing is static but you have the right idea.

You'll want to use MSI Afterburner 4.3.0 Beta 4 and make sure the power limit is maxed out. Then adjust core and memory clocks to find your max OC. If you want to see in more detail what your card will attempt to boost at, in Afterburner hit CTRL-F. That will give you the voltage/frequency curve (click on the dots).


----------



## Majentrix

Here's what the Phoenix looks like in my build. The triple slot cooler is an absolute monster.


----------



## Peet1

21218 on Firestrike







, with +750 on the memory, thats like 9500mhz effective









Card is a Palit GTX 1070 Jetstream with a Palit GTX 1070 GameRock BIOS flashed on it and has a 114% Powertarget now.

http://www.3dmark.com/3dm/12948188


----------



## Matt26LFC

This is my best Firestrike score from my WC'ed MSI 1070 FE


----------



## Blackfyre

Quote:


> Originally Posted by *Peet1*
> 
> 21218 on Firestrike
> 
> 
> 
> 
> 
> 
> 
> , with +750 on the memory, thats like 9500mhz effective
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card is a Palit GTX 1070 Jetstream with a Palit GTX 1070 GameRock BIOS flashed on it and has a 114% Powertarget now.
> 
> http://www.3dmark.com/3dm/12948188


Quote:


> Originally Posted by *jhatfie*
> 
> Nope, just my regular MSI GTX 1070 Gaming on air in a 72F/22C room.


Okay since you're both on air cooled GTX 1070's and have scores I have not seen anyone here reach. Can you please both run *3DMark* (_since you both have that_), go to *"STRESS TESTS"*, and then choose *"FIRE STRIKE ULTRA STRESS TEST"* from the drop down MENU. Run it, the test will take approximately 20 minutes to finish, then post a picture with the percentage and GPU-Z Sensor screen below it.

Just take a screenshot and post to here or imgur, then link us here to it.

You guys have unbelievable cards if they're truly stable and don't show artifacts or crash. People on water-cooling are not or barely reaching these results.

*jhatfie* Can you please check your BIOS version in GPU-Z also and report back on that.


----------



## Peet1

@Blackfyre cant do that, sorry brah! only the 3Dmark Demo here


----------



## wrathofbill

Hi, can someone enlighten me as to why my Memory overclock is so low. If I go any higher than +200 I get artefacts on Heaven and Valley Benchmarks and a flickering screen with firestrike.

Msi Founders +190 core +200 Mem. 70C max temp

Graphics score is just under 20,000 on Firestrike so its not to shabby.

Thanks

https://cdn.pcpartpicker.com/static/forever/images/userbuild/173288.1bb37272d54c951a93dc830c356cb128.378f5c35413f760368fa353301b60a07.1600.jpg


----------



## Peet1

Quote:


> Originally Posted by *wrathofbill*
> 
> Hi, can someone enlighten me as to why my Memory overclock is so low. If I go any higher than +200 I get artefacts on Heaven and Valley Benchmarks and a flickering screen with firestrike.
> 
> Msi Founders +190 core +200 Mem. 70C max temp
> 
> Graphics score is just under 20,000 on Firestrike so its not to shabby.
> 
> Thanks


Maybe they threw in some batches with bad running GDDRX, who knows

Edit: Wait, that thought is stupid, my memory is from Samsung


----------



## Bdonedge

I uninstalled the gigabyte program that I had and my stock card has a lower score on 3D mark since I've done that. Any ideas of what would have caused that?


----------



## CaptainZombie

Quote:


> Originally Posted by *averian*
> 
> I managed to squeeze out 21k on Firestrike graphics after I added an EVGA hybrid water cooler to my 1070 FE: http://www.3dmark.com/3dm/13056080
> 
> 
> 
> Afterburner settings: +220 Core, +650 Memory.


Since the SC ACX 3.0 is a reference board, the EVGA hybrid water cooler should fit with no issues right? I wonder if it would be worth it even though the ACX cooler is not bad.


----------



## AwesomeName

Ok, I considered a 480, then a 1060, now I am set on a 1070. This is looks like a great balance between performance and cost. To be honest, I just want something that can make Battlefield 1 look amazing. Plus I need something that will last several years.

My question is, does anyone know when free game(s) with purchase offers will start? I can't remember, or wasn't paying attention, when the deals started with the 980/ti. I know no one knows the answer but historically, do they hit relatively close to the launch of cards or is something more along the lines of 6 months to a year later?


----------



## Mad Pistol

Quote:


> Originally Posted by *AwesomeName*
> 
> Ok, I considered a 480, then a 1060, now I am set on a 1070. This is looks like a great balance between performance and cost. To be honest, I just want something that can make Battlefield 1 look amazing. Plus I need something that will last several years.
> 
> My question is, does anyone know when free game(s) with purchase offers will start? I can't remember, or wasn't paying attention, when the deals started with the 980/ti. I know no one knows the answer but historically, do they hit relatively close to the launch of cards or is something more along the lines of 6 months to a year later?


This is a tough one to qualify, because at the moment, the GTX 1070 and 1080 are untouchable by AMD. Yes, I know the Fury X is close to 1070 performance (in some cases, the Fury X is slightly faster), but in terms of performance/efficiency, the 1070 and 1080 are the runaway kings of the moment.

Until AMD releases a product that can compete, Nvidia doesn't really have any incentive to include games with their products.

Also, hello McKinney friend!!! (I'm currently in Garland).


----------



## averian

Quote:


> Originally Posted by *CaptainZombie*
> 
> Since the SC ACX 3.0 is a reference board, the EVGA hybrid water cooler should fit with no issues right? I wonder if it would be worth it even though the ACX cooler is not bad.


Great question. So yes, but there is a slight caveat. I used the EVGA hybrid cooler from the 980 TI because I picked one up at a great price, but it only includes a GPU cooler and uses the stock fan on any "reference" or "founder's edition" to cool the VRM (I don't use a shroud). The SC ACX 3.0 is a reference design so no problems with fitment, however, it is missing the blower fan which you really do need to implement in some fashion. You could probably get away with adding VRM heat sinks and a case fan directly blowing on the card, but a better option now is the just released EVGA 10 series hybrid that includes a complete new fan/shroud/gpu water cooler all in one. The cost for that option is $120: http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1

My biggest goals were quiet and low temps. Since Pascal seems to be very sensitive to temperature and will throttle when heat soaked (a problem that isn't obvious in a short benchmark run). I only gained about 500 on my firestrike graphics score by switching to water since I'm currently power limited at 112% on stock bios. However, the blower fan runs at less than 30% now so I can't hear it and my temps on full OC load have only been about 45 C. So the temp difference is remarkable and that means no more bouncing around of boost clocks when hitting the temp wall after long gaming sessions.


----------



## averian

Since it seems like others might be interested in going the hybrid water cooled route, here are some pics of my 1070 FE tear down/build with an EVGA Hybrid 980 TI AIO watercooler installed:


http://imgur.com/Vwaw0


I was actually able to install the whole setup without removing any part of the backplate. This wasn't super easy as I had to carefully use pliers to remove the fan connector. The trickiest part was then getting the hybrid fan connector back on which took some subtle wire bending and quite a bit of patience. It probably makes the most sense to pull everything apart so you can route the wires on the opposite side of the card compared to how I did it. However, if you have some dexterity, there is barely enough room to get things done like I managed (just make sure all wires completely clear the squirrel cage fan).

Alternatively, one can pull everything apart and then install the new EVGA 10 series hybrid that includes a blower fan of its own and shroud. Not that it matters, but I think I actually prefer the looks of my frankenstein card, lol.


----------



## mypickaxe

Quote:


> Originally Posted by *Blackfyre*
> 
> Mate a lot of work was put into this post, so thank you for writing it up! Interesting results.


It wasn't very much work, just a couple of hours.


----------



## Yetyhunter

I finally received my GTX 1070 G1 Gaming and I must say I am a bit disappointed by it's overclocking ability although it's a HUGE upgrade from my GTX 670.


I ran firestrike and heaven benchmark and the highest overclock without artifacts I could reach is a merely +50 mhz on the core reaching a max boost clock of 2037 but it stabilizes at 2000 sometimes going under this value. I must say that the artifacts look rather strange like purple or green light flashes happening very fast and then the driver crashes.I attached a screenshot showing very strange power fluctuations and don't know if they are normal
Memory overclock is +500 mhz, anything higher results in memory artifacts.


Another problem I have is huge DPC latency, Again I have no clue what might be causing that.


----------



## mypickaxe

Quote:


> Originally Posted by *Yetyhunter*
> 
> I finally received my GTX 1070 G1 Gaming and I must say I am a bit disappointed by it's overclocking ability although it's a HUGE upgrade from my GTX 670.
> 
> 
> I ran firestrike and heaven benchmark and the highest overclock without artifacts I could reach is a merely +50 mhz on the core reaching a max boost clock of 2037 but it stabilizes at 2000 sometimes going under this value. I must say that the artifacts look rather strange like purple or green light flashes happening very fast and then the driver crashes.I attached a screenshot showing very strange power fluctuations and don't know if they are normal
> Memory overclock is +500 mhz, anything higher results in memory artifacts.
> 
> 
> Another problem I have is huge DPC latency, Again I have no clue what might be causing that.


I hate to say this, but I believe for the vast majority of the AIB partner boards, the only differentiators between those boards and the reference design is a better air cooling design and more stable power delivery.

The silicon is mostly the same from what I've seen across all of them. The VBIOS might be tweaked to give a little boost to the max TDP, but beyond that, until we see custom BIOS and a working editor, I think this is the range we are going to see.


----------



## Yetyhunter

So many people getting over +100 mhz very easily. Should this be considered a bad overclock?

Heaven score is also low but considering that the core clock stabilzes at about 1950 might be normal. I noticed that the temperatures are quite high, the card reaches almost 80*C. If I manage to lower this temps will I get a higher core clock


----------



## Pragmatist

I finally reached 20000+ on the graphics score.


















I'm pretty happy about it considering I didn't have an easy time reaching the score, but the 1070 is just a card I'll have until the 1080 Ti gets released so I bought the first one I could find available at the time, which is the MSI Armor OC btw. I fancy the black and white theme, but the lack of a backplate annoys the heck out of me.

Also, the MSI Gaming 8X seems to overclock better judging by the posts on this thread, although I paid the same price. It's the early adopter tax, I guess.


----------



## Phixit

The G1 Gaming seems to be a poor overclocker versus the MSI Gaming.


----------



## Mad Pistol

Quote:


> Originally Posted by *Yetyhunter*
> 
> So many people getting over +100 mhz very easily. Should this be considered a bad overclock?
> 
> Heaven score is also low but considering that the core clock stabilzes at about 1950 might be normal. I noticed that the temperatures are quite high, the card reaches almost 80*C. If I manage to lower this temps will I get a higher core clock


If you haven't already, try and exit as many programs as you can in the background. If you have a 2nd monitor, disable it for benchmarks.

I was able to do a Heaven run @ +170/+700. This was my score.


----------



## Yetyhunter

Only thing that was running was the benchmark and MSI afterbuner. What temps are you getting ? My card is very hot, it shows 78*C and I can't keep my hand on the back-plate without getting burned. I was running +50/+500 any higher I get artifacts.


----------



## Mad Pistol

Quote:


> Originally Posted by *Yetyhunter*
> 
> Only thing that was running was the benchmark and MSI afterbuner. What temps are you getting ? My card is very hot, it shows 78*C and I can't keep my hand on the back-plate without getting burned. I was running +50/+500 any higher I get artifacts.


And that's a G1???

I had my fan set manually to 70% for the benchmark, and it hit about 70C during the benchmark. I don't know why, but Pascal likes staying at or below 70-75C in my experience. Above that, and it starts throttling back pretty aggressively.

Looks like you may have just gotten a dud.


----------



## pez

Benchmarks probably push the card more than games. What are your game temps?


----------



## averian

Quote:


> Originally Posted by *mypickaxe*
> 
> I hate to say this, but I believe for the vast majority of the AIB partner boards, the only differentiators between those boards and the reference design is a better air cooling design and more stable power delivery.
> 
> The silicon is mostly the same from what I've seen across all of them. The VBIOS might be tweaked to give a little boost to the max TDP, but beyond that, until we see custom BIOS and a working editor, I think this is the range we are going to see.


Agreed, I'm very interested to see where the real limits are once the BIOS can be successfully edited and flashed. Based on what I've been able to glean so far, additional power/voltage will allow for a small boost from where we're at now, but nothing massive. Hope I'm wrong about what to expect, haha.


----------



## Yetyhunter

Well I just reached 80*C in Withcer 3 and I noticed I started seeing artifacts when the temperature goes over 75*C. What do you mean by a dud ?? The card runs fine at stock settings, still get's a bit warm.


----------



## Blackfyre

Quote:


> Originally Posted by *Yetyhunter*
> 
> Well I just reached 80*C in Withcer 3 and I noticed I started seeing artifacts when the temperature goes over 75*C. What do you mean by a dud ?? The card runs fine at stock settings, still get's a bit warm.


Man that's not normal at all. Are you sure you don't accidentally have fan speed locked on MSI Afterburner at something low like 20% or 30%, or you've enabled an old custom FAN curve or something?

My temperature barely reached 60 degrees celsius playing the Witcher 3 in the worst case scenario. It's winter here in WA, Australia though, and I am using a custom curve for the fan. It's still very quiet though.

That's with +106 MHz on Core and +800 MHz on Memory.


----------



## pez

Yeah 80C is not normal on a G1 cooler. Either they botched TPU application or you've got some seriously bad airflow (which I hope is not the case with the case in your sig).


----------



## Yetyhunter

I think I might have an ideea. First of all I must say I never had an issue with my GTX670, it was heavily OC'ed and temps never went over 70*C and now I replaced it with the GTX 1070 with an obvious overheating problem. I did not change anything else inside the case. The problem might be that I have the radiator on the bottom of the case blowing hot air directly on the GPU. Why do I have such a difference ?? The GTX 1070 has a lower TDP and it should be cooler than my old card.

Where should I put the radiator? It is a 4 fan push pull configuration.

80*C were reached with max core voltage max power limit +50/+500 and with an open case


----------



## Yetyhunter

Quote:


> Originally Posted by *pez*
> 
> Yeah 80C is not normal on a G1 cooler. Either they botched TPU application or you've got some seriously bad airflow (which I hope is not the case with the case in your sig).


I should rethink the whole airflow configuration.
This is a quick sketch on the config


----------



## pez

What model was your GTX670?

The G1 is pulling air off of the card and pushing it down and around the card. If your bottom radiator is set to intake, then you're creating a dead zone of hot air there where the GPU is just trying to recycle the same hot air over and over.


----------



## Yetyhunter

Quote:


> Originally Posted by *pez*
> 
> What model was your GTX670?
> 
> The G1 is pulling air off of the card and pushing it down and around the card. If your bottom radiator is set to intake, then you're creating a dead zone of hot air there where the GPU is just trying to recycle the same hot air over and over.


It was a windforce x3


----------



## Foresight

so increasing voltage in msi afterburner wont be helping with overclocks?


----------



## mypickaxe

Quote:


> Originally Posted by *Foresight*
> 
> so increasing voltage in msi afterburner wont be helping with overclocks?


From what I've seen with the per voltage point dynamic boost clock, I'd say not yet. I've experimented with it in both Afterburner and PrecisionX OC (which is still terribly slow compared to Afterburner, even with the "OC" update) and saw little to no difference in sustained overclock.

Even bumping voltage up 100mv, I saw no difference in top end OC result.

My setup is GTX 1070 SLI (FE cards with EK blocks and a serial terminal connection between them.)


----------



## jhatfie

Quote:


> Originally Posted by *Blackfyre*
> 
> Okay since you're both on air cooled GTX 1070's and have scores I have not seen anyone here reach. Can you please both run *3DMark* (_since you both have that_), go to *"STRESS TESTS"*, and then choose *"FIRE STRIKE ULTRA STRESS TEST"* from the drop down MENU. Run it, the test will take approximately 20 minutes to finish, then post a picture with the percentage and GPU-Z Sensor screen below it.
> 
> Just take a screenshot and post to here or imgur, then link us here to it.
> 
> You guys have unbelievable cards if they're truly stable and don't show artifacts or crash. People on water-cooling are not or barely reaching these results.
> 
> *jhatfie* Can you please check your BIOS version in GPU-Z also and report back on that.


First test completed but passed percentage was too low, so dropped my memory from +700 to +675. Bios version is showing as 86.04.1E.00.70.


----------



## Blackfyre

Quote:


> Originally Posted by *jhatfie*
> 
> First test completed but passed percentage was too low, so dropped my memory from +700 to +675. Bios version is showing as 86.04.1E.00.70.


That's interesting... Did it come with this BIOS or did you update it to the latest one that MSI released on their site?

I'm still on the older BIOS, a few have claimed now that the newer BIOS is slightly better at overclocking, but I didn't expect such a difference.

I don't want to update the BIOS in case MSI locked something up with it, and when the BIOS modders finally figure out how to unlock voltage, I want to make sure I am on the safe side and using the OLDEST BIOS, just in case.


----------



## jhatfie

Quote:


> Originally Posted by *Blackfyre*
> 
> That's interesting... Did it come with this BIOS or did you update it to the latest one that MSI released on their site?
> 
> I'm still on the older BIOS, a few have claimed now that the newer BIOS is slightly better at overclocking, but I didn't expect such a difference.
> 
> I don't want to update the BIOS in case MSI locked something up with it, and when the BIOS modders finally figure out how to unlock voltage, I want to make sure I am on the safe side and using the OLDEST BIOS, just in case.


That is the BIOS that came with the card.


----------



## LiquidHaus

finally was able to get the card installed into my system today. had to re-route my acrylic lines to accommodate the massive size of this thing...



gonna firestrike it later today too. will post scores soon.

also, Zotac shipped their cards with the Chinese version of their overclocking utility. I have since installed a newer English version, and I was able to get the core up another 35mhz. I am okay with that.


----------



## FXformat

Got a 1070 today but when i installed the newest driver it says it's not supported...anyone has this issue?


----------



## bigjdubb

I just installed the latest driver today without any issue. Have you tried using a driver cleaner?


----------



## mypickaxe

Quote:


> Originally Posted by *averian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> I hate to say this, but I believe for the vast majority of the AIB partner boards, the only differentiators between those boards and the reference design is a better air cooling design and more stable power delivery.
> 
> The silicon is mostly the same from what I've seen across all of them. The VBIOS might be tweaked to give a little boost to the max TDP, but beyond that, until we see custom BIOS and a working editor, I think this is the range we are going to see.
> 
> 
> 
> Agreed, I'm very interested to see where the real limits are once the BIOS can be successfully edited and flashed. Based on what I've been able to glean so far, additional power/voltage will allow for a small boost from where we're at now, but nothing massive. Hope I'm wrong about what to expect, haha.
Click to expand...

I think they just took the overclock away from the end user and built it into their boost tables. My theory is the new 16nm FinFET is not nearly as tolerant to increased voltage, combined with stretching the clocks at the factory and few architectural improvements beyond shrinking Maxwell, this is what we are left with for now. Unless we all want to get into LN2.


----------



## FXformat

Quote:


> Originally Posted by *bigjdubb*
> 
> I just installed the latest driver today without any issue. Have you tried using a driver cleaner?


This is a fresh build, new mobo, new ssd, everything is new...running windows 10 enterprise...no other drivers on it before.


----------



## Forceman

Anyone else having a problem with their card not downclocking at idle? I picked up a G1 and on a fresh restart it'll downclock, but once I run something that kicks it up to 3D clocks it gets stuck at 1595 on the core and full memory speed. Reinstalling Afterburner seems to reset it, so maybe it is related to AB (running through Beta 4).


----------



## Phixit

Quote:


> Originally Posted by *Forceman*
> 
> Anyone else having a problem with their card not downclocking at idle? I picked up a G1 and on a fresh restart it'll downclock, but once I run something that kicks it up to 3D clocks it gets stuck at 1595 on the core and full memory speed. Reinstalling Afterburner seems to reset it, so maybe it is related to AB (running through Beta 4).


Didn't happen to me, the card is downclocking correctly at idle.


----------



## Forceman

Quote:


> Originally Posted by *Phixit*
> 
> Didn't happen to me, the card is downclocking correctly at idle.


Looks like it is downclocking for a while, and then suddenly jumps back up (even sitting idle) and then stays there from then on. Maybe something running in the background, but I have no idea what.


----------



## luan87us

Got my Asus Strix O8G a couple days after ordering from newegg. This thing is a beast together with my 6700k build. The Aura powered ROG backplate compliment my build amazingly since I'm using a Maximus VIII Ranger. Here's a quick benchmark on Heaven with Ultra setting at 1080p 144hz, Telesstration Normal (all 3 sliders maxed), AA 8x. Stock 6700k and gaming mode for the gpu.


----------



## LiquidHaus

trying to get over 21k graphics score. but just got this. either way, I am happy.

though I will have to admit that my four 7970s did better in the 4k firestrike test. haha wahhhh.


----------



## FXformat

Fixed the issue, turns out my windows 10 was an older version, had to update the windows to the newer version.


----------



## Blackfyre

Quote:


> Originally Posted by *jhatfie*
> 
> That is the BIOS that came with the card.


Can you please run GPU-Z, then heaven benchmark & valley benchmark (_both on extreme pre-sets, not full screen_), and 3dmark too or some other game (not in full-screen too, just windowed). Then click back on GPU-Z in the taskbar while everything is running in the background. And can you report back on what the highest VDDC is? That's the voltage, it's on the sensors page in GPU-Z.

You have a golden chip my friend, especially if it's running below 1.0930v and you're getting such scores.

Quote:


> Originally Posted by *FXformat*
> 
> Fixed the issue, turns out my windows 10 was an older version, had to update the windows to the newer version.


Always check for Windows Update after you finish installing Windows, then restart and check again and again until there's no more updates.








Quote:


> Originally Posted by *lifeisshort117*
> 
> trying to get over 21k graphics score. but just got this. either way, I am happy.
> 
> though I will have to admit that my four 7970s did better in the 4k firestrike test. haha wahhhh.


I tried man, I tried, this is the highest I got using +106 Core & +800 Memory (_memory was stable, I didn't want to push it further though_).

http://www.3dmark.com/fs/9234814


----------



## mypickaxe

Quote:


> Originally Posted by *luan87us*
> 
> Got my Asus Strix O8G a couple days after ordering from newegg. This thing is a beast together with my 6700k build. The Aura powered ROG backplate compliment my build amazingly since I'm using a Maximus VIII Ranger. Here's a quick benchmark on Heaven with Ultra setting at 1080p 144hz, Telesstration Normal (all 3 sliders maxed), AA 8x. Stock 6700k and gaming mode for the gpu.


1070 FE under water isn't so bad...(same settings as above)...


----------



## luan87us

Quote:


> Originally Posted by *mypickaxe*
> 
> 1070 FE under water isn't so bad...(same settings as above)...


I assume your is overclocked? And how is everything getting 17k+ with 3DMark Firestrike? I'm only getting 15,420.


----------



## viking21

for a node 202 with two fans below the gpu, do you suggest a FE o a custom?


----------



## PeterMac

I bough MSI GTX 1070 Gaming X and it have coil whine, do you also have some coil whine during gaming from your cards ? for example I start to hear it little from 80fps.

Here is during game at 120-125fps


----------



## FXformat

Quote:


> Originally Posted by *PeterMac*
> 
> I bough MSI GTX 1070 Gaming X and it have coil whine, do you also have some coil whine during gaming from your cards ? for example I start to hear it little from 80fps.
> 
> Here is during game at 120-125fps


I've owned dozens of cards throughout the years and not one, was silent...they all made a whine, my 1070 whines too when it's loading up the game screen...during game play, it's not so much


----------



## pez

Quote:


> Originally Posted by *Yetyhunter*
> 
> It was a windforce x3


Hmmm, which are very similar coolers. Something definitely seems wrong. Could you log or screenshot the fan curve? I know another member adjusted their fan curve a bit (specifically on a 1070 G1) and that fixed the issues for them.


----------



## Mad Pistol

I am now convinced that the GTX 1070/1080 are unbelievable beasts. I just played a full round of Star Wars Battlefront, Ultra settings @ 1440p, and the framerate never dipped below 60 FPS. Mind Blowing how good the 1070 and 1080 are.


----------



## mypickaxe

Quote:


> Originally Posted by *viking21*
> 
> for a node 202 with two fans below the gpu, do you suggest a FE o a custom?


Depends on your geographic location and how cool ambient temps are in the room. I have that case for my HTPC build and would not recommend anything other than a blower style cooler or pseudo blower such as the R9 Nano.

I tested a 380 open air cooler (Direct CUII) and it was causing my APU to thermal throttle.

I also tested a TITAN X for kicks (not meant for that build) and it ran just fine.

I sold that and picked up the Nano for the HTPC and 1070s for my main rig.

Quote:


> Originally Posted by *luan87us*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> 1070 FE under water isn't so bad...(same settings as above)...
> 
> 
> 
> I assume your is overclocked? And how is everything getting 17k+ with 3DMark Firestrike? I'm only getting 15,420.
Click to expand...

Yes. It runs around 2050-2088 MHz stable in benchmarks. I can get it up to 2100 in in game but it has crashed a couple of times so I backed it down a bit. +197 core, +500 memory seems to be a good gaming stable compromise for me. For benchmarks I can bump the vram up to +700 without issues but prefer games to not crash before a save.

As far as your score, there's a bottleneck somewhere. Your CPU isn't the bottleneck. It's faster than mine in Heaven. I am only overclocking all cores to 4 GHz. It's in an x16 3.0 configuration. Confirmed it is running at PCIe 3.0 x16 during load.

My DDR4 is in quad channel mode, running at 2400 MHz. It is 2800 MHz rated, but I have a 950 Pro SSD and I prefer it to run at full speed. Fiddling with the BCLK tends to slow it down.


----------



## pez

Quote:


> Originally Posted by *Mad Pistol*
> 
> I am now convinced that the GTX 1070/1080 are unbelievable beasts. I just played a full round of Star Wars Battlefront, Ultra settings @ 1440p, and the framerate never dipped below 60 FPS. Mind Blowing how good the 1070 and 1080 are.


I had a similar experience with my 1080 @ 4K on BF4. Less demanding I think, but I was just surprised.


----------



## mypickaxe

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mad Pistol*
> 
> I am now convinced that the GTX 1070/1080 are unbelievable beasts. I just played a full round of Star Wars Battlefront, Ultra settings @ 1440p, and the framerate never dipped below 60 FPS. Mind Blowing how good the 1070 and 1080 are.
> 
> 
> 
> I had a similar experience with my 1080 @ 4K on BF4. Less demanding I think, but I was just surprised.
Click to expand...

I tried DOOM last night at 4K. Even on medium it can't keep a steady 60 fps. Waiting for Vulkan and for proper SLI support.


----------



## pez

Quote:


> Originally Posted by *mypickaxe*
> 
> I tried DOOM last night at 4K. Even on medium it can't keep a steady 60 fps. Waiting for Vulkan and for proper SLI support.


Yeah, granted I'm on a 1080 for 4K







. My second card is backordered, so I'm waiting for that. I was seeing some SLI support with my 970s and it was running pretty well, but that was at 2K. Are you using the official HB bridge for your SLI setup?


----------



## Yetyhunter

Quote:


> Originally Posted by *pez*
> 
> Hmmm, which are very similar coolers. Something definitely seems wrong. Could you log or screenshot the fan curve? I know another member adjusted their fan curve a bit (specifically on a 1070 G1) and that fixed the issues for them.


I fixed it now. I moved the radiator on top for exhaust and I have 2 side panel fans for intake. The temperature never exceeds 71*C and the core clock remains stable at +2000 mhz.
Here is a firestrike score. I think it's decent enough

And the fan curve


----------



## pez

That sounds a lot more normal now







. Glad you got this figured out







.


----------



## Bdonedge

I was BSODing last night and I uninstalled all the third party OC programs and my card seems to be performing a lot better. Has anyone had any bad experiences with the afterburner beta?
Should I use the previous version?


----------



## Mad Pistol

Quote:


> Originally Posted by *Yetyhunter*
> 
> I fixed it now. I moved the radiator on top for exhaust and I have 2 side panel fans for intake. The temperature never exceeds 71*C and the core clock remains stable at +2000 mhz.
> Here is a firestrike score. I think it's decent enough
> 
> 
> Spoiler: Graphs
> 
> 
> 
> 
> And the fan curve


Much better. It's amazing what a little cool air will do for these GPUs performance wise. Glad to hear it is fixed.


----------



## pez

I wasn't having any luck on what I think was AB Beta 2. Everything from clocks getting stuck at 'stock' and OSD not working whatsoever. I'm actually using the Xtreme Gaming app through GB which is working ok.


----------



## trelokomio58

This is my best score at firestrike with my 1070gamingX, anything above this will crash the driver.
Pretty disappointed by 1070's overclocked performance.
A 980Τι overclocked at 1450-1500mhz easily beats an 1070 overclocked at 2050-2100mhz.


----------



## jhatfie

Quote:


> Originally Posted by *Blackfyre*
> 
> Can you please run GPU-Z, then heaven benchmark & valley benchmark (_both on extreme pre-sets, not full screen_), and 3dmark too or some other game (not in full-screen too, just windowed). Then click back on GPU-Z in the taskbar while everything is running in the background. And can you report back on what the highest VDDC is? That's the voltage, it's on the sensors page in GPU-Z.


Running Valley, Heaven and Witcher 3 at the same time I see the voltage bounce back and forth from 1.0930 (Highest) and 1.0810.


----------



## Mad Pistol

Welp, I just said "screw it" and installed my 2nd GTX 1070. I have officially lost my mind.



Benchmarks to follow shortly.


----------



## mypickaxe

Quote:


> Originally Posted by *trelokomio58*
> 
> This is my best score at firestrike with my 1070gamingX, anything above this will crash the driver.
> Pretty disappointed by 1070's overclocked performance.
> A 980Τι overclocked at 1450-1500mhz easily beats an 1070 overclocked at 2050-2100mhz.


I can confirm that, but in real world gaming, you should not notice a difference.

Advantage goes to the 1070 for 8GB vs 6GB of VRAM. Also, SMP for VR. Last but not least, lower TDP (and less heat in the room.)

I can say I'm pleased with the reduction in heat coming from a voltage modded VBIOS on a TITAN X to 1070 SLI. I don't plan to stay with 1070s for even an entire year, but for now, it's a good lateral move. I wanted higher FPS in Project Cars, but I didn't want a second (used) TITAN X.

Same rig as sig, I had a TITAN X at 1484 MHz. 5930K overclocked to 4.2 GHz. Graphics score = 20555. Here is the link to the score: http://www.3dmark.com/fs/8656048
Now here's the score for a single GTX 1070 (again, same rig as sig.) 5930K overclocked to 4.0 GHz. Graphics score = 21903 Here is the link to the score: http://www.3dmark.com/fs/9219072

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> I tried DOOM last night at 4K. Even on medium it can't keep a steady 60 fps. Waiting for Vulkan and for proper SLI support.
> 
> 
> 
> Yeah, granted I'm on a 1080 for 4K
> 
> 
> 
> 
> 
> 
> 
> . My second card is backordered, so I'm waiting for that. I was seeing some SLI support with my 970s and it was running pretty well, but that was at 2K. Are you using the official HB bridge for your SLI setup?
Click to expand...

No HB bridge. I have EK blocks on the GPUs and the official bridge isn't supported. I might try the EVGA LED HB bridge (removing the outer shield so it will fit) or just wait for EK to release their own.

But that doesn't matter as much as the fact DOOM does not currently support SLI. That is true since it is still just an OpenGL title. Setting a profile to force alternate frame rendering is not a good workaround, and it runs better in Single GPU mode.

Until Vulkan support is available, and provided they include full multi-gpu support in the Vulkan patch, I won't bother.

I was just monkeying around to see if a 4K monitor would be worth it for various titles. I'm happy with my ROG Swift G-Sync 1440p/144 Hz.


----------



## Mad Pistol

First benchmark, stock on both 1070s. The single bridge is definitely holding me back. I will be going to Micro Center and Fry's to see if I can retrieve a HB SLI bridge. If not, I may just get an LED bridge.



And a pic of GPUZ. All the "yellow" in PerfCap means my SLI bridge is holding me back.



I can already hear my credit card screaming.


----------



## mypickaxe

Quote:


> Originally Posted by *Mad Pistol*
> 
> First benchmark, stock on both 1070s. The single bridge is definitely holding me back. I will be going to Micro Center and Fry's to see if I can retrieve a HB SLI bridge. If not, I may just get an LED bridge.


The LED bridge should work just fine. I am using a single LED bridge, not high bandwidth. I tested with the triple SLI ribbon bridge which came with my Asus motherboard, and see no difference in performance. You won't need a true HB bridge at 1080p or even 1440p even at 144 Hz. I've seen where it really starts to make a difference at 5K.


----------



## trelokomio58

Quote:


> Originally Posted by *mypickaxe*
> 
> I can confirm that, but in real world gaming, you should not notice a difference.
> 
> Advantage goes to the 1070 for 8GB vs 6GB of VRAM. Also, SMP for VR. Last but not least, lower TDP (and less heat in the room.)
> 
> I can say I'm pleased with the reduction in heat coming from a voltage modded VBIOS on a TITAN X to 1070 SLI. I don't plan to stay with 1070s for even an entire year, but for now, it's a good lateral move. I wanted higher FPS in Project Cars, but I didn't want a second (used) TITAN X.
> 
> Same rig as sig, I had a TITAN X at 1484 MHz. 5930K overclocked to 4.2 GHz. Graphics score = 20555. Here is the link to the score: http://www.3dmark.com/fs/8656048
> Now here's the score for a single GTX 1070 (again, same rig as sig.) 5930K overclocked to 4.0 GHz. Graphics score = 21903 Here is the link to the score: http://www.3dmark.com/fs/9219072
> No HB bridge. I have EK blocks on the GPUs and the official bridge isn't supported. I might try the EVGA LED HB bridge (removing the outer shield so it will fit) or just wait for EK to release their own.


Iam disappointed mate because i sell my 980ti and grab a 1070, because i read at reviews that the 1070 is as powerfull as a titanX!
Nah..Ok out of the box performance is good..Also has very low temps, barely pass 65C..But it has also a poor overclock performance.
This was my score with my [email protected] with thw same setup(4790K)-->22600 graphics score


The 1070 have good performance out of the box, have low temps but also is a poor overclock performer..


----------



## mypickaxe

Quote:


> Originally Posted by *trelokomio58*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> I can confirm that, but in real world gaming, you should not notice a difference.
> 
> Advantage goes to the 1070 for 8GB vs 6GB of VRAM. Also, SMP for VR. Last but not least, lower TDP (and less heat in the room.)
> 
> I can say I'm pleased with the reduction in heat coming from a voltage modded VBIOS on a TITAN X to 1070 SLI. I don't plan to stay with 1070s for even an entire year, but for now, it's a good lateral move. I wanted higher FPS in Project Cars, but I didn't want a second (used) TITAN X.
> 
> Same rig as sig, I had a TITAN X at 1484 MHz. 5930K overclocked to 4.2 GHz. Graphics score = 20555. Here is the link to the score: http://www.3dmark.com/fs/8656048
> Now here's the score for a single GTX 1070 (again, same rig as sig.) 5930K overclocked to 4.0 GHz. Graphics score = 21903 Here is the link to the score: http://www.3dmark.com/fs/9219072
> No HB bridge. I have EK blocks on the GPUs and the official bridge isn't supported. I might try the EVGA LED HB bridge (removing the outer shield so it will fit) or just wait for EK to release their own.
> 
> 
> 
> Iam disappointed mate because i sell me 980ti and grab a 1070, because i read at reviews that the 1070 is as powerfull as a titanX!
> Nah..Ok out of the box performance is very good..Also it has very low temps, barely pass 65C..But it has also a poor overclock performance.
> This was my score with my [email protected] with thw same setup(4790K)-->22600 graphics score
> 
> 
> The 1070 have good performance out of the box, hane low temps but also is a poor overclocker performer..
Click to expand...

I understand, but I am not worried too much about it since I am running SLI for titles I care about. As far as a single GPU, yes it would be a bit disappointing, but you play games (I hope) and not just in it for benchmarks. For games where 6GB is not enough, the 8GB is going to come in handy of the next year or two.


----------



## pez

Honestly, all of the games I can't currently max out at 4K (including AA, tess, HBAO, etc) support SLI. I'm pretty excited for SLI this time around







.


----------



## Yungbenny911

Saw some performace gains going from x8 x8 3.0 (single flexible bridge), to x16 x16 3.0 (dual SLI flexible bridge)


----------



## crun

http://i.imgur.com/OelKPVf.png - this is what I'm running recently.

Clock stabilizes on 1924/0.9V. It might drop a step lower in 3dmark ultra, but in 1080p gaming it seems to be rock stable. Temp maxes at 74-75c with 60% fan speed.

With max. stable OC (+180, no curve clock as it blocks the power consumption limit increase) clock was jumping between ~1970-2020 (2088 max), depending on the game etc. and temperature was peaking at 82c with 65% fan speed.


----------



## bigjdubb

What seems to be the average on memory clocks? I see quite a few people getting +700mhz and more but my MSI Gaming (non x) only manages +575 before driver crashing. I don't get any artifacts but I get driver crashes.

Quote:


> Originally Posted by *trelokomio58*
> 
> Iam disappointed mate because i sell my 980ti and grab a 1070, *because i read at reviews that the 1070 is as powerfull as a titanX*!
> Nah..Ok out of the box performance is good..Also has very low temps, barely pass 65C..But it has also a poor overclock performance.
> This was my score with my [email protected] with thw same setup(4790K)-->22600 graphics score
> 
> 
> The 1070 have good performance out of the box, have low temps but also is a poor overclock performer..


I guess you didn't realize that your 980ti @1500 was already more powerful than a Titan X.


----------



## trelokomio58

Quote:


> Originally Posted by *bigjdubb*
> 
> I guess you didn't realize that your 980ti @1500 was already more powerful than a Titan X.


TitanX oced its a bit faster than 980Τι oced.
A 980Τι can do1450-1500mhz , easilly with stock voltage.
The conclusion is that a [email protected] is faster than a [email protected]
My mistake, i sell my 980ti to buy a slower graphics card


----------



## Mad Pistol

So no bites on the SLI bridges in this area. I actually just snagged an HB SLI bridge from Nvidia's site, but now I'm in talks with someone about buying my second card. If we can come to an agreement, I guess I'm sending that back.

Man, this entire thing has been a whirlwind!


----------



## LiquidHaus

If you guys plan to watercool your cards with the new bridge, taking a dremel to the corners of the bridge will not affect the performance of the bridge at all, and it'll work 100%. I confirmed this with this system last week at work lol


----------



## luan87us

Quote:


> Originally Posted by *mypickaxe*
> 
> Yes. It runs around 2050-2088 MHz stable in benchmarks. I can get it up to 2100 in in game but it has crashed a couple of times so I backed it down a bit. +197 core, +500 memory seems to be a good gaming stable compromise for me. For benchmarks I can bump the vram up to +700 without issues but prefer games to not crash before a save.
> 
> As far as your score, there's a bottleneck somewhere. Your CPU isn't the bottleneck. It's faster than mine in Heaven. I am only overclocking all cores to 4 GHz. It's in an x16 3.0 configuration. Confirmed it is running at PCIe 3.0 x16 during load.
> 
> My DDR4 is in quad channel mode, running at 2400 MHz. It is 2800 MHz rated, but I have a 950 Pro SSD and I prefer it to run at full speed. Fiddling with the BCLK tends to slow it down.


My card is running at 2050 mhz in benchmark. My cpu is running at stock clock (4ghz-4.2ghz), DDR4 dual channel ram running at 3000 mhz (Ripjaws V rated at 3000mhz), and I only have a 850 EVO ssd. I haven't mess with any overclocking in my system yet as I don't have enough fans for proper airflow in the case yet. But I see a lot of people getting 16-17k on very similar set up on Firestrike but I only get about 15.4k


----------



## mypickaxe

Quote:


> Originally Posted by *trelokomio58*
> 
> Quote:
> 
> 
> 
> I guess you didn't realize that your 980ti @1500 was already more powerful than a Titan X.
Click to expand...

That's not true, considering a TITAN X can also be overclocked to 1500 in the best conditions, there is no way for that to be possible.

Quote:


> Originally Posted by *luan87us*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> Yes. It runs around 2050-2088 MHz stable in benchmarks. I can get it up to 2100 in in game but it has crashed a couple of times so I backed it down a bit. +197 core, +500 memory seems to be a good gaming stable compromise for me. For benchmarks I can bump the vram up to +700 without issues but prefer games to not crash before a save.
> 
> As far as your score, there's a bottleneck somewhere. Your CPU isn't the bottleneck. It's faster than mine in Heaven. I am only overclocking all cores to 4 GHz. It's in an x16 3.0 configuration. Confirmed it is running at PCIe 3.0 x16 during load.
> 
> My DDR4 is in quad channel mode, running at 2400 MHz. It is 2800 MHz rated, but I have a 950 Pro SSD and I prefer it to run at full speed. Fiddling with the BCLK tends to slow it down.
> 
> 
> 
> My card is running at 2050 mhz in benchmark. My cpu is running at stock clock (4ghz-4.2ghz), DDR4 dual channel ram running at 3000 mhz (Ripjaws V rated at 3000mhz), and I only have a 850 EVO ssd. I haven't mess with any overclocking in my system yet as I don't have enough fans for proper airflow in the case yet. But I see a lot of people getting 16-17k on very similar set up on Firestrike but I only get about 15.4k
Click to expand...

Power management profiles (both Windows and Nvidia cpl) could be part of it. Running the GPU at x8 instead of x16 could impact it a little bit as well. Not having proper airflow in the case could *definitely* be hurting you, considering the card will throttle the clocks down.

Quote:


> Originally Posted by *lifeisshort117*
> 
> If you guys plan to watercool your cards with the new bridge, taking a dremel to the corners of the bridge will not affect the performance of the bridge at all, and it'll work 100%. I confirmed this with this system last week at work lol


That's good to know, I guess. When the bridge is no longer a unicorn, I will possibly look into it. Still waiting to see what EK comes up with.


----------



## Mad Pistol

A single SLI bridge isn't nearly enough for dual 1070's. It is literally a laggy mess on GTA V, whereas a single 1070 is fine.

EDIT: Victory! I found a ribbon SLI bridge in my storage, so I now have two SLI bridges! I hooked both of them up, and the stuttering is all but gone!


----------



## bigjdubb

Quote:


> Originally Posted by *trelokomio58*
> 
> TitanX oced its a bit faster than 980Τι oced.
> A 980Τι can do1450-1500mhz , easilly with stock voltage.
> The conclusion is that a [email protected] is faster than a [email protected]
> My mistake, i sell my 980ti to buy a slower graphics card


Quote:


> Originally Posted by *mypickaxe*
> 
> That's not true, considering a TITAN X can also be overclocked to 1500 in the best conditions, there is no way for that to be possible.


Well we were talking about Nvidias comparison of a 1070 to a TitanX, which was a comparison against a stock Titan X. A 980ti @ 1500mhz is absolutely more powerful than a stock Titan X.

I started back over from scratch and overclocked the memory first and was able to get +750 on the memory and +145 on the clock. During the test it was 2037-2050 mhz.

I was able to break 21,000

Result: http://www.3dmark.com/fs/9253174

Stress Test: http://www.3dmark.com/fsst/59325


----------



## trelokomio58

Quote:


> Originally Posted by *bigjdubb*
> 
> Well we were talking about Nvidias comparison of a 1070 to a TitanX, which was a comparison against a stock Titan X. A 980ti @ 1500mhz is absolutely more powerful than a stock Titan X.
> 
> I started back over from scratch and overclocked the memory first and was able to get +750 on the memory and +145 on the clock. During the test it was 2037-2050 mhz.
> 
> I was able to break 21,000
> 
> Result: http://www.3dmark.com/fs/9253174
> 
> Stress Test: http://www.3dmark.com/fsst/59325


Read my posts again.
I said 1070 have good performance at stock clocks, out of the box 1070 its faster than 980ti reference for sure.
I am disappointed from the overclock performance though..Poor overclock performance, a overclocked 980ti or titanX easily jumb ahead from an overclocked 1070.


----------



## bigjdubb

Quote:


> Originally Posted by *trelokomio58*
> 
> Read my posts again.
> I said 1070 have good performance at stock clocks, out of the box 1070 its faster than 980ti reference for sure.
> I am disappointed from the overclock performance though..Poor overclock performance, a overclocked 980ti or titanX easily jumb ahead from an overclocked 1070.


I understand that, but you said that they claimed it was faster than a Titan X so you got one. That was Nvidia that made that claim, users discovered very quickly (before anyone could buy a 1070) that OC'd Titan X's and 980ti's were as fast or faster than OC'd 1070's. The only performance upgrade path for those users was the 1080.


----------



## trelokomio58

Quote:


> Originally Posted by *bigjdubb*
> 
> I understand that, but you said that they claimed it was faster than a Titan X so you got one. That was Nvidia that made that claim, users discovered very quickly (before anyone could buy a 1070) that OC'd Titan X's and 980ti's were as fast or faster than OC'd 1070's. The only performance upgrade path for those users was the 1080.


Thats why i told "my mistake".
I sell the 980ti and grab a 1070, because i thought its faster but...no..
If overclock both its slower, i think i do a downgrade not upgrade..


----------



## mypickaxe

Quote:


> Originally Posted by *bigjdubb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *trelokomio58*
> 
> TitanX oced its a bit faster than 980Τι oced.
> A 980Τι can do1450-1500mhz , easilly with stock voltage.
> The conclusion is that a [email protected] is faster than a [email protected]
> My mistake, i sell my 980ti to buy a slower graphics card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> That's not true, considering a TITAN X can also be overclocked to 1500 in the best conditions, there is no way for that to be possible.
> 
> Click to expand...
> 
> Well we were talking about Nvidias comparison of a 1070 to a TitanX, which was a comparison against a stock Titan X. A 980ti @ 1500mhz is absolutely more powerful than a stock Titan X.
Click to expand...

Now you're just changing the rules so you can win an argument. I'm done.


----------



## Mad Pistol

I think the 1070 will be faster in the end thanks to the advancements for VR, but that remains to be seen. However, OC vs OC, you are right, an OC'd 980 TI will beat an OC'd 1070 in most scenarios (but not all)


----------



## bigjdubb

Quote:


> Originally Posted by *mypickaxe*
> 
> Now you're just changing the rules so you can win an argument. I'm done.


I'm not even arguing, but thank you for being done.


----------



## bigjdubb

Quote:


> Originally Posted by *Mad Pistol*
> 
> A single SLI bridge isn't nearly enough for dual 1070's. It is literally a laggy mess on GTA V, whereas a single 1070 is fine.
> 
> EDIT: Victory! I found a ribbon SLI bridge in my storage, so I now have two SLI bridges! I hooked both of them up, and the stuttering is all but gone!


I am really curious to see how you feel about SLI 1070's with Ultrawide 1440p, that is the next step I want to take with my display.


----------



## Mad Pistol

Quote:


> Originally Posted by *bigjdubb*
> 
> I am really curious to see how you feel about SLI 1070's with Ultrawide 1440p, that is the next step I want to take with my display.


The limited experience I have had with it so far has been great. I just played a couple rounds of Star Wars Battlefront @ 3440x1440, ultra settings and 110% resolution scaling. The framerate didn't dip below 100 FPS, and the average was around 120 FPS. I was also able to max out GTA V @ 3440x1440, and I averaged over 60 FPS.

My conclusion: GTX 1070 SLI for ultrawide is overkill and amazing.









EDIT: I just realized how similar our rigs are (4790k, Gigabyte Z97 Gaming 7, GTX 1070, 850 EVO)


----------



## Airrick10

Quote:


> Originally Posted by *Mad Pistol*
> 
> The limited experience I have had with it so far has been great. I just played a couple rounds of Star Wars Battlefront @ 3440x1440, ultra settings and 110% resolution scaling. The framerate didn't dip below 100 FPS, and the average was around 120 FPS. I was also able to max out GTA V @ 3440x1440, and I averaged over 60 FPS.
> 
> My conclusion: GTX 1070 SLI for ultrawide is overkill and amazing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I just realized how similar our rigs are (4790k, Gigabyte Z97 Gaming 7, GTX 1070, 850 EVO)


I too have a similar rig to you guys: 4790k, Gigabyte Z97 Gaming 7, GTX 1070, 840 EVO though and from TEXAS!


----------



## Blackfyre

Quote:


> Originally Posted by *Airrick10*
> 
> I too have a similar rig to you guys: 4790k, Gigabyte Z97 Gaming 7, GTX 1070, 840 EVO though and from TEXAS!


I too have a similar rig, haha!

4790K @ 4.6Ghz @ 1.188v
Gigabyte Z97X-Gaming GT
MSI GTX 1070 Gaming X
16Gb @2400Mhz 11-13-13-30 (GeIL's EVO POTENZA).
EVGA SuperNOVA 1000W G2
Samsung 830 EVO


----------



## Mad Pistol

Quote:


> Originally Posted by *Airrick10*
> 
> I too have a similar rig to you guys: 4790k, Gigabyte Z97 Gaming 7, GTX 1070, 840 EVO though and from TEXAS!


Quote:


> Originally Posted by *Blackfyre*
> 
> I too have a similar rig, haha!
> 
> 4790K @ 4.6Ghz @ 1.188v
> Gigabyte Z97X-Gaming GT
> MSI GTX 1070 Gaming X
> 16Gb @2400Mhz 11-13-13-30 (GeIL's EVO POTENZA).
> EVGA SuperNOVA 1000W G2
> Samsung 830 EVO


Love it!

So I broke into the 99th percentile of Fire Strike scores!!!

http://www.3dmark.com/3dm/13105728?



I wonder how long it will stay up there....


----------



## FXformat

Ok i'm impressed with this card, just played Project Cars in 4K 3840x2160, everything Ultra, thunderstorm climate, 35 cars...avg around 60fps...when it's a cluster**** of us, it gets down to 40ish....all in all very impressive, definitely.

When i'm playing a time trial by myself, clear weather..it's over 100FPS in 4K and ultra!


----------



## Blackfyre

Quote:


> Originally Posted by *FXformat*
> 
> Ok i'm impressed with this card, just played Project Cars in 4K 3840x2160, everything Ultra, thunderstorm climate, 35 cars...avg around 60fps...when it's a cluster**** of us, it gets down to 40ish....all in all very impressive, definitely.
> 
> When i'm playing a time trial by myself, clear weather..it's over 100FPS in 4K and ultra!


You should try *Forza Motorsport 6 Apex BetaA*

It's DX12, it's on the Windows Store, it's free. And it looks and runs better than Project Cars. I still prefer Project Cars though.


----------



## luan87us

Quote:


> Originally Posted by *mypickaxe*
> 
> Power management profiles (both Windows and Nvidia cpl) could be part of it. Running the GPU at x8 instead of x16 could impact it a little bit as well. Not having proper airflow in the case could *definitely* be hurting you, considering the card will throttle the clocks down.


Good to know. Yeah atm I only have 2 exhaust fans running through the Corsair H100i radiator. More fans are on the way though. Can't wait to set up proper airflow and start messing with overclocking the CPU.


----------



## FXformat

Quote:


> Originally Posted by *Blackfyre*
> 
> You should try *Forza Motorsport 6 Apex BetaA*
> 
> It's DX12, it's on the Windows Store, it's free. And it looks and runs better than Project Cars. I still prefer Project Cars though.


All my games are free heh, just cant play online, which i don't care for anymore...i have played forza, i prefer the physics of project cars more...i play that on my steering wheel and pedals it's awesome, feel so real.


----------



## FXformat

I have the 1070 from EVGA, SC ...when i game and have MSI afterburner up, it says clock speed is 1974mhz...i haven't touched any overclocking, this seems high from factory no?


----------



## bigjdubb

Welcome to Boost 3.0. I have the MSI Gaming (non x) which isn't the over clocked version and it runs at almost 1900mhz without touching anything.


----------



## deegzor

Anyone else having issues with less demanding games? For me it doesn't use p-states correctly in low end games. It only uses 1506mhz on core for example cs go, diablo 3, payday 2. Any workaround for this?


----------



## bigjdubb

Quote:


> Originally Posted by *deegzor*
> 
> Anyone else having issues with less demanding games? For me it doesn't use p-states correctly in low end games. It only uses 1506mhz on core for example cs go, diablo 3, payday 2. Any workaround for this?


I have issues with it not clocking up in less demanding areas of demanding games as well. In FO4 sometimes it doesn't want to clock up after being in the pipboy screen or going from interior to exterior. I'm hoping it is something that will be worked out in future drivers.


----------



## FXformat

Does anybody know if the EK 1080 FE waterblock will fit on my EVGA 1070 SC?


----------



## mypickaxe

Quote:


> Originally Posted by *FXformat*
> 
> Does anybody know if the EK 1080 FE waterblock will fit on my EVGA 1070 SC?


Yes, they are interchangeable. SC is a reference card, 1080 FE and 1070 FE blocks are the same other than the name printed on them.


----------



## FXformat

Quote:


> Originally Posted by *mypickaxe*
> 
> Yes, they are interchangeable. SC is a reference card, 1080 FE and 1070 FE blocks are the same other than the name printed on them.


Thank you, microcenter has them in stock looks like i'll snatch one up tomorrow.

Repped for quick response!


----------



## CaptainZombie

Quote:


> Originally Posted by *averian*
> 
> Great question. So yes, but there is a slight caveat. I used the EVGA hybrid cooler from the 980 TI because I picked one up at a great price, but it only includes a GPU cooler and uses the stock fan on any "reference" or "founder's edition" to cool the VRM (I don't use a shroud). The SC ACX 3.0 is a reference design so no problems with fitment, however, it is missing the blower fan which you really do need to implement in some fashion. You could probably get away with adding VRM heat sinks and a case fan directly blowing on the card, but a better option now is the just released EVGA 10 series hybrid that includes a complete new fan/shroud/gpu water cooler all in one. The cost for that option is $120: http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1
> 
> My biggest goals were quiet and low temps. Since Pascal seems to be very sensitive to temperature and will throttle when heat soaked (a problem that isn't obvious in a short benchmark run). I only gained about 500 on my firestrike graphics score by switching to water since I'm currently power limited at 112% on stock bios. However, the blower fan runs at less than 30% now so I can't hear it and my temps on full OC load have only been about 45 C. So the temp difference is remarkable and that means no more bouncing around of boost clocks when hitting the temp wall after long gaming sessions.


Thanks for your impressions and link to the new 10X0 Hydro kit from EVGA. I'll probably give this a shot soon unless I decide to go custom water cooling. I think this card in general probably performs even better with some H20 keeping it cool. LOL!


----------



## iluvkfc

I got 1070 SLI, Gigabyte Windforce OC model. Will fill form once max OC is found, but I pass Heaven and Fire Strike at 2100 with ease (did not try mem). Anyone know if G1 block will work on it? PCB appears very similar. Also anyone know where's Pascal BIOS Tweaker, I want higher PT and voltage!


----------



## Blackfyre

Quote:


> Originally Posted by *deegzor*
> 
> Anyone else having issues with less demanding games? For me it doesn't use p-states correctly in low end games. It only uses 1506mhz on core for example cs go, diablo 3, payday 2. Any workaround for this?


Quote:


> Originally Posted by *bigjdubb*
> 
> I have issues with it not clocking up in less demanding areas of demanding games as well. In FO4 sometimes it doesn't want to clock up after being in the pipboy screen or going from interior to exterior. I'm hoping it is something that will be worked out in future drivers.


This is seriously stupid. Probably the only thing I regret from leaving AMD.

Using MSI AfterBurner's "*AMD Compatible Properties*" in the settings; I had my 7970 overclocked, running at max core and max memory speeds, and constant max voltage for over 5 years without any issues. Temperatures were fine when I was not gaming anyway, because there's 0% GPU usage, and they'd only spike up during gameplay.

I hate this auto throttling bullsh** because the GPU thinks "_oh it's FINE, I don't need to use full power_", then a lag spike happens because suddenly I enter a very demanding area or something happens on screen and it stutters for one moment, jumps the speed up and goes back to normal.

This is literally the ONLY issue I have with the card.

Although I noticed if you go to *nVidia Control Panel*, then *Manage 3D Settings*, then choose the game you want, then change *Power Management Mode* to *Prefer Maximum Performance*, it won't throttle below BASE clock speeds. So at worst it stays in the 1500MHz region while gaming.


----------



## Swiftes

getting this high DPC latency problem with my new 1070, gonna try DDU tonight and start afresh.


----------



## Yetyhunter

Quote:


> Originally Posted by *Swiftes*
> 
> getting this high DPC latency problem with my new 1070, gonna try DDU tonight and start afresh.


I have the same problem even after DDU on a fresh windows install.


----------



## Blackfyre

Quote:


> Originally Posted by *Swiftes*
> 
> getting this high DPC latency problem with my new 1070, gonna try DDU tonight and start afresh.


Quote:


> Originally Posted by *Yetyhunter*
> 
> I have the same problem even after DDU on a fresh windows install.


Try installing *Process Lasso* (FREE) and see if the high DPC latency problem is solved for you guys.


----------



## Swiftes

Quote:


> Originally Posted by *Blackfyre*
> 
> Try installing *Process Lasso* (FREE) and see if the high DPC latency problem is solved for you guys.


Nice one man, ill give this a whirl later and see what the crack is, the audio dropouts on JC3 last night were unbearable!


----------



## Blackfyre

Quote:


> Originally Posted by *Swiftes*
> 
> Nice one man, ill give this a whirl later and see what the crack is, the audio dropouts on JC3 last night were unbearable!


All good, hope it works.

Seriously though, if someone just tells me we get DX12 and full driver support for the next few years for Windows 7. I wouldn't even hesitate for one second to go back.


----------



## Creatinas

Anyone had problems with FPS dips on CSGO?

I'm playing at 1280*1024 streached with everything on minimum. I have 400-500fps regulary but sometimes it dips to 90, 130fps and the game stutters...
It's not normal because with my gtx 970 this didn't happen at all.

In every other game its fine, must be a code bug in csgo or somekind of conflit with some new technology present in the GTX1070.
I've went to "manage 3d Settings" in the nvidia control painel and switched everything to "off" and high performance and I think the problem is solved.



This is my rig.

I5 4690k @ 4.7ghz
Msi z87-Gd65 gaming
Gskill ripjaws 1866mhz
crucial 250gb ssd
Msi Gtx 1070 Gaming X (with oc mode.)


----------



## Blackfyre

Quote:


> Originally Posted by *Creatinas*
> 
> Anyone had problems with FPS dips on CSGO?
> 
> I'm playing at 1280*1024 streached with everything on minimum. I have 400-500fps regulary but sometimes it dips to 90, 130fps and the game stutters...
> It's not normal because with my gtx 970 this didn't happen at all.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> In every other game its fine, must be a code bug in csgo or somekind of conflit with some new technology present in the GTX1070.
> I've went to "manage 3d Settings" in the nvidia control painel and switched everything to "off" and high performance and I think the problem is solved.
> 
> 
> 
> This is my rig.
> 
> I5 4690k @ 4.7ghz
> Msi z87-Gd65 gaming
> Gskill ripjaws 1866mhz
> crucial 250gb ssd
> Msi Gtx 1070 Gaming X (with oc mode.)


You probably didn't need to switch everything OFF in the 3D Settings. Just put ON the "*prefer high performance*" option.


----------



## deegzor

Quote:


> Originally Posted by *Blackfyre*
> 
> You probably didn't need to switch everything OFF in the 3D Settings. Just put ON the "*prefer high performance*" option.


also disabling shader cache can help. Here's my settings https://www.dropbox.com/s/bujxgr798ufy223/nvset1.JPG?dl=0 , https://www.dropbox.com/s/d7ptbt0u9vim269/nvset2.JPG?dl=0 , https://www.dropbox.com/s/iyb0cvk80bs7kwu/nvset3.JPG?dl=0


----------



## luan87us

Quote:


> Originally Posted by *Creatinas*
> 
> Anyone had problems with FPS dips on CSGO?
> 
> I'm playing at 1280*1024 streached with everything on minimum. I have 400-500fps regulary but sometimes it dips to 90, 130fps and the game stutters...
> It's not normal because with my gtx 970 this didn't happen at all.
> 
> In every other game its fine, must be a code bug in csgo or somekind of conflit with some new technology present in the GTX1070.
> I've went to "manage 3d Settings" in the nvidia control painel and switched everything to "off" and high performance and I think the problem is solved.
> 
> This is my rig.


I've been playing a lot of CSGO on my new skylake rig all at stock with the Asus Strix 1070 and had not encountered this issue. I have all settings on High at 1920x1080 and usually get about 300-500 fps depends on the map. The fps will fluctuate between ~300-500 but never drop below 300 much.


----------



## Creatinas

I'm getting 1930-1970mhz core clock in my msi gtx 1070 gaming x, just with the nv boost 3.0. This is average/normal right? How much are you guys getting without actual manual oc?


----------



## XenoRad

I got my Gigabyte GTX 1070 G1 and I'm pretty happy with it. Solid performance in games though I was expecting slightly more in Crysis 3 / GTA 5. Mind you I am playing with full details at 1440p and 4x MSAA so...

@Creatinas - Seems about right, I'm also getting around 1930 to 1970 stock settings, though it mostly hovers around 1949 and below in more intense applications (Crysis 3 / GTA 5).

I haven't really checked on whether it starts high and gradually decreases over time or whether depending on the game it goes to a certain frequency and stays on it and only dips when the GPU is not fully used. The temps are around 64 C with stock fan settings as well so plenty of headroom there.

On another note I kinda dislike Gigabyte's Xtreme Gaming Utility. So far it's the only way to control the RGB LEDs and its two frequency modes (OC/Gaming) but otherwise it's pretty lackluster when it comes to usability and monitoring. I've just used it to set the LED to red to match my other LED's in the case and then uninstalled it.

Also I'm not getting the card's frequency modes as it says:

Boost: 1822 MHz/ Base: 1620 MHz in OC Mode
Boost: 1784 MHz/ Base: 1594 MHz in Gaming Mode

and yet in gaming it never goes below around 1930 so what gives? In which mode is my card running? OC or Gaming?


----------



## BulletSponge

"Out for delivery"


----------



## Blackfyre

Quote:


> Originally Posted by *BulletSponge*
> 
> "Out for delivery"


Haha!!!

And that's why I drove 20KM and just picked up mine.


----------



## luan87us

Quote:


> Originally Posted by *Creatinas*
> 
> I'm getting 1930-1970mhz core clock in my msi gtx 1070 gaming x, just with the nv boost 3.0. This is average/normal right? How much are you guys getting without actual manual oc?


That's normal. 1070/1080 are reported to be running at higher boost clock than what mfg specified. My Strix is advertised at 1830mhz (Gaming mode) and 1850 (OC Mode) but yet it run at about 2005mhz in OC mode and 19XXmhz in gaming mode.


----------



## Blackfyre

Anyone else have this issue with FALLOUT 4? And it won't let me change the settings to ULTRA or increase VIEW DISTANCE. Every time I go custom settings, set everything to ULTRA, it resets after I press OK.


----------



## Blze001

Quote:


> Originally Posted by *Blackfyre*
> 
> 
> 
> Anyone else have this issue with FALLOUT 4? And it won't let me change the settings to ULTRA or increase VIEW DISTANCE. Every time I go custom settings, set everything to ULTRA, it resets after I press OK.


I'm not having that error, but then again I'm launching through F4SE so maybe it bypasses the launcher checks?


----------



## bigjdubb

Quote:


> Originally Posted by *Creatinas*
> 
> I'm getting 1930-1970mhz core clock in my msi gtx 1070 gaming x, just with the nv boost 3.0. This is average/normal right? How much are you guys getting without actual manual oc?


That seems like the normal range for all 1070's. If you adjust the power limit slider in afterburner it will increase a little more without ever touching the clock speed slider.


----------



## Blackfyre

Quote:


> Originally Posted by *Blze001*
> 
> I'm not having that error, but then again I'm launching through F4SE so maybe it bypasses the launcher checks?


What's F4SE?

And it's the same issue as this:


__
https://www.reddit.com/r/3s7j11/unidentified_video_card_issue/


----------



## LiquidHaus

Quote:


> Originally Posted by *Blackfyre*
> 
> 
> 
> Anyone else have this issue with FALLOUT 4? And it won't let me change the settings to ULTRA or increase VIEW DISTANCE. Every time I go custom settings, set everything to ULTRA, it resets after I press OK.


I get that window popup, but it does let me change everything to max/ultra.

by the way, 3440x1440p with everything max, and the card is maxxing out my 60hz monitor lol


----------



## bigjdubb

Quote:


> Originally Posted by *Blackfyre*
> 
> 
> 
> Anyone else have this issue with FALLOUT 4? And it won't let me change the settings to ULTRA or increase VIEW DISTANCE. Every time I go custom settings, set everything to ULTRA, it resets after I press OK.


I want to say mine gave me that message initially but I was able to adjust the settings. Is there a config file that it saves somewhere? Maybe you can delete that file and start over.

EDIT: Now that I think about it, I ended up doing a clean install because of too many mods and the issues that brings. Since that install I have not had that message pop up.
Quote:


> Originally Posted by *Blze001*
> 
> I'm not having that error, but then again I'm launching through F4SE so maybe it bypasses the launcher checks?


I wish I could get F4SE to work


----------



## Ysbzqu6572

I don't get this DPC latency "issues" coz I do have quite cheap components honestly and I do not experience any audio or fps issues everything is butter smooth on G-Sync monitor.. how could this High DPC latency be seen ?
EDIT: I tried that program latency mon and it reported highest latency about 1200 from nvidia driver and 42000 overall highest without process name but I do not experience any issues.. what the heck then ? Might be actually placebo from your side guys ?


----------



## bigjdubb

Quote:


> Originally Posted by *H4wk*
> 
> I don't get this DPC latency "issues" coz I do have quite cheap components honestly and I do not experience any audio or fps issues everything is butter smooth on G-Sync monitor.. how could this High DPC latency be seen ?


They have tools to test it. I didn't have any issues until I read the post (1080 owners thread I think) and got the tool to test it. Once I installed the stupid tool I started to get the video streaming issues that everyone was talking about.


----------



## Bee Dee 3 Dee

Guru3D: Corsair Launches Magnetic Levitation Bearing based FANs.

Corsair: CORSAIR ML Series fans






^^ should be awesome with ASUS 1070 (FanConnect provides dual 4-pin GPU-controlled PWM fan headers... i can hook-up four of them to my SLI 1070 vid cards!)









(Only 9 more days until my ASUS 1070-SLI from Amazon!







and now i don't have to ask wat fans would be nice to connect to them.







)

.


----------



## MrPlankton

I have an MSI GTX 1070 Gaming X inside a Thermaltake Core V21 case. The case is for mATX/Mini-ITX boards and the motherboard setup is up using a horizontal style, where it is is sitting flat. I have a Noctua NH-D14 installed along with the standard 200 mm intake fan in the front. The only thing is that the back of the graphics card is touching the CPU fan "handles" used to secure them to the heatsink. In other words, the back of the card is right near the CPU heatsink. Will this be a problem in terms of temperature?


----------



## bigjdubb

Quote:


> Originally Posted by *MrPlankton*
> 
> I have an MSI GTX 1070 Gaming X inside a Thermaltake Core V21 case. The case is for mATX/Mini-ITX boards and the motherboard setup is up using a horizontal style, where it is is sitting flat. I have a Noctua NH-D14 installed along with the standard 200 mm intake fan in the front. The only thing is that the back of the graphics card is touching the CPU fan "handles" used to secure them to the heatsink. In other words, the back of the card is right near the CPU heatsink. Will this be a problem in terms of temperature?


They will be heating each other up but there isn't much that can be done about it other than running the fans at a bit higher RPM than you would need to in an ATX format build. You could use a thin piece of rubber in between the backplate and handles to prevent any direct metal to metal contact. Maybe even just wrap that portion of the handle with black electricians tape.


----------



## deegzor

Quote:


> Originally Posted by *Blackfyre*
> 
> This is seriously stupid. Probably the only thing I regret from leaving AMD.
> 
> Using MSI AfterBurner's "*AMD Compatible Properties*" in the settings; I had my 7970 overclocked, running at max core and max memory speeds, and constant max voltage for over 5 years without any issues. Temperatures were fine when I was not gaming anyway, because there's 0% GPU usage, and they'd only spike up during gameplay.
> 
> I hate this auto throttling bullsh** because the GPU thinks "_oh it's FINE, I don't need to use full power_", then a lag spike happens because suddenly I enter a very demanding area or something happens on screen and it stutters for one moment, jumps the speed up and goes back to normal.
> 
> This is literally the ONLY issue I have with the card.
> 
> Although I noticed if you go to *nVidia Control Panel*, then *Manage 3D Settings*, then choose the game you want, then change *Power Management Mode* to *Prefer Maximum Performance*, it won't throttle below BASE clock speeds. So at worst it stays in the 1500MHz region while gaming.


I found a workaround for this! -> http://forums.guru3d.com/showthread.php?t=395939 after getting skin and turning on k-boost when afterburner want's to restart just press no and go to windows device manager and disable your gpu then enable it.


----------



## bigjdubb

I just saw the pictures of the EK full coverage (sort of) block for the MSI cards....



I really wish I had a reference board card, I want this block:



I may see if I can sell or trade my MSI Gaming for a reference gpu.


----------



## Dude970

Quote:


> Originally Posted by *BulletSponge*
> 
> "Out for delivery"


Mine was delayed a day, will arrive tomorrow


----------



## LiquidHaus

Can someone please get a modded bios already for these cards?

There's a STRIX 1080 bios out there with the core voltage bumped to 1.24v. Most people are talking about their cards not doing so well with the increased votlage, however almost every card out there has ONE 8 pin connector.

Well... the Zotac Amp Extreme - both 1070 and 1080 - have TWO 8 pin connectors, as well as much more phases. This promises cleaner power delivery. I am fully confident the Zotac cards would excel past any other cards with voltage increases.

I just need someone to crack a bios to unlock voltage. I wish I knew how to do it. I would just take my stock Amp Extreme bios and unlock the voltage.

UGH this is driving me nuts.


----------



## Offender_Mullet

1070 FTW coming in a few days.







About time!


----------



## mypickaxe

Quote:


> Originally Posted by *lifeisshort117*
> 
> Can someone please get a modded bios already for these cards?
> 
> There's a STRIX 1080 bios out there with the core voltage bumped to 1.24v. Most people are talking about their cards not doing so well with the increased votlage, however almost every card out there has ONE 8 pin connector.
> 
> Well... the Zotac Amp Extreme - both 1070 and 1080 - have TWO 8 pin connectors, as well as much more phases. This promises cleaner power delivery. I am fully confident the Zotac cards would excel past any other cards with voltage increases.
> 
> I just need someone to crack a bios to unlock voltage. I wish I knew how to do it. I would just take my stock Amp Extreme bios and unlock the voltage.
> 
> UGH this is driving me nuts.


I think they're trying to save us from ourselves, who wants to blow up a card? More likely - protect their bottom line with the Ti / Titan P line when that gets here.

It's telling we haven't seen much on the Classified / K1ngp1n / Lightning GPUs for the 1080 line.


----------



## BulletSponge

Well, after waiting all day FedEx finally dropped off my bundle of joy. Got it installed plugged in a quick +150/+500 in Afterburner and gave her a test drive in Valley.



After letting it loop for 30 minutes....



I'm looking forward to getting that Valley score over 100 my next day off. I think this card still has a lot of untapped potential.


----------



## Dude970

Looks like a good card


----------



## BulletSponge

Quote:


> Originally Posted by *Dude970*
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like a good card


You are NOT going to be disappointed.









Oh, I replaced my daughters 760 with my 970. She won't be disappointed either when she gets home.


----------



## Dude970

Quote:


> Originally Posted by *BulletSponge*
> 
> You are NOT going to be disappointed.


Looks like you are right, I have the same cpu to. I run it at 4.5, how about you. If you have time through in a Firestrike normal run


----------



## BulletSponge

I am a paranoid noob at CPU overclocking. I just did the lazy mans 42 multi/save settings. Yes, I am ashamed









I don't have full Firestrike, only the demo installed I think.



Edit-64 man canals TDM in BF4, 1440, all settings ultra except MSAAx2 instead of x4, I was hitting 186fps. DEFINITELY time for a new monitor.


----------



## bigjdubb

Quote:


> Originally Posted by *BulletSponge*
> 
> Well, after waiting all day FedEx finally dropped off my bundle of joy. Got it installed plugged in a quick +150/+500 in Afterburner and gave her a test drive in Valley.
> 
> 
> 
> After letting it loop for 30 minutes....
> 
> 
> 
> I'm looking forward to getting that Valley score over 100 my next day off. I think this card still has a lot of untapped potential.


Did you clock the core up and then the memory? I was at +155/+575 and then clocked memory up first and was able to get +140/+750. Memory seems to make more of an impact than core clock.... this thing needs GDDR5X!


----------



## Forceman

Quote:


> Originally Posted by *bigjdubb*
> 
> Did you clock the core up and then the memory? I was at +155/+575 and then clocked memory up first and was able to get +140/+750. Memory seems to make more of an impact than core clock.... this thing needs GDDR5X!


The memory clocks on these cards are amazing.


----------



## BulletSponge

Quote:


> Originally Posted by *bigjdubb*
> 
> Did you clock the core up and then the memory? I was at +155/+575 and then clocked memory up first and was able to get +140/+750. Memory seems to make more of an impact than core clock.... this thing needs GDDR5X!


+140/+750


----------



## whicker

Just got my 1070 strix today. Haven't messed around too much but man was I impressed while playing overwatch on epic and the card was only clocked at 983mhz, 51c, fans not even spinning. Never dropped below 60fps. Unreal...

I'm gonna dive into overclocking and validating tomorrow but was wondering what you guys are using to OC. I usually use MSI afterburner but it seems with an Asus card it just gives me a voltage slider 0-100% and no fancy freq/volt tool like precision.


----------



## LiquidHaus

Quote:


> Originally Posted by *whicker*
> 
> Just got my 1070 strix today. Haven't messed around too much but man was I impressed while playing overwatch on epic and the card was only clocked at 983mhz, 51c, fans not even spinning. Never dropped below 60fps. Unreal...
> 
> I'm gonna dive into overclocking and validating tomorrow but was wondering what you guys are using to OC. I usually use MSI afterburner but it seems with an Asus card it just gives me a voltage slider 0-100% and no fancy freq/volt tool like precision.


can you please, please send me your bios file for that card? apparently the strix cards have higher voltage limits (1.25v) well at least the 1080 does. I would like to assume the 1070 does as well.


----------



## mypickaxe

Quote:


> Originally Posted by *whicker*
> 
> Just got my 1070 strix today. Haven't messed around too much but man was I impressed while playing overwatch on epic and the card was only clocked at 983mhz, 51c, fans not even spinning. Never dropped below 60fps. Unreal...
> 
> I'm gonna dive into overclocking and validating tomorrow but was wondering what you guys are using to OC. I usually use MSI afterburner but it seems with an Asus card it just gives me a voltage slider 0-100% and no fancy freq/volt tool like precision.


Afterburner is fine, all I need. Precision is a steaming pile. It grabs the keyboard at startup and doesn't let go for longer than should be reasonably necessary. It hangs in its sluggish UI. Onscreen monitoring doesn't work half the time, it doesn't show CPU temps, and on and on.

The voltage slider is an offset up to +100mv, not percentage points, just an FYI on that.


----------



## pez

Quote:


> Originally Posted by *Blackfyre*
> 
> 
> 
> Anyone else have this issue with FALLOUT 4? And it won't let me change the settings to ULTRA or increase VIEW DISTANCE. Every time I go custom settings, set everything to ULTRA, it resets after I press OK.


I have a feeling I'm a bit late to the party, but ensure your Fallout4.ini and Fallout4Prefs.ini (I might be off on the names) are not set to Read Only. I had this issue and that was what happened to me. Fallout 4 was still under the impression I had my GTX 780 (and I've had SLI 970s in the meantime







). You should be able to find the files in C:/.../Documents/My Games/Fallout4


----------



## bigjdubb

Quote:


> Originally Posted by *whicker*
> 
> Just got my 1070 strix today. Haven't messed around too much but man was I impressed while playing overwatch on epic and the card was only clocked at 983mhz, 51c, fans not even spinning. Never dropped below 60fps. Unreal...
> 
> I'm gonna dive into overclocking and validating tomorrow but was wondering what you guys are using to OC. I usually use MSI afterburner but it seems with an Asus card it just gives me a voltage slider 0-100% and no fancy freq/volt tool like precision.


Hit control F while in afterburner to get the boost/voltage curve.


----------



## Blackfyre

Quote:


> Originally Posted by *pez*
> 
> I have a feeling I'm a bit late to the party, but ensure your Fallout4.ini and Fallout4Prefs.ini (I might be off on the names) are not set to Read Only. I had this issue and that was what happened to me. Fallout 4 was still under the impression I had my GTX 780 (and I've had SLI 970s in the meantime
> 
> 
> 
> 
> 
> 
> 
> ). You should be able to find the files in C:/.../Documents/My Games/Fallout4


Haha! Sure enough the .ini files were all read-only. But as you predicted, you were a bit late. I got frustrated and uninstalled the game. So yeah I'm not going to download and install it again just to test. Maybe after I buy a new SSD and format my PC completely.

I've already finished the game with my HD7970 (_aka the 280X_). But I wanted to test the game after bumping up the graphics to see how it looks and how it would feel like playing it smoothly.t


----------



## pez

Quote:


> Originally Posted by *Blackfyre*
> 
> Haha! Sure enough the .ini files were all read-only. But as you predicted, you were a bit late. I got frustrated and uninstalled the game. So yeah I'm not going to download and install it again just to test. Maybe after I buy a new SSD and format my PC completely.
> 
> I've already finished the game with my HD7970 (_aka the 280X_). But I wanted to test the game after bumping up the graphics to see how it looks and how it would feel like playing it smoothly.t


Haha no worries. Glad I could somewhat contribute







. The game definitely benefits from an SSD as load times are cut significantly.

The game runs great at 1440p on the 1080 at absolute max, so I'm sure the 1070 will run it quite well at that res with a few minor tweaks to Godrays and AA.


----------



## QxY

Got the Zotac 1070 AMP yesterday, really impressed so far. Card is dead silent, fans don't even run until 60C. Coming from a 780 Ti, it's nice not to hear coil whine anymore.









Gotta love those RGB LEDs, glad I didn't get the Founders Edition:



Fire Strike score...highest temperature was 73C, games are a few degrees lower. 4770k running at stock speeds:


----------



## Majentrix

Did some testing on my Phoenix and it looks like as soon as the core temperature hits 60c the fans spin up to 700 RPM, and when the temperature drops back down to below 60 the fans stop. This isn't a problem in intensive games where the temps sit at 65-68 most of the time, but in light games the fans are constantly turning off and on as the temperature hovers around 59 to 60. I imagine this can't be good for the fans to spin up and down for hours at a time and I'll have to work out a fan profile to fix this at some point.


----------



## Ysbzqu6572

Because normal idle temps without fans for 1070 should be around or below 50 celsius instead of yours 60.
Might be your card isn't downclocking or something.


----------



## Majentrix

I didn't say anything about idle temps. My card idles at 35c which is perfectly normal given a 13c ambient.


----------



## Ysbzqu6572

I see, my bad, then you might need to change fan curve profile


----------



## pez

Since you've got pretty low ambient temps, it sounds like you could set the fans to cut on at around 50C and have it scale normally from there. At the same time, it's pretty cool to see that a card with the power of a 1070 can run pretty passively in light gaming







.


----------



## Jiehfeng

Just got the Zotac Founder's edition yesterday. What does the option 1 mean? The defaults?
Anyways, I'm gonna try OC'ing both the gpu and memory clocks.


----------



## Blackfyre

Quote:


> Originally Posted by *Jiehfeng*
> 
> Just got the Zotac Founder's edition yesterday. What does the option 1 mean? The defaults?
> Anyways, I'm gonna try OC'ing both the gpu and memory clocks.


Good luck, I recommend this method for overclocking the 1070:

Just get *MSI Afterburner* (_download the latest BETA from Guru3D that will work with the GTX 1070_).

I started by increasing Power Limit percentage all the way to the maximum, which is 126%

After that, skip core overclocking and go down to memory clock, and start increasing it by +100 increments and clicking apply. Get to around + 500 Mhz on MEMORY (make sure you're not increasing core clock).

Then go run Firestrike and see if you're having any issues with it or not (like artifacts or crashing)... If not then go back and increase CORE clock (but don't expect to get much here).

Keep using Firestrike to test for quick stability... ONCE you're completely finished do a lot of other stress testing and MONITOR your temperatures.

*So right now I have;*

Power Limit 126%
Core Clock +106 MHz
Memory +666MHz


----------



## Schneeder

Stoked to get my card today.







Work can't go by quick enough!


----------



## Jiehfeng

Quote:


> Originally Posted by *Blackfyre*
> 
> Good luck, I recommend this method for overclocking the 1070:
> 
> Just get *MSI Afterburner* (_download the latest BETA from Guru3D that will work with the GTX 1070_).
> 
> I started by increasing Power Limit percentage all the way to the maximum, which is 126%
> 
> After that, skip core overclocking and go down to memory clock, and start increasing it by +100 increments and clicking apply. Get to around + 500 Mhz on MEMORY (make sure you're not increasing core clock).
> 
> Then go run Firestrike and see if you're having any issues with it or not (like artifacts or crashing)... If not then go back and increase CORE clock (but don't expect to get much here).
> 
> Keep using Firestrike to test for quick stability... ONCE you're completely finished do a lot of other stress testing and MONITOR your temperatures.
> 
> *So right now I have;*
> 
> Power Limit 126%
> Core Clock +106 MHz
> Memory +666MHz


I do know how to overclock, I've done it with a few other GPU's. But I learnt a few new things from you, so thank you very much! Gonna try it now.


----------



## pez

Quote:


> Originally Posted by *Blackfyre*
> 
> Good luck, I recommend this method for overclocking the 1070:
> 
> Just get *MSI Afterburner* (_download the latest BETA from Guru3D that will work with the GTX 1070_).
> 
> I started by increasing Power Limit percentage all the way to the maximum, which is 126%
> 
> After that, skip core overclocking and go down to memory clock, and start increasing it by +100 increments and clicking apply. Get to around + 500 Mhz on MEMORY (make sure you're not increasing core clock).
> 
> Then go run Firestrike and see if you're having any issues with it or not (like artifacts or crashing)... If not then go back and increase CORE clock (but don't expect to get much here).
> 
> Keep using Firestrike to test for quick stability... ONCE you're completely finished do a lot of other stress testing and MONITOR your temperatures.
> 
> *So right now I have;*
> 
> Power Limit 126%
> Core Clock +106 MHz
> Memory +*666*MHz


----------



## Oj010

I'm running out of clubs to join


----------



## Blackfyre

Quote:


> Originally Posted by *pez*


Haha! Did you know there was a *DEVIL* achievement on 3DMark. I only discovered it by chance. It shows you the achievement below when you submit a score with the number 666 in it. But it has nothing to do with the overclock. The number has to be in the score









Luckily for me though setting *+666* MHz for the memory boost resulted in a *3DMark score* of *16669*

*Achievement looks like this:*
Quote:


> *You Devil*
> 
> Submit a score that contains 666.


*Check it out:* (_Click on Show Result Details and look on the right_).

http://www.3dmark.com/3dm/13078330


----------



## Phixit

GTX 1080/1070 High DPC / Latency issues :

https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/1/


----------



## whicker

Quote:


> Originally Posted by *lifeisshort117*
> 
> can you please, please send me your bios file for that card? apparently the strix cards have higher voltage limits (1.25v) well at least the 1080 does. I would like to assume the 1070 does as well.


I will see if GPUz lets me extract tonight. Although when i was messing around with the voltage slider and power limit yesterday the card did not go any higher than 1093mv. I'm pretty sure the Bios you are after is not on retail Asus 1080 cards and was "leaked" by someone on the Asus overclocking team.
Quote:


> Originally Posted by *mypickaxe*
> 
> Afterburner is fine, all I need. Precision is a steaming pile. It grabs the keyboard at startup and doesn't let go for longer than should be reasonably necessary. It hangs in its sluggish UI. Onscreen monitoring doesn't work half the time, it doesn't show CPU temps, and on and on.
> 
> The voltage slider is an offset up to +100mv, not percentage points, just an FYI on that.


That makes sense, thanks. I'm used to the 12mv slider that my 770 had lol. I also prefer afterburner so I'm glad I'm not missing out by sticking with it.

Quote:


> Originally Posted by *bigjdubb*
> 
> Hit control F while in afterburner to get the boost/voltage curve.


I knew it was in there somewhere! Thanks!

Quote:


> Originally Posted by *Majentrix*
> 
> I didn't say anything about idle temps. My card idles at 35c which is perfectly normal given a 13c ambient.


Damn dude, I'm in canada and my ambients don't get much lower than 18c in the winter lol.


----------



## Blackfyre

Quote:


> Originally Posted by *Phixit*
> 
> GTX 1080/1070 High DPC / Latency issues :
> 
> https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/1/


So any official word regarding this issue? Any fix promises for coming drivers? Any acknowledgement by nVidia of the issue?

Also has anyone with high DPC-Latency checked my suggestion from a few pages back and seen if it helps?

Using *Process Lasso* might help.

Quote:


> Originally Posted by *whicker*
> 
> I will see if GPUz lets me extract tonight. *Although when i was messing around with the voltage slider and power limit yesterday the card did not go any higher than 1093mv*. I'm pretty sure the Bios you are after is not on retail Asus 1080 cards and was "leaked" by someone on the Asus overclocking team.


*This might actually confirm my theory or at least add evidence to it, I was the first one (that I know of) to mention this across multiple forums:*

http://www.overclock.net/t/1605348/bios-hardware-voltage-lock-is-preventing-gtx-1070-from-reaching-1080-performance


----------



## XenoRad

From nVidia on the geforce forums (https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/16/):
Quote:


> Thank you. No need to submit further survey feedbacks on this issue. We believe we understand the root cause and will provide a fix through a future driver. At this time I do not have an ETA but as soon as I have further information, I will share with everyone.


----------



## bigjdubb

Quote:


> Originally Posted by *Blackfyre*
> 
> *This might actually confirm my theory or at least add evidence to it, I was the first one (that I know of) to mention this across multiple forums:*
> 
> http://www.overclock.net/t/1605348/bios-hardware-voltage-lock-is-preventing-gtx-1070-from-reaching-1080-performance


While the voltage does appear to be limited by the bios, I'm not sure if I would make the leap to assume it's because they want to limit performance. There seems to be a real voltage limit on this process/architecture and it could very well be that the chips that become 1070's couldn't cut the mustard even at 1.2 something.


----------



## Blackfyre

Quote:


> Originally Posted by *bigjdubb*
> 
> While the voltage does appear to be limited by the bios, I'm not sure if I would make the leap to assume it's because they want to limit performance. There seems to be a real voltage limit on this process/architecture and it could very well be that the chips that become 1070's couldn't cut the mustard even at 1.2 something.


Yeah maybe. But I bet there will always be a few chips that can cut the mustard that slipped in. Golden chips or lucky ones (_whatever we call them_). And giving just a bit of extra headroom would not hurt in that case. It wouldn't hurt in any case. Temperatures are very safe, especially in the case of the MSI Cards with their custom cooling. I think I summed it up well in my last comment on page 2.


----------



## bigjdubb

That will most likely be the case further down the road when there are plenty of chips to go around.


----------



## LiquidHaus

Quote:


> Originally Posted by *Blackfyre*
> 
> *This might actually confirm my theory or at least add evidence to it, I was the first one (that I know of) to mention this across multiple forums:*
> 
> ]


lol you weren't, man.


__
https://www.reddit.com/r/4lwysn/gtx_1080_hard_limit_at_125_volts/

except they found more importantly, that the hardware is hard capped at 1.25v. meaning software/bios wise, they are all limited to ~1.1v. which is always assumed in many cases with stock bios, just varied values.

all this comes down to is requiring a modded bios with unlocked voltage. simply put.


----------



## whicker

Quote:


> Originally Posted by *Blackfyre*
> 
> Good luck, I recommend this method for overclocking the 1070:
> 
> Just get *MSI Afterburner* (_download the latest BETA from Guru3D that will work with the GTX 1070_).
> 
> I started by increasing Power Limit percentage all the way to the maximum, which is 126%
> 
> After that, skip core overclocking and go down to memory clock, and start increasing it by +100 increments and clicking apply. Get to around + 500 Mhz on MEMORY (make sure you're not increasing core clock).
> 
> Then go run Firestrike and see if you're having any issues with it or not (like artifacts or crashing)... If not then go back and increase CORE clock (but don't expect to get much here).
> 
> Keep using Firestrike to test for quick stability... ONCE you're completely finished do a lot of other stress testing and MONITOR your temperatures.
> 
> *So right now I have;*
> 
> Power Limit 126%
> Core Clock +106 MHz
> Memory +666MHz


The only thing that scares me about overclocking the ram so much is if the ram modules start getting hot when fans aren't spinning.


----------



## Blackfyre

Quote:


> Originally Posted by *whicker*
> 
> The only thing that scares me about overclocking the ram so much is if the ram modules start getting hot when fans aren't spinning.


*I use a custom fan curve that prevents fans from going to 0 speed, keeping them at 40% minimum since they're so damn silent I can't even hear them until they go higher:*


----------



## bigjdubb

Quote:


> Originally Posted by *whicker*
> 
> The only thing that scares me about overclocking the ram so much is if the ram modules start getting hot when fans aren't spinning.


If afterburner is to be believed, the ram clocking down when not under load the same way the GPU is.


----------



## Matt26LFC

Quote:


> Originally Posted by *Blackfyre*
> 
> Good luck, I recommend this method for overclocking the 1070:
> 
> Just get *MSI Afterburner* (_download the latest BETA from Guru3D that will work with the GTX 1070_).
> 
> I started by increasing Power Limit percentage all the way to the maximum, which is 126%
> 
> After that, skip core overclocking and go down to memory clock, and start increasing it by +100 increments and clicking apply. Get to around + 500 Mhz on MEMORY (make sure you're not increasing core clock).
> 
> Then go run Firestrike and see if you're having any issues with it or not (like artifacts or crashing)... If not then go back and increase CORE clock (but don't expect to get much here).
> 
> Keep using Firestrike to test for quick stability... ONCE you're completely finished do a lot of other stress testing and MONITOR your temperatures.
> 
> *So right now I have;*
> 
> Power Limit 126%
> Core Clock +106 MHz
> Memory +666MHz


Hey dude, how do you get the power limit to 126% or isn;t that possible for the founders edition?


----------



## Blackfyre

Quote:


> Originally Posted by *Matt26LFC*
> 
> Hey dude, how do you get the power limit to 126% or isn;t that possible for the founders edition?


I believe everyone that updated their BIOS or bought a GPU from the second or third batch (_that was already updated_), got screwed with an updated BIOS that had lower power limits than the cards that came out with the original batch. What's yours? 112% max?

This is one of the reasons why I haven't updated my BIOS yet. I've stayed on the original BIOS.


----------



## Matt26LFC

Oh right, yeah mines 112%. So I guess its stuck there







shame really, hopefully things will get unlocked by bios update at some point i havr plenty of thermal headroom


----------



## nacherc

Waiting for my MSI GTX 1070 GAMING X!!!!!!!!!!!!!!!!!!!!!!!!!!!!


----------



## deegzor

Quote:


> Originally Posted by *Blackfyre*
> 
> Good luck, I recommend this method for overclocking the 1070:
> 
> Just get *MSI Afterburner* (_download the latest BETA from Guru3D that will work with the GTX 1070_).
> 
> I started by increasing Power Limit percentage all the way to the maximum, which is 126%
> 
> After that, skip core overclocking and go down to memory clock, and start increasing it by +100 increments and clicking apply. Get to around + 500 Mhz on MEMORY (make sure you're not increasing core clock).
> 
> Then go run Firestrike and see if you're having any issues with it or not (like artifacts or crashing)... If not then go back and increase CORE clock (but don't expect to get much here).
> 
> Keep using Firestrike to test for quick stability... ONCE you're completely finished do a lot of other stress testing and MONITOR your temperatures.
> 
> *So right now I have;*
> 
> Power Limit 126%
> Core Clock +106 MHz
> Memory +666MHz


Can you provide some proof that memory is more important than core?


----------



## whicker

Quote:


> Originally Posted by *Blackfyre*
> 
> *I use a custom fan curve that prevents fans from going to 0 speed, keeping them at 40% minimum since they're so damn silent I can't even hear them until they go higher:*


Quote:


> Originally Posted by *bigjdubb*
> 
> If afterburner is to be believed, the ram clocking down when not under load the same way the GPU is.


Yeah I was thinking of making a new profile so that they would start spinning at 45c since it idles at 35c. I still want them to shut off if im idle, should make the fans last a lot longer if they are technically only running at 50% duty.

I'm pretty sure my 770 would downclock memory so it might be alright. I will have to monitor the mem frequency when using chrome or watching movies to make sure it isn't going full clocks with the fans off.

Quote:


> Originally Posted by *Matt26LFC*
> 
> Oh right, yeah mines 112%. So I guess its stuck there
> 
> 
> 
> 
> 
> 
> 
> shame really, hopefully things will get unlocked by bios update at some point i havr plenty of thermal headroom


My Strix caps out at 112% as well. I'm not sure what 126% would do though if I'm already hitting 1093mv with 112%. I hope there are bios updates in the future that will let us OC over 2150mhz. These cards run so cool it would be a shame not to see them at 2500mhz.


----------



## bigjdubb

Quote:


> Originally Posted by *deegzor*
> 
> Can you provide some proof that memory is more important than core?


My benchmark scores with +155/+575 are lower than my scores with +140/+750. I was getting within spitting distance of my +155/+575 score with +0/+750.

Just mess around with it and see how it goes with your card.

Quote:


> Originally Posted by *whicker*
> 
> My Strix caps out at 112% as well. I'm not sure what 126% would do though if I'm already hitting 1093mv with 112%. I hope there are bios updates in the future that will let us OC over 2150mhz. These cards run so cool it would be a shame not to see them at 2500mhz.


It won't make any difference unless we can get beyond 1.1v. I get zero difference in clock speeds between 100% and 126% on the power slider the same way I get no difference in voltage between 75% and 100% on the voltage slider.


----------



## deegzor

Quote:


> Originally Posted by *bigjdubb*
> 
> My benchmark scores with +155/+575 are lower than my scores with +140/+750. I was getting within spitting distance of my +155/+575 score with +0/+750.
> 
> Just mess around with it and see how it goes with your card.


For me it's quite the opposite heres my score for compare with 2152 core -> http://www.3dmark.com/fs/9256696 i would actually start loosing points when overclocking them memory over 9300mhz effective (+650 set on ab)


----------



## Matt26LFC

Hey guys I've just run the Firestrike Extreme Stress Test and got 0% Passed twice in a row. Any idea what this means?


----------



## Hunched

Quote:


> Originally Posted by *Matt26LFC*
> 
> Hey guys I've just run the Firestrike Extreme Stress Test and got 0% Passed twice in a row. Any idea what this means?


It's crashing.
Lower your clocks


----------



## Matt26LFC

Quote:


> Originally Posted by *Hunched*
> 
> It's crashing.
> Lower your clocks


Nothings overclocked, im running stock frequencies. Im also sat here watching it run the whole way through, takes about 10mins or so and it certainly isn't crashing


----------



## Blackfyre

Quote:


> Originally Posted by *deegzor*
> 
> Can you provide some proof that memory is more important than core?


Nowhere did I say that memory is more important than core, nor is it more important in benchmarks and gaming. But Core overclocking is limited, memory overclocking has a bigger potential. This is why you start overclocking the Memory first (_which is the advice I was giving_). If you get it to a stable +650MHz that's good, stop there. No need to go further, keep it on the safe side. In fact, push it to +700Mhz, make sure it doesn't crash under stress testing, then back it down to +650Mhz. Then start overclocking the core because you won't be going much with it anyway (_we're all capped_).


----------



## Blackfyre

Quote:


> Originally Posted by *Matt26LFC*
> 
> Hey guys I've just run the Firestrike Extreme Stress Test and got 0% Passed twice in a row. Any idea what this means?


Okay it only ever crashes for me if the OC is not stable. Otherwise I get 98%+

Are you running a legal version of 3DMark? Right click it on steam and go properties and check file validity or whatever it's called.

It'll check all your FILES and if there's an update it will force it to update.
























At STOCK speeds you should never be crashing. That's not normal. Hopefully it's a software issue and not hardware.


----------



## Matt26LFC

Quote:


> Originally Posted by *Blackfyre*
> 
> Okay it only ever crashes for me if the OC is not stable. Otherwise I get 98%+
> 
> Are you running a legal version of 3DMark? Right click it on steam and go properties and check file validity or whatever it's called.
> 
> It'll check all your FILES and if there's an update it will force it to update.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At STOCK speeds you should never be crashing. That's not normal. Hopefully it's a software issue and not hardware.


Yeah its a legal version, bought it through steam last year during the summer sale

Tried looking for the Check file validity, only thing I could find like that was about cache, checked it, says it was fine

Weird thing is its not crashing, its running through all the loops, but when it presents me with a score of 0% it also says only 1 loop completed even though I've just sat here and watched it run through them all, I'm guessing this is a software issue as its fine no matter what else i'm doing, benching or playing Fallout 4, just one of those weird yet annoying things


----------



## zeroibis

The MSI EK X are in stock: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127956

Just picked one up and it is already in packing! I also grabbed a 1TB SSD for some more gaming space!


----------



## bigjdubb

Quote:


> Originally Posted by *deegzor*
> 
> For me it's quite the opposite heres my score for compare with 2152 core -> http://www.3dmark.com/fs/9256696 i would actually start loosing points when overclocking them memory over 9300mhz effective (+650 set on ab)


Well you are able to get quite a bit more on the clock than I was, and also able to get more on the memory with that clock.

When I clock to +155 (which equals 2075 until 48 degrees) I can only get +575 on memory. If I back that clock off to +140 I can get +750 on the memory. The extra 175mhz on the memory does more for me than the additional 15mhz on the core clock.

I don't have a Silicon Lottery winning card.


----------



## wrathofbill

The best I can clock is + 235 0n the core and only +200 on memory, still happy with a total cook speed of around 2063mhz and memory 4200 mhz ..


----------



## Blackfyre

Quote:


> Originally Posted by *Matt26LFC*
> 
> Yeah its a legal version, bought it through steam last year during the summer sale
> 
> Tried looking for the Check file validity, only thing I could find like that was about cache, checked it, says it was fine
> 
> Weird thing is its not crashing, its running through all the loops, but when it presents me with a score of 0% it also says only 1 loop completed even though I've just sat here and watched it run through them all, I'm guessing this is a software issue as its fine no matter what else i'm doing, benching or playing Fallout 4, just one of those weird yet annoying things


Sounds like a software issue. Either driver or 3DMark's issue.

That's weird. I hope it gets fixed or you find a solution


----------



## wrathofbill

Am I doing something wrong or did I pull a short straw somewhere?
Quote:


> Originally Posted by *Matt26LFC*
> 
> Hey dude, how do you get the power limit to 126% or isn;t that possible for the founders edition?


Quote:


> Originally Posted by *Blackfyre*
> 
> Nowhere did I say that memory is more important than core, nor is it more important in benchmarks and gaming. But Core overclocking is limited, memory overclocking has a bigger potential. This is why you start overclocking the Memory first (_which is the advice I was giving_). If you get it to a stable +650MHz that's good, stop there. No need to go further, keep it on the safe side. In fact, push it to +700Mhz, make sure it doesn't crash under stress testing, then back it down to +650Mhz. Then start overclocking the core because you won't be going much with it anyway (_we're all capped_).


Quote:


> Originally Posted by *bigjdubb*
> 
> Well you are able to get quite a bit more on the clock than I was, and also able to get more on the memory with that clock.
> 
> When I clock to +155 (which equals 2075 until 48 degrees) I can only get +575 on memory. If I back that clock off to +140 I can get +750 on the memory. The extra 175mhz on the memory does more for me than the additional 15mhz on the core clock.
> 
> I don't have a Silicon Lottery winning card.


+ 235 0n the core and only +200 on memory, still happy with a total clock speed of around 2063mhz and memory 4200 mhz. Also tried memory first but get to around +200 and artefacts and crashing start to occur.

Am I doing something wrong or did I pull a short straw somewhere?


----------



## Blackfyre

Quote:


> Originally Posted by *wrathofbill*
> 
> + 235 0n the core and only +200 on memory, still happy with a total clock speed of around 2063mhz and memory 4200 mhz. Also tried memory first but get to around +200 and artefacts and crashing start to occur.
> 
> Am I doing something wrong or did I pull a short straw somewhere?


Wait which *GTX 1070* are you using? Brand/Model.

I know every card is different, and every cards boosts differently, but how are you only getting 2063MHz with +235 MHz on the core?

I get 2100MHz by adding +110 Mhz (_which is where I start seeing artifacts_). But by using +106MHz it's perfectly stable (_stress testing, bechmarking, and gaming for hours_). That basically translates to 2088 MHz.

As for memory I've went up to +800 and it's stable, I haven't pushed further though. I went back down to +666 and left it there.

*My current setup for the last few days has been this:*

*Core Voltage* = 100%.
*Power Limit* = 126%
*Core Clock* = +106 MHz
*Memory Clock* = +666 MHz

That and the *custom fan curve* I posted earlier give me brilliant results and a nice boost over stock on my *MSI GTX 1070 Gaming X*.


----------



## wrathofbill

Quote:


> Originally Posted by *Blackfyre*
> 
> Wait which *GTX 1070* are you using? Brand/Model.
> 
> I know every card is different, and every cards boosts differently, but how are you only getting 2063MHz with +235 MHz on the core?
> 
> I get 2100MHz by adding +110 Mhz (_which is where I start seeing artifacts_). But by using +106MHz it's perfectly stable (_stress testing, bechmarking, and gaming for hours_). That basically translates to 2088 MHz.
> 
> As for memory I've went up to +800 and it's stable, I haven't pushed further though. I went back down to +666 and left it there.
> 
> *My current setup for the last few days has been this:*
> 
> *Core Voltage* = 100%.
> *Power Limit* = 126%
> *Core Clock* = +106 MHz
> *Memory Clock* = +666 MHz
> 
> That and the *custom fan curve* I posted earlier give me brilliant results and a nice boost over stock on my *MSI GTX 1070 Gaming X*.


I am new to pc building and gaming but learning. I have the Msi FE card it boosts up to 2100 on start up but settles around 2063 and stable.
Core Voltage is locked
Power limit 112 %
Core clock +235
Memory clock + 200
With custom fan curve sits just under 70 C when gaming.

Ive tried to get memory higher but thats about it.....

http://www.3dmark.com/fs/9281567


----------



## Dude970

I got my new GPU, really happy. I added 650 Mem and 100 core, Power Limit 126 didn't touch my CPU. Broke 20K graphics and 15 overall


----------



## marik123

I just got a 1070 strix for $390.14 shipped, now waiting them to ship it to me.









https://jet.com/product/ASUS-GeForce-GTX-1070-STRIX-GTX1070-8G-GAMING-8GB-256-Bit-GDDR5-PCI-Express-30-H/8ce8d8f42a9446eba44c80b4120395e8


----------



## Dude970

Great Price too!


----------



## marik123

Quote:


> Originally Posted by *Dude970*
> 
> 
> 
> 
> 
> 
> 
> 
> Great Price too!


I figured paying $11.14 more than the $379 MSRP isn't that bad, given the fact that it has a way better cooler than reference. I been waiting for the price to go down for 1 month already and was planning to go for the Gigabyte Windforce for $399.99 + $4.99 shipped. I'm glad I waited.


----------



## prey1337

Downloaded the Afterburner Beta, seemed to help just a little like mentioned earlier in the thread.

2088 MHz looks like the sweet spot for now.


On some testing I did see it touch 2100 MHz sometimes, but that didn't prove to be very stable.


----------



## QxY

Quick question. My Zotac 1070 AMP clock boost limit is supposed to be at 1797 MHz, but in gaming/benches it seem to go all the way up to 1949. Is it normal to exceed the boost limit like that?


----------



## Dude970

Yes, that is normal...GPU Boost 3.0


----------



## mypickaxe

Quote:


> Originally Posted by *QxY*
> 
> Quick question. My Zotac 1070 AMP clock boost limit is supposed to be at 1797 MHz, but in gaming/benches it seem to go all the way up to 1949. Is it normal to exceed the boost limit like that?


As Dude970 indicated, it's not a limit, it's more of a "recommendation." You are seeing more, as Dude970 stated, due to GPU Boost 3.0, which in turn is giving you that gain because the GPU Boost table affords you a better boost based on power limit, voltage point (this is key) and thermals.

It's basically a free overclock.


----------



## Mad Pistol

Quote:


> Originally Posted by *QxY*
> 
> Quick question. My Zotac 1070 AMP clock boost limit is supposed to be at 1797 MHz, but in gaming/benches it seem to go all the way up to 1949. Is it normal to exceed the boost limit like that?


Absolutely normal. The card basically pushes itself as far as it can comfortably go without exceeding the power limits. The better the card, the higher the boost.


----------



## whicker

Quote:


> Originally Posted by *lifeisshort117*
> 
> can you please, please send me your bios file for that card? apparently the strix cards have higher voltage limits (1.25v) well at least the 1080 does. I would like to assume the 1070 does as well.


Here it is, although I doubt it is what you are looking for.

AsusROGStrixBios 148k .zip file


Make sure to extract it ofc.


----------



## LiquidHaus

Quote:


> Originally Posted by *whicker*
> 
> Here it is, although I doubt it is what you are looking for.
> 
> AsusROGStrixBios 148k .zip file
> 
> 
> Make sure to extract it ofc.


awesome. thanks man!


----------



## whicker

Ran some Firestrike runs today and was getting good results. Boosting to 2030 with +150 +500 (stock core freq is 1506mhz) and voltage dint really go over 1000mv. I'm thinking I can push it quite a bit further.


----------



## Dreamliner

I sold my Gigabyte 970 G1 and I'm looking for a 1070. Any standouts?

I've got an Asus board so I started looking at the 1070 Strix card. There are 2 variants with different clocks. As far as I can tell, they're identical. Shouldn't I just get the cheaper on and OC it?

Should I get the Strix or is there a better one?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814126111
http://www.newegg.com/Product/Product.aspx?Item=N82E16814126109


----------



## Majentrix

You really can't go wrong with any 1070, they're all cool, quiet and good overclockers. Get the one that you like the look of the best.
The Strix has the advantage of having fan connectors that you can plug your case fans in and tie their speed to GPU temperature.

And don't pay extra for a higher clocked model, they're all binned and OC the same.


----------



## Dreamliner

I wonder how long it will be before these cards are bundled with a game or two...


----------



## LiquidHaus

it's not for everyone, but as each day passes, i am loving my amp extreme more and more.


----------



## whicker

Quote:


> Originally Posted by *Dreamliner*
> 
> I sold my Gigabyte 970 G1 and I'm looking for a 1070. Any standouts?
> 
> I've got an Asus board so I started looking at the 1070 Strix card. There are 2 variants with different clocks. As far as I can tell, they're identical. Shouldn't I just get the cheaper on and OC it?
> 
> Should I get the Strix or is there a better one?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814126111
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814126109


Get the cheaper one and oc it yourself. Another thing the strix has that the other cards don't is 2xhdmi. That's why bought it but some may need 3xDP


----------



## pez

Quote:


> Originally Posted by *Blackfyre*
> 
> Haha! Did you know there was a *DEVIL* achievement on 3DMark. I only discovered it by chance. It shows you the achievement below when you submit a score with the number 666 in it. But it has nothing to do with the overclock. The number has to be in the score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Luckily for me though setting *+666* MHz for the memory boost resulted in a *3DMark score* of *16669*
> 
> *Achievement looks like this:*
> *Check it out:* (_Click on Show Result Details and look on the right_).
> 
> http://www.3dmark.com/3dm/13078330


Nice







.


----------



## Dreamliner

Has any of these cards been revealed to NOT whine? I've read a lot of these have a high pitch whine...


----------



## Blackfyre

Quote:


> Originally Posted by *Dreamliner*
> 
> Has any of these cards been revealed to NOT whine? I've read a lot of these have a high pitch whine...


They only whine at High FPS. The higher the FPS the higher the whine. So if you're the type of person who has V-Sync on in 95% of the games and they're running @ 60FPS, like me for example, my card never whines.

When I play CS Go and FPS is through the roof, and V-Sync is off, it whines. But speaker or headphones make it inaudible anyway.


----------



## xCamoLegend

CPU at 4.4ghz
Ram timings overclocked to 2133mhz 9-11-10-28-128-1
GPU at +50 Core Volts, 126% Power Limit, +130 Core and +665 Mem


----------



## bigjdubb

Quote:


> Originally Posted by *Dreamliner*
> 
> Has any of these cards been revealed to NOT whine? I've read a lot of these have a high pitch whine...


I haven't heard anything from my MSI but I don't run anything at high FPS. If I am getting more than 80fps on average I turn the settings up, if the settings can't go up I play it on my 4k screen.


----------



## Majentrix

You can also set a specific frame rate target in the nvidia control panel and the card will clock itself down or up to try and meet that target. Set it to the refresh rate of your monitor and the whine should go away.
Quote:


> Originally Posted by *bigjdubb*
> 
> I haven't heard anything from my MSI but I don't run anything at high FPS. If I am getting more than 80fps on average I turn the settings up, if the settings can't go up I play it on my 4k screen.


It usually happens in menus where the card is going full steam rendering thousands of useless frames a second.


----------



## Dreamliner

Quote:


> Originally Posted by *Blackfyre*
> 
> They only whine at High FPS. The higher the FPS the higher the whine. So if you're the type of person who has V-Sync on in 95% of the games and they're running @ 60FPS, like me for example, my card never whines.
> 
> When I play CS Go and FPS is through the roof, and V-Sync is off, it whines. But speaker or headphones make it inaudible anyway.


Quote:


> Originally Posted by *Majentrix*
> 
> You can also set a specific frame rate target in the nvidia control panel and the card will clock itself down or up to try and meet that target. Set it to the refresh rate of your monitor and the whine should go away.
> It usually happens in menus where the card is going full steam rendering thousands of useless frames a second.


I have a 4K monitor. One frustration I have is the 1070 isn't quite powerful enough to give me 60FPS @ 4K on new games. My backlog for old games is enormous, though.

I do wonder how electronic whine isn't resolved before shipping. I had a laptop with an Intel chip that whined...I didn't keep it long.


----------



## Jiehfeng

Alright @Blackfyre, I downloaded firestrike today and did as you said. I started off bringing the power limit to the max which was 112 for me. Then I started with 500MHz+ with msi afterburner (new beta) and ran firestrike, no problems at all. Then I kept increasing it and now I'm at 900MHz+ and it still is running fine on firestrike. Kinda seems abnormal since yours was 600MHz+, what gives? I've paused my OC'ing for now.


----------



## DrumAndBass

Hey mates, bought two days ago Palit 1070 Super Jetstream and happy as F***







) You bet! From 560ti to 1070 lol
Such a nice and quiet card, and so cold - 61 degree at 2100 coreclock at 1500 rpm - amazing cooling sytem!

I will copy my post here from guru3d, maybe someone can help me sort out this new MSI Afterburner curve stuff.

*Heaven 4 results*:

Screens:


http://imgur.com/g8faK


So, maximum voltage i could get was 1.093 with new 'curve mode management'.

At 2150 core clock, 1.093v and 114 power limit- Heaven bench crashed for me with blue screen. Driver reset itself.

At 2120 core clock - same thing.

2101 core clock - all fine, passed. No artifacts. 100% GPU usage 100% of the time, temps good - 59-61 degress. Again, not a single artifact which were happening at 1.075v.

_On very rare occasions_ card goes to one state backward on curve, which i set to 2088/1.081v. You can see it on top graphs, very short and rare occasions. Also anyone can notice that when this step backward happens - it is at the exact same time when power limit hits max.

Considering whole bench time card was at 100% those half a second short and rare steps backward can be neglected - _2101 core clock line was dead flat whole bench_ <- which is perfect - zero throttling.

So, that my results. Also, for reasons i could only divine - card won't use any point on curve after 1.093v. I tend to think that this due to BIOS limitation for power. So it doesn't matter where you will set curve points after 1.093 - card ignores them. Maybe someone could correct me on that or figure out this curve-stuff.

Question - was anyone able to increase 114 powerlimit?


----------



## Blackfyre

Quote:


> Originally Posted by *Jiehfeng*
> 
> Alright @Blackfyre, I downloaded firestrike today and did as you said. I started off bringing the power limit to the max which was 112 for me. Then I started with 500MHz+ with msi afterburner (new beta) and ran firestrike, no problems at all. Then I kept increasing it and now I'm at 900MHz+ and it still is running fine on firestrike. Kinda seems abnormal since yours was 600MHz+, what gives? I've paused my OC'ing for now.


The gains that you get from overclocking the memory past the 600MHz to 700MHz region are minimal. It's not worth going higher.

Also the temperature we are getting on MSI AfterBurner or any other program is the reading for the CORE, not the memory. Basically you don't know what the memory temperature is running at. Increasing it to 900MHz could mean it's running really hot without you knowing. So yeah, could be damaging in the long-term (_it's unknown_). So it's better to be safe and stay in the 600MHz to 700MHz region.

CORE overclocking on the other hand, every +25MHz results in bigger changes when it comes to benchmark scores or frame-rate. But we're all limited.

*For Anyone Interested:*

I did hit 21K finally on the GRAPHICS score on 3DMark today by increasing the fan speed & creating a custom curve CORE/VOLTAGE.

*http://www.3dmark.com/fs/9295027*


----------



## Blze001

Quote:


> Originally Posted by *Dreamliner*
> 
> I do wonder how electronic whine isn't resolved before shipping. I had a laptop with an Intel chip that whined...I didn't keep it long.


Components have always whined, it's a byproduct of working chips to their max. Computers only got quiet enough for people to notice relatively recently. I guarantee my old Pentium III and GeForce 4 whined, but there was no way I was ever going to hear it over the din those old-school component fans made.


----------



## xCamoLegend

So what's the reason these cards don't get more stable when you apply more voltage? anyone got any guesses?


----------



## prey1337

Quote:


> Originally Posted by *xCamoLegend*
> 
> So what's the reason these cards don't get more stable when you apply more voltage? anyone got any guesses?


They seem to be capped at 1.093v, different GPU architecture, so maybe there's a reason these are limited from the mfg at these lower voltages.


----------



## Blackfyre

Quote:


> Originally Posted by *Matt26LFC*
> 
> Hey guys I've just run the Firestrike Extreme Stress Test and got 0% Passed twice in a row. Any idea what this means?


I was wrong. Mine is doing the same thing too now. Even at stock clocks. There was a 5Mb update to 3DMark around 2 days ago, so when I tested today it was giving me 0% even though it was finishing all 20 runs. Same issue as you exactly.

Update must have made things conflict somewhere.
























Overclock still very stable though.


----------



## mickeykool

Quote:


> Originally Posted by *Blackfyre*
> 
> I was wrong. Mine is doing the same thing too now. Even at stock clocks. There was a 5Mb update to 3DMark around 2 days ago, so when I tested today it was giving me 0% even though it was finishing all 20 runs. Same issue as you exactly.
> 
> Update must have made things conflict somewhere.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overclock still very stable though.


Ok, i was scratching my head last night as well. I know i ran this test few days ago and ran fine then last nght I got 2 failed in a row.. Tested it via stock and same thing.. So seems to be a software issue.


----------



## Blackfyre

Quote:


> Originally Posted by *mickeykool*
> 
> Ok, i was scratching my head last night as well. I know i ran this test few days ago and ran fine then last nght I got 2 failed in a row.. Tested it via stock and same thing.. So seems to be a software issue.


100% Software issue.

It's good I remembered there was a small 5Mb update for it on Steam around 2 days ago. Otherwise I was sitting here panicing too. lol


----------



## Matt26LFC

Haha, good to hear guys! Was getting a little concerned, guess i can stop the test i literally just started.

Came home from work this afternoon and decided to forget it for a bit so i watched Deadpool instead, gotta say great movie lol Only problem with it is I cant ever see it again for the first time again lol


----------



## joloxx9

Guys how did you force your cards to use 1,093v? My card almost always stays at 1,075-1,81 i saw 1,093 only once, Ive got msi gaming x 1070 with oc bios.


----------



## bigjdubb

Quote:


> Originally Posted by *joloxx9*
> 
> Guys how did you force your cards to use 1,093v? My card almost always stays at 1,075-1,81 i saw 1,093 only once, Ive got msi gaming x 1070 with oc bios.


Set the sliders (Voltage and Power Target) to max and play a game that pushes your card.


----------



## prey1337

Quote:


> Originally Posted by *joloxx9*
> 
> Guys how did you force your cards to use 1,093v? My card almost always stays at 1,075-1,81 i saw 1,093 only once, Ive got msi gaming x 1070 with oc bios.


I have everything slid to max and I only see 1.093V and 2088MHz in benchmarks.

Doom maxed out I sit between 1.080V-1.093V and 2075MHz-2088MHz.


----------



## Blackfyre

Quote:


> Originally Posted by *joloxx9*
> 
> Guys how did you force your cards to use 1,093v? My card almost always stays at 1,075-1,81 i saw 1,093 only once, Ive got msi gaming x 1070 with oc bios.


No you shouldn't force constant voltage. It's not safe when the GPU clocks itself down and you're giving it 1.0930v at 200Mhz

However, if you press CTRL + F while using MSI AfterBurner you will be presented with the ability to overclock using a voltage/frequency curve editor.

First of all, don't play with this if you don't know what you're doing. So I recommend reading about it first.

But if you locate where 1093 is on the x axis, and click on it. You can drag it up to the clock that you want to be running at that voltage.

So for example I drag that up to 2101 MHz and exit the curve. And it forces the videocard to use 2101 MHz at 1.0930v but the problem you explain happens because there's this really stupid safety measure in the GPU's that we found which automatically clocks your videocard down the moment it goes beyond 54 degrees celsius. And again after 65 degrees celsius.

So right now we're all basically capped. There's no point of trying to push over the limit, because you can't push much anyway. It's not worth the hassle. We're blocked by the either the BIOS or hardware locks. And it's frustrating.


----------



## krytikul

OC(O8G) or non-OC(8G) Strix?

Does anyone know if the cards are binned better for OC?

About to pull the trigger on the OC Strix for $419.


----------



## Eric1285

Quote:


> Originally Posted by *krytikul*
> 
> OC(O8G) or non-OC(8G) Strix?
> 
> Does anyone know if the cards are binned better for OC?
> 
> About to pull the trigger on the OC Strix for $419.


Dang, where are you finding it for that price?


----------



## krytikul

Quote:


> Originally Posted by *Eric1285*
> 
> Quote:
> 
> 
> 
> Originally Posted by *krytikul*
> 
> OC(O8G) or non-OC(8G) Strix?
> 
> Does anyone know if the cards are binned better for OC?
> 
> About to pull the trigger on the OC Strix for $419.
> 
> 
> 
> Dang, where are you finding it for that price?
Click to expand...

Here on Jet.com

Use SHOP10 to get ~$47 off and waive the return for another $10 off bringing it to exactly $419.09

If you pay by debit card it comes to around $412.


----------



## Eric1285

Quote:


> Originally Posted by *krytikul*
> 
> Here on Jet.com
> 
> Use SHOP10 to get ~$47 off and waive the return for another $10 off bringing it to exactly $419.09
> 
> If you pay by debit card it comes to around $412.


Oh cool, didn't know about the SHOP10 code.


----------



## joloxx9

Thanks for your reply guys.

I found game which is not stable on 2075/2088 it is war Thunder, you can check it as game is free. I can play witcher 3 doom etc without any issues but war Thunder gives me red artifacts when I OC more than 2065 so I will stick with that speed


----------



## bigjdubb

Quote:


> Originally Posted by *joloxx9*
> 
> Thanks for your reply guys.
> 
> I found game which is not stable on 2075/2088 it is war Thunder, you can check it as game is free. I can play witcher 3 doom etc without any issues but war Thunder gives me red artifacts when I OC more than 2065 so I will stick with that speed


Artifacts are usually a memory issue, do you have your memory overclocked?


----------



## joloxx9

Its always caused by gpu, if i disable memory oc I can still see them, if I set GPU clock to 2100 there is much more of them. On 2065 nothing, but in witcher they appear even on 2088 maybe I didnt win silicon lottery







Or its because my OC bios from MSI, I hope when we get bios tweaker will be much better

I hope MSI will drop Gaming Z soon so I will check bios from them


----------



## Blackfyre

Quote:


> Originally Posted by *joloxx9*
> 
> Its always caused by gpu, if i disable memory oc I can still see them, if I set GPU clock to 2100 there is much more of them. On 2065 nothing, but in witcher they appear even on 2088 maybe I didnt win silicon lottery
> 
> 
> 
> 
> 
> 
> 
> Or its because my OC bios from MSI, I hope when we get bios tweaker will be much better
> 
> I hope MSI will drop Gaming Z soon so I will check bios from them


You definitely have a bad chip here. At *2088 MHz* I have 0 issues, and with a custom fan curve I can maintain that for hours of gaming on the Witcher 3.


----------



## mickeykool

Quote:


> Originally Posted by *Blackfyre*
> 
> You definitely have a bad chip here. At *2088 MHz* I have 0 issues, and with a custom fan curve I can maintain that for hours of gaming on the Witcher 3.


Can you screenshot your custom fan profile?


----------



## joloxx9

Well its only 21 Mhz


----------



## whicker

Quote:


> Originally Posted by *Blackfyre*
> 
> So for example I drag that up to 2101 MHz and exit the curve. And it forces the videocard to use 2101 MHz at 1.0930v but the problem you explain happens because there's this really stupid safety measure in the GPU's that we found which automatically clocks your videocard down the moment it goes beyond 54 degrees celsius. And again after 65 degrees celsius.


I did not know this! The 54c thing is crazy, the card shouldn't be clocking down till it hits temp limit. I guess I should finish reading this thread so I know these kinds of things. I'm assuming the temp downclocking is from bios?


----------



## joloxx9

Thanks for your reply guys.

I found game which is not stable on 2075/2088 it is war Thunder, you can check it as game is free. I can play witcher 3 doom etc without any issues but war Thunder gives me red artifacts when
Quote:


> Originally Posted by *whicker*
> 
> I did not know this! The 54c thing is crazy, the card shouldn't be clocking down till it hits temp limit. I guess I should finish reading this thread so I know these kinds of things. I'm assuming the temp downclocking is from bios?


Its from super ultra boost 3.0


----------



## LiquidHaus

I guess the solution here is to watercool the cards, otherwise we're all gonna throttle.


----------



## joloxx9

It will be hard to keep card under 54, bios tweak is the answer


----------



## bigjdubb

My first speed drop occurs at 48*c (2063 to 2050) then at 54*c it drops again (2050 to 2037). I'm not sure what happens beyond that because I never get above 56*. My fan curve looks like a cliff, 20% fan speed until 45* where it rises nearly straight up to 100% at 48*.

I will hopefully have mine under water this weekend and I shouldn't have any trouble keeping it under 48* with a 480mm rad and 8 fans.


----------



## wrathofbill

Quote:


> Originally Posted by *Blackfyre*
> 
> Wait which *GTX 1070* are you using? Brand/Model.
> 
> I know every card is different, and every cards boosts differently, but how are you only getting 2063MHz with +235 MHz on the core?
> 
> I get 2100MHz by adding +110 Mhz (_which is where I start seeing artifacts_). But by using +106MHz it's perfectly stable (_stress testing, bechmarking, and gaming for hours_). That basically translates to 2088 MHz.
> 
> As for memory I've went up to +800 and it's stable, I haven't pushed further though. I went back down to +666 and left it there.
> 
> *My current setup for the last few days has been this:*
> 
> *Core Voltage* = 100%.
> *Power Limit* = 126%
> *Core Clock* = +106 MHz
> *Memory Clock* = +666 MHz
> 
> That and the *custom fan curve* I posted earlier give me brilliant results and a nice boost over stock on my *MSI GTX 1070 Gaming X*.


Downloaded the beta of Afterburner and still after an hour of tinkering I still can't get the memory any more than +200 stable before artefacts come into play. Only difference with the beta is with the little voltage boost keeping and keeping +235 on the core getting up to 2088mhz stable sometimes flickers 2100 mhz. FE MSI GTX 1070


----------



## bigjdubb

Maybe the cooling and power delivery systems on the aftermarket cards is helping out the memory. It's tough to compare a reference card with a custom PCB card, there are quite a few variables.


----------



## Blackfyre

Quote:


> Originally Posted by *whicker*
> 
> I did not know this! The 54c thing is crazy, the card shouldn't be clocking down till it hits temp limit. I guess I should finish reading this thread so I know these kinds of things. I'm assuming the temp downclocking is from bios?


I said we, but I meant me









Nah I said we because I assume others have definitely noticed it too. Because it's really frustrating. I can't believe these stupid and moronic implementations are made. I can guarantee it's intentional to nerf our cards.

Hopefully unlocked and modified BIOSES in the future can fix these stupid implementations.

Quote:


> Originally Posted by *wrathofbill*
> 
> FE MSI GTX 1070


I've heard this several times now by those who own the FE. However the positive with the FE is that the core can clock higher. People who get them water cooled almost guarantees them 2150Mhz on the core.


----------



## bigjdubb

Quote:


> Originally Posted by *Blackfyre*
> 
> I can guarantee it's intentional to nerf our cards.


Multiple throttle points is not something that is new to Boost 3.0 or the 10 series.



And it technically isn't throttling, just boosting less. Everything above the clock you set is "extra" mhz provided by the boost feature which uses multiple data points to calculate how much boost to give.


----------



## Yetyhunter

How do you keep your cards so cool ?? My card easily goes over 75*C even at default settings reaching 80*C. I tried every possible fan combination to change airflow with no decent result . I thought I solved the problem but nowhere near. Could my card be faulty ? The back plate gets so hot I can't even keep my hand on it.


----------



## bigjdubb

Quote:


> Originally Posted by *Yetyhunter*
> 
> How do you keep your cards so cool ?? My card easily goes over 75*C even at default settings reaching 80*C. I tried every possible fan combination to change airflow with no decent result . I thought I solved the problem but nowhere near. Could my card be faulty ? The back plate gets so hot I can't even keep my hand on it.


My case is open (core P5) and ambient is around 21c. The cooler on the MSI manages to keep the card between 50*-56*c with 100% fan speed in those conditions. I have had a few long (6+ hours) gaming sessions and it still managed to keep that 56*c max.


----------



## whicker

Quote:


> Originally Posted by *Yetyhunter*
> 
> How do you keep your cards so cool ?? My card easily goes over 75*C even at default settings reaching 80*C. I tried every possible fan combination to change airflow with no decent result . I thought I solved the problem but nowhere near. Could my card be faulty ? The back plate gets so hot I can't even keep my hand on it.


I cant say if there is an issue with your card but I can share my own temps for comparison.

Last night while doing firestrike extreme and ultra runs, fans set at 60%, boost clocks between 1980-2030mhz, ram 9000mhz my temps never went over 65c with an ambient temperature of 28c.


----------



## LiquidHaus

do what you can to get dead air out of the case, and as much fresh air in as possible. not just related to the graphics card. if I ever ran my side panel, i'd have 4 120mm fans blowing fresh air into the case, with 3 120mm fans blowing that-now dead air out of the case. but then again I never run my side panel.


----------



## Teufel9000

i got a MSI Gaming X. was able to achieve a stable +100 mhz OC on the Core and +700 on the memory. (Can Benchmark at 750 all the way through but starts artifacting towards the end.)

with a 5% power limit increase.

Highest ive seen the temps were 69*C load during haven when i was messing with power limit and some voltage.

after 2050mhz my core didnt like it.

Tried raising voltage up to 70% and tried getting 2100 stable but couldnt. oh well i can live with this. its still a solid OC. and apparently fantastic memory lol.

Picture of my OC.



validation -> https://www.techpowerup.com/gpuz/details/kmse9


----------



## LiquidHaus

Quote:


> Originally Posted by *Teufel9000*
> 
> i got a MSI Gaming X. was able to achieve a stable +100 mhz OC on the Core and +700 on the memory. (Can Benchmark at 750 all the way through but starts artifacting towards the end.)
> 
> with a 5% power limit increase.
> 
> Highest ive seen the temps were 69*C load during haven when i was messing with power limit and some voltage.
> 
> after 2050mhz my core didnt like it.
> 
> Tried raising voltage up to 70% and tried getting 2100 stable but couldnt. oh well i can live with this. its still a solid OC. and apparently fantastic memory lol.
> 
> Picture of my OC.


try this: take your mem to +200mhz instead of +700mhz and then take your core back to 2100mhz from the 2050mhz. see if you can get it stable with those parameters.


----------



## Teufel9000

Quote:


> Originally Posted by *lifeisshort117*
> 
> try this: take your mem to +200mhz instead of +700mhz and then take your core back to 2100mhz from the 2050mhz. see if you can get it stable with those parameters.


I always mess with CORE clocks first. i didnt touch the memory clock until i found where the Core stopped being stable. i might get it stable if i keep the gpu at 100% power load all the time so it doesnt downclock.

Core just doesnt wanna budge lol. and im not sure if i wanna go to 100% voltage.... but i mean im NO WHERE near bad tempatures. and that was when i was playing with the power limit too.


----------



## Yetyhunter

Quote:


> Originally Posted by *bigjdubb*
> 
> My case is open (core P5) and ambient is around 21c. The cooler on the MSI manages to keep the card between 50*-56*c with 100% fan speed in those conditions. I have had a few long (6+ hours) gaming sessions and it still managed to keep that 56*c max.


That should explain it then. I have 26*C ambient temps with an open case and fans running at 60% max, any higher it's unbearably loud.
Quote:


> Originally Posted by *whicker*
> 
> I cant say if there is an issue with your card but I can share my own temps for comparison.
> 
> Last night while doing firestrike extreme and ultra runs, fans set at 60%, boost clocks between 1980-2030mhz, ram 9000mhz my temps never went over 65c with an ambient temperature of 28c.


You must have very good airflow, there can't be such a huge difference between ASUS and gigabyte. I just can'r explain these very high temps. Don't think it can be PSU related, even if it's 7 years old now.


----------



## whicker

Quote:


> Originally Posted by *Yetyhunter*
> 
> That should explain it then. I have 26*C ambient temps with an open case and fans running at 60% max, any higher it's unbearably loud.
> You must have very good airflow, there can't be such a huge difference between ASUS and gigabyte. I just can'r explain these very high temps. Don't think it can be PSU related, even if it's 7 years old now.


The case is a antec 900. 2x120mm intake, 120mm and 200mm out. Side panels are on. Front intakes are pretty much blowing "cold" air right on the gpu since it barely fits in the case lol.


----------



## prey1337

Quote:


> Originally Posted by *Yetyhunter*
> 
> That should explain it then. I have 26*C ambient temps with an open case and fans running at 60% max, any higher it's unbearably loud.
> You must have very good airflow, there can't be such a huge difference between ASUS and gigabyte. I just can'r explain these very high temps. Don't think it can be PSU related, even if it's 7 years old now.


That's pretty high with that card I think.

It has more fans than my EVGA SC and I'm 30C at idle (40% fan speed) in a closed tower, and I don't even get to 60C (under 75% fan speed) while gaming, even less during benchmarking.


----------



## bigjdubb

Quote:


> Originally Posted by *Yetyhunter*
> 
> That should explain it then. I have 26*C ambient temps with an open case and fans running at 60% max, any higher it's unbearably loud.


Luckily the Twin Frozr VI cooler is pretty quiet even at 100%. I either have the speakers turned up or my headset on when I play games so I just let it fly at 100% to keep the boost up as much as possible.


----------



## Forceman

Quote:


> Originally Posted by *Teufel9000*
> 
> Core just doesnt wanna budge lol. and im not sure if i wanna go to 100% voltage.... but i mean im NO WHERE near bad tempatures. and that was when i was playing with the power limit too.


I don't think the voltage slider is doing anything for you. At 70% (whatever the % is supposed to be) you are still only at 1.062V. I'd bet money that if you went back to 0% you'd still be at the same 1.062V.

It's the same for me - I get 1.062V max no matter what I change the slider to. Haven't tried the Gigabyte utility yet, but I think it is still based on Rivatuner so I doubt it is any different.


----------



## prey1337

Quote:


> Originally Posted by *bigjdubb*
> 
> Luckily the Twin Frozr VI cooler is pretty quiet even at 100%. I either have the speakers turned up or my headset on when I play games so I just let it fly at 100% to keep the boost up as much as possible.


The 12" sub I'm using drowns out every noise in a 50ft radius, I might consider running the fans closer to 100% to keep the operating temps even lower.


----------



## 348299

Hi guys, i am reading o lot about throttling of this card and i am wondering if i should get and open air cooler or blower in a elite 130 case, if i get the open cooler one i will get a closed water cooler too.
I wonder if the temps on open coolers are that much lower compared to blower ones to justify me buying a watercooler to my 3570k.


----------



## Dreamliner

Even card with aftermarket coolers are throttling?

What the heck.

I just want an air 1070 that doesn't whine and won't throttle. What the heck do I buy?


----------



## Hunched

Quote:


> Originally Posted by *Dreamliner*
> 
> Even card with aftermarket coolers are throttling?
> 
> What the heck.
> 
> I just want an air 1070 that doesn't whine and won't throttle. What the heck do I buy?


900 Series throttled too on all cards.
Nvidia values power and thermal efficiency over performance.
It's GPU Boost, aka GPU Limit/Throttle.

You need a custom BIOS. That's the only way to eliminate throttling.
Nvidia doesn't allow the disabling of power saving features such as GPU Boost by normal means.


----------



## Dreamliner

Quote:


> Originally Posted by *Hunched*
> 
> 900 Series throttled too on all cards.
> Nvidia values power and thermal efficiency over performance.
> It's GPU Boost, aka GPU Limit/Throttle.
> 
> You need a custom BIOS. That's the only way to eliminate throttling.
> Nvidia doesn't allow the disabling of power saving features such as GPU Boost by normal means.


I had my 970 G1 on a Afterburner preset to immediate 80% fan for gaming. Think I was throttling too?


----------



## Hunched

Quote:


> Originally Posted by *Dreamliner*
> 
> I had my 970 G1 on a Afterburner preset to immediate 80% fan for gaming. Think I was throttling too?


You were.
My 970 G1 throttled down clocks and voltage as early as 68c and was only fixed with a custom BIOS.
Mr-Dark has a very popular thread on here that specifies what they do, removing GPU Boost throttling is in the list.

GPU Boost throttles every card it's on, and can only be disabled with a custom BIOS edited to do so.


----------



## BulletSponge

What version of Afterburner are you all using? 4.2 has no power target











Of course I could just be an idiot and have overlooked it. I don't remember updating, going back to 4.1.


----------



## Hunched

Quote:


> Originally Posted by *BulletSponge*
> 
> What version of Afterburner are you all using? 4.2 has no power target
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course I could just be an idiot and have overlooked it. I don't remember updating, going back to 4.1.


Use the 4.3.0 Beta


----------



## Dude970

Quote:


> Originally Posted by *Hunched*
> 
> Use the 4.3.0 Beta


I'm using the Beta one too


----------



## modapcboy

Anyone using a 1070 , particularly a zotac amp 1070 on a 1440p monitor ? Gaming wise is it good ?


----------



## BulletSponge

Quote:


> Originally Posted by *Hunched*
> 
> Use the 4.3.0 Beta


I'll look it up, in the meantime finally cracked 21k in FS but I think I've hit the wall for now.







Bring on the BIOS mods.


----------



## Dude970

Quote:


> Originally Posted by *BulletSponge*
> 
> I'll look it up, in the meantime finally cracked 21k in FS but I think I've hit the wall for now.
> 
> 
> 
> 
> 
> 
> 
> Bring on the BIOS mods.


Crank that i5 up to 4.8 to 5 and you will get 15K, great job on the graphics


----------



## pewpewlazer

Quote:


> Originally Posted by *leandronb*
> 
> Hi guys, i am reading o lot about throttling of this card and i am wondering if i should get and open air cooler or blower in a elite 130 case, if i get the open cooler one i will get a closed water cooler too.
> I wonder if the temps on open coolers are that much lower compared to blower ones to justify me buying a watercooler to my 3570k.


Where are you reading about these cards "throttling"? If anything we're seeing nearly all of them boost higher than the manufacturer spec boost clocks right out of the box.


----------



## LiquidHaus

Quote:


> Originally Posted by *Dreamliner*
> 
> Even card with aftermarket coolers are throttling?
> 
> What the heck.
> 
> I just want an air 1070 that doesn't whine and won't throttle. What the heck do I buy?


Zotac Amp Extreme. mine doesn't whine, and it doesn't throttle.

Quote:


> Originally Posted by *modapcboy*
> 
> Anyone using a 1070 , particularly a zotac amp 1070 on a 1440p monitor ? Gaming wise is it good ?


I have the Amp Extreme. I run at 3440x1440. every game i play I have since maxed out the settings and it is capping my monitor at 60fps, since it only does 60hz anyway. couldn't better happier.


----------



## Dreamliner

Quote:


> Originally Posted by *lifeisshort117*
> 
> Zotac Amp Extreme. mine doesn't whine, and it doesn't throttle.


Whoa. That thing is intense.

I know nothing about Zotac, quality?

It seems to be one of the most expensive 1070's...is it inflated, will it decrease in the next month or two?

Any idea when games will be bundled with the 1070's?


----------



## LiquidHaus

so close to 21k gpu score...


----------



## LiquidHaus

Quote:


> Originally Posted by *Dreamliner*
> 
> Whoa. That thing is intense.
> 
> I know nothing about Zotac, quality?
> 
> It seems to be one of the most expensive 1070's...is it inflated, will it decrease in the next month or two?
> 
> Any idea when games will be bundled with the 1070's?


quality is top notch. 5 year warranty too.

im not sure if it'll decrease in the next month or two, but it definitely will within three. no idea about game bundles though.


----------



## Dreamliner

Quote:


> Originally Posted by *lifeisshort117*
> 
> quality is top notch. 5 year warranty too.
> 
> im not sure if it'll decrease in the next month or two, but it definitely will within three. no idea about game bundles though.


I sure hope so. After selling off the bundled games, I paid just barely over $300 for my 970 G1. Seems nuts to spend $475 now.


----------



## anthonyl

Count me in.. 2 x Gigabyte G1 GTX 1070's..


----------



## LiquidHaus

broke 21k! +800mhz memory did the trick.


----------



## Majentrix

Nice! Welcome to the 21k club


----------



## Dreamliner

The Zotac won't fit in my case. Its between the G1 and Strix. I'm leaning towards Strix because I have an Asus board. I only have 2 DP monitors, but the loss of a 3rd DP on the Strix has be a smidge nervous for future options.

If the display supports it, will the 1070 output 4K 60Hz over HDMI?


----------



## pez

Quote:


> Originally Posted by *Yetyhunter*
> 
> How do you keep your cards so cool ?? My card easily goes over 75*C even at default settings reaching 80*C. I tried every possible fan combination to change airflow with no decent result . I thought I solved the problem but nowhere near. Could my card be faulty ? The back plate gets so hot I can't even keep my hand on it.


I'm determined there is either a weird stock fan profile for the 1070 G1 or that airflow is a big issue. I've seen 2 or 3 people on here now report 80C+ temps for their's on here. The 1080 is a hotter running card and at 50% fan speed, I'm not hitting above 72C max. If your ambient temps aren't something crazy, and your airflow is adequate, then you may have a bad card? I've seen most user's report back that a better fan curve or sorting their airflow was the solution.
Quote:


> Originally Posted by *pewpewlazer*
> 
> Where are you reading about these cards "throttling"? If anything we're seeing nearly all of them boost higher than the manufacturer spec boost clocks right out of the box.


Since GPU boost came about, people have confused 'throttling' and 'variable clocks' with each and it's become almost synonymous on the forums.


----------



## Blackfyre

*Apparently new drivers, I haven't tested them yet. So I don't know if they fix the high DPC latency issues:*

http://forums.guru3d.com/showthread.php?t=408708

*Edit* make sure you read the comments too if you want to install them you're going to need the modded inf file.

I'm not free to test right now.


----------



## DrumAndBass

*Blackfyre,* it seems they are useful for mobile gpus, also Manuel G. said that new driver will appear later this weak. To be fair i havn't experinced any problems beside DPC latency with 368.39 so far (overwatch, diablo 3, cs).


----------



## whicker

Did some more benchmarking last night. Managed to get to +175 (2067-2088 max boost) +750 on the memory and got this score. http://www.3dmark.com/fs/9309399

I still think with bios mod or better knowledge of the freq/voltage curve I could do better. For some reason the card rarely goes over 1000mv and anything over +175 on core gives me artifacting. It's still a good OC though so I can't really complain.

I was running the fans at 100% and they are very audible over 60-70% but the good news is with an ambient of 29-30c (canadian heat wave) the card never went over 55c during the runs. During the winter that would be around 45-50c max temps which is insane. I have crazy temperature headroom with this card but cant do anything about it because of the stupid voltage limit.


----------



## Yetyhunter

Quote:


> Originally Posted by *pez*
> 
> I'm determined there is either a weird stock fan profile for the 1070 G1 or that airflow is a big issue. I've seen 2 or 3 people on here now report 80C+ temps for their's on here. The 1080 is a hotter running card and at 50% fan speed, I'm not hitting above 72C max. If your ambient temps aren't something crazy, and your airflow is adequate, then you may have a bad card? I've seen most user's report back that a better fan curve or sorting their airflow was the solution.
> Since GPU boost came about, people have confused 'throttling' and 'variable clocks' with each and it's become almost synonymous on the forums.


My fan curve is oriented twords silence. The max is 60% and it climbs very slowly. Will increasing the fan speed make a difference ? My ambient temps are also quite high, 26, 27°C.
Curious thing is that the gtx670 windforce x3 i replaced never went over 65*C in exactly the same circumstances.
Does the 1070 use so much more power ?


----------



## bigjdubb

Quote:


> Originally Posted by *Yetyhunter*
> 
> My fan curve is oriented twords silence. The max is 60% and it climbs very slowly. Will increasing the fan speed make a difference ? My ambient temps are also quite high, 26, 27°C.
> Curious thing is that the gtx670 windforce x3 i replaced never went over 65*C in exactly the same circumstances.
> Does the 1070 use so much more power ?


I believe the 1070 has a lower TDP than the 670, if I remember my 670 correctly. It is possible that there is an issue with your particular card, maybe the machine that squirts the TIM ran out on your card.

Have you considered removing the heatsink to check things out?

Quote:


> Originally Posted by *pez*
> 
> Since GPU boost came about, people have confused 'throttling' and 'variable clocks' with each and it's become almost synonymous on the forums.


Very much this, a lot of people misunderstand how it works. Throttling would be a clock speed lower than the rated boost speed, on an FE card that would be anything below 1683mhz.


----------



## 348299

Two pages back was a guy saying that the card throttles at 45c.


----------



## bigjdubb

Quote:


> Originally Posted by *leandronb*
> 
> Two pages back was a guy saying that the card throttles at 45c.


It's not throttling in the traditional meaning. GPU boost varies based on how much headroom there is. As your temps rise the headroom shrinks and so does the amount of boost.

It appears for Pascal this first "step" is 48*c, then 54*c... so on and so forth

It has become so common to use the word throttling but unfortunately it confuses new Nvidia owners. Those of us that have been around this block a few times understand the difference, but for someone coming from an AMD card "throttling" has a whole different meaning.


----------



## 348299

Quote:


> Originally Posted by *bigjdubb*
> 
> It's not throttling in the traditional meaning. GPU boost varies based on how much headroom there is. As your temps rise the headroom shrinks and so does the amount of boost.
> 
> It appears for Pascal this first "step" is 48*c, then 54*c... so on and so forth
> 
> It has become so common to use the word throttling but unfortunately it confuses new Nvidia owners. Those of us that have been around this block a few times understand the difference, but for someone coming from an AMD card "throttling" has a whole different meaning.


So at least the minimum rated speed i should have with 1070 right?
I have a 660ti now and it easily reaches 80c, i am worried of buying a 1070 and if it have the same temps, then i will have a 1070 throttled down all the time.


----------



## Blackfyre

*Okay the new drivers are out:*

http://www.nvidia.com/download/driverResults.aspx/105037/en-us


----------



## Blackfyre

Quote:


> Originally Posted by *leandronb*
> 
> So at least the minimum rated speed i should have with 1070 right?
> I have a 660ti now and it easily reaches 80c, i am worried of buying a 1070 and if it have the same temps, then i will have a 1070 throttled down all the time.


No you misunderstood that post. The throttling that happens is only to the BOOST speeds. Not to the base speeds. The minimum you will get is the base speeds advertised on the website of each manufacturer. That's regardless of temperature I believe. Trust me, I went crazy and tested, I put fan speed on 0% and got my card to around 90 degrees celsius. It won't throttle below advertised base speeds, in fact it maintained speeds around 1900MHz (_which is boost_). But if you overclock past that, then it's different. It throttles at certain junctures. Which is still very stupid, because the card is more than capable at holding those speeds and decent temperatures. Which is why we were annoyed a few pages back.


----------



## LiquidHaus

Quote:


> Originally Posted by *bigjdubb*
> 
> It's not throttling in the traditional meaning. GPU boost varies based on how much headroom there is. As your temps rise the headroom shrinks and so does the amount of boost.
> 
> It appears for Pascal this first "step" is 48*c, then 54*c... so on and so forth
> 
> It has become so common to use the word throttling but unfortunately it confuses new Nvidia owners. Those of us that have been around this block a few times understand the difference, but for someone coming from an AMD card "throttling" has a whole different meaning.


when I was doing another overclock session last night, my card would start throttling the boost as soon as it hit 60c. 45c it was hitting the frequency I had set it to. but as soon as 60 came up, it bumped it down a bit.


----------



## 348299

Quote:


> Originally Posted by *bigjdubb*
> 
> It's not throttling in the traditional meaning. GPU boost varies based on how much headroom there is. As your temps rise the headroom shrinks and so does the amount of boost.
> 
> It appears for Pascal this first "step" is 48*c, then 54*c... so on and so forth
> 
> It has become so common to use the word throttling but unfortunately it confuses new Nvidia owners. Those of us that have been around this block a few times understand the difference, but for someone coming from an AMD card "throttling" has a whole different meaning.


Quote:


> Originally Posted by *Blackfyre*
> 
> No you misunderstood that post. The throttling that happens is only to the BOOST speeds. Not to the base speeds. The minimum you will get is the base speeds advertised on the website of each manufacturer. That's regardless of temperature I believe. Trust me, I went crazy and tested, I put fan speed on 0% and got my card to around 90 degrees celsius. It won't throttle below advertised base speeds, in fact it maintained speeds around 1900MHz (_which is boost_). But if you overclock past that, then it's different. It throttles at certain junctures. Which is still very stupid, because the card is more than capable at holding those speeds and decent temperatures. Which is why we were annoyed a few pages back.


Thanks for the clarification.
I am less worried about this now.
The main problem for me is that i have an itx case (elite 130), and i am still deciding if i should stick to intel stock cooler and replace my blower 660ti to a blower 1070, or if i should watercool the cpu and get an open air cooler 1070, i don't know if an open air will make a big difference in case temperature if cpu is watercooled, maybe to motherboard and ram. What do you guys think?


----------



## Joenc

so why would these companies make cards that the boost starts getting lower around
50c and higher??? every game will make cards run hot,

so you need liquid cooling or have pc in cold room..

Doesn't make sense to limit speed at so low temps...


----------



## LiquidHaus

don't feel so bad guys. the 1080s are in the same boat right now. i built these two systems at work this week, and one system can only get 2113mhz, and the other 2088mhz. AND they're watercooled. both identical systems with SLI 1080s.


----------



## Blackfyre

Quote:


> Originally Posted by *lifeisshort117*
> 
> don't feel so bad guys. the 1080s are in the same boat right now. i built these two systems at work this week, and one system can only get 2113mhz, and the other 2088mhz. AND they're watercooled. both identical systems with SLI 1080s.
> 
> 
> Spoiler: Warning: Spoiler!


Just wanted to say, those look beautiful. Nice builds. Wish my computer looked half as decent, haha.


----------



## bigjdubb

Quote:


> Originally Posted by *Joenc*
> 
> so why would these companies make cards that the boost starts getting lower around
> 50c and higher??? every game will make cards run hot,
> 
> *so you need liquid cooling or have pc in cold room..*
> 
> Doesn't make sense to limit speed at so low temps...


Using watercooling and keeping the ambient temps as low as possible have been key ingredients to achieving maximum overclocks for many years. The lower the temps the greater chance you have of getting higher clock speeds, if this wasn't true then LN2 and phase change would be a waste of time.


----------



## Dreamliner

The Zotac won't fit in my case. Its between the G1 and Strix. I'm leaning towards Strix because I have an Asus board. I only have 2 DP monitors, but the loss of a 3rd DP on the Strix has be a smidge nervous for future options.

If the display supports it, will the 1070 output 4K 60Hz over HDMI?


----------



## LiquidHaus

Quote:


> Originally Posted by *Blackfyre*
> 
> Just wanted to say, those look beautiful. Nice builds. Wish my computer looked half as decent, haha.


thanks man! I try to get these things looking as good as possible.

you should work on the aesthetics of yours soon!


----------



## pez

Quote:


> Originally Posted by *Yetyhunter*
> 
> My fan curve is oriented twords silence. The max is 60% and it climbs very slowly. Will increasing the fan speed make a difference ? My ambient temps are also quite high, 26, 27°C.
> Curious thing is that the gtx670 windforce x3 i replaced never went over 65*C in exactly the same circumstances.
> Does the 1070 use so much more power ?


My ambient temps are a constant 25C, so I'm not too far off. However, I will say if you're pushing the card in something like Firestrike constantly, 80C may be a normal temp. I'd say for gaming that seems high. Any way to measure your ambient case temps?
Quote:


> Originally Posted by *bigjdubb*
> 
> I believe the 1070 has a lower TDP than the 670, if I remember my 670 correctly. It is possible that there is an issue with your particular card, maybe the machine that squirts the TIM ran out on your card.
> 
> Have you considered removing the heatsink to check things out?
> Very much this, a lot of people misunderstand how it works. Throttling would be a clock speed lower than the rated boost speed, on an FE card that would be anything below 1683mhz.


I'd actually go as far to say that as long as you're not hitting below your stock clocks, you're not being throttled. I mean based on the variables required for boost (of any kind) it fits. However, for you to hit your base clocks on any of the 10-series cards, you've gotta have some crazy variables or a bad card.
Quote:


> Originally Posted by *lifeisshort117*
> 
> don't feel so bad guys. the 1080s are in the same boat right now. i built these two systems at work this week, and one system can only get 2113mhz, and the other 2088mhz. AND they're watercooled. both identical systems with SLI 1080s.
> 
> 
> Spoiler: Warning: Spoiler!


My my. Where do you work? If you don't mind answering/me asking







. Always felt I could live a happy life building systems like that .


----------



## DrumAndBass

I too want to be in 21k club







( But that is the max memory clock without artifacts (


----------



## Anth0789

Been waiting two weeks for my card to come, finally its here just got to go pick it up later today, will post pics once its here.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Anth0789*
> 
> Been waiting two weeks for my card to come, finally its here just got to go pick it up later today, will post pics once its here.


cool.









only 6 more days for me 2 strix.


----------



## bigjdubb

Quote:


> Originally Posted by *DrumAndBass*
> 
> I too want to be in 21k club
> 
> 
> 
> 
> 
> 
> 
> ( But that is the max memory clock without artifacts (


You're so close! Maybe bump your CPU and RAM as much as possible if your card is topped out. Turn the A/C down as low as possible, open up the case and shove a room fan into the opening... there's always something left to try!

Also, run the test plenty of times. My results will vary a few hundred points in back to back runs.


----------



## DrumAndBass

*bigjdubb* thanks for advice, will play with RAM timings a bit, and maybe set CPU to 4,6 but my last experience with it at 4,6 was @ 1.37v and i'm not a fan of such voltage....


----------



## LiquidHaus

Quote:


> Originally Posted by *pez*
> 
> *snip*
> My my. Where do you work? If you don't mind answering/me asking
> 
> 
> 
> 
> 
> 
> 
> . Always felt I could live a happy life building systems like that .


Yeah it's definitely nice to be doing this sorta stuff everyday lol. And I am the watercooling manager at Xidax PCs! I handle all the building and QC with our water cooled systems.

People get pretty salty about the pricing we have on our systems, and sometimes I can't blame them. However our lifetime warranty could be worth quite a bit of money to a lot of people who already know the hassle and headache of getting something returned or refunded because it didn't work correctly or just died of old age.

But anyway, this site coupled with the opportunity for hands on experience with the newer hardware is very nice to have.


----------



## bigjdubb

Quote:


> Originally Posted by *DrumAndBass*
> 
> *bigjdubb* thanks for advice, will play with RAM timings a bit, and maybe set CPU to 4,6 but my last experience with it at 4,6 was @ 1.37v and i'm not a fan of such voltage....


It's benchmarking, it only has to work for a couple of minutes. It just needs to boot and complete the test so that you can record the score and get a screenshot!


----------



## whicker

Quote:


> Originally Posted by *DrumAndBass*
> 
> I too want to be in 21k club
> 
> 
> 
> 
> 
> 
> 
> ( But that is the max memory clock without artifacts (


Man I wish my core freq and voltage looked as stable as yours. During firestrike runs freq is bouncing between 1967-2088 and voltage 950mv-1093mv. What does your voltage/freq curve look like?


----------



## DrumAndBass

Quote:


> Originally Posted by *whicker*
> 
> Man I wish my core freq and voltage looked as stable as yours. During firestrike runs freq is bouncing between 1967-2088 and voltage 950mv-1093mv. What does your voltage/freq curve look like?


Nothing special....almost like regular one )


----------



## LiquidHaus

Quote:


> Originally Posted by *whicker*
> 
> Man I wish my core freq and voltage looked as stable as yours. During firestrike runs freq is bouncing between 1967-2088 and voltage 950mv-1093mv. What does your voltage/freq curve look like?


your boost should notttt be bouncing around that much, especially dropping all the way down to 1967mhz. what are your temps? mine would occasionally bounce, but only down to 2075mhz.


----------



## whicker

Quote:


> Originally Posted by *lifeisshort117*
> 
> your boost should notttt be bouncing around that much, especially dropping all the way down to 1967mhz. what are your temps? mine would occasionally bounce, but only down to 2075mhz.


Temps are under 55c. Something is definitely wrong then :/. Could try another DDU uninstall and try the new driver from today. What else could it be? I'll take a screenshot like his when I get home from work.


----------



## pez

Quote:


> Originally Posted by *lifeisshort117*
> 
> Yeah it's definitely nice to be doing this sorta stuff everyday lol. And I am the watercooling manager at Xidax PCs! I handle all the building and QC with our water cooled systems.
> 
> People get pretty salty about the pricing we have on our systems, and sometimes I can't blame them. However our lifetime warranty could be worth quite a bit of money to a lot of people who already know the hassle and headache of getting something returned or refunded because it didn't work correctly or just died of old age.
> 
> But anyway, this site coupled with the opportunity for hands on experience with the newer hardware is very nice to have.


I like the site and I like what you guys have going on. I think that's awesome. And you guys are a business and have to make money. I ain't gonna sweat you on that







.


----------



## DrumAndBass

*whicker* DDU is never bad, also make sure voltage limit and power limit are at max (+100 and 114/120 respectively).

Also i'm using 368.39 driver and not last two, so it might affect voltage too....who knows...

This is in heaven under 100% gpu usage - dead flat core clock/voltage graphs


----------



## LiquidHaus

Quote:


> Originally Posted by *whicker*
> 
> Temps are under 55c. Something is definitely wrong then :/. Could try another DDU uninstall and try the new driver from today. What else could it be? I'll take a screenshot like his when I get home from work.


what overclocking utility are you using? make sure you're using the 4.3.0 beta 4 on afterburner....OR you could try the Zotac utility; Firestorm. head over to Zotac's site and download the 2.0006E version. I guess you could also try EVGA's Precision, but I've heard that utility is having problems with Pascal.

Quote:


> Originally Posted by *pez*
> 
> I like the site and I like what you guys have going on. I think that's awesome. And you guys are a business and have to make money. I ain't gonna sweat you on that
> 
> 
> 
> 
> 
> 
> 
> .


Thanks man! I appreciate the kind words. We're trying our best over here lol


----------



## whicker

Quote:


> Originally Posted by *DrumAndBass*
> 
> *whicker* DDU is never bad, also make sure voltage limit and power limit are at max (+100 and 114/120 respectively).
> 
> Also i'm using 368.39 driver and not last two, so it might affect voltage too....who knows...


Yeah I use DDU almost every time, my afterburner looks the same as yours except I only have 112% power limit and not using curve. The driver might be the culprit. I'll post a screenshot tonight and try the newer and older drivers to see how it is. Worst case it might be Bios?


----------



## whicker

Quote:


> Originally Posted by *lifeisshort117*
> 
> what overclocking utility are you using? make sure you're using the 4.3.0 beta 4 on afterburner....OR you could try the Zotac utility; Firestorm. head over to Zotac's site and download the 2.0006E version. I guess you could also try EVGA's Precision, but I've heard that utility is having problems with Pascal.


Using the beta 4 MSIAF.


----------



## LiquidHaus

Quote:


> Originally Posted by *whicker*
> 
> Using the beta 4 MSIAF.


reinstall it, or try another utility. sometimes with the systems I build, AF doesn't even register frequencies correctly - and that's with the beta 4 as well. so I'd say it's worth a try.


----------



## whicker

Quote:


> Originally Posted by *lifeisshort117*
> 
> reinstall it, or try another utility. sometimes with the systems I build, AF doesn't even register frequencies correctly - and that's with the beta 4 as well. so I'd say it's worth a try.


I'll try that too. I will say that I don't think the 100% voltage slider was even working so you might be on to something.


----------



## DrumAndBass

Quote:


> Originally Posted by *whicker*
> 
> Yeah I use DDU almost every time, my afterburner looks the same as yours except I only have 112% power limit and not using curve. The driver might be the culprit. I'll post a screenshot tonight and try the newer and older drivers to see how it is. Worst case it might be Bios?


Go for curve mode. When i tried to oc this card i started with old style slider changes. Voltage behaved strangely. For example i could not even bypass 1.075 with sliders.
From my post at guru3d forums: "I struggle to understand right now what makes GPU determine which point on curve to choose as the max voltage value, but all i get for now is that throuh curve you can achieve 1.093, whereas with slider-style i capped at 1.075v." - so go for curve mode.
I also would like to try Zotac Firestorm, seems reasonable to try out.


----------



## marik123

Finally my GTX1070 is shipped out yesterday, will be here on next Monday.







:thumb:


----------



## DrumAndBass

Quote:


> Originally Posted by *marik123*
> 
> Finally my GTX1070 is shipped out yesterday, will be here on next Monday.
> 
> 
> 
> 
> 
> 
> 
> :thumb:


gratz mate! don't froget to make them benches after you configure everything)


----------



## ProHitZ

My Gaming X keeps hitting voltage limit... Both GPU-z and afterburner states that as clock limiter, gained 40 Mhz just by maxing out the voltage slider but it still wants more. Thought that most pascal cards didn't even scale that well with voltage, might end up nicely with a custom bios later, runs really cool and far away from power limit.

Not the best score, not really tuned the overclock yet. Probably able to push out a bit more still, that was with max clock 2065 + memory 8800.


----------



## Blackfyre

*First DX12 Results from 3DMark:*

http://www.3dmark.com/3dm/13197090
























*LINK to higher quality of IMAGE Below:*

http://i.imgur.com/NL5Bl22.png


----------



## whicker

At home now. Uninstalled msiab, ran ddu, installed 368.39, installed msiab beta4. Gonna run some benchmarks and follow up.

I still find its bouncing around alot but im getting okay scores. Seems like the more i boost the clock the lower voltage it wants to use. At stock it was almost always at 1093mv.


----------



## Hunched

Just got my MSI Gaming 1070 8G and I'm pretty damn happy. SO much better than the Gigabyte WindForce OC.
The fans are so silent, no grinding issues like with the Gigabyte. Thankfully I'm 2 for 2 with no coil whine!

I flashed the MSI Gaming 1070 *X* 8G BIOS to my card which I had to obtain through a cached version of the page since its been removed for the 1070 for some reason... 1080 BIOS is still up on their site.
This made it boost 100mhz higher, to 1999.5mhz.

With a custom fan curve to hit 50% at 50c and 100% at 80c the card hit 57c/61%/1514rpm max in a run of Unigine Valley Extreme HD.
The Gigabyte fans were obnoxious around 60%, these fans you can barely hear.

Also, the extra 6-pin helps a lot.
With a single 8-pin the Gigabyte WindForce OC was smashing against its 111% TDP and giving pwr perfcaps. With 8+6 pin the MSI it hit 72.4% in Unigine Valley, and limits at 126% in MSI AB.

It's only VRel perfcaps present now which is for 100% of Pascal, and is possible to be fixed when BIOS modding becomes available.

Now to begin overclocking!
I'll also be flashing to the MSI Gaming 1070 Z 8G BIOS whenever that becomes available.


----------



## Dude970

Just tried a run on Time Spy

http://www.3dmark.com/3dm/13207589?


----------



## Phixit

Quote:


> Originally Posted by *Hunched*
> 
> Just got my MSI Gaming 1070 8G and I'm pretty damn happy. SO much better than the Gigabyte WindForce OC.
> The fans are so silent, no grinding issues like with the Gigabyte. Thankfully I'm 2 for 2 with no coil whine!
> 
> I flashed the MSI Gaming 1070 *X* 8G BIOS to my card which I had to obtain through a cached version of the page since its been removed for the 1070 for some reason... 1080 BIOS is still up on their site.
> This made it boost 100mhz higher, to 1999.5mhz.
> 
> With a custom fan curve to hit 50% at 50c and 100% at 80c the card hit 57c/61%/1514rpm max in a run of Unigine Valley Extreme HD.
> The Gigabyte fans were obnoxious around 60%, these fans you can barely hear.
> 
> Also, the extra 6-pin helps a lot.
> With a single 8-pin the Gigabyte WindForce OC was smashing against its 111% TDP and giving pwr perfcaps. With 8+6 pin the MSI it hit 72.4% in Unigine Valley, and limits at 126% in MSI AB.
> 
> It's only VRel perfcaps present now which is for 100% of Pascal, and is possible to be fixed when BIOS modding becomes available.
> 
> Now to begin overclocking!
> I'll also be flashing to the MSI Gaming 1070 Z 8G BIOS whenever that becomes available.


Did you get your MSI on Newegg.ca at $620 CAD ?


----------



## Dreamliner

Quote:


> Originally Posted by *Hunched*
> 
> Just got my MSI Gaming 1070 8G and I'm pretty damn happy. SO much better than the Gigabyte WindForce OC.
> The fans are so silent, no grinding issues like with the Gigabyte. Thankfully I'm 2 for 2 with no coil whine!
> 
> I flashed the MSI Gaming 1070 *X* 8G BIOS to my card which I had to obtain through a cached version of the page since its been removed for the 1070 for some reason... 1080 BIOS is still up on their site.
> This made it boost 100mhz higher, to 1999.5mhz.
> 
> With a custom fan curve to hit 50% at 50c and 100% at 80c the card hit 57c/61%/1514rpm max in a run of Unigine Valley Extreme HD.
> The Gigabyte fans were obnoxious around 60%, these fans you can barely hear.
> 
> Also, the extra 6-pin helps a lot.
> With a single 8-pin the Gigabyte WindForce OC was smashing against its 111% TDP and giving pwr perfcaps. With 8+6 pin the MSI it hit 72.4% in Unigine Valley, and limits at 126% in MSI AB.
> 
> It's only VRel perfcaps present now which is for 100% of Pascal, and is possible to be fixed when BIOS modding becomes available.
> 
> Now to begin overclocking!
> I'll also be flashing to the MSI Gaming 1070 Z 8G BIOS whenever that becomes available.


Bad times with the Gigabyte?

I've been hearing bad things. I'm considering the Strix to go with my Asus board. I looked at the MSI but they made some pretty shaky X99 boards......


----------



## pez

Not a 1070, but mentioned I'd post some quick SLI temps for someone in either the 1070 or 1080 thread to show them some temps. Top card is getting a bit toasty. This is after about an hour of Crysis 3. All settings completely maxed, but AA is at FXAA. Haven't tried the in between setting, but 4K @ 27" isn't giving me too many jaggies







.


----------



## LiquidHaus

Quote:


> Originally Posted by *Dreamliner*
> 
> Bad times with the Gigabyte?
> 
> I've been hearing bad things. I'm considering the Strix to go with my Asus board. I looked at the MSI but they made some pretty shaky X99 boards......


well i'll give you some insight involving Asus, MSI and Gigabyte. I deal with their boards and cards literally every day of my life. the MSI motherboards sure as heck are less finicky as Asus boards. and the Gigabyte boards are pure meh in my experience. I'd pick an X99A X-Power motherboard over a Rampage V any day.

Asus support is just terrible as well. MSI isn't too shabby, and Gigabyte isn't too bad at all. But that Zotac though.....5 year warranty. Hard to beat. that being said, I did have my eye on the 1070 Strix for a while. It was either that or the Zotac for me. EVGA has amazing support, and most of the time their GPUs are great. I just couldn't shake their need to put their logo over EVERYTHING.


----------



## Mad Pistol

Look what the UPS man dropped off for me today? Something even rarer than the GTX 1070 and 1080 are at the moment...


----------



## Anth0789

Here she is:


----------



## Hunched

Quote:


> Originally Posted by *Phixit*
> 
> Did you get your MSI on Newegg.ca at $620 CAD ?


No I asked if I could get it for $570 which is $440 USD since that's what it costs on their US site, and since it was $570 like a week ago.
I would have been paying $570 if I didn't have to RMA Gigabyte garbage, prices went up on everything since, including what I originally bought.


----------



## Hunched

Quote:


> Originally Posted by *Dreamliner*
> 
> Bad times with the Gigabyte?
> 
> I've been hearing bad things. I'm considering the Strix to go with my Asus board. I looked at the MSI but they made some pretty shaky X99 boards......


The Gigabyte WindForce OC is pure garbage. The fans are pure garbage, the shroud is pure garbage, if it had a backplate it would be garbage.
I'd get something with more than a single 8-pin. Since the MSI has 8+6 I'm no longer throttling against a 111% TDP limit.


----------



## Dreamliner

Quote:


> Originally Posted by *lifeisshort117*
> 
> well i'll give you some insight involving Asus, MSI and Gigabyte. I deal with their boards and cards literally every day of my life. the MSI motherboards sure as heck are less finicky as Asus boards. and the Gigabyte boards are pure meh in my experience. I'd pick an X99A X-Power motherboard over a Rampage V any day.
> 
> Asus support is just terrible as well. MSI isn't too shabby, and Gigabyte isn't too bad at all. But that Zotac though.....5 year warranty. Hard to beat. that being said, I did have my eye on the 1070 Strix for a while. It was either that or the Zotac for me. EVGA has amazing support, and most of the time their GPUs are great. I just couldn't shake their need to put their logo over EVERYTHING.


The Zotac won't fit in my case so I think I'm going Strix.

I had a MSI X99A board ordered for $150 and got cold feet and switched to the Asus Sabertooth X99 for $300.


----------



## bigjdubb

Time Spy

MSI GTX 1070 Gaming (non X)

Stock: http://www.3dmark.com/3dm/13210863?

Overclocked: http://www.3dmark.com/3dm/13211041?


----------



## BulletSponge

Quote:


> Originally Posted by *bigjdubb*
> 
> Time Spy
> 
> MSI GTX 1070 Gaming (non X)
> 
> Stock: http://www.3dmark.com/3dm/13210863?
> 
> Overclocked: http://www.3dmark.com/3dm/13211041?


Very nice indeed.


----------



## whicker

Quote:


> Originally Posted by *Anth0789*
> 
> Here she is:


Could you do me a huge favor and run some benchmarks with MSI AB running and an OC and show me your voltage and freq graph?


----------



## Mad Pistol

I Time Spy some SLI...

http://www.3dmark.com/3dm/13212917?


----------



## BulletSponge

Quote:


> Originally Posted by *Mad Pistol*
> 
> I Time Spy some SLI...
> 
> http://www.3dmark.com/3dm/13212917?


With my envious little eye......


----------



## Dreamliner

I'd love to see a 1070 roundup review.

Some say there are no custom PCB's but if that's true what is the second power plug doing on the MSI and Strix?


----------



## Majentrix

There are custom PCBs but they only differ from reference boards in power delivery and in the Strix' case display output. It's for this reason you can flash any 1070 BIOS to any card.


----------



## whicker

Quote:


> Originally Posted by *Dreamliner*
> 
> I'd love to see a 1070 roundup review.
> 
> Some say there are no custom PCB's but if that's true what is the second power plug doing on the MSI and Strix?


Strix only has the 1 8pin plug


----------



## nacherc

MSI GTX 1070 GAMING X

CORE +110
MEM +900
POWER USAGE 85%
FAN SPEED 60%
TEMP 60c


----------



## mypickaxe

Quote:


> Originally Posted by *Mad Pistol*
> 
> I Time Spy some SLI...
> 
> http://www.3dmark.com/3dm/13212917?


Nice, I will test my 1070 SLI build later. For now I have run through on my HTPC:

http://www.3dmark.com/3dm/13191282


----------



## Dreamliner

Quote:


> Originally Posted by *Majentrix*
> 
> There are custom PCBs but they only differ from reference boards in power delivery and in the Strix' case display output. It's for this reason you can flash any 1070 BIOS to any card.


Doesn't that also mean that all 1070's are pretty much identical?

I thought about getting the NZXT water cooler but I don't see a way to mount a backplate or cool the memory chips so I thought I'd be better off with the Strix.


----------



## KaRtA82

Picking up a 1070 tomorrow, just wondering how any Modded Bios's are going with the Pascal Cards.

All I can find is an XOC for the Strix, but I'll be on a reference 1070.


----------



## ondoy

Quote:


> Originally Posted by *Mad Pistol*
> 
> I Time Spy some SLI...
> 
> http://www.3dmark.com/3dm/13212917?


here's a 980Ti SLI timespy with a 2.3 ghz cpu..


----------



## bigjdubb

Have you run it without SLI? The tests I have seen with the 10 series aren't showing much async benefit in SLI.


----------



## Majentrix

Quote:


> Originally Posted by *Dreamliner*
> 
> Doesn't that also mean that all 1070's are pretty much identical?


Yes. The only difference is power delivery, PCB and cooler. Some have superficial features like fan connectors and BIOS switches but they're all essentially the same under the hood.


----------



## nacherc




----------



## Blackfyre

*My highest score so far:*



*LINK:*

http://www.3dmark.com/spy/23949


----------



## LiquidHaus

I feel like reminding those interested in the XOC Strix bios that the card that bios came from was a 1080, not a 1070. Pretty sure flashing a bios that supports gddr5x memory wouldnt be good for normal gddr5.


----------



## bigjdubb

Is anyone able to crack open the bios and modify it yet, or are people just flashing other cards bios onto their card?


----------



## Majentrix

So close to 6000!


----------



## ogow89

I have the Msi gtx 1070, and i've been trying to overclock it for hours now, and just no luck. I am using Afterburner 4.3 beta 4. I can hit 700+mhz on the memory but the core won't go beyond 50+mhz. Powerlimit is set to max 126% and i also tried the corevoltage at 100%. Anything above 50mhz just locks up and crashes. I tried also overclocking the core withouth the memory, and yet won't budge.

Any solutions? I can't be the only one with the ****tiest chip of all.

Oh and if i set the corevoltage to 100% and PL to 126% and core clock to +50mhz, the gpu also crashes. Is there something i should configure before overclocking this card?

I already used the latest DDU and checked the pci-e connectors.

PSU: Cosair RM750X gold edition. It powered an r9 290 pcs+ oc, so a 1070 shouldn't be a problem for it.


----------



## Blackfyre

Quote:


> Originally Posted by *ogow89*
> 
> I have the Msi gtx 1070, and i've been trying to overclock it for hours now, and just no luck. I am using Afterburner 4.3 beta 4. I can hit 700+mhz on the memory but the core won't go beyond 50+mhz. Powerlimit is set to max 126% and i also tried the corevoltage at 100%. Anything above 50mhz just locks up and crashes. I tried also overclocking the core withouth the memory, and yet won't budge.
> 
> Any solutions? I can't be the only one with the ****tiest chip of all.
> 
> Oh and if i set the corevoltage to 100% and PL to 126% and core clock to +50mhz, the gpu also crashes. Is there something i should configure before overclocking this card?
> 
> I already used the latest DDU and checked the pci-e connectors.
> 
> PSU: Cosair RM750X gold edition. It powered an r9 290 pcs+ oc, so a 1070 shouldn't be a problem for it.


*100% it's not your Power Supply.*

Really sounds like you've got a bad chip. Just horrible luck really.


----------



## ogow89

Quote:


> Originally Posted by *Blackfyre*
> 
> *100% it's not your Power Supply.*
> 
> Really sounds like you've got a bad chip. Just horrible luck really.


The PSU couldn't be the issue, but i had to mention, so people wouldn't give me the usual stuff, like check psu, uninstall/install drivers, use ddu, and so on.

But it really can't be possible that i would be the only guy, out of everyone that is on every forum overclocking his/her gpu and able to stay beyond 2ghz, to have an awful chip that can't go beyond 1936mhz.

Either MSI Afterburner/gpu-z and every other tool out there are bugged, and are reporting false frequencies, or Msi themselves handpicked this chip for me. This is some Illuminati BS.


----------



## DrumAndBass

Quote:


> Originally Posted by *ogow89*
> 
> The PSU couldn't be the issue, but i had to mention, so people wouldn't give me the usual stuff, like check psu, uninstall/install drivers, use ddu, and so on.
> 
> But it really can't be possible that i would be the only guy, out of everyone that is on every forum overclocking his/her gpu and able to stay beyond 2ghz, to have an awful chip that can't go beyond 1936mhz.
> 
> Either MSI Afterburner/gpu-z and every other tool out there are bugged, and are reporting false frequencies, or Msi themselves handpicked this chip for me. This is some Illuminati BS.


What's your exact voltage numbers at the time of 100% pgu usage? For 2k+ core clock it should be somehwere near 1.060-1.070v for stability.
If you use afterburner 4.3 beta 4 try using curve mode

Also i found it funny, that now when Time Spy is released we can't actually compare DX11 to DX12 performance in one bench. I am more interested in pure performance comparisson between the two APIs than separate numbers for records with each.

Does Time Spy supports DX11? If yes, could someone please make two benches back-to-back in both APIs and post results?

*nacherc* what is your clocks during bench in post 943? It seems you can squeeze a little bit more out of your card


----------



## joloxx9

Do we have any 1070 Gaming Z ownere here? if yes could you be so kind and share your BIOS? I tried to get one from MSI support but they want me to give them S/N which obviously I don't have


----------



## ogow89

Quote:


> Originally Posted by *DrumAndBass*
> 
> What's your exact voltage numbers at the time of 100% pgu usage? For 2k+ core clock it should be somehwere near 1.060-1.070v for stability.
> If you use afterburner 4.3 beta 4 try using curve mode


I've tried both curve mode and offset. It goes to 1.081v at 2025mhz before it then crashes.


----------



## DrumAndBass

Quote:


> Originally Posted by *ogow89*
> 
> I've tried both curve mode and offset. It goes to 1.081v at 2025mhz before it then crashes.


Definately something not right with your card, considering it's msi and *Blackfyre* with his msi gets so much higher. It's either a bad silicon lottery for you mate, or some bios problem. If it is the latter than it can be updated later.

You should decide for yourself if 5-15 average fps difference is worth struggles with changing your card. And it's not a warranty case also, so...

Maybe you can return card back to retail shop and take from another vendor like Zotac or Palit? I have done it in the past, when card is in good shape, some retail stores allow you to change.


----------



## ogow89

Quote:


> Originally Posted by *DrumAndBass*
> 
> Definately something not right with your card, considering it's msi and *Blackfyre* with his msi gets so much higher. It's either a bad silicon lottery for you mate, or some bios problem. If it is the latter than it can be updated later.
> 
> You should decide for yourself if 5-15 average fps difference is worth struggles with changing your card. And it's not a warranty case also, so...
> 
> Maybe you can return card back to retail shop and take from another vendor like Zotac or Palit? I have done it in the past, when card is in good shape, some retail stores allow you to change.


I have 14 days to return the card for either replacement or money back without giving a reason. 5-15 more fps, would be really nice, since i want to go over 1440p with dsr. I might send it back, but the limited supply really wouldn't help my case.


----------



## ogow89

Gpu-z pic, bios looks different than the one that was shown sometime ago.


----------



## DrumAndBass

Quote:


> Originally Posted by *ogow89*
> 
> I have 14 days to return the card for either replacement or money back without giving a reason.


I would go for it. You might consider Zotac one for new card, simply because on different forums their results seems to be the highest overall - on core clock as well on memory speeds.
My palit does freaking good job as well, so can recommend super jetstreams too. 2101 on 1.093 stable as f***. Temps 58-61 degress with fans on 50% @ 1250rpm. Also dead silent, even at 1500rpm. Power limit is 114% though. I hope it's a temporary thing....
So, you decide.....i would go for replacemnet, if you have some card which you could use for time being.
Quote:


> Originally Posted by *ogow89*
> 
> Gpu-z pic, bios looks different than the one that was shown sometime ago.


Ask *Blackfyre* to sreenshot his bios in gpu-z. I wonder if they are identical.


----------



## ogow89

Quote:


> Originally Posted by *DrumAndBass*
> 
> I would go for it. You might consider Zotac one for new card, simply because on different forums their results seems to be the highest overall - on core clock as well on memory speeds.
> My palit does freaking good job as well, so can recommend super jetstreams too. 2101 on 1.093 stable as f***. Temps 58-61 degress with fans on 50% @ 1250rpm. Also dead silent, even at 1500rpm. Power limit is 114% though. I hope it's a temporary thing....
> So, you decide.....i would go for replacemnet, if you have some card which you could use for time being.


I will go for it, if i don't find a resolution to the issue within the next few days. Quick question though, the TDP/power consumption in gpu-z, how high should it go when overclocking? Mine shows 73% TDP, is that related to the Powerlimit somehow? Maybe the powerlimit slider isn't working correctly for me. I would try the evga tool, i just hate going through the hassle of making an account at their site. So if someone has a link to directly download it, that would be great.

Oh and i can do +50mhz on the core and 600mhz on the memory 100% stable on everything, without touching either the powerlimit or votlage. That gives me a graphics score of 20121 on firestrike.


----------



## Blackfyre

Quote:


> Originally Posted by *DrumAndBass*
> 
> Ask *Blackfyre* to sreenshot his bios in gpu-z. I wonder if they are identical.


----------



## ogow89

Quote:


> Originally Posted by *Blackfyre*


yup same









I would go for overclock the memory a tad more and close that 300gb/s


----------



## ogow89

@ *Blackfyre*

Could you please show the powerconsumption/TDP that gpu z is showing at max, when your card is overclocked?


----------



## DrumAndBass

Quote:


> Originally Posted by *ogow89*
> 
> Quick question though, the TDP/power consumption in gpu-z, how high should it go when overclocking? Mine shows 73% TDP, is that related to the Powerlimit somehow?


It should be directly related, because with less power limit allowed to operate, the less TDP would be. I say 'should' because it is a GPU boost 3.0 which is still not very well known wild beast.

btw i like how we continue this conversation on both forums simultaniously ^)


----------



## ogow89

Quote:


> Originally Posted by *DrumAndBass*
> 
> It should be directly related, because with less power limit allowed to operate, the less TDP would be. I say 'should' because it is a GPU boost 3.0 which is still not very well known wild beast.


Well in that case, it would be helpful to see the numbers it's giving when the gpu is overclocked. Mine doesn't go to 126% in case it was to be directly related. It doesn't even come close to it, max i saw was maybe 80%.


----------



## Blackfyre

Quote:


> Originally Posted by *ogow89*
> 
> @ *Blackfyre*
> 
> Could you please show the powerconsumption/TDP that gpu z is showing at max, when your card is overclocked?


You'll have to wait a few hours until I get back home, sorry mate


----------



## luan87us

So I've been getting some PM asking for my ASUS Strix 1070 BIOS. What's so special about that bios? Or everyone just testing out every bios out there? And I'm not sure about the legality of this part so I actually haven't send it to anyone yet. I also don't know what other info is contained in that bios.


----------



## ogow89

Quote:


> Originally Posted by *luan87us*
> 
> So I've been getting some PM asking for my ASUS Strix 1070 BIOS. What's so special about that bios? Or everyone just testing out every bios out there? And I'm not sure about the legality of this part so I actually haven't send it to anyone yet. I also don't know what other info is contained in that bios.


Creditcard, bank info, social security number, and every password you typed on your computer before and after installing the card, along side of family pictures. It is also very illegal, you could go to jail for improper use and the distribution of software illegally and without authorization provided by the manufacturer.


----------



## Mr-Dark

Quote:


> Originally Posted by *ogow89*
> 
> Creditcard, bank info, social security number, and every password you typed on your computer before and after installing the card, along side of family pictures. It is also very illegal, you could go to jail for improper use and the distribution of software illegally and without authorization provided by the manufacturer.


Lol


----------



## whicker

Quote:


> Originally Posted by *luan87us*
> 
> So I've been getting some PM asking for my ASUS Strix 1070 BIOS. What's so special about that bios? Or everyone just testing out every bios out there? And I'm not sure about the legality of this part so I actually haven't send it to anyone yet. I also don't know what other info is contained in that bios.


Nothing illegal haha. Just having issues with my non OC keep a steady clock/voltage. If you'd rather do a 3dmark or unigen run with your power target maxed out and show me your freq and voltage graph that would give me the info I need.

Sorry for bothering you btw. Dint mean to spook you


----------



## Tasm

Not that bad:



But...Zotac Extreme is coming


----------



## Bee Dee 3 Dee

Update/ confirm original promised earliest delivery date- finally!

"Arriving Wednesday" (2016-07-*20*):









*2 of ASUS GeForce GTX 1070* 8GB ROG STRIX OC Edition Graphic Card STRIX-GTX1070-O8G-GAMING
(Amazon.com)
originally they said, as early as 2016-07-*20* no later than 26th.

just finished installing and testing current (*GTX760-4G-SLI*) setup yesterday with 3Dmark...
Fire Strike 1.1
https://www.3dmark.com/fs/9319709

Fire Strike Extreme 1.1
https://www.3dmark.com/fs/9321130

Steam Summer sale games i bought for GTX1070-SLI:







-The Witcher 3 Wild Hunt - 2016-07-03 - 24.99 (-50% Off $49.99 = 24.99)
-Grand Theft Auto V - 2016-07-03 - 35.99 (-40% Off 59.99 = 35.99)
-Far Cry 4 Gold - 2016-07-03 - 34.99 (-50% Off $69.99 = 34.99)
-DOOM - 2016-07-03 - 35.99 (-40% Off $59.99 = 35.99)
-Act of Aggression - Reboot Edition - 2016-07-03 - 17.99 (-60% Off 44.99 = 17.99)

Tried FO4 and GTAV (Winter Steam Sale) and they stunk on GTX760-SLI and i returned them last December.
Never considered playing/ trying new games after FO4 and GTAV on 760-SLI. So i shopped around last springtime for GTX970-SLI and BANG! the GTX10xx was announced







... just before that i almost considered a couple of times pulling the trigger on GTX970-SLI. soo glad i did not!







Last November for only $579 i got: ASUS ROG SWIFT PG278Q 27-inch 120Hz G-Sync gaming monitor. But G-Sync on GTX760-SLI only made the most resource intensive games made before 2015 awesome and fun as could be.

can't wait to play all the newest games maxed on 1070-SLI and on G-Sync too @1440p!









don't ask for copys of me 1070 strix BIOS.. i hear there's no PCs in jail.








.
.
.
.
.
.
.


----------



## Frutek

Quote:


> Originally Posted by *Dreamliner*
> 
> Doesn't that also mean that all 1070's are pretty much identical?
> 
> I thought about getting the NZXT water cooler but I don't see a way to mount a backplate or cool the memory chips so I thought I'd be better off with the Strix.


MSI Gaming X has both that you need



memory chips and VRM are covered with plate already.


----------



## flamin9_t00l

Hey all, long time lurker in this thread and thought I would share my 3DM DX12 result.

MSI Gaming X 1070 @ frequency curve OC (maxing @ 2088c) +700 memory on OC Bios (highest I can reach without lockup).



Not too shabby... really like this card, it's super quiet (matches my build perfect aswell)









Have to say I'm not a fan of the GPU boost 3.0 downclocking (come on nvidia we all like a nice stable core freq), even more so when we know the card can do it.

Even if the freq is less I would take a solid figure over clock dancing any day.
















I also have an EVGA Founders in my ITX rig and it has a worse case of clock dancing than the gaming X probably something to do with less power limit.

Having said this they are still the best GPU's I've ever owned and the performance @ my native 1440p is just perfect.


----------



## whicker

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Hey all, long time lurker in this thread and thought I would share my 3DM DX12 result.
> 
> MSI Gaming X 1070 @ frequency curve OC (maxing @ 2088c) +700 memory on OC Bios (highest I can reach without lockup).
> 
> 
> 
> Not too shabby... really like this card, it's super quiet (matches my build perfect aswell)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have to say I'm not a fan of the GPU boost 3.0 downclocking (come on nvidia we all like a nice stable core freq), even more so when we know the card can do it.
> 
> Even if the freq is less I would take a solid figure over clock dancing any day.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also have an EVGA Founders in my ITX rig and it has a worse case of clock dancing than the gaming X probably something to do with less power limit.
> 
> Having said this they are still the best GPU's I've ever owned and the performance @ my native 1440p is just perfect.


What do you mean by clock dancing? My strix seems to jump from 1934-2066mhz during benchmarks. While others on here have very stable clocks only dropping by one step when power limited.


----------



## LiquidHaus

its so funny that we all acknowledge how awesome our 1070s are, but complain about how they aren't doing what we want them to do while pushing them to their highest (as of right now) limits lol


----------



## Blackfyre

Quote:


> Originally Posted by *lifeisshort117*
> 
> its so funny that we all acknowledge how awesome our 1070s are, but complain about how they aren't doing what we want them to do while pushing them to their highest (as of right now) limits lol


But they have more POTENTIAL!










Lol


----------



## LiquidHaus

Quote:


> Originally Posted by *Blackfyre*
> 
> But they have more POTENTIAL!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Lol


I can't agree with you more!

The hardware is hard locked at 1.25v. Okay fine. Anything between 1.093v or 1.088v and 1.25v should theoretically be possible. Someone please mod the bios. Please.

So many assumptions have been made that this architecture doesn't do well with higher voltages. Why is that? The cards that were being tested that made those assumptions we're the Founder's Editions. The cards we're rocking have much cleaner power delivery. I absolutely refuse to believe that'll be the case with our AIB cards.

It's the same idea with overclocking a proc on a lower end motherboard and a motherboard meant for overclocking. of COURSE you'll get a nicer overclock with the motherboard that is meant for overclocking.

EDIT: we need this exact thread started, but for the 10xx series; http://www.overclock.net/t/1573308/nvidia-gtx-900-cards-custom-bios-upon-request


----------



## Cakewalk_S

Ah man! I'd love to get into this club by picking up a new GTX 1070 but I believe my first components that need upgraded would be the CPU and motherboard... This 2500k is still kickin but sort of lack luster for some future gaming... My upgrades will definitely be a 6700k delided, itx board, m.2 ssd, and some darn fast ddr4...then I can get my GPU sorted out...


----------



## flamin9_t00l

Quote:


> Originally Posted by *whicker*
> 
> What do you mean by clock dancing?


Non static but yeah they're epic cards and I believe they would produce better numbers if it wasn't for that step down.

We need custom BIOSES I guess... hope the BIOS masters can work their magic sometime lol.

My MSI gaming X manages to stay over 2ghz but still bounces around but no matter what I do the EVGA FE always drops below 2000 eventually.

I'm not really complaining just venting my views lol.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Ah man! I'd love to get into this club by picking up a new GTX 1070 but I believe my first components that need upgraded would be the CPU and motherboard... This 2500k is still kickin but sort of lack luster for some future gaming... My upgrades will definitely be a 6700k delided, itx board, m.2 ssd, and some darn fast ddr4...then I can get my GPU sorted out...


That would be a solid upgrade although you're 2500k could still rock with a 1070 for a bit yet. What GPU are you running now?

nevermind just checked your sig. strix 970


----------



## killeraxemannic

3dMark Time Spy with my GTX 1070 G1 in gaming mode via XGE with no CPU OC's

http://www.3dmark.com/spy/40942


----------



## criminal

Mine:

http://www.3dmark.com/spy/20492


----------



## Blackfyre

Quote:


> Originally Posted by *ogow89*
> 
> @ *Blackfyre*
> 
> Could you please show the powerconsumption/TDP that gpu z is showing at max, when your card is overclocked?


*LINK to higher quality version of image below:*

http://i.imgur.com/phmKP8F.jpg













































---


----------



## nesham




----------



## ikjadoon

Quote:


> Originally Posted by *Frutek*
> 
> MSI Gaming X has both that you need
> 
> 
> 
> memory chips and VRM are covered with plate already.


That's what I'm eyeing.

If anyone puts an AIO on an MSI card, let us know how it goes. Can you fit the AIO while leaving the VRAM plate on? I'm thinking about dropping a 120mm rad onto one of these GTX 1070s.
Quote:


> Originally Posted by *lifeisshort117*
> 
> It's the same idea with overclocking a proc on a lower end motherboard and a motherboard meant for overclocking. of COURSE you'll get a nicer overclock with the motherboard that is meant for overclocking.


Err, hold on, haha. Any Z170/Z97/Z87 motherboard will overclock a CPU to just about the same level. This has been tested time and time again; Silicon Lottery (which bins hundreds of CPUs, maybe thousands by now) has also reaffirmed this: the motherboard will not change your overclock in the slightest....unless you were going on LN2.

GPUs, though, I think are a totally different story. Someone floated an interesting point: maybe one reason all these cards are maxing out at 2050MHz more often than 2100MHz: EVGA, MSI, et al. are binning those cards (and thus not selling them now) for their Lightning, Classified, HOF LE, etc. cards.


----------



## LiquidHaus

Quote:


> Originally Posted by *ikjadoon*
> 
> Err, hold on, haha. Any Z170/Z97/Z87 motherboard will overclock a CPU to just about the same level. This has been tested time and time again; Silicon Lottery (which bins hundreds of CPUs, maybe thousands by now) has also reaffirmed this: the motherboard will not change your overclock in the slightest....unless you were going on LN2.
> 
> GPUs, though, I think are a totally different story. Someone floated an interesting point: maybe one reason all these cards are maxing out at 2050MHz more often than 2100MHz: EVGA, MSI, et al. are binning those cards (and thus not selling them now) for their Lightning, Classified, HOF LE, etc. cards.


I had meant Z170 compared to a H170 or Q170 - and when you do that comparison.... overclocks will definitely change.

and also, I do not believe those companies will dump the resources into going that route simply because they are waiting for the 1080Ti to be released. Ever head of an MSI 980 (non-ti) Lightning? Me neither. That's because the Ti's will be for the big big cards.

And i'm pretty sure at that point, binning 1070s isn't relate-able to them binning 1080s for the Ti's. I don't think that's possible when the Ti is such a jump from a normal card. Nvidia will have to manufacture them before they could even be binned.

At which point - in my case, the Amp Extreme is Zotac's highest end revision card. I fully expected them to release just the AMP version, but they went forward and released the Amp Extreme. Which is exactly why I went for it. It was the only card that went toe to toe with the Kingpin back when the 980ti's we're the newest things out.


----------



## Frutek

Quote:


> Originally Posted by *ikjadoon*
> 
> That's what I'm eyeing.
> 
> If anyone puts an AIO on an MSI card, let us know how it goes. Can you fit the AIO while leaving the VRAM plate on? I'm thinking about dropping a 120mm rad onto one of these GTX 1070s.
> Err, hold on, haha. Any Z170/Z97/Z87 motherboard will overclock a CPU to just about the same level. This has been tested time and time again; Silicon Lottery (which bins hundreds of CPUs, maybe thousands by now) has also reaffirmed this: the motherboard will not change your overclock in the slightest....unless you were going on LN2.
> 
> GPUs, though, I think are a totally different story. Someone floated an interesting point: maybe one reason all these cards are maxing out at 2050MHz more often than 2100MHz: EVGA, MSI, et al. are binning those cards (and thus not selling them now) for their Lightning, Classified, HOF LE, etc. cards.


Some people already have done that - http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club/5490_30#post_25277269


----------



## ikjadoon

Quote:


> Originally Posted by *lifeisshort117*
> 
> I had meant Z170 compared to a H170 or Q170 - and when you do that comparison.... overclocks will definitely change.
> 
> and also, I do not believe those companies will dump the resources into going that route simply because they are waiting for the 1080Ti to be released. Ever head of an MSI 980 (non-ti) Lightning? Me neither. That's because the Ti's will be for the big big cards.
> 
> And i'm pretty sure at that point, binning 1070s isn't relate-able to them binning 1080s for the Ti's. I don't think that's possible when the Ti is such a jump from a normal card. Nvidia will have to manufacture them before they could even be binned.
> 
> At which point - in my case, the Amp Extreme is Zotac's highest end revision card. I fully expected them to release just the AMP version, but they went forward and released the Amp Extreme. Which is exactly why I went for it. It was the only card that went toe to toe with the Kingpin back when the 980ti's we're the newest things out.


Ohhh, haha. I understand. Yes. In my mind, I didn't even think about H170 or Q170 as an option.

Err, no. Not the Ti. But, within the GTX 1070 lineup: the STRIX, your Amp Extreme, etc. Why would EVGA let out a 2100MHz GTX 1070 as just a SC edition? They'd lose $50-100 (depending on markup) if they didn't make it a FTW or FTW+ card.
Quote:


> Originally Posted by *Frutek*
> 
> Some people already have done that - http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club/5490_30#post_25277269


Shucks, thought I quoted this. Thank you! I'll give these a nice long read.


----------



## LiquidHaus

Quote:


> Originally Posted by *ikjadoon*
> 
> Ohhh, haha. I understand. Yes. In my mind, I didn't even think about H170 or Q170 as an option.
> 
> Err, no. Not the Ti. But, within the GTX 1070 lineup: the STRIX, your Amp Extreme, etc. Why would EVGA let out a 2100MHz GTX 1070 as just a SC edition? They'd lose $50-100 (depending on markup) if they didn't make it a FTW or FTW+ card.
> Shucks, thought I quoted this. Thank you! I'll give these a nice long read.


Well, that should be the question of the year in regards to any of these AIB releases. Every AIB card is clocking around the exact same frequencies. Because they are all reference design albeit with different phase layouts....and they're all charging different prices for different levels of their AIB cards. EVGA is just doing the same as everyone else. Their cards just happen to normally be a bit more expensive than others - and blame that on the name recognition.


----------



## Dreamliner

Quote:


> Originally Posted by *Frutek*
> 
> Some people already have done that - http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club/5490_30#post_25277269


Wow. With those heatsink plates, the MSI on AIO looks fantastic.

Too bad it's not really helping performance. I see in those screenshots hes just barely maintaining 2Ghz. Seems to be the limit on these cards no matter what.

Its a bit disappointing to see there's pretty much zero difference in performance capability regardless of manufacturer or cooling configuration.


----------



## bigjdubb

Quote:


> Originally Posted by *ikjadoon*
> 
> That's what I'm eyeing.
> 
> If anyone puts an AIO on an MSI card, let us know how it goes. Can you fit the AIO while leaving the VRAM plate on? I'm thinking about dropping a 120mm rad onto one of these GTX 1070s.
> Err, hold on, haha. Any Z170/Z97/Z87 motherboard will overclock a CPU to just about the same level. This has been tested time and time again; Silicon Lottery (which bins hundreds of CPUs, maybe thousands by now) has also reaffirmed this: the motherboard will not change your overclock in the slightest....unless you were going on LN2.
> 
> GPUs, though, I think are a totally different story. Someone floated an interesting point: maybe one reason all these cards are maxing out at 2050MHz more often than 2100MHz: EVGA, MSI, et al. are binning those cards (and thus not selling them now) for their Lightning, Classified, HOF LE, etc. cards.


I am putting my Alphacool HF-14 universal gpu block on my MSI Gaming 1070 this weekend (fingers crossed). i will let you know if I run into any backplate/frontplate/block mounting issues.


----------



## bigjdubb

Quote:


> Originally Posted by *Dreamliner*
> 
> Wow. With those heatsink plates, the MSI on AIO looks fantastic.
> 
> Too bad it's not really helping performance. I see in those screenshots hes just barely maintaining 2Ghz. Seems to be the limit on these cards no matter what.
> 
> *Its a bit disappointing to see there's pretty much zero difference in performance capability regardless of manufacturer or cooling configuration.*


Well watercooling will help maintain maximum boost clocks since the temps can be kept below any boost reduction points.


----------



## Frutek

Quote:


> Originally Posted by *Dreamliner*
> 
> Wow. With those heatsink plates, the MSI on AIO looks fantastic.
> 
> Too bad it's not really helping performance. I see in those screenshots hes just barely maintaining 2Ghz. Seems to be the limit on these cards no matter what.
> 
> Its a bit disappointing to see there's pretty much zero difference in performance capability regardless of manufacturer or cooling configuration.


Those screens are without OC


----------



## Dreamliner

I'd still love to see a comprehensive 1070 roundup with various manufacturers and cooling solutions.


----------



## wrathofbill

The best I could get on TimeSpy

http://www.3dmark.com/spy/44184

And finally broke the 20,000 graphics score on Firestrike









http://www.3dmark.com/fs/9319994


----------



## Yungbenny911

Now the wait begins


----------



## criminal

Quote:


> Originally Posted by *Yungbenny911*
> 
> Now the wait begins


Very nice.


----------



## ikjadoon

Quote:


> Originally Posted by *Dreamliner*
> 
> Wow. With those heatsink plates, the MSI on AIO looks fantastic.
> 
> Too bad it's not really helping performance. I see in those screenshots hes just barely maintaining 2Ghz. Seems to be the limit on these cards no matter what.
> 
> Its a bit disappointing to see there's pretty much zero difference in performance capability regardless of manufacturer or cooling configuration.


True. I'm more interested in exhausting directly out the case (the 200W+ load these cards probably put out), quieter (thinking about a fat 60mm rad with push/pull), and the more consistent temperatures/boosting.

And, sure, if we ever figure out BIOS tweaking, the headroom won't hurt.
Quote:


> Originally Posted by *bigjdubb*
> 
> I am putting my Alphacool HF-14 universal gpu block on my MSI Gaming 1070 this weekend (fingers crossed). i will let you know if I run into any backplate/frontplate/block mounting issues.


Nooiiiicceee. I'm crazy excited for you!!







And slightly jealous. But, definitely way more excited! hahaha








Quote:


> Originally Posted by *lifeisshort117*
> 
> Well, that should be the question of the year in regards to any of these AIB releases. Every AIB card is clocking around the exact same frequencies. Because they are all reference design albeit with different phase layouts....and they're all charging different prices for different levels of their AIB cards. EVGA is just doing the same as everyone else. Their cards just happen to normally be a bit more expensive than others - and blame that on the name recognition.


True. I likely won't be buying any GPUs until at least the BF1 beta drops and we can get more consistent performance numbers....maybe we'll figure out this sooner rather than later.

Because...in a crazy way, haha...I wouldn't mind buying a cheaper card as I won't be using the fancy air cooler anyone. It's just that those lower end cooler cards usually have lower end bins, too. :-/


----------



## bigjdubb

Quote:


> Originally Posted by *Yungbenny911*
> 
> Now the wait begins


That's definitely the best way to get an EK block on an MSI card. They want $210 for block and backplate, and the block somehow manages to be worse looking than the one that they make exclusively for the MSI Seahawk.


----------



## Yungbenny911

Quote:


> Originally Posted by *bigjdubb*
> 
> That's definitely the best way to get an EK block on an MSI card. They want $210 for block and backplate, and the block somehow manages to be worse looking than the one that they make exclusively for the MSI Seahawk.


I know right...









I actually bought two gaming 1070's, but sending them back to save me 200$


----------



## Blackfyre

Unbelievable how amazing the Witcher 3 looks after graphically modding it and adding SUPREME and ULTIMATE graphic modes going beyond ULTRA quality.

Have the game constantly running at 60FPS with v-sync, most my settings between Supreme & Ultimate, with only shadow quality set to ULTRA (_highest setting in the game by default_).

It truly looks incredible. Like a completely new experience playing it. I started the whole game again when I first got the GTX 1070 and I'm doing every little side-quest too.
























I think it has surpassed the GTA and MGS series as my favourite narrative driven games of all time. The story is incredible. Actually considering buying the books.

Oh and how can I forget about Telltale's Walking Dead series







But the Witcher is just open world, it's beautiful.


----------



## bigjdubb

I really wish I enjoyed that game, it is quite pretty.


----------



## flamin9_t00l

Quote:


> Originally Posted by *bigjdubb*
> 
> I really wish I enjoyed that game, it is quite pretty.


I second that... tried The Witcher 2 and couldn't get into it so never bothered with the 3rd.

The walking dead series was great (haven't tried Michonne yet tho).

I've been rocking Elite championship on DiRT Rally (god it's hard on manual gears)







good fun tho.


----------



## Blackfyre

Quote:


> Originally Posted by *flamin9_t00l*
> 
> I second that... tried The Witcher 2 and couldn't get into it so never bothered with the 3rd..


Well now that you have a GTX 1070, I cannot recommend the Witcher 3 enough (_you don't need the previous two titles, there's enough lore around the world to understand everything_). Buy it or download it and push the settings to ULTRA.

@ 1080p it should be a breeze. Just keep physx hairworks at LOW instead of HIGH.


----------



## flamin9_t00l

Speaking of DiRT Rally... I remember running it on my R9 290 reference and it was absolutely ringing it's neck @ 1440p (not maxed settings).

Running on the 1070 is a walk in the park... they're barely breaking a sweat with fully maxed settings and vsync. The performance (& noise) difference is night and day.


----------



## Schneeder

Current score.


Power Limit - 120%
Core Clock + 95
Memory Clock + 590
Never hit over 68℃.

6700k is stock currently.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Blackfyre*
> 
> Well now that you have a GTX 1070, I cannot recommend the Witcher 3 enough (_you don't need the previous two titles, there's enough lore around the world to understand everything_). Buy it or download it and push the settings to ULTRA.
> 
> @ 1080p it should be a breeze. Just keep physx hairworks at LOW instead of HIGH.


I have considered buying it a couple of times and it has been at decent prices in the sales. The graphics do look wicked and probably worth a look (maybe in the next sale). I was also going to try Shadow of Mordor aswell.


----------



## Ranguvar

GTX 1070 FTW @ 2,278MHz / 9,455MHz, 45C



Benchmark link: http://www.3dmark.com/3dm/13273661
3DMark invalidated because 1070 FTW is not recognized yet.

Test taken at 100% fan, which I'd happily do if I needed. Still stable with fan curve.
Max voltage 1.093V. Hopefully we can tap the next 0.16V soon.

Performance seems a bit low for the clock speed, any ideas (decreasing clock/memory does not help)?


----------



## ikjadoon

Quote:


> Originally Posted by *Ranguvar*
> 
> GTX 1070 FTW @ 2,278MHz / 9,455MHz, 45C
> 
> 
> 
> Benchmark link: http://www.3dmark.com/3dm/13273661
> 3DMark invalidated because 1070 FTW is not recognized yet.
> 
> Test taken at 100% fan, which I'd happily do if I needed. Still stable with fan curve.
> Max voltage 1.093V. Hopefully we can tap the next 0.16V soon.
> 
> Performance seems a bit low for the clock speed, any ideas (decreasing clock/memory does not help)?


This is on the stock ACX cooler?! Insane! That's the highest clock I've ever seen; you're nearly at 2.3GHz at 45C load?! Is it game stable, too?


----------



## Blackfyre

Quote:


> Originally Posted by *Ranguvar*
> 
> GTX 1070 FTW @ 2,278MHz / 9,455MHz, 45C
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Benchmark link: http://www.3dmark.com/3dm/13273661
> 3DMark invalidated because 1070 FTW is not recognized yet.
> 
> Test taken at 100% fan, which I'd happily do if I needed. Still stable with fan curve.
> Max voltage 1.093V. Hopefully we can tap the next 0.16V soon.
> 
> Performance seems a bit low for the clock speed, any ideas (decreasing clock/memory does not help)?


WOW! What?! That's INSANE! I'm going to really regret not waiting for the EVGA FTW! I think this is a super golden chip and not all of them, not even most of them, will hit such clocks. This is insane.
























*EDIT:*

Your benchmark link shows this: http://www.3dmark.com/3dm/13273661

*Graphics Score = 20853*

I've hit 21K with lower clocks. Much lower clocks... Some BS is happening here...


----------



## Ranguvar

Quote:


> Originally Posted by *ikjadoon*
> 
> This is on the stock ACX cooler?! Insane! That's the highest clock I've ever seen; you're nearly at 2.3GHz at 45C load?! Is it game stable, too?


So far. May need -25MHz, but have not crashed yet.

This year's ACX seems very good, plus Pascal is positively frigid.
Not sure about throwing these under water, doubt we'll be able to volt up enough to spend that headroom.

Quote:


> Originally Posted by *Blackfyre*
> 
> I've hit 21K with lower clocks. Much lower clocks... Some BS is happening here...


Exactly. I posted another thread directly asking what is up. It won't pass 100% TDP, there's that at least.
I can hope that SystemInfo validating the 1070 will help, but I highly doubt it?


----------



## LiquidHaus

Quote:


> Originally Posted by *Ranguvar*
> 
> GTX 1070 FTW @ 2,278MHz / 9,455MHz, 45C
> 
> 
> 
> Benchmark link: http://www.3dmark.com/3dm/13273661
> 3DMark invalidated because 1070 FTW is not recognized yet.
> 
> Test taken at 100% fan, which I'd happily do if I needed. Still stable with fan curve.
> Max voltage 1.093V. Hopefully we can tap the next 0.16V soon.
> 
> Performance seems a bit low for the clock speed, any ideas (decreasing clock/memory does not help)?


Quote:


> Originally Posted by *Blackfyre*
> 
> WOW! What?! That's INSANE! I'm going to really regret not waiting for the EVGA FTW! I think this is a super golden chip and not all of them, not even most of them, will hit such clocks. This is insane.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT:*
> 
> Your benchmark link shows this: http://www.3dmark.com/3dm/13273661
> 
> *Graphics Score = 20853*
> 
> I've hit 21K with lower clocks. Much lower clocks... Some BS is happening here...


yeah... 2278mhz is absolutely epic for this gpu. that's actually the highest overclock i've seen other than ln2 1080. AND 45c? it just doesn't make sense. that being said, you DO have the first FTW card here I believe.

and like Blackfyre said, I personally have got a higher score than that at 2088mhz core, and with ram just a smidge over the frequency you've shown. something definitely isn't right.


----------



## TheMiracle

What is the Max you guys think it is possible without touching the voltage??

I have a G1 Gaming

Made a test with:
111% power limit
+100 core
+400 memory

All fine.

I dont like to change the voltage since I am afraid it can damage the card.


----------



## LiquidHaus

Quote:


> Originally Posted by *TheMiracle*
> 
> What is the Max you guys think it is possible without touching the voltage??
> 
> I have a G1 Gaming
> 
> Made a test with:
> 111% power limit
> +100 core
> +400 memory
> 
> All fine.
> 
> I dont like to change the voltage since I am afraid it can damage the card.


hey man - not to be rude, but if you've read just the past couple pages, you'd learn that these cards are currently voltage locked. you couldn't change it if you wanted to. so whatever your max oc that you get is, you get. until modded bios come out. that's that. and everyone's cards are performing around the same anyway so it doesn't matter which one you'll want. just go for looks.


----------



## ikjadoon

Quote:


> Originally Posted by *Ranguvar*
> 
> So far. May need -25MHz, but have not crashed yet.
> 
> This year's ACX seems very good, plus Pascal is positively frigid.


It's way above average for FTW, too. From the 4 EVGA FTW reviews I read:

HiTechLegion review: 2088MHz max sustained boost
Hexus.net review: 2062MHz max sustained boost
Bjorn3D review: 2007MHz max sustained boost
TweakTown review: 2050MHz max sustained boost

Doing some nearly useless stats, that's 2051MHz average with a 30MHz standard deviation. Your 2,278 is +221MHz higher...you're *8 standard deviations away*, haha...literally a speck on the right edge of the bell curve. Less than 0.000000000001% of chips are like that. _Literally one in a *trillion*_.

Odd the results are low--the clocks could be misreported. But, if not...brahh....you have the luckiest GTX 1070 in the world, bar none.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Ranguvar*
> 
> GTX 1070 FTW @ 2,278MHz / 9,455MHz, 45C
> 
> 
> 
> Benchmark link: http://www.3dmark.com/3dm/13273661
> 3DMark invalidated because 1070 FTW is not recognized yet.
> 
> Test taken at 100% fan, which I'd happily do if I needed. Still stable with fan curve.
> Max voltage 1.093V. Hopefully we can tap the next 0.16V soon.
> 
> Performance seems a bit low for the clock speed, any ideas (decreasing clock/memory does not help)?


Would you mind filling in your rig details in your profile... I'm interested to see the specs and other components.

That clock is very impressive but ***'s wrong with the score... gotta be something to do with unrecognized GPU. If you're on skylake defo should not be those figures.

I'm thinking for giggles were all running 1500MHz and Ranguvar's card was programmed wrong lol hahaha... that should spark a few comments lol. Would be a lot worse than the 970 memory fiasco. Teehee.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Ranguvar*
> 
> GTX 1070 FTW @ 2,278MHz / 9,455MHz, 45C


Hey mate, why don't you try a run @ say between 2050-2100 core clock and see what the score is? if you don't mind


----------



## Yungbenny911

wrong thread lol

BTW: isn't that a poor graphics score for that clock speed? I've seen 2100Mhz 1070's get 21k gpu score...


----------



## Swolern

Anyone here running 1070 SLI on a 3440x1440 monitor? How is it? I have 2 STRIXs incoming and looking to pick up the Acer X34 100hz monitor.


----------



## Forceman

Quote:


> Originally Posted by *TheMiracle*
> 
> What is the Max you guys think it is possible without touching the voltage??
> 
> I have a G1 Gaming
> 
> Made a test with:
> 111% power limit
> +100 core
> +400 memory
> 
> All fine.
> 
> I dont like to change the voltage since I am afraid it can damage the card.


What's your actual clockspeed at those settings? Just giving the offset doesn't help, since each card boosts to a different level at stock.


----------



## Mad Pistol

Quote:


> Originally Posted by *Swolern*
> 
> Anyone here running 1070 SLI on a 3440x1440 monitor? How is it? I have 2 STRIXs incoming and looking to pick up the Acer X34 100hz monitor.


There isn't anything that I cannot run @ max at 3440x1440. It was almost max on my single 1070, but dual 1070's in SLI is a dream. Make sure you get at least an LED bridge to go along with it, but I would recommend an HB bridge. Mine definitely made things seems smoother.


----------



## headd

Quote:


> Originally Posted by *Ranguvar*
> 
> Exactly. I posted another thread directly asking what is up. It won't pass 100% TDP, there's that at least.
> I can hope that SystemInfo validating the 1070 will help, but I highly doubt it?


Slow memory.Firestrike scale better with faster memory.You have only 9454Mhz.
I have managed 21 100 point with 2100-2164/9520Mhz.
http://www.3dmark.com/3dm/13241821

There are few better results with memory at 9800Mhz and core only 2100mhz


----------



## pez

Quote:


> Originally Posted by *lifeisshort117*
> 
> I had meant Z170 compared to a H170 or Q170 - and when you do that comparison.... overclocks will definitely change.
> 
> and also, I do not believe those companies will dump the resources into going that route simply because they are waiting for the 1080Ti to be released. Ever head of an MSI 980 (non-ti) Lightning? Me neither. That's because the Ti's will be for the big big cards.
> 
> And i'm pretty sure at that point, binning 1070s isn't relate-able to them binning 1080s for the Ti's. I don't think that's possible when the Ti is such a jump from a normal card. Nvidia will have to manufacture them before they could even be binned.
> 
> At which point - in my case, the Amp Extreme is Zotac's highest end revision card. I fully expected them to release just the AMP version, but they went forward and released the Amp Extreme. Which is exactly why I went for it. It was the only card that went toe to toe with the Kingpin back when the 980ti's we're the newest things out.


Well a GTX 770 Lightning existed, so it's not a total improbability. However, I think with the Gaming X and Gaming Z, they've either replaced the lightning or are indeed going to wait for a Ti.


----------



## headd

My timespy results from yesterday
http://www.3dmark.com/spy/12297
I was world first like 8hours with single GTX1070







Now i am 15th world.I have only average memory(max around 9500mhz)
But still GPU score is still pretty good.Most dudes have better score only because cpu score but are slower in graphics score.


----------



## nacherc

Result with an i7 6700k downclock 3.8 GHZ because I have a ****ty cpu cooler. Next week, I will to buy an ID COOLING FROSTFLOW 240l and im going to do some benchs.


----------



## Swolern

Quote:


> Originally Posted by *Mad Pistol*
> 
> There isn't anything that I cannot run @ max at 3440x1440. It was almost max on my single 1070, but dual 1070's in SLI is a dream. Make sure you get at least an LED bridge to go along with it, but I would recommend an HB bridge. Mine definitely made things seems smoother.


Thanks for the reply. I believe your monitor runs at 60hz correct? Can you OC the refresh any? I'm looking at the X34 which runs at 100hz, but not sure if I want to spend $1200 bucks on it.

Have you tried 2 regular single ribbon SLI connectors? I saw some test where they gave the same performance as the HD bridge. http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/


----------



## Jiehfeng

Finally got a stable OC, basically a 10+fps gain.
On MSI Afterburner with the power limit maxed out:
| +680 MHz [Memory Clock]
| +150 MHz [Core Clock]

Stress tested it with FurMark, max temps I got was 80C (custom fan curve).
Results:



Stock:


OC:


----------



## TheMiracle

Quote:


> Originally Posted by *lifeisshort117*
> 
> hey man - not to be rude, but if you've read just the past couple pages, you'd learn that these cards are currently voltage locked. you couldn't change it if you wanted to. so whatever your max oc that you get is, you get. until modded bios come out. that's that. and everyone's cards are performing around the same anyway so it doesn't matter which one you'll want. just go for looks.


Yes, sorry. I saw you guys talking about voltage and I didnt notice that it was because of the new core/voltage curve.
With +100 core my boost goes to 2050mhz with some drops to 2026 mhz.


----------



## marduke83

So picked up a Gainward Phoenix 1070 today (not the GLH unfortunately, none available here at the moment), but pretty happy with it. Not overly impressed with the overclock on it or the memory clock, but I guess that's the lottery. These results are the best I've been able to get, anything more on the core or memory will cause the drivers to crash..







But it's a HUGE improvement over my old 780ti!


----------



## Mad Pistol

Quote:


> Originally Posted by *Swolern*
> 
> Thanks for the reply. I believe your monitor runs at 60hz correct? Can you OC the refresh any? I'm looking at the X34 which runs at 100hz, but not sure if I want to spend $1200 bucks on it.
> 
> Have you tried 2 regular single ribbon SLI connectors? I saw some test where they gave the same performance as the HD bridge. http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/


I tried overclocking my monitor, but to no avail. There were dropped frames when I overclocked it even just to 70hz.

I actually had 2 bridges before this, one was a solid bridge, the other was a ribbon bridge, both connected at the same time. It was still slightly stuttery, but not bad. After getting the HB SLI bridge, the stutters all but went away. Since you're planning on getting a high refresh 3440x1440 monitor, I really would recommend the HB SLI bridge. You will more than likely see the difference.


----------



## PhilWrir

I just ordered a Gigabyte 1070 G1 Gaming that gets here tuesday.









Im going to showcase some severe ignorance here so feel free to tease me if needed.









How easy are these cards to overclock?
I could never get the hang of pushing my 780 and I assume the 1070 is a similar situation in regards to base clock OC, Boost OC, thermal and voltage throttling etc?
Does anyone know of a simple-easy-stupid OC guides for these cards because they are a total overclocking mystery to me.
(the last time I successfully OCd a card was a GTX480)


----------



## Blackfyre

Quote:


> Originally Posted by *PhilWrir*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I just ordered a Gigabyte 1070 G1 Gaming that gets here tuesday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im going to showcase some severe ignorance here so feel free to tease me if needed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How easy are these cards to overclock?
> I could never get the hang of pushing my 780 and I assume the 1070 is a similar situation in regards to base clock OC, Boost OC, thermal and voltage throttling etc?
> 
> 
> *Does anyone know of a simple-easy-stupid OC guides for these cards* because they are a total overclocking mystery to me.
> (the last time I successfully OCd a card was a GTX480)


*I posted this a long time ago, should be a good starter:*

http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/740#post_25339752


----------



## Bee Dee 3 Dee

(2) ASUS GeForce GTX 1070 8GB ROG STRIX OC Edition
(STRIX-GTX1070-O8G-GAMING)


----------



## marduke83

Quote:


> Originally Posted by *PhilWrir*
> 
> I just ordered a Gigabyte 1070 G1 Gaming that gets here tuesday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im going to showcase some severe ignorance here so feel free to tease me if needed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How easy are these cards to overclock?
> I could never get the hang of pushing my 780 and I assume the 1070 is a similar situation in regards to base clock OC, Boost OC, thermal and voltage throttling etc?
> Does anyone know of a simple-easy-stupid OC guides for these cards because they are a total overclocking mystery to me.
> (the last time I successfully OCd a card was a GTX480)


Honestly wasn't too bad, it's strange for me to see the actual boost go WAY above what it says in GPUz (I was used to it going to what it said the boost should be), mine says 1863boost but it will go to 2050+ with the right conditions, but it will usually sit around 2020ish. I maxed the power limit first without touching anything else, then tested the boost clock and it was already in the 1900s.


----------



## duganator

So I'm looking to upgrade from a 970, is there any particular 1070 I should be looking at? I like the look of the Asus strix, and it's a bit cheaper than some other cards.


----------



## Dude970




----------



## LiquidHaus

Quote:


> Originally Posted by *PhilWrir*
> 
> I just ordered a Gigabyte 1070 G1 Gaming that gets here tuesday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im going to showcase some severe ignorance here so feel free to tease me if needed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How easy are these cards to overclock?
> I could never get the hang of pushing my 780 and I assume the 1070 is a similar situation in regards to base clock OC, Boost OC, thermal and voltage throttling etc?
> Does anyone know of a simple-easy-stupid OC guides for these cards because they are a total overclocking mystery to me.
> (the last time I successfully OCd a card was a GTX480)


severe ignorance indeed!

they are quite easy to overclock....if all you want out of it is 2000mhz.

just snag afterburner, do +80mhz on the core slider and +500mhz on the memory slider and you'll be golden. if you feel like diving in the deep end that is bios locked-voltage limits with us, crank the core slider up to +125 and see how you do. haha we're all frustrated on this voltage lock. everyone's cards are essentially the same so we're all in the same boat.


----------



## Forceman

Quote:


> Originally Posted by *lifeisshort117*
> 
> severe ignorance indeed!
> 
> they are quite easy to overclock....if all you want out of it is 2000mhz.
> 
> just snag afterburner, do +80mhz on the core slider and +500mhz on the memory slider and you'll be golden. if you feel like diving in the deep end that is bios locked-voltage limits with us, crank the core slider up to +125 and see how you do. haha we're all frustrated on this voltage lock. everyone's cards are essentially the same so we're all in the same boat.


You can't just pop in someone else's offset numbers because every card boosts differently at stock. Your +85 might be his +35, or +150. It'll work for memory since they are all the same, but not core.

If you are targeting a specific number (like 2000) the easiest way is to run GPU-Z and start the render test while watching the actual clock speed on the sensors tab. Take 2000 (or whatever you want to try) and subtract the actual clock speed from GPU-Z. Plug that in as your offset, and bingo, you have you desired speed. It may change a little in actual gaming as the boost moves around, but it's an easy way to start.


----------



## LiquidHaus

Quote:


> Originally Posted by *Forceman*
> 
> You can't just pop in someone else's offset numbers because every card boosts differently at stock. Your +85 might be his +35, or +150. It'll work for memory since they are all the same, but not core.
> 
> If you are targeting a specific number (like 2000) the easiest way is to run GPU-Z and start the render test while watching the actual clock speed on the sensors tab. Take 2000 (or whatever you want to try) and subtract the actual clock speed from GPU-Z. Plug that in as your offset, and bingo, you have you desired speed. It may change a little in actual gaming as the boost moves around, but it's an easy way to start.


Sure you can. because those offset assumptions are based on an AIB card. Founder's (reference) cards would just require higher offset settings. So if anything, he'd still have more headroom if he by chance didn't snag an AIB - which by the way I would greatly recommend DO snagging an AIB card.

I wasn't being ignorant in my simplified advice.

Anyway, I finally got around to running Time Spy!



I cranked my fans to 90% to get away from any boost throttle, and it kept the card at 57c. LOL


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *lifeisshort117*
> 
> severe ignorance indeed!
> 
> they are quite easy to overclock....if all you want out of it is 2000mhz.
> 
> just snag afterburner, do +80mhz on the core slider and +500mhz on the memory slider and you'll be golden. if you feel like diving in the deep end that is bios locked-voltage limits with us, crank the core slider up to +125 and see how you do. haha we're all frustrated on this voltage lock. everyone's cards are essentially the same so we're all in the same boat.


Quote:


> Originally Posted by *Forceman*
> 
> You can't just pop in someone else's offset numbers because every card boosts differently at stock. Your +85 might be his +35, or +150. It'll work for memory since they are all the same, but not core.
> 
> If you are targeting a specific number (like 2000) the easiest way is to run GPU-Z and start the render test while watching the actual clock speed on the sensors tab. Take 2000 (or whatever you want to try) and subtract the actual clock speed from GPU-Z. Plug that in as your offset, and bingo, you have you desired speed. It may change a little in actual gaming as the boost moves around, but it's an easy way to start.


both of the two ^^ combined is awesome!









tanks


----------



## Dreamliner

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> (2) ASUS GeForce GTX 1070 8GB ROG STRIX OC Edition
> (STRIX-GTX1070-O8G-GAMING)


This post is underrated.


----------



## nyk20z3

I am running a Asus Nano now but have interest in the Gigabyte Extreme Gaming 1070 the problem is idk if EK will release a block for it. I told myself i wouldnt buy $600 + gpu's anymore so no 1080 for me


----------



## Ysbzqu6572

Sadly Gigabyte still has not solved the problem with overkill fan curve on 1070 G1 edition with latest BIOS update (link) so I am sticking with my custom fan profile as follows, the card is very quiet during load with this setup and keeps up to 75 celsius temps:


----------



## ProHitZ

Why do so many cards stop at 2088? Is it some kind of limitation? I´ve clearly entered higher numbers in afterburner but it still stops at 2088, seen a lot of other cards aswell with the same clock. GPU-Z and afterburner reports everything to throttling the card (voltage, power limit, temperature etc) which seems impossible because the values are good. A MSI Gaming X btw.


----------



## darkpower45

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> (2) ASUS GeForce GTX 1070 8GB ROG STRIX OC Edition
> (STRIX-GTX1070-O8G-GAMING)


You sir have patience.


----------



## ikjadoon

Quote:


> Originally Posted by *ProHitZ*
> 
> Why do so many cards stop at 2088? Is it some kind of limitation? I´ve clearly entered higher numbers in afterburner but it still stops at 2088, seen a lot of other cards aswell with the same clock. GPU-Z and afterburner reports everything to throttling the card (voltage, power limit, temperature etc) which seems impossible because the values are good. A MSI Gaming X btw.


IIRC, since Kepler, Nvidia has "GPU clock straps". The GPU die can't run at every frequency (2085, 2086, 2087, 2088,...); it only runs at pre-defined straps like 2065MHz and then 2088MHz and then 2094MHz (just examples). For whatever reason, GPU overclocking tools still offer us 1MHz increments, even though the GPU doesn't use those.

You may still be in the 2088Mhz strap? Or, you've already tried a bigger jump like 2110MHz, which definitely should put you into the next strap?

Or, hell, with Pascal overclocking as it is, it could be something completely different, haha









--

About the throttling: everything is being throttled? That is funky. I'm not sure I follow, but I don't have a card yet. :-/


----------



## ProHitZ

Quote:


> Originally Posted by *ikjadoon*
> 
> IIRC, since Kepler, Nvidia has "GPU clock straps". The GPU die can't run at every frequency (2085, 2086, 2087, 2088,...); it only runs at pre-defined straps like 2065MHz and then 2088MHz and then 2094MHz (just examples). For whatever reason, GPU overclocking tools still offer us 1MHz increments, even though the GPU doesn't use those.
> 
> You may still be in the 2088Mhz strap? Or, you've already tried a bigger jump like 2110MHz, which definitely should put you into the next strap?
> 
> Or, hell, with Pascal overclocking as it is, it could be something completely different, haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> --
> 
> About the throttling: everything is being throttled? That is funky. I'm not sure I follow, but I don't have a card yet. :-/


Hmm I see. Had no idea about that, think I have 2102 right now. Can try a bit higher and see what happens.

EDIT: Worked out great man! Apparently i was right below the next strap, 2100.5 running now.

No idea about the throttling reports either, seems the programs are just freaking out at these speeds... Oh well, besides I know my card likes voltage everything is good again!


----------



## Ranguvar

1070 FTW here.
So guys.... 2126 does indeed beat 2278MHz. For some reason.


2278MHz/2002MHz - 6116
2126/2002MHz - 6272
2278MHz/2357MHz - 6438
2126/2357MHz - 6625
Same in Firestrike.
No throttling, <50C.

*Why does Manual curve, +150MHz clock, equal less performance?*

I bet anything 2GHz Manual vs 2GHz with just +XXMHz, the latter will win.


----------



## Swolern

Quote:


> Originally Posted by *Mad Pistol*
> 
> I tried overclocking my monitor, but to no avail. There were dropped frames when I overclocked it even just to 70hz.
> 
> I actually had 2 bridges before this, one was a solid bridge, the other was a ribbon bridge, both connected at the same time. It was still slightly stuttery, but not bad. After getting the HB SLI bridge, the stutters all but went away. Since you're planning on getting a high refresh 3440x1440 monitor, I really would recommend the HB SLI bridge. You will more than likely see the difference.


Interesting. Looks like I need an expensive bridge after all.

Quote:


> Originally Posted by *Ranguvar*
> 
> 1070 FTW here.
> So guys.... 2126 does indeed beat 2278MHz. For some reason.
> 
> 
> 2278MHz/2002MHz - 6116
> 2126/2002MHz - 6272
> 2278MHz/2357MHz - 6438
> 2126/2357MHz - 6625
> Same in Firestrike.
> No throttling, <50C.
> 
> *Why does Manual curve, +150MHz clock, equal less performance?*
> 
> I bet anything 2GHz Manual vs 2GHz with just +XXMHz, the latter will win.


2278 is probably not fully stable. Try to drop the speeds down a tad and test again.


----------



## TheMiracle

How do I set the Power Limit to 126%?? The max I can get is 111% with my G1 Gaming (core voltage 100%). I am using MSI Afterburner 4.3.0 beta 4.
Is 111% the max for G1 Gaming??

Edit: Ya, because G1 has only one 8-pin connector!


----------



## Forceman

Quote:


> Originally Posted by *TheMiracle*
> 
> How do I set the Power Limit to 126%?? The max I can get is 111% with my G1 Gaming (core voltage 100%). I am using MSI Afterburner 4.3.0 beta 4.
> Is 111% the max for G1 Gaming??
> 
> Edit: Ya, because G1 has only one 8-pin connector!


Are you hitting the power limit? I don't think I've seen anything higher than about 92% on mine.


----------



## TheMiracle

Quote:


> Originally Posted by *Forceman*
> 
> Are you hitting the power limit? I don't think I've seen anything higher than about 92% on mine.


Yes, with +100 core, +500 memory, it stays at 100%-109%, but have some spikes to 111%. Checking with MSI Afterburner and GPU-Z.


----------



## Forceman

Quote:


> Originally Posted by *TheMiracle*
> 
> Yes, with +100 core, +500 memory, it stays at 100%-109%, but have some spikes to 111%. Checking with MSI Afterburner and GPU-Z.


What's the actual clock speed? Maybe I need to check mine again.


----------



## Mad Pistol

Quote:


> Originally Posted by *TheMiracle*
> 
> Yes, with +100 core, +500 memory, it stays at 100%-109%, but have some spikes to 111%. Checking with MSI Afterburner and GPU-Z.


The GTX 1070 and other high-end cards are very dependent on what resolution you're using as to what the power limit hits. Back when I was using my GTX 780 @ 1920x1200, It was rare to see the card go over 100% power limit, even when I had the power limit maxed to 106%. However, as soon as I got my 3440x1440 monitor, it almost always stayed above 100%.

Same with my dual GTX 1070s; ever game that I play that will max out these cards hits 100%+ on the power limit easily @ 3440x1440.


----------



## TheMiracle

Quote:


> Originally Posted by *Forceman*
> 
> What's the actual clock speed? Maybe I need to check mine again.


2050-2063mhz with some spikes to 2076mhz.
Temp is 57°c max and fans at 70%


----------



## TheMiracle

Quote:


> Originally Posted by *Mad Pistol*
> 
> The GTX 1070 and other high-end cards are very dependent on what resolution you're using as to what the power limit hits. Back when I was using my GTX 780 @ 1920x1200, It was rare to see the card go over 100% power limit, even when I had the power limit maxed to 106%. However, as soon as I got my 3440x1440 monitor, it almost always stayed above 100%.
> 
> Same with my dual GTX 1070s; ever game that I play that will max out these cards hits 100%+ on the power limit easily @ 3440x1440.


Well, my monitor is also 1440p, but how can resolutions influence the TDP??? I thought it was only dependent on core/voltage vs GPU usage.


----------



## Forceman

Quote:


> Originally Posted by *TheMiracle*
> 
> 2050-2063mhz with some spikes to 2076mhz.
> Temp is 57°c max and fans at 70%


Hmm. I'll have to check again. I'm at 2050/+500 and I don't think I've ever seen it above 100% in AB. Around 40C with a waterblock. My card doesn't seem to want to go above 1.062V, so maybe that's why.


----------



## TheMiracle

Quote:


> Originally Posted by *Forceman*
> 
> Hmm. I'll have to check again. I'm at 2050/+500 and I don't think I've ever seen it above 100% in AB. Around 40C with a waterblock. My card doesn't seem to want to go above 1.062V, so maybe that's why.


My card is hiting 1.082v.


----------



## Mad Pistol

Quote:


> Originally Posted by *TheMiracle*
> 
> Well, my monitor is also 1440p, but how can resolutions influence the TDP??? I thought it was only dependent on core/voltage vs GPU usage.


I'm unsure of the intricacies of GPU architecture, but as the resolution goes up, the GPU resources and pipeline get more highly saturated. It probably also has something to do with more VRAM usage, as well.


----------



## Ranguvar

Quote:


> Originally Posted by *lifeisshort117*
> 
> Anyway, I finally got around to running Time Spy!


Nice! Is that with the FTW bios?

I'm kinda surprised your CPU scored as high as it did - actually amazed, my 6700K at 4.7GHz scores ~1000 points less!!

http://www.3dmark.com/3dm/13326932


----------



## Ranguvar

Accidental doublepost.
Quote:


> Originally Posted by *Swolern*
> 
> 2278 is probably not fully stable. Try to drop the speeds down a tad and test again.


I lose score by switching to Manual curve and changing *nothing* vs. same 2126MHz clock in the normal tab with +125.

http://www.3dmark.com/3dm/13333860
vs.
http://www.3dmark.com/3dm/13326932

Unfortunate. 2278MHz Manual scores only slightly better than 2126MHz, which trounces 2126 using Manual.

Hoping to figure this out better soon. Have never seen higher clocks mean worse performance (vs. crashing/artifacts) from GeForce FX to now.


----------



## ikjadoon

Quote:


> Originally Posted by *ProHitZ*
> 
> Hmm I see. Had no idea about that, think I have 2102 right now. Can try a bit higher and see what happens.
> 
> EDIT: Worked out great man! Apparently i was right below the next strap, 2100.5 running now.
> 
> No idea about the throttling reports either, seems the programs are just freaking out at these speeds... Oh well, besides I know my card likes voltage everything is good again!


Nooiice. That lines up perfectly. Kepler was also ~13MHz.



Hmm, well, as long as it's stable & boost is consistent, I'd say you're good.








Quote:


> Originally Posted by *TheMiracle*
> 
> Well, my monitor is also 1440p, but how can resolutions influence the TDP??? I thought it was only dependent on core/voltage vs GPU usage.


Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm unsure of the intricacies of GPU architecture, but as the resolution goes up, the GPU resources and pipeline get more highly saturated. It probably also has something to do with more VRAM usage, as well.


Yup. This is the same with CPUs. All of these tests register a "100% CPU load", but stress different / more parts of the CPU, so power / temperature vary a lot even with "100%".


----------



## showaccord97

First post despite being a member for over 9 years. Recently upgraded my rig from an Asus 660TI to a MSi GTX 1070 Gaming X. Also upgraded to a h110i Vi(from TK2.0 pro). What better way then to make my first post getting me into the 21k club. 21.5k actually.. Still rocking a 3570k @4.6GHz but may change that soon as I am a number chaser... thanks to everyone for all the great advise and information..


----------



## Airrick10

I finally got my MSI 1070 Gaming X to break into the Firestrike 20,000 graphics score!







Anything +100 on the core would crash so I had to go to the graph and mess with some voltages there. I pretty much left everything at +80 core but on the 1.093 voltage, I raised it to +130. I was able to get to 2114MHz core clock too!
*
MSI 1070 Gaming X OC settings:
*
+130 on the 1.093v everything else +80 core (Graph)
+700 Memory
Max Power Limit
Max Core Voltage
100% Fan
Unlock Power/Temp Limit

http://www.3dmark.com/fs/9368650


----------



## Forceman

Quote:


> Originally Posted by *TheMiracle*
> 
> My card is hiting 1.082v.


Are you using Afterburner? Can you post a screenshot with AB in it? I'm wondering if I'm missing something or what is going on. I just checked again and at 2050, in Heaven, it's running 1.050V and 92-95% power. Using the slider in AB doesn't change the voltage at all. Or are you using the curve?


----------



## mypickaxe

Quote:


> Originally Posted by *BulletSponge*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mad Pistol*
> 
> I Time Spy some SLI...
> 
> http://www.3dmark.com/3dm/13212917?
> 
> 
> 
> With my envious little eye......
Click to expand...

http://www.3dmark.com/spy/73718



Oh...wait...

I'm number one for all 1070s now. Come at me, bro.







(I KID!)


----------



## LiquidHaus

Quote:


> Originally Posted by *Ranguvar*
> 
> Nice! Is that with the FTW bios?
> 
> I'm kinda surprised your CPU scored as high as it did - actually amazed, my 6700K at 4.7GHz scores ~1000 points less!!
> 
> http://www.3dmark.com/3dm/13326932


Yeah I know lol I was looking at everyone else's scores, and it seems I am at the top for both scores on this thread in regards to a one card setup on this thread. I am waiting to be beat though. The CPU score surprised me the most too though. I'm a good margin above most of the scores. I guess that means my burn-in period for attaining a stable overclock that's 72 hours long is worth it haha.

Quote:


> Originally Posted by *mypickaxe*
> 
> http://www.3dmark.com/spy/73718
> 
> 
> 
> Oh...wait...
> 
> I'm number one for all 1070s now. Come at me, bro.
> 
> 
> 
> 
> 
> 
> 
> 
> (I KID!)


Oh yeah! nice man. And for me, it's nice seeing two Zotac cards near the top. hehe


----------



## mypickaxe

Quote:


> Originally Posted by *lifeisshort117*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ranguvar*
> 
> Nice! Is that with the FTW bios?
> 
> I'm kinda surprised your CPU scored as high as it did - actually amazed, my 6700K at 4.7GHz scores ~1000 points less!!
> 
> http://www.3dmark.com/3dm/13326932
> 
> 
> 
> Yeah I know lol I was looking at everyone else's scores, and it seems I am at the top for both scores on this thread in regards to a one card setup on this thread. I am waiting to be beat though. The CPU score surprised me the most too though. I'm a good margin above most of the scores. I guess that means my burn-in period for attaining a stable overclock that's 72 hours long is worth it haha.
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> http://www.3dmark.com/spy/73718
> 
> 
> 
> Oh...wait...
> 
> I'm number one for all 1070s now. Come at me, bro.
> 
> 
> 
> 
> 
> 
> 
> (I KID!)
> 
> 
> 
> Click to expand...
> 
> Oh yeah! nice man. And for me, it's nice seeing two Zotac cards near the top. hehe
Click to expand...

Mine are simple Founders Editions. Nothing against Zotac. Oh, and those were from the same user (the two Zotac entries.)


----------



## deegzor

New fs run after installing vram heat sinks. Spoiler it's closer to 22k than 21k graphics score







-> http://www.3dmark.com/3dm/13339446?

these card's are going to be absolute beasts when we get our hands on bios tweaker


----------



## alex4069

I filled out the 1070 owner club form but did not have my validation until now.
Here it is:

http://www.techpowerup.com/gpuz/details/7vwfd/


----------



## alex4069

I was playing Forza 6 apex with my old setup with 7950 cf and had to run on High textures at 1440p and running 40fps. When got my 4670k and gtx 1070 amp I turn up to 4k and ultra settings and pegging 60fps which is max for my monitor.









I also have a question. Which sli bridge do I need for to GTX 1070 Amp edition cards on a MSI z97s sli krait MB? Will I need long or short bridge?


----------



## luan87us

Finally ran Time Spy. Here's my score with the Strix O8G using OC Mode from Asus GPUTweak and 6700k at 4.6ghz. Seem kinda low doesn't it? I tend to get lower graphic score than most other 1070 owners. Don't really know why. The thing boost to 2050MHz and doesn't throttle down at all. Am I experiencing that latency issue everyone talking about?


----------



## joloxx9

My 3d timespy test

http://www.3dmark.com/spy/76757


----------



## alex4069

Quote:


> Originally Posted by *luan87us*
> 
> Finally ran Time Spy. Here's my score with the Strix O8G using OC Mode from Asus GPUTweak and 6700k at 4.6ghz. Seem kinda low doesn't it? I tend to get lower graphic score than most other 1070 owners. Don't really know why. The thing boost to 2050MHz and doesn't throttle down at all. Am I experiencing that latency issue everyone talking about?


Might be something going on I hit 5483 with i5 4690k and GTX 1070 amp. http://www.3dmark.com/spy/73338


----------



## Forceman

Quote:


> Originally Posted by *luan87us*
> 
> Finally ran Time Spy. Here's my score with the Strix O8G using OC Mode from Asus GPUTweak and 6700k at 4.6ghz. Seem kinda low doesn't it? I tend to get lower graphic score than most other 1070 owners. Don't really know why. The thing boost to 2050MHz and doesn't throttle down at all. Am I experiencing that latency issue everyone talking about?


Try pushing the memory clock up - it can really improve your scores. Seems like a lot of cards can do +500 at least.


----------



## Mad Pistol

Quote:


> Originally Posted by *alex4069*
> 
> I was playing Forza 6 apex with my old setup with 7950 cf and had to run on High textures at 1440p and running 40fps. When got my 4670k and gtx 1070 amp I turn up to 4k and ultra settings and pegging 60fps which is max for my monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also have a question. Which sli bridge do I need for to GTX 1070 Amp edition cards on a MSI z97s sli krait MB? Will I need long or short bridge?


4k SLI, you will need the HB SLI bridge for the best experience. How many slots are there between your cards? (a picture will probably help more than explaining it)


----------



## KrAzYtHeBoY

*MSI deleted Updated-BIOS for set OC-MODE default.*
https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios

*reason?*


----------



## Mad Pistol

I was really hoping this group would become like the GTX 970 owners club, but it doesn't seem that way. Is there just not a lot of interest in this card? The 1080 thread is far more active.

EDIT: also, it looks like the rush for 1070's is over. Newegg has them in abundance at the moment.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601201888%20601204369%208000


----------



## jlhawn

I created a thread on OCN the other day about an issue with my new GTX 1070,
I also found a fix, it seems that after removing the MSI Gaming app the problem was gone, I reinstalled the gaming app just to verify my findings
and the problem game back again, upon removing the app the problem was gone, I'm posting this in case any of you might be having any problems.

Here is the copy of my thread from a week ago.

07/11/2016
I have had my MSI GTX 1070 Gaming X since July 5, it has been running perfect with a boost core clock of 1974mhz out of the box. I have not over clocked it yet as I don't see a need to, Today I was looking at the sensors with GPU-Z and when I ran the gpu test the cards core clock only boosted to 1151mhz and voltage only went to 0.68 when up until this it would boost to 1974mhz and voltage of 1.06, I tried 3 times with the same result so I rebooted my system and it's back to normal. also if it matters I do not have the dpc latency issue that others are having as I tested for that. So what I am asking is would this be a driver problem or is my brand new GTX 1070 a bad graphics card (pile of poop) ? never had the problem with any of my other nvidia gpu's even though I over clocked all of them, GTX 295, GTX 580 sli, GTX 680, GTX 970.
Thank you.


----------



## mypickaxe

Quote:


> Originally Posted by *Mad Pistol*
> 
> I was really hoping this group would become like the GTX 970 owners club, but it doesn't seem that way. Is there just not a lot of interest in this card? The 1080 thread is far more active.
> 
> EDIT: also, it looks like the rush for 1070's is over. Newegg has them in abundance at the moment.
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601201888%20601204369%208000


The price to performance isn't there yet. People are just looking at the 1070 like a less expensive, warmed over 980 Ti (performance wise.) Problem is, you can find deals on 980 Ti at about the same price as a lot of the overpriced 1070s out there today. Nobody should be spending more than $500 for a 1070 (just my opinion.) I paid MSRP for my Founders Edition 1070s, only because they happened to be on the shelf, I was able to sell my TITAN X and waterblock to cover most of the cost of the two 1070s, and they didn't have the 1080 in stock.

The 1080 is so much faster than anything else that isn't overclocked to the gills, that's where all the activity is, at least for right now.

On another note, managed to improve my scores a little bit. From what I am seeing on the TITAN X forum, a 12,000 graphics score in Time Spy is respectable for 2-way SLI.

http://www.3dmark.com/spy/82064


----------



## joloxx9

Who knows, Ive got that rom on my HDD


----------



## Mad Pistol

Quote:


> Originally Posted by *mypickaxe*
> 
> The price to performance isn't there yet. People are just looking at the 1070 like a less expensive, warmed over 980 Ti (performance wise.) Problem is, you can find deals on 980 Ti at about the same price as a lot of the overpriced 1070s out there today. Nobody should be spending more than $500 for a 1070 (just my opinion.) I paid MSRP for my Founders Edition 1070s, only because they happened to be on the shelf, I was able to sell my TITAN X and waterblock to cover most of the cost of the two 1070s, and they didn't have the 1080 in stock.
> 
> The 1080 is so much faster than anything else that isn't overclocked to the gills, that's where all the activity is, at least for right now.


True, and I hate that. I think Nvidia could have done much better on the value front if they had lowered the MSRP for the 1070 even by $30. It is what it is, though.

Honestly, my GTX 1070 SLI setup is beastly. I will not require anything better for literally years.


----------



## GreedyMuffin

http://www.3dmark.com/fs/9379421

My 1500/1989 results from 980Ti. Wondering if I should get rid of the 980Ti. (AKA slap it into an folding rig for 24/7 folding along with the others.) and get myself either a 1080 or 1070. Can someone post benchmarks to the firestrike Extreme test for me?

I would love if they're were overclocked pretty good. 24/7 stable OC and not benchstable.









Cheers!


----------



## alex4069

Quote:


> Originally Posted by *Mad Pistol*
> 
> 4k SLI, you will need the HB SLI bridge for the best experience. How many slots are there between your cards? (a picture will probably help more than explaining it)


I will post pictures later. I have 1 pcie 2 x1 slot in between.


----------



## Mad Pistol

Quote:


> Originally Posted by *alex4069*
> 
> I will post pictures later. I have 1 pcie 2 x1 slot in between.


You probably want the 3-slot adapter, then. I found this on Nvidia's website.


----------



## ikjadoon

Quote:


> Originally Posted by *Mad Pistol*
> 
> True, and I hate that. I think Nvidia could have done much better on the value front if they had lowered the MSRP for the 1070 even by $30. It is what it is, though.
> 
> Honestly, my GTX 1070 SLI setup is beastly. I will not require anything better for literally years.


Very true. With what consumers expect, it costs a tad too much for a hair too little performance. And prices are still relatively inflated. The cheapest GTX 1070 is still 15% over MSRP---maybe still high demand, maybe "FE" pricing, maybe price gouging, maybe anything.

GTX 970: $330 MSRP, on-par with GTX 780 Ti @ $700
GTX 1070: $380 MSRP, on-par with GTX 980 Ti @ $650

But, you can look at it the other way: you get a much cooler, 8GB, async-enabled GTX 980 Ti for about $430. Progress is good.


----------



## Blackfyre

Quote:


> Originally Posted by *KrAzYtHeBoY*
> 
> *MSI deleted Updated-BIOS for set OC-MODE default.*
> https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
> 
> *reason?*












Why exactly? Where is the public statement? Why was it removed? I never downloaded mine, was waiting for the next BIOS update. But what if they pull this down and then won't post another one?


----------



## Forceman

Quote:


> Originally Posted by *GreedyMuffin*
> 
> http://www.3dmark.com/fs/9379421
> 
> My 1500/1989 results from 980Ti. Wondering if I should get rid of the 980Ti. (AKA slap it into an folding rig for 24/7 folding along with the others.) and get myself either a 1080 or 1070. Can someone post benchmarks to the firestrike Extreme test for me?
> 
> I would love if they're were overclocked pretty good. 24/7 stable OC and not benchstable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers!


I wouldn't trade a 1500 MHz 980 Ti for a 1070, I don't think. For comparison, my GTX 1070 FSE score at 2050/+500 is 8819/9603.

http://www.3dmark.com/3dm/13362084?


----------



## TheMiracle

Quote:


> Originally Posted by *Forceman*
> 
> Are you using Afterburner? Can you post a screenshot with AB in it? I'm wondering if I'm missing something or what is going on. I just checked again and at 2050, in Heaven, it's running 1.050V and 92-95% power. Using the slider in AB doesn't change the voltage at all. Or are you using the curve?


Sure!
This was during Fire Strike bench.
I am not using the curve.


----------



## mypickaxe

Quote:


> Originally Posted by *GreedyMuffin*
> 
> http://www.3dmark.com/fs/9379421
> 
> My 1500/1989 results from 980Ti. Wondering if I should get rid of the 980Ti. (AKA slap it into an folding rig for 24/7 folding along with the others.) and get myself either a 1080 or 1070. Can someone post benchmarks to the firestrike Extreme test for me?
> 
> I would love if they're were overclocked pretty good. 24/7 stable OC and not benchstable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers!


Since you already have an overclocked 980 Ti, don't get a 1070 unless you are running VR, plan to SLI, or you really need that extra 2 GB of VRAM right now.


----------



## Blackfyre

Quote:


> Originally Posted by *mypickaxe*
> 
> Since you already have an overclocked 980 Ti, don't get a 1070 unless you are running VR, plan to SLI, or you really need that extra 2 GB of VRAM right now.


+1

If you already have a 980 Ti that's overclocked there's no need on wasting any money and getting a GTX 1070 since overclocking is limited on it anyway thus far.

If you have the cash for the GTX 1080, that would be an upgrade. Otherwise there's no need for minimal FPS gains here and there.


----------



## Bee Dee 3 Dee

w00t!









Woke up 2day and...



11:00 AM delivered.... who'd have ever thunk it on a Sunday morning, aye!? Stranger than fiction....
Just after waking, breakfast, and... and... YES!!!!













i'll put her in case asap after prepping. never thought it would arrive so early in the day.









it's been three years now with GTX760-SLI (purchased (2013-08-21))... so let me see...

suggestions plz...

(Done) -make incremental image of C drive- 1st thing 2day- as always
(Done) -get latest nVidia driver (368.81-desktop-win8-win7-winvista-64bit-international-whql)- DLed yesterday

checklist (still need to do):
-DL latest beta AB (from Guru3d?)
-uninstall existing AB
-uninstall existing nVidia driver
-shut down
-physically remove GTX760-SLI cards
-slap in GTX1070-SLI cards SWEET!








-install vid drivers and AB
-Run 3DMark
-PLAY DOOM... The Witcher 3 Wild Hunt... GTAV... Far Cry 4... Arma3... Rise of the Tomb Raider.... MUAHAHAHAHA!!!!!


----------



## Forceman

Make sure you get the AB Beta 4 from the MSI website.


----------



## Dude970

Quote:


> Originally Posted by *Forceman*
> 
> Make sure you get the AB Beta 4 from the MSI website.


Is it different than the one from Guru3d?


----------



## Forceman

Quote:


> Originally Posted by *Dude970*
> 
> Is it different than the one from Guru3d?


Actually, I don't know, just make sure you get the Beta 4 wherever you get it. I've just always gotten it direct from MSI.


----------



## Dude970

Quote:


> Originally Posted by *Forceman*
> 
> Actually, I don't know, just make sure you get the Beta 4 wherever you get it


Okay, I think they are the same. When you said to make sure you got it from MSI I wondered


----------



## GreedyMuffin

Hi!

Thanks!

Might just get a 1080 then. I need another card for folding as I have sold 2x 980s a while ago (before the 1080 came out, to get the most money out of them) and I need a replacement card.

Could wait for the 1080Ti perhaps?


----------



## Hunched

Quote:


> Originally Posted by *KrAzYtHeBoY*
> 
> *MSI deleted Updated-BIOS for set OC-MODE default.*
> https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
> 
> *reason?*


No idea, I managed to snag it with a cached version of the page when it was still available myself. No longer possible.
They didn't remove the BIOS for the 1080 Gaming X, only the 1070...

It's going to suck if they aren't going to be manually posting updated BIOS, since that means we'll have to use their update tool and let it do its automatic stuff.
Which means it will probably only give me a BIOS for the non-X version of the card, since that's what I own and use.

Hopefully somewhere reputable will collect the BIOS's from the update tool so we can download them anyway.
If you can flash the X BIOS to a non-X card, you can likely do the same with the Z BIOS.
I'd like updated versions of the X BIOS and may upgrade to the Z BIOS if I can ever find them...

Here's the .zip that was on MSI's site. You can also find just the .rom on Techpowerup.

GeForce_GTX_1070_GAMING_X_8G_602-V330-06S_vbios.zip 3071k .zip file


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Forceman*
> 
> Actually, I don't know, just make sure you get the Beta 4 wherever you get it. I've just always gotten it direct from MSI.


AB Beta 4 from the Milky Way Galaxy.







Tanks!







that's why threads like this are awesome.


----------



## luan87us

Quote:


> Originally Posted by *Forceman*
> 
> Try pushing the memory clock up - it can really improve your scores. Seems like a lot of cards can do +500 at least.


Will I have to mess with voltage for that? I don't really want to mess with the voltage for this card as it's more than enough for most game I play. Just wondering why my graphic score always lower than most people here.


----------



## nacherc

GTX 1070 GAMING X

CORE +100
MEM +800


----------



## Forceman

Quote:


> Originally Posted by *luan87us*
> 
> Will I have to mess with voltage for that? I don't really want to mess with the voltage for this card as it's more than enough for most game I play. Just wondering why my graphic score always lower than most people here.


No, shouldn't have to mess with voltage at all. The memory/controller on these cards is amazing.


----------



## bdc604

dropped an evga gtx1070 sc into my little devil. pretty happy with it overall, set fps to 60 to eliminate coil whine. silent, even during benchmarking. had to pop the side panel off and look inside to make sure the fans were running.


----------



## mypickaxe

Quote:


> Originally Posted by *bdc604*
> 
> dropped an evga gtx1070 sc into my little devil. pretty happy with it overall, set fps to 60 to eliminate coil whine. silent, even during benchmarking. had to pop the side panel off and look inside to make sure the fans were running.


Great. May your temps be high and your framerates be...wait...


----------



## oblivious

Just a curious question guys since this is the official club for the 1070... How does it handle dual 1440p monitors? Carries it's weight?


----------



## alex4069

Quote:


> Originally Posted by *Mad Pistol*
> 
> 4k SLI, you will need the HB SLI bridge for the best experience. How many slots are there between your cards? (a picture will probably help more than explaining it)


I will post pictures later. I have 1 pcie
Quote:


> Originally Posted by *Mad Pistol*
> 
> You probably want the 3-slot adapter, then. I found this on Nvidia's website.


Here is the pic:


That is a pcie x1 slot right in front of the card.


----------



## alex4069

Quote:


> Originally Posted by *bdc604*
> 
> dropped an evga gtx1070 sc into my little devil. pretty happy with it overall, set fps to 60 to eliminate coil whine. silent, even during benchmarking. had to pop the side panel off and look inside to make sure the fans were running.


I'm sorry to say this, but that is a sexy case. I plan on getting a case either black outside and withe in or vice versa.


----------



## alex4069

Could you remove my signup that I did at 8:57 last night. That one didnt have all the info in it.


----------



## whicker

I was having trouble with my 1070 non oc strix where the voltage and clock basically bounces around non stop during benchmarks. Clock from 1976-2075 and voltage from 993-1093. Anyway I flashed the Strix OC bios to it and finally it was keeping a constant clock and voltage at stock OC strix boost (2035mhz 1093mv). I though all was good but running valley and 3dmark I was getting artifacting. So what this tells me is Asus is definitely binning their cards. Another thing to note is the way they are binning. They are binning chips by TDP. Basically the highest clock my Strix non OC can hit without artifacting when over 100%TDP is 1999. That's why when I run benchmarks with the stock bios it will bounce from 1976-2075mhz. Anyway I'm probably gonna try and return the card and get the OC version. That was I can feel comfortable with the fact that once new games come out that actually push it to 100% TDP the clocks, and resulting framerate, isn't bouncing everywhere.


----------



## FXformat

Just wanted to say i put my 1070 under water, and with graphics intensive gaming in 4K, it never broke 40 degrees...this card runs cool as hell..

And it looks sexy under water


----------



## Mad Pistol

Quote:


> Originally Posted by *alex4069*
> 
> 
> 
> That is a pcie x1 slot right in front of the card.


Yep. Get the 3-slot HB SLI bridge.


----------



## Dreamliner

Quote:


> Originally Posted by *whicker*
> 
> So what this tells me is Asus is definitely binning their cards. Another thing to note is the way they are binning. They are binning chips by TDP.


Interesting...


----------



## alex4069

Wanted to let you know of the noise on a Zotac 1070 amp at 100 percent on fans @ core clock 2088 and memory clock @ 2304

Here is the Youtube link:


----------



## chrcoluk

Palit Gamerock Premium, binned chip and biggest cooler I seen 2.5x slot cooler. Max I have been able to get this card to temp wise is 62C and its summer.


----------



## syl1979

I am in with Galax 1070 Gamer.


----------



## TheMiracle

Guys,
These OC's you are posting here, do you actually use for gaming or just for bench?


----------



## pez

Quote:


> Originally Posted by *Mad Pistol*
> 
> I was really hoping this group would become like the GTX 970 owners club, but it doesn't seem that way. Is there just not a lot of interest in this card? The 1080 thread is far more active.
> 
> EDIT: also, it looks like the rush for 1070's is over. Newegg has them in abundance at the moment.
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601201888%20601204369%208000


It's been said, but it is very true. The 1070 is a very good card at it's price. And I think at a MSRP of $380, it would've been killer. But there's not a lot of cards hitting even close to that point, and we're looking at what people were getting good deals on GTX 980s for. I think most users looking forward to this card were looking for it to be a really 350-400 tops card and then it would have been a true 970 successor. I loved my 970s







.

It's also a strangely niche card. I got one for my GFs rig and while I haven't been able to test it like I'd like just yet, just based on reviews, I felt it a waste to use it with 1080p. However, it's not quite there for all games at 1440p with high refresh rate. Thus I gave her my 1440p panel and went 4K on my rig. I just can't imagine enjoying SLI 1070s for 4K. I don't mind making compromises on AA with titles, but with the 1070s, I'd have to compromise more than that to maintain 60 frames, and for me, that's no bueno. UlWQHD and QHD with high refresh are what I think 1070/1070 SLI fits into perfectly.


----------



## Blackfyre

Quote:


> Originally Posted by *TheMiracle*
> 
> Guys,
> These OC's you are posting here, do you actually use for gaming or just for bench?


I game and benchmark and do everything with my overclock. If it's stable, it's stable.

Others who push their fan speed to 100% and push memory to the limit probably just do it for the sake of benchmarking.


----------



## luan87us

Quote:


> Originally Posted by *Forceman*
> 
> No, shouldn't have to mess with voltage at all. The memory/controller on these cards is amazing.


Sweet I will try that out today.
Quote:


> Originally Posted by *FXformat*
> 
> Just wanted to say i put my 1070 under water, and with graphics intensive gaming in 4K, it never broke 40 degrees...this card runs cool as hell..
> 
> And it looks sexy under water


Hey I have that same case in black







What are you fans configuration? My case is having negative pressure despite the 3 intakes fans at the bottom running 1300 rpm, 1 exhaust in back at 750rpm, 2 on the corsair h100i v2 exhaust.


----------



## TheMiracle

Quote:


> Originally Posted by *Blackfyre*
> 
> I game and benchmark and do everything with my overclock. If it's stable, it's stable.
> 
> Others who push their fan speed to 100% and push memory to the limit probably just do it for the sake of benchmarking.


I am keeping my OC at +100 core (2063mhz boost), +500 memory and 70% fan. With this my card works at 109% TDP and 1.081V, the temps dont go higher than 58ºC.
I didnt have any problem yet. It passed the stress test from 3DMark.

I am just asking because benchs are only 5 min long, but usually I game for more then 2h!!


----------



## Yetyhunter

Has anyone tried the new G1 BIOS ? I'm not even sure if it's for my card .

http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios


----------



## bigjdubb

Just a little update. My Alhphacool universal gpu block fits the MSI 1070 Gaming perfectly with the front and backplate on. Any block that fits within the mounting holes should fit without an issue, if the mounting system is the post/spring/nut style. If the block needs a special backplate then there might be an issue.

This is the block I am using:



Hopefully the picture will explain when I mean by post/spring/nut mounting. I ran the bold through from the back, slipped on the plastic washer and used a nut on top of the washer to tighten the bolt to the plate and pcb. I then dropped on the block, put the springs on the posts and tightened down the thumb screw/nut. (the picture shows the bolt and thumb screw opposite of how I installed it)

Fitment will be more test and see with AIO blocks but it looks like the EVGA kit will work fine with the MSI plates in place.


----------



## bigjdubb

Quote:


> Originally Posted by *TheMiracle*
> 
> I am keeping my OC at +100 core (2063mhz boost), +500 memory and 70% fan. With this my card works at 109% TDP and 1.081V, the temps dont go higher than 58ºC.
> I didnt have any problem yet. It passed the stress test from 3DMark.
> 
> I am just asking because benchs are only 5 min long, but usually I game for more then 2h!!


I usually have a minimum of 2 profiles set up in Afterburner. Benchmarking profile which is the highest clocks I can get and still pass a benchmark. My gaming profiles are constantly tweaked as I get more hours in games and I make separate profiles for stubborn games that don't want to play nice with my clocks (looking at you GTAV).

With this particular card my max clocks profile and the profile I use for gaming aren't much different, 25mhz less on the memory is the only change so far (+145/+725 for gaming).


----------



## ikjadoon

Quote:


> Originally Posted by *bigjdubb*
> 
> Just a little update. My Alhphacool universal gpu block fits the MSI 1070 Gaming perfectly with the front and backplate on. Any block that fits within the mounting holes should fit without an issue, if the mounting system is the post/spring/nut style. If the block needs a special backplate then there might be an issue.
> 
> This is the block I am using:
> 
> 
> 
> Hopefully the picture will explain when I mean by post/spring/nut mounting. I ran the bold through from the back, slipped on the plastic washer and used a nut on top of the washer to tighten the bolt to the plate and pcb. I then dropped on the block, put the springs on the posts and tightened down the thumb screw/nut. (the picture shows the bolt and thumb screw opposite of how I installed it)*
> 
> Fitment will be more test and see with AIO blocks but it looks like the EVGA kit will work fine with the MSI plates in place*.


Nice!! So, both the VRM plate AND the VRAM plate? You didn't hit/bump the VRAM plate?


----------



## bigjdubb

Nothing touches. I meant to get some measurements while I had everything apart but I will be taking back apart (currently has the air cooler back on it) in the next couple of days once I get the rest of the loop buttoned up so I can get some clearance measurements. I think the extruded cold plate of the EVGA will provide the necessary clearance for the mounting plate (with all the water tubes etc..) and the pump/block combo to remain above the ram plate.


----------



## ikjadoon

Quote:


> Originally Posted by *bigjdubb*
> 
> Nothing touches. I meant to get some measurements while I had everything apart but I will be taking back apart (currently has the air cooler back on it) in the next couple of days once I get the rest of the loop buttoned up so I can get some clearance measurements. I think the extruded cold plate of the EVGA will provide the necessary clearance for the mounting plate (with all the water tubes etc..) and the pump/block combo to remain above the ram plate.


That's great to hear. I'd love that and I'm sure the peeps in The Mod thread would love to hear it, too. OK, right, right--that makes sense. Awesome--thank you!


----------



## Teufel9000

Quote:


> Originally Posted by *TheMiracle*
> 
> Guys,
> These OC's you are posting here, do you actually use for gaming or just for bench?


both. but i usually tone it down for Gaming to make sure its 100% stable and i dont need super volts lol.

Running +100/+600 @ 50% fan and stock volts atm. for gaming.

Benchmark wise id turn on all the voltage and fans and go like +100/+800. but i bet most people run stable OCs.


----------



## mypickaxe

Quote:


> Originally Posted by *Teufel9000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheMiracle*
> 
> Guys,
> These OC's you are posting here, do you actually use for gaming or just for bench?
> 
> 
> 
> both. but i usually tone it down for Gaming to make sure its 100% stable and i dont need super volts lol.
> 
> Running +100/+600 @ 50% fan and stock volts atm. for gaming.
> 
> Benchmark wise id turn on all the voltage and fans and go like +100/+800. but i bet most people run stable OCs.
Click to expand...

I don't see the point in running just a stable OC for a benchmark, if the point is to see how far the card can be pushed, or to try to get onto the leader boards.

To me it's like the difference between...and I hate to use a car analogy but, it's appropriate:

Tuning a car for a quarter mile race vs. tuning for a 200 mile race.

You can use the same car for both, and it can be tuned to go fast in either one, but longevity for the 200 mile race is invariably going to require substantial differences in the tuning.


----------



## bigjdubb

To me, a stable overclock for benchmarks is one that allows you to finish the benchmark. It doesn't have to finish it consistently but as long as I can finish the benchmark a few times I am satisfied. Outside of benchmarking stability matters a lot more but I don't use any synthetic suite to gauge stability, I just play the games and wait for a crash and adjust accordingly.

I don't see the point in using Firestrike, Furmark, or Unigine looped to gauge stability. If you run all of those for 48hours straight without a crash it just means your clocks for those benchmarks are stable. It tells you zero about your game stability.


----------



## oblivious

Would a 650 watt psu suffice for a single 1070?


----------



## bigjdubb

Quote:


> Originally Posted by *oblivious*
> 
> Would a 650 watt psu suffice for a single 1070?


Yes, as long as it has the required PCI-E power cables. Depending on the rest of your system, you could probably run two of them on a 650.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> To me, a stable overclock for benchmarks is one that allows you to finish the benchmark. It doesn't have to finish it consistently but as long as I can finish the benchmark a few times I am satisfied. Outside of benchmarking stability matters a lot more but I don't use any synthetic suite to gauge stability, I just play the games and wait for a crash and adjust accordingly.
> 
> I don't see the point in using Firestrike, Furmark, or Unigine looped to gauge stability. If you run all of those for 48hours straight without a crash it just means your clocks for those benchmarks are stable. It tells you zero about your game stability.


Exactly.


----------



## mypickaxe

Quote:


> Originally Posted by *bigjdubb*
> 
> To me, a stable overclock for benchmarks is one that allows you to finish the benchmark. It doesn't have to finish it consistently but as long as I can finish the benchmark a few times I am satisfied. Outside of benchmarking stability matters a lot more but I don't use any synthetic suite to gauge stability, I just play the games and wait for a crash and adjust accordingly.
> 
> I don't see the point in using Firestrike, Furmark, or Unigine looped to gauge stability. If you run all of those for 48hours straight without a crash it just means your clocks for those benchmarks are stable. It tells you zero about your game stability.


That's what I said in the previous post, in different words.


----------



## oblivious

Quote:


> Originally Posted by *bigjdubb*
> 
> Yes, as long as it has the required PCI-E power cables. Depending on the rest of your system, you could probably run two of them on a 650.


I should have the PCIE cables. My current 7950 runs on two 8 pin connectors and it's a very good quality PSU.. I just did not want to buy this and get and then find out i needed to upgrade my PSU also.


----------



## bigjdubb

Quote:


> Originally Posted by *oblivious*
> 
> I should have the PCIE cables. My current 7950 runs on two 8 pin connectors and it's a very good quality PSU.. I just did not want to buy this and get and then find out i needed to upgrade my PSU also.


You should be good to go. Each generation of cards is bringing down the PSU requirements further and further.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *bigjdubb*
> 
> You should be good to go. Each generation of cards is bringing down the PSU requirements further and further.


my new gtx1070-SLI are +150% FPS (versus previous my gtx760-SLI cards); but only use 10% more power. lol









i could have very comfortably upgraded to gtx1080-SLI.


----------



## Forceman

Quote:


> Originally Posted by *bigjdubb*
> 
> To me, a stable overclock for benchmarks is one that allows you to finish the benchmark. It doesn't have to finish it consistently but as long as I can finish the benchmark a few times I am satisfied. Outside of benchmarking stability matters a lot more but I don't use any synthetic suite to gauge stability, I just play the games and wait for a crash and adjust accordingly.
> 
> I don't see the point in using Firestrike, Furmark, or Unigine looped to gauge stability. If you run all of those for 48hours straight without a crash it just means your clocks for those benchmarks are stable. It tells you zero about your game stability.


The only problem with that is when people don't make it clear that us the standard they are using when they post their clocks/scores. It can lead to inflated expectations of performance/overclocks if people aren't upfront about it.


----------



## iZeroFive

Yesterday i finally received my EVGA GTX1070 SC.I used to have a GTX970 but i sold that too early not considering stocks might be a problem so i spend 3 weeks with GTX960 at 1440p and it wasn't the best experience.Anyways i'm really happy with the new card build quality is top notch,card looks sexy and it's super silent.The only major problem that i have i can't reach my targeted oc at the moment.Before getting the card i read almost every review so i thought exact 2ghz boost / 9ghz mem would be ok for 7/24 use but the card is not super stable at those speeds e.g it can complate Valley but it crashes middle of Firestrike Extreme and even if it runs stable i see that 2ghz rarely it mostly runs at 1950/1980mhz area.

So how is SC users doing in terms of overclock.Is 2000mhz boost too much for this card at def voltage but %112 power target?


----------



## Ranguvar

1070 FTW back, managed to crack 6700 graphics score in Time Spy.



Time Spy: 6703
Firestrike Extreme: 10125
Firestrike: 21376
(Invalid because x70 FTW is unknown still)

Still disappointed manual curve 2278MHz gives less performance.
If K-Boost forced the card to 1.08/1.09V instead of 1.05V, I could run that frequency with a normal offset overclock.

Waiting for BIOS editors, I guess!


----------



## Dude970

Great score


----------



## mypickaxe

Quote:


> Originally Posted by *iZeroFive*
> 
> Yesterday i finally received my EVGA GTX1070 SC.I used to have a GTX970 but i sold that too early not considering stocks might be a problem so i spend 3 weeks with GTX960 at 1440p and it wasn't the best experience.Anyways i'm really happy with the new card build quality is top notch,card looks sexy and it's super silent.The only major problem that i have i can't reach my targeted oc at the moment.Before getting the card i read almost every review so i thought exact 2ghz boost / 9ghz mem would be ok for 7/24 use but the card is not super stable at those speeds e.g it can complate Valley but it crashes middle of Firestrike Extreme and even if it runs stable i see that 2ghz rarely it mostly runs at 1950/1980mhz area.
> 
> So how is SC users doing in terms of overclock.Is 2000mhz boost too much for this card at def voltage but %112 power target?


EDIT: Well, never mind. It looked initially like you had a rear mounted 120mm radiator. My monitor is set too dark and that's a poor exposure. My bad.

Just a suggestion, but you might want to consider rotating your radiator such that the ports are below the pump. If you leave it the way you have it, there's a possibility you will end up with air in the radiator. It is harder on the pump. Failure may occur sooner than later.


----------



## TheLAWNOOB

Anybody mining with the 1070?


----------



## oblivious

Quote:


> Originally Posted by *bigjdubb*
> 
> You should be good to go. Each generation of cards is bringing down the PSU requirements further and further.


Thanks guys.. I was 95% sure it would be enough from building experience and using online PSU Calculators but i wanted some feedback from guys that already own 1070's.

Just a random question to throw out, is there any brands of Nvidia cards that needs to be avoided? I've never owned an Nvidia card. From what i can tell, Zotac, EVGA and Asus are the top sellers.


----------



## TheLAWNOOB

My Zotac 1070 Amp is doing alright. Fan is very quiet. According tp reviews it runs a bit hotter than other cards, but for me it runs at 75C at 70% fan at 180W and the fan is pretty quiet. Ambient is 30C.


----------



## BulletSponge

Highest temp I've seen on my MSI Gaming X is 56C so far. I can't speak about fan noise or coil whine though, my hearing is shot from the flight deck.
Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Anybody mining with the 1070?


This crossed my mind as well.


----------



## alex4069

Quote:


> Originally Posted by *Forceman*
> 
> The only problem with that is when people don't make it clear that us the standard they are using when they post their clocks/scores. It can lead to inflated expectations of performance/overclocks if people aren't upfront about it.


I just posted my highest Time Spy benchmark and that is after playing gears of war ultimate at 4k and everything set to highest.


----------



## alex4069

Quote:


> Originally Posted by *BulletSponge*
> 
> Highest temp I've seen on my MSI Gaming X is 56C so far. I can't speak about fan noise or coil whine though, my hearing is shot from the flight deck.
> This crossed my mind as well.


My highest temp is 59 after gaming and running Time Spy. My game was Gears of War Ultimate 4k settings at max and overclock core at 2076 memory at 2276, fan 100%.


----------



## optimus002

Quote:


> Originally Posted by *GreedyMuffin*
> 
> http://www.3dmark.com/fs/9379421
> 
> My 1500/1989 results from 980Ti. Wondering if I should get rid of the 980Ti. (AKA slap it into an folding rig for 24/7 folding along with the others.) and get myself either a 1080 or 1070. Can someone post benchmarks to the firestrike Extreme test for me?
> 
> I would love if they're were overclocked pretty good. 24/7 stable OC and not benchstable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers!


Wait for 1080Ti bare minimum.

Your score is showing that something isn't stable. At 1500 you should be right around the 10k gpu score mark, not barely beating mine @1455. Here's mine:

*1455/8000*
http://www.3dmark.com/fs/9401308

*1506/8000*
http://www.3dmark.com/fs/9399702


----------



## alex4069

Quote:


> Originally Posted by *optimus002*
> 
> Wait for 1080Ti bare minimum.
> 
> Your score is showing that something isn't stable. At 1500 you should be right around the 10k gpu score mark, not barely beating mine @1455. Here's mine:
> 
> *1455/8000*
> http://www.3dmark.com/fs/9401308
> 
> *1506/8000*
> http://www.3dmark.com/fs/9399702


Here is mine on a GTX 1070 AMP with a I5 4690k: http://www.3dmark.com/3dm/13410105
GPU clock is 2063 Memory clock 2276. My GPU is a little more than the 980ti but my physics score holds me down.


----------



## pez

Quote:


> Originally Posted by *oblivious*
> 
> Would a 650 watt psu suffice for a single 1070?


Quote:


> Originally Posted by *oblivious*
> 
> I should have the PCIE cables. My current 7950 runs on two 8 pin connectors and it's a very good quality PSU.. I just did not want to buy this and get and then find out i needed to upgrade my PSU also.


Quote:


> Originally Posted by *bigjdubb*
> 
> You should be good to go. Each generation of cards is bringing down the PSU requirements further and further.


I think my system with 2 x 1080s is pushing 550W at full tilt and that's a wall reading WITH my monitor, amp, and DAC included (i.e. about 70w or so between the 3). I bought the x1250 back when I planned on doing crazy things with my 780 to eventually go SLI, but that plan was cut short with some bad luck







. You've got a solid unit in your sig, so you shouldn't have an issue at all.

Quote:


> Originally Posted by *oblivious*
> 
> Thanks guys.. I was 95% sure it would be enough from building experience and using online PSU Calculators but i wanted some feedback from guys that already own 1070's.
> 
> Just a random question to throw out, is there any brands of Nvidia cards that needs to be avoided? I've never owned an Nvidia card. From what i can tell, Zotac, EVGA and Asus are the top sellers.


I actually don't think there's any brand to really avoid. However, people generally loathe ASUS RMA support. Gigabyte seems fine as they're holding up a decent service-level with RMA'ing my mouse. EVGA has been good to me the couple of times I had to use them. That's about it as far as GPUs (thankfully).


----------



## Amph

there is a way to know how many wattage a gpu is pulling from the pcie of the MB?


----------



## luan87us

Quote:


> Originally Posted by *Amph*
> 
> there is a way to know how many wattage a gpu is pulling from the pcie of the MB?


HWMonitor


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *luan87us*
> 
> HWMonitor


HWMonitor?









i just got to ask a question....









i switched recently from *"CPUz's HWM"*... http://www.cpuid.com/softwares/hwmonitor.html

now i'm using *"Open Hardware Monitor"* and LUV it... http://openhardwaremonitor.org/

*But* only one prob with *"Open Hardware Monitor"* is, that it lacks a PSU (watts)... i luv everything else is much better than "CPUz's HWM"... (see pic)

*anyone know what to use to complement "Open Hardware Monitor" so i can quickly see watts used by PSU (Current/Min/Max)?*

Tanks!


----------



## Blze001

Quote:


> Originally Posted by *oblivious*
> 
> Would a 650 watt psu suffice for a single 1070?


I'm running mine with an overclocked i5 on a 450w PSU, so 650w is probably fine.


----------



## 42 degree angle

Hello 1070 owners! I think that this thread is the best one to ask this, but do guide me to somewhere else if needed. So;

What is actually 'the best' aftermarket 1070 to buy at this time and age? I've never had a PC and I'm gathering up a build, this being the last part. I've been eyeballing the Gigabyte G1 because I'm going to go with a blackout theme with my build, but then I stumbled across coil whine accusations. Is this true? What are the downsides of each 1070, which should be avoided and which should be favoured?

To give an image about pricing, the G1 stands at 489€, MSI stands at 499€ and Asus at 529€. Also I'd love to go with the MSI one, but I'd love it to match my build, so how would one make the red part of the shroud black without voiding warranty, a non-permanent solution?


----------



## oblivious

Thanks everyone for the replies... Now i must ask.. Is anyone running dual 1440p monitors on a single 1070? Right now i have a Crossover 27Q but i want to add another 1440p monitor to run vertically and an extended desktop.


----------



## Shaitan

Quote:


> Originally Posted by *oblivious*
> 
> Thanks everyone for the replies... Now i must ask.. Is anyone running dual 1440p monitors on a single 1070? Right now i have a Crossover 27Q but i want to add another 1440p monitor to run vertically and an extended desktop.


I'm running an Acer XB271HU and Dell U2715H off my MSI 1070 Gaming X. They are running in extended desktop with no issues.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *42 degree angle*
> 
> Hello 1070 owners! I think that this thread is the best one to ask this, but do guide me to somewhere else if needed. So;
> 
> What is actually 'the best' aftermarket 1070 to buy at this time and age? I've never had a PC and I'm gathering up a build, this being the last part. I've been eyeballing the Gigabyte G1 because I'm going to go with a blackout theme with my build, but then I stumbled across coil whine accusations. Is this true? What are the downsides of each 1070, which should be avoided and which should be favoured?
> 
> To give an image about pricing, the G1 stands at 489€, MSI stands at 499€ and Asus at 529€. Also I'd love to go with the MSI one, but I'd love it to match my build, so how would one make the red part of the shroud black without voiding warranty, a non-permanent solution?


All aftermarket 1070s are great. But features and cost do matter and maybe one other thing should be considered...

You have to- i have wondered recently- of the possibility of BIOS flashing in the near future may/ could be a big factor. Here's how; i'm no authority on custom made BIOS and flashing, but *if* after some custom BIOS for 1070 are finally created; one single aftermarket 1070 may be the best one to have (flash to). Although, maybe there's no need to worry. In recent years a single factor like 1 versus 2, six and or eight pin PCI-e power plugs, could be a difference/ factor. Or, many single things might have mattered in the past. But maybe power is not a factor because they all have potential. Hopefully someone that is an all around authority on custom BIOS can comment and clear up things i mentioned here.

on other hand, i could testify why ASUS Strix OC 1070 is best. Only because i got two







and also because of the extras:
1. it has two four pin fan connects that makes it very unique... (fans connected react to the GPU temp)
2. the HSF on it ROCKS and it is identical to the Strix 1080's HSF...
3. the manufacturing process is unique (100% robot produced and no flux is a cool thing to note.)

GL


----------



## ikjadoon

Quote:


> Originally Posted by *Forceman*
> 
> The only problem with that is when people don't make it clear that us the standard they are using when they post their clocks/scores. It can lead to inflated expectations of performance/overclocks if people aren't upfront about it.


This is the bane of my existence. I didn't know people threw up benches/clocks that were not game stable. Completely threw me for a loop when I was overclocking my GTX 770.

This thread, sadly, should've been like the Skylake/Haswell OC'ing guides where final clocks (not offsets) & stability test were complied and organized in the OP. Alas, maybe Darkwizze is a unicorn, haha.

BTW, GTX 1060 looks just the same. ~2 to 2.1GHz maximum core clocks, ~9 to 9.4GHz maximum memory clocks.


----------



## Mudfrog

Well I just ordered a 1070. I opted for the MSI Gaming but I almost pulled the trigger on the Gigabyte G1. I was anticipating that the MSI's dual larger fans should be quieter and perhaps cool better than the Gigabyte. I imagine there is not much difference though. My 670's are showing their age, especially since a lot of games no longer support SLI. Can't wait to try it out. Should be here next Wednesday.


----------



## luan87us

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> All aftermarket 1070s are great. But features and cost do matter and maybe one other thing should be considered...
> 
> You have to- i have wondered recently- of the possibility of BIOS flashing in the near future may/ could be a big factor. Here's how; i'm no authority on custom made BIOS and flashing, but *if* after some custom BIOS for 1070 are finally created; one single aftermarket 1070 may be the best one to have (flash to). Although, maybe there's no need to worry. In recent years a single factor like 1 versus 2, six and or eight pin PCI-e power plugs, could be a difference/ factor. Or, many single things might have mattered in the past. But maybe power is not a factor because they all have potential. Hopefully someone that is an all around authority on custom BIOS can comment and clear up things i mentioned here.
> 
> on other hand, i could testify why ASUS Strix OC 1070 is best. Only because i got two
> 
> 
> 
> 
> 
> 
> 
> and also because of the extras:
> 1. it has two four pin fan connects that makes it very unique... (fans connected react to the GPU temp)
> 2. the HSF on it ROCKS and it is identical to the Strix 1080's HSF...
> 3. the manufacturing process is unique (100% robot produced and no flux is a cool thing to note.)
> 
> GL


Nice to see another happy Strix 1070 owner. I have the same card and I love it. it rocked GTA5 on all Very High settings with 180FPS lol.


----------



## Blackfyre

Quote:


> Originally Posted by *Mudfrog*
> 
> Well I just ordered a 1070. I opted for the MSI Gaming but I almost pulled the trigger on the Gigabyte G1. I was anticipating that the MSI's dual larger fans should be quieter and perhaps cool better than the Gigabyte. I imagine there is not much difference though. My 670's are showing their age, especially since a lot of games no longer support SLI. Can't wait to try it out. Should be here next Wednesday.


I had a windforce cooler before on my previous card, the MSI Gaming X I got is definitely quieter, that's for sure. And my temperatures are brilliant too.

So I'm 90% it's both quieter and cooler.


----------



## oblivious

If a 1060 can OC to around 2ghz then why should I opt for a 1070 and just save myself some money... I guess I'm really asking what's the real differences in the 1060 and the 1070 that warrants spending the extra money.


----------



## Blackfyre

Quote:


> Originally Posted by *oblivious*
> 
> If a 1060 can OC to around 2ghz then why should I opt for a 1070 and just save myself some money... I guess I'm really asking what's the real differences in the 1060 and the 1070 that warrants spending the extra money.


*Well the 1070 clocks at similar speeds to the 1080 but doesn't perform the same. And here's the best way to explain it:
*
The *GTX 1080* has 2560 CUDA Cores
The *GTX 1070* has 1920 CUDA Cores
The *GTX 1060* has 1280 CUDA Cores

So even if all 3 GPU's are clocked at exactly *2000MHz Core Speed* and the exact same memory speed. There's a big gap in performance between all three. Well big, relatively speaking.

Oh and don't forget the *GTX 1080* has the advantage of not only having the most *CUDA Cores*, but also *GDDR5X* Memory instead of *GDDR5* like the *GTX 1070* & *1060*.


----------



## Amph

Quote:


> Originally Posted by *luan87us*
> 
> HWMonitor


how? it just tell me the total wattage, not the one from the pci express alone, pcie slot from MB not pcie connector


----------



## luan87us

Quote:


> Originally Posted by *Amph*
> 
> how? it just tell me the total wattage, not the one from the pci express alone, pcie slot from MB not pcie connector


Sorry I misread your question. I thought you mean how many watt your system is using total lol.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *luan87us*
> 
> Nice to see another happy Strix 1070 owner. I have the same card and I love it. it rocked GTA5 on all Very High settings with 180FPS lol.


cool.










GTAV sucked on my old cards. can't wait to try on Strix 1070.

...posting initial FireStrikes results in a few min....


----------



## whicker

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> cool.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTAV sucked on my old cards. can't wait to try on Strix 1070.
> 
> ...posting initial FireStrikes results in a few min....


Can you post your Freq graph and voltage graph as well? (from MSI AB or w.e)


----------



## Tasm

Quote:


> Originally Posted by *lifeisshort117*
> 
> I can't agree with you more!
> 
> The hardware is hard locked at 1.25v. Okay fine. Anything between 1.093v or 1.088v and 1.25v should theoretically be possible. Someone please mod the bios. Please.
> 
> So many assumptions have been made that this architecture doesn't do well with higher voltages. Why is that? The cards that we're being tested that made those assumptions we're the Founder's Editions. The cards we're rocking have much cleaner power delivery. I absolutely refuse to believe that'll be the case with our AIB cards.
> 
> It's the same idea with overclocking a proc on a lower end motherboard and a motherboard meant for overclocking. of COURSE you'll get a nicer overclock with the motherboard that is meant for overclocking.


I just wanted a little more voltage. If you can go up to 1.15V, that´s far from the limit and could make some chips probably pas
Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> HWMonitor?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i just got to ask a question....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i switched recently from *"CPUz's HWM"*... http://www.cpuid.com/softwares/hwmonitor.html
> 
> now i'm using *"Open Hardware Monitor"* and LUV it... http://openhardwaremonitor.org/
> 
> *But* only one prob with *"Open Hardware Monitor"* is, that it lacks a PSU (watts)... i luv everything else is much better than "CPUz's HWM"... (see pic)
> 
> *anyone know what to use to complement "Open Hardware Monitor" so i can quickly see watts used by PSU (Current/Min/Max)?*
> 
> Tanks!


Make sure to use HWinfo.

Its the best you can use.


----------



## davidh93

My msi 1070 aero with an ek block. I was able to hit a stable max overclock of 2166 on the core and +500 on the mem. All under 45c. I downgraded from a msi 1080 x.


----------



## Yetyhunter

Quote:


> Originally Posted by *Yetyhunter*
> 
> Has anyone tried the new G1 BIOS ? I'm not even sure if it's for my card .
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios


----------



## Bee Dee 3 Dee

Got delayed installing Cards Yesterday... had to abort... Thunderstorms and Lightning all day from 4 AM to 4 PM... not good for upgrading hardware... lol

GOT 'er done just now...

everything done and working AOK and is SWEET!!! as can be.









____________________________________________
____________________________________________
____________________________________________
____________________________________________
____________________________________________

*Initial FireStrike and FireStrike Extreme Results and comparison to old cards...*

Old Cards: GTX 760(2x)
New Cards: GTX 1070(2x)

*Fire Strike Results:*
Old Cards: *GTX 760(2x)*
*Graphics Score
11,561*

http://www.3dmark.com/fs/9319709

New Cards: *GTX 1070(2x)*
*Graphics Score
27,285*

http://www.3dmark.com/fs/9410011

____________________________________________
____________________________________________
____________________________________________
____________________________________________
____________________________________________

*Fire Strike Extreme Results:*
Old Cards: *GTX 760(2x)*
*Graphics Score
5,606*

http://www.3dmark.com/fs/9321130

New Cards: *GTX 1070(2x)*
*Graphics Score
15,247*

http://www.3dmark.com/fs/9410175

____________________________________________
____________________________________________
____________________________________________
____________________________________________
____________________________________________

200% to 300% Increase on Graphics Score was what i was hoping for.









*Question:
* What Graphics Scores have ppl gotten with new PCs on GTX1070-SLI?
(*sry to ask but i'm burned out after 4 hours non-stop working, missed lunch, and headed for food. Oh, and my google broke







... lol*) the four week wait for cards was worth it, too! soooo happy.









Tanks!









(Valley comparison next.)


----------



## marik123

Are all the 1070 voltage locked at 1.05v? I tried to increase my GPU voltage on my 1070 strix, still show as 1.05v during load. The other thing is when I'm running Heaven 4.0, boost will show 2100mhz, then 1 minute later, 2082, then 2075, then all the way down to 2062mhz. Is this normal even though my GPU temperature never exceed 65C, plus I have power target to 112%?


----------



## oblivious

Quote:


> Originally Posted by *Blackfyre*
> 
> *Well the 1070 clocks at similar speeds to the 1080 but doesn't perform the same. And here's the best way to explain it:
> *
> The *GTX 1080* has 2560 CUDA Cores
> The *GTX 1070* has 1920 CUDA Cores
> The *GTX 1060* has 1280 CUDA Cores
> 
> So even if all 3 GPU's are clocked at exactly *2000MHz Core Speed* and the exact same memory speed. There's a big gap in performance between all three. Well big, relatively speaking.
> 
> Oh and don't forget the *GTX 1080* has the advantage of not only having the most *CUDA Cores*, but also *GDDR5X* Memory instead of *GDDR5* like the *GTX 1070* & *1060*.


So between the 1070 and the 1060 the money difference is the CUDA core difference.


----------



## ikjadoon

Quote:


> Originally Posted by *marik123*
> 
> Are all the 1070 voltage locked at 1.05v? I tried to increase my GPU voltage on my 1070 strix, still show as 1.05v during load. The other thing is when I'm running Heaven 4.0, boost will show 2100mhz, then 1 minute later, 2082, then 2075, then all the way down to 2062mhz. Is this normal even though my GPU temperature never exceed 65C, plus I have power target to 112%?


Correct me if I'm wrong, but I have heard that these GPUs reach their maximum core clocks at 50C and under. Someone said that at 50C, they start decreasing by 10MHz or something. I've heard this actually in a few places, but I haven't seen it replicated or tested.

Anandtech said their full GTX 1070 review will be up tomorrow, so hopefully they test this. It might be planned by NVIDIA, as they mentioned that GPU Boost 3.0 has specific water-cooling enhancements.


----------



## Swolern

Quote:


> Originally Posted by *Mad Pistol*
> 
> True, and I hate that. I think Nvidia could have done much better on the value front if they had lowered the MSRP for the 1070 even by $30. It is what it is, though.
> 
> Honestly, my GTX 1070 SLI setup is beastly. I will not require anything better for literally years.


Agreed! I just got my 1070 Strix SLI + Asus Predator 3440x1440 100hz Gsync monitor and it has been a gaming dream! Plays everything maxed completely out near 100fps. Most games I have plenty of room use increase resolution scaling. Can't wait for Battlefield One!!!

Btw MadPistol there was no HB bridge to be found quickly when I needed it so I used 2 regular flex SLI bridges and it works perfectly, and looks to be the same performance as the HB bridge. Extreme butter smooth in all games! My buddy told me there will be an issue if you don't use exactly matching single bridges.


----------



## Mr-Dark

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> Got delayed installing Cards Yesterday... had to abort... Thunderstorms and Lightning all day from 4 AM to 4 PM... not good for upgrading hardware... lol
> 
> GOT 'er done just now...
> 
> everything done and working AOK and is SWEET!!! as can be.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> 
> *Initial FireStrike and FireStrike Extreme Results and comparison to old cards...*
> 
> Old Cards: GTX 760(2x)
> New Cards: GTX 1070(2x)
> 
> *Fire Strike Results:*
> Old Cards: *GTX 760(2x)*
> *Graphics Score
> 11,561*
> 
> http://www.3dmark.com/fs/9319709
> 
> New Cards: *GTX 1070(2x)*
> *Graphics Score
> 27,285*
> 
> http://www.3dmark.com/fs/9410011
> 
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> 
> *Fire Strike Extreme Results:*
> Old Cards: *GTX 760(2x)*
> *Graphics Score
> 5,606*
> 
> http://www.3dmark.com/fs/9321130
> 
> New Cards: *GTX 1070(2x)*
> *Graphics Score
> 15,247*
> 
> http://www.3dmark.com/fs/9410175
> 
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> 
> 200% to 300% Increase on Graphics Score was what i was hoping for.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Question:
> * What Graphics Scores have ppl gotten with new PCs on GTX1070-SLI?
> (*sry to ask but i'm burned out after 4 hours non-stop working, missed lunch, and headed for food. Oh, and my google broke
> 
> 
> 
> 
> 
> 
> 
> ... lol*) the four week wait for cards was worth it, too! soooo happy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Valley comparison next.)


Hello

I think yo should upgrade that cpu.. its holding yo back

27k graphic score is very low for 1070 SLI in Firestrike.. here is my Ti's

http://www.3dmark.com/3dm/13430436?


----------



## Malinkadink

Ordered a 1070 G1 for $435. Sold my 970 for $200, $235 upgrade for close to double my 970s performance, should have it tomorrow


----------



## prey1337

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> I think yo should upgrade that cpu.. its holding yo back
> 
> 27k graphic score is very low for 1070 SLI in Firestrike.. here is my Ti's
> 
> http://www.3dmark.com/3dm/13430436?


It's interesting how much older CPU's tank overall Firestrike scores.

I'm still shopping for a Xeon to replace my i7 to get my scores up.


----------



## BulletSponge

Quote:


> Originally Posted by *Malinkadink*
> 
> Ordered a 1070 G1 for $435. Sold my 970 for $200, $235 upgrade for close to double my 970s performance, should have it tomorrow


Very nice, I sold an old MacBook Pro and a GTX 760 Monday on Amazon for $432. Payday is tomorrow and the new monitor itch is strong. My QNIX at 96Hz just doesn't quite cut it now.


----------



## jlhawn

Quote:


> Originally Posted by *marik123*
> 
> Are all the 1070 voltage locked at 1.05v? I tried to increase my GPU voltage on my 1070 strix, still show as 1.05v during load. The other thing is when I'm running Heaven 4.0, boost will show 2100mhz, then 1 minute later, 2082, then 2075, then all the way down to 2062mhz. Is this normal even though my GPU temperature never exceed 65C, plus I have power target to 112%?


mine reads 1.092 in GPU-Z under full load and boost of 1987mhz out of the box.
did you over clock yours to get it over 2000mhz or is that stock out of the box? either thats a nice core clock.


----------



## ogow89

Sent my bad overclocker gtx 1070 gamer x back and ordered a palit super jetstream, good move right? The super jetstream comes with 25+ mhz on the core over the oc mode of the msi card. I am hoping for a better chip this time around. I wanted to get a golden sample phoenix from gainward, but those are out of stock and the led rbg can't be modified.

Man i liked the look of the msi card, and it was a beefy and a solid build, sucks for me it had a terrible chip in it that wouldn't go over 1950mhz on the core, overclocked or not.


----------



## luan87us

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> I think yo should upgrade that cpu.. its holding yo back
> 
> 27k graphic score is very low for 1070 SLI in Firestrike.. here is my Ti's
> 
> http://www.3dmark.com/3dm/13430436?


Wow yeah his overall score definitely stink. I am glad I built a whole new skylake rig to pair with my 1070.


----------



## GreedyMuffin

Ended up ordering a ASUS 1080 Strix. Cooler and quieter compared to the MSI gaming X.

Thanks for ya'll s help!


----------



## blueballs

Hey guys,

I'm having a hard time deciding which gtx 1070 to grab.

I'd like to make custom bios and put that puppy underwater in the near future and try to get the max performance out of it

I don't really have a budget but I want to get a good bang for the buck decision and right now I am between 2 gtx 1070 model.

Evga FTW at 609.99$ CAD
Asus Turbo at 539.99$ CAD

I know the asus won't really satisfied my "need' but damn it has a 70$ CAD reduction .-.


----------



## Bdonedge

If you're going under water why not just get the cheapest one? They all OC the same AFAIK


----------



## blueballs

Quote:


> Originally Posted by *Bdonedge*
> 
> If you're going under water why not just get the cheapest one? They all OC the same AFAIK


----------



## prey1337

Quote:


> Originally Posted by *blueballs*


What? He's pretty much right.

Definitely go with the cheaper one, so the Turbo. I doubt you'll be able to squeeze out that much more from the FTW to justify its initial cost.


----------



## 42 degree angle

Anyone here went with the Palit brand 1070s? I've been looking into them but can't seem to find any reviews or benchmarks really. Is Palit a reputable brand or should I forget it?

I have my eyes on the Palit Super Jetstream, priced at 479€, being the cheapest aftermarket card with non-blower cooler. It looks super slick, has a gigantic cooler and what I read from 980ti etc. reviews it seems like pretty much the same as "mainstream" brands. Anyone have anything to say?

E: Or the Gainward cards which seem like fairly the same? The Phoenix 'Golden Sample' is priced at 479€ also and sports the same specs pretty much, should I go with these as they'd respect my aesthetic theme or go with a 20€ more expensive card from mainstream brands?


----------



## chrcoluk

Palit are actually the biggest brand in the world, they ship and sell alot more cards than asus and msi, they mainly sell in asia which is why they not so well known.

I have a palit 1070 gamerock premium, and its a great card, the cooler is a beast making it a 2.5 slot card, very quiet fans.

Also they bin their chips.


----------



## ogow89

Quote:


> Originally Posted by *prey1337*
> 
> What? He's pretty much right.
> 
> Definitely go with the cheaper one, so the Turbo. I doubt you'll be able to squeeze out that much more from the FTW to justify its initial cost.


The overclocking potential pretty much depends on luck. I had the gtx 1070 msi gamer x and i can tell you that, that card wouldn't overclock at all on the core. As a matter of fact, aside from the OC mode the card came with, there was no additional overclocking possible on the core at all, no matter how high the powerlimit was or the vcore. Going for even a cheaper card, could also give him one that can't even pass 1900mhz.
Quote:


> Originally Posted by *42 degree angle*
> 
> Anyone here went with the Palit brand 1070s? I've been looking into them but can't seem to find any reviews or benchmarks really. Is Palit a reputable brand or should I forget it?
> 
> I have my eyes on the Palit Super Jetstream, priced at 479€, being the cheapest aftermarket card with non-blower cooler. It looks super slick, has a gigantic cooler and what I read from 980ti etc. reviews it seems like pretty much the same as "mainstream" brands. Anyone have anything to say?


Palit is actually a bigger brand than msi and asus, just less advertised. As for their gpu, they are top notch. As a matter of fact, their pascal cards, are the fastest, coolest and quietest out of all brands. Oh and it comes with a dual bios, in case of a failure, the card would remain functional. Added security level







Something that could come in handy for modders.

Here is a review for the palit gamerock one,
https://www.computerbase.de/2016-06/palit-geforce-gtx-1070-gamerock-test/

I just sent my msi gamer x back and am replacing it with that gpu you are looking to buy. If you can wait till friday, i might be able to aid you. If you go for the msi one, which in my opinion actually looks physically pleasing, make sure that the black metal next to the outputs, isn't in your way to the screws. I had hard time tightening them.


----------



## 42 degree angle

Quote:


> Originally Posted by *ogow89*
> 
> Palit is actually a bigger brand than msi and asus, just less advertised. As for their gpu, they are top notch. As a matter of fact, their pascal cards, are the fastest, coolest and quietest out of all brands. Oh and it comes with a dual bios, in case of a failure, the card would remain functional. Added security level
> 
> 
> 
> 
> 
> 
> 
> Something that could come in handy for modders.
> 
> Here is a review for the palit gamerock one,
> https://www.computerbase.de/2016-06/palit-geforce-gtx-1070-gamerock-test/
> 
> I just sent my msi gamer x back and am replacing it with that gpu you are looking to buy.


Okay so what I'm gathering here is that Palit is actually a BIG manufacturer with great reputation and products, but it gets less marketing because it's more guided to Asian markets yeah? If I had to guess I'd say it's not readily available in the US leading to small amount of reviews.

So the Palit Super Jetstream 1070 should be as good, if not better, than any other 1070 from say Asus, MSI, Gigabyte and so on? I like the aesthetics and it'll fit my build the best. From the info I managed to gather it has a hefty out-of-the-box OC but is capable of +2MHz like the others, disregarding silicon lottery ofc.

E: What are the differences between GameRock and Super Jetstream besides colors? They seem like the same card all around spec wise


----------



## ogow89

Quote:


> Originally Posted by *42 degree angle*
> 
> Okay so what I'm gathering here is that Palit is actually a BIG manufacturer with great reputation and products, but it gets less marketing because it's more guided to Asian markets yeah? If I had to guess I'd say it's not readily available in the US leading to small amount of reviews.
> 
> So the Palit Super Jetstream 1070 should be as good, if not better, than any other 1070 from say Asus, MSI, Gigabyte and so on? I like the aesthetics and it'll fit my build the best. From the info I managed to gather it has a hefty out-of-the-box OC but is capable of +2MHz like the others, disregarding silicon lottery ofc.
> 
> E: What are the differences between GameRock and Super Jetstream besides colors? They seem like the same card all around spec wise


The palit cards have the highest oc out of the box actually maybe aside of the asus strix. Even higher than the water cooled msi one.

The super jetstream is clocked higher than the gamerock, and it supposedly has 2 vrms more and is designed for oc's. However, you have also the premium gamerock, which is the highest out of the box oc'd gtx 1070 of all brands. So get the super jetstream, unless you like the blue color the gamerock offers.


----------



## ikjadoon

So, the excitement begins. Anandtech is releasing their 30+ page GTX 1080/1070 review tomorrow. I poked Ryan Smith (Editor in Chief at Anandtech) in the comments of this hard drive review and he CONFIRMED OUR SUSPICIONS.



GPU Boost 3.0 actually starts reducing your maximum GPU boost *at possibly even 39C*! He was kind enough to tease this image from tomorrow's upcoming review.



So....all the more reason to water-cool? haha, well, if you care about ~70MHz, that is.


----------



## 42 degree angle

Quote:


> Originally Posted by *ogow89*
> 
> The palit cards have the highest oc out of the box actually maybe aside of the asus strix. Even higher than the water cooled msi one.
> 
> The super jetstream is clocked higher than the gamerock, and it supposedly has 2 vrms more and is designed for oc's. However, you have also the premium gamerock, which is the highest out of the box oc'd gtx 1070 of all brands. So get the super jetstream, unless you like the blue color the gamerock offers.


Thanks mate for all the input, appreciated! I'll be grabbing the Super Jetstream then, prefer it aesthetically over the GameRock personally. If you end up going with the SJS yourself be sure to share your thoughts via PM or by quoting onto this thread!


----------



## 113802

Petty sure my 6700K is throttling due to the heat in my room







Waiting for my window to get fixed so I can use the AC again.

GTX 1070

http://www.3dmark.com/3dm/13436128?

GTX 980 Ti

http://www.3dmark.com/fs/6803047


----------



## iZeroFive

I don't really understand this new GPU Boost 3.0 mambo jambo and i'm getting frustrated because it is so god damn unstable.Back in day with my good old GTX970 if you can hit 1500mhz without any error that's it there was no sudden 50mhz drops or there was no "keep me at 50celcius or i'm not going to boost myself up" type of ****.

As an EVGA 1070 SC owner i set custom fan curve then i overclocked as this:

+60 core v , +50mhz gpu , +500 mem , %112 power target.

Then i increased gpu +50 to +60mhz,gpu-z told me that i can reach higher boost as you can expect but i got lower graphic score in Timespy then i got with +50mhz increase.At that test card hit max 71celcius but the highest boost clock that gpu-z read was 1987mhz and highest vddc 1.0620v.

How is other 1070 SC users doing any suggestion for hitting at least stable 2000mhz,i don't expect anything higher then that at the moment.


----------



## Bee Dee 3 Dee

just started Doom... played 20 minuets...

Doom is awesome!









can't believe the game feels like previous (Doom 3) and is improved.

GTX1070 ROCKS in it! sooo glad i got two.


----------



## Yungbenny911

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> Got delayed installing Cards Yesterday... had to abort... Thunderstorms and Lightning all day from 4 AM to 4 PM... not good for upgrading hardware... lol
> 
> GOT 'er done just now...
> 
> everything done and working AOK and is SWEET!!! as can be.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Initial FireStrike and FireStrike Extreme Results and comparison to old cards...*
> 
> Old Cards: GTX 760(2x)
> New Cards: GTX 1070(2x)
> 
> *Fire Strike Results:*
> Old Cards: *GTX 760(2x)*
> *Graphics Score
> 11,561*
> 
> http://www.3dmark.com/fs/9319709
> 
> New Cards: *GTX 1070(2x)*
> *Graphics Score
> 27,285*
> 
> http://www.3dmark.com/fs/9410011
> 
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> 
> *Fire Strike Extreme Results:*
> Old Cards: *GTX 760(2x)*
> *Graphics Score
> 5,606*
> 
> http://www.3dmark.com/fs/9321130
> 
> New Cards: *GTX 1070(2x)*
> *Graphics Score
> 15,247*
> 
> http://www.3dmark.com/fs/9410175
> 
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> ____________________________________________
> 
> 200% to 300% Increase on Graphics Score was what i was hoping for.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Question:
> * What Graphics Scores have ppl gotten with new PCs on GTX1070-SLI?
> (*sry to ask but i'm burned out after 4 hours non-stop working, missed lunch, and headed for food. Oh, and my google broke
> 
> 
> 
> 
> 
> 
> 
> ... lol*) the four week wait for cards was worth it, too! soooo happy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Valley comparison next.)


You should definitely invest in a new processor. That's a horrible GPU score just 200 points higher than my previous 970 SLI setup...









http://www.3dmark.com/compare/fs/9410011/fs/4659651


----------



## Dreamliner

whicker mentioned he wasn't able to achieve the OC'd Strix speeds on his regular Strix. Is it possible manufacturers are binning?

I've been seeing some blower models pop up under $400...very tempting.


----------



## Oj010

Quote:


> Originally Posted by *iZeroFive*
> 
> I don't really understand this new GPU Boost 3.0 mambo jambo and i'm getting frustrated because it is so god damn unstable.Back in day with my good old GTX970 if you can hit 1500mhz without any error that's it there was no sudden 50mhz drops or there was no "keep me at 50celcius or i'm not going to boost myself up" type of ****.
> 
> As an EVGA 1070 SC owner i set custom fan curve then i overclocked as this:
> 
> +60 core v , +50mhz gpu , +500 mem , %112 power target.
> 
> Then i increased gpu +50 to +60mhz,gpu-z told me that i can reach higher boost as you can expect but i got lower graphic score in Timespy then i got with +50mhz increase.At that test card hit max 71celcius but the highest boost clock that gpu-z read was 1987mhz and highest vddc 1.0620v.
> 
> How is other 1070 SC users doing any suggestion for hitting at least stable 2000mhz,i don't expect anything higher then that at the moment.


I think you just didn't notice it, GPU Boost has always behaved that way when you're close to the limits. Hell, I spent several hours closely watching the frequencies on a GTX 980 non-Ti just last week bouncing around in GPUPI1B. I did eventually manage a run through at 1,700 MHz but not before about 200 runs of it bouncing down as low as 1,640.


----------



## Swolern

Cheapest GTX 1070 I have seen. http://slickdeals.net/f/8943303-asus-gtx-1070-turbo-385-s-h-newegg


----------



## chrcoluk

II see zilch temperature throttling, I think people saying pascal throttles before it hits temp target are wrong, I think they misdiagnosed PWR throttling as temp related.


----------



## chrcoluk

Quote:


> Originally Posted by *iZeroFive*
> 
> I don't really understand this new GPU Boost 3.0 mambo jambo and i'm getting frustrated because it is so god damn unstable.Back in day with my good old GTX970 if you can hit 1500mhz without any error that's it there was no sudden 50mhz drops or there was no "keep me at 50celcius or i'm not going to boost myself up" type of ****.
> 
> As an EVGA 1070 SC owner i set custom fan curve then i overclocked as this:
> 
> +60 core v , +50mhz gpu , +500 mem , %112 power target.
> 
> Then i increased gpu +50 to +60mhz,gpu-z told me that i can reach higher boost as you can expect but i got lower graphic score in Timespy then i got with +50mhz increase.At that test card hit max 71celcius but the highest boost clock that gpu-z read was 1987mhz and highest vddc 1.0620v.
> 
> How is other 1070 SC users doing any suggestion for hitting at least stable 2000mhz,i don't expect anything higher then that at the moment.


I think really all that happened is you had a very nice 970 chip that had lots of headroom.

boost 3.0 will be more aggressive in its automated overclocking meaning manual overclocking on top of that will seem disappointing.

On my 970 boost 2.0 only managed 110mhz over stock speeds, on my 1070 its done nearly 500 over stock.


----------



## marik123

Quote:


> Originally Posted by *ikjadoon*
> 
> Correct me if I'm wrong, but I have heard that these GPUs reach their maximum core clocks at 50C and under. Someone said that at 50C, they start decreasing by 10MHz or something. I've heard this actually in a few places, but I haven't seen it replicated or tested.
> 
> Anandtech said their full GTX 1070 review will be up tomorrow, so hopefully they test this. It might be planned by NVIDIA, as they mentioned that GPU Boost 3.0 has specific water-cooling enhancements.


I just did more testing tonight and my GPU stays at 2100mhz boost if temp < 40c, then drop 12.5mhz per 3-5c increase. The lowest I seen in boost is 2037.5mhz when temperature reached 65C during Heaven 4.0.


----------



## DStealth

Just bought the cheapest in my country GB WF2 1070 card, tested 3 of them all made 2100+/9500+ out of the box for benching, still testing them on AIR. Stock CPU while my PSU is for RMA








Here are some results ...eagerly awaiting BIOS to avoid the PT, Vrel kicking constantly and the best core is set to 50% Voltage in XOC for 2138/51 ...what a shame not to use their potential...all air cooled , hot air actually here are summer heats with 37+*C outside and near 30 inside




Actually amazed by these small cut dies...


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Yungbenny911*
> 
> You should definitely invest in a new processor. That's a horrible GPU score just 200 points higher than my previous 970 SLI setup...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/9410011/fs/4659651


cool.









yes, i need to research CPU, Mobo and RAM. i should be able to find something that would raise all around performance for pretty cheap.

wat i need is the most popular/ "bag-for-ur-buck" CPU, Mobo and RAM from say 2014, should be enough?
(i want to re-use my existing PSU, Case and even HSF if possible.)

then, i'd start shopping for the best "bag-for-ur-buck" of 2015 and buy it mid to late 2017.

suggestions, for best 2014 CPU, Mobo and RAM?

Tanks!


----------



## Mudfrog

Quote:


> Originally Posted by *Malinkadink*
> 
> Ordered a 1070 G1 for $435. Sold my 970 for $200, $235 upgrade for close to double my 970s performance, should have it tomorrow


I wish I could sell my older cards. They get passed down to the kids. One 670 will replace a 560ti and the other will replace a 6950. Then the 560ti will replace a 4850. The circle of life in my house. I have to many PC's.


----------



## whicker

Quote:


> Originally Posted by *Dreamliner*
> 
> whicker mentioned he wasn't able to achieve the OC'd Strix speeds on his regular Strix. Is it possible manufacturers are binning?
> 
> I've been seeing some blower models pop up under $400...very tempting.


While I can't say for sure about other brands such as MSI and Gigabyte but Asus is DEFINITELY binning. Not being able to get these cards over 2000mhz or having a card that cant hold a stable clock is a kick in the nuts. Fortunately I am returning the regular strix and have a OC strix on order. Had to spend an extra 40$ but at least i'm guaranteed 2050mhz stable.


----------



## 113802

My EVGA 1070 FTW can only do 2025mhz, anything over it artifacts. Memory does hit 9200mhz without an issue. Also has loud coil whine!


----------



## Blackfyre

Quote:


> Originally Posted by *WannaBeOCer*
> 
> My EVGA 1070 FTW can only do 2025mhz, anything over it artifacts. Memory does hit 9200mhz without an issue. Also has loud coil whine!


Loud coil whine only occurs at high frame-rates correct?

If you lock frame-rate to 60FPS there's no coil whine? At least that's how it is for all of us.

Don't worry too much, hopefully these videocards get an unlocked BIOS, which would make the EVGA 1070 FTW by far the best option. Not only does it have a powerphase of 8+8 Pin but it has DUAL BIOS. Which will allow you to flash on one side without worrying too much about anything going wrong. And if anything goes wrong you can always switch back to the default BIOS.
























PS: How did you determine 2025MHz is the highest you can achieve?

Are you using MSI AfterBurner (latest BETA version)? What are your settings?


----------



## 113802

Quote:


> Originally Posted by *Blackfyre*
> 
> Loud coil whine only occurs at high frame-rates correct?
> 
> If you lock frame-rate to 60FPS there's no coil whine? At least that's how it is for all of us.
> 
> Don't worry too much, hopefully these videocards get an unlocked BIOS, which would make the EVGA 1070 FTW by far the best option. Not only does it have a powerphase of 8+8 Pin but it has DUAL BIOS. Which will allow you to flash on one side without worrying too much about anything going wrong. And if anything goes wrong you can always switch back to the default BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: How did you determine 2025MHz is the highest you can achieve?
> 
> Are you using MSI AfterBurner (latest BETA version)? What are your settings?


I was using EVGA Precision X OC but I noticed it doesn't overclock correctly. Say for example I add +60mhz on the core it drops it down to +50mhz and it still has that issue at boot where it slows down the keyboard. Installing Afterburner now since I believe I can get a higher overclock.


----------



## Blackfyre

Quote:


> Originally Posted by *WannaBeOCer*
> 
> I was using EVGA Precision X OC but I noticed it doesn't overclock correctly. Say for example I add +60mhz on the core it drops it down to +50mhz and it still has that issue at boot where it slows down the keyboard. Installing Afterburner now since I believe I can get a higher overclock.


Make sure you install the LATEST Beta version of AfterBurner.

Go to settings and tick (_enable_) unlock voltage control, but make sure you DO NOT enable "Force Constant Voltage".

I have an MSI GTX 1070 Gaming X

*My settings are:*

Core Voltage Percentage = +100%
Power Limit = 126% (_yours will be around 113% maximum_).
Core = + 106 MHz
Memory = 666 MHz

However for benchmarking purposes, to HIT the 21K Graphics Score mark, I used a manual CURVE for CORE/Voltage, and set memory to +850 MHz

http://www.3dmark.com/3dm/13449623


----------



## 113802

Quote:


> Originally Posted by *Blackfyre*
> 
> Make sure you install the LATEST Beta version of AfterBurner.
> 
> Go to settings and tick (_enable_) unlock voltage control, but make sure you DO NOT enable "Force Constant Voltage".
> 
> I have an MSI GTX 1070 Gaming X
> 
> *My settings are:*
> 
> Core Voltage Percentage = +100%
> Power Limit = 126% (_yours will be around 113% maximum_).
> Core = + 106 MHz
> Memory = 666 MHz
> 
> However for benchmarking purposes, to HIT the 21K Graphics Score mark, I used a manual CURVE for CORE/Voltage, and set memory to +850 MHz
> 
> http://www.3dmark.com/3dm/13449623


Was able to complete Time Spy at 2100mhz but with a few artifacts. MSI Afterburner definitely applies voltage correctly along with core frequency unlike EVGA Precision. I am able to run 2075Mhz without any artifacts.

http://www.3dmark.com/3dm/13449964?


----------



## Gonzalez07

Hey guys, apologies if this is a dumb question..but does the 1070 work with Windows 7? I keep reading people having to upgrade to Windows 10 so I'm a little confused.


----------



## whicker

Man my strix is ****. I have been filling out the RMA claim so I ran a time spy at stock clocks to show them. It's under 55c and clocks will drop to the 1700s. When I was Ocing it, the worst I seen the clock fluctuate was from 2062 to1867mhz in heaven. I think I have the worst 1070 chip ever made.



good card for comparison


----------



## Amph

Quote:


> Originally Posted by *luan87us*
> 
> Sorry I misread your question. I thought you mean how many watt your system is using total lol.


that is easy, what i need is something that tell me the wattage that a gpu take from the pci MB connector


----------



## Amph

i noticed some strange with my 1070, when i open chrome it work in p2 state only even if i'm running a strong application, is this a bug or something?

i don't want to force p0 state


----------



## Mudfrog

So the guys that are having a lot of clock fluctuations, is that during an OC or stock? In theory the card "should" run fine without under clocking up to what the manufacturer states as the clock speed, correct? I won't have a need to overclock my card for a long time, so more of a curiosity question.

So the MSI Gaming X that I ordered should be able to run at a constant 1771 in gaming mode?

Trying to figure out if the complaining is mainly people who expect it to run faster than advertised.


----------



## whicker

Quote:


> Originally Posted by *Mudfrog*
> 
> So the guys that are having a lot of clock fluctuations, is that during an OC or stock? In theory the card "should" run fine without under clocking up to what the manufacturer states as the clock speed, correct? I won't have a need to overclock my card for a long time, so more of a curiosity question.
> 
> So the MSI Gaming X that I ordered should be able to run at a constant 1771 in gaming mode?
> 
> Trying to figure out if the complaining is mainly people who expect it to run faster than advertised.


From what I understand. MSI guarantees that card will run at 1771 boost at 82c (thermal limit) and 100% (power limit) before it starts throttling. In real world though, the AIB cards run under 65c and should boost approx 200mhz higher than stated max boost. So in game, as long as you are under thermal and power limit, your card should run around 1958-1971Mhz stable.

The issue I was having with my strix is that it would throttle by 100mhz at 50c for really no apparent reason. The clock was also unstable, it should have ran at 1898-1911 but would drop to the 1700s.


----------



## ogow89

Quote:


> Originally Posted by *Mudfrog*
> 
> So the guys that are having a lot of clock fluctuations, is that during an OC or stock? In theory the card "should" run fine without under clocking up to what the manufacturer states as the clock speed, correct? I won't have a need to overclock my card for a long time, so more of a curiosity question.
> 
> So the MSI Gaming X that I ordered should be able to run at a constant 1771 in gaming mode?
> 
> Trying to figure out if the complaining is mainly people who expect it to run faster than advertised.


At gaming mode it will run at constant 1882 mhz on the core, and 2002 mhz on the memory. With oc mode, it would run around 1924mhz on the core and 2025 mhz on the vram.

The complaining, you are talking about, is that most reviews show the card running short of 2ghz constantly with out overclocking around 1974mhz. And with many on the forums are hitting over 2050 mhz on the core with overclocking. My piece of **** chip, couldn't even maintain oc mode from msi, and it maxed at 1902mhz, and with overclocking 1924 mhz. Anytime i added over 50 mhz to the core, the application would crash. For a 500 euros card, being one of the most expensive gtx 1070 cards, it should offer better overclocking potential. Therefore lies my dissatisfaction with the product. These cards were advertised by nvidia during their press 2 months ago, to hit over 2.1 ghz, why should i settle for mine at 1900mhz, when others enjoy 2+ ghz chips?


----------



## Mudfrog

Quote:


> Originally Posted by *ogow89*
> 
> At gaming mode it will run at constant 1882 mhz on the core, and 2002 mhz on the memory. With oc mode, it would run around 1924mhz on the core and 2025 mhz on the vram.
> 
> The complaining, you are talking about, is that most reviews show the card running short of 2ghz constantly with out overclocking around 1974mhz. And with many on the forums are hitting over 2050 mhz on the core with overclocking. My piece of **** chip, couldn't even maintain oc mode from msi, and it maxed at 1902mhz, and with overclocking 1924 mhz. Anytime i added over 50 mhz to the core, the application would crash. For a 500 euros card, being one of the most expensive gtx 1070 cards, it should offer better overclocking potential. Therefore lies my dissatisfaction with the product. These cards were advertised by nvidia during their press 2 months ago, to hit over 2.1 ghz, why should i settle for mine at 1900mhz, when others enjoy 2+ ghz chips?


Ok, makes sense. This wasn't directed at you. I've read several people complaining about the same thing. I just wasn't sure if the throttling was on the factory OC or an above and beyond OC.


----------



## zeroibis

Got my 1070 Seahawk EK X yesterday. It was hitting almost 2Ghz out of the box without me making any changes to the settings. I can really tell how much lower the temps on this thing are. I am seeing around 10C+ reduction in gpu core temps as compared to a 480 on the same cooling system. Even at 100% load temps remain under 40C which is 5-8C better than other cards I have used on this same cooling setup.

This weekend I got to let it sit on some furmark for a few hours to finish up a long burn in before I jump into actual overclocking.


----------



## izidour

Hi guys, everybody is okay?

well i bought one GTX 1070 G1 Gaming and she work very well, i can over it to 2025mhz GPU Core Clock (+85mhz) and 4504mhz (500mhz) memory without voltage. when i set more voltage, i had one big thrott and can't have more than 2038-2050mhz with + 90mhz. The TDP all the time is in limit.

In this GTX 1070 G1 Gaming the max power limit is 111%, the temperate never above 49c without voltage, and with + 100mv the max is 52c. So please, can anybody help me try to get 2100mhz hehe, there any bios i can put on my VGA without brick it?

Thanks to all you guys.

OBS: My memory is samsung.


----------



## Pheesh

Quote:


> Originally Posted by *Gonzalez07*
> 
> Hey guys, apologies if this is a dumb question..but does the 1070 work with Windows 7? I keep reading people having to upgrade to Windows 10 so I'm a little confused.


Yes it works with windows 7, I installed mine last night and had no issues.


----------



## TheMiracle

Have you guys found any way to avoid the freq bouncing?
My card keeps changing from 2038mhz to 2063mhz when gaming. (+100 core). Can I use the Voltage Curve to avoid it, or is it only tied to gpu temps?


----------



## ikjadoon

Quote:


> Originally Posted by *Oj010*
> 
> I think you just didn't notice it, GPU Boost has always behaved that way when you're close to the limits. Hell, I spent several hours closely watching the frequencies on a GTX 980 non-Ti just last week bouncing around in GPUPI1B. I did eventually manage a run through at 1,700 MHz but not before about 200 runs of it bouncing down as low as 1,640.


Err, right, but isn't GPU Boost 3.0 far more temperature dependent? From what Anandtech has shown and what a few people have posted, it starts adjusting max boost at around 40 to 50C. Was that true with Maxwell and GPU Boost 2.0? I don't know--I never had a Maxwell card,








Quote:


> Originally Posted by *Swolern*
> 
> Cheapest GTX 1070 I have seen. http://slickdeals.net/f/8943303-asus-gtx-1070-turbo-385-s-h-newegg


We're so close. That's still $5 over MSRP, though, a month after launch.







But this is better, much better than before.
Quote:


> Originally Posted by *chrcoluk*
> 
> II see zilch temperature throttling, I think people saying pascal throttles before it hits temp target are wrong, I think they misdiagnosed PWR throttling as temp related.


Hmm...it's definitely temperature related, as Anandtech, Oj010, and marik123 have shown. It's actually a very direct relationship. How long are you testing for? Anandtech writes,
Quote:


> To start, Pascal clockspeeds are much more temperature-dependent than on Maxwell 2 or Kepler. Kepler would drop a single bin at a specific temperature, and Maxwell 2 would sustain the same clockspeed throughout. However Pascal will drop its clockspeeds as the GPU warms up, *regardless of whether it still has formal thermal and TDP headroom to spare*. This happens by backing off both on the clockspeed at each individual voltage point, and backing off to lower voltage points altogether.


Quote:


> Originally Posted by *marik123*
> 
> I just did more testing tonight and my GPU stays at 2100mhz boost if temp < 40c, then drop 12.5mhz per 3-5c increase. The lowest I seen in boost is 2037.5mhz when temperature reached 65C during Heaven 4.0.


That lines up exactly right with what Anandtech showed. 12.5MHz is the GPU core frequency bin/strip size, so it drops one bin per 3-5C. Hmm...I wonder why some people see it and some don't? I wonder...are individuals not running tests long enough? Or are they not monitoring GPU frequency as the "test ramps up" and only look at the results at the very end?


----------



## Mudfrog

I remember during the press release, it was said that these newer cards would fully support SLI even if the game did not as it's hardware driven. I'm assuming that's on DX12 games only, anyone have more information on this?


----------



## marik123

Quote:


> Originally Posted by *ikjadoon*
> 
> So, the excitement begins. Anandtech is releasing their 30+ page GTX 1080/1070 review tomorrow. I poked Ryan Smith (Editor in Chief at Anandtech) in the comments of this hard drive review and he CONFIRMED OUR SUSPICIONS.
> 
> 
> 
> GPU Boost 3.0 actually starts reducing your maximum GPU boost *at possibly even 39C*! He was kind enough to tease this image from tomorrow's upcoming review.
> 
> 
> 
> So....all the more reason to water-cool? haha, well, if you care about ~70MHz, that is.


That totally explains why my boost clock drop all the way down to 2037.5mhz during Heaven 4.0 when temperature reached 65c from 2100mhz boost when temp is <38c.


----------



## whicker

Quote:


> Originally Posted by *ikjadoon*
> 
> That lines up exactly right with what Anandtech showed. 12.5MHz is the GPU core frequency bin/strip size, so it drops one bin per 3-5C. Hmm...I wonder why some people see it and some don't? I wonder...are individuals not running tests long enough? Or are they not monitoring GPU frequency as the "test ramps up" and only look at the results at the very end?


Normally you would drop 1-2 bins if temps get high. My card unfortunately was dropping 10+ bins when running under 55c.
Quote:


> Originally Posted by *marik123*
> 
> That totally explains why my boost clock drop all the way down to 2037.5mhz during Heaven 4.0 when temperature reached 65c from 2100mhz boost when temp is <38c.


Just gotta make a custom fan curve. Most AIB fans dont even turn on till 60c. Change that to 40c and with fans around 60% you wont clock down as much if at all, also make sure you increase temp and power limit to max.


----------



## Bugses

I just got my Gainward 1070 GLH. I ran a Firestrike and it says my card got these specs:

Memory 8.192 MB
Core clock 350 MHz
Memory bus clock 2.126 MHz
Driver version 10.18.13.6881

Does that seem accurate?


----------



## ogow89

Quote:


> Originally Posted by *Bugses*
> 
> I just got my Gainward 1070 GLH. I ran a Firestrike and it says my card got these specs:
> 
> Memory 8.192 MB
> Core clock 350 MHz
> Memory bus clock 2.126 MHz
> Driver version 10.18.13.6881
> 
> Does that seem accurate?


nope, try using DDU.


----------



## Bugses

Quote:


> Originally Posted by *ogow89*
> 
> nope, try using DDU.


That didnt change anything in GPU-z. Is it because I'm checking it when my GPU is idle?
I just ran the Time Spy benchmark, and I got this result http://www.3dmark.com/spy/111424
I also ran the Firestrike benchmark and got this result http://www.3dmark.com/fs/9425201

Does that look more normal?


----------



## ogow89

Quote:


> Originally Posted by *Bugses*
> 
> That didnt change anything in GPU-z. Is it because I'm checking it when my GPU is idle?
> I just ran the Time Spy benchmark, and I got this result http://www.3dmark.com/spy/111424
> I also ran the Firestrike benchmark and got this result http://www.3dmark.com/fs/9425201
> 
> Does that look more normal?


Oh i figured it out, your gpu-z is soooooooo old man, there is already version 1.9

Btw your physics score is too low, i suggest overclocking your cpu to get more out of your gpu.

Graphics score is good though.


----------



## Bugses

Quote:


> Originally Posted by *ogow89*
> 
> Oh i figured it out, your gpu-z is soooooooo old man, there is already version 1.9
> 
> Btw your physics score is too low, i suggest overclocking your cpu to get more out of your gpu.
> 
> Graphics score is good though.


Rofl, youre right. Nothing to see here, carry on!








Thanks though.

And yeah, I actually had my 3570K oc´ed, but I accidently clicked that magic button on my motherboard that wipes my BIOS...
But I just ordered a 5820K, so that should help a bit


----------



## Bugses

On a side note, I've never really OC´ed a GPU before. How safe is that? Its nearly idiot proof to OC my CPU, so I was just wondering if this is the same?


----------



## ogow89

Quote:


> Originally Posted by *Bugses*
> 
> On a side note, I've never really OC´ed a GPU before. How safe is that? Its nearly idiot proof to OC my CPU, so I was just wondering if this is the same?


Add 500 mhz to your memory, and test it for 1 hr for stability. Add additional 100 if it proved stable. Afterwards, put your powerlimit to max, and add maybe 50 mhz to the core and test it again for stability. If stable, and you see stable clocks over 2 ghz, try maybe adding another 50 and test it again. If 100mhz+ on the core with 600mhz+ on vram ran with no artifacts or crashing, you can then try 110mhz and that is where all gtx 1070 hit their limit. You can push the vram to over 700 mhz+, but i wouldn't bother, due to how the cooling on these cards work. Also there is no performance gain out of it. Adding to the core also sees the card throttling even harder. Try to reach 110+ mhz on the core and 600+ mhz on the memory. Max you will see however is maybe a variance of 5 fps.


----------



## Bugses

Quote:


> Originally Posted by *ogow89*
> 
> Add 500 mhz to your memory, and test it for 1 hr for stability. Add additional 100 if it proved stable. Afterwards, put your powerlimit to max, and add maybe 50 mhz to the core and test it again for stability. If stable, and you see stable clocks over 2 ghz, try maybe adding another 50 and test it again. If 100mhz+ on the core with 600mhz+ on vram ran with no artifacts or crashing, you can then try 110mhz and that is where all gtx 1070 hit their limit. You can push the vram to over 700 mhz+, but i wouldn't bother, due to how the cooling on these cards work. Also there is no performance gain out of it. Adding to the core also sees the card throttling even harder. Try to reach 110+ mhz on the core and 600+ mhz on the memory. Max you will see however is maybe a variance of 5 fps.


Thanks for your guide. How risky is it though? In terms of damaging the GPU?


----------



## whicker

Quote:


> Originally Posted by *Bugses*
> 
> On a side note, I've never really OC´ed a GPU before. How safe is that? Its nearly idiot proof to OC my CPU, so I was just wondering if this is the same?


Probably even easier. Grab MSI AB beta 4. In settings, enable voltage monitoring and adjustment. max out voltage, max out power and temp limit. 500 to memory and increase clock in 15mhz increments till you see artifacts or crash in 3dmark. Once you do, back off 15mhz and you are done. Also might want to make a custom fan curve since most cards have fans off till 60c.
Quote:


> Originally Posted by *Bugses*
> 
> Thanks for your guide. How risky is it though? In terms of damaging the GPU?


They are all limited to 1093mv so no risk at all as long as it doesn't get too hot.


----------



## ogow89

Quote:


> Originally Posted by *Bugses*
> 
> Thanks for your guide. How risky is it though? In terms of damaging the GPU?


Not at all, since you can't change the voltage anymore. However, the memory, no one knows. If you want to play it even more safe, just add 500mhz to the memory. Gpu boost 3.0 does the overclocking already for you this time around, adding the 110 mhz doesn't mean that you will get +110 at all time, rather when the power and temp allow it.


----------



## Bugses

Great







Thanks both of you.


----------



## Vaesauce

Got my Gigabyte 1070 Xtreme Gaming yesterday. So far it's been a huge upgrade over my 7970ghz lol.

Only bad thing is the micro stutter which Nvidia has acknowledged and are fixing.

As for Overclocks and stuff. I can't really add much Core since with OC/Boost, I'm already at 2075mhz. Cool thing is though that I haven't gone over 60c. As for Memory, I've been able to go up to +700 with a few artifacts showing in some games and not. I've lowered it to 500 since then, mostly because I don't care much for actual maximum Overclocking.

I would suppose some people would like the Xtreme Gaming 1070 Bios?


----------



## ikjadoon

Quote:


> Originally Posted by *whicker*
> 
> Normally you would drop 1-2 bins if temps get high. My card unfortunately was dropping 10+ bins when running under 55c.
> Just gotta make a custom fan curve. Most AIB fans dont even turn on till 60c. Change that to 40c and with fans around 60% you wont clock down as much if at all, also make sure you increase temp and power limit to max.


True, but "high" has changed, in NVIDIA's definition. And, oh, yeah, your card was crazy. 10 bins? Something was definitely wrong. Maybe a bad chip.

Quote:


> Originally Posted by *marik123*
> 
> That totally explains why my boost clock drop all the way down to 2037.5mhz during Heaven 4.0 when temperature reached 65c from 2100mhz boost when temp is <38c.












It is a little annoying; I wish NVIDIA had been a bit more upfront about this. However...it could have been that your card, under GPU Boost 2.0, might have only been stable at 2038MHz at any temperature. But, with GB 3.0, they let it ramp up a bit when the die is cold, so if you wanted to go cooler, you would see benefits that GB 2.0 might not have given you.


----------



## Vikhr

Anyone with a MSI Gaming X 1070 having issues with the card not boosting properly? It seems to happen after the computer has been left on for a prolonged period of time, afterwards the card will fail to boost to the usual speeds or at all.

Resetting the power limit appears to fix it temporarily but I feel that I shouldn't have to worry about this in the first place.


----------



## bigjdubb

Quote:


> Originally Posted by *Vikhr*
> 
> Anyone with a MSI Gaming X 1070 having issues with the card not boosting properly? It seems to happen after the computer has been left on for a prolonged period of time, afterwards the card will fail to boost to the usual speeds or at all.
> 
> Resetting the power limit appears to fix it temporarily but I feel that I shouldn't have to worry about this in the first place.


I have run into a few issues of my card not clocking up but it has been during benchmarking sessions with driver crashes and what not. I just assumed it was from driver crashes since rebooting fixed it. I will keep my eye out and see if I notice it again.


----------



## DStealth

Quote:


> Originally Posted by *Vaesauce*
> 
> Got my Gigabyte 1070 Xtreme Gaming yesterday. ?


Can you share your BIOS file, thanks in advance


----------



## Vaesauce

Quote:


> Originally Posted by *DStealth*
> 
> Can you share your BIOS file, thanks in advance


I just tried extracting it via GPU-Z and it won't let me. GPU-Z just freezes up and goes to " Not Responding "









I was able to send it to their website though apparently.


----------



## nacherc

MSI 1070 GAMING X

CORE +100
MEM +800

Tested in benchs and games like GTA V and BTF4. Stable.


----------



## luan87us

Quote:


> Originally Posted by *Vaesauce*
> 
> Got my Gigabyte 1070 Xtreme Gaming yesterday. So far it's been a huge upgrade over my 7970ghz lol.
> 
> Only bad thing is the micro stutter which Nvidia has acknowledged and are fixing.
> 
> As for Overclocks and stuff. I can't really add much Core since with OC/Boost, I'm already at 2075mhz. Cool thing is though that I haven't gone over 60c. As for Memory, I've been able to go up to +700 with a few artifacts showing in some games and not. I've lowered it to 500 since then, mostly because I don't care much for actual maximum Overclocking.
> 
> I would suppose some people would like the Xtreme Gaming 1070 Bios?


Imagine how big of an upgrade it is for me from a 7850 lol.


----------



## 113802

Every time I think my overclock is stable I end up with different colors in Overwatch


----------



## bpmcleod

Any custom BIOS yet for MSI Gaming 1070?

EDIT: I want a custom BIOS unlocking voltages and power limits and what not. Will be dropping it on water here shortly so wanting to prepare for it... Here is the original BIOS

GP104.zip 149k .zip file


----------



## alex4069

Quote:


> Originally Posted by *42 degree angle*
> 
> Hello 1070 owners! I think that this thread is the best one to ask this, but do guide me to somewhere else if needed. So;
> 
> What is actually 'the best' aftermarket 1070 to buy at this time and age? I've never had a PC and I'm gathering up a build, this being the last part. I've been eyeballing the Gigabyte G1 because I'm going to go with a blackout theme with my build, but then I stumbled across coil whine accusations. Is this true? What are the downsides of each 1070, which should be avoided and which should be favoured?
> 
> To give an image about pricing, the G1 stands at 489€, MSI stands at 499€ and Asus at 529€. Also I'd love to go with the MSI one, but I'd love it to match my build, so how would one make the red part of the shroud black without voiding warranty, a non-permanent solution?


All I can speak on is the Zotac GTX 1070 AMP Editionot. It has two fans that when gpu is not being used spin down to a stop to save on fan life and easily overclock able. It is black with a carbon fiber exco Armour, small yellow line on backplate and you can change the lighting effects on the card.


----------



## Vaesauce

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Every time I think my overclock is stable I end up with different colors in Overwatch


Don't think that is an overclocking issue. That is an issue with Overwatch. It's been confirmed I believe.


----------



## Eric1285

Quote:


> Originally Posted by *Vaesauce*
> 
> Don't think that is an overclocking issue. That is an issue with Overwatch. It's been confirmed I believe.


I had the same issue with one tint / shade slowly spreading over the screen. Backed off my overclock bit by bit until it went away. Didn't take much...maybe -10 on the core and -25 or so on the memory.


----------



## Vaesauce

So, crazy... after some testing... my 1070 Xtreme Gaming Edition Specs are:

Core Clock
1695 MHz in OC mode
1670 MHz in Gaming mode

Boost Clock
1898 MHz in OC mode
1873 MHz in Gaming mode

Yet.. when I game/benchmark/put load on GPU without touching anything OC tools, my Core runs at between 2000mhz and 2037mhz. With the Xtreme Engine and using the "OC" Mode. It sits between 2050-2100mhz and in "Gaming" mode, between 2037-2062mhz.

I guess there is really no need to add any Core Clock in general. Is this normal? lol


----------



## chrcoluk

Quote:


> Originally Posted by *ikjadoon*
> 
> Err, right, but isn't GPU Boost 3.0 far more temperature dependent? From what Anandtech has shown and what a few people have posted, it starts adjusting max boost at around 40 to 50C. Was that true with Maxwell and GPU Boost 2.0? I don't know--I never had a Maxwell card,
> 
> 
> 
> 
> 
> 
> 
> 
> We're so close. That's still $5 over MSRP, though, a month after launch.
> 
> 
> 
> 
> 
> 
> 
> But this is better, much better than before.
> Hmm...it's definitely temperature related, as Anandtech, Oj010, and marik123 have shown. It's actually a very direct relationship. How long are you testing for? Anandtech writes,
> 
> That lines up exactly right with what Anandtech showed. 12.5MHz is the GPU core frequency bin/strip size, so it drops one bin per 3-5C. Hmm...I wonder why some people see it and some don't? I wonder...are individuals not running tests long enough? Or are they not monitoring GPU frequency as the "test ramps up" and only look at the results at the very end?


Their data simply makes no sense, not seeing my card behave in that manner.

It only throttles when power % goes within 10% of the limit.

So in default it throttles at 90% TDP, or if I set max 114% it throttles at 104%. My card has never gone over 62C so does run cool, but that is still hot enough according to these claims to have temperature throttles which I dont get.

The problem with all those people who said it's temperature is they havent put PWR usage on their graphs.

When jayz on youtube did a throttle video, his video clearly showed 2 things.

1 - it throttles at PWR cap.
2 - it throttled at the target temp which is by default 82C, he clearly did not have throttles at any other point.

When he raised the temp target to over 90C he was able to game at circa 85C temps with no throttling.

He was using a FE card which as we know has sucky cooling as well.

I suppose one theory is that some vendors are shipping a bios that acts in that behaviour. but not all vendors.


----------



## Malinkadink

So i'm not too thrilled with the 1070 g1 in terms of noise/coolng. I'm able to set the fan to 40% and not hear it above my case fans and this is at 1700rpm for the gpu. Anything higher i can finally make it out over the case fans. My 970 g1 i was able to let it sit at 55% with 2k rpm and it was as quiet as the rest of the pc and never topped over 70C @ 1500mhz. This 1070 sits around 75C under load @ 2000mhz.

Other than that the performance is a nice leap up, but the card really likes to throttle when its in the 100-105% power range. I can't help but feel like Pascal is kind of fail sauce compared to what maxwell was in 2013.


----------



## Yetyhunter

Quote:


> Originally Posted by *Malinkadink*
> 
> So i'm not too thrilled with the 1070 g1 in terms of noise/coolng. I'm able to set the fan to 40% and not hear it above my case fans and this is at 1700rpm for the gpu. Anything higher i can finally make it out over the case fans. My 970 g1 i was able to let it sit at 55% with 2k rpm and it was as quiet as the rest of the pc and never topped over 70C @ 1500mhz. This 1070 sits around 75C under load @ 2000mhz.
> 
> Other than that the performance is a nice leap up, but the card really likes to throttle when its in the 100-105% power range. I can't help but feel like Pascal is kind of fail sauce compared to what maxwell was in 2013.


It seems I am not the only one in this situation. The card is louder and allot hotter then my GTX670 WF3. What core clocks do you get during gaming ?


----------



## brettjv

Quote:


> Originally Posted by *Yetyhunter*
> 
> It seems I am not the only one in this situation. The card is louder and allot hotter then my GTX670 WF3. What core clocks do you get during gaming ?


For Dog's sake, fellas, it's like fully 2.5-3 TIMES faster than a GTX670 (the card I'd LOVE to be able to replace with a 1070), right?

And how many times more VRAM does it have onboard again?

And yet still has similar TDP, yeah?

Don't you think it might make some sense that a card this much faster/better than a 670 ... might take a little bit more fan speed to keep cool? Heck, I have a pretty good case and a custom fan curve for my GTX670, and a GB WF3 cooler on it ... its spin up PRETTY loud as it approaches temps where throttling can start (which was around 70C on Kepler IIRC).

That all said, you might wanna try the old 'smoothing the heatsink surface with some fine-grit sandpaper' (the common name for which is eluding me, suddenly), and re-doing the thermal paste with some quality stuff.


----------



## pez

Quote:


> Originally Posted by *brettjv*
> 
> For Dog's sake, fellas, it's like fully 2.5-3 TIMES faster than a GTX670 (the card I'd LOVE to be able to replace with a 1070), right?
> 
> And how many times more VRAM does it have onboard again?
> 
> And yet still has similar TDP, yeah?
> 
> Don't you think it might make some sense that a card this much faster/better than a 670 ... might take a little bit more fan speed to keep cool? Heck, I have a pretty good case and a custom fan curve for my GTX670, and a GB WF3 cooler on it ... its spin up PRETTY loud as it approaches temps where throttling can start (which was around 70C on Kepler IIRC).
> 
> That all said, you might wanna try the old 'smoothing the heatsink surface with some fine-grit sandpaper' (the common name for which is eluding me, suddenly), and re-doing the thermal paste with some quality stuff.


Lapping?









I'm almost more than curious to do this to my first G1. I swapped the positioning of the two and the second one I received runs cooler as the top card.


----------



## Sueramb6753

-snip-


----------



## ogow89

Quote:


> Originally Posted by *Symix*
> 
> My MSI gaming X makes a little grinding noise when fans spin, is this normal? it's pretty quiet, you almost couldn't hear it if I recorded it. Still I notice it when the fans spin up because other fans make absolutely no noise other than airflow.


Set fan speed manually via Afterburner, and try to isolate which fan is causing the noise, if by chance the noise increase, then you have what is called rattling, which is caused by the fan bearing -> meaning faulty fan, meaning RMA.


----------



## Vaesauce

Quote:


> Originally Posted by *DStealth*
> 
> Can you share your BIOS file, thanks in advance


I used NVFlash to take the BIOs instead of GPU-Z so this worked.

Here ya go!

1070XG.zip 148k .zip file


Warning ahead of time... Though on the Specs for the XG, it says it's like Core is at 1898mhz, it actually jumps to 2050-2100, so if you're looking to overclock Core... there is little room for it.


----------



## prey1337

Just a tidbit of information.

BF4 seems to be a lot more sensitive to my OC settings than Doom was.
Had to drop it down a few notches to keep it from crashing regularly.

With my fan curve now, the highest I get it 55C. Still can't hear it over my CPU cooler.

Also, I do not understand all of the disappointment in the card's OC capability.
These are extremely powerful cards out of the box, and getting up to 2000MHz seems to be a breeze for every card. That's just icing on the cake IMO.

As far as the "throttling" goes, it has no real perceivable difference, it's a couple MHz here and there.
AND that's when you're overclocking the card, above the guaranteed clocks, so it's not like you are losing performance when you're still +100MHz over the factory settings.


----------



## chrcoluk

Indeed, pascal clocks way better than maxwell.


----------



## Blze001

Elder Scrolls Online apparently hates my 1070. I'm struggling to break 30fps and my card still won't get enough of a load from the game to clock up it's speed. So the game is acting like the GPU isn't fast enough, but the GPU isn't getting enough of a load to speed up.

At least I have my ENB modded Skyrim to put it to work.


----------



## HAL900

Quote:


> Originally Posted by *chrcoluk*
> 
> Indeed, pascal clocks way better than maxwell.






Nope


----------



## Raikiri

Palit GameRock:











Running at 2088 boost, 9000 Mem, zero coil whine and maxes out at around 68c/40% fan speed in a warm room. Pretty happy with it.


----------



## Lineswithrobfor

For the 1070 is there thermal throttling when around 76C?

I find that my AMP! edition card already boost to 2050, adding more core clock in Afterburner doesn't do anything? Anyone else seeing this?


----------



## ikjadoon

Quote:


> Originally Posted by *Vaesauce*
> 
> So, crazy... after some testing... my 1070 Xtreme Gaming Edition Specs are:
> 
> Core Clock
> 1695 MHz in OC mode
> 1670 MHz in Gaming mode
> 
> Boost Clock
> 1898 MHz in OC mode
> 1873 MHz in Gaming mode
> 
> Yet.. when I game/benchmark/put load on GPU without touching anything OC tools, my Core runs at between 2000mhz and 2037mhz. With the Xtreme Engine and using the "OC" Mode. It sits between 2050-2100mhz and in "Gaming" mode, between 2037-2062mhz.
> 
> I guess there is really no need to add any Core Clock in general. Is this normal? lol


NVIDIA has just the most inane specifications. Your Core Clock is what it should never go below during gaming (i.e., the bare minimum). The Boost Clock is the "low average" of what you should get in-game. Every card, however, has an average in-game boost much higher than "Boost Clock" even *without* overclocking (as long as you have good cooling). So, from what I have seen...you can just turn the TDP limit up and the Temp limit up and the card will "auto OC" itself (by just boosting up very high, much higher than Boost Clock would suggest). However, like you noticed, NVIDIA doesn't boost much past 2050MHz on its own. If you want more, that's when you need to OC.
Quote:


> Originally Posted by *chrcoluk*
> 
> Their data simply makes no sense, not seeing my card behave in that manner.
> 
> It only throttles when power % goes within 10% of the limit.
> 
> So in default it throttles at 90% TDP, or if I set max 114% it throttles at 104%. My card has never gone over 62C so does run cool, but that is still hot enough according to these claims to have temperature throttles which I dont get.
> 
> The problem with all those people who said it's temperature is they havent put PWR usage on their graphs.
> 
> When jayz on youtube did a throttle video, his video clearly showed 2 things.
> 
> 1 - it throttles at PWR cap.
> 2 - it throttled at the target temp which is by default 82C, he clearly did not have throttles at any other point.
> 
> When he raised the temp target to over 90C he was able to game at circa 85C temps with no throttling.
> 
> He was using a FE card which as we know has sucky cooling as well.
> 
> I suppose one theory is that some vendors are shipping a bios that acts in that behaviour. but not all vendors.


So, wait. You've *directly* tested this? You've looked at your maximum boost clock at 45C versus 75C? It's identical? I think very few people have cards that are 1) at 100% load (to activate maximum boost) and 2) only at 45C. You need to test *that*.

What you (and JaysTwoCents) are talking about is traditional TDP / temperature throttling. That hasn't changed. Yes, at 100% TDP, you'll throttle. Yes, at 83C, you'll throttle.
Quote:


> Originally Posted by *prey1337*
> 
> Just a tidbit of information.
> 
> BF4 seems to be a lot more sensitive to my OC settings than Doom was.
> .....
> As far as the "throttling" goes, it has no real perceivable difference, it's a couple MHz here and there.
> 
> AND that's when you're overclocking the card, above the guaranteed clocks, so it's not like you are losing performance when you're still +100MHz over the factory settings.


Great points. BF4 is notorious for "body-checking" overclocks. I had to go down 50MHz on my GTX 770 to get it stable in BF4 versus other stability tests / games. And true. Even with GPU Boost 3.0's additional throttling, at best, it's 65MHz. That's 3%! At 60FPS, that's barely 2FPS. 60FPS to 62FPS. Whoop de doo. Hahaha. But, technically, this is OCN, "the pursuit of performance".








Quote:


> Originally Posted by *Lineswithrobfor*
> 
> For the 1070 is there thermal throttling when around 76C?
> 
> I find that my AMP! edition card already boost to 2050, adding more core clock in Afterburner doesn't do anything? Anyone else seeing this?


1. Your first question has many answers and I'm unsure which throttling you're talking about. What is the temp limit set at?

2. Are you adding more than 13MHz? Each frequency jump is always 13MHz apart. So, you need to try at least 2063MHz before it'll change, I think.


----------



## Sueramb6753

-snip-


----------



## DStealth

Quote:


> Originally Posted by *Vaesauce*
> 
> I used NVFlash to take the BIOs instead of GPU-Z so this worked.
> 
> Here ya go!
> 
> 1070XG.zip 148k .zip file
> 
> 
> Warning ahead of time... Though on the Specs for the XG, it says it's like Core is at 1898mhz, it actually jumps to 2050-2100, so if you're looking to overclock Core... there is little room for it.


Almost none







Mine boosts to 2076 in 3dmark while OC can hit 2138/51

Flashed your BIOS on my GB WF2 card witch was with G1 BIOS already...despite the 114 vs 111% PT the memory hit the 10GHz mark in AVP with best result obtained so far in this particular benchmark









Thanks for sharing XG BIOS:thumb:


----------



## DStealth

Edit: Cannot edit my post just after posting it ..strange


----------



## Vaesauce

Quote:


> Originally Posted by *DStealth*
> 
> Almost none
> 
> 
> 
> 
> 
> 
> 
> Mine boosts to 2076 in 3dmark while OC can hit 2138/51
> 
> Flashed your BIOS on my GB WF2 card witch was with G1 BIOS already...despite the 114 vs 111% PT the memory hit the 10GHz mark in AVP with best result obtained so far in this particular benchmark
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for sharing XG BIOS:thumb:










very glad to hear! and You're welcome!


----------



## ogow89

Quote:


> Originally Posted by *Symix*
> 
> Well I took a video anyway, you have to watch in 1080p or audio gets too compressed to hear the noise.
> 
> 
> 
> 
> Is it possible the sticker is off center and it's making the fan vibrate because of it?
> 
> Apparently these are double ball bearing fans, any trick to fixing it like maybe a drop of oil? I've also read that clicking like this is normal for these fans. I'd hate to get a replacement with the same thing, rather just get a different card.


Well, to be honest, i returned mine due to limited overclocking potential and opted for the palit super jetstream. In your case, this noise won't fix it self and you can't fix double bearing fans that easily, you would have to drill the plastic and apply oil to it. I had similar issue with the r9 290 pcs+. I would just RMA and look for either a direct replacement if you like the way the card looks, or get something else.


----------



## Sueramb6753

-snip-


----------



## ogow89

Quote:


> Originally Posted by *Symix*
> 
> I don't like how it looks at all. Don't care about OC either, I just want it to run quiet.
> 
> Maybe I'll go for palit gamerock. Any other recomendations ?


Are you size limited?

You've got the Palit Jetstream, super jetstream, Gainward Phoenix, Gainward Phoenix golden samples, these all sport the same cooler and are similar price wise -> cheaper than MSI.

Evga is also a great brand. I would mostly rather pick a two fanned cooler than small 3 fans like the g1. 2 100mm fans are usually quieter. Cooling wise, all non founder edition cards will do.


----------



## Sueramb6753

-snip-


----------



## Raikiri

I can only recommend the GameRock, seems like a great card.

But then that's only my opinion, there's another thread on here where someone has one that barely clocks. Mine is very quiet though, under 55c the fan doesn't run at all.


----------



## boldenc

Just got my Evga 1070 SC and so far the max stable OC (+125Mhz) bounces between 2025 ~ 2050, is that OC fine?

http://www.3dmark.com/3dm/13486402?

https://www.techpowerup.com/gpuz/details/62u2c


----------



## SuperZan

Quote:


> Originally Posted by *Symix*
> 
> I don't like how it looks at all. Don't care about OC either, I just want it to run quiet.
> 
> Maybe I'll go for palit gamerock. Any other recomendations ?


EVGA SC is very quiet, I'm loving it so far.


----------



## mypickaxe

EVGA HB bridge came in today. Fits my EK blocks with shroud removed. Nice to see the LED is at least working.


----------



## oblivious

Have any of you guys went from 1440p back down to 1080p?


----------



## Dude970

Looking real good!


----------



## ogow89

Quote:


> Originally Posted by *Symix*
> 
> Nah no size limits, no problem with 3 slot cards. what's the difference between palit gamerock and jetstream? they cost exactly the same at the same clocks.


Palit made two series, the gamerock for gamers, and jetstream for overclockers. The jetstream comes with two additional vrms for better power delivery. You've got then the gamerock and the gamerock premium which comes with more factory overclock and boosts higher. There is also the normal jetstream and the super jetstream, the latter is also clocked higher. and naturally costs about 20 euros more here. Performance wise, they are all the same depending on your luck and how high your chip clocks. So just pick the one that suits your build color wise. I personally find the jetstream gray color more bad ass. Not into the hip blue look the gamerock offers.

Oh and don't forget about the gainward cards. Gainward is Palit. I recommend the super sample phoenix. Those cards are handpicked for overclocking, and i mean really handpicked. They test it them for overclocking potential. Cost the same btw. Well all i can say is pick your poison









470 or so € ain't cheap.


----------



## Malinkadink

Quote:


> Originally Posted by *oblivious*
> 
> Have any of you guys went from 1440p back down to 1080p?


I went from 1080p to 1440p, back to 1080p, now @ 4k, going back to 1080p 144hz tomorrow. This 4k monitor isn't mine, just using it till i fix my 1080p one. Why do you ask?

More on topic though, i'm returning my 1070 g1, and have an zotac amp extreme coming in tomorrow. Seems it stays a bit cooler, quieter, and most importantly the added 8 pin should let it hold higher boost clocks and generally be a bit more stable. I really loved my 970 g1 and its been a good 2 years with it, but gigabyte definitely cheaped out on the 1070 g1 a bit imho.


----------



## iZeroFive

Quote:


> Originally Posted by *boldenc*
> 
> Just got my Evga 1070 SC and so far the max stable OC (+125Mhz) bounces between 2025 ~ 2050, is that OC fine?
> 
> http://www.3dmark.com/3dm/13486402?
> 
> https://www.techpowerup.com/gpuz/details/62u2c


Wow seriously mine crashes at anything above +50mhz.

I think i have some kind of power problem.

In TPU's GTX 1070 SC review they mentioned that the card sometimes drawing even less power then the founders edition.

Quote from TPU: "I'm surprised EVGA has not adjusted the board's power limit, though, since it's still set at around 150 W, which is way too low for such a card and might cause Boost to reduce clocks earlier to keep power draw below the limit."

Because of this problem no matter what i do card hits "Pwr , VRel" as shown in GPU-Z this is happening when my power target is already set to %112.

I'm using custom fan curve in Valley i can only see 2ghz in first 5-10sec then it immidiately drops to 1950/70mhz or even lower.I tried %100 fan speed,voltage is already set to +100 but just no card just can't reach 2ghz stably.I set +50mhz to gpu anything higher then that like +10mhz more increment then Valley crashes.At this point i know that i don't have the best GTX1070 chip around here but i still think this is a power issue.

Is there any SC user here who can't set more then +50mhz like me?

-

Mobo: Z97E-ITX/ac / PSU: Corsair RMx 750w


----------



## oblivious

Quote:


> Originally Posted by *Malinkadink*
> 
> I went from 1080p to 1440p, back to 1080p, now @ 4k, going back to 1080p 144hz tomorrow. This 4k monitor isn't mine, just using it till i fix my 1080p one. Why do you ask?
> 
> More on topic though, i'm returning my 1070 g1, and have an zotac amp extreme coming in tomorrow. Seems it stays a bit cooler, quieter, and most importantly the added 8 pin should let it hold higher boost clocks and generally be a bit more stable. I really loved my 970 g1 and its been a good 2 years with it, but gigabyte definitely cheaped out on the 1070 g1 a bit imho.


The reason i'm asking is because i posted about it in the general GPU section and no one has still responded.

I'm in a weird position. Right now i have a Crossover 27Q monitor that only has DVI. I want to add a second monitor and i'd like for the resolutions to match so my 2nd would have to be 1440p. For that i was planning to buy a 1070. My problem is that non of the new gen cards that i've seen has two DVI ports. From this i need to buy something IPS and 1440p that has a DisplayPort and/or HDMI.

http://www.ebay.com/itm/111364284557?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
http://www.ebay.com/itm/141481152682?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
http://www.ebay.com/itm/222090464588?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT

Those are the monitors i have my eye on since they have something other than DVI for inputs.

The reason i was asking about 1080p is I was throwing around the idea of downgrading from 1440p to 1080p. I can get something from Newegg (Asus, Acer, Samsung) that will probably be cheaper than the Korean IPS and PLS screens. I really do like the colors of my Crossover monitor but I could probably sacrifice that for an easier setup at 1080p. Also if i went down to 1080p i might be fine with a 1060 instead of a 1070.


----------



## Malinkadink

Quote:


> Originally Posted by *oblivious*
> 
> The reason i'm asking is because i posted about it in the general GPU section and no one has still responded.
> 
> I'm in a weird position. Right now i have a Crossover 27Q monitor that only has DVI. I want to add a second monitor and i'd like for the resolutions to match so my 2nd would have to be 1440p. For that i was planning to buy a 1070. My problem is that non of the new gen cards that i've seen has two DVI ports. From this i need to buy something IPS and 1440p that has a DisplayPort and/or HDMI.
> 
> http://www.ebay.com/itm/111364284557?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
> http://www.ebay.com/itm/141481152682?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
> http://www.ebay.com/itm/222090464588?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
> 
> Those are the monitors i have my eye on since they have something other than DVI for inputs.
> 
> The reason i was asking about 1080p is I was throwing around the idea of downgrading from 1440p to 1080p. I can get something from Newegg (Asus, Acer, Samsung) that will probably be cheaper than the Korean IPS and PLS screens. I really do like the colors of my Crossover monitor but I could probably sacrifice that for an easier setup at 1080p. Also if i went down to 1080p i might be fine with a 1060 instead of a 1070.


http://www.ebay.com/itm/Pixio-PX277-27-inch-2560x1440-144Hz-AMD-FreeSync-WQHD-Gaming-PC-Monitor-/262508966762?hash=item3d1ec0f36a:g:Q4QAAOSwuzRXesKp

Pay a little extra for that an enjoy 144hz 1440p, it has freesync if you ever went AMD, but i think that would be the best thing to do while not spending too much. Make it your main display and set the 27Q to the side.


----------



## Swolern

Quote:


> Originally Posted by *Symix*
> 
> I don't like how it looks at all. Don't care about OC either, I just want it to run quiet.
> 
> Maybe I'll go for palit gamerock. Any other recomendations ?


Get a Strix. They are extremely quiet.
Quote:


> Originally Posted by *Malinkadink*
> 
> http://www.ebay.com/itm/Pixio-PX277-27-inch-2560x1440-144Hz-AMD-FreeSync-WQHD-Gaming-PC-Monitor-/262508966762?hash=item3d1ec0f36a:g:Q4QAAOSwuzRXesKp
> 
> Pay a little extra for that an enjoy 144hz 1440p, it has freesync if you ever went AMD, but i think that would be the best thing to do while not spending too much. Make it your main display and set the 27Q to the side.


+1 on the 144hz refresh rate. Once you get a high refresh rate monitor its hard to go back. I would try to find a cheap used Gsync monitor though. There are a few in the marketplace here for sale.


----------



## mypickaxe

Quote:


> Originally Posted by *oblivious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Malinkadink*
> 
> I went from 1080p to 1440p, back to 1080p, now @ 4k, going back to 1080p 144hz tomorrow. This 4k monitor isn't mine, just using it till i fix my 1080p one. Why do you ask?
> 
> More on topic though, i'm returning my 1070 g1, and have an zotac amp extreme coming in tomorrow. Seems it stays a bit cooler, quieter, and most importantly the added 8 pin should let it hold higher boost clocks and generally be a bit more stable. I really loved my 970 g1 and its been a good 2 years with it, but gigabyte definitely cheaped out on the 1070 g1 a bit imho.
> 
> 
> 
> The reason i'm asking is because i posted about it in the general GPU section and no one has still responded.
> 
> I'm in a weird position. Right now i have a Crossover 27Q monitor that only has DVI. I want to add a second monitor and i'd like for the resolutions to match so my 2nd would have to be 1440p. For that i was planning to buy a 1070. My problem is that non of the new gen cards that i've seen has two DVI ports. From this i need to buy something IPS and 1440p that has a DisplayPort and/or HDMI.
> 
> http://www.ebay.com/itm/111364284557?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
> http://www.ebay.com/itm/141481152682?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
> http://www.ebay.com/itm/222090464588?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
> 
> Those are the monitors i have my eye on since they have something other than DVI for inputs.
> 
> The reason i was asking about 1080p is I was throwing around the idea of downgrading from 1440p to 1080p. I can get something from Newegg (Asus, Acer, Samsung) that will probably be cheaper than the Korean IPS and PLS screens. I really do like the colors of my Crossover monitor but I could probably sacrifice that for an easier setup at 1080p. Also if i went down to 1080p i might be fine with a 1060 instead of a 1070.
Click to expand...

Barring HDMI to DVI cables, how big is the 1080p monitor you're considering? At normal desk distance, I wouldn't consider a 1080p monitor over 24" in size. Pixels would bother me. However, at a distance of 6 to 8 ft. a 43" 1080p LCD is just fine for gaming. I wouldn't use it for productivity apps, after all, from that distance.


----------



## Vaesauce

The Xtreme Gaming fans are superbly quiet.

Even at 75% I can't hear it. I hear my CPU/Case fans over them.


----------



## oblivious

Quote:


> Originally Posted by *Swolern*
> 
> Get a Strix. They are extremely quiet.
> +1 on the 144hz refresh rate. Once you get a high refresh rate monitor its hard to go back. I would try to find a cheap used Gsync monitor though. There are a few in the marketplace here for sale.


Man that's expensive lol. I know I was looking at around 300 to spend and that adds another 100 to the price of just the monitor. Is 144hz that Mich better than 60hz?


----------



## BulletSponge

Quote:


> Originally Posted by *oblivious*
> 
> Man that's expensive lol. I know I was looking at around 300 to spend and that adds another 100 to the price of just the monitor. Is 144hz that Mich better than 60hz?


To me, 96 is noticeably better than 60. I'll have an opinion on 144 vs 96 this time tomorrow.


----------



## cmpxchg8b

Stepped up my EVGA 970 SSC ACX 2.0 to 1070 ACX 3.0!




Nothing fancy, simple mid-range gaming machine. Playing WoW and DOOM, just in time for Legion pre-patch roll-out.


----------



## jlhawn

why would one night gaming my gpu usage recorded with afterburner is very smooth and steady at 99% and then the next night it's a constant jumping up and down from 99% to 25% ?
It does this playing the same game with the same scenes and things going on in the game, it's American Truck Simulator if that can help answer my question.
and it doesn't affect game play, my fps are the same and no stuttering.


----------



## Fearnot

Hi guys new owner with an Asus GTX 1070 Turbo finally replaced my X-fire 7950's crazy how long those lasted. I can't really find any info out there on the Turbo's Pcb but does anyone know if any waterblocks are compatible with the Turbo?


----------



## Swolern

Quote:


> Originally Posted by *oblivious*
> 
> Man that's expensive lol. I know I was looking at around 300 to spend and that adds another 100 to the price of just the monitor. Is 144hz that Mich better than 60hz?


Definitely a big visual gaming difference between 60fps @ 60hz vs 144fps @144hz. The amount of smoothness and clarity with graphical motion brings it closer to life-like, non-blurred motion. Besides looking great it will give you the advantage in multiplayer games which makes it easier to see enemies during fast motion, and decreases lag input times for a quicker reaction.


----------



## chrcoluk

Quote:


> Originally Posted by *HAL900*
> 
> 
> 
> 
> 
> Nope


either hacked bios showing wrong values or some kind of complete freak event, lets be realistic.

My figures are.

970 boosted to 1320 without throttling at stock TDP limit.
boosted to 1468 without throttling or artefacts with 240 watt TDP and higher voltages (heavily modded bios)
970 base speed is 1215 so a 105 o/c at stock TDP

My 1070 boosts to 2038 out of the box with stock speed of 1671 almost 400mhz.


----------



## chrcoluk

Quote:


> Originally Posted by *Symix*
> 
> Nah no size limits, no problem with 3 slot cards. what's the difference between palit gamerock and jetstream? they cost exactly the same at the same clocks.


gamerock premium has binned chips.

not sure about the two you posted tho.


----------



## chrcoluk

Quote:


> Originally Posted by *ikjadoon*
> 
> NVIDIA has just the most inane specifications. Your Core Clock is what it should never go below during gaming (i.e., the bare minimum). The Boost Clock is the "low average" of what you should get in-game. Every card, however, has an average in-game boost much higher than "Boost Clock" even *without* overclocking (as long as you have good cooling). So, from what I have seen...you can just turn the TDP limit up and the Temp limit up and the card will "auto OC" itself (by just boosting up very high, much higher than Boost Clock would suggest). However, like you noticed, NVIDIA doesn't boost much past 2050MHz on its own. If you want more, that's when you need to OC.
> So, wait. You've *directly* tested this? You've looked at your maximum boost clock at 45C versus 75C? It's identical? I think very few people have cards that are 1) at 100% load (to activate maximum boost) and 2) only at 45C. You need to test *that*.
> 
> What you (and JaysTwoCents) are talking about is traditional TDP / temperature throttling. That hasn't changed. Yes, at 100% TDP, you'll throttle. Yes, at 83C, you'll throttle.
> Great points. BF4 is notorious for "body-checking" overclocks. I had to go down 50MHz on my GTX 770 to get it stable in BF4 versus other stability tests / games. And true. Even with GPU Boost 3.0's additional throttling, at best, it's 65MHz. That's 3%! At 60FPS, that's barely 2FPS. 60FPS to 62FPS. Whoop de doo. Hahaha. But, technically, this is OCN, "the pursuit of performance".
> 
> 
> 
> 
> 
> 
> 
> 
> 1. Your first question has many answers and I'm unsure which throttling you're talking about. What is the temp limit set at?
> 
> 2. Are you adding more than 13MHz? Each frequency jump is always 13MHz apart. So, you need to try at least 2063MHz before it'll change, I think.


Whenever I game I am always testing









OSD is on pretty much all the time and eyes keep wondering to look at whats going on.









So to verify

100% load.
No throttling unless close to TDP limit.
The cooler is a beast and the voltages are low on pascal so these two things comined has meant my temps typically dont go over 60C even in the summer we in now, and highest I have seen is 62C but I dont mind forcing the fans to a low speed to see what happens when it goes over 70C.

On jayz video other than drops of one speed bin, he only had throttles when within 5C of the temp limit or within 10% or so of the TDP. The claims you posted are that the card throttles when its like 40C or so below the temp target which makes no sense at all.


----------



## 1337LutZ

Just build my new rig with a MSI 1070 8G Armor. Easily did 2050 boost  Really happy with it


----------



## Powergate

Quote:


> Originally Posted by *chrcoluk*
> 
> gamerock premium has binned chips.
> 
> not sure about the two you posted tho.


Just got my (non premium) Gamerock and flashed the Premium BIOS:
http://www.3dmark.com/fs/9451663

Test with default BIOS:
http://www.3dmark.com/fs/9451832

I would guess they are just separated by BIOS instead of different binned gpus.


----------



## jkteddy77

Pretty disappointed with my 1070 Strix at the moment. I got the slower clocked version thinking I didn't need to pay Asus an extra $40 (which is now only $20 more...) to ship me the same GPU with higher stocks out of the box.

Well, seems the slimy bastards bin their chips... and they do it well.

Not very happy with my results, at 100% voltage, 112% power, I can only manage 1818mhz stable, boosts to 1987mhz max, and the poor memory only overclocks to 9040mhz before flickering, artifacts, and crashes set in. Not a single mhz more can be added to either clock, that's its max...
That's slower than the stock clocks of the higher end Strix... Yet here most people can manage 2050core and 9300mem without having to touch voltage or power limit...
I am only using Asus's GPUTweak II software, do you think Afterburner would give me better results, or is a clockrate a clockrate no matter the software?
Asus sure got me good. I'm gaming in 4k, I need every frame I can get. Look how far I am from the 4K level xD
I lost the silicon lottery pretty badly, if there was ever a first loser. I think I took the cake. This is a very premium card, for it to be performing this badly is a shocker.

Not even close to passing 20k


http://www.3dmark.com/3dm/13509510
I am only using Asus's GPUTweak II software, do you think Afterburner would give me better results, or is a clockrate a clockrate no matter the software?
Something I'm afraid of doing is flashing the 08G version's Bios onto mine, but seeing as it can't even clock to match that card's OC mode on its own, I'd be in a heap of trouble if the performance didn't improve along with it.

Nonetheless, it's a fantastic card as it is, one of the coolest, never gets over 62C or over 42% fan speed at stock OC profile, it's dead silent if the fans are under 60%, and the RGB lighting and very nice backplate and card aesthetic is among the best. I think thy only better looking card is the MSI line, but I don't like red xP
Just make sure to buy the 08G model if you want a Strix and have any inkling of overclocking...


----------



## whicker

Quote:


> Originally Posted by *jkteddy77*
> 
> Pretty disappointed with my 1070 Strix at the moment. I got the slower clocked version thinking I didn't need to pay Asus an extra $40 (which is now only $20 more...) to ship me the same GPU with higher stocks out of the box.
> 
> Well, seems the slimy bastards bin their chips...
> 
> Not very happy with my results, at 100% voltage, 112% power, I can only manage 1818mhz stable, boosts too 1987mhz max, and the poor memory only overclocks to 9040mhz before flickering, artifacts, and crashes set in. Not a single mhz more can be added to either clock, that's its max...
> That's slower than the stock clocks of the higher end Strix... Yet here most people can manage 2050core and 9300mem without having to touch voltage or power limit...
> Asus sure got me good. I'm gaming in 4k, I need every frame I can get :/
> I lost the silicon lottery pretty badly by my understandings.
> 
> Not even close to passing 20k
> 
> 
> http://www.3dmark.com/3dm/13509510


Yeah Asus Is binning hard. My non OC strix was pretty bad too, not only did it OC to the same clocks as you but it would bounce around by 100+mhz from 1999-1847mhz. Have an OC one on order though, I would have liked to go with someone else after that but I want the 2 HDMI ports. You can always try the oc Bios, although mine couldn't run the OC strix bios at stock clocks without artifacting but at least the clock stayed stable.


----------



## jkteddy77

Quote:


> Originally Posted by *Malinkadink*
> 
> I went from 1080p to 1440p, back to 1080p, now @ 4k, going back to 1080p 144hz tomorrow. This 4k monitor isn't mine, just using it till i fix my 1080p one.


I find your setup interesting , kinda how mine is going this past month xD
1080p + 900p monitor
1080p + 1080p144hz+ 900p portrait Monitor
1080p144hz + 27"4k + 900p portrait Monitor

Finally later this week:
27" 4k + Pivoting portrait 1080p monitor

144hz just didn't do anything for me. I tried it, I tested it thoroughly, but it just was paled in comparison to 4k IPs for me.
Now if only AMD would release a massive Vega 11 card already so I could take advantage of 4k Freesync


----------



## Swolern

Quote:


> Originally Posted by *jkteddy77*
> 
> Pretty disappointed with my 1070 Strix at the moment. I got the slower clocked version thinking I didn't need to pay Asus an extra $40 (which is now only $20 more...) to ship me the same GPU with higher stocks out of the box.
> 
> Well, seems the slimy bastards bin their chips... and they do it well.
> 
> Not very happy with my results, at 100% voltage, 112% power, I can only manage 1818mhz stable, boosts to 1987mhz max, and the poor memory only overclocks to 9040mhz before flickering, artifacts, and crashes set in. Not a single mhz more can be added to either clock, that's its max...
> That's slower than the stock clocks of the higher end Strix... Yet here most people can manage 2050core and 9300mem without having to touch voltage or power limit...
> I am only using Asus's GPUTweak II software, do you think Afterburner would give me better results, or is a clockrate a clockrate no matter the software?
> Asus sure got me good. I'm gaming in 4k, I need every frame I can get. Look how far I am from the 4K level xD
> I lost the silicon lottery pretty badly, if there was ever a first loser. I think I took the cake. This is a very premium card, for it to be performing this badly is a shocker.
> 
> Not even close to passing 20k
> 
> 
> http://www.3dmark.com/3dm/13509510
> I am only using Asus's GPUTweak II software, do you think Afterburner would give me better results, or is a clockrate a clockrate no matter the software?
> Something I'm afraid of doing is flashing the 08G version's Bios onto mine, but seeing as it can't even clock to match that card's OC mode on its own, I'd be in a heap of trouble if the performance didn't improve along with it.
> 
> Nonetheless, it's a fantastic card as it is, one of the coolest, never gets over 62C or over 42% fan speed at stock OC profile, it's dead silent if the fans are under 60%, and the RGB lighting and very nice backplate and card aesthetic is among the best. I think thy only better looking card is the MSI line, but I don't like red xP
> Just make sure to buy the 08G model if you want a Strix and have any inkling of overclocking...


Quote:


> Originally Posted by *whicker*
> 
> Yeah Asus Is binning hard. My non OC strix was pretty bad too, not only did it OC to the same clocks as you but it would bounce around by 100+mhz from 1999-1847mhz. Have an OC one on order though, I would have liked to go with someone else after that but I want the 2 HDMI ports. You can always try the oc Bios, although mine couldn't run the OC strix bios at stock clocks without artifacting but at least the clock stayed stable.


No Asus does not bin their cards. It's all about the luck of the draw in the silicon lottery. One of my normal Strix 1070 OCs to 2050mhz. Same as the OC version reviewed here. http://www.pcgameware.co.uk/reviews/graphics-cards/asus-geforce-gtx-1070-strix-oc-graphics-card-review/


----------



## jkteddy77

how does your other one fare? not as well? are they really that few and far apart?


----------



## whicker

Quote:


> Originally Posted by *Swolern*
> 
> No Asus does not bin their cards. It's all about the luck of the draw in the silicon lottery. One of my normal Strix 1070 OCs to 2050mhz. Same as the OC version reviewed here. http://www.pcgameware.co.uk/reviews/graphics-cards/asus-geforce-gtx-1070-strix-oc-graphics-card-review/


If they weren't binning they would not be able guaranty the OC strix at 1860mhz boost. My card can't do that, and I doubt jkteddy77's will be able to either. Maybe if you bought it day one before they had sufficient stock they were not binning. As of July, they definitely have been. Also I doubt your non OC strix can hold that 2050 clock during a time spy run. It will drop to the 1800s or artifact like mine does.


----------



## 1337LutZ

My MSI ARMOR (non-OC version) goes to 2050 without any hassle


----------



## jkteddy77

mine bounces like that too, even with the temp limit to the max, voltage to the max, power limit to the max, GPU usage to the max...
Seriously considering returning it for the higher model, because I really do love this card's aesthetic. At least get a second chance...
Doesn't seem like a huge deal, but If I can get 2-3 more fps out of one in 4k, that's a HUGE difference. That's the difference between high and ultra for most games in 4k.
Sadly, it seems Newegg only allows RMA/Replacements, not returns. Maybe I'll call them, see if I can get the higher model for the difference in price instead.
Only thing is now they're out of stock and priced $100 higher than Amazon...


----------



## alex4069

I just got through playing Doom @ 4k everything max + nightmare and was getting between 45 and 60 fps on my overclock. 4690k @ 4.6 GTX 1070 GPU core @ 2050 and Memory @ 2256. Temps CPU highest core temp 61, GPU highest temp 60. GPU core load 100%, Memory load at 57%, CPU load 75.7%.


----------



## whicker

Quote:


> Originally Posted by *jkteddy77*
> 
> mine bounces like that too, even with the temp limit to the max, voltage to the max, power limit to the max, GPU usage to the max...
> Seriously considering returning it for the higher model, because I really do love this card's aesthetic. At least get a second chance...
> Doesn't seem like a huge deal, but If I can get 2-3 more fps out of one in 4k, that's a HUGE difference. That's the difference between high and ultra for most games in 4k.
> Sadly, it seems Newegg only allows RMA/Replacements, not returns. Maybe I'll call them, see if I can get the higher model for the difference in price instead.
> Only thing is now they're out of stock and priced $100 higher than Amazon...


I would try the strix OC bios first. Maybe you will get lucky and it will run it without artifacting.

Asus_1070_Strix_OC_GP104_86.04.1E.00.21.zip 149k .zip file


http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980/0_30

Let me know how it goes.


----------



## alex4069

I plan on ordering me another Zotac GTX 1070 AMP edition in the next couple of days. I was wondering about which HB sli bridge to get. I dont wan to mess up my look or put a named bridge in that I don't have any components from. So should I just buy a Nvidia HB sli bridge or what? and would the Nvidia HB bridges, since they were made for the FE versions, work with my 2 cards?


----------



## TheMiracle

N
Quote:


> Originally Posted by *jkteddy77*
> 
> I find your setup interesting , kinda how mine is going this past month xD
> 1080p + 900p monitor
> 1080p + 1080p144hz+ 900p portrait Monitor
> 1080p144hz + 27"4k + 900p portrait Monitor
> 
> Finally later this week:
> 27" 4k + Pivoting portrait 1080p monitor
> 
> 144hz just didn't do anything for me. I tried it, I tested it thoroughly, but it just was paled in comparison to 4k IPs for me.
> Now if only AMD would release a massive Vega 11 card already so I could take advantage of 4k Freesync


When you guys try an 27" + 1440p + 144hrz + GSync, you will want to throw all your monitors through the window!!
I have one, and it is the best investment I did for PC ever.


----------



## mypickaxe

Quote:


> Originally Posted by *chrcoluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ikjadoon*
> 
> ...[/B].
> 
> ....
> 
> 
> 
> ...The claims you posted are that the card throttles when its like 40C or so below the temp target which makes no sense at all.
Click to expand...

It's because they definitely do - GPU Boost 3.0.

I've seen this on my second GPU in SLI when the second card hits 50 degrees during a benchmark run. The first GPU might be at 45 and the second when it hits 50 might drop a tic, which eventually causes the first to drop due to linked clocks.


----------



## Powergate

My (non premium) Gamerock can stable clock to 1720MHz GPU and 2350MHz memory, which is in line with the Gamerock Premium reviews.
So there is most likely identical binning on those models.


----------



## Raikiri

Quote:


> Originally Posted by *chrcoluk*
> 
> gamerock premium has binned chips.
> 
> not sure about the two you posted tho.


Mine is non premium, running it at 2088 boost, 9000 mem. Haven't tried any higher though.


----------



## jkteddy77

If after a month I can't swap it out for the higher end model, I'll try a bios flash, but can't risk losing my warranty at the moment.
Am almost certain Newegg would allow me to upgrade to the higher clocked version if I pay the difference once it comes back in stock for a reasonable price again, as long as I explain the instability I experience with the lower model.


----------



## ogow89

Quote:


> Originally Posted by *jkteddy77*
> 
> If after a month I can't swap it out for the higher end model, I'll try a bios flash, but can't risk losing my warranty at the moment.
> Am almost certain Newegg would allow me to upgrade to the higher clocked version if I pay the difference once it comes back in stock for a reasonable price again, as long as I explain the instability I experience with the lower model.


Sell the card if you can't return. Alot of people can't get their hands on one due to limited availability. You might lose 10 or so bucks, shouldn't be that bad, if you want to get a better one with better oc potential.

I got store credits now for my crappy msi gamer x. I have placed an order on a super jetstream, but they are out of them. So i've got to wait now


----------



## bigjdubb

Has anyone been able to crack open the bios and modify it yet? I guess what I am really asking is, have the bios tweaking tools been updated to work on pascal?


----------



## jlhawn

Quote:


> Originally Posted by *jlhawn*
> 
> why would one night gaming my gpu usage recorded with afterburner is very smooth and steady at 99% and then the next night it's a constant jumping up and down from 99% to 25% ?
> It does this playing the same game with the same scenes and things going on in the game, it's American Truck Simulator if that can help answer my question.
> and it doesn't affect game play, my fps are the same and no stuttering.


wow, thanks for all the replies and or some help or suggestions on my problem I asked about.








not one person on here could reply?
guess Ill ask my question somewhere with more helpful users.


----------



## ogow89

Quote:


> Originally Posted by *jlhawn*
> 
> wow, thanks for all the replies and or some help or suggestions on my problem I asked about.
> 
> 
> 
> 
> 
> 
> 
> 
> not one person on here could reply?
> guess Ill ask my question somewhere with more helpful users.


If the fps is steady, then the gpu isn't stressed. So the usage varies depending on the load. Does the game have a vsync option'? Uncheck it and see if the gpu usage goes to 99% and stays there the entire time. Also make sure you are not using adaptive vsync via the driver.


----------



## chrcoluk

note regarding voltages and power level.

Dont bump voltages unless you have a reason to, bumping voltages increases power needed for a specific clock speed and means you will hit the TDP limit easier meaning lower clocks.

Voltages should only be bumped if you have instability at a certian speed and want to make it stable.

Bumping voltages will also raise temps.


----------



## mypickaxe

Quote:


> Originally Posted by *chrcoluk*
> 
> note regarding voltages and power level.
> 
> Dont bump voltages unless you have a reason to, bumping voltages increases power needed for a specific clock speed and means you will hit the TDP limit easier meaning lower clocks.
> 
> Voltages should only be bumped if you have instability at a certian speed and want to make it stable.
> 
> Bumping voltages will also raise temps.


In other news, a waterblock takes care of that, and my best Time Spy score was with the voltage slider maxed out. So, YMMV...


----------



## oblivious

Quote:


> Originally Posted by *Swolern*
> 
> Definitely a big visual gaming difference between 60fps @ 60hz vs 144fps @144hz. The amount of smoothness and clarity with graphical motion brings it closer to life-like, non-blurred motion. Besides looking great it will give you the advantage in multiplayer games which makes it easier to see enemies during fast motion, and decreases lag input times for a quicker reaction.


I guess I should let you guys know I'm not a competitive PC gamer. I don't really play first person shooters on PC. Don't know if that would impact my decision.


----------



## jkteddy77

True, now would be the optimal time while they're still in high demand and low stock.

I had to wait for ever for even mine to come in stock.


----------



## jkteddy77

I play FPS games, and I still prefer 4k 60fps over 144hz, but I keep my side monitor as 1080p 144hz.

I would say even in non-competitive games, a 1440p 144hz monitor paired with a 1070 is still the best, that or look into getting an Ultrawide 2560x1080 monitor, those are cheap on ebay. Would only need a 1060 for 2560x1080
Witcher 3 in ultrawide or 1440p 144hz would be gorgeous.


----------



## Lineswithrobfor

Anyone know if your OC setting crash in 3dmark firestrike, but stable in Unigine Heaven

Does that mean it's stable or unstable?

Should i use a 3rd benchmark to check ?


----------



## mkmitch

I just odered an EVGA SC, it and the FTW are in stock at NewEgg. $349 for the card, but less than I have paid for cards in the past.


----------



## prey1337

Quote:


> Originally Posted by *Lineswithrobfor*
> 
> Anyone know if your OC setting crash in 3dmark firestrike, but stable in Unigine Heaven
> 
> Does that mean it's stable or unstable?
> 
> Should i use a 3rd benchmark to check ?


That means it's unstable in Firestrike, and stable in Heaven. And most likely unstable in games, depending on the title.

As mentioned earlier in this thread, a benchmark run that passes is "stable" only for the purpose of scoring higher, it could fail one run but work the next.

As far as stable for gaming, you just have to test each game that interests you and see what settings work.


----------



## Lineswithrobfor

Quote:


> Originally Posted by *prey1337*
> 
> That means it's unstable in Firestrike, and stable in Heaven. And most likely unstable in games, depending on the title.
> 
> As mentioned earlier in this thread, a benchmark run that passes is "stable" only for the purpose of scoring higher, it could fail one run but work the next.
> 
> As far as stable for gaming, you just have to test each game that interests you and see what settings work.


Ok ty


----------



## Bugses

Is there a fix to the stutter problem? I just had my FPS drop from 120ish to around 10 in BF4.


----------



## Ranguvar

Quote:


> Originally Posted by *Bugses*
> 
> Is there a fix to the stutter problem? I just had my FPS drop from 120ish to around 10 in BF4.


Are you on 368.95?


----------



## Powergate

Finally figured out how this boost mechanism is working:

Code:



Code:


(GPU Clock) -> (Max. Boost) @ (Voltage) : (Temperature)

1737MHz -> 2062MHz @ 1050mV : 60°C

1732MHz -> 2050MHz @ 1043mV : 60°C

1727MHz -> 2050MHz @ 1050mV : 60°C

1722MHz -> 2037MHz @ 1043mV : 60°C

1717MHz -> 2037MHz @ 1050mV : 60°C

So, the actual boost is around ~320MHz and is separated by 13MHz increments with two voltage levels.

The boost offset probably differs on each card and is also affected by temperature as well.

Edit:
Test with power limit instead instead of VRel (reliability voltage):

Code:



Code:


(GPU Clock) -> (Max. Boost) @ (Voltage) : (Temperature) | (Power Limit)

1737MHz -> 1987MHz @ 993mV : 70°C | 101% TDP

1732MHz -> 1974MHz @ 993mV : 70°C | 102% TDP

1727MHz -> 1961MHz @ 981mV : 70°C | 101% TDP

1722MHz -> 1949MHz @ 981mV : 71°C | 101% TDP

1717MHz -> 1936MHz @ 975mV : 71°C | 101% TDP


----------



## Amph

i noticed that one of my g1 gaming 1070 does not cycle the color, like the other, and it also stay at a lower oc(-50MH) is it deflective?


----------



## SmackHisFace

Hey so I know this is the wrong place but I just got a GTX 1060 and the voltage is not adjustable, nor can i monitor the voltages. Is this how GPU boost 3.0 works or is this card voltage locked? Sorry again but I dont see a GTX 1060 owners club. Got to about 2050 boost clock so far and want to push this thing further.


----------



## chrcoluk

Quote:


> Originally Posted by *Powergate*
> 
> Finally figured out how this boost mechanism is working:
> 
> (GPU Clock) -> (Max. Boost) @ (Voltage) : (Temperature)
> 
> 1737MHz -> 2062MHz @ 1050mV : 60°C
> 
> 1732MHz -> 2050MHz @ 1043mV : 60°C
> 
> 1727MHz -> 2050MHz @ 1050mV : 60°C
> 
> 1722MHz -> 2037MHz @ 1043mV : 60°C
> 
> 1717MHz -> 2037MHz @ 1050mV : 60°C
> 
> So, the actual boost is around ~320MHz and is separated by 13MHz increments with two voltage levels.
> 
> The boost offset probably differs on each card and is also affected by temperature as well.


where is the all important power utilisation stat? and yes on AIB cards its waaaaay more important than temperature given that pretty much no AIB card even gets close to temp target but all of them hit power cap.


----------



## chrcoluk

Quote:


> Originally Posted by *SmackHisFace*
> 
> Hey so I know this is the wrong place but I just got a GTX 1060 and the voltage is not adjustable, nor can i monitor the voltages. Is this how GPU boost 3.0 works or is this card voltage locked? Sorry again but I dont see a GTX 1060 owners club. Got to about 2050 boost clock so far and want to push this thing further.


leave voltage alone unless you have stability issues, to max out your clock push power limit to max. Also try a +25 or +50 increment on core clock.


----------



## Bugses

Quote:


> Originally Posted by *Ranguvar*
> 
> Are you on 368.95?


No 368.81. Shouldnt GeForce Experience warn me about new drivers?


----------



## SmackHisFace

Quote:


> Originally Posted by *chrcoluk*
> 
> leave voltage alone unless you have stability issues, to max out your clock push power limit to max. Also try a +25 or +50 increment on core clock.


Well I want to go further so I need to adjust voltages. I already have the power limit to max but Im not hitting the power limit. I need more voltage.

Are there any good pascal OC tutorials? Im using the EVGA scanner to adjust offsets at individual voltages but sometimes the card is sitting at 1.030v instead of like 1.07+ even though power limit is well under the max of 116%. I really have no idea how this scanner works. Here is a picture of where Im at. Currently the card is boosting to 2113 at 1.05v or greater but for some reason the card is going down to 1.03v and downclocking to 2025 as a result. Does the little voltage (lightning bolt symbol) on the left do anything? I cant tell? And finally why cant I hit "run" and let pres X automatically find the best VF curve.


----------



## StarGazerLeon

Quote:


> Originally Posted by *Bugses*
> 
> No 368.81. Shouldnt GeForce Experience warn me about new drivers?


Nvidia just released a hotfix driver (368.95) that addresses some of the stuttering caused by high DPC latency spikes that were present with the previous drivers, so it might be worth installing 368.95 to see if it remedies your issue. YMMV of course; there are many things that could be causing your framerate dips.


----------



## paulclift

Got the SeaHawk X being delivered tomorrow. Cannot wait. Upgrading from 970 SC.


----------



## Forceman

Quote:


> Originally Posted by *Amph*
> 
> i noticed that one of my g1 gaming 1070 does not cycle the color, like the other, and it also stay at a lower oc(-50MH) is it deflective?


Install the Gigabyte utility and check it. Sounds like one is in OC mode and the other isn't (that's the 50 MHz difference). You can also change the color settings there. There's a toggle for color cycle.


----------



## HAL900

Quote:


> Originally Posted by *StarGazerLeon*
> 
> Nvidia just released a hotfix driver (368.95) that addresses some of the stuttering caused by high DPC latency spikes that were present with the previous drivers, so it might be worth installing 368.95 to see if it remedies your issue. YMMV of course; there are many things that could be causing your framerate dips.


This driver is so slow


----------



## alex4069

Should have my second GTX 1070 AMP on Wednesday and should have my nvidia HB SLI bridge. I will post results then.


----------



## chrcoluk

Quote:


> Originally Posted by *SmackHisFace*
> 
> Well I want to go further so I need to adjust voltages. I already have the power limit to max but Im not hitting the power limit. I need more voltage.
> 
> Are there any good pascal OC tutorials? Im using the EVGA scanner to adjust offsets at individual voltages but sometimes the card is sitting at 1.030v instead of like 1.07+ even though power limit is well under the max of 116%. I really have no idea how this scanner works. Here is a picture of where Im at. Currently the card is boosting to 2113 at 1.05v or greater but for some reason the card is going down to 1.03v and downclocking to 2025 as a result. Does the little voltage (lightning bolt symbol) on the left do anything? I cant tell? And finally why cant I hit "run" and let pres X automatically find the best VF curve.


what is well under?

I have yet to see a pascal card not TDP throttle.

Bumping the voltage will allow a speed bin to be more stable, it will not make you go up to a higher speed bin.

It may lower your speed tho as it means your power usage goes up and more TDP throttling.

I see you on a 1060, care to post "all" your data with game running? Make sure you post PWR utilisation please.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *StarGazerLeon*
> 
> Nvidia just released a hotfix driver (368.95) that addresses some of the stuttering caused by high DPC latency spikes that were present with the previous drivers, so it might be worth installing 368.95 to see if it remedies your issue. YMMV of course; there are many things that could be causing your framerate dips.


368.95 fixed Doom 2016 from lagging so bad it was unplayable... FPS has jumped from a mere 20-50 to whopping 144+FPS on G-sync. TANKS!









note: updated the existing 368.81 that were installed cleanly. and funny thing is that for 1st time i used the Express install versus the Custom install. didn't even have to restart. it kept all settings and saved a bunch of time. (Maybe "Express install" worked because both drivers were "368.x".)

(repped u faster that my new FPS in Doom. lol







)

Tanks!


----------



## Marctraider

So I've finally decided to ditch my old 670 GTX in favor of a

Gigabyte 1070 GTX Xtreme.

There is a small ?problem? as the card seems to run at like 2050+mhz as shown by GPU-z on full load,
yet the cards boost clock is advertised around 1800~mhz.

Is this normal? Ive just removed and reinstalled the newest Nvidia driver after card swap, and I have not yet attempted to overclock it.

Other than that, The performance of this card with my new XB321HK 4K screen exceeds all expectations!!

Im rather new to Pascal so please be kind


----------



## Powergate

Quote:


> Originally Posted by *chrcoluk*
> 
> where is the all important power utilisation stat? and yes on AIB cards its waaaaay more important than temperature given that pretty much no AIB card even gets close to temp target but all of them hit power cap.


It was limited by VRel (reliability voltage) not power limit.

Now tested with power limit instead:

(GPU Clock) -> (Max. Boost) @ (Voltage) : (Temperature) | (Power Limit)

1737MHz -> 1987MHz @ 993mV : 70°C | 101% TDP

1732MHz -> 1974MHz @ 993mV : 70°C | 102% TDP

1727MHz -> 1961MHz @ 981mV : 70°C | 101% TDP

1722MHz -> 1949MHz @ 981mV : 71°C | 101% TDP

1717MHz -> 1936MHz @ 975mV : 71°C | 101% TDP


----------



## ogow89

Quote:


> Originally Posted by *Marctraider*
> 
> So I've finally decided to ditch my old 670 GTX in favor of a
> 
> Gigabyte 1070 GTX Xtreme.
> 
> There is a small ?problem? as the card seems to run at like 2050+mhz as shown by GPU-z on full load,
> yet the cards boost clock is advertised around 1800~mhz.
> 
> Is this normal? Ive just removed and reinstalled the newest Nvidia driver after card swap, and I have not yet attempted to overclock it.
> 
> Im rather new to Pascal so please be kind


show off









''oh my card boosts to 2+ghz, but i thought it would only do 1.7ghz. How come?''


----------



## SmackHisFace

Quote:


> Originally Posted by *chrcoluk*
> 
> what is well under?
> 
> I have yet to see a pascal card not TDP throttle.
> 
> Bumping the voltage will allow a speed bin to be more stable, it will not make you go up to a higher speed bin.
> 
> It may lower your speed tho as it means your power usage goes up and more TDP throttling.
> 
> I see you on a 1060, care to post "all" your data with game running? Make sure you post PWR utilisation please.


]

It is well under TDP. This is a 1060 not a 1070 and the power limit is at 116%. According to GPUZ the reason is Vrel, voltage reliability, Im not getting over say 105% TDP in Valley benchmark.


----------



## Marctraider

Any suggestions?


----------



## BulletSponge

Quote:


> Originally Posted by *Marctraider*
> 
> 
> 
> Any suggestions?


Yes, enjoy your card.


----------



## Vaesauce

Quote:


> Originally Posted by *Marctraider*
> 
> So I've finally decided to ditch my old 670 GTX in favor of a
> 
> Gigabyte 1070 GTX Xtreme.
> 
> There is a small ?problem? as the card seems to run at like 2050+mhz as shown by GPU-z on full load,
> yet the cards boost clock is advertised around 1800~mhz.
> 
> Is this normal? Ive just removed and reinstalled the newest Nvidia driver after card swap, and I have not yet attempted to overclock it.
> 
> Other than that, The performance of this card with my new XB321HK 4K screen exceeds all expectations!!
> 
> Im rather new to Pascal so please be kind


It's the BIOS that comes with the card. I have an Xtreme Gaming one as well and same thing for me.

No need to OC the core unless you're aiming for like 2100+ lol. Just OC the memory and call it


----------



## Marctraider

Ah Ok thanks! I certainly will enjoy this card ^-^

Memory overclocked a little bit from 2041mhz to 2300mhz. Not bad.


----------



## Forceman

Quote:


> Originally Posted by *Marctraider*
> 
> 
> 
> Any suggestions?


Suggestions on what?
Quote:


> Originally Posted by *Marctraider*
> 
> Ah Ok thanks! I certainly will enjoy this card ^-^
> 
> Memory overclocked a little bit from 2041mhz to 2300mhz. Not bad.


You can probably push the memory farther - most seem to do about +500.


----------



## Powergate

Alright, seems like there is indeed some sort of binning which is separated in three grades:

1. quality grade: 203MHz boost (higher OC)
2. quality grade: 190MHz boost (simple OC)
3. quality grade: 177MHz boost (default)

Code:



Code:


Palit GeForce GTX 1070 Founders Edition:
1506MHz -> 1683MHz = 177MHz

Palit GeForce GTX 1070 Gamerock:
1556MHz -> 1746MHz = 190MHz

Palit GeForce GTX 1070 Gamerock Premium:
1670MHz -> 1873MHz = 203MHz

Palit GeForce GTX 1070 JetStream:
1506MHz -> 1683MHz = 177MHz

Palit GeForce GTX 1070 Super JetStream:
1632MHz -> 1835MHz = 203MHz

Gainward GeForce GTX 1070 Phoenix:
1506MHz -> 1683MHz = 177MHz

Gainward GeForce GTX 1070 Phoenix Golden Sample:
1632MHz -> 1835MHz = 203MHz

Gainward GeForce GTX 1070 Phoenix GLH:
1670MHz -> 1873MHz = 203MHz

Gigabyte GeForce GTX 1070 WindForce OC:
1582MHz -> 1771MHz = 190MHz

Gigabyte GeForce GTX 1070 G1 Gaming:
1620MHz -> 1822MHz = 203MHz

Gigabyte GeForce GTX 1070 Xtreme Gaming:
1695MHz -> 1898MHz = 203MHz

ASUS ROG Strix GeForce GTX 1070:
1506MHz -> 1683MHz = 177MHz

ASUS ROG Strix GeForce GTX 1070 OC:
1657MHz -> 1860MHz = 203MHz

MSI GeForce GTX 1070 Aero 8G:
1506MHz -> 1683MHz = 177MHz

MSI GeForce GTX 1070 Aero 8G OC:
1531MHz -> 1721MHz = 190MHz

MSI GeForce GTX 1070 Armor 8G:
1506MHz -> 1683MHz = 177MHz

MSI GeForce GTX 1070 Armor 8G OC:
1556MHz -> 1746MHz = 190MHz

MSI GeForce GTX 1070 Gaming 8G
1531MHz -> 1721MHz = 190MHz

MSI GeForce GTX 1070 Gaming X 8G:
1607MHz -> 1797MHz = 190MHz

MSI GeForce GTX 1070 Gaming Z 8G:
1657MHz -> 1860MHz = 203MHz


----------



## Amph

Quote:


> Originally Posted by *Forceman*
> 
> Install the Gigabyte utility and check it. Sounds like one is in OC mode and the other isn't (that's the 50 MHz difference). You can also change the color settings there. There's a toggle for color cycle.


it's what i did already, i was able to fix the color, but the oc is impossible to fix, it's always stay at -60MHz, seems an hard modification in the bios or something else


----------



## Majentrix

My 1070's fans don't even spin up when playing WoW - Amazing!

I remember the days when I could barely run the game with my fans screaming at 100%. We've come so far.


----------



## GTRtank

Just watched this on youtube, seems pretty legit!


----------



## chrcoluk

Quote:


> Originally Posted by *Powergate*
> 
> It was limited by VRel (reliability voltage) not power limit.
> 
> Now tested with power limit instead:
> 
> (GPU Clock) -> (Max. Boost) @ (Voltage) : (Temperature) | (Power Limit)
> 
> 1737MHz -> 1987MHz @ 993mV : 70°C | 101% TDP
> 
> 1732MHz -> 1974MHz @ 993mV : 70°C | 102% TDP
> 
> 1727MHz -> 1961MHz @ 981mV : 70°C | 101% TDP
> 
> 1722MHz -> 1949MHz @ 981mV : 71°C | 101% TDP
> 
> 1717MHz -> 1936MHz @ 975mV : 71°C | 101% TDP


100% TDP is close to the limit, so as expected you misrepresented the facts. It will throttle at that TDP usage. You can probably only be confident TDP is not throttling when it is below 85%. (at 114% limit) or below 70% (at 100% limit). Remember the card is calculating TDP usage way faster than the OSD refreshes the value, the OSD/gpuz will only report an average for its refresh duration, so e.g. if 101% is reported its probable it spiked to the limit.


----------



## chrcoluk

Quote:


> Originally Posted by *SmackHisFace*
> 
> ]
> 
> It is well under TDP. This is a 1060 not a 1070 and the power limit is at 116%. According to GPUZ the reason is Vrel, voltage reliability, Im not getting over say 105% TDP in Valley benchmark.


105% is not well under. VREL is caused by insufficient power, as the card is hitting TDP limit, it cannot reliably supply the voltage so throttles.

Actual power usage is something that fluctuates rapidly, way more rapid than you see on gpuz or an OSD, unlike temperatures which move steadily in one direction up or down. The 105% is just an average for the refresh duration, and a peak value of that refresh duration been only 10% higher is more then likely.

Check the investigation by pcper into the amd power issues, and you can see on their power draw graphs how rapidly power fluctuates. (yes they tested nvidia cards also).


----------



## chrcoluk

Quote:


> Originally Posted by *Powergate*
> 
> Alright, seems like there is indeed some sort of binning which is separated in three grades:
> 
> 1. quality grade: 203MHz boost (higher OC)
> 2. quality grade: 190MHz boost (simple OC)
> 3. quality grade: 177MHz boost (default)
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Palit GeForce GTX 1070 Founders Edition:
> 1506MHz -> 1683MHz = 177MHz
> 
> Palit GeForce GTX 1070 Gamerock:
> 1556MHz -> 1746MHz = 190MHz
> 
> Palit GeForce GTX 1070 Gamerock Premium:
> 1670MHz -> 1873MHz = 203MHz
> 
> Palit GeForce GTX 1070 JetStream:
> 1506MHz -> 1683MHz = 177MHz
> 
> Palit GeForce GTX 1070 Super JetStream:
> 1632MHz -> 1835MHz = 203MHz


did you manage to get my bios now from the vgabios site then?


----------



## chrcoluk

Quote:


> Originally Posted by *GTRtank*
> 
> Just watched this on youtube, seems pretty legit!


it is







TDP is the main issue on these cards not temps









Of course modding a bios is much safer than hard modding


----------



## chrcoluk

my bios is here

https://www.techpowerup.com/vgabios/184789/184789


----------



## GTRtank

Quote:


> Originally Posted by *chrcoluk*
> 
> it is
> 
> 
> 
> 
> 
> 
> 
> TDP is the main issue on these cards not temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course modding a bios is much safer than hard modding


Yup which is why this is perfect, it's not permanent and it raises the TDP. Look forward to seeing how many people do this! Of course it will be easier when the BIOS editor comes out but for now, not a bad way to do it.


----------



## StarGazerLeon

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> 368.95 fixed Doom 2016 from lagging so bad it was unplayable... FPS has jumped from a mere 20-50 to whopping 144+FPS on G-sync. TANKS!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> note: updated the existing 368.81 that were installed cleanly. and funny thing is that for 1st time i used the Express install versus the Custom install. didn't even have to restart. it kept all settings and saved a bunch of time. (Maybe "Express install" worked because both drivers were "368.x".)
> 
> (repped u faster that my new FPS in Doom. lol
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Tanks!


Not at all, my friend. Glad the new Doom is running better for you.


----------



## paulclift

Love it.

Get 43c underload in The Division. That's with around 26c ambient.

Shame the tubes are quite short. Going to have to put a hole in the side window to mount it properly.


----------



## Powergate

Quote:


> Originally Posted by *chrcoluk*
> 
> 100% TDP is close to the limit, so as expected you misrepresented the facts. It will throttle at that TDP usage. You can probably only be confident TDP is not throttling when it is below 85%. (at 114% limit) or below 70% (at 100% limit).


Quote:


> Originally Posted by *Powergate*
> 
> *It was limited by VRel (reliability voltage) not power limit.*


-> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/1300_50#post_25372787









Quote:


> Originally Posted by *chrcoluk*
> 
> VREL is caused by insufficient power, as the card is hitting TDP limit, it cannot reliably supply the voltage so throttles.


VRel means it can't clock higher without raising voltage, VRel != TDP Limit.
Quote:


> "Pwr" means highest Power Target (TDP) has been reached, ie. Your GPU is now using designed "250W". It could go higher (temps and voltage allow it), but it would hog up over "250W".
> "VRel"/"VOp" are indicating that card could go higher (it's got power/TDP and temperature room for it), but voltage reached highest possible level and going higher would mean overvoltage on GPU Core (and Turbo Boost 2.0 technology can't do that, when GPU vCore limit is in place).


----------



## Noirgheos

I assume since you guys are posting here and not in the DPC thread, that majority of you are unaffected?


----------



## SuperZan

I haven't been affected by it (fortunately).


----------



## Noirgheos

I assume since you guys are posting here and not in the DPC thread, that majority of you are unnaffected?
Quote:


> Originally Posted by *SuperZan*
> 
> I haven't been affected by it (fortunately).


Because I've been doing some testing at the store I work at...

A 1500MHz 980 Ti more often that not beats a 2100MHz 1070 by like 5-10FPS, but I know the 1070 is technically more future proof. The only thing stopping me from grabbing a nice MSI or EVGA 1070 is this DPC stuttering issue.


----------



## mypickaxe

Quote:


> Originally Posted by *Noirgheos*
> 
> I assume since you guys are posting here and not in the DPC thread, that majority of you are unnaffected?
> Quote:
> 
> 
> 
> Originally Posted by *SuperZan*
> 
> I haven't been affected by it (fortunately).
> 
> 
> 
> Because I've been doing some testing at the store I work at...
> 
> A 1500MHz 980 Ti more often that not beats a 2100MHz 1070 by like 5-10FPS, but I know the 1070 is technically more future proof. The only thing stopping me from grabbing a nice MSI or EVGA 1070 is this DPC stuttering issue.
Click to expand...

I tested for it, seems fine. It's higher than some report after getting a fix, but mine is still below the minimum threshold for problem detection. It could be the 5930K + overclock + memory bandwidth is keeping it at bay.

Plus, there's no fix yet for SLI users, so I can't even be bothered with the existing hotfix.


----------



## Ranguvar

Quote:


> Originally Posted by *Noirgheos*
> 
> I assume since you guys are posting here and not in the DPC thread, that majority of you are unnaffected?
> Because I've been doing some testing at the store I work at...
> 
> A 1500MHz 980 Ti more often that not beats a 2100MHz 1070 by like 5-10FPS, but I know the 1070 is technically more future proof. The only thing stopping me from grabbing a nice MSI or EVGA 1070 is this DPC stuttering issue.


Wish I bought the $360 MSI Golden that was out a ~month before I got my 1070 FTW. But full HEVC fixed function decode is nice, and the 1070 gets a decent gain under Vulkan and DX12. A driver soon should improve that.

Also hoping 1070s will fly, or at least, go a little faster, once we open the voltage gates.

My DPC issue was resolved by the hotfix, I tested before and after.


----------



## Noirgheos

I assume since you guys are posting here and not in the DPC thread, that majority of you are unnaffected?
Quote:


> Originally Posted by *SuperZan*
> 
> I haven't been affected by it (fortunately).


Because I've been doing some testing at the store I work at...

A 1500MHz 980 Ti more often that not beats a 2100MHz 1070 by like 5-10FPS, but I know the 1070 is technically more future proof. The only thing stopping me from grabbing a nice MSI or EVGA 1070
Quote:


> Originally Posted by *Ranguvar*
> 
> Wish I bought the $360 MSI Golden that was out a ~month before I got my 1070 FTW. But full HEVC fixed function decode is nice, and the 1070 gets a decent gain under Vulkan and DX12. A driver soon should improve that.
> 
> Also hoping 1070s will fly, or at least, go a little faster, once we open the voltage gates.
> 
> My DPC issue was resolved by the hotfix, I tested before and after.


We'll see. Sales are refreshing tomorrow here in Canada, I'll order whichever is cheaper, the 1070 or the 980 Ti.


----------



## ricko99

Planning to get a 1070 to replace my 970 for 1440p 144hz display. Which AIB card is the best in terms of cooling performance? I'm currently torn between Zotac AMP Extreme for its beefy cooler and ASUS Strix for its 2 slot design. But I've had bad experiences in the past with ASUS cards ( 3 different cards, HD6870, GTX 680 DCII, R9 280x DCII all died within a year without any OC) so I'm wondering how is the overall quality for ASUS this time around.


----------



## chrcoluk

Quote:


> Originally Posted by *Noirgheos*
> 
> I assume since you guys are posting here and not in the DPC thread, that majority of you are unaffected?


I had the idle DPC issue as did most owners, however for me it didnt actually affect anything, I had no stutters in games, videos etc.

Also to powergate Ithink we will have to agree to disagree, whilst VREL should mean what you said, my past experience has shown it can also appear if TDP is been hit. The reason been TDP can prevent the voltage been raised which means vrel can show indirectly as a result of TDP. I know this because when I modded my older 970 bios, I had set 1.200 vddc for some lower speed bin's but they couldnt be reached with reason VREL, I then made a newer bios with raised TDP (nothing else changed) and the 1.200 started working.


----------



## ITAngel

Quote:


> Originally Posted by *ricko99*
> 
> Planning to get a 1070 to replace my 970 for 1440p 144hz display. Which AIB card is the best in terms of cooling performance? I'm currently torn between Zotac AMP Extreme for its beefy cooler and ASUS Strix for its 2 slot design. But I've had bad experiences in the past with ASUS cards ( 3 different cards, HD6870, GTX 680 DCII, R9 280x DCII all died within a year without any OC) so I'm wondering how is the overall quality for ASUS this time around.


I order the zotac gtx 1070 amp extreme seems to be an amazing card. Took that over dual rx 480 AIB cards. I even took it over the 1080 gpus.i can provide any information if you need. Is coming today.


----------



## Killmassacre

Quote:


> Originally Posted by *ricko99*
> 
> Planning to get a 1070 to replace my 970 for 1440p 144hz display. Which AIB card is the best in terms of cooling performance? I'm currently torn between Zotac AMP Extreme for its beefy cooler and ASUS Strix for its 2 slot design. But I've had bad experiences in the past with ASUS cards ( 3 different cards, HD6870, GTX 680 DCII, R9 280x DCII all died within a year without any OC) so I'm wondering how is the overall quality for ASUS this time around.


Although I don't know anything about the Zotac AMP Extreme, I have tried out the EVGA SC 1070, the Asus strix OC 1070, and now the MSI gaming X 1070 which i currently own. In terms of cooling I found the MSI gaming X to be the best where it maxed out at 71C in my rig, the EVGA SC was second maxing at 73C, and the Asus strix OC was the worst reaching 76C.

Also in terms of noise levels the MSI gaming X was also significantly quieter, where as the EVGA SC and Asus strix were similar. The Gaming X and Strix OC both seemed to OC the same amount, with both reaching a stock boost of 1936MHz core clock during extended gaming and both OCing over 2050Mhz (I didn't try going past this). The EVGA SC was not as good however, as it fluctuated between 1860-1920MHz and couldn't reach 2050Mhz core clock. In terms of build quality the EVGA SC seemed the best, and the Asus and MSI were tied for second.

I was a little disappointed with the Asus strix OC 1070 since I upgraded from an Asus strix 970 and assumed it would be equal in terms of cooling, noise levels, and build quality. However it was a slight downgrade on all fronts except in performance and the fact it had RGB lighting. The MSI gaming X 1070 however is easily just as quiet and the temps are nearly just as good as my old Asus strix 970 (which maxed at 69C in my rig).


----------



## madmeatballs

Quote:


> Originally Posted by *ITAngel*
> 
> I order the zotac gtx 1070 amp extreme seems to be an amazing card. Took that over dual rx 480 AIB cards. I even took it over the 1080 gpus.


Me too got the same card, pretty satisfied with it except for the sag. What did you do to it?


----------



## ITAngel

Quote:


> Originally Posted by *madmeatballs*
> 
> Me too got the same card, pretty satisfied with it except for the sag. What did you do to it?


I will have to do nothing to it. I own a thermaltake core x9 case so the motherboard is horizontal versus vertical. However I did own once a devil 13 290x2 gpu and it came with a pole stick that you can adjust. Maybe you can make something like that or use some cable straps? I read some people did that. By the way my card will be here today Monday.


----------



## Majentrix

You can use fishing line to tie the video card to the top of the case. Or if your case allows it zip tie the end of the card to a tie-on point . At one point Powercolor sold that jack on its own, have a look around and see if you can find one. You can fab one yourself if you're so inclined.


----------



## headd

Quote:


> Originally Posted by *Killmassacre*
> 
> Although I don't know anything about the Zotac AMP Extreme, I have tried out the EVGA SC 1070, the Asus strix OC 1070, and now the MSI gaming X 1070 which i currently own. In terms of cooling I found the MSI gaming X to be the best where it maxed out at 71C in my rig, the EVGA SC was second maxing at 73C, and the Asus strix OC was the worst reaching 76C.
> 
> Also in terms of noise levels the MSI gaming X was also significantly quieter , where as the EVGA SC and Asus strix were similar. The Gaming X and Strix OC both seemed to OC the same amount, with both reaching a stock boost of 1936MHz core clock during extended gaming and both OCing over 2050Mhz (I didn't try going past this). The EVGA SC was not as good however, as it fluctuated between 1860-1920MHz and couldn't reach 2050Mhz core clock. In terms of build quality the EVGA SC seemed the best, and the Asus and MSI were tied for second.
> 
> I was a little disappointed with the Asus strix OC 1070 since I upgraded from an Asus strix 970 and assumed it would be equal in terms of cooling, noise levels, and build quality. However it was a slight downgrade on all fronts except in performance and the fact it had RGB lighting. The MSI gaming X 1070 however is easily just as quiet and the temps are nearly just as good as my old Asus strix 970 (which maxed at 69C in my rig).


best are Zotac extreme and Palit/gainward.Those have 3slot cooler and have best temperatures and are most quieter.
i have gainward and after OC to 2140/9400 card have only 64-65°C and 1000-1200RPM.
If i use default fan curve card reach 68°C with 800-900RPM.


----------



## Trhuster

Quote:


> Originally Posted by *headd*
> 
> best are Zotac extreme and Palit/gainward.Those have 3slot cooler and have best temperatures and are most quieter.
> i have gainward and after OC to 2140/9400 card have only 64-65°C and 1000-1200RPM.
> If i use default fan curve card reach 68°C with 800-900RPM.


Nice video, what overlay do you use to monitor core MHz?


----------



## headd

Quote:


> Originally Posted by *Trhuster*
> 
> Nice video, what overlay do you use to monitor core MHz?


afterburner


----------



## paulclift

First time ever done this.

This is all stock and using the overclock mode in the MSI gaming app.

GTX 1070 Seahawk.


----------



## czarsvk

I am using the BIOS posted in pg.132 which is the OC version of Asus ROG Strix 1070. I too have Asus ROG STRIX 1070 but it is not a OC version. I get the Certficate error like in the end below

This is my cmd:

NVIDIA Firmware Update Utility (Version 5.287.0)
Modified Version By Joe Dirt

Checking for matches between display adapter(s) and image(s)...

Adapter: Graphics Device (10DE,1B81,1043,8598) H:--:NRM S:00,B:01,D:00,F:00

WARNING: Firmware image PCI Subsystem ID (1043.8599)
does not match adapter PCI Subsystem ID (1043.8598).

Please press 'y' to confirm override of PCI Subsystem ID's:
Overriding PCI subsystem ID mismatch
Current - Version:86.04.1E.00.23 ID:10DE:1B81:1043:8598
GP104 Board (Normal Board)
Replace with - Version:86.04.1E.00.21 ID:10DE:1B81:1043:8599
GP104 Board (Normal Board)

Update display adapter firmware?
Press 'y' to confirm (any other key to abort):
The display may go *BLANK* on and off for up to 10 seconds or more during the update process depending on your display adapter and output device.

Identifying EEPROM...
EEPROM ID (EF,6013) : WBond W25Q40EW 1.65-1.95V 4096Kx1S, page

*BCRT Error: Certificate 2.0 verification failed

ERROR: BIOS Cert 2.0 Verification Error. Update failed*

FYI, I renamed my file to "ROM_NAME" and my command
nvflash -6 ROM_NAME.rom

Please help me with the correct OC ROM version(ROG STRIX GTX 1070 OC) if what I am using is wrong. Thanks in advance.


----------



## DStealth

Wrong version you've got m8
From my experience, flashed almost every 1070 BIOS out there - best results obtained are with the stock one...
Just a quick Time Spy bench 2126/9600


----------



## czarsvk

Quote:


> Originally Posted by *DStealth*
> 
> Wrong version you've got m8
> From my experience, flashed almost every 1070 BIOS out there - best results obtained are with the stock one...
> Just a quick Time Spy bench 2126/9600


Can you please post Asus ROG STRIX GTX 1070 OC version BIOS please. I just want to try and see how it works. This is my first time. I am eager to learn this stuff.


----------



## DStealth

Quote:


> Originally Posted by *czarsvk*
> 
> Can you please post Asus ROG STRIX GTX 1070 OC version BIOS please. I just want to try and see how it works. This is my first time. I am eager to learn this stuff.


If so, please read this thread and you'll find answers to your both questions...


----------



## czarsvk

Quote:


> Originally Posted by *DStealth*
> 
> If so, please read this thread and you'll find answers to your both questions...


Thank you but I downloaded the same version posted by "Whicker" in a different post below and got the error.
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/1310#post_25370492

Humm, interesting. Any thouughts?


----------



## DStealth

Quote:


> Originally Posted by *DStealth*
> 
> Wrong version you've got m8


As already mentioned you have to download the correct NVflash version in order to flash Pascal...although you didn't read the whole tread...here you are the ...

nvflash_pascal.zip 1155k .zip file


----------



## czarsvk

Quote:


> Originally Posted by *DStealth*
> 
> As already mentioned you have to download the correct NVflash version in order to flash Pascal...although you didn't read the whole tread...here you are the ...
> 
> nvflash_pascal.zip 1155k .zip file


Thank you I read the whole thing but I thought you were referring to BIOS software. I didn't know what m8 was. I am new to graphic card/mother board bios and stuff. Forgive my ignorance. Now I understand little better.I will try this out in the evening and let you know. Thank you so much.


----------



## ITAngel

Got my card today. YAY! Zotac did great with this series.


----------



## ogow89

Quote:


> Originally Posted by *ITAngel*
> 
> Got my card today. YAY! Zotac did great with this series.


Looks ugly


----------



## chris21010

i got 2 of these bad boys! i just can not wait for custom bios so i can pertinently set custom fan curves.

oh, and i had to remove the HDD bay as the GPU's were too long so i screwed the SSD to the side mesh, not too glamorous but it works.


----------



## jrcbandit

Is there much point to water cooling these cards, especially if you get an after market card with a better/quieter cooler? Seems like the extra voltage doesn't really do much for the 1070s. Of course if I do that, I'll have a full custom loop with two radiators only for my CPU, while now it is cooling CPU and 2 video cards...

Is there anyone else like me potentially going from 970 SLI to a single 1070? Don't really want to spend the amount the 1080 costs and selling my two 970s used will help cover the cost of the 1070. Then I wont have to deal with the 10 billion SLI issues such as no support at all and if the game does support it then it typically has poor scaling and/or poor frametimes. Plus, 8 GB of memory is nice for 1440p versus only 3.5gb functional memory on the 970.

It will be interesting to see how SLI develops in future DX12 games since it has to be built into the engine. Might see much better scaling/frametimes, alternatively since it takes more effort could see 0 support.


----------



## chris21010

Quote:


> Originally Posted by *jrcbandit*
> 
> Is there much point to water cooling these cards, especially if you get an after market card with a better/quieter cooler?


only if you have 2. i say this as one of my 2 cards in the pic above thermal throttles in my case as the two together get quite hot.


----------



## ogow89

Quote:


> Originally Posted by *chris21010*
> 
> 
> 
> i got 2 of these bad boys! i just can not wait for custom bios so i can pertinently set custom fan curves.
> 
> oh, and i had to remove the HDD bay as the GPU's were too long so i screwed the SSD to the side mesh, not too glamorous but it works.


How much system RAM do you have, i see 2 sticks, and what is your cpu?


----------



## chris21010

Quote:


> Originally Posted by *ogow89*
> 
> How much system RAM do you have, i see 2 sticks, and what is your cpu?


that rig isnt special, its a [email protected] rig, and not my main gaming rig. it has an i3-6100 with 8GB DDR4 ram. also, everything minus the GPU's cost less than 1 of these 1070 FTW's. but it does push out ~1,350,000 points per day, folding at home.


----------



## ogow89

Quote:


> Originally Posted by *chris21010*
> 
> that rig isnt special, its a [email protected] rig, and not my main gaming rig. it has an i3-6100 with 8GB DDR4 ram. also, everything minus the GPU's cost less than 1 of these 1070 FTW's. but it does push out ~1,350,000 points per day, folding at home.


you plan to leave those two gpus in there?


----------



## chris21010

Quote:


> Originally Posted by *ogow89*
> 
> you plan to leave those two gpus in there?


until 1080ti's or something better is released, yes.


----------



## czarsvk

Quote:


> Originally Posted by *DStealth*
> 
> As already mentioned you have to download the correct NVflash version in order to flash Pascal...although you didn't read the whole tread...here you are the ...
> 
> nvflash_pascal.zip 1155k .zip file


This worked like a charm. Thank you.


----------



## boldenc

anyone tried to flash the GTX 1070 FTW bios on his GTX 1070 SC ?


----------



## Dasboogieman

Quote:


> Originally Posted by *boldenc*
> 
> anyone tried to flash the GTX 1070 FTW bios on his GTX 1070 SC ?


Might not be a good idea. The FTW is a completely different PCB with the upgraded VRM array.


----------



## SuperZan

Quote:


> Originally Posted by *Dasboogieman*
> 
> Might not be a good idea. The FTW is a completely different PCB with the upgraded VRM array.


Yeah, I'd be extremely hesitant to put the FTW bios on mine knowing it's designed around 10+2 phase design and the SC's rocking the standard 4+1.


----------



## alex4069

I will have my second 1070 amp in and will run benches on single sli bridge. Then Wednesday I will have my nvidia HB sli bridge in and do same benchmarks. I'm thinking 2 will be overkill at 4k. I have played Doom and Return of the Tomb Raider at 4k maxed. Doom I was seeing high 40s fps maxed and Tomb Raider at 4k and dx12 on the benchmark I averaged 40 fps.


----------



## wickedout

Hello everyone. Tonight I pulled the trigger on the ZOTAC GeForce GTX 1070 AMP! Did much research on this GPU and it had a ton of great review and the build quality is amazing. I'll have it here in a few days. I was gonna go 1080 but just couldn't pull the trigger. I'm keeping my 980Ti as back up or for another build down the road. Benchmark on the way as soon as I get this GPU. Can't wait.


----------



## marduke83

Anyone else having problems with afterburner changing the V/Freq curve? Mine did it even when I have a saved profile. I have it set to max at 2062 (any higher and it can cause crashes), but when I booted up my rig after work I checked it, and it had set itself to max out at 2175 at 1.03v and even when selecting my saved profile it would load my curve but when I hit apply it would revert to the higher curve... Pretty frustrating.


----------



## Dasboogieman

mmmm my EVGA GTX 1070 FTW does 2050-2075 at 112% power and 100%on voltage, VRAM is at 9320mhz. Is this considered good?


----------



## DStealth

Depends...but sounds sub-average for the core and average for the memory. Are these clock 24/7 stable monitored(stable) or benchmarks only...


----------



## Mudfrog

Received my MSI Gaming X 1070 yesterday. Without touching it the core will jump to 2004, dropping to around 1975 and eventually holding at 1949 during long gaming sessions. The max that I saw the card get to was 61c.

Last night I tried:

GTA 5 at max settings and 1440p, I held steady at 60 during the day and dropped as low as 45 at times during the night. The game is now gorgeous.
Rise of the Tomb Raider at 1080p max settings, held at 60fps, don't recall dropping under 60fps but it may have happened.
Rust on max settings 1080p, I have been plagued by low fps in rust with my 670, even on low settings. Last night I held a pretty constant 60 fps. I only recall the fps dipping once, it dropped to 14 fps for a couple of seconds when I destroyed a raid tower which consisted of around 50 logs falling to the ground.

Overall I'm happy, I'm just not sure if I'm $450 happy









Next year I plan to upgrade the CPU / Memory which should help with the fps even more.


----------



## Dasboogieman

Quote:


> Originally Posted by *DStealth*
> 
> Depends...but sounds sub-average for the core and average for the memory. Are these clock 24/7 stable monitored(stable) or benchmarks only...


24/7 stable, I've passed it thru FFXIV Benchmark @ 4K for 12hrs so far.

Benchmark I can maybe eke out 2075-2090 but I think it needs more voltage when the modded BIOS comes.


----------



## Dreamliner

Quote:


> Originally Posted by *headd*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Killmassacre*
> 
> Although I don't know anything about the Zotac AMP Extreme, I have tried out the EVGA SC 1070, the Asus strix OC 1070, and now the MSI gaming X 1070 which i currently own. In terms of cooling I found the MSI gaming X to be the best where it maxed out at 71C in my rig, the EVGA SC was second maxing at 73C, and the Asus strix OC was the worst reaching 76C.
> 
> Also in terms of noise levels the MSI gaming X was also significantly quieter , where as the EVGA SC and Asus strix were similar. The Gaming X and Strix OC both seemed to OC the same amount, with both reaching a stock boost of 1936MHz core clock during extended gaming and both OCing over 2050Mhz (I didn't try going past this). The EVGA SC was not as good however, as it fluctuated between 1860-1920MHz and couldn't reach 2050Mhz core clock. In terms of build quality the EVGA SC seemed the best, and the Asus and MSI were tied for second.
> 
> I was a little disappointed with the Asus strix OC 1070 since I upgraded from an Asus strix 970 and assumed it would be equal in terms of cooling, noise levels, and build quality. However it was a slight downgrade on all fronts except in performance and the fact it had RGB lighting. The MSI gaming X 1070 however is easily just as quiet and the temps are nearly just as good as my old Asus strix 970 (which maxed at 69C in my rig).
> 
> 
> 
> best are Zotac extreme and Palit/gainward.Those have 3slot cooler and have best temperatures and are most quieter.
> i have gainward and after OC to 2140/9400 card have only 64-65°C and 1000-1200RPM.
> If i use default fan curve card reach 68°C with 800-900RPM.
Click to expand...

It sounds like you're saying the MSI card is giving you the best performance at the lowest temperatures and quietest levels, is that correct?

I'm trying to decide what 1070 to get, I had a gigabyte 970 and I thought it was noisy. I was planning on getting a Strix but heard they might be binning and it sounds like a lot of people are liking the MSI anyway. (I have an Asus board and passed on an MSI board as I had quality concerns.)


----------



## bigjdubb

I think the MSI is a good card but I bought mine because it was the one in stock. I have zero complaints about the air cooler on it, it performs quite well given it's low noise output.

Quote:


> Originally Posted by *cloudliu*
> 
> request msi gtx 1070 gaming z bis


I haven't even seen the Z available for purchase.


----------



## mypickaxe

Quote:


> Originally Posted by *Mudfrog*
> 
> Received my MSI Gaming X 1070 yesterday. Without touching it the core will jump to 2004, dropping to around 1975 and eventually holding at 1949 during long gaming sessions. The max that I saw the card get to was 61c.
> 
> Last night I tried:
> 
> GTA 5 at max settings and 1440p, I held steady at 60 during the day and dropped as low as 45 at times during the night. The game is now gorgeous.
> Rise of the Tomb Raider at 1080p max settings, held at 60fps, don't recall dropping under 60fps but it may have happened.
> Rust on max settings 1080p, I have been plagued by low fps in rust with my 670, even on low settings. Last night I held a pretty constant 60 fps. I only recall the fps dipping once, it dropped to 14 fps for a couple of seconds when I destroyed a raid tower which consisted of around 50 logs falling to the ground.
> 
> Overall I'm happy, I'm just not sure if I'm $450 happy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Next year I plan to upgrade the CPU / Memory which should help with the fps even more.


One point to ponder: 1070 SLI is still an option (probably, depending on your motherboard and case) and it beats a single 1080. You can get 60 to 80 fps at 4K (!) on what are essentially max settings in GTA V with 1070 SLI. More memory makes this possible, where 980 Ti would choke on the same settings due to 6GB limit. So basically, 1070 SLI is roughly 5% better than a highly overclocked (and hot) TITAN X 2 way SLI configuration. (The original TITAN X, of course.)


----------



## Dreamliner

I ended up going with the Strix O8G. Seems like the best chance for high speeds. Matches my Asus Sabertooth X99. $450 after $25 off PayPal code on Newegg.


----------



## Ranguvar

Quote:


> Originally Posted by *Dasboogieman*
> 
> mmmm my EVGA GTX 1070 FTW does 2050-2075 at 112% power and 100%on voltage, VRAM is at 9320mhz. Is this considered good?


It'll do 122% power target if you switch to second BIOS.

Took a few reboots for me, maybe clear Precision X settings, to show.


----------



## ondoy

joining this club.


----------



## dislikeyou

I have a Palit GeForce GTX 1070 8GB JetStream (NE51070015P2-1041J) now:



Happy with the card expect it seem affected by the DPS latency issue in CSGO, haven't tried any other games then CS and LoL. Card is very quiet, can't hear it while gaming or idle.


----------



## pojo1806

MSI Gaming X


----------



## marik123

After couple days of testing, I finally settle for 2100mhz core and 9300mhz memory.


----------



## SuperZan

Quote:


> Originally Posted by *marik123*
> 
> After couple days of testing, I finally settle for 2100mhz core and 9300mhz memory.


Very nice. My EVGA SC is topping out around 2060-70MHz on the core but with great stability using the scaling OC in Precision. I could probably push a bit farther and feed a more consistent voltage but I'm keeping temps and fan speed nearly the same with the dynamic OC, which I am loving.

1070 is definitely the best value play from Pascal IMO.


----------



## ITAngel

I wonder how far I can push my Zotac GTX 1070 AMP Extreme. I have yet to try playing with it since All I did yesterday was update my games and test them all with the card which I was very very impress by the performance.







Finally finished it last night.


----------



## Dreamliner

Quote:


> Originally Posted by *marik123*
> 
> After couple days of testing, I finally settle for 2100mhz core and 9300mhz memory.


Do you have the Strix? Is it the OC one? Settings & app used plz?


----------



## Phixit

I can't go higher than +120 on Core with my GTX 1070. Instant artifacts ..

-+120 on core
-+800 memory

Running at 100% fan speeds, 58 degrees in Unigine Heaven. Sounds like a jet engine.

I feel like I have the worst GTX 1070 of this forum. I'm almost tempted to put it on eBay, sell at loss and grab either a EVGA FTW / MSI Gaming X or Zotac Extreme. I can return it, but I'll have to pay for the shipping + 15% restocking fees (thanks NCIX) so I'll lose even more money.

Don't know what to do.


----------



## luan87us

So I've been wanting to try out VR but the Vive is way too much for a novelty. Anyone know of a more affordable VR headset that will perform?


----------



## marik123

Quote:


> Originally Posted by *Dreamliner*
> 
> Do you have the Strix? Is it the OC one? Settings & app used plz?


I have the Strix OC I think? I used Asus GPU Tweak - II, core @ 1878mhz boost (Will show 2100mhz in most apps), 9300mhz on memory, voltage +50mv, power target 112%, fan @ auto.


----------



## Dreamliner

Quote:


> Originally Posted by *marik123*
> 
> I have the Strix OC I think? I used Asus GPU Tweak - II, core @ 1878mhz boost (Will show 2100mhz in most apps), 9300mhz on memory, voltage +50mv, power target 112%, fan @ auto.


The OC one is the STRIX-GTX1070-*O8G*-GAMING vs the regular STRIX-GTX1070-*8G*-GAMING. Its also more expensive.

What stress test/benchmarks were you using? (I need to get some)

I'd like to know how you got either one for $390.


----------



## ITAngel

Quote:


> Originally Posted by *luan87us*
> 
> So I've been wanting to try out VR but the Vive is way too much for a novelty. Anyone know of a more affordable VR headset that will perform?


Oculus Rift $599 unless you want cheaper.


----------



## bigjdubb

Quote:


> Originally Posted by *luan87us*
> 
> So I've been wanting to try out VR but the Vive is way too much for a novelty. Anyone know of a more affordable VR headset that will perform?


Oculus is the next cheapest consumer device. There are other dev kits out there that cost less, OSVR is a decent option.


----------



## marik123

Quote:


> Originally Posted by *Dreamliner*
> 
> The OC one is the STRIX-GTX1070-*O8G*-GAMING vs the regular STRIX-GTX1070-*8G*-GAMING. Its also more expensive.
> 
> What stress test/benchmarks were you using? (I need to get some)
> 
> I'd like to know how you got either one for $390.


I got mine 2 weeks ago from Jet.com, selling it for $420.14 paying with debit card with free return waived + free shipping and new customer gets 15% (upto $30 off max) and thus I got it for $390.14. I just looked at my jet.com invoice and it's STRIX-1070-8G. I use heaven 4.0, to stress test my card and also tried to play games on it to make sure it's stable.

Fullfilled by

Jet Trusted Partner.

ASUS

ASUS GeForce GTX 1070 STRIX-GTX1070-8G-GAMING 8GB 256-Bit GDDR5 PCI Express 3.0 HDCP Ready Video Card

Estimated Delivery: 2 - 5 business days

You paid: $390.14

Item Subtotal

$436.13

Extra Savings

-$45.99

Estimated Tax

$0.00

Item Total

$390.14

Charged On: 7/13/2016 9:02:49 PM
Hide Price Details

Quantity: 1

Opted out of free return


----------



## Swolern

Quote:


> Originally Posted by *alex4069*
> 
> I will have my second 1070 amp in and will run benches on single sli bridge. Then Wednesday I will have my nvidia HB sli bridge in and do same benchmarks. I'm thinking 2 will be overkill at 4k. I have played Doom and Return of the Tomb Raider at 4k maxed. Doom I was seeing high 40s fps maxed and Tomb Raider at 4k and dx12 on the benchmark I averaged 40 fps.


Definitely not overkill until you can hold 60fps. The 1070 SLI will actually beat the $1200 Titan XP in performance in good SLI games so it's a beast of a GPU solution for a great price comparatively.

The Titan XP is stated to be 60% faster than the previous Titan X. 1070 SLI is showing 70% faster @ 4K in some games like GTA V. http://www.guru3d.com/articles_pages/geforce_gtx_1070_2_way_sli_review,21.html


----------



## mypickaxe

Quote:


> Originally Posted by *Swolern*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alex4069*
> 
> I will have my second 1070 amp in and will run benches on single sli bridge. Then Wednesday I will have my nvidia HB sli bridge in and do same benchmarks. I'm thinking 2 will be overkill at 4k. I have played Doom and Return of the Tomb Raider at 4k maxed. Doom I was seeing high 40s fps maxed and Tomb Raider at 4k and dx12 on the benchmark I averaged 40 fps.
> 
> 
> 
> Definitely not overkill until you can hold 60fps. The 1070 SLI will actually beat the $1200 Titan XP in performance in good SLI games so it's a beast of a GPU solution for a great price comparatively.
> 
> The Titan XP is stated to be 60% faster than the previous Titan X. 1070 SLI is showing 70% faster @ 4K in some games like GTA V. http://www.guru3d.com/articles_pages/geforce_gtx_1070_2_way_sli_review,21.html
Click to expand...

Here's what I'm finding with 1070 SLI and overclocking: If you're running 1440p, in a lot of games, don't even bother. There's not enough of a percentage increase (relative to frames per second increase) for it to be of any real benefit.

Once you get into 4K and 5K gaming, then the overclock percentage improvement jumps from around 4% in a game such as Shadow of Mordor at 1440p with "max settings" and high res texture pack, to *14-16% improvement at 4K and 5K,* respectively, with identical settings.

I chose Shadow of Mordor to test, as it has an in-game benchmark AND an in-game virtual resolution adjustment. Coupled with easy alt-tab to change OC in Afterburner, and it made for a super easy comparison on the same hardware.

As far as "1070 SLI beats TITAN X (2016)"...and I'm not calling it a TITAN XP...that remains to be seen. Could be the case for some games but not others. We don't have one to test, we don't know how soon there will be water blocks available for them (for a real apples to apples comparison in my case,) if voltage is going to be unlocked at all for Pascal, etc. etc. etc.

What I do know is that in some games, a single 1080 is just barely beaten by SLI 1070, while in others with excellent scaling, it's a decent margin of victory for the 1070 SLI config, but depending on resolution, one may not even notice that difference.

Where I see it being a big deal is at 4K for users trying to maintain a locked 60 fps. 1440p / 144 Hz gaming as well, but for me it becomes irrelevant on a GSync monitor over 100 Hz.
Quote:


> Originally Posted by *pojo1806*
> 
> MSI Gaming X


Remarkably similar to Paul's Hardware build posted today: https://pbs.twimg.com/media/CoUPsgkUEAA4eXQ.jpg:large


----------



## Dreamliner

Quote:


> Originally Posted by *marik123*
> 
> I got mine 2 weeks ago from Jet.com


I almost bought that too. I missed it and wavered on spending extra on the O8G model. I ended up going newegg as pricing now was ~$20 or so more without having to worry about trouble.

I wavered initially because this forum made it sound like the 1070 wasn't binned and just had different BIOS' separating the 8G and O8G. Recent buyers have found their 8G's unstable at O8G defaults. Sounds like you got lucky!


----------



## Swolern

Quote:


> Originally Posted by *mypickaxe*
> 
> Here's what I'm finding with 1070 SLI and overclocking: If you're running 1440p, in a lot of games, don't even bother. There's not enough of a percentage increase (relative to frames per second increase) for it to be of any real benefit.
> 
> Once you get into 4K and 5K gaming, then the overclock percentage improvement jumps from around 4% in a game such as Shadow of Mordor at 1440p with "max settings" and high res texture pack, to *14-16% improvement at 4K and 5K,* respectively, with identical settings.
> 
> I chose Shadow of Mordor to test, as it has an in-game benchmark AND an in-game virtual resolution adjustment. Coupled with easy alt-tab to change OC in Afterburner, and it made for a super easy comparison on the same hardware.
> Remarkably similar to Paul's Hardware build posted today: https://pbs.twimg.com/media/CoUPsgkUEAA4eXQ.jpg:large


Agreed. Resolutions 1440p and lower will be held back by CPU limitations. Wow I just saw 1070 SLI in Shadow of Mordor 5K is 100% faster than the Titan X!! Damn!! That actually kills the 60% said extra performance of the Titan XP. There you can really see the 1070 SLI stretch out its legs!
http://www.guru3d.com/articles_pages/geforce_gtx_1070_2_way_sli_review,24.html


----------



## alex4069

Quote:


> Originally Posted by *Swolern*
> 
> Definitely not overkill until you can hold 60fps. The 1070 SLI will actually beat the $1200 Titan XP in performance in good SLI games so it's a beast of a GPU solution for a great price comparatively.
> 
> The Titan XP is stated to be 60% faster than the previous Titan X. 1070 SLI is showing 70% faster @ 4K in some games like GTA V. http://www.guru3d.com/articles_pages/geforce_gtx_1070_2_way_sli_review,21.html


Ok, I seen that before. Now just need to look in on replacing my 4k monitor with a 5k one. I see about value. Both of my 1070's cost just 864 dollars and 39 for the HB sli bridge.


----------



## mypickaxe

Quote:


> Originally Posted by *Swolern*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> Here's what I'm finding with 1070 SLI and overclocking: If you're running 1440p, in a lot of games, don't even bother. There's not enough of a percentage increase (relative to frames per second increase) for it to be of any real benefit.
> 
> Once you get into 4K and 5K gaming, then the overclock percentage improvement jumps from around 4% in a game such as Shadow of Mordor at 1440p with "max settings" and high res texture pack, to *14-16% improvement at 4K and 5K,* respectively, with identical settings.
> 
> I chose Shadow of Mordor to test, as it has an in-game benchmark AND an in-game virtual resolution adjustment. Coupled with easy alt-tab to change OC in Afterburner, and it made for a super easy comparison on the same hardware.
> Remarkably similar to Paul's Hardware build posted today: https://pbs.twimg.com/media/CoUPsgkUEAA4eXQ.jpg:large
> 
> 
> 
> Agreed. Resolutions 1440p and lower will be held back by CPU limitations. Wow I just saw 1070 SLI in Shadow of Mordor 5K is 100% faster than the Titan X!! Damn!! That actually kills the 60% said extra performance of the Titan XP. There you can really see the 1070 SLI stretch out its legs!
> http://www.guru3d.com/articles_pages/geforce_gtx_1070_2_way_sli_review,24.html
Click to expand...

Yes, in my testing with the EVGA High Bandwidth bridge at 5K resolution, I was seeing 96 to 98% scaling with SLI enabled. That's just nuts.


----------



## ikjadoon

Quote:


> Originally Posted by *ogow89*
> 
> You get instantaneous results in helping me, whilst [email protected] keeps working your electric bill and trying to simulate the molecular structure of some protein and taking years with due to inefficient algorithm.


[off-topic]

Just to clear up a few things: you say "inefficient" as if there's another more efficient choice. There is no available equivalent supercomputer to crank out these numbers without massive investment, sadly (it's 100 petaFLOPS).

Few politicians have really been pushing for increased funding for scientific research at least in the USA. Big money grants are hard to get by, especially for "orphaned" diseases that aren't cancer or HIV,







And, if you don't think it's the government's responsibility to fund scientific research, then [email protected] is literally right up your alley (that is, if you care about scientific research enough).

It's like telling someone crowd-funds one of their projects: "pssh, you're using an inefficient manufacturing." So...uh, what should they do? Try to convince some giant corporation, with their very efficient manufacturing, to make this product that they don't really care about? Unlikely. Sure, that's the "best" long-term situation and if you have the means to do that, please do.

But, in the mean [email protected] is pretty dope.







It's insane how much its grown with NVIDIA GPUs responsible for ~83% of the folding power.

[/off-topic]


----------



## marik123

Quote:


> Originally Posted by *Dreamliner*
> 
> I almost bought that too. I missed it and wavered on spending extra on the O8G model. I ended up going newegg as pricing now was ~$20 or so more without having to worry about trouble.
> 
> I wavered initially because this forum made it sound like the 1070 wasn't binned and just had different BIOS' separating the 8G and O8G. Recent buyers have found their 8G's unstable at O8G defaults. Sounds like you got lucky!


I wouldn't say lucky, but just got an average chip. My top Heaven 4.0 scores were 2664, while other guys on forums can hit 2727 Heaven 4.0 scores 2138/9600 in here.
Quote:


> Originally Posted by *DStealth*
> 
> Yep flashed Evga SC BIOS but fan are rotating slowly and no other benefits from it. Now flashed .68 BIOS from GB G1 obtained from TPU db and after this updated from GB site to the latest beta with their tool, happy with it so far - it spins the fans higher stock clock are higher and clocks seems more consistent. Here's a result with it...note CPU and ram are stock....


----------



## Schneeder

Hit this tonight. Just playing around.

Time Spy


----------



## DStealth

Quote:


> Originally Posted by *marik123*
> 
> I wouldn't say lucky, but just got an average chip. My top Heaven 4.0 scores were 2664, while other guys on forums can hit 2727 Heaven 4.0 scores 2138/9600 in here.


This was with CPU+ram on stock speeds while awaiting my main PSU RMA , here's a quick run with them OCed







110fps are easy:thumb:


Quote:


> Originally Posted by *Schneeder*
> 
> Hit this tonight. Just playing around.
> 
> Time Spy


Keep pushing m8


----------



## jrcbandit

Quote:


> Originally Posted by *mypickaxe*
> 
> Yes, in my testing with the EVGA High Bandwidth bridge at 5K resolution, I was seeing 96 to 98% scaling with SLI enabled. That's just nuts.


Is the high bandwidth bridge just for handling higher resolutions or does it allow more efficient scaling with Pascal? After doing 970 SLI, I sincerely doubt I'll ever do SLI again due to the vast majority of games having poor/unimpressive scaling or not even having SLI support, especially when the game launches... It doesn't help that DX12 games have to be specifically coded for SLI instead of enabling it at the driver level, ie, might be even fewer games supporting SLI in the future. But if Pascal SLI is that much better, I could eventually change my mind.


----------



## jrp0079

Looking to upgrade to a 1070, but i had a question. Would there be any performance or bottleneck issues with my 3 year old build. Have a 3570K and a 660TI. Or do i have to upgrade my whole system?


----------



## SuperZan

Quote:


> Originally Posted by *jrp0079*
> 
> Looking to upgrade to a 1070, but i had a question. Would there be any performance or bottleneck issues with my 3 year old build. Have a 3570K and a 660TI. Or do i have to upgrade my whole system?


For the most part that would be fine, certainly you won't be outright handicapped in most instances. What resolution are you playing at?


----------



## jrcbandit

Quote:


> Originally Posted by *jrp0079*
> 
> Looking to upgrade to a 1070, but i had a question. Would there be any performance or bottleneck issues with my 3 year old build. Have a 3570K and a 660TI. Or do i have to upgrade my whole system?


Perfectly fine if you overclock your 3570 to 4.4+ ghz.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *jrp0079*
> 
> Looking to upgrade to a 1070, but i had a question. Would there be any performance or bottleneck issues with my 3 year old build. Have a 3570K and a 660TI. Or do i have to upgrade my whole system?


i have as old a system as anyone (LGA1366) added 2x ASUS GTX1070-O8G-GAMING last week and Doom 2016 ROCKS!!!









...thinking my system was too old at 1st because of glitchy play in Doom; i shopped for a new system when i 1st got my cards... but then a day or two later i realized that the game- Doom- doesn't support SLI and so i simply disabled SLI in the Game profile for it in the nVidia CP and all a sudden the game was as playable as could ever be imagined... with all Maxed out settings including highest AA and i get as high as 144+ FPS on a G-sync 1440P monitor. and average 99 FPS with minimum 89 FPS with a single GTX1070 card on LGA1366. the GTX1070 is unbelievable. it was a better investment than my 8800GTX back in the day.









Yes, ur Benchmark results and numbers may suk on an older PC, but actual game-play will still be awesome.

Next i'll be testing TR, Whitcher 3, GTA V and Arma III.


----------



## prey1337

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> i have as old a system as anyone (LGA1366) and Doom 2016 ROCKS!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...thinking my system was too old at 1st because of glitchy play in Doom; i shopped for a new system when i 1st got my cards... but then a day or two later i realized that the game- Doom- doesn't support SLI and so i simply disabled SLI in the Game profile for it in the nVidia CP and all a sudden the game was as playable as could ever be imagined... with all Maxed out settings including highest AA and i get as high as 144+ FPS on a G-sync 1440P monitor. and average 99 FPS with minimum 89 FPS with a single GTX1070 card on LGA1366. the GTX1070 is unbelievable. it was a better investment than my 8800GTX back in the day.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, ur Benchmark results and numbers may suk on an older PC, but actual game-play will still be awesome.
> 
> Next i'll be testing TR, Whitcher 3, GTA V and Arma III.


Agreed, I can attest to that.

Still running an 8 year old i7 920, only overclocked to 3.8GHz.
The only time I can see my CPU effecting performance is in benchmarking.


----------



## jlhawn

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> i have as old a system as anyone (LGA1366) and Doom 2016 ROCKS!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...thinking my system was too old at 1st because of glitchy play in Doom; i shopped for a new system when i 1st got my cards... but then a day or two later i realized that the game- Doom- doesn't support SLI and so i simply disabled SLI in the Game profile for it in the nVidia CP and all a sudden the game was as playable as could ever be imagined... with all Maxed out settings including highest AA and i get as high as 144+ FPS on a G-sync 1440P monitor. and average 99 FPS with minimum 89 FPS with a single GTX1070 card on LGA1366. the GTX1070 is unbelievable. it was a better investment than my 8800GTX back in the day.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, ur Benchmark results and numbers may suk on an older PC, but actual game-play will still be awesome.
> 
> Next i'll be testing TR, Whitcher 3, GTA V and Arma III.


Same here, my old X58 runs every game I have including the new Doom maxed out with my GTX 1070.
games run great with fps well over 60 with no frame skipping, etc. I even crank up AA in NVCP.
I don't pay attention to benchmarks either, just smooth game play is why I buy a graphics card.
I should add that my cpu is at stock speed.


----------



## Swolern

Quote:


> Originally Posted by *jrcbandit*
> 
> Is the high bandwidth bridge just for handling higher resolutions or does it allow more efficient scaling with Pascal? After doing 970 SLI, I sincerely doubt I'll ever do SLI again due to the vast majority of games having poor/unimpressive scaling or not even having SLI support, especially when the game launches... It doesn't help that DX12 games have to be specifically coded for SLI instead of enabling it at the driver level, ie, might be even fewer games supporting SLI in the future. But if Pascal SLI is that much better, I could eventually change my mind.


HB bridge increases both SLI scaling and for handling higher resolution. Higher resolutions will have a higher impact on the HB bridge though as there is more of a bottleneck. http://www.overclock.net/t/1603864/hwunboxed-nvidia-s-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *prey1337*
> 
> Agreed, I can attest to that.
> 
> Still running an 8 year old i7 920, only overclocked to 3.8GHz.
> The only time I can see my CPU effecting performance is in benchmarking.


yep at 1st i set my eyes on:

1. Gigabyte LGA2011-3 Intel X99 ATX Broadwell-E Motherboard GA-X99-Ultra Gaming
2. i7-5820K Haswell-E 6-Core 3.3GHz
3. Ballistix Sport LT 16GB Kit (4GBx4) DDR4 2400 MT/s (PC4-19200) DIMM 288-Pin - BLS4K4G4D240FSB

but lol after configuring Doom for a single card and being floored at the performance versus unsupported SLI. and as a result i've never been happier computer-wise; because now i can take my time upgrading my system.

wat next technology (Intel) is around the corner? (i haven't shopped for a PC in over five years.)

Edit: current system was purchased, "2010-12-22- LGA1366 Mobo CPU RAM".


----------



## prey1337

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> yep at 1st i set my eyes on:
> 
> 1. Gigabyte LGA2011-3 Intel X99 ATX Broadwell-E Motherboard GA-X99-Ultra Gaming
> 2. i7-5820K Haswell-E 6-Core 3.3GHz
> 3. Ballistix Sport LT 16GB Kit (4GBx4) DDR4 2400 MT/s (PC4-19200) DIMM 288-Pin - BLS4K4G4D240FSB
> 
> but lol after configuring Doom for a single card and being floored at the performance versus unsupported SLI. and as a result i've never been happier computer-wise; because now i can take my time upgrading my system.
> 
> wat next technology (Intel) is around the corner? (i haven't shopped for a PC in over five years.)
> 
> Edit: current system was purchased, "2010-12-22- LGA1366 Mobo CPU RAM".


Exactly.
Skylake X and Kaby Lake X will be out by the second half of next year, so there's that.

I bought my system in 2009 and just upgraded a few things here and there. Honestly the only worthwhile upgrades were switching to a SSD and adding this 1070.

Edit: Of course I have a build list in PCpartpicker right now for LGA1151, seems to be a good bang for the buck build, but I have no valid reason to go for it lol.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *jlhawn*
> 
> Same here, my old X58 runs every game I have including the new Doom maxed out with my GTX 1070.
> games run great with fps well over 60 with no frame skipping, etc. I even crank up AA in NVCP.
> I don't pay attention to benchmarks either, just smooth game play is why I buy a graphics card.
> I should add that my cpu is at stock speed.


yes.
Quote:


> Originally Posted by *jlhawn*
> 
> Same here, my old X58 runs every game I have including the new Doom maxed out with my GTX 1070.
> games run great with fps well over 60 with no frame skipping, etc. I even crank up AA in NVCP.
> I don't pay attention to benchmarks either, just smooth game play is why I buy a graphics card.
> I should add that my cpu is at stock speed.


get g-sync monitor before upgrading ur system, they are simply BLISSSS!
Quote:


> Originally Posted by *prey1337*
> 
> Exactly.
> Skylake X and Kaby Lake X will be out by the second half of next year, so there's that.
> 
> I bought my system in 2009 and just upgrade a few things here and there. Honestly the only worthwhile upgrades were switching to a SSD and adding this 1070.


g-sync monitor= BLISSFUL! upgrade that can't be beat.


----------



## prey1337

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> yes.
> get g-sync monitor before upgrading ur system, they are simply BLISSSS!
> g-sync monitor= BLISSFUL! upgrade that can't be beat.


Just upgraded to an IPS at the moment, even that was a huge step up from what I had.

A good 1440p g-sync is probably down the road somewhere for me.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *prey1337*
> 
> Just upgraded to an IPS at the moment, even that was a huge step up from what I had.
> 
> A good 1440p g-sync is probably down the road somewhere for me.


return the IPS monitor if u can...

i never played without vertical-sync until G-Sync last last November. And the difference in FPS games is the definition of Night and Day. i can shoot out a pig's eye from 50 feet while running an jumping now. Before- with vertical-sync- i was happy- felt fortunate- if the broad-side of a barn was my objective while prone. lol


----------



## prey1337

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> return the IPS monitor if u can...
> 
> i never played without vertical-sync until G-Sync last last November. And the difference in FPS games is the definition of Night and Day. i can shoot out a pig's eye from 50 feet while running an jumping now. Before- with vertical-sync- i was happy- felt fortunate- if the broad-side of a barn was my objective while prone. lol


I mean, I'm sure it's great, but $160 vs $800 is quite a large price gap.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *prey1337*
> 
> I mean, I'm sure it's great, but $160 vs $800 is quite a large price gap.


haha sry i forgot the drop in IPS prices.

only paid $579 for my (1440p) ASUS ROG SWIFT PG278Q 27-inch 120Hz G-Sync gaming monitor.

it is my all-time best investment.









no video-tarring is as cool as being an all around ACE shooter too now.

$379 for a new 1080p: (AOC brand 3rd from bottem)
http://www.newegg.com/Product/ProductList.aspx?Submit=Property&N=100160979%20600559797%208000&IsNodeId=1&bop=And&Order=PRICED&PageSize=90


----------



## jlhawn

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> haha sry i forgot the drop in IPS prices.
> 
> only paid $579 for my (1440p) ASUS ROG SWIFT PG278Q 27-inch 120Hz G-Sync gaming monitor.
> 
> it is my all-time best investment.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> no video-tarring is as cool as being an all around ACE shooter too now.
> 
> $379 for a new 1080p: (AOC brand 3rd from bottem)
> http://www.newegg.com/Product/ProductList.aspx?Submit=Property&N=100160979%20600559797%208000&IsNodeId=1&bop=And&Order=PRICED&PageSize=90


I will look into Gsync monitors.
funny the date you purchsed your current system, I built mine 12/12/2010, I'm still running the 2010 bios in the mother board with no problems.
thanks for the info on performance of the gsync monitors.


----------



## brettjv

Quote:


> Originally Posted by *Phixit*
> 
> I can't go higher than +120 on Core with my GTX 1070. Instant artifacts ..
> 
> -+120 on core
> -+800 memory
> 
> Running at 100% fan speeds, 58 degrees in Unigine Heaven. Sounds like a jet engine.
> 
> I feel like I have the worst GTX 1070 of this forum. I'm almost tempted to put it on eBay, sell at loss and grab either a EVGA FTW / MSI Gaming X or Zotac Extreme. I can return it, but I'll have to pay for the shipping + 15% restocking fees (thanks NCIX) so I'll lose even more money.
> 
> Don't know what to do.


I got an easy answer for you ... DON'T. Just ignore it.

Seriously, there's not a person on this planet that could tell you the difference between the worst and best overclocking GTX1070 if you put them down in front of a computer and had 'em do 'blind test' gaming on one vs. the other. The absolute minimum delta FPS the human eye can detect is 10%, and most people would have a hard time detecting anything less than 20% FPS difference in a 'blind test'.

Look man, the absolute BEST air-cooled 1070 is no more than 5% faster than the worst OC'ing one. And yours is FAR from the worst.

So unless you have a ton of cash lying around and/or an obsessive need for e-peen ... I'd just let it go, live with what you got. Next gen, maybe you'll 'win' the silicon lottery.


----------



## mypickaxe

Quote:


> Originally Posted by *jrcbandit*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> Yes, in my testing with the EVGA High Bandwidth bridge at 5K resolution, I was seeing 96 to 98% scaling with SLI enabled. That's just nuts.
> 
> 
> 
> Is the high bandwidth bridge just for handling higher resolutions or does it allow more efficient scaling with Pascal? After doing 970 SLI, I sincerely doubt I'll ever do SLI again due to the vast majority of games having poor/unimpressive scaling or not even having SLI support, especially when the game launches... It doesn't help that DX12 games have to be specifically coded for SLI instead of enabling it at the driver level, ie, might be even fewer games supporting SLI in the future. But if Pascal SLI is that much better, I could eventually change my mind.
Click to expand...

It's for high bandwidth situation, sorry to have to repeat the name of the product. I don't mean that sarcastically. What I do mean is, yes, it's for 4K / 5K, but also for over 120 Hz situations at 1440p and up.

An LED bridge is sufficient until you get up to 4K/60 and after that you really need the HB bridge for quality scaling.

As far as SLI goes, I bought it for better frame rates in certain games I care about such as Project Cars, Witcher 3, GTA V, and for a while there, The Division. It scales so well in The Division, btw.

Long story short, I was in the market for a 1080, but found 2 1070 FEs on the shelf and couldn't pass them up. This was 3 or 4 weeks ago.


----------



## brettjv

Quote:


> Originally Posted by *jrp0079*
> 
> Looking to upgrade to a 1070, but i had a question. Would there be any performance or bottleneck issues with my 3 year old build. Have a 3570K and a 660TI. Or do i have to upgrade my whole system?


Define 'any bottlenecking issues'? What does that 'mean' to you?

'Bottlenecking' is extremely dependent on 'what your definition is' ... probably more than anything else.

And beyond that, the next most important aspect is ... 'what game, and at what settings (res, AA level, etc) are you playing/testing at?'

One simply cannot look at a particular hardware config (unless we're talking EXTREME differences in age/quality of components) and universally declare 'yes, that will bottleneck' or 'no, it won't'.

The proper answer is always 'it depends on what your expectations are, what you're running, and at what settings'?


----------



## Phixit

Quote:


> Originally Posted by *brettjv*
> 
> I got an easy answer for you ... DON'T. Just ignore it.
> 
> Seriously, there's not a person on this planet that could tell you the difference between the worst and best overclocking GTX1070 if you put them down in front of a computer and had 'em do 'blind test' gaming on one vs. the other. The absolute minimum delta FPS the human eye can detect is 10%, and most people would have a hard time detecting anything less than 20% FPS difference in a 'blind test'.
> 
> Look man, the absolute BEST air-cooled 1070 is no more than 5% faster than the worst OC'ing one. And yours is FAR from the worst.
> 
> So unless you have a ton of cash lying around and/or an obsessive need for e-peen ... I'd just let it go, live with what you got. Next gen, maybe you'll 'win' the silicon lottery.


You're right, I'll keep it.


----------



## Mudfrog

So is this a typo on MSI's site? 8192 MB GDDR5 / 8108 MHz Memory (OC Mode)

My RAM is barely over 4000 MHz. I tried OC mode and it set my card at 1530 core clock instead of the 1975 that I get normally.


----------



## Mudfrog

Quote:


> Originally Posted by *jrp0079*
> 
> Looking to upgrade to a 1070, but i had a question. Would there be any performance or bottleneck issues with my 3 year old build. Have a 3570K and a 660TI. Or do i have to upgrade my whole system?


I upgraded from a 2500K and 670 SLI. While I'm sure the 2500K still bottlenecks the card, it was a tremendous upgrade. Eventually when I replace the CPU / Mobo / RAM I'll have another little performance boost.


----------



## mickeykool

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> yes.
> get g-sync monitor before upgrading ur system, they are simply BLISSSS!
> g-sync monitor= BLISSFUL! upgrade that can't be beat.


Is your gsync broken w/ the hotfix update? Mine seems to be. With it on game stutter but w/o its smooth..


----------



## bigjdubb

Quote:


> Originally Posted by *Mudfrog*
> 
> So is this a typo on MSI's site? 8192 MB GDDR5 / 8108 MHz Memory (OC Mode)
> 
> My RAM is barely over 4000 MHz. I tried OC mode and it set my card at 1530 core clock instead of the 1975 that I get normally.


The 4000mhz you are reading in Afterburner needs to be doubled for the total speed.


----------



## Mudfrog

Quote:


> Originally Posted by *bigjdubb*
> 
> The 4000mhz you are reading in Afterburner needs to be doubled for the total speed.


Ok, I wasn't aware of that.


----------



## showaccord97

Quote:


> Originally Posted by *jrp0079*
> 
> Looking to upgrade to a 1070, but i had a question. Would there be any performance or bottleneck issues with my 3 year old build. Have a 3570K and a 660TI. Or do i have to upgrade my whole system?


I literally just upgraded from that exact setup to a 1070. My 3570k runs @4650MHz with a Corsair h100i and doesn't get above 68C. So 4.4 or 4.5 should be fine. As others said. I'm getting 5995 on Timespy and can run just about every game at max settings without any trouble.

I've debated on upgrading to a newer system as well but just can't justify the cost when I'm getting the performance I am now after upgrading to a 1070 Gaming X.


----------



## ogow89

Really golden sample from gainward isn't that much golden. Damn cards comes with 2088mhz out of the box, and with powerlimit maxed goes to 2101mhz, i can't move the core clock at all with that thing. Memory so far stable with +600mhz.

Well at least, i get 21000 on firestrike graphics score.

Damn i was hoping for something over 2150mhz on the core.


----------



## DStealth

Quote:


> Originally Posted by *ogow89*
> 
> Well at least, i get 21000 on firestrike graphics score.


Wow how great..just beat first the cheapest on the market [email protected]*C ambient...just 500+p to go


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *mickeykool*
> 
> Is your gsync broken w/ the hotfix update? Mine seems to be. With it on game stutter but w/o its smooth..


no. not any probs with my g-sync and hotfix. my PC is running perfect. (with exception the of a couple of instances of not properly waking from sleep mode... hence i ran "chkdsk"... see recommends below.... and it fixed free space improperly marked as in use...)

1st try doon uninstall/ clean-install of the vid driver driver.

2nd, also things like "chkdsk /?" or to be precise, *"CHKDSK C: /F /R"* and run it on any partition the drive ur OS is installed on.... maybe in some situations, just not urs now, ppl can even run Chkdsk on the drive games are on if never done before- with normal disclaimers of course; but always run it on the OS drive 1st to see if that helps... all said is advisable because of the relationship with software drivers and Chkdsk; meaning, if one needs running then the other may not install correctly. Improper shutdowns/ freeze-ups caused by bad driver installations (or like i had recently, with improper wake from sleep mode and countless other things) can cause the need to run Chkdsk; thus it is a big circle and well worth checking and the time that it takes to run Chkdsk occasionally, especially if anything weird or funky is going on. To see results of Chkdsk after PC boots into Windows: Open "Event Viewer">Windows Logs>Application>See "Wininit" entry in right pane (in the Source column)> highlight it and read below the results of Chkdsk.

Also, after a fresh install of vid drivers, be sure to check all settings in the nVidia CP.

GL


----------



## mickeykool

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> no. not any probs with my g-sync and hotfix. my PC is running perfect. (with exception the of a couple of instances of not properly waking from sleep mode... hence i ran "chkdsk"... see recommends below.... and it fixed free space improperly marked as in use...)
> 
> 1st try doon uninstall/ clean-install of the vid driver driver.
> 
> 2nd, also things like "chkdsk /?" or to be precise, *"CHKDSK C: /F /R"* and run it on any partition the drive ur OS is installed on.... maybe in some situations, just not urs now, ppl can even run Chkdsk on the drive games are on if never done before- with normal disclaimers of course; but always run it on the OS drive 1st to see if that helps... all said is advisable because of the relationship with software drivers and Chkdsk; meaning, if one needs running then the other may not install correctly. Improper shutdowns/ freeze-ups caused by bad driver installations (or like i had recently, with improper wake from sleep mode and countless other things) can cause the need to run Chkdsk; thus it is a big circle and well worth checking and the time that it takes to run Chkdsk occasionally, especially if anything weird or funky is going on. To see results of Chkdsk after PC boots into Windows: Open "Event Viewer">Windows Logs>Application>See "Wininit" entry in right pane (in the Source column)> highlight it and read below the results of Chkdsk.
> 
> Also, after a fresh install of vid drivers, be sure to check all settings in the nVidia CP.
> 
> GL


thanks for quick response, i'll try the steps later this evening when I get around.


----------



## Tuzic

Hey guys, first post on this website EVER, so what better way to start with posting some of my 1070 stats? so here we go.

EVGA GTX 1070 FE

Core clock +215 Mhz
Memory clock + 850 Mhz

Voltage limit - %100
Power limit - %112

https://www.techpowerup.com/gpuz/details/beq22




100% STABLE guys, unbelievable... I think I struck gold. Ran furmark for 15 minutes, valley for 30, heaven for 30, no artifacts.

Custom fan curve, tops out at 72 degrees with fan reaching %67, surprisingly quiet inside my case! (coming from a reference r9 290)

I'll post some benchmark photos when I get back, on my way to look at a new apartment!

#FeelsGoodMan.


----------



## Stupidfastwagon

Gents,

My question to you is should I return my 1070 FE and exchange it for the Asus 1070 Strix model.


----------



## ogow89

Quote:


> Originally Posted by *DStealth*
> 
> Wow how great..just beat first the cheapest on the market [email protected]*C ambient...just 500+p to go


350 points to go actually. I probably can, just need to see how far i can push the vram and core, once i have the time. So far i have 21170. Besides, your cpu, motherboard might be helping you a little bit, considering i have the cheapest z87 and an i5 4690k. Besides your gigabyte costs as much as mine here. And it isn't breezy where i live either.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Stupidfastwagon*
> 
> Gents,
> 
> My question to you is should I return my 1070 FE and exchange it for the Asus 1070 Strix model.


u can't ask me because i luv the two strix' i got installed 8 days ago. i'm killing 2nd biggest Doom boss now.









(Max temps after multiple 2+ non-stop hours with all default settings out of the box is only a measly 69c; in an HAF-X case.







max fan speed 51%. it just can't print money. but man, it's one of best investments ever although the GTX1070 part of it deserves 99.99% of the credit- and not the brand. Strix is the icing on the cake- that's all- but is it ever gooood icing!?







)

the only advantage of the FE that comes to mind is, isn't FE better if u plan to add liquid cooling to it?

wat if any other advantages does FE have, ppl?


----------



## Stupidfastwagon

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> u can't ask me because i luv the two strix' i got installed 8 days ago. i'm killing 2nd biggest Doom boss now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Max temps after multiple 2+ non-stop hours with all default settings out of the box is only a measly 69c; in an HAF-X case.
> 
> 
> 
> 
> 
> 
> 
> max fan speed 51%. it just can't print money. but man, it's one of best investments ever although the GTX1070 part of it deserves 99.99% of the credit- and not the brand. Strix is the icing on the cake- that's all- but is it ever gooood icing!?
> 
> 
> 
> 
> 
> 
> 
> )
> 
> the only advantage of the FE that comes to mind is, isn't FE better if u plan to add liquid cooling to it?
> 
> wat if any other advantages does FE have, ppl?


I dont plan on water cooling, however its nice to have that option available if I did decide to take the plunge.


----------



## Blackfyre

Quote:


> Originally Posted by *Tuzic*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hey guys, first post on this website EVER, so what better way to start with posting some of my 1070 stats? so here we go.
> 
> EVGA GTX 1070 FE
> 
> Core clock +215 Mhz
> Memory clock + 850 Mhz
> 
> Voltage limit - %100
> Power limit - %112
> 
> https://www.techpowerup.com/gpuz/details/beq22
> 
> 
> 
> 
> 100% STABLE guys, unbelievable... I think I struck gold. Ran furmark for 15 minutes, valley for 30, heaven for 30, no artifacts.
> 
> Custom fan curve, tops out at 72 degrees with fan reaching %67, surprisingly quiet inside my case! (coming from a reference r9 290)
> 
> I'll post some benchmark photos when I get back, on my way to look at a new apartment!
> 
> #FeelsGoodMan.


Quote:


> *EVGA GTX 1070 FE*
> 
> Core clock +215 Mhz
> Memory clock + 850 Mhz
> 
> Voltage limit - %100
> Power limit - %112


Those look very impressive.









What are your actual boost CORE clocks when you're gaming/benchmarking?

Does it hover between 2025MHz & 2100MHz depending on temperature, or is it higher than that?


----------



## luan87us

Quote:


> Originally Posted by *ITAngel*
> 
> Oculus Rift $599 unless you want cheaper.


Quote:


> Originally Posted by *bigjdubb*
> 
> Oculus is the next cheapest consumer device. There are other dev kits out there that cost less, OSVR is a decent option.


Lol ok that's way too much for a novelty toy for me haha. I don't even have a 1440p monitor yet. Using a BenQ XL2720Z 144hz monitor as I mainly play CSGO. I'm moving in a few months and will probably add a 4K tv to the set up to play GTAV and other games.


----------



## sterik01

Count me in. Just installed. Had it in a box for two days.


----------



## Mad Pistol

I probably am still on my honeymoon with this setup, but dear god, GTX 1070 SLI is a monster!!!

Right now, I have BF4, Battlefront, Overwatch, and The Witcher 3, all running maxed out @ 5160x2160 (DSR). I average 80 FPS or more on each of those games. It is BONKERS!!!

Titan XP, SCHMITAN EXHPEE!!! I've got all the performance I want and more with this setup.









EDIT: also, dual FEs + HB SLI bridge is sexy as hell.


----------



## LiquidHaus

oh how this group has grown and yet not a single person has modded a bios for these cards.


----------



## ogow89

Quote:


> Originally Posted by *lifeisshort117*
> 
> oh how this group has grown and yet not a single person has modded a bios for these cards.


word on the street is, they are hardware locked, and no bios mod is gonna change that.


----------



## alex4069

New card in today.


----------



## alex4069

I purchased the new HB sli bridge and picked the one for my spacing and it just a little longer. So, unless a company makes an adjustable HB sli bridge I will just have to stay with the flexible ones.


----------



## LiquidHaus

Quote:


> Originally Posted by *ogow89*
> 
> word on the street is, they are hardware locked, and no bios mod is gonna change that.


the *official* word is that they are locked hardware wise at 1.25v

everyone's bios is cutting it currently at 1.093v, tops.

there is quite a bit of overclocking to be had between those two.


----------



## alex4069

Ok, after a little more messing with it I could fit the HB sli bridge in.



and can tell a little difference.

example: http://www.3dmark.com/compare/spy/163743/spy/163346


----------



## danjal

another Zotac owner, and really liking its performance.


----------



## Dude970

Quote:


> Originally Posted by *lifeisshort117*
> 
> the *official* word is that they are locked hardware wise at 1.25v
> 
> everyone's bios is cutting it currently at 1.093v, tops.
> 
> there is quite a bit of overclocking to be had between those two.


Have patience, it is coming


----------



## wickedout

Quote:


> Originally Posted by *danjal*
> 
> another Zotac owner, and really liking its performance.


Mine will be here either tomorrow or Friday. Zotac 1070 AMP GPU is pretty sick. I'm impressed with the numbers it's putting out. Overclocks like a beast IMHO! Can't wait to get my hands on it.


----------



## ClashOfClans

Hi guys I am currently running a gtx970. I have a i5 3570 cpu(non-k)...do you think my cpu will bottleneck a 1070?


----------



## Swolern

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> u can't ask me because i luv the two strix' i got installed 8 days ago. i'm killing 2nd biggest Doom boss now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Max temps after multiple 2+ non-stop hours with all default settings out of the box is only a measly 69c; in an HAF-X case.
> 
> 
> 
> 
> 
> 
> 
> max fan speed 51%. it just can't print money. but man, it's one of best investments ever although the GTX1070 part of it deserves 99.99% of the credit- and not the brand. Strix is the icing on the cake- that's all- but is it ever gooood icing!?
> 
> 
> 
> 
> 
> 
> 
> )
> 
> the only advantage of the FE that comes to mind is, isn't FE better if u plan to add liquid cooling to it?
> 
> wat if any other advantages does FE have, ppl?


There are no advantages of the FE, its a reference card. Most of the AIB cards have way better coolers and much quieter, with better power delivery management. But all 1070s OC about the same due to locked voltage(+/- silicon luck) so power del doesnt matter as much.

Your STRIX SLI is at 69c? Man thats pretty warm. My SLI maxes out around 58c fully OCd. But i did adjust the fan curve slightly, that might be it. I would do the same if i were you. Lower temps = less throttling.
Quote:


> Originally Posted by *Mad Pistol*
> 
> I probably am still on my honeymoon with this setup, but dear god, GTX 1070 SLI is a monster!!!
> 
> Right now, I have BF4, Battlefront, Overwatch, and The Witcher 3, all running maxed out @ 5160x2160 (DSR). I average 80 FPS or more on each of those games. It is BONKERS!!!
> 
> Titan XP, SCHMITAN EXHPEE!!! I've got all the performance I want and more with this setup.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: also, dual FEs + HB SLI bridge is sexy as hell.


Ok that is one advantage, i do love that look of the FEs with that bridge!







Looks like a transformer!!!

And ya 1070 SLI is going to trump a Titan XP. When scaling is good i calculated the 1070 SLI will be as much as 40% more performance than a $1200 Titan XP at high resolutions.
Quote:


> Originally Posted by *ClashOfClans*
> 
> Hi guys I am currently running a gtx970. I have a i5 3570 cpu(non-k)...do you think my cpu will bottleneck a 1070?


Depends on resolution & game, but mostly no it will not BN.


----------



## Peet1

So this is pretty tricky. I couldn't find any info about this whats however.

*Does anyone know, if the Palit gtx 1070 Jetstreams, or similar versions, have any thermal pads between the PCB and the Backplate?*

I have one, but i am not bold enough risking to void the guaranty, by taking it apart.


----------



## ITAngel

Mine is not overclocked yet but figure I would post what I am getting stock from GPU-Z


----------



## Skull Knight

Anyone else getting low GPU usage on Arkham Knight with a CPU around the i7-2600 (non K) @ 1080p? It hardly hovers above 60% unless I'm doing a benchmark, most other games seem to utilize over 90% just fine.

I know its a rubbish port by the way.


----------



## Powergate

Replaced the fan cover of my Palit Super JetStream











Now it's even more silent & cooling has improved.


----------



## madmeatballs

Quote:


> Originally Posted by *ITAngel*
> 
> Mine is not overclocked yet but figure I would post what I am getting stock from GPU-Z




I overclocked mine and this is how far it can get, any further results to driver crash. Zotac GTX 1070 AMP! Extreme edition here too!


----------



## marduke83

Quote:


> Originally Posted by *ogow89*
> 
> Really golden sample from gainward isn't that much golden. Damn cards comes with 2088mhz out of the box, and with powerlimit maxed goes to 2101mhz, i can't move the core clock at all with that thing. Memory so far stable with +600mhz.
> 
> Well at least, i get 21000 on firestrike graphics score.
> 
> Damn i was hoping for something over 2150mhz on the core.


It's definitely golden compare to my phoenix (normal model) which can't do over 2075 in games and 2062 in firestrike...
I like the card, but the size of the cooler is my biggest gripe (as I'm looking at sli in the future, and my other card would literally have 2-3mm of clearance) But the cooling is amazing, was playing CSGO and the fans didn't even spin up.. But thinking of selling it and getting the evga ftw, mainly due to the 2 card slot size compare to 3 slot. If I wasn't looking at sli later on I would keep it.


----------



## Mudfrog

Hit my first CPU bottleneck last night. GTA 5 at 1440p max settings (aside from AA), 70% GPU usage and 95-100% CPU usage.

Those with MSI, when using the MSI OC software I have no screen overlay from it or afterburner. If I close the OC software (forget the name of it) then afterburner will show the screen overlay. Anyone run into this?

Also, MSI dragoneye says I must have MSI hardware in order to use it.


----------



## ogow89

Quote:


> Originally Posted by *marduke83*
> 
> It's definitely golden compare to my phoenix (normal model) which can't do over 2075 in games and 2062 in firestrike...
> I like the card, but the size of the cooler is my biggest gripe (as I'm looking at sli in the future, and my other card would literally have 2-3mm of clearance) But the cooling is amazing, was playing CSGO and the fans didn't even spin up.. But thinking of selling it and getting the evga ftw, mainly due to the 2 card slot size compare to 3 slot. If I wasn't looking at sli later on I would keep it.


Yea i might be able to get more out of it, once i settle in my new place. 2050mhz is where i left the card however for now. Games are all maxed out, including witcher 3 at 1440p and higher. I even set the hairworks from low which i used on the msi 1070 gamer x to high. That 120 mhz really made a difference of 3-5 fps that i just used a higher setting with similar performance.


----------



## ogow89

Quote:


> Originally Posted by *Mudfrog*
> 
> Hit my first CPU bottleneck last night. GTA 5 at 1440p max settings (aside from AA), 70% GPU usage and 95-100% CPU usage.
> 
> Those with MSI, when using the MSI OC software I have no screen overlay from it or afterburner. If I close the OC software (forget the name of it) then afterburner will show the screen overlay. Anyone run into this?
> 
> Also, MSI dragoneye says I must have MSI hardware in order to use it.


For dragoneye, you need to install separately for it to work properly. I had the same issue. As far as the MSI OC software, i didn't run into the same issue as you. Then again i didn't bother with MSI OC software.


----------



## TheMiracle

Quote:


> Originally Posted by *Mudfrog*
> 
> Hit my first CPU bottleneck last night. GTA 5 at 1440p max settings (aside from AA), 70% GPU usage and 95-100% CPU usage.
> 
> Those with MSI, when using the MSI OC software I have no screen overlay from it or afterburner. If I close the OC software (forget the name of it) then afterburner will show the screen overlay. Anyone run into this?
> 
> Also, MSI dragoneye says I must have MSI hardware in order to use it.


I am also getting CPU bottleneck in GTA, only in some areas. But even with the bottleneck I get 80+FPS.


----------



## Mudfrog

Quote:


> Originally Posted by *ogow89*
> 
> For dragoneye, you need to install separately for it to work properly. I had the same issue. As far as the MSI OC software, i didn't run into the same issue as you. Then again i didn't bother with MSI OC software.


It was installed from the disc, did you download the software then? I'll try that tonight.
Quote:


> Originally Posted by *TheMiracle*
> 
> I am also getting CPU bottleneck in GTA, only in some areas. But even with the bottleneck I get 80+FPS.


With my setup I'm usually around 60-65 fps, occasionally dipping as low as 45.


----------



## marduke83

So I just bit the bullet on a EVGA FTW, within an hour the price at my local store here in Australia dropped by $15, and at that price with EVGAs customer service etc I couldn't pass it up.


----------



## FlatOUT

Hello fellows! Just got a MSI Gtx 1070 Gaming X. Pretty happy with it!



Just did some overclock, looks stable







Is it a good overclock? Or i`ve lost chip lottery again ? )


----------



## chrcoluk

I think 2037 is reasonable, personally I am happy I get over 2k and hopefully you are also.


----------



## ITAngel

Quote:


> Originally Posted by *Powergate*
> 
> Replaced the fan cover of my Palit Super JetStream
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now it's even more silent & cooling has improved.


Nice, that looks pretty nice.
Quote:


> Originally Posted by *madmeatballs*
> 
> 
> 
> I overclocked mine and this is how far it can get, any further results to driver crash. Zotac GTX 1070 AMP! Extreme edition here too!


Hmm.. I see, I will attempt to OC it later on to see how far I can take it for sure. Thanks!


----------



## ITAngel

Quote:


> Originally Posted by *luan87us*
> 
> Lol ok that's way too much for a novelty toy for me haha. I don't even have a 1440p monitor yet. Using a BenQ XL2720Z 144hz monitor as I mainly play CSGO. I'm moving in a few months and will probably add a 4K tv to the set up to play GTAV and other games.


You can look into this other unit for $399.00 Razer OSVR HDK2 headset (Pre-Order) shipped starting from 12th August.
http://hexus.net/tech/news/peripherals/94924-razer-osvr-hdk2-headset-open-pre-orders/


----------



## luan87us

Quote:


> Originally Posted by *ITAngel*
> 
> You can look into this other unit for $399.00 Razer OSVR HDK2 headset (Pre-Order) shipped starting from 12th August.
> http://hexus.net/tech/news/peripherals/94924-razer-osvr-hdk2-headset-open-pre-orders/


I saw that but yeah $400 is definitely way over what I had in mind. I was thinking $100-$150 haha. I guess VR gaming can wait a few year.


----------



## dislikeyou

Sometimes when I boot into Windows 10, the font is blurry, I go to Display Settings and it shows me that I must sign out to apply 150% scaling but that should be required only first time scaling is set.

Anyone else experience the same on a 4K monitor? I think its the GPU cuz I don't remember having this issue on previous GPUs.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Swolern*
> 
> There are no advantages of the FE, its a reference card. Most of the AIB cards have way better coolers and much quieter, with better power delivery management. But all 1070s OC about the same due to locked voltage(+/- silicon luck) so power del doesnt matter as much.
> 
> Your STRIX SLI is at 69c? Man thats pretty warm. My SLI maxes out around 58c fully OCd. But i did adjust the fan curve slightly, that might be it. I would do the same if i were you. Lower temps = less throttling.
> Ok that is one advantage, i do love that look of the FEs with that bridge!
> 
> 
> 
> 
> 
> 
> 
> Looks like a transformer!!!


can u post a SS of ur custom fan curve, so i can try it, plz? Thanks









Because yes, with all default settings, my max fan speed is a mere 51%!









Strix' Two 4-pin PWN "ASUS FanConnects GPS-Controlled Headers" RULE!


----------



## ogow89

Quote:


> Originally Posted by *FlatOUT*
> 
> Hello fellows! Just got a MSI Gtx 1070 Gaming X. Pretty happy with it!
> 
> 
> 
> Just did some overclock, looks stable
> 
> 
> 
> 
> 
> 
> 
> Is it a good overclock? Or i`ve lost chip lottery again ? )


Your score is a little on the low side. I had 102 fps, with 100mhz less core clock. At 2030mhz, you should be getting around 104 fps.


----------



## LiquidHaus

Quote:


> Originally Posted by *Powergate*
> 
> Replaced the fan cover of my Palit Super JetStream
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now it's even more silent & cooling has improved.


very nice! do you happen to have before and after temps?

SO I contacted Alphacool to see if they'd make a custom block/cooler for the Amp Extreme...

based on how they make their line of blocks - with their universal GPU block and then a custom heatsink that covers the entire PCB - that design could also work with most stock air cooler fans over their heatsink design. it'd be nice if that'd be a viable option. you'd have excellent GPU cooling and all the VRM cooling wouldn't be passive. It'd be active based on the stock fan design mounted to the new heatsink that they make.


----------



## yevonxxx

simple question:

evga FTW or MSI gaming X?

can't decide.


----------



## FlatOUT

delete


----------



## FlatOUT

Quote:


> Originally Posted by *ogow89*
> 
> Your score is a little on the low side. I had 102 fps, with 100mhz less core clock. At 2030mhz, you should be getting around 104 fps.


im on i7-2600k


----------



## benjamen50

I just got the EVGA 1070 FTW, here's a screenshot of my final overclock and unigine heaven benchmark score:





Specs in rig sig.


----------



## jlhawn

Quote:


> Originally Posted by *yevonxxx*
> 
> simple question:
> 
> evga FTW or MSI gaming X?
> 
> can't decide.


MSI Gaming X

Heres mine, It's great.


----------



## Mad Pistol

Quote:


> Originally Posted by *yevonxxx*
> 
> simple question:
> 
> evga FTW or MSI gaming X?
> 
> can't decide.


I like the way the Gaming X looks better, but they are both great. Get whichever one you can find cheaper.


----------



## ikjadoon

Quote:


> Originally Posted by *Mudfrog*
> 
> Hit my first CPU bottleneck last night. GTA 5 at 1440p max settings (aside from AA), 70% GPU usage and 95-100% CPU usage.
> 
> Those with MSI, when using the MSI OC software I have no screen overlay from it or afterburner. If I close the OC software (forget the name of it) then afterburner will show the screen overlay. Anyone run into this?
> 
> Also, MSI dragoneye says I must have MSI hardware in order to use it.


Quote:


> Originally Posted by *TheMiracle*
> 
> I am also getting CPU bottleneck in GTA, only in some areas. But even with the bottleneck I get 80+FPS.


In GTA:V, it just gobbles up CPU speeds. You get measurable gains even overclocking an i7-6700K to 4.6GHz vs 4GHz stock. However, I guess this only matters if your monitor is higher than 60Hz.


----------



## TheMiracle

Quote:


> Originally Posted by *ikjadoon*
> 
> In GTA:V, it just gobbles up CPU speeds. You get measurable gains even overclocking an i7-6700K to 4.6GHz vs 4GHz stock. However, I guess this only matters if your monitor is higher than 60Hz.


Yep, I have a 144hrz G-Sync monitor!


----------



## Mudfrog

Quote:


> Originally Posted by *ikjadoon*
> 
> In GTA:V, it just gobbles up CPU speeds. You get measurable gains even overclocking an i7-6700K to 4.6GHz vs 4GHz stock. However, I guess this only matters if your monitor is higher than 60Hz.


I'm at 1080p 60hz.


----------



## chrcoluk

Ok I did some testing after messing with my voltage
Quote:


> With vsync on (60fps) my tdp usage is similar to yours, I guess you got vsync on.
> 
> My final observation is this.
> 
> When power limit is slightly limited, gpuz reports VREL whilst card throttles.
> When power limit is heavily limited gpuz will report PWR and card will throttle much more.
> When voltage is maxed out gpuz will also report VREL and I do agree that the default shipped bios is limited to 1.09v (possibly also hardware side).
> 
> I turned my fans right down and managed to get my card up to 73C, the card I think is downclocking slightly at 70C but its not conclusive, I would have to do a lot more testing.
> 
> When i set TDP limit to 60% in afterburner with vsync off, clocks were down to 1600mhz Art 50% they were maxwell speeds circa 1400-1500mhz.


With vsync on so card nowhere near TDP limit I do get about an extra 30-50mhz solid, but with vsync off and so card fully pushed, the voltage goes down to 1.06v only slightly higher than the default 1.05v due to TDP limit.


----------



## Raikiri

Oh yeah, my heaven score while we're at it:



2500k @ 4.5Ghz, 2088 core and 9020 mem.


----------



## bigjdubb

Quote:


> Originally Posted by *lifeisshort117*
> 
> SO I contacted Alphacool to see if they'd make a custom block/cooler for the Amp Extreme...
> 
> based on how they make their line of blocks - with their universal GPU block and then a custom heatsink that covers the entire PCB - that design could also work with most stock air cooler fans over their heatsink design. it'd be nice if that'd be a viable option. you'd have excellent GPU cooling and all the VRM cooling wouldn't be passive. It'd be active based on the stock fan design mounted to the new heatsink that they make.


From the way I understood it, if they didn't offer a heatsink for the GPX series for your card you could send them your card and they would make one.

I had them on for my 970's. I just sent them to my buddy and he decided to paint them white to match his new build, I think it looks pretty sharp.


----------



## chrcoluk

4670k @ 4.3ghz
2062 core (but throttles)
9316 mem

1080, msaa 8x, ulta, extreme tesellation


----------



## LiquidHaus

Quote:


> Originally Posted by *benjamen50*
> 
> I just got the EVGA 1070 FTW, here's a screenshot of my final overclock and unigine heaven benchmark score:
> 
> 
> 
> 
> 
> Specs in rig sig.


very nice OC you got there! that's the second FTW card to get above average overclocks compared to anyone else, if im not mistaken.

Quote:


> Originally Posted by *bigjdubb*
> 
> From the way I understood it, if they didn't offer a heatsink for the GPX series for your card you could send them your card and they would make one.
> 
> I had them on for my 970's. I just sent them to my buddy and he decided to paint them white to match his new build, I think it looks pretty sharp.


you are correct - I would have to send in my 1070 to them for a block to actually happen. that'd be a big sacrifice, however i'd only do it at this point IF we get some voltage unlocks. otherwise, an aircooled card is perfectly fine with these current bios limits.


----------



## LiquidHaus

QUESTION for you guys: has anyone successfully flashed a bios from another manufacturer onto their card? haven't been on this thread much since i had went on vacation two weeks ago.

I have a EVGA FTW bios I would really like to try on my Zotac Amp Extreme.

and yes, double post. sorry about it


----------



## dislikeyou

Quote:


> Originally Posted by *lifeisshort117*
> 
> QUESTION for you guys: has anyone successfully flashed a bios from another manufacturer onto their card? haven't been on this thread much since i had went on vacation two weeks ago.
> 
> I have a EVGA FTW bios I would really like to try on my Zotac Amp Extreme.
> 
> and yes, double post. sorry about it


Some people have.

http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it


----------



## pez

Quote:


> Originally Posted by *dislikeyou*
> 
> Sometimes when I boot into Windows 10, the font is blurry, I go to Display Settings and it shows me that I must sign out to apply 150% scaling but that should be required only first time scaling is set.
> 
> Anyone else experience the same on a 4K monitor? I think its the GPU cuz I don't remember having this issue on previous GPUs.


Is this happening every time you log out or restart? I set mine once and had to log out and back in. It never changed until I manually went back and changed. I believe if you play a game at a lower resolution, Windows will try to auto-detect scaling and set it, but changes won't apply until you log out and back in again. It also shouldn't trigger automatically without you at least opening up the Display Settings.


----------



## dislikeyou

Quote:


> Originally Posted by *pez*
> 
> Is this happening every time you log out or restart? I set mine once and had to log out and back in. It never changed until I manually went back and changed. I believe if you play a game at a lower resolution, Windows will try to auto-detect scaling and set it, but changes won't apply until you log out and back in again. It also shouldn't trigger automatically without you at least opening up the Display Settings.


Not every time but happened a few times. I open up Chrome and notice that text is blurry so I open up Display Settings and it shows that I must log out to apply. Maybe something is messed up in Windows.


----------



## bigjdubb

I will mess around with with scaling tonight to see if I get anything weird. Since I don't run any scaling on my 4k screen (55") I haven't run into anything yet.


----------



## LiquidHaus

I am so sad right now.

Apparently the 1070 FTW has 10+2 phase layout

Amp Extreme is 8+2 phase layout.

that makes the FTW the highest and cleanest power delivery 1070 on the market.

I had thought this whole time that the Amp Extreme was a 10+3 phase layout.

NOOOOO

well, other than that, the cards appear to be damn close to each other. I would really like to flash the FTW bios to my Amp Extreme tonight. just super nervous.


----------



## TopicClocker

Hey everyone, I'm planning to add some benchmarks to the second page, does anyone have any suggestions for benchmarks to run on these awesome GPUs?

I'm thinking about having the FFXIV ARR benchmark as-well as 3DMark Firestrike and maybe Unigine Valley and Heaven, does anyone have any suggestions?


----------



## rv8000

Not sure if we have the proper tools to look into the Bios yet, but does anyone know if there are bios differences outside of clock speeds for the MSI Gaming, Gaming X, and Gaming Z?

My plan was to simply flash the regular gaming model to either the X or the Z, but if there aren't any bios differences I'm pretty sure I just got the royal dud of a 1070; memory oc above +250 artifacts like crazy. Anyone know if they did any binning for the models?


----------



## FlatOUT

Quote:


> Originally Posted by *Skull Knight*
> 
> Anyone else getting low GPU usage on Arkham Knight with a CPU around the i7-2600 (non K) @ 1080p? It hardly hovers above 60% unless I'm doing a benchmark, most other games seem to utilize over 90% just fine.
> 
> I know its a rubbish port by the way.


Im on 2600k and usage goes from 60 up to 98 (1440p)


----------



## FlatOUT

Quote:


> Originally Posted by *rv8000*
> 
> Not sure if we have the proper tools to look into the Bios yet, but does anyone know if there are bios differences outside of clock speeds for the MSI Gaming, Gaming X, and Gaming Z?
> 
> My plan was to simply flash the regular gaming model to either the X or the Z, but if there aren't any bios differences I'm pretty sure I just got the royal dud of a 1070; memory oc above +250 artifacts like crazy. Anyone know if they did any binning for the models?


Im getting the most performance boost out of ovrecloking the memory.. im on X and +600 fine


----------



## Dude970

That sounds like some good benches. I have never tried the FFXIV ARR benchmark


----------



## rv8000

Quote:


> Originally Posted by *FlatOUT*
> 
> Im getting the most performance boost out of ovrecloking the memory.. im on X and +600 fine


I'm a little lazy atm so I'm not going to pull the card apart tonight, but I have a strange feeling they used different ram ic's on the "Gaming non X/Z" version.


----------



## Dude970

Quote:


> Originally Posted by *TopicClocker*
> 
> Hey everyone, I'm planning to add some benchmarks to the second page, does anyone have any suggestions for benchmarks to run on these awesome GPUs?
> 
> I'm thinking about having the FFXIV ARR benchmark as-well as 3DMark Firestrike and maybe Unigine Valley and Heaven, does anyone have any suggestions?


I would also suggest the new TimeSpy Bench


----------



## Vaesauce

Quote:


> Originally Posted by *lifeisshort117*
> 
> I am so sad right now.
> 
> Apparently the 1070 FTW has 10+2 phase layout
> 
> Amp Extreme is 8+2 phase layout.
> 
> that makes the FTW the highest and cleanest power delivery 1070 on the market.
> 
> I had thought this whole time that the Amp Extreme was a 10+3 phase layout.
> 
> NOOOOO
> 
> well, other than that, the cards appear to be damn close to each other. I would really like to flash the FTW bios to my Amp Extreme tonight. just super nervous.


The 1070XG has a 10+2 phase layout as well


----------



## ikjadoon

Quote:


> Originally Posted by *Vaesauce*
> 
> The 1070XG has a 10+2 phase layout as well


Haven't we known for years, though, that the _number of phases_ don't necessarily matter, but the quality of the phases?

Isn't this akin to saying, "Man, this AMD FX-9350 has 8 cores. I'm so sad my Intel i7-6700K only has 4."


----------



## Vaesauce

Quote:


> Originally Posted by *ikjadoon*
> 
> Haven't we known for years, though, that the _number of phases_ don't necessarily matter, but the quality of the phases?
> 
> Isn't this akin to saying, "Man, this AMD FX-9350 has 8 cores. I'm so sad my Intel i7-6700K only has 4."


While true, I was only pointing out the fact that the XG also has the same Power Phase Layout.

I cannot comment on whether it's quality is good or bad until we have the ability to unlock the voltage and see what all cards are capable of. But I do agree with your statement.


----------



## SuperZan

Quote:


> Originally Posted by *ikjadoon*
> 
> Haven't we known for years, though, that the _number of phases_ don't necessarily matter, but the quality of the phases?
> 
> Isn't this akin to saying, "Man, this AMD FX-9350 has 8 cores. I'm so sad my Intel i7-6700K only has 4."


Most definitely. That said, 10+2 high-quality phases is certainly desirable, all things being equal besides. ?


----------



## rv8000

So a word of warning to anyone considering the MSI GTX 1070 "Gaming", they're using micron GDDR5 and apparently (haven't checked the IC's for M# yet) the IC's seem to be running pretty close to max at the stock 8Ghz; can't even get +175 stable on the memory OC compared to the Samsung based cards. A bit frustrated by this










Anyone with the Gaming X/Z variant running anything but Samsung chips?


----------



## JackCY

Quote:


> Originally Posted by *rv8000*
> 
> So a word of warning to anyone considering the MSI GTX 1070 "Gaming", they're using micron GDDR5 and apparently (haven't checked the IC's for M# yet) the IC's seem to be running pretty close to max at the stock 8Ghz; can't even get +175 stable on the memory OC compared to the Samsung based cards. A bit frustrated by this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone with the Gaming X/Z variant running anything but Samsung chips?


Just reball GDDR5x no?


----------



## rv8000

Quote:


> Originally Posted by *JackCY*
> 
> Just reball GDDR5x no?


I wish, but don't really need my 1080







. Just wasn't expecting different memory on the Gaming card, I guess they had to dock something to save some $$$.


----------



## jlhawn

Quote:


> Originally Posted by *rv8000*
> 
> So a word of warning to anyone considering the MSI GTX 1070 "Gaming", they're using micron GDDR5 and apparently (haven't checked the IC's for M# yet) the IC's seem to be running pretty close to max at the stock 8Ghz; can't even get +175 stable on the memory OC compared to the Samsung based cards. A bit frustrated by this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone with the Gaming X/Z variant running anything but Samsung chips?


Mine has Samsung, yours is the first I heard of using Micron in the 1070 but, thats just my research.


----------



## rv8000

Quote:


> Originally Posted by *jlhawn*
> 
> Mine has Samsung, yours is the first I heard of using Micron in the 1070 but, thats just my research.


It's probably reserved for the non X/Z models.


----------



## kpo6969

Quote:


> Originally Posted by *rv8000*
> 
> So a word of warning to anyone considering the MSI GTX 1070 "Gaming", they're using micron GDDR5 and apparently (haven't checked the IC's for M# yet) the IC's seem to be running pretty close to max at the stock 8Ghz; can't even get +175 stable on the memory OC compared to the Samsung based cards. A bit frustrated by this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone with the Gaming X/Z variant running anything but Samsung chips?


My MSI GTX 1070 Gaming shows it's Samsung.


----------



## rv8000

Quote:


> Originally Posted by *kpo6969*
> 
> My MSI GTX 1070 Gaming shows it's Samsung.


RIP my luck









So far I can't even sustain 2038 on the core either, feels like I hit the bottom of the barrel with this card


----------



## marik123

This is the best Heaven 4.0 scores I can achieve with any artifact and my core topped out at 2100mhz @ 1.081v.


----------



## TheDeadCry

I've got the MSI Gaming X - My memory is manufactured by Micron as well. :| I can't attest to the quality, as I haven't done any OC'ing besides bumping the core to run at 2GHz under load.


----------



## Zer0CoolX

Quote:


> Originally Posted by *rv8000*
> 
> So a word of warning to anyone considering the MSI GTX 1070 "Gaming", they're using micron GDDR5 and apparently (haven't checked the IC's for M# yet) the IC's seem to be running pretty close to max at the stock 8Ghz; can't even get +175 stable on the memory OC compared to the Samsung based cards. A bit frustrated by this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone with the Gaming X/Z variant running anything but Samsung chips?


Now you got me worries, just ordered the X. Ill check it out in about a week and report back which RAM it comes with.


----------



## DStealth

Quote:


> Originally Posted by *marik123*
> 
> This is the best Heaven 4.0 scores I can achieve with any artifact and my core topped out at 2100mhz @ 1.081v.


Not bad, here's mine for compare


----------



## syl1979

I feel that to the contrary of maxwell, nvidia really did maximize out of box frequencies on pascal gpuz.


----------



## danjal

Quote:


> Originally Posted by *syl1979*
> 
> I feel that to the contrary of maxwell, nvidia really did maximize out of box frequencies on pascal gpuz.


or they locked the 1070 down so it couldnt get near the 1080.


----------



## syl1979

By cutting 25% of the core of course they did. But it seems on the custom cards to be difficult to get more than 15% overclock


----------



## DStealth

Quote:


> Originally Posted by *syl1979*
> 
> I feel that to the contrary of maxwell, nvidia really did maximize out of box frequencies on pascal gpuz.


As for the custom models yes...the OC headroom is quite restricted although more than 1-1.5ghz effective memory overclock is not too shabby. But comparing base boost clock my card for example is having 22% more 24/7 gaming stable day and night overclock which cannot be said as maximized out of the box, let alone memory combined....


----------



## Powergate

Quote:


> Originally Posted by *lifeisshort117*
> 
> very nice! do you happen to have before and after temps?


~72°C @ 1150rpm, now ~69°C @ 850rpm

Interesting side note: the orignal fan made some PWM noise which sounds like coil whine. This was easy reproducible as only certain speeds like 33% made PWM noise (on 34% it was silent).

Edit:
Was exactly like shown in this video:


----------



## chrcoluk

my current gpuz, note I beat someone else's score with a lower core clock as my higher vram clock compensated.


----------



## DStealth

Lol this memory OC is average....that's high - GDDR5X territory


----------



## chrcoluk

someone on ocuk is reporting their Zotac card only hits 60% TDP at 1.09v, vs my palit been TDP throttled and going down to 1.06v, its possible Zotac have set a very generous TDP limit on their cards, I estimate at around 238 watt TDP limit, I wonder if Zotac users can test? Note this is on unigine benches with vsync off so card fully utilised, I can maintain 1.09v fine if vsync is on capped at 60fps.


----------



## Dude970

Here is my Heaven run


----------



## FlatOUT

What's the safe overclock for our cards?(mem clock/voltage/core clock) I mean to play with,not benchmarking. I've hit 2038 with +0mV, and 2100 with +100mV setting 80-90% fan. Is that ok that gpu clocks down with temp going up?


----------



## GreedyMuffin

For the guys who want to flash the ZOTAC bios on his FTW (Or wise verca) don't. It can mess up your card since the power delivery is completely different. If both would have been FEs (or FEs PCBs) it would be no problem.

Can anyone witth an GTX 1070 run a firestrike vanilla for me? My buddy who has a 6700K and slightly overclocked 1070 get this score : http://www.3dmark.com/3dm/13684626 (124 + offset on the GPU)

Seems kinda low when my 980Ti 1500/1990 (mem was on 2000 on this test, same deal). gets this : http://www.3dmark.com/fs/6664726

Thanks!


----------



## Swolern

Quote:


> Originally Posted by *FlatOUT*
> 
> What's the safe overclock for our cards?(mem clock/voltage/core clock) I mean to play with,not benchmarking. I've hit 2038 with +0mV, and 2100 with +100mV setting 80-90% fan. Is that ok that gpu clocks down with temp going up?


There is a hard lock on voltage with these cards. As long as your OC are not getting artifacts or crashes you are good. GPU clocks going down due to temps is called throttling and it's normal. Get those temps down for a more stable OC.


----------



## GreedyMuffin

Quote:


> Originally Posted by *FlatOUT*
> 
> What's the safe overclock for our cards?(mem clock/voltage/core clock) I mean to play with,not benchmarking. I've hit 2038 with +0mV, and 2100 with +100mV setting 80-90% fan. Is that ok that gpu clocks down with temp going up?


Keep the temp as low as you can get them and there won't be any problem whatsoever.

Ramping up the voltage is no problemo. Not in MSI aft. Prec. X etc. The little voltage is so small it won't harm the card. Nor will a high memory clock or core clock. Jus keep it stable.









I'm running 2100 on my 1080 FE on stock volts in my 4770 rig until my EK WV gets delivered. Will ramp up the voltage and hope for 2200 when the temps are around 40¤C. Crossing my fingers!

EDIT: Goddammit, Swolern was quicker .


----------



## kaudiyo

A clubber from Spain here...with SLI of MSI Gaming X, nice cards







.

I´m in the club since June 20th but couldn´t post before cause I was on holidays until past Sunday...

I finally got my SLI bridge HB (holy cow!), I think I´m one of the first owners here cause in Spain they are not for sale yet, had to import it from US







.

Will post later some benchmarks with and without HB (using single ribbon).


----------



## FlatOUT

Anyone else experiencing sometimes micro frezing problem?


----------



## Antipathy

Quote:


> Originally Posted by *rv8000*
> 
> RIP my luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So far I can't even sustain 2038 on the core either, feels like I hit the bottom of the barrel with this card


You and me both. I have the MSI Armor, but I'm not having much OC luck either. I can hit 2038, but it eventually down clocks to 2025. PerfCap is reporting VRel, but it is actually temp. If I set the fans to 100%, it will hold 2038Mhz, but anything higher crashes hard. It starts to throttle between 58-60c. I can only get +350Mhz on the memory. I can't even crack 100fps in heaven.

Still a helluva card, I guess, just a little disappointing when you see everybody basically STARTING at +100Mhz core, +500Mhz memory.


----------



## benjamen50

Quote:


> Originally Posted by *FlatOUT*
> 
> Anyone else experiencing sometimes micro frezing problem?


Is this during benchmarks? I kind of get fps drops then goes back up a few moments later on unigine heaven benchmark. I think it could mean unstable OC but I could be wrong.


----------



## ricko99

So I'm torn between getting 1070 Amp extreme and Super jetstream. I can get $75 discount for Super jetstream. Both have the same boost clock. Now I'm just worrried about temperature. Anyone with Super jetstream mind sharing the temperature while gaming?


----------



## bigjdubb

Quote:


> Originally Posted by *Antipathy*
> 
> You and me both. I have the MSI Armor, but I'm not having much OC luck either. I can hit 2038, but it eventually down clocks to 2025. PerfCap is reporting VRel, but it is actually temp. If I set the fans to 100%, it will hold 2038Mhz, but anything higher crashes hard. It starts to throttle between 58-60c. I can only get +350Mhz on the memory. I can't even crack 100fps in heaven.
> 
> Still a helluva card, I guess, just a little disappointing when you see everybody basically STARTING at +100Mhz core, +500Mhz memory.


You aren't missing much, 2050 mhz seems to be the high side of average.

Quote:


> Originally Posted by *ricko99*
> 
> So I'm torn between getting 1070 Amp extreme and Super jetstream. I can get $75 discount for Super jetstream. Both have the same boost clock. Now I'm just worrried about temperature. Anyone with Super jetstream mind sharing the temperature while gaming?


I don't think any of the aftermarket cards have much trouble keeping temps under control. Some cards may be a little better than others, but they all seem to get the job done.


----------



## NCSUZoSo

Video of my install of the H110 + Kraken G10 onto my MSI GTX 1070 Armor OC:







With an OC at 2.139 GHz on the core and 9.624 Gbps on the VRAM I was able to barely beat a stock 1080 with the same CPU as mine in 3DMark Time Spy.

Max Load Temp: 41C

(this score is low compared to most 1080 scores due to the older CPU being used (3770K), most are over 7000, but most of those 1080s are also OC'd)


----------



## Antipathy

Quote:


> Originally Posted by *bigjdubb*
> 
> You aren't missing much, 2050 mhz seems to be the high side of average.


Well that makes me feel a little bit better, thanks.


----------



## chrcoluk

Quote:


> Originally Posted by *GreedyMuffin*
> 
> For the guys who want to flash the ZOTAC bios on his FTW (Or wise verca) don't. It can mess up your card since the power delivery is completely different. If both would have been FEs (or FEs PCBs) it would be no problem.
> 
> Can anyone witth an GTX 1070 run a firestrike vanilla for me? My buddy who has a 6700K and slightly overclocked 1070 get this score : http://www.3dmark.com/3dm/13684626 (124 + offset on the GPU)
> 
> Seems kinda low when my 980Ti 1500/1990 (mem was on 2000 on this test, same deal). gets this : http://www.3dmark.com/fs/6664726
> 
> Thanks!


yeah to clarify I wasnt asking anyone to cross flash the bios, I am asking if Zotac card owners could test, thanks. AMP extreme if possible.

Also yeah his score is low, he should be able to clear 20k, maybe high 19000s but below 19k seems low.


----------



## mcbaes72

During intense battle on DA Inquisition, PC froze. First time this happened on 1070 Armor and new (used) PG278Q. Temps weren't too high, 76C, fans 66%. Lowered graphics from Ultra down to High settings. Profile 1: +150 Core and +456 Memory, used Profile 2: +100/+300 afterwards and was able to play for over an hour on DAI and even Shadows of Mordor with no more issues.

Hmm... Since it's been about a week with new 1440p monitor, thought the 1070 handled itself well on Ultra settings with fairly low fan speed. Maybe it was just a one-time glitch that caused gaming rig to freeze.


----------



## chrcoluk

Usually a complete freeze isnt GPU related, as the GPU driver is designed to be able to restart itself on a crash. Even in the XP days, you would get a BSOD rather than hard freeze. A freeze seems more likely to be cpu, motherboard issue.


----------



## NCSUZoSo

Quote:


> Originally Posted by *chrcoluk*
> 
> Usually a complete freeze isnt GPU related, as the GPU driver is designed to be able to restart itself on a crash. Even in the XP days, you would get a BSOD rather than hard freeze. A freeze seems more likely to be cpu, motherboard issue.


In XP you would get a BSOD with a driver crash the majority of the time. Even in Vista and Windows 7 it didn't always successfully restart the driver, so it wouldn't be a shock to me if it happened randomly in W10 also.
Quote:


> Originally Posted by *mcbaes72*
> 
> During intense battle on DA Inquisition, PC froze. First time this happened on 1070 Armor and new (used) PG278Q. Temps weren't too high, 76C, fans 66%. Lowered graphics from Ultra down to High settings. Profile 1: +150 Core and +456 Memory, used Profile 2: +100/+300 afterwards and was able to play for over an hour on DAI and even Shadows of Mordor with no more issues.
> 
> Hmm... Since it's been about a week with new 1440p monitor, thought the 1070 handled itself well on Ultra settings with fairly low fan speed. Maybe it was just a one-time glitch that caused gaming rig to freeze.


What is your temp limit set to in AB? Hit the drop down box next to the power limit and check.


----------



## mcbaes72

Quote:


> Originally Posted by *chrcoluk*
> 
> Usually a complete freeze isnt GPU related, as the GPU driver is designed to be able to restart itself on a crash. Even in the XP days, you would get a BSOD rather than hard freeze. A freeze seems more likely to be cpu, motherboard issue.


I didn't consider the CPU and MoBo (6700k/Z170-Pro), will keep a closer eye on temps and core loads using Core Temp and ASUS AI Suite. Thanks for the advice.


----------



## mcbaes72

Quote:


> Originally Posted by *NCSUZoSo*
> 
> In XP you would get a BSOD with a driver crash the majority of the time. Even in Vista and Windows 7 it didn't always successfully restart the driver, so it wouldn't be a shock to me if it happened randomly in W10 also.
> What is your temp limit set to in AB? Hit the drop down box next to the power limit and check.


Temp Limit = 95C
Power Limit = 108%
AB version 4.2.0
OS Win10 Prof.

EDIT: Yeah, didn't auto restart, pressed Reset button on case.


----------



## jrizzz

Quote:


> Originally Posted by *rv8000*
> 
> RIP my luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So far I can't even sustain 2038 on the core either, feels like I hit the bottom of the barrel with this card


Don't worry, I didn't know about the gaming/gaming x/gaming z models until after I ordered my card. Hopefully luck is on my side when I get it. At least I get 6 months to pay it off with paypal credit










Spoiler: Warning: Spoiler!


----------



## NCSUZoSo

Quote:


> Originally Posted by *jrizzz*
> 
> Don't worry, I didn't know about the gaming/gaming x/gaming z models until after I ordered my card. Hopefully luck is on my side when I get it. At least I get 6 months to pay it off with paypal credit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I have a MSI 1070 Armor OC running at 2.139 GHz and the VRAM is at 9.624 Gbps (max load temp = 41C), look back at the last page for video of Kraken G10 + H110 install.

There is no real difference between the Gaming/Gaming X/Gaming Z, except a few visual changes like LEDs and back plates. In the end the Armor's 6+2 power design is plenty for OCing, so any of the Gaming models will do very well due to their 8+2 design. However it is my opinion that power phases do not matter in this 16nm generation vs nearly as much as. previous gens. I think the binning of the chip is far more important due to the low power limit we are given (108%) vs the 1080's (120%).

OC that GDDR5, it should reach well past 9 Gbps (8 Gbps is stock, 1080's GDDR5X is 10 Gbps)


----------



## danjal

Quote:


> Originally Posted by *Antipathy*
> 
> You and me both. I have the MSI Armor, but I'm not having much OC luck either. I can hit 2038, but it eventually down clocks to 2025. PerfCap is reporting VRel, but it is actually temp. If I set the fans to 100%, it will hold 2038Mhz, but anything higher crashes hard. It starts to throttle between 58-60c. I can only get +350Mhz on the memory. I can't even crack 100fps in heaven.
> 
> Still a helluva card, I guess, just a little disappointing when you see everybody basically STARTING at +100Mhz core, +500Mhz memory.


I run a zotac 1070 amp and I've noticed mine throttles when it hits exactly 60c and no way around it.. at 60c it throttles to 2038, pretty sure nvidia is doing it on purpose, which is stupid to do it at such a low temperature. I could see maybe 70 or 75c but not 60c.


----------



## NCSUZoSo

Quote:


> Originally Posted by *danjal*
> 
> I run a zotac 1070 amp and I've noticed mine throttles when it hits exactly 60c and no way around it.. at 60c it throttles to 2038, pretty sure nvidia is doing it on purpose, which is stupid to do it at such a low temperature. I could see maybe 70 or 75c but not 60c.


This is not true about NVIDIA limiting to 60C before downclocking. If you watch my YouTube video on the previous page you'll see I was holding a little over 2 GHz at 75C with no throttling going on before installing the H110.

Also even though my max load temp after 2 hours is 41C on the core, it won't OC any further (2.139 GHz), so it's not the temp. Make sure you don't have your power limit and temp linked, there is a checkbox that links them when you hit the drop down box.


----------



## Fr3eWar

Hello there,

I'm new to this forum, but have some questions.
Now i own a Gainward GTX 1070 GLH edition. Im fine, and happy with it, but for my fun, i gonna build a custom loop water cooled pc.
Just ordered the parts, and the case. My vga will be also watercooled, so for me, the solution will be EVGA, but can't decide between the FE and SC, they are both Ref PCB, my wattarty will not be void only if i go with EVGA. My ref Fullcover block from EK is also ordered already. I gonna order the new vga, and if it arrives gonna sell the Gainward.
What do you think, FE or SC.
At the moment i can't order SC here in Hungary (last week there was 1 shop with ~ 589USD) for normal price (now in another shop is 670USD) the FE is at "good" price 582USD. Should i wait and hope, there will be an SC soon in the market for around 600USD or stick with te FE. The cooler will be replaced.

How do you see guys the FE editions in this forum? In the biggest Hungarian PC forum the FE owners say, that FE cards run easily the same clock as Custom cards (like my GLH now)


Thank for reps


----------



## danjal

Quote:


> Originally Posted by *NCSUZoSo*
> 
> This is not true about NVIDIA limiting to 60C before downclocking. If you watch my YouTube video on the previous page you'll see I was holding a little over 2 GHz at 75C with no throttling going on before installing the H110.
> 
> Also even though my max load temp after 2 hours is 41C on the core, it won't OC any further (2.139 GHz), so it's not the temp. Make sure you don't have your power limit and temp linked, there is a checkbox that links them when you hit the drop down box.


I'm running a 1070 amp edition, I was assuming its Zotacs bios, not nvidias. guess I should have been more specific.


----------



## Antipathy

Quote:


> Originally Posted by *NCSUZoSo*
> 
> This is not true about NVIDIA limiting to 60C before downclocking. If you watch my YouTube video on the previous page you'll see I was holding a little over 2 GHz at 75C with no throttling going on before installing the H110.
> 
> Also even though my max load temp after 2 hours is 41C on the core, it won't OC any further (2.139 GHz), so it's not the temp. Make sure you don't have your power limit and temp linked, there is a checkbox that links them when you hit the drop down box.


Mine definitely drops 1 bin from 2038 to 2025 when it hits 60c. I've watched it happen repeatedly. It never drops any more than that, though, so presumably, it could still clock at 2025 at 75c. It typically hovers around 60-63c.

I do have power and temp linked, though I'm not sure that is applicable here. I'll give unlinking them a shot, though.

EDIT - Cool video, btw, definitely saving that.


----------



## NCSUZoSo

Quote:


> Originally Posted by *Fr3eWar*
> 
> Hello there,
> How do you see guys the FE editions in this forum? In the biggest Hungarian PC forum the FE owners say, that FE cards run easily the same clock as Custom cards (like my GLH now)


The reason most people are happy with the FE compared to upgraded models is because of what everyone is talking about, the power limit. Due to the power limit being restricted to 108% the gains from upgraded power circuitry on custom boards gives less gains as compared to the 1080 which has a 120% power limit.

Founders Edition cards have a 4+1 phase design, sounds very weak, but due to a die shrink to 16nm the required power is much less. However if the power limit can be raised the custom cards should OC MUCH better. I believe with the power limit raised I could hit 2.3 GHz easy on my 1070 Armor (6+2 phase design).

If you are looking at the EVGA 1070 SC model only then it doesn't really matter, the EVGA 1070 SC is basically a reference design and has the stock 4+1 design. So for you there is not really any difference between the two cards except the cooler which you won't be using. In this select case, get the FE if you want it now.

You can see here the difference in the EVGA models, on the FTW has an upgraded power phase design (10+2):
http://www.evga.com/products/Product.aspx?pn=08G-P4-6173-KR


----------



## TheMiracle

Its very weird that my card throttles depending the game I am playing. With Fallout 4 it stays in 2050-2063 all the time, but in The Witcher 3 it have some drops to 2000 and 2025.


----------



## ikjadoon

Quote:


> Originally Posted by *Vaesauce*
> 
> While true, I was only pointing out the fact that the XG also has the same Power Phase Layout.
> 
> I cannot comment on whether it's quality is good or bad until we have the ability to unlock the voltage and see what all cards are capable of. But I do agree with your statement.


OK, sure, right.
Quote:


> Originally Posted by *SuperZan*
> 
> Most definitely. That said, 10+2 high-quality phases is certainly desirable, all things being equal besides. ?


Yeah, the "all things being equal" is the part I'm currently unsure about. The FE cards kill with their 4+1 or 5+1 (I forgot what they're on now).

Quote:


> Originally Posted by *NCSUZoSo*
> 
> This is not true about NVIDIA limiting to 60C before downclocking. If you watch my YouTube video on the previous page you'll see I was holding a little over 2 GHz at 75C with no throttling going on before installing the H110.
> 
> Also even though my max load temp after 2 hours is 41C on the core, it won't OC any further (2.139 GHz), so it's not the temp. Make sure you don't have your power limit and temp linked, there is a checkbox that links them when you hit the drop down box.


I don't know. Anandtech seemed pretty clear. Maybe, you are in a unique position to corroborate or disagree with their results. Run the card at 40C to 55C (maybe limiting fans on your radiator?). Any difference in maximum clock speed? Anandtech certainly saw one:


----------



## danjal

Quote:


> Originally Posted by *Antipathy*
> 
> Mine definitely drops 1 bin from 2038 to 2025 when it hits 60c. I've watched it happen repeatedly. It never drops any more than that, though, so presumably, it could still clock at 2025 at 75c. It typically hovers around 60-63c.
> 
> I do have power and temp linked, though I'm not sure that is applicable here. I'll give unlinking them a shot, though.


My temp and power limit are and have been unlinked since I installed the zotac amp edition... I set my temp limit at 70c and my card runs at around 65-67c. it will run at 2075, then drops to 2050, then to 2038 and it holds at 2038 indefinitely.. It drops from 2050 to 2038 at 60c exactly, every time.... memory clock is at 8400, and I'm running 5% undervolt. (power limit set at 95%)

I use the custom fan profile in msi afterburner and I leave it at default setting but I ramp the fans to 80% at 60c.

I'm pretty sure with water it wouldnt throttle, tempted to try the g10 but dont want to screw up my warranty yet.


----------



## danjal

Quote:


> Originally Posted by *TheMiracle*
> 
> Its very weird that my card throttles depending the game I am playing. With Fallout 4 it stays in 2050-2063 all the time, but in The Witcher 3 it have some drops to 2000 and 2025.


I would guess because Witcher is hitting the gpu harder and its getting warmer.


----------



## TheMiracle

Quote:


> Originally Posted by *danjal*
> 
> I would guess because Witcher is hitting the gpu harder and its getting warmer.


Same temp on both, 57°C.


----------



## FlatOUT

https://forums.geforce.com/default/topic/951723/geforce-drivers/announcing-geforce-hotfix-driver-368-95/
What do you think about it? This hotfix didnt help me


----------



## Fr3eWar

Ok, i think i gonna order than FE edit.
My Gainward has a 8 phase design, but it does only +30mhz overclock with 114% power limit (base OC is one of the highest, 1670 - 100% powerlimit) i can OC it only to 1700Mhz (is around 2090 with boost). I think (and hope) with water my FE will run also the same clocks.


----------



## Mudfrog

Quote:


> Originally Posted by *rv8000*
> 
> RIP my luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So far I can't even sustain 2038 on the core either, feels like I hit the bottom of the barrel with this card


I'll check mine tonight, mine only jumps to 2004.x on the core, although I haven't tried OC'ng it. I have no need to right now.


----------



## danjal

Quote:


> Originally Posted by *FlatOUT*
> 
> https://forums.geforce.com/default/topic/951723/geforce-drivers/announcing-geforce-hotfix-driver-368-95/
> What do you think about it? This hotfix didnt help me


I dont have the dpc issue with zotac 1070 amp


----------



## Raikiri

Slight improvement:



2500k @ 4.5Ghz, 1070 @ 2151 core, 9654 mem


----------



## ratskrone

somenone with Zotac AMP here and can upload the BIOS Please ?

thanx


----------



## danjal

Zotac1070ampedition.zip 149k .zip file


----------



## danjal

is this any good?
http://www.3dmark.com/3dm/13715065


----------



## Dude970

Yes, that is a good score. OC that CPU some more and you can get 6K


----------



## jrcbandit

Quote:


> Originally Posted by *NCSUZoSo*
> 
> The reason most people are happy with the FE compared to upgraded models is because of what everyone is talking about, the power limit. Due to the power limit being restricted to 108% the gains from upgraded power circuitry on custom boards gives less gains as compared to the 1080 which has a 120% power limit.
> 
> Founders Edition cards have a 4+1 phase design, sounds very weak, but due to a die shrink to 16nm the required power is much less. However if the power limit can be raised the custom cards should OC MUCH better. I believe with the power limit raised I could hit 2.3 GHz easy on my 1070 Armor (6+2 phase design).
> 
> If you are looking at the EVGA 1070 SC model only then it doesn't really matter, the EVGA 1070 SC is basically a reference design and has the stock 4+1 design. So for you there is not really any difference between the two cards except the cooler which you won't be using. In this select case, get the FE if you want it now.
> 
> You can see here the difference in the EVGA models, on the FTW has an upgraded power phase design (10+2):
> http://www.evga.com/products/Product.aspx?pn=08G-P4-6173-KR


Upping the power limit doesn't seem to help much compared to locked 1070s. EVGA FTW 1070s can all do 120% on the second bios, yet the overclocking has never reached 2.3 Ghz from any of the posts at EVGA forums or elsewhere. Most people are still stuck at around 2.075-2.15 Ghz even at 120%. At 120% and max voltage my card isn't even stable at 2.1 ghz, the memory can overclock by +650 at least.

I dunno if the results would change much on water cooling since the voltage is locked down pretty tight without doing a hard mod. FTW water blocks are coming in August, although there is no guarantee that the 1080 FTW waterblock will work on the 1070. I might give it a try if any sees really good results, but I am not that optimistic. My temperature is around 50-60 C while gaming and I can't hit a stable 2.1 Ghz clock, all water cooling would likely do is knock that temperature down some to 40-50 C.


----------



## NCSUZoSo

Well then it is time to do my 4th hardware mod haha.

BTW putting an AIO cooler on a MSI card doesn't void the warranty. MSI confirmed this.










This obviously also applies to the US.


----------



## Ranguvar

Quote:


> Originally Posted by *jrcbandit*
> 
> Upping the power limit doesn't seem to help much compared to locked 1070s. EVGA FTW 1070s can all do 120% on the second bios, yet the overclocking has never reached 2.3 Ghz from any of the posts at EVGA forums or elsewhere. Most people are still stuck at around 2.075-2.15 Ghz even at 120%. At 120% and max voltage my card isn't even stable at 2.1 ghz, the memory can overclock by +650 at least.
> 
> I dunno if the results would change much on water cooling since the voltage is locked down pretty tight without doing a hard mod. FTW water blocks are coming in August, although there is no guarantee that the 1080 FTW waterblock will work on the 1070. I might give it a try if any sees really good results, but I am not that optimistic. My temperature is around 50-60 C while gaming and I can't hit a stable 2.1 Ghz clock, all water cooling would likely do is knock that temperature down some to 40-50 C.


My 1070 FTW will do 2278MHz @ 1.09V easily and maintains it fine, barely creeping into the 50C+ range at max fan.

The problem is that you need to use a custom voltage curve, and when you do, you lose performance instantly, clock-for-clock, vs basic offset overclocking.

With more voltage, who knows what is possible. Can't wait for BIOS editor.


----------



## ikjadoon

Quote:


> Originally Posted by *TheMiracle*
> 
> Yep, I have a 144hrz G-Sync monitor!


Quote:


> Originally Posted by *Mudfrog*
> 
> I'm at 1080p 60hz.


Right, so maybe different preferences then. As more cards are targeting 4K, we are getting more CPU bound than we think.


----------



## Antipathy

Quote:


> Originally Posted by *NCSUZoSo*
> 
> This is not true about NVIDIA limiting to 60C before downclocking. If you watch my YouTube video on the previous page you'll see I was holding a little over 2 GHz at 75C with no throttling going on before installing the H110.
> 
> Also even though my max load temp after 2 hours is 41C on the core, it won't OC any further (2.139 GHz), so it's not the temp. Make sure you don't have your power limit and temp linked, there is a checkbox that links them when you hit the drop down box.


Quote:


> Originally Posted by *Antipathy*
> 
> Mine definitely drops 1 bin from 2038 to 2025 when it hits 60c. I've watched it happen repeatedly. It never drops any more than that, though, so presumably, it could still clock at 2025 at 75c. It typically hovers around 60-63c.


Quote:


> Originally Posted by *ikjadoon*
> 
> I don't know. Anandtech seemed pretty clear. Maybe, you are in a unique position to corroborate or disagree with their results. Run the card at 40C to 55C (maybe limiting fans on your radiator?). Any difference in maximum clock speed? Anandtech certainly saw one:
> 
> 
> Spoiler: Warning: Spoiler!


Interestingly, and maybe predictably, temperature is only part of the equation, apparently. My card will drop a bin at 60c _at stock voltage_. After playing around with Afterburner, it looks like either increasing voltage _or_ reducing temperature will prevent it from down clocking. I guess maybe that is what that anandtech picture is illustrating? Anyway, it looks like I may still be able to squeeze more out of it yet.


----------



## ikjadoon

Quote:


> Originally Posted by *Antipathy*
> 
> Interestingly, and maybe predictably, temperature is only part of the equation, apparently. My card will drop a bin at 60c _at stock voltage_. After playing around with Afterburner, it looks like either increasing voltage _or_ reducing temperature will prevent it from down clocking. I guess maybe that is what that anandtech picture is illustrating? Anyway, it looks like I may still be able to squeeze more out of it yet.


Err, right. Downlocking at 60C is expected. From 39C to 69C, the GPU boost clock is gradually reduced in 13MHz increments.

The GPU from Anandtech is also at stock voltage. I would voltage only make you hit 39C to 69C faster, unless the cooling is very powerful.


----------



## GreedyMuffin

My 980Ti is folding on 1500mhz. 39'C at all time. Sl I guess my 1080 will be a bit colder since it draw 70 watts less.


----------



## danjal

I cant gain voltage control in msi afterburner with my zotac


----------



## Dude970

I broke 22K graphics score on FireStrike









http://www.3dmark.com/fs/9561012


----------



## LiquidHaus

The struggle is real at work hahaha


----------



## Mad Pistol

Just a reference side by side of GTX 1070 FE SLI, both stock and with a "reasonable" overclock.

Stock (169.4)


Core+150, Mem+600 (183.2)


Decent gains, but nothing like previous generation nvidia cards.


----------



## benjamen50

Quote:


> Originally Posted by *danjal*
> 
> I cant gain voltage control in msi afterburner with my zotac[/quote
> 
> Go into MSI afterburner settings and make sure voltage settings are enabled and that it's set to extended MSI voltage control, not really sure what it was called exactly as I don't use MSI afterburner anymore.


----------



## EvilWiffles

Ello,

Just got myself the MSI GTX 1070 Sea Hawk X.
Only played with it for a short time now. At stock, I reach 1980MHz.
OC'd, I can get 2100MHz but I'm not sure it's stable yet. Anyways, max temp I've reached so far is 42c, want to try Witcher 3 because benchmarks rarely ever max temps as hard as Witcher 3.

Wish I could disable GPU boost, really dislike it.


----------



## madmeatballs

Quote:


> Originally Posted by *EvilWiffles*
> 
> Ello,
> 
> Just got myself the MSI GTX 1070 Sea Hawk X.
> Only played with it for a short time now. At stock, I reach 1980MHz.
> OC'd, I can get 2100MHz but I'm not sure it's stable yet. Anyways, max temp I've reached so far is 42c, want to try Witcher 3 because benchmarks rarely ever max temps as hard as Witcher 3.
> 
> Wish I could disable GPU boost, really dislike it.


same, we now just have to patiently wait for a pascal bios tweaker.


----------



## QxY

Pretty happy with the Zotac GTX 1070 AMP Edition (non-Extreme), especially being my first Zotac card and after hearing some negative things about the company in the past.

The card looks great (though Yellow stripes are subjective) and seems well build. Colored LEDs are cool. Zotac's FireStorm software is alright.

GPU boosts up to 1980MHz on it's own, lowest throttle I've seen was at 1920MHz. The cooling is very good, Furmark stress maxes at 73C...games are usually a couple degrees lower...except for Fallout 4 which for some odd reason goes all the way up to 76C in some levels at Ultra (eg. The Institute), probably due to the not very optimized features like Godrays. Fans are literally silent, highest RPM I've seen at 79%. On idle they don't even run...idles temps are in late 30's. Mind you it's mid-summer and no AC where I live.

So yeah, I definitely recommended this card. I still don't understand though why it requires dual 8-pin connectors when my old MSI 780 Ti Gaming needed 6-pin + 8-pin and had like 100W more TDP.


----------



## madmeatballs

Quote:


> Originally Posted by *QxY*
> 
> Pretty happy with the Zotac GTX 1070 AMP Edition (non-Extreme), especially being my first Zotac card and after hearing some negative things about the company in the past.
> 
> The card looks great (though Yellow stripes are subjective) and seems well build. Colored LEDs are cool. Zotac's FireStorm software is alright.
> 
> GPU boosts up to 1980MHz on it's own, lowest throttle I've seen was at 1920MHz. The cooling is very good, Furmark stress maxes at 73C...games are usually a couple degrees lower...except for Fallout 4 which for some odd reason goes all the way up to 76C in some levels at Ultra (eg. The Institute), probably due to the not very optimized features like Godrays. Fans are literally silent, highest RPM I've seen at 79%. On idle they don't even run...idles temps are in late 30's. Mind you it's mid-summer and no AC where I live.
> 
> So yeah, I definitely recommended this card. I still don't understand though why it requires dual 8-pin connectors when my old MSI 780 Ti Gaming needed 6-pin + 8-pin and had like 100W more TDP.


Don't forget the 5 year warranty after you register it with Zotac.


----------



## ClashOfClans

I wonder if the fan on the MSI Gaming X 1070 is like the fan on my msi gtx970 where it stays off all the time at idle. I like this as it allows my computer to be dead silent when not gaming. then again, the fans are also silent when they are going and running a game.


----------



## brettjv

Quote:


> Originally Posted by *jrcbandit*
> 
> Upping the power limit doesn't seem to help much compared to locked 1070s. EVGA FTW 1070s can all do 120% on the second bios, yet the overclocking has never reached 2.3 Ghz from any of the posts at EVGA forums or elsewhere. Most people are still stuck at around 2.075-2.15 Ghz even at 120%. At 120% and max voltage my card isn't even stable at 2.1 ghz, the memory can overclock by +650 at least.
> 
> I dunno if the results would change much on water cooling since the voltage is locked down pretty tight without doing a hard mod. FTW water blocks are coming in August, although there is no guarantee that the 1080 FTW waterblock will work on the 1070. I might give it a try if any sees really good results, but I am not that optimistic. My temperature is around 50-60 C while gaming and I can't hit a stable 2.1 Ghz clock, all water cooling would likely do is knock that temperature down some to 40-50 C.


Well said. I would add that there has always been a limit to how much increasing the available voltage (and the related concept of 'TDP' i.e. power limit) will increase overclock potential.

The traces on a 12-14nm die process are obviously EXTREMELY close to one another, and as they've gotten closer due to process shrinks (as has happened over the years), we're more quickly reaching a point of diminishing returns in terms of how much 'more voltage' can actually 'help' with our OC's.

IOW, it's not like 'moar volts = moar OCs', indefinitely.

I'm talking here about simple 'physics' based limits, as opposed to artificial stuff related to drivers and bioses here and whatnot. Additional voltage can HELP w/frequencies we can reach, but only to a point.

And beyond that point, you get 'sparks' happening between the traces,and our chips start frying. It's not at all unreasonable to suspect that Pascal is tied down the way it is due to pure physics ... and based on what I've seen, I'm prepared to say that I STRONGLY suspect that both extra power plugs, and extra power phases, AND higher power limits ... on the Pascal GPU's ... are virtually completely superfluous. We're not really being limited by these things (much), we're being limited by simple physics.

Although things change a bit with LN2 cooling ...


----------



## SlvrDragon50

Just jumped in on the ASUS STRIX 1070 deal from Jet. Hopefully, it's the OC version though it seems like it doesn't really make any difference since it's just BIOS. Gotta start looking for a water block now though. Hopefully the other companies will begin to release STRIX blocks.


----------



## brettjv

Quote:


> Originally Posted by *Mad Pistol*
> 
> Just a reference side by side of GTX 1070 FE SLI, both stock and with a "reasonable" overclock.
> 
> Decent gains, but nothing like previous generation nvidia cards.


Or maybe ... nV just underclocked the older 'stock' cards more dramatically









Perhaps you could try calculating the OC scaling in a precise manner for us ... try doing an exactly 5% OC on core and memory on a Maxwell, and then again on a Pascal (making sure to set up a purely GPU limited situation) ... and seeing if the returns are the same, on a percentage basis? I've personally been very keen to see properly composed OC scaling tests done on both architectures ...

IMHO the raw numbers aren't nearly as important as the % differences ... because all the raw numbers tell us is how close to their limits each architecture was 'pushed' initially.

IOW, just because the 980Ti can be OC'd 30% and Pascal 10% doesn't mean Maxwell is 'better', it just means business decisions made at the time favored under-clocking Maxwell more than Pascal ... relative to it's actual 'max perf'.

I wanna know if they SCALE equally. And that involves % calculations, not raw numbers.

I'm sure you know the formula's to use


----------



## benjamen50

What I'm interested in is a overclocked 1070 @ 2150mhz + 600+ offset mem vs GTX 1080 FE. Would be nice to see how close it is to it in terms of performance.


----------



## Jimbags

Quote:


> Originally Posted by *benjamen50*
> 
> What I'm interested in is a overclocked 1070 @ 2150mhz + 600+ offset mem vs GTX 1080 FE. Would be nice to see how close it is to it in terms of performance.


If the voltage wasnt so locked up it would be even closer


----------



## syl1979

Some feedback on my Galax 1070 gamer. Defaut boost under load seems 1987mhz, with maximum 2013 mhz. Voltage at 1.06v reduced to 1.05 under load.

Tried overclock. Voltage lock at 1.075v. The card only accept +80 on core, +425 on memory. Gets 2050 under load. Gives around +5% performance increase.

Good temps (70max) and noise


----------



## supermi

Quote:


> Originally Posted by *benjamen50*
> 
> What I'm interested in is a overclocked 1070 @ 2150mhz + 600+ offset mem vs GTX 1080 FE. Would be nice to see how close it is to it in terms of performance.


I have such a 1070 , not the 1080 fe

Edit: I meant 1080 Fools Edition ( fun jab at company not awesome customers here)!


----------



## benjamen50

Quote:


> Originally Posted by *supermi*
> 
> I have such a 1070 , not the 1080 fe
> 
> Edit: I meant 1080 Fools Edition ( fun jab at company not awesome customers here)!


I have one too somewhat, but I just tested my overclock for stability again and found out it's stable at a lower overclock now, 75+ MHz core, 500+ MHz on the memory instead of 600 MHz on the memory.


----------



## chrcoluk

Quote:


> Originally Posted by *QxY*
> 
> Pretty happy with the Zotac GTX 1070 AMP Edition (non-Extreme), especially being my first Zotac card and after hearing some negative things about the company in the past.
> 
> The card looks great (though Yellow stripes are subjective) and seems well build. Colored LEDs are cool. Zotac's FireStorm software is alright.
> 
> GPU boosts up to 1980MHz on it's own, lowest throttle I've seen was at 1920MHz. The cooling is very good, Furmark stress maxes at 73C...games are usually a couple degrees lower...except for Fallout 4 which for some odd reason goes all the way up to 76C in some levels at Ultra (eg. The Institute), probably due to the not very optimized features like Godrays. Fans are literally silent, highest RPM I've seen at 79%. On idle they don't even run...idles temps are in late 30's. Mind you it's mid-summer and no AC where I live.
> 
> So yeah, I definitely recommended this card. I still don't understand though why it requires dual 8-pin connectors when my old MSI 780 Ti Gaming needed 6-pin + 8-pin and had like 100W more TDP.


see my earlier post, zotac have a much higher TDP limit on their bios, so the 2nd power connector is not just for show, it stops you hitting TDP limits, and the zotac cards will really shine when a pascal bios editor is finished as they will be able to use higher voltages without hitting TDP limit.

My palit when I set +100% core voltage cannot hold 1.09v as it hits TDP limit, it can only hold 1.09v when the card is under light load.


----------



## danjal

Quote:


> Originally Posted by *benjamen50*
> 
> Go into MSI afterburner settings and make sure voltage settings are enabled and that it's set to extended MSI voltage control, not really sure what it was called exactly as I don't use MSI afterburner anymore.


I had tried that before and it still wouldnt work.. I had to download the beta, it works.


----------



## danjal

unlocked the voltage control and put it at 50% .. I was able to go higher on my memory.. Thats the highest I've been able to run my memory so far..

core clock seems to not want to go over 100, like its the wall..

zotac 1070 amp edition


----------



## FuzzDad

Been playing around...I haven't put the card on water yet (that's next months "present"...but these cards are excellent:


----------



## FlatOUT

Quote:


> Originally Posted by *danjal*
> 
> unlocked the voltage control and put it at 50% .. I was able to go higher on my memory.. Thats the highest I've been able to run my memory so far..
> 
> core clock seems to not want to go over 100, like its the wall..
> 
> zotac 1070 amp edition


Im getting +100 score at the same oc(but mem +300 more) and im on i7-2600k(MSI Gaming x)


----------



## mstrmind5

Between the EVGA FTW and the MSI Gaming X versions, which is the quieter - meaning slower fan speed?

Thanks.


----------



## ikjadoon

Quote:


> Originally Posted by *jrcbandit*
> 
> Upping the power limit doesn't seem to help much compared to locked 1070s. EVGA FTW 1070s can all do 120% on the second bios, yet the overclocking has never reached 2.3 Ghz from any of the posts at EVGA forums or elsewhere. Most people are still stuck at around 2.075-2.15 Ghz even at 120%. At 120% and max voltage my card isn't even stable at 2.1 ghz, the memory can overclock by +650 at least.
> 
> *I dunno if the results would change much on water cooling* since the voltage is locked down pretty tight without doing a hard mod. FTW water blocks are coming in August, although there is no guarantee that the 1080 FTW waterblock will work on the 1070. I might give it a try if any sees really good results, but I am not that optimistic. My temperature is around 50-60 C while gaming and I can't hit a stable 2.1 Ghz clock, all water cooling would likely do is knock that temperature down some to 40-50 C.


Quote:


> Originally Posted by *brettjv*
> 
> Well said. I would add that there has always been a limit to how much increasing the available voltage (and the related concept of 'TDP' i.e. power limit) will increase overclock potential.
> 
> The traces on a 12-14nm die process are obviously EXTREMELY close to one another, and as they've gotten closer due to process shrinks (as has happened over the years), we're more quickly reaching a point of diminishing returns in terms of how much 'more voltage' can actually 'help' with our OC's.


I haven't seen a lot of independent testing of water vs air directly, but Anandtech reached an extra 76MHz higher boost when under water (39C load compared to 77C load). The cards, ostensibly, "throttle" early. 83C isn't the only temperature throttle anymore and I feel this is related to 16nm FinFET, but I know nothing about semiconductors, haha.

With Pascal / 16nm FinFET / GPU Boost 3.0, NVIDIA has included a lot of "behind the scenes clock regulation" that our 'traditional' overclocking software doesn't pick up.


----------



## Dreamliner

Just got my OC Strix. What do ya'll mention to test for whine, OC and benchmarks?


----------



## FlatOUT

Quote:


> Originally Posted by *mstrmind5*
> 
> Between the EVGA FTW and the MSI Gaming X versions, which is the quieter - meaning slower fan speed?
> 
> Thanks.


Im pretty sure you wont hear any noise on both cards. Just get cheaper one)


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Dreamliner*
> 
> Just got my OC Strix. What do ya'll mention to test for whine, OC and benchmarks?


i have SLI OC Strix... (got them installed 12 days ago and np.)

needed to adjust Fan speeds especially for the top card; after testing a week with all defaults; i came up with the following last night...

maybe u can try too:



thanks to ^^ my top card went down from "69c max", to only "*58c max*". my "max Fan %", went from "51% max" to only "66% max".

GL









PS for anyone needs Open Hardware Monitor DL, http://openhardwaremonitor.org/downloads/


----------



## Dude970

Play your favorite game with a good OC, you will know immediately if you have coil whine


----------



## Dreamliner

My Strix seems to be okay. I played Arkham Knight for a 20 minutes or so at 4K, 2X AA and all setting maxed (Gameworks off).

I used the Strix utility and put it in OC mode which spins the fans at 38%. GPU was showing 2025Mhz, Memory 8014Mhz, 61*C temp. No whine.

Seems fine?


----------



## NCSUZoSo

Easiest way to be sure is to throw up a stress test like Kombuster or similar, it will start in the first few seconds.


----------



## NCSUZoSo

Quote:


> Originally Posted by *benjamen50*
> 
> What I'm interested in is a overclocked 1070 @ 2150mhz + 600+ offset mem vs GTX 1080 FE. Would be nice to see how close it is to it in terms of performance.


I already posted these results lol:

http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/1570#post_25390962



That is a stock GTX 1080 vs. my MSI GTX 1070 Armor @ 2.15 GHz | 9.624 Gbps in the new 3DMark Time Spy demo (both CPUs are the same, except his OC is 100 MHz higher)


----------



## NCSUZoSo

Quote:


> Originally Posted by *syl1979*
> 
> Some feedback on my Galax 1070 gamer. Defaut boost under load seems 1987mhz, with maximum 2013 mhz. Voltage at 1.06v reduced to 1.05 under load.
> 
> Tried overclock. Voltage lock at 1.075v. The card only accept +80 on core, +425 on memory. Gets 2050 under load. Gives around +5% performance increase.
> 
> Good temps (70max) and noise


Quote:


> Originally Posted by *chrcoluk*
> 
> see my earlier post, zotac have a much higher TDP limit on their bios, so the 2nd power connector is not just for show, it stops you hitting TDP limits, and the zotac cards will really shine when a pascal bios editor is finished as they will be able to use higher voltages without hitting TDP limit.
> 
> My palit when I set +100% core voltage cannot hold 1.09v as it hits TDP limit, it can only hold 1.09v when the card is under light load.


Guys I just had some very interesting results w/ my GTX 1070 Armor + H110 / G10. When I turned my temp limit down to 60C (since I never break 41C) my card was stable at a higher clock speed vs. having it set to 80C. Also my voltage hit 1.093V and held there stably during the end of each part of the 3DMark Time Spy bench:



As you can kind of see in the picture, it runs at 1.075V during the test and in the last 20-30 seconds it bumps to 1.093V, this never happens with the temp limit at 80C.

Before with the it set to 80C I never went above 1.075V, but it holds at 1.075V the entire run, it's not as low as others are reporting either way.


----------



## jrizzz

Just got my MSI Gaming 1070 yesterday:




Settled on 2126 core / 2228 Mem:
http://www.3dmark.com/fs/9572589


----------



## Mudfrog

What setting should Heaven be ran on? I ran it on Extreme. This is where my card tops out.

2151 Core and 8619 Memory, The core dropped to 2126 then held steady.

Ran it again, but at 1080 and received:

FPS: 102.4
Score: 2579
Min FPS: 32.8
Max FPS: 212.2


----------



## FlatOUT

Quote:


> Originally Posted by *Mudfrog*
> 
> What setting should Heaven be ran on? I ran it on Extreme. This is where my card tops out.
> 
> 2151 Core and 8619 Memory, The core dropped to 2126 then held steady.
> 
> Ran it again, but at 1080 and received:
> 
> FPS: 102.4
> Score: 2579
> Min FPS: 32.8
> Max FPS: 212.2


Xtreme on 1080p


----------



## benjamen50

Quote:


> Originally Posted by *FuzzDad*
> 
> Been playing around...I haven't put the card on water yet (that's next months "present"...but these cards are excellent:


Might wanna update that GPU-Z because when I used it on my EVGA GTX 1070 ACX 3.0 FTW the computer decided to shut off immediately and turn on again. This is after I just installed the new GPU.


----------



## NCSUZoSo

Hell yeah guys! I just managed to grab the #1 spot in 3DMark Fire Strike Ultra for my CPU/GPU combo:

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/R/1419/1090/8157?minScore=0&cpuName=Intel Core i7-3770K Processor&gpuName=NVIDIA GeForce GTX 1070

(In the link you may have to tell it 1 GPU instead of "Any" number)



Those aren't my fastest clock speeds on the VRAM, but to keep the core stable at 2.139 GHz (reported 2.126, but it was peaking at 2.139). I noticed something strange about the voltage applied though. By playing with the core supposed boost and moving in between the actual clock steppings, you could make the same speed on the core stable if you put it directly on the stepping (i.e. +177). Also I noticed when it is crashing is due to the voltage not jumping high enough at the needed time. The same run may be stable once and the not again on the second time. I tried linking and unlinking the voltage/temp, and using different temp settings. I can't really say what it is doing yet, but I do know I saw runs crash with the temp limit at 82C vs. set to 60C (with my load not ever passing 45C).

BTW I can't do what someone asked to lower my fan speeds and check the 60C wall. With my fans at their lowest settings and running 3DMark Fire Strike Ultra back to back, it won't go above 54C lol. I did however notice a change in clock speeds far more often when I was in the 50's so that makes me believe what is being reported after all.

You can compare to me directly using this:http://www.3dmark.com/fs/9576878


----------



## mickr777

Just installed my 2x Asus ROG STRIX-GTX1070-8G-GAMING, got them for AU$699 each (still need to get a better SLi bridge)

Upgraded from a Single Sapphire Radeon HD 7970 Dual-X 3GB


----------



## ikjadoon

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Hell yeah guys! I just managed to grab the #1 spot in 3DMark Fire Strike Ultra for my CPU/GPU combo:
> 
> BTW I can't do what someone asked to lower my fan speeds and check the 60C wall. With my fans at their lowest settings and running 3DMark Fire Strike Ultra back to back, it won't go above 54C lol. I did however notice a change in clock speeds far more often when I was in the 50's so that makes me believe what is being reported after all.
> 
> You can compare to me directly using this: http://www.3dmark.com/3dm/13748081?


Nice.

Oh, hahaha, it's all good. Err, sorry; 60C isn't the actual wall. It starts at 39C and goes to ~69C.



A fair number of people have mentioned it on passing with air coolers, but because most air coolers have that zero dB technology, it's hard to get a good reading. But, that does make sense if you see fluctuations at 50C. From 51C to 69C, it has the most drops.


----------



## NCSUZoSo

I will do more testing and let you know what I find, but I definitely saw a fluctuation at around 50-52C

BTW, I linked to the wrong run that got the top spot for my combo: http://www.3dmark.com/fs/9576878


----------



## supermi

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Guys I just had some very interesting results w/ my GTX 1070 Armor + H110 / G10. When I turned my temp limit down to 60C (since I never break 41C) my card was stable at a higher clock speed vs. having it set to 80C. Also my voltage hit 1.093V and held there stably during the end of each part of the 3DMark Time Spy bench:
> 
> 
> 
> As you can kind of see in the picture, it runs at 1.075V during the test and in the last 20-30 seconds it bumps to 1.093V, this never happens with the temp limit at 80C.
> 
> Before with the it set to 80C I never went above 1.075V, but it holds at 1.075V the entire run, it's not as low as others are reporting either way.


I have different card(s) but I am water cooled and will give that a try tonight!!!


----------



## Forceman

Quote:


> Originally Posted by *supermi*
> 
> I have different card(s) but I am water cooled and will give that a try tonight!!!


I tried it on mine, and it didn't seem to affect the voltage. Maybe it's brand specific though.


----------



## NCSUZoSo

I have played around more with the temp limit and I think it is random and has no real effect after hours of testing with 3DMark Fire Strike Ultra. I have to say, I am very impressed with the VRAM on this card though.

Stock Speed: 8.0 Gbps

My Card: 9.754 Gbps

GDDR5X (1080): 10 Gbps

So far the highest overall OC I have been able to bench successfully was: 2.139 GHz on the core and a little over 4800 MHz on the VRAM (9.6 Gbps)

That is one hell of an OC, now I did bring my core OC down to 2 GHz, but still this is proof of the headroom we should have with more voltage/power limit.


----------



## danjal

Quote:


> Originally Posted by *mickr777*
> 
> Just installed my 2x Asus ROG STRIX-GTX1070-8G-GAMING, got them for AU$699 each (still need to get a better SLi bridge)
> 
> Upgraded from a Single Sapphire Radeon HD 7970 Dual-X 3GB


what kind of motherboard and processor are you running?


----------



## mickr777

Quote:


> Originally Posted by *danjal*
> 
> what kind of motherboard and processor are you running?


ASRock Fatal1ty Z97 Professional and Intel i7 4790k @ 4.7ghz


----------



## danjal

Quote:


> Originally Posted by *mickr777*
> 
> ASRock Fatal1ty Z97 Professional and Intel i7 4790k @ 4.7ghz


cool


----------



## Dreamliner

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Easiest way to be sure is to throw up a stress test like Kombuster or similar, it will start in the first few seconds.


Just rank Kombuster at 1080 and it was at 350 fps, the temp was at 59*c with 38% fan speed at 2063 GPU clock. My ear right up to it and no whine.

Seems to be good, right? I'm pretty happy.

I just need to figure out the best OC settings to try and a way to test it.


----------



## owikhan

So finally hands on

http://www.newegg.com/Product/Product.aspx?Item=N82E16814500401


----------



## ITAngel

Quote:


> Originally Posted by *owikhan*
> 
> 
> So finally hands on
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500401


Nice congrats! If you over clock yours let me know what you got. I am just starting to overclock mine but I never really done overclocking with Firestorm. My MSI Afterburner doesn't allow me to unlock the voltage for some odd reason. So I am working on that too.


----------



## madmeatballs

Quote:


> Originally Posted by *ITAngel*
> 
> Nice congrats! If you over clock yours let me know what you got. I am just starting to overclock mine but I never really done overclocking with Firestorm. My MSI Afterburner doesn't allow me to unlock the voltage for some odd reason. So I am working on that too.


did you install afterburner v. 4.3.0 beta 4? That unlocks the voltage lock.


----------



## danjal

Quote:


> Originally Posted by *ITAngel*
> 
> Nice congrats! If you over clock yours let me know what you got. I am just starting to overclock mine but I never really done overclocking with Firestorm. My MSI Afterburner doesn't allow me to unlock the voltage for some odd reason. So I am working on that too.


yea, you have to use the beta. it works done it today... What you will find is there is a difference in the settings between Zotac's Firestorm and Msi Afterburner.


----------



## ITAngel

Quote:


> Originally Posted by *madmeatballs*
> 
> did you install afterburner v. 4.3.0 beta 4? That unlocks the voltage lock.


oh no I have not, but I went to get it and yes is unlocked now. Thanks!







+1


----------



## ITAngel

Quote:


> Originally Posted by *danjal*
> 
> yea, you have to use the beta. it works done it today... What you will find is there is a difference in the settings between Zotac's Firestorm and Msi Afterburner.


Yea I was not aware you needed the beta but is working now, would be easier to work with Overclocking this card little by little.


----------



## SupernovaBE

Got my 2 MSI gaming x 8G in the house








1 on EKWB and 1 air, second WB is on the way ;-)

card 1 / 1164mhz / 4650/9300
card 2 / need to test more on water, 1100 atm / 4650/9300

Waiting for custom bios









Benchmark scores for now ( not on the hi clock speeds )
Firestrike 24610
http://www.3dmark.com/fs/9560372
Time Spy 10935 ( NR 1 1070 SLI Atm ) Cool








http://www.3dmark.com/spy/175573

( old rig 3x780ti classified )
http://www.3dmark.com/fs/8010130

I need to find a good tool to stress the cards, clock speeds stable in furmark still crashes 3dmark firestrike.. and if you have the demo vers that's annoying


----------



## pewpewlazer

I installed a pair of EVGA Hybrid Coolers (the 980 Ti models) on my 1070s last night. I had planned on doing some before/after testing and comparisons, but like with my HB SLI bridge, I did precisely zero "before" testing of any value. However:


Firestrike Ultra score jumped up ~200 points consistently from ~8500 to ~8700. Core clocks are still jumping around like they have ADHD in 3Dmark, but I guess they're higher or more consistently higher now.
Max temp of low 40C range gaming is NICE. Definitely exceeded my expectations for an AIO water cooler.
Adding two fans made my system appreciably louder (duh) at idle, but under load it's night and day vs the FE coolers. There's basically no difference in system noise between sitting on the desktop and full 3D load now.
Core clock held a flat line 2101mhz playing Metro 2033 at 4K. Almost positive it did not do that before.
And just to throw my 2 cents in on the temp limit thing... Un-linking the power/temp limit in Afterburner and dragging the temp limit slider from 92*C down to 60*C made zero difference for me in 3Dmark.


----------



## Dreamliner

I think MSI Afterburner is probably better than the Strix software. What are the best OC settings to try on my OC Strix?

I typically have 2 modes. A silent low clock mode and a OC mode with elevated fan speed for gaming and I use the Afterburner hoykeys to switch.

Suggestions for OC settings?


----------



## wickedout

Quote:


> Originally Posted by *owikhan*
> 
> 
> So finally hands on
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500401


Nice I got mine on Friday afternoon. What's your setting like? Let me know and how it works out for you? Thanks.


----------



## iRUSH

Quote:


> Originally Posted by *SupernovaBE*
> 
> Got my 2 MSI gaming x 8G in the house
> 
> 
> 
> 
> 
> 
> 
> 
> 1 on EKWB and 1 air, second WB is on the way ;-)
> 
> card 1 / 1164mhz / 4650/9300
> card 2 / need to test more on water, 1100 atm / 4650/9300
> 
> Waiting for custom bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Benchmark scores for now ( not on the hi clock speeds )
> Firestrike 24610
> http://www.3dmark.com/fs/9560372
> Time Spy 10935
> http://www.3dmark.com/spy/175573
> 
> ( old rig 3x780ti classified )
> http://www.3dmark.com/fs/8010130
> 
> I need to find a good tool to stress the cards, clock speeds stable in furmark still crashes 3dmark firestrike.. and if you have the demo vers that's annoying


Nice 1st post! Show off lol


----------



## danjal

Phantek Enthoo Pro-m
Gigabyte GA-z170x-gaming g5
[email protected]
16gb gskill ddr4-3000
Zotac 1070 amp edition with red led lighting to go with the Gigabyte ga-z170x-gaming 5..
Cryorig H5 universal with the white cover taken off..
One Noctua and two Corsair 1400mm fans.

That yellow striping on the zotac backplate doesnt look too bad actually. It would look better a bright silver though..


----------



## SupernovaBE

Quote:


> Originally Posted by *iRUSH*
> 
> Nice 1st post! Show off lol


Haha yes indeed ;-)
Im doing this hobby the last 10 year







never made a post here..
First time sli fc waterblock.
Looks good but for the price i would have 2 1080.. so for price performance point not that good








The modding and tweaking is more fun !
I will make a build topic on this forum.


----------



## pewpewlazer

I don't foresee mine winning any beauty contests, but it works.


----------



## iRUSH

Quote:


> Originally Posted by *pewpewlazer*
> 
> 
> 
> I don't foresee mine winning any beauty contests, but it works.


I'd have a heart attack if my build looked like that lol. At least it's no slouch ?


----------



## SlvrDragon50

Omg lol. Why do you have such long tubing runs?


----------



## owikhan

Quote:


> Originally Posted by *wickedout*
> 
> Nice I got mine on Friday afternoon. What's your setting like? Let me know and how it works out for you? Thanks.


I also get my card on Friday morning







Smiles... at the moment work like charm with out any issue.. waiting for my 144hz lcd..
and then little bit OC and see results..

For now i checked on Tomb rider very high setting and get full score 131.9
and on Heaven Benchmark 4.0 2440 on extreme..

lets see what,s results come after some OC


----------



## owikhan

Quote:


> Originally Posted by *ITAngel*
> 
> Nice congrats! If you over clock yours let me know what you got. I am just starting to overclock mine but I never really done overclocking with Firestorm. My MSI Afterburner doesn't allow me to unlock the voltage for some odd reason. So I am working on that too.


Thanks for sure we must convo with each other and try max overclock


----------



## victorrz

My results with zotac 1070 amp extreme:



http://www.3dmark.com/3dm/13588142



It's running on 2126 core clock which stabilizes at 2001 and 9360Mhz the memory clock.


----------



## Swolern

Quote:


> Originally Posted by *mickr777*
> 
> Just installed my 2x Asus ROG STRIX-GTX1070-8G-GAMING, got them for AU$699 each (still need to get a better SLi bridge)
> 
> Upgraded from a Single Sapphire Radeon HD 7970 Dual-X 3GB


Nice setup. Im lovin my STRIX 1070s. Been extremely happy with SLI at 3440x1440 100Hz monitor. What monitor are you running?

Also i used 2 flex bridges for SLI which i got that has the same performance as the HB bridge. The trick is to use 2 single SLI brides that are exactly identical.


----------



## Mad Pistol

Because we cannot check the ASIC quailty through GPU-Z at this time, I thought it would be fun to test each of my cards by themselves to see what they were able to achieve.

What I found was that they both overclock almost identically. However, my GTX 1070 FE from the Nvidia store scores a little bit higher than my GTX 1070 FE from MSI.

MSI 1070 @ +170/+700 (102.6)


Nvidia 1070 @ +170/+700 (103.4)


Pretty sure it's an anomaly, but it's certainly something to consider.


----------



## showaccord97

Been playing around with my 1070 and finally got it where i'm satisfied as far as bench-marking with synthetic testing. Broke 6000 with my 3570K with puts me at second overall with the same setup and 191 worldwide.. beating some 1080's(i know they're not OC'd) is still solid!. Actual gaming has me at 120Mhz less on the memory vs synthetic but still happy overall!. I want to upgrade but is there really any benefit other than bech scores...???


----------



## benjamen50

840 MHz+ on the memory clock?!? It must be Samsung VRAM on the GPU. You can check this via GPU-Z if you didn't know. You might wanna make sure the memory clock isn't causing artifacting during GPU stress tests.


----------



## showaccord97

Quote:


> Originally Posted by *benjamen50*
> 
> 840 MHz+ on the memory clock?!? It must be Samsung VRAM on the GPU. You can check this via GPU-Z if you didn't know. You might wanna make sure the memory clock isn't causing artifacting during GPU stress tests.


No artifact during heaven, valley or timespy testing. running overwatch with the above gives me the dreaded green or yellow screens. If I back off the memory to 720 I have no issues in game, also yes I have samsung memory via GPU-Z.


----------



## mickr777

Quote:


> Originally Posted by *Swolern*
> 
> Nice setup. Im lovin my STRIX 1070s. Been extremely happy with SLI at 3440x1440 100Hz monitor. What monitor are you running?
> 
> Also i used 2 flex bridges for SLI which i got that has the same performance as the HB bridge. The trick is to use 2 single SLI brides that are exactly identical.


Im using a Samsung 28in uhd monitor LU28E590DS, I did try the 2 bridges, I have 2 hard sli ones and i just get green flickering if i use them together by them self's there fine.


----------



## DStealth

Finally broke 110FPS barrier










Here's a Valley run same clocks 2126/9700 115.2FPS


----------



## stoker

Quote:


> Originally Posted by *mickr777*
> 
> I did try the 2 bridges, I have 2 hard sli ones and i just get green flickering if i use them together by them self's there fine.


You need to use 2 soft bridges for it to work


----------



## NCSUZoSo

Quote:


> Originally Posted by *SupernovaBE*
> 
> Got my 2 MSI gaming x 8G in the house
> 
> 
> 
> 
> 
> 
> 
> 
> 1 on EKWB and 1 air, second WB is on the way ;-)
> 
> card 1 / 1164mhz / 4650/9300
> card 2 / need to test more on water, 1100 atm / 4650/9300
> 
> Waiting for custom bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Benchmark scores for now ( not on the hi clock speeds )
> Firestrike 24610
> http://www.3dmark.com/fs/9560372
> Time Spy 10935
> http://www.3dmark.com/spy/175573
> 
> ( old rig 3x780ti classified )
> http://www.3dmark.com/fs/8010130
> 
> I need to find a good tool to stress the cards, clock speeds stable in furmark still crashes 3dmark firestrike.. and if you have the demo vers that's annoying
> 
> 
> 
> 
> 
> 
> 
> 
> ]


I expect you will get much better clocks after more testing, with a full coverage waterblock it wouldn't surprise me if the VRAM hit 10 Gbps. I'm already hitting over 9.75 Gbps (Samsung GDDR5) using an AIO cooler + Kraken G10. Also I believe the MSI Gaming X has a secondary BIOS option that allows more voltage, or was that only the Gaming Z?

Note I just noticed that, the GDDR5 is made by Samsung, but in all the PR from NVIDIA on the 1070 they were using Micron's fastest GDDR5 rated at 8 Gbps, so I wonder if maybe this is why I am seeing such crazy stable VRAM speeds (9.7 Gbps+)? MSI choose to use Samsung I guess?

Just put up a new best in 3DMark Fire Strike Ultra: 4976
http://www.3dmark.com/3dm/13777796?

This might help ensure I hold the top spot for my CPU/GPU combo.


----------



## mickr777

Quote:


> Originally Posted by *stoker*
> 
> You need to use 2 soft bridges for it to work


I just tried two flexi bridges they do the same thing

edit: seems to effect 3840x2160 resolution the most


----------



## stoker

Quote:


> Originally Posted by *mickr777*
> 
> I just tried two flexi bridges they do the same thing
> 
> edit: seems to effect 3840x2160 resolution the most


Oh yeah at 4K and above, you really need the HB bridge


----------



## Swolern

Quote:


> Originally Posted by *Mad Pistol*
> 
> Because we cannot check the ASIC quailty through GPU-Z at this time, I thought it would be fun to test each of my cards by themselves to see what they were able to achieve.
> 
> What I found was that they both overclock almost identically. However, my GTX 1070 FE from the Nvidia store scores a little bit higher than my GTX 1070 FE from MSI.
> 
> MSI 1070 @ +170/+700 (102.6)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Nvidia 1070 @ +170/+700 (103.4)
> 
> 
> 
> 
> Pretty sure it's an anomaly, but it's certainly something to consider.


Intersting. What are the actual core clocks running on both? As they may be different due to manufacture bios.
Quote:


> Originally Posted by *mickr777*
> 
> I just tried two flexi bridges they do the same thing
> 
> edit: seems to effect 3840x2160 resolution the most


Weird, i have zero flickering at 3440x1440 with 2 flex bridges, and neither does my friend with 1070SLI with 2 flex bridges @ 4K resolution. I would try to re-seat the GPUs in the mobo and install a fresh NV drivers.


----------



## DStealth

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Just put up a new best in 3DMark Fire Strike Ultra: 4976
> http://www.3dmark.com/3dm/13777796?
> 
> This might help ensure I hold the top spot for my CPU/GPU combo.


Over 5k GPU in FS Ultra is quite an achievement for 1070...I had such while testing my card with stock CPU/Rams speeds...i now cannot reproduce it thanks to TDP throttling...my memory wasn't even OCed to the limit...


----------



## mickr777

Quote:


> Originally Posted by *Swolern*
> 
> Weird, i have zero flickering at 3440x1440 with 2 flex bridges, and neither does my friend with 1070SLI with 2 flex bridges @ 4K resolution. I would try to re-seat the GPUs in the mobo and install a fresh NV drivers.


yeah done that, even changed what one is the main card (not major problem with one bridge i am getting 55-60fps in 4k on witcher 3 with most settings maxed)
just need to wait till i can get a HB sli bridge in Australia

here are some bench results with 1x ROG STRIX-GTX1080-8G-GAMING (+100mhz OC)


and this is with 2x ROG STRIX-GTX1080-8G-GAMING (+100mhz OC) with one flex bridge


havent had time to fiddle with OC much yet


----------



## SupernovaBE

Quote:


> Originally Posted by *NCSUZoSo*
> 
> I expect you will get much better clocks after more testing, with a full coverage waterblock it wouldn't surprise me if the VRAM hit 10 Gbps. I'm already hitting over 9.75 Gbps (Samsung GDDR5) using an AIO cooler + Kraken G10. Also I believe the MSI Gaming X has a secondary BIOS option that allows more voltage, or was that only the Gaming Z?
> 
> Note I just noticed that, the GDDR5 is made by Samsung, but in all the PR from NVIDIA on the 1070 they were using Micron's fastest GDDR5 rated at 8 Gbps, so I wonder if maybe this is why I am seeing such crazy stable VRAM speeds (9.7 Gbps+)? MSI choose to use Samsung I guess?
> 
> Just put up a new best in 3DMark Fire Strike Ultra: 4976
> http://www.3dmark.com/3dm/13777796?
> 
> This might help ensure I hold the top spot for my CPU/GPU combo.


I will try that tonight








i never did more than the +650
I will let you know

I think that the gaming X and Z are no different ( no dual bios )
Same PCB/power phase etc
Higher base clocks and ddr5 speed for the Z

I use 2 flexible sli bridges ( faster than 1 )


----------



## mickr777

Quote:


> Originally Posted by *NCSUZoSo*
> 
> I expect you will get much better clocks after more testing, with a full coverage waterblock it wouldn't surprise me if the VRAM hit 10 Gbps. I'm already hitting over 9.75 Gbps (Samsung GDDR5) using an AIO cooler + Kraken G10. Also I believe the MSI Gaming X has a secondary BIOS option that allows more voltage, or was that only the Gaming Z?
> 
> Note I just noticed that, the GDDR5 is made by Samsung, but in all the PR from NVIDIA on the 1070 they were using Micron's fastest GDDR5 rated at 8 Gbps, so I wonder if maybe this is why I am seeing such crazy stable VRAM speeds (9.7 Gbps+)? MSI choose to use Samsung I guess?
> 
> Just put up a new best in 3DMark Fire Strike Ultra: 4976
> http://www.3dmark.com/3dm/13777796?
> 
> This might help ensure I hold the top spot for my CPU/GPU combo.


Asus are using samsung GDDR5 on there asus rog strix 1070 too


----------



## NCSUZoSo

Quote:


> Originally Posted by *DStealth*
> 
> Over 5k GPU in FS Ultra is quite an achievement for 1070...I had such while testing my card with stock CPU/Rams speeds...i now cannot reproduce it thanks to TDP throttling...my memory wasn't even OCed to the limit...


Well what is most aggravating to me about the whole OCing situation with my MSI 1070 Armor + H110 is I am rarely ever seeing my load temp break 42C and I still can't push the clocks higher. I was reading an in depth tutorial on doing the hardware mods to the FE 1070/1080 and at the end of it all they said they didn't recommend it for anyone who wasn't running LN2 or similar. Telling anyone with watercooling that is isn't worth the trouble because the gains would be small due to the temp rising quickly with voltage:

https://xdevs.com/guide/pascal_oc/
Quote:


> This also brings and answer to the question if overvolting can help OC on aircooling or watercooling. It does not help, due to thermal, which get only worse. Higher temperature render stability and performance decrease. GPU literally overheats and cannot run high frequency anymore, even though temperature is below specified maximum temperature +94°C. Think of it as temperature to frequency dependency, all the way down from +94°C to -196°C, with slope around 100MHz every 50°C. So just like in 980/980Ti/TitanX case, *over-voltage on aircooling/watercooling is not recommended, as it gains little if any performance improvement.
> *
> Don't get this message wrong, as GTX 1080/1070 are still great cards for daily gaming/content creation and VR experience. They are fast, not power hungry, moderately cool. The only catch is that overclocking them is not as fun and rewarding as it was on previous generations, even considering all tricks involved to get Maxwell clock high.


----------



## Jimbags

Quote:


> Originally Posted by *showaccord97*
> 
> Been playing around with my 1070 and finally got it where i'm satisfied as far as bench-marking with synthetic testing. Broke 6000 with my 3570K with puts me at second overall with the same setup and 191 worldwide.. beating some 1080's(i know they're not OC'd) is still solid!. Actual gaming has me at 120Mhz less on the memory vs synthetic but still happy overall!. I want to upgrade but is there really any benefit other than bech scores...???


What is your 3570k clocked at? I have same cpu. Going to have get some better benches now ive seen whst youve achieved


----------



## stoker

My max stable peaking @ 2124mhz


----------



## showaccord97

Quote:


> Originally Posted by *Jimbags*
> 
> What is your 3570k clocked at? I have same cpu. Going to have get some better benches now ive seen whst youve achieved


4653Mhz and hey glad I could motivate you to get some more benchmarks, lol!


----------



## mickr777

Bit of playing with could get core to 2025mhz and ram to 8996mhz and be stable (only one sli bridge)


----------



## kaudiyo

finally can post here the FireStrike benchmarks of my SLI 1070 MSI Gaming X:

THE BEAUTIES



WITH SINGLE RIBBON CABLE AND OC



WITH HB BRIDGE AND OC



is +38k graphics fine for this OC SLI??? near +3.5k difference of ribbon vs HB.

*The ribbon one has more physics score cause my 4690K was running at 4.5ghz instead of 4.3ghz of HB. This week will upgrade to 6700k and give this SLI the CPU power it deserves







.


----------



## Swolern

That Nvidia bridge just kills the look of those cards! U need this one man!!


----------



## NCSUZoSo

Guys wanted to share this pic of my new rig with all the upgrades now in place, really digging this panel on pic I took. I made it focus on the GPU and this way it slightly faded out the mobo components, I think it's cool.



Also I bumped my CPU OC to 4.5 GHz from 4.2 GHz and now have taken over the top spots for my CPU/GPU combo









I did also change from 16GB DDR3 2133 GSkill Sniper to 32GB DDR3 2400 Corsair Vengeance Pro, that helped my score also.


----------



## kaudiyo

Quote:


> Originally Posted by *Swolern*
> 
> That Nvidia bridge just kills the look of those cards! U need this one man!!


I think that one doesn´t exist for 80mm length, my new mobo this week will space cards +1 slot so I need 80mm bridge and will sell current 60mm.


----------



## SupernovaBE

Quote:


> Originally Posted by *NCSUZoSo*
> 
> I expect you will get much better clocks after more testing, with a full coverage waterblock it wouldn't surprise me if the VRAM hit 10 Gbps. I'm already hitting over 9.75 Gbps (Samsung GDDR5) using an AIO cooler + Kraken G10. Also I believe the MSI Gaming X has a secondary BIOS option that allows more voltage, or was that only the Gaming Z?
> 
> Note I just noticed that, the GDDR5 is made by Samsung, but in all the PR from NVIDIA on the 1070 they were using Micron's fastest GDDR5 rated at 8 Gbps, so I wonder if maybe this is why I am seeing such crazy stable VRAM speeds (9.7 Gbps+)? MSI choose to use Samsung I guess?
> 
> Just put up a new best in 3DMark Fire Strike Ultra: 4976
> http://www.3dmark.com/3dm/13777796?
> 
> This might help ensure I hold the top spot for my CPU/GPU combo.


Well on water ( for now )
1152/9650
Time spy ( 1 card ) 6683
http://www.3dmark.com/3dm/13786969?

the card on air is not this fast ant thats a bummer









no dual bios on this card ( of Z )
The Z and X have the same pcb/power phase for what i know.
Im gonna w8 for the custom bios ;-)

the card on air will be tested to the max tomorrow


----------



## TUFinside

I was getting ready to buy a 1070, but since i heard the latency issues (DPC), now i'm not so sure...Did you notice stuttering or sound issues in game ?


----------



## Mudfrog

Quote:


> Originally Posted by *TUFinside*
> 
> I was getting ready to buy a 1070, but since i heard the latency issues (DPC), now i'm not so sure...Did you notice stuttering or sound issues in game ?


I occasionally get some stutter, I've only noticed it in Fallout 4 though. GTA 5, Rise of the Tomb Raider and Rust are all smooth. I haven't played much else since getting the card.


----------



## NCSUZoSo

Quote:


> Originally Posted by *TUFinside*
> 
> I was getting ready to buy a 1070, but since i heard the latency issues (DPC), now i'm not so sure...Did you notice stuttering or sound issues in game ?


For the few people who have had this problem there is a hotfix that has been available:


__
https://www.reddit.com/r/4tydge/dpc_latency_hotfix_driver_for_gtx_1060_1070_1080/
Quote:


> Since this is a hotfix driver and not WHQL, I will not be making any stickied post with the usual template. However, I will sticky this post temporarily for information.
> 
> PS: The DPC Latency issue does not impact everyone. Only a subset of users do. You don't have to update if you are not experiencing the issue.
> 
> PPS: Nvidia's Manuel regarding Netflix stuttering issue - "We have root caused this issue and will provide a fix in a future driver."


Personally I have noticed zero stuttering in gaming/blu-ray playback/Netflix W10 App/Premiere Pro/Solidworks/etc. Everything has done great on this card for me except the voltage/power limit issue that caps all of us right under 2.2 GHz no matter the cooling solution (except LN2 + hardware mods).


----------



## TheMiracle

Quote:


> Originally Posted by *TUFinside*
> 
> I was getting ready to buy a 1070, but since i heard the latency issues (DPC), now i'm not so sure...Did you notice stuttering or sound issues in game ?


I have zero issue with my 1070 G1, running smoother than my old 970 which had a little stutter in some games.


----------



## ogow89

So i noticed than i can't check the ''Apply overclocking at startup'' on MSI afterburner, is there a solution?

Gainward Phoenix 1070.


----------



## NCSUZoSo

Quote:


> Originally Posted by *ogow89*
> 
> So i noticed than i can't check the ''Apply overclocking at startup'' on MSI afterburner, is there a solution?
> 
> Gainward Phoenix 1070.


There is a bug in a few of the skins for the latest beta version (4.3 Beta 4) that cause this, but if you tell Afterburner to start with Windows it should be applying the last clock speed you used when it loads up at boot.


----------



## ogow89

Quote:


> Originally Posted by *NCSUZoSo*
> 
> There is a bug in a few of the skins for the latest beta version (4.3 Beta 4) that cause this, but if you tell Afterburner to start with Windows it should being applying the last clock speed you used when it loads up at boot.


Did so, but it won't apply them sadly. Which skin works bug free?


----------



## NCSUZoSo

Quote:


> Originally Posted by *ogow89*
> 
> Did so, but it won't apply them sadly. Which skin works bug free?


Any of the ones that begin with the word Default should work fine


----------



## criminal

Just picked one of these up: https://www.amazon.com/gp/product/B01IIGVL3W/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

Cheapest I have ever seen for a a full cover water block, so wanted to share.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *TUFinside*
> 
> I was getting ready to buy a 1070, but since i heard the latency issues (DPC), now i'm not so sure...Did you notice stuttering or sound issues in game ?


Maybe i'm wrong, but sound issues may be getting confused with FPS going way too high (a very good thing







) and causing a whining sound to be generated. (1st time i ever experienced it was in the good old 8800 GTX days... *memories...







) From reading somewhere on this Thread early on- i believe that it could be cured by Capping FPS. Maybe Rivatuner or something has an option to Cap FPS.


----------



## Yetyhunter

Quote:


> Originally Posted by *TheMiracle*
> 
> I have zero issue with my 1070 G1, running smoother than my old 970 which had a little stutter in some games.


What temperatures do you get under constant 99% load?
What is the highest gaming stable clock you can achieve?
I am asking this to compare your results with mine. My card max temperature is 73*C with a stabilized clock of 2012-2025 mhz and 2200 mhz for memory.
And a rather curios question, how many heat pipes does your card have ?


----------



## danjal

Quote:


> Originally Posted by *Mudfrog*
> 
> I occasionally get some stutter, I've only noticed it in Fallout 4 though. GTA 5, Rise of the Tomb Raider and Rust are all smooth. I haven't played much else since getting the card.


I dont get any wioth my Zotac Amp edition.. at least not in bf4, SW BF and Archeage.


----------



## TheMiracle

Quote:


> Originally Posted by *Yetyhunter*
> 
> What temperatures do you get under constant 99% load?
> What is the highest gaming stable clock you can achieve?
> I am asking this to compare your results with mine. My card max temperature is 73*C with a stabilized clock of 2012-2025 mhz and 2200 mhz for memory.
> And a rather curios question, how many heat pipes does your card have ?


56ºC is the max I get with 80% fan speed.
For games I set it +100 on core and +500 memory. It runs stable in 2050-2063mhz, but have some spikes to 2038 and 2076.
I dont know how many heat pipes my card has, its a Gigabyte G1 Gaming.


----------



## ITAngel

Quote:


> Originally Posted by *ogow89*
> 
> So i noticed than i can't check the ''Apply overclocking at startup'' on MSI afterburner, is there a solution?
> 
> Gainward Phoenix 1070.


I have not tried it yet but have you gotten the latest MSI Afterburner Beta version? My stuff were all locked until a member from the forum told me about it and once I upgraded to the new MSI Afterburner most of my settings started working. I would start there if you have not not gotten the latest yet.


----------



## Yetyhunter

Quote:


> Originally Posted by *TheMiracle*
> 
> 56ºC is the max I get with 80% fan speed.
> For games I set it +100 on core and +500 memory. It runs stable in 2050-2063mhz, but have some spikes to 2038 and 2076.
> I dont know how many heat pipes my card has, its a Gigabyte G1 Gaming.


Isn't 80% too loud for your ? Anyway very nice temps.


----------



## ITAngel

What I meant to say and ask you guys is that. Currently i have the Dark Pro 3 install for my CPU cooler and I am wondering if that is putting heat into the case. Maybe not but I am thinking if I keep a lot of that heat out by re-installing my EK dual 240mm slim rads and water cooling the CPU alone. My thoughts here is that the fast will push in more cooler air just for the CPU since the heat of the CPU will be coming out on top from the rads. What you guys think?


----------



## Dude970

I improved on Heaven


----------



## ikjadoon

Quote:


> Originally Posted by *criminal*
> 
> Just picked one of these up: https://www.amazon.com/gp/product/B01IIGVL3W/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> Cheapest I have ever seen for a a full cover water block, so wanted to share.


Shucks, it's out of stock. How much was it for? Amazon, for whatever reason, removes the price after it goes "currently unavailable."


----------



## iRUSH

Quote:


> Originally Posted by *ikjadoon*
> 
> Shucks, it's out of stock. How much was it for? Amazon, for whatever reason, removes the price after it goes "currently unavailable."


$85 iirc


----------



## DStealth

Quote:


> Originally Posted by *Dude970*
> 
> I improved on Heaven


What BIOS, cooling and clocks you're running here...I've got 110.7fps but not a single chance to catch your 22k GPU score in FS....

A new driver for TitanX lunch still from MS servers only ...and contains DPC latency fix also and some optimizations are observed performance-wise:
Quote:


> Enjoy!
> 
> X64
> http://download.windowsupdate.com/c/msdownload/update/driver/drvs/2016/08/20906845_4592ba4f9b66ab642395777522fc8bb101b5d55f.cab
> 
> X86
> http://download.windowsupdate.com/c/msdownload/update/driver/drvs/2016/08/20906846_acf04d65f766bc24cf91390d1aaa5e11daa7261a.cab


----------



## duganator

Just joined the club boys. Testing the card now, it boosted to 1974 out of the box


----------



## Blackfyre

Quote:


> Originally Posted by *duganator*
> 
> Just joined the club boys. Testing the card now, it boosted to 1974 out of the box


Welcome to the club. What card did you get? Have you tried overclocking? What card did you come from? Details man


----------



## NCSUZoSo

Quote:


> Originally Posted by *ITAngel*
> 
> What I meant to say and ask you guys is that. Currently i have the Dark Pro 3 install for my CPU cooler and I am wondering if that is putting heat into the case. Maybe not but I am thinking if I keep a lot of that heat out by re-installing my EK dual 240mm slim rads and water cooling the CPU alone. My thoughts here is that the fast will push in more cooler air just for the CPU since the heat of the CPU will be coming out on top from the rads. What you guys think?


Yes if you aren't going to water cool the GPU the next best option is to water cool the CPU so all of the heat is removed away from the GPU. Any normal air heat sink is going to exhaust some amount of hot air to the GPU no matter how good the airflow due to turbulence.


----------



## owikhan

can any one checked 1070 ASIC QUALITY??


----------



## benjamen50

Quote:


> Originally Posted by *owikhan*
> 
> can any one checked 1070 ASIC QUALITY??


Nobody here can yet, there is no software that I know of at the moment that can read ASIC quality from any Pascal GPU.


----------



## duganator

Lol, so much I left out. I got the zotac 1070 amp open box from microcenter for $402 after tax. I'm coming from a gtx 970 1531mhz. I'm running a 1440p 144hz monitor, so the 970 was sadness. My card boosted out of the box to 1974 and I have it at 2084.
Quote:


> Originally Posted by *Blackfyre*
> 
> Welcome to the club. What card did you get? Have you tried overclocking? What card did you come from? Details man


----------



## Mudfrog

Quote:


> Originally Posted by *TheMiracle*
> 
> 56ºC is the max I get with 80% fan speed.
> For games I set it +100 on core and +500 memory. It runs stable in 2050-2063mhz, but have some spikes to 2038 and 2076.
> I dont know how many heat pipes my card has, its a Gigabyte G1 Gaming.


On my card the core clocks to 2151 and 8606 on the memory. For games I run the core between 2050 and 2100 and the memory at 8400, forcing the fans to run at 65% it stays around 50-55c. I'm loving this card.


----------



## Dude970

Quote:


> Originally Posted by *DStealth*
> 
> What BIOS, cooling and clocks you're running here...I've got 110.7fps but not a single chance to catch your 22k GPU score in FS....


stock bios. Fans set to 90, 156 core and 800 mem


----------



## criminal

Quote:


> Originally Posted by *ikjadoon*
> 
> Shucks, it's out of stock. How much was it for? Amazon, for whatever reason, removes the price after it goes "currently unavailable."


$85


----------



## FlatOUT

Quote:


> Originally Posted by *Dude970*
> 
> I improved on Heaven


How?! I get only 100 fps on 2600k(4,5ghz) with 2038/9200 mem


----------



## Dude970

Quote:


> Originally Posted by *FlatOUT*
> 
> How?! I get only 100 fps on 2600k(4,5ghz) with 2038/9200 mem


my 3570k was at 4.8 on that run. I also slid voltage and power limit in afterburner to max


----------



## Mudfrog

Quote:


> Originally Posted by *FlatOUT*
> 
> How?! I get only 100 fps on 2600k(4,5ghz) with 2038/9200 mem


I get 102.8 with my 2500K @ 4.5 and the card at 2151 / 8606.


----------



## LiquidHaus

I got this email this morning and it got me quite excited. Good news for those with the amp extreme like me!



Not sure when production will be over nor when they will actually be available. But either way it's awesome that a block is going to be made.


----------



## GunnzAkimbo

Asus 1070 Dual Edition is the lowest priced card in AUS.

$579

http://www.auspcmarket.com.au/asus-nvidia-geforce-dual-gtx1070-8g/

horrible looking cooler, but strip it off, or custom paint it.


----------



## ikjadoon

Quote:


> Originally Posted by *iRUSH*
> 
> $85 iirc


Quote:


> Originally Posted by *criminal*
> 
> $85


Thannkks.







Not bad at all.


----------



## TheDeadCry

Can Confirm (That ASIC Quality can't be read ATM) Maybe for the same reason that all 1070's have pretty much the same clock thresholds?


----------



## Prozillah

G'day lads -

Got my G1 Gaming week or so ago. Love the card - absolutely awesome increase in virtually all aspects over my twin gigabyte windforce 290's.

Running on a 1440P 144hz screen.

Stable OC @ 2113mhz & 8800mhz

HOWEVER - as soon as the card idles down or I do a windows restart / boot up it artifacts and either hardlocks or BSODS. It appears there isn't enough voltage being feed to it with that mem OC applied from its downclocked position. When the power is up and cranking I can edge it the OC out of it safely and run long BF4 sessions no worries.

Bios issue? Faulty card? Got me stumped.

Happens on both Afterburner & Gigabyte Xtreme Software however I can at least boot with the XTREME software around 300mhz Mem OC applied.


----------



## SupernovaBE

Quote:


> Originally Posted by *Prozillah*
> 
> G'day lads -
> 
> Got my G1 Gaming week or so ago. Love the card - absolutely awesome increase in virtually all aspects over my twin gigabyte windforce 290's.
> 
> Running on a 1440P 144hz screen.
> 
> Stable OC @ 2113mhz & 8800mhz
> 
> HOWEVER - as soon as the card idles down or I do a windows restart / boot up it artifacts and either hardlocks or BSODS. It appears there isn't enough voltage being feed to it with that mem OC applied from its downclocked position. When the power is up and cranking I can edge it the OC out of it safely and run long BF4 sessions no worries.
> 
> Bios issue? Faulty card? Got me stumped.
> 
> Happens on both Afterburner & Gigabyte Xtreme Software however I can at least boot with the XTREME software around 300mhz Mem OC applied.


EVGA precision X OC ( unlcoked voltage for the 1070/1080 )
- Dynamically set independent voltage/frequency points for ultimate control
http://www.evga.com/precisionxoc/

This is what you need i think ;-)


----------



## Ranguvar

Quote:


> Originally Posted by *SupernovaBE*
> 
> EVGA precision X OC ( unlcoked voltage for the 1070/1080 )
> - Dynamically set independent voltage/frequency points for ultimate control
> http://www.evga.com/precisionxoc/
> 
> This is what you need i think ;-)


Yes, but also note that when I use manaul voltage curve, I lose FPS at same frequencies, test this.


----------



## Prozillah

Tried the precision software - still does it unfortunately. As soon as it requires the smallest amount of graphics processing (i.e. opening the calculator to desktop) it artifacts and either BSODS with VIDEO_SCHEDULLAR_INTERNAL_ERROR or hardlocks and restarts

- Might try a format and clean windows 10 install - I have recently just changed them out from the 290's and was really hoping to avoid formatting...


----------



## jlhawn

Quote:


> Originally Posted by *Prozillah*
> 
> Tried the precision software - still does it unfortunately. As soon as it requires the smallest amount of graphics processing (i.e. opening the calculator to desktop) it artifacts and either BSODS with VIDEO_SCHEDULLAR_INTERNAL_ERROR or hardlocks and restarts
> 
> - Might try a format and clean windows 10 install - I have recently just changed them out from the 290's and was really hoping to avoid formatting...


You would be better off to RMA the graphics card, it should not being doing this.


----------



## Prozillah

Quote:


> Originally Posted by *jlhawn*
> 
> You would be better off to RMA the graphics card, it should not being doing this.


Geezus really?? that's a damn shame as it OC's well - and there was another guy on here with the same issue - under what grounds can I RMA it as still works perfectly in its stock config?

Would it not be fixed with a bios update or something?


----------



## Prozillah

Also in saying that - on gigabytes website there is a new bios (F2_BETA) but it states that the BIOS's can only be flashed with the same VBIOS. Mine is F10......

What are the risks of trying to flash it to the F2_BETA bios to see if that helps the issue?


----------



## DStealth

Flash it no risk at all, but as far as i know it fixes only fan issues, i have answered you already in guru3d forum, not gonna repeat myself...but shortly - no need to RMA the card IMO


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Prozillah*
> 
> G'day lads -
> 
> Got my G1 Gaming week or so ago. Love the card - absolutely awesome increase in virtually all aspects over my twin gigabyte windforce 290's.
> 
> Running on a 1440P 144hz screen.
> 
> Stable OC @ 2113mhz & 8800mhz
> 
> HOWEVER - as soon as the card idles down or I do a windows restart / boot up it artifacts and either hardlocks or BSODS. It appears there isn't enough voltage being feed to it with that mem OC applied from its downclocked position. When the power is up and cranking I can edge it the OC out of it safely and run long BF4 sessions no worries.
> 
> Bios issue? Faulty card? Got me stumped.
> 
> Happens on both Afterburner & Gigabyte Xtreme Software however I can at least boot with the XTREME software around 300mhz Mem OC applied.


i had a 2x 760's and before that 2x 560ti's and no BSODs watsoever. But with my new 2x 1070's, i got two weeks ago, i had some BSODs during the 1st ten days, but successfully got rid of them.

A. understand that the Windows/Minidump Folder is where the BSOD info is saved and that the .dmp files there can help prove a lot and are well worth reading. But u can skip that if u want.

B. Reset any OCing, then uninstall all Video Utilities, uninstall Hardware Monitoring Utilities and finally uninstall the Video Drivers. (and after uninstall, then shut down and use only ur Primary Monitor... meaning, if u have more than one monitor unplug all other one(s) and only after "Step C" below, plug them back in.)

C. do a clean install of the Video drivers and install only something (utility) to monitor temperatures but skip installing any Video Utilities. and don't adjust any Fan speeds and don't OC. simply monitor temperatures of Vid cards and their fan speeds and PC hardware temps.

D. Run a benchmark like Heaven or Valley and if they seem fine, play a few games. Test restarting, full Shut down, test Logging-Off and Logging-On PC a few times.

E. If no Hardlocks or BSODs occur, then install the newest MSI Afterburner Beta, but don't change anything and repeat D (^^).

F. adjust fan speeds if u want and run PC a few days to be sure no BSODs or Hardlocks occur with all defaults before finally OCing.

GL

My problems have not returned since doing the above.







cards and whole PC are running perfectly!

if problems don't go away; and the .DMP files point to just the Video card drivers, as the cause, then start with a fresh OS.


----------



## mickr777

After a few hours of oc testing my 2x Asus ROG STRIX-GTX1070-8G-GAMING I got 2012 core/9000 memory was very stable and after 10 mins stress test 56c and 58c temps with fans sitting about 65%


----------



## Dreamliner

Quote:


> Originally Posted by *mickr777*
> 
> After a few hours of oc testing my 2x Asus ROG STRIX-GTX1070-8G-GAMING I got 2012 core/9000 memory was very stable and after 10 mins stress test 56c and 58c temps with fans sitting about 65%


What app and settings are you using? I've got the OC edition and I'm not certain how to OC it properly.


----------



## Prozillah

Quote:


> Originally Posted by *Prozillah*
> 
> G'day lads -
> 
> Got my G1 Gaming week or so ago. Love the card - absolutely awesome increase in virtually all aspects over my twin gigabyte windforce 290's.
> 
> Running on a 1440P 144hz screen.
> 
> Stable OC @ 2113mhz & 8800mhz
> 
> HOWEVER - as soon as the card idles down or I do a windows restart / boot up it artifacts and either hardlocks or BSODS. It appears there isn't enough voltage being feed to it with that mem OC applied from its downclocked position. When the power is up and cranking I can edge it the OC out of it safely and run long BF4 sessions no worries.
> 
> Bios issue? Faulty card? Got me stumped.
> 
> Happens on both Afterburner & Gigabyte Xtreme Software however I can at least boot with the XTREME software around 300mhz Mem OC applied.


Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> i had a 2x 760's and before that 2x 560ti's and no BSODs watsoever. But with my new 2x 1070's, i got two weeks ago, i had some BSODs during the 1st ten days, but successfully got rid of them.
> 
> A. understand that the Windows/Minidump Folder is where the BSOD info is saved and that the .dmp files there can help prove a lot and are well worth reading. But u can skip that if u want.
> 
> B. Reset any OCing, then uninstall all Video Utilities, uninstall Hardware Monitoring Utilities and finally uninstall the Video Drivers. (and after uninstall, then shut down and use only ur Primary Monitor... meaning, if u have more than one monitor unplug all other one(s) and only after "Step C" below, plug them back in.)
> 
> C. do a clean install of the Video drivers and install only something (utility) to monitor temperatures but skip installing any Video Utilities. and don't adjust any Fan speeds and don't OC. simply monitor temperatures of Vid cards and their fan speeds and PC hardware temps.
> 
> D. Run a benchmark like Heaven or Valley and if they seem fine, play a few games. Test restarting, full Shut down, test Logging-Off and Logging-On PC a few times.
> 
> E. If no Hardlocks or BSODs occur, then install the newest MSI Afterburner Beta, but don't change anything and repeat D (^^).
> 
> F. adjust fan speeds if u want and run PC a few days to be sure no BSODs or Hardlocks occur with all defaults before finally OCing.
> 
> GL
> 
> My problems have not returned since doing the above.
> 
> 
> 
> 
> 
> 
> 
> cards and whole PC are running perfectly!
> 
> if problems don't go away; and the .DMP files point to just the Video card drivers, as the cause, then start with a fresh OS.


Cheers Bee Dee - luckily through the majority of your process you described above I already know the card runs brilliantly in Stock config & also with the Core OC only. its only when the Mem OC is applied do the BSODS or hardlocks happening instantaneously.

Ive RMA'd it now anway...luckily I know the guys the shop and they are sending me a new one first with a courier ticket to send the other back....so should be able to nail it down somewhat once I receive it.


----------



## jlhawn

Quote:


> Originally Posted by *Prozillah*
> 
> Cheers Bee Dee - luckily through the majority of your process you described above I already know the card runs brilliantly in Stock config & also with the Core OC only. its only when the Mem OC is applied do the BSODS or hardlocks happening instantaneously.
> 
> Ive RMA'd it now anway...luckily I know the guys the shop and they are sending me a new one first with a courier ticket to send the other back....so should be able to nail it down somewhat once I receive it.


My reason for saying to RMA because of what it's doing is that you should never have to jump through hoops with driver installs and re-installing windows to have a gpu run correctly.
I do believe in a clean driver install but, you should never have to re-install your operating system to get the gpu to run correctly.
I have never had a BSOD over a failed gpu over clock, all I get with a bad over clock is a TDR of the gpu driver which a ctrl alt del brings me back to my desktop so I can start over.
I also have never had artifacts or odd spots and colors on my screen with a bad gpu over clock either.
I really believe your gpu has a vram issue and I'm sure the replacement will run correctly. But this is just my opinion from my experience over the years of building and tweaking computers.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Prozillah*
> 
> Cheers Bee Dee - luckily through the majority of your process you described above I already know the card runs brilliantly in Stock config & also with the Core OC only. its only when the Mem OC is applied do the BSODS or hardlocks happening instantaneously.
> 
> Ive RMA'd it now anway...luckily I know the guys the shop and they are sending me a new one first with a courier ticket to send the other back....so should be able to nail it down somewhat once I receive it.


cool.









u soo LUCKY. Cheers to ur friends in the shop!

Doom 4 ROCKs! on a single 1070 with EVERYthing Maxed. (95 FPS average)
For my old PC, Rise of TR needs SLI for all Everything Maxed (but just FXAA ) and Hair normal ("On") uses 7.9 GB vRAM. (75 FPS average)
GTA 5 with EVERYthing Maxed, doesn't scale well but SLI helps; is plenty smooth for a GTA series game. (65 FPS average)
Will try Witcher 3 soon.

G-sync 1440p is BEST thing ever in Vid games! and so is the 1070. is as badass as the 8800 GTX was bak in the day.


----------



## Swolern

Anyone see any 1070 SLI vs Titan X Pascal benchmarks?


----------



## Shweller

Quote:


> Originally Posted by *mickr777*
> 
> After a few hours of oc testing my 2x Asus ROG STRIX-GTX1070-8G-GAMING I got 2012 core/9000 memory was very stable and after 10 mins stress test 56c and 58c temps with fans sitting about 65%


I love my Strix OC 1070. Its amazing on OC mode via GPUtweek. On stock voltages it boosts to 1974-2012Mz on its own and the temps stay around 60°C with fan speed around 50-55%. This is a beast of a card that almost outperforms my old 290x crossfire that I could hardly ever get work right. Very pleased with my purchase.


----------



## svefn

any reviews/owners of the Palit 1070 Dual fan?


----------



## pez

Quote:


> Originally Posted by *Swolern*
> 
> Anyone see any 1070 SLI vs Titan X Pascal benchmarks?


Only 'SLI' benchmark I've seen, which is largely irrelevant to what you're looking for, is the PC World review where they test 1080 SLI vs Titan X P SLI....but only at 4K/5K....

My guess essentially is you're going to see similar to much better performance with 1070 SLI with games that scale. SLI scaling is much better these days, though still not perfect. The biggest factor here of course is price. $820-ish-$900-ish vs $1200.


----------



## Swolern

Quote:


> Originally Posted by *pez*
> 
> Only 'SLI' benchmark I've seen, which is largely irrelevant to what you're looking for, is the PC World review where they test 1080 SLI vs Titan X P SLI....but only at 4K/5K....
> 
> My guess essentially is you're going to see similar to much better performance with 1070 SLI with games that scale. SLI scaling is much better these days, though still not perfect. The biggest factor here of course is price. $820-ish-$900-ish vs $1200.


True. I know i dont need it, and im perfectly happy with my 1070SLI. But damn some of those benchmarks sure looks nice. But $400 in my pocket looks nice also.


----------



## pez

Quote:


> Originally Posted by *Swolern*
> 
> True. I know i dont need it, and im perfectly happy with my 1070SLI. But damn some of those benchmarks sure looks nice. But $400 in my pocket looks nice also.


Agreed lol. After seeing the price and performance of the Titan X P, I can't say I'm disappointed in my decision to go 1080 SLI, either







. I spent slightly more than that of a Titan...but I mean....it would take another Titan X to beat it out ATM







.


----------



## GreedyMuffin

Same over here. I'm kinda mad for only having the top-tier card for a few days.. (Was out of stock everywhere).

But F. it. The performance is as good as always and my room doesn't get heated as much compared to OCed 980Ti. Which was my goal.


----------



## kevindd992002

Which is the best 1070 around nowadays?


----------



## bigjdubb

Quote:


> Originally Posted by *kevindd992002*
> 
> Which is the best 1070 around nowadays?


Very little variation in performance across the board. For every report of a specific card doing really well there are two reports of that same type of card not doing as well as expected. Just pick the one that has the right atheistic appeal for you and go with it.


----------



## TUFinside

Hey ASUS ! Release your ITX GTX 1070 already ! (Pleaaaaase ?)


----------



## mypickaxe

Quote:


> Originally Posted by *bigjdubb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> Which is the best 1070 around nowadays?
> 
> 
> 
> Very little variation in performance across the board. For every report of a specific card doing really well there are two reports of that same type of card not doing as well as expected. Just pick the one that has the right atheistic appeal for you and go with it.
Click to expand...

I'd say, the right aesthetic *and* thermals. You may want to water cool the thing, in which case the cheapest reference style model (not necessarily the Founders Edition, depending on when you buy...right now hard to find anything less than $44) would be a good bet. You can get water blocks for custom PCB models, but it may be of little benefit until we get BIOS mods (if ever.)

If not water cooling, but in a cramped mini-ITX case with poor ventilation, a blower style cooler is the best bet.

If you have breathing room, an open air design may be the best. If you want RGB and GPU controlled PWM fan headers, I'd look at the Asus ROG Strix model.

It's really your call, they're all pretty much within breathing distance of one another.


----------



## ogow89

Quote:


> Originally Posted by *kevindd992002*
> 
> Which is the best 1070 around nowadays?


gainward phoenix golden sample or palit super jetstream, guaranteed a chip that goes over 2ghz.


----------



## DStealth

Quote:


> Originally Posted by *ogow89*
> 
> a chip that goes over 2ghz.


Can you point ... one that doesn't ???


----------



## duganator

So my amp 1070 is reaching 70 c in games with the fan at 100℅ did I get a faulty card? I read that someone said tightening the screws on the card might help


----------



## ITAngel

I wonder if any of these companies will ever make a water cooler block for the Zotac GTX 1070 AMP Extreme cards. I hope so but if not it will suck as I like this card and on air it does amazingly well. However someday I may want to go water with it.


----------



## madmeatballs

Quote:


> Originally Posted by *ITAngel*
> 
> I wonder if any of these companies will ever make a water cooler block for the Zotac GTX 1070 AMP Extreme cards. I hope so but if not it will suck as I like this card and on air it does amazingly well. However someday I may want to go water with it.


afaik, Alphacool is working on one for use 1070 AMP Extreme users.

Btw, what overclock have you achieved with your card?


----------



## aka13

Anyone here with a triple 1080p screen setup? I have been running 4040x1080 on a single 970, and will be switching to 5760x1020 now. Will a single 1070 yield appropriate performance?


----------



## ogow89

Quote:


> Originally Posted by *DStealth*
> 
> Can you point ... one that doesn't ???


my msi gtx 1070 gamer x, maxed at 1936mhz, with 126% pl, 100% vcore. Anything above 50+ mhz on the core clock resulted into crash. my gainward phoenix golden sample goes up to 2154mhz, and settles at 2136mhz after awhile, with a curve overclock, and 114% pl.


----------



## ITAngel

Quote:


> Originally Posted by *madmeatballs*
> 
> afaik, Alphacool is working on one for use 1070 AMP Extreme users.
> 
> Btw, what overclock have you achieved with your card?


Oh that would be cool if Alphacool makes one I would to totally buy it. Well At the moment I have been monkeying around with the software but have not pushed the memory yet. On stock settings I have pushed the GPU Core up to 190 without crashing and running any games fine. I am trying to find the base line and how fast I can push each section. I only done CPU overclocking not GPU so figuring out what I need to focus on with these 10 series cards.


----------



## TheDeadCry

Quote:


> Originally Posted by *ogow89*
> 
> my msi gtx 1070 gamer x, maxed at 1936mhz, with 126% pl, 100% vcore. Anything above 50+ mhz on the core clock resulted into crash. my gainward phoenix golden sample goes up to 2154mhz, and settles at 2136mhz after awhile, with a curve overclock, and 114% pl.


Bad Luck man







I bumped up my core to 100mhz for 2ghz on load immediately when I got it...runs flawlessly. Damn.


----------



## FitNerdPilot

Not sure if this is the right forum for this question... I have the Gigabyte GTX 1070 Xtreme Gaming and have been having bad luck with it in Battlefield 4. Keep getting "device hung, directx error" crashes. I came from an older AMD card. I've tried several fixes I found on youtube but no luck. What am I missing? Let me know what info I failed to provide as I'm sure I missed something crucial.


----------



## ogow89

Quote:


> Originally Posted by *TheDeadCry*
> 
> Bad Luck man
> 
> 
> 
> 
> 
> 
> 
> I bumped up my core to 100mhz for 2ghz on load immediately when I got it...runs flawlessly. Damn.


well that is why i changed the msi card to phoenix golden sample. its cheaper, quieter, and faster. looks okay in comparison, but it is in the case, so not starring at it all day.


----------



## TheDeadCry

Quote:


> Originally Posted by *ogow89*
> 
> well that is why i changed the msi card to phoenix golden sample. its cheaper, quieter, and faster. looks okay in comparison, but it is in the case, so not starring at it all day.


True







Sounds like you made a good choice. 1070's are so finicky when overclocking, as other people mentioned - almost all identical in overclock ability. It sounds like you got a good card with a little extra room. Gratz







I'm very interested to see what you can push. If you make any breakthroughs, I'd love to see!







That being said, I spend too much time just looking inside my case, lmao....I'm a sucker for shiny and colorful things.


----------



## LiquidHaus

Quote:


> Originally Posted by *madmeatballs*
> 
> afaik, Alphacool is working on one for use 1070 AMP Extreme users.
> 
> Btw, what overclock have you achieved with your card?


Quote:


> Originally Posted by *ITAngel*
> 
> Oh that would be cool if Alphacool makes one I would to totally buy it. Well At the moment I have been monkeying around with the software but have not pushed the memory yet. On stock settings I have pushed the GPU Core up to 190 without crashing and running any games fine. I am trying to find the base line and how fast I can push each section. I only done CPU overclocking not GPU so figuring out what I need to focus on with these 10 series cards.


CONFIRMED Alphacool is making a waterblock for the Amp Extreme. I emailed them last week asking them about it and they said it is currently in production.

Here is a SC of the email:



Not exactly sure the ETA though. I am still waiting on an answer for that.


----------



## ITAngel

Is there is a good guide or video that show steps to take to overclock a GPU carefully that anyone can post here for me? Thanks In Advance!
Quote:


> Originally Posted by *lifeisshort117*
> 
> CONFIRMED Alphacool is making a waterblock for the Amp Extreme. I emailed them last week asking them about it and they said it is currently in production.
> 
> Here is a SC of the email:
> 
> 
> 
> Not exactly sure the ETA though. I am still waiting on an answer for that.


WOW DUDE! THANKS! That is amazing news. please keep us posted if you see it go up on sale or if you find out when.


----------



## mcbaes72

Quote:


> Originally Posted by *duganator*
> 
> So my amp 1070 is reaching 70 c in games with the fan at 100℅ did I get a faulty card? I read that someone said tightening the screws on the card might help


70C isn't bad @ 100% fan speed. I run my fans % 10 lower than posted temp. For example, playing couple hours straight, highest I've seen was 77C @ 67% fan speed (MSI Armor). If I ran full 100% fans, probably drop to around 68-70C where your temp is at.


----------



## Ranguvar

Quote:


> Originally Posted by *mcbaes72*
> 
> 70C isn't bad @ 100% fan speed. I run my fans % 10 lower than posted temp. For example, playing couple hours straight, highest I've seen was 77C @ 67% fan speed (MSI Armor). If I ran full 100% fans, probably drop to around 68-70C where your temp is at.


Are you sure?

My 1070 FTW @ 2.1GHz/9.4GHz will do mid-50s at prolonged load, 100% fan.


----------



## mypickaxe

Quote:


> Originally Posted by *Ranguvar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mcbaes72*
> 
> 70C isn't bad @ 100% fan speed. I run my fans % 10 lower than posted temp. For example, playing couple hours straight, highest I've seen was 77C @ 67% fan speed (MSI Armor). If I ran full 100% fans, probably drop to around 68-70C where your temp is at.
> 
> 
> 
> Are you sure?
> 
> My 1070 FTW @ 2.1GHz/9.4GHz will do mid-50s at prolonged load, 100% fan.
Click to expand...

Stop talking about your temp and start comparing Delta-T against ambient. It's a silly argument otherwise.


----------



## mcbaes72

Quote:


> Originally Posted by *Ranguvar*
> 
> Are you sure?
> 
> My 1070 FTW @ 2.1GHz/9.4GHz will do mid-50s at prolonged load, 100% fan.


Wow, your temps are great for air cooling! Hmm... instead of guessing, I'll run my fans 100% and play for an hour, weather shouldn't be hot after work, curious to see how high max temp gets.

I'm OC'ed +150/+600, good airflow in case (2) 200mm intake, (3) 120mm exhaust.


----------



## Hunched

Quote:


> Originally Posted by *FitNerdPilot*
> 
> Not sure if this is the right forum for this question... I have the Gigabyte GTX 1070 Xtreme Gaming and have been having bad luck with it in Battlefield 4. Keep getting "device hung, directx error" crashes. I came from an older AMD card. I've tried several fixes I found on youtube but no luck. What am I missing? Let me know what info I failed to provide as I'm sure I missed something crucial.


It means your memory clock is too high, from my experience with 2 different 1070's and BF4.
Firestrike is a terrible stability test, if you max out what is stable in that and then go play BF4 or Witcher 3 or many other games you're going to have issues and have to lower your OC.
I assume the device hang is only happening during loading screens?
If you're freezing during actual gameplay it's more likely too high of a core clock.

BF4 likes to tell you your memory is unstable while it loads maps into the VRAM during the load screens.


----------



## LiquidHaus

Quote:


> Originally Posted by *ITAngel*
> 
> WOW DUDE! THANKS! That is amazing news. please keep us posted if you see it go up on sale or if you find out when.


Yes it is indeed amazing news lol, and I will definitely keep everyone posted.

Quote:


> Originally Posted by *mypickaxe*
> 
> Stop talking about your temp and start comparing Delta-T against ambient. It's a silly argument otherwise.


This.


----------



## Ranguvar

Quote:


> Originally Posted by *mcbaes72*
> 
> Wow, your temps are great for air cooling! Hmm... instead of guessing, I'll run my fans 100% and play for an hour, weather shouldn't be hot after work, curious to see how high max temp gets.
> 
> I'm OC'ed +150/+600, good airflow in case (2) 200mm intake, (3) 120mm exhaust.


Grab an ambient temp, I should be able to find a thermometer somewhere.

2x 140mm intake, 3x 140mm exhaust.

Also, can you do Time Spy stress test for 10m, or Firestrike Ultra, so we have the same test?


----------



## bigjdubb

Well my MSI Gaming 1070 maintains a max temp of 54*c with 22-25*c ambient temps at 100% fan speed. It doesn't matter how long of a session it is, at 2050mhz and 100% fan speed the max temp is 54* with my 22-25* ambient room.


----------



## GreedyMuffin

If anyone else want to use their FE backplate with the EK block.


----------



## FitNerdPilot

Hunched,

Thanks for the reply. It's happening in gameplay usually. It seems sporadic. I did the fix of removing files from one of the folders last week (search directx fix bf4) and it worked fine for a few days. Now it's happening again. I tried lowering everything to stock 1070 levels and it was still happening. Could it be Gigabyte software? Something else I'm missing?


----------



## Hunched

Quote:


> Originally Posted by *FitNerdPilot*
> 
> Hunched,
> 
> Thanks for the reply. It's happening in gameplay usually. It seems sporadic. I did the fix of removing files from one of the folders last week (search directx fix bf4) and it worked fine for a few days. Now it's happening again. I tried lowering everything to stock 1070 levels and it was still happening. Could it be Gigabyte software? Something else I'm missing?


I'd do a DDU uninstall of your drivers and do a custom clean installation of 368.95. Select "Repair Game" for BF4 in Origin, or completely reinstall it.
Reinstall/repair DirectX.
Or even underclock your GPU and see if that stops it, it's rare but cards can have stability issues even at their stock settings, even more likely if the card comes with aggressive out of the box clocks.

DirectX hangs are almost always the cause of GPU instability.
BF4 is about the touchiest thing there is, if your GPU isn't stable for whatever reason it will let you know.








Good luck, I hope you figure it out. I know how much of nightmare it can be to solve PC issues...


----------



## Dude970

Quote:


> Originally Posted by *Hunched*
> 
> I'd do a DDU uninstall of your drivers and do a custom clean installation of 368.95. Select "Repair Game" for BF4 in Origin, or completely reinstall it.
> Reinstall/repair DirectX.
> Or even underclock your GPU and see if that stops it, it's rare but cards can have stability issues even at their stock settings, even more likely if the card comes with aggressive out of the box clocks.
> 
> DirectX hangs are almost always the cause of GPU instability.
> BF4 is about the touchiest thing there is, if your GPU isn't stable for whatever reason it will let you know.
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck, I hope you figure it out. I know how much of nightmare it can be to solve PC issues...










good advice


----------



## kevindd992002

Will the Accelero Hybrid AIO still fit the AiB GTX 1070's? Would it be better to ask this directly from Artic Cooling?

And is there like a prediction when these 1000 series cards' prices will go down to their MSRP?!


----------



## BulletSponge

Quote:


> Originally Posted by *FitNerdPilot*
> 
> Not sure if this is the right forum for this question... I have the Gigabyte GTX 1070 Xtreme Gaming and have been having bad luck with it in Battlefield 4. Keep getting "device hung, directx error" crashes. *I came from an older AMD card*. I've tried several fixes I found on youtube but no luck. What am I missing? Let me know what info I failed to provide as I'm sure I missed something crucial.


Quote:


> Originally Posted by *Hunched*
> 
> *I'd do a DDU uninstall of your drivers and do a custom clean installation of 368.95*. Select "Repair Game" for BF4 in Origin, or completely reinstall it.
> Reinstall/repair DirectX.
> Or even underclock your GPU and see if that stops it, it's rare but cards can have stability issues even at their stock settings, even more likely if the card comes with aggressive out of the box clocks.
> 
> DirectX hangs are almost always the cause of GPU instability.
> BF4 is about the touchiest thing there is, if your GPU isn't stable for whatever reason it will let you know.
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck, I hope you figure it out. I know how much of nightmare it can be to solve PC issues...


My guess is there are some left over AMD driver files causing issues. Can he run DDU to remove AMD drivers if the AMD card is not installed? This is only if a fresh install/repair DX try does not work.


----------



## Al plants Corn

Probably a silly question but I've been out of the game too long to remember but can you step-up an EVGA SC or FTW towards a 1080?

This would give me the option to upgrade if decide a 1070 isn't enough.


----------



## GreedyMuffin

You can step-up a 1070 to a 1080, but not a 1070 SC to a 1070 FTW example. AFAIK.









EDIT: You get step-up a 1070 SC to a 1080 FTW if you want to. That's correct.


----------



## Ranguvar

Quote:


> Originally Posted by *GreedyMuffin*
> 
> EDIT: You get step-up a 1070 SC to a 1080 FTW if you want to. That's correct.


Nope. Only Founders, custom blowers, and ACX versions are on the list now, and that is consistent with what they've done before.

They will step up to SC if they have no ACX stock though, seems common at least has been common.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Ranguvar*
> 
> Nope. Only Founders, custom blowers, and ACX versions are on the list now, and that is consistent with what they've done before.
> 
> They will step up to SC if they have no ACX stock though, seems common at least has been common.


My bad, thanks for correcting me!


----------



## yanks8981

I just ordered an Evga 1070 FTW to replace my 780. I hope I picked the right one.


----------



## LiquidHaus

Quote:


> Originally Posted by *yanks8981*
> 
> I just ordered an Evga 1070 FTW to replace my 780. I hope I picked the right one.


I would have got that if I had not gone with the Amp Extreme. Solid choice, sir!


----------



## Al plants Corn

Quote:


> Originally Posted by *Ranguvar*
> 
> Nope. Only Founders, custom blowers, and ACX versions are on the list now, and that is consistent with what they've done before.
> 
> They will step up to SC if they have no ACX stock though, seems common at least has been common.


What's the likelihood of a FTW/etc versions being added later on?


----------



## Shweller

Quote:


> Originally Posted by *BulletSponge*
> 
> My guess is there are some left over AMD driver files causing issues. Can he run DDU to remove AMD drivers if the AMD card is not installed? This is only if a fresh install/repair DX try does not work.


Agreed, I went from a 290x crossfire setup to my strix 1070 with no issures at all. A ddu uninstall adn clean install of Nvidia drivers should do the trick. May some kind of remaining registry items from Amd may be causing issues. I just glad to be back on the green team. No problems finally.


----------



## ITAngel

Quote:


> Originally Posted by *lifeisshort117*
> 
> CONFIRMED Alphacool is making a waterblock for the Amp Extreme. I emailed them last week asking them about it and they said it is currently in production.
> 
> Here is a SC of the email:
> 
> 
> 
> Not exactly sure the ETA though. I am still waiting on an answer for that.


So I did a quick OC, and this is what I found doing it manually instead of using the curve mode which I am beginning to like more.



GPU @ 2113Mhz
MEM @ 4654Mhz
TEMP AVG @ 64C/65C
ROOM TEMP @ 72F

That is what I am working on trying to get the rest to run without any artifacts. Then once I fine tune it I will start upping the voltage to help with it and last OC the CPU.


----------



## Ranguvar

Quote:


> Originally Posted by *Al plants Corn*
> 
> What's the likelihood of a FTW/etc versions being added later on?


Hasn't happened in the past, someone correct me if I'm wrong.


----------



## mcbaes72

Quote:


> Originally Posted by *Ranguvar*
> 
> Grab an ambient temp, I should be able to find a thermometer somewhere.
> 
> 2x 140mm intake, 3x 140mm exhaust.
> 
> Also, can you do Time Spy stress test for 10m, or Firestrike Ultra, so we have the same test?


I ran Firestrike (see below). Includes GF Experience, Core Temp, GPU-Z, and Afterburner.

Room Temp = 77F
Fans @ 100% = 50C and up
Max Temp = 67C
OC stats on Afterburner 4.2.0 (see pic)
CPU = 6700K @ 4.5
RAM = 16GB @ 3000
Monitor = 1440p/144


----------



## Al plants Corn

Quote:


> Originally Posted by *Ranguvar*
> 
> Hasn't happened in the past, someone correct me if I'm wrong.


Well fudge. Thanks


----------



## Ranguvar

Quote:


> Originally Posted by *mcbaes72*
> 
> I ran Firestrike (see below). Includes GF Experience, Core Temp, GPU-Z, and Afterburner.
> 
> Room Temp = 77F
> Fans @ 100% = 50C and up
> Max Temp = 67C
> OC stats on Afterburner 4.2.0 (see pic)
> CPU = 6700K @ 4.5
> RAM = 16GB @ 3000
> Monitor = 1440p/144


Precision isn't showing your clock speeds from the test, just the offset you set, which may be eaten into by Boost 3.0 downclocking, etc.
If you can link the Firestrike result it'll show max/avg clock, one or the other.

I'm just getting to sleep, but I'll be able to test tomorrow 10m Firestrike.
6700K @ 4.7, 32GB RAM @ 3000.

My ambient is 67F, so definitely cooler.

Here is a screenshot from an older session, 2278MHz doesn't score well due to using custom voltage curve, but as an example, 45C max: https://i.imgur.com/crJkaUM.jpg
Fails validation because 1070 FTW is _still_ undetected in 3DMark.....


----------



## madmeatballs

Quote:


> Originally Posted by *ITAngel*
> 
> So I did a quick OC, and this is what I found doing it manually instead of using the curve mode which I am beginning to like more.
> 
> 
> 
> GPU @ 2113Mhz
> MEM @ 4654Mhz
> TEMP AVG @ 64C/65C
> ROOM TEMP @ 72F
> 
> That is what I am working on trying to get the rest to run without any artifacts. Then once I fine tune it I will start upping the voltage to help with it and last OC the CPU.


Wow lucky you, I couldn't get those clocks on the same card without artifacts and driver crashing. I could only get +35 Core and +400 Memory Clock, voltage and power limit maxed. 2060MHz (goes down as low to 2038) on GPU and 4498MHz on MEM. Sad to say I lost the silicone lottery lol.


----------



## ITAngel

Hang in there, I am still working it but, I have yet to see artifacts on that setting. I saved it to do the test tomorrow on heaven against. Will keep you posted on my findings.


----------



## Dreamliner

I gotta say, I feel pretty lucky. I knew getting the Strix OC card would guarantee a good chip. This is what I'm seeing after 10 minutes or so of MSI Kombustor. Suggestions on what to try next? Is there some multiple to use when overclocking the core and memory?


----------



## LiquidHaus

Quote:


> Originally Posted by *ITAngel*
> 
> So I did a quick OC, and this is what I found doing it manually instead of using the curve mode which I am beginning to like more.
> 
> 
> 
> GPU @ 2113Mhz
> MEM @ 4654Mhz
> TEMP AVG @ 64C/65C
> ROOM TEMP @ 72F
> 
> That is what I am working on trying to get the rest to run without any artifacts. Then once I fine tune it I will start upping the voltage to help with it and last OC the CPU.


oh very nice! will have to try this soon!


----------



## TheBoom

Quote:


> Originally Posted by *ITAngel*
> 
> So I did a quick OC, and this is what I found doing it manually instead of using the curve mode which I am beginning to like more.
> 
> 
> 
> GPU @ 2113Mhz
> MEM @ 4654Mhz
> TEMP AVG @ 64C/65C
> ROOM TEMP @ 72F
> 
> That is what I am working on trying to get the rest to run without any artifacts. Then once I fine tune it I will start upping the voltage to help with it and last OC the CPU.


How are you guys getting above 2.1ghz. My strix doesn't wanna go any higher than 2075mhz before crashing the driver. Memory seems to go to about 9.27ghz max. And this is with power limit and voltage maxed.

I did notice that the card doesn't get near the max power limit of 120% at all.


----------



## Oj010

Quote:


> Originally Posted by *TheBoom*
> 
> How are you guys getting above 2.1ghz. My strix doesn't wanna go any higher than 2075mhz before crashing the driver. Memory seems to go to about 9.27ghz max. And this is with power limit and voltage maxed.
> 
> I did notice that the card doesn't get near the max power limit of 120% at all.


Strix and Gaming X are two of the poorer clocking cards I've used.

On an unrelated note, two days ago I had a FE that did 2268 MHz GPUPI


----------



## mcbaes72

Quote:


> Originally Posted by *Ranguvar*
> 
> Precision isn't showing your clock speeds from the test, just the offset you set, which may be eaten into by Boost 3.0 downclocking, etc.
> If you can link the Firestrike result it'll show max/avg clock, one or the other.
> 
> I'm just getting to sleep, but I'll be able to test tomorrow 10m Firestrike.
> 6700K @ 4.7, 32GB RAM @ 3000.
> 
> My ambient is 67F, so definitely cooler.
> 
> Here is a screenshot from an older session, 2278MHz doesn't score well due to using custom voltage curve, but as an example, 45C max: https://i.imgur.com/crJkaUM.jpg
> Fails validation because 1070 FTW is _still_ undetected in 3DMark.....


Got it, great score and low temp! FTW OC's nicely.

Here's more detailed results including Max Core, Memory, etc. Ran it again this morning, cooler room temp (72F), GPU temp (63C @ 100% fans) per Afterburner. But I forgot to screen-capture second Firestrike run this morning, so re-used score from last night.


----------



## TheBoom

Quote:


> Originally Posted by *Oj010*
> 
> Strix and Gaming X are two of the poorer clocking cards I've used.
> 
> On an unrelated note, two days ago I had a FE that did 2268 MHz GPUPI


I see. Well, I'd still pick the aesthetics of the strix over a small difference in overclockability and I guess you can't have both with the current batch.

Edit : Is the single 8-pin connector the reason for the limitations?


----------



## ITAngel

Quote:


> Originally Posted by *TheBoom*
> 
> How are you guys getting above 2.1ghz. My strix doesn't wanna go any higher than 2075mhz before crashing the driver. Memory seems to go to about 9.27ghz max. And this is with power limit and voltage maxed.
> 
> I did notice that the card doesn't get near the max power limit of 120% at all.


Hi there I was reporting what Heaven Benchmark was displaying on top right. Not sure which of the two software is showing actual reporting of GPU and Mem Mhz but I ran a test before going to work today and here is what I got full desktop screenshot. Same Room Temp 72F.



You guys can tell me if is wrong, and to me any overclock over stock and stock overclock is free performance to me.


----------



## TheBoom

Quote:


> Originally Posted by *ITAngel*
> 
> Hi there I was reporting what Heaven Benchmark was displaying on top right. Not sure which of the two software is showing actual reporting of GPU and Mem Mhz but I ran a test before going to work today and here is what I got full desktop screenshot. Same Room Temp 72F.
> 
> 
> 
> You guys can tell me if is wrong, and to me any overclock over stock and stock overclock is free performance to me.


The correct values are shown by afterburner or gpu-z if you have that. So in this case it should be 2062mhz core and 9.3ghz mem if that was what it was showing when the benchmark was actually running.


----------



## FitNerdPilot

By the way, I have an extra Gigabyte 1070 G1 Gaming if anyone is in need... bought it, then the Xtreme Gaming came out and I bought it hoping I could just sell the G1.


----------



## ITAngel

Quote:


> Originally Posted by *TheBoom*
> 
> The correct values are shown by afterburner or gpu-z if you have that. So in this case it should be 2062mhz core and 9.3ghz mem if that was what it was showing when the benchmark was actually running.


Okay I will play with gpu z when I get back from work today and will fine tune those settings to see. By the way is that an okay OC?


----------



## Naykz

Hi Guys,

i got litte question. I got two 1070s from Gigabyte. Each can be overclocked to 2100-2113mhz, but in SLI i get already driver issues @2088mhz. Is SLI kinda lowering the overclock potential?

best regards


----------



## Shut3r

Does someone has got a Bios of a msi Aero or asus Turbo 1070?

Gesendet von meinem LG-D855 mit Tapatalk


----------



## bigjdubb

Quote:


> Originally Posted by *Naykz*
> 
> Hi Guys,
> 
> i got litte question. I got two 1070s from Gigabyte. Each can be overclocked to 2100-2113mhz, but in SLI i i get already driver issues @2088mhz. Is SLI kinda lowering the overclock potential?
> 
> best regards


I have always been able to get higher clock speeds with single cards than I could while running the same cards in SLI. It's not unusual for this to happen.


----------



## Naykz

Quote:


> Originally Posted by *bigjdubb*
> 
> I have always been able to get higher clock speeds with single cards than I could while running the same cards in SLI. It's not unusual for this to happen.


ok, thx for the info . First i thought its some thermal throttle issue but GPU Boost 3.0 is unpredictable. Even when i run them as single cards they use far less voltage. Guess i should be lucky they are atleast both stable @2075mhz while in SLI.


----------



## madmeatballs

Quote:


> Originally Posted by *ITAngel*
> 
> Hi there I was reporting what Heaven Benchmark was displaying on top right. Not sure which of the two software is showing actual reporting of GPU and Mem Mhz but I ran a test before going to work today and here is what I got full desktop screenshot. Same Room Temp 72F.
> 
> 
> 
> You guys can tell me if is wrong, and to me any overclock over stock and stock overclock is free performance to me.




Hmmm, pretty weird. My core clock was brought down to 2050MHz then 2038MHz during the benchmark, GPU temp was at 54c. GPU Boost 3.0 is weird.
(I set heaven to 1600x900 to copy what he had so I could compare.)


----------



## Tasm

Why the hell unlocked BIOS are taking so long?

Its hardware locked?


----------



## ITAngel

Quote:


> Originally Posted by *madmeatballs*
> 
> 
> 
> Hmmm, pretty weird. My core clock was brought down to 2050MHz then 2038MHz during the benchmark, GPU temp was at 54c. GPU Boost 3.0 is weird.
> (I set heaven to 1600x900 to copy what he had so I could compare.)


Yea the settings I put it that way so I can see other tools during the test but I can run one at 1080p during lunch and post it. I did notice that also were it will bring it down since at one point I had it at 2088 and went down during bench. I think could be PCIE 3.0 throttling these cards? Anyways to give you a better comparison I will mess with it hopefully during lunch in an hour if not today when I get home from work.









By the way I like your minimum FPS, I seem to get low on those for some odd reasons. What CPU and speed do you have it at and Ram?

Nevermind I saw it on your test.









Going to try match speeds and such, so I need to OC my CPU to 4.0Ghz.


----------



## madmeatballs

Quote:


> Originally Posted by *Tasm*
> 
> Why the hell unlocked BIOS are taking so long?
> 
> Its hardware locked?


I really hope someone is working on it. We just have to wait.
Quote:


> Originally Posted by *ITAngel*
> 
> Yea the settings I put it that way so I can see other tools during the test but I can run one at 1080p during lunch and post it. I did notice that also were it will bring it down since at one point I had it at 2088 and went down during bench. I think could be PCIE 3.0 throttling these cards? Anyways to give you a better comparison I will mess with it hopefully during lunch in an hour if not today when I get home from work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By the way I like your minimum FPS, I seem to get low on those for some odd reasons. What CPU and speed do you have it at and Ram?
> 
> Nevermind I saw it on your test.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going to try match speeds and such, so I need to OC my CPU to 4.0Ghz.


My CPU is actually oc'd to 4.6GHz, but your CPU is better though.


----------



## LiquidHaus

Quote:


> Originally Posted by *Tasm*
> 
> Why the hell unlocked BIOS are taking so long?
> 
> Its hardware locked?


the voltage is hardware locked, yes. BUT it's hardware locked at 1.25v.

anything between 1.093v and 1.25v is still possible. indeed, they are definitely taking way too long. I dont think anyone has the incentives though. in the bios thread, people we're talking about starting a kickstarter for it to gain some attention from those who actually have the skills to make it happen.


----------



## TheBoom

Yeah the boost is kinda annoying. Sometimes it will not even go to the highest possible clocks. Other times it takes a while to ramp up. I'm going to leave it as it is until someone comes up with an unlocked bios that disables it completely.


----------



## mypickaxe

Quote:


> Originally Posted by *Tasm*
> 
> Why the hell unlocked BIOS are taking so long?
> 
> Its hardware locked?


Encryption.


----------



## ITAngel

Here is the test I did at lunch time. The only thing I did do a quick OC on the CPU for 4.5Ghz. I would had continue overclocking but ran out of time at lunch. Will continue later on tonight and see.


----------



## TheBoom

Quote:


> Originally Posted by *ITAngel*
> 
> Here is the test I did at lunch time. The only thing I did do a quick OC on the CPU for 4.5Ghz


You have a strix? It seems like 2063 is the sweet spot for all the strix cards. Seen the same on other forums. Any higher and crashes.


----------



## Vaesauce

Finally was able to breach 2700! I am satisfied lol. I can go crawl back into my corner now


----------



## ITAngel

Quote:


> Originally Posted by *TheBoom*
> 
> You have a strix? It seems like 2063 is the sweet spot for all the strix cards. Seen the same on other forums. Any higher and crashes.


No I have the Zotac GTX 1070 AMP Extreme. I am not sure if Afterbuner have a bug or what but I was running 2088Mhz on the core and then dropped to 2063Mhz.  Seriously?


----------



## ITAngel

Quote:


> Originally Posted by *Vaesauce*
> 
> 
> 
> Finally was able to breach 2700! I am satisfied lol. I can go crawl back into my corner now


Nice, Good work. I am trying to break the upper mid 2500+ but something something is holding me back.


----------



## Hnykill

Just upgraded from AMD 7950 to Palit GTX 1070 Super Jetstream today. man this thing is a heavy ****er :Þ ..it feels like 1Kg . but the rule of silence is - bigger fans and low RPM. took Palit over Gigabyte 3x Windforce. my old Gigabyte 7950 had 3x small "thin" fans and under load each of the 3 fans needed like 3000 rpm to cool the card. high noise, high picthed thing.

Also some shrouds around GPU coolers today totally cover the heatsink. giving the airflow of the fans nowhere to go. Palit fans are above and have open way trough the heatsink to exhaust hot air. they are also thicker "deeper" then MSI and Gigabyte. never wanted the Founders Edition. was going to get Evga. but this thing. there are no idiots behind this design. RGB light at front. Backplate. this is one silent card.

i7 5820K @ 4.4Ghz
NZXT Kraken X61
Gigabyte X99 Gaming 5
16 GB DDR4 2666 Mhz
Palit GTX 1070 Super Jetstream
Evga 1300W Supernova G2

I have a benQ XL2411T 144Hz monitor with Lightboost. just maxed out all of my games and well :Þ .. this is something alse.. 100+ FPS steady (dont use AA) ..man it's beautiful. best was Hitman 2016. saw most difference there. now if game developers would stop capping games at 60 FPS i would enjoy it a little more. i mean, i bought Wolfenstein : The new Order , and i have this setup but i am capped at 60 FPS :/ ..bethesda seems to have this problem in perticular. something about theyr physic engine.

This is one great card im telling you. But dont fall for the "thin" many fans setup's on this card. high spin, high pitched, high noise coolers. been playing since Quake 2 with Voodo 1. so i have some experience.

Great card. and like the rule of thumb say's. "alway's skip a generation"

= silent big ass cooler of a card.


----------



## Vaesauce

Quote:


> Originally Posted by *ITAngel*
> 
> Nice, Good work. I am trying to break the upper mid 2500+ but something something is holding me back.


Thanks!

You'll get there one way or another man!

I have a Gigabyte 1070 Xtreme Gaming, I could barely hit mid 2500s until I switched over to the Zotac AMP Extreme BIOS. Zotac's has less throttling. It also draws a lot more power/watts which I think feeds better into the Core/Memory Clocks. I was able to hit 2700s with no artifacts and that was the goal.

The biggest difference is the fan RPM for sure though. 85% on Zotac BIOS is like 2700rpm. 85% on the Palit Premium Gamerock is like 1600RPM. Meanwhile, they both keep the card at the same exact degree range. Mostly because the Zotac uses up a lot more watts and the card heats up more, might explain it. Weird thing also is that the Palit BIOS is also extremely more stable on Memory overclocking.

Either way, i've been observing a lot lol.


----------



## TUFinside

Quote:


> Originally Posted by *Hnykill*
> 
> Just upgraded from AMD 7950 to Palit GTX 1070 Super Jetstream today. man this thing is a heavy ****er :Þ ..it feels like 1Kg . but the rule of silence is - bigger fans and low RPM. took Palit over Gigabyte 3x Windforce. my old Gigabyte 7950 had 3x small "thin" fans and under load each of the 3 fans needed like 3000 rpm to cool the card. high noise, high picthed thing.
> 
> Also some shrouds around GPU coolers today totally cover the heatsink. giving the airflow of the fans nowhere to go. Palit fans are above and have open way trough the heatsink to exhaust hot air. they are also thicker "deeper" then MSI and Gigabyte. never wanted the Founders Edition. was going to get Evga. but this thing. there are no idiots behind this design. RGB light at front. Backplate. this is one silent card.
> 
> i7 5820K @ 4.4Ghz
> NZXT Kraken X61
> Gigabyte X99 Gaming 5
> 16 GB DDR4 2666 Mhz
> Palit GTX 1070 Super Jetstream
> Evga 1300W Supernova G2
> 
> I have a benQ XL2411T 144Hz monitor with Lightboost. just maxed out all of my games and well :Þ .. this is something alse.. 100+ FPS steady (dont use AA) ..man it's beautiful. best was Hitman 2016. saw most difference there. now if game developers would stop capping games at 60 FPS i would enjoy it a little more. i mean, i bought Wolfenstein : The new Order , and i have this setup but i am capped at 60 FPS :/ ..bethesda seems to have this problem in perticular. something about theyr physic engine.
> 
> This is one great card im telling you. But dont fall for the "thin" many fans setup's on this card. high spin, high pitched, high noise coolers. been playing since Quake 2 with Voodo 1. so i have some experience.
> 
> Great card. and like the rule of thumb say's. "alway's skip a generation"
> 
> = silent big ass cooler of a card.


Good pick, i think it's one of the most, if not the most silent card out there !


----------



## TUFinside

I asked that question elsewhere but i thought i can ask it again here if you allow me.Is that a good pick for mATX small case (horizontal mobo design) ? I mean blower style vs open design ?

ASUS GTX 1070 TURBO



Also can i fit this backplate to this card without actually using a waterblock ?


----------



## bigjdubb

I don't have a reference card so I can't be certain that it will or will not work but it doesn't to appear to have all of the same holes. It is entirely possible that it has enough of them to attach properly though.

Is the ASUS turbo a reference board? I can't tell from the pictures. EDIT: It's not a reference board but it may use the same mounting holes as a reference board.


----------



## criminal

Quote:


> Originally Posted by *ikjadoon*
> 
> Thannkks.
> 
> 
> 
> 
> 
> 
> 
> Not bad at all.


Received the block yesterday and got it installed. Great block for the money.


----------



## FlatOUT

Got my MSI Gaming X to 2126 Mhz at 100% Fan but there is artefacts. At 2101 its fine with no artefacts at its 100% fan,on auto it drops to 2076 Mhz. Memory is pretty ok at 4800 still dont know if it can go higher cuz im afraid(still not sure if +800 safe)


----------



## ikjadoon

Quote:


> Originally Posted by *criminal*
> 
> Received the block yesterday and got it installed. Great block for the money.


Great deal, wow. I'm happily surprised to see parts for these 10-series dipping into more reasonable price segments, haha.


----------



## 113802

2114Mhz/9216Mhz GTX 1070 FTW - Can't stand the coil whine, I also received another card to RMA it but that one also had loud coil whine and couldn't even hit 2035Mhz.

http://www.3dmark.com/3dm/13872670


----------



## erase

Quote:


> Originally Posted by *TUFinside*
> 
> I asked that question elsewhere but i thought i can ask it again here if you allow me.Is that a good pick for mATX small case (horizontal mobo design) ? I mean blower style vs open design ?
> 
> ASUS GTX 1070 TURBO
> 
> 
> 
> Also can i fit this backplate to this card without actually using a waterblock ?


I have this card. I got it 2 days ago. I doubt back plate will work, it isnt reference design card, it is an Asus custom card, they use same parts are Strix on the PCB, just not as many of them of course. In regards to clock speed is reference.

I can tell for sure, but Asus may be using different VRAM chips. I was able to overclock my VRAM to +684 which gives me a nice round 300 GB/s on the memory. It can run +1000 briefly but there will be a lot of snow, which isn't good.

The core doesn't overclock that well, I put a +165 and it will crash after a while. I put my core back to default, and just overclocked RAM.

My Asus GTX 1070 Turbo is in a HTPC case, Fractal node 605, with 2x fans on minimum setting. The blower fan isn't loud at all when just doing next to nothing, although under load it will ramp up. Depends on the game too, and if you cap at 60 FPS, then the card will lower clocks and is pretty quiet, e.g. Doom on Nightmare settings doesn't need to boost to max.

Can post a photo soon if anyone want to see card sitting in my Fractal node 605 build.


----------



## DStealth

Just got my highest scores on FS and FSU don't know anniversary update or the new driver are responsible...even with the Zotac BIOS which is a little slower than my original clock per clock


----------



## vfrmaverick

I bought the echa 1070sc and have managed an ok overclock, +120 for and +525 memory very stable, temps around 62-65c and 2114mhz peak and usually dropping to 2088. Went ahead and tried the Asus strix bios and could only gain like 10 more MHz stable but had to sacrifice memory, for a grand score of 3982 on valley. The exact same as my stock bios lol. Just want to see what else is out there or if my chip is just tapped out


----------



## Shut3r

Quote:


> Originally Posted by *TUFinside*
> 
> I asked that question elsewhere but i thought i can ask it again here if you allow me.Is that a good pick for mATX small case (horizontal mobo design) ? I mean blower style vs open design ?
> 
> ASUS GTX 1070 TURBO
> 
> 
> 
> Also can i fit this backplate to this card without actually using a waterblock ?


Hi, can you load up your Bios file please?


----------



## Swolern

So I admit I have have been contemplating buying a new Titan X Pascal for my new Predator x34 monitor. I love new hardware! So i asked a guy with the same 3440x1440 res monitor to OC his new Titan XP and run Heaven 4.0 to compare it to my 1070 SLI.

*Here is his Titan XP getting around 1.9-1.95ghz core, 11100mhz mem.*



*
And my 1070SLI to compare 1936/9050mhz :*



Almost a *65%* increase over the Titan X Pascal. Uh, I think ill keep my $800 GPU solution.


----------



## outofmyheadyo

something is seriously off here, titanXP is faster than sli 980ti and this here claims its 65% slower?


----------



## Naykz

Quote:


> Originally Posted by *outofmyheadyo*
> 
> something is seriously off here, titanXP is faster than sli 980ti and this here claims its 65% slower?


Yeah, maybe the drivers arent optimised yet. If you compare SLI 1070 vs Titan X (Pascal) SLI in TimeSpy, there isnt even that big of a performance difference. Or they wasted most of the potencial due to the reference model.


----------



## Swolern

Quote:


> Originally Posted by *outofmyheadyo*
> 
> something is seriously off here, titanXP is faster than sli 980ti and this here claims its 65% slower?


Quote:


> Originally Posted by *Naykz*
> 
> Yeah, maybe some the drivers arent optimised yet. If you compare SLI 1070 vs Titan X (Pascal) SLI in TimeSpy, there isnt even that big of a performance difference. Or they wasted most of the potencial due to the reference model.


Ya your right. I just did the math, and even if SLI is scaling perfectly then the 1070 SLI should be anywhere from 19-26% faster than the Titan X, not 65%. maybe something is up with the drivers or Heaven. This is going by the Titan X review comparison. 26% faster for 1440, & 19% for 4K going by the chart.


----------



## victorrz

Quote:


> Originally Posted by *ITAngel*
> 
> Here is the test I did at lunch time. The only thing I did do a quick OC on the CPU for 4.5Ghz. I would had continue overclocking but ran out of time at lunch. Will continue later on tonight and see.


These are my results using also a 1070 amp extreme:



Now I will try to overclock more the memory clock.

With these clocks I am getting 2126Mhz on core but It stabilizes at 2101Mhz.


----------



## Prozillah

Soo I got my rma'd 1070 g1 gaming to replace what was appearing like ano obvious issue on my initial card.

The 1st card would run 2113mhz boost consistently but couldn't get acurate results on the memory. Afterburner would choke, hardlock and bsod on a small oc of 100 - 200mhz on the memory and the voltage never ever seemed to move under load from 1.050v which I thought was strange considering under normal load these cards should sit around the 1.092 ish mark with the power limit pumped all the way up.

Buuuut the new card I received will only do about 2038mhz core but 9.1ghz on the memory without bsods or hardlocks or anything.

I'm loosing out on 62mhz on the core but gaining huge on the memory. I always understood core was king so is it worth keeping the good core clock or the 2nd with the memory clock?


----------



## Derpinheimer

Memory is king this time. Keep the better memory over clocker


----------



## ITAngel

Quote:


> Originally Posted by *Swolern*
> 
> So I admit I have have been contemplating buying a new Titan X Pascal for my new Predator x34 monitor. I love new hardware! So i asked a guy with the same 3440x1440 res monitor to OC his new Titan XP and run Heaven 4.0 to compare it to my 1070 SLI.
> 
> *Here is his Titan XP getting around 1.9-1.95ghz core, 11100mhz mem.*
> 
> 
> 
> *
> And my 1070SLI to compare 1936/9050mhz :*
> 
> 
> 
> Almost a *65%* increase over the Titan X Pascal. Uh, I think ill keep my $800 GPU solution.


Personally I think Titan XP are a waste of money keep your SLI setup. Not only is faster but also is better security if a card failed you are not down. Send the card out for RMA and still keep running while it gets replaced/repaired.







win-win situation.


----------



## ITAngel

Quote:


> Originally Posted by *victorrz*
> 
> These are my results using also a 1070 amp extreme:
> 
> 
> 
> Now I will try to overclock more the memory clock.
> 
> With these clocks I am getting 2126Mhz on core but It stabilizes at 2101Mhz.


Yea I am going to play with it more on the weekend and see but it seems the highest I have gotten was 2088Mhz.


----------



## outofmyheadyo

Are you guys on the default coolers or water? Im wondering will adding an universal gpu block give me some proper gains?


----------



## ITAngel

I am on the default cooler not on water yet. Pending on a water block to be made for my card.


----------



## bigjdubb

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Are you guys on the default coolers or water? Im wondering will adding an universal gpu block give me some proper gains?


I was originally going to run an Alphacool HF-14 universal GPU block but since it has taken me three weeks to get my loop back up and running the Bitspower block came out for the MSI gaming cards and I decided to get it. The Alphacool block does fit fine on the cards though.

I don't expect to see any additional performance gains (as far as clock speed is concerned) other than holding a higher boost speed due to lower temps.


----------



## FlatOUT

What are the safe oc memory clocks of Gaming X (for playing) ? I`ve done +800 but im scary this gonna blow up my card


----------



## trihy

Anyone tested differences in g1 1070 different bios?

It says fan duty changed. But, for better cooling or for lower noise?


----------



## Swolern

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Are you guys on the default coolers or water? Im wondering will adding an universal gpu block give me some proper gains?


Depends how good the cooler on the 1070 you get it. A single STRIX holds my OC to about 58c Max temps. SLI goes up to about 64c. Won't see any benefits from watercooling with no voltage unlock, besides maybe 1 or 2 bins of thermal throttling, but that nothing really. I would just get one with a good cooler.


----------



## Derpinheimer

Quote:


> Originally Posted by *FlatOUT*
> 
> What are the safe oc memory clocks of Gaming X (for playing) ? I`ve done +800 but im scary this gonna blow up my card


Its not gonna do anything...

Just find out what the best value is. I can clock my 1080 at +800 but get the best performance at +575


----------



## danjal

Quote:


> Originally Posted by *duganator*
> 
> So my amp 1070 is reaching 70 c in games with the fan at 100℅ did I get a faulty card? I read that someone said tightening the screws on the card might help


Mine never gets above 70c and running a decent overclock pretty good, I run at 80% fan speed... I also have an open type case, phantec enthoo pro m and have two 140 intakes and one 140 exhaust.. and keep my house at 73degrees F.


----------



## danjal

I currently own a zotac 1070 amp edition..

Can I sli it with another gpu from a different manufacturer, say an evga ftw 1070?

all I know is both have two 8 pin connectors. both are also clocked the same from factory, 1607


----------



## Phinix

I'm currently deciding between any single 1080 or 1070 SLI with either two MSI gaming X or two EVGA SC.

Would you guys go with the single 1080 or SLI the 1070?


----------



## danjal

Quote:


> Originally Posted by *Phinix*
> 
> I'm currently deciding between any single 1080 or 1070 SLI with either two MSI gaming X or two EVGA SC.
> 
> Would you guys go with the single 1080 or SLI the 1070?


If I had it to do over again, I would get a 1080, and I would pick one that a full cover water block is available for from either xspc or ekwb or alpha cool.


----------



## danjal

Quote:


> Originally Posted by *Phinix*
> 
> I'm currently deciding between any single 1080 or 1070 SLI with either two MSI gaming X or two EVGA SC.
> 
> Would you guys go with the single 1080 or SLI the 1070?


bigger issue is availability.. not many 1080's to be found..


----------



## Phinix

Quote:


> Originally Posted by *danjal*
> 
> If I had it to do over again, I would get a 1080, and I would pick one that a full cover water block is available for from either xspc or ekwb or alpha cool.


What was your reasoning for saying "do it over again"?


----------



## duganator

Quote:


> Originally Posted by *danjal*
> 
> Mine never gets above 70c and running a decent overclock pretty good, I run at 80% fan speed... I also have an open type case, phantec enthoo pro m and have two 140 intakes and one 140 exhaust.. and keep my house at 73degrees F.


are you running the amp card? I'm using a 760t with two 140 intakes and a single 140 exhaust


----------



## Swolern

Quote:


> Originally Posted by *danjal*
> 
> If I had it to do over again, I would get a 1080, and I would pick one that a full cover water block is available for from either xspc or ekwb or alpha cool.


Quote:


> Originally Posted by *Phinix*
> 
> What was your reasoning for saying "do it over again"?


Upgrade itch. Everyone gets it. Same thing the 1080 owners are saying. Should have got a Titan.


----------



## danjal

Quote:


> Originally Posted by *Phinix*
> 
> What was your reasoning for saying "do it over again"?


I bought a zotac 1070 amp edition last month.. It runs fantastic and it powers my 1440p monitor fantastic and looks great in my setup and it overclocks well, but these cards need watercooling because they temperature throttle, you need to keep them real cool if you really want to overclock them..

I'm wanting to watercool now, and I sold an old computer I pieced together from scrapped parts that were given to me and made a good profit.. Now I have the money for a 1080, or 1070..

The main thing is I want to build a watercooled setup in my current machine. I currently have the zotac 1070 amp edition, so I can either buy another 1070 either zotac or evga.. The zotac 1070 amp edition doesnt have a waterblock available for it yet that I know of, so i can either wait or see if one of the manufacturers would make one for my card in an exchange deal they do sometimes, or get a founders edition which the waterblocks are widely available for.. one 1080fe would have been a little cheaper than 2 1070s and waterblocks are available for the fe version... but the 1070's when sli is supported looks to be very fast, plenty fast to power a 4k monitor which I want to upgrade to in the future.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> I was originally going to run an Alphacool HF-14 universal GPU block but since it has taken me three weeks to get my loop back up and running the Bitspower block came out for the MSI gaming cards and I decided to get it. The Alphacool block does fit fine on the cards though.
> 
> I don't expect to see any additional performance gains (as far as clock speed is concerned) other than holding a higher boost speed due to lower temps.


That's exactly what I saw. Same over clock I had on air, just better temps, noise and boost is consistent.


----------



## CaptainZombie

Since I was still within the 30 day return policy I swapped out my EVGA SC which I paid $449 for the EVGA FTW since it went on sale at Microcenter for $429. Hopefully the FTW is a better overall card. I didn't have any issues with the SC, but for $20 cheaper felt like it was a no brainier. I just barely fit the FTW in my 250D.

How have the FTWs been performing and is anyone regretting getting this card?


----------



## CJston15

Grabbed an MSI GTX 1070 Gaming X edition for $429 today at Microcenter.


----------



## ITAngel

Question which card is more superior if you water cool and oc both of them the ZOTAC GeForce GTX 1070 AMP! Extreme or the MSI GeForce GTX 1070 SEA HAWK X EK?


----------



## ITAngel

Quote:


> Originally Posted by *CJston15*
> 
> Grabbed an MSI GTX 1070 Gaming X edition for $429 today at Microcenter.


Personally I would grab another card and just wait for Alphacool to release the cooler for the Zotac cards.


----------



## danjal

Quote:


> Originally Posted by *duganator*
> 
> are you running the amp card? I'm using a 760t with two 140 intakes and a single 140 exhaust


yes the amp edition, not the amp extreme..


----------



## ITAngel

I like that it has dual 8 pin connectors. I would totally wait for the blocks for them.


----------



## chrcoluk

So as I guessed even a one 8 pin power connector card like the plait comes alive with a zotac extreme bios, nice.

I might give this a shot now actually, can someone be kind enough to refresh me here, which bios flasher is needed and the command needed? thanks. Also if possible a link to the right bios.

@Vaesauce


----------



## sammkv

Sucks people are still getting coil whine paying $400+ for these gpu's! My Zotac AMP has none


----------



## ITAngel

Quote:


> Originally Posted by *sammkv*
> 
> Sucks people are still getting coil whine paying $400+ for these gpu's! My Zotac AMP has none


Mine has none as well but mine is the AMP Extreme but no coil whine.


----------



## Curseair

Which 1070 should I get guys out of the Zotac AMP Extreme and MSI Gaming X, I was thinking about the Evga FTW but that has weird things going on from twitching fans to bad coil whine?

I have a Air 540 case with a red and black build, I like the look of the Zotac but it would not look right with everything else black + red unless I could paint the yellow stripes on the backplate to red etc, Is the Zotac card better than the MSI, I don't care about stock clocks/boosts as I'll overclock it myself anyway, Or should I not bother with the hassle of painting the Zotac and just get the MSI, The Zotac is cheaper by 30 dollars where i'm at.


----------



## ITAngel

Quote:


> Originally Posted by *Curseair*
> 
> Which 1070 should I get guys out of the Zotac AMP Extreme and MSI Gaming X, I was thinking about the Evga FTW but that has weird things going on from twitching fans to bad coil whine?
> 
> I have a Air 540 case with a red and black build, I like the look of the Zotac but it would not look right with everything else black + red unless I could paint the yellow stripes on the backplate to red etc, Is the Zotac card better than the MSI, I don't care about stock clocks/boosts as I'll overclock it myself anyway, Or should I not bother with the hassle of painting the Zotac and just get the MSI, The Zotac is cheaper by 30 dollars where i'm at.


Personally if you guy by theme looks and not by clocks. Grab the msi instead and also the zotac is pretty heavy too so keep in mine you may get some sagging. Both cards are great and I like the zotac better. I could change the led to any color pretty much.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Swolern*
> 
> Upgrade itch. Everyone gets it. Same thing the 1080 owners are saying. Should have got a Titan.


any 1070 itching? just got a cure... just notified Gigabyte (GV-N1080G1 GAMING-8GD) 1080 for $649 at NE:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125869

^^limited to 1 per cust.


----------



## FitNerdPilot

Quote:


> Originally Posted by *BulletSponge*
> 
> My guess is there are some left over AMD driver files causing issues. Can he run DDU to remove AMD drivers if the AMD card is not installed? This is only if a fresh install/repair DX try does not work.


So, I installed a new SSD as my boot drive when I installed the new 1070. Can't imagine that the drivers on the old hard drive (which is now just my files drive) would be causing the problem. The SSD was a clean install of windows.


----------



## FitNerdPilot

Quote:


> Originally Posted by *Curseair*
> 
> Which 1070 should I get guys out of the Zotac AMP Extreme and MSI Gaming X, I was thinking about the Evga FTW but that has weird things going on from twitching fans to bad coil whine?
> 
> I have a Air 540 case with a red and black build, I like the look of the Zotac but it would not look right with everything else black + red unless I could paint the yellow stripes on the backplate to red etc, Is the Zotac card better than the MSI, I don't care about stock clocks/boosts as I'll overclock it myself anyway, Or should I not bother with the hassle of painting the Zotac and just get the MSI, The Zotac is cheaper by 30 dollars where i'm at.


I have an extra 1070 G1 Gaming if you're in need. It's a great card. Used it for about a week and then ordered the Xtreme Gaming because it had the extra ports for my setup (3 monitors, 1 Vive, 1 TV in living room to stream what's on Vive)...


----------



## LiquidHaus

Quote:


> Originally Posted by *ITAngel*
> 
> WOW DUDE! THANKS! That is amazing news. please keep us posted if you see it go up on sale or if you find out when.


quoting you again to notify you that I got an email back from Alphacool, claiming the blocks will be available in 6-7 weeks! we just gotta hold out till then lol


----------



## TUFinside

I still don't know which 1070 to get, i have 282mm GPU clearance.









I would like one from ASUS to match with the MoBo, for now only the ASUS 1070 Turbo or DUAL can fit in my case, i was hoping for a SFF card from them, but there is none.


----------



## Prozillah

New RMA'd G1 replacement. Final clock - 2025mhz core, 9.18ghz memory. - meh


----------



## Amph

is this portion insulated from electricity? can i put the gpu above alluminium in that red place?


----------



## kaudiyo

Quote:


> Originally Posted by *Swolern*
> 
> That Nvidia bridge just kills the look of those cards! U need this one man!!


I finally did my own tweak for the look...


----------



## pez

Quote:


> Originally Posted by *Amph*
> 
> is this portion insulated from electricity? can i put the gpu above alluminium in that red place?


Not quite sure I understand you, but assuming you're talking about the orange design on the cooler, you shouldn't have an issue. From the side profile of the card, the middle fan slightly sticks out of the coolers shroud. It's hard to explain, but it's not fully 'protected' like the other two fans. Hopefully this explanation does more good than bad







.


----------



## joloxx9

Guys can somebody share msi gaming z 1070 bios please? I want to put it in my 1070 x version thanks

Wysłane z mojego HTC One M9 przy użyciu Tapatalka


----------



## Swolern

Anyone here running 4K with 2 of these cards in SLI? If so how are you liking it. Looking at picking up a 65in 4K OLED TV, so I will need to hold 60fps as much as possible due to no Gsync. I think Witcher 3 all maxed out is out of the question, but any other game should be ok I believe.
Quote:


> Originally Posted by *kaudiyo*
> 
> I finally did my own tweak for the look...


That's actually looks great! Well done!









What res do you run?


----------



## CaptainZombie

Quote:


> Originally Posted by *CaptainZombie*
> 
> Since I was still within the 30 day return policy I swapped out my EVGA SC which I paid $449 for the EVGA FTW since it went on sale at Microcenter for $429. Hopefully the FTW is a better overall card. I didn't have any issues with the SC, but for $20 cheaper felt like it was a no brainier. I just barely fit the FTW in my 250D.
> 
> How have the FTWs been performing and is anyone regretting getting this card?


I didn't get any responses and I was hoping for any additional thoughts. Thanks


----------



## pez

Quote:


> Originally Posted by *Swolern*
> 
> Anyone here running 4K with 2 of these cards in SLI? If so how are you liking it. Looking at picking up a 65in 4K OLED TV, so I will need to hold 60fps as much as possible due to no Gsync. I think Witcher 3 all maxed out is out of the question, but any other game should be ok I believe.
> That's actually looks great! Well done!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What res do you run?


Quote:


> Originally Posted by *CaptainZombie*
> 
> I didn't get any responses and I was hoping for any additional thoughts. Thanks


I did for about a month and had a great experience with 4K. Much moreso than 21:9 1440p. I don't have TW3, but most stuff maxed perfectly at 4K with 60+ FPS as long as I disabled AA (which is arguable unnecessary at this res anyways). I think the one exception was Crysis 3. I had to turn a couple more settings down for it.


----------



## Amph

Quote:


> Originally Posted by *pez*
> 
> Not quite sure I understand you, but assuming you're talking about the orange design on the cooler, you shouldn't have an issue. From the side profile of the card, the middle fan slightly sticks out of the coolers shroud. It's hard to explain, but it's not fully 'protected' like the other two fans. Hopefully this explanation does more good than bad
> 
> 
> 
> 
> 
> 
> 
> .


i'm talking about the pcb the edge of it, can i touch it without getting electricity?


----------



## pez

as long as there's no solder points or metal on metal, that shouldn't technically short it. I'm also not sure about the portion that contains the SLI fingers. I'd proceed with a high amount of caution.


----------



## Amph

yeah also the sli fingers, i see tracks there


----------



## pez

Yep, if it's going to make contact with that, I'd say avoid it.


----------



## Derpinheimer

Quote:


> Originally Posted by *Prozillah*
> 
> New RMA'd G1 replacement. Final clock - 2025mhz core, 9.18ghz memory. - meh


That's really not so bad. You're losing 1-2% compared to a lottery winner


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *pez*
> 
> as long as there's no solder points or metal on metal, that shouldn't technically short it. I'm also not sure about the portion that contains the SLI fingers. I'd proceed with a high amount of caution.


Yes. Agreed. And never touch anything inside a PC when it is on. (i expand the rule to include the power chord must be disconnected. I've witnessed Power buttons and switches being bumped and thus turning on a PC; and i've seen PCs for other reasons almost magically start-up.) An individual should only question what if something was touched by accident. And all should know about ESD and what to do, and what not to do, because of it.


----------



## kaudiyo

Quote:


> Originally Posted by *Swolern*
> 
> Anyone here running 4K with 2 of these cards in SLI? If so how are you liking it. Looking at picking up a 65in 4K OLED TV, so I will need to hold 60fps as much as possible due to no Gsync. I think Witcher 3 all maxed out is out of the question, but any other game should be ok I believe.
> That's actually looks great! Well done!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What res do you run?


I'm on 4K and you can play 60fps with SLI 1070, no OC needed. I max Witcher 3, you just need to disable antialiasing, there is no difference with it activated, I took screenshots both ON and OFF and couldn't appreciate it. It seems at 4K it has no effect.


----------



## Mad Pistol

Quote:


> Originally Posted by *kaudiyo*
> 
> I'm on 4K and you can play 60fps with SLI 1070, no OC needed. I max Witcher 3, you just need to disable antialiasing, there is no difference with it activated, I took screenshots both ON and OFF and couldn't appreciate it. It seems at 4K it has no effect.


I can actually play Battlefield 4, Battlefront, Overwatch, and a few other titles @ 5160x2160 (Ultrawide 4K+), and let me tell you, 1070's power through it easily. I get mid 80 FPS average for both Battlefront and Battlefield.

You should have no issue @ 4K on SLI GTX 1070.


----------



## mypickaxe

Quote:


> Originally Posted by *Mad Pistol*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kaudiyo*
> 
> I'm on 4K and you can play 60fps with SLI 1070, no OC needed. I max Witcher 3, you just need to disable antialiasing, there is no difference with it activated, I took screenshots both ON and OFF and couldn't appreciate it. It seems at 4K it has no effect.
> 
> 
> 
> I can actually play Battlefield 4, Battlefront, Overwatch, and a few other titles @ 5160x2160 (Ultrawide 4K+), and let me tell you, 1070's power through it easily. I get mid 80 FPS average for both Battlefront and Battlefield.
> 
> You should have no issue @ 4K on SLI GTX 1070.
Click to expand...

Just picked up a 4K 32" G-Sync to go along with SLI 1070s.

The only real problem games are The Division (just a bit too heavy for 1070s at 4K, though it looks fantastic), those with no SLI support (DOOM, Just Cause 3) or just poorly optimized / too ambitious / old tech / CPU bound. (Mirror's Edge Catalyst, Fallout 4...)

But OMG: Project Cars.

Now for some reason, and I don't know if it's the Windows 10 Anniversary update or Nividia drivers or the combination of the two, but Codemasters racing titles need SLI disabled now or they crash after launch. Didn't happen before. DiRT Rally, GRiD Autosport.)

Battlefront looks great too. I didn't notice a big improvement in Overwatch, but it still looks great. Definitely nice on a 32" IPS. So does Rocket League.

Try Forza 6: Apex. No SLI but it looks great and you can get 60 @ 4K with one 1070.


----------



## Mad Pistol

Quote:


> Originally Posted by *mypickaxe*
> 
> Just picked up a 4K 32" G-Sync to go along with SLI 1070s.
> 
> The only real problem games are The Division (just a bit too heavy for 1070s at 4K, though it looks fantastic), those with no SLI support (DOOM, Just Cause 3) or just poorly optimized / too ambitious / old tech / CPU bound. (Mirror's Edge Catalyst, Fallout 4...)
> 
> But OMG: Project Cars.
> 
> Now for some reason, and I don't know if it's the Windows 10 Anniversary update or Nividia drivers or the combination of the two, but Codemasters racing titles need SLI disabled now or they crash after launch. Didn't happen before. DiRT Rally, GRiD Autosport.)
> 
> Battlefront looks great too. I didn't notice a big improvement in Overwatch, but it still looks great. Definitely nice on a 32" IPS. So does Rocket League.
> 
> Try Forza 6: Apex. No SLI but it looks great and you can get 60 @ 4K with one 1070.


Yea, assuming the game is optimized for SLI, GTX 1070 SLI for 3440x1440 or 4K+ is a dream. You do still run into some limitations, but for the most part, it's been a great experience so far.


----------



## LiquidHaus

in terms of playing fallout 4, playing at 3440x1440 is a dream even on one 1070!


----------



## deegzor

http://www.3dmark.com/fs/9592406 WOHOO broke 22k


----------



## mypickaxe

Quote:


> Originally Posted by *lifeisshort117*
> 
> in terms of playing fallout 4, playing at 3440x1440 is a dream even on one 1070!


If by dream you mean, only in your dreams does downtown Boston run at a locked 60 fps...


----------



## danjal

Quote:


> Originally Posted by *mypickaxe*
> 
> Just picked up a 4K 32" G-Sync to go along with SLI 1070s.


What brand and model did you get? How do you like the windows scaling for the text and icons?


----------



## dmasteR




----------



## mypickaxe

Quote:


> Originally Posted by *danjal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> Just picked up a 4K 32" G-Sync to go along with SLI 1070s.
> 
> 
> 
> What brand and model did you get? How do you like the windows scaling for the text and icons?
Click to expand...

Acer Predator XB321HK. See the review on Tom's Hardware. Price was salty but I got it for $150 less than MSRP on Amazon.

The price seems to fluxuate a bit. I got it with Prime so free shipping. FedEx, two days, no damage. Monitor itself has no dead pixels. Backlight bleed isn't an issue for me, but I suppose if you were super picky, one of the corners may have a tad more glow than the others, but it's par for the course with IPS panels. Basically unavoidable, but not an issue if you're not running a black background or 16:10 movies all the time.

As far as scaling, it is fine at 150% (that's how Windows sets it up by default) but I use it at 100% as I prefer the desktop space. I have 20/10 vision so it's no issue for me. I checked it at 200% to simulate 1080p but it's a tad bit on the blurry side for fonts if you have LCD font smoothing enabled. I wouldn't use it at 200% ever so it's not something I'm going to fiddle with.

As far as image quality: it's the best screen I've ever owned outside of a 15" Retina MacBook Pro from 2012.


----------



## danjal

Quote:


> Originally Posted by *mypickaxe*


thank you


----------



## mypickaxe

http://www.tomshardware.com/reviews/acer-predator-xb321hk-uhd-monitor,4681.html


----------



## Prozillah

Quote:


> Originally Posted by *deegzor*
> 
> http://www.3dmark.com/fs/9592406 WOHOO broke 22k


that u chip u got is mental


----------



## chaous2000

Quote:


> Originally Posted by *dmasteR*


Oh, so it isnt just me going bat**** crazy trying to figure out why anything over +150 on the memory caused artifacts. I was thinking i was the only one. Thanks for the post!


----------



## Prozillah

Quote:


> Originally Posted by *chaous2000*
> 
> Oh, so it isnt just me going bat**** crazy trying to figure out why anything over +150 on the memory caused artifacts. I was thinking i was the only one. Thanks for the post!


Any kind of memory oc on my 1st g1 gaming 1070 caused instant snow followed by hardlock and/or bsod. Under load and voltage tho I could take it all the way to 600. We're u getting anything like that?


----------



## chaous2000

Quote:


> Originally Posted by *Prozillah*
> 
> Any kind of memory oc on my 1st g1 gaming 1070 caused instant snow followed by hardlock and/or bsod. Under load and voltage tho I could take it all the way to 600. We're u getting anything like that?


If i go above +100, i get alternating white blocks and a colored block across the whole screen and the computer restarts. If i try and go past this dead zone, there is an immediate lock up when i hit apply. Though, odly like you, while under load i can go up to +400 on the mem. I honestly think its faulty vram, im going to be rmaing it monday.


----------



## Prozillah

Quote:


> Originally Posted by *chaous2000*
> 
> If i go above +100, i get alternating white blocks and a colored block across the whole screen and the computer restarts. If i try and go past this dead zone, there is an immediate lock up when i hit apply. Though, odly like you, while under load i can go up to +400 on the mem. I honestly think its faulty vram, im going to be rmaing it monday.


Yup that's exactly the same as me. It's not limited to MSI. I was pulling my hair out originally trying to figure out what was going on with it. Rma'd it and got a worse Overclocker lol


----------



## chaous2000

Quote:


> Originally Posted by *Prozillah*
> 
> Yup that's exactly the same as me. It's not limited to MSI. I was pulling my hair out originally trying to figure out what was going on with it. Rma'd it and got a worse Overclocker lol


Well dang, I wonder whats going on. If its across multiple vendors, wouldn't that suggests it is chipside?


----------



## madmeatballs

Quote:


> Originally Posted by *dmasteR*


I hope they finally unlock the 1.09v voltage limit, it may be what is causing it. For some reason I get artifacts with my Zotac GTX 1070 AMP Extreme on SWTOR (only that game, I tried many other games and benches it works fine). I understand it might just be a driver issue or a something else, or it could be the game. No point if I RMA it right now, I would have to wait months for new stocks I'd rather wait for new drivers and/or hopefully new bios.
Quote:


> Originally Posted by *Prozillah*
> 
> Yup that's exactly the same as me. It's not limited to MSI. I was pulling my hair out originally trying to figure out what was going on with it. Rma'd it and got a worse Overclocker lol


Well, maybe if they allow us to have more power these artifacts and what not might go away... hoping nvidia unlocks the 1.09v limit LOL.


----------



## chaous2000

Very good point, for now i guess ill stay at my "safe" +125 on memory lmao, and hope for the best in the next few months. What makes no sense whatsoever, is without voltage even increased the card pull 1.063, why even allow us to increase it if they are going to bottom us out like this. Might as well hope for a new bios, or driver fix to fix all this bull****. Guess this is what we get as early adopters


----------



## Curseair

I was just about to get an MSI card.. Guess i'll wait.


----------



## Derpinheimer

Quote:


> Originally Posted by *madmeatballs*
> 
> I hope they finally unlock the 1.09v voltage limit, it may be what is causing it. For some reason I get artifacts with my Zotac GTX 1070 AMP Extreme on SWTOR (only that game, I tried many other games and benches it works fine). I understand it might just be a driver issue or a something else, or it could be the game. No point if I RMA it right now, I would have to wait months for new stocks I'd rather wait for new drivers and/or hopefully new bios.
> Well, maybe if they allow us to have more power these artifacts and what not might go away... hoping nvidia unlocks the 1.09v limit LOL.


There's a bios for the 1080 that unlocks voltage. It didn't give me a higher memory OC than stock, so I wouldn't get your hopes up.


----------



## pez

Quote:


> Originally Posted by *mypickaxe*
> 
> If by dream you mean, only in your dreams does downtown Boston run at a locked 60 fps...


Lol, this was literally my first thought. It doesn't matter whether or not you have one really powerful card, or two powerful cards....ti's just a crapshoot. I know people criticize Bethesda for the way their games run, but FO3 for me years ago ran so perfectly in every part of the game at high+ settings. I didn't even have the top GPU when that game came out either.


----------



## madmeatballs

Quote:


> Originally Posted by *Derpinheimer*
> 
> There's a bios for the 1080 that unlocks voltage. It didn't give me a higher memory OC than stock, so I wouldn't get your hopes up.


You mean past 1.09v? Well, it could be entirely different though. We'll never know until it comes, "if" it comes. lol


----------



## Derpinheimer

Quote:


> Originally Posted by *madmeatballs*
> 
> You mean past 1.09v? Well, it could be entirely different though. We'll never know until it comes, "if" it comes. lol


Yep, 1.2v (possibly higher with a modified msi afterburner). It's probably coming for the 1070, since we know it's possible now.


----------



## SlvrDragon50

Just got my ASUS Strix. No overclocking until I get a waterblock on it.

Backplate is absolutely gorgeous though.


----------



## Hunched

I think I found the quickest way to find memory instability.
Just open Witcher 3 to the main menu and exit the game and repeat.

Benchmarks like 3DMark lie, they'll tell you your memory is stable when it actually isn't since they only test them in full load.
When in reality your crashes are far more likely to happen during idle-load transitions or when things are just being dynamic and fluctuating, like exiting the idle menu screen in Witcher 3 into BANG full on 3D game render time. That's when my crashes would happen in Witcher 3 after hours of playtime, from menus to gameplay.

This is my experience with memory clocking 2 completely different 1070's, and it was how my 970 was too.
3DMark, Unigine, all of them will tell you your memory clock is WAY better than Witcher 3, BF4, and the like will.
I've passed 3DMark Time Spy over 10 times at +700mhz without crashing or graphical artifacts... I just had this Witcher 3 "usage spike" test crash me at +340mhz.

Maybe all of our super high clocks for 3DMark and the like would be stable in Witcher 3 and BF4 and so on if we could lock them up there or use custom BIOS to supply more voltage.
As always the instability of your clocks are at their greatest when they are dynamic and fluctuating up and down, stressing their full range, not just peak usage where things stabilize and lock.


This is with a MSI Gaming 1070 non-X with the 1607mhz X BIOS on it. Last 1070 was a Gigabyte WindForce OC. 970 was a Gigabyte G1.
They all work like this, and a custom BIOS and K-Boost for my 970 G1 greatly improved stability in these areas allowing for a far higher always stable overclock.


----------



## Prozillah

Quote:


> Originally Posted by *Hunched*
> 
> I think I found the quickest way to find memory instability.
> Just open Witcher 3 to the main menu and exit the game and repeat.
> 
> Benchmarks like 3DMark lie, they'll tell you your memory is stable when it actually isn't since they only test them in full load.
> When in reality your crashes are far more likely to happen during idle-load transitions or when things are just being dynamic and fluctuating, like exiting the idle menu screen in Witcher 3 into BANG full on 3D game render time. That's when my crashes would happen in Witcher 3 after hours of playtime, from menus to gameplay.
> 
> This is my experience with memory clocking 2 completely different 1070's, and it was how my 970 was too.
> 3DMark, Unigine, all of them will tell you your memory clock is WAY better than Witcher 3, BF4, and the like will.
> I've passed 3DMark Time Spy over 10 times at +700mhz without crashing or graphical artifacts... I just had this Witcher 3 "usage spike" test crash me at +340mhz.
> 
> Maybe all of our super high clocks for 3DMark and the like would be stable in Witcher 3 and BF4 and so on if we could lock them up there or use custom BIOS to supply more voltage.
> As always the instability of your clocks are at their greatest when they are dynamic and fluctuating up and down, stressing their full range, not just peak usage where things stabilize and lock.
> 
> 
> This is with a MSI Gaming 1070 non-X with the 1607mhz X BIOS on it. Last 1070 was a Gigabyte WindForce OC. 970 was a Gigabyte G1.
> They all work like this, and a custom BIOS and K-Boost for my 970 G1 greatly improved stability in these areas allowing for a far higher always stable overclock.


Yes completely agree - as stated in my previous post - my 1st g1 1070 I received was awesome when voltage was flowing and consistent but crashed on boot up with any kind of mem OC applied. Could take it all the way to 600+ during heaven runs etc. So RMA'd it for a 2nd card which loses 80+ on the core but holds stable at 490+ on the mem during all phases.

If I was able to lock the voltage to be consistent I would keep the original as the core clocked much higher.


----------



## Jimbags

Glad I got the FE edition. Alot of people knock it because of the single 8 pin, but Ive got 27th in firestrike extreme for my cpu (3570k). Ill post some actual numbers tonight. But im only using stock bios too. I have the slider in afterburner @max 112% Ill post exact clocks later. I know ive had mem around the 9k mark and core over 2100mhz. Like I said ill post exact numbers later though.


----------



## Dave65

https://www.techpowerup.com/gpuz/details/wm4ay


----------



## chaous2000

Quote:


> Originally Posted by *SlvrDragon50*
> 
> Just got my ASUS Strix. No overclocking until I get a waterblock on it.
> 
> Backplate is absolutely gorgeous though.


There is no reason to wait for a waterblock. The 1070's are locked at 1.09 volts, so your card will never heat up high enough to even warrant a water block. *assuming you have proper airflow in your case that is.*

If i force my fans to 100%, my card never goes above 59c at 100% full tilt.


----------



## Derpinheimer

Quote:


> Originally Posted by *Hunched*
> 
> I think I found the quickest way to find memory instability.
> Just open Witcher 3 to the main menu and exit the game and repeat.
> 
> Benchmarks like 3DMark lie, they'll tell you your memory is stable when it actually isn't since they only test them in full load.
> When in reality your crashes are far more likely to happen during idle-load transitions or when things are just being dynamic and fluctuating, like exiting the idle menu screen in Witcher 3 into BANG full on 3D game render time. That's when my crashes would happen in Witcher 3 after hours of playtime, from menus to gameplay.
> 
> This is my experience with memory clocking 2 completely different 1070's, and it was how my 970 was too.
> 3DMark, Unigine, all of them will tell you your memory clock is WAY better than Witcher 3, BF4, and the like will.
> I've passed 3DMark Time Spy over 10 times at +700mhz without crashing or graphical artifacts... I just had this Witcher 3 "usage spike" test crash me at +340mhz.
> 
> Maybe all of our super high clocks for 3DMark and the like would be stable in Witcher 3 and BF4 and so on if we could lock them up there or use custom BIOS to supply more voltage.
> As always the instability of your clocks are at their greatest when they are dynamic and fluctuating up and down, stressing their full range, not just peak usage where things stabilize and lock.
> 
> 
> This is with a MSI Gaming 1070 non-X with the 1607mhz X BIOS on it. Last 1070 was a Gigabyte WindForce OC. 970 was a Gigabyte G1.
> They all work like this, and a custom BIOS and K-Boost for my 970 G1 greatly improved stability in these areas allowing for a far higher always stable overclock.


I dont have Witcher 3, do you think any other games would work for this test? Seems like a good idea.


----------



## xCamoLegend

Quote:


> Originally Posted by *Derpinheimer*
> 
> I dont have Witcher 3, do you think any other games would work for this test? Seems like a good idea.


Rise of the tomb raider is a good one, it has a real time rendered main menu and it seems to reliably show artifacts on unstable OC's in the benchmark


----------



## Prozillah

Quote:


> Originally Posted by *Derpinheimer*
> 
> I dont have Witcher 3, do you think any other games would work for this test? Seems like a good idea.


anytime I jam bf4 it will find an unstable oc in 5 mins


----------



## saunupe1911

Quote:


> Originally Posted by *SlvrDragon50*
> 
> Just got my ASUS Strix. No overclocking until I get a waterblock on it.
> 
> Backplate is absolutely gorgeous though.


I grabbed one too. I never see it go above 65 degrees and I don't have the best air flow due to my setup being inside a desk. I'm seeing it hit 2050 mhz with no sweat. nice GPU so far


----------



## Hunched

I just discovered the current version of MSI Afterburner has some hidden settings, I was looking for a way to use K-Boost since EVGA has disabled it for everyone but EVGA card owners for Pascal.




Then I was curious how he accessed the voltage/curve area and locked the voltage, core, and mem to their max exactly like K-Boost.
Both still do not override thermal throttling, only a custom BIOS can achieve that.
Found this information/guide
http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
- Added GPU Boost 3.0 technology support for NVIDIA Pascal graphics cards:
- Added percent based overvoltage support
- *Added voltage/frequency curve customization support.* You may use traditional core clock slider on NVIDIA GeForce GTX 1070 and 1080 graphics cards to apply fixed offset to all voltage/frequency curve points as well as use brand new flexible voltage/frequency curve editor window for more precise per-point curve adjustment. *The editor window can be activated with Ctrl + F* keyboard shortcut and it provides you the following features:
- You may independently adjust clock frequency offset for each point with mouse cursor or Up / Down keys
- You may hold Ctrl key to set anchor and fix clock frequency offset in minimum/maximum voltage point and adjust the offset of any other point with mouse to linearly interpolate the offsets between the anchor and adjustment points
- You may hold Shift key while adjusting the offset of any point with mouse to apply the same fixed offset to all points. That's equal to adjusting the offset with the slider in main application window
- You may press Ctrl + D to reset offsets for all points
- You may switch between traditional core clock control slider in the main window and voltage/frequency curve editor window to see how they affect each other in realtime
- *You may press L after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling*









This should help avoid a lot of possible instability in games.
If you research Witcher 3 crashes you will find many of them are when entering/exiting the inventory or even while instantly starting the game.
Which are about the only times memory frequency changes, this should stop that completely.
It can help with core clock too, but it will still bounce around in 15mhz~ steps, but no more jumps from idle to 2000mhz+ or anything anymore.


----------



## DStealth

According to 3dmark the highest GPU with 1070...Don't have a valid key unfortunately


----------



## saunupe1911

So I tried a bit overclocking on my Asus Strix 1070 OC, downloaded Forza Apex, played a few races, and checked my highest clocks after I entered Forza Hub. Dang I'm ecstatic right now!!!! over 2100mhz. and Look at that max temp with fans running at 50% with my custom profile. Now what can I do to get the maximum amount of memory? So these cards can hit 9000mhz memory right? Any extra tips!


----------



## Wovermars1996

So I was wondering if somebody could help me out. I'm currently debating on getting an Asus Strix GTX1070 and I was wondering if someone here who has the same card or any GTX 1070 could test out a specific game and settings to see what kind of fps I would get. The Game is Overwatch and these settings.


----------



## saunupe1911

I don't have overwatch but check out my previous post. the Strix aint half bad.


----------



## Hunched

Can someone with Witcher 3 please open and close the game a time or two and let me know if you see anything like this for a millisecond?

It happens most of the time when I launch the game, even with the core and memory underclocked as far as they can underclock (-400, -502)
They're very fast flickers you almost don't notice, nothing strange happens after.

I don't have another GPU to test with and I can't remember if this has always happened or not :/
I don't know if it would be bad memory, seeing as +0 or -400 doesn't make it any better or worse.
Identical results with a 400mhz difference in clocks, probably not memory then right?









Hopefully it's nothing serious and just drivers or my configuration or something not physically defective...
I don't really know what to Google to try to find this exact issue, no luck, it would show up in video if I could find any of the very opening if it is widespread.
I'm thinking nobody really notices or cares, best case scenario

Edit: Found a random YouTube vid with the same **** happening so I'm good unless by some miracle this guys memory is dying too












I've just found a few more and same thing


----------



## Curseair

Do the 1070's with more than one 8 pin actually get any benefit? Wondering because i'm thinking about getting the Strix and it's one 8 pin only.


----------



## DStealth

Nope


----------



## Prozillah

Quote:


> Originally Posted by *Hunched*
> 
> I just discovered the current version of MSI Afterburner has some hidden settings, I was looking for a way to use K-Boost since EVGA has disabled it for everyone but EVGA card owners for Pascal.
> 
> 
> 
> 
> Then I was curious how he accessed the voltage/curve area and locked the voltage, core, and mem to their max exactly like K-Boost.
> Both still do not override thermal throttling, only a custom BIOS can achieve that.
> Found this information/guide
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> - Added GPU Boost 3.0 technology support for NVIDIA Pascal graphics cards:
> - Added percent based overvoltage support
> - *Added voltage/frequency curve customization support.* You may use traditional core clock slider on NVIDIA GeForce GTX 1070 and 1080 graphics cards to apply fixed offset to all voltage/frequency curve points as well as use brand new flexible voltage/frequency curve editor window for more precise per-point curve adjustment. *The editor window can be activated with Ctrl + F* keyboard shortcut and it provides you the following features:
> - You may independently adjust clock frequency offset for each point with mouse cursor or Up / Down keys
> - You may hold Ctrl key to set anchor and fix clock frequency offset in minimum/maximum voltage point and adjust the offset of any other point with mouse to linearly interpolate the offsets between the anchor and adjustment points
> - You may hold Shift key while adjusting the offset of any point with mouse to apply the same fixed offset to all points. That's equal to adjusting the offset with the slider in main application window
> - You may press Ctrl + D to reset offsets for all points
> - You may switch between traditional core clock control slider in the main window and voltage/frequency curve editor window to see how they affect each other in realtime
> - *You may press L after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This should help avoid a lot of possible instability in games.
> If you research Witcher 3 crashes you will find many of them are when entering/exiting the inventory or even while instantly starting the game.
> Which are about the only times memory frequency changes, this should stop that completely.
> It can help with core clock too, but it will still bounce around in 15mhz~ steps, but no more jumps from idle to 2000mhz+ or anything anymore.


This! Solved all my problems thank u!


----------



## Fosion

Quote:


> Originally Posted by *DStealth*
> 
> According to 3dmark the highest GPU with 1070...Don't have a valid key unfortunately


Come and get me brah http://www.3dmark.com/fs/9675788


----------



## DStealth

Can't you have much better core and memory also...
Anyway selling this card already and jumping on 1080 Wagon...hope for a better OCing cards...fingers crossed


----------



## Fosion

True.
Well, good luck! I will also keep my fingers crossed for you


----------



## madmeatballs

Zotac GTX 1070 AMP Extreme! with NZXT's Kraken G10 max temps were 59-62C prior to installing the G10/X41 combo. Now max temps are around 39-42C (On Heaven bench) Glad its more quiet now lol.


----------



## Jimbags

Just benched my Founders edition. 13th overall for my cpu. Firestike extreme overall score of 8376 points. Graphics score of 9651











Might try heaven next


----------



## pez

Quote:


> Originally Posted by *Wovermars1996*
> 
> So I was wondering if somebody could help me out. I'm currently debating on getting an Asus Strix GTX1070 and I was wondering if someone here who has the same card or any GTX 1070 could test out a specific game and settings to see what kind of fps I would get. The Game is Overwatch and these settings.


Sorry to not be able to give you a direct answer as of yet, but a single 1080 ran OW at 4K for me at like 130FPS maxed out. I believe the lowest dips I saw were in the 90s. Unfortunately, I do not have a 4K monitor anymore.


----------



## saunupe1911

Fellas what's a good memory overclock. I know the goal for the GPU was get 2100 or more and you got a winner. But what about the memory? 8500? 9000mhz?


----------



## ITAngel

Quote:


> Originally Posted by *DStealth*
> 
> According to 3dmark the highest GPU with 1070...Don't have a valid key unfortunately


I got the same thing is annoying. lol


----------



## saunupe1911

Quote:


> Originally Posted by *mickr777*
> 
> After a few hours of oc testing my 2x Asus ROG STRIX-GTX1070-8G-GAMING I got 2012 core/9000 memory was very stable and after 10 mins stress test 56c and 58c temps with fans sitting about 65%


Yeah what app did you use and what was your voltages?


----------



## criminal

Quote:


> Originally Posted by *Prozillah*
> 
> anytime I jam bf4 it will find an unstable oc in 5 mins


Borderlands 2 on a heavy physx map for me.


----------



## Shut3r

Hey. Does somebody can load up his Bios file of asus gtx 1070 Turbo? Are there some new new about a Bios editor für Pascal gpus?

Gesendet von meinem LG-D855 mit Tapatalk


----------



## mypickaxe

Quote:


> Originally Posted by *chaous2000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlvrDragon50*
> 
> Just got my ASUS Strix. No overclocking until I get a waterblock on it.
> 
> Backplate is absolutely gorgeous though.
> 
> 
> 
> There is no reason to wait for a waterblock. The 1070's are locked at 1.09 volts, so your card will never heat up high enough to even warrant a water block. *assuming you have proper airflow in your case that is.*
> 
> If i force my fans to 100%, my card never goes above 59c at 100% full tilt.
Click to expand...

at 40 degrees and steps at 50, then 60. So yes a waterblock is helpful for some users.


----------



## Shut3r

Gpx pro would be nice i think i buy it, so I can use the gamerock bios

Gesendet von meinem LG-D855 mit Tapatalk


----------



## criminal

Anyone done the shunt mod on their 1070?


----------



## LiquidHaus

Quote:


> Originally Posted by *chaous2000*
> 
> There is no reason to wait for a waterblock. The 1070's are locked at 1.09 volts, so your card will never heat up high enough to even warrant a water block. *assuming you have proper airflow in your case that is.*
> 
> If i force my fans to 100%, my card never goes above 59c at 100% full tilt.


wrong. bios locked at 1.09. hardware locked at 1.25. once modded bios are out, you'll be needing a waterblock.


----------



## bigjdubb

Quote:


> Originally Posted by *criminal*
> 
> Anyone done the shunt mod on their 1070?


That's interesting. I have never used that liquid metal stuff but it looks easy enough to work with. If bios modding doesn't happen, I may have to give this a try.


----------



## LiquidHaus

Quote:


> Originally Posted by *mypickaxe*
> 
> If by dream you mean, only in your dreams does downtown Boston run at a locked 60 fps...


might dip down to 50 or so but nothing to complain that much about.


----------



## wickedout

Quote:


> Originally Posted by *sammkv*
> 
> Sucks people are still getting coil whine paying $400+ for these gpu's! My Zotac AMP has none


I have the Zotac AMP 1070 and love it. It has no coil whine at all. I have it maxed out. Overclocks like a champ. It's super fast for my gaming needs.


----------



## mypickaxe

Quote:


> Originally Posted by *lifeisshort117*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> If by dream you mean, only in your dreams does downtown Boston run at a locked 60 fps...
> 
> 
> 
> might dip down to 50 or so but nothing to complain that much about.
Click to expand...

I've seen it dip into the 40s at 1440p and 4K with SLI 1070s (and even the high 30s in the most intense area I've found) with an overclocked 5930K running at 4.5 GHz. Which, to me, means the game is CPU bound and would have benefited greatly by something like DX12 increased draw calls (as others have mentioned elsewhere.) What I'm saying is, the game is demanding in Boston and a locked 60 fps is basically impossible.


----------



## LiquidHaus

Quote:


> Originally Posted by *mypickaxe*
> 
> I've seen it dip into the 40s at 1440p and 4K with SLI 1070s (and even the high 30s in the most intense area I've found) with an overclocked 5930K running at 4.5 GHz. Which, to me, means the game is CPU bound and would have benefited greatly by something like DX12 increased draw calls (as others have mentioned elsewhere.) What I'm saying is, the game is demanding in Boston and a locked 60 fps is basically impossible.


I know what you're saying. I keep up on the Fallout 4 Discussion Thread as well. More people are up in arms complaining about that than anything else there. I'm level 73 in the game and rarely spend enough time downtown to let it ruin my day. And yes, the pattern with Bethesda games is that they are indeed CPU bound. That isn't news. Another common trend with Bethesda is their lack of updates that actually *fix* things - like the problem you're describing.

I don't understand why you tried to negate what I was saying in regards to 3440x1440 being a dream when you have already come to a decision about your issue not being GPU related.


----------



## mypickaxe

Quote:


> Originally Posted by *lifeisshort117*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> I've seen it dip into the 40s at 1440p and 4K with SLI 1070s (and even the high 30s in the most intense area I've found) with an overclocked 5930K running at 4.5 GHz. Which, to me, means the game is CPU bound and would have benefited greatly by something like DX12 increased draw calls (as others have mentioned elsewhere.) What I'm saying is, the game is demanding in Boston and a locked 60 fps is basically impossible.
> 
> 
> 
> I know what you're saying. I keep up on the Fallout 4 Discussion Thread as well. More people are up in arms complaining about that than anything else there. I'm level 73 in the game and rarely spend enough time downtown to let it ruin my day. And yes, the pattern with Bethesda games is that they are indeed CPU bound. That isn't news. Another common trend with Bethesda is their lack of updates that actually *fix* things - like the problem you're describing.
> 
> I don't understand why you tried to negate what I was saying in regards to 3440x1440 being a dream when you have already come to a decision about your issue not being GPU related.
Click to expand...

I didn't negate it, I'm saying we both have opinions. Your opinion is not more or less valid. My point was that I don't think it's a dream, and for those who want a locked 60fps, it's not in that game.

p.s. "That isn't news."...wasn't necessary to make your point. Debate tactics are just not necessary on a forum where you are trying to have a congenial conversation.


----------



## Jimbags

Quote:


> Originally Posted by *saunupe1911*
> 
> Fellas what's a good memory overclock. I know the goal for the GPU was get 2100 or more and you got a winner. But what about the memory? 8500? 9000mhz?


My FE card with single 8 pin in firestrike floats between 2088-2125 core at its highest, with 9000mhz memory. I havent had it crash in a bench yet either. Great card, founders edition is the best looking too, in my opinion anyway.


----------



## SlvrDragon50

Quote:


> Originally Posted by *saunupe1911*
> 
> I grabbed one too. I never see it go above 65 degrees and I don't have the best air flow due to my setup being inside a desk. I'm seeing it hit 2050 mhz with no sweat. nice GPU so far


Yea, I was reading some bad reviews about it, but I've been very happy with mine. My computer is water cooled with no fresh intake, and the GPU stays around 65 while playing Overwatch.


----------



## gtbtk

Quote:


> Originally Posted by *chaous2000*
> 
> Oh, so it isnt just me going bat**** crazy trying to figure out why anything over +150 on the memory caused artifacts. I was thinking i was the only one. Thanks for the post!


Do you have a Samsung memory card or a micron memory 1070?

I have the micron memory gaming X and memory overclocks to +520 with no artifacts in firestrike It will start artifacting like if you push to +530 though.

Wont run in time spy at that rate though, I need to cut it back to about +450


----------



## gtbtk

Quote:


> Originally Posted by *chaous2000*
> 
> Oh, so it isnt just me going bat**** crazy trying to figure out why anything over +150 on the memory caused artifacts. I was thinking i was the only one. Thanks for the post!


Do you have a Samsung memory card or a micron memory 1070?

I have the micron memory gaming X and memory overclocks to +520 with no artifacts in firestrike It will start artifacting like if you push to +530 though.

Wont run in time spy at that rate though, I need to cut it back to about +450


----------



## saunupe1911

Quote:


> Originally Posted by *SlvrDragon50*
> 
> Yea, I was reading some bad reviews about it, but I've been very happy with mine. My computer is water cooled with no fresh intake, and the GPU stays around 65 while playing Overwatch.


I've spent the past day or so reading comments. Everyone has had pretty good success with the O8G-Gaming version. It's the 8G-Gaming version that's giving people problems with overclocking which IMO should have been expected. Why would ASUS produce both cards if one wasn't truly better than the other? That extra $20 is worth if you want to try for 2100 mhz or more. I was playing Forza Apex with 1080p Ultra settings at around 2100mhz. Tonight I'm going to see how long it was needing to hit that clock rate. I've also got to stabilize my memory clocks over 9000mhz and I'm done. My max temp was 54 C in a 74 F home. Crazy! I'm X61 water cooled with 2 140mm side fans and 2 120mm front fans. Now I'm trying to order another extremely thin bottom fan that I would connect to the Strix fan pwm to hopefully get it down below 50 C with these overclock settings. Also the normal Asus OC setting is good enough for most games. This clock rate I'm shooting for is for the future when there is a game that's stressing it out. And at this point the 08G-Gaming is sold out on Newegg. Amazon is the last hope while they last. They just got here shipment last week!


----------



## Prozillah

Question -

I got my original G1 gaming 1070 RMA'd as I couldn't apply any amount of Mem OC without it artifacting followed by hardlock or BSOD. However under load it was awesome and could safely pass firestrikes etc at 600+ mem OC. Core sits at 2114mhz consistent.

Due to knowing the guys at the shop they provided me a new replacement g1 before I had to return the other....

Once the 2nd turned up it didn't preform quite (16023 firestrike score vs 16311 with 1st card) but was stable at that OC'd memory OC range up to 530+. Core struggles to maintain 2050 and often drops between 2025 etc. BUT it does hold the memory clock at factory settings.

Since using the AFTERBURNER voltage/core curve & lock method mentioned in this thread I was successfully able to lock the 1st card that preforms better in all tests. It is obvious this card's power delivery on the current bios is bottoming out caused the memory to freak and locking up the card but with the voltage lock its great.

Would you - keep the 2nd card that doesn't preform as well but knows it preforms at stock settings etc.

OR

Keep the 1st card that preforms better and just keep the voltage locked to ensure no hardlocks or BSOD's?

that is the question....


----------



## chaous2000

Quote:


> Originally Posted by *lifeisshort117*
> 
> wrong. bios locked at 1.09. hardware locked at 1.25. once modded bios are out, you'll be needing a waterblock.


sorry, I meant that the bios was locked. And my post was directed towards the lock. Once, if ever, they are unlocked, I can only really see a water block being needed after 1.17v


----------



## chaous2000

Quote:


> Originally Posted by *gtbtk*
> 
> Do you have a Samsung memory card or a micron memory 1070?
> 
> I have the micron memory gaming X and memory overclocks to +520 with no artifacts in firestrike It will start artifacting like if you push to +530 though.
> 
> Wont run in time spy at that rate though, I need to cut it back to about +450


I honestly don't know, I'll have to look once I'm home. I thought all the 1070s came with the high speed Samsung fdr5?


----------



## wickedout

What 3DMark test are some of you using for benchmarking? I was just wondering? Thanks.


----------



## chaous2000

Quote:


> Originally Posted by *wickedout*
> 
> What 3DMark test are some of you using for benchmarking? I was just wondering? Thanks.


there's a few,fir 1080p use the regular firestrike, for 1440muse ultra,for 4k,use extreme. As a side note, the last 2 may be reversed, not at my computer at the moment. There's also the heaven benchmark as well


----------



## deegzor

Quote:


> Originally Posted by *criminal*
> 
> Anyone done the shunt mod on their 1070?


Done did it







Gets rid of throttling for me, using water btw.


----------



## supermi

Quote:


> Originally Posted by *deegzor*
> 
> Done did it
> 
> 
> 
> 
> 
> 
> 
> Gets rid of throttling for me, using water btw.


Can we do a similar thing on G1 PCB (or any non FE PCB for that matter?

And AWESOME RESULTS !! NO THROTTLE!!!


----------



## Prozillah

Quote:


> Originally Posted by *supermi*
> 
> Can we do a similar thing on G1 PCB (or any non FE PCB for that matter?
> 
> And AWESOME RESULTS !! NO THROTTLE!!!


I would say it would be the same - if someone has pulled the back plate and heatsink off who is able to upload a snap that would be primo - should be able to tell from there


----------



## SlvrDragon50

Quote:


> Originally Posted by *saunupe1911*
> 
> I've spent the past day or so reading comments. Everyone has had pretty good success with the O8G-Gaming version. It's the 8G-Gaming version that's giving people problems with overclocking which IMO should have been expected. Why would ASUS produce both cards if one wasn't truly better than the other? That extra $20 is worth if you want to try for 2100 mhz or more. I was playing Forza Apex with 1080p Ultra settings at around 2100mhz. Tonight I'm going to see how long it was needing to hit that clock rate. I've also got to stabilize my memory clocks over 9000mhz and I'm done. My max temp was 54 C in a 74 F home. Crazy! I'm X61 water cooled with 2 140mm side fans and 2 120mm front fans. Now I'm trying to order another extremely thin bottom fan that I would connect to the Strix fan pwm to hopefully get it down below 50 C with these overclock settings. Also the normal Asus OC setting is good enough for most games. This clock rate I'm shooting for is for the future when there is a game that's stressing it out. And at this point the 08G-Gaming is sold out on Newegg. Amazon is the last hope while they last. They just got here shipment last week!


Yea I got my 1070 STRIX 8G for around 330 so I'm not gonna complain







None of my games really need a hard overclock, but it's always fun seeing how lucky you get with your card!


----------



## chaous2000

Alright, looks like the factories in China have been using micron, not the high-speed Samsung, ram. Over on the Nvidia redit page, people that has cards made in China, not Taiwan, have been having a lot of artifact issues. Explains why only the Chinese msi factory issued a recall.


----------



## supermi

Quote:


> Originally Posted by *Prozillah*
> 
> I would say it would be the same - if someone has pulled the back plate and heatsink off who is able to upload a snap that would be primo - should be able to tell from there


I will do!


----------



## saunupe1911

Quote:


> Originally Posted by *SlvrDragon50*
> 
> Yea I got my 1070 STRIX 8G for around 330 so I'm not gonna complain
> 
> 
> 
> 
> 
> 
> 
> None of my games really need a hard overclock, but it's always fun seeing how lucky you get with your card!


Did some more overclocking. Trying to setup a few profiles. Such as max and more milder overclock. Kinda like a future 4k and 2k overclock. So far I can get a little over 2100mhz GPU clock with 9216mhz memory clock. But I had to raise GPU Voltage to plus +50. I really don't want to raise voltage at all. I have to keep memory 9000mhz if I don't want to raise voltage. A profile with 9000mhz is probably what I will rock with a little gaming at 1080p Ultra Settings


----------



## chaous2000

Quote:


> Originally Posted by *saunupe1911*
> 
> Did some more overclocking. Trying to setup a few profiles. Such as max and more milder overclock. Kinda like a future 4k and 2k overclock. So far I can get a little over 2100mhz GPU clock with 9216mhz memory clock. But I had to raise GPU Voltage to plus +50. I really don't want to raise voltage at all. I have to keep memory 9000mhz if I don't want to raise voltage. A profile with 9000mhz is probably what I will rock with a little gaming at 1080p Ultra Settings


Well since they cards are bios locked to 1.09v, it doesn't matter if you set it to 100%,it will never cross 1. 09,for now.


----------



## saunupe1911

Quote:


> Originally Posted by *chaous2000*
> 
> Well since they cards are bios locked to 1.09v, it doesn't matter if you set it to 100%,it will never cross 1. 09,for now.


oh I see. good info. I don't want to burn in up lmao


----------



## DStealth

Quote:


> Originally Posted by *saunupe1911*
> 
> oh I see. good info. I don't want to burn in up lmao


The voltage increase is negligible from 0 to +100% actually two straps...which in my case translates from 1.063 to 1.09 or 0.027v difference







You're not going to burn anything neither your scores will rise due to this "huge" overvoltage


----------



## tps3443

Hey everyone, am I now officially in the club? I just sold the AMD RX480 that I purchased on the forums here, for a little extra a horsepower.

My new Nvidia Evga GTX1070 SC! Picked her up for very cheap, sealed in the package! I just registered on over at evga site. And now a am playing with some overclocking.

absolutely silent video card! it is quite massive too. A good bit larger than the RX480 8GB I just sold off today.

Anyways, it is quite amazing that this thing can handle 4K at reasonable FPS, once overclocked to the roof that is. But, its impressive. A lot of power for $400 bucks. I love it!

http://s1371.photobucket.com/user/tps3443/media/KIMG0538_zpsb3g85jwy.jpg.html

http://s1371.photobucket.com/user/tps3443/media/KIMG0541_zpsgk8fwjd8.jpg.html


----------



## SlvrDragon50

Quote:


> Originally Posted by *saunupe1911*
> 
> Did some more overclocking. Trying to setup a few profiles. Such as max and more milder overclock. Kinda like a future 4k and 2k overclock. So far I can get a little over 2100mhz GPU clock with 9216mhz memory clock. But I had to raise GPU Voltage to plus +50. I really don't want to raise voltage at all. I have to keep memory 9000mhz if I don't want to raise voltage. A profile with 9000mhz is probably what I will rock with a little gaming at 1080p Ultra Settings


That sounds fantastic! Hopefully my card is just as good.


----------



## supermi

Quote:


> Originally Posted by *Prozillah*
> 
> I would say it would be the same - if someone has pulled the back plate and heatsink off who is able to upload a snap that would be primo - should be able to tell from there


This pic work? 1070 windforce


----------



## Amph

i overclocked the mem to +1000 but it resulted in artifact and the computer freezed, can this have damaged the gpu?


----------



## madmeatballs

Quote:


> Originally Posted by *Prozillah*
> 
> I would say it would be the same - if someone has pulled the back plate and heatsink off who is able to upload a snap that would be primo - should be able to tell from there


this too from amp extreme 
(dont mind the circle)


----------



## chaous2000

Quote:


> Originally Posted by *Amph*
> 
> i overclocked the mem to +1000 but it resulted in artifact and the computer freezed, can this have damaged the gpu?


possibly, You never start high and work down. You always start low and work your way up on the clock. If you clicked the start up on windows under msi afterburner, or any oc app, you will need to boot into safe mode and uninstall the oc app, or you will crash every time you try to load into windows. Due to the oc app applying the 1000hz boost.


----------



## NCSUZoSo




----------



## chaous2000

Quote:


> Originally Posted by *NCSUZoSo*


Is there an adapter that makes this work? This might be the easiest way for me to get water-cooling on my gpu.


----------



## Amph

Quote:


> Originally Posted by *chaous2000*
> 
> possibly, Yo never start highdand work down, got start low and work your way up on the clock. If you clicked the start up on windows under msi afterburner, or any oc app, you will need to boot into safe mode and I install the oc app, or you will crash every time you try to load into windows. Due to the oc app applying the 1000hz boost.


i restored to default no artifact for now


----------



## criminal

Quote:


> Originally Posted by *deegzor*
> 
> Done did it
> 
> 
> 
> 
> 
> 
> 
> Gets rid of throttling for me, using water btw.


Did you use Thermal Grizzly Conductonaut or something else?


----------



## Darksides327

Oh guys help please.
I think i fryed gpu..I slided voltage to max and 140 core plus 600 memory.Then playes csgo demolition and arms race and opened bf4.When i opened it black screen,restarted,no screen but skype sound is there,cleared jumper and cmos battery now pc wont boot.If this is fixable i will never do overclocking again..


----------



## chaous2000

Quote:


> Originally Posted by *Darksides327*
> 
> Oh guys help please.
> I think i fryed gpu..I slided voltage to max and 140 core plus 600 memory.Then playes csgo demolition and arms race and opened bf4.When i opened it black screen,restarted,no screen but skype sound is there,cleared jumper and cmos battery now pc wont boot.If this is fixable i will never do overclocking again..


First, I wouldn't let this deter you from overclocking. As i stated before, you work your way up in clock speed, you never start out high. As for the sliders, you can max out power and voltage, but the card will only pull what it needs. shut the computer off, restart into safe mode, and uninstall any overclocking software you have installed. The reason for this is that the OC settings are applied on boot, and not stored hardware side on the card itself. this will prevent the card from having any OC applied to it. Once that is done, run a benchmark of your choice to make sure said card is still stable. After that, read up on some guides to overclocking, while what happened can make you **** your pants the first time, it's a lesson in not doing something rash with an expensive piece of electronic equipment.


----------



## Darksides327

I cannot power up pc mate.I got post on reddit that gpu is broken?
Also,i cannot power up pc after cleared cmos (jumper also)and i unplugged gpu and plugged back.Is card dead?


----------



## chaous2000

Quote:


> Originally Posted by *Darksides327*
> 
> I cannot power up pc mate.I got post on reddit that gpu is broken?
> Also,i cannot power up pc after cleared cmos (jumper also)and i unplugged gpu and plugged back.Is card dead?


Possibly, only way to know is to put it in another computer and try and boot. As a side note, what brand and modle of psu do you have?


----------



## Darksides327

Gigabyte g750H mofular gold


----------



## Swolern

Quote:


> Originally Posted by *Darksides327*
> 
> I cannot power up pc mate.I got post on reddit that gpu is broken?
> Also,i cannot power up pc after cleared cmos (jumper also)and i unplugged gpu and plugged back.Is card dead?


Highly doubt you fried your card by just overclocking. You just have to keep troubleshooting. Just try to get into bios. Here are some more troubleshooting steps. http://www.tomshardware.com/forum/261145-31-perform-steps-posting-post-boot-video-problems


----------



## Darksides327

Quote:


> Originally Posted by *Swolern*
> 
> Highly doubt you fried your card by just overclocking. You just have to keep troubleshooting. Just try to get into bios. Here are some more troubleshooting steps. http://www.tomshardware.com/forum/261145-31-perform-steps-posting-post-boot-video-problems


Gigabyte g750H mofular gold
Just to mention,i did not touch the card bios.Just msi AB.


----------



## Swolern

Quote:


> Originally Posted by *Darksides327*
> 
> Gigabyte g750H mofular gold
> Just to mention,i did not touch the card bios.Just msi AB.


Do the checklist. First thing is to get the PC to post and into the motherboard bios.


----------



## chaous2000

Quote:


> Originally Posted by *Swolern*
> 
> Highly doubt you fried your card by just overclocking. You just have to keep troubleshooting. Just try to get into bios. Here are some more troubleshooting steps. http://www.tomshardware.com/forum/261145-31-perform-steps-posting-post-boot-video-problems


I've seen it happen before, it's rare but does happen. Like this guy said though, go through the checklist and report back. We will help with what we can.


----------



## frikadellenkind

Hi,

got my 1070 (Palit Dual) a week ago. I figured out it can run pretty fast with a custom curve (Afterburner) up to 2177mhz. Of course i had to raise the voltage. Before i try to go higher i´d like to know if it is safe to raise the voltage? I assume it is, otherwise it would not be possible without changing the BIOS.


----------



## gtbtk

Quote:


> Originally Posted by *chaous2000*
> 
> I honestly don't know, I'll have to look once I'm home. I thought all the 1070s came with the high speed Samsung fdr5?


They all come with 8GB GDDRS ram that is specced at 8gb/s but some have Samsung and some have Micron memory installed. The micron memory cards also are using a different bios version 86.04.26.xx


----------



## DStealth

Quote:


> Originally Posted by *Darksides327*
> 
> ...If this is fixable i will never do overclocking again..


Lol you got the wrong forum








May be a faulty card or memory, try other PSU connector and/or MB slot and if not fixed RMA it's not related with the OC








Can you boot to bios or only windows related ?


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> Yes completely agree - as stated in my previous post - my 1st g1 1070 I received was awesome when voltage was flowing and consistent but crashed on boot up with any kind of mem OC applied. Could take it all the way to 600+ during heaven runs etc. So RMA'd it for a 2nd card which loses 80+ on the core but holds stable at 490+ on the mem during all phases.
> 
> If I was able to lock the voltage to be consistent I would keep the original as the core clocked much higher.


you can lock the voltage with afterburner on pascal cards.

you need to open the Voltage/frequency graph with ctrl-f then select the voltage point that you want to lock and press the letter "L" key and it will lock the voltage where you select it when the card is under load


----------



## saunupe1911

Quote:


> Originally Posted by *frikadellenkind*
> 
> Hi,
> 
> got my 1070 (Palit Dual) a week ago. I figured out it can run pretty fast with a custom curve (Afterburner) up to 2177mhz. Of course i had to raise the voltage. Before i try to go higher i´d like to know if it is safe to raise the voltage? I assume it is, otherwise it would not be possible without changing the BIOS.


We might need a sticky lmao because I've asked this question a few posts back. These GPU voltages are locked to 1.09 volts. It won't go any higher than that. And I proved it last night playing Forza Apex GPU scaled to 4k. I turned voltages allllllllllll the way up 100% and Power scaling to 100%. It maxed out at 1.09 volts and 54 C in a 75 F room. Also I've noticed the GPU will just simply boost on its own. My strix will just sit at 2114 mhz at 1.09 volts. Now memory is where things get tricky with the Strix OC. This is what causes it to crash. My Samsung memory is rock solid at around 9200 Mhz. I haven't tried anything higher. Possibly this weekend. I'm anxious to see what this things can do with unlocked voltages. It might border a stock 1080 for sure. I wouldn't even try it if you bought a card with 1 or 2 fans though. I gotta hand it to Asus. Those Strix fans get the job done. You water cooled guys will have an overclocking monster when we get unlocked BIOS


----------



## criminal

Quote:


> Originally Posted by *criminal*
> 
> Anyone done the shunt mod on their 1070?


Quoting myself for context. Ordered some Thermal Grizzly Conductonaut so I can try this shunt mod. I will report back with how it goes once complete.


----------



## Kamikaze-X

Hi guys, I have an MSI 1070 FE under water on an EK-FC block and backplate and i've been dabbling with overclocking.

When people are reporting their firestrike scores, what score are you reporting? the whole combined test score, or just the graphics score?


----------



## amd7674

Hey Guys,









I don't know if this deserves its own thread, MODS if you think it is please let me know I will create one.

I live in Canada and I'm planning to upgrade my Asus GTX670 Non-TOP to GTX1070 in the next month or so. (a little strapped on $$$ now). Also I want to wait for 1070 to be in stock and major issues ironed out.

My current rig has aging CM HAF 922 case, Corsair TX750w PSU, 3570k @ 4.5Ghz, MSI Z77 mobo, 16Gb Sammy 1600 RAM and 32" LG 1080P IPS display (4:4:4) which runs at 75Hz.
I'm not planning on changing the display anytime soon, and to some GTX 1070 might be an overkill at [email protected] but I this will be my last upgrade for the current rig for the next 3-4 years.
After that I will have to do the full build (cpu, mobo, cpu, gpu etc)&#8230;.

Also I'm hoping 3570k @ 4.5Ghz won't bottleneck GTX 1070 too much.

I did measure my case I have about 33-34cm or 13" clearance for GPU, so I think I can I can pretty much fit any of the cards in my HAF 922 case.

I still do not have a clear winner&#8230;

MSI Gaming X:
I was leaning toward MSI gaming X card (to match mobo) but with the recent Micron/Sammy vram lottery I don't want to take any chances. Recent MSI statement about their 1070 issues is a little scary.

EVGA FTW:
There are reports of coil whine issues on EVGA cards, not sure if my local store would allow me to swap the card if I got one with the issue. Some peeps says it is related to PSU, and I'm not sure if my Corsair TX750w is good enough?

Zotac AMP Extreme:
Big mother, which should fit in my case. If someone is kind enough and can give me measurement in mm I would really appreciate.
It's got wonderful HSF solution, but I HATE its memory pads for $500 card I would expect much better solution. I always find heat pads on vram fragile when dealing with them. i.e. if I wanted to change thermal paste on GPU.
I don't think I would bother, so it might be irrelevant. However from my understanding both MSI/EVGA offer quieter fans.

Asus Strix Overclock Gaming:
Big, great HSF. However it has heat pads on vram. Not as quiet as MSI/EVGA solutions.

Gigabyte Windforce:
Or I should just go for the cheapest solution?&#8230; about $100 CDN cheaper than all of the above.

Any help comments/tips would be much appreciated.


----------



## madmeatballs

Well, I had the same problem too. I figured I'd get the one with the longest warranty, and one that allows you to take off the heatsink to put a third party cooler. I went with Zotac AMP! Extreme.

Also the amp extreme has its own cooling for VRMs so if you take out the big heatsink it doesn't go off with it.


----------



## amd7674

Quote:


> Originally Posted by *madmeatballs*
> 
> Well, I had the same problem too. I figured I'd get the one with the longest warranty, and one that allows you to take off the heatsink to put a third party cooler. I went with Zotac AMP! Extreme.
> 
> Also the amp extreme has its own cooling for VRMs so if you take out the big heatsink it doesn't go off with it.


At least in Canada I only get 3 years (as an extended) in total warranty on Zotac GPUs. I get the same from Asus, Gigabyte, MSI and EVGA.

From Zotac's USA Warranty page:

Product Type Graphics Cards
Standard Warranty 2-year
Extended Warranty 3-year total


----------



## tps3443

I highly doubt you broke that video card. Man, if people only knew how many hard locks I go through daily from overclocking. And I just restart and try again.


----------



## saunupe1911

Quote:


> Originally Posted by *amd7674*
> 
> Hey Guys,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't know if this deserves its own thread, MODS if you think it is please let me know I will create one.
> 
> I live in Canada and I'm planning to upgrade my Asus GTX670 Non-TOP to GTX1070 in the next month or so. (a little strapped on $$$ now). Also I want to wait for 1070 to be in stock and major issues ironed out.
> 
> My current rig has aging CM HAF 922 case, Corsair TX750w PSU, 3570k @ 4.5Ghz, MSI Z77 mobo, 16Gb Sammy 1600 RAM and 32" LG 1080P IPS display (4:4:4) which runs at 75Hz.
> I'm not planning on changing the display anytime soon, and to some GTX 1070 might be an overkill at [email protected] but I this will be my last upgrade for the current rig for the next 3-4 years.
> After that I will have to do the full build (cpu, mobo, cpu, gpu etc)&#8230;.
> 
> Also I'm hoping 3570k @ 4.5Ghz won't bottleneck GTX 1070 too much.
> 
> I did measure my case I have about 33-34cm or 13" clearance for GPU, so I think I can I can pretty much fit any of the cards in my HAF 922 case.
> 
> I still do not have a clear winner&#8230;
> 
> MSI Gaming X:
> I was leaning toward MSI gaming X card (to match mobo) but with the recent Micron/Sammy vram lottery I don't want to take any chances. Recent MSI statement about their 1070 issues is a little scary.
> 
> EVGA FTW:
> There are reports of coil whine issues on EVGA cards, not sure if my local store would allow me to swap the card if I got one with the issue. Some peeps says it is related to PSU, and I'm not sure if my Corsair TX750w is good enough?
> 
> Zotac AMP Extreme:
> Big mother, which should fit in my case. If someone is kind enough and can give me measurement in mm I would really appreciate.
> It's got wonderful HSF solution, but I HATE its memory pads for $500 card I would expect much better solution. I always find heat pads on vram fragile when dealing with them. i.e. if I wanted to change thermal paste on GPU.
> I don't think I would bother, so it might be irrelevant. However from my understanding both MSI/EVGA offer quieter fans.
> 
> Asus Strix Overclock Gaming:
> Big, great HSF. However it has heat pads on vram. Not as quiet as MSI/EVGA solutions.
> 
> Gigabyte Windforce:
> Or I should just go for the cheapest solution?&#8230; about $100 CDN cheaper than all of the above.
> 
> Any help comments/tips would be much appreciated.


Asus Strix Overclock Gaming fans won't even turn on until about 50 C in all modes. And you will only see those temps when the card is running 1950 mhz or higher. I can barely even hear the fans once until the are running over 60%. Plus you get an extra HDMI, fan management, and the Samsung memory. My only gripe is its size.

If I were you I would go with the aftermarket vendor that you believe has the best cooling solution. 3 fan minimum in IMO. Temps is everything with these cards to keep your clocks high


----------



## madmeatballs

Quote:


> Originally Posted by *amd7674*
> 
> At least in Canada I only get 3 years (as an extended) in total warranty on Zotac GPUs. I get the same from Asus, Gigabyte, MSI and EVGA.
> 
> From Zotac's USA Warranty page:
> 
> Product Type Graphics Cards
> Standard Warranty 2-year
> Extended Warranty 3-year total


once you buy it it has a 2 year warranty, after registering it on their website you get additional 3 years. So 5 years in total. I live in the Philippines by the way.


----------



## bigjdubb

Quote:


> Originally Posted by *criminal*
> 
> Quoting myself for context. Ordered some Thermal Grizzly Conductonaut so I can try this shunt mod. I will report back with how it goes once complete.


I will anxiously await your results, this isn't limited to FE cards right?


----------



## EnthusiastGamer

Hey guys, I bricked my G1 Gaming GTX 1070 by flashing a wrong bios, could anyone with the same model lend me a saved ROM with CPU-Z? Id really appreciate it . Thanks !


----------



## DStealth

Quote:


> Originally Posted by *Fosion*
> 
> True.
> Well, good luck! I will also keep my fingers crossed for you


Thank you for the kind words..







.just a quick run om my new toy...its faster...faster by far from 1070...


----------



## GreedyMuffin

Quote:


> Originally Posted by *DStealth*
> 
> Thank you for the kind words..
> 
> 
> 
> 
> 
> 
> 
> .just a quick run om my new toy...its faster...faster by far from 1070...


Daym, those scores!

WIth my 1080 I can run 2139mhz on stock voltage, my card is not golden, but close!









However, I can't get more than 24.200 on graphic score? Any idea why?









My mem is on a 545+ offset. Thank you!


----------



## DStealth

Mine is not golden too...memory is just a random number, actually impressed by the cooler over here just as low as 50% and benching so high...


----------



## GreedyMuffin

I just can't get why I am getting lower scores than normal. Is very annoying. :/

My 'golden' 980Ti which was clocked to 1500/1989 on stock voltage got 21K. I just can't understand why I can't achieve 25K with my 1080.. :/


----------



## DStealth

Flash the correct BIOS you should be thermal limited ... or just try lowering the memory overclock...could help
This cards are huge


----------



## GreedyMuffin

If so, power limited. My temps are 39-43'C under load.

Will see more on it when I come home from Sweden! Awesome scores!


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> I will anxiously await your results, this isn't limited to FE cards right?


I don't believe it is. Obviously finding the same resistor on an aftermarket card will be the challenge there though.


----------



## F-Zero

Hey guys ! Got myself a Gigabyte GTX 1070 G1 Gaming Oc a week ago. I'm coming from a R9 270 so the performance boost is huge !


----------



## Seid Dark

Quote:


> Originally Posted by *EnthusiastGamer*
> 
> Hey guys, I bricked my G1 Gaming GTX 1070 by flashing a wrong bios, could anyone with the same model lend me a saved ROM with CPU-Z? Id really appreciate it . Thanks !


Release bios: https://www.techpowerup.com/vgabios/183934/gigabyte-gtx1070-8192-160608

Also it seems like Gigabyte has made several updates to the bios: http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios


----------



## amd7674

Quote:


> Originally Posted by *saunupe1911*
> 
> Asus Strix Overclock Gaming fans won't even turn on until about 50 C in all modes. And you will only see those temps when the card is running 1950 mhz or higher. I can barely even hear the fans once until the are running over 60%. Plus you get an extra HDMI, fan management, and the Samsung memory. My only gripe is its size.
> 
> If I were you I would go with the aftermarket vendor that you believe has the best cooling solution. 3 fan minimum in IMO. Temps is everything with these cards to keep your clocks high


Thanks for your post.... Hmm. I'm very happy with my current Asus 670. What were your out of the box speeds (core/vram)? and now o/c stable? Do you mind sharing your fan profile as well?

Also do I have to use Asus software to control the RGB colors? or once the colors are set I can use MSI AB (my favourite software)?


----------



## DStealth

Quote:


> Originally Posted by *GreedyMuffin*
> 
> If so, power limited. My temps are 39-43'C under load.
> 
> Will see more on it when I come home from Sweden! Awesome scores!


Thanks,
Fosion was my prophet
Quote:


> True.
> Well, good luck! I will also keep my fingers crossed for you thumb.gif


This thing is flying ,,,still not adding all the voltage from the curve and stock cooled...great card


----------



## saunupe1911

Quote:


> Originally Posted by *amd7674*
> 
> Thanks for your post.... Hmm. I'm very happy with my current Asus 670. What were your out of the box speeds (core/vram)? and now o/c stable? Do you mind sharing your fan profile as well?
> 
> Also do I have to use Asus software to control the RGB colors? or once the colors are set I can use MSI AB (my favourite software)?


Welp I'm actually new to Nvidia overclocking because I was a more of a console gamer and I'm migrating from an old Asus Radeon 6850. So out of the box the GPU would max out around 2050 mhz. I saw it hit over 2100 once in 3dmark. Asus GPUTweak software keeps the memory at 8300 mhz for all modes...including its overclock mode. So memory tweaks is where the true overclocking comes in since these 1070 GPUs will hit naturally 2000 mhz. So I wasn't really going for silence with my custom profiles. I make the fans kicks on to 60% when it hits 50c..then I have a gradual incline to 100% at 70C. So for example it was hot in my home during these 105 degree days so my case temp was 30C which made my 1070 temps max out at around 64 under my max profile when normally its in the low 50s during full load. Well my fan was at 80% and keeping it stable. The Strix idles in the low 30s and fans won't even turn on. I bet it would hit the 20s if fans ran at maybe 30%, but kinda unnecessary IMO.

So far my most stable OC is 2114 mhz GPU and 9210 memory. I GPU scaled Forza Apex to 4k just to load the system and I'm getting FPS in the 80s with Ultra settings. 1080p its is in the 120 and 130s. I will never play Forza on Xbox One again!!!!!! Will probably shoot for 9300 at some point. Heck I'm at work right now itching to play Forza. Now I want a 4k screen smh

Side note...I'm really interested in the Gigabyte Extreme temps. And I wonder if its using the Samsung memory as well. It was my first choice but I couldn't find one and I wasn't over $500 for it. That price isn't worth it over a Strix OC


----------



## Kamikaze-X

so finally did some proper overclocking on my MSI 1070 FE with the EK-FC block to see how far the memory would go... and it just wouldn't stop! I thought I was going to run out of slider in Afterburner! :O

I could run +850mhz (9.8ghz) in Firestrike but i started seeing green flashes (they didn't really look like artifacts as I'm used to seeing them, but they stopped when I backed off the memory).

So... I can get to +800Mhz for 9.72Ghz on the memory - is this a record for non-extreme cooling?

The GPU will comfortably do +220mhz and top out at 2126Mhz which is dissapointing as anything beyond the 3Dmark score goes down, and at +225Mhz the GPU driver will usually stop responding.

15813 is pretty respectable though! Puts me at 5th place overall for a 1070 paired with an i5 4690K


----------



## amd7674

Quote:


> Originally Posted by *saunupe1911*
> 
> Welp I'm actually new to Nvidia overclocking because I was a more of a console gamer and I'm migrating from an old Asus Radeon 6850. So out of the box the GPU would max out around 2050 mhz. I saw it hit over 2100 once in 3dmark. Asus GPUTweak software keeps the memory at 8300 mhz for all modes...including its overclock mode. So memory tweaks is where the true overclocking comes in since these 1070 GPUs will hit naturally 2000 mhz. So I wasn't really going for silence with my custom profiles. I make the fans kicks on to 60% when it hits 50c..then I have a gradual incline to 100% at 70C. So for example it was hot in my home during these 105 degree days so my case temp was 30C which made my 1070 temps max out at around 64 under my max profile when normally its in the low 50s during full load. Well my fan was at 80% and keeping it stable. The Strix idles in the low 30s and fans won't even turn on. I bet it would hit the 20s if fans ran at maybe 30%, but kinda unnecessary IMO.
> 
> So far my most stable OC is 2114 mhz GPU and 9210 memory. I GPU scaled Forza Apex to 4k just to load the system and I'm getting FPS in the 80s with Ultra settings. 1080p its is in the 120 and 130s. I will never play Forza on Xbox One again!!!!!! Will probably shoot for 9300 at some point. Heck I'm at work right now itching to play Forza. Now I want a 4k screen smh
> 
> Side note...I'm really interested in the Gigabyte Extreme temps. And I wonder if its using the Samsung memory as well. It was my first choice but I couldn't find one and I wasn't over $500 for it. That price isn't worth it over a Strix OC


Thanks for the info... I still have several weeks to make up my mind  I still like Zotac Extreme Amp card too ... it will be a close race between 2 of the biggest 1070 up there.


----------



## criminal

Quote:


> Originally Posted by *chaous2000*
> 
> Is there an adapter that makes this work? This might be the easiest way for me to get water-cooling on my gpu.


https://www.amazon.com/NZXT-Technologies-Bracket-Cooling-RL-KRG10-W1/dp/B00ITTFNW4/ref=sr_1_4?ie=UTF8&qid=1470772003&sr=8-4&keywords=nzxt+kraken


----------



## saunupe1911

Quote:


> Originally Posted by *amd7674*
> 
> Thanks for the info... I still have several weeks to make up my mind  I still like Zotac Extreme Amp card too ... it will be a close race between 2 of the biggest 1070 up there.


Yeah man the Strix is only a half inch or so shorter. Both are big!!!! That's my only gripe. I thought it wasn't going to fit! But my choices where Gigabyte Extreme, Strix OC, and then Zotac since they all have the best cooling features out there. Zotac is the lowest on the list IMO because I needed multiple HDMI since my PC is primarily a HTPC.


----------



## Kamikaze-X

I just got 2nd for 1070 + 4690K in Timespy! :O

http://www.3dmark.com/spy/244813


----------



## bigjdubb

Quote:


> Originally Posted by *criminal*
> 
> I don't believe it is. Obviously finding the same resistor on an aftermarket card will be the challenge there though.


I don't really feel like taking my waterblock off just to look. Maybe someone will give it a go on an MSI card and do the hard part (figuring out if it works) for me.


----------



## Kamikaze-X

i think the submission form is broken, it asks for memory clock but theres no text box, just 'option 1' to choose :/


----------



## amd7674

Quote:


> Originally Posted by *saunupe1911*
> 
> Yeah man the Strix is only a half inch or so shorter. Both are big!!!! That's my only gripe. I thought it wasn't going to fit! But my choices where Gigabyte Extreme, Strix OC, and then Zotac since they all have the best cooling features out there. Zotac is the lowest on the list IMO because I needed multiple HDMI since my PC is primarily a HTPC.


Not sure about your audio setup, but perhaps you could use DVI-D to HDMI to get 2nd HDMI port? Regardless it seems you have a very good GPU !!!.... For you it is time to play some games for it is going to be painful wait and RTFM... LOL


----------



## Fosion

Quote:


> Originally Posted by *DStealth*
> 
> Thank you for the kind words..
> 
> 
> 
> 
> 
> 
> 
> .just a quick run om my new toy...its faster...faster by far from 1070...


Wow nice! Gratz!















But you know ... I can try to give some advice. You do not need to use the curve to get the best results. I can not explain it, but I did some tests on my 1070.
It's a bit worse (63.7 frames per second at 1440p in Heaven benchmark)

than that (64.9)


----------



## saunupe1911

Quote:


> Originally Posted by *amd7674*
> 
> Not sure about your audio setup, but perhaps you could use DVI-D to HDMI to get 2nd HDMI port? Regardless it seems you have a very good GPU !!!.... For you it is time to play some games for it is going to be painful wait and RTFM... LOL


Needed second HDMI for HD Audio and video. Otherwise any other port is good for video. I wanted more than one just incase I expand my setup.


----------



## tps3443

Here is my verification!

https://www.techpowerup.com/gpuz/details/82drq


----------



## EnthusiastGamer

Does anyone know if once the 1070 (G1 Gaming) is bricked due to flashing a wrong bios, when you plug in to a different adapter you will trigger the 2nd bios?


----------



## chaous2000

Quote:


> Originally Posted by *gtbtk*
> 
> They all come with 8GB GDDRS ram that is specced at 8gb/s but some have Samsung and some have Micron memory installed. The micron memory cards also are using a different bios version 86.04.26.xx


I get that, I just didnt know, at the time, if I had micron or Samsung. I have since verified that my card was. Made. In China and has. Micron memory, which from. What I was able to ascertain, is the cause of all the problems everyone has been having. And from what I can tell, while they are supposed to be speed at 8gb/s mine is t operating at nearly that.


----------



## BulletSponge

Quote:


> Originally Posted by *EnthusiastGamer*
> 
> Does anyone know if once the 1070 (G1 Gaming) is bricked due to flashing a wrong bios, when you plug in to a different adapter you will trigger the 2nd bios?


You have to physically flip the BIOS switch on the card, near the sli fingers I believe.


----------



## vfrmaverick

ive been having the worst time with nvidia drivers lately. constant BSOD with irql_not_less_or_equal and others all tracing to nvdlddmkm.sys and ntoskrnl.sys. System specs are sabretooth z77, [email protected] 4.6ghz, 16gb PNY 1800mhz ddr3, evga 1070sc (369.09 drivers) and fresh win10 with anual update. Ive tried the last 3 sets of drivers even using DDU to remove before installing the other set to no avail


----------



## Darksides327

Guys it works.
Okay if anyone has same trouble follow this.
1.get out gpu and connect monitor with intel gpu
2.go in safe mode and clean program which u used for overclocking and drivers.Make sure to clean all files from program witch u used foroverclocking.
3.go and plug back your gpu and reinstall drivers and u are ready to cobtinue overclocking.


----------



## LiquidHaus

when you're not used to air cooling anymore so you have to do whatever it takes for lower temps lol..


----------



## tps3443

My first run with the NEW GTX 1070 SC. I had to back down my 6600K some, my water cooler is failing me!

This is a FAST card though! I could not dream of something like this with that RX480 lol.

Is there a way to get past the power limiter yet on the 1070??

Anyways, I feel like my physics score has gone down for some reason. I normally got a 10,790 @ 4.8Ghz with the 6600K paired with the now SOLD RX480 8GB. Oh well. Maybe it is the memory! I went from DDR4 3000 CAS 15 to, DDR4 2133 CAS 15. The Previous memory was giving me hotter CPU temps, and it would present a random BSOD..

Doesn't feel like that in games!









http://s1371.photobucket.com/user/tps3443/media/tps3443 gtx1070_zpsbabqx4n1.jpg.html


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *vfrmaverick*
> 
> ive been having the worst time with nvidia drivers lately. constant BSOD with irql_not_less_or_equal and others all tracing to nvdlddmkm.sys and ntoskrnl.sys. System specs are sabretooth z77, [email protected] 4.6ghz, 16gb PNY 1800mhz ddr3, evga 1070sc (369.09 drivers) and fresh win10 with anual update. Ive tried the last 3 sets of drivers even using DDU to remove before installing the other set to no avail


How many monitors do u have?


----------



## vfrmaverick

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> How many monitors do u have?


single 1080p


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *vfrmaverick*
> 
> single 1080p


have u tried Chkdsk?


----------



## tps3443

Quote:


> Originally Posted by *tps3443*
> 
> My first run with the NEW GTX 1070 SC. I had to back down my 6600K some, my water cooler is failing me!
> 
> This is a FAST card though! I could not dream of something like this with that RX480 lol.
> 
> Is there a way to get past the power limiter yet on the 1070??
> 
> Anyways, I feel like my physics score has gone down for some reason. I normally got a 10,790 @ 4.8Ghz with the 6600K paired with the now SOLD RX480 8GB. Oh well. Maybe it is the memory! I went from DDR4 3000 CAS 15 to, DDR4 2133 CAS 15. The Previous memory was giving me hotter CPU temps, and it would present a random BSOD..
> 
> Doesn't feel like that in games!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://s1371.photobucket.com/user/tps3443/media/tps3443 gtx1070_zpsbabqx4n1.jpg.html


I was about to run firestrike, and force of habit I turn the fan up to 100% on this 1070 SC, I see in firestrike its running about 45C load while overclocked LMAO. Obviously I am not use to the new video card. It is amazing how good the cooler is! I cannot seem to heat it up


----------



## vfrmaverick

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> have u tried Chkdsk?


just ran, no issues at all.
i kept changing settings like mad and got the system to not crash after 5 minutes like before. but it hung after 30 minutes of Valley with a high memory overclock. backed off a bit and trying it again.

PS- i was trying it all without an overclock just to get it runing.


----------



## amd7674

Quote:


> Originally Posted by *vfrmaverick*
> 
> just ran, no issues at all.
> i kept changing settings like mad and got the system to not crash after 5 minutes like before. but it hung after 30 minutes of Valley with a high memory overclock. backed off a bit and trying it again.
> 
> PS- i was trying it all without an overclock just to get it runing.


Can you try to lower your 3570k o/c? i.e. 4Ghz?


----------



## vfrmaverick

Quote:


> Originally Posted by *amd7674*
> 
> Can you try to lower your 3570k o/c? i.e. 4Ghz?


already lowered from what was rock stable 4.8, then to 4.6, and now 4.4

little update, it hung again on a different stress test, but i happened to look at AB right when it happened. Temp was stable at 57C but power just dropped to 0 when it hung. I have a GS500 power supply so i shouldnt be stressing it out too much


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *vfrmaverick*
> 
> just ran, no issues at all.
> i kept changing settings like mad and got the system to not crash after 5 minutes like before. but it hung after 30 minutes of Valley with a high memory overclock. backed off a bit and trying it again.
> 
> PS- i was trying it all without an overclock just to get it runing.


Try uninstalling all Video Utilities and also uninstall any Hardware Monitoring Utilities; before uninstalling vid Drivers; and don't reinstall Vid Utilities until BSODs and Lock-ups are gone.

See:
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/1740#post_25403786

^^ the reason that Vid utilities may make a difference is the Fan Speed sensors being read by multiple hardware utilities.

Edit: all previous info in the utilities about older vid cards needs wiping out or else a communication glitch occurs between new Vid cards and utilities which result in BSODs and lock-ups. best way is uninstall before Vid Drivers uninstall.


----------



## vfrmaverick

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> Try uninstalling all Video Utilities and also uninstall any Hardware Monitoring Utilities; before uninstalling vid Drivers; and don't reinstall Vid Utilities until BSODs and Lock-ups are gone.
> 
> See:
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/1740#post_25403786
> 
> ^^ the reason that Vid utilities may make a difference is the Fan Speed sensors being read by multiple hardware utilities.
> 
> Edit: all previous info in the utilities about older vid cards needs wiping out or else a communication glitch occurs between new Vid cards and utilities which result in BSODs and lock-ups. best way is uninstall before Vid Drivers uninstall.


well i did a fresh win 10 install, so i shouldnt have previous drivers installed. i did remove the custom fan curve just to see if it makes a difference


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *vfrmaverick*
> 
> well i did a fresh win 10 install, so i shouldnt have previous drivers installed. i did remove the custom fan curve just to see if it makes a difference


k

i was getting: "FAILURE_BUCKET_ID: X64_0x116_IMAGE_nvlddmkm.sys".

is that wat u got?

Is ur Valley installed on C Drive?

If not then Chkdsk the drive Valley is on.

But *most important is uninstalling Vid Utilities and finding out if BSODs still occur after a fresh driver install and no Vid utilities installed.
*
And with a fresh OS it is always a good idea to reset BIOS to Optimal defaults and no OCing. Just a thought.

Newest Chipset drivers matter a lot too. Ones made before these 1070s could need updating. Just a thought.

Win10 still has "Event Viewer" too i imagine, but it usually has too much info. But u could try it. Just a thought.

To be scientific u must use Drive Imaging tech like Acronis. (Fresh OS or one that is free of all BSODs only take the 12 minuets to restore a Backup image. It's faster than Chkdsk.)







Just a thought.


----------



## gtbtk

Quote:


> Originally Posted by *chaous2000*
> 
> Alright, looks like the factories in China have been using micron, not the high-speed Samsung, ram. Over on the Nvidia redit page, people that has cards made in China, not Taiwan, have been having a lot of artifact issues. Explains why only the Chinese msi factory issued a recall.


Have you got a link to any details on the recall?

All I have see is this :



http://imgur.com/Ge10p


----------



## LocutusH

In the club with a ZOTAC 1070FE


----------



## Yetyhunter

Quote:


> Originally Posted by *Seid Dark*
> 
> Release bios: https://www.techpowerup.com/vgabios/183934/gigabyte-gtx1070-8192-160608
> 
> Also it seems like Gigabyte has made several updates to the bios: http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios


What is the correct BIOS for my card ? Any idea on how to upgrade the BIOS ? Thanks


----------



## F-Zero

I tried overclocking my G1 today and it went great ! I think i could get a little bit more out of her but no time right now
I don't get why my physics score is lower than usual. I usually get around 12800~ now barely 12000.


----------



## chaous2000

Quote:


> Originally Posted by *gtbtk*
> 
> Have you got a link to any details on the recall?
> 
> All I have see is this :
> 
> 
> 
> http://imgur.com/Ge10p


The problem is that MSI only pulled cards from retail stores, you will have to reach out to MSI and request an RMA for an artifacting card. You can always mention yours was one made in china and has the micron ram as well, might help.


----------



## DStealth

Quote:


> Originally Posted by *Fosion*
> 
> Wow nice! Gratz!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But you know ... I can try to give some advice. You do not need to use the curve to get the best results. I can not explain it, but I did some tests on my 1070.
> It's a bit worse (63.7 frames per second at 1440p in Heaven benchmark)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> than that (64.9)


Yes i know how GPU and video clocks are related. This was the best i did yesterday will push it further today if my wife allows







)

Anyway made a slight comparison between 1070 and 1080 @maximum OC allowed
Everything remains the same on my computer here's what goes in 3dmark11

http://www.3dmark.com/compare/3dm11/...dm11/11468333#

21 to 25% difference thru different tests from 1 to 4


----------



## Prozillah

Quote:


> Originally Posted by *F-Zero*
> 
> I tried overclocking my G1 today and it went great ! I think i could get a little bit more out of her but no time right now
> I don't get why my physics score is lower than usual. I usually get around 12800~ now barely 12000.


What are you clocks at? Im so close to breaking the 21k graphics mark.


----------



## Kamikaze-X

I broke the 21K graphics score with +800Mhz (9.72Ghz) on the memory and +220Mhz on the GPU (2126mhz)


----------



## F-Zero

Quote:


> Originally Posted by *Prozillah*
> 
> What are you clocks at? Im so close to breaking the 21k graphics mark.


Core 2114 MHz / Memory 2329 MHz


----------



## TUFinside

Quote:


> Originally Posted by *LocutusH*
> 
> In the club with a ZOTAC 1070FE
> 
> 
> Spoiler: Warning: Spoiler!


I have a similar rig ongoing, is it noisy at load while gaming ?


----------



## CODELESS

Sold my 970 G1 Gaming and Upgraded to the GTX 1070 G1 Gaming from Gigabyte.

most games stable on a 160+ on the core and have only pushed my Ram to 500hz so far.

Unturned game crashes with any over clock, tho GTA5 and vally stable with the 160+ on the core.

when i start a bench the clocks start off on 2160 (in that range +/-) and then drop down to 2137 where it flattens out

pretty happy with the card.

i have an i5 3570k overclocked to 4.5.
steam VR tests score give me 11 score


----------



## Seid Dark

Quote:


> Originally Posted by *tps3443*
> 
> Anyways, I feel like my physics score has gone down for some reason. I normally got a 10,790 @ 4.8Ghz with the 6600K paired with the now SOLD RX480 8GB. Oh well. Maybe it is the memory! I went from DDR4 3000 CAS 15 to, DDR4 2133 CAS 15. The Previous memory was giving me hotter CPU temps, and it would present a random BSOD..
> 
> Doesn't feel like that in games!


Try to overclock that RAM, I've seen benchmarks where DDR4 2133 and 2400 limited Skylake performance a lot in some CPU intensive games, not just in 3DMark. That's the reason why Haswell was faster than Skylake in some launch reviews, reviewers used slow DDR4. Fortunately most games tend to be GPU limited, so your memory shouldn't be a bottleneck in those.


----------



## GreedyMuffin

I run 2400 12-12-12-35-2T. That is completely decent imo.


----------



## Mudfrog

I'm starting to have an issue with my card not being utilized correctly. This occurred last night with two different games. GTA 5 and Borderlands 2. About 15 minutes into each game my card reduced it's utilization down to 40-50%, which resulted in poor frame rates. My core clock remained constant at 2088, it was just the utilization that dropped. At first I thought, CPU bottleneck, which it could have been in GTA 5, but my CPU utilization never went above 50% when this occurred. Borderlands 2 is definitely not CPU bottlenecked.


----------



## LiquidHaus

I have no idea why any of you guys are snagging cards with a single 8 pin. Power fluctuation is all over the damn place when you overclock them. Water blocks are essentially pointless since frequencies can't stabilize because power delivery is also bad; 4 phases, and a single 8 pin is just not enough.

Simply put.


----------



## GreedyMuffin

Funny.

My 1080 is stable at 2139mhz stock voltage.

That's a FE under water.


----------



## pez

There's plenty of people with FE cards that are clocking above and beyond some of the 'average' OCs of the 'premium' AIBs. It's more of a silicon lottery thing than anything.


----------



## LiquidHaus

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Funny.
> 
> My 1080 is stable at 2139mhz stock voltage.
> 
> That's a FE under water.


Funny when your sig says 2126mhz instead.

Especially when you say stock voltage. Cause you know...you can actually up voltage and all.









Whatever your frequency is, it won't matter. Once the TDP limit is reached, power delivery WILL become inconsistent. I have waterblocked 18 1080s and 12 1070s at work so far. They all have acted the same.

Not sure what you get out of attempting to prove someone wrong.


----------



## GreedyMuffin

Perhaps 2126 was my first OC when I got my card, but haven't updated it?

Lol.

Just saying that mine dosen't do that in games, benchmarks nor folding.


----------



## criminal

Quote:


> Originally Posted by *lifeisshort117*
> 
> I have no idea why any of you guys are snagging cards with a single 8 pin. Power fluctuation is all over the damn place when you overclock them. Water blocks are essentially pointless since frequencies can't stabilize because power delivery is also bad; 4 phases, and a single 8 pin is just not enough.
> 
> Simply put.


Why do you even care? I am happy with my single 8 pin card and my FE seems to be better than others who have AIB cards. And come Friday I will be doing the shunt mod which will take care of any power target issues.


----------



## LiquidHaus

Quote:


> Originally Posted by *criminal*
> 
> Why do you even care? I am happy with my single 8 pin card and my FE seems to be better than others who have AIB cards. And come Friday I will be doing the shunt mod which will take care of any power target issues.


Because I care about my customer's hardware...? It's frustrating to me seeing it time and time again.

Do not get me wrong OCN, I am not calling you all fools. I am simply stating my frustrations are grounded in tangible experience that I've had with these cards.

In my opinion, I'd try to sway you all to an AIB card rather than reference. But to each their own.


----------



## bigjdubb

I wish my AIB card clocked as well as most of the FE cards I see. The only thing my card can do at 2100mhz is run OCCT.


----------



## criminal

Quote:


> Originally Posted by *lifeisshort117*
> 
> Because I care about my customer's hardware...? It's frustrating to me seeing it time and time again.
> 
> Do not get me wrong OCN, I am not calling you all fools. I am simply stating my frustrations are grounded in tangible experience that I've had with these cards.
> 
> In my opinion, I'd try to sway you all to an AIB card rather than reference. But to each their own.


That's great, but like I said my FE is faster than many others AIB cards, so why would I want to change my card? All the phases in the world aren't going to make a difference until we get a successful way to customize the bios on these cards and even then unless the cards are kept really cold will that make much difference.

My issue right now is power target. And like I said come Friday I will have that solved.

Oh yeah, last I checked most of us in here aren't your customers.


----------



## LiquidHaus

Quote:


> Originally Posted by *criminal*
> 
> That's great, but like I said my FE is faster than many others AIB cards, so why would I want to change my card? All the phases in the world aren't going to make a difference until we get a successful way to customize the bios on these cards and even then unless the cards are kept really cold will that make much difference.
> 
> My issue right now is power target. And like I said come Friday I will have that solved.
> 
> Oh yeah, last I checked most of us in here aren't your customers.


You and I got a real problem you know that.

I don't like your attitude. Good luck with your mod, hopefully the manufacturer finds out and voids your warranty.

And I'd love to put some money down for when modded bios DO become a thing, the AIB cards pull ahead.


----------



## prey1337

This is great stuff.

Started up MGSV again last night, ran smooth as can be, no issues to report.


----------



## GreedyMuffin

Quote:


> Originally Posted by *lifeisshort117*
> 
> And I'd love to put some money down for when modded bios DO become a thing, the AIB cards pull ahead.


I'm in. 5 bucks.









I don't think that will matter. My 980Ti did 1500-1520 on stock voltage. Could probably do 1600+ with custom vmod bios.

EDIT: That was a refrence PCB under water. (Now on air).


----------



## bigjdubb

Quote:


> Originally Posted by *criminal*
> 
> Oh yeah, last I checked most of us in here aren't your customers.


I would be disappointed if anyone on this site was a customer of someone who builds computers.


----------



## LiquidHaus

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I'm in. 5 bucks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think that will matter. My 980Ti did 1500-1520 on stock voltage. Could probably do 1600+ with custom vmod bios.


Now we got a pot going!

We'll see, everything I've dealt with has been pointing me to this theoretical result. But this is what makes this sort of stuff fun.
Quote:


> Originally Posted by *bigjdubb*
> 
> I would be disappointed if anyone on this site was a customer of someone who builds computers.


Indeed, though I doubt a customer would even know about this site. A ton of my customers can't be bothered to mess with their computer. They want a fast computer and they don't want to have to mess with it. Can't knock that. Everyone has their hobbies, and everyone has things in their life they can't devote time to get into.


----------



## tps3443

Quote:


> Originally Posted by *lifeisshort117*
> 
> I have no idea why any of you guys are snagging cards with a single 8 pin. Power fluctuation is all over the damn place when you overclock them. Water blocks are essentially pointless since frequencies can't stabilize because power delivery is also bad; 4 phases, and a single 8 pin is just not enough.
> 
> Simply put.


Well, when you pick up a Brand new GTX 1070 Evga SC for $331 sealed. You do not care about a single 8 pin. And adding another $120 for a Zotac Amp! extreme which is $459.00 will not justify the performance boost.


----------



## LiquidHaus

Quote:


> Originally Posted by *tps3443*
> 
> Well, when you pick up a Brand new GTX 1070 Evga SC for $331 sealed. You do not care about a single 8 pin. And adding another $120 for a Zotac Amp! extreme which is $459.00 will not justify the performance boost.


At $331, I couldn't agree with you more. Awesome price for a 1070. When I got my Amp Extreme though, Founder's were the same price.


----------



## criminal

Quote:


> Originally Posted by *lifeisshort117*
> 
> I have no idea why any of you guys are snagging cards with a single 8 pin. Power fluctuation is all over the damn place when you overclock them. Water blocks are essentially pointless since frequencies can't stabilize because power delivery is also bad; 4 phases, and a single 8 pin is just not enough.
> 
> Simply put.


Quote:


> Originally Posted by *lifeisshort117*
> 
> Funny when your sig says 2126mhz instead.
> 
> Especially when you say stock voltage. Cause you know...you can actually up voltage and all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whatever your frequency is, it won't matter. Once the TDP limit is reached, power delivery WILL become inconsistent. I have waterblocked 18 1080s and 12 1070s at work so far. They all have acted the same.
> 
> Not sure what you get out of attempting to prove someone wrong.


Like your attitude was any better in the two posts above?
Quote:


> Originally Posted by *lifeisshort117*
> 
> You and I got a real problem you know that.
> 
> I don't like your attitude. Good luck with your mod, hopefully the manufacturer finds out and voids your warranty.
> 
> And I'd love to put some money down for when modded bios DO become a thing, the AIB cards pull ahead.


We do huh? Mr. Internet tough guy...lol

Anyway, I am not one to "claim" warranty on something I modify. So I am not really concerned about the manufacturer finding out. And if/when modded bios comes out and may make every AIB card better than my FE (which I doubt), I won't really care either way seeing as how I am happy with the performance I have now and any extra I get is just gravy.


----------



## saunupe1911

Quote:


> Originally Posted by *tps3443*
> 
> Well, when you pick up a Brand new GTX 1070 Evga SC for $331 sealed. You do not care about a single 8 pin. And adding another $120 for a Zotac Amp! extreme which is $459.00 will not justify the performance boost.


Ok so where on earth can I find GTX 1070 Evga SC for $331? Best Buy, Newegg, and Amazon are all around $439 ish


----------



## SuperZan

Quote:


> Originally Posted by *tps3443*
> 
> Well, when you pick up a Brand new GTX 1070 Evga SC for $331 sealed. You do not care about a single 8 pin. And adding another $120 for a Zotac Amp! extreme which is $459.00 will not justify the performance boost.


This. Plus, Mr. Coolguy routine or no, there isn't any indication thus far that the extra 8-pin is substantially improving OC ability. Simply put.


----------



## saunupe1911

Quote:


> Originally Posted by *SuperZan*
> 
> This. Plus, Mr. Coolguy routine or no, there isn't any indication thus far that the extra 8-pin is substantially improving OC ability. Simply put.


So I'll just use some common sense here. Voltage is locked right? An 8 pin can deliver stable 1.09 volts right? So where's the advantage of another connector. So a 1080 needs two 8 pin connectors right? What is its voltage? I would say a 1070 would need 2 pins to reach that voltage once it's unlocked....in theory.


----------



## GreedyMuffin

1080 FE with custom bios can deliver 1.2V. Tested it myself.


----------



## bigjdubb

Quote:


> Originally Posted by *SuperZan*
> 
> This. Plus, Mr. Coolguy routine or no, there isn't any indication thus far that the extra 8-pin is substantially improving OC ability. Simply put.


The extra 6 pin on my MSI card gets me zero extra performance at this point. Time will tell if the extra power will be beneficial once we can open up the BIOS and let the juice loose. I have a strong feeling that bios modding results will be largely dependent on your finishing place in the silicon lottery.

Quote:


> Originally Posted by *saunupe1911*
> 
> So I'll just use some common sense here. Voltage is locked right? An 8 pin can deliver stable 1.09 volts right? So where's the advantage of another connector. So a 1080 needs two 8 pin connectors right? What is its voltage? I would say a 1070 would need 2 pins to reach that voltage once it's unlocked....in theory.


The 1080 only needs a single 8 pin at this point as well.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> bios modding results will be largely dependent on your finishing place in the silicon lottery.


That is how it has pretty much been since Maxwell. Just talk to the guys with a Classified 980/980Ti that have completely unlocked voltage. Unless going subzero temps that extra voltage didn't matter.


----------



## LiquidHaus

Quote:


> Originally Posted by *criminal*
> 
> Like your attitude was any better in the two posts above?
> We do huh? Mr. Internet tough guy...lol
> 
> Anyway, I am not one to "claim" warranty on something I modify. So I am not really concerned about the manufacturer finding out. And if/when modded bios comes out and may make every AIB card better than my FE (which I doubt), I won't really care either way seeing as how I am happy with the performance I have now and any extra I get is just gravy.


I hadn't had my cup of coffee yet.

The warranty deal with Zotac is nice, even with the FE editions I believe. They won't mind you taking the stock coolers off. iirc EVGA is the same way. Not sure about any others though.
Quote:


> Originally Posted by *SuperZan*
> 
> This. Plus, Mr. Coolguy routine or no, there isn't any indication thus far that the extra 8-pin is substantially improving OC ability. Simply put.


All I'm saying is it's pretty interesting that almost every 1070 can cross flash each other - whether reference or AIB - and they all relatively overclock the same. I'd like to think the bios files everyone got has the same limits, if it's with Boost 3.0 or voltage limits or TDP limits or whatever specific thing it is, it's the same across the board. But because of this, it creates the same limits on any card.

It's why I'm doubling down on the modded bios helping greatly.
Quote:


> Originally Posted by *saunupe1911*
> 
> So I'll just use some common sense here. Voltage is locked right? An 8 pin can deliver stable 1.09 volts right? So where's the advantage of another connector. So a 1080 needs two 8 pin connectors right? What is its voltage? I would say a 1070 would need 2 pins to reach that voltage once it's unlocked....in theory.


It is indeed a theory. A theory I'll stick with.


----------



## Prozillah

Quote:


> Originally Posted by *supermi*
> 
> This pic work? 1070 windforce


Do you know if this is the same pcb as a g1? I'm going to be doing the shunt mod on my g1 as soon as my grizzly turns up


----------



## chrcoluk

Quote:


> Originally Posted by *Curseair*
> 
> Do the 1070's with more than one 8 pin actually get any benefit? Wondering because i'm thinking about getting the Strix and it's one 8 pin only.


Yes

My game rock premium cannot hold 1.09v as it hits TDP limit so TDP throttles.

A guy with a 2 8 pin zotac amp extreme can easily use 1.09v with TDP usage only at 60%.

The big question is tho can a single 8 pin card raise the TDP limit as high as the 2 8 pin zotac cards, dont know until custom bios.

If I was buying today, I would get a 2 8 pin card.


----------



## chrcoluk

Quote:


> Originally Posted by *gtbtk*
> 
> you can lock the voltage with afterburner on pascal cards.
> 
> you need to open the Voltage/frequency graph with ctrl-f then select the voltage point that you want to lock and press the letter "L" key and it will lock the voltage where you select it when the card is under load


why is all these new afterburner features undocumented hotkeys? should be a gui button to click on.


----------



## Asymmetry

Loving my Seahawk, gaming temps 35*C (witcher 3 ultra settings @1080), 20*C idle !




build thread
http://www.overclock.net/t/1480841/htpc-to-dual-water-gaming-pc-htpc-evolution-fight-to-defeat-heat-and-noise


----------



## tps3443

Anyone else gaming at 1080P with a GTX 1070 at 2.126Ghz/9.2Ghz ?

It is kind of silly lol but, man does it feel smooth. A little overkill but, it sure is sweet!

Skyrim with HD packs is running around 225-400 fps

Vsync forced off.

And the temps do not exceed 65C with the overclock. And the fan profile on silent with even demanding games.

I can keep the temps around the 40's with a higher fan profile though. But no need.

Nice to have a silent video card again.


----------



## FlatOUT

Looks like none there likes MSI Gaming X version


----------



## supermi

Quote:


> Originally Posted by *Prozillah*
> 
> Do you know if this is the same pcb as a g1? I'm going to be doing the shunt mod on my g1 as soon as my grizzly turns up


Pretty sure it is the same PCB, that looks like the right resistor right? I need some liquid metal! Wanna PASS stock 1080's ,if I can get this working and we get some voltage, maybe finally worth taking these -30c!!!!


----------



## tps3443

I wish I could get some more voltage to mine. My temps are phenomenal on air alone. I can see myself easily getting 2.2-2.35Ghz, maybe more if I just had more voltage. Obviously Nvidia wanted to sell actually sell 1080's and this is why our voltage is limited?


----------



## _Killswitch_

I have question for you 1070 Owners, Been thinking about upgrading my Video card. Stuck between 1080 and 1070. 1080 is really high while 1070 is more affordable will the 1070 be really decent upgrade from 680?

Been unimpressed with most of release of video card's until now so will the 1070 be well worth the money?


----------



## Shut3r

Quote:


> Originally Posted by *Asymmetry*
> 
> Loving my Seahawk, gaming temps 35*C (witcher 3 ultra settings @1080), 20*C idle !
> 
> 
> 
> 
> build thread
> http://www.overclock.net/t/1480841/htpc-to-dual-water-gaming-pc-htpc-evolution-fight-to-defeat-heat-and-noise


Hey can you load up your Bios file please,?

Gesendet von meinem LG-D855 mit Tapatalk


----------



## Prozillah

What resolution are you planning to game at?


----------



## xGTx

Totally. I got a 1070 Gaming X about 2 weeks ago and I was amazed how the card gave very similar performance to the GTX680 Tri SLI setup I had at a point 4 years ago, whilst having 4 times more RAM, consuming like 1/4 the wattage, and being extremely silent. No microstuttering either.


----------



## LocutusH

Quote:


> Originally Posted by *TUFinside*
> 
> I have a similar rig ongoing, is it noisy at load while gaming ?


No, not noisy at all. Since the 6 series these ref coolers are very good. Well not 50°C good, like the ones with 3-4 times bigger heatsinks, but who cares if 70 comes out on the rear?


----------



## tps3443

Quote:


> Originally Posted by *_Killswitch_*
> 
> I have question for you 1070 Owners, Been thinking about upgrading my Video card. Stuck between 1080 and 1070. 1080 is really high while 1070 is more affordable will the 1070 be really decent upgrade from 680?
> 
> Been unimpressed with most of release of video card's until now so will the 1070 be well worth the money?


A GTX 1070 is a HUGE upgrade from a GTX680 ! I bought a RX480 8GB last month, and I was scoring about 15,000 Graphics steady with it and that was overclocked to its limits. Gaming performance at 1080P was really really good. But, I wanted more. And I found a extreme deal on a GTX1070 SC ACX 3.0. And I am upgrading to 1440P 144HZ, or 4K 60HZ very shortly. So, I wanted some additional power especially with VR gaming requiring 90+ FPS.

I am getting 21,000 graphics score now, gaming performance has nearly doubled sometimes!

A RX480 is about 2 times the performance of a GTX 680 alone at only around $200 bucks NEW. Which is HUGE! because, a lot of people think of a good upgrade being about 20%+ lol

So, if you are buying or upgrading to a GTX 1070 the performance is huge! Not minuscule one bit! ONLY A HUGE LEAP!!! Roughly (3) times or more the power.

The GTX 1080 is a bit pricey for me. At $650-$700 maybe if it was 384 bit but I feel like its the price of what a GTX 1080 Ti should be, and once the GTX 1070 is overclocked it is either neck and neck or nipping at its heels for around $400.

If you want huge boost, get a GTX1070. And overclock your CPU.


----------



## GreedyMuffin

I'd say get what you can afford. Also dpeends on your gaming res/hz.

I went for the 1080 as I had some extra money at the time. And the differene between the two is 20-25% AFAIK. Will notice it quite well om 1440P/144hz.

You can't compare OCed 1070 to stock 1080 since most 1080s OC good. I'm running 2139 stock voltage. But that seems to be a bit over average.

Not too many 1070s can do that from what I've seen.

All in all.

Can you afford it?

Will you benefit from a card faster than the 1070?

Which res/hz are you playing on?


----------



## Prozillah

I play on 1440p 144hz with my 1070 and honestly it handles it absolutely brilliantly. Now that's not to say I have everything maxed out. I usually turn down shadows in some games and leave aa and ap around x2 - x4. The devision with a combination of ultra and low settings nets me 95fps on avg - very smooth experience still. Yes 1080 would be better of course but not justifiable for the price increase for most.


----------



## LocutusH

With some games, you can even bring an 1080 under 60 fps in FHD.

So its safe to say, that ONLY your budget is the limit, and what you are willing to sacrifice in the graphics settings.


----------



## chrcoluk

680 to 1070 is a no brainer, massive upgrade.


----------



## bpmcleod

Quote:


> Originally Posted by *bpmcleod*
> 
> Any custom BIOS yet for MSI Gaming 1070?
> 
> EDIT: I want a custom BIOS unlocking voltages and power limits and what not. Will be dropping it on water here shortly so wanting to prepare for it... Here is the original BIOS
> 
> GP104.zip 149k .zip file


Anyone able to make a modded BIOS for me for my MSSI 1070 Gaming X? This is my current BIOS file. I havent found one listed yet. Looking to unlock voltages and increase power limit to roughly 140% if possible. Thanks a lot


----------



## Shut3r

What a Bios file is it?

Gesendet von meinem LG-D855 mit Tapatalk


----------



## criminal

Quote:


> Originally Posted by *bpmcleod*
> 
> Anyone able to make a modded BIOS for me for my MSSI 1070 Gaming X? This is my current BIOS file. I havent found one listed yet. Looking to unlock voltages and increase power limit to roughly 140% if possible. Thanks a lot


There is currently no way to mod the bios for these cards.


----------



## bpmcleod

Quote:


> Originally Posted by *criminal*
> 
> There is currently no way to mod the bios for these cards.


Whatttt lol. Is it just the MSI cards or the 10xx series in general?


----------



## criminal

Quote:


> Originally Posted by *bpmcleod*
> 
> Whatttt lol. Is it just the MSI cards or the 10xx series in general?


10xx series in general.


----------



## CaptainZombie

I bought the FTW last week, and while searching on Google about the hybrid all-in-one cooler from EVGA for this card I spotted this. I am assuming that they are going to also come out with a standalone kit for both the 1070 and 1080 FTW since they use the same PCB?

Has @EVGA-JacobF said anything about when the standalone kit might be releasing?

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6278-KR


----------



## madmeatballs

Hmmmm, I noticed if your temps are lower than 45c I get the ability to overclock more (stable; well, maybe at least for my 1070, yours could be completely different). Hours ago I had *h*t stock fans installed on my Kraken x41 temps were playing around 49-52C. Now I replaced it with higher static pressure fans (helix 140mm from my H240x) I noticed my temps were at 41-42C consistent (this was on heaven bench btw). To be honest these helix aren't the best ones out there as I have replaced my H240-x' fans with Noctua IPPC which done a very great good cooling the rad.

Now, knowing that I had a significant temperature improvement by switching fans I tried to push my overclock. Now it is stable at 2075MHz( will push it till unstable) and I'm guessing once I hit 45C ~ 50C it will start limiting me from overclocking. I'm not sure what kind of sh** this is but I'm thinking if you can hold temps lower than 45C you can max your card out. Initial when I had the stock cooler for this 1070 amp extreme I was only able to OC +30 on core (temps were hitting 59C-60C) and now I can do +50 (2075MHz[41-42C]). Maybe I can hit 2100 now, I'll give it a shot and update when I get results. Now this is just my opinion and I am not an expert. (I stress test with firestrike stress test and 30mins of furmark + a few hour of bf4 gameplay)

Update:

Okay I tried +75 off the bat for 2100MHz too bad my card cant make it lol! Temps was still at 41-42C.
So far +70 is looking good tho. This might be it's limit.

Update 2:
+70 is confirmed stable!








actually for some reason +70 is still 2100MHz LOL. It is weird MSI afterburner, GPU-z, CAM, HWMonitor says both +70 and +75 results me to 2100MHz
+75 wasn't stable and +70 was stable both.

Update 3:
Something weird is going on from +60 ~ +75 core its 2100MHz
+80 is 2114MHz tho weird.

Update 4:
now +60 became 2088MHz and +65 is now 2100MHz. still same temps.


----------



## paulclift

Quote:


> Originally Posted by *Asymmetry*
> 
> Loving my Seahawk, gaming temps 35*C (witcher 3 ultra settings @1080), 20*C idle !
> 
> build thread
> http://www.overclock.net/t/1480841/htpc-to-dual-water-gaming-pc-htpc-evolution-fight-to-defeat-heat-and-noise


Got the same card, don't quite get those temps though. Am hitting more like 45c when running The Division.


----------



## LiquidHaus

I tried Afterburner's curve mode last night but no avail. Even locked at 1.093v, a higher clock speed couldn't be achieved compared to using the normal sliders.

This is good and bad though - good that Afterburner is directing the correct amount of voltage to the card that it needs for certain clock speeds, but bad that I couldn't get a higher clock speed lol.


----------



## Curseair

Which one guys MSI Gaming X or EVGA FTW Gaming?, Black + Red build, Ruled out the Zotac and Asus.


----------



## Shaitan

Quote:


> Originally Posted by *Curseair*
> 
> Which one guys MSI Gaming X or EVGA FTW Gaming?, Black + Red build, Ruled out the Zotac and Asus.


Between the two, my personal opinion would be the MSI. I have tried both and I had issues with the FTW not being able to cool itself properly and the LEDs being a bit wonky causing odd colors. I probably just had a defective one though.

Is there a particular reason that you ruled out the Asus? I have one coming tomorrow so I'm just curious.


----------



## Curseair

It's more expensive where i'm at, I love the backplate on the EVGA and the side panel where it say's what the card is and the MSI just because I have a red and black build and it looks aggressive.







, Saying that I honestly can't decide between the 3 even though I said I ruled it out, Probably so I can make up my mind easier haha.


----------



## saunupe1911

Welp I've gotten my Asus Strix OC to boost and stabilze to 2114 mhz if its around 60C or below and will stay locked at 9304 memory no matter the temps. Might try 9400 Mhz tonight. What I did find out is that it won't function if the GPU is manually set above 2114mhz....instant green screen. So I will leave that alone, set it too 1900ish and let the boost do its thing automatically. It may be maxed out at this point. So I can't speak for others but let the Boost 3.0 automatically boost GPU clocks. It will just naturally go as high as it can from what I'm seeing. Memory on the other hand is where its static Max values need to be set in overclocking software


----------



## paulclift

GP104.zip 148k .zip file


As requested. My MSI Seahawk BIOS.

86.04.1E.00.55


----------



## amd7674

Quote:


> Originally Posted by *Shaitan*
> 
> Between the two, my personal opinion would be the MSI. I have tried both and I had issues with the FTW not being able to cool itself properly and the LEDs being a bit wonky causing odd colors. I probably just had a defective one though.
> 
> Is there a particular reason that you ruled out the Asus? I have one coming tomorrow so I'm just curious.


I'm still undecided myself









EVGA FTW some peeps report "coil whine" issues. MSI Gaming has memory lottery (micron is not doing so good in o/c department)... I might buy Asus Strix OC (only one 8 pin power input), but I like more power input on both EVGA and MSI products. Leaving in Canada from my understanding I have better RMA support with Asus and MSI (both have Canadian depots). For EVGA and Zotac I have to sent them to US.

The usual first world problems...


----------



## Shaitan

Quote:


> Originally Posted by *amd7674*
> 
> I'm still undecided myself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EVGA FTW some peeps report "coil whine" issues. MSI Gaming has memory lottery (micron is not doing so good in o/c department)... I might buy Asus Strix OC (only one 8 pin power input), but I like more power input on both EVGA and MSI products. Leaving in Canada from my understanding I have better RMA support with Asus and MSI (both have Canadian depots). For EVGA and Zotac I have to sent them to US.
> 
> The usual first world problems...


I actually have to say I prefer the cards with the single 8 pin connector. I usually don't bother much with overclocking so it doesn't seem that I need anything more than that. The only thing I didn't like about the MSI gaming X is how wide the card is.


----------



## prey1337

Quote:


> Originally Posted by *Shaitan*
> 
> I actually have to say I prefer the cards with the single 8 pin connector. I usually don't bother much with overclocking so it doesn't seem that I need anything more than that. The only thing I didn't like about the MSI gaming X is how wide the card is.


Exactly, it would of been so close to my side panel that I probably wouldn't have been able to plug in the pins.
Plus that red doesn't exactly go with every build theme out there.


----------



## amd7674

Quote:


> Originally Posted by *Shaitan*
> 
> I actually have to say I prefer the cards with the single 8 pin connector. I usually don't bother much with overclocking so it doesn't seem that I need anything more than that. The only thing I didn't like about the MSI gaming X is how wide the card is.


You are probably correct... In my case HAF 922, I can fit any 1070 there is, including the Extreme Amp.

Can someone please tell me what is the difference between EVGA FTW DT and EVGA FTW cards?

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6276-KR

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6274-KR

What is DT stands for? Both offer require dual 8 pins connectors and both offer dual Bioses. I see the clocks are lower on one, but the price is the same for both


----------



## amd7674

Quote:


> Originally Posted by *prey1337*
> 
> Exactly, it would of been so close to my side panel that I probably wouldn't have been able to plug in the pins.
> Plus that red doesn't exactly go with every build theme out there.


My case is closed so I don't care if the card was the ugliest on the planet LOL... As long as it is stable and "quiet"


----------



## criminal

Quote:


> Originally Posted by *amd7674*
> 
> You are probably correct... In my case HAF 922, I can fit any 1070 there is, including the Extreme Amp.
> 
> Can someone please tell me what is the difference between EVGA FTW DT and EVGA FTW cards?
> 
> http://www.evga.com/Products/Product.aspx?pn=08G-P4-6276-KR
> 
> http://www.evga.com/Products/Product.aspx?pn=08G-P4-6274-KR
> 
> What is DT stands for? Both offer require dual 8 pins connectors and both offer dual Bioses. I see the clocks are lower on one, but the price is the same for both


All I see is the clock speed difference and one has leds while the other doesn't.


----------



## BTCHSLP

Quote:


> Originally Posted by *paulclift*
> 
> GP104.zip 148k .zip file
> 
> 
> As requested. My MSI Seahawk BIOS.
> 
> 86.04.1E.00.55


Is that the Sea Hawk X-BIOS ?
Got that card the reference-design / pcb ?

If the card got the reference-design i will test it on my EVGA FE









EDIT:
Ok - it looks like that the Sea Hawk have reference-pcb:
http://www.bit-tech.net/hardware/graphics/2016/08/03/msi-geforce-gtx-1070-sea-hawk-x-review/1
Quote:


> The PCB also looks like a standard reference board, with no fancy features and a 4+1 phase power layout that doesn't make use of upgraded components like MSI's Gaming range of cards does. The card does support SLI, although it's likely to look messy pairing up two of these, and you'd need to ensure adequate spacing between them to give the blower fan of the top card room to breathe.


----------



## amd7674

Quote:


> Originally Posted by *criminal*
> 
> All I see is the clock speed difference and one has leds while the other doesn't.


Thank you... it is a little weird in my books... give 10+2 power phase and dual bios support but have lower base clocks. Maybe these are 2nd tier FTW???

Are you sure about LEDs? ... if you look at its specs tab it shows:

Adjustable RGB LED
An adjustable RGB LED that can be controlled in EVGA PrecisionX OC


----------



## prey1337

And they are priced the same, really strange.


----------



## paulclift

Quote:


> Originally Posted by *BTCHSLP*
> 
> Is that the Sea Hawk X-BIOS ?
> Got that card the reference-design / pcb ?
> 
> If the card got the reference-design i will test it on my EVGA FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> Ok - it looks like that the Sea Hawk have reference-pcb:
> http://www.bit-tech.net/hardware/graphics/2016/08/03/msi-geforce-gtx-1070-sea-hawk-x-review/1


Yes Sea Hawk X. Just standard as I bought it. Don't know much else as new to this.


----------



## BTCHSLP

Thank you for uploading the BIOS.

I benched the BIOS and i'm a little bit disappointed about the results.
It's a pity for a hybrid-cooled VGA that the maximal power-target is 105 %









SeaHawk:
PT: 105 %
TT: 92 °C
Voltage: 100 %
GPU Off: +0 MHz
MEM Off: +0 MHz

NVIDIA GeForce GTX 1070 video card benchmark result - Intel Core i7-6800K,EVGA INTERNATIONAL CO.,LTD 131-HE-E095

SC-BIOS:
PT: 112 %
TT: 92 °C
Voltage: 100 %
GPU Off: +0 MHz
MEM Off: +0 MHz

NVIDIA GeForce GTX 1070 video card benchmark result - Intel Core i7-6800K,EVGA INTERNATIONAL CO.,LTD 131-HE-E095


----------



## SupernovaBE

I hope that the bios editor wil come out soon









I just did part 2 on my 1070Sli build.
second waterblock and the parallel full cover block + red dye








And a second 480 ut60 for Cool silent oc









Next: oc max







( 1.09v )

MSI Gaming x 8g 1070
I like them alot !

Atm
+150 / + 750 ( 2126 / 9500 )
http://www.3dmark.com/spy/257704 ( 11 1129 ) Good for first place in the 1070 sli list









Firestrike
+100 / 600 ( higher score than +100 / 750 )
http://www.3dmark.com/fs/9748833 ( 25 860 ) good for first place in the 1070 sli list


----------



## Asymmetry

Quote:


> Originally Posted by *paulclift*
> 
> Got the same card, don't quite get those temps though. Am hitting more like 45c when running The Division.


in my case I mounted a case fan behind the tank, then stock fan in front, case has two intake fans in front of it. Ambient temp is 17*C


----------



## amd7674

Quote:


> Originally Posted by *prey1337*
> 
> And they are priced the same, really strange.


I got reply for EVGA excellent support:

"The difference between the two model is just the clock rate. DT stands for De-tuned. They are cards that can overclock fine but could not reach our Full FTW specs."

So it is lower tier FTW card.


----------



## prey1337

Did you ask them why it still costs the same then? Haha


----------



## George the Jew

Hi all, I need some help.

I bought a 1070 FTW from EVGA recently, and have been having issues playing one game in particular, that game being Counter Strike: Global Offensive. I tend to stutter a **** ton, have artifacts (overclocked or not) and my display driver keeps crashing. Anyone else having a similar issue, or do I maybe have a bad card? Other games I play work fine, just CS:GO has this issue for me.

EDIT: Just tried some other low-powered games, same result in Rocket League. RMA an order?

EDIT #2; Tried GTA V, the one game I know works with this card, and it started artifacting. Im RMAing this thing.


----------



## George the Jew

Prey it costs about $40 less, what are you talking about?


----------



## prey1337

Quote:


> Originally Posted by *George the Jew*
> 
> Prey it costs about $40 less, what are you talking about?


All I saw was the two links posted earlier of EVGA's site, which show them as the same price. I guess it's a mistake.

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6276-KR

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6274-KR


----------



## amd7674

Quote:


> Originally Posted by *prey1337*
> 
> All I saw was the two links posted earlier of EVGA's site, which show them as the same price. I guess it's a mistake.
> 
> http://www.evga.com/Products/Product.aspx?pn=08G-P4-6276-KR
> 
> http://www.evga.com/Products/Product.aspx?pn=08G-P4-6274-KR


I think newegg sells them at the different price. However do I want something that de-tuned? LOL








I think we wouldn't be on these forums if were planning on using stock speeds


----------



## amd7674

Quote:


> Originally Posted by *George the Jew*
> 
> Hi all, I need some help.
> 
> I bought a 1070 FTW from EVGA recently, and have been having issues playing one game in particular, that game being Counter Strike: Global Offensive. I tend to stutter a **** ton, have artifacts (overclocked or not) and my display driver keeps crashing. Anyone else having a similar issue, or do I maybe have a bad card? Other games I play work fine, just CS:GO has this issue for me.
> 
> EDIT: Just tried some other low-powered games, same result in Rocket League. RMA an order?
> 
> EDIT #2; Tried GTA V, the one game I know works with this card, and it started artifacting. Im RMAing this thing.


If you have o/c CPU/RAM try running it at stock speeds. Try reinstalling drivers with DDU. Did you run Afterbuner to check how the GPU is behaving... fan speed, core speed...etc....
or it could just bad card :-(


----------



## criminal

So I did the shunt mod:

Before:


After:


Seems to help with power consumption bouncing off the tdp limit and also helps keep clocks more consistent. Power consumption would constantly bounce off 121% and so far it hasn't exceeded 117.8%. Clocks never drop below 2063 now when they would drop down to like ~2010 or something before. 3dmark/Metro LL scores improved slightly at same clocks, but haven't had a chance to play a game yet and see long term results.

So not a bad results, but really not great either.

FYI I used an extremely thin layer. A thicker layer should offer better results.


----------



## amd7674

criminal... looks great !!!









What paste did you use?


----------



## criminal

Quote:


> Originally Posted by *amd7674*
> 
> criminal... looks great !!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What paste did you use?


Thermal Grizzly Conductonaut


----------



## ucode

Quote:


> Originally Posted by *BTCHSLP*
> 
> It's a pity for a hybrid-cooled VGA that the maximal power-target is 105 %


You do know that is 200W?


----------



## Prozillah

Quote:


> Originally Posted by *criminal*
> 
> So I did the shunt mod:
> 
> Before:
> 
> 
> After:
> 
> 
> Seems to help with power consumption bouncing off the tdp limit and also helps keep clocks more consistent. Power consumption would constantly bounce off 121% and so far it hasn't exceeded 117.8%. Clocks never drop below 2063 now when they would drop down to like ~2010 or something before. 3dmark/Metro LL scores improved slightly at same clocks, but haven't had a chance to play a game yet and see long term results.
> 
> So not a bad results, but really not great either.
> 
> FYI I used an extremely thin layer. A thicker layer should offer better results.


Nice job Crim - I got my grizzly coming soon. Gona be doing the same mod on my G1.

What bios are you using that gives you 120%? Max on this bad boy is 111% from memory


----------



## danjal

Quote:


> Originally Posted by *amd7674*
> 
> I'm still undecided myself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EVGA FTW some peeps report "coil whine" issues. MSI Gaming has memory lottery (micron is not doing so good in o/c department)... I might buy Asus Strix OC (only one 8 pin power input), but I like more power input on both EVGA and MSI products. Leaving in Canada from my understanding I have better RMA support with Asus and MSI (both have Canadian depots). For EVGA and Zotac I have to sent them to US.
> 
> The usual first world problems...


I had a evga ftw and it had a bit of coil whine, much more than my zotac, the zotac is silent.... I sent to evga back and getting another zotac amp edition. The zotac overclocked better too.


----------



## smkd13

Getting ready to make that jump from a 550Ti (yeah yeah yeah, stoneage) but it has served its purpose. I just need more now. I have a conundrum. I have my eye on either the MSI 1070 Gaming X (colors go with my system), EVGA 1070 FTW, Gigabyte 1070 G1 (same as motherboard) and Zotac 1070 Extreme.

Now i hate making decision so i will leave it to you. All of them after reading through this tread seem to preform fine and each have there little nuances. Currently running a set of 1080 monitors but will hopefully be upgrading to 1440 soonish.

Much appreciate the input in advance.


----------



## danjal

Does Zotac make an SLI led bridge or high bandwidth bridge.. I couldnt find a zotac brand on newegg.

I run into the problem with my monitor being 75hz.. The single flex bridge only runs at 60hz so it downclocks my monitor frequency.. I need an sli led bridge at minimum or a high bandwidth bridge..


----------



## danjal

Quote:


> Originally Posted by *smkd13*
> 
> Getting ready to make that jump from a 550Ti (yeah yeah yeah, stoneage) but it has served its purpose. I just need more now. I have a conundrum. I have my eye on either the MSI 1070 Gaming X (colors go with my system), EVGA 1070 FTW, Gigabyte 1070 G1 (same as motherboard) and Zotac 1070 Extreme.
> 
> Now i hate making decision so i will leave it to you. All of them after reading through this tread seem to preform fine and each have there little nuances. Currently running a set of 1080 monitors but will hopefully be upgrading to 1440 soonish.
> 
> Much appreciate the input in advance.


I just sent back an evga ftw because of a lot of coil whine.. My first Zotac was silent and overclocked better than the evga ftw.. Going to get another zotac amp edition..

Something else I noticed, the evga led lights dont get red, its more of a salmon red.. The zotac lights get red, and the yellow strip which looks awful when your looking at the card out of the box, looks great installed with the case lit up in red.. I run a phantek enthoo pro m with the tinted panel and it looks great. I cant complain about my zotac at all, plus it runs about 10celcius cooler than the evga ftw did.

The 1070 amp extreme would be a good choice, of go up to the 1080 amp edition.

I run a 32" 2560x1440 crossover monitor and with a single card in bf4 I get around 130fps, same in starwars bf, and Archeage.. gta5 I get around 100fps.... I was going to put the evga 1070ftw in my backup rig, but decided to just stick with the rx480 in it and sli two of the zotac 1070 amp editions, especially after finding the ega didnt overclock well and ran that much hotter than the zotac.


----------



## BTCHSLP

Quote:


> Originally Posted by *ucode*
> 
> You do know that is 200W?


Are you sure, mate ?









I think the max power drawn of the Seahawk X is ~158 W (150W + 5% Power-Limit).
The EVGA SC got a max power drawn of ~170W (150W + 12% Power-Limit).

The SC boost without any OC at 2000 MHz (GPU).


----------



## Curseair

Quote:


> Originally Posted by *danjal*
> 
> I had a evga ftw and it had a bit of coil whine, much more than my zotac, the zotac is silent.... I sent to evga back and getting another zotac amp edition. The zotac overclocked better too.


Quote:


> Originally Posted by *danjal*
> 
> I just sent back an evga ftw because of a lot of coil whine.. My first Zotac was silent and overclocked better than the evga ftw.. Going to get another zotac amp edition..
> 
> Something else I noticed, the evga led lights dont get red, its more of a salmon red.. The zotac lights get red, and the yellow strip which looks awful when your looking at the card out of the box, looks great installed with the case lit up in red.. I run a phantek enthoo pro m with the tinted panel and it looks great. I cant complain about my zotac at all, plus it runs about 10celcius cooler than the evga ftw did.
> 
> The 1070 amp extreme would be a good choice, of go up to the 1080 amp edition.
> 
> I run a 32" 2560x1440 crossover monitor and with a single card in bf4 I get around 130fps, same in starwars bf, and Archeage.. gta5 I get around 100fps.... I was going to put the evga 1070ftw in my backup rig, but decided to just stick with the rx480 in it and sli two of the zotac 1070 amp editions, especially after finding the ega didnt overclock well and ran that much hotter than the zotac.


Could you please show me a picture of the Zotac installed in the case with red lighting if it's possible?


----------



## danjal

Quote:


> Originally Posted by *Curseair*
> 
> Could you please show me a picture of the Zotac installed in the case with red lighting if it's possible?


----------



## gtbtk

I made what I think is an interesting discovery last night experimenting with my MSI Gaming X 1070. This is still a work in progress and I dont have all the answers yet but I though I would share with the community non the less.

I noticed that the rated base clock to boost clock ratios don't a match up between different brands of card i.e one brand will have a higher boost clock for a given OC base clock when compared to a 1070 of another brand. I discovered that when compared to the founders edition, the AIB OC boards have all, to varying degrees just dragged the left side of the curve higher and each added their own minor tweaks to the lower end of the curve while leaving the right side of the curve pretty much standard creating a flatter curve and grabbing more resources to the lower end where you dont seem to be needing them use them under load.

I am working with a sample of one but at least on my card and of course YMMV, If you use Afterburner, control click the left side of the graph and drag it down a touch to increase the slope of the graph and slightly lower the apparent base clock. Then, without increasing voltage from 0, drag the 1.071v point up to above 2.1Ghz then straighten up the curve for the other 1.xxx points so there is not such a big step shaped section of the curve just curve before 1.071 and the card will run stable in Firestrike at the higher.frequencies. 100% fan helps as boost will drop in steps as temperature rises and the card only wants to operate in a fairly narrow band of temp for a given frequency rangs. If it falls outside of that temp/frequency range, the card will crash. I am still trying to work out how to better manage that aspect

On my initial attempts, I did get the card to run at 2150Mhz though that was starting to artifact. I have not gone back that high to see what adding a little extra voltage will achieve as yet. I have completed multiple runs through firestrike an 2101Mhz and it seems stable with the Micron memory I have at +450 without artifacting and without doing any other fine tuning.

You guys might like to build on my observations and see how it works on other AIB cards, Maybe this is the key for making use of the extra power provided in the 8+6 and 8+8 cards?


----------



## ogow89

So i went all in today, and opted for the max stable overclock on my gpu, and test firestrike. I think this is the max i will ever be able to get with this crappy motherboard.

Core max boost 2139mhz

vram at 9514mhz.

50% fan speed kept the card under 55°c all the time. And the ambient temps dropped today, so the card stayed above 2100mhz the entire run. Again this is with a Phoenix Golden sample.

Only regret is not paying a bit more for a better motherboard, and going for an i7 back then.


----------



## ucode

Quote:


> Originally Posted by *BTCHSLP*
> 
> Are you sure, mate ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think the max power drawn of the Seahawk X is ~158 W (150W + 5% Power-Limit).
> The EVGA SC got a max power drawn of ~170W (150W + 12% Power-Limit).
> 
> The SC boost without any OC at 2000 MHz (GPU).


Your assuming 100% is 150W which isn't always the case. For instance there's a FTW 1070 VBIOS with 185W at 100% IIRC and something like 225W max.

Tip. Try using HWiNFO to show GPU Watts rather than the % readings.


----------



## BTCHSLP

Quote:


> Originally Posted by *ucode*
> 
> Your assuming 100% is 150W which isn't always the case. For instance there's a FTW 1070 VBIOS with 185W at 100% IIRC and something like 225W max.
> 
> Tip. Try using HWiNFO to show GPU Watts rather than the % readings.


That's correct.
The SC and SeaHawk got the same PCB (reference).
The FTW got a custom PCB with two 8-pin connectors and is built for higher power-output.

Ok - thank you








I will bench it with 3DMark TimeSpy 10 minutes long and will post the screens of sensor status of HWiNFO.


----------



## BTCHSLP

Here some results:

*SC-BIOS*
PT: 112 %
TT: 92 °C
Voltage: 100 %
GPU Off: +0 MHz
MEM Off: +0 MHz


_Heaven - DX11_


_TimeSpy - DX12_

*SeaHawk-BIOS*
PT: 105 %
TT: 92 °C
Voltage: 100 %
GPU Off: +0 MHz
MEM Off: +0 MHz


_Heaven - DX11_


_TimeSpy - DX12_


----------



## criminal

Quote:


> Originally Posted by *Prozillah*
> 
> Nice job Crim - I got my grizzly coming soon. Gona be doing the same mod on my G1.
> 
> What bios are you using that gives you 120%? Max on this bad boy is 111% from memory


Using the default bios which allows 112% Power Target. TDP in GPUZ for this card has always went to 121%, so I can't explain what's going on there. All I know is now it tops out at 117% with more consistent clock speeds. Someone that understands Power target/TDP/Power consumption can explain what's going on I am sure.


----------



## kevindd992002

Which is the best 1070 AiB around? As always, I'm targeting for the best overclocker but has also nice thermals and noise profile as I live in a tropical country.

My two 670's are using Arctic Cooling's Hybrid Cooler (AIO) and Xtreme III but I'm not sure if I'd be able to transfer one (specifically the AIO) to the single 1070 that I'll be buying. Any ideas?

Thanks.


----------



## ogow89

Palit super jetstream, and gainward phoenix golden sample. Some say the zotac amp edition also delivers over 2100mhz on the core, the rest are pretty much like the lottery.

Had an msi gamer x before the golden sample i got now, and it topped at 1925mhz. Mine now tops at 2169, though for a brief moment, and stabilizes at 2139mhz with max overclock.


----------



## drunkonpiss

hey guys,

using a Palit Jetstream. Thought I can save a few bucks going for it instead of the Super Jetstream. My question is, is there a way I can adjust the settings the same as Super Jetstream which means I can copy the base and boost clock of it?


----------



## ogow89

Quote:


> Originally Posted by *drunkonpiss*
> 
> hey guys,
> 
> using a Palit Jetstream. Thought I can save a few bucks going for it instead of the Super Jetstream. My question is, is there a way I can adjust the settings the same as Super Jetstream which means I can copy the base and boost clock of it?


MSI afterburner, overclock and test it thoroughly at stock voltage and with stock powerlimit, and if stable flash the bios of super jetstream. Look it up, there is a website that has a whole list of them. If it isn't there, i can then just give you mine, it's basically the same as the one from super jetstream.


----------



## drunkonpiss

Quote:


> Originally Posted by *ogow89*
> 
> MSI afterburner, overclock and test it thoroughly at stock voltage and with stock powerlimit, and if stable flash the bios of super jetstream. Look it up, there is a website that has a whole list of them. If it isn't there, i can then just give you mine, it's basically the same as the one from super jetstream.


I would appreciate it if you could share or point me to the site where I can flash my BIOS settings. Never done it before







How are your temps and stability after flashing the BIOS?


----------



## ogow89

Quote:


> Originally Posted by *drunkonpiss*
> 
> I would appreciate it if you could share or point me to the site where I can flash my BIOS settings. Never done it before
> 
> 
> 
> 
> 
> 
> 
> How are your temps and stability after flashing the BIOS?


If you never flashed a gpu before, then i say better not to do it, and opt for software overclocking via afterburner. As for me, i didn't flash my gpu, i got the gainward phoenix golden sample, which is the same as the super jetstream from palit. Temps never go beyond 65°c, though that depends on your ambient temps.

Just use MSI afterburner, or the plait overclock software that came with your card.


----------



## saunupe1911

Anyone got their memory over 9500 MHz or higher? I got my Strix to 9400 MHz last night but I believe a saw a few artifacts when temps hit 70C. I adjusted my fan profiles to stabilize around the lows 60s at 80% fan speed. I think this Samsung GDDR 5 memory's golden chip target is if it can do over 9500 MHz while easily keeping temps down. I'm not running my fans above 80% to achieve that....too loud. I'm interested in seeing what MSI Seahawk and other liquid GPUs are maxing out at.


----------



## Dude970

My MSI Gaming X will OC mem pretty high. I have Samsung memory


----------



## bpmcleod

Quote:


> Originally Posted by *madmeatballs*
> 
> Hmmmm, I noticed if your temps are lower than 45c I get the ability to overclock more (stable; well, maybe at least for my 1070, yours could be completely different). Hours ago I had *h*t stock fans installed on my Kraken x41 temps were playing around 49-52C. Now I replaced it with higher static pressure fans (helix 140mm from my H240x) I noticed my temps were at 41-42C consistent (this was on heaven bench btw). To be honest these helix aren't the best ones out there as I have replaced my H240-x' fans with Noctua IPPC which done a very great good cooling the rad.
> 
> Now, knowing that I had a significant temperature improvement by switching fans I tried to push my overclock. Now it is stable at 2075MHz( will push it till unstable) and I'm guessing once I hit 45C ~ 50C it will start limiting me from overclocking. I'm not sure what kind of sh** this is but I'm thinking if you can hold temps lower than 45C you can max your card out. Initial when I had the stock cooler for this 1070 amp extreme I was only able to OC +30 on core (temps were hitting 59C-60C) and now I can do +50 (2075MHz[41-42C]). Maybe I can hit 2100 now, I'll give it a shot and update when I get results. Now this is just my opinion and I am not an expert. (I stress test with firestrike stress test and 30mins of furmark + a few hour of bf4 gameplay)
> 
> Update:
> 
> Okay I tried +75 off the bat for 2100MHz too bad my card cant make it lol! Temps was still at 41-42C.
> So far +70 is looking good tho. This might be it's limit.
> 
> Update 2:
> +70 is confirmed stable!
> 
> 
> 
> 
> 
> 
> 
> 
> actually for some reason +70 is still 2100MHz LOL. It is weird MSI afterburner, GPU-z, CAM, HWMonitor says both +70 and +75 results me to 2100MHz
> +75 wasn't stable and +70 was stable both.
> 
> Update 3:
> Something weird is going on from +60 ~ +75 core its 2100MHz
> +80 is 2114MHz tho weird.
> 
> Update 4:
> now +60 became 2088MHz and +65 is now 2100MHz. still same temps.


I am not sure if someone already commented on this but the cards have a boost range. Its been this way since the 6 series I think. Every X mhz on the core boosts to the same speed. Which is why you are seeing 60-75 going to 2100. Which means if 60 is stable and 76 is not then just put it on 60. No reason to do 70 or 75 as you get the same clock regardless. This isnt anything weird its just how boost is designed.


----------



## bpmcleod

Quote:


> Originally Posted by *lifeisshort117*
> 
> I tried Afterburner's curve mode last night but no avail. Even locked at 1.093v, a higher clock speed couldn't be achieved compared to using the normal sliders.
> 
> This is good and bad though - good that Afterburner is directing the correct amount of voltage to the card that it needs for certain clock speeds, but bad that I couldn't get a higher clock speed lol.


I noticed on my MSI Gaming 1070 that at stock voltages with just max PT slider my clocks were stable at 80/100 but eith 1.09v that becomes unstable. My card does not seem to like extra voltage. I tested those clocks on firestrike/time spy and heaven. All stable no artifacts. Drivers crash with more volts though. Not sure if its msi afterburner or my card.


----------



## saunupe1911

Quote:


> Originally Posted by *Dude970*
> 
> My MSI Gaming X will OC mem pretty high. I have Samsung memory


Nice! You making me want to shoot for 9600MHz. But no way am I running those fans at 100%. These 1070 are getting hot at those memory clock speeds. I'm not talking running those speeds for benchmarking. I'm referring to everyday maxed out gaming. Also seems like it can sit at 1.09 volts all day because its temps are still in the 40s and 50s. As long it will last 3 years within my warranty I'm good lmao


----------



## drunkonpiss

Quote:


> Originally Posted by *ogow89*
> 
> If you never flashed a gpu before, then i say better not to do it, and opt for software overclocking via afterburner. As for me, i didn't flash my gpu, i got the gainward phoenix golden sample, which is the same as the super jetstream from palit. Temps never go beyond 65°c, though that depends on your ambient temps.
> 
> Just use MSI afterburner, or the plait overclock software that came with your card.


Kinda risky huh? is there a way to do it safely? I can always revert to my original BIOS, right?


----------



## bigjdubb

Quote:


> Originally Posted by *drunkonpiss*
> 
> Kinda risky huh? is there a way to do it safely? I can always revert to my original BIOS, right?


If your graphics card has a dual BIOS then the risk is very low, if it only has a single BIOS there is always the possibility of bricking the card. It's not all that risky or difficult to do but I don't really see any reason to flash a BIOS at this point.


----------



## ogow89

Quote:


> Originally Posted by *drunkonpiss*
> 
> Kinda risky huh? is there a way to do it safely? I can always revert to my original BIOS, right?


First thing first, how much overclock are you getting out of your gpu? Overclock the card via software with either msi afterburner or the one that came with your card. Don't just flash it for the heck of it, if you don't even know if your card is stable at higher clocks. You can always flash it back. Your gpu came with two bioses as far as i know.

But before you flash a 500 bucks card, and void the warranty and run the risk of bricking it, just overclock like everybody else.

There is no safe way of flashing a card, you are always doing it at your own risk, and that applies to any kind of flashing.


----------



## drunkonpiss

Quote:


> Originally Posted by *bigjdubb*
> 
> If your graphics card has a dual BIOS then the risk is very low, if it only has a single BIOS there is always the possibility of bricking the card. It's not all that risky or difficult to do but I don't really see any reason to flash a BIOS at this point.


Mine has a Dual BIOS. Im using a Palit Jetstream but I would want to boost it the same as the Super Jetstream hence i'm considering flashing the BIOS but I understand the risks. I just want a faster clock speed with a very minimal impact to my temps.


----------



## smkd13

Quote:


> Originally Posted by *danjal*
> 
> I just sent back an evga ftw because of a lot of coil whine.. My first Zotac was silent and overclocked better than the evga ftw.. Going to get another zotac amp edition..
> 
> Something else I noticed, the evga led lights dont get red, its more of a salmon red.. The zotac lights get red, and the yellow strip which looks awful when your looking at the card out of the box, looks great installed with the case lit up in red.. I run a phantek enthoo pro m with the tinted panel and it looks great. I cant complain about my zotac at all, plus it runs about 10celcius cooler than the evga ftw did.
> 
> The 1070 amp extreme would be a good choice, of go up to the 1080 amp edition.
> 
> I run a 32" 2560x1440 crossover monitor and with a single card in bf4 I get around 130fps, same in starwars bf, and Archeage.. gta5 I get around 100fps.... I was going to put the evga 1070ftw in my backup rig, but decided to just stick with the rx480 in it and sli two of the zotac 1070 amp editions, especially after finding the ega didnt overclock well and ran that much hotter than the zotac.


Thank you for the reply bud. I have always liked the Zotac cards. This definitely helped with my decision quite a bit. Should look pretty good in the Evolv too. That is some good numbers for 1440 off one card. now i am definitely looking forward to the upgrade.

P.S. I wish i had the money for a 1080. That would be happening in a heartbeat


----------



## Powergate

Palit GeForce GTX 1070 Super Jetstream BIOS and NVflash:

nvflash.zip 1154k .zip file


----------



## rv8000

It appears I've angered the Micron gods, and they have unleashed their wrath upon me. Another MSI Gaming with Micron, white checkers everywhere


----------



## bigjdubb

Quote:


> Originally Posted by *drunkonpiss*
> 
> Mine has a Dual BIOS. Im using a Palit Jetstream but I would want to boost it the same as the Super Jetstream hence i'm considering flashing the BIOS but I understand the risks. I just want a faster clock speed with a very minimal impact to my temps.


A different BIOS is not going to get you more clock speed at a lower temperature. If the card you are getting the BIOS from runs higher clock speeds at lower temps than your current card it is most likely because of the different cooling solution, not the BIOS.


----------



## SupernovaBE

Looks like my gaming X have samsung chips, they clock good.
just my voltage is stuck at 1.043..

Stable clocks for heaven


----------



## amd7674

Quote:


> Originally Posted by *SupernovaBE*
> 
> Looks like my gaming X have samsung chips, they clock good.
> just my voltage is stuck at 1.043..
> 
> Stable clocks for heaven


very nice... I would consider buying MSI gaming X card, to match my MSI mobo... however I do not want to take part of vram lottery









So for me it is a tight horse race between... Strix OC, EVGA FTW and Zotac Extreme AMP.


----------



## drunkonpiss

Appreciate all the response guys! Will consider everyone's advice!


----------



## tps3443

How do I join the club? I submitted a GPU-Z verification link, and a fire strike screen shot. What else do I need?

EVGA GTX 1070 ACX 3.0 Super Clocked 8GB

https://www.techpowerup.com/gpuz/details/hya7r


----------



## Dude970

Sign up form on the OP


----------



## BTCHSLP

I flashed back my standard-bios on my FE:

Here some results of 3Dmark TimeSpy-benchmark:

http://www.3dmark.com/spy/263507

PT: 112%
TT: 92 °C
Voltage: 100%
GPU Offset: +200 MHz
MEM Offset: +800 MHz


----------



## Dude970




----------



## gtbtk

Quote:


> Originally Posted by *rv8000*
> 
> It appears I've angered the Micron gods, and they have unleashed their wrath upon me. Another MSI Gaming with Micron, white checkers everywhere


At what settings? Mine will do that at times somewhere above +520 and high voltage settings, below it seems ok


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> At what settings? Mine will do that at times somewhere above +520 and high voltage settings, below it seems ok


I got Samsung memory on the g1 and it won't boot into Windows with anything more than 50+ mem oc. However when I lock the voltage it will happily do over 600+ mem oc. They need some serious bios tweaking....


----------



## PureBlackFire

so, got a gtx 1070 Gigabyte G1. cannot install drivers on Windows 10. also, this card came with nothing in the box. really? and the fan shroud is plastic. what was wrong with the 980ti G1 design? the orange stripes on this card... anyone know the issue with driver install?


----------



## jrcbandit

So the highest clock I've been able to stably get is 2075 Mhz core (+75 overclock on FTW edition 1070) and when the card gets hot (65+ C) in games it can drop to 2050 core. Does this mean water cooling would pretty much only give me the extra +25 core speed since extra voltage doesn't do much for these cards? If I try +100 overclock to get 2100 core, the game or benchmark will crash and I suspect better cooling wont do much. My card's temperature is usually around 50-60 C when gaming, although some games can push it to 65 C where the core drops from 2075 to 2050.


----------



## Forceman

Quote:


> Originally Posted by *jrcbandit*
> 
> So the highest clock I've been able to stably get is 2075 Mhz core (+75 overclock on FTW edition 1070) and when the card gets hot (65+ C) in games it can drop to 2050 core. Does this mean water cooling would pretty much only give me the extra +25 core speed since extra voltage doesn't do much for these cards? If I try +100 overclock to get 2100 core, the game or benchmark will crash and I suspect better cooling wont do much. My card's temperature is usually around 50-60 C when gaming, although some games can push it to 65 C where the core drops from 2075 to 2050.


Until, and unless, we get BIOS mods, then yes, the only thing going under water is really going to do for performance is limit the temperature throttling.


----------



## jrcbandit

Quote:


> Originally Posted by *Forceman*
> 
> Until, and unless, we get BIOS mods, then yes, the only thing going under water is really going to do for performance is limit the temperature throttling.


Ah that's what I figured. Watercooling currently would only make sense for SLI configurations. And if I ever do go back to SLI, I'd likely only watercool the top card since the bottom one should have adequate air flow for proper cooling. Unless of course we get bios mods, then watercooling both cards could be beneficial.

My two radiators are going to waste with only cooling the CPU







. But buying a $130 waterblock (+$30? for backplate since I assume default backplate wont work) for almost no benefit on a single card would also be a waste.


----------



## Tcoppock

Time Spy 

GPU-Z Validation https://www.techpowerup.com/gpuz/details/z9yy2

Core+120
Memory+750
EVGA GTX 1070 SC
I5 6600k 4.7ghz

Does this score seem normal for this setup?


----------



## danjal

Quote:


> Originally Posted by *Tcoppock*
> 
> Time Spy
> 
> GPU-Z Validation https://www.techpowerup.com/gpuz/details/z9yy2
> 
> Core+120
> Memory+750
> EVGA GTX 1070 SC
> I5 6600k 4.7ghz
> 
> Does this score seem normal for this setup?


Zotac 1070 amp edition, [email protected]


----------



## danjal

Quote:


> Originally Posted by *jrcbandit*
> 
> Ah that's what I figured. Watercooling currently would only make sense for SLI configurations. And if I ever do go back to SLI, I'd likely only watercool the top card since the bottom one should have adequate air flow for proper cooling. Unless of course we get bios mods, then watercooling both cards could be beneficial.
> 
> My two radiators are going to waste with only cooling the CPU
> 
> 
> 
> 
> 
> 
> 
> . But buying a $130 waterblock (+$30? for backplate since I assume default backplate wont work) for almost no benefit on a single card would also be a waste.


Why wouldnt it help? If you can keep them below 50celcius it should help quite a bit..


----------



## Majentrix

Quote:


> Originally Posted by *Tcoppock*
> 
> Time Spy
> 
> GPU-Z Validation https://www.techpowerup.com/gpuz/details/z9yy2
> 
> Core+120
> Memory+750
> EVGA GTX 1070 SC
> I5 6600k 4.7ghz
> 
> Does this score seem normal for this setup?



http://www.3dmark.com/spy/24775

Looks similar to mine, you're fine.


----------



## danjal

I have a question regarding an sli bridge... I returned an EVGA ftw card for another Zotac 1070 amp edition and thinking about sli'ing the 1070's... or I might just buy a zotax 1080 amp edition, havent made up my mind definitely yet..

My question is, I'm running a gigabyte z170x-gaming 5 motherboard, what sli bridge would I need if I go the sli route? Zotac doesnt sell their own branded bridges, so I guess I'll just get an nvidia brand bridge and paint it to match..

I just need to know what length sli bridge I need, and cant find it anywhere.. I'm guessing the medium length.. any help would be appreciated.


----------



## Majentrix

You would need an SLI bridge with one space between the two cards. Yes, the medium length one is what you want.
Also you don't need an HB SLI bridge unless you're doing 4k or above, you won't see any performance increase at lower resolutions.


----------



## danjal

Quote:


> Originally Posted by *Majentrix*
> 
> You would need an SLI bridge with one space between the two cards. Yes, the medium length one is what you want.
> Also you don't need an HB SLI bridge unless you're doing 4k or above, you won't see any performance increase at lower resolutions.


I need at least an led bridge, my monitor is a 75hz monitor and the regular flex bridge sets refresh rate to 60hz..


----------



## bpmcleod

Anyone else noticing throttling issues based solely on the amount of voltages you put through the card? My card failed +70/+700 at +100mv on the slider but passed at +75 because at 100mv the card was clocking at 2050 core and at +75mv the card throttled down to 2025/2012 (would bounce between these two amounts periodically). The temp of my card never exceeds 54c. The only thing changing between the two runs would be the +25mv difference.


----------



## leongws

Hi, i got a Zotac 1070 amp edition. It is connected to 1 displayport(u2414h main display) and 1 HDMI(TV). I only uses pc screen only option(u2414h)for my pc and will only use extended mode to watch movies on TV and will switch to pc screen only after finish watching.

I discovered recently that when I boot the pc, the splash screen are shown on the TV and not my main monitor even though I did not use extended display to activate it. My main display which is my u2414h will have no signal till it reach windows login screen and at this time the TV will have no signal once it reach the same screen( which should be the case from the start as I did not activate extended mode)

I am only able to get immediate signal to my monitor and showing the splash screen after I remove the HDMI cable to the TV. Not sure why if I plug in both HDMI and displayport cable, the system will show the splash screen on HDMI device and not on the main displayport first even though HDMI is not activated in extended mode. Other than this small issue, the card is running fine. Anyone got this issue and how to solve it?


----------



## danjal

Quote:


> Originally Posted by *bpmcleod*
> 
> Anyone else noticing throttling issues based solely on the amount of voltages you put through the card? My card failed +70/+700 at +100mv on the slider but passed at +75 because at 100mv the card was clocking at 2050 core and at +75mv the card throttled down to 2025/2012 (would bounce between these two amounts periodically). The temp of my card never exceeds 54c. The only thing changing between the two runs would be the +25mv difference.


I notice throttling at or near 60celcius and 70 celcius..


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> I got Samsung memory on the g1 and it won't boot into Windows with anything more than 50+ mem oc. However when I lock the voltage it will happily do over 600+ mem oc. They need some serious bios tweaking....


How do you get the overblock to stick between reboots?


----------



## criminal

Quote:


> Originally Posted by *bpmcleod*
> 
> Anyone else noticing throttling issues based solely on the amount of voltages you put through the card? My card failed +70/+700 at +100mv on the slider but passed at +75 because at 100mv the card was clocking at 2050 core and at +75mv the card throttled down to 2025/2012 (would bounce between these two amounts periodically). The temp of my card never exceeds 54c. The only thing changing between the two runs would be the +25mv difference.


I wouldn't even worry with adding any additional voltage. Does nothing for my overclock.


----------



## bpmcleod

Quote:


> Originally Posted by *criminal*
> 
> I wouldn't even worry with adding any additional voltage. Does nothing for my overclock.


Did a little for mine. At +75mv I am stable at 2038/2364 so not too shaby. I havent tried too much more on the memory. I got artifacts at +750 but that was at 0mv extra. So may be able to get there at +75? I am not sure as of right now. I will admit though the extra voltage isnt much :-\

http://www.3dmark.com/3dm/14081949?


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> How do you get the overblock to stick between reboots?


I make sure afterburner doesn't load the oc at boot and loads a custom profile on both 2D and 3D applications that both have the voltage locked. As long as the volts are higher that .800 I can set the memory oc without it ****ting itself, chucking blocky snow all over the desktop and hardlocking


----------



## tps3443

Quote:


> Originally Posted by *smkd13*
> 
> Getting ready to make that jump from a 550Ti (yeah yeah yeah, stoneage) but it has served its purpose. I just need more now. I have a conundrum. I have my eye on either the MSI 1070 Gaming X (colors go with my system), EVGA 1070 FTW, Gigabyte 1070 G1 (same as motherboard) and Zotac 1070 Extreme.
> 
> Now i hate making decision so i will leave it to you. All of them after reading through this tread seem to preform fine and each have there little nuances. Currently running a set of 1080 monitors but will hopefully be upgrading to 1440 soonish.
> 
> Much appreciate the input in advance.


I'm gaming on 1080P with my gtx 1070 acx 3.0 SC. It is really overkill.

But, I plan to get a Acer X34 predator soon. 3440x1440 100hz a gtx1070 does great at this resolution .

GTX1070/1080 are great cards for the money!

Alot of power for $400


----------



## Hunched

I think Rise of the Tomb Raider is the best stress test for VRAM.
Uses over 7.8gb at times at 1080p with almost max settings (no DoF, motion blur, lens flare, or other stupid little things)
I don't even have SSAA enabled either.

I don't know why I haven't seen this game get tons of graphical praise, it looks leagues better than Crysis 3 and Witcher 3.
Easily the best looking game I've ever played.
It looks as good as when games get shown at E3 before the downgrade, except this one didn't actually get downgraded.


----------



## tps3443

Quote:


> Originally Posted by *Hunched*
> 
> I think Rise of the Tomb Raider is the best stress test for VRAM.
> Uses over 7.8gb at times at 1080p with almost max settings (no DoF, motion blur, lens flare, or other stupid little things)
> I don't even have SSAA enabled either.
> 
> I don't know why I haven't seen this game get tons of graphical praise, it looks leagues better than Crysis 3 and Witcher 3.
> Easily the best looking game I've ever played.
> It looks as good as when games get shown at E3 before the downgrade, except this one didn't actually get downgraded.


WOW! That is a lot of Vram! I have never played the game. But, I think I will try to get it now and test on my GTX 1070 also.

Hey everyone! I had a concern. My 3 day old EVGA GTX 1070 ACX SC has a faint clicking noise in one of the fans. It is audible. But, you've got to remember it, and listen for it to notice the noise. Should I contact evga and get another card? Or just deal with it? it gets a little louder when the fans spin up, under gaming conditions. But, it is a really soft clicking noise. I do not see it rubbing anything at all, its not clipping a wire, or the heatsink. I think it Is going on inside the fan motor itself.

I have never RMAed a video card before.

If it is quick, and painless. I may call evga. But, if it takes a long time then I might just deal with it. I just hope the one fan does not die!

I'm loving this GTX1070 after 72 hours in!! SO much fun! It has filled my room, with that hot fresh pool float NEW VIDEO CARD SMELL! LMAO


----------



## danjal

Quote:


> Originally Posted by *tps3443*
> 
> WOW! That is a lot of Vram! I have never played the game. But, I think I will try to get it now and test on my GTX 1070 also.
> 
> Hey everyone! I had a concern. My 3 day old EVGA GTX 1070 ACX SC has a faint clicking noise in one of the fans. It is audible. But, you've got to remember it, and listen for it to notice the noise. Should I contact evga and get another card? Or just deal with it? it gets a little louder when the fans spin up, under gaming conditions. But, it is a really soft clicking noise. I do not see it rubbing anything at all, its not clipping a wire, or the heatsink. I think it Is going on inside the fan motor itself.
> 
> I have never RMAed a video card before.
> 
> If it is quick, and painless. I may call evga. But, if it takes a long time then I might just deal with it. I just hope the one fan does not die!
> 
> I'm loving this GTX1070 after 72 hours in!! SO much fun! It has filled my room, with that hot fresh pool float NEW VIDEO CARD SMELL! LMAO


you might take the card out and look at it, you might have a wire hitting somewhere you cant see...

and feel the fans and see if you can feel a bad bearing.


----------



## CaptainZombie

I haven't OC anything for these benchmarks, but how does these numbers look? I'm trying to see if this FTW is worth holding on to or not. I'm running these at 1440p res for the benchmarks.

I'm running an i7 4790k, 1070 FTW, and 16gb DDR3 3200.


----------



## criminal

Quote:


> Originally Posted by *CaptainZombie*
> 
> I haven't OC anything for these benchmarks, but how does these numbers look? I'm trying to see if this FTW is worth holding on to or not. I'm running these at 1440p res for the benchmarks.
> 
> I'm running an i7 4790k, 1070 FTW, and 16gb DDR3 3200.


Look about right to me.


----------



## rv8000

Quote:


> Originally Posted by *gtbtk*
> 
> At what settings? Mine will do that at times somewhere above +520 and high voltage settings, below it seems ok


Anything above +100 on the memory. Certain games we're artifacting at stock as well (FO4 and GW2).


----------



## 113802

Quote:


> Originally Posted by *CaptainZombie*
> 
> I haven't OC anything for these benchmarks, but how does these numbers look? I'm trying to see if this FTW is worth holding on to or not. I'm running these at 1440p res for the benchmarks.
> 
> I'm running an i7 4790k, 1070 FTW, and 16gb DDR3 3200.


Make sure to switch the to the slave bios and restart your computer a few times. The power limit will reach 122% instead of 112%

http://www.3dmark.com/3dm/14095543?


----------



## benjamen50

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Make sure to switch the to the slave bios and restart your computer a few times. The power limit will reach 122% instead of 112%
> 
> http://www.3dmark.com/3dm/14095543?


I'm tempted to use the switch while the computer is on but I know that's a bad idea so i'll do it when the computer is switched off and unplugged.


----------



## CaptainZombie

Quote:


> Originally Posted by *criminal*
> 
> Look about right to me.


Quote:


> Originally Posted by *WannaBeOCer*
> 
> Make sure to switch the to the slave bios and restart your computer a few times. The power limit will reach 122% instead of 112%
> 
> http://www.3dmark.com/3dm/14095543?


Thanks I'll also look at adjusting the power limit.


----------



## USlatin

Haven't read the whole thread and just got my 1070 SeaHawk, could someone please help me out? I am trying to make sure I got a good silicone lottery sample while I can still return for a different one









*1. What are you guys using to check the boost clock?*

The only game I play doesn't stress the card enough and furmark is too synthetic and also hasn't shown me the highest clock that I've noticed the card go up to.

*2. Also, what are you guys using to test stability?*

I would like to do an overnight burn since I will use the card as an accelerator for video editing and I want the OC to be rock solid. I tried a quick and dirty OC (haven't had the time for a step by step pass/fail) and I don't seem to get much out of it with the mV set to +100... I see all these people posting 2.0+ GHz that I wonder if that is just gaming-benching stable... or if I got a bad ticket for the silicone lotto

Card does run STUPID cold, and my case temp dropped 10C or more, which dropped my CPU temps by a good 5C or so


----------



## 113802

Quote:


> Originally Posted by *USlatin*
> 
> Haven't read the whole thread and just got my 1070 SeaHawk, could someone please help me out? I am trying to make sure I got a good silicone lottery sample while I can still return for a different one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *1. What are you guys using to check the boost clock?*
> 
> The only game I play doesn't stress the card enough and furmark is too synthetic and also hasn't shown me the highest clock that I've noticed the card go up to.
> 
> *2. Also, what are you guys using to test stability?*
> 
> I would like to do an overnight burn since I will use the card as an accelerator for video editing and I want the OC to be rock solid. I tried a quick and dirty OC (haven't had the time for a step by step pass/fail) and I don't seem to get much out of it with the mV set to +100... I see all these people posting 2.0+ GHz that I wonder if that is just gaming-benching stable... or if I got a bad ticket for the silicone lotto
> 
> Card does run STUPID cold, and my case temp dropped 10C or more, which dropped my CPU temps by a good 5C or so


Use GPU-Z Render Test to test max boost clock next to MSI Afterburner. All the synthetic test I run always pass at high clocks. When playing Overwatch I noticed the entire screen changes colors when I have a bad overclock. My max stable overclock is +90Mhz Core/+600 memory.


----------



## USlatin

OK, I updated to GPU-Z 1.10.0. The updated version does show me a more realistic clocks than the old version I of GPU-Z, it claims 1582MHz/2002MHz and a 1772MHz boost. It also detects CUDA which the old version didn't. I ran the Render Test and the boost stayed the same at 1772MHz.

So I guess now I should to a pass/fail step by step with the voltage at +100, correct?


----------



## 113802

Quote:


> Originally Posted by *USlatin*
> 
> OK, I updated to GPU-Z 1.10.0. The updated version does show me a more realistic clocks than the old version I of GPU-Z, it claims 1582MHz/2002MHz and a 1772MHz boost. It also detects CUDA which the old version didn't. I ran the Render Test and the boost stayed the same at 1772MHz.
> 
> So I guess now I should to a pass/fail step by step with the voltage at +100, correct?


Odd, it should boost higher than 1772, are you looking at MSI afterburner GPU clock when running the render test? For example my card at stock voltage/clock it boost to 2012Mhz core.

I run the Render Test and look at MSI Afterburner to confirm the boost clock. Make sure you are using the Beta 4 of afterburner or change the theme to Cyborg skin.

Also yeah overclock the card to +100!

Here's an example:


----------



## USlatin

Boom! Rep+

Getting 2.1GHz during the GPU-Z Render Test , will stress test it tonight (and today by gaming for about 10hs straight hahahah!)








*
What should I use to stress test overnight?*


----------



## Forceman

I don't think there is much value in testing it overnight. Run Valley or Heaven on loop for 30 minutes or an hour and that should be good enough.


----------



## Wollowon

Core : 2.1 GHZ [Stable Speed]

Memory : 9 GHZ

Power and Voltage Max.

Card Model : ASUS STRIX OC

Is this OC normal for Strix ?

This OC is stable , tested with GTA V , Witcher 3 [4-5 hours]


----------



## USlatin

Welp, the one game I play isn't very heavy, War Thunder. So it doesn't really help test stability. The clock hangs out at 1164MHz with max settings 1440p 60Hz. But once I get my second DP cable I will add two 24" 1920x1200 monitors on the side of my 32" 1440p, and that will likely push the 1070 to the limit. It will be the exact the same pixel count as 4K.


----------



## tps3443

Is there a way to get my card to 122% power limit? Its a SC ACX. Not a FTW.

Also, On default settings the card will boost to 1987 MHz on Kombuster. And kombustor GPU monitor says "Voltage limit reached" Does increasing voltage really help?

How do I know if I am exceeding power limit? Should it stick at 112% on graph? Or move around some?


----------



## benjamen50

Is TDP % reading on GPU-Z used to check if GPU is under power limit? Or is it power usage in EVGA precision X?


----------



## TheLAWNOOB

Yes


----------



## Prozillah

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Make sure to switch the to the slave bios and restart your computer a few times. The power limit will reach 122% instead of 112%
> 
> http://www.3dmark.com/3dm/14095543?


Is this only specific to the EVGA FTW cards?


----------



## tps3443

I'm going to check again, When I get home and see what tdp is and power %


----------



## supermodjo

gigabyte g1 stable at 2152 in bf4 craches in the witcher 3.stable at 2124 stock voltage so increasing voltage will not help so much i think.watercooled card temps at 45- 50 max at full load.mem at 9200


----------



## 113802

Quote:


> Originally Posted by *Prozillah*
> 
> Is this only specific to the EVGA FTW cards?


Yup, only the cards with two bios chips.

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6276-KR


----------



## leongws

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Yup, only the cards with two bios chips.
> 
> http://www.evga.com/Products/Product.aspx?pn=08G-P4-6276-KR


If not wrong Palit SJS also got 2 bios switch


----------



## Swolern

Quote:


> Originally Posted by *tps3443*
> 
> I'm gaming on 1080P with my gtx 1070 acx 3.0 SC. It is really overkill.
> 
> But, I plan to get a Acer X34 predator soon. 3440x1440 100hz a gtx1070 does great at this resolution .
> 
> GTX1070/1080 are great cards for the money!
> 
> Alot of power for $400


You wont hit 100fps in many new AAA titles but thats what Gsync is for. Loving my Predator X34.


----------



## tps3443

Quote:


> Originally Posted by *Swolern*
> 
> You wont hit 100fps in many new AAA titles but thats what Gsync is for. Loving my Predator X34.


I think its about the best multi use monitor available. I watched a guy playing crysis 3 maxed settings with some AA, avg about 44-52fps. With a dip down to 38fps every now and then with a GTX1070 SC ACX 3.0

That is some strong performance!


----------



## tps3443

Hey everyone I was trying out the auto overclock with precision XOC 6.0.4 and it automatically applied a 113% power limiter after about 3 minutes of running the artifact scanner, while it was automatically overclocking my card.

The power limiter slowly climbed as it over clocked my card, and increased voltage. Now it is set at 113%! Has anyone ever had this happen?


----------



## tps3443

It seems like 2,112 is easily achieved. And 9,200 memory. I've gotten about 9,600 on the memory and I can benchmark at these speeds, but 9,600+ will artifact some games GTA V mainly.

Extreme overclocking is possible on any gtx 1070, and every card can hit 2.5Ghz or so. But, you have to solder on a few things, and trick the on board Texas instruments voltage regulator of the card, to make it think it is sending less voltage to the GPU, and using less power then the TDP limit, so it will actually pull more voltage then 1.093 and allow voltage of 1.200-1.260 and then you fan get 2.5Ghz. It is fairly easy. And there is not to much involved. I am considering modding mine. Disassembly is probably the hardest part lol. This modification will also allow more memory voltage, so 9,850mhz to 10,200 mHz could be easily possible.

Anyone done this mod yet? There is a great guide on google


----------



## USlatin

Very few people will think 20% higher OC is worth a hard mod, plus the actual voltage aging the card, but as you point out many have done it and seems pretty straight forward and easy.

I just wanted to follow up on my OC. I played hours of 100% maxed 1440p 60Hz War Thunder and the card just laughed at it. I just ran about 30min of GPU-Z render test full screen, which seems to put the GPU at 100% and the 2100.5MHz clock never budged. Temps are stupid low with max of 44C and with a case temp reading of 28C, so 16C delta with the stock thermal compound. I will switch to MX-2 soon. I did shroud the rad on both sides, but it is a push only config.


----------



## USlatin

Oh, one last thing. Where can we see a reliable reading of the voltage, and what is stock/spec voltage?


----------



## Swolern

Quote:


> Originally Posted by *tps3443*
> 
> It seems like 2,112 is easily achieved. And 9,200 memory. I've gotten about 9,600 on the memory and I can benchmark at these speeds, but 9,600+ will artifact some games GTA V mainly.
> 
> Extreme overclocking is possible on any gtx 1070, and every card can hit 2.5Ghz or so. But, you have to solder on a few things, and trick the on board Texas instruments voltage regulator of the card, to make it think it is sending less voltage to the GPU, and using less power then the TDP limit, so it will actually pull more voltage then 1.093 and allow voltage of 1.200-1.260 and then you fan get 2.5Ghz. It is fairly easy. And there is not to much involved. I am considering modding mine. Disassembly is probably the hardest part lol. This modification will also allow more memory voltage, so 9,850mhz to 10,200 mHz could be easily possible.
> 
> Anyone done this mod yet? There is a great guide on google


I wouldn't be comfortable with 1.26v unless on water. Do you have a link to the mod?


----------



## tps3443

https://xdevs.com/guide/pascal_oc/

As requested! Here is the guide! This guide is the exact same for GTX1070/80 If your on water, and have basic soldering skills, this mod will get you in to another performance territory.


----------



## Prozillah

Quote:


> Originally Posted by *tps3443*
> 
> It seems like 2,112 is easily achieved. And 9,200 memory. I've gotten about 9,600 on the memory and I can benchmark at these speeds, but 9,600+ will artifact some games GTA V mainly.
> 
> Extreme overclocking is possible on any gtx 1070, and every card can hit 2.5Ghz or so. But, you have to solder on a few things, and trick the on board Texas instruments voltage regulator of the card, to make it think it is sending less voltage to the GPU, and using less power then the TDP limit, so it will actually pull more voltage then 1.093 and allow voltage of 1.200-1.260 and then you fan get 2.5Ghz. It is fairly easy. And there is not to much involved. I am considering modding mine. Disassembly is probably the hardest part lol. This modification will also allow more memory voltage, so 9,850mhz to 10,200 mHz could be easily possible.
> 
> Anyone done this mod yet? There is a great guide on google


I'm considering it but will try the shunt mod first once my grizzly turns up


----------



## Swolern

Quote:


> Originally Posted by *tps3443*
> 
> https://xdevs.com/guide/pascal_oc/
> 
> As requested! Here is the guide! This guide is the exact same for GTX1070/80 If your on water, and have basic soldering skills, this mod will get you in to another performance territory.


Awesome. Thanks. + Rep. I wonder if the performance scales up with the higher clocks.


----------



## ucode

Quote:


> Originally Posted by *Swolern*
> 
> I wonder if the performance scales up with the higher clocks.


With my 1080 1.2V was required for 2.2GHz while 2GHz was about 1.0V. So about 20% more voltage and 60% more dynamic power for a 10% increase in GPU clock.


----------



## dislikeyou

I am still having issue with scaling on Windows 7, Windows 8.1 and Windows 10. Most of the time when I boot into Windows the scaling changes to 100% while still showing as 150% in display settings, when I sign out and log back in, the scaling becomes 150%.

I also have another issue which I have had since 2 years ago when I got a 4K monitor for the first time. During POST/Boot, the monitor loses signal and goes to sleep/power saving mode and the signal doesn't come back until Windows has loaded, sometimes it doesn't even come back in Windows.

For the signal to come back I have to restart the PC. It is very annoying and it seems to be a displayport and 4K monitor issue, I had it on all 4K monitors but don't remember having such issues with WQHD monitors. It could also be a UEFI issue since I noticed this issue first time when I bought a 4K monitor and had a GTX 660Ti that was the first card with UEFI VBIOS that I owned.

Somebody else have same issue or solution?


----------



## leongws

Quote:


> Originally Posted by *leongws*
> 
> Hi, i got a Zotac 1070 amp edition. It is connected to 1 displayport(u2414h main display) and 1 HDMI(TV). I only uses pc screen only option(u2414h)for my pc and will only use extended mode to watch movies on TV and will switch to pc screen only after finish watching.
> 
> I discovered recently that when I boot the pc, the splash screen are shown on the TV and not my main monitor even though I did not use extended display to activate it. My main display which is my u2414h will have no signal till it reach windows login screen and at this time the TV will have no signal once it reach the same screen( which should be the case from the start as I did not activate extended mode)
> 
> I am only able to get immediate signal to my monitor and showing the splash screen after I remove the HDMI cable to the TV. Not sure why if I plug in both HDMI and displayport cable, the system will show the splash screen on HDMI device and not on the main displayport first even though HDMI is not activated in extended mode. Other than this small issue, the card is running fine. Anyone got this issue and how to solve it?


Quote:


> Originally Posted by *dislikeyou*
> 
> I am still having issue with scaling on Windows 7, Windows 8.1 and Windows 10. Most of the time when I boot into Windows the scaling changes to 100% while still showing as 150% in display settings, when I sign out and log back in, the scaling becomes 150%.
> 
> I also have another issue which I have had since 2 years ago when I got a 4K monitor for the first time. During POST/Boot, the monitor loses signal and goes to sleep/power saving mode and the signal doesn't come back until Windows has loaded, sometimes it doesn't even come back in Windows.
> 
> For the signal to come back I have to restart the PC. It is very annoying and it seems to be a displayport and 4K monitor issue, I had it on all 4K monitors but don't remember having such issues with WQHD monitors. It could also be a UEFI issue since I noticed this issue first time when I bought a 4K monitor and had a GTX 660Ti that was the first card with UEFI VBIOS that I owned.
> 
> Somebody else have same issue or solution?


I posted something similar yesterday. Are u connected to displayport and HDMI? My issue disappear if i connect only with displayport. Guess i will just live with it till someone got solutions.


----------



## rfarmer

I'd like to join the club, picked up a nVidia GTX 1070 FE from Best Buy yesterday. I have a ncase m1 and the max gpu width is 5.5" and I water cool it so a reference card was pretty much the only way to go. Several other 1070's I would have liked to use but they are too wide.

I went with an Aquacomputer Pascal Black Edition block with active cooling backplate. Mainly because I liked the way it looked, but with max gpu temps at 38C during Firestrike run seems to work pretty good too.

Initial OC I have the core at 1987 MHz and Memory bit over 9000 MHz, Firestrike graphics score over 20,000. http://www.3dmark.com/fs/9772431

I am guessing that the FE are voltage locked? I have no option in Afterburner even with voltage settings set to unlock.

All in all a nice improvement over my 970.

Edit: Got to 2025 MHz with no problem. http://www.3dmark.com/fs/9772968


----------



## dislikeyou

Quote:


> Originally Posted by *leongws*
> 
> I posted something similar yesterday. Are u connected to displayport and HDMI? My issue disappear if i connect only with displayport. Guess i will just live with it till someone got solutions.


Just displayport, mini dp on the monitor and DP on the GPU.


----------



## criminal

My highest Firestrike score yet thanks to the shunt mod: http://www.3dmark.com/3dm/14118243?


----------



## tps3443

What do you guys think about if they released a GTX1070 Ti 8GB

I remember when the Nvidia GTX 260 Came out, and then they released a Core 216 model, The Core 216 GTX 260 has a good bit more Cuda Cores, and higher clock speeds. It was like a refreshment of the whole GTX 260 video card.

I started building Computers, and PC gaming long before there was a huge selection of hardware and parts. The only video cards we had that were top end was a 9800XT, and a Nvidia 6800 Ultra. There were some video cards in the lower range ofcourse . 6600GT, 6800, 6800GT, and other than that there was a FX5200 FX5500, Ati 9500 Pro

It is amazing to see how far PC gaming has come! HUGE selections of hardware, and different styles.


----------



## SlvrDragon50

Uhh.. not sure why they would start releasing a 1070 Ti... Ti is reserved for the 80.


----------



## tps3443

Quote:


> Originally Posted by *rfarmer*
> 
> I'd like to join the club, picked up a nVidia GTX 1070 FE from Best Buy yesterday. I have a ncase m1 and the max gpu width is 5.5" and I water cool it so a reference card was pretty much the only way to go. Several other 1070's I would have liked to use but they are too wide.
> 
> I went with an Aquacomputer Pascal Black Edition block with active cooling backplate. Mainly because I liked the way it looked, but with max gpu temps at 38C during Firestrike run seems to work pretty good too.
> 
> Initial OC I have the core at 1987 MHz and Memory bit over 9000 MHz, Firestrike graphics score over 20,000. http://www.3dmark.com/fs/9772431
> 
> I am guessing that the FE are voltage locked? I have no option in Afterburner even with voltage settings set to unlock.
> 
> All in all a nice improvement over my 970.
> 
> Edit: Got to 2025 MHz with no problem. http://www.3dmark.com/fs/9772968


Your case has plenty of room to fit even the widest of Custom PCB design GTX1070 cards. The GTX1070 EVGA FTW uses a NON-Reference board, it is much wider than a reference. I think it is only 5" inches wide, and Reference cards are 4" inches wide or a little less. There is not much point in buying AIB cards if there is a "HUGE PREMIUM" they do not overclock any better than a reference card.

I do own a EVGA GTX 1070 ACX 3.0 Superclocked. And I just rave about it all the time! It is a reference card, with just higher clocks, and a better cooler with L.E.D's
It runs super SILENT!

I bought a RX480 before this, so It was so LOUD! 100% fan required for a decent overclock. And it still ran at around 90C

I have to take my case door off, and turn the fan up to 85% to even hear this GTX1070 ACX SC is even audible, or making noise.

You Reference design blower GTX1070 is plenty! And just looks better than most GTX1070's. Enjoy it! I have had my 1070 for 5 days now. Going great so far! Amazing cards.


----------



## D13mass

Hi, which card is better for silence (quiet work)?
Someone told me MSI Gaming X, but after my really hot 980ti and noises I have some doubts


----------



## rfarmer

Quote:


> Originally Posted by *tps3443*
> 
> Your case has plenty of room to fit even the widest of Custom PCB design GTX1070 cards. The GTX1070 EVGA FTW uses a NON-Reference board, it is much wider than a reference. I think it is only 5" inches wide, and Reference cards are 4" inches wide or a little less. There is not much point in buying AIB cards if there is a "HUGE PREMIUM" they do not overclock any better than a reference card.
> 
> I do own a EVGA GTX 1070 ACX 3.0 Superclocked. And I just rave about it all the time! It is a reference card, with just higher clocks, and a better cooler with L.E.D's
> It runs super SILENT!
> 
> I bought a RX480 before this, so It was so LOUD! 100% fan required for a decent overclock. And it still ran at around 90C
> 
> I have to take my case door off, and turn the fan up to 85% to even hear this GTX1070 ACX SC is even audible, or making noise.
> 
> You Reference design blower GTX1070 is plenty! And just looks better than most GTX1070's. Enjoy it! I have had my 1070 for 5 days now. Going great so far! Amazing cards.


Mine is water cooled so I had to take into account the block width and with cooling at 42C max I think that is even better.


----------



## tps3443

Quote:


> Originally Posted by *rfarmer*
> 
> Mine is water cooled so I had to take into account the block width and with cooling at 42C max I think that is even better.


Your build is pretty much identical to mine. Same CPU, Same Motherboard, Same GPU, Same PSU too

Ive got the Gigabyte Z-170N Gaming 5 ITX. anyhow, Does your BCLK on your motherboard run at 99.40 to 99.70Mhz

It makes all of my overclocks at say X48 multiplier a overall speed of about 4,780

And man, 42C load is incredible. For something that is totally silent. It is really, I wish I could finish my custom loop. I still need fittings, tubing, etc. etc. GPU block


----------



## rfarmer

Quote:


> Originally Posted by *tps3443*
> 
> Your build is pretty much identical to mine. Same CPU, Same Motherboard, Same GPU, Same PSU too
> 
> Ive got the Gigabyte Z-170N Gaming 5 ITX. anyhow, Does your BCLK on your motherboard run at 99.40 to 99.70Mhz
> 
> It makes all of my overclocks at say X48 multiplier a overall speed of about 4,780
> 
> And man, 42C load is incredible. For something that is totally silent. It is really, I wish I could finish my custom loop. I still need fittings, tubing, etc. etc. GPU block


BCLK is 99.98 to 100.02 MHz. Yeah I was amazed at the temps, my 970 was in the mid 50s at load. Helped to lower max CPU temps from 52C to 48C.


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> I make sure afterburner doesn't load the oc at boot and loads a custom profile on both 2D and 3D applications that both have the voltage locked. As long as the volts are higher that .800 I can set the memory oc without it ****ting itself, chucking blocky snow all over the desktop and hardlocking


Thanks. What voltage are you locking the voltage at?


----------



## tps3443

Quote:


> Originally Posted by *D13mass*
> 
> Hi, which card is better for silence (quiet work)?
> Someone told me MSI Gaming X, but after my really hot 980ti and noises I have some doubts


This is comparison data between (2) GTX1070's

EVGA ACX 3.0 Super clocked, and the MSI GTX 1070 Gaming X,

The MSI Gaming X will reach 70C under gaming load, @ 30dB, you can get it cooler with more fan speed. But it will only get louder.

The EVGA ACX 3.0 SC will reach 73C under gaming load, @ 28.3dB, This card is quiet under load. And it is completely silent with 0% fan profile until temps reach 60C. Mine reaches about 60's with silent fan profile which Is really SILENT! No noise at all while gaming. With the Optimal fan profile with some more aggressive air flow, The temps are even better on the EVGA ACX SC! Hovering around the 55's Celsius and the card is silent in my case.

I do not tolerate noise! So, I purchased the quietest GTX1070 I could find in stock after checking several reviews.


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> Thanks. What voltage are you locking the voltage at?


So I've just got a stock profile set for both 2d and 3d with the volts locked at .800 and from their I got my hot keys bound for my general gaming OC which I know is rock solid at 0% xtra voltage, 111% power, 106mhz on the core and 550 memoryou @ 1.050v. Overall it holds a constant 2114 mHz core and 9.1ghz memory. The funny thing is if I set set the voltage anything higher thanumber 1.05 the performance worsens as it starts to throttle the TDP. I cracked my best Firestrike graphics score at 21090 with these settings. Be aware of that. Higher clocks and voltage doesn't always mean the card performs better when considering gpu boost 3.0....


----------



## SlvrDragon50

The ASUS STRIX is super quiet. My fan is pretty much never on.


----------



## tps3443

From what I understand, the EVGA GTX 1070 ACX 3.0 SC is the quietest GTX 1070 available.

There are a lot of other 1070's that run a little cooler, or only a little bit noisier but cooler lol

If you are after all, just seeking the quietest 1070 out of the box hands down, lowest load dB noise level you can get your hands on in the reviews. It is a EVGA ACX 3.0 GTX 1070.

There are to many good GTX1070's out there now. it makes for a very tough choice.

Your case temps, and fan setup can effect your load noise dB from card to card.


----------



## pawelekd9

Hi, I watercooled my GTX 1070 FE. Specifying i'm using hybrid cooler. If you want to know how to make one, thread is here: link

My result in Time Spy: score


----------



## saunupe1911

Quote:


> Originally Posted by *SlvrDragon50*
> 
> The ASUS STRIX is super quiet. My fan is pretty much never on.


Yeah by default its fans won't turn on until 50C. And It takes a lot to reach 50. It has to be overclocked with 2000 mhz and over 9000 memory to reach 50c with a game that's pushing it. I use Afterburner to turn on fans to 50% at 45C just for safety. These 1070s are beasts. And so far the only dud I've heard about in this thread is the MSI 1070s with micron memory. You basically can't go wrong. I kinda wish I waited for a water cooled 1070 at this point though.


----------



## ssgtnubb

My TimeSpy Score. Running stock clocks on my 4790k, on my G1 running +100/+500.


----------



## marik123

Is there any way to disable the GPU boost 3.0? Right now my 1070 strix will run all games fine except rise of tomb raider where I will get a app crash or a blue screen when gaming. The problem is sometimes the frequency will drop down to 2000 - 2025mhz and the voltage dropped down to 1.000v even though I set it to 2100mhz @ 1.081v.


----------



## Forceman

You can't disable it, but you can lock the frequency and voltage at given points using Afterburner. You have to use the curve overclocking (ctrl-F) and then you can lock the individual points with L. Give that a try.


----------



## Prozillah

Quote:


> Originally Posted by *marik123*
> 
> Is there any way to disable the GPU boost 3.0? Right now my 1070 strix will run all games fine except rise of tomb raider where I will get a app crash or a blue screen when gaming. The problem is sometimes the frequency will drop down to 2000 - 2025mhz and the voltage dropped down to 1.000v even though I set it to 2100mhz @ 1.081v.


Try finding your max core clock with the volts locked at 1.050. It should stop the card from TDP throttling which I believe is where you are having the issue. If I run my card anything over 1.050 it bounces the core around between 2114 and 2000 - at 1.05v it keeps consistant @ 2101 - 2114


----------



## TurboMach1

Quote:


> Originally Posted by *SlvrDragon50*
> 
> Uhh.. not sure why they would start releasing a 1070 Ti... Ti is reserved for the 80.


what about the 560 Ti, 660 Ti, and 750 Ti


----------



## rfarmer

Quote:


> Originally Posted by *TurboMach1*
> 
> what about the 560 Ti, 660 Ti, and 750 Ti


True but there was neither a 770 Ti nor a 970 Ti, I would say those days are over.


----------



## TurboMach1

Quote:


> Originally Posted by *rfarmer*
> 
> True but there was neither a 770 Ti nor a 970 Ti, I would say those days are over.


i know i was just busting balls


----------



## Dude970

I reached 16K









http://www.3dmark.com/3dm/14124934


----------



## Prozillah

Quote:


> Originally Posted by *Dude970*
> 
> I reached 16K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/14124934


21800 in the graphics- noooiice. What are your clocks and OC setup?


----------



## Prozillah

BTW anyone with an AIB card tried flashing to 1070 FE Bios? They seem to be getting higher OC results on AIB cards with the FE bios in some cases


----------



## tps3443

oops : Edit.


----------



## tps3443

Quote:


> Originally Posted by *Dude970*
> 
> I reached 16K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/14124934


STRONG TO DEATH BRO!! Nice score! What is your 3570K clocked at? They sure are great little chips.

Z77 overclocked good, everything after that was not so good at clocking high, up until skylake


----------



## tps3443

I just disassembled my Evga GTX 1070 ACX SC, extremely high quality everything! The cards have a sub frame lol. And a back plate. Very nice machine work.

I replaced the thermal paste with some better stuff. How these cards are built, and assembled is really impressive!

Anyways, there was a TON of factory thermal paste. Man, they put like a 1/2 pound on there. My 1070 is so much lighter now lol. jk.

It is running about 4-6C cooler at idle. Load seems to be about 3 C lower, maybe 4C lower. I'm still inspecting it.


----------



## Dude970

Quote:


> Originally Posted by *Prozillah*
> 
> 21800 in the graphics- noooiice. What are your clocks and OC setup?


Thanks mate


----------



## Prozillah

Quote:


> Originally Posted by *tps3443*
> 
> I just disassembled the Evga GTX 1070 ACX SC, extremely high quality everything! The cards have a sub frame lol. And a back plate. Very nice machine work.
> 
> I replaced the thermal paste with some better stuff. How these cards are build, and assembled is really impressive!
> 
> Anyways, there was a TON of factory thermal paste. Man, they put like a 1/2 pound on there. My 1070 is so much lighter now lol. jk.
> 
> It is running about 4-6C cooler at idle. Load seems to be about 3 C lower, maybe 4C lower. I'm still inspecting it.


what did u use?


----------



## Dude970

Quote:


> Originally Posted by *tps3443*
> 
> STRONG TO DEATH BRO!! Nice score! What is your 3570K clocked at? They sure are great little chips.
> 
> Z77 overclocked good, everything after that was not so good at clocking high, up until skylake


Thanks ^^


----------



## DStealth

As far as I remember you had 22k run previously...non or less a respective score for 1070...


----------



## Dude970

Quote:


> Originally Posted by *DStealth*
> 
> As far as I remember you had 22k run previously...non or less a respective score for 1070...


Thanks, yes I did hit 22k, had to tone down the OC and bump up on the CPU/RAM to break 16K. I will be getting a i7-3770K soon, so 16K will be a breeze then


----------



## Prozillah

nice sample - esp considering the speculation around the gaming X's memory OC's


----------



## tps3443

Quote:


> Originally Posted by *Prozillah*
> 
> what did u use?


Well there is no void warranty stickers anywhere that will get damaged after taking the cards apart. The thermal paste looked a bit hard, and dry. And there was a lot of it! So, it took me a while to clean it all off. The build design of the card is just phenomenal.

Remove back plate screws and then plate first, then you will have another set of screws going through the PCB through the top sub frame metal plate that covers the entire top of the GPU PCB. All the way down to the super high quality stainless steel screws. These cards are just built really well. No flexing anymore, there is a aluminum front plate, and back plate tightened together over top of the GPU, board everything fits to perfect tolerances.

Its a big PCB board. Looks identical to GTX1080, accept missing (2) VRM's it looks like.


----------



## Prozillah

Quote:


> Originally Posted by *tps3443*
> 
> Well there is no void warranty stickers anywhere that will get damaged after taking the cards apart. The thermal paste looked a bit hard, and dry. And there was a lot of it! So, it took me a while to clean it all off. The build design of the card is just phenomenal.
> 
> Remove back plate screws and then plate first, then you will have another set of screws going through the PCB through the top sub frame metal plate that covers the entire top of the GPU PCB. All the way down to the super high quality stainless steel screws. These cards are just built really well. No flexing anymore, there is a aluminum front plate, and back plate tightened together over top of the GPU, board everything fits to perfect tolerances.
> 
> Its a big PCB board. Looks identical to GTX1080, accept missing (2) VRM's it looks like.


Haha soz I meant what paste did u use? Will be interested to hear ur final figures on the temp drop


----------



## duganator

http://www.3dmark.com/spy/269621
Not a terrible score for running a low speed xeon. I'm kinda bummed nothing I can do will get this card over 2100 though


----------



## tps3443

TUNIQ TX-4. Thermal Gel

Out of a thermal paste "Round Up" featuring roughly 17-18 different types of thermal paste. MX4, AS5,ETC.ETC. some of the best paste available and to many more to list. TUNIQ TX-4 just ran circles around them all.

With my EVGA GTX1070 Set at the default fan profile. Before I always managed a 43C idle, with 0% fan RPM using the STOCK thermal paste that was on the GPU when I got it 5 days ago







It has just always been 43C idle period. And, I thought it was really good! Especially considering there was no fan running at all!

One big flaw with the GTX1070 ACX, is the heatsink! It is really rough, And I cannot see it making the best contact or heat transfer effectively.

So, after sanding it down to a mirror finish, I put a relatively large amount of TUNIQ TX-4 My idle temps are 36C now. This is 7C lower idle temps. No fan either! Simply amazing difference it has made.
As for load temps, I usually would hit around 70+C with it on the Silent minimal fan/0 RPM fan profile. I am well below this, I'm still trying to figure out what it is. It seems alot lower, but I am trying to make sure it is accurate.


----------



## marik123

Quote:


> Originally Posted by *Prozillah*
> 
> Try finding your max core clock with the volts locked at 1.050. It should stop the card from TDP throttling which I believe is where you are having the issue. If I run my card anything over 1.050 it bounces the core around between 2114 and 2000 - at 1.05v it keeps consistant @ 2101 - 2114


My card behaves different where if I don't add any voltage to it, then it will max out at 2075mhz. However if I add +50mv to my gpu core voltage, then the boost will hit 2100mhz. I recently upgraded my Asus GPU Tweak II to the latest version and now rise of tomb raider is 100% stable. I guess the older version will allow vcore drop all the way down to 1.0v @ 2000mhz, and now it will stay between 1.04 - 1.075v between 2000 - 2100 mhz.


----------



## zipper17

Hi, was bought Galax 1070exoc.
However, I Have a question, Im new to OCing 1070, how do i OC GTX 1070?
i already installed MSI afterburner 4.3.0 beta 4, and then what should i do for the first time OCing 1070?
Do i need to increase power limit & temp limit to the max?
How many increment I need to increase Core Clock & Memory Clock for the first time?

In general,What is the safest number increment of Core/Mem Clock on 1070OC for Daily Use ??
Quote:


> Originally Posted by *Dude970*
> 
> I reached 16K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/14124934


have the same 3570k as yours, Dude how do you achieve 5ghz on 3570K? any link to the Guide? is that possible with aircooling?
my 3570k only at @4.2, literally im nerves OCing that high,

My Firestrike Default:


----------



## Samurai707

Finally got around to messing around with my Overclock earlier this afternoon...
Currently sitting at stock volts, 2126/4303 (+140/+300 if anyone cares) for core/mem on Valley.

Haven't even maxed out yet but I have high hopes for my MSI Gaming X... looks like I might have to block this one after all
















I'm really only gaming right now, but not really playing anything that's truly demanding of my card anymore and it's always fun to destroy whatever I'm in of course!

I'd love to see a cleaner spreadsheet of everyone's highest clocks in things as the top 30 Valley and Heaven threads have for ease of pushing our cards to the fullest too (maybe for 2nd post that was saved for benchmarks?). I'd be willing to help out if that's something that everyone would like to add to and such.


----------



## Prozillah

Has anyone reached 2200mhz core yet?


----------



## Itglows

Anyone else getting drops to really low frames every once in a while? Seems like this only started after putting the 1070 in. I have the newest drivers installed.


----------



## danjal

Quote:


> Originally Posted by *Itglows*
> 
> Anyone else getting drops to really low frames every once in a while? Seems like this only started after putting the 1070 in. I have the newest drivers installed.


I havent , zotac 1070.. Actually impressed with the gpu... Thinking about going sli or to a single 1080..


----------



## Prozillah

Quote:


> Originally Posted by *danjal*
> 
> I havent , zotac 1070.. Actually impressed with the gpu... Thinking about going sli or to a single 1080..


I've got a free sync 144hz 1440p monitor and while it's gorgeous I'm flirting with the idea of going to a ultra wide 1440p gsync with 1080 ti when released


----------



## danjal

Quote:


> Originally Posted by *Prozillah*
> 
> I've got a free sync 144hz 1440p monitor and while it's gorgeous I'm flirting with the idea of going to a ultra wide 1440p gsync with 1080 ti when released


My next monitor upgrade will be 40" or better 4k monitor.


----------



## D13mass

Quote:


> Originally Posted by *tps3443*
> 
> This is comparison data between (2) GTX1070's
> 
> EVGA ACX 3.0 Super clocked, and the MSI GTX 1070 Gaming X,
> 
> The MSI Gaming X will reach 70C under gaming load, @ 30dB, you can get it cooler with more fan speed. But it will only get louder.
> 
> The EVGA ACX 3.0 SC will reach 73C under gaming load, @ 28.3dB, This card is quiet under load. And it is completely silent with 0% fan profile until temps reach 60C. Mine reaches about 60's with silent fan profile which Is really SILENT! No noise at all while gaming. With the Optimal fan profile with some more aggressive air flow, The temps are even better on the EVGA ACX SC! Hovering around the 55's Celsius and the card is silent in my case.
> 
> I do not tolerate noise! So, I purchased the quietest GTX1070 I could find in stock after checking several reviews.


Did you mean this card https://www.bhphotovideo.com/c/product/1262833-REG/evga_08g_p4_6171_kr_geforce_gtx_1070_acx.html ?


----------



## Dude970

Quote:


> Originally Posted by *zipper17*
> 
> Hi, was bought Galax 1070exoc.
> However, I Have a question, Im new to OCing 1070, how do i OC GTX 1070?
> i already installed MSI afterburner 4.3.0 beta 4, and then what should i do for the first time OCing 1070?
> Do i need to increase power limit & temp limit to the max?
> How many increment I need to increase Core Clock & Memory Clock for the first time?
> 
> In general,What is the safest number increment of Core/Mem Clock on 1070OC for Daily Use ??
> have the same 3570k as yours, Dude how do you achieve 5ghz on 3570K? any *link to the Guide*? is that possible with aircooling?
> my 3570k only at @4.2, literally im nerves OCing that high,
> 
> My Firestrike Default:


Not on Air

http://www.overclock.net/t/1247413/ivy-bridge-overclocking-guide-with-ln2-guide-at-the-end


----------



## ucode

Quote:


> Originally Posted by *BTCHSLP*
> 
> I flashed back my standard-bios on my FE:


I don't know why a lot of softwares use percentages for power limits but FYI nVidia have a command line tool to check actual limits and consumption in Watts as well as checking other things. Supposed to have a plus minus 5% accuracy providing there are no hardware mods.

C:\Program Files\NVIDIA Corporation\NVSMI>nvidia-smi.exe -q -d power


----------



## amd7674

Quote:


> Originally Posted by *tps3443*
> 
> TUNIQ TX-4. Thermal Gel
> 
> Out of a thermal paste "Round Up" featuring roughly 17-18 different types of thermal paste. MX4, AS5,ETC.ETC. some of the best paste available and to many more to list. TUNIQ TX-4 just ran circles around them all.
> 
> With my EVGA GTX1070 Set at the default fan profile. Before I always managed a 43C idle, with 0% fan RPM using the STOCK thermal paste that was on the GPU when I got it 5 days ago
> 
> 
> 
> 
> 
> 
> 
> It has just always been 43C idle period. And, I thought it was really good! Especially considering there was no fan running at all!
> 
> One big flaw with the GTX1070 ACX, is the heatsink! It is really rough, And I cannot see it making the best contact or heat transfer effectively.
> 
> So, after sanding it down to a mirror finish, I put a relatively large amount of TUNIQ TX-4 My idle temps are 36C now. This is 7C lower idle temps. No fan either! Simply amazing difference it has made.
> As for load temps, I usually would hit around 70+C with it on the Silent minimal fan/0 RPM fan profile. I am well below this, I'm still trying to figure out what it is. It seems alot lower, but I am trying to make sure it is accurate.


Great job !!!!
Are there any thermal pads anywhere? One of the things I dislike on zotac and Asus strix cards. My kids zotac gtx 970 had them and they were easy to rip apart. What sand paper did you use to lap the heatsink? I might go for evga ftw. Also how did you apply the thermal paste "x" and "+"?

Thx


----------



## saunupe1911

Quote:


> Originally Posted by *marik123*
> 
> My card behaves different where if I don't add any voltage to it, then it will max out at 2075mhz. However if I add +50mv to my gpu core voltage, then the boost will hit 2100mhz. I recently upgraded my Asus GPU Tweak II to the latest version and now rise of tomb raider is 100% stable. I guess the older version will allow vcore drop all the way down to 1.0v @ 2000mhz, and now it will stay between 1.04 - 1.075v between 2000 - 2100 mhz.


Interesting...I've been testing my Asus Strix OC all weekend. I uninstalled the GPU Tweek and went to MSI Afterburner. GPU tweek starting losing its mind. It started resetting my profiles! Luckily I took phone pics of the numbers. I prefer MSI afterburner because the software as a whole is more stable. Plus I can adjust the voltage curve to gain even more stability. So I've been alternating between Heaven, Firestrike, and TimeSpy for benchmarking. 2114 mhz and 9300 memory is the highest I can get for TimeSpy but Firestrike crashes. So I bumped down 2100 and 9200 and I'm stable across the board. So it seems safe to say the Strix likes 2000 to 2100 and about 9200 to 9300 MHz memory for overall stability. That puts my graphics scores in Firestrike to around 20, 400 which is sniffing at stock 1080s....not bad at all.


----------



## danjal

Quote:


> Originally Posted by *tps3443*
> 
> TUNIQ TX-4. Thermal Gel
> 
> Out of a thermal paste "Round Up" featuring roughly 17-18 different types of thermal paste. MX4, AS5,ETC.ETC. some of the best paste available and to many more to list. TUNIQ TX-4 just ran circles around them all.
> 
> With my EVGA GTX1070 Set at the default fan profile. Before I always managed a 43C idle, with 0% fan RPM using the STOCK thermal paste that was on the GPU when I got it 5 days ago
> 
> 
> 
> 
> 
> 
> 
> It has just always been 43C idle period. And, I thought it was really good! Especially considering there was no fan running at all!
> 
> One big flaw with the GTX1070 ACX, is the heatsink! It is really rough, And I cannot see it making the best contact or heat transfer effectively.
> 
> So, after sanding it down to a mirror finish, I put a relatively large amount of TUNIQ TX-4 My idle temps are 36C now. This is 7C lower idle temps. No fan either! Simply amazing difference it has made.
> As for load temps, I usually would hit around 70+C with it on the Silent minimal fan/0 RPM fan profile. I am well below this, I'm still trying to figure out what it is. It seems alot lower, but I am trying to make sure it is accurate.


The evga 1070 ftw I received had an awful amount of coil whine, literally sounded like someone lightly tapping a pencil on the side of the case.. I"m in the process of returning it.. I think I'm going with either another zotac 1070 amp edition for sli or 1080 amp edition or msi aero oc and watercool it and sell my other 1070 amp edition and buy a watercooling kit.

The evga 1070ftw I had ran like 8-10c warmer than the zotac did, I thought that was odd being both had relatively the same size heatsinks and fans, the zotac heatsink and fans were a bit larger but not much, maybe the rough finish of the evga you experienced contributed to that. The evga ran a higher reference voltage than the zotac also..


----------



## Scougar

Asus Strix 1070 owner reporting in. Bought it a week ago, and it started artifacting on booting up (Big boxes with vertical lines just after the windows logo had finished.)

Yesterday on the desktop it artifacted with lots of small colored squares and froze the PC.

On bootup, was fine again. I suspect returning it to the etailer.... this is the point I wish I hadn't chosen to go via Jet.com and ditch the free returns.


----------



## THEROTHERHAMKID

Whats best 1070 to go for? I have a g1 1080 for my pc downstairs on my 4k TV
But im wanting something for my pc in the bedroom for 1080p gaming or is a 980ti enough for that?


----------



## bigjdubb

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Whats best 1070 to go for? I have a g1 1080 for my pc downstairs on my 4k TV
> But im wanting something for my pc in the bedroom for 1080p gaming or is a 980ti enough for that?


The 1070 and 980ti are basically the same performance wise, if you already have a 980ti then I would just stick with that.


----------



## saunupe1911

Quote:


> Originally Posted by *Scougar*
> 
> Asus Strix 1070 owner reporting in. Bought it a week ago, and it started artifacting on booting up (Big boxes with vertical lines just after the windows logo had finished.)
> 
> Yesterday on the desktop it artifacted with lots of small colored squares and froze the PC.
> 
> On bootup, was fine again. I suspect returning it to the etailer.... this is the point I wish I hadn't chosen to go via Jet.com and ditch the free returns.


I've seen that but it was right after a driver crash and reboot while benchmarking. I'm not thinking much of it at all


----------



## TheGlow

Ive decided to get a 1070 but not sure which specific version.
I've used MSI and Gigabyte boards in the past, evga cards and had asus mobo and video card, so I've been flexible.
I see the Zotac is recommended which last I recall was a budget brand.
I just got a Dell S2716DG so looking to enjoy 1440p as Im on a budget radeon 380 as a place holder.


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> So I've just got a stock profile set for both 2d and 3d with the volts locked at .800 and from their I got my hot keys bound for my general gaming OC which I know is rock solid at 0% xtra voltage, 111% power, 106mhz on the core and 550 memoryou @ 1.050v. Overall it holds a constant 2114 mHz core and 9.1ghz memory. The funny thing is if I set set the voltage anything higher thanumber 1.05 the performance worsens as it starts to throttle the TDP. I cracked my best Firestrike graphics score at 21090 with these settings. Be aware of that. Higher clocks and voltage doesn't always mean the card performs better when considering gpu boost 3.0....


I gave it a try, Great discovery! my Micron memory will clock without the checkerboard pattern to +600 now.

Thanks


----------



## gtbtk

Quote:


> Originally Posted by *SupernovaBE*
> 
> Looks like my gaming X have samsung chips, they clock good.
> just my voltage is stuck at 1.043..
> 
> Stable clocks for heaven


Adjust the temp slider to +100, turn on the custom fan curve and you should find that the voltage will increase up to 1.093V, With some tweaks, the fan curve can be set so that card temps stay in the low 50s


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> I gave it a try, Great discovery! my Micron memory will clock without the checkerboard pattern to +600 now.
> 
> Thanks


very nice - how'd ya do it?


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> very nice - how'd ya do it?


Open the curves screen, I selected the pooint at 1.075v, press "L" and it will lock the voltage and show a vertical yellow line.

I have that point pulled up to 2100 Mhz and anything to the right on the curve go flat at 2100mhz. I then smooth out the curve to the left a bit. +600 mem starts to artifact a bit but +590 seems OK


----------



## gtbtk

How do I attach a JPG? I was going to post a screen shot of the curve for you.


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> Open the curves screen, I selected the pooint at 1.075v, press "L" and it will lock the voltage and show a vertical yellow line.
> 
> I have that point pulled up to 2100 Mhz and anything to the right on the curve go flat at 2100mhz. I then smooth out the curve to the left a bit. +600 mem starts to artifact a bit but +590 seems OK


Sounds good - yea got mine locked similar at 2114 at 1.05v. 550mem is the sweet spot but can bench at 600. Had the best combined graphics scores with this setup and it's a 100% stable in all games tested.


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> Sounds good - yea got mine locked similar at 2114 at 1.05v. 550mem is the sweet spot but can bench at 600. Had the best combined graphics scores with this setup and it's a 100% stable in all games tested.


I might try pulling my voltages down a bit and see how I go, should reduce temps a bit too


----------



## Dude970

Quote:


> Originally Posted by *Prozillah*
> 
> Sounds good - yea got mine locked similar at 2114 at 1.05v. 550mem is the sweet spot but can bench at 600. Had the best combined graphics scores with this setup and it's a 100% stable in all games tested.


Can you post a screenshot of your curve. I want to make sure I understand correctly


----------



## USlatin

Quote:


> Originally Posted by *tps3443*
> 
> TUNIQ TX-4. Thermal Gel
> 
> Out of a thermal paste "Round Up" featuring roughly 17-18 different types of thermal paste. MX4, AS5,ETC.ETC. some of the best paste available and to many more to list. TUNIQ TX-4 just ran circles around them all.


Link please, cause I hardly call a 1.5C difference "running circles"


----------



## Prozillah

Quote:


> Originally Posted by *Dude970*
> 
> Can you post a screenshot of your curve. I want to make sure I understand correctly


I'm away till Friday if u pm me or if I remember I'll do it when I get home for u


----------



## Prozillah

I basically set it to 105 core and moved the one point upto 2114mhz and locked the point. Didn't touch the rest of the curve


----------



## Dude970

Quote:


> Originally Posted by *Prozillah*
> 
> I'm away till Friday if u pm me or if I remember I'll do it when I get home for u










pm for reminder sent


----------



## saunupe1911

Yeah I
Quote:


> Originally Posted by *Prozillah*
> 
> I'm away till Friday if u pm me or if I remember I'll do it when I get home for u


Yeah I'm interested locking the voltage too. Gotta try this out


----------



## saunupe1911

And man I why is 2114 the sweet spot for these 1070s??????????? I been saying this in my posts.


----------



## ITAngel

So I got my CPU overclock to 4.5Ghz @ 1.277v and I was wondering on the GPU what should I find out the max on the GPU Core or Memory Core first before pushing the next part?

This is what I got so far.



and


----------



## saunupe1911

Welp got over 20400 by locking voltage to 1.75. Man I wish I could crack 21000!!!!!! Buy hey I guess that's why Nvidia says I need a 1080 to get those numbers







Appreciate the voltage lock info!


----------



## madmeatballs

Quote:


> Originally Posted by *ITAngel*
> 
> So I got my CPU overclock to 4.5Ghz @ 1.277v and I was wondering on the GPU what should I find out the max on the GPU Core or Memory Core first before pushing the next part?
> 
> This is what I got so far.
> 
> 
> 
> and


I tried core first then memory, I don't think it matters anyway. It will just be easier to find out stability if you do one first then followed by the other. What I did was core first then memory.


----------



## Prozillah

Quote:


> Originally Posted by *saunupe1911*
> 
> Welp got over 20400 by locking voltage to 1.75. Man I wish I could crack 21000!!!!!! Buy hey I guess that's why Nvidia says I need a 1080 to get those numbers
> 
> 
> 
> 
> 
> 
> 
> Appreciate the voltage lock info!


Is ur core clock jumping around during the benchmark?


----------



## Prozillah

Cause u should be getting a score like mine or higher with those clocks. Also try benching ur memory lower like 575 or something and see if that improves ur score - pascal got mem error correction and there is a sweet spot. It will also reduce ur TDP allowing the core to remain higher for longer. Failing do the shunt mod like me


----------



## saunupe1911

Quote:


> Originally Posted by *Prozillah*
> 
> Cause u should be getting a score like mine or higher with those clocks. Also try benching ur memory lower like 575 or something and see if that improves ur score - pascal got mem error correction and there is a sweet spot. It will also reduce ur TDP allowing the core to remain higher for longer. Failing do the shunt mod like me


Yep it bounces around a little but its a lot more stable. For example the combined test keeps a steady voltage for the entire time. Let me try bumping it down and see what I get


----------



## M0E

Well, after 7 years I finally had to upgrade. The Division and No Man's Sky forced it. Purchased an ASUS Strix OC edition. I can tell the first gen i7 Im still running is holding the card back. Looks like I'll have to upgrade the whole PC very soon.


----------



## Jackharm

Not too certain as how to to fiddle around with the voltages on my zotac amp!extreme but I am able to break 21,000 in terms of graphics score at least.

+70 on core and +500 on memory (tried 550, but would checker) with a custom fan curve. It shows a max of 2100 on core, but it would throttle down to 2088 for the majority of the benchmark.

CPU isn't the best overclocker, 4670k at 4.0


Spoiler: Warning: Spoiler!


----------



## SupernovaBE

Quote:


> Originally Posted by *gtbtk*
> 
> Adjust the temp slider to +100, turn on the custom fan curve and you should find that the voltage will increase up to 1.093V, With some tweaks, the fan curve can be set so that card temps stay in the low 50s


Ty for the respons
I wil try that later today and report back !
I have it @ 92° now on the slider
And im on water so 50 wil not happen, low 40 max


----------



## madmeatballs

Quote:


> Originally Posted by *Jackharm*
> 
> Not too certain as how to to fiddle around with the voltages on my zotac amp!extreme but I am able to break 21,000 in terms of graphics score at least.
> 
> +70 on core and +500 on memory (tried 550, but would checker) with a custom fan curve. It shows a max of 2100 on core, but it would throttle down to 2088 for the majority of the benchmark.
> 
> CPU isn't the best overclocker, 4670k at 4.0
> 
> 
> Spoiler: Warning: Spoiler!


I also have the amp extreme and able to keep 2100MHz without dropping to 2088MHz because of heat, but it isn't stable for me on 3dmark firestrike ultra stress test. Though on the regular firestrike stress test and heaven it was so I just opted for +45 core +500 mem which gave me 2075MHz not bad.

What temps do you get? I don't hit more than 44C I think it starts to throttle at 45C though.


----------



## Curseair

Guys should I go for a 980 Ti for 464 dollars or a 1070 at 555 dollars? That's the prices where i'm at , EVGA FTW's etc. Brand new, I will be overclocking is the 980 ti better then?


----------



## gtbtk

here is my curve using the voltage lock



managed to get a Fire strike graphics score of just under 20600 using i7-2600 non K

http://www.3dmark.com/fs/9777447


----------



## GreedyMuffin

Quote:


> Originally Posted by *gtbtk*
> 
> here is my curve using the voltage lock
> 
> 
> 
> managed to get a Fire strike graphics score of just under 20600 using i7-2600 non K
> 
> http://www.3dmark.com/fs/9777447


How do you enable the voltage lock?

Thanks!

NVM: I press CTRL + L and it worked. Just thought L would be appropriate to try.


----------



## rfarmer

Quote:


> Originally Posted by *SupernovaBE*
> 
> Ty for the respons
> I wil try that later today and report back !
> I have it @ 92° now on the slider
> And im on water so 50 wil not happen, low 40 max


Yeah mine is the same way, 42C max while gaming. Got to love water cooling.


----------



## danjal

Quote:


> Originally Posted by *Curseair*
> 
> Guys should I go for a 980 Ti for 464 dollars or a 1070 at 555 dollars? That's the prices where i'm at , EVGA FTW's etc. Brand new, I will be overclocking is the 980 ti better then?


I think I would go with the 1070 just because its the newer platform..


----------



## Mr-Dark

Hello

My 1070 SC arrive today! and also my new build


----------



## ITAngel

Mine stay down clock to 2088Mhz =/ I still get a little bit of artifacts


----------



## HyPiK




----------



## saunupe1911

Quote:


> Originally Posted by *gtbtk*
> 
> here is my curve using the voltage lock
> 
> 
> 
> managed to get a Fire strike graphics score of just under 20600 using i7-2600 non K
> 
> http://www.3dmark.com/fs/9777447


Whats your memory clocks? I still see vcore dips no matter if its 8800 mhz or 9000mhz with 2113 Mhz curve at 1075.


----------



## Curseair

Quote:


> Originally Posted by *danjal*
> 
> I think I would go with the 1070 just because its the newer platform..


Yeah I did, Got a EVGA FTW coming tomorrow.


----------



## gtbtk

Quote:


> Originally Posted by *saunupe1911*
> 
> Whats your memory clocks? I still see vcore dips no matter if its 8800 mhz or 9000mhz with 2113 Mhz curve at 1075.


That was at +9216Mhz


----------



## gtbtk

Quote:


> Originally Posted by *GreedyMuffin*
> 
> How do you enable the voltage lock?
> 
> Thanks!
> 
> NVM: I press CTRL + L and it worked. Just thought L would be appropriate to try.


I just select the point and press L


----------



## gtbtk

Quote:


> Originally Posted by *GreedyMuffin*
> 
> How do you enable the voltage lock?
> 
> Thanks!
> 
> NVM: I press CTRL + L and it worked. Just thought L would be appropriate to try.


I just select the point and press L


----------



## gtbtk

Quote:


> Originally Posted by *SupernovaBE*
> 
> Ty for the respons
> I wil try that later today and report back !
> I have it @ 92° now on the slider
> And im on water so 50 wil not happen, low 40 max


even better. the temp slider seems to move a curve that effects voltage. hope it helps


----------



## SupernovaBE

Nope








Gpu 1 stays at 1.043 and gpu 2 1.081
not a difference


----------



## saunupe1911

Quote:


> Originally Posted by *gtbtk*
> 
> That was at +9216Mhz


Damn you have a good chip the highest I've gotten is 20400


----------



## rulezzzor

could anyone send me a bios from palit-gtx-1070-gamerock-premium-edition or Gainward GTX 1070 Phoenix GLH
thank you

ok. i found it as a not verified







thx


----------



## Thingol

Submitted information for addition to 1070 owner club


----------



## Dude970

Quote:


> Originally Posted by *Thingol*
> 
> Submitted information for addition to 1070 owner club


----------



## ITAngel

Quote:


> Originally Posted by *saunupe1911*
> 
> Damn you have a good chip the highest I've gotten is 20400


How does he get 9000+Mhz? You multiply results by 2?


----------



## saunupe1911

Quote:


> Originally Posted by *ITAngel*
> 
> How does he get 9000+Mhz? You multiply results by 2?


Yep 2.There's 2 RAM modules inside of the GPU


----------



## Hunched

So does anyone here actually play video games? Or do you all just run benchmarks?
Because both of the 1070's I had could pass FireStrike and TimeSpy at +700 memory, like 10 runs each without issue.
But BF4 wasn't have any of it, lower it at least to +500/+450
Then playing BF4 for over 4 hours max settings 64 player conquest and it finally freezes, at +350
Rise of the Tomb Raider first starts artifacting at +320, happens no earlier as far as I can tell after playing for at least 2 hours every +10mhz from +250 to +320.
So 16 hours of ROTTR gameplay alone until I finally had something obvious like an overlapping black square appear.
ROTTR uses over 7gb of VRAM at almost all times and is pretty fancy with the effects.

No wonder there's always so many posts for Witcher 3, BF4, ROTTR, Star Citizen, Shadow of Mordor, and so many other games about crashing and lowering overclocks if people are using things like FireStrike to push their cards to their limits.
Compared to Rise of the Tomb Raider, Witcher 3, BF4, etc... FireStrike isn't even close for finding instability.

I'm sure my cards underperform to a degree, but I can't tell by how much, since everyone here probably runs FireStrike twice and if it doesn't crash or if it doesn't happen to artifact within those 10 minutes they think it's stable.
It can take many hours for an unstable memory or core overclock to display a visual artifact or crash a game, and depending on the game or application, it can take way longer or never happen.
Lots of people already say Valley allows them higher overclocks than FireStrike. Same thing happens with FireStrike compared to so many games out there.

It would be cool if somebody released a better benchmark, or if people stopped using FireStrike to find stability because it's obviously so terrible for that.
It was the same story with my 970, though not to this large of a degree.

I guess I'll just see more of you complaining on Steam and developer forums when you decide to play your next hyped AAA game for more than 5 hours complaining about how it's crashing despite never crashing in FireStrike








I'm sure there will be a huge wave for Cyberpunk 2077, though that will take a while. Damn those pesky RPG's that are demanding and require like 50 hours of stability to beat, like Witcher 3.


----------



## ITAngel

I have many games I test with mine on stock settings. Star Citizen, BF4, World of Warcraft, Overwatch, tomb Raider, and more. Normally memory settings are 550 with core at 90 and stock voltage no curve or nothing else modified. Yes I do play these games for hours when I can.


----------



## Jimbags

Quote:


> Originally Posted by *ITAngel*
> 
> How does he get 9000+Mhz? You multiply results by 2?


Quote:


> Originally Posted by *saunupe1911*
> 
> Yep 2.There's 2 RAM modules inside of the GPU


Isnt it multiplied by 4? Also 'DDR' stands for double data rate in GDDR5.
Thats why desktop ram will show as 800mhz for 1600mhz ddr3 in cpuz for example.


----------



## Prozillah

Quote:


> Originally Posted by *Hunched*
> 
> So does anyone here actually play video games? Or do you all just run benchmarks?
> Because both of the 1070's I had could pass FireStrike and TimeSpy at +700 memory, like 10 runs each without issue.
> But BF4 wasn't have any of it, lower it at least to +500/+450
> Then playing BF4 for over 4 hours max settings 64 player conquest and it finally freezes, at +350
> Rise of the Tomb Raider first starts artifacting at +320, happens no earlier as far as I can tell after playing for at least 2 hours every +10mhz from +250 to +320.
> So 16 hours of ROTTR gameplay alone until I finally had something obvious like an overlapping black square appear.
> ROTTR uses over 7gb of VRAM at almost all times and is pretty fancy with the effects.
> 
> No wonder there's always so many posts for Witcher 3, BF4, ROTTR, Star Citizen, Shadow of Mordor, and so many other games about crashing and lowering overclocks if people are using things like FireStrike to push their cards to their limits.
> Compared to Rise of the Tomb Raider, Witcher 3, BF4, etc... FireStrike isn't even close for finding instability.
> 
> I'm sure my cards underperform to a degree, but I can't tell by how much, since everyone here probably runs FireStrike twice and if it doesn't crash or if it doesn't happen to artifact within those 10 minutes they think it's stable.
> It can take many hours for an unstable memory or core overclock to display a visual artifact or crash a game, and depending on the game or application, it can take way longer or never happen.
> Lots of people already say Valley allows them higher overclocks than FireStrike. Same thing happens with FireStrike compared to so many games out there.
> 
> It would be cool if somebody released a better benchmark, or if people stopped using FireStrike to find stability because it's obviously so terrible for that.
> It was the same story with my 970, though not to this large of a degree.
> 
> I guess I'll just see more of you complaining on Steam and developer forums when you decide to play your next hyped AAA game for more than 5 hours complaining about how it's crashing despite never crashing in FireStrike
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure there will be a huge wave for Cyberpunk 2077, though that will take a while. Damn those pesky RPG's that are demanding and require like 50 hours of stability to beat, like Witcher 3.


Ur sounding a touch butt hurt bro - if u read a good chunk of the past 30 pages ull see many share that exact fact that we can oc higher in specific benches vs everyday games. But for the record - I can bench at 2114 core 9.3ghz but I get the highest benching scores when I run 2114mhz and 9.1ghz memory which is also 100% bf4 stable for hours. Hope this helps


----------



## xTesla1856

Joined the club yesterday with G1 Gaming 1070. Was originally gonna get a 1080, but they were out of stock. Hope this tides me over until Volta (or maybe Vega?







). I will do full testing today once I install Windows and get the drivers going on my Edition 10. Stay tuned


----------



## criminal

Quote:


> Originally Posted by *Hunched*
> 
> So does anyone here actually play video games? Or do you all just run benchmarks?
> Because both of the 1070's I had could pass FireStrike and TimeSpy at +700 memory, like 10 runs each without issue.
> But BF4 wasn't have any of it, lower it at least to +500/+450
> Then playing BF4 for over 4 hours max settings 64 player conquest and it finally freezes, at +350
> Rise of the Tomb Raider first starts artifacting at +320, happens no earlier as far as I can tell after playing for at least 2 hours every +10mhz from +250 to +320.
> So 16 hours of ROTTR gameplay alone until I finally had something obvious like an overlapping black square appear.
> ROTTR uses over 7gb of VRAM at almost all times and is pretty fancy with the effects.
> 
> No wonder there's always so many posts for Witcher 3, BF4, ROTTR, Star Citizen, Shadow of Mordor, and so many other games about crashing and lowering overclocks if people are using things like FireStrike to push their cards to their limits.
> Compared to Rise of the Tomb Raider, Witcher 3, BF4, etc... FireStrike isn't even close for finding instability.
> 
> I'm sure my cards underperform to a degree, but I can't tell by how much, since everyone here probably runs FireStrike twice and if it doesn't crash or if it doesn't happen to artifact within those 10 minutes they think it's stable.
> It can take many hours for an unstable memory or core overclock to display a visual artifact or crash a game, and depending on the game or application, it can take way longer or never happen.
> Lots of people already say Valley allows them higher overclocks than FireStrike. Same thing happens with FireStrike compared to so many games out there.
> 
> It would be cool if somebody released a better benchmark, or if people stopped using FireStrike to find stability because it's obviously so terrible for that.
> It was the same story with my 970, though not to this large of a degree.
> 
> I guess I'll just see more of you complaining on Steam and developer forums when you decide to play your next hyped AAA game for more than 5 hours complaining about how it's crashing despite never crashing in FireStrike
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure there will be a huge wave for Cyberpunk 2077, though that will take a while. Damn those pesky RPG's that are demanding and require like 50 hours of stability to beat, like Witcher 3.


I run TimeSpy all day, see. http://www.3dmark.com/3dm/14184169?









Seriously though, I see people talking about games in here. And my card is stable in games +200 Core +600 VRAM for hours. Those games include SOM, FarCry4, Borderlands 2, Crysys 3 and Fallout 4. Anyone with any sense knows Firestrike is useless for stability test.


----------



## Wollowon

Hello

I need ASUS STRIX "DEFAULT OC MODE BIOS" , anyone have it can share it please ?

Thanks


----------



## Mad Pistol

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Hunched*
> 
> So does anyone here actually play video games? Or do you all just run benchmarks?
> Because both of the 1070's I had could pass FireStrike and TimeSpy at +700 memory, like 10 runs each without issue.
> But BF4 wasn't have any of it, lower it at least to +500/+450
> Then playing BF4 for over 4 hours max settings 64 player conquest and it finally freezes, at +350
> Rise of the Tomb Raider first starts artifacting at +320, happens no earlier as far as I can tell after playing for at least 2 hours every +10mhz from +250 to +320.
> So 16 hours of ROTTR gameplay alone until I finally had something obvious like an overlapping black square appear.
> ROTTR uses over 7gb of VRAM at almost all times and is pretty fancy with the effects.
> 
> No wonder there's always so many posts for Witcher 3, BF4, ROTTR, Star Citizen, Shadow of Mordor, and so many other games about crashing and lowering overclocks if people are using things like FireStrike to push their cards to their limits.
> Compared to Rise of the Tomb Raider, Witcher 3, BF4, etc... FireStrike isn't even close for finding instability.
> 
> I'm sure my cards underperform to a degree, but I can't tell by how much, since everyone here probably runs FireStrike twice and if it doesn't crash or if it doesn't happen to artifact within those 10 minutes they think it's stable.
> It can take many hours for an unstable memory or core overclock to display a visual artifact or crash a game, and depending on the game or application, it can take way longer or never happen.
> Lots of people already say Valley allows them higher overclocks than FireStrike. Same thing happens with FireStrike compared to so many games out there.
> 
> It would be cool if somebody released a better benchmark, or if people stopped using FireStrike to find stability because it's obviously so terrible for that.
> It was the same story with my 970, though not to this large of a degree.
> 
> I guess I'll just see more of you complaining on Steam and developer forums when you decide to play your next hyped AAA game for more than 5 hours complaining about how it's crashing despite never crashing in FireStrike
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure there will be a huge wave for Cyberpunk 2077, though that will take a while. Damn those pesky RPG's that are demanding and require like 50 hours of stability to beat, like Witcher 3.






Many people reference their benchable settings in this thread (because lets face it, what's the best way to show off how good your card is? bench it, and bench it hard). The fact is that 24/7 stable settings are definitely lower. In my case, I can bench my cards @ +170/600 core/mem in SLI, but if I try and run games at those settings, the video card driver will crash within 30 seconds to a minute. It just doesn't work well for games.

I keep my 24/7 max OC is around +100/+500 core/mem. It could probably go higher than that for 24/7 use, but I'd rather play the games than worry about if my OC is stable, so I don't push it too hard. In fact, I will run the cards at stock a lot of the time because it is more than enough for the games I play. The OC is just icing on the cake.

But yea, I get what you're saying. Benchmarks are less taxing on hardware overtime compared to actual games.


----------



## GreedyMuffin

My 2139mhz stock voltage is game and folding stable.

Currently running 2050 at 0.950V. ^^

Also game stable and folding stable.


----------



## Mad Pistol

Quote:


> Originally Posted by *GreedyMuffin*
> 
> My 2139mhz stock voltage is game and folding stable.
> 
> Currently running 2050 at 0.950V. ^^
> 
> Also game stable and folding stable.


Yea, but you have a 1080 w/ Waterblock, so yours doesn't count.


----------



## TheLAWNOOB

What is the stock voltage on these things? Can voltage be set using afterburner offset?


----------



## ITAngel

Quote:


> Originally Posted by *saunupe1911*
> 
> Yep 2.There's 2 RAM modules inside of the GPU


Oh okay thanks!


----------



## madmeatballs

If you plan to update your driver to 372.54 hold it off, there seems to be a problem with it. My overclock got messed up and gives me some weird lag on bf4.

Update: fixed the issue, just don't forget to quit/close MSI Afterburner before installing the new driver (372.54)


----------



## GreedyMuffin

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> What is the stock voltage on these things? Can voltage be set using afterburner offset?


You increase voltage you'll need to use the curve sadly.

Stock voltage is 1.050V. At least for me.


----------



## watermanpc85

Hi guys!!, Im thinking about upgrade my loved gtx 970 MSI (custom BIOS +145 Power limit @1535/1850/1.256v) to a 1070 model.

The thing is Im a bit dissapointed with the performance increase I see from reviews/benches, I hopped for something more than a 45/60% at best, specially at the price they are now, so I wonder if there is any news about custom BIOS in order to be able to unlock power limit and increase voltage to achieve something like 2200/2300/2400Mhz on the core and so enlarge the gap between 970 and 1070.

Also, if there isnt, do you think is there any hope for this cards to really push way harder their OC levels???

Thanks!!


----------



## TheGlow

I went and got the MSI Gaming 8 X 1070.
What do I need to know about it? I read up on a bios upgrade to change default mode after some whining about review copies.
Any repercussions to putting into OC mode? Normally I would do a mild to medium OC the first year or so, and then push it around the time im looking for an upgrade as Im unsure if I'd be killing its lifespan.


----------



## STIguy312

Quote:


> Originally Posted by *TheGlow*
> 
> I went and got the MSI Gaming 8 X 1070.
> What do I need to know about it? I read up on a bios upgrade to change default mode after some whining about review copies.
> Any repercussions to putting into OC mode? Normally I would do a mild to medium OC the first year or so, and then push it around the time im looking for an upgrade as Im unsure if I'd be killing its lifespan.


I was able to push my Gaming X 8G to its power limit with a core clock well over 2000 MHz water cooled. I can get a GPU-Z validation later today as I am not at my PC.

If you're looking for a mild overclock, it will handle OC mode from the MSI gaming app that is available from the MSI website. It was designed to work with the card in stock form.


----------



## Mad Pistol

Quote:


> Originally Posted by *watermanpc85*
> 
> Hi guys!!, Im thinking about upgrade my loved gtx 970 MSI (custom BIOS +145 Power limit @1535/1850/1.256v) to a 1070 model.
> 
> The thing is Im a bit dissapointed with the performance increase I see from reviews/benches, I hopped for something more than a 45/60% at best, specially at the price they are now, so I wonder if there is any news about custom BIOS in order to be able to unlock power limit and increase voltage to achieve something like 2200/2300/2400Mhz on the core and so enlarge the gap between 970 and 1070.
> 
> Also, if there isnt, do you think is there any hope for this cards to really push way harder their OC levels???
> 
> Thanks!!


For the moment, the chips seem to OC max to around the 2000-2150mhz level. Anything past that is virtually unheard of. Literally, any card, whether it's the Founders Edition or an MSI Gaming Z model, they all do about the same.

If you want one, find the one you want and enjoy it. However, if you're happy with your 970, I would suggest just keeping that for another year.


----------



## STIguy312

Quote:


> Originally Posted by *Mad Pistol*
> 
> For the moment, the chips seem to OC max to around the 2000-2150mhz level. Anything past that is virtually unheard of. Literally, any card, whether it's the Founders Edition or an MSI Gaming Z model, they all do about the same.
> 
> If you want one, find the one you want and enjoy it. However, if you're happy with your 970, I would suggest just keeping that for another year.


As Mad Pistol stated, the OC max is pretty limited. Wish I knew how limited before I bought a water block for it. The card stays nice and cool though!

If you're thinking of selling the 970 to put money towards a new card, I would start thinking about that now as their used value continues to drop.


----------



## watermanpc85

Thanks both guys!!!

Yes, I want to sell my 970 and put the money on the new 1070, do you think there will be a drop in price anytime soon?? 450/500 € for xx70 series is quite crazy IMHO









Btw, your replies were more focused on the fact that all cards seem to OC the same, but my question was more focused about if there is/will be/maybe will be a CUSTOM BIOS to unlock power limit and voltage and thus increase OC levels...any clue about that??


----------



## Mad Pistol

Quote:


> Originally Posted by *watermanpc85*
> 
> Thanks both guys!!!
> 
> Yes, I want to sell my 970 and put the money on the new 1070, do you think there will be a drop in price anytime soon?? 450/500 € for xx70 series is quite crazy IMHO
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, your replies were more focused on the fact that all cards seem to OC the same, but my question was more focused about if there is/will be/maybe will be a CUSTOM BIOS to unlock power limit and voltage and thus increase OC levels...any clue about that??


I think Custom BIOSes are already in the works, but it is yet to be seen if unlocking voltage and power limits will have any effect on Pascal's performance. Also, I wouldn't expect a price drop anytime soon. Nvidia is at the head of the market, so at this point, they are only competing against themselves.


----------



## watermanpc85

Thanks for your help Mad Pistol!!, as I suspected, no price drop in a while...









Lets hope for a good OC improvement with custom BIOSes


----------



## 113802

http://www.3dmark.com/fs/9824119

Still can't reach anywhere near 22k

My gaming/stable settings are +90/+555


----------



## gtbtk

Att: MSI Gaming X 1070 owners with Micron memory (default bios 86.04.26.00.3E)

Has anyone with the Micron memory card flashed their card with the MSI Gaming X default OC bios version 86.04.1E.00.40?

If so how well did it work? did you need to flash back?

Other than defaulting to a base clock of 1607 mhz and the did you notice any other differences?

Did you get the 50 Mhz boost on the video ram?

Did you notice any changes in the occurrence of the checkerboard artifacts if you pushed the OC past about 450?


----------



## bigjdubb

Quote:


> Originally Posted by *WannaBeOCer*
> 
> http://www.3dmark.com/fs/9824119
> 
> *Still can't reach anywhere near 22k*
> 
> My gaming/stable settings are +90/+555


What do you mean? 21,750 is very near 22k.


----------



## Prozillah

thats the highest score ive seen posted yet


----------



## M0E

Quote:


> Originally Posted by *WannaBeOCer*
> 
> http://www.3dmark.com/fs/9824119
> 
> Still can't reach anywhere near 22k
> 
> My gaming/stable settings are +90/+555


Doing better than I am









http://www.3dmark.com/compare/fs/9824119/fs/9829550


----------



## TheGlow

What are some recommended benchmarks?
I recall using Heaven bench a couple months ago when trying to OC my amd r9 380.
This time I ran it once, mostly extreme i think, 1440p in gaming mode and got 1433.
I put on OC mode and oddly scored lower, 1418.


----------



## M0E

Quote:


> Originally Posted by *TheGlow*
> 
> What are some recommended benchmarks?
> I recall using Heaven bench a couple months ago when trying to OC my amd r9 380.
> This time I ran it once, mostly extreme i think, 1440p in gaming mode and got 1433.
> I put on OC mode and oddly scored lower, 1418.


Firestrike seems to be the go to benchmark software of the last year or two. I still use 3DMark11 and Heaven as well so I can compare to previous tests on my system as I now begin to find the need to upgrade.


----------



## striker3

i have gtx 1070 gigabyte g1 gaming and i have problem with oc

first my default card clocks in game dont even reach 2000mhz it is 1936 and some times 1975 . i cant oc ram even +200 it give me artifacts with 3dmark even before it starts
and i cant oc core even +100 and i tried almost everything iam using latest msi afterburner version and latest nvidia driver any help?!


----------



## amd7674

same price EVGA FTW or Strix OC?







I'm leaning toward EVGA.... please advise...


----------



## criminal

Quote:


> Originally Posted by *amd7674*
> 
> same price EVGA FTW or Strix OC?
> 
> 
> 
> 
> 
> 
> 
> I'm leaning toward EVGA.... please advise...


EVGA has better customer service.


----------



## 113802

Quote:


> Originally Posted by *amd7674*
> 
> same price EVGA FTW or Strix OC?
> 
> 
> 
> 
> 
> 
> 
> I'm leaning toward EVGA.... please advise...


Two EVGA 1070 FTW I received have loud coil whine. Many others reported coil whine with the FTW card. Asus along with Sapphire use higher end chokes to reduce noise.


----------



## Face2Face

Quote:


> Originally Posted by *amd7674*
> 
> same price EVGA FTW or Strix OC?
> 
> 
> 
> 
> 
> 
> 
> I'm leaning toward EVGA.... please advise...


The STRIX is the better card out of the box. Most boost to 2GHz out of the box and top out at 62c-63c under load with average ambient temps. If you plan on overclocking using custom BIOeS in the future, then I would recommend the FTW.


----------



## Face2Face

If you guys could choose one, which would it be?

ASUS GTX 1070 STRIX OC @ $390
MSI GTX 1070 SEA HAWK @ $420


----------



## amd7674

Quote:


> Originally Posted by *Face2Face*
> 
> The STRIX is the better card out of the box. Most boost to 2GHz out of the box and top out at 62c-63c under load with average ambient temps. If you plan on overclocking using custom BIOeS in the future, then I would recommend the FTW.


Thank you... The coil whine issues are a little scary.... hmmm...


----------



## saunupe1911

Quote:


> Originally Posted by *Face2Face*
> 
> If you guys could choose one, which would it be?
> 
> ASUS GTX 1070 STRIX OC @ $390
> MSI GTX 1070 SEA HAWK @ $420


They are equal IMO but the Strix has multiple HDMI if you use your PC has a HTPC or VR. Plus it has 2 FAN pwm to add additional fans to the case and keep it cooler based upon GPU temps.

MSI is smaller so would be a better fit in the case. the Asus's 3 fan setup keeps it really cool plus you can expand.
The Asus's fans aren't loud at all. No coil whine and haven't heard of a Strix OC that won't go over 2000Mhz and 9000mhz memory

These were all my reason for going Asus. Zotac Extreme with the 3 fans was second choice but wouldn't have fit in my current setup


----------



## saunupe1911

apologies duplicate post


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *amd7674*
> 
> same price EVGA FTW or Strix OC?
> 
> 
> 
> 
> 
> 
> 
> I'm leaning toward EVGA.... please advise...


Because of the timing of ur purchase (now versus the original 1070 release date); and in addition to all else have said, u have to equate any existing or potential advantage(s) of the EVGA 90 day trade-up program, too. (i forget the facts. i haven't had an EVGA card in a few years. idk any changes either.)

EVGA also has (last i knew) a very nice extended warranty program that was priced ok when i used it a few years ago. and their forums/ support ROCK!

GL









PS i have two 1070 Strix but i bought them shortly after 1070 first went on sale. (so there was next to no reason for me to consider EVGA versus u now.) Whereby now, we know so much more (of the 10xx) performance-wise, plus prices are established, and potential new releases and modifications are somewat apparent, etc.


----------



## Sueramb6753

-snip-


----------



## amd7674

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> Because of the timing of ur purchase (now versus the original 1070 release date); and in addition to all else have said, u have to equate any existing or potential advantage(s) of the EVGA 90 day trade-up program, too. (i forget the facts. i haven't had an EVGA card in a few years. idk any changes either.)
> 
> EVGA also has (last i knew) a very nice extended warranty program that was priced ok when i used it a few years ago. and their forums/ support ROCK!
> 
> GL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS i have two 1070 Strix but i bought them shortly after 1070 first went on sale. (so there was next to no reason for me to consider EVGA versus u now.) Whereby now, we know so much more (of the 10xx) performance-wise, plus prices are established, and potential new releases and modifications are somewat apparent, etc.


Thank you for your reply 
I'm flip-flopping between the two. I'm terrified I will have to deal with sever EVGA coil whine... LOL.. My CEO of finances at home would not be too happy. I would buy the card locally with a chance my local store would try to help me first (30 days warranty).

If you were in my shoes what would you do? LOL Strix OC or EVGA FTW?


----------



## amd7674

Quote:


> Originally Posted by *Symix*
> 
> Sent my msi 1070 gaming x back due to faulty fan bearing
> 
> got a palit super jetstream and it has more coil whine than any card i've ever owned.
> 
> what's the next one I should try? maybe evga ?


Myself I'm leaning toward EVGA FTW (all EVGA are known for coil whine issues). My buddy has coil whine issues with his Gigabyte Gaming model.
From what other users reported Asus Strix might have less issues.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *amd7674*
> 
> Thank you for your reply
> I'm flip-flopping between the two. I'm terrified I will have to deal with sever EVGA coil whine... LOL.. My CEO of finances at home would not be too happy. I would buy the card locally with a chance my local store would try to help me first (30 days warranty).
> 
> If you were in my shoes what would you do? LOL Strix OC or EVGA FTW?


np

Seeing the paranoia involved with coil whine, i might do as u and just purchase from a local store, too. I like to buy HDDs and LCDs locally for the same reason... easy, quick and no hassle returning.

But also, maybe consider starting a thread on the EVGA forums about ur dilemma. (just refer to the Strix as "other" or "other brand".)

And they will tell u the options about RMAing a card with coil whine and they'll make clear about the advantages of their EVGA 90-day trade-up program.

GL









PS i do luv me Strix cards. (And ASUS did a great job three years ago with a MOBO RMA for me.) But the options i missed out on- as far as the EVGA trade up program goes- was just a given loss, by purchasing so early on. (i couldn't wait.lol)


----------



## USlatin

Quote:


> Originally Posted by *Face2Face*
> 
> If you guys could choose one, which would it be?
> 
> ASUS GTX 1070 STRIX OC @ $390
> MSI GTX 1070 SEA HAWK @ $420


The SeaHawk lets you mount side panel fans since it is narrower, it exhausts all hot air out of the case which makes all of your components run significantly cooler (my CPU dropped 5C+, my case temp and therefor RAM dropped 10C+), and the GPU runs stupid cold like 1,000% always sub 45C throughout 8hs of load, it is also DEAD silent no coil whine, no pump noise, no fan noise... getting 2.1GHz was a joke it was so easy

with the temperature deltas I am getting I can literally run my rig full load for 8hs, overclocked, in 45C ambient temp, haha

I couldn't be happier


----------



## Face2Face

Quote:


> Originally Posted by *USlatin*
> 
> The SeaHawk lets you mount side panel fans since it is narrower, it exhausts all hot air out of the case which makes all of your components run significantly cooler (my CPU dropped 5C+, my case temp and therefor RAM dropped 10C+), and the GPU runs stupid cold like 1,000% always sub 45C throughout 8hs of load, it is also DEAD silent no coil whine, no pump noise, no fan noise... getting 2.1GHz was a joke it was so easy
> 
> with the temperature deltas I am getting I can literally run my rig full load for 8hs, overclocked, in 45C ambient temp, haha
> 
> I couldn't be happier


Wow. That's pretty amazing. Also, does the power limit only go up to 105%? If so, is that limiting?


----------



## 113802

Quote:


> Originally Posted by *amd7674*
> 
> Myself I'm leaning toward EVGA FTW (all EVGA are known for coil whine issues). My buddy has coil whine issues with his Gigabyte Gaming model.
> From what other users reported Asus Strix might have less issues.


Get an AMD Fury X for $365 with the visa promo code and call it a day.

https://m.newegg.com/Product/index?itemnumber=N82E16814150742

I'm trying to sell my 1070 FTW because of the annoying coil whine. Both the ones I receieved required me to cap the card at 30 FPS to stop the whine. Tested it in this system along with a Rampage IV BE with a Corsair AX1200i but it still yells and screams fhe same.

Edit: I've only owned EVGA cards in the last six years.

GTX 470 - no whine - reference
GTX 780 - no whine - reference
GTX 780 Ti KingPin - only benchmarks
GTX 980 Ti - only benchmarks
GTX 1070 - every game!


----------



## USlatin

Quote:


> Originally Posted by *Face2Face*
> 
> Wow. That's pretty amazing. Also, does the power limit only go up to 105%? If so, is that limiting?


Haven't even checked, because it runs 2.1GHz solid without any dips at all even at 100% load for 2hs


----------



## Swolern

Quote:


> Originally Posted by *WannaBeOCer*
> 
> I'm trying to sell my 1070 FTW because of the annoying coil whine. Both the ones I receieved required me to cap the card at 30 FPS to stop the whine. Tested it in this system along with a Rampage IV BE with a Corsair AX1200i but it still yells and screams fhe same.
> 
> Edit: I've only owned EVGA cards in the last six years.
> 
> GTX 470 - no whine - reference
> GTX 780 - no whine - reference
> GTX 780 Ti KingPin - only benchmarks
> GTX 980 Ti - only benchmarks
> GTX 1070 - every game!


I too bought 100% EVGA every GPU for years. These last 2 gens I found the STRIX cooler to be quieter and cooler. Has anyone seen a direct comparison of the 2?


----------



## Sueramb6753

-snip-


----------



## Prozillah

Quote:


> Originally Posted by *striker3*
> 
> i have gtx 1070 gigabyte g1 gaming and i have problem with oc
> 
> first my default card clocks in game dont even reach 2000mhz it is 1936 and some times 1975 . i cant oc ram even +200 it give me artifacts with 3dmark even before it starts
> and i cant oc core even +100 and i tried almost everything iam using latest msi afterburner version and latest nvidia driver any help?!


May have just lost the lottery my friend...but in any case try a manual voltage lock - should help your overclocks


----------



## Prozillah

my curve as requested by some:


----------



## amd7674

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Get an AMD Fury X for $365 with the visa promo code and call it a day.
> 
> https://m.newegg.com/Product/index?itemnumber=N82E16814150742
> 
> I'm trying to sell my 1070 FTW because of the annoying coil whine. Both the ones I receieved required me to cap the card at 30 FPS to stop the whine. Tested it in this system along with a Rampage IV BE with a Corsair AX1200i but it still yells and screams fhe same.
> 
> Edit: I've only owned EVGA cards in the last six years.
> 
> GTX 470 - no whine - reference
> GTX 780 - no whine - reference
> GTX 780 Ti KingPin - only benchmarks
> GTX 980 Ti - only benchmarks
> GTX 1070 - every game!


I won't be going back to the RED team... LOL... Although HD6950 was good GPU.







Maybe if they had RX490 out ...

As for the Coil Whine, are you saying you get it at 60FPS with adaptive sync enabled. My kids HTPC in the livingroom Zotac GTX 970 had some coil whine when I was benchmarking in excess of 200FPS. However the 1080p TV is set to 60Hz and I'm capping FPS to 60 using adaptive sync. Some people say Witcher 3 menu's are the worst (the best way to test it), I don't own the game so I cannot test it.

DId you try to RMA your cards with EVGA, what is your other PSU? out of curiosity.?


----------



## 113802

Quote:


> Originally Posted by *amd7674*
> 
> I won't be going back to the RED team... LOL... Although HD6950 was good GPU.
> 
> 
> 
> 
> 
> 
> 
> Maybe if they had RX490 out ...
> 
> As for the Coil Whine, are you saying you get it at 60FPS with adaptive sync enabled. My kids HTPC in the livingroom Zotac GTX 970 had some coil whine when I was benchmarking in excess of 200FPS. However the 1080p TV is set to 60Hz and I'm capping FPS to 60 using adaptive sync. Some people say Witcher 3 menu's are the worst (the best way to test it), I don't own the game so I cannot test it.
> 
> DId you try to RMA your cards with EVGA, what is your other PSU? out of curiosity.?


I'm using a Seasonic X-850 Gold in this rig, my Rampage IV Black Edition uses a Corsair AX1200i

I requested a crossship RMA and both cards screamed at any frame rate above 31 FPS. I kept my first card because the second one couldn't hit 2ghz surprisingly. The fans are quiet and the card runs cool. I purchased EVGA because their customer service is great. I also preferred the aesthetics of the card.

http://www.3dmark.com/fs/9640486



After looking at current reviews with the latest drivers the Fury X it has surpassed the 980 Ti and @ 4K it trades blows with the 1080


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Symix*
> 
> Definitely not evga then, also heard asus strix is a loud cooler compared to others


ASUS GeForce GTX 1070 STRIX Gaming review - Graphics Card Noise Levels:

http://www.guru3d.com/articles_pages/asus_geforce_gtx_1070_strix_gaming_review,11.html


----------



## Prozillah

Quote:


> Originally Posted by *WannaBeOCer*
> 
> I'm using a Seasonic X-850 Gold in this rig, my Rampage IV Black Edition uses a Corsair AX1200i
> 
> I requested a crossship RMA and both cards screamed at any frame rate above 31 FPS. I kept my first card because the second one couldn't hit 2ghz surprisingly. The fans are quiet and the card runs cool. I purchased EVGA because their customer service is great. I also preferred the aesthetics of the card.
> 
> http://www.3dmark.com/fs/9640486
> 
> 
> 
> After looking at current reviews with the latest drivers the Fury X it has surpassed the 980 Ti and @ 4K it trades blows with the 1080


Can u please bang a link up for that? That's quite impressive and good buying if that's the case


----------



## saunupe1911

Quote:


> Originally Posted by *Prozillah*
> 
> my curve as requested by some:


Interesting that you never touch your voltage


----------



## waylo88

So I saw that MSI cards have a problem with the Micron memory. Is this only MSI cards? I ask because my Strix apparently has Micron memory.


----------



## Prozillah

Quote:


> Originally Posted by *saunupe1911*
> 
> Interesting that you never touch your voltage


Yea my card is funny it won't boot at stock voltages with any mem oc above 100 but with volt locked to 1.05 it's rock solid stable at those clocks


----------



## 113802

Quote:


> Originally Posted by *Prozillah*
> 
> Can u please bang a link up for that? That's quite impressive and good buying if that's the case







Seems like the Fury X competes with Maxwell Titan X, I found one review regarding the 1080 vs the fury x @ 4k but it seems like it was a hoax.


----------



## jlhawn

Quote:


> Originally Posted by *waylo88*
> 
> So I saw that MSI cards have a problem with the Micron memory. Is this only MSI cards? I ask because my Strix apparently has Micron memory.


I heard the micron vram doesn't oc well, I'm lucky my MSI GTX 1070 came with Samsung vram chips.
oc yours and see what happens, but I would say if there is a problem with MSI vram it would be the vram chips and not
the graphics card manufacturer.


----------



## owikhan

Quote:


> Originally Posted by *Symix*
> 
> Sent my msi 1070 gaming x back due to faulty fan bearing
> 
> got a palit super jetstream and it has more coil whine than any card i've ever owned.
> 
> what's the next one I should try? maybe evga ?


Why not you check Zotac 1070 Extreme Amp Edition


----------



## Swolern

Samsung memory on both of my STRIXs.


----------



## amd7674

Quote:


> Originally Posted by *WannaBeOCer*
> 
> I'm using a Seasonic X-850 Gold in this rig, my Rampage IV Black Edition uses a Corsair AX1200i
> 
> I requested a crossship RMA and both cards screamed at any frame rate above 31 FPS. I kept my first card because the second one couldn't hit 2ghz surprisingly. The fans are quiet and the card runs cool. I purchased EVGA because their customer service is great. I also preferred the aesthetics of the card.
> 
> http://www.3dmark.com/fs/9640486
> 
> 
> 
> After looking at current reviews with the latest drivers the Fury X it has surpassed the 980 Ti and @ 4K it trades blows with the 1080


Thank you for the info. So are you still waiting for replacement? or replacement has coil whine too?
Quote:


> Originally Posted by *waylo88*
> 
> So I saw that MSI cards have a problem with the Micron memory. Is this only MSI cards? I ask because my Strix apparently has Micron memory.


Can you please post GPU-Z screenshot. I thought MSI only shipped some 1070 GPUs with micron vram.


----------



## lennal

Has anyone experienced with their GTX 1070 a STOP error caused by nvlddmkm.sys?
Could the card be faulty and causes bsod? or can overclock cause it in the middle of playing a game?


----------



## waylo88

Quote:


> Originally Posted by *amd7674*
> 
> Can you please post GPU-Z screenshot. I thought MSI only shipped some 1070 GPUs with micron vram.


----------



## amd7674

Quote:


> Originally Posted by *waylo88*


Thank you, is this OC or non-OC strix?


----------



## waylo88

Quote:


> Originally Posted by *amd7674*
> 
> Thank you, is this OC or non-OC strix?


I think non-OC. It's this one:

http://www.newegg.com/Product/Product.aspx?item=N82E16814126111


----------



## amd7674

Quote:


> Originally Posted by *lennal*
> 
> Has anyone experienced with their GTX 1070 a STOP error caused by nvlddmkm.sys?
> Could the card be faulty and causes bsod? or can overclock cause it in the middle of playing a game?


are you using the latest nvidia drivers? I don't own 1070 yet, but a lot of peeps complained about the latest drivers not being OC "friendly"


----------



## amd7674

Quote:


> Originally Posted by *waylo88*
> 
> I think non-OC. It's this one:
> 
> http://www.newegg.com/Product/Product.aspx?item=N82E16814126111


looks like msi/asus are both using micron in some of their cards. I wonder if other companies are doing the same


----------



## chaous2000

Finally returned my MSI gtx gaming x 1070 and switched for an EVGA SC 1070. Have golden ******* memory, got 130+ on the core, and 600+ on the memory.


----------



## whicker

My friends g1 gaming that was bought last week has micron as well. Seems like the Samsung ram is running out. Glad my strix oc has Samsung.


----------



## benjamen50

Any EVGA FTWs with micron memory?


----------



## amd7674

Quote:


> Originally Posted by *chaous2000*
> 
> Finally returned my MSI gtx gaming x 1070 and switched for an EVGA SC 1070. Have golden ******* memory, got 130+ on the core, and 600+ on the memory.


Congrats









Any coil whine issues?


----------



## lennal

Quote:


> Originally Posted by *amd7674*
> 
> are you using the latest nvidia drivers? I don't own 1070 yet, but a lot of peeps complained about the latest drivers not being OC "friendly"


So I heard. I am using ver 368.81. So it should be fine in regards to oc. and I didn't push the oc too hard either. +126/+512. there shouldn't be much of a problem


----------



## amd7674

Quote:


> Originally Posted by *lennal*
> 
> So I heard. I am using ver 368.81. So it should be fine in regards to oc. and I didn't push the oc too hard either. +126/+512. there shouldn't be much of a problem


I would try your crashing game without any oc. It sounds like oc issue. Don't forget some cards won't do even +30 on core. Also because your benchies pass certain speed, it doesn't mean it is game stable. It is silicon lottery after all.
Good luck.


----------



## madmeatballs

Quote:


> Originally Posted by *lennal*
> 
> So I heard. I am using ver 368.81. So it should be fine in regards to oc. and I didn't push the oc too hard either. +126/+512. there shouldn't be much of a problem


I had this problem. I fixed it by simply closing MSI afterburner or your overclocking tool prior to installing the new driver(that way puts your card to stock). My oc actually improved after the driver update.


----------



## lennal

Quote:


> Originally Posted by *amd7674*
> 
> I would try your crashing game without any oc. It sounds like oc issue. Don't forget some cards won't do even +30 on core. Also because your benchies pass certain speed, it doesn't mean it is game stable. It is silicon lottery after all.
> Good luck.


I am about to turn oc off to test whether or not I would still get the stop error.
Quote:


> Originally Posted by *madmeatballs*
> 
> I had this problem. I fixed it by simply closing MSI afterburner or your overclocking tool prior to installing the new driver(that way puts your card to stock). My oc actually improved after the driver update.


You got bsod from ocing using afterburner? and the update fixed it? then it does sound like ocing may be the problem


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *lennal*
> 
> I am about to turn oc off to test whether or not I would still get the stop error.
> You got bsod from ocing using afterburner? and the update fixed it? then it does sound like ocing may be the problem


i found that both afterburner, and not properly installing vid driver, was not the only thing that could cause problems with 1070.

-proper and *clean vid driver installation* seemed to eliminate all freeze-ups and BSODs; but did not completely over a span of several hours or days; i tried several times

-same for a *complete clean uninstall/ install (wipe any existence that it was ever installed) of Afterburner* being uninstalled prior to ^^; i tried several times

-only after *fully uninstalling any Hardware utility(s) (wipe any existence that it or they were ever installed)* - i'm talking about ones that include GPU Fan sensor reading ability, too; and the two steps ^^; and result was that BSOD and freeze-ups DID completely disappear and never returned- not one time.

-conclusion:
1. uninstall Afterburner fully.
2. complete full uninstall of any and all Hardware utilities with GPU fan sensor reading abilities, and Restart.
3. Uninstall vid drivers fully, and Restart. And do a clean install of vid drivers.
4. test vid card running stock out-of-box settings without any OCing, test Games and Benchmarks. Continue testing without Afterburner or any Hardware Utilities installed. Include testing of PC fully. Example: be sure all is fine after restarting and shutting down and see if Games and Benchmarks are still fine.
5. install Hardware utilities and test PC all around again
6. install Afterburner and test PC all around again without any adjust whatsoever in Afterburner
7. if all is fine ^^, then test OCing with Afterburner

GL


----------



## lennal

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> i found that both afterburner, and not properly installing vid driver, was not the only thing that could cause problems with 1070.
> 
> -proper and *clean vid driver installation* seemed to eliminate all freeze-ups and BSODs; but did not completely over a span of several hours or days; i tried several times
> 
> -same for a *complete clean uninstall/ install (wipe any existence that it was ever installed) of Afterburner* being uninstalled prior to ^^; i tried several times
> 
> -only after *fully uninstalling any Hardware utility(s) (wipe any existence that it or they were ever installed)* - i'm talking about ones that include GPU Fan sensor reading ability, too; and the two steps ^^; and result was that BSOD and freeze-ups DID completely disappear and never returned- not one time.
> 
> -conclusion:
> 1. uninstall Afterburner fully.
> 2. complete full uninstall of any and all Hardware utilities with GPU fan sensor reading abilities, and Restart.
> 3. Uninstall vid drivers fully, and Restart. And do a clean install of vid drivers.
> 4. test vid card running stock out-of-box settings without any OCing, test Games and Benchmarks. Continue testing without Afterburner or any Hardware Utilities installed. Include testing of PC fully. Example: be sure all is fine after restarting and shutting down and see if Games and Benchmarks are still fine.
> 5. install Hardware utilities and test PC all around again
> 6. install Afterburner and test PC all around again without any adjust whatsoever in Afterburner
> 7. if all is fine ^^, then test OCing with Afterburner
> 
> GL


When you say vid driver, I'd only need to uninstall the graphics driver, right? or should I reinstall all nvidia drivers, such as 3d vision and physx? I'd assume geforce experience can be excluded?
Generally speaking, afterburner and evga precision oc would be the only hardware utility capable of gpu fan sensor readings, correct?


----------



## rfarmer

Quote:


> Originally Posted by *lennal*
> 
> When you say vid driver, I'd only need to uninstall the graphics driver, right? or should I reinstall all nvidia drivers, such as 3d vision and physx? I'd assume geforce experience can be excluded?
> Generally speaking, afterburner and evga precision oc would be the only hardware utility capable of gpu fan sensor readings, correct?


When I was in the 970 owners club all the people there used this utility to completely remove video drivers. http://www.guru3d.com/files-details/display-driver-uninstaller-download.html


----------



## USlatin

Just to update, I checked my GPU voltage during 100% load, speed stays at 2.101MHz and voltage stays at *1.012V*, steady


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *lennal*
> 
> When you say vid driver, I'd only need to uninstall the graphics driver, right? or should I reinstall all nvidia drivers, such as 3d vision and physx? I'd assume geforce experience can be excluded?
> Generally speaking, afterburner and evga precision oc would be the only hardware utility capable of gpu fan sensor readings, correct?


A clean uninstall of vid drivers should be ALL nVidia related (includes 3d vision and physx and experience). Unfortunately i can't advise on gforce Experience because i've never used it in any way. u might be able to back-up some settings but if ur goal is to prove a 1070 or 1080 vid card is AOK and no freeze-ups or BSODs occur then test with only the vid card drivers (including physx) and skip experience and 3D drivers until all is proven AOK then u can install it later and see if all remains AOK after doing so.

as far as utilities with GPU fan sensor reading sensors go, practically all Hardware Utilities have them. So take into account all that run automatically when Windows Start and consider manually run ones too but focus on uninstalling the ones that Start with Windows and run the background. But realize that Manually run ones can just not be ran until later on after all is proven AOK. They really don't have anything to uninstall and are usually referred to as, "Self-Standing" and should pose no threat if not started. After all is well they should be fine too and maybe only maybe a newer fresh version DLed freshly extracted would be advisable.

After u find any steps to follow or an order of steps to avoid, and prove that they were the cause it should be np to do any and all with ur PC with said steps always in back of mind. Although, sometimes better drivers from Manufacturers (nVidia) have proven to eliminate said problems and or need to or not to need to, follow certain steps. 10xx series cards are fresh out and stable drivers should require at least a few more months. Average time for super stable drivers and practically no need to ever update them is approximately imho 3 to 6 months. (one of many obvious/ good advantage to waiting before buying one of a new series of Vid cards. But if one can't wait, cause they have to have to have it like me lol- then they have to put up with any hassles.







)


----------



## Dude970

Changed my CPU and Ram, got a bump in FS









A few more tweaks and I will have the GPU score at 22k again

http://www.3dmark.com/fs/9853670



And Timespy


----------



## chaous2000

Quote:


> Originally Posted by *amd7674*
> 
> Congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any coil whine issues?


not at all, which is amazing.


----------



## whicker

Are any of your custom fan profiles staying after you close MSI ab? My oc stays but MSI ab needs to be open for the fan profile. I never had this problem with my MSI 770 but now I have a strix.


----------



## nacherc

MSI GTX 1070 GAMING X

IN VALLEY
+120 ON THE CORE
+900 MEMORY
100% FAN


----------



## chaous2000

Quote:


> Originally Posted by *nacherc*
> 
> 
> 
> 
> MSI GTX 1070 GAMING X
> 
> IN VALLEY
> +120 ON THE CORE
> +900 MEMORY
> 100% FAN


what was your power %tdp at? Helps having that extra 6 pin so you can draw more power. The evga SC only has the one 8 pin


----------



## tps3443

The extra 6 pin does not help the GTX1070 draw more power. It cannot reach the limits of a single 8 pin. The card is limited with a power limiter, so It cannot achieve GTX1080 levels.
a single 8 pin is plenty!

Anyways, look what I just picked up guys for $280 bucks. I bought it from as young kid going to graphic design school, the card was paid for, and supplied by financial aid. I drove 4 hours round trip to pick up this once in a lifetime deal! I left at 2AM lol. It was that good!

A $280 GTX1080?! Are you kidding me?

He was desperate for money. And had no computer to use it in. He did not even know what it was used for. I do though









https://www.techpowerup.com/gpuz/details/g83dk

It is not a HUGE upgrade from a 1070. But, it is decent. And for the money. It is a no brainer. I sold my GTX1070 ACX SC for $350

So, I went from a GTX1070, to a GTX1080 and still put $70 in my pocket! This was definitely a once in a lifetime DEAL.


----------



## gtbtk

Quote:


> Originally Posted by *waylo88*
> 
> So I saw that MSI cards have a problem with the Micron memory. Is this only MSI cards? I ask because my Strix apparently has Micron memory.


I have a Micron memory 1070 Gaming X card. I found a number of micron memory vbios from different manufacturers including Micron Memory cards from Asus, Gigabyte, Palit, Gainward. These are all bioses that are version numbered 86.04.26.00.XX. I have not tried any samsung memory bioses on the card.

I felt daring yesterday and flashed my cards with all the different Micron bioses that I could find to see what the differences were and how it effected performance.

I found that with the other brand bioses installed, I did get a variation on how much power limit adjustment that I could make, non of the others will allow an increase of +126, but given I have never managed to get the card to 100% anyway I didn't think it was that much of a problem. I did notice different brands would changed the top speed of the fan at 100% but the differences were in the range of about 100-150 rpm.

The main thing that I discovered is that with the other brand vbioses installed, I did not find the checkerboard artifact issues that I could quite easily get with the stock MSI bios even when pushing the card to more than +500 without locking the voltage.

The highest clocked Micron bios that I played with was the Palit that default clocks the card to 1633 mhz with stock clocked vRAM. With +100% voltage, the card will boost to 2012Mhz by at stock clocks.

The hypothesis that I came up with from my testing, is that the MSI bios seems to have had the curve that manages VRM power delivery to the RAM set lower than other competing cards. Maybe it was an engineering decision to manage temperatures? The Palit bios with fans at 100% does run the card about 3-4 degrees hotter than the stock MSI bios in the same tests. Having said that though, the MSI Bios power starvation limitations only make themselves felt if you push the card to abput +500Mhz. Maybe MSI will come out with a new bios version at some stage although the existing VRM power delivery is enough to power the card in spec settings so I am not hopeful.


----------



## nacherc

126% TDP


----------



## zipper17

Out of the box:


OC (MSI AB, 111%-86C, +105, +502, no voltage):











btw already installed latest driver 372.54,but in GPuz still show 368.95 ??

=======================

hey guys btw, noob queston, is it safe to run memory at 9GHZ daily?

Run valley, firestrike, witcher3, hitman didnt notice any graphic error. but will it be overheat or decrease the life of memory itself?
btw looks like my card also covered the vram area with heatsink.

Let just hope this card last longer lol.
because I upgrade graphic card once per 3-5 years.
This is my jump from GTX 560Ti/GTX 750ti.

=======================
*EDIT:* *+105 +502 causing crash in Firestrike Extreme and Firestrike Stres Test.

I Have to lower a bit the coreclock into +80 +502, to become stable of all stress test and benchmark.*

*Stable config: (MSI AB, 125%-92C, +105, +502, +100 voltage):*


----------



## Mad Pistol

Quote:


> Originally Posted by *tps3443*
> 
> The extra 6 pin does not help the GTX1070 draw more power. It cannot reach the limits of a single 8 pin. The card is limited with a power limiter, so It cannot achieve GTX1080 levels.
> a single 8 pin is plenty!
> 
> Anyways, look what I just picked up guys for $280 bucks. I bought it from as young kid going to graphic design school, the card was paid for, and supplied by financial aid. I drove 4 hours round trip to pick up this once in a lifetime deal! I left at 2AM lol. It was that good!
> 
> A $280 GTX1080?! Are you kidding me?
> 
> He was desperate for money. And had no computer to use it in. He did not even know what it was used for. I do though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.techpowerup.com/gpuz/details/g83dk
> 
> It is not a HUGE upgrade from a 1070. But, it is decent. And for the money. It is a no brainer. I sold my GTX1070 ACX SC for $350
> 
> So, I went from a GTX1070, to a GTX1080 and still put $70 in my pocket! This was definitely a once in a lifetime DEAL.


It was probably stolen, especially if he had no clue what it was and was desperate for money.


----------



## Forceman

Quote:


> Originally Posted by *Mad Pistol*
> 
> It was probably stolen, especially if he had no clue what it was and was desperate for money.


That was my thought as well. Like they say, if it's too good to be true, it probably is.


----------



## Curseair

Quote:


> Originally Posted by *benjamen50*
> 
> Any EVGA FTWs with micron memory?


Mines is Samsung.


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> I have a Micron memory 1070 Gaming X card. I found a number of micron memory vbios from different manufacturers including Micron Memory cards from Asus, Gigabyte, Palit, Gainward. These are all bioses that are version numbered 86.04.26.00.XX. I have not tried any samsung memory bioses on the card.
> 
> I felt daring yesterday and flashed my cards with all the different Micron bioses that I could find to see what the differences were and how it effected performance.
> 
> I found that with the other brand bioses installed, I did get a variation on how much power limit adjustment that I could make, non of the others will allow an increase of +126, but given I have never managed to get the card to 100% anyway I didn't think it was that much of a problem. I did notice different brands would changed the top speed of the fan at 100% but the differences were in the range of about 100-150 rpm.
> 
> The main thing that I discovered is that with the other brand vbioses installed, I did not find the checkerboard artifact issues that I could quite easily get with the stock MSI bios even when pushing the card to more than +500 without locking the voltage.
> 
> The highest clocked Micron bios that I played with was the Palit that default clocks the card to 1633 mhz with stock clocked vRAM. With +100% voltage, the card will boost to 2012Mhz by at stock clocks.
> 
> The hypothesis that I came up with from my testing, is that the MSI bios seems to have had the curve that manages VRM power delivery to the RAM set lower than other competing cards. Maybe it was an engineering decision to manage temperatures? The Palit bios with fans at 100% does run the card about 3-4 degrees hotter than the stock MSI bios in the same tests. Having said that though, the MSI Bios power starvation limitations only make themselves felt if you push the card to abput +500Mhz. Maybe MSI will come out with a new bios version at some stage although the existing VRM power delivery is enough to power the card in spec settings so I am not hopeful.


Very interesting - Im having same issue on my g1 bios when oc without voltage lock


----------



## DyndaS

hmmmm.... Palit Super JetStream or EVGA SC









Priority is Noise -> Temp -> Look


----------



## SuperZan

I can tell you first-hand that the SC is fantastic on all three counts. I've only heard good things about the JetStream though, so I'd go with whichever you fancy just a bit more.


----------



## deegzor

Anybody break that 2200mhz mark? --> http://www.3dmark.com/3dm/14284155


----------



## Hunched

Quote:


> Originally Posted by *DyndaS*
> 
> hmmmm.... Palit Super JetStream or EVGA SC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Priority is Noise -> Temp -> Look


That Palit is supposed to be the quietest 1070 due to a larger coolers than almost everything and good fans.
It's not available in NA so I got the next quietest thing, MSI Gaming 8G.

This is according to every review I could find that tested decibel levels, EVGA wasn't on any of them at the time so I don't know about them.
Just avoid everything that uses smaller high RPM fans, like Gigabyte 1070's.


----------



## zipper17

Quote:


> Originally Posted by *deegzor*
> 
> Anybody break that 2200mhz mark? --> http://www.3dmark.com/3dm/14284155


mine only 20.7K, maybe I would try, is that requires increase voltage?
I'm afraid of my own card longevity for high oc, I used Adaptive Sync @60hz for all games anyway.
Have traumatic experiences with my old 560ti that died from overheating because FPS never capped & 1 fan died.

Btw, Anyone play this game? How it performs to yours current specs?

Mankind Divided gamegpu




looks like a cpu & gpu bound 0.o


----------



## deegzor

Quote:


> Originally Posted by *zipper17*
> 
> mine only 20.7K, maybe I would try, is that requires increase voltage?
> I'm afraid of my own card longevity for high oc, I used Adaptive Sync @60hz for all games anyway.
> Have traumatic experiences with my old 560ti that died from overheating because FPS never capped & 1 fan died.
> 
> Btw, Anyone play this game? How it performs to yours current specs?
> 
> Mankind Divided gamegpu
> 
> 
> 
> 
> looks like a cpu & gpu bound 0.o


I mean't 2200mhz on core speed









and yes i have increased voltage to max(1.093v) without unlocking the bios limits also powerlimit is set to max(112% on my model) . I would stay under 80C and not run the fans @ Full speed when pushing the card to the limit. My tool picks: Msi AB for OC and GPU-Z for monitoring.

Btw running GPU-Z on the backround and checking it once in a while to check max temps is one way to figure out if fan is blow, it should immediately show on the max temp.


----------



## kpo6969

Quote:


> zipper17



Quote:


> btw already installed latest driver 372.54,but in GPuz still show 368.95 ??


Update GPUZ to version 1.10.0


----------



## xTesla1856

What are my options for flashing a BIOS on my G1 Gaming 1070? My card boosts itself to 1946mhz on stock settings. There is not a lot of headroom left, when I crank the Voltage, Power limit and Temp target, I can go up to 2024mhz stably, anything more and I get crashes and stuttering. Memory overclocking is also kinda lacking, +200 and I get crashes.

EDIT: My Afterburner monitoring looks super weird. (Heaven Bench 1440P Ultra)


----------



## amd7674

Strix oc or evga ftw???

A couple more days to make very hard decision lol first world problems lol. In Canada EVGA warranty is not as good as in US. I cannot get an advanced RMA which includes cover costs for cross shipping. Asus on the other hand has depot in my provide Ontario. So in long run warranty would be cheaper with Asus. I only had once problem with gpu in my life with xfx 8800gt, I had to cover shipping one way to California... But kudos to xfx it was painless other than costs.

I currently own asus gtx 670, solid build and no issues in the last 3 years.

I've read on another forum one user reported Asus strix (non-oc) using micron memory, however the place I would be buying from I would have 14 days to return it... No questions asked. For evga ftw I would get 30 days warranty.

For build I like asus 3 fans, 4 heat pipes, extra fan outputs. However I do not like thermal pads for vram. Therefore I would give EVGA card an edge. My haf 922 case is closed so RGB blings are not big issue and both GPUs can be set to red. Evga cards have some reports of coil whine, but both local retailer and evga said they would help in case I would try into an issue.

From my understanding there is no real advantage of 2x8 power on evga vs 1x8 power connector on asus. 10.2 vs 6.1 (if I'm not mistaken) power phases. All 1070 are power limited and all of them oc to 2000-2100 range, silicon lottery luck applies. Lol

So at the end I'm still flip flopping ;-(. Both seem like great GPUs and will be massive upgrades from my current gtx 670.

I want to buy 1070 to give my current rig [email protected], 16gb ram the last shot in the ram for the next 3-4 yesrs. After which I will build the new gaming rig from ground up including new gpu.

Thanks again for your input. ( Much appreciated ) ;-). I'm currently on my week vacation, as a result I'm typing this on my tiny phone. Apologies for typos, also my English is noty first language...

If anyone has any comments please post them,. I want to pull the trigger by this Monday or Tuesday.


----------



## Sasquatch in Space

Proof of life.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *amd7674*
> 
> Strix oc or evga ftw???
> 
> A couple more days to make very hard decision lol first world problems lol. In Canada EVGA warranty is not as good as in US. I cannot get an advanced RMA which includes cover costs for cross shipping. Asus on the other hand has depot in my provide Ontario. So in long run warranty would be cheaper with Asus. I only had once problem with gpu in my life with xfx 8800gt, I had to cover shipping one way to California... But kudos to xfx it was painless other than costs.
> 
> I currently own asus gtx 670, solid build and no issues in the last 3 years.
> 
> I've read on another forum one user reported Asus strix (non-oc) using micron memory, however the place I would be buying from I would have 14 days to return it... No questions asked. For evga ftw I would get 30 days warranty.
> 
> For build I like asus 3 fans, 4 heat pipes, extra fan outputs. However I do not like thermal pads for vram. Therefore I would give EVGA card an edge. My haf 922 case is closed so RGB blings are not big issue and both GPUs can be set to red. Evga cards have some reports of coil whine, but both local retailer and evga said they would help in case I would try into an issue.
> 
> From my understanding there is no real advantage of 2x8 power on evga vs 1x8 power connector on asus. 10.2 vs 6.1 (if I'm not mistaken) power phases. All 1070 are power limited and all of them oc to 2000-2100 range, silicon lottery luck applies. Lol
> 
> So at the end I'm still flip flopping ;-(. Both seem like great GPUs and will be massive upgrades from my current gtx 670.
> 
> I want to buy 1070 to give my current rig [email protected], 16gb ram the last shot in the ram for the next 3-4 yesrs. After which I will build the new gaming rig from ground up including new gpu.
> 
> Thanks again for your input. ( Much appreciated ) ;-). I'm currently on my week vacation, as a result I'm typing this on my tiny phone. Apologies for typos, also my English is noty first language...
> 
> If anyone has any comments please post them,. I want to pull the trigger by this Monday or Tuesday.


do u have a g-sync monitor?


----------



## sew333

Quote:


> Originally Posted by *Blackfyre*
> 
> *LINK to higher quality version of image below:*
> 
> http://i.imgur.com/phmKP8F.jpg
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ---


Pwr.Thrm.Vrel.Vop. Why do you have this reason? High temps? But you have low temps.Is this normal?

I am getting sometimes similiar perfcap PwrThrmVrel after starting games. After 1 minute PwrThrmVrel is gone and it stabilize to VREL. My screen:


----------



## amd7674

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> do u have a g-sync monitor?


At the moment I have lg 32" 32ld450 IPS panel, 4:4:4 RGB chroma compatible running at 75hz. For now at least I'm not planning on upgrading, when possible I will use 2k/4k DSR.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *amd7674*
> 
> I
> At the moment I have lg 32" 32ld450 IPS panel, 4:4:4 RGB chroma compatible running at 75hz. For now at least I'm not planning on upgrading, when possible I will use 2k/4k DSR.


g-sync (made by nVidia) is the greatest tech EVER!

i was able to afford G-sync last November because i skipped upgrading my PC and the 970s and 980s were still too expensive. (Boy am i ever glad i didn't and got two 1070s last month!?







) G-sync is the best money i ever invested PC-wise. So if ur PC is less than six years old (my PC is seven but all the best from the time i built it and still ROCKS because of 1070-SLI now) don't do anything but add a G-sync monitor *after adding a 1070*. And realize that the SLI experience is greatly improved with G-sync, too. And game-play alone, is a reason to join the G-sync wagon.

An example of improved Game-play: Thanks to G-sync, just today i was repeatedly, from aprox 50 feet, literally aiming at an Eagle-from-out-of-nowhere swooping down on a pair of helpless natives in Far Cry 4, as they shouted, "help me help me the Eagle...".

You could even slightly exaggerate it, if u saw it, and say i was aiming at the Eagle's eye. (nj) Before G-sync i would NOT have been able to. And instead i would have been lucky to shoot the sky in the same time. (Poor natives would have died without me and G-sync.







) I believe i actually killed a couple Eagles last night, but it was too late to remember wat, how and when; because i was having so much fun seeking/ hunting White Rhinoceros with an Elephant gun- just another example of Game-play being greatly improved with G-sync. Because u have to aim exactly behind the ear of the Rhinoceros while simultaneously kangaroo jumping behind trees and large boulders for protection from it. I got at least five Rhinoceros last night- wearing my pajamas







- and one more today to unlock a total of two very important things.

All in all, PC gaming comes down to having fun. Benchmarks are a good tool for tweaking and QA, but *"fun"* is *"the PC gaming word"*! And *G-sync delivers it every time!*

GL


----------



## Curseair

Did I lose the silicon lottery with my 1070? Only boosts to 405mhz on the memory, After that BF4 crashes.


----------



## DFroN

This got delivered and installed yesterday





I haven't had time to OC or bench yet, hopefully will get time this week. Will report results if I do. Out of the box it boosts to 1974MHz at 1.05v and settles at 38c.


----------



## Prozillah

Quote:


> Originally Posted by *Curseair*
> 
> Did I lose the silicon lottery with my 1070? Only boosts to 405mhz on the memory, After that BF4 crashes.


Na not really- I'd say middle of the road


----------



## tps3443

Quote:


> Originally Posted by *Mad Pistol*
> 
> It was probably stolen, especially if he had no clue what it was and was desperate for money.


Maybe so. But, I am not really sure from where? All of the TigerDirect stores, and CompUSA stores have been shut down in my state for years now.
They knew how much it was worth, but they must have been really desperate to sell a 1080 for $280. Either way, I got a picture of his ID, and I made a bill of sale, that he signed! So, this card is mine now. I drove a long ways there and back to get it. And I paid for it.

My GTX1070 that I upgraded from was purchased at nearly full price. So, I know what it is like to spend a bunch of money on a graphics card.

I am just happy to have finally settled on a GPU! lol

in 30 days, I purchased a RX480 8GB, then a 1070, and now I'm on a GTX1080. Well, there is nothing more powerful that is under $1,000 lol.

But, I will tell you guys this right now. I don't think it is all that much difference from a GTX1070 to a GTX1080. It is impossible to tell the difference in games at 1440P and 1080P.

So, unless you are running 4K then a GTX1070 is just plenty! And even with my 1080 I have to turn down settings at 4K to maintain a happy 50-60 fps. I am glad to see 4K is attainable now though with relatively affordable graphics cards.


----------



## TheGlow

Whats with the MSI Gaming X Gaming App? Clock reports 1936 whether gaming or OC mode.
So what is OC mode really doing?
And I havent OCd a video card really so whats some good benchmarks to use, where to compare to? And whats life expectancy effect otherwise Id rather leave it "stock".


----------



## Dude970

Quote:


> Originally Posted by *deegzor*
> 
> Anybody break that 2200mhz mark? --> http://www.3dmark.com/3dm/14284155


I did today running an old benchmark called tropics. Afterburner shot was with my phone cause it surprised me










The benchmark result


----------



## DyndaS

Quote:


> Originally Posted by *Hunched*
> 
> That Palit is supposed to be the quietest 1070 due to a larger coolers than almost everything and good fans.
> It's not available in NA so I got the next quietest thing, MSI Gaming 8G.
> 
> This is according to every review I could find that tested decibel levels, EVGA wasn't on any of them at the time so I don't know about them.
> Just avoid everything that uses smaller high RPM fans, like Gigabyte 1070's.


I know how gigabyte and asus are loud this time. I just coudn't rly decide what exactly should I go for. I don't like that palit don't allow remove the cooler.

Palit SJS ordered.


----------



## Mad Pistol

Quote:


> Originally Posted by *Dude970*
> 
> I did today running an old benchmark called tropics. Afterburner shot was with my phone cause it surprised me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The benchmark result


Now I"m really curious what my 1070 SLI setup will do at those settings


----------



## Dude970

Other benchmarks wont take that hard of a OC. Valley, heaven, and tropics do. Firestrike has to be lower, and TimeSpy has to be way lower. I am very happy with this 1070


----------



## Mad Pistol

Might be running into a CPU bottleneck.


----------



## batman900

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> g-sync (made by nVidia) is the greatest tech EVER!
> 
> i was able to afford G-sync last November because i skipped upgrading my PC and the 970s and 980s were still too expensive. (Boy am i ever glad i didn't and got two 1070s last month!?
> 
> 
> 
> 
> 
> 
> 
> ) G-sync is the best money i ever invested PC-wise. So if ur PC is less than six years old (my PC is seven but all the best from the time i built it and still ROCKS because of 1070-SLI now) don't do anything but add a G-sync monitor *after adding a 1070*. And realize that the SLI experience is greatly improved with G-sync, too. And game-play alone, is a reason to join the G-sync wagon.
> 
> An example of improved Game-play: Thanks to G-sync, just today i was repeatedly, from aprox 50 feet, literally aiming at an Eagle-from-out-of-nowhere swooping down on a pair of helpless natives in Far Cry 4, as they shouted, "help me help me the Eagle...".
> 
> You could even slightly exaggerate it, if u saw it, and say i was aiming at the Eagle's eye. (nj) Before G-sync i would NOT have been able to. And instead i would have been lucky to shoot the sky in the same time. (Poor natives would have died without me and G-sync.
> 
> 
> 
> 
> 
> 
> 
> ) I believe i actually killed a couple Eagles last night, but it was too late to remember wat, how and when; because i was having so much fun seeking/ hunting White Rhinoceros with an Elephant gun- just another example of Game-play being greatly improved with G-sync. Because u have to aim exactly behind the ear of the Rhinoceros while simultaneously kangaroo jumping behind trees and large boulders for protection from it. I got at least five Rhinoceros last night- wearing my pajamas
> 
> 
> 
> 
> 
> 
> 
> - and one more today to unlock a total of two very important things.
> 
> All in all, PC gaming comes down to having fun. Benchmarks are a good tool for tweaking and QA, but *"fun"* is *"the PC gaming word"*! And *G-sync delivers it every time!*
> 
> GL


You sir are a strong promoter of this tech lol. I've got about 400 to spend right now and I'm shopping a 1070 so I can max out witcher 3. I've also been strongly considering a gsync monitor. What gets me is that I never have any screen tear with this benq 144hz. Not even when I drop into the 30-40 fps, it does however feel choppy when it drops and that's what has me interested in gsync. I've experienced it short term before and couldn't tell a difference honestly.

So, as someone who loves it so much. Could you enlighten me as to how you reprieve low frame rates with it? Screen tear aside since I don't currently notice any, is that low of a frame rate still feel smooth?

Thanks!


----------



## Mad Pistol

Here's the fun one... Unigine 4K, max settings.

Just so we're clear, this is not Intel HD graphics (LOL). It's GTX 1070 FE SLI @ +150 core/ +600 mem.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *batman900*
> 
> You sir are a strong promoter of this tech lol. I've got about 400 to spend right now and I'm shopping a 1070 so I can max out witcher 3. I've also been strongly considering a gsync monitor. What gets me is that I never have any screen tear with this benq 144hz. Not even when I drop into the 30-40 fps, it does however feel choppy when it drops and that's what has me interested in gsync. I've experienced it short term before and couldn't tell a difference honestly.
> 
> So, as someone who loves it so much. Could you enlighten me as to how you reprieve low frame rates with it? Screen tear aside since I don't currently notice any, is that low of a frame rate still feel smooth?
> 
> Thanks!












after a quick glance at the wiki g-sync page for a refresh (lol) of my past knowledge i think i found what u request (the reprieve part):
Quote:


> ...NVIDIA built a special collision avoidance feature to avoid the eventuality of a new frame being ready while a *duplicate is painting on the screen* (something that could generate lag and/or stutter) in which case they anticipate the refresh and wait for the next frame to be completed.... SOURCE


the ^^ *"duplicate is painting on the screen"* part caught my eye immediately because when the technology was first discussed and as it was developed i read everything because it seemed i would never truly understand it as well as i hoped to. That said, makes me realize how now that i have the tech, it doesn't really matter. in other words, it works, and i use it always for gaming, and it just doesn't matter how because it does what everybody said it would do. i just LUV it.







But i should say, the closest i ever got to understanding it fully a few years ago before finally getting it last November; included always starting out by saying that the FPS of current game on the screen always exactly matches the frequency of ur LCD. (Or saying it visa-versa or both ways also helps initial comprehension of it.) And that is a no-brainer as far a "perfect" thing goes. Wat else could u suggest in order to improve gaming experience!? NOTHING i tell u. nothing.









man i'm glad u asked about reprieving low FPS.







Because the perfect excuse for finally updating my PC just popped-up in my mind... it's that G-sync might be even better. Noticeable improved. idk how it could be better than it is, but it is a great excuse. Thanks!







(LGA 1150, 1151 and 2011-v3- OH MY!







)

i started out leaning LGA 2011-v3, but recently LGA1151 and i7-6700K took the lead. i hate on-board GPUs but oh well. It will trample my existing LGA1366. Hek, a new smart phone or Tablet could.lol

GL









Edit 1: and 50 to 60 FPS G-sync feels like 80 to 90 without it. And 40 to 50 FPS feels like 70+ without it. Only under 40 FPS does it feel a little slow with G-sync, but still very playable and incredible more smoother than no G-sync.


----------



## amd7674

"Bee Dee 3 Dee" thanks for all the info ;-). That sounds wonderful, however I won have any $$$ for it anytime soon.;-(. However if my current display died today a 2k 144hz IPS panel based monitor would be what I would buy today.


----------



## amd7674

This is with evga ftw? How is your core clock? Power settings? 400mhz is still plenty.

In this review if I'm not mistaken they have only managed 270mhz...

http://m.hexus.net/tech/reviews/graphics/95446-evga-geforce-gtx-1070-ftw-gaming-acx-30/?page=13
Quote:


> Originally Posted by *Curseair*
> 
> Did I lose the silicon lottery with my 1070? Only boosts to 405mhz on the memory, After that BF4 crashes.


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> Very interesting - Im having same issue on my g1 bios when oc without voltage lock


Is that a Micron Memory Card?


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> Is that a Micron Memory Card?


No Samsung in this one. But if I lock the voltage I game 100% at 550


----------



## Curseair

Quote:


> Originally Posted by *amd7674*
> 
> This is with evga ftw? How is your core clock? Power settings? 400mhz is still plenty.
> 
> In this review if I'm not mistaken they have only managed 270mhz...
> 
> http://m.hexus.net/tech/reviews/graphics/95446-evga-geforce-gtx-1070-ftw-gaming-acx-30/?page=13


That is 268mhz more than mine on the memory and the clock boosts stays higher also, My slave bios which unlocks more power limit to 122% does not do much either but I get bad coil whine on the slave bios. Should I return and try another?


----------



## amd7674

Quote:


> Originally Posted by *Curseair*
> 
> That is 268mhz more than mine on the memory and the clock boosts stays higher also, My slave bios which unlocks more power limit to 122% does not do much either but I get bad coil whine on the slave bios. Should I return and try another?


Another case of coil whine with evga ;-(. If possible I would try another PSU or system. If you don't have it or the coil whine bothers/affects you to much, I would RMA it. Perhaps let it go for 2hrs or so to see if it improves or go away.

I'm not too happy now, there are user reports of asus shipping their strix with micron, and evga has coil whine isdues. I don't know what to buy, dishing out a lot of coin and playing lotto games is not fun.

I would go for MSI gaming x if I was guaranteed Samsung vram.

Maybe I should look at zotac extreme amp. My other two zotac cards are working fine... Gtx 650ti and gtx 970..

Crap ;-(


----------



## TheGlow

Quote:


> Originally Posted by *amd7674*
> 
> Another case of coil whine with evga ;-(. If possible I would try another PSU or system. If you don't have it or the coil whine bothers/affects you to much, I would RMA it. Perhaps let it go for 2hrs or so to see if it improves or go away.
> 
> I'm not too happy now, there are user reports of asus shipping their strix with micron, and evga has coil whine isdues. I don't know what to buy, dishing out a lot of coin and playing lotto games is not fun.
> 
> I would go for MSI gaming x if I was guaranteed Samsung vram.
> 
> Maybe I should look at zotac extreme amp. My other two zotac cards are working fine... Gtx 650ti and gtx 970..
> 
> Crap ;-(


I just got the MSI Gaming X and it's Micron. Whats the issue with Micron vs Samsung? OC potential?


----------



## Curseair

Quote:


> Originally Posted by *amd7674*
> 
> Another case of coil whine with evga ;-(. If possible I would try another PSU or system. If you don't have it or the coil whine bothers/affects you to much, I would RMA it. Perhaps let it go for 2hrs or so to see if it improves or go away.
> 
> I'm not too happy now, there are user reports of asus shipping their strix with micron, and evga has coil whine isdues. I don't know what to buy, dishing out a lot of coin and playing lotto games is not fun.
> 
> I would go for MSI gaming x if I was guaranteed Samsung vram.
> 
> Maybe I should look at zotac extreme amp. My other two zotac cards are working fine... Gtx 650ti and gtx 970..
> 
> Crap ;-(


Yeah it was a very tough choice between the cards, I heard the Asus does not have coil whine though with upgraded chokes, Does not look as nice as the EVGA though and EVGA in the uk has amazing customer service, Which is why I choose them but plagued by coil whine on the 1070's, No point with an RMA yet though I can still return it within 10 days, Maybe I should go with Asus.


----------



## amd7674

Quote:


> Originally Posted by *TheGlow*
> 
> I just got the MSI Gaming X and it's Micron. Whats the issue with Micron vs Samsung? OC potential?


That's my understanding.

Quote:


> Originally Posted by *Curseair*
> 
> Yeah it was a very tough choice between the cards, I heard the Asus does not have coil whine though with upgraded chokes, Does not look as nice as the EVGA though and EVGA in the uk has amazing customer service, Which is why I choose them but plagued by coil whine on the 1070's, No point with an RMA yet though I can still return it within 10 days, Maybe I should go with Asus.


Just watch for memory lottery. There is at least one user few pages back with asus strix non-OC version with micron. Buy from the dealer where you can easily return it. I think I will go for asus oc or zotac extreme amp. Both from the store I will have 14 days to test it. (Keep it or return it).


----------



## wrathofbill

Anyone else fitted one of these puppies. Just ordered one for my FE GTX 1070 , will post results when fitted....

http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


----------



## bigjdubb

I thought the hybrid on the 10 series cards was based on the FTW pcb not the reference pcb.


----------



## wrathofbill

Quote:


> Originally Posted by *bigjdubb*
> 
> I thought the hybrid on the 10 series cards was based on the FTW pcb not the reference pcb.


Maybe, but this fella fits all reference cards as well. Check the compatibility section on link.


----------



## Blackfyre

I haven't been here since around 40 or 50 pages ago (_10 posts per page_). I skimmed through the posts to get up to date and noticed that:

1. There is no BIOS mod yet. (*Unbelievable*).

2. The discussion about *Micron vs Samsung*

Which manufacturers are opting for the cheap option?

I have an *MSI GTX 1070 Gaming X* from first week or two of release and it has been a great overclocking wise. And tens of pages ago I noticed someone raged that all we do is overclock & benchmark and that our clocks are not actually stable for gaming.

Well I have been playing games for well over a month now with the settings below. Hours upon hours of gaming & usage with absolutely no crashing/issues & zero artifacts.


----------



## Sueramb6753

-snip-


----------



## amd7674

Quote:


> Originally Posted by *Blackfyre*
> 
> I haven't been here since around 40 or 50 pages ago (_10 posts per page_). I skimmed through the posts to get up to date and noticed that:
> 
> 1. There is no BIOS mod yet. (*Unbelievable*).
> 
> 2. The discussion about *Micron vs Samsung*
> 
> Which manufacturers are opting for the cheap option?
> 
> I have an *MSI GTX 1070 Gaming X* from first week or two of release and it has been a great overclocking wise. And tens of pages ago I noticed someone raged that all we do is overclock & benchmark and that our clocks are not actually stable for gaming.
> 
> Well I have been playing games for well over a month now with the settings below. Hours upon hours of gaming & usage with absolutely no crashing/issues & zero artifacts.


Congrats ;-). I'm currently checking with local store about return policy. If no questions asked I might roll a dice to see if i can score gaming x with Sammy ram.

Sometimes buying at release time pays off...


----------



## amd7674

Quote:


> Originally Posted by *Symix*
> 
> My palit SJS had micron memory (and coil whine)


Did you return it?


----------



## wrathofbill

Quote:


> Originally Posted by *Blackfyre*
> 
> I haven't been here since around 40 or 50 pages ago (_10 posts per page_). I skimmed through the posts to get up to date and noticed that:
> 
> 1. There is no BIOS mod yet. (*Unbelievable*).
> 
> 2. The discussion about *Micron vs Samsung*
> 
> Which manufacturers are opting for the cheap option?
> 
> I have an *MSI GTX 1070 Gaming X* from first week or two of release and it has been a great overclocking wise. And tens of pages ago I noticed someone raged that all we do is overclock & benchmark and that our clocks are not actually stable for gaming.
> 
> Well I have been playing games for well over a month now with the settings below. Hours upon hours of gaming & usage with absolutely no crashing/issues & zero artifacts.


----------



## Curseair

Should I return my EVGA FTW with coil whine on the slave bios and get the Asus OC or MSI Gaming x?


----------



## amd7674

Quote:


> Originally Posted by *Curseair*
> 
> Should I return my EVGA FTW with coil whine on the slave bios and get the Asus OC or MSI Gaming x?


It is a lottery ;-(... There is a good chance you will end up with micron vram if you MSI or asus... Question is if you can live with milder memory over clock?


__
https://www.reddit.com/r/4upfxo/for_new_gtx_1070_owners_out_there_out_of/

What about zotac extreme amp? Would it fit your case?


----------



## Mad Pistol

Is it pretty much confirmed that all FE's get Samsung RAM?


----------



## TheGlow

Whats the preferred in game OSD to use?
I was liking the MSI Gaming App Overlay but it didn't seem to have a toggle feature.
Then I was trying Afterburner, which then had me use rivatuner.
Next I experienced issues where the hotkey toggle wouldnt work. I had to relaunch Witcher to get Afterburners OSD, but then Gaming apps stopped. I guess you cant have both at once. So the behavior has been odd as I don't know which will end up loading.
Then rivatuner was popping up errors about connecting to server.


----------



## Curseair

Quote:


> Originally Posted by *amd7674*
> 
> It is a lottery ;-(... There is a good chance you will end up with micron vram if you MSI or asus... Question is if you can live with milder memory over clock?
> 
> 
> __
> https://www.reddit.com/r/4upfxo/for_new_gtx_1070_owners_out_there_out_of/
> 
> What about zotac extreme amp? Would it fit your case?


It would yeah but it would look a bit off in my case, Everything else is pretty much black + red.


----------



## criminal

Quote:


> Originally Posted by *Mad Pistol*
> 
> Is it pretty much confirmed that all FE's get Samsung RAM?


I believe so.


----------



## wrathofbill

I have a FE Msi 1070 with Samsung ram, the best I can overclock on it it +200 But core will take +220... I def got the bottom of the barrel in the memory stakes. Or am I missing something?????


----------



## criminal

Quote:


> Originally Posted by *wrathofbill*
> 
> I have a FE Msi 1070 with Samsung ram, the best I can overclock on it it +200 But core will take +220... I def got the bottom of the barrel in the memory stakes. Or am I missing something?????


That does seem awfully low. Does it crash or artifact when you overclock the memory more?


----------



## bigjdubb

Quote:


> Originally Posted by *wrathofbill*
> 
> I have a FE Msi 1070 with Samsung ram, the best I can overclock on it it +200 But core will take +220... I def got the bottom of the barrel in the memory stakes. Or am I missing something?????


Have you tried over clocking the memory first and then the core? I wasn't able to get much of an overclock on my memory at first (+225 I think) but when I removed the core overclock I was able to get +600 on memory.


----------



## wrathofbill

I get the checkerboard affect on screen when overclocking to high and PC freezes. Also tried memory first and did get it to go above +300. But crashed again and stuck it back to +200.


----------



## criminal

Quote:


> Originally Posted by *wrathofbill*
> 
> I get the checkerboard affect on screen when overclocking to high and PC freezes. Also tried memory first and did get it to go above +300. But crashed again and stuck it back to +200.


I think you may just have bad clocking vram.


----------



## wrathofbill

Quote:


> Originally Posted by *bigjdubb*
> 
> Have you tried over clocking the memory first and then the core? I wasn't able to get much of an overclock on my memory at first (+225 I think) but when I removed the core overclock I was able to get +600 on memory.


Quote:


> Originally Posted by *criminal*
> 
> I think you may just have bad clocking vram.


Yep!


----------



## Sueramb6753

-snip-


----------



## LikesToSlide

Hey folks I was overclocking and testing my Gigabyte G1 1070 this weekend and experienced something strange.. I had the clocks set to +100/+550 and power limit +111%. Ran Heaven, Valley, Firestrike,& FurMark no problem. Max temp was 67.

Thinking everything was stable I went and played Overwatch for an hour when all of a sudden the everything on screen got a yellow tint. Then as soon as I died (rare, but it happens







), everything turned back to normal.

Is that symptomatic of an unstable overclock? Should I back off my settings?


----------



## M0E

Quote:


> Originally Posted by *LikesToSlide*
> 
> Hey folks I was overclocking and testing my Gigabyte G1 1070 this weekend and experienced something strange.. I had the clocks set to +100/+550 and power limit +111%. Ran Heaven, Valley, Firestrike,& FurMark no problem. Max temp was 67.
> 
> Thinking everything was stable I went and played Overwatch for an hour when all of a sudden the everything on screen got a yellow tint. Then as soon as I died (rare, but it happens
> 
> 
> 
> 
> 
> 
> 
> ), everything turned back to normal.
> 
> Is that symptomatic of an unstable overclock? Should I back off my settings?


One time occurrence, or have you been able to duplicate it? If it was just the one time, I'd keep the settings you have until you notice another issue. It may have not been related to the overclock settings.


----------



## BulletSponge

Quote:


> Originally Posted by *LikesToSlide*
> 
> Hey folks I was overclocking and testing my Gigabyte G1 1070 this weekend and experienced something strange.. I had the clocks set to +100/+550 and power limit +111%. Ran Heaven, Valley, Firestrike,& FurMark no problem. Max temp was 67.
> 
> Thinking everything was stable I went and played Overwatch for an hour when all of a sudden the everything on screen got a yellow tint. Then as soon as I died (rare, but it happens
> 
> 
> 
> 
> 
> 
> 
> ), everything turned back to normal.
> 
> Is that symptomatic of an unstable overclock? Should I back off my settings?


At +150/+750 on my Gaming X everything will turn black in Overwatch except for the player names and explosions after a few minutes. At +100/+700 there are no issues so I'd assume it's your OC.


----------



## wrathofbill

It's
Quote:


> Originally Posted by *LikesToSlide*
> 
> Hey folks I was overclocking and testing my Gigabyte G1 1070 this weekend and experienced something strange.. I had the clocks set to +100/+550 and power limit +111%. Ran Heaven, Valley, Firestrike,& FurMark no problem. Max temp was 67.
> 
> Thinking everything was stable I went and played Overwatch for an hour when all of a sudden the everything on screen got a yellow tint. Then as soon as I died (rare, but it happens
> 
> 
> 
> 
> 
> 
> 
> ), everything turned back to normal.
> 
> Is that symptomatic of an unstable overclock? Should I back off my settings?


It's to with Overwatch, it doesn't like a over clocked card. It cannot handle it doe some reason. I got round it by syncing it with display , 144-150 fps is good enough for me.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> I just got the MSI Gaming X and it's Micron. Whats the issue with Micron vs Samsung? OC potential?


MSI, Gigabyte, ASUS, Palit, Gainward, EGVA have all produced 1070 cards with Micron memory. They all use a different version of vbios compared to their samsung memory equivalent. (86.04.26.00.XX micron 86.04.1e.00.xx samsung)

Anecdotally, the samsung memory clocks higher than Micron. I have no experience with a samsung card. Leaving everything else at defaults, micron cards with give you a checkerboard artifacts and lock up. However, if you lock voltage in Afterburner, the Micron memory can go to +580-600 where it starts flashing coloured artifacating but no lockup indicating to me at least, that the problem is more a bug/design limitation in the MSI Gaming X Bios ram power delivery curve rather than with the RAM itself.

I did some experiments with my Gaming X and flashed it with a number of different micron version bios. The 1633 core clock Palit bios and the memory could clock higher without checkerboards than with the stock MSI bios. The Palit bios did introduce other issues though, the over power limit was reduced to 114% from 126%, the fans spun slower at 100% and the card ran hotter (70 deg vs 55 deg) but still within limits of the gpu chip.


----------



## gtbtk

Quote:


> Originally Posted by *wrathofbill*
> 
> I get the checkerboard affect on screen when overclocking to high and PC freezes. Also tried memory first and did get it to go above +300. But crashed again and stuck it back to +200.


You can try this in Afterburner:

1. Ctrl-F to bring up the curve window.

2. Select the point at 1.081v on the curve

3. Press the "L" key and a vertical line will appear on the graph.

4. Click on the apply button in afterburner.

After locking the voltage, now try increasing your ram speed and see how you go.


----------



## wrathofbill

Quote:


> Originally Posted by *gtbtk*
> 
> You can try this in Afterburner:
> 
> 1. Ctrl-F to bring up the curve window.
> 
> 2. Select the point at 1.081v on the curve
> 
> 3. Press the "L" key and a vertical line will appear on the graph.
> 
> 4. Click on the apply button in afterburner.
> 
> After locking the voltage, now try increasing your ram speed and see how you go.


I'll give it a go , thanks.


----------



## amd7674

Quote:


> Originally Posted by *Symix*
> 
> Sending it back tomorrow, not sure which one to get now.
> 
> Not returning it over micron memory lol, it had some fan issues.


Fan issues, coil whine and micron ram
How like were you? ;-)


----------



## amd7674

Quote:


> Originally Posted by *gtbtk*
> 
> MSI, Gigabyte, ASUS, Palit, Gainward, EGVA have all produced 1070 cards with Micron memory. They all use a different version of vbios compared to their samsung memory equivalent. (86.04.26.00.XX micron 86.04.1e.00.xx samsung)
> 
> Anecdotally, the samsung memory clocks higher than Micron. I have no experience with a samsung card. Leaving everything else at defaults, micron cards with give you a checkerboard artifacts and lock up. However, if you lock voltage in Afterburner, the Micron memory can go to +580-600 where it starts flashing coloured artifacating but no lockup indicating to me at least, that the problem is more a bug/design limitation in the MSI Gaming X Bios ram power delivery curve rather than with the RAM itself.
> 
> I did some experiments with my Gaming X and flashed it with a number of different micron version bios. The 1633 core clock Palit bios and the memory could clock higher without checkerboards than with the stock MSI bios. The Palit bios did introduce other issues though, the over power limit was reduced to 114% from 126%, the fans spun slower at 100% and the card ran hotter (70 deg vs 55 deg) but still within limits of the gpu chip.


Thanks for the info... Where did you see evga cards with micron vram? And what about zotac?


----------



## ITAngel

Quote:


> Originally Posted by *gtbtk*
> 
> You can try this in Afterburner:
> 
> 1. Ctrl-F to bring up the curve window.
> 
> 2. Select the point at 1.081v on the curve
> 
> 3. Press the "L" key and a vertical line will appear on the graph.
> 
> 4. Click on the apply button in afterburner.
> 
> After locking the voltage, now try increasing your ram speed and see how you go.


Thanks for the tips and instructions that people can try in the community. I may even give it a goal myself to try on my Zotac card even though I am not having issues with memory as I play around with mine around 500-550. Still good instructions though so. Thanks 1+ Rep









Note I been trying to always remember were to find that graph lol so this helps. Often I have to look online and do a search to find the shortcut keys.


----------



## bigjdubb

Have there been any updates to the latency/audio issue with Pascal? I am getting really sick of the audio problems and frametime spikes in games, so tired of it that I am thinking about just disabling the card and run off of integrated until it's fixed. The drivers and hotfixes are not helping. To be clear, I mean unofficial workarounds/hacks etc since the official fixes don't fix anything.

The GTX 1070 is turning out to be the only video card purchase I have ever regretted (and I have purchased A LOT of them).


----------



## Jimbags

My FE is samsung ram and it oc's like a champion! Also that mem oc there? +666?How does it run un DOOM?







@Blackfyre


----------



## Blackfyre

Quote:


> Originally Posted by *Jimbags*
> 
> My FE is samsung ram and it oc's like a champion! Also that mem oc there? +666?How does it run un DOOM?
> 
> 
> 
> 
> 
> 
> 
> @Blackfyre


Sorry I don't have DOOM to test, but I am sure it will run devilishly good


----------



## gtbtk

Quote:


> Originally Posted by *amd7674*
> 
> Thanks for the info... Where did you see evga cards with micron vram? And what about zotac?


This is the EVGA Superclocked Micron with 1595 core clock Bios:

https://www.techpowerup.com/vgabios/185431/185431

The Zotac is the AMP! Edition with 1607 Core clock:

https://www.techpowerup.com/vgabios/185220/185220

I have not focused too much attention on the Zotac firmware as yet. I did have fun trying out the Precision XOC automatic overclocking with the EGVA bios installed and noticed temps rising to 70 deg with 100% fan which is the hottest I have seen my card with the fans forced on.


----------



## gtbtk

Quote:


> Originally Posted by *ITAngel*
> 
> Thanks for the tips and instructions that people can try in the community. I may even give it a goal myself to try on my Zotac card even though I am not having issues with memory as I play around with mine around 500-550. Still good instructions though so. Thanks 1+ Rep
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Note I been trying to always remember were to find that graph lol so this helps. Often I have to look online and do a search to find the shortcut keys.


I am starting to form the view that the Micron RAM is not the problem, it is the vbios power delivery in the MSI micron version Bios.

I read somewhere that the next version of afterburner will inclide a button to bring the graph up


----------



## LikesToSlide

Quote:


> Originally Posted by *BulletSponge*
> 
> At +150/+750 on my Gaming X everything will turn black in Overwatch except for the player names and explosions after a few minutes. At +100/+700 there are no issues so I'd assume it's your OC.


I'll do some more testing and look out for this. The color tint was confusing to me in regards to the overclock, because the video card would still be rendering all those pixels. All my previous experience with bad overclocks resulted in driver crash or blank portions of the screen like you're describing.
Quote:


> Originally Posted by *wrathofbill*
> 
> It's
> It's to with Overwatch, it doesn't like a over clocked card. It cannot handle it doe some reason. I got round it by syncing it with display , 144-150 fps is good enough for me.


Good point. I'll try some other games.
Quote:


> Originally Posted by *M0E*
> 
> One time occurrence, or have you been able to duplicate it? If it was just the one time, I'd keep the settings you have until you notice another issue. It may have not been related to the overclock settings.


I'll be taking your advice and do some more testing. Thanks!


----------



## Jimbags

Quote:


> Originally Posted by *Blackfyre*
> 
> Sorry I don't have DOOM to test, but I am sure it will run devilishly good


??


----------



## amd7674

Quote:


> Originally Posted by *gtbtk*
> 
> I am starting to form the view that the Micron RAM is not the problem, it is the vbios power delivery in the MSI micron version Bios.
> 
> I read somewhere that the next version of afterburner will inclide a button to bring the graph up


Quote:


> Originally Posted by *gtbtk*
> 
> This is the EVGA Superclocked Micron with 1595 core clock Bios:
> 
> https://www.techpowerup.com/vgabios/185431/185431
> 
> The Zotac is the AMP! Edition with 1607 Core clock:
> 
> https://www.techpowerup.com/vgabios/185220/185220
> 
> I have not focused too much attention on the Zotac firmware as yet. I did have fun trying out the Precision XOC automatic overclocking with the EGVA bios installed and noticed temps rising to 70 deg with 100% fan which is the hottest I have seen my card with the fans forced on.


Thanks ... That's awesome... If you had to buy gtx 1070 which one would you buy? I was choosing between evga ftw (potential coil whine) or asus strix oc (micron)... Or which was #1 pick ( MSI gaming x to match my MSI mobo )before the crappy MSI message about qa issues and micron ram.


----------



## amd7674

Quote:


> Originally Posted by *Blackfyre*
> 
> Sorry I don't have DOOM to test, but I am sure it will run devilishly good


Get demo on steam. That's what I'm planning to use when I buy my gtx 1070.


----------



## rfarmer

Quote:


> Originally Posted by *amd7674*
> 
> Get demo on steam. That's what I'm planning to use when I buy my gtx 1070.


I bought it 2 weeks ago when it was on sale for $30, the demo is a lot of fun though.


----------



## deegzor

Quote:


> Originally Posted by *deegzor*
> 
> I mean't 2200mhz on core speed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and yes i have increased voltage to max(1.093v) without unlocking the bios limits also powerlimit is set to max(112% on my model) . I would stay under 80C and not run the fans @ Full speed when pushing the card to the limit


Quote:


> Originally Posted by *Dude970*
> 
> I did today running an old benchmark called tropics. Afterburner shot was with my phone cause it surprised me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The benchmark result


Nice mate!







Mine's also steady on heaven and most of games 2200mhz but firestrike and time spy seems to crash it mid way.


----------



## TheGlow

What are some good fan curve suggestions? I'm always worried I wont put enough and have heat issues, then I worry I put too much and wonder if I burn out the fans.
I was average 70º before tweaking and didnt like the heat coming out of the top of my case.


----------



## bigjdubb

I wouldn't worry about burning the fans out. It's impossible for one of us to tell you how they should be set, you just have to decide what balance of noise and temps makes you happy.


----------



## ITAngel

I think I am getting somewhere now.



Working on some fan curve and need to tune a bit the overclocking.









and this one


----------



## wrathofbill

For
Quote:


> Originally Posted by *LikesToSlide*
> 
> I'll do some more testing and look out for this. The color tint was confusing to me in regards to the overclock, because the video card would still be rendering all those pixels. All my previous experience with bad overclocks resulted in driver crash or blank portions of the screen like you're describing.
> Good point. I'll try some other games.
> I'll be taking your advice and do some more testing. Thanks!


Forgot to say it can also be down to the CPU overclock.... Check the games forums.


----------



## Prozillah

Really weird issue for me. Been absolutely rock solid stable at 1.05 2100 core 550 mem. Got a 2nd monitor and tried to connect via HDMI which didn't work strangely enough and disconnected it. Now my card runs at 1.062v and sits at 2088 core 550mem.

*** is going on?


----------



## gtbtk

Quote:


> Originally Posted by *amd7674*
> 
> Thanks ... That's awesome... If you had to buy gtx 1070 which one would you buy? I was choosing between evga ftw (potential coil whine) or asus strix oc (micron)... Or which was #1 pick ( MSI gaming x to match my MSI mobo )before the crappy MSI message about qa issues and micron ram.


Even with the Micron memory, I am happy with my Gaming X. When I bought mine, I had a choice of that or an ASUS Strix that would not fit in my case so the choice was made easier for me.

You should also take a look at the base model MSI Gaming 8G 1070 as well. You save some money and you always have the choice of flashing a Gaming X or Gaming Z bios later.


----------



## ricko99

Anyone with ASUS 1070 Strix OC version can confirm with me whether the OC mode from GPUTweakII causes any crashes due to the clock speed being too high? I've been seeing this issue lately from the ROG forum. Was thinking about getting the card but after seeing this I'm not really sure whether to go with Strix OC or G1 gaming.


----------



## Sueramb6753

-snip-


----------



## zipper17

Damn my previous config Core Clock offset at +105mhz causing Crash in Firestrike Extreme & Firestrike Stress Test







, But strangely I play some games nothing happened.
I Have to lower my core clock into +80mhz to become stable, even at +85 mhz I got a single crash on FS Stress test, what gives? :S

Galax 1070 exoc OC config:
MSI AB, 125%-92C
Core +80,
Mem +502,
+100 voltage
Tested in Firestrike Extreme Stress test, 10minutes, 20 loops, passed 98.6%, no crash stable.

Newest Score:
Firestrike Graphic Scores: @ 20653
Firestrike Extreme Graphic Scores: @ 9784
TimeSpy Graphic Scores @ 6483



~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
noob question, does Firestrike really reliable for checking stability ??


----------



## drunkonpiss

Hi guys,

Guess i'm doing my OC wrong did a +210 clock and +100 on memory. Tried to run Firestrike and Valley and it was ok but did a benchmark on Timespy then my clock speeds are locked to 500 mhz. Had to restart my rig to get my stock speeds back. Anyone had this before or i'm just OCing the wrong way? using a Palit Jetstream.


----------



## madmeatballs

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *zipper17*
> 
> Damn my previous config Core Clock offset at +105mhz causing Crash in Firestrike Extreme & Firestrike Stress Test
> 
> 
> 
> 
> 
> 
> 
> , But strangely I play some games nothing happened.
> I Have to lower my core clock into +80mhz to become stable, even at +85 mhz I got a single crash on FS Stress test, what gives? :S
> 
> Galax 1070 exoc OC config:
> MSI AB, 125%-92C
> Core +80,
> Mem +502,
> +100 voltage
> Tested in Firestrike Extreme Stress test, 10minutes, 20 loops, passed 98.6%, no crash stable.
> 
> Newest Score:
> Firestrike Graphic Scores: @ 20653
> Firestrike Extreme Graphic Scores: @ 9784
> TimeSpy Graphic Scores @ 6483
> 
> 
> 
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> noob question, does Firestrike really reliable for checking stability ??






I experienced that too. I used BF4,Valley, Heaven, Firestrike/extreme/ultra stress test and time spy plus some other games to test stability. Core +80, Mem +500 as well. I get 2113MHz on core and 9200MHz on mem.


----------



## bigjdubb

Quote:


> Originally Posted by *drunkonpiss*
> 
> Hi guys,
> 
> Guess i'm doing my OC wrong did a +210 clock and +100 on memory. Tried to run Firestrike and Valley and it was ok but did a benchmark on Timespy then my clock speeds are locked to 500 mhz. Had to restart my rig to get my stock speeds back. Anyone had this before or i'm just OCing the wrong way? using a Palit Jetstream.


Did you have a driver failure/restart? It's nice that the graphics drivers restart themselves so quickly and smoothly after a crash now but it's still better to reboot after the driver failure. I have this happen all the time when testing out overclocks, sometimes the clock speed is there, it's just the program you are using to read it is screwed up from the driver crash.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> Have there been any updates to the latency/audio issue with Pascal? I am getting really sick of the audio problems and frametime spikes in games, so tired of it that I am thinking about just disabling the card and run off of integrated until it's fixed. The drivers and hotfixes are not helping. To be clear, I mean unofficial workarounds/hacks etc since the official fixes don't fix anything.
> 
> The GTX 1070 is turning out to be the only video card purchase I have ever regretted (and I have purchased A LOT of them).


@bigjdubb

I never had any audio issues, but I had some latency/stutter issues in games. This is before I even knew it was a widespread issue and just thought is was an issue specific to my system. What I did was go into device manager, click on view and then check "Show hidden devices". Then I deleted/uninstalled all hidden devices under network adapters, audio adapters, display adapters and rebooted. This seemed to fix my issue and I haven't had the issue crop back up.


----------



## bigjdubb

I will give that a shot. I picked up a new headset (Logitech G933) hoping that the microphone disconnection issue was a problem with my old headset but it still happens with the new one. I really really wish I hadn't updated my driver from the first 1070 driver, everything was fine until I did that. The frustrating thing is that I can install my GTX 750 or my old GTX 570 and the problem is gone, I wish I hadn't got rid of my 970's so quick..


----------



## Curseair

Quote:


> Originally Posted by *amd7674*
> 
> Get demo on steam. That's what I'm planning to use when I buy my gtx 1070.


I'm getting a refund on my EVGA FTW 1070 now, Not sure if I should try a replacement or Asus Strix OC or MSI Gaming X, Have you picked yet?


----------



## Nukemaster

Quick question here.

Are these cards supposed to throttle back speeds even before the temperatures get over 70?

My last card was a GTX 670 and it would boost to 1215 as long as the power/thermal limit was not exceeded.

Here is what my Asus Dual(just 2 fans. Figured with a 150 watt card that would be plenty) 1070 OC(factory oc, have not changed any settings yet) does. While it is above the quoted boost speed, I do wonder if this is normal or not.



EDIT.
Once the clocks drop like that, they do not seem to come back(small dips are alt + tab out of a game). I slowed the fan until the card topped 82c and then it will throttle(as expected) and come back.


----------



## criminal

Quote:


> Originally Posted by *Nukemaster*
> 
> Quick question here.
> 
> Are these cards supposed to throttle back speeds even before the temperatures get over 70?
> 
> My last card was a GTX 670 and it would boost to 1215 as long as the power/thermal limit was not exceeded.
> 
> Here is what my Asus Dual(just 2 fans. Figured with a 150 watt card that would be plenty) 1070 OC(factory oc, have not changed any settings yet) does. While it is above the quoted boost speed, I do wonder if this is normal or not.
> 
> 
> 
> EDIT.
> Once the clocks drop like that, they do not seem to come back(small dips are alt + tab out of a game). I slowed the fan until the card topped 82c and then it will throttle(as expected) and come back.


GPU Boost 3.0. Pascal based cards seem to start throttling as soon as temps get in the mid 40's. I have seen user that can keep sub 40 degree temps maintain a constant boost clock until hitting the power limit.


----------



## madmeatballs

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *criminal*
> 
> GPU Boost 3.0. Pascal based cards seem to start throttling as soon as temps get in the mid 40's. I have seen user that can keep sub 40 degree temps maintain a constant boost clock until hitting the power limit.






Yup, once you hit around 40C it starts to throttle your boost clock down. For me it starts at 42C which make my 2113MHz oc become 2100MHz. Water cooling can stop throttling, I hooked up mine to an NZXT Kraken G10/X41 to keep temps from reaching around 42C so far it has done a good job.


----------



## wrathofbill

Well I did it....... first time taking a GPU apart, so far so good......

The " FE MSI GTX 1070 EVGA HYBRID"




Idles at 31 degrees, no real change but gonna try some SP fans against the EvGA one you get with kit, maybe push and pull.
But boy does it stay cool, 47C max from 70C under load is a damn big reduction. And of course its a lot, lot more quiet....








Overclocks are higher and more stable, I have taken it to 2152Mhz but scared to go any higher, don't know why, fear of breaking something I suppose. I can now stick +240 on the core, how much more can it take?

Mmmmmm.....


----------



## ITAngel

Mine seems to start to throttle down around 49C to 52C so as long I keep it at 45C to 49C it will run 2100Mhz and even 2114Mhz. One of the test I did yesterday I saw the boost kick in when starting my test and jump to 2126, then down to 2114, then to 2100 and was holding there but once the temp past 49C I started to see it go down to 2088, and 2075 then started to hold there. I found that in order to keep it running stable I needed to provide +25 power.


----------



## bigjdubb

That's how it works, if you can get the temps even lower (quite a bit lower) you could clock it higher. This is why LN2 and other forms of exotic cooling is so popular among extreme overclockers.


----------



## criminal

Quote:


> Originally Posted by *wrathofbill*
> 
> Well I did it....... first time taking a GPU apart, so far so good......
> 
> The " FE MSI GTX 1070 EVGA HYBRID"
> 
> 
> 
> 
> Idles at 31 degrees, no real change but gonna try some SP fans against the EvGA one you get with kit, maybe push and pull.
> But boy does it stay cool, 47C max from 70C under load is a damn big reduction. And of course its a lot, lot more quiet....
> 
> 
> 
> 
> 
> 
> 
> 
> Overclocks are higher and more stable, I have taken it to 2152Mhz but scared to go any higher, don't know why, fear of breaking something I suppose. I can now stick +240 on the core, how much more can it take?
> 
> Mmmmmm.....


Looks good. Push/pull on the rad should help even more.


----------



## Nukemaster

Thanks for all the replies.

It will have to stay like this until I get to place and all in one cooler on it some day. This thing is loud coming from my LQ310(with AP15) + 670. I think it may still be more quiet than my old 5870


----------



## TheBoom

Or wait for custom bioses so we can completely disable boost and get fixed clocks


----------



## bigjdubb

Quote:


> Originally Posted by *TheBoom*
> 
> Or wait for custom bioses so we can completely disable boost and get fixed clocks


That doesn't mean you will be able to keep those higher clocks stable at higher temperatures. I'm sure there is a bit more clock than boost 3.0 is willing to give at a certain temperature but I wouldn't expect to be able to run boost 3.0's sub 50 degree clock speeds at 70 and up.


----------



## SLOWION

Sup dudes

Got an ASUS Strix GTX 1070 OC in-house

















Been pretty satisfied with it so far although it doesn't seem to be the best overclocker. I can only get about 2036 MHz on the core before it starts to get sketchy. Still a nice jump coming from an R9 290X Lightning though

Ran a few benchmarks at 1440p and 1080p for whoever is interested


----------



## Blackfyre

*GTA 5 People:*

Help a brother out. I need setting recommendations to maintain the game above 60FPS and use anti-aliasing too. This game seriously dips when exiting the city area. Also any recommended mods that improve the graphics while not hitting performance too much that are easy to install?

I'm already using Re-Shade 2.0 and one of the popular GTA 5 Re-Shade profiles, but after playing the *Witcher 3* modded out like crazy and running easily over 60FPS, all of a sudden GTA 5 looks like a massive disappointment in comparison (_both graphically & performance optimization wise_).


----------



## vfrmaverick

Quote:


> Originally Posted by *Blackfyre*
> 
> *GTA 5 People:*
> 
> Help a brother out. I need setting recommendations to maintain the game above 60FPS and use anti-aliasing too. This game seriously dips when exiting the city area. Also any recommended mods that improve the graphics while not hitting performance too much that are easy to install?
> 
> I'm already using Re-Shade 2.0 and one of the popular GTA 5 Re-Shade profiles, but after playing the *Witcher 3* modded out like crazy and running easily over 60FPS, all of a sudden GTA 5 looks like a massive disappointment in comparison (_both graphically & performance optimization wise_).


what resolution? I play it @1080 fully, and i mean FULLY maxed out at 80fps+


----------



## TheBoom

Quote:


> Originally Posted by *bigjdubb*
> 
> That doesn't mean you will be able to keep those higher clocks stable at higher temperatures. I'm sure there is a bit more clock than boost 3.0 is willing to give at a certain temperature but I wouldn't expect to be able to run boost 3.0's sub 50 degree clock speeds at 70 and up.


I guess that depends if it's anything like 2.0. With those we could eliminate temperature as a factor and make voltage the only stability factor.


----------



## bigjdubb

Quote:


> Originally Posted by *TheBoom*
> 
> I guess that depends if it's anything like 2.0. With those we could eliminate temperature as a factor and make voltage the only stability factor.


Temperature is always a factor, you can't disable physics with a custom bios.


----------



## dminzi

Hello everyone,
Quick question: For some reason, after upgrading to a 1440p monitor that uses display port, my card will no longer under clock itself when idle. So it sits at 1500mhz all the time which is kind of pointless and just heats up my case. Anybody have any idea why this is happening? It used to sit at around 700mhz when doing idle tasks...


----------



## amd7674

I just placed an order for evga ftw. As long as there in no or minimal coil whine I will be happy camper ;-)

I should get it early next week.

Thank you everyone vfor your input ;-)
Quote:


> Originally Posted by *Curseair*
> 
> I'm getting a refund on my EVGA FTW 1070 now, Not sure if I should try a replacement or Asus Strix OC or MSI Gaming X, Have you picked yet?


----------



## connectwise

Quote:


> Originally Posted by *ricko99*
> 
> Anyone with ASUS 1070 Strix OC version can confirm with me whether the OC mode from GPUTweakII causes any crashes due to the clock speed being too high? I've been seeing this issue lately from the ROG forum. Was thinking about getting the card but after seeing this I'm not really sure whether to go with Strix OC or G1 gaming.


How high is too high? I've not had any carshing problems but I haven't OC'd it yet. It's just below 2k stock oc.

quote name="SLOWION" url="/t/1601546/official-nvidia-gtx-1070-owners-club/2540_20#post_25462779"]Sup dudes

Got an ASUS Strix GTX 1070 OC in-house









Spoiler: Warning: Spoiler!











Been pretty satisfied with it so far although it doesn't seem to be the best overclocker. I can only get about 2036 MHz on the core before it starts to get sketchy. Still a nice jump coming from an R9 290X Lightning though

Ran a few benchmarks at 1440p and 1080p for whoever is interested







[/quote]

I can't believe you had the self control to actually take pictures before you put that into your machine.

I took a picture of my box, a day after everything was installed. Who's got the patience to leave the card out of the computer?


----------



## amd7674

Grats;-) very nice looking rig ;-)

My new toy is coming next week.

Quote:


> Originally Posted by *SLOWION*
> 
> Sup dudes
> 
> Got an ASUS Strix GTX 1070 OC in-house


----------



## M0E

Quote:


> Originally Posted by *ricko99*
> 
> Anyone with ASUS 1070 Strix OC version can confirm with me whether the OC mode from GPUTweakII causes any crashes due to the clock speed being too high? I've been seeing this issue lately from the ROG forum. Was thinking about getting the card but after seeing this I'm not really sure whether to go with Strix OC or G1 gaming.


Quote:


> Originally Posted by *connectwise*
> 
> How high is too high? I've not had any carshing problems but I haven't OC'd it yet. It's just below 2k stock oc.


I leave my card in basic OC mode and it boosts to 2025 consistently with zero crashing. In fact I am VERY satisfied with my purchase.


----------



## zipper17

Guys i have a question

What is the maximum Coreclock on 1070 can handle at Voltage 1093v ?
At how many celcius it can stable without throttle down?

I wonder is there any Boost/Throttle Step table for this.
or every card has different behavior??


----------



## bigjdubb

Quote:


> Originally Posted by *zipper17*
> 
> Guys i have a question
> 
> What is the maximum Coreclock on 1070 can handle at Voltage 1093v ?
> At how many celcius it can stable without throttle down?
> 
> I wonder is there any Boost/Throttle Step table for this.
> or every card has different behavior??


Maximum clockspeed is going to vary from chip to chip. Most likely it will be between 2000 and 2100 mhz and it will drop in 13mhz increments starting from around 50 degrees C .


----------



## ricko99

Quote:


> Originally Posted by *connectwise*
> 
> How high is too high? I've not had any carshing problems but I haven't OC'd it yet. It's just below 2k stock oc.
> 
> quote name="SLOWION" url="/t/1601546/official-nvidia-gtx-1070-owners-club/2540_20#post_25462779"]Sup dudes
> 
> Got an ASUS Strix GTX 1070 OC in-house
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been pretty satisfied with it so far although it doesn't seem to be the best overclocker. I can only get about 2036 MHz on the core before it starts to get sketchy. Still a nice jump coming from an R9 290X Lightning though
> 
> Ran a few benchmarks at 1440p and 1080p for whoever is interested


I can't believe you had the self control to actually take pictures before you put that into your machine.

I took a picture of my box, a day after everything was installed. Who's got the patience to leave the card out of the computer?








[/quote]

Quote:


> Originally Posted by *M0E*
> 
> I leave my card in basic OC mode and it boosts to 2025 consistently with zero crashing. In fact I am VERY satisfied with my purchase.


They don't mention what's the core clock but some who owns the OC variant of the card will face crashes whenever they put the card on OC mode, some even have to downclock the card from the original gaming mode to get the card stable. Sounds similar to me my ASUS R9 280x TOP issue a few years ago where I had to downvolt the card to get it up and running due to ASUS overvolting the card and overclocking the memory clock too high. That's why I'm pretty torn on getting the Strix. Design wise it looks the best among all the 3 fans cards but out of the box quality, I have doubt with ASUS for a while now


----------



## benjamen50

Quote:


> Originally Posted by *ricko99*
> 
> I can't believe you had the self control to actually take pictures before you put that into your machine.
> 
> I took a picture of my box, a day after everything was installed. Who's got the patience to leave the card out of the computer?


They don't mention what's the core clock but some who owns the OC variant of the card will face crashes whenever they put the card on OC mode, some even have to downclock the card from the original gaming mode to get the card stable. Sounds similar to me my ASUS R9 280x TOP issue a few years ago where I had to downvolt the card to get it up and running due to ASUS overvolting the card and overclocking the memory clock too high. That's why I'm pretty torn on getting the Strix. Design wise it looks the best among all the 3 fans cards but out of the box quality, I have doubt with ASUS for a while now[/quote]

So, TLR - Asus cards are a mess currently in factory OC where it may not be stable. Looks like the two overall decent choices are EVGA and Gigabyte then ATM that don't really have major issues. Correct me if I'm wrong.


----------



## zipper17

Quote:


> Originally Posted by *bigjdubb*
> 
> Maximum clockspeed is going to vary from chip to chip. Most likely it will be between 2000 and 2100 mhz and it will drop in 13mhz increments starting from around 50 degrees C .


Yeah, I think watercooler is really needed if want to eliminate All Throttling.
With aircooling is not really effective to get a steady high coreclock.


----------



## SLOWION

Quote:


> Originally Posted by *connectwise*
> 
> How high is too high? I've not had any carshing problems but I haven't OC'd it yet. It's just below 2k stock oc.
> Quote:
> 
> 
> 
> Originally Posted by *SLOWION*
> 
> Sup dudes
> 
> Got an ASUS Strix GTX 1070 OC in-house
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been pretty satisfied with it so far although it doesn't seem to be the best overclocker. I can only get about 2036 MHz on the core before it starts to get sketchy. Still a nice jump coming from an R9 290X Lightning though
> 
> Ran a few benchmarks at 1440p and 1080p for whoever is interested
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't believe you had the self control to actually take pictures before you put that into your machine.
> 
> I took a picture of my box, a day after everything was installed. Who's got the patience to leave the card out of the computer?
Click to expand...

Mine sat for about 2 weeks before it finally got installed


----------



## drunkonpiss

Quote:


> Originally Posted by *bigjdubb*
> 
> Did you have a driver failure/restart? It's nice that the graphics drivers restart themselves so quickly and smoothly after a crash now but it's still better to reboot after the driver failure. I have this happen all the time when testing out overclocks, sometimes the clock speed is there, it's just the program you are using to read it is screwed up from the driver crash.


Thanks! It seems my OC was unstable so I had to redo it again. Did another run but I have to limit the core clock to +200 and below to get a steady performance.


----------



## syl1979

I don't think you can get this one in western markets :

GALAX 1070 GAMER








Stock boost between 1987Mhz and 2012 Mhz.
Still optimizing overclock but works well at 2100Mhz core 2212 Memory (Samsung)

The PCB should be very similar to the EXOC. Picture found on the website


For the curious the chinese webpage
http://www.szgalaxy.com/__ZH_GB__/Product5/ProductDetail?proID=248

And the Bios :
https://www.techpowerup.com/vgabios/184645/184645


----------



## dminzi

Im sorry to post this again, but I REALLY need help. For some reason my core clock is stuck on 1506mHZ and I have no idea how to change it. In Precision X it will just sit there even when my computer is completely idle. It raises all the temps in my comp so I am a bit worried. I just changed to a displayport 1440p monitor if that matters at all. It will bost into the 1700s during gameplay, but on idle it no longer scales down... Please help me!


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *dminzi*
> 
> Im sorry to post this again, but I REALLY need help. For some reason my core clock is stuck on 1506mHZ and I have no idea how to change it. In Precision X it will just sit there even when my computer is completely idle. It raises all the temps in my comp so I am a bit worried. I just changed to a displayport 1440p monitor if that matters at all. It will bost into the 1700s during gameplay, but on idle it no longer scales down... Please help me!


Add ur PC's specs to ur Sig, plz.

if the temps of all components are all below suggested max, then no need to worry.
Are they?

If answer is yes...
try the following order and read closely:
1. uninstall all Vid utilities like Precision-X and wipe all data that it might try to keep. That way it can be freshly installed later, (and stick to one Vid utility for OCing, don't install multiple ones) then restart.
2. uninstall ALL Hardware utilities, (the ones that run in the background or ones that have any ability to remember things) then restart.
3. uninstall video drivers, then restart.
4. use GPU-z to confirm temps and speeds of Vid card.
5. run a simple utility to monitor all other hardware temps and speeds. (but be sure u run a freshly extracted one. delete existing ones already used before the problem.)
6. if temps and speeds seem better, then install a Vid utility like Precision-x but be sure it is a fresh install, then restart if all seems fine and check again.

GL


----------



## syl1979

Was it working before changing the monitor?

Which OS ? Can we have a capture of gpuz ? Which card was installed before ?


----------



## dminzi

Quote:


> Originally Posted by *syl1979*
> 
> Was it working before changing the monitor?
> 
> Which OS ? Can we have a capture of gpuz ? Which card was installed before ?


I am running Windows 10 and it was throttling the core up and down like normal when I was using my old 1080p, 60fps monitor. I have only ever had a GTX 1070 FE in this system. I am going to try the driver sweep like the guy above suggested and see if that helps at all. GPU-Z picture is attached I think


----------



## dminzi

Quote:


> Originally Posted by *dminzi*
> 
> I am running Windows 10 and it was throttling the core up and down like normal when I was using my old 1080p, 60fps monitor. I have only ever had a GTX 1070 FE in this system. I am going to try the driver sweep like the guy above suggested and see if that helps at all. GPU-Z picture is attached I think


Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> Add ur PC's specs to ur Sig, plz.
> 
> if the temps of all components are all below suggested max, then no need to worry.
> Are they?
> 
> If answer is yes...
> try the following order and read closely:
> 1. uninstall all Vid utilities like Precision-X and wipe all data that it might try to keep. That way it can be freshly installed later, (and stick to one Vid utility for OCing, don't install multiple ones) then restart.
> 2. uninstall ALL Hardware utilities, (the ones that run in the background or ones that have any ability to remember things) then restart.
> 3. uninstall video drivers, then restart.
> 4. use GPU-z to confirm temps and speeds of Vid card.
> 5. run a simple utility to monitor all other hardware temps and speeds. (but be sure u run a freshly extracted one. delete existing ones already used before the problem.)
> 6. if temps and speeds seem better, then install a Vid utility like Precision-x but be sure it is a fresh install, then restart if all seems fine and check again.
> 
> GL


I have done what you just said and not much has changed... Here is a GPU-Z screen shot post reinstalls/resets...


----------



## zipper17

Quote:


> Originally Posted by *syl1979*
> 
> I don't think you can get this one in western markets :
> 
> GALAX 1070 GAMER
> 
> Stock boost between 1987Mhz and 2012 Mhz.
> Still optimizing overclock but works well at 2100Mhz core 2212 Memory (Samsung)
> 
> The PCB should be very similar to the EXOC. Picture found on the website
> 
> 
> For the curious the chinese webpage
> http://www.szgalaxy.com/__ZH_GB__/Product5/ProductDetail?proID=248
> 
> And the Bios :
> https://www.techpowerup.com/vgabios/184645/184645


how much temperature did you get running with coreclock 2100mhz? and at what % fanspeed ?


----------



## vloeibaarglas

Quote:


> Originally Posted by *gtbtk*
> 
> MSI, Gigabyte, ASUS, Palit, Gainward, EGVA have all produced 1070 cards with Micron memory. They all use a different version of vbios compared to their samsung memory equivalent. (86.04.26.00.XX micron 86.04.1e.00.xx samsung)
> 
> Anecdotally, the samsung memory clocks higher than Micron. I have no experience with a samsung card. Leaving everything else at defaults, micron cards with give you a checkerboard artifacts and lock up. However, if you lock voltage in Afterburner, the Micron memory can go to +580-600 where it starts flashing coloured artifacating but no lockup indicating to me at least, that the problem is more a bug/design limitation in the MSI Gaming X Bios ram power delivery curve rather than with the RAM itself.
> 
> I did some experiments with my Gaming X and flashed it with a number of different micron version bios. The 1633 core clock Palit bios and the memory could clock higher without checkerboards than with the stock MSI bios. The Palit bios did introduce other issues though, the over power limit was reduced to 114% from 126%, the fans spun slower at 100% and the card ran hotter (70 deg vs 55 deg) but still within limits of the gpu chip.


I'll dump you a Asus Strix OC 1632 bios 86.04.26.00.62 with Micron today. And let me get this straight, you have not flashed a Samsung BIOS on your card right? We may need someone with a Palit/Gainward/EVGA FTW with Dual Bios + Micron to test this out for us.

Terrible memory overclocking here. 8800 mhz works for me, 9000 mhz instantly crashes PC due to BSOD. Last Strix OC, which was not stable at stock boost was a piss poor overclocker, could not achieve anything over 2012 mhz no matter what, but that Samsung memory can go up to at least 9600 mhz stable. Didn't push for more before sending it back. The current card does 2126 core, possibly 12/25 mhz more, but the Micron memory is so piss poor.


----------



## bigjdubb

Quote:


> Originally Posted by *dminzi*
> 
> I have done what you just said and not much has changed... Here is a GPU-Z screen shot post reinstalls/resets...


Have you checked your power settings in Nvidia control panel? If you have "Prefer Maximum Performance" selected then the card will not clock down below it's base 3d mode clock.

Is your new monitor a G-Sync monitor?


----------



## dminzi

Quote:


> Originally Posted by *bigjdubb*
> 
> Have you checked your power settings in Nvidia control panel? If you have "Prefer Maximum Performance" selected then the card will not clock down below it's base 3d mode clock.
> 
> Is your new monitor a G-Sync monitor?


It is a G-Sync Monitor and the power is set to adaptive. I just downclocked the hz to 120 and I still am locked at the high core idle.


----------



## bigjdubb

Quote:


> Originally Posted by *dminzi*
> 
> *It is a G-Sync Monitor* and the power is set to adaptive. I just downclocked the hz to 120 and I still am locked at the high core idle.


That is probably the source of the problem. If you look around for the thread on G-Sync and cards not downclocking you might find a solution. I know the problem existed but I'm not sure if it was ever resolved, I didn't pay much attention to it since I don't have a g-sync monitor.


----------



## asdkj1740

is there any way to lock the voltage to the max level?
evga ftw 1070 can boost the voltage to 1.09v but it is not stable all the time, it varies from 1.07~1.09
i have tried the function of locking voltage in msi ab but it seems that the voltage still fluctuate between 1.08~1.09v


----------



## bigjdubb

Quote:


> Originally Posted by *asdkj1740*
> 
> is there any way to lock the voltage to the max level?
> evga ftw 1070 can boost the voltage to 1.09v but it is not stable all the time, it varies from 1.07~1.09
> i have tried the function of locking voltage in msi ab but it seems that the voltage still fluctuate between 1.08~1.09v


I'm not sure I would concern myself with that small of a fluctuation.


----------



## whicker

Quote:


> Originally Posted by *ricko99*
> 
> They don't mention what's the core clock but some who owns the OC variant of the card will face crashes whenever they put the card on OC mode, some even have to downclock the card from the original gaming mode to get the card stable. Sounds similar to me my ASUS R9 280x TOP issue a few years ago where I had to downvolt the card to get it up and running due to ASUS overvolting the card and overclocking the memory clock too high. That's why I'm pretty torn on getting the Strix. Design wise it looks the best among all the 3 fans cards but out of the box quality, I have doubt with ASUS for a while now


Are they using MSI AB or the asus ultility. With msi ab you add +25 to the core and that should make it run at advertised OC boost (1860mhz). In reality it is 2025mhz. With 100% increased voltage and +113 my Strix oc runs at 2126mhz core and +700 samsung memory. With a custom fan curve the card stays around 60c as long as your ambient is in the low 20s. I find it pretty quiet up to 65% fan speed and it has no perceptible coil whine. When using stock OC speeds (+25 core) and +100%v you should actually be able to hit around 2065mhz.

If anyone has a strix OC card that cant hit the OC boost then I would for sure contact Asus or your retailer for a replacement. The cards are guaranteed at those speeds, that's why they cost 50$ more than the non OC version.


----------



## saunupe1911

Quote:


> Originally Posted by *amd7674*
> 
> Grats;-) very nice looking rig ;-)
> 
> My new toy is coming next week.


Quote:


> Originally Posted by *whicker*
> 
> Are they using MSI AB or the asus ultility. With msi ab you add +25 to the core and that should make it run at advertised OC boost (1860mhz). In reality it is 2025mhz. With 100% increased voltage and +113 my Strix oc runs at 2126mhz core and +700 samsung memory. With a custom fan curve the card stays around 60c as long as your ambient is in the low 20s. I find it pretty quiet up to 65% fan speed and it has no perceptible coil whine. When using stock OC speeds (+25 core) and +100%v you should actually be able to hit around 2065mhz.
> 
> If anyone has a strix OC card that cant hit the OC boost then I would for sure contact Asus or your retailer for a replacement. The cards are guaranteed at those speeds, that's why they cost 50$ more than the non OC version.


People like me all throughout this thread has been posting Strix OC experiences and overclocking benchmarks. I've even posted Afterburner and 3Dmark screenshots. Looks like he got a dud. First bad experience I've seen with strix OC. That thing should at least do 2000 Mhz and 9000mhz memory without crashing in anything. Return it


----------



## bigjdubb

If the card reaches the stated boost speeds then it shouldn't be returned. If it doesn't reach those speeds then you should return it.

Returning a product because it doesn't overclock as well as you hoped drives up prices and leads to stricter return policies.


----------



## gtbtk

Quote:


> Originally Posted by *vloeibaarglas*
> 
> I'll dump you a Asus Strix OC 1632 bios 86.04.26.00.62 with Micron today. And let me get this straight, you have not flashed a Samsung BIOS on your card right? We may need someone with a Palit/Gainward/EVGA FTW with Dual Bios + Micron to test this out for us.
> 
> Terrible memory overclocking here. 8800 mhz works for me, 9000 mhz instantly crashes PC due to BSOD. Last Strix OC, which was not stable at stock boost was a piss poor overclocker, could not achieve anything over 2012 mhz no matter what, but that Samsung memory can go up to at least 9600 mhz stable. Didn't push for more before sending it back. The current card does 2126 core, possibly 12/25 mhz more, but the Micron memory is so piss poor.


If you lock the voltage in the graph page on afterburner, the micron memory should clock up to about +600 (9200mhz) without crashing. At least it does so with my gaming X. without locking the voltage, I end up with checker board s and a blue screen at about +500.. I am coming to the conclusion that the power to support the high memory clocks is not being ramped up fast enough and is the reason for the crash.

It will be interesting to see how the Asus OC bios goes. I have only seen the non OC Asus bios so far. The EVGA SC bios lets you play with the EVGA Precision XOC automatic overclocking utility. It gets 2136 mhz clocks but it is unstable on my card.

No I have not flashed a Samsung Bios. Everything I have tried has been 86.04.26.00.xx version bioses.. I did post a question earlier asking that same thing but got no responses


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> is there any way to lock the voltage to the max level?
> evga ftw 1070 can boost the voltage to 1.09v but it is not stable all the time, it varies from 1.07~1.09
> i have tried the function of locking voltage in msi ab but it seems that the voltage still fluctuate between 1.08~1.09v


the card will automatically drop voltage as temps increase and is a function of boost 3.0


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> If you lock the voltage in the graph page on afterburner, the micron memory should clock up to about +600 (9200mhz) without crashing. At least it does so with my gaming X. without locking the voltage, I end up with checker board s and a blue screen at about +500.. I am coming to the conclusion that the power to support the high memory clocks is not being ramped up fast enough and is the reason for the crash.
> 
> It will be interesting to see how the Asus OC bios goes. I have only seen the non OC Asus bios so far. The EVGA SC bios lets you play with the EVGA Precision XOC automatic overclocking utility. It gets 2136 mhz clocks but it is unstable on my card.
> 
> No I have not flashed a Samsung Bios. Everything I have tried has been 86.04.26.00.xx version bioses.. I did post a question earlier asking that same thing but got no responses


Would you mind posting a screenshot of how you set up Afterburner?
i have the Gaming X as well and using the Beta Afterburner and not sure whats up.
I see people say they add 50-75 on the core, yet I put 150 and it went to around 2100. (not at my PC now and don't have my notes).
As for memory I'm unsure what to do.
Also a bit clueless in regards to the fan curve and how this throttling kicks in.
When I let the Gaming App do OC I only saw it go to about 1936/1949.
So I don't know what should be a good idle temp, idle fan speed, or target temp/fan during gaming.


----------



## TheBoom

Quote:


> Originally Posted by *bigjdubb*
> 
> Temperature is always a factor, you can't disable physics with a custom bios.


Not too sure how exactly it works but with my 970 i got my highest clocks with a custom bios even with the card hitting 90c in games.
Quote:


> Originally Posted by *dminzi*
> 
> Hello everyone,
> Quick question: For some reason, after upgrading to a 1440p monitor that uses display port, my card will no longer under clock itself when idle. So it sits at 1500mhz all the time which is kind of pointless and just heats up my case. Anybody have any idea why this is happening? It used to sit at around 700mhz when doing idle tasks...


Do you have a multi monitor set up or is it a single 1440p monitor? With displayport this issue has been around for ages and the only real way to circumvent it is by using Nvidia Inspector's multi display power saver tool.


----------



## Blackfyre

Quote:


> Originally Posted by *vfrmaverick*
> 
> what resolution? I play it @1080 fully, and i mean FULLY maxed out at 80fps+


How is that possible? I am playing at 1080p too (_with anti-aliasing @ 2x MSAA_). Do you have advanced settings fully maxed out too? Or just the graphics settings and everything in advanced turned off? What driver are you using?

*EDIT*

As you can see, even though average FPS in benchmarks are well over 60FPS, the minimums which correspond to areas in the countryside (_outside the city_), have much lower frame-rates (_*usually between 30 & 40 FPS in-game*_).

Frames Per Second (Higher is better) *Min*, Max, *Avg*

Pass 0, *6*.072186, 135.065979, *91*.556847
Pass 1, *31*.898115, 99.113045, *67*.764236
Pass 2, *49*.795315, 146.027786, *100*.498779
Pass 3, *66*.335686, 151.946594, *112*.582672
Pass 4, *24*.358755, 177.153885, *88*.457359

*EDIT #2*

If I turn off all the advanced graphic options and turn anti-aliasing off too, with everything else in graphics (_not advanced_) maxed out, the lowest FPS I saw was 56FPS (_for a few seconds, otherwise everything is smooth above 60FPS_). However, I am playing on a 40" TV, I cannot play any game without anti-aliasing. They all look horrible with jagged lines on every edge. With anti-aliasing @ 8X the FPS immediately drops between 30 and 40 FPS in those areas outside the city (_especially when you're surrounded by a lot of grass_).


----------



## bigjdubb

What CPU are you using?

There are certain areas that cause FPS drops, especially during the daytime. I have yet to figure out a way to keep the minimums above 60 fps in these areas no matter the settings. At 1440p and 4k I only get drops in these areas even with the settings maxed. I think it's a coding/engine issue and not a MOAR POWA issue.


----------



## syl1979

Quote:


> Originally Posted by *zipper17*
> 
> how much temperature did you get running with coreclock 2100mhz? and at what % fanspeed ?


I made the trials with 80% fans, for both card and case. Voltage at 1.06 (using curv). Stays below 60 at firestrike stress test (that is not so demanding)


----------



## TheGlow

Im not sure if I'm missing something. I set the clock + 150 and got 2100Hz. Then +175, and still 2100. +185 = 2113. +200 =2126, 205 - 2126, 215 crashes.
I thought each bit on the clock should come up. There were many sections where I bumped up a bit and saw no impact. Or it would go down sometimes.
For a few seconds just now alt-tabbed out of overwatch it was saying 2152, then dropped back down.

60ºc, 60% fan so far. Power limit to 126, temp 92, voltage untouched.

Memory Im unsure, got it +800 to 4799 in After burner and saw some black strips, rarely. Brought it to 775 and having seen anything yet.

Oddly firestrike scored less in graphics after oc but higher in physics.


----------



## TheBoom

Quote:


> Originally Posted by *TheGlow*
> 
> Im not sure if I'm missing something. I set the clock + 150 and got 2100Hz. Then +175, and still 2100. +185 = 2113. +200 =2126, 205 - 2126, 215 crashes.
> I thought each bit on the clock should come up. There were many sections where I bumped up a bit and saw no impact. Or it would go down sometimes.
> For a few seconds just now alt-tabbed out of overwatch it was saying 2152, then dropped back down.
> 
> 60ºc, 60% fan so far. Power limit to 126, temp 92, voltage untouched.
> 
> Memory Im unsure, got it +800 to 4799 in After burner and saw some black strips, rarely. Brought it to 775 and having seen anything yet.
> 
> Oddly firestrike scored less in graphics after oc but higher in physics.


Unfortunately that's how boost seems to work. You don't get the exact values you add. It depends on alot of other factors.

The firestrike score could be 2 things, either its within the margin of error or the clocks are unstable at that voltage.


----------



## Forceman

Quote:


> Originally Posted by *TheGlow*
> 
> Im not sure if I'm missing something. I set the clock + 150 and got 2100Hz. Then +175, and still 2100. +185 = 2113. +200 =2126, 205 - 2126, 215 crashes.
> I thought each bit on the clock should come up. There were many sections where I bumped up a bit and saw no impact. Or it would go down sometimes.
> For a few seconds just now alt-tabbed out of overwatch it was saying 2152, then dropped back down.
> 
> 60ºc, 60% fan so far. Power limit to 126, temp 92, voltage untouched.
> 
> Memory Im unsure, got it +800 to 4799 in After burner and saw some black strips, rarely. Brought it to 775 and having seen anything yet.
> 
> Oddly firestrike scored less in graphics after oc but higher in physics.


I don't know why you didn't see anything between +150 and +175 unless it throttled before you noticed it, but the clock speeds increment in 13 MHz steps. So adding +10 won't always get you a new clock step. That's why it didn't change between +200 and +205, for example.


----------



## syl1979

When overclocking with nvidia Boost it is very important to follow BOTH frequency and voltage, and the Perfcap reason.

Always keep GPUZ sensors in background, and if possible use Afterburner overlay to monitor at least frequency and voltage

I have made some trials with the curv of afterburner. There are many hidden laws that seem to control the boost.

-Core Temperature throttling that may start as low as 35degC, another one at 60degC , with some other limit whe the temp is going down (frequence increase again when getting below 34degC)
You can ever see the position of the points changing in Afterburner curv if you keep it open all the time !

-Power limit thottling
-VRM Temperature throttling

If I set all points from 1.05 to 1.093 to 2100 Mhz,

- If voltage % in afterburner set to 0 , the card will boost in the lowest range of voltage. If below 34 deg I should get 2100 with 1.05v. As soon as over 35 deg It will decrease to 2088Mhz keeping same voltage.
- If voltage % set to 100%, what i see is a first boost at 1.093v, going back to 1.081 almost immediatly. As I am on air at this voltage it goes very quickly over 35deg so immediatly down to 2088.
- If voltage % at 60, the boost will be somehow around 1.063v. It can goes up to 1.075 at some point during the test. Maybe it is linked to power draw .....


----------



## Blackfyre

Quote:


> Originally Posted by *bigjdubb*
> 
> What CPU are you using?
> 
> There are certain areas that cause FPS drops, especially during the daytime. I have yet to figure out a way to keep the minimums above 60 fps in these areas no matter the settings. At 1440p and 4k I only get drops in these areas even with the settings maxed. I think it's a coding/engine issue and not a MOAR POWA issue.


*4790K @ 4.7GHz* so I really doubt it's a CPU bottleneck. Usage never goes above 60% from what I've noticed while playing GTA V. But like you said, I agree that it's most likely a coding/engine issue (_poor optimization_).


----------



## syl1979

Just lower the resolution to 720p. You will see where is the cpu limit....


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> Would you mind posting a screenshot of how you set up Afterburner?
> i have the Gaming X as well and using the Beta Afterburner and not sure whats up.
> I see people say they add 50-75 on the core, yet I put 150 and it went to around 2100. (not at my PC now and don't have my notes).
> As for memory I'm unsure what to do.
> Also a bit clueless in regards to the fan curve and how this throttling kicks in.
> When I let the Gaming App do OC I only saw it go to about 1936/1949.
> So I don't know what should be a good idle temp, idle fan speed, or target temp/fan during gaming.


I am running a Micron memory Card. Samsung cards behave differently and use a different bios version. With the locked voltage +600 memory setting is not exactly stable under load but it doesn't instantly crash either the way it does without locking voltage. Artifacts will appear on screen after a little while under load so I think that it is temperature related as opposed to power supply related . +500 is a better place for me to be but this is still experimental. I doubt that I would run at this speed 24/7


Spoiler: Warning: Spoiler!


----------



## bigjdubb

Quote:


> Originally Posted by *TheGlow*
> 
> Im not sure if I'm missing something. I set the clock + 150 and got 2100Hz. Then +175, and still 2100. +185 = 2113. +200 =2126, 205 - 2126, 215 crashes.
> I thought each bit on the clock should come up. There were many sections where I bumped up a bit and saw no impact. Or it would go down sometimes.
> For a few seconds just now alt-tabbed out of overwatch it was saying 2152, then dropped back down.
> 
> 60ºc, 60% fan so far. Power limit to 126, temp 92, voltage untouched.
> 
> Memory Im unsure, got it +800 to 4799 in After burner and saw some black strips, rarely. Brought it to 775 and having seen anything yet.
> 
> Oddly firestrike scored less in graphics after oc but higher in physics.


Boost 3.0 is a booger to deal with. If your scores are dropping with the over clock, it is probably unstable. There really isn't much point in overclocking beyond the point where there isn't a performance return but Firestrike shouldn't be your measuring stick for the return in performance.


----------



## ogow89

Quote:


> Originally Posted by *Blackfyre*
> 
> *GTA 5 People:*
> 
> Help a brother out. I need setting recommendations to maintain the game above 60FPS and use anti-aliasing too. This game seriously dips when exiting the city area. Also any recommended mods that improve the graphics while not hitting performance too much that are easy to install?
> 
> I'm already using Re-Shade 2.0 and one of the popular GTA 5 Re-Shade profiles, but after playing the *Witcher 3* modded out like crazy and running easily over 60FPS, all of a sudden GTA 5 looks like a massive disappointment in comparison (_both graphically & performance optimization wise_).


DSR 2880*1620

Everything maxed out except for grass on very high.

FXAA, no MSAA, you are using dsr anyways. Set DSR smoothness to 15%.

Advanced settings, turn on long shadows and streaming while flying, you can max the bar for distance scaling but the rest turn them off.

Inner city 65-90 fps depending on cpu load, i am on an i5. Your i7 should deliver higher performance.

Out of the city, 60+, with dips to mid 50s if you are on a hill with too much grass. To maintain 60 fps all the time, you will have to turn grass a notch down to high.


----------



## Blackfyre

Quote:


> Originally Posted by *ogow89*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> DSR 2880*1620
> 
> Everything maxed out except for grass on very high.
> 
> FXAA, no MSAA, you are using dsr anyways. Set DSR smoothness to 15%.
> 
> Advanced settings, turn on long shadows and streaming while flying, you can max the bar for distance scaling but the rest turn them off.
> 
> Inner city 65-90 fps depending on cpu load, i am on an i5. Your i7 should deliver higher performance.
> 
> 
> 
> Out of the city, 60+, with dips to mid 50s if you are on a hill with too much grass. To maintain 60 fps all the time, you will have to turn grass a notch down to high.


Great advice above, thanks. Game runs smoothly. It's not as smooth (_visually_) as with anti-aliasing @ 1080p, but it'll do (_because it performs better_); much better. Wow *grass* is basically the sole biggest FPS hitter, that's just crazy.


----------



## gtbtk

Quote:


> Originally Posted by *vloeibaarglas*
> 
> I'll dump you a Asus Strix OC 1632 bios 86.04.26.00.62 with Micron today. And let me get this straight, you have not flashed a Samsung BIOS on your card right? We may need someone with a Palit/Gainward/EVGA FTW with Dual Bios + Micron to test this out for us.
> 
> Terrible memory overclocking here. 8800 mhz works for me, 9000 mhz instantly crashes PC due to BSOD. Last Strix OC, which was not stable at stock boost was a piss poor overclocker, could not achieve anything over 2012 mhz no matter what, but that Samsung memory can go up to at least 9600 mhz stable. Didn't push for more before sending it back. The current card does 2126 core, possibly 12/25 mhz more, but the Micron memory is so piss poor.


I got hold of the Asus OC Micron Mem bios this afternoon and tried it out on my Gaming X.

After a few hours of messing around, I'm impressed. I think that I may leave it on my card for a while.

At least on my card, I can get higher stable clocks than I can with msi stock bios. It looks to me that ASUS has done something a bit different with the power delivery curves in their version. I can get a stable 2126 Mhz Clock with +500 memory and it seems stable so far. Best I was able to get out of the MSI bios that was consistently stable was 2101. Temps are under control in the same way they were with the stock Bios


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> I am running a Micron memory Card. Samsung cards behave differently and use a different bios version. With the locked voltage +600 memory setting is not exactly stable under load but it doesn't instantly crash either the way it does without locking voltage. Artifacts will appear on screen after a little while under load so I think that it is temperature related as opposed to power supply related . +500 is a better place for me to be but this is still experimental. I doubt that I would run at this speed 24/7
> 
> 
> Spoiler: Warning: Spoiler!


I'm on an MSI gaming x w/ Micron as well.
I see big differences between us however.
I havent touched the core voltage yet, but I have the power limit and temp to max like your pic.
I have my clock manually set to 200 as I havent played with the Curved version. Should I try that?
What would show instability on putting memory too high? I had mine on 800 and only saw a couple random black lines, like knife slashes. maybe 2-3 inches wide, thin at the ends, a little wider in the middle. And I would only see those every 20-30 secs. Not that frequently. I dropped it to +750 and I didnt see any more like that.
So not necessarily seeing artifacts doesnt mean its stable? So +200 could potentially perform less than +180?
I know firestrikes not the end all but I figure I need something as a base line. Ill need to check notes but Heaven went up a chunk, +1800 score from OC, just like firestrikes physics portion.
I need to add my rig info, but cpu is i5-6600k , I forget if I had it at 4.2 or 4.4Ghz.

Lastly my temps are much different. If in just Gaming APP defaults it sits at 70º. With AB manual and fan around 100% I think I was still sitting around 55º.
Ill need to check my notes later, but even idle I dont think I've gone under 41º,

How risky is playing with the voltage? overclocking is fun but I'd rather not shave a year or more of lifespan for a few more fps.


----------



## LocutusH

Anyone else experiencing this problem, with low clocks stuck after long idle time?
Geforce Forums thread


----------



## vloeibaarglas

Quote:


> Originally Posted by *gtbtk*
> 
> I got hold of the Asus OC Micron Mem bios this afternoon and tried it out on my Gaming X.
> 
> After a few hours of messing around, I'm impressed. I think that I may leave it on my card for a while.
> 
> At least on my card, I can get higher stable clocks than I can with msi stock bios. It looks to me that ASUS has done something a bit different with the power delivery curves in their version. I can get a stable 2126 Mhz Clock with +500 memory and it seems stable so far. Best I was able to get out of the MSI bios that was consistently stable was 2101. Temps are under control in the same way they were with the stock Bios


How did you find those Micron bios on Techpowerup/GPU-Z website? When I filter on GTX 1070, I only see Samsung: https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1070&interface=&memType=&memSize=

I submitted my BIOS via GPU-Z, but it is not showing up since it is not 86.04.1E.00.XX bios.


----------



## bigjdubb

Quote:


> Originally Posted by *LocutusH*
> 
> Anyone else experiencing this problem, with low clocks stuck after long idle time?
> Geforce Forums thread


I haven't, but I can't let my card down clock due to the latency problem. The lowest clock my card sees is 1518 mhz.


----------



## gtbtk

Quote:


> Originally Posted by *vloeibaarglas*
> 
> How did you find those Micron bios on Techpowerup/GPU-Z website? When I filter on GTX 1070, I only see Samsung: https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1070&interface=&memType=&memSize=
> 
> I submitted my BIOS via GPU-Z, but it is not showing up since it is not 86.04.1E.00.XX bios.


Change the brand to unverified uploads

here is the link https://www.techpowerup.com/vgabios/185493/185493


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> I'm on an MSI gaming x w/ Micron as well.
> I see big differences between us however.
> I havent touched the core voltage yet, but I have the power limit and temp to max like your pic.
> I have my clock manually set to 200 as I havent played with the Curved version. Should I try that?
> What would show instability on putting memory too high? I had mine on 800 and only saw a couple random black lines, like knife slashes. maybe 2-3 inches wide, thin at the ends, a little wider in the middle. And I would only see those every 20-30 secs. Not that frequently. I dropped it to +750 and I didnt see any more like that.
> So not necessarily seeing artifacts doesnt mean its stable? So +200 could potentially perform less than +180?
> I know firestrikes not the end all but I figure I need something as a base line. Ill need to check notes but Heaven went up a chunk, +1800 score from OC, just like firestrikes physics portion.
> I need to add my rig info, but cpu is i5-6600k , I forget if I had it at 4.2 or 4.4Ghz.
> 
> Lastly my temps are much different. If in just Gaming APP defaults it sits at 70º. With AB manual and fan around 100% I think I was still sitting around 55º.
> Ill need to check my notes later, but even idle I dont think I've gone under 41º,
> 
> How risky is playing with the voltage? overclocking is fun but I'd rather not shave a year or more of lifespan for a few more fps.


Just remember to not set afterburner to automatically start and apply a profile. If your computer crashes cause you push too far it wont be overclocked after the reboot.

Do not run the gaming app and afterburner at the same time

The voltage slider at 100% will not hurt your card. The maximum the card can go to is a maximum of 1.093V and the slider only increases the % of power up to that limit. It will increase clocks along the curve that afterburner sets and can potentially crash your machine if you have the slider.curve points set too high. at default settings it wont crash

I have never been able to get my either the core clock slider near +200 and never got the Ram anywhere close to 750. It sounds more like how the Samsung memory cards work. You may find that the machine will crash with a +200 oc if you boost the voltage. You need to put the card under load to test if it is stable or not

Set a custom fan curve or run the card at a fixed rate. 100% is good, it will keep the load temps under 60 degrees and the MSI cards are not noisy. The lower the temps the faster the card will run. after you find a stable setting, create a custom fan curve to manage your temps


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> Just remember to not set afterburner to automatically start and apply a profile. If your computer crashes cause you push too far it wont be overclocked after the reboot.
> 
> Do not run the gaming app and afterburner at the same time
> 
> The voltage slider at 100% will not hurt your card. The maximum the card can go to is a maximum of 1.093V and the slider only increases the % of power up to that limit. It will increase clocks along the curve that afterburner sets and can potentially crash your machine if you have the slider.curve points set too high. at default settings it wont crash
> 
> I have never been able to get my either the core clock slider near +200 and never got the Ram anywhere close to 750. It sounds more like how the Samsung memory cards work. You may find that the machine will crash with a +200 oc if you boost the voltage. You need to put the card under load to test if it is stable or not
> 
> Set a custom fan curve or run the card at a fixed rate. 100% is good, it will keep the load temps under 60 degrees and the MSI cards are not noisy. The lower the temps the faster the card will run. after you find a stable setting, create a custom fan curve to manage your temps


I had done this with valley or heaven running 1440 windowed, so it had a load.
Again I'll need to check my notes but I know I set to 200 and hit apply and it seemed fine. Memory still sitting down at +400 so I had artifacts or anything.
Next jump I tried was 225 and it instantly killed my pc. Then I tried 215, same behavior. system hangs for 5-10 secs, then valley/heaven pops up that it crashed. Then after 30 seconds or so windows would BSOD. I cant recall if I tried 205 or 210 successfully.
As for memory I didnt see anything drastic like an immediate crash. just those odd artifacts around +800. +775 I didnt see.
I need to check but I believe i left core on +175/180 and memory +700 or 750 and played a about 2 hours of Overwatch with no incidents.

So regardless of voltage and core clocks I set, the gpu boost will still throttle and adjust based on the temps?
I had the fan on 100% and honestly didnt really hear anything but my living gets hot so I often have the AC and Fan on, at least now in the summer, so I'm not really hearing anything.
I did feel heat around my feet area when I was just using gaming app initially, which seemed to not kick the fans on often or just aim for 70º. I was a bit concerned considering the tower is on the floor but But to the left, and the fans all blow left, up and rear, so to feel heat on the right side, it would have to be coming out of the rear and affecting the right area through ambience.

Whats the deal with the core curves? thats the gpu boost settings?


----------



## vloeibaarglas

Quote:


> Originally Posted by *gtbtk*
> 
> Change the brand to unverified uploads
> 
> here is the link https://www.techpowerup.com/vgabios/185493/185493


Yup, that is my BIOS. Feels bad man. 2 out of 26 submitted BIOS are Micron


----------



## fwix

Hello folks , an owner of a GTX 1070 STRIX here :d , validation gpuz : https://www.techpowerup.com/gpuz/details/hkx8p
some pics of teh gpu






I have a question though , is there any bios editor or a unloked bios for the strix ?? or maybe any one tried to flash his strix with other bios like msi gaming x , cause it's damn annoying the throttling that this GPU have with the power target set to 112 % , i mean i can get the core clock to 2050 mhz with only 0.980 volt so i believe there is some headroom for this babe xdd

Here an image of 3dmark 11


----------



## bigjdubb

No modded bios yet. If you don't want the card to limit your boost speeds you will need to crank the fans up and keep it as cool as possible. Just set the fan curve to hit 100% at 50*c and turn your headphones up.


----------



## Cicco99

Quote:


> Originally Posted by *fwix*
> 
> Hello folks , an owner of a GTX 1070 STRIX here :d , validation gpuz : https://www.techpowerup.com/gpuz/details/hkx8p
> some pics of teh gpu
> 
> 
> 
> 
> 
> 
> I have a question though , is there any bios editor or a unloked bios for the strix ?? or maybe any one tried to flash his strix with other bios like msi gaming x , cause it's damn annoying the throttling that this GPU have with the power target set to 112 % , i mean i can get the core clock to 2050 mhz with only 0.980 volt so i believe there is some headroom for this babe xdd
> 
> Here an image of 3dmark 11


you can flash the bios of the 1070 Strix OC that have power limit to 120% instead of 112% (i have done it and all work perfectly, indeed some time accordly to gpuz tdp goes over 120% like 125%/130%







)


----------



## fwix

Quote:


> Originally Posted by *Cicco99*
> 
> you can flash the bios of the 1070 Strix OC that have power limit to 120% instead of 112% (i have done it and all work perfectly, indeed some time accordly to gpuz tdp goes over 120% like 125%/130%
> 
> 
> 
> 
> 
> 
> 
> )


can i have a link , i searched in tpu database and found none daheck :dddd no body likes asus or what :d


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> Just remember to not set afterburner to automatically start and apply a profile. If your computer crashes cause you push too far it wont be overclocked after the reboot.
> 
> Do not run the gaming app and afterburner at the same time
> 
> The voltage slider at 100% will not hurt your card. The maximum the card can go to is a maximum of 1.093V and the slider only increases the % of power up to that limit. It will increase clocks along the curve that afterburner sets and can potentially crash your machine if you have the slider.curve points set too high. at default settings it wont crash
> 
> I have never been able to get my either the core clock slider near +200 and never got the Ram anywhere close to 750. It sounds more like how the Samsung memory cards work. You may find that the machine will crash with a +200 oc if you boost the voltage. You need to put the card under load to test if it is stable or not
> 
> Set a custom fan curve or run the card at a fixed rate. 100% is good, it will keep the load temps under 60 degrees and the MSI cards are not noisy. The lower the temps the faster the card will run. after you find a stable setting, create a custom fan curve to manage your temps





http://imgur.com/vesxL1R




http://imgur.com/K4SSrAv

Bumping up the voltage made the core hit 2151, but thats about it. I lowered it, and it dropped to 2138, then upped it and it hopped back up again.
Temp shot to 65 pretty quick.
I lowered the voltage again and then my pc crashed.
For now I guess Ill just sit at +180 core, +700 memory and see how it handles.
What are side effects or signs if memory is too high? Core just outright kills my pc so it's an obvious give away.
Edit: Well, 825 appeared to be ok for a minute. I kicked it to 850 and Heaven froze. I could still interact with after burner so dropped back to 800 and heaven cleared up and moved again.
Also I set power limit back to 100 and I see no difference, so I guess thats directly related to having the core voltage increased.
Also underclocked to -400, memory +0, fan 100% I cant get the temp under 56/57.
+180/+700 it goes to 61º, so interesting in that extra 580 core only costs me 5-6º.
So fan curve has it sitting at about 63º and 74% fan. I guess I'll leave it like that for a bit.


----------



## fwix

okey found a bios here , https://rog.asus.com/forum/showthread.php?87101-GTX-1070-STRIX-Default-OC-MODE-BIOS and yeah the difference is so damn good

now I'm stable 100% at 2114 MHZ with 1.054 // 2400 mhz memory " gaming " tested in 3 games witcher 3 / rise of the tomb raider / AC UNITY

plus I'm at 28 k gpu score :d (i believe with un oc cpu i can achieve 29 k gpu score this one limit me a bit "3770s oc to 4ghz" )

also in benchmark tools i can go up to 2202 but it's not reaaly stable i believe it's just a temperature limit , with a waterblock i believe that i can get a stable 2250 mhz with no prob , i really like the pascale and mostly the asus so damn beautiful xdddd


----------



## saunupe1911

So what are you guys using to save and upload bios to your 1070. Tech Powerup GPUz version 0.8.6 gives me a "BIOS reading not supported on this device" error when I click save to file


----------



## Prozillah

Quote:


> Originally Posted by *fwix*
> 
> okey found a bios here , https://rog.asus.com/forum/showthread.php?87101-GTX-1070-STRIX-Default-OC-MODE-BIOS and yeah the difference is so damn good
> 
> now I'm stable 100% at 2114 MHZ with 1.054 // 2400 mhz memory " gaming " tested in 3 games witcher 3 / rise of the tomb raider / AC UNITY
> 
> plus I'm at 28 k gpu score :d (i believe with un oc cpu i can achieve 29 k gpu score this one limit me a bit "3770s oc to 4ghz" )
> 
> also in benchmark tools i can go up to 2202 but it's not reaaly stable i believe it's just a temperature limit , with a waterblock i believe that i can get a stable 2250 mhz with no prob , i really like the pascale and mostly the asus so damn beautiful xdddd


What card do you actually have - and does anyone know if I can flash my G1 with this bios - wouldn't mind opening it up to the 120% power limit


----------



## Forceman

Quote:


> Originally Posted by *saunupe1911*
> 
> So what are you guys using to save and upload bios to your 1070. Tech Powerup GPUz version 0.8.6 gives me a "BIOS reading not supported on this device" error when I click save to file


Isn't GPU-Z on version 1.9 or something now? You probably just need to update to the newest version.


----------



## LiquidHaus

For those who want to give it a try at another OC run - Zotac released a new version of their Firestorm OC Utility. Much like Afterburner, just with a different skin.

Link is on their home page.


----------



## madmeatballs

Quote:


> Originally Posted by *lifeisshort117*
> 
> For those who want to give it a try at another OC run - Zotac released a new version of their Firestorm OC Utility. Much like Afterburner, just with a different skin.
> 
> Link is on their home page.


I'll definitely try it. I'll let you guys know the results when I get it. So far my card is sitting pretty and stable at 2113MHz.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> I had done this with valley or heaven running 1440 windowed, so it had a load.
> Again I'll need to check my notes but I know I set to 200 and hit apply and it seemed fine. Memory still sitting down at +400 so I had artifacts or anything.
> Next jump I tried was 225 and it instantly killed my pc. Then I tried 215, same behavior. system hangs for 5-10 secs, then valley/heaven pops up that it crashed. Then after 30 seconds or so windows would BSOD. I cant recall if I tried 205 or 210 successfully.
> As for memory I didnt see anything drastic like an immediate crash. just those odd artifacts around +800. +775 I didnt see.
> I need to check but I believe i left core on +175/180 and memory +700 or 750 and played a about 2 hours of Overwatch with no incidents.
> 
> So regardless of voltage and core clocks I set, the gpu boost will still throttle and adjust based on the temps?
> I had the fan on 100% and honestly didnt really hear anything but my living gets hot so I often have the AC and Fan on, at least now in the summer, so I'm not really hearing anything.
> I did feel heat around my feet area when I was just using gaming app initially, which seemed to not kick the fans on often or just aim for 70º. I was a bit concerned considering the tower is on the floor but But to the left, and the fans all blow left, up and rear, so to feel heat on the right side, it would have to be coming out of the rear and affecting the right area through ambience.
> 
> Whats the deal with the core curves? thats the gpu boost settings?


You must have won the silicon lottery. congrats

The Card will vary clock speed and voltage depending on temperature whilst under load. The Curve allows you to control that at individual points across the range up to the limits of the card. The slider that you played with moves the entire curve up and down.


----------



## gtbtk

Quote:


> Originally Posted by *LocutusH*
> 
> Anyone else experiencing this problem, with low clocks stuck after long idle time?
> Geforce Forums thread


I have noticed that if i lock voltage for a while and then take the lock off.

If you hit the reset to default button in afterburner, the card will go back to normal and you can reset your overclock


----------



## gtbtk

Quote:


> Originally Posted by *vloeibaarglas*
> 
> Yup, that is my BIOS. Feels bad man. 2 out of 26 submitted BIOS are Micron


I think all the first batch of cards manufactured were all samsung memory. The second manufacturing run in July were all or mostly Micron, I suspect that we will see more start coming through in the next couple of weeks.

One thing that I noticed is that the Asus bios on my card would tell me that it was clocked higher than the msi bios but my fire strike scores were all bit lower. The Asus card was telling me that I was at 2139 and the card was reasonably stable. I cant get near that with the msi bios.

I think that the asus bios on msi hardware might misreport clock speeds


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> 
> 
> http://imgur.com/vesxL1R
> 
> 
> 
> 
> http://imgur.com/K4SSrAv
> 
> Bumping up the voltage made the core hit 2151, but thats about it. I lowered it, and it dropped to 2138, then upped it and it hopped back up again.
> Temp shot to 65 pretty quick.
> I lowered the voltage again and then my pc crashed.
> For now I guess Ill just sit at +180 core, +700 memory and see how it handles.
> What are side effects or signs if memory is too high? Core just outright kills my pc so it's an obvious give away.
> Edit: Well, 825 appeared to be ok for a minute. I kicked it to 850 and Heaven froze. I could still interact with after burner so dropped back to 800 and heaven cleared up and moved again.
> Also I set power limit back to 100 and I see no difference, so I guess thats directly related to having the core voltage increased.
> Also underclocked to -400, memory +0, fan 100% I cant get the temp under 56/57.
> +180/+700 it goes to 61º, so interesting in that extra 580 core only costs me 5-6º.
> So fan curve has it sitting at about 63º and 74% fan. I guess I'll leave it like that for a bit.


Strange that Afterburner is showing your card at 0V. What have you set the Afterburner settings to?


----------



## amd7674

Getting cold feet guys ;-(. There is a recent evga ftw on Amazon.com which claims it uses micron. I should get my evga ftw next week but I should be able to get my money back if I don't pick it up. Lol
Coil whine is another issues, perhaps asus strix oc would be safer buy... with better chokes.... I'm not sure what to do...;-(


----------



## GreedyMuffin

Quote:


> Originally Posted by *amd7674*
> 
> Getting cold feet guys ;-(. There is a recent evga ftw on Amazon.com which claims it uses micron. I should get my evga ftw next week but I should be able to get my money back if I don't pick it up. Lol
> Coil whine is another issues, perhaps asus strix oc would be safer buy... with better chokes.... I'm not sure what to do...;-(


Cold whine could also be because of your PSU.

Before I moved my 1080 to this rig I had it in a 4770 rig with a cheap B600 PSU. No coil white at all. Was so quiet and nice. With my AX1500I I get coil whine..


----------



## vloeibaarglas

Micron is not that bad. Memory bandwidth doesn't limit the 1070 in most situations. Overclocking the memory from my current 2200 mhz (Micron) to 2400 mhz (Samsung higher range) increases timespy graphics score by 100 pts based on the results online. I'm not willing to do another exchange for 1.5% increase in Timespy performance when my current core can get to 2100+ stably. Risk getting another Micro card with a worse core. Getting another 2000 mhz core like my first card is a solid 5% drop in performance.

We've been getting a lot of reports on Micron. It first started with MSI Gaming X, then all the cards, and most recently a lot of EVGA FTW..


----------



## saunupe1911

Let's ban together and protest against Micron!!!!! I'm sick of this company shipping out crap to my gaming brothers lmao!!!!

But seriously if you want a 1070 I would buy it now!!!! Why? Because all of these returned Micron 1070 is all that will be left. Might be a good black friday purchase though


----------



## amd7674

One of the reasons I went after evga ftw was Samsung. If that goes out of the window, what should I get. Asus strix OC, zotac extreme amp, MSI gaming x or stick with evga ftw? Btw... In Canada great evga warranty is not so great. I have to ship cards to USA ( same goes for zotac ). Asus and MSI has warehouses in my province,. Shipping would be much cheaper. Although my case cm had 922 with three 200mm is not super quiet I would like card quiet. Currently I own asus nom rig gtx 670 and it worked perfectly for the last 3-4 years. No coil whine. My PSU is older corsair tx750. I want to add 1070 as my last upgrade to the current [email protected] build. I'm planning to keep this for next 4 years or so. After which I will build new rig.

So assuming all 1070 ship with micron,. What should I get adus, msi, zotac or stick with evga??

Help !!!!!!?


----------



## saunupe1911

Quote:


> Originally Posted by *amd7674*
> 
> One of the reasons I went after evga ftw was Samsung. If that goes out of the window, what should I get. Asus strix OC, zotac extreme amp, MSI gaming x or stick with evga ftw? Btw... In Canada great evga warranty is not so great. I have to ship cards to USA ( same goes for zotac ). Asus and MSI has warehouses in my province,. Shipping would be much cheaper. Although my case cm had 922 with three 200mm is not super quiet I would like card quiet. Currently I own asus nom rig gtx 670 and it worked perfectly for the last 3-4 years. No coil whine. My PSU is older corsair tx750. I want to add 1070 as my last upgrade to the current [email protected] build. I'm planning to keep this for next 4 years or so. After which I will build new rig.
> 
> So assuming all 1070 ship with micron,. What should I get adus, msi, zotac or stick with evga??
> 
> Help !!!!!!?


Well it seems you have got one helluva chance of getting a Samsung mem with either an Asus Strix OC or the big Zotac Extreme. Those have to be using Samsung memory with those guaranteed OC ratings


----------



## monza1412

Quote:


> Originally Posted by *amd7674*
> 
> Getting cold feet guys ;-(. There is a recent evga ftw on Amazon.com which claims it uses micron. I should get my evga ftw next week but I should be able to get my money back if I don't pick it up. Lol
> Coil whine is another issues, perhaps asus strix oc would be safer buy... with better chokes.... I'm not sure what to do...;-(


I read that same review and was really pissed, was about to purchase that same one. Maybe in the official Evga forum jacob or someone else can clarify if all the 1070 models will come with micron memory from now on.
Asus it's also using micron, it seems that is just luck of the draw.


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> Strange that Afterburner is showing your card at 0V. What have you set the Afterburner settings to?


Im not sure what setting may have impacted that. I think there was a setting under Unlocking voltage that mentioned monitoring voltage, I didnt check that one off. I believe gpuz's VDDC is showing the equivalent statistic, but I could be wrong.
I may have won a lottery where it can potentially go more, but my temps beg to differ.
What do you suggest in way of benchmarks to see if its stable or working well as opposed to not simply crashing but potentially not working correctly?
Like how my graphics score in Firestrike dropped even though the core went up about 200MHz.


----------



## monza1412

Quote:


> Originally Posted by *gtbtk*
> 
> Strange that Afterburner is showing your card at 0V. What have you set the Afterburner settings to?


You said earlier that you did lock your voltage before oc your memory and did not see lock ups, I read in a french forum that someone did the same as you to a Palit Jetstream in order to overclock his vram, maybe that's the solution. We need more feedback on this, maybe the chips aren't that bad, and the problem relies on the way the software overclock them.


----------



## amd7674

Quote:


> Originally Posted by *saunupe1911*
> 
> Well it seems you have got one helluva chance of getting a Samsung mem with either an Asus Strix OC or the big Zotac Extreme. Those have to be using Samsung memory with those guaranteed OC ratings


I can get both locally with 30 days return policy. I know it is not moral to do it, but at least I have an option. With evga I'm stack with the card... I think there is 15% restocking fees. ;-(
Quote:


> Originally Posted by *monza1412*
> 
> I read that same review and was really pissed, was about to purchase that same one. Maybe in the official Evga forum jacob or someone else can clarify if all the 1070 models will come with micron memory from now on.
> Asus it's also using micron, it seems that is just luck of the draw.


Hmmm... Please keep us posted if you find out. I might pass on evga ;-( maybe I should go for monster zotac. My last two zotac cards were fine gtx 970 htpc and gtx 650ti.


----------



## madroxinide

Sorry to be the bearer of bad news, but that is my review of the EVGA 1070 FTW with Micron memory on Amazon. I was quite disapointed when I got around to loading up GPU-Z. I'm at work right now but later today I can dump the bios and/or test some overclocking capability.

I've messed around a little bit with the OC so far, but I'm not 100% sure that I'm doing it right, and I've really only been stress testing against Overwatch which I've heard doesn't play well with video card overclocks.

A couple questions I have before I make claims as to the speeds I am getting

1. Should I increase the voltage limit to 100%?
2. Should I also increase the power limit to 112%?
3. Should I manually set CPU/Mem increases or should I use the OC boost 3.0 thing?
4. Is Unigine adequate to stress test?

I have noticed some weird flashing on my 1440p 144hz gsync monitor that seems to be correlated with the video card automatically adjusting it's clock speeds. I've found similar claims and apparently setting the refresh rate to 120 hz fixes this. These flashes only happen on the desktop or web browsing, never under load in a game.


----------



## monza1412

Quote:


> Originally Posted by *amd7674*
> 
> Hmmm... Please keep us posted if you find out. I might pass on evga ;-( maybe I should go for monster zotac. My last two zotac cards were fine gtx 970 htpc and gtx 650ti.


I feel your pain bro, as I'm in the same boat as you, the other option is buy a FE and deal with it's problems, at least its guaranteed (for now) to have samsungs chips.
Already posted in the Evga forum, but I'm not so much optimistic about a truthful outcome.

I feel cheated, all the site reviews were made with the first batch of the AIB partners, with cherry picked cards, and for the retail units they silently switch a key component.


----------



## reflex75

Quote:


> Originally Posted by *gtbtk*
> 
> I got hold of the Asus OC Micron Mem bios this afternoon and tried it out on my Gaming X.
> 
> After a few hours of messing around, I'm impressed. I think that I may leave it on my card for a while.
> 
> At least on my card, I can get higher stable clocks than I can with msi stock bios. It looks to me that ASUS has done something a bit different with the power delivery curves in their version. I can get a stable 2126 Mhz Clock with +500 memory and it seems stable so far. Best I was able to get out of the MSI bios that was consistently stable was 2101. Temps are under control in the same way they were with the stock Bios


Hi, I have the same strange issue with overclocking my new Palit 1070 super Jetstream.
The core runs fine near 2100Mhz no problem, but vram acts strangly.
If I increase beyond 4200Mhz in afterburner (which is little +200 compare to default 4000Mhz), then I have instant freeze on desktop with checkerboard square patterns.
First I tought bad luck I have poor MICRON memory








But as you have suggested, I tried to lock the voltage to 1.093 before overclocking and this time, vram easily increased to +600Mhz without any problem








I even could run Timespy benchmark and see my graphic score jumped from 6300 to almost 6700 !








But the side effect of high locking voltage is a warmer idle temp on desktop and more power draw.
And moreover, I can't set auto overclocking at startup because I need first to increase voltage to avoid instant freeze...
So, it seems bios update is the only solution.
But which bios best handle Micron vram? Are you satisfied with Asus bios?


----------



## vloeibaarglas

It would be interesting to see if the Micron bioses are programmed with lower voltages than Samsung ones or if they are programmed at the same voltage, but the Samsung ones requite less voltage to overclock.

I don't possess the skills to find the voltage offsets in GPU bioses.


----------



## amd7674

I just cancelled evga ftw order. I can pick asus strix oc non-oc ($50 cheaper) or zotac extreme amp. I have 30 days no questions asked return policy. I'm leaning toward asus ( depot in Canada ). Also my current asus gtx 670 is excellent.

Thoughts???


----------



## saunupe1911

Quote:


> Originally Posted by *amd7674*
> 
> I just cancelled evga ftw order. I can pick asus strix oc non-oc ($50 cheaper) or zotac extreme amp. I have 30 days no questions asked return policy. I'm leaning toward asus ( depot in Canada ). Also my current asus gtx 670 is excellent.
> 
> Thoughts???


No bro...Strix non OC has Micron. Grab the OC. I honestly don't understand why people even buy the non OC. They are clearly stating there's a difference!


----------



## amd7674

Quote:


> Originally Posted by *saunupe1911*
> 
> No bro...Strix non OC has Micron. Grab the OC. I honestly don't understand why people even buy the non OC. They are clearly stating there's a difference!


If I'm not mistaken I think there were users with micron ram on their oc units too. But if oc version is giving me higher chance of winning Samsung ram I will do it. ;-)

Thanks


----------



## LiquidHaus

get the amp extreme. it is indeed a monster.

(mine)


----------



## bigjdubb

I haven't seen any consistency regarding memory brand for any particular card. I think it has more to do with when the card was produced than what brand or version it is.


----------



## vloeibaarglas

Quote:


> Originally Posted by *amd7674*
> 
> If I'm not mistaken I think there were users with micron ram on their oc units too. But if oc version is giving me higher chance of winning Samsung ram I will do it. ;-)
> 
> Thanks


I just got a Strix OC Micron last week. August 15th ship date from Newegg. Aug 8th ship date from Newegg Strix OC was a Samsung

At this point, it is like a double lottery. One for the silicon, the other one for VRAM. I hate it when manufacturers do this. This is like worse than iPhone's TSMC vs Samsung chipgate.


----------



## amd7674

Quote:


> Originally Posted by *vloeibaarglas*
> 
> I just got a Strix OC Micron last week. August 15th ship date from Newegg. Aug 8th ship date from Newegg Strix OC was a Samsung
> 
> At this point, it is like a double lottery. One for the silicon, the other one for VRAM. I hate it when manufacturers do this. This is like worse than iPhone's TSMC vs Samsung chipgate.


Ordered asus oc .... I have 30 days to return it.


----------



## reflex75

Quote:


> Originally Posted by *vloeibaarglas*
> 
> I just got a Strix OC Micron last week. August 15th ship date from Newegg. Aug 8th ship date from Newegg Strix OC was a Samsung
> 
> At this point, it is like a double lottery. One for the silicon, the other one for VRAM. I hate it when manufacturers do this. This is like worse than iPhone's TSMC vs Samsung chipgate.


Triple lottery with coil whine...


----------



## TheGlow

Quote:


> Originally Posted by *madroxinide*
> 
> Sorry to be the bearer of bad news, but that is my review of the EVGA 1070 FTW with Micron memory on Amazon. I was quite disapointed when I got around to loading up GPU-Z. I'm at work right now but later today I can dump the bios and/or test some overclocking capability.
> 
> I've messed around a little bit with the OC so far, but I'm not 100% sure that I'm doing it right, and I've really only been stress testing against Overwatch which I've heard doesn't play well with video card overclocks.
> 
> A couple questions I have before I make claims as to the speeds I am getting
> 
> 1. Should I increase the voltage limit to 100%?
> 2. Should I also increase the power limit to 112%?
> 3. Should I manually set CPU/Mem increases or should I use the OC boost 3.0 thing?
> 4. Is Unigine adequate to stress test?
> 
> I have noticed some weird flashing on my 1440p 144hz gsync monitor that seems to be correlated with the video card automatically adjusting it's clock speeds. I've found similar claims and apparently setting the refresh rate to 120 hz fixes this. These flashes only happen on the desktop or web browsing, never under load in a game.


So far the only games I've been playing is Overwatch and Witcher3, and I've been using Unigine to put load when testing.
When I was originally using the Gaming APP and switch between OC and Gaming I would see my desktop do the flicker.
It has since stopped, but not sure if thats a side affect of playing with Afterburner or the setting I made in Nvidia panel for it to be on high performance instead of Optimal Power.


http://imgur.com/vesxL1R




http://imgur.com/K4SSrAv

I don't have much experience with OC'ing video cards so I'm still learning.
But as you can see when I did raise the voltage I only got 2151MHz as opposed to 2138.
Core +200 works ok. +215 crashes. I havent played around in between those 2 values yet.
For memory 800 looks fine, 825 I only had up for a minute or 2, but when I set to 850 it died.
I have Micron so it's still a lottery by the looks of it.
Although I am probably doing something wrong.

I settled on +180 core, +700 memory in the meantime and Overwatch was running fine about 120-130fps @1440p. I had lowered a few non critical features from ultra to high before OC'ing, so plan to bump it back up to max and see.


----------



## madroxinide

So I've had some time to test my EVGA GTX 1070 FTW with Micron memory, purchased from Amazon 8/22/16.

I started by using EVGA Precision X OC to raise the voltage limit to 100%. This allows for 1.093V under most circumstances. Sometimes I see it hanging out at 1.081ish, not sure why.

I then increased the Power Target to 112%, the max EVGA XOC allows.

I then increased my fan speed manually to be at 100% (2700RPM).

I adjusted the NVidia control panel settings to force maximum performance, so the card will not clock below the default clock of 1607Mhz. With fan speeds at 100%, I'm idling around 40C. Load temperatures hanging around 60C.

At this point, before adjusting the clock offsets, I applied a load with Unigine and was suprised to see a boost clock speed of about 2000Mhz instead of the advertised 1800Mhz, this must be from the voltage% increase.

I began increasing my GPU clock offset. Maybe I'm doing something wrong or I have a poor overclocking card because I can't keep Unigine from crashing with anything higher than +85. This results in somewhere around 2050Mhz GPU under prolonged load.

I then began adjusting my Memory offset. I settled around +375. I still think this may be a bit too high to be 100% stable. This results in a Memory clock of 4374Mhz, or about 8750Mhz effective. If I push to +400 Unigine will crash.

I am going to be testing +85/+375 with overwatch, and I expect that to cause some issues and force me to lower the clocks a bit.

Honestly, I expected more out of this EVGA FTW edition. I don't feel right returning a video card that performs flawlessly under the advertised speeds, but I am definitely disappointed EVGA did not take more steps to ensure their overclocking focused (and priced) cards are not binned a bit stricter.

Here is a screenshot just after Unigine finished.


http://imgur.com/OrS3j


----------



## monza1412

Quote:


> Originally Posted by *madroxinide*
> 
> So I've had some time to test my EVGA GTX 1070 FTW with Micron memory, purchased from Amazon 8/22/16.
> 
> I started by using EVGA Precision X OC to raise the voltage limit to 100%. This allows for 1.093V under most circumstances. Sometimes I see it hanging out at 1.081ish, not sure why.
> 
> I then increased the Power Target to 112%, the max EVGA XOC allows.
> 
> I then increased my fan speed manually to be at 100% (2700RPM).
> 
> I adjusted the NVidia control panel settings to force maximum performance, so the card will not clock below the default clock of 1607Mhz. With fan speeds at 100%, I'm idling around 40C. Load temperatures hanging around 60C.
> 
> At this point, before adjusting the clock offsets, I applied a load with Unigine and was suprised to see a boost clock speed of about 2000Mhz instead of the advertised 1800Mhz, this must be from the voltage% increase.
> 
> I began increasing my GPU clock offset. Maybe I'm doing something wrong or I have a poor overclocking card because I can't keep Unigine from crashing with anything higher than +85. This results in somewhere around 2050Mhz GPU under prolonged load.
> 
> I then began adjusting my Memory offset. I settled around +375. I still think this may be a bit too high to be 100% stable. This results in a Memory clock of 4374Mhz, or about 8750Mhz effective. If I push to +400 Unigine will crash.
> 
> I am going to be testing +85/+375 with overwatch, and I expect that to cause some issues and force me to lower the clocks a bit.
> 
> Honestly, I expected more out of this EVGA FTW edition. I don't feel right returning a video card that performs flawlessly under the advertised speeds, but I am definitely disappointed EVGA did not take more steps to ensure their overclocking focused (and priced) cards are not binned a bit stricter.
> 
> Here is a screenshot just after Unigine finished.
> 
> 
> http://imgur.com/OrS3j


could you please try to overclock just the vram, without voltage adjustments? tell us how far it will go before you see some artifacts or driver crashes.

Also don't forget that you have 2 bios in that card, try the slave one, it allows more TDP (122% I believe) but does not turn off the fans in idle mode.


----------



## madroxinide

I did forget about having a second Bios! thanks.

After switching to the slave Bios I brought the voltage limit back to 0%, I took the power target back to 100%, and also reduced core clock offset to 0.

I am now able to reach +400Mhz VRAM and after a couple runs of Unigine it seems to be stable. I tried 450 and Unigine quickly crashed.

I wonder if I can get my core higher now though after switching to the slave bios which allows 122% instead of 112% power target.
Edit: Doesn't really seem like it. Unigine continues to crash @ +100/+400.

Gonna take a break from OCing and play some games. If anyone wants me to test something with my card just give me a shout.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> I haven't seen any consistency regarding memory brand for any particular card. I think it has more to do with when the card was produced than what brand or version it is.


I think Samsung is only guaranteed on the FE because they are all built by the same OEM. I am glad I went with the FE now. No coil whine, Samsung ram, good overclocker and $85 full cover water block.









And no DPC latency issues.


----------



## kpo6969

Quote:


> Originally Posted by *bigjdubb*
> 
> I haven't seen any consistency regarding memory brand for any particular card. I think it has more to do with when the card was produced than what brand or version it is.


I believe this the case also.
Purchased MSI 1070 Gaming (no x or z just stock) on 07/07/16 from the Egg.
Samsung memory


----------



## bigjdubb

Quote:


> Originally Posted by *criminal*
> 
> I think Samsung is only guaranteed on the FE because they are all built by the same OEM. I am glad I went with the FE now. No coil whine, Samsung ram, good overclocker and $85 full cover water block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And no DPC latency issues.


I wish I would have snagged an FE but I got the first one I could, which was the MSI. Maybe the MSI will have a better resale value when I get rid of it for Vega.


----------



## SuperZan

Quote:


> Originally Posted by *criminal*
> 
> I think Samsung is only guaranteed on the FE because they are all built by the same OEM. I am glad I went with the FE now. No coil whine, Samsung ram, good overclocker and $85 full cover water block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And no DPC latency issues.


Same boat here, minus the FE bit, though EVGA SC is essentially dressed-up reference. Quiet as a mouse, Samsung RAM as well, and no dreaded DPC latency issues (knock on wood). It's unfortunate that it's such a roulette-wheel in terms of who will be affected.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> I wish I would have snagged an FE but I got the first one I could, which was the MSI. Maybe the MSI will have a better resale value when I get rid of it for Vega.


I felt stupid at first because I didn't have patients to wait for a AIB card. The Zotac FE was the first card that came in stock on Amazon that I was able to actually snag so I went with it. No regrets now.
Quote:


> Originally Posted by *SuperZan*
> 
> It's unfortunate that it's such a roulette-wheel in terms of who will be effected.


It definitely is. It has got to be certain combinations of hardware that causes the issue. Well of course the Nvidia drivers as well. To be honest these are the worse string of Nvidia drivers I have ever seen. I have had to role back by driver twice due to stability issues. I am using the 368.69 - WHQL driver. I had to role back from 368.81 & 372.54.


----------



## bigjdubb

Well I don't regret my purchase since I bought it knowing I would be replacing it with a Vega card. Vega may get here a little sooner than expected but I figure I can probably sell my card and block at a $150 loss fairly easily. $150 for 3 or 4 months of fun (and hopefully only 2 months of frustration) is fine by me.


----------



## SuperZan

Quote:


> Originally Posted by *bigjdubb*
> 
> Well I don't regret my purchase since I bought it knowing I would be replacing it with a Vega card. Vega may get here a little sooner than expected but I figure I can probably sell my card and block at a $150 loss fairly easily. $150 for 3 or 4 months of fun (and hopefully only 2 months of frustration) is fine by me.


Indeed. You'll be able to get a great resale price for sure. I sold a 780 Ti that none of my local pals wanted for £120 on Ebay, I'd expect a used 1070 would still sell for near-retail even after Vega's release.


----------



## monza1412

The thing that is bs is that all the AIB partners sent cards with Samsung memory to the reviewers, even today I couldn't find a single review of any 1070 using Micron ram, it feels like a cheap move.
I know that the oc is not guaranteed in either chip but they are promoting something that actually they do not sell in the retail channel.

This type of hardware changes should be declared in the outside package or in the AIB partner product page, name a 2nd revision, a different code product or something so any customer could be aware of it. It's just not right.


----------



## saunupe1911

This micron crap sounds like a class action lawsuit to me!!! (Kanye Shrug)


----------



## criminal

Quote:


> Originally Posted by *saunupe1911*
> 
> This micron crap sounds like a class action lawsuit to me!!! (Kanye Shrug)


Nope, Nvidia being doing it for years.


----------



## bigjdubb

I think it's pretty far off the mark to even be upset about it. This is not a new practice, has been going on for years. Both brands of chips are manufactured to the same spec and operate at the stated frequency. No one is entitled to great overclocks.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> I think it's pretty far off the mark to even be upset about it. This is not a new practice, has been going on for years. Both brands of chips are manufactured to the same spec and operate at the stated frequency. No one is entitled to great overclocks.


Agree. Although getting a crappy overclock does suck.


----------



## bigjdubb

Quote:


> Originally Posted by *criminal*
> 
> Agree. Although getting a crappy overclock does suck.


As someone who can't get 2100 mhz out of their 1070, I agree completely that it sucks. Such is life though, I had a Sandy Bridge (until I sold it) that couldn't do better than 4.5 ghz, that sucked more.


----------



## monza1412

Quote:


> Originally Posted by *bigjdubb*
> 
> I think it's pretty far off the mark to even be upset about it. This is not a new practice, has been going on for years. Both brands of chips are manufactured to the same spec and operate at the stated frequency. No one is entitled to great overclocks.


To be fair, to recognize that is not a new practice doesn't make it right, and the oc capacity is not the issue here. The issue is that you, as a consumer, have the right to know what you are buying.

I ask you something, how could you be sure your card is performing correctly if all the reviews they show us were made with different components? How can I be sure that the only difference in this second batch of cards is just the memory supplier?

I'm not upset as I didn't buy one, but this lately business practices suck big time.


----------



## Hunched

I've had 2 1070's with Samsung RAM and both have issues before +500
My MSI Gaming with Samsung has black squares appear after 10 minutes of Rise of the Tomb Raider gameplay at +320mhz.
I didn't test my original Gigabyte 1070 enough because I sent it back since its fans were faulty, but it probably would be near the same.
Neither could achieve anything higher than 2025mhz Core without eventual freezes or crashes.
My i5 4690k can't do more than 4.5ghz and needs more voltage than almost everyone I've seen at 4.5ghz to be stable with it.

I've lost the lottery on everything but my 970. G502 died, EVGA PSU died, cases arrive falling apart, Zowie EC2-A arrives with faulty right click, Asus mobo died on Christmas day. I've never owned a case I could install my HDD in normally because they all vibrate like mad. So much more.
I've had problems with basically every single thing regarding my PC except my Noctua products.

Feels bad man when everyone is acting like it's the end of the world when their 1070 performs almost as bad as yours, at least all of you still seem to be getting 2100mhz or close to it...
I wish my 1070's were as "bad" as all of yours. I guess I'm just super unlucky.


----------



## bigjdubb

Quote:


> Originally Posted by *monza1412*
> 
> To be fair, to recognize that is not a new practice doesn't make it right, and the oc capacity is not the issue here. The issue is that you, as a consumer, have the right to know what you are buying.
> 
> *I ask you something, how could you be sure your card is performing correctly if all the reviews they show us were made with different components? How can I be sure that the only difference in this second batch of cards is just the memory supplier?*
> 
> I'm not upset as I didn't buy one, but this lately business practices suck big time.


*You don't.*

Should these companies have a product label like ingredients on food packaging? Every time they change suppliers for caps and resistors should they be sending out bulletins to everyone? They advertise the card as having GDDR5 memory that runs at a given frequency, that's what you are getting whether or not it's Micron or Samsung. Components that make up electronic products change throughout the lifespan of that product, this is the case for all electronics.

"Specifications Subject to Change Without Notice"

(Edited to spare feelings)


----------



## Hunched

It would be super nice if we could get a BIOS editor sometime this year so I could somewhat improve the lackluster performance of my 1070.
All these people flashing random BIOS's to their cards and claiming significantly improved memory overclocks gives me hope, but I think I'd like to stick with BIOS's from my manufacturer and have the ability to tweak them to perform better.

Not really into the whole flashing a random Asus, Gigabyte, Zotac, etc BIOS to my MSI card without even knowing the differences in the BIOS.


----------



## monza1412

Quote:


> Originally Posted by *bigjdubb*
> 
> *You don't.*
> 
> Should these companies have a product label like ingredients on food packaging? Every time they change suppliers for caps and resistors should they be sending out bulletins to everyone? They advertise the card as having GDDR5 memory that runs at a given frequency, that's what you are getting whether or not it's Micron or Samsung. Components that make up electronic products change throughout the lifespan of that product, this is the case for all electronics.
> 
> "Specifications Subject to Change Without Notice"
> 
> If this bothers you, you should stop buying products and just make what you need and live off the land.


I think your last sentence was out of line as I was talking respectfully.

For the ram topic in question, think whatever you like, but then don't be mad when a gtx970 kind of fiasco happens.


----------



## bigjdubb

Quote:


> Originally Posted by *monza1412*
> 
> I think your last sentence was out of line as I was talking respectfully.
> 
> For the ram topic in question, think whatever you like, but then don't be mad when a gtx970 kind of fiasco happens.


I wasn't mad about my GTX 970's either. Sometimes I wish I still had them.


----------



## rfarmer

Quote:


> Originally Posted by *bigjdubb*
> 
> I wasn't mad about my GTX 970's either. Sometimes I wish I still had them.


I wasn't mad about my GTX 970 either. I bought it shortly after release and never had one issue with not having enough ram, it just performed amazing. I never saw the point in jumping on the "nvidia screwed us over" bandwagon.


----------



## criminal

Quote:


> Originally Posted by *monza1412*
> 
> I think your last sentence was out of line as I was talking respectfully.
> 
> For the ram topic in question, think whatever you like, but then don't be mad when a gtx970 kind of fiasco happens.


I was aggravated about the 970 fiasco more than most users that bought one were. What you are talking about with the different memory types on the 1070 is not even in the same ballpark. Unless you can prove that Micron some how makes the 1070 perform much worse than Samsung, it is a non issue. Sorry.


----------



## LikesToSlide

I'm sorry if this was covered already but- How do you tell what brand of memory a card has?


----------



## Dude970

Quote:


> Originally Posted by *LikesToSlide*
> 
> I'm sorry if this was covered already but- How do you tell what brand of memory a card has?


Use GPU-Z


----------



## monza1412

Quote:


> Originally Posted by *criminal*
> 
> I was aggravated about the 970 fiasco more than most users that bought one were. What you are talking about with the different memory types on the 1070 is not even in the same ballpark. Unless you can prove that Micron some how makes the 1070 perform much worse than Samsung, it is a non issue. Sorry.


Sorry to hear that, I think you misunderstood me.
Obviously it's not in the same ballpark, the 970 was an extreme example of what a business/marketing policy can do, and as you can see from the above posts not everyone felt bad with it.


----------



## rfarmer

Quote:


> Originally Posted by *LikesToSlide*
> 
> I'm sorry if this was covered already but- How do you tell what brand of memory a card has?


----------



## batman900

New Asus non-OC Strix here. Ship date from BH was 8/23. Micron memory. 60.1% ASIC which is the lowest I've ever had.

No issues with the card, not with latency or anything else. It is probably the most quiet card I've ever owned. Seriously, with my Noctua fans set on 60%, my comp makes zero noise with my head 3 feet from it. I decided on ASUS from the lack of reports of coil whine as I can't stand it, really wanted EVGA for the support but too many reports of whine, and my research was indeed correct with no sign of it in my case.

The memory does a solid 400 and the card will run at almost 2200 stock voltage and bios, I say almost because this thing's boost is all over the place and once temps get into the upper 60s in my small case with 2 fans it will throttle down to 2154 I think it was. Stock boost hits close to 1900.

The overclocking was really just for curiosity. It will be running at stock in my little setup that doesn't have the best breathing ability.

I also picked up an asus 24' Rog 180hz gsync monitor which I'm not completely sold on yet and will probably return. For the life of me I can't get a comfortable setup for the colors and gamma. Side by side with my benq 2411z there's no doubt it's a worse panel. Gsync is really nice but not on a screen worse than what I had at the price point these monitors currently demand. The matt coating it has is also very aggressive and gives a screen door affect in games, it's like I'm looking through some weird filter. Ah well. Cheers!


----------



## j4k3nqc

I got my MSI GTX 1070 Sea Hawk EK X 3 weeks ago and I can't be much happier it. This card is awesome with such low load temps.
If I'm not mistaken, this is the same card as the MSI 1070 Gaming X.

Max temps at stock speed I've had is 39C.

I OC'ed Core at 2088 mhz stable and thanks to Samsung GDDR5 memory, it runs at +750mhz (9508mhz effective). Max temps is 49C.

I think I got very lucky with this card.









It's a bit weird thought because I have not set any + core Voltage nor Power % Limit, it is at +0mv and 100% power limit. I had crashes when at +100mv and 126% power limit and all stable with nothing more than stock values. I'm wondering if the card is pushing the voltage better without specific values putted in even if it is higher...


----------



## Hunched

There's a new MSI Gaming BIOS on the MSI Forums: https://forum-en.msi.com/index.php?topic=274411.msg1562922#msg1562922
Couldn't find it anywhere else, uploaded it myself to Techpowerup's database.

It's kind of annoying that they aren't uploading new BIOSes to the product pages, since the MSI Live Update tool doesn't ever find anything despite updates existing.
So I guess it's going to be a community effort from people who get new cards to upload the new BIOSes, or people are going to just keep having to make BIOS request topics on their forums for updates.

I guess it's because there's 2 types of memory that they aren't making them easily available, since people are bound to brick their cards flashing what is for Micron on Samsung and vice versa.
Would be nice if they just put them in the product page anyway with a very obvious disclaimer of what type of memory the BIOS is for, but I guess you can never trust the general public to be smart.


----------



## vloeibaarglas

Can we rename this thread [Official] Nvidia GTX 1070 Micron Memory Owner Support Thread.

Life is not that bad with Micron. You only lose a few percentage point in performance at most.


----------



## owikhan

is it good score?or i get better then this?

ZOTAC GTX 1070 AMP EXTREME EDITION

I use FIRE STORM


----------



## gtbtk

Quote:


> Originally Posted by *monza1412*
> 
> You said earlier that you did lock your voltage before oc your memory and did not see lock ups, I read in a french forum that someone did the same as you to a Palit Jetstream in order to overclock his vram, maybe that's the solution. We need more feedback on this, maybe the chips aren't that bad, and the problem relies on the way the software overclock them.


That may well be true. My MSG gaming behaves differrently with an asus OC bios installed.


----------



## gtbtk

Quote:


> Originally Posted by *reflex75*
> 
> Hi, I have the same strange issue with overclocking my new Palit 1070 super Jetstream.
> The core runs fine near 2100Mhz no problem, but vram acts strangly.
> If I increase beyond 4200Mhz in afterburner (which is little +200 compare to default 4000Mhz), then I have instant freeze on desktop with checkerboard square patterns.
> First I tought bad luck I have poor MICRON memory
> 
> 
> 
> 
> 
> 
> 
> 
> But as you have suggested, I tried to lock the voltage to 1.093 before overclocking and this time, vram easily increased to +600Mhz without any problem
> 
> 
> 
> 
> 
> 
> 
> 
> I even could run Timespy benchmark and see my graphic score jumped from 6300 to almost 6700 !
> 
> 
> 
> 
> 
> 
> 
> 
> But the side effect of high locking voltage is a warmer idle temp on desktop and more power draw.
> And moreover, I can't set auto overclocking at startup because I need first to increase voltage to avoid instant freeze...
> So, it seems bios update is the only solution.
> But which bios best handle Micron vram? Are you satisfied with Asus bios?


You do have one option with afterburner.

You can set a base profile with locked voltagefor 2D apps and a 2nd profile with your overclock for 3d Applications. might be a work around for you


----------



## kevindd992002

Do you guys concur that the Zotac GTX 1070 AMP! Extreme is the best 1070 variant?


----------



## owikhan

Quote:


> Originally Posted by *kevindd992002*
> 
> Do you guys concur that the Zotac GTX 1070 AMP! Extreme is the best 1070 variant?


i am using and fully satisfied now thinking SLI


----------



## asdkj1740

Quote:


> Originally Posted by *kevindd992002*
> 
> Do you guys concur that the Zotac GTX 1070 AMP! Extreme is the best 1070 variant?


this has the almost highest power limit among others
its cooling is enough to keep it cool under 70c at 1XXX rpm
with five years warranty which is remarkable on average
with separated vrm heatsink you can easily install aio with proper vrm cooling, not to mention there is no sticker on the screws of the back of the pcb
msrp is quiet high though, even higher than gigabyte xtreme and evga ftw
about the mosfets used in this card, quality and quantity both are not the tier 1 grade.
but it still fine for 150w power or even ~300w if bios tweaker is here.


----------



## kevindd992002

Gotcha
Quote:


> Originally Posted by *asdkj1740*
> 
> this has the almost highest power limit among others
> its cooling is enough to keep it cool under 70c at 1XXX rpm
> with five years warranty which is remarkable on average
> with separated vrm heatsink you can easily install aio with proper vrm cooling, not to mention there is no sticker on the screws of the back of the pcb
> msrp is quiet high though, even higher than gigabyte xtreme and evga ftw
> about the mosfets used in this card, quality and quantity both are not the tier 1 grade.
> but it still fine for 150w power or even ~300w if bios tweaker is here.


Gotcha. Is the board of the Amp Extreme generally better binned than the Amp Edition? If I decide on replacing the stock cooler with an AIO anyway, would it be better to just go with the cheaper Amp Edition?


----------



## asdkj1740

Quote:


> Originally Posted by *kevindd992002*
> 
> Gotcha
> Gotcha. Is the board of the Amp Extreme generally better binned than the Amp Edition? If I decide on replacing the stock cooler with an AIO anyway, would it be better to just go with the cheaper Amp Edition?


from what i have known, both share the same pcb, exactly the same if i remember them correctly.
the differences between them are bios settings(eg. power limit) and the stock cooler.
if you are going to install aio, sure amp is good enough and no need to buy the extreme one.
dont forget the attach heatsinks for vram, wish you get the samsung one, f**k my 1070 with micron only getting 2150mhz


----------



## kevindd992002

Quote:


> Originally Posted by *asdkj1740*
> 
> from what i have known, both share the same pcb, exactly the same if i remember them correctly.
> the differences between them are bios settings(eg. power limit) and the stock cooler.
> if you are going to install aio, sure amp is good enough and no need to buy the extreme one.
> dont forget the attach heatsinks for vram, wish you get the samsung one, f**k my 1070 with micron only getting 2150mhz


Yeah, sure. So there's no way for me to know that I'd be getting the Samsung chips over the Micron and this is pure luck of draw?


----------



## Jackharm

Quote:


> Originally Posted by *kevindd992002*
> 
> Yeah, sure. So there's no way for me to know that I'd be getting the Samsung chips over the Micron and this is pure luck of draw?


Luck of the draw, unless you were to keep on returning until you come across a samsung chip.

If it is of any consolation my amp!extreme is micron but I can hit 21xx on core and 2302ish? (I'll edit in a gpuz screenshot in a bit) on memory.
EDIT:


Spoiler: Warning: Spoiler!



With +100 on core and +500 on memory


----------



## asdkj1740

Quote:


> Originally Posted by *kevindd992002*
> 
> Yeah, sure. So there's no way for me to know that I'd be getting the Samsung chips over the Micron and this is pure luck of draw?


dont know

but it is interesting that unlike others, amp extreme 1070 has higher vram factory overclocking clock at 2050
Quote:


> Originally Posted by *kevindd992002*
> 
> Yeah, sure. So there's no way for me to know that I'd be getting the Samsung chips over the Micron and this is pure luck of draw?


i guess it depends on the brands, msi and evga started using micron on their cards now, you should ask the zotac users
good luck


----------



## vloeibaarglas

Quote:


> Originally Posted by *gtbtk*
> 
> That may well be true. My MSG gaming behaves differrently with an asus OC bios installed.


In a good or bad way? The Micron bios from my Strix OC right? You are saying different BIOS you different max memory OC?

Also can you write some quick steps to using nvflash?


----------



## TheGlow

I don't think Micron is the end of the world.
Unless I'm doing something wrong, my MSI can handle +800 so far on memory.
Playing around 850 is when it started glitching up.


----------



## SLOWION

Quote:


> Originally Posted by *lifeisshort117*
> 
> get the amp extreme. it is indeed a monster.
> 
> (mine)
> 
> 
> Spoiler: Warning: Spoiler!










very nice


----------



## vloeibaarglas

Quote:


> Originally Posted by *TheGlow*
> 
> I don't think Micron is the end of the world.
> Unless I'm doing something wrong, my MSI can handle +800 so far on memory.
> Playing around 850 is when it started glitching up.


Can you dump your bios?


----------



## TheGlow

Quote:


> Originally Posted by *vloeibaarglas*
> 
> Can you dump your bios?


I tried to upload with GPUz and it said already a duplicate of 184628.rom, which I didn't find on the site.
Is there a preferred alternate way to provide this? Only 251KB


----------



## Prozillah

I'm on g1 with Micron I get 2100 core 9ghz memory and no coil whine- nothing wrong with Micron stop whinging


----------



## PyroTechNiK

Purchased the Asus Strix OC from ncix Canada on August 10th.



Samsung memory.


----------



## kevininsimi

Considering the ASUS DUAL-GTX1070-O8G for a white themed build. Only 3 reviews on newegg--wondering if there are better options besides this specific edition?

Not sure if I'm willing to look past the no back plate


----------



## Hunched

Can we have another topic for people to report and complain about the type of memory their card has? Nobody cares, and there is Micron that overclocks better than Samsung.
If suddenly SK Hynix memory starts appearing on 1070's too the world might end.


----------



## Dude970

I got a new case so I can view my 1070







My cat likes it too

It's still running great. I have been playing Doom maxed out, and running benches. I love this card


----------



## zipper17

Run 3dmark Firestrike (Standar/Extreme/Ultra) Stress test or Timespy Stress test, and check your result.
Your system must complete all loops (20loops, 10minutes), with a Frame Rate Stability of at least 97%.

I used both 3dmark stress test & valley, to check stability after overclock.
In Valley i got stable, but in 3dmark Firestrike Extreme stress, i got crashes.


----------



## gtbtk

Quote:


> Originally Posted by *vloeibaarglas*
> 
> In a good or bad way? The Micron bios from my Strix OC right? You are saying different BIOS you different max memory OC?
> 
> Also can you write some quick steps to using nvflash?


I will preface this by saying that I have spent more time tweaking the MSI so familiarity probably gives it some advantages/knowledge on the quirks etc that I have not had time to find with the ASUS firmware.

The fastest reliable clocks I have managed with MSI firmware is 2101Mhz on the GPU. The ASUS firmware when installed clocked easily to 2136-2150mhz and seemed pretty stable but benchmark scores left me scratching my head. I discovered that there must be additional processing running in parallel on these GPUs that Afterburner does not report on. Even with the higher clocks on the Asus, I was getting lower scores in Firestrike (14600-14700 vs 14800-15000 - I only have an i7-2600 so physics scores are terrible at about 10,000. Best graphics score I have seen is about 20500 but I am not sure what I did to get that cause at the time I didn't understand what I am going to describe next). I saw similar differences in Rise of Tomb Raider and other benchmarks as well.

After reverting to the MSI Bios, I had noticed that sometimes the scores would vary wildly having made different changes and I realized that here is something hidden happening in the curve at the low end around the 850-900 mv part of the curve. It turns out that these cards, in addition to the Core Clock, also have a "Video Clock" that Afterburner and Precision X etc do not report on that has a baring on absolute performance of your GPU. the utility HWInfo64 will report the value though. The key to best overclocking performance is to maximize both Core Clock and Video Clock whilst keeping everything stable. That means that you want to curve in Afterburner to be as flat as you can make it while pushing it as high as you can make it while keeping everything stable. It also means that the best performance from a given card may well be at a core clock of 2000 with a high video clock and not 2150Mhz with small video clock. I suspect that the ASUS OC Bios is more tuned to give big core clocks at the expense of the "hidden" video clocks to score more marketing points.

This tutorial is at your own risk and assumes that you have at least a basic knowledge of computer hardware, bios and pcie devices.

It also assumes that you wont do something stupid like trying to flash a bios for a GTX 680 onto you shiny new GTX 1070. There are ways to recover bricked cards but that is out of scope of this brief tutorial. I would advise though having an alternative graphics option available, your old card or an iGPU is fine, just in case you do mess up, you have a chance to recover the card yourself

Different bios versions will make your card behave differently, The power draw may be different, the power limit slider is likely to not be the same. MSI is 126%, Asus is 120%, EVGA FTW limit is only 114% etc. Zotac and EGVA bioses on the MSI card can make the card draw more than 100% of its power target for example. The different bioses may also do unexpected things with fan control. The voltage controllers and fan controllers IC chips all seem to be the same and specified by Nvidia, the different bioses have been tuned differently to manage different designs so keep a careful eye on everything and revert back to stock if the card behaves poorly.

For Micron ram cards I have only tried bios files from the same "family" of bios with version numbers 86.04.26.00.XX. Please let me know if you have a Micron card and tried flashing a Samsung memory version 86.04.1E.00.XX bios and let us know how it worked out. I am guessing it should probably be ok but I am not completely sure. I may try it one day but I have not had time to allocate to recovery if it doesn't work.

To flash a new bios to your video card you will need a couple of pieces of software that can all be downloaded from the internet. Google is your friend:

GPU-Z 1.10
NVFlash 5.292 that skips certificate checks
A new Bios file to experiment with.

I am assuming that you only have a single card installed. SLI installs need extra commands for NVflash to address the correct card. Not having a 2nd card, I am not sure how that works

1. *Important* - Use GPU-Z to make a backup of your original Bios file, It will give you something to fall back on so keep it safe somewhere. If you are going to tweak bioses when there is software available, only work on copies of your master file so you don't risk corrupting it.

2. Place a copy of nvflash with the associated *.sys files in a separate directory such as c:\nvflash

3. Obtain your new bios rom file you want to try out and place a copy of it in the nvflash directory you just created. Rename the bios file something simple but meaningful with a .rom extension so that you dont need to type that much.

4. Open device manager and disable your video card.

5 Open an administrative command prompt and cd to c:\nvflash directory

6.From the command line in c:\nvflash type the command "nvflash -6 newbiosfile.rom" without the quotes. (Substitute your file name for the example newbiosfile.rom i typed here) the -6 switch says ignore that the bios is for a different card

7. Press "y" twice to respond to the two questions the nvflash utility will ask you and the card will be flashed with the new rom. When it finishes, it will tell you if it succeeded or failed to flash the new bios to the card. If it fails, I would suggest grabbing a copy of your backup rom file and try reflashing with the original again just in case the failed attempt corrupted anything before you reboot (assuming that your card only has a single bios

8 Reboot you PC, you may need to re-enable the graphics card in device manager after the reboot

Remember what you are doing is experimental and there are no guarantees that a different bios will be better on your card. Have a play with the utilities keeping a close eye on all the report metrics and make conservative changes to GPU settings to start with to find out what the limits are.

If the behaviour of the card gets all extreme the you may wish to flash back to the original. For example EGVA bioses on my MSI card will turn off a fixed 100% fan speed setting and change it to 600rpm for example. However, setting a Custom fan curves seem stable and keeps working as expected. You may experience something different


----------



## gtbtk

Quote:


> Originally Posted by *batman900*
> 
> New Asus non-OC Strix here. Ship date from BH was 8/23. Micron memory. 60.1% ASIC which is the lowest I've ever had.
> 
> No issues with the card, not with latency or anything else. It is probably the most quiet card I've ever owned. Seriously, with my Noctua fans set on 60%, my comp makes zero noise with my head 3 feet from it. I decided on ASUS from the lack of reports of coil whine as I can't stand it, really wanted EVGA for the support but too many reports of whine, and my research was indeed correct with no sign of it in my case.
> 
> The memory does a solid 400 and the card will run at almost 2200 stock voltage and bios, I say almost because this thing's boost is all over the place and once temps get into the upper 60s in my small case with 2 fans it will throttle down to 2154 I think it was. Stock boost hits close to 1900.
> 
> The overclocking was really just for curiosity. It will be running at stock in my little setup that doesn't have the best breathing ability.
> 
> I also picked up an asus 24' Rog 180hz gsync monitor which I'm not completely sold on yet and will probably return. For the life of me I can't get a comfortable setup for the colors and gamma. Side by side with my benq 2411z there's no doubt it's a worse panel. Gsync is really nice but not on a screen worse than what I had at the price point these monitors currently demand. The matt coating it has is also very aggressive and gives a screen door affect in games, it's like I'm looking through some weird filter. Ah well. Cheers!


How did you get an ASIC score? my version of GPU-Z 1.10 says it is not supported on this card


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> I will preface this by saying that I have spent more time tweaking the MSI so familiarity probably gives it some advantages/knowledge on the quirks etc that I have not had time to find with the ASUS firmware.
> 
> The fastest reliable clocks I have managed with MSI firmware is 2101Mhz on the GPU. The ASUS firmware when installed clocked easily to 2136-2150mhz and seemed pretty stable but benchmark scores left me scratching my head. I discovered that there must be additional processing running in parallel on these GPUs that Afterburner does not report on. Even with the higher clocks on the Asus, I was getting lower scores in Firestrike (14600-14700 vs 14800-15000 - I only have an i7-2600 so physics scores are terrible at about 10,000. Best graphics score I have seen is about 20500 but I am not sure what I did to get that cause at the time I didn't understand what I am going to describe next). I saw similar differences in Rise of Tomb Raider and other benchmarks as well.
> 
> After reverting to the MSI Bios, I had noticed that sometimes the scores would vary wildly having made different changes and I realized that here is something hidden happening in the curve at the low end around the 850-900 mv part of the curve. It turns out that these cards, in addition to the Core Clock, also have a "Video Clock" that Afterburner and Precision X etc do not report on that has a baring on absolute performance of your GPU. the utility HWInfo64 will report the value though. The key to best overclocking performance is to maximize both Core Clock and Video Clock whilst keeping everything stable. That means that you want to curve in Afterburner to be as flat as you can make it while pushing it as high as you can make it while keeping everything stable. It also means that the best performance from a given card may well be at a core clock of 2000 with a high video clock and not 2150Mhz with small video clock. I suspect that the ASUS OC Bios is more tuned to give big core clocks at the expense of the "hidden" video clocks to score more marketing points.


I was wondering about those fluctuations.
I'm only using 3dmark free so I have to watch the demos and cant run repeatedly back to back.
The highest I got on firestrik so far was 15717, and thats with +200 core, +775memory. And thats with no extra voltage or power limit.
Ive gotten memory up to 825 but it scores less. I raise voltage and scores less.
It's very odd. So far I've just settled on +180 clock and get about 2124/2136 and memory at +700.
I heard I missed on 3dmark being $5 so i'm not sure how often that happens so i'm waiting in the meantime.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> I was wondering about those fluctuations.
> I'm only using 3dmark free so I have to watch the demos and cant run repeatedly back to back.
> The highest I got on firestrik so far was 15717, and thats with +200 core, +775memory. And thats with no extra voltage or power limit.
> Ive gotten memory up to 825 but it scores less. I raise voltage and scores less.
> It's very odd. So far I've just settled on +180 clock and get about 2124/2136 and memory at +700.
> I heard I missed on 3dmark being $5 so i'm not sure how often that happens so i'm waiting in the meantime.


those are really good settings. I would be please if i could find a way to get that on my card.

With the 3dmark scores, you are actually better off looking at the graphics score as opposed to the total if you are discussing different machines. The CPU you use will vary the physics score by quite a bit and have a big impact on the total score


----------



## ricko99

Just got a new 1070 FTW. It's got micron VRAM. Haven't done any OC just yet. Was expecting for Samsung, guess luck wasn't with me


----------



## batman900

Quote:


> Originally Posted by *gtbtk*
> 
> How did you get an ASIC score? my version of GPU-Z 1.10 says it is not supported on this card


I'm using 0.8.7, just right clicking at the top and selecting read asic from the drop down menu.


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> those are really good settings. I would be please if i could find a way to get that on my card.
> 
> With the 3dmark scores, you are actually better off looking at the graphics score as opposed to the total if you are discussing different machines. The CPU you use will vary the physics score by quite a bit and have a big impact on the total score


I'll keep that in mind as I noticed the physics section would bomb out, but to be expected. On an i5 [email protected] so it's not too bad.
i took screenshots of the results so I'll start focusing on the graphics readings for now.
Is it normal for the 3dmark website to show a history of my results when i was NOT logged in, and now that I am, it does not show anything anymore?
I understand if it wasnt tagged prior to my creating an account, but nothing I do going forward is showing up either.
Also does it doesnt seem like the card idles as much anymore. I thought I recall it dipping to 500MHz and 0% fan. Now it seems the lowest it goes is 1582MHz and 25% fan.
And does 3dmark differentiate 1440p vs 1080p? I'm on 1440p and not sure if I should change res to 1080p to reflect that.


----------



## Omzig

Time to join the club.









So i just replaced my MSI 970 with a 1070 Zontac Amp! GPU-Z ASCI reports @ 60.2% (that's the lowest of any card ive owned in 20+ years of PC gaming) Curve OC in MSI AB tops out at 2114 in most games at 1.085v (dropping back to 2000'ish in a heaven/valley loop) Samsung Mem (i just slapped 500Mhz on it and left it at that,no artifact's) temps are around 70'ish 100% load with fans @ 85%

Over all its a pretty solid upgrade from the 970,Im retesting a lot of older games @ 4k+ DSR and now i can now set very high textures in stuff like SOM & ROTTR without them gagging at 3.5 gig......

Just a few sight niggles so far

First....Can't seem to lock any P State's ! as soon as i issue nvidiaInspector.exe -forcepstate:0,8 (or any other state or clock speed settings) i get BSOD related to nvlddmkm.sys

Not sure if this is a pascal thing or a recent driver change (or if NVI is incompatible with Pascal ?) but it worked fine a few days back when i was running 358.80 on the 970 (now on 372.54 with the 1070 on a clean install of win7 x64) were i had 2 bat files that i switched between to lock my 2d/3d clocks,now i have to watch firefox get boosted upto 1016mhz just scrolling a page of forum text....

Second issue...When i wake from Sleep.....I get an eject/insert noise then the display comes back at 640x480 but at a refresh my monitor doesn't support so the screen is a split line mess,Windows restart fixes the issue as does restarting the display driver via CRU i can switch to hybrid sleep for now but i'll report the issue to Zontac and see what they say.

The 970 would sometimes intermittently blackscreen on wake from sleep (say one in 50 wakes) but the 7970 i had in before that never suffered once with any wake issues,Im connected via DL-DVI @ 2560x1440 60/75 Hz and Ive tried 3 known good cables all with the same results

Might try a clean install of windows10 x64 later to see if these issues clear up there,i'll report back my findings









Have a good One.


----------



## kevindd992002

Quote:


> Originally Posted by *Omzig*
> 
> Time to join the club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So i just replaced my MSI 970 with a 1070 Zontac Amp! GPU-Z ASCI reports @ 60.2% (that's the lowest of any card ive owned in 20+ years of PC gaming) Curve OC in MSI AB tops out at 2114 in most games at 1.085v (dropping back to 2000'ish in a heaven/valley loop) Samsung Mem (i just slapped 500Mhz on it and left it at that,no artifact's) temps are around 70'ish 100% load with fans @ 85%
> 
> Over all its a pretty solid upgrade from the 970,Im retesting a lot of older games @ 4k+ DSR and now i can now set very high textures in stuff like SOM & ROTTR without them gagging at 3.5 gig......
> 
> Just a few sight niggles so far
> 
> First....Can't seem to lock any P State's ! as soon as i issue nvidiaInspector.exe -forcepstate:0,8 (or any other state or clock speed settings) i get BSOD related to nvlddmkm.sys
> 
> Not sure if this is a pascal thing or a recent driver change (or if NVI is incompatible with Pascal ?) but it worked fine a few days back when i was running 358.80 on the 970 (now on 372.54 with the 1070 on a clean install of win7 x64) were i had 2 bat files that i switched between to lock my 2d/3d clocks,now i have to watch firefox get boosted upto 1016mhz just scrolling a page of forum text....
> 
> Second issue...When i wake from Sleep.....I get an eject/insert noise then the display comes back at 640x480 but at a refresh my monitor doesn't support so the screen is a split line mess,Windows restart fixes the issue as does restarting the display driver via CRU i can switch to hybrid sleep for now but i'll report the issue to Zontac and see what they say.
> 
> The 970 would sometimes intermittently blackscreen on wake from sleep (say one in 50 wakes) but the 7970 i had in before that never suffered once with any wake issues,Im connected via DL-DVI @ 2560x1440 60/75 Hz and Ive tried 3 known good cables all with the same results
> 
> Might try a clean install of windows10 x64 later to see if these issues clear up there,i'll report back my findings
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have a good One.


This is for the regular AMP and not the AMP Extreme, correct?


----------



## Omzig

Yep just the Regular Amp! 1070


----------



## gtbtk

Quote:


> Originally Posted by *ricko99*
> 
> Just got a new 1070 FTW. It's got micron VRAM. Haven't done any OC just yet. Was expecting for Samsung, guess luck wasn't with me


It is not the end of the world. You can still overclock the micron ram to +550 to 600 mhz


----------



## gtbtk

Quote:


> Originally Posted by *batman900*
> 
> I'm using 0.8.7, just right clicking at the top and selecting read asic from the drop down menu.


That version is using the Maxwell calculations and it is inaccurate.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> I'll keep that in mind as I noticed the physics section would bomb out, but to be expected. On an i5 [email protected] so it's not too bad.
> i took screenshots of the results so I'll start focusing on the graphics readings for now.
> Is it normal for the 3dmark website to show a history of my results when i was NOT logged in, and now that I am, it does not show anything anymore?
> I understand if it wasnt tagged prior to my creating an account, but nothing I do going forward is showing up either.
> Also does it doesnt seem like the card idles as much anymore. I thought I recall it dipping to 500MHz and 0% fan. Now it seems the lowest it goes is 1582MHz and 25% fan.
> And does 3dmark differentiate 1440p vs 1080p? I'm on 1440p and not sure if I should change res to 1080p to reflect that.


Not sure about 3d mark. I run the test and if I click compare results it opens a web page and I am logged in already. My best score of all time is 15002 but I am running an i7-2600. My best graphics scores are in the range of 20500.

Firestrike runs a 1080p benchmark, Extreme is the 1440p benchmark and Ultra is 4K. you should not need to change resolution of your monitor

Have you locked voltage and are you running a fan curve with the overclock?


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> Not sure about 3d mark. I run the test and if I click compare results it opens a web page and I am logged in already. My best score of all time is 15002 but I am running an i7-2600. My best graphics scores are in the range of 20500.
> 
> Firestrike runs a 1080p benchmark, Extreme is the 1440p benchmark and Ultra is 4K. you should not need to change resolution of your monitor
> 
> Have you locked voltage and are you running a fan curve with the overclock?


OK, I wasnt sure. Yes, i hit open in webpage and it shows it there, but when I go to the history area it doesnt show any. Meanwhile before I signed in it did. Not major, just annoying. I've taken screenshots now and named them what i had it set to since 3dmark isnt showing that.

Highest I had on firestrike thus far is 15717, and graphics was 20761, but others are in the 20670-20730 range.
Im not sure what you mean by voltage locking. These scores I had with voltage +0, and power limit at 100. Sliding those to max hasnt changed much.
I'm still playing around with Heaven now since I dont have to wait for the demos, and with voltage/power to max I was able to core to +210 and clock was at 2190 or so until heat kicked in but stuck around 2177.

http://i.imgur.com/yw8m6O3.jpg
I dabbled with a fan curve, but its probably garbage. I average around 64º on load. Even pushing it to 100 fan I dont see it go down much more, so just trying to find the break point so the fans aren't wasting.


----------



## batman900

Quote:


> Originally Posted by *gtbtk*
> 
> That version is using the Maxwell calculations and it is inaccurate.


Oh good to know! I just read omzig's post about how his new zotac is 60.2 just like my asus. Which for both of us seems to be the lowest we've ever seen lol. I'm betting then that these older versions are just posting 60.2 since they are incompatible? Now I'm curious to what my actual score is


----------



## Omzig

Aha so 60.2% is a bad/broken reading,thanks for the heads up







id read a few posts about ppl having 100% asci scores on 1080's so i thought getting 100% would be a faulty reading,i'll keep an eye out for an updated Gpuz.

Right off to test my issues under windows 10.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *batman900*
> 
> Oh good to know! I just read omzig's post about how his new zotac is 60.2 just like my asus. Which for both of us seems to be the lowest we've ever seen lol. I'm betting then that these older versions are just posting 60.2 since they are incompatible? Now I'm curious to what my actual score is












when, how, why not yet a solution?

i wanna know!









Edit 1: but i do have Samsung on me Strix.


----------



## eternal7trance

Anyone know what would cause my 1070 to randomly sit at 2d clocks even while gaming? I have to restart to fix it but that's the only way it works and then after a while it will randomly set itself back to those clocks no matter what game I pick. I have one monitor hooked to hdmi at 60hz and one hooked to the dvi at 96hz if that helps diagnose anything.


----------



## jrcbandit

Quote:


> Originally Posted by *Hunched*
> 
> I've had 2 1070's with Samsung RAM and both have issues before +500
> My MSI Gaming with Samsung has black squares appear after 10 minutes of Rise of the Tomb Raider gameplay at +320mhz.
> I didn't test my original Gigabyte 1070 enough because I sent it back since its fans were faulty, but it probably would be near the same.
> Neither could achieve anything higher than 2025mhz Core without eventual freezes or crashes.
> My i5 4690k can't do more than 4.5ghz and needs more voltage than almost everyone I've seen at 4.5ghz to be stable with it.
> 
> I've lost the lottery on everything but my 970. G502 died, EVGA PSU died, cases arrive falling apart, Zowie EC2-A arrives with faulty right click, Asus mobo died on Christmas day. I've never owned a case I could install my HDD in normally because they all vibrate like mad. So much more.
> I've had problems with basically every single thing regarding my PC except my Noctua products.
> 
> Feels bad man when everyone is acting like it's the end of the world when their 1070 performs almost as bad as yours, at least all of you still seem to be getting 2100mhz or close to it...
> I wish my 1070's were as "bad" as all of yours. I guess I'm just super unlucky.


Hah, I sympathize with you because I always lose the silicon lottery too. My 4770K only stably overclocks to 4.4 Ghz (I can get it to 4.6 Ghz but I want perfect stability), none of the video cards I have purchased have overclocked well, either AMD or Nvidia. I also had my first EVGA power supply fail even though they were rated one of the most reliable, but I RMA'd it and haven't had issues since (I think it has a 10 year warranty?).

It seems normal for most 1070s to overclock only to around 2000-2075 Mhz for the core and +400-500 for the memory. I think the main problem was with a number of early 1070 MSIs with Micron memory only getting +200 or so memory overclocks. More recent cards seem to be about the same as Samsung with +400-500 overclocks. I can run artifact free benchmarks at +600 on my Samsung memory, but that isn't any good for gaming, where I need to lower it down to around 450-525ish for 0 artifacts in overclocking sensitive games.


----------



## Omzig

So I just finished testing my niggles posted a page back on a fresh install of win10 x64 with 372.54......

Wake from sleep returns to full res desktop every time,unlike my windows 7 x64 install which wakes to 640x480 @??hz forcing me to reboot.......not sure if this is a win7 driver bug or something specific to my system,anyone else have wake from sleep issues under windows 7 ? (i'll just use hibernate for now)

As with with windows 7,Attempting any kind P-state locking via NV-Inspector just results in a BSOD from nvlddmkm.sys PAGE_FAULT_IN_NONPAGED_AREA......Not sure if this is a driver fault or an incompatibility with NVI and pascal but atm im betting on a driver change as i also went and tested 372.54 on my old 970 (im my sisters PC now) and i get the same BSOD there,but rolling back to 358.80 results in perfect p state locking

It will be pretty lame if NV have broken the ability to lock p states,i hate seeing my 2d clock's bounce all over the shop when scrolling pages in firefox,god know why it needs to boost to 680MHz just to scroll a page of text.....i might try and track down which driver broke this function when i have more time/access to the 970 to test a larger number of drivers.


----------



## TheGlow

I just had an odd experience. I've had my 1070 at +180 core, +700 memory for a few days, no problems.
I had afterburner set to launch with windows, and to apply the profile.
I had to reboot and for reasons unknown it was locking up with the checker pattern at launch.
Luckily I have a safe mode option in the boot menu and was able to find the cfg file and remove the StartUp reference.
I'm guessing theres a chance at boot it hasnt started to get enough voltage and it died?
Voltage %+ set to 0 and power limit to 100, so nothing super fancy there.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> OK, I wasnt sure. Yes, i hit open in webpage and it shows it there, but when I go to the history area it doesnt show any. Meanwhile before I signed in it did. Not major, just annoying. I've taken screenshots now and named them what i had it set to since 3dmark isnt showing that.
> 
> Highest I had on firestrike thus far is 15717, and graphics was 20761, but others are in the 20670-20730 range.
> Im not sure what you mean by voltage locking. These scores I had with voltage +0, and power limit at 100. Sliding those to max hasnt changed much.
> I'm still playing around with Heaven now since I dont have to wait for the demos, and with voltage/power to max I was able to core to +210 and clock was at 2190 or so until heat kicked in but stuck around 2177.
> 
> http://i.imgur.com/yw8m6O3.jpg
> I dabbled with a fan curve, but its probably garbage. I average around 64º on load. Even pushing it to 100 fan I dont see it go down much more, so just trying to find the break point so the fans aren't wasting.


those are good results. you have nothing to be worried about.

heaven though, is easier on your hardware than firestrike or time spy. Tweaking to max out heaven will result in crashes under heavier load


----------



## kevindd992002

How does the MSI Gaming X compare to the Zotac AMP! Extreme based on board and components quality? I will eventually be replacing the stock cooler of whichever 1070 I buy with an AIO so I'm contemplating between the two cards.

The MSI has a baseplate that serves as a heatsink for almost all the front PCB components. The Zotac will make me need to install aluminum heatsinks to the vRAM chips and the MOSFETs but that would be no problem for me at all as I've done this with my GTX 670's. I can get the Zotac for $100 less in our country and even has 5 years warranty compared to the MSI which has only 3.

My question is would installing an AIO coupled with individual heatsink to the chips that need it in the Zotac provide me almost the same overclocking capability (and temps) compared to when going with the MSI wherein I only need to install the AIO and be done with it? The AIO that I'll be installing is my old Arctic Cooling Hybrid Cooler from my old card.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> How does the MSI Gaming X compare to the Zotac AMP! Extreme based on board and components quality? I will eventually be replacing the stock cooler of whichever 1070 I buy with an AIO so I'm contemplating between the two cards.
> 
> The MSI has a baseplate that serves as a heatsink for almost all the front PCB components. The Zotac will make me need to install aluminum heatsinks to the vRAM chips and the MOSFETs but that would be no problem for me at all as I've done this with my GTX 670's. I can get the Zotac for $100 less in our country and even has 5 years warranty compared to the MSI which has only 3.
> 
> My question is would installing an AIO coupled with individual heatsink to the chips that need it in the Zotac provide me almost the same overclocking capability (and temps) compared to when going with the MSI wherein I only need to install the AIO and be done with it? The AIO that I'll be installing is my old Arctic Cooling Hybrid Cooler from my old card.


I cant give you a definitive answer to your question however, the MSI Gaming X and the MSI Gaming 8G are the same hardware with different clocked bioses. You may want to consider the base model MSI Gaming as well and save a but of extra money if you go MSI.

The Zotac is clocked faster out of the box (1633 vs 1582 Mhz) and the Zotac, based on my experimentation flashing my MSI Gaming X with the Zotac bios sets the VRMs to pump more power into the card. I cannot get the stock MSI Card to reach 100% or the power target but I can get 110% with Zotac FW. Having said that, I cant get the Zotac FW to perform better than the stock MSI FW in benchmarks.

Especially as you save $100 and you seem to have more potential to generate heat, I would think that the Zotac may give you more benefits if you put it under water


----------



## syl1979

Please note that 100% is probably different between each vendor. You can see the watt power reading in hwinfo


----------



## ericool69

Does anyone have the MSI version of this card? I have quite a bit of a problem, I've noticed when the MSI gaming app is open I get lots of frame drops, lags, and stuttering. The frame rates drops from 100+ to 5 below. I've been searching for hours and still no solution, however when the gaming app is closed everything is back to normal, no lag/stutter/frame drops. It's odd how the app meant for the product isn't optimized at app

I have already tried uninstall/reinstall all the drivers and and gaming app itself but no luck,


----------



## connectwise

What version of gpuz work with this card? My asci score doesn't show on the asus strix saying not avail for this card.


----------



## gtbtk

Quote:


> Originally Posted by *connectwise*
> 
> What version of gpuz work with this card? My asci score doesn't show on the asus strix saying not avail for this card.


1.10 is latest and has support for pascal gpus. There is no version of gpu-z that will report ASIC as yet. I believe it is coming soon


----------



## connectwise

Roger thanks.


----------



## Seid Dark

Quote:


> Originally Posted by *ericool69*
> 
> Does anyone have the MSI version of this card? I have quite a bit of a problem, I've noticed when the MSI gaming app is open I get lots of frame drops, lags, and stuttering. The frame rates drops from 100+ to 5 below. I've been searching for hours and still no solution, however when the gaming app is closed everything is back to normal, no lag/stutter/frame drops. It's odd how the app meant for the product isn't optimized at app
> 
> I have already tried uninstall/reinstall all the drivers and and gaming app itself but no luck,


I've seen many people reporting problems with the Gaming app on various forums, it's buggy as hell. MSI may fix it some day but now it's best to uninstall it and use the Afterburner.


----------



## Omzig

Quote:


> Originally Posted by *Seid Dark*
> 
> I've seen many people reporting problems with the Gaming app on various forums, it's buggy as hell. MSI may fix it some day but now it's best to uninstall it and use the Afterburner.


I can concur,I installed the "gaming app" for my sister the other day after moving my old MSI 970 to her system,about 20 mins later she was shouting at me asking why her PC was laggy,rebooted (very slow to load into windows) closed & uninstalled the "gaming App" rebooted all back to normal.....Id say avoid that POS


----------



## syl1979

GALAX 1070 GAMER

100% Power = 195 W
(92% for 180W)

Max in Fireburner : 125% = 243,7 W


----------



## Vici0us

Here are some benchmarks: GTX 1070 G1 Gaming --- i7-4770K @ 4.3GHz
(accidentally made a double post so I figured.. I'd post few benchmarks).
Fire Strike - 15841

Fire Strike Extreme - 8502

Fire Strike Ultra - 4617


----------



## Vici0us

Hey guys,I got my my G1 1070 over a month ago. I'll post some benchmarks soon.


Spoiler: Warning: Spoiler!


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> those are good results. you have nothing to be worried about.
> 
> heaven though, is easier on your hardware than firestrike or time spy. Tweaking to max out heaven will result in crashes under heavier load


Will do. I figure I'll play with Heaven since its free and get an idea for what that can handle, and then work my way down a bit and do periodic firestrikes and time spys.
Although I did have the odd lock up at boot with +180/700 which was surprising. So it boots stock now and I guess I'll just need to remember to apply the profile every time i boot/play.

Quote:


> Originally Posted by *ericool69*
> 
> Does anyone have the MSI version of this card? I have quite a bit of a problem, I've noticed when the MSI gaming app is open I get lots of frame drops, lags, and stuttering. The frame rates drops from 100+ to 5 below. I've been searching for hours and still no solution, however when the gaming app is closed everything is back to normal, no lag/stutter/frame drops. It's odd how the app meant for the product isn't optimized at app
> 
> I have already tried uninstall/reinstall all the drivers and and gaming app itself but no luck,


I dunno, Gaming App worked fine for me the first few days I used it until I switched to after burner.
I remember it not turning on fans until it got hotter, so I would always sit around 70º. Also I cant remember if it downclocked a lot , like 500MHz.
I tried a few times since using after burner and just not fond of it. The OSD in game, fans not coming on until later, and only boosts to around 1950MHz or so.
Afterburner gets me to 2130-2170MHz and my fan settings let me sit around 64-65º


----------



## amd7674

I should be getting Asus strix oc this week (if everything goes well). What driver do you guys recommend installing to test new GPU?

I'm currently using Asus GTX 670 in my machine, so I would like to run some benchies before I swap the cards.

Thanks in advance


----------



## ricko99

Tested my rig with Firestrike benchmark and I got 13871 as the score

With 1070 FTW (no overclock), Xeon E3 1231 v3 and 16GB 1600Mhz DDR3 RAM and Firestrike running from 7200rpm HDD. Currently using NVIDIA driver 368.81. Is this result normal for a non-oced 1070 running stable at 1962MHz Core clock?

http://www.3dmark.com/fs/9970041


----------



## Dimensive

UPS just delivered this beauty:



EVGA GTX 1070 SC (Black Edition)


----------



## bigjdubb

Quote:


> Originally Posted by *ricko99*
> 
> Tested my rig with Firestrike benchmark and I got 13871 as the score
> 
> With 1070 FTW (no overclock), Xeon E3 1231 v3 and 16GB 1600Mhz DDR3 RAM and Firestrike running from 7200rpm HDD. Currently using NVIDIA driver 368.81. Is this result normal for a non-oced 1070 running stable at 1962MHz Core clock?
> 
> http://www.3dmark.com/fs/9970041


18,000 seems about right for the graphics score. The total is meaningless for comparison since your CPU plays a huge factor in the total score. 21,000 (give or take a couple hundred points) or so seems to be a good average for overclocked results, graphics score not total score.


----------



## GreedyMuffin

A shop in Norway mis-priced their Gigabyte Mini-itx 1070 to like 350 under retail. So I had to buy one. Will be folding on it, but gotta play with it.

EDIT: Let's hope they don't cancel my order.


----------



## SuperZan

Quote:


> Originally Posted by *GreedyMuffin*
> 
> A shop in Norway mis-priced their Gigabyte Mini-itx 1070 to like 350 under retail. So I had to buy one. Will be folding on it, but gotta play with it.
> 
> EDIT: Let's hope they don't cancel my order.


Their mistake, they ought to pay for it! 

In all seriousness, best of luck. Maybe they were just feeling generous!


----------



## GreedyMuffin

200-250¤

But yeah! They better do! I'm gonna take it further until they give me the card. Haha


----------



## LiquidHaus

Quote:


> Originally Posted by *kevindd992002*
> 
> How does the MSI Gaming X compare to the Zotac AMP! Extreme based on board and components quality? I will eventually be replacing the stock cooler of whichever 1070 I buy with an AIO so I'm contemplating between the two cards.
> 
> The MSI has a baseplate that serves as a heatsink for almost all the front PCB components. The Zotac will make me need to install aluminum heatsinks to the vRAM chips and the MOSFETs but that would be no problem for me at all as I've done this with my GTX 670's. I can get the Zotac for $100 less in our country and even has 5 years warranty compared to the MSI which has only 3.
> 
> My question is would installing an AIO coupled with individual heatsink to the chips that need it in the Zotac provide me almost the same overclocking capability (and temps) compared to when going with the MSI wherein I only need to install the AIO and be done with it? The AIO that I'll be installing is my old Arctic Cooling Hybrid Cooler from my old card.


Alphacool is coming out with a waterblock for the Amp Extreme.
Quote:


> Originally Posted by *gtbtk*
> 
> I cant give you a definitive answer to your question however, the MSI Gaming X and the MSI Gaming 8G are the same hardware with different clocked bioses. You may want to consider the base model MSI Gaming as well and save a but of extra money if you go MSI.
> 
> The Zotac is clocked faster out of the box (1633 vs 1582 Mhz) and the Zotac, based on my experimentation flashing my MSI Gaming X with the Zotac bios sets the VRMs to pump more power into the card. I cannot get the stock MSI Card to reach 100% or the power target but I can get 110% with Zotac FW. Having said that, I cant get the Zotac FW to perform better than the stock MSI FW in benchmarks.
> 
> Especially as you save $100 and you seem to have more potential to generate heat, I would think that the Zotac may give you more benefits if you put it under water


Are you meaning FE? and not FW? Cause yeah, single 8 pin cards are normal for pulling those power limit numbers. Just clarifying is all.


----------



## gtbtk

Quote:


> Originally Posted by *lifeisshort117*
> 
> Alphacool is coming out with a waterblock for the Amp Extreme.
> Are you meaning FE? and not FW? Cause yeah, single 8 pin cards are normal for pulling those power limit numbers. Just clarifying is all.


no. FW as in firmware


----------



## Hnykill

Quote:


> Originally Posted by *Hunched*
> 
> That Palit is supposed to be the quietest 1070 due to a larger coolers than almost everything and good fans.
> It's not available in NA so I got the next quietest thing, MSI Gaming 8G.
> 
> This is according to every review I could find that tested decibel levels, EVGA wasn't on any of them at the time so I don't know about them.
> Just avoid everything that uses smaller high RPM fans, like Gigabyte 1070's.


I Have a Palit GTX 1070 Super Jetstream and it is dead silent. also the plastic shroud around the heatsink is not blocking the airflow of the heatsink, at all. also has RGB lights. at 100% load the fans spin up to 52%.. and it is really just a "hummm" gentle breeze sound. it is perfectly clear that the ones who made the cooler for this card knows what they are doing. also it looks pretty stellar !







..it's the best GTX 1070 brand i know of. and it overclocks like hell. Mine is at 2088 Core and 9000 Mhz Memory. It makes about much sound as a 140mm case fan at about 1200 RPM.

I am all about cooling solutions. and this card is i can say the best of them all.


----------



## ricko99

Quote:


> Originally Posted by *bigjdubb*
> 
> 18,000 seems about right for the graphics score. The total is meaningless for comparison since your CPU plays a huge factor in the total score. 21,000 (give or take a couple hundred points) or so seems to be a good average for overclocked results, graphics score not total score.


I see. It means I just have to look at the graphics score to compare with others and that's it right?


----------



## Blackfyre

I've been using the same settings for over a month now, but I decided to go with a more silent treatment on my *custom fan curve*. Now maximum temperature reached after about 20 minutes of running Heaven Benchmark & then actually running the benchmark too is about 68 degrees celsius. With my *MSI GTX 1070 Gaming X* & my custom curve, it's barely audible even with speakers off.

With my previous fan curve it used to stabilize at *2088Mhz core*, now it stabilizes at *2050Mhz*. It's worth that little to no performance drop that I get a much quieter system. I also dropped my *CPU overclock* & *fan speed* on my *4790K* from *4.7GHz @ 1.280v* to a *4.6GHz @ 1.200v* (_that shaved another 10 or so degrees celsius on the CPU with little to no performance impact in terms of gaming_).

http://i.imgur.com/21sAaGV.png


----------



## asdkj1740

wants to share some findings about evga cards
evga cooling plate for vram and mosfet is suck, if you want to get higher oc on vram, you must need to strengthen them.
i have got a micron vram evga card and the max stable vram oc is 2150 with stock cooler.
yesterday i removed the front cooling plate and add thermals pad to the back of the pcb so that the back plate can really serve the cooling purpose and i also add some vram heatsinks to the front of the pcb vram, now the max stable oc for vram is 2250.

i have this experience before on my 970 eplidia vram cards, same issue same solution same work nice a charm
evga, the whole cooling is suck, acx heatsink is suck, cooling plate is suck, with these two parts, your oc is not going anywhere

bear in mind that if the front cooling plate is removed then the back plate cant be screwed in meaning that you have to use your own way to keep the back plate in the right place
another evga stupid design


----------



## asdkj1740

Quote:


> Originally Posted by *Hnykill*
> 
> I Have a Palit GTX 1070 Super Jetstream and it is dead silent. also the plastic shroud around the heatsink is not blocking the airflow of the heatsink, at all. also has RGB lights. at 100% load the fans spin up to 52%.. and it is really just a "hummm" gentle breeze sound. it is perfectly clear that the ones who made the cooler for this card knows what they are doing. also it looks pretty stellar !
> 
> 
> 
> 
> 
> 
> 
> ..it's the best GTX 1070 brand i know of. and it overclocks like hell. Mine is at 2088 Core and 9000 Mhz Memory. It makes about much sound as a 140mm case fan at about 1200 RPM.
> 
> I am all about cooling solutions. and this card is i can say the best of them all.


this card this cooler is able to beat general 120 aio, it is insane, it is the king, the new world, without cheating.
even i have my 120 aio installed, i still got one level thermal throttling from 2088 to 2076.


----------



## nellyp67

had the Zotax amp extreme for almost 2 weeks now - is this an average OC


----------



## Sueramb6753

-snip-


----------



## GreedyMuffin

My GTX 1070 order was cancelled..

Should I take further or let it be?


----------



## owikhan

any zotac 1070 amp extreme edition expert here??


----------



## asdkj1740

Quote:


> Originally Posted by *GreedyMuffin*
> 
> My GTX 1070 order was cancelled..
> 
> Should I take further or let it be?


wait for the sales and the dates when bf1/ titanfall2 /watch dogs be avalibale

currently only evga ftw has a price cut from 460 to 440, but probably micron vram


----------



## Blackfyre

*Battlefield 1 Beta* is running buttery smooth at 1080p @ ULTRA Preset.


----------



## b0z0

I'm debating between the Evga FTW or the MSI Gaming X. Which one should I go for?


----------



## owikhan

Go for MSI GAMING X:thumb:


----------



## Lennart76

Someone here who can beat my GTX 1070 FTW Graphics Score?









http://www.3dmark.com/fs/9971245


----------



## iluvkfc

So I have finally settled on my gaming stable OC for my Gigabyte 1070 Windforce OC cards. Core clock: 2012 MHz, Memory clock: 9.2 GHz. I'm honestly a bit disappointed, pretty low for a watercooling OC, but these cards are just way too power and voltage limited, even though I'm running ASUS Strix OC BIOS on both for 120% power limit. Also each card is stable at way higher than that on its own. Let's hope we see a BIOS editor one of these days, preferably before Volta. I'm just hoping the eventual launch of the 1080 Ti motivates people to get their hands dirty.

https://www.techpowerup.com/gpuz/details/zd6ap
https://www.techpowerup.com/gpuz/details/vm9vp


----------



## tigertank79

Hi, I have 2x evga 1070SC SLI and I have this problem with AB(430 beta4) and PrecisionX OC OSD.
Someone have this problem....and a solution? Thanks!

See amount of vram used...



http://imgur.com/fLZGHlT


P.S. With single cards all is regular.


----------



## iluvkfc

Quote:


> Originally Posted by *tigertank79*
> 
> Hi, I have 2x evga 1070SC SLI and I have this problem with AB(430 beta4) and PrecisionX OC OSD.
> Someone have this problem....and a solution? Thanks!
> 
> See amount of vram used...
> 
> 
> 
> http://imgur.com/fLZGHlT
> 
> 
> P.S. With single cards all is regular.


Restart AB/Precision after enabling SLI, should do it.


----------



## tigertank79

Quote:


> Originally Posted by *iluvkfc*
> 
> Restart AB/Precision after enabling SLI, should do it.


Thanks but I already tried and nothing changes. The problem returns.


----------



## Shin

Hi guys,
I want to change my SLI GTX 670 by a GTX 1070. But there are so many custom cards and I can't see the difference between them. :S
The Zotac AMP! Extrem Edition is very interesting but there aren't many review about this card. How are the MSI and Asus doing compare to the Zotac ?


----------



## TheGlow

Quote:


> Originally Posted by *Blackfyre*
> 
> *Battlefield 1 Beta* is running buttery smooth at 1080p @ ULTRA Preset.


I'm wondering the [email protected] performance.


----------



## Mr-Dark

Quote:


> Originally Posted by *TheGlow*
> 
> I'm wondering the [email protected] performance.


around 90-130fps.. at 2050mhz 4400mhz memory


----------



## TheGlow

Quote:


> Originally Posted by *Mr-Dark*
> 
> around 90-130fps.. at 2050mhz 4400mhz memory


I downloaded this morning before I left for work.
I'm stable at 2160/4700 so I'll take a pic. I'll sack a few settings if needed to get closer to 120-144fps.


----------



## Lennart76

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> around 90-130fps.. at 2050mhz 4400mhz memory
> 
> 
> 
> I downloaded this morning before I left for work.
> I'm stable at 2160/4700 so I'll take a pic. I'll sack a few settings if needed to get closer to 120-144fps.
Click to expand...

Firestrike pls

Gesendet von iPhone mit Tapatalk


----------



## Lennart76

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> around 90-130fps.. at 2050mhz 4400mhz memory
> 
> 
> 
> I downloaded this morning before I left for work.
> I'm stable at 2160/4700 so I'll take a pic. I'll sack a few settings if needed to get closer to 120-144fps.
Click to expand...

Firestrike pls

Gesendet von iPhone mit Tapatalk


----------



## bigjdubb

I just played 2 rounds of BF1, 1440p Ultra was around 90 fps with no overclock. I will play it in 4k tonight and see how it does.

Quote:


> Originally Posted by *ricko99*
> 
> I see. It means I just have to look at the graphics score to compare with others and that's it right?


Yup. CPU plays a factor in the graphics score but it is very minor compared to the overall score.


----------



## slayer305

Had my 1070 FE since late June. Got a bit unlucky with the silicon lottery, core will only do +120 24/7 stable, +140 will fail 3dmark. Clocks fluctuate between 1911-1987 Mhz boost. RAM will do +500 easily though, possibly more, I just left it there.


----------



## Mr-Dark

Quote:


> Originally Posted by *Lennart76*
> 
> Firestrike pls
> 
> Gesendet von iPhone mit Tapatalk


Here is









http://www.3dmark.com/3dm/14448553?


----------



## amd7674

is it a bad idea to buy MSI Gaming Z? I love the design and how quiet the card supposedly runs.... However 99.9% it will come with micron.


----------



## kleitos44

The Zotac cards have a really odd power limit, and the Amp! (non-extreme) tends to run hot because of it. That being said, I've flashed my card to a Zotac GTX 1070 Amp! Extreme and threw a Kraken 280mm and a g10 bracket on it. Solid 2151 MHZ on core with it - they have PCB that will negate any hope of a full waterblock but if that doesn't matter, get the Amp! Extreme.

Haven't had any issues.


----------



## DanielB123

Hello,
I've received my Zotac GTX 1070 AMP Extreme today, however it is acting up strange in my system. At first I've tried overclocking it a little and it is having none of it, I've tried 200mhz on memory and after a couple of minutes in GTA V it reboots my whole system. So I've put memory back to stock and I've tried adding 20mhz to the core, unlike the memory increasing the core rather forces GTA V to freeze and Nvidia drivers to stop responding (it's not just GTA V, Witcher 3 and even cs:go do it aswell.).

So I thought maybe I got really unlucky with the lottery and I've put it all back to stock, completely. Fired up GTA V and after a couple of minutes it still randomly reboots my whole system. Anyone know why this could be?

I've got an Intel i5 4690k OCed at 4.6ghz (also tried putting it to stock but it still crashes / reboots) and for PSU I've got Superflower Golden Green HX 550W 80+ Gold


----------



## LiquidHaus

Quote:


> Originally Posted by *owikhan*
> 
> any zotac 1070 amp extreme edition expert here??


Yes. What would you like to know?
Quote:


> Originally Posted by *kleitos44*
> 
> The Zotac cards have a really odd power limit, and the Amp! (non-extreme) tends to run hot because of it. That being said, I've flashed my card to a Zotac GTX 1070 Amp! Extreme and threw a Kraken 280mm and a g10 bracket on it. Solid 2151 MHZ on core with it - they have PCB that will negate any hope of a full waterblock but if that doesn't matter, get the Amp! Extreme.
> 
> Haven't had any issues.


How so? The Amp and Amp Extreme both run dual 8 pin. They should be sitting around 70%-80% at max overclock.

And how will the PCB negate any hope of a full waterblock?

Alphacool has told me personally they have one coming out, as well as Watercool planning to do a block for the Zotac cards.

Quote:


> Originally Posted by *DanielB123*
> 
> Hello,
> I've received my Zotac GTX 1070 AMP Extreme today, however it is acting up strange in my system. At first I've tried overclocking it a little and it is having none of it, I've tried 200mhz on memory and after a couple of minutes in GTA V it reboots my whole system. So I've put memory back to stock and I've tried adding 20mhz to the core, unlike the memory increasing the core rather forces GTA V to freeze and Nvidia drivers to stop responding (it's not just GTA V, Witcher 3 and even cs:go do it aswell.).
> 
> So I thought maybe I got really unlucky with the lottery and I've put it all back to stock, completely. Fired up GTA V and after a couple of minutes it still randomly reboots my whole system. Anyone know why this could be?
> 
> I've got an Intel i5 4690k OCed at 4.6ghz (also tried putting it to stock but it still crashes / reboots) and for PSU I've got Superflower Golden Green HX 550W 80+ Gold


550W sounds kinda low in my opinion. Especially dependent on how old that Superflower is.


----------



## DanielB123

It's almost 2 years old, and it didn't have any problems with my previous GTX 970 which if I'm aware has very similiar power usage compared to the 1070?


----------



## LiquidHaus

Quote:


> Originally Posted by *DanielB123*
> 
> Hello,
> I've received my Zotac GTX 1070 AMP Extreme today, however it is acting up strange in my system. At first I've tried overclocking it a little and it is having none of it, I've tried 200mhz on memory and after a couple of minutes in GTA V it reboots my whole system. So I've put memory back to stock and I've tried adding 20mhz to the core, unlike the memory increasing the core rather forces GTA V to freeze and Nvidia drivers to stop responding (it's not just GTA V, Witcher 3 and even cs:go do it aswell.).
> 
> So I thought maybe I got really unlucky with the lottery and I've put it all back to stock, completely. Fired up GTA V and after a couple of minutes it still randomly reboots my whole system. Anyone know why this could be?
> 
> I've got an Intel i5 4690k OCed at 4.6ghz (also tried putting it to stock but it still crashes / reboots) and for PSU I've got Superflower Golden Green HX 550W 80+ Gold


Quote:


> Originally Posted by *DanielB123*
> 
> It's almost 2 years old, and it didn't have any problems with my previous GTX 970 which if I'm aware has very similiar power usage compared to the 1070?


What overclocking Utility are you using? The one that was on the Zotac disc?


----------



## DanielB123

I've ignored the disc and downloaded the Zotac firestorm from their website as the one on the disc is outdated. I've also tried using Afterburner with same results.


----------



## LiquidHaus

Quote:


> Originally Posted by *DanielB123*
> 
> I've ignored the disc and downloaded the Zotac firestorm from their website as the one on the disc is outdated. I've also tried using Afterburner with same results.


Wow. That's....strange. I'd uninstall the video drivers. Clean em from your registry. Re-download the newest drivers. Reinstall.


----------



## LiquidHaus

I lol at myself for saying 550w is too low.

My system full load is 470w with my monitor and receiver on as well. What was I thinking?


----------



## rfarmer

Quote:


> Originally Posted by *lifeisshort117*
> 
> I lol at myself for saying 550w is too low.
> 
> My system full load is 470w with my monitor and receiver on as well. What was I thinking?


Yeah I was going to say I am running a OCed 6600k and GTX 1070 on a Corsair SF450 with no problem.


----------



## DanielB123

I've uninstalled the drivers and then used the DDU program, installed the newest driver through geforce experience and it's still rebooting my system. Guess I'll send it back and hope it was the card and not my system.


----------



## Blackfyre

Quote:


> Originally Posted by *DanielB123*
> 
> I've uninstalled the drivers and then used the DDU program, installed the newest driver through geforce experience and it's still rebooting my system. Guess I'll send it back and hope it was the card and not my system.


If you weren't having any of these issues before you got the card, then it's 99% the card that is the problem. Send it back to warranty, don't even mention that you overclocked it, there's no need, because you didn't even push it really. Just straight out, I installed the latest drivers, tried to play my games (_a, b, c, and d_) and the videocard is causing the driver to crash and system to restart in some cases. I tried different driver versions, asked around in forums, and everyone told me that I have a faulty card that needs to be returned to warranty.

Goodluck, hope the new one you get isn't a pain in the ass too.


----------



## sammkv

What I've noticed so far with the Zotac AMP Edition lots of heat gets trapped inside the backplate and fan shroud so you really need strong direct cooling into the card


----------



## kevindd992002

I thought the Zotac AMP! Extreme is one of the coolest running cards?


----------



## sammkv

Quote:


> Originally Posted by *kevindd992002*
> 
> I thought the Zotac AMP! Extreme is one of the coolest running cards?


Yeah the Extreme cooling is awesome but I have the lower model AMP Edition.


----------



## shamoke

Well, I am now the proud owner of a 1070 FTW. I could not be more excited.

Hopefully my overclock is good!


----------



## Prozillah

Quote:


> Originally Posted by *DanielB123*
> 
> Hello,
> I've received my Zotac GTX 1070 AMP Extreme today, however it is acting up strange in my system. At first I've tried overclocking it a little and it is having none of it, I've tried 200mhz on memory and after a couple of minutes in GTA V it reboots my whole system. So I've put memory back to stock and I've tried adding 20mhz to the core, unlike the memory increasing the core rather forces GTA V to freeze and Nvidia drivers to stop responding (it's not just GTA V, Witcher 3 and even cs:go do it aswell.).
> 
> So I thought maybe I got really unlucky with the lottery and I've put it all back to stock, completely. Fired up GTA V and after a couple of minutes it still randomly reboots my whole system. Anyone know why this could be?
> 
> I've got an Intel i5 4690k OCed at 4.6ghz (also tried putting it to stock but it still crashes / reboots) and for PSU I've got Superflower Golden Green HX 550W 80+ Gold


Sounds a lot like power draw. I would try with a beefier unit if your able to. Or try the card in another system?


----------



## ZakZakXxX

http://www.3dmark.com/fs/9882413

Score GTX 1070 amp


----------



## AngryLobster

Just got my hands on a Amp Extreme and I am super impressed. I mainly purchased it for the cooling potential and it does not disappoint at all.

I play @ 4K and this thing never breaks 65C while hovering between 1000-1200 RPM and is basically inaudible. I'm sure it can play a lot of games @ 1080p without even having to spin the fans.

The only funky thing is the default fan profile as it turns the fans on and off while you are playing a game since the minimum fan speed is 1000RPM and even at that low RPM it sometimes manages to bring temps below whatever the fan shut off threshold is during low load areas of games.

I'm so pleased with this thing. Had to resort to a Raijintek Morpheus on my 980Ti to keep temperatures in check at 4K (no problem at 1080p) and my Sapphire Fury required 1600-1800RPM to stay around 75C at 4K as well which was way too loud.

EDIT: Mine came with Micron and although it has a 100mhz memory OC out the box, I managed an additional 300mhz but saw only a 1-1.5 FPS gain at 4K so I put it back to stock. As for the core, the card boosted out the box to 1999mhz and settled at around 1963mhz. Adding +75 keeps it steady at about 2050ish so I'm happy with that even though the performance improvement is minuscule.


----------



## kevindd992002

Quote:


> Originally Posted by *sammkv*
> 
> Yeah the Extreme cooling is awesome but I have the lower model AMP Edition.


Oops, my bad. I didn't notice you only had the regular AMP. Thanks!


----------



## zipper17

My 3570k kinda bottlenecking 1070 in Hitman Marakesh Level (When Running on DX12). I noticed a lot of stuttering lags in crowds market areas.
But when I switched to Dx11, the lag is gone.

Does DX12 make CPU performances running slower than on DX11??
does anyone here have 3570K & Hitman 2016 game?


----------



## Balrogos

Aye comrades







yesterday i join nVidia and gtx1070 club, my last gpu from nvidia was GeFroce 4MX x), now i have GTX 1070 Palit Super JetStream why palit? I watched some test and the graphics is most quiet high base oc good cooling unit and also low price i bought mine for 470 euro(524$ dollars), and i want ask you few questions.

1. Is there no fan and oc control under nvidia panel like in amd panel?
2. My card have 1835mhz boost clock which is lowest guaranted boost (as i heard) but in gaming my gpu hits 2035mhz
3. How to overlock? is there like overlocking base clock and boost clock? or any more options?(in amd i just touch power limit volts and clock of course







and boost was fixed )
4. Which program is good for burning test/stability for nvidia? (i used furmark in amd)

Also greetings from Poland


----------



## kevindd992002

Quote:


> Originally Posted by *zipper17*
> 
> My 3570k kinda bottlenecking 1070 in Hitman Marakesh Level (When Running on DX12). I noticed a lot of stuttering lags in crowds market areas.
> But when I switched to Dx11, the lag is gone.
> 
> Does DX12 make CPU performances running slower than on DX11??
> does anyone here have 3570K & Hitman 2016 game?


What?!! I'm using just a 2600K that's overcloked to 4.6GHz.. I was under the impression that the CPU won't bottleneck a 1080 let alone a 1070.


----------



## Jimbags

Quote:


> Originally Posted by *kevindd992002*
> 
> What?!! I'm using just a 2600K that's overcloked to 4.6GHz.. I was under the impression that the CPU won't bottleneck a 1080 let alone a 1070.


My 3570k only did in games which can make use of multithreading. Like gtav etc your 2600k overclocked will out do a 3570k in multithreaded games. I just upgraded my [email protected] to a 3770k. I have a gtx 1070. Might do some testing. GTAV got the i5 cpu usage hanging around the 90% mark were as my htpc woth gtx750ti and i7 2600, kept cpu usage around 50-60%


----------



## kevindd992002

Quote:


> Originally Posted by *Jimbags*
> 
> My 3570k only did in games which can make use of multithreading. Like gtav etc your 2600k overclocked will out do a 3570k in multithreaded games. I just upgraded my [email protected] to a 3770k. I have a gtx 1070. Might do some testing. GTAV got the i5 cpu usage hanging around the 90% mark were as my htpc woth gtx750ti and i7 2600, kept cpu usage around 50-60%


Whew, that is a relief then. I guess I'll test everything out when I get my 1070.


----------



## zipper17

Quote:


> Originally Posted by *kevindd992002*
> 
> What?!! I'm using just a 2600K that's overcloked to 4.6GHz.. I was under the impression that the CPU won't bottleneck a 1080 let alone a 1070.


Other games runs just fine though, Witcher 3 & GTA5 pretty much butterly smooth 1080p nearly maxed out 60hz adaptive vsync on. even though sometimes dips down to 50FPS-ish. (GTA 5 grass is on very high detail.)

I might want to make a video of it. It's real laggy on hitman dx12 with older i5 system.

It seem Hitman with Dx12 mode on, make CPU runs a lot heavier than DX11.(especially in Hitman Marakesh Dense area).
This Lags didn't occur when running on DX11.( It still playable but still hit as low as 38-40FPS.)

Game is installed on SSD.

I also curious with this, anyone play hitman game with older i5 system can confirm this? do you notice a massive lag on Marakesh level especially on DX12 API? (settings in game are maxed out 1080P, no supersampling,SMAA on, DX12)


----------



## Fuzzy05

Not to happy with the memory OC but got a pretty nice core.

MSI GTX 1070 Gaming X


----------



## gtbtk

Quote:


> Originally Posted by *amd7674*
> 
> is it a bad idea to buy MSI Gaming Z? I love the design and how quiet the card supposedly runs.... However 99.9% it will come with micron.


I don't know of anyone on the forums who claims ownership of a Gaming Z, availability has been very limited to date also so I dont think there has been huge sales of the Z.

All the review samples have been samsung.


----------



## gtbtk

Quote:


> Originally Posted by *Fuzzy05*
> 
> 
> 
> 
> 
> Not to happy with the memory OC but got a pretty nice core.
> 
> MSI GTX 1070 Gaming X


Open the Nvidia Control panel and set it to "maximum performance" in the 3D settings section

Open the curve in afterburner (Ctrl-F). Select a point up at around 1.093v press "L" and a vertical line will appear in the graph. Press the tick box in AB to apply. You can then try cranking up the memory.

You want the card sitting up at at least 800MV or you will end up with checkerboard artifacts with the higher memory clocks. Without the voltage lock, the card will artifact like crazy at about +500 and above. With the Voltage locked, my MSI Gaming X that is identical to yours will run at up to about +600 but introduces artifacts from about +550Mhz


----------



## zipper17

Quote:


> Originally Posted by *Jimbags*
> 
> My 3570k only did in games which can make use of multithreading. Like gtav etc your 2600k overclocked will out do a 3570k in multithreaded games. I just upgraded my [email protected] to a 3770k. I have a gtx 1070. Might do some testing. GTAV got the i5 cpu usage hanging around the 90% mark were as my htpc woth gtx750ti and i7 2600, kept cpu usage around 50-60%


I Just did some testing on Hitman Marrakesh Dense Bazaar Area, all option set to LOW as possible, 1280x720P,

Framerates still can't do a solid +60FPS, still dips down to ~50FPS ish

Just confirmed Yes, Hitman Marrakesh Level is very CPU BOUND game, Holy crap GG(3570K @4,2ghz.)

Maybe this is what happen if '3dmark combined scores' comes to real world games.


----------



## Jimbags

Quote:


> Originally Posted by *zipper17*
> 
> I Just did some testing on Hitman Marrakesh Dense Bazaar Area, all option set to LOW as possible, 1280x720P,
> 
> Framerates still can't do a solid +60FPS, still dips down to 45 -50FPS
> 
> Just confirmed Yes, Hitman Marrakesh Level is very CPU BOUND game, Holy crap. GG. (3570K @4,2ghz.)
> 
> Maybe this is what happen if '3dmark combined scores' comes to real world games.


OC that bad boy. Have you delidded it? mine ran 4.7 daily on 1.275V


----------



## fedeK

Hello all,I know this is gtx 1070's club but since the corrisponding one for 1060 owners does not exist actually, I ask here : there is any bios with unlocked power limit and vcore voltage that can I flash in my 1060 ( Inno3D Compact X1 ) ?
Thanks for the help


----------



## zipper17

Quote:


> Originally Posted by *Jimbags*
> 
> OC that bad boy. Have you delidded it? mine ran 4.7 daily on 1.275V


im oc those 4.2ghz without voltage increase. maybe i would try to 4.5ghz. No delidded. I'm not very usual to OC things.
how many voltages does it needs 3570K [email protected] 4.5ghz In general? 1.2V?


----------



## Fuzzy05

Quote:


> Originally Posted by *gtbtk*
> 
> Open the Nvidia Control panel and set it to "maximum performance" in the 3D settings section
> 
> Open the curve in afterburner (Ctrl-F). Select a point up at around 1.093v press "L" and a vertical line will appear in the graph. Press the tick box in AB to apply. You can then try cranking up the memory.
> 
> You want the card sitting up at at least 800MV or you will end up with checkerboard artifacts with the higher memory clocks. Without the voltage lock, the card will artifact like crazy at about +500 and above. With the Voltage locked, my MSI Gaming X that is identical to yours will run at up to about +600 but introduces artifacts from about +550Mhz


Alrighty i did what you told me to do and now my GPU is running at 2126mhz core and 4152mhz on the memory constantly is that bad?


----------



## amd7674

Quote:


> Originally Posted by *Fuzzy05*
> 
> 
> 
> 
> 
> Not to happy with the memory OC but got a pretty nice core.
> 
> MSI GTX 1070 Gaming X


Core OC looks great. Is this on water? 49C max? and there is no fan RPM.


----------



## Fuzzy05

Quote:


> Originally Posted by *amd7674*
> 
> Core OC looks great. Is this on water? 49C max? and there is no fan RPM.


No no max 67c i just ran valley for 2 seconds just to get the max core and memory to show up in gpu-z so i can take a screenshot.


----------



## gtbtk

Quote:


> Originally Posted by *Fuzzy05*
> 
> Alrighty i did what you told me to do and now my GPU is running at 2126mhz core and 4152mhz on the memory constantly is that bad?


gpu clock speed is pretty good. Memory OC is in the average range. What happens if you push the memory oc higher?


----------



## amd7674

So finally I picked up Asus OC 1070 last night. It came with Sammy RAM  Also I haven't notice any coil whining.









Very nice / solid build. It clocks just over 2000 on core on load (out of the box). I haven't got much testing last night, because it was late. I will play with custom fan curve and clocks later this week.

One thing I've noticed is the card likes to run higher 2D clocks on the desktop than it should. At the moment card is running at 1600Mhz on core.
This causes card to run at 51C with fans at 0%. When I start benchmark the card goes to its max, after I close bench it goes to 200Mhz-ish and then it jumps to 1600Mhz after a while.

I'm using 368.95 driver on win10 (I'm not sure if I have anniversary update or not).

I didn't do a video driver cleanup or anything, I just removed older GTX 670 and rebooted system few times and everything worked fine.

Any help would be much appreciated


----------



## gtbtk

Quote:


> Originally Posted by *fedeK*
> 
> Hello all,I know this is gtx 1070's club but since the corrisponding one for 1060 owners does not exist actually, I ask here : there is any bios with unlocked power limit and vcore voltage that can I flash in my 1060 ( Inno3D Compact X1 ) ?
> Thanks for the help


There are no pascal editing tools at this time so I don't think so


----------



## Balrogos

I had i5 3570k 4,9ghz and 1,4v







(i hit 5,0Ghz but on 1,5v and my old mobo cant give more and 5.0ghz was unstable in benchmarks) 3 years of working until ihs was soldiered to my cooler and i just put ihs on core not glued them and when i try to remove cooler for cleaning it scratch cpu :| anyway it was worth deliding temps goes down by 20C, now i have i7 4790k



soldiered mark(everything would be fine if i glued ihs to cpu also i have lapped cooler and ihs on water sand paper so ihs just stick to ihs just on air without any paste x) )


And guys it would be cool if you can answer to my post :| http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/2800#post_25480629


----------



## Fuzzy05

Quote:


> Originally Posted by *gtbtk*
> 
> gpu clock speed is pretty good. Memory OC is in the average range. What happens if you push the memory oc higher?


Running +175 core and +500 memory now! Pretty sure i can push the core even further now. Stresstested for 30 minutes seems stable!


----------



## frashed

Can anyone help me out?

My MSI Gaming X is overclocking extremely poorly. +40mhz (1961mhz) seems stable but anything around +50 and above, Unigine locks up and throws an error.







This is without touching mem.

Running a 750w Silverstone Strider Gold. Is there an issue or am I just really unlucky?


----------



## amd7674

Quote:


> Originally Posted by *Fuzzy05*
> 
> Running +175 core and +500 memory now! Pretty sure i can push the core even further now. Stresstested for 30 minutes seems stable!


Great 

what are your temps and fan speeds?


----------



## Fuzzy05

[/quote]
Quote:


> Originally Posted by *amd7674*
> 
> Great
> 
> what are your temps and fan speeds?


Maximum 71c and the fanspeed is arround 45%


----------



## Jimbags

Quote:


> Originally Posted by *zipper17*
> 
> im oc those 4.2ghz without voltage increase. maybe i would try to 4.5ghz. No delidded. I'm not very usual to OC things.
> how many voltages does it needs 3570K [email protected] 4.5ghz In general? 1.2V?


All depends on the cpu. If you havent touched voltage, its prob on auto. Try 1.25v with 4.5ghz and go from there if unstable up it to 1.3v in my opinion 1.4 and above isnt good to run daily. So stay below that


----------



## amd7674

Quote:


> Originally Posted by *Fuzzy05*


Maximum 71c and the fanspeed is arround 45%[/quote]

That's great. I'm sure MSI beats Asus Strix when it comes to noise levels. My fans where running around 60% at 70C.

I'm having a little issue with 3D clocks running on the desktop after exiting benchmark utility... 1600Mhz.


----------



## TheGlow

Quote:


> Originally Posted by *amd7674*
> 
> Maximum 71c and the fanspeed is arround 45%
> 
> That's great. I'm sure MSI beats Asus Strix when it comes to noise levels. My fans where running around 60% at 70C.
> 
> I'm having a little issue with 3D clocks running on the desktop after exiting benchmark utility... 1600Mhz.


Since I started playing with afterburner I've never seen the clock go under 1535 or so. Once or twice AB said it was around 500, but gpuz would still say 1500.
20895 for graphics, so far my highest.
http://www.3dmark.com/3dm/14527020
Locking voltage let me get away with +820 on memory, but tried +210 on core and firestrike died half way. It survived the demo...


----------



## amd7674

Quote:


> Originally Posted by *TheGlow*
> 
> Since I started playing with afterburner I've never seen the clock go under 1535 or so. Once or twice AB said it was around 500, but gpuz would still say 1500.
> 20895 for graphics, so far my highest.
> http://www.3dmark.com/3dm/14527020
> Locking voltage let me get away with +820 on memory, but tried +210 on core and firestrike died half way. It survived the demo...


Thank you for your feedback.

Are you saying having 1600Mhz on the desktop is normal? I better setup custom fan curve. I do not like GPU seating at 51c.


----------



## Mad Pistol

Quote:


> Originally Posted by *Balrogos*
> 
> Aye comrades
> 
> 
> 
> 
> 
> 
> 
> yesterday i join nVidia and gtx1070 club, my last gpu from nvidia was GeFroce 4MX x), now i have GTX 1070 Palit Super JetStream why palit? I watched some test and the graphics is most quiet high base oc good cooling unit and also low price i bought mine for 470 euro(524$ dollars), and i want ask you few questions.
> 
> 1. Is there no fan and oc control under nvidia panel like in amd panel?
> 2. My card have 1835mhz boost clock which is lowest guaranted boost (as i heard) but in gaming my gpu hits 2035mhz
> 3. How to overlock? is there like overlocking base clock and boost clock? or any more options?(in amd i just touch power limit volts and clock of course
> 
> 
> 
> 
> 
> 
> 
> and boost was fixed )
> 4. Which program is good for burning test/stability for nvidia? (i used furmark in amd)
> 
> Also greetings from Poland


1. You will have to download something like MSI Afterburner to get overclocking control. Nvidia doesn't support it at the driver level anymore.
2. Almost all cards will boost well above their "boost clocks" because they can (including the Founders Edition). It's just a part of GPU boost.
3. For overclocking, download MSI Afterburner. You will see several things. (screenshot below for reference)




Core Voltage: As of right now, you may be able to download a custom bios to unlock it fully. Otherwise, don't touch it (it makes no difference in performance)
Power Limit and Temp. Limit: Max these out for overclocking
Core Clock: whatever this is listed at is the Mhz over the original clocks. This affects boost clock mostly. If the driver crashes after changing this, it is probably unstable. Reduce the frequency.
Memory Clock: same as "Core Clock" but for memory. If you see graphical artifacts or the computer hardlocks after increasing this, it is probably unstable. Reduce the frequency.

4. Unigine Heaven 4.0 is a great stability test. Run this for about 30 minutes to an hour, and if nothing bad happens, move on to playing games. The ONLY real way to tell if a clock frequency is stable is to game on it and see if it crashes.

Hope this helps.


----------



## pez

Quote:


> Originally Posted by *amd7674*
> 
> Thank you for your feedback.
> 
> Are you saying having 1600Mhz on the desktop is normal? I better setup custom fan curve. I do not like GPU seating at 51c.


Unless you're using Rainmeter or you constantly have something with GPU acceleration open (some Chrome stuff, VLC, even Libre Office, etc.) then it shouldn't. That is unless a new driver had changed something. My 1070, and Titan XP both come down to like 140/5XX at idle. My 1080s did this as well.


----------



## amd7674

Quote:


> Originally Posted by *pez*
> 
> Unless you're using Rainmeter or you constantly have something with GPU acceleration open (some Chrome stuff, VLC, even Libre Office, etc.) then it shouldn't. That is unless a new driver had changed something. My 1070, and Titan XP both come down to like 140/5XX at idle. My 1080s did this as well.


Thank you for the info I will check it out tonight. I wonder if there is Asus aura LED software or something running in the background causing this.


----------



## Fuzzy05

How is my score on heaven benchmark?


----------



## Samurai707

Quote:


> Originally Posted by *Fuzzy05*
> 
> How is my score on heaven benchmark?


Probably better off checking it against the Heaven thread...


----------



## Fuzzy05

Quote:


> Originally Posted by *Samurai707*
> 
> Probably better off checking it against the Heaven thread...


True that, but i want to see what other 1070's are getting


----------



## Samurai707

Quote:


> Originally Posted by *Fuzzy05*
> 
> True that, but i want to see what other 1070's are getting


They label it by Video card, I'd say it's definitely the easiest place to get that data. It's a little... cluttered... to say the least with people looking for OC help in here in general. At least that's how I see it!


----------



## Fuzzy05

Quote:


> Originally Posted by *Samurai707*
> 
> They label it by Video card, I'd say it's definitely the easiest place to get that data. It's a little... cluttered... to say the least with people looking for OC help in here in general. At least that's how I see it!


I did not know that, thanks for letting me know!

EDIT: Can't seem to find it.


----------



## owikhan

Zotac GTX 1070 AMP EXTREME EDITION
http://www.3dmark.com/3dm/14530577?


----------



## TheGlow

Quote:


> Originally Posted by *pez*
> 
> Unless you're using Rainmeter or you constantly have something with GPU acceleration open (some Chrome stuff, VLC, even Libre Office, etc.) then it shouldn't. That is unless a new driver had changed something. My 1070, and Titan XP both come down to like 140/5XX at idle. My 1080s did this as well.


I got mine 2 weeks ago and was using the August 15 drivers, and as I mentioned once I started using afterburner, my clocks never go lower.
Even with Afterburner opening at windows start, but not loading profiles, still wont downclock.
I could try and set AB to not launch at all and run gpu-z and see what it says.
I got the new drivers yesterday and no difference in behavior.

Quote:


> Originally Posted by *Fuzzy05*
> 
> True that, but i want to see what other 1070's are getting



2699 here, but thats before I was pushing the memory much, so not sure how much more I'd get now.
Plus I've disabled the onboard intel since then.


----------



## owikhan

Guys help regarding GPU CLOCK by Defaul 1633 and i am able to set at 1708


----------



## pez

Quote:


> Originally Posted by *TheGlow*
> 
> I got mine 2 weeks ago and was using the August 15 drivers, and as I mentioned once I started using afterburner, my clocks never go lower.
> Even with Afterburner opening at windows start, but not loading profiles, still wont downclock.
> I could try and set AB to not launch at all and run gpu-z and see what it says.
> I got the new drivers yesterday and no difference in behavior.
> 
> 2699 here, but thats before I was pushing the memory much, so not sure how much more I'd get now.
> Plus I've disabled the onboard intel since then.


I meant to mention that I use AB to OC as well. This is with all of my 10-series/Pascal-based GPUs. I am still on 4.3b, however.


----------



## zipper17

Quote:


> Originally Posted by *Jimbags*
> 
> All depends on the cpu. If you havent touched voltage, its prob on auto. Try 1.25v with 4.5ghz and go from there if unstable up it to 1.3v in my opinion 1.4 and above isnt good to run daily. So stay below that


Just did ocing my cpu to 4.5GHZ with steady 1.200 vcore. Running prime95 stable +30minutes.

Did run some test again, Hitman Marakesh level on crowded areas, still hit as low as 52-55FPS, with settings lowest possible 720P. No use lol, GTX 1070 @+-2050mhz/4500mhz.

Those game are CPU bound, loves more Cores / Hyper Threading. Can't do a solid +60FPS with my 3570K. Botneck. Or might be it just a poor optimized game.


----------



## Balrogos

I think bad optimalization did u check cpu load? when benchamrking in marakesh

check this poor cooler from asus strix gtz 1070

I think asus make idiots from us

and palit for reference


----------



## Blackfyre

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Balrogos*
> 
> I think bad optimalization did u check cpu load? when benchamrking in marakesh
> 
> check this poor cooler froma sus strix gtz 1070
> 
> I think asus make idios from us
> 
> and palit for reference






Is there a picture like this for an MSI GTX 1070 Gaming X cooler? I want to see how they've applied their thermal paste too.


----------



## Nukemaster

Quote:


> Originally Posted by *Blackfyre*
> 
> 
> Is there a picture like this for an MSI GTX 1070 Gaming X cooler? I want to see how they've applied their thermal paste too.


Is this the one you are after?
https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_X/4.html

The problem with HDC(heatpipe direct contact) coolers is that as cores get smaller, less pipes touch.

Asus has a dual fan 1070 with just 2 pipes so they should both touch, but the cooler is not the greatest. It is pretty good under medium load, but seems to get fairly loud when the card maxes out.


----------



## amd7674

Quote:


> Originally Posted by *TheGlow*
> 
> I got mine 2 weeks ago and was using the August 15 drivers, and as I mentioned once I started using afterburner, my clocks never go lower.
> Even with Afterburner opening at windows start, but not loading profiles, still wont downclock.
> I could try and set AB to not launch at all and run gpu-z and see what it says.
> I got the new drivers yesterday and no difference in behavior.
> 
> 2699 here, but thats before I was pushing the memory much, so not sure how much more I'd get now.
> Plus I've disabled the onboard intel since then.


I can easily reproduce my "stuck" high clock issue. After lunching 3DMark menu, card starts running at 1800Mhz and it stays there. I tried that with/wihout running GPU-Z and AB.
I also tried to uninstall and install AB without any luck. To fix it without rebooting I'm using CRU's utility driver restart program which fixes the issue and put the card in 300Mhz speed.

Next I will run DDU and install latest drivers. Someone said it is good practice to uninstall drivers when swapping to a new GPU. I will post later with my findings.

UPDATE: did DDU in safe mode and installed the latest drivers. When I lunch 3DMark menu the clocks jump to 1800Mhz and go back to 300Mhz... After I quit 3DMark the speeds remain at 300Mhz







I'm pretty sure DDU (latest version) in safe mode did it for me, unless the driver fixed something.


----------



## By-Tor

Seriously contemplating switching from the Red to the Green team. My pair of 290x's work great, but I would like to move to a single card that has more power and I'm thinking about the 1070 which has a very nice price point/bang for the buck. AMD is really not releasing anything I must have so far.

I have been reading and looking around, but can't make up my mind on which to get. It will need to be one that EK makes a water block for since it will be added to my custom loop.

My case is a black and red ROG theme and want to stay with those colors. I do like the ASUS ROG version!!

Would like to hear some thoughts on this...

Thank you


----------



## SuperZan

Quote:


> Originally Posted by *By-Tor*
> 
> Seriously contemplating switching from the Red to the Green team. My pair of 290x's work great, but I would like to move to a single card that has more power and I'm thinking about the 1070 which has a very nice price point/bang for the buck. AMD is really not releasing anything I must have so far.
> 
> I have been reading and looking around, but can't make up my mind on which to get. It will need to be one that EK makes a water block for since it will be added to my custom loop.
> 
> My case is a black and red ROG theme and want to stay with those colors. I do like the ASUS ROG version!!
> 
> Would like to hear some thoughts on this...
> 
> Thank you


I had been using CF 390's before I switched to the 1070, and I've been very pleased with the decision. You'll have plenty of options in terms of cards that fit a full-cover block, though it does seem that the only ASUS cards fitting the reference profile for the blocks are FE cards. I tend to favour EVGA myself, though I've only heard good things from happy FE owners. Personally, if planning to block the card I'd get something like an MSI Aero 1070, as it's around $400 USD and you won't need the cooler so the middling quality of the Aero's cooler wouldn't affect you.


----------



## TheGlow

Quote:


> Originally Posted by *Fuzzy05*
> 
> True that, but i want to see what other 1070's are getting


Quote:


> Originally Posted by *amd7674*
> 
> I can easily reproduce my "stuck" high clock issue. After lunching 3DMark menu, card starts running at 1800Mhz and it stays there. I tried that with/wihout running GPU-Z and AB.
> I also tried to uninstall and install AB without any luck. To fix it without rebooting I'm using CRU's utility driver restart program which fixes the issue and put the card in 300Mhz speed.
> 
> Next I will run DDU and install latest drivers. Someone said it is good practice to uninstall drivers when swapping to a new GPU. I will post later with my findings.
> 
> UPDATE: did DDU in safe mode and installed the latest drivers. When I lunch 3DMark menu the clocks jump to 1800Mhz and go back to 300Mhz... After I quit 3DMark the speeds remain at 300Mhz
> 
> 
> 
> 
> 
> 
> 
> I'm pretty sure DDU (latest version) in safe mode did it for me, unless the driver fixed something.


Which version of AB and Nvidia drivers did you go with?
Drivers im on is 372.70 and AB 3.2.0b4
Edit: I uninstalled all the MSI stuff, afterburner and DDU.
On latest and nvidia and just gpuz is confirming clock is idling at 215MHz, 0% fan.
Now not sure which AB to put in in case thats the trigger.
AB 3.2.0b4 back in and looks fine. Even able to get fan down to 0% in custom profiles as before it wouldnt let me go under 25% or so, even with fan curve at 0.
Edit2: As fun as lower idle clocks are, my pc keeps crashing with the checkerboard artifacts after about 1 minute of applying, whereas before it was fine. I think this is a voltage issue now. If I lock voltage then clocks shoot up again.
Edit3: What a horror. Im pretty much instalocking whenever I load up one of my previous profiles. Coreclock +180 is still fine since that doesn't kick in until boost comes on, which handles voltage as well.
But memory clock sadly is independent of that from what I see. +400 seems fine, but I have been using +700 no problem all week, and even ran successful 3dmark benches at +800, some with +825.
With voltage locked at minimum which looks like 800mV, I can sit on desktop at +700, but now clocks at 1771 as opposed to 1550 or so as before, 50º.


----------



## AngryLobster

Quote:


> Originally Posted by *Balrogos*
> 
> I think bad optimalization did u check cpu load? when benchamrking in marakesh
> 
> check this poor cooler froma sus strix gtz 1070
> 
> I think asus make idios from us
> 
> and palit for reference


Wow that Palits heatpipes are huge. Their cooler looks a lot like the aftermarket heatsinks you can buy (Prolimatek MK-26 and Raijintek Morpheus). No wonder it performs so well.


----------



## syl1979

Did some of you tried undervolting ? From my side as i was getting a wall at 2100mhz - 1.093v i have spend some time playing with the curv and lock. I can be stable at 2050mhz and only .993v (point set at 2075).


----------



## TheGlow

Man, i shouldnt have redone all my stuff. Dropping into 2d mode I guess is killing me.
Also locking voltage I thought would be a minimum setting but it could go up, nope.
So now I have to lock it at 1.093 or so just so this doesn't crash.


----------



## syl1979

There may be a bug in afterburner. When making too much changes on the curv at some points it will block to 2d frequency if not locked to one point. Reset in afterburner, back to normal. Had to restart a fresh new curv from reset position to have it work.


----------



## Hunched

Quote:


> Originally Posted by *TheGlow*
> 
> Man, i shouldnt have redone all my stuff. Dropping into 2d mode I guess is killing me.
> Also locking voltage I thought would be a minimum setting but it could go up, nope.
> So now I have to lock it at 1.093 or so just so this doesn't crash.


Isn't GPU Boost power saving great?
I just love power saving features that crash and cause instability that can't be disabled.


----------



## TheGlow

Quote:


> Originally Posted by *Hunched*
> 
> Isn't GPU Boost power saving great?
> I just love power saving features that crash and cause instability that can't be disabled.


What makes it fun is mine glitched and wasn't applying. Always stuck in boost.
So i had to go out of my way to "fix" it.
I just having to figure out how to reglitch me.
Nice. Now voltage was stuck at 650mV so Witcher3 was running 5fps at 415MHz.


----------



## syl1979

Are you applying some overclock ? With witch tool ?


----------



## TheGlow

Quote:


> Originally Posted by *syl1979*
> 
> Are you applying some overclock ? With witch tool ?


Afterburner 4.3.0 beta4.
Core +180, Mem +700.
Ive had it run stable with Core +200 and Mem +800, so 180/700 shouldn't be an issue.
Someone else mentioned the card stuck in 3d speeds, which I also experienced. So i removed the MSI Gaming App, after burner, and DDU'd the drivers.
Redid drivers and Afterburner by itself and can now drop to 215MHz on core, and temp/fans are better.
But once I reapplied my profile with +700 I usually crash within a minute.
Earlier it was fine browsing for 10 mins. once I launched a game it crashed.
So now I have to have off, apply the voltage lock, then raise memory.
I see the Profiles settings section has 2D and 3D profiles.
I'll play with that next. Set stock for 2D and then OC's for 3D.
Going to bed now so that'll wait for tomorrow.


----------



## amd7674

Quote:


> Originally Posted by *TheGlow*
> 
> Afterburner 4.3.0 beta4.
> Core +180, Mem +700.
> Ive had it run stable with Core +200 and Mem +800, so 180/700 shouldn't be an issue.
> Someone else mentioned the card stuck in 3d speeds, which I also experienced. So i removed the MSI Gaming App, after burner, and DDU'd the drivers.
> Redid drivers and Afterburner by itself and can now drop to 215MHz on core, and temp/fans are better.
> But once I reapplied my profile with +700 I usually crash within a minute.
> Earlier it was fine browsing for 10 mins. once I launched a game it crashed.
> So now I have to have off, apply the voltage lock, then raise memory.
> I see the Profiles settings section has 2D and 3D profiles.
> I'll play with that next. Set stock for 2D and then OC's for 3D.
> Going to bed now so that'll wait for tomorrow.


I'm using the same drivers/AB version as you. I fixed my issue by reinstalling driver (DDU in safe mode).

What Power Management mode are you using? Under Manage 3D settings. I'm using Prefer Maximum Performance.

BTW... I cannot get much OC on my Strix OC... +50 on core and +500 on memory... 2050/9000 is not too bad in games. Just played 30 minutes of BF1 (ultra settings) In benches the core drops to 2025/2038... I guess it gets throttled by temps. I don't want to change Fan Curve as it would be too noisy for me.


----------



## GreedyMuffin

I've tried undervolting with my 1080. 2088 at 0.975V stable it seems like! :-D


----------



## Yetyhunter

Quote:


> Originally Posted by *Balrogos*
> 
> I think bad optimalization did u check cpu load? when benchamrking in marakesh
> 
> check this poor cooler froma sus strix gtz 1070
> 
> I think asus make idios from us
> 
> and palit for reference


The gigabyte is even worse. The thermal paste was not evenly applied and it was hard as a rock. I changed the paste with some arctic silver and shaved off as much as 4-5*C from the max temp. Dropped from 73* to a max off 68*C. These are real wold temperatures while gaming for more than an hour. I played the witcher 3 which gives almost constant 99% Gpu usage.
Also the gigabyte version has only 2 heat-pipes !! How is this possible ??


----------



## Jimbags

Quote:


> Originally Posted by *Yetyhunter*
> 
> The gigabyte is even worse. The thermal paste was not evenly applied and it was hard as a rock. I changed the paste with some arctic silver and shaved off as much as 4-5*C from the max temp. Dropped from 73* to a max off 68*C. These are real wold temperatures while gaming for more than an hour. I played the witcher 3 which gives almost constant 99% Gpu usage.
> Also the gigabyte version has only 2 heat-pipes !! How is this possible ??


These are very thermally (is that a word?) Efficient gpu's. Dont worry about heatpipes... That asus cooler actually looks pretty good. Its the paste thats the problem.My Gigabyte Founders Edition stays cool enough to do 2100+mhz and the 1070 isnt vapour chamber like the 1080. I might redo my paste though. Damn you for encouraging me!


----------



## ZakZakXxX

http://www.3dmark.com/fs/9882413

Me .. 3d mark fire strike ..
Ghraphic score 20938


----------



## Jimbags

Quote:


> Originally Posted by *ZakZakXxX*
> 
> http://www.3dmark.com/fs/9882413
> 
> Me .. 3d mark fire strike ..
> Ghraphic score 20938


Try extreme you pansy. Your rocking a 1070 after all....


----------



## kevindd992002

Quote:


> Originally Posted by *AngryLobster*
> 
> Wow that Palits heatpipes are huge. Their cooler looks a lot like the aftermarket heatsinks you can buy (Prolimatek MK-26 and Raijintek Morpheus). No wonder it performs so well.


How does the Palit fair with the Zotac AMP Extreme?


----------



## patriotaki

hello all im planning buying a gtx 1070 very soon. But i dont know which one to get ... does any of you had issues with coil whine? if so which brand (msi,asus etc) ?

Does the G1 offer the best OC and cooling solution like the 970 did?


----------



## phalae

Hello Guys,

I don't post a lot around here, but I read you almost everyday.









I own a GTX 1070 Gigabyte G1 Gaming and I experience something bad with the cooler fan. They start to vibrate randomly at high speed (65-70% fan speed) and makes an awfull sound (like vibration for a mobile phone).
is there anyone ever experienced something like that with a GTX cooler ?









Btw, I'm impressed with +150mhz on core, nice for you guys








I'm stable @+125 for now


----------



## Jimbags

Quote:


> Originally Posted by *phalae*
> 
> Hello Guys,
> 
> I don't post a lot around here, but I read you almost everyday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I own a GTX 1070 Gigabyte G1 Gaming and I experience something bad with the cooler fan. They start to vibrate randomly at high speed (65-70% fan speed) and makes an awfull sound (like vibration for a mobile phone).
> is there anyone ever experienced something like that with a GTX cooler ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, I'm impressed with +150mhz on core, nice for you guys
> 
> 
> 
> 
> 
> 
> 
> 
> I'm stable @+125 for now


Anything like that, if legit, return it!


----------



## Sueramb6753

-snip-


----------



## patriotaki

Quote:


> Originally Posted by *Symix*
> 
> Had 2 palit SJS with coil whine
> 
> Had 1 MSI gaming X with fan rattle, but zero coil whine. Also fans were less noisy (if you don't count rattle) on the MSI compared to palit.
> 
> From what I've seen gigabyte/asus/evga are also notorious for coil whine.
> 
> Getting a perfect 1070 is pretty much like winning the lottery, first batches are always full of defects.


which one offers the best cooling solution then and good oc headroom?


----------



## Sueramb6753

-snip-


----------



## kevindd992002

Quote:


> Originally Posted by *Symix*
> 
> Had 2 palit SJS with coil whine
> 
> Had 1 MSI gaming X with fan rattle, but zero coil whine. Also fans were less noisy (if you don't count rattle) on the MSI compared to palit.
> 
> From what I've seen gigabyte/asus/evga are also notorious for coil whine.
> 
> Getting a perfect 1070 is pretty much like winning the lottery, first batches are always full of defects.


When do the second batches usually come?


----------



## phalae

Quote:


> Originally Posted by *Jimbags*
> 
> Anything like that, if legit, return it!


I sent a email to gigabyte support


----------



## syl1979

Some more documentation about undervolt.

Here are scores locked at 2075 / 0.993v

Firestrike Graphics 19930
http://www.3dmark.com/fs/10003257


Firestrike Ultra Graphics 4698
http://www.3dmark.com/fs/10003195


Timespy 5589
http://www.3dmark.com/spy/375143


----------



## TheGlow

Quote:


> Originally Posted by *amd7674*
> 
> I'm using the same drivers/AB version as you. I fixed my issue by reinstalling driver (DDU in safe mode).
> 
> What Power Management mode are you using? Under Manage 3D settings. I'm using Prefer Maximum Performance.
> 
> BTW... I cannot get much OC on my Strix OC... +50 on core and +500 on memory... 2050/9000 is not too bad in games. Just played 30 minutes of BF1 (ultra settings) In benches the core drops to 2025/2038... I guess it gets throttled by temps. I don't want to change Fan Curve as it would be too noisy for me.


Yes, Prefer maximum performance.
Well, just waking up and checking this, Windows did updates and forced a reboot at 1:20am.
I opened Afterburner and its sitting at 1582, 800mV.
So a fresh restart, no games having been played and its already stuck in the 3d clocks?
Forget about 2D and 3D profiles. As soon as I set them it applies to 3D profile, even at the desktop.
Then causes lock up at boot.
I got about 1-2 seconds to close AB before it applies the profile.
And as I type this up it jumped from the 215MHz , 625mB to 1550/800mV.
And as I hit submit on this post it jumped back down.
Wow, ok so it seems lots of little apps trigger the higher clocks, like opening Origins, or even Microsoft Edge.
Once I close edge, clocks drop back to normal.
So now if I apply my OC with memory OC, and stay low voltage, it seems ok.
Once I open any of these apps up, its an instant crash. I guess it pushes the clocks before they get voltage.
And as earlier, as I type this the boost shot up. And it doesnt appear the Curve editor/voltage lock settings apply per profile. it's just that session.


----------



## Balrogos

it is normal without oc my gpu clock goes up to 2025/2037.5mhz?


----------



## pez

Quote:


> Originally Posted by *phalae*
> 
> I sent a email to gigabyte support


If you have any troubles or find them taking a while to get back to you, hit up @Super Lizard on here. He was a great help in assisting me with my GB RMA of a mouse







.


----------



## TheGlow

Ok, I figured out a little more.
If I install the MSI Gaming App, it adds 3 services,
GamingApp_Service
GamingHotkey_Service
MSI_ActiveX_Service.

Disabling hotkey service seems to have no effects.
Disabling either GamingApp_Service or the ActiveX service prevents the MSI Gaming App from opening.
The GamingApp_Services seems to be the trigger to keep the card in "3d mode" or whatever the proper term is. It will feed it the 800mV minimum and sit at 1582MHz.
Once I stop that service, the system drops back to 215MHz and 625mV.
So I'm guessing my original lock at system boot was the high oc profile applying before the gaming app service kicks in.

Tested numerous times. Turn on service, clocks jump. Turn off service, clocks drop. With clocks and voltage low, open MS Edge, instant lock up.
Note I do not need to have the MSI Gaming App program set to launch with windows, just installing it adds the service which is locking in those higher voltage/clock rates.


----------



## vloeibaarglas

Quote:


> Originally Posted by *Yetyhunter*
> 
> The gigabyte is even worse. The thermal paste was not evenly applied and it was hard as a rock. I changed the paste with some arctic silver and shaved off as much as 4-5*C from the max temp. Dropped from 73* to a max off 68*C. These are real wold temperatures while gaming for more than an hour. I played the witcher 3 which gives almost constant 99% Gpu usage.
> Also the gigabyte version has only 2 heat-pipes !! How is this possible ??


Gigabyte G1 is now a budget line card. Gigabyte Windforce (used to be their top of the line two generations ago) is now the super budget card. Their main line card is now Gaming Extreme.

Rest of the 1070 have between 5-6 heatpipes. No surprise that Gigabyte 1070/1080 are the worst performing custom full sized cards thermally.


----------



## syl1979

Quote:


> Originally Posted by *Balrogos*
> 
> it is normal without oc my gpu clock goes up to 2025/2037.5mhz?


Mine boost at 2025 without overclock


----------



## syl1979

Ahah !
Core 2150 ,mem 4650 (not stable, artefacts)

Got graphic score 22013 points in Firestrike
http://www.3dmark.com/fs/10004737
Makes 1st rank today for 2500K + GTX1070 !


Fore firestrike Ultra 4913 graphic
http://www.3dmark.com/fs/10004785

For Timespy the memory can only stand +600
Makes 6596 graphic score
http://www.3dmark.com/spy/375963

Notes :
Stable clocks are 2100 Mhz mem 4430

Firestrike is 20264
http://www.3dmark.com/fs/10004349

Timespy is 6488
http://www.3dmark.com/spy/375596


----------



## amd7674

Quote:


> Originally Posted by *TheGlow*
> 
> Ok, I figured out a little more.
> If I install the MSI Gaming App, it adds 3 services,
> GamingApp_Service
> GamingHotkey_Service
> MSI_ActiveX_Service.
> 
> Disabling hotkey service seems to have no effects.
> Disabling either GamingApp_Service or the ActiveX service prevents the MSI Gaming App from opening.
> The GamingApp_Services seems to be the trigger to keep the card in "3d mode" or whatever the proper term is. It will feed it the 800mV minimum and sit at 1582MHz.
> Once I stop that service, the system drops back to 215MHz and 625mV.
> So I'm guessing my original lock at system boot was the high oc profile applying before the gaming app service kicks in.
> 
> Tested numerous times. Turn on service, clocks jump. Turn off service, clocks drop. With clocks and voltage low, open MS Edge, instant lock up.
> Note I do not need to have the MSI Gaming App program set to launch with windows, just installing it adds the service which is locking in those higher voltage/clock rates.


I've read in few places MSI Gaming App is evil ... I would think Uninstall should remove all services associated with it.


----------



## Swiftz

I'm going to upgrade my 290X to the MSI 1070 as the red/black colour matches my build but I have read online about quality problems with the MSI 1070, is that only with Micron memory and not Samsung?


----------



## amd7674

Stupid question I can see some peeps using "Core Voltage (%)" and some do not. Is it actually helping to OC GPU's core? The slider goes up to 100%, isn't 100% the default the GPU core will use?


----------



## adamjp

Just got my Zotac amp and I love it. I've got settings maxed in gta v and its actually turning off the fans from time to time because it's staying so cool. I haven't bothered to overclock but I'm having no issues with the 550w power supply at stock speeds. I'm not sure I see much point in OCing with gpu boost 3.0


----------



## amd7674

Quote:


> Originally Posted by *adamjp*
> 
> Just got my Zotac amp and I love it. I've got settings maxed in gta v and its actually turning off the fans from time to time because it's staying so cool. I haven't bothered to overclock but I'm having no issues with the 550w power supply at stock speeds. I'm not sure I see much point in OCing with gpu boost 3.0


If on/off bothers you, you can build your own AB profile and set fans to go on at 50c @ 40% and onward.

I did my OC just for fun of it. At the moment I only game at [email protected]


----------



## Mad Pistol

Quote:


> Originally Posted by *amd7674*
> 
> Stupid question I can see some peeps using "Core Voltage (%)" and some do not. Is it actually helping to OC GPU's core? The slider goes up to 100%, isn't 100% the default the GPU core will use?


Not a stupid question at all.









Depending on the GPU model, it may help. Personally, I haven't found any instances where increasing voltage percentage helps, but I guess with AIB partner cards/coolers, there may be some benefit.


----------



## TheGlow

Quote:


> Originally Posted by *amd7674*
> 
> I've read in few places MSI Gaming App is evil ... I would think Uninstall should remove all services associated with it.


For some perhaps. The app itself was too simplistic and I didn't like it.
However installing it added that service which forces the card to run with more juice, not idle.
This makes me not crash.
I'm not sure about you guys, but if its reporting low clocks, add some to memory like you can, a bit high like 700 for me, and open MS edge in on win10 or something.
For me instant death.


----------



## chantruong

Quick question, guys. Do you know if the MSI Aero and Asus Turbo blower models use founder's edition PCBs?

Thanks


----------



## vloeibaarglas

Quote:


> Originally Posted by *chantruong*
> 
> Quick question, guys. Do you know if the MSI Aero and Asus Turbo blower models use founder's edition PCBs?
> 
> Thanks


Quick googling reveals that Asus Turbo has a 'custom' PCB. Aero unknown. Both cards have poor thermals, thus poor boost 3.0 clocks.

Typically, custom PCB are better than reference, but not always. XFX's dumpster 79xx/2xx generation featured cards with worse PCB than reference PCBs.


----------



## chantruong

Quote:


> Originally Posted by *vloeibaarglas*
> 
> Quick googling reveals that Asus Turbo has a 'custom' PCB. Aero unknown. Both cards have poor thermals, thus poor boost 3.0 clocks.
> 
> Typically, custom PCB are better than reference, but not always. XFX's dumpster 79xx/2xx generation featured cards with worse PCB than reference PCBs.


Thanks for the reply. The reason I ask is I would like to get a waterblock for two cards in the future. Is the founder's edition the way to go?


----------



## SuperZan

Quote:


> Originally Posted by *chantruong*
> 
> Thanks for the reply. The reason I ask is I would like to get a waterblock for two cards in the future. Is the founder's edition the way to go?


FE or anything like the Aero that is just a replicated FE PCB with a cheaper blower cooler.

https://www.ekwb.com/configurator/waterblock/3831109831472


----------



## kersoz2003

I have gtx 1070 overlclocked to gtx 1080 stock. I have 144 hz 2 k monitör and I get 80-90 fps averadge in battlefield 1 , so I set my refresh rate to 90 hz and get a very fluent gameplay


----------



## Nukemaster

If anyone wants to know what the Asus Dual GTX 1070 08G looks like it is this.

Note the thermal pad/tape that only covers some memory and has a holder under some of it.


The VRM cooler does not seem to have quite large enough thermal pad(s)


Temperatures under heavy load seems to be about 78c or so(stock fan auto setting).

The paste is pretty hard and required more work to remove than I expected(maybe new paste would reduce temperatures).


----------



## amd7674

Quote:


> Originally Posted by *Nukemaster*
> 
> If anyone wants to know what the Asus Dual GTX 1070 08G looks like it is this.
> 
> Note the thermal pad/tape that only covers some memory and has a holder under some of it.
> 
> 
> The VRM cooler does not seem to have quite large enough thermal pad(s)
> 
> 
> Temperatures under heavy load seems to be about 78c or so(stock fan auto setting).
> 
> The paste is pretty hard and required more work to remove than I expected(maybe new paste would reduce temperatures).


Did you have a tiny sticker when dissembling the card? I believe it might void warranty with Asus. I have it on my Strix OC.

Please let me know if the paste replacement help. My strix gets to 75C when stress testing with default fan profile. During regular games its at 60-65C. The fans over 60% are a little to noisy for me so I'm using default fan curve (not very aggressive).


----------



## StrelokAT

Hy guys. Today my MSI gtx1070 GamingX has arrived. But unfortunately i got one with crappy MICRON memory.
Couldnt even oc my mem to +400.








Why does MSI use cheap Micron memory on there cards anyway?? On the Gaming X.









Shall i send it back. Please help me. I dont like that and i cannot live with it.


----------



## sammkv

Quote:


> Originally Posted by *Yetyhunter*
> 
> The gigabyte is even worse. The thermal paste was not evenly applied and it was hard as a rock. I changed the paste with some arctic silver and shaved off as much as 4-5*C from the max temp. Dropped from 73* to a max off 68*C. These are real wold temperatures while gaming for more than an hour. I played the witcher 3 which gives almost constant 99% Gpu usage.
> Also the gigabyte version has only 2 heat-pipes !! How is this possible ??


I just did the same with my card removed the crappy dried up stock paste and put some MX-4 on it. Should be good for 8 years


----------



## TheGlow

Quote:


> Originally Posted by *StrelokAT*
> 
> Hy guys. Today my MSI gtx1070 GamingX has arrived. But unfortunately i got one with crappy MICRON memory.
> Couldnt even oc my mem to +400.
> 
> 
> 
> 
> 
> 
> 
> 
> Why does MSI use cheap Micron memory on there cards anyway?? On the Gaming X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shall i send it back. Please help me. I dont like that and i cannot live with it.


Just wondering what you did.
Did you install MSI Gaming App? Are you just using After Burner?
Did you apply the mem clock during a load or just at desktop?
Because I have the Gaming X w/ Micro and I've gotten it to work with +800 mem clock. I really don't believe I could have won the lottery that well as so many others are saying otherwise.
Like this morning I was resetting things and I was locking up over +400 at desktop hence I'm asking you about voltage and load.


----------



## amd7674

Quote:


> Originally Posted by *sammkv*
> 
> I just did the same with my card removed the crappy dried up stock paste and put some MX-4 on it. Should be good for 8 years


Did you notice any temp differences?


----------



## amd7674

Quote:


> Originally Posted by *StrelokAT*
> 
> Hy guys. Today my MSI gtx1070 GamingX has arrived. But unfortunately i got one with crappy MICRON memory.
> Couldnt even oc my mem to +400.
> 
> 
> 
> 
> 
> 
> 
> 
> Why does MSI use cheap Micron memory on there cards anyway?? On the Gaming X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shall i send it back. Please help me. I dont like that and i cannot live with it.


Hey I can only do +50 on my core and +500 on samsung ram on my strix OC ...









You can try to lock the voltage using AB.

http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/2510#post_25459448


----------



## amd7674

How are noise levels on MSI Gaming X cards? Are the fans noisy after 60%. I find my strix OC a little lound with anything over 60%. I get a little bit of noise when the card is pushed to the max.


----------



## Nukemaster

Quote:


> Originally Posted by *amd7674*
> 
> Did you have a tiny sticker when dissembling the card? I believe it might void warranty with Asus. I have it on my Strix OC.
> 
> Please let me know if the paste replacement help. My strix gets to 75C when stress testing with default fan profile. During regular games its at 60-65C. The fans over 60% are a little to noisy for me so I'm using default fan curve (not very aggressive).


Yes, the card had a single sticker on one of the cooler screws. It does not say anything, but it is safe to say they will not warranty a card if you remove the sticker.

As for the temperatures. I was testing in games. Your Strix is most likely much better(and I would guess has a better voltage regulation system.). My regulators get very hot(most likely in spec still).

Unfortunately I am not sure when I will try to re apply thermal paste on the card. I got a Mono Plus from Arctic to see how well it performs. I honestly would have rather stick an all in one cooler on that core(That is how my GTX 670 was and boy was it quiet compared to this cards stock cooler), but it is kind of expensive and pumps do fail more often than fans.

In the end, I would have paid a bit more for a strix or even better an MSI card had I known this cooler would be louder than expected. I honestly figured for a lower wattage card, almost any cooler would be decent.

My next thing to do is make a larger cooler for the VRMs since the new cooler does not cover it. I want to use part of a cpu cooler, but it is not wide enough so I need a thick plate to make a base first.

Mono Plus


Fan to try to keep the VRM area cooler until I can get something better.


What my 670 had







CPU cooler for the VRMs


----------



## naved777

Zotac GTX 1070 AMP extreme flickering and ended up with black bars (At stock)


----------



## naved777




----------



## Balrogos

Quote:


> Originally Posted by *Nukemaster*
> 
> If anyone wants to know what the Asus Dual GTX 1070 08G looks like it is this.
> 
> Note the thermal pad/tape that only covers some memory and has a holder under some of it.
> 
> 
> The VRM cooler does not seem to have quite large enough thermal pad(s)
> 
> 
> Temperatures under heavy load seems to be about 78c or so(stock fan auto setting).
> 
> The paste is pretty hard and required more work to remove than I expected(maybe new paste would reduce temperatures).


Quote:


> Originally Posted by *sammkv*
> 
> I just did the same with my card removed the crappy dried up stock paste and put some MX-4 on it. Should be good for 8 years


Asus is a piecie of **** now







earlier i have R9 290x with the same crap strix cooler if u want good Direct Heatpipe paste try Coollaboratory Liquid Ultra or better solution specialy designed for DH coolers Coollaboratory Liquid Copper this paste is excellent.


----------



## Yetyhunter

Quote:


> Originally Posted by *sammkv*
> 
> I just did the same with my card removed the crappy dried up stock paste and put some MX-4 on it. Should be good for 8 years


Great ! Did you noticed lower temps ?


----------



## sammkv

Quote:


> Originally Posted by *Yetyhunter*
> 
> Great ! Did you noticed lower temps ?


No I don't think it did honestly lol


----------



## jamor

Do msi X and Z have same voltage?


----------



## GreedyMuffin

I believe so. :

AKA/ Take that as a yes.


----------



## Majentrix

Just bought a 1440p 144hz monitor to go with my 1070. Time to see what games I can max out with this


----------



## jamor

My core is overclocked at 2126 mhz stable (for now at least) on MSI GTX 1070 X. Is memory overclocking important? I just left it stock 4007 mhz.. or 2003 mhz.. depending on what program you're looking at.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Majentrix*
> 
> Just bought a 1440p 144hz monitor to go with my 1070. Time to see what games I can max out with this


Congrats! That's a very strong setup.


----------



## Swolern

Quote:


> Originally Posted by *jamor*
> 
> My core is overclocked at 2126 mhz stable (for now at least) on MSI GTX 1070 X. Is memory overclocking important? I just left it stock 4007 mhz.. or 2003 mhz.. depending on what program you're looking at.


Yes memory matters. How much it matters depends on the game and resolution.


----------



## syl1979

That's why gtx1080 have high speed GDDR5X


----------



## jamor

Quote:


> Originally Posted by *Swolern*
> 
> Yes memory matters. How much it matters depends on the game and resolution.


So do I just double the memory clock number in Uningine? If I get overclock of 4300mhz in uningine is that really 8600mhz? And in GPU-Z/Afterburner sensors it reads as 2151.5 mhz so I just triple that number?


----------



## Forceman

Quote:


> Originally Posted by *jamor*
> 
> So do I just double the memory clock number in Uningine? If I get overclock of 4300mhz in uningine is that really 8600mhz? And in GPU-Z/Afterburner sensors it reads as 2151.5 mhz so I just triple that number?


Double Uningine, and quadruple the number you see in GPU-Z (doesn't AB show 4xxx though? So just double that). 8000 is stock, so whatever math gets you above that, basically.


----------



## TheGlow

Quote:


> Originally Posted by *Majentrix*
> 
> Just bought a 1440p 144hz monitor to go with my 1070. Time to see what games I can max out with this


I got the Dell S2716DG last month before the 1070 and honestly, maybe me getting old, but I don't see much difference with 1080p vs 1440p.
I do like the desktop space, the 27" and the 144Hz.
Overwatch on max is about 120fps.
Witcher3 all maxed except Hairworks gets me 70-85fps.
Battlefield 1 demo jumps around 90-120, Ill assume a lot of that is its unoptimized as sometimes you join the server and you don't even get a gun to shoot anyone. Or melee. Or heal. Just stand there to die.

Quote:


> Originally Posted by *jamor*
> 
> My core is overclocked at 2126 mhz stable (for now at least) on MSI GTX 1070 X. Is memory overclocking important? I just left it stock 4007 mhz.. or 2003 mhz.. depending on what program you're looking at.


I can get up to +800 for memory in After burner but only if the card is running with some juice.
I noticed installing MSI Gaming App adds the GamingApp_Service and when thats running, the clock shoots up to 1500MHz and won't idle.
If idle and i push memory i get crashes doing stuff like opening MS Edge, launching Origins, etc.
With that service going I can push more.


----------



## jamor

Quote:


> Originally Posted by *TheGlow*
> 
> I can get up to +800 for memory in After burner but only if the card is running with some juice.
> I noticed installing MSI Gaming App adds the GamingApp_Service and when thats running, the clock shoots up to 1500MHz and won't idle.
> If idle and i push memory i get crashes doing stuff like opening MS Edge, launching Origins, etc.
> With that service going I can push more.


So far I have poor memory and good core. Up to 2126 core stable so far but 8800 memory instantly crashes. 8600 was working for a short period but I haven't had time to test it.

FWIW I have the Micron.


----------



## amd7674

Quote:


> Originally Posted by *TheGlow*
> 
> I can get up to +800 for memory in After burner but only if the card is running with some juice.
> I noticed installing MSI Gaming App adds the GamingApp_Service and when thats running, the clock shoots up to 1500MHz and won't idle.
> If idle and i push memory i get crashes doing stuff like opening MS Edge, launching Origins, etc.
> With that service going I can push more.


I think pascal BIOS editor (if we ever get it) would be your friend to set p states, power levels etc.

I guess I'm an average Joe, my Asus Strix OC "seems" to be stable at 2050/9000. (more real game time required).

I wish I had a 144hz ips g-sync monitor, but I don't have $$$.

However my 32" LG 32LD450 (4:4:4), ips panel, [email protected] will do for now









BF on ultra at [email protected] doesn't even make 1070 sweat ... I will test it tonight with vsync off.


----------



## amd7674

probably most of you have seen it but there is a new MSI AfterBurner 4.3.0 Beta 14 (dx12 compatible).

I will try it tonight.


----------



## TheGlow

Quote:


> Originally Posted by *jamor*
> 
> So far I have poor memory and good core. Up to 2126 core stable so far but 8800 memory instantly crashes. 8600 was working for a short period but I haven't had time to test it.
> 
> FWIW I have the Micron.


I have Micron too. So not sure if I just got crazy lucky, or there are vastly different types out there.


http://imgur.com/ge3QCEV


Quote:


> Originally Posted by *amd7674*
> 
> I think pascal BIOS editor (if we ever get it) would be your friend to set p states, power levels etc.
> 
> I guess I'm an average Joe, my Asus Strix OC "seems" to be stable at 2050/9000. (more real game time required).
> 
> I wish I had a 144hz ips g-sync monitor, but I don't have $$$.
> 
> However my 32" LG 32LD450 (4:4:4), ips panel, [email protected] will do for now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BF on ultra at [email protected] doesn't even make 1070 sweat ... I will test it tonight with vsync off.


I went with the Dell. It's TN but damn if I don't really notice.
My old LG was an IPS and I replaced 3 years ago with a TN didnt really notice. I also use IPS 16:10 24" monitors and I don't think the difference is worth another $250+.
The Dell s2716dg drops to $450 fairly often lately.
Quote:


> Originally Posted by *amd7674*
> 
> probably most of you have seen it but there is a new MSI AfterBurner 4.3.0 Beta 14 (dx12 compatible).
> 
> I will try it tonight.


Nope, wasn't aware. I'll blow it up now.


----------



## adamjp

It's sad the crap I'm hearing about Asus and MSI GPUs lately. Zotac seems hit and miss but I seem to have done well any time I've bought them. I almost bought EVGA because I've only ever heard good things about them, but ultimately I went for the biggest heat sink I could fit in my core 500 and that's what I came up with. I love the steel wrap around back plate on the zotac amp and I've yet to see it hit 70c


----------



## amd7674

Quote:


> Originally Posted by *TheGlow*
> 
> I went with the Dell. It's TN but damn if I don't really notice.
> My old LG was an IPS and I replaced 3 years ago with a TN didnt really notice. I also use IPS 16:10 24" monitors and I don't think the difference is worth another $250+.
> The Dell s2716dg drops to $450 fairly often lately.
> Nope, wasn't aware. I'll blow it up now.


There are Pros/Cons for both panels. With 32" glory I like IPS's viewing angles and vibrant colors for photo editing. However YES you I do get IPS glow and TN gives you better response/input lag.

What I hate about any type of display are backlight bleeding and screen uniformity issues.


----------



## ElectroManiac

Gonna buy a 1070 today. Deciding between Gigabyte Windforce, Gigabyte G1, MSI Armor and MSI Gaming.

Any concern on this models?

The Windforce is in a really good price compare to the rest, so kind looking toward that one. I will overclock so don't mind what clocks they have on stock.

Should I just save the money on the Windforce, or is worth paying 20/30$ on the other options?

How does the Windforce overclock?


----------



## bigjdubb

Quote:


> Originally Posted by *adamjp*
> 
> It's sad the crap I'm hearing about Asus and MSI GPUs lately.


Just keep your internal "internet filter" turned on and consider the source. I think that maybe 1 in 10 people on tech forums are speaking from first hand experience, the other 9 are just regurgitating something they read somewhere else.

Have you ever seen or taken part in one of those rumor experiments where one person tells a story and then it gets passed along through the group. By the end of the experiment the story is completely different and is in no way a representation of the original.

That's 90% of the internet. No offense intended but I wouldn't trust 9 out of 10 of the people in this thread to format a thumb drive for me, much less take hardware advice from them.


----------



## gtbtk

I run the fan at 100% and it stabilises at 55-60 deg if I am wringing as much as possible out of the card


----------



## jamor

Quote:


> Originally Posted by *Forceman*
> 
> Double Uningine, and quadruple the number you see in GPU-Z (doesn't AB show 4xxx though? So just double that). 8000 is stock, so whatever math gets you above that, basically.


Thanks. I thought AB was reading 2000 but I can't recheck until I'm home.


----------



## EvilWiffles

Had replaced my GTX 1070 Sea Hawk X since my PSU apparently decided to kill it. Luckily I was able to get a replacement on Amazon, since it hasn't been past 30 days.
This one overclocks almost the same, except it was throttling more often. 2114MHz on core 1.093v stable in hours of gameplay with BF1. Locking the curve made it not throttle now.



Temps average 44c, pretty good and silent. Mostly silent. Samsung VRAM btw, both cards were.


----------



## TheGlow

Quote:


> Originally Posted by *bigjdubb*
> 
> That's 90% of the internet. No offense intended but I wouldn't trust 9 out of 10 of the people in this thread to format a thumb drive for me, much less take hardware advice from them.


Heh, like how I got a 30 min crash course on configuring Cisco phones for our environment. and 3 years later I'm still finding and correcting mistakes the CCNP Voice admin left all over.

As you said, I keep hearing bad stuff about the Micron for memory but so far so good for me. I believe someone with Samsung said they couldnt OC much at all either.
Quote:


> Originally Posted by *EvilWiffles*
> 
> Had replaced my GTX 1070 Sea Hawk X since my PSU apparently decided to kill it. Luckily I was able to get a replacement on Amazon, since it hasn't been past 30 days.
> This one overclocks almost the same, except it was throttling more often. 2114MHz on core 1.093v stable in hours of gameplay with BF1. Locking the curve made it not throttle now.
> 
> 
> 
> Temps average 44c, pretty good and silent. Mostly silent. Samsung VRAM btw, both cards were.


Damn, 44c. I'm idling at desktop 45c, 35% fan.


----------



## Balrogos

Better take palit







windforce is not top cooler anymore, also OC depoends from chip itself as always.


----------



## jamor

Quote:


> Originally Posted by *EvilWiffles*
> 
> Had replaced my GTX 1070 Sea Hawk X since my PSU apparently decided to kill it. Luckily I was able to get a replacement on Amazon, since it hasn't been past 30 days.
> This one overclocks almost the same, except it was throttling more often. 2114MHz on core 1.093v stable in hours of gameplay with BF1. Locking the curve made it not throttle now.
> 
> 
> 
> Temps average 44c, pretty good and silent. Mostly silent. Samsung VRAM btw, both cards were.


How are you adjusting the voltage? I thought the 1070 was locked?


----------



## Vesimas

Here my new shining EVGA 1070 FTW


----------



## EvilWiffles

Quote:


> Originally Posted by *jamor*
> 
> How are you adjusting the voltage? I thought the 1070 was locked?


In Afterburner. Ctrl+F and locked it to 1093 on chart.
Before, I thought it was locked so I only selected 1062 but apparently you can lock it beyond that to a certain degree.
Quote:


> Originally Posted by *Vesimas*
> 
> Here my new shining EVGA 1070 FTW


Nice







. Was thinking of getting that card, just didn't like the banner on the side, was too gody for my setup.


----------



## asdkj1740

Quote:


> Originally Posted by *jamor*
> 
> How are you adjusting the voltage? I thought the 1070 was locked?


msi ab beta is for pascal and you can add 100% voltage to your card through this beta version


----------



## TheGlow

No difference for me with this version of Afterburner and high clocks locking my system when in idle.
Leaving that GamingApp service on is the key for me to not die.


----------



## ElectroManiac

Quote:


> Originally Posted by *Balrogos*
> 
> Better take palit
> 
> 
> 
> 
> 
> 
> 
> windforce is not top cooler anymore, also OC depoends from chip itself as always.


Is the Windforce cooler that bad?

I don't mind if is not top of the top coolers just that it cools enough to get a nice OC from it.

Money is tight now for me so I would like to save some, but if the cooler is that bad I would pay the extra for the other cards.


----------



## BroPhilip

Hey Guys, first post, I have the MSI Gaming z 1070. Running well haven't push oc just yet. Stock it sits just bellow 2000 on clock and has 100 oc on memory stock. Although it does have Micron memory. Had to replace my asus strix non oc. It had issues with the boost clock not kicking in and only seeing half fps as it should and some artifacts. (It had samsung memory) Sounds like the later builds are coming with micron memory.

Also quick question, I see people seeing benefits from locking voltage in AB? What do I need to be careful of in using the voltage of my card in AB.

SPECS
I5 -6600k. 4.7 ghz
16 ddr4
ASUS z170-a
600w evga gpu
MSI Gaming Z 1070


----------



## Balrogos

Quote:


> Originally Posted by *ElectroManiac*
> 
> Is the Windforce cooler that bad?
> 
> I don't mind if is not top of the top coolers just that it cools enough to get a nice OC from it.
> 
> Money is tight now for me so I would like to save some, but if the cooler is that bad I would pay the extra for the other cards.


Palit have better cooler than gigabyte and palit is cheaper.


----------



## Yetyhunter

Quote:


> Originally Posted by *ElectroManiac*
> 
> Gonna buy a 1070 today. Deciding between Gigabyte Windforce, Gigabyte G1, MSI Armor and MSI Gaming.
> 
> Any concern on this models?
> 
> The Windforce is in a really good price compare to the rest, so kind looking toward that one. I will overclock so don't mind what clocks they have on stock.
> 
> Should I just save the money on the Windforce, or is worth paying 20/30$ on the other options?
> 
> How does the Windforce overclock?


Go with the MSI. I read about many complains from the gigabyte in this thread and saw generally higher overclock potential for the msi gaming.


----------



## j4k3nqc

Quote:


> Originally Posted by *EvilWiffles*
> 
> In Afterburner. Ctrl+F and locked it to 1093 on chart.
> Before, I thought it was locked so I only selected 1062 but apparently you can lock it beyond that to a certain degree.


Thanks man, was wondering how to do that since my card would only go stable at 2075mhz at 1.049V. I think I will be able to do 2115 or 2126mhz with 1.093V.

VRAM is already at +750mhz (9500mhz) stable no artefacts so if core can handle 40-50mhz more it will be a very nice card!


----------



## ElectroManiac

Quote:


> Originally Posted by *"Balrogos*
> Palit have better cooler than gigabyte and palit is cheaper.


Well on Amazon and Newegg the Gigabyte is cheaper.
Quote:


> Originally Posted by *Yetyhunter*
> 
> Go with the MSI. I read about many complains from the gigabyte in this thread and saw generally higher overclock potential for the msi gaming.


Thanks will take that into consideration.


----------



## Balrogos

Quote:


> Originally Posted by *ElectroManiac*
> 
> Well on Amazon and Newegg the Gigabyte is cheaper.
> Thanks will take that into consideration.


Heh i buy things in diffirent shops not only from one







check other shops and check prices.


----------



## ElectroManiac

Quote:


> Originally Posted by *Balrogos*
> 
> Heh i buy things in diffirent shops not only from one
> 
> 
> 
> 
> 
> 
> 
> check other shops and check prices.


Could you point me to one?


----------



## TheGlow

Quote:


> Originally Posted by *ElectroManiac*
> 
> Could you point me to one?


I prefer newegg as I don't have to pay tax, often get free or cheap shipping. And I have credit with them.


----------



## ElectroManiac

Quote:


> Originally Posted by *TheGlow*
> 
> I prefer newegg as I don't have to pay tax, often get free or cheap shipping. And I have credit with them.


Agree I always use Newegg and Amazon. Newegg no tax as you say and Amazon because I have prime.

Just asking this poster who say he buy stuff from other store to see what store has the Palit cheaper. He looks like a Palit rep trying to sell one to me


----------



## jakoBTR

'gtx 1070 + 8gb ram ' , bf1 performance test


----------



## deegzor

Quote:


> Originally Posted by *owikhan*
> 
> Zotac GTX 1070 AMP EXTREME EDITION
> http://www.3dmark.com/3dm/14530577?


Core is still king http://www.3dmark.com/compare/spy/371076/spy/314999#


----------



## kevindd992002

Which is a better card, Zotac amp extreme or palit sjs?


----------



## jamor

So it seems I can get 2126 mhz clock stable. Bad news is memory clock is worse than I thought. I can't even get past +100 mhz (4100/8200).


----------



## Hnykill

Palit Super Jetstream here. Core clock 2100 Mhz and 9200 memory. just won the silicon lottery







..near silent card

Why cant i upload a jpg. ?


----------



## Vesimas

I have a doubt. I read in the past about the strange behaviour of gtx with refresh over 120hz under windows. They have fixed that or not? Because now im running at [email protected] under windows and in game i choose [email protected] with gsync activated. And also its correct that i have enable vertical sync under nvidia cp and im not enabling it under tha game settings? Ty


----------



## TheGlow

Quote:


> Originally Posted by *Vesimas*
> 
> I have a doubt. I read in the past about the strange behaviour of gtx with refresh over 120hz under windows. They have fixed that or not? Because now im running at [email protected] under windows and in game i choose [email protected] with gsync activated. And also its correct that i have enable vertical sync under nvidia cp and im not enabling it under tha game settings? Ty


I have windows desktop on [email protected] no problems. Vsync is disabled as I think that only kicks in if your frames surpass 144Hz. Then youd get either tearing or vsync delay.
So far everything Ive been playing is 70-120fps so it hasnt been a concern.
otherwise I hear setting rivatuner to hard cap at 135-140frames will ensure you stay in gsync range.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Vesimas*
> 
> I have a doubt. I read in the past about the strange behaviour of gtx with refresh over 120hz under windows. They have fixed that or not? Because now im running at [email protected] under windows and in game i choose [email protected] with gsync activated. And also its correct that i have enable vertical sync under nvidia cp and im not enabling it under tha game settings? Ty


always disable v-sync for g-sync to work properly. it must be disabled for g-sync to function as intended. (no "halfing of FPS" anymore- YAY!- whenever FPS is less than default frequency of monitor.







in otherwords, u can still reach 120hz and 144hz with v-sync on, but with it on u loose g-sync.)
Rule 1: u can tell nvcp to always use in-game settings; then u disable v-sync from in-game settings.

If, for some reason, i can't think of one, u set nvcp to not use v-sync overall... or for a particular (individual) game; the result would most likely be, just like with Adaptive-sync, that the nvcp setting would override any in-game setting. (it is easy to prove with Adaptive-sync. i guess the exact same theories as those used to prove it with Adaptive-sync could also be used to prove that nvcp setting for v-sync overrides in-game settings for v-sync for sake of g-synv working. But, u would need a full understanding of V-sync, Adaptive-sync and G-sync. All three are tightly related so it would be the best exercise imaginable for mastering each and all three at the same time.







worked for me.







)

and the only strange behavior over 120hz (not on this thread) that i ever recall is power draw too high in 2D (Windows). I guess it would be easy to catch because GPU clock and fan speeds would be higher than normal on Desktop. Still, as a result, my only experience in Windows is @60hz- and it shouldn't be because i read how much better the desktop looks at @120hz... but one day i was both paranoid of power draw bug and also diagnosing a BSOD and as a result i switched my Desktop to 60hz and never went bak.lol

GL


----------



## Vesimas

Thank you both, now that i'm at home i have just tryed: @144hz gpu clock at 961Mhz and @120hz gpu clock at 253Mhz so i suppose i'll leave the monitor at 120hz under windows








The other matter if i have understood well is that i need to disable vertical sync in game setting and on nvcp i set always use in-game settings.


----------



## issact

My step-up finally got filled a couple weeks ago. Only a few days left before I get to hold my baby! I even have an ekwb waterblock waiting for her.

I've been googling firmware overclocks/power mods in the meantime. Am I sniffing around too soon? I'm too much of a novice to pioneer such a thing, anyone know more than I?

Thanks!


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Vesimas*
> 
> Thank you both, now that i'm at home i have just tryed: @144hz gpu clock at 961Mhz and @120hz gpu clock at 253Mhz so i suppose i'll leave the monitor at 120hz under windows
> 
> 
> 
> 
> 
> 
> 
> 
> The other matter if i have understood well is that i need to disable vertical sync in game setting and on nvcp i set always use in-game settings.


np









yes, u correctly discerned disabling v-sync in both nvcp and in-game.

but i bet nvcp overrides in-game v-sync setting if u disable v-sync in nvcp. (mouthful lol)

And u (not just u but for anyone reading this now or ever) u did enable G-Sync in both the nvcp "Set up G-Sync" and also "Manage 3D Settings/Global" and or "Manage 3D Settings/Program Settings", right?


----------



## Vesimas

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> And u (not just u but for anyone reading this now or ever) u did enable G-Sync in both the nvcp "Set up G-Sync" and also "Manage 3D Settings/Global" and or "Manage 3D Settings/Program Settings", right?


G-Sync is enable under Set Up G-Sync Panel (i put only full screen and not windowed) but in the other menù you mentioned i can't find anything g-sync related

EDIT: nvm i find it, it was under monitor technology and it was not on gsync :/


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> Hey Guys, first post, I have the MSI Gaming z 1070. Running well haven't push oc just yet. Stock it sits just bellow 2000 on clock and has 100 oc on memory stock. Although it does have Micron memory. Had to replace my asus strix non oc. It had issues with the boost clock not kicking in and only seeing half fps as it should and some artifacts. (It had samsung memory) Sounds like the later builds are coming with micron memory.
> 
> Also quick question, I see people seeing benefits from locking voltage in AB? What do I need to be careful of in using the voltage of my card in AB.
> 
> SPECS
> I5 -6600k. 4.7 ghz
> 16 ddr4
> ASUS z170-a
> 600w evga gpu
> MSI Gaming Z 1070


The maximum voltage your card will run at is 1.093v and is safe. Make sure you increase the fan speed with a custom curve or a fixed speed if you are pushing the memory clock speed.

Could you extract a copy of your vbios and either post a copy here or upload it to techpowerup.com please?

You can use GPU-z 1.10 to both extract and upload the bios file to techpowerup


----------



## owikhan

Quote:


> Originally Posted by *deegzor*
> 
> Core is still king http://www.3dmark.com/compare/spy/371076/spy/314999#


wow cool

So far best can i do this bro i cant get more gpu clock 1708
http://www.3dmark.com/3dm/14548749?


----------



## Balrogos

Quote:


> Originally Posted by *ElectroManiac*
> 
> Well on Amazon and Newegg the Gigabyte is cheaper.
> Thanks will take that into consideration.


Heh i buy things in diffirent shop not only from one







check other shops and check prices.
Quote:


> Originally Posted by *ElectroManiac*
> 
> Could you point me to one?


Well im from Poland newegg are usa shop also amazon.







every electronic in poland is more expenive than in germany uk or usa thats why we snipe lower prices in many shops.

Okay if i want try OC i should put power limit to maximum? and what max voltage is safe?


----------



## Majentrix

I'm going to need a second 1070 to drive this 1440p 144hz monitor, one just isn't fast enough for the games I play.


----------



## Balrogos

and what you play







? most of games i payed are crappy optimalized i have i7 4790k and gtx 1070 ofc







, PlanetSide2 (max load 30-40% CPU and 35-37% gpu and fps dont go over 80 and somtimes i get below 60), same with Warframe, Black Desert in cities locations.


----------



## Vesimas

What do you think about this, all stock speed


----------



## TheGlow

Quote:


> Originally Posted by *Vesimas*
> 
> Thank you both, now that i'm at home i have just tryed: @144hz gpu clock at 961Mhz and @120hz gpu clock at 253Mhz so i suppose i'll leave the monitor at 120hz under windows
> 
> 
> 
> 
> 
> 
> 
> 
> The other matter if i have understood well is that i need to disable vertical sync in game setting and on nvcp i set always use in-game settings.


I got the msi gaming app service running so Im at 1582MHz idle on desktop.
if I dont have the service up, itll drop to 215MHz and then I lock up from memory overclock.
This is at 144Hz refresh.

Tried some more Time Spy, no lockups or crashes. I think I found the core limit. I dunno if I can squeeze anymore from the memory.
http://i.imgur.com/v5W9Nb7.jpg
http://www.3dmark.com/3dm/14599112


----------



## Blackfyre

Quote:


> Originally Posted by *Vesimas*
> 
> What do you think about this, all stock speed


Normal score on stock. Enjoy the card & good luck with the overclocking.






















Check GPU-Z to see if you have Micron or Samsung vRAM on your GPU and from there overclocking accordingly. With Micron you won't be able to overclock memory a lot, and with Samsung, you should be safe if you set it to +500Mhz on MSI AfterBurner latest beta and then start testing Core by slowly increasing it. Mine is stable at +106Mhz which equates to 2088MHz

I have an *MSI GTX 1070 Gaming X*


----------



## Vesimas

Thank you again for the answers, ftm i'll stick to stock speed


----------



## victorrz

This is my result with a Zotac 1070 AMP Extreme. 2126/2101Mhz and memory clock at 2340Mhz

http://www.3dmark.com/3dm/14508260


----------



## Omzig

So Ive had my zotac amp! (standard) for just over a week now and been running all kind of tests/games/benchmarks /tweaks ect ect as we do









after a lot of testing ive got it running solid/stable at 2101mhz core & 4498mhz on mem (for me that's a +187 core +500 on mem (samsung) above default boost settings)...

Now here's the funky thing,Even at 1.093volts i could not get the core to stick @ 2114mhz i would always drop to 2088mhz,yet running undervolted im now able to hold 2101mhz @ 0.975 without issue....very strange

All im all im very happy with the upgrade from the msi 970 i had been running but im still a bit pissed that i can't lock the p-state's on this card without BSOD,still not sure if that's an issue with NVI or something to do with NV's new power management,might have to re reg at g3d and ask in the NVI thread there.


----------



## tigertank79

My Sli tests at default(1987 and 1974/8000), 2050/8800 and 2100/9000.

3DMark Firestrike:


http://imgur.com/PpFiZeX




http://imgur.com/AVXR3I1




http://imgur.com/jcBZoxr


3DMark Firestrike Extreme:


http://imgur.com/nhwjeq4




http://imgur.com/S3zCOiy




http://imgur.com/vR5dj8e


3Dmark TimeSpy:


http://imgur.com/rcI7uqG




http://imgur.com/VHGYSdx




http://imgur.com/2YnkKzt


No bios mod for these cards? 112% PL is limiting.


----------



## TheGlow

Quote:


> Originally Posted by *Omzig*
> 
> So Ive had my zotac amp! (standard) for just over a week now and been running all kind of tests/games/benchmarks /tweaks ect ect as we do
> 
> 
> 
> 
> 
> 
> 
> 
> 
> after a lot of testing ive got it running solid/stable at 2101mhz core & 4498mhz on mem (for me that's a +187 core +500 on mem (samsung) above default boost settings)...
> 
> Now here's the funky thing,Even at 1.093volts i could not get the core to stick @ 2114mhz i would always drop to 2088mhz,yet running undervolted im now able to hold 2101mhz @ 0.975 without issue....very strange
> 
> All im all im very happy with the upgrade from the msi 970 i had been running but im still a bit pissed that i can't lock the p-state's on this card without BSOD,still not sure if that's an issue with NVI or something to do with NV's new power management,might have to re reg at g3d and ask in the NVI thread there.


I believe that's the thermal throttling.
I can set my voltage to lock at 1.093, set voltage +100, limit, etc, Core +210 and it will only say 1.080mV and 2189MHz on the clock.
Since no load, temp will still stay low at 42º with 100% fan.
Once I put load, temps shoot up, then voltage starts to throttle down again.


----------



## wrathofbill

Now this is what I should have waited for......

http://eu.evga.com/Products/Product.aspx?pn=08G-P4-6278-KR


----------



## supermodjo

Quote:


> Originally Posted by *TheGlow*
> 
> I believe that's the thermal throttling.
> I can set my voltage to lock at 1.093, set voltage +100, limit, etc, Core +210 and it will only say 1.080mV and 2189MHz on the clock.
> Since no load, temp will still stay low at 42º with 100% fan.
> Once I put load, temps shoot up, then voltage starts to throttle down again.


same situation like you guys only vithout temp trotle.g1 1070 card stable at 2129 9000, at 1.050 watercooled average temp 40-45 gr C.so card its stable at 2152 9200 in bf4 but its not stable in witcher 3 and rise of tomb rider games crash to desktop.so to find 100 % stable you must play high demanding games.i play games on a 144 monitor who puch card more then 60 hz.like the guy above me i cannot make card stable at 2152 even at 1.093 V.games craches to desktop.maybe a moded bios will help a litle or maybe not.1 question guys how to underclock the card?thx and chers.ps card has samsung memory.



My PC


----------



## Chaoz

Just jumped over to Green camp. Bought a EVGA GTX 1070 SC. Love this card. Looks cool and is cool. And a hell of a lot faster than my R9 390 STRIX.


----------



## Omzig

Quote:


> Originally Posted by *supermodjo*
> 
> same situation like you guys only vithout temp trotle.g1 1070 card stable at 2129 9000, at 1.050 watercooled average temp 40-45 gr C.so card its stable at 2152 9200 in bf4 but its not stable in witcher 3 and rise of tomb rider games crash to desktop.so to find 100 % stable you must play high demanding games.i play games on a 144 monitor who puch card more then 60 hz.like the guy above me i cannot make card stable at 2152 even at 1.093 V.games craches to desktop.maybe a moded bios will help a litle or maybe not.1 question guys how to underclock the card?thx and chers.ps card has samsung memory.
> 
> My PC


Yep same i don't think my failing to lock @ 2126mhz was temp limited as it never passed 64c & 2100mhz makes the card sit at 62c 100% load.

Anyhows i undervolted by using MSI AB to setup a Voltage curve where everything 0.975 and above is locked to 2100


----------



## Rage19420

Quote:


> Originally Posted by *wrathofbill*
> 
> Now this is what I should have waited for......
> 
> http://eu.evga.com/Products/Product.aspx?pn=08G-P4-6278-KR


Which is the same as this no? https://www.bhphotovideo.com/c/product/1260360-REG/msi_gtx_1070_sea_hawk_x_geforce_gtx_1070_sea.html


----------



## TheGlow

Quote:


> Originally Posted by *supermodjo*
> 
> same situation like you guys only vithout temp trotle.g1 1070 card stable at 2129 9000, at 1.050 watercooled average temp 40-45 gr C.so card its stable at 2152 9200 in bf4 but its not stable in witcher 3 and rise of tomb rider games crash to desktop.so to find 100 % stable you must play high demanding games.i play games on a 144 monitor who puch card more then 60 hz.like the guy above me i cannot make card stable at 2152 even at 1.093 V.games craches to desktop.maybe a moded bios will help a litle or maybe not.1 question guys how to underclock the card?thx and chers.ps card has samsung memory.
> 
> My PC


Benchmarks ran at +200 core, about 2160MHz seemed fine. i think I saw 2-3 red sparkles in TimeSpy.
In the meantime I have it sitting at +180 core so around 2140 or so and runs perfectly fine. Witcher3 and Battlefield 1 no problems. this is with voltage +0 so sits at something like 1.05 or something.
I'll double check what end numbers are.


----------



## Prozillah

Just successfully completed the shunt mod on my g1 this morning. Reduced the power limit by 30%. Leveled out the clocks nicely. No improvement on benchmark results however just remained the same.


----------



## Dude970

Made some good progress benching my MSI card. I can hit 22K consistently on Firestrike , and TimeSpy is looking good too


----------



## Hunched

I need a BIOS editor in my life.
+300 mem is crashing in the BF1 test, but only when I'm not playing... only on the server browser.
Because my voltage is always at 1.093v when playing, but it's always at 1.081v on the server browser.

If GPU Boost wasn't the biggest pile of garbage I've ever had to deal with then I'd be able to keep it at 1.093 and never crash.
With a custom BIOS I could push past 1.093 and overclock further.

Out of every problem and everything I have ever done in my life with a PC, dealing with GPU Boost and its dynamic bull**** is by far the worst experience I've ever had to deal with.
**** this ****, my god I want a BIOS editor to control my overclock, to control my voltage, to stop my voltage from deciding to lower and fluctuate around and cause instability.

How bad do your engineers have to be that it's solid during 3D gameplay but can't handle a 2D server browser. That's so backwards, you screwed up so bad it's mind boggling.
My card is constantly being starved of voltage and choking out and there's nothing I can do to fix it or maintain a constant, fixed voltage.

Why does Nvidia hate overclockers so much? I'm pissed.


----------



## Prozillah

Quote:


> Originally Posted by *Hunched*
> 
> I need a BIOS editor in my life.
> +300 mem is crashing in the BF1 test, but only when I'm not playing... only on the server browser.
> Because my voltage is always at 1.093v when playing, but it's always at 1.081v on the server browser.
> 
> If GPU Boost wasn't the biggest pile of garbage I've ever had to deal with then I'd be able to keep it at 1.093 and never crash.
> With a custom BIOS I could push past 1.093 and overclock further.
> 
> Out of every problem and everything I have ever done in my life with a PC, dealing with GPU Boost and its dynamic bull**** is by far the worst experience I've ever had to deal with.
> **** this ****, my god I want a BIOS editor to control my overclock, to control my voltage, to stop my voltage from deciding to lower and fluctuate around and cause instability.
> 
> How bad do your engineers have to be that it's solid during 3D gameplay but can't handle a 2D server browser. That's so backwards, you screwed up so bad it's mind boggling.
> My card is constantly being starved of voltage and choking out and there's nothing I can do to fix it or maintain a constant, fixed voltage.
> 
> Why does Nvidia hate overclockers so much? I'm pissed.


Do the shunt mod


----------



## Prozillah

Quote:


> Originally Posted by *Dude970*
> 
> Made some good progress benching my MSI card. I can hit 22K consistently on Firestrike , and TimeSpy is looking good too


Thats outrageous mate - what's your secret? Or do you just have a super sample chip?


----------



## Balrogos

Guys in MSI afterburner i hvae voltage% ho to turn % to volts? also temerature target if i set like 60C the cooling system will try maitain 60C?


----------



## jovanni

Hey there club. Being testing my new Gigabyte G1 the last couple weeks found my self very satisfied.
First had to read and make a spherical opinion about the range of all 1070. So I went all the posts of this thread (







.

Coming from 2 970s sli (OC strix) was a difficult decision to jump in one 1070.
The 2 970 was but overclockers. Most of the 970s went up to 1530 and mine (both of them barely 1480).
So I choose Gigabyte. It was a surprise that the card not only outperform the 2 970s but it is also a very good overclocker...
9000 mhz memory
2126 mhz core (2126-2100 in heaven bench)

the memory is micron, so I am not asking much...

firestrike.jpg 408k .jpg file


heaven.jpg 146k .jpg file


----------



## Forceman

Quote:


> Originally Posted by *Hunched*
> 
> I need a BIOS editor in my life.
> +300 mem is crashing in the BF1 test, but only when I'm not playing... only on the server browser.
> Because my voltage is always at 1.093v when playing, but it's always at 1.081v on the server browser.
> 
> If GPU Boost wasn't the biggest pile of garbage I've ever had to deal with then I'd be able to keep it at 1.093 and never crash.
> With a custom BIOS I could push past 1.093 and overclock further.
> 
> Out of every problem and everything I have ever done in my life with a PC, dealing with GPU Boost and its dynamic bull**** is by far the worst experience I've ever had to deal with.
> **** this ****, my god I want a BIOS editor to control my overclock, to control my voltage, to stop my voltage from deciding to lower and fluctuate around and cause instability.
> 
> How bad do your engineers have to be that it's solid during 3D gameplay but can't handle a 2D server browser. That's so backwards, you screwed up so bad it's mind boggling.
> My card is constantly being starved of voltage and choking out and there's nothing I can do to fix it or maintain a constant, fixed voltage.
> 
> Why does Nvidia hate overclockers so much? I'm pissed.


Have you tried locking the voltage on the curve in AB?


----------



## wrathofbill

Quote:


> Originally Posted by *Rage19420*
> 
> Which is the same as this no? https://www.bhphotovideo.com/c/product/1260360-REG/msi_gtx_1070_sea_hawk_x_geforce_gtx_1070_sea.html


Has much as its a hybrid 1070... Then you could say mine is a mix of the 2, but I would rather have the EVGA GeForce GTX 1070 FTW HYBRID GAMING!!!





My pics above of my FE MSI with EVGA Hybrid kit fitted compared to the http://eu.evga.com/Products/Product.aspx?pn=08G-P4-6278-KR

I'll take the FTW Hybrid thanks....


----------



## Dude970

Quote:


> Originally Posted by *Prozillah*
> 
> Thats outrageous mate - what's your secret? Or do you just have a super sample chip?


I OCed it pretty good, and my CPU is at 5Ghz


----------



## Balrogos

How you unlock powerlimit beyond 114?


----------



## jlhawn

Quote:


> Originally Posted by *Balrogos*
> 
> How you unlock powerlimit beyond 114?


what brand is your GTX 1070? some don't allow it over 114.
My msi lets me go to 126


----------



## TheGlow

Quote:


> Originally Posted by *Dude970*
> 
> I OCed it pretty good, and my CPU is at 5Ghz


I dunno man. I have the MSI as well and don't score nearly that high, and I have it OCd higher than in your shots.









Quote:


> Originally Posted by *Hunched*
> 
> I need a BIOS editor in my life.
> +300 mem is crashing in the BF1 test, but only when I'm not playing... only on the server browser.
> Because my voltage is always at 1.093v when playing, but it's always at 1.081v on the server browser.
> 
> If GPU Boost wasn't the biggest pile of garbage I've ever had to deal with then I'd be able to keep it at 1.093 and never crash.
> With a custom BIOS I could push past 1.093 and overclock further.
> 
> Out of every problem and everything I have ever done in my life with a PC, dealing with GPU Boost and its dynamic bull**** is by far the worst experience I've ever had to deal with.
> **** this ****, my god I want a BIOS editor to control my overclock, to control my voltage, to stop my voltage from deciding to lower and fluctuate around and cause instability.
> 
> How bad do your engineers have to be that it's solid during 3D gameplay but can't handle a 2D server browser. That's so backwards, you screwed up so bad it's mind boggling.
> My card is constantly being starved of voltage and choking out and there's nothing I can do to fix it or maintain a constant, fixed voltage.
> 
> Why does Nvidia hate overclockers so much? I'm pissed.


I can suggest 2 things that I've come across.
1 Is bring up the curve editor, ctrl+f, then click one of the dots at the 1.093 mark, then hit L. that will lock the voltage on that point.
or 2, I installed the MSI Gaming App application, and that adds a service called GamingApp_Service. This causes a process to run called MSIGamingOSD_x64 and x86.
Having that run forces the card to stay in 3d mode. So I am idling at +180 core, +700 mem no problem, 1582MHz and 725mV.
If I disable those services and my card drops back to 2d mode then I cant put more than +300 or so on memory or it dies.
I saw stuff like launching MS Edge would make AB say im in 3d mode. I guess to quick for voltage to compensate and that would crash with the checkerboard.
Opening origin = crash, etc.
So I just let the service run and I'm good.


----------



## adamjp

Had to share this snap of my Zotac Amp playing GTA V at 1080p60 maxed settings. Couldn't be happier with the temps, especially since it's in an ITX case.


----------



## Balrogos

Wierd didint know that manufacturers set thier own limit Palit SJS 114 power Limit and i dont know anything from OCing this nvidia cards toy have no voltage control just %.
Quote:


> Due to the dynamic nature of the boost clock, your frequency is not fixed. Limiters and monitors temperature, load, power and voltages will continuously alter a maximum clock state. For most founders cards these will all be the same, likely for board partners as well. So my prognosis right now is that most GeForce GTX 1070 cards will all run at roughly 2.0 to 2.1 GHz maximum with a few exception here and there.


its killed me after i read this so there is no more OC on nvidia cards? we just maxing factory settings?


----------



## TheGlow

Quote:


> Originally Posted by *adamjp*
> 
> Had to share this snap of my Zotac Amp playing GTA V at 1080p60 maxed settings. Couldn't be happier with the temps, especially since it's in an ITX case.


Looks good. For my MSI at idle my fan curve has it 35% and sits at 45ºc. On load fans go to around 80% and I sit around 64ºc.


----------



## jovanni

Did someone found better use the AB in gigabyte g1? AB in my case didn't have power control. Is that normal?


----------



## jovanni

My card hovers 1.06-1.093 max voltage. Hope someone to unlock this. Wonder if 2200 will be easy with moded bios....


----------



## gtbtk

You need 4.3 beta. Not 4.2 to unlock the voltage control


----------



## jovanni

Voltage is adjustable in latest AB(got it few hours ago from guru3d). But there isn't power limit slider. Only with gigabyte app I can control power limits


----------



## Evilsplashy

I'm thinking about getting the EVGA 1070 SC. Does anyone notice if the card slants towards the end of the card? Does it stay straight? Thanks


----------



## SuperZan

Quote:


> Originally Posted by *Evilsplashy*
> 
> I'm thinking about getting the EVGA 1070 SC. Does anyone notice if the card slants towards the end of the card? Does it stay straight? Thanks


I've not noticed any slanting or sagging, it's sat in my sig-rig. Plain seating, no wire or anything like that.


----------



## Blackfyre

Is it safe to lock the voltage at 1.093v and the clock & memory speeds always running at maximum too? For 24/7 usage that is...

Temperatures are remaining safe always when idle & at load. But at idle the GPU remains clocked at MAX clock & memory speeds, as well as running at highest voltage. Always. Continuously.

Check what I mean, this is after a reboot:

http://i.imgur.com/frq9RLX.png


----------



## Olper

I have an EVGA GTX 1070 water cooled.
Was able to push 2114MHz from GPU and 9234MHz from the memory (Samsung).
Maxed the voltage bar in MSI afterburner to +100mV. The card runs at 1.031V. Wish for custom bios one day, as the Power% never goes above 90%. Is this normal??
My CPU is a i5 [email protected]
K-boost off.. Can't get EVGA precision XOC to run







"0xc000007b"
Firestrike run:

+10 more MHz on the memory will present artifacts. +20 will crash the card in Fire Strike's second graphics test. The core stays ~40 degrees at all times and does not throttle at all.

More details in my case mod thread:
http://www.overclock.net/t/1540564/completed-fractal-define-xl-original-case-mod-and-water-cooling/10#post_25492066


----------



## SupernovaBE

Can someone try out if 1070SLi and a dedicated physx card is a good idea ?
I have no 3 card to try this out








Maybe a 780Ti ( old card from my m8 )


----------



## Olper

Seems like the voltage slider does nothing. And the other bios does nothing but increase power% limit to 122%. Doesn't do much as I'm not exceeding 90% anyway


----------



## TheGlow

Quote:


> Originally Posted by *Olper*
> 
> Seems like the voltage slider does nothing. And the other bios does nothing but increase power% limit to 122%. Doesn't do much as I'm not exceeding 90% anyway


I have the voltage +0, but power limit 126%. I sat at 1.043 pretty much. I believe when I set to max AND lock it at 1.093, it still sits more at 1.08.


----------



## TrantaLocked

Quote:


> Originally Posted by *Vesimas*
> 
> What do you think about this, all stock speed


Dude, I got nearly the exact same GPU score but because I have an i5-4690, my overall score is 5300 instead of your 5550. That's insane; 10.8 FPS in my CPU test, CPU score is 600 points less than your 13 FPS CPU run? And then they weight the already ******ed CPU score THAT much into the overall score?

Timespy is rigged. The overall score difference can be *200+* pts between two systems getting the same frame rate in the GPU tests and barely a difference in the CPU test.


----------



## TheGlow

Quote:


> Originally Posted by *TrantaLocked*
> 
> Dude, I got nearly the exact same GPU score but because I have an i5-4690, my overall score is 5300 instead of your 5550. That's insane; 10.8 FPS in my CPU test, CPU score is 600 points less than your 13 FPS CPU run? And then they weight the already ******ed CPU score THAT much into the overall score?
> 
> Timespy is rigged. The overall score difference can be *200+* pts between two systems getting the same frame rate in the GPU tests and barely a difference in the CPU test.


I was wondering how TimeSpy was working because I have a 1070 that's ocing pretty hard and yet it says I'm only better than 74% of all results.
Doesnt that mean that 26% are all on 1070 slis, 1080, or TitanX's?


----------



## Hunched

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried locking the voltage on the curve in AB?


Quote:


> Originally Posted by *TheGlow*
> 
> I dunno man. I have the MSI as well and don't score nearly that high, and I have it OCd higher than in your shots.
> 
> 
> 
> 
> 
> 
> 
> 
> I can suggest 2 things that I've come across.
> 1 Is bring up the curve editor, ctrl+f, then click one of the dots at the 1.093 mark, then hit L. that will lock the voltage on that point.
> or 2, I installed the MSI Gaming App application, and that adds a service called GamingApp_Service. This causes a process to run called MSIGamingOSD_x64 and x86.
> Having that run forces the card to stay in 3d mode. So I am idling at +180 core, +700 mem no problem, 1582MHz and 725mV.
> If I disable those services and my card drops back to 2d mode then I cant put more than +300 or so on memory or it dies.
> I saw stuff like launching MS Edge would make AB say im in 3d mode. I guess to quick for voltage to compensate and that would crash with the checkerboard.
> Opening origin = crash, etc.
> So I just let the service run and I'm good.


I've been using the lock but it kept lowering the voltage anyway in the lesser demanding server browser.
It seems that lowering the clock for 1.081v in the curve has helped or completely stopped voltage from dropping to 1.081v now, its been staying at 1.093v in the server browser now, no crashing so far.
It might still lower to 1.081v if I get to a certain temperature, but so far BF1 max settings hasn't been getting things hot enough for that thermal throttle to happen.

Of course the core clock still fluctuates unfortunately, but at least that shouldn't ruin stability if the voltage remains constant.

Also, that's pretty crazy that the MSI Gaming App makes a difference of +300 to +700 for your stability, TheGlow.
I doubt I'll be getting any further than +300~ still.
Though now I shouldn't be crashing in the server browser anymore since I'm maintaining the voltage I have during gameplay which never crashes, in a world that makes sense this is how things will be.


----------



## Hunched

Quote:


> Originally Posted by *Blackfyre*
> 
> Is it safe to lock the voltage at 1.093v and the clock & memory speeds always running at maximum too? For 24/7 usage that is...
> 
> Temperatures are remaining safe always when idle & at load. But at idle the GPU remains clocked at MAX clock & memory speeds, as well as running at highest voltage. Always. Continuously.
> 
> Check what I mean, this is after a reboot:
> 
> http://i.imgur.com/frq9RLX.png


It's safe, but just make a profile for idle and a locked profile for gaming.
You can attach profiles to keyboard hotkeys and switch between them on the fly without having to bring MSI AB up on the screen.
In settings, under profiles.


----------



## Olper

I've been under the impression that Micron's GDDR5 overclocks less than Samsungs. But yours are at 9600MHz. Samsungs have been said to reach 9300 MHz at best.


----------



## Balrogos

Heh it would be cool if someone can reply to my posts anyway. i push volt slider to 100% power limit to 114, temp to 92C, max stable without artifacts on core +85MHz so total 2113,5MHz, my card have micron memory so i will try now OC them a bit







. Lets say i can push power limit like to 125 it is possiible than i get better OC on core? since i cant mess with Volts.


----------



## kevindd992002

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> always disable v-sync for g-sync to work properly. it must be disabled for g-sync to function as intended. (no "halfing of FPS" anymore- YAY!- whenever FPS is less than default frequency of monitor.
> 
> 
> 
> 
> 
> 
> 
> in otherwords, u can still reach 120hz and 144hz with v-sync on, but with it on u loose g-sync.)
> Rule 1: u can tell nvcp to always use in-game settings; then u disable v-sync from in-game settings.
> 
> If, for some reason, i can't think of one, u set nvcp to not use v-sync overall... or for a particular (individual) game; the result would most likely be, just like with Adaptive-sync, that the nvcp setting would override any in-game setting. (it is easy to prove with Adaptive-sync. i guess the exact same theories as those used to prove it with Adaptive-sync could also be used to prove that nvcp setting for v-sync overrides in-game settings for v-sync for sake of g-synv working. But, u would need a full understanding of V-sync, Adaptive-sync and G-sync. All three are tightly related so it would be the best exercise imaginable for mastering each and all three at the same time.
> 
> 
> 
> 
> 
> 
> 
> worked for me.
> 
> 
> 
> 
> 
> 
> 
> )
> 
> and the only strange behavior over 120hz (not on this thread) that i ever recall is power draw too high in 2D (Windows). I guess it would be easy to catch because GPU clock and fan speeds would be higher than normal on Desktop. Still, as a result, my only experience in Windows is @60hz- and it shouldn't be because i read how much better the desktop looks at @120hz... but one day i was both paranoid of power draw bug and also diagnosing a BSOD and as a result i switched my Desktop to 60hz and never went bak.lol
> 
> GL


I'm not sure where you got the information that enabling V-Sync will disable G-Sync. That is just simply not true. By default, NVCP enables both and what happens is that if fps > Hz, then V-Sync kicks in but when fps < Hz then G-Sync takes over. It's more like Adaptive V-Sync but with G-Sync in the mix.

Don't get me wrong though. I keep V-Sync disabled all the time and would rather use an in-game fps limiter or let it just go high fps (more than Hz) especially for FPS games.

So yeah.


----------



## Olper

Quote:


> Originally Posted by *Balrogos*
> 
> Heh it would be cool if someone can reply to my posts anyway. i push volt slider to 100% power limit to 114, temp to 92C, max stable without artifacts on core +85MHz so total 2113,5MHz, my card have micron memory so i will try now OC them a bit
> 
> 
> 
> 
> 
> 
> 
> . Lets say i can push power limit like to 125 it is possiible than i get better OC on core? since i cant mess with Volts.


Which card you have?
How high does you power usage rise to? My 1070 FTW power limit is 112%, but my card never goes over 90% power. So the limit is never reached and it would do nothing to me if i raise the limit to 122%.
So if your card does not reach power usage close to power limit, raising the power limit does nothing.
2113.5MHz is a decent overclock! I think you can expect your micron memory to run at ~8800MHz.


----------



## Prozillah

Quote:


> Originally Posted by *Balrogos*
> 
> Heh it would be cool if someone can reply to my posts anyway. i push volt slider to 100% power limit to 114, temp to 92C, max stable without artifacts on core +85MHz so total 2113,5MHz, my card have micron memory so i will try now OC them a bit
> 
> 
> 
> 
> 
> 
> 
> . Lets say i can push power limit like to 125 it is possiible than i get better OC on core? since i cant mess with Volts.


If you get over 2100 core and 9000 memory regardless Micron or Samsung urve got urself a decent card


----------



## jovanni

Quote:


> Originally Posted by *Olper*
> 
> Which card you have?
> How high does you power usage rise to? My 1070 FTW power limit is 112%, but my card never goes over 90% power. So the limit is never reached and it would do nothing to me if i raise the limit to 122%.
> So if your card does not reach power usage close to power limit, raising the power limit does nothing.
> 2113.5MHz is a decent overclock! I think you can expect your micron memory to run at ~8800MHz.


My card, gigabyte g1
Core 2113-2126 (2138 artifacts)
Memory 9000 (micron)
Temp 59c (92% for benchmarking)
Power limit max and goes up to 109% in gpuz (gigabyte has one 8pin power cable)


----------



## Blackfyre

Quote:


> Originally Posted by *Hunched*
> 
> It's safe, but just make a profile for idle and a locked profile for gaming.
> You can attach profiles to keyboard hotkeys and switch between them on the fly without having to bring MSI AB up on the screen.
> In settings, under profiles.


Thanks for the answer. But yeah I had my HD7970 overclocked and always running @ max frequencies and locked voltage for well over two years without any issues whatsoever. Until I upgraded to the GTX 1070. Which I heard is different with nVidia cards and you should never lock the voltage and frequencies. But apparently not, everyone is doing it. So why not. The important thing is temperatures remain safe.


----------



## Prozillah

Quote:


> Originally Posted by *Blackfyre*
> 
> Thanks for the answer. But yeah I had my HD7970 overclocked and always running @ max frequencies and locked voltage for well over two years without any issues whatsoever. Until I upgraded to the GTX 1070. Which I heard is different with nVidia cards and you should never lock the voltage and frequencies. But apparently not, everyone is doing it. So why not. The important thing is temperatures remain safe.


Whats this about not running nvidia cards max clock and volts 24/7?


----------



## Prozillah

As long as temps in check it shouldn't matter


----------



## Balrogos

Ok. in unigine or games +85Mhz on core was good without artifacts but firestrike crashes so i lower to +75Mhz and here is the score(dont know why in stats is 600Mhz on gpu clock): http://www.3dmark.com/3dm/14640621

For the Time Spy i dont see any artifacts when ocing memory but benchmark crashed over time so i will try to find stable memory, firestrike goes +600Mhz(9216MHz) without any problems

Aha and i have GTX 1070 Palit Super JetStream (micron memory). And my power limit in AB hits even higher than 114 power limit.


----------



## Olper

Well done!
Your goal should be to find a stable overclock that does not crash in any game or benchmark. Not even once/week








For me, I'm not sure if my 2114 / 9234 MHz overclock is 100% stable, because I have not test it long enough.


----------



## rfarmer

http://www.3dmark.com/fs/10053568 Well seems the limit for my nVidia FE, 2050/9300. Anything over 185+ on the core causes crash in the 2nd Firestrike run.


----------



## Prozillah

After the shunt mod I just completed improved my max FireStrike graphics by a couple of hundreds points. Max is 21226 graphics score.


----------



## kevindd992002

Quote:


> Originally Posted by *Prozillah*
> 
> After the shunt mod I just completed improved my max FireStrike graphics by a couple of hundreds points. Max is 21226 graphics score.


Sorry for being illiterate but what is the shunt mod?


----------



## Prozillah

Voltage hack on the 10 series of cards that tricks the card into thinking it's not using as much power as it actually is.


----------



## Olper

Quote:


> Originally Posted by *Prozillah*
> 
> Voltage hack on the 10 series of cards that tricks the card into thinking it's not using as much power as it actually is.


It's not a voltage mod.
Shunt resistor is used to measure current. Gpu typically has 3 shunt resistors. By decreasing the resistance of the shunts, the gpu is fooled to think it is not using as much power as it is. The risk here is that you end up running more current thru your fets and chokes than they can handle. Fet will overheat or a choke will saturate. I would not do this with 4+1 phase card.


----------



## kevindd992002

Quote:


> Originally Posted by *Prozillah*
> 
> Voltage hack on the 10 series of cards that tricks the card into thinking it's not using as much power as it actually is.


And how do you achieve this?


----------



## Balrogos

Replace resistor or use conductive paste/paint and paint the resistor


----------



## Olper

Find the Shunt resistors and solder another resistors on top of them. This is from GTX 670, but it's the same mod. Not just for Nvidia 10-series..

The big lump saying R005. This means the resistor's value is 5 milliohms. There's two of them on top of each other, equaling 2.5mohm.
Picture from my case mod thread: here


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> I was wondering how TimeSpy was working because I have a 1070 that's ocing pretty hard and yet it says I'm only better than 74% of all results.
> Doesnt that mean that 26% are all on 1070 slis, 1080, or TitanX's?


Try adjusting your curve so that the card remains stable but the graph point at .975 volts is pulled up to 2000mhz and see how that works for you in both time spy and firestrike


----------



## gtbtk

Quote:


> Originally Posted by *Olper*
> 
> I've been under the impression that Micron's GDDR5 overclocks less than Samsungs. But yours are at 9600MHz. Samsungs have been said to reach 9300 MHz at best.


The problem is not so much the Micron rams ability to OC but the fact that the Micron ram doesn't cope with large voltage jumps from the low power saving voltages to the higher voltages when you click the apply button. I suspect that the OC software is changing the memory clock faster than it is increasing the voltage to support the memory running at OC speeds higher than +500. That is at least what I have observed with Afterburner 4.3 beta 4. If you lock the voltage using the curve before increasing the memory clock speed, you can OC the memory over 500 without any checkerboard artifacts

I have not had the chance to try beta 14 as yet but I did post it as a bug some time ago on the guru3d forum that is moderated by the developer


----------



## rfarmer

Quote:


> Originally Posted by *kevindd992002*
> 
> Sorry for being illiterate but what is the shunt mod?


----------



## TheGlow

Quote:


> Originally Posted by *Olper*
> 
> I've been under the impression that Micron's GDDR5 overclocks less than Samsungs. But yours are at 9600MHz. Samsungs have been said to reach 9300 MHz at best.


I know. From everything I'm reading I shouldn't be getting it up this high. It's making me wonder if the sensors aren't bad or something.
9600 is fine, 9660 i saw 1 or 2 red flicker/artifacts in timespy, and 9720 had more artifacts, but didnt crash TimeSpy.

Quote:


> Originally Posted by *gtbtk*
> 
> Try adjusting your curve so that the card remains stable but the graph point at .975 volts is pulled up to 2000mhz and see how that works for you in both time spy and firestrike


It looks like that's already the case. .975 on my curve is 2062MHz.
It hits 2000 at .925.
Thats with only +180 to core, which is what I've been gaming successfully at. Benching handles +200 so far, and working the kinks out of 210.
Demo mode 3dmark makes me not want to do too many in succession.

Quote:


> Originally Posted by *gtbtk*
> 
> The problem is not so much the Micron rams ability to OC but the fact that the Micron ram doesn't cope with large voltage jumps from the low power saving voltages to the higher voltages when you click the apply button. I suspect that the OC software is changing the memory clock faster than it is increasing the voltage to support the memory running at OC speeds higher than +500. That is at least what I have observed with Afterburner 4.3 beta 4. If you lock the voltage using the curve before increasing the memory clock speed, you can OC the memory over 500 without any checkerboard artifacts
> 
> I have not had the chance to try beta 14 as yet but I did post it as a bug some time ago on the guru3d forum that is moderated by the developer


That's exactly what I was seeing. With AB on, no OC's, it would idle at 215MHz. Just opening MS Edge would kick it into 3D clocks and jump to 1582. Close Edge and back to 215MHz.
I apply my +700 to memory, seems fine. Once I open edge, instant death.


----------



## Olper

Quote:


> Originally Posted by *rfarmer*


I have to agree, this method is preferable, because you can easily reverse it. Won't kill warranty... It should, but nobody will know you did it


----------



## Balrogos

Small update i make my own fan curve and without any problem i pass all benchmarks and dont see any artifacts (earlier i have auto fun)

2113,6 MHz(core +85MHz) / 9216 MHz(memory micron +600MHz)

http://www.3dmark.com/3dm/14644871
http://www.3dmark.com/3dm/14644527


----------



## jamor

With my minimal testing +150 core yields 300 points in 3Dmark FireStrike while +100 memory about 100 points (just to note +200 memory only yielded +110 points but I left it out because it's not stable).

The fps gains seem to be minimal however.

Memory per +100:

Combined tests yields about +*.*6 fps per +100 memory in the combined test, +1 fps in Graphics 1 test, +*.*5 fps in graphics 2 test, and +0 fps gain in Physics test.

Core per +150

Combined tests yields about +*.*6 fps per +150 core in the combined test, +4 fps in Graphics 1 test, +3 fps in graphics 2 test, and +*.*4 fps gain in Physics test.

This was on 1440p monitor, i5-2500k @ 4.5 ghz, 1070


----------



## kevindd992002

Quote:


> Originally Posted by *rfarmer*


Is this something that you think can be achievable in the future when BIOS modding tools are available for Pascal already?


----------



## rfarmer

Quote:


> Originally Posted by *kevindd992002*
> 
> Is this something that you think can be achievable in the future when BIOS modding tools are available for Pascal already?


That is what I am hoping for. My 970 was not a great OCer, 59.4% ASIC and Hynix memory. But with a bios mod I was able to sit at 1506/8000 with no PerfCap Reason and absolutely stable during all games and benchmarks.

That is what I want for my 1070, a good everyday OC that I don't have to mess with.


----------



## StrelokAT

My new MSI gtx1070 GamingX runs now @ core2152 and memory +385 (Micron)








But im satisfied with my new toy. Its a huge upgrade from gtx780 to gtx1070.

On the weekend i mount KrakenG10 on it and then temps will not rise above 40-45C°.
Even with stock cooler the card is very silent and cool. WOW, im impressed. Well done MSI.


----------



## ogow89

Quote:


> Originally Posted by *StrelokAT*
> 
> My new MSI gtx1070 GamingX runs now @ core2152 and memory +385 (Micron)
> 
> 
> 
> 
> 
> 
> 
> 
> But im satisfied with my new toy. Its a huge upgrade from gtx780 to gtx1070.
> 
> On the weekend i mount KrakenG10 on it and then temps will not rise above 40-45C°.
> Even with stock cooler the card is very silent and cool. WOW, im impressed. Well done MSI.


do some benching and post your results. Firestrike normal, Heaven 1080p all maxed settings and valley extreme preset.


----------



## StrelokAT

In Valley i got this: 

Other results are comming soon. Need to install Firestike befor benching

Heaven:

Update Firestike:

Update for TimeSy :


----------



## jasjeet

Whats the best card edition to buy this time? Do we have a top 3?


----------



## Balrogos

Quote:


> Originally Posted by *StrelokAT*
> 
> In Valley i got this
> 
> Other results are comming soon. Need to install Firestike befor benching
> 
> Heaven:
> 
> Update Firestike:
> 
> Update for TimeSy :


How much you OC your I7







?

I think

1. Palit
2. Msi
3. EVGA/ZOTAC


----------



## StrelokAT

My i7 runs @4.4ghz

Ahh, where can i download firestrike. All what i see is TimeSpy, but no Firestrike?


----------



## Prozillah

Ive noticed something very interesting - when do my benching after the shunt mod (using the liquid metal method) I did two separate tests:

1. Using AB and locking the point at 1.093 - I moved that one point and one point only - all the up over 2200 mhz (something I was not able to do before) it didnt pass the entire bench so dropped it back down to the stable pass point which was about 2164. Had the memory clocked at 600+ (Micron) and the max graphics score I was able to achieve was 20 880.

2. Same again using AB with volt point locked at 1.093, used the core clock method and set it to 105+ on the core and simply moved only the last volt point up a few points to lock the card at 2100mhz with 600+ memory OC. Final graphics score - 21216 (as linked)

What I noticed is even you have it locked at a certain point but the rest of your curve remains closer to the stock speeds that affects your overall performance. It seems like a bug with the software/firmware as the numbers dont make any sense at all.

Both of these tests I had the fan speed manually set to 100%


----------



## TheGlow

Quote:


> Originally Posted by *StrelokAT*
> 
> My i7 runs @4.4ghz
> 
> Ahh, where can i download firestrike. All what i see is TimeSpy, but no Firestrike?


It should be part of the bundle.
Go to Benchmarks and itll have TIMESPY all big on top, but at the bottom, or scroll a bit should show firestrike.


----------



## Balrogos

Quote:


> Originally Posted by *StrelokAT*
> 
> My i7 runs @4.4ghz
> 
> Ahh, where can i download firestrike. All what i see is TimeSpy, but no Firestrike?


oh six core


----------



## StrelokAT

Have this six core since december 2011. It´s allready an old fart.


----------



## Balrogos

but still six core







i must buy new PSU(i think its too old now) and i will try oc my CPU earlier i had i5 3570k 4,9Ghz(mobo msi z77a-g45) http://valid.canardpc.com/2719842http://valid.canardpc.com/2719842







but i kill the CPU i sctrach the surface of laminate :|


----------



## jasjeet

Quote:


> Originally Posted by *Balrogos*
> 
> How much you OC your I7
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> I think
> 
> 1. Palit
> 2. Msi
> 3. EVGA/ZOTAC


I don't think I can fit the palit in the node 304. Ill get the msi gaming x then.


----------



## StrelokAT

But dont forget, they use now Micron memory and not Samsung!


----------



## TheGlow

Quote:


> Originally Posted by *jasjeet*
> 
> I don't think I can fit the palit in the node 304. Ill get the msi gaming x then.


Yea, I think I was leaning towards a zotac. I could remove hdd cage but I have a bunch and didnt want the headache.
The MSI fit my HAF 912 just right , maybe half a cm of space between.
As mentioned mine came with Micron memory but I've got no problems OCing it, but it seems im in the minority.


----------



## Balrogos

Quote:


> Originally Posted by *StrelokAT*
> 
> In Valley i got this
> 
> Other results are comming soon. Need to install Firestike befor benching
> 
> Heaven:
> 
> Update Firestike:
> 
> Update for TimeSy :


I do also valley and heaven.


----------



## TheDeadCry

Quote:


> Originally Posted by *Olper*
> 
> I have to agree, this method is preferable, because you can easily reverse it. Won't kill warranty... It should, but nobody will know you did it


Psh, all you need is to trace with pencil lead.







In all serious, nobody do that. I've done it on an old powerpc mac mini years ago, though, to overclock - it worked...but I wouldn't try it on anything valuable. Just a little anecdote.


----------



## jamor

Quote:


> Originally Posted by *Prozillah*
> 
> Ive noticed something very interesting - when do my benching after the shunt mod (using the liquid metal method) I did two separate tests:
> 
> 1. Using AB and locking the point at 1.093 - I moved that one point and one point only - all the up over 2200 mhz (something I was not able to do before) it didnt pass the entire bench so dropped it back down to the stable pass point which was about 2164. Had the memory clocked at 600+ (Micron) and the max graphics score I was able to achieve was 20 880.
> 
> 2. Same again using AB with volt point locked at 1.093, used the core clock method and set it to 105+ on the core and simply moved only the last volt point up a few points to lock the card at 2100mhz with 600+ memory OC. Final graphics score - 21216 (as linked)
> 
> What I noticed is even you have it locked at a certain point but the rest of your curve remains closer to the stock speeds that affects your overall performance. It seems like a bug with the software/firmware as the numbers dont make any sense at all.
> 
> Both of these tests I had the fan speed manually set to 100%


Same thing was happening to me. My best performance was with 2123mhz core and 8200-8300mhz memory. When I did a test with 8800mhz memory my performance dropped. I don't know what to make of it.


----------



## stoker

Quote:


> Originally Posted by *jamor*
> 
> Same thing was happening to me. My best performance was with 2123mhz core and 8200-8300mhz memory. When I did a test with 8800mhz memory my performance dropped. I don't know what to make of it.


Its the inbuilt error correction. When your close to your max overclock you may not even see any artifacts but your core and or memory is not stable so the error correction kicks in lowering your score.

Try lowering core down one step 25mhz and test

Then ram 5mhz at a time and rerun your test to get the best out of it


----------



## TheGlow

Quote:


> Originally Posted by *stoker*
> 
> Its the inbuilt error correction. When your close to your max overclock you may not even see any artifacts but your core and or memory is not stable so the error correction kicks in lowering your score.
> Try lowering core down one step 25mhz and test
> Then ram 5mhz at a time and rerun your test to get the best out of it


Its this maddening repetition that kills me since I have 3dmark demo. That demo takes forever.


----------



## Prozillah

Quote:


> Originally Posted by *TheGlow*
> 
> Its this maddening repetition that kills me since I have 3dmark demo. That demo takes forever.


ahaha yes ur absolutely right


----------



## Curseair

Does anyone have a slightest clue where I could find RMA centres in europe for graphics cards? My friend has to RMA a gigabyte card but it's nothing but hassle apparently, Is there anywhere I could look?


----------



## TheGlow

Quote:


> Originally Posted by *Curseair*
> 
> Does anyone have a slightest clue where I could find RMA centres in europe for graphics cards? My friend has to RMA a gigabyte card but it's nothing but hassle apparently, Is there anywhere I could look?


Im not sure by you but for me typically the vendor if its still in the first 30 days or so.
After that I'm not sure. Each manufacturer has different policies.


----------



## gtbtk

Quote:


> Originally Posted by *Prozillah*
> 
> Ive noticed something very interesting - when do my benching after the shunt mod (using the liquid metal method) I did two separate tests:
> 
> 1. Using AB and locking the point at 1.093 - I moved that one point and one point only - all the up over 2200 mhz (something I was not able to do before) it didnt pass the entire bench so dropped it back down to the stable pass point which was about 2164. Had the memory clocked at 600+ (Micron) and the max graphics score I was able to achieve was 20 880.
> 
> 2. Same again using AB with volt point locked at 1.093, used the core clock method and set it to 105+ on the core and simply moved only the last volt point up a few points to lock the card at 2100mhz with 600+ memory OC. Final graphics score - 21216 (as linked)
> 
> What I noticed is even you have it locked at a certain point but the rest of your curve remains closer to the stock speeds that affects your overall performance. It seems like a bug with the software/firmware as the numbers dont make any sense at all.
> 
> Both of these tests I had the fan speed manually set to 100%


There is a "Video clock" that the curve controls in addition to the core clock and it has an effect on the overclock performance results. The Video clock frequency is effected by but not reported by AfterBurner in the graphs.

Through my own experimentation, I discovered that the frequency of the video clock is adjusted by the curve at the voltage values in the range of 0.975 - 1.0v.

Try your firestrike experiment again with memory as before, the 1.093v point at 2164 and put a bump in the middle of the curve at .975v up to 2000hz. You may need to adjust the 1.093 point to get stability again but I think that you will find a nice increase in your benchmark scores.


----------



## Curseair

Quote:


> Originally Posted by *TheGlow*
> 
> Im not sure by you but for me typically the vendor if its still in the first 30 days or so.
> After that I'm not sure. Each manufacturer has different policies.


Yeah it's after the first 30 days, Like i'm trying to find the closest RMA centre to Denmark for any GPU brand.


----------



## syl1979

For those complaining about 3dmark demos. You just have to do alt-f4 on the demo launch screen. It will kill it and launch the bench, that will still be valid


----------



## syl1979

Quote:


> Originally Posted by *TheGlow*
> 
> Its this maddening repetition that kills me since I have 3dmark demo. That demo takes forever.


For those complaining about 3dmark demos. You just have to do alt-f4 on the demo launch screen. It will kill it and launch the bench, that will still be valid


----------



## TheGlow

Quote:


> Originally Posted by *syl1979*
> 
> For those complaining about 3dmark demos. You just have to do alt-f4 on the demo launch screen. It will kill it and launch the bench, that will still be valid


Quote:


> Originally Posted by *syl1979*
> 
> For those complaining about 3dmark demos. You just have to do alt-f4 on the demo launch screen. It will kill it and launch the bench, that will still be valid


I heard about that, or alt-tabbing out, etc, but it just killed the whole thing for me I think.
I'll try again later.
I doubt theres a script to edit and it just runs for 1 second instead.


----------



## Olper

Quote:


> Originally Posted by *TheDeadCry*
> 
> Psh, all you need is to trace with pencil lead.
> 
> 
> 
> 
> 
> 
> 
> In all serious, nobody do that. I've done it on an old powerpc mac mini years ago, though, to overclock - it worked...but I wouldn't try it on anything valuable. Just a little anecdote.


You are wrong.
Some volt mods have been done with pensil, where the impedance is in the ball park of 10kilo ohms. The pensil makes another route of ~ 100 kilo ohms and thus rises the voltage by roughly 10%. You really have no control here, but try, measure voltage and try again.

Gpu shunt resistors i have seen are from 2 to 5 milliohms. A ~100 kilo ohm route in parallel to this does nothing. Not a single permille.


----------



## muzammil84

so I just got my 1070. It's Inno3d iChill x4, I have read almost entire thread and not a single person ever had this card before? it's suppose to cool very well, if not I'll just slam water block on it. It's a reference pcb too so many blocks available.
I haven't tried it yet, will report back once done some testing.


----------



## Balrogos

I watched soem test and this graphics is "noise" compared to other profucts and this little fan not helping at all but it looks nice.


----------



## muzammil84

Quote:


> Originally Posted by *Balrogos*
> 
> I watched soem test and this graphics is "noise" compared to other profucts and this little fan not helping at all but it looks nice.


I've seen some reviews too and it's on par with Strix and Amp extreme in terms of noise, only Palit jetstream is quieter but will find out soon. if it's too loud/hot i'll get block for it. Also it has quite high clock out if the box so wonder how it will overclock.
Was just a bit suprised no one owned that card and posted here as the price is not bad and cooling better than many more popular ones.


----------



## asdkj1740

Quote:


> Originally Posted by *muzammil84*
> 
> so I just got my 1070. It's Inno3d iChill x4, I have read almost entire thread and not a single person ever had this card before? it's suppose to cool very well, if not I'll just slam water block on it. It's a reference pcb too so many blocks available.
> I haven't tried it yet, will report back once done some testing.


inno3d ichill airboss cooler was the best stock cooler in the market. very outstanding, in the past.
the pascal ichills are just suck, really suck. i am curious it is all about the exhaust vent on the top and bottom are almost blocked. as the pascal ichill heatsinks are almost the same as maxwell ones.

yeah, rgb, the reason why the exhaust vents are block is for the rgb showcasing.


----------



## QxY

Just noticed my Zotac AMP has started to boost up to 2000mhz on it's own as the weather is starting to cool down a bit. Highest I've seen it before was around 1970...I wonder if it'll boost any higher in winter.


----------



## Chaoz

Quote:


> Originally Posted by *QxY*
> 
> Just noticed my Zotac AMP has started to boost up to 2000mhz on it's own as the weather is starting to cool down a bit. Highest I've seen it before was around 1970...I wonder if it'll boost any higher in winter.


Mine does the same, just noticed while in BF1 that it reaches up to 2000MHz, mine's an EVGA SC. But this card is soo quiet and cool. You can barely hear the fan spinning and max temps are 62°C, while gaming for hours on end.


----------



## muzammil84

Quote:


> Originally Posted by *asdkj1740*
> 
> inno3d ichill airboss cooler was the best stock cooler in the market. very outstanding, in the past.
> the pascal ichills are just suck, really suck. i am curious it is all about the exhaust vent on the top and bottom are almost blocked. as the pascal ichill heatsinks are almost the same as maxwell ones.
> 
> yeah, rgb, the reason why the exhaust vents are block is for the rgb showcasing.


you got me worried now








is it bad in terms of heat dissipation and high temps or noisy fans?

from the reviews and tests that ichill was one of the coolest running cards so I'm confused now.


----------



## jakoBTR

gtx 1070 , 3570K and 8gb ram . bf1 1 hour of game play


----------



## Chaoz

Quote:


> Originally Posted by *jakoBTR*
> 
> 
> 
> 
> 
> gtx 1070 , 3570K and 8gb ram . bf1 1 hour of game play


Not bad. Quite low RAM usage aswell, mine goes all the way up to 12 GB on Ultra settings with 100 fps.


----------



## mbm

seems you have locked your frames to 60.. what is the purpose of the video then?


----------



## Balrogos

Quote:


> Originally Posted by *muzammil84*
> 
> you got me worried now
> 
> 
> 
> 
> 
> 
> 
> 
> is it bad in terms of heat dissipation and high temps or noisy fans?
> 
> from the reviews and tests that ichill was one of the coolest running cards so I'm confused now.


Gigabyte G1 Gaming - 46,8 dB - 68C - 2240 RPM
iNNO ICHILLI 3D 4X - 41,8 dB - 67C - 1300 RPM
MSI Armor - 41.6 dB - 71C - 1600 RPM
ASUS Rog Strix - 40.5 dB - 65C - 1660 RPM
Zotac AMP Extreme -38.3 dB - 65C - 1260 RPM
Palit Gamerock - 35.8 dB - 69C - 990 RPM
MSI Gaming X - 35.6 dB - 70C - 1350 RPM
Palit Jetstream - 34.5 dB - 68C - 920 RPM

And after oc the most quiet cards are palit cards and MSI Gaming X not exceeding 37,5 dB all other cards are above 41 dB and the most noise 48.6 dB Gigabyte G1 Gaming


----------



## jamor

Quote:


> Originally Posted by *mbm*
> 
> seems you have locked your frames to 60.. what is the purpose of the video then?


Maybe to not get any fps drops? 60hz stable is important on a 60hz monitor. Dip down to 40 in heavy use and you're going to have a bad time.


----------



## Derpinheimer

Quote:


> Originally Posted by *gtbtk*
> 
> There is a "Video clock" that the curve controls in addition to the core clock and it has an effect on the overclock performance results. The Video clock frequency is effected by but not reported by AfterBurner in the graphs.
> 
> Through my own experimentation, I discovered that the frequency of the video clock is adjusted by the curve at the voltage values in the range of 0.975 - 1.0v.
> 
> Try your firestrike experiment again with memory as before, the 1.093v point at 2164 and put a bump in the middle of the curve at .975v up to 2000hz. You may need to adjust the 1.093 point to get stability again but I think that you will find a nice increase in your benchmark scores.


interesting finding! How come they hide this slider from us if it's clearly affected by the software


----------



## jovanni

Trying to find something don't remember what...fall into a post of a guy asking what card to choose, zotac Amp! Extreme 980ti or 1070 third party manufacturer .....and almost all of the replies shout: the 980ti overclocked is faster than the overclocked 1070....so I was misguided from most tech sites because in all reviews I read they had 1070 no reference compared with reference 980ti....and guess what they find 1070 faster....


----------



## pez

Quote:


> Originally Posted by *mbm*
> 
> seems you have locked your frames to 60.. what is the purpose of the video then?


Quote:


> Originally Posted by *jamor*
> 
> Maybe to not get any fps drops? 60hz stable is important on a 60hz monitor. Dip down to 40 in heavy use and you're going to have a bad time.


Most likely this. <60 FPS can be a nightmare on just about any game that's fast-paced.
Quote:


> Originally Posted by *jovanni*
> 
> Trying to find something don't remember what...fall into a post of a guy asking what card to choose, zotac Amp! Extreme 980ti or 1070 third party manufacturer .....and almost all of the replies shout: the 989ti overclocked is faster than the overclocked 1070....so I was misguided from most tech sites because in all reviews I read they had 1070 no reference compared with reference 980ti....and guess what they find 1070 faster....


Seems like you fell victim to the users trying to justify their late 980Ti purchases







. Both cards are respectable and performance isn't so different that you should feel the need to upgrade. Although, I could understand feeling a bit bitter if you had invested in a 980Ti at a price-point of $500+ prior to or not long after the 10-series release.


----------



## jamor

Quote:


> Originally Posted by *jovanni*
> 
> Trying to find something don't remember what...fall into a post of a guy asking what card to choose, zotac Amp! Extreme 980ti or 1070 third party manufacturer .....and almost all of the replies shout: the 989ti overclocked is faster than the overclocked 1070....so I was misguided from most tech sites because in all reviews I read they had 1070 no reference compared with reference 980ti....and guess what they find 1070 faster....


http://www.overclock.net/t/1601896/overclockersclub-overclock-showdown-gtx-980ti-vs-gtx-1070-vs-gtx-1080

They are close. 1070 slightly faster. 1070 also uses less power so that's a plus.


----------



## jovanni

Quote:


> Originally Posted by *jamor*
> 
> http://www.overclock.net/t/1601896/overclockersclub-overclock-showdown-gtx-980ti-vs-gtx-1070-vs-gtx-1080
> 
> They are close. 1070 slightly faster. 1070 also uses less power so that's a plus.


Ofcourse there are many positives in 1070 vs 980ti, but in comparison of the raw power (fps vs fps lets say) I found myself a little bit misguided.


----------



## jovanni

Quote:


> Originally Posted by *pez*
> 
> Most likely this. <60 FPS can be a nightmare on just about any game that's fast-paced.
> Seems like you fell victim to the users trying to justify their late 980Ti purchases
> 
> 
> 
> 
> 
> 
> 
> . Both cards are respectable and performance isn't so different that you should feel the need to upgrade. Although, I could understand feeling a bit bitter if you had invested in a 980Ti at a price-point of $500+ prior to or not long after the 10-series release.


I am a 1070 owner and my choise was a result of reviews showing the 1070 in front of the 980ti.....


----------



## jamor

Quote:


> Originally Posted by *jovanni*
> 
> I am a 1070 owner and my choise was a result of reviews showing the 1070 in front of the 980ti.....


980ti is more expensive. So what's the problem? If you upgraded from a 980ti then that was really stupid. Otherwise I don't see the problem.


----------



## jamor

So I was looking at the Zotacs because everyone was talking them up.. and holy crap those things are HUGE!!!. I'd have to mod my case for that guy.


----------



## jovanni

Quote:


> Originally Posted by *jamor*
> 
> 980ti is more expensive. So what's the problem? If you upgraded from a 980ti than that was really stupid. Otherwise I don't see the problem.


Had 2 970 in SLI. I want to move away from SLI, too much for too little.....had 2 670, had 2 570 etc.....so I choose the 1070....and I am prety satisfied with that.....
The choise was based on web reviews (like guru3d).


----------



## asdkj1740

Quote:


> Originally Posted by *muzammil84*
> 
> you got me worried now
> 
> 
> 
> 
> 
> 
> 
> 
> is it bad in terms of heat dissipation and high temps or noisy fans?
> 
> from the reviews and tests that ichill was one of the coolest running cards so I'm confused now.


the hot air cant get out easily through the exhaust vents as the shroud blocks lots of hot air to exhaust....

be aware of those reviews in open test bench, do not take these result for granted
it is pretty sure that pascal ichill is not longer powerful as before leading 5~10C lower among others.

it is still fine for 70c~80c as the clock will drop around 70mhz from instant max boost at idle temp, its meaning almost nothing in actual gameplay


----------



## rfarmer

Quote:


> Originally Posted by *muzammil84*
> 
> so I just got my 1070. It's Inno3d iChill x4, I have read almost entire thread and not a single person ever had this card before? it's suppose to cool very well, if not I'll just slam water block on it. It's a reference pcb too so many blocks available.
> I haven't tried it yet, will report back once done some testing.


Put a block on it, max temps I get in benchmarks are 38C and 42C during extended gaming on high end games. You will be much happier with the temps and noise over any air cooled solution.


----------



## TheGlow

Quote:


> Originally Posted by *pez*
> 
> Most likely this. <60 FPS can be a nightmare on just about any game that's fast-paced.


Yea, but it looks like still 1080p, so the 1070 shouldnt be having a problem.
Im on max settings, 1440p and Its only jumping around 90-120, i think dipping as low as 70.
Quote:


> Originally Posted by *rfarmer*
> 
> Put a block on it, max temps I get in benchmarks are 38C and 42C during extended gaming on high end games. You will be much happier with the temps and noise over any air cooled solution.


I've never looked into changing the cooling on a card before. What options/manufacturer's are there?
Any benefit vs getting one of the models that already has it?


----------



## gtbtk

Quote:


> Originally Posted by *Derpinheimer*
> 
> interesting finding! How come they hide this slider from us if it's clearly affected by the software


I think Pascal, by introducing GPU Boost 3.0 and the associated curve adjustment function, has provided the first opportunity that we have target different parts of the cards performance at different levels of voltage.

Afterburner and Precision XOC, gpu tweak II etc have been updated to support the new features offered by the architecture but by using a curve but the support is pretty manual and rudimentary and not overly user friendly.

None of the Overclock utilities have been re-engineered from the ground up to make best use of the new functionality by providing a user friendly front end to best support the newly available adjustments.

I'm sure that it will come but developing software from the ground up requires the understanding of exactly how the new architecture works, the inclination to re-write something that is probably good enough at this point in time, man power to write the new software and a financial budget. I don't think all those things have lined up as yet.


----------



## rfarmer

Quote:


> Originally Posted by *TheGlow*
> 
> Yea, but it looks like still 1080p, so the 1070 shouldnt be having a problem.
> Im on max settings, 1440p and Its only jumping around 90-120, i think dipping as low as 70.
> I've never looked into changing the cooling on a card before. What options/manufacturer's are there?
> Any benefit vs getting one of the models that already has it?


http://www.hitzestau.com/waterblocks-for-the-nvidia-gtx-1080/ This site lists most of the blocks currently available for the 1080/1070 reference design. EK has several available for non reference cards.

I would actually have gone with the MSI Sea Hawk EK X https://us.msi.com/Graphics-card/GeForce-GTX-1070-SEA-HAWK-EK-X.html#hero-overview except at 6.50" wide it won't fit in my case. But price for the card with included block is very good. I went with Aquacomputer because I liked the look, build quality and I wanted to try the active cooling backplate.


----------



## muzammil84

Quote:


> Originally Posted by *rfarmer*
> 
> Put a block on it, max temps I get in benchmarks are 38C and 42C during extended gaming on high end games. You will be much happier with the temps and noise over any air cooled solution.


I've been watercooling my gpus for couple of years, it just I like the look of this card and thought that newest cards would have very good coolers with low noise, also I wanted to keep my tubing as hidden as possible, routing only two to the cpu block and all the tubing would be inside the case, not visible. But I will probably end up getting Heatkiller IV for this gpu as 38°C looks much better than 65ish


----------



## toyz72

i had a quick question concerning the gigabyte gtx 1070 mini. i run itx,and really been thinking of picking this card up. is there anything this card is lacking compared to full size cards?


----------



## gtbtk

Quote:


> Originally Posted by *toyz72*
> 
> i had a quick question concerning the gigabyte gtx 1070 mini. i run itx,and really been thinking of picking this card up. is there anything this card is lacking compared to full size cards?


Size?? ;-)

I have not used this model, however, being only a single fan card, you may find that the card will run a little hotter than the larger cards dual or triple fan cards with larger heat syncs. With the way that GPUboost 3.0 works, higher temps will reduce absolute performance as the card clocks back as temperatures rise but the founder edition cards have similar challenges and they work fine.

The Mini is not a card designed for absolute overclock performance. If your case has good airflow, you should be fine as long as you manage your own expectations.


----------



## rfarmer

Quote:


> Originally Posted by *muzammil84*
> 
> I've been watercooling my gpus for couple of years, it just I like the look of this card and thought that newest cards would have very good coolers with low noise, also I wanted to keep my tubing as hidden as possible, routing only two to the cpu block and all the tubing would be inside the case, not visible. But I will probably end up getting Heatkiller IV for this gpu as 38°C looks much better than 65ish


The Heatkiller was on my short list, no backplate available when I ordered mine but that is a nice looking block.


----------



## owikhan

Quote:


> Originally Posted by *jamor*
> 
> So I was looking at the Zotacs because everyone was talking them up.. and holy crap those things are HUGE!!!. I'd have to mod my case for that guy.


Close your eyes







and buy ZOTAC AMP EXTREME EDITION.. i am lucky i have Corsair 750D high air flow edition its fit in that case easily







:thumb:


----------



## MindBlank

Is anyone else getting minute performance gains from overclocking?

This is where I'm at: 1962MHz usual boost out of the box (Gigabyte G1 Gaming, Samsung memory). So I overclock to my highest stable 2101MHz, 1.05v, 2200MHz memory (will not go higher for memory - so, yeah - "Samsung OC's very well" - not in my case it doesn't).

This should yield a good performance boost, but I am seeing an average of 3fps increase throughout 3 games so far. I'm running a ultrawide 2560x1080 monitor and things are cranked up to ultra for all games, so there is a good load on the GPU.

For Witcher 3 I'm getting 70 instead of 68fps in multiple scenes. For Rise of the Tomb Raider I'm getting 67 instead of 65. And also tested BF1 and it goes from like 110 to 113 fps, or with 100% resolution scale it goes from like 40 to 41 fps...

Something's not right. I'm running this on a 4790k which is currently OC'd to 4.9GHz, paired with fast 2400MhZ RAM. I was expecting much more. I also have a GTX 1060 and going from 1980MHz and stock VRAM to 2100MHz and +350MHz VRAM made the card show awesome gains, averages of 10 fps throughout games. Some instances it shoots up by 14-15 fps. Not 2 fps like the 1070...

I am using the AB overlay and keeping an eye on the clocks - they are pretty much locked 2101MHz and 2200MHz VRAM. TDP is around 106% from the max 111%, temps are 63C load.

Any ideas on this?


----------



## TheGlow

Quote:


> Originally Posted by *MindBlank*
> 
> Is anyone else getting minute performance gains from overclocking?
> 
> This is where I'm at: 1962MHz usual boost out of the box (Gigabyte G1 Gaming, Samsung memory). So I overclock to my highest stable 2101MHz, 1.05v, 2200MHz memory (will not go higher for memory - so, yeah - "Samsung OC's very well" - not in my case it doesn't).
> 
> This should yield a good performance boost, but I am seeing an average of 3fps increase throughout 3 games so far. I'm running a ultrawide 2560x1080 monitor and things are cranked up to ultra for all games, so there is a good load on the GPU.
> 
> For Witcher 3 I'm getting 70 instead of 68fps in multiple scenes. For Rise of the Tomb Raider I'm getting 67 instead of 65. And also tested BF1 and it goes from like 110 to 113 fps, or with 100% resolution scale it goes from like 40 to 41 fps...
> 
> Something's not right. I'm running this on a 4790k which is currently OC'd to 4.9GHz, paired with fast 2400MhZ RAM. I was expecting much more. I also have a GTX 1060 and going from 1980MHz and stock VRAM to 2100MHz and +350MHz VRAM made the card show awesome gains, averages of 10 fps throughout games. Some instances it shoots up by 14-15 fps. Not 2 fps like the 1070...
> 
> I am using the AB overlay and keeping an eye on the clocks - they are pretty much locked 2101MHz and 2200MHz VRAM. TDP is around 106% from the max 111%, temps are 63C load.
> 
> Any ideas on this?


For BF1 theres something odd about the scaling. I believe 42% = your monitor resolution. So 100% is downscaling 4k or some crap.
Witcher3 I have all but Hairworks on, @1440p I get 75-85fps usually. I forgot to check Hairworks impact after i found some stable settings.
I'm not sure about the memory. I have Micron and managed to get it up to 9600 (+800 in AB) but theres definitely some odd issues.
I would instantly crash often if I was in 2d mode. Installing the MSI Gaming App also installs a service that seems to lock the card into 3d.
So it never idles under 1582MHz on the core. This lets me put up to about +850 on memory before it dies on me.
Without that service, it dies on 400+. Even just sitting on desktop and launching apps will kill it.


----------



## jamor

Quote:


> Originally Posted by *MindBlank*
> 
> Is anyone else getting minute performance gains from overclocking?
> 
> This is where I'm at: 1962MHz usual boost out of the box (Gigabyte G1 Gaming, Samsung memory). So I overclock to my highest stable 2101MHz, 1.05v, 2200MHz memory (will not go higher for memory - so, yeah - "Samsung OC's very well" - not in my case it doesn't).
> 
> This should yield a good performance boost, but I am seeing an average of 3fps increase throughout 3 games so far. I'm running a ultrawide 2560x1080 monitor and things are cranked up to ultra for all games, so there is a good load on the GPU.
> 
> For Witcher 3 I'm getting 70 instead of 68fps in multiple scenes. For Rise of the Tomb Raider I'm getting 67 instead of 65. And also tested BF1 and it goes from like 110 to 113 fps, or with 100% resolution scale it goes from like 40 to 41 fps...
> 
> Something's not right. I'm running this on a 4790k which is currently OC'd to 4.9GHz, paired with fast 2400MhZ RAM. I was expecting much more. I also have a GTX 1060 and going from 1980MHz and stock VRAM to 2100MHz and +350MHz VRAM made the card show awesome gains, averages of 10 fps throughout games. Some instances it shoots up by 14-15 fps. Not 2 fps like the 1070...
> 
> I am using the AB overlay and keeping an eye on the clocks - they are pretty much locked 2101MHz and 2200MHz VRAM. TDP is around 106% from the max 111%, temps are 63C load.
> 
> Any ideas on this?


Yea generally it's only a few FPS unless you have a really really good over clocker like 2150 mhz / 9600 mhz. If it was any better than you'd have a 1080!


----------



## StrelokAT

Now i finished my Firestrike benchmark. Get to post #3036 and there you see my results.
In Valley the points are not high. Not even above 4000. Strange??


----------



## jamor

Quote:


> Originally Posted by *StrelokAT*
> 
> Now i finished my Firestrike benchmark. Get to post #3036 and there you see my results.
> In Valley the points are not high. Not even above 4000. Strange??


Idk I'm benchmarking on 1440p so I can't compare. But if you're getting errors because you OCed too high then your performance will suffer.


----------



## StrelokAT

No no, everything allright. No errors because of too much OC and i know the performance will suffer.
I even beched Valley with stock clocks and results are only Score:3763

Update: now i finally got 4053 points in Valley


----------



## Balrogos

Quote:


> Originally Posted by *StrelokAT*
> 
> No no, everything allright. No errors because of too much OC and i know the performance will suffer.
> I even beched Valley with stock clocks and results are only Score:3763
> 
> Update: now i finally got 4053 points in Valley


Did you check before my results?


----------



## Prozillah

Quote:


> Originally Posted by *MindBlank*
> 
> Is anyone else getting minute performance gains from overclocking?
> 
> This is where I'm at: 1962MHz usual boost out of the box (Gigabyte G1 Gaming, Samsung memory). So I overclock to my highest stable 2101MHz, 1.05v, 2200MHz memory (will not go higher for memory - so, yeah - "Samsung OC's very well" - not in my case it doesn't).
> 
> This should yield a good performance boost, but I am seeing an average of 3fps increase throughout 3 games so far. I'm running a ultrawide 2560x1080 monitor and things are cranked up to ultra for all games, so there is a good load on the GPU.
> 
> For Witcher 3 I'm getting 70 instead of 68fps in multiple scenes. For Rise of the Tomb Raider I'm getting 67 instead of 65. And also tested BF1 and it goes from like 110 to 113 fps, or with 100% resolution scale it goes from like 40 to 41 fps...
> 
> Something's not right. I'm running this on a 4790k which is currently OC'd to 4.9GHz, paired with fast 2400MhZ RAM. I was expecting much more. I also have a GTX 1060 and going from 1980MHz and stock VRAM to 2100MHz and +350MHz VRAM made the card show awesome gains, averages of 10 fps throughout games. Some instances it shoots up by 14-15 fps. Not 2 fps like the 1070...
> 
> I am using the AB overlay and keeping an eye on the clocks - they are pretty much locked 2101MHz and 2200MHz VRAM. TDP is around 106% from the max 111%, temps are 63C load.
> 
> Any ideas on this?


Yea something could be up i have the same card and similar set up and I gain on avg 10 -20 fps depending on the game. What are your temps?


----------



## pez

Quote:


> Originally Posted by *jovanni*
> 
> I am a 1070 owner and my choise was a result of reviews showing the 1070 in front of the 980ti.....


Quote:


> Originally Posted by *jamor*
> 
> 980ti is more expensive. So what's the problem? If you upgraded from a 980ti then that was really stupid. Otherwise I don't see the problem.


Yeah I got the impression that from the initial post that you bought a 980Ti or were somehow dissatisfied with 980Ti vs 1070 performance.


----------



## MindBlank

Quote:


> Originally Posted by *Prozillah*
> 
> Yea something could be up i have the same card and similar set up and I gain on avg 10 -20 fps depending on the game. What are your temps?


63C on load...


----------



## Prozillah

Quote:


> Originally Posted by *MindBlank*
> 
> 63C on load...


hermm possibly try adjusting your fan curve to be bit more aggressive - something closer to 60-70% at 55c I would suggest as a start. And make sure your power slider is maxed to the right. Lock voltage point at 1.093 and run 50mhz less OC than your max memory OC.


----------



## wrathofbill

Quote:


> Originally Posted by *TheGlow*
> 
> Yea, but it looks like still 1080p, so the 1070 shouldnt be having a problem.
> Im on max settings, 1440p and Its only jumping around 90-120, i think dipping as low as 70.
> I've never looked into changing the cooling on a card before. What options/manufacturer's are there?
> Any benefit vs getting one of the models that already has it?


I changed my MSI FE card into this with kit from EVGA, pictured below. But I would prefer the new FTW Hybrid 1070 that they are releasing over mine. One benefit is it would have been cheaper


----------



## KGPrime

Quote:


> Originally Posted by *MindBlank*
> 
> Is anyone else getting minute performance gains from overclocking?
> 
> This is where I'm at: 1962MHz usual boost out of the box (Gigabyte G1 Gaming, Samsung memory). So I overclock to my highest stable 2101MHz, 1.05v, 2200MHz memory (will not go higher for memory - so, yeah - "Samsung OC's very well" - not in my case it doesn't).
> 
> This should yield a good performance boost, but I am seeing an average of 3fps increase throughout 3 games so far. I'm running a ultrawide 2560x1080 monitor and things are cranked up to ultra for all games, so there is a good load on the GPU.
> 
> For Witcher 3 I'm getting 70 instead of 68fps in multiple scenes. For Rise of the Tomb Raider I'm getting 67 instead of 65. And also tested BF1 and it goes from like 110 to 113 fps, or with 100% resolution scale it goes from like 40 to 41 fps...
> 
> Something's not right. I'm running this on a 4790k which is currently OC'd to 4.9GHz, paired with fast 2400MhZ RAM. I was expecting much more. I also have a GTX 1060 and going from 1980MHz and stock VRAM to 2100MHz and +350MHz VRAM made the card show awesome gains, averages of 10 fps throughout games. Some instances it shoots up by 14-15 fps. Not 2 fps like the 1070...
> 
> I am using the AB overlay and keeping an eye on the clocks - they are pretty much locked 2101MHz and 2200MHz VRAM. TDP is around 106% from the max 111%, temps are 63C load.
> 
> Any ideas on this?


Yes overclocking video cards is mostly pointless save for some games, and has always been the case. You can go ready every review of every geforce released and overclocked from reputable tech sites and in general 4-6fps 10 at exceptional best on the low end is about right for gains save for some games that are heavily gpu dependent and take overclocks well, but that is not all or most. From the Geforce 256 to the 1080 in 2016 all the reviews are out there in black and white all the overclocking results ect. Anyone can read them. I have and what i'm saying is based on that and owning the cards and overclocking them.

With Pascal, thus far seemingly, the fact looks like these cards are never going to perform much better than stock out of the box, and there doesn't seem to be any real reason to bother, not with water, not dual pin power connectors, nothing. As far as i can tell the extra 6 pin connector on my card runs the led light, lol. I bought the non x plain 1070 gaming msi twin frozer card and it boosted to 1850+ Mhz stock right out of the box. And another 100Mhz stock boost for the "X" version or whatever is quite literally pointless. As in literally as overclocking mine +160Mhz changed nothing for my game play experience in any games i played with it at those clocks. I tested it in a bunch of games at the highest it would clock to - 2050Mhz on stock voltage, without crashing, in games, not useless benchmarks and it basically perform the same as stock for all that matters. I was getting 180fps in Aliens and actually just throttled the framerate to 120fps, or i got 40 fps in Skyrim with Enb and all that crap and it changed nothing. Regardless either way, the card never has yet gone above 45C, and my fan speed has never ever once risen above 40%. And this card almost literally split my case in half it's so wide i had to bend my HDD cage and smash the door back on. Not only did my cpu temps not change at all, my gpu temps are lower than i can recall in the last decade or so. This card runs stupid cool, Twin Frozr anyway. I would have gotten the Aero to save 40 bucks if it was available at the time, i still can imagine my temps would not be an issue. I make a custom fan profile in Riva first thing upon installing as protocol, and i have been using Riva for nearly 20 years and this card has never utilized that curve once, never went above 40% fans speed, not at stock boost, not at forced 2050Mhz and gaming all night.


----------



## NFSxperts

I'm looking to get a cheap 1070. I'm narrowed it down to these choices (order from cheapest to expensive):
Zotac AMP is out of my budget, Right now I'm learning towards the Galax or Inno3D since the y both have backplates.
Any thoughts?
Not in the US and buying soon


Gigabyte GeForce GTX 1070 WINDFORCE OC AdvantagesDisadvantages

8pin only
semi-passive fan
highest stock boost clocks

no backplate
90mm fans
restrictive back i/o



Inno3D GeForce GTX 1070 TWIN X2 AdvantagesDisadvantages

8pin only
backplate
3DMark and VRMark key

lowest stock boost clock
2year warranty



GALAX GeForce® GTX 1070 EX AdvantagesDisadvantages

100mm fans
metal shroud
backplate?

8pin + 6pin
stupid fan leds



MSI GEFORCE® GTX 1070 ARMOR 8G AdvantagesDisadvantages

8pin only
3 year warranty

no backplate


----------



## BroPhilip

So Guys, I have a question....

I purchased the MSI 1070 Gaming Z after having to RMA a Asus Strix non-oc for stability issues. I went with the Z model hoping for a better binned processor. However with the MSI Gaming App (I know it is garbage lol) on OC mode I am only getting a core boost of 1975 then it settles down to 1962. Which seems low for the highest end oc of the gaming series with others reporting close to 2000 out of the box. In Afterburner the highest stable is 2066 which settles in at 2050. The memory is also Micron (i had Samsung in the Strix) and I have only pushed it to 250 which puts it 8600mhz. While the z was only 10 more than the x model I was hoping for a better card.

So is this boost low for a High End OC card?


----------



## Azruine

My 1070: highest stable 2063~2050, and max clock at 2083. Max mem clock is 2442. I found that voltage limit was almost on during benchmark and power limit also turns on sometime.

So... Does anyone have modded bios for rog strix? I found modded 1080 strix bios(voltage limit to 1.25, no power limit), but I can't find 1070's.


----------



## bigjdubb

I still have not come across any pascal bios editing tool. Hopefully one comes out soon because I really need to get rid of the boost feature on this card before I throw the damn thing in the garbage.


----------



## Nukemaster

wrathofbill, That is great, I have an all in one cooler on my 670 and it was great. I would do it with my 1070, but my case is small and having the radiator on the bottom is not always good for the pump(made it louder).

BroPhilip, I would guess it is an average boost. All cards are different. You could buy 10 and they would all be different. I am not even sure how much binning they are doing since the cards had been in higher demand from release.

I am always one to think about VRM cooling since many aftermarket cards cool the VRM are worse than the reference design.

I do not think this is overkill.

Stock thermal pad does not even fully cover all the mosfets. strange.


Time for a bit more heatsink.


I hope this will cut it


I still have to cut off some more of the 90 degree part because it will block air from the cooler from passing over the heatsink.

I also should make a plate(may not be able to have any actual fins on it due to space restrictions.) for the back of the board since a great deal of the heat is dumped into the board(normal for surface mount parts.).

In a few hours I will see if the thermal glue from my gpu cooler will hold my new heatsink together.


----------



## Chaoz

Quote:


> Originally Posted by *NFSxperts*
> 
> I'm looking to get a cheap 1070. I'm narrowed it down to these choices (order from cheapest to expensive):
> Zotac AMP is out of my budget, Right now I'm learning towards the Galax or Inno3D since the y both have backplates.
> Any thoughts?
> Not in the US and buying soon
> 
> 
> Gigabyte GeForce GTX 1070 WINDFORCE OC AdvantagesDisadvantages
> 
> 8pin only
> semi-passive fan
> highest stock boost clocks
> 
> no backplate
> 90mm fans
> restrictive back i/o
> 
> 
> 
> Inno3D GeForce GTX 1070 TWIN X2 AdvantagesDisadvantages
> 
> 8pin only
> backplate
> 3DMark and VRMark key
> 
> lowest stock boost clock
> 2year warranty
> 
> 
> 
> GALAX GeForce® GTX 1070 EX AdvantagesDisadvantages
> 
> 100mm fans
> metal shroud
> backplate?
> 
> 8pin + 6pin
> stupid fan leds
> 
> 
> 
> MSI GEFORCE® GTX 1070 ARMOR 8G AdvantagesDisadvantages
> 
> 8pin only
> 3 year warranty
> 
> no backplate


EVGA GTX 1070 SC or FTW not an option? They're cheaper than other cards where I bought mine.

The SC only has 1x 8-pin, but has ****ty OC potential, not that I need it anyways. And the FTW has 2x 8-pin and both have backplates. WHole reason why I went with the SC is for the 1x 8-pin.

Both cards have quiet high Boost clocks and a nice RGB LED on the side.


----------



## Mr-Dark

Quote:


> Originally Posted by *Chaoz*
> 
> EVGA GTX 1070 SC or FTW not an option? They're cheaper than other cards where I bought mine.
> 
> The SC only has 1x 8-pin, but has ****ty OC potential, not that I need it anyways. And the FTW has 2x 8-pin and both have backplates. WHole reason why I went with the SC is for the 1x 8-pin.
> 
> Both cards have quiet high Boost clocks and a nice RGB LED on the side.


The RGB for the FTW only.. the SC is white led only.. also its the Silicon all the time









My 1070 SC is rock solid at 2100mhz and 9ghz memory without any Voltage increase


----------



## Chaoz

Quote:


> Originally Posted by *Mr-Dark*
> 
> The RGB for the FTW only.. the SC is white led only.. also its the Silicon all the time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 1070 SC is rock solid at 2100mhz and 9ghz memory without any Voltage increase


Forgot about that. Anyways, white is the color of my system, so I'm good.

I can clock mine stable to 2114MHz and 8996 MHz memory, with 100% voltage increase and 112% power. Lower voltage gives instant crashes.


----------



## striker3

i have gigabyte gtx 1070 g1 gaming and my max core oc is 2050 i think settle at 2038 and mem dont know it max it is micron and i oc it plus 240mhz and it have np in battlefield1 open beta wanna ask is it worth oc i mean how many frames u get if u overclocked the vga from 2000 to 2100 or more .wht is the avrage gain in games. for me in bf1 open beta dx12 is more stable than dx11 when overclocking the vga


----------



## Chaoz

Quote:


> Originally Posted by *striker3*
> 
> i have gigabyte gtx 1070 g1 gaming and my max core oc is 2050 i think settle at 2038 and mem dont know it max it is micron and i oc it plus 240mhz and it have np in battlefield1 open beta wanna ask is it worth oc i mean how many frames u get if u overclocked the vga from 2000 to 2100 or more .wht is the avrage gain in games. for me in bf1 open beta dx12 is more stable than dx11 when overclocking the vga


Max 10 fps increase. Sometimes not even 10 fps. So not really worth it. I leave mine at stock settings.


----------



## Blackfyre

Quote:


> Originally Posted by *BroPhilip*
> 
> So Guys, I have a question....
> 
> I purchased the MSI 1070 Gaming Z after having to RMA a Asus Strix non-oc for stability issues. I went with the Z model hoping for a better binned processor. However with the MSI Gaming App (I know it is garbage lol) on OC mode I am only getting a core boost of 1975 then it settles down to 1962. Which seems low for the highest end oc of the gaming series with others reporting close to 2000 out of the box. In Afterburner the highest stable is 2066 which settles in at 2050. The memory is also Micron (i had Samsung in the Strix) and I have only pushed it to 250 which puts it 8600mhz. While the z was only 10 more than the x model I was hoping for a better card.
> 
> So is this boost low for a High End OC card?


So basically buying the *MSI GTX 1070 Gaming X* in the first week of release (_1st Batch_) was a much better option than waiting for the Z Edition and paying more almost 2 months after. Not only does the *1st batch Gaming X* come with Samsung vRAM (_which overclocks higher_), it also overclocks higher on the core side than the new "top of the line" *Gaming Z* Edition.


----------



## BulletSponge

Quote:


> Originally Posted by *Blackfyre*
> 
> So basically buying the *MSI GTX 1070 Gaming X* in the first week of release (_1st Batch_) was a much better option than waiting for the Z Edition and paying more almost 2 months after. Not only does the *1st batch Gaming X* come with Samsung vRAM (_which overclocks higher_), it also overclocks higher on the core side than the new "top of the line" *Gaming Z* Edition.


Ditto!


----------



## TheGlow

Quote:


> Originally Posted by *Blackfyre*
> 
> So basically buying the *MSI GTX 1070 Gaming X* in the first week of release (_1st Batch_) was a much better option than waiting for the Z Edition and paying more almost 2 months after. Not only does the *1st batch Gaming X* come with Samsung vRAM (_which overclocks higher_), it also overclocks higher on the core side than the new "top of the line" *Gaming Z* Edition.


Micron isn't the end of the world.


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> Micron isn't the end of the world.


HOLY....god damn dude...I have the same card as you and at max I can get maybe 2063 core, 8600 mem. I have micron. Very Impressive. I am jealous.


----------



## Dude970

Quote:


> Originally Posted by *Blackfyre*
> 
> So basically buying the *MSI GTX 1070 Gaming X* in the first week of release (_1st Batch_) was a much better option than waiting for the Z Edition and paying more almost 2 months after. Not only does the *1st batch Gaming X* come with Samsung vRAM (_which overclocks higher_), it also overclocks higher on the core side than the new "top of the line" *Gaming Z* Edition.


What









Quote:


> Originally Posted by *BulletSponge*
> 
> Ditto!


For once I don't regret jumping in early


----------



## Mazda6i07

Kind of off topic, but doesn anyone have any info on when/if nvidia is releasing the 'ti' line of cards?


----------



## jamor

I could barely get 8200 on my MSI Micron. My once decent stable core OC was no longer stable either. So I returned it and will wait a little bit to buy a different brand. Maybe the Zotac. I hate to be that guy but yea.. now I am.


----------



## TheDeadCry

Quote:


> Originally Posted by *jamor*
> 
> I could barely get 8200 on my MSI Micron. My once decent stable core OC was no longer stable either. So I returned it and will wait a little bit to buy a different brand. Maybe the Zotac. I hate to be that guy but yea.. now I am.


Trust me...I've submitted an RMA but I'm reticent to send my card to them, paying shipping only to have them send it right back - with up to like a few weeks of wait time F**k that...I've had atrocious experiences with companies like G.Skill who took my memory for SEVERAL weeks, only to have it shipped back.


----------



## Dude970

Quote:


> Originally Posted by *Mazda6i07*
> 
> Kind of off topic, but doesn anyone have any info on when/if nvidia is releasing the 'ti' line of cards?


Just about time for the Holiday season, in December


----------



## KawasakiDragonn

sorry if this is a dumb question. is there any 1070 have over 2000mhz as stock (or factory overclocked) gpu boosted clock? kind of curious


----------



## TheDeadCry

Quote:


> Originally Posted by *KawasakiDragonn*
> 
> sorry if this is a dumb question. is there any 1070 have over 2000mhz as stock (or factory overclocked) gpu boosted clock? kind of curious


If you mean are there cards that boost at or over 2000mhz at stock, yes.


----------



## RaleighStClair

Edit: Never mind, ordered the MSI GTX 1070 Seahawk X


----------



## BroPhilip

Thought I would post my last OC with my MSI Gaming Z. Cpu OC I5-6600k 4.7GHz. 6023 in Time Spy with 2100 core with locked voltage 1.093. 8700 memory clock!


----------



## Mazda6i07

Quote:


> Originally Posted by *Dude970*
> 
> Just about time for the Holiday season, in December


Thanks! Looking forward to that release.


----------



## Dude970

Quote:


> Originally Posted by *Mazda6i07*
> 
> Thanks! Looking forward to that release.


That is from a rumor I read not official


----------



## Mazda6i07

Quote:


> Originally Posted by *Dude970*
> 
> That is from a rumor I read not official


I figured it was a rumor since there's no official articles out there yet. but its better than nothing i suppose.


----------



## jamor

Quote:


> Originally Posted by *Dude970*
> 
> Just about time for the Holiday season, in December


Quote:


> Originally Posted by *Mazda6i07*
> 
> Thanks! Looking forward to that release.


Which ti are u talking about? I doubt there will be a 1070ti.


----------



## Mazda6i07

Quote:


> Originally Posted by *jamor*
> 
> Which ti are u talking about? I doubt there will be a 1070ti.


Any of the 'ti' series and will see what's released and prices and such. Was hoping for a 1070ti, but if they don't release one probably the 1080ti id imagine. Haven't seen any info on the 'ti' line this year, so i was just curious.


----------



## Azreil24

Hello guys. GTX 1070 Sea Hawk owners, can you please help me with some Idle and Load temps? Would really appreciate it.

Thank you!


----------



## criminal

Quote:


> Originally Posted by *Azreil24*
> 
> Hello guys. GTX 1070 Sea Hawk owners, can you please help me with some Idle and Load temps? Would really appreciate it.
> 
> Thank you!


What's your issue? We need more info.


----------



## Fosion

Quote:


> Originally Posted by *TheGlow*
> 
> Micron isn't the end of the world.


but worst then samsung anyway =D http://www.3dmark.com/spy/386610


----------



## Azreil24

Quote:


> Originally Posted by *criminal*
> 
> What's your issue? We need more info.


I recently bought one, and it Idles at 38-39 degrees C and in Load it goes up to 63-64 degrees C. I would have expected much lower temps. All that with 70% fan speed for the radiator fan. Ambient temp it's about 25-26 (27 max) degrees C. Decent air flow in a Evolv ATX case with CPU cooled by a Kraken X61. Taking the side panel off doesn't affect temps much, sign that the temp inside my case is OK.


----------



## criminal

Quote:


> Originally Posted by *Azreil24*
> 
> I recently bought one, and it Idles at 38-39 degrees C and in Load it goes up to 63-64 degrees C. I would have expected much lower temps. All that with 70% fan speed for the radiator fan. Ambient temp it's about 25-26 (27 max) degrees C. Decent air flow in a Evolv ATX case with CPU cooled by a Kraken X61. Taking the side panel off doesn't affect temps much, sign that the temp inside my case is OK.


You got the radiator and fan setup as intake or exhaust? Those temps seem about right if setup as exhaust.


----------



## Azreil24

Quote:


> Originally Posted by *criminal*
> 
> You got the radiator and fan setup as intake or exhaust? Those temps seem about right if setup as exhaust.


I tried both ways, I got the same temps. I even tried with the radiator outside the case, the same. Other users report sub 30 degrees Idle temps and max 55 degrees Load. Here comes the funny part, especially after you say that those seem as normal temps, I took the card to the store I bought it from and after I returned it they sent me some photos that show that the card goes up to 55 degrees in Load both in Furmark and Heaven, and Idles at 30 degrees, although their pictures clearly show that in Furmark the card steps down to 1620Mhz Core and 0.813V, and in Heaven it goes up to 1920Mhz and 1.062V. At the same time GPU-Z shows that the card is power limited, so somethings fishy.

Now I don't know what to think, as 38 Idle and 64 in Load are air cooler temps, but from what I have seen liquid cooling takes GPU's down to much better temps than just a 4-5 degrees difference. Heck, even the Fury X has better temps, and that is a much hotter card.


----------



## criminal

Quote:


> Originally Posted by *Azreil24*
> 
> I tried both ways, I got the same temps. I even tried with the radiator outside the case, the same. Other users report sub 30 degrees Idle temps and max 55 degrees Load. Here comes the funny part, especially after you say that those seem as normal temps, I took the card to the store I bought it from and after I returned it they sent me some photos that show that the card goes up to 55 degrees in Load both in Furmark and Heaven, and Idles at 30 degrees, although their pictures clearly show that in Furmark the card steps down to 1620Mhz Core and 0.813V, and in Heaven it goes up to 1920Mhz and 1.062V. At the same time GPU-Z shows that the card is power limited, so somethings fishy.
> 
> Now I don't know what to think, as 38 Idle and 64 in Load are air cooler temps, but from what I have seen liquid cooling takes GPU's down to much better temps than just a 4-5 degrees difference. Heck, even the Fury X has better temps, and that is a much hotter card.


That is strange that your temps don't change between intake and exhaust. Your temps are similar to the temps I had with a hybrid cooler on my GTX 980 when I had it setup as exhaust, but if I changed it to intake I would get about an 8C drop on load. Maybe your card has a bad TIM job? Overall your temps aren't terrible and much better than what some people see on air cooling with higher noise levels. I know even @ 100% fan on my FE even after using better TIM I still saw mid 70's. Maybe try a better fan on the radiator or better yet push/pull on the radiator with better fans.


----------



## Azreil24

My bad, I thought you refer to installing the radiator in the front (intake) or the back of the case (exhaust)







No, I always had the fan installed in the front for push config as intake, and yet I had these temps


----------



## Chaoz

Even my aircooled GTX 1070 SC idles at 31°C and full load 61°C.


----------



## Azreil24

61 seems to low for an air cooler, maybe with vsync on in light gaming







If it was a GS or Jetstream/Gamerock with fans past 70%, it would have made more sense


----------



## Chaoz

Quote:


> Originally Posted by *Azreil24*
> 
> 61 seems to low for an air cooler, maybe with vsync on in light gaming
> 
> 
> 
> 
> 
> 
> 
> If it was a GS or Jetstream/Gamerock with fans past 70%, it would have made more sense


Ultra settings, completely maxed out on BF1 playing with +100fps hours on end. No V-Sync on. Custom fan curve aswell.



Not my problem you don't believe me, I know what I see.


----------



## EvilWiffles

Quote:


> Originally Posted by *Azreil24*
> 
> Hello guys. GTX 1070 Sea Hawk owners, can you please help me with some Idle and Load temps? Would really appreciate it.
> 
> Thank you!


Max temp I've seen so far is 47c but averages 44.
Idle is around room temperature, so around 30c and lower, not like it really matters.


----------



## Azreil24

Quote:


> Originally Posted by *EvilWiffles*
> 
> Max temp I've seen so far is 47c but averages 44.
> Idle is around room temperature, so around 30c and lower, not like it really matters.


This is what I would expected from my unit too, and close to what every customer review says. Thank you! Don't know what to do now. I returned the uni with bad temps, but the store didn't want to replace it, as they say that my temps are good... and now the price for this model jumped with almost 100 euros compared to what I payed for the defective unit


----------



## RaleighStClair

Quote:


> Originally Posted by *Chaoz*
> 
> Ultra settings, completely maxed out on BF1 playing with +100fps hours on end. No V-Sync on. Custom fan curve aswell.
> 
> 
> 
> Not my problem you don't believe me, I know what I see.


Completely maxed out BF1 and only 47% GPU usage? You playing at 720p? lol.


----------



## Chaoz

Quote:


> Originally Posted by *RaleighStClair*
> 
> Completely maxed out BF1 and only 47% GPU usage? You playing at 720p? lol.


As you know BF1 Open Beta is finished, so I can't play anymore. And this is not in-game, I was just browsing and such. That screenshot was just to show the fan curve, not my usage and temp.

And no I'm playing at Ultra-wide 1080p.

Just put 2 and 2 together and you'd realized that that screenshot is not from in-game. As I said my in-game temps are 61°C and idles around 31°C. Screenshot shows 33°C -_-


----------



## TheDeadCry

Quote:


> Originally Posted by *Chaoz*
> 
> Even my aircooled GTX 1070 SC idles at 31°C and full load 61°C.


Same here


----------



## Prozillah

Yea same here on my G1 WITH the Shunt mod in place. I replaced my TIM with Thermal Grizzly liquid metal and with fan speed approximately 60% temps max out around 54c

Id say your TIM application is bad. ESPECIALLY on water... You shouldn't be any higher than 45 - 50c max ever.

Keeping in mind my case airflow is stella


----------



## ElectroManiac

End up getting the MSI Gaming 1070. Though about getting the X version, but money is tight now. The card arrive today and will install it tonight.


----------



## TheDeadCry

Quote:


> Originally Posted by *Prozillah*
> 
> Yea same here on my G1 WITH the Shunt mod in place. I replaced my TIM with Thermal Grizzly liquid metal and with fan speed approximately 60% temps max out around 54c
> 
> Id say your TIM application is bad. ESPECIALLY on water... You shouldn't be any higher than 45 - 50c max ever.
> 
> Keeping in mind my case airflow is stella


My max on my card has probably been 66 degrees, 2560x1080 benchmarks, games, furmark, etc. I have an MSI 1070 Gaming X - OC at ~2050 core, 4250 memory. Also this is default fan profile - Fan doesn't spin up until 60 degrees...so yeah. 22~23 degrees room temp.


----------



## RaleighStClair

Quote:


> Originally Posted by *Chaoz*
> 
> As you know BF1 Open Beta is finished, so I can't play anymore. And this is not in-game, I was just browsing and such. That screenshot was just to show the fan curve, not my usage and temp.
> 
> And no I'm playing at Ultra-wide 1080p.
> 
> Just put 2 and 2 together and you'd realized that that screenshot is n*ot from in-game.* As I said my in-game temps are 61°C and idles around 31°C. Screenshot shows 33°C -_-


Exactly, so im not sure why you posted a screenshot of it.

No biggie though, you must live in a freezer.


----------



## TheGlow

My msi gaming x idles at 45c with fan curve at 35%.
Gaming it hits about 80% speed and 64c.


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> My msi gaming x idles at 45c with fan curve at 35%.
> Gaming it hits about 80% speed and 64c.


Room Temp? On average.


----------



## TheGlow

Quote:


> Originally Posted by *TheDeadCry*
> 
> Room Temp? On average.


I have no idea, no thermometer, but hot as balls.
AC that doesnt do much, and a fan. I cant even hear the card at 100% fan.
Once I turn those off though, I start to sweat.
I have an i5-6600k, its idled now at 33/35/29/29.
PC is to the left of me on the floor, and has 2 intake on front, 1 intake on left side, 1 big exhaust top, 1 exhaust on back. So no airflow to the right, but when gaming at night I can feel the heat coming about 2 feet away on its right side.


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> I have no idea, no thermometer, but hot as balls.
> AC that doesnt do much, and a fan. I cant even hear the card at 100% fan.
> Once I turn those off though, I start to sweat.
> I have an i5-6600k, its idled now at 33/35/29/29.
> PC is to the left of me on the floor, and has 2 intake on front, 1 intake on left side, 1 big exhaust top, 1 exhaust on back. So no airflow to the right, but when gaming at night I can feel the heat coming about 2 feet away on its right side.


:O I live in a small apartment dorm, like 9x11 room lmao. I had to get a portable AC unit for several heatwaves over the summer, so it worked pretty well for the tiny room. Anyways, I have the h440 case with an h115i 280mm radiator in the front, 3 corsair sp120mm fans on the top, and one 140mm exhaust in the back. I'm kinda starved for intake, being that the only intake I have is a radiator blowing hot air through the front, and that's it, lmao. I'm VERY pleased with my MSI 1070 Gaming X, besides my poor OCing card - quitest card I've yet had. Even on 100% fan it's not that loud IMO. My 6700k idles around 25-30c at 4.5ghz. Your temps seem pretty good, considering I'm in a relatively cool room.







Cheers.


----------



## TheGlow

Quote:


> Originally Posted by *TheDeadCry*
> 
> :O I live in a small apartment dorm, like 9x11 room lmao. I had to get a portable AC unit for several heatwaves over the summer, so it worked pretty well for the tiny room. Anyways, I have the h440 case with an h115i 280mm radiator in the front, 3 corsair sp120mm fans on the top, and one 140mm exhaust in the back. I'm kinda starved for intake, being that the only intake I have is a radiator blowing hot air through the front, and that's it, lmao. I'm VERY pleased with my MSI 1070 Gaming X, besides my poor OCing card - quitest card I've yet had. Even on 100% fan it's not that loud IMO. My 6700k idles around 25-30c at 4.5ghz. Your temps seem pretty good, considering I'm in a relatively cool room.
> 
> 
> 
> 
> 
> 
> 
> Cheers.


Wife turned off fan and AC an hour ago. I now have a nice sheen on my forehead.


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> Wife turned off fan and AC an hour ago. I now have a nice sheen on my forehead.


Damn, gl with that. I can't even concentrate or get work done when it's too hot. :| My portable air conditioning unit of 8000 BTU's pulls 800 Watts from the wall... so in some ways its nice to have a smaller room, so I don't need to turn it on so much. Thankfully it's almost fall, and that means I'll be cranking those clocks.







Tell me, what have you done to your 1070 to get it to OC so well? We have the same card (Same Memory) but my max is barely ~2050 core...


----------



## TheGlow

Quote:


> Originally Posted by *TheDeadCry*
> 
> Damn, gl with that. I can't even concentrate or get work done when it's too hot. :| My portable air conditioning unit of 8000 BTU's pulls 800 Watts from the wall... so in some ways its nice to have a smaller room, so I don't need to turn it on so much. Thankfully it's almost fall, and that means I'll be cranking those clocks.
> 
> 
> 
> 
> 
> 
> 
> Tell me, what have you done to your 1070 to get it to OC so well? We have the same card (Same Memory) but my max is barely ~2050 core...


The only thing I can think of that shouldnt be TOO related is I installed the msi gaming app. i noticed that adds a service gamingapp_service, which in turn runs 2 processed osd 32 and 64. Having that on prevents the card from idling into 2d clocks or whatever the term is. without that, i can idle to 215MHz. the problem is once ANYthing tries to go into 3d with +400 memory or so, instant lock up.
With that service running it forces the card up to 1582MHz idle.
From there I could play around with stuff and it wouldnt die.
+205 core seems ok. +210 i had a few red artifacts in timespy. 215 initially killed it instantly, but that may be before i had the app service going so I need to revisit.
+825 mem i think may have had some artifacts. 850 it took for a few mins and died.
For now I've been gaming steadily at +180/+700


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> The only thing I can think of that shouldnt be TOO related is I installed the msi gaming app. i noticed that adds a service gamingapp_service, which in turn runs 2 processed osd 32 and 64. Having that on prevents the card from idling into 2d clocks or whatever the term is. without that, i can idle to 215MHz. the problem is once ANYthing tries to go into 3d with +400 memory or so, instant lock up.
> With that service running it forces the card up to 1582MHz idle.
> From there I could play around with stuff and it wouldnt die.
> +205 core seems ok. +210 i had a few red artifacts in timespy. 215 initially killed it instantly, but that may be before i had the app service going so I need to revisit.
> +825 mem i think may have had some artifacts. 850 it took for a few mins and died.
> For now I've been gaming steadily at +180/+700


Maximum performance set in Nvidia Control Panel, I assume? Do you lock your voltage?


----------



## TheGlow

Quote:


> Originally Posted by *TheDeadCry*
> 
> Maximum performance set in Nvidia Control Panel, I assume? Do you lock your voltage?


Yes to max performance. For regular use, no on voltage lock. I don't even add the +core voltage. I leave it 0. powerlimit is +126.


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> Yes to max performance. For regular use, no on voltage lock. I don't even add the +core voltage. I leave it 0. powerlimit is +126.


Alright, I'll try this, thanks. Here's hoping!


----------



## asdkj1740

gigabyte xtreme uses micron vram too....damn it...


----------



## Prozillah

honestly this whole micron ram vs samsung has been proven invalid - if there are many micron cards maintaining and evening beating samsung results that means its down to the chips individually.


----------



## benjamen50

You can pretty much get micron vram on any 1070 brand. Even the EVGA FTW ACX 3.0 one.


----------



## jovanni

Quote:


> Originally Posted by *Prozillah*
> 
> honestly this whole micron ram vs samsung has been proven invalid - if there are many micron cards maintaining and evening beating samsung results that means its down to the chips individually.


That's probably correct. My card has micron and I can get 9000 stable.


----------



## Prozillah

I've got Micron and I bet 9.1 stable 24/7 - higher for benches


----------



## ElectroManiac

How do you know if you got Micron or Samsung?

Install my 1070 yesterday. Test it with Witcher 3 and all good. Gonna try my first OC today after work.


----------



## juniordnz

Guys, I just had an EVGA 1080 FTW fail on me with less than 1 month of use and I'm seriously considering changing to 2x1070 in SLI.

How are these 1070 performing on SLI? Too many problems? Is it worth it?

Also I'm a little skeptical about EVGA build quality by now (I wonder why...). So I'm considering MSI Gaming X. I know MSI brags about using Military grade components, but does that mean that the card will actually be a better built than EVGA? MSI 1070 Gaming X is one of the cheapest models I can get here in Brazil...

Thanks very much for everyone's help!


----------



## gtbtk

Quote:


> Originally Posted by *jovanni*
> 
> That's probably correct. My card has micron and I can get 9000 stable.


I dont think issue is the micron VRAM itself. It is the way that the card seems to delay in ramping up the VRM to supply the vram with the higher voltage required when it has been in a power saving mode and we set it to jump from stock levels to a +500 or so overclock.

That suggests to me, that the Micron specific bios or the Nvidia drivers that control the memory vrm through the Bios has a bug in the vrm control logic that is not setting the voltage levels fast enough before it sets the memory clock on the micron memory cards. The end result is that the memory tries to run at the higher speeds but is being temporarily starved of voltage because if you lock the voltages high, the crash doesn't happen


----------



## Prozillah

Quote:


> Originally Posted by *gtbtk*
> 
> I dont think issue is the micron VRAM itself. It is the way that the card seems to delay in ramping up the VRM to supply the vram with the higher voltage required when it has been in a power saving mode and we set it to jump from stock levels to a +500 or so overclock.
> 
> That suggests to me, that the Micron specific bios or the Nvidia drivers that control the memory vrm through the Bios has a bug in the vrm control logic that is not setting the voltage levels fast enough before it sets the memory clock on the micron memory cards. The end result is that the memory tries to run at the higher speeds but is being temporarily starved of voltage because if you lock the voltages high, the crash doesn't happen


Tuche


----------



## saunupe1911

Quote:


> Originally Posted by *asdkj1740*
> 
> gigabyte xtreme uses micron vram too....damn it...


Wow!


----------



## HaiderGill

He's back inno3D iChill GeForce GTX 1070. Second GeForce after the GeForce 2 GTS, been a long time...Could we put GeForce in the title of the thread?


----------



## Vaesauce

Quote:


> Originally Posted by *asdkj1740*
> 
> gigabyte xtreme uses micron vram too....damn it...


Uhh, you must have been unlucky...

I have a Gigabyte Xtreme and it has Samsung.


----------



## jovanni

Quote:


> Originally Posted by *gtbtk*
> 
> I dont think issue is the micron VRAM itself. It is the way that the card seems to delay in ramping up the VRM to supply the vram with the higher voltage required when it has been in a power saving mode and we set it to jump from stock levels to a +500 or so overclock.
> 
> That suggests to me, that the Micron specific bios or the Nvidia drivers that control the memory vrm through the Bios has a bug in the vrm control logic that is not setting the voltage levels fast enough before it sets the memory clock on the micron memory cards. The end result is that the memory tries to run at the higher speeds but is being temporarily starved of voltage because if you lock the voltages high, the crash doesn't happen


Didn't work with mine. Locked the volts 1.093 and in any case same crash point. Maybe it needs more trials but I think I will not get much
Hope the unlocked bios will give more potential


----------



## jamor

Zotac told me they still use Samsung vram so if you obsess over that kind of thing I think they are the only ones left still using Samsung so buy that.

**Customer service could be wrong. You make purchase at your own discretion.


----------



## jovanni

Quote:


> Originally Posted by *juniordnz*
> 
> Guys, I just had an EVGA 1080 FTW fail on me with less than 1 month of use and I'm seriously considering changing to 2x1070 in SLI.
> 
> How are these 1070 performing on SLI? Too many problems? Is it worth it?
> 
> Also I'm a little skeptical about EVGA build quality by now (I wonder why...). So I'm considering MSI Gaming X. I know MSI brags about using Military grade components, but does that mean that the card will actually be a better built than EVGA? MSI 1070 Gaming X is one of the cheapest models I can get here in Brazil...
> 
> Thanks very much for everyone's help!


I will grab one 1080 and stick with that. Had 2 970 in SLI and had to many issues. Not recommended by me....


----------



## asdkj1740

Quote:


> Originally Posted by *asdkj1740*
> 
> the hot air cant get out easily as the exhaust vents are just mostly blocked.
> 
> be aware of those reviews in open test bench, do not take these result for granted
> it is pretty sure that pascal ichill is not longer powerful as before leading 5~10C lower among others.
> 
> it is still fine for 70c~80c as the clock will drop around 70mhz from instant max boost at idle temp, its meaning almost nothing in actual gameplay


Quote:


> Originally Posted by *jamor*
> 
> Zotac told me they still use Samsung vram so if you obsess over that kind of thing I think they are the only ones left still using Samsung so buy that.
> 
> **Customer service could be wrong. You make purchase at your own discretion.


zotac amp/amp extreme pcb vrm is suck, but with five years warranty

tpu reviews show that samsung vram can do ~2400mhz, very outstanding, 2300mhz should work like a charm in actual gaming


----------



## asdkj1740

Quote:


> Originally Posted by *Vaesauce*
> 
> Uhh, you must have been unlucky...
> 
> I have a Gigabyte Xtreme and it has Samsung.


i planned to buy it, but i saw this xtreme gpuz recently


----------



## jamor

Quote:


> Originally Posted by *Vaesauce*
> 
> Uhh, you must have been unlucky...
> 
> I have a Gigabyte Xtreme and it has Samsung.


He's not saying all Gigabytes have Micron. All 1070 manufacturers started out with Samsung but recently they all switched to Micron (we think because of a shortage).


----------



## Vaesauce

That being said
Quote:


> Originally Posted by *jamor*
> 
> He's not saying all Gigabytes have Micron. All 1070 manufacturers started out with Samsung but recently they all switched to Micron (we think because of a shortage).


No, he's saying that the Gigabyte Xtreme uses Micron "too".

I was just simply stating that I have a Gigabyte Xtreme and it has Samsung VRAM.

We all know that the first of the 1070s and their Aftermarket components came with Samsung and that the later waves came with Micron.


----------



## jamor

Quote:


> Originally Posted by *Vaesauce*
> 
> That being said
> No, he's saying that the Gigabyte Xtreme uses Micron "too".
> 
> I was just simply stating that I have a Gigabyte Xtreme and it has Samsung VRAM.
> 
> We all know that the first of the 1070s and their Aftermarket components came with Samsung and that the later waves came with Micron.


No. He's right. Gigabyte does use Micron now. He's not saying they always used micron. Everyone knows the old 1070s had Samsung. I don't understand what you're trying to argue.


----------



## saunupe1911

Quote:


> Originally Posted by *jamor*
> 
> No. He's right. Gigabyte does use Micron now. He's not saying they always used micron. Everyone knows the old 1070s had Samsung. I don't understand what you're trying to argue.


LMAO yall are saying the same thing LMAO.


----------



## Vaesauce

Quote:


> Originally Posted by *jamor*
> 
> No. He's right. Gigabyte does use Micron now. He's not saying they always used micron. Everyone knows the old 1070s had Samsung. I don't understand what you're trying to argue.


So if you knew comprehension, I stated..

"Uhhh you must have been UNLUCKY"

Soooo, dunno why you're even replying to me.


----------



## Vaesauce

Quote:


> Originally Posted by *saunupe1911*
> 
> LMAO yall are saying the same thing LMAO.


Don't let him know, he's angry for some odd reason.


----------



## jamor

Quote:


> Originally Posted by *Vaesauce*
> 
> Don't let him know, he's angry for some odd reason.


You sound like a 12 year old. Don't be so upset.


----------



## Vaesauce

Quote:


> Originally Posted by *jamor*
> 
> You sound like a 12 year old. Don't be so upset.


Upset? Meh, I'm not the one calling people little kids.


----------



## jamor

Quote:


> Originally Posted by *Vaesauce*
> 
> Upset? Meh, I'm not the one calling people little kids.


Relax. You'll get over it.


----------



## Vaesauce

Quote:


> Originally Posted by *jamor*
> 
> Relax. You'll get over it.


Lol, looks like you really want to get the last word in. You can have it









Angry because I told the guy that he was unlucky he had Micron VRAM on his Gigabyte Xtreme when I have Samsung on mine haha. Amazing.


----------



## jamor

Quote:


> Originally Posted by *Vaesauce*
> 
> Lol, looks like you really want to get the last word in. You can have it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Angry because I told the guy that he was unlucky he had Micron VRAM on his Gigabyte Xtreme when I have Samsung on mine haha. Amazing.


Take a deep breath lol. I didn't mean to upset you.


----------



## Chaoz

Quote:


> Originally Posted by *RaleighStClair*
> 
> Exactly, so im not sure why you posted a screenshot of it.
> 
> No biggie though, you must live in a freezer.


I posted the screenshot for the fan curve I have for my card. Just standard room temp of around 20ish°C.

Max. Temp I hit with my card is 63°C. Even in fire strike my temp stays around 55°C. Must be lucky, I guess? Got 2x NF-A14 iPPC-2000PWM @1000rpm static blowing directly on the card.


----------



## StrelokAT

Today i mounted my KrakenG10 on my MSI gtx1070 GamingX and the temps are around @ 43-44°C maximum.


----------



## juniordnz

Quote:


> Originally Posted by *StrelokAT*
> 
> Today i mounted my KrakenG10 on my MSI gtx1070 GamingX and the temps are around @ 43-44°C maximum.


Which cooler are you using on it? H75?


----------



## StrelokAT

I use the Kraken X41 AIO with 2 Thermaltake fans in push and pull.


----------



## Gir

How is everyone's performance in GTA V? To my surprise, I'm really struggling to run it on my HTPC (not my main rig).

I've lowered the settings, MSAA off, and just at 1080p. During some stunt races today I even dropped down in the low 40's. I feel like something isn't right, going to do some more investigating once I'm off work. Has anyone else had problems? I replaced a GTX 970 that I don't recall struggling at all.

HTPC Specs:

CPU
Intel Core i5-3330

Motherboard
ASRock H77M-ITX

Graphics
Asus GTX 1070 ROG Strix

RAM
G.Skill 16Gb 1600Mhz

EDIT: And that's with a brand new Windows 10 install, ran DDU, installed newest drivers.


----------



## George the Jew

Quote:


> Originally Posted by *amd7674*
> 
> If you have o/c CPU/RAM try running it at stock speeds. Try reinstalling drivers with DDU. Did you run Afterbuner to check how the GPU is behaving... fan speed, core speed...etc....
> or it could just bad card :-(


Thanks for the advice on DDU. Turns out I still had my old 750 TI drivers making everything act weird. But now, being a dumbass, I bricked my master bios


----------



## TheGlow

Quote:


> Originally Posted by *Gir*
> 
> How is everyone's performance in GTA V? To my surprise, I'm really struggling to run it on my HTPC (not my main rig).
> 
> I've lowered the settings, MSAA off, and just at 1080p. During some stunt races today I even dropped down in the low 40's. I feel like something isn't right, going to do some more investigating once I'm off work. Has anyone else had problems? I replaced a GTX 970 that I don't recall struggling at all.


I have no idea. I kind of want to get it but unsure. Hows PC community?
I played on 360 when it first released, a bunch.
Then when I got xb1 a year ago I got it and finished story mode but didnt do much online.
Now with a beast PC I'm interested, but jeez, that price doesn't come down often.


----------



## George the Jew

Quote:


> Originally Posted by *TheGlow*
> 
> I have no idea. I kind of want to get it but unsure. Hows PC community?
> I played on 360 when it first released, a bunch.
> Then when I got xb1 a year ago I got it and finished story mode but didnt do much online.
> Now with a beast PC I'm interested, but jeez, that price doesn't come down often.


Im having issues in GTA V as well, and I'm pretty sure its a low-end CPU. If you have an i5 of any kind, the 1070 can be bottlenecked by it. Not kidding, check out the good old gamer's video on it, it loses 30% performance without i7.


----------



## George the Jew

Quote:


> Originally Posted by *Gir*
> 
> How is everyone's performance in GTA V? To my surprise, I'm really struggling to run it on my HTPC (not my main rig).
> 
> I've lowered the settings, MSAA off, and just at 1080p. During some stunt races today I even dropped down in the low 40's. I feel like something isn't right, going to do some more investigating once I'm off work. Has anyone else had problems? I replaced a GTX 970 that I don't recall struggling at all.
> 
> HTPC Specs:
> 
> CPU
> Intel Core i5-3330
> 
> Motherboard
> ASRock H77M-ITX
> 
> Graphics
> Asus GTX 1070 ROG Strix
> 
> RAM
> G.Skill 16Gb 1600Mhz
> 
> EDIT: And that's with a brand new Windows 10 install, ran DDU, installed newest drivers.


See my post above yours


----------



## Gir

Can you post a link to that video you're referring to please? I can't seem to find it.


----------



## George the Jew

there you go!


----------



## rfarmer

Quote:


> Originally Posted by *George the Jew*
> 
> 
> 
> 
> 
> there you go!


Damn looks like I need to upgrade my 6600k. I knew the day would come when I had to go i7, looks like it is here.


----------



## By-Tor

I'm having a hard time choosing a new card. I want to go from a crossfire set up with a pair of 290x's to a single good card and the 1070 looks like a good upgrade path. I owned a geforce 2 15 years ago and sold it to go ATI/AMD and have been with AMD ever since.

I'll be mounting a full cover block on what ever I buy with no need for the fans.

Can anyone suggest a good card?


----------



## rfarmer

Quote:


> Originally Posted by *By-Tor*
> 
> I'm having a hard time choosing a new card. I want to go from a crossfire set up with a pair of 290x's to a single good card and the 1070 looks like a good upgrade path. I owned a geforce 2 15 years ago and sold it to go ATI/AMD and have been with AMD ever since.
> 
> I'll be mounting a full cover block on what ever I buy with no need for the fans.
> 
> Can anyone suggest a good card?


If you are water cooling that is the one I would get, I ended up getting a FE because I couldn't find any Aeros in stock when I purchased. Reference pcb and higher clocks than FE. Better price too.


----------



## George the Jew

Quote:


> Originally Posted by *rfarmer*
> 
> Damn looks like I need to upgrade my 6600k. I knew the day would come when I had to go i7, looks like it is here.


I feel the exact same way. Crazy that an i5 is becoming average-below average


----------



## rfarmer

Quote:


> Originally Posted by *George the Jew*
> 
> I feel the exact same way. Crazy that an i5 is becoming average-below average


Yeah I went 2500k - 4690k - 6600k and for a long time I just saw no benefit to an i7, looks like the new games and new gpus require a bit more.


----------



## FattysGoneWild

Peasant i5's







i7 or bust. Its just common sense with an extremely strong gpu like the 1070/1080.


----------



## mypickaxe

Quote:


> Originally Posted by *George the Jew*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rfarmer*
> 
> Damn looks like I need to upgrade my 6600k. I knew the day would come when I had to go i7, looks like it is here.
> 
> 
> 
> I feel the exact same way. Crazy that an i5 is becoming average-below average
Click to expand...

A 6600K @ 4.5 (give or take) would be just fine.


----------



## RaleighStClair

Quote:


> Originally Posted by *rfarmer*
> 
> Yeah I went 2500k - 4690k - 6600k and for a long time I just saw no benefit to an i7, looks like the new games and new gpus require a bit more.


I just had to upgrade to a 6700k this year because my 3570k @ 4.6ghz was bottlenecking my 980ti in a number of titles.

I used to be one of those people that would wholeheartedly recommended an i5 over an i7, but now I am not so sure. Just this year alone I played 5 games that had bottlenecking issues - especially hwen trying to get the mos out of a 144hz monitor.


----------



## George the Jew

Quote:


> Originally Posted by *mypickaxe*
> 
> A 6600K @ 4.5 (give or take) would be just fine.


Did you see the video link I gave in the previous page? i5 gives about a 30% performane decrease on haswell, and skylake is a minor improvement, so we'll call it 20% loss, yes?


----------



## kevindd992002

Quote:


> Originally Posted by *asdkj1740*
> 
> zotac amp/amp extreme pcb vrm is suck, but with five years warranty
> 
> tpu reviews show that samsung vram can do ~2400mhz, very outstanding, 2300mhz should work like a charm in actual gaming


What made you say that the vrm pcb of the zotac suck?


----------



## BroPhilip

MSI gaming Z and the MSI Gaming app?

So here is my astute observations regarding the MSI Gaming app.... it is crap! Lol. Running it with the fan stop below 60 is a horrible idea lol. (Maybe I just care more about Temps than a little fan sound)
Also, the max fan button only spins up the fans for a few secs before reverting. If I run the stock speed in AB there is a 50MHz difference. If I set it to the oc (1860 MHz) mode in the gaming app it will start at 1999 then go to 1975. The same clock in AB (1860 verified in GPU-Z with no other adjustments except fan curve) yields 2012 consistent speed with no dips and a max temp in the low 40s. This should be the same result using either app!

The other issue is the the RGB only works with the gaming app up, which overrides the custom fan curve in AB. It's a good card but seriously held back by their own app...


----------



## asdkj1740

Quote:


> Originally Posted by *jamor*
> 
> He's not saying all Gigabytes have Micron. All 1070 manufacturers started out with Samsung but recently they all switched to Micron (we think because of a shortage).


if that is because of the shortage of samsung vram, then 1060 should all be using micron


----------



## zipper17

If anyone has Galax/KFA2 card, download Xtreme Tuner Plus here to change your LED color, just for info. KFA2 link

Also Just FYI if someone want to search full review about KFA2/Galax 1070 EXOC here.


----------



## zipper17

Quote:


> Originally Posted by *George the Jew*
> 
> 
> 
> 
> 
> there you go!


it can be bottlenecking but depends on what games.

Mostly because it's CPU bound game. In GPU bound scenario would still nearly identical. (in most of old games).
If you crank up your resolution to higher pixel, that cpu bound would no more.

Yes some of newer games started to use i7 hyper threading and faster DDR4 system.

I used 3570k OC, yes so far I suffer a botneck, but only very noticeable in Hitman marrakesh level. (but this game could be poor optimized Dx12, at the same time cpu bound, and also eat RAM resource more than 8GB. idk what's the exact cause.)

Most of the time I play on [email protected] GPU runs cool and got no tearing at once benefit.

Even if you want to do a perfect [email protected] you wouldn't even reach it in most games, even if you didn't have botneck system. it's require more and more power to get into 144numbers. Probably the easiest game are csgo, etc.


----------



## reeven

I love micron chips. They never broke on me. Samsung well, a lot died.
Example ati 9700,9800,9800pro all Samsung dead vram artifacts.
Nvidia gtx8800gts Samsung dead.
Nvidia gtx8800gtx samsung dead.
Evga 560ti Samsung dead.
Then I move to ati. 290 hynix and 280x Elpida. Hynix on 280x artifacts on lots of people, mine 280x Elpida rocking.
Now I'm on gigabyte 1060 Samsung and evga 1080 micron.
Samsung are very good at ram, ssd etc but their ram broke in time...
So enjoy your micron GPUs, they rock. Micron will last year's to came.


----------



## reflex75

Quote:


> Originally Posted by *reeven*
> 
> I love micron chips...


We change GPU every year, we want to enjoy it now.


----------



## kens30

Hi whicker does your Asus Strix oc card have samsung memory or micron as i have flashed your bios successfully on my non oc Strix with no crashes whatsoever.
The only thing i am worried about if your card has micron memory as mine has samsung and i don't want to risk damaging the card.
By the way i got very lucky with my mem oc +800 in Afterburner fully stable.
Thanks in advance.


----------



## gtbtk

Quote:


> Originally Posted by *jamor*
> 
> Zotac told me they still use Samsung vram so if you obsess over that kind of thing I think they are the only ones left still using Samsung so buy that.
> 
> **Customer service could be wrong. You make purchase at your own discretion.


Customer Service is wrong. I have copies of the bioses for both the AMP! Edition and AMP! Extreme edition for Micron memory cards
Quote:


> Originally Posted by *kens30*
> 
> Hi whicker does your card have samsung memory or micron as i have flashed your bios successfully no crashes whatsoever.
> The only thing i am worried about if your card has micron memory as mine has samsung and i don't want to risk damaging the card.
> By the way i got very lucky with my mem oc +800 in Afterburner fully stable.
> Thanks in advance.


samsung bios 86.04.1e.00.xx

micron bios 86.04.26.00.xx


----------



## kens30

Thanks a lot for your quick reply and yes the bios i flashed had Samsung memory.


----------



## TheGlow

Damn. I went with 6600k because the 2500k lasted me years and the i7 had no blatant advantage.
I have a spare i7 2600k i think, laying around and hadnt gotten around to replacing the 2500k for my daughter since it never seemed that big a deal.
I have my [email protected] 4.4GHz so far on a 212, ,so ill hold off for now.
Maybe later Ill grab a 6700k.
What difference would it be for a 6600k vs a 6700 non k?
I might be able to get a nonk for real cheap.


----------



## George the Jew

Quote:


> Originally Posted by *kens30*
> 
> Hi whicker does your Asus Strix oc card have samsung memory or micron as i have flashed your bios successfully on my non oc Strix with no crashes whatsoever.
> The only thing i am worried about if your card has micron memory as mine has samsung and i don't want to risk damaging the card.
> By the way i got very lucky with my mem oc +800 in Afterburner fully stable.
> Thanks in advance.


Hey so what's your Core overclock? I got my 1070 FTW to +800 Memory as well, but my core is fishy going past 2050 mhz


----------



## GunnzAkimbo

Situation:

I now have 3 x 680s installed and scored this;



So i want to get a *SINGLE 1070* that can "trump" that score.

Cheapass(t) 1070 i can get is the Gainward Phoenix (with a free 128GB USB 3 stick and $115 of Paragon in-game value from Nvidia! ...whatever tha crap that means)

Gotta move quick as the deal ends soon. Will lurk this thread for days until i see a good score.

If ya need a reason, here be 3

Power consumption = Bad.

Heat output = Bad

Noise = Bad


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> Damn. I went with 6600k because the 2500k lasted me years and the i7 had no blatant advantage.
> I have a spare i7 2600k i think, laying around and hadnt gotten around to replacing the 2500k for my daughter since it never seemed that big a deal.
> I have my [email protected] 4.4GHz so far on a 212, ,so ill hold off for now.
> Maybe later Ill grab a 6700k.
> What difference would it be for a 6600k vs a 6700 non k?
> I might be able to get a nonk for real cheap.


Highly recommend the 6700k, having upgraded from the 3570k. I would always go for the k SKU. Personally I've found my 6700k to reach ~4.8ghz (but you'd need a pretty beefy cooler) As you can imagine, going from a ivybridge i5 to a skylake i7 is quite the jump, lmao. I'd say if you'd bother upgrading, go for the K SKU - then again, you may want to buy a AIO as well to reach those higher clocks. I'm currently sitting at 4.6Ghz with an [email protected]~25 degrees idle. I'd take a look into a good comparison, but these skylake chips, as far as I can tell, are great overclockers (Hence why I'd stick with a K SKU) I'd hate to see that 1070 of yours being held back!


----------



## GunnzAkimbo

err


----------



## mypickaxe

Quote:


> Originally Posted by *George the Jew*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> A 6600K @ 4.5 (give or take) would be just fine.
> 
> 
> 
> Did you see the video link I gave in the previous page? i5 gives about a 30% performane decrease on haswell, and skylake is a minor improvement, so we'll call it 20% loss, yes?
Click to expand...

I've seen that before. Crysis 3 is one of the few games that scales well with more cores or SMT (Hyperthreading.) He picked the best case scenario. I'm not saying there's no benefit, I'm saying it's overblown at this point in time. In the vast majority of games, they're GPU bound and you're not going to see much of a difference (if any.)


----------



## zipper17

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> Situation:
> I now have 3 x 680s installed and scored this;
> 
> 
> 
> So i want to get a *SINGLE 1070* that can "_trump"_ that score.
> Cheapass(t) 1070 i can get is the Gainward Phoenix (with a free 128GB USB 3 stick and $115 of Paragon in-game value from Nvidia! ...whatever tha crap that means)
> Gotta move quick as the deal ends soon. Will lurk this thread for days until i see a good score.
> 
> If ya need a reason, here be 3
> Power consumption = Bad.
> Heat output = Bad
> Noise = Bad


Highest 1070OC in Firestrike Graphic Score are 22K I believe. less ~1000 points from your 3x680.

You probably need a single 1080 for solid ~23K or higher at ease.

Btw 3dmark Firestrike is not a real world performances Gaming. It's just a Benchmark sake.

You can lurk here. Just select single gpu, and uncheck valid result, if showing an error click "fetch next 1000 results with these search options."


----------



## Hunched

There's finally a Gaming Z 1070 BIOS online... Anyone want to confirm whether it works with Samsung Gaming X and Gaming non-X?
https://www.techpowerup.com/vgabios/185888/msi-gtx1070-8192-160608-2

I believe it's Samsung since it's "86.04.1E.00" and every Micron BIOS thus far has been "86.04.26.00"
Yea I think I'm gonna flash it, its build date is before Micron cards even started appearing from MSI. Earliest Micron BIOS's are from a month later.
There's no way this isn't a Samsung BIOS


----------



## TheDeadCry

I've flashed the standard non-review sample of the bios and it works perfectly.







you can find that, too.


----------



## Hunched

Quote:


> Originally Posted by *TheDeadCry*
> 
> I've flashed the standard non-review sample of the bios and it works perfectly.
> 
> 
> 
> 
> 
> 
> 
> you can find that, too.


There are only 2 Gaming Z BIOS's in the Techpowerup database, and the other one is Micron. So I'm not sure where you found yours, unless you're saying you're using the Micron BIOS.

Anyways I just flashed the OC Mode Samsung Gaming Z BIOS to my Gaming non-X and it worked so we good.
Whether it helps at all I don't know yet.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> There are only 2 Gaming Z BIOS's in the Techpowerup database, and the other one is Micron. So I'm not sure where you found yours, unless you're saying you're using the Micron BIOS.
> 
> Anyways I just flashed the OC Mode Samsung Gaming Z BIOS to my Gaming non-X and it worked so we good.
> Whether it helps at all I don't know yet.


I flashed the micron (non oc mode by default bios) You are correct. Keep me posted on how the samsung bios goes.


----------



## Hunched

Quote:


> Originally Posted by *TheDeadCry*
> 
> I flashed the micron (non oc mode by default bios) You are correct. Keep me posted on how the samsung bios goes.


It's going to take a long time for me to comfortably say whether or not I can get a higher stable OC.
I haven't been gaming as much lately since I'm waiting on getting a new monitor in October, and there just isn't a whole lot I'm excited to play.
I don't feel like running benchmarks all day, and even if I did they're worse stability testers than something like BF1 was and will be.

Once I have my new monitor and all the Q4 games start rolling out, I'll rack up some hundreds of hours of gameplay stability testing by the end of the year









I just ran Unigine Valley once without issue, and it boosts higher now, up to 2050mhz. Which was expected.
Realistically things will probably be basically the same, at least they should be theoretically if nobody made gimped BIOSes.
I doubt this will unlock crazy overclocking power, but best case I might have a bit more OC headroom.

If only we had a BIOS editor, we could just make the best BIOS, and actually see what all the differences are between the current ones besides the very basics.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> It's going to take a long time for me to comfortably say whether or not I can get a higher stable OC.
> I haven't been gaming as much lately since I'm waiting on getting a new monitor in October, and there just isn't a whole lot I'm excited to play.
> I don't feel like running benchmarks all day, and even if I did they're worse stability testers than something like BF1 was and will be.
> 
> Once I have my new monitor and all the Q4 games start rolling out, I'll rack up some hundreds of hours of gameplay stability testing by the end of the year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just ran Unigine Valley once without issue, and it boosts higher now, up to 2050mhz. Which was expected.
> Realistically things will probably be basically the same, at least they should be theoretically if nobody made gimped BIOSes.
> I doubt this will unlock crazy overclocking power, but best case I might have a bit more OC headroom.
> 
> If only we had a BIOS editor, we could just make the best BIOS, and actually see what all the differences are between the current ones besides the very basics.


Sounds good.


----------



## zipper17

FYI: Hitman just updated into 1.4.3
Since update, i didn't notice any massive lag again on Hitman Marrakesh Level (DX12)







.
So I think it's a game issues.
Quote:


> Originally Posted by *zipper17*
> 
> My 3570k kinda bottlenecking 1070 in Hitman Marakesh Level (When Running on DX12). I noticed a lot of stuttering lags in crowds market areas.
> But when I switched to Dx11, the lag is gone.
> 
> Does DX12 make CPU performances running slower than on DX11??
> does anyone here have 3570K & Hitman 2016 game?


Quote:


> Originally Posted by *zipper17*
> 
> I used 3570k OC, yes so far I suffer a botneck, but only very noticeable in Hitman marrakesh level. (but this game could be poor optimized Dx12, at the same time cpu bound, and also eat RAM resource more than 8GB. idk what's the exact cause.)


Also just for info MSI afterburner 4.3.0 beta 14 now support Dx12 OSD. Good thing.


----------



## GunnzAkimbo

They are both virtually the same GFX score...

Quote:


> Originally Posted by *zipper17*
> 
> Highest 1070OC in Firestrike Graphic Score are 22K I believe. less ~1000 points from your 3x680.
> 
> You probably need a single 1080 for solid ~23K or higher at ease.
> 
> Btw 3dmark Firestrike is not a real world performances Gaming. It's just a Benchmark sake.
> 
> You can lurk here. Just select single gpu, and uncheck valid result, if showing an error click "fetch next 1000 results with these search options."


taa.

may have to 1070 sli, 1 now 1 later.


----------



## mrtbahgs

Just ordered a Gigabyte G1 Gaming that should arrive Wednesday or Thursday and will replace my 4 year old 670.
Very excited to finally be able to give my rig a performance boost and will utilize it on BF1 when it comes out.

I figured you guys would know the correct process nowadays so let me ask a likely often repeated question.
Should I be removing my existing nVidia driver before swapping cards and apply the latest driver once the 1070 is in, or can I just do a "clean install" with the latest driver and all will be well?
I figured going from a 670 to a 1070 will likely be the same driver used, but wasn't sure if something else is also factored in that it needs to realize I have a different card.
According to Geforce Experience, I am currently on 368.39 (6/6/16) and the most recent driver is 372.70 (8/29/16).


----------



## Hnykill

Plug in the new card and the drivers will recognise it right away. then just install new drivers. Nvidia have good drivers and support. switching cards are counted for. it will be no problem.


----------



## GreedyMuffin

I would use DDU. Then install new drivers. I always do that when I update my driver as well, might be overkill.


----------



## TheDeadCry

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I would use DDU. Then install new drivers. I always do that when I update my driver as well, might be overkill.


Might be a little over the top, lol - but I agree with always using DDU before installing a new graphics card. Or.....you can just go extreme like me and start with a clean install of windows. lol


----------



## zipper17

When I upgraded to a newer card, I did uninstall the old driver, and reinstall the newest driver. Just in case, I think it's recommended at least to uninstall the old one (get rid of old installed registry) and then choose clean install. New card would have a new features within the driver, and let the driver scanned a new hardware and that would unlock some new features. If you didn't uninstall old driver, probably the newest features from card would be locked, even though it's the latest driver. Cmiiw.


----------



## BroPhilip

Quote:


> Originally Posted by *Hunched*
> 
> There's finally a Gaming Z 1070 BIOS online... Anyone want to confirm whether it works with Samsung Gaming X and Gaming non-X?
> https://www.techpowerup.com/vgabios/185888/msi-gtx1070-8192-160608-2
> 
> I believe it's Samsung since it's "86.04.1E.00" and every Micron BIOS thus far has been "86.04.26.00"
> Yea I think I'm gonna flash it, its build date is before Micron cards even started appearing from MSI. Earliest Micron BIOS's are from a month later.
> There's no way this isn't a Samsung BIOS


Quote:


> Originally Posted by *Hunched*
> 
> There's finally a Gaming Z 1070 BIOS online... Anyone want to confirm whether it works with Samsung Gaming X and Gaming non-X
> https://www.techpowerup.com/vgabios/185888/msi-gtx1070-8192-160608-2
> 
> I believe it's Samsung since it's "86.04.1E.00" and every Micron BIOS thus far has been "86.04.26.00"
> Yea I think I'm gonna flash it, its build date is before Micron cards even started appearing from MSI. Earliest Micron BIOS's are from a month later.
> There's no way this isn't a Samsung BIOS


One of themails is mine and it is from a micron gaming z model....don't remember the bios number will check when I get home


----------



## rajaadima

Quote:


> Originally Posted by *BroPhilip*
> 
> One of themails is mine and it is from a micron gaming z model....don't remember the bios number will check when I get home


I flashed both bios on my Gaming X (samsung mem):
-with the one dated 07-05-2016 fire strike crashes instantly and no problem in Heaven and valley benchs
-with the other fire strike crashes instantly and unigine Heaven shows me black screen and 0% on GPU load during 3 to 5 secondes from time to time. and sometime it crashes too. and Valley no problem

roll back to my saved Bios

Sorry for my english.


----------



## mrtbahgs

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I would use DDU. Then install new drivers. I always do that when I update my driver as well, might be overkill.


Quote:


> Originally Posted by *TheDeadCry*
> 
> Might be a little over the top, lol - but I agree with always using DDU before installing a new graphics card. Or.....you can just go extreme like me and start with a clean install of windows. lol


Quote:


> Originally Posted by *zipper17*
> 
> When I upgraded to a newer card, I did uninstall the old driver, and reinstall the newest driver. Just in case, I think it's recommended at least to uninstall the old one (get rid of old installed registry) and then choose clean install. New card would have a new features within the driver, and let the driver scanned a new hardware and that would unlock some new features. If you didn't uninstall old driver, probably the newest features from card would be locked, even though it's the latest driver. Cmiiw.


Thanks for all the quick replies, I will give DDU a try just to be safe and that's one of the main things I was thinking, if perhaps some feature isn't working 100% because of the card swap.
I considered a clean install of Win7, but I don't think I want to deal with the hassle of setting it all back up.
I am actually still on my original install from 4 years ago!


----------



## TheDeadCry

Quote:


> Originally Posted by *mrtbahgs*
> 
> Thanks for all the quick replies, I will give DDU a try just to be safe and that's one of the main things I was thinking, if perhaps some feature isn't working 100% because of the card swap.
> I considered a clean install of Win7, but I don't think I want to deal with the hassle of setting it all back up.
> I am actually still on my original install from 4 years ago!


Holy F**k the longest I go before installing a clean version of windows is maybe a few months. lol


----------



## mrtbahgs

Quote:


> Originally Posted by *TheDeadCry*
> 
> Holy F**k the longest I go before installing a clean version of windows is maybe a few months. lol


Wow lol we are opposites, I've heard of people doing a yearly reinstall, but is there a particular reason why you do it so often?
Other than some possibly slight performance decreases (and likely not noticeable unless benchmarked) I don't have a reason to reinstall as nothing is corrupt or acting up.
I guess it falls into the "If it ain't broke don't fix it" mantra.

If there is a big selling point (and proven facts) to show that I should take the time to do a clean install, please let me know since this seems like the best time.
I think a big part of the reason I don't want to go through it though is that I have multiple drives set up and I don't know how they would interact upon a fresh install.
- C: drive is my main SSD for OS and programs.
- D: drive is a HDD re-mapped to hold default music, pictures, documents, etc. (likely a simple fix to remap and point to existing data on the drive)
- G: drive is an SSD strictly for Steam and other games (this one might be odd or a hassle if I reinstall my OS and I don't know if Steam will repair any registry parts that are missing to link the games)
- M: drive is a large HDD for my ripped media to watch movies easier and hosts a Plex server. (Should just be straight data so no worries syncing this one back up)


----------



## TheDeadCry

Quote:


> Originally Posted by *mrtbahgs*
> 
> Wow lol we are opposites, I've heard of people doing a yearly reinstall, but is there a particular reason why you do it so often?
> Other than some possibly slight performance decreases (and likely not noticeable unless benchmarked) I don't have a reason to reinstall as nothing is corrupt or acting up.
> I guess it falls into the "If it ain't broke don't fix it" mantra.
> 
> If there is a big selling point (and proven facts) to show that I should take the time to do a clean install, please let me know since this seems like the best time.
> I think a big part of the reason I don't want to go through it though is that I have multiple drives set up and I don't know how they would interact upon a fresh install.
> - C: drive is my main SSD for OS and programs.
> - D: drive is a HDD re-mapped to hold default music, pictures, documents, etc. (likely a simple fix to remap and point to existing data on the drive)
> - G: drive is an SSD strictly for Steam and other games (this one might be odd or a hassle if I reinstall my OS and I don't know if Steam will repair any registry parts that are missing to link the games)
> - M: drive is a large HDD for my ripped media to watch movies easier and hosts a Plex server. (Should just be straight data so no worries syncing this one back up)


No, no, I do a clean install relatively often is because I'm a windows insider, so I want everything fresh. Secondly, I can't stand the clutter left by old programs, and other junk that gets stuck in every nook and cranny in windows.


----------



## rfarmer

Quote:


> Originally Posted by *TheDeadCry*
> 
> No, no, I do a clean install relatively often is because I'm a windows insider, so I want everything fresh. Secondly, I can't stand the clutter left by old programs, and other junk that gets stuck in every nook and cranny in windows.


I am the same way, just did a fresh install yesterday. I am also a Windows Insider and much prefer format to upgrade.


----------



## Hunched

Quote:


> Originally Posted by *rajaadima*
> 
> I flashed both bios on my Gaming X (samsung mem):
> -with the one dated 07-05-2016 fire strike crashes instantly and no problem in Heaven and valley benchs
> -with the other fire strike crashes instantly and unigine Heaven shows me black screen and 0% on GPU load during 3 to 5 secondes from time to time. and sometime it crashes too. and Valley no problem
> 
> roll back to my saved Bios
> 
> Sorry for my english.


The Samsung BIOS is fine on my Samsung Gaming non-X.
I feel like saying your issues are because of something you did wrong, since you went ahead and made a choice as stupid as flashing a Micron BIOS to a Samsung card.


----------



## TheDeadCry

Quote:


> Originally Posted by *rfarmer*
> 
> I am the same way, just did a fresh install yesterday. I am also a Windows Insider and much prefer format to upgrade.


Yeah, and I do it so regularly that it's not really an inconvenience anymore. I have program installs ready and waiting, that I update often in my "Clean Install" folder on an external flash drive - stuff like that. The only part that's annoying is signing back into all my accounts (Windows, Steam, Chrome, Twitter, etc.) I have two step verification on everything, lol. Otherwise, I like doing it lol.


----------



## rfarmer

Quote:


> Originally Posted by *TheDeadCry*
> 
> Yeah, and I do it so regularly that it's not really an inconvenience anymore. I have program installs ready and waiting, that I update often in my "Clean Install" folder on an external flash drive - stuff like that. The only part that's annoying is signing back into all my accounts (Windows, Steam, Chrome, Twitter, etc.) I have two step verification on everything, lol. Otherwise, I like doing it lol.


Yeah I find it pretty painless anymore. I have a 128GB M.2 that is just for the OS and programs, a 960GB SSD for steam and a 2TB HDD for various other junk. My personal folders are redirected to my D: drive and I try and keep all my programs updated as needed. So I can usually have it installed and setup like I like in under an hour.


----------



## nacherc

GTX MSI GAMING 1070

CORE +100
MEMORY +950 (SAMSUNG)


----------



## TheDeadCry

Quote:


> Originally Posted by *mrtbahgs*
> 
> Wow lol we are opposites, I've heard of people doing a yearly reinstall, but is there a particular reason why you do it so often?
> Other than some possibly slight performance decreases (and likely not noticeable unless benchmarked) I don't have a reason to reinstall as nothing is corrupt or acting up.
> I guess it falls into the "If it ain't broke don't fix it" mantra.
> 
> If there is a big selling point (and proven facts) to show that I should take the time to do a clean install, please let me know since this seems like the best time.
> I think a big part of the reason I don't want to go through it though is that I have multiple drives set up and I don't know how they would interact upon a fresh install.
> - C: drive is my main SSD for OS and programs.
> - D: drive is a HDD re-mapped to hold default music, pictures, documents, etc. (likely a simple fix to remap and point to existing data on the drive)
> - G: drive is an SSD strictly for Steam and other games (this one might be odd or a hassle if I reinstall my OS and I don't know if Steam will repair any registry parts that are missing to link the games)
> - M: drive is a large HDD for my ripped media to watch movies easier and hosts a Plex server. (Should just be straight data so no worries syncing this one back up)


Also, just wanted to make a note. My drive setup is pretty similar to yours
- C: Main
- D: Steam as well as my movies
- E: Thumbdrive with important documents and such, as well as music
- F: Backup drive (Thumbdrive backup, and any other important documents/pictures/etc)

In this way, I know all of my important things are safe, and have redundancy's on my backup drive. Also, with regards to steam, all you do is launch the installer from the drive and it auto configs to your new install of windows. I don't see any problems with a new install - you should be pretty set. However, I do it all the time, and it takes me maybe 15 minutes to get up and running once windows is installed. It all depends on what you are willing to do.


----------



## rfarmer

Quote:


> Originally Posted by *nacherc*
> 
> 
> 
> GTX MSI GAMING 1070
> 
> CORE +100
> MEMORY +950 (SAMSUNG)




Nice score, great OC on your memory.

Nvidia FE
Core +180
Memory + 650 (Samsung)


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *TheDeadCry*
> 
> Holy F**k the longest I go before installing a clean version of windows is maybe a few months. lol


Quote:


> Originally Posted by *mrtbahgs*
> 
> Wow lol we are opposites, I've heard of people doing a yearly reinstall, but is there a particular reason why you do it so often?
> Other than some possibly slight performance decreases (and likely not noticeable unless benchmarked) I don't have a reason to reinstall as nothing is corrupt or acting up.
> I guess it falls into the "If it ain't broke don't fix it" mantra.
> 
> If there is a big selling point (and proven facts) to show that I should take the time to do a clean install, please let me know since this seems like the best time.
> I think a big part of the reason I don't want to go through it though is that I have multiple drives set up and I don't know how they would interact upon a fresh install.
> - C: drive is my main SSD for OS and programs.
> - D: drive is a HDD re-mapped to hold default music, pictures, documents, etc. (likely a simple fix to remap and point to existing data on the drive)
> - G: drive is an SSD strictly for Steam and other games (this one might be odd or a hassle if I reinstall my OS and I don't know if Steam will repair any registry parts that are missing to link the games)
> - M: drive is a large HDD for my ripped media to watch movies easier and hosts a Plex server. (Should just be straight data so no worries syncing this one back up)


Quote:


> Originally Posted by *TheDeadCry*
> 
> No, no, I do a clean install relatively often is because I'm a windows insider, so I want everything fresh. Secondly, I can't stand the clutter left by old programs, and other junk that gets stuck in every nook and cranny in windows.


Quote:


> Originally Posted by *rfarmer*
> 
> I am the same way, just did a fresh install yesterday. I am also a Windows Insider and much prefer format to upgrade.


Quote:


> Originally Posted by *TheDeadCry*
> 
> Yeah, and I do it so regularly that it's not really an inconvenience anymore. I have program installs ready and waiting, that I update often in my "Clean Install" folder on an external flash drive - stuff like that. The only part that's annoying is signing back into all my accounts (Windows, Steam, Chrome, Twitter, etc.) I have two step verification on everything, lol. Otherwise, I like doing it lol.


Quote:


> Originally Posted by *TheDeadCry*
> 
> Also, just wanted to make a note. My drive setup is pretty similar to yours
> - C: Main
> - D: Steam as well as my movies
> - E: Thumbdrive with important documents and such, as well as music
> - F: Backup drive (Thumbdrive backup, and any other important documents/pictures/etc)
> 
> In this way, I know all of my important things are safe, and have redundancy's on my backup drive. Also, with regards to steam, all you do is launch the installer from the drive and it auto configs to your new install of windows. I don't see any problems with a new install - you should be pretty set. However, I do it all the time, and it takes me maybe 15 minutes to get up and running once windows is installed. It all depends on what you are willing to do.


My OS will be 6 years old on January 14, 2017.









Organized and Backed-up as can be:


----------



## TheDeadCry

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> My OS will be 6 years old on January 14, 2017.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Organized and Backed-up as can be:


----------



## rfarmer

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> My OS will be 6 years old on January 14, 2017.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Organized and Backed-up as can be:
> 
> 
> Spoiler: Warning: Spoiler!


Holy hell that is a lot of drives.


----------



## TheDeadCry

I never keep much data around. I'm totally solid state now - for everything. I Have one 1TB HDD left and it's retired sitting in my drawer in an external drive cage.


----------



## TheDeadCry

Quote:


> Originally Posted by *rfarmer*
> 
> Yeah I find it pretty painless anymore. I have a 128GB M.2 that is just for the OS and programs, a 960GB SSD for steam and a 2TB HDD for various other junk. My personal folders are redirected to my D: drive and I try and keep all my programs updated as needed. So I can usually have it installed and setup like I like in under an hour.


Notice any benefits with that m.2 over say a Samsung 850 EVO SATA? I was looking into perhaps getting 1 or maybe 2 950 pros in raid 0 on my board. ;P


----------



## rfarmer

Quote:


> Originally Posted by *TheDeadCry*
> 
> Notice any benefits with that m.2 over say a Samsung 850 EVO SATA? I was looking into perhaps getting 1 or maybe 2 950 pros in raid 0 on my board. ;P




My speeds are excellent, but other than a bit faster boot times not really much different. I see a lot bigger difference moving my games off of a mechanical drive onto a SSD. So much nicer when you are loading large maps.

Now the 950s in raid 0 might give you a more noticeable improvement.


----------



## TheDeadCry

Quote:


> Originally Posted by *rfarmer*
> 
> 
> 
> My speeds are excellent, but other than a bit faster boot times not really much different. I see a lot bigger difference moving my games off of a mechanical drive onto a SSD. So much nicer when you are loading large maps.
> 
> Now the 950s in raid 0 might give you a more noticeable improvement.


Very nice! People always quote how unnecessary things like this are in real world performance, but most of us are not your average everyday home user. I will not be sated until everything loads instantly, lol. Besides, how does one quantify "Real world performance" Just a little pet peeve of mine when it comes to some reviewers. Like I move files, reinstall, etc. all the time - which makes a huge difference with that extra bandwidth. I feel like some use cases are far unrepresented. Especially for the community the reviewers often cater to - and especially those of use who revel in that 1 second or whatever difference. It adds up - all those hundreds of millions of microseconds add up, y'know. I see it as an investment worthwhile if I think of it that way. That's just my two cents. I'm also just a numbers whore sometimes.


----------



## rfarmer

Quote:


> Originally Posted by *TheDeadCry*
> 
> Very nice! People always quote how unnecessary things like this are in real world performance, but most of us are not your average everyday home user. I will not be sated until everything loads instantly, lol. Besides, how does one quantify "Real world performance" Just a little pet peeve of mine when it comes to some reviewers. Like I move files, reinstall, etc. all the time - which makes a huge difference with that extra bandwidth. I feel like some use cases are far unrepresented. Especially for the community the reviewers often cater to - and especially those of use who revel in that 1 second or whatever difference. It adds up - all those hundreds of millions of microseconds add up, y'know. I see it as an investment worthwhile if I think of it that way. That's just my two cents. I'm also just a numbers whore sometimes.


I am the same way, main reason I upgraded from Z97 to Z170 was to get the M.2 slot. Using m-itx with only 1 slot so no raid 0 for me.








Biggest difference I noticed is when I bought Doom I accidentally installed to the M.2 rather than my gaming SSD. Doom has some pretty big maps and they loaded extremely quickly off the M.2, I could really tell the difference when I reinstalled to my gaming SSD.


----------



## TheDeadCry

For sure. It's just a matter of budgeting right now. Hopefully I'll be able to get a 950 pro to try soon. I do worry what kind of impact raid 0 will have on my 1070, though. Skylake has what, 22 pcie lanes? So at x4 for each, I won't be able to run my 1070 @ x16.


----------



## TheDeadCry

Actually now that I look, I think it's only 20 pcie lanes? Edit: Ah, maybe 16 lanes from the CPU, and 20 from the chipset. F**ck this is confusing, but it looks like my 1070 will run at the full x16 even with raid 0 950's


----------



## rfarmer

Quote:


> Originally Posted by *TheDeadCry*
> 
> Actually now that I look, I think it's only 20 pcie lanes?


Actually that was the big improvement with the Z170 chipset. Skylake only have 16 pci lanes but the Z170 motherboard have an additional 20 lanes vs Z97 that only had the 16 pci lanes from the cpu.
So that gives you a total of 36 pci lanes


----------



## TheDeadCry

Quote:


> Originally Posted by *rfarmer*
> 
> Actually that was the big improvement with the Z170 chipset. Skylake only have 16 pci lanes but the Z170 motherboard have an additional 20 lanes vs Z97 that only had the 16 pci lanes from the cpu.
> So that gives you a total of 36 pci lanes


Ah, thanks for clearing that up. Everyone made it seem far more complicated then it was, lmao.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> There's finally a Gaming Z 1070 BIOS online... Anyone want to confirm whether it works with Samsung Gaming X and Gaming non-X?
> https://www.techpowerup.com/vgabios/185888/msi-gtx1070-8192-160608-2
> 
> I believe it's Samsung since it's "86.04.1E.00" and every Micron BIOS thus far has been "86.04.26.00"
> Yea I think I'm gonna flash it, its build date is before Micron cards even started appearing from MSI. Earliest Micron BIOS's are from a month later.
> There's no way this isn't a Samsung BIOS


There is also a micron 86.04.26.00.xx Gaming Z Bios on the site too.


----------



## Majentrix

I have an i5 6660k at 4.6GHz and a 1070.
What's limiting my frames the most at 1440p? I've asked around and I'm getting conflicting answers.


----------



## Nukemaster

Quote:


> Originally Posted by *Majentrix*
> 
> I have an i5 6660k at 4.6GHz and a 1070.
> What's limiting my frames the most at 1440p? I've asked around and I'm getting conflicting answers.


It will vary from game to game. I do not think the cpu is holding you back unless you play mmo type games since they are hard on any cpu.

If your gpu usage is high when frame rates are lower than expected, it is the video card doing all it can.

If your gpu usage is low when the frame rates are lower than expected, the cpu(or the game it self) is holding you back.

If you use vsync gpu usage will be lower when the card has extra power(already hitting you sync frame limit), but no reason to use it. This is not a bottleneck as such.


----------



## TheDeadCry

Quote:


> Originally Posted by *Majentrix*
> 
> I have an i5 6660k at 4.6GHz and a 1070.
> What's limiting my frames the most at 1440p? I've asked around and I'm getting conflicting answers.


I would HIGHLY recommend using HWMonitor to keep an eye on both your CPU Usage and GPU usage, and from that you can determine whether or not you're being held back. Keep an eye on temps too - throttling is no good either.


----------



## rfarmer

Quote:


> Originally Posted by *Majentrix*
> 
> I have an i5 6660k at 4.6GHz and a 1070.
> What's limiting my frames the most at 1440p? I've asked around and I'm getting conflicting answers.


Would be good to know on which games also, is it all games or just certain ones?


----------



## RaleighStClair

Quote:


> Originally Posted by *Majentrix*
> 
> I have an i5 6660k at 4.6GHz and a 1070.
> What's limiting my frames the most at 1440p? I've asked around and I'm getting conflicting answers.


Depends on the game, and it can vary on location/settings in games as well. For example, if you are playing Battlefield4/1, on a server with more than 32 people than your 6600k will 100% bottleneck you. Where as you would be totally fine with 32players or going through the singe-player campaign with that setup.

The Division, GTA 5, FC primal, Fallout 4 (entire metro Boston area), AC Syndicate, Witcher 3 (cities), etc are all games I had bottlencking issues with this year before I upgraded to a 6700K.


----------



## syl1979

Trying to push the limits on my 2500K + 1070

Timespy Graphics 6594
http://www.3dmark.com/spy/434844

Firestrike Extreme Graphics 10407
http://www.3dmark.com/fs/10140539

Top on list


----------



## Majentrix

Quote:


> Originally Posted by *rfarmer*
> 
> Would be good to know on which games also, is it all games or just certain ones?


My main games are WoW, GTA V, Overwatch, Planetside 2 and Trackmania. From what I understand these are mostly CPU limited games.


----------



## NFSxperts

Quote:


> Originally Posted by *Chaoz*
> 
> EVGA GTX 1070 SC or FTW not an option? They're cheaper than other cards where I bought mine.
> 
> The SC only has 1x 8-pin, but has ****ty OC potential, not that I need it anyways. And the FTW has 2x 8-pin and both have backplates. WHole reason why I went with the SC is for the 1x 8-pin.
> 
> Both cards have quiet high Boost clocks and a nice RGB LED on the side.


I caved in and bought the Zotac Amp. Haven't needed to oc yet. Usually boosts to high 1900s and sometimes 2000+
How much would it have cost in the US including taxes?
http://www.newegg.com/Product/Product.aspx?Item=N82E16814500400&cm_re=zotac_1070-_-14-500-400-_-Product


----------



## benjamen50

Quote:


> Originally Posted by *NFSxperts*
> 
> I caved in and bought the Zotac Amp. Haven't needed to oc yet. Usually boosts to high 1900s and sometimes 2000+
> How much would it have cost in the US including taxes?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500400&cm_re=zotac_1070-_-14-500-400-_-Product


That is pretty good if it already booats above 2000mhz. Probably the gpu might achieve 2150mhz and more along with high memory overclocks.


----------



## Chaoz

Quote:


> Originally Posted by *NFSxperts*
> 
> I caved in and bought the Zotac Amp. Haven't needed to oc yet. Usually boosts to high 1900s and sometimes 2000+
> How much would it have cost in the US including taxes?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500400&cm_re=zotac_1070-_-14-500-400-_-Product


Nice card.

But I'm not from the US. Clearly says on my Location that I'm from Belgium.


----------



## Balrogos

Quote:


> Originally Posted by *Majentrix*
> 
> My main games are WoW, GTA V, Overwatch, Planetside 2 and Trackmania. From what I understand these are mostly CPU limited games.


PlanetSide 2 are just extremly bad optimized game


----------



## Salem13

Quote:


> Originally Posted by *Majentrix*
> 
> My main games are WoW, GTA V, Overwatch, Planetside 2 and Trackmania. From what I understand these are mostly CPU limited games.


6600k and a 1070 FTW here, this may sound completely backwards but I was getting low frames in GTA5 when I had 3 stars on me.

I switched the phys-x to the CPU and things got much smoother... I just checked and since then driver updates have switched it back but yet it still runs fine?

Maybe just letting the game know it could use the CPU instead???


----------



## Derpinheimer

Quote:


> Originally Posted by *RaleighStClair*
> 
> Depends on the game, and it can vary on location/settings in games as well. For example, if you are playing Battlefield4/1, on a server with more than 32 people than your 6600k will 100% bottleneck you. Where as you would be totally fine with 32players or going through the singe-player campaign with that setup.
> 
> The Division, GTA 5, FC primal, Fallout 4 (entire metro Boston area), AC Syndicate, Witcher 3 (cities), etc are all games I had bottlencking issues with this year before I upgraded to a 6700K.


Wow, so is an i7 3820 holding back at GTX 1080? Hyper threading, but older


----------



## jamor

Quote:


> Originally Posted by *Derpinheimer*
> 
> Wow, so is an i7 3820 holding back at GTX 1080? Hyper threading, but older


Depends what game you're playing. Some games are really CPU intensive so if you add a 2nd display for example and multi-task or stream with it you'd think you need a better GPU for that to keep your FPS high but you actually need a better CPU and faster RAM to handle all of the processes while maintaining high FPS.


----------



## Derpinheimer

Quote:


> Originally Posted by *jamor*
> 
> Depends what game you're playing. Some games are really CPU intensive so if you add a 2nd display for example and multi-task or stream with it you'd think you need a better GPU for that to keep your FPS high but you actually need a better CPU and faster RAM to handle all of the processes while maintaining high FPS.


In the BF1 beta the framerate was ok but it felt really sluggish. The perf_overlay seemed kinda flat on both cpu/gpu but was spiking constantly. (I know that's contradictory, i just mean the spikes were not to like 200ms) maybe it's memory cuz I'm on 3x4gb ddr3 1600 (one stick died lol)


----------



## jamor

Quote:


> Originally Posted by *Derpinheimer*
> 
> In the BF1 beta the framerate was ok but it felt really sluggish. The perf_overlay seemed kinda flat on both cpu/gpu but was spiking constantly. (I know that's contradictory, i just mean the spikes were not to like 200ms) maybe it's memory cuz I'm on 3x4gb ddr3 1600 (one stick died lol)


And dual channel memory doubles your bandwidth without the 4th stick your 3rd stick is running at half the speed.


----------



## Chaoz

Quote:


> Originally Posted by *Derpinheimer*
> 
> In the BF1 beta the framerate was ok but it felt really sluggish. The perf_overlay seemed kinda flat on both cpu/gpu but was spiking constantly. (I know that's contradictory, i just mean the spikes were not to like 200ms) maybe it's memory cuz I'm on 3x4gb ddr3 1600 (one stick died lol)


I had no problems whatsoever playing BF1 beta. Boost was stable @2050MHz and VRAM was around 4GB, if i'm not mistaken, framerate did jump from 100fps up to 120fps and back to 80 o so and saw my RAM usage spiking up till 11GB, doesn't really matter as I have 32GB in Quad channel, with all 8 slots filled.

But overall performance didn't feel sluggish at all. Was smooth as hell.


----------



## ITAngel

Any word on Zotac GTX 1070 AMP! Extreme Water block? Still waiting for one.


----------



## GunnzAkimbo

Quote:


> Originally Posted by *nacherc*
> 
> 
> 
> GTX MSI GAMING 1070
> 
> CORE +100
> MEMORY +950 (SAMSUNG)


my 3 x 680s score 4900, can't justify spending $650 for a 1070 and getting exactly the same performance. im gonna need a bigger boat, i mean gfx card.


----------



## criminal

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> my 3 x 680s score 4900, can't justify spending $650 for a 1070 and getting exactly the same performance. im gonna need a bigger boat, i mean gfx card.


1. Where do you live that a 1070 cost $650?
2. You will still have a better experience using one graphics card over 2 or 3.


----------



## Chaoz

Quote:


> Originally Posted by *criminal*
> 
> 1. Where do you live that a 1070 cost $650?
> 2. You will still have a better experience using one graphics card over 2 or 3.


That is indeed quite expensive. Maybe Canada, cuz of the taxes and what not?

Even where I bought my card they only cost €460 or $515, which is 'kinda cheap'.


----------



## Prozillah

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> my 3 x 680s score 4900, can't justify spending $650 for a 1070 and getting exactly the same performance. im gonna need a bigger boat, i mean gfx card.


If the only thing ur doing is benchmarking then sure


----------



## adamjp

In Canada 1070's cost an average of $600. You can get them as low as $570 but the highest end ones could go to $650. This is the same as 980 Ti from some places now, so basically the advance in architecture offered almost no improvement to "performance per dollar" in Canada. This wasn't the case when you could get the 970 very close to the same price as the 770. If you're buying a new GPU then you definitely should buy a new one to take advantage of the new features and technologies, however if you already have a high end GPU in Canada, don't bother unless you intend to move up an entire tier.


----------



## BulletSponge

Quote:


> Originally Posted by *Prozillah*
> 
> If the only thing ur doing is benchmarking then sure


I've wondered about that. With Nvidia saying that with Pascal and future cards will not support more than 2-way sli for gaming will that 2 card limit apply to previous generation cards in future drivers as well?


----------



## Lostcase

So I'm working on a build for my son. He just graduated from Air Force BMT and off to tech school. I'm surprising him with a nice build, but it needs to have a small foot print. His desk space is extremely limited at tech school, so Mini-ITX it is. I picked up a Zotac 1070 Amp Extreme for $429 at Micro Center. Unfortunately, I picked up a case that is too small for this massive card, Thermaltake V1. I even had someone at Micro Center double check the specs and confirmed that it would fit, just really tight.
That was not the case. Not only is the card about an inch too long, but it takes up about "2.5" PCI slots instead of just two.

Are there any other Mini-ITX users here with this card?

Thanks!


----------



## Nukemaster

One beastly looking card.


----------



## saunupe1911

Quote:


> Originally Posted by *Lostcase*
> 
> So I'm working on a build for my son. He just graduated from Air Force BMT and off to tech school. I'm surprising him with a nice build, but it needs to have a small foot print. His desk space is extremely limited at tech school, so Mini-ITX it is. I picked up a Zotac 1070 Amp Extreme for $429 at Micro Center. Unfortunately, I picked up a case that is too small for this massive card, Thermaltake V1. I even had someone at Micro Center double check the specs and confirmed that it would fit, just really tight.
> That was not the case. Not only is the card about an inch too long, but it takes up about "2.5" PCI slots instead of just two.
> 
> Are there any other Mini-ITX users here with this card?
> 
> Thanks!


You are an awesome dad.


----------



## Lostcase

Thank you, saunupe.


----------



## pez

Quote:


> Originally Posted by *Lostcase*
> 
> So I'm working on a build for my son. He just graduated from Air Force BMT and off to tech school. I'm surprising him with a nice build, but it needs to have a small foot print. His desk space is extremely limited at tech school, so Mini-ITX it is. I picked up a Zotac 1070 Amp Extreme for $429 at Micro Center. Unfortunately, I picked up a case that is too small for this massive card, Thermaltake V1. I even had someone at Micro Center double check the specs and confirmed that it would fit, just really tight.
> That was not the case. Not only is the card about an inch too long, but it takes up about "2.5" PCI slots instead of just two.
> 
> Are there any other Mini-ITX users here with this card?
> 
> Thanks!


That's a rather large card for miTX







. I'm a huge fan of the Evolv ITX and then the Ncase M1. The Evolv only has two slots and would choke the card too much if it did fit, and the M1 has enough slots, but is a tad too short. What is the max size you're looking for? Do any of the SFF-ish Fractal Design cases fit your criteria?


----------



## GunnzAkimbo

Quote:


> Originally Posted by *criminal*
> 
> 1. Where do you live that a 1070 cost $650?
> 2. You will still have a better experience using one graphics card over 2 or 3.


Thats a cheapass 1070

The quality units are $700+ up to $800+

https://www.pccasegear.com/category/193_1829/graphics-cards/geforce-gtx-1070/sort/1

It's why im so hesitant to buy something that runs exactly the same. And games use 3 cards fine - been using these cards for 4 years now, since early 2012.

The only worthy single card is the TXP, very expensive here... outrageously expensive.


----------



## greg8west

So I just picked up a new Evga GTX1070 SC, and I got a couple questions.

First at higher fan speeds (70+) it makes a weird noise like a whine. Its not the sound of air blowing for sure. Its not very loud but loud enough to be annoying, is this a common problem with evga's ACX 3 cooler's?

Second whats considered an average overclock for the 1070? Right now I got my core up to 2050mhz and my memory at 4700mhz and is stable in everything I try, that seems like a pretty good overclock (+700 on the memory). I can even go over 4800 on the memory but it gets minor artifacts very often but still boots and runs everything.

Is it worth it to keep the card and ignore the fan noise due to overclock potential or should a just bring it back and upgrade to the FTW for a few extra bucks?


----------



## gtbtk

Quote:


> Originally Posted by *Lostcase*
> 
> So I'm working on a build for my son. He just graduated from Air Force BMT and off to tech school. I'm surprising him with a nice build, but it needs to have a small foot print. His desk space is extremely limited at tech school, so Mini-ITX it is. I picked up a Zotac 1070 Amp Extreme for $429 at Micro Center. Unfortunately, I picked up a case that is too small for this massive card, Thermaltake V1. I even had someone at Micro Center double check the specs and confirmed that it would fit, just really tight.
> That was not the case. Not only is the card about an inch too long, but it takes up about "2.5" PCI slots instead of just two.
> 
> Are there any other Mini-ITX users here with this card?
> 
> Thanks!


Looks like one of these could be just the trick for you

http://www.gigabyte.com/products/product-page.aspx?pid=5923#kf


----------



## drunkonpiss

@Lostcase

As a dad myself, I wanna be like you










There is a 1070 amp mini by Zotac that you can consider. Instead of having 3 fans, it only has 2. You can probably have it refunded and get the mini to perfectly fit your son's mini-ITX build.

https://www.zotac.com/us/product/graphics_card/zotac-geforce%C2%AE-gtx-1070-mini-0


----------



## Chaoz

Quote:


> Originally Posted by *greg8west*
> 
> So I just picked up a new Evga GTX1070 SC, and I got a couple questions.
> 
> First at higher fan speeds (70+) it makes a weird noise like a whine. Its not the sound of air blowing for sure. Its not very loud but loud enough to be annoying, is this a common problem with evga's ACX 3 cooler's?
> 
> Second whats considered an average overclock for the 1070? Right now I got my core up to 2050mhz and my memory at 4700mhz and is stable in everything I try, that seems like a pretty good overclock (+700 on the memory). I can even go over 4800 on the memory but it gets minor artifacts very often but still boots and runs everything.
> 
> Is it worth it to keep the card and ignore the fan noise due to overclock potential or should a just bring it back and upgrade to the FTW for a few extra bucks?


Mine doesn't make that noise. You can barely hear the fans spinning at all while in-game. Got a custom fan profile set up. And because my card doesn't go over 63°C in-game it only stays at 60%.
Maybe try setting a slightly less noisy curve?

Is it making that noise with OC on or at stock? Mine's still at stock, no need yet to OC it, tbh.


----------



## TheDeadCry

New insider preview build today, HURRAH! Time to do some benchmarking.







Today is a good day! Time for a clean install?


----------



## Hunched

Am I able to choose which profile MSI Afterburner defaults to on startup?
I have 4 profiles and "apply overclocking at system startup" and I can't choose which of the 4 it goes with?
It keeps defaulting to my profile with locked 1.093v voltage, even if my last used profile was #2 which I created for idle use.

Anyone know what makes MSI AB choose which profile to auto-apply at startup? I can't find an option. Kinda dumb.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> Am I able to choose which profile MSI Afterburner defaults to on startup?
> I have 4 profiles and "apply overclocking at system startup" and I can't choose which of the 4 it goes with?
> It keeps defaulting to my profile with locked 1.093v voltage, even if my last used profile was #2 which I created for idle use.
> 
> Anyone know what makes MSI AB choose which profile to auto-apply at startup? I can't find an option. Kinda dumb.


Hmmm...As long as you have it set to the profile when you restart, it should start with that one. If you have "Force constant voltage" set in the settings, that's probably your issue.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> Am I able to choose which profile MSI Afterburner defaults to on startup?
> I have 4 profiles and "apply overclocking at system startup" and I can't choose which of the 4 it goes with?
> It keeps defaulting to my profile with locked 1.093v voltage, even if my last used profile was #2 which I created for idle use.
> 
> Anyone know what makes MSI AB choose which profile to auto-apply at startup? I can't find an option. Kinda dumb.


There is an option in afterburner under "profiles" where you can set a hotkey to switch between profiles, also.


----------



## Hunched

Quote:


> Originally Posted by *TheDeadCry*
> 
> Hmmm...As long as you have it set to the profile when you restart, it should start with that one. If you have "Force constant voltage" set in the settings, that's probably your issue.


Quote:


> Originally Posted by *TheDeadCry*
> 
> There is an option in afterburner under "profiles" where you can set a hotkey to switch between profiles, also.


I use the hotkeys.
The Force constant voltage option doesn't do anything on my card, not supported or whatever.

It always defaults to profile #4 on startup, I guess I can try swapping #2 and #4 and hope it still chooses #4.
There should really be somewhere you can choose which profile it chooses automatically at startup though... I don't understand what's special about #4 that it picks that every time.


----------



## RaleighStClair

Just messing with my new MSI 1070 Seahawk X, and with a +125 core + 400 mem. Tops out at 2100 core with no throttling and 44c, not bad.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> I use the hotkeys.
> The Force constant voltage option doesn't do anything on my card, not supported or whatever.
> 
> It always defaults to profile #4 on startup, I guess I can try swapping #2 and #4 and hope it still chooses #4.
> There should really be somewhere you can choose which profile it chooses automatically at startup though... I don't understand what's special about #4 that it picks that every time.


How odd. I've never experienced that problem before. :| I don't use multiple profiles often though, so I may not be the best source. Hmmm have you tried changing power settings in Nvidia control panel. Who nows, maybe maximum performance mode causes afterburner to automatically use the most power hungry profile? Idk.


----------



## Hunched

Quote:


> Originally Posted by *TheDeadCry*
> 
> How odd. I've never experienced that problem before. :| I don't use multiple profiles often though, so I may not be the best source. Hmmm have you tried changing power settings in Nvidia control panel. Who nows, maybe maximum performance mode causes afterburner to automatically use the most power hungry profile? Idk.


I'll test some things in a bit, not able to try anything right now.
I don't use maximum performance mode because I don't need my voltage and such high during Google Chrome use or anything like it.
Since I have a profile with the voltage locked at 1.093v for gaming, whenever it's used all power management modes will give identical results, they're nullified.
So I just have it on optimal power so my GPU can chill out when I'm not gaming, that's all I need, full throttle or no throttle which I toggle between.

This MSI AB thing isn't a big issue, it's just annoying that I have to correct it every single time I start my PC though, even though it just takes a second.
Not a fan of problems that shouldn't exist if I had the control I should have over the situation.


----------



## mrtbahgs

Perhaps MSI AB is selecting the last profile you saved?
So if there is a slot for a 5th, just copy the settings of the one you want to be default and put it into #5 and try that.

I've only used PrecisionX so I have no background with AB.


----------



## Hunched

Quote:


> Originally Posted by *mrtbahgs*
> 
> Perhaps MSI AB is selecting the last profile you saved?
> So if there is a slot for a 5th, just copy the settings of the one you want to be default and put it into #5 and try that.
> 
> I've only used PrecisionX so I have no background with AB.


That's a good idea, but I just figured it out.
It has to do with the lock, beside the profiles.

It seems like Profile #4 was what I last created before locking profile modification, and it seems whatever profile you've last chosen prior to locking will be the default startup profile.
So now that I've unlocked profile modification and applied #2 exactly as I did before, but without the lock enabled, it auto applies #2 on startup.

Profile modification is locked again, and if I apply #1, #3, #4, whatever and then restart, it will apply #2 upon startup.

So there we go.
Locking profile modification doesn't just stop you from accidentally modifying profiles.
Your most recently applied profile prior to locking is the default startup profile, regardless of which profile was last applied.


----------



## Dude970

I use AB, and if the apply at start up is selected, switching to a profile it remains selected. If you are switching to a different profile, unclick it first, and it will still remember the one you originally selected.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Am I able to choose which profile MSI Afterburner defaults to on startup?
> 
> I have 4 profiles and "apply overclocking at system startup" and I can't choose which of the 4 it goes with?
> 
> It keeps defaulting to my profile with locked 1.093v voltage, even if my last used profile was #2 which I created for idle use.
> 
> Anyone know what makes MSI AB choose which profile to auto-apply at startup? I can't find an option. Kinda dumb.


In settings, under the profile tab, you can set default profiles for 2D and 3D computing. Set your default profile to the 2d option and put your gaming profile in the 3d option.

No need to crank up fans and run higher voltages through the card if you are not loading it up


----------



## madmax95

Hi guys !

What's up about custom bios flashing on GTX 1070 graphic cards ?


----------



## bigjdubb

Quote:


> Originally Posted by *madmax95*
> 
> Hi guys !
> 
> What's up about custom bios flashing on GTX 1070 graphic cards ?


Flashing a custom bios isn't a problem. Customizing a 1070 bios is a huge problem though, we don't have a bios editor yet.


----------



## jovanni

Quote:


> Originally Posted by *bigjdubb*
> 
> Flashing a custom bios isn't a problem. Customizing a 1070 bios is a huge problem though, we don't have a bios editor yet.











.....this is what we waiting for....and mostly from curiosity side....to discover the limit of pascal....


----------



## bigjdubb

I want a bios editor so that I can get rid of boost and just set my card to 2050mhz and 1.09 volts all the time. I think that if I can get rid of boost and frequency fluctuations I can get rid of my crazy frame time spikes and audio issues.


----------



## BroPhilip

Quote:


> Originally Posted by *bigjdubb*
> 
> I want a bios editor so that I can get rid of boost and just set my card to 2050mhz and 1.09 volts all the time. I think that if I can get rid of boost and frequency fluctuations I can get rid of my crazy frame time spikes and audio issues.


Just lock the voltage in the curve editor cntr-l on the 1.093 voltage marker and set it as a preset in AB. That will lock the clock. I can keep a steady 2113 with no artifacts. If I drop the lock it becomes instead and I have to drop it to 2076 - 2088 with the curve.


----------



## mrtbahgs

I will do some more research on my end if needed, but wanted to try here for a quick response.

Is there much difference between dual link DVI-D and displayport?
Most importantly, will displayport be a hindrance in any way compared to DVI-D such as less monitor options/settings?

I believe displayport is newer tech so I assume it won't reduce/hurt anything, but I have never used it before and will need to with my 1070 so I want to be sure.
My options are to try and hook up my main monitor via DP or get a DP to DVI adapter and hook it up via DVI like I had been.
I have a secondary monitor that is limited to DVI so that is why I may need the adapter if my main monitor will suffer from a DP connection.


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hunched*
> 
> Am I able to choose which profile MSI Afterburner defaults to on startup?
> 
> I have 4 profiles and "apply overclocking at system startup" and I can't choose which of the 4 it goes with?
> 
> It keeps defaulting to my profile with locked 1.093v voltage, even if my last used profile was #2 which I created for idle use.
> 
> Anyone know what makes MSI AB choose which profile to auto-apply at startup? I can't find an option. Kinda dumb.
> 
> 
> 
> In settings, under the profile tab, you can set default profiles for 2D and 3D computing. Set your default profile to the 2d option and put your gaming profile in the 3d option.
> 
> No need to crank up fans and run higher voltages through the card if you are not loading it up
Click to expand...

I solved the issue myself, but the options you speak of do not exist in the profiles tab. All the profiles tab shows is Global Profile Hotkeys.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> I solved the issue myself, but the options you speak of do not exist in the profiles tab. All the profiles tab shows is Global Profile Hotkeys.


it does in 4.3 beta 3 and 14 which are the versions that have been released that support pascal and the boost 3.0 curves


----------



## RaleighStClair

Quote:


> Originally Posted by *BroPhilip*
> 
> Just lock the voltage in the curve editor cntr-l on the 1.093 voltage marker and set it as a preset in AB. That will lock the clock. I can keep a steady 2113 with no artifacts. If I drop the lock it becomes instead and I have to drop it to 2076 - 2088 with the curve.


How exactly do you lock the overclock at a specific voltage though? For example, 2150mhz Core @ 1.093.

I can't seem to actually lock the overclock at any specific core/voltage using the curve editor (CNTRL + F).


----------



## zipzop

Quote:


> Originally Posted by *RaleighStClair*
> 
> How exactly do you lock the overclock at a specific voltage though? For example, 2150mhz Core @ 1.093.
> 
> I can't seem to actually lock the overclock at any specific core/voltage using the curve editor (CNTRL + F).


http://www.guru3d.com/files-details/msi-afterburner-beta-download.html

"You may press "L" after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling"


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hunched*
> 
> I solved the issue myself, but the options you speak of do not exist in the profiles tab. All the profiles tab shows is Global Profile Hotkeys.
> 
> 
> 
> it does in 4.3 beta 3 and 14 which are the versions that have been released that support pascal and the boost 3.0 curves
Click to expand...

Not for me it doesn't.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hunched*
> 
> I solved the issue myself, but the options you speak of do not exist in the profiles tab. All the profiles tab shows is Global Profile Hotkeys.
> 
> 
> 
> it does in 4.3 beta 3 and 14 which are the versions that have been released that support pascal and the boost 3.0 curves
> 
> Click to expand...
> 
> Not for me it doesn't.
Click to expand...

Strange. Maybe that is a feature reserved for MSI cards??


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hunched*
> 
> I solved the issue myself, but the options you speak of do not exist in the profiles tab. All the profiles tab shows is Global Profile Hotkeys.
> 
> 
> 
> it does in 4.3 beta 3 and 14 which are the versions that have been released that support pascal and the boost 3.0 curves
> 
> Click to expand...
> 
> Not for me it doesn't.
> 
> 
> 
> Click to expand...
> 
> Strange. Maybe that is a feature reserved for MSI cards??
Click to expand...

I'm using an MSI Gaming 1070, lol


----------



## gtbtk

that is what my screen looks like


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> 
> 
> 
> 
> that is what my screen looks like


I don't understand why I don't have those options, my general tab has all the same options and my settings are the same except everything but voltage monitoring is checked.


----------



## Forceman

Quote:


> Originally Posted by *Hunched*
> 
> I don't understand why I don't have those options, my general tab has all the same options and my settings are the same except everything but voltage monitoring is checked.


Maybe uninstall and delete the AB folder? Maybe it has old config files that are messing it up in some way - I've had that happen before when switching from brands.


----------



## gtbtk

I was about to suggest uninstalling as well.

BTW, I have a Gaming X but I am running a gaming Z bios


----------



## Hunched

Quote:


> Originally Posted by *Forceman*
> 
> Maybe uninstall and delete the AB folder? Maybe it has old config files that are messing it up in some way - I've had that happen before when switching from brands.


Not sure I feel like having to reconfigure everything without possibly fixing anything.
It seems to be just these two options, and I don't think I need them.

Quote:


> Originally Posted by *gtbtk*
> 
> I was about to suggest uninstalling as well.
> 
> BTW, I have a Gaming X but I am running a gaming Z bios


I use the Z BIOS too.


----------



## TheDeadCry

Quote:


> Originally Posted by *bigjdubb*
> 
> I want a bios editor so that I can get rid of boost and just set my card to 2050mhz and 1.09 volts all the time. I think that if I can get rid of boost and frequency fluctuations I can get rid of my crazy frame time spikes and audio issues.


This


----------



## Dude970

Quote:


> Originally Posted by *Hunched*
> 
> Not sure I feel like having to reconfigure everything without possibly fixing anything.
> It seems to be just these two options, and I don't think I need them.
> I use the Z BIOS too.


How is the Z-bios. What does it boost to when left at stock using it? I may add it to my Gaming-x


----------



## Hunched

It's hard to find info, but it seems like those profile management options might be part of Riva Statistics Server which I chose not to install with MSI Afterburner?


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Maybe uninstall and delete the AB folder? Maybe it has old config files that are messing it up in some way - I've had that happen before when switching from brands.
> 
> 
> 
> Not sure I feel like having to reconfigure everything without possibly fixing anything.
> It seems to be just these two options, and I don't think I need them.
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I was about to suggest uninstalling as well.
> 
> BTW, I have a Gaming X but I am running a gaming Z bios
> 
> Click to expand...
> 
> I use the Z BIOS too.
Click to expand...

Try this and it may solve the issue.

In the [settings] section of your afterburner.cfg file in the c:\program files (x86)\MSI afterburner\profiles directory, try adding the following 2 lines and restart AB:

Profile2D=-1
Profile3D=-1

You may also need to check you have write access to the profiles directory.

Quote:


> Originally Posted by *Hunched*
> 
> It's hard to find info, but it seems like those profile management options might be part of Riva Statistics Server which I chose not to install with MSI Afterburner?


The Riva server provides a bunch of functionality to AB. I recommend you install it. It doesnt eat much resources


----------



## TheDeadCry

Quote:


> Originally Posted by *mrtbahgs*
> 
> I will do some more research on my end if needed, but wanted to try here for a quick response.
> 
> Is there much difference between dual link DVI-D and displayport?
> Most importantly, will displayport be a hindrance in any way compared to DVI-D such as less monitor options/settings?
> 
> I believe displayport is newer tech so I assume it won't reduce/hurt anything, but I have never used it before and will need to with my 1070 so I want to be sure.
> My options are to try and hook up my main monitor via DP or get a DP to DVI adapter and hook it up via DVI like I had been.
> I have a secondary monitor that is limited to DVI so that is why I may need the adapter if my main monitor will suffer from a DP connection.


Always, Always, Always use display port when possible. Displayport supports higher bandwidth than HDMI and DVI. It is the only cable capable of pushing very high resolutions at higher refresh rates, from the wiki, "DisplayPort 1.4 can support 8K UHD (7680×4320) at 60 Hz with 10-bit color and HDR, or 4K UHD (3840×2160) at 120 Hz with 10-bit color and HDR. 4K at 60 Hz with 10-bit color and HDR can be achieved without the need for DSC." Whereas the lastest HDMI standard only supports up to 4k at 60hz. Display port is far and away the superior connection. Not even in the same ballpark as HDMI/DVI. The DVI adapter would be the limiting factor. I have a displayport to HDMI connector, so your displayport to DVI should work just fine - but you will be limited to the bandwidth of DVI. If your monitor supports displayport natively, always use that.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> I don't understand why I don't have those options, my general tab has all the same options and my settings are the same except everything but voltage monitoring is checked.


Have you flashed to a different bios by chance? Maybe the oc by default bios, the gaming z bios, etc?


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> The Riva server provides a bunch of functionality to AB. I recommend you install it. It doesnt eat much resources


I found a post from Unwinder who developed RivaTuner and it says:
"The server is no longer being distributed as a part of client applications like MSI Afterburner and EVGA Precision. Now it comes under generic RivaTuner Statistics Server name with own installer and can be optionally installed when necessary. The clients will automatically hide any server dependent functionality (such as On-Screen Display, framerate monitoring, *automatic 2D/3D profiles management*, screen and video capture and so on) when the server is not installed"
This is from 2013 though, probably still accurate.

I always choose to install the bare minimum of what I need.
I have SpeedFan running and that already monitors my GPU and CPU temps, I can run GPU-Z in the background if I need more info.

Quote:


> Originally Posted by *Dude970*
> 
> How is the Z-bios. What does it boost to when left at stock using it? I may add it to my Gaming-x


I didn't test it at "stock" but I tested it with core and memory clocks at +0, but voltage and power/temp limits maxed, and it boosted to 2050mhz on my card.
Everyone's power/temp limits should be maxed anyway, voltage too if you aren't TDP throttling.


----------



## Dude970

Thanks, may give it a go soon


----------



## mrtbahgs

Quote:


> Originally Posted by *TheDeadCry*
> 
> Always, Always, Always use display port when possible. Displayport supports higher bandwidth than HDMI and DVI. It is the only cable capable of pushing very high resolutions at higher refresh rates, from the wiki, "DisplayPort 1.4 can support 8K UHD (7680×4320) at 60 Hz with 10-bit color and HDR, or 4K UHD (3840×2160) at 120 Hz with 10-bit color and HDR. 4K at 60 Hz with 10-bit color and HDR can be achieved without the need for DSC." Whereas the lastest HDMI standard only supports up to 4k at 60hz. Display port is far and away the superior connection. Not even in the same ballpark as HDMI/DVI. The DVI adapter would be the limiting factor. I have a displayport to HDMI connector, so your displayport to DVI should work just fine - but you will be limited to the bandwidth of DVI. If your monitor supports displayport natively, always use that.


Awesome, thank you for the informative post and exactly what I was looking for. +Rep

I will have to see if my main monitor has a DP cable in the box or if I need to pick one up.
Its 1440p 60hz as it is bought when 1440p was pretty new to the market so I am pretty sure dual link DVI-D was plenty (as per the manual or online), but good to know DP should be the same or better.


----------



## gtbtk

Quote:


> Originally Posted by *Dude970*
> 
> Thanks, may give it a go soon


I actually get slightly better bench marks with the Gaming X bios when overclocked. (Firestrike graphics score 20500 vs 20300)

The Z is getting to boost to about 1970mhz at stock gaming mode settings. It will boost to 2038 then drop back if i increase the voltage to +100 and leave everything else at the defaults.


----------



## TheDeadCry

Quote:


> Originally Posted by *mrtbahgs*
> 
> Awesome, thank you for the informative post and exactly what I was looking for. +Rep
> 
> I will have to see if my main monitor has a DP cable in the box or if I need to pick one up.
> Its 1440p 60hz as it is bought when 1440p was pretty new to the market so I am pretty sure dual link DVI-D was plenty (as per the manual or online), but good to know DP should be the same or better.


No problem. Thank You for the rep.







Idk what displayport version your monitor supports, though. If it's older it won't support the latest standards, but you should still use displayport if possible.


----------



## RaleighStClair

Quote:


> Originally Posted by *zipzop*
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> "You may press "L" after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling"


Thanks. It appears I am pwr limited, which I believe would need to be fixed with a modded bios - which is not available yet for 10 series NV GPUs? - so I guess I will just wait for bios editing tools (like I did on my 970) or stick to 2000mhz core.

Has their been any progress to a Pascal bios tweaker tool?


----------



## mrtbahgs

Quote:


> Originally Posted by *TheDeadCry*
> 
> No problem. Thank You for the rep.
> 
> 
> 
> 
> 
> 
> 
> Idk what displayport version your monitor supports, though. If it's older it won't support the latest standards, but you should still use displayport if possible.


It looks like DP 1.2 is what my monitor has and I will try a DP cable tomorrow.
According to a nice and hopefully correct table on wikipedia, I am good to go with DP1.2

https://en.wikipedia.org/wiki/DisplayPort#Resolution_and_frame_rate_support_for_DisplayPort

Also to keep things on track with this thread, I tested my 1070 G1 Gaming in BF4 for an hour or so and according to HWinfo my out of the box max clock is 1974!
I haven't paid too much attention yet, but that seems pretty promising to me and will hopefully result is a good manual OC later.


----------



## TheDeadCry

Quote:


> Originally Posted by *mrtbahgs*
> 
> It looks like DP 1.2 is what my monitor has and I will try a DP cable tomorrow.
> According to a nice and hopefully correct table on wikipedia, I am good to go with DP1.2
> 
> https://en.wikipedia.org/wiki/DisplayPort#Resolution_and_frame_rate_support_for_DisplayPort
> 
> Also to keep things on track with this thread, I tested my 1070 G1 Gaming in BF4 for an hour or so and according to HWinfo my out of the box max clock is 1974!
> I haven't paid too much attention yet, but that seems pretty promising to me and will hopefully result is a good manual OC later.


Ah, well talking display performance and such isn't really off track so to speak, but that's pretty good







My MSI Gaming X boosts around that range.


----------



## TheDeadCry

My 1070 is running like a champ connected to my two 21:9 monitors - it really is a damn good card. Coming from the 780, I'm still blown away even two months after getting it. The monitors are both 2560x1080...once you go 21:9...you can't go back. Like, not only do games run great, they look great and I'm not even remotely interested in 4k. Unfortunately the two monitors only have HDMI, so I have one connected via HDMI and the other connected via an HDMI to display port adapter.


----------



## kevindd992002

I was told by our supplier that there is a batch of new incoming stocks for this GPU that has a new fan revision. The fans, he said, will be similar to the GTX 1060 fans. Is this accurate information? And if so, how would the new fans be “better”?


----------



## Nightingale

Quote:


> Originally Posted by *adamjp*
> 
> In Canada 1070's cost an average of $600. You can get them as low as $570 but the highest end ones could go to $650. This is the same as 980 Ti from some places now, so basically the advance in architecture offered almost no improvement to "performance per dollar" in Canada. This wasn't the case when you could get the 970 very close to the same price as the 770. If you're buying a new GPU then you definitely should buy a new one to take advantage of the new features and technologies, however if you already have a high end GPU in Canada, don't bother unless you intend to move up an entire tier.


Then Comes the Tax. So that $570 card is now $650 out the door.


----------



## Nightingale

I've got to say I am very pleased I made the initial investment into custom water loops 3 years ago. For instance my 1070 uses a EK universal gpu block and I just sit here at 2.2Ghz never having clock frequency fluctuations, not even having to use 100% voltage, never going above 38 on full load. A simple $65 universal block used already in 2 previous GPU's still severing me well today. No need to buy special edition OC bios versions with higher boost, since I can manually set a profile.

I just felt like mentioning this cause I see with Pascal all these temp/clock frequency inconsistent behavioral issues.


----------



## bigjdubb

Quote:


> Originally Posted by *zipzop*
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> "You may press "L" after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling"


I hadn't been able to get anything to actually lock using Afterburner. Last night I was talking with someone on teamspeak and they gave me a suggestion that actually worked.

For whatever reason afterburner would not lock anything with ctrl L. BUT, If I lock it then apply the setting and save it to a profile and then make that profile the default profile and reboot I have locked frequencies and voltage.

With my frequency and voltage locked I no longer get any frametime spikes and I was able to play a game and keep my microphone for the rest of the night. Boost 3.0 appears to be the source of my problems but thankfully that problem goes away with a locked frequency. Now if we can get a bios editor I can ditch Afterburner.

My 1070 gaming and desktop experience is finally enjoyable but I have to keep the frequency locked (I locked mine at 2050 mhz) to have an enjoyable experience.


----------



## zipzop

Mine came in today! yay. finally arrived this morning after waiting what felt like a century due to Newegg back order, and the warehouse being on the other side of the country










Anyhow I got the SC Gaming model from EVGA with the 8-pin connector. Jumping straight into OC'ing, boosts to the 1930mhz's out of the box. With power limit cranked, it tends to hold at that frequency better . Custom fan curve with around 57% usually keeps it 60C and under, and nice and quiet. So far is handling 2100mhz+ OC without a hitch which is great. Around 2177mhz/2164mhz is where I got a crash in GTAV. UNigine Heavan, crashes on a bit lower frequency...still some tweaking to do with the voltage maybe and the V/F curve I imagine. Should be interesting


----------



## zipper17

Hey guys is it recommended to Use Molex to PCIEx ? i read some google, said it's not recommended, it causes a problems.

I have Dual-molex to pciex that come from the graphic card itself (free accessories).

planning to SLI 1070, my PSU are non modular, so pciex are limited.


----------



## bigjdubb

Quote:


> Originally Posted by *zipper17*
> 
> Hey guys is it recommended to Use Molex to PCIEx ? i read some google, said it's not recommended, it causes a problems.
> 
> I have molex to pciex that come from the graphic card itself (free accessories).
> 
> planning to SLI 1070, my PSU are non modular, so pciex are limited.


You need to make sure that the power supply can handle the second card but if it can then the adapters should be fine. It's not ideal, adapters never are, but it isn't unsafe unless you are overloading the power supply.


----------



## zipper17

Quote:


> Originally Posted by *Nightingale*
> 
> I've got to say I am very pleased I made the initial investment into custom water loops 3 years ago. For instance my 1070 uses a EK universal gpu block and I just sit here at 2.2Ghz never having clock frequency fluctuations, not even having to use 100% voltage, never going above 38 on full load. A simple $65 universal block used already in 2 previous GPU's still severing me well today. No need to buy special edition OC bios versions with higher boost, since I can manually set a profile.
> 
> I just felt like mentioning this cause I see with Pascal all these temp/clock frequency inconsistent behavioral issues.


Yes water + 38 c + 2.2ghz are awesome, card will never throttle down.

But is it stable to all testing? 3dmark, 3dmark stress test, valley, heaven, and games, etc?

I can get @2114mhz and will steady @2076mhz on valley 1440p extremehd and a couple of games, but when running 3dmark FS Extreme stress test crashes after some loops, and witcher 3 crashes after playing 10 minutes. Chip lottery, and aircooling are not the best friend. GPu boost will love water.

SLI might be a better option rather than struggle with inconsistent OC'ed one card.
Quote:


> Originally Posted by *bigjdubb*
> 
> You need to make sure that the power supply can handle the second card but if it can then the adapters should be fine. It's not ideal, adapters never are, but it isn't unsafe unless you are overloading the power supply.


Okay. But I read somewhere, some people still say it's not safe because will cause heat on the wires ..blabla, especially when the card consume a lot of power. I might be just go for a new psu then.


----------



## Nightingale

On water I am Stable 100%, Valley, 3dmark, most importantly tthe games I play. Witcher 3, Mirrors edge catalyst, BF4, GTA V etc....


----------



## Dude970

Quote:


> Originally Posted by *zipper17*
> 
> Hey guys is it recommended to Use Molex to PCIEx ? i read some google, said it's not recommended, it causes a problems.
> 
> I have molex to pciex that come from the graphic card itself (free accessories).
> 
> planning to SLI 1070, my PSU are non modular, so pciex are limited.


I would upgrade the PSU, get a good modular one with the needed pci-e connectors


----------



## zipper17

core clock speed step (from2000-2150)
2150
2138
2126
2114
2100
2088
2076
2063
2050
2037
2025
2012
2000

Is there anyone know actually How much gain step by step between those clock speed?? is there any apples to apples comparison ?


----------



## Nukemaster

Those may be rounded.

Kepler used 13mhz per step It looks pretty close.

It may also be something strange like 12.xx at a time.

I think the gain is pretty minimal. You can use AB to force -13mhz in steps to see how it effects a benchmark.


----------



## zipper17

Is not really even worth to put lot effort for OCing, if gain are just by a tiny amount amongst those clock speed.

Mine are stable in 2063,*2050,2037,2025*,2012 range. (Averagely in bold.) 0% Voltage.

125%,92C target.
memory @9000mhz (288GB/s)
custom fan curve 1:10 (Example 70C @80%fanspeed)
Hysteresis at 8C.

With Max Voltage, I can reach the @2114mhz, steady at @2076, but only stable on Valley. 3dmark Stress test / Witcher 3 crashes.
Didn't try yet higher than 2114. The card has 6+8pin Pin thought, I'm not sure what cause the crashes, it's my gpu (bad luck) or it might be something(PSU)?

When gaming:
However i choose 71C target for maximum temperature allowed.
Adaptive sync on via cp. (which mean [email protected] limit in every game, less heat and plus free-temporal no tearing)
Mainly play at 1440P. (Online soft Gaming at 1080P)
Optimal Power.

So Far game like Hitman, GTA5, And Witcher 3 mainly that would sweat the card.
However old game like Crysis 1 @ max setting 1440P, still dip to ~45FPS ish lol.


----------



## BroPhilip

Quote:


> Originally Posted by *zipper17*
> 
> Yes water + 38 c + 2.2ghz are awesome, card will never throttle down.
> 
> But is it stable to all testing? 3dmark, 3dmark stress test, valley, heaven, and games, etc?
> 
> I can get @2114mhz and will steady @2076mhz on valley 1440p extremehd and a couple of games, but when running 3dmark FS Extreme stress test crashes after some loops, and witcher 3 crashes after playing 10 minutes. Chip lottery, and aircooling are not the best friend. GPu boost will love water.
> 
> SLI might be a better option rather than struggle with inconsistent OC'ed one card.
> Okay. But I read somewhere, some people still say it's not safe because will cause heat on the wires ..blabla, especially when the card consume a lot of power. I might be just go for a new psu then.


That is probably true... your oc results seem to be on target for 98% of the people out there, except for those with founder editions or those limited by micron memory like me. Unless we get a Bios editor it seems nvidia has locked down the cards probably could push to close to stock 1080 in performance.


----------



## Forceman

Quote:


> Originally Posted by *zipper17*
> 
> Is there anyone know actually How much gain step by step between those clock speed?? is there any apples to apples comparison ?


Well at best you are looking at something on the order of 0.6%, with the actual gain being less than that. So a single step is essential meaningless.


----------



## phalae

Hello Guys,

This is my last benchmark but I don't understand why the card is throttling ?

I start at 2113mhz go to 2088Mhz for a while and finish at 2075mhz at the end of the benchmark. My temps looks ok no ?









What do you think ?

http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg

http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg


----------



## Bold Eagle

Quote:


> Originally Posted by *zipper17*
> 
> Hey guys is it recommended to Use Molex to PCIEx ? i read some google, said it's not recommended, it causes a problems.
> 
> I have Dual-molex to pciex that come from the graphic card itself (free accessories).
> 
> planning to SLI 1070, my PSU are non modular, so pciex are limited.


List your PSU so meaningful comments can be made.

I have a Gainward unit.

https://www.techpowerup.com/gpuz/details/mb8gh


----------



## Dude970

Quote:


> Originally Posted by *phalae*
> 
> Hello Guys,
> 
> This is my last benchmark but I don't understand why the card is throttling ?
> 
> I start at 2113mhz go to 2088Mhz for a while and finish at 2075mhz at the end of the benchmark. My temps looks ok no ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you think ?
> 
> http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg
> 
> http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg


Turn up the fans more, it will lower gpu temp, and stop the throttle some.


----------



## ZakZakXxX

Quote:


> Originally Posted by *Jimbags*
> 
> Try extreme you pansy. Your rocking a 1070 after all....


Are there words better than this?

Overall result is clear
Re-test again ..
For the Extreme test, I can not at the moment

http://www.3dmark.com/3dm/14859313

on clock 2088

Graphics Score
20 859


----------



## ZakZakXxX

Quote:


> Originally Posted by *phalae*
> 
> Hello Guys,
> 
> This is my last benchmark but I don't understand why the card is throttling ?
> 
> I start at 2113mhz go to 2088Mhz for a while and finish at 2075mhz at the end of the benchmark. My temps looks ok no ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you think ?
> 
> http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg
> 
> http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg


So much because of the heat in the card
If the temperature dropped below 45 Celsius, it never goes down


----------



## mrtbahgs

I have a Gigabyte G1 Gaming so I downloaded the Xtreme Gaming tool to play with the RGB LEDs and notice it also is an OC tool.

Will the Gigabyte OC software work just the same (net similar results) to PrecisionX or am I better going with what I consider to be a more commonly known and proven OC tool (PrecisionX)?

If I should use PrecisionX, I assume I will have to uninstall the Xtreme Gaming software so it doesn't conflict with the OC settings from PrecisionX and then I give up the ability to tinker with LED color options or can I have both installed and Xtreme Gaming won't try to overwrite my OC settings from PrecisionX?


----------



## THEROTHERHAMKID

You can use to overclock
But use precision or msi Afterburner
Yes remove extreme if overclocking yourself or just dont set it to run


----------



## zipper17

Quote:


> Originally Posted by *phalae*
> 
> Hello Guys,
> 
> This is my last benchmark but I don't understand why the card is throttling ?
> 
> I start at 2113mhz go to 2088Mhz for a while and finish at 2075mhz at the end of the benchmark. My temps looks ok no ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you think ?
> 
> http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg
> 
> http://tof.canardpc.com/view/1731ca3a-19e8-425a-8257-1805b46bfe08.jpg


Do you have 3dmark firestrike? I suggest runs those, and run 3d mark Stress test Firestrike extreme if you haven't yet.
Complete all 20 loops with a Frame Rate Stability of at least 97%. If you passed those then your card are mostly stable.

Then, try run any heavy games, example witcher3, gta5, etc that would make gpu runs at 99% load during test.
Crank up the settings, reso at least 1440P. If you don't have 1440P monitor try DSR.
Make sure everything on any applications doesn't have crashes or artifacts or anything weird. Fully stable 100%.

Yes, this temperature throttling are a bit not fair In My Opinion.
Why you keep throttle down if card can stable example at 2050 anyway? even though as hot as 70-80C.


----------



## gtbtk

Quote:


> Originally Posted by *bigjdubb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipzop*
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> "You may press "L" after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling"
> 
> 
> 
> I hadn't been able to get anything to actually lock using Afterburner. Last night I was talking with someone on teamspeak and they gave me a suggestion that actually worked.
> 
> For whatever reason afterburner would not lock anything with ctrl L. BUT, If I lock it then apply the setting and save it to a profile and then make that profile the default profile and reboot I have locked frequencies and voltage.
> 
> With my frequency and voltage locked I no longer get any frametime spikes and I was able to play a game and keep my microphone for the rest of the night. Boost 3.0 appears to be the source of my problems but thankfully that problem goes away with a locked frequency. Now if we can get a bios editor I can ditch Afterburner.
> 
> My 1070 gaming and desktop experience is finally enjoyable but I have to keep the frequency locked (I locked mine at 2050 mhz) to have an enjoyable experience.
Click to expand...

That is an interesting observation. thanks for sharing


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> core clock speed step (from2000-2150)
> 
> 2150
> 
> 2138
> 
> 2126
> 
> 2114
> 
> 2100
> 
> 2088
> 
> 2076
> 
> 2063
> 
> 2050
> 
> 2037
> 
> 2025
> 
> 2012
> 
> 2000
> 
> Is there anyone know actually How much gain step by step between those clock speed?? is there any apples to apples comparison ?


''

12.5mhz per step


----------



## Dude970

Broke 2800 running Heaven


----------



## RaleighStClair

I take it their is no way to increase the power limit on these 1070s yet?

My MSI 1070 seahawk apparently maxes out at 105% in Afterburner. Wish I could bump that to 120%.


----------



## asdkj1740

Quote:


> Originally Posted by *RaleighStClair*
> 
> I take it their is no way to increase the power limit on these 1070s yet?
> 
> My MSI 1070 seahawk apparently maxes out at 105% in Afterburner. Wish I could bump that to 120%.


you can try the hard mod, which is very easy to do it and it is reversible, just short the shunts by liquid ultra. some guide video of this can be found on youtube.
but i dont think this will help a lot as the main point is all about the gpu boost 3.0


----------



## asdkj1740

i would like to add a evga hybrid aio to evga ftw 1070
does anyone knows the sizes of the evga 1070 ftw gpu mounting holes and 980ti references gpu mounting holes as well as 1070 reference holes?


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> i would like to add a evga hybrid aio to evga ftw 1070
> does anyone knows the sizes of the evga 1070 ftw gpu mounting holes and 980ti references gpu mounting holes as well as 1070 reference holes?


I believe they are the same.

This is a video of something similar to what you want to do


----------



## Nevii

How many performance I'm losing with memory chips (Micron) that overclocks poorly? More or less, let's say 2100/8700 vs 2100/9400. I'm thinking about returning my card and buying another one with (i hope) better memory chips. Is it worth the effort, or gain in performance will be too low? I'm getting 102FPS in Heaven (maxed out with 1080p resolution). Also I'm wondering if it's still possible to buy card with Samsung memory chips, or maybe all producers switched to Micron?

EDIT: My card is only hitting 86% of power limit, and throttle reason is VRel, is it normal?


----------



## Skyblaze

Hey I'm new here!









I switched to nVidia a while ago again with a Palit DUAL GTX 1070 after being with AMD for a long while. It was the cheapest non-reference model I could get at the time and I love it after my old 7970, especially how silent and relative cool it is. Though being such a cheap model it still manages to hit the thermal-limit of 82°C in Overwatch and Witcher 3 in 1440p so I wondered, did anyone have any luck undervolting their 1070? The new dynamic-curve of GPU Boost 3.0 seems to make it a bit more difficult than usual but thinking about it it would theoretically be possible to undervolt them by adjusting the curve more to the left in Afterburner so the clockspeeds are set at a lower voltage, did anyone try this before?


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Hey I'm new here!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I switched to nVidia a while ago again with a Palit DUAL GTX 1070 after being with AMD for a long while. It was the cheapest non-reference model I could get at the time and I love it after my old 7970, especially how silent and relative cool it is. Though being such a cheap model it still manages to hit the thermal-limit of 82°C in Overwatch and Witcher 3 in 1440p so I wondered, did anyone have any luck undervolting their 1070? The new dynamic-curve of GPU Boost 3.0 seems to make it a bit more difficult than usual but thinking about it it would theoretically be possible to undervolt them by adjusting the curve more to the left in Afterburner so the clockspeeds are set at a lower voltage, did anyone try this before?


Yes you can use the curve to set the peak at say 1.00v but why not just set a custom fan curve and/or adjust the thermal limit?


----------



## gtbtk

Quote:


> Originally Posted by *Nevii*
> 
> How many performance I'm losing with memory chips (Micron) that overclocks poorly? More or less, let's say 2100/8700 vs 2100/9400. I'm thinking about returning my card and buying another one with (i hope) better memory chips. Is it worth the effort, or gain in performance will be too low? I'm getting 102FPS in Heaven (maxed out with 1080p resolution). Also I'm wondering if it's still possible to buy card with Samsung memory chips, or maybe all producers switched to Micron?
> 
> EDIT: My card is only hitting 86% of power limit, and throttle reason is VRel, is it normal?


Micron Ram doesn't overclock that badly. I can clock my MSI Gaming X with Micron memory to 9100mhz. You just need to lock the voltage in the afterburner curve to a value higher than .800v before you move the memory slider past about +480 to +500. Below that, it doesn't need the voltage lock

With my i7-2600 oc to 4.4Ghz, I get about the same in Heaven with a score of about 2580 so that is not an unreasonable score. It may be a bit slow if you have a recent top end CPU running at high clock rates. 86% power is pretty normal on my MSI.


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> Yes you can use the curve to set the peak at say 1.00v but why not just set a custom fan curve and/or adjust the thermal limit?


Thanks for the reply! I thought about a custom fan-curve but coming from a Club3D 7970 Royal Queen which was amazing in overclocking but unfortunately quite loud the near silence of the 1070 feels like heaven. Also wouldn't I slow my card's performance then? As far as I understood it GPU Boost pushes the voltages and speed till the thermal limit is reached so if I simply cap it at 1.00v wouldn't it be clocking slower than uncapped? Also about the Thermal Limit, I noticed I can unlink Core Clock and Thermal Limit but what would happen if I set the Thermal Limit to 75°C while they are unlinked? Would the fan simply spin faster?

I've been overclocking GPUs and CPUs since the 5770/6850 age but this new dynamic-system is something I have to wrap my head around first.


----------



## syl1979

Quote:


> Originally Posted by *Skyblaze*
> 
> Thanks for the reply! I thought about a custom fan-curve but coming from a Club3D 7970 Royal Queen which was amazing in overclocking but unfortunately quite loud the near silence of the 1070 feels like heaven. Also wouldn't I slow my card's performance then? As far as I understood it GPU Boost pushes the voltages and speed till the thermal limit is reached so if I simply cap it at 1.00v wouldn't it be clocking slower than uncapped? Also about the Thermal Limit, I noticed I can unlink Core Clock and Thermal Limit but what would happen if I set the Thermal Limit to 75°C while they are unlinked? Would the fan simply spin faster?
> 
> I've been overclocking GPUs and CPUs since the 5770/6850 age but this new dynamic-system is something I have to wrap my head around first.


I am using an afterburner profile with curv capped starting 0.993v and overclock to 2050mhz for 24/24 use, memory at +425. Runs really cool.


----------



## Skyblaze

Quote:


> Originally Posted by *syl1979*
> 
> I am using an afterburner profile with curv capped starting 0.993v and overclock to 2050mhz for 24/24 use, memory at +425. Runs really cool.


Hmm interesting, how would I go about that? So once I cap the curve I basically have to set clocks manually instead of it being dynamic?


----------



## Nukemaster

Quote:


> Originally Posted by *syl1979*
> 
> I am using an afterburner profile with curv capped starting 0.993v and overclock to 2050mhz for 24/24 use, memory at +425. Runs really cool.


This seems like an interesting idea.


----------



## Skyblaze

Quote:


> Originally Posted by *Nukemaster*
> 
> This seems like an interesting idea.


This was actually my main goal with undervolting. My Palit DUAL doesn't reach as high frequencies with GPU Boost since its cooler is obviously not as effective as the more expensive ones but I thought with a slight undervolt I can push the clocks a bit more thermal-wise so it's more in line with better models. Then again I'm mostly CPU-bound anyway with my 3570k so yeah...


----------



## syl1979

One exemple :









http://img.techpowerup.org/160905/undervolt-0-993-2075-curv.png

Make this type of curv. Keep voltage at 0%. Push power limit to max and adjust also the memory.

The boost will push the card to max frequency.. You will still have at least one step of thermal throttling (at 35degC). The second will be at 60degC.


----------



## syl1979

For defining the right frequency, you have to check stability (firestrike stress test is the more effective to my point of view).

After that check if the firestrike graphic score increase if you increase the voltage. It revels some remaining internal errors and means you have to decrease a little bit the frequency.

That's why i went down from 2075 to 2050


----------



## Skyblaze

Quote:


> Originally Posted by *syl1979*
> 
> One exemple :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://img.techpowerup.org/160905/undervolt-0-993-2075-curv.png
> 
> Make this type of curv. Keep voltage at 0%. Push power limit to max and adjust also the memory.
> 
> The boost will push the card to max frequency.. You will still have at least one step of thermal throttling (at 35degC). The second will be at 60degC.


Thanks, that didn't work out for me however. I was testing with Overwatch and unmodified my card would scratch at 1900mhz with 1024mv locked. I tried what you said with the only difference being that I set the curve at 1955mhz @ 993mv to be safe at the beginning. Running Overwatch again it ran at 1911mhz with 1050mv, I'm missing something here :/


----------



## syl1979

Can you share your curv ?


----------



## BroPhilip

So, I thought I would post my oc results for everyone to critique my experience with the MSI Gaming Z (Micron). I used fire strike over timespy as it would crash my driver quicker... thought this might help any one looking at this card...

Specs:
Asus Z170-A
I5-6600K oc to 4.7ghz
600 evga psu

(I am listing max boost speed plus the step down after first thermal. My Temps averaged between 50-59 with custom fan curve to 40% - 95% ramp up between 40-50 degrees)

Factory Gaming Mode
No voltage or power limit increase
1987-1974. Score 15020
With voltage power increase
1999

Factor OC mode
No voltage or power increase
1999 - 1987. Score 15086
With voltage and power increase
2025

OC with no voltage or power (highest stable)
2050-2037 Mem 8700 score 15635

OC %126 power +100 core voltage (Max Stable)
2088-2062 Mem 8700 score 15706

In time spy I was able to break 2100 with 8700 memory with a voltage lock at 1093 and scored close to 6100. However this would crash firestrike..... the sweet spot seems to be around 2050 for daily use.

So how does this compare to everyone else's oc experience?


----------



## Skyblaze

Quote:


> Originally Posted by *syl1979*
> 
> Can you share your curv ?


Sure, here:


----------



## syl1979

It seems ok. Are you sure you have set the voltage to 0% ?

And just in case you have to make "apply" in afterburner main screen.....


----------



## Skyblaze

Quote:


> Originally Posted by *syl1979*
> 
> It seems ok. Are you sure you have set the voltage to 0% ?
> 
> And just in case you have to make "apply" in afterburner main screen.....


Hmm well yes my voltage sits at +0% and I did press apply though while trying it again now I noticed that the entire curve stays the same but offsets slightly upward once I hit "Apply" This is all very weird


----------



## syl1979

The curv is adjusted based on temperature.... It should move to highest position when below 32-35 degC.

Maybe the undervolt doesn't work same as me due to differences in the bios or driver parameters (i think i am on defaut balanced power setting)?


----------



## Skyblaze

Quote:


> Originally Posted by *syl1979*
> 
> The curv is real time adjusted based on temperature.... It should move to highest position when below 32-35 degC.
> 
> Maybe the undervolt doesn't work same as me due to differences in the bios or driver parameters (i think i am on defaut balanced power setting)?


Ah okay that explains why my curve moved but yeah that could be, I'm also on "Balanced" though. Is there any way I can lock the curve from passing a specified mV value?


----------



## syl1979

You can try to lock the point with "L" on the curve

But it will disable the lower voltage at idle


----------



## Skyblaze

Quote:


> Originally Posted by *syl1979*
> 
> You can try to lock the point with "L" on the curve
> 
> But it will disable the lower voltage at idle


Eh that's also not really desirable, I'll play around with the cure a bit more tomorrow, thanks for the help so far!


----------



## GreedyMuffin

I run my GTX 1080 at 2012mhz at 900MV. Mem is 500+.

Seems like my card liked undervolting! :-D


----------



## xg4m3

Is it worth paying 37€ more for MSI X edition over G1 from Gigabyte, for 1070?

MSI Gaming X is 537€
Gigabyte G1 is 499€


----------



## syl1979

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I run my GTX 1080 at 2012mhz at 900MV. Mem is 500+.
> 
> Seems like my card liked undervolting! :-D


For me there is a frequency wall at around 2100/2150 mhz on these pascal cards


----------



## zipper17

Quote:


> Originally Posted by *Nevii*
> 
> How many performance I'm losing with memory chips (Micron) that overclocks poorly? More or less, let's say 2100/8700 vs 2100/9400. I'm thinking about returning my card and buying another one with (i hope) better memory chips. Is it worth the effort, or gain in performance will be too low? I'm getting 102FPS in Heaven (maxed out with 1080p resolution). Also I'm wondering if it's still possible to buy card with Samsung memory chips, or maybe all producers switched to Micron?
> 
> EDIT: My card is only hitting 86% of power limit, and throttle reason is VRel, is it normal?


From what I know, it's not because poor overclock, Micron memory on 1070 has poor voltage management, when Overclocked Memory need a higher voltage at certain frequency, the voltage are not bump up immediately from idle state, that causing a graphical error/artifacting. Cmiiw.


----------



## Hunched

I kind of wish I could have different fan curves for different profiles in MSI Afterburner.
Since I lock my voltage to 1.093v whenever I game, it gets a bit stupid as they exit and enter 0rpm mode constantly during pause screens and idle areas.
If you have it set to 0rpm up to 50c or 60c it will do this.

Unfortunately "User Define" button in MSI AB is not saved to profiles in the current state you have it in, enabled or disabled.
I don't need 0rpm mode when my voltage is 1.093v at all times during gaming, but I want 0rpm mode when I'm idle browsing, but I can only have 1 fan curve.

I suppose it's not a big deal, but I'm sure it's not great for the fans lifespan to go from 0% to 50% like 15 times in between finding matches...
I can just make a custom fan curve that never hits 0rpm and toggle the User Define button in MSI AB before and after I play games I guess, since BIOS fan curve uses 0rpm up to 60c.


----------



## zipper17

Quote:


> Originally Posted by *Hunched*
> 
> I kind of wish I could have different fan curves for different profiles in MSI Afterburner.
> Since I lock my voltage to 1.093v whenever I game, it gets a bit stupid as they exit and enter 0rpm mode constantly during pause screens and idle areas.
> If you have it set to 0rpm up to 50c or 60c it will do this.
> 
> Unfortunately "User Define" button in MSI AB is not saved to profiles in the current state you have it in, enabled or disabled.
> I don't need 0rpm mode when my voltage is 1.093v at all times during gaming, but I want 0rpm mode when I'm idle browsing, but I can only have 1 fan curve.
> 
> I suppose it's not a big deal, but I'm sure it's not great for the fans lifespan to go from 0% to 50% like 15 times in between finding matches...
> I can just make a custom fan curve that never hits 0rpm and toggle the User Define button in MSI AB before and after I play games I guess, since BIOS fan curve uses 0rpm up to 60c.


There's a setting called Temperature Hysteresis on AB. Put Temperature Hysteresis to 10C or something, Fan RPM will not immediately lowered down when on pause/idle screen.


----------



## zipper17

Quote:


> Originally Posted by *syl1979*
> 
> One exemple :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://img.techpowerup.org/160905/undervolt-0-993-2075-curv.png
> 
> Make this type of curv. Keep voltage at 0%. Push power limit to max and adjust also the memory.
> 
> The boost will push the card to max frequency.. You will still have at least one step of thermal throttling (at 35degC). The second will be at 60degC.


OMG this works to mine. Dude I thanks, how do you figure this out btw?









is there any different performances when running 2050 at higher voltage? is it safe to running card at undervolt like this?

I Got steady 2050 0.993V in Firestrike Extreme Custom loop Graphic test2 running about 23000 Frames, no crashes.
60C temp, however with 100% fan speed.

i didn't know you can run 2050 with only 0.993v

Btw In general to get a steady 2101hz how much voltage do i need ??


----------



## syl1979

On my card i need 1.09v for 2100mhz. Not worth the extra power....


----------



## zipper17

Quote:


> Originally Posted by *syl1979*
> 
> On my card i need 1.09v for 2100mhz. Not worth the extra power....


What happen if voltage is not adjusted correctly to the corespeed? it will crashes?I need to trial & error?


----------



## syl1979

You can try to find the optimum voltage for desired frequency. Put your fans to 80% or max. You can make firestrike bench from 1.05v down to 0.9. Check at which voltage level thr score starts to go down.


----------



## zipper17

2050 mhz stable at only 0.993v, what about else?

2114 ?
2100 ?
2088 ?
2076 ?
2063 ?
2050 = 0.993v (stable after custom loop run GT2 FS Extreme)
2037 ?
2025 ?
2012 ?
2000 ?
Quote:


> Originally Posted by *syl1979*
> 
> You can try to find the optimum voltage for desired frequency. Put your fans to 80% or max. You can make firestrike bench from 1.05v down to 0.9. Check at which voltage level thr score starts to go down.


hmm what is thr score btw lol? i missed understanding


----------



## syl1979

I mean the benchmark graphic score/result


----------



## zipper17

Quote:


> Originally Posted by *syl1979*
> 
> I mean the benchmark graphic score/result


is that mean even at the same CoreClock speed at different Volt, it will differ the result??
Quote:


> Originally Posted by *syl1979*
> 
> You can try to find the optimum voltage for desired frequency. Put your fans to 80% or max. You can make firestrike bench from 1.05v down to 0.9. Check at which voltage level thr score *starts to go down*.


higher voltage will produce better result even though at same corespeed?

I missed understanding. btw I thanks.


----------



## ucode

Quote:


> Originally Posted by *zipper17*
> 
> core clock speed step (from2000-2150)
> 2150
> 2138
> 2126
> 2114
> 2100
> 2088
> 2076
> 2063
> 2050
> 2037
> 2025
> 2012
> 2000
> 
> Is there anyone know actually How much gain step by step between those clock speed?? is there any apples to apples comparison ?


Gain? It's the voltage points that are set to a clock frequency. One could have 0 difference or a 100MHz difference for example.


----------



## zipper17

Just did some test with 0.993V curve target, 0%voltage.

0.993V - 2063/2050mhz = 20 193

0.993V - 2088/2076mhz = 20 188

1.050V - 2063/2050mhz = 20 298

(Fire Strike Graphic Scores, +600mhz memory, 0% voltage, 100%fanspeed)

Strange, looks like performance is losing with undervolt into 0.993V, even though can maintaining Those Higher Clocks. I'm bit confused with pascal overclock behavior.

My current overclock is 20862 at 2050/2038mhz 60%voltage.
probably I need better cooling to run at +2100mhz.


----------



## Nevii

Quote:


> Originally Posted by *zipper17*
> 
> From what I know, it's not because poor overclock, Micron memory on 1070 has poor voltage management, when OC'ed Memory need a higher voltage at certain frequency the voltage are not bump up immediately from idle state, that causing a graphical error/artifacting. Cmiiw.


Today I cooled down temperature in room where my computer stands to 16oC and pushed all my fans to 100%. It gives me maximum of 45oC in stress on GPU and allowed me to overclock to stable 2101/9100 which gives nice 106.6fps in Heaven. When i tried to push my microns little bit further (+550 vs +600) throttling came back, card went down to 1.081 and after a while card crashed, but without any chessboard artifacts, it just restarded its clocks and froze with black screen for a while. Without cooling down my room, and just trying to overclock in regular circumstances my card memories was failing with anything above +300.

I guess i won't be returning my card, i will just wait for maybe some BIOS update, or maybe BIOS Editor.


----------



## jovanni

Quote:


> Originally Posted by *xg4m3*
> 
> Is it worth paying 37€ more for MSI X edition over G1 from Gigabyte, for 1070?
> 
> MSI Gaming X is 537€
> Gigabyte G1 is 499€


My G1 runs like a charm around 59c temp (25 room temp). Max stable 2126. Max mem 9000 (micron). For all I have read here (and I went through all posts in 1070 thread) it all bind to the silicon lottery except the memory which is kind of foggy estimations (micron vs samsung)


----------



## BroPhilip

Quote:


> Originally Posted by *jovanni*
> 
> My G1 runs like a charm around 59c temp (25 room temp). Max stable 2126. Max mem 9000 (micron). For all I have read here (and I went through all posts in 1070 thread) it all bind to the silicon lottery except the memory which is kind of foggy estimations (micron vs samsung)


Ladies and gentlemen we have a lottery winner lol


----------



## mrtbahgs

Is there a decent 10 series OC guide that I can read?
It's been awhile since I dialed in the OC on my 670 so I am out of practice and with all these changes I am sure the process is a bit different.

Basically just trying to better understand some of the settings/options in PrecisionXOC and what OC step ups are recommended to do things fairly quickly, like jump in 50s or 25s, etc.
Is Heaven/Valley enough to benchmark and dial in results and then a deeper stress test for stability once I feel I hit my limit?
Is it still best to do core clocks by itself first and then add in memory clocks after, but then at what point is a bit more core worse than a fair amount more memory (+13 vs +50 for example)?

Just hoping to do things right the first time, but also not take 4 hours to find a stable OC.


----------



## Skyblaze

Well I read something about Pascal undervolting in another thread and it works perfectly!







No need to mess with the curve, just lower your power and temperature limit and drive up the clocks to offset which will make the card automatically undervolt itself. These are my current settings for now, I haven't messed with the memory-clocks yet because I have Micron memory:



It boosts itself to around 2000-2050mhz and shifts around 875-950mV, temps are never exceeding 75°C in accordance to the temp-limit and it's silent as hell, I could hardly be happier!


----------



## BroPhilip

I personally liked this video walk through...





Quote:


> Originally Posted by *mrtbahgs*
> 
> Is there a decent 10 series OC guide that I can read?
> It's been awhile since I dialed in the OC on my 670 so I am out of practice and with all these changes I am sure the process is a bit different.
> 
> Basically just trying to better understand some of the settings/options in PrecisionXOC and what OC step ups are recommended to do things fairly quickly, like jump in 50s or 25s, etc.
> Is Heaven/Valley enough to benchmark and dial in results and then a deeper stress test for stability once I feel I hit my limit?
> Is it still best to do core clocks by itself first and then add in memory clocks after, but then at what point is a bit more core worse than a fair amount more memory (+13 vs +50 for example)?
> 
> Just hoping to do things right the first time, but also not take 4 hours to find a stable OC.


----------



## BroPhilip

Quote:


> Originally Posted by *mrtbahgs*
> 
> Is there a decent 10 series OC guide that I can read?
> It's been awhile since I dialed in the OC on my 670 so I am out of practice and with all these changes I am sure the process is a bit different.
> 
> Basically just trying to better understand some of the settings/options in PrecisionXOC and what OC step ups are recommended to do things fairly quickly, like jump in 50s or 25s, etc.
> Is Heaven/Valley enough to benchmark and dial in results and then a deeper stress test for stability once I feel I hit my limit?
> Is it still best to do core clocks by itself first and then add in memory clocks after, but then at what point is a bit more core worse than a fair amount more memory (+13 vs +50 for example)?
> 
> Just hoping to do things right the first time, but also not take 4 hours to find a stable OC.


I personally liked this video walk through...


----------



## zipper17

I suggest to Use 3d Mark Advanced if you have it to test stability instead of Valley/Heaven.

-3d Mark Firestrike Extreme or Timespy
a. Firestrike Extreme Stress Test. 20 Loops Graphic test 1 and at least 97% of Framerates stability.
b. Firestrike Extreme Custom Loop Graphic Test 2. Running at least 20.000 Frames.
c. Timespy Stress Test / Custom Run Loops. (I don't have timespy, so I used FS Extreme instead)

IMO very good tool and fast to detect stability and to find the best tweaks for your card.
I never had a single crashes in Valley. I can run any kind of tweaks easily. You think that by yourself.
When I challenge my card to 3d mark, things it's different. It gives a constant stress load more than valley imo.


----------



## zipper17

Quote:


> Originally Posted by *mrtbahgs*
> 
> Is there a decent 10 series OC guide that I can read?
> It's been awhile since I dialed in the OC on my 670 so I am out of practice and with all these changes I am sure the process is a bit different.
> 
> Basically just trying to better understand some of the settings/options in PrecisionXOC and what OC step ups are recommended to do things fairly quickly, like jump in 50s or 25s, etc.
> Is Heaven/Valley enough to benchmark and dial in results and then a deeper stress test for stability once I feel I hit my limit?
> Is it still best to do core clocks by itself first and then add in memory clocks after, but then at what point is a bit more core worse than a fair amount more memory (+13 vs +50 for example)?
> 
> Just hoping to do things right the first time, but also not take 4 hours to find a stable OC.


Yes, do coreclock oc first, then last oc the memory.

If you involved a memory Overclock, probably it will affect the power target limit/temperatures core clock.

I've seen it on my Overlay AfterBurner, with Memory Overclocked it increase my Coreclock temp by 2-3 Degrees than memory at stock.

Memory on 1070 in general should be easy for +500mhz right away. Coreclock tweaks are the real challenge for overclock, any card has a different behavior & result. In general Overclock 1070 corespeed, should be achieveable around 2000,2012,2025,2038mhz or so (Real Steady Boost clock).

For tools, I use MSI AB for overlay and tweaks, 3dmark advanced for stress test stability, and last Witcher 3 on max settings with 1440P resolution.

cmiw.


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Yes you can use the curve to set the peak at say 1.00v but why not just set a custom fan curve and/or adjust the thermal limit?
> 
> 
> 
> Thanks for the reply! I thought about a custom fan-curve but coming from a Club3D 7970 Royal Queen which was amazing in overclocking but unfortunately quite loud the near silence of the 1070 feels like heaven. Also wouldn't I slow my card's performance then? As far as I understood it GPU Boost pushes the voltages and speed till the thermal limit is reached so if I simply cap it at 1.00v wouldn't it be clocking slower than uncapped? Also about the Thermal Limit, I noticed I can unlink Core Clock and Thermal Limit but what would happen if I set the Thermal Limit to 75°C while they are unlinked? Would the fan simply spin faster?
> 
> I've been overclocking GPUs and CPUs since the 5770/6850 age but this new dynamic-system is something I have to wrap my head around first.
Click to expand...

GPU boost 3 drops clocks as temps rise. The higher temps will slow the card down more than any extra power draw the fans will take.


----------



## bigjdubb

There is so little difference between my maximum effort overclock performance and my minimal effort overclock performance that I'm not sure it's even worth it. When I got the card I slid the power target up to the max and left everything else alone, the difference between that "overclock" and spending hours trying to find the maximum stable overclock was about 50mhz and maybe a couple of FPS in game. Worth it for benchmarking, not worth it for gaming.


----------



## gtbtk

Quote:


> Originally Posted by *xg4m3*
> 
> Is it worth paying 37€ more for MSI X edition over G1 from Gigabyte, for 1070?
> 
> MSI Gaming X is 537€
> Gigabyte G1 is 499€


MSI card is quieter.

Also consider the base model MSI Gaming 8G card. The only difference between that and the Gaming X is the firmware. Electrically they are identical.


----------



## zipper17

Quote:


> Originally Posted by *bigjdubb*
> 
> There is so little difference between my maximum effort overclock performance and my minimal effort overclock performance that I'm not sure it's even worth it. When I got the card I slid the power target up to the max and left everything else alone, the difference between that "overclock" and spending hours trying to find the maximum stable overclock was about 50mhz and maybe a couple of FPS in game. Worth it for benchmarking, not worth it for gaming.


Yup, I rather forget about Overclocking, and go for SLI instead. Much bigger improvement, less difficult.
But you can learn how to oc your card.


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Well I read something about Pascal undervolting in another thread and it works perfectly!
> 
> 
> 
> 
> 
> 
> 
> No need to mess with the curve, just lower your power and temperature limit and drive up the clocks to offset which will make the card automatically undervolt itself. These are my current settings for now, I haven't messed with the memory-clocks yet because I have Micron memory:
> 
> 
> 
> It boosts itself to around 2000-2050mhz and shifts around 875-950mV, temps are never exceeding 75°C in accordance to the temp-limit and it's silent as hell, I could hardly be happier!


there is nothing wrong with micron memory up to about +500


----------



## giodegracia

this so true man. i was once a proud owner of a Gigabyte GTX 970 G1 Gaming and i overclocked it crazy running 1600mhz stable on the core and +500 on the memory. it was much worth it overclocking Maxwell than Pascal nowadays. Altough jumping from that 970 to my MSI GTX 1070 Gaming X is still a huge jump. the card just doesn't seem to be worth my cash at all in terms of overclocking and getting that extra performance for what i paid. now i regret not getting an overclocked custom bios 980 ti. that would have been much of a better choice.


----------



## giodegracia

Quote:


> Originally Posted by *bigjdubb*
> 
> There is so little difference between my maximum effort overclock performance and my minimal effort overclock performance that I'm not sure it's even worth it. When I got the card I slid the power target up to the max and left everything else alone, the difference between that "overclock" and spending hours trying to find the maximum stable overclock was about 50mhz and maybe a couple of FPS in game. Worth it for benchmarking, not worth it for gaming.


this so true man. i was once a proud owner of a Gigabyte GTX 970 G1 Gaming and i overclocked it crazy running 1600mhz stable on the core and +500 on the memory. it was much worth it overclocking Maxwell than Pascal nowadays. Altough jumping from that 970 to my MSI GTX 1070 Gaming X is still a huge jump. the card just doesn't seem to be worth my cash at all in terms of overclocking and getting that extra performance for what i paid. now i regret not getting an overclocked custom bios 980 ti. that would have been much of a better choice.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> There is so little difference between my maximum effort overclock performance and my minimal effort overclock performance that I'm not sure it's even worth it. When I got the card I slid the power target up to the max and left everything else alone, the difference between that "overclock" and spending hours trying to find the maximum stable overclock was about 50mhz and maybe a couple of FPS in game. Worth it for benchmarking, not worth it for gaming.


LOL... yep


----------



## zipper17

Quote:


> Originally Posted by *giodegracia*
> 
> this so true man. i was once a proud owner of a Gigabyte GTX 970 G1 Gaming and i overclocked it crazy running 1600mhz stable on the core and +500 on the memory. it was much worth it overclocking Maxwell than Pascal nowadays. Altough jumping from that 970 to my MSI GTX 1070 Gaming X is still a huge jump. the card just doesn't seem to be worth my cash at all in terms of overclocking and getting that extra performance for what i paid. now i regret not getting an overclocked custom bios 980 ti. that would have been much of a better choice.


doesn't make 1070 is a poor card. 1070 still has better performances per watt than 980Ti. and probably better DX 12 async compute performances (Dynamic load balancing and Preemption improvement). Decent Overclocker.

980Ti indeed far greater overclocker card. They are still amongst those Pro Overclocker. It can achieve 1,9GHZ but well on LN2.

Maintaining TitanX Pascal performances in Firestrike lol.
http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+performance+preset/version+1.1/1+gpu
But probably about 1:1millions chance.


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> there is nothing wrong with micron memory up to about +500


Huh interesting, I'll try that later then, I always had a feeling that it was way overblown.


----------



## X6SweexLV

Hello, I wanted to know the Zotac GTX 1070 Amp EDITION users reviews on this video card...
That is the temperature and fans noise?
Any problem what you have!
I wanna buy this or inno3d gtx 1070 twin x2


----------



## muzammil84

Quote:


> Originally Posted by *X6SweexLV*
> 
> Hello, I wanted to know the Zotac GTX 1070 Amp EDITION users reviews on this video card...
> That is the temperature and fans noise?
> Any problem what you have!
> I wanna buy this or inno3d gtx 1070 twin x2


I've got inno3d iChill x4. Max temp after time spy stress test is 64°C, fans can be heard but not noisy or annoying, also transition from idle mode to max load is very smooth(i know some 3 fan cards have very erratic and annoying profile). It is not the srongest card out there but quite high oc on stock and I've got Samsung memory on mine. So far managed to push it to 9088MHz and it will go further, I'll keep pushing tomorrow. Clock is 2068 "only" but I'm not desperate to hit 2100 only for sake of it.
This is my first air stock cooler gpu and it's still new for me as i used to gpus that make zero noise with water block on them but it's nothing disturbing and I'm running it in Core P5 which is open case.


----------



## Skyblaze

Well it seems my 1070 doesn't like memory overclocking at all which might or might not be related to my undervolting but even a 200mhz boost on the memory makes it extremely unstable :/


----------



## QxY

Quote:


> Originally Posted by *X6SweexLV*
> 
> Hello, I wanted to know the Zotac GTX 1070 Amp EDITION users reviews on this video card...
> That is the temperature and fans noise?
> Any problem what you have!
> I wanna buy this or inno3d gtx 1070 twin x2


I've been using the regular Zotac GTX 1070 AMP for a little more than 2 months now, pretty happy with the card so far. The card's construction is quite solid with an all-metal backplate and shroud, though the yellow stripes on the backplate are subjective.

It has a couple of 100mm fans that max on 100% at around 2200RPM, so even at near full speed they are still relatively quiet.

In games, they run at around 65-75% speed with the temps around 69-71C. It's a tad higher than other AiB cards like the MSI Gaming...I'd say on par with EVGA's ACX 3 cooler and still quite lower and quieter than a Founder's Edition.

Boost clock range from 1940-2000mhz out of the box. Coil whine is much lower than my old 780 Ti.


----------



## mrtbahgs

Quote:


> Originally Posted by *BroPhilip*
> 
> I personally liked this video walk through...


Yes thank you, that was a good video and showed some settings that didn't exactly stand out to me at first like the voltage slider being hidden a bit.
Quote:


> Originally Posted by *zipper17*
> 
> I suggest to Use 3d Mark Advanced if you have it to test stability instead of Valley/Heaven.
> 
> -3d Mark Firestrike Extreme or Timespy
> a. Firestrike Extreme Stress Test. 20 Loops Graphic test 1 and at least 97% of Framerates stability.
> b. Firestrike Extreme Custom Loop Graphic Test 2. Running at least 20.000 Frames.
> c. Timespy Stress Test / Custom Run Loops. (I don't have timespy, so I used FS Extreme instead)
> 
> IMO very good tool and fast to detect stability and to find the best tweaks for your card.
> I never had a single crashes in Valley. I can run any kind of tweaks easily. You think that by yourself.
> When I challenge my card to 3d mark, things it's different. It gives a constant stress load more than valley imo.


I will look into those and try whatever ones are free, I assume by the name "Advanced" that that version costs a few dollars and I'm not that into benching.

Quote:


> Originally Posted by *zipper17*
> 
> Yes, do coreclock oc first, then last oc the memory.
> 
> If you involved a memory Overclock, probably it will affect the power target limit/temperatures core clock.
> 
> I've seen it on my Overlay AfterBurner, with Memory Overclocked it increase my Coreclock temp by 2-3 Degrees than memory at stock.
> 
> Memory on 1070 in general should be easy for +500mhz right away. Coreclock tweaks are the real challenge for overclock, any card has a different behavior & result. In general Overclock 1070 corespeed, should be achieveable around 2000,2012,2025,2038mhz or so (Real Steady Boost clock).
> 
> For tools, I use MSI AB for overlay and tweaks, 3dmark advanced for stress test stability, and last Witcher 3 on max settings with 1440P resolution.
> 
> cmiw.


Sounds like it's still pretty similar then, the only big thing would be to see what core clock increase equals what memory clock increase so we know which is worth sacrificing.
(Is it worth OCing memory +100 if we have to drop core by 13 for it to be stable as an example of what I am referring to, just need to fill in the real numbers)
It seems like temps really take control of the core clocks and step it down one or 2 times so I will have to look into higher fan speeds or just settle on a "good-enough" OC.


----------



## MyNewRig

Hey Guys,

This Micron memory issue is really getting to me, i purchased two different 1070 cards both came with the freaking Micron ICs and perform like crap, memory overclocking is also not there at all, just +100 would start producing artifacts, i have another card from July with Samsung memory which i love and it performs like a champ, having had a taste of cards with Samsung memory, their stability, performance and overclockability i just can't settle for the Micron card and feel it is a piece of garbage.

I tried contacting all manufacturers about this but their responses smell very fishy, they are very vague about the issue, no transparency at all, they either deny any issues exists or they say we do not support overclocking which is utter BS, when they designed, engineered and made their cards especially for OVERCLOCKING and are charging us a premium for it, why else are they using 10+2 power phases and 2 X 8-pin connector, with 125% power target if they do not directly support and promote overclocking, even Nvidia's Official lunch event for Pascal was bragging about the overclocking performance of Pascal, so overclocking is very much at the heart of this generation.

Now i need another 1070 with Samsung memory, is that even being produced anymore? is there any chance to ever be able to buy that? did anyone manage to get a 1070 with Samsung memory lately? according to this thread i saw EVGA FTW, MSI Gaming X & Z, Gigabyte Xtreme, ASUS Strix OC & non-OC, Zotac Extreme, ALL using Micron memory for people who made the purchase recently ... so is Micron now a fact of life for the 1070? no way around that? no chance of these manufacturers ever reverting to Samsung again?

If i can not get a 1070 with Samsung memory or a FIXED Micron memory then i will very likely skip the entire freaking Pascal generation all together!

What do you guys think about this situation? and please do not tell me that it is not a big deal or not an important issues, because i personally tested cards with both memory types and there is a real and significant different in performance and overall experience.

Thanks


----------



## benjamen50

Is SK Hynix even being used anymore for pascal series GPUs? Sorry if this sounds stupid but I've never heard of one having it yet I've just been hearing Micron or Samsung VRAM.


----------



## MyNewRig

Quote:


> Originally Posted by *benjamen50*
> 
> Is SK Hynix even being used anymore for pascal series GPUs? Sorry if this sounds stupid but I've never heard of one having it yet I've just been hearing Micron or Samsung VRAM.


No reports of Hynix memory being used so far, as far as i have read (i researched this issue a lot because it is totally annoying me)


----------



## muzammil84

Quote:


> Originally Posted by *MyNewRig*
> 
> Hey Guys,
> 
> This Micron memory issue is really getting to me, i purchased two different 1070 cards both came with the freaking Micron ICs and perform like crap, memory overclocking is also not there at all, just +100 would start producing artifacts, i have another card from July with Samsung memory which i love and it performs like a champ, having had a taste of cards with Samsung memory, their stability, performance and overclockability i just can't settle for the Micron card and feel it is a piece of garbage.
> 
> I tried contacting all manufacturers about this but their responses smell very fishy, they are very vague about the issue, no transparency at all, they either deny any issues exists or they say we do not support overclocking which is utter BS, when they designed, engineered and made their cards especially for OVERCLOCKING and are charging us a premium for it, why else are they using 10+2 power phases and 2 X 8-pin connector, with 125% power target if they do not directly support and promote overclocking, even Nvidia's Official lunch event for Pascal was bragging about the overclocking performance of Pascal, so overclocking is very much at the heart of this generation.
> 
> Now i need another 1070 with Samsung memory, is that even being produced anymore? is there any chance to ever be able to buy that? did anyone manage to get a 1070 with Samsung memory lately? according to this thread i saw EVGA FTW, MSI Gaming X & Z, Gigabyte Xtreme, ASUS Strix OC & non-OC, Zotac Extreme, ALL using Micron memory for people who made the purchase recently ... so is Micron now a fact of life for the 1070? no way around that? no chance of these manufacturers ever reverting to Samsung again?
> 
> If i can not get a 1070 with Samsung memory or a FIXED Micron memory then i will very likely skip the entire freaking Pascal generation all together!
> 
> What do you guys think about this situation? and please do not tell me that it is not a big deal or not an important issues, because i personally tested cards with both memory types and there is a real and significant different in performance and overall experience.
> 
> Thanks


check my input couple of posts up. I recently bought Inno3d with samsung memory, did 9088 mhz and I'm not done yet, will try push it further today. very cool and quiet gpu


----------



## MyNewRig

Here are the results of trying to get comments from the concerned parties regarding the issue:

1- ASUS support when pushed enough to admit why they switched to Micron, and if they will ever solve the problems with this Memory controller or if they will ever produce cards with Samsung memory again, they deny any issues exists, and they insist rigorously that they can not give any information about this outside of the company, they said we do not use Samsung memory anymore and good luck finding a card with that memory type, and they do not plan to issue a fix because they do NOT SEE A PROBLEM!, support case closed without any chance of further follow up or obtainment of better information.

2- EVGA after pushed for the same stopped responding to email and just say we DO NOT SUPPORT OVERCLOCKING ... period, no fix is planned because they do not acknowledge the issue as well like ASUS.

3- Nvidia totally ignores the issue and did not even bother to comment on the Geforce.com forums thread for this issue.

By the looks of it this smells very fishy, specifications downgrade two months after launch and after sending review samples out without any announcement or admission of the issue, and without even the slightest willingness to share any information about this.

So unless this thing really blows up like the GTX 970 3.5GB memory issue nothing will happen!


----------



## reflex75

Quote:


> Originally Posted by *MyNewRig*
> 
> Hey Guys,
> 
> This Micron memory issue is really getting to me, i purchased two different 1070 cards both came with the freaking Micron ICs and perform like crap, memory overclocking is also not there at all, just +100 would start producing artifacts, i have another card from July with Samsung memory which i love and it performs like a champ, having had a taste of cards with Samsung memory, their stability, performance and overclockability i just can't settle for the Micron card and feel it is a piece of garbage.
> 
> I tried contacting all manufacturers about this but their responses smell very fishy, they are very vague about the issue, no transparency at all, they either deny any issues exists or they say we do not support overclocking which is utter BS, when they designed, engineered and made their cards especially for OVERCLOCKING and are charging us a premium for it, why else are they using 10+2 power phases and 2 X 8-pin connector, with 125% power target if they do not directly support and promote overclocking, even Nvidia's Official lunch event for Pascal was bragging about the overclocking performance of Pascal, so overclocking is very much at the heart of this generation.
> 
> Now i need another 1070 with Samsung memory, is that even being produced anymore? is there any chance to ever be able to buy that? did anyone manage to get a 1070 with Samsung memory lately? according to this thread i saw EVGA FTW, MSI Gaming X & Z, Gigabyte Xtreme, ASUS Strix OC & non-OC, Zotac Extreme, ALL using Micron memory for people who made the purchase recently ... so is Micron now a fact of life for the 1070? no way around that? no chance of these manufacturers ever reverting to Samsung again?
> 
> If i can not get a 1070 with Samsung memory or a FIXED Micron memory then i will very likely skip the entire freaking Pascal generation all together!
> 
> What do you guys think about this situation? and please do not tell me that it is not a big deal or not an important issues, because i personally tested cards with both memory types and there is a real and significant different in performance and overall experience.
> 
> Thanks


Hi,
sorry for you, and I understand you because I have the exact same issue with my Palit Super Jetstream 1070 which came with Micron memory.
It's a biased situation compare to Samsung memory used for all the good reviews and benchmarks.
Now, customers will be disappointed to pay premium price for nothing.
But we are many to hope for a bios update with better voltage management to fix it.
The more we complain on Geforce forum and the more we have a chance to be heard:
https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/1/


----------



## MyNewRig

Quote:


> Originally Posted by *reflex75*
> 
> Hi,
> sorry for you, and I understand you because I have the exact same issue with my Palit Super Jetstream 1070 which came with Micron memory.
> It's a biased situation compare to Samsung memory used for all the good reviews and benchmarks.
> Now, customers will be disappointed to pay premium price for nothing.
> But we are many to hope for a bios update with better voltage management to fix it.
> The more we complain on Geforce forum and the more we have a chance to be heard:
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/1/


Thank you for your response, i already complained in that thread, but the thread has been created for about two weeks now and this issue is blowing up all over the place while Nvidia & Co. are totally ignoring the issue, not only that, they dare to bluntly deny any knowledge or admission of the issue when asked about it directly!

Given that during August ALL manufacturers without exception have switched to Micron memory and all the cards that remain with Samsung memory are old batches, also totally ignoring the issue from Nvidia & Co. the only reasonable explanation is that Nvidia downgraded the specifications for the 1070 due to poor sales of the 1080, because when 1070 premium Samsung memory ICs can reach 9600Mhz without a problem, that lowers the gab between this and GDDR5X used on the 1080 ...

I do not expect any BIOS updates will even be issued unless this really really blows up and then they would have to restore performance to save face.

I second your suggestion and encourage any and everyone with a GTX 1070 with Micron Memory that feels manipulated by such shady practices and that is unsatisfied with the poor performance and lack of overclockability of their Micron GTX 1070 compared to 1st batch buyers and ALL reviews out there for the card to complain in the Geforce.com thread to get things moving faster ..

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/


----------



## X6SweexLV

Ty
i buy a Inno3D iChill X3 GTX1070 this card is only 10 euro (410eur) more expensive that Zotac GTX 1070 Amp Edition and much coolest
If i buy a Inno3D iChill X4 I can not impose a side cover...


----------



## MyNewRig

Adding more information:

in ASUS Motherboards' GPU Information section, all GTX 1070s display memory type as "Samsung" even for cards that have Micron memory, proving that Samsung memory is the Original specification

So the GTX 1070's original specifications that we supposedly purchased are GP104 GPU and Samsung GDDR5 memory, changing memory type without announcing it and without clearly marking retail packaging and product description with "Micron" memory is misleading and manipulative


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> there is nothing wrong with micron memory up to about +500
> 
> 
> 
> Huh interesting, I'll try that later then, I always had a feeling that it was way overblown.
Click to expand...

I am pretty sure that it is actually a bug in the code for the memory VRM power delivery.

From low voltage states (below .800v), the delivery of the increased voltage supply lags does not seem to ramp up fast enough to meet the needs of the high +500ish memory overclock when you put the card under load.


----------



## gtbtk

Quote:


> Originally Posted by *reflex75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> Hey Guys,
> 
> This Micron memory issue is really getting to me, i purchased two different 1070 cards both came with the freaking Micron ICs and perform like crap, memory overclocking is also not there at all, just +100 would start producing artifacts, i have another card from July with Samsung memory which i love and it performs like a champ, having had a taste of cards with Samsung memory, their stability, performance and overclockability i just can't settle for the Micron card and feel it is a piece of garbage.
> 
> I tried contacting all manufacturers about this but their responses smell very fishy, they are very vague about the issue, no transparency at all, they either deny any issues exists or they say we do not support overclocking which is utter BS, when they designed, engineered and made their cards especially for OVERCLOCKING and are charging us a premium for it, why else are they using 10+2 power phases and 2 X 8-pin connector, with 125% power target if they do not directly support and promote overclocking, even Nvidia's Official lunch event for Pascal was bragging about the overclocking performance of Pascal, so overclocking is very much at the heart of this generation.
> 
> Now i need another 1070 with Samsung memory, is that even being produced anymore? is there any chance to ever be able to buy that? did anyone manage to get a 1070 with Samsung memory lately? according to this thread i saw EVGA FTW, MSI Gaming X & Z, Gigabyte Xtreme, ASUS Strix OC & non-OC, Zotac Extreme, ALL using Micron memory for people who made the purchase recently ... so is Micron now a fact of life for the 1070? no way around that? no chance of these manufacturers ever reverting to Samsung again?
> 
> If i can not get a 1070 with Samsung memory or a FIXED Micron memory then i will very likely skip the entire freaking Pascal generation all together!
> 
> What do you guys think about this situation? and please do not tell me that it is not a big deal or not an important issues, because i personally tested cards with both memory types and there is a real and significant different in performance and overall experience.
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hi,
> sorry for you, and I understand you because I have the exact same issue with my Palit Super Jetstream 1070 which came with Micron memory.
> It's a biased situation compare to Samsung memory used for all the good reviews and benchmarks.
> Now, customers will be disappointed to pay premium price for nothing.
> But we are many to hope for a bios update with better voltage management to fix it.
> The more we complain on Geforce forum and the more we have a chance to be heard:
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/1/
Click to expand...

That is the thread I started at Nvidia


----------



## Blze001

So, question: if I'm not overclocking my Founder's Edition card, am I likely to really notice the Samsung vs Micron difference? I dunno which mine has.


----------



## gtbtk

Quote:


> Originally Posted by *Blze001*
> 
> So, question: if I'm not overclocking my Founder's Edition card, am I likely to really notice the Samsung vs Micron difference? I dunno which mine has.


As far as I know, Founders cards are all report Samsung memory.

Either way, if you don't OC anything you wont notice any difference


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> I am pretty sure that it is actually a bug in the code for the memory VRM power delivery.
> 
> From low voltage states (below .800v), the delivery of the increased voltage supply lags does not seem to ramp up fast enough to meet the needs of the high +500ish memory overclock when you put the card under load.


That would certainly explain why my 1070 ran fine with +250 memory at first. I upped the memory-clock while I was already ingame in Witcher 3 and after it was stable for 20 minutes of playing I thought it was fine. Though then as soon as I started a round of League of Legends I was met with a checkerboard freeze and even after rebooting and lowering the memory to +100 the game started but I saw weird black flickering everywhere along with driver-restarts :/


----------



## MyNewRig

Quote:


> Originally Posted by *Skyblaze*
> 
> That would certainly explain why my 1070 ran fine with +250 memory at first. I upped the memory-clock while I was already ingame in Witcher 3 and after it was stable for 20 minutes of playing I thought it was fine. Though then as soon as I started a round of League of Legends I was met with a checkerboard freeze and even after rebooting and lowering the memory to +100 the game started but I saw weird black flickering everywhere along with driver-restarts :/


Welcome to the life of Micron GTX 1070 owners, you can never get that card 100% stable across the board, what a ****ty product


----------



## MyNewRig

I have a strange idea, i have never had an Nvidia GPU in the past few generations (780, 970, 980, 980 Ti) that could not easily do +500 memory at least, these 1070s with the Micron ICs can not even do +100 without producing artifacts.

Could they be using overclocked GDDR5 7Ghz ICs overclocked to 8Ghz that it can not even stand the slightest OC without voltage or power increase?

What do you think about this theory?


----------



## Hnykill

Quote:


> Originally Posted by *MyNewRig*
> 
> Welcome to the life of Micron GTX 1070 owners, you can never get that card 100% stable across the board, what a ****ty product


Well.. i have a GTX 1070 Palit Super Jetstream with Micron memory. Memory oc to +600 (9200Mhz) and it is just fine that way. maybe there was some bad batch of Micron memory that got into the market ?.


----------



## MyNewRig

Quote:


> Originally Posted by *Hnykill*
> 
> Well.. i have a GTX 1070 Palit Super Jetstream with Micron memory. Memory oc to +600 (9200Mhz) and it is just fine that way. maybe there was some bad batch of Micron memory that got into the market ?.


When did you buy your card? is it very recently? fairly recently, or one or two months ago?

You seem to be the exception to the rule that all/most Micron memory cards do not OC at all or OC very little, but before you call your OC stable, can you play/benchmark some Rise of the Tomb Raider and see if you get any black/flickering artifacts and at what memory frequency do they stop?

I managed to once reach +350 with my Micron card (8700Mhz) , passed one FireStrike bench and played some Witcher 3, it was fine, but then once i switched to rotTR i started getting horrible black boxes, bars and flickering artifacts, it did not stop until i COMPLETELY removed any memory offset and reverted back to +0 (8008Mhz) this is when the artifacts seemed to have stopped during gameplay, i suspect i am still getting some minor artifacts even at memory stock settings but i have not played long to confirm that 100%

So try that and let me know.


----------



## reflex75

Quote:


> Originally Posted by *gtbtk*
> 
> That is the thread I started at Nvidia


Thank you


----------



## zipper17

I still struggle to get pass 21K in Firestrike and 10k in Firestrike Extreme.
Currently 20862 and 9876, just nearly damn. (Stable in Stress test FS Extreme and games)
Anything higher my card will sensitive to crashes, Looks like I have less better chip.
Kinda regret to not buy 1080 lol, because in My country prices are crazy higher. So I decided to go 1070 instead.
But bare necessity sake, 1070 still decent for 1440P anyway, witcher 3 Max settings physx on 1440P adaptive sync on, 50-60FPS, but still can dip down onto 45FPSish


----------



## Skyblaze

Quote:


> Originally Posted by *MyNewRig*
> 
> Welcome to the life of Micron GTX 1070 owners, you can never get that card 100% stable across the board, what a ****ty product


Now hey I wouldn't go _that_ far because my 1070 is silent and relatively cool while overclocking the core just fine with my undervolt







The memory situations does needs some kind of addressing though, it seems like a somewhat simple fix if it's really just because of the voltage regulation :/


----------



## MyNewRig

Quote:


> Originally Posted by *Skyblaze*
> 
> Now hey I wouldn't go _that_ far because my 1070 is silent and relatively cool while overclocking the core just fine with my undervolt
> 
> 
> 
> 
> 
> 
> 
> The memory situations does needs some kind of addressing though, it seems like a somewhat simple fix if it's really just because of the voltage regulation :/


Yeah, because you have not been cursed like me by trying Samsung cards, once you try a Samsung 1070 you can never go back, it becomes an addiction lol









How do you know it is a simple fix? maybe not, even if it is simple, how do you know that they are "willing" to fix it, maybe they made it like that so we are forced to buy a 1080 for higher memory clocks which is very useful in high resolutions, a significant difference in anything above 1080p .. but we will see









Also forcing high voltage on chips to become stable means they are not good chips to begin with, good chips run on low voltage and deal with fluctuations pretty well, so Samsung is much superior in that department, there is just no way around this you see.

Why in hell are you under-volting your card by the way?!


----------



## zipper17

Quote:


> Originally Posted by *mrtbahgs*
> 
> Yes thank you, that was a good video and showed some settings that didn't exactly stand out to me at first like the voltage slider being hidden a bit.
> I will look into those and try whatever ones are free, I assume by the name "Advanced" that that version costs a few dollars and I'm not that into benching.
> Sounds like it's still pretty similar then, the only big thing would be to see what core clock increase equals what memory clock increase so we know which is worth sacrificing.
> (Is it worth OCing memory +100 if we have to drop core by 13 for it to be stable as an example of what I am referring to, just need to fill in the real numbers)
> It seems like temps really take control of the core clocks and step it down one or 2 times so I will have to look into higher fan speeds or just settle on a "good-enough" OC.


look into steam when on big sales, usually cheap. If you into overclock this guy are needed imo, They can fast detect if card crashes, useful to know how far your corespeed can go. Generally Coreclock should provide better gain than memory. Coreclock are your first priority imo.


----------



## zipper17

Quote:


> Originally Posted by *MyNewRig*
> 
> Thank you for your response, i already complained in that thread, but the thread has been created for about two weeks now and this issue is blowing up all over the place while Nvidia & Co. are totally ignoring the issue, not only that, they dare to bluntly deny any knowledge or admission of the issue when asked about it directly!
> 
> Given that during August ALL manufacturers without exception have switched to Micron memory and all the cards that remain with Samsung memory are old batches, also totally ignoring the issue from Nvidia & Co. the only reasonable explanation is that Nvidia downgraded the specifications for the 1070 due to poor sales of the 1080, because when 1070 premium Samsung memory ICs can reach 9600Mhz without a problem, that lowers the gab between this and GDDR5X used on the 1080 ...
> 
> I do not expect any BIOS updates will even be issued unless this really really blows up and then they would have to restore performance to save face.
> 
> I second your suggestion and encourage any and everyone with a GTX 1070 with Micron Memory that feels manipulated by such shady practices and that is unsatisfied with the poor performance and lack of overclockability of their Micron GTX 1070 compared to 1st batch buyers and ALL reviews out there for the card to complain in the Geforce.com thread to get things moving faster ..
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/


oh wow really samsung can reach 9600mhz? I have samsung type luckily, If I Overlocked it into +700mhz (9400effective) is not really stable, I see an occasional fast black screen/weird graphic error, when Stress Testing. However on +600mhz is stable (9200MHZ effective). In General +500mhz(9000mhz) should already did good. However I didn't notice any gain on games, mostly only on Firestrike scores, but overall it's just a tiny gain maybe.


----------



## MyNewRig

Quote:


> Originally Posted by *zipper17*
> 
> oh wow really samsung can reach 9600mhz? I have samsung type luckily, If I Overlocked it +700mhz (9400effective) is not really stable, I see an occasional fast black screen/weird graphic error, when Stress Testing. However on +600mhz is stable (9200MHZ effective). In General +500mhz(9000mhz) are already good. I didn't notice any gain on games, mostly only on Firestrike scores, but overall it's just a tiny gain imo.


WOW, 9200Mhz effective? i envy you, mine with Micron can not do 8200Mhz stable, so you have 11% faster memory than i and all Micron card owners have, even though we paid the same for our cards, i did not pay 11% less than you, care to exchange cards? please say YES









EDIT: try to use your card in 4K with 8000Mhz vs 9200Mhz memory and lets discuss "gains"


----------



## zipper17

Quote:


> Originally Posted by *MyNewRig*
> 
> WOW, 9200Mhz effective? i envy you, mine with Micron can not do 8200Mhz stable, so you have 11% faster memory than i and all Micron card owners have, even though we paid the same for our cards, i did not pay 11% less than you, care to exchange cards? please say YES


I see other post people with micron can run just fine +500-700mhz? . maybe try this? 




I don't know, looks like they make some mistake again & again, after gtx 970, Generation XX70 always has a mistake. damn.


----------



## MyNewRig

Quote:


> Originally Posted by *zipper17*
> 
> I see other post people with micron run just fine +500-700mhz . maybe try this?
> 
> 
> 
> 
> I don't know, looks like they make some mistake again & again, after gtx 970, Generation XX70 always has a mistake. damn.


Yeah, they f**ked up an otherwise awesome card, the early batches with Samsung memory were so damn flawless, performance and OC were golden, i don't know why they had to mess this up to save a few pennies, i hope it blows up in their face pretty soon.


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah, they f**ked up an otherwise awesome card, the early batches with Samsung memory were so damn flawless, performance and OC were golden, i don't know why they had to mess this up to save a few pennies, i hope it blows up in their face pretty soon.


Still don't know how some of you say/believe this. The vram runs at the advertised speed correct? Overclocking isn't guaranteed, so I am not sure what you expect "to blow up in their face".


----------



## MyNewRig

Quote:


> Originally Posted by *zipper17*
> 
> I see other post people with micron can run just fine +500-700mhz? . maybe try this?


The problem with using "Prefer Maximum Performance" is that once you change any settings in that section, the driver changes from "Let Application Decide" to an all custom settings, this affects all other graphics settings and not just power delivery, so does not sound like a good idea.

I also tried with forced voltage via MSI AB and can only maybe get +50 Mhz more or so before artifacts start to appear, so that is not a solution either,

The Micron ICs are much inferior than Samsung, it is very obvious, the amount of work and power tweaking they require for a small increase in frequency is just crappy, with Samsung ICs you just dial +600 or +700 on the memory and that is it, it works flawless, big difference buddy!


----------



## zipper17

Quote:


> Originally Posted by *MyNewRig*
> 
> The problem with using "Prefer Maximum Performance" is that once you change any settings in that section, the driver changes from "Let Application Decide" to an all custom settings, this affects all other graphics settings and not just power delivery, so does not sound like a good idea.
> 
> I also tried with forced voltage via MSI AB and can only maybe get +50 Mhz more or so before artifacts start to appear, so that is not a solution either,
> 
> The Micron ICs are much inferior than Samsung, it is very obvious, the amount of work and power tweaking they require for a small increase in frequency is just crappy, with Samsung ICs you just dial +600 or +700 on the memory and that is it, it works flawless, big difference buddy!


What brand do you have, maybe ask them is there any Step-Up Program like EVGA did.

I think you need to RMA your card. Let See if newer card will running better probably?sorry english not my native lol


----------



## mrtbahgs

Skimmed the last few pages, but saw someone asked on recently purchased cards and memory brand.

I have a Gigabyte G1 Gaming 1070 that I got last week and I am fairly certain it has Samsung memory (99% sure).
I can double check, but it's just the brand reading in GPU-Z right?

I haven't gotten around to try an OC yet, but hope the memory can do the 500-700 that a lot are claiming.


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> Still don't know how some of you say/believe this. The vram runs at the advertised speed correct? Overclocking isn't guaranteed, so I am not sure what you expect "to blow up in their face".


Again with this logic? okay AGAIN ..

1- NO not all Micron equipped 1070s are stable at stock settings, many report artifacting on games and desktop with Micron cards at stock settings on "daily basis", issues that NEVER existed during the Samsung batches, what has changed in the cards that resulting in all these issues? memory ICs .. that is!

2- These cards are designed, engineered, made, marketed, promoted, sold at a premium as OVERCLOCKING CARDS, people who buy 1070s with two 8-pin power connectors and 10+2 phases and 3 slots coolers at a premium are not buying them to run "stock setting", manufacturers who make those cards are not making them to run "stock settings", one 8-pin and 4 power phases are more than enough for that.

When Nvidia's Jen-Hsun promote the overclocking potential of Pascal at its official launch event and brag about how far it can overclock and how cool it runs when overclocked, then these are not cards made to run at stock settings.

When early review samples and original specifications of the 1070 are using Samsung memory and all reviews are showing memory reaching 9200-9600Mhz, buyers expect when they spend an arm and a leg to buy these cards that performance will be in the same line as the reviews show.

When all manufacturers have official overclocking guides and forum section on their sites, and participate in overclocking contests then these are not cards made to run at "stock setting"

Overclocking is at the heart of Pascal, and cards are marketed and sold as overclocking cards, and buyers make the purchasing decision to buy one brand or the other, based on "OC potential"

Formally and legally speaking, yeah we can say they are not responsible for overclocking bla bla bla, but the real-world market-based speaking, cheaping out on components and systematically downgrading the cards while being very vague and silent about it is a disaster, and a very big deal and a lot of people have stopped purchasing 1070s because of this.

Leave that "stock settings" talk to the manipulative manufacturers trying to get away with this crap!

an easy solution is to be honest and transparent about it, make cheap cards list Micron memory in the specs and market them as non-OC cards.

Then put Samsung memory in the high-tier over-engineered and most expensive cards and mark them as OC cards, sell them at a premium

Problem solved!


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> wall of text


When Jen-Hsun talked about overclocking, all he referred to is core overclocking and even then numbers were never guaranteed. As far as independent reviews and websites are concerned, manufacturers are not responsible for those overclocking results. Again, overclocking is never guaranteed. Just go over to the Titan X Pascal thread and see even they have some issues with their cards not overclocking the best and they paid $1200! Get over yourself.


----------



## msigtx760tf4

check this out !! 5000 on memory is no problem for my Palit gamerock
the core is weak

http://www.3dmark.com/3dm/14962848


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> When Jen-Hsun talked about overclocking, all he referred to is core overclocking and even then numbers were never guaranteed. As far as independent reviews and websites are concerned, manufacturers are not responsible for overclocking results. Again, overclocking is never guaranteed. Just go over to the Titan X Pascal thread and see even they have some issues with their cards not overclocking the best and they paid $1200! Get over yourself.


NO, nothing to get over here, i remember the same line being thrown at early adopters of the GTX 970 when they complained about micro-stuttering issues, and later discover that the card can only access 3.5GB at full speed, and people like you telling them oh, nothing wrong with the card, it runs great, get over yourselves, suck it, take it like man. what kind of consumer logic is that really? and look how it turned out now?

What is forcing us to accept this crap? what is forcing us to pay top dollars for a GPU with inferior components? what forces us to stand the lying, manipulation and shady practices of the manufacturers, total lack of transparency and honesty about the issues? absolutely nothing .. heck with Pascal and heck with GPUs all together, it is easy, there are so many alternatives that we do not have to accept crappy products, suck it or get over anything.

Either they deliver a flawless product to buy or not .. simple .. easy ..

Why are you using people with more money than they know what to do with as an example here? people buying a $1200 knowing for sure that in a few months time a card with the same exact performance will come out at half the price, do not represent the rest of us, not a good example to use here.

and Memory is a major component in the overall performance of the 1070 in high resolutions, i don't care if Jen-Hsun was referring to core or memory, he was officially and publicly supporting overclocking and bragging about it, and vRAM is not out of that equation.

Would you be okay with buying a SkyLake 6600K that all reviews show it can reach 4.5-4.7Ghz easily to find out that your sample struggles to reach 4Ghz? would a "stock settings" excuse help you get over yourself? what if Intel systematically start capping the K series to not exceed 4Ghz while still setting it as an UNLOCKED chip, would you still buy it? would you still suck it?

I suggest that you get over yourself and speak with honesty and sense, be realistic on what you expect consumers to accept, GTX 1070 with Micron memory chips is garbage and not worth the money, if they did this move to save, then they should pass the savings to us and lower the MSRP .. otherwise, they can shove their Micron 1070s up their ....


----------



## Hnykill

Quote:


> Originally Posted by *MyNewRig*
> 
> When did you buy your card? is it very recently? fairly recently, or one or two months ago?
> 
> You seem to be the exception to the rule that all/most Micron memory cards do not OC at all or OC very little, but before you call your OC stable, can you play/benchmark some Rise of the Tomb Raider and see if you get any black/flickering artifacts and at what memory frequency do they stop?
> 
> I managed to once reach +350 with my Micron card (8700Mhz) , passed one FireStrike bench and played some Witcher 3, it was fine, but then once i switched to rotTR i started getting horrible black boxes, bars and flickering artifacts, it did not stop until i COMPLETELY removed any memory offset and reverted back to +0 (8008Mhz) this is when the artifacts seemed to have stopped during gameplay, i suspect i am still getting some minor artifacts even at memory stock settings but i have not played long to confirm that 100%
> 
> So try that and let me know.


Card release date is May 30. 2016. in GPU-Z. i kinda won the silicon lottery on this one. Core clock @ 2100Mhz, memory @ 9200Mhz. overvoltage 100% and powertarget 114%. i have a custom fan curve profile and card is both silent and cooled like that









I have no artifacts or flickering in any games. 3dmark or other benchmarks. it is stable oc.


----------



## Fosion

Quote:


> Originally Posted by *msigtx760tf4*
> 
> check this out !! 5000 on memory is no problem for my Palit gamerock
> the core is weak
> 
> http://www.3dmark.com/3dm/14962848


http://www.3dmark.com/fs/9932598


----------



## MyNewRig

Quote:


> Originally Posted by *Hnykill*
> 
> Card release date is May 30. 2016. in GPU-Z. i kinda won the silicon lottery on this one. Core clock @ 2100Mhz, memory @ 9200Mhz. overvoltage 100% and powertarget 114%. i have a custom fan curve profile and card is both silent and cooled like that


I am not asking what card release date is showing in GPU-Z because all cards are showing the same date, either 7th of May or 30th of May, i am asking when did YOU purchase the card from the store or site you purchased it from?


----------



## Hnykill

Quote:


> Originally Posted by *MyNewRig*
> 
> I am not asking what card release date is showing in GPU-Z because all cards are showing the same date, either 7th of May or 30th of May, i am asking when did YOU purchase the card from the store or site you purchased it from?


August 1. in local store here in Iceland


----------



## TheGlow

Sorry, been away for last 10 days or so and buried in new posts.
So my Micron is really a god then or something?
The things I noticed was installing the MSI gaming app adds the msi gaming service, which triggers an osd exe to run. That exe stops the card from ever really idling into 2d (not sure terms) clocks so it will always get at least .725mV or so.
Maximum performance was set in nvidia panel. Thats it.
If I stop that MSI service, I checkerboard lock up immediately.
If I have the profile set at start up, theres a good chance I instantly lock up. So I need to apply after since it takes a moment for that service to kick in.
If I do not have the MSI service running, then apply about +500 on the memory, it will appear to be fine as im in low voltage. Opening many apps like MS Edge will immediately kick the checkerboards up.
I ordered it on NewEgg, August 15th.


----------



## Skyblaze

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah, because you have not been cursed like me by trying Samsung cards, once you try a Samsung 1070 you can never go back, it becomes an addiction lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How do you know it is a simple fix? maybe not, even if it is simple, how do you know that they are "willing" to fix it, maybe they made it like that so we are forced to buy a 1080 for higher memory clocks which is very useful in high resolutions, a significant difference in anything above 1080p .. but we will see
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also forcing high voltage on chips to become stable means they are not good chips to begin with, good chips run on low voltage and deal with fluctuations pretty well, so Samsung is much superior in that department, there is just no way around this you see.
> 
> Why in hell are you under-volting your card by the way?!


Well okay that explains it I never had a Samsung 1070









And well I'm not sure it's a simple fix but applying the voltage before ramping up the memory-clock atleast seems to help with some Micron cards :/ But you have a point, we shouldn't have to resort to such measures either way :/

As for undervolting, I have a Palit DUAL 1070 which has a custom-cooler but a very cheap one which reaches 83°C when running Witcher 3 or Overwatch in 1440p which was a tad high for my taste and since I don't like additional noise I didn't want to ramp the fans up so I tried undervolting and it worked well. A friends R9 390 gave me this idea, he runs it in a mini-ITX case and it frequently reached 85°C so I undervolted it by 80mV and it lost almost 10°C at the same clocks.


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> Sorry, been away for last 10 days or so and buried in new posts.
> So my Micron is really a god then or something?
> The things I noticed was installing the MSI gaming app adds the msi gaming service, which triggers an osd exe to run. That exe stops the card from ever really idling into 2d (not sure terms) clocks so it will always get at least .725mV or so.
> Maximum performance was set in nvidia panel. Thats it.
> If I stop that MSI service, I checkerboard lock up immediately.
> If I have the profile set at start up, theres a good chance I instantly lock up. So I need to apply after since it takes a moment for that service to kick in.
> If I do not have the MSI service running, then apply about +500 on the memory, it will appear to be fine as im in low voltage. Opening many apps like MS Edge will immediately kick the checkerboards up.
> I ordered it on NewEgg, August 15th.


Very interesting stuff the amount of work and effort one has to put to MAYBE be able to overclock the Micron memory and it is not even a real OC, rather a workaround,

When you set "max performance" in Nvidia control panel, the driver switched from "Let Application Decide" mode to a totally custom mode, what do you do to other settings in the profile? leave them at defaults?

What brand and model of card do you have? your numbers are mind-blowing


----------



## zipper17

Quote:


> Originally Posted by *Fosion*
> 
> http://www.3dmark.com/fs/9932598


Quote:


> Originally Posted by *msigtx760tf4*
> 
> check this out !! 5000 on memory is no problem for my Palit gamerock
> the core is weak
> 
> http://www.3dmark.com/3dm/14962848


Damn, can you provide FS Extreme stress test result?

I still struggle to get pass even 21K Graphics score. looks like i have less better chip.







(or i just noob for overclocking.







)

I can get memory running into +700mhz (9400mhz), but I noticed a very fast abnormal black screen during testing FSext Graphic Test2, so I lowered it down.


----------



## TheGlow

Quote:


> Originally Posted by *MyNewRig*
> 
> Very interesting stuff the amount of work and effort one has to put to MAYBE be able to overclock the Micron memory and it is not even a real OC, rather a workaround,
> 
> When you set "max performance" in Nvidia control panel, the driver switched from "Let Application Decide" mode to a totally custom mode, what do you do to other settings in the profile? leave them at defaults?
> 
> What brand and model of card do you have? your numbers are mind-blowing


The work I put in technically was backwards.
I had installed the MSI gaming app initially and then decided to not use it and use AfterBurner instead.
Thats when I noticed the clocks never idled anymore, however I never had any problems trying to OC it other then picking outright crazy numbers like +250 clock or +900 memory.
Then I removed all drivers/apps to get the 2d clocks to work, and thats when I inadvertently experienced the insta-locks with a medium to high clock.

This is an MSI Gaming X.

I set it to Max performance, but wasn't aware of anything else being changed, so I would assume defaults.
What kind of settings would have been impacted? Ill need to take a look. Witcher3 and Overwatch were the main things, but had some personal issues/stress and was just playing Diablo3 for the mindlessness of it.


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> The work I put in technically was backwards.
> I had installed the MSI gaming app initially and then decided to not use it and use AfterBurner instead.
> Thats when I noticed the clocks never idled anymore, however I never had any problems trying to OC it other then picking outright crazy numbers like +250 clock or +900 memory.
> Then I removed all drivers/apps to get the 2d clocks to work, and thats when I inadvertently experienced the insta-locks with a medium to high clock.
> 
> This is an MSI Gaming X.
> 
> I set it to Max performance, but wasn't aware of anything else being changed, so I would assume defaults.
> What kind of settings would have been impacted? Ill need to take a look. Witcher3 and Overwatch were the main things, but had some personal issues/stress and was just playing Diablo3 for the mindlessness of it.


On the first page of the "3D Settings" in the C-Panel which is "Adjust Image Settings with Preview" you have the option "Let the 3D application decide" option selected by default which leaves it up to the game to decide how rendering is to be done.

When you go to next page which is "Manage 3D Settings" and make ANY kind of changes there including the Power settings, the driver switches to "use the advanced 3D image settings" which uses the default profile to now OVER-RIDE any game settings, so for example if you have AA disabled in that general profile and enable AA in the game settings i think the general profile settings will be the ones applied.

If you have Max Performance configured now, go to the first page of the "3D settings" and you will notice that "use the advanced 3D image settings" has now been auto-selected for you, from the default "Let the 3D application decide" which will override whatever settings you configure in the game itself.

This is why i did not try this Max Performance trick to try stabilizing my vRAM because it messes up so many other things.


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> Stuff


Micron on the GTX 1070 and the GTX 970 issue are WAY different things. Not even comparable.

Anyway, I don't expect you to ever get what i am saying so I am done with this discussion.

Oh yeah, my FE has Samsung and does 9312 on the memory. Should've went with an FE!


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> Micron on the GTX 1070 and the GTX 970 issue are WAY different things. Not even comparable.
> 
> Anyway, I don't expect you to ever get what i am saying so I am done with this discussion.


You were not even a part of the discussion to begin with, because from the way you sound i believe that you either don't own a 1070 at all, do not own any pascal card or have one with Samsung memory,

So in all cases you do not seem to be affected by the issue at all and therefore i do not expect you to understand our situation or relate to it ..

We who bought the 1070 with Micron memory simply do not accept the situation and for a bunch of very good reasons, you do not seem to be part of this group so you will never understand our frustration .. whatever man, i heard your point of view many times before and i honestly don't care what you think because it is not helping make the situation any better, you are just supporting the manufacturers in screwing us up even more, not a very nice stance!


----------



## msigtx760tf4

http://www.3dmark.com/3dm/14964241


----------



## msigtx760tf4

Quote:


> Originally Posted by *zipper17*
> 
> Damn, can you provide FS Extreme stress test result?
> 
> I still struggle to get pass even 21K Graphics score. looks like i have less better chip.
> 
> 
> 
> 
> 
> 
> 
> (or i just noob for overclocking.
> 
> 
> 
> 
> 
> 
> 
> )
> 
> I can get memory running into +700mhz (9400mhz), but I noticed a very fast abnormal black screen during testing FSext Graphic Test2, so I lowered it down.


no problem

http://www.3dmark.com/3dm/14964241

just break 16K in fs that's all on my rig

http://www.3dmark.com/3dm/14964806


----------



## zipper17

Quote:


> Originally Posted by *msigtx760tf4*
> 
> no problem
> 
> http://www.3dmark.com/3dm/14964241
> 
> just break 16K in fs that's all on my rig
> 
> http://www.3dmark.com/3dm/14964806


Can you share your oc tweaks setting? any curve or offset settings??

I mean FS extreme Stress test Like this:

Consistency Under load Stability test.

Did you test those? If so You got a Nice chip then. Congratz.


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> Oh yeah, my FE has Samsung and does 9312 on the memory. Should've went with an FE!


Yeah, that was easily detectable, you sound like a happy guy with a Samsung card,

I personally "Still Can" go with whatever card my heart desires because i 1) still in my return period and 2) my card has horrible coil whine and have an approved RMA already

so i can go with the FE, or anything else or even get my money back, all my options are open.

What i am really trying to find out is which 1070 to get including the FE or just skip Pascal all together which i have no problem in doing at all .. i simply don't care all that much.

How is your experience with the FE when it comes to noise and temps? do you have any coilwhine or any other complaints? and are all FEs exactly the same regardless of brand? EVGA, ASUS, MSI .. whatever?


----------



## Awsan

3D Mark 11 Score

Can i get some other 3D mark 11 scores?


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah, that was easily detectable, you sound like a happy guy with a Samsung card,
> 
> I personally "Still Can" go with whatever card my heart desires because i 1) still in my return period and 2) my card has horrible coil whine and have an approved RMA already
> 
> so i can go with the FE, or anything else or even get my money back, all my options are open.
> 
> What i am really trying to find out is which 1070 to get including the FE or just skip Pascal all together which i have no problem in doing at all .. i simply don't care all that much.
> 
> How is your experience with the FE when it comes to noise and temps? do you have any coilwhine or any other complaints? and are all FEs exactly the same regardless of brand? EVGA, ASUS, MSI .. whatever?


So the FE would be what I would do if I was that unhappy about Micron VRAM. I believe all FEs still get Samsung.

I have no coil whine. Even in game menus that show excessively high frame rates I don't get any. As far as I know all brands are all the same (they all have the same OEM). Temps and noise with the stock cooler were not bad, but I water blocked the card anyway. No complaints.


----------



## Dude970

Quote:


> Originally Posted by *Awsan*
> 
> 3D Mark 11 Score
> 
> Can i get some other 3D mark 11 scores?


Here is a run I did on 3D Mark 11

http://www.3dmark.com/3dm11/11545706


----------



## Hunched

I feel like it's not even worth overclocking VRAM at this point, memory overclocking whether it be RAM or VRAM is always nightmarish with timings and more variables than simplistic core overclocking.
Not to mention you can't even control timings for GPU memory overclocking...

I don't know if somehow my PC configuration could have caused this, or if it's just an odd coincidence that both 1070's I've had with Samsung memory landed at around +300 maximum stable.
I can run benchmarks like 3DMark FireStrike and TimeSpy 20 times in a row at +600 to +700 without issue like everyone else, but play actual games for hundreds of hours like BF4 or ROTTR and you will have issues.

So right there, ignore 100% of people telling you their 3DMark stable memory overclocks, that information could not be more worthless.
Also another thing tons of people probably overlook, just because your games aren't crashing and there aren't constant in your face artifacts, doesn't mean your memory is stable.
All these people saying things like "+800 is stable but I get no more score in 3DMark than +400" *then you're not stable above +400*

I could have an unstable memory overclock and it could take 20+ hours of BF4 gameplay for a visual artifact to appear, in less demanding games it may never happen.
Also, there's Micron memory that overclocks better than Samsung memory.

All I see here is a lot of guessing, stability overestimating, lack of testing, lack of proof and information.
Nobody seems to have a clue what they're talking about, I wouldn't even consider myself close to an expert yet I'm learning nothing from anything anyone is saying right now.

I'm almost certain if I could test everyone's cards for 10's of hours I'd find nearly everyone's stable Samsung overclock is actually far less than they believe it to be, and that Micron really isn't that far behind.
My Samsung 1070 is stable at +300 or less, but again, if I did what everyone else did I'd be saying it's stable at +700, because it is in 3DMark, even after 20+ runs while most people here probably do like 5.

There may or may not be a problem, but nobody here seems capable of figuring anything out or testing anything properly, so who knows for sure.


----------



## RaleighStClair

Quote:


> Originally Posted by *Hunched*
> 
> I feel like it's not even worth overclocking VRAM at this point, memory overclocking whether it be RAM or VRAM is always nightmarish with timings and more variables than simplistic core overclocking.
> Not to mention you can't even control timings for GPU memory overclocking...
> 
> I don't know if somehow my PC configuration could have caused this, or if it's just an odd coincidence that both 1070's I've had with Samsung memory landed at around +300 maximum stable.
> I can run benchmarks like 3DMark FireStrike and TimeSpy 20 times in a row at +600 to +700 without issue like everyone else, but play actual games for hundreds of hours like BF4 or ROTTR and you will have issues.
> 
> So right there, ignore 100% of people telling you their 3DMark stable memory overclocks, that information could not be more worthless.
> Also another thing tons of people probably overlook, just because your games aren't crashing and there aren't constant in your face artifacts, doesn't mean your memory is stable.
> All these people saying things like "+800 is stable but I get no more score in 3DMark than +400" *then you're not stable above +400*
> 
> I could have an unstable memory overclock and it could take 20+ hours of BF4 gameplay for a visual artifact to appear, in less demanding games it may never happen.
> Also, there's Micron memory that overclocks better than Samsung memory.
> 
> All I see here is a lot of guessing, stability overestimating, lack of testing, lack of proof and information.
> Nobody seems to have a clue what they're talking about, I wouldn't even consider myself close to an expert yet I'm learning nothing from anything anyone is saying right now.
> 
> I'm almost certain if I could test everyone's cards for 10's of hours I'd find nearly everyone's stable Samsung overclock is actually far less than they believe it to be, and that Micron really isn't that far behind.
> My Samsung 1070 is stable at +300 or less, but again, if I did what everyone else did I'd be saying it's stable at +700, because it is in 3DMark, even after 20+ runs while most people here probably do like 5.
> 
> There may or may not be a problem, but nobody here seems capable of figuring anything out or testing anything properly, so who knows for sure.


This is OCN, no one here _actually_ plays games lol.

OCN in a nutshell:

- setup up beast PC

- OC

- buy better cooler for OC potential

- OC some more

- find somewhat stable OC

- run bechmarks

- Test in-game, if stable, never load game again

- Repeat.


----------



## syl1979

On my 970, i was able to have stable +650 in memory (samsung). When going over that point, there was no artifacts, but performance decrease. Meaning the error correction functionnality of GDDR5 was applied.

For my 1070, i do have Samsung memory. However passed +425 i start to see artifacts. I am able to make full runs of firestrike or timespy at +625. Even if what i see is a july 14th's fireworks of artifacts, the performance IS increasing allowing me to pass 22000 graphic score.

I BELIEVE that nvidia decided to disable the memory error checking(in their memory controller ?) in order to achieve such high core frequency.

They may have developped this using samsung memory only without seeing any issue when changing between load/idle.

Then Cards manufacturers started to use Micron memory.... And then issues coming for the unlucky with memory silicon lottery


----------



## Awsan

Quote:


> Originally Posted by *Dude970*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Awsan*
> 
> 3D Mark 11 Score
> 
> Can i get some other 3D mark 11 scores?
> 
> 
> 
> Here is a run I did on 3D Mark 11
> 
> http://www.3dmark.com/3dm11/11545706
Click to expand...

This is the best i can achieve

http://www.3dmark.com/3dm11/11594400

+125 on core and +500 on memory power limit set to 111% anything past that will crash instantaneously + i saw a constant 2050 core and 4500 memory using EVGA PXOC while benchmarking


----------



## Dude970

Quote:


> Originally Posted by *Awsan*
> 
> This is the best i can achieve
> 
> http://www.3dmark.com/3dm11/11594400
> 
> +125 on core and +500 on memory power limit set to 111% anything past that will crash instantaneously + i saw a constant 2050 core and 4500 memory using EVGA PXOC while benchmarking










Very good score, I am thinking of upgrading to the 6700K


----------



## Awsan

Quote:


> Originally Posted by *Dude970*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Awsan*
> 
> This is the best i can achieve
> 
> http://www.3dmark.com/3dm11/11594400
> 
> +125 on core and +500 on memory power limit set to 111% anything past that will crash instantaneously + i saw a constant 2050 core and 4500 memory using EVGA PXOC while benchmarking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Very good score, I am thinking of upgrading to the 6700K
Click to expand...

Yea that 6700k is killing it







, but the 1070 is trying to kill me


----------



## Hunched

Quote:


> Originally Posted by *RaleighStClair*
> 
> This is OCN, no one here _actually_ plays games lol.
> 
> OCN in a nutshell:
> 
> - setup up beast PC
> 
> - OC
> 
> - buy better cooler for OC potential
> 
> - OC some more
> 
> - find somewhat stable OC
> 
> - run bechmarks
> 
> - Test in-game, if stable, never load game again
> 
> - Repeat.










It's unfortunate that nearly all the information here is very misleading, since barely anyone tests anything properly.

Until you've put 5-10 hours into 5-10 different games at least as "fragile" as BF4, you can't begin to say your overclock is stable.
I'm sure everyone here has ran FireStrike 500 times, eyes peeled for every loop for at least 10 hours.








Even if someone did that, which nobody here has, that's still useless compared to playing games.

I truly wish memory instability was as simple as lets say CPU core instability, where as long as you don't crash/BSOD you're 100% stable.
You have to watch for those often rare and infrequent visual signs of instability.

It seemed like my +300 memory was stable after the last 20 hours of Rainbow Six Siege.
Then suddenly a map loads and textures are black, pink, and green.
That didn't happen in the previous 200 R6S hours on my 970, so probably not a game issue.

I really dislike being uncertain, and you never know when your memory overclock is going to suddenly give you a sign it's not so solid.
A quick test to find instability would be a blessing, but it doesn't exist. Memory is so god damn finicky and varies game to game.
One day you might think you've finally got it, just to play the new BF1 or Star Citizen or whatever and nope, you were wrong the whole time.








Quote:


> Originally Posted by *syl1979*
> 
> On my 970, i was able to have stable +650 in memory (samsung). When going over that point, there was no artifacts, but performance decrease. Meaning the error correction functionnality of GDDR5 was applied.
> 
> For my 1070, i do have Samsung memory. However passed +425 i start to see artifacts. I am able to make full runs of firestrike or timespy at +625. Even if what i see is a july 14th's fireworks of artifacts, the performance IS increasing allowing me to pass 22000 graphic score.
> 
> I BELIEVE that nvidia decided to disable the memory error checking(in their memory controller ?) in order to achieve such high core frequency.
> 
> They may have developped this using samsung memory only without seeing any issue when changing between load/idle.
> 
> Then Cards manufacturers started to use Micron memory.... And then issues coming for the unlucky with memory silicon lottery


I'm pretty certain error checking is still a thing, since people claim to be stable at +700 or more with no performance increase from like +400.
That's error checking causing the performance not to change, their +700 is not stable just because it passes without artifacts.

Are you saying you do get artifacting at +625 in 3DMark?
Or does 3DMark look fine at +625, but as low as +425 games will artifact? Because that would be similar to my experience.


----------



## syl1979

Quote:


> Originally Posted by *Hunched*
> 
> Are you saying you do get artifacting at +625 in 3DMark?
> Or does 3DMark look fine at +625, but as low as +425 games will artifact? Because that would be similar to my experience.


I mean I don't see artifacts at +425. At +500 I see some.
Past 550 it starts to cover all my screen.
At +625 It is fireworks.

And then even if I lower core frequency and force core voltage to 1.09v.

Maybe it is Bios related to my card (GALAX/KFA2 1070 Gamer)


----------



## TheGlow

Quote:


> Originally Posted by *MyNewRig*
> 
> On the first page of the "3D Settings" in the C-Panel which is "Adjust Image Settings with Preview" you have the option "Let the 3D application decide" option selected by default which leaves it up to the game to decide how rendering is to be done.
> 
> When you go to next page which is "Manage 3D Settings" and make ANY kind of changes there including the Power settings, the driver switches to "use the advanced 3D image settings" which uses the default profile to now OVER-RIDE any game settings, so for example if you have AA disabled in that general profile and enable AA in the game settings i think the general profile settings will be the ones applied.
> 
> If you have Max Performance configured now, go to the first page of the "3D settings" and you will notice that "use the advanced 3D image settings" has now been auto-selected for you, from the default "Let the 3D application decide" which will override whatever settings you configure in the game itself.
> 
> This is why i did not try this Max Performance trick to try stabilizing my vRAM because it messes up so many other things.


Yes. it looks like thats exactly what happened.
I just put it back to let the 3d app decide and prefer max performance is still set.
I'll leave that for now and see how it is.


----------



## zipper17

Quote:


> Originally Posted by *Hunched*
> 
> I feel like it's not even worth overclocking VRAM at this point, memory overclocking whether it be RAM or VRAM is always nightmarish with timings and more variables than simplistic core overclocking.
> Not to mention you can't even control timings for GPU memory overclocking...
> 
> I don't know if somehow my PC configuration could have caused this, or if it's just an odd coincidence that both 1070's I've had with Samsung memory landed at around +300 maximum stable.
> I can run benchmarks like 3DMark FireStrike and TimeSpy 20 times in a row at +600 to +700 without issue like everyone else, but play actual games for hundreds of hours like BF4 or ROTTR and you will have issues.
> 
> So right there, ignore 100% of people telling you their 3DMark stable memory overclocks, that information could not be more worthless.
> Also another thing tons of people probably overlook, just because your games aren't crashing and there aren't constant in your face artifacts, doesn't mean your memory is stable.
> All these people saying things like "+800 is stable but I get no more score in 3DMark than +400" *then you're not stable above +400*
> 
> I could have an unstable memory overclock and it could take 20+ hours of BF4 gameplay for a visual artifact to appear, in less demanding games it may never happen.
> Also, there's Micron memory that overclocks better than Samsung memory.
> 
> All I see here is a lot of guessing, stability overestimating, lack of testing, lack of proof and information.
> Nobody seems to have a clue what they're talking about, I wouldn't even consider myself close to an expert yet I'm learning nothing from anything anyone is saying right now.
> 
> I'm almost certain if I could test everyone's cards for 10's of hours I'd find nearly everyone's stable Samsung overclock is actually far less than they believe it to be, and that Micron really isn't that far behind.
> My Samsung 1070 is stable at +300 or less, but again, if I did what everyone else did I'd be saying it's stable at +700, because it is in 3DMark, even after 20+ runs while most people here probably do like 5.
> 
> There may or may not be a problem, but nobody here seems capable of figuring anything out or testing anything properly, so who knows for sure.


yeah I don't think all people tested properly they stability. But i believe some are legit.

At least I play a lot of Witcher 3, Hitman, GTA5, 1440P max settings. They always stressed my card up to 99%.
Other games are AC3, PlanetSide2, Crysis1, BF3, metro Redux etc.
Combined with 3dmark Firestrike Extreme Stress Test stability or custom loops, not just ran the benchmark.
I doubt valley/heaven because seems they're less stressing. I never encounter a crash on them, but 3dmark random crashes everywhere.
Do you think they're less demanding than BF4, ROTR, Rainbow Six Siege?
By far I test them with +500mhz-600mhz, didn't notice any graphical error.

Only if I try +700mhz, mem start to produce error in 3dmark Firestrike Extreme Graphic test2 Custom loops, like very fast black screen or weird light green pop out, that is not in the first place with mem at stock. unfortunately I don't have BF4 and ROTR or Rainbow Six Siege.

Memory stability is harder to detect than GPU itself
GPUs unstable usually they produce crashes, you just need some Real tools bench or games that produce a good stress load..
Memory unstable produce a graphical error or performances lose, that requires eyes to see and long investigating.


----------



## RaleighStClair

Quote:


> Originally Posted by *zipper17*
> 
> yeah I don't think all people tested properly they stability. But i believe some are legit.
> 
> At least I play a lot of Witcher 3, Hitman, GTA5, 1440P max settings. They always stressed my card up to 99%.
> Other games are AC3, PlanetSide2, Crysis1, BF3, metro Redux etc.
> Combined with 3dmark Firestrike Extreme Stress Test stability or custom loops, not just ran the benchmark.
> I doubt valley/heaven because seems they're less stressing. I never encounter a crash on them, but 3dmark random crashes everywhere.
> Do you think they're less demanding than BF4, ROTR, Rainbow Six Siege?
> By far I test them with +500mhz-600mhz, didn't notice any graphical error.
> 
> Only if I try +700mhz, mem start to produce error in 3dmark Firestrike Extreme Graphic test2 Custom loops, like very fast black screen or weird light green pop out, that is not in the first place with mem at stock. unfortunately I don't have BF4 and ROTR or Rainbow Six Siege.
> 
> Memory stability is harder to detect than GPU itself
> GPUs unstable usually they produce crashes, you just need some Real tools bench or games that produce a good stress load..
> Memory unstable produce a graphical error or performances lose, that requires eyes to see and long investigating.


I always test using Witcher 3 and Battlefield 4. I play with some buddies in BF4 2-3 times a week and testing while playing for 3-4 hour sessions is a great way to test for CPU and GPU stability - as BF4 and Witcher 3 will let you know real quick if you are not stable.

I usually test for stability by looping valley/heaven for 2 hours, then play some games - you know those thing we buy all these expensive GPUs for







.

I would def recommend playing actual games though to check for stability. I don't OC for pointless scores or anything of that nature, but free FPS/performance is always appreciated, especially on a 144hz monitor.


----------



## Hunched

Quote:


> Originally Posted by *zipper17*
> 
> yeah I don't think all people tested properly they stability. But i believe some are legit.
> 
> At least I play a lot of Witcher 3, Hitman, GTA5, 1440P max settings. They always stressed my card up to 99%.
> Other games are AC3, PlanetSide2, Crysis1, BF3, metro Redux etc.
> Combined with 3dmark Firestrike Extreme Stress Test stability or custom loops, not just ran the benchmark.
> I doubt valley/heaven because seems they're less stressing. I never encounter a crash on them, but 3dmark random crashes everywhere.
> Do you think they're less demanding than BF4, ROTR, Rainbow Six Siege?
> By far I test them with +500mhz-600mhz, didn't notice any graphical error.
> 
> Only if I try +700mhz, mem start to produce error in 3dmark Firestrike Extreme Graphic test2 Custom loops, like very fast black screen or weird light green pop out, that is not in the first place with mem at stock. unfortunately I don't have BF4 and ROTR or Rainbow Six Siege.
> 
> Memory stability is harder to detect than GPU itself
> GPUs unstable usually they produce crashes, you just need some Real tools bench or games that produce a good stress load..
> Memory unstable produce a graphical error or performances lose, that requires eyes to see and long investigating.


I don't have 3DMark Extreme, but I definitely find BF4, BF1 Beta, ROTTR, and other highly detailed games way better than free 3DMark and Unigine benches.
I was surprised when I saw issues in Rainbow Six Siege, it had been far more lenient compared to the others.

I've never had a crash or freeze running 3DMark at +600 to +700, there was a millisecond black flicker in about half the runs at +700 and that was it.
Meanwhile I've had BF4 lock up and freeze at +350 before, though I never get visual errors/artifacts.

During the BF1 Beta earlier this month I had freezes only in the server browser menu, never during gameplay.
My voltage was always at 1.093 during gameplay, but dropped to 1.081 on the server browser...
So always within 5-10 minutes it would freeze on the server browser menu, at +300.

Unfortunately the beta ended before I could see how much I need to lower the memory until the freezing stops.
I found out how to stop my voltage from dropping from 1.093 to 1.081 right after, which would have likely solved this freezing.

Most recently R6S was unhappy with 1.093v and +300 once.
So now I feel like just throwing it at +250 and hopefully get 100% stability.
I feel like I've been losing progress for months, constant disappointment and lowering of overclocks.
The road to +0









Also, ROTTR first showed black box artifacts at +320 after a couple hours at that clock.
+250 is probably pretty solid for my card considering where issues begin, but that number was not worth all this.








I should probably just accept +250 and stop trying to squeeze every last mhz out of my card.
I'm about ready to give up on +300 or higher.


----------



## zipper17

Quote:


> Originally Posted by *Hunched*
> 
> I don't have 3DMark Extreme, but I definitely find BF4, BF1 Beta, ROTTR, and other highly detailed games way better than free 3DMark and Unigine benches.
> I was surprised when I saw issues in Rainbow Six Siege, it had been far more lenient compared to the others.
> 
> I've never had a crash or freeze running 3DMark at +600 to +700, there was a millisecond black flicker in about half the runs at +700 and that was it.
> Meanwhile I've had BF4 lock up and freeze at +350 before, though I never get visual errors/artifacts.
> 
> During the BF1 Beta earlier this month I had freezes only in the server browser menu, never during gameplay.
> My voltage was always at 1.093 during gameplay, but dropped to 1.081 on the server browser...
> So always within 5-10 minutes it would freeze on the server browser menu, at +300.
> 
> Unfortunately the beta ended before I could see how much I need to lower the memory until the freezing stops.
> I found out how to stop my voltage from dropping from 1.093 to 1.081 right after, which would have likely solved this freezing.
> 
> Most recently R6S was unhappy with 1.093v and +300 once.
> So now I feel like just throwing it at +250 and hopefully get 100% stability.
> I feel like I've been losing progress for months, constant disappointment and lowering of overclocks.
> The road to +0
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, ROTTR first showed black box artifacts at +320 after a couple hours at that clock.
> +250 is probably pretty solid for my card considering where issues begin, but that number was not worth all this.
> 
> 
> 
> 
> 
> 
> 
> 
> I should probably just accept +250 and stop trying to squeeze every last mhz out of my card.
> I'm about ready to give up on +300 or higher.


What is your Core clock offset? do you oc them? are u sure they're stable? they would be the culprit.

Telling 3dmark are less stress than games are not right either. People Probably just ran a benchmark, and not try to test stability more depth. 3d mark free?, theyre cheap on steam when 80%off. I can run unigene & heavy games just fine with any tweaks, but when running 3dmark, it's random crashes everywhere. You probably still not find your best tweaks for full 100%stable yet.

Every test & games has different amount of stress load, you would need to test all application, to get fully stable.

i will keep track my 500-600mhz on every application. I always make sure my corespeed running stable first without touching memory, after stable then OCing memory.

This thread seems less active either than other thread.

Now im kinda regret buying nvidia product, *** GOW4 only eligible through 20sept-30Oct really ??


----------



## Hunched

Quote:


> Originally Posted by *zipper17*
> 
> What is your Core clock offset? do you oc them? are u sure they're stable?
> 
> Telling 3dmark are less stress than games are not right either. U Probably just ran a benchmark, and not try to test stability more depth. 3d mark free?, theyre cheap on steam when 80%off.
> 
> Every test & games has different amount of stress load, you would need to test all application, to get fully stable.


I haven't touched my core clock much, its at stock until I'm comfortable with memory. Memory gives bigger performance gains so its what I've been focusing on.
Core boosts over 2000mhz anyway and it's not like I'll be able to get much higher than that, maybe 2100mhz if I'm super lucky.
Adjusting the memory definitely effects stability, whether or not I crash. If core was unstable I'd crash regardless of my memory clock.

The free version of 3DMark only allows you to run a benchmark, and it's what nearly everyone here is using to post their results with, and it's not nearly as prone to crashing or artifacting as games have been.


----------



## zipper17

Quote:


> Originally Posted by *Hunched*
> 
> I haven't touched my core clock much, its at stock until I'm comfortable with memory. Memory gives bigger performance gains so its what I've been focusing on.
> Core boosts over 2000mhz anyway and it's not like I'll be able to get much higher than that, maybe 2100mhz if I'm super lucky.
> Adjusting the memory definitely effects stability, whether or not I crash. If core was unstable I'd crash regardless of my memory clock.
> 
> The free version of 3DMark only allows you to run a benchmark, and it's what nearly everyone here is using to post their results with, and it's not nearly as prone to crashing or artifacting as games have been.


It's not really right, Core > memory IMO.

That's the culprit, Just like you said, because people doesn't understand how to Real Testing their product stability after OC, there is a settings called 3d mark stress test.
i can run pass the benchmark easily, but stress test after couple of loops it crashes. After 3dmark i throw up witcher 3, etc.

Dont get me wrong everyone have their own business. I'm enough. its up to everyone. and be happy.

Thanks for your conservative post though. I will extra cautious with that.


----------



## GreedyMuffin

I fold 24/7 when not gaming, so I believe my OC us stable after two weeks of 24/7 folding.

My men starts to decrease in performance after 537+. So I run 498+ daily.


----------



## zipper17

Quote:


> Originally Posted by *syl1979*
> 
> On my 970, i was able to have stable +650 in memory (samsung). When going over that point, there was no artifacts, but performance decrease. Meaning the error correction functionnality of GDDR5 was applied.
> 
> For my 1070, i do have Samsung memory. However passed +425 i start to see artifacts. I am able to make full runs of firestrike or timespy at +625. Even if what i see is a july 14th's fireworks of artifacts, the performance IS increasing allowing me to pass 22000 graphic score.
> 
> I BELIEVE that nvidia decided to disable the memory error checking(in their memory controller ?) in order to achieve such high core frequency.
> 
> They may have developped this using samsung memory only without seeing any issue when changing between load/idle.
> 
> Then Cards manufacturers started to use Micron memory.... And then issues coming for the unlucky with memory silicon lottery


Quote:


> Originally Posted by *Dude970*
> 
> Here is a run I did on 3D Mark 11
> 
> http://www.3dmark.com/3dm11/11545706


Do you guys use Offset or custom curve? is there any specified trick behind the Custom curve method or anything? maybe i missed some tricks behind pascal overclock.

I still can't get pass 21K on Firestrike and 10K on FS Extreme, anything higher got crashes lol. its my goal if I'm lucky. Some people must have a better chip omg.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Skyblaze*
> 
> That would certainly explain why my 1070 ran fine with +250 memory at first. I upped the memory-clock while I was already ingame in Witcher 3 and after it was stable for 20 minutes of playing I thought it was fine. Though then as soon as I started a round of League of Legends I was met with a checkerboard freeze and even after rebooting and lowering the memory to +100 the game started but I saw weird black flickering everywhere along with driver-restarts :/
> 
> 
> 
> Welcome to the life of Micron GTX 1070 owners, you can never get that card 100% stable across the board, what a ****ty product
Click to expand...

of course you can. you just need to try a bit harder


----------



## Hunched

Quote:


> Originally Posted by *zipper17*
> 
> It's not really right, Core > memory IMO.
> 
> That's the culprit, Just like you said, because people doesn't understand how to Real Testing their product stability after OC, there is a settings called 3d mark stress test.
> i can run pass the benchmark easily, but stress test after couple of loops it crashes. After 3dmark i throw up witcher 3, etc.
> 
> Dont get me wrong everyone have their own business. I'm enough. its up to everyone. and be happy.
> 
> Thanks for your conservative post though. I will extra cautious with that.


It's been shown the memory overclocks with Pascal usually give higher performance gains than core overclocks, since there usually isn't much headroom left with core to begin with for most people.
The 3DMark stress test isn't part of the free version unfortunately, just the benchmarks.

I can't say I quite understand the last two lines of your post.
Quote:


> Originally Posted by *GreedyMuffin*
> 
> I fold 24/7 when not gaming, so I believe my OC us stable after two weeks of 24/7 folding.
> 
> My men starts to decrease in performance after 537+. So I run 498+ daily.


I'm curious, what's the significance of 537 and 498? I doubt you were stability testing memory in 1mhz increments


----------



## TheGlow

Quote:


> Originally Posted by *Hunched*
> 
> I feel like it's not even worth overclocking VRAM at this point, memory overclocking whether it be RAM or VRAM is always nightmarish with timings and more variables than simplistic core overclocking.
> Not to mention you can't even control timings for GPU memory overclocking...
> 
> I don't know if somehow my PC configuration could have caused this, or if it's just an odd coincidence that both 1070's I've had with Samsung memory landed at around +300 maximum stable.
> I can run benchmarks like 3DMark FireStrike and TimeSpy 20 times in a row at +600 to +700 without issue like everyone else, but play actual games for hundreds of hours like BF4 or ROTTR and you will have issues.
> 
> So right there, ignore 100% of people telling you their 3DMark stable memory overclocks, that information could not be more worthless.
> Also another thing tons of people probably overlook, just because your games aren't crashing and there aren't constant in your face artifacts, doesn't mean your memory is stable.
> All these people saying things like "+800 is stable but I get no more score in 3DMark than +400" *then you're not stable above +400*
> 
> I could have an unstable memory overclock and it could take 20+ hours of BF4 gameplay for a visual artifact to appear, in less demanding games it may never happen.
> Also, there's Micron memory that overclocks better than Samsung memory.
> 
> All I see here is a lot of guessing, stability overestimating, lack of testing, lack of proof and information.
> Nobody seems to have a clue what they're talking about, I wouldn't even consider myself close to an expert yet I'm learning nothing from anything anyone is saying right now.
> 
> I'm almost certain if I could test everyone's cards for 10's of hours I'd find nearly everyone's stable Samsung overclock is actually far less than they believe it to be, and that Micron really isn't that far behind.
> My Samsung 1070 is stable at +300 or less, but again, if I did what everyone else did I'd be saying it's stable at +700, because it is in 3DMark, even after 20+ runs while most people here probably do like 5.
> 
> There may or may not be a problem, but nobody here seems capable of figuring anything out or testing anything properly, so who knows for sure.


Sadly I dont have 3dmark full so I can only do a few benches before I lose my sanity at the staring at the demos.
I have gotten it to not crash at +210/+825. There were some flashes and artifacts passed around +805-810 on the memory, but again no out right crashes.
Core +215 would be a gamble on whether it would crash or not.
+210/+800 I don't observe any issues.
Now unsure of how the behavior may be I have settled on +180/+700 for daily gaming use with no issues. I'll need to double check but I think the core voltage is still +0 on that configuration.
No issues at all on Witcher3 1440p, all max but hairworks. Thats gives 75-85fps. I havent tested w/ hairworks after the OC but need to.
Overwatch 1140p with most at max, 1 or 2 i think fog and something else are just 1 notch down, proven to not have any real visual impact but does save some fps.
That floats around 110-140FPS.

Lately I've been playing Diablo3, but that thing ran fine on my 260 so thats not even putting a dent on this.


----------



## bigjdubb

I think we have discovered some new and interesting things here. It appears as though not all chips overclock equally regardless of brand or manufacture date. We must inform the world of this revelation....









(the world responded with: no **** Sherlock, we've known this for nearly three decades)


----------



## TheGlow

Quote:


> Originally Posted by *bigjdubb*
> 
> I think we have discovered some new and interesting things here. It appears as though not all chips overclock equally regardless of brand or manufacture date. We must inform the world of this revelation....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (the world responded with: no **** Sherlock, we've known this for nearly three decades)


My first amd had a jumper that would oc from 1.0GHz to 1.3GHz!


----------



## monza1412

Quote:


> Originally Posted by *MyNewRig*
> 
> By the looks of it this smells very fishy, specifications downgrade two months after launch and after sending review samples out without any announcement or admission of the issue, and without even the slightest willingness to share any information about this.
> 
> So unless this thing really blows up like the GTX 970 3.5GB memory issue nothing will happen!


I complain about this same issue a couple of weeks ago. The most shady thing is that all the 1070 reviews I found were made with cards carring samsungs ic, totally misleading the consumer in what to expect, until yesterday







:

http://www.madshrimps.be/articles/article/1000901/KFA2-GeForce-GTX-1070-EX-OC-8GB-Video-Card-Review/4#axzz4Ktrx5Kl0

Yes, I know they are running under specs, and that the oc is not guaranteed and all that, but still.

Someone said that the cards can be identified with the serial number in the box, I can't say that's true though, so the safe bet if you still want to get a 1070 with samsung ram is get a FE.


----------



## bigjdubb

Even if we ignore the fact that not all Samsung memory overclocks well and not all Micron memory overclocks poorly, it's not misleading to the consumer. I would be willing to be that every review you read will have a disclaimer at the beginning of the section on overclocking stating that results will vary. Only consumers should be held responsible for consumer ignorance.


----------



## monza1412

Quote:


> Originally Posted by *bigjdubb*
> 
> Even if we ignore the fact that not all Samsung memory overclocks well and not all Micron memory overclocks poorly, it's not misleading to the consumer. I would be willing to be that every review you read will have a disclaimer at the beginning of the section on overclocking stating that results will vary. Only consumers should be held responsible for consumer ignorance.


you don't get it do you? The disclaimer in every review should be that cards reviewed are cherry picked and the retail cards, depending on batch, are carring different components. Is not consumer ignorance, it's plain disinformation.


----------



## zipper17

A little insight How Graphic card are made (zotac brand, this was from 2011),
http://www.computershopper.com/feature/inside-a-mainland-china-factory-how-a-graphics-card-is-made-in-100-pictures/(page)/9#review-section


Even such manufactures really actually have been used that 3dmark software (must be corporate edition) for Quality & internal testing.
But we don't even know how far they tested it. Probably they only tested at factory stock settings.
Overclocking are not part of QA. but who knows ;/

tl;dr

My card can't even run at @2138,2126,2114 without crashing. Blame QA and their engineer ;/
If I were part of them, probably I would tell the engineer to make an OC stable card


----------



## Nightingale

I don't get it I thought it was common knowledge that while tinkering with your initial overclock using Heaven and 3Dmark the real test was always playing multiple high demanding games. I mean that is what the cards were made for, playing games. The real stable overclock is that which does not crash or suffer performance penalty's while playing.

I also am also paranoid about memory overclock cause of the built in error correction.


----------



## zipper17

Quote:


> Originally Posted by *Nightingale*
> 
> I don't get it I thought it was common knowledge that while tinkering with your initial overclock using Heaven and 3Dmark the real test was always playing multiple high demanding games. I mean that is what the cards were made for, playing games. The real stable overclock is that which does not crash or suffer performance penalty's while playing.
> 
> I also am also paranoid about memory overclock cause of the built in error correction.


lol my common knowledge about overclocking is to make sure use all games & bench stability test and test them all.
all cpu, gpu, hardisk, ram etc stability test, included real apps. While Gaming all component are works together right?

tl;dr
Highest overclock + 100% stable = perfect.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> Even if we ignore the fact that not all Samsung memory overclocks well and not all Micron memory overclocks poorly, it's not misleading to the consumer. I would be willing to be that every review you read will have a disclaimer at the beginning of the section on overclocking stating that results will vary. Only consumers should be held responsible for consumer ignorance.


Exactly.
Quote:


> Originally Posted by *monza1412*
> 
> you don't get it do you? The disclaimer in every review should be that cards reviewed are cherry picked and the retail cards, depending on batch, are carring different components. Is not consumer ignorance, it's plain disinformation.


Are you new to this hobby? Manufactures have been cherry picking cards for reviews as long as I can remember. Again, as long as they run at advertised specs you are getting what you are paying for.


----------



## monza1412

sigh.

There is no worse blind man than the one who doesn't want to see. There is no worse deaf man than the one who doesn't want to hear. And there is no worse madman than the one who doesn't want to understand.


----------



## tps3443

Coil whine is not that big of a deal. When a card is outputting that many frames, it's just going to whine.

My 1080 whines like crazy exiting unihaven benchmark on the advertising splash screen.


----------



## gtbtk

Quote:


> Originally Posted by *monza1412*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> By the looks of it this smells very fishy, specifications downgrade two months after launch and after sending review samples out without any announcement or admission of the issue, and without even the slightest willingness to share any information about this.
> 
> So unless this thing really blows up like the GTX 970 3.5GB memory issue nothing will happen!
> 
> 
> 
> I complain about this same issue a couple of weeks ago. The most shady thing is that all the 1070 reviews I found were made with cards carring samsungs ic, totally misleading the consumer in what to expect, until yesterday
> 
> 
> 
> 
> 
> 
> 
> :
> 
> http://www.madshrimps.be/articles/article/1000901/KFA2-GeForce-GTX-1070-EX-OC-8GB-Video-Card-Review/4#axzz4Ktrx5Kl0
> 
> Yes, I know they are running under specs, and that the oc is not guaranteed and all that, but still.
> 
> Someone said that the cards can be identified with the serial number in the box, I can't say that's true though, so the safe bet if you still want to get a 1070 with samsung ram is get a FE.
Click to expand...

up until now. I have been thinking bait and switch a bit myself. Well done on finding the first Micron review I have seen.


----------



## jasjeet

Should I be paying for more expensive models of the 1070? Or are all of them overclocking similarly?
Is the difference more down to cooling capacity?


----------



## msigtx760tf4

Quote:


> Originally Posted by *zipper17*
> 
> Can you share your oc tweaks setting? any curve or offset settings??
> 
> I mean FS extreme Stress test Like this:
> 
> Consistency Under load Stability test.
> 
> Did you test those? If so You got a Nice chip then. Congratz.


did right now

http://www.3dmark.com/3dm/14981876


----------



## tps3443

They all do roughly the same. The GTX 1070 FE is particularly good, like any other model. I have a GTX1080 FE, and it runs with best, and better than most actually. My card overclocks very well.

The biggest advantage for a GTX1070 partner card, is better cooling.

My last video card was a GTX1070 SC ACX 3.0. It was fantastic video card. It was quiet, and cool.

Either way, you can heavily Overclock a Founders Edition and keep it under 80C, and fairly quiet.

I run my GTX1080 FE fan around 58% MAX once it reaches 70C. It never touches 80C. It's very quiet, and I have a silent case fan blowing on it. I run my card at +235 Core, 11,200 memory, 120% power limit, and +74% voltage.

The GTX1070 FE, will run a few degrees cooler also.

You cannot go wrong with either model.

Good luck!


----------



## zipper17

Quote:


> Originally Posted by *msigtx760tf4*
> 
> check this out !! 5000 on memory is no problem for my Palit gamerock
> the core is weak
> 
> http://www.3dmark.com/3dm/14962848


Quote:


> Originally Posted by *msigtx760tf4*
> 
> no problem
> 
> http://www.3dmark.com/3dm/14964241
> 
> just break 16K in fs that's all on my rig
> 
> http://www.3dmark.com/3dm/14964806


Quote:


> Originally Posted by *msigtx760tf4*
> 
> did right now
> 
> http://www.3dmark.com/3dm/14981876












this is mine

FS: http://www.3dmark.com/fs/10204853 | 20862 Graphic Scores
FSExt: http://www.3dmark.com/fs/10204898 | 9886 Graphic Scores
FSExt Stress test: http://www.3dmark.com/3dm/14979560 | 98.7 %
Timespy Demo http://www.3dmark.com/spy/464659 | 6518 Graphic Scores
Driver 372.70.

Can you share your OC setting maybe?, do you use Custom curve or basic increase offset?

Can't even pass 21K in FS. (My attempt right now!







)
I have crashes if I Increase anything higher (2114, 2126, 2138mhz)
i wanted to buy Palit at the first time, but my local store only has galax at that time. Im f**ked?

I use Custom Curve MSI AB:


----------



## jasjeet

Quote:


> Originally Posted by *tps3443*
> 
> They all do roughly the same. The GTX 1070 FE is particularly good, like any other model. I have a GTX1080 FE, and it runs with best, and better than most actually. My card overclocks very well.
> 
> The biggest advantage for a GTX1070 partner card, is better cooling.
> 
> My last video card was a GTX1070 SC ACX 3.0. It was fantastic video card. It was quiet, and cool.
> 
> Either way, you can heavily Overclock a Founders Edition and keep it under 80C, and fairly quiet.
> 
> I run my GTX1080 FE fan around 58% MAX once it reaches 70C. It never touches 80C. It's very quiet, and I have a silent case fan blowing on it. I run my card at +235 Core, 11,200 memory, 120% power limit, and +74% voltage.
> 
> The GTX1070 FE, will run a few degrees cooler also.
> 
> You cannot go wrong with either model.
> 
> Good luck!


I usually get the MSI, but im thinking Palit GTX 1070 Dual.
Can't fit the larger 2.5 slot cards in the node 304.

Anything to be weary of?


----------



## deegzor

Quote:


> Originally Posted by *zipper17*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is mine
> 
> FS: http://www.3dmark.com/fs/10204853 | 20862 Graphic Scores
> FSExt: http://www.3dmark.com/fs/10204898 | 9886 Graphic Scores
> FSExt Stress test: http://www.3dmark.com/3dm/14979560 | 98.7 %
> Timespy Demo http://www.3dmark.com/spy/464659 | 6518 Graphic Scores
> Driver 372.70.
> 
> Can you share your OC setting maybe?, do you use Custom curve or basic increase offset?
> 
> Can't even pass 21K in FS. (My attempt right now!
> 
> 
> 
> 
> 
> 
> 
> )
> I have crashes if I Increase anything higher (2114, 2126, 2138mhz)
> i wanted to buy Palit at the first time, but my local store only has galax at that time. Im f**ked?
> 
> I use Custom Curve MSI AB:


Here's mine atleast it used to be top score with 1070. --> https://www.dropbox.com/s/bytbr2k3pzc7kbm/gtx1070_curve.JPG?dl=0


----------



## zipper17

Quote:


> Originally Posted by *deegzor*
> 
> Here's mine atleast it used to be top score with 1070. --> https://www.dropbox.com/s/bytbr2k3pzc7kbm/gtx1070_curve.JPG?dl=0


is it fully stable? that's insane. if it fully stable on all apps. You won all of the lottery chip lol.

Any FS Graphic Scores etc?

What is your maximum power consumption shown in gpuz?

This must be Watercooled/Custom Loops?. what brand do you have?

I'm also read on hundred page back, some guy did 2.2GHZ.


----------



## deegzor

Quote:


> Originally Posted by *zipper17*
> 
> is it fully stable? that's insane. if it fully stable on all apps. You won all of the lottery chip lol.
> 
> Any FS Graphic Scores etc?
> 
> What is your maximum power consumption shown in gpuz?
> 
> This must be Watercooled/Custom Loops?. what brand do you have?
> 
> I'm also read on hundred page back, some guy did 2.2GHZ.


Hey,

These clock's have been stable in most games(gta V, PC cars, Rotr, BF4, Just cause 3, evolve 2, cs go). For Overwatch though i need to lower core to 2100 and mem to +600 or black artifacts randomly appear.

Here's my Fs score -> http://www.3dmark.com/fs/9872907

Max gpu-z power consumptions is around 80%

Card is watercooled with this https://www.arctic.ac/eu_en/accelero-hybrid-iii-140.html and added copper heat sinks on to vram.

For me 2190is max stable for most gaming and benchmarking. 2.2ghz will cause a crash in fs for example.


----------



## GreedyMuffin

Coil whine is so low when the card is running at 900mv instead of 1050mv.

Try undervolting for fun! :-D


----------



## tps3443

Quote:


> Originally Posted by *jasjeet*
> 
> I usually get the MSI, but im thinking Palit GTX 1070 Dual.
> Can't fit the larger 2.5 slot cards in the node 304.
> 
> Anything to be weary of?


I run my gtx1080 in a mini ITX case too! It runs out great.

Whatever gtx1070 is in your price range. They are all pretty good cards. I like the build quality and factory look of a Founders Edition my self. But there cooling capabilities are not as good as other GTX1070's.

Palit, and Zotac, make wonderful cards. I would take either one of those over a Gigabyte model.

I believe Palit is a leader in trying to make some of the best overclocking cards available. Unfortunately with the 10 series cards and there power limiter, not the single 8pin, that is not the issue at all. But the fact that we have to stay within the 112%and120% limiters. We are at a wall with most of them. If they could clock to 2.3Ghz or more it would make the titan x pascal look silly in comparison at a respectable true enthusiast level $1200

Get what you can afford, and get what is available. Evga, Asus, Zotac, MSI, palit is what I would stick with. There are some actual Nvidia GTX1070 cards available, if you order direct from Geforce.com also.


----------



## jasjeet

Quote:


> Originally Posted by *tps3443*
> 
> I run my gtx1080 in a mini ITX case too! It runs out great.
> 
> Whatever gtx1070 is in your price range. They are all pretty good cards. I like the build quality and factory look of a Founders Edition my self. But there cooling capabilities are not as good as other GTX1070's.
> 
> Palit, and Zotac, make wonderful cards. I would take either one of those over a Gigabyte model.
> 
> I believe Palit is a leader in trying to make some of the best overclocking cards available. Unfortunately with the 10 series cards and there power limiter, not the single 8pin, that is not the issue at all. But the fact that we have to stay within the 112%and120% limiters. We are at a wall with most of them. If they could clock to 2.3Ghz or more it would make the titan x pascal look silly in comparison at a respectable true enthusiast level $1200
> 
> Get what you can afford, and get what is available. Evga, Asus, Zotac, MSI, palit is what I would stick with. There are some actual Nvidia GTX1070 cards available, if you order direct from Geforce.com also.


Thanks for the reply, right now the zotac 1070 amp edition seems the best value at 400£.
EVGA ACX3.0 is £400 also but not as good I beleive.
Not sure about the MSI Armour version.
MSI X and EVGA FTW around 440£.


----------



## bigjdubb

Quote:


> Originally Posted by *monza1412*
> 
> you don't get it do you? The disclaimer in every review should be that cards reviewed are cherry picked and the retail cards, depending on batch, are carring different components. Is not consumer ignorance, it's plain disinformation.


I do get it. The fact that you feel the disclaimer should be "cards reviewed are cherry picked and the retail cards, depending on batch, are carring different components" is a sign that you don't get it. A reviewer posting that disclaimer would be taking responsibility for consumer ignorance because any consumer that doesn't know that the components that make up the electronic products (or any products) we buy will change throughout the life cycle of that product is ignorant.
Quote:


> Originally Posted by *monza1412*
> 
> sigh.
> 
> There is no worse blind man than the one who doesn't want to see. There is no worse deaf man than the one who doesn't want to hear. And there is no worse madman than the one who doesn't want to understand.


Stand in front of a mirror and say that 100 times. Maybe it will sink in.


----------



## Dude970

Quote:


> Originally Posted by *zipper17*
> 
> Do you guys use Offset or custom curve? is there any specified trick behind the Custom curve method or anything? maybe i missed some tricks behind pascal overclock.
> 
> I still can't get pass 21K on Firestrike and 10K on FS Extreme, anything higher got crashes lol. its my goal if I'm lucky. Some people must have a better chip omg.


I use offset. Fot 22k graphics I use
Quote:


> Originally Posted by *Hunched*
> 
> I feel like it's not even worth overclocking VRAM at this point, memory overclocking whether it be RAM or VRAM is always nightmarish with timings and more variables than simplistic core overclocking.
> Not to mention you can't even control timings for GPU memory overclocking...
> 
> I don't know if somehow my PC configuration could have caused this, or if it's just an odd coincidence that both 1070's I've had with Samsung memory landed at around +300 maximum stable.
> I can run benchmarks like 3DMark FireStrike and TimeSpy 20 times in a row at +600 to +700 without issue like everyone else, but play actual games for hundreds of hours like BF4 or ROTTR and you will have issues.
> 
> So right there, ignore 100% of people telling you their 3DMark stable memory overclocks, that information could not be more worthless.
> Also another thing tons of people probably overlook, just because your games aren't crashing and there aren't constant in your face artifacts, doesn't mean your memory is stable.
> All these people saying things like "+800 is stable but I get no more score in 3DMark than +400" *then you're not stable above +400*
> 
> I could have an unstable memory overclock and it could take 20+ hours of BF4 gameplay for a visual artifact to appear, in less demanding games it may never happen.
> Also, there's Micron memory that overclocks better than Samsung memory.
> 
> All I see here is a lot of guessing, stability overestimating, lack of testing, lack of proof and information.
> Nobody seems to have a clue what they're talking about, I wouldn't even consider myself close to an expert yet I'm learning nothing from anything anyone is saying right now.
> 
> I'm almost certain if I could test everyone's cards for 10's of hours I'd find nearly everyone's stable Samsung overclock is actually far less than they believe it to be, and that Micron really isn't that far behind.
> My Samsung 1070 is stable at +300 or less, but again, if I did what everyone else did I'd be saying it's stable at +700, because it is in 3DMark, even after 20+ runs while most people here probably do like 5.
> 
> There may or may not be a problem, but nobody here seems capable of figuring anything out or testing anything properly, so who knows for sure.


@ Hunched Hmm, so if I post a benchmark, you think I'm saying it's game stable? Nope, never said that. I am am able to get 22K graphics in Firestrike, I can even hit 2.2 on other benchmarks. I have never said it was game stable.

Users post benchmarks because we show off what the cards can do. If yours cant, no need to question everone else


----------



## KawasakiDragonn

I'm trying to overclock my EVGA 1070 SC, never overclock gpu before, any suggestions?
planning on using Precision X OC


----------



## tps3443

Quote:


> Originally Posted by *jasjeet*
> 
> Thanks for the reply, right now the zotac 1070 amp edition seems the best value at 400£.
> EVGA ACX3.0 is £400 also but not as good I beleive.
> Not sure about the MSI Armour version.
> MSI X and EVGA FTW around 440£.


Before I bought a GTX1080, I owned a evga GTX1070 SC ACX 3.0. The quality was ridiculously good! It is just a FE model, with better cooling and higher clocks.

The Zotac amp is really nice to, I believe it is a wider NON reference board though, so confirm it will fit in your case.

As for the armour it's just a low budget cost effective Gaming X version of the card lol.

The FTW, Amp extreme!, Gaming X, all use the 4.99" wider NON reference board.

If the amp fits, I would go with that. If not, the quality of the evga ACX is just GORGEOUS once you get it in your hands! Pictures do not give it justice. I thought they looked ugly in pictures. But, I ended up buying one on Craigslist, and they do look great in person. It feels like a piece of machinery or something lol. The Zotac does to though.


----------



## tps3443

Quote:


> Originally Posted by *KawasakiDragonn*
> 
> I'm trying to overclock my EVGA 1070 SC, never overclock gpu before, any suggestions?
> planning on using Precision X OC


Use these settings below, since your new it will provide a aggressive performance boost. But nothing to extreme, that will cause instability problems.

increase the power limiter to 112%, then set the GPU clock to +100Mhz, and then +500Mhz on your memory. Set your temp limit to 91C or 92C,

And change the fan profile to aggressive in fan settings, and click OK!

I've owned one of the exact cards, and those are pretty much guaranteed stable on any gtx1070. And perfect for everyday gaming. If you want more, continue to add +10 to +15 on your core, and memory and check for stability in games.

There is a point where you will hit a wall eventually, and increasing these sliders does nothing.


----------



## Dude970

Quote:


> Originally Posted by *zipper17*
> 
> Do you guys use Offset or custom curve? is there any specified trick behind the Custom curve method or anything? maybe i missed some tricks behind pascal overclock.
> 
> I still can't get pass 21K on Firestrike and 10K on FS Extreme, anything higher got crashes lol. its my goal if I'm lucky. Some people must have a better chip omg.


I use offset, tried curve with no luck


----------



## KawasakiDragonn

Quote:


> Originally Posted by *tps3443*
> 
> Use these settings below, since your new it will provide a aggressive performance boost. But nothing to extreme, that will cause instability problems.
> 
> increase the power limiter to 112%, then set the GPU clock to +100Mhz, and then +500Mhz on your memory. Set your temp limit to 91C or 92C,
> 
> And change the fan profile to aggressive in fan settings, and click OK!
> 
> I've owned one of the exact cards, and those are pretty much guaranteed stable on any gtx1070. And perfect for everyday gaming. If you want more, continue to add +10 to +15 on your core, and memory and check for stability in games.
> 
> There is a point where you will hit a wall eventually, and increasing these sliders does nothing.


Thanks! I was using Heaven benchmark to test stability.
(EDIT: I can run +350 on memory without problem, +500 will crash.)

Before overclock my GPU stay around 1883Mhz

Stable around 2025Mhz after overclock, sometimes jump to 2038 but only for few seconds. This setting's able to pass whole Heaven's benchmark.
I tried to do +110 on core, It stays around 2076Mhz but start making glitch screen. I slide the voltage slider to 50% and the core dropped back to 2025Mhz.
My Power limits constantly appears on 1, I think I'm hitting that wall now.


----------



## gtbtk

Guy,

My initial tests seem to indicate that the Micron memory Overclock artifacting problem seems to have been solved with the 272.90 driver pack that has just been released.


----------



## tps3443

Quote:


> Originally Posted by *KawasakiDragonn*
> 
> Thanks! I was using Heaven benchmark to test stability.
> (EDIT: I can run +350 on memory without problem, +500 will crash.)
> 
> Before overclock my GPU stay around 1883Mhz
> 
> Stable around 2025Mhz after overclock, sometimes jump to 2038 but only for few seconds. This setting's able to pass whole Heaven's benchmark.
> I tried to do +110 on core, It stays around 2076Mhz but start making glitch screen. I slide the voltage slider to 50% and the core dropped back to 2025Mhz.
> My Power limits constantly appears on 1, I think I'm hitting that wall now.


I'm glad it's working out for you. I figured every GTX 1070 could run 9,000 memory though. Try and squeeze out every last drop of power. If the cards had the ability to run 150% power limiters, they would be crushers for $400 bucks.

Bare in mind you may make it through a benchmark, and the same overclocks will crash in a game you don't even play lol.


----------



## BroPhilip

Quote:


> Originally Posted by *gtbtk*
> 
> Guy,
> 
> My initial tests seem to indicate that the Micron memory Overclock artifacting problem seems to have been solved with the 272.90 driver pack that has just been released.


MORE INFO PLEASE.....WILL TEST OUT TOMORROW

What speeds are you getting?


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Guy,
> 
> My initial tests seem to indicate that the Micron memory Overclock artifacting problem seems to have been solved with the 272.90 driver pack that has just been released.
> 
> 
> 
> MORE INFO PLEASE.....WILL TEST OUT TOMORROW
> 
> What speeds are you getting?
Click to expand...

I stand corrected. it is not fixed after another reboot.

I was running light loads at +600 but that would cause major stutters in heaven. Heaven would run at +570 and be quiete stable


----------



## Mjhieu

Same here, 372.90 not fix the issue yet. Only Performance mode in nvidia control work. Optimal and Adaptive mode will crash/artifact and reboot the system.

My card is MSI GTX 1070 Gaming Z, it has Micron Memory.


----------



## zipzop

372.90, EVGA 1070SC w/ micron memory.

Just tried Heaven @ +400 mem and didn't make a pass without random lock ups / delays Then a fatal black screen and audio lockup and system shutdown

@ +350 got a driver crash, but a few black screens / lockups before, between scenes. +300 seems stable that's where I usually keep it...

Seems to me it has the lock ups / delays during the transitions between scenes where the voltage drops into the .900mv's momentarily and I guess it doesn't scale properly when it goes back up to 1.050mv+ or can't handle the voltage dips


----------



## zipper17

Quote:


> Originally Posted by *Dude970*
> 
> I use offset, tried curve with no luck


How many +offset from your initial factory settings? what is your highest coreclock speed in the curve point?

I can reach 21.109 Graphic Scores in Firestrike, with memory offset +715-750mhz, highest coreclock speed in the curve point are 2100mhz.
but with Flashing Green error while benchmarking (memory unstable), not really that impressive OC with this card (many crashes if OC).
I can see other 980Ti/1070 can reach higher. Looks like I already found my limit on my card.

I'm thinking about 1070 SLI or 1080 right now...
Do you guys think 1070SLI worth? I play on 2560x1440P 60hz. I see other reviews say not all games supported and there will be a stuttering and worse minimum framerates, plus more heat and draw power.

Also another little method how to get a few hundred points on Firestrike CPU/GPU scores:
-make your GPU running on prefer max performances
-go to power settings, change your plan to High Performances.
-close any monitoring software while benching, Hwmonitor, msi afterburner,gpuz, etc.
-temperature ambient might be also help a little bit. and go for %100fan speed to maximum your cooling potential.

In my experience i got a few hundred points on GPU scores from around 20.800 into 20.900ish, and a few hundred gain points on CPU scores also.


----------



## tps3443

Quote:


> Originally Posted by *zipper17*
> 
> How many +offset from your initial factory settings? what is your highest coreclock speed in the curve point?
> 
> I can reach 21.109 Graphic Scores in Firestrike, with memory offset +715-750mhz, highest coreclock speed in the curve point are 2100mhz.
> but with Flashing Green error while benchmarking, not really that impressive OC with this card.
> I can see other 980Ti/1070 can reach higher. Looks like I already found my limit on my card.
> 
> I'm thinking about 1070 SLI or 1080 right now...
> Do you guys think 1070SLI worth? I play on 2560x1440P 60hz. I see other reviews say not all games supported and there will be a stuttering and worse minimum framerates, plus more heat and draw power.
> 
> Also another little method how to get a few hundred points on Firestrike CPU/GPU scores:
> -make your GPU running on prefer max performances
> -go to power settings, change your plan to High Performances.
> -close any monitoring software while benching, Hwmonitor, msi afterburner,gpuz, etc.
> -temperature ambient might be also help a little bit.
> 
> In my experience i got a few hundred points on GPU scores from around 20.800 into 20.900ish, and a few hundred gain points on CPU scores also.


I sold my GTX1070 SC ACX, and I got a GTX1080 FE. In 1440P maybe not so much, the GTX 1070 is plenty. But in 4K I can get a average of 60+ fps in everything.

Unless you want 100+ fps with 1440P with a 100HZ monitor, then a GTX1080 would be perfect.

I would rather save money, and go with a single gtx1080 used for around $500+ on Craigslist.

The highest firestrike I've gotten is 26,100 with some serious overclocking. It's game stable too.

Memory is at 11,440 with a core running around 2114

That is near stock Titan X Pascal performance of roughly 27,000 graphics. So it's close.

The biggest advantage, is the GTX1080 has alot of awesome BIOS available that offer up to 1.250 volts, and other things to squeeze even more. I've tried some and can get 2,230 but I cannot cool it with the stock FE cooler.

So it's definitely a bit faster. 4K is the biggest difference, it goes from barely playable, to smooth. It will add about 10-12 fps. And that's huge for 4K. That's like going from 38 to 50 and add in overclocking, and it gets even better.


----------



## zipper17

Quote:


> Originally Posted by *tps3443*
> 
> I sold my GTX1070 SC ACX, and I got a GTX1080 FE. In 1440P maybe not so much, the GTX 1070 is plenty. But in 4K I can get a average of 60+ fps in everything.
> 
> Unless you want 100+ fps with 1440P with a 100HZ monitor, then a GTX1080 would be perfect.
> 
> I would rather save money, and go with a single gtx1080 used for around $500+ on Craigslist.
> 
> The highest firestrike I've gotten is 26,100 with some serious overclocking. It's game stable too.
> 
> Memory is at 11,440 with a core running around 2114
> 
> That is near stock Titan X Pascal performance of roughly 27,000 graphics. So it's close.
> 
> The biggest advantage, is the GTX1080 has alot of awesome BIOS available that offer up to 1.250 volts, and other things to squeeze even more. I've tried some and can get 2,230 but I cannot cool it with the stock FE cooler.
> 
> So it's definitely a bit faster. 4K is the biggest difference, it goes from barely playable, to smooth. It will add about 10-12 fps. And that's huge for 4K. That's like going from 38 to 50 and add in overclocking, and it gets even better.


Thanks for input. Yeah im kind of regret not to buy 1080, i wanted 1080 for the first time after pascal launched, but go for 1070 instead, since in my country price are crazy higher.


----------



## Mjhieu

Quote:


> Originally Posted by *tps3443*
> 
> I sold my GTX1070 SC ACX, and I got a GTX1080 FE. In 1440P maybe not so much, the GTX 1070 is plenty. But in 4K I can get a average of 60+ fps in everything.
> 
> Unless you want 100+ fps with 1440P with a 100HZ monitor, then a GTX1080 would be perfect.
> 
> I would rather save money, and go with a single gtx1080 used for around $500+ on Craigslist.
> 
> The highest firestrike I've gotten is 26,100 with some serious overclocking. It's game stable too.
> 
> Memory is at 11,440 with a core running around 2114
> 
> That is near stock Titan X Pascal performance of roughly 27,000 graphics. So it's close.
> 
> The biggest advantage, is the GTX1080 has alot of awesome BIOS available that offer up to 1.250 volts, and other things to squeeze even more. I've tried some and can get 2,230 but I cannot cool it with the stock FE cooler.
> 
> So it's definitely a bit faster. 4K is the biggest difference, it goes from barely playable, to smooth. It will add about 10-12 fps. And that's huge for 4K. That's like going from 38 to 50 and add in overclocking, and it gets even better.


I really want gtx 1080, but the tax increase it nearly 900 USD. And my gtx 1070 gaming Z cost me total 635 USD. Hell expensive. I have no choice, can't spend more.


----------



## tps3443

That is terrible, you guys shouldn't have to pay so much for hardware.

I paid $400 for my GTX1070, then I sold it for $340 used. And I purchased my brand new GTX1080 for $280 lol.

I know it sounds crazy but, it's one of those once in a lifetime deals that I came across. A young kid had it, college paid for it. He didn't know what it was.. And he dropped out of college in graphic design, and sold it to me, I got it off Craigslist.

It was about a 3 hour drive there, then 3 hours back. Lol and it was about 2 or 3 am in the morning. But, I did not care. The deal was to good to pass up!


----------



## zipper17

I think im gonna look for single configuration, a single 1080 for next upgrade. Or wait for another 3-5years for Next gen XX80, Bare necessity with single 1070.









Yeah SLI has some problem itself. On my RIG im not sure 3570K will optimize those SLI 1070, i dont think im gonna upgrade the processor too lazy. because i need to upgrade all of them included mobo/ram etc. i will wait for bigger NExt gen for processor I7. HB bridge SLi is also another story, need pay another $$ to spend.

and also likely my mobo will only support 8x-8x pcie 3.0 for SLI. I need some Core i7-E Edition to get 16x-16x pcie3.0. Gain is pretty huge with 8xvs16x lanes especially on Witcher 3.


----------



## reflex75

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BroPhilip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Guy,
> 
> My initial tests seem to indicate that the Micron memory Overclock artifacting problem seems to have been solved with the 272.90 driver pack that has just been released.
> 
> 
> 
> MORE INFO PLEASE.....WILL TEST OUT TOMORROW
> 
> What speeds are you getting?
> 
> Click to expand...
> 
> I stand corrected. it is not fixed after another reboot.
> 
> I was running light loads at +600 but that would cause major stutters in heaven. Heaven would run at +570 and be quiete stable
Click to expand...

Could you specify your findings please?
(can't test it my self right now)
Do you see any improvement?


----------



## gtbtk

Quote:


> Originally Posted by *reflex75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BroPhilip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Guy,
> 
> My initial tests seem to indicate that the Micron memory Overclock artifacting problem seems to have been solved with the 272.90 driver pack that has just been released.
> 
> 
> 
> MORE INFO PLEASE.....WILL TEST OUT TOMORROW
> 
> What speeds are you getting?
> 
> Click to expand...
> 
> I stand corrected. it is not fixed after another reboot.
> 
> I was running light loads at +600 but that would cause major stutters in heaven. Heaven would run at +570 and be quiete stable
> 
> Click to expand...
> 
> Could you specify your findings please?
> 
> (can't test it my self right now)
> 
> Do you see any improvement?
Click to expand...

The moving goalposts that are pascal appeared worked with initiating the high memory overclock when the card was sitting at .725v the first time I tried. After a reboot, the card went back to the same problem as before.

It didn't fix anything except possibly making it a little more intermittent than before. I did notice that if you idle my card at the stock core curve, the Voltage will sit at .825. If you increase the slider to overclock the card in the traditional manner, the idle voltage drops to .625v or lower and that is when you are most likely to get the checker board artifacts with the high value memory overclocks.

The thing that makes this frustrating is that these cards will work one time when they are close to the limits and then if you repeat the same thing 10 mins later, it will fail there is not much consistency. There must be something or a number of things that change but I have not been able to identify exactly what they are. I am guessing that it is a combination of the interaction of all the undocumented things that are happening within the card that are not reported on by the OC utilities.


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> 372.90, EVGA 1070SC w/ micron memory.
> 
> Just tried Heaven @ +400 mem and didn't make a pass without random lock ups / delays Then a fatal black screen and audio lockup and system shutdown
> 
> @ +350 got a driver crash, but a few black screens / lockups before, between scenes. +300 seems stable that's where I usually keep it...
> 
> Seems to me it has the lock ups / delays during the transitions between scenes where the voltage drops into the .900mv's momentarily and I guess it doesn't scale properly when it goes back up to 1.050mv+ or can't handle the voltage dips


Does heaven crash with the core clock at stock settings and the memory overclocked?

Have you adjusted the voltage and power targets from the default?


----------



## MyNewRig

Nothing has been fixed with Driver 372.90, same Checkerboard artifacts, system lock-up, BSOD memory dump and system crash is happening at as low as +300 memory on the absolute garbage Micron GDDR5 chips.

And like mentioned a few posts up, these memory ICs are not stable, one time +300 is stable and some other times you get artifacts even at +200 offset, what a garbage memory chip.

Does anyone know if these Micron memory ICs are Chinese or something? why are they so f**ked up like that compared to their Samsung GDDR5 counterparts, are they manufactured on a different node than what Samsung uses, or what foundry is producing them? is there any memory expert here with an explanation about this?


----------



## Mjhieu

PPL, come here https://forums.geforce.com/default/board/33/ and make Nvidia answer, they keep silent and ignore this ISSUE.


----------



## syl1979

Maybe there is different sources for micron chip. "True" Micron. Vs Elpida Micron ?


----------



## deegzor

Hey guys!

Any news on the pascal bios tweaker? Is anybody even working on it anymore?


----------



## X6SweexLV

Now i am 100% in the GTX 1070 owners club


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *reflex75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BroPhilip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Guy,
> 
> My initial tests seem to indicate that the Micron memory Overclock artifacting problem seems to have been solved with the 272.90 driver pack that has just been released.
> 
> 
> 
> MORE INFO PLEASE.....WILL TEST OUT TOMORROW
> 
> What speeds are you getting?
> 
> Click to expand...
> 
> I stand corrected. it is not fixed after another reboot.
> 
> I was running light loads at +600 but that would cause major stutters in heaven. Heaven would run at +570 and be quiete stable
> 
> Click to expand...
> 
> Could you specify your findings please?
> 
> (can't test it my self right now)
> 
> Do you see any improvement?
> 
> Click to expand...
> 
> The moving goalposts that are pascal appeared worked with initiating the high memory overclock when the card was sitting at .725v the first time I tried. After a reboot, the card went back to the same problem as before.
> 
> It didn't fix anything except possibly making it a little more intermittent than before. I did notice that if you idle my card at the stock core curve, the Voltage will sit at .825. If you increase the slider to overclock the card in the traditional manner, the idle voltage drops to .625v or lower and that is when you are most likely to get the checker board artifacts with the high value memory overclocks.
> 
> The thing that makes this frustrating is that these cards will work one time when they are close to the limits and then if you repeat the same thing 10 mins later, it will fail there is not much consistency. There must be something or a number of things that change but I have not been able to identify exactly what they are. I am guessing that it is a combination of the interaction of all the undocumented things that are happening within the card that are not reported on by the OC utilities.
Click to expand...

Have you tried installing the MSI Gaming App? I found with that Gaming App Service it installs running the voltage never goes under i thin, .725, but I never check pattern with it.
Once I disabled the service itll dip to something around .625 and then I die if I launch certain things like MS Edge, or Origin when I was trying BF1 beta. I didnt play around to see what other apps would trigger it.


----------



## tps3443

Quote:


> Originally Posted by *zipper17*
> 
> I think im gonna look for single configuration, a single 1080 for next upgrade. Or wait for another 3-5years for Next gen XX80, Bare necessity with single 1070.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah SLI has some problem itself. On my RIG im not sure 3570K will optimize those SLI 1070, i dont think im gonna upgrade the processor too lazy. because i need to upgrade all of them included mobo/ram etc. i will wait for bigger NExt gen for processor I7. HB bridge SLi is also another story, need pay another $$ to spend.
> 
> and also likely my mobo will only support 8x-8x pcie 3.0 for SLI. I need some Core i7-E Edition to get 16x-16x pcie3.0. Gain is pretty huge with 8xvs16x lanes especially on Witcher 3.


Save money! Hi on eBay, buy a 3970X i7 extreme, and a x79 motherboard. Overclock it to 5Ghz, and you will have quad channel, and 12 threads for about $250 bucks. And 40 PCI-E lanes.

After selling off the z77, and 3570K you will come out even better.

Another cheap option is a Z97 mini itx board, with a i7 broadwell 5775C it's a 14nm chip but, socket 1150. There very fast. And cheap! Faster than a 6700K clock for clock.


----------



## Bold Eagle

Quote:


> Originally Posted by *tps3443*
> 
> Save money! Hi on eBay, buy a 3970X i7 extreme, and a x79 motherboard. Overclock it to 5Ghz, and you will have quad channel, and 12 threads for about $250 bucks. And 40 PCI-E lanes.
> 
> After selling off the z77, and 3570K you will come out even better.
> 
> Another cheap option is a Z97 mini itx board, with a i7 broadwell 5775C it's a 14nm chip but, socket 1150. There very fast. And cheap! Faster than a 6700K click for clock.


Get's a white box from ebay with a brick in it - ouch burnt.

Get's a white box from ebay - CPU and mobo dont post ouch burnt.

Money saved - not really sure.


----------



## owikhan

Quote:


> Originally Posted by *X6SweexLV*
> 
> Now i am 100% in the GTX 1070 owners club


wow coool congrats bro


----------



## BroPhilip

Welcome to the blessing / curse .... If nothing else, its fun to see how far you can push these cards


----------



## MyNewRig

Quote:


> Originally Posted by *X6SweexLV*
> 
> Now i am 100% in the GTX 1070 owners club


I will not congratulate you before you tell me did you get the original 1070 with Samsung memory or the Micron 1070? if it has Samsung then congratulations, enjoy your awesome GPU, if it has Micron then i would feel sorry for you


----------



## BroPhilip

Quote:


> Originally Posted by *MyNewRig*
> 
> I will not congratulate you before you tell me did you get the original 1070 with Samsung memory or the Micron 1070? if it has Samsung then congratulations, enjoy your awesome GPU, if it has Micron then i would feel sorry for you










Jerk......You just had to go there


----------



## X6SweexLV

Quote:


> Originally Posted by *owikhan*
> 
> wow coool congrats bro


Ty...

Quote:


> Originally Posted by *MyNewRig*
> 
> I will not congratulate you before you tell me did you get the original 1070 with Samsung memory or the Micron 1070? if it has Samsung then congratulations, enjoy your awesome GPU, if it has Micron then i would feel sorry for you


Ty, i have Micron vram but i can oc my +580 wats is 4579 mhz if i go 600 its start flickering








I use 3dmark firestrike to test


----------



## Mjhieu

Quote:


> Originally Posted by *X6SweexLV*
> 
> Ty...
> Ty, i have Micron vram but i can oc my +580 wats is 4579 mhz if i go 600 its start flickering
> 
> 
> 
> 
> 
> 
> 
> 
> I use 3dmark firestrike to test


The best test now is Rise of Tomb Raider, it can fill full ur gpu 99%-100% and your gpu reached 7.6gb to 8gb. Other firestrike, furmark, timespy work well with my card, but when benchmark Tomb, issue come right after 2-3 round. If you can pass 5 round loops of it, your card is good for sure. Just max out everything in Tomb setting then bench mark 5 times.


----------



## X6SweexLV

Quote:


> Originally Posted by *Mjhieu*
> 
> The best test now is Rise of Tomb Raider, it can fill full ur gpu 99%-100% and your gpu reached 7.6gb to 8gb. Other firestrike, furmark, timespy work well with my card, but when benchmark Tomb, issue come right after 2-3 round. If you can pass 5 round loops of it, your card is good for sure. Just max out everything in Tomb setting then bench mark 5 times.


Ok, i try


----------



## zipper17

Quote:


> Originally Posted by *Mjhieu*
> 
> The best test now is Rise of Tomb Raider, it can fill full ur gpu 99%-100% and your gpu reached 7.6gb to 8gb. Other firestrike, furmark, timespy work well with my card, but when benchmark Tomb, issue come right after 2-3 round. If you can pass 5 round loops of it, your card is good for sure. Just max out everything in Tomb setting then bench mark 5 times.


Do you mean you work well only 'ran' the 3dmark benchmark ? and to only get scores? From my point of views, i also can ran wells by just ran the benchmark to get scores. but when i come to Custom loops/Stress test, it crashes. And when i come to Witcher 3 1440P, it crashes also, the same thing happen.

a New Firestrike/Timespy adv edition comes with Stress Test, or with adv edition u can run Custom loops Every Graphic test 1 &2, Combined Test etc. I bought them on steam when sales 80% off.

Im definitely looking for ROTTR when big sales on steam.









Right now I'm also downloading FFXiV heavensward.

My card out of the box Scores 19.XXX, Overclocked =20.9XX (Firestrike Graphic scores), anything higher would produce a lot crashes/memory flashing green.

about only +1000 points improvement, pretty sucks.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *reflex75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BroPhilip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Guy,
> 
> My initial tests seem to indicate that the Micron memory Overclock artifacting problem seems to have been solved with the 272.90 driver pack that has just been released.
> 
> 
> 
> MORE INFO PLEASE.....WILL TEST OUT TOMORROW
> 
> What speeds are you getting?
> 
> Click to expand...
> 
> I stand corrected. it is not fixed after another reboot.
> 
> I was running light loads at +600 but that would cause major stutters in heaven. Heaven would run at +570 and be quiete stable
> 
> Click to expand...
> 
> Could you specify your findings please?
> 
> (can't test it my self right now)
> 
> Do you see any improvement?
> 
> Click to expand...
> 
> The moving goalposts that are pascal appeared worked with initiating the high memory overclock when the card was sitting at .725v the first time I tried. After a reboot, the card went back to the same problem as before.
> 
> It didn't fix anything except possibly making it a little more intermittent than before. I did notice that if you idle my card at the stock core curve, the Voltage will sit at .825. If you increase the slider to overclock the card in the traditional manner, the idle voltage drops to .625v or lower and that is when you are most likely to get the checker board artifacts with the high value memory overclocks.
> 
> The thing that makes this frustrating is that these cards will work one time when they are close to the limits and then if you repeat the same thing 10 mins later, it will fail there is not much consistency. There must be something or a number of things that change but I have not been able to identify exactly what they are. I am guessing that it is a combination of the interaction of all the undocumented things that are happening within the card that are not reported on by the OC utilities.
> 
> Click to expand...
> 
> Have you tried installing the MSI Gaming App? I found with that Gaming App Service it installs running the voltage never goes under i thin, .725, but I never check pattern with it.
> Once I disabled the service itll dip to something around .625 and then I die if I launch certain things like MS Edge, or Origin when I was trying BF1 beta. I didnt play around to see what other apps would trigger it.
Click to expand...

Gaming app is installed at the moment. I have tried both with and without the gaming app installed. The problem is still there regardless. The memory doesn't like the high clocks being initiated when the voltage is below about .8


----------



## MyNewRig

Quote:


> Originally Posted by *Mjhieu*
> 
> The best test now is Rise of Tomb Raider, it can fill full ur gpu 99%-100% and your gpu reached 7.6gb to 8gb. Other firestrike, furmark, timespy work well with my card, but when benchmark Tomb, issue come right after 2-3 round. If you can pass 5 round loops of it, your card is good for sure. Just max out everything in Tomb setting then bench mark 5 times.


Totally agree, if you can survive Rise of the Tomb Raider benches and most importantly gameplay then you know the point at which your memory OC is actually stable, play 30 minutes or so of rotTR at whatever settings you think are stable, then exit the game, if you do not see any artifacts during gameplay and if you can exit the game without checkerboard and system crash then you know the point at which you are actually stable, i predict since you can benchmark at +580 that your actual stable OC would be around +350 to +400 ...


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> Gaming app is installed at the moment. I have tried both with and without the gaming app installed. The problem is still there regardless. The memory doesn't like the high clocks being initiated when the voltage is below about .8


Hey, Can you guys share what's your Firestrike/Timespy Graphic Scores?

You probably still has a overclock potential from your Core GPU. if you can push far enough your GPU, you probably doesn't need overclock the micron memory...

My Samsung memory even +500-600mhz i dont really think can push far enough. maybe just a tiny gain overall.


----------



## MyNewRig

Quote:


> Originally Posted by *zipper17*
> 
> Hey, Can you guys share what's your Firestrike/Timespy Graphic Scores?
> 
> You probably still has a overclock potential from your Core GPU. if you can push far enough your GPU, you probably doesn't need overclock the micron memory...
> 
> My Samsung memory even +500-600mhz i dont really think can push far enough. maybe just a tiny gain overall.


Actually i believe that memory OC in Pascal is what makes a difference, core OC is not that much important since almost all Pascal can run at or near 2000Mhz so that 50 or 80 extra Mhz almost does not make any difference at all, but going on memory from 8000Mhz to 9000Mhz results in more than 1000 Firestrike score increase and much smoother frames on 2K and 4K, this is why all early reviews used to say that Pascal really shines in memory OC (that was the original Pascal with Samsung memory) not anymore


----------



## zipper17

Quote:


> Originally Posted by *MyNewRig*
> 
> Actually i believe that memory OC in Pascal is what makes a difference, core OC is not that much important since almost all Pascal can run at or near 2000Mhz so that 50 or 80 extra Mhz almost does not make any difference at all, but going on memory from 8000Mhz to 9000Mhz results in more than 1000 Firestrike score increase and much smoother frames on 2K and 4K, this is why all early reviews used to say that Pascal really shines in memory OC (that was the original Pascal with Samsung memory) not anymore


I bought 1070 without knowing anything happen in general. if I got micron probably get mad either now.
This is my upgrade from 560Ti/750Ti. I even rarely do overclocking on GPU, but since i bought 1070, read and join sites/forums people confirmed it can overclock 1070 further, so I tried, how far my 1070 can go..

Btw, If your Highest target point core clock is 2000mhz, most likely your core only steady at around 19XX mhz. It will throttling depends on temperature/power/voltage target. Except if you has strong cooling solution (custom loops/watercooling).

If throttle, You need overclock your card to maybe ~2050mhz or so, to get a steady minimum 2000mhz.


----------



## MyNewRig

Quote:


> Originally Posted by *zipper17*
> 
> I bought 1070 without knowing anything happen in general. if I got micron probaly get mad either now.
> This is my upgrade from 560Ti/750Ti. I even rarely do overclocking on GPU, but since i bought 1070, read and join sites/forums people confirmed it can overclock 1070 further, so I tried, how far my 1070 can go..
> 
> Btw, If your Highest target point core clock is 2000mhz, most likely your core only steady at around 19XX mhz. It will throttling depends on temperature/power/voltage target. Except if you has strong cooling solution (custom loops/watercooling).
> 
> If throttle, You need overclock your card to maybe ~2050mhz or so, to get a steady minimum 2000mhz.


Yes, mine boosts to 2088Mhz when not stressed, throttles down to 2012Mhz under heavy load and sometimes 2000Mhz, i have had another one that had Samsung memory which boosted to 2113Mhz and would run at 2088Mhz all the time under full load, i could do +700 on memory while being very stable, that was a golden sample that i never seen again.

Currently my Micron 1070 struggles to do +250 on memory, and its FireStrike score is about 1000 points less than the Samsung one!

So i have tried best and worst 1070s, the 2113Mhz Samsung card was really a monster and i miss it


----------



## mrtbahgs

Quote:


> Originally Posted by *MyNewRig*
> 
> Actually i believe that memory OC in Pascal is what makes a difference, core OC is not that much important since almost all Pascal can run at or near 2000Mhz so that 50 or 80 extra Mhz almost does not make any difference at all, but going on memory from 8000Mhz to 9000Mhz results in more than 1000 Firestrike score increase and much smoother frames on 2K and 4K, this is why all early reviews used to say that Pascal really shines in memory OC (that was the original Pascal with Samsung memory) not anymore


This is what I was wondering/getting at earlier.
Is there a value comparison that says how much core clock is worth giving up for how much memory OC you can add?

For example (just random values) is it better to drop 25Mhz on the core if I can add another 100Mhz to memory... what values for each area is what I was trying to track down without having to do hours of comparisons myself because I am not one to enjoy the long OC journey.


----------



## MyNewRig

Quote:


> Originally Posted by *mrtbahgs*
> 
> This is what I was wondering/getting at earlier.
> Is there a value comparison that says how much core clock is worth giving up for how much memory OC you can add?
> 
> For example (just random values) is it better to drop 25Mhz on the core if I can add another 100Mhz to memory... what values for each area is what I was trying to track down without having to do hours of comparisons myself because I am not one to enjoy the long OC journey.


I did not notice that memory OC affects core OC in anyway in Pascal with a power target of 120% if your BIOS allows that, at 112% power target it it seems that after a certain memory frequency threshold GPU voltage starts throttling down, specially with the latest driver, i have not tested 112% power target a lot to be honest so i don't have a lot of experience in that, but 120% power target or more (125%) on some cards i think you can reach your max OC on core and memory without them affecting one another.

For example i test max offset for memory and core independently, once i find out the limit of each i can then offset the memory and core together at their max limit and they both remains stable, if memory OC causes the core frequency to throttle more that is something i have not investigated so i don't really know .. but i think it all comes down to your power target, 120% or 125% provides enough headroom for both to reach their max performance potential.

Sorry if i do not have a better answer


----------



## msigtx760tf4

my time spy resoult

http://www.3dmark.com/3dm/15004620


----------



## zeeee4

Quote:


> Originally Posted by *msigtx760tf4*
> 
> my time spy resoult
> 
> http://www.3dmark.com/3dm/15004620


Lol what? Your CPU is clocked at 5.1ghz? ***? I highly doubt you run that 24/7 just did it for the bench to be better than everyone I mean my i5 6600k at 4.7ghz and oc 1070 gets 5900 score on time spy sooooooo ***


----------



## Dude970

Quote:


> Originally Posted by *zeeee4*
> 
> Lol what? Your CPU is clocked at 5.1ghz? ***? I highly doubt you run that 24/7 just did it for the bench to be better than everyone I mean my i5 6600k at 4.7ghz and oc 1070 gets 5900 score on time spy sooooooo ***


so oc your cpu more, and improve your score


----------



## criminal

Quote:


> Originally Posted by *zeeee4*
> 
> Lol what? Your CPU is clocked at 5.1ghz? ***? I highly doubt you run that 24/7 just did it for the bench to be better than everyone I mean my i5 6600k at 4.7ghz and oc 1070 gets 5900 score on time spy sooooooo ***


That's kinda the point don't you think? To get a higher BENCHMARK score than others? Most benchers do suicide runs in order to get better scores.

Some people make my head hurt.


----------



## Dude970

Quote:


> Originally Posted by *criminal*
> 
> That's kinda the point don't you think? To get a higher BENCHMARK score than others? Most benchers do suicide runs in order to get better scores.
> 
> Some people make my head hurt.


well said!!


----------



## tps3443

Quote:


> Originally Posted by *zipper17*
> 
> Do you mean you work well only 'ran' the 3dmark benchmark ? and to only get scores? From my point of views, i also can ran wells by just ran the benchmark to get scores. but when i come to Custom loops/Stress test, it crashes. And when i come to Witcher 3 1440P, it crashes also, the same thing happen.
> 
> a New Firestrike/Timespy adv edition comes with Stress Test, or with adv edition u can run Custom loops Every Graphic test 1 &2, Combined Test etc. I bought them on steam when sales 80% off.
> 
> Im definitely looking for ROTTR when big sales on steam.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Right now I'm also downloading FFXiV heavensward.
> 
> My card out of the box Scores 19.XXX, Overclocked =20.9XX (Firestrike Graphic scores), anything higher would produce a lot crashes/memory flashing green.
> 
> about only +1000 points improvement, pretty sucks.


This is a $400 video card, so that's alot of power for this price range, if it kept on clocking up, the GTX1080 would be silly. Nvidia wanted to sell the 1080's too lol.

You gotta pay to play, you could spend (3) times as much $400.00 X 3 = $1200

And only achieve 6,100 more in firestrike. You've got 77% of the GTX Titan Pascal performance for 30% of its cost lol.

21,000 is just about the wall on a 1070.

You can modify it though, there are a few guides on Google.

Just Google the shunt mod gtx1070


----------



## tps3443

Quote:


> Originally Posted by *criminal*
> 
> That's kinda the point don't you think? To get a higher BENCHMARK score than others? Most benchers do suicide runs in order to get better scores.
> 
> Some people make my head hurt.


You delid your 66k yet? Massive temp drop. 15-20C lol. Makes suicide runs even better!


----------



## 2Lazy2Die

Actually are the stock results of 2265 in heaven and the graphics score of 5945 in timespy normal for a gtx 1070 FTW? I mean, there are so many benchmarks other people post, saying they are not overclocking and getting scores of like 2500 in heaven. This kinda bothers me considering that i got a micron memory.


----------



## bigjdubb

Quote:


> Originally Posted by *criminal*
> 
> That's kinda the point don't you think? To get a higher BENCHMARK score than others? Most benchers do suicide runs in order to get better scores.
> 
> Some people make my head hurt.


Quote:


> Originally Posted by *Dude970*
> 
> well said!!


+Rep for sanity in combination with common sense.

Benchmark stable means you can finish the benchmark. Who cares if you can run 3DMark for 2 hours, the test only takes a couple of minutes. Since the stress test is meaningless for game stability and the benchmark itself only takes a few minutes there is zero point in running a 3DMark stress test.


----------



## Mr-Dark

GTX 1070 in SLI



Firestrike score at stock

http://www.3dmark.com/3dm/15006887?

OC + 100mhz on the core and +300 memory

http://www.3dmark.com/3dm/15007001?

I hope this help someone


----------



## deegzor

This is for all the nay sayers









Atleat on my card core is king not memory.

here's the proof :

Higher core -> http://www.3dmark.com/fs/10246704

Higher mem -> http://www.3dmark.com/fs/10246771

peace


----------



## supermi

Quote:


> Originally Posted by *deegzor*
> 
> This is for all the nay sayers
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Atleat on my card core is king not memory.
> 
> here's the proof :
> 
> Higher core -> http://www.3dmark.com/fs/10246704
> 
> Higher mem -> http://www.3dmark.com/fs/10246771
> 
> peace


If you clock your memory too high you will get error correction and a loss of performance. If your memory does not need to I would guess you would see more performance scaling.


----------



## owikhan

Quote:


> Originally Posted by *Mr-Dark*
> 
> GTX 1070 in SLI
> 
> 
> 
> Firestrike score at stock
> 
> http://www.3dmark.com/3dm/15006887?
> 
> OC + 100mhz on the core and +300 memory
> 
> http://www.3dmark.com/3dm/15007001?
> 
> I hope this help someone


Its a beauty wow


----------



## Mjhieu

Quote:


> Originally Posted by *Mr-Dark*
> 
> GTX 1070 in SLI
> 
> 
> 
> Firestrike score at stock
> 
> http://www.3dmark.com/3dm/15006887?
> 
> OC + 100mhz on the core and +300 memory
> 
> http://www.3dmark.com/3dm/15007001?
> 
> I hope this help someone


Omg, You're too rich







, love you xD, Congratz. Here If I go for gtx 1070, I will have to spend about 1300 usd for it.


----------



## bigjdubb

Quote:


> Originally Posted by *Mjhieu*
> 
> Omg, You're too rich
> 
> 
> 
> 
> 
> 
> 
> , love you xD, Congratz. *Here If I go for gtx 1070, I will have to spend about 1300 usd for it*.


That sucks pretty hardcore, my 1070 is barely worth it's $430 retail price..... I can't imagine paying any more than that for one.


----------



## zipper17

Quote:


> Originally Posted by *bigjdubb*
> 
> Benchmark stable means you can finish the benchmark. Who cares if you can run 3DMark for 2 hours, the test only takes a couple of minutes. Since the stress test is meaningless for game stability and the benchmark itself only takes a few minutes there is zero point in running a 3DMark stress test.


Why you say that?

I'm also can finish the benchmark just perfectly to get scores. but when i come to Custom loops/Stress test, it randomly crashes after some loops.

If you have a perfect chip, doesn't mean other people also have a perfect chip. You don't need to tell other people stress test is useless.

If you overclocking your CPu and run prime 95, thats also meaningless?
Quote:


> Originally Posted by *bigjdubb*
> 
> Who cares if you can run 3DMark for 2 hours


We need to check our card stable or not after OC for our own good, not mean to show-off or anything









I didn't run for hours to stablity, program running about 10minutes looping or custom looping. It is enough to detect if theres any crash or error.
I'm also go play some games after that for do stress gaming to make sure. Specific Stress test and Stress test gaming can be relative.

Sorry, does overclock community it against test stability or something ?? I would not bother anymore of you.


----------



## MyNewRig

I have a question you guys, is ANY GTX 1060 using Micron memory or is it all Samsung? the idea is, i see GTX 1060 also has GDDR5 rated at 8Ghz so it is the same memory type used in the 1070, if there was indeed a shortage in Samsung's GDDR5 that caused Nvidia to switch to Micron, then the GTX 1060 would have Micron too, but if all GTX 1060s still use Samsung GDDR5 then the 1070 is being gimped on purpose to encourage more GTX 1080 sales since performance is very close.

If it is being gimped on purpose by using the inferior Micron GDDR5 chips this will explain the secrecy around the issue and the reason why Nvidia and partners remain totally silent about this because it will be sorts of a scandal to admit the practice.

So instead of lowering the GTX 1080 price to make it sell better they cripple the GTX 1070 so the gab in performance would be wider ..

I want your opinion please if you are familiar with the GTX 1060 ...


----------



## Mjhieu

Quote:


> Originally Posted by *bigjdubb*
> 
> That sucks pretty hardcore, my 1070 is barely worth it's $430 retail price..... I can't imagine paying any more than that for one.


I hate my country tax,1070 gaming Z cost me 635 us, and here gxt 1080 cost from 880 us to 1k. I live in Vietnam, not rich country and i have to pay nearly double vs usa.


----------



## MyNewRig

Okay, i just finished discussing this with one Nvidia Partner (will not mention their name for obvious reasons), reading between the lines, the GTX 1070 has been downgraded to Micron memory on purpose to lower its performance compared to the GTX 1080, to widen the gab between both cards and make the GTX 1080 sell more.

So Nvidia's strategy, if you want higher memory bandwidth for 4K, forget about buying the GTX 1070 and overclocking its memory, if you want more memory bandwidth you have to buy the GTX 1080.

Nvidia has instructed the AIBs to not share any information about this with end-users, so like i expected this information is classified.

The implications is that no fix will ever be provided because it is meant to be this way, and the latest driver update was only targeted at improving the stability of Micron memory at stock settings and nothing more, Nvidia will not do anything else about this.

GTX 1060 still uses Samsung GDDR5 8GHz modules in all the recent production so this eliminates the possibility of Samsung having any supply issues with GDDR5 like they are claiming.


----------



## muzammil84

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay, i just finished discussing this with one Nvidia Partner (will not mention their name for obvious reasons), reading between the lines, the GTX 1070 has been downgraded to Micron memory on purpose to lower its performance compared to the GTX 1080, to widen the gab between both cards and make the GTX 1080 sell more.
> 
> So Nvidia's strategy, if you want higher memory bandwidth for 4K, forget about buying the GTX 1070 and overclocking its memory, if you want more memory bandwidth you have to buy the GTX 1080.
> 
> Nvidia has instructed the AIBs to not share any information about this with end-users, so like i expected this information is classified.
> 
> The implications is that no fix will ever be provided because it is meant to be this way, and the latest driver update was only targeted at improving the stability of Micron memory at stock settings and nothing more, Nvidia will not do anything else about this.
> 
> GTX 1060 still uses Samsung GDDR5 8GHz modules in all the recent production so this eliminates the possibility of Samsung having any supply issues with GDDR5 like they are claiming.


so this information is classified but you got it from Nvidia's partner, is that right?


----------



## gtbtk

Quote:


> Originally Posted by *2Lazy2Die*
> 
> Actually are the stock results of 2265 in heaven and the graphics score of 5945 in timespy normal for a gtx 1070 FTW? I mean, there are so many benchmarks other people post, saying they are not overclocking and getting scores of like 2500 in heaven. This kinda bothers me considering that i got a micron memory.


the FTW does draw higher power levels than the base cards. have you tried using a fan curve or faster constant fan for better cooling? your card may be reducing clock speed because of heat

There is nothing wrong with micron memory, just an issue with the power supply if you try a high overclock from low idle voltages. If the card is idling at .800v or above the card will clock above +500 no problem. I tend to get better performance at +400 than at +560 but I believe that is also consistant with a number of samsung cards as well.

I am running with a MSI Gaming X Micron memory card running on an i7-2600 with a BCLK overclock of 4.4ghz.

Heaven is about 2366 in the default gaming mode, 2376 in OC mode and 2558 overclocked at 1080p in a well ventilated case.

This is a recent time spy with the same curve based OC as run on heaven. http://www.3dmark.com/spy/483429


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay, i just finished discussing this with one Nvidia Partner (will not mention their name for obvious reasons), reading between the lines, the GTX 1070 has been downgraded to Micron memory on purpose to lower its performance compared to the GTX 1080, to widen the gab between both cards and make the GTX 1080 sell more.
> 
> So Nvidia's strategy, if you want higher memory bandwidth for 4K, forget about buying the GTX 1070 and overclocking its memory, if you want more memory bandwidth you have to buy the GTX 1080.
> 
> Nvidia has instructed the AIBs to not share any information about this with end-users, so like i expected this information is classified.
> 
> The implications is that no fix will ever be provided because it is meant to be this way, and the latest driver update was only targeted at improving the stability of Micron memory at stock settings and nothing more, Nvidia will not do anything else about this.
> 
> GTX 1060 still uses Samsung GDDR5 8GHz modules in all the recent production so this eliminates the possibility of Samsung having any supply issues with GDDR5 like they are claiming.


I feel a class action coming on again.....the curse of the 70 model cards.


----------



## MyNewRig

Quote:


> Originally Posted by *muzammil84*
> 
> so this information is classified but you got it from Nvidia's partner, is that right?


Yes, sure thing!, asking the right questions in the right sequence you can uncover a lot from the responses, even if the other party is trying not to disclose much, nothing new, this technique is used a lot in different contexts.

But there were explicit things being said, not everything was implied, like they latterly said:

_"Sorry we cannot tell you that, the memory could be any brand."

"We only guaranteed the function of the card. We cannot guarantee the memory brand."

"If that is the case, we are not informed nor I doubt that information would be given to end-users"

"Please know that your inquiry is quite an impossible one. If you have any suggestions on how we can proceed, I would gladly help you out"

"Please know that whatever the memory, these cards are tested according to NVIDIA standards"_

These are the responses i was getting by asking systematically about what memory types is used in newer production, or how i can get a card with Samsung, or if the Founder's Edition is guaranteed to still use Samsung.

You do not have to take my word for it, judge for yourself


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> I feel a class action coming on again.....the curse of the 70 model cards.


They are covering their legal basis this time around it seems by delivering a product that can do the bare minimum of advertised specs and has no OC headroom whatsoever to show its quality, that if you are lucky to even get the stable minimum guaranteed performance.

I don't know who they are fooling? when has it ever been the case that buyers of high-end GPUs are making purchases based on the very minimum guaranteed performance? i think the answer is never!

Why are these manufacturers developing OC tools and OC BIOS, OC this and OC that if they don't know that this is what sells their cards over others?

If they were true to their "we do not support OC" mantra they would have not been pushing all these OC supporting tools in HW and SW! to generate sales with enthusiasts who buy these GPUs, the market is just full of **** these days ..


----------



## madmeatballs

Anyone tried doing the shunt method with a Zotac AMP Extreme?


----------



## Forceman

Quote:


> Originally Posted by *MyNewRig*
> 
> Yes, sure thing!, asking the right questions in the right sequence you can uncover a lot from the responses, even if the other party is trying not to disclose much, nothing new, this technique is used a lot in different contexts.
> 
> But there were explicit things being said, not everything was implied, like they latterly said:
> 
> _"Sorry we cannot tell you that, the memory could be any brand."
> 
> "We only guaranteed the function of the card. We cannot guarantee the memory brand."
> 
> "If that is the case, we are not informed nor I doubt that information would be given to end-users"
> 
> "Please know that your inquiry is quite an impossible one. If you have any suggestions on how we can proceed, I would gladly help you out"
> 
> "Please know that whatever the memory, these cards are tested according to NVIDIA standards"_
> 
> These are the responses i was getting by asking systematically about what memory types is used in newer production, or how i can get a card with Samsung, or if the Founder's Edition is guaranteed to still use Samsung.
> 
> You do not have to take my word for it, judge for yourself


Yeah, you really uncovered the smoking gun there.


----------



## criminal

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, you really uncovered the smoking gun there.


LOL... that guy.


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> LOL... that guy.


Oh, the Happy Samsung GTX 1070 owner again, missed you


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> Oh, the Happy Samsung GTX 1070 owner again, missed you


Hey, what can I say? There is actually some benefit from being an early adopter sometimes.


----------



## Dude970

Maybe we should split the thread, one for Samsung and one for Micron


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> Hey, what can I say? There is actually some benefit from being an early adopter sometimes.


To be honest that is the first time i experience this phenomena in computer components and especially GPUs, in previous generations i was an early adapter and later cards would release with dual BIOS, better components, fixed issues, game bundles etc .. so it has always been getting better with time, this time around product got significantly worse, but maybe for the better, my Micron GTX 1070s are on their way back to the store, so i will get another one later if they resume Samsung production again, or get AMD VEGA with HBM2 512 GB/s memory that would not make me miss my 1070s ... it just needs patience








Quote:


> Originally Posted by *Dude970*
> 
> Maybe we should split the thread, one for Samsung and one for Micron


LOL, no kidding that is a really good idea, coz these are two different products

"Original NVIDIA Samsung GTX 1070 Owner's Club"

"NVIDIA Micron GTX 1070 Owner's Club"

Too bad i won't meet with @criminal in my club


----------



## xg4m3

So, let's say i buy MSI X card. Does that mean i cant get stable OC modes they advertise or i cant OC it past default overclocks which are already preconfigured (normal, gaming, oc mode or whatever they're called)?


----------



## ricko99

It's all about the business, typical Ngreedia. Everyone should know by now.


----------



## Mjhieu

Quote:


> Originally Posted by *Dude970*
> 
> Maybe we should split the thread, one for Samsung and one for Micron


Maybe, If fine for me, since the problem still there with driver, but it's ok with maximum peformance mode my pc never crash and can fulloading folding home 24/7 and gaming all time xD.







I love Micron Memory, I hate samsung xD


----------



## Dude970

Quote:


> Originally Posted by *Mjhieu*
> 
> Maybe, If fine for me, since the problem still there with driver, but it's ok with maximum peformance mode my pc never crash and can fulloading folding home 24/7 and gaming all time xD.
> 
> 
> 
> 
> 
> 
> 
> I love Micron Memory, I hate samsung xD


Nice to here the card works good for you. I cant use the last two drivers.


----------



## Mjhieu

Quote:


> Originally Posted by *xg4m3*
> 
> So, let's say i buy MSI X card. Does that mean i cant get stable OC modes they advertise or i cant OC it past default overclocks which are already preconfigured (normal, gaming, oc mode or whatever they're called)?


B4 You manual overlock, set power mode to maximum performance in nvidia control panel 1st if your card has Micron Vram. I'm not sure about samung.







. Who know you will got good gpu


----------



## tps3443

Quote:


> Originally Posted by *bigjdubb*
> 
> That sucks pretty hardcore, my 1070 is barely worth it's $430 retail price..... I can't imagine paying any more than that for one.


I hate that you guys got all this Micron memory. That's a shame! Nvidia better not pull this with GTX1080's. I think there is only one company that makes GDDR5X

My GTX1070 was a SC ACX it had Samsung memory. And it clock to very high levels I could bench at close to 10,000, with a lower core level.

Hell, my GTX1080 can only run about 11,100 with a really high core clock.


----------



## bigjdubb

Quote:


> Originally Posted by *zipper17*
> 
> Sorry, does overclock community it against test stability or something ?? I would not bother anymore of you.


I'm not against stability testing, it is absolutely necessary if you plan to run daily overclocks. My problem with the 3DMark stress test is that it only means you are stable for 3dMark and there is no point in being 100% stable for 3DMark since the whole point of it is high score. This is why I feel the stress test in 3DMark is pointless, if you want to stress your card for temp testing and general stability you can use something that actually stresses it like OCCT.

Games are different because you play them for hours on end and there is nothing more frustrating than a CTD in the middle of a mission/fight/ etc. There is only one way to ensure game stability and way is to play the game and see what happens. You may have to set up multiple profiles for different games, some games are picky about memory over clocks and some games don't seem to care what you do, they are just stable.

I do not endorse anyone spending time getting 3DMark stable when they could be doing something useful like running the benchmark or testing game stability.

Quote:


> Originally Posted by *Dude970*
> 
> Maybe we should split the thread, one for Samsung and one for Micron


What about the Micron owners that get better overclocks than Samsung owners.

Quote:


> Originally Posted by *tps3443*
> 
> I hate that you guys got all this Micron memory. That's a shame! Nvidia better not pull this with GTX1080's. I think there is only one company that makes GDDR5X
> 
> My GTX1070 was a SC ACX it had Samsung memory. And it clock to very high levels I could bench at close to 10,000, with a lower core level.
> 
> Hell, my GTX1080 can only run about 11,100 with a really high core clock.


Not sure why you quoted me, my MSI has Samsung memory, not that getting Micron memory is a bad thing.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> I'm not against stability testing, it is absolutely necessary if you plan to run daily overclocks. My problem with the 3DMark stress test is that it only means you are stable for 3dMark and there is no point in being 100% stable for 3DMark since the whole point of it is high score. This is why I feel the stress test in 3DMark is pointless, if you want to stress your card for temp testing and general stability you can use something that actually stresses it like OCCT.
> 
> Games are different because you play them for hours on end and there is nothing more frustrating than a CTD in the middle of a mission/fight/ etc. There is only one way to ensure game stability and way is to play the game and see what happens. You may have to set up multiple profiles for different games, some games are picky about memory over clocks and some games don't seem to care what you do, they are just stable.
> 
> I do not endorse anyone spending time getting 3DMark stable when they could be doing something useful like running the benchmark or testing game stability.
> What about the Micron owners that get better overclocks than Samsung owners.
> Not sure why you quoted me, my MSI has Samsung memory, not that getting Micron memory is a bad thing.


Yeah 3Dmark is all about getting the highest possible score, so stability in it is just silly. It just needs to be stable enough to pass the freaking test! That goes for Heaven and Valley as well.

Yeah you can't speak of those Micron users! They must be lying about their overclocking.


----------



## 2Lazy2Die

Quote:


> Originally Posted by *gtbtk*
> 
> the FTW does draw higher power levels than the base cards. have you tried using a fan curve or faster constant fan for better cooling? your card may be reducing clock speed because of heat
> 
> There is nothing wrong with micron memory, just an issue with the power supply if you try a high overclock from low idle voltages. If the card is idling at .800v or above the card will clock above +500 no problem. I tend to get better performance at +400 than at +560 but I believe that is also consistant with a number of samsung cards as well.
> 
> I am running with a MSI Gaming X Micron memory card running on an i7-2600 with a BCLK overclock of 4.4ghz.
> 
> Heaven is about 2366 in the default gaming mode, 2376 in OC mode and 2558 overclocked at 1080p in a well ventilated case.
> 
> This is a recent time spy with the same curve based OC as run on heaven. http://www.3dmark.com/spy/483429


Setting power management to high perfomance actually helped with the artifacts, which previously appeared at >200 memory. However the score in heaven, even with +350 oc is just 93 fps,2345 score which still feels underpowered. Could CPU bottleneck the heaven benchmark?


----------



## BroPhilip

Quote:


> Originally Posted by *Dude970*
> 
> Maybe we should split the thread, one for Samsung and one for Micron


Sort of like dodge ball. All the losers to the left please


----------



## MyNewRig

Quote:


> Originally Posted by *ricko99*
> 
> It's all about the business, typical Ngreedia. Everyone should know by now.


Makes perfect sense, Nvidia is so damn greedy that they are milking their customers and products right and left whenever they have the opportunity to do so, but are Micron GDDR5 memory modules that much cheaper than Samsung GDDR5 that it is actually worth all the hassle? or are they scalping a few pennies here and there?

Also this doesn't explain how are they still using Samsung's GDDR5 in the GTX 1060? why didn't they downgrade that one as well?

I have a feeling that with all the mess this is causing they will just switch back silently to Samsung in future production and give those who already bought cards with Micron the "overclocking is not guaranteed" excuse, those Micron cards that can't run stable at stock settings which appear to be many, they will just take them back under RMA and refurbish them with Samsung memory.

I don't see this continuing like that till the end of the generation because that will make AMD kill it in a few months time with VEGA and HBM2

AMD has already been taking market share from Nvidia for more than a year now, and with this Micron memory issue, VEGA with HMB2 will probably be more successful than it normally would, unless AMD manages to mess it up somehow like they many times do.


----------



## MyNewRig

Quote:


> Originally Posted by *2Lazy2Die*
> 
> Setting power management to high perfomance actually helped with the artifacts, which previously appeared at >200 memory. However the score in heaven, even with +350 oc is just 93 fps,2345 score which still feels underpowered. Could CPU bottleneck the heaven benchmark?


Setting power management to high performance is not a good workaround because it causes the driver to switch from "Let the 3D application decide" mode to a complete custom profile override which will make the in-game render settings ineffective.

Also that does not help increase the poor memory's OC, it just prevents the system from locking up and crashing when going from low power to high power mode or vice versa, so it is kinda of useless.


----------



## 2Lazy2Die

Quote:


> Originally Posted by *MyNewRig*
> 
> Setting power management to high performance is not a good workaround because it causes the driver to switch from "Let the 3D application decide" mode to a complete custom profile override which will make the in-game render settings ineffective.
> 
> Also that does not help increase the poor memory's OC, it just prevents the system from locking up and crashing when going from low power to high power mode or vice versa, so it is kinda of useless.


TBH i don't care that much about OC. I tested the max perfomance simply because i wanted to see, whether it will do anything or not. And it actually did increase the fps by like 6, when i OCed memory and core clock. But again, that's not what really bothers me. What actually does, are those scores of 2260 at stock, with 1964 core and 4004 mem, This just feels low to me. I might be exaggerating, of course, maybe it's not that much a deal. But seeing people getting something like 2413 at stock with evga overlay on just makes me feel sad. Yeah those are just numbers... But quite big numbers.(11 fps difference) I'm just trying to decide, whether or not i should try'n change it on a different 1070.


----------



## monza1412

Quote:


> Originally Posted by *MyNewRig*
> 
> Also this doesn't explain how are they still using Samsung's GDDR5 in the GTX 1060? why didn't they downgrade that one as well?


This could also change later on, remember that the 1060 was recently introduced.
Quote:


> Originally Posted by *MyNewRig*
> 
> I have a feeling that with all the mess this is causing they will just switch back silently to Samsung in future production and give those who already bought cards with Micron the "overclocking is not guaranteed" excuse, those Micron cards that can't run stable at stock settings which appear to be many, they will just take them back under RMA and refurbish them with Samsung memory.


I honestly doubt that.


----------



## MyNewRig

Quote:


> Originally Posted by *monza1412*
> 
> I honestly doubt that.


Why? it absolutely makes a strong business sense to do something about this before AMD releases VEGA with HBM2 512 GB/s memory, otherwise their loss off market share will be pretty significant, don't you think?


----------



## monza1412

Quote:


> Originally Posted by *MyNewRig*
> 
> Why? it absolutely makes a strong business sense to do something about this before AMD releases VEGA with HBM2 512 GB/s memory, otherwise their loss off market share will be pretty significant, don't you think?


Because by doing that they are "acknowledging" that there is indeed a problem, also I would assume that it has less economical impact for the Nvidia partners replace faulty cards, that switch the entire line of production for one component. I don't know, I just don't see it viable. Maybe the problem is easy fixable by some programming and not hardware replacement, so future drivers or bios could help.

AMD is another story, no one knows for sure a specific launch date, retail availability at the time of launch, relative performance or price, or what can launch Nvidia to contrarest VEGA at that specific time. It's all supposition.


----------



## TheGlow

I dont like the idea I got a Micron vs a Samsung, however the performance appears the same and it looks like it OC's well.
Whether its a placebo or not, I'm ok with it.


----------



## Mr-Dark

Quote:


> Originally Posted by *Mjhieu*
> 
> Omg, You're too rich
> 
> 
> 
> 
> 
> 
> 
> , love you xD, Congratz. Here If I go for gtx 1070, I will have to spend about 1300 usd for it.


I'm not









Sadily a friend Visit me today and once he see that 1070 SLI, he just give me 300$ and 2* 980 Ti Hybrid.. I just couldn't pass that offer as 980 Ti's Hybrid for single 1070 Price











The Hybrid's boost to 1342mhz at stock and that already faster than 1070 SLI..

1070 SLI score

http://www.3dmark.com/3dm/15006887?

Ti's score

http://www.3dmark.com/3dm/15027316?

Decent enough


----------



## bigjdubb

Quote:


> Originally Posted by *TheGlow*
> 
> I dont like the idea I got a Micron vs a Samsung, however the performance appears the same and it looks like it OC's well.
> Whether its a placebo or not, I'm ok with it.


There is nothing wrong with Micron memory in general. I have had a handful of sli setups where one card had micron and one card had either samsung or sk hynix memory (usually sk or micron though) and have never had any problems getting clocks to match. Even in the case of the 1070 there is no defined line between the capabilities of Micron memory and Samsung memory. Some people with samsung memory get high mem clocks and some don't, it's the same for Micron owners. Luck seems to be better with Samsung, at least in the early batches.


----------



## criminal

Quote:


> Originally Posted by *bigjdubb*
> 
> There is nothing wrong with Micron memory in general. I have had a handful of sli setups where one card had micron and one card had either samsung or sk hynix memory (usually sk or micron though) and have never had any problems getting clocks to match. Even in the case of the 1070 there is no defined line between the capabilities of Micron memory and Samsung memory. Some people with samsung memory get high mem clocks and some don't, it's the same for Micron owners. Luck seems to be better with Samsung, at least in the early batches.


These Micron guys want to complain.... lucky for them they never had to deal with Elpida vram. I remember back when I bought my GTX 780 Classified, the original batch shipped with Samsung, but some started shipping with Elpida. That was a $699 videocard. Now that's something to complain about.


----------



## Dude970

Quote:


> Originally Posted by *criminal*
> 
> These Micron guys want to complain.... lucky for them they never had to deal with Elpida vram. I remember back when I bought my GTX 780 Classified, the original batch shipped with Samsung, but some started shipping with Elpida. That was a $699 videocard. Now that's something to complain about.


I had Elpida vram on a 970, and it OC'd like a champ. Guess everyone's mileage varies


----------



## criminal

Quote:


> Originally Posted by *Dude970*
> 
> I had Elpida vram on a 970, and it OC'd like a champ. Guess everyone's mileage varies


I believe Micron had already bought them by then. So you might have lucked out and gotten some re-branded Micron or something. All I know is that even with the extra voltage and subzero temps that the Classified was built for, the Elpida wouldn't overclock for crap. And the GTX 780 was a true high-end gpu. Especially the Classified version.

Yep: https://www.micron.com/about/about-the-elpida-acquisition


----------



## Dude970

Quote:


> Originally Posted by *criminal*
> 
> I believe Micron had already bought them by then. So you might have lucked out and gotten some re-branded Micron or something. All I know is that even with the extra voltage and subzero temps that the Classified was built for, the Elpida wouldn't overclock for crap. And the GTX 780 was a true high-end gpu. Especially the Classified version.
> 
> Yep: https://www.micron.com/about/about-the-elpida-acquisition


Bought it last December, so I guess it was rebranded


----------



## MyNewRig

Quote:


> Originally Posted by *bigjdubb*
> 
> There is nothing wrong with Micron memory in general. I have had a handful of sli setups where one card had micron and one card had either samsung or sk hynix memory (usually sk or micron though) and have never had any problems getting clocks to match. Even in the case of the 1070 there is no defined line between the capabilities of Micron memory and Samsung memory. Some people with samsung memory get high mem clocks and some don't, it's the same for Micron owners. Luck seems to be better with Samsung, at least in the early batches.


Remember that we are not talking about Micron memory in general or as a brand, this is specifically about Micron's GDDR5 8Ghz memory modules used in the GTX 1070, and the problem is not just with their lower or non-existent OC headroom, but they have other problems, they crash with system lock-up, BSOD, full system crash, screen wide artifacts on the desktop and browsers, artifacts at stock settings, i get desktop artifacts with that Micron memory just switching displays, they can not deal with voltage/power fluctuations, etc ... they are just very unstable and totally mess up the product ...

Micron's GDDR5X that is used in the GTX 1080 on the other hand appear to be working fine without issues, previous generations Micron GDDR5 6Ghz and 7Ghz were fine, no one was complaining ...

So that specific Micron GDDR5 8Ghz chips used recently on the GTX 1070 is problematic, low quality and unstable, and that is what we are referring to when we talk about "Micron memory" in that context.


----------



## bigjdubb

Quote:


> Originally Posted by *MyNewRig*
> 
> Remember that we are not talking about Micron memory in general or as a brand, this is specifically about Micron's GDDR5 8Ghz memory modules used in the GTX 1070, and the problem is not just with their lower or non-existent OC headroom, but they have other problems, they crash with system lock-up, BSOD, full system crash, screen wide artifacts on the desktop and browsers, artifacts at stock settings, i get desktop artifacts with that Micron memory just switching displays, they can not deal with voltage/power fluctuations, etc ... they are just very unstable and totally mess up the product ...
> 
> Micron's GDDR5X that is used in the GTX 1080 on the other hand appear to be working fine without issues, previous generations Micron GDDR5 6Ghz and 7Ghz were fine, no one was complaining ...
> 
> So that specific Micron GDDR5 8Ghz chips used recently on the GTX 1070 is problematic, low quality and unstable, and that is what we are referring to when we talk about "Micron memory" in that context.


That's the memory on YOUR card. There are other users getting 500mhz overclocks on their 1070 micron memory, there are 1070's with samsung memory that can't seem to get more than 200 mhz overclock.


----------



## Dude970

Sounds more like your PSU MyNewRig. Which 1070 do you have again? And are you using separate PCIe cables to the GPU if you have one like mine, a 8pin+6pin


----------



## MyNewRig

Quote:


> Originally Posted by *bigjdubb*
> 
> That's the memory on YOUR card. There are other users getting 500mhz overclocks on their 1070 micron memory, there are 1070's with samsung memory that can't seem to get more than 200 mhz overclock.


Quote:


> Originally Posted by *Dude970*
> 
> Sounds more like your PSU MyNewRig. Which 1070 do you have again? And are you using separate PCIe cables to the GPU if you have one like mine, a 8pin+6pin


No that is not my card, i tried two different cards with Micron memory and even the one that overclocks a bit better crash the system when voltage changes to low power state, the Majority of Micron cards can only do +200 to +250 stable and by stable i mean actually using it in gaming for a few hours without artifacts and crashing, not one suicide 3DMark run, and most Samsung cards can do +500 and above very easy without any issue, and when i say "most" i don't mean every single card, there are of course exceptions on both memory types but these exception don't represent the majority of chips out there ... Micron cards that OC well and Samsung Cards that DO NOT OC well are just rare anomalies

No it is not my PSU, i have a Platinum rated PSU, and all cables supplied are single 8-pin cables each with its own separate connector, i ran a Samsung GTX 1070 for two weeks on that same exact system, +700 memory (9400Mhz) 100% stable during hours of gaming and benchmarking, and when i push it to +750 or +800 all i get is "Display driver stopped responding and has recovered." no full system crash or lockup has ever happened with that card.

the other two Micron cards, one shows artifacts at +100 and the other goes up to around +300 but then instant system crash when power stats change, +250 is the highest stable OC that does not cause system crashes going from low power to high power and vice versa ... i can lock the voltage to 1093 mV with +300 to prevent voltage fluctuation crashing but i still get artifacts in rotTR until i lower the offset to around +270

From the data points i have gathered online, the mean/average stable frequency for micron is around the +200 to +250 point, and the mean/average stable frequency for Samsung is around +500 ... exceptions exist on both sides, but you have much much better odds with Samsung cards ... especially in the overall stability department and full system crashing ..


----------



## Dude970

Quote:


> Originally Posted by *MyNewRig*
> 
> No that is not my card, i tried two different cards with Micron memory and even the one that overclocks a bit better crash the system when voltage changes to low power state, the Majority of Micron cards can only do +200 to +250 stable and by stable i mean actually using it in gaming for a few hours without artifacts and crashing, not one suicide 3DMark run, and most Samsung cards can do +500 and above very easy without any issue, and when i say "most" i don't mean every single card, there are of course exceptions on both memory types but these exception don't represent the majority of chips out there ... Micron cards that OC well and Samsung Cards that DO NOT OC well are just rare anomalies
> 
> No it is not my PSU, i have a Platinum rated PSU, and all cables supplied are single 8-pin cables each with its own separate connector, i ran a Samsung GTX 1070 for two weeks on that same exact system, +700 memory (9400Mhz) 100% stable during hours of gaming and benchmarking, and when i push it to +750 or +800 all i get is "Display driver stopped responding and has recovered." no full system crash or lockup has ever happened with that card.
> 
> the other two Micron cards, one shows artifacts at +100 and the other goes up to around +300 but then instant system crash when power stats change, +250 is the highest stable OC that does not cause system crashes going from low power to high power and vice versa ... i can lock the voltage to 1093 mV with +300 to prevent voltage fluctuation crashing but i still get artifacts in rotTR until i lower the offset to around +270
> 
> From the data points i have gathered online, the mean/average stable frequency for micron is around the +200 to +250 point, and the mean/average stable frequency for Samsung is around +500 ... exceptions exist on both sides, but you have much much better odds with Samsung cards ... especially in the overall stability department and full system crashing ..


Okay, that seems to rule out the PSU. Try older drivers, I get artifacts and crashes with the last two, version .54 and older work fine.
Just trying to help you man, not doubting your issues


----------



## t1337dude

Just ordered an MSI 1070 Gaming Z 8G for my HTPC rig for 400 from Jet









Almost ordered the Gigabyte G1 instead but literally every other review mentioned coil whine. My Gigabyte 980 Ti G1 had to be returned 2 times before I got a card with coil whine.

The MSI cost 30 more but sometimes you have to pay a little more to minimize potential headaches and to just enjoy the purchase


----------



## MyNewRig

Quote:


> Originally Posted by *Dude970*
> 
> Okay, that seems to rule out the PSU. Try older drivers, I get artifacts and crashes with the last two, version .54 and older work fine.
> Just trying to help you man, not doubting your issues


Yes sure i understand, but let me tell you this, i tested a bunch of GPUs over the years and have a good sense for their behavior, good high quality cards remain stable in variety of situations and overclock comfortably without much issue and hassle, low quality cards on the other hand are impossible to get stable no matter how much you try, the amount of PSUs and system components i have replaced in the past to get bad GPUs to be stable is huge i have latterly tried every single PSU brand and grade in the market during my testings ..

But from my experience, a card with quality components you just put in the system, OC, run a couple of tests and then it run perfectly until its end of life .. a low quality product like that Micron GTX 1070 will not run well no matter what you do, you can change PSUs, Motherboards, drivers, do all kinds of workaround like voltage locking .. etc etc .. but it will always remain problematic .. that is based on experience with a lot of GPUs i have handled over the years ..


----------



## MyNewRig

Quote:


> Originally Posted by *t1337dude*
> 
> Just ordered an MSI 1070 Gaming Z 8G for my HTPC rig for 400 from Jet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Almost ordered the Gigabyte G1 instead but literally every other review mentioned coil whine. My Gigabyte 980 Ti G1 had to be returned 2 times before I got a card with coil whine.
> 
> The MSI cost 30 more but sometimes you have to pay a little more to minimize potential headaches and to just enjoy the purchase


I was considering the Gaming Z myself but browsing MSI forums i found a bunch of horror stories with that card, mostly related to artifacting and crashing on stock settings just being on the desktop and screens losing signal for some reason,

Gigabyte and EVGA have a bunch of coilwhine reports, ASUS Strix and Nvidia's Founder's Edition appear to be the two cards with least issues ... but if you already ordered the Gaming Z please let me know how it goes once you receive it because i am really willing to give this card a try if someone can confirm that it does not have QC issues like most MSI products ...


----------



## ITAngel

I have ran my GPU at 2038Mhz and did boot at 2100Mhz. I did realized these GTX 10 series cards want to stay cool or else it will throttle down your speeds. I found the sweet spot for mine anything below 49C It will run at full speed. It will start to throttle once I past 49C which sucks but yes is what I found on my GTX 1070.

I would say this my old EVGA GTX 980 Ti under water used to heat up my room to 80-85F if I had the door close. This card will heat up my room up to 73-75F Max will not go any higher. Of course taking the consideration the CPU is also overclocked to 4.5Ghz/4.6Ghz most of the time. Depending on what I am doing which is water cooled. This GPU is currently on air since I am waiting for a cooler for my GPU to come out.


----------



## MyNewRig

Quote:


> Originally Posted by *ITAngel*
> 
> I have ran my GPU at 2038Mhz and did boot at 2100Mhz. I did realized these GTX 10 series cards want to stay cool or else it will throttle down your speeds. I found the sweet spot for mine anything below 49C It will run at full speed. It will start to throttle once I past 49C which sucks but yes is what I found on my GTX 1070.


That is strange the 1070 starts throttling at 60c ... then 65c ... then 70c ... under 60c it always runs at full boost, i think it is power throttling not temp throttling .. which type of 1070 do you have? try increasing the power target to +112 or +120 depending on the max limit allowed by your BIOS and see if you are still throttling below 60c


----------



## Nukemaster

I am not sure what controls it, but I get a drop at 36 and 40. This may never be seen on cards with a passive mode or very low fan speeds.

I used the gpu-z render test because it is pretty slow at heating the card up(and stays under power limits as well).
Do not mind the lack of fps numbers. It only works when the render test is in the foreground.


You can always set a small clock offset to undo this throttle









EDIT. Ohh this is an Asus Dual 1070 08G with a Mono Plus on it.


----------



## MyNewRig

Quote:


> Originally Posted by *Nukemaster*
> 
> I am not sure what controls it, but I get a drop at 36 and 40. This may never be seen on cards with a passive mode or very low fan speeds.
> 
> I used the gpu-z render test because it is pretty slow at heating the card up(and stays under power limits as well).
> Do not mind the lack of fps numbers. It only works when the render test is in the foreground.
> 
> 
> You can always set a small clock offset to undo this throttle
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT. Ohh this is an Asus Dual 1070 08G with a Mono Plus on it.


OKay, i suspect you are using driver 372.90 correct? notice your voltage is at 0.9930 v when it should be at 1.067 to 1.093 v in normal cases, but what i noticed with driver 372.90 is that it uses more power to supply the memory to stabilize it, if you have Micron memory causing core to drop below 1 v and hence the throttling, this happens up to 112% power target, if you increase your power target to 120% and voltage to +100 you will notice that your core voltage will go back to 1.081 and core will break the 2000Mhz mark, i said yesterday that the latest 372.90 driver is lowering core voltage and causing throttling in an attempt to stabilize the memory but nobody seemed to care about that info ..


----------



## Nukemaster

Quote:


> Originally Posted by *MyNewRig*
> 
> OKay, i suspect you are using driver 372.90 correct? notice your voltage is at 0.9930 v when it should be at 1.067 to 1.093 v in normal cases, but what i noticed with driver 372.90 is that it uses more power to supply the memory to stabilize it, if you have Micron memory causing core to drop below 1 v and hence the throttling, this happens up to 112% power target, if you increase your power target to 120% and voltage to +100 you will notice that your core voltage will go back to 1.081 and core will break the 2000Mhz mark, i said yesterday that the latest 372.90 driver is lowering core voltage and causing throttling in an attempt to stabilize the memory but nobody seemed to care about that info ..


I do have that driver revision. The lower voltage was done with the voltage/frequency curve option in Afterburner.

My card has Samsung memory.

It did it before adjusting the voltage/frequency curve. I first asked here if it was normal or not(in this thread). I am locked at a max of 112% power limit with this card(I still have it at the default 100).

This may all be a bios thing. At first I thought my VRM's where getting too hot(and maybe they are?).


----------



## ITAngel

Quote:


> Originally Posted by *MyNewRig*
> 
> That is strange the 1070 starts throttling at 60c ... then 65c ... then 70c ... under 60c it always runs at full boost, i think it is power throttling not temp throttling .. which type of 1070 do you have? try increasing the power target to +112 or +120 depending on the max limit allowed by your BIOS and see if you are still throttling below 60c


I am using Zotac GTX 1070 AMP! Extreme. I could be wrong maybe was 59C but I could swear it was 49C but I can't find my notes since is been a while I overclocked the card.







I may need to start from scratch again. I did save a base setting.

I was using to test EVGA PrecisionX OC, MSI Afterburner, and FireStorm the settings. I didn't like the idea I was using a beta Afterburner so I was focusing on EVGA PrecisionX OC.


----------



## MyNewRig

Quote:


> Originally Posted by *MyNewRig*
> 
> OKay, i suspect you are using driver 372.90 correct? notice your voltage is at 0.9930 v when it should be at 1.067 to 1.093 v in normal cases, but what i noticed with driver 372.90 is that it uses more power to supply the memory to stabilize it, if you have Micron memory causing core to drop below 1 v and hence the throttling, this happens up to 112% power target, if you increase your power target to 120% and voltage to +100 you will notice that your core voltage will go back to 1.081 and core will break the 2000Mhz mark, i said yesterday that the latest 372.90 driver is lowering core voltage and causing throttling in an attempt to stabilize the memory but nobody seemed to care about that info ..


First thing to do is try reverting back to 372.70 after a DDU cleanup and custom driver install with "reset settings" selected, see if you are still throttling that much, most probably you will be breaking the 2000Mhz with 372.70


----------



## MyNewRig

Quote:


> Originally Posted by *ITAngel*
> 
> I am using Zotac GTX 1070 AMP! Extreme. I could be wrong maybe was 59C but I could swear it was 49C but I can't find my notes since is been a while I overclocked the card.
> 
> 
> 
> 
> 
> 
> 
> I may need to start from scratch again. I did save a base setting.
> 
> I was using to test EVGA PrecisionX OC, MSI Afterburner, and FireStorm the settings. I didn't like the idea I was using a beta Afterburner so I was focusing on EVGA PrecisionX OC.


What the max power target in the AMP Extreme? does it allow 125%? Afterburner beta 14 is simply the best tool, beta or not it works the best









If you are maxing out your power target and voltage and still throttle at 49c that would be a strange behavior, on my end it runs 2088Mhz until 60c then 2050Mhz until 65c then 2032Mhz until 69c then 2012Mhz for as long as i use it even for hours with some occasional dips into 2000Mhz in stressful scenes.

What is the max power target on the AMP Extreme?


----------



## t1337dude

Quote:


> Originally Posted by *MyNewRig*
> 
> I was considering the Gaming Z myself but browsing MSI forums i found a bunch of horror stories with that card, mostly related to artifacting and crashing on stock settings just being on the desktop and screens losing signal for some reason,
> 
> Gigabyte and EVGA have a bunch of coilwhine reports, ASUS Strix and Nvidia's Founder's Edition appear to be the two cards with least issues ... but if you already ordered the Gaming Z please let me know how it goes once you receive it because i am really willing to give this card a try if someone can confirm that it does not have QC issues like most MSI products ...


In the past I've avoided MSI after buying a 5850 from them a long time ago. But I feel like this time around they've got a handle on their QC issues (perhaps it has to do with them having a higher price than the competitors?).

I agree that the Strix also looked like a safe bet, but I went with the MSI because I already own the MSI 1080 Gaming X 8G in my primary gaming rig and I'm very satisfied with it compared to the Gigabyte 980 Ti G1 that I upgraded from. I've experienced no coil whine from my 1080 MSI card (except at menus running at 10000 fps), and on top of that it's a lot quieter than my Gigabyte 980 Ti ever was. Not to say that the 1080 Gigabyte G1 is the same as the 980 Ti, but from reviews the consensus seemed to be that it runs louder (and a little cooler) than other cards. In all honesty I prefer to hear my GPU as little as possible, and I hardly ever notice my my MSI card in my very quiet case setup.


----------



## MyNewRig

Quote:


> Originally Posted by *t1337dude*
> 
> In the past I've avoided MSI after buying a 5850 from them a long time ago. But I feel like this time around they've got a handle on their QC issues (perhaps it has to do with them having a higher price than the competitors?).
> 
> I agree that the Strix also looked like a safe bet, but I went with the MSI because I already own the MSI 1080 Gaming X 8G in my primary gaming rig and I'm very satisfied with it compared to the Gigabyte 980 Ti G1 that I upgraded from. I've experienced no coil whine from my 1080 MSI card (except at menus running at 10000 fps), and on top of that it's a lot quieter than my Gigabyte 980 Ti ever was. Not to say that the 1080 Gigabyte G1 is the same as the 980 Ti, but from reviews the consensus seemed to be that it runs louder (and a little cooler) than other cards. In all honesty I prefer to hear my GPU as little as possible, and I hardly ever notice my my MSI card in my very quiet case setup.


I agree, i had an MSI GTX 970 and it was almost silent and no QC issues at all, but with all the issues being reported with the 1070 Gaming X and Gaming Z i am feeling uneasy about them even though i can afford the Gaming Z and also like it ..

I am very interested to hear your experience with the 1070 Gaming Z, OC levels etc ... this might help me pull the trigger on one, so please keep me updated









EDIT: regarding Gigabyte i totally agree, had a bunch of issues with Gigabyte motherboards in the last few generations, never tried a Gigabyte GPU though, i feel they look pretty ugly and cheap.


----------



## xg4m3

I was thinking about getting MSI Gaming X version, but i'm little scared now after reading these posts









Is Zotac safer bet?


----------



## ITAngel

Okay Running EVGA Precision X and OC Scanner, Looks like after tweaking it a bit I got it to 2138Mhz on the core with memory at 9,216.









http://www.ozone3d.net/gpudb/score.php?which=309916


----------



## Mjhieu

Quote:


> Originally Posted by *xg4m3*
> 
> I was thinking about getting MSI Gaming X version, but i'm little scared now after reading these posts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is Zotac safer bet?


I notice some ppl with Zotac 1070 card also complain their card crash with new driver and it also use Micron. So the problem isn't MSI or Zotac, it about Micron and Nvidia driver.


----------



## ITAngel

Quote:


> Originally Posted by *Mjhieu*
> 
> I notice some ppl with Zotac 1070 card also complain their card crash with new driver and it also use Micron. So the problem isn't MSI or Zotac, it about Micron and Nvidia driver.


I am using the new drivers on my card and I have yet to crash with them on any of the games I have been playing. My card also has Samsung memory so not sure if that makes a huge difference. My room also stays within 68F-73F. I am going to play a game now with the new overclocking settings to see how well it dose.


----------



## t1337dude

Quote:


> Originally Posted by *xg4m3*
> 
> I was thinking about getting MSI Gaming X version, but i'm little scared now after reading these posts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is Zotac safer bet?


I'm not sure what there is to be scared about. If there's an epidemic with some issue, I haven't seen it reflected in customer reviews or anything like that. I haven't payed attention to this thread but the only "mass issue" I've observed from the usual customer is coil whine, and after combing through customer reviews all last night, the MSI customers are the ones who were least likely to report coil whine.


----------



## Mjhieu

Quote:


> Originally Posted by *ITAngel*
> 
> I am using the new drivers on my card and I have yet to crash with them on any of the games I have been playing. My card also has Samsung memory so not sure if that makes a huge difference. My room also stays within 68F-73F. I am going to play a game now with the new overclocking settings to see how well it dose.


Nice, yours card has samsung, so lucky. Here room temp about 30 degree C, and my MSI Gaming Z under fulload always stay at 68 degree C, It has Micron.


----------



## ITAngel

Quote:


> Originally Posted by *Mjhieu*
> 
> Nice, yours card has samsung, so lucky. Here room temp about 30 degree C, and my MSI Gaming Z under fulload always stay at 68 degree C, It has Micron.


I see, good temps, I am not familiar with the micro chips but all I know people say is good to have samsung so I guess I got lucky. After playing one round of Overwatch temps reach at 47C on the GPU , toward 60second of the game round finishing the game crashed. Core was at 2126Mhz. So not sure what I need to tweak next to make it stable.


----------



## BroPhilip

Quote:


> Originally Posted by *MyNewRig*
> 
> I agree, i had an MSI GTX 970 and it was almost silent and no QC issues at all, but with all the issues being reported with the 1070 Gaming X and Gaming Z i am feeling uneasy about them even though i can afford the Gaming Z and also like it ..
> 
> I am very interested to hear your experience with the 1070 Gaming Z, OC levels etc ... this might help me pull the trigger on one, so please keep me updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: regarding Gigabyte i totally agree, had a bunch of issues with Gigabyte motherboards in the last few generations, never tried a Gigabyte GPU though, i feel they look pretty ugly and cheap.


I listed my oc experience with the gaming z and posted it awhile back ... but here it is again

Specs:
Asus Z170-A
I5-6600K oc to 4.7ghz
600 evga psu

(I am listing max boost speed plus the step down after first thermal. My Temps averaged between 50-59 with custom fan curve to 40% - 95% ramp up between 40-50 degrees)

Factory Gaming Mode
No voltage or power limit increase
1987-1974. Score 15020
With voltage power increase
1999

Factor OC mode
No voltage or power increase
1999 - 1987. Score 15086
With voltage and power increase
2025

OC with no voltage or power (highest stable)
2050-2037 Mem 8700 score 15635

OC %126 power +100 core voltage (Max Stable)
2088-2062 Mem 8700 score 15706

In time spy I was able to break 2100 with 8700 memory with a voltage lock at 1093 and scored close to 6100. However this would crash firestrike..... the sweet spot seems to be around 2050 for daily use


----------



## MyNewRig

Quote:


> Originally Posted by *BroPhilip*
> 
> I listed my oc experience with the gaming z and posted it awhile back ... but here it is again
> 
> Specs:
> Asus Z170-A
> I5-6600K oc to 4.7ghz
> 600 evga psu
> 
> (I am listing max boost speed plus the step down after first thermal. My Temps averaged between 50-59 with custom fan curve to 40% - 95% ramp up between 40-50 degrees)
> 
> Factory Gaming Mode
> No voltage or power limit increase
> 1987-1974. Score 15020
> With voltage power increase
> 1999
> 
> Factor OC mode
> No voltage or power increase
> 1999 - 1987. Score 15086
> With voltage and power increase
> 2025
> 
> OC with no voltage or power (highest stable)
> 2050-2037 Mem 8700 score 15635
> 
> OC %126 power +100 core voltage (Max Stable)
> 2088-2062 Mem 8700 score 15706
> 
> In time spy I was able to break 2100 with 8700 memory with a voltage lock at 1093 and scored close to 6100. However this would crash firestrike..... the sweet spot seems to be around 2050 for daily use


Thank you very much for that info, exactly what i was looking for, these are almost the same exact levels i am getting with my ASUS Strix 1070, the same exact scores at the same settings, and i also have a 6600K overclocked to the same level.

I expected the MSI GAMING Z to provide a little bit higher performance than this but it appears to have no advantage over the Strix.

I think the main benefit of the Z is its quiet fans, the Strix would start to sound like a jet engine after 65% fan, how do the fans on the Z sound with that custom curve?


----------



## khanmein

__
https://www.reddit.com/r/4wh8jc/msi_officially_announces_gtx_1070_quality/

weird, y nobody complain on GTX 1080 GDDR5X Micron chip???


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> 
> __
> https://www.reddit.com/r/4wh8jc/msi_officially_announces_gtx_1070_quality/
> 
> weird, y nobody complain on GTX 1080 GDDR5X Micron chip???


Because Micron is the only producer of GDDR5X memory, they are probably the developer of it as a cheaper alternative to HBM memory, so the problem is not with Micron memory in general, the issue is only with Micron's GDDR5 8Ghz memory currently used in the GTX 1070 and not with any other type of Micron memory chips as far as i am aware.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Because Micron is the only producer of GDDR5X memory, they are probably the developer of it as a cheaper alternative to HBM memory, so the problem is not with Micron memory in general, the issue is only with Micron's GDDR5 8Ghz memory currently used in the GTX 1070 and not with any other type of Micron memory chips as far as i am aware.


actually i planned to purchase MSI Gaming X but now i lurking around for used so that can get to know whether is Samsung or not. apparently, i'm anti GIGA & MSI but this time around the heat-sink, PCB, spare parts, etc. is really nice + beefy cooler. GIGA G1 Gaming during GTX 970/980/980Ti the cooler is good but not with Hynix/Elpida VRAM.

theoretically, Samsung VRAM tend to be tighter in terms RAM timing. at the end of the day, perhaps all depend with luck & silicon lottery too.


----------



## X6SweexLV

My satable oc, no extra volts 2050-2088Mhz
its runing stable in Rise of the Tomb Raider, The Witcher 3, 3DMark and Valley...
If i try use +550 its start visual artifact but +500 work good (Micron vram)


----------



## BroPhilip

Quote:


> Originally Posted by *MyNewRig*
> 
> Thank you very much for that info, exactly what i was looking for, these are almost the same exact levels i am getting with my ASUS Strix 1070, the same exact scores at the same settings, and i also have a 6600K overclocked to the same level.
> 
> I expected the MSI GAMING Z to provide a little bit higher performance than this but it appears to have no advantage over the Strix.
> 
> I think the main benefit of the Z is its quiet fans, the Strix would start to sound like a jet engine after 65% fan, how do the fans on the Z sound with that custom curve?


I actually had the strix non oc model but was getting some pretty bad stuttering in opening videos and such, so I rma'd it. Sad thing is it had samsung memory. But it didn't over clock well but only had a 112 power limit. The MSI is much quieter than the strix model. I also liked the shorter form for future sli solutions. Is your Strix the OC or the Non-OC model?


----------



## victorrz

Quote:


> Originally Posted by *MyNewRig*
> 
> What the max power target in the AMP Extreme? does it allow 125%? Afterburner beta 14 is simply the best tool, beta or not it works the best
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you are maxing out your power target and voltage and still throttle at 49c that would be a strange behavior, on my end it runs 2088Mhz until 60c then 2050Mhz until 65c then 2032Mhz until 69c then 2012Mhz for as long as i use it even for hours with some occasional dips into 2000Mhz in stressful scenes.
> 
> What is the max power target on the AMP Extreme?


The max power limit on the AMP Extreme is 120%/300W.
With 372.90 my score on TimeSpy improved, now I'm getting 6737 graphics score with 2126/2101Mhz on core clock and 9360Mhz on memory clock (Samsung).


----------



## gtbtk

Quote:


> Originally Posted by *2Lazy2Die*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the FTW does draw higher power levels than the base cards. have you tried using a fan curve or faster constant fan for better cooling? your card may be reducing clock speed because of heat
> 
> There is nothing wrong with micron memory, just an issue with the power supply if you try a high overclock from low idle voltages. If the card is idling at .800v or above the card will clock above +500 no problem. I tend to get better performance at +400 than at +560 but I believe that is also consistant with a number of samsung cards as well.
> 
> I am running with a MSI Gaming X Micron memory card running on an i7-2600 with a BCLK overclock of 4.4ghz.
> 
> Heaven is about 2366 in the default gaming mode, 2376 in OC mode and 2558 overclocked at 1080p in a well ventilated case.
> 
> This is a recent time spy with the same curve based OC as run on heaven. http://www.3dmark.com/spy/483429
> 
> 
> 
> Setting power management to high perfomance actually helped with the artifacts, which previously appeared at >200 memory. However the score in heaven, even with +350 oc is just 93 fps,2345 score which still feels underpowered. Could CPU bottleneck the heaven benchmark?
Click to expand...

Possibly. I get lower scores because I am only running an i7-2600 than others running i7-6700K with the same card for example. What CPU are you running? What Power supply do you have?


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I feel a class action coming on again.....the curse of the 70 model cards.
> 
> 
> 
> They are covering their legal basis this time around it seems by delivering a product that can do the bare minimum of advertised specs and has no OC headroom whatsoever to show its quality, that if you are lucky to even get the stable minimum guaranteed performance.
> 
> I don't know who they are fooling? when has it ever been the case that buyers of high-end GPUs are making purchases based on the very minimum guaranteed performance? i think the answer is never!
> 
> Why are these manufacturers developing OC tools and OC BIOS, OC this and OC that if they don't know that this is what sells their cards over others?
> 
> If they were true to their "we do not support OC" mantra they would have not been pushing all these OC supporting tools in HW and SW! to generate sales with enthusiasts who buy these GPUs, the market is just full of **** these days ..
Click to expand...

Nvidia Advertises GPU Boost 3.0 and features Precision Xoc with the automatic overclocking utility developed in conjunction with EVGA. Also, all the Nvidia supplied cards to the major English speaking tech review sites have exclusively been Samsung cards, even the reviews that have come out recently that I have seen.

Bit difficult to claim that they are selling cards with the expectation of them not being overclocked and given that teh 1060 have all been Samsung cards it is difficult for them to claim a shortage of chips. Why not use the micron chips on their lower range cards as well?


----------



## tps3443

Quote:


> Originally Posted by *Mr-Dark*
> 
> I'm not
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadily a friend Visit me today and once he see that 1070 SLI, he just give me 300$ and 2* 980 Ti Hybrid.. I just couldn't pass that offer as 980 Ti's Hybrid for single 1070 Price
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Hybrid's boost to 1342mhz at stock and that already faster than 1070 SLI..
> 
> 1070 SLI score
> 
> http://www.3dmark.com/3dm/15006887?
> 
> Ti's score
> 
> http://www.3dmark.com/3dm/15027316?
> 
> Decent enough


A GTX980Ti is a great card, and performs great up against a GTX1070. It's faster when Overclocked. You need to try and get it to 1450-1500's Mhz core clocks

The biggest advantage you have, is there both water cooled 980Ti's! So, temps will be way better over your previous ACX 1070 cards.

Although stock SLI gtx1070's use about the same power as a single HEAVILY Overclocked 980Ti lol. WHO CARES?!

There both fast! If someone brought me a GTX980Ti, and $300. I would probably hand over my GTX1080 FE.

That rig is crushing 4K, and obliterating your 1440P 144hz monitor!


----------



## tps3443

Quote:


> Originally Posted by *t1337dude*
> 
> Just ordered an MSI 1070 Gaming Z 8G for my HTPC rig for 400 from Jet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Almost ordered the Gigabyte G1 instead but literally every other review mentioned coil whine. My Gigabyte 980 Ti G1 had to be returned 2 times before I got a card with coil whine.
> 
> The MSI cost 30 more but sometimes you have to pay a little more to minimize potential headaches and to just enjoy the purchase


Coil whine is common, and it's not a terrible thing. My GTX1080 FE, at 1080P will coil whine in some rare instances, it's just outputting so many frames per second, that some cards whine. Turn on Vsync or Gsync and it goes away.


----------



## Mr-Dark

Quote:


> Originally Posted by *tps3443*
> 
> A GTX980Ti is a great card, and performs great up against a GTX1070. It's faster when Overclocked. You need to try and get it to 1450-1500's Mhz core clocks
> 
> The biggest advantage you have, is there both water cooled 980Ti's! So, temps will be way better over your previous ACX 1070 cards.
> 
> Although stock SLI gtx1070's use about the same power as a single HEAVILY Overclocked 980Ti lol. WHO CARES?!
> 
> There both fast! If someone brought me a GTX980Ti, and $300. I would probably hand over my GTX1080 FE.
> 
> That rig is crushing 4K, and obliterating your 1440P 144hz monitor!


Yea, 2 980 Ti for single 1070 price









1500mhz is easy as the ASIC quality is high enough, 79% and 74% also I have 1300W psu so no problem









the temp is very good, around 55c Max


----------



## tps3443

Quote:


> Originally Posted by *Mr-Dark*
> 
> Yea, 2 980 Ti for single 1070 price
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1500mhz is easy as the ASIC quality is high enough, 79% and 74% also I have 1300W psu so no problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the temp is very good, around 55c Max


I love deals! I paid $280 for my GTX1080 brand new lol. Financial Aid paid for it, for this kid to go to school for graphic design, and then he dropped out and sold off all of the items he got like an idiot. He didn't even know what a GTX1080 was for. I had drive like 3 hours to get it. But, it was worth it!

You gave him your 1070's to though right?

I wonder why they did away with ASIC quality with the 10 series.

I had a RX480 before my 1070, and before my 1080. Anyways, the RX480 was a 91.5% ASIC QUALITY. And, it still couldn't hit 1400 lmao. It's terrible AMD pre overclocks there cards. There's just no room left!


----------



## Mr-Dark

Quote:


> Originally Posted by *tps3443*
> 
> I love deals! I paid $280 for my GTX1080 brand new lol. Financial Aid paid for it, for this kid to go to school for graphic design, and then he dropped out and sold off all of the items he got like an idiot. He didn't even know what a GTX1080 was for. I had drive like 3 hours to get it. But, it was worth it!
> 
> You gave him your 1070's to though right?
> 
> I wonder why they did away with ASIC quality with the 10 series.
> 
> I had a RX480 before my 1070, and before my 1080. Anyways, the RX480 was a 91.5% ASIC QUALITY. And, it still couldn't hit 1400 lmao. It's terrible AMD pre overclocks there cards. There's just no room left!


Wow, 1080 for 280$!! I can drive 6h for that deal..lol

The Hybrid's is mine from day one ( 2 month old only ), that guy purchase them from me from 1 month for 1150$, then once he see the 1070 SLI he just say I hate the 4 tube's from the Hybrid.. So we trade and the Hybrid's back to me + 300$









The only issue now is my cpu temp, went up by 5c as the the whole AIR inside the case is warmer, but the winter is coming


----------



## t1337dude

Quote:


> Originally Posted by *tps3443*
> 
> Coil whine is common, and it's not a terrible thing. My GTX1080 FE, at 1080P will coil whine in some rare instances, it's just outputting so many frames per second, that some cards whine. Turn on Vsync or Gsync and it goes away.


The coil whine issue isn't about standard coil whine that you get with any card. Many 9xx and 10xx cards and brands exhibit increased amounts of coil whine, which occurs during normal gaming and benchmarking scenarios. Unless you constantly like hearing coil whine, it's a significant issue and requires users to pick their cards carefully otherwise you will be playing the RMA game for awhile.


----------



## BulletSponge

Quote:


> Originally Posted by *t1337dude*
> 
> The coil whine issue isn't about standard coil whine that you get with any card. Many 9xx and 10xx cards and brands exhibit increased amounts of coil whine, which occurs during normal gaming and benchmarking scenarios. Unless you constantly like hearing coil whine, it's a significant issue and requires users to pick their cards carefully otherwise you will be playing the RMA game for awhile.


I've never heard coil whine before myself but to be fair after 2 years on the flight deck I don't hear much of anything.


----------



## Sueramb6753

-snip-


----------



## tps3443

Quote:


> Originally Posted by *Mr-Dark*
> 
> Wow, 1080 for 280$!! I can drive 6h for that deal..lol
> 
> The Hybrid's is mine from day one ( 2 month old only ), that guy purchase them from me from 1 month for 1150$, then once he see the 1070 SLI he just say I hate the 4 tube's from the Hybrid.. So we trade and the Hybrid's back to me + 300$
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The only issue now is my cpu temp, went up by 5c as the the whole AIR inside the case is warmer, but the winter is coming


I love winter for just that reason! Hot case or cold case! Those hybrids are way cooler! Enjoy gaming in the 40C range lol while Overclocked. Ive got the doors off, 100% fans to pull this with my single gtx1080 lol.


----------



## tps3443

Quote:


> Originally Posted by *t1337dude*
> 
> The coil whine issue isn't about standard coil whine that you get with any card. Many 9xx and 10xx cards and brands exhibit increased amounts of coil whine, which occurs during normal gaming and benchmarking scenarios. Unless you constantly like hearing coil whine, it's a significant issue and requires users to pick their cards carefully otherwise you will be playing the RMA game for awhile.


I've never had such a bad card yet. That would be pretty annoying! I've only hear it a few times from my GTX1080 from looking at the sky lol and stuff in games at low resolution like 1080P


----------



## t1337dude

Quote:


> Originally Posted by *Symix*
> 
> What do you mean pick them carefully? Any card has a chance of coil whine and there's no place on the internet you can get reliable information about the chance of coil whine.


Means doing research and using common sense. Read user reviews. Some brands have user reviews with ubiquitous mentioning of coil whine (e.g. EVGA and Gigabyte). Other brands don't. If you buy a brand that many people complain about coil whine - you obviously shouldn't be shocked if your card is always whining. Then it's the RMA game.
Quote:


> Originally Posted by *tps3443*
> 
> I've never had such a bad card yet. That would be pretty annoying! I've only hear it a few times from my GTX1080 from looking at the sky lol and stuff in games at low resolution like 1080P


Many people have to do multiple RMA's due to the issue. Some people's are exacerbated by their PSU. It's unfortunate but tis the life of an enthusiast.


----------



## mrtbahgs

Ran a quick series of Heaven and Valley tests to find that core at +120 stutters or had some odd issue in one of the 2 benchmarks so I went with +100 core and then randomly tossed in +500 memory (2038/9000 it settles on).

Was able to play BF4 for like 4 rounds straight and no issues at all so I will see if I can bump things up some more later.
GPU fan curve is running 72% when it maxed at 62C though so I'm not sure how much faster and louder I want it to run.

Voltage I have not touched, is it usually worth keeping +0%, go straight to +100%, or where?
Power is maxed at 111%


----------



## MyNewRig

Quote:


> Originally Posted by *t1337dude*
> 
> The coil whine issue isn't about standard coil whine that you get with any card. Many 9xx and 10xx cards and brands exhibit increased amounts of coil whine, which occurs during normal gaming and benchmarking scenarios. Unless you constantly like hearing coil whine, it's a significant issue and requires users to pick their cards carefully otherwise you will be playing the RMA game for awhile.


there is no amount of "careful buying" that can save you from Coilwhine, out of the four ASUS Strix 1070s i tested, 3 are silent and one is whining like a B*itch , you can never make sure if yours will whine or not before actually installing it into the system, there are however some manufacturers that are more prone to Coilwhine, EVGA being the top "whiny" brand!


----------



## Mjhieu

Quote:


> Originally Posted by *Mr-Dark*
> 
> Yea, 2 980 Ti for single 1070 price
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1500mhz is easy as the ASIC quality is high enough, 79% and 74% also I have 1300W psu so no problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the temp is very good, around 55c Max


Grats. But I wanna tell you that " I love your avatar animation and the lady in avatar xD"


----------



## xg4m3

Ok, the time has come.
MSI Gaming X or Zotac AMP Extreme?
Difference in price is cca 10€.

Which one and why?

Amazon links from where i will order it:

MSI
Zotac


----------



## gtbtk

Quote:


> Originally Posted by *xg4m3*
> 
> Ok, the time has come.
> 
> MSI Gaming X or Zotac AMP Extreme?
> 
> Difference in price is cca 10€.
> 
> Which one and why?
> 
> Amazon links from where i will order it:
> 
> MSI
> 
> Zotac


Zotac will be faster out of the box at stock settings. The Gaming Z is more comparable than the X in the stock performance stakes.
Both OC to about the same levels.
MSI is one of the quietest cards around
Zotac much larger so check available space
Zotac draws more power than the MSI


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Ok, the time has come.
> 
> MSI Gaming X or Zotac AMP Extreme?
> 
> Difference in price is cca 10€.
> 
> Which one and why?
> 
> Amazon links from where i will order it:
> 
> MSI
> 
> Zotac
> 
> 
> 
> Zotac will be faster out of the box at stock settings. The Gaming Z is more comparable than the X in the stock performance stakes.
> Both OC to about the same levels.
> MSI is one of the quietest cards around
> Zotac much larger so check available space
> Zotac draws more power than the MSI
Click to expand...

zotac fan issue is terrible that's y 3+2 warranty like barnacules said warranty is for #poorpeople.


----------



## t1337dude

[
Quote:


> Originally Posted by *MyNewRig*
> 
> there is no amount of "careful buying" that can save you from Coilwhine, out of the four ASUS Strix 1070s i tested, 3 are silent and one is whining like a B*itch , you can never make sure if yours will whine or not before actually installing it into the system, there are however some manufacturers that are more prone to Coilwhine, EVGA being the top "whiny" brand!


Yea that's what I mostly meant







picking brands that appear less prone to it. It's certainly not equal across the brands from what I've observed. I've read little to no complaints pertaining to coil whine for MSI this generation and my first MSI card is free of it. But I had 3 Gigabte cards that had it to varying degrees (with the last one being the least annoying and fathomably quiet).

Another thing I meant by "careful picking" is actually listening for the issue and testing across multiple games. Dying Light, for example, caused my Gigabyte card to screech like no other. It's much easier to RMA it through the retailer than the brand I think so it's better to figure it out sooner than later.


----------



## owikhan

Quote:


> Originally Posted by *xg4m3*
> 
> Ok, the time has come.
> MSI Gaming X or Zotac AMP Extreme?
> Difference in price is cca 10€.
> 
> Which one and why?
> 
> Amazon links from where i will order it:
> 
> MSI
> Zotac


GO FOR ZOTAC


----------



## HOODedDutchman

So I have 1 gigabyte windforce x2 oc card and debating on getting a 2nd for games like witcher 3 and crysis 3 etc. At 1440p the 1070 alone just doesn't quite cut it. So what does everyone think ? Is the 1070 sli worth it or wait for 1080ti and sell the 1070 or wait for next gen and deal with turning down a setting or 2 (hairworks off is beautifully playable on the witcher 3 and 2tsaa is fine for 60+ in crysis 3). The one thing I know is the Pascal cards are fairly new so I will get a ton of life out of 2 1070s. Thanks for any input guys.


----------



## tps3443

Quote:


> Originally Posted by *HOODedDutchman*
> 
> So I have 1 gigabyte windforce x2 oc card and debating on getting a 2nd for games like witcher 3 and crysis 3 etc. At 1440p the 1070 alone just doesn't quite cut it. So what does everyone think ? Is the 1070 sli worth it or wait for 1080ti and sell the 1070 or wait for next gen and deal with turning down a setting or 2 (hairworks off is beautifully playable on the witcher 3 and 2tsaa is fine for 60+ in crysis 3). The one thing I know is the Pascal cards are fairly new so I will get a ton of life out of 2 1070s. Thanks for any input guys.


Well the gtx980 was out for 22 months. Nearly 2 years before GTX1080 was released. That's a nice long cycle to keep a video card! I think I would either buy a GTX1080 for around $500-600 used or new, after selling your 1070 ofcourse. Or wait until 1080Ti.

Look at the official Nvidia P6000 specs it's a quadro card base on Titan X P, only it has more cuda cores at 3,840. So, it's going to be fast! If the GTX1080Ti has the same specs as the P6000 like it is speculated to have. It will be a true 4K card. And even faster than Titan X Pascal!

I use my GTX1080 for 4K, I do not run AA. And I may turn down a setting or two. Some games with gtx1080 going from 1440P to 2160P only lose 10fps in performance.

I guess it all depends on gtx1080ti price. The Titan X is $1200 so, to may be $999. So, (2) used 1080's for $1,000 seems like a better value.


----------



## HOODedDutchman

Quote:


> Originally Posted by *tps3443*
> 
> Well the gtx980 was out for 22 months. Nearly 2 years before GTX1080 was released. That's a nice long cycle to keep a video card! I think I would either buy a GTX1080 for around $500-600 used or new, after selling your 1070 ofcourse. Or wait until 1080Ti.
> 
> Look at the official Nvidia P6000 specs it's a quadro card base on Titan X P, only it has more cuda cores at 3,840. So, it's going to be fast! If the GTX1080Ti has the same specs as the P6000 like it is speculated to have. It will be a true 4K card. And even faster than Titan X Pascal!
> 
> I use my GTX1080 for 4K, I do not run AA. And I may turn down a setting or two. Some games with gtx1080 going from 1440P to 2160P only lose 10fps in performance.
> 
> I guess it all depends on gtx1080ti price. The Titan X is $1200 so, to may be $999. So, (2) used 1080's for $1,000 seems like a better value.


Ya only issue is I'm in Canada. So cheapest 1070s are $560 CAD + 13% tax and cheapest 1080s are $900 CAD + 13% tax. Mssive price increase for 20% performance increase.


----------



## Mr-Dark

Quote:


> Originally Posted by *Mjhieu*
> 
> Grats. But I wanna tell you that " I love your avatar animation and the lady in avatar xD"


Thanks bro, Kate is the love


----------



## lanofsong

Hey GTX 1070 owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Waleh

Hey guys, which of the blower style 1070's is best for my little itx rig? I know there's a founder's edition card, the MSI aero, and Asus turbo (not sure if there are more). What do you guys recommend?


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Ok, the time has come.
> 
> MSI Gaming X or Zotac AMP Extreme?
> 
> Difference in price is cca 10€.
> 
> Which one and why?
> 
> Amazon links from where i will order it:
> 
> MSI
> 
> Zotac
> 
> 
> 
> Zotac will be faster out of the box at stock settings. The Gaming Z is more comparable than the X in the stock performance stakes.
> 
> Both OC to about the same levels.
> 
> MSI is one of the quietest cards around
> 
> Zotac much larger so check available space
> 
> Zotac draws more power than the MSI
> 
> Click to expand...
> 
> zotac fan issue is terrible that's y 3+2 warranty like barnacules said warranty is for #poorpeople.
Click to expand...

Cant comment on Zotac Fans. I am pleased with the quietness of my MSI Gaming X even at 100% fan


----------



## Chaoz

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, which of the blower style 1070's is best for my little itx rig? I know there's a founder's edition card, the MSI aero, and Asus turbo (not sure if there are more). What do you guys recommend?


The Aero and Turbo cards use cheaper parts, hence why they're cheaper in price than the Founders Edition cards.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Ok, the time has come.
> 
> MSI Gaming X or Zotac AMP Extreme?
> 
> Difference in price is cca 10€.
> 
> Which one and why?
> 
> Amazon links from where i will order it:
> 
> MSI
> 
> Zotac
> 
> 
> 
> Zotac will be faster out of the box at stock settings. The Gaming Z is more comparable than the X in the stock performance stakes.
> 
> Both OC to about the same levels.
> 
> MSI is one of the quietest cards around
> 
> Zotac much larger so check available space
> 
> Zotac draws more power than the MSI
> 
> Click to expand...
> 
> zotac fan issue is terrible that's y 3+2 warranty like barnacules said warranty is for #poorpeople.
> 
> Click to expand...
> 
> Cant comment on Zotac Fans. I am pleased with the quietness of my MSI Gaming X even at 100% fan
Click to expand...

i'm anti giga & msi but this time round i prefer MSI cooler + heat-sink + pcb


----------



## HOODedDutchman

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, which of the blower style 1070's is best for my little itx rig? I know there's a founder's edition card, the MSI aero, and Asus turbo (not sure if there are more). What do you guys recommend?


Asus turbo. I believe the aero uses stock pcb and vrm setup while the turbo has a custom vrm setup.


----------



## Waleh

Quote:


> Originally Posted by *Chaoz*
> 
> The Aero and Turbo cards use cheaper parts, hence why they're cheaper in price than the Founders Edition cards.


Quote:


> Originally Posted by *HOODedDutchman*
> 
> Asus turbo. I believe the aero uses stock pcb and vrm setup while the turbo has a custom vrm setup.


Thanks for the reply guys, so then is the founders edition regarded as being superior to the ASUS turbo and MSI aero in terms of acoustics/thermals?


----------



## HOODedDutchman

Quote:


> Originally Posted by *Waleh*
> 
> Thanks for the reply guys, so then is the founders edition regarded as being superior to the ASUS turbo and MSI aero in terms of acoustics/thermals?


Personally I would go Asus turbo. Looks way better then reference and has aftermarket pcb meaning better vrm design of some sort (can't tell without review). It does class it as being assembled with Asus new technology tho. You can see by this picture the little Asus logo on pcb. So obv not reference.
http://techreport.com/news/30322/asus-turbo-gtx-1070-flies-under-the-radar

This is pretty cool also.
https://www.google.ca/amp/www.techpowerup.com/223676/asus-intros-geforce-gtx-1070-turbo%3famp

Or you could go with the gigabyte mini 1070. Basically made for itx rigs. It will dump heat into the case tho.


----------



## MyNewRig

Okay guys, now is decision time and i desperately need your help,

I have been closely monitoring and debating the Samsung vs. Micron GDDR5 memory issue for about two weeks now,

The conclusion is that, Nvidia designed the GTX 1070, its BIOS and Drivers based on Original specifications of Samsung GDDR5 8Ghz modules, the Samsung ICs appear to need low voltage/power to remain stable and can handle voltage fluctuations flawlessly.

Micron GDDR5 on the other hand, being a lower quality chip, is power hungry which needs constantly high voltage pumping to remain stable, so it can not operate at the same voltage/power envelope that was originally developed for Samsung GDDR5 in BIOS and Drivers.

The assumption was initially made that there is a BIOS/Driver bug preventing Micron GDDR5 from running stably, but what it turned out to be, because of the poor power properties of these Micron chips, they can not operate at the original voltage/power specifications and thus a fix was to pump as much power into them with the available tools, meaning voltage-locking and "prefer max performance" which are both not part of the standard specification.

The GTX 1070 has been significantly downgraded as the initial memory OC range showing in all early reviews (from 9000Mhz to 9750Mhz) is no longer valid and the new range is now 7600Mhz to 8800Mhz, meaning that the lowest Samsung OC is higher than the potential highest Micron OC.

*Now i need to know what to do with my Micron GTX 1070 and i need your suggestions:

1- Return the damn thing, get my money back and wait for Nvidia to start making card with Samsung memory again if ever.

2- Return the damn thing and get a Founder's Edition GTX 1070 with Samsung memory and live with the reference cooling and noise.

3- Return the thing and get my money back and wait for AMD's VEGA.

3- Return the thing and wait for Volta!

4- Get a GTX 1060 or an RX 480 instead.

5- Suck it and live with the new downgraded performance!

6- get a PS4 Pro! LOL








*

Please advise ...


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay guys, now is decision time and i desperately need your help,
> 
> I have been closely monitoring and debating the Samsung vs. Micron GDDR5 memory issue for about two weeks now,
> 
> The conclusion is that, Nvidia designed the GTX 1070, its BIOS and Drivers based on Original specifications of Samsung GDDR5 8Ghz modules, the Samsung ICs appear to need low voltage/power to remain stable and can handle voltage fluctuations flawlessly.
> 
> Micron GDDR5 on the other hand, being a lower quality chip, is power hungry which needs constantly high voltage pumping to remain stable, so it can not operate at the same voltage/power envelope that was originally developed for Samsung GDDR5 in BIOS and Drivers.
> 
> The assumption was initially made that there is a BIOS/Driver bug preventing Micron GDDR5 from running stably, but what it turned out to be, because of the poor power properties of these Micron chips, they can not operate at the original voltage/power specifications and thus a fix was to pump as much power into them with the available tools, meaning voltage-locking and "prefer max performance" which are both not part of the standard specification.
> 
> The GTX 1070 has been significantly downgraded as the initial memory OC range showing in all early reviews (from 9000Mhz to 9750Mhz) is no longer valid and the new range is now 7600Mhz to 8800Mhz, meaning that the lowest Samsung OC is higher than the potential highest Micron OC.
> 
> *Now i need to know what to do with my Micron GTX 1070 and i need your suggestions:
> 
> 1- Return the damn thing, get my money back and wait for Nvidia to start making card with Samsung memory again if ever.
> 
> 2- Return the damn thing and get a Founder's Edition GTX 1070 with Samsung memory and live with the reference cooling and noise.
> 
> 3- Return the thing and get my money back and wait for AMD's VEGA.
> 
> 3- Return the thing and wait for Volta!
> 
> 4- Get a GTX 1060 or an RX 480 instead.
> 
> 5- Suck it and live with the new downgraded performance!
> 
> 6- get a PS4 Pro! LOL
> 
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Please advise ...


I'm confused... I've never heard of this. So at stock clocks both micron and Samsung should work fine correct ?

If I was you and this stuff actually matters as much as you're saying just get a reference card. Under normal circumstances its not going to be noisy. Nvidia is not AMD their reference coolers are much more silent then AMD cards.


----------



## bigjdubb

Quote:


> Originally Posted by *Waleh*
> 
> Thanks for the reply guys, so then is the founders edition regarded as being superior to the ASUS turbo and MSI aero in terms of acoustics/thermals?


If I was looking for a blower style I would definitely go with the founders edition. The only real advantage the Asus has is 2 HDMI ports, and that's only an advantage if you have multiple HDMI screens.


----------



## rfarmer

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay guys, now is decision time and i desperately need your help,
> 
> I have been closely monitoring and debating the Samsung vs. Micron GDDR5 memory issue for about two weeks now,
> 
> The conclusion is that, Nvidia designed the GTX 1070, its BIOS and Drivers based on Original specifications of Samsung GDDR5 8Ghz modules, the Samsung ICs appear to need low voltage/power to remain stable and can handle voltage fluctuations flawlessly.
> 
> Micron GDDR5 on the other hand, being a lower quality chip, is power hungry which needs constantly high voltage pumping to remain stable, so it can not operate at the same voltage/power envelope that was originally developed for Samsung GDDR5 in BIOS and Drivers.
> 
> The assumption was initially made that there is a BIOS/Driver bug preventing Micron GDDR5 from running stably, but what it turned out to be, because of the poor power properties of these Micron chips, they can not operate at the original voltage/power specifications and thus a fix was to pump as much power into them with the available tools, meaning voltage-locking and "prefer max performance" which are both not part of the standard specification.
> 
> The GTX 1070 has been significantly downgraded as the initial memory OC range showing in all early reviews (from 9000Mhz to 9750Mhz) is no longer valid and the new range is now 7600Mhz to 8800Mhz, meaning that the lowest Samsung OC is higher than the potential highest Micron OC.
> 
> *Now i need to know what to do with my Micron GTX 1070 and i need your suggestions:
> 
> 1- Return the damn thing, get my money back and wait for Nvidia to start making card with Samsung memory again if ever.
> 
> 2- Return the damn thing and get a Founder's Edition GTX 1070 with Samsung memory and live with the reference cooling and noise.
> 
> 3- Return the thing and get my money back and wait for AMD's VEGA.
> 
> 3- Return the thing and wait for Volta!
> 
> 4- Get a GTX 1060 or an RX 480 instead.
> 
> 5- Suck it and live with the new downgraded performance!
> 
> 6- get a PS4 Pro! LOL
> 
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Please advise ...


Do what I did, get a FE and water cool it.


----------



## MyNewRig

Quote:


> Originally Posted by *bigjdubb*
> 
> If I was looking for a blower style I would definitely go with the founders edition.


I am not looking for a blower style, but i don't want to pay $640 for a card with that garbage Micron GDDR5 memory.

This is why i need your help about my options and alternatives


----------



## MyNewRig

Quote:


> Originally Posted by *rfarmer*
> 
> Do what I did, get a FE and water cool it.


I don't have a custom loop, wouldn't buying a reference card and then buying a water block for it gets me into the GTX 1080 range anyways? and wouldn't changing the stock cooler void my warranty? i just don't feel very good about that option. please give more insight ..


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> I am not looking for a blower style, but i don't want to pay $640 for a card with that garbage Micron GDDR5 memory.
> 
> This is why i need your help about my options and alternatives


U think the extra few hundred mhz you get out of Samsung chips is going to make a valuable difference ?. I'm on a gigabyte x2 oc with Samsung memory. Just checked. Not sure if that helps or not lol.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> U think the extra few hundred mhz you get out of Samsung chips is going to make a valuable difference ?. I'm on a gigabyte x2 oc with Samsung memory. Just checked. Not sure if that helps or not lol.


No man, it is not about max OC potential, that is secondary, it is about system stability and quality, these garbage Micron chips lock-up the system, produce artifacts on the desktop and browser, crash with a BSOD, cause random artifacts in games, and cause the core to power-throttle because they are pretty power hungry.

It comes down to quality, i want a quality product for my money not some trash, i tested a GTX 1070 with Samsung memory for 2 weeks, and i used to push the hell out of that card, it never locked up or crashed the system, the worst it could do is "Display driver stopped responding and has recovered" until you lower the offset a little .. that is all, never seen any of the artifacts i am seeing with these Micron chips ...

Also in 2K and 4K the Samsung memory card was giving smoother frames, that Micron **** gets laggy as you increase the resolution, i just hate it!

So it is a matter of 1st Quality, and 2nd Performance ...


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> No man, it is not about max OC potential, that is secondary, it is about system stability and quality, these garbage Micron chips lock-up the system, produce artifacts on the desktop and browser, crash with a BSOD, cause random artifacts in games, and cause the core to power-throttle because they are pretty power hungry.
> 
> It comes down to quality, i want a quality product for my money not some trash, i tested a GTX 1070 with Samsung memory for 2 weeks, and i used to push the hell out of that card, it never locked up or crashed the system, the worst it could do is "Display driver stopped responding and has recovered" until you lower the offset a little .. that is all, never seen any of the artifacts i am seeing with these Micron chips ...
> 
> Also in 2K and 4K the Samsung memory card was giving smoother frames, that Micron **** gets laggy as you increase the resolution, i just hate it!
> 
> So it is a matter of 1st Quality, and 2nd Performance ...


Is it not possible you just got a defective card and it's a coincidence that it has different memory. Why don't you try to rma the card first before jumping to conclusions.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Is it not possible you just got a defective card and it's a coincidence that it has different memory. Why don't you try to rma the card first before jumping to conclusions.


I tried, they sent me an advance-RMA card again with the same Micron garbage and the same issues, the only difference is that the replacement coilwhines just to add insult to injury...

Look, i have been researching this, and testing hardware for over two weeks, i am not jumping to any conclusions, i arrived at that conclusion, slowly, systematically and after a lot of testing, researching and talking to everyone involved including Nvidia and my AIB ..

So lets move on to the next step, since now is decision time ... debating and researching time is over .. and i am running out of time for my return window ..

Please advise on options now ...


----------



## muzammil84

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay guys, now is decision time and i desperately need your help,
> 
> I have been closely monitoring and debating the Samsung vs. Micron GDDR5 memory issue for about two weeks now,
> 
> The conclusion is that, Nvidia designed the GTX 1070, its BIOS and Drivers based on Original specifications of Samsung GDDR5 8Ghz modules, the Samsung ICs appear to need low voltage/power to remain stable and can handle voltage fluctuations flawlessly.
> 
> Micron GDDR5 on the other hand, being a lower quality chip, is power hungry which needs constantly high voltage pumping to remain stable, so it can not operate at the same voltage/power envelope that was originally developed for Samsung GDDR5 in BIOS and Drivers.
> 
> The assumption was initially made that there is a BIOS/Driver bug preventing Micron GDDR5 from running stably, but what it turned out to be, because of the poor power properties of these Micron chips, they can not operate at the original voltage/power specifications and thus a fix was to pump as much power into them with the available tools, meaning voltage-locking and "prefer max performance" which are both not part of the standard specification.
> 
> The GTX 1070 has been significantly downgraded as the initial memory OC range showing in all early reviews (from 9000Mhz to 9750Mhz) is no longer valid and the new range is now 7600Mhz to 8800Mhz, meaning that the lowest Samsung OC is higher than the potential highest Micron OC.
> 
> *Now i need to know what to do with my Micron GTX 1070 and i need your suggestions:
> 
> 1- Return the damn thing, get my money back and wait for Nvidia to start making card with Samsung memory again if ever.
> 
> 2- Return the damn thing and get a Founder's Edition GTX 1070 with Samsung memory and live with the reference cooling and noise.
> 
> 3- Return the thing and get my money back and wait for AMD's VEGA.
> 
> 3- Return the thing and wait for Volta!
> 
> 4- Get a GTX 1060 or an RX 480 instead.
> 
> 5- Suck it and live with the new downgraded performance!
> 
> 6- get a PS4 Pro! LOL
> 
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Please advise ...


no 2.
i got Inno3d iChill x4 1070 which uses reference pcb(means Samsung memory) and the cooler is fantastic. Max temp after 20 loops of stress test in 3dmark was 62°C, in games it usually oscillates around mid 50s. fans don't spin until 50°C and theres an extra fan and heatsink for vrm.


----------



## MyNewRig

Quote:


> Originally Posted by *muzammil84*
> 
> no 2.
> i got Inno3d iChill x4 1070 which uses reference pcb(means Samsung memory) and the cooler is fantastic. Max temp after 20 loops of stress test in 3dmark was 62°C, in games it usually oscillates around mid 50s. fans don't spin until 50°C and theres an extra fan and heatsink for vrm.


Okay, that is a fantastic advice, my retailer who will do the exchange or replacement does not have that particular card in stock, EVGA's SC with ACX 3.0 uses the reference PCB as well, no? wouldn't that be the next best thing and would most probably come with Samsung? or did they also change to Micron on that card as well?


----------



## HOODedDutchman

Order a higher end binned card. Or go reference that's your best luck. Zotac amp extreme, gigabyte xtreme gaming, I'd guess evga ftw and that's probably the one I'd shoot for since the price isn't crazy. The zotac option is quite reasonable as well. Or go reference, you won't have any issues with reference. Its not near as loud as the AMD fanboys make it out to be. Even with a little custom curve to keep it in the 70s the noise won't even be noticeable.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Order a higher end binned card. Or go reference that's your best luck. Zotac amp extreme, gigabyte xtreme gaming, I'd guess evga ftw and that's probably the one I'd shoot for since the price isn't crazy. The zotac option is quite reasonable as well. Or go reference, you won't have any issues with reference. Its not near as loud as the AMD fanboys make it out to be. Even with a little custom curve to keep it in the 70s the noise won't even be noticeable.


Sorry buddy but you seem pretty late to the party, i have considered all these cards and have spoken with all their manufacturers, price is not an issue here, the AMP! Extreme, GB Xtreme and EVGA FTW .. have ALL lately switched to Micron since August ... not a single card of these uses Samsung GDDR5 anymore, look at all the recent GPU-Z screenshots for these cards in this thread and you will see all are damn Micron !

if it was that easy i would have not been so desperate for help .. the only card remaining that uses Samsung is the reference cards that have been manufactured pretty early ...


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> Sorry buddy but you seem pretty late to the party, i have considered all these cards and have spoken with all their manufacturers, price is not an issue here, the AMP! Extreme, GB Xtreme abd EVGA FTW .. have ALL lately switched to Micron since August ... not a single card of these uses Samsung GDDR5 anymore, look at all the recent GPU-Z screenshots for these cards in this thread and you will see all are damn Micron !
> 
> if it was that easy i would have not been so desperate for help .. the only card remaining that uses Samsung is the reference cards that have been manufactured pretty early ...


Get a FE and call it a day.


----------



## TheBoom

Quote:


> Originally Posted by *MyNewRig*
> 
> Sorry buddy but you seem pretty late to the party, i have considered all these cards and have talked with all their manufacturers, price is not an issue here, the AMP! Extreme, GB Xtreme abd EVGA FTW .. have ALL lately switched to Micron since August ... not a single card of these uses Samsung GDDR5 anymore, look at all the recent GPU-Z screenshots for these cards in this thread and you will see all are damn Micron !
> 
> if it was that easy i would have not been so desperate for help .. the only card remaining that uses Samsung is the reference cards that have been manufactured pretty early ...


How about Asus? I bought my strix not too long ago and it came with Samsung. A friend very recently got the turbo and confirmed it was Samsung. However I cannot confirm if those were from older batches or not. Actually I think the turbo cards are based very similarly to the FE editions so they might still be using Samsung chips.


----------



## JackCY

Maybe Samsung can't spit out enough chips? Or is Micron cheaper or they need to buy more from Micron because the GDDR5x volume of buying is too low so they buy GDDR5?
I've had a few cards before and yeah the Samsung memory chips are the ones to go for really. Dunno why even Micron is only on par with Hynix, Elpida when it comes to speed when they can do GDDR5x... maybe their GDDR5x is not as fast as it could be if Samsung made it.

Samsung GDDR5 runs 9GHz fine most of the time, even more if the core or VRAM can both handle it.

If you have a card with faulty or moody VRAM by design even on stock clocks then by advice is to get rid of it ASAP, return it, get your money back. I've had one and the random crap unstable VRAM causes is difficult to pin point at first but once you put it all together downclock the VRAM etc. suddenly these random issues and apps crashing goes away.

In EU, buy, test for 2 weeks, return if you don't like it.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Sorry buddy but you seem pretty late to the party, i have considered all these cards and have spoken with all their manufacturers, price is not an issue here, the AMP! Extreme, GB Xtreme and EVGA FTW .. have ALL lately switched to Micron since August ... not a single card of these uses Samsung GDDR5 anymore, look at all the recent GPU-Z screenshots for these cards in this thread and you will see all are damn Micron !
> 
> if it was that easy i would have not been so desperate for help .. the only card remaining that uses Samsung is the reference cards that have been manufactured pretty early ...


Dam they must have changed their ways. Back in the 6xx and 7xx series days of u ordered a binned chip like the AMP edition it was always Samsung chips. Anyway to tell the manufactured date on the box ? If not just buy a founders edition of they are guaranteed samsung. Your not going to be upset about the noise. Even if it's audible it will be slight and very smooth. Your making me want to go buy a second wf2 oc at my local shop because I know it's first batch (bought it and returned it cuz my motherboard fried the same day and it was 1 or the other)...


----------



## HOODedDutchman

Also what card do you have ? I heard there was a bad batch of cards that went out from msi and someone else that they were recalling a few weeks back. Think I read it on tweaktown.


----------



## MyNewRig

Quote:


> Originally Posted by *TheBoom*
> 
> How about Asus? I bought my strix not too long ago and it came with Samsung. A friend very recently got the turbo and confirmed it was Samsung. However I cannot confirm if those were from older batches or not. Actually I think the turbo cards are based very similarly to the FE editions so they might still be using Samsung chips.


Oh man, i had the July Strix OC with Samsung memory, how much i loved that card, flawless, stable, quality and an OC champ, i had to let it go thinking that the next one will be exactly the same, got two other Strix 1070s one in August and one this month and both came with Micron and i can't hate them enough, i returned one last week and have one left now, i feel it is poisoning my system and want to get it out of the system ASAP .. this is why i am asking for replacement recommendations ...


----------



## muzammil84

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay, that is a fantastic advice, my retailer who will do the exchange or replacement does not have that particular card in stock, EVGA's SC with ACX 3.0 uses the reference PCB as well, no? wouldn't that be the next best thing and would most probably come with Samsung? or did they also change to Micron on that card as well?


iChill x3 is the same card without that extra little fan so go for that one if possible


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> Get a FE and call it a day.


Do you think i can tolerate its noise and heat? my current Strix runs at 66c while OCed and i can't hear it (except for the coil whine lol) ... the FE on same settings is expected to reach 83c ... do you think i can take it?









no plan to water cool, coz FE price + water block price = near 1080 price .. so not very good value, also do you believe that the reference cooler is durable enough, like can it last 2 or 3 years, and how can i clean it? since i have a pretty dusty environment and never tried an FE before .. what will the experience be like?

Also what FE brand should i go with? i think in this case EVGA for the warranty?

Sorry too many questions


----------



## rfarmer

Quote:


> Originally Posted by *MyNewRig*
> 
> Do you think i can tolerate its noise and heat? my current Strix runs at 66c while OCed and i can't hear it (except for the coil whine lol) ... the FE on same settings is expected to reach 83c ... do you think i can take it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> no plan to water cool, coz FE price + water block price = near 1080 price .. so not very good value, also do you believe that the reference cooler is durable enough, like can it last 2 or 3 years, and how can i clean it? since i have a pretty dusty environment and never tried an FE before .. what will the experience be like?
> 
> Also what FE brand should i go with? i think in this case EVGA for the warranty?
> 
> Sorry too many questions


Keep an eye on the sales forum here, http://www.overclock.net/t/1611854/fs-gigabyte-gtx-1070-g1-gaming this guy is selling a Gigabyte G1 Gaming 1070 with Samsung memory for $390.


----------



## MyNewRig

Quote:


> Originally Posted by *JackCY*
> 
> Maybe Samsung can't spit out enough chips? Or is Micron cheaper or they need to buy more from Micron because the GDDR5x volume of buying is too low so they buy GDDR5?
> I've had a few cards before and yeah the Samsung memory chips are the ones to go for really. Dunno why even Micron is only on par with Hynix, Elpida when it comes to speed when they can do GDDR5x... maybe their GDDR5x is not as fast as it could be if Samsung made it.
> 
> Samsung GDDR5 runs 9GHz fine most of the time, even more if the core or VRAM can both handle it.
> 
> If you have a card with faulty or moody VRAM by design even on stock clocks then by advice is to get rid of it ASAP, return it, get your money back. I've had one and the random crap unstable VRAM causes is difficult to pin point at first but once you put it all together downclock the VRAM etc. suddenly these random issues and apps crashing goes away.
> 
> In EU, buy, test for 2 weeks, return if you don't like it.


Agree with you 100% on all points, perfect points you make here, except that they are still using Samsung's GDDR5 8Ghz on ALL GTX 1060s, this totally eliminates the shortage or supply issue ..

Looks like a business decision to save money, make more profits or fulfill their contract with Micron coz maybe GDDR5X sales are slow!

Samsung GDDR5 runs great, it is flawless memory, the problem is that ALL cards with Micron GDDR5 are faulty and moody and all that stuff you say, so it is an entire market out there with issues, not just the specific sample i have, if it was only my specific sample that would have been a very easy thing to fix, hence my questions to you guys ..









I checked every single card in the market and recently ALL stock is Micron .. i would like to skip the entire Pascal generation but the wait until VEGA or Volta is a very very long one.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Dam they must have changed their ways. Back in the 6xx and 7xx series days of u ordered a binned chip like the AMP edition it was always Samsung chips. Anyway to tell the manufactured date on the box ? If not just buy a founders edition of they are guaranteed samsung. Your not going to be upset about the noise. Even if it's audible it will be slight and very smooth. Your making me want to go buy a second wf2 oc at my local shop because I know it's first batch (bought it and returned it cuz my motherboard fried the same day and it was 1 or the other)...


Exactly, this is why everybody is so damn disappointed, buyers of the FTW, the most expensive MSI Gaming Z, AMP! Extreme, GB Xtreme, all finding out they are getting Micron memory that can't even do +100 memory are feeling very bad, you will see a lot of cards being returned or sold these days because of that.

I had 4 Strix so far ... 2 Samsung and 2 Microns .. differences in stability, quality and performance are huge ..

What makes it worse is that nobody want to TALK about WHY this is the case, Nvidia did not give them permission to even explain to customers why the F*uck the made this switch .. we are intentionally kept completely in the dark about this ...

It is a mind blowing situation and is making me consider skipping Pascal all together ..

Sorry i hope there was an easy solution but i searched for one right and left .. only the option of getting a reference remains ..


----------



## MyNewRig

Quote:


> Originally Posted by *rfarmer*
> 
> Keep an eye on the sales forum here, http://www.overclock.net/t/1611854/fs-gigabyte-gtx-1070-g1-gaming this guy is selling a Gigabyte G1 Gaming 1070 with Samsung memory for $390.


I am located in Europe .. if he is US based that could be a problem .. but thanks for pointing that out


----------



## Newwt

I think some of you guys are being a little dramatic. The cards perform amazing at the specs they are sold, but everyones throwing a fit because you cant modify it to run as fast as you think it should. I understand the frustration, but overclocking potential not a selling point of the card from the manufactures.


----------



## *AcidBath*

I was pleasantly surprised last July when I discovered the EVGA SC 1070 card I bought had samsung vram. I don't think any gcard I had in the past 6 years had samsung vram. I so far got that up to 4600 with no issues. Can't say I need such an OC at this time but it's comforting none the less. Bummer for those late to market...I recall seeing this same scenario before, and more than once!


----------



## juniordnz

I'm just gonna leave this here...


Spoiler: Warning: Spoiler!







Let us all make a moment of silence in memory of this poor thing

So young...so much to see and do...a whole life to live...


----------



## ssgtnubb

Quote:


> Originally Posted by *juniordnz*
> 
> I'm just gonna leave this here...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Let us all make a moment of silence in the memory of this poor thing
> 
> So young...so much to see and do...a whole life to live...


Poor card, never had a chance in this cruel world. That sucks like crazy.


----------



## shadowrain

http://www.gamersnexus.net/news-pc/2427-difference-between-gtx-1080-founders-edition-and-reference

*The (1080)Founder's Edition will cost $700. The MSRP is $600 - so vendors like MSI, EVGA, ASUS, et al. can enter market with cards cheaper than nVidia's own, throw their own coolers on them, and overclock them differently. The vendors will exercise similar control and design/engineering over their versions of the GTX 1000 series as with previous generations.*

By these wording, the choice and control of the AIB design/engineering including the micron/samsung vram choice is solely upon the AIB manufacturers. Manufacturers don't get gddr5's directly from Nvidia, they buy them themselves.

My speculation is the samsung vrams used during the 1st batch of 1070s were from thier FE stocks and was put in AIB's until the more "cost effective" Micron chips were offered to them or there was really a shortage of Samsung 256 Gbit/s chips.

The Samsung chips in the 1060 is also a lower bandwidth at 192 Gbits/s. Putting these chips in the 1070 will hamper it more than the Microns.

As for the Micron 1070's, plenty here are happy with thier Microns, some even surpassing Samsung's OC. Yes there are some Micron cards that are unstable at stock, like the MSI gaming X recall in China, most likely due to a bad batch of chips delivered to MSI and maybe Asus cards. Palit and Zotac Microns are working fine at stock based on thier owner reviews and forums.

As another user said here, RMAing working Micron cards to get Samsung cards only makes vendors more strict with thier RMA policy.

TLR- as written on the Asus 1070 strix specs page, *All specifications are subject to change without notice*. And Samsung GDDR5 is not a specification at all.


----------



## shadowrain

Go cry me a river ASUS ROG badge fanboy. My Zotac Amp Extreme with Micron works and OC's as well as my Amp Extreme with Samsung. Proof enough that Microns are not the issue. Either one better than any of your Strix's any day of the week. This and the other successful Micron OC's here and on the web disproves your blanket statement that "all" Microns are unstable at OC, let alone stock. Also congrats on being another victim of ASUS ROG's horrendous RMA services.


----------



## MyNewRig

Quote:


> Originally Posted by *shadowrain*
> 
> Go cry me a river ASUS ROG badge fanboy. My Zotac Amp Extreme with Micron works and OC's as well as my Amp Extreme with Samsung. Proof enough that Microns are not the issue. Either one better than any of your Strix's any day of the week. This and the other successful Micron OC's here and on the web disproves your blanket statement that "all" Microns are unstable at OC, let alone stock. Also congrats on being another victim of ASUS ROG's horrendous RMA services.


LOL i don't have to RMA to ASUS at all, i do not even need to RMA, i live in the EU where the consumer is king, and ASUS can shut their "specifications are subject to change without notice" clause up their butts, as an EU consumer if a manufacturer changes even a hair in their product from the original review samples or marketing material and i don't like the change, product goes back into the box and back to the store/manufacture by the power of the law, as an EU consumer specifications are what i deem as such at my own discretion


----------



## shadowrain

The sense of EUntitlement is strong with this one. Well good for you, and good for me as I'm good too.

Thanks for mistaking me for Roland01 from the geforce forums. Was intrigued and I only had a peek at your discussions but I'm flattered.

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/10/

Anyways Mr GamerSX, you have a good one. I'm out, peace.


----------



## MyNewRig

Quote:


> Originally Posted by *shadowrain*
> 
> The sense of EUntitlement is strong with this one. Well good for you, and good for me as I'm good too.
> 
> Thanks for mistaking me for Roland01 from the geforce forums. Was intrigued and I only had a peek at your discussions but I'm flattered.
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/10/
> 
> Anyways Mr GamerSX, you have a good one. I'm out, peace.


Sorry man, i apologize if i have mistaken you for somebody else, this being your first post did not make it look good, anyways cheers


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> I asked my retailer to RMA my card on the condition that they send me a card with Samsung GDDR5 as a replacement, the retailer put that as a replacement condition with the manufacturer/supplier, they said no we can not give you that .. when asked why, they can not tell .. Nvidia has them by the b*alls they can not even talk or explain why they made the switch and why these Micron GDDR5 chips are so bad!
> 
> we are not being any dramatic .. this is a totally f*ucked up situation, if only the lousy AMD had VEGA up in the market already, Nvidia would have not dare to push it so far, but what can i say, this is how monopolies act ..


To be honest, I don't think Nvidia is responsible for the switch. If they were the FE cards would suffer a similar fate. The only cards that Nvidia has no direct control over (other than the Pascal chip itself) is the aftermarket cards put out by other manufacturers. The FE's can be hot and loud, but at least they are consistent with what parts are used. Like I suggested earlier, just get a FE or wait for Vega.


----------



## outofmyheadyo

are all the non FE cards micron chips ?


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> To be honest, I don't think Nvidia is responsible for the switch.


I just find it impossible to believe that such systematically coordinated activity that all happened in the same time and with the same approach to shadiness is not orchestrated by Nvidia, like @HOODedDutchman said a few posts above, the highest binned cards have always had Samsung memory, for example i can not believe that the super expensive MSI Gaming Z can not afford to use Samsung ICs in their card, that card costs $690 in my market, they can latterly use a Gold plated backplate and still turn a profit, it must be that someone with authority over what they can do has given them specific instructions, and who else can do that but Nvidia, also AIBs can not actually lift a finger without Nvidia's approval, it is not like they just take the chip and do whatever they like with it,

It is official according to http://www.nvidia.com/object/pf_boardpartners.html that:

_"NVIDIA has worked closely with Authorized Board Partners to ensure they meet and maintain the *strict requirements*"_

Quote:


> If they were the FE cards would suffer a similar fate. The only cards that Nvidia has no direct control over (other than the Pascal chip itself) is the aftermarket cards put out by other manufacturers. The FE's can be hot and loud, but at least they are consistent with what parts are used.


Actually in my market the FEs are drying out, there are only a few samples left and they do not come back to stock once they run out, for example there is no more ASUS FE in retailer i am buying from, FEs look to be an initial limited edition production that once it runs out you will only be left with the AIB cards all with Micron memory, at least this is how it appears to be working in my market, i could be totally wrong about this one though.

Quote:


> Like I suggested earlier, just get a FE or wait for Vega.


Looks to be the only viable option at the moment, since these Micron memory cards honestly disgust me, i just feel they are cheap and low quality product that is not worth the price,

do you have any idea when will VEGA actually hit the market and be available for purchase?


----------



## MyNewRig

Quote:


> Originally Posted by *outofmyheadyo*
> 
> are all the non FE cards micron chips ?


Currently all recent stock of EVGA FTW, GB G1 & Xtreme, ASUS Strix, Zotac AMP! & Extreme, Palit, Gainward are all using Micron GDDR5, all have the same issues, more or less depending on luck, some don't OC at all, some very minor OC, some crash and artifact at stock settings and i am talking about system-wide crashing and lock-ups not just soft-crashing .. there appears to be a lucky few who can get some OC from Micron but with a bunch of tricks like voltage-locking and forcing driver to apply max power at all times.

In general the highest OC potential of Micron GDDR5 is lower than the lowest OC potential of Samsung GDDR5, there are some exceptions of course but the sample size of these exceptions is very small and is not even verified ..


----------



## BulletSponge

This is probably a stupid thought, but could Apple have bought up all the Samsung chips?

NVIDIA JOB LISTINGS HINT AT RENEWED GRAPHICS CHIP DEAL WITH APPLE


----------



## Hunched

I've been trying to figure out what is wrong with my VRAM for months, its instability just doesn't make sense.
Like everyone else I can run FireStrike, TimeSpy, Valley, Heaven, all of them hundreds of times at +600 without issue.
Since I've been voltage locking I can play Rise of The Tomb Raider, Rainbow Six Siege, everything at +600 for hours and hours.

I get random artifacts 00.01% of the time, and it only ever happens after loading a map as an example.
I've NEVER been in the middle of playing and had issues happen, once the card is under load nothing bad ever happens.

Despite locking voltage, it doesn't seem to help the memory, I believe there's still fluctuations there ruining everything.
It makes literally 0 sense that DOUBLING my memory overclock from +300 to +600 does NOT increase instability, it has issues just as rarely.

The issues I'm having at +300 should be significantly worse and more frequent at +600, but nope.
Just 1 out of 100 loading screens it will go to **** the second load is placed on the card.

So because of something in the BIOS, driver, my PC configuration, I can't maintain +600 even though I clearly should be capable and usually can.
I just don't understand, and can't find anyone having a similar experience.

+600 and +300 are identical in how rare stability issues are, it makes no sense.
Every thing I have ever overclocked always gets more unstable the higher you push it, except this somehow.
Because of some transition with voltage or whatever it can't handle that causes instability for 1 second out of every 20+ hours of gameplay ONLY when loading new textures, I have to always run my memory over 400mhz lower than I should have to.

I'd really like to fix this but I don't understand what is even happening, I want to get another 1070 or a 980 Ti at this point.
Never had these weird issues with Maxwell.


----------



## cookieboyeli

Quote:


> Originally Posted by *Hunched*
> 
> I've been trying to figure out what is wrong with my VRAM for months, its instability just doesn't make sense.
> Like everyone else I can run FireStrike, TimeSpy, Valley, Heaven, all of them hundreds of times at +600 without issue.
> Since I've been voltage locking I can play Rise of The Tomb Raider, Rainbow Six Siege, everything at +600 for hours and hours.
> 
> I get random artifacts 00.01% of the time, and it only ever happens after loading a map as an example.
> I've NEVER been in the middle of playing and had issues happen, once the card is under load nothing bad ever happens.
> 
> Despite locking voltage, it doesn't seem to help the memory, I believe there's still fluctuations there ruining everything.
> It makes literally 0 sense that DOUBLING my memory overclock from +300 to +600 does NOT increase instability, it has issues just as rarely.
> 
> The issues I'm having at +300 should be significantly worse and more frequent at +600, but nope.
> Just 1 out of 100 loading screens it will go to **** the second load is placed on the card.
> 
> So because of something in the BIOS, driver, my PC configuration, I can't maintain +600 even though I clearly should be capable and usually can.
> I just don't understand, and can't find anyone having a similar experience.
> 
> +600 and +300 are identical in how rare stability issues are, it makes no sense.
> Every thing I have ever overclocked always gets more unstable the higher you push it, except this somehow.
> Because of some transition with voltage or whatever it can't handle that causes instability for 1 second out of every 20+ hours of gameplay ONLY when loading new textures, I have to always run my memory over 400mhz lower than I should have to.
> 
> I'd really like to fix this but I don't understand what is even happening, I want to get another 1070 or a 980 Ti at this point.
> Never had these weird issues with Maxwell.


Do you by chance have a setting called *EPU power saving* enabled in your bios? I had the same problem and that setting being enabled was the cause. Loading screen lockups, sometimes after alt tabbing a bunch and waiting it would unlock after 20 seconds, sometimes the game would crash. Go with what your first instinct tells you - if the memory is no more "unstable" at +300 than it is at +600... then how could it be memory!


----------



## Hunched

Quote:


> Originally Posted by *cookieboyeli*
> 
> Do you by chance have a setting called *EPU power saving* enabled in your bios? I had the same problem and that setting being enabled was the cause. Loading screen lockups, sometimes after alt tabbing a bunch and waiting it would unlock after 20 seconds, sometimes the game would crash. Go with what your first instinct tells you - if the memory is no more "unstable" at +300 than it is at +600... then how could it be memory!


Thanks for the suggestion, but it's not enabled.

That's what doesn't make sense, +300 and +600 never have issues once I'm playing a game or running a benchmark and there's load on the GPU.
It seems like its some step during the transition from little or no load to high loads that is unstable, since I've never had an issue once visuals have loaded and I'm playing.

In hundreds of hours of testing it has only ever froze during loading screens, and has only ever artifacted immediately upon entering a game after a loading screen.
Both of these happens maybe 1 in 50 map loads.

At +600 I'd expect it to happen at least 1 in 25, or for the artifacting to look worse, something to happen that would make sense.
If I'm having these issues at +300, doubling that to +600 should obviously increase it, in fact it should be happening constantly and crashes should be nearly non-stop.
Or am I wrong?

With my CPU if I add even another 100mhz that's the difference between not having a BSOD in over a year to having them every hour.
Same went for memory with my 970, the more I increased the more frequent and worse artifacting and crashes would be, which is how it's supposed to be.

If I get into a game with my GPU at +600mhz I could just play that for a year straight and never leave without problem, but we better not load it up 20+ times or eventually we'll have issues.
I'm not allowed to overclock because of something that happens 0.1% of the time that I can't fix and which shouldn't be happening.

I'd feel better if it didn't seem like I was the only person in the world having this issue with Pascal, my 970 overclocked logically and fine, no inexplicable issues with Maxwell.

At this point I just want to understand. It pisses me off that I'm probably never going to know what is happening, I can't figure this out and I doubt I'm going to get a magical answer here.

Welcome to Pascal overclocking. Where +300 and +600 are equally unstable, but 100% stable while gaming, and will have issues 1% of the time when loading to gameplay transitions.
So you better run it at +250 or even less, even though you can actually play forever at +600 without issue as long as you avoid loading.








I can't make sense of it, can't fix it. I give up.


----------



## Forceman

Sounds like the card is dropping the volts too low (or too fast) in those idle periods. Have you tried raising the voltage at all?


----------



## Hunched

Quote:


> Originally Posted by *Forceman*
> 
> Sounds like the card is dropping the volts too low (or too fast) in those idle periods. Have you tried raising the voltage at all?


I have the core voltage raised and locked to 1.093v, and according to GPU-Z it always stays there whenever that profile is in use.
This might just be one of those problems that can't be explained, I have a magical 1070, in a bad way... my memory is cursed.

I'd kind of feel better if I didn't get any performance increase from +300 to +600 like some people because of error correction, but it's a nice boost, unfortunately I don't get to have it.
For all I know I have this issue at +0 on the memory, probably far less frequently. I haven't used the card without an overclock for any extended period of time.

I'm hoping it stops at +250, but I have no clue. I'm going to just keep lower my memory by -50mhz every time I have an issue, and maybe one day I'll be downclocking.
It's not helpful that it doesn't happen frequently enough, I thought +300 was stable after the longest time and then Rainbow Six Siege surprised me when the second I loaded into the map textures were missing, black/pink/green voids.
Despite the previous 20 hours of gameplay being 100% rock solid, I have to lower it and lose performance just because this will happen occasionally during transitions.

There's no quick test or benchmark I can do for this, unless maybe I load and quit terrorist hunt hundreds of times for hours and hours that would speed up the process...
I'll probably be unstable at +250, and that will take another 20+ hours of gameplay to load enough maps to find out at this rate.
I'm not a patient person.

I'm also using the exact same PC that my 970 worked flawlessly on, so I don't think anything outside of the 1070 itself could be interfering with stability in some way.

I guess my 1070 memory is just finicky as hell and I have to appeal to the lowest common denominator, weakest link in the chain until it is satisfied and stable.
I'm losing basically my entire overclock because of whatever this weak link is, even though it's rock solid 99.99% of the time, I need 100%, and this 00.01% is so far down there.

I've even tried downclocking my core a good bit to see if it would help memory stability but it doesn't seem to care.
I suppose at least I'll probably be able to maintain +600 in seamless open world games with practically no loading screens, always utilizing the GPU... yay...


----------



## reflex75

Quote:


> Originally Posted by *Hunched*
> 
> I've been trying to figure out what is wrong with my VRAM for months, its instability just doesn't make sense...


We are many to complain about this memory which can not handle the voltage variation.
But we don't know yet if it's software or hardware issue.
If you want, you can add your feedback on GeForce forum:
https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/1/


----------



## Hunched

Quote:


> Originally Posted by *reflex75*
> 
> We are many to complain about this memory which can not handle the voltage variation.
> But we don't know yet if it's software or hardware issue.
> If you want, you can add your feedback on GeForce forum:
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/1/


Thing is, I even have Samsung memory though...
Which makes me even more disappointed, looks like I have the most broken Samsung memory on OCN.


----------



## JukeBox

My Palit superjetstream has been rocksolid and is sitting at 2.12ghz under load with ram clocked to 8.5ghz without any issues. I guess either i'm in the minority if i'm on micron, i'l check when I get home from work


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Sounds like the card is dropping the volts too low (or too fast) in those idle periods. Have you tried raising the voltage at all?
> 
> 
> 
> I have the core voltage raised and locked to 1.093v, and according to GPU-Z it always stays there whenever that profile is in use.
> This might just be one of those problems that can't be explained, I have a magical 1070, in a bad way... my memory is cursed.
> 
> I'd kind of feel better if I didn't get any performance increase from +300 to +600 like some people because of error correction, but it's a nice boost, unfortunately I don't get to have it.
> For all I know I have this issue at +0 on the memory, probably far less frequently. I haven't used the card without an overclock for any extended period of time.
> 
> I'm hoping it stops at +250, but I have no clue. I'm going to just keep lower my memory by -50mhz every time I have an issue, and maybe one day I'll be downclocking.
> It's not helpful that it doesn't happen frequently enough, I thought +300 was stable after the longest time and then Rainbow Six Siege surprised me when the second I loaded into the map textures were missing, black/pink/green voids.
> Despite the previous 20 hours of gameplay being 100% rock solid, I have to lower it and lose performance just because this will happen occasionally during transitions.
> 
> There's no quick test or benchmark I can do for this, unless maybe I load and quit terrorist hunt hundreds of times for hours and hours that would speed up the process...
> I'll probably be unstable at +250, and that will take another 20+ hours of gameplay to load enough maps to find out at this rate.
> I'm not a patient person.
> 
> I'm also using the exact same PC that my 970 worked flawlessly on, so I don't think anything outside of the 1070 itself could be interfering with stability in some way.
> 
> I guess my 1070 memory is just finicky as hell and I have to appeal to the lowest common denominator, weakest link in the chain until it is satisfied and stable.
> I'm losing basically my entire overclock because of whatever this weak link is, even though it's rock solid 99.99% of the time, I need 100%, and this 00.01% is so far down there.
> 
> I've even tried downclocking my core a good bit to see if it would help memory stability but it doesn't seem to care.
> I suppose at least I'll probably be able to maintain +600 in seamless open world games with practically no loading screens, always utilizing the GPU... yay...
Click to expand...

I have observed something similar to you. Anything for me up to +450 is fine regardless of voltage.

450-500 seems to do the occasional artifacting with Voltages below 800. Anything above 500 has to have the voltage above .800 or it will crash every time. I have concluded that the artifacts come from a voltage starvation scenario when the ram speeds get to a certain level because the VRM does not react fast enough from low voltage levels to keep up with the memory over clock

Problem with pascal is that there does not seem to be any rhyme nor reason for the changes in voltage level at idle. You can restart you machine, runs a given task and all will be good. Reboot the machine, try the exact same task the next time and it will start checkerboard artifacting.

I agree that there appears to be a bug in either the core sections of the bios that is common to all the micron cards or in the nVidia Drivers. I actually think it is more likely to be the Drivers causing the problem as they provide the API that provides the "intellegence" and sends the I2c signals to the card to control the card functions. The brand of memory does not control the card voltages and if it was only Asus or MSI cards or any other single model/range of card, I would agree that it was AIB specific, unfortunately it isnt.

The Fanboys over at the nvidia.com forums all want to stick their heads in the sand and blame the individual vendors rather than just discussing and addressing the core issue that can be shared with all AIB partners selling micron cards. I would imagine that the code to either delay the memory OC until the voltage had time to ramp up or alternatively, stop the cards dropping voltage to below .800v should be relatively trivial


----------



## gtbtk

Micron memory is not universally bad.

Micron Memory MSI Gaming X with ram running at 9066Mhz. It will run higher memory clock but it starts screen tearing artifacts.

Firestrike 20542 graphics score. i7-2600 cpu hampering absolute fps and physics scores.

http://www.3dmark.com/fs/10295168


----------



## reflex75

Quote:


> Originally Posted by *gtbtk*
> 
> I would imagine that the code to either delay the memory OC until the voltage had time to ramp up or alternatively, stop the cards dropping voltage to below .800v should be relatively trivial


If it was so trivial why this issue is still pending?
Maybe the hardware itself can't stand the required voltage/frequency variation?


----------



## MyNewRig

I am pretty convinced that i have this Micron GTX 1070 memory issue figured out with the aid of Micron's GDDR5 datasheet found here https://www.micron.com/products/datasheets/65c410ee-af9c-4d6f-b35b-595ce11150c4

Micron graphics memory in general is not terrible, but it is a 2nd-tier manufacturer after Samsung, since Samsung GDDR5 has always had a higher quality and headroom across the different generations.

I am talking specifically about Micron's GDDR5 which has a maximum data rate of 8.0 Gb/s

According to its datasheet found here https://www.micron.com/products/datasheets/65c410ee-af9c-4d6f-b35b-595ce11150c4

Micron GDDR5 operate at data rates of 6.0 Gb/s, 7.0 Gb/s, 8.0 Gb/s (MAX)

8.0 Gb/s being the maximum rated for this memory which could explain why there is no headroom left because at stock settings the memory is already pushed to its limit.

If the same Micron GDDR5 chips were used on a 7 Gb/s rated GPU it would have been great, very stable and would have an amazing OC headroom left ..

The way they have used these Micron GDDR5 modules is bad and makes it of a much lower quality standards than Samsung's GDDR5 which appear to support up to 9 Gb/s and so it has that great headroom and stability ..

This cheaping out on components is not a good thing, you could cheap out on coolers, chokes, power phases, or any other GPU component, it will still operate at stock settings but will provide an overall lower quality and performance.

If you imagine that the highest binned GTX 1070 cards like the MSI Gaming Z, EVGA FTW, Asus Strix OC, Zotac AMP! Extreme, Gigabyte Xtreme, have all switched to Micron's lower quality chips when on other parts of the cards they are using premium components you will realize this whole thing must be orchestrated by NVIDIA as a direct design change to downgrade the GTX 1070 ..

When you also know that in previous generations (700 and 900 series) top binned cards were all using Samsung, you will realize that AIBs had no control over this because it is Nvidia who ordered them to do so.

Also when you think about the amount of vagueness, shadiness and secrecy surrounding this decision, and the fact that Nvidia and all its partners refuse to give any information about the switch, you will know how messed up this situation is.

I also predict that no fix will ever be provided because it is meant to be and stay this way, that the GTX 1070 has been downgraded and capped at a max 8Ghz memory for business and financial reasons, and it is manipulative that this has been done after review samples were sent out, if they have done this from the very beginning, reactions would have been very different.

It looks like Samsung's GDDR5 memory was of a much higher quality and headroom than Nvidia liked them to be, to position the GTX 1070 in a certain performance category compared to its GTX 1080 and other products in its lineup, so they have downgraded the card to cap its memory performance for business reasons, but can't publicly announce what they have done.

Also Founder's Edition cards look to be drying up in my market, and all what will be left of the GTX 1070 till the end of the generation is AIB cards with the downgraded Micron GDDR5 with a max data rate of 8 Gb/s

I am very much convinced that this analysis with the aid of Micron's GDDR5 datasheet is pretty accurate and reflects what actually happened behind the scenes.

Also when the card is already selling way above its MSRP and since now its performance has been downgraded it should have least come down to its MSRP level to reflect the cheaper, lower quality memory chips now being used

Downgrading the card's performance while still keeping it way above its MSRP after review samples have been sent out with better quality components and staying in total silence and denial about the situation, makes this unacceptable from a consumer stand-point and i predict the GTX 1070 prices will start coming down to reflect its new value proposition and lower demand after this move.


----------



## Oj010

^^ FWIW none of the cards you mentioned are binned, not even the Classified is binned anymore.


----------



## gtbtk

Quote:


> Originally Posted by *reflex75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I would imagine that the code to either delay the memory OC until the voltage had time to ramp up or alternatively, stop the cards dropping voltage to below .800v should be relatively trivial
> 
> 
> 
> If it was so trivial why this issue is still pending?
> 
> Maybe the hardware itself can't stand the required voltage/frequency variation?
Click to expand...

Because it requires the developers to accept that there is actually an issue and expend time and energy to fix it.

Of course the hardware can cope. if it could not then it would never work on any micron cards. I run mine at +530 on a regular basis. As long as the card voltage doesn't have jump up from .625mv to .1050 in a split second to match the high mem clock when it is instantly placed under load, everything works fine. If the API in the drivers sent the commands like something similar to this:

"prepare to increase memory clock->increase memory voltage->pause->increase memory clock now", instead of "prepare to OC memory->increase both Memory clock and Voltage right now" which is what I believe is happening, We would not be having this conversation and blaming Micron _. It is easy to blame Micron simply because it is at the end of a chain.

Now I accept that you may be having more drastic issues with your card but you seem to be in the minority here. It is more likely your problem is a hardware fault rather than a blanket problem with Micron memory in general._


----------



## HOODedDutchman

What card do you have I don't think you ever said... I could have missed it tho. From this forum there is a glitch in overclocking and power states. But from what most people are saying their cards overclock as well as Samsung in many cases if they take that issue out of the equation.
https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/1/

Also The speed rating of the memory is what they guarantee it to run at. Just like Intel has a spec of 2133 on skylake etc. If it goes higher it goes higher, but no card is guaranteed to overclock even 1mhz over stock. Judging from that link and your situation saying you have issues at stock I'd say you may have one of those cards that are being recalled. Or since you've gotten 2 you may just have **** luck. I'm not concerned about memory clocking in mine anyways. Core overclock has much better scaling then memory in most cases and memory is very hard to find a max overclock that's stable in absolutely everything. Also if it's not stable and looks stable in one game that u play all the time but it's actually not you can kill the memory eventually. Just my 2 cents. Cards were sold to run stock settings. No company gaurantees overclocking. So if it runs fine stock (most micron cards do) then your usually sol. In your case I think you got a bad batch n u probably exchanged your card for the same batch that They received in the same shipment.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> What card do you have I don't think you ever said... I could have missed it tho. From this forum there is a glitch in overclocking and power states. But from what most people are saying their cards overclock as well as Samsung in many cases if they take that issue out of the equation.
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/1/
> 
> Also The speed rating of the memory is what they guarantee it to run at. Just like Intel has a spec of 2133 on skylake etc. If it goes higher it goes higher, but no card is guaranteed to overclock even 1mhz over stock. Judging from that link and your situation saying you have issues at stock I'd say you may have one of those cards that are being recalled. Or since you've gotten 2 you may just have **** luck. I'm not concerned about memory clocking in mine anyways. Core overclock has much better scaling then memory in most cases and memory is very hard to find a max overclock that's stable in absolutely everything. Also if it's not stable and looks stable in one game that u play all the time but it's actually not you can kill the memory eventually. Just my 2 cents. Cards were sold to run stock settings. No company gaurantees overclocking. So if it runs fine stock (most micron cards do) then your usually sol. In your case I think you got a bad batch n u probably exchanged your card for the same batch that They received in the same shipment.


Is this post directed to me? anyways, memory scales pretty well according to my testing, on stock memory frequency of 8Ghz i get graphics score of around 18900 in Firestrike, at 9200Mhz memory score rises to around 20200 .. that is about 7% performance gain, the result is smoother and more fluid frames in 2K and 4K ... tested with an ASUS Strix OC with Samsung GDDR5


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Is this post directed to me? anyways, memory scales pretty well according to my testing, on stock memory frequency of 8Ghz i get graphics score of around 18900 in Firestrike, at 9200Mhz memory score rises to around 20200 .. that is about 7% performance gain, the result is smoother and more fluid frames in 2K and 4K ... tested with an ASUS Strix OC with Samsung GDDR5


I don't think u get that much of a gain by just adjusting memory. Mine scores about 18900 stock like u (just under 19k) and I max out around 20100 or so. That's with around 2050 core and just threw +450 at memory to be on the low end of the averages I see around the forums.

N what card do you have with the micron that has issues ?


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I don't think u get that much of a gain by just adjusting memory. Mine scores about 18900 stock like u (just under 19k) and I max out around 20100 or so. That's with around 2050 core and just threw +450 at memory to be on the low end of the averages I see around the forums.
> 
> N what card do you have with the micron that has issues ?


Both are ASUS Strix, one from July with Samsung and the other from less than two weeks ago with Micron ...


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Both are ASUS Strix, one from July with Samsung and the other from less than two weeks ago with Micron ...


Ah u running them in sli ?


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ah u running them in sli ?


No the Samsung card i only had it for about two weeks ...


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> No the Samsung card i only had it for about two weeks ...


That sux man. I'm sitting in my living room debating on whether or not I should go buy a 2nd 1070 lol.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> That sux man. I'm sitting in my living room debating on whether or not I should go buy a 2nd 1070 lol.


I have been debating that for two weeks now, weather to keep that ****ty Micron one i have or return it and skip Pascal .. that Samsung Strix i had was part of the components for a rig i was building for someone, and i had the opportunity to test how awesome that card was (2113Mhz Core and 9300Mhz memory), i loved it so much that i purchased one for myself .. but then mine turned out with Micron and i hate it a lot









I just found the Gainward GTX 1070 Phoenix GLH has 4250Mhz memory out of the box (8500Mhz effective) ... is that card even any good?

Which brand do you currently have? i bet if you make a new GTX 1070 purchase other than the Founders Edition you will end with Micron .. which one are you thinking about buying?


----------



## saunupe1911

Quote:


> Originally Posted by *MyNewRig*
> 
> I am pretty convinced that i have this Micron GTX 1070 memory issue figured out with the aid of Micron's GDDR5 datasheet found here https://www.micron.com/products/datasheets/65c410ee-af9c-4d6f-b35b-595ce11150c4
> 
> Micron graphics memory in general is not terrible, but it is a 2nd-tier manufacturer after Samsung, since Samsung GDDR5 has always had a higher quality and headroom across the different generations.
> 
> I am talking specifically about Micron's GDDR5 which has a maximum data rate of 8.0 Gb/s
> 
> According to its datasheet found here https://www.micron.com/products/datasheets/65c410ee-af9c-4d6f-b35b-595ce11150c4
> 
> Micron GDDR5 operate at data rates of 6.0 Gb/s, 7.0 Gb/s, 8.0 Gb/s (MAX)
> 
> 8.0 Gb/s being the maximum rated for this memory which could explain why there is no headroom left because at stock settings the memory is already pushed to its limit.
> 
> If the same Micron GDDR5 chips were used on a 7 Gb/s rated GPU it would have been great, very stable and would have an amazing OC headroom left ..
> 
> The way they have used these Micron GDDR5 modules is bad and makes it of a much lower quality standards than Samsung's GDDR5 which appear to support up to 9 Gb/s and so it has that great headroom and stability ..
> 
> This cheaping out on components is not a good thing, you could cheap out on coolers, chokes, power phases, or any other GPU component, it will still operate at stock settings but will provide an overall lower quality and performance.
> 
> If you imagine that the highest binned GTX 1070 cards like the MSI Gaming Z, EVGA FTW, Asus Strix OC, Zotac AMP! Extreme, Gigabyte Xtreme, have all switched to Micron's lower quality chips when on other parts of the cards they are using premium components you will realize this whole thing must be orchestrated by NVIDIA as a direct design change to downgrade the GTX 1070 ..
> 
> When you also know that in previous generations (700 and 900 series) top binned cards were all using Samsung, you will realize that AIBs had no control over this because it is Nvidia who ordered them to do so.
> 
> Also when you think about the amount of vagueness, shadiness and secrecy surrounding this decision, and the fact that Nvidia and all its partners refuse to give any information about the switch, you will know how messed up this situation is.
> 
> I also predict that no fix will ever be provided because it is meant to be and stay this way, that the GTX 1070 has been downgraded and capped at a max 8Ghz memory for business and financial reasons, and it is manipulative that this has been done after review samples were sent out, if they have done this from the very beginning, reactions would have been very different.
> 
> It looks like Samsung's GDDR5 memory was of a much higher quality and headroom than Nvidia liked them to be, to position the GTX 1070 in a certain performance category compared to its GTX 1080 and other products in its lineup, so they have downgraded the card to cap its memory performance for business reasons, but can't publicly announce what they have done.
> 
> Also Founder's Edition cards look to be drying up in my market, and all what will be left of the GTX 1070 till the end of the generation is AIB cards with the downgraded Micron GDDR5 with a max data rate of 8 Gb/s
> 
> I am very much convinced that this analysis with the aid of Micron's GDDR5 datasheet is pretty accurate and reflects what actually happened behind the scenes.
> 
> Also when the card is already selling way above its MSRP and since now its performance has been downgraded it should have least come down to its MSRP level to reflect the cheaper, lower quality memory chips now being used
> 
> Downgrading the card's performance while still keeping it way above its MSRP after review samples have been sent out with better quality components and staying in total silence and denial about the situation, makes this unacceptable from a consumer stand-point and i predict the GTX 1070 prices will start coming down to reflect its new value proposition and lower demand after this move.


I think you are on to something because it makes sense. An overclocked 1070 is just too close to a low end 1080. They had to do something. Also what memory does 1080s have...micron or samsung? Or is it a mixture?


----------



## saunupe1911

Also do you guys think 1070s with samsung memory will be like worth a lot a more on the resell market in the near future compared to their micron siblings


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> I have been debating that for two weeks now, weather to keep that ****ty Micron one i have or return it and skip Pascal .. that Samsung Strix i had was part of the components for a rig i was building for someone, and i had the opportunity to test how awesome that card was (2113Mhz Core and 9300Mhz memory), i loved it so much that i purchased one for myself .. but then mine turned out with Micron and i hate it a lot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just found the Gainward GTX 1070 Phoenix GLH has 4250Mhz memory out of the box (8500Mhz effective) ... is that card even any good?
> 
> Which brand do you currently have? i bet if you make a new GTX 1070 purchase other than the Founders Edition you will end with Micron .. which one are you thinking about buying?


I have a gigabyte wf2 oc. The one I can get is most likely Samsung. As it's first batch. I owned it and returned it because my motherboard was fried from a thunderstorm a few days before I bought it. Then when I went to install obv nothing worked. That was z77 and couldn't find a board anywhere so I returned the 2nd 1070 and bought a skylake 6700k setup.


----------



## bigjdubb

Quote:


> Originally Posted by *saunupe1911*
> 
> Also do you guys think 1070s with samsung memory will be like worth a lot a more on the resell market in the near future compared to their micron siblings


Depends on how stupid the person you are selling it to is. If "very" then "yes it's worth more", if "not very" then "no it's not worth more".


----------



## MyNewRig

Quote:


> Originally Posted by *saunupe1911*
> 
> I think you are on to something because it makes sense. An overclocked 1070 is just too close to a low end 1080. They had to do something. Also what memory does 1080s have...micron or samsung? Or is it a mixture?


The GTX 1080 has Micron GDDR5X which is a good memory, no problems at all, the X memory was developed by Micron as a cheaper alternative to HBM memory, it runs at 10Ghz and can overclock easily to 11Ghz ... fast, stable .. has OC headroom, no problems ...

Like i said in my post it is not a general problem with anything that Micron makes, it is just the way Nvidia is using Micron's GDDR5 (the normal non-X variant) and pushing it to the absolute limit so no headroom is left whatsoever .. that is the problem .. if that same Micron GDDR5 8 Gb/s modules were put in a card advertised with 7 Gb/s out of the box, it would have been great quality and everyone would be happy ...

That is the difference i realized, it is not the Micron chip itself .. it is how Nvidia has got cheaper memory and pushed it up to the obsolete limit that is causing the issue.

Quote:


> Originally Posted by *saunupe1911*
> 
> Also do you guys think 1070s with samsung memory will be like worth a lot a more on the resell market in the near future compared to their micron siblings


Yes of course, the market discounts everything, later in the second hand market you will see cards advertised as "Original Samsung GTX 1070" and "Micron GTX 1070" with the Samsung selling higher of course, and since the Samsung samples are now rare and hard to find, they could be worth much much more ..

Just simple economics, the reason why Diamonds cost way too much, they are just stones made of carbon molecules but it is because of scarcity "being rare" that the market is valuing them so high !


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> When you also know that in previous generations (700 and 900 series) top binned cards were all using Samsung, you will realize that AIBs had no control over this because it is Nvidia who ordered them to do so.


This above is totally wrong and I already touched on this earlier in the thread. EVGA have used Hynix and Elpida memory on their Classified cards, when the Classified was the top card that they offered and originally shipped with Samsung.

https://www.google.com/#q=gtx+780+classified+elpida

https://www.google.com/#q=gtx+780+Ti+classified+hynix

Also MSI with their Lightning:

https://www.google.com/#q=MSI+Lightning+780+elpida+memory

Manufacturers can use what ever memory models they want on their custom boards as long as it runs at specs Nvidia has specified. It has been going on for years, so it is nothing new with the 1070.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I have a gigabyte wf2 oc. The one I can get is most likely Samsung. As it's first batch. I owned it and returned it because my motherboard was fried from a thunderstorm a few days before I bought it. Then when I went to install obv nothing worked. That was z77 and couldn't find a board anywhere so I returned the 2nd 1070 and bought a skylake 6700k setup.


Interesting story, but how would you make sure that you will be able to get the same exact sample that you returned, and why hasn't it been sold until now?


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Interesting story, but how would you make sure that you will be able to get the same exact sample that you returned, and why hasn't it been sold until now?


This was like 2 weeks ago and they had like 7 in stock and it's listed as open box. Its just a small store they don't sell online or anything.


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> This above is totally wrong and I already touched on this earlier in the thread. EVGA have used Hynix and Elpida memory on their Classified cards, when the Classified was the top card that they offered and originally shipped with Samsung.
> 
> https://www.google.com/#q=gtx+780+classified+elpida
> 
> https://www.google.com/#q=gtx+780+Ti+classified+hynix
> 
> Also MSI with their Lightning:
> 
> https://www.google.com/#q=MSI+Lightning+780+elpida+memory
> 
> Manufacturers can use what ever memory models they want on their custom boards as long as it runs at specs Nvidia has specified. It has been going on for years, so it is nothing new with the 1070.


Yes, But you know what is new this time around that is making this turn into an issue? this is the first card that pushes Micron's GDDR5 to its absolute maximum rated limit of 8 Gb/s to the point where there is not stability headroom left on the spec., previous generation's 970, 980 and 980 Ti were all running these modules at 7 Gb/s and this is why it was so easy on these cards to do +500 without any issue even if it uses Micron GDDR5 ... i owned the 970 and the 980 Ti and with both cards +500 memory was so easy .. never had any issues ..

I am just trying to diagnose the issue technically and systematically to arrive at a correct conclusion and figure out what Nvidia and AIBs are trying so hard to hide, which is not helping anyone make an informed purchase decision ... myself included ..

This is why i went digging into Micron GDDR5 datasheet and found that it is MAX rated data rate is 8 Gb/s and it does not seem to be taking it so well, when MSI first made the switch, took the chips from Micron with that rating, looks like it turned out that they got a batch from Micron that could not run at 8 GB/s and therefore had to recall the entire batch ..

This is what has been confusing me all along .. why doesn't Micron GDDR5 have any headroom left when already configured at 8 Gb/s out of the box, now i know that these chips can not do anymore than that in general which explains why some are getting artifacts at +100

People are doing all these sorts of tricks, voltage-locking, or prefer high performance etc and living in the dream or illusion that Nvidia will provide a fix, but nothing can or will be provided because that memory can not do any better than that, but it looks like Samsung GDDR5 are good up to 9 Gb/s which is why they easily OC to at least that level for most people ..

If Nvidia cared about the quality of their product they would have not allowed AIBs to switch to Micron GDDR5 when the rated spec of 8 Gb/s is the max spec for these chips, if they cared they would have stuck with Samsung GDDR5 or moved to GDDR5X to ensure these cards will run really stable ..

I am personally pretty confident that this information is the final truth in this topic, and personally i would not spend my money on a card running standard Micron GDDR5 at 8 Gb/s out of the box .. that is just a bad purchase IMO


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> This was like 2 weeks ago and they had like 7 in stock and it's listed as open box. Its just a small store they don't sell online or anything.


Then go for it before the market goes out hunting for the rare Samsung GTX 1070 gems


----------



## ITAngel

If you had the choice of owning dual RX 480 versus a GTX 1070 which one you guys would go for?


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> SNIP


I can totally understand people who have Micron cards that can't run at rated spec without artifacts or crashes. Those people deserve a replacement card and deserve to complain. But you showed your hand in that Nvidia thread that was linked earlier this thread when you said, "We are entitled to same OC range shown in every single review sample that Nvidia and AIBs sent to reviewers from 9000Mhz to 9750Mhz frequency" https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4984726/#4984726

I mean I assume that you are GamerSX right? You aren't entitled to anything except a card that runs at the rated Nvidia specs. If you got that I think you should hush about the issue or send your card back and wait for Vega, Volta or pony up the cash for a GTX 1080.
Quote:


> Originally Posted by *ITAngel*
> 
> If you had the choice of owning dual RX 480 versus a GTX 1070 which one you guys would go for?


I always say take a single faster card over two slower cards any day.


----------



## outofmyheadyo

Single card over double anyday.


----------



## guttheslayer

How do you actually check if your card is samsung or Micron G5?

This is getting abit worrying!


----------



## Nightingale

Quote:


> Originally Posted by *criminal*
> 
> This above is totally wrong and I already touched on this earlier in the thread. EVGA have used Hynix and Elpida memory on their Classified cards, when the Classified was the top card that they offered and originally shipped with Samsung.
> 
> https://www.google.com/#q=gtx+780+classified+elpida
> 
> https://www.google.com/#q=gtx+780+Ti+classified+hynix
> 
> Also MSI with their Lightning:
> 
> https://www.google.com/#q=MSI+Lightning+780+elpida+memory
> 
> Manufacturers can use what ever memory models they want on their custom boards as long as it runs at specs Nvidia has specified. It has been going on for years, so it is nothing new with the 1070.


This guy is spot on. I used to own a lighting and was lucky to get the good memory but so many were bumped out when they received the elipda, same goes for the classified. It seems the early batches of these cards back then came with the good memory initially then they moved over to the alternatives. I bought my Strix within the first week they were readily available where I live and got Samsung Memory. It seems now Micron is everywhere on the new production batch of alot of AIB cards.


----------



## Nightingale

Quote:


> Originally Posted by *guttheslayer*
> 
> How do you actually check if your card is samsung or Micron G5?
> 
> This is getting abit worrying!


Download GPU-Z and it will display the memory manufacturer used on the card. Here is the direct link from techpowerup for you https://www.techpowerup.com/downloads/2794/techpowerup-gpu-z-v1-11-0

Here is an example of me running the program and where to look.


----------



## ITAngel

Thanks criminal & outofmyheadyo!


----------



## guttheslayer

Quote:


> Originally Posted by *Nightingale*
> 
> Download GPU-Z and it will display the memory manufacturer used on the card. Here is the direct link from techpowerup for you https://www.techpowerup.com/downloads/2794/techpowerup-gpu-z-v1-11-0


Alright thanks alot bro!!

I wonder why does the X70 from Nvidia having issues recently? First the 0.5GB memory is gimped from GTX 970, now we facing memory gimp of 1070 by using cheaper alternative...

This is bad, really bad.


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> I can totally understand people who have Micron cards that can't run at rated spec without artifacts or crashes. Those people deserve a replacement card and deserve to complain. But you showed your hand in that Nvidia thread that was linked earlier this thread when you said, "We are entitled to same OC range shown in every single review sample that Nvidia and AIBs sent to reviewers from 9000Mhz to 9750Mhz frequency" https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4984726/#4984726
> 
> I mean I assume that you are GamerSX right? You aren't entitled to anything except a card that runs at the rated Nvidia specs. If you got that I think you should hush about the issue or send your card back and wait for Vega, Volta or pony up the cash for a GTX 1080.
> I always say take a single faster card over two slower cards any day.


What reviews are for then if not used to set expectation ranges before purchasing? a deceptive marketing vehicle? if what you say is actually applicable in the market then why not just drop the whole concept of reviews, just read a product's spec-sheet and then go buy it directly?

I am pretty sure you would personally never take your own advice, and i am positive that you, if put in the same situation would be furious about it, judging by your very active participation in overclock.net

It is very easy to give advice when not being in other people's shoes.


----------



## Nightingale

Quote:


> Originally Posted by *criminal*
> 
> I can totally understand people who have Micron cards that can't run at rated spec without artifacts or crashes. Those people deserve a replacement card and deserve to complain. But you showed your hand in that Nvidia thread that was linked earlier this thread when you said, "We are entitled to same OC range shown in every single review sample that Nvidia and AIBs sent to reviewers from 9000Mhz to 9750Mhz frequency" https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4984726/#4984726
> 
> I mean I assume that you are GamerSX right? You aren't entitled to anything except a card that runs at the rated Nvidia specs. If you got that I think you should hush about the issue or send your card back and wait for Vega, Volta or pony up the cash for a GTX 1080.
> I always say take a single faster card over two slower cards any day.


Yeah to bad the single cards we used to pay $500-$550 for are now $1300


----------



## Joenc

another $30 check for 1070 users with micron memory coming

in about 3 years !

hey MyNewRig ........... answer ::: *6- get a PS4 Pro!*

I was going to get a 1080 or 1070 6 weeks ago or so and after reading

all the problems with both , I'm just going to wait awhile and get a

ps4 pro.....

good luck , maybe NVidia can upgrade the bios in the 1070 micron cards


----------



## bigjdubb

A console is certainly not the answer to the question "Which video card should I buy?". If someone asks what computer to get, you wouldn't answer "iPhone7" would you?


----------



## saunupe1911

Quote:


> Originally Posted by *bigjdubb*
> 
> A console is certainly not the answer to the question "Which video card should I buy?". If someone asks what computer to get, you wouldn't answer "iPhone7" would you?


Bad analogy...because some folks would certainly say grab an iPad or Windows tablet.

On a side note I'm sort of switching from console to PC because of the xbox play anywhere feature. I'm sick of gaming at 720p or 1080p 30 fps on xbox one. I only have a console for madden and NBA 2k17. So from now on PS4 and PC is what I will do for my gaming. But it just makes no sense to buy a PS4 pro with new one dropping next year.


----------



## bigjdubb

Quote:


> Originally Posted by *saunupe1911*
> 
> Bad analogy...because some folks would certainly say grab an iPad or Windows tablet.


That's the point of it. The kind of person that would say that would also say "just get a console". Don't get me wrong, I'm not saying that there is anything wrong with consoles/console gaming or tablets/ipads. It's just that they serve a different audience, if someone is asking about graphics cards then they are into PC gaming and a console isn't an adequate solution.
Quote:


> Originally Posted by *saunupe1911*
> 
> On a side note I'm sort of switching from console to PC because of the xbox play anywhere feature. I'm sick of gaming at 720p or 1080p 30 fps on xbox one. I only have a console for madden and NBA 2k17. So from now on PS4 and PC is what I will do for my gaming. But it just makes no sense to buy a PS4 pro with new one dropping next year.


I don't think there is any sort of confirmation or consensus that there will be a PS5 next year, it would be years earlier than previous generations. If the normal pattern was followed the PS5 would be 2019-2020.


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> What reviews are for then if not used to set expectation ranges before purchasing? a deceptive marketing vehicle? if what you say is actually applicable in the market then why not just drop the whole concept of reviews, just read a product's spec-sheet and then go buy it directly?
> 
> I am pretty sure you would personally never take your own advice, and i am positive that you, if put in the same situation would be furious about it, judging by your very active participation in overclock.net
> 
> It is very easy to give advice when not being in other people's shoes.


For your information I have been in a similar situation in the past. Luckily for me the card had a defect (the cooler) and I was able to get a replacement which happened to have better memory (honestly it was luck of the draw). On another occasion a card I got had horrible coil whine and wouldn't overclock at all (memory or core), so I just returned the card all together and waited it out until new cards released. Those are your options as well, but yet you keep beating a dead horse thinking your card is suddenly going to overclock better if you whine enough. I can respect that you want a good overclocking card, but the way you are going about wanting it to happen is wrong. You have options to get a card with Samsung memory (an FE) or wait it out until Vega or Volta, but instead you are making this a whole big conspiracy theory without legitimate proof.


----------



## outofmyheadyo

If you live in the us why dont you just keep ordering cards from amazon/newegg, until you get one with samsung memory ? I have the cheapest 1070 ( phoenix gs ) and even that has samsung chips on it.


----------



## G woodlogger

If cards next year are just a next gen Pascal with faster memory and some small changes and boost 4.0, slower 1070 make sense. It could just be economic optimization, but then Intel supposedly have design rules for what size laptop a given CPU can be used in, so who knows, we just have to live with it.


----------



## HOODedDutchman

Couple more benchmarks I ran. Scaling is actually ridiculously good.


----------



## ITAngel

Man is frustrating not to see any company yet release a water block for the Zotac GTX 1070 AMP EXTREME.


----------



## HOODedDutchman

Both cards are Samsung memory. Also I was informed the card I grabbed today wasn't the old stock but just came in last week. I have a feeling gigabyte made a bunch of these wf2 cards to start and haven't made anymore as I've seen them go out of stock at a few stores and never come back. Might be a good bet to try one of these. I mean I'm 2 for 2 3 months apart on the Samsung memory.


----------



## Exenth

I think I broke Firestrike, look at these GPU Core and Memory clocks

Also my EVGA 1070 FTW doesn't do more than +100MHz on the Core and +200MHz on the Memory, a little disappointed but still a great card.

http://www.3dmark.com/3dm/15108207?


----------



## Roland0101

Hi @"GamerSX"









After you already accused someone else to be me, I thought it is time to come in person. Lets see if I "can't pull that BS here."
Quote:


> Originally Posted by *MyNewRig*
> I am talking specifically about Micron's GDDR5 which has a maximum data rate of 8.0 Gb/s
> 
> According to its datasheet found here https://www.micron.com/products/datasheets/65c410ee-af9c-4d6f-b35b-595ce11150c4
> 
> Micron GDDR5 operate at data rates of 6.0 Gb/s, 7.0 Gb/s, 8.0 Gb/s (MAX)
> 
> 8.0 Gb/s being the maximum rated for this memory which could explain why there is no headroom left because at stock settings the memory is already pushed to its limit.


Interesting, but let's take a look at the specifications of the Samsung memory used for the 1070.

From Samsungs website: "GDDR5 can achieve a data rate of up to 7 Gb/s per pin which translates to 32 GB/s for the complete memory chip. This is 2.7 times faster than GDDR3's 2.6 Gb/s per pin and 10.4 GB/s."

Source:http://www.samsung.com/semiconductor/products/dram/graphic-dram/








But don't panic jet, because Samsung sells itself short here.
The Memory that the 1070 FE uses is the K4G80325FB-HC25 chips from Samsung, which is indeed a 8GB/s chip or in other words a chip that has the exactly same data rate specification as the Micron Memory you mentioned here has.

Source:
http://wccftech.com/nvidia-pascal-gp104-gpu-leaked/
https://tech4gamers.com/nvidia-geforce-gtx-1080-gtx-1070-will-use-gddr5-memory/
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/4.html

Quote:


> Originally Posted by *MyNewRig*
> If the same Micron GDDR5 chips were used on a 7 Gb/s rated GPU it would have been great, very stable and would have an amazing OC headroom left ..
> 
> So please understand that the way they have used these Micron GDDR5 modules is bad and makes it of a much lower quality standards than Samsung's GDDR5 which appear to support up to 9 Gb/s and so it has that great headroom and stability ..


No it doesn't, it supports the same data rate and is specified for 2000Mhz ram clock
Plus there is Overclocking headroom at the micron Ram, something others and I already proved to you.
Now it appears that the Samsung memory can achieve higher clocks on the same voltage than the Micron ram, but the micron ram can achieve at least decent OC results if you lock the voltage.

And that is one more indicator that the ram itself is not the problem.

Quote:


> Originally Posted by *MyNewRig*
> If you imagine that the highest binned GTX 1070 cards like the MSI Gaming Z, EVGA FTW, Asus Strix OC, Zotac AMP! Extreme, Gigabyte Xtreme, have all switched to Micron's lower quality chips when on other parts of the cards they are using premium components you will realize this whole thing must be orchestrated by NVIDIA as a direct design change to downgrade the GTX 1070 ..


Quote:


> Originally Posted by *MyNewRig*
> After thinking about it a bit more, it looks like Samsung's GDDR5 memory was of a much higher quality and headroom than Nvidia liked them to be, to position the GTX 1070 in a certain performance category compared to its GTX 1080 and other products in its lineup, so they have downgraded the card to cap its memory performance for business reasons, but can't publicly announce what they have done.


That was BS as you posted it the first time and it is BS now.
The 1070 could never reach a 1080 with it's micron GDDR5X memory, no matter how much you overclock the card.

The 1070 with Samsung and Micron memory do not preform differently on stock clocks, they are both on the same, pretty impressive level.

And tell me again how business could be a reason?
Downgrading one card on purpose, (what factually did not happen, see above) a card that never would reach a 1080 anyway is a good business decision?

Plus the evil Nvidia guys don't do this at release, they do it later, so that the overwhelming majority of the people, who look at benchmarks to decide which card they buy don't notice it.
Yes, that sounds like a good plan...


----------



## HOODedDutchman

Quote:


> Originally Posted by *Exenth*
> 
> I think I broke Firestrike, look at these GPU Core and Memory clocks
> 
> Also my EVGA 1070 FTW doesn't do more than +100MHz on the Core and +200MHz on the Memory, a little disappointed but still a great card.
> 
> http://www.3dmark.com/3dm/15108207?


It says ur clocks are 2300 something. Its a ftw it's prob close to it's limit stock.


----------



## ITAngel

Welcome! Roland0101; I have been following this whole thing and it seems like an interesting argument, well more informative to me. Thanks for dropping by and great post by the way.


----------



## Roland0101

Quote:


> Originally Posted by *ITAngel*
> 
> Welcome! Roland0101; I have been following this whole thing and it seems like an interesting argument, well more informative to me. Thanks for dropping by and great post by the way.


Thanks for the warm welcome ITAngel.


----------



## zipper17

Micron has power save feature mem voltage, 1.5V and 1.35V.


While Samsung memory seems not:










that might be the culprit, why 'overclocking' the memory vram less stable on micron.

http://www.samsung.com/us/samsungsemiconductor/pdfs/PSG_1H_2016.pdf
https://www.micron.com/resource-details/63c49896-86ea-4067-80e8-46635707233f

If you want faster memory vram, better buy card that has GDDR5X, or wait in 2018 with Samsung GDDR6 14-16 Gbps.


----------



## criminal

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Both cards are Samsung memory. Also I was informed the card I grabbed today wasn't the old stock but just came in last week. I have a feeling gigabyte made a bunch of these wf2 cards to start and haven't made anymore as I've seen them go out of stock at a few stores and never come back. Might be a good bet to try one of these. I mean I'm 2 for 2 3 months apart on the Samsung memory.


Good looking setup.








Quote:


> Originally Posted by *Roland0101*
> 
> That was BS as you posted it the first time and it is BS now.
> The 1070 could never come close to a 1080 with it's micron GDDR5X memory, no matter how much you overclock the card.
> 
> The 1070 with Samsung and Micron memory do not preform differently on stock clocks, they are both on the same, pretty impressive level.
> 
> And tell me again how business could be a reason?
> Downgrading one card on purpose, (what factually did not happen, see above) a card that never would reach a 1080 anyway is a good business decision?
> 
> Plus the evil Nvidia guys don't do this at release, they do it later, so that the overwhelming majority of the people, who look at benchmarks to decide which card they buy don't notice it.
> Yes, that sounds like a good plan...


Welcome.

I believe that the GTX 1070 is the first X70 card that couldn't overclock and match the stock performance of the X80 card of the same generation. And that's when using the Samsung memory. His whole conspiracy theory for that being the reason for the memory change is so freaking silly! The 1070 was never able to get close to the performance of the 1080, so there was no need to "secretly" change to Micron memory to cripple performance... lol

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html
https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/30.html


----------



## criminal

derp


----------



## reflex75

Quote:


> Originally Posted by *zipper17*
> 
> Micron has power save feature mem voltage, 1.5V and 1.35V.
> 
> 
> While Samsung memory seems not:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that might be the culprit, why 'overclocking' the memory vram less stable on micron.
> 
> http://www.samsung.com/us/samsungsemiconductor/pdfs/PSG_1H_2016.pdf
> https://www.micron.com/resource-details/63c49896-86ea-4067-80e8-46635707233f
> 
> If you want faster memory vram, better buy card that has GDDR5X, or wait in 2018 with Samsung GDDR6 14-16 Gbps.


Very good finding








Thank you!
Could you please share your feedback on the GeForce forum issue thread:
https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/13/


----------



## Roland0101

Quote:


> Originally Posted by *zipper17*
> 
> Micron has power save feature mem voltage, 1.5V and 1.35V.
> http://www.overclock.net/content/type/61/id/2879973/
> 
> While Samsung memory seems not:
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/content/type/61/id/2879974/
> 
> that might be the culprit, why 'overclocking' the memory vram less stable on micron.
> 
> http://www.samsung.com/us/samsungsemiconductor/pdfs/PSG_1H_2016.pdf
> https://www.micron.com/resource-details/63c49896-86ea-4067-80e8-46635707233f
> 
> If you want faster memory vram, better buy card that has GDDR5X, or wait in 2018 with Samsung GDDR6 14-16 Gbps.


That is a really good point. It might be very well the culprit.


----------



## Roland0101

Quote:


> Originally Posted by *criminal*
> 
> Welcome.


Thanks!
Quote:


> I believe that the GTX 1070 is the first X70 card that couldn't overclock and match the stock performance of the X80 card of the same generation. And that's when using the Samsung memory. His whole conspiracy theory for that being the reason for the memory change is so freaking silly! The 1070 was never able to get close to the performance of the 1080, so there was no need to "secretly" change to Micron memory to cripple performance... lol
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html
> https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/30.html


Yes, as you know I agree.







(Thanks for the links)

I think the reason was either cost efficiency or this. https://www.techpowerup.com/reviews/AMD/RX_480/4.html
Meaning, Samsung could not provide enough Ram for everyone.


----------



## tps3443

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay guys, now is decision time and i desperately need your help,
> 
> I have been closely monitoring and debating the Samsung vs. Micron GDDR5 memory issue for about two weeks now,
> 
> The conclusion is that, Nvidia designed the GTX 1070, its BIOS and Drivers based on Original specifications of Samsung GDDR5 8Ghz modules, the Samsung ICs appear to need low voltage/power to remain stable and can handle voltage fluctuations flawlessly.
> 
> Micron GDDR5 on the other hand, being a lower quality chip, is power hungry which needs constantly high voltage pumping to remain stable, so it can not operate at the same voltage/power envelope that was originally developed for Samsung GDDR5 in BIOS and Drivers.
> 
> The assumption was initially made that there is a BIOS/Driver bug preventing Micron GDDR5 from running stably, but what it turned out to be, because of the poor power properties of these Micron chips, they can not operate at the original voltage/power specifications and thus a fix was to pump as much power into them with the available tools, meaning voltage-locking and "prefer max performance" which are both not part of the standard specification.
> 
> The GTX 1070 has been significantly downgraded as the initial memory OC range showing in all early reviews (from 9000Mhz to 9750Mhz) is no longer valid and the new range is now 7600Mhz to 8800Mhz, meaning that the lowest Samsung OC is higher than the potential highest Micron OC.
> 
> *Now i need to know what to do with my Micron GTX 1070 and i need your suggestions:
> 
> 1- Return the damn thing, get my money back and wait for Nvidia to start making card with Samsung memory again if ever.
> 
> 2- Return the damn thing and get a Founder's Edition GTX 1070 with Samsung memory and live with the reference cooling and noise.
> 
> 3- Return the thing and get my money back and wait for AMD's VEGA.
> 
> 3- Return the thing and wait for Volta!
> 
> 4- Get a GTX 1060 or an RX 480 instead.
> 
> 5- Suck it and live with the new downgraded performance!
> 
> 6- get a PS4 Pro! LOL
> 
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Please advise ...


My last GTX1070 had the Samsung memory, so it was fantastic while I had it. Before going GTX1080.

But, I love the FE cards! And boy do they Overclock! It least you pay extra, knowing your getting OEM quality. I guess it's not nothing after all.

My GTX1080 is a FE. That thing is a Overclocking beast!


----------



## tps3443

Anyone running a GTX1070, with Micron memory. That wants more Overclocking room. I would sell the cards before it's a worldwide known problem. And just grab a FE with Samsung memory. I ran my old gtx1070 with Samsung memory at like 9,800Mhz. And some of you guys cannot even hit 8500? That's terrible.


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> I believe that the GTX 1070 is the first X70 card that couldn't overclock and match the stock performance of the X80 card of the same generation. And that's when using the Samsung memory. His whole conspiracy theory for that being the reason for the memory change is so freaking silly! The 1070 was never able to get close to the performance of the 1080, so there was no need to "secretly" change to Micron memory to cripple performance... lol
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html
> https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/30.html


LOL, i see you two are getting along quite well which is pretty surprising to be honest, being a fanboy who would defend even the farts of NVIDIA until his last breath, he is not a very intelligent individual to converse with, he is a fake, and a liar who would twist facts to defend his master Nvidia, but you going along with this crap is pretty surprising given you being such an involved OC.net member









Anyways, that very chart you posted puts a well OCed Samsung 1070 pretty darn close to a stock 1080 performance, 133.7 FPS on an OCed 1070 vs. 137.9 FPS on FE 1080 , that is only a 3% performance difference for cards that have 34% asking pricing difference (according to EU market prices) , that happens to also be the reason why i bought a 1070 instead of a 1080, with Samsung memory and some decent OC, you get a GTX 1080 type experience.



What makes this a conspiracy is not just the fact that they switched to Micron, it is the fact that they did that AFTER all review samples went out with Samsung memory producing chart smashing results, and the fact that they are being extremely secretive about this on purpose, even if asked about it directly they just refuse to talk or give any information.

If it was just "business as usual" as you imply, how hard for them to say, sorry we ran out of Samsung GDDR5 supply, or these memory chips are good but we need to fix the BIOS or drivers which we will do soon. or anything AT ALL. but they are keeping us totally in the dark here intentionally, if that is not a conspiracy or at least very deceptive marketing and business practices then i don't know what is? how come you don't see it this way? please tell me, maybe i am missing something here...

The thing is that these choices you are suggesting are not so easy in practice, i had a 980 Ti which i sold to get that 1070, while both have an almost identical performance, i got the 1070 for efficiency reasons, the 980 Ti used to run hot and noisy (MSI Gaming) it went into the 80c+ and would make the entire system sound like a vacuum cleaner while gaming, and i did not like the experience.

The reason i got the ASUS Strix 1070 ... is the fact that it does not exceed 70c under load, runs quite, great OC potential (according to reviews), has 8GB framebuffer, aesthetics, yes it looks pretty cool with the RGB lighting and all ..

Now if i get an FE 1070 i lose many of these benefits, i will be right back at the 980 Ti days, will run hot (83c according to reviews), be noisier, no RGB ... etc .. not something i am very excited about .. also getting an FE and putting a water block on it does not look possible, because my retailer is my warranty provider and they are uneasy about changing the cooler on the card, i care a lot for warranty and don't like voiding it.

Also waiting for VEGA or Volta means that i will have to stay for so long without a GPU which is also not an easy choice ..

Buying a 1080 is impossible, after forking $872 on the 980 Ti , i decided to never pay way too much for a GPU anymore .. so the 1070 price is the highest i am willing to pay ..

My "whining" as you call it is not like beating a dead horse as you put it, i am trying to raise awareness and get results here, and most importantly information, results in the form of a BIOS/Driver fix, later production starting to use Samsung GDDR5 again, or information that for example this will or will never be fixed so i can make an informed decision what to buy or what not to buy .. any of these outcomes will be a positive thing.

So all is good, and i hope you can see that the choices you are proposing are not that easy to implement, at least not with the current level of information NVIDIA and AIBs are making available, which is absolutely nothing besides "we will look into it" posted by an Nvidia rep. about a week ago ..


----------



## AngryLobster

I have 3 1070s all with Micron memory and don't give a damn whether they OC or not. Not only is the performance gain minuscule but the entire "overclocking" experience with Pascal is just an illusion.

I find the outrage here pretty pathetic.


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> His whole conspiracy theory for that being the reason for the memory change is so freaking silly! The 1070 was never able to get close to the performance of the 1080, so there was no need to "secretly" change to Micron memory to cripple performance... lol


Samsung GTX 1070 beating a GTX 1080 at only 66% of the cost.



Source: https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html


----------



## HOODedDutchman

Its not a conspiracy it's common practice. This has happened with every gpu with AMD and Nvidia since the beginning of time. Usually because 1 company cannot supply enough. Hopefully a bios is issues through msi, gigabyte, evga, etc. to fix the power state issue. Other then that I don't think there's any issue here.


----------



## HOODedDutchman

Quote:


> Originally Posted by *tps3443*
> 
> Anyone running a GTX1070, with Micron memory. That wants more Overclocking room. I would sell the cards before it's a worldwide known problem. And just grab a FE with Samsung memory. I ran my old gtx1070 with Samsung memory at like 9,800Mhz. And some of you guys cannot even hit 8500? That's terrible.


Many people with Samsung have got unlucky on the memory side of thing as well. I've seen lots of micron owners say they get between +400-600. Which is a decent clock. I'd be happy with anything that does 8800mhz or more. That's a 10% overclock.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Its not a conspiracy it's common practice. This has happened with every gpu with AMD and Nvidia since the beginning of time. Usually because 1 company cannot supply enough. Hopefully a bios is issues through msi, gigabyte, evga, etc. to fix the power state issue. Other then that I don't think there's any issue here.


Two points here which i mentioned before, i feel i am starting to repeat myself a lot:

1- This is the first generation where GDDR5 is being pushed to its absolute limit of 8 Gb/s and it appears that Micron GDDR5 ICs are not taking it very well, Samsung GDDR5 or Micron GDDR5X look much more suitable for this application or even HBM.

2- If power states would be fixed in a BIOS update it will make the situation a lot better, not perfect but will make Micron GTX 1070 a little less crippled.

But why no one is willing to talk about it? why is all the information secrecy from Nvidia and AIBs?

Again i ask a very logical question and hope for an objective answer here, if it is a common practice and is that easy to fix the power state in software like many of you imply, how hard is it for Nvidia to share this information with us? how hard is it for them to say Samsung is not supplying enough GDDR5 to keep up with our production needs and we switched to Micron's GDDR5 which is equally good and we are working on a new BIOS/Drivers to resolve the power state issue?

If it was that simple, why is all the secrecy, vagueness and shadiness about this after more than one month of this issue being first raised? please answer that ..


----------



## khanmein

how come review sample on Sept 2016 received Samsung VRAM version???

http://www.dragonblogger.com/evga-geforce-gtx-1070-ftw-acx-3-0-video-card/6/


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Its not a conspiracy it's common practice. This has happened with every gpu with AMD and Nvidia since the beginning of time. Usually because 1 company cannot supply enough. Hopefully a bios is issues through msi, gigabyte, evga, etc. to fix the power state issue. Other then that I don't think there's any issue here.


seriously is not bout over-clocking issue. default stock setting causes artifacts & other unknown bugs too.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> how come review sample on Sept 2016 received Samsung VRAM version???
> 
> http://www.dragonblogger.com/evga-geforce-gtx-1070-ftw-acx-3-0-video-card/6/


Because if they start sending Micron GTX 1070s to reviewers performance results and ratings will tank and this issue will blow up halting the sales of GTX 1070s by unaware consumers.

And some people don't think that this is at least manipulative?!

When ASUS and MSI sent review samples with slightly OCed BIOS, just a tiny +25Mhz on the core, people made a very big deal out of this and it blow up and ASUS makes a press release about the issue, but when they send review samples with memory that can OC around +1000Mhz above the effective rate of the samples currently being sold, some people think this is okay?!! how? i really don't get it at all!


----------



## Waleh

Quote:


> Originally Posted by *MyNewRig*
> 
> Because if they start sending Micron GTX 1070s to reviewers performance results and ratings will tank and this issue will blow up halting the sales of GTX 1070s by unaware consumers.
> 
> And some people don't think that this is at least manipulative?!
> 
> When ASUS and MSI sent review samples with slightly OCed BIOS, just a tiny +25Mhz on the core, people made a very big deal out of this and it blow up and ASUS makes a press release about the issue, but when they send review samples with memory that can OC around +1000Mhz above the effective rate of the samples currently being sold, some people think this is okay?!! how? i really don't get it at all!


Do the founders cards all use Samsung memory?


----------



## MyNewRig

Quote:


> Originally Posted by *Waleh*
> 
> Do the founders cards all use Samsung memory?


I am not 100% sure about that, some say they do, i am debating with myself about getting an EVGA Founder's Edition 1070, so if i end up getting that and not just skipping Pascal all together i will let you know if it has Samsung.

I am just worried that soon after buying that FE , NVIDIA will realize that using Micron's GDDR5 and running it at 8 Gb/s is a terrible decision and then switch back to Samsung on all AIB cards, i will feel really bad about it if this happens


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Because if they start sending Micron GTX 1070s to reviewers performance results and ratings will tank and this issue will blow up halting the sales of GTX 1070s by unaware consumers.
> 
> And some people don't think that this is at least manipulative?!
> 
> When ASUS and MSI sent review samples with slightly OCed BIOS, just a tiny +25Mhz on the core, people made a very big deal out of this and it blow up and ASUS makes a press release about the issue, but when they send review samples with memory that can OC around +1000Mhz above the effective rate of the samples currently being sold, some people think this is okay?!! how? i really don't get it at all!


frankly speaking, starting 7xx series i started bashed linustechtips & jayztwocents due to review sample is totally different from consumer products. perhaps i'm the one who concerning regarding this particular matter. literally, they often declared themselves as enthusiastic content creator that's legit & professional but how come they wait almost half year to announced the 3.5 GB VRAM fiasco thingy?

other tech reviewers also the same ****, so guys stop believe them & do some lil research before purchasing. furthermore, they also like to said this will be my daily driver. damn all bull-**** crap. those reviewer i watched for basic entertainment like wrestling. "don't try this at home"


----------



## khanmein

Quote:


> Originally Posted by *Waleh*
> 
> Do the founders cards all use Samsung memory?


most likely & better silicon lottery on FE too. depend batches as well.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> most likely & better silicon lottery on FE too. depend batches as well.


Why do you say that FE has better silicon lottery? is it binned or have cherry picked GPUs? i am starting to feel better and better about FE everyday now and looks like this what i will end up with ..


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Why do you say that FE has better silicon lottery? is it binned or have cherry picked GPUs? i am starting to feel better and better about FE everyday now and looks like this what i will end up with ..


y i said so? based on observation e.g. jokerslunt bought all the reference cards can really over-clock quite well & single 8-pin on 1070/1080 FE won't affect the over-clocking compare with custom cards so common sense is regarding silicon lottery.

have u ever noticed the early review for 1st release product always can over-clock like a beast?

for FE this time round i believe is binned for consumers & cherry picked for reviewers.

pascal architecture got limitation on the voltage core & this time round there's not huge different between reference & custom.

by looking at the price, FE cost premium for a reason. "there ain't no such thing as a free lunch!"

(everything is my own perspective view so no offense)


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> y i said so? based on observation e.g. jokerslunt bought all the reference cards can really over-clock quite well & single 8-pin on 1070/1080 FE won't affect the over-clocking compare with custom cards so common sense is regarding silicon lottery.
> 
> have u ever noticed the early review for 1st release product always can over-clock like a beast?
> 
> for FE this time round i believe is binned for consumers & cherry picked for reviewers.
> 
> pascal architecture got limitation on the voltage core & this time round there's not huge different between reference & custom.
> 
> by looking at the price, FE cost premium for a reason. "there ain't no such thing as a free lunch!"
> 
> (everything is my own perspective view so no offense)


Interesting info. what about temp throttling? i don't intend to install a water block on it so will the reference cooler be able to make it stay above 2000Mhz under stress and without having to run the fan at 100%? also how about cleaning? i have a dusty environment and not sure if dust gets stuck inside that vacuum chamber? and how to clean it if it happens?


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Interesting info. what about temp throttling? i don't intend to install a water block on it so will the reference cooler be able to make it stay above 2000Mhz under stress and without having to run the fan at 100%? also how about cleaning? i have a dusty environment and not sure if dust gets stuck inside that vacuum chamber? and how to clean it if it happens?


FYI, if u wanna hit 2k MHz stable then u need to set custom profile fan curve + increase the fan & compensate with the noise level which can hit 3~4k RPM is pretty loud to obtain decent temp. please take noted ambient temp is very important.

blower type usually more suitable for smaller/compact case with restricted air-flow & for SLI user.

disassembly FE is not that difficult if u got the right tools to dismantle it & i suggest remove the bigger back plate can dissipate some margin of heat or last resort buy arctic accelero xtreme iv

obviously, dismantle might void warranty. if i wanna clean this type of blower or heat-sink, try use automotive air compressor or the cheapest way is visit the nearby petrol/gas station that provide the tire air pump & set for flat tyre to blow the dust out. P.S. remember hold the fan to avoid it spin when blow it.


----------



## HOODedDutchman

I don't think there's any evidence out there saying FE is binned. Kind of ridiculous. I've not seen any reviews of fe doing 2100mhz or above and I've seen many aftermarket cards do that.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I don't think there's any evidence out there saying FE is binned. Kind of ridiculous. I've not seen any reviews of fe doing 2100mhz or above and I've seen many aftermarket cards do that.


I would be pretty happy with an FE doing 2050Mhz or so and 9200Mhz memory without system crashing, artifacts and BSOD all the time .. that will actually result in a higher performance, stable and quality experience than the garbage i have now, given that i can tolerate the fan noise, and that it does not throttle down to 1900Mhz or so under stress ..


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I don't think there's any evidence out there saying FE is binned. Kind of ridiculous. I've not seen any reviews of fe doing 2100mhz or above and I've seen many aftermarket cards do that.


i don't know y u need hit 2.1k MHz? 2k MHz is more than enough & FE/custom vendors didn't guarantee too. do u wish to obtain better score in benchmark & brag bout it? u're not reviewer but just a normal user that play games. reviewer task is to push the cards to its limit & end of story. (depend some might highlight the pros & cons but not all)

like i said at the end of the day, there's no evidence to prove it but based on the reviewers u watched, none of them pointed any major issue.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> I would be pretty happy with an FE doing 2050Mhz or so and 9200Mhz memory without system crashing, artifacts and BSOD all the time .. that will actually result in a higher performance, stable and quality experience than the garbage i have now, given that i can tolerate the fan noise, and that it does not throttle down to 1900Mhz or so under stress ..


achieving 1.8~1.9k MHz is not too shabby at all & regarding the micron chip is not bout the potential over-clocking but even with the default stock also got issue, if not y MSI stated 100% return rate for the recent batches??


----------



## AngryLobster

Are you pulling numbers out of thin air or have a source? I have 3 MSI 1070s all micron equipped that all run flawlessly. All 3 purchased within the last 2 weeks.


----------



## khanmein

Quote:


> Originally Posted by *AngryLobster*
> 
> Are you pulling numbers out of thin air or have a source? I have 3 MSI 1070s all micron equipped that all run flawlessly. All 3 purchased within the last 2 weeks.


i expected got this kinda ans but that's good for u but other might not be lucky as u. try to request yr fav reviewer & review MICRON instead SAMSUNG for GTX 1070 then we got source for sure.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> achieving 1.8~1.9k MHz is not too shabby at all & regarding the micron chip is not bout the potential over-clocking but even with the default stock also got issue, if not y MSI stated 100% return rate for the recent batches??


Yes exactly, i just want a card with stable quality components that i would not have to be nervous about all the time, owning a Micron GTX 1070 currently is a risk, you never know what will happen with future drivers updates or more demanding game releases if the way it is now it is barely stable at stock settings... that damn card just makes me nervous.

I just checked a noise test video on the 1070 FE and it sounds very acceptable at 70-80% fan speed, keeps the card cool 69-72c .. not bad at all, it actually sounds quieter than my Strix cooler at the same levels, Strix starts to sound like a Jet Engine after 65% fan

If the same cooler can cool the Titan XP with its 250W TDP, then it sure can cool the 1070 with 150W TDP

I get quality components, and great craftsmanship, and with EVGA i can buy 5 years extended warranty for just 20€ for an overall awesome package ...


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Yes exactly, i just want a card with stable quality components that i would not have to be nervous about all the time, owning a Micron GTX 1070 currently is a risk, you never know what will happen with future drivers updates or more demanding game releases if the way it is now it is barely stable at stock settings... that damn card just makes me nervous.
> 
> I just checked a noise test video on the 1070 FE and it sounds very acceptable at 70-80% fan speed, keeps the card cool 69-72c .. not bad at all, it actually sounds quieter than my Strix cooler at the same levels, Strix starts to sound like a Jet Engine after 65% fan
> 
> If the same cooler can cool the Titan XP with its 250W TDP, then it sure can cool the 1070 with 150W TDP
> 
> I get quality components, and great craftsmanship, and with EVGA i can buy 5 years extended warranty for just 20€ for an overall awesome package ...


take noted if u purchase EVGA FE is slightly over-clocked but i'm not sure with other brand. if directly from NV it will be default like it stated on the official website.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> take noted if u purchase EVGA FE is slightly over-clocked but i'm not sure with other brand. if directly from NV it will be default like it stated on the official website.


Does not look to be any different from original specs:

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6170-KR

EVGA GeForce GTX 1070 FOUNDERS EDITION 08G-P4-6170:

Base / Boost (Clock): 1506 /1683
Power Type: 8-Pin
Power Phase: 4+1
Max Power Draw: 150W

Exactly the same as: http://www.geforce.com/hardware/10series/geforce-gtx-1070

They should also list in the specs:

*Premium Memory: Samsung GDDR5 8 Gb/s*









At least i know that i am going to list that in my ebay description when i sell it next year


----------



## muzammil84

Quote:


> Originally Posted by *MyNewRig*
> 
> LOL, i see you two are getting along quite well which is pretty surprising to be honest, being a fanboy who would defend even the farts of NVIDIA until his last breath, he is not a very intelligent individual to converse with, he is a fake, and a liar who would twist facts to defend his master Nvidia, but you going along with this crap is pretty surprising given you being such an involved OC.net member
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways, that very chart you posted puts a well OCed Samsung 1070 pretty darn close to a stock 1080 performance, 133.7 FPS on an OCed 1070 vs. 137.9 FPS on FE 1080 , that is only a 3% performance difference for cards that have 34% asking pricing difference (according to EU market prices) , that happens to also be the reason why i bought a 1070 instead of a 1080, with Samsung memory and some decent OC, you get a GTX 1080 type experience.
> 
> 
> 
> What makes this a conspiracy is not just the fact that they switched to Micron, it is the fact that they did that AFTER all review samples went out with Samsung memory producing chart smashing results, and the fact that they are being extremely secretive about this on purpose, even if asked about it directly they just refuse to talk or give any information.
> 
> If it was just "business as usual" as you imply, how hard for them to say, sorry we ran out of Samsung GDDR5 supply, or these memory chips are good but we need to fix the BIOS or drivers which we will do soon. or anything AT ALL. but they are keeping us totally in the dark here intentionally, if that is not a conspiracy or at least very deceptive marketing and business practices then i don't know what is? how come you don't see it this way? please tell me, maybe i am missing something here...
> 
> The thing is that these choices you are suggesting are not so easy in practice, i had a 980 Ti which i sold to get that 1070, while both have an almost identical performance, i got the 1070 for efficiency reasons, the 980 Ti used to run hot and noisy (MSI Gaming) it went into the 80c+ and would make the entire system sound like a vacuum cleaner while gaming, and i did not like the experience.
> 
> The reason i got the ASUS Strix 1070 ... is the fact that it does not exceed 70c under load, runs quite, great OC potential (according to reviews), has 8GB framebuffer, aesthetics, yes it looks pretty cool with the RGB lighting and all ..
> 
> Now if i get an FE 1070 i lose many of these benefits, i will be right back at the 980 Ti days, will run hot (83c according to reviews), be noisier, no RGB ... etc .. not something i am very excited about .. also getting an FE and putting a water block on it does not look possible, because my retailer is my warranty provider and they are uneasy about changing the cooler on the card, i care a lot for warranty and don't like voiding it.
> 
> Also waiting for VEGA or Volta means that i will have to stay for so long without a GPU which is also not an easy choice ..
> 
> Buying a 1080 is impossible, after forking $872 on the 980 Ti , i decided to never pay way too much for a GPU anymore .. so the 1070 price is the highest i am willing to pay ..
> 
> My "whining" as you call it is not like beating a dead horse as you put it, i am trying to raise awareness and get results here, and most importantly information, results in the form of a BIOS/Driver fix, later production starting to use Samsung GDDR5 again, or information that for example this will or will never be fixed so i can make an informed decision what to buy or what not to buy .. any of these outcomes will be a positive thing.
> 
> So all is good, and i hope you can see that the choices you are proposing are not that easy to implement, at least not with the current level of information NVIDIA and AIBs are making available, which is absolutely nothing besides "we will look into it" posted by an Nvidia rep. about a week ago ..


I've been following this thread for a while now and:

first of all, your ignorance and abuse towards other users is not acceptable, this is getting ridiculous and it should be reported
secondly, we are all tired of seeing 56 posts a day from you soaking about memory on 1070s and trying to reveal the biggest conspiracy theory in modern ages.

Like others said already, you get factory performance and that's what guaranteed and within nvidia specs. if you really heaving such problems at stock speeds RMA your card and get replacement, if it's the same, repeat.

Overclocking memory IS NOT guaranteed by any manufacturer, the fact that we don't hear any info on that conspiracy theory is no one at nVidia gives a flying thing about a couple of crying kids...

once, i got i5 4690k which couldn't do 4.4 ghz stable(while two others 4.6 easily), I think Intel did it on purpose to make me buy more expensive i7 4790k lol...


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Does not look to be any different from original specs:
> 
> http://www.evga.com/Products/Product.aspx?pn=08G-P4-6170-KR
> 
> EVGA GeForce GTX 1070 FOUNDERS EDITION 08G-P4-6170:
> 
> Base / Boost (Clock): 1506 /1683
> Power Type: 8-Pin
> Power Phase: 4+1
> Max Power Draw: 150W
> 
> Exactly the same as: http://www.geforce.com/hardware/10series/geforce-gtx-1070
> 
> They should also list in the specs:
> 
> *Premium Memory: Samsung GDDR5 8 Gb/s*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least i know that i am going to list that in my ebay description when i sell it next year


my mistake that's GPU boost 3.0 that u might get higher than the original specs included other brands too.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> my mistake that's GPU boost 3.0 that u might get higher than the original specs included other brands too.


My understanding is that EVGA, ASUS, MSI .. etc just get these cards pre-manufactured directly from NVIDIA, all they do is provide the retail box, logistics and post-sale customer support, why would they be any different from what NVIDIA sells on its website?


----------



## F3niX69

Quote:


> Originally Posted by *muzammil84*
> 
> I've been following this thread for a while now and:
> 
> first of all, your ignorance and abuse towards other users is not acceptable, this is getting ridiculous and it should be reported
> secondly, we are all tired of seeing 56 posts a day from you soaking about memory on 1070s and trying to reveal the biggest conspiracy theory in modern ages.
> 
> Like others said already, you get factory performance and that's what guaranteed and within nvidia specs. if you really heaving such problems at stock speeds RMA your card and get replacement, if it's the same, repeat.
> 
> Overclocking memory IS NOT guaranteed by any manufacturer, the fact that we don't hear any info on that conspiracy theory is no one at nVidia gives a flying thing about a couple of crying kids...
> 
> once, i got i5 4690k which couldn't do 4.4 ghz stable(while two others 4.6 easily), I think Intel did it on purpose to make me buy more expensive i7 4790k lol...


Problem is we can go higher with micron memory if we lock the voltage.My memory goes up to 8900 if i lock it.Otherwise it goes to only 8400.
What we ask from nvidia or AIB partners is a voltage regulation fix so we can achieve what we are capable of without limitations(by limitations i refer to high tdp usage in idle when voltage locking)
I don't have a problem with my micron memory.8900 is ok with me,even though its lower than most samsung.I just don't want to lock the voltage to achieve that clock.

Now people with micron memory that are not stable at stock clocks,should just go and return their cards instead of posting here,and not wait for a fix that may never come.Instability at stock is a different problem and you should RMA the cards now.


----------



## muzammil84

Quote:


> Originally Posted by *F3niX69*
> 
> Problem is we can go higher with micron memory if we lock the voltage.My memory goes up to 8900 if i lock it.Otherwise it goes to only 8400.
> What we ask from nvidia or AIB partners is a voltage regulation fix so we can achieve what we are capable of without limitations(by limitations i refer to high tdp usage in idle when voltage locking)
> I don't have a problem with my micron memory.8900 is ok with me,even though its lower than most samsung.I just don't want to lock the voltage to achieve that clock.
> 
> Now people with micron memory that are not stable at stock clocks,should just go and return their cards instead of posting here,and not wait for a fix that may never come.Instability at stock is a different problem and you should RMA the cards now.


that's what i mean, instead of crying in every post and hope for a miracle just return the bloody card and problem solved.
it's ok to discuss problems but not go on and on same note in every single post


----------



## Roland0101

Quote:


> Originally Posted by *khanmein*
> 
> seriously is not bout over-clocking issue. default stock setting causes artifacts & other unknown bugs too.


A Card that is not running flawless at stock settings is a defective card.
You can and should RAM such a card ASAP.

From all I read only MSI has real problems with cards on stock settings, and they already admitted that.


----------



## MyNewRig

Quote:


> Originally Posted by *F3niX69*
> 
> Problem is we can go higher with micron memory if we lock the voltage.My memory goes up to 8900 if i lock it.Otherwise it goes to only 8400.
> What we ask from nvidia or AIB partners is a voltage regulation fix so we can achieve what we are capable of without limitations(by limitations i refer to high tdp usage in idle when voltage locking)
> I don't have a problem with my micron memory.8900 is ok with me,even though its lower than most samsung.I just don't want to lock the voltage to achieve that clock.
> 
> Now people with micron memory that are not stable at stock clocks,should just go and return their cards instead of posting here,and not wait for a fix that may never come.Instability at stock is a different problem and you should RMA the cards now.


When the GTX 970 3.5GB + 0.5GB issue was first observed it started just like this one, a few people here and there talking about it, others like some of our friends here getting irritated by them and telling them how perfect and flawless their 970s were and asked them to also shut up and RMA,

Nvidia remained silent about it for months and did not want to acknowledge they did anything wrong until after 2 or 3 years later.

Would an RMA have helped someone of the early complainers get a less stuttery GTX 970 at the time? no...

Why is this one different? because some people are running fine at stock settings? RMA is a very inconvenient venue to use and involves a lot of time and hassle, without full disclosure from NVIDIA as to the extent and severity of the issue and the potential of a possible fix, RMA is not a suitable solution before we get such disclosure, it will come, but NVIDIA is just being a bit stubborn because the competition is not putting any pressures on them just yet, but it will come, maybe with some patience.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> My understanding is that EVGA, ASUS, MSI .. etc just get these cards pre-manufactured directly from NVIDIA, all they do is provide the retail box, logistics and post-sale customer support, why would they be any different from what NVIDIA sells on its website?


yeah all is the same.


----------



## MyNewRig

Quote:


> Originally Posted by *muzammil84*
> 
> that's what i mean, instead of crying in every post and hope for a miracle just return the bloody card and problem solved.
> it's ok to discuss problems but not go on and on same note in every single post


My only condolence from the irritation i have caused you is that i promise you that if and when i get my Original GTX 1070 Founders Edition with Samsung GDDR5 memory i will push the hell out of it to the limit and bombard you with benches and all the good stuff that makes you happy, so cheer up buddy


----------



## F3niX69

Quote:


> Originally Posted by *MyNewRig*
> 
> When the GTX 970 3.5GB + 0.5GB issue was first observed it started just like this one, a few people here and there talking about it, others like some of our friends here getting irritated by them and telling them how perfect and flawless their 970s were and asked them to also shut up and RMA,
> 
> Nvidia remained silent about it for months and did not want to acknowledge they did anything wrong until after 2 or 3 years later.
> 
> Would an RMA have helped someone of the early complainers get a less stuttery GTX 970 at the time? no...
> 
> Why is this one different? because some people are running fine at stock settings? RMA is a very inconvenient venue to use and involves a lot of time and hassle, without full disclosure from NVIDIA as to the extent and severity of the issue and the potential of a possible fix, RMA is not a suitable solution before we get such disclosure, it will come, but NVIDIA is just being a bit stubborn because the competition is not putting any pressures on them just yet, but it will come, maybe with some patience.


That 970 fiasco is a different story.Involving false advertising and other things.
May i ask if you have instability at stock settings?


----------



## MyNewRig

Quote:


> Originally Posted by *F3niX69*
> 
> That 970 fiasco is a different story.Involving false advertising and other things.
> May i ask if you have instability at stock settings?


YES, with Driver 372.70 i was getting artifacts in rotTR at stock settings, with 372.90 i *thought* that i was stable at +250 (8500Mhz effective) until i found artifacts, i lowered to +200 and thought that was stable, played rotTR for an hour and a half yesterday, first hour went good, then flashing artifacts started appearing, 30 minutes later i exited to main menu, the second i clicked on exit to main menu, checkerboard artifacts on a pink background appeared, system locked up, BSOD memory dump and restart ...

And BTW this is my 2nd Micron card with the same crap, what is the point of RMA without more information from Nvidia? get a 3rd Micron card with the same and RMA again and again?

Do you know how much time it takes to process an RMA and how long you wait to get a replacement? i am not working as a beta tester for NVIDIA to turn my house into an RMA center, they need to provide full information about this before any further action is taken









EDIT: this is not different from the GTX 970, there is a lot of deceptive marketing involved with all review samples being sent out are all Samsung, and there is an underlying quality issue with running these Micron GDDR5 chips at 8 Gb/s that is effecting all users to different degrees and extents, wait and see how bad this will blow up ..


----------



## khanmein

Quote:


> Originally Posted by *F3niX69*
> 
> That 970 fiasco is a different story.Involving false advertising and other things.
> May i ask if you have instability at stock settings?


this is another smart false advertising but this time round NV is clever to avoid any legal court issue again. 7 series & VRAM will continue on next Volta for sure which is even tricky than before.


----------



## muzammil84

Quote:


> Originally Posted by *MyNewRig*
> 
> YES, with Driver 372.70 i was getting artifacts in rotTR at stock settings, with 372.90 i *thought* that i was stable at +250 (8500Mhz effective) until i found artifacts, i lowered to +200 and thought that was stable, played rotTR for an hour and a half yesterday, first hour went good, then flashing artifacts started appearing, 30 minutes later i exited to main menu, the second i clicked on exit to main menu, checkerboard artifacts on a pink background appeared, system locked up, BSOD memory dump and restart ...
> 
> And BTW this is my 2nd Micron card with the same crap, what is the point of RMA without more information from Nvidia? get a 3rd Micron card with the same and RMA again and again?
> 
> Do you know how much time it takes to process an RMA and how long you wait to get a replacement? i am not working as a beta tester for NVIDIA to turn my house into an RMA center, they need to provide full information about this before any further action is taken
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: this is not different from the GTX 970, there is a lot of deceptive marketing involved with all review samples being sent out are all Samsung, and there is an underlying quality issue with running these Micron GGDR5 chips at 8 Gb/s that is effecting all users to different degrees and extents, wait and see how bad this will blow up ..


may i ask you where do you live?

my intention wasn't to argue or make you angry. i was trying to point what already others have, instead of looking for a solution here you should waste no more time and go for RMA, i doubt if any fix is coming.

like i said few days ago, if you can buy from UK, get inno3d iChill, they are not very popular cards so I'm pretty sure overclockers.co.uk still have the same batch as i got mine(it wasn't that long ago), great temperatures and noise level and Samsung memory. Good luck mate and time for some action


----------



## MyNewRig

Quote:


> Originally Posted by *muzammil84*
> 
> may i ask you where do you live?
> 
> my intention wasn't to argue or make you angry. i was trying to point what already others have, instead of looking for a solution here you should waste no more time and go for RMA, i doubt if any fix is coming.
> 
> like i said few days ago, if you can buy from UK, get inno3d iChill, they are not very popular cards so I'm pretty sure overclockers.co.uk still have the same batch as i got mine(it wasn't that long ago), great temperatures and noise level and Samsung memory. Good luck mate and time for some action


no harm done buddy, i am in the EU as well (not the UK, you are even exiting us now so you are a defacto non-EU state







) if you see a few posts above i am debating what to get as a replacement, so you see, i am working on it, not just complaining and doing nothing, my retailer who is proving the replacement does not have the inno3d iChill in stock, so one of my options now is the EVGA FE .. still looking for all possible options, and is really looking forward for a response from Nvidia to give full discourse about the issue so we know what is really going on before i take any further action, and i am going to continue to raise awareness and pressure them for a response until they give us more information ... they are being so secretive about it but nothing can stand in the face of angry and disappointed customers, they have to give in before AMD releases VEGA anyways


----------



## khanmein

Quote:


> Originally Posted by *muzammil84*
> 
> may i ask you where do you live?
> 
> my intention wasn't to argue or make you angry. i was trying to point what already others have, instead of looking for a solution here you should waste no more time and go for RMA, i doubt if any fix is coming.
> 
> like i said few days ago, if you can buy from UK, get inno3d iChill, they are not very popular cards so I'm pretty sure overclockers.co.uk still have the same batch as i got mine(it wasn't that long ago), great temperatures and noise level and Samsung memory. Good luck mate and time for some action


RMA other brand is not allowed. inno3d 980ti got a high return rate but maybe this time round they improved.

e.g. techcity bryan RMA & received better version but ended up sold it off too & stick with his GIGA G1 gaming GTX 970.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> no harm done buddy, i am in the EU as well (not the UK, you are even exiting us now so you are a defacto non-EU state
> 
> 
> 
> 
> 
> 
> 
> ) if you see a few posts above i am debating what to get as a replacement, so you see, i am working on it, not just complaining and doing nothing, my retailer who is proving the replacement does not have the inno3d iChill in stock, so one of my options now is the EVGA FE .. still looking for all possible options, and is really looking forward for a response from Nvidia to give full discourse about the issue so we know what is really going on before i take any further action, and i am going to continue to raise awareness and pressure them for a response until they give us more information ... they are being so secretive about it but nothing can stand in the face of angry and disappointed customers, they have to give in before AMD releases VEGA anyways


grab EVGA FE w/ any hassle cos they got the better service. less headache.. my country is not allow to RMA for another brand.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> RMA other brand is not allowed. inno3d 980ti got a high return rate but maybe this time round they improved.
> 
> e.g. techcity bryan RMA & received better version but ended up sold it off too & stick with his GIGA G1 gaming GTX 970.


No in the EU in general and with my retailer in particular i can pick whatever brand and model i like as long as it is a GTX 1070, so not an issue for me, probably an issue in other regions of the world


----------



## F3niX69

Quote:


> Originally Posted by *MyNewRig*
> 
> YES, with Driver 372.70 i was getting artifacts in rotTR at stock settings, with 372.90 i *thought* that i was stable at +250 (8500Mhz effective) until i found artifacts, i lowered to +200 and thought that was stable, played rotTR for an hour and a half yesterday, first hour went good, then flashing artifacts started appearing, 30 minutes later i exited to main menu, the second i clicked on exit to main menu, checkerboard artifacts on a pink background appeared, system locked up, BSOD memory dump and restart ...
> 
> And BTW this is my 2nd Micron card with the same crap, what is the point of RMA without more information from Nvidia? get a 3rd Micron card with the same and RMA again and again?
> 
> Do you know how much time it takes to process an RMA and how long you wait to get a replacement? i am not working as a beta tester for NVIDIA to turn my house into an RMA center, they need to provide full information about this before any further action is taken
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: this is not different from the GTX 970, there is a lot of deceptive marketing involved with all review samples being sent out are all Samsung, and there is an underlying quality issue with running these Micron GDDR5 chips at 8 Gb/s that is effecting all users to different degrees and extents, wait and see how bad this will blow up ..


Instability at stock should always be a reason to return the cards.And if you don't mind the nvidia cooler,just get a FE as they have samsung memory and as it seems oc higher in core clock than aib.
I know rma is talking a while but if you can't run at stock settings there is not much else you can do,except lock voltage.Maybe the 2nd card with micron you got was from the same batch== same problems at stock??

Also about the 970,you have to look at it from a business standpoint.Nvidia is only able to gurrantee to you a gtx 1070 8gb running at 8000mhz and nothing more.
the 970 wasn't able to run all 4gb at the promised settings, and that's why it went to court.
The micron 1070 with instability at stock is a different case and should be treated as such,cause ton all micron cards have problems at stock.
Just return it and get a FE and save yourself a headache while i wait here for a voltage fix to run at max speeds my micron vram


----------



## JackCY

Quote:


> Originally Posted by *MyNewRig*
> 
> No in the EU in general and with my retailer in particular i can pick whatever brand and model i like as long as it is a GTX 1070, so not an issue for me, probably an issue in other regions of the world


Yeah 14 days to return. So you return it and buy a different one. Sometimes retailers go funky and offer you that even without an extra charge but that's more rare I think, small retailers that don't mind a loss. I've returned two Seasonic PSUs for coil whine and got a higher wattage EVGA G2 without paying any extra, it was the lowest wattage retailer had for the G2.

Such swaps are possible, it's all about the retailer and laws. All in all in EU you can return within 14 days if you've ordered via distant method (online, phone, ...), you get your money back and can buy something else.

The 970 debacle is not comparable, they messed up marketing on that one, promising more than the HW actually had.
1070, they always change suppliers for parts as products are manufactured for many months even years, as long as it runs with the specs they advertise/market there is no issue only you disappointed that you didn't buy the first samples that are most of the time better than what is released later on.

IMHO Samsung can't keep up the demand and as such their VRAM was always more expensive and put into products of who ever was willing to pay the most for them.

If you get a Micron 1070 that doesn't run stock 8GHz without artifacts or crashes or other issues at stock settings, then go return it or RMA it to get your money back with stating that the newly used Micron VRAM chips do not meet the advertised specifications and you want your money back. It is up to the AIBs to deal with it, it's them who is putting the Micron chips on the cards to lower costs and or keep up with demand.


----------



## MyNewRig

Quote:


> Originally Posted by *JackCY*
> 
> 1070, they always change suppliers for parts as products are manufactured for many months even years, as long as it runs with the specs they advertise/market there is no issue only you disappointed that you didn't buy the first samples that are most of the time better than what is released later on.
> 
> IMHO Samsung can't keep up the demand and as such their VRAM was always more expensive and put into products of who ever was willing to pay the most for them.
> 
> If you get a Micron 1070 that doesn't run stock 8GHz without artifacts or crashes or other issues at stock settings, then go return it or RMA it to get your money back with stating that the newly used Micron VRAM chips do not meet the advertised specifications and you want your money back. It is up to the AIBs to deal with it, it's them who is putting the Micron chips on the cards to lower costs and or keep up with demand.


Great, all that info has been communicated to me a countless times and i fully understand it now, but there are few remarks/observations that no one seems to be able to answer successfully until now:

1- If it is that simple and straight forward, why all the secrecy and vagueness from NVIDIA and AIBs surrounding the issue? i contacted every single AIB and there is a whole thread in Geforce.com that has been on the first page for 3 weeks now and is nearing 11,000 views and 200 Replies, despite all that effort to obtain information, no one wants to talk aside from Nvidia who said one week ago "we will look into it", why can't they explain this as simply and easily as you just did?

2- This is the first time that Micron GDDR5 has been pushed to its limit of 8 Gb/s and it looks like it is unable to comfortably and stability provide that data rate, this is different from any other generation.

3- How much more expensive is Samsung's GDDR5 modules are compared to Micron's? is it 20, 30 or 100% more expensive? and what percentage of the product's overall price is dependent on RAM price? lets say Samsung would add $20 or $30 to the cost over Micron? okay, there are cards like the FTW, GB Xtreme, Zotac Extreme, ASUS OC, MSI Z that are selling way beyond the MSRP and already using premium components all over, why did they decide to cheap out on memory specifically on these cards when the customer is already paying a huge premium for the cards? why not save on the power phases, chokes, heatsink, PCB, connectors, why memory specifically?

4- If like you and many others say, Nvidia is not involved in all this, how come that ALL AIBs without exception have made the switch to Micron around August without any coordination or mass change of specification from Nvidia? if what you are saying explains the situation, then some AIBs would switch memory suppliers on SOME of their cards, say the cheaper ones, but all AIBs and ALL cards, ALL in August, is that a coincidence?

5- If there is a shortage or a pricing issue, how come all other 8 Gb/s cards like the GTX 1060 and the RX 480 are using Samsung GDDR5 8 Gb/s modules until today? wouldn't a shortage affect everybody?

The answer to these questions is key to understanding what is really going on, can you please answer all or some of these in light of your knowledge of how the market works?


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> achieving 1.8~1.9k MHz is not too shabby at all & regarding the micron chip is not bout the potential over-clocking but even with the default stock also got issue, if not y MSI stated 100% return rate for the recent batches??


Any aftermarket card will do over 1800.. Usually over 1900 bone stock.


----------



## khanmein

Quote:


> Originally Posted by *Roland0101*
> 
> Read you post again, maybe you can see it.
> 
> 
> 
> 
> 
> 
> 
> 
> You are lying about that all the time, as I am proved to you more than one time. http://www.3dmark.com/fs/10258734


i know the benchmark score is legit but are u sure there's no shuttering or artifacts at all? no other issue spotted?? i'm really curious..


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> YES, with Driver 372.70 i was getting artifacts in rotTR at stock settings, with 372.90 i *thought* that i was stable at +250 (8500Mhz effective) until i found artifacts, i lowered to +200 and thought that was stable, played rotTR for an hour and a half yesterday, first hour went good, then flashing artifacts started appearing, 30 minutes later i exited to main menu, the second i clicked on exit to main menu, checkerboard artifacts on a pink background appeared, system locked up, BSOD memory dump and restart ...
> 
> And BTW this is my 2nd Micron card with the same crap, what is the point of RMA without more information from Nvidia? get a 3rd Micron card with the same and RMA again and again?
> 
> Do you know how much time it takes to process an RMA and how long you wait to get a replacement? i am not working as a beta tester for NVIDIA to turn my house into an RMA center, they need to provide full information about this before any further action is taken
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: this is not different from the GTX 970, there is a lot of deceptive marketing involved with all review samples being sent out are all Samsung, and there is an underlying quality issue with running these Micron GDDR5 chips at 8 Gb/s that is effecting all users to different degrees and extents, wait and see how bad this will blow up ..


Dude is ur card stable at stock or not. Every post u talk about instability you are talking about overclocking at the same time. If ur card isn't stable at stock then why r u trying to overclock in rotr. If it IS stable at stock then this is all just ridiculous.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Dude is ur card stable at stock or not. Every post u talk about instability you are talking about overclocking at the same time. If ur card isn't stable at stock then why r u trying to overclock in rotr. If it IS stable at stock then this is all just ridiculous.


Welcome to the world of Micron GTX 1070, sometimes it is stable and sometimes it is not, with some drivers it is stable and with some other drivers it is not, in general the memory is not stable at stock settings, just barely works.


----------



## aliquis

MyNewRig, you have made your point more than clear.

Seriously, act like a mature consumer and vote with your money, if you dislike nvidia for this than suck it up and don't buy their products.

Its well known that manufacturers use components from different supplier's, to meet high demand, in the same product and that this can result in different performance ( remember the iphone soc's that were manufactured at tsmc/samsung, the early ps4 revisions had blowers from 2 different suppliers, one was very loud and one decent, gtx970 gigabyte cards had fans from different suppliers resulting in different cooling /performance, you couldn't regulate some of them down to lower rpm regions, the list goes on...)

You simply fail to get what many others already posted, you are promised advertised/stock specs, anything else is a bonus. Now if your card has problems at stock, that is a reason to complain/rma, but that seems to be the case with a minority of users.


----------



## mrtbahgs

Sorry if it has been discussed recently, every time I check this thread I have 3+ pages of unread very long posts that I just pass over because it seems to be repetitive chatter about memory brands or something.

I am wondering a few things regarding voltage increases for overclocking:
1) Will increased voltage help with memory overclocks or just core?
(For example, say +600 is stable, but +700 shows an artifact or 2. Can voltage potentially make +700 stable?)

2) I am using PrecisionXOC, is the voltage slider adjusting how much you increase or how much is used overall?
(For example, does 40% voltage mean it will actually be undervolted or that it is adding 40% of some unknown amount?
I assume 100% voltage means it is running at max voltage, but it is hard to understand something in between 0 and 100.)

3) If i decide to touch the voltage slider to try and get one more step up stable, should I just go straight to 100% or actually play with it in between?

4a) What is the current max voltage on these cards?
4b)I think mine usually runs at 1.05v or so and if the bump is only a few hundredths I find it hard to believe that it will really do much both for stability as well as increased temps so I suppose I don't see a negative in just slamming it to max out, am I wrong?

Thank you for anyone that can take the time to answer these.


----------



## criminal

Quote:


> Originally Posted by *mrtbahgs*
> 
> Sorry if it has been discussed recently, every time I check this thread I have 3+ pages of unread very long posts that I just pass over because it seems to be repetitive chatter about memory brands or something.
> 
> I am wondering a few things regarding voltage increases for overclocking:
> 1) Will increased voltage help with memory overclocks or just core?
> (For example, say +600 is stable, but +700 shows an artifact or 2. Can voltage potentially make +700 stable?)
> 
> 2) I am using PrecisionXOC, is the voltage slider adjusting how much you increase or how much is used overall?
> (For example, does 40% voltage mean it will actually be undervolted or that it is adding 40% of some unknown amount?
> I assume 100% voltage means it is running at max voltage, but it is hard to understand something in between 0 and 100.)
> 
> 3) If i decide to touch the voltage slider to try and get one more step up stable, should I just go straight to 100% or actually play with it in between?
> 
> 4a) What is the current max voltage on these cards?
> 4b)I think mine usually runs at 1.05v or so and if the bump is only a few hundredths I find it hard to believe that it will really do much both for stability as well as increased temps so I suppose I don't see a negative in just slamming it to max out, am I wrong?
> 
> Thank you for anyone that can take the time to answer these.


1.) The voltage slider pertains to core voltage, but some here claim it helps with memory clocks as well.
2.) Adding any voltage with the voltage slider is an increase over stock. So 40% would be 40% increase over stock voltage.
3.) You can play with it and see what works best for your card, but for my card stock voltage has given me the best overclock.
4.) I think 1.093v is the max voltage.


----------



## boostnek9

Can we talk about video cards instead of feelings for a minute?

that'd be great. Thanks.


----------



## Roland0101

Quote:


> Originally Posted by *khanmein*
> 
> i know the benchmark score is legit but are u sure there's no shuttering or artifacts at all? no other issue spotted?? i'm really curious..


No, no artifacts no shuttering and the result you can see. Its stable.

I couldn't get over 2088Mhz real core clock (8352Mhz effective) in 3D mark without setting 3Dmark to "prefer maximum performance" while i could get ROTTR to 2202 Mhz real core clock (8808Mhz effective) without doing that.

And that is imho a pretty clear indicator that the Ram itself isn't the problem, the insufficient voltage control is the culprit.


----------



## jlhawn

Wow, What is happening to the GTX 1070 owners club? this is getting stupid with all the arguing over memory.


----------



## mrtbahgs

Quote:


> Originally Posted by *criminal*
> 
> 1.) The voltage slider pertains to core voltage, but some here claim it helps with memory clocks as well.
> 2.) Adding any voltage with the voltage slider is an increase over stock. So 40% would be 40% increase over stock voltage.
> 3.) You can play with it and see what works best for your card, but for my card stock voltage has given me the best overclock.
> 4.) I think 1.093v is the max voltage.


Awesome, thank you!

Are you saying that stock voltage actually did better for your OC stability than raising it?
I could see raising it either help a bit or not do anything, but it can also potentially hurt stability?

1.093v does sound familiar, so I'd be raising mine by .04v at the most which at least to me doesn't sound like much.
Assuming it gives me that little boost i needed to make one step higher remain stable, will that increase in voltage have any degrading or other negative effects on the card?
I am thinking it shouldn't be noticeable other than maybe a degree of temperature, but I want to be safe since its a new card and I plan to use it for 4 years.
Bios modding and larger overvolting I could see being a potential threat, but I am not looking at doing extreme tweaks.


----------



## bigjdubb

Quote:


> Originally Posted by *mrtbahgs*
> 
> Awesome, thank you!
> 
> Are you saying that stock voltage actually did better for your OC stability than raising it?
> I could see raising it either help a bit or not do anything, but it can also potentially hurt stability?
> 
> 1.093v does sound familiar, so I'd be raising mine by .04v at the most which at least to me doesn't sound like much.
> Assuming it gives me that little boost i needed to make one step higher remain stable, will that increase in voltage have any degrading or other negative effects on the card?
> I am thinking it shouldn't be noticeable other than maybe a degree of temperature, but I want to be safe since its a new card and I plan to use it for 4 years.
> Bios modding and larger overvolting I could see being a potential threat, but I am not looking at doing extreme tweaks.


Too much voltage can hurt your overclock potential and stability, especially if you are air cooled. My card achieves the same clocks at 1.062 and 1.093 (max) so I run it at 1.062. No need to add more juice than required.

Quote:


> Originally Posted by *jlhawn*
> 
> Wow, What is happening to the GTX 1070 owners club? this is getting stupid with all the arguing over memory.


If you block 2-3 people it cleans up the thread pretty good.


----------



## Roland0101

Quote:


> Originally Posted by *bigjdubb*
> 
> Too much voltage can hurt your overclock potential and stability, especially if you are air cooled. My card achieves the same clocks at 1.062 and 1.093 (max) so I run it at 1.062. No need to add more juice than required.


I also encountered that behavior. I can run 3Dmark at stock voltage settings with +90 offset core clock, (over the already higher clocks of my STRIX OC) but if I set the card to 1.093v it crashes.
Sometimes less is better.


----------



## jamor

The problem is that Samsung Memory was forged in the fires of Mount Doom and hand picked by Sauron himself while the Micron Memory was made in China.


----------



## xg4m3

Quote:


> Originally Posted by *jamor*
> 
> The problem is that Samsung Memory was forged in the fires of Mount Doom and hand picked by Sauron himself while the Micron Memory was made in China.


Ok. You just won the internet with that comment. Made my day haha


----------



## criminal

Quote:


> Originally Posted by *mrtbahgs*
> 
> Awesome, thank you!
> 
> Are you saying that stock voltage actually did better for your OC stability than raising it?
> I could see raising it either help a bit or not do anything, but it can also potentially hurt stability?
> 
> 1.093v does sound familiar, so I'd be raising mine by .04v at the most which at least to me doesn't sound like much.
> Assuming it gives me that little boost i needed to make one step higher remain stable, will that increase in voltage have any degrading or other negative effects on the card?
> I am thinking it shouldn't be noticeable other than maybe a degree of temperature, but I want to be safe since its a new card and I plan to use it for 4 years.
> Bios modding and larger overvolting I could see being a potential threat, but I am not looking at doing extreme tweaks.


What bigjdubb said below.
Quote:


> Originally Posted by *bigjdubb*
> 
> Too much voltage can hurt your overclock potential and stability, especially if you are air cooled. My card achieves the same clocks at 1.062 and 1.093 (max) so I run it at 1.062. No need to add more juice than required.
> If you block 2-3 people it cleans up the thread pretty good.


This^


----------



## Star Forge

I find it funny that people here are still arguing about VRAM. For me, I had a copy of the 1070 FTW with Samsung and Micron VRAM and the Samsung VRAM couldn't overclock for crap and the entire unit fails to properly apply core voltages even after manually setting the voltages. The Micron one I ended up keeping can push the core over 2100 and the VRAM right now is purring at +300, maybe more with core voltages at 1.168 to 1.175 stable on air.

So it is more YMMV than Samsung is better than Micron because my Micron has done better than most. The problem with the 1070 that I have seen in regards to overclocking is GPU Boost 3.0 is failing at times to properly scale voltages to keep the card's overclock alive.


----------



## tps3443

Founders Edition for the WIN!

Never though I would say that.


----------



## Roland0101

Quote:


> Originally Posted by *jamor*
> 
> The problem is that Samsung Memory was forged in the fires of Mount Doom and hand picked by Sauron himself while the Micron Memory was made in China.


Are we sure mount Doom isn't in China?


----------



## Star Forge

Samsung RAM is made in Korea I think. However, that doesn't really matter in the end.


----------



## Prothean

The *"crash on exit"* bug is hard to reproduce because of the card's dynamic voltage regulation. Sometimes it crashes and sometimes it doesn't. However, I found a reliable way to crash my MSI GTX 1070 with Micron ram using driver 372.90.

Here's how I crash it:

1) Reset all my clocks in MSI Afterburner.

2) Set the memory to +600.

3) Launch the Rise of the Tomb Raider benchmark.

It will crash after the benchmark completes, or when I try to exit the game. (Try +500 if you can't complete the benchmark).

Here's how I fix it:

1) Set the Rise of the Tomb Raider power management mode to "prefer maximum performance" in the Nvidia control panel.

2) Re-run the Rise of the Tomb Raider benchmark with the same memory overclock.

It does not crash after the benchmark completes, and I am able to exit the game without a crash.

_This is a temporary fix and I hope a permanent solution is coming, either a driver or bios update._


----------



## criminal

Quote:


> Originally Posted by *Prothean*
> 
> The *"crash on exit"* bug is hard to reproduce because of the card's dynamic voltage regulation. Sometimes it crashes and sometimes it doesn't. However, I found a reliable way to crash my MSI GTX 1070 with Micron ram using driver 372.90.
> 
> Here's how I crash it:
> 
> 1) Reset all my clocks in MSI Afterburner.
> 
> 2) Set the memory to +600.
> 
> 3) Launch the Rise of the Tomb Raider benchmark.
> 
> It will crash after the benchmark completes, or when I try to exit the game. (Try +500 if you can't complete the benchmark).
> 
> Here's how I fix it:
> 
> 1) Set the Rise of the Tomb Raider power management mode to "prefer maximum performance" in the Nvidia control panel.
> 
> 2) Re-run the Rise of the Tomb Raider benchmark with the same memory overclock.
> 
> It does not crash after the benchmark completes, and I am able to exit the game without a crash.
> 
> _This is a temporary fix and I hope a permanent solution is coming, either a driver or bios update._


The aftermarket cards come with a custom bios because their power target is higher than a FE. Someone has to be modifying these bios files for these cards, otherwise there wouldn't be anything different between the bios on an aftermarket card and a FE. A bios fix would probably fix the whole issue. The fix for this issue falls squarely on MSI/Asus/Zotac/Gigabyte.


----------



## Star Forge

Quote:


> Originally Posted by *Prothean*
> 
> The *"crash on exit"* bug is hard to reproduce because of the card's dynamic voltage regulation. Sometimes it crashes and sometimes it doesn't. However, I found a reliable way to crash my MSI GTX 1070 with Micron ram using driver 372.90.
> 
> Here's how I crash it:
> 
> 1) Reset all my clocks in MSI Afterburner.
> 
> 2) Set the memory to +600.
> 
> 3) Launch the Rise of the Tomb Raider benchmark.
> 
> It will crash after the benchmark completes, or when I try to exit the game. (Try +500 if you can't complete the benchmark).
> 
> Here's how I fix it:
> 
> 1) Set the Rise of the Tomb Raider power management mode to "prefer maximum performance" in the Nvidia control panel.
> 
> 2) Re-run the Rise of the Tomb Raider benchmark with the same memory overclock.
> 
> It does not crash after the benchmark completes, and I am able to exit the game without a crash.
> 
> _This is a temporary fix and I hope a permanent solution is coming, either a driver or bios update._


To be honest, the problem with Pascals in general is how aggressive GPU Boost 3.0 is. The problem with the cards I have used is due to the fact that the voltage does not boost or gets reduced heavily as temperatures rise to a point that the bins always starve on a lack of voltage that the GPU Boost 3.0 is denying, while certain BIOS'es tend to fix that so the cards don't starve. So if your card is crashing over small overclocks, it is suggesting that GPU Boost 3.0 is denying significant voltage. No matter how much argument can be made on VRAM, if the GPS Boost 3.0 is starving your card with voltage, you are going to run into issues period. This alludes to my Samsung unit as well as while it would in a better life could be superior to memory overclocking vs. Micron, if it can't even hit 2088 MHz stable on overvolting, that card is still being crippled by poor GPU Boost 3.0 tabling and in the end, fails at overclocking.

Also by the time you sell your 1070 for a Volta, the prices would of tanked 2nd hand to a point that a "premium" on Samsung RAM to get more money is irrelevant. Deal with it.


----------



## HOODedDutchman

Temps are really good considering I'm just running an H440 case with stock fans (3x120mm intake, 1 140mm exhaust). Was planning to add a fan on the other side of HDD trays, but might not have to. This is after about 1 hours of The Witcher 3 on maximum settings 2560x1440 with vsync and framecap off so gpu's were 100% usage most of the time.



I'll throw in my firestrike score here as well as my first one got deleted for swearing... My bad


----------



## MyNewRig

Quote:


> Originally Posted by *mrtbahgs*
> 
> 1) Will increased voltage help with memory overclocks or just core?
> (For example, say +600 is stable, but +700 shows an artifact or 2. Can voltage potentially make +700 stable?)


On my sample if you are dialing the memory offset while a 3D application is running, for example Heaven and say artifacts start appearing at +300 then this is your max OC, voltage locking will not get you any higher without artifacting, what voltage locking helps overcome is the checkerboard system lockup followed by a BSOD and restart when power state changes from high to low voltage or vise versa, because it prevents the BIOS from decreasing the voltage in low power states or idle

Quote:


> 2) I am using PrecisionXOC, is the voltage slider adjusting how much you increase or how much is used overall?
> (For example, does 40% voltage mean it will actually be undervolted or that it is adding 40% of some unknown amount?
> I assume 100% voltage means it is running at max voltage, but it is hard to understand something in between 0 and 100.)


The voltage is not calculated as a percentage, it is an offset with a max of +100 mV on top of the stock voltage, so +0 is stock voltage ... +100 is stock voltage + 100 mV, the card maxes out at 1.093v but with GPU Boost you can not actually force or lock the voltage, all you can do is set the MAX voltage that the card will reach, but even if you have it locked at 1.093v it will temp or power throttle down to even 1.0v or below if GPU boost determines that there is not enough power to support your clock or that temp is too high to run that voltage.

Quote:


> 3) If i decide to touch the voltage slider to try and get one more step up stable, should I just go straight to 100% or actually play with it in between?


You can slide to +100 right away, no problems at all, it will not make much difference because actually GPU Boost is what controls the voltage as explained above, you just give it a little bit higher ceiling to work with. voltage on Pascal is tightly managed by the BIOS you really have very little control over it aside from preventing it from going into 600-700 mV in idle.

Quote:


> 4a) What is the current max voltage on these cards?
> 4b)I think mine usually runs at 1.05v or so and if the bump is only a few hundredths I find it hard to believe that it will really do much both for stability as well as increased temps so I suppose I don't see a negative in just slamming it to max out, am I wrong?


No negatives at all since the BIOS overrides your voltage offset at its discretion, increasing voltage and power targets will raise your temps by some 4c or 5c degrees.


----------



## pheoxs

Forgive my ignorance but how do you tell if you have micron memory or not?


----------



## MyNewRig

Quote:


> Originally Posted by *pheoxs*
> 
> Forgive my ignorance but how do you tell if you have micron memory or not?


GPU-Z check memory type field.


----------



## pheoxs

Quote:


> Originally Posted by *MyNewRig*
> 
> GPU-Z check memory type field.


Perfect, thanks!


----------



## Roland0101

Quote:


> Originally Posted by *Star Forge*
> 
> Samsung RAM is made in Korea I think. However, that doesn't really matter in the end.


Samsung biggest memory factory is in china too, they have at least two other facilities one in Austin and one in Korea but I don't no where the 1070 Ram modules are produced. And you are right, in the end it doesn't matter.


----------



## MyNewRig

Quote:


> Originally Posted by *pheoxs*
> 
> Perfect, thanks!


Welcome, if you bought your card in June or early July or have an FE then most probably you have Samsung, from August batches forward it is Micron in all non-FE cards.


----------



## pheoxs

Quote:


> Originally Posted by *MyNewRig*
> 
> Welcome, if you bought your card in June or early July or have an FE then most probably you have Samsung, from August batches forward it is Micron in all non-FE cards.


Just picked it up, but its a smaller local store so it might be old inventory


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Welcome, if you bought your card in June or early July or have an FE then most probably you have Samsung, from August batches forward it is Micron in all non-FE cards.


Dude u know I just bought a card yesterday with Samsung memory lol. Store said it came in last week. Its Canada tho so maybe gigabyte has a wearhouse here with stockpile.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Dude u know I just bought a card yesterday with Samsung memory lol. Store said it came in last week.


That is really unique, i hope i can find the same, i think you explained before that the card you have might have been out of production for a while or maybe have low sales so batches from a couple months ago are still available in the market, you are lucky though to be able to score a Samsung during that messed up Micron market.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> That is really unique, i hope i can find the same, i think you explained before that the card you have might have been out of production for a while or maybe have low sales so batches from a couple months ago are still available in the market, you are lucky though to be able to score a Samsung during that messed up Micron market.


Ya that's what I thought but I got there n asked him n he said he's only had it about a week...
Maybe you should look into this card. 8500mhz ram speed stock and dam close to 1080 at stock settings.
http://www.tweaktown.com/reviews/7885/palit-geforce-gtx-1070-gamerock-premium-edition-review/index.html


----------



## TheGlow

My Micron seems to be doing fine. Even without whacky voltage playing.
Pic with default core volt/power
http://i.imgur.com/ge3QCEV.jpg

Max without crashing. Some artifacts in timespy. None observered at 200/800.
http://i.imgur.com/v5W9Nb7.jpg

I haven't run anymore since last driver update but I haven't experienced any difference in usage since. I run it at 180/750 stable.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ya that's what I thought but I got there n asked him n he said he's only had it about a week...
> Maybe you should look into this card. 8500mhz ram speed stock and dam close to 1080 at stock settings.
> http://www.tweaktown.com/reviews/7885/palit-geforce-gtx-1070-gamerock-premium-edition-review/index.html


Very interesting, i also found the Gainward GTX 1070 Phoenix GLH which also has 8500Mhz effective out of the box, and i have a feeling that any of these two cards must use Samsung GDDR5 to actually be able to guarantee that memory frequency out of the box, i asked my retailer today about the Phoenix GLH and they said it is not a very high quality and they are getting more RMA cases on that one compared to other 1st-tired brands, the Palit you recommended appears to be of a higher quality though but unfortunately no one has it in Stock yet, looks like it will arrive in October.

Right now i feel that the FE is the safest bet, at least i get guaranteed quality and know exactly what components i will be buying, but i will monitor the situation for a few more days and then decide which one i will go with.


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> My Micron seems to be doing fine. Even without whacky voltage playing.
> Pic with default core volt/power
> http://i.imgur.com/ge3QCEV.jpg
> 
> Max without crashing. Some artifacts in timespy. None observered at 200/800.
> http://i.imgur.com/v5W9Nb7.jpg
> 
> I haven't run anymore since last driver update but I haven't experienced any difference in usage since. I run it at 180/750 stable.


wow, which card is that? when did you buy it?


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Very interesting, i also found the Gainward GTX 1070 Phoenix GLH which also has 8500Mhz effective out of the box, and i have a feeling that any of these two cards must use Samsung GDDR5 to actually be able to guarantee that memory frequency out of the box, i asked my retailer today about the Phoenix GLH and they said it is not a very high quality and they are getting more RMA cases on that one compared to other 1st-tired brands, the Palit you recommended appears to be of a higher quality though but unfortunately no one has it in Stock yet, looks like it will arrive in October.
> 
> Right now i feel that the FE is the safest bet, at least i get guaranteed quality and know exactly what components i will be buying, but i will monitor the situation for a few more days and then decide which one i will go with.


Ya there are reviews of this card from June but I don't see any stock around either.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ya there are reviews of this card from June but I don't see any stock around either.


You got the Gigabyte Windforce dual fan right? my head is starting to hurt from trying to figure this out, how can Gigabyte use Samsung on the Dual fan Windforce and use Micron on its highest-tier Xtreme Gaming which costs a lot more? i just feel that i don't understand anything anymore


----------



## HOODedDutchman

Changed out the sli bridges for some nicer looking black ones I had kicking around. Ordered asrock hb bridge off newegg. Cheap and simple plain black hb bridge suits my rig nicely. Might put a square of plastidip over the a stock logo tho.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Changed out the sli bridges for some nicer looking black ones I had kicking around. Ordered asrock hb bridge off newegg. Cheap and simple plain black hb bridge suits my rig nicely. Might put a square of plastidip over the a stock logo tho.


That is a very sweet setup man, looks very neat, i love it ... it was a smart move on your part to score that second Samsung card, enjoy your setup ..


----------



## mrtbahgs

Quote:


> Originally Posted by *bigjdubb*
> 
> Too much voltage can hurt your overclock potential and stability, especially if you are air cooled. My card achieves the same clocks at 1.062 and 1.093 (max) so I run it at 1.062. No need to add more juice than required.


Ok thank you, I guess I will have to invest some more time in dialing things in and there won't be a "quick OC" solution.
The main reason I am curious about voltage is that according to GPU-Z my limitation is always "relative voltage" so I am trying to prevent too many down steps and keep things stable with a decent OC. I have temps pretty much locked at 61C or so when my fans hit 70%.

Quote:


> Originally Posted by *MyNewRig*
> 
> The voltage is not calculated as a percentage, it is an offset with a max of +100 mV on top of the stock voltage, so +0 is stock voltage ... +100 is stock voltage + 100 mV, the card maxes out at 1.093v but with GPU Boost you can not actually force or lock the voltage, all you can do is set the MAX voltage that the card will reach, but even if you have it locked at 1.093v it will temp or power throttle down to even 1.0v or below if GPU boost determines that there is not enough power to support your clock or that temp is too high to run that voltage.
> 
> You can slide to +100 right away, no problems at all, it will not make much difference because actually GPU Boost is what controls the voltage as explained above, you just give it a little bit higher ceiling to work with. voltage on Pascal is tightly managed by the BIOS you really have very little control over it aside from preventing it from going into 600-700 mV in idle.
> 
> No negatives at all since the BIOS overrides your voltage offset at its discretion, increasing voltage and power targets will raise your temps by some 4c or 5c degrees.


Nice, thank you, that makes a lot more sense in regards to the voltage slider.
As i mentioned above, voltage is apparently what is limiting my card's OC so I will see if I gain anything with it up to 100%
I already have power target maxed at I believe 111% so hopefully it will be voltage or temperature that decide how high I can clock.
I don't think I want to go beyond 70% fan speed so temperature may indeed keep me down a step or 2 on core clock, but it is still running pretty well and I am now seeing how much memory I can add, currently thinking +700 will be my max, possibly drop it to +650. Core seems game stable at +100, but I believe crashed or stuttered Heaven at +125 or +133 so I haven't tried to find the highest OC in between.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> That is a very sweet setup man, looks very neat, i love it ... it was a smart move on your part to score that second Samsung card, enjoy your setup ..


Thanks man. Just need a few more thing. Like figure out what I'm gonna do about that cooler blocking the x1 slot so I can move my sound card up there.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Thanks man. Just need a few more thing. Like figure out what I'm gonna do about that cooler blocking the x1 slot so I can move my sound card up there.


Judging by the way you have your cables arranged and the way the whole system looks very neat and clean you look like an OCD guy like me, mine looks almost as neat as yours, but i admit yours look better, my ASUS Strix RGB Backplate is improving its looks a lot but unfortunately that awesome backplate would have to be sacrificed because of that broken leg Micron GDDR5 on the card, i have to prepare myself to live with the dull FE backplate .. i really loved the Strix when it had Samsung, could not be happier, but now i have to make compromises with my toys which is not nice at all


----------



## HOODedDutchman

Mine don't have backplate which is kind of driving me nuts. I was going to order some of those v1tech ones but they won't fit under the usb3 connector. At least the pcb are black lol.


----------



## HOODedDutchman

Also might have to swap out the cpu cooler to fit the sound card on top. I do not like it at the bottom. Was thinking nzxt aio but not sure how their warranty works. I know corsair will cover ur parts if they leak.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Mine don't have backplate which is kind of driving me nuts. I was going to order some of those v1tech ones but they won't fit under the usb3 connector. At least the pcb are black lol.


I honestly think that your setup looks pretty awesome without backplates on the cards, but if you actually care about that kind of thing why didn't you get a card with backplate like the G1 for example?


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Also might have to swap out the cpu cooler to fit the sound card on top. I do not like it at the bottom. Was thinking nzxt aio but not sure how their warranty works. I know corsair will cover ur parts if they leak.


I would avoid anything NZXT at all costs, i have both case LED kits by NZXT the Hue and the Hue+ both have HUGEEEE quality issues, i have had the Hue replaced 4 times! and even the 4th part has issues, then i bought the Hue+ which has better quality but does not save LED settings, every time you shutdown and start the PC the LED settings are deleted and they turn white again! contacted NZXT about that about a month ago and they said that they are aware of the issue and are working on it, one month later, nothing from them yet, NZXT has horrible horrible quality, i would definitely stick with Corsair, their quality, warranty and customer support are top notch.


----------



## gtbtk

Quote:


> Originally Posted by *saunupe1911*
> 
> Also do you guys think 1070s with samsung memory will be like worth a lot a more on the resell market in the near future compared to their micron siblings


It depends if they fix the driver/firmware bug for teh memory voltage control or not.


----------



## gtbtk

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> When you also know that in previous generations (700 and 900 series) top binned cards were all using Samsung, you will realize that AIBs had no control over this because it is Nvidia who ordered them to do so.
> 
> 
> 
> This above is totally wrong and I already touched on this earlier in the thread. EVGA have used Hynix and Elpida memory on their Classified cards, when the Classified was the top card that they offered and originally shipped with Samsung.
> 
> https://www.google.com/#q=gtx+780+classified+elpida
> 
> https://www.google.com/#q=gtx+780+Ti+classified+hynix
> 
> Also MSI with their Lightning:
> 
> https://www.google.com/#q=MSI+Lightning+780+elpida+memory
> 
> Manufacturers can use what ever memory models they want on their custom boards as long as it runs at specs Nvidia has specified. It has been going on for years, so it is nothing new with the 1070.
Click to expand...

Elpida was taken over by Micron


----------



## Star Forge

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *saunupe1911*
> 
> Also do you guys think 1070s with samsung memory will be like worth a lot a more on the resell market in the near future compared to their micron siblings
> 
> 
> 
> It depends if they fix the driver/firmware bug for teh memory voltage control or not.
Click to expand...

To be brutally honest, it doesn't have a lasting effect. The price of most 900 series cards has maintain within a +/- 20 dollar margin from the average with the exception of something like Kingpin Edition. The thing is, the amount of RAM overclocking only gains realistically an additional 1-2 FPS for gaming and that would be pretty moot by the time you are selling your cards after Volta's release.


----------



## gtbtk

Quote:


> Originally Posted by *Exenth*
> 
> I think I broke Firestrike, look at these GPU Core and Memory clocks
> 
> Also my EVGA 1070 FTW doesn't do more than +100MHz on the Core and +200MHz on the Memory, a little disappointed but still a great card.
> 
> http://www.3dmark.com/3dm/15108207?


Remember that your FTW already has a +88 OC factory overclock so your +100 is actually +188.

If you add say +50 to +75 OC and then lift the 1.093v point on the curve higher, depending on the chip, you can potentially get 2100Mhz-2150Mhz oc or above assuming you have set the voltage to +100.

The EVGA cards are set up to suck a a lot of watts by default. It is likely you are limited to +200 memory because you are loading the card up to 220+ watts and it is giving up.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Micron has power save feature mem voltage, 1.5V and 1.35V.
> 
> 
> While Samsung memory seems not:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that might be the culprit, why 'overclocking' the memory vram less stable on micron.
> 
> http://www.samsung.com/us/samsungsemiconductor/pdfs/PSG_1H_2016.pdf
> https://www.micron.com/resource-details/63c49896-86ea-4067-80e8-46635707233f
> 
> If you want faster memory vram, better buy card that has GDDR5X, or wait in 2018 with Samsung GDDR6 14-16 Gbps.


VDD is the input voltage and VDDQ is output voltage and they equal each other in a given usage scenario.

The Micron and the Samsung Documents are both saying that various SKUs can run at either 1.35 or 1.5V.

One thing that the Micron document says that stands out to me is the feature "precharge for better burst access". Samsung may have something similar but I have not seen anything anywhere from samsung that talks a precharge, though the Samsung datasheets are very light on the detailed spec information.

If I was an Nvidia software engineer, that would be the place where I started investigating micron feature and comparing it with the functionality of the samsung memory cards in firmware and Nvidia drivers


----------



## gtbtk

Quote:


> Originally Posted by *tps3443*
> 
> Anyone running a GTX1070, with Micron memory. That wants more Overclocking room. I would sell the cards before it's a worldwide known problem. And just grab a FE with Samsung memory. I ran my old gtx1070 with Samsung memory at like 9,800Mhz. And some of you guys cannot even hit 8500? That's terrible.


It is a software bug. I am sure it will be addressed and resolved at some stage. In the mean time there is a work around

No need to be running around yet screaming "the sky is falling!"


----------



## Dude970

Here is a post that is not talking about memory, geesh I may unsubscribe... this is getting old

http://www.3dmark.com/spy/504479



And Firestrike

http://www.3dmark.com/fs/10191519



***Disclaimer**** This is Samsung Memory


----------



## rfarmer

Quote:


> Originally Posted by *Dude970*
> 
> Here is a post that is not talking about memory, geesh I may unsubscribe... this is getting old
> 
> http://www.3dmark.com/spy/504479
> 
> 
> 
> And Firestrike
> 
> http://www.3dmark.com/fs/10191519
> 
> 
> 
> ***Disclaimer**** This is Samsung Memory


Nice score and damn nice OC on your i7, and I totally agree about unsubscribing.


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> VDD is the input voltage and VDDQ is output voltage and they equal each other in a given usage scenario.
> 
> The Micron and the Samsung Documents are both saying that various SKUs can run at either 1.35 or 1.5V.


You are right, wasn't looking at the second column.
Quote:


> If I was an Nvidia software engineer, that would be the place where I started investigating micron feature and comparing it with the functionality of the samsung memory cards in firmware and Nvidia drivers


It seams like they already did on the driver side, it is slightly better with 372.90, but a real fix will need a Bios update from the AIBs.
Quote:


> It is a software bug. I am sure it will be addressed and resolved at some stage. In the mean time there is a work around
> 
> No need to be running around yet screaming "the sky is falling!"


----------



## Roland0101

Quote:


> Originally Posted by *Dude970*
> 
> Here is a post that is not talking about memory, geesh I may unsubscribe... this is getting old
> 
> http://www.3dmark.com/spy/504479
> 
> 
> 
> And Firestrike
> 
> http://www.3dmark.com/fs/10191519
> 
> 
> 
> ***Disclaimer**** This is Samsung Memory


Very nice score.







But 3DMark really need to fix the Brawn achievement.


----------



## Quantium40

So...

I just bought this card a week ago. Safe to say the cheap Gigabyte Windforce models have Samsung memory at a higher frequency?


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> I honestly think that your setup looks pretty awesome without backplates on the cards, but if you actually care about that kind of thing why didn't you get a card with backplate like the G1 for example?


Wasn't in stock at the time.


----------



## BulletSponge

Quote:


> Originally Posted by *Quantium40*
> 
> So...
> 
> I just bought this card a week ago. Safe to say the cheap Gigabyte Windforce models have Samsung memory at a higher frequency?


You oughta be able to get a fair bit more oomph outta that card with some tweaking.











I just realized my GPU-Z is out of date, any ASIC scores supported yet?


----------



## BulletSponge

Double post-muh bad.


----------



## mrtbahgs

I noticed in a Valley run that the core clock displayed from Valley is higher than the core clock from PrecisionXOC's OSD, but the memory is the same as well as temps and what not.
I will have to double check Heaven to see if it does the same.

Is this normal and I assume a clock that cannot be trusted or whats going on?
I don't normally look at the small text in the top right corner so I hadn't noticed the difference before.

Precision said 2050/9408 and Valley said 2100/9408


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tps3443*
> 
> Anyone running a GTX1070, with Micron memory. That wants more Overclocking room. I would sell the cards before it's a worldwide known problem. And just grab a FE with Samsung memory. I ran my old gtx1070 with Samsung memory at like 9,800Mhz. And some of you guys cannot even hit 8500? That's terrible.
> 
> 
> 
> It is a software bug. I am sure it will be addressed and resolved at some stage. In the mean time there is a work around
> 
> No need to be running around yet screaming "the sky is falling!"
Click to expand...

Remember that we are just making an assumption here, we don't know for sure if it is a software bug or a hardware issue that these Micron memory ICs are not able to maintain stability under variable / adoptive voltage operation, the proposed workaround of voltage locking is basically disabling adaptive voltage mode and running these chips under constant voltage, a solution that can not be implemented in BIOS because that requires changing the entire power management algorithm that was initially designed for Samsung GDDR5 ICs that don't exhibit the same power properties as Micron.

Also remember that what we are actually locking is Core voltage not memory voltage, since memory is actually running at a much higher voltage than core, memory is running on 1.5v but that we don't have access to or control over, we can't even see with the tools we have available.

So the problem could be much deeper rooted than some might think and a solution might not be possible at all or might not be that easy or viable to implement, all i know is, if it was that easy to fix as you imply, a solution would have been provided by now, but with that delay and complete silence from NVIDIA, this does not look good at all.


----------



## chrcoluk

Something interesting I noticed.

I accidentally left multi monitor power saving on in nvidia inspector when playing a game, and I observed the power utilisation was a lot lower than normal and as a result my card could keep the max voltage without TDP throttling, I thought ***? I then unticked the gtx 1070 in multi display power saver and sure enough power utilisation jumped up from an average of 75% to about 105%.with same clock speeds, momentarily until it TDP throttled.


----------



## Hunched

Quote:


> Originally Posted by *MyNewRig*
> 
> Also remember that what we are actually locking is Core voltage not memory voltage, since memory is actually running at a much higher voltage than core, memory is running on 1.5v but that we don't have access to or control over, we can't even see with the tools we have available.


If that's true, it explains why my Samsung memory only ever artifacts or has issues 1 in 100 transitions from idle to load, and is 100% solid when under load (as high voltage is always being applied).
But if this is true, why does locking core voltage to 1.093v help memory at all like so many people are saying?
It should have no effect whatsoever on the stability of the memory.

Or does locking core voltage also lock memory voltage? But then I wouldn't have stability issues only during transitions...


----------



## MyNewRig

Quote:


> Originally Posted by *Hunched*
> 
> If that's true, it explains why my Samsung memory only ever artifacts or has issues 1 in 100 transitions from idle to load, and is 100% solid when under load (as high voltage is always being applied).
> But if this is true, why does locking core voltage to 1.093v help memory at all like so many people are saying?
> It should have no effect whatsoever on the stability of the memory.
> 
> Or does locking core voltage also lock memory voltage? But then I wouldn't have stability issues only during transitions...


Because it looks like Core and memory power states are linked, i have never seen a card for example that would run core in high performance 3D while keeping memory in power-saving mode at the same time, so what appears to be happening is that locking core voltage forces core to be in max performance power state which instructs the memory controller to enter the same mode as well, but we are not directly controlling memory voltage in anyway, just force it to enter high performance mode constantly which forces the memory controller to apply max voltage constantly without going from low to high and vice versa.

But i notice you saying that Samsung memory also randomly show artifacts during transitions? can you explain how and when this exactly happen? since if this is the case it could be a general Pascal voltage issue then?


----------



## Hunched

Quote:


> Originally Posted by *MyNewRig*
> 
> But i notice you saying that Samsung memory also randomly show artifacts during transitions? can you explain how and when this exactly happen? since if this is the case it could be a general Pascal voltage issue then?


I seem to be an exception, somehow.
It's quite infrequent, but every now and then after a loading screen, loading a new map, everything will go to ****.
Once everything is loaded and running, I've never had an issue. Only ever immediately after load is applied from an idle state.

A bad analogy is if you had a car, 1 in 50 attempts to start the engine would be a disaster, but if you get it started without issue it will run forever without issue.

It's a bit annoying, because I could run +600 all time time if I could keep the memory voltage at whatever it is at when it is under load.
It's possible that there are still voltage fluctuations with memory even with the core voltage locked at 1.093v, its higher power state could be a voltage range that fluctuates according to load/utilization.
Which would explain why I never have issue when there is load on the card/memory, sometimes the jump from no load to load simply breaks the memory overclock.

I don't know exactly what my memory issue is or why I have it, so sorry for explaining it poorly.


----------



## agntallen

wow. what have i been reading in this thread for the past couple of pages.... all i see is samsung or micron posts all around

i'm just happy that i switched over from an r9 390 to a 1070. what else do i gotta post to be a part of the owners club

any game recommendations & is a g-sync monitor really worth switching over to? (does anybody know if the rainbow 6 siege community is still up? was thinking of getting that) last time i owned a nvidia card was way way back when the gt 7600 was out.


----------



## HOODedDutchman

Quote:


> Originally Posted by *agntallen*
> 
> wow. what have i been reading in this thread for the past couple of pages.... all i see is samsung or micron posts all around
> 
> i'm just happy that i switched over from an r9 390 to a 1070. what else do i gotta post to be a part of the owners club
> 
> any game recommendations & is a g-sync monitor really worth switching over to? (does anybody know if the rainbow 6 siege community is still up? was thinking of getting that) last time i owned a nvidia card was way way back when the gt 7600 was out.


I don't play much multiplayer lately. Waiting for bf1. Last couple months or so I've been playing fallout4, witcher 3, rise of the tomb raider. All amazing. Recommend witcher 3 the most tho. Gameplay is amazing and really pushes the hardware and looks stunning. Fallout4 and ride of the tomb raider do too but fallout4 doesn't look as good and the witcher has WAY more gameplay then rise of the tomb raider. Plus u can find witcher 3 for $30 game of the year edition with all dlc.


----------



## Roland0101

Quote:


> Originally Posted by *mrtbahgs*
> 
> I noticed in a Valley run that the core clock displayed from Valley is higher than the core clock from PrecisionXOC's OSD, but the memory is the same as well as temps and what not.
> I will have to double check Heaven to see if it does the same.
> 
> Is this normal and I assume a clock that cannot be trusted or whats going on?
> I don't normally look at the small text in the top right corner so I hadn't noticed the difference before.
> 
> Precision said 2050/9408 and Valley said 2100/9408


Valley is reading the clock at the start and then never again, and it's known for reporting nvidia gpu clocks incorrectly.

The reading you get from PrecisionX is the right one.


----------



## Roland0101

Quote:


> Originally Posted by *agntallen*
> 
> wow. what have i been reading in this thread for the past couple of pages.... all i see is samsung or micron posts all around
> 
> i'm just happy that i switched over from an r9 390 to a 1070. what else do i gotta post to be a part of the owners club


That's a good attitude, it's a great card.








Quote:


> any game recommendations & is a g-sync monitor really worth switching over to? (does anybody know if the rainbow 6 siege community is still up? was thinking of getting that) last time i owned a nvidia card was way way back when the gt 7600 was out.


What type of games do you like to play?
G-Sync is a nice features, there are however problems for some users with the Windows AU and G-sync, you might read up on that or wait a few months until MS and Nvidia can short that out.


----------



## F3niX69

Quote:


> Originally Posted by *Dude970*
> 
> Here is a post that is not talking about memory, geesh I may unsubscribe... this is getting old
> 
> http://www.3dmark.com/spy/504479
> 
> 
> 
> And Firestrike
> 
> http://www.3dmark.com/fs/10191519
> 
> 
> 
> ***Disclaimer**** This is Samsung Memory


Damn i have an i7 3770k at 4.2 ghz and a gtx 1070 and only get 14266 in firestrike and 5758 in timespy
Pretty high overclocks you have there


----------



## HOODedDutchman

Quote:


> Originally Posted by *F3niX69*
> 
> Damn i have an i7 3770k at 4.2 ghz and a gtx 1070 and only get 14266 in firestrike and 5758 in timespy
> Pretty high overclocks you have there


Ya that's a MASSIVE score for a 3770k. He must be doing suicide runs at 5ghz or something lol. Think my 3770k only scored like high 11k's at 4.5ghz.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ya that's a MASSIVE score for a 3770k. He must be doing suicide runs at 5ghz or something lol. Think my 3770k only scored like high 11k's at 4.5ghz.


This is not only about the CPU running at 5Ghz, He has his 1070 OCed to 2,164 MHz core and 2,448 MHz memory, which is pretty crazy









I wonder what kind of card and cooling he is using for that impressive run!


----------



## TheGlow

Quote:


> Originally Posted by *mrtbahgs*
> 
> I noticed in a Valley run that the core clock displayed from Valley is higher than the core clock from PrecisionXOC's OSD, but the memory is the same as well as temps and what not.
> I will have to double check Heaven to see if it does the same.
> 
> Is this normal and I assume a clock that cannot be trusted or whats going on?
> I don't normally look at the small text in the top right corner so I hadn't noticed the difference before.
> 
> Precision said 2050/9408 and Valley said 2100/9408


I noticed that a few times as well. Valley a nd maybe Heaven. The core clock its reporting was probably what your core was originally at when you launched it, but then heat throttling and power made it dial down a little, so gpuz/precision is correctly updating. If you close and launch valley again it may fix and report correctly again. Or I think adjusting the clock again manually will kick it back into sync.

Quote:


> Originally Posted by *agntallen*
> 
> wow. what have i been reading in this thread for the past couple of pages.... all i see is samsung or micron posts all around
> 
> i'm just happy that i switched over from an r9 390 to a 1070. what else do i gotta post to be a part of the owners club
> 
> any game recommendations & is a g-sync monitor really worth switching over to? (does anybody know if the rainbow 6 siege community is still up? was thinking of getting that) last time i owned a nvidia card was way way back when the gt 7600 was out.


Witcher3 really is nice. I played it on xb1 and got it when I went with a 1070 and mindblowing difference.
I also got a 27" 1440p, 144Hz GSync monitor. I had the monitor when still on an r9 380 so I got to play without gsync for a couple months.
Long story short I always played with VSync on and forgot about the input delay until playing Street Fighter V. Turned it off and I could see the tearing clear as day.
Then played Overwatch with the 120+fps and could still see tearing but only if actively looking for it.
Now with gsync its very smooth. I havent tried out the ULMB yet.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> This is not only about the CPU running at 5Ghz, He has his 1070 OCed to 2,164 MHz core and 2,448 MHz memory, which is pretty crazy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wonder what kind of card and cooling he is using for that impressive run!


Ya I could care less about the total score tho. I never look at that. Total score doesn't mean anything. You could have an i5 and tri sli 980ti and have some stupid high score that means nothing cuz u r severely bottlenecked lol.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ya I could care less about the total score tho. I never look at that. Total score doesn't mean anything. You could have an i5 and tri sli 980ti and have some stupid high score that means nothing cuz u r severely bottlenecked lol.


Sorry i forgot to add in my last post that he has a whopping 22028 Graphics Score in Firestrike, pretty impressive when i think how i struggle to even reach 20000


----------



## HOODedDutchman

Quote:


> Originally Posted by *TheGlow*
> 
> I noticed that a few times as well. Valley a nd maybe Heaven. The core clock its reporting was probably what your core was originally at when you launched it, but then heat throttling and power made it dial down a little, so gpuz/precision is correctly updating. If you close and launch valley again it may fix and report correctly again. Or I think adjusting the clock again manually will kick it back into sync.
> Witcher3 really is nice. I played it on xb1 and got it when I went with a 1070 and mindblowing difference.
> I also got a 27" 1440p, 144Hz GSync monitor. I had the monitor when still on an r9 380 so I got to play without gsync for a couple months.
> Long story short I always played with VSync on and forgot about the input delay until playing Street Fighter V. Turned it off and I could see the tearing clear as day.
> Then played Overwatch with the 120+fps and could still see tearing but only if actively looking for it.
> Now with gsync its very smooth. I havent tried out the ULMB yet.


I'm on 60Hz 1440p and I see basically 0 screen tearing in witcher 3. N that's with sli. Vsync off and frame limit unlimited. If I run vsync off and limit fps to 60 I get tons of screen tearing. Its kind of like battlefield that was. Screen tearing is nothing and feels much smoother over 60fps even on a 60Hz.


----------



## F3niX69

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ya that's a MASSIVE score for a 3770k. He must be doing suicide runs at 5ghz or something lol. Think my 3770k only scored like high 11k's at 4.5ghz.


Quote:


> Originally Posted by *MyNewRig*
> 
> This is not only about the CPU running at 5Ghz, He has his 1070 OCed to 2,164 MHz core and 2,448 MHz memory, which is pretty crazy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wonder what kind of card and cooling he is using for that impressive run!


I wonder what cpu cooling and which 1070 he is using.
5ghz is ALOT for this cpu.my sample need 1.325v to stay at 4,2 which is dissapointing.
Quote:


> Originally Posted by *HOODedDutchman*
> 
> I'm on 60Hz 1440p and I see basically 0 screen tearing in witcher 3. N that's with sli. Vsync off and frame limit unlimited. If I run vsync off and limit fps to 60 I get tons of screen tearing. Its kind of like battlefield that was. Screen tearing is nothing and feels much smoother over 60fps even on a 60Hz.


in some games tearing is much worse than others, and i don't know why. in overwatch for example i can't see any distracting tearing


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Sorry i forgot to add in my last post that he has a whopping 22028 Graphics Score in Firestrike, pretty impressive when i think how i struggle to even reach 20000


That sucks. I score like 21k without even trying lol. Quick +100 core +400 men. Well I score over 36k graphics stock actually... Just sayin.


----------



## HOODedDutchman

Quote:


> Originally Posted by *F3niX69*
> 
> I wonder what cpu cooling and which 1070 he is using.
> 5ghz is ALOT for this cpu.my sample need 1.325v to stay at 4,2 which is dissapointing.
> in some games tearing is much worse than others, and i don't know why. in overwatch for example i can't see any distracting tearing


Ya some games are just brutal. But some games have a lot less noticeable input lag/slowdown feeling then others. Vsync feels terrible at 60Hz in witcher 3 but in fallout4 it feels fine and have no issue.


----------



## criminal

Quote:


> Originally Posted by *Dude970*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Here is a post that is not talking about memory, geesh I may unsubscribe... this is getting old
> 
> http://www.3dmark.com/spy/504479
> 
> 
> 
> And Firestrike
> 
> http://www.3dmark.com/fs/10191519
> 
> 
> 
> ***Disclaimer**** This is Samsung Memory


Great graphics score! These are my best runs:

http://www.3dmark.com/spy/290317

http://www.3dmark.com/fs/9859378


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *criminal*
> 
> I believe that the GTX 1070 is the first X70 card that couldn't overclock and match the stock performance of the X80 card of the same generation. And that's when using the Samsung memory. His whole conspiracy theory for that being the reason for the memory change is so freaking silly! The 1070 was never able to get close to the performance of the 1080, so there was no need to "secretly" change to Micron memory to cripple performance... lol
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html
> https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/30.html
> 
> 
> 
> LOL, i see you two are getting along quite well which is pretty surprising to be honest, being a fanboy who would defend even the farts of NVIDIA until his last breath, he is not a very intelligent individual to converse with, he is a fake, and a liar who would twist facts to defend his master Nvidia, but you going along with this crap is pretty surprising given you being such an involved OC.net member
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways, that very chart you posted puts a well OCed Samsung 1070 pretty darn close to a stock 1080 performance, 133.7 FPS on an OCed 1070 vs. 137.9 FPS on FE 1080 , that is only a 3% performance difference for cards that have 34% asking pricing difference (according to EU market prices) , that happens to also be the reason why i bought a 1070 instead of a 1080, with Samsung memory and some decent OC, you get a GTX 1080 type experience.
> 
> 
> 
> What makes this a conspiracy is not just the fact that they switched to Micron, it is the fact that they did that AFTER all review samples went out with Samsung memory producing chart smashing results, and the fact that they are being extremely secretive about this on purpose, even if asked about it directly they just refuse to talk or give any information.
> 
> If it was just "business as usual" as you imply, how hard for them to say, sorry we ran out of Samsung GDDR5 supply, or these memory chips are good but we need to fix the BIOS or drivers which we will do soon. or anything AT ALL. but they are keeping us totally in the dark here intentionally, if that is not a conspiracy or at least very deceptive marketing and business practices then i don't know what is? how come you don't see it this way? please tell me, maybe i am missing something here...
> 
> The thing is that these choices you are suggesting are not so easy in practice, i had a 980 Ti which i sold to get that 1070, while both have an almost identical performance, i got the 1070 for efficiency reasons, the 980 Ti used to run hot and noisy (MSI Gaming) it went into the 80c+ and would make the entire system sound like a vacuum cleaner while gaming, and i did not like the experience.
> 
> The reason i got the ASUS Strix 1070 ... is the fact that it does not exceed 70c under load, runs quite, great OC potential (according to reviews), has 8GB framebuffer, aesthetics, yes it looks pretty cool with the RGB lighting and all ..
> 
> Now if i get an FE 1070 i lose many of these benefits, i will be right back at the 980 Ti days, will run hot (83c according to reviews), be noisier, no RGB ... etc .. not something i am very excited about .. also getting an FE and putting a water block on it does not look possible, because my retailer is my warranty provider and they are uneasy about changing the cooler on the card, i care a lot for warranty and don't like voiding it.
> 
> Also waiting for VEGA or Volta means that i will have to stay for so long without a GPU which is also not an easy choice ..
> 
> Buying a 1080 is impossible, after forking $872 on the 980 Ti , i decided to never pay way too much for a GPU anymore .. so the 1070 price is the highest i am willing to pay ..
> 
> My "whining" as you call it is not like beating a dead horse as you put it, i am trying to raise awareness and get results here, and most importantly information, results in the form of a BIOS/Driver fix, later production starting to use Samsung GDDR5 again, or information that for example this will or will never be fixed so i can make an informed decision what to buy or what not to buy .. any of these outcomes will be a positive thing.
> 
> So all is good, and i hope you can see that the choices you are proposing are not that easy to implement, at least not with the current level of information NVIDIA and AIBs are making available, which is absolutely nothing besides "we will look into it" posted by an Nvidia rep. about a week ago ..
Click to expand...

I suspect part of the reason for the power management/bios/drivers it not being acknowledged as an issue that is in need of attention is because of all the noise screaming the the Ram chips are faulty.

While the Ram chips are certainly part of the chain of components that makes the issue visible, There is evidence that demonstrates that the problem is not with the chips themselves but elsewhere in the chain which is common to the Samsung cards as well.

I do understand that it is frustrating and disappointing that the expensive shiny new toy doesn't operate as expected

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tps3443*
> 
> Anyone running a GTX1070, with Micron memory. That wants more Overclocking room. I would sell the cards before it's a worldwide known problem. And just grab a FE with Samsung memory. I ran my old gtx1070 with Samsung memory at like 9,800Mhz. And some of you guys cannot even hit 8500? That's terrible.
> 
> 
> 
> It is a software bug. I am sure it will be addressed and resolved at some stage. In the mean time there is a work around
> 
> No need to be running around yet screaming "the sky is falling!"
> 
> Click to expand...
> 
> Remember that we are just making an assumption here, we don't know for sure if it is a software bug or a hardware issue that these Micron memory ICs are not able to maintain stability under variable / adoptive voltage operation, the proposed workaround of voltage locking is basically disabling adaptive voltage mode and running these chips under constant voltage, a solution that can not be implemented in BIOS because that requires changing the entire power management algorithm that was initially designed for Samsung GDDR5 ICs that don't exhibit the same power properties as Micron.
> 
> Also remember that what we are actually locking is Core voltage not memory voltage, since memory is actually running at a much higher voltage than core, memory is running on 1.5v but that we don't have access to or control over, we can't even see with the tools we have available.
> 
> So the problem could be much deeper rooted than some might think and a solution might not be possible at all or might not be that easy or viable to implement, all i know is, if it was that easy to fix as you imply, a solution would have been provided by now, but with that delay and complete silence from NVIDIA, this does not look good at all.
Click to expand...

Here you go. I just found these documents on the Memory Voltage controller chip installed on the cards and the Nvidia standards framework in which it operates.

http://www.upi-semi.com/en-article-upi-362-1566

and this document from Nvidia. It describes the standardized interface that is used to manage and control the memory VRM.

http://international.download.nvidia.com/openvreg/openvreg-type2-specification.pdf

Pages 7 - 11 are particularly informative. In the table on page 11, the row "power saving interface low threshold" that has its max voltage listed at .8v.

In my experience, the artifacts start if the voltage is below .8v when you apply the high mem oc.

Power Saving Interface Low
Threshold


----------



## TheGlow

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I'm on 60Hz 1440p and I see basically 0 screen tearing in witcher 3. N that's with sli. Vsync off and frame limit unlimited. If I run vsync off and limit fps to 60 I get tons of screen tearing. Its kind of like battlefield that was. Screen tearing is nothing and feels much smoother over 60fps even on a 60Hz.


I think some games just perform a bit better. SF5 was hideously tearing all over the place at 60fps. Actually its hard locked at 60 since the games designed with the moves around specific frame amounts so thats understandable.
Overwatch was ok. Witcher3 i originally had to lower some settings since i was on the 380. With the 1070 everything is maxed except hair works and I was getting [email protected] That was before I was playing with the overclocking so I need to revisit that and see where i stand.
Also I appear to have an above average superman micron.


----------



## HOODedDutchman

Quote:


> Originally Posted by *TheGlow*
> 
> I think some games just perform a bit better. SF5 was hideously tearing all over the place at 60fps. Actually its hard locked at 60 since the games designed with the moves around specific frame amounts so thats understandable.
> Overwatch was ok. Witcher3 i originally had to lower some settings since i was on the 380. With the 1070 everything is maxed except hair works and I was getting [email protected] That was before I was playing with the overclocking so I need to revisit that and see where i stand.
> Also I appear to have an above average superman micron.


I don't think that's the only thing you turned down. With 1 card I had hairworks off, Sharpening in the middle, foliage distance on high instead of ultra (hated it it's friggin high setting n makes foliage pop up Like 20 feet in front of you) and shadow quality high instead of ultra. With those settings I was getting 60 constant with vsync and it was using 95% of gpu at full clocks.


----------



## tps3443

The Samsung memory on my RX480 was mediocre at best!

It ran stable at about 8,800.

I think memory to be good, it should do it least +1,000Mhz over stock speed.

Great memory will do 1,200-1,500+ over stock.

I have not seen a GTX1080 that doesn't run it least 11,000Mhz effective speed.

I've seen some do nearly 12,000 mhz


----------



## MyNewRig

Quote:


> Originally Posted by *tps3443*
> 
> The Samsung memory on my RX480 was mediocre at best!
> 
> It ran stable at about 8,800.
> 
> I think memory to be good, it should do it least +1,000Mhz over stock speed.
> 
> Great memory will do 1,200-1,500+ over stock.
> 
> I have not seen a GTX1080 that doesn't run it least 11,000Mhz effective speed.
> 
> I've seen some do nearly 12,000 mhz


You switched from RX 480 to GTX 1070? if so why did you buy the 480 in the first place and then why you switched?


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> Great graphics score! These are my best runs:
> 
> http://www.3dmark.com/spy/290317
> 
> http://www.3dmark.com/fs/9859378


holy moly, that is a Founders? i am definitely buying that thing ..
















So it can do all that with 4+1 power phases and one 8-pin? so these after market cards with 10+2 phases, two 8-pin etc is just a bunch of useless BS?

What co-brand FE do you have? or did you get that one directly from Nvidia?


----------



## montyben101

Quote:


> Originally Posted by *MyNewRig*
> 
> holy moly, that is a Founders? i am definitely buying that thing ..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So it can do all that with 4+1 power phases and one 8-pin? so these after market cards with 10+2 phases, two 8-pin etc is just a bunch of useless BS?
> 
> What co-brand FE do you have? or did you get that one directly from Nvidia?


Have a look at this:






It seems that they are a bit unnecessary, possibly at really high overclocks but the FE can still get very high.


----------



## MyNewRig

Quote:


> Originally Posted by *montyben101*
> 
> Have a look at this:
> 
> 
> 
> 
> 
> 
> It seems that they are a bit unnecessary, possibly at really high overclocks but the FE can still get very high.


Pretty informative video, this explains a lot, but then that means that if Pascal BIOS modifier becomes available one day, and we are finally able to increase core voltage to 1.250v with a custom BIOS then cards like the FTW will become really beasty and could OC to impressive levels! while cards like the FE with 4+1 phases will be stuck at stock settings.


----------



## Star Forge

I think the problem with the whole memory issues comes down to Pascal's overly-aggressive voltage system. Someone needs to modify the BIOS so the voltages don't aggressively downclocks and starve the VRAM and Core voltage. While some cards here (especially in FE's) are showing that their voltage issue isn't a problem and they are doing very well with their cards, I feel like a majority of Pascals (on both type of VRAM) has having issues and from my experience, it seems to all point fingers to voltage regulation. Each 1070 seems to manage voltage a bit differently, with some worse than others. It is also worth nothing that EVGA has literally four different version of the FTW BIOS too, two for Micron and two for Samsung. I might actually switch to the 2nd BIOS on my card to see if that secondary BIOS makes a difference.


----------



## MyNewRig

Great news update, we actually managed to get Nvidia to make a new vBIOS for Micron cards that should improve the overclocking experience or at least make it stable at stock, said vBIOS will be made available through board partners soon.

Can't wait to actually try it, pretty excited about it


----------



## xg4m3

Quote:


> Originally Posted by *MyNewRig*
> 
> Great news update, we actually managed to get Nvidia to make a new vBIOS for Micron cards that should improve the overclocking experience or at least make it stable at stock, said vBIOS will be made available through board partners soon.
> 
> Can't wait to actually try it, pretty excited about it


Source?

Edit: ah, stupid me







geforce forums


----------



## MyNewRig

Quote:


> Originally Posted by *xg4m3*
> 
> Source?


Official post by Nvidia in GeForce Official Support Forums

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835


----------



## Star Forge

Wow amazing! Cool! Right now the voltages on my card is trading off at a high core or a high VRAM but doing both isn't sustainable for the long term.


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> holy moly, that is a Founders? i am definitely buying that thing ..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So it can do all that with 4+1 power phases and one 8-pin? so these after market cards with 10+2 phases, two 8-pin etc is just a bunch of useless BS?
> 
> What co-brand FE do you have? or did you get that one directly from Nvidia?


Yep. I am really satisfied with my card. I believe all the extra phases and power connections are wasted on Pascal. I have not seen any aftermarket cards that consistently overclock better than the FE's. My FE is a Zotac.


----------



## montyben101

Quote:


> Originally Posted by *MyNewRig*
> 
> Pretty informative video, this explains a lot, but then that means that if Pascal BIOS modifier becomes available one day, and we are finally able to increase core voltage to 1.250v with a custom BIOS then cards like the FTW will become really beasty and could OC to impressive levels! while cards like the FE with 4+1 phases will be stuck at stock settings.


Yep! I just hope that editor becomes available, I would bet that 4+1 could also go up a bit but this is when FTW cards with a large number of phases would really be beneficial. Its a shame Nvidia really locked the voltage a lot as it limits the cards so much...


----------



## Star Forge

Quote:


> Originally Posted by *montyben101*
> 
> Yep! I just hope that editor becomes available, I would bet that 4+1 could also go up a bit but this is when FTW cards with a large number of phases would really be beneficial. Its a shame Nvidia really locked the voltage a lot as it limits the cards so much...


I have a feeling this is what the future is looking like from nVidia: to slowly make overclocking irrelevant. I feel like Pascal out of the box from nVidia is already achieving as high as it can go and nVidia gave us little headroom to play to wet our overclocking tongues. They know that the way they designed Pascal, it remains first and forth most as a chip generating the most power with the least amount of power draw and that is exactly what they did here. GPU Boost 3.0 was implemented so the chip can keep its efficiency all while not causing the voltage to completely blow it up since they are already running as high as they could possibly go?

A good analogy would be that nVidia released Pascal like a 2.0 Twin Turbo that is already tuned to near its maximum capacity. You shove it with more power and the whole motor will blow out so they locked the ECU to the car to an extent that will never reach that point if we were to tinker with it.


----------



## gtbtk

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> holy moly, that is a Founders? i am definitely buying that thing ..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So it can do all that with 4+1 power phases and one 8-pin? so these after market cards with 10+2 phases, two 8-pin etc is just a bunch of useless BS?
> 
> What co-brand FE do you have? or did you get that one directly from Nvidia?
> 
> 
> 
> Yep. I am really satisfied with my card. I believe all the extra phases and power connections are wasted on Pascal. I have not seen any aftermarket cards that consistently overclock better than the FE's. My FE is a Zotac.
Click to expand...

I certainly agree that the extra 6 or 8 pin power connector is purely a marketing gimmick that does not add any practical value. Extra phases may smooth the power delivery out but they all end up being under utilized.

I'm not planning on water cooling any time soon so the so-so cooling of an FE turns me off . I have found that the base clocked cards are no handicap compared to the factory oc cards if you are prepared to tweak things


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Source?
> 
> 
> 
> Official post by Nvidia in GeForce Official Support Forums
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835
Click to expand...

did you notice that Sora got very quiet? hehe


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Source?
> 
> 
> 
> Official post by Nvidia in GeForce Official Support Forums
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835
> 
> Click to expand...
> 
> did you notice that Sora got very quiet? hehe
Click to expand...

LOL Sora is just chilling and enjoying himself in the forums, not really looking for support or helping others, he called what we were doing a "piss in the ocean" well that piss got us a vBIOS update, hope it will actually fix the issue and does not introduce new set of issues if the Memory chips themselves are so unstable that they can not be stabilized via Software.

I was just discussing the issue with my retailer today since now this is a known and recognized issue, thanks to your thread, about what possible solutions are available aside form getting an FE because i am like you turned off by the FE cooling and i really really like the RGB LED Backplate of the Strix, it makes my rig look so pretty, we agreed that i will wait one more week for a possible response from Nvidia before i replace with an FE or return the card and get my money back ... luckily Nvidia responded today and i hope that the new vBIOS will be available ASAP so i can test it before my return period runs out









That was such a successful campaigning we have done here to get Nvidia to do something about it, feels satisfying when you get results after all the "whining"


----------



## fauka

hello i just want to say that i dont recomend using bios from others cards... i was having 1070 MSI SEA hawk x and fleshed palit game rock bios.. saw few ppl was using it and was fine but i got just bad luck and my card went down. i only hope now that warranty will apply and send me back working one :/ but anyway its seems strange cuz i was runing sli with it and the second one went down not first ;/ anyway im just let u know about my problem :S


----------



## MyNewRig

Quote:


> Originally Posted by *fauka*
> 
> hello i just want to say that i dont recomend using bios from others cards... i was having 1070 MSI SEA hawk x and fleshed palit game rock bios.. saw few ppl was using it and was fine but i got just bad luck and my card went down. i only hope now that warranty will apply and send me back working one :/ but anyway its seems strange cuz i was runing sli with it and the second one went down not first ;/ anyway im just let u know about my problem :S


What do you mean by "went down"? the BIOS flash bricked your card? do you know that you can still flash back your bricked card with original BIOS if you boot with another GPU plugged in and then using NVFlash to reflash your bricked one? warranty will be void of course if you send it to them without flashing the stock BIOS back.


----------



## Jimbags

Those people whining about FE having samsung ram and alot of other manfacturers using micron or what ever on no fe models. Well thats not nvidias fault. The strix pcb for example is not manufactured by nvidia. Or any other nin nvidia card for that mattter. I have a GTX 1070 FE, it overclocks very decent, 2100+mhz on the core and over 9000mhz effective memory clock. Never hits 80c (my ambient is higher than most too). Also really isnt that noisy at all, I game with headphones anyway but ive had it run boinc oveenight and its only 1m from where I sleep. Honestly I love the look of the FE, Better than most non-stock models imho. (Some come close). It does have adjustable lighting behind the righting too, its not rgb leds thats the fad atm but I like it







Not malice intended towards anyone just another point of view







Also maybe the vram is apart of the reason FE is a special edition card....


----------



## gtbtk

Quote:


> Originally Posted by *fauka*
> 
> hello i just want to say that i dont recomend using bios from others cards... i was having 1070 MSI SEA hawk x and fleshed palit game rock bios.. saw few ppl was using it and was fine but i got just bad luck and my card went down. i only hope now that warranty will apply and send me back working one :/ but anyway its seems strange cuz i was runing sli with it and the second one went down not first ;/ anyway im just let u know about my problem :S


You can easily recover the bricked card.

Disable any SLI settings you have and take off the bridge, Make sure the bricked card is in the second slot and boot off the good card in the primary slot or even better, an iGPU

Force update the bricked card with the stock bios using the command. "nvflash -6 -i1 stockbiosfile.rom"

I flashes a galaxy bios to my Gaming X the other day and it bricked the card. I suspect that bios file that failed was mislabled and actually for a different type of card.

After finishing the flash job, cross your fingers, reboot and it should come back to life again.


----------



## jlhawn

my GTX 1070 (in my sig) is performing very good out of the box therefore I haven't over clocked it as I don't see the need too.
It has Samsung memory.

this is while Folding


----------



## MyNewRig

Quote:


> Originally Posted by *Jimbags*
> 
> thats not nvidias fault.


OMG, Again?!









We discussed this point to death for weeks, and it is 100% on Nvidia, proof is that Nvidia is now the one who actually provided the FIX to ALL partner cards with all different types of PCBs, and Nvidia HAS to approve major components like memory and write BIOS and Drivers for it before partners can actually use it.

Please no more of that "not Nvidia's fault" stuff


----------



## MyNewRig

@gtbtk did you notice NVIDIA's wording? _*"improve the overclocking experience for certain users"*_

They make it sound like an OC only issue that is effecting only some users, it is pretty understandable though, they are playing it safe this time to avoid any legal liabilities that might arise


----------



## gtbtk

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835

Quote:


> Originally Posted by *Jimbags*
> 
> Those people whining about FE having samsung ram and alot of other manfacturers using micron or what ever on no fe models. Well thats not nvidias fault. The strix pcb for example is not manufactured by nvidia. Or any other nin nvidia card for that mattter. I have a GTX 1070 FE, it overclocks very decent, 2100+mhz on the core and over 9000mhz effective memory clock. Never hits 80c (my ambient is higher than most too). Also really isnt that noisy at all, I game with headphones anyway but ive had it run boinc oveenight and its only 1m from where I sleep. Honestly I love the look of the FE, Better than most non-stock models imho. (Some come close). It does have adjustable lighting behind the righting too, its not rgb leds thats the fad atm but I like it
> 
> 
> 
> 
> 
> 
> 
> Not malice intended towards anyone just another point of view
> 
> 
> 
> 
> 
> 
> 
> Also maybe the vram is apart of the reason FE is a special edition card....


You should really get your facts sorted out before making sweeping assumptions like that. Just because product X works fine, doesn't mean that products Y and Z work in the same way. That is particularly true in electronics as products are are revised in the middle of production runs. there is no guarantee that teh substitute component is not faulty or in need of a modification in the control software that slips through.

While you are right that Asus makes their own customized PCBs, as does MSI, EVGA, Zotac and others, all the custom boards are designed to comply with certain base level electrical standards that are defined by Nvidia and all the boards run a bios whose base code is supplied by Nvidia. The vendors do overlay that with their branding, clocks settings, and tweak power management parameters but that is all done within the framework that Nvidia has created and is in effect little more than window dressing.

Besides, Nvidia have just announced they have created an updated bios that is being distributed to partners to address the core memory voltage bug in the Micron card bios.

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> That was such a successful campaigning we have done here to get Nvidia to do something about it, feels satisfying when you get results after all the "whining"


I am glad all the whining paid off for you guys.









Good luck!


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> I am glad all the whining paid off for you guys.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck!


I honestly was just using the word whining in a sarcastic sense, we actually did what any customer with a broken product would do when being turned down though private support channels, we just went public with this, explained the problem objectively and demanded a fix or more information about what is going on, we were aiming for results from the beginning, so it was not really whining.

But lets not jump the gun here, we have not seen or tested that proposed BIOS fix or even have an ETA yet, so this might not end so easily after all, but it is indeed a very good start, at least the issue is now officially recognized and acknowledged so if said BIOS update did not really improve the situation we have a good base to work from, at least no one will claim that no issues exist anymore which was the case during the past few weeks.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> @gtbtk did you notice NVIDIA's wording? *"improve the overclocking experience for certain users"*
> 
> They make it sound like an OC only issue that is effecting only some users, it is pretty understandable though, they are playing it safe this time to avoid any legal liabilities that might arise


Yes I saw that. Given that Micron memory is only installed on a subset of all 1070s their statement is accurate. There are some Micron 1070 owners, I'm sure, who have never experienced the bug because they never tried to overclock their card. Remember doing a straw pole of the users here, will give you a skewed statistical result because the only people who come here are interested in overclocking their cards in the first place

Your card may be slightly faulty regardless of the bug or it may even be something like your PC overclock that is causing PCIe instability and causing your card to crash.

Could I suggest that you reset your CPU and System RAM to stock settings and then see if you are still having the same crashing issues at stock clocks? If you have a high BCLK OC or tweaked voltages on the motherboard, it may be causing your GPU to have freak out and have nothing to do with the memory at all.

If your card starts working at stock, make a single change to start to put your OC back in place and test if the card starts freaking out again at each step. If you find it is an element of your PC overclock, dial that back a bit so you get stable again.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Yes I saw that. Given that Micron memory is only installed on a subset of all 1070s their statement is accurate. There are some Micron 1070 owners, I'm sure, who have never experienced the bug because they never tried to overclock their card. Remember doing a straw pole of the users here, will give you a skewed statistical result because the only people who come here are interested in overclocking their cards in the first place
> 
> Your card may be slightly faulty regardless of the bug or it may even be something like your PC overclock that is causing PCIe instability and causing your card to crash.
> 
> Could I suggest that you reset your CPU and System RAM to stock settings and then see if you are still having the same crashing issues at stock clocks? If you have a high BCLK OC or tweaked voltages on the motherboard, it may be causing your GPU to have freak out and have nothing to do with the memory at all.
> 
> If your card starts working at stock, make a single change to start to put your OC back in place and test if the card starts freaking out again at each step. If you find it is an element of your PC overclock, dial that back a bit so you get stable again.


There is no need for all that stability testing because i have a much simpler and better test, tested a couple Samsung GTX 1070s for about a month on the same system with the same BIOS settings and all, in games and benchmarks even with +600 memory and they were flawless, not a single crash or artifact during one whole month, it only started after that Micron card went into the system, so it is clear what is the culprit here.

Also the artifacting at stock settings were less frequent with the latest driver update, and when it crashes it crashes with a checkerboard artifact followed by a BSOD, so it is very clear what is going on here.


----------



## TheDeadCry

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Source?
> 
> 
> 
> Official post by Nvidia in GeForce Official Support Forums
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835
> 
> Click to expand...
> 
> did you notice that Sora got very quiet? hehe
Click to expand...

That dude is obnoxious - if only for the sole reason that he's a pretentious, condescending D-Bag that tries to silence all dissent and anything that doesn't otherwise fit his narrative. He can't even consider any alternatives. It's pretty pathetic. I wish I could hide his comments from the nvidia forums. Don't get me wrong, other's opinions are great but his comments do not promote positive discourse IMHO...I feel like he tries to devolve the conversation into a pissing match where nothing constructive can get done. There are grown-up ways to approach these things, and "Sora" in my opinion doesn't handle it as such.


----------



## TheDeadCry

Quote:


> Originally Posted by *criminal*
> 
> I am glad all the whining paid off for you guys.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck!


"Whining" With that attitude I assume you either have a lot of disposable income, or that you just love riding the corporate D*ck ( I am aware that the asterisk is stupid...but I'm unsure of what language is "acceptable") *Sigh* Explain your issue. Let's talk about this. I'm honestly curious as to why you seem content with mediocrity. Hell, maybe I'm misinterpreting - If so, disregard the previous comments. lmao There is a +1 rep in it for you


----------



## gtbtk

Quote:


> Originally Posted by *TheDeadCry*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Source?
> 
> 
> 
> Official post by Nvidia in GeForce Official Support Forums
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835
> 
> Click to expand...
> 
> did you notice that Sora got very quiet? hehe
> 
> Click to expand...
> 
> That dude is obnoxious - if only for the sole reason that he's a pretentious, condescending D-Bag that tries to silence all dissent and anything that doesn't otherwise fit his narrative. He can't even consider any alternatives. It's pretty pathetic. I wish I could hide his comments from the nvidia forums. Don't get me wrong, other's opinions are great but his comments do not promote positive discourse IMHO...I feel like he tries to devolve the conversation into a pissing match where nothing constructive can get done. There are grown-up ways to approach these things, and "Sora" in my opinion doesn't handle it as such.
Click to expand...

He certainly doesn't add anything constructive to the conversation.

If anything he undermines the Nvidia product range


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Yes I saw that. Given that Micron memory is only installed on a subset of all 1070s their statement is accurate. There are some Micron 1070 owners, I'm sure, who have never experienced the bug because they never tried to overclock their card. Remember doing a straw pole of the users here, will give you a skewed statistical result because the only people who come here are interested in overclocking their cards in the first place
> 
> Your card may be slightly faulty regardless of the bug or it may even be something like your PC overclock that is causing PCIe instability and causing your card to crash.
> 
> Could I suggest that you reset your CPU and System RAM to stock settings and then see if you are still having the same crashing issues at stock clocks? If you have a high BCLK OC or tweaked voltages on the motherboard, it may be causing your GPU to have freak out and have nothing to do with the memory at all.
> 
> If your card starts working at stock, make a single change to start to put your OC back in place and test if the card starts freaking out again at each step. If you find it is an element of your PC overclock, dial that back a bit so you get stable again.
> 
> 
> 
> There is no need for all that stability testing because i have a much simpler and better test, tested a couple Samsung GTX 1070s for about a month on the same system with the same BIOS settings and all, in games and benchmarks even with +600 memory and they were flawless, not a single crash or artifact during one whole month, it only started after that Micron card went into the system, so it is clear what is the culprit here.
> 
> Also the artifacting at stock settings were less frequent with the latest driver update, and when it crashes it crashes with a checkerboard artifact followed by a BSOD, so it is very clear what is going on here.
Click to expand...

That pretty much makes me think that the Micron Memory in your case is a red herring. I think your card is actually faulty. It should not be crashing at stock


----------



## TheDeadCry

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheDeadCry*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xg4m3*
> 
> Source?
> 
> 
> 
> Official post by Nvidia in GeForce Official Support Forums
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986835/#4986835
> 
> Click to expand...
> 
> did you notice that Sora got very quiet? hehe
> 
> Click to expand...
> 
> That dude is obnoxious - if only for the sole reason that he's a pretentious, condescending D-Bag that tries to silence all dissent and anything that doesn't otherwise fit his narrative. He can't even consider any alternatives. It's pretty pathetic. I wish I could hide his comments from the nvidia forums. Don't get me wrong, other's opinions are great but his comments do not promote positive discourse IMHO...I feel like he tries to devolve the conversation into a pissing match where nothing constructive can get done. There are grown-up ways to approach these things, and "Sora" in my opinion doesn't handle it as such.
> 
> Click to expand...
> 
> He certainly doesn't add anything constructive to the conversation.
> 
> If anything he undermines the Nvidia product range
Click to expand...

Exactly. If anything I believe he's a turn-off. He certainly wouldn't have convinced me to get an Nvidia card, lmao.


----------



## Hunched

Does anyone know why using "Prefer Maximum Performance" will eventually go stupid and force the card into 3D mode even when no 3D apps are running?
A restart fixes it, only for a while then it happens again.

It's pretty annoying.


----------



## Forceman

Quote:


> Originally Posted by *Hunched*
> 
> Does anyone know why using "Prefer Maximum Performance" will eventually go stupid and force the card into 3D mode even when no 3D apps are running?
> A restart fixes it, only for a while then it happens again.
> 
> It's pretty annoying.


Which driver version? I thought they fixed that a few drivers ago.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> Does anyone know why using "Prefer Maximum Performance" will eventually go stupid and force the card into 3D mode even when no 3D apps are running?
> A restart fixes it, only for a while then it happens again.
> 
> It's pretty annoying.


Are you using any type of program that has hardware acceleration enabled? Maybe your browser, even steam possibly - stuff like that. Have you been monitoring the GPU usage? Some applications can cause issues like this. I'd recommend checking the task manager, and monitoring your gpu usage. Maybe try closing one application at a time and see if one is causing the strange behaviour.


----------



## Roland0101

Quote:


> Originally Posted by *TheDeadCry*
> 
> Are you using any type of program that has hardware acceleration enabled? Maybe your browser, even steam possibly - stuff like that. Have you been monitoring the GPU usage? Some applications can cause issues like this. I'd recommend checking the task manager, and monitoring your gpu usage. Maybe try closing one application at a time and see if one is causing the strange behaviour.


Yes, that is probably the reason. It's better to set "Prefer Maximum Performance" in the Game profiles and not globally.


----------



## Hunched

Quote:


> Originally Posted by *Forceman*
> 
> Which driver version? I thought they fixed that a few drivers ago.


372.90
Every now and then for no reason it will get stuck like it believes a game is running when I'm just sitting at the desktop.
Quote:


> Originally Posted by *TheDeadCry*
> 
> Are you using any type of program that has hardware acceleration enabled? Maybe your browser, even steam possibly - stuff like that. Have you been monitoring the GPU usage? Some applications can cause issues like this. I'd recommend checking the task manager, and monitoring your gpu usage. Maybe try closing one application at a time and see if one is causing the strange behaviour.


I have as little things open as one can have, no Steam, browser, monitoring.
I guess I'll just end every single task possible next time it happens, even though they're basically all Windows tasks that need to be running.

I'll walk away from my computer and come back and somehow its in 3D mode with high clocks and voltages.
The only things I have running that interact with the GPU is MSI Afterburner and SpeedFan, with as little monitoring enabled as you can have.

It would be cool if anything worked properly ever, I'm so tired of my 1070.


----------



## HOODedDutchman

I did a little writeup on sli scaling if anyone is interested.
http://www.overclock.net/t/1612629/gtx-1070-sli-scaling-comparison


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> 372.90
> Every now and then for no reason it will get stuck like it believes a game is running when I'm just sitting at the desktop.
> I have as little things open as one can have, no Steam, browser, monitoring.
> I guess I'll just end every single task possible next time it happens, even though they're basically all Windows tasks that need to be running.
> 
> I'll walk away from my computer and come back and somehow its in 3D mode with high clocks and voltages.
> The only things I have running that interact with the GPU is MSI Afterburner and SpeedFan, with as little monitoring enabled as you can have.
> 
> It would be cool if anything worked properly ever, I'm so tired of my 1070.


The fact that you leave for a bit and come back to having full 3D clocks makes me think that it is, in fact, the case that something running in the background is messing with your card. I can't say for sure, obviously. I'm just giving you some suggestions, that's all. I've had this kind of issue in the past. Additionally, Roland gave some good advice. If all else fails you should try changing the game profiles individually to only use the "Prefer Maximum Performance" setting when in game, instead of globally in the settings.


----------



## Hunched

Quote:


> Originally Posted by *TheDeadCry*
> 
> The fact that you leave for a bit and come back to having full 3D clocks makes me think that it is, in fact, the case that something running in the background is messing with your card. I can't say for sure, obviously. I'm just giving you some suggestions, that's all. I've had this kind of issue in the past. Additionally, Roland gave some good advice. If all else fails you should try changing the game profiles individually to only use the "Prefer Maximum Performance" setting when in game, instead of globally in the settings.


Yes that is a solution, though its a bit of a pain to have to do that for every game, especially if you ever need to do a clean install and lose all your settings and have to redo it.

GPU-Z utilization is 0% when this is happening, everything is how it should be except clocks and voltage.
Hopefully it's one of the non-Windows things running in task manager, since that's about 5 things. Should be easy to find and fix if one of those are the problem.
Nothing can ever be simple though, so I doubt there's an easy fix. Would be nice though to end one of these tasks and see it drop back into idle mode.

Just gonna wait for it to happen again.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Does anyone know why using "Prefer Maximum Performance" will eventually go stupid and force the card into 3D mode even when no 3D apps are running?
> 
> A restart fixes it, only for a while then it happens again.
> 
> It's pretty annoying.


I have not seen that behavior before. is that running with the multi monitor or single monitor mode?


----------



## Roland0101

Quote:


> Originally Posted by *Hunched*
> 
> Yes that is a solution, though its a bit of a pain to have to do that for every game, especially if you ever need to do a clean install and lose all your settings and have to redo it.


You can use "NVIDIA Inspector" http://www.majorgeeks.com/files/details/nvidia_inspector.html[/URL]
to export and import (after a clean install) all your modified game profiles in a row.
It's also shows settings not available in the NVCP.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> Yes that is a solution, though its a bit of a pain to have to do that for every game, especially if you ever need to do a clean install and lose all your settings and have to redo it.
> 
> GPU-Z utilization is 0% when this is happening, everything is how it should be except clocks and voltage.
> Hopefully it's one of the non-Windows things running in task manager, since that's about 5 things. Should be easy to find and fix if one of those are the problem.
> Nothing can ever be simple though, so I doubt there's an easy fix. Would be nice though to end one of these tasks and see it drop back into idle mode.
> 
> Just gonna wait for it to happen again.


Agreed. Sometimes it's the little things. I clean install like every week, lol. Unfortunately, I can't think of anything more that you could try, aside from maybe using DDU and/or make sure that inbuilt ***** like Xbox DVR is turned off. Xbox DVR has been know to cause a host of problems for people. I hate how windows 10 makes my computer seem infected with malware.


----------



## F3niX69

Quote:


> Originally Posted by *criminal*
> 
> Yep. I am really satisfied with my card. I believe all the extra phases and power connections are wasted on Pascal. I have not seen any aftermarket cards that consistently overclock better than the FE's. My FE is a Zotac.


All FE edition cards are manufactured by the same OEM.so evga-asus-zotac etc FE are all the same.i only think some of them offer better warranty.
Also it seems that FE have a tendency to oc higher than aib cards.which is really annoying.And the extra phases don't seem to help much as you said.Also a single 8pin is all a 1070 will ever need i think,no point in 8+6 or 8+8 setups.
1070 and 1080 chips are basically locked by nvidia to not be able to go higher,even though they can
Also i would like to make a statement and mark my words, based on what i said above.I foresee that the next 11x nvidia gpus will probably be a pascal refresh like the 6 and 7 series.
*Drops the mic*


----------



## Hunched

I found the cause but I don't know how to fix it.

It's "Settings" in the Task Manager, whenever it is opened the GPU goes into 3D mode, when its closed clocks and voltages lower.


I don't know why it's showing up in Task Manager when it isn't even visible or opened on screen.
It seems to just randomly automatically run in the background.

I found its .exe and added it to NVCP programs and changed the power preference, but it doesn't listen.

Anyone know how to stop "Settings" from randomly starting and running in the background, or how to make power preferences for the executable actually work?


----------



## Hunched

It happens if I open Calculator too, anything that uses that Windows 10 app interface style.
Am I the only person running Windows 10 and Prefer Maximum Performance or something?

Suddenly nobody can help that I found the cause...

When I add Windows Calculator .exe to NVCP Program Settings and change it to adaptive or optimal performance, that actually works and clocks and voltages lower.
Doesn't work for SystemSettings.exe though


----------



## Forceman

Quote:


> Originally Posted by *Hunched*
> 
> It happens if I open Calculator too, anything that uses that Windows 10 app interface style.
> Am I the only person running Windows 10 and Prefer Maximum Performance or something?
> 
> Suddenly nobody can help that I found the cause...
> 
> When I add Windows Calculator .exe to NVCP Program Settings and change it to adaptive or optimal performance, that actually works and clocks and voltages lower.
> Doesn't work for SystemSettings.exe though


I had the same thing when I first got my card, but it went away with a driver change and then I switched back to adaptive (or whatever the other choice is) so I don't know if it came back. Must be some UWP thing with the apps - I just tested it with the 960 in my HTPC and the same full power thing happens, so it isn't just a 1070 problem.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> It happens if I open Calculator too, anything that uses that Windows 10 app interface style.
> Am I the only person running Windows 10 and Prefer Maximum Performance or something?
> 
> Suddenly nobody can help that I found the cause...
> 
> When I add Windows Calculator .exe to NVCP Program Settings and change it to adaptive or optimal performance, that actually works and clocks and voltages lower.
> Doesn't work for SystemSettings.exe though


Ah, sorry. I was AFK for a bit. It's both very odd, but also not surprising, in that it's the windows apps causing an issue. Windows 10 Apps, unless I'm somehow horribly mistaken, are hardware accelerated. However, I'm not sure I'm going to be very helpful here. I have not experienced the issue personally. I may try to replicate the issue. You may want to try a power shell command for uninstalling all windows 10 apps. Open powershell as admin paste this: *Get-AppxPackage -AllUsers | Remove-AppxPackage* and let it do its thing. May be worth a try.


----------



## Roland0101

Quote:


> Originally Posted by *Hunched*
> 
> It happens if I open Calculator too, anything that uses that Windows 10 app interface style.
> Am I the only person running Windows 10 and Prefer Maximum Performance or something?
> 
> Suddenly nobody can help that I found the cause...
> 
> When I add Windows Calculator .exe to NVCP Program Settings and change it to adaptive or optimal performance, that actually works and clocks and voltages lower.
> Doesn't work for SystemSettings.exe though


Did you rebooted after you changed the settings?

I didn't looked into this for quite some time now, so there might be changes in the drivers by now, but the whole idea about "Prefer maximum performance" was to indeed Prefer maximum performance.
Or in other words set the card to it's base clocks, don't let the voltage go down on normal idle and don't let the PCI-E slot power down.

Then Nvidia introduced profiles for some application or services who are present at system start to overwrite the global "Prefer maximum performance" settings to "Adaptive" so "Prefer maximum performance" would only become active if a 3D application starts. But that caused problems on some systems because this application or services remain active for the entire season and prevented "Prefer maximum performance" to become active at all.

On a Windows 10 computer about every program that uses GPU acceleration will set the clock speed to the base clock of the GPU (and the other things I mentioned) if you have "Prefer maximum performance" globally activated.
That will cost you money, radiates unnecessary heat into your case and probably will reduce the lifetime of your components.

The better way is to set "Prefer maximum performance" only in the profiles of the applications you need to run on that setting and using NVIDIA inspector (that I linked on the previous page) to export your profiles. So you avoid the negative effects of a globally set "Prefer maximum performance", plus you can have your game profiles inside a minute back after you preformed a clean installation of the driver.


----------



## TheDeadCry

Quote:


> Originally Posted by *TheDeadCry*
> 
> Ah, sorry. I was AFK for a bit. It's both very odd, but also not surprising, in that it's the windows apps causing an issue. Windows 10 Apps, unless I'm somehow horribly mistaken, are hardware accelerated. However, I'm not sure I'm going to be very helpful here. I have not experienced the issue personally. I may try to replicate the issue. You may want to try a power shell command for uninstalling all windows 10 apps. Open powershell as admin paste this: *Get-AppxPackage -AllUsers | Remove-AppxPackage* and let it do its thing. May be worth a try.


Additionally, if this doesn't work and you need your UWP apps back, paste this command: *Get-AppxPackage -allusers | foreach {Add-AppxPackage -register "$($_.InstallLocation)\appxmanifest.xml" -DisableDevelopmentMode}*


----------



## TheGlow

For me its running the MSI Gaming app service that keeps me in 3d clocks.
Its when I realized that and stopped it my clocks would wind down. but then I would get the checkboard freeze all the time opening other things like MS Edge.
So I leave the service running, locks me into 3d clocks and keeps voltage at 725-800 so I dont checker board anymore.
Only issue is I need to manually apply my OC each start up because I will checkboard lock at boot since Afterburner applies it before the service kicks in.


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> For me its running the MSI Gaming app service that keeps me in 3d clocks.
> Its when I realized that and stopped it my clocks would wind down. but then I would get the checkboard freeze all the time opening other things like MS Edge.
> So I leave the service running, locks me into 3d clocks and keeps voltage at 725-800 so I dont checker board anymore.
> Only issue is I need to manually apply my OC each start up because I will checkboard lock at boot since Afterburner applies it before the service kicks in.


Did you consider to use the automatic profile management in Afterburner? The 2D profile at clocks that don't give you problems and the 3D profile for OC settings?


----------



## BroPhilip

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hunched*
> 
> Does anyone know why using "Prefer Maximum Performance" will eventually go stupid and force the card into 3D mode even when no 3D apps are running?
> 
> A restart fixes it, only for a while then it happens again.
> 
> It's pretty annoying.
> 
> 
> 
> I have not seen that behavior before. is that running with the multi monitor or single monitor mode?
Click to expand...

I had same issue.... only fix was a clean driver install, with DDU


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> Did you consider to use the automatic profile management in Afterburner? The 2D profile at clocks that don't give you problems and the 3D profile for OC settings?


Yea, didnt do anything. At boot it would think I was in 3d already and checkerboard me still.
As I said, MSI gaming app service is running. that keeps me in 3d, so I can tweak and set my card all I want and it doesnt glitch up on me.
I just need to manually apply after a reboot, which i havent had to in weeks, so its a decent option for now. Afterburner is set to open on launch, just not apply profile, so thats a reminder to apply then.


----------



## Hunched

What doesn't make sense is that if the global profile is set to Optimal Power that will fix it, which means it's changing the Program Setting of something out there to Optimal Power as well correct?
Attempting to open "settings" with the .exe it directs to when you "open file location" does literally nothing, I don't think this .exe has to do with power management for "Settings" even though it should.
It's like a dummy .exe or something. Somewhere out there what I actually need to change, is being changed when I change my global power setting, if I could just find that and do that in program settings...

This is what the solution should be, but these settings do not actually change anything.


Any idea on how to find the correct program which I can then set to Optimal Power? Or how to stop "Settings" from randomly deciding to start running in the background?
It would be easier and makes more sense to leave maximum performance as the global setting if I could change this 1 program setting to optimal or adaptive power.

Quote:


> Originally Posted by *Forceman*
> 
> I had the same thing when I first got my card, but it went away with a driver change and then I switched back to adaptive (or whatever the other choice is) so I don't know if it came back. Must be some UWP thing with the apps - I just tested it with the 960 in my HTPC and the same full power thing happens, so it isn't just a 1070 problem.


The nice thing about the other apps though, such as Calculator, is they will appear in recently used programs when you go to add one in NVCP, and setting Calculator to Optimal Power actually works.
I'd rather it work with Settings and not Calculator if I had to choose, since Calculator doesn't randomly run itself in the background and does not need to ever be used.

Quote:


> Originally Posted by *TheDeadCry*
> 
> Ah, sorry. I was AFK for a bit. It's both very odd, but also not surprising, in that it's the windows apps causing an issue. Windows 10 Apps, unless I'm somehow horribly mistaken, are hardware accelerated. However, I'm not sure I'm going to be very helpful here. I have not experienced the issue personally. I may try to replicate the issue. You may want to try a power shell command for uninstalling all windows 10 apps. Open powershell as admin paste this: *Get-AppxPackage -AllUsers | Remove-AppxPackage* and let it do its thing. May be worth a try.


Yes they are hardware accelerated, but you can fix this on some of them such as Calculator in NVCP the same way you can fix Google Chrome by choosing Optimal Power.
Uninstalling "Settings" isn't a great idea. It's pretty necessary thing to keep, and unfortunately is the one that I can't figure out how to set to Optimal Power or stop it from running in the background.
Quote:


> Originally Posted by *Roland0101*
> 
> -snip-


I rebooted yes. The idea of Prefer Maximum Performance is to only give you the maximum performance when 3D apps are running, like games, that's what I want it for... not Windows Settings lol.
Yes, I can set it to Prefer Maximum Performance for the hundreds of games I play... but there are only 2 things that aren't games that I use which have issue with this.
Google Chrome and "Settings", Google Chrome which is easily fixed by selecting Optimal Power in NVCP, as should be the same for Settings but it's not.

There's far less things I'd need to configure in program settings if I use Prefer Maximum Performance as global, there's only 2 things.
Unfortunately Windows is stupid and it's so far impossible to find the actual .exe or whatever for "Settings" to adjust accordingly like I have with Google Chrome.

I've been trying to find an answer about SystemSettings.exe on Google but no luck, though I have found 1 or 2 others complain of it and other Windows "apps" autorunning randomly and interfering with the global maximum performance settings they're using. No solutions though, not for all the apps as only some work properly in NVCP Program Settings as far as people know.

There's clearly a solution out there, using Optimal Power globally is making something somewhere related to "Settings" use my global profile, and it would show that in program settings if I could find it, and then I could just set it to Optimal. It's just hiding somewhere, unless I'm wrong in thinking it works this way, and somehow it has no program settings anywhere.


----------



## Roland0101

Quote:


> Originally Posted by *Hunched*
> 
> There's clearly a solution out there, using Optimal Power globally is making something somewhere related to "Settings" use my global profile, and it would show that in program settings if I could find it, and then I could just set it to Optimal. It's just hiding somewhere, unless I'm wrong in thinking it works this way, and somehow it has no program settings anywhere.


Try "Shellexperiencehost.exe" and Cortana.


----------



## Hunched

Quote:


> Originally Posted by *Roland0101*
> 
> Try "Shellexperiencehost.exe" and Cortana.


I tried that already, I put them both on Optimal Power regardless of my global setting.
"Settings" still boosts everything up to max clocks if my global setting is Prefer Maximum Performance.

If I could find where to set it to Optimal Power like everything else, or disable it from running in the background whenever it pleases for no reason, we would have this solved.


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> What doesn't make sense is that if the global profile is set to Optimal Power that will fix it, which means it's changing the Program Setting of something out there to Optimal Power as well correct?
> Attempting to open "settings" with the .exe it directs to when you "open file location" does literally nothing, I don't think this .exe has to do with power management for "Settings" even though it should.
> It's like a dummy .exe or something. Somewhere out there what I actually need to change, is being changed when I change my global power setting, if I could just find that and do that in program settings...
> 
> This is what the solution should be, but these settings do not actually change anything.
> 
> 
> Any idea on how to find the correct program which I can then set to Optimal Power? Or how to stop "Settings" from randomly deciding to start running in the background?
> It would be easier and makes more sense to leave maximum performance as the global setting if I could change this 1 program setting to optimal or adaptive power.
> The nice thing about the other apps though, such as Calculator, is they will appear in recently used programs when you go to add one in NVCP, and setting Calculator to Optimal Power actually works.
> I'd rather it work with Settings and not Calculator if I had to choose, since Calculator doesn't randomly run itself in the background and does not need to ever be used.
> Yes they are hardware accelerated, but you can fix this on some of them such as Calculator in NVCP the same way you can fix Google Chrome by choosing Optimal Power.
> Uninstalling "Settings" isn't a great idea. It's pretty necessary thing to keep, and unfortunately is the one that I can't figure out how to set to Optimal Power or stop it from running in the background.
> I rebooted yes. The idea of Prefer Maximum Performance is to only give you the maximum performance when 3D apps are running, like games, that's what I want it for... not Windows Settings lol.
> Yes, I can set it to Prefer Maximum Performance for the hundreds of games I play... but there are only 2 things that aren't games that I use which have issue with this.
> Google Chrome and "Settings", Google Chrome which is easily fixed by selecting Optimal Power in NVCP, as should be the same for Settings but it's not.
> 
> There's far less things I'd need to configure in program settings if I use Prefer Maximum Performance as global, there's only 2 things.
> Unfortunately Windows is stupid and it's so far impossible to find the actual .exe or whatever for "Settings" to adjust accordingly like I have with Google Chrome.
> 
> I've been trying to find an answer about SystemSettings.exe on Google but no luck, though I have found 1 or 2 others complain of it and other Windows "apps" autorunning randomly and interfering with the global maximum performance settings they're using. No solutions though, not for all the apps as only some work properly in NVCP Program Settings as far as people know.
> 
> There's clearly a solution out there, using Optimal Power globally is making something somewhere related to "Settings" use my global profile, and it would show that in program settings if I could find it, and then I could just set it to Optimal. It's just hiding somewhere, unless I'm wrong in thinking it works this way, and somehow it has no program settings anywhere.


The command I showed you will only delete non critical apps.


----------



## Hunched

Quote:


> Originally Posted by *TheDeadCry*
> 
> The command I showed you will only delete non critical apps.


But the issue itself is Windows Settings, no other apps are causing the GPU to boost.
Well, other apps can, but they aren't a problem because I simply don't run them. Windows Setting's keeps running itself in the background which is why it's a problem.
Unlike other apps which can be fixed via NVCP like Calculator, I can't find a way to control the power setting of Windows Settings besides the global performance setting, or a way to stop it from always running itself.

Oh Microsoft...


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> But the issue itself is Windows Settings, no other apps are causing the GPU to boost.


Yeah, I don't know. Who knows, maybe it will knock something loose. That, and DDU are the only things I can think of. Unless you want to clean install


----------



## Hunched

Quote:


> Originally Posted by *TheDeadCry*
> 
> Yeah, I don't know. Who knows, maybe it will knock something loose. That, and DDU are the only things I can think of. Unless you want to clean install


I have a clean install of 372.90. I think I'm just going to give up and use Optimal Power globabally.

It seems it's impossible to set Windows Settings to Optimal, Adaptive, Maximum Performance.
I also don't know how to stop it from randomly starting itself and running in the background, obviously if it wasn't running I wouldn't be having issues but apparently it really likes hanging out in the Task Manager and keeps coming back uninvited.

There isn't a specific service I can turn off simply called "Windows Settings" or anything like that, there probably is but I just can't find it, or it's grouped in with a service that does multiple things.
If only the picture I posted worked, because it should, and does work with everything else like Chrome, Calculator, and every other application you could have on your PC.

Leave it to Microsoft to create an .exe that cannot work with NVCP somehow...


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> That pretty much makes me think that the Micron Memory in your case is a red herring. I think your card is actually faulty. It should not be crashing at stock


Let me set something straight here, it occasionally shows artifacts at stock not crash, the checkerboard crashing happens at +200 memory last time i ran a game (3 days ago) , i have not tested +150 and +100 yet, but if it is BSODing at +200 and shows artifacts at stock then there is probably no need for any further testing as it barely works at stock and not even stable enough to maintain stock frequencies at stock settings, i expect if i under-clock it to 7600-7800Mhz region it will get into the range of full stability, but what is the point?

The way i stumbled upon this issue eliminates any other problems with my system other than the Micron memory chips on the GPU,

What happened is that i had the opportunity to try two different Samsung cards for a few weeks, i loved everything about them that i bough one for myself in August, at that time i did not know anything about all of this, i just installed the card, was pretty excited, ran Rise of the Tomb Raider in 2K, and after a few minutes of gameplay i felt there was something off about the card, the gameplay was laggy and did not feel near as fluid as it used to be with the Samsung card, i thought maybe because i have not tried OCing it yet, given that i was doing +600-700Mhz on the Samsungs, i dialed +500 memory on that card and get instant checkerboard BSOD! at that point i knew there was something totally off about that card for sure but i did not know what it was, eventually i ran GPU-Z screenshots of both cards side by side and noticed that the only difference is that one had Samsung and the other had Micron!

The next day i contacted my retailer saying what is that piece of garbage you sent me? i want a card with Samsung memory because the one you sent me does not perform right and is laggy during high res. gaming, so they set an RMA condition with their supplier that a replacement card is requested with Samsung memory, the supplier contacted ASUS and they said they can not provide that, so the retailer told me just return the card, get your money back and buy another one since it is not easy to take that under warranty.

During that time the retailer received a new batch restock of the card, so i bought one from the new batch, put it in the system and noticed the same symptoms along with a lot of coilwhine, ran GPU-Z and it turned out to have Micron memory again!, at this point i went Googling "GTX 1070 Micron Memory" ...

You know the rest of the story ...


----------



## agntallen

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I don't play much multiplayer lately. Waiting for bf1. Last couple months or so I've been playing fallout4, witcher 3, rise of the tomb raider. All amazing. Recommend witcher 3 the most tho. Gameplay is amazing and really pushes the hardware and looks stunning. Fallout4 and ride of the tomb raider do too but fallout4 doesn't look as good and the witcher has WAY more gameplay then rise of the tomb raider. Plus u can find witcher 3 for $30 game of the year edition with all dlc.


i do have the witcher 3 & i can notice a lot more detail in the game than i did on the r9 390.

Quote:


> Originally Posted by *Roland0101*
> 
> That's a good attitude, it's a great card.
> 
> 
> 
> 
> 
> 
> 
> 
> What type of games do you like to play?
> G-Sync is a nice features, there are however problems for some users with the Windows AU and G-sync, you might read up on that or wait a few months until MS and Nvidia can short that out.


i ended up getting the asus strix 1070 & i'm definitely happy with it. hardware canucks review of the strix 1070 is what bought me to get that card.
I've been playing a lot of bf4 lately, but looking to diversify my options. I already own the witcher 3 & still in the process of beating the game.

i currently run it on the asus vg248qe which is the 1080p with 144hz monitor & was thinking of possibly selling that & upgrading to 1440p.


----------



## kevindd992002

What is the difference between Zotac ZT-P10700I-10P and ZT-P10700B-10P? The former is not even in Zotac's website but it can be found in Newegg CA and Jet.com.


----------



## madmeatballs

Quote:


> Originally Posted by *kevindd992002*
> 
> What is the difference between Zotac ZT-P10700I-10P and ZT-P10700B-10P? The former is not even in Zotac's website but it can be found in Newegg CA and Jet.com.


Well, its weird how Newegg doesn't label it as an AMP! Extreme even if it looks pretty much like it and it's specs.

I have the Zotac ZT-P10700B-10P SKU

I have Samsung memory if that matters.

Someone else with an AMP! Extreme should check theirs if they have ZT-P10700I-10P SKU and see what is different.


----------



## HOODedDutchman

Quote:


> Originally Posted by *kevindd992002*
> 
> What is the difference between Zotac ZT-P10700I-10P and ZT-P10700B-10P? The former is not even in Zotac's website but it can be found in Newegg CA and Jet.com.


The 1st one isnt really overclocked and has 2x6 pin connectors while the amp and amp extreme have 2x8 pin connectors and are overclocked. Also this card has no backplate. Its a cheaper version of the 1070 to compete with the gigabyte windforce 2 card and otger aftermarket cards that are priced well with less features. Cooler is also dual slot instead of 2.5 slot like the amp extreme. Regular amp is also dual slot tho. The amp is basically same pcb etc. as amp extreme just bit small cooler. Still runs very cool and has a cery good price for what you get. I actually bought one n was running an htpc at the time and realized on the way home my psu didnt have dual 8 pin connectors and turned around n went back and exchanged without even opening the box lol.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> It happens if I open Calculator too, anything that uses that Windows 10 app interface style.
> Am I the only person running Windows 10 and Prefer Maximum Performance or something?
> 
> Suddenly nobody can help that I found the cause...
> 
> When I add Windows Calculator .exe to NVCP Program Settings and change it to adaptive or optimal performance, that actually works and clocks and voltages lower.
> Doesn't work for SystemSettings.exe though


Are you running Afterburner?

Have a look in the Riva Tuner app. You may find that there are the apps that are ramping up your overclock in the app profiles section. There are application detection level settings that you can play with as well for the global setting.


----------



## gtbtk

Quote:


> Originally Posted by *madmeatballs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> What is the difference between Zotac ZT-P10700I-10P and ZT-P10700B-10P? The former is not even in Zotac's website but it can be found in Newegg CA and Jet.com.
> 
> 
> 
> Well, its weird how Newegg doesn't label it as an AMP! Extreme even if it looks pretty much like it and it's specs.
> 
> I have the Zotac ZT-P10700B-10P SKU
> 
> I have Samsung memory if that matters.
> 
> Someone else with an AMP! Extreme should check theirs if they have ZT-P10700I-10P SKU and see what is different.
Click to expand...

MSI cards have different serial number ranges that differentiate the Samsung cards and the Micron cards. I am surmising that the difference is in the Zotac ranges is also the different memory brand


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That pretty much makes me think that the Micron Memory in your case is a red herring. I think your card is actually faulty. It should not be crashing at stock
> 
> 
> 
> Let me set something straight here, it occasionally shows artifacts at stock not crash, the checkerboard crashing happens at +200 memory last time i ran a game (3 days ago) , i have not tested +150 and +100 yet, but if it is BSODing at +200 and shows artifacts at stock then there is probably no need for any further testing as it barely works at stock and not even stable enough to maintain stock frequencies at stock settings, i expect if i under-clock it to 7600-7800Mhz region it will get into the range of full stability, but what is the point?
> 
> The way i stumbled upon this issue eliminates any other problems with my system other than the Micron memory chips on the GPU,
> 
> What happened is that i had the opportunity to try two different Samsung cards for a few weeks, i loved everything about them that i bough one for myself in August, at that time i did not know anything about all of this, i just installed the card, was pretty excited, ran Rise of the Tomb Raider in 2K, and after a few minutes of gameplay i felt there was something off about the card, the gameplay was laggy and did not feel near as fluid as it used to be with the Samsung card, i thought maybe because i have not tried OCing it yet, given that i was doing +600-700Mhz on the Samsungs, i dialed +500 memory on that card and get instant checkerboard BSOD! at that point i knew there was something totally off about that card for sure but i did not know what it was, eventually i ran GPU-Z screenshots of both cards side by side and noticed that the only difference is that one had Samsung and the other had Micron!
> 
> The next day i contacted my retailer saying what is that piece of garbage you sent me? i want a card with Samsung memory because the one you sent me does not perform right and is laggy during high res. gaming, so they set an RMA condition with their supplier that a replacement card is requested with Samsung memory, the supplier contacted ASUS and they said they can not provide that, so the retailer told me just return the card, get your money back and buy another one since it is not easy to take that under warranty.
> 
> During that time the retailer received a new batch restock of the card, so i bought one from the new batch, put it in the system and noticed the same symptoms along with a lot of coilwhine, ran GPU-Z and it turned out to have Micron memory again!, at this point i went Googling "GTX 1070 Micron Memory" ...
> 
> You know the rest of the story ...
Click to expand...

It should be rock solid at stock speeds and not show ANY artifacts or cause BSOD for that matter. That is what makes me think your card actually has a fault and warrants being replaced.

My card, if it is idling at voltages less than .780v and I hit apply to a +500Mhz Memory OC I get the same artifacts as well. If the Voltage is locked in AB at anything above .800v, the memory OC works fine. That is how I derermined it was a software bug and not a hardware fault and why I started bugging Nvidia to fix the bug.

In your case though, I think you have two issues and the micron memory problem at oc speeds is ending up being a distraction to the stock speed artifact problem.


----------



## Tobe404

Thought I'd chuck up a Firestrike run of my Gainward GTX 1070 Phoenix. I'm pretty stoked with the results. Cheers all.

http://www.3dmark.com/3dm/15160082


----------



## Ljanmi

What is max I can expect from Samsung memory in Afterburner? +700 or more?


----------



## Tobe404

Quote:


> Originally Posted by *Ljanmi*
> 
> What is max I can expect from Samsung memory in Afterburner? +700 or more?


I'd say more around +600, give or take. Things start getting funny at anything above +625 for me. Most reviews only go up to +600-+650.


----------



## HOODedDutchman

Seams a bit low on the graphucs score. I see better results from skylake platform for some reason. Even my scaling is higher then most reviews that use x99.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> It should be rock solid at stock speeds and not show ANY artifacts or cause BSOD for that matter. That is what makes me think your card actually has a fault and warrants being replaced.
> 
> My card, if it is idling at voltages less than .780v and I hit apply to a +500Mhz Memory OC I get the same artifacts as well. If the Voltage is locked in AB at anything above .800v, the memory OC works fine. That is how I derermined it was a software bug and not a hardware fault and why I started bugging Nvidia to fix the bug.
> 
> In your case though, I think you have two issues and the micron memory problem at oc speeds is ending up being a distraction to the stock speed artifact problem.


I don't think so, i strongly believe that 8Ghz is not the lowest denominator for Micron GDDR5 ICs, some run stable at 8400Mhz+ and some run stable at 7600Mhz without funny power tricks like voltage-locking or prefer max performance, i might just have an IC there that is not 100% stable at 8000Mhz that is causing the issue, these Micron GDDR5 should have been used at 7000-7500Mhz to make sure that all samples are 100% stable, they are just not cutting it at their maximum rated data rate.

Like discussed before, RMA is a huge pain in the butt, i would not go through the hassle of RMA without at least making sure that i have a very high probability of getting a quality product in return, and as you know that is not the case now, if i do RMA and get a replacement from the current stock, i might get lucky and get a sample that is stable up to say 8200Mhz ... that is not a safe enough headroom to make sure the card will stay stable for its lifetime, across multiple drivers updates, and future more stressful games, so RMA at this point is not useful.

I will RMA or return at the end, but i was waiting for an update from NVIDIA to test on the sample i have now before i decide what i will replace it with, if the new BIOS gets me to 8400-8600Mhz stable without doing anything funny, then i will know that there is hope and that this is not a completely lost cause, on the other hand if the BIOS did not improve the headroom enough, then i know that Micron GDDR5 is not suitable to run at 8Ghz to begin with and then i will get a Founders Edition with Samsung or just get my money back and wait for VEGA with HBM2.

That is my plan which involves a lot less hassle and would produce much better results than just blindly RMAing now without knowing what i am getting myself into.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> MSI cards have different serial number ranges that differentiate the Samsung cards and the Micron cards. I am surmising that the difference is in the Zotac ranges is also the different memory brand


Could you give me more information? i have compared six different serial numbers for ASUS Strix cards, 3 with Samsung and 3 with Micron, there are no clues whatsoever as to which one has what memory type based on the serial number, if it is not the case with MSI i would like to learn more about that, maybe i will be able to identify a Samsung card by its serial number and that would be a pretty good solution if the information is indeed accurate and dependable.


----------



## SpirosKGR

Guys, does anyone knows if EVGA Superclocked model comes with samsung memory?

Thanks


----------



## MyNewRig

Quote:


> Originally Posted by *SpirosKGR*
> 
> Guys, does anyone knows if EVGA Superclocked model comes with samsung memory?
> 
> Thanks


Currently no, they don't, all recent production is exclusively Micron


----------



## criminal

Quote:


> Originally Posted by *TheDeadCry*
> 
> "Whining" With that attitude I assume you either have a lot of disposable income, or that you just love riding the corporate D*ck ( I am aware that the asterisk is stupid...but I'm unsure of what language is "acceptable") *Sigh* Explain your issue. Let's talk about this. I'm honestly curious as to why you seem content with mediocrity. Hell, maybe I'm misinterpreting - If so, disregard the previous comments. lmao There is a +1 rep in it for you


For one, notice the wink smiley? I was mostly joking. Some people with Micron memory cards have legitimate issues with their card running at rated specs. Those people deserve to be upset about the issue and have something done about the issue. But the people "whining" because their micron equipped card doesn't overclock very well.... well those people need to grow up and move on. You can read my past comments in this thread if you want to know more because I am not getting into all of this again.


----------



## Kronos8

Quote:


> Originally Posted by *MyNewRig*
> 
> Currently no, they don't, all recent production is exclusively Micron


Any feedback on FE cards? Do they still have Samsung Memories?
I really wonder what will be the case on new FE cards that are coming.
MSI 1070 FE due on Oct 7 on Amazon.de.
Asus 1070 FE due on Oct 17 on Amazon.de.
PNY1070 FE no ETA on Amazon.de.


----------



## MyNewRig

Quote:


> Originally Posted by *Kronos8*
> 
> Any feedback on FE cards? Do they still have Samsung Memories?
> I really wonder what will be the case on new FE cards that are coming.
> MSI 1070 FE due on Oct 7 on Amazon.de.
> Asus 1070 FE due on Oct 17 on Amazon.de.
> PNY1070 FE no ETA on Amazon.de.


All FEs are directly manufactured by NVIDIA and are sent to board partners in a fully manufactured form, they just provide the retail box, logistics and customer support, as far as we know all FEs in the market right now have Samsung, not a single report of an FE with Micron so far, whether they will switch to Micron or not remains solely Nvidia's decision, i highly doubt they will switch to Micron though, Nvidia would probably be crazy to do it.

EDIT: just remember that the new upcoming BIOS could possibly change everything, if the Micron memory controller is just simply misconfigured in the current BIOS and they have now configured it properly in the upcoming BIOS then Micron cards could end up being as good as Samsung ones, the next few days will reveal a lot, we just have to wait and see how it turns out.


----------



## Kronos8

@MyNewRig
Thanks for reply.


----------



## fauka

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fauka*
> 
> hello i just want to say that i dont recomend using bios from others cards... i was having 1070 MSI SEA hawk x and fleshed palit game rock bios.. saw few ppl was using it and was fine but i got just bad luck and my card went down. i only hope now that warranty will apply and send me back working one :/ but anyway its seems strange cuz i was runing sli with it and the second one went down not first ;/ anyway im just let u know about my problem :S
> 
> 
> 
> You can easily recover the bricked card.
> 
> Disable any SLI settings you have and take off the bridge, Make sure the bricked card is in the second slot and boot off the good card in the primary slot or even better, an iGPU
> 
> Force update the bricked card with the stock bios using the command. "nvflash -6 -i1 stockbiosfile.rom"
> 
> I flashes a galaxy bios to my Gaming X the other day and it bricked the card. I suspect that bios file that failed was mislabled and actually for a different type of card.
> 
> After finishing the flash job, cross your fingers, reboot and it should come back to life again.
Click to expand...

nah mate... it didnt brick its just dead... thats it.. i dont see that card in every way. system bios or anything. even light from MSI logo is gone ;|


----------



## MyNewRig

Quote:


> Originally Posted by *fauka*
> 
> nah mate... it didnt brick its just dead... thats it.. i dont see that card in every way. system bios or anything. even light from MSI logo is gone ;|


Yeah, that is exactly what bricking is, the card turns into a worthless piece of metal that can not do anything, did you try the method that we described to flash the original BIOS back?


----------



## fauka

yeah nothing ;// when i was trying to flash it always no display adapter :|


----------



## MyNewRig

Quote:


> Originally Posted by *fauka*
> 
> yeah nothing ;//


crap, your warranty is most probably void by now but you can try and send it in anyways and see what happens, maybe they could fix it for a small fee or something good luck man.


----------



## BroPhilip

Quote:


> Originally Posted by *MyNewRig*
> 
> All FEs are directly manufactured by NVIDIA and are sent to board partners in a fully manufactured form, they just provide the retail box, logistics and customer support, as far as we know all FEs in the market right now have Samsung, not a single report of an FE with Micron so far, whether they will switch to Micron or not remains solely Nvidia's decision, i highly doubt they will switch to Micron though, Nvidia would probably be crazy to do it.
> 
> EDIT: just remember that the new upcoming BIOS could possibly change everything, if the Micron memory controller is just simply misconfigured in the current BIOS and they have now configured it properly in the upcoming BIOS then Micron cards could end up being as good as Samsung ones, the next few days will reveal a lot, we just have to wait and see how it turns out.


I am hoping that is the truth.... my micron will only checker board if I push +425 to +450. Everything below is pretty good. +350 seems to be pretty stable but the most demanding game I run is battlefront (full settings ne vsync) and arkam knight. That's 8700 to 8800 but I feel it is pulling from my core overclock which is only stable at 2088. (Could be wrong) Either way it is decent. If this boost the oc any more it would be a win on my part. I still feel the people are right with the Micron and am thankful to the people that call attention to this. I have been nervous as I don't have any thing that really stresses the gpu. Being that I have a high end card (Gaming Z) the extra power and voltage may have been what has kept me from some of the issues. But anyway it's a win for the consumer any time a company responds and it also gives respect to the company. I can endure a lot if I feel the company is dealing with it.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> MSI cards have different serial number ranges that differentiate the Samsung cards and the Micron cards. I am surmising that the difference is in the Zotac ranges is also the different memory brand
> 
> 
> 
> Could you give me more information? i have compared six different serial numbers for ASUS Strix cards, 3 with Samsung and 3 with Micron, there are no clues whatsoever as to which one has what memory type based on the serial number, if it is not the case with MSI i would like to learn more about that, maybe i will be able to identify a Samsung card by its serial number and that would be a pretty good solution if the information is indeed accurate and dependable.
Click to expand...

I can only speak for MSI Gaming X cards.

The samsung memory were 602-v330-06xxxxxxx and Micron cards are 602-v330-4xxxxxxx. At least that is the case up until last August


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Yes I saw that. Given that Micron memory is only installed on a subset of all 1070s their statement is accurate. There are some Micron 1070 owners, I'm sure, who have never experienced the bug because they never tried to overclock their card. Remember doing a straw pole of the users here, will give you a skewed statistical result because the only people who come here are interested in overclocking their cards in the first place
> 
> Your card may be slightly faulty regardless of the bug or it may even be something like your PC overclock that is causing PCIe instability and causing your card to crash.
> 
> Could I suggest that you reset your CPU and System RAM to stock settings and then see if you are still having the same crashing issues at stock clocks? If you have a high BCLK OC or tweaked voltages on the motherboard, it may be causing your GPU to have freak out and have nothing to do with the memory at all.
> 
> If your card starts working at stock, make a single change to start to put your OC back in place and test if the card starts freaking out again at each step. If you find it is an element of your PC overclock, dial that back a bit so you get stable again.
> 
> 
> 
> There is no need for all that stability testing because i have a much simpler and better test, tested a couple Samsung GTX 1070s for about a month on the same system with the same BIOS settings and all, in games and benchmarks even with +600 memory and they were flawless, not a single crash or artifact during one whole month, it only started after that Micron card went into the system, so it is clear what is the culprit here.
> 
> Also the artifacting at stock settings were less frequent with the latest driver update, and when it crashes it crashes with a checkerboard artifact followed by a BSOD, so it is very clear what is going on here.
Click to expand...

Just been thinking. Check your VRM/power settings in the PC bios. If the settings are turned down for maximum power economy rather than performance it may be starving the card of power


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Just been thinking. Check your VRM/power settings in the PC bios. If the settings are turned down for maximum power economy rather than performance it may be starving the card of power


Don't have any more patience to tinker with this any more, in my experience, unstable chips remain sensitive and problematic no matter what you try to do to stabilize them, unless the issue is addressed on a deeper level, so i will just save my energy until that new NVIDIA vBIOS is released for my card and then i will continue testing.


----------



## Nukemaster

Quote:


> Originally Posted by *gtbtk*
> 
> Just been thinking. Check your VRM/power settings in the PC bios. If the settings are turned down for maximum power economy rather than performance it may be starving the card of power


I do not think any setting in the board will regulate power to the video card. It gets 12 volts off the power supply(and a bit of 3.3) and has its own VRMs to deal with lowering the voltage as needed. These features effect the cpu(because it has a power system on the board) and maybe some onboard devices(green ethernet).

It sucks to see users having issues with these cards. Someone should have done more testing with the new memory before releasing those cards.


----------



## MyNewRig

Quote:


> Originally Posted by *Nukemaster*
> 
> I do not think any setting in the board will regulate power to the video card. It gets 12 volts off the power supply(and a bit of 3.3) and has its own VRMs to deal with lowering the voltage as needed. These features effect the cpu(because it has a power system on the board) and maybe some onboard devices(green ethernet).
> 
> It sucks to see users having issues with these cards. Someone should have done more testing with the new memory before releasing those cards.


Exactly, i don't believe any VRM/Power setting in the BIOS have anything to do with the discrete GPU, all settings are related to the CPU and iGPU as well as on-board components, the GPU is an entire sub-system of its own, with its own BIOS, VRM etc .. so i won't even bother, that system was rock solid stable for months until that Micron GTX 1070 went in and since then the system got poisoned.

Anyways i will not waste my life on that broken Micron 1070, if the BIOS fix did not work flawlessly with some minor testing and if then Nvidia did not move back to Samsung GDDR5 which is known to work well, then i will just move on ..


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> My card, if it is idling at voltages less than .780v and I hit apply to a +500Mhz Memory OC I get the same artifacts as well. If the Voltage is locked in AB at anything above .800v, the memory OC works fine. That is how I derermined it was a software bug and not a hardware fault and why I started bugging Nvidia to fix the bug.


This is similar to what I observed. If I disable the gaming app to drop back into 2d clocks, and apply +300 on memory, it seems fine. I can put 500 or so, and it also appears good but then launching something like MS Edge or opening Origin, that triggers 3d clocks, I would checkerboard.
As I mentioned, gaming app has me locked at 800mv currently, but it has say at 725 or 745, i cant recall, and its fine at +800.
Quote:


> Originally Posted by *gtbtk*
> 
> I can only speak for MSI Gaming X cards.
> The samsung memory were 602-v330-06xxxxxxx and Micron cards are 602-v330-4xxxxxxx. At least that is the case up until last August


My MSI gaming x w/ micron follows this pattern, 602-v330-4xxxxxxx


----------



## zipzop

Anyone know the type of metal on the ACX 3.0 contact surface for the die, whether it's nickel plated copper or aluminum. reason is I'd like to put on some CLU, though I hear it shouldn't be used on aluminum.

Happy that Nvidia recognized the Micron issue, I'll be watching the EVGA forums for the update


----------



## TheDeadCry

Quote:


> Originally Posted by *MyNewRig*
> 
> Don't have any more patience to tinker with this any more, in my experience, unstable chips remain sensitive and problematic no matter what you try to do to stabilize them, unless the issue is addressed on a deeper level, so i will just save my energy until that new NVIDIA vBIOS is released for my card and then i will continue testing.


Once the new bios was announced, I immediately went back to stock. I'm not going to waste my time on OCing anymore, unless I know we have a proper bios. I can wait.


----------



## Roland0101

Quote:


> Originally Posted by *Nukemaster*
> 
> I do not think any setting in the board will regulate power to the video card. It gets 12 volts off the power supply(and a bit of 3.3) and has its own VRMs to deal with lowering the voltage as needed. These features effect the cpu(because it has a power system on the board) and maybe some onboard devices(green ethernet).


That depends. The FE uses the PCIe slot 12v Rail, as cards with a 1x8 Pin power connector do too. So all Bios power saving features that influence the PCIe slots are still influencing the gpu.


----------



## Star Forge

I do have a question for some of you: Are there people here with Samsung VRAM that couldn't get their cards to overclock well? I am curious.


----------



## TheDeadCry

Quote:


> Originally Posted by *criminal*
> 
> For one, notice the wink smiley? I was mostly joking. Some people with Micron memory cards have legitimate issues with their card running at rated specs. Those people deserve to be upset about the issue and have something done about the issue. But the people "whining" because their micron equipped card doesn't overclock very well.... well those people need to grow up and move on. You can read my past comments in this thread if you want to know more because I am not getting into all of this again.


Sure, but this "Whining" actually got Nvidia to acknowledge that there ARE issues with selected 1070's based on a possible voltage regulation management issue. I would agree that if someones card is unstable at stock, there is clearly some other issue that needs to be addressed via RMA and will probably not be solved by a new bios. However, clearly these complaints are legitimate, or Nvidia wouldn't have acknowledged it. THOSE people are NOT whiners. They are smart consumers. I understand that you were being somewhat facetious, but disregarding people by calling them "Whiners" is "unfair". Stop "Whining" about how some people actually give a damn about making sure their product is working as expected. I look forward to your response.


----------



## TheGlow

Quote:


> Originally Posted by *Star Forge*
> 
> I do have a question for some of you: Are there people here with Samsung VRAM that couldn't get their cards to overclock well? I am curious.


I think I recall someone mentioning that, but not 100%.
On the flipside there are some rare microns like mine that I can oc to +800 and dont recall artifacts.I didnt see any until +825.
I've settled on +700 in the meantime. Core I've gotten up to 210 w/o crash on timespy, but artifacted. 200 I didnt notice any, but settled on +180 for now.


----------



## Star Forge

Quote:


> Originally Posted by *TheGlow*
> 
> I think I recall someone mentioning that, but not 100%.
> On the flipside there are some rare microns like mine that I can oc to +800 and dont recall artifacts.I didnt see any until +825.
> I've settled on +700 in the meantime. Core I've gotten up to 210 w/o crash on timespy, but artifacted. 200 I didnt notice any, but settled on +180 for now.


For your Micron board, did you tweaked anything in regards to voltages? What is your Power Limit? Curious again.


----------



## Nukemaster

Quote:


> Originally Posted by *Roland0101*
> 
> That depends. The FE uses the PCIe slot 12v Rail, as cards with a 1x8 Pin power connector do too. So all Bios power saving features that influence the PCIe slots are still influencing the gpu.


Telling the slot to enter its power savings does not actually limit the electrical power in the slot(the device also has to allow these savings). If it did, then no card would have ever burned a pci-e slot from going over the slot limit. Some boards with 3 and 4 x16 slots have burned because the 24 pin connector does not provide enough power(or contacts for 12 volts) and burned at the connector(this is why many boards have an extra molex type power connector for extra power for more x16 slots.).

I am *NOT* saying that no board ever had more control over the slot, just that most do not mess with the power limits.

I do not even think that a board can tell how much power the slot is using at this point in time. They tend to rely on the card to not take too much.

USB on the other hand generally uses something like a polyswitch(self resetting) to limit current and avoid issues of overdraw.


----------



## Roland0101

Quote:


> Originally Posted by *Nukemaster*
> 
> Telling the slot to enter its power savings does not actually limit the electrical power in the slot(the device also has to allow these savings). If it did, then no card would have ever burned a pci-e slot from going over the slot limit. Some boards with 3 and 4 x16 slots have burned because the 24 pin connector does not provide enough power(or contacts for 12 volts) and burned at the connector.


True of course, if the power saving features working as intended, but we are talking about a possible problem with this features.
Quote:


> I do not even think that a board can tell how much power the slot is using at this point in time. They tend to rely on the card to not take too much.
> 
> USB on the other hand generally uses something like a polyswitch(self resetting) to limit current and avoid issues of overdraw.


You are right, there is no safeguard that limits the power consumption on the PCIe slot, see the RX 480 problem.


----------



## TheGlow

Quote:


> Originally Posted by *Star Forge*
> 
> For your Micron board, did you tweaked anything in regards to voltages? What is your Power Limit? Curious again.


When pushing for max the core voltage was +100 and power limit 126%.
But for my daily use 180/700 the Core voltage is at +0, but Powerlimit still 126.


----------



## criminal

Quote:


> Originally Posted by *TheDeadCry*
> 
> Sure, but this "Whining" actually got Nvidia to acknowledge that there ARE issues with selected 1070's based on a possible voltage regulation management issue. I would agree that if someones card is unstable at stock, there is clearly some other issue that needs to be addressed via RMA and will probably not be solved by a new bios. However, clearly these complaints are legitimate, or Nvidia wouldn't have acknowledged it. THOSE people are NOT whiners. They are smart consumers. I understand that you were being somewhat facetious, but disregarding people by calling them "Whiners" is "unfair". Stop "Whining" about how some people actually give a damn about making sure their product is working as expected. I look forward to your response.


LOL... I think we can agree that people wanting their defective product to work right aren't whiners.


----------



## reflex75

Quote:


> Originally Posted by *TheGlow*
> 
> When pushing for max the core voltage was +100 and power limit 126%.
> But for my daily use 180/700 the Core voltage is at +0, but Powerlimit still 126.


Don't forget to mention you use MSI Gamming App which keep 3D mode all the time to prevent voltage drop and prevent crash.


----------



## TheDeadCry

Quote:


> Originally Posted by *criminal*
> 
> LOL... I think we can agree that people wanting their defective product to work right aren't whiners.


People have a right to complain. I just wanted to make that clear. I don't want to see pages upon pages about people complaining about problems with their cards. On the other hand, if it is legitimately a widespread concern, its beneficial to talk about. Of course nobody likes to hear someone screaming that their smoking graphics card needs to be fixed by a Nvidia. From what I've seen, most peoples complaints are legitimate. I do know that there have been actual defective modules (MSI Recall, for example) it's really a mess. My card is limited by voltage. My max OC is maybe 2025mhz stable. My power usage always stays very low, seems around maybe 80% max. It's very misleading to have a slider that goes up to 126%, but I digress. I'm very curious as to how a new bios will impact the power dynamics of the card.


----------



## gtbtk

Quote:


> Originally Posted by *Star Forge*
> 
> I do have a question for some of you: Are there people here with Samsung VRAM that couldn't get their cards to overclock well? I am curious.


I think there are people who cannot get their cards to +600 but I am not aware of any having the checkerboard artifact and subsequent BSOD problem


----------



## RaleighStClair

Quote:


> Originally Posted by *Star Forge*
> 
> I do have a question for some of you: Are there people here with Samsung VRAM that couldn't get their cards to overclock well? I am curious.


I have samsung memory and can only do +500 mem stable. But I can get 2100core stable, until it it pwr throttles down to the lower bin. My card never gets above 50c, so technically I would need a modded bios to get it to work at that clock speed.

I was able to run valley at 2200 on a suicide run using the voltage lock trick - it does eventually go back down another bin in games, but not in benchmarks - but my mem is pretty much locked at 500, which isn't that great.


----------



## gtbtk

Quote:


> Originally Posted by *TheDeadCry*
> 
> Quote:
> 
> 
> 
> Originally Posted by *criminal*
> 
> LOL... I think we can agree that people wanting their defective product to work right aren't whiners.
> 
> 
> 
> People have a right to complain. I just wanted to make that clear. I don't want to see pages upon pages about people complaining about problems with their cards. On the other hand, if it is legitimately a widespread concern, its beneficial to talk about. Of course nobody likes to hear someone screaming that their smoking graphics card needs to be fixed by a Nvidia. From what I've seen, most peoples complaints are legitimate. I do know that there have been actual defective modules (MSI Recall, for example) it's really a mess. My card is limited by voltage. My max OC is maybe 2025mhz stable. My power usage always stays very low, seems around maybe 80% max. It's very misleading to have a slider that goes up to 126%, but I digress. I'm very curious as to how a new bios will impact the power dynamics of the card.
Click to expand...

OC.N is a technical based discussion forum. I know some of it is boasting about how much you can get in Firestrike but it is also about Technical discussion and problem solving when overclocking computer devices. Many of the people who do come here dont really understand how this stuff works (thats a good thing, it keeps IT salaries high...







) so the come here to get their problems solved by the "experts". Problem is, there are no experts when a new architecture is just released to the market and the inventor of the product is not telling you the secrets of how the thing works internally.

The posts here told the problem solvers among us important things about this particular problem. Like it being common to all brands not isolated to one vendor substituting shoddy parts. That would not have been obvious on a single brand forum where It would have taken much longer to identify and resolve as there is no way to compare notes between different brands of cards as it were.

Yes in China, the country of counterfeit eggs, counterfeit rice, counterfeit iphones and melamine substitution in milk powder, there was a recall of MSI product locally. I still don't know what the quality problems were that caused them to do that recall from vendors. MSI cards weren't recalled anywhere else that I am aware of.

BTW, if you want to get higher overclocks and get closer to your power target, move the Voltage, power target and temp sliders to maximum.

Apply a custom curve or fixed fan speed at say 85%.

Set the core clock slider to -50 (yes negative) in Afterburner then before you apply that, Ctrl-F and use the curve screen. Start off by dragging the 1.093 point to 2100mhz and then drag the 0.975v point to 1999Mhz and hit apply. You will end up with a stair step looking curve that remains flat from 1.093v to the top.

Adjust memory slider to taste. I find +400 is reliable on both DX11 and DX12 with my card. Above that point, DX11 is OK but and DX12 doesn't like it so much.

You can dial in this OC by adjusting either the 1.093 point or the .975 point up or down by 12 to 13 Mhz at a time. The 0.975 point seems to have a big impact on the amount of power the card draws

I tend to find my card falls over and crashes the graphics driver if it tries to draw more than about 220Watts. I use a program called HWiNFO64 and monitor how much power my card is drawing under load. You can set it up to display extra monitoring info like the number of Watts the card is drawing and even how much the power supply is outputting if you have a smart power supply.

Best Overclocks i have gotten tend to hover around 180-205w under load in firestrike.


----------



## Roland0101

Quote:


> Originally Posted by *RaleighStClair*
> 
> I have samsung memory and can only do +500 mem stable.


Afterburner or effective?


----------



## TheDeadCry

Quote:


> Originally Posted by *gtbtk*
> 
> OC.N is a technical based discussion forum. I know some of it is boasting about how much you can get in Firestrike but it is also about Technical discussion and problem solving when overclocking computer devices. Many of the people who do come here dont really understand how this stuff works (thats a good thing, it keeps IT salaries high...
> 
> 
> 
> 
> 
> 
> 
> ) so the come here to get their problems solved by the "experts". Problem is, there are no experts when a new architecture is just released to the market and the inventor of the product is not telling you the secrets of how the thing works internally.
> 
> The posts here told the problem solvers among us important things about this particular problem. Like it being common to all brands not isolated to one vendor substituting shoddy parts. That would not have been obvious on a single brand forum where It would have taken much longer to identify and resolve as there is no way to compare notes between different brands of cards as it were.
> 
> Yes in China, the country of counterfeit eggs, counterfeit rice, counterfeit iphones and melamine substitution in milk powder, there was a recall of MSI product locally. I still don't know what the quality problems were that caused them to do that recall from vendors. MSI cards weren't recalled anywhere else that I am aware of.
> 
> BTW, if you want to get higher overclocks and get closer to your power target, move the Voltage, power target and temp sliders to maximum.
> 
> Apply a custom curve or fixed fan speed at say 85%.
> 
> Set the core clock slider to -50 (yes negative) in Afterburner then before you apply that, Ctrl-F and use the curve screen. Start off by dragging the 1.093 point to 2100mhz and then drag the 0.975v point to 1999Mhz and hit apply. You will end up with a stair step looking curve that remains flat from 1.093v to the top.
> 
> Adjust memory slider to taste. I find +400 is reliable on both DX11 and DX12 with my card. Above that point, DX11 is OK but and DX12 doesn't like it so much.
> 
> You can dial in this OC by adjusting either the 1.093 point or the .975 point up or down by 12 to 13 Mhz at a time. The 0.975 point seems to have a big impact on the amount of power the card draws
> 
> I tend to find my card falls over and crashes the graphics driver if it tries to draw more than about 220Watts. I use a program called HWiNFO64 and monitor how much power my card is drawing under load. You can set it up to display extra monitoring info like the number of Watts the card is drawing and even how much the power supply is outputting if you have a smart power supply.
> 
> Best Overclocks i have gotten tend to hover around 180-205w under load in firestrike.


Ah yes. Thanks for the overclocking advice. HWinfo is always open







Overclocking is just getting more and more convoluted. Remember when we used to be able to overclock the shaders independently? I remember. I wish we just had straightforward and direct access like we used to. I always have my sliders maxed.







In any case/good info.


----------



## gtbtk

Quote:


> Originally Posted by *TheDeadCry*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> OC.N is a technical based discussion forum. I know some of it is boasting about how much you can get in Firestrike but it is also about Technical discussion and problem solving when overclocking computer devices. Many of the people who do come here dont really understand how this stuff works (thats a good thing, it keeps IT salaries high...
> 
> 
> 
> 
> 
> 
> 
> ) so the come here to get their problems solved by the "experts". Problem is, there are no experts when a new architecture is just released to the market and the inventor of the product is not telling you the secrets of how the thing works internally.
> 
> The posts here told the problem solvers among us important things about this particular problem. Like it being common to all brands not isolated to one vendor substituting shoddy parts. That would not have been obvious on a single brand forum where It would have taken much longer to identify and resolve as there is no way to compare notes between different brands of cards as it were.
> 
> Yes in China, the country of counterfeit eggs, counterfeit rice, counterfeit iphones and melamine substitution in milk powder, there was a recall of MSI product locally. I still don't know what the quality problems were that caused them to do that recall from vendors. MSI cards weren't recalled anywhere else that I am aware of.
> 
> BTW, if you want to get higher overclocks and get closer to your power target, move the Voltage, power target and temp sliders to maximum.
> 
> Apply a custom curve or fixed fan speed at say 85%.
> 
> Set the core clock slider to -50 (yes negative) in Afterburner then before you apply that, Ctrl-F and use the curve screen. Start off by dragging the 1.093 point to 2100mhz and then drag the 0.975v point to 1999Mhz and hit apply. You will end up with a stair step looking curve that remains flat from 1.093v to the top.
> 
> Adjust memory slider to taste. I find +400 is reliable on both DX11 and DX12 with my card. Above that point, DX11 is OK but and DX12 doesn't like it so much.
> 
> You can dial in this OC by adjusting either the 1.093 point or the .975 point up or down by 12 to 13 Mhz at a time. The 0.975 point seems to have a big impact on the amount of power the card draws
> 
> I tend to find my card falls over and crashes the graphics driver if it tries to draw more than about 220Watts. I use a program called HWiNFO64 and monitor how much power my card is drawing under load. You can set it up to display extra monitoring info like the number of Watts the card is drawing and even how much the power supply is outputting if you have a smart power supply.
> 
> Best Overclocks i have gotten tend to hover around 180-205w under load in firestrike.
> 
> 
> 
> Ah yes. Thanks for the overclocking advice. HWinfo is always open
> 
> 
> 
> 
> 
> 
> 
> Overclocking is just getting more and more convoluted. Remember when we used to be able to overclock the shaders independently? I remember. I wish we just had straightforward and direct access like we used to. I always have my sliders maxed.
> 
> 
> 
> 
> 
> 
> 
> In any case/good info.
Click to expand...

My first work PC was a 80286 with 512K of Ram and CGA graphics adapter. hehe.

You couldn't overclock those at all.


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> BTW, if you want to get higher overclocks and get closer to your power target, move the Voltage, power target and temp sliders to maximum.


Is Voltage helping you regarding the memory clocks? Because if I set the games or 3Dmark to "prefer Maximum performance" I don't need to touch the core voltage in Afterburner at all. (as it should be, it's core voltage.) 8808Mhz effective is stable no matter what. (DX11)
Furthermore, I can run +90 offset core clock (over the STRIX clocks) if I don't touche the voltage, if I do set it to 100% (1.093v), it crashes the driver.

Quote:


> Adjust memory slider to taste. I find +400 is reliable on both DX11 and DX12 with my card. Above that point, DX11 is OK but and DX12 doesn't like it so much.


This is interesting. I can run Time Spy and Fire strike at 8808Mhz effective on 368.81, but on 372.90 Time Spy crashes the driver. Fire strike still runs fine.


----------



## Hunched

Quote:


> Originally Posted by *Roland0101*
> 
> Furthermore, I can run +90 offset core clock (over the STRIX clocks) if I don't touche the voltage, if I do set it to 100% (1.093v), it crashes the driver.


At least my experience with single 8-pin cards, is increasing the voltage doesn't help because you're going to be smashing into the TDP limit of the card even more if you do so.
With my MSI 8+6 I never get close to TDP throttling, it's only VRel limited which is what you want.


----------



## Roland0101

Quote:


> Originally Posted by *Hunched*
> 
> At least my experience with single 8-pin cards, is increasing the voltage doesn't help because you're going to be smashing into the TDP limit of the card even more if you do so.
> With my MSI 8+6 I never get close to TDP throttling, it's only VRel limited which is what you want.


Yes, that is exactly the reason, the card is capped by the power limit and the driver doesn't seam to like it.
If I don't increase the voltage the card is capped on reliability voltage and runs fine.


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> My first work PC was a 80286 with 512K of Ram and CGA graphics adapter. hehe.
> 
> You couldn't overclock those at all.


Which one, the version with 10Mhz or the speedy one with 16Mhz. (Maybe even a 25Mhz racer)









My first was a Commodore 64. OK technically that wasn't a PC, but it was a computer.


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> BTW, if you want to get higher overclocks and get closer to your power target, move the Voltage, power target and temp sliders to maximum.
> 
> 
> 
> Is Voltage helping you regarding the memory clocks? Because if I set the games or 3Dmark to "prefer Maximum performance" I don't need to touch the core voltage in Afterburner at all. (as it should be, it's core voltage.) 8808Mhz effective is stable no matter what. (DX11)
> Furthermore, I can run +90 offset core clock (over the STRIX clocks) if I don't touche the voltage, if I do set it to 100% (1.093v), it crashes the driver.
> 
> Quote:
> 
> 
> 
> Adjust memory slider to taste. I find +400 is reliable on both DX11 and DX12 with my card. Above that point, DX11 is OK but and DX12 doesn't like it so much.
> 
> Click to expand...
> 
> This is interesting. I can run Time Spy and Fire strike at 8808Mhz effective on 368.81, but on 372.90 Time Spy crashes the driver. Fire strike still runs fine.
Click to expand...

I am still experimenting.

I have found I can increase core clock slider to higher levels on the slider with core voltage percent at 0 but i don't get maximum performance. You need to set the voltage before you increase the core frequency slider then not be so greedy with the core clock slider value.

one thing I have noticed is that if the GPU tries to pull more than about 210 - 220W as measured with HWiNFO64, it crashes the Graphics driver. That is my main limitation at the moment, not artifacts. Still scratching my head to work out how to better manage that. The card is advertised to draw 150W at stock but it is physically wired for 300W. I'm running an HX 850i PSU in single rail mode so the PSU is not limiting anything. The whole PC is only using about 350W under load so there is loads of headroom.

Any suggestions for tweaking bios voltage settings to give me more power headroom or software to monitor PCIe bus power loads are welcome

Another observation that I have made with my Gaming X is that the card idle voltage is determined by where on the curve the stock default clock value sits.

In my case, the stock clock of my gaming X is 1582mhz. If I increase the Core clock slider, the voltage point that matches 1582mhz gets lower the more the slider is increased until the active drops off the left side of the page making the artifact problem more likely. That is something to keep in mind with Micron memory overclocks and checkerboard artifacts. It is better to overclock the core with the curve rather than the slider, lifting values higher than the stock clock point and leaving the bottom 3 or 4 points at stock.


----------



## TheDeadCry

Quote:


> Originally Posted by *Roland0101*
> 
> Is Voltage helping you regarding the memory clocks? Because if I set the games or 3Dmark to "prefer Maximum performance" I don't need to touch the core voltage in Afterburner at all. (as it should be, it's core voltage.) 8808Mhz effective is stable no matter what. (DX11)
> Furthermore, I can run +90 offset core clock (over the STRIX clocks) if I don't touche the voltage, if I do set it to 100% (1.093v), it crashes the driver.
> This is interesting. I can run Time Spy and Fire strike at 8808Mhz effective on 368.81, but on 372.90 Time Spy crashes the driver. Fire strike still runs fine.


I've noticed this dynamic with CPU and RAM overclocking, as well. People more knowledgeable than me could probably explain it better. I think it's partially due to things like auto rules in the bios' of whatever - be it graphics card, or motherboard. For example, sometimes when I apply an overclock to the CPU the bios alters settings in such a way as to make it unstable. When I reset the CMOS I do the same OC and it's rock steady. Again, a little bit of a digression. There are so many properties I am yet to fully understand, such as electron tunneling and such within the chip. I'm not surprised that increasing the voltage may actually destabilize the machine. The role of thermal and voltage thresholds and how they effect the silicon on a low level, is very interesting. Again, someone more knowledgeable than me could explain this. I think we overclockers are starting to get a feel for these chips' dynamics though - at least on a base level. Of course, we still have the silicon lottery itself, but it seems we can still apply similar rules. It's all about that voltage regulation, and obviously that's up in the air right now with all this micron controversy. Every time you change the clock or the memory, or anything there's a lot of other stuff that's automatically getting altered that isn't shown - like shader clocks. This is pretty obvious stuff, I know. It's worth bringing up though, for people who find their once stable clocks cannot be re-stabilized once fiddling around with the clocks before going back. This is the most tangential, digressive post I've ever typed up I think. I'm tired, and I drank my first energy drink as well as just the caffeine itself in months. My tired, college-siphoned brain is not the most focused. Anyways, I love discussing this stuff - this is what overclocking is all about.







I'm loving the discussions on here.


----------



## BulletSponge

Quote:


> Originally Posted by *Roland0101*
> 
> Which one, the version with 10Mhz or the speedy one with 16Mhz. (Maybe even a 25Mhz racer)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My first was a Commodore 64. OK technically that wasn't a PC, but it was a computer.


Aww, my memories of Choplifter on a C64 are priceless.


----------



## TheDeadCry

Quote:


> Originally Posted by *gtbtk*
> 
> I am still experimenting.
> 
> I have found I can increase core clock slider to higher levels on the slider with core voltage percent at 0 but i don't get maximum performance. You need to set the voltage before you increase the core frequency slider then not be so greedy with the core clock slider value.
> 
> one thing I have noticed is that if the GPU tries to pull more than about 210 - 220W as measured with HWiNFO64, it crashes the Graphics driver. That is my main limitation at the moment, not artifacts. Still scratching my head to work out how to better manage that. The card is advertised to draw 150W at stock but it is physically wired for 300W. I'm running an HX 850i PSU in single rail mode so the PSU is not limiting anything. The whole PC is only using about 350W under load so there is loads of headroom.
> 
> Any suggestions for tweaking bios voltage settings to give me more power headroom or software to monitor PCIe bus power loads are welcome
> 
> Another observation that I have made with my Gaming X is that the card idle voltage is determined by where on the curve the stock default clock value sits.
> 
> In my case, the stock clock of my gaming X is 1582mhz. If I increase the Core clock slider, the voltage point that matches 1582mhz gets lower the more the slider is increased until the active drops off the left side of the page making the artifact problem more likely. That is something to keep in mind with Micron memory overclocks and checkerboard artifacts. It is better to overclock the core with the curve rather than the slider, lifting values higher than the stock clock point and leaving the bottom 3 or 4 points at stock.


It's a very curious thing. I also have the gaming x. While the slider can reach 126% I notice the card gets nowhere even close. I highly doubt its power related, if your card is like mine, which uses maybe 80-90% of the maximum power draw. I think it comes back to voltage regulation, which makes sense. The voltage is either not sufficient, or more likely, not being delivered correctly. Because while the card can clearly handle this additional power draw - the voltage regulation it seems cannot cope. I believe the card is "starved" for power, only because it isn't being delivered properly. I'd very much like to see what comes of this new bios thingamajig. I use 1000w Cooler Master Gold power supply....which has headroom on top of headroom. I very much doubt it has to do with the power supply, since mine has top notch delivery and I face similar issues.


----------



## TheDeadCry

Quote:


> Originally Posted by *gtbtk*
> 
> My first work PC was a 80286 with 512K of Ram and CGA graphics adapter. hehe.
> 
> You couldn't overclock those at all.


I don't go quite so far back, lol. In particular I think about my old 9600GT.







I'm still in my mid-twenties. I was fortunate to actually be able to work directly with Nvidia on this sort of PC package, with a Nvida branded case and GFX card and whatever else. I don't know if they actually decided to sell it or not, though. I was offered to build the computer with the direction of a relative of my step-uncle who worked for Nvidia. From what I'm to understand, this was done to see how well the instruction manual for the Nvidia kit was to understand for Novices (At the time being my first build) I believe I was around 14 maybe. Anyways, in exchange I got to keep the computer, and that's actually how I got so into computer's. Just a neat little story I thought i'd share.


----------



## TheDeadCry

Quote:


> Originally Posted by *TheDeadCry*
> 
> I don't go quite so far back, lol. In particular I think about my old 9600GT.
> 
> 
> 
> 
> 
> 
> 
> I'm still in my mid-twenties. I was fortunate to actually be able to work directly with Nvidia on this sort of PC package, with a Nvida branded case and GFX card and whatever else. I don't know if they actually decided to sell it or not, though. I was offered to build the computer with the direction of a relative of my step-uncle who worked for Nvidia. From what I'm to understand, this was done to see how well the instruction manual for the Nvidia kit was to understand for Novices (At the time being my first build) I believe I was around 14 maybe. Anyways, in exchange I got to keep the computer, and that's actually how I got so into computer's. Just a neat little story I thought i'd share.


I Think I found IT. This one has a a 9800GT, though. I feel special now, lmao. http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=5574380


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My first work PC was a 80286 with 512K of Ram and CGA graphics adapter. hehe.
> 
> You couldn't overclock those at all.
> 
> 
> 
> Which one, the version with 10Mhz or the speedy one with 16Mhz. (Maybe even a 25Mhz racer)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My first was a Commodore 64. OK technically that wasn't a PC, but it was a computer.
Click to expand...

It was a hotrod.

An 8Mhz Olivetti M28. I had Luxury of dual 5 1/4" 1.2mb floppies and a 10mb hard drive


----------



## gtbtk

Quote:


> Originally Posted by *TheDeadCry*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I am still experimenting.
> 
> I have found I can increase core clock slider to higher levels on the slider with core voltage percent at 0 but i don't get maximum performance. You need to set the voltage before you increase the core frequency slider then not be so greedy with the core clock slider value.
> 
> one thing I have noticed is that if the GPU tries to pull more than about 210 - 220W as measured with HWiNFO64, it crashes the Graphics driver. That is my main limitation at the moment, not artifacts. Still scratching my head to work out how to better manage that. The card is advertised to draw 150W at stock but it is physically wired for 300W. I'm running an HX 850i PSU in single rail mode so the PSU is not limiting anything. The whole PC is only using about 350W under load so there is loads of headroom.
> 
> Any suggestions for tweaking bios voltage settings to give me more power headroom or software to monitor PCIe bus power loads are welcome
> 
> Another observation that I have made with my Gaming X is that the card idle voltage is determined by where on the curve the stock default clock value sits.
> 
> In my case, the stock clock of my gaming X is 1582mhz. If I increase the Core clock slider, the voltage point that matches 1582mhz gets lower the more the slider is increased until the active drops off the left side of the page making the artifact problem more likely. That is something to keep in mind with Micron memory overclocks and checkerboard artifacts. It is better to overclock the core with the curve rather than the slider, lifting values higher than the stock clock point and leaving the bottom 3 or 4 points at stock.
> 
> 
> 
> It's a very curious thing. I also have the gaming x. While the slider can reach 126% I notice the card gets nowhere even close. I highly doubt its power related, if your card is like mine, which uses maybe 80-90% of the maximum power draw. I think it comes back to voltage regulation, which makes sense. The voltage is either not sufficient, or more likely, not being delivered correctly. Because while the card can clearly handle this additional power draw - the voltage regulation it seems cannot cope. I believe the card is "starved" for power, only because it isn't being delivered properly. I'd very much like to see what comes of this new bios thingamajig. I use 1000w Cooler Master Gold power supply....which has headroom on top of headroom. I very much doubt it has to do with the power supply, since mine has top notch delivery and I face similar issues.
Click to expand...

I have scratched my head over the power target and what the card is actually doing with power as well.

EVGA SC and FTW bioses that i flashed to my card both report that they pull higher power levels, reaching above 100% under load. The voltage and clock jumped around a lot more than the MSI bios bounces does too


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> I don't think so, i strongly believe that 8Ghz is not the lowest denominator for Micron GDDR5 ICs, some run stable at 8400Mhz+ and some run stable at 7600Mhz without funny power tricks like voltage-locking or prefer max performance, i might just have an IC there that is not 100% stable at 8000Mhz that is causing the issue, these Micron GDDR5 should have been used at 7000-7500Mhz to make sure that all samples are 100% stable, they are just not cutting it at their maximum rated data rate.
> 
> Like discussed before, RMA is a huge pain in the butt, i would not go through the hassle of RMA without at least making sure that i have a very high probability of getting a quality product in return, and as you know that is not the case now, if i do RMA and get a replacement from the current stock, i might get lucky and get a sample that is stable up to say 8200Mhz ... that is not a safe enough headroom to make sure the card will stay stable for its lifetime, across multiple drivers updates, and future more stressful games, so RMA at this point is not useful.
> 
> I will RMA or return at the end, but i was waiting for an update from NVIDIA to test on the sample i have now before i decide what i will replace it with, if the new BIOS gets me to 8400-8600Mhz stable without doing anything funny, then i will know that there is hope and that this is not a completely lost cause, on the other hand if the BIOS did not improve the headroom enough, then i know that Micron GDDR5 is not suitable to run at 8Ghz to begin with and then i will get a Founders Edition with Samsung or just get my money back and wait for VEGA with HBM2.
> 
> That is my plan which involves a lot less hassle and would produce much better results than just blindly RMAing now without knowing what i am getting myself into.


i highly recommend RMA & try to get samsung chip without any hassle. i don't think a single vbios can solve the h/w limitation issue.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> i highly recommend RMA & try to get samsung chip without any hassle. i don't think a single vbios can solve the h/w limitation issue.


Lets not try guessing too much, the vBIOS update should be coming out shortly and i will put it to the test, if what we are suspecting is true, that this thing being a hardware limitation issue which i also believe it to be the case, then the Founders Edition with Samsung memory will still be there, i just do not want to downgrade to a blower-style cooler before making absolutely sure that Micron GDDR5 is a lost cause...


----------



## TheDeadCry

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My first work PC was a 80286 with 512K of Ram and CGA graphics adapter. hehe.
> 
> You couldn't overclock those at all.
> 
> 
> 
> Which one, the version with 10Mhz or the speedy one with 16Mhz. (Maybe even a 25Mhz racer)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My first was a Commodore 64. OK technically that wasn't a PC, but it was a computer.
> 
> Click to expand...
> 
> It was a hotrod.
> 
> An 8Mhz Olivetti M28. I had Luxury of dual 5 1/4" 1.2mb floppies and a 10mb hard drive
Click to expand...

I'm experienced with floppies, though.


----------



## 9colai

I just made a test with my MSI 1070 gaming X card with Samsung memory:

I ran time spy with + 0 MHz RAM and + 510 MHz RAM respectively and i got the following results:

GPU score: 6403 @ 2003,4 MHz (read by GPU-z)

GPU score: 6653 @ 2256,8 MHz (read by GPU-z)

Thats a 3,9 % performance increase for 1000 MHz RAM Increase. That would yield a gain of approximately 2 more FPS at 60 Hz gaming.

I'm just curious... Why is this such a big deal? Is it worth spending time on RMA's and so forth for a 2 potential FPS gain? It's still an awesome GPU!


----------



## reflex75

Quote:


> Originally Posted by *9colai*
> 
> I just made a test with my MSI 1070 gaming X card with Samsung memory:
> 
> I ran time spy with + 0 MHz RAM and + 510 MHz RAM respectively and i got the following results:
> 
> GPU score: 6403 @ 2003,4 MHz (read by GPU-z)
> 
> GPU score: 6653 @ 2256,8 MHz (read by GPU-z)
> 
> Thats a 3,9 % performance increase for 1000 MHz RAM Increase. That would yield a gain of approximately 2 more FPS at 60 Hz gaming.
> 
> I'm just curious... Why is this such a big deal? Is it worth spending time on RMA's and so forth for a 2 potential FPS gain? It's still an awesome GPU!


Your test is right, but your conclusion is wrong.
Every game behaves differently, and some games sclaes better with memory bandwidth increase.
Moreover, it depends also on the settings you choose for each game.


----------



## 9colai

Quote:


> Originally Posted by *reflex75*
> 
> Your test is right, but your conclusion is wrong.
> Every game behaves differently, and some games sclaes better with memory bandwidth increase.
> Moreover, it depends also on the settings you choose for each game.


Okay, could you mention a VRAM sensetive game for me? I would like to test it just of couriosity


----------



## SpirosKGR

What do you think about KFA2, its good company?
I am interested about KFA² GeForce GTX 1070 EX OC ( Samsung / micron memory ? )

thanks in advance


----------



## khanmein

Quote:


> Originally Posted by *SpirosKGR*
> 
> What do you think about KFA2, its good company?
> I am interested about KFA² GeForce GTX 1070 EX OC ( Samsung / micron memory ? )
> 
> thanks in advance


most likely come with micron chip for all the brands. if u're lucky to get older stock is another story. apparently, more than 90% reviewers using samsung chip.

leadtek winfast hurricane GTX 1070 using micron.

videocardz reviewed colorful igame come with samsung chip.


----------



## Roland0101

Quote:


> Originally Posted by *BulletSponge*
> 
> Aww, my memories of Choplifter on a C64 are priceless.
















And now we play ultra realistic games with hairwork or Pure Hair....


----------



## Star Forge

Quote:


> Originally Posted by *khanmein*
> 
> i highly recommend RMA & try to get samsung chip without any hassle. i don't think a single vbios can solve the h/w limitation issue.


The thing is not all Samsung PCB's can overclock well. I swear the one I have that is a Samsung board can't overclock for crap. The issue is mostly due to failing voltage states sustaining overclocked power more than anything. I am considering a retest myself shortly but out of the box, the Samsung board I got from an RMA isn't beating out the Micron board.


----------



## RyanRazer

Hey, new here, just decided to share my experience.
_Since now i've owned AMD GPUs, just as a side note







_

So, i decided to ditch my R9 290 by Sapphire, great card btw... I went for GTX 1070 and got myself a *Gigabyte G1 Gaming*. Great GPU, i had a blast playing games at 1440p and 144Hz... It really is something.... But as i read on the net and forums and if i compared peoples reports to mine, i found out i was left behind a bit.

https://s25.postimg.org/5oyxx28bz/g1_gaming_diag.png


First of all, as you can see in this picture (i hope you can see), my temps were high and the card was relatively loud. I was kinda used to it since i had 290







That thing was hot and loud, i tell you.... so it didn't seem like a big deal but when i started comparing... Temps under load were around 73C-77C or 163-168F. I saw people usually hover under 70C. Plus, my fan was spinning at around 50%. But percentage doesn't tell you much since different GPUs have different max RPM. Those on the G1 had maximum 4000+. In SS you can see they run at 72% translating to 3003 RPM. This is crazy...

Overclocking:
That thing would run at 2011-2025MhZ, which is kinda OK. Certainly not the best but not the worst either. But i saw this discussion about micron and samsung memory chips and how micron are not doing very well, i figured G1 had the micron one, as my max mem. overclock was about 185mhz... anything close to 200 would be unstable and crash. I was kinda surprised as i saw on net before that new gddr5 can hit 9ghz mark on memory easily...

Anyways, i sold the G1 and ordered the AMP! Extreme, so i hope for the best -> cooler running, quieter card with a bit better overclcking. Will report back the comparison between the cards









Sry for lengthy post


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> The thing is not all Samsung PCB's can overclock well. I swear the one I have that is a Samsung board can't overclock for crap. The issue is mostly due to failing voltage states sustaining overclocked power more than anything. I am considering a retest myself shortly but out of the box, the Samsung board I got from an RMA isn't beating out the Micron board.


of course not ALL are overclocking well, but most do, and whatever OC level they reach they are 1) consistent at that level and 2) stable and 3) don't lock-up and crash restart the system.

When Samsung memory reaches its max OC level, anything above that you just get a soft crash of "Display Driver has stopped responding and has recovered."

Micron on the other hand is generally not stable and not consistent, at least on the current BIOS, sometimes a memory offset is stable in a certain application, some other times the same application and the same offset crashes or causes artifacts, it is really sensitive, and when it crashes it is an ugly hard crash with lock-ups, BSOD and restart ...

OC potential alone does not explain the whole problem, the issue goes beyond being an OC-only problem, this has been explained a countless times by many and i am surprised why is it so hard for people to get that?

EDIT: Micron GDDR5 in general degrades the overall quality and experience of the product, and what is surprising is that is the $200 RX 480 is using the Samsung chips while the $420 ($640 in my case due to EU VAT) is using the inferior Micron chips, it just blows my mind when i think about it, regardless of what the BIOS fix will bring!


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> of course not ALL are overclocking well, but most do, and whatever OC level they reach they are 1) consistent at that level and 2) stable and 3) don't lock-up and crash restart the system.
> 
> When Samsung memory reaches its max OC level, anything above that you just get a soft crash of "Display Driver has stopped responding and has recovered."
> 
> Micron on the other hand is generally not stable and not consistent, at least on the current BIOS, sometimes a memory offset is stable in a certain application, some other times the same application and the same offset crashes or causes artifacts, it is really sensitive, and when it crashes it is an ugly hard crash with lock-ups, BSOD and restart ...
> 
> OC potential alone does not explain the whole problem, the issue goes beyond being an OC-only problem, this has been explained a countless times by many and i am surprised why is it so hard for people to get that?


In the core, both of my units are 100% stable at their factory EVGA specs. If I don't touch anything, both the Micron and the Samsung units don't fail at all. Therefore, it seems if you run everything bone stock, neither VRAM choices affect the performance of the card. If they do, they you got a dud and obvious an RMA. The thing is, it seems everyone here is not happy with Micron because they all say they got better OC stability and clocks on Samsung and pushing people towards Samsung if necessary. My point is that Samsung isn't 100% bulletproof either and that it all comes back to good old silicon lottery.

My argument is that the type of VRAM is not the biggest issue, but the BIOS written to deal with voltage states are. It seems the BIOS written for Micron isn't 100% properly implemented for Micron VRAM for sufficient overclocking, hence why I when I saw the nVidia post that they are going to hotfix it, that is going to be a good thing as I feel like Micron VRAM is more sensitive than the Samsung VRAM on certain applications when running beyond their initial specifications, but I hope the revised BIOS will give them the juice they need to maintain higher speeds.

As much as I want to keep my Samsung unit, the core on this thing fails to even successfully pass Heaven 4.0 at 2100 MHz, while the Micron one can easily so.


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> I am still experimenting.
> 
> one thing I have noticed is that if the GPU tries to pull more than about 210 - 220W as measured with HWiNFO64, it crashes the Graphics driver. That is my main limitation at the moment, not artifacts. Still scratching my head to work out how to better manage that. The card is advertised to draw 150W at stock but it is physically wired for 300W. I'm running an HX 850i PSU in single rail mode so the PSU is not limiting anything. The whole PC is only using about 350W under load so there is loads of headroom.
> 
> Any suggestions for tweaking bios voltage settings to give me more power headroom or software to monitor PCIe bus power loads are welcome


You use a different bios, right? Because your cards standard power limit is 230w. Furthermore I doupt that your card with the 1x6 and 1x8 pin power connectors uses the PCIe slot power very much, definitely not to it's max.
The only way to measure that would be an Oscilloscope.

For the Bios voltage, I don't now if there are OC Bios hacks for the gaming X available. You can tweak you Bios on your own, there are guides out there, but I don't know if anyone did this for pascal successfully. (Still new cards)

I also think that this might be a driver problem. WDDM 2.1 is still brand new and it will take time for Nvidia and MS to optimize it the way WDDM 2.0 was. You may test if you get beter results with 368.81.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> In the core, both of my units are 100% stable at their factory EVGA specs. If I don't touch anything, both the Micron and the Samsung units don't fail at all. Therefore, it seems if you run everything bone stock, neither VRAM choices affect the performance of the card. If they do, they you got a dud and obvious an RMA. The thing is, it seems everyone here is not happy with Micron because they all say they got better OC stability and clocks on Samsung and pushing people towards Samsung if necessary. My point is that Samsung isn't 100% bulletproof either and that it all comes back to good old silicon lottery.
> 
> My argument is that the type of VRAM is not the biggest issue, but the BIOS written to deal with voltage states are. It seems the BIOS written for Micron isn't 100% properly implemented for Micron VRAM for sufficient overclocking, hence why I when I saw the nVidia post that they are going to hotfix it, that is going to be a good thing as I feel like Micron VRAM is more sensitive than the Samsung VRAM on certain applications when running beyond their initial specifications, but I hope the revised BIOS will give them the juice they need to maintain higher speeds.
> 
> As much as I want to keep my Samsung unit, the core on this thing fails to even successfully pass Heaven 4.0 at 2100 MHz, while the Micron one can easily so.


That is not entirely correct based on many data points we have seen reported, the customer experience is the final answer to determine if a product is problematic or not, during the first two months after release, when all GTX 1070s were using Samsung, you don't see a single report or complaint about memory, artifacting, quality or stability issues, after the switch to Micron you can't find a single computer forum without at least one complaint about memory, stability or quality issues, that must mean something.

It could very much be an incomplete BIOS implementation for the Micron memory controller that is responsible for most issues, but OC aside, from my experience and the two Micron samples i tested, they feel laggy in 2K & 4K than their Samsung counterparts even at stock setting, i did not measure frame time latency, but there seems to be something off about the latency or timings of these ICs, also with my current Micron sample, i was getting occasional artifacts in rotTR with driver 372.70 , after updating to 372.90 the artifacts disappeared at stock settings.

It was never the case with the two Samsung samples that i tried for about a month that i ever got any kind of issue whatsoever either on stock settings or when OCed to +600 or more, that also must mean something.

Micron GDDR5 running at 8 Gb/s is not out right defective, but it is sensitive and unstable, which lowers the overall product perceived quality and reliability, maybe this is the best way to explain the problem universally, and i keep wondering why they made the switch? NVIDIA did not give us an answer to that yet, and it looks like they don't want to give that information out no matter what.


----------



## Roland0101

Quote:


> Originally Posted by *TheDeadCry*
> 
> I've noticed this dynamic with CPU and RAM overclocking, as well. People more knowledgeable than me could probably explain it better. I think it's partially due to things like auto rules in the bios' of whatever - be it graphics card, or motherboard. For example, sometimes when I apply an overclock to the CPU the bios alters settings in such a way as to make it unstable. When I reset the CMOS I do the same OC and it's rock steady. Again, a little bit of a digression. There are so many properties I am yet to fully understand, such as electron tunneling and such within the chip. I'm not surprised that increasing the voltage may actually destabilize the machine. The role of thermal and voltage thresholds and how they effect the silicon on a low level, is very interesting. Again, someone more knowledgeable than me could explain this. I think we overclockers are starting to get a feel for these chips' dynamics though - at least on a base level. Of course, we still have the silicon lottery itself, but it seems we can still apply similar rules. It's all about that voltage regulation, and obviously that's up in the air right now with all this micron controversy. Every time you change the clock or the memory, or anything there's a lot of other stuff that's automatically getting altered that isn't shown - like shader clocks. This is pretty obvious stuff, I know. It's worth bringing up though, for people who find their once stable clocks cannot be re-stabilized once fiddling around with the clocks before going back. This is the most tangential, digressive post I've ever typed up I think. I'm tired, and I drank my first energy drink as well as just the caffeine itself in months. My tired, college-siphoned brain is not the most focused. Anyways, I love discussing this stuff - this is what overclocking is all about.
> 
> 
> 
> 
> 
> 
> 
> I'm loving the discussions on here.


It's of course true that there are many, many things that influence OC potential, but in this case it shouldn't make a difference. A stock pascal card is always capped by reliability voltage, now my card can do +90 core clock on standard voltage. If I raise the voltage, the card becomes capped by power limit, and that should be it, just another Cap reason, but instead it crashes the driver.

Edit: And you are right, a good discussion is always fun.


----------



## TheDeadCry

Quote:


> Originally Posted by *Roland0101*
> 
> You use a different bios, right? Because your cards standard power limit is 230w. Furthermore I doupt that your card with the 1x6 and 1x8 pin power connectors uses the PCIe slot power very much, definitely not to it's max.
> The only way to measure that would be an Oscilloscope.
> 
> For the Bios voltage, I don't now if there are OC Bios hacks for the gaming X available. You can tweak you Bios on your own, there are guides out there, but I don't know if anyone did this for pascal successfully. (Still new cards)
> 
> I also think that this might be a driver problem. WDDM 2.1 is still brand new and it will take time for Nvidia and MS to optimize it the way WDDM 2.0 was. You may test if you get beter results with 368.81.


We are yet to be able to modify pascal bios' as per the last time I've checked. As you mentioned, there are several factors - software and hardware, that have yet to fully mature within the cards in the community. Whatever the case may be, for me at least, this card is the only one that I have come across that hasn't reached even close to even the standard power limit. This is good, obviously. I LOVE this cards power efficiency compared to my old 780. However, whatever the case may be there is definitely an incongruity between power/voltage. I've heard of people reaching the power limit, but some of us don't even come close, even to the standard unaltered power limit. Traditionally I would think the low power consumption would be a sign of a great card bin. If this is the case, and our cards limited by voltage is due to regulation, we could see a dramatic difference with a bios editor/ upcoming bios coming to board partners.


----------



## Roland0101

Quote:


> Originally Posted by *RyanRazer*
> 
> Anyways, i sold the G1 and ordered the AMP! Extreme, so i hope for the best -> cooler running, quieter card with a bit better overclcking. Will report back the comparison between the cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sry for lengthy post


Seeing your temps at a Gaming G1, who has a pretty good cooling solution, I would suggest that you take a look at you system (case) cooling.


----------



## TheDeadCry

Quote:


> Originally Posted by *Roland0101*
> 
> It's of course true that there are many, many things that influence OC potential, but in this case it shouldn't make a difference. A stock pascal card is always capped by reliability voltage, now my card can do +90 core clock on standard voltage. If I raise the voltage, the card becomes capped by power limit, and that should be it, just another Cap reason, but instead it crashes the driver.
> 
> Edit: And you are right, a good discussion is always fun.


I don't see the power cap I'm accustomed to seeing with previous nvidia generations. I find this curious. I expect to see the same power limit cap as you have described, with an increase in voltage - but I don't. I don't know what kind of wizardy is going on behind the scenes, but I'm interested in investigating further. Because either the card is using more power than reported, or is most probably related to voltage. I'm biding my time for now. My card is reliable, just ill utilized.


----------



## RyanRazer

Quote:


> Originally Posted by *Roland0101*
> 
> Seeing your temps at a Gaming G1, who has a pretty good cooling solution, I would suggest that you take a look at you system (case) cooling.


I have an open case. so air flow is 100%


----------



## Roland0101

Quote:


> Originally Posted by *TheDeadCry*
> 
> I don't see the power cap I'm accustomed to seeing with previous nvidia generations. I find this curious. I expect to see the same power limit cap as you have described, with an increase in voltage - but I don't. I don't know what kind of wizardy is going on behind the scenes, but I'm interested in investigating further. Because either the card is using more power than reported, or is most probably related to voltage. I'm biding my time for now. My card is reliable, just ill utilized.


Well it will depend on your specific card, your settings (not just OC software, also NVCP, VBios and Bios) and what game or benchmark you run.


----------



## Roland0101

Quote:


> Originally Posted by *RyanRazer*
> 
> I have an open case. so air flow is 100%


An open case doesn't mean that you have a good airflow, sometimes it even hinders the airflow.

It can be that your G1 had a cooling problem, but this card normally working has an almost as good cooling as my STRIX has.
http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/39678-roundup-5x-geforce-gtx-1070-mit-custom-design-im-test.html?start=17

(sorry for the German link, but the graphics are self explaining.)

Edit: How many case fans do you have?


----------



## TheDeadCry

Quote:


> Originally Posted by *Roland0101*
> 
> Well it will depend on your specific card, your settings (not just OC software, also NVCP, VBios and Bios) and what game or benchmark you run.


For sure. You can rest assured that I'll keep you posted.







Damn, I remember my previous build (3570k, GTX 780) when overclocked, could peak at around 500w (Burn-in) Now, having EVERYTHING connected to my Kill-a-watt (Monitors, router, and all else) draws at max like less than 350w. I may differ from a lot of people here, in that what I really get excited about is power efficiency. I get excited about seeing how low I can go. I'm idling currently at ~150w. That is incredible to me thinking of all of this drawing the power of only 2 or 3 60w incandescent light bulbs. All of it.


----------



## SpirosKGR

MSI Gaming X or EVGA SC Thoughts on difference between this two? Which card will cause me less problems ? ( no coil whine, fans, latency issues ) I know that evga comes with micron now







but i hope that nvidia's new bios fix this

What in your opinion ?







Thanks again guys


----------



## RyanRazer

zero. One actually but i've disable it. I don't look at decibels.. all cards have roughly the same 39dcb when i looked at reviews. I mean those figures don't vary much while irl it is a big difference. decibels is a weird scale for measuring "loudness". It is a unit of pressure. But from 100 to 1001 decibels it's a much much biger jump than from 20 to 21 dcb... it does not scale linearly...

as for the fans, i have zero fans except gpu and cpu.. there is nothing stopping the air from flowing. Side of the case is nicely opened so the air has a free path. When i get a new GPU i'll do a test with smoke to see the deal but i doubt the air is trapped


----------



## RyanRazer

Quote:


> Originally Posted by *Roland0101*
> 
> An open case doesn't mean that you have a good airflow, sometimes it even hinders the airflow.
> 
> It can be that your G1 had a cooling problem, but this card normally working has an almost as good cooling as my STRIX has.
> http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/39678-roundup-5x-geforce-gtx-1070-mit-custom-design-im-test.html?start=17
> 
> (sorry for the German link, but the graphics are self explaining.)
> 
> Edit: How many case fans do you have?


http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_1070_g1_gaming_review,11.html

Judging by this list, all cards sound pretty much the same. I mean, if you showed this to some layperson, like my mother, she would probably say they are all pretty much the same, *whereas you and me both know that r9 290x is much much louder card than G1 Gaming, despite list stating 290x is 43DBa and G1 42DBa...*


----------



## MyNewRig

Okay, now i am playing Quantum Break (great game BTW) , have +200 memory on driver 372.90 , 1440p was going good, increased the resolution to 4K and after a while of playing i started seeing red flashing artifacts, so it looks like the applications that manage to fill most of the available vRAM like rotTR or QB @ 4K the Micron memory chip starts showing its true color


----------



## Roland0101

Quote:


> Originally Posted by *RyanRazer*
> 
> http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_1070_g1_gaming_review,11.html
> 
> Judging by this list, all cards sound pretty much the same. I mean, if you showed this to some layperson, like my mother, she would probably say they are all pretty much the same, *whereas you and me both know that r9 290x is much much louder card than G1 Gaming, despite list stating 290x is 43DBa and G1 42DBa...*


Yes, gtx 1070 3 fan cooler cards are all on some very good level.
That is my point.

Your G1 was ruining way to hot in your system. So either the card itself was defective regarding the cooling, something pretty unlikely, your room temperature is pretty high or you system cooling is not optimal.
It's of course entirely your decision if you look after that, but if I'm right the AMP! Extreme will not run any cooler than the G1 did.


----------



## RyanRazer

Quote:


> Originally Posted by *Roland0101*
> 
> Yes, gtx 1070 3 fan cooler cards are all on some very good level.
> That is my point.
> 
> You G1 was ruining way to hot in your system. So either the card itself was defective regarding the cooling, something pretty unlikely, your room temperature is pretty high or you system cooling is not optimal.
> It's of course entirely your decision if you look after that, but if I'm right the AMP! Extreme will not run any cooler than the G1 did.


i will definitely report back in a week when it comes in house!


----------



## MyNewRig

Quote:


> Originally Posted by *RyanRazer*
> 
> i will definitely report back in a week when it comes in house!


My Strix runs from 66c to 71c on the default fan curve depending on load, OC or not, power target etc ... i tested with 100% fan yesterday and it stayed at 54c after 10 minutes of continuous load, i saw a guy with the same exact card hitting 81c , so there are some samples out there that seem to have the cooler not making perfect contact with the die or the thermal compound is not applied properly, in an open case you should have the perfect airflow because the ambient air is not being restricted in any way, if you getting such high temps in an open case then i would consider an RMA or re-seating the cooler with new thermal compound application but that could void the warranty on some cards that have anti-intrusion markings on the screws.


----------



## MyNewRig

I just found out that the EVGA GTX 1070 FTW Hybrid is coming to stock in my market in less than two weeks, if anyone has that or can find reviews about it would be great, if only that card comes with Samsung then it would probably be the best 1070 around, i am afraid to ask EVGA about what type of memory this one comes with and get the "we can not guarantee the memory type" kind of response, am just tired of getting the same response from manufacturers so we probably have to find out on our own.

This is a new production it seems, and maybe, i mean maybe after that Micron memory fiasco EVGA has made up its mind to use Samsung on its top of the line 1070 ...

What do you guys think about that card?


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> I just found out that the EVGA GTX 1070 FTW Hybrid is coming to stock in my market in less than two weeks, if anyone has that or can find reviews about it would be great, if only that card comes with Samsung then it would probably be the best 1070 around, i am afraid to ask EVGA about what type of memory this one comes with and get the "we can not guarantee the memory type" kind of response, am just tired of getting the same response from manufacturers so we probably have to find out on our own.
> 
> This is a new production it seems, and maybe, i mean maybe after that Micron memory fiasco EVGA has made up its mind to use Samsung on its top of the line 1070 ...
> 
> What do you guys think about that card?


It uses the FTW PCB so Micron most likely. The Hybrid is just a FTW with a Hybrid cooler.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> It uses the FTW PCB so Micron most likely. The Hybrid is just a FTW with a Hybrid cooler.


It is a new batch of the FTW, so why can't they change their mind about using Micron memory? why would they insist?


----------



## MyNewRig

Also curious if the review sample of that one will also be sent with Samsung, now that would be really hilarious to make special review samples with Samsungs and retail samples with MIcron even for cards that are very recently produced, i am really looking forward for a review for that card ..


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> It is a new batch of the FTW, so why can't they change their mind about using Micron memory? why would they insist?


Well don't you think by now that a 2nd batch would be already depleted and people buying cards now would be on the third batch already? The 1070 is a hot seller and if they knew any official issues, we should be already seeing Samsung FTW's back into the retail wild. EVGA is not going to make Samsung-only boards for the Hybrids when they know they are just slapping a FTW with that cooler. It would make no sense to them. If the Hybrids are getting Samsung, then so would the FTW's and so far I am not seeing a retail shortage of FTW's. If they would to have a shortage, then it seems very likely they are making changes to the PCB and most likely going back to Samsung. However, it doesn't seem that way.

As I said, my Samsung FTW is going back to EVGA as it is a crappy overclocker and my Micron one is doing fine so far so I am not worried. Worst case, I can just fork another $100 and EVGA Step-Up to a 1080 SC and deal with it.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> so far I am not seeing a retail shortage of FTW's. If they would to have a shortage, then it seems very likely they are making changes to the PCB and most likely going back to Samsung. However, it doesn't seem that way.


That is a very important observation, there was indeed a shortage in the Strix stock for about three weeks before the flood of Micron Strix came back to stock in the thousands.


----------



## Lavajuice

So after following the memory issue for the past few weeks I finally pulled the trigger on a 1070, went with Gigabyte due to a nice rebate and a free GoW 4 copy, didn't want to keep waiting because I had unfortunately already gotten a GSYNC monitor so I figured I'd take the risk and hope I either get lucky with a Samsung (saw others getting different Gigabyte models also having Samsung recently) or that the driver helps solve the Micron issue. Turns out I ended up with a Samsung!







Got it 2 days ago off Newegg http://www.newegg.com/Product/Product.aspx?Item=N82E16814125871 Hopefully this is helpful information for you guys!


----------



## RyanRazer

Quote:


> Originally Posted by *MyNewRig*
> 
> My Strix runs from 66c to 71c on the default fan curve depending on load, OC or not, power target etc ... i tested with 100% fan yesterday and it stayed at 54c after 10 minutes of continuous load, i saw a guy with the same exact card hitting 81c , so there are some samples out there that seem to have the cooler not making perfect contact with the die or the thermal compound is not applied properly, in an open case you should have the perfect airflow because the ambient air is not being restricted in any way, if you getting such high temps in an open case then i would consider an RMA or re-seating the cooler with new thermal compound application but that could void the warranty on some cards that have anti-intrusion markings on the screws.


No worries, G1 is no longer mine. Sold it, waiting for amp ext


----------



## Waleh

Hey guys! I'm planning on getting a 1070 FE soon for my ITX build and I had a question. I currently have a 6600k and plan to mainly play BF1 once it comes out. However, I've been reading mixed reviews where some people are saying the 6600k bottlenecks the 1070 and others are saying you're still getting great performance with a 6600k/1070. I would be moving from a 970 to a 1070. I have considered a 6700k but I don't really want to spend money on a new processor. What do you guys say? Thanks


----------



## jlhawn

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys! I'm planning on getting a 1070 FE soon for my ITX build and I had a question. I currently have a 6600k and plan to mainly play BF1 once it comes out. However, I've been reading mixed reviews where some people are saying the 6600k bottlenecks the 1070 and others are saying you're still getting great performance with a 6600k/1070. I would be moving from a 970 to a 1070. I have considered a 6700k but I don't really want to spend money on a new processor. What do you guys say? Thanks


It will run just fine with the 6600k, I have no issues with my old X58 system and a GTX 1070 thats over clocked to 1975mhz boost.


----------



## BroPhilip

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys! I'm planning on getting a 1070 FE soon for my ITX build and I had a question. I currently have a 6600k and plan to mainly play BF1 once it comes out. However, I've been reading mixed reviews where some people are saying the 6600k bottlenecks the 1070 and others are saying you're still getting great performance with a 6600k/1070. I would be moving from a 970 to a 1070. I have considered a 6700k but I don't really want to spend money on a new processor. What do you guys say? Thanks


I have the same setup with 6600k clocked at 4.7 aircooled. I played the open beta all maxed out with frames to spare. You'll be just fine...


----------



## Waleh

Quote:


> Originally Posted by *jlhawn*
> 
> It will run just fine with the 6600k, I have no issues with my old X58 system and a GTX 1070 thats over clocked to 1975mhz boost.


Thanks! Good to know
Quote:


> Originally Posted by *BroPhilip*
> 
> I have the same setup with 6600k clocked at 4.7 aircooled. I played the open beta all maxed out with frames to spare. You'll be just fine...


Ah okay. I play on a 144 hz monitor so I wanted to squeeze as many frames as I can get hence the upgrade to a 1070. Out of curiosity, how do you think frames will differ from your CPU at stock vs the 4.7 OC you got? In other words, is the OC giving you substantial real world performance? I have a tiny itx case with a small air cooler so I definitely can't clock that high.


----------



## khanmein

Quote:


> Originally Posted by *Star Forge*
> 
> The thing is not all Samsung PCB's can overclock well. I swear the one I have that is a Samsung board can't overclock for crap. The issue is mostly due to failing voltage states sustaining overclocked power more than anything. I am considering a retest myself shortly but out of the box, the Samsung board I got from an RMA isn't beating out the Micron board.


i suggest samsung is not due to the potential over-clocking but stability. default micron chip also can cause artifacts & to reduce the risk to minimal y not go for samsung with lesser issue compare with micron.

y they don't use samsung chip? obviously micron chip is way cheaper & reduce the cost. like previously 9xx series early batches come with samsung then follow up with hynix, elpida aka micron.

by the way, i totally agreed what u highlighted the written bios is not good enough but at the end of the day, get samsung vram is a wiser choice.


----------



## long99x

just got a 1070 amp extreme yesterday


----------



## Star Forge

Quote:


> Originally Posted by *khanmein*
> 
> i suggest samsung is not due to the potential over-clocking but stability. default micron chip also can cause artifacts & to reduce the risk to minimal y not go for samsung with lesser issue compare with micron.
> 
> y they don't use samsung chip? obviously micron chip is way cheaper & reduce the cost. like previously 9xx series early batches come with samsung then follow up with hynix, elpida aka micron.
> 
> by the way, i totally agreed what u highlighted the written bios is not good enough but at the end of the day, get samsung vram is a wiser choice.


To be honest, I am keeping my Micron board because so far that thing is more stable for me than the Samsung board overall. If there is a notable issue with Micron, then I hope nVidia is going to fix it very soon with a revised BIOS to many of you who are struggling. Both of my boards has been stable at stock clocks, but in my tests the Samsung one doesn't seem to hold overclocks better than my Micron, so that alone means I am sticking with the Micron. If you all think Samsung is better then go for it, but to me, not all of the Micron's are bad but would I want more stability especially on an overclocked front? Yes, but is my overclocked bad right now on Micron? No. Is it better than the Samsung that I had? To be honest, yes.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> To be honest, I am keeping my Micron board because so far that thing is more stable for me than the Samsung board overall. If there is a notable issue with Micron, then I hope nVidia is going to fix it very soon with a revised BIOS to many of you who are struggling. Both of my boards has been stable at stock clocks, but in my tests the Samsung one doesn't seem to hold overclocks better than my Micron, so that alone means I am sticking with the Micron. If you all think Samsung is better then go for it, but to me, not all of the Micron's are bad but would I want more stability especially on an overclocked front? Yes, but is my overclocked bad right now on Micron? No. Is it better than the Samsung that I had? To be honest, yes.


How far are you able to OC your Micron board on the memory side? also on stock settings, do you notice any frame "fluidity" differences between both cards at stock? what resolution do you game on? if you do 2K or 4K, can you measure frame time latency on both cards at these high resolutions at stock settings and report back?

There are also a few leaks that the 1070 will be getting a refresh with GDDR5X around the time VEGA releases which is in about 4 months or so, this might explain the switch to Micron


----------



## khanmein

Quote:


> Originally Posted by *Lavajuice*
> 
> So after following the memory issue for the past few weeks I finally pulled the trigger on a 1070, went with Gigabyte due to a nice rebate and a free GoW 4 copy, didn't want to keep waiting because I had unfortunately already gotten a GSYNC monitor so I figured I'd take the risk and hope I either get lucky with a Samsung (saw others getting different Gigabyte models also having Samsung recently) or that the driver helps solve the Micron issue. Turns out I ended up with a Samsung!
> 
> 
> 
> 
> 
> 
> 
> Got it 2 days ago off Newegg http://www.newegg.com/Product/Product.aspx?Item=N82E16814125871 Hopefully this is helpful information for you guys!


great news for u but i personally won't get a giga. look at jayztwocents giving away his review sample GTX 1080 g1 gaming.

usually better products he will give away for his friends & family.
Quote:


> Originally Posted by *MyNewRig*
> 
> How far are you able to OC your Micron board on the memory side? also on stock settings, do you notice any frame "fluidity" differences between both cards at stock? what resolution do you game on? if you do 2K or 4K, can you measure frame time latency on both cards at these high resolutions at stock settings and report back?
> 
> There are also a few leaks that the 1070 will be getting a refresh with GDDR5X around the time VEGA releases which is in about 4 months or so, this might explain the switch to Micron


i think they reserved samsung vram chip for upcoming GTX 1080Ti in case of shortage. that's great news for star forge if micron chip on his card is pretty solid but i don't wanna risk this kind hard-earned money. now i'm seeking for used GTX 1070.


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> How far are you able to OC your Micron board on the memory side? also on stock settings, do you notice any frame "fluidity" differences between both cards at stock? what resolution do you game on? if you do 2K or 4K, can you measure frame time latency on both cards at these high resolutions at stock settings and report back?
> 
> There are also a few leaks that the 1070 will be getting a refresh with GDDR5X around the time VEGA releases which is in about 4 months or so, this might explain the switch to Micron


I don't have a 2K monitor but I do supersample 4K via DSP on 1080p. In terms of fluidity differences, they both feel the same at stock and the latency seems to be fine (I know that Pascal in general had latency issues that nVidia supposedly fixed in a driver a while back). I can do some tests later in the week while I still have that card. The RAM I was able to achieve stable is around +400 and that is if I use the FTW's Slave BIOS to feed the card with a Power Target of 122% and driving the core clock down to 2088-2100. Adding voltage doesn't seem to always add more stability in either card's case, which made me always to feel like the 1070's biggest enemy is how the voltage tables work from GPU Boost 3.0. The more this is being discussed, the more I am willing to plug in the Samsung card back in for a final re-test. I know for my Samsung board at stock voltage, it barely can hit 2100 MHz Core and 8600 MHz RAM before Heaven kicks the bucket. The Micron one can go up two more bins and still hold at 8800 MHz with a bit more stable voltage.


----------



## RyanRazer

Quote:


> Originally Posted by *long99x*
> 
> 
> 
> just got a 1070 amp extreme yesterday


I'm waiting for mine. U satisfied?


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> I don't have a 2K monitor but I do supersample 4K via DSP on 1080p. In terms of fluidity differences, they both feel the same at stock and the latency seems to be fine (I know that Pascal in general had latency issues that nVidia supposedly fixed in a driver a while back). I can do some tests later in the week while I still have that card. The RAM I was able to achieve stable is around +400 and that is if I use the FTW's Slave BIOS to feed the card with a Power Target of 122% and driving the core clock down to 2088-2100. Adding voltage doesn't seem to always add more stability in either card's case, which made me always to feel like the 1070's biggest enemy is how the voltage tables work from GPU Boost 3.0. The more this is being discussed, the more I am willing to plug in the Samsung card back in for a final re-test. I know for my Samsung board at stock voltage, it barely can hit 2100 MHz Core and 8600 MHz RAM before Heaven kicks the bucket. The Micron one can go up two more bins and still hold at 8800 MHz with a bit more stable voltage.


The first thing that draw my attention to the Micron card having something off about it even before i knew that GPU-Z has a memory type detection engine, is that it felt laggy in 2K Compared to the one with Samsung, this is the first ever symptom that caught my attention the minute i switched cards, i was not very knowledgeable about the issue back then to actually measure frame time latency, this is why i would like for you to do that test if possible, i feel that the timings of the Micron ICs are looser and it manifests itself in higher resolutions that are vRAM dependent.

by "stable at +400" are we talking about stability with voltage/power tricks? or stability at stock power management settings? if you are stable +400 on stock power settings then you have indeed got an exceptional sample. i was just testing this morning with 3DMark Firestrike stress test, i can not pass the test with anything higher than +200, if i go to +250 it does not crash or anything but it stops the test in the middle for issues with "frame time inconsistency". which probably indicates that the memory is producing errors and a correction mechanism is kicking in affecting frame time consistency, locking voltage or prefer max performance did not help much.

I also figured that +100 voltage is causing instability, so i disabled Afterburner's Voltage monitoring and control, and just put the sliders at 120% power and +60 core which brings me to 2050Mhz effective and which throttles down to 2025-2012 shortly after load is applied. increasing voltage does not increase core OC limit, it just uses more power to do it resulting in about 3c degrees warmer temp at the same core frequency.

Your Samsung sample only being able to do 8600Mhz is pretty rare but it sure happens, but did your Samsung ever been crashing the system with BSOD if you go higher or is it just soft-crash like it was in my case?
Quote:


> Originally Posted by *khanmein*
> 
> i think they reserved samsung vram chip for upcoming GTX 1080Ti in case of shortage. that's great news for star forge if micron chip on his card is pretty solid but i don't wanna risk this kind hard-earned money. now i'm seeking for used GTX 1070.


Reserving Samsung GDDR5 8 Gb/s for the 1080Ti is highly unlikely, the 1080Ti is probably getting 10 Gb/s modules and the Samsung chips previously used in the 1070 can not do that, i don't think that any current GDDR5 tech is able to do 10 Gb/s aside from the X modules that Micron has developed.


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> Ah okay. I play on a 144 hz monitor so I wanted to squeeze as many frames as I can get hence the upgrade to a 1070. Out of curiosity, how do you think frames will differ from your CPU at stock vs the 4.7 OC you got? In other words, is the OC giving you substantial real world performance? I have a tiny itx case with a small air cooler so I definitely can't clock that high.


BF1 is a CPU heavy game, so his OC will give him a decent real world performance boost.
Still, you don't have to worry it should run just fine on your rig without OC the CPU.


----------



## RyanRazer

Originally Posted by Waleh View Post

Ah okay. I play on a 144 hz monitor so I wanted to squeeze as many frames as I can get hence the upgrade to a 1070. Out of curiosity, how do you think frames will differ from your CPU at stock vs the 4.7 OC you got? In other words, is the OC giving you substantial real world performance? I have a tiny itx case with a small air cooler so I definitely can't clock that high.
Quote:


> Originally Posted by *Roland0101*
> 
> BF1 is a CPU heavy game, so his OC will give him a decent real world performance boost.
> Still, you don't have to worry it should run just fine on your rig without OC the CPU.


I wouldn't be so sure. I don't know about BF1 being a CPU heavy game, didn't look it up, but if it is, 6600k could be a bit weak. My years old 4790 has a better single threaded score and multi threaded obviously since it has HT 8 cores, whereas 6600 isn't HT. New games usually benefit from multiple threads, they say some games can utilize even more threads than 8. Like i7-6850K, 6 cores / 12 threads...


----------



## RyanRazer

Here the video i was referencing to... Worth the look.










As you can see, the difference best manifests it's self at minimums. With 6600k the mins are substantially lower than with 6700k, with same frequency and with the exact same GPU. The draw-calls is something 4core CPU can't handle in tight situations. Here's where 8 threads come in handy.

If you look at the scenario: same generation CPUs, same frequency, architecture, the difference is just in threads.

Now i imagine i7-6850K with 12 threads would preform even better here, even though i'd probably run at a bit lower clock speeds. Would be cool to see it there in the list.


----------



## rfarmer

Quote:


> Originally Posted by *RyanRazer*
> 
> Here the video i was referencing to... Worth the look.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see, the difference best manifests it's self at minimums. With 6600k the mins are substantially lower than with 6700k, with same frequency and with the exact same GPU. The draw-calls is something 4core CPU can't handle in tight situations. Here's where 8 threads come in handy.
> 
> If you look at the scenario: same generation CPUs, same frequency, architecture, the difference is just in threads.
> 
> Now i imagine i7-6850K with 12 threads would preform even better here, even though i'd probably run at a bit lower clock speeds. Would be cool to see it there in the list.


I had watched one of his earlier videos on the same topic, he blows it out of proportion more than it really is making it seem like the 6700k is better in all situations. Most games the 6600k and 6700k are close enough to not even matter, but as you see in Crysis and The Witcher 3 there are definitely games that benefit from the extra 4 threads.

Minimum requirements on BF1 is the 6600k with 6700 as recommended, so this may very well be one of those games that benefit from the 4 extra threads.


----------



## Roland0101

Quote:


> Originally Posted by *RyanRazer*
> 
> Originally Posted by Waleh View Post
> 
> Ah okay. I play on a 144 hz monitor so I wanted to squeeze as many frames as I can get hence the upgrade to a 1070. Out of curiosity, how do you think frames will differ from your CPU at stock vs the 4.7 OC you got? In other words, is the OC giving you substantial real world performance? I have a tiny itx case with a small air cooler so I definitely can't clock that high.
> 
> I wouldn't be so sure. I don't know about BF1 being a CPU heavy game, didn't look it up, but if it is, 6600k could be a bit weak. My years old 4790 has a better single threaded score and multi threaded obviously since it has HT 8 cores, whereas 6600 isn't HT. New games usually benefit from multiple threads, they say some games can utilize even more threads than 8. Like i7-6850K, 6 cores / 12 threads...


It always depends on the game Engine. Some benefit but most do not.
http://www.guru3d.com/articles_pages/core_i7_6950x_6900k_6850k_and_6800k_processor_review,16.html

Most games do not benefit from more than 8 threads. It is also worth noticing that the higher your resolution gets the CPU influence is reduced as long as your CPU is not a bottleneck in the sense of the word.

But that is unrelated to the question that was if the 6600k will be a significant bottleneck in BF1, and your own link proves that the 6600k is a pretty good gaming CPU.
Of course it is not as fast as the 6700k who has the better single core performance (the deciding factor at most games.) but it is preforming very good, and it won't be a real bottleneck for BF1.


----------



## RyanRazer

Quote:


> Originally Posted by *rfarmer*
> 
> I had watched one of his earlier videos on the same topic, he blows it out of proportion more than it really is making it seem like the 6700k is better in all situations. Most games the 6600k and 6700k are close enough to not even matter, but as you see in Crysis and The Witcher 3 there are definitely games that benefit from the extra 4 threads.
> 
> Minimum requirements on BF1 is the 6600k with 6700 as recommended, so this may very well be one of those games that benefit from the 4 extra threads.


That 6700 is better in every situation is not true, however i think we can agree that 6700 is better or equal to 6600 in every situation. Some games do benefit, some don't. Now it's up to us if its worth those extra 100$.

On another note, in theory in future more and more games will utilize more threads as consoles are getting higher threaded CPUs. New PS4 Pro: x86-64 AMD "Jaguar", 8 cores. Now PS5 is coming out in a year or so with i assume 8 "core" cpu or even more. But probably 8...
Now, we all know that consoles are a majority of gaming market and there's much more money in consoles, so developers will start coding with multi threaded cpus in mind, probably leading to better performance on the PC part (easier porting i assume).

I am no developer, so there is plenty of assumptions and speculation in this post.


----------



## RyanRazer

Quote:


> Originally Posted by *Roland0101*
> 
> It always depends on the game Engine. Some benefit but most do not.
> http://www.guru3d.com/articles_pages/core_i7_6950x_6900k_6850k_and_6800k_processor_review,16.html
> 
> Most games do not benefit from more than 8 threads. It is also worth noticing that the higher your resolution gets the CPU influence is reduced as long as your CPU is not a bottleneck in the sense of the word.
> 
> But that is unrelated to the question that was if the 6600k will be a significant bottleneck in BF1, and your own link proves that the 6600k is a pretty good gaming CPU.
> Of course it is not as fast as the 6700k who has the better single core performance (the deciding factor at most games.) but it is preforming very good, and it won't be a real bottleneck for BF1.


Well, single threaded performance with 6600 and 6700 should be pretty much the same.




Note: 6700k test was run at higher freq - 3.5hgz - 3.9ghz vs 4.0 ghz - 4.2ghz. If you clocked both at 4.4 ghz lets say i think the score would be near identical. Now i don't own those CPUs and cant verify this and i don't know how each overclocks but i don't think there is much difference in IPC.

So these 2 CPUs are near identical, only that 6700k has 8 threads while 6600K has not. Assuming upcoming games will utilize 8 cores (looking at console trend) i think it is safe to assume that 6700k is quite better option especially taking future-proofing into account.


----------



## Hnykill

I have a Palit Super Jetstream GTX 1070 and i use the ThunderMaster utility program that comes with it. it's simple and very easy. RGB lights and one badass cooler. it overclockes well and the plastic shroud around the heatsink allowes good airflow. It has Micron memory but it is one of the better ones for overclocking. 9200Mhz at all times. Core at 2100 Mhz. custom fan profile and dead silent. the fans on this card are big and move a lot of air at low rpm. if you are thinking about buying a GTX 1070, i would reccomend This one.


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> So these 2 CPUs are near identical, only that 6700k has 8 threads while 6600K has not. Assuming upcoming games will utilize 8 cores (looking at console trend) i think it is safe to assume that 6700k is a better option especially taking future-proofing into account.


The 6700k is of course better, that's why it is significantly more expensive.









here is Waleh question.
Quote:


> Originally Posted by *Waleh*
> 
> Hey guys! I'm planning on getting a 1070 FE soon for my ITX build and I had a question. I currently have a 6600k and plan to mainly play BF1 once it comes out. However, I've been reading mixed reviews where some people are saying the 6600k bottlenecks the 1070 and others are saying you're still getting great performance with a 6600k/1070. I would be moving from a 970 to a 1070. I have considered a 6700k but I don't really want to spend money on a new processor. What do you guys say? Thanks


While the 6700k is better, the 6600k is a good CPU that will not prevent him to play BF1 at very good settings.
I think we should not unsettle Waleh here and make him buy a CPU he don't need for the task he specifically asked about.


----------



## Waleh

Thanks so much for all the responses! Of course the 6700k is a better CPU than the 6600k due to HT and newer titles taking advantage of HT. However, I think I'm going to stick to my 6600k as the upgrade from a 970 to 1070 is quite substantial for me and I don't want to spend the extra money right now on a new CPU


----------



## Tobe404

Not sure why the clocks are different in each utility but in game it boosts to a max of 2113Mhz and the memory is at about 9250Mhz. Maxes out about 72c in games and is dead quiet, Very happy.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Reserving Samsung GDDR5 8 Gb/s for the 1080Ti is highly unlikely, the 1080Ti is probably getting 10 Gb/s modules and the Samsung chips previously used in the 1070 can not do that, i don't think that any current GDDR5 tech is able to do 10 Gb/s aside from the X modules that Micron has developed.


10.5 GB/s ++ is for GTX 1080 & Titax XP. GDDR5 already reached the potential limit of 9 GB/s + & FYI, GTX 1080Ti come with GDDR5 if like u said 10 GB/s who wanna buy Titan XP or even GTX 1080? NV won't repeat the same like they did on Maxwell..


----------



## RyanRazer

Quote:


> Originally Posted by *Waleh*
> 
> Thanks! Good to know
> Ah okay. I play on a 144 hz monitor so I wanted to squeeze as many frames as I can get hence the upgrade to a 1070. Out of curiosity, how do you think frames will differ from your CPU at stock vs the 4.7 OC you got? In other words, is the OC giving you substantial real world performance? I have a tiny itx case with a small air cooler so I definitely can't clock that high.


Quote:


> Originally Posted by *Roland0101*
> 
> The 6700k is of course better, that's why it is significantly more expensive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here is Waleh question.
> While the 6700k is better, the 6600k is a good CPU that will not prevent him to play BF1 at very good settings at all.
> I think we should not unsettle Waleh here and make him buy a CPU he don't need for the task he specifically asked about.


i was throwing 6700k into the game because he stated "_*Ah okay. I play on a 144 hz monitor so I wanted to squeeze as many frames as I can get hence the upgrade to a 1070*_". Of course it is solely his decision how will he spend his money. I just gave my perspective and showed some benches from "The Good Old Gamer". That's all


----------



## rfarmer

Quote:


> Originally Posted by *Waleh*
> 
> Thanks so much for all the responses! Of course the 6700k is a better CPU than the 6600k due to HT and newer titles taking advantage of HT. However, I think I'm going to stick to my 6600k as the upgrade from a 970 to 1070 is quite substantial for me and I don't want to spend the extra money right now on a new CPU


I have a 6600k AND GTX 1070 and I am really happy with the combo, plays every game I throw at it at extremely high fps.


----------



## tps3443

Quote:


> Originally Posted by *MyNewRig*
> 
> You switched from RX 480 to GTX 1070? if so why did you buy the 480 in the first place and then why you switched?


I bought a RX480 because they were suppose to be great cards. But you could bog it down slightly in 1080P, like a GTX1080 in 4k.

Then I bought a GTX1070.

Then I bought a GTX1080.

The GTX1080 is a great card. Once Overclocked you will average 60Fps in 4K. And every now and then dip to 40's. But, it's a pretty smooth experience!

Oh and, a GTX1070 smokes a RX480. There is no comparison. A RX480 falls short of a gtx980 in alot of games. Where as a GTX1070 is as fast as (2) GTX970's in SLI in alot of games.

Going from RX480 to 1070 was HUGE.

Going from GTX1070, to GTX1080 was not as huge. But, it was still a nice boost.


----------



## tps3443

Quote:


> Originally Posted by *Waleh*
> 
> Thanks so much for all the responses! Of course the 6700k is a better CPU than the 6600k due to HT and newer titles taking advantage of HT. However, I think I'm going to stick to my 6600k as the upgrade from a 970 to 1070 is quite substantial for me and I don't want to spend the extra money right now on a new CPU


GPU is paramount for me! I run a 6600K, and a GTX1080.

I'd recommend fast memory though. Ddr4 4000 if you play Fallout 4.


----------



## tps3443

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys! I'm planning on getting a 1070 FE soon for my ITX build and I had a question. I currently have a 6600k and plan to mainly play BF1 once it comes out. However, I've been reading mixed reviews where some people are saying the 6600k bottlenecks the 1070 and others are saying you're still getting great performance with a 6600k/1070. I would be moving from a 970 to a 1070. I have considered a 6700k but I don't really want to spend money on a new processor. What do you guys say? Thanks


I play BF1 BETA with my 6600K, and a GTX1080 at 4K. My 6600K does a fantastic job!

I prefer GPU power over CPU power.

A 6600K will not be bottlenecked by a GTX1070. I had one before and it always stayed at 99% GPU utilization.

And 93+% of all 6600K's will run at 4.65GHz+ so this will help it out a little more.

If you have the money for a 6700K, then get it! But, if buying the 6700K makes you back down on GPU. For example 6700K, RX480. This is not worth it.

A 6600K and a GTX1070 is alot more capable, than a 6700K and a RX480/GTX1060.

Bf1 is demanding. You will need the GPU HP.


----------



## MyNewRig

Quote:


> Originally Posted by *tps3443*
> 
> Going from GTX1070, to GTX1080 was not as huge. But, it was still a nice boost.


It is not huge in terms of performance but that nice boost costs 35% more in price, the 1080 is ridiculously priced especially here in Europe (about $890) and i am not willing to spend that much on a GPU knowing that it will become obsolete in a matter of months, or at least gets a significant price reduction once the 1080Ti is released, it just feels like throwing money down the toilet even though i would highly appreciate a good 60 FPS @ 4K.


----------



## tps3443

Quote:


> Originally Posted by *khanmein*
> 
> 10.5 GB/s ++ is for GTX 1080 & Titax XP. GDDR5 already reached the potential limit of 9 GB/s + & FYI, GTX 1080Ti come with GDDR5 if like u said 10 GB/s who wanna buy Titan XP or even GTX 1080? NV won't repeat the same like they did on Maxwell..


Gtx1080ti specs have already been officially released.


----------



## MyNewRig

Quote:


> Originally Posted by *tps3443*
> 
> And 93+% of all 6600K's will run at 4.65GHz+ so this will help it out a little more.


How are you getting 4.65Ghz on your 6600K? mine OC very easily to 4.6Ghz and runs below 60c under stress on air, but once i go to 4.7Ghz it starts getting a bit unstable, so 4.65Ghz is probably my optimal range but i am not sure how to get there with multiplier OC, i can only dial 46 or 47 multiplier, i can not do 46.5 for example, so how are you getting that frequency?


----------



## rfarmer

Found this review, think it shows that the 6600k really is all you need. OC does really make a difference.

http://techbuyersguru.com/intels-core-i5-6600k-vs-i7-6700k-vs-i7-6900k-games?page=0


----------



## tps3443

Quote:


> Originally Posted by *MyNewRig*
> 
> It is not huge in terms of performance but that nice boost costs 35% more in price, the 1080 is ridiculously priced especially here in Europe (about $890) and i am not willing to spend that much on a GPU knowing that it will become obsolete in a matter of months, or at least gets a significant price reduction once the 1080Ti is released, it just feels like throwing money down the toilet even though i would highly appreciate a good 60 FPS @ 4K.


OK first of all, The GTX980 was out for 22 months before GTX1080 was released.
That is a 2 year life cycle!

The GTX1080Ti has already been announced, and specs released. It will be $900.00

So the price of $619-$649.99 of a GTX1080 is not getting any cheaper for now.

The GTX1080 Ti cost almost 50% more than a GTX1080. So, why would Nvidia make it cost 60%-75% more? Lol. That is silly.

I also only paid $280 for my GTX1080 brand new.

The GTX1080 is expected to have a long 2 year life cycle. And the GTX1080 Ti will have the same 2 year life cycle as well! Because they release at different times.

You gotta pay to play!

And starting level gtx1060 is $250-$329

Mid to high level gtx1070 is $389-$469

High end gtx1080 is $600-$699

Ultra high gtx1080ti is $899-$1000+

How does this not all sound fair?

I mean I'm going to upgrade to a gtx1080Ti in January. And it offers only 15% boost at some times. And other times it is 25%.

A GTX1070 is as fast as a Titan X. Well a Overclocked GTX1080 can get up to 45% faster than a Titan X. So that actually is a reasonable boost!


----------



## long99x

Quote:


> Originally Posted by *RyanRazer*
> 
> I'm waiting for mine. U satisfied?


Yeah, the card is very cool, idle is 30, play witcher 3 about 3 hours with auto fan, max temp is 59, default boost max 2023mhz, this card fine for me









Only one issue, when idle my card stuck at 1025mhz, don't know how to fix it, I use tripple monitors


----------



## MyNewRig

Quote:


> Originally Posted by *tps3443*
> 
> I also only paid $280 for my GTX1080 brand new.


??


----------



## MyNewRig

Quote:


> Originally Posted by *long99x*
> 
> Only one issue, when idle my card stuck at 1025mhz, don't know how to fix it, I use tripple monitors


You can not fix it, this is a normal behavior and has been like that since many generations, once your collective resolution reaches a certain threshold the card automatically goes into high performance mode even on idle desktop, i now have a collective resolution of 6K ... 2K and 4K, if i only run the 2K monitor or the 4K TV i don't get this constant pump, but if i run all monitors at a collective 6K resolution the card always stays at high performance mode, not only does core runs higher but also memory runs at max frequency all the time.


----------



## tps3443

Quote:


> Originally Posted by *MyNewRig*
> 
> How are you getting 4.65Ghz on your 6600K? mine OC very easily to 4.6Ghz and runs below 60c under stress on air, but once i go to 4.7Ghz it starts getting a bit unstable, so 4.65Ghz is probably my optimal range but i am not sure how to get there with multiplier OC, i can only dial 46 or 47 multiplier, i can not do 46.5 for example, so how are you getting that frequency?


I said 4.65 as in, this is what most of them do.

I run mine at 4,912Mhz. It has been delided and I run 1.450Volts to it.

I purchased mine from Silicon lottery as a 4.8Ghz model it was only $249.99 and turns out, it ran stable at 4.91 as well. I can boot at 5Ghz with 1.50Volts. But I cannot cool it.


----------



## tps3443

Quote:


> Originally Posted by *MyNewRig*
> 
> ??


One of those once in a lifetime deals, that are to good to be true. But it actually is true. A local buy. I had to drive 6 hours round trip at 3AM!

Do not hold your breath on price drops. Nvidia is very greedy! And they love money. Lol But, there product is reliable, fast as hell! With great drivers. And, they hold there value like a Honda.


----------



## MyNewRig

Quote:


> Originally Posted by *tps3443*
> 
> OK first of all, The GTX980 was out for 22 months before GTX1080 was released.
> That is a 2 year life cycle!
> 
> The GTX1080 is expected to have a long 2 year life cycle. And the GTX1080 Ti will have the same 2 year life cycle as well! Because they release at different times.


The life cycle *WAS* two years, that was in a different market, now everyone is on 14nm (AMD & Intel) except Nvidia and VEGA with HBM2 is coming in Jan 2017 so Nvidia will have to refresh its lineup, for one to get on the 14nm node and to counter AMD's HBM2 with GDDR5X or whatever else they have in store, so both the 1070 and 1080 will be refreshed in less than a year, the word is that the 1070 will be optimized on the 14nm node for higher clocks and even more power efficiency and get GDDR5X 10 Gb/s memory around the time VEGA releases or a few months after.

Volta is what will be released after around a year and a half, but during that time the only logical move is that Pascal will be refreshed sooner than you think, and the current cards will not be worth a damn by that time.


----------



## khanmein

Quote:


> Originally Posted by *tps3443*
> 
> Gtx1080ti specs have already been officially released.


where's the official?


----------



## tps3443

Quote:


> Originally Posted by *MyNewRig*
> 
> The life cycle *WAS* two years, that was in a different market, now everyone is on 14nm (AMD & Intel) except Nvidia and VEGA with HBM2 is coming in Jan 2017 so Nvidia will have to refresh its lineup, for one to get on the 14nm node and to counter AMD's HBM2 with GDDR5X or whatever else they have in store, so both the 1070 and 1080 will be refreshed in less than a year, the word is that the 1070 will be optimized on the 14nm node for higher clocks and even more power efficiency and get GDDR5X 10 Gb/s memory around the time VEGA releases or a few months after.
> 
> Volta is what will be released after around a year and a half, but during that time the only logical move is that Pascal will be refreshed sooner than you think, and the current cards will not be worth a damn by that time.


I heard that the next Nvidia gpu's will only be a Pascal V2. They are only going to refresh the GPU's, because next generation is having trouble shrinking dies and they are pushing it out further.

Even Intel would use to, release a new socket every year with a new chipsets, and CPU lineup.

Now, they offer (3) years of CPU's that fit the same socket.

LGA1150=4770K,4790K,5775C Broadwell.
LGA1151=6600K,7600K

Bare in mind the 6600K is pushing 14 or 15 months now.

It use to be, buy a motherboard and here comes a new socket and CPU.


----------



## MyNewRig

Quote:


> Originally Posted by *tps3443*
> 
> I heard that the next Nvidia gpu's will only be a Pascal V2. They are only going to refresh the GPU's, because next generation is having trouble shrinking dies and they are pushing it out further.


Yes that is what i mean, and it will probably be called 1170 and 1180, 14nm die shrink, higher stock clocks and faster memory, i expect these to be out in Q1 2017 or early Q2 , they can not let VEGA take a lot of market share so they will have to do it.

No more two years cycle in this current market, things are moving so much faster than last gen.


----------



## Roland0101

Quote:


> Originally Posted by *RyanRazer*
> 
> i was throwing 6700k into the game because he stated "_*Ah okay. I play on a 144 hz monitor so I wanted to squeeze as many frames as I can get hence the upgrade to a 1070*_". Of course it is solely his decision how will he spend his money. I just gave my perspective and showed some benches from "The Good Old Gamer". That's all


Fair enough.


----------



## Hnykill

6600k will never bottleneck a GTX 1070. it's a very powerful chip. but CPU's power is becoming limited in games. Graphic cards are the best upgrade you can buy for your computer if you play games. game designers tend to overlook the CPU's potential somehow and theyr multicore capability. but Intel and AMD are the only ones that makes good CPU's on planets Earth so.. slowly they will give us what they have designed. but assuring theyr supremicy meanwhile. them first. then we get some chips.


----------



## long99x

Quote:


> Originally Posted by *MyNewRig*
> 
> You can not fix it, this is a normal behavior and has been like that since many generations, once your collective resolution reaches a certain threshold the card automatically goes into high performance mode even on idle desktop, i now have a collective resolution of 6K ... 2K and 4K, if i only run the 2K monitor or the 4K TV i don't get this constant pump, but if i run all monitors at a collective 6K resolution the card always stays at high performance mode, not only does core runs higher but also memory runs at max frequency all the time.


thank for the answer, very useful


----------



## Waleh

Ladies and Gentlemen, I have ordered the MSI 1070 FE for my ITX rig


----------



## xg4m3

Quote:


> Originally Posted by *MyNewRig*
> 
> The life cycle *WAS* two years, that was in a different market, now everyone is on 14nm (AMD & Intel) except Nvidia and VEGA with HBM2 is coming in Jan 2017 so Nvidia will have to refresh its lineup, for one to get on the 14nm node and to counter AMD's HBM2 with GDDR5X or whatever else they have in store, so both the 1070 and 1080 will be refreshed in less than a year, the word is that the 1070 will be optimized on the 14nm node for higher clocks and even more power efficiency and get GDDR5X 10 Gb/s memory around the time VEGA releases or a few months after.
> 
> Volta is what will be released after around a year and a half, but during that time the only logical move is that Pascal will be refreshed sooner than you think, and the current cards will not be worth a damn by that time.


True, but personally i dont see the point in waiting for refreshed versions since it will be too close to Volta release. The timing is little weird. 2016 is Pascal, 2017 is Pascal again but with GDDRX for all cards, and then 2018 is Volta. For me is either upgrade now, or just wait 2 years until Volta is out. Just my 2 cents.


----------



## MyNewRig

Quote:


> Originally Posted by *xg4m3*
> 
> True, but personally i dont see the point in waiting for refreshed versions since it will be too close to Volta release. The timing is little weird. 2016 is Pascal, 2017 is Pascal again but with GDDRX for all cards, and then 2018 is Volta. For me is either upgrade now, or just wait 2 years until Volta is out. Just my 2 cents.


I used to believe the same during the first two months of Pascal when they were using the trouble free Samsung GDDR5 on the 1070, but now with Micron and all the issues it brought to the table i just don't find the product as attractive as it used to be, the next few months will all be about memory upgrade, HBM2 on VEGA and GDDR5X on Pascal, i think these will be much better products, products that you can keep for a couple of years without worrying too much about them,

Your logic can still be valid, get Pascal refresh next year and then Volta refresh 2 years after, aside from memory Pascal have plenty of issues that Maxwell did not, it is just not mature yet in terms of Drivers and BIOS and probably the refresh will be a much more solid tech.


----------



## reflex75

Quote:


> Originally Posted by *9colai*
> 
> Okay, could you mention a VRAM sensetive game for me? I would like to test it just of couriosity


You can check this video about the gain of memory overcloking:


----------



## BroPhilip

Quote:


> Originally Posted by *MyNewRig*
> 
> How are you getting 4.65Ghz on your 6600K? mine OC very easily to 4.6Ghz and runs below 60c under stress on air, but once i go to 4.7Ghz it starts getting a bit unstable, so 4.65Ghz is probably my optimal range but i am not sure how to get there with multiplier OC, i can only dial 46 or 47 multiplier, i can not do 46.5 for example, so how are you getting that frequency?


Mine runs smooth as butter at 4.7 with no gaming problems or stability issues, and it's just air cooled with a evo 212. ASUS auto tune overclocked it without having to touch a single thing..... Max Temps after hours of sw battlefront is 67 peek buthe normal 57-64.


----------



## MyNewRig

Quote:


> Originally Posted by *BroPhilip*
> 
> Mine runs smooth as butter at 4.7 with no gaming problems or stability issues, and it's just air cooled with a evo 212. ASUS auto tune overclocked it without having to touch a single thing..... Max Temps after hours of sw battlefront is 67 peek buthe normal 57-64.


For my sample the voltage required to stabilize 4.7Ghz (about 1.4v) can not be kept cool on air, exceeds 80c in Prime95 so i left it at 4.6Mhz which runs between 1.312v to 1.360v the max i am comfortable with on air, reaches about 76c in Prime95 and around 54c in a typical gaming session, so i did not find the need to squeeze that extra 100Mhz


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> The first thing that draw my attention to the Micron card having something off about it even before i knew that GPU-Z has a memory type detection engine, is that it felt laggy in 2K Compared to the one with Samsung, this is the first ever symptom that caught my attention the minute i switched cards, i was not very knowledgeable about the issue back then to actually measure frame time latency, this is why i would like for you to do that test if possible, i feel that the timings of the Micron ICs are looser and it manifests itself in higher resolutions that are vRAM dependent.
> 
> by "stable at +400" are we talking about stability with voltage/power tricks? or stability at stock power management settings? if you are stable +400 on stock power settings then you have indeed got an exceptional sample. i was just testing this morning with 3DMark Firestrike stress test, i can not pass the test with anything higher than +200, if i go to +250 it does not crash or anything but it stops the test in the middle for issues with "frame time inconsistency". which probably indicates that the memory is producing errors and a correction mechanism is kicking in affecting frame time consistency, locking voltage or prefer max performance did not help much.
> 
> I also figured that +100 voltage is causing instability, so i disabled Afterburner's Voltage monitoring and control, and just put the sliders at 120% power and +60 core which brings me to 2050Mhz effective and which throttles down to 2025-2012 shortly after load is applied. increasing voltage does not increase core OC limit, it just uses more power to do it resulting in about 3c degrees warmer temp at the same core frequency.
> 
> Your Samsung sample only being able to do 8600Mhz is pretty rare but it sure happens, but did your Samsung ever been crashing the system with BSOD if you go higher or is it just soft-crash like it was in my case?
> Reserving Samsung GDDR5 8 Gb/s for the 1080Ti is highly unlikely, the 1080Ti is probably getting 10 Gb/s modules and the Samsung chips previously used in the 1070 can not do that, i don't think that any current GDDR5 tech is able to do 10 Gb/s aside from the X modules that Micron has developed.


Mine has been set right now with no voltage changes but the Power Limit set to 122% using Slave BIOS. On the Master BIOS with the target of 112%, it would hold just fine but once in a blue moon it would BSOD. However at 122%, the Micron seems to hold well. On my Samsung at +300 with normal conditions (no voltage or PT over 112%), it would crash Heaven halfway after the 2nd round. So YMMV.


----------



## TheGlow

Quote:


> Originally Posted by *BroPhilip*
> 
> Mine runs smooth as butter at 4.7 with no gaming problems or stability issues, and it's just air cooled with a evo 212. ASUS auto tune overclocked it without having to touch a single thing..... Max Temps after hours of sw battlefront is 67 peek buthe normal 57-64.


Nice to hear. I have a 6600k w/ evo 212 and have it on 4.4ghz. I wasnt sure how high I should expect to take it.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> Mine has been set right now with no voltage changes but the Power Limit set to 122% using Slave BIOS. On the Master BIOS with the target of 112%, it would hold just fine but once in a blue moon it would BSOD. However at 122%, the Micron seems to hold well. On my Samsung at +300 with normal conditions (no voltage or PT over 112%), it would crash Heaven halfway after the 2nd round. So YMMV.


Since you are an EVGA guy, i have a question, i am now looking at the FTW or the FTW Hybrid, both are priced close enough, but looking at EVGA's Forums i see a ton of people having all kinds of funny issues with the FTW, screen signal turn off with 100% fans, artifacts at the desktop, coilwhile, LEDs not working or unable to change colors, very disappointing OC levels or people unable to OC at all ... and a bunch of other issues i can't even begin to remember ...

How are your FTWs doing? FTW looks great on paper, but the amount of threads with issues in the forums is scary, what are your views on this?


----------



## tps3443

Quote:


> Originally Posted by *Waleh*
> 
> Ladies and Gentlemen, I have ordered the MSI 1070 FE for my ITX rig


Best card for the money, the GTX1070. Check and see if you get Samsung memory in gpuz when you get it.


----------



## TheGlow

I realized my bench mark pics were nearly 30 days old so I decided to rerun them.
I'm assuming display driver versions can have an impact like this? I'm seeing a 140-150 difference between TimeSpy runs now than last month. And lesser.
Also coincidentally I was able to try +215 core didnt crash, hitting 2200 on occasion. Artifact red flashes all over and other rips in the screen so definitely not stable, however it didnt crash.
I had a run as +210/+850 as well without a crash.
However again everything seems about 140 less when tested with other settings.
I'll admit I still have some apps like Firefox open and my regular services in the tray, but i had those running last time as well.


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I am still experimenting.
> 
> one thing I have noticed is that if the GPU tries to pull more than about 210 - 220W as measured with HWiNFO64, it crashes the Graphics driver. That is my main limitation at the moment, not artifacts. Still scratching my head to work out how to better manage that. The card is advertised to draw 150W at stock but it is physically wired for 300W. I'm running an HX 850i PSU in single rail mode so the PSU is not limiting anything. The whole PC is only using about 350W under load so there is loads of headroom.
> 
> Any suggestions for tweaking bios voltage settings to give me more power headroom or software to monitor PCIe bus power loads are welcome
> 
> 
> 
> You use a different bios, right? Because your cards standard power limit is 230w. Furthermore I doupt that your card with the 1x6 and 1x8 pin power connectors uses the PCIe slot power very much, definitely not to it's max.
> 
> The only way to measure that would be an Oscilloscope.
> 
> For the Bios voltage, I don't now if there are OC Bios hacks for the gaming X available. You can tweak you Bios on your own, there are guides out there, but I don't know if anyone did this for pascal successfully. (Still new cards)
> 
> I also think that this might be a driver problem. WDDM 2.1 is still brand new and it will take time for Nvidia and MS to optimize it the way WDDM 2.0 was. You may test if you get beter results with 368.81.
Click to expand...

I have been playing with different vendor bioses on my MSI Gaming X. The one that seems most stable, other than the MSI bioses, is the ASUS Strix bios but that also has an 8+2 phase VRM. Currently I am running the base model MSI Gaming 8G bios and based on the limited time I have played with it. It seems an easier bios to overclock.

The best that I have observed with my card's power draw is about 220watts. Anything above that and it will either lock up or crash.

The Techpowerup GPU bios DB claims the MSI Gaming has a base target of 230W with a limit of 291W. While the bios may be programmed that way, At least on my card, It does not get anywhere close to to the base target limit. I did say Voltage in this original post when I should probably have said power. Voltage does not seem to be an issue with these cards whereas power management does seem a bit sloppy unless I am missing something.

As far as I am aware, no-one, other than Nvidia and the Vendors themselves have successfully edited a Pascal bios file as yet

The driver stability as higher clocks has improved over time so maybe the slight levels of extra stability are coming from wddm 2.1?


----------



## gtbtk

Quote:


> Originally Posted by *9colai*
> 
> Quote:
> 
> 
> 
> Originally Posted by *reflex75*
> 
> Your test is right, but your conclusion is wrong.
> Every game behaves differently, and some games sclaes better with memory bandwidth increase.
> Moreover, it depends also on the settings you choose for each game.
> 
> 
> 
> Okay, could you mention a VRAM sensetive game for me? I would like to test it just of couriosity
Click to expand...

In my experience, firestrike will rin without artifacts and finish with higher memory clocks than time spy.

Did the TimeSpy tests you did have blinking red spots everywhere?

Quote:


> Originally Posted by *TheDeadCry*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roland0101*
> 
> You use a different bios, right? Because your cards standard power limit is 230w. Furthermore I doupt that your card with the 1x6 and 1x8 pin power connectors uses the PCIe slot power very much, definitely not to it's max.
> The only way to measure that would be an Oscilloscope.
> 
> For the Bios voltage, I don't now if there are OC Bios hacks for the gaming X available. You can tweak you Bios on your own, there are guides out there, but I don't know if anyone did this for pascal successfully. (Still new cards)
> 
> I also think that this might be a driver problem. WDDM 2.1 is still brand new and it will take time for Nvidia and MS to optimize it the way WDDM 2.0 was. You may test if you get beter results with 368.81.
> 
> 
> 
> We are yet to be able to modify pascal bios' as per the last time I've checked. As you mentioned, there are several factors - software and hardware, that have yet to fully mature within the cards in the community. Whatever the case may be, for me at least, this card is the only one that I have come across that hasn't reached even close to even the standard power limit. This is good, obviously. I LOVE this cards power efficiency compared to my old 780. However, whatever the case may be there is definitely an incongruity between power/voltage. I've heard of people reaching the power limit, but some of us don't even come close, even to the standard unaltered power limit. Traditionally I would think the low power consumption would be a sign of a great card bin. If this is the case, and our cards limited by voltage is due to regulation, we could see a dramatic difference with a bios editor/ upcoming bios coming to board partners.
Click to expand...

EVGA Bios running on MSI Card will report up to the power Limit that is set by the bios. MSI Bios will not hit 100% at all. there is certainly a coding difference being seen here.


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> I have been playing with different vendor bioses on my MSI Gaming X. The one that seems most stable, other than the MSI bioses, is the ASUS Strix bios but that also has an 8+2 phase VRM. Currently I am running the base model MSI Gaming 8G bios and based on the limited time I have played with it. It seems an easier bios to overclock.


It's the Bios made for the card, and afterburner is made for it, so that is not surprising. But the Strix has an 6+1 VRM, belief my, I counted.








Quote:


> The best that I have observed with my card's power draw is about 220watts. Anything above that and it will either lock up or crash.
> The Techpowerup GPU bios DB claims the MSI Gaming has a base target of 230W with a limit of 291W. While the bios may be programmed that way, At least on my card, It does not get anywhere close to to the base target limit. I did say Voltage in this original post when I should probably have said power. Voltage does not seem to be an issue with these cards whereas power management does seem a bit sloppy unless I am missing something.


My Strix has a power limit of 200w. I can reach that but as said, 372.90 don't likes it. That's why I think that it's more a driver issue than a power control issue. Still, brand new WDDM, so not surprising that the driver is not optimized.
Quote:


> The driver stability as higher clocks has improved over time so maybe the slight levels of extra stability are coming from wddm 2.1?


I am pretty sure of it, at least the new driver improved the stability on high clocks overall, (if you stay beneath the power limit.) but on the other hand it seams to reduce memory OC potential in DX12 a bit, (about 50mhz effective, so not a big deal.) at least on my card.


----------



## BroPhilip

Quote:


> Originally Posted by *TheGlow*
> 
> Nice to hear. I have a 6600k w/ evo 212 and have it on 4.4ghz. I wasnt sure how high I should expect to take it.


I love my ASUS z170-a the software will auto benchmark and auto tune the overclock even auto prams the fans based on position in the case. This is what it sets it at each time. I don't remember what my voltage is but I can check next time I boot it up. I even used the supplied thermal past that came with the evo.... (shhhuush...don't tell anyone because its factory and has to stink lol)


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> Since you are an EVGA guy, i have a question, i am now looking at the FTW or the FTW Hybrid, both are priced close enough, but looking at EVGA's Forums i see a ton of people having all kinds of funny issues with the FTW, screen signal turn off with 100% fans, artifacts at the desktop, coilwhile, LEDs not working or unable to change colors, very disappointing OC levels or people unable to OC at all ... and a bunch of other issues i can't even begin to remember ...
> 
> How are your FTWs doing? FTW looks great on paper, but the amount of threads with issues in the forums is scary, what are your views on this?


From both cards so far:

1. No artifacts on desktop.
2. No screen signal turn off.
3. Samsung doesn't OC as well as my Micron (will retest as soon as I can since I need to return one of the cards back to EVGA for a defective cooler).
4. LED's do work for me, but there are some LED issues where one or two of the LED's are stuck bright green (hence the RMA from my end, replacement unit had a perfect cooler).
5. Both cards (Samsung and Micron) has coil buzzing, but not a total whine. That is probably the downer on the entire FTW line right now.

Just remember that the Hybrid edition will nearly 100% use the FTW PCB, just instead using the Hybrid cooler, not the ACX 3.0.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> From both cards so far:
> 
> 1. No artifacts on desktop.
> 2. No screen signal turn off.
> 3. Samsung doesn't OC as well as my Micron (will retest as soon as I can since I need to return one of the cards back to EVGA for a defective cooler).
> 4. LED's do work for me, but there are some LED issues where one or two of the LED's are stuck bright green (hence the RMA from my end, replacement unit had a perfect cooler).
> 5. Both cards (Samsung and Micron) has coil buzzing, but not a total whine. That is probably the downer on the entire FTW line right now.
> 
> Just remember that the Hybrid edition will nearly 100% use the FTW PCB, just instead using the Hybrid cooler, not the ACX 3.0.


Thank you, very useful stuff, i will avoid then, because one of the things i can not tolerate with GPUs is coil buzzing, it annoys me a lot, my previous last gen EVGA card also had this and i had to purchase a very expensive Seasonic Platinum PSU to lessen it a bit but it did not totally help, so i do not want to go through the same nightmare again.

Out of the four Strix cards i put into the system, only one has this, so it looks less common in the the Strix than it is with the FTW.

So far looks like the ASUS Strix OC or the FE are my only viable options once we get that BIOS update and i test it

- EVGA has coil buzzing, runs hotter and with problematic LEDS
- MSI has a ton of QC issues like it has always been the case with MSI so i simply don't trust it
- Gigabyte looks ugly as sin and has this flimsy plastic shroud which makes it look like a cheap toy.
- Zotac Extreme is huge than it needs be and it sags more that i would like it to, and also that issue with its fans that everyone is talking about.

But the question is, do these 2 X 8-pin connectors actually help with anything in Pascal because these seems like one main difference between the FTW/MSI and the other cards? or that Dual BIOS can even be put to use without any BIOS mod. tools available?


----------



## xg4m3

So you guys experience the artifacts on default OC clocks or only if you try to overclock it further? Like MSI X states it can run at 1797 MHz in it's OC clock. Do artifacts appear at those clocks or only if one tries to push it further.

Today i have to decide if will switch 970 with 1070 or just get 2 new 1080p screens and i'm torn between those two options.


----------



## MyNewRig

Quote:


> Originally Posted by *xg4m3*
> 
> So you guys experience the artifacts on default OC clocks or only if you try to overclock it further? Like MSI X states it can run at 1797 MHz in it's OC clock. Do artifacts appear at those clocks or only if one tries to push it further.
> 
> Today i have to decide if will switch 970 with 1070 or just get 2 new 1080p screens and i'm torn between those two options.


If you already have a working GPU then now is a very bad time to decide on a 1070 purchase, i think you should wait until the dust settles in this Micron memory issue, either that BIOS fix proves to be actually helpful and fixes the stability issue most of us are having with Micron, or they switch back to the more stable Samsung GDDR5 which i am suspecting some are starting to do, i found a guy who have purchased a GB G1 from NewEgg a couple days ago and it came with Samsung! so maybe they have changed their mind about using Micron.

Wait a few more weeks and by that time we will have much more information as to the extent of this issue and then it will be a better time to decide.

With Micron some are having artifacts on stock clocks unless they use "Prefer Max Performance", some get it with certain drivers or certain games at stock, some get it with as little as +100 or +200 OC on the memory, and some rare people say that they can push their Micron to +500 without any issues, i have seen only TWO people who said that so far.

It is still unclear what is going on here, and what the Ultimate fix will be, stick with your 970 for a few more weeks if you can.


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> Thank you, very useful stuff, i will avoid then, because one of the things i can not tolerate with GPUs is coil buzzing, it annoys me a lot, my previous last gen EVGA card also had this and i had to purchase a very expensive Seasonic Platinum PSU to lessen it a bit but it did not totally help, so i do not want to go through the same nightmare again.
> 
> Out of the four Strix cards i put into the system, only one has this, so it looks less common in the the Strix than it is with the FTW.
> 
> So far looks like the ASUS Strix OC or the FE are my only viable options once we get that BIOS update and i test it
> 
> - EVGA has coil buzzing, runs hotter and with problematic LEDS
> - MSI has a ton of QC issues like it has always been the case with MSI so i simply don't trust it
> - Gigabyte looks ugly as sin and has this flimsy plastic shroud which makes it look like a cheap toy.
> - Zotac Extreme is huge than it needs be and it sags more that i would like it to, and also that issue with its fans that everyone is talking about.
> 
> But the question is, do these 2 X 8-pin connectors actually help with anything in Pascal because these seems like one main difference between the FTW/MSI and the other cards? or that Dual BIOS can even be put to use without any BIOS mod. tools available?


EVGA does have a better cooler IMHO. I never hit over 62C ever. Also the current cooler I got from my RMA replacement is perfect even LED's so it isn't a lost cause with FTW QC. Also the coil whine only exists if you run the card on applications with FPS over 75 that I have noticed.

As for the power delivery, I am imagining that it is since on the FTW's I am getting power targets of 122% on the Slave BIOS and without voltage tweaks, sustaining a decent, stable overclock on my Micron board and I just played three hours of GTA V without an issue. So to an extent it is, but nVidia is still hindering its true potential. I think people are still working on BIOS editors right now and hopefully they are ready to go soon.


----------



## weskeh

Bought myself a zotac amp extreme 1070 and came with samsung memory last week, bouht it at B&H ny


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> EVGA does have a better cooler IMHO. I never hit over 62C ever. Also the current cooler I got from my RMA replacement is perfect even LED's so it isn't a lost cause with FTW QC. Also the coil whine only exists if you run the card on applications with FPS over 75 that I have noticed.
> 
> As for the power delivery, I am imagining that it is since on the FTW's I am getting power targets of 122% on the Slave BIOS and without voltage tweaks, sustaining a decent, stable overclock on my Micron board and I just played three hours of GTA V without an issue. So to an extent it is, but nVidia is still hindering its true potential. I think people are still working on BIOS editors right now and hopefully they are ready to go soon.


EVGA has better fans indeed, and along with MSI they are the only cards with double ball bearings fans, the fans on the Strix are simple sleeve bearings design which starts to sound like a Jet engine at around 70%, the downside of the double ball bearings on the ACX is that it sounds a bit louder and exhibit a pitch change in different RPMs, but all reviews out there show the FTW's ACX 3 running around 4-5c degrees warmer than the Strix and MSI, but if your FTW is not exceeding 62c then maybe these reviews are misleading us like they always do.

75FPS+ is what i am getting on most titles at 2K , that sounds like a lot of buzzing to take in









Power target on 2nd OC BIOS is only 122%? i thought it was 125% .. no? my Strix OC gets to 120% with one 8-pin connector, so what is the point of the 2 X 8-pins on the FTW?

If the custom BIOS tools becomes available then the FTW will probably really outshine everything else, since it can handle whatever power you throw at it and the dual BIOS makes experimenting safe.

So you think the FTW is a better choice than the Strix? i am drawn more to the Hybrid because it will be out of the factory in about a week from now, so it was produced with that Micron memory issue in the open and maybe they had done something to improve the situation.


----------



## Star Forge

Quote:


> Originally Posted by *MyNewRig*
> 
> EVGA has better fans indeed, and along with MSI they are the only cards with double ball bearings fans, the fans on the Strix are simple sleeve bearings design which starts to sound like a Jet engine at around 70%, the downside of the double ball bearings on the ACX is that it sounds a bit louder and exhibit a pitch change in different RPMs, but all reviews out there show the FTW's ACX 3 running around 4-5c degrees warmer than the Strix and MSI, but if your FTW is not exceeding 62c then maybe these reviews are misleading us like they always do.
> 
> 75FPS+ is what i am getting on most titles at 2K , that sounds like a lot of buzzing to take in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power target on 2nd OC BIOS is only 122%? i thought it was 125% .. no? my Strix OC gets to 120% with one 8-pin connector, so what is the point of the 2 X 8-pins on the FTW?
> 
> If the custom BIOS tools becomes available then the FTW will probably really outshine everything else, since it can handle whatever power you throw at it and the dual BIOS makes experimenting safe.
> 
> So you think the FTW is a better choice than the Strix? i am drawn more to the Hybrid because it will be out of the factory in about a week from now, so it was produced with that Micron memory issue in the open and maybe they had done something to improve the situation.


Personally, I will be gobsmacked if the Hybrid has differences from the FTW. The 1080 Hybrid is basically a 1080 FTW with a Hybrid setup, and the 1070 Hybrid should be no different. Even with the VRAM issue aside, you are still getting the same FTW board with the same-ish BIOS and everything. Therefore, I don't really see a difference short of EVGA giving itself Hybrid exclusivity to the 1080 first before the 1070? Also the reason my card runs around 62-64C maximum is that I run an aggressive fan curve. The reviewers tend to not use the pre-set aggressive fan curve in favor of the default one, which put silence first and would not spin up until the card hit's the mid-50's. Therefore, the FTW gets higher temps because of the fans late spins as well as never hitting over 60% maximum fan speed. The aggressive fan curve has it so the fans are spinning at 20% by 40-50C and then around 70% by the mid-60's and no I don't hear any RPM variations that makes this card annoying to use. In fact, this card is the quietest air-cooled card I ever had, surpassing the ACX 2.0 on a 970 that I briefly had and beats out any blower.

I will tell you this: if the 1070 Hybrid has improvements over the FTW, I will personally sell my 1070 FTW to get the Hybrid or I am stepping-up to a 1080 SC Gaming. However, I doubt the 1070 Hybrid would have any improvements with the exception of its cooler.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> I will tell you this: if the 1070 Hybrid has improvements over the FTW, I will personally sell my 1070 FTW to get the Hybrid or I am stepping-up to a 1080 SC Gaming. However, I doubt the 1070 Hybrid would have any improvements with the exception of its cooler.


I been looking for reviews since yesterday but non exist yet, apart from some guy in pcper claiming that his runs 2200Mhz core out of the box! i asked him if his came with Samsung or Micron, no response yet,

Check the comments here https://www.pcper.com/news/Graphics-Cards/EVGA-Adds-Water-Cooled-GTX-1070-FTW-Hybrid-Lineup
Quote:


> _"My EVGA 1070 Hybrid arrived yesterday. 2200 core out of box. Scores VERY close to my stock nVidia 1080 Founders Edition for $200 less. 5100 Firestrike Ultra and 6800 Timespy graphics scores. Runs around 37c while benchmarking and 44c after hours of gameplay. It's currently residing in a Fractal Define S."
> _


----------



## kevindd992002

When will this new BIOS come out?

What is the most high-end air-cooled EVGA GTX 1070 available?


----------



## MyNewRig

Quote:


> Originally Posted by *kevindd992002*
> 
> When will this new BIOS come out?


No idea, ask Freakvidia https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986942/#4986942

Quote:


> What is the most high-end air-cooled EVGA GTX 1070 available?


MSI Gaming Z

EVGA FTW

Gigabyte Xtreme

Zotac AMP! Extreme


----------



## reflex75

Quote:


> Originally Posted by *MyNewRig*
> 
> some guy in pcper claiming that his runs 2200Mhz core out of the box!


That doesn't make sense.
And to score 6800 in Timespy, 2100Mhz is already enough.
(bests are above 7k)


----------



## kevindd992002

Quote:


> Originally Posted by *MyNewRig*
> 
> No idea, ask Freakvidia https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/4986942/#4986942
> MSI Gaming Z
> 
> EVGA FTW
> 
> Gigabyte Xtreme
> 
> Zotac AMP! Extreme


Thanks. I actually will be buying the Zotac AMP! Extreme for my personal rig and figured that I will buy another EVGA card or two (as it has global warranty and I live in the Philippines) that I will be selling for profit, so.

Do you have any ideas if the Zotac includes the GOW4 game when bought from Jet.com? If I buy the Zotac here, you can get the code here: https://www.zotac.com/ph/page/redeem-gears-of-war-4 but you see it's only for Asian countries. The only reason that I want to buy from the US is because I can get it for less.


----------



## madmeatballs

Quote:


> Originally Posted by *kevindd992002*
> 
> figured that I will buy another EVGA card or two (as it has global warranty and I live in the Philippines) that I will be selling for profit


You are buying the EVGA from the US? If I can recall PCHub is carrying EVGA cards locally, I wonder if they will accept warranty for cards bought outside of our country though.


----------



## kevindd992002

Quote:


> Originally Posted by *madmeatballs*
> 
> You are buying the EVGA from the US? If I can recall PCHub is carrying EVGA cards locally, I wonder if they will accept warranty for cards bought outside of our country though.


Yes and yes again, EVGA has confirmed that they have global warranty. It won't be PCHub that will honor the warranty but I'm sure there's an official contact point here that would do so. Let me confirm with them once more and I'll report back.


----------



## weskeh

Quote:


> Originally Posted by *kevindd992002*
> 
> Thanks. I actually will be buying the Zotac AMP! Extreme for my personal rig and figured that I will buy another EVGA card or two (as it has global warranty and I live in the Philippines) that I will be selling for profit, so.
> 
> Do you have any ideas if the Zotac includes the GOW4 game when bought from Jet.com? If I buy the Zotac here, you can get the code here: https://www.zotac.com/ph/page/redeem-gears-of-war-4 but you see it's only for Asian countries. The only reason that I want to buy from the US is because I can get it for less.


Yes they give gow4 keycode with purchase in the us. I bought mine there and will get the keycide via e mail. That link that you posted is for asia only. Im from europe btw.


----------



## TheGlow

Quote:


> Originally Posted by *reflex75*
> 
> That doesn't make sense.
> And to score 6800 in Timespy, 2100Mhz is already enough.
> (bests are above 7k)


6800 for graphics? I wasnt able to hit 6800 until about 2176MHz. And that was last month. Rerunning my tests last night I couldnt even hit 6700, and that was at 2200MHz.


However I think there may have been a glitch. Afterwards I was playing overwatch and was experiencing random stuttering which I didnt the day before. So I dont think its driver related since I played before on the latest without incident.
I'll run some more quick tests tonight.

Quote:


> Originally Posted by *weskeh*
> 
> Yes they give gow4 keycode with purchase in the us. I bought mine there and will get the keycide via e mail. That link that you posted is for asia only. Im from europe btw.


Sucks. I see Newegg offering the code now but of course when I bought it there was no promo. I still have an unused Splinter Cell code from my 660. Either no game or something I have no interest in.


----------



## kevindd992002

Quote:


> Originally Posted by *weskeh*
> 
> Yes they give gow4 keycode with purchase in the us. I bought mine there and will get the keycide via e mail. That link that you posted is for asia only. Im from europe btw.


You mean you bought yours from jet.com, right? I'm reading in reddit that some people are having problems with getting the code from them. Since most of the cards from jet.com are really from Newegg, what they do is direct their customers to Newegg and Newegg will direct them back to jet.com as the purchase was done there. So I'm kinda confused here.


----------



## weskeh

nah, i bought it in store in the u.s, whent on holiday and brought one with me. B&H is where i bought it.

the salesperson told me they had a depot in europe, so maybe contact them and ask where they ship from if you buying online...


----------



## madmeatballs

Quote:


> Originally Posted by *kevindd992002*
> 
> Yes and yes again, EVGA has confirmed that they have global warranty. It won't be PCHub that will honor the warranty but I'm sure there's an official contact point here that would do so. Let me confirm with them once more and I'll report back.


Nice!, but you know how it is here, whoever EVGA's distro is here might crazy and not honor global warranties. I also found out that we have 5 year warranty for Zotac cards while the US market only has a 3 year warranty now. Zotac told me they have different warranty policies per region.

Edit: Too bad I can't avail GoW4 from Zotac since I purchased my card before september. lol


----------



## Roland0101

Quote:


> Originally Posted by *xg4m3*
> 
> So you guys experience the artifacts on default OC clocks or only if you try to overclock it further? Like MSI X states it can run at 1797 MHz in it's OC clock. Do artifacts appear at those clocks or only if one tries to push it further.
> 
> Today i have to decide if will switch 970 with 1070 or just get 2 new 1080p screens and i'm torn between those two options.


A Card that has problems on advertised clocks is a defective card. Some people have problems, but that are mostly MSI cards.

Here is a comparison between my old Asus Strix GTX 970 OC, even further overclocked and my Asus Strix GTX 1070 with +50 offset at the core clock speed and no memory overclocking. http://www.3dmark.com/compare/fs/10247130/fs/8588136#[/URL]

As you can see the result is pretty amazing for just one generation between the cards.

Is there a risk that you get a Card with issues?
Yes, but that risk is always there. The chance that you get a card that has problems on stock are pretty slim, and if that should happen, you could RMA that card.


----------



## TheGlow

Quote:


> Originally Posted by *BroPhilip*
> 
> I love my ASUS z170-a the software will auto benchmark and auto tune the overclock even auto prams the fans based on position in the case. This is what it sets it at each time. I don't remember what my voltage is but I can check next time I boot it up. I even used the supplied thermal past that came with the evo.... (shhhuush...don't tell anyone because its factory and has to stink lol)


I should look into that. I have the Asus Z170 pro gaming so I assume one of those apps it tries to push at me can do that. I'm just wary of touching the voltage as I attribute that to affecting the lifespan.


----------



## jamor

Quote:


> Originally Posted by *madmeatballs*
> 
> Nice!, but you know how it is here, whoever EVGA's distro is here might crazy and not honor global warranties. I also found out that we have 5 year warranty for Zotac cards while the US market only has a 3 year warranty now. Zotac told me they have different warranty policies per region.
> 
> Edit: Too bad I can't avail GoW4 from Zotac since I purchased my card before september. lol


US Market is 5 years too the only catch is you have to register your product within 30 days.


----------



## MyNewRig

Quote:


> Originally Posted by *reflex75*
> 
> That doesn't make sense.
> And to score 6800 in Timespy, 2100Mhz is already enough.
> (bests are above 7k)


What doesn't really make sense is a card configured with 1797Mhz boost clock, would automatically boost to 2200Mhz "out of the box" without him touching anything even if it is running at 40-50c , GPU boost 3 never boosts by that much without an offset, and increased power target unless this card has the most cherry picked Pascal chip in the world!

But i am just wondering what would make someone lie about something like that? i don't see the incentive!


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> I should look into that. I have the Asus Z170 pro gaming so I assume one of those apps it tries to push at me can do that. I'm just wary of touching the voltage as I attribute that to affecting the lifespan.


That would be the AI Suite 3 but i don't recommend that you ever install that, it has issues that conflict with Nvidia drivers and it is such a bloated unneeded software, just use the AI Tweaker in the BIOS or EZ Tune.

Even without touching the voltage, when you set a manual CPU multiplier the BIOS increase the voltage above stock automatically anyways if you have the CPU over-voltage jumper on the ON position in the Pro Gaming

Why do you care for lifespan? if it will reduce the chip lifespan from 15 years to 10 years why would you even care? do you plan to keep it running for that long? anything below 1.4v on air is fine, as long as you can properly cool it so that it is in the 50c-70c at most times


----------



## MyNewRig

Quote:


> Originally Posted by *jamor*
> 
> US Market is 5 years too the only catch is you have to register your product within 30 days.


The problem with Zotac and getting extra 3 years of warranty upon registration is that these 3 years are only valid for the original purchaser, if you don't register, the first 2 years are valid to whoever have the card in hand.

If you intend to sell your card after a year or two i think the 2 years transferable warranty of Zotac is bad because it will run out by the time you try to sell it reducing the value significantly, the standard 3 years transferable warranty of other brands is much better if you don't actually intend keeping the card for 5 years.

5 years Extended warranty can also be bought with EVGA upon registration for some $20 bucks, but same problem, first 3 years are transferable and the other 2 only valid for original owner, and who keeps a GPU for 5 years anyways? most people will be tempted to upgrade much earlier than that.


----------



## TheGlow

Quote:


> Originally Posted by *MyNewRig*
> 
> What doesn't really make sense is a card configured with 1797Mhz boost clock, would automatically boost to 2200Mhz "out of the box" without him touching anything even if it is running at 40-50c , GPU boost 3 never boosts by that much without an offset, and increased power target unless this card has the most cherry picked Pascal chip in the world!
> 
> But i am just wondering what would make someone lie about something like that? i don't see the incentive!


Yea, I was able to hit 2200MHz on desktop last night and still had to set offset to +215. I think Id get to around 1989 or something at stock boost.
Quote:


> Originally Posted by *MyNewRig*
> 
> That would be the AI Suite 3 but i don't recommend that you ever install that, it has issues that conflict with Nvidia drivers and it is such a bloated unneeded software, just use the AI Tweaker in the BIOS or EZ Tune.
> 
> Even without touching the voltage, when you set a manual CPU multiplier the BIOS increase the voltage above stock automatically anyways if you have the CPU over-voltage jumper on the ON position in the Pro Gaming
> 
> Why do you care for lifespan? if it will reduce the chip lifespan from 15 years to 10 years why would you even care? do you plan to keep it running for that long? anything below 1.4v on air is fine, as long as you can properly cool it so that it is in the 50c-70c at most times


Thats what I thought. I remember ai suite leaving some residue behind that took a while to nuke it.
I got 5 years out of my 2500k so far, and then gave to my daughter. A friend gave me his 2600k but i havent bothered to swap that out for her yet.
I doubt I would wait 15 vs 10 years for an upgrade, but the 2500k lasted so long, i wonder how long i'll ride the 6600k. And at that rate the 2500k still works perfectly fine. If anything I cant find another motherboard for it to make a 3rd PC since I have that spare 2600k.


----------



## kevindd992002

Quote:


> Originally Posted by *madmeatballs*
> 
> Nice!, but you know how it is here, whoever EVGA's distro is here might crazy and not honor global warranties. I also found out that we have 5 year warranty for Zotac cards while the US market only has a 3 year warranty now. Zotac told me they have different warranty policies per region.
> 
> Edit: Too bad I can't avail GoW4 from Zotac since I purchased my card before september. lol


We'll see what they say but I'm not getting my hopes high too. And they're correct, it's also 5 years for US Zotac customers.
Quote:


> Originally Posted by *MyNewRig*
> 
> The problem with Zotac and getting extra 3 years of warranty upon registration is that these 3 years are only valid for the original purchaser, if you don't register, the first 2 years are valid to whoever have the card in hand.
> 
> If you intend to sell your card after a year or two i think the 2 years transferable warranty of Zotac is bad because it will run out by the time you try to sell it reducing the value significantly, the standard 3 years transferable warranty of other brands is much better if you don't actually intend keeping the card for 5 years.
> 
> 5 years Extended warranty can also be bought with EVGA upon registration for some $20 bucks, but same problem, first 3 years are transferable and the other 2 only valid for original owner, and who keeps a GPU for 5 years anyways? most people will be tempted to upgrade much earlier than that.


I'm confused, I thought it was also 3 years original warranty and 2 years extended upon registration for Zotac? So which is transferrable and which is not? If the extended warranty isn't, can't the 2nd buyer return it to you (the original purchaser) and you go through the RMA process yourself assuming someone bought it from you locally?


----------



## madmeatballs

Quote:


> Originally Posted by *kevindd992002*
> 
> We'll see what they say but I'm not getting my hopes high too. And they're correct, it's also 5 years for US Zotac customers.
> I'm confused, I thought it was also 3 years original warranty and 2 years extended upon registration for Zotac? So which is transferrable and which is not? If the extended warranty isn't, can't the 2nd buyer return it to you (the original purchaser) and you go through the RMA process yourself assuming someone bought it from you locally?


Hmm, I had experienced this already with Zotac locally. I RMA'd a third hand GTX 980 for fan issues (I wasn't the first owner, I was the third







). They told me the warranty for the card was still valid until 2019 and that they only need the official receipt of the card. Then sold it.


----------



## MyNewRig

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm confused, I thought it was also 3 years original warranty and 2 years extended upon registration for Zotac? So which is transferrable and which is not? If the extended warranty isn't, can't the 2nd buyer return it to you (the original purchaser) and you go through the RMA process yourself assuming someone bought it from you locally?


Zotac only has 2 years standard warranty without registration or anything (almost all other manufacturers have 3) that first two years is transferable and can be used with the original purchase receipt so within these first two years, if the card is sold say 4 times and every time the seller provides the purchase receipt to the buyer, the buyer can still use the warranty,

To get 5 years with Zotac you need to register with your personal info, this registration gives you a 3 years extended warranty for a total of 5 years, but these 3 extra year can only be used by the original registrant, or the original owner.

Of course the 2nd, 3rd or 4th buyer can still use the warranty with the assistance of the original purchaser, but how would it be like to be contacted by someone 2 years after selling a card because they want to use the warranty? and then maybe you would have to handle the logistics?

Of course you might also provide the account in which you registered the product to the buyer so they can go in, change the address and some information and then use the extended warranty on their own, i am not sure how it actually works in practice, but what i am sure of is that Zotac comes only with 2 years standard warranty that follows the product itself not the owner, and this is why i do not buy Zotac because everyone else is giving 3 years transferable warranty and i tend to upgrade GPU every two years so when it is time to sell it, i can sell it with one more year of warranty left on it.

Hope that helps


----------



## criminal

Quote:


> Originally Posted by *khanmein*
> 
> i think they reserved samsung vram chip for upcoming GTX 1080Ti in case of shortage. that's great news for star forge if micron chip on his card is pretty solid but i don't wanna risk this kind hard-earned money. now i'm seeking for used GTX 1070.


This may have already been addressed in one of the 85 posts I haven't read, but the 1080Ti would use GDDR5X which Samsung does not make. So the 1080Ti has nothing to do with the 1070's not getting Samsung memory.


----------



## mrtbahgs

I looked into Zotac a month or so back and while I recall reading somewhere that the warranty could be 5 years if registered, when I looked at the policy on their actual website it made no mention of the 5 year warranty for US customers, it states up to 3 total.

I immediately took them off my list of potential cards to buy once (according to their website) I read that it was only 2+1 years. If they really are still offering a 5 year warranty then they not only lost a sale to me, but likely many others as well, bad business to not keep your information up to date if it is true.
Quote:


> Graphics Cards
> Standard Warranty: 2 years
> Extended Warranty: +1 year
> Total Warranty: 3 years total


https://www.zotac.com/us/page/product-warranty-policy


----------



## RyanRazer

Quote:


> Originally Posted by *mrtbahgs*
> 
> I looked into Zotac a month or so back and while I recall reading somewhere that the warranty could be 5 years if registered, when I looked at the policy on their actual website it made no mention of the 5 year warranty for US customers, it states up to 3 total.
> 
> I immediately took them off my list of potential cards to buy once (according to their website) I read that it was only 2+1 years. If they really are still offering a 5 year warranty then they not only lost a sale to me, but likely many others as well, bad business to not keep your information up to date if it is true.
> https://www.zotac.com/us/page/product-warranty-policy


Quote:


> Terms and Conditions
> 
> The warranty terms vary by region. Please ensure you are viewing your regional ZOTAC web page for proper warranty details.
> Please note: ZOTAC Europe reserves the right to change the terms and conditions without advance notice. Please check back regularly for updates.
> 
> Overview to the ZOTAC Product type
> 
> Graphics Cards
> · Standard Warranty: 2 years
> · Extended Warranty: +3years
> · Total Warranty: 5 years total


Ouch


----------



## ZakZakXxX

GTX 1070 AMP with Bios GTX 1070 AMP EXTREME OC .


----------



## Lavajuice

Quote:


> Originally Posted by *khanmein*
> 
> great news for u but i personally won't get a giga. look at jayztwocents giving away his review sample GTX 1080 g1 gaming.
> 
> usually better products he will give away for his friends & family.
> i think they reserved samsung vram chip for upcoming GTX 1080Ti in case of shortage. that's great news for star forge if micron chip on his card is pretty solid but i don't wanna risk this kind hard-earned money. now i'm seeking for used GTX 1070.


Yeah, while I was initially thrilled to get a Samsung but upon running the card it had an extremely loud wurring noise that blasts through my entire apartment if I keep my door open. Turns out even spinning the middle fan with my fingers while it's not connected produces significant clicking and clacking, whereas the other two fans are silent as can be. I game with a headset anyway which makes the sound barely noticeable so I'm left in a predicament. Newegg unfortunately doesn't seem to refurbish items with RMA, they just replace them, so I'm worried I'll end up with a problematic Micron card which is silent but performs worse. Haven't had time to test OC potential yet which I'm planning on doing when I get the chance, but what do you guys think I should do?


----------



## TheGlow

Quote:


> Originally Posted by *Lavajuice*
> 
> Yeah, while I was initially thrilled to get a Samsung but upon running the card it had an extremely loud wurring noise that blasts through my entire apartment if I keep my door open. Turns out even spinning the middle fan with my fingers while it's not connected produces significant clicking and clacking, whereas the other two fans are silent as can be. I game with a headset anyway which makes the sound barely noticeable so I'm left in a predicament. Newegg unfortunately doesn't seem to refurbish items with RMA, they just replace them, so I'm worried I'll end up with a problematic Micron card which is silent but performs worse. Haven't had time to test OC potential yet which I'm planning on doing when I get the chance, but what do you guys think I should do?


I'd do an RMA for a new card. If you just got it then hopefully that stack was still Samsungs.
Otherwise I have a micron and it performs wonderfully. And theres supposed to be the new bios that should help.


----------



## MyNewRig

Quote:


> Originally Posted by *mrtbahgs*
> 
> I looked into Zotac a month or so back and while I recall reading somewhere that the warranty could be 5 years if registered, when I looked at the policy on their actual website it made no mention of the 5 year warranty for US customers, it states up to 3 total.
> 
> I immediately took them off my list of potential cards to buy once (according to their website) I read that it was only 2+1 years. If they really are still offering a 5 year warranty then they not only lost a sale to me, but likely many others as well, bad business to not keep your information up to date if it is true.
> https://www.zotac.com/us/page/product-warranty-policy


Region dependent, this is why you don't see it in the term while some others do, it is not an omission on their part,

For us in the EU it is 2 + 3 , it suck that it is only 2 + 1 in the US, but regardless, it should be 3 transferable and anything else that only applies to original purchaser would be on top of the 3 years, but giving only two years transferable sucks, Zotac is the same company as Sapphire and they both only give two years transferable, this is why i never buy GPUs from these companies since i usually need 3 years transferable at least.


----------



## mrtbahgs

Quote:


> Originally Posted by *RyanRazer*
> 
> Terms and Conditions
> 
> The warranty terms vary by region. Please ensure you are viewing your regional ZOTAC web page for proper warranty details.
> Please note: ZOTAC Europe reserves the right to change the terms and conditions without advance notice. Please check back regularly for updates.
> 
> Overview to the ZOTAC Product type
> 
> Graphics Cards
> · Standard Warranty: 2 years
> · Extended Warranty: +3years
> · Total Warranty: 5 years total


Quote:


> Originally Posted by *MyNewRig*
> 
> Region dependent, this is why you don't see it in the term while some others do, it is not an omission on their part,
> 
> For us in the EU it is 2 + 3 , it suck that it is only 2 + 1 in the US, but regardless, it should be 3 transferable and anything else that only applies to original purchaser would be on top of the 3 years, but giving only two years transferable sucks, Zotac is the same company as Sapphire and they both only give two years transferable, this is why i never buy GPUs from these companies since i usually need 3 years transferable at least.


Yea I can understand it being different for EU and that is fine, but someone here said US is 5 years as well, but their site for US warranty info doesn't agree with that.
If they are indeed 5 years for US, then they are not doing themselves any favors by listing 2+1 on their website for US information.


----------



## matti2

OC or not, im hitting "Voltage limit 1" on MSI afterburner graphs.
Is that normal?
MSI gaming x


----------



## Roland0101

Quote:


> Originally Posted by *matti2*
> 
> OC or not, im hitting "Voltage limit 1" on MSI afterburner graphs.
> Is that normal?
> MSI gaming x


Yes, it just means that it's not possible to add more voltage to the card due to the settings in the VBios regarding voltage.


----------



## kevindd992002

Quote:


> Originally Posted by *madmeatballs*
> 
> Hmm, I had experienced this already with Zotac locally. I RMA'd a third hand GTX 980 for fan issues (I wasn't the first owner, I was the third
> 
> 
> 
> 
> 
> 
> 
> ). They told me the warranty for the card was still valid until 2019 and that they only need the official receipt of the card. Then sold it.


I see. And 2019 is the end of that 5-year warranty period, correct?
Quote:


> Originally Posted by *MyNewRig*
> 
> Zotac only has 2 years standard warranty without registration or anything (almost all other manufacturers have 3) that first two years is transferable and can be used with the original purchase receipt so within these first two years, if the card is sold say 4 times and every time the seller provides the purchase receipt to the buyer, the buyer can still use the warranty,
> 
> To get 5 years with Zotac you need to register with your personal info, this registration gives you a 3 years extended warranty for a total of 5 years, but these 3 extra year can only be used by the original registrant, or the original owner.
> 
> Of course the 2nd, 3rd or 4th buyer can still use the warranty with the assistance of the original purchaser, but how would it be like to be contacted by someone 2 years after selling a card because they want to use the warranty? and then maybe you would have to handle the logistics?
> 
> Of course you might also provide the account in which you registered the product to the buyer so they can go in, change the address and some information and then use the extended warranty on their own, i am not sure how it actually works in practice, but what i am sure of is that Zotac comes only with 2 years standard warranty that follows the product itself not the owner, and this is why i do not buy Zotac because everyone else is giving 3 years transferable warranty and i tend to upgrade GPU every two years so when it is time to sell it, i can sell it with one more year of warranty left on it.
> 
> Hope that helps


Yes, that explains it a lot. And you have a point regarding the buyer returning it to the original purchaser for RMA purposes.

Now I want to confirn if US really offers the 5 years warranty or just the 3. I swear their chat support told me that it was also 5 when I asked a few weeks back. I'll confirm with them again.

EDIT: I just got off Zotac's chat support and they confirmed that for USA and Canada it is really just 2 + 1. That's a bummer then! I'll have to reconsider buying the Zotac from jet.com then.


----------



## Star Forge

So apparently there are two versions of the Hybrid from EVGA. The Gaming Hybrid is the version that has a FTW PBC mated to Hybrid cooler and a normal Hybrid version that is just a FE PCB mated to a Hybrid cooler, which you could also upgrade if you have a FE PCB style and pay another $100 for the Hybrid upgrade set.


----------



## BroPhilip

https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html

So techpowerup reviewed the msi gaming z and guess what it's samsung memory......this just makes me angry. Oh and it's dated the 20th of September. Which is long after I bought mine with micron memory.....


----------



## kevindd992002

Quote:


> Originally Posted by *weskeh*
> 
> nah, i bought it in store in the u.s, whent on holiday and brought one with me. B&H is where i bought it.
> 
> the salesperson told me they had a depot in europe, so maybe contact them and ask where they ship from if you buying online...


Ok. I was actually specifically asking about jet.com in my original question but thanks anyway as I just confirmed that they don't send out any codes.


----------



## MyNewRig

Quote:


> Originally Posted by *Star Forge*
> 
> So apparently there are two versions of the Hybrid from EVGA. The Gaming Hybrid is the version that has a FTW PBC mated to Hybrid cooler and a normal Hybrid version that is just a FE PCB mated to a Hybrid cooler, which you could also upgrade if you have a FE PCB style and pay another $100 for the Hybrid upgrade set.


Okay, so that is an FE "style" board not a actual FE board with FE components? i don't think they are taking Nvidia's FE cards, removing the cooler and putting their own, so they must be using an FE "design" and soldering their own chosen components including Micron GDDR5, so this card is more like a 1070 SC with ACX but slapped with a closed loop?


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html
> 
> So techpowerup reviewed the msi gaming z and guess what it's samsung memory......this just makes me angry. Oh and it's dated the 20th of September. Which is long after I bought mine with micron memory.....


Bit of a bait and switch isnt it?


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Bit of a bait and switch isnt it?


they received the sample few month ago but they need some times to wrote the whole article.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Bit of a bait and switch isnt it?


only a bit? that is outright dirty marketing ...


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html
> 
> So techpowerup reviewed the msi gaming z and guess what it's samsung memory......this just makes me angry. Oh and it's dated the 20th of September. Which is long after I bought mine with micron memory.....


Bit of a bait and switch isnt it?

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Bit of a bait and switch isnt it?
> 
> 
> 
> only a bit? that is outright dirty marketing ...
Click to expand...

I have been told a billion times not to exaggerate. ;-)


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> I have been told a billion times not to exaggerate. ;-)


Yeah those same consumers who would just suck any trash Greedvidia throws their way are ruining the market, and are making Nvidia eat us all for breakfast...

Extremely inflated prices and they still buy, cheaping out on components, no problem, review samples sent with components that are non-existent in retail, don't exaggerate ... keeping us totally in the dark here, they don't mind, even if they have to use strange power tricks and workarounds to make their products run as it should!

A monopoly with a huge pool of fanboys surrounding it is the worst kind of market, i am even more mad at AMD now for delaying VEGA all that much, if VEGA was currently available i would have switched instantly ...

I really don't know why the other manufacturers are leaving the discrete GPU market all to Nvidia to play us like this? maybe they don't care since the mobile/tablet market is much bigger and expanding and this is where everyone is focused?

Sony and MS were smart to use AMD hardware exclusively in consoles because Nvidia business practices are disgusting, Nvidia tried to counter with its failed shield tablet but no one cares about that or buy it.

If it was not for AMDs 30% something market share (which is rising quarter after quarter for good reasons) Nvidia would have already been broken down as a monopoly under antitrust law, because they are acting exactly like one, inflated prices, bad customer treatment, shady business .. they have all of it and i hope that there were 4 or 5 players in that market because things would have been much better for the consumer if it was the case.

I am so sick of Nvidia and its products but there are currently zero alternatives for my needs, so i have to either alter my needs or wait for the slow AMD to come to the party, late as usual.


----------



## asdkj1740

Quote:


> Originally Posted by *BroPhilip*
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html
> 
> So techpowerup reviewed the msi gaming z and guess what it's samsung memory......this just makes me angry. Oh and it's dated the 20th of September. Which is long after I bought mine with micron memory.....


no surprise with review samples...


----------



## RyanRazer

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah those same consumers who would just suck any trash Greedvidia throws their way are ruining the market, and are making Nvidia eat us all for breakfast...
> 
> Extremely inflated prices and they still buy, cheaping out on components, no problem, review samples sent with components that are non-existent in retail, don't exaggerate ... keeping us totally in the dark here, they don't mind, even if they have to use strange power tricks and workarounds to make their products run as it should!
> 
> A monopoly with a huge pool of fanboys surrounding it is the worst kind of market, i am even more mad at AMD now for delaying VEGA all that much, if VEGA was currently available i would have switched instantly ...
> 
> I really don't know why the other manufacturers are leaving the discrete GPU market all to Nvidia to play us like this? maybe they don't care since the mobile/tablet market is much bigger and expanding and this is where everyone is focused?
> 
> Sony and MS were smart to use AMD hardware exclusively in consoles because Nvidia business practices are disgusting, Nvidia tried to counter with its failed shield tablet but no one cares about that or buy it.
> 
> If it was not for AMDs 30% something market share (which is rising quarter after quarter for good reasons) Nvidia would have already been broken down as a monopoly under antitrust law, because they are acting exactly like one, inflated prices, bad customer treatment, shady business .. they have all of it and i hope that there were 4 or 5 players in that market because things would have been much better for the consumer if it was the case.
> 
> I am so sick of Nvidia and its products but there are currently zero alternatives for my needs, so i have to either alter my needs or wait for the slow AMD to come to the party, late as usual.


I second that! Exactly my view & situation. I am waiting for Vega. If the price is right and suitable for 1440p 144hz gaming, i'm all for it.

_dissclaimer: I know gtx 1070 isn't 1440p 144hz gaming gpu, but you can achieve that with most games lowering some settings. 100FPS is also good for me. Anything above 60-90 is ok with me...._


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have been told a billion times not to exaggerate. ;-)
> 
> 
> 
> Yeah those same consumers who would just suck any trash Greedvidia throws their way are ruining the market, and are making Nvidia eat us all for breakfast...
> 
> Extremely inflated prices and they still buy, cheaping out on components, no problem, review samples sent with components that are non-existent in retail, don't exaggerate ... keeping us totally in the dark here, they don't mind, even if they have to use strange power tricks and workarounds to make their products run as it should!
> 
> A monopoly with a huge pool of fanboys surrounding it is the worst kind of market, i am even more mad at AMD now for delaying VEGA all that much, if VEGA was currently available i would have switched instantly ...
> 
> I really don't know why the other manufacturers are leaving the discrete GPU market all to Nvidia to play us like this? maybe they don't care since the mobile/tablet market is much bigger and expanding and this is where everyone is focused?
> 
> Sony and MS were smart to use AMD hardware exclusively in consoles because Nvidia business practices are disgusting, Nvidia tried to counter with its failed shield tablet but no one cares about that or buy it.
> 
> If it was not for AMDs 30% something market share (which is rising quarter after quarter for good reasons) Nvidia would have already been broken down as a monopoly under antitrust law, because they are acting exactly like one, inflated prices, bad customer treatment, shady business .. they have all of it and i hope that there were 4 or 5 players in that market because things would have been much better for the consumer if it was the case.
> 
> I am so sick of Nvidia and its products but there are currently zero alternatives for my needs, so i have to either alter my needs or wait for the slow AMD to come to the party, late as usual.
Click to expand...

That's the thing, there is nothing comparable currently on the market and there is high demand. Intel are doing exactly the same thing.

The reason that there are not more players making GPUs is because of the billions of dollars of investment required to set up something from scratch and the finite size of the market probably means that it would take many many years to recoup the investment making it not viable for someone trying to play catch up from a starting point of nothing.

I am sure that any delays with Vega were more because the reduction of fab has been problematic and it didn't work properly initially meaning they need more time to develop it properly. Intel had similar pronblems with broadwell CPUs. As frustrating as it may be, I dont think they are doing it to spite any of us


----------



## kevindd992002

I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?


----------



## TheGlow

Quote:


> Originally Posted by *MyNewRig*
> 
> Sony and MS were smart to use AMD hardware exclusively in consoles because Nvidia business practices are disgusting, Nvidia tried to counter with its failed shield tablet but no one cares about that or buy it.


IMO this is one of the main reasons the consoles are trash. I remember getting a Wii and seeing the ATI logo on the side "Thats why the graphics are already outdated".
Back then a console lasted a good while and performance was pretty much static. Once they started going with AMD is when we also started seeing games no longer running 1080p but odd stuff like 920p, games cant even run consistently at 30fps the majority of the time, let alone 60.
Ive tried ATI/Amd pc cards on a few occasions and always trouble with driver issues and other oddities. Just recently my r380 I couldnt set refresh rate higher than 60, said monitor didnt support it. a 144Hz Dell. I had to set refresh rate via Windows. if at any point I launched the Radeon extra monitor settings area I would get, not exaggerating, 12-15 notification tray/pop ups about unsupported resolutions, then force me back to 60fps. Then I had to go back into Windows monitor settings and set back to 144Hz.
Once I got my 1070 and back on nvidia's wagon, no more problems.
Anecdotal, maybe just my experiences, but since 15+ years this has been my experience.


----------



## xg4m3

To me it seems like AMD/ATI is always one step behind Nvidia. They're far from being bad, but there is always something missing. No matter what they do, Nvidia will always squeeze more power into their hardware and boom, majority of people will ignore AMD. That and games obviously working better with nvidia drivers. Every single friend of mine with AMD card always worries about support when new game is released and that says something. It's a shame, we need stronger competitor to have lower prices, but it doesn't seem like that will happen any time soon.


----------



## TheGlow

Quote:


> Originally Posted by *xg4m3*
> 
> To me it seems like AMD/ATI is always one step behind Nvidia. They're far from being bad, but there is always something missing. No matter what they do, Nvidia will always squeeze more power into their hardware and boom, majority of people will ignore AMD. That and games obviously working better with nvidia drivers. Every single friend of mine with AMD card always worries about support when new game is released and that says something. It's a shame, we need stronger competitor to have lower prices, but it doesn't seem like that will happen any time soon.


Minor and petty but I had the Crimson drivers keep giving a notification that there was a driver upgrade. Which was a version behind what I was already on. And I dont mean I was using a beta or something, but the next full release, for a good week until it realized and left me alone.
Like how MS said theyd use AMD in the next console, Scorpio, but not until next Christmas. Kind of indicating AMDs new line isnt ready for at least another year.


----------



## matti2

Its good to have both, Amd and Nvidia in market, not bad idea if there were more competitors to keep prices decent.
PC gamer since 90s and after 1070 might change back to Amd, never knows..


----------



## madmeatballs

Quote:


> Originally Posted by *kevindd992002*
> 
> I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?


You contacted JTPX? I knew they would still have these cards from June-July's stock.


----------



## asdkj1740

Quote:


> Originally Posted by *RyanRazer*
> 
> I second that! Exactly my view & situation. I am waiting for Vega. If the price is right and suitable for 1440p 144hz gaming, i'm all for it.
> 
> _dissclaimer: I know gtx 1070 isn't 1440p 144hz gaming gpu, but you can achieve that with most games lowering some settings. 100FPS is also good for me. Anything above 60-90 is ok with me...._


price must be "wrong" for vega.....

but more and more freesync monitiors coming out, samsung va 1ms quantum dot panels are very appealing and all of them are freesync support...


----------



## kevindd992002

Quote:


> Originally Posted by *madmeatballs*
> 
> You contacted JTPX? I knew they would still have these cards from June-July's stock.


Yes. I'm talking to Jerome and he said that they only have two shipments so far, so I think I have a high chance of getting a card with Samsung vRAM chips.


----------



## Lavajuice

Quote:


> Originally Posted by *TheGlow*
> 
> I'd do an RMA for a new card. If you just got it then hopefully that stack was still Samsungs.
> Otherwise I have a micron and it performs wonderfully. And theres supposed to be the new bios that should help.


Yeah it's a matter now if whether I go through Gigabyte or Newegg, Newegg would pay for shipping both ways but Gigabyte would make me pay for shipping out there and potentially take a lot longer but I would be guaranteed to keep my card. My initial OC results are +100 core +400 memory stable playing OW, which seems not amazing compared to others but also it could be worse which is what I fear by getting a new one. Obviously yours is an amazing Micron example but it seems like most other Micron users on here are reporting lower than +400.


----------



## shadowrain

Quote:


> Originally Posted by *kevindd992002*
> 
> I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?


Pinas? I got my Zotac 1070 Amp Extreme from Hub 1st week of July, new shipment as all 1070's were sold out in June. Samsung vram, highly ocable.


----------



## RyanRazer

Quote:


> Originally Posted by *shadowrain*
> 
> Pinas? I got my Zotac 1070 Amp Extreme from Hub 1st week of July, new shipment as all 1070's were sold out in June. Samsung vram, highly ocable.


So extreme, ha? i just got one today. I had G1 gaming but i sold it and went for amp extreme. It is better, quieter, cooler, OCs a bit better but i cant get pass 2088MhZ core... What can you push it to?

Quote:


> Originally Posted by *kevindd992002*
> 
> I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?


I have extreme, got it today via amazon.it, it has Samsung chip. I think all extremes have samsung chips, i read it somewhere. Not sure about regular amp though...


----------



## MyNewRig

Quote:


> Originally Posted by *RyanRazer*
> 
> So extreme, ha? i just got one today. I had G1 gaming but i sold it and went for amp extreme. It is better, quieter, cooler, OCs a bit better but i cant get pass 2088MhZ core... What can you push it to?
> I have extreme, got it today via amazon.it, it has Samsung chip. I think all extremes have samsung chips, i read it somewhere. Not sure about regular amp though...


How come all GB Extreme have Samsung? i seen a few people with the Extreme posting GPU-Z screenshots with Micron, and i even asked Gigabyte Taiwan HQ if all Extreme have Samsung but they said they can not guarantee that and memory can be anything on the Extreme, so it is highly unlikely that all Extremes have Samsung.


----------



## RyanRazer

Quote:


> Originally Posted by *MyNewRig*
> 
> How come all GB Extreme have Samsung? i seen a few people with the Extreme posting GPU-Z screenshots with Micron, and i even asked Gigabyte Taiwan HQ if all Extreme have Samsung but they said they can not guarantee that and memory can be anything on the Extreme, so it is highly unlikely that all Extremes have Samsung.


Zotac amp extreme, no gb extreme gaming


----------



## MyNewRig

Quote:


> Originally Posted by *RyanRazer*
> 
> Zotac amp extreme, no gb extreme gaming


Ah sorry, even Zotac Extreme switched to Micron in August, only June/July/Early August were using Samsung, production after that is all Micron.


----------



## bobbyh

Hey guys I have the MSI Gaming X and was looking at the charts on the tomshardware article http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-7.html

I was wondering if I could flash the FE bios to my gaming X 1070 in order to get the efficiency of the 1070fe but with the better cooling of my gaming x card. I don't really care about absolute max performance, just lowest idle power usage and load power usage.


----------



## Nukemaster

Quote:


> Originally Posted by *bobbyh*
> 
> Hey guys I have the MSI Gaming X and was looking at the charts on the tomshardware article http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-7.html
> 
> I was wondering if I could flash the FE bios to my gaming X 1070 in order to get the efficiency of the 1070fe but with the better cooling of my gaming x card. I don't really care about absolute max performance, just lowest idle power usage and load power usage.


You could either set a negative offset or lower the power limit in afterburner. It would be a safer bet than risking the card with the flash.


----------



## mrtbahgs

What's the best free game capture software that is pretty easy to set up?

In the past I used OBS to stream, but I had help setting it up from someone who already had it running and I remember there was a lot going on so I don't think I want to try it all over again.

Is Shadow Play or whatever it is called actually pretty decent? I have zero experience with it.

I think for now I would just record current gameplay and not stream and then I can cut out scenes I want to keep and delete the rest since I assume it takes up some good HDD space. Perhaps I would be better off with the "record my last 10 minutes" button and I can selectively use it when something worth saving occurs. Is that feature included with Shadow Play or whatever other software you may recommend?


----------



## Nukemaster

Quote:


> Originally Posted by *mrtbahgs*
> 
> What's the best free game capture software that is pretty easy to set up?
> 
> In the past I used OBS to stream, but I had help setting it up from someone who already had it running and I remember there was a lot going on so I don't think I want to try it all over again.
> 
> Is Shadow Play or whatever it is called actually pretty decent? I have zero experience with it.
> 
> I think for now I would just record current gameplay and not stream and then I can cut out scenes I want to keep and delete the rest since I assume it takes up some good HDD space. Perhaps I would be better off with the "record my last 10 minutes" button and I can selectively use it when something worth saving occurs. Is that feature included with Shadow Play or whatever other software you may recommend?


Try shadowplay(They call it Share now, but you can turn off all the streaming stuff) it works very well.

The actual replay feature lets you run a constant buffer so you can grab the last X min of video and save it. I do not use it, but it is a cool idea for those random things that games do.


----------



## bobbyh

Quote:


> Originally Posted by *Nukemaster*
> 
> You could either set a negative offset or lower the power limit in afterburner. It would be a safer bet than risking the card with the flash.


Thanks for the reply. I just realized the page I sent was the wrong one, this is the right one http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-8.html

Yeah I am currently setting the power limit to 50, the problem is that if you check out the charts in that link, the fe at 50% power limit consumes only ~70 watts while the gaming x consumes ~110. Idle power usage is also twice as high on the gaming x as the FE. I saw somewhere that this could be because the gaming x added frequency bins at the top and removed some from the bottom limiting the frequency it could drop down to when idle.


----------



## Roland0101

Quote:


> Originally Posted by *bobbyh*
> 
> Thanks for the reply. I just realized the page I sent was the wrong one, this is the right one http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-8.html
> 
> Yeah I am currently setting the power limit to 50, the problem is that if you check out the charts in that link, the fe at 50% power limit consumes only ~70 watts while the gaming x consumes ~110. Idle power usage is also twice as high on the gaming x as the FE. I saw somewhere that this could be because the gaming x added frequency bins at the top and removed some from the bottom limiting the frequency it could drop down to when idle.


I agree with Nukemaster. I would not try to flash a FE Bios to a MSI Gaming X.
If you don't mind me asking, way did you purchased a 1070 with a very high power consumption if power usage is the most important factor for you?


----------



## RyanRazer

Quote:


> Originally Posted by *mrtbahgs*
> 
> What's the best free game capture software that is pretty easy to set up?
> 
> In the past I used OBS to stream, but I had help setting it up from someone who already had it running and I remember there was a lot going on so I don't think I want to try it all over again.
> 
> Is Shadow Play or whatever it is called actually pretty decent? I have zero experience with it.
> 
> I think for now I would just record current gameplay and not stream and then I can cut out scenes I want to keep and delete the rest since I assume it takes up some good HDD space. Perhaps I would be better off with the "record my last 10 minutes" button and I can selectively use it when something worth saving occurs. Is that feature included with Shadow Play or whatever other software you may recommend?


Do try shadow play. I tried it and FPS hit was almost zero, whereas when i tried to record with FRAPS for example, it was horrible.


----------



## Dude970

Quote:


> Originally Posted by *mrtbahgs*
> 
> What's the best free game capture software that is pretty easy to set up?
> 
> In the past I used OBS to stream, but I had help setting it up from someone who already had it running and I remember there was a lot going on so I don't think I want to try it all over again.
> 
> Is Shadow Play or whatever it is called actually pretty decent? I have zero experience with it.
> 
> I think for now I would just record current gameplay and not stream and then I can cut out scenes I want to keep and delete the rest since I assume it takes up some good HDD space. Perhaps I would be better off with the "record my last 10 minutes" button and I can selectively use it when something worth saving occurs. Is that feature included with Shadow Play or whatever other software you may recommend?


MSI Afterburner has recording that works pretty good the times I tried it.


----------



## bobbyh

Quote:


> Originally Posted by *Roland0101*
> 
> I agree with Nukemaster. I would not try to flash a FE Bios to a MSI Gaming X.
> If you don't mind me asking, way did you purchased a 1070 with a very high power consumption if power usage is the most important factor for you?


I had bought it when I lived at a different place and where I am living now doesn't have air conditioning so I am trying to limit how much the computer heats up my room, I have an undervolted i7 5775c to, jsut trying to go for minimum power draw, and heat, while maintaining the best performance I can.


----------



## bobbyh

Quote:


> Originally Posted by *bobbyh*
> 
> I had bought it when I lived at a different place and where I am living now doesn't have air conditioning so I am trying to limit how much the computer heats up my room, I have an undervolted i7 5775c to, jsut trying to go for minimum power draw, and heat, while maintaining the best performance I can.


Noise also bothers me so I want the fans turned as low as possible


----------



## RyanRazer

*Switched G1 Gaming for AMP! Extreme*

So i just sold my G1. It was an ok card, but it ran to hot and pretty loud.

HERES SS of G1 in action


Spoiler: Warning: Spoiler!







Card runs much quieter and cooler. Temp when OCed and in game is 66-70C. Besides that fans are much quieter than G1s. Those spin much faster and generaly louder. So in regard to that, i am very pleased with the upgrade. Those my 2 main reasons why i upgraded.

Here's AMP E gaming session, medium OC


Spoiler: Warning: Spoiler!







MAX OC


Spoiler: Warning: Spoiler!







As far as the OCing goes, it is just ok. The memory overclocks just fine to 9222mhz effective, but with GPU i just didn't hit the silicon lottery i guess. I was hoping for 2100-2150 Mhz range of stable core clock, but mine is stable at max 2088mhz. Oh and i have a Samsung memory module.


Spoiler: Warning: Spoiler!







Q1: Does using different OCing tools have any role in here? I was using MSI afterburner ofc, but would using stock Zotac be any beneficial?
Q2: Does anyone know a BIOS to flash to let my card use 125% power or more?

tl;dr

Lovely, quiet, cool running card. Memory overclocks fine, GPU not so much.


----------



## kevindd992002

Quote:


> Originally Posted by *shadowrain*
> 
> Pinas? I got my Zotac 1070 Amp Extreme from Hub 1st week of July, new shipment as all 1070's were sold out in June. Samsung vram, highly ocable.


Yes, I'm from the Philippines. Well, that's good news then. That correlates to the information I got from the dsitributor wherein their last shipment from Zotac was first week of July.

How were you able to confirm that the one you got is from the new shipment? Did you have a "waiting period" until the new stocks arrived?


----------



## Waleh

Noob question - I received my 1070 today and this is my first time upgrading a GPU. Do I uninstall my previous 970 drivers or do I just stick in the 1070 and then update from GeForce experience?


----------



## Forceman

Quote:


> Originally Posted by *Waleh*
> 
> Noob question - I received my 1070 today and this is my first time upgrading a GPU. Do I uninstall my previous 970 drivers or do I just stick in the 1070 and then update from GeForce experience?


Nvidia drivers are the same for all cards, so it really doesn't matter. I'd install the new card, then do a clean install (click the clean install check box in the installer) with the newest drivers.


----------



## BulletSponge

Quote:


> Originally Posted by *Waleh*
> 
> Noob question - I received my 1070 today and this is my first time upgrading a GPU. Do I uninstall my previous 970 drivers or do I just stick in the 1070 and then update from GeForce experience?


Quote:


> Originally Posted by *Forceman*
> 
> Nvidia drivers are the same for all cards, so it really doesn't matter. I'd install the new card, then do a clean install with the newest drivers.


^This, before the wave of DDU recommenders say otherwise.


----------



## Waleh

Awesome, thanks guys! I'll do that


----------



## Forceman

Quote:


> Originally Posted by *BulletSponge*
> 
> ^This, before the wave of DDU recommenders say otherwise.


There's probably at least one person burning to tell how they do a fresh windows install when they change cards.


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> Noob question - I received my 1070 today and this is my first time upgrading a GPU. Do I uninstall my previous 970 drivers or do I just stick in the 1070 and then update from GeForce experience?


Best way imho is:
Uninstall the 970 in the device manager.
Shut down the PC.
Pull the power plug.
Press the start button for a few seconds.
Pull the 970 (toggle the PCIe slot latch to unlock the card)
Put in the 1070 (Make sure the card is firmly inserted into the PCIe slot and that the power cables are connected properly)
Start into the Bios and look if the card is recognized.
Start into Windows and install the driver.


----------



## Roland0101

Quote:


> Originally Posted by *bobbyh*
> 
> I had bought it when I lived at a different place and where I am living now doesn't have air conditioning so I am trying to limit how much the computer heats up my room, I have an undervolted i7 5775c to, jsut trying to go for minimum power draw, and heat, while maintaining the best performance I can.


Ok, that is very understandable.


----------



## BroPhilip

So I have had two bsod lately with my once stable msi gaming z in starwars battlefront......the only change was turning vsync on... just guessing from the numbers ito appears that the card is really undercutting it's power as its not need to only push 60fps but the ram clock is delaying and staying at the higher clock.... both times were hard freezes (once in game and once going to menu) followed by bsod.... so when is this bios update coming?


----------



## abdidas

Just got an EVGA FTW edition, holy moly this thing is quiet. Couldn't be happier with the upgrade, btw I came from a GTX 550 ti so this is a huge step up for me.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?


My MSI Gaming with micron was 2nd week July 2016 but I think I was one of the first to get a Micron card


----------



## jlhawn

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?
> 
> 
> 
> My MSI Gaming with micron was 2nd week July 2016 but I think I was one of the first to get a Micron card
Click to expand...

I bought my MSI Gaming X July 5th and I have Samsung.


----------



## jlhawn

Quote:


> Originally Posted by *abdidas*
> 
> Just got an EVGA FTW edition, holy moly this thing is quiet. Couldn't be happier with the upgrade, btw I came from a GTX 550 ti so this is a huge step up for me.


Big Time step for you.







Enjoy.


----------



## gtbtk

Quote:


> Originally Posted by *jlhawn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?
> 
> 
> 
> My MSI Gaming with micron was 2nd week July 2016 but I think I was one of the first to get a Micron card
> 
> Click to expand...
> 
> I bought my MSI Gaming X July 5th and I have Samsung.
Click to expand...

Mine was manufactured 2nd week July. purchasing your card on 5 July would place it in the first batch manufactured in June


----------



## jlhawn

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jlhawn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> I just asked the sole distributor in our country and was told that the only shipments they got from Zotac (until now) were one from June and another from the first week of July. With that information, do you guys think that those still contain Samsung vRAM chips? I read that the Micron chips started appearing last August, right?
> 
> 
> 
> My MSI Gaming with micron was 2nd week July 2016 but I think I was one of the first to get a Micron card
> 
> Click to expand...
> 
> I bought my MSI Gaming X July 5th and I have Samsung.
> 
> Click to expand...
> 
> Mine was manufactured 2nd week July. purchasing your card on 5 July would place it in the first batch manufactured in June
Click to expand...

True, didn't think of that.








GPU-Z shows a release date on mine of May 30 2016 if thats the same as manufactured date.


----------



## abdidas

Quote:


> Originally Posted by *jlhawn*
> 
> Big Time step for you.
> 
> 
> 
> 
> 
> 
> 
> Enjoy.


Yeah I upgraded pretty much everything except the gpu back in April, i7 6700k, 16gb ram, z170, itx business. My trusty old Q6600 lasted me many years but it's now found a new home.


----------



## jlhawn

Quote:


> Originally Posted by *Roland0101*
> 
> Ok, that is very understandable.


Are you the same person on the nvidia forums?
If so some of the users on there are jerks towards you when you have been very helpful to lots of them in my opinion.


----------



## jlhawn

Quote:


> Originally Posted by *abdidas*
> 
> Yeah I upgraded pretty much everything except the gpu back in April, i7 6700k, 16gb ram, z170, itx business. My trusty old Q6600 lasted me many years but it's now found a new home.


I have an old Q9650 that I gave to a buddys daughter, it;s still running strong.


----------



## abdidas

BTW I got Micron chip in my 1070 FTW so I guess am screwed in the overclock department if am following right


----------



## jlhawn

Quote:


> Originally Posted by *abdidas*
> 
> BTW I got Micron chip in my 1070 FTW so I guess am screwed in the overclock department if am following right


per nvidia forum mod (Manuel G) there is a bios fix in the works for the micron issue but, that could take months as
they have to work with every manufacturer of the graphics cards, evga msi gigabyte etc.

this thread page 16
https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/16/


----------



## gtbtk

Quote:


> Originally Posted by *jlhawn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *abdidas*
> 
> BTW I got Micron chip in my 1070 FTW so I guess am screwed in the overclock department if am following right
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> per nvidia forum mod (Manuel G) there is a bios fix in the works for the micron issue but, that could take months as
> they have to work with every manufacturer of the graphics cards, evga msi gigabyte etc.
> 
> this thread page 16
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/16/
Click to expand...

This forum has been at the centre of all of that.

I am the one who identified the bug and started that thread. After 17 odd pages finally managed to convince Nvidia, with Roland and GamersX assistance, that they needed to give it some attention.

Yes you are right, there are a number of know nothing blowhards on that forum that contribute nothing much that is constructive.


----------



## gtbtk

Quote:


> Originally Posted by *abdidas*
> 
> BTW I got Micron chip in my 1070 FTW so I guess am screwed in the overclock department if am following right


no you are not screwed at all.

Read through this thread, there are many posts that describe how you can work around and avoid the bug that causes the "micron" issue


----------



## pez

What exactly is the 'bug'?


----------



## Nukemaster

Quote:


> Originally Posted by *bobbyh*
> 
> Thanks for the reply. I just realized the page I sent was the wrong one, this is the right one http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-8.html
> 
> Yeah I am currently setting the power limit to 50, the problem is that if you check out the charts in that link, the fe at 50% power limit consumes only ~70 watts while the gaming x consumes ~110. Idle power usage is also twice as high on the gaming x as the FE. I saw somewhere that this could be because the gaming x added frequency bins at the top and removed some from the bottom limiting the frequency it could drop down to when idle.


I am not sure how much more you can do. That should still be a noticeable drop in power. Does a negative offset help any?


----------



## MyNewRig

Quote:


> Originally Posted by *jlhawn*
> 
> per nvidia forum mod (Manuel G) there is a bios fix in the works for the micron issue but, that could take months as
> they have to work with every manufacturer of the graphics cards, evga msi gigabyte etc.
> 
> this thread page 16
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/16/


Are you sure this would take *MONTHS* instead of days? because if this is the case i will just return my Micron 1070 today and wait a few months for VEGA HBM2 or Pascal v2 with GDDR5X memory .. i keep checking for that BIOS update daily!

Please give me more details on how you arrived at this prediction ..


----------



## asdkj1740

hey, how many of you using evga precision x to manually adjust the voltage curve for overclocking???
i found that manually adjust the voltage curve instead of using linear upward shifting may gain more in overclocking, around ~40mhz in gaming.
but in fsu i get lower mark for higher gpu clock and i am sure there is no throttling in gpu mhz during fsu testing.

for 2101mhz, it is using msi ab to increase 100% voltage and the max voltage is 1.09v
for 2139mhz, it is using evga px and manually set the voltage curve and max voltage is 1.08v


----------



## gtbtk

Quote:


> Originally Posted by *pez*
> 
> What exactly is the 'bug'?


This has been described in detail a number of times both here and on nvidia.com so I will be brief here.

When the micron 1070 is in a low power state and idling below about .780 - .800v, if you set a high memory overclock and put the card under load it will checkerboard artifact and then BSOD.

The reason is that the Memory VRM does not increase the voltage to the memory fast enough to match with the high frequency memory voltage requirements resulting in the memory being starved and crashing.

you can work around the problem If you lock the voltage in the curve to ensure the voltage stays above .800v. If you do that, the memory doesn't checkerboard and bsod. Of course if you overclock too far, it will start the traditional artifact/display errors you see just like any other card.

The bios fix should better co-ordinate the increased voltage supply with the increased memory clock voltage requirements.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> hey, how many of you using evga precision x to manually adjust the voltage curve for overclocking???
> i found that manually adjust the voltage curve instead of using linear upward shifting may gain more in overclocking, around ~40mhz in gaming.
> but in fsu i get lower mark for higher gpu clock and i am sure there is no throttling in gpu mhz during fsu testing.
> 
> for 2101mhz, it is using msi ab to increase 100% voltage and the max voltage is 1.09v
> for 2139mhz, it is using evga px and manually set the voltage curve and max voltage is 1.08v


Assuming you have the voltage at +100, Modify the curve you set up by increasing the curve point at .975 up to somewhere in the range of 1950 - 2000mhz. If the run finishes, your benchmark scores should improve. If it crashes, reduce the .975 level slightly, if it passes increase it slightly and retest until you find the maximum level you can run the .975v point while remaining stable.

If you do not add any extra voltage, use the .950 point instead of .975. MSI afterburner allows finer control of the curve than EVGA.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Assuming you have the voltage at +100, Modify the curve you set up by increasing the curve point at .975 up to somewhere in the range of 1950 - 2000mhz. If the run finishes, your benchmark scores should improve. If it crashes, reduce the .975 level slightly, if it passes increase it slightly and retest until you find the maximum level you can run the .975v point while remaining stable.
> 
> If you do not add any extra voltage, use the .950 point instead of .975. MSI afterburner allows finer control of the curve than EVGA.


how to access the voltage curve adjustment in msi afterburner?? i cant find it, thanks.

EDIT: oh i saw it by pressing crtl+f now, many thanks thats awesome.
i was adjusting only 1.09v and 1.08v in evga precision x and thats give me higher overclocking room.
during gaming the most likely voltage ranges are from 1.07v to 1.09v (if adding 100% voltage), so why the 0.975v should be concerned?


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Assuming you have the voltage at +100, Modify the curve you set up by increasing the curve point at .975 up to somewhere in the range of 1950 - 2000mhz. If the run finishes, your benchmark scores should improve. If it crashes, reduce the .975 level slightly, if it passes increase it slightly and retest until you find the maximum level you can run the .975v point while remaining stable.
> 
> If you do not add any extra voltage, use the .950 point instead of .975. MSI afterburner allows finer control of the curve than EVGA.
> 
> 
> 
> how to access the voltage curve adjustment in msi afterburner?? i cant find it, thanks.
Click to expand...

you need After burner version 4.3 beta 14.

type Ctrl-F or click on the little graph icon to the left of the GPU core clock slider and it will open a new window with the graph


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> you need After burner version 4.3 beta 14.
> 
> type Ctrl-F or click on the little graph icon to the left of the GPU core clock slider and it will open a new window with the graph


i dont know why my msi ab does not have that button... but CRTL F does the trick too, many thanks dude


i was using this settings and this give me ~50mhz from 2114 to 2164 in playing witcher 3 without crashes.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you need After burner version 4.3 beta 14.
> 
> type Ctrl-F or click on the little graph icon to the left of the GPU core clock slider and it will open a new window with the graph
> 
> 
> 
> i dont know why my msi ab does not have that button... but CRTL F does the trick too, many thanks dude
> 
> 
> 
> i was using this settings and this give me ~50mhz from 2114 to 2164 in playing witcher 3 without crashes.
Click to expand...

You are probably using AB 4.3 beta 4. The icon was added in beta 14. you can download the update from guru3d.com


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you need After burner version 4.3 beta 14.
> 
> type Ctrl-F or click on the little graph icon to the left of the GPU core clock slider and it will open a new window with the graph
> 
> 
> 
> i dont know why my msi ab does not have that button... but CRTL F does the trick too, many thanks dude
> 
> 
> 
> i was using this settings and this give me ~50mhz from 2114 to 2164 in playing witcher 3 without crashes.
Click to expand...

the .950 - .975 seems to have best effect to adjust the video clock, a "hidden feature" clock internal to the gpu made adjustable by the curves feature that was enabled with pascal. that clock combined with the core clock maximizes performance.

HWInfo64 monitors the video clock where evga and msi do not show you as you could never adjust it directly before now


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you need After burner version 4.3 beta 14.
> 
> type Ctrl-F or click on the little graph icon to the left of the GPU core clock slider and it will open a new window with the graph
> 
> 
> 
> i dont know why my msi ab does not have that button... but CRTL F does the trick too, many thanks dude
> 
> 
> 
> i was using this settings and this give me ~50mhz from 2114 to 2164 in playing witcher 3 without crashes.
> 
> Click to expand...
> 
> You are probably using AB 4.3 beta 4. The icon was added in beta 14. you can download the update from guru3d.com
Click to expand...

i am using beta 14, but in form of traditional skin, its not important now, huge thanks to you, you make my day


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you need After burner version 4.3 beta 14.
> 
> type Ctrl-F or click on the little graph icon to the left of the GPU core clock slider and it will open a new window with the graph
> 
> 
> 
> i dont know why my msi ab does not have that button... but CRTL F does the trick too, many thanks dude
> 
> 
> 
> i was using this settings and this give me ~50mhz from 2114 to 2164 in playing witcher 3 without crashes.
> 
> Click to expand...
> 
> You are probably using AB 4.3 beta 4. The icon was added in beta 14. you can download the update from guru3d.com
> 
> Click to expand...
> 
> i am using beta 14, but in form of traditional skin, its not important now, huge thanks to you, you make my day
Click to expand...

crtl-F does the job. you can try lifting the curve points in Evga precision between .950-.975 and see how that works for you too.


----------



## RyanRazer

Does changing from "quality" to "performance" in nvidia settings have any impact in gaming visuals and ability to OC gpu more? My amp extreme can't pass 2088mhz core clock :/


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> crtl-F does the job. you can try lifting the curve points in Evga precision between .950-.975 and see how that works for you too.


by using msi ab, is the max voltage still be 1.09v?


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Does changing from "quality" to "performance" in nvidia settings have any impact in gaming visuals and ability to OC gpu more? My amp extreme can't pass 2088mhz core clock :/


try it and see.

I think those settings have more to do with the card at idle rather than under load but I have never done any performance comparisons


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> crtl-F does the job. you can try lifting the curve points in Evga precision between .950-.975 and see how that works for you too.
> 
> 
> 
> by using msi ab, is the max voltage still be 1.09v?
Click to expand...

yes.

both software actually stems from the same base


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> yes.
> 
> both software actually stems from the same base


thanks
sho
Quote:


> Originally Posted by *gtbtk*
> 
> Assuming you have the voltage at +100, Modify the curve you set up by increasing the curve point at .975 up to somewhere in the range of 1950 - 2000mhz. If the run finishes, your benchmark scores should improve. If it crashes, reduce the .975 level slightly, if it passes increase it slightly and retest until you find the maximum level you can run the .975v point while remaining stable.
> 
> If you do not add any extra voltage, use the .950 point instead of .975. MSI afterburner allows finer control of the curve than EVGA.


would you mind to show me your curve by uploading a pic here??


----------



## pez

Quote:


> Originally Posted by *gtbtk*
> 
> This has been described in detail a number of times both here and on nvidia.com so I will be brief here.
> 
> When the micron 1070 is in a low power state and idling below about .780 - .800v, if you set a high memory overclock and put the card under load it will checkerboard artifact and then BSOD.
> 
> The reason is that the Memory VRM does not increase the voltage to the memory fast enough to match with the high frequency memory voltage requirements resulting in the memory being starved and crashing.
> 
> you can work around the problem If you lock the voltage in the curve to ensure the voltage stays above .800v. If you do that, the memory doesn't checkerboard and bsod. Of course if you overclock too far, it will start the traditional artifact/display errors you see just like any other card.
> 
> The bios fix should better co-ordinate the increased voltage supply with the increased memory clock voltage requirements.


Good info, thank you







.

I've only OC'ed the 1080s I had previously in any sort of 'heavy' manner, but I don't remember the memory brand on those. However, is this just for the GDDR5 cards or GDDR5X as well? I'll check out the 1070 in my GFs PC soon enough.


----------



## gtbtk

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> This has been described in detail a number of times both here and on nvidia.com so I will be brief here.
> 
> When the micron 1070 is in a low power state and idling below about .780 - .800v, if you set a high memory overclock and put the card under load it will checkerboard artifact and then BSOD.
> 
> The reason is that the Memory VRM does not increase the voltage to the memory fast enough to match with the high frequency memory voltage requirements resulting in the memory being starved and crashing.
> 
> you can work around the problem If you lock the voltage in the curve to ensure the voltage stays above .800v. If you do that, the memory doesn't checkerboard and bsod. Of course if you overclock too far, it will start the traditional artifact/display errors you see just like any other card.
> 
> The bios fix should better co-ordinate the increased voltage supply with the increased memory clock voltage requirements.
> 
> 
> 
> Good info, thank you
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I've only OC'ed the 1080s I had previously in any sort of 'heavy' manner, but I don't remember the memory brand on those. However, is this just for the GDDR5 cards or GDDR5X as well? I'll check out the 1070 in my GFs PC soon enough.
Click to expand...

1080s use GDDR5X memory. They do not have this issue.

The Original batch of 1070s had Samsung GDDR5 memory and subsequent batches have Micron GDDR5 memory and the bug seemed to have snuck in at the change


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> yes.
> 
> both software actually stems from the same base
> 
> 
> 
> thanks
> sho
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Assuming you have the voltage at +100, Modify the curve you set up by increasing the curve point at .975 up to somewhere in the range of 1950 - 2000mhz. If the run finishes, your benchmark scores should improve. If it crashes, reduce the .975 level slightly, if it passes increase it slightly and retest until you find the maximum level you can run the .975v point while remaining stable.
> 
> If you do not add any extra voltage, use the .950 point instead of .975. MSI afterburner allows finer control of the curve than EVGA.
> 
> Click to expand...
> 
> would you mind to show me your curve by uploading a pic here??
Click to expand...

This curve is one of my later experimental curves that leaves the default curve at the left end at stock, so it keeps the idle voltage above .800 to avoid memory checkerboarding but it will peak at about 2136. The idle voltage in the performance mode settles at match the default base clock point on the curve. If you increase the slider it pushes the idle voltage point further down the voltage curve to below .800

This approach is the best way I have discovered, so far, to maximize frame rate. However, if you are too aggressive at either of the two points it will crash the card with too much power draw. It is a matter of balance between the two.

In my case the MSI shuts itself off if it exceeds about 220w so I need to juggle the 2 points to balance power draw.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> This curve is one of my later experimental curves that leaves the default curve at the left end at stock, so it keeps the idle voltage above .800 to avoid memory checkerboarding but it will peak at about 2136. The idle voltage in the performance mode settles at match the default base clock point on the curve. If you increase the slider it pushes the idle voltage point further down the voltage curve to below .800
> 
> This approach is the best way I have discovered, so far, to maximize frame rate. However, if you are too aggressive at either of the two points it will crash the card with too much power draw. It is a matter of balance between the two.
> 
> In my case the MSI shuts itself off if it exceeds about 220w so I need to juggle the 2 points to balance power draw.


if i lock the min voltage at .85v 16xx mhz, then i cant get higher clock at gaming, how to use this lock function in msi ab?


----------



## Arturo.Zise

Quick question.

I will be grabbing a new Panasonic 4K TV for my media room this weekend and was going to hook up a new PS4 Pro to it. But I have the ability to move my computer setup into the same room and am now considering selling my 970 and buying a 1070. Would a 1070 be able to run most games at 4k 30fps minimum? I use a controller for 90% of the games I play so I'm not fussed on a constant 60fps.

Also, would I notice a big difference between my 32" 1440p monitor to a 4K TV clarity wise? I'm about 2-3ft from my monitor and about 6-7 ft from my TV.


----------



## asdkj1740

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Quick question.
> 
> I will be grabbing a new Panasonic 4K TV for my media room this weekend and was going to hook up a new PS4 Pro to it. But I have the ability to move my computer setup into the same room and am now considering selling my 970 and buying a 1070. Would a 1070 be able to run most games at 4k 30fps minimum? I use a controller for 90% of the games I play so I'm not fussed on a constant 60fps.
> 
> Also, would I notice a big difference between my 32" 1440p monitor to a 4K TV clarity wise? I'm about 2-3ft from my monitor and about 6-7 ft from my TV.


depends on your graphic settings in game... if you leave aa and shadow effect to default or set them to off, then it shouldnt be hard to keep the fps to 30fps


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> This curve is one of my later experimental curves that leaves the default curve at the left end at stock, so it keeps the idle voltage above .800 to avoid memory checkerboarding but it will peak at about 2136. The idle voltage in the performance mode settles at match the default base clock point on the curve. If you increase the slider it pushes the idle voltage point further down the voltage curve to below .800
> 
> This approach is the best way I have discovered, so far, to maximize frame rate. However, if you are too aggressive at either of the two points it will crash the card with too much power draw. It is a matter of balance between the two.
> 
> In my case the MSI shuts itself off if it exceeds about 220w so I need to juggle the 2 points to balance power draw.
> 
> 
> 
> 
> 
> 
> 
> if i lock the min voltage at .85v 16xx mhz, then i cant get higher clock at gaming, how to use this lock function in msi ab?
Click to expand...

You need to lock the voltage at 1.093


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> This curve is one of my later experimental curves that leaves the default curve at the left end at stock, so it keeps the idle voltage above .800 to avoid memory checkerboarding but it will peak at about 2136. The idle voltage in the performance mode settles at match the default base clock point on the curve. If you increase the slider it pushes the idle voltage point further down the voltage curve to below .800
> 
> This approach is the best way I have discovered, so far, to maximize frame rate. However, if you are too aggressive at either of the two points it will crash the card with too much power draw. It is a matter of balance between the two.
> 
> In my case the MSI shuts itself off if it exceeds about 220w so I need to juggle the 2 points to balance power draw.
> 
> 
> 
> 
> 
> 
> 
> if i lock the min voltage at .85v 16xx mhz, then i cant get higher clock at gaming, how to use this lock function in msi ab?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> You need to lock the voltage at 1.093
Click to expand...

but then it wont go down ever.
is there any way to set the min voltage and max voltage?


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> This curve is one of my later experimental curves that leaves the default curve at the left end at stock, so it keeps the idle voltage above .800 to avoid memory checkerboarding but it will peak at about 2136. The idle voltage in the performance mode settles at match the default base clock point on the curve. If you increase the slider it pushes the idle voltage point further down the voltage curve to below .800
> 
> This approach is the best way I have discovered, so far, to maximize frame rate. However, if you are too aggressive at either of the two points it will crash the card with too much power draw. It is a matter of balance between the two.
> 
> In my case the MSI shuts itself off if it exceeds about 220w so I need to juggle the 2 points to balance power draw.
> 
> 
> 
> 
> 
> if i lock the min voltage at .85v 16xx mhz, then i cant get higher clock at gaming, how to use this lock function in msi ab?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> You need to lock the voltage at 1.093
> 
> Click to expand...
> 
> but then it wont go down ever.
> is there any way to set the min voltage and max voltage?
Click to expand...

Afterburner allows you to set one profile for 2D and a separate profile for 3D. You could save a stock 2D one for when you are not gaming and a 3D profile for when you do run a 3D applications


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Afterburner allows you to set one profile for 2D and a separate profile for 3D. You could save a stock 2D one for when you are not gaming and a 3D profile for when you do run a 3D applications


i am sorry how to set the 2d one...or you mean i can set another profile for non gaming so when i want to play games i just need to change the profile each time?


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Afterburner allows you to set one profile for 2D and a separate profile for 3D. You could save a stock 2D one for when you are not gaming and a 3D profile for when you do run a 3D applications
> 
> 
> 
> i am sorry how to set the 2d one...or you mean i can set another profile for non gaming so when i want to play games i just need to change the profile each time?
Click to expand...

Save your Overclock settings to say, profile 1

Reset the card to default, add a fan curve if you want and then save those settings in to profile button 2.

Then open the settings dialog and select the profiles tab.

There you can choose a default profile for 2d (profile 2) and a default profile for 3d (profile 1)


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Save your Overclock settings to say, profile 1
> 
> Reset the card to default, add a fan curve if you want and then save those settings in to profile button 2.
> 
> Then open the settings dialog and select the profiles tab.
> There you can choose a default profile for 2d (profile 2) and a default profile for 3d (profile 1)


i got it, thanks a lot


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Save your Overclock settings to say, profile 1
> 
> Reset the card to default, add a fan curve if you want and then save those settings in to profile button 2.
> 
> Then open the settings dialog and select the profiles tab.
> There you can choose a default profile for 2d (profile 2) and a default profile for 3d (profile 1)
> 
> 
> 
> i got it, thanks a lot
Click to expand...

deleted


----------



## RyanRazer

Can someone please explain why i hit thermal limit if my card is running at 53C? This is cold... When gaming at 2000mhz at stock speeds i hit 67C and everything is OK, when i try to overclock, the card goes from 2113 mhz and 1.093V to 2088 and 1.081V, damn it... Either that, or if i push the core to high, it crashes..



Can i flash some other cards BIOS with power limit set to 125%?


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Can someone please explain why i hit thermal limit if my card is running at 53C? This is cold... When gaming at 2000mhz at stock speeds i hit 67C and everything is OK, when i try to overclock, the card goes from 2113 mhz and 1.093V to 2088 and 1.081V, damn it...
> 
> 
> 
> Can i flash some other cards BIOS with power limit set to 125%?


That is not a thermal limit. That is how USB boost 3.0 works. It manages voltages and clock speeds vs temperature.

What card are you running?

you could flash another bios and try it but the basic usb boost 3.0 behavior wont change and the power draw may change so you need to keep a close eye on it.


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> That is not a thermal limit. That is how USB boost 3.0 works. It manages voltages and clock speeds vs temperature.
> 
> What card are you running?
> 
> you could flash another bios and try it but the basic usb boost 3.0 behavior wont change and the power draw may change so you need to keep a close eye on it.


I'm sure you mean GPU Boost 3.0 instead of USB, right?


----------



## RyanRazer

Quote:


> Originally Posted by *gtbtk*
> 
> That is not a thermal limit. That is how USB boost 3.0 works. It manages voltages and clock speeds vs temperature.
> 
> What card are you running?
> 
> you could flash another bios and try it but the basic usb boost 3.0 behavior wont change and the power draw may change so you need to keep a close eye on it.


I have amp ext. I was hoping a bit more from it. 2100-2150. Oh well, 80 is fine as well i guess







I really hope we get an unlocked bios so we can apply a bit more voltage to it. if i get artifacts at 2100+mhz (with 1.081 - 1.093V), could this be resolved with amping up a voltage? Or does that mean my GPU just cant get any higher?


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is not a thermal limit. That is how USB boost 3.0 works. It manages voltages and clock speeds vs temperature.
> 
> What card are you running?
> 
> you could flash another bios and try it but the basic usb boost 3.0 behavior wont change and the power draw may change so you need to keep a close eye on it.
> 
> 
> 
> I have amp ext. I was hoping a bit more from it. 2100-2150. Oh well, 80 is fine as well i guess
> 
> 
> 
> 
> 
> 
> 
> I really hope we get an unlocked bios so we can apply a bit more voltage to it. if i get artifacts at 2100+mhz (with 1.081 - 1.093V), could this be resolved with amping up a voltage? Or does that mean my GPU just cant get any higher?
Click to expand...

The bios you are running is set to allow the highest power draw of any bios I am aware of (300w).

If you are getting artifacts at 2100, try lowering the middle part of the curve a few points.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> The bios you are running is set to allow the highest power draw of any bios I am aware of (300w).
> 
> If you are getting artifacts at 2100, try lowering the middle part of the curve a few points.


yeah, leave all points as default and except raising those gaming voltages (1.05~1.09) may help a bit, at least for my card .
and try to use/lock 1.08v instead of 1.09v, in my case i get higher core clock at 1.08v instead of 1.09v.
however honestly, leave it be, too frustration testing all these numbers with insignificant gaming experience improved, just wait and pray for a pascal bios tweaker...


----------



## jlhawn

Quote:


> Originally Posted by *MyNewRig*
> 
> Are you sure this would take *MONTHS* instead of days? because if this is the case i will just return my Micron 1070 today and wait a few months for VEGA HBM2 or Pascal v2 with GDDR5X memory .. i keep checking for that BIOS update daily!
> 
> Please give me more details on how you arrived at this prediction ..


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> deleted


Quote:


> Originally Posted by *jlhawn*


That was 6 days ago. But no way to know for sure.


----------



## asdkj1740

Quote:


> Originally Posted by *TheGlow*
> 
> That was 6 days ago. But no way to know for sure.


it is crazy that aic cant fix it alone.....seems nvidia does not allow aic to modify the bios..... or aic simply unable to do it...pascal bios is well protected.....


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheGlow*
> 
> That was 6 days ago. But no way to know for sure.
> 
> 
> 
> it is crazy that aic cant fix it alone.....seems nvidia does not allow aic to modify the bios..... or aic simply unable to do it...pascal bios is well protected.....
Click to expand...

If you compare all the different brand bioses in hex editor. They all have minor changes for branding and clock speeds and voltage limits etc. The main body of each file though are all very similar


----------



## Roland0101

Quote:


> Originally Posted by *jlhawn*
> 
> Are you the same person on the nvidia forums?
> If so some of the users on there are jerks towards you when you have been very helpful to lots of them in my opinion.


Yes I am, and thanks, but it's not a big deal, I can take care of myself.









Quote:


> Originally Posted by *gtbtk*
> 
> This forum has been at the centre of all of that.
> 
> I am the one who identified the bug and started that thread. After 17 odd pages finally managed to convince Nvidia, with Roland and GamersX assistance, that they needed to give it some attention.
> 
> Yes you are right, there are a number of know nothing blowhards on that forum that contribute nothing much that is constructive.


Thanks gtbtk, but I can't take credit for that.
My Opinion was that you should seek help at you specific AIB, because it would take a new vBios to take care of this issue.
And while the VBios part was right, the approach directly at Nvidias forum turned out to be the best move.

So all the credit belongs to you.


----------



## saunupe1911

Quote:


> Originally Posted by *asdkj1740*
> 
> it is crazy that aic cant fix it alone.....seems nvidia does not allow aic to modify the bios..... or aic simply unable to do it...pascal bios is well protected.....


Welp looks like what you got now is all the performance you will able to squeeze. So the rules of thumb are

1. Boost 3.0 will throttle GPU speeds so keep it as cool as possible.
2. Pray for Samsung memory so can get the best overclocking speeds for RAM

So honestly this means you really need to grab one of those water cooled 1070s. Not sure which is the best.

Or if you grab a blower style then you better get the coolest 3 fan 1070 and be sure to install it into a cool PC case with good air flow to keep temps down.
But again this is only if you absolutely need max speeds. Locking my voltage and creating stabilized profiles got me locked in at 2050 mhz (Boost 3.0 fluctuates between 2000 to 2088 mhz depending on ambient temps) and 9500 mhz memory for my max setting. I normally game at 2050 and 9000mhz because that all that most games need at maxed 1080p or 2k. We'll see when Gears drop next week and Mafia 3 drops tomorrow.


----------



## Waleh

Hey guys, so I just installed my new 1070 and have been noticing that my GPU Usage is not at 99%. It's usually around 70ish. My temps are fine. I tried both BF4 and Fallout and both show the same thing. I took a picture of BF4 for you guys.

My FPS is decent (usually over 100 in BF4) but it should be higher. On a side note, my 1070 FE has Samsung memory. Any suggestions?

BF4.png 4343k .png file


----------



## RyanRazer

TNX @gtbtk and @asdkj1740


----------



## TheDeadCry

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, so I just installed my new 1070 and have been noticing that my GPU Usage is not at 99%. It's usually around 70ish. My temps are fine. I tried both BF4 and Fallout and both show the same thing. I took a picture of BF4 for you guys.
> 
> My FPS is decent (usually over 100 in BF4) but it should be higher. On a side note, my 1070 FE has Samsung memory. Any suggestions?
> 
> BF4.png 4343k .png file


No Vsync or anything enabled? I ask because I don't know if your monitor is 120hz or greater.


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, so I just installed my new 1070 and have been noticing that my GPU Usage is not at 99%. It's usually around 70ish. My temps are fine. I tried both BF4 and Fallout and both show the same thing. I took a picture of BF4 for you guys.
> 
> My FPS is decent (usually over 100 in BF4) but it should be higher. On a side note, my 1070 FE has Samsung memory. Any suggestions?
> 
> BF4.png 4343k .png file


Download 3DMark. http://www.majorgeeks.com/files/details/3dmark.html
Run Fire Strike and post the link to the result page.


----------



## Waleh

There you go:

http://www.3dmark.com/3dm/15265824

I also OC'd my GPU (First time ever OC'ing). I used MSI Afterburner and did not touch my Voltage. I added 210 to the core clock and 600 to the memory clock. It was stable in Firestrike and Heaven Benchmark.

I've included those results here for you too:


----------



## BroPhilip

Quote:


> Originally Posted by *Waleh*
> 
> There you go:
> 
> http://www.3dmark.com/3dm/15265824
> 
> I also OC'd my GPU (First time ever OC'ing). I used MSI Afterburner and did not touch my Voltage. I added 210 to the core clock and 600 to the memory clock. It was stable in Firestrike and Heaven Benchmark.
> 
> I've included those results here for you too:


That is really low (3d mark)I have the same processor and mine is hitting 15700 and my ram is no where near yours (+400) only... it might be error correcting on the memory being to high


----------



## TheGlow

Quote:


> Originally Posted by *Waleh*
> 
> There you go:
> 
> http://www.3dmark.com/3dm/15265824
> 
> I also OC'd my GPU (First time ever OC'ing). I used MSI Afterburner and did not touch my Voltage. I added 210 to the core clock and 600 to the memory clock. It was stable in Firestrike and Heaven Benchmark.
> 
> I've included those results here for you too:


Yea,Im checking a screenshot i had and its 15750, 20895 for graphics. Im on a 6600k as well and my physics is 9427.


----------



## Waleh

I ran it again and got about the same result

http://www.3dmark.com/3dm/15266248

Just to note, my 6600k is at stock


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> There you go:
> 
> http://www.3dmark.com/3dm/15265824
> 
> I also OC'd my GPU (First time ever OC'ing). I used MSI Afterburner and did not touch my Voltage. I added 210 to the core clock and 600 to the memory clock. It was stable in Firestrike and Heaven Benchmark.


It's not terribly but it should be better.
A 6600k should clock to 3900Mhz, not to 3800Mhz, but that could be a simple reading error.

There is also the Time measuring inaccuracy, did you Alt - Tab out of the test on some point?

Download hwinfo64 > sensors only > fullscreen > open all sensors.

Set your GPU to stock.

Run Fire Strike again.

After the benchmark is finished make a screenshot of hwinfo64. Make sure all CPU, Motherboard and GPU readings are visible. (clocks, temps voltage etcetera.)


----------



## Waleh

Quote:


> Originally Posted by *Roland0101*
> 
> It's not terribly but it should be better.
> A 6600k should clock to 3900Mhz, not to 3800Mhz, but that could be a simple reading error.
> 
> There is also the Time measuring inaccuracy, did you Alt - Tab out of the test on some point?
> 
> Download hwinfo64 > sensors only > fullscreen > open all sensors.
> 
> Set your GPU to stock.
> 
> Run Fire Strike again.
> 
> After the benchmark is finished make a screenshot of hwinfo64. Make sure all CPU, Motherboard and GPU readings are visible. (clocks, temps voltage etcetera.)


There you go:


----------



## TheGlow

Quote:


> Originally Posted by *Waleh*
> 
> I ran it again and got about the same result
> 
> http://www.3dmark.com/3dm/15266248
> 
> Just to note, my 6600k is at stock


Put that K to work. thats why the k is there.
Mine is at 4.4GHz.


----------



## Roland0101

@Waleh

There is nothing outstanding wrong with this numbers. The difference in the VID voltage of you CPU cores and the 1.000v maximal voltage the GPU reports are to bear in mind.

What is afterburner reporting as your max core voltage?

If you want you can create a hwinfo64 log file. (click on the sheet with the green cross) and run Fire Strike again. Then lode up the log file to a free file hosting site and post the link here. So we could see how your machine behaves inside the benchmark.


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jlhawn*
> 
> Are you the same person on the nvidia forums?
> If so some of the users on there are jerks towards you when you have been very helpful to lots of them in my opinion.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes I am, and thanks, but it's not a big deal, I can take care of myself.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> This forum has been at the centre of all of that.
> 
> I am the one who identified the bug and started that thread. After 17 odd pages finally managed to convince Nvidia, with Roland and GamersX assistance, that they needed to give it some attention.
> 
> Yes you are right, there are a number of know nothing blowhards on that forum that contribute nothing much that is constructive.
> 
> Click to expand...
> 
> Thanks gtbtk, but I can't take credit for that.
> My Opinion was that you should seek help at you specific AIB, because it would take a new vBios to take care of this issue.
> And while the VBios part was right, the approach directly at Nvidias forum turned out to be the best move.
> 
> So all the credit belongs to you.
Click to expand...

yes, but you came around in the end.


----------



## gtbtk

Quote:


> Originally Posted by *Waleh*
> 
> There you go:
> 
> http://www.3dmark.com/3dm/15265824
> 
> I also OC'd my GPU (First time ever OC'ing). I used MSI Afterburner and did not touch my Voltage. I added 210 to the core clock and 600 to the memory clock. It was stable in Firestrike and Heaven Benchmark.
> 
> I've included those results here for you too:


Firestrike Graphics score is OK, I am sure with tuning you can improve that a little bit but your cpu/physics score is very low.

Have you overclocked your CPU at all?

Try setting your memory to the XMP profile. I see you are on an asus board. If you are not sure what to do, try the automatic oc and see how that goes. You should be able to get a 4.4-4.5Ghz OC without too much trouble. Only proviso is that the automatic utilities tend to add too much core voltage that will not hurt you in the short term but higher voltage = higher heat = shorted lifespan of your cpu years out from now. So after you have played with that and run a few bench marks, try reducing the vcore voltages a step at a time and see if your pc remains stable.


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> @Waleh
> 
> There is nothing outstanding wrong with this numbers. The difference in the VID voltage of you CPU cores and the 1.000v maximal voltage the GPU reports are to bear in mind.
> 
> What is afterburner reporting as your max core voltage?
> 
> If you want you can create a hwinfo64 log file. (click on the sheet with the green cross) and run Fire Strike again. Then lode up the log file to a free file hosting site and post the link here. So we could see how your machine behaves inside the benchmark.


@Waleh

The 107 degree C Motherboard/VRM temps look very high. Do you have any case fans providing ventilation into your case?

CPU Fan speed is only running in the 1000-1100 range as well. Asus lets tou have automatic fan control. Maybe you should look at that as well


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> yes, but you came around in the end.


Ok, I take that.








Quote:


> Originally Posted by *gtbtk*
> 
> Firestrike Graphics score is OK, I am sure with tuning you can improve that a little bit but your cpu/physics score is very low.
> 
> Have you overclocked your CPU at all?


No he has not, and that is the main thing I focus on as well. Only one core going to 100% use and the different VID voltages.
But I would rater look at the problems on stock than to overclock a not entirely correct working CPU.
Quote:


> Originally Posted by *gtbtk*
> 
> @Waleh
> 
> The 107 degree C Motherboard/VRM temps look very high. Do you have any case fans providing ventilation into your case?
> 
> CPU Fan speed is only running in the 1000-1100 range as well. Asus lets tou have automatic fan control. Maybe you should look at that as well


I doubt that the 107 degrees Celsius reading is correct, it would not stay that way without load.


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> yes, but you came around in the end.
> 
> 
> 
> Ok, I take that.
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Firestrike Graphics score is OK, I am sure with tuning you can improve that a little bit but your cpu/physics score is very low.
> 
> Have you overclocked your CPU at all?
> 
> Click to expand...
> 
> No he has not, and that is the main thing I focus on as well. Only one core going to 100% use and the different VID voltages.
> But I would rater look at the problems on stock then overclock a not entirely correct working CPU.
Click to expand...

I think cpu/vrm cooling is the first priority. If that is cooler, then the cpu or voltages will not throttle back giving better performance


----------



## Roland0101

[/quote]
Quote:


> Originally Posted by *gtbtk*
> 
> I think cpu/vrm cooling is the first priority. If that is cooler, then the cpu or voltages will not throttle back giving better performance


I edited my post.
As said, I doupt that the Temp3 reading is correct, no change without load, and the CPU is staying pretty cool.

I would like to look how the CPU behaves at Fire Strike in a hwinfo log file.


----------



## Waleh

Quote:


> Originally Posted by *gtbtk*
> 
> @Waleh
> 
> The 107 degree C Motherboard/VRM temps look very high. Do you have any case fans providing ventilation into your case?
> 
> CPU Fan speed is only running in the 1000-1100 range as well. Asus lets tou have automatic fan control. Maybe you should look at that as well


This is an ITX case. I have one Noctua Fan at the front as intake. These are my curves from Asus Fan Xpert (I set it to the silent preset). I have not OC'd as I have never done it before and I'm worried that my cooling is inadequate (I have a small air cooler, AXP 100)


----------



## Waleh

Quote:


> Originally Posted by *Roland0101*


I edited my post.
As said, I doupt that the Temp3 reading is correct, no change without load, and the CPU is staying pretty cool.

I would like to look how the CPU behaves at Fire Strike in a hwinfo log file.[/quote]

I'll do that now for you


----------



## gtbtk

Quote:


> Originally Posted by *Waleh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> @Waleh
> 
> The 107 degree C Motherboard/VRM temps look very high. Do you have any case fans providing ventilation into your case?
> 
> CPU Fan speed is only running in the 1000-1100 range as well. Asus lets tou have automatic fan control. Maybe you should look at that as well
> 
> 
> 
> This is an ITX case. I have one Noctua Fan at the front as intake. These are my curves from Asus Fan Xpert (I set it to the silent preset). I have not OC'd as I have never done it before and I'm worried that my cooling is inadequate (I have a small air cooler, AXP 100)
Click to expand...

Silent fans will hobble the performance. Noctua fans are really quiet anyway so I would suggest just trying the standard or even a turbo profile for the chassis and CPU fan settings, particularly when you are bench marking. You can always switch back if you are not loading up the PC or You may find that they are not too noisy anyway for daily use anyway.

Get the cooling balance right and you should be able to run at least a small overclock up to say 4.2Ghz without running into temp problems.


----------



## Waleh

@Roland0101

This is my test file:
http://www.filedropper.com/testing_3


----------



## Waleh

Quote:


> Originally Posted by *gtbtk*
> 
> Silent fans will hobble the performance. Noctua fans are really quiet anyway so I would suggest just trying the standard or even a turbo profile for the chassis and CPU fan settings, particularly when you are bench marking. You can always switch back if you are not loading up the PC or You may find that they are not too noisy anyway for daily use anyway.
> 
> Get the cooling balance right and you should be able to run at least a small overclock up to say 4.2Ghz without running into temp problems.


I changed them to Turbo now. In terms of OC, do I just sync all cores change the multiplier to 42 and set a manual voltage? What should my voltage be? Also, once I tried to set a manual voltage but for some reason HWmonitor started to report a higher voltage than what I set so I went back to stock. Do I need to have the latest BIOS or something?


----------



## smkd13

Quote:


> Originally Posted by *abdidas*
> 
> Just got an EVGA FTW edition, holy moly this thing is quiet. Couldn't be happier with the upgrade, btw I came from a GTX 550 ti so this is a huge step up for me.


funny cause that is what i am gonna be upgrading from. congrats and hope to join the club soon


----------



## gtbtk

Quote:


> Originally Posted by *Waleh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Silent fans will hobble the performance. Noctua fans are really quiet anyway so I would suggest just trying the standard or even a turbo profile for the chassis and CPU fan settings, particularly when you are bench marking. You can always switch back if you are not loading up the PC or You may find that they are not too noisy anyway for daily use anyway.
> 
> Get the cooling balance right and you should be able to run at least a small overclock up to say 4.2Ghz without running into temp problems.
> 
> 
> 
> I changed them to Turbo now. In terms of OC, do I just sync all cores change the multiplier to 42 and set a manual voltage? What should my voltage be? Also, once I tried to set a manual voltage but for some reason HWmonitor started to report a higher voltage than what I set so I went back to stock. Do I need to have the latest BIOS or something?
Click to expand...

How is the noise with turbo settings?

new bios is up to you but if it aint broken, I dont think there is a need right now.

I would suggest saving your existing profile as one of the user definable presets so you can always revert to it later if things go wrong.

Then I would suggest that you reset to optimum defaults and check your sata settings to make sure any sata drives you may have are enabled and set correctly.

You can try setting all cores to say 42. Leave all voltages on auto to start with. See how that goes for you. if it doesn't boot or if it crashes, look at increasing the vcore voltage slightly and retest. Keep a close eye on your temps

after that is stable, look at the RAM settings and select XMP for your ram and test again


----------



## Waleh

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Waleh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Silent fans will hobble the performance. Noctua fans are really quiet anyway so I would suggest just trying the standard or even a turbo profile for the chassis and CPU fan settings, particularly when you are bench marking. You can always switch back if you are not loading up the PC or You may find that they are not too noisy anyway for daily use anyway.
> 
> Get the cooling balance right and you should be able to run at least a small overclock up to say 4.2Ghz without running into temp problems.
> 
> 
> 
> I changed them to Turbo now. In terms of OC, do I just sync all cores change the multiplier to 42 and set a manual voltage? What should my voltage be? Also, once I tried to set a manual voltage but for some reason HWmonitor started to report a higher voltage than what I set so I went back to stock. Do I need to have the latest BIOS or something?
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> How is the noise with turbo settings?
> 
> new bios is up to you but if it aint broken, I dont think there is a need right now.
> 
> I would suggest saving your existing profile as one of the user definable presets so you can always revert to it later if things go wrong.
> 
> Then I would suggest that you reset to optimum defaults and check your sata settings to make sure any sata drives you may have are enabled and set correctly.
> 
> You can try setting all cores to say 42. Leave all voltages on auto to start with. See how that goes for you. if it doesn't boot or if it crashes, look at increasing the vcore voltage slightly and retest. Keep a close eye on your temps
> 
> after that is stable, look at the RAM settings and select XMP for your ram and test again
Click to expand...

On idle it seems pretty quiet. I haven't tested during gaming but I don't mind higher acoustics there since I have headphones on as long as it's not a plane engine. Let's say I boot into windows, what should I use to test my OC? I heard x264 is pretty good. I even downloaded a custom version of x264 from the skylake thread. Should I use that?


----------



## gtbtk

Quote:


> Originally Posted by *Waleh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Waleh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Silent fans will hobble the performance. Noctua fans are really quiet anyway so I would suggest just trying the standard or even a turbo profile for the chassis and CPU fan settings, particularly when you are bench marking. You can always switch back if you are not loading up the PC or You may find that they are not too noisy anyway for daily use anyway.
> 
> Get the cooling balance right and you should be able to run at least a small overclock up to say 4.2Ghz without running into temp problems.
> 
> 
> 
> I changed them to Turbo now. In terms of OC, do I just sync all cores change the multiplier to 42 and set a manual voltage? What should my voltage be? Also, once I tried to set a manual voltage but for some reason HWmonitor started to report a higher voltage than what I set so I went back to stock. Do I need to have the latest BIOS or something?
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> How is the noise with turbo settings?
> 
> new bios is up to you but if it aint broken, I dont think there is a need right now.
> 
> I would suggest saving your existing profile as one of the user definable presets so you can always revert to it later if things go wrong.
> 
> Then I would suggest that you reset to optimum defaults and check your sata settings to make sure any sata drives you may have are enabled and set correctly.
> 
> You can try setting all cores to say 42. Leave all voltages on auto to start with. See how that goes for you. if it doesn't boot or if it crashes, look at increasing the vcore voltage slightly and retest. Keep a close eye on your temps
> 
> after that is stable, look at the RAM settings and select XMP for your ram and test again
> 
> Click to expand...
> 
> On idle it seems pretty quiet. I haven't tested during gaming but I don't mind higher acoustics there since I have headphones on as long as it's not a plane engine. Let's say I boot into windows, what should I use to test my OC? I heard x264 is pretty good. I even downloaded a custom version of x264 from the skylake thread. Should I use that?
Click to expand...

aida64 is a handy utility that can stress test a cpu and run benchmarks.

the demo mode works for a couple of weeks without a key so you could try that

realbench is a free one that you could use as well


----------



## Forceman

Maybe y'all should turn off the nested quoting.


----------



## Waleh

Quote:


> Originally Posted by *Forceman*
> 
> Maybe y'all should turn off the nested quoting.


Sorry about that!

@gtbtk

So, I'm at 4.2 right now with the voltage on auto. I'm using the AIDA64 systems stability test to stress the CPU. My max temp is 67 and it's been about 10 min of testing (monitoring with HWMonitor). The CPU fan is at 1800ish rpm and the chassis fan is at 984 rpm. It's not very loud either. Definitely good acoustics for me. How are things looking so far? Do I keep this test running overnight? XMP is also enabled.


----------



## gtbtk

Quote:


> @gtbtk
> 
> So, I'm at 4.2 right now with the voltage on auto. I'm using the AIDA64 systems stability test to stress the CPU. My max temp is 67 and it's been about 10 min of testing (monitoring with HWMonitor). The CPU fan is at 1800ish rpm and the chassis fan is at 984 rpm. It's not very loud either. Definitely good acoustics for me. How are things looking so far? Do I keep this test running overnight? XMP is also enabled.


That all looks pretty good. Temp looks ok. the beauty of this is that you can experiment with different multipliers and see how it goes. If it fails, you can revert to settings that you know work well.

If you have gone 15-20 mins, you could stop that stress test as your temps should have balanced out by now. If you are at 67 they are fine. Give some of your bench marks a try and see how it goes.

I would expect the heaven benchmark at the settings you posted to be around 100ish and the firestrike physics score to be about 10000ish rather than in the 7000s.

If you want, you can go back to a stress test later but the benchmarks are also stressing your system.


----------



## Waleh

@gtbtk

So,

This is my new score. How does that look? Physics definitely went up quite a bit. In heaven I got 102.2 FPS









Also, is 210 on the core and 600 on the memory a pretty good OC? Today was my first time OC'ing my GPU and CPU.









http://www.3dmark.com/3dm/15268550


----------



## Roland0101

@Waleh

Ok, the first picture is Virtual Memory committed.


__
https://flic.kr/p/LY9oqa


__
https://flic.kr/p/LY9oqa

The green line is my system and the red is yours.
This picture is just for you to see that we run the same test at the same time.

Edit: due to new testing, see the next side.

Fourth picture GPU usage. The green arrow shows the combine test at the end of Fire Strike. As you can see your GPU load is caving in. This is not because of a GPU problem, its most likely because your CPU who has to handle the Physic as well can not give the GPU enough to do. (in very simple terms)


__
https://flic.kr/p/MKAVZE


__
https://flic.kr/p/MKAVZE

All in all, your CPU is simple not working as it should be. It also never goes to its 3900mhz boost clock for more than a second.

Where is the problem?
If your VRMs are really that hot it would be an explanation. I would go to the hwinfo64 forum and ask if the readings for your motherboard are correct, or if the sensor is just not getting any data.
Alternatively you could measure them by yourself if you have a suitable thermometer.

Did you changed your Bios settings in any way? Clearing your CMOS and applying default settings could be a good idea.
What Bios are you on?
Did you installed the newest chip-set driver for your motherboard?
Do you have a different over 500w PSU you could test with?

Edit: OC settings look much better. Still, I would investigate the stock problem, it could come around if you don't take care of it.


----------



## Waleh

@Roland0101 and @gtbtk

I need to thank you guys greatly. You're really helping me a lot! I posted in HWinfo about the VRM temps and I'm waiting on a response.

Did you changed your Bios settings in any way? Clearing your CMOS and applying default settings could be a good idea. All I did in BIOS was enable XMP (Prior to OC, now I also set cores to 42x multplier)

What Bios are you on? I am on 0702 (Quite old version, maybe I'll update that)

Did you installed the newest chip-set driver for your motherboard? I will do that now.

Do you have a different over 500w PSU you could test with? I do not. This PSU should be able to handle the system though.


----------



## gtbtk

Quote:


> Originally Posted by *Waleh*
> 
> @Roland0101 and @gtbtk
> 
> I need to thank you guys greatly. You're really helping me a lot! I posted in HWinfo about the VRM temps and I'm waiting on a response.
> 
> Did you changed your Bios settings in any way? Clearing your CMOS and applying default settings could be a good idea. All I did in BIOS was enable XMP (Prior to OC, now I also set cores to 42x multplier)
> 
> What Bios are you on? I am on 0702 (Quite old version, maybe I'll update that)
> 
> Did you installed the newest chip-set driver for your motherboard? I will do that now.
> 
> Do you have a different over 500w PSU you could test with? I do not. This PSU should be able to handle the system though.


Are your motherboard readings still sitting at 107 degrees?


----------



## Waleh

Quote:


> Originally Posted by *gtbtk*
> 
> Are your motherboard readings still sitting at 107 degrees?


Yeah, they are. I'm going to try setting my bios to default, updating it, and then applying OC again.


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> @Roland0101 and @gtbtk
> 
> I need to thank you guys greatly. You're really helping me a lot! I posted in HWinfo about the VRM temps and I'm waiting on a response.


No problem.
Let us know how it goes, and yes your PSU should easily handle your system, question is if it works correctly. But I also don't think it's the culprit.


----------



## asdkj1740

Quote:


> Originally Posted by *saunupe1911*
> 
> Welp looks like what you got now is all the performance you will able to squeeze. So the rules of thumb are
> 
> 1. Boost 3.0 will throttle GPU speeds so keep it as cool as possible.
> 2. Pray for Samsung memory so can get the best overclocking speeds for RAM
> 
> So honestly this means you really need to grab one of those water cooled 1070s. Not sure which is the best.
> 
> Or if you grab a blower style then you better get the coolest 3 fan 1070 and be sure to install it into a cool PC case with good air flow to keep temps down.
> But again this is only if you absolutely need max speeds. Locking my voltage and creating stabilized profiles got me locked in at 2050 mhz (Boost 3.0 fluctuates between 2000 to 2088 mhz depending on ambient temps) and 9500 mhz memory for my max setting. I normally game at 2050 and 9000mhz because that all that most games need at maxed 1080p or 2k. We'll see when Gears drop next week and Mafia 3 drops tomorrow.


if that is the case then evga kingpin and galax hof Mad Tse wont use off boost bios for their ln2 world record.
gpu boost is not that simple and totally not friendly to overclockers like those maxwell modded bios crazy guys out therem, they are willing to pull ~100w more just for ~50MHZ gain in core clock


----------



## RyanRazer

I just thought i drop my $0.02 here









http://www.3dmark.com/3dm/15271896

EDIT: wow now i see my CPU only boosts to 3.8 Ghz. It's not a K but intel says 3.6 - 4.0 Ghz boost.. Hmmm


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> I just thought i drop my $0.02 here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15271896
> 
> EDIT: wow now i see my CPU only boosts to 3.8 Ghz. It's not a K but intel says 3.6 - 4.0 Ghz boost.. Hmmm


4.0 would be for a single core, 3.9 for 3 cores. 3.8 2 cores


----------



## RyanRazer

tnx


----------



## gtbtk

correction 3.9 is 2 cores, 3.8 is 3 cores


----------



## Roland0101

@Waleh

Ok, I was at a friends house wo runs a i5 6600k this Morning to help him with a Problem on his machine.
After I was finished I took the opportunity to test a little.

The voltage behavior at Bios defaults is exactly the same as with your processor, so that is actually not unusual but the dynamic voltage/frequencies arrangement the i5 Skylake does on stock settings.

That doesn't change the fact that your Physics Score is low at Fire Strike and that the Temp3 and following temperatures are to high if they are correct.

What is Afterburner reporting as your max GPU core voltage under load?


----------



## Waleh

@Roland0101

So, a few things.

I got a response from the HWinfo Admin and it seems like the temp readings 1) are not VRMs 2) are not accurate and can be disregarded. This is what he said:

"Why do you think that Temp 3-6 are VRM temperatures? Those are just generic values from the sensor, which I believe are not correct (most probably not connected inputs on your mainboard providing erratic values). So I suggest not to worry about those and just ignore them."

Second, I OC'd my CPU to 4.3 on stock voltage and ran the x264 OCN Stress test overnight with no crashes as well as aida x64 for an hour or so (The OCN Stress test was more taxing) and I averaged around 68 degrees during the stress periods.

I also redid the firestrike test and got:

http://www.3dmark.com/3dm/15276328

OC'ing my CPU seems to have helped in games too. GPU usage has gone up in some titles like BF4 quite a bit. Again, my 1070 has 210 MHz on the core clock and 600 MHz on the memory (Samsung memory, FE). I could probably go higher but I'll just settle with this for now.









Does that look better now?


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> @Roland0101
> 
> So, a few things.
> 
> I got a response from the HWinfo Admin and it seems like the temp readings 1) are not VRMs 2) are not accurate and can be disregarded. This is what he said:
> 
> "Why do you think that Temp 3-6 are VRM temperatures? Those are just generic values from the sensor, which I believe are not correct (most probably not connected inputs on your mainboard providing erratic values). So I suggest not to worry about those and just ignore them."


Yes, that was what i expected.
Quote:


> Second, I OC'd my CPU to 4.3 on stock voltage and ran the x264 OCN Stress test overnight with no crashes as well as aida x64 for an hour or so (The OCN Stress test was more taxing) and I averaged around 68 degrees during the stress periods.
> 
> I also redid the firestrike test and got:
> 
> http://www.3dmark.com/3dm/15276328
> 
> OC'ing my CPU seems to have helped in games too. GPU usage has gone up in some titles like BF4 quite a bit. Again, my 1070 has 210 MHz on the core clock and 600 MHz on the memory (Samsung memory, FE). I could probably go higher but I'll just settle with this for now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does that look better now?


Yes it does.
The only thing I would look at is vcore and VID under load. The bios will adjust this settings automatically on auto. I don't think that the values are to high considering your Temps, but it's better to be sure.


----------



## RyanRazer

Any of you guys play PARAGON?


----------



## Deadcry

Quote:


> Originally Posted by *Waleh*
> 
> On idle it seems pretty quiet. I haven't tested during gaming but I don't mind higher acoustics there since I have headphones on as long as it's not a plane engine. Let's say I boot into windows, what should I use to test my OC? I heard x264 is pretty good. I even downloaded a custom version of x264 from the skylake thread. Should I use that?


I use RealBench it works really well. I start with the 15min test to start, and then 4-8 hours at night. It will run and notify you of instabilities. I believe you can download from Asus' website.


----------



## Waleh

Quote:


> Originally Posted by *Roland0101*
> 
> Yes, that was what i expected.
> Yes it does.
> The only thing I would look at is vcore and VID under load. The bios will adjust this settings automatically on auto. I don't think that the values are to high considering your Temps, but it's better to be sure.


So I ran x264 (OCN version) again for about 30 min to stress the CPU. I used HWmonitor to look at the values.

Vcore- Value (during stress): 1.232 V Min: 0.608 V Max: 1.248 V

VID- Value (during stressing): 1.213 V Min: 1.193 V Max: 1.256 V


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> So I ran x264 (OCN version) again for about 30 min to stress the CPU. I used HWmonitor to look at the values.
> 
> Vcore- Value (during stress): 1.232 V Min: 0.608 V Max: 1.248 V
> 
> VID- Value (during stressing): 1.213 V Min: 1.193 V Max: 1.256 V


Completely OK.


----------



## Waleh

@Roland0101. I must thank you again! I appreciate all your help. Definitely a +rep


----------



## RyanRazer

Who has the balls to do that?









GTX 1080 FE + GTX 1070 FE Power Limit Mod - Unlock the Power Target


----------



## TheGlow

Quote:


> Originally Posted by *RyanRazer*
> 
> 
> 
> 
> 
> 
> Who has the balls to do that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX 1080 FE + GTX 1070 FE Power Limit Mod - Unlock the Power Target


This is old. I believe a bunch of people already did it.


----------



## Prozillah

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> my 3 x 680s score 4900, can't justify spending $650 for a 1070 and getting exactly the same performance. im gonna need a bigger boat, i mean gfx card.


If the only thing ur doing is benchmarking then sure
Quote:


> Originally Posted by *RyanRazer*
> 
> 
> 
> 
> 
> 
> Who has the balls to do that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX 1080 FE + GTX 1070 FE Power Limit Mod - Unlock the Power Target


I did it on my Gaming G1 1070 - worked an absolutely treat. 2100 core 500 mem consistent no drops. Definitely rate it u have the ability to do it


----------



## RyanRazer

Quote:


> Originally Posted by *Prozillah*
> 
> If the only thing ur doing is benchmarking then sure
> I did it on my Gaming G1 1070 - worked an absolutely treat. 2100 core 500 mem consistent no drops. Definitely rate it u have the ability to do it


those numbers are relative. Some hit 2100 core and 500 mem without bypassing that resistor... if you got like 50mhz bump in core clock, i don't think it is worth the hustle. or is it? What were the numbers before?
How's with warranty with that. I understand that liquid metal can be completely removed so RMA would be still possible, the only thing to check before doing is if specific card has mechanism for them to see if card was disassembled, right?


----------



## Prozillah

Quote:


> Originally Posted by *RyanRazer*
> 
> those numbers are relative. Some hit 2100 core and 500 mem without bypassing that resistor... if you got like 50mhz bump in core clock, i don't think it is worth the hustle. or is it? What were the numbers before?
> How's with warranty with that. I understand that liquid metal can be completely removed so RMA would be still possible, the only thing to check before doing is if specific card has mechanism for them to see if card was disassembled, right?


Yea your right I didn't gain any additional OC but I did gain consistency - the card no longer power throttles which it would prior in certain apps and games. It just increases the headroom to give it a smoother level of power available. I believe the constant dips in volts and power can cause instability and potential stutters etc. That was my reason for doing it.

And yes the metal can be completely removed.


----------



## RyanRazer

Thanks for that.


----------



## TheDeadCry

Quote:


> Originally Posted by *Forceman*
> 
> Maybe y'all should turn off the nested quoting.


shhhhh...


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> @Roland0101. I must thank you again! I appreciate all your help. Definitely a +rep


No problem, have fun with the new toy.


----------



## RyanRazer

Quote:


> Originally Posted by *Prozillah*
> 
> Yea your right I didn't gain any additional OC but I did gain consistency - the card no longer power throttles which it would prior in certain apps and games. It just increases the headroom to give it a smoother level of power available. I believe the constant dips in volts and power can cause instability and potential stutters etc. That was my reason for doing it.
> 
> And yes the metal can be completely removed.


Sorry to bother you again but what did you use to bridge that resistor? Would something like that work, or is there dedicated liquid-metal compound for better conductivity? Coollaboratory Ultra


----------



## criminal

Quote:


> Originally Posted by *RyanRazer*
> 
> Sorry to bother you again but what did you use to bridge that resistor? Would something like that work, or is there dedicated liquid-metal compound for better conductivity? Coollaboratory Ultra


That would work or this: https://www.amazon.com/gp/product/B01A9KIGSI/ref=oh_aui_detailpage_o05_s00?ie=UTF8&psc=1


----------



## AngryLobster

gtbtk fighting the 17 day long battle for 1 FPS.


----------



## zipper17

Fully Functional unlockable 3/4 Way Pascal card. [X-post from Titan X Pascal Owner Thread.]
Quote:


> Originally Posted by *Ghoxt*
> 
> Source: Functional 4 Way SLI Titan X Pascal - in several games - how he did it, in his words.
> 
> Educational/ Info only on how one guy says he did it in his niche "server" setup with listed games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Current build:
> 
> Supermicro X10DRG-Q _(One of the few boards that numbers the PCI slots in reverse. Slot #1 is furthest away from CPU) hmm_
> 
> 2x E5-2699 v4 (44 cores/ 88HT) (3.7ghz turbo)
> 
> 512GB ram DDR4 2400mhz ecc reg
> 
> (QUAD SLI) 4 Titan X PASCAL
> 
> 2x Samsung NVMe 961 Pro pcie 3.0 (os drives)
> 
> 10x Samsung 850 Pro SSD RAID (apps drive)
> 
> LG 31MU97z 4096x2160 true 4K rev C
> 
> modded P5 case, Noctua heatsinks
> 
> Digital power supply 1650w
> 
> Windows Server 2012 R2 Data Center, Ubuntu 15
> 
> Again this is as far as I'm concerned, info only, about a wild one-off. But this is OCN so nothing can be considered out of bounds as we have the crazy 1%'ers here as well.
> 
> P.S., the politics of Nvidia shutting >2 way SLI down, someone else can talk to it. My personal opinion is that ultimately even for me it's too expensive to play with, for it to not work on a whim or driver update etc... I'm going single card next gen.


Interesting read, 3-4way actually can be working and scaling properly (using all of 4 GPU's) like an old days, the problem is just Software/Hardware limitation from nvidia that needed a 'proper HB Bridge SLI' & custom SLI profiles.


----------



## gtbtk

Quote:


> Originally Posted by *AngryLobster*
> 
> gtbtk fighting the 17 day long battle for 1 FPS.


What?


----------



## BroPhilip

Quote:


> Originally Posted by *gtbtk*
> 
> What?


Haters going to hate lol...


----------



## jlhawn

Quote:


> Originally Posted by *TheGlow*
> 
> This is old. I believe a bunch of people already did it.


I have done this on a GTX 680 and GTX 970, works good.
Here is another guide which I think is better than the video.


----------



## madmeatballs

Anyone tried the new driver (373.06). Someone claimed it improved their OC "a bit" lol.


----------



## asdkj1740

Quote:


> Originally Posted by *Prozillah*
> 
> Yea your right I didn't gain any additional OC but I did gain consistency - the card no longer power throttles which it would prior in certain apps and games. It just increases the headroom to give it a smoother level of power available. I believe the constant dips in volts and power can cause instability and potential stutters etc. That was my reason for doing it.
> 
> And yes the metal can be completely removed.


""I believe the constant dips in volts and power can cause instability and potential stutters etc. ""
thats true and thats why maxwell bios post will turn off the gpu boost 2.0 completely in most of the cases.
frame time is related and fluctuated according to fps, like 60fps 16.6ms, and a stable frame time helps a lot in smoothing out stutterings


----------



## asdkj1740

Quote:


> Originally Posted by *jlhawn*
> 
> I have done this on a GTX 680 and GTX 970, works good.
> Here is another guide which I think is better than the video.


dont need to do that on these two cards...there are already bios tweakers for kepler and maxwell....


----------



## jlhawn

Quote:


> Originally Posted by *asdkj1740*
> 
> dont need to do that on these two cards...there are already bios tweakers for kepler and maxwell....


Agreed, I'm not doing it to my 1070


----------



## asdkj1740

Quote:


> Originally Posted by *jlhawn*
> 
> Agreed, I'm not doing it to my 1070


if changed the cooling to aio solution then you may still have 1~2 levels of core clock drop (13mhz each), but i found that even the temp stays at very low level like 50c while voltage is locked, the core clock may not be constant and i guess it may be related to power usage... so i think using liquid ultra to shot those shrunts and changed to a aio solution should give you most stable core clock experiences. however if you run 3dmark fsu you may find that the higher and steady is your core clock the lower is your mark.


----------



## TheDeadCry

Quote:


> Originally Posted by *AngryLobster*
> 
> gtbtk fighting the 17 day long battle for 1 FPS.


Erm... This is overclock.net... I expect no less than this.


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> What?
> 
> 
> 
> Haters going to hate lol...
Click to expand...

He obviously likes blue screens of death and putting his data at risk. Seems like as doesnt use his computer for anything in anyway important, he believes everyone else should be like it too


----------



## RyanRazer

Quote:


> Originally Posted by *jlhawn*
> 
> I have done this on a GTX 680 and GTX 970, works good.
> Here is another guide which I think is better than the video.


thnx for the link bro


----------



## kevindd992002

Oh God! Thanks for Samsung











I did expect this though. All stocks that we have in the Philippine market are manufactured around June.


----------



## MyNewRig

Quote:


> Originally Posted by *kevindd992002*
> 
> Oh God! Thanks for Samsung
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did expect this though. All stocks that we have in the Philippine market are manufactured around June.


Oh.. Oh look who's got lucky







enjoy it man ..

Rumors about Pascal Refresh "GeForce 20" Series started surfacing a few hours ago along with VEGA news launching in a couple of months or so, i am returning my broken Micron GTX 1070 today and will wait for any of these instead, GTX 2070 should come with GDDR5X clocked at 10 Gb/s and VEGA with HBM2 .. it is going to be a few months wait but looks very much worth it with the current messed up Micron GDDR5 situation, i hope AMD manages to surprise us with quality and performance this time around because i am really itching to switch to AMD after that horrible experience with Nvidia's x70 products for two generations in a row!


----------



## kevindd992002

Quote:


> Originally Posted by *MyNewRig*
> 
> Oh.. Oh look who's got lucky
> 
> 
> 
> 
> 
> 
> 
> enjoy it man ..
> 
> Rumors about Pascal Refresh "GeForce 20" Series started surfacing a few hours ago along with VEGA news launching in a couple of months or so, i am returning my broken Micron GTX 1070 today and will wait for any of these instead, GTX 2070 should come with GDDR5X clocked at 10 Gb/s and VEGA with HBM2 .. it is going to be a few months wait but looks very much worth it with the current messed up Micron GDDR5 situation, i hope AMD manages to surprise us with quality and performance this time around because i am really itching to switch to AMD after that horrible experience with Nvidia's x70 products for two generations in a row!


Thanks!

Yeah, I did read about that yesterday. I came from the 600 series (2 x 670's) so it would kill me if I waited until next year for the 20 series to come out. I can't go to the other side either as my monitor is a GSYNC monitor. So I hope the 1070 serves me well.


----------



## vloeibaarglas

Looks like Palit pushed an update. Can someone with Palit Micron update their card and dump us the BIOS? A Micron card is needed since the bios update utility will likely flash different bios based on your VRAM manufacturer.

All the different card versions link to the same update utility (Dual vs Gamerock, etc): http://www.palit.com/palit/vgapro.php?id=2639&lang=en&pn=NE51070H15P2-1041G&tab=do


----------



## TheGlow

Quote:


> Originally Posted by *vloeibaarglas*
> 
> Looks like Palit pushed an update. Can someone with Palit Micron update their card and dump us the BIOS? A Micron card is needed since the bios update utility will likely flash difference bios based on your VRAM manufacturer.
> 
> All the different card versions link to the same update utility (Dual vs Gamerock, etc): http://www.palit.com/palit/vgapro.php?id=2639&lang=en&pn=NE51070H15P2-1041G&tab=do


The vbios are probably coming out within a day or so, just because MyNewRig sent his card back.


----------



## vloeibaarglas

Brave souls who are not afraid of bricks, can recover or have dual bios can try this dump:

__
https://www.reddit.com/r/56bc17/vbios_update_released_for_palit_gtx_1070/d8i0yzk

Poster claims to have dumped an updated Micron card.


----------



## F3niX69

Quote:


> Originally Posted by *TheGlow*
> 
> The vbios are probably coming out within a day or so, just because MyNewRig sent his card back.


Ahahahaha i really laughed on that one.

I hope ASUS release an update soon.I am waiting for their vBIOS


----------



## Waleh

Hey guys, just a quick question. I recently purchased a 1070 FE and OC'd it to 210 Mhz on the core and 600 Mhz on the memory and have stability with these values. However, I sometimes notice that my core clock doesn't stay stable, it fluctuates from 2075ish to 1950ish. I know that GPU boost does this but is there a way to keep the core clock constant? Should I be adjusting the power limit? I have it at the default power limit and I do notice that it sometimes goes to 101% in games. Should I increase the power limit a few percent higher is MSI afterburber?

Also, does adjusting the power limit mean that the GPU will use more than 150 W or is the maximum power limit taken into consideration with the 150 W value?

Finally, is there a specific temperature where this card throttles? I usually max at 75 degrees but I can set the fan curve a bit higher to bring down the temperatures further if needed. Thanks!


----------



## Roland0101

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, just a quick question. I recently purchased a 1070 FE and OC'd it to 210 Mhz on the core and 600 Mhz on the memory and have stability with these values. However, I sometimes notice that my core clock doesn't stay stable, it fluctuates from 2075ish to 1950ish. I know that GPU boost does this but is there a way to keep the core clock constant? Should I be adjusting the power limit? I have it at the default power limit and I do notice that it sometimes goes to 101% in games. Should I increase the power limit a few percent higher is MSI afterburber?


You are running into the power limit if you don't adjust it. If you overclock that high you should set it to max. But monitor the temps so that they don't hinder you.
Quote:


> Also, does adjusting the power limit mean that the GPU will use more than 150 W or is the maximum power limit taken into consideration with the 150 W value?


Yes it will use more power but with your card probably not more than 180w You can monitor that with hwinfo64
Quote:


> Finally, is there a specific temperature where this card throttles? I usually max at 75 degrees but I can set the fan curve a bit higher to bring down the temperatures further if needed. Thanks!


Pascals Gpu boost 3.0 is more sensitive to temperatures than GPU boost 2.0 was.
But you shouldn't turn you machine into a turbine just to gain 15 or 20Mhz.


----------



## TheDeadCry

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, just a quick question. I recently purchased a 1070 FE and OC'd it to 210 Mhz on the core and 600 Mhz on the memory and have stability with these values. However, I sometimes notice that my core clock doesn't stay stable, it fluctuates from 2075ish to 1950ish. I know that GPU boost does this but is there a way to keep the core clock constant? Should I be adjusting the power limit? I have it at the default power limit and I do notice that it sometimes goes to 101% in games. Should I increase the power limit a few percent higher is MSI afterburber?
> 
> Also, does adjusting the power limit mean that the GPU will use more than 150 W or is the maximum power limit taken into consideration with the 150 W value?
> 
> Finally, is there a specific temperature where this card throttles? I usually max at 75 degrees but I can set the fan curve a bit higher to bring down the temperatures further if needed. Thanks!


With regards to power, 150w is the base power consumption that is rated at stock. If you set the power slider to 126% in afterburner for example you would get an added 26% of headroom - so 150w + 0.26(150w). The same goes for temperature. If you look at afterburner - or any OC software, you will also see a temperature slider. Whatever temperature you set it to, lets's say 92 - the card will only start to throttle around 92 degrees (+- a few degrees)


----------



## zipper17

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, just a quick question. I recently purchased a 1070 FE and OC'd it to 210 Mhz on the core and 600 Mhz on the memory and have stability with these values. However, I sometimes notice that my core clock doesn't stay stable, *it fluctuates from 2075ish to 1950ish*. I know that GPU boost does this but is there a way to keep the core clock constant? Should I be adjusting the power limit? I have it at the default power limit and I do notice that it sometimes goes to 101% in games. Should I increase the power limit a few percent higher is MSI afterburber?
> 
> Also, does adjusting the power limit mean that the GPU will use more than 150 W or is the maximum power limit taken into consideration with the 150 W value?
> 
> Finally, is there a specific temperature where this card throttles? I usually max at 75 degrees but I can set the fan curve a bit higher to bring down the temperatures further if needed. Thanks!


Do you mean from 2075 straight down to 19XX ish right away? that's to much, Coreclock boost its about 12.5 per increment/decrement.
example: 2101, 2088, 2075, 2068, 2050, 2038, 2025,2012,2000,19XX, ...
you should see 2068, 2050, 2038, 2025,2012,2000, before 19XX. CMIIW.

do you use custom curve method?
maybe you need adjust more your custom curve points.

To get a constant highest points corespeed, you need a very good cooling solution.
Adjust Fan speed GPU, Fan case with good static pressure, Airflow Case, Ambient temperature, or go watercooler, etc, that kind of thing might help.

ex: My average clock are 2050, 2038, 2025, Temp about 60-70C, while playing Witcher 3. Highest 2101mhz.


----------



## mrtbahgs

Not sure if this is the best spot to ask this or if I should make my own thread, but I will try here initially anyway.

My previous setup was a GTX670 connect to a receiver via HDMI and then the receiver to my 1080p TV via HDMI and everything was simple to setup and use as a "monitor" for watching shows and movies from my PC while sitting on my couch.

Now I am running a GTX1070 still connected via HDMI to my receiver, but the receiver now connects to my new 4k TV via HDMI.
I was able to set the resolution of the new TV to its native 4k on my PC, but I now I am getting various display issues and I am not sure what is causing it and if it can be fixed.
I will have to revisit the other minor things and report back (currently not at home), but I wanted to ask the main question first to see what answers pop up.

Assuming I have the term right for the distortion I see, I believe I am seeing screen tearing.
I have horizontal lines of lag from time to time when scrolling down and possibly when video is playing as well (I will have to further test the latter).
The strangest thing to me though is that it is on all 3 of my screens and not just the 4k display (at least the scrolling portion for sure).
Also, again i need to look into it further, but if I turn off my receiver and TV to where the computer goes back to just 2 monitors, the tearing remains.

Now I am trying to understand if this is a limitation of my receiver, a driver issue with my GPU (I have the latest installed), or what?

It is likely important to note that I could only set the 4k TV to 30Hz refresh rate in windows and the other 2 monitors are on their normal 60Hz.
Is 30Hz the reason i see tearing? If so, why is it also showing up on the other 2 monitors now?
What, if anything, can I do to try and improve the distortion?

I assume at least one of you is also hooked up in the same way as me, so do you experience the same things?
Are you able to connect the TV at 60Hz instead of 30Hz? Perhaps this is limited by a few years old receiver that I have.


----------



## TheDeadCry

Almost assuredly due to the displays being 4k. It's likely that your HDMI input does not support 4k/60hz. I have no idea why it would tell you it is running at 60hz, if it isn't the case...but who knows. What you're describing is almost assuredly a bandwidth issue. it's clear to me that your computer is not actually functioning at 60hz with your tv. What is the HDMI revision on your 4k display(s)?


----------



## MyNewRig

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, just a quick question. I recently purchased a 1070 FE and OC'd it to 210 Mhz on the core and 600 Mhz on the memory and have stability with these values. However, I sometimes notice that my core clock doesn't stay stable, it fluctuates from 2075ish to 1950ish. I know that GPU boost does this but is there a way to keep the core clock constant? Should I be adjusting the power limit? I have it at the default power limit and I do notice that it sometimes goes to 101% in games. Should I increase the power limit a few percent higher is MSI afterburber?
> 
> Also, does adjusting the power limit mean that the GPU will use more than 150 W or is the maximum power limit taken into consideration with the 150 W value?
> 
> Finally, is there a specific temperature where this card throttles? I usually max at 75 degrees but I can set the fan curve a bit higher to bring down the temperatures further if needed. Thanks!


Max out all sliders for best performance, the FE has a max power target of 112% .. that is 12% above the base TDP of 150W so you will be running at 170W max which is perfectly fine, that will lower the effect of power throttling, next create a custom fan curve to match your temps, for example 50% fan at 50c, 60% at 60c, 70% at 70c etc .. this fan curve tens to work best on the FE and will also prevent temp throttling, with these settings you should be running above 2000Mhz most of the times ..


----------



## TheDeadCry

Almost assuredly due to the displays being 4k. It's likely that your HDMI input does not support 4k/60hz. I have no idea why it would tell you it is running at 60hz, if it isn't the case...but who knows. What you're describing is almost assuredly a bandwidth issue. it's clear to me that your computer is not actually functioning at 60hz with your tv. What is the HDMI revision on your 4k display(s)?
Quote:


> Originally Posted by *mrtbahgs*
> 
> Not sure if this is the best spot to ask this or if I should make my own thread, but I will try here initially anyway.
> 
> My previous setup was a GTX670 connect to a receiver via HDMI and then the receiver to my 1080p TV via HDMI and everything was simple to setup and use as a "monitor" for watching shows and movies from my PC while sitting on my couch.
> 
> Now I am running a GTX1070 still connected via HDMI to my receiver, but the receiver now connects to my new 4k TV via HDMI.
> I was able to set the resolution of the new TV to its native 4k on my PC, but I now I am getting various display issues and I am not sure what is causing it and if it can be fixed.
> I will have to revisit the other minor things and report back (currently not at home), but I wanted to ask the main question first to see what answers pop up.
> 
> Assuming I have the term right for the distortion I see, I believe I am seeing screen tearing.
> I have horizontal lines of lag from time to time when scrolling down and possibly when video is playing as well (I will have to further test the latter).
> The strangest thing to me though is that it is on all 3 of my screens and not just the 4k display (at least the scrolling portion for sure).
> Also, again i need to look into it further, but if I turn off my receiver and TV to where the computer goes back to just 2 monitors, the tearing remains.
> 
> Now I am trying to understand if this is a limitation of my receiver, a driver issue with my GPU (I have the latest installed), or what?
> 
> It is likely important to note that I could only set the 4k TV to 30Hz refresh rate in windows and the other 2 monitors are on their normal 60Hz.
> Is 30Hz the reason i see tearing? If so, why is it also showing up on the other 2 monitors now?
> What, if anything, can I do to try and improve the distortion?
> 
> I assume at least one of you is also hooked up in the same way as me, so do you experience the same things?
> Are you able to connect the TV at 60Hz instead of 30Hz? Perhaps this is limited by a few years old receiver that I have.


----------



## MyNewRig

Quote:


> Originally Posted by *mrtbahgs*
> 
> Not sure if this is the best spot to ask this or if I should make my own thread, but I will try here initially anyway.
> 
> My previous setup was a GTX670 connect to a receiver via HDMI and then the receiver to my 1080p TV via HDMI and everything was simple to setup and use as a "monitor" for watching shows and movies from my PC while sitting on my couch.
> 
> Now I am running a GTX1070 still connected via HDMI to my receiver, but the receiver now connects to my new 4k TV via HDMI.
> I was able to set the resolution of the new TV to its native 4k on my PC, but I now I am getting various display issues and I am not sure what is causing it and if it can be fixed.
> I will have to revisit the other minor things and report back (currently not at home), but I wanted to ask the main question first to see what answers pop up.
> 
> Assuming I have the term right for the distortion I see, I believe I am seeing screen tearing.
> I have horizontal lines of lag from time to time when scrolling down and possibly when video is playing as well (I will have to further test the latter).
> The strangest thing to me though is that it is on all 3 of my screens and not just the 4k display (at least the scrolling portion for sure).
> Also, again i need to look into it further, but if I turn off my receiver and TV to where the computer goes back to just 2 monitors, the tearing remains.
> 
> Now I am trying to understand if this is a limitation of my receiver, a driver issue with my GPU (I have the latest installed), or what?
> 
> It is likely important to note that I could only set the 4k TV to 30Hz refresh rate in windows and the other 2 monitors are on their normal 60Hz.
> Is 30Hz the reason i see tearing? If so, why is it also showing up on the other 2 monitors now?
> What, if anything, can I do to try and improve the distortion?
> 
> I assume at least one of you is also hooked up in the same way as me, so do you experience the same things?
> Are you able to connect the TV at 60Hz instead of 30Hz? Perhaps this is limited by a few years old receiver that I have.


Your TV needs at least HDMI 1.4b to run [email protected] with 4:2:0 chroma subsampling with a high speed HDMI cable, or HDMI 2.0 for [email protected] for 4:4:4 chroma subsampling and possibly HDR, anything lower than HDMI 1.4b will only be able to do [email protected] which is laggy and practically unusable as a computer monitor refresh rate.

The best way to know what resolution and refresh rate your TV is actually running is the OSD display on the TV itself showing signal specifications, but usually NVIDIA control panel will show that information and it is normally correct.

If you have a high speed HDMI cable and your TV is only running [email protected] then you need a new TV with an HDMI 2.0 port for best performance, unfortunately there is no other way around it, cheap 4K TVs don't always support [email protected]


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> The vbios are probably coming out within a day or so, just because MyNewRig sent his card back.


LOL, that was a good one









I have actually not taken it out of the system yet, planning to do it during the weekend when i am in the mode for it, so if they release something in a day or two i might be able to give it a try, not very excited about it though coz i lost interest in the product all together by now, might try it for the sake of science though ....


----------



## zipper17

Quote:


> Originally Posted by *mrtbahgs*
> 
> Not sure if this is the best spot to ask this or if I should make my own thread, but I will try here initially anyway.
> 
> My previous setup was a GTX670 connect to a receiver via HDMI and then the receiver to my 1080p TV via HDMI and everything was simple to setup and use as a "monitor" for watching shows and movies from my PC while sitting on my couch.
> 
> Now I am running a GTX1070 still connected via HDMI to my receiver, but the receiver now connects to my new 4k TV via HDMI.
> I was able to set the resolution of the new TV to its native 4k on my PC, but I now I am getting various display issues and I am not sure what is causing it and if it can be fixed.
> I will have to revisit the other minor things and report back (currently not at home), but I wanted to ask the main question first to see what answers pop up.
> 
> Assuming I have the term right for the distortion I see, I believe I am seeing screen tearing.
> I have *horizontal lines of lag from time to time when scrolling down* and possibly when video is playing as well (I will have to further test the latter).
> The strangest thing to me though is that it is on all 3 of my screens and not just the 4k display (at least the scrolling portion for sure).
> Also, again i need to look into it further, but if I turn off my receiver and TV to where the computer goes back to just 2 monitors, the tearing remains.
> 
> Now I am trying to understand if this is a limitation of my receiver, a driver issue with my GPU (I have the latest installed), or what?
> 
> It is likely important to note that I could only set the 4k TV to 30Hz refresh rate in windows and the other 2 monitors are on their normal 60Hz.
> Is 30Hz the reason i see tearing? If so, why is it also showing up on the other 2 monitors now?
> What, if anything, can I do to try and improve the distortion?
> 
> I assume at least one of you is also hooked up in the same way as me, so do you experience the same things?
> Are you able to connect the TV at 60Hz instead of 30Hz? Perhaps this is limited by a few years old receiver that I have.


Do you have any framelimiter on ? disable framelimiter from 3rd apps such nvidia inspector/ msi ab.

I was also experience some kind of those horizontal lines tearing in the middle of screen, when enable Framelimiter & Vsync/Adaptive at the same time in games.
Disable any framelimiter 3rd apps, horizontal lines tearing gone. However I didn't experience those while on desktop.


----------



## Roland0101

Quote:


> Originally Posted by *mrtbahgs*
> 
> Are you able to connect the TV at 60Hz instead of 30Hz? Perhaps this is limited by a few years old receiver that I have.


Possible, and easy to test. Put the receiver out of the equation and connect the TV directly to the PC.
Brand names and models would also be nice, so it would be possible to check.


----------



## mrtbahgs

Wow, thank you all for the quick posts!
I will reply now and also try some things out and reply again later.
Quote:


> Originally Posted by *TheDeadCry*
> 
> Almost assuredly due to the displays being 4k. It's likely that your HDMI input does not support 4k/60hz. I have no idea why it would tell you it is running at 60hz, if it isn't the case...but who knows. What you're describing is almost assuredly a bandwidth issue. it's clear to me that your computer is not actually functioning at 60hz with your tv. What is the HDMI revision on your 4k display(s)?


Perhaps I am miss-understanding what you are saying, but to clarify, my PC is recognizing the TV as 30Hz and my other 2 monitors as 60Hz.
I have no way to tell the PC to run my 4k TV at 60Hz.
It should be possible though (assuming everything is 4k 60hz compatible) that a PC can run a TV at 4k 60hz, correct?
TV is the latest 4k, so HDMI 2.0a. The receiver I am 95% sure has HDMI 2.0, definitely not 2.0a, but very small chance it is the old HDMI 1.4.
Quote:


> Originally Posted by *MyNewRig*
> 
> Your TV needs at least HDMI 1.4b to run [email protected] with 4:2:0 chroma subsampling with a high speed HDMI cable, or HDMI 2.0 for [email protected] for 4:4:4 chroma subsampling and possibly HDR, anything lower than HDMI 1.4b will only be able to do [email protected] which is laggy and practically unusable as a computer monitor refresh rate.
> 
> The best way to know what resolution and refresh rate your TV is actually running is the OSD display on the TV itself showing signal specifications, but usually NVIDIA control panel will show that information and it is normally correct.
> 
> If you have a high speed HDMI cable and your TV is only running [email protected] then you need a new TV with an HDMI 2.0 port for best performance, unfortunately there is no other way around it, cheap 4K TVs don't always support [email protected]


The TV is 100% the latest HDMI standard, so it should be 2.0a.
The cables I am using I am pretty confident are all high speed cables, but I can look into them closer. I think they straight up would not work at all though if they weren't high speed cables.
I will try and check my OSD or NVidia control panel to verify the speeds, i don't recall my TV's OSD showing anything though and the remote has limited buttons so I can't hit like display or something to try and pop it up manually.
Quote:


> Originally Posted by *zipper17*
> 
> Do you have any framelimiter on ? disable framelimiter from 3rd apps such nvidia inspector/ msi ab.
> 
> I was also experience some kind of those horizontal lines tearing in the middle of screen, when enable Framelimiter & Vsync/Adaptive at the same time in games.
> Disable any framelimiter 3rd apps, horizontal lines tearing gone. However I didn't experience those while on desktop.


I haven't tried gaming at all so it may be even worse in that. Gaming is not my intention for this TV though.
I haven't heard of a framelimiter and certainly didn't turn it on myself, but I will try and look to see if it is on somewhere.
Quote:


> Originally Posted by *Roland0101*
> 
> Possible, and easy to test. Put the receiver out of the equation and connect the TV directly to the PC.
> Brand names and models would also be nice, so it would be possible to check.


Yes, I think that is likely the best way to help narrow things down a bit so I will try a direct connection to the TV and see if options change.
In regards to brands, my sig is likely up to date minus the TV, but I will list everything here for ease.

GPU - Gigabyte GTX 1070 G1 Gaming
Receiver - Onkyo TX-NR727
TV - Samsung UN65KS8000 (3840x2160 @120Hz, but 60Hz max for PCs I think)
Main Monitor - Asus PB278Q (2560x1440 @ 60Hz)
Secondary Monitor - HP 2311x (1920x1080 @ 60Hz)

Thanks again for the quick help thus far.


----------



## Roland0101

Quote:


> Originally Posted by *mrtbahgs*
> 
> Yes, I think that is likely the best way to help narrow things down a bit so I will try a direct connection to the TV and see if options change.
> In regards to brands, my sig is likely up to date minus the TV, but I will list everything here for ease.
> 
> Thanks again for the quick help thus far.


Yes, could have take a look at your sig...









Ok, TV supports 4k @ 60Hz with and without 4:4:4.
Problem is that your receiver also supports 4K loop through according to it's spec sheet. (Still, test it)

Are the HDMI cables Highspeed?
How long are the cables?
Did you set HDMI input to 'PC'?
Do you use the PC HDMI port?
Did you set "HDMI UHD Color" in the picture menu?


----------



## Deadcry

Quote:


> Originally Posted by *mrtbahgs*
> 
> Wow, thank you all for the quick posts!
> I will reply now and also try some things out and reply again later.
> Perhaps I am miss-understanding what you are saying, but to clarify, my PC is recognizing the TV as 30Hz and my other 2 monitors as 60Hz.
> I have no way to tell the PC to run my 4k TV at 60Hz.
> It should be possible though (assuming everything is 4k 60hz compatible) that a PC can run a TV at 4k 60hz, correct?
> TV is the latest 4k, so HDMI 2.0a. The receiver I am 95% sure has HDMI 2.0, definitely not 2.0a, but very small chance it is the old HDMI 1.4.
> The TV is 100% the latest HDMI standard, so it should be 2.0a.
> The cables I am using I am pretty confident are all high speed cables, but I can look into them closer. I think they straight up would not work at all though if they weren't high speed cables.
> I will try and check my OSD or NVidia control panel to verify the speeds, i don't recall my TV's OSD showing anything though and the remote has limited buttons so I can't hit like display or something to try and pop it up manually.
> I haven't tried gaming at all so it may be even worse in that. Gaming is not my intention for this TV though.
> I haven't heard of a framelimiter and certainly didn't turn it on myself, but I will try and look to see if it is on somewhere.
> Yes, I think that is likely the best way to help narrow things down a bit so I will try a direct connection to the TV and see if options change.
> In regards to brands, my sig is likely up to date minus the TV, but I will list everything here for ease.
> 
> GPU - Gigabyte GTX 1070 G1 Gaming
> Receiver - Onkyo TX-NR727
> TV - Samsung UN65KS8000 (3840x2160 @120Hz, but 60Hz max for PCs I think)
> Main Monitor - Asus PB278Q (2560x1440 @ 60Hz)
> Secondary Monitor - HP 2311x (1920x1080 @ 60Hz)
> 
> Thanks again for the quick help thus far.


120hz you say? This may be due to your 4K tv interpolating. What interpolation is, is basically like adding "fake" frames. It tricks the eye via a combination of effects. This affect cab cause stuttering, frame duplication, and blurring. Turn it off if you can find it. I can guarantee your tv doesn't run natively 120hz. It usually has some weird proprietary name, somewhere in the osd. What brand again? Edit: I'm on my phone while my PC updates, in case I missed something. Edit Edit: Forgive me! I see that you posted your specs in the post! Lmao


----------



## zipzop

I just noticed I get 7-10 degrees C cooler temps by running the memory at stock(8ghz Micron). Wow...so in addition to it not being able to OC, it also runs hot AF. Well at least this keeps my core at 2100+ without temp throttling


----------



## Deadcry

Found it. As I expected "Since this TV is 120Hz, 'Auto Motion Plus' can be used on 30Hz and 60Hz signals. This will add the soap opera effect (SOE). Low custom values will work well to smooth out motion with a minimum of the soap opera effect."

Try disabling "auto motion plus" this will likely solve your issue.


----------



## mrtbahgs

Quote:


> Originally Posted by *Roland0101*
> 
> Yes, could have take a look at your sig...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ok, TV supports 4k @ 60Hz with and without 4:4:4.
> Problem is that your receiver also supports 4K loop through according to it's spec sheet. (Still, test it)
> 
> Are the HDMI cables Highspeed?
> How long are the cables?
> Did you set HDMI input to 'PC'?
> Do you use the PC HDMI port?
> Did you set "HDMI UHD Color" in the picture menu?


Not 100% sure on if they are high speed cables, I will be seeing if I can find out, but I am fairly sure they are.
6' cables for both pc to receiver and receiver to tv.
I am not sure what you mean by HDMI input to PC. I go to the input for my PC though, I think it self names it? I will look into it further and try to update.
PC HDMI port on the receiver? No I had to use the Game HDMI input because the one labeled PC is not a 4K input for my receiver.
Yes to UHD Color.

Quote:


> Originally Posted by *Deadcry*
> 
> 120hz you say? This may be due to your 4K tv interpolating. What interpolation is, is basically like adding "fake" frames. It tricks the eye via a combination of effects. This affect cab cause stuttering, frame duplication, and blurring. Turn it off if you can find it. I can guarantee your tv doesn't run natively 120hz. It usually has some weird proprietary name, somewhere in the osd. What brand again? Edit: I'm on my phone while my PC updates, in case I missed something. Edit Edit: Forgive me! I see that you posted your specs in the post! Lmao


Quote:


> Originally Posted by *Deadcry*
> 
> Found it. As I expected "Since this TV is 120Hz, 'Auto Motion Plus' can be used on 30Hz and 60Hz signals. This will add the soap opera effect (SOE). Low custom values will work well to smooth out motion with a minimum of the soap opera effect."
> 
> Try disabling "auto motion plus" this will likely solve your issue.


My TV does indeed have a native 120Hz refresh rate, but I dont think 120Hz is possible on a PC connection or at least my receiver wont support above 60Hz.
Auto Motion Plus is I think currently set to Auto and I believe turned down to 0 unless the gray area numbers don't correlate to auto, I'll tinker with it later tonight and certainly try completely off.


----------



## kevindd992002

Can anyone with a Zotac GTX 1070 AMP! Extreme do me a favor? So before installing my new card to the system I went ahead and did a simple stock fan test by just making the fans spin with my finger. The output ports being the reference, the leftmost and center fans spin at about the same way as regular fans do. But the rightmost fan seems to have the most "resistance" in a way that it spins significantly fewer rounds than the other two (making me think that it lacks oil or something).

The thibg is that when I plug the card into the system, they all kinda spin the same way event at 100% RPM. Now I don't have values to back this up because all three fans are connected to the same RPM monitor pin so I cannot monitor them individually.

Is this common for this card or what? Is it something to worry about? Please help.

Thanks.


----------



## gtbtk

Quote:


> Originally Posted by *mrtbahgs*
> 
> Not sure if this is the best spot to ask this or if I should make my own thread, but I will try here initially anyway.
> 
> My previous setup was a GTX670 connect to a receiver via HDMI and then the receiver to my 1080p TV via HDMI and everything was simple to setup and use as a "monitor" for watching shows and movies from my PC while sitting on my couch.
> 
> Now I am running a GTX1070 still connected via HDMI to my receiver, but the receiver now connects to my new 4k TV via HDMI.
> 
> I was able to set the resolution of the new TV to its native 4k on my PC, but I now I am getting various display issues and I am not sure what is causing it and if it can be fixed.
> 
> I will have to revisit the other minor things and report back (currently not at home), but I wanted to ask the main question first to see what answers pop up.
> 
> Assuming I have the term right for the distortion I see, I believe I am seeing screen tearing.
> 
> I have horizontal lines of lag from time to time when scrolling down and possibly when video is playing as well (I will have to further test the latter).
> 
> The strangest thing to me though is that it is on all 3 of my screens and not just the 4k display (at least the scrolling portion for sure).
> 
> Also, again i need to look into it further, but if I turn off my receiver and TV to where the computer goes back to just 2 monitors, the tearing remains.
> 
> Now I am trying to understand if this is a limitation of my receiver, a driver issue with my GPU (I have the latest installed), or what?
> 
> It is likely important to note that I could only set the 4k TV to 30Hz refresh rate in windows and the other 2 monitors are on their normal 60Hz.
> 
> Is 30Hz the reason i see tearing? If so, why is it also showing up on the other 2 monitors now?
> 
> What, if anything, can I do to try and improve the distortion?
> 
> I assume at least one of you is also hooked up in the same way as me, so do you experience the same things?
> 
> Are you able to connect the TV at 60Hz instead of 30Hz? Perhaps this is limited by a few years old receiver that I have.


To get 4K at 60Hz with HDMI you need HDMI 2.0. Your 1070 has that but I suspect the receiver is using HDMI 1.4 which only supports 4K at 30Hz


----------



## gtbtk

Quote:


> Originally Posted by *vloeibaarglas*
> 
> Looks like Palit pushed an update. Can someone with Palit Micron update their card and dump us the BIOS? A Micron card is needed since the bios update utility will likely flash different bios based on your VRAM manufacturer.
> 
> All the different card versions link to the same update utility (Dual vs Gamerock, etc): http://www.palit.com/palit/vgapro.php?id=2639&lang=en&pn=NE51070H15P2-1041G&tab=do


The update does not seem to resolve the Voltage starvation issue, At least if you flash it to an MSI Gaming card


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> The update does not seem to resolve the Voltage starvation issue, At least if you flash it to an MSI Gaming card


I keep wondering, how are you flashing all these vendor-specific BIOSes on your MSI card without bricking it? aren't these BIOSes customized for their specific targeted boards?


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> The update does not seem to resolve the Voltage starvation issue, At least if you flash it to an MSI Gaming card


Which bios version is it?


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The update does not seem to resolve the Voltage starvation issue, At least if you flash it to an MSI Gaming card
> 
> 
> 
> I keep wondering, how are you flashing all these vendor-specific BIOSes on your MSI card without bricking it? aren't these BIOSes customized for their specific targeted boards?
Click to expand...

I have been flashing pc bios eeproms since the late 80s. That was when you needed to use UV light to erase the chip before you could reflash the chip.

I am using the nvflash version that bypasses the signature that was posted here by Joe Dirt. Command is nvflash -6 filename.rom. The cards are pretty much all the same, they all use the same voltage controllers. The variation is in how many VRM channels the controller is connected to but I don't think that the firmware actually cares. The other variations are in things like the max power the card can draw and core clock and memory clock speeds.

I have bricked the card twice. Once when I flashed a "1070" bios that was actually a mislabeled 1080 bios and one other time but I have no idea why that didn't work.

It is easy to recover a bricked card as long as you have an iGPU that you can boot from - set the bios to default to the iGPU and not pcie slot if you brick the card. The physical PCI addressing is still in place and that allows you to communicate with the flash rom. That will allow you to write anything you like to it so you write the correct rom and it will recover the card.

The problems with a bricked card is that the code contained in the data on the rom is either corrupted or sending wrong control signals to the hardware.


----------



## gtbtk

@Roland0101

It is a Palit bios ver 86.04.3B.00.94


----------



## sammkv

Quote:


> Originally Posted by *kevindd992002*
> 
> Can anyone with a Zotac GTX 1070 AMP! Extreme do me a favor? So before installing my new card to the system I went ahead and did a simple stock fan test by just making the fans spin with my finger. The output ports being the reference, the leftmost and center fans spin at about the same way as regular fans do. But the rightmost fan seems to have the most "resistance" in a way that it spins significantly fewer rounds than the other two (making me think that it lacks oil or something).
> 
> The thibg is that when I plug the card into the system, they all kinda spin the same way event at 100% RPM. Now I don't have values to back this up because all three fans are connected to the same RPM monitor pin so I cannot monitor them individually.
> 
> Is this common for this card or what? Is it something to worry about? Please help.
> 
> Thanks.


Long as you don't get any weird/odd noises coming from the fans during usage I think it's fine. I really regret getting the AMP Edition and should have got the AMP Extreme







.


----------



## kevindd992002

Quote:


> Originally Posted by *sammkv*
> 
> Long as you don't get any weird/odd noises coming from the fans during usage I think it's fine. I really regret getting the AMP Edition and should have got the AMP Extreme
> 
> 
> 
> 
> 
> 
> 
> .


Yup, no weird noises at all. I'm just being OC but I still wanted to confirm.

Oh and by the way, how do you start OC'ing this? I came from two Keplers (GTX 670) in SLI so I'm not sure if the process is still the same with Pascals. What I did back then was to increase the voltage and the power target limit in the vBIOS by using KBT and then use Afterburner to increase the power target, core clock, and mem clock until they are stable in Heaven.


----------



## RyanRazer

Quote:


> Originally Posted by *sammkv*
> 
> Long as you don't get any weird/odd noises coming from the fans during usage I think it's fine. I really regret getting the AMP Edition and should have got the AMP Extreme
> 
> 
> 
> 
> 
> 
> 
> .


AMP! Extreme is not without problems either... http://www.overclock.net/t/1613126/gtx-1070-amp-extreme-owners
This is as of now. There are probably other people with other problems.

Quote:


> Originally Posted by *kevindd992002*
> 
> Yup, no weird noises at all. I'm just being OC but I still wanted to confirm.
> 
> Oh and by the way, how do you start OC'ing this? I came from two Keplers (GTX 670) in SLI so I'm not sure if the process is still the same with Pascals. What I did back then was to increase the voltage and the power target limit in the vBIOS by using KBT and then use Afterburner to increase the power target, core clock, and mem clock until they are stable in Heaven.


No bios moddig till now (i think). Just max power and volt sliders and slowly increase core clock and than mem


----------



## kevindd992002

Quote:


> Originally Posted by *RyanRazer*
> 
> AMP! Extreme is not without problems either... http://www.overclock.net/t/1613126/gtx-1070-amp-extreme-owners
> This is as of now. There are probably other people with other problems.
> No bios moddig till now (i think). Just max power and volt sliders and slowly increase core clock and than mem


What does increasing the volt sliders do?


----------



## RyanRazer

Allows card to draw a bit more voltage. But we are talking very small increases so you don't have to worry about overvolting (frying) your card. boost 3.0 is very protective with voltage and temperature. We are all waiting an unlocked bios to increase voltage more ;D

we are locked at 1.093V... so you can easily max it out. press ctrl + f when in msi afterburner, you'll see the voltage/core clock curve


----------



## shadowrain

Quote:


> Originally Posted by *kevindd992002*
> 
> Yup, no weird noises at all. I'm just being OC but I still wanted to confirm.
> 
> Oh and by the way, how do you start OC'ing this? I came from two Keplers (GTX 670) in SLI so I'm not sure if the process is still the same with Pascals. What I did back then was to increase the voltage and the power target limit in the vBIOS by using KBT and then use Afterburner to increase the power target, core clock, and mem clock until they are stable in Heaven.


I myself prefer use the latest Zotac Firestorm App. OC settings I use are exactly what is used in the HW Cannucks review for a "mild" OC. Also using the fan curve for low temps.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73251-zotac-gtx-1070-amp-extreme-review-17.html


----------



## RyanRazer

So you are saying your OC is better wit firestorm than msi? You said you get clock the same as hardwarkanucks but didn't compare to MSI... That would be weird if that is the case. Can you confirm?


----------



## shadowrain

Quote:


> Originally Posted by *RyanRazer*
> 
> So you are saying your OC is better wit firestorm than msi? You said you get clock the same as hardwarkanucks but didn't compare to MSI... That would be weird if that is the case. Can you confirm?


My oc is stable yes. Better? Not sure. I tried to copy the OC settings from this other review using Afterburner but no cigar. So I stayed with what works.
http://thepcenthusiast.com/zotac-geforce-gtx-1070-amp-extreme-review/5/


----------



## RyanRazer

Quote:


> Originally Posted by *shadowrain*
> 
> My oc is stable yes. Better? Not sure. I tried to copy the OC settings from this other review using Afterburner but no cigar. So I stayed with what works.
> http://thepcenthusiast.com/zotac-geforce-gtx-1070-amp-extreme-review/5/


I will try that one + overclocking guide from their article. However, firestorms monitoring isn't as good as MSI's... You cant monitor fan sped and gpu, voltage limit etc... or have i missed something? How would having 2 overclocking software at the same time work? firestorm to overclock, MSI to monitor? Would that be a bad idea? Is there and other monitoring software, besides hwmonitor and aida64... the one with graphs like msi.


----------



## shadowrain

Quote:


> Originally Posted by *RyanRazer*
> 
> I will try that one + overclocking guide from their article. However, firestorms monitoring isn't as good as MSI's... You cant monitor fan sped and gpu, voltage limit etc... or have i missed something? How would having 2 overclocking software at the same time work? firestorm to overclock, MSI to monitor? Would that be a bad idea? Is there and other monitoring software, besides hwmonitor and aida64... the one with graphs like msi.


I use the GPU Meter gadget from 8gadgetpack with PCMeter to monitor stuff easily. Less DPC footprint than aida et al. Works for me since 7(and vista) so I just stuck with it.
http://i.imgur.com/vXJoVvW.jpg


----------



## RyanRazer

Hmm. tnx but ill stick to msi


----------



## kevindd992002

Quote:


> Originally Posted by *RyanRazer*
> 
> Allows card to draw a bit more voltage. But we are talking very small increases so you don't have to worry about overvolting (frying) your card. boost 3.0 is very protective with voltage and temperature. We are all waiting an unlocked bios to increase voltage more ;D
> 
> we are locked at 1.093V... so you can easily max it out. press ctrl + f when in msi afterburner, you'll see the voltage/core clock curve


'
I see. I'll try this out on Monday and see what OC I can come up with, at least for the meantime that we still don't have a BIOS unlocker.
Quote:


> Originally Posted by *shadowrain*
> 
> I myself prefer use the latest Zotac Firestorm App. OC settings I use are exactly what is used in the HW Cannucks review for a "mild" OC. Also using the fan curve for low temps.
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73251-zotac-gtx-1070-amp-extreme-review-17.html


Thanks but I'm sticking with Afterburner. Have you tried aiming for the max OC of your card (not copying the ones on the reviews) yet?


----------



## rfarmer

Anyone else run into this. With driver 372.70 http://www.3dmark.com/fs/10127647, with driver 373.06 http://www.3dmark.com/fs/10408274. I lost nearly 1000 points on my graphics score with exactly the same clock speeds.


----------



## madmeatballs

Quote:


> Originally Posted by *rfarmer*
> 
> Anyone else run into this. With driver 372.70 http://www.3dmark.com/3dm/14778848?, with driver 373.06 http://www.3dmark.com/3dm/15316411?. I lost nearly 1000 points on my graphics score with exactly the same clock speeds.


I did not lose points but I lost my oc stability from 37x.xx drivers, I used to hit 2126MHz core now I can only be stable with 2088Mhz.


----------



## Dude970

Both . 70 and the 373.06 give me artifacts and crash running FireStrike at the same OC. I still use 372.54


----------



## Joenc

Quote:


> Originally Posted by *TheGlow*
> 
> The vbios are probably coming out within a day or so, just because MyNewRig sent his card back.


oh, that hurts ! hahah

still he can go buy another one and test ...


----------



## MyNewRig

Quote:


> Originally Posted by *Joenc*
> 
> oh, that hurts ! hahah
> 
> still he can go buy another one and test ...


I wrote a few posts above that i still have the card in the system and nothing has came out yet, nothing is coming in a day or two or even a week, EVGA said that it will take around two weeks and is only targeted at overclocking, nothing will be fixed at stock settings, that Palit BIOS that came out yesterday has nothing to do with this issue and it is the only reasons some people assumed the fix is coming in a day or two, but it is not.


----------



## MyNewRig

Quote:


> Originally Posted by *madmeatballs*
> 
> I did not lose points but I lost my oc stability from 37x.xx drivers, I used to hit 2126MHz core now I can only be stable with 2088Mhz.


It has always been the case even in previous generations that driver updates keep resulting in lower and lower overclocks as they tend to utilize the GPU more aggressively for increased performance, it usually gets to the point where OC is not possible as drivers reach their max maturity.


----------



## rfarmer

Quote:


> Originally Posted by *Dude970*
> 
> Both . 70 and the 373.06 give me artifacts and crash running FireStrike at the same OC. I still use 372.54


Well I completely uninstalled display drive and installed 372.54, better results. http://www.3dmark.com/fs/10410042


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> @Roland0101
> 
> It is a Palit bios ver 86.04.3B.00.94


Thanks.
Could you find any release notes?

I think this bios has noting to do with the issue.


----------



## Dude970

Quote:


> Originally Posted by *rfarmer*
> 
> Well I completely uninstalled display drive and installed 372.54, better results. http://www.3dmark.com/fs/10410042










Hopefully the next set of drivers do better


----------



## rfarmer

Quote:


> Originally Posted by *Dude970*
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully the next set of drivers do better


Yeah I updated the driver because I got Mafia III, but at 30 fps don't think there is a lot of optimization going on.


----------



## kevindd992002

Quote:


> Originally Posted by *MyNewRig*
> 
> It has always been the case even in previous generations that driver updates keep resulting in lower and lower overclocks as they tend to utilize the GPU more aggressively for increased performance, it usually gets to the point where OC is not possible as drivers reach their max maturity.


What happens then if the GPU utilization for later drivers is more aggressive than previous drivers? Does that mean that OC is no longer "beneficial"? I'm not sure what it means by "aggressive GPU utilization".


----------



## MyNewRig

Quote:


> Originally Posted by *kevindd992002*
> 
> What happens then if the GPU utilization for later drivers is more aggressive than previous drivers? Does that mean that OC is no longer "beneficial"? I'm not sure what it means by "aggressive GPU utilization".


Newer drivers come with optimizations that seek to squeeze as much performance as possible from the chip and this results in stable OC becoming no longer stable as the GPU is pushed harder with newer drivers, i have always experienced this with previous generations that i had to lower my OC with new drivers.


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> @Roland0101
> 
> It is a Palit bios ver 86.04.3B.00.94
> 
> 
> 
> Thanks.
> Could you find any release notes?
> 
> I think this bios has noting to do with the issue.
Click to expand...

No. Found it along the way in a reddit post and though I would see what happened if I flashed it

I think that you are right,


----------



## RyanRazer

Quote:


> Originally Posted by *MyNewRig*
> 
> Newer drivers come with optimizations that seek to squeeze as much performance as possible from the chip and this results in stable OC becoming no longer stable as the GPU is pushed harder with newer drivers, i have always experienced this with previous generations that i had to lower my OC with new drivers.


I don't know mate, Nvidia is known to be a dick even towards their own customers. Deliberately crippling old GPUs with new drivers, so people would upgrade faster.

http://www.techspot.com/review/1000-project-cars-benchmarks/page6.html

https://forums.geforce.com/default/topic/831639/nvidia-is-deliberately-downgrading-kepler-gpus-performance-in-favor-of-maxwell-/






Maybe a few more months we in fact will get drivers that are beneficial but i am afraid when 2000 series come out....

More can be found on net.
I can't stand Nvidia's policy...


----------



## TylerAD

Game is LOCKED at 30 hz/30 FPS for now. They are supposed to release a patch this weekend or early next week to resolve. Its not your pc....


----------



## gtbtk

Quote:


> Originally Posted by *rfarmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dude970*
> 
> Both . 70 and the 373.06 give me artifacts and crash running FireStrike at the same OC. I still use 372.54
> 
> 
> 
> Well I completely uninstalled display drive and installed 372.54, better results. http://www.3dmark.com/fs/10410042
Click to expand...

Interesting.

The newest drivers let me run memory 100mhz faster in Timespy than I could before. Maybe they focus on DX12 performance??


----------



## kevindd992002

Quote:


> Originally Posted by *MyNewRig*
> 
> Newer drivers come with optimizations that seek to squeeze as much performance as possible from the chip and this results in stable OC becoming no longer stable as the GPU is pushed harder with newer drivers, i have always experienced this with previous generations that i had to lower my OC with new drivers.


I understand but is that a good thing though? I guess what I really want to comprehend is if increased GPU utilization is more favorable than having a higher stable OC?


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> Interesting.
> 
> The newest drivers let me run memory 100mhz faster in Timespy than I could before. Maybe they focus on DX12 performance??


Could very well be.

I will test that on my system tonight.


----------



## MyNewRig

Quote:


> Originally Posted by *kevindd992002*
> 
> I understand but is that a good thing though? I guess what I really want to comprehend is if increased GPU utilization is more favorable than having a higher stable OC?


i really don't know, probably newer drivers are better for real gaming performance, maybe not so much for benchmarks, but newer drivers cause you to drop core OC by 10-20Mhz or something like that so not a really big deal


----------



## zipper17

Quick test Firestrike basic, new driver v373.06 vs 372.90

Still pretty much the same for me Graphic scores ~20,8XX.
Quote:


> Originally Posted by *shadowrain*
> 
> I myself prefer use the latest Zotac Firestorm App. OC settings I use are exactly what is used in the HW Cannucks review for a "mild" OC. Also using the fan curve for low temps.
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73251-zotac-gtx-1070-amp-extreme-review-17.html


Strange he can manage 2151mhz & 9000mhz, but Graphic scores 20,3XX. He should easily at least get 21K graphic scores, imo



My card only can manage around 2101-2068mhz & 9000/9200mhz in Firestrike Bench, but graphic scores slightly higher 20,8xx.

How can this happen, many time i see people with higher corespeed, but graphic scores pretty much the same or below the people who has lower corespeed.


----------



## zipper17

Quote:


> Originally Posted by *rfarmer*
> 
> Anyone else run into this. With driver 372.70 http://www.3dmark.com/fs/10127647, with driver 373.06 http://www.3dmark.com/fs/10408274. I lost nearly 1000 points on my graphics score with exactly the same clock speeds.


372.90 (or 372.70 I forgot)
http://www.3dmark.com/fs/10235442
373.06
http://www.3dmark.com/fs/10411448

pretty much same ~20,8XX

Did you try this, it might be help:
- set every global setting to default
- and then prefer max performances
- go to control panel, power option, choose prefer High performances. (it set your cpu & pcie power saving state to off)
- set your fan speed GPU 100% or 80% fixed. or what ever you prefer.
- then close any 3rd monitoring program.
- run benchmark

however can manage to get 21K, but with around +700-800mhz to the memory (9450-9500mhz)
http://www.3dmark.com/fs/10234540, 21,109
but it's not stable, flashing green pop out on the screen.
my corespeed anything higher than 2101mhz, crashes while in stability test & witcher 3.
i maybe waiting for Full Unlocked Mod bios for the last effort of overclocking.


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> Interesting.
> 
> The newest drivers let me run memory 100mhz faster in Timespy than I could before. Maybe they focus on DX12 performance??


Did you have to lower your core clock? Voltage slider?


----------



## TheGlow

Newer drivers do seem a bit less. Previous ones I could run timespy as high as +215/+850 without crashing. Here mine crashed once and froze another time. When I did get it to finish, score was 130 points less.


----------



## shadowrain

Quote:


> Originally Posted by *zipper17*
> 
> Quick test Firestrike basic, new driver v373.06 vs 372.90
> 
> Still pretty much the same for me Graphic scores ~20,8XX.
> Strange he can manage 2151mhz & 9000mhz, but Graphic scores 20,3XX. He should easily at least get 21K graphic scores, imo
> 
> 
> 
> My card only can manage around 2101-2068mhz & 9000/9200mhz in Firestrike Bench, but graphic scores slightly higher 20,8xx.
> 
> How can this happen, many time i see people with higher corespeed, but graphic scores pretty much the same or below the people who has lower corespeed.


Many factors. CPU, Mobo, CPU OC, RAM and RAM latency, voltage, temps, drivers, processes, bloat apps, etc... these may lower the fps even by 1 or by a few but that affects the graphic scores nonetheless.


----------



## FastOne8

Hello guys. Does this method http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980/70 works with GTX 1070 cards? Can I flash both bioses for cards with Dual bios or second bios are protected and can't be flashed?


----------



## mrtbahgs

Ahhhhhh.... i didn't realize swapping out a TV (or GPU) would cause so many troubles.
I think the tearing no longer happens, perhaps it just needed a restart, but now I get other odd issues.
*Edit: Tearing is back after another restart to try something. I think it is related to when Windows no longer lets Aero work, but I cant turn it back on and it may come back on its own after sleeping and I believe with TV/Receiver off.*
**Edit2: I found something showing some of this is a Windows 7 limitation so I guess it won't be fixable. Microsoft Link**

The biggest now is that if the TV is not on PC input, my screens will turn on and off and keep cycling or checking inputs, idk.
This is the first time I have noticed any of this so something started it that I cant explain.
I disconnected the TV display in windows for now so it stopped.

I am also pretty sure then even with my TV and receiver off, windows still somehow thinks a 3rd monitor is connected and I can even move my mouse over into the "3rd screen".
I would expect it to just only see 2 screens and work with them until I power the receiver and TV back on, but it isn't working like that.

I have also seen windows turn off Aero and make my desktops go to a black background and no longer see any icons on my desktop, this is with the TV connected and on PC input.
Perhaps it is just shifting the other 2 monitors into a corner of a display or something because the 3rd screen is 4k and it tries to make the others big too, but its not like my mouse thinks the screens are bigger.
(^^^This rolls into the above "edit 2")

I have a feeling most of these things occur once I have the TV and receiver off or changed inputs from PC to OTA TV or Smart TV Apps.
These few things are hard to explain and maybe not consistent so it's even harder to troubleshoot.
Everything may change too once I restart the PC again.

One last one that I am pretty sure is GPU related is that the GPU must have it's display inputs in some default priority.
When I do a restart, my Bios splash screen and stuff will show up on my secondary monitor connected via DVI, but once windows login finally pops up, it shifts to the correct main display connected via DP.
This may have occurred even before I swapped out my TV and certainly only started after changing GPUs and how my main screen is connected (used to be both via DVI).

So many little problems and troubles that I don't know where to begin or how to troubleshoot which component is causing the problems... very frustrating.


----------



## gtbtk

Quote:


> Originally Posted by *Dude970*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Interesting.
> 
> The newest drivers let me run memory 100mhz faster in Timespy than I could before. Maybe they focus on DX12 performance??
> 
> 
> 
> Did you have to lower your core clock? Voltage slider?
Click to expand...

I am using a custom curve and not the slider so that skews the "Core Clock" reading. I reduced the power target slider to 95%. The card started the bench at 2126mhz +500 (9100Mhz), was mostly running at 2114Mhz with occasional drops to 2080. The way I have my curve set up combined with the higher memory clocks, has a tendency for the card to want to pull more than 220Watts at peak loads and that tends to crash the card. Reducing the target slider takes the top off those power "spikes".

I have come to the conclusion that the MSI 126% power target setting may only be useful if you are using some sort of active cooling like LN2 because it is certainly way higher than the circuitry of the card on air can cope with in my rig. Techpowerup.com claims the MSI bios is set for 230W at 100% with a boost to 291W but, at least on an Asus Z68 motherboard with PCIe 2.0, the card cannot cope at those levels even though it is being fed by a platinum rated HX850i power supply in single rail mode.

I don't have a modern pcie 3.0 board to test it available so the power draw issues may vary with the newer hardware.


----------



## madmeatballs

Quote:


> Originally Posted by *gtbtk*
> 
> Interesting.
> 
> The newest drivers let me run memory 100mhz faster in Timespy than I could before. Maybe they focus on DX12 performance??


Quote:


> Originally Posted by *TheGlow*
> 
> Newer drivers do seem a bit less. Previous ones I could run timespy as high as +215/+850 without crashing. Here mine crashed once and froze another time. When I did get it to finish, score was 130 points less.


Are you guys running 3dmark stress test or just the benchmark test. Mine is still the same and fails on stress test(Firestrike Ultra). No artifacts tho.


----------



## gtbtk

Quote:


> Originally Posted by *madmeatballs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Interesting.
> 
> The newest drivers let me run memory 100mhz faster in Timespy than I could before. Maybe they focus on DX12 performance??
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheGlow*
> 
> Newer drivers do seem a bit less. Previous ones I could run timespy as high as +215/+850 without crashing. Here mine crashed once and froze another time. When I did get it to finish, score was 130 points less.
> 
> Click to expand...
> 
> Are you guys running 3dmark stress test or just the benchmark test. Mine is still the same and fails on stress test(Firestrike Ultra). No artifacts tho.
Click to expand...

Mine was the benchmark


----------



## DeathAngel74

May I please join?


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> May I please join?


wellcome to micron family, you are the chosen one


----------



## asdkj1740

Quote:


> Originally Posted by *FastOne8*
> 
> Hello guys. Does this method http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980/70 works with GTX 1070 cards? Can I flash both bioses for cards with Dual bios or second bios are protected and can't be flashed?


no, dont do it


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> I am using a custom curve and not the slider so that skews the "Core Clock" reading. I reduced the power target slider to 95%. The card started the bench at 2126mhz +500 (9100Mhz), was mostly running at 2114Mhz with occasional drops to 2080. The way I have my curve set up combined with the higher memory clocks, has a tendency for the card to want to pull more than 220Watts at peak loads and that tends to crash the card. Reducing the target slider takes the top off those power "spikes".
> 
> I have come to the conclusion that the MSI 126% power target setting may only be useful if you are using some sort of active cooling like LN2 because it is certainly way higher than the circuitry of the card on air can cope with in my rig. Techpowerup.com claims the MSI bios is set for 230W at 100% with a boost to 291W but, at least on an Asus Z68 motherboard with PCIe 2.0, the card cannot cope at those levels even though it is being fed by a platinum rated HX850i power supply in single rail mode.
> 
> I don't have a modern pcie 3.0 board to test it available so the power draw issues may vary with the newer hardware.


tpu should be wrong. check these two out






















http://tech.sina.com.cn/n/t/2016-07-19/doc-ifxuapvw2296082.shtml


----------



## ucode

TDP in ambiguous percentages again instead of Watts :/


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> tpu should be wrong. check these two out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://tech.sina.com.cn/n/t/2016-07-19/doc-ifxuapvw2296082.shtml


It may well be my motherboard that shuts things down prematurely.

I made my comments based on monitoring my PSU and graphics card power draw combined with the afterburner TDP % reading. None of the power draw numbers seem out of place or obviously wrong. With the MSI bios installed I don't remember Afterburner ever hitting 100% TDP in gaming or 3dmark. The exception being in furmark where i can get the card to pull 250W maximum but that is still not getting near 126%


----------



## asdkj1740

it is said that 1080 evga classified ln2 bios has no power limit
Quote:


> Originally Posted by *gtbtk*
> 
> It may well be my motherboard that shuts things down prematurely.
> 
> I made my comments based on monitoring my PSU and graphics card power draw combined with the afterburner TDP % reading. None of the power draw numbers seem out of place or obviously wrong. With the MSI bios installed I don't remember Afterburner ever hitting 100% TDP in gaming or 3dmark. The exception being in furmark where i can get the card to pull 250W maximum but that is still not getting near 126%


have you flashed the msi bios on your non msi card? maybe that is the reason why the number is wrong.
i am considering flash a bios that has highest power limit among 1070. do you have the right version of nvflash??
but i have no idea which bios can be flashed to my ftw micron vram as some in another thread said galax hof and gigabyte xtreme bios cant be cross flashed.
any zotac amp extreme user here can tell me how much is the max power draw during benchmarking and gaming?


----------



## asdkj1740

or should i simply apply liquid ultra to the shrunts of my pcb, but the power draw cant be read correctly on software anymore


----------



## RyanRazer

Quote:


> Originally Posted by *asdkj1740*
> 
> it is said that 1080 evga classified ln2 bios has no power limit
> have you flashed the msi bios on your non msi card? maybe that is the reason why the number is wrong.
> i am considering flash a bios that has highest power limit among 1070. do you have the right version of nvflash??
> but i have no idea which bios can be flashed to my ftw micron vram as some in another thread said galax hof and gigabyte xtreme bios cant be cross flashed.
> any zotac amp extreme user here can tell me how much is the max power draw during benchmarking and gaming?


120℅ slider in msi ab... Can't say about wattage


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> it is said that 1080 evga classified ln2 bios has no power limit
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It may well be my motherboard that shuts things down prematurely.
> 
> I made my comments based on monitoring my PSU and graphics card power draw combined with the afterburner TDP % reading. None of the power draw numbers seem out of place or obviously wrong. With the MSI bios installed I don't remember Afterburner ever hitting 100% TDP in gaming or 3dmark. The exception being in furmark where i can get the card to pull 250W maximum but that is still not getting near 126%
> 
> 
> 
> have you flashed the msi bios on your non msi card? maybe that is the reason why the number is wrong.
> i am considering flash a bios that has highest power limit among 1070. do you have the right version of nvflash??
> but i have no idea which bios can be flashed to my ftw micron vram as some in another thread said galax hof and gigabyte xtreme bios cant be cross flashed.
> any zotac amp extreme user here can tell me how much is the max power draw during benchmarking and gaming?
Click to expand...

I have an MSI card. I have flashed Asus strix, Zotac amp extreme, Galaxy Gamer, EVGA SC and FTW, Palit SJ, gainward GLH, Gigabyte xtreme bioses to my card to see how they go.

None of the bioses changed the Micron behavior and overclocks didn't improve over what I could get with the MSI bioses on my card. The cards with 1671 core clocks were not stable last time i tried them. maybe I should revisit them with the new drivers.

The EGVA bioses did allow me to try out the auto OC in Precision XOC - not particularly reliable. Their max power limit is set significantly lower than the MSI Gaming cards. The end result is that the EVGA cards core frequency jumps around a lot more than the other cards as it hits the power limit when It doesnt on MSI bios.

NVflash is available early on in this thread. Search for "Joe Dirt".

The MSI bios has the highest power limit slider adjustment with 126% although I have noticed that the Zotac has a higher TDP. Asus Strix is 125% and is a pretty close match with the MSI card in spite of the Asus Strix only having an 8 pin and the MSI having 8+6pin power.

The HOF cards have something like a 14 phase VRM so they may use a different voltage controller to everyone else. I have not seen a HOF bios so I dont know if it works or not. The standard VC is only 8 channel. EGVA use 5 channels and doublers to make the 10 phase VRM on the FTW.

I would suggest that you find a PC with iGPU that you can get access to just in case you brick the card and need to recover it


----------



## TheGlow

Quote:


> Originally Posted by *madmeatballs*
> 
> Are you guys running 3dmark stress test or just the benchmark test. Mine is still the same and fails on stress test(Firestrike Ultra). No artifacts tho.


Benchmark. I have the free demo so can't do the other things.


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> I am using a custom curve and not the slider so that skews the "Core Clock" reading. I reduced the power target slider to 95%. The card started the bench at 2126mhz +500 (9100Mhz), was mostly running at 2114Mhz with occasional drops to 2080. The way I have my curve set up combined with the higher memory clocks, has a tendency for the card to want to pull more than 220Watts at peak loads and that tends to crash the card. Reducing the target slider takes the top off those power "spikes".
> 
> I have come to the conclusion that the MSI 126% power target setting may only be useful if you are using some sort of active cooling like LN2 because it is certainly way higher than the circuitry of the card on air can cope with in my rig. Techpowerup.com claims the MSI bios is set for 230W at 100% with a boost to 291W but, at least on an Asus Z68 motherboard with PCIe 2.0, the card cannot cope at those levels even though it is being fed by a platinum rated HX850i power supply in single rail mode.
> 
> I don't have a modern pcie 3.0 board to test it available so the power draw issues may vary with the newer hardware.


Thanks, I have played with the curve a little. Will try to figure it out


----------



## mrtbahgs

Small update to my screen issues, I thought a bit outside the box to fix the Windows 7 limitation of 8192 pixels wide by setting up my displays like this:


So now i no longer have Aero issues and a few other things that seemed to be tied to that.
It is a bit awkward at first to get used to where my 3rd screen is when I need it, but it's worth it have things seemingly to work correctly now.
Hopefully this saves someone else the trouble that I had and they can just go to this type of setup from the start.

Hopefully the only thing left to do now is to look into my cables and receiver to see which is causing me to only run 30hz refresh rate on the TV.


----------



## zipper17

Quote:


> Originally Posted by *madmeatballs*
> 
> Are you guys running 3dmark stress test or just the benchmark test. Mine is still the same and fails on stress test(Firestrike Ultra). No artifacts tho.


Mine for stability testing after applied some OC, I started to use at least 3dmark Stress Test Firestrike Extreme for a couple runs and,
Custom Loops Firestrike Extreme Test Graphic-Test-2 in about 20.000-30.000 Frames.
Game: Witcher 3, GTA5, Hitman. 1440P settings maxed out, moderate AA.

Anything outside or if I didn't run those, I doubt my card running stable (Perfectly zero from crash/artifacting.)


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> I am using a custom curve and not the slider so that skews the "Core Clock" reading. I reduced the *power target slider to 95%*. The card started the bench at 2126mhz +500 (9100Mhz), was mostly running at 2114Mhz with occasional drops to 2080. The way I have my curve set up combined with the higher memory clocks, *has a tendency for the card to want to pull more than 220Watts at peak loads and that tends to crash the card.* Reducing the target slider takes the top off those power "spikes".
> 
> I have come to the conclusion that the MSI 126% power target setting may only be useful if you are using some sort of active cooling like LN2 because it is certainly way higher than the circuitry of the card on air can cope with in my rig. Techpowerup.com claims the MSI bios is set for 230W at 100% with a boost to 291W but, at least on an Asus Z68 motherboard with PCIe 2.0, the card cannot cope at those levels even though it is being fed by a platinum rated HX850i power supply in single rail mode.
> 
> I don't have a modern pcie 3.0 board to test it available so the power draw issues may vary with the newer hardware.


Interesting, might want to try that.

Im getting crash to desktop in 3dmark test/bench if GPU load reach 100%.

Btw, Who else still struggle to get +21K or 22K Graphic on Firestrike Base? coz we're about on the same boat.

I dont know If my chip already at the limit or just still not find the best tweaks of my card.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I am using a custom curve and not the slider so that skews the "Core Clock" reading. I reduced the *power target slider to 95%*. The card started the bench at 2126mhz +500 (9100Mhz), was mostly running at 2114Mhz with occasional drops to 2080. The way I have my curve set up combined with the higher memory clocks, *has a tendency for the card to want to pull more than 220Watts at peak loads and that tends to crash the card.* Reducing the target slider takes the top off those power "spikes".
> 
> I have come to the conclusion that the MSI 126% power target setting may only be useful if you are using some sort of active cooling like LN2 because it is certainly way higher than the circuitry of the card on air can cope with in my rig. Techpowerup.com claims the MSI bios is set for 230W at 100% with a boost to 291W but, at least on an Asus Z68 motherboard with PCIe 2.0, the card cannot cope at those levels even though it is being fed by a platinum rated HX850i power supply in single rail mode.
> 
> I don't have a modern pcie 3.0 board to test it available so the power draw issues may vary with the newer hardware.
> 
> 
> 
> Interesting, might want to try that.
> 
> Im getting crash to desktop in 3dmark test/bench if GPU load reach 100%.
> 
> Btw, Who else still struggle to get +21K or 22K Graphic on Firestrike Base? coz we're about on the same boat.
> 
> I dont know If my chip already at the limit or just still not find the best tweaks of my card.
Click to expand...

I tried furmark this evening and the card was happily pulling 250W so I dont know whats going on.

My best firestrike graphics score is just under 20600.


----------



## outofmyheadyo

My phoenix 1070 GS with +75 core (2050) +600mem(9200?) gives me 20 767 graphics score in firestrike.


----------



## TheGlow

Question, I have a Sony 60" tv that I hooked up to my 1070 with a high quality fiber optic hdmi wire.
I was looking to try Mafia 3 on it and I dont know whats the best way. I tried to set my PC as extended to keep separate resolutions but couldnt see an option to get the game on that screen. Im guessing id have to switch primary and what not, extra work.
So i set it to mirror and then in game set it to 1080p. My TV is 1080p but I was able to select 2560x1440p and it displayed. Is that an error? I would assume it would say out of range or something?
Also I something like screen tear but incredibly bad. There would be one line often across the screen scrolling down usually, kind of like when youd record older tv's on a camera.
I tried vsync on and off, didnt make a difference.
The only other thing I had tried was Witcher3 but I only did that for 5 minutes so not sure if it also exhibited the line.


----------



## DeathAngel74

I have mine connected to 47 in. Samsung. I'm using Audioquest Cinnamon hdmi cable. Set fast sync and triple buffering, no screen tearing. Latest game ready drivers. Hope this helps.


----------



## TheDeadCry

Quote:


> Originally Posted by *DeathAngel74*
> 
> I have mine connected to 47 in. Samsung. I'm using Audioquest Cinnamon hdmi cable. Set fast sync and triple buffering, no screen tearing. Latest game ready drivers. Hope this helps.


This, and from all reports I've heard, the PC port is not very good (relatively speaking) I love me dual 21:9's


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> Interesting.
> 
> The newest drivers let me run memory 100mhz faster in Timespy than I could before. Maybe they focus on DX12 performance??


So, I hadn't not much time over the weekend, so only Timespy for now.
Same results on my Rig, can run Timespy at 8808Mhz effective memory clock again, (last time with 368.81).
Also, if I don't max out the memory, lets say using only 8692Mhz effective, I can get +110 offset at the core clock now. (+90 before)


----------



## saunupe1911

Quote:


> Originally Posted by *outofmyheadyo*
> 
> My phoenix 1070 GS with +75 core (2050) +600mem(9200?) gives me 20 767 graphics score in firestrike.


My memory maxes out at 9450 which gives m about 20800 score, but I normally game around 9200 to keep temps and fans silent. That's all I need for this generation of games at maxed out 1080p or 2k games

I've quit trying for 21000. I would need to sit at 2114 GPU and probably 96000 memory consistent during the entire run. Boost 3.0 just gone throttle done to the high 19000 once the card goes 55 degrees

I just wish we could programmatic tell Boost 3.0 to only throttle at a certain temp. Lets say 65 degrees. Seems to me Boost 3.0 is flat out too conservative so that we won't burn these cards up or 1080 GTX numbers


----------



## RyanRazer

Quote:


> Originally Posted by *saunupe1911*
> 
> My memory maxes out at 9450 which gives m about 20800 score, but I normally game around 9200 to keep temps and fans silent. That's all I need for this generation of games at maxed out 1080p or 2k games
> 
> I've quit trying for 21000. I would need to sit at 2114 GPU and probably 96000 memory consistent during the entire run. Boost 3.0 just gone throttle done to the high 19000 once the card goes 55 degrees
> 
> I just wish we could programmatic tell Boost 3.0 to only throttle at a certain temp. Lets say 65 degrees. Seems to me Boost 3.0 is flat out too conservative so that we won't burn these cards up or 1080 GTX numbers


I think it's not about Nvidia cooling our GPUs as much as about not reaching to high clocks and potentially matching gtx1080 performance (like 900 series).

"If you wan't 1080 performance? Pay for that..." is the logic i think... Same for 1080 and Titan; performance (class) separation.


----------



## outofmyheadyo

It has been so long since the release, I really dont think we will see a bios editor, hope it wont be the norm moving forward.


----------



## madmeatballs

Quote:


> Originally Posted by *RyanRazer*
> 
> I think it's not about Nvidia cooling our GPUs as much as about not reaching to high clocks and potentially matching gtx1080 performance (like 900 series).
> 
> "If you wan't 1080 performance? Pay for that..." is the logic i think... Same for 1080 and Titan; performance (class) separation.


Well, as someone said a few posts back we are in "overclock.net" after all.


----------



## RyanRazer

Quote:


> Originally Posted by *madmeatballs*
> 
> Well, as someone said a few posts back we are in "overclock.net" after all.


i am aware of that. But that doesn't mean Nvidia is prepared to give you free performance. They want you to buy 1080... (higher tier GPU in genreral)


----------



## saunupe1911

But hey at least quite a few cards are able to at least sniff at very low end 1080 speeds. Not bad...not bad at all.


----------



## RyanRazer

Great card indeed


----------



## lanofsong

Hey GTX 1070 owners,

We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

October Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## RyanRazer

Quote:


> Originally Posted by *lanofsong*
> 
> Hey GTX 1070 owners,
> 
> We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
> Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> October Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


I suppose i need a good internet connection for that?

100kb upload wouldn't be much of a good?


----------



## TheDeadCry

Quote:


> Originally Posted by *RyanRazer*
> 
> I suppose i need a good internet connection for that?
> 
> 100kb upload wouldn't be much of a good?


I've got Gigabit internet, ftw.







Finally. Free for students...although it's technically provided by the county, so I use my VPN. lol


----------



## DeathAngel74

First off, I was one of those unfortunate customers to get Micron VRAM. If I'm getting 2038/8496MHz, should I RMA the card? I saw white checkerboard once yesterday, but I was OCing the VRAM to 8996MHz and core to 2114MHz. It's been fine since I toned it down. Temps are 38-42C under full load. I was not even aware of this issue. I just that my card could not do 9GB and the VRAM oc was too high, lol.


----------



## lanofsong

Quote:


> Originally Posted by *RyanRazer*
> 
> I suppose i need a good internet connection for that?
> 
> 100kb upload wouldn't be much of a good?


No harm in trying to fold a few unit just to see


----------



## outofmyheadyo

Unless you have problems on default frequenzy u have no reason to rma, thats what u are sold, no OC is quaranteed.


----------



## DeathAngel74

Oh ok, so it was really that I had just pushed the card to its limit. Just making sure. Thanks! By default, you less sc'd speeds correct? Mine overclocks fine, even past what EVGA set it to. I just have to be mindful of vram ocing and overall temps.


----------



## TheDeadCry

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Unless you have problems on default frequenzy u have no reason to rma, thats what u are sold, no OC is quaranteed.


"Please Understand" This is an OC forum, so you have to expect some people taking issue with their overclocking potential. lol Edit: However I do agree that an RMA isn't necessary, and will probably cost you money for a process that will not likely change the overclocking situation by much, if at all. Consider MSI at least, just ships a "refurbished" card whom somebody else sent them to RMA.


----------



## DeathAngel74

I just bought the card 2 days ago. Evga is working with nvidia for fixed vbios update. I am within return policy.


----------



## kevindd992002

When using the latest beta version of Afterburner, do I just choose "third party" for the unlock voltage control option considering that my 1070 is a Zotac?


----------



## long99x

Hey, can I overclock my zotac extreme 1070, my vol stuck at 1050mV, raise vol from msi afterburn but no effect.


----------



## RyanRazer

Quote:


> Originally Posted by *long99x*
> 
> Hey, can I overclock my zotac extreme 1070, my vol stuck at 1050mV, raise vol from msi afterburn but no effect.


did you allow you card more power too? Some SS would be nice. Like this?


----------



## TheGlow

Quote:


> Originally Posted by *TheDeadCry*
> 
> I've got Gigabit internet, ftw.
> 
> 
> 
> 
> 
> 
> 
> Finally. Free for students...although it's technically provided by the county, so I use my VPN. lol


Since you got the through VPN, doesnt that affect your ping and speeds? Unless that also supports gigabit.


----------



## kevindd992002

@RyanRazer

Any ideas on my question above? Also, what core and memory clocks are you running your overclocked 1070 at?


----------



## TheDeadCry

Quote:


> Originally Posted by *TheGlow*
> 
> Since you got the through VPN, doesnt that affect your ping and speeds? Unless that also supports gigabit.


All VPN's will have some overhead, that's true, but I use PIA which is exceptionally good. Speed and ping are only minimally affected. PIA has a great network of servers. For the most part, you wouldn't notice any sort of difference in network performance. There are some VERY few occasions where, due to server load or otherwise can give you a more significant overhead. This happens very rarely. In the case that it happens, you can simply switch it off for a few minutes, or change servers - They have tons. I've played online with the VPN connected, and is completely playable, with minimal latency (Except for a server hiccup on rare occasion) DL speed is more or less the same, if you can believe it. Maybe 50Mb/s dip (on a gigabit connection) so a very small amount.


----------



## RyanRazer

Quote:


> Originally Posted by *kevindd992002*
> 
> @RyanRazer
> 
> Any ideas on my question above? Also, what core and memory clocks are you running your overclocked 1070 at?


i use standard msi, works fine with me. i also have amp ex


----------



## long99x

Quote:


> Originally Posted by *RyanRazer*
> 
> did you allow you card more power too? Some SS would be nice. Like this?


you mean power limit, yes, I raise this too, but my card can't go over 2200mhz core clock, my temp is fine, max 60C


----------



## kevindd992002

Quote:


> Originally Posted by *RyanRazer*
> 
> i use standard msi, works fine with me. i also have amp ex


Any reason for not using "third party" though. I was assuming that standard MSI is for their cards.


----------



## RyanRazer

Quote:


> Originally Posted by *long99x*
> 
> you mean power limit, yes, I raise this too, but my card can't go over 2200mhz core clock, my temp is fine, max 60C


My card can't go pass 2088Mhz core clock...







You are fine mate. Temp is great too







Are you sure you get 2100+Mhz with 1,05V?
Could you post some SS with graph?

Quote:


> Originally Posted by *kevindd992002*
> 
> Any reason for not using "third party" though. I was assuming that standard MSI is for their cards.


Not rly. I just used that from the start, never tried third party. you can try and report back







i don't have time now to test that. Maybe tomorrow.


----------



## long99x

Quote:


> Originally Posted by *RyanRazer*
> 
> My card can't go pass 2088Mhz core clock...
> 
> 
> 
> 
> 
> 
> 
> You are fine mate. Temp is great too
> 
> 
> 
> 
> 
> 
> 
> Are you sure you get 2100+Mhz with 1,05V?
> Could you post some SS with graph?
> Not rly. I just used that from the start, never tried third party. you can try and report back
> 
> 
> 
> 
> 
> 
> 
> i don't have time now to test that. Maybe tomorrow.


I'm at work now, when I get home I will take a screenshoot


----------



## kevindd992002

Quote:


> Originally Posted by *RyanRazer*
> 
> My card can't go pass 2088Mhz core clock...
> 
> 
> 
> 
> 
> 
> 
> You are fine mate. Temp is great too
> 
> 
> 
> 
> 
> 
> 
> Are you sure you get 2100+Mhz with 1,05V?
> Could you post some SS with graph?
> Not rly. I just used that from the start, never tried third party. you can try and report back
> 
> 
> 
> 
> 
> 
> 
> i don't have time now to test that. Maybe tomorrow.


I tried but I'm not sure how to compare it with "standard MSI". Mine can't go past 2088Mhz core clock too. The weird thing in the monitoring graph is that mine starts at 2101Mhz for a few seconds and then 2088Mhz and then finally stays longest at 2076Mhz. This is all during a Heaven benchmark run. I want to say that that is throttling, correct? But due to what? Temp or Voltage? I don't think it's because of TDP because the power usage is nowhere near 100%.


----------



## ygryk

Guys, I need your experience and advise.

My rig is Raven RVZ02 and MSI Aero GTX 1070 OC, motherboard from Alienware X51 R1, and Core i7-3770.
I have my GPU without overclocking, but I found it heating up to 82C degrees while playing.

Cheking GPU-Z and MSI Afterburner I found it throttling the GPU Core from 1870MHZ down to 1730MHZ. Even I set cooler for 80% at 55C it doesn't happen to resolve the issue.

So the MAX GPU Core I saw is at 1936MHZ for first seconds of tests or playing games. Then it goes down for 1730MHz.

Is it normal? I saw the same issues for FE owners or owners of Turbine cooling system and all of them saying - It is normal situation if GPU heats up to 82C.

Please help me to understand what can be done or should I leave it as it is.

Thanks!


----------



## khanmein

Quote:


> Originally Posted by *ygryk*
> 
> Guys, I need your experience and advise.
> 
> My rig is Raven RVZ02 and MSI Aero GTX 1070 OC, motherboard from Alienware X51 R1, and Core i7-3770.
> I have my GPU without overclocking, but I found it heating up to 82C degrees while playing.
> 
> Cheking GPU-Z and MSI Afterburner I found it throttling the GPU Core from 1870MHZ down to 1730MHZ. Even I set cooler for 80% at 55C it doesn't happen to resolve the issue.
> 
> So the MAX GPU Core I saw is at 1936MHZ for first seconds of tests or playing games. Then it goes down for 1730MHz.
> 
> Is it normal? I saw the same issues for FE owners or owners of Turbine cooling system and all of them saying - It is normal situation if GPU heats up to 82C.
> 
> Please help me to understand what can be done or should I leave it as it is.
> 
> Thanks!


yr case got very restricted air flow but for me personally, above 80°c is not acceptable but during gaming with 82°c will slightly throttled down a lil bit. my suggestion for u is use blower type like reference cooler aka founder edition or put a stand fan blow yr case.


----------



## ygryk

Quote:


> yr case got very restricted air flow but for me personally, above 80°c is not acceptable but during gaming with 82°c will slightly throttled down a lil bit. my suggestion for u is use blower type like reference cooler aka founder edition or put a stand fan blow yr case.


I know it is limited for AirFlow compare to regular MidTowers. But the thing is that turbine MUST flow the air through the card outside the case and it is ok from my point of view. Based on what I saw in Internet from FE owners or Turbine cooling system owners they said that it is regular situation for this cards to heat up to 82 (but not for me







.
The type of cooler is the same as FE - See the link to MSI product https://www.msi.com/Graphics-card/GeForce-GTX-1070-AERO-8G-OC.html#hero-overview

Thanks!


----------



## outofmyheadyo

Quote:


> Originally Posted by *TheDeadCry*
> 
> "Please Understand" This is an OC forum, so you have to expect some people taking issue with their overclocking potential. lol Edit: However I do agree that an RMA isn't necessary, and will probably cost you money for a process that will not likely change the overclocking situation by much, if at all. Consider MSI at least, just ships a "refurbished" card whom somebody else sent them to RMA.


I understand but it`s like buying a car that has a top speed of 200km/h and u want to take it back because it wont go to 300, makes no sense.


----------



## khanmein

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I understand but it`s like buying a car that has a top speed of 200km/h and u want to take it back because it wont go to 300, makes no sense.


nope even at default stock setting also got issue.. if not y so many user selling off their GTX 1070 in a hurry over my country??

https://forum.lowyat.net/index.php?act=Search&CODE=show&searchid=fae3f4c55c3e318bab5330b0d73121e8&search_in=posts&result_type=topics&highlite=GTX+1070


----------



## khanmein

Quote:


> Originally Posted by *ygryk*
> 
> I know it is limited for AirFlow compare to regular MidTowers. But the thing is that turbine MUST flow the air through the card outside the case and it is ok from my point of view. Based on what I saw in Internet from FE owners or Turbine cooling system owners they said that it is regular situation for this cards to heat up to 82 (but not for me
> 
> 
> 
> 
> 
> 
> 
> .
> The type of cooler is the same as FE - See the link to MSI product https://www.msi.com/Graphics-card/GeForce-GTX-1070-AERO-8G-OC.html#hero-overview
> 
> Thanks!


no need worried as long didn't hit constantly above 90°c


----------



## misoonigiri

Quote:


> Originally Posted by *ygryk*
> 
> Guys, I need your experience and advise.
> 
> My rig is Raven RVZ02 and MSI Aero GTX 1070 OC, motherboard from Alienware X51 R1, and Core i7-3770.
> I have my GPU without overclocking, but I found it heating up to 82C degrees while playing.
> 
> Cheking GPU-Z and MSI Afterburner I found it throttling the GPU Core from 1870MHZ down to 1730MHZ. Even I set cooler for 80% at 55C it doesn't happen to resolve the issue.
> 
> So the MAX GPU Core I saw is at 1936MHZ for first seconds of tests or playing games. Then it goes down for 1730MHz.
> 
> Is it normal? I saw the same issues for FE owners or owners of Turbine cooling system and all of them saying - It is normal situation if GPU heats up to 82C.
> 
> Please help me to understand what can be done or should I leave it as it is.
> 
> Thanks!


Just an FYI,
http://www.overclock.net/t/1466816/silverstone-raven-rvz01-rvz02-ml07-ml08-ftz01-owners-club/6300#post_25421677
http://www.overclock.net/t/1466816/silverstone-raven-rvz01-rvz02-ml07-ml08-ftz01-owners-club/6380#post_25481577


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RyanRazer*
> 
> i use standard msi, works fine with me. i also have amp ex
> 
> 
> 
> Any reason for not using "third party" though. I was assuming that standard MSI is for their cards.
Click to expand...

The 1070s all use the same voltage controller (except possibly the Galax HOF). Afterburner has a hw database update feature that lets you add new types of supported voltage controllers that are accessed through the third party section. As the Zotac uses the same as the same type of voltage controller as the MSI one, it should work fine.


----------



## gtbtk

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheDeadCry*
> 
> "Please Understand" This is an OC forum, so you have to expect some people taking issue with their overclocking potential. lol Edit: However I do agree that an RMA isn't necessary, and will probably cost you money for a process that will not likely change the overclocking situation by much, if at all. Consider MSI at least, just ships a "refurbished" card whom somebody else sent them to RMA.
> 
> 
> 
> I understand but it`s like buying a car that has a top speed of 200km/h and u want to take it back because it wont go to 300, makes no sense.
Click to expand...

You are forgetting that the 200 mph car is sold with an empty NO2 bottle installed as standard. It is not unreasonable to expect that if you do take the time to refill the NO2 bottle the car will get an extra kick.

While I agree that there is no guarantee of a 2200mhz core overclock or an 800Mhz memory OC, all of the cards are marketed as overclockable and sold with Afterburner or an equivalent software package that suggests that there is some sort of OC potential


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> The 1070s all use the same voltage controller (except possibly the Galax HOF). Afterburner has a hw database update feature that lets you add new types of supported voltage controllers that are accessed through the third party section. As the Zotac uses the same as the same type of voltage controller as the MSI one, it should work fine.


I see, thanks for the clarification! Now what is the difference between the standard MSI and extended MSI?


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> You are forgetting that the 200 mph car is sold with an empty NO2 bottle installed as standard. It is not unreasonable to expect that if you do take the time to refill the NO2 bottle the car will get an extra kick


But if you don't also upgrade the brakes and transmission you may crash and burn.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The 1070s all use the same voltage controller (except possibly the Galax HOF). Afterburner has a hw database update feature that lets you add new types of supported voltage controllers that are accessed through the third party section. As the Zotac uses the same as the same type of voltage controller as the MSI one, it should work fine.
> 
> 
> 
> I see, thanks for the clarification! Now what is the difference between the standard MSI and extended MSI?
Click to expand...

dont know exactly other than the extended is supposed to give you more features.

If you take a look in the Afterburner program directory, there is a sub dir called doc that has a pdf file that describes how the features work.


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You are forgetting that the 200 mph car is sold with an empty NO2 bottle installed as standard. It is not unreasonable to expect that if you do take the time to refill the NO2 bottle the car will get an extra kick
> 
> 
> 
> But if you don't also upgrade the brakes and transmission you may crash and burn.
Click to expand...

Assuming that the brakes and tranny are sub par to begin with. As with everything, cars and GTX 1070s included, you pay more and you get better extras/support components


----------



## RyanRazer

off-topic: anyone plays paragon here?


----------



## long99x

Quote:


> Originally Posted by *RyanRazer*
> 
> My card can't go pass 2088Mhz core clock...
> 
> 
> 
> 
> 
> 
> 
> You are fine mate. Temp is great too
> 
> 
> 
> 
> 
> 
> 
> Are you sure you get 2100+Mhz with 1,05V?
> Could you post some SS with graph?
> Not rly. I just used that from the start, never tried third party. you can try and report back
> 
> 
> 
> 
> 
> 
> 
> i don't have time now to test that. Maybe tomorrow.


ss here


----------



## RyanRazer

Quote:


> Originally Posted by *long99x*
> 
> ss here


So this is during benchmarking? crazy... i can't run heaven with 2114 without artifacts with voltage +100... Can you pass heaven benchmark without artifacting? That is some efficient chip there. unbelievable


----------



## saunupe1911

Quote:


> Originally Posted by *RyanRazer*
> 
> So this is during benchmarking? crazy... i can't run heaven with 2114 without artifacts with voltage +100... Can you pass heaven benchmark without artifacting? That is some efficient chip there. unbelievable


Man I run Heaven and Firestrike at 2050 and 9400 with voltage set to 0 or normal and power limit set to 120%. I'm going to check and see what the lowest Power Limit % I can get before crash. Efficiency + Power is the goal of my setup. Keep temps, wattage, and longevity going with max power. Then use other profiles to crank it up if need be.


----------



## Roland0101

Quote:


> Originally Posted by *khanmein*
> 
> nope even at default stock setting also got issue.. if not y so many user selling off their GTX 1070 in a hurry over my country??


DeathAngel74 who asked the question outofmyheadyo answered to made specifically clear that he don't has a problem on stock settings, so outofmyheadyo is simply right.
Overclocking is not guaranteed and therefor a card that is not overclocking as high as others (DeathAngel74 can reach 8496MHz memory clock , not sure if he looked the voltage or not.) is not a reason to RMA.
Off course he could try anyway, maybe he get lucky.


----------



## zipper17

I doubt valley/heaven, in 3dmark FS I got random crashes right away.

If you can get stable in 3dmark too, congratz, and please share some light how to get corespeed higher.


----------



## khanmein

Quote:


> Originally Posted by *Roland0101*
> 
> DeathAngel74 who asked the question outofmyheadyo answered to made specifically clear that he don't has a problem on stock settings, so outofmyheadyo is simply right.
> Overclocking is not guaranteed and therefor a card that is not overclocking as high as others (DeathAngel74 can reach 8496MHz memory clock , not sure if he looked the voltage or not.) is not a reason to RMA.
> Off course he could try anyway, maybe he get lucky.


i don't wanna take the risk. few users including yourself got lucky & to think positive i rather stick with samsung that might give me explosion surprise.


----------



## madmeatballs

Shunt method might improve stability but unsure. I'll be try it on my AMP Extreme soon, lets see what happens.


----------



## Roland0101

Quote:


> Originally Posted by *khanmein*
> 
> i don't wanna take the risk. few users including yourself got lucky & to think positive i rather stick with samsung that might give me explosion surprise.


Hope they didn't attached a galaxy s7 to your GPU.









Edit: Oh, and I doupt that only a few users get lucky, I have access to 3 GTX 1070 and all of them running perfectly fine on stock clocks and also overclock decently, not as good as many Samsung cards on the memory side, but not bad either.


----------



## Kronos8

I've been following this thread ever since I decided to go with GTX 1070 and found out about the Micron "issue". I have never bought a GPU so expensive. I don't have the "feeling" of using a high end GPU. I can't afford a "bad decision", so I am still skeptical on when I will buy and which brand. I would like to add some "food for thought" in the conversation and maybe help me understand some details I missed.

Additional VRM's (compared to FE) and higher TDP (compared to FE). These, I understand, have to do mainly with OC. If you can lift a couch with 4 people, there is no reason to lift the couch with 8 people, unless you intend to raise the weight of the couch, meaning in our example the GPU delivered power (meaning OCing it). From many videos I have watched on YT, I understand that the way of OCing a 1070 (all 10xx actually) is pretty much straight forward. Max out TDP, Temp and voltage (max volt is locked), add cores and/or ram and test. The way the core drops as temp and TDP raises, I understand that good OC is something the user can not achieve solely by his/her own but it is software and/or hardware related and the way each parameter affects the other.

If the above are correct, then for much better OC results, unlocking the max voltage and/or manipulating the way each parameter affects the other (through BIOS edit, maybe?) is the only way to go.

If that thought is correct, my questions are

1. Is it possible/feasible to unlock the max volt and edit the BIOS?
2. If yes, having Micron memories with issues mentioned won't push back the card from even better OC's?
3. If no, then what is the necessity of buying more expensive cards with more VRM's and higher TDP's?

As I wrote, food for thought.


----------



## abctoz

Quote:


> Originally Posted by *Kronos8*
> 
> I've been following this thread ever since I decided to go with GTX 1070 and found out about the Micron "issue". I have never bought a GPU so expensive. I don't have the "feeling" of using a high end GPU. I can't afford a "bad decision", so I am still skeptical on when I will buy and which brand. I would like to add some "food for thought" in the conversation and maybe help me understand some details I missed.
> 
> Additional VRM's (compared to FE) and higher TDP (compared to FE). These, I understand, have to do mainly with OC. If you can lift a couch with 4 people, there is no reason to lift the couch with 8 people, unless you intend to raise the weight of the couch, meaning in our example the GPU delivered power (meaning OCing it). From many videos I have watched on YT, I understand that the way of OCing a 1070 (all 10xx actually) is pretty much straight forward. Max out TDP, Temp and voltage (max volt is locked), add cores and/or ram and test. The way the core drops as temp and TDP raises, I understand that good OC is something the user can not achieve solely by his/her own but it is software and/or hardware related and the way each parameter affects the other.
> 
> If the above are correct, then for much better OC results, unlocking the max voltage and/or manipulating the way each parameter affects the other (through BIOS edit, maybe?) is the only way to go.
> 
> If that thought is correct, my questions are
> 
> 1. Is it possible/feasible to unlock the max volt and edit the BIOS?
> 2. If yes, having Micron memories with issues mentioned won't push back the card from even better OC's?
> 3. If no, then what is the necessity of buying more expensive cards with more VRM's and higher TDP's?
> 
> As I wrote, food for thought.


1. not sure but from what I read about a month ago nobody can go over 1.07v
2. I think the Micron memory issue has something to do with bios too, my Micron card could do ~+500-550 with load but when idle windows would artifact. Samsung clocks higher on average, AFAIK all FE cards have Samsung memory.
3. More VRM's give more stable power output, meaning you could theoretically get a fractional better overclock, in practice because you're limited to 1.07v it doesn't seem like an issue to me. Lower load on VRMs mean they can potentially last longer

I would recommend FE's except for the fact that they run more noisy/hotter, they're actually built like tanks with that metal shroud/backplate plus you get Saumsung memory. But customer coolers are still superior in heat/noise.


----------



## long99x

Quote:


> Originally Posted by *RyanRazer*
> 
> So this is during benchmarking? crazy... i can't run heaven with 2114 without artifacts with voltage +100... Can you pass heaven benchmark without artifacting? That is some efficient chip there. unbelievable


in heaven and valley my card throttle around 2133~2150 and stable in the witcher 3.


----------



## KedarWolf

I read 1070s Founder Editions are the way to go because they are guaranteed to have Samsung memory. Anyone know if this is true?


----------



## kevindd992002

Quote:


> Originally Posted by *Roland0101*
> 
> Hope they didn't attached a galaxy s7 to your GPU.


You really meant Galaxy Note 7, right?







The S7 doesn't have the explosion issue.


----------



## madmeatballs

Quote:


> Originally Posted by *abctoz*
> 
> 1. not sure but from what I read about a month ago nobody can go over 1.07v
> 2. I think the Micron memory issue has something to do with bios too, my Micron card could do ~+500-550 with load but when idle windows would artifact. Samsung clocks higher on average, AFAIK all FE cards have Samsung memory.
> 3. More VRM's give more stable power output, meaning you could theoretically get a fractional better overclock, in practice because you're limited to 1.07v it doesn't seem like an issue to me. Lower load on VRMs mean they can potentially last longer
> 
> I would recommend FE's except for the fact that they run more noisy/hotter, they're actually built like tanks with that metal shroud/backplate plus you get Saumsung memory. But customer coolers are still superior in heat/noise.


I think the limit is 1.093v.


----------



## KedarWolf

I saw a 1070 voltage mod that uses conductive paste which is removable and won't void the warranty.


----------



## sblocc10

ok two questions/additions:

where can i find that mod with voltage paste-based ??

AND

i own a MSI 1070 ARMOR and shorted the two shunts (2m0 and 5m0) --> power limit constantly below 100 (around 95-99)

next i will add resitors (smd) on top of three cap's around power IC ---> power limit should be gone completely! (f.ex ~150 WATTS??)

currently i have a maximum mHz of 2050 (sometimes 2037, or 2012 when system gets over 80°)

the voltage is around 1,07 when gaming, but the heat on air wont allow much more i think because of reaching 85 or more sometimes!!

now i read that i also can change the installed GPU-power CAPS of 4x 330 uF on backside of card to, say 4x 470 uF which would result in a powerup as well to reach higher clocks ?!!

that would be 1/3 more available power to the GPU...

SO has anyone ever heard of one trying this mod and reaching maybe over 2100 or 2200 on air at say 90-95° ???

SHOULD BE POSSIBLE ANYHOW !!

https://xdevs.com/guide/pascal_oc/


----------



## asdkj1740

Quote:


> Originally Posted by *madmeatballs*
> 
> Shunt method might improve stability but unsure. I'll be try it on my AMP Extreme soon, lets see what happens.


have your card already reached the power limit at 300w?


----------



## sblocc10

as for mine i either reach power limit because of TDP (not rated power limit !!!) or voltage limit depending of wheather to use msaa 8x or not in games says gpu-z..


----------



## Roland0101

Quote:


> Originally Posted by *kevindd992002*
> 
> You really meant Galaxy Note 7, right?
> 
> 
> 
> 
> 
> 
> 
> The S7 doesn't have the explosion issue.


True, forgive me my ignorance about Samsung phones, I am a typical iPhone user.









Quote:


> Originally Posted by *Kronos8*
> 
> 1. Is it possible/feasible to unlock the max volt and edit the BIOS?


It is possible, but I don't know if anyone did that for a pascal card yet. All "Normal" 1070 Vbios limit the card at 1.093v.
Still, nothing can kill a GPU faster than to much voltage, so even if it becomes available, you should know exactly what are you doing (or have enough money) if you overvolting a card that much, especially a card of a new manufacturing process. You probably will also need a better cooling than Air to achieve really good results.
Quote:


> 2. If yes, having Micron memories with issues mentioned won't push back the card from even better OC's?


I can overclock my Micron card to 8808Mhz effective memory clock. If I could get more voltage to the Memory chips I probably could overclock them further. (The new Bios will tell.)
Will I test that? Yes, and then I will run it near stock clocks for the next 1.5 - 2.5 Years to keep it cool and quiet until I buy a new GPU. And that is because I play at 1440p (DSR) and there is nothing on the marked or in the pipelines this card can't handle for this period of time.
Quote:


> 3. If no, then what is the necessity of buying more expensive cards with more VRM's and higher TDP's?


Most partner cards are overclocked to begin with. One reason this cards have more VRMs and a higher TDP is that the manufactures want to make sure their own overclocking is rock solid.
The second reason is of course to attract people wo want to overclock the cards further.
Still, imho some designed are just overblown.


----------



## Roland0101

Quote:


> Originally Posted by *sblocc10*
> 
> SO has anyone ever heard of one trying this mod and reaching maybe over 2100 or 2200 on air at say 90-95° ???
> 
> SHOULD BE POSSIBLE ANYHOW !!


No it isn't. The GPU would never hold this frequencies at this temperatures.


----------



## sblocc10

nah.. i could play doom for more than about 10 hours at 2025 mhz witch 1.063v and moving round 87°C

afterburner goes up to 92° 'possible' so wheres the matter?


----------



## sblocc10

and that was just in summer with mATX housing and bad air flow!


----------



## asdkj1740

i flashed zotac amp extreme bios on my evga ftw. works fine.
zotac amp extreme has 300w power limit at 120%, and the max power draw of running fsu is 225w at 90%.
my card has arctic hybrid 140 aio put on and the max temp of running fsu is 48c.

that is, without power throttling and temperature throttling, my card's core clock still cant be absolutely stable.
i have overclocked my card to 2164mhz at 1.08v by using voltage curve adjustment method, and most of the time of running fsu was 2152mhz while some time it dropped to 2139mhz, which is the same as using stock evga bios (215w max).
i am afraid after aio is used then having a higher power limit bios may not be help with stabilizing core clock...


----------



## Roland0101

Quote:


> Originally Posted by *sblocc10*
> 
> nah.. i could play doom for more than about 10 hours at 2025 mhz witch 1.063v and moving round 87°C
> 
> afterburner goes up to 92° 'possible' so wheres the matter?


The problem is that GPU boost 3.0 will down-clock your card more the higher the Temps go.

Read the last paragraph in the article you linked.


----------



## ucode

Here's an example of temperature down clocking running Fire Strike on air.










Down to 2150MHz at 80C, that's about a 60MHz drop for 40C rise in temperature.


----------



## asdkj1740

Quote:


> Originally Posted by *ucode*
> 
> Here's an example of temperature down clocking running Fire Strike on air.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Down to 2150MHz at 80C, that's about a 60MHz drop for 40C rise in temperature.


which card is that?
strix oc modded bios?


----------



## sblocc10

however just tried reading old and flashing dofferetnt bios with nvf and checks bypassed, still doesnt accept ne bios!
how to manage that


----------



## hkuve

IMAG0412.jpg 5300k .jpg file
Hi Guys,

I've recently replaced my GTX 970 with Asus Strix 1070 (Micron VRAM), and I've made a new windows installation and everything was going fine, however after I installed MSI afterburner and did a little overclocking on Core (+170) and memory(+300), I saw after a while dots when booting into windows so I removed the MSI application but I'm still receiving these flickering and artifacts, I've attached a screenshot.

Could someone kindly advise if the issue is from the new driver or from overclocking the Micron memory or it could be defective card and I should replace it ?

Thanks...


----------



## khanmein

Quote:


> Originally Posted by *hkuve*
> 
> IMAG0412.jpg 5300k .jpg file
> Hi Guys,
> 
> I've recently replaced my GTX 970 with Asus Strix 1070 (Micron VRAM), and I've made a new windows installation and everything was going fine, however after I installed MSI afterburner and did a little overclocking on Core (+170) and memory(+300), I saw after a while dots when booting into windows so I removed the MSI application but I'm still receiving these flickering and artifacts, I've attached a screenshot.
> 
> Could someone kindly advise if the issue is from the new driver or from overclocking the Micron memory or it could be defective card and I should replace it ?
> 
> Thanks...


try RMA for samsung vram chip if u can't then stick with default stock & enjoy your gaming.


----------



## asdkj1740

Quote:


> Originally Posted by *hkuve*
> 
> IMAG0412.jpg 5300k .jpg file
> Hi Guys,
> 
> I've recently replaced my GTX 970 with Asus Strix 1070 (Micron VRAM), and I've made a new windows installation and everything was going fine, however after I installed MSI afterburner and did a little overclocking on Core (+170) and memory(+300), I saw after a while dots when booting into windows so I removed the MSI application but I'm still receiving these flickering and artifacts, I've attached a screenshot.
> 
> Could someone kindly advise if the issue is from the new driver or from overclocking the Micron memory or it could be defective card and I should replace it ?
> 
> Thanks...


did you press the reset button on msi ab, check gpuz about whether you are overclocking
if you still get those dots then you should go for rma or return it to the seller


----------



## ucode

Quote:


> Originally Posted by *asdkj1740*
> 
> which card is that?
> strix oc modded bios?


It was Pascal card run to show clock response to thermals, graphs have been windowed enough to hopefully show that. FWIW though a Founders Edition GTX1080 with cross flashed T4 VBIOS. Do note that the T4 does not set a thermal limit although hopefully the hard limits at 96C and 99C should still be in effect.


----------



## hkuve

I didn't press the reset button but I uninstalled it and still observed these artifacts.


----------



## zipper17

Quote:


> Originally Posted by *ucode*
> 
> Here's an example of temperature down clocking running Fire Strike on air.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Down to 2150MHz at 80C, that's about a 60MHz drop for 40C rise in temperature.


What's your Firestrike Graphic Scores? (Firestrike Basic?)

Mine about 20,8XX. highest i can achieve stable. (stable tested with 3dmark Stress test, witcher3, hitman, gta5)

Reach 21.1K but with unstable memory, green flashpop artifact. (+700-800mhz, 9400mhz)

Edited: nevermind, you have GTX 1080.


----------



## RyanRazer

Quote:


> Originally Posted by *ucode*
> 
> Here's an example of temperature down clocking running Fire Strike on air.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Down to 2150MHz at 80C, that's about a 60MHz drop for 40C rise in temperature.


That Voltage though. Unlocked BIOS?


----------



## RyanRazer

Quote:


> Originally Posted by *RyanRazer*
> 
> 
> 
> 
> 
> 
> Who has the balls to do that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX 1080 FE + GTX 1070 FE Power Limit Mod - Unlock the Power Target


Quote:


> Originally Posted by *sblocc10*
> 
> ok two questions/additions:
> 
> where can i find that mod with voltage paste-based ??
> 
> AND
> 
> i own a MSI 1070 ARMOR and shorted the two shunts (2m0 and 5m0) --> power limit constantly below 100 (around 95-99)
> 
> next i will add resitors (smd) on top of three cap's around power IC ---> power limit should be gone completely! (f.ex ~150 WATTS??)
> 
> currently i have a maximum mHz of 2050 (sometimes 2037, or 2012 when system gets over 80°)
> 
> the voltage is around 1,07 when gaming, but the heat on air wont allow much more i think because of reaching 85 or more sometimes!!
> 
> now i read that i also can change the installed GPU-power CAPS of 4x 330 uF on backside of card to, say 4x 470 uF which would result in a powerup as well to reach higher clocks ?!!
> 
> that would be 1/3 more available power to the GPU...
> 
> SO has anyone ever heard of one trying this mod and reaching maybe over 2100 or 2200 on air at say 90-95° ???
> 
> SHOULD BE POSSIBLE ANYHOW !!
> 
> https://xdevs.com/guide/pascal_oc/


here


----------



## TheBoom

Are the rumours that theres a strix oc bios floating around with voltage at 1.24v true?

That being said has anyone managed to safely flash a different bios to the strix oc itself?


----------



## ucode

@zipper17 FWIW graphics score just a little over 26.1k. Great score for your 1070 bro. Much better performance per dollar than this 1080 which also has the advantage of extra voltage and no set power limit. Be nice if Pascal VBIOS could be modded by the public.

Seems like some buggy-ness with the GDDR5X as well plus some issues with the core. Forcing p-states seems to kill the video clock, leaving it at a low frequency. Does the same happen with the 1070?

@RyanRazer it's a specially made VBIOS for the Asus Strix not really intended for my card but have seemed to have gotten away with cross flashing it. Not sure but I think it might have been propagated by an Asus Engineer who might have access to signing it or know someone. I think he was also looking at a special 1070 VBIOS but not sure what happened with that. More info may be found on hwbot forum under strix overclock thread.


----------



## Roland0101

Quote:


> Originally Posted by *hkuve*
> 
> I didn't press the reset button but I uninstalled it and still observed these artifacts.


But where are your clocks? Are they back to stock?


----------



## DeathAngel74

2100/8504 stable on micron, 1.075v, 47c.


----------



## ITAngel

Quote:


> Originally Posted by *DeathAngel74*
> 
> 2100/8504 stable on micron, 1.075v, 47c.


Nice clock there. =)


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> 2100/8504 stable on micron, 1.075v, 47c.


47C, watercooling?


----------



## outofmyheadyo

Gainward Golden Sample GTX 1070 @ default 1.05v ( dont think I can increase it with this card with software, and dont have any liquid metal at hand )
Fans are at auto aswell so around ~33% cant stand noise









http://www.3dmark.com/3dm/15398566

*21 040 graphics score*

@ 2050 core
@ 9450 memory
4600 @ 6700K
3200-14-14-14-34 ram

Cant seem to go highter than that, im sure watercooling would help aswell.


----------



## RyanRazer

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Gainward Golden Sample GTX 1070 @ default 1.05v ( dont think I can increase it with this card with software, and dont have any liquid metal at hand )
> 
> http://www.3dmark.com/3dm/15398566
> 
> *21 040 graphics score*
> 
> @ 2050 core
> @ 9450 memory
> 4600 @ 6700K
> 3200-14-14-14-34 ram
> 
> Cant seem to go highter than that, im sure watercooling would help aswell.


this overclocking is a mystery to me.. You get higher graphics score than me with lower clocks. HOW? Does CPU count here 2?!


----------



## outofmyheadyo

I think memory clock has a bigger part on the score vs the core clock, not sure tho


----------



## DeathAngel74

No watercooling, on air @ 65% fans.


----------



## Roland0101

Quote:


> Originally Posted by *RyanRazer*
> 
> this overclocking is a mystery to me.. You get higher graphics score than me with lower clocks. HOW? Does CPU count here 2?!


Can have several reasons. Cooling, PSU ripple, or that the error correction is eating up the difference.
Try to lower your clocks a little, Memory first, and test.


----------



## outofmyheadyo

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Gainward Golden Sample GTX 1070 @ default 1.05v ( dont think I can increase it with this card with software, and dont have any liquid metal at hand )
> Fans are at auto aswell so around ~33% cant stand noise
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15398566
> 
> *21 040 graphics score*
> 
> @ 2050 core
> @ 9450 memory
> 4600 @ 6700K
> 3200-14-14-14-34 ram
> 
> Cant seem to go highter than that, im sure watercooling would help aswell.


Oh lordy, I found the voltage slider in afterburner now ( for some reason I took it for granted that it did not work on my phoenix 1080 and so I did not try it on my 1070 ) now it turns out +100mv bumps up the voltage from 1.0500v to 1.0810v time to do some more testing


----------



## DeathAngel74

I just found it yesterday in PX OC. 1.075-1.093v


----------



## outofmyheadyo

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Oh lordy, I found the voltage slider in afterburner now ( for some reason I took it for granted that it did not work on my phoenix 1080 and so I did not try it on my 1070 ) now it turns out +100mv bumps up the voltage from 1.0500v to 1.0810v time to do some more testing


Well I got some increase in coreclock from 2050 to 2125 thats seems to be the limit for now!

timespy - graphics score 6837 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)

firestrike - graphics score 21 485 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)

Pretty happy with this card








Quote:


> Originally Posted by *DeathAngel74*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just found it yesterday in PX OC. 1.075-1.093v


Dont tell me about TXP, it destroys the 1070 and 1070 sli and whatever you throw at it


----------



## zipzop

I tried the CLU shunt mod today. And boy that definitely works! Power% doesn't go above 100% on 112% power limit(maxed slider) and no more "PerCap Reason: Pwr" in GPU-Z(power throttling)...I did get a drop of one frequency bin 2088mhz to 2076 with PerfCap being "VRel" (voltage reliability?).....

I also replaced the regular thermal paste with CLU. 50C is the hottest it gets! (with a semi-aggressive fan curve at 70% / 4400rpm) Great stuff that is

Test was GTAV maxed out at 1440p with 99% GPU usage constant


----------



## RyanRazer

Quote:


> Originally Posted by *zipzop*
> 
> I tried the CLU shunt mod today. And boy that definitely works! Power% doesn't go above 100% on 112% power limit(maxed slider) and no more "PerCap Reason: Pwr" in GPU-Z(power throttling)...I did get a drop of one frequency bin 2088mhz to 2076 with PerfCap being "VRel" (voltage reliability?).....
> 
> I also replaced the regular thermal paste with CLU. 50C is the hottest it gets! (with a semi-aggressive fan curve at 70% / 4400rpm) Great stuff that is
> 
> Test was GTAV maxed out at 1440p with 99% GPU usage constant


i always get voltage limit. never power... It usually stays at 98-99%.

Do you mean you applied a conductive compound like liquid metal over resistor by; _"I tried the CLU shunt mod today"_ ?

EDIT: _actually no, it stays around 70%.... MSI afterburner throws voltage limit for some reason...
GPU usage stays at 98-99%, my bad. I double checked now..._


----------



## DeathAngel74

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Well I got some increase in coreclock from 2050 to 2125 thats seems to be the limit for now!
> 
> timespy - graphics score 6837 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)
> 
> firestrike - graphics score 21 485 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)
> 
> Pretty happy with this card
> 
> 
> 
> 
> 
> 
> 
> 
> Dont tell me about TXP, it destroys the 1070 and 1070 sli and whatever you throw at it


EVGA Precision X OC?


----------



## TheDeadCry

Quote:


> Originally Posted by *RyanRazer*
> 
> this overclocking is a mystery to me.. You get higher graphics score than me with lower clocks. HOW? Does CPU count here 2?!


Check your background processes, it could be a matter of you using more resources while running the benchmark.


----------



## zipper17

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Well I got some increase in coreclock from 2050 to 2125 thats seems to be the limit for now!
> 
> timespy - graphics score 6837 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)
> 
> firestrike - graphics score 21 485 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)
> 
> Pretty happy with this card
> 
> 
> 
> 
> 
> 
> 
> 
> Dont tell me about TXP, it destroys the 1070 and 1070 sli and whatever you throw at it


can you share your result FS Extreme stress test also? (if have advanced edition)
if you can complete once or two at least without crashing/artifacting, congratz with your card.

I can reach 21.1K, but memory unstable.
http://www.3dmark.com/3dm/14985885? (@2088/@9500)
http://www.3dmark.com/3dm/14986015? (@2088/@9450)
Btw, in custom curve my Highest corespeed point was 2101mhz.

do you notice any weird very fast artifact during benchmarking with @9450 memory?
because My memory at 9300-9500mhz start to have greenball flashpop artifact in about milliseconds,
FYI, little memory artifact would be pretty hard to catch with bare eyes.

Best stable i can get without crashing/artifacting on Firestrike bench/stress test (also in games witcher3,gta5,hitman, etc) :
http://www.3dmark.com/fs/10456650 (graphics score 20,866, 2100/9216mhz)


In pics, stress test was invalid reason, because i disabled the scan system & hardware monitoring info, some people say it is for more stable reason.
on bench I got Time measurement inconsistent, it's random on my system, idk, sometime will valid, and sometime not.

I can get 20.9XX, but nearly +700mhz memory, so I used +500-600mhz instead for more safety reason & daily gaming.

AT this point, pretty much i give up, i don't know anymore if i can push it further. Because it seems anything higher will crashing in Firestrike, and witcher 3.


----------



## DeathAngel74

I forgot to post a screenshot of the overclock...


----------



## outofmyheadyo

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Oh lordy, I found the voltage slider in afterburner now ( for some reason I took it for granted that it did not work on my phoenix 1080 and so I did not try it on my 1070 ) now it turns out +100mv bumps up the voltage from 1.0500v to 1.0810v time to do some more testing


Quote:


> Originally Posted by *zipper17*
> 
> can you share your result FS Extreme stress test also? (if have advanced edition)
> if you can complete once or two at least without crashing/artifacting, congratz with your card.
> 
> I can reach 21.1K, but memory unstable.
> http://www.3dmark.com/3dm/14985885? (@2088/@9500)
> http://www.3dmark.com/3dm/14986015? (@2088/@9450)
> Btw, in custom curve my Highest corespeed point was 2101mhz.
> 
> do you notice any weird very fast artifact during benchmarking with @9450 memory?
> because My memory at 9300-9500mhz start to have greenball flashpop artifact in about milliseconds,
> FYI, little memory artifact would be pretty hard to catch with bare eyes.
> 
> Best stable i can get without crashing/artifacting on Firestrike bench/stress test (also in games witcher3,gta5,hitman, etc) :
> http://www.3dmark.com/fs/10456650 (graphics score 20,866, 2100/9216mhz)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> In pics, stress test was invalid reason, because i disabled the scan system & hardware monitoring info, some people say it is for more stable reason.
> on bench I got Time measurement inconsistent, it's random on my system, idk, sometime will valid, and sometime not.
> 
> I can get 20.9XX, but nearly +700mhz memory, so I used +500-600mhz instead for more safety reason & daily gaming.
> 
> AT this point, pretty much i give up, i don't know anymore if i can push it further. Because it seems anything higher will crashing in Firestrike, and witcher 3.


I did a quick run on firestrike extreme did not see any artifacts http://www.3dmark.com/3dm/15409265 - graphics score 10 093
And here is the firestrike extreme stress test http://www.3dmark.com/3dm/15409400 it says 99.6% not sure if that is good enough or not.
Fans were run at 100% because I am trying to figure out if the card is worth buying a waterblock for or not, was going to wait for 1080ti but my loop would look silly with only cpu in it.


----------



## zipper17

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I did a quick run on firestrike extreme did not see any artifacts http://www.3dmark.com/3dm/15409265 - graphics score 10 093
> And here is the firestrike extreme stress test http://www.3dmark.com/3dm/15409400 it says 99.6% not sure if that is good enough or not.
> Fans were run at 100% because I am trying to figure out if the card is worth buying a waterblock for or not, was going to wait for 1080ti but my loop would look silly with only cpu in it.


2114/9450mhz with 99.6% are very good stability result, your card very persistent under that load & clocks. Yours should be better chip than average 1070 card imo. All you have to do now make sure & keep watch your @9450mhz, if there is little memory artifact or not.

My card anything higher than 2100mhz seems not fully stable (randomly crashes on stress test fsx). Might want to try OC again. Still struggle to get Full stable 21K graphic scores. As for memory anything higher than 9325MHZ-9500mhz (+700-800mhz), i started to see randomly greenball flashing artifacts in short milliseconds.

Have you try for 22K graphic scores?


----------



## outofmyheadyo

Right now its +100 on the core tried 125 and it did not like it, havent tried to go higher on the memory i tried +800 a while back and it didnt like that either, im sure i need water to get 22k if possible.


----------



## FastOne8

Hello guys. I've purchased a Palit Gamerock 1070 today and I'm getting very low 3Dmark Firestrike 1080p score:

http://i.imgur.com/kiHW8UC.png

Nothing is overlocked, because I just wanted to test stability. I don't play any demanding games so I never bother to overclock CPU. I've seen people getting average 17-18k of overall score with this card... Please help me.


----------



## outofmyheadyo

it`s where it should be, people getting 17-18k have their CPU and their GPU overclocked you seem to run both on stock + the slow ddr3


----------



## FastOne8

Thank you for your reply. So nothing to worry about?


----------



## MyNewRig

Quote:


> Originally Posted by *FastOne8*
> 
> Hello guys. I've purchased a Palit Gamerock 1070 today and I'm getting very low 3Dmark Firestrike 1080p score:
> 
> http://i.imgur.com/kiHW8UC.png
> 
> Nothing is overlocked, because I just wanted to test stability. I don't play any demanding games so I never bother to overclock CPU. I've seen people getting average 17-18k of overall score with this card... Please help me.


18,831 Graphics score is very normal for your core frequency of 1557MHz , all those who are getting graphics score higher than 20,000 are overclocking their GPUs, your card can probably achieve much higher clocks than what it came with out of the box, if you overlock your core and memory a little you will get in the range everyone else is getting, you also have a good CPU, if you overclock it a bit with some GPU overclock you can easily break 16,000 Firestrike score, but as it is now, your score looks absolutely normal.


----------



## msigtx760tf4

this is my palit game rock (no PE) on stock setings 1557 mhz on core

http://www.3dmark.com/3dm/15416470

only Cpu OC to 5.1 ghz

difference is 500 on gpu why ? maybee because of boost


----------



## outofmyheadyo

3d mark shows your card was running @ 1962 on the core


----------



## msigtx760tf4

it's stock boost to 1962 at 1557 on base core speed

i'll do +100 on core in a minute

http://www.3dmark.com/3dm/15416883

and +100 core and +450 mem

http://www.3dmark.com/3dm/15416990


----------



## asdkj1740

Quote:


> Originally Posted by *msigtx760tf4*
> 
> this is my palit game rock (no PE) on stock setings 1557 mhz on core
> 
> http://www.3dmark.com/3dm/15416470
> 
> only Cpu OC to 5.1 ghz
> 
> difference is 500 on gpu why ? maybee because of boost


whats vram is your card?


----------



## msigtx760tf4

Sansung.


----------



## asdkj1740

warning.

do not flash those new bios avaliable on techpowerup. those uploaded at 30/9 from palit and gainward.
https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1070&interface=&memType=&memSize=&since=6+months+ago

these new bios get some problems as i flashed them with my evga ftw:
1. file size of these new bios is not 250KB as others old bios available on techpowerup
2. i still flashed them and it worked fine generally, no problem on gaming
3. the only problem i have found is that the power usage was very strangely low, <130W in gaming and benchmarking, however other stat like temp/core clock/voltage/gpu loading were all fine.
4. i think the idle artifacts problem is still there as when i overclocked vram to 2300mhz i was fine pressed the apply button but crashed and artifact shown after i opened gog/steam.
5. the vram overclocking gets slightly improvement from my old bios at 2200mhz to 2275mhz, i ran witcher 3/gtav/rise of tomb raider.

so i get back my zotac amp extreme bios (works fine and give me more power limit although it is hard to reach 100% 250W)
and wait for the true micron bios released by zotac.
i think those new bios on techpowerup are wrongly extracted otherwise file size should be the same as 250kb like others.
maybe it is due to my non palit/gainward card, dont know yet.
from the info on techpowerup new bios, the gamerock premium version bios gets higher power limit at 225w max from old version at 195w max. but on the palit website gamerock premium page still shows only 195w max. (170w 100%), nothing changed after released new bios. so those new bios on techpowerup should have some problems...

only flashing exe can be downloaded at gainward and palit website now. no xxx.rom can be downloaded from these official website.
therefore dont flash it if your card is non palit/gainward as the exe will detect and reject your non palit/gainward card before started flashing.
dont even be a smart ass like me flashing old palit bios first then use the official palit flashing exe to flash the new bios, it will give you getting stuck at 99% in process that will never end and your card maybe bricked although after rebooting i was fine to get into windows and flashed back zotac amp extreme bios.

good luck, safe cross flashing, and as always, **** you nvidia

i screenshoted this in testing the new palit bios overclocked to 2275mhz and no problem appeared except the freakingl low power which was just 99w....no any negative influences actually.. my last bios in general gives me ~180w in witcher 3, may be ~130w in minimum, 99w is not normal.. in my opinion.









this is the record of furmark 1080 benchmarks on the new palit bios downloaded from techpowerup.
in general both normalized gpu power and gpu power should get very close to each other, but the new palit bios on techpowerup was very strange that there was a huge gap between these two power draw.


----------



## asdkj1740

any gainward or palit user can tell us about the new bios released??
any improvement on vram overclocking including idle stage and 3d/gaming/benchmark stage?
any increase in power draw available?


----------



## Roland0101

Quote:


> Originally Posted by *DeathAngel74*
> 
> I forgot to post a screenshot of the overclock...


Did you locked your voltage?


----------



## DeathAngel74

Nope I don't know how, been too long since i've used a hex editor.


----------



## Roland0101

Quote:


> Originally Posted by *DeathAngel74*
> 
> Nope I don't know how, been too long since i've used a hex editor.


There are easier ways than a Hex editor. Just set the benchmark or the game to "prefer maximum performance" in the NVCP and apply your overclocking after the application has started. (a costume voltage/clock curve is also a way.)

If you use Heaven or Valley you can let them run in Windowed mode (the voltage is always high) and gradually set the clocks higher.
Just make sure that you have a non OC 2D profile you can apply with a key combination (or automatically with afterburner) before you leave the benchmark if you go over what is stable for you without "prefer maximum performance".


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> Nope I don't know how, been too long since i've used a hex editor.
> 
> 
> 
> There are easier ways than a Hex editor. Just set the benchmark or the game to "prefer maximum performance" in the NVCP and apply your overclocking after the application has started. (a costume voltage/clock curve is also a way.)
> 
> If you use Heaven or Valley you can let them run in Windowed mode (the voltage is always high) and gradually set the clocks higher.
> Just make sure that you have a non OC 2D profile you can apply with a key combination (or automatically with afterburner) before you leave the benchmark if you go over what is stable for you without "prefer maximum performance".
Click to expand...

You can set the default performance in the NV control panel to maximum performance. That should make the card idle above .800v and that should stop the checkerboarding without having to mess with overclocking after you start a 3d app.


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> You can set the default performance in the NV control panel to maximum performance. That should make the card idle above .800v and that should stop the checkerboarding without having to mess with overclocking after you start a 3d app.


Absolutely right.
You may have to add this programs to the NVCP and set them to prefer maximum performance to make sure that it is always active.
C:\Windows\system32\dwm.exe
C:\Windows\explorer.exe
C:\Windows\SystemApps\ShellExperienceHost_cw5n1h2txyewy\ShellExperienceHost.exe


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You can set the default performance in the NV control panel to maximum performance. That should make the card idle above .800v and that should stop the checkerboarding without having to mess with overclocking after you start a 3d app.
> 
> 
> 
> Absolutely right.
> You may have to add this programs to the NVCP and set them to prefer maximum performance to make sure that it is always active.
> C:\Windows\system32\dwm.exe
> C:\Windows\explorer.exe
> C:\Windows\SystemApps\ShellExperienceHost_cw5n1h2txyewy\ShellExperienceHost.exe
Click to expand...

With Chrome or Firefox open, my card is idling at the cards base core clock. It only ramps up to boost clock under a 3d load


----------



## ucode

Quote:


> Originally Posted by *asdkj1740*
> 
> only flashing exe can be downloaded at gainward and palit website now. no xxx.rom can be downloaded from these official website.


Do you have a link?


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> only flashing exe can be downloaded at gainward and palit website now. no xxx.rom can be downloaded from these official website.
> 
> 
> 
> Do you have a link?
Click to expand...

the palit bios does not fix the micron issue


----------



## asdkj1740

Quote:


> Originally Posted by *ucode*
> 
> Do you have a link?


as i said, go the those official website looking for the specific model like super jetstream or gamerock premium, at the support or download page you can find the bios update exe.


----------



## ucode

nvm, was just trying to offer help by extracting VBIOS from exe. For instance update 7-Oct-2016 GTX 1070 Super JetStream VBIOS file 1070_BIOS_Upgrade_1006-1.zip actually contains seven VBIOS's.

11228-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.38 , 75W,151W,170W
11230-10703BM0G1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.3A , 75W,151W,170W
11232-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.3C , 75W,170W,195W
11234-10703BM0G1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.3E , 75W,170W,195W
11268-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.72 , 75W,195W,225W
11269-10703BM0G1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.95 , 75W,195W,225W
11270-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.94 , 75W,195W,225W

Don't have a GTX 1070 myself.


----------



## asdkj1740

Quote:


> Originally Posted by *ucode*
> 
> nvm, was just trying to offer help by extracting VBIOS from exe. For instance update 7-Oct-2016 GTX 1070 Super JetStream VBIOS file 1070_BIOS_Upgrade_1006-1.zip actually contains seven VBIOS's.
> 
> 11228-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.38 , 75W,151W,170W
> 11230-10703BM0G1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.3A , 75W,151W,170W
> 11232-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.3C , 75W,170W,195W
> 11234-10703BM0G1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.3E , 75W,170W,195W
> 11268-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.72 , 75W,195W,225W
> 11269-10703BM0G1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.95 , 75W,195W,225W
> 11270-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.94 , 75W,195W,225W
> 
> Don't have a GTX 1070 myself.


oh thats really help, thanks mate.
did you get these roms separately? would you mind to upload for us to try?


----------



## ucode

They are supplied in a single download from Palit. Without a link given I don't know if this is what you are after or something else.


----------



## asdkj1740

Quote:


> Originally Posted by *ucode*
> 
> They are supplied in a single download from Palit. Without a link given I don't know if this is what you are after or something else.


http://www.palit.com/palit/vgapro.php?id=2639&lang=en&pn=NE51070H15P2-1041G&tab=do

this is the new bios on techpowerup
https://www.techpowerup.com/vgabios/186488/palit-gtx1070-8192-160930

edited: looks like they are the same....maybe these only work on pailt cards.
thank you.
may i know how do you read those power limit from the bios?


----------



## asdkj1740

have anyone tried to flash galax hof bios successfully on a non galax hof card?
or did anyone brick the card by flashing galax hof bios on a non galax hof card?


----------



## DeathAngel74

I've only flashed a 1070sc with micron bios to Samsung bios. Voltage went up to 1.093, didn't notice much else.


----------



## Powergate

Quote:


> Originally Posted by *ucode*
> 
> nvm, was just trying to offer help by extracting VBIOS from exe. For instance update 7-Oct-2016 GTX 1070 Super JetStream VBIOS file 1070_BIOS_Upgrade_1006-1.zip actually contains seven VBIOS's.
> 
> 11270-10703BM0N1.rom,10DE:1B81,GeForce GTX 1070 VGA BIOS,86.04.3B.00.94 , 75W,195W,225W


Thanks for this information, already wondered why my 3DMark score slightly increased after the update.
So the default TDP is now increased by 25W (170W -> 195W).


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> With Chrome or Firefox open, my card is idling at the cards base core clock. It only ramps up to boost clock under a 3d load


The problem is that Windows (on most systems) will not apply "prefer maximum performance" in idle without an application open that uses 3D acceleration if this three programs are not added to the NVCP and set to "prefer maximum performance". So the GPU will not only go to it's base clocks with higher voltage but to its idle clocks and idle voltage.
And you know what that means if you run an OC profile all the time.


----------



## ITAngel

Nevermind wrong forum to talk about my question.


----------



## outofmyheadyo

Whats the deal with my gainward phoenix 1070 gs fans, the video was taken while idle, and fans seem to do this 24x7 start/stop/start/stop what`s the point, and how do I fix it ?


----------



## Burtman88

Found a place that has 4 1070 cards that have Samsung memory!!! i had to return 2 micron ones due to freezing and lock ups on stock oc that was recommended by the asus and evga.


----------



## Burtman88




----------



## gtbtk

Quote:


> Originally Posted by *outofmyheadyo*
> 
> 
> 
> 
> 
> Whats the deal with my gainward phoenix 1070 gs fans, the video was taken while idle, and fans seem to do this 24x7 start/stop/start/stop what`s the point, and how do I fix it ?


Your card is idling at the temperature where it activates the fans. For MSI cards that is 60 degrees, not sure about the gainward cards.

The card gets to the temp threshold, the fans start and cool it down to under the threshold and stop again. if you can direct a little more Case airflow towards your card, that may be enough to keep the card under the threshold while it is at idle or alternatively, set a custom fan curve


----------



## outofmyheadyo

There is no case airflow its a thermaltake p5, basically an open case, ill try with a custom curve.


----------



## ITAngel

Hey there *gtbtk*, It mainly to test a couple of application to determined if I want to go full MacBook Pro. I am attempting it on my Lenovo Y700 laptop instead, keeping my main PC for other main stuff like gaming and multimedia.







Sorry I didn't think anyone read that, but honestly is so hard to let go this Zotac GTX 1070 AMP! Extreme card and specially when I know it overclocks like crazy on air.


----------



## KedarWolf

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *outofmyheadyo*
> 
> Oh lordy, I found the voltage slider in afterburner now ( for some reason I took it for granted that it did not work on my phoenix 1080 and so I did not try it on my 1070 ) now it turns out +100mv bumps up the voltage from 1.0500v to 1.0810v time to do some more testing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well I got some increase in coreclock from 2050 to 2125 thats seems to be the limit for now!
> 
> timespy - graphics score 6837 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)
> 
> firestrike - graphics score 21 485 - ( [email protected],3200ram 14-14-14-34. gainward gtx 1070 phoenix GS @ 2125/9450)
> 
> Pretty happy with this card
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just found it yesterday in PX OC. 1.075-1.093v
> 
> Click to expand...
> 
> Dont tell me about TXP, it destroys the 1070 and 1070 sli and whatever you throw at it
Click to expand...

Saw a long article on 1070s in SLI and a single TXP, the 1070s in most every benchmark were a bit better then the TXP. Only one's that weren't the games had no SLI support.


----------



## khanmein

another samsung review. all the popular & well known review side from youtube/webpage unable to review a micron GTX 1070.

http://www.hardocp.com/article/2016/10/12/msi_geforce_gtx_1070_gaming_x_8g_video_card_review#.WAESm8miPN4


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> another samsung review. all the popular & well known review side from youtube/webpage unable to review a micron GTX 1070.
> 
> http://www.hardocp.com/article/2016/10/12/msi_geforce_gtx_1070_gaming_x_8g_video_card_review#.WAESm8miPN4


This is getting so f*cking ridiculous, reviewing a product that doesn't exists in the market and even dare to brag about its memory OC!!!!


----------



## gtbtk

Quote:


> Originally Posted by *ITAngel*
> 
> Hey there *gtbtk*, It mainly to test a couple of application to determined if I want to go full MacBook Pro. I am attempting it on my Lenovo Y700 laptop instead, keeping my main PC for other main stuff like gaming and multimedia.
> 
> 
> 
> 
> 
> 
> 
> Sorry I didn't think anyone read that, but honestly is so hard to let go this Zotac GTX 1070 AMP! Extreme card and specially when I know it overclocks like crazy on air.


If it was me, and I say this owning a 2014 15" Macbook pro, keep your 1070. I usually run if on Windows 10 from a Samsung T1 usb drive because it gives me better functionality and performance for most things.

Certainly play with Macos or Yosemite and make a Hackintosh on your Lenovo, it could be fun. For experimentation, the iGPU is fine, you don't need a discrete GPU for day to day things, unless your applications are 3d based, you should be fine. The built in dGPUs apple sell in the Macbooks are pretty low end compared to windows gaming laptops anyway. I believe applications like Final cut Pro leverages quicksync for rendering so the iGPU has you covered there.

Even when the new 2016 Apples drop, I think it has already been announced they will have AMD GPUs in the higher models and I cant believe they could have have anything better than an RX 480.


----------



## asdkj1740

evga jacob said bios will be available early next week


----------



## ucode

Quote:


> Originally Posted by *asdkj1740*
> 
> http://www.palit.com/palit/vgapro.php?id=2639&lang=en&pn=NE51070H15P2-1041G&tab=do
> 
> may i know how do you read those power limit from the bios?


Okay, that's the same VBIOS update.

This thread helps explain about power and other settings. Sadly the thread was closed by Laithan after the start of cross flashing orgies became too much to bare.


----------



## asdkj1740

Quote:


> Originally Posted by *ucode*
> 
> Okay, that's the same VBIOS update.
> 
> This thread helps explain about power and other settings. Sadly the thread was closed by Laithan after the start of cross flashing orgies became too much to bare.


thanks

it seems that cross flashing 1070 bios wont brick the card as most of the 1070s have got the same voltage controller chip uP 9511P. just leave galax hof be, as hof cards use another voltage controller chip even on pascal cards.

i flashed few bios on my evga ftw. none of them brick my card but only zotac amp extreme give me the "fine" operation.
i flashed new palit bios and msi gaming and msi gamingz bios, all of them give me a huge differential of total gpu power and total gpu normalized power. the total gpu power is strangely low while the gpu normalized power is freaking high resulting in power throttling...
only zotac amp extreme gives me the proper operation as my evga stock bios did.

msi bios on evga card


zotac bios on evga card


another thread is talking about shunt shorting by liquid ultra for higher power limit, in my opinion, why dont just flash a high power limit bios that is suitable for the card...
1080 even has that strix oc t4 bios with no power limit

this is from techpowerup bios collection, although some bios on tpu maybe wrong or broken, be careful on those number like default clock and boost clock something like that...


----------



## asdkj1740

ucode, can we increase the power limit on the bios using hex editior? just power limit.
have you ever tried this?


----------



## RyanRazer

Quote:


> Originally Posted by *outofmyheadyo*
> 
> 
> 
> 
> 
> Whats the deal with my gainward phoenix 1070 gs fans, the video was taken while idle, and fans seem to do this 24x7 start/stop/start/stop what`s the point, and how do I fix it ?


I think you have the same issue. My workaround: http://www.overclock.net/t/1613126/gtx-1070-amp-extreme-owners/20_10#post_25581285


----------



## msigtx760tf4

who knows how to flash gtx1070 bios ?
i downloaded NVFLash certificate checks bypassed v5.287 from here :http://www.overclock.net/t/1521334/official-nvflash-with-certificate-checks-bypassed-for-gtx-950-960-970-980-980ti-titan-x
but i have this error : BIOS Cert 2.0 Verification Error, update aborted


----------



## asdkj1740

Quote:


> Originally Posted by *msigtx760tf4*
> 
> who knows how to flash gtx1070 bios ?
> i downloaded NVFLash certificate checks bypassed v5.287 from here :http://www.overclock.net/t/1521334/official-nvflash-with-certificate-checks-bypassed-for-gtx-950-960-970-980-980ti-titan-x
> but i have this error : BIOS Cert 2.0 Verification Error, update aborted


https://www.techpowerup.com/downloads/2709/nvflash-5-292-0-for-windows


----------



## MyNewRig

Quote:


> Originally Posted by *msigtx760tf4*
> 
> who knows how to flash gtx1070 bios ?
> i downloaded NVFLash certificate checks bypassed v5.287 from here :http://www.overclock.net/t/1521334/official-nvflash-with-certificate-checks-bypassed-for-gtx-950-960-970-980-980ti-titan-x
> but i have this error : BIOS Cert 2.0 Verification Error, update aborted


Just get the latest version of NVFlash from https://www.techpowerup.com/downloads/Utilities/BIOS_Flashing/NVIDIA/

Follow the guide here http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980


----------



## msigtx760tf4

thank you Rep+
i flashed bios from gamerock PE (new) with 225W to Gamerock (non PE)
and there is No difference in performance








back to original bios


----------



## ucode

Quote:


> Originally Posted by *asdkj1740*
> 
> ucode, can we increase the power limit on the bios using hex editior? just power limit.
> have you ever tried this?


The problem is when the VBIOS is modified the signing becomes invalid and will produce a certification error if we try to flash it. There is provision for a user license key by request from nVidia but I've never seen one. Only solution at this time is for someone with signing keys to provide a modified and signed VBIOS or hardware modification. A special 1070 VBIOS was reported as work in progress but so far has not seen the light of day.


----------



## caenlen

Quote:


> Originally Posted by *msigtx760tf4*
> 
> thank you Rep+
> i flashed bios from gamerock PE (new) with 225W to Gamerock (non PE)
> and there is No difference in performance
> 
> 
> 
> 
> 
> 
> 
> 
> back to original bios


Yeah, I am not entirely sure why anyone is messing with the BIOS or power limits, 2.1ghz is the max for these cards no matter what, voltage no longer matters, its just where we are in life. Can't complain honestly, lol.


----------



## asdkj1740

Quote:


> Originally Posted by *ucode*
> 
> The problem is when the VBIOS is modified the signing becomes invalid and will produce a certification error if we try to flash it. There is provision for a user license key by request from nVidia but I've never seen one. Only solution at this time is for someone with signing keys to provide a modified and signed VBIOS or hardware modification. A special 1070 VBIOS was reported as work in progress but so far has not seen the light of day.


oic, thx. it is shame there is not strix oc t4 bios for 1070..
that post ended sad as it is locked...cross flashing should be discuss further as it is the safer way to increase power limit than the liquid ultra method.
i have no idea why my evga ftw cant cross flash other bios properly, only zotac amp extreme is suitable so far.


----------



## kevindd992002

Any reason why they can't produce a Pascal BIOS Editor?


----------



## asdkj1740

Quote:


> Originally Posted by *msigtx760tf4*
> 
> thank you Rep+
> i flashed bios from gamerock PE (new) with 225W to Gamerock (non PE)
> and there is No difference in performance
> 
> 
> 
> 
> 
> 
> 
> 
> back to original bios


why did that? premium bios has higher power limit.
the old non premium bios has just 170w max which is not enough for gaming. even 190w max is still not enough..
225w is generally enough.


----------



## khanmein

Quote:


> Originally Posted by *kevindd992002*
> 
> Any reason why they can't produce a Pascal BIOS Editor?


what for? voltage is locked & sensitive.


----------



## kevindd992002

Any reason why they can't produce a Pascal BIOS Editor?
Quote:


> Originally Posted by *khanmein*
> 
> what for? voltage is locked & sensitive.


I thought voltage was locked with Kepler and Maxwell as well?


----------



## khanmein

Quote:


> Originally Posted by *kevindd992002*
> 
> Any reason why they can't produce a Pascal BIOS Editor?
> I thought voltage was locked with Kepler and Maxwell as well?


don't know but no much different.


----------



## outofmyheadyo

I think it was encrypted, no ?


----------



## cryptos9099

nVidia vBIOS's use a Vendor-specific signed certificate that "acts" like a checksum to prevent tampering and bad firmware from being used.


----------



## kevindd992002

Quote:


> Originally Posted by *cryptos9099*
> 
> nVidia vBIOS's use a Vendor-specific signed certificate that "acts" like a checksum to prevent tampering and bad firmware from being used.


So this wasn't present in Kepler or Maxwell?


----------



## cryptos9099

Quote:


> Originally Posted by *kevindd992002*
> 
> So this wasn't present in Kepler or Maxwell?


This methodology started with Maxwell, but JoeDirt learned how to get nvflash to ignore the sig check afaik. I'm sure the process can be replicated.


----------



## Awsan

Holly smokes this game uses upto 7.5GB Vram at max settings + SMAA









And a question please

TXAA VS SMAA, SMAAx2, SMAAx4

What should i consider as i noticed that TXAA is lighter than SMAA but is SMAA worth it? as i saw amazing AA on the x4 settings but on normal SMAA it looks good but cant really tell if its better than TXAA


----------



## kevindd992002

Quote:


> Originally Posted by *cryptos9099*
> 
> This methodology started with Maxwell, but JoeDirt learned how to get nvflash to ignore the sig check afaik. I'm sure the process can be replicated.


This is exactly what I thought. So technically the Pascal BIOS Editor is an easy development for the usual people in this scene. I'm just wondering why no one has seem to care enough to produce one yet.

@khanmein

This is what I was telling you about. It's not about being voltage locked and all.


----------



## Roland0101

Quote:


> Originally Posted by *Awsan*
> 
> Holly smokes this game uses upto 7.5GB Vram at max settings + SMAA


7.5? I have 7.8 on some points.








Quote:


> And a question please
> 
> TXAA VS SMAA, SMAAx2, SMAAx4
> 
> What should i consider as i noticed that TXAA is lighter than SMAA but is SMAA worth it? as i saw amazing AA on the x4 settings but on normal SMAA it looks good but cant really tell if its better than TXAA


First, you have TXAA as an Option? Because then I need to have a look at my update history. FAXX, SMAA, SSAA 2x SSAA 4x is all what I get.

For the rest, look for yourself

Or here scroll down and you will find the same anti aliasing comparisons as above for almost all possible settings.

And the last one.

Imho SMAA is the best compromise. It's looking pretty good and cost not much performance.
SSAA 2x means the game is rendered at a higher resolution, 2688x1512 @ a 1920x1080 Monitor. At SSAA 4x it is 3840x2160 @ a 1920x1080 Monitor. That means of course that the performance cost is immense, but both settings are superior to FAXX and SMAA.


----------



## Awsan

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Awsan*
> 
> Holly smokes this game uses upto 7.5GB Vram at max settings + SMAA
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 7.5? I have 7.8 on some points.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> And a question please
> 
> TXAA VS SMAA, SMAAx2, SMAAx4
> 
> What should i consider as i noticed that TXAA is lighter than SMAA but is SMAA worth it? as i saw amazing AA on the x4 settings but on normal SMAA it looks good but cant really tell if its better than TXAA
> 
> Click to expand...
> 
> First, you have TXAA as an Option? Because then I need to have a look at my update history. FAXX, SMAA, SSAA 2x SSAA 4x is all what I get.
> 
> For the rest, look for yourself
> 
> Or here scroll down and you will find the same anti aliasing comparisons as above for almost all possible settings.
> 
> And the last one.
> 
> Imho SMAA is the best compromise. It's looking pretty good and cost not much performance.
> SSAA 2x means the game is rendered at a higher resolution, 2688x1512 @ a 1920x1080 Monitor. At SSAA 4x it is 3840x2160 @ a 1920x1080 Monitor. That means of course that the performance cost is immense, but both settings are superior to FAXX and SMAA.
Click to expand...

Thanks for the info









Oh my bad







meant FXAA


----------



## TheGlow

Quote:


> Originally Posted by *caenlen*
> 
> Yeah, I am not entirely sure why anyone is messing with the BIOS or power limits, 2.1ghz is the max for these cards no matter what, voltage no longer matters, its just where we are in life. Can't complain honestly, lol.



Just on desktop I can get it to hit 2.2GHz.


----------



## asdkj1740

palit just replied my email of asking what is new about the new bios released
palit said it is for micron overclocking improvement, enlarging power limit, fixing fan turning on and off repeatedly at idle.

according to my cross flashing, palit indeed improves the micron vram overclocking, in my case my micron can now overclocking to 2275 from 2200, but i dont think palit fixed the idle checkerboard problem...


----------



## reflex75

Quote:


> Originally Posted by *asdkj1740*
> 
> palit just replied my email of asking what is new about the new bios released
> palit said it is for micron overclocking improvement, enlarging power limit, fixing fan turning on and off repeatedly at idle.
> 
> according to my cross flashing, palit indeed improves the micron vram overclocking, in my case my micron can now overclocking to 2275 from 2200, but i dont think palit fixed the idle checkerboard problem...


I also have a Palit Super Jetstream 1070 and I've tried their last bios.
This bios update is a great improvement with better fan management. it's cooler, still silent, and no more constant on/off fan speed during idle. Also the power limit has been increased to 225W, which is the limit for this card with a single 8pin. This power increase provide better max core frequency stability








But it doesn't fix Micron memory violent crash with reboot even during desktop idle.
And I think Palit can't solve this problem without the new bios update provided by Nvidia himself.
EVGA announced they are working on this issue:
http://forums.evga.com/gtx-1070-FTW-Bios-m2564665.aspx
Pity, because this Palit card could be perfect without this memory issue...


----------



## asdkj1740

Quote:


> Originally Posted by *reflex75*
> 
> I also have a Palit Super Jetstream 1070 and I've tried their last bios.
> This bios update is a great improvement with better fan management. it's cooler, still silent, and no more constant on/off fan speed during idle. Also the power limit has been increased to 225W, which is the limit for this card with a single 8pin. This power increase provide better max core frequency stability
> 
> 
> 
> 
> 
> 
> 
> 
> But it doesn't fix Micron memory violent crash with reboot even during desktop idle.
> And I think Palit can't solve this problem without the new bios update provided by Nvidia himself.
> EVGA announced they are working on this issue:
> http://forums.evga.com/gtx-1070-FTW-Bios-m2564665.aspx
> Pity, because this Palit card could be perfect without this memory issue...


indeed, palit jetstream and gamerock pascal cards are very impressive, with excellent pcb design (very nice vcore mosfets, dual bios support) and insane cooling (should be the best cooling among other 1070s).
increasing power limit to 225 from 195 is very helpful as during game the range of power draw is around~200w, therefore 225w should be enough for gaming.

the blame of idle crashing problem of micron should go to nvidia. and evga jacob also said their new coming evga micron bios mainly aims to improve micron overclocking but not dealing with idle checkerboard problem. it seems that aic cant fix this problem, only nvidia can tackle it, but i am afraid nvidia wont give a **** about this problem...

is your card micron vram? any overclocking improvement ?


----------



## saunupe1911

Has anyone with a Asus Strix OC flashed their bios to something different with a higher power limit? Did you improve performance? I understand technically it's already pretty high up but I would love to get mine to 2100 + with a stable locked in voltage. The highest I can get is 2050 and 52 to 54 degrees before Boost 3.0 knocks it down during certain time periods


----------



## reflex75

Quote:


> Originally Posted by *asdkj1740*
> 
> indeed, palit jetstream and gamerock pascal cards are very impressive, with excellent pcb design (very nice vcore mosfets, dual bios support) and insane cooling (should be the best cooling among other 1070s).
> increasing power limit to 225 from 195 is very helpful as during game the range of power draw is around~200w, therefore 225w should be enough for gaming.
> 
> the blame of idle crashing problem of micron should go to nvidia. and evga jacob also said their new coming evga micron bios mainly aims to improve micron overclocking but not dealing with idle checkerboard problem. it seems that aic cant fix this problem, only nvidia can tackle it, but i am afraid nvidia wont give a **** about this problem...
> 
> is your card micron vram? any overclocking improvement ?


I agree, this card would be perfect








if it was not spoiled by such a bad memory...








If I try to overclock my Micron vram just a little bit then I get the checkerboard artifact and reboot!








And you have many people facing this issue even at stock speed, so treating this issue as an overclocker problem is just lawyer bull***** from Nvidia to cover their back!


----------



## asdkj1740

Quote:


> Originally Posted by *saunupe1911*
> 
> Has anyone with a Asus Strix OC flashed their bios to something different with a higher power limit? Did you improve performance? I understand technically it's already pretty high up but I would love to get mine to 2100 + with a stable locked in voltage. The highest I can get is 2050 and 52 to 54 degrees before Boost 3.0 knocks it down during certain time periods


no, technically, asus strix oc's power limit is too low, if you know how good is the vcore mosfet on asus strix pcb. should be the best among 1070.
you may have known that the only one modded bios right now for 1080 is strix oc t4 bios from asus...

enough power limit should give you stable overclocking without throttling, but i dont think higher power limit will help you to achieve higher core clock.


----------



## asdkj1740

Quote:


> Originally Posted by *reflex75*
> 
> I agree, this card would be perfect
> 
> 
> 
> 
> 
> 
> 
> if it was not spoiled by such a bad memory...
> 
> 
> 
> 
> 
> 
> 
> 
> If I try to overclock my Micron vram just a little bit then I get the checkerboard artifact and reboot!
> 
> 
> 
> 
> 
> 
> 
> 
> And you have many people facing this issue even at stock speed, so treating this issue as an overclocker problem is just lawyer bull***** from Nvidia to cover their back!


the only problem of palit 1070 jetstream and gamerock should be the single 8pin connector design.
it is fine if we have a bios tweaker like maxwell one that gives us tweaking the amount of input from a 8pin connector.
however....









i bought my evga 970 from us amazon but i am not eligible for the $30 refund as i am not a american. nvidia treats customers like ****.
the pascal dpc latency problems are still not yet fixed.


----------



## reflex75

Quote:


> Originally Posted by *asdkj1740*
> 
> i bought my evga 970 from us amazon but i am not eligible for the $30 refund


Sorry for you, but looks like Nvidia X70 generations are cursed!


----------



## gtbtk

Quote:


> Originally Posted by *reflex75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> indeed, palit jetstream and gamerock pascal cards are very impressive, with excellent pcb design (very nice vcore mosfets, dual bios support) and insane cooling (should be the best cooling among other 1070s).
> increasing power limit to 225 from 195 is very helpful as during game the range of power draw is around~200w, therefore 225w should be enough for gaming.
> 
> the blame of idle crashing problem of micron should go to nvidia. and evga jacob also said their new coming evga micron bios mainly aims to improve micron overclocking but not dealing with idle checkerboard problem. it seems that aic cant fix this problem, only nvidia can tackle it, but i am afraid nvidia wont give a **** about this problem...
> 
> is your card micron vram? any overclocking improvement ?
> 
> 
> 
> I agree, this card would be perfect
> 
> 
> 
> 
> 
> 
> 
> if it was not spoiled by such a bad memory...
> 
> 
> 
> 
> 
> 
> 
> 
> If I try to overclock my Micron vram just a little bit then I get the checkerboard artifact and reboot!
> 
> 
> 
> 
> 
> 
> 
> 
> And you have many people facing this issue even at stock speed, so treating this issue as an overclocker problem is just lawyer bull***** from Nvidia to cover their back!
Click to expand...

What OC utility are you using?

What performance setting have you set in Nvidia Control panel?

What voltage is your card sitting at when you click the apply button to OC the card?

If you lock the voltage on the afterburner curve at 1.093v, do you still get checkerboards?


----------



## ogider

Can someone share new bios for Palit 1070.
Looking for Game Rock Premium.
Just rom file.


----------



## asdkj1740

Quote:


> Originally Posted by *reflex75*
> 
> Sorry for you, but looks like Nvidia X70 generations are cursed!


i would say x70 buyers being cursed, and cursed by nvidia.


----------



## iluvkfc

Can anyone confirm this is the BIOS I should use for 225W power limit? Palit Super Jetstream. I am currently using ASUS Strix for 200W power limit but it's still not enough for some real-world applications. Cards are Gigabyte GTX 1070 G1 PCBs by the way, with Samsung memory.


----------



## asdkj1740

Quote:


> Originally Posted by *ogider*
> 
> Can someone share new bios for Palit 1070.
> Looking for Game Rock Premium.
> Just rom file.


already posted on this thread, just few pages from this page
techpowerup has it too


----------



## asdkj1740

Quote:


> Originally Posted by *iluvkfc*
> 
> Can anyone confirm this is the BIOS I should use for 225W power limit? Palit Super Jetstream. I am currently using ASUS Strix for 200W power limit but it's still not enough for some real-world applications. Cards are Gigabyte GTX 1070 G1 PCBs by the way, with Samsung memory.


give it a try.

btw, i would like to know which mosfets are gigabyte 1070 g1 using so if one day you take off the cooler then please do me a favor telling me those models printed on the surface of those mosfets. thank you.


----------



## iluvkfc

Quote:


> Originally Posted by *asdkj1740*
> 
> give it a try.
> 
> btw, i would like to know which mosfets are gigabyte 1070 g1 using so if one day you take off the cooler then please do me a favor telling me those models printed on the surface of those mosfets. thank you.


Already watercooling them, didn't take a close look at MOSFET part numbers but I know it's the same as 1080 G1, just 2 phases missing.


----------



## asdkj1740

the reason why kingpin's latest world record's titanx pascal cards are using normal bios rather than a gpu boost disabled modded bios like the 1080 one should be lack of evga's help. as evga doesnt sell titanx pascal.... and nvidia doesnt give a **** about kingpin's record


----------



## asdkj1740

Quote:


> Originally Posted by *iluvkfc*
> 
> Already watercooling them, didn't take a close look at MOSFET part numbers but I know it's the same as 1080 G1, just 2 phases missing.


never mind thanks.
i am afraid gigabyte 1070 g1 gaming vrms are the same as its RX480 g1 gaming... if that is the case then the vrms in 1070 g1 gaming are quiet weak and i dont think you should use a 250w bios. so 225w palit bios should be the safest one you should look for.


----------



## F3niX69

Quote:


> Originally Posted by *asdkj1740*
> 
> no, technically, asus strix oc's power limit is too low, if you know how good is the vcore mosfet on asus strix pcb. should be the best among 1070.
> you may have known that the only one modded bios right now for 1080 is strix oc t4 bios from asus...
> 
> enough power limit should give you stable overclocking without throttling, but i dont think higher power limit will help you to achieve higher core clock.


so the asus strix has a very good power mosfet and is limited by the 200watt limit?


----------



## iluvkfc

Quote:


> Originally Posted by *asdkj1740*
> 
> never mind thanks.
> i am afraid gigabyte 1070 g1 gaming vrms are the same as its RX480 g1 gaming... if that is the case then the vrms in 1070 g1 gaming are quiet weak and i dont think you should use a 250w bios. so 225w palit bios should be the safest one you should look for.


Now I am interested in this 250W BIOS. Where can I find 250W BIOS that will actually let card draw 250W from a single 8-pin card, I thought these were for 8+6 and 8+8 cards?


----------



## asdkj1740

Quote:


> Originally Posted by *F3niX69*
> 
> so the asus strix has a very good power mosfet and is limited by the 200watt limit?


IR3555, very high end mosfet.
asus strix and galax hof (as well as gainward g soul, a chinese version) use this beast.
1080 strix has eight this mosfets...60A*8=480A*1.1V>500W. it is no doubt that pascal strix is built for ln2.

edited: i made a mistake, 1070 strix has only 6 phase vcore mosfet and 1080 has 8


----------



## asdkj1740

Quote:


> Originally Posted by *iluvkfc*
> 
> Now I am interested in this 250W BIOS. Where can I find 250W BIOS that will actually let card draw 250W from a single 8-pin card, I thought these were for 8+6 and 8+8 cards?


you can try zotac amp bios and galax exoc bios.. but both of them are not for single 8pin connectors.
mine dual 8pin card flashed to single 8 pin card bios and 8+6pin card bios resulted in strange power usage. throttling in furmark...so they may not suit your card but you can try them all. just take all the risks by yourself.
techpowerup has lots of bios for 1070.


----------



## F3niX69

Quote:


> Originally Posted by *asdkj1740*
> 
> IR3555, very high end mosfet.
> asus strix and galax hof (as well as gainward g soul, a chinese version) use this beast.
> it has eight this mosfets...60A*8=480A*1.1V>500W. it is no doubt that pascal strix is built for ln2.


Interesting. So why did asus choose a 200w max tdp on their bios and why doesn't it come with extra bells and whistles like (lets say) a bios switch?(if its such a premium board)


----------



## asdkj1740

Quote:


> Originally Posted by *F3niX69*
> 
> Interesting. So why did asus choose a 200w max tdp on their bios and why doesn't it come with extra bells and whistles like (lets say) a bios switch?(if its such a premium board)


lower temperature and noise.
lower power consumption for better power efficiency, it is the main trend nowadays...

asus is superior, offering too much may harm its brand image.
in my opinion, a premium price for the same product like other provided=asus.

asus 1080 turbo use the 6 phases ir3555 pcb, the same pcb of asus strix with little cut down, which is freaking awesome and very surprised me, at such msrp. it is rare to see asus to offer such great card in competitive price.


----------



## F3niX69

Quote:


> Originally Posted by *asdkj1740*
> 
> lower temperature and noise.
> lower power consumption for better power efficiency, it is the main trend nowadays...
> 
> asus is superior, offering too much may harm its brand image.
> in my opinion, a premium price for the same product like other provided=asus.
> 
> asus 1080 turbo use the 6 phases ir3555 pcb, the same pcb of asus strix with little cut down, which is freaking awesome and very surprised me, at such msrp. it is rare to see asus to offer such great card in competitive price.


It would be awesome if they included a bios switch though,very dissapointed with that cause i am bit afraid to flash other higher tdp vbioses and see if i get higher overclocks


----------



## asdkj1740

Quote:


> Originally Posted by *F3niX69*
> 
> It would be awesome if they included a bios switch though,very dissapointed with that cause i am bit afraid to flash other higher tdp vbioses and see if i get higher overclocks


no higher clock i am afraid, but just more stable clock and less throttling. easier to maintain the clock at instant max boost level.
Quote:


> Originally Posted by *F3niX69*
> 
> It would be awesome if they included a bios switch though,very dissapointed with that cause i am bit afraid to flash other higher tdp vbioses and see if i get higher overclocks


you can check the locked bios post out, lots of freaks cross flashed, lol. their works and results are admirable.


----------



## gtbtk

EVGA have just released their micron bug bioses. They can be downloaded here:

http://forums.evga.com/EVGA-GeForce-GTX-1070-BIOS-Update-v8604500070-m2565056.aspx


----------



## DeathAngel74

Is there a way to just extract the rom? I feel more comfortable using nvflash. Tried 7-zip to extract and no good.
https://drive.google.com/file/d/0B007JgCLgXQLSGkxcUxSS2cwdm8/view?usp=sharing
08g-p4-6173-kr bios update file


*Edit:*
Now stable at 8812 MHz, previously only 8500 MHz.


----------



## Ljanmi

Guys I have a massive problem


----------



## DeathAngel74

What are your full pc specs? Have you made any recent hardware or driver changes?


----------



## Ljanmi

Quote:


> Originally Posted by *DeathAngel74*
> 
> What are your full pc specs? Have you made any recent hardware or driver changes?


No just type GTX 1070 lag spikes in google, you will see what I am talking about here


----------



## DeathAngel74

I dunno, I just did a clean install of Windows 7, 10 was pissing me off! I've only had my 1070 for a little over a week. Could be a harware conflict or a program running in the background?
The only thing I can suggest is :
remove nVidia driver with DDU, remove overclocking software too(delete corresponding folders too).
reboot
clean install of nVidia drivers
reboot
re-install overclocking software
reboot
reapply NCP tweaks and overclocking parameters
Hopefully...PROFIT!


----------



## zipper17

Quote:


> Originally Posted by *Ljanmi*
> 
> Guys I have a massive problem


that's not 1070 stuttering, the game itself AC4 are known for terrible physx optimization. Even with SLI still FPS drop with physx set to high, it's just simply unplayable with solid framerates.

there's ton of reference on geforce forums.

https://forums.geforce.com/default/topic/660850/physx/physx-in-assassins-creed-iv-black-flag-causing-low-fps-on-gtx-780/4/
http://forums.ubi.com/showthread.php/839455-Assassins-Creed-Black-Flag-and-my-pc-question


----------



## TheGlow

Quote:


> Originally Posted by *Ljanmi*
> 
> Guys I have a massive problem


Oh, This game. I got it free with my xbox one and couldnt get more than an hour in. I don't think you're missing much.


----------



## Ljanmi

I dont think it will help









I have the same problem in Dying Light, Crew and now Black Flag


----------



## DeathAngel74

Alice Madness Returns as well.Horribly optimized and laggy. Was driving me batty, so I deleted it. All other games are fine though....TW3, Batman:AK, SW:BF 2015 all maxed out.


----------



## Exenth

the new BIOS allowed me to OV my memory by +600MHz which results in a actual clock of 9216 MHz,

and i was able to finally beat the 20k Graphic Score in Firestrike


----------



## DeathAngel74

......damn you, lol! I can't complain, i guess. From 8008 to 8812. Nice job!


----------



## Blasius

FOR ME:

1080P

MY ASUS FOUNDERS

GPU 2139 + DDR 4335










2K


----------



## muzammil84

would this new BIOS help with Samsung memory OC too?


----------



## DeathAngel74

no, only Micron specific


----------



## zipper17

Quote:


> Originally Posted by *Ljanmi*
> 
> I dont think it will help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have the same problem in Dying Light, Crew and now Black Flag


Whats resolution do you use on Dying Light, Crew? what's your minimum framerates?

as for Black Flag physx just simply broken, i have the game myself, Set physx to low or disable for butter smooth.


----------



## Ljanmi

Quote:


> Originally Posted by *zipper17*
> 
> that's not 1070 stuttering, the game itself AC4 are known for terrible physx optimization. Even with SLI still FPS drop with physx set to high, it's just simply unplayable with solid framerates.
> 
> there's ton of reference on geforce forums.
> 
> https://forums.geforce.com/default/topic/660850/physx/physx-in-assassins-creed-iv-black-flag-causing-low-fps-on-gtx-780/4/
> http://forums.ubi.com/showthread.php/839455-Assassins-Creed-Black-Flag-and-my-pc-question


Not really, it doesnt change a thing - *the same result*


----------



## Ljanmi

Quote:


> Originally Posted by *Ljanmi*
> 
> Not really, it doesnt change a thing - *the same result*


TXAA 4X was the problem, *zipper17 WAS RIGHT* about nVidia Physics. Set to Low seems to have no problem, Ill have to test that setting some more but on Off definitely there is *NO LAG SPIKES*









I apologize for confusion


----------



## zipper17

Quote:


> Originally Posted by *asdkj1740*


i have the galax exoc board, it has 250w.
During 3dmark FSX stress test/gaming, the highest gpu power i've seen on my card is 228w (114% GPu power TDP) as far as i remember.
2100/9200mhz samsung memory, the corespeed anything higher than 2100mhz it will randomly crash during 3dmark (nvlddmkm driver crash)

does bios with higher powerlimit will help me get higher core clock speed stable? curious about for amp extreme bios.
but imo stable overclocking always still depends how good is the silicon chip, am I right?.
Quote:


> Originally Posted by *Ljanmi*
> 
> TXAA 4X was the problem, you were *RIGHT* about nVidia Physics. Set to Low seems to have no problem, Ill have to test that setting some more but on Off definitely there is *NO LAG SPIKES*
> 
> 
> 
> 
> 
> 
> 
> 
> I apologize for confusion


i used TXAA only, not TXAA4x btw, for lag spikes yeah it will always happen if set physx to high.


----------



## zipzop

EVGA BIOS for my SC did nothing. Still artifacting / hanging / crashing in the +350-+400.


----------



## DeathAngel74

My previous happy dance was crushed by reality. I'm am forced to 8726 MHz. The previous 88xx was not game stable. Still a big improvement. No need to go kick rocks over it. +362(x2 effective) MHz offset, maybe I just got lucky.


----------



## zipper17

Quote:


> Originally Posted by *DeathAngel74*
> 
> My previous happy dance was crushed by reality. I'm am forced to 8726 MHz. The previous 88xx was not game stable. Still a big improvement. No need to go kick rocks over it. +362(x2 effective) MHz offset, maybe I just got lucky.


Did you try bump your core more, instead of memory?

I have samsung memory, but my corechip anything higher than 2100mhz will trigger 'nvlddmkm' crash driver. it's ridiculous. Overall oc potential of 1070 are mediocre at best both GPU/Memory chip.

Corespeed save rate : 2100mhz
Average Corespeed : 2050, 2038 mhz
mem save rate : 2250-2300mhz (9000-9200mhz effective)


----------



## DeathAngel74

No. Same. 2101 on core or crash. So now, 2101/8726


----------



## bobfig

Oo i wanna join. Just got a msi seahawk 1070 ek x and its been nice. As far as oc i didnt really push the memory but im at 2088mhz core and 4101mhz on the vram. If i try for the 2100mhz on the core it starts to crah the drivers.


----------



## DeathAngel74

I think 2100 is the sweet spot for most 1070s


----------



## ahmedmo1

Just switched over from a 980ti... why? Got it for the exact same price. Will get around to OCing when I have time.


----------



## DeathAngel74

Do we still need to run NVCP at max performance with the bios update? Wondering if I can remove dwm.exe and explorer.exe from NVCP as well? I used to set it at optimal power before all this started.


----------



## gtbtk

Quote:


> Originally Posted by *ahmedmo1*
> 
> 
> 
> Just switched over from a 980ti... why? Got it for the exact same price. Will get around to OCing when I have time.


There is a bios update that should assist with overclocking available for your card that you should apply. link is in posy #4660


----------



## zipper17

GTX1070(1x) 24K FS Graphic scores, holy ****
http://www.3dmark.com/fs/9740626


----------



## asdkj1740

Quote:


> Originally Posted by *zipper17*
> 
> i have the galax exoc board, it has 250w.
> During 3dmark FSX stress test/gaming, the highest gpu power i've seen on my card is 228w (114% GPu power TDP) as far as i remember.
> 2100/9200mhz samsung memory, the corespeed anything higher than 2100mhz it will randomly crash during 3dmark (nvlddmkm driver crash)
> 
> does bios with higher powerlimit will help me get higher core clock speed stable? curious about for amp extreme bios.
> but imo stable overclocking always still depends how good is the silicon chip, am I right?.


may help stability, you can try it. when i run furmark i get 280w and no throttling at core core using zotac amp extreme. if your stock bios give you downclock in running furmark then you may consider flashing zotac amp extreme.
try to set 100% fan speed for lowest temp you can get.
your samsung can run at 2300mhz only??? can yours run at 2350 or 2400 in gaming?


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> I think 2100 is the sweet spot for most 1070s


aio and watercooling should be able to get higher result if voltage curve is fine tuned


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> Do we still need to run NVCP at max performance with the bios update? Wondering if I can remove dwm.exe and explorer.exe from NVCP as well? I used to set it at optimal power before all this started.


no, at least i didnt


----------



## DeathAngel74

Thanks


----------



## zipper17

Quote:


> Originally Posted by *asdkj1740*
> 
> may help stability, you can try it. when i run furmark i get 280w and no throttling at core core using zotac amp extreme. if your stock bios give you downclock in running furmark then you may consider flashing zotac amp extreme.
> try to set 100% fan speed for lowest temp you can get.
> your samsung can run at 2300mhz only??? can yours run at 2350 or 2400 in gaming?


i can get 2325-2376mhz(9300-9500mhz) memory, but got noticeable artifact (greenball flashing) in 3d mark/witcher3 as far as i remember.
little memory artifact its pretty hard to detect it's only milliseconds, but i can see it.
The application still running fine though.

2250-2300 (9000-9200mhz) for save zone.

But my corespeed sucks, Tried bump the peak cores into 2151-2114mhz = crashing on 3dmark (driver crash nvlddmkm).

my current stable (custom curve shift+click method, 60% core voltage, max power&temp limit)
peak cores = 2101mhz, average= ~2063, 2050, 2038 mhz, depends on games &temperatures.
mem = 2250-2300 (9000-9200mhz)
Firestrike GS = 20,8XX.
Firestrike GS with 9500mhz = 21,1XX. (but with little memory artifact)


----------



## mrmouse

Good Morning together









I just want to give that Informations, which i collected for myself, further to anyone, who can use it.
I'm from Bavaria, so there could be some grammatical errors.

At first, my System is like in the signature.
I switched after nearly 7years (20.1.2010 - 19.9.2016) from a ATI XFX 5850 BE to a KFA² GTX 1070 EX. This card is(was) in Germany available for 399 Euros.
I bought it about exactly 4 weeks ago.

The first two weeks, i just looked how the card works and feels like.
I regarded the clocks and wrote them down.

Default: 1519 MHz @ 0.8000V
Default Boost: 1709 MHz
Default Boost REAL: 1860-1923 MHz @ 1.0313V-1.0500V

The Stock-Cooler of the KFA² Card wasn't so bad. But it's not in my "nature" to leave hardware stock.

I remind, that i've got a Alpenföhn PETER Cooler down in the basement. Now, there was no turning back.

The VRAM-Heatsink were glued with Arctic Silver Alumina Thermal Adhesive (1:1). I mixed it with normal Thermal Compound 2) (Arcitc Silver Alumina) in the Ratio about 1:1:2, to reduce the glue adhesive force.

Between PETER and the GPU i decided to use the Acrtic Cooling MX-2 Thermal Compound.

I mounted 2x Scythe Kaze Jyu SLIM SY1012SL12M 100x100x12 mm @ 7Volt to PETER.



IMPORTANT:
The Stock Cooler Temperatures are AVG, with manual OC and with default boost.
The PETER Temperatures are at 2012/2224 @ 0.9000V.
Memory: MICRON

BIOS Screen:


This is my 24/7 Setting now:

2025/2224 @ 0.9000V - 2025 drops at about 50°C to 2012. But this clock (2012) is Furmark, Benchmark and Game stable!
IDLE 164/202 @ 0.6250V

Games which i used to test (everything max with even downsamling - just looked that GPU is at 99% and than i played for hours)

Max Max
Grid Autosport
Dirt Rally
GTA V
Portal 2
Crysis Warhead

The biggest difference are the Temperatures:

Stock Cooler (Auto-Fan / till 50°C Zero-Mode):
Idle: 38-46°C
Normal Load: 68-78°C
Furmark Load: 82°C and more (doesn't increase the Max °C)

PETER with 2x Scythe Kaze Jyu SLIM @ 7Volt
Idle: 28-32°C
Normal Load: 44-52°C
Furmark Load (15Minutes): max. 64°C

So far so good. I'm very happy, that i changed the VGA-Cooler.
Maybe i will try liquid metal between gpu and cooler. I've got it here, but i dont want to take out the card out of the case

Now, i will show you my self-experience text-document, which i used ONLY FOR ME.

Maybe, somebody can use some of this information.

Best greets from cold upper Bavaria.

MY TEXT-DOCUMENTS (just copy and paste)

Furmark 1280x720 8xMSAA 'Window Mode' (FAST TESTS 1-5Min)

MSI Afterburner: Corevoltage +0%
Power Limit 100%
Temp Limit 83°C
Memory Clock +0MHz
Fan Speed 100% (Stock KFA2 Cooler)

1784 @ 0,8000v (crash at 1794) ~55% TDP ~55°C
1885 @ 0,8125v (crash at 1898)
1936 @ 0,8250v (crash at 1949)
1936 @ 0,8313v (crash at 1949) !
1949 @ 0,8438v (crash at 1961)
1961 @ 0,8500v (crash at 1974)
1974 @ 0,8625v (crash at 1987)
1987 @ 0,8750v (crash at 1999)
1999 @ 0,8812v (crash at 1999) !
1999 @ 0,8938v (crash at 2012) ~65% TDP ~61°C
2012 @ 0,9000v (crash at 2037 after 4-5Min)
2037 @ 0,9125v (crash at 2050)
2050 @ 0,9250v (crash at 2050) !

Furmark 1280x720 8xMSAA 'Window Mode'

MSI Afterburner: Corevoltage +0%
Power Limit 125% (Do not work on my Card - only with the Tool from KFA2)
Temp Limit 92°C
Memory Clock +400MHz

VGA Cooler: Alpenföhn Peter (Rev.1)
VGA Memory Heatsinks: Black low Heatsinks from Peter
Memory Heatsink Glue: Arctic Silver 'Alumina Thermal Adhesive' mixed with Arctic Silver 'Alumina Thermal Compound' Mixing Ratio: 1:1:2
FAN: 2x Scythe Kaze Jyu SLIM SY1012SL12M, 100x100x12 mm, @ 7Volt (12V=2000RPM)
Thermal Compound: Arctic Cooling 'Arctic MX-2' (a few years old







)

GPU:
2025MHz @ 0.9000v

drops to 2012, than to 1999MHz (hitting 58-59°C) - manual push back to 2012MHz, than its stable
Max.Temp:63-64° TDP:90-93% GPU-Load: 99% MemoryControllerLoad: 99-100%

Memory:
2202Mhz/4404MHz/8808MHz - +200MHz/400MHz/800MHz

till 50°C 2300MHz Mem is ok. Than little Artefacts
till 55°C 2250MHz Mem is ok. Than little Artefacts
till 64°C (and maybe more?) is 2202MHz ok. NO Artefacts till NOW

Mad-Max [email protected] max.Details vsync off about 10 hours

AVG Temperature 60-74°C

MSI Afterburner: Corevoltage +0%
Power Limit 100%
Temp Limit 83°C
Memory Clock +0MHz
Fan Speed 69% (Stock KFA2 Cooler)
2025 @ 0,9000v

jumping to 2012 @ 0,9000v

3D Mark Fire Strike:

MSI Afterburner: Corevoltage +0%
Power Limit 125%
Temp Limit 92°C
Memory Clock +0MHz
PETER

Passed with: 2176MHz @ 1,0625V (drop to 2164MHz)

Passed with: 2227MHz @ 1.0625V (1st drop to 2214, 2nd drop to 2202, 3th drop to 2189)
tryed with 2240 @ 1.0625V instant crash

FRAMEBUFFER Crysis WARHEAD Benchmarking Tool 0,40

DirectX 10 ENTHUSIAST 15X @ Map: frost flythrough @ 8 2880 x 1800 AA 8x
==> Framerate [ Min: 45.85 Max: 75.74 Avg: 60.53 ]

DirectX 10 ENTHUSIAST 15X @ Map: cargo flythrough @ 8 2880 x 1800 AA 8x
==> Framerate [ Min: 24.39 Max: 59.43 Avg: 47.71 ]

DirectX 10 ENTHUSIAST 15X @ Map: ambush flythrough @ 8 2880 x 1800 AA 8x
==> Framerate [ Min: 31.89 Max: 75.23 Avg: 54.85 ]

GPU:
2025MHz @ 0,9000v
drops at 50°C to 2012MHz - than stable at 2012MHz

Max.Temp:52° Max.TDP:68.0% Max.GPU-Load:99% Max.MemoryControllerLoad:78%
Max.BusInferfaceLoad:5% Max.MemoryUsed:1567MB

Memory:
2202Mhz/4404MHz/8808MHz - +200MHz/400MHz/800MHz
Stable - No Artefacts


----------



## Teshreni

Hi ..

Is this Normal Result ?

i have seen better than this With the same GPU and CPU



http://www.3dmark.com/fs/10364535


----------



## outofmyheadyo

Way too low


----------



## gtbtk

Quote:


> Originally Posted by *Teshreni*
> 
> Hi ..
> 
> Is this Normal Result ?
> 
> i have seen better than this With the same GPU and CPU
> 
> 
> 
> http://www.3dmark.com/fs/10364535


Your 6600K should be capable of getting you a physics score of about 10000 so that is under performing. It could be way too hot and need a better cooler, You could try overclocking the CPU if you havent done that or you have many things running in the background while you are running the benchmark taking processor resources away from the CPU.

Graphics scores should be somewhere around 20,000 - 21000 with a reasonable CPU. 18000 could be because your CPU is under performing.


----------



## DeathAngel74

Finally got RGB/FULL/12bpc working. Colors in-game are really vibrant now.


----------



## asdkj1740

another round up 1070s review
https://www.computerbase.de/2016-10/geforce-gtx-1070-partnerkarten-vergleich-test/

palit/gainward, well done. i am sure with higher power limit bios, they can perform even better.
g1 is a trap, and ichill loses his king of air cooling in the gernations of kelper and maxwell.
zotac extreme really has the most insane bios settings however its cooling may not be good enough to match with its 300w bios.


----------



## netok

New EVGA bios seems to work well with Micron vram.
Can someone extract their new EVGA BIOS and upload to TPU or here?
It can be done in GPU-z, pressing the button next to bios version.
Thanks


----------



## asdkj1740

Quote:


> Originally Posted by *netok*
> 
> New EVGA bios seems to work well with Micron vram.
> Can someone extract their new EVGA BIOS and upload to TPU or here?
> It can be done in GPU-z, pressing the button next to bios version.
> Thanks


ftw secondary, that is the 215w bios.
https://www.sendspace.com/file/5h060m

i extracted it by gpuz, its not 250KB, so i dont know whether it would work//i extracted it correctly


----------



## DeathAngel74

https://drive.google.com/file/d/0B007JgCLgXQLTk5vaXhCa1Jwc0U/view?usp=sharing
New 1070 SC primary (single bios card) 170W BIOS. Also not 250kb, its 259kb. I used nvflash 5.287 x64 to extract it.
86.04.50.00.70


----------



## netok

Quote:


> Originally Posted by *asdkj1740*
> 
> ftw secondary, that is the 215w bios.
> https://www.sendspace.com/file/5h060m
> 
> i extracted it by gpuz, its not 250KB, so i dont know whether it would work//i extracted it correctly


Quote:


> Originally Posted by *DeathAngel74*
> 
> https://drive.google.com/file/d/0B007JgCLgXQLTk5vaXhCa1Jwc0U/view?usp=sharing
> 1070 SC primary (single bios card) 170W BIOS. Also not 250kb, its 259kb. I used nvflash 5.287 x64 to extract it.
> 86.04.50.00.70


Much appreciated!


----------



## gtbtk

The new EVGA bioses require a new version of nvflash, later than version 5.292 which is the latest one I have.

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *netok*
> 
> New EVGA bios seems to work well with Micron vram.
> Can someone extract their new EVGA BIOS and upload to TPU or here?
> It can be done in GPU-z, pressing the button next to bios version.
> Thanks
> 
> 
> 
> ftw secondary, that is the 215w bios.
> https://www.sendspace.com/file/5h060m
> 
> i extracted it by gpuz, its not 250KB, so i dont know whether it would work//i extracted it correctly
Click to expand...

That wont work, you need a new 5.3X version of nvflash to flash that file that id not generally available as yet. What you can do, to try it out, is install the 86.04.26.xx FTW bios using nvflash 5.292 or 5.287 from Joe Dirt and then use the EVGA update utility that you can download from the EVGA forums to flash your new "EVGA FTW" card.

I tried it on my MSI Gaming X and teh micron memory now works as expected. The limited TDP did mean that my card ran slower than it did with the MSI bios because it kepy throttling back because it hit the power limit


----------



## DeathAngel74

I noticed that the update utility came a nvflash64.sys from sept, 2016. The new bios seems stable after 12 hours. I tried different things to get it to crash after finding stable clocks, lol. No crashing or checkerboard screen of death (CSOD), so I think I'm good to go!


----------



## Roland0101

Quote:


> Originally Posted by *ahmedmo1*
> 
> 
> 
> Just switched over from a 980ti... why? Got it for the exact same price. Will get around to OCing when I have time.


369.09 is the MS driver, you should install one from Nvidia.


----------



## FastOne8

Hello. I've been testing my new Palit GTX 1070 gamerock (Stock clock). I've played Crysis 3 for a hour and then my video froze, but sound continued. I had kill with task manager to be able to return to the Windows. I've checked Event Viewer but there is nothing about driver crashing. I guess this is not a normal, right? because this crash looks like driver crash.
Also I got micron memory if thats' matter. I'm really worried about this crash and I don't want to sound stupid, but should I RMA my card?


----------



## Jurgennoppe

Hi guys,

i'm new to this whole videocard bios flashing thing.

I have a palit 1070 super jetstream. I discovered my vbios version is 86.04.26.00.28
Is there a newer version of this and what does it do?
Also, how do i install it? Help would be greatly appreciated!

i checked gpu-z, it's micron, not samsung.


----------



## gtbtk

Palit released new bioses that are supposed to solve the Micron checkerboard crash bug in 1070 cards. go to the links for your card, download the bios update utility and run the program. It will update teh bios for your cards

Quote:



> Originally Posted by *FastOne8*
> 
> Hello. I've been testing my new Palit GTX 1070 gamerock (Stock clock). I've played Crysis 3 for a hour and then my video froze, but sound continued. I had kill with task manager to be able to return to the Windows. I've checked Event Viewer but there is nothing about driver crashing. I guess this is not a normal, right? because this crash looks like driver crash.
> Also I got micron memory if thats' matter. I'm really worried about this crash and I don't want to sound stupid, but should I RMA my card?


Before you RMA the card, update the bios file and see if that resolves your problem

http://www.palit.com/palit/vgapro.php?id=2634&lang=en&pn=NE51070T15P2-1041G&tab=do

Quote:


> Originally Posted by *Jurgennoppe*
> 
> Hi guys,
> 
> i'm new to this whole videocard bios flashing thing.
> 
> I have a palit 1070 super jetstream. I discovered my vbios version is 86.04.26.00.28
> Is there a newer version of this and what does it do?
> Also, how do i install it? Help would be greatly appreciated!
> 
> i checked gpu-z, it's micron, not samsung.


http://www.palit.com/palit/vgapro.php?id=2629&lang=en&pn=NE51070S15P2-1041J&tab=do


----------



## Velocifera

Hey all!

I have a few questions about doing the shunt mod to unlock the power limit on an EVGA GTX 1070 ACX3.0. I saw that many users have used liquid metal paste to do this mod but I was curious about alternatives.

I recently attempted to do this mod using a tiny strip of conductive copper foil tape on the shunt resistor and upon testing I noticed it did not change my desktop consumption % at all and when I launched a game to check for sure, my card froze up and died. Removing and cleaning the mod didn't undo whatever damage was caused and I was forced to seek a replacement.

I feel terrible about what happened but I don't want to be discouraged because overclocking is a bit of a passion of mine. My problem is that I have no idea what might have happened. I made very sure nothing was physically damaged and nothing else was shorted out by the tape before I tested. I really would like to know what might have happened to my card and why this mod didn't work at all for me when so many other users have reported success? Should I try actually using liquid metal instead of copper foil tape? Does EVGAs reference style card behave differently than others?


----------



## Roland0101

Quote:


> Originally Posted by *FastOne8*
> 
> Hello. I've been testing my new Palit GTX 1070 gamerock (Stock clock). I've played Crysis 3 for a hour and then my video froze, but sound continued. I had kill with task manager to be able to return to the Windows. I've checked Event Viewer but there is nothing about driver crashing. I guess this is not a normal, right? because this crash looks like driver crash.
> Also I got micron memory if thats' matter. I'm really worried about this crash and I don't want to sound stupid, but should I RMA my card?


Do what gtbtk suggested.
Furthermore, did you monitor your system temps?


----------



## mrtbahgs

Quote:


> Originally Posted by *Nukemaster*
> 
> Try shadowplay(They call it Share now, but you can turn off all the streaming stuff) it works very well.
> 
> The actual replay feature lets you run a constant buffer so you can grab the last X min of video and save it. I do not use it, but it is a cool idea for those random things that games do.


Quote:


> Originally Posted by *RyanRazer*
> 
> Do try shadow play. I tried it and FPS hit was almost zero, whereas when i tried to record with FRAPS for example, it was horrible.


I tried Share (Shadowplay) tonight to see what it is like and while it seemed easy to setup and record my last 5 minutes, I noticed that the quality of the video is not as good as when I am actually playing.
I have it set to use my "In-game" resolution so it should be capturing at 1440p and then it maxes at 60fps, is the fps the reason it would be a bit blurry?

I tried a quick custom setup to where I made sure it was 1440p but then either that or something else screwed up because I was only recording sound and not video.
It took me a few tries and i don't know what exactly to get it recording video again, but I am back to less quality with what it captures.

If it matters, I tried playback in both Windows Media Player and VLC player

I forgot to pay attention to the performance hit for me, so I will have to try and remember it next time.


----------



## _Killswitch_

I have a question, next week i'll have enough spare cash to upgrade my GTX 680 too a GTX 1070. Question is, there any real gain (worth the extra money) in Dual 8-pin cards vs single 8-pin cards?


----------



## Snuckie7

What would be faster, a ~1500 MHz GTX 980Ti or a ~2100 MHz GTX 1070? I was about to pull the trigger on a new 1070 but it seems like the last gen 980Ti might still be faster. . .


----------



## Forceman

Quote:


> Originally Posted by *Snuckie7*
> 
> What would be faster, a ~1500 MHz GTX 980Ti or a ~2100 MHz GTX 1070? I was about to pull the trigger on a new 1070 but it seems like the last gen 980Ti might still be faster. . .


Unless the 980 Ti is a fair bit cheaper, you are better off with a 1070. Even if the 980 Ti is faster, it won't be by much, and the 1070 has the advantage in power draw and new feature support (however limited that may be).


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> I have a question, next week i'll have enough spare cash to upgrade my GTX 680 too a GTX 1070. Question is, there any real gain (worth the extra money) in Dual 8-pin cards vs single 8-pin cards?


for pascal the extra pin for deliver power is pretty gimmick. FYI, 8-pin is more than enough. try get samsung vram for GTX 1070 for all the hassle & good luck.


----------



## khanmein

Quote:


> Originally Posted by *Snuckie7*
> 
> What would be faster, a ~1500 MHz GTX 980Ti or a ~2100 MHz GTX 1070? I was about to pull the trigger on a new 1070 but it seems like the last gen 980Ti might still be faster. . .


384-bit bus is the main point 980Ti > 1070

let's us stick with 970 for a moment & wait for pascal refresh


----------



## Snuckie7

Quote:


> Originally Posted by *khanmein*
> 
> 384-bit bus is the main point 980Ti > 1070
> 
> let's us stick with 970 for a moment & wait for pascal refresh


My 970 can't quite handle the highest settings at 1440p and there is a 1070 selling for $360 right now. . .decisions


----------



## gtbtk

Quote:


> Originally Posted by *mrtbahgs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nukemaster*
> 
> Try shadowplay(They call it Share now, but you can turn off all the streaming stuff) it works very well.
> 
> The actual replay feature lets you run a constant buffer so you can grab the last X min of video and save it. I do not use it, but it is a cool idea for those random things that games do.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *RyanRazer*
> 
> Do try shadow play. I tried it and FPS hit was almost zero, whereas when i tried to record with FRAPS for example, it was horrible.
> 
> Click to expand...
> 
> I tried Share (Shadowplay) tonight to see what it is like and while it seemed easy to setup and record my last 5 minutes, I noticed that the quality of the video is not as good as when I am actually playing.
> I have it set to use my "In-game" resolution so it should be capturing at 1440p and then it maxes at 60fps, is the fps the reason it would be a bit blurry?
> 
> I tried a quick custom setup to where I made sure it was 1440p but then either that or something else screwed up because I was only recording sound and not video.
> It took me a few tries and i don't know what exactly to get it recording video again, but I am back to less quality with what it captures.
> 
> If it matters, I tried playback in both Windows Media Player and VLC player
> 
> I forgot to pay attention to the performance hit for me, so I will have to try and remember it next time.
Click to expand...

The Video is being encoded on the fly so it will never be as good as the in game display. Video frame rates for playback are maxed at 60 fps regardless, you will give the card an easier time if you use vsync and cap the game framerate at 60 fps while you are recording so it does not have the overhead of having to do a pull down from a higher, non stable frame rate to 60fps. The maths gets really tricky or it ends up just dropping many of the frames it has encoded, wasting gpu resources.

You should see better results if you output the video at 1080p as the down res will hide some of the encoding artifacts.


----------



## gtbtk

Quote:


> Originally Posted by *_Killswitch_*
> 
> I have a question, next week i'll have enough spare cash to upgrade my GTX 680 too a GTX 1070. Question is, there any real gain (worth the extra money) in Dual 8-pin cards vs single 8-pin cards?


At this point in time, not really. If someone ever works out how to allow the bios to be editable to unlock voltages and TDP caps then the answer is a resounding maybe.


----------



## khanmein

Quote:


> Originally Posted by *Snuckie7*
> 
> My 970 can't quite handle the highest settings at 1440p and there is a 1070 selling for $360 right now. . .decisions


theoretically, i having same scenario like u, i didn't pull the trigger yet due to the micron.

literally, i might wait for "DEC" or my bro come back from US or GTX 1080Ti release.


----------



## Snuckie7

Quote:


> Originally Posted by *khanmein*
> 
> theoretically, i having same scenario like u, i didn't pull the trigger yet due to the micron.
> 
> literally, i might wait for "DEC" or my bro come back from US or GTX 1080Ti release.


Is it widespread that cards are using micron instead of Samsung memory now? Was looking at getting an MSI Gaming X


----------



## khanmein

Quote:


> Originally Posted by *Snuckie7*
> 
> Is it widespread that cards are using micron instead of Samsung memory now? Was looking at getting an MSI Gaming X


yeah most likely. i dilemma with MSI Gaming X too but i prefer Asus Strix.


----------



## Pittster

Quote:


> Originally Posted by *Snuckie7*
> 
> Is it widespread that cards are using micron instead of Samsung memory now? Was looking at getting an MSI Gaming X


Everything I have read about the actual 1070's consumers have in there hands are a mix of Micron and Samsung memory.

It seems all brands have a mix including some founders edition cards having it.

A Vbios is coming from most brands and MSI has said on there forums that it is coming in 2-4 weeks.

For reference I have a MSI Gaming 1070 running at core 2062mhz and memory of 4353mhz this is with %100 Core voltage increase Power limit 126% and temp limit 92deg.

If I go any higher on the memory I get the white Squares crash only when a 3d application is loading unloading


----------



## Snuckie7

Pulled the trigger for a $355 MSI 1070 Gaming X
















Let's see how this goes.


----------



## asdkj1740

Quote:


> Originally Posted by *_Killswitch_*
> 
> I have a question, next week i'll have enough spare cash to upgrade my GTX 680 too a GTX 1070. Question is, there any real gain (worth the extra money) in Dual 8-pin cards vs single 8-pin cards?


go check techpowerup 1070 bios collection page, check the power limits of different cards instead of looking for the connectors..
some dual 8 pin card like zotac amp extreme has 300w.
some dual 8 pin card like evga ftw has only 226w.
it is stupid to see number of connectors, like what i did...


----------



## asdkj1740

Quote:


> Originally Posted by *Velocifera*
> 
> Hey all!
> 
> I have a few questions about doing the shunt mod to unlock the power limit on an EVGA GTX 1070 ACX3.0. I saw that many users have used liquid metal paste to do this mod but I was curious about alternatives.
> 
> I recently attempted to do this mod using a tiny strip of conductive copper foil tape on the shunt resistor and upon testing I noticed it did not change my desktop consumption % at all and when I launched a game to check for sure, my card froze up and died. Removing and cleaning the mod didn't undo whatever damage was caused and I was forced to seek a replacement.
> 
> I feel terrible about what happened but I don't want to be discouraged because overclocking is a bit of a passion of mine. My problem is that I have no idea what might have happened. I made very sure nothing was physically damaged and nothing else was shorted out by the tape before I tested. I really would like to know what might have happened to my card and why this mod didn't work at all for me when so many other users have reported success? Should I try actually using liquid metal instead of copper foil tape? Does EVGAs reference style card behave differently than others?


flashing another high power limit bios is more safer..


----------



## Jurgennoppe

Thx,

but what does this update actually do?So I just download and run the bios setup file there?


----------



## Nukemaster

Quote:


> Originally Posted by *mrtbahgs*
> 
> I tried Share (Shadowplay) tonight to see what it is like and while it seemed easy to setup and record my last 5 minutes, I noticed that the quality of the video is not as good as when I am actually playing.
> I have it set to use my "In-game" resolution so it should be capturing at 1440p and then it maxes at 60fps, is the fps the reason it would be a bit blurry?
> 
> I tried a quick custom setup to where I made sure it was 1440p but then either that or something else screwed up because I was only recording sound and not video.
> It took me a few tries and i don't know what exactly to get it recording video again, but I am back to less quality with what it captures.
> 
> If it matters, I tried playback in both Windows Media Player and VLC player
> 
> I forgot to pay attention to the performance hit for me, so I will have to try and remember it next time.


As others have said the video is compressed. It will not look as good as gameplay. If you have the space, you can try higher bit rates(this can improve things, but will create larger files as well).


----------



## Jurgennoppe

Quote:


> Originally Posted by *gtbtk*
> 
> Before you RMA the card, update the bios file and see if that resolves your problem
> 
> http://www.palit.com/palit/vgapro.php?id=2634&lang=en&pn=NE51070T15P2-1041G&tab=do
> 
> http://www.palit.com/palit/vgapro.php?id=2629&lang=en&pn=NE51070S15P2-1041J&tab=do


Thx,

but what does this update actually do?So I just download and run the bios setup file there?


----------



## asdkj1740

Quote:


> Originally Posted by *Jurgennoppe*
> 
> Thx,
> 
> but what does this update actually do?So I just download and run the bios setup file there?


palit bios improve power limit and micron overclocking and the fan spinning problem.
if you card is suck then you have nothing to lose for trying this new bios.
palit new bios update utility is smart, auto detection for the right bios to your card.


----------



## asdkj1740

http://forums.evga.com/GTX1060-SC-6GB-with-Micron-GDDR5-now-m2565778.aspx

indeed vram overclocking over factory settings is not guarantee by aic, but it is implicitly guaranteed by vram chip producers like samsung and micron.
and some games really love memory bandwidth, digital foundry has some videos talked about this.
huge improvement on overclocking vram, not just shown on fps but rather than frame time which i think it is an more important index to actual gaming experience like stuttering.
do not be fooled by fools...google the truth...


----------



## gtbtk

Quote:


> Originally Posted by *Snuckie7*
> 
> Pulled the trigger for a $355 MSI 1070 Gaming X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let's see how this goes.


Micron memory on these cards, while it needs a work around to not crash when you overclock the memory, is not actually all the doom and gloom some people would have you believe it is.


----------



## gtbtk

Quote:



> Originally Posted by *Jurgennoppe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Before you RMA the card, update the bios file and see if that resolves your problem
> 
> http://www.palit.com/palit/vgapro.php?id=2634&lang=en&pn=NE51070T15P2-1041G&tab=do
> 
> http://www.palit.com/palit/vgapro.php?id=2629&lang=en&pn=NE51070S15P2-1041J&tab=do
> 
> 
> 
> Thx,
> 
> but what does this update actually do?So I just download and run the bios setup file there?
Click to expand...

asdkj1740 is correct, you also get an instant performance upgrade for free as well because of the higher TDP


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> http://forums.evga.com/GTX1060-SC-6GB-with-Micron-GDDR5-now-m2565778.aspx
> 
> indeed vram overclocking over factory settings is not guarantee by aic, but it is implicitly guaranteed by vram chip producers like samsung and micron.
> and some games really love memory bandwidth, digital foundry has some videos talked about this.
> huge improvement on overclocking vram, not just shown on fps but rather than frame time which i think it is an more important index to actual gaming experience like stuttering.
> do not be fooled by fools...google the truth...


Not owning a 1060, I don't have the inclination to have the same discussions all over again for exactly the same problem the 1070 has.

I wonder how long it will take for them to work it out by themselves?


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> http://forums.evga.com/GTX1060-SC-6GB-with-Micron-GDDR5-now-m2565778.aspx
> 
> indeed vram overclocking over factory settings is not guarantee by aic, but it is implicitly guaranteed by vram chip producers like samsung and micron.
> and some games really love memory bandwidth, digital foundry has some videos talked about this.
> huge improvement on overclocking vram, not just shown on fps but rather than frame time which i think it is an more important index to actual gaming experience like stuttering.
> do not be fooled by fools...google the truth...


EDIT: REP+ to you for making things much clearer in my mind with such two short paragraphs









Best information i received in a long time, makes perfect sense based on my experience with both memory types,

Frame time could explain why the instant i installed the Micron card i felt it was laggy in high resolutions compared to Samsung even at the same stock frequencies, Samsung chips are more expensive for a reason, you usually get what you pay for and vRAM is no exception.

Overclocked Samsung vRAM produces much more fluid FPS in 4K especially in titles that uses almost all 8GB of vRAM like rotTR, and Samsung has an OC range usually exceeding 9 Gbps unless you get really unlucky with your sample.

I have actually requested frame time measurement data from a guy here who has both a Micron and a Samsung GTX 1070 to verify this phenomenon but unfortunately it does not seem like he had the time to perform such test.

Can you please share a link to that digital foundry video? or any other links discussing the topic technically in more depth? i am highly interested in that information.


----------



## JukeBox

just chipping in regarding micron RAM,

my palit superjetstream 1070 (micron) has been rock solid for day 1.

Gonna update to the new bios to hopefully get more overclock headroom (as power limit will go up ~30w).

should be good!


----------



## bobfig

also my msi 1070 seahawk ek x came with samsung memory.


----------



## MyNewRig

Just did a very quick test with Witcher 3 in 4K and the frame time graph is jumping all over the place despite the consistent FPS graph, will do more testing later today but i think i can finally express what is going on in numbers and graphs.


----------



## khanmein

Quote:


> Originally Posted by *JukeBox*
> 
> just chipping in regarding micron RAM,
> 
> my palit superjetstream 1070 (micron) has been rock solid for day 1.
> 
> Gonna update to the new bios to hopefully get more overclock headroom (as power limit will go up ~30w).
> 
> should be good!


sorry i don't believe is rock solid if got no issue y they released new vbios at the 1st place. i don't saw any vbios for samsung. avoid micron for god sake.


----------



## JukeBox

different ram batches will behave differently (obviously).

What I said still stands. my particular card, from a particular batch, is fine.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> http://forums.evga.com/GTX1060-SC-6GB-with-Micron-GDDR5-now-m2565778.aspx
> 
> indeed vram overclocking over factory settings is not guarantee by aic, but it is implicitly guaranteed by vram chip producers like samsung and micron.
> and some games really love memory bandwidth, digital foundry has some videos talked about this.
> huge improvement on overclocking vram, not just shown on fps but rather than frame time which i think it is an more important index to actual gaming experience like stuttering.
> do not be fooled by fools...google the truth...
> 
> 
> 
> EDIT: REP+ to you for making things much clearer in my mind with such two short paragraphs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Best information i received in a long time, makes perfect sense based on my experience with both memory types,
> 
> Frame time could explain why the instant i installed the Micron card i felt it was laggy in high resolutions compared to Samsung even at the same stock frequencies, Samsung chips are more expensive for a reason, you usually get what you pay for and vRAM is no exception.
> 
> Overclocked Samsung vRAM produces much more fluid FPS in 4K especially in titles that uses almost all 8GB of vRAM like rotTR, and Samsung has an OC range usually exceeding 9 Gbps unless you get really unlucky with your sample.
> 
> I have actually requested frame time measurement data from a guy here who has both a Micron and a Samsung GTX 1070 to verify this phenomenon but unfortunately it does not seem like he had the time to perform such test.
> 
> Can you please share a link to that digital foundry video? or any other links discussing the topic technically in more depth? i am highly interested in that information.
Click to expand...

Don't lose sight of the fact that frame time and frame rate are connected to each other. You cant have long frame times and high frame rates at the same time because frame rates are a measure of the the number frames per second and frame time is a measure of the time between the presentation of each frame.

If you run a benchmark or play a game amd measure framerates, you will see a measurement for maximum, average and minimum frame rates. The minimum frame rate measurement is more meaningful in gauging how smooth the experience will be. In any game or benchmark different scenes vary the loads on the GPU and frame rates increase and decrease depending on load plus other factors such as PCI bus bandwidth availability, CPU load, Ram load, disk access if the application is writing to or reading from virtual memory, other processes running in the background. It is not just the video card in isolation. The CPU has to instruct the GPU what to do and what to process before the GPU can actually do anything.

You could install a Titan X Pascal card in a core duo PC from 2006 and you will still get choppy frame rates simply because the CPU/Ram/Disk/PCI bus are not powerful enough or dont have enough bandwidth to supply the GPU with data to process at a fast enough rate. The GPU just ends up sitting there idle

If you are experiencing choppy game play, you need to look at not just the GPU and drivers, but also everything else that the PC is doing at the time you are playing the game.


----------



## ucode

Quote:


> Originally Posted by *asdkj1740*
> 
> oic, thx. it is shame there is not strix oc t4 bios for 1070..
> that post ended sad as it is locked...cross flashing should be discuss further as it is the safer way to increase power limit than the liquid ultra method.
> i have no idea why my evga ftw cant cross flash other bios properly, only zotac amp extreme is suitable so far.


Well there's still HW mod for the brave / foolish depending how it goes. Would have been interesting but perhaps the 1070 performance is already too close to the 1080.

Here's those files you asked for.

Code:



Code:


  ROM File  ,      Vendor ID's   ,  BIOS Version , Min , Def , Max 
=====================================================================
  6278_2.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 226W
  6278_1.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 208W
  6276_2.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 226W
  6276_1.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 208W
  6274_2.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 226W
  6274_1.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 208W
    6178.rom, 10DE:1B81-3842:6178, 86.04.50.00.70,  75W, 151W, 170W
    6173.rom, 10DE:1B81-3842:6173, 86.04.50.00.70,  75W, 151W, 170W
    6171.rom, 10DE:1B81-3842:6171, 86.04.50.00.70,  75W, 151W, 170W
    6170.rom, 10DE:1B81-3842:6170, 86.04.50.00.70,  75W, 151W, 170W
    5173.rom, 10DE:1B81-3842:5173, 86.04.50.00.70,  75W, 151W, 170W
    5171.rom, 10DE:1B81-3842:5171, 86.04.50.00.70,  75W, 151W, 170W
    5170.rom, 10DE:1B81-3842:5170, 86.04.50.00.70,  75W, 151W, 170W


----------



## gtbtk

Quote:


> Originally Posted by *JukeBox*
> 
> just chipping in regarding micron RAM,
> 
> my palit superjetstream 1070 (micron) has been rock solid for day 1.
> 
> Gonna update to the new bios to hopefully get more overclock headroom (as power limit will go up ~30w).
> 
> should be good!


If you have the card set to high performance in the NV control panel with the original bios and have chrome or firefox running in the background before you hit the apply OC button, you wont see the checkerboard problem at all


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Don't lose sight of the fact that frame time and frame rate are connected to each other. You cant have long frame times and high frame rates at the same time because frame rates are a measure of the the number frames per second and frame time is a measure of the time between the presentation of each frame.
> 
> If you run a benchmark or play a game amd measure framerates, you will see a measurement for maximum, average and minimum frame rates. The minimum frame rate measurement is more meaningful in gauging how smooth the experience will be. In any game or benchmark different scenes vary the loads on the GPU and frame rates increase and decrease depending on load plus other factors such as PCI bus bandwidth availability, CPU load, Ram load, disk access if the application is writing to or reading from virtual memory, other processes running in the background. It is not just the video card in isolation. The CPU has to instruct the GPU what to do and what to process before the GPU can actually do anything.
> 
> You could install a Titan X Pascal card in a core duo PC from 2006 and you will still get choppy frame rates simply because the CPU/Ram/Disk/PCI bus are not powerful enough or dont have enough bandwidth to supply the GPU with data to process at a fast enough rate. The GPU just ends up sitting there idle
> 
> If you are experiencing choppy game play, you need to look at not just the GPU and drivers, but also everything else that the PC is doing at the time you are playing the game.


Like you said, FPS is the total number of frames output during one second, while Frame Time measures the time it takes to render each frame so if that time is distributed evenly across that one second the game will feel the most fluid but if the distribution is highly variable the game will feel laggy or stuttery,

Also keep in mind that my observations are not the result of trying only one card, i tried four of them, two with Samsung and two with Micron, in the same system, with the same configurations,

How do you think i first noticed that Micron issue? i did that before reading anything about it or even have any awareness that different memory types are being used, even before trying any OC, so my observations are unbiased because i have not been exposed to any sort of prior information,

What happened is that i mostly game in 2K, the 970 i had previously was laggy and was not handling it well, then i got a 980 Ti which was much better but used to get too hot and loud so i got rid of it, next it was the GTX 1070 with Samsung memory, that one gave me a very fluent 2K experience, the games were very smooth and fluid, then came the Micron 1070, i installed the card, ran the same games on the same 2K resolution and it was getting noticeably laggy, i did not understand what was wrong, it took me about one day to figure it out, by opening GPU-Z of both cards side by side and comparing what could be different i noticed that one said GDDR5 (Samsung) and the other said GDDR5 (Micron), so i contacted my retailer and said that the card does not perform as it should, using the Samsung card as a reference, so i got the card replaced with another one which also turned out to have Micron and the same symptoms, i Googles the issue looking for any clues and ended up in that thread you started.

Similar observations happened to other people who first got cards with Samsung and then got them replaced with Micron cards for different reasons, these are the ones who complain the most because they tested both and noticed an obvious difference.

Two hours ago i logged The Witcher 3 FPS and FrameTime with AfterBurner running at 4K during 10 minutes in the game, i was standing in the same spot just looking around or walking back and forth in a very small area, nothing has been loading since game and textures of the area were all loaded in RAM, nothing is running in the background and my 6600K is running at 4.6Ghz and not even being fully utilized, it is running in the 50c region, checking the graphs after these 10 minutes, the FPS graph is an almost straight line so FPS is very consistent which is to be expected by just staying in the same area in the game just moving around and looking around, while the FrameTime graph is going all over the place with some huge spikes every few data points.

I will post screenshots here later today when i have time to generate more data for you to see.

Having tested that many cards eliminates any other issues with the system, i can not play the blame game on other components here, like it is the CPU or the loading or system RAM, since all these variables have been ruled out in my case.


----------



## asdkj1740

Quote:


> Originally Posted by *ucode*
> 
> Well there's still HW mod for the brave / foolish depending how it goes. Would have been interesting but perhaps the 1070 performance is already too close to the 1080.
> 
> Here's those files you asked for.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ROM File  ,      Vendor ID's   ,  BIOS Version , Min , Def , Max
> =====================================================================
> 6278_2.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 226W
> 6278_1.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 208W
> 6276_2.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 226W
> 6276_1.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 208W
> 6274_2.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 226W
> 6274_1.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 208W
> 6178.rom, 10DE:1B81-3842:6178, 86.04.50.00.70,  75W, 151W, 170W
> 6173.rom, 10DE:1B81-3842:6173, 86.04.50.00.70,  75W, 151W, 170W
> 6171.rom, 10DE:1B81-3842:6171, 86.04.50.00.70,  75W, 151W, 170W
> 6170.rom, 10DE:1B81-3842:6170, 86.04.50.00.70,  75W, 151W, 170W
> 5173.rom, 10DE:1B81-3842:5173, 86.04.50.00.70,  75W, 151W, 170W
> 5171.rom, 10DE:1B81-3842:5171, 86.04.50.00.70,  75W, 151W, 170W
> 5170.rom, 10DE:1B81-3842:5170, 86.04.50.00.70,  75W, 151W, 170W


thank you very much


----------



## watermanpc85

Hi guys, just wanted to ask you what model of 1070 do you recommend me RIGHT NOW??? in terms of max OC, future BIOS mod, performance out of the box, etc...also, keep in mind my max case lenght is like 30cm...

Im thinking about the MSI gaming X 8g but now Im scared about the micron memory problem









thanks!!


----------



## ITAngel

Quote:


> Originally Posted by *watermanpc85*
> 
> Hi guys, just wanted to ask you what model of 1070 do you recommend me RIGHT NOW??? in terms of max OC, future BIOS mod, performance out of the box, etc...also, keep in mind my max case lenght is like 30cm...
> 
> Im thinking about the MSI gaming X 8g but now Im scared about the micron memory problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks!!


EVGA always been solid cards, with amazing customer service etc... might want to look into them.


----------



## MyNewRig

Quote:


> Originally Posted by *watermanpc85*
> 
> in terms of max OC


None! if you want to OC forget about the 1070 all together, if you must buy get a Founders but that will not go well with your BIOS modding requirement, if you must buy just avoid the EVGA FTW coz that has tons of issues, i would also avoid Gigabyte, the rest are a matter of taste, i would personally not buy a GTX 1070 now, it was great in June/July when it had Samsung memory, not anymore, not worth the money IMO.


----------



## TheGlow

Quote:


> Originally Posted by *Snuckie7*
> 
> Is it widespread that cards are using micron instead of Samsung memory now? Was looking at getting an MSI Gaming X


Yes but as mentioned not the end of the world. The problem is with higher mem overclocks not getting enough voltage and some people here discovered.
I found having the MSI Gaming app installed would install a service that kept me in 3d clocks. So I didnt know about the problem until later since i accidentally found a work around.
Also suggested to add some windows apps like dwm.exe and explorer.exe into nvidia panel to max performance to ensure you stay in 3d clocks.
At that point you can OC like a "normal" samsung. Hopefully the new vbios will fix those voltage issues at 2d clocks and we'll be fine.

Quote:


> Originally Posted by *Pittster*
> 
> Everything I have read about the actual 1070's consumers have in there hands are a mix of Micron and Samsung memory.
> 
> It seems all brands have a mix including some founders edition cards having it.
> 
> A Vbios is coming from most brands and MSI has said on there forums that it is coming in 2-4 weeks.
> 
> For reference I have a MSI Gaming 1070 running at core 2062mhz and memory of 4353mhz this is with %100 Core voltage increase Power limit 126% and temp limit 92deg.
> 
> If I go any higher on the memory I get the white Squares crash only when a 3d application is loading unloading


Yes, without locking my system into 3d clocks I saw stuff like launching ms Edge or EA Origin alone would crash me.
I get up to 2200MHz core at desktop, in game gpu "un"boost brings me down to 2172 or so. Mem I've gotten away with +825 and +850 in timespy benches.
Quote:


> Originally Posted by *khanmein*
> 
> sorry i don't believe is rock solid if got no issue y they released new vbios at the 1st place. i don't saw any vbios for samsung. avoid micron for god sake.


Rock solid for what I'm doing as well. Again, I love how you mention avoiding micron as if theres a filter on Newegg to check off Samsung, Micron, Any.
If you're using high end cards like these, you're an enthusiast. If youre an enthusiast, you are probably going to overclock. by being an enthusiast, you know you sometimes have to do things a bit differently. For me I found what these tweaks and quirks are and am enjoying my card perfectly fine.
my only complaint is I cant have my OC profile apply at boot because I will lock, but rather need to manually wait a few seconds, and then apply once Im in 3d clocks again.
Again, a quirk I've adapted to.


----------



## reflex75

Quote:


> Originally Posted by *ucode*
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ROM File  ,      Vendor ID's   ,  BIOS Version , Min , Def , Max
> =====================================================================
> 6278_2.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 226W
> 6278_1.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 208W
> 6276_2.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 226W
> 6276_1.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 208W
> 6274_2.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 226W
> 6274_1.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 208W
> 6178.rom, 10DE:1B81-3842:6178, 86.04.50.00.70,  75W, 151W, 170W
> 6173.rom, 10DE:1B81-3842:6173, 86.04.50.00.70,  75W, 151W, 170W
> 6171.rom, 10DE:1B81-3842:6171, 86.04.50.00.70,  75W, 151W, 170W
> 6170.rom, 10DE:1B81-3842:6170, 86.04.50.00.70,  75W, 151W, 170W
> 5173.rom, 10DE:1B81-3842:5173, 86.04.50.00.70,  75W, 151W, 170W
> 5171.rom, 10DE:1B81-3842:5171, 86.04.50.00.70,  75W, 151W, 170W
> 5170.rom, 10DE:1B81-3842:5170, 86.04.50.00.70,  75W, 151W, 170W


thank you.
Has anyone tried to test one of this new EVGA bios to a none EVGA card?


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Like you said, FPS is the total number of frames output during one second, while Frame Time measures the time it takes to render each frame so if that time is distributed evenly across that one second the game will feel the most fluid but if the distribution is highly variable the game will feel laggy or stuttery,
> 
> Also keep in mind that my observations are not the result of trying only one card, i tried four of them, two with Samsung and two with Micron, in the same system, with the same configurations,
> 
> How do you think i first noticed that Micron issue? i did that before reading anything about it or even have any awareness that different memory types are being used, even before trying any OC, so my observations are unbiased because i have not been exposed to any sort of prior information,
> 
> What happened is that i mostly game in 2K, the 970 i had previously was laggy and was not handling it well, then i got a 980 Ti which was much better but used to get too hot and loud so i got rid of it, next it was the GTX 1070 with Samsung memory, that one gave me a very fluent 2K experience, the games were very smooth and fluid, then came the Micron 1070, i installed the card, ran the same games on the same 2K resolution and it was getting noticeably laggy, i did not understand what was wrong, it took me about one day to figure it out, by opening GPU-Z of both cards side by side and comparing what could be different i noticed that one said GDDR5 (Samsung) and the other said GDDR5 (Micron), so i contacted my retailer and said that the card does not perform as it should, using the Samsung card as a reference, so i got the card replaced with another one which also turned out to have Micron and the same symptoms, i Googles the issue looking for any clues and ended up in that thread you started.
> 
> Similar observations happened to other people who first got cards with Samsung and then got them replaced with Micron cards for different reasons, these are the ones who complain the most because they tested both and noticed an obvious difference.
> 
> Two hours ago i logged The Witcher 3 FPS and FrameTime with AfterBurner running at 4K during 10 minutes in the game, i was standing in the same spot just looking around or walking back and forth in a very small area, nothing has been loading since game and textures of the area were all loaded in RAM, nothing is running in the background and my 6600K is running at 4.6Ghz and not even being fully utilized, it is running in the 50c region, checking the graphs after these 10 minutes, the FPS graph is an almost straight line so FPS is very consistent which is to be expected by just staying in the same area in the game just moving around and looking around, while the FrameTime graph is going all over the place with some huge spikes every few data points.
> 
> I will post screenshots here later today when i have time to generate more data for you to see.
> 
> Having tested that many cards eliminates any other issues with the system, i can not play the blame game on other components here, like it is the CPU or the loading or system RAM, since all these variables have been ruled out in my case.


The problem is that blaming the micron memory quality is making huge assumptions that ignores the rest of the ecosystem of the card. By loudly jumping to the conclusion that it is the Micron memory's fault because of poor quality to the exclusion of anything else, takes everyone's attention away from looking at all the other parts that make up the vram sub system to find a real solution to the problem. Is there a connection to Micron chips? Yes there is, but that is just the starting place for the investigation. It turns out that the "fuel supply" settings to to the memory are not configured correctly on the micron cards. Just like a car with a blocked fuel filter, it wont run as efficiently or smoothly as it may otherwise. That doesn't make the engine or the fuel itself poor quality.

I am not playing any blame game at all. That is what you are doing by assigning blame Micron memory chips quality. I understand that you may be experiencing some issues but they will be resolved when the bios that corrects the memory control issues is delivered and installed. Hopefully sooner, rather than later.

The EVGA Bios update proved that the vbios was at fault as the new one improves performance and overclock stability of the cards for both core and memory overclocks and seems to put the cards at a similar level to the samsung memory cards - as soon as the update is rolled out to all brands, problem solved.

I have certainly not observed the same issues that you seem to be having with your card. Odds are that you are experiencing a combination of things that, in isolation may only cause a minor or unnoticable issues but together add up to give you the problems you are seeing. CPU clocks, temperatures/background processes/unstable PC overclocks/motherboard voltages/GPU/vbios and associated bugs/GPU overclock settings/ all play a part in the performance that you will get from your rig.


----------



## gtbtk

Quote:


> Originally Posted by *reflex75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ucode*
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ROM File  ,      Vendor ID's   ,  BIOS Version , Min , Def , Max
> =====================================================================
> 6278_2.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 226W
> 6278_1.rom, 10DE:1B81-3842:6278, 86.04.50.00.70,  92W, 185W, 208W
> 6276_2.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 226W
> 6276_1.rom, 10DE:1B81-3842:6276, 86.04.50.00.70,  92W, 185W, 208W
> 6274_2.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 226W
> 6274_1.rom, 10DE:1B81-3842:6274, 86.04.50.00.70,  92W, 185W, 208W
> 6178.rom, 10DE:1B81-3842:6178, 86.04.50.00.70,  75W, 151W, 170W
> 6173.rom, 10DE:1B81-3842:6173, 86.04.50.00.70,  75W, 151W, 170W
> 6171.rom, 10DE:1B81-3842:6171, 86.04.50.00.70,  75W, 151W, 170W
> 6170.rom, 10DE:1B81-3842:6170, 86.04.50.00.70,  75W, 151W, 170W
> 5173.rom, 10DE:1B81-3842:5173, 86.04.50.00.70,  75W, 151W, 170W
> 5171.rom, 10DE:1B81-3842:5171, 86.04.50.00.70,  75W, 151W, 170W
> 5170.rom, 10DE:1B81-3842:5170, 86.04.50.00.70,  75W, 151W, 170W
> 
> 
> 
> 
> 
> thank you.
> Has anyone tried to test one of this new EVGA bios to a none EVGA card?
Click to expand...

Yes I did.

I put a FTW 2nd bios on my MSI Gaming X. The memory is stable but the performance of my card is reduced because the TDP limit is significantly lower than the MSI Card.

Because the file wants a later version of NVFlash than I have available, I flashed the original FTW bios to the card first and then used the EVGA update utility to update my card to teh new version.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> The problem is that blaming the micron memory quality is making huge assumptions that ignores the rest of the ecosystem of the card. By loudly jumping to the conclusion that it is the Micron memory's fault because of poor quality to the exclusion of anything else, takes everyone's attention away from looking at all the other parts that make up the vram sub system to find a real solution to the problem. Is there a connection to Micron chips? Yes there is, but that is just the starting place for the investigation. It turns out that the "fuel supply" settings to to the memory are not configured correctly on the micron cards. Just like a car with a blocked fuel filter, it wont run as efficiently or smoothly as it may otherwise. That doesn't make the engine or the fuel itself poor quality.


I have been in IT for about 20 years, the saying goes "If it ain't Broken Don't Fix it" , for all i care about is that i had a really stable system when i had that Samsung card installed, never crashed or artifacted once, on stock or OC settings, in fact it was the most stable PC i ever built, and i been doing this for a very long time.

All that changed is that i took out that card and put an identical card with only one component changed, the Memory modules, from Samsung to Micron, the system started acting up, next removed that card and got another one with the same Micron memory component and the same symptoms, as far as i am concerned the memory ICs are the culprit, it is not my job to test every sub-component, that is Nvidia and AIB's job to do, i am also not obligated as a consumer, in the EU at least, to wait months for a fix, the law states that a fix has to be delivered in a reasonable time, and electronics market standard for that reasonable time is two weeks maximum.

Also getting a memory chip to run properly is not rocket science like you make it sound, with standard system memory we switch between several memory brands and ICs on the same board and the same BIOS and they just run with simple configuration from an XMP profile or simple voltage setting, you make it sound like it needs months of R&D to make these Micron ICs run within the 1070 Ecosystem even though they have the same specs as Samsung ICs and it should pretty much be a plug-n-play experience getting them to run properly if they are actually of good quality as you would like to assume.

If i was managing Nvidia or one of these board partner companies i would not go through the trouble to save a few pennies, the card was designed and developed with Samsung memory ICs and they work the best within that ecosystem you were referring to, and i see that they work good within all other ecosystems because they are just good quality ICs,

Google has already indexed a bunch of sites with people talking about the issue, many are having them at stock settings and many are returning their cards, just search and watch, it is also not the first generation where Micron memory is having problems or mediocre performance, there is a history behind these chips, probably this time it is the worst because they been pushed to GDDR5 max data rate,

Anyways my card will not be with me when that BIOS update is released if ever, fortunately my country has laws in place to protect against such shady product policies, even if i was not in my return period still i have a legal right to get my money back, i simply don't accept the product in this fashion and if i had the ability to desolder these Micron ICs and replace them with Samsung myself i would have done it, but no need, i am not that desperate to have a GPU currently, not gaming for a few months will not kill me .. so all good ..


----------



## madmeatballs

Do you guys think an AMP Extreme with max temps around 70-75C(load witcher 3, stock no oc, ~80-90% fan speed, ambient temp 30c) is RMA worthy?


----------



## asdkj1740

https://www.techpowerup.com/downloads/2786/nvflash-5-319-0-for-windows
do not use this version 5.319.0 on techpowerup to cross flash pascal bios. some told me it does not work....


----------



## asdkj1740

Quote:


> Originally Posted by *madmeatballs*
> 
> Do you guys think an AMP Extreme with max temps around 70-75C(load witcher 3, stock no oc, ~80-90% fan speed, ambient temp 30c) is RMA worthy?


whats wrong with your card? fan speed is auto? how much is the rpm?


----------



## khanmein

Quote:


> Originally Posted by *madmeatballs*
> 
> Do you guys think an AMP Extreme with max temps around 70-75C(load witcher 3, stock no oc, ~80-90% fan speed, ambient temp 30c) is RMA worthy?


yes avoid zotac. 3+2 warranty is gimmick & pretty useless. did u see any gpu offered this kinda warranty? mostly 3 years only.


----------



## madmeatballs

Quote:


> Originally Posted by *khanmein*
> 
> yes avoid zotac. 3+2 warranty is gimmick & pretty useless. did u see any gpu offered this kinda warranty? mostly 3 years only.


Well, I picked zotac since it was the only high oc out of the box card available when I bought it. Their after sales is pretty much easy to deal with here too, unlike other brands which had awful after sales.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> The problem is that blaming the micron memory quality is making huge assumptions that ignores the rest of the ecosystem of the card. By loudly jumping to the conclusion that it is the Micron memory's fault because of poor quality to the exclusion of anything else, takes everyone's attention away from looking at all the other parts that make up the vram sub system to find a real solution to the problem. Is there a connection to Micron chips? Yes there is, but that is just the starting place for the investigation. It turns out that the "fuel supply" settings to to the memory are not configured correctly on the micron cards. Just like a car with a blocked fuel filter, it wont run as efficiently or smoothly as it may otherwise. That doesn't make the engine or the fuel itself poor quality.
> 
> I am not playing any blame game at all. That is what you are doing by assigning blame Micron memory chips quality. I understand that you may be experiencing some issues but they will be resolved when the bios that corrects the memory control issues is delivered and installed. Hopefully sooner, rather than later.
> 
> The EVGA Bios update proved that the vbios was at fault as the new one improves performance and overclock stability of the cards for both core and memory overclocks and seems to put the cards at a similar level to the samsung memory cards - as soon as the update is rolled out to all brands, problem solved.
> 
> I have certainly not observed the same issues that you seem to be having with your card. Odds are that you are experiencing a combination of things that, in isolation may only cause a minor or unnoticable issues but together add up to give you the problems you are seeing. CPU clocks, temperatures/background processes/unstable PC overclocks/motherboard voltages/GPU/vbios and associated bugs/GPU overclock settings/ all play a part in the performance that you will get from your rig.


i didn't blame micron but NV & vendor that offered micron cos cheaper. y so troublesome flash vbios? if i try to RMA my graphic card, the shop found i flashed vbios = void warranty. even i didn't RMA my card & yet the new vbios still can't achieve the same level like samsung at all.

i know u will said flash back the default vbios but they can found out easily & find excuse to deny the whole RMA process. (my country related to MH370, MH17 & 1MDB)

i personally won't get EVGA due the VRM cooling is sux for pascal unlike maxwell is pretty good.


----------



## khanmein

Quote:


> Originally Posted by *madmeatballs*
> 
> Well, I picked zotac since it was the only high oc out of the box card available when I bought it. Their after sales is pretty much easy to deal with here too, unlike other brands which had awful after sales.


if pascal go for ASUS whereas MSI pcb look like FE (reference) or slightly lower but the their cooling is superior. as for Zotac so thick but the cooling is so-so + their fan issue is no.1 problematic.

2nd hand value is good for Zotac due to the warranty but i'll avoid it. apparently, i still got bad exp with MSI & GIGA. i support EVGA but this round their VRM cooling almost hit 90°c (tested few days) & imagine after one year the VRM = GG.com

http://www.guru3d.com/articles_pages/evga_geforce_gtx_1070_sc_superclocked_gaming_review,10.html


----------



## madmeatballs

Quote:


> Originally Posted by *khanmein*
> 
> if pascal go for ASUS whereas MSI pcb look like FE (reference) or slightly lower but the their cooling is superior. as for Zotac so thick but the cooling is so-so + their fan issue is no.1 problematic.
> 
> 2nd hand value is good for Zotac due to the warranty but i'll avoid it. apparently, i still got bad exp with MSI & GIGA. i support EVGA but this round their VRM cooling almost hit 90°c (tested few days) & imagine after one year the VRM = GG.com
> 
> http://www.guru3d.com/articles_pages/evga_geforce_gtx_1070_sc_superclocked_gaming_review,10.html


Wow din't know that thing about EVGA.


----------



## khanmein

Quote:


> Originally Posted by *madmeatballs*
> 
> Wow din't know that thing about EVGA.


see carefully 96°c too is totally not acceptable. maxwell on ASUS sux but this time round they're best.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The problem is that blaming the micron memory quality is making huge assumptions that ignores the rest of the ecosystem of the card. By loudly jumping to the conclusion that it is the Micron memory's fault because of poor quality to the exclusion of anything else, takes everyone's attention away from looking at all the other parts that make up the vram sub system to find a real solution to the problem. Is there a connection to Micron chips? Yes there is, but that is just the starting place for the investigation. It turns out that the "fuel supply" settings to to the memory are not configured correctly on the micron cards. Just like a car with a blocked fuel filter, it wont run as efficiently or smoothly as it may otherwise. That doesn't make the engine or the fuel itself poor quality.
> 
> 
> 
> I have been in IT for about 20 years, the saying goes "If it ain't Broken Don't Fix it" , for all i care about is that i had a really stable system when i had that Samsung card installed, never crashed or artifacted once, on stock or OC settings, in fact it was the most stable PC i ever built, and i been doing this for a very long time.
> 
> All that changed is that i took out that card and put an identical card with only one component changed, the Memory modules, from Samsung to Micron, the system started acting up, next removed that card and got another one with the same Micron memory component and the same symptoms, as far as i am concerned the memory ICs are the culprit, it is not my job to test every sub-component, that is Nvidia and AIB's job to do, i am also not obligated as a consumer, in the EU at least, to wait months for a fix, the law states that a fix has to be delivered in a reasonable time, and electronics market standard for that reasonable time is two weeks maximum.
> 
> Also getting a memory chip to run properly is not rocket science like you make it sound, with standard system memory we switch between several memory brands and ICs on the same board and the same BIOS and they just run with simple configuration from an XMP profile or simple voltage setting, you make it sound like it needs months of R&D to make these Micron ICs run within the 1070 Ecosystem even though they have the same specs as Samsung ICs and it should pretty much be a plug-n-play experience getting them to run properly if they are actually of good quality as you would like to assume.
> 
> If i was managing Nvidia or one of these board partner companies i would not go through the trouble to save a few pennies, the card was designed and developed with Samsung memory ICs and they work the best within that ecosystem you were referring to, and i see that they work good within all other ecosystems because they are just good quality ICs,
> 
> Google has already indexed a bunch of sites with people talking about the issue, many are having them at stock settings and many are returning their cards, just search and watch, it is also not the first generation where Micron memory is having problems or mediocre performance, there is a history behind these chips, probably this time it is the worst because they been pushed to GDDR5 max data rate,
> 
> Anyways my card will not be with me when that BIOS update is released if ever, fortunately my country has laws in place to protect against such shady product policies, even if i was not in my return period still i have a legal right to get my money back, i simply don't accept the product in this fashion and if i had the ability to desolder these Micron ICs and replace them with Samsung myself i would have done it, but no need, i am not that desperate to have a GPU currently, not gaming for a few months will not kill me .. so all good ..
Click to expand...

I have been diagnosing these sort of hardware and infrastructure things, working the IT industry since the mid 1980s. I am certainly not a kid. I usually get paid to solve problems that other people cant fix. Nvidia got my help for free because I had an interest in getting this solved.

I am not making any judgement call on the change in memory vendor. Would I have preferred to get Samsung memory and not had these issues to work around? Of course I would. Would I have been as cavalier in swapping out parts without testing it - no Of course not.

However, I didn't get samsung and I cant change that. I have a choice to but either howl at the moon and achieve nothing but make myself unhappy or do something about Identifying and fixing my problem. If it solves everyone else's problem as well so much the better, even if half the people getting the free benefit want to argue with me because they believe it is a conspiracy targeted directly at them.

I also never suggested that memory or vrm settings were rocket science. But it does require the right settings to be coded in before the bios file is locked in with an encrypted certificate. In this case it appears that it wasnt. I can imagine that an accountant probably made the decision, they installed it in a prototype, it booted and displayed something on screen and decided that was enough testing.

Yes you are right. There are a number of sites talking about this thread and the one I started at Nvidia.com which seem to be the only two that I am aware of where some of the participants are trying to be constructive. Other than, that there is just noise talking about the doom of getting Micron memory

There has been a common theme to responses to your complaints that your card crashes at stock. Everyone has suggested that you you return it. I am glad that you are finally planning to do that.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> see carefully 96°c too is totally not acceptable. maxwell on ASUS sux but this time round they're best.


guru3d is on open tech bench. 96c on vrm is suck at all.

evga is using cooling plate for vram and vrm cooling, and it is bad.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Everyone has suggested that you you return it. I am glad that you are finally planning to do that.


In fact i have been planning on returning it the moment i switched on the system for the first time with that card installed and ran the first game, i knew there was something off about the card, something i can not accept or live with at that price point, all i was doing all this time is trying to find a solution to this other than skipping the entire generation and staying without a GPU until 2017, something like a decent manufacturer still using Samsung, or a recognition of the inferiority of the Micron ICs and a switch back to Samsung, at least on the more expensive models, but nothing of that happened after a whole month and a half of trying, actually the opposite is happening, they now started poisoning the GTX 1060 with Micron memory and complaints started surfacing, they are very stubborn so it is a lost cause, best is to vote with my money that, NO i do not want the product in this state.

So i am just simply back at my original plan, getting my money back and skipping the generation, at least i tried as hard and as vigorously as i could, so now i can comfortably and happily live with that decision.


----------



## watermanpc85

Quote:


> Originally Posted by *ITAngel*
> 
> EVGA always been solid cards, with amazing customer service etc... might want to look into them.


Quote:


> Originally Posted by *MyNewRig*
> 
> None! if you want to OC forget about the 1070 all together, if you must buy get a Founders but that will not go well with your BIOS modding requirement, if you must buy just avoid the EVGA FTW coz that has tons of issues, i would also avoid Gigabyte, the rest are a matter of taste, i would personally not buy a GTX 1070 now, it was great in June/July when it had Samsung memory, not anymore, not worth the money IMO.


Thanks to both guys!!...

Well, EVGA seems to have a low TDP limit according with the table posted a few pages back so I guess it wiont handle OC as well as MSI card right??...I know the 1070 is no where near an OC beast like the 970 and other cards are but with "OC" I meant, "the best OC possible" (specific samples aside of course). Is the micron memory a REAL issue that really affects the card or its only when OCing?? also, is vram OC worth in pascal??

But I want to ask you guys another question...how do you feel about upgrading from my 970 which can run at 1550/3800 to a 1070??...do you think it will worth the hustle.

Thanks!!


----------



## DeathAngel74

I've read threads elsewhere about ppl using hex editors to modify power limit and voltage.
Technically, if we find the right offsets, change core freq., vram freq., power limit, TDP and lock voltage, re-sign the rom, change fan curve, etc.. Then we won't need a tweaker? Or am I wrong? Just thinking out loud I guess.


----------



## vallonen

EVGA has the same problem XFX had a few years back with the vrm cooling on their 200 series, if I recall they addressed the issue on their 300 series.


----------



## ITAngel

I see the highest I have run my card is 2168Mhz on the core. Not sure if that is a good OC or not . Zotac GTX 1070 amp extreme.


----------



## MyNewRig

Quote:


> Originally Posted by *watermanpc85*
> 
> Thanks to both guys!!...
> 
> Well, EVGA seems to have a low TDP limit according with the table posted a few pages back so I guess it wiont handle OC as well as MSI card right??...I know the 1070 is no where near an OC beast like the 970 and other cards are but with "OC" I meant, "the best OC possible" (specific samples aside of course). Is the micron memory a REAL issue that really affects the card or its only when OCing?? also, is vram OC worth in pascal??
> 
> But I want to ask you guys another question...how do you feel about upgrading from my 970 which can run at 1550/3800 to a 1070??...do you think it will worth the hustle.
> 
> Thanks!!


EVGA FTW is overheating and dying or catching fire, plenty of reports are surfacing every day so just forget that card.

In all seriousness the current Pascal is a mess, it has a bunch of issues and that Micron memory thing totally kills it, if you have a card already i would hold on to it for a few more months until AMD VEGA and the more mature and higher performance Pascal V2 show up in a few months, it will be faster and cheaper since MSRP will be lowered due to upcoming competition, that will be a much better time to buy.

Not worth the hassle at all, do yourself a favor and skip ...


----------



## ITAngel

Quote:


> Originally Posted by *MyNewRig*
> 
> EVGA FTW is overheating and dying or catching fire, plenty of reports are surfacing every day so just forget that card.
> 
> In all seriousness the current Pascal is a mess, it has a bunch of issues and that Micron memory thing totally kills it, if you have a card already i would hold on to it for a few more months until AMD VEGA and the more mature and higher performance Pascal V2 show up in a few months, it will be faster and cheaper since MSRP will be lowered due to upcoming competition, that will be a much better time to buy.
> 
> Not worth the hassle at all, do yourself a favor and skip ...


I guess i am lucky picking my Zotac GTX 1070 AMP! Extreme graphic card.


----------



## DeathAngel74

ummm, almost make me not want to overclock my card. dayum


----------



## MyNewRig

Quote:


> Originally Posted by *DeathAngel74*
> 
> ummm, almost make me not want to overclock my card. dayum


the SC thermal imaging is not promising either, just telling ..


----------



## Roland0101

Quote:


> Originally Posted by *watermanpc85*
> 
> Well, EVGA seems to have a low TDP limit according with the table posted a few pages back so I guess it wiont handle OC as well as MSI card right??


Not if you don't plan to make physical changes to your card. It isn't easy to reach the Power limit with an 1070, it is almost impossible if you have a high TDP card, then you simply will be limited by VRel.
Quote:


> Is the micron memory a REAL issue that really affects the card or its only when OCing?? also, is vram OC worth in pascal??


It is a OC issue that will probably get fixed with the coming Vbios update. (EVGA already has that BIos out, Asus and MSI confirmed that it is coming) There are cards that has problems on stock clocks, but such a card is simply defective. (Most of the time that was reported on MSI cards, and MSI acknowledged an issue.)
Quote:


> But I want to ask you guys another question...how do you feel about upgrading from my 970 which can run at 1550/3800 to a 1070??...do you think it will worth the hustle.
> 
> Thanks!!


See for yourself. Your 970 is pretty nice overclocked, so compare your result to the, in this benchmark, not overclocked 1070 of mine.


----------



## HaiderGill

I'm still messing about with mine Inno3D iChill X3 Herculez GTX 1070. I play on a 1080P60 42 inch TV so have only overclocked to 2.08GHz core. Very impressed, cool'n'quiet...


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> guru3d is on open tech bench. 96c on vrm is suck at all.
> 
> evga is using cooling plate for vram and vrm cooling, and it is bad.


tell me which reviewer not using open test bench? expect joker slunt that followed NV SOP


----------



## Cybianno

GTX480? You mean the reference design 290X? Guess this is what a mix of oh not so good quality control paired up with a poor cooler design does









Seriously, a card like EVGA's shouldn't exceed 70ºC degrees even if given a small voltage bump.

As I'm a little of touch with the current situation, is this currently happening with other non-ref 1070/1080s from other brands? 1070 Gaming X with Samsung ram here, so far no issues.


----------



## muzammil84

Quote:


> Originally Posted by *HaiderGill*
> 
> I'm still messing about with mine Inno3D iChill X3 Herculez GTX 1070. I play on a 1080P60 42 inch TV so have only overclocked to 2.08GHz core. Very impressed, cool'n'quiet...


I've got x4 myself. The cooling on that card is amazing, on some games using vsync it rarely even spins its fans so the card is very quiet(even at full load max temp i've ever seen was 62°C).
Got mine @2050 mhz and 9260 memory, seems to be the max for my gpu but not really bothered about extra few mhz which do absolutely nothing in games. Samsung memory, happy about that too. I was originally planning on putting a water block on it but it runs so cool and quiet that i don't see a point atm(plus the fact that it voids warranty puts me off a bit, there's A LOT discussions about dying 1070s, especially Palit, and that within days from purchase). This very efficient cooler got its price; gpu is huge, I'm running it in P5 so there's no problem but i doubt if it fits in most atx cases. They don't seem very popular in here, ppl go with MSI, EVGA, Zotac and complain a lot about certain issues, my Inno3d has been spotless so far, i definitely recommend to anyone who is considering 1070 purchase.


----------



## khanmein

@Dans Tech; (i received his comment so far dare to talk about micron & maybe he don't even know there's micron exist on GDRR5 or confused with GDRR5X)

I dont understand your issue with the 1070 cards, all the cards I've taken a look at contain memory from Samsung. In the most recent review of the Zotac GTX 1070, that contained 8x Samsung K4G80325FB-HC25 GDDR5 memory chips. The 1070 is a very powerful GPU and feel it offers fantastic performance for the cost. Your opinion? Dan.﻿ (https://www.youtube.com/channel/UCgRSCe2siDmMP_cDyjqpPAg)





 (released 4~5 days ago) come with samsung???

where's the other youtube tech reviewer aka content creator bull-crap????


----------



## kevindd992002

Quote:


> Originally Posted by *madmeatballs*
> 
> Well, I picked zotac since it was the only high oc out of the box card available when I bought it. Their after sales is pretty much easy to deal with here too, unlike other brands which had awful after sales.


Correct. I guess the Zotac after-sales support depends on how the official distributor/s of a specific country operates. In our case (Philippines), the distributor is very helpful and easy to deal with. So the 3+2 warranty is not a gimmick for us as it's a REAL 5 years warranty.
Quote:


> Originally Posted by *khanmein*
> 
> if pascal go for ASUS whereas MSI pcb look like FE (reference) or slightly lower but the their cooling is superior. as for Zotac so thick but the cooling is so-so + their fan issue is no.1 problematic.
> 
> 2nd hand value is good for Zotac due to the warranty but i'll avoid it. apparently, i still got bad exp with MSI & GIGA. i support EVGA but this round their VRM cooling almost hit 90°c (tested few days) & imagine after one year the VRM = GG.com
> 
> http://www.guru3d.com/articles_pages/evga_geforce_gtx_1070_sc_superclocked_gaming_review,10.html


Dude, I think you're misunderstanding madmeatballs regarding his query. He's not asking any comparison about the different 1070's out there. He already has the Zotac GTX 1070 AMP! Extreme and he's just asking about the his temps in particular.
Quote:


> Originally Posted by *ITAngel*
> 
> I see the highest I have run my card is 2168Mhz on the core. Not sure if that is a good OC or not . Zotac GTX 1070 amp extreme.


I say that's a decent core overclock as I have the same card and anything higher thatn 2076MHz gives me artifacts in Heaven. I haven't installed the AIO that I have to this 1070 though so I'm expecting an improvement in OC when I do that.


----------



## weskeh

Quote:


> Originally Posted by *RyanRazer*
> 
> AMP! Extreme is not without problems either... http://www.overclock.net/t/1613126/gtx-1070-amp-extreme-owners
> This is as of now. There are probably other people with other problems.
> No bios moddig till now (i think). Just max power and volt sliders and slowly increase core clock and than mem


I also have a zotac amp extreme.very happy with it. Added 150mhz on the core true firestorm core clock boosts about 2100mhz with this setting but down clocks to about 2088-2055 ish cause off gpu boost 3.0, not tryed higher cause of this as i dont see the point. Added 400mhz in the memory core. Not tryed any higher either at this point.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> @Dans Tech; (i received his comment so far dare to talk about micron & maybe he don't even know there's micron exist on GDRR5 or confused with GDRR5X)
> 
> I dont understand your issue with the 1070 cards, all the cards I've taken a look at contain memory from Samsung. In the most recent review of the Zotac GTX 1070, that contained 8x Samsung K4G80325FB-HC25 GDDR5 memory chips. The 1070 is a very powerful GPU and feel it offers fantastic performance for the cost. Your opinion? Dan.﻿ (https://www.youtube.com/channel/UCgRSCe2siDmMP_cDyjqpPAg)
> 
> 
> 
> 
> 
> (released 4~5 days ago) come with samsung???
> 
> where's the other youtube tech reviewer aka content creator bull-crap????


Just direct him to the nvidia.com thread


----------



## derfer

Twice now my GTX 1070 has had checkerboard artifacts at stock during web browsing. Goes away if I change tabs. I found others reporting the same issue but nobody could figure out what to pin it down to, very sporadic. Samsung memory. Wondering if I should RMA or not.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Just direct him to the nvidia.com thread


https://forums.geforce.com/member/1940330/

i'm not sure this guy is dans or not but he commented at NV thread already.


----------



## khanmein

Quote:


> Originally Posted by *derfer*
> 
> Twice now my GTX 1070 has had checkerboard artifacts at stock during web browsing. Goes away if I change tabs. I found others reporting the same issue but nobody could figure out what to pin it down to, very sporadic. Samsung memory. Wondering if I should RMA or not.


RMA & get samsung chip. avoid edit the setting or flash vbios for what.


----------



## gtbtk

Quote:


> Originally Posted by *derfer*
> 
> Twice now my GTX 1070 has had checkerboard artifacts at stock during web browsing. Goes away if I change tabs. I found others reporting the same issue but nobody could figure out what to pin it down to, very sporadic. Samsung memory. Wondering if I should RMA or not.


what model card is it?


----------



## weskeh

Quote:


> Originally Posted by *derfer*
> 
> Twice now my GTX 1070 has had checkerboard artifacts at stock during web browsing. Goes away if I change tabs. I found others reporting the same issue but nobody could figure out what to pin it down to, very sporadic. Samsung memory. Wondering if I should RMA or not.


Did you try lowering the clocks and see if its still the same?

If so i would rma yes.

Only started with this card? Never had problems with an older card?

Drivers correctly deleted and reinstalled? With ddu? Sufficiant psu?


----------



## watermanpc85

Quote:


> Originally Posted by *MyNewRig*
> 
> EVGA FTW is overheating and dying or catching fire, plenty of reports are surfacing every day so just forget that card.
> 
> In all seriousness the current Pascal is a mess, it has a bunch of issues and that Micron memory thing totally kills it, if you have a card already i would hold on to it for a few more months until AMD VEGA and the more mature and higher performance Pascal V2 show up in a few months, it will be faster and cheaper since MSRP will be lowered due to upcoming competition, that will be a much better time to buy.
> 
> Not worth the hassle at all, do yourself a favor and skip ...


Quote:


> Originally Posted by *Roland0101*
> 
> Not if you don't plan to make physical changes to your card. It isn't easy to reach the Power limit with an 1070, it is almost impossible if you have a high TDP card, then you simply will be limited by VRel.
> 
> It is a OC issue that will probably get fixed with the coming Vbios update. (EVGA already has that BIos out, Asus and MSI confirmed that it is coming) There are cards that has problems on stock clocks, but such a card is simply defective. (Most of the time that was reported on MSI cards, and MSI acknowledged an issue.)
> See for yourself. Your 970 is pretty nice overclocked, so compare your result to the, in this benchmark, not overclocked 1070 of mine.


Thanks guys!!, nice answers!!...looking at the 970 vs 1070 comparsion bench, it looks like the 1070 is having a really poor performance over the 970...

Having in mind mine goes easy to 1550 on the core, and your 1070 is a nice overclocker at 2000+ Mhz on the core, I guess the difference between my 970 and a good 1070 sample would be like 45/50% improvement.

Not an easy decision here







because I dont want to wait almost a year to upgrade and then see how the 2xxx series are even worse (which could happen seeing how bad AMD is competing right now)...mmmmmmm


----------



## madmeatballs

Quote:


> Originally Posted by *Snuckie7*
> 
> My 970 can't quite handle the highest settings at 1440p and there is a 1070 selling for $360 right now. . .decisions


Are you aiming for above 60FPS 1440p? What games do you play?


----------



## MyNewRig

Quote:


> Originally Posted by *watermanpc85*
> 
> Thanks guys!!, nice answers!!...looking at the 970 vs 1070 comparsion bench, it looks like the 1070 is having a really poor performance over the 970...
> 
> Having in mind mine goes easy to 1550 on the core, and your 1070 is a nice overclocker at 2000+ Mhz on the core, I guess the difference between my 970 and a good 1070 sample would be like 45/50% improvement.
> 
> Not an easy decision here
> 
> 
> 
> 
> 
> 
> 
> because I dont want to wait almost a year to upgrade and then see how the 2xxx series are even worse (which could happen seeing how bad AMD is competing right now)...mmmmmmm


You would not be waiting for a year or anything even close to that, it is 4 to 5 months tops, and probably an earlier announcement, AMD VEGA will not be a weak competition by any means since VEGA 10 is coming with 12 TFLOPS of performance and is using HBM2 512 GB/s bandwidth memory, and that is a Titan XP contender, Nvidia have to counter immediately, the 1050 Ti has already moved to Samsung 14nm fab, and next is Pascal V2, you get 20% performance improvement or more, faster GDDR5X memory, and a lower MSRP , with competition in the market Nvidia will have to make sure that the product is the best it can be, not like the situation we are in now due to lack of competition Nvidia can play us all they want.

It is your call at the end so do whatever you think is best for your situation.


----------



## HaiderGill

Quote:


> Originally Posted by *muzammil84*
> 
> I've got x4 myself. The cooling on that card is amazing, on some games using vsync it rarely even spins its fans so the card is very quiet(even at full load max temp i've ever seen was 62°C).
> Got mine @2050 mhz and 9260 memory, seems to be the max for my gpu but not really bothered about extra few mhz which do absolutely nothing in games. Samsung memory, happy about that too. I was originally planning on putting a water block on it but it runs so cool and quiet that i don't see a point atm(plus the fact that it voids warranty puts me off a bit, there's A LOT discussions about dying 1070s, especially Palit, and that within days from purchase). This very efficient cooler got its price; gpu is huge, I'm running it in P5 so there's no problem but i doubt if it fits in most atx cases. They don't seem very popular in here, ppl go with MSI, EVGA, Zotac and complain a lot about certain issues, my Inno3d has been spotless so far, i definitely recommend to anyone who is considering 1070 purchase.


Yeah I did see the X4 but my budget for video cards has always been upto £350 so I went over budget at £409. Probably a bit stupid I didn't just pay the extra £10. I have good cooling in my case (Define R4 with two Define front fans blowing towards the GPU, a define rear exhaust fan and Noctua NH D15 to cool the CPU and Seasonic 850W M12ii. I run the front two case fans on the lowest setting with the door closed and all in all a very quiet PC. I can hardly tell it's on. Even under full load (Furmark) it's no where as noisy as my old PC displaying a desktop. That's the clincher for the iChill Air Boss it's quiet under load. It's quenched my thirst for water cooling...Performance wise I'm maxing eveything out at 1080P60 in GTA 5, I'm picky about a fast smooth frame-rate...


----------



## weskeh

Quote:


> Originally Posted by *HaiderGill*
> 
> Yeah I did see the X4 but my budget for video cards has always been upto £350 so I went over budget at £409. Probably a bit stupid I didn't just pay the extra £10. I have good cooling in my case (Define R4 with two Define front fans blowing towards the GPU, a define rear exhaust fan and Noctua NH D15 to cool the CPU and Seasonic 850W M12ii. I run the front two case fans on the lowest setting with the door closed and all in all a very quiet PC. I can hardly tell it's on. Even under full load (Furmark) it's no where as noisy as my old PC displaying a desktop. That's the clincher for the iChill Air Boss it's quiet under load. It's quenched my thirst for water cooling...Performance wise I'm maxing eveything out at 1080P60 in GTA 5, I'm picky about a fast smooth frame-rate...


i dont have any bad words about my zotac amp extreme 1070, so there's that


----------



## weskeh

Quote:


> Originally Posted by *Roland0101*
> 
> Not if you don't plan to make physical changes to your card. It isn't easy to reach the Power limit with an 1070, it is almost impossible if you have a high TDP card, then you simply will be limited by VRel.
> 
> It is a OC issue that will probably get fixed with the coming Vbios update. (EVGA already has that BIos out, Asus and MSI confirmed that it is coming) There are cards that has problems on stock clocks, but such a card is simply defective. (Most of the time that was reported on MSI cards, and MSI acknowledged an issue.)
> See for yourself. Your 970 is pretty nice overclocked, so compare your result to the, in this benchmark, not overclocked 1070 of mine.


whats does Vrel actually do/mean, i know its voltage reliability.

but i have a hard time understanding the real words here, im not reaching its tdp by any means, so what is holding what back? the voltage? how can i be limited if i dont even reach its tdp? can u elaborate?


----------



## watermanpc85

Quote:


> Originally Posted by *MyNewRig*
> 
> You would not be waiting for a year or anything even close to that, it is 4 to 5 months tops, and probably an earlier announcement, AMD VEGA will not be a weak competition by any means since VEGA 10 is coming with 12 TFLOPS of performance and is using HBM2 512 GB/s bandwidth memory, and that is a Titan XP contender, Nvidia have to counter immediately, the 1050 Ti has already moved to Samsung 14nm fab, and next is Pascal V2, you get 20% performance improvement or more, faster GDDR5X memory, and a lower MSRP , with competition in the market Nvidia will have to make sure that the product is the best it can be, not like the situation we are in now due to lack of competition Nvidia can play us all they want.
> 
> It is your call at the end so do whatever you think is best for your situation.


Thanks man!!...I guess I will wait then...or maybe If I find a 980ti cheap out there I can pull the trigger and then wait and see what happens with vega and 2xxx nvidia series...


----------



## khanmein

Quote:


> Originally Posted by *watermanpc85*
> 
> Thanks man!!...I guess I will wait then...or maybe If I find a 980ti cheap out there I can pull the trigger and then wait and see what happens with vega and 2xxx nvidia series...


don't expect anything about vega. HBM 2 is freaking exp. i'm more excited with Zen to compete with Intel. AMD CPU is the key point. NV still rule no.1 for graphic card.


----------



## ITAngel

Quote:


> Originally Posted by *kevindd992002*
> 
> Correct. I guess the Zotac after-sales support depends on how the official distributor/s of a specific country operates. In our case (Philippines), the distributor is very helpful and easy to deal with. So the 3+2 warranty is not a gimmick for us as it's a REAL 5 years warranty.
> Dude, I think you're misunderstanding madmeatballs regarding his query. He's not asking any comparison about the different 1070's out there. He already has the Zotac GTX 1070 AMP! Extreme and he's just asking about the his temps in particular.
> I say that's a decent core overclock as I have the same card and anything higher thatn 2076MHz gives me artifacts in Heaven. I haven't installed the AIO that I have to this 1070 though so I'm expecting an improvement in OC when I do that.


Yea I agree with an AIO/Block I think I can hold those speed on long hours of gaming, but once the case, card and environment starts to heat up it will throttle down the speed to around 2132Mhz finally about 2088Mhz with good control on air and temps. Note I am not even configuring the fans I figure I will not be running my fans higher than 60% and I normally have it on Auto. I would say that 2088/2076 Mhz seems the sweet spot on these cards. I can't complain anything over stock speed is free performance.


----------



## TheGlow

Quote:


> Originally Posted by *khanmein*
> 
> don't expect anything about vega. HBM 2 is freaking exp. i'm more excited with Zen to compete with Intel. AMD CPU is the key point. NV still rule no.1 for graphic card.


M. Bison? Experience? Zenyatta? Nevada?


----------



## JukeBox

successfully flashed my Palit Superjetstream GTX1070 last night.

Device ID is: 10DE 1B81

Flashed to VBIOS 86.04.3B.00.94

"GPU Device Id: 0x10DE 0x1B81
Version: 86.04.3B.00.94
GeForce GTX 1070 VGA BIOS
Copyright (C) 1996-2016 NVIDIA Corp.
BIOS-P/[email protected]
Connectors
1x DVI-D
1x HDMI
3x DisplayPort
Board power limit
Target: 195.0 W
Limit: 225.0 W
Adj. Range: -62%, +15%
Thermal limits
Rated: 83.0C
Max: 92.0C
Memory Support
GDDR5, Samsung
GDDR5, Micron
Boost Clock: 1835 MHz"

Reason was for the higher 225W power limit.

I was initially on this BIOS:

GPU Device Id: 0x10DE 0x1B81
Version: 86.04.1E.00.2C
GeForce GTX 1070 VGA BIOS
Copyright (C) 1996-2016 NVIDIA Corp.
BIOS-P/[email protected]
Connectors
1x DVI-D
1x HDMI
3x DisplayPort
Board power limit
Target: 151.0 W
Limit: 170.0 W
Adj. Range: -50%, +13%
Thermal limits
Rated: 83.0C
Max: 92.0C
Memory Support
GDDR5, Samsung
GDDR5, Micron
Boost Clock: 1683 MHz

.

Stability is perfect.

overclocking to 2150mhz core and 4450 vmem.

slowly turning down voltage till it hits instability now







.


----------



## bobfig

was looking at the bios collection and that the seahawk power limit is way lower then the gamer x and gamer z. any one know why that would be when the whole board would be under water.


----------



## Roland0101

Quote:


> Originally Posted by *watermanpc85*
> 
> Having in mind mine goes easy to 1550 on the core, and your 1070 is a nice overclocker at 2000+ Mhz on the core, I guess the difference between my 970 and a good 1070 sample would be like 45/50% improvement.
> 
> Not an easy decision here
> 
> 
> 
> 
> 
> 
> 
> because I dont want to wait almost a year to upgrade and then see how the 2xxx series are even worse (which could happen seeing how bad AMD is competing right now)...mmmmmmm


Even worse? That is quit a lot for just one generation. Plus the 1070 doubles the vRam and it has better DX12 support.
But I don't want to convince you. I myself thought long about maybe skipping this generation. My old Strix 970 is also a very good card with very good performance and if yours gives you what you need it makes no sense to upgrade.


----------



## Snuckie7

Quote:


> Originally Posted by *madmeatballs*
> 
> Are you aiming for above 60FPS 1440p? What games do you play?


I'm aiming for 60 fps at 1440p, hopefully for some time. Last game I played was Rise of the Tomb Raider, and the next games on my docket are GTA V and The Witcher 3.


----------



## derfer

Quote:


> Originally Posted by *gtbtk*
> 
> what model card is it?


Sea Hawk X. Samsung memory.
Quote:


> Originally Posted by *weskeh*
> 
> Did you try lowering the clocks and see if its still the same?
> 
> If so i would rma yes.
> 
> Only started with this card? Never had problems with an older card?
> 
> Drivers correctly deleted and reinstalled? With ddu? Sufficiant psu?


No prior issues. No overclock besides the small stock one. 850watt titanium psu. The issue just started out of the blue, nothing was updated prior to it. I was on older drivers so I cleaned that up and am trying the latest ones now but I can't recall if I've ever seen checkerboard when it's not a hardware issue. When it happens in the browser its not full screen it's just parts of the window.


----------



## flexy123

Quote:


> Originally Posted by *vloeibaarglas*
> 
> You only lose a few percentage point in performance at most.


Yeah that sounds...ATTRACTIVE. /S

PS: Some people are enthusiasts and they like to push hardware.


----------



## gtbtk

Quote:


> Originally Posted by *derfer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> what model card is it?
> 
> 
> 
> Sea Hawk X. Samsung memory.
> Quote:
> 
> 
> 
> Originally Posted by *weskeh*
> 
> Did you try lowering the clocks and see if its still the same?
> 
> If so i would rma yes.
> 
> Only started with this card? Never had problems with an older card?
> 
> Drivers correctly deleted and reinstalled? With ddu? Sufficiant psu?
> 
> Click to expand...
> 
> No prior issues. No overclock besides the small stock one. 850watt titanium psu. The issue just started out of the blue, nothing was updated prior to it. I was on older drivers so I cleaned that up and am trying the latest ones now but I can't recall if I've ever seen checkerboard when it's not a hardware issue. When it happens in the browser its not full screen it's just parts of the window.
Click to expand...

Checkerboard artifacts on the micron cards are a voltage supply issue when the card ramps up from low power Idle 2d mode to 3d mode.

It is not typical of the Samsung cards as far as I am aware but you could try setting the card to maximum performance mode in nv control panel and assign max performance to dwm.exe so it keeps voltage up to .800v in 2d mode.


----------



## gtbtk

Guru3D are keeping track of the Bios update releases for us

http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,4.html


----------



## QPSS

So, has anyone tried to confirm MyNewRig's claims that there are frametime issues with Micron VRAM?
I just tried it (I have a 1070 with Micron running @ 1080p) and in MSI Afterburner it shows me quite some frametime jumps (no matter the game). Jumps of 5 ms up and down or, if lower FPS, 10 ms happen, even though the FPS stays the same by standing still. But I am not sure if thats normal. These jumps stop if I limit my framerate with Rivatuner Server or Vsync.

Can anyone with Samsung VRAM verify and someone with Micron (patched and unpatched) check if they have the same jumps? I would like to send my card back if thats really an issue, because that would be unacceptable.
Or MyNewRig, could you post some screenshots of your frametime graph, so I can see if it looks like mine?


----------



## Forceman

What sample rate are you using for Afterburner? Is it recording every frame time, because AB used to just report the highest frame time for each sample period, not every frame time. So that may account for some of the disparity. Have you tried using Fraps to record the actual frame times (for each frame) and then graphed that to compare?


----------



## QPSS

Sample rate as in hardware polling rate?
1000 ms.
I didnt do that yet, since its more work and I dont have Fraps installed.


----------



## Forceman

Quote:


> Originally Posted by *QPSS*
> 
> Sample rate as in hardware polling rate?
> 1000 ms.
> I didnt do that yet, since its more work and I dont have Fraps installed.


It used to be that way, I don't have RTSS installed so I can't check it myself.


----------



## QPSS

I can change it down to 100 ms, which I just tried. Doesnt change the jumps, though, they just are further apart on the graph.


----------



## Forceman

Quote:


> Originally Posted by *QPSS*
> 
> I can change it down to 100 ms, which I just tried. Doesnt change the jumps, though, they just are further apart on the graph.


That sounds like it is not recording every frame time individually. Here's what the FRAPS result (Valley, the scene changes are why it is in groups) looks like on a graph.


----------



## QPSS

Alright, Fraps results here:


----------



## Forceman

Quote:


> Originally Posted by *QPSS*
> 
> Alright, Fraps results here:


Generally pretty good, but there are periodic spikes there.

Edit: is there a demo? If so, I can run it to compare for you.


----------



## QPSS

Nope, theres not a demo AFAIK. But Ill download valley rq.
Its not DX12.


----------



## Forceman

Quote:


> Originally Posted by *QPSS*
> 
> Nope, theres not a demo AFAIK. But Ill download valley rq.
> Its not DX12.


Anything else you notice the same behavior on? I don't have much installed, but I have a decent selection of recent games I can test for you.


----------



## QPSS

I actually noticed those huge spikes. Felt like something is loading.

I dont have many games installed right now. RotTR, Mirrors Edge 2.


----------



## Forceman

Here's two runs of Mirror's Edge Catalyst. Top one is 40ms scale, bottom is 20.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Guru3D are keeping track of the Bios update releases for us
> 
> http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,4.html


what? gigabye has not used micron vram on 1070s ever????
if that is true then it show that using micron vram may not be compulsory as what those aic claim about supply problem...


----------



## QPSS

Quote:


> Originally Posted by *Forceman*
> 
> Here's two runs of Mirror's Edge Catalyst. Top one is 40ms scale, bottom is 20.




Doesnt seem different from yours.
So do you have Samsung or Micron? If Micron... patched?


----------



## Forceman

Quote:


> Originally Posted by *QPSS*
> 
> 
> 
> Doesnt seem different from yours.
> So do you have Samsung or Micron? If Micron... patched?


I assume Samsung since I got it early on, but I've never checked.


----------



## QPSS

Could you please?


----------



## Forceman

Quote:


> Originally Posted by *QPSS*
> 
> Could you please?


Samsung, like I expected. +500 on the memory.


----------



## QPSS

Thanks.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Guru3D are keeping track of the Bios update releases for us
> 
> http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,4.html
> 
> 
> 
> what? gigabye has not used micron vram on 1070s ever????
> if that is true then it show that using micron vram may not be compulsory as what those aic claim about supply problem...
Click to expand...

I don't think that is true that gigabyte have not used Micron VRam on some cards but I cannot confirm that absolutely.

Can we do a straw poll?

Is there anyone with a Gigabyte card with Micron Memory out there who can confirm?


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I don't think that is true that gigabyte have not used Micron VRam on some cards but I cannot confirm that absolutely.
> 
> Can we do a straw poll?
> 
> Is there anyone with a Gigabyte card with Micron Memory out there who can confirm?


even Zotac FE come with micron too.

http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue.html

there's two users from GIGA had micron.


----------



## gtbtk

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *QPSS*
> 
> 
> 
> Doesnt seem different from yours.
> So do you have Samsung or Micron? If Micron... patched?
> 
> 
> 
> I assume Samsung since I got it early on, but I've never checked.
Click to expand...

I will jump in.

MSI Gaming X 1070 with Micron Ram. Still on the original 86.04.26.00.3E bios

First graph at stock settings:



second graph with custom curve OC and +550 memory overclock



Both my graphs seem to match each other and look similar to your valley graphs as far as I can see.


----------



## F3niX69

Quote:


> Originally Posted by *gtbtk*
> 
> I don't think that is true that gigabyte have not used Micron VRam on some cards but I cannot confirm that absolutely.
> 
> Can we do a straw poll?
> 
> Is there anyone with a Gigabyte card with Micron Memory out there who can confirm?


My friend has a g1 gaming 1070 with micron


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> I don't think that is true that gigabyte have not used Micron VRam on some cards but I cannot confirm that absolutely.
> 
> Can we do a straw poll?
> 
> Is there anyone with a Gigabyte card with Micron Memory out there who can confirm?


No need for a poll, people in the following link report having Micron on their Gigabyte G1 cards and Gigabyte's official BIOS update page has a "Release for Micron Memory" made in August


__
https://www.reddit.com/r/58fw54/manufacturers_roll_out_firmware_updates_for/%5B/URL


----------



## khanmein

Quote:


> Originally Posted by *khanmein*
> 
> even Zotac FE come with micron too.


Quote:


> Originally Posted by *MyNewRig*
> 
> No need for a poll, people in the following link report having Micron on their Gigabyte G1 cards and Gigabyte's official BIOS update page has a "Release for Micron Memory" made in August
> 
> 
> __
> https://www.reddit.com/r/58fw54/manufacturers_roll_out_firmware_updates_for/%5B/URL


i always avoid getting GIGA cos they often release to many beta vbios due to different pcb revision like v1.0, v1.1, etc..

damn scary if flashed wrong vbios..

new driver 375.57 is out & W10 issues is more than AMD.

when can NV solve this issue???


Spoiler: Warning: Spoiler!



 [GM204] Quantum Break window either remains blank or freezes in game scene in
windowed mode. [1804910]
 Assassins Creed - Syndicate shows intermittent flickering black or white patches on
game character faces. [200211264]
 Surround Display icon disappears after rotate mode set to portrait. [200201040]
 [Forza Horizon 3] Possible driver memory leak. [1826143]
 [GeForce GTX 1070][Alienware Graphics Amplifier] The graphics card is not detected
upon installing the driver. [200236450]
 [367.77, WDDM 2.1] Driver install/overinstall requires reboot. [1757931]
 [Luxmark 3.0] Display driver stopped responding while running benchmark LuxBall
HDR (Simple Bechmark:217K triangles). [200153736]
 [347.09, GM204] Blank screen observed on an ASUS Tiled display when system
resumes from shutdown or hibernation with Fast boot option enabled from BIOS.
[1591053]


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> i always avoid getting GIGA cos they often release to many beta vbios due to different pcb revision like v1.0, v1.1, etc..


Totally agree, i was having hell with my last few motherboards from Gigabyte with the beta BIOS releases that stay in beta state for many months, i do not buy anything Gigabyte anymore because of this.


----------



## EDK-TheONE

msi gaming non-x has micron memory?


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Totally agree, i was having hell with my last few motherboards from Gigabyte with the beta BIOS releases that stay in beta state for many months, i do not buy anything Gigabyte anymore because of this.


u can still buy giga but need to triple check & do some homework before purchase. usually 1st/2nd batch/month is ok then they tend to use cheap parts or try to shrink down as many as possible.

i bought Leadtek cos cheap + promotion along with Plextor M6S 128 GB but now my card is under RMA-ing cos one of the fan stop spinning & so far no other issue.

FYI, my both fans come with Colorful CF-12815B (i'm not sure is single or dual ball bearing)


----------



## khanmein

Quote:


> Originally Posted by *EDK-TheONE*
> 
> msi gaming non-x has micron memory?


most likely come with micron. try to find some older stock cos right now more than 90% GTX 1070 (micron)

go ask yr fav reviewer how come their GTX 1070 (samsung)?


----------



## MyNewRig

Funny thing is that reports are surfacing that EVGA and Zotac Founders Edition cards are starting to use Micron memory, where are all those who said that this is not Nvidia's fault on the grounds that all reference design is using Samsung?

Entertaining to watch all recent 1070s and 1060s being trashed, Game On ...









Since now Nvidia's drivers and software support is not any better or faster than AMD's and they are using cheap memory ICs in their products, AMD VEGA 10 For The Win, faster memory, proper DX12 Async Compute implementation, soon with PS4 Pro checkerboard 4K rendering and half precision (FP16) rendering, and currently better and faster software and drivers support, they fixed the power draw issue on the RX 480 in three days while Nvidia has been playing us for two months, and they released their Battlefield 1 driver earlier!

C'mon AMD give me VEGA already ... my wallet is ready


----------



## vallonen

AMD used to be the leader and the innovator it would be nice to see them back in the drivers seat again, it's been long enough. Besides, as consumers we could do with some competition again. Intel, more so Nvidia, cost too much money for what they offer performance wise and Pascal has not impressed me much.


----------



## Avendor

Hello guys, i had in the last 5-6 years GTX 580, proud to have now GTX 1070, there's a big jump for me in performance. My model is Gigabyte G1 Gaming and I am amazed how cool runs in full load, temperature hardly exceeds 60°C.







I made some tests in FS TS, my scores:

i5 3570k - @4.4GHz air
core clock - +85
mem clock - +426





Pretty okeish scores, i suppose?
There's nothing wrong with Micron memory. The problem occurs with GPU voltage, not being sufficiently delivered. All credit goes to @gtbtk because he reported this issue. Moreover his workaround should do the job, NVCP - Power management mode - Prefer maximum performance.








If i try to set Adaptive, checkerboard pattern insta appear. Hopefully, new VBIOS gonna resolve this matter.


----------



## madmeatballs

Quote:


> Originally Posted by *Snuckie7*
> 
> I'm aiming for 60 fps at 1440p, hopefully for some time. Last game I played was Rise of the Tomb Raider, and the next games on my docket are GTA V and The Witcher 3.


I assume you aim max setting aswell? or not? if so, witcher 3 performs okay at 1440p max setting with 2x aa hairworks, hbao around lowest FPS I seen was 55FPS. Decent enough but I am wondering how it would do with future titles, so I think if you aim for 60fps and above with this card it might not meet what you want. In fallout 4 far harbor around 50+FPS too with hbao and godrays. If you don't mind turning those off I'm sure it will maintain above 60FPS with no problems. I am just worried about new titles and how nvidia handles dx12.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I don't think that is true that gigabyte have not used Micron VRam on some cards but I cannot confirm that absolutely.
> 
> Can we do a straw poll?
> 
> Is there anyone with a Gigabyte card with Micron Memory out there who can confirm?
> 
> 
> 
> No need for a poll, people in the following link report having Micron on their Gigabyte G1 cards and Gigabyte's official BIOS update page has a "Release for Micron Memory" made in August
> 
> 
> __
> https://www.reddit.com/r/58fw54/manufacturers_roll_out_firmware_updates_for/%5B/URL
Click to expand...

I found that too


----------



## Lavajuice

Quote:


> Originally Posted by *gtbtk*
> 
> I don't think that is true that gigabyte have not used Micron VRam on some cards but I cannot confirm that absolutely.
> 
> Can we do a straw poll?
> 
> Is there anyone with a Gigabyte card with Micron Memory out there who can confirm?


Purely anecdotal of course, but I RMA'd my Gigabyte Samsung card I had gotten 9/28 due to one of the fans being excessively loud, and the replacement from Newegg which was sent to me about a week ago was also Samsung.


----------



## gtbtk

Hmmmmmmm??!!!!!


----------



## gtbtk

Quote:


> Originally Posted by *Lavajuice*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I don't think that is true that gigabyte have not used Micron VRam on some cards but I cannot confirm that absolutely.
> 
> Can we do a straw poll?
> 
> Is there anyone with a Gigabyte card with Micron Memory out there who can confirm?
> 
> 
> 
> Purely anecdotal of course, but I RMA'd my Gigabyte Samsung card I had gotten 9/28 due to one of the fans being excessively loud, and the replacement from Newegg which was sent to me about a week ago was also Samsung.
Click to expand...

There are three two possibilities here.

Either you received old stock (unlikely) or they have dropped micron and gone back to Samsung because it was all "too hard" or they are alternating memory brand by production runs because neither company can completely satisfy demand for continuous production, which I think is most likely.

I notice that 1060s are now showing up with Micron memory as well.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*


this kyle is jerk cheater. don't have balls to show us the micron...


----------



## JukeBox

not scientific or anything, but the path of exile client has a frame time counter, and my frametime sites around 4-7 constant, doesn't spike above that on graphs ever. Poll rate appears to be 1000ms fixed. This is on patched micron. Same behaviour for me while unpatched. Palit card.


----------



## TylerAD

Installed my MSI Gaming X (from Newegg purchase 9/28) and it does have the Samsung memory confirmed by GUPZ. I have not had much time to tweak/OC yet, but with BF1 last night I was running + 110 (boost was like 2012, but do no remember exactly) on core and +450 Memory. So far so good... cant wait to see what I am able to squeeze from the card.


----------



## SpirosKGR

Any recommendation for custom fan profile for Evga SC GTX 1070 ? ( 2 hours Rainbow Six Siege gaming, max 76 C at Default - stock fan profile - )









 i flashed the new bios, no problem found, no artifacts

Also some benches with old wolf i7 2600K at 4.2 GHz


----------



## Chaoz

Quote:


> Originally Posted by *SpirosKGR*
> 
> Any recommendation for custom fan profile for Evga SC GTX 1070 ? ( 2 hours Rainbow Six Siege gaming, max 76 C at Default - stock fan profile - )


This is my fan profile for my GTX1070 SC. Card doesn't go over 58°C while gaming on 1080p Ultra-wide at max. settings and is really quiet aswell.


----------



## khanmein

Steve Burke aka Gamers Nexus said "Need to do more research.﻿" (clever answer)

"No worries. It's really just a time thing -- I don't know much about this issue, so have to research heavily before making any statements. We've got a lot on our plate already. Thanks for understanding! ^SB﻿"

i personally think micron is crippled vs samsung (GDDR5) but i guess GDDR5X also the same crap.

finally i received a well-know tech reviewer & i asked few tech reviewer, their fans started to shielding them like previous thai king.


----------



## Nelly.

SERIOUS ISSUE with EVGA cards - Any of you EVGA owners seen this? >>

__
https://www.reddit.com/r/580b8w/welp_my_evga_1070_ftw_just_killed_itself/


----------



## khanmein

Quote:


> Originally Posted by *Nelly.*
> 
> SERIOUS ISSUE with EVGA cards - Any of you EVGA owners seen this? >>
> 
> __
> https://www.reddit.com/r/580b8w/welp_my_evga_1070_ftw_just_killed_itself/%5B/URL
> 
> I have 5x gainward bliss 1070 micron and bunch of Gainward phoenic samsung...
> 
> So far I tested memory OC in mining mode only
> 
> So far Samsung is stable with +780-900, but also lover P states and voltage might crash driver so its a bit hard to test right!!
> Bliss micron updated to new bios
> +800 stable ONLY IF I force static frequency and voltage so card NEVER drop to under P2 state!!
> If I don`t force hight clock even +450 is crshing miner at start! with new micron bios from site!
> 
> So problem is still not fixed!
> 
> MSI gaming X micron CRAP, first MSI gaming dies this pice of crap is overheating, guru3d I blmae you and all review sites for misleading us!!!!! I am waiting for IR temp tool to write my article about MSI faking temperature of GPU and VRM being danagerous HOT!
> 
> MSI gaming X original bios micron is stable at +700 if I force P2 state and use afterburner to limit click to +1700 Mhz!


----------



## F3niX69

Has nvidia ditched the power saving features? I noticed today that my card is sitting on base clocks without me choosing so.I haven't locked any voltage and my nvcp is set to adaptive.Even changing back to optimal mode didn't get clocks down. I even restarted my pc several times.Driver version is 375.57 and the problem occured on the previous version too.


----------



## EDK-TheONE

Finally i ordered Zotac 1070 Amp Extreme and will be received in 48 hours later . I hope it has Samsung memory.


----------



## QPSS

Quote:


> Originally Posted by *Nelly.*
> 
> SERIOUS ISSUE with EVGA cards - Any of you EVGA owners seen this? >>
> 
> __
> https://www.reddit.com/r/580b8w/welp_my_evga_1070_ftw_just_killed_itself/


Doesnt really surprise me. The EVGAs 970s cooler was designed badly aswell, one of the heatpipes didnt even touch the GPU and was very loud. Before that I had a 260 from EVGA, which they refused to replace, even though it crashed regularly. and I found out the cooler was simply installed badly.
I am done with EVGA because of all this crap. I gave them enough chance, and I dont understand why people still praise them so much.


----------



## khanmein

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Finally i ordered Zotac 1070 Amp Extreme and will be received in 48 hours later . I hope it has Samsung memory.


if u received samsung then wish u good luck but if is micron; i'll would said this more than satisfactory in quality.

"We decided not to touch the memory as 8.2 Gbps seems more than adequate, and in terms of real world performance, it's much more effective to focus power to the GPU's core clocks instead."

https://www.custompcreview.com/reviews/zotac-geforce-gtx-1070-amp-graphics-card-review/32803/

actually their heat-sink design look quite good & if not mistake better than Asus Strix but Asus mosfet's + IR3555 are way more superior.


----------



## asdkj1740

Quote:


> Originally Posted by *QPSS*
> 
> Doesnt really surprise me. The EVGAs 970s cooler was designed badly aswell, one of the heatpipes didnt even touch the GPU and was very loud. Before that I had a 260 from EVGA, which they refused to replace, even though it crashed regularly. and I found out the cooler was simply installed badly.
> I am done with EVGA because of all this crap. I gave them enough chance, and I dont understand why people still praise them so much.


i got that one too. really suck in cooling especially when you unlock the power limit, 90c 4000rpm was what i got...


----------



## asdkj1740

""With this being said, EVGA understands that lower temperatures are preferred by reviewers and customers.""

what do you guys feel about this statement?


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> ""With this being said, EVGA understands that lower temperatures are preferred by reviewers and customers.""
> 
> what do you guys feel about this statement?


the EVGA representative said 40% fan curve is too low, we need to increase to min 60% so that the VRM is cooled.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> the EVGA representative said 40% fan curve is too low, we need to increase to min 60% so that the VRM is cooled.


i wont say it is wrong becasue evga really has conservative stock fan profile this time.
but it still cant justify the bad vrm cooling problem.

when people complain about evga cooling, some always said increase the fan speed....
this is not the right dimension...


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> i wont say it is wrong becasue evga really has conservative stock fan profile this time.
> but it still cant justify the bad vrm cooling problem.
> 
> when people complain about evga cooling, some always said increase the fan speed....
> this is not the right dimension...


yeah this is what i'm hinting.. bad cooling VRM implementation & i don't know y the science studio fella bought 2 EVGA GTX 1070 (due to the color scheme that match the ASRock x99 Taichi) & plan to give-away Asus Strix GTX 1070 for 100k subscriber


----------



## Roland0101

Quote:


> Originally Posted by *F3niX69*
> 
> Has nvidia ditched the power saving features? I noticed today that my card is sitting on base clocks without me choosing so.I haven't locked any voltage and my nvcp is set to adaptive.Even changing back to optimal mode didn't get clocks down. I even restarted my pc several times.Driver version is 375.57 and the problem occured on the previous version too.


No, they didn't ditched the power saving features. Use DDU in save mode and install 373.06.


----------



## F3niX69

Quote:


> Originally Posted by *Roland0101*
> 
> No, they didn't ditched the power saving features. Use DDU in save mode and install 373.06.


its only the latest drivers that do this then?


----------



## Roland0101

Quote:


> Originally Posted by *F3niX69*
> 
> its only the latest drivers that do this then?


The latest , (375.57) as some issues, (with windows 10 in particular.) that's why I don't recommend to use it.

I am on 373.06 and have no issues whatsoever, but that must not be the same for your system. Or in other words, its probably not the driver alone that causes the issue, it's either a corrupt installation of the driver or a interaction between something on your system and the driver.


----------



## derfer

That artifact issue I reported a few pages ago seems to affect 1080s too so its not connected to the 1070 memory woes. http://forums.guru3d.com/showthread.php?t=409841


----------



## khanmein

Quote:


> Originally Posted by *derfer*
> 
> That artifact issue I reported a few pages ago seems to affect 1080s too so its not connected to the 1070 memory woes. http://forums.guru3d.com/showthread.php?t=409841


GDDR5X micron issue?


----------



## gtbtk

Quote:


> Originally Posted by *derfer*
> 
> That artifact issue I reported a few pages ago seems to affect 1080s too so its not connected to the 1070 memory woes. http://forums.guru3d.com/showthread.php?t=409841


Those are different types of artifacts to the checkerboard artifact that the bios update will fix. I have not seen a problem like that one before

The 1070 memory voltage issue puts a white checkerboard pattern over the entire screen then usually crashes the PC


----------



## zipzop

Quote:


> Originally Posted by *F3niX69*
> 
> Has nvidia ditched the power saving features? I noticed today that my card is sitting on base clocks without me choosing so.I haven't locked any voltage and my nvcp is set to adaptive.Even changing back to optimal mode didn't get clocks down. I even restarted my pc several times.Driver version is 375.57 and the problem occured on the previous version too.


In all likelihood you have something running in the background that is boosting your card up from idle states. Shadowplay or "share" puts the card in a high power state , even when it's not recording but just simply by being enabled. Pull up Nvidia control panel, in the menu bar under "desktop", enable "display GPU activity icon in notification area" and see what is using the GPU


----------



## F3niX69

Quote:


> Originally Posted by *zipzop*
> 
> In all likelihood you have something running in the background that is boosting your card up from idle states. Shadowplay or "share" puts the card in a high power state , even when it's not recording but just simply by being enabled. Pull up Nvidia control panel, in the menu bar under "desktop", enable "display GPU activity icon in notification area" and see what is using the GPU


Aaaaand indeed it was shadowplay. Disabled it and clocks went down. Sad that even when not recording it runs the card at base clocks. Any work arounds to use shadowplay and have the power states at the same time?


----------



## zipzop

Quote:


> Originally Posted by *F3niX69*
> 
> Aaaaand indeed it was shadowplay. Disabled it and clocks went down. Sad that even when not recording it runs the card at base clocks. Any work arounds to use shadowplay and have the power states at the same time?


Actually it is recording. it records up to the most recent 5 minutes(or 10 or 15 or 20, whatever you set it to)

Hit alt+z any time and disable/enable instant replay. When it's enabled, it will always be recording the last 5 minutes

edit: also I think that by turning off desktop capture in preferences->privacy control, it will also *not* boost clocks on the desktop with instant replay enabled. Because it will not be recording latest desktop activity


----------



## Chaoz

Quote:


> Originally Posted by *Roland0101*
> 
> The latest , (375.57) as some issues, (with windows 10 in particular.) that's why I don't recommend to use it.
> 
> I am on 373.06 and have no issues whatsoever, but that must not be the same for your system. Or in other words, its probably not the driver alone that causes the issue, it's either a corrupt installation of the driver or a interaction between something on your system and the driver.


But not in Win 7? Seeing as I'm on those new drivers. Otherwise I can't play BF1, as those drivers are required to play it.

I don't have those problems with my card, though. I've been playing games all week for hours on end.
I made a custom fancurve so hopefully I'll be out of the danger.


----------



## F3niX69

Quote:


> Originally Posted by *zipzop*
> 
> Actually it is recording. it records up to the most recent 5 minutes(or 10 or 15 or 20, whatever you set it to)
> 
> Hit alt+z any time and disable/enable instant replay. When it's enabled, it will always be recording the last 5 minutes
> 
> edit: also I think that by turning off desktop capture in preferences->privacy control, it will also *not* boost clocks on the desktop with instant replay enabled. Because it will not be recording latest desktop activity


Disabling desktop recording is getting the clocks down which is nice. I just tested it.
I got another question though. Where is shadowplay storing the footage it is recording before (pressing the button to save the last few minutes) Is it constantly writing on the drives?


----------



## zipzop

Quote:


> Originally Posted by *F3niX69*
> 
> Disabling desktop recording is getting the clocks down which is nice. I just tested it.
> I got another question though. Where is shadowplay storing the footage it is recording before (pressing the button to save the last few minutes) Is it constantly writing on the drives?


Good question but most likely AppData->Local->NVIDIA Corporation->ShadowPlay. Also I think it's only encoded into a .mp4 after you hit the save button. Before that it's just meta data?


----------



## TheGlow

Quote:


> Originally Posted by *F3niX69*
> 
> Aaaaand indeed it was shadowplay. Disabled it and clocks went down. Sad that even when not recording it runs the card at base clocks. Any work arounds to use shadowplay and have the power states at the same time?


From what I recall i think it uses your default windows\user\video folder.
I moved my whole user's folder to a different drive out of habit. OS reinstall and I just point to the user folder and back to where I was. Also less read/writes to SSD.
But until the vbios comes out, staying in high clocks isn't necessarily bad. I rigged mine on purpose to do this hence I can oc the mem to +800 or higher and not instantly checkerboard.


----------



## F3niX69

Quote:


> Originally Posted by *zipzop*
> 
> Good question but most likely AppData->Local->NVIDIA Corporation->ShadowPlay. Also I think it's only encoded into a .mp4 after you hit the save button. Before that it's just meta data?


My question is mainly this.Does it burden my ssd with extra writes even when not saving a clip? Cause if it does it destroys ssds.


----------



## mrtbahgs

Quote:


> Originally Posted by *F3niX69*
> 
> Disabling desktop recording is getting the clocks down which is nice. I just tested it.
> I got another question though. Where is shadowplay storing the footage it is recording before (pressing the button to save the last few minutes) Is it constantly writing on the drives?


Hit Alt+z
Go to the little double gear icon on the bottom right
Go to Recordings
You can now see where it is storing and also change it to somewhere else.


----------



## F3niX69

Quote:


> Originally Posted by *mrtbahgs*
> 
> Hit Alt+z
> Go to the little double gear icon on the bottom right
> Go to Recordings
> You can now see where it is storing and also change it to somewhere else.


its on the main drive which is the ssd,not good


----------



## RyanRazer

http://wccftech.com/nvidia-gtx-1070-memory-issue/

Micron RAM issue has been acknowledged


----------



## khanmein

Quote:


> Originally Posted by *RyanRazer*
> 
> http://wccftech.com/nvidia-gtx-1070-memory-issue/
> 
> Micron RAM issue has been acknowledged


they always wanna be the 1st tech reviewer to announced this kinda stuff.


----------



## RyanRazer

Quote:


> Originally Posted by *khanmein*
> 
> they always wanna be the 1st tech reviewer to announced this kinda stuff.


yeah not my favorite site but they cover a lot...







More like a gossip yellow pages magazine then a tech site


----------



## TheGlow

Quote:


> Originally Posted by *F3niX69*
> 
> My question is mainly this.Does it burden my ssd with extra writes even when not saving a clip? Cause if it does it destroys ssds.


Yes if you have Shadowplay/instant replay on. Mines set to 1 minute so its always recording the last minute. If I press my save hotkey, then it will dump it to the Video/Gamename/ folder and start another 1 min clip in the buffer. So I guess it's technically writing that every 1 second to keep updating. Hence I have the users documents, videos, pics, downloads, etc on a separate 1TB drive. All but Appdata, which I've debated to move as well.


----------



## netok

Quote:


> Originally Posted by *khanmein*
> 
> they always wanna be the 1st tech reviewer to announced this kinda stuff.


They are just good at spreading rumors and copying from other site.
Guru3d covered this 2 days ago.
https://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue.html


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> The latest , (375.57) as some issues, (with windows 10 in particular.) that's why I don't recommend to use it.
> 
> I am on 373.06 and have no issues whatsoever, but that must not be the same for your system. Or in other words, its probably not the driver alone that causes the issue, it's either a corrupt installation of the driver or a interaction between something on your system and the driver.


I updated last night and played some Overwatch with no problem. I load it up today and totally different story. Reboot, no help. FPS instead of sitting at 144 were dipping to 120 and i could feel hard stuttering but not showing in the fps readout. Also swear i was seeing some tearing, felt like GSync wasnt on or something.
I confirmed desktop was 144Hz, GSync on, change game display settings from and back to [email protected]
So I just rolled back to 373.06 and hope that fixes it.

Edit: No, still horribly all over. FPSs hit 110-120 and flicker. Somethings up.
Testufo also wont stay on valid more t han 4 seconds, many stutters, even up to major, 6-10 detected.
Definitely seems to me GSync stopped working.
Did ddu and reinstalled as well, no change. I dropped it to 60fps which the gsync should have been buttery smooth and it was choppy all over.


----------



## monza1412

So whats the latest word about the micron issue? I saw a couple of vendors release new bioses. Did the bios updates solve the crashes? Any oc gain in memory after the update?


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> I updated last night and played some Overwatch with no problem. I load it up today and totally different story. Reboot, no help. FPS instead of sitting at 144 were dipping to 120 and i could feel hard stuttering but not showing in the fps readout. Also swear i was seeing some tearing, felt like GSync wasnt on or something.
> I confirmed desktop was 144Hz, GSync on, change game display settings from and back to [email protected]
> So I just rolled back to 373.06 and hope that fixes it.
> 
> Edit: No, still horribly all over. FPSs hit 110-120 and flicker. Somethings up.
> Testufo also wont stay on valid more t han 4 seconds, many stutters, even up to major, 6-10 detected.
> Definitely seems to me GSync stopped working.
> Did ddu and reinstalled as well, no change. I dropped it to 60fps which the gsync should have been buttery smooth and it was choppy all over.


That doesn't sound good.

I read a report that said that DDU is not capable of cleaning 375.57 completely from the system due to changes in the drivers structure.
I can not confirm that, I don't have access to a machine with this driver, but after your experience it seams likely.

Did you tested 368.81?
You may have to search for driver residues after using DDU again. Running SFC /scannow and/or a OS repair might help too.


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> http://wccftech.com/nvidia-gtx-1070-memory-issue/
> 
> Micron RAM issue has been acknowledged


Guru3d announced it on the 21st http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue.html

NCIX tech tips gave it a mention too at 3:40


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> That doesn't sound good.
> 
> I read a report that said that DDU is not capable of cleaning 375.57 completely from the system due to changes in the drivers structure.
> I can not confirm that, I don't have access to a machine with this driver, but after your experience it seams likely.
> 
> Did you tested 368.81?
> You may have to search for driver residues after using DDU again. Running SFC /scannow and/or a OS repair might help too.


I dunno whats up. I thought I found the fix but might have been a placebo. I switched to borderless window, and seemed fine. I went back to fullscreen, no problem.
Been playing for last 3 hours no problems. I switch to McCree and instantly I have the issue. I remembered using him before when it happened. I switched back to other characters, no difference. I then tried window/fullscreen toggles again, nothing.
Brought up the riva OSD and GPU usage was going as low as 61%. I think it normally just ran at 95+ but I cant really remember.
Only other pic I have is when I first got the card on Aug 16 and its showing 99% usage.
I tried lowering my OC , no difference. And when it happens as I mentioned the gsync looks like it isnt doing anything.
I checked in Witcher3 and seemed ok.


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> I dunno whats up. I thought I found the fix but might have been a placebo. I switched to borderless window, and seemed fine. I went back to fullscreen, no problem.
> Been playing for last 3 hours no problems. I switch to McCree and instantly I have the issue.


So the problem accrues only in Overwatch and then only with a specific character?


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> So the problem accrues only in Overwatch and then only with a specific character?


Not character specific, or at least thats coincidental. I just loaded up without joining a match and in the preview area for Skins, can hold left click and move the cursor to rotate the view. It randomly stuters and thats when I see gpu usage dropped to 60-70, fps to 110-120. then it goes back to 144fps (which i have capped at), gpu usage back up and then it seems smooth. And it's alternating like this.
Tried ufotest again and had lots of stutter reports there.
So I dont know wth to think.


----------



## khanmein

Quote:


> Originally Posted by *netok*
> 
> They are just good at spreading rumors and copying from other site.
> Guru3d covered this 2 days ago.
> https://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue.html


y don't u said guru3d copied gtbtk that started the thread at geforce.com forums???

other tech reviewer slowly started to make this issue cos i make a lot f***-ing noise at them.


----------



## mrtbahgs

Quote:


> Originally Posted by *F3niX69*
> 
> its on the main drive which is the ssd,not good


Ya I'm not sure if it constantly writes or not, I can try and remember to check since I have like a little gadget that tracks my drive read and writes among other things.
Like someone else mentioned though, I have my video folder remapped to my 1Tb HDD, but you could at least just remap the shadowplay file location to fix that portion if you have an HDD as well.


----------



## Arturo.Zise

Got me a Gainward Phoenix 1070 2 weeks ago. Been running a few games at 4k on my new Panasonic TV and am loving it so far. Only tried +100/+400 OC and it seems to hover around 2000mhz core and memory is at 8800. Fans spin to 40% max and temps sit at 70c constant. Can hardly hear it spinning up. My room temps are 27c+ on most days as well.

I managed to get get lucky with Samsung ram on this card. How high OC does Samsung usually go? Will I notice much difference in most games at higher memory clocks?


----------



## mstrmind5

All recent buyers, say last 6 weeks, what memory does your 1070 have? Could you also tell me what model 1070 you have?

Thanks.


----------



## asdkj1740

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Got me a Gainward Phoenix 1070 2 weeks ago. Been running a few games at 4k on my new Panasonic TV and am loving it so far. Only tried +100/+400 OC and it seems to hover around 2000mhz core and memory is at 8800. Fans spin to 40% max and temps sit at 70c constant. Can hardly hear it spinning up. My room temps are 27c+ on most days as well.
> 
> I managed to get get lucky with Samsung ram on this card. How high OC does Samsung usually go? Will I notice much difference in most games at higher memory clocks?


new bios released gives you ~25w more to use, which should help in stabilizing core clock more.


----------



## asdkj1740

Quote:


> Originally Posted by *mstrmind5*
> 
> All recent buyers, say last 6 weeks, what memory does your 1070 have? Could you also tell me what model 1070 you have?
> 
> Thanks.


most likely micron is what you will get.
1060 started to use micron too, you should buy 1050ti.
actually there is non samsung vram on 1050ti too.


----------



## Blasius

my 1070


----------



## luciferuru

Hi

I own a msi gaming X gtx 1070

I notice, when i play Gears of War 4 and the Witcher 3 in Max settings at 1080p

My tdp doesnt exceed 70-80% while my gpu usage is at 100%

Is it Normal ?


----------



## netok

Quote:


> Originally Posted by *khanmein*
> 
> y don't u said guru3d copied gtbtk that started the thread at geforce.com forums???
> 
> other tech reviewer slowly started to make this issue cos i make a lot f***-ing noise at them.


I am talking about the media, not someone in some forum.
If you count that, You do know that this issue is first reported in Chinese forum Chiphell long before gtbtk started that thread, right?


----------



## Mr-Dark

Hello

My New Gaming X arrive today to replace the Evga 1070 SC.. planing for SLI an another one on the way..



This one is way better than EVGA SC.. Power usage around 65% to 75% not more and max temp is 68c, while the SC hit 105% or more and throttle at stock clock with 77c at stock fan curve









something wired happen after installing the Gaming X. My monitor start flicker on the desktop.. I change the power plan from Optimal to Adaptive and that fix the problem completely.. is that normal ? anyone or its just me ?


----------



## Dude970

Great looking rig MR-Dark. I don't have that with mine, but have read others saying they had to switch to adaptive.


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> My New Gaming X arrive today to replace the Evga 1070 SC.. planing for SLI an another one on the way..
> 
> 
> 
> This one is way better than EVGA SC.. Power usage around 65% to 75% not more and max temp is 68c, while the SC hit 105% or more and throttle at stock clock with 77c at stock fan curve
> 
> 
> 
> 
> 
> 
> 
> 
> 
> something wired happen after installing the Gaming X. My monitor start flicker on the desktop.. I change the power plan from Optimal to Adaptive and that fix the problem completely.. is that normal ? anyone or its just me ?


welcome back dude.
yes its normal now and msi is going to release a new bios for you.
yours is micron right?
try to set max power to avoid this issue currently.


----------



## asdkj1740

Quote:


> Originally Posted by *luciferuru*
> 
> Hi
> 
> I own a msi gaming X gtx 1070
> 
> I notice, when i play Gears of War 4 and the Witcher 3 in Max settings at 1080p
> 
> My tdp doesnt exceed 70-80% while my gpu usage is at 100%
> 
> Is it Normal ?


it should be normal.
your card is fine as msi has "290w" power max, and pascal cards really has great power control.

but i personally dont think it is normal, zotac 300w bios gives me zero throttling in furmark 1080p 0xaa while msi 290w bios gave me serious throttling. maybe only my card fails to use this bios.

is there anyone using msi seahawk????


----------



## luciferuru

When i have the time, i will do a benchmark firestrike.

On this thread, i saw some with 70-80% tdp
while, other are at 100-110% tdp


----------



## Mr-Dark

Quote:


> Originally Posted by *Dude970*
> 
> 
> 
> 
> 
> 
> 
> 
> Great looking rig MR-Dark. I don't have that with mine, but have read others saying they had to switch to adaptive.


Thanks bro, now i'm using Adaptive plan and no problem So far








Quote:


> Originally Posted by *asdkj1740*
> 
> welcome back dude.
> yes its normal now and msi is going to release a new bios for you.
> yours is micron right?
> try to set max power to avoid this issue currently.


Thanks, the strange thing the SC never flicker on me.. only some black screen while playing games on stock fan curve duo to VRM overheat...lol









yes, its Micron.. I order 2 from NewEgg last week.. another one on the way so i can SLI them


----------



## QPSS

Quote:


> Originally Posted by *luciferuru*
> 
> When i have the time, i will do a benchmark firestrike.
> 
> On this thread, i saw some with 70-80% tdp
> while, other are at 100-110% tdp


Since most of the cards all have a different stock powertarget in terms of wattage, this doesnt really say anything anyway.
Some have 175W, some 190, some 200, some 225, etc.


----------



## khanmein

Quote:


> Originally Posted by *netok*
> 
> I am talking about the media, not someone in some forum.
> If you count that, You do know that this issue is first reported in Chinese forum Chiphell long before gtbtk started that thread, right?


yeah chiphell mentioned more earlier..


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *netok*
> 
> They are just good at spreading rumors and copying from other site.
> Guru3d covered this 2 days ago.
> https://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue.html
> 
> 
> 
> y don't u said guru3d copied gtbtk that started the thread at geforce.com forums???
> 
> other tech reviewer slowly started to make this issue cos i make a lot f***-ing noise at them.
Click to expand...

thanks for the support


----------



## gtbtk

Quote:


> Originally Posted by *netok*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> y don't u said guru3d copied gtbtk that started the thread at geforce.com forums???
> 
> other tech reviewer slowly started to make this issue cos i make a lot f***-ing noise at them.
> 
> 
> 
> I am talking about the media, not someone in some forum.
> If you count that, You do know that this issue is first reported in Chinese forum Chiphell long before gtbtk started that thread, right?
Click to expand...

I never claimed I was the first person to notice artifacts. I was actually one of the first owners of a micron card as I got mine on 21 July so I probably noticed it early too but that is not relevant here.

I was the first person to work out what was causing it and start trying to get it resolved.

As the Micron cards were introduced with a new .26 bios family compared to teh older Samsung cards with the .1E family of bios, I have a suspicion that Nvidia and the partners may have already seen the problem before launching the micron memory cards and tried fixing it with the .26 version bios but could not work out exactly what was causing it in the time they had available.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I never claimed I was the first person to notice artifacts. I was actually one of the first owners of a micron card as I got mine on 21 July so I probably noticed it early too but that is not relevant here.
> 
> I was the first person to work out what was causing it and start trying to get it resolved.
> 
> As the Micron cards were introduced with a new .26 bios family compared to teh older Samsung cards with the .1E family of bios, I have a suspicion that Nvidia and the partners may have already seen the problem before launching the micron memory cards and tried fixing it with the .26 version bios but could not work out exactly what was causing it in the time they had available.


this is what i'm mad bout it. they knew it earlier. samsung chip is shortage cos they need to supply to AMD too.. what i think is that NV asked Micron since u guys released GDDR5X y don't u guys supply us GDDR5 but micron said we don't know how to produce GDDR5. NV said asked elpida then micron said oh yeah we bought elpida over. TLDR; micron released crippled GDDR5 & the cost is way cheaper compare with samsung..

once the end user found out, we just use new vbios to hold them up. give the 3 months they will forget it.


----------



## Dude970

Playing BF1, card is running it great. I game at stock settings.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> this is what i'm mad bout it. they knew it earlier. samsung chip is shortage cos they need to supply to AMD too.. what i think is that NV asked Micron since u guys released GDDR5X y don't u guys supply us GDDR5 but micron said we don't know how to produce GDDR5. NV said asked elpida then micron said oh yeah we bought elpida over. TLDR; micron released crippled GDDR5 & the cost is way cheaper compare with samsung..
> 
> once the end user found out, we just use new vbios to hold them up. give the 3 months they will forget it.


my last 970 is elipida. barely to get 1850mhz stable, from 1750mhz default.
but after i attached vram heatsinks to them as i used aio on my 970 later, i got 2000mhz stable.
elipida seems to be very temp. sensitive. not good. but it should be better than micron, i guess.

and i wont tell you mine 970 is evga ssc and its vram is cooled by that **** plate before, lol.


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> my last 970 is elipida. barely to get 1850mhz stable, from 1750mhz default.
> but after i attached vram heatsinks to them as i used aio on my 970 later, i got 2000mhz stable.
> elipida seems to be very temp. sensitive. not good. but it should be better than micron, i guess.
> 
> and i wont tell you mine 970 is evga ssc and its vram is cooled by that **** plate before, lol.


evga 970 ssc is quite good if compare to my cheap leadtek.

elpida officially acquired by micron.


----------



## DeathAngel74

I had one ftw+ with Samsung and one with elpida. The card with elpida caused artifacts at 1440p or higher than 1506.5mhz. Elpida is teh suxx0rz


----------



## RyanRazer

Quote:


> Originally Posted by *TheGlow*
> 
> I updated last night and played some Overwatch with no problem. I load it up today and totally different story. Reboot, no help. FPS instead of sitting at 144 were dipping to 120 and i could feel hard stuttering but not showing in the fps readout. Also swear i was seeing some tearing, felt like GSync wasnt on or something.
> I confirmed desktop was 144Hz, GSync on, change game display settings from and back to [email protected]
> So I just rolled back to 373.06 and hope that fixes it.
> 
> Edit: No, still horribly all over. FPSs hit 110-120 and flicker. Somethings up.
> Testufo also wont stay on valid more t han 4 seconds, many stutters, even up to major, 6-10 detected.
> Definitely seems to me GSync stopped working.
> Did ddu and reinstalled as well, no change. I dropped it to 60fps which the gsync should have been buttery smooth and it was choppy all over.


Sorry to hear. for me new driver 375 work just fine. Heres video:


----------



## TheGlow

Quote:


> Originally Posted by *RyanRazer*
> 
> Sorry to hear. for me new driver 375 work just fine. Heres video:


Im going to just chalk it up to something with Overwatch. Because there was a patch as well the other day.
Again, very erratic. I had it glitched again but played through it for 10 mins or so then it cleared up again for 2 hours.
I just started again for today and its choppy. I have no idea. it looks like the card is either just throttling back or the game just isnt synching up right.
Brief test in Witcher3 everything is fine and smooth, 85-100fps.


----------



## RyanRazer

Quote:


> Originally Posted by *TheGlow*
> 
> Im going to just chalk it up to something with Overwatch. Because there was a patch as well the other day.
> Again, very erratic. I had it glitched again but played through it for 10 mins or so then it cleared up again for 2 hours.
> I just started again for today and its choppy. I have no idea. it looks like the card is either just throttling back or the game just isnt synching up right.
> Brief test in Witcher3 everything is fine and smooth, 85-100fps.


i just tried overwatch with genji and mcree (i think i've read he gave you trouble) and it works just fine.


----------



## TheGlow

Quote:


> Originally Posted by *RyanRazer*
> 
> i just tried overwatch with genji and mcree (i think i've read he gave you trouble) and it works just fine.


Yea Im wondering if its just some oddity with McCree or maps. From what I can gather when it happens, McCree is t here, but not always when mccree is there does it happen.
I think i need to take stock of who is there in general and which map.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I never claimed I was the first person to notice artifacts. I was actually one of the first owners of a micron card as I got mine on 21 July so I probably noticed it early too but that is not relevant here.
> 
> I was the first person to work out what was causing it and start trying to get it resolved.
> 
> As the Micron cards were introduced with a new .26 bios family compared to teh older Samsung cards with the .1E family of bios, I have a suspicion that Nvidia and the partners may have already seen the problem before launching the micron memory cards and tried fixing it with the .26 version bios but could not work out exactly what was causing it in the time they had available.
> 
> 
> 
> this is what i'm mad bout it. they knew it earlier. samsung chip is shortage cos they need to supply to AMD too.. what i think is that NV asked Micron since u guys released GDDR5X y don't u guys supply us GDDR5 but micron said we don't know how to produce GDDR5. NV said asked elpida then micron said oh yeah we bought elpida over. TLDR; micron released crippled GDDR5 & the cost is way cheaper compare with samsung..
> 
> once the end user found out, we just use new vbios to hold them up. give the 3 months they will forget it.
Click to expand...

We don't know what they actually knew that was only a guess on my part. The new bios could have just been coincidental cause all the cards including the newer samsung ones made recently have come with the 26 bios as well. The micron change was 1 July. Humans plan around calender dates that stand out so 1st of the month could well be a likely cut over date for a new bios release too.

Getting all emotional and paranoid about this will not fix it or make it be resolved any faster. I am pretty sure that there has never been any intent to roll out something that will give them a headache to fix later but shuff happens..... The Support people should probably have dealt with it better but that assumes that the support guys actually know what they are talking about when in reality, they work in an outsourced call centre type environment, read off a script and look things up in a knowledge base database. Escalation procedures all tend to be not thought through very well.

That creates an issue when a new problem arises because there are no database entries to regurgitate. The other problem these companies have, is that the last thing a knowledgable and experienced IT guy wants to be doing is first level tech support.


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheGlow*
> 
> I updated last night and played some Overwatch with no problem. I load it up today and totally different story. Reboot, no help. FPS instead of sitting at 144 were dipping to 120 and i could feel hard stuttering but not showing in the fps readout. Also swear i was seeing some tearing, felt like GSync wasnt on or something.
> I confirmed desktop was 144Hz, GSync on, change game display settings from and back to [email protected]
> So I just rolled back to 373.06 and hope that fixes it.
> 
> Edit: No, still horribly all over. FPSs hit 110-120 and flicker. Somethings up.
> Testufo also wont stay on valid more t han 4 seconds, many stutters, even up to major, 6-10 detected.
> Definitely seems to me GSync stopped working.
> Did ddu and reinstalled as well, no change. I dropped it to 60fps which the gsync should have been buttery smooth and it was choppy all over.
> 
> 
> 
> Sorry to hear. for me new driver 375 work just fine. Heres video:
Click to expand...

Mine started black screen crashing this morning. I had to roll back to 373.06 which has since been stable


----------



## RyanRazer

Quote:


> Originally Posted by *gtbtk*
> 
> Mine started black screen crashing this morning. I had to roll back to 373.06 which has since been stable


All games or game specific?


----------



## TheGlow

Man, last match got BAD, said 108 frames but i know that was more like 20. GPU usage went as low as 48% from what I could tell. Makes no damn sense.


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> Im going to just chalk it up to something with Overwatch. Because there was a patch as well the other day.
> Again, very erratic. I had it glitched again but played through it for 10 mins or so then it cleared up again for 2 hours.
> I just started again for today and its choppy. I have no idea. it looks like the card is either just throttling back or the game just isnt synching up right.
> Brief test in Witcher3 everything is fine and smooth, 85-100fps.


The hot fix driver (375.63) is out and Nvidia pulled 375.57. Maybe it helps but it could be very well a game problem.
Any related issue on the Overwatch forums?


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Mine started black screen crashing this morning. I had to roll back to 373.06 which has since been stable
> 
> 
> 
> All games or game specific?
Click to expand...

it actually crashed out to a black screen with mouse cursor in sniper fury, game that hardly puts any load on the card. never done that before


----------



## SpirosKGR

Hello guys, while playing Battlefield 1 multiplayer , i noticed that my gpu usage was dropping to 60-70-80 % in game almost all the time on my new gtx 1070, at conquest mode. Ι tried lots of configurations on ultra-high-med-low preset but the problem still exists. I have a 1080p monitor and the resolution scale in game sis 100% ( native ). Cpu max usage it was 90-05%
What can i do to fix the problem?







I believe that its not a gpu problem ( fire strike bench for test http://www.3dmark.com/3dm/15560429 )
*Specs*
Intel Core i7 2600K at 4.2 GHz | Kingston DDR3 4x2GB ( total 8GB ) 1600mhz |
Asrock Fatality z77 Professional motherboard |
Arctic Liquid 120 Freezer ( AIO cpu cooler ) |
EVGA GTX 1070 SC 8GB |
Samsung 840 Pro 128GB |
WD Caviar Black 1 TB |
OS: Windows 7 Home 64 bit

Max gpu temperature: 64C ( with fan profile ) |
Max cpu temperature: 59-61 |

I tried to install the game on ssd but it didnt work
Thanks in advance


----------



## TheGlow

Quote:


> Originally Posted by *SpirosKGR*
> 
> Hello guys, while playing Battlefield 1 multiplayer , i noticed that my gpu usage was dropping to 60-70-80 % in game almost all the time on my new gtx 1070, at conquest mode. Ι tried lots of configurations on ultra-high-med-low preset but the problem still exists. I have a 1080p monitor and the resolution scale in game sis 100% ( native ). Cpu max usage it was 90-05%
> What can i do to fix the problem?
> 
> 
> 
> 
> 
> 
> 
> I believe that its not a gpu problem ( fire strike bench for test http://www.3dmark.com/3dm/15560429 )
> *Specs*
> Intel Core i7 2600K at 4.2 GHz | Kingston DDR3 4x2GB ( total 8GB ) 1600mhz |
> Asrock Fatality z77 Professional motherboard |
> Arctic Liquid 120 Freezer ( AIO cpu cooler ) |
> EVGA GTX 1070 SC 8GB |
> Samsung 840 Pro 128GB |
> WD Caviar Black 1 TB |
> OS: Windows 7 Home 64 bit
> 
> Max gpu temperature: 64C ( with fan profile ) |
> Max cpu temperature: 59-61 |
> 
> I tried to install the game on ssd but it didnt work
> Thanks in advance


Double check resolution scale. in the beta it was something stupid like 42% really is 100%. So 100% is really trying to push 4k or some crap.


----------



## RyanRazer

Quote:


> Originally Posted by *TheGlow*
> 
> Double check resolution scale. in the beta it was something stupid like 42% really is 100%. So 100% is really trying to push 4k or some crap.


I am pretty sure 100% res scale is fixed now. I doubt my card would run 4k with 100FPS


----------



## DeathAngel74

Yeah new drivers out, hotfix...


----------



## RyanRazer

Quote:


> Originally Posted by *DeathAngel74*
> 
> Yeah new drivers out, hotfix...


just saw. If @Roland0101 didn't mention it, i wouldn't notice. My gforce experience does not notify about update. It says i am on latest version 375.57... Any improvements? Worth the upgrade?


----------



## F3niX69

Quote:


> Originally Posted by *SpirosKGR*
> 
> Hello guys, while playing Battlefield 1 multiplayer , i noticed that my gpu usage was dropping to 60-70-80 % in game almost all the time on my new gtx 1070, at conquest mode. Ι tried lots of configurations on ultra-high-med-low preset but the problem still exists. I have a 1080p monitor and the resolution scale in game sis 100% ( native ). Cpu max usage it was 90-05%
> What can i do to fix the problem?
> 
> 
> 
> 
> 
> 
> 
> I believe that its not a gpu problem ( fire strike bench for test http://www.3dmark.com/3dm/15560429 )
> *Specs*
> Intel Core i7 2600K at 4.2 GHz | Kingston DDR3 4x2GB ( total 8GB ) 1600mhz |
> Asrock Fatality z77 Professional motherboard |
> Arctic Liquid 120 Freezer ( AIO cpu cooler ) |
> EVGA GTX 1070 SC 8GB |
> Samsung 840 Pro 128GB |
> WD Caviar Black 1 TB |
> OS: Windows 7 Home 64 bit
> 
> Max gpu temperature: 64C ( with fan profile ) |
> Max cpu temperature: 59-61 |
> 
> I tried to install the game on ssd but it didnt work
> Thanks in advance


It sounds like the cpu can't keep up.Though it should be capable for 60fps gaming on 1080p.What are your framerates?


----------



## Pittster

Quote:


> Originally Posted by *SpirosKGR*
> 
> Hello guys, while playing Battlefield 1 multiplayer , i noticed that my gpu usage was dropping to 60-70-80 % in game almost all the time on my new gtx 1070, at conquest mode. Ι tried lots of configurations on ultra-high-med-low preset but the problem still exists. I have a 1080p monitor and the resolution scale in game sis 100% ( native ). Cpu max usage it was 90-05%
> What can i do to fix the problem?
> 
> 
> 
> 
> 
> 
> 
> I believe that its not a gpu problem ( fire strike bench for test http://www.3dmark.com/3dm/15560429 )
> *Specs*
> Intel Core i7 2600K at 4.2 GHz | Kingston DDR3 4x2GB ( total 8GB ) 1600mhz |
> Asrock Fatality z77 Professional motherboard |
> Arctic Liquid 120 Freezer ( AIO cpu cooler ) |
> EVGA GTX 1070 SC 8GB |
> Samsung 840 Pro 128GB |
> WD Caviar Black 1 TB |
> OS: Windows 7 Home 64 bit
> 
> Max gpu temperature: 64C ( with fan profile ) |
> Max cpu temperature: 59-61 |
> 
> I tried to install the game on ssd but it didnt work
> Thanks in advance


I have near identical specs (i7 2600k @4.5ghz, 16gb 2133 Ram, 1070 @ 2.1 core 8400 Mem Windows 10) my fire strike run for reference http://www.3dmark.com/3dm/15560429

I run with ultra Preset at 1080p Gpu utilization is 99% constant frames vary between 100-160fps all game (64p conquest)

is Vsync off ? do you run a frame rate cap ?
Win 7 vs Win 10?
when I went to 16gb 2133 from 8gb 1600 I notice smother gameplay in BF4 ( did the upgrade last year)
Your PC should be fine what are you actually averaging in frames over a match ?


----------



## SpirosKGR

Quote:


> Originally Posted by *Pittster*
> 
> I have near identical specs (i7 2600k @4.5ghz, 16gb 2133 Ram, 1070 @ 2.1 core 8400 Mem Windows 10) my fire strike run for reference http://www.3dmark.com/3dm/15560429
> 
> I run with ultra Preset at 1080p Gpu utilization is 99% constant frames vary between 100-160fps all game (64p conquest)
> 
> is Vsync off ? do you run a frame rate cap ?
> Win 7 vs Win 10?
> when I went to 16gb 2133 from 8gb 1600 I notice smother gameplay in BF4 ( did the upgrade last year)
> Your PC should be fine what are you actually averaging in frames over a match ?


The 3d mark bench its the same








Vsync of course OFF
Windows 7 ( maybe Windows 10 fix the problem ? )


----------



## SpirosKGR

Quote:


> Originally Posted by *F3niX69*
> 
> It sounds like the cpu can't keep up.Though it should be capable for 60fps gaming on 1080p.What are your framerates?


My frames is close to +100 at ULTRA but the gameplay no so smooth ... The gpu usage goes up to 80-90, jump to 60, then 70, 75, 83, back to 67, 70 , 80~ like that.

Sorry for my bad english. Not my natural language


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> We don't know what they actually knew that was only a guess on my part. The new bios could have just been coincidental cause all the cards including the newer samsung ones made recently have come with the 26 bios as well. The micron change was 1 July. Humans plan around calender dates that stand out so 1st of the month could well be a likely cut over date for a new bios release too.
> 
> Getting all emotional and paranoid about this will not fix it or make it be resolved any faster. I am pretty sure that there has never been any intent to roll out something that will give them a headache to fix later but shuff happens..... The Support people should probably have dealt with it better but that assumes that the support guys actually know what they are talking about when in reality, they work in an outsourced call centre type environment, read off a script and look things up in a knowledge base database. Escalation procedures all tend to be not thought through very well.
> 
> That creates an issue when a new problem arises because there are no database entries to regurgitate. The other problem these companies have, is that the last thing a knowledgable and experienced IT guy wants to be doing is first level tech support.


Why are you so confident that the new BIOS fixed the power state problem? i see you keep spreading that information in every single site that has an article or people talking about the issue without any evidence, did you go through that EVGA BIOS update thread or comments on Guru3D?

The new BIOS simply just did what Nvidia and EVGA product manager Jacob said it will do, that is "improve the overclocking experience for certain users" , they increased the memory voltage and nothing more so people can have 100Mhz or so additional OC headroom, but the voltage/frequency switching algorithm still did not change, and as a result people are still getting the checkerboard artifacts of death if they don't lock the card in 3D mode or high power/voltage state.

Your proposed solution was to increase voltage -> then increase frequency -> then decrease frequency -> then decrease voltage, so that the memory does not find itself in a situation with high frequency accompanied by simultaneously low voltage, but this is not the case, checkerboard is still happening without locking 3D mode or voltage or running something like a browser in the background, even on the new BIOS.

The BIOS update is just a gimmick by Nvidia to camouflage the real issue from the market, one more evidence that Micron ICs are not stable across the board is that someone like TheGlow can do +800Mhz without any special tricks while others have to downclock their vRAM to 7600Mhz to run stable.

If the ICs were okay and BIOS have a general problem then everyone would have the same results, but when some people on the same BIOS can reach 9000+Mhz while others can not run stable at 8000Mhz then the memory ICs are a hit and miss, indicating a mass production quality problem.

The BIOS did not fix the power state switching issue at all, why are you so active in spreading misinformation like that when evidence suggests otherwise? i don't get it?


----------



## TheGlow

Quote:


> Originally Posted by *MyNewRig*
> 
> The BIOS update is just a gimmick by Nvidia to camouflage the real issue from the market, one more evidence that Micron ICs are not stable across the board is that someone like TheGlow can do +800Mhz without any special tricks while others have to downclock their vRAM to 7600Mhz to run stable.


I indicated several times I DID have to do something special. I need to make sure I don't enter 2D clocks.
I installed the MSI Gaming App which installs a GamingApp Service. This in turn triggers 2 exe's, MsiGamingOSD_x64 and MsiGamingOSD_x32.
I noticed if the x64 is running I get stuck in 3d clocks. This makes it so I get usually 800mV in Afterburner, sometimes as low as 725mV.
Clock is idle at 1582.
If I stop that exe, I will drop to something like 315MHz clock and voltage something like 90.
Now when im down there, if I execute anything that tries to bring me to 3d clocks, I checkerboard, like launching MS Edge.
So as gtbtk has mentioned it appears to be voltage related. While Im in 3d clocks I can go up and down with memory clock all day fine. Once I go 2D, and run something with hardware acceleration = game over.


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> I indicated several times I DID have to do something special. I need to make sure I don't enter 2D clocks.
> I installed the MSI Gaming App which installs a GamingApp Service. This in turn triggers 2 exe's, MsiGamingOSD_x64 and MsiGamingOSD_x32.
> I noticed if the x64 is running I get stuck in 3d clocks. This makes it so I get usually 800mV in Afterburner, sometimes as low as 725mV.
> Clock is idle at 1582.
> If I stop that exe, I will drop to something like 315MHz clock and voltage something like 90.
> Now when im down there, if I execute anything that tries to bring me to 3d clocks, I checkerboard, like launching MS Edge.
> So as gtbtk has mentioned it appears to be voltage related. While Im in 3d clocks I can go up and down with memory clock all day fine. Once I go 2D, and run something with hardware acceleration = game over.


The thing is that with the new EVGA BIOS the same is still happening, only difference is that some users can OC a bit higher than before "in 3D mode" but eventually if voltage is not locked checkerboard of death is still happening, so how is that a fix? the original complaints have to do with instability and crashing issues in the form of checkerboard artifacting when power switching, that is still the case, so is the new BIOS the fix we wanted?

Also your OC level is rare and unique, many are crashing at stock, crashing that goes away when memory is underclocked, how that is that not a mass production quality issue on the memory ICs?

gtbtk is spreading information all over the place that the BIOS update is the final answer which makes the Micron memory issues go way, which is simply not the case.


----------



## TheGlow

I had installed the Gaming App first, not knowing about afterburner. the rest i found out later when cleaning up and removing unneeded apps. thats when I was checkerboarding like crazy. took me a bit to find out all that about osd.exes


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> I had installed the Gaming App first, not knowing about afterburner. the rest i found out later when cleaning up and removing unneeded apps. thats when I was checkerboarding like crazy. took me a bit to find out all that about osd.exes


So with the updated BIOS your OC level can increase by +50Mhz or +100Mhz due to higher voltage being fed to the vRAM but if your voltage or power state is not locked you would still checkerboard artifact and crash, so how is that the final fix that gtbtk makes it sound like it is and spreading that information all over the place?


----------



## DeathAngel74

I only got the checkerboard twice. Once while overclocking vram past 8500MHz(windows 10). The second time, after installing the fixed bios, overclocking too close to 9000MHz(windows 7). Both times self-inflicted. Maybe I got lucky purchasing mine from BBY? Mine is model #:08G-P4-6173-KB, not -KR. Once GPU and CPU overclocks were stable, black screens and crashes to desktop have ceased. I also added dwm.exe, chrome.exe and explorer.exe set to maximum performance in NVCP. So the card idles at 1594/8726 .762mV. No probs switching between 2d and 3d apps(games). All games are set to maximum performance in NVCP.


----------



## MyNewRig

Quote:


> Originally Posted by *DeathAngel74*
> 
> I only got the checkerboard twice. Once while overclocking vram past 8500MHz(windows 10). The second time, after installing the fixed bios, overclocking too close to 9000MHz(windows 7). Both times self-inflicted. Maybe I got lucky purchasing mine from BBY? Mine is model #:08G-P4-6173-KB, not -KR.


The fact that you are still getting the checkerboard crashing after updating the BIOS (even when your OC increased) means that the problem is not fixed, the BIOS is still lowering the voltage before the frequency causing voltage deprivation to the memory and an immediate crash.

Also the fact that results are so variable on old and new BIOS, or as you call it some people getting lucky and some getting very unlucky that there is crashing at stock settings indicates a mass production quality issue.

We are dealing with two different problems here, memory ICs quality issues and improper voltage management even after the update, these problems are still not fixed, so no need to mislead people and make them think that everything is okay now!


----------



## DeathAngel74

I'm just stating that I have no more issues. Twice in a almost 2 weeks. Both times, when I lowered my VRAM OC, the checkerboard went away. I'm not trying to argue about anything. Just stating that I'm not having issues anymore, whatsoever. Sorry everyone else is still having issues. Geez!


----------



## MyNewRig

Quote:


> Originally Posted by *DeathAngel74*
> 
> I'm just stating that I have no more issues. Twice in a almost 2 weeks. Both times, when I lowered my VRAM OC, the checkerboard went away. I'm not trying to argue about anything. Just stating that I'm not having issues anymore, whatsoever. Sorry everyone else is still having issues. Geez!


No No it is perfectly fine, i am not pissed at you at all, just pissed because gtbtk is making the market believe that the BIOS update is the magical answer to all the issues and makes this problem one of the past even when evidence suggests otherwise, sorry if my anger sounded like it was directed at you


----------



## DeathAngel74

I'm pissed too. I bought the card without knowing any of this, lol! My fault for not researching....but damn its frustrating. Then I start overclocking and boom! checkerboard. I thought it was just for the normal reasons, lol. Too high core or vram clocks, not enough voltage, etc. Then I found all this here and at evga forums. Should have kept my SLI 970 FTW+'s. Sorry brah, my post was kinda insensitive to those still experiencing frustration and problems. This blows...They do a Pascal v2 refresh to compensate, for free of course.


----------



## MyNewRig

Quote:


> Originally Posted by *DeathAngel74*
> 
> I'm pissed too. I bought the card without knowing any of this, lol! My fault for not researching....but damn its frustrating. Then I start overclocking and boom! checkerboard. I thought it was just for the normal reasons, lol. Too high core or vram clocks, etc. Then I found all this here and at evga forums. Should have kept my SLI 970 FTW+'s.


Yeah me too, I feel so damn stupid for not keeping my GTX 980 Ti and falling for this GTX 1070 trap, i now returned it and have to stay without a GPU for God knows how long until there are decent enthusiast grade GPUs in the market, but how could i have known that Pascal would turn out to be the mess it is unless i can see into the future? it was always the case that newer is better, not this time!

You could even be double-pissed for you are not only affected by that Micron crap but also you got an EVGA product and i am sure you heard about that hotspots and fire hazard issues on Pascal ACX cards, it is just crazy what they are doing with these cards, does lack of competition really result in so much negligence? i am beyond surprised to be honest.


----------



## TheGlow

I have MSI and didnt use the bios' released so far. This is still stock so behavior is understandable.


----------



## DeathAngel74

Yeah...As of tomorrow EVGA will send us thermal pads to address the hotspot issue free of charge. Woohoo! LOL! Already ordered some Gelid GC Extreme.


----------



## MyNewRig

Quote:


> Originally Posted by *DeathAngel74*
> 
> Yeah...As of tomorrow EVGA will send us thermal pads to address the hotspot issue free of charge. Woohoo! LOL! Already ordered some Gelid GC Extreme.


So instead of going the Samsung route, issuing mass recalls and fixing their products themselves making sure it stays safe and cool, they are sending thermal pads to their customers for a complex and time consuming repair job, for me this is a bad joke given the product price, good for EVGA that they have customers who are okay with such thing.


----------



## DeathAngel74

Like I said in the other post. We should be given Pascal V2 refresh free of charge for our trouble.


----------



## MyNewRig

Quote:


> Originally Posted by *DeathAngel74*
> 
> Like I said in the other post. We should be given Pascal V2 refresh free of charge for our trouble.


Hahahaha, probably cheaper for these companies to file for bankruptcy than do this, your ambitions are way out of control, Nvidia usually considers these troubles as "features" of the product


----------



## Vectorized

Hi guys,
I have a GIGABYTE GeForce GTX 1070 G1 and I wish to overclock it.

Do you suggest me to raise core voltage or not? If yes, why?

I read that core voltage is locked to 1.09 V, no matter how far you set the voltage slider: is this true?

Thank you very much!


----------



## MyNewRig

Quote:


> Originally Posted by *Vectorized*
> 
> Hi guys,
> I have a GIGABYTE GeForce GTX 1070 G1 and I wish to overclock it.
> 
> Do you suggest me to raise core voltage or not? If yes, why?
> 
> I read that core voltage is locked to 1.09 V, no matter how far you set the voltage slider: is this true?
> 
> Thank you very much!


Yes voltage is locked at that level, with stock voltage being about 1.07v so the difference between maxing out the voltage slider or not is only 0.02v which is a very small difference, all what maxing voltage will give you is about extra 20-40Mhz boost or less power throttling by that much, i would just max the voltage slider to get as high clock as possible under load, some people report better OC without moving the voltage slider.

Just try it, no much difference and no risk, you will get about 2c to 3c degrees warmer with max voltage vs. stock voltage.


----------



## TheGlow

Hmm, seems with new drivers its ignoring my oc....
i had to change a few times before it kicked back in...


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> The hot fix driver (375.63) is out and Nvidia pulled 375.57. Maybe it helps but it could be very well a game problem.
> Any related issue on the Overwatch forums?


Well i think I figured it out. not the drivers...
I use plex server to share videos on my pc's hdd to network, like roku's and tablets.
Never gave me a problem before but seems to be at least when theyre watching Rick and Morty, 1080p HEVC rips, thats what was killing my PC.
I went to task manager and raised priority on overwatch to high and it went away. back to Normal priority, chop fest.
I hope thats it but earlier when it was occurring I closed plex among many other misc apps to see if that was the culprit and i was still impacted.
Then i was playing and my daughter just started watching in the same room as me so I could see to correlation.
Earlier they were in their own room watching it. Madness.
Damn you Rick, Wubaluba dubdub!


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> Well i think I figured it out. not the drivers...
> I use plex server to share videos on my pc's hdd to network, like roku's and tablets.
> Never gave me a problem before but seems to be at least when theyre watching Rick and Morty, 1080p HEVC rips, thats what was killing my PC.
> I went to task manager and raised priority on overwatch to high and it went away. back to Normal priority, chop fest.
> I hope thats it but earlier when it was occurring I closed plex among many other misc apps to see if that was the culprit and i was still impacted.
> Then i was playing and my daughter just started watching in the same room as me so I could see to correlation.
> Earlier they were in their own room watching it. Madness.
> Damn you Rick, Wubaluba dubdub!


Yea the kids, always make trouble.







Good that you figured it out.


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> Yea the kids, always make trouble.
> 
> 
> 
> 
> 
> 
> 
> Good that you figured it out.


Yea, i went in the other room and other kid was watching in there. I'm fairly certain I've played while something was being shared but majority is older divx, or mp4s. h265 i dunno if it needs more power or i think needs to transcode for the rokus, not sure. But yes, both were watching the same time which I don't think has ever happened.
Again Im puzzled as I did shut it off and that wasn't a trigger initially. unless it just throttled the cpu for a bit and didnt reclaim until later.
Still odd that the fps would still show 110+ but performance was god awful.
Spinning would show tearing but spamming screenshot button didnt show any tears.


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> If the ICs were okay and BIOS have a general problem then everyone would have the same results, but when some people on the same BIOS can reach 9000+Mhz while others can not run stable at 8000Mhz then the memory ICs are a hit and miss, indicating a mass production quality problem.


No, not everyone would get the same results, because there are variations at Samsung Memory cards too. There are always variations.

Plus, how many times did I wroth that, here and over there at the Dark Side? 7600Mhz memory clock = defective card = RMA. Yes, its really that easy.
Quote:


> You could even be double-pissed for you are not only affected by that Micron crap but also you got an EVGA product and i am sure you heard about that hotspots and fire hazard issues on Pascal ACX cards, it is just crazy what they are doing with these cards, does lack of competition really result in so much negligence? i am beyond surprised to be honest.


Actually, EVGA has lots of competition. Asus, MSI, Zotac, GIGABYTE etcetera.


----------



## MyNewRig

Quote:


> Originally Posted by *Roland0101*
> 
> No, not everyone would get the same results, because there are variations at Samsung Memory cards too. There are always variations.


The variations with Samsung "in *GENERAL*" (don't quote one odd guy who can't do +100Mhz with Samsung and use as evidence) is weather you can OC to 9600Mhz or only 9200Mhz or so.

The variations with Micron is either they are stable at rated specs or not, nothing comparable

Even those rare Samsung chips that don't OC well don't checkerboard and crash the system, they just soft-crash, but that has been said a thousand times already as well.

Quote:


> Plus, how many times did I wroth that, here and over there at the Dark Side? 7600Mhz memory clock = defective card = RMA. Yes, its really that easy.


It does not matter how many times you say that, i am not convinced, i have another opinion, this is a mass production quality issue because it is something common and not just one or two odd cases, saying it ten thousand more times will not make it any more convincing to me, so just accept that people can disagree with you and stop repeating yourself.


----------



## THEROTHERHAMKID

What's the 1070 like for 1080 gaming ?
I have a 1080 g1 for my main pc in the living room on my 4K tv
But I'm thinking of a 1070 for my second pc on my 1080 monitor


----------



## Majentrix

Gonna buy a second 1070 for SLI tomorrow


----------



## Mr-Dark

Quote:


> Originally Posted by *TheGlow*
> 
> I have MSI and didnt use the bios' released so far. This is still stock so behavior is understandable.


So MSI release bios update ??


----------



## Majentrix

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> What's the 1070 like for 1080 gaming ?
> I have a 1080 g1 for my main pc in the living room on my 4K tv
> But I'm thinking of a 1070 for my second pc on my 1080 monitor


Currently overkill for 1080p 60hz gaming. Perfect for 1080p 144hz gaming.
If you buy one now you're probably set for this generation.


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *Majentrix*
> 
> Currently overkill for 1080p 60hz gaming. Perfect for 1080p 144hz gaming.
> If you buy one now you're probably set for this generation.


Ok thanks for the help
Or should I go for a 980ti?


----------



## Majentrix

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Ok thanks for the help
> Or should I go for a 980ti?


A 980ti is roughly on par with a 1070. If you can find one for less than a 1070 then by all means go for it. Bare in mind that over the long run the 1070 will probably be the better choice as it's going to receive more driver updates improving performance.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> We don't know what they actually knew that was only a guess on my part. The new bios could have just been coincidental cause all the cards including the newer samsung ones made recently have come with the 26 bios as well. The micron change was 1 July. Humans plan around calender dates that stand out so 1st of the month could well be a likely cut over date for a new bios release too.
> 
> Getting all emotional and paranoid about this will not fix it or make it be resolved any faster. I am pretty sure that there has never been any intent to roll out something that will give them a headache to fix later but shuff happens..... The Support people should probably have dealt with it better but that assumes that the support guys actually know what they are talking about when in reality, they work in an outsourced call centre type environment, read off a script and look things up in a knowledge base database. Escalation procedures all tend to be not thought through very well.
> 
> That creates an issue when a new problem arises because there are no database entries to regurgitate. The other problem these companies have, is that the last thing a knowledgable and experienced IT guy wants to be doing is first level tech support.
> 
> 
> 
> Why are you so confident that the new BIOS fixed the power state problem? i see you keep spreading that information in every single site that has an article or people talking about the issue without any evidence, did you go through that EVGA BIOS update thread or comments on Guru3D?
> 
> The new BIOS simply just did what Nvidia and EVGA product manager Jacob said it will do, that is "improve the overclocking experience for certain users" , they increased the memory voltage and nothing more so people can have 100Mhz or so additional OC headroom, but the voltage/frequency switching algorithm still did not change, and as a result people are still getting the checkerboard artifacts of death if they don't lock the card in 3D mode or high power/voltage state.
> 
> Your proposed solution was to increase voltage -> then increase frequency -> then decrease frequency -> then decrease voltage, so that the memory does not find itself in a situation with high frequency accompanied by simultaneously low voltage, but this is not the case, checkerboard is still happening without locking 3D mode or voltage or running something like a browser in the background, even on the new BIOS.
> 
> The BIOS update is just a gimmick by Nvidia to camouflage the real issue from the market, one more evidence that Micron ICs are not stable across the board is that someone like TheGlow can do +800Mhz without any special tricks while others have to downclock their vRAM to 7600Mhz to run stable.
> 
> If the ICs were okay and BIOS have a general problem then everyone would have the same results, but when some people on the same BIOS can reach 9000+Mhz while others can not run stable at 8000Mhz then the memory ICs are a hit and miss, indicating a mass production quality problem.
> 
> The BIOS did not fix the power state switching issue at all, why are you so active in spreading misinformation like that when evidence suggests otherwise? i don't get it?
Click to expand...

Other than assume the fault is in Micron chips, where is your evidence that the actual Micron silicon is faulty in any card, including your own? How have you eliminated faulty memory mosfets, inductors, voltage controllers or PCB trace faults on your board? What analysis have you done comparing the numbers of faulty samsung vs micron cards? For all you know, the RMA percentages could be similar, the only difference being that the Samsung memory guys are not complaining about checkerboard artifacts and blaming the memory chips to the exclusion of anything else. You haven't done any of that. You have just jumped to a conclusion based on your sample of one and all the misinformation and assumptions that have been drawn by people without any scientific method posting in forums.

I would say that not crashing with a +500 memory overclock on a properly working card was an improvement. It is not me spreading any misinformation at all. It is you doing that. I have confidence in the diagnosis because I can prove it by running the same experiment over and over again on different systems and get the same results. If a number of cards, such as mine, can run above 9000Mhz with the workaround, it indicates that it is not a problem because the chips core design prohibits memory overclocks. If it was then no cards could run at those sorts of speeds.

My WORKAROUND to a timing issue was to keep the voltage higher level at idle so that the the time to ramp up from idle to full load was reduced compared to increasing voltage from the ultra low level to full voltage that you get in the power saver modes. Was it ideal? No it wasn't. It also wasn't a final solution and it certainly was not a workaround to fix intrinsically broken cards that crash at stock speeds. The solution to a card that will not run even at stock speeds is to RETURN THE CARD under warranty and get a replacement. That has been suggested to you on multiple occasions by a number of different people. If someone mods their card before they test it out, then they only have themselves to blame. For all anyone knows knows the modding itself may have broken the card.

If you push the card too far and the driver crashes and resets, sometimes it loses the higher voltage idle state settings and goes back to the lower voltage state until you reset everything back to the workaround. If you miss resetting the workaround again that then you can crash the card. Workarounds are not perfect and is the reason I want this bios update to work as advertised to make a permanent fix and not have to deal with this fiddle.

The Glows card does seem to be made from amazing pieces of silicon but the rest of his rig will also have something to do with supporting those high clocks too. What that is that makes what he has special I am not sure without seeing it.

At this point in time, I cannot have an opinion on this bios update being the absolute solution to anything wider than a single EVGA FTW card until I can install and try the one for my specific hardware. At the moment for anything other than EVGA, the update doesn't yet exist in the public arena. If it ends up not providing the solution for me, I will keep lobbying Nvidia/MSI to work harder to fix it. I did flash the new EVGA FTW bios to my Gaming X and the white checkerboard/video scheduler bsod crash did not occur under the same circumstances it did when using the EVGA FTW .26 bios. It does introduce other performance limitations on an MSI card such as lower max power limitations due to the differences in board designs so it is not a viable long term swap for MSI cards. Just as it is not conclusive that it solves every other brands problem. The sample of one that I have so far, looks promising. It gives me hope that there will be a solution to this specific problem.

Trying to form a mob to burn down NVIDIA is not the solution. That approach ultimately achieves nothing other than to get the nvidia engineers, who can possibly fix a demonstrable, repeatable problem, it to dismiss everything as noise and put up the barricades to defend themselves from emotional attacks. My approach is not fanboyism, it is being pragmatic.


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> My WORKAROUND to a timing issue was to keep the voltage higher level at idle so that the the time to ramp up from idle to full load was reduced compared to increasing voltage from the ultra low level to full voltage that you get in the power saver modes.


What voltage? Wasn't the work around to use a fixed GPU clock to prevent P5 and P8?


----------



## asdkj1740

new micron bios, not just from evga but also from palit and gainward, work better on my card, obviously.
these new bios certainly alleviate the checkerboard problem to some extent, but not totally fixed.
i tested all these new bios on my card, i can still get the checkerboard if i raise too much vram clock like 2400mhz. but no more checkerboard from 2000mhz to 2350mhz.
i would say, nvidia and aic fix the checkerbaord problem from the vram clock range between 2000mhz and 2300mhz.
i dont think nvidia and aic cant fix this problem on higher vram clock like 2400mhz or above, i guess nvidia and aic simply dont want users to overclock that much.

i still believe samsung vram is better (generally of course) in overclocking, base on what i have seen.
however i wont blame to micron now, given what i have seen on new bios.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah me too, I feel so damn stupid for not keeping my GTX 980 Ti and falling for this GTX 1070 trap, i now returned it and have to stay without a GPU for God knows how long until there are decent enthusiast grade GPUs in the market, but how could i have known that Pascal would turn out to be the mess it is unless i can see into the future? it was always the case that newer is better, not this time!
> 
> You could even be double-pissed for you are not only affected by that Micron crap but also you got an EVGA product and i am sure you heard about that hotspots and fire hazard issues on Pascal ACX cards, it is just crazy what they are doing with these cards, does lack of competition really result in so much negligence? i am beyond surprised to be honest.


if i were u owned 980Ti i won't upgrade cos i preferred 384-bit bus more. the down side is H.265 (hybird) not really fully supported & lack of efficiency due to 28nm but (matured)

previously i really like hynix cos i used to be AMD user but now i noticed samsung is way more stable.

at the end of the day, i can guarantee new pascal refresh is coming & to avoid all the troublesome avoid micron for god sake & purchase a GTX 1070/60/50Ti with a samsung vram chip for the peace of mind.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Other than assume the fault is in Micron chips ......


You have not made any scientific analysis either for you don't have the tools or means to do so and if the developer with 10 other partners with their labs, resources, equipment and expertise are not able to figure it out already then one individual with one card would be foolish to think that they can do better, how would you analyze memory mosfets, inductors, voltage controller? you can't.

Starting a thread in GeForce forums based on observational analysis of collective symptoms while many others including myself have also had the same independent observations does not automatically make you an authority or an expert in the subject, but many people are taking your word for it regardless, so when you go around in all sites and forums spreading information that everything is fine and that the BIOS is the final fix and then people acting on this information by buying the cards and then end up being disappointed is just misleading, and in that regard just as dishonest as these reviewers who are only reviewing Samsung memory cards and keeping totally shut about the Micron switch.

Again it is a question of roles, is it your role to spend months and months analyzing products down to the PCB traces level for Nvidia? you say that your time has high value and that you analyze and fix systems for companies for a living, right? how can you spend your days and nights following this and spreading information in every single site that discusses it? you say that your goal is to only get your own single card fixed, you have not been successful in doing so yet after more than three months of trying, you observed this July, now we are approaching November and even your case is not settled yet, so how come you go round telling others that things are fine or will be fine? i don't really get it, it does not make sense.

You keep lying to yourself by saying that i only tested one single card, but you know damn well that unlike you i tested four of them, every single one was from a different batch and a different stock, purchased few weeks apart, the two Samsung cards ran flawlessly with memory OC easily exceeding 9000Mhz without any sort of trickery while the two Micron cards are having a bunch of issues at stock settings and both were crashing the system with a mild OC, enough said, that is as far as i can go in my consumer role to draw conclusions, because i actually value my time more than to spend months lobbying Nvidia to provide an impossible software fix, just don't do like these reviewers and spread misleading information that makes others get burnt at the end, that is all i am asking.


----------



## Vectorized

Quote:


> Originally Posted by *MyNewRig*
> 
> Yes voltage is locked at that level, with stock voltage being about 1.07v so the difference between maxing out the voltage slider or not is only 0.02v which is a very small difference, all what maxing voltage will give you is about extra 20-40Mhz boost or less power throttling by that much, i would just max the voltage slider to get as high clock as possible under load, some people report better OC without moving the voltage slider.
> 
> Just try it, no much difference and no risk, you will get about 2c to 3c degrees warmer with max voltage vs. stock voltage.


Thanks!

REP+


----------



## khanmein

i told u the new vbios can't solve the hardware limitation of micron. if they can solve, we sure have saw a lot youtube tech reviewer mention but so far none~

i'm looking forward to steve burke (gamers nexus) make a comparison.. FYI, the AMD RX 480 PCI-E also partially fix & reduce the power draw from PCI-E to the minimal but still exceed the requirement spec.





 (RX 480 Endurance Test on a Cheap Motherboard - Part 1)

within the vid until now is 3 months, Steve Burke mentioned verbally on one of his vid but i can't remember which one but u can ask him yourself if u don't believe. he said so far no issue but how come no vid for part II if really no issue at all? this part u guys think yourself.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> i told u the new vbios can't solve the hardware limitation of micron. if they can solve, we sure have saw a lot youtube tech reviewer mention but so far none~
> 
> i'm looking forward to steve burke (gamers nexus) make a comparison.. FYI, the AMD RX 480 PCI-E also partially fix & reduce the power draw from PCI-E to the minimal but still exceed the requirement spec.
> 
> 
> 
> 
> 
> (RX 480 Endurance Test on a Cheap Motherboard - Part 1)
> 
> within the vid until now is 3 months, Steve Burke mentioned verbally on one of his vid but i can't remember which one but u can ask him yourself if u don't believe. he said so far no issue but how come no vid for part II if really no issue at all? this part u guys think yourself.


actually hardcore overclocking youtube channel just get the free rx480 sample from xfx.
this card has six phases vcore and 3 of them are supplied by pcie and 3 of them are from 8pin connector, exactly the same as reference design.
this channel is going to do some hard mod on this card, so it is a great chance to see can the motherboard survive with crazy hard mod


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My WORKAROUND to a timing issue was to keep the voltage higher level at idle so that the the time to ramp up from idle to full load was reduced compared to increasing voltage from the ultra low level to full voltage that you get in the power saver modes.
> 
> 
> 
> What voltage? Wasn't the work around to use a fixed GPU clock to prevent P5 and P8?
Click to expand...

no. if the card idle voltage is above .800 or so then the memory doesn't crash.

locking voltage will also work around that problem butr it is using a sledge hammer to crack a wall nut


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> i told u the new vbios can't solve the hardware limitation of micron. if they can solve, we sure have saw a lot youtube tech reviewer mention but so far none~
> 
> i'm looking forward to steve burke (gamers nexus) make a comparison.. FYI, the AMD RX 480 PCI-E also partially fix & reduce the power draw from PCI-E to the minimal but still exceed the requirement spec.
> 
> 
> 
> 
> 
> (RX 480 Endurance Test on a Cheap Motherboard - Part 1)
> 
> within the vid until now is 3 months, Steve Burke mentioned verbally on one of his vid but i can't remember which one but u can ask him yourself if u don't believe. he said so far no issue but how come no vid for part II if really no issue at all? this part u guys think yourself.


With the GTX 970 partitioned memory 3.5 + 0.5 GB Nvidia also claimed that a BIOS/Drivers fix will resolve the problem, how do you resolve physically partitioned memory stuttering issues via software? you can't, but Nvidia claimed it regardless, they released drivers that make the card not go near that 0.5GB partition which improved the situation a little, just like this Micron fix, the market responded by making the GTX 970 one of the best selling GPUs of all times, so Nvidia learned that consumers are willing to accept mediocrity and they did it again with the GTX 1070, cheaping out on memory components in the middle of the production cycle, looks like many consumers are still willing to take it and gtbtk is helping them justify their foolishness which is just sad!


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> actually hardcore overclocking youtube channel just get the free rx480 sample from xfx.
> this card has six phases vcore and 3 of them are supplied by pcie and 3 of them are from 8pin connector, exactly the same as reference design.
> this channel is going to do some hard mod on this card, so it is a great chance to see can the motherboard survive with crazy hard mod


yeah actually XFX GTR draw the exactly same like the reference PCB (



)

even he highly recommend XFX/HIS due to potential voltmods etc. i recommend go for ASUS RX 480

he interested with voltmods + the vid talked more compliment than cons (obviously xfx delighted) the cons he mentioned roughly indirectly regarding same design pcb with reference. (hinted)

this y can't i see a powercolor or giga send a review sample to him?? ASUS didn't send to him cos y need him since out there a lot reviewer got higher viewers..

i personally had a very bad experience with HIS cards!


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> With the GTX 970 partitioned memory 3.5 + 0.5 GB Nvidia also claimed that a BIOS/Drivers fix will resolve the problem, how do you resolve physically partitioned memory stuttering issues via software? you can't, but Nvidia claimed it regardless, they released drivers that make the card not go near that 0.5GB partition which improved the situation a little, just like this Micron fix, the market responded by making the GTX 970 one of the best selling GPUs of all times, so Nvidia learned that consumers are willing to accept mediocrity and they did it again with the GTX 1070, cheaping out on memory components in the middle of the production cycle, looks like many consumers are still willing to take it and gtbtk is helping them justify their foolishness which is just sad!


this world is like a routine. e.g. my ex-prime minister corrupted, while his staff said how to solve this issue? he said give our citizen 3 months & they will forget it.

look at pastor kong hee manipulated the church income for his beloved wife temptation.. the 3.5 GB thang took those claimed themselves as professional content creator with no bias for half year to announced the issue?? shame on them again.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Other than assume the fault is in Micron chips ......
> 
> 
> 
> You have not made any scientific analysis either for you don't have the tools or means to do so and if the developer with 10 other partners with their labs, resources, equipment and expertise are not able to figure it out already then one individual with one card would be foolish to think that they can do better, how would you analyze memory mosfets, inductors, voltage controller? you can't.
> 
> Starting a thread in GeForce forums based on observational analysis of collective symptoms while many others including myself have also had the same independent observations does not automatically make you an authority or an expert in the subject, but many people are taking your word for it regardless, so when you go around in all sites and forums spreading information that everything is fine and that the BIOS is the final fix and then people acting on this information by buying the cards and then end up being disappointed is just misleading, and in that regard just as dishonest as these reviewers who are only reviewing Samsung memory cards and keeping totally shut about the Micron switch.
> 
> Again it is a question of roles, is it your role to spend months and months analyzing products down to the PCB traces level for Nvidia? you say that your time has high value and that you analyze and fix systems for companies for a living, right? how can you spend your days and nights following this and spreading information in every single site that discusses it? you say that your goal is to only get your own single card fixed, you have not been successful in doing so yet after more than three months of trying, you observed this July, now we are approaching November and even your case is not settled yet, so how come you go round telling others that things are fine or will be fine? i don't really get it, it does not make sense.
> 
> You keep lying to yourself by saying that i only tested one single card, but you know damn well that unlike you i tested four of them, every single one was from a different batch and a different stock, purchased few weeks apart, the two Samsung cards ran flawlessly with memory OC easily exceeding 9000Mhz without any sort of trickery while the two Micron cards are having a bunch of issues at stock settings and both were crashing the system with a mild OC, enough said, that is as far as i can go in my consumer role to draw conclusions, because i actually value my time more than to spend months lobbying Nvidia to provide an impossible software fix, just don't do like these reviewers and spread misleading information that makes others get burnt at the end, that is all i am asking.
Click to expand...

I said experiment. I didn't say analysis

You have been complaining that your card wont run at stock speeds for the last 2 months and blaming Micron memory for it. if it wont run at stock then the problem is that your specific card is broken, not a universal problem for every card although you would have everyone believe that it is.

no-one is disputing that there is a problem with 1070s with micron memory there obviously is. the problem is your aggressive approach and conclusion that the cards are all fatally flawed without any chance of a solution. If you feel that way sell or return your cards and get something you are happy with.

Or is what makes you happy just being an internet troll?


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> With the GTX 970 partitioned memory 3.5 + 0.5 GB Nvidia also claimed that a BIOS/Drivers fix will resolve the problem, how do you resolve physically partitioned memory stuttering issues via software? you can't, but Nvidia claimed it regardless, they released drivers that make the card not go near that 0.5GB partition which improved the situation a little, just like this Micron fix, the market responded by making the GTX 970 one of the best selling GPUs of all times, so Nvidia learned that consumers are willing to accept mediocrity and they did it again with the GTX 1070, cheaping out on memory components in the middle of the production cycle, looks like many consumers are still willing to take it and gtbtk is helping them justify their foolishness which is just sad!


digital foundry has a video talked about this problem. and i think the consequences of this problem are too exaggerated.

1. real 4g vram that can be accessed, but with different speeds
2. nvidia driver must have done somethings to keep the first 3.5g fully utilized first. and the result is not bad at all.
3. under 1080p and non sli, it is hard to notice the low speed 0.5g vram consequences/influences. (by digital foundry conclusion)


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> digital foundry has a video talked about this problem. and i think the consequences of this problem are too exaggerated.
> 
> 1. real 4g vram that can be accessed, but with different speeds
> 2. nvidia driver must have done somethings to keep the first 3.5g fully utilized first. and the result is not bad at all.
> 3. under 1080p and non sli, it is hard to notice the low speed 0.5g vram consequences/influences. (by digital foundry conclusion)


regarding the richard leadbetter is way better than other tech reviewer. he often hinted the cons by himself & from the vid but we need to look for ourselves.

unfortunately, he didn't respond my comment regarding the micron fiasco.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> I said experiment. I didn't say analysis
> 
> You have been complaining that your card wont run at stock speeds for the last 2 months and blaming Micron memory for it. if it wont run at stock then the problem is that your specific card is broken, not a universal problem for every card although you would have everyone believe that it is.
> 
> no-one is disputing that there is a problem with 1070s with micron memory there obviously is. the problem is your aggressive approach and conclusion that the cards are all fatally flawed without any chance of a solution. If you feel that way sell or return your cards and get something you are happy with.
> 
> Or is what makes you happy just being an internet troll?


What is this now? didn't i say before that the two Micron cards that i had have had issues at stock not just one? and the two were from two different batches and production runs? why do you ignore the information communicated to you and keeping repeating the same things over and over?

I did not say that there is no solution, i say that there is no solution in software, there is the solution of using Samsung ICs or higher grade Micron ICs, not fool us with software fix promises, and certainly not a fix that would take more than three months to deliver. that time frame is not acceptable by any consumer law in the world, two to three weeks is the most time this can take which is not the case, you raised the issue in July, we are at the end of October now, did you get your fix yet?

Your willingness to accept being played like that by Nvidia does not make it a normal or legally acceptable situation, the least i can do is be aggressive about it, i am actually trying to be as civil as possible, with the time and efforts i have put into replacing GPUs, handling the logistics and finances of the process, contacting the manufacturers politely about it several times, the least i can do is be aggressive.

a troll? how is that for a scientific argument? just because i call you out on spreading misleading information all over the place just like these corrupt reviewers automatically makes me a troll in your eyes? resorting to an "ad hominem attack" when you run out of logical arguments makes you the troll or even worse.

https://en.wikipedia.org/wiki/Ad_hominem


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> digital foundry has a video talked about this problem. and i think the consequences of this problem are too exaggerated.
> 
> 1. real 4g vram that can be accessed, but with different speeds
> 2. nvidia driver must have done somethings to keep the first 3.5g fully utilized first. and the result is not bad at all.
> 3. under 1080p and non sli, it is hard to notice the low speed 0.5g vram consequences/influences. (by digital foundry conclusion)


The problem is very real, i owned the GTX 970 for a few weeks before returning it due to stuttering, GTA V in 2K was a stuttery mess even with the fixed drivers, i could not stand it, maybe in 1080p it was somewhat acceptable if you watch your settings, but with many games if you max out your settings and manage to exceed the fast 3.5GB partition you are guaranteed a stuttering experience, if you watch your settings to keep vRAM usage around the 3GB mark it runs fine.

Did you actually try the card? did you have a different experience?


----------



## khanmein

@MyNewRig & @gtbtk don't fight between each other.

now we should gather all the forces discriminate the Nvidia, 3rd partners, tech reviewer e.g. linustechtips (gimmick benchmark), jayztwocents (show off Nissy 370z), paul h/w (he only know how to drink beer), bitwit (kyle awesomesauce but not awesome at all that's y removed), randomfrankp (anything also RGB) etc a lot more i lazy to mention...


----------



## F3niX69

Why is taking manufacturers so long to release the vbios? Asus(which is the on ei care for) hasn't said anything for weeks.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> If you feel that way sell or return your cards and get something you are happy with.


Ah and didn't you notice that i said a couple days ago that my card is going back to store, and then a few posts above that it has already been returned? are you only selectively reading what is being posted and ignoring what doesn't suite you? or are you so busy following 10 other forums while you continue to spread the information that everything is fine that you can't keep track of what is being said anymore?

I understand that you are stuck with your card and in your country you have no right of return at this point, you badly need your issue fixed, i can appreciate that, but you don't have to keep going round encouraging everyone else to get themselves stuck with a bad product like you just to feel better.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> @MyNewRig & @gtbtk don't fight between each other.
> 
> now we should gather all the forces discriminate the Nvidia, 3rd partners, tech reviewer e.g. linustechtips (gimmick benchmark), jayztwocents (show off Nissy 370z), paul h/w (he only know how to drink beer), bitwit (kyle awesomesauce but not awesome at all that's y removed), randomfrankp (anything also RGB) etc a lot more i lazy to mention...


I am not fighting, he is the one who just called me a troll! how is that for a scientific approach?


----------



## khanmein

Quote:


> Originally Posted by *F3niX69*
> 
> Why is taking manufacturers so long to release the vbios? Asus(which is the on ei care for) hasn't said anything for weeks.


i might be an amateur or noob but let me tell u that edit vbios is simple like abc & y took them so long is the duration of testing + micron chip already reached the limit like GTX 970 3.5 GB + 0.5 GB same meaning.

return your card & request samsung if they can't provide then don't buy..


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> I am not fighting, he is the one who just called me a troll! how is that for a scientific approach?


by the way, i already commented to PCPER, if they really can discuss the micron thing then i pledge some amount to the church. few bucks is more than i can afford since i'm haven't purchase any GTX 1070 yet.


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> The problem is very real, i owned the GTX 970 for a few weeks before returning it due to stuttering, GTA V in 2K was a stuttery mess even with the fixed drivers, i could not stand it, maybe in 1080p it was somewhat acceptable if you watch your settings, but with many games if you max out your settings and manage to exceed the fast 3.5GB partition you are guaranteed a stuttering experience, if you watch your settings to keep vRAM usage around the 3GB mark it runs fine.
> 
> Did you actually try the card? did you have a different experience?


i have not tried on 1440p so i cant say that.
but the main point is that are you sure that stutterings//fps dropped//frame time raised are all about the 0.5g low speed problem??
df video shows that whenever 970 has "issues", actually 980 also suffer the same extent, under 1080p.
if you maxed out settings and under 1440p, i dont think a single 970 is capable to give you good gaming experience in nature even 970 is full 4g full speed.
df video shows under 1440p and maxed out settings, single 970 will start to suffer by that 0.5g low speed problem, indeed that is true on some crazy games like acu.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> return your card & request samsung if they can't provide then don't buy..


Exactly, this or you don't get to take my money, simple, that is the fix i am calling for, not a gimmicky political BIOS update that is aimed to silence the market until they forget and this issue gets buried.

3 Months to make a BIOS tweak? and it is not even a profound tweak, just a voltage adjustment, what kind of a stupid game is that?


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> i have not tried on 1440p so i cant say that.
> but the main point is that are you sure that stutterings//fps dropped//frame time raised are all about the 0.5g low speed problem??
> df video shows that whenever 970 has "issues", actually 980 also suffer the same extent, under 1080p.
> if you maxed out settings and under 1440p, i dont think a single 970 is capable to give you good gaming experience in nature even 970 is full 4g full speed.
> df video shows under 1440p and maxed out settings, single 970 will start to suffer by that 0.5g low speed problem, indeed that is true on some crazy games like acu.


yeah playing GTA V & Tomb Raider 2013 got shuttering on my 1440p 60 Hz but if u tried to lower the setting around mid or turn off all the AA & don't make the vrm hit above 3.2 GB then the shuttering will gone. (i don't know how to explain clearly)

FYI, i play a lot NBA & FIFA more often than triple A titles. this is one of the reason i wanna buy GTX 1070!


----------



## QPSS

Quote:


> Originally Posted by *Majentrix*
> 
> Currently overkill for 1080p 60hz gaming. Perfect for 1080p 144hz gaming.
> If you buy one now you're probably set for this generation.


Not at all.
With max details in RotTR for example I can get it down to 35 FPS and the VRAM maxed. There are other games that are like this too, if you want the best quality (especially AA).


----------



## HaiderGill

Quote:


> Originally Posted by *weskeh*
> 
> i dont have any bad words about my zotac amp extreme 1070, so there's that


Yeah they are good cards too...


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> i have not tried on 1440p so i cant say that.
> but the main point is that are you sure that stutterings//fps dropped//frame time raised are all about the 0.5g low speed problem??
> df video shows that whenever 970 has "issues", actually 980 also suffer the same extent, under 1080p.
> if you maxed out settings and under 1440p, i dont think a single 970 is capable to give you good gaming experience in nature even 970 is full 4g full speed.
> df video shows under 1440p and maxed out settings, single 970 will start to suffer by that 0.5g low speed problem, indeed that is true on some crazy games like acu.


Low FPS and stuttering are not the same, same settings on my previous GTX 780 were giving low FPS but it was a smooth one, after upgrading to the GTX 980 Ti i could do 4K also with low FPS but no stuttering, the GTX 970 stuttering was correlated to the amount of vRAM being utilized, i tested 720p and was very smooth, 1080p with conservative settings was also acceptable, 1080p Ultra settings gets stuttery, 1440p was a stuttering mess.

My observation is that the card reports having 4GB of vRAM and some game engines seek to utilize all of it and it gets stuttery, with the 780 the situation was better because the card was only reporting 3GB of vRAM, most game engines would restrict themselves to that amount and so the experience is smooth in high and low FPS alike.

But from personal experience the problem was actually very real and annoying.


----------



## Majentrix

Funny, apparently Gainward cards are meant to have Micron memory and are affected by the bug. GPU-Z says my Gainward card has Samsung memory.

Guess I got a good batch?


----------



## asdkj1740

Quote:


> Originally Posted by *QPSS*
> 
> Not at all.
> With max details in RotTR for example I can get it down to 35 FPS and the VRAM maxed. There are other games that are like this too, if you want the best quality (especially AA).


1080 is fine for 1080p 144hz, i dont think 1070 is capable to do it


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> You have not made any scientific analysis either for you don't have the tools or means to do so and if the developer with 10 other partners with their labs, resources, equipment and expertise are not able to figure it out already then one individual with one card would be foolish to think that they can do better, how would you analyze memory mosfets, inductors, voltage controller? you can't.
> 
> Starting a thread in GeForce forums based on observational analysis of collective symptoms while many others including myself have also had the same independent observations does not automatically make you an authority or an expert in the subject, but many people are taking your word for it regardless, so when you go around in all sites and forums spreading information that everything is fine and that the BIOS is the final fix and then people acting on this information by buying the cards and then end up being disappointed is just misleading, and in that regard just as dishonest as these reviewers who are only reviewing Samsung memory cards and keeping totally shut about the Micron switch.
> 
> Again it is a question of roles, is it your role to spend months and months analyzing products down to the PCB traces level for Nvidia? you say that your time has high value and that you analyze and fix systems for companies for a living, right? how can you spend your days and nights following this and spreading information in every single site that discusses it? you say that your goal is to only get your own single card fixed, you have not been successful in doing so yet after more than three months of trying, you observed this July, now we are approaching November and even your case is not settled yet, so how come you go round telling others that things are fine or will be fine? i don't really get it, it does not make sense.
> 
> You keep lying to yourself by saying that i only tested one single card, but you know damn well that unlike you i tested four of them, every single one was from a different batch and a different stock, purchased few weeks apart, the two Samsung cards ran flawlessly with memory OC easily exceeding 9000Mhz without any sort of trickery while the two Micron cards are having a bunch of issues at stock settings and both were crashing the system with a mild OC, enough said, that is as far as i can go in my consumer role to draw conclusions, because i actually value my time more than to spend months lobbying Nvidia to provide an impossible software fix, just don't do like these reviewers and spread misleading information that makes others get burnt at the end, that is all i am asking.


I have asked you for some evidence already, to see if I have the same problem. You ignored me. Yet some other people were helpful and we didnt really see a difference in frametime to a Samsung one.


----------



## MyNewRig

Quote:


> Originally Posted by *Majentrix*
> 
> Funny, apparently Gainward cards are meant to have Micron memory and are affected by the bug. GPU-Z says my Gainward card has Samsung memory.
> 
> Guess I got a good batch?


Reports such as yours keep surfacing everyday, when did you buy the card? could be that some silently switched back to Samsung again like they silently switched to Micron in the past, to save themselves the headache or a potential sales halt since more and more people are being aware of this everyday.


----------



## HaiderGill

On the Micron memory issue I would go to your local trading standard, a product when sold has to be fit for purpose...It amazing the press reaction is so muted compared to RX 480 power non-issue...


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> I have asked you for some evidence already, to see if I have the same problem. You ignored me. Yet some other people were helpful and we didnt really see a difference in frametime to a Samsung one.


Yes i apologize for that, i was busy and exhausted from testing and just had to take my card out of the system to return it because my return period was coming to an end, i saw others have provided you with frametime graphs which looked okay so i thought that satisfied the issue, no? at least some with Micron are getting consistent frametime so my messed up frametimes were irrelevant because it does not seem to be affecting every single card.


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> Low FPS and stuttering are not the same, same settings on my previous GTX 780 were giving low FPS but it was a smooth one, after upgrading to the GTX 980 Ti i could do 4K also with low FPS but no stuttering, the GTX 970 stuttering was correlated to the amount of vRAM being utilized, i tested 720p and was very smooth, 1080p with conservative settings was also acceptable, 1080p Ultra settings gets stuttery, 1440p was a stuttering mess.
> 
> My observation is that the card reports having 4GB of vRAM and some game engines seek to utilize all of it and it gets stuttery, with the 780 the situation was better because the card was only reporting 3GB of vRAM, most game engines would restrict themselves to that amount and so the experience is smooth in high and low FPS alike.
> 
> But from personal experience the problem was actually very real and annoying.


yes they are not the same so i used "//". stuttering is more related to unstable frame time.
i would say kelper is not a good example to compare with maxwell cards. kelper itself has some issues.....
so 970 vs 980 should be a good dimension to see how that 0.5g influences actual gaming.
my 970 used to play middle earth shadow of mordor under 1080p maxed out settings, 4g vram is fully utilized, and i not feel any stutterings.
maybe you should check out this video. there are other videos comparing 970 and 980 in the channel.


----------



## Arturo.Zise

Quote:


> Originally Posted by *Majentrix*
> 
> Funny, apparently Gainward cards are meant to have Micron memory and are affected by the bug. GPU-Z says my Gainward card has Samsung memory.
> 
> Guess I got a good batch?


Yeah my 2 week old Gainward Phoenix has Samsung RAM. About to flash the Golden Sample BIOS and try some OC gaming at 4K.


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> yes they are not the same so i used "//". stuttering is more related to unstable frame time.
> i would say kelper is not a good example to compare with maxwell cards. kelper itself has some issues.....
> so 970 vs 980 should be a good dimension to see how that 0.5g influences actual gaming.
> my 970 used to play middle earth shadow of mordor under 1080p maxed out settings, 4g vram is fully utilized, and i not feel any stutterings.
> maybe you should check out this video. there are other videos comparing 970 and 980 in the channel.


1080p with 970 no issue but once u upgraded 1440p the shuttering is exist but weird joker slunt said no shuttering at all..

my Viewsonic VX2370Smh-LED gave to my bro for his PS4 & i bought Dell U2717D for better image quality.


----------



## QPSS

Quote:


> Originally Posted by *asdkj1740*
> 
> i have not tried on 1440p so i cant say that.
> but the main point is that are you sure that stutterings//fps dropped//frame time raised are all about the 0.5g low speed problem??
> df video shows that whenever 970 has "issues", actually 980 also suffer the same extent, under 1080p.
> if you maxed out settings and under 1440p, i dont think a single 970 is capable to give you good gaming experience in nature even 970 is full 4g full speed.
> df video shows under 1440p and maxed out settings, single 970 will start to suffer by that 0.5g low speed problem, indeed that is true on some crazy games like acu.


I had a 970 too. It stuttered much more than my friends 980. Especially when driving up the hills. We never measured the frametimes, because it really was that obvious. I saw some tests that claimed the 970 is not different from a 980, but they didnt measure frametimes. And I dont care what they claimed, because I saw it myself, and not only in GTA5, and that was the main reason why I bought a 1070 that early.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> 1080p with 970 no issue but once u upgraded 1440p the shuttering is exist but weird joker slunt said no shuttering at all..
> 
> my Viewsonic VX2370Smh-LED gave to my bro for his PS4 & i bought Dell U2717D for better image quality.


thats why i said the influences of 0.5g problem are too exaggerated.
but i am not saying there are no such influences at all.


----------



## Majentrix

Quote:


> Originally Posted by *MyNewRig*
> 
> Reports such as yours keep surfacing everyday, when did you buy the card? could be that some silently switched back to Samsung again like they silently switched to Micron in the past, to save themselves the headache or a potential sales halt since more and more people are being aware of this everyday.


I bought mine on July 9 in Australia, so an early batch.


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> thats why i said the influences of 0.5g problem are too exaggerated.
> but i am not saying there is not such influences at all.


exaggerated or not that is a subjective matter in the eye of the beholder, did the problem exist? yes, did it effect every game and every resolution the same? no, was it very annoying for some and okay for others? yes

But like QPSS said, for me it was a very obvious in your face type stuttering, GTA V on 1440p or Ultra 1080p was almost unplayable for me specially when driving because of that issue.


----------



## MyNewRig

Quote:


> Originally Posted by *Majentrix*
> 
> I bought mine on July 9 in Australia, so an early batch.


Okay that explains it


----------



## khanmein

^^ my leadtek pcb is the same like reference & supported NV debug mode too. now i can't test any new driver cos one of my fan stop spinning & sent for RMA on 17th Sept until now still stuck at the supplier there.

i don't need to play ultra but as long it support high + some AA for triple A titles i'm contended with 60 Hz.


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> exaggerated or not that is a subjective matter in the eye of the beholder, did the problem exist? yes, did it effect every game and every resolution the same? no, was it very annoying for some and okay for others? yes
> 
> But like QPSS said, for me it was a very obvious in your face type stuttering, GTA V on 1440p or Ultra 1080p was almost unplayable for me specially when driving because of that issue.


yes, it is all about personal feelings.
so i suggest to see some reviews//testings done by others like these youtubers as they have more resources to do a apple to apple comparison. these videos are not going to justify the problem but is going to show you to what extent that 0.5g problem will really suffer.

there is one things we need to know: knowing 0.5g problem or not has zero impact on the card operation.....


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> yes, it is all about personal feelings.
> so i suggest to see some reviews//testings done by others like these youtubers as they have more resources to do a apple to apple comparison. these videos are not going to justify the problem but is going to show you to what extent that 0.5g problem will really suffer.
> 
> there is one things we need to know: knowing 0.5g problem or not has zero impact on the card operation.....


But why re-open the issue when it has been closed to most people's satisfaction on multiple levels i think, a few solutions have been provided, first if i remember correctly the GTX 970 price was reduced after this issue blew up, also retailers in my country gave the opportunity to all those who purchased according to the old spec-sheet before the issue was officially announced to return their cards for a full refund even after the return period was passed, and then eventually those who decided to knowingly still keep their cards after they have been made aware of the situation a $30 compensation under the class action settlement.

So probably everyone had the opportunity to seek satisfaction some way or the other, buy cheaper, return for a full refund or get compensated for their trouble.


----------



## QPSS

Quote:


> Originally Posted by *asdkj1740*
> 
> yes, it is all about personal feelings.
> so i suggest to see some reviews//testings done by others like these youtubers as they have more resources to do a apple to apple comparison. these videos are not going to justify the problem but is going to show you to what extent that 0.5g problem will really suffer.
> 
> there is one things we need to know: knowing 0.5g problem or not has zero impact on the card operation.....


Well, of course it doesnt have any impact on the cards operation, since that is normal behavior for a gimped card like that. So I dont know what you mean with that sentence.
The problem is there and I am extremely happy I got rid of it with my 1070, and now seeing it use its full 8 GB of memory in quite a few games, realizing 3.5 GB/4 GB was FAR FAR too little, I am even more confirmed.


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> But why re-open the issue when it has been closed to most people's satisfaction on multiple levels i think, a few solutions have been provided, first if i remember correctly the GTX 970 price was reduced after this issue blew up, also retailers in my country gave the opportunity to all those who purchased according to the old spec-sheet before the issue was officially announced to return their cards for a full refund even after the return period was passed, and then eventually those who decided to knowingly still keep their cards after they have been made aware of the situation a $30 compensation under the class action settlement.
> 
> So probably everyone had the opportunity to seek satisfaction some way or the other, buy cheaper, return for a full refund or get compensated for their trouble.


i am the one who cant be//have not satisfied...no refund, no price reduction, no compensation, not eligible to us lawsuit even i bought the card from us amazon....i live in a hell country.
closing the issue or not wont border me to see what happened inside because it is my interest. as i said knowing the truth or not wont affect the card operation so we can still blame all problems to that 0.5g if we will be happier then.

one thing that will never close: nvidia.......


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> no. if the card idle voltage is above .800 or so then the memory doesn't crash.
> 
> locking voltage will also work around that problem butr it is using a sledge hammer to crack a wall nut


Sorry I'm missing something here. When you say idle voltage do you mean you are idling at GPU 139MHz / Mem 101MHz with 0.8V?


----------



## khanmein

@Mjhieu received samsung no issue. like i said again & again micron really crippled.

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/41/


----------



## shadowrain

All these talks of Samsung vs Micron and conspiracy theories... think about this... maybe it was Samsung themselves who are hoarding or jacking up the prices of their GDDR5. The 1070 Samsung to Micron issue happened just before the advent of the iPhone 7, Google Pixel and the increased demand for ram chips with these phones and the Note 7 fiasco.

Numerous sites have noted that the increases ddr4 demand on phones plus the exploding Note 7 will cause Samsung to increase prices on thier phones, Rams and SSD. GDDR5 included. Prices of DDR4 in asian countries inclusing mine have been increasing steadily for the past 2 months. 5-10% increase as of now.

Now back to Micron, it is a bios voltage issue not resolved by the AIB's before the launch of the Micron cards that made *SOME*, not *ALL*, but *SOME* Micron cards be unstable at stock.

And as proof and also for those asking for a micron 1070 review, here is one. http://asuswrt.net/2016/09/24/zotac-gtx1070-mini-review/ Stable at stock and memory stable at +100mhz (8400 effective). You know why? Coz this company tests their cards 1 by 1 in Heaven and Firestrike. 




Just my $0.02.


----------



## khanmein

Quote:


> Originally Posted by *shadowrain*
> 
> All these talks of Samsung vs Micron and conspiracy theories... think about this... maybe it was Samsung themselves who are hoarding or jacking up the prices of their GDDR5. The 1070 Samsung to Micron issue happened just before the advent of the iPhone 7, Google Pixel and the increased demand for ram chips with these phones and the Note 7 fiasco.
> 
> Numerous sites have noted that the increases ddr4 demand on phones plus the exploding Note 7 will cause Samsung to increase prices on thier phones, Rams and SSD. GDDR5 included. Prices of DDR4 in asian countries inclusing mine have been increasing steadily for the past 2 months. 5-10% increase as of now.
> 
> Now back to Micron, it is a bios voltage issue not resolved by the AIB's before the launch of the Micron cards that made *SOME*, not *ALL*, but *SOME* Micron cards be unstable at stock.
> 
> And as proof and also for those asking for a micron 1070 review, here is one. http://asuswrt.net/2016/09/24/zotac-gtx1070-mini-review/ Stable at stock and memory stable at +100mhz (8400 effective). You know why? Coz this company tests their cards 1 by 1 in Heaven and Firestrike.
> 
> 
> 
> 
> Just my $0.02.


oh yeah then u must ensure that u purchase micron & show us the same result like u shown us. thanks.

let me remind u that Heaven & Firestrike don't represent daily gaming. i don't wanna see any proof from review but i prefer to see end user actual benchmark.. come show us yours... where's yr some micron that work w/o any issue???

the mosfet AIO is the same like reference. my leadtek PCB is using this type of mosfet's too. i don't believe there's a stable one with micron cos a lot user never comment on forum & don't even know what's artifacts or checkboard.


----------



## shadowrain

Quote:


> Originally Posted by *khanmein*
> 
> oh yeah then u must ensure that u purchase micron & show us the same result like u shown us. thanks.


How did you know that I don't have micron? Only MyNewRig aka GamerSX knows that. It shows these trolls are one and the same. Spreading, preaching that 1070 Microns are the devil. I don't need to show you that 1070 Microns are stable at stock or at +100, many in this and numerous forums have already attested to that fact. Only Microns I see with flaws and with evidence are ASUS and MSI. All other AIB's have good records on their microns at stock.

So you my friend, just like MyNewRig, welcome to my blocklist. Makes this thread much fresher without this Micron is the devil nonsense.

PS: Whether it be Samsung or Micron, I'm getting either a Zotac 1070 Amp Mini or Amp Edition to SLI with my Amp Extreme. My 1st Zotac and very happy and I'm not paying the premium for the big named brands who doesn't even focus on making thier GPU's work properly.


----------



## MyNewRig

Quote:


> Originally Posted by *shadowrain*
> 
> All these talks of Samsung vs Micron and conspiracy theories... think about this... maybe it was Samsung themselves who are hoarding or jacking up the prices of their GDDR5. The 1070 Samsung to Micron issue happened just before the advent of the iPhone 7, Google Pixel and the increased demand for ram chips with these phones and the Note 7 fiasco.
> 
> Numerous sites have noted that the increases ddr4 demand on phones plus the exploding Note 7 will cause Samsung to increase prices on thier phones, Rams and SSD. GDDR5 included. Prices of DDR4 in asian countries inclusing mine have been increasing steadily for the past 2 months. 5-10% increase as of now.
> 
> Now back to Micron, it is a bios voltage issue not resolved by the AIB's before the launch of the Micron cards that made *SOME*, not *ALL*, but *SOME* Micron cards be unstable at stock.
> 
> And as proof and also for those asking for a micron 1070 review, here is one. http://asuswrt.net/2016/09/24/zotac-gtx1070-mini-review/ Stable at stock and memory stable at +100mhz (8400 effective). You know why? Coz this company tests their cards 1 by 1 in Heaven and Firestrike.
> 
> 
> 
> 
> Just my $0.02.


You know what, most of the people who talk like you are just making theoretical conclusions, go try both cards side by side or one after the other and come back and tell us what you think, you will know then how misinformed you are,

Here is another guy who tried a Samsung card after having used a Micron card for a while, he got the same experience as me who also tried both cards, check what he says:

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/post/5003274/#5003274

You just contradicted yourself in your post, how can it be a BIOS issue when on the same BIOS some are stable at stock, some have great OC headroom while many are unstable at stock? if the ICs were okay and the problem was in the BIOS then everyone else would have the same experience.

Pricing and availability is not an excuse for using low grade memory components, Samsung made the chips more expensive? so be it, we are already paying a premium and margins are huge on these cards, they don't have to screw us all to improve their margins by a few pennies, stock not available? fine simply switch to GDDR5X and clock it lower.

And in all cases be transparent and communicative don't keep customers guessing what is going on for months, we have all the right to come up with all sorts of conspiracy theories when we are not being given any information, who can blame us? do you have any better information for us? you are just guessing as well.


----------



## khanmein

Quote:


> Originally Posted by *shadowrain*
> 
> How did you know that I don't have micron? Only MyNewRig aka GamerSX knows that. It shows these trolls are one and the same. Spreading, preaching that 1070 Microns are the devil. I don't need to show you that 1070 Microns are stable at stock or at +100, many in this and numerous forums have already attested to that fact. Only Microns I see with flaws and with evidence are ASUS and MSI. All other AIB's have good records on their microns at stock.
> 
> So you my friend, just like MyNewRig, welcome to my blocklist. Makes this thread much fresher without this Micron is the devil nonsense.
> 
> PS: Whether it be Samsung or Micron, I'm getting either a Zotac 1070 Amp Mini or Amp Edition to SLI with my Amp Extreme. My 1st Zotac and very happy and I'm not paying the premium for the big named brands who doesn't even focus on making thier GPU's work properly.


i'm so scare u wanna block me.. please do it.. i wish good luck with your micron zotac. i'm not richfag but if the product is premium i willing to pay & same concept with the foods but if the service is bad then i won't visit them too..

my GTX 970 came with a low & rare brand that u might not able to obtain but seriously i don't have face any artifacts or BSOD. there's some issue like TDR & MFAA but those related with s/w + NV fixed the issue for my side.

if they done a good job i'll praise & support them but if they manipulated us, i'm bash them like no more tech of tomorrow. hey folks, so far i only read 3 micron chip articles.

i can't even find a well known article from anandtech, hexus, kitguru, guru3d, tech report, toms hardware, pcper, hardocp, hardwarecanucks, gamers nexus, etc that reviewed samsung only! c'mon are u blind? where's micron review??

there's more than 20 articles that viewed by the majority tech users. none of them came with micron so u're named as pro shills.


----------



## vallonen

Face it. There are issues with Micron memory.


----------



## EDK-TheONE

Finally i got my Zotac amp extreme with Samsung memory and 2037 Mhz out of box.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> i'm so scare u wanna block me.. please do it.. i wish good luck with your micron zotac.


LOL he blocked you too? that must be the saddest day in your life









He says that only i know that he has a Micron card? how the hell would i know that? i don't even recall the guy, he makes it sound like i am his neighbor or something haha

I was going to ask you to ask him what is that evidence he has that ASUS and MSI has the issue and the other brands don't? these guys talk about a screenshot as an evidence, i don't know how is that considered an evidence?

Lets not bother him, he can do +100Mhz on his memory, that must be the happiest overclocker on earth at the moment


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> LOL he blocked you too? that must be the saddest day in your life
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He says that only i know that he has a Micron card? how the hell would i know that? i don't even recall the guy, he makes it sound like i am his neighbor or something haha
> 
> I was going to ask you to ask him what is that evidence he has that ASUS and MSI has the issue and the other brands don't? these guys talk about a screenshot as an evidence, i don't know how is that considered an evidence?
> 
> Lets not bother him, he can do +100Mhz on his memory, that must be the happiest overclocker on earth at the moment


i don't care if his micron overclocked +1000MHz or break the guinness world records. when the competition held, someone please ask this guy screenshot the GPU-Z or GPU Shark that the over-clocker guys use micron or samsung at the 1st place. i bet u won't see any micron out there.

he might said micron GDDR5X but i'm talking about GDDR5 MICRON.


----------



## shadowrain

You have no evidence either


----------



## shadowrain

Good, talk with yourself. Make it believable. Bring out all the other accounts too.


----------



## khanmein

Quote:


> Originally Posted by *shadowrain*
> 
> You have no evidence either


did NV released new vbios (micron) for EVGA & Palit/Gainward?

this is not evidence then at the 1st place y released a new vbios if no issue?

as matter of fact, i don't think u accept the truth. truth is honey which is bitter! no offense~


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> i don't care if his micron overclocked +1000MHz or break the guinness world records. when the competition held, someone please ask this guy screenshot the GPU-Z or GPU Shark that the over-clocker guys use micron or samsung at the 1st place. i bet u won't see any micron out there.
> 
> he might said micron GDDR5X but i'm talking about GDDR5 MICRON.


I wouldn't mind him, he is just trolling, i don't believe that anyone in their right mind will actually buy not only one, but two of these crippled cards, he will just come back crying later when the issue gets fixed for real in hardware and he finds himself stuck with a very bad product, that if he even owns the product which i doubt at this point, the only evidence would be a purchase receipt in his own name with a valid photo ID, all these GPU-Z screenshots people are posting we believe in Goodwill, nothing of that is actually an evidence.

Did you see the score Mjhieu is now getting with his Samsung memory card? that is really awesome, he also noticed how smoother the frame rate is, and he posted a screenshot "evidence" that pretty much seals the deal









These guys just can't handle the truth, they are fooling themselves into accepting a broken product and it hurts when someone points where it is broken, you know if he tried a Samsung memory card for a couple days he will come back very angry at his Micron card.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> I wouldn't mind him, he is just trolling, i don't believe that anyone in their right mind will actually buy not only one, but two of these crippled cards, he will just come back crying later when the issue gets fixed for real in hardware and he finds himself stuck with a very bad product, that if he even owns the product which i doubt at this point, the only evidence would be a purchase receipt in his own name with a valid photo ID, all these GPU-Z screenshots people are posting we believe in Goodwill, nothing of that is actually an evidence.
> 
> Did you see the score Mjhieu is now getting with his Samsung memory card? that is really awesome, he also noticed how smoother the frame rate is, and he posted a screenshot "evidence" that pretty much seals the deal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These guys just can't handle the truth, they are fooling themselves into accepting a broken product and it hurts when someone points where it is broken, you know if he tried a Samsung memory card for a couple days he will come back very angry at his Micron card.


yeah i believe end user himself like e.g. Mjhieu if really no different y can't he achieved the same thing with micron if micron is not the main root cause? out there might got few very specific lucky users with micron that having no issue or what-so-ever but we don't wanna take the risk cos i personally scare someone wanna block me!

what i know is that the micron issue can be happened after days, weeks, month or years & the issue will getting more serious so let the dude enjoy 1st. god bless him with all the micron thingy..

try to ask jayztwocents make comparison vid between samsung & micron live benchmark. i dare to challenge him up...


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> try to ask jayztwocents make comparison vid between samsung & micron live benchmark. i dare to challenge him up...


They get a ton of free products for themselves and for giveaways to boost their channels, it is in their best interest to sweep this under the rug for as long as they possibly can until everyone else is talking about it and then they will start talking too, they just can't be the first to do it or there will be no more free products for them.

I think at this point most people know about the issue, if they still choose to buy the product, it is their choice and their money, the only thing that is bothering me is that Nvidia is getting away with low quality products and shady business policies all the time, this encourages them to do the same with Pascal V2 which would be a disaster.

You see when the RX 480 had that power draw issue that got fixed in a few days everyone was hammering them for it without any evidence that anyone's board got fried, but when Nvidia gives us cheap memory they are okay with it and call those who complain trolls, Nvidia managed to brainwash so many people, it is sad.


----------



## gtbtk

Quote:


> Originally Posted by *Majentrix*
> 
> Funny, apparently Gainward cards are meant to have Micron memory and are affected by the bug. GPU-Z says my Gainward card has Samsung memory.
> 
> Guess I got a good batch?


They seem to be alternating. the first batch was Samsung. The 2nd batch micron, now it seems they are back on samsung again


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> They seem to be alternating. the first batch was Samsung. The 2nd batch micron, now it seems they are back on samsung again


Quote:


> Originally Posted by *Majentrix*
> 
> I bought mine on July 9 in Australia, so an early batch.


----------



## Mr-Dark

Quote:


> Originally Posted by *MyNewRig*
> 
> They get a ton of free products for themselves and for giveaways to boost their channels, it is in their best interest to sweep this under the rug for as long as they possibly can until everyone else is talking about it and then they will start talking too, they just can't be the first to do it or there will be no more free products for them.
> 
> I think at this point most people know about the issue, if they still choose to buy the product, it is their choice and their money, the only thing that is bothering me is that Nvidia is getting away with low quality products and shady business policies all the time, this encourages them to do the same with Pascal V2 which would be a disaster.
> 
> You see when the RX 480 had that power draw issue that got fixed in a few days everyone was hammering them for it without any evidence that anyone's board got fried, but when Nvidia gives us cheap memory they are okay with it and call those who complain trolls, Nvidia managed to brainwash so many people, it is sad.


Even with all Nvidia problems, its just way better than AMD.. Trust me

AMD still cant make quality driver's as Nvidia.. Super slow support & No optimization & few driver's compared to Nvidia...

Nvidia on the top from long time.. AMD dead on gpu's after 7000 Series.. Tell me how they still selling the 390X which 290X chip from 2013 ? and how they compute with Pascal ? Only RX 480 !!

I hope the 490X will not be dual 480X on same PCB as if that happen then its the End for AMD


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> no. if the card idle voltage is above .800 or so then the memory doesn't crash.
> 
> locking voltage will also work around that problem butr it is using a sledge hammer to crack a wall nut
> 
> 
> 
> Sorry I'm missing something here. When you say idle voltage do you mean you are idling at GPU 139MHz / Mem 101MHz with 0.8V?
Click to expand...

yes


----------



## khanmein

guys u can ask @Mr-Dark what i know he's pro in vbios stuff.

based on your exp @Mr-Dark, do u think the micron new vbios make a difference & achieve the same result like samsung? thanks.


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> Did you see the score Mjhieu is now getting with his Samsung memory card? that is really awesome, he also noticed how smoother the frame rate is, and he posted a screenshot "evidence" that pretty much seals the deal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These guys just can't handle the truth, they are fooling themselves into accepting a broken product and it hurts when someone points where it is broken, you know if he tried a Samsung memory card for a couple days he will come back very angry at his Micron card.


While I sure as hell dont defend these practices of Nvidia and their retailers, what you claim is a bit far fetched. He didnt say its smoother now. He just said its smooth. He also didnt really show any evidence of that. Its just a simple screenshot that shows he has Samsung now. The resolution is too low to see anything else anyway.

Not to mention that my Micron card works fine too. No crashes or artifacts at all. I can only get it to run +250 (+300 all in all), but thats ok, and might get better after the BIOS upgrade.


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> Even with all Nvidia problems, its just way better than AMD.. Trust me
> 
> AMD still cant make quality driver's as Nvidia.. Super slow support & No optimization & few driver's compared to Nvidia...
> 
> Nvidia on the top from long time.. AMD dead on gpu's after 7000 Series.. Tell me how they still selling the 390X which 290X chip from 2013 ? and how they compute with Pascal ? Only RX 480 !!
> 
> I hope the 490X will not be dual 480X on same PCB as if that happen then its the End for AMD


AMD was focusing more on consoles during that time because this is where most of its money is coming from if you check their latest earnings reports, also Sony and MS are happy to do business with AMD because they say that they are really easy to deal with, they get good prices from them as well, AMD has been doing more than great on consoles with that same old architecture, now they have shifted focus to PC again and i expect VEGA and Zen will be really good products since AMD has not always been that bad.

Regarding drivers and software support, that is no longer the case, have you been following the latest Nvidia drivers issues? it is a mess, open bugs carry on from one driver to the next without being fixed for months, the last BF1 driver was breaking windows before they release the hotfix yesterday and still a bunch of open bugs from a few months ago are still there, also that fake Micron BIOS fix has been promised for a very long time, it took them forever to deliver and until now only EVGA managed to get it out, nothing yet from ASUS, MSI, Zotac and all the rest.

AMD is sure not releasing a dual GPU card, we are all waiting for the VEGA architecture which is rumored to come with 12 TFLOPS of compute performance, HBM2 512 Gbps bandwidth memory, and a bunch of exciting rendering borrowed from the PS4 Pro. development cycle, also AMD drivers are better than Nvidia's in Windows 10, issues are much less, they work great, gets released fast and more frequently, and they even released the BF1 drivers two days before Nvidia and it was good.

VEGA development is being done silence without the hype that preceded Polaris, that is a very good sign that they are so confident the performance will be really good that no hype is needed, the product will just speak for itself, i see the tide is shifting.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> While I sure as hell dont defend these practices of Nvidia and their retailers, what you claim is a bit far fetched. He didnt say its smoother now. He just said its smooth. He also didnt really show any evidence of that. Its just a simple screenshot that shows he has Samsung now. The resolution is too low to see anything else anyway.
> 
> Not to mention that my Micron card works fine too. No crashes or artifacts at all. I can only get it to run +250 (+300 all in all), but thats ok, and might get better after the BIOS upgrade.


Forget about all that evidence talk, no one of us has ever produced any evidence of anything, we deal in words and screenshots and believe one another in Goodwill that is, an evidence would be an official purchase recipe with one's own name with a valid photo ID, no one of us is providing anything remotely close to that, so all that evidence talk is invalid.

I encourage you to try a Samsung card if you can get your hands on one, play on it for a few hours and judge for yourself, otherwise if you are happy with your Micron card so be it, i believe it is a low quality product and a waste of money, i returned mine and got my money back for that reason, if it is good enough for you then it is good enough for you, no one is judging you on that, enjoy ..


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> Forget about all that evidence talk, no one of us has ever produced any evidence of anything, we deal in words and screenshots and believe one another in Goodwill that is, an evidence would be an official purchase recipe with one's own name with a valid photo ID, no one of us is providing anything remotely close to that, so all that evidence talk is invalid.
> 
> I encourage you to try a Samsung card if you can get your hands on one, play on it for a few hours and judge for yourself, otherwise if you are happy with your Micron card so be it, i believe it is a low quality product and a waste of money, i returned mine and got my money back for that reason, if it is good enough for you then it is good enough for you, no one is judging you on that, enjoy ..


You also claimed you can back your choppy FPS up with frametime measurements, yet Ive never seen it. And our tests showed that its not the case.
Subjectivity may very well play a huge role here. If you think you have been scammed, you wont like it at all and find all kinds of stuff that really isnt there.

I would try a Samsung card, if there was a way, but since they all use Micron now, its kinda hard to get one with Samsung.
But in any case, I dont see "unsmooth" FPS. I am used to smooth FPS, and I am very sensitive to stuff like that, so much I can easily spot the difference between 60 FPS and 70. It runs smooth.
If anything it actually feels smoother, because even at 35 FPS I cant really complain as much as I would complain about 35 FPS normally.

Yeah. Subjective. Thats why direct comparisons are needed. But for some funny reason nobody wants to do them. Not you and not reviewers.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> You also claimed you can back your choppy FPS up with frametime measurements, yet Ive never seen it. .


I am sorry but i found myself pretty exhausted after all the wasted time and energy spent in this issue that i did not have the energy to run even more tests, take screenshots and post them, especially when i had to take the card out, clean it, package it, ship it and rearrange all the cables and everything back with all the screens i had connected to it, and the next day was my last day of return, would you have preferred me to spend all my energy on doing the tests instead of getting the card ready for shipment and rearranging my desk?

Sorry but i had to draw a line when enough testing is enough and actually send my card back before my return period ends, you have been provided with other tests that showed proper frametimes, yours show proper frametimes so what my messed up frametimes screenshots would have added to the picture?

You don't want to take my word for it and want a solid indisputable evidence, then your only option is to try a Samsung card yourself and draw your own conclusions.


----------



## EDK-TheONE

here is my result with little oc gpu: zotac amp extreme






fs: http://www.3dmark.com/3dm/15627237


----------



## khanmein

Quote:


> Originally Posted by *EDK-TheONE*
> 
> here is my result with little oc gpu: zotac amp extreme
> 
> 
> 
> 
> 
> 
> fs: http://www.3dmark.com/3dm/15627237


look like u can't hit 2400 MHz for memory clock rite?


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> I am sorry but i found myself pretty exhausted after all the wasted time and energy spent in this issue that i did not have the energy to run even more tests, take screenshots and post them, especially when i had to take the card out, clean it, package it, ship it and rearrange all the cables and everything back with all the screens i had connected to it, and the next day was my last day of return, would you have preferred me to spend all my energy on doing the tests instead of getting the card ready for shipment and rearranging my desk?
> 
> Sorry but i had to draw a line when enough testing is enough and actually send my card back before my return period ends, you have been provided with other tests that showed proper frametimes, yours show proper frametimes so what my messed up frametimes screenshots would have added to the picture?
> 
> You don't want to take my word for it and want a solid indisputable evidence, then your only option is to try a Samsung card yourself and draw your own conclusions.


You claimed you see it very clearly that frametimes are jumping "all over the place" and said you would post screenshots. Yet a day later you still had your card and still didnt deliver anything. It would have been easy to do a simple frame time benchmark, which takes only a minute or two. It was obvious you were the only one claiming stuff like that, yet you thought its a good idea not to back it up with your claimed messed up frametimes.
Cant blame people who dont want to believe you now.


----------



## benjamen50

Is it just me or is it that EVGA Precision X seems to be causing like microstutters which freeze a whole game for a second or two a few times every hour?


----------



## Avendor

@EDK-TheONE You really need to change your CPU, that's one hell of bottleneck, at least try to oc him. Good graphics score.








this is mine


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> You claimed you see it very clearly that frametimes are jumping "all over the place" and said you would post screenshots. Yet a day later you still had your card and still didnt deliver anything. It would have been easy to do a simple frame time benchmark, which takes only a minute or two. It was obvious you were the only one claiming stuff like that, yet you thought its a good idea not to back it up with your claimed messed up frametimes.
> Cant blame people who dont want to believe you now.


First of all it was not that easy to do because i was using Afterburner to measure it, then i saw in your discussions that Fraps is more reliable so i had to install Fraps and figure out how to do it and take screenshots, which i did not have the time and energy for, i am not a dedicated tester and have other things to do in my life.

Also you do not have to believe me at all, i bought the GTX 970 in the midst of all the memory fiasco surrounding it, the sales guy at the store told me that all that is nonsense and the card is great and is a best seller so i just ignored everything and bough the card to test for myself because the price was good at the time, i said to myself that if it works well for me then that is all that matters, i was intending to keep it if it did not stutter that much, but it turned out that what people are reporting was very true so i returned the card.

Your Micron card is working well for you right? why are you feeling unsure about it now? why do you need my or anyone else's approval to feel confident about your purchase?


----------



## khanmein

Quote:


> Originally Posted by *Avendor*
> 
> @EDK-TheONE You really need to change your CPU, that's one hell of bottleneck, at least try to oc him. Good graphics score.
> 
> 
> 
> 
> 
> 
> 
> 
> this is mine


if his cpu bottle-neck then my cpu can throw away.


----------



## TheGlow

Quote:


> Originally Posted by *Mr-Dark*
> 
> So MSI release bios update ??


Misunderstanding, there are no MSI yet, but some people have flashed other vendor bios to their MSI cards. I haven't tried it nor plan to.
I will play it safe and wait for MSI's.

Quote:


> Originally Posted by *gtbtk*
> 
> no. if the card idle voltage is above .800 or so then the memory doesn't crash.
> 
> locking voltage will also work around that problem butr it is using a sledge hammer to crack a wall nut


Exactly. When mine is riding the 3d clocks it will dip to .800 most of the time, sometimes sit around .725. Even .725 is fine. If I leave 3d clocks then it goes lower, i cant remember, maybe .600, and then its time for checkers.


----------



## EDK-TheONE

Quote:


> Originally Posted by *khanmein*
> 
> look like u can't hit 2400 MHz for memory clock rite?


new reslts here:




http://www.3dmark.com/3dm/15627478

this is only nooby oc!


----------



## EDK-TheONE

Quote:


> Originally Posted by *khanmein*
> 
> if his cpu bottle-neck then my cpu can throw away.


i do not think my cpu impact on graphics score.


----------



## Avendor

Exactly, waiting as well for Gigabyte VBIOS, to squeeze more from my 1070. I heard this week it's coming out, can't wait


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> First of all it was not that easy to do because i was using Afterburner to measure it, then i saw in your discussions that Fraps is more reliable so i had to install Fraps and figure out how to do it and take screenshots, which i did not have the time and energy for, i am not a dedicated tester and have other things to do in my life.
> 
> Also you do not have to believe me at all, i bought the GTX 970 in the midst of all the memory fiasco surrounding it, the sales guy at the store told me that all that is nonsense and the card is great and is a best seller so i just ignored everything and bough the card to test for myself because the price was good at the time, i said to myself that if it works well for me then that is all that matters, i was intending to keep it if it did not stutter that much, but it turned out that what people are reporting was very true so i returned the card.
> 
> Your Micron card is working well for you right? why are you feeling unsure about it now? why do you need my or anyone else's approval to feel confident about your purchase?


Quote:


> Originally Posted by *EDK-TheONE*
> 
> new reslts here:
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15627478
> 
> this is only nooby oc!


micron users can't hit 2.4k at all (confirm got artifacts)
.


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> First of all it was not that easy to do because i was using Afterburner to measure it, then i saw in your discussions that Fraps is more reliable so i had to install Fraps and figure out how to do it and take screenshots, which i did not have the time and energy for, i am not a dedicated tester and have other things to do in my life.
> 
> Also you do not have to believe me at all, i bought the GTX 970 in the midst of all the memory fiasco surrounding it, the sales guy at the store told me that all that is nonsense and the card is great and is a best seller so i just ignored everything and bough the card to test for myself because the price was good at the time, i said to myself that if it works well for me then that is all that matters, i was intending to keep it if it did not stutter that much, but it turned out that what people are reporting was very true so i returned the card.
> 
> Your Micron card is working well for you right? why are you feeling unsure about it now? why do you need my or anyone else's approval to feel confident about your purchase?


As I said, its no issue at all, it takes only 1 or 2 mins to do a benchmark after installing Fraps and Frafs, yet you say you only saw it in MSI Afterburner, where I also found it weird, yet a real benchmark showed its no problem at all.

Im not someone who ignores any criticism, as I know myself Nvidia does **** like that all the time. And I dont need your approval, Im just trying to put things right, since you still claim its proven and then use posts like those on the Geforce forum, to prove your point, while it doesnt even say what you claim it does.
Why do you think that I feel unsure because of unproven stuff and methods you use to make your point? If someone talks stuff that is not proven at all and only himself has claimed so far, you think I am unsure if I put it into context?
Alright... Makes actually you look unsure.


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> The variations with Samsung "in *GENERAL*" (don't quote one odd guy who can't do +100Mhz with Samsung and use as evidence) is weather you can OC to 9600Mhz or only 9200Mhz or so.


And that is true, one sample is not evidence. Problem is you are doing that the entire time.

You are using a few examples to claim that micron memory is generally flawed, without any evidence for that.
Quote:


> Even those rare Samsung chips that don't OC well don't checkerboard and crash the system, they just soft-crash, but that has been said a thousand times already as well.
> It does not matter how many times you say that, i am not convinced, i have another opinion, this is a mass production quality issue because it is something common and not just one or two odd cases, saying it ten thousand more times will not make it any more convincing to me, so just accept that people can disagree with you and stop repeating yourself.


Then tell me where are the tens of thousands of people who must have problems?

Quote:


> Originally Posted by *MyNewRig*
> 
> With the GTX 970 partitioned memory 3.5 + 0.5 GB Nvidia also claimed that a BIOS/Drivers fix will resolve the problem, how do you resolve physically partitioned memory stuttering issues via software? you can't, but Nvidia claimed it regardless, they released drivers that make the card not go near that 0.5GB partition which improved the situation a little, just like this Micron fix, the market responded by making the GTX 970 one of the best selling GPUs of all times, so Nvidia learned that consumers are willing to accept mediocrity and they did it again with the GTX 1070, cheaping out on memory components in the middle of the production cycle, looks like many consumers are still willing to take it and gtbtk is helping them justify their foolishness which is just sad!


There was no hardware problem in the first place.

The card was designed in that way. Nvidia made a big mistake with the wrong specs, but the card was exactly the way Nvidia wanted it to be.

Quote:


> Originally Posted by *MyNewRig*
> 
> The problem is very real, i owned the GTX 970 for a few weeks before returning it due to stuttering, GTA V in 2K was a stuttery mess even with the fixed drivers, i could not stand it, maybe in 1080p it was somewhat acceptable if you watch your settings, but with many games if you max out your settings and manage to exceed the fast 3.5GB partition you are guaranteed a stuttering experience, if you watch your settings to keep vRAM usage around the 3GB mark it runs fine.
> 
> Did you actually try the card? did you have a different experience?


I did, and I purchased the card knowing about the 3.5 + 0.5 memory specs. Then I tested the card extensively and also with games that used more than 3.5 GB vram.
There was no "guaranteed stuttering" at 1080p or 1440p even if you exceeded 3.5 GB vram on my system.
I am sure that some people got this problem, especially at SLI systems, but in general that issue was as overblown as your attempt to make the micron memory problem into a universal quality problem.

Quote:


> Originally Posted by *MyNewRig*
> 
> I understand that you are stuck with your card and in your country you have no right of return at this point, you badly need your issue fixed, i can appreciate that, but you don't have to keep going round encouraging everyone else to get themselves stuck with a bad product like you just to feel better.


Would you come down from your high horse please?

I am an EU citizen too, and I also don't return my card, (your claims about the return rights in the EU are also not entirely accurate, you just ordered from a shop that is accommodating.) simply because there is no reason too.

The 1070 with micron memory is not a "bad product". Stop trying to unsettling people just because you are butthurt.


----------



## QPSS

Quote:


> Originally Posted by *EDK-TheONE*
> 
> i do not think my cpu impact on graphics score.


It does. With a 4.5 GHz 6700k, with the same VRAM OC and ~100 MHz less GPU OC, I get 16k points there.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> As I said, its no issue at all, it takes only 1 or 2 mins to do a benchmark after installing Fraps and Frafs, yet you say you only saw it in MSI Afterburner, where I also found it weird, yet a real benchmark showed its no problem at all.
> 
> Im not someone who ignores any criticism, as I know myself Nvidia does **** like that all the time. And I dont need your approval, Im just trying to put things right, since you still claim its proven and then use posts like those on the Geforce forum, to prove your point, while it doesnt even say what you claim it does.
> Why do you think that I feel unsure because of unproven stuff and methods you use to make your point? If someone talks stuff that is not proven at all and only himself has claimed so far, you think I am unsure if I put it into context?
> Alright... Makes actually you look unsure.


I did not say i was sure about the Frametime consistency at all, go back to my posts regarding this issue, i proposed it as a possibility after i saw a few people mentioning it and asked people to test it and compare, i was mainly asking those who have access to cards with both Micron and Samsung to do a comparative measurement, i was intending to post mine and did not get a chance to do it, i saw you and a few other people posting Fraps measurement and that was enough for me, what are you complaining about? i don't get it, what difference would my Frametime charts have made for you to make all that fuss about it when it was just merely a proposed theory and when i was actually asking people to verify if it is true or not?


----------



## kevindd992002

C'mon, stop the argument you guys! Your replies are all TLDR's to most people now.


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> I did not say i was sure about the Frametime consistency at all, go back to my posts regarding this issue, i proposed it as a possibility after i saw a few people mentioning it and asked people to test it and compare, i was mainly asking those who have access to cards with both Micron and Samsung to do a comparative measurement, i was intending to post mine and did not get a chance to do it, i saw you and a few other people posting Fraps measurement and that was enough for me, what are you complaining about? i don't get it, what difference would my Frametime charts have made for you to make all that fuss about it when it was just merely a proposed theory and when i was actually asking people to verify if it is true or not?


Alright, if there is no problem with frametime, your "smoothness" point is irrelevant, yet why do you still bring it up? As I said, with that mention of the Geforce forum post.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> Alright, if there is no problem with frametime, your "smoothness" point is irrelevant, yet why do you still bring it up? As I said, with that mention of the Geforce forum post.


Because like you said with your GTX 970 that stuttering was clearly noticeable and that was enough for you, same thing, the Samsung card i tested had clearly smoother frames compared to Micron in the same resolution and the same games, i am sure how the games felt, if frametime can not capture it then maybe the answer is in timing or anything else, i am not an expert in frametime measurement, and it is only you and some other guy who posted their charts, you say it looks normal well i take your word for it, i am really not sure.

When i saw another guy who switched from Micron to Samsung mention the same thing, then i know it was not just me hallucinating about this.

I am usually an objective person, and i understand that if both memory types are producing the same FPS and same Frametime then they must provide the same level of frame smoothness but i am sure of what i saw, so probably more Frametime testing will reveal something, we just need more people to test this and do comparisons.


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> Because like you said with your GTX 970 that stuttering was clearly noticeable and that was enough for you, same thing, the Samsung card i tested had clearly smoother frames compared to Micron in the same resolution and the same games, i am sure how the games felt, if frametime can not capture it then maybe the answer is in timing or anything else, i am not an expert in frametime measurement, and it is only you and some other guy who posted their charts, you say it looks normal well i take your word for it, i am really not sure.
> 
> When i saw another guy who switched from Micron to Samsung mention the same thing, then i know it was not just me hallucinating about this.
> 
> I am usually an objective person, and i understand that if both memory types are producing the same FPS and same Frametime then they must provide the same level of frame smoothness but i am sure of what i saw, so probably more Frametime testing will reveal something, we just need more people to test this and do comparisons.


And yet he didnt say it. He only said smooth, not "its smooth again" or "its smoother".

Frametime pretty much shows all these problems. I didnt make a benchmark because I didnt have to prove it to anyone and I wasnt the only one claiming it. Dozens in just one forum said the same thing. No point for me to prove already proven things. And I highly doubt what you thought you saw was as obvious as the 970 issue in GTA5 for example. Not to mention I had an OSD running and saw the FPS go down directly as soon as the VRAM usage went over 3.5.


----------



## TheBoom

Quote:


> Originally Posted by *khanmein*
> 
> micron users can't hit 2.4k at all (confirm got artifacts)
> .


My samsung can't go past 2.3k. And 2.1k core is more than decent. Not sure what he means by nooby OC lol.


----------



## MyNewRig

Quote:


> Originally Posted by *Roland0101*
> 
> Then tell me where are the tens of thousands of people who must have problems?


Use Google, read the posts and measure the sentiment, i am not going to do this homework for you.

Quote:


> The card was designed in that way. Nvidia made a big mistake with the wrong specs, but the card was exactly the way Nvidia wanted it to be.
> I did, and I purchased the card knowing about the 3.5 + 0.5 memory specs. Then I tested the card extensively and also with games that used more than 3.5 GB vram.
> There was no "guaranteed stuttering" at 1080p or 1440p even if you exceeded 3.5 GB vram on my system.
> I am sure that some people got this problem, especially at SLI systems, but in general that issue was as overblown as your attempt to make the micron memory problem into a universal quality problem.


I don't care, i bought the card, tested it, it was badly stuttering, i returned it, stores in my country allowed everyone who bought prior to official announcement to get a refund, Nvidia settled and provided a compensation at the end, you still arguing this after the issue was closed against Nvidia just shows how biased you are, why are you arguing a closed issue?

Quote:


> Would you come down from your high horse please?
> 
> I am an EU citizen too, and I also don't return my card, (your claims about the return rights in the EU are also not entirely accurate, you just ordered from a shop that is accommodating.) simply because there is no reason too.
> 
> The 1070 with micron memory is not a "bad product". Stop trying to unsettling people just because you are butthurt.


Not all EU citizens know their rights or exercise them well and not all EU states have the same exact consumer laws, we only share the basics, in my country i returned a bunch of electronic products in the past for things much smaller than this, i was never denied a return once so i know what i am talking about.

It is a very bad product, and why exactly would i be butthurt? would you please stop using that childish language?


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> And yet he didnt say it. He only said smooth, not "its smooth again" or "its smoother".
> 
> Frametime pretty much shows all these problems. I didnt make a benchmark because I didnt have to prove it to anyone and I wasnt the only one claiming it. Dozens in just one forum said the same thing. No point for me to prove already proven things. And I highly doubt what you thought you saw was as obvious as the 970 issue in GTA5 for example. Not to mention I had an OSD running and saw the FPS go down directly as soon as the VRAM usage went over 3.5.


Fraps as i read is not the best Frametime measurement tool due to the way it works, this is what i have read, and like i told you i am not a Frametime expert, more testing is needed using professional tools by people who have access to both cards and know what they are doing.

And yes the frame smoothness difference was that obvious, the card with Samsung was very fluent, the games felt normal, the one with Micron was very noticeably laggy.


----------



## EDK-TheONE

another bench:
http://www.3dmark.com/3dm/15628501


----------



## asdkj1740

Quote:


> Originally Posted by *EDK-TheONE*
> 
> another bench:
> http://www.3dmark.com/3dm/15628501


RUN FSE/FSU?


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> Fraps as i read is not the best Frametime measurement tool due to the way it works, this is what i have read, and like i told you i am not a Frametime expert, more testing is needed using professional tools by people who have access to both cards and know what they are doing.
> 
> And yes the frame smoothness difference was that obvious, the card with Samsung was very fluent, the games felt normal, the one with Micron was very noticeably laggy.


Another possibility could have been issues with your DPC latency for some reason. Can happen if you remove a card and just put a new one in without reinstalling the drivers clean first (even if its the same kind of card). Or maybe even other software on your system causing it due to even minor changes in the system or in rare cases even without an obvious change in anything.
Yeah, we need more tests. At least there we agree.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> Another possibility could have been issues with your DPC latency for some reason. Can happen if you remove a card and just put a new one in without reinstalling the drivers clean first (even if its the same kind of card). Or maybe even other software on your system causing it due to even minor changes in the system or in rare cases even without an obvious change in anything.
> Yeah, we need more tests. At least there we agree.


I never switch cards without totally refreshing my system, resetting system BIOS and reapply my settings, cleaning with DDU followed by CCleaner and then reinstalling the drivers, i even uninstall Afterburner with delete settings and reinstall it, everything gets refreshed with the new card, nothing else was changed on the software side.

Yes this is what i am telling you, we need more testing with professional tools by people who have both cards, i was just making a proposal and asked for it to be tested, never said i was sure at all about anything that has to do with Frametime measurement, it is just an idea proposed by a few people and i was asking for verification, all i have in that regard is the clear lag i experienced.

It is also worth mentioning that the Micron card was providing the right 3DMark score for its configuration, i have a habit after installing a new card and doing all that process described above that i run 3DMark to make sure that CPU/RAM/GPU settings are applied correctly and that my score is in the level is should be before lunching games.

After doing all that rotTR for example was very noticeably laggy in 2K compared the Samsung card were it was really smooth and fluent, and BTW this is the first thing that draw my attention to the Micron card having something off about it even before realizing it had Micron memory or knowing anything at all about this issue.


----------



## Mr-Dark

Quote:


> Originally Posted by *MyNewRig*
> 
> AMD was focusing more on consoles during that time because this is where most of its money is coming from if you check their latest earnings reports, also Sony and MS are happy to do business with AMD because they say that they are really easy to deal with, they get good prices from them as well, AMD has been doing more than great on consoles with that same old architecture, now they have shifted focus to PC again and i expect VEGA and Zen will be really good products since AMD has not always been that bad.
> 
> Regarding drivers and software support, that is no longer the case, have you been following the latest Nvidia drivers issues? it is a mess, open bugs carry on from one driver to the next without being fixed for months, the last BF1 driver was breaking windows before they release the hotfix yesterday and still a bunch of open bugs from a few months ago are still there, also that fake Micron BIOS fix has been promised for a very long time, it took them forever to deliver and until now only EVGA managed to get it out, nothing yet from ASUS, MSI, Zotac and all the rest.
> 
> AMD is sure not releasing a dual GPU card, we are all waiting for the VEGA architecture which is rumored to come with 12 TFLOPS of compute performance, HBM2 512 Gbps bandwidth memory, and a bunch of exciting rendering borrowed from the PS4 Pro. development cycle, also AMD drivers are better than Nvidia's in Windows 10, issues are much less, they work great, gets released fast and more frequently, and they even released the BF1 drivers two days before Nvidia and it was good.
> 
> VEGA development is being done silence without the hype that preceded Polaris, that is a very good sign that they are so confident the performance will be really good that no hype is needed, the product will just speak for itself, i see the tide is shifting.


I read the same when Maxwell out and still nothing... i don't hate AMD but after 4 year with amd and finaly switching to nvidia... night and day difference..

i expect nothing from Zen.. its same old buldozar core but with big cache and more core's... lol


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> I read the same when Maxwell out and still nothing... i don't hate AMD but after 4 year with amd and finaly switching to nvidia... night and day difference..
> 
> i expect nothing from Zen.. its same old buldozar core but with big cache and more core's... lol










how come you say that Zen is just buldozar when everyone knows that it is a brand new architecture designed by the same guy who previously worked on the Athlon architecture and has been in the works for 5+ years? anyways i don't care for Zen because Intel products have been good so far, the market could still use some competition so i hope AMD provides that.

I am however highly anticipating VEGA, if it turns out to be as good as expected i will be the first to jump on it, i game in high resolutions so that super fast HBM2 will be very useful for me, if not then Pascal V2 will be there with better memories if they don't figure something out to trash the GTX 2070 like they have been doing with the x70 products.


----------



## msigtx760tf4

Quote:


> Originally Posted by *EDK-TheONE*
> 
> another bench:
> http://www.3dmark.com/3dm/15628501


Nice!
This is my s**t
http://www.3dmark.com/3dm/15629338


----------



## EDK-TheONE

Quote:


> Originally Posted by *asdkj1740*
> 
> RUN FSE/FSU?


http://www.3dmark.com/3dm/15629396


----------



## Avendor

Quote:


> Originally Posted by *msigtx760tf4*
> 
> Nice!
> This is my s**t
> http://www.3dmark.com/3dm/15629338


Quote:


> Originally Posted by *msigtx760tf4*
> 
> Nice!
> This is my s**t
> http://www.3dmark.com/3dm/15629338


Pretty good, do you have water-cooler for Cpu, if i may ask, what's the max. temperature in full load?


----------



## TheGlow

Quote:


> Originally Posted by *ucode*
> 
> Sorry I'm missing something here. When you say idle voltage do you mean you are idling at GPU 139MHz / Mem 101MHz with 0.8V?


GPU sits at at around 1582 I think, and Mem 4100MHz.
Quote:


> Originally Posted by *khanmein*
> 
> micron users can't hit 2.4k at all (confirm got artifacts)
> .


Please stop with the maltruths.
I am a Micron user I can hit 2.4k at all (confirm no artifacts)


As for stuttering, mine was fine since I got it in August and this weekend I was having those crazy stuttering issues. After an Overwatch update and new drivers.
It took me a while but it turned out to be my plex media server streaming 2 instances of hevc.
From what I can tell 1 session was fine. Going to task manager and setting Overwatch.exe's priority to high or real-time stopped that.
Task manager was still showing low CPU usage hence I didnt think that could have been related.


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> 3 Months to make a BIOS tweak?


ManuelG announced the new Bios on 09/29/2016. That is less than a months. You really should try to get your facts straight.

Quote:


> Use Google, read the posts and measure the sentiment, i am not going to do this homework for you.


Translation: I don't have any facts, I just make things up because I saw a few posts on the internet.


----------



## EDK-TheONE

oc my cpu 5% via blck and memory @ 2950 Mhz DDR4 and get better Score:
http://www.3dmark.com/3dm/15629678


----------



## MyNewRig

Quote:


> Originally Posted by *Roland0101*
> 
> ManuelG announced the new Bios on 09/29/2016. That is less than a months. You really should try to get your facts straight.


Count from when the issue was first discovered and reported not from the announcement.

Quote:


> Translation: I don't have any facts, I just make things up because I saw a few posts on the internet.


Correction: i experienced these issues first hand by trying a few different cards and comparing, you have only one card and have read some posts on the internet as well, you don't have any better tools than i do and you never tested a Samsung card yourself.


----------



## EDK-TheONE

another bench: got 21874 GS








http://www.3dmark.com/3dm/15629899


----------



## msigtx760tf4

Quote:


> Originally Posted by *Avendor*
> 
> Pretty good, do you have water-cooler for Cpu, if i may ask, what's the max. temperature in full load?


yeah my i5 3570K is water cooled by Zalman LQ 320. Max tem at full load at 5.1 ghz is 82-85 C at 1,490 V. CPU Delided and coollaboratory liquid ultra applied on Core
24/7 got setup at 5.0 ghz at 1.480 and it's ok, max temp is 79-82 C in OCCT. In game is 70 C max
Even 5.1 ghz at game is ok and the temp is about 75C


----------



## EDK-TheONE

Quote:


> Originally Posted by *TheGlow*
> 
> GPU sits at at around 1582 I think, and Mem 4100MHz.
> Please stop with the maltruths.
> I am a Micron user I can hit 2.4k at all (confirm no artifacts)
> 
> 
> As for stuttering, mine was fine since I got it in August and this weekend I was having those crazy stuttering issues. After an Overwatch update and new drivers.
> It took me a while but it turned out to be my plex media server streaming 2 instances of hevc.
> From what I can tell 1 session was fine. Going to task manager and setting Overwatch.exe's priority to high or real-time stopped that.
> Task manager was still showing low CPU usage hence I didnt think that could have been related.


could you provide 3d FS bench??


----------



## Avendor

Quote:


> Originally Posted by *msigtx760tf4*
> 
> yeah my i5 3570K is water cooled by Zalman LQ 320. Max tem at full load at 5.1 ghz is 82-85 C at 1,490 V. CPU Delided and coollaboratory liquid ultra applied on Core
> 24/7 got setup at 5.0 ghz at 1.480 and it's ok, max temp is 79-82 C in OCCT. In game is 70 C max
> Even 5.1 ghz at game is ok and the temp is about 75C


Wow, I thank you for the valuable information you provided. One last question, voltage isn't too high? what's the max. range 1.500v? from my knowledge it should not reach more than 1.4


----------



## khanmein

Quote:


> Originally Posted by *TheGlow*
> 
> GPU sits at at around 1582 I think, and Mem 4100MHz.
> Please stop with the maltruths.
> I am a Micron user I can hit 2.4k at all (confirm no artifacts)
> 
> 
> As for stuttering, mine was fine since I got it in August and this weekend I was having those crazy stuttering issues. After an Overwatch update and new drivers.
> It took me a while but it turned out to be my plex media server streaming 2 instances of hevc.
> From what I can tell 1 session was fine. Going to task manager and setting Overwatch.exe's priority to high or real-time stopped that.
> Task manager was still showing low CPU usage hence I didnt think that could have been related.


congrats but i would like to know u set the fan speed 100% max for 24/7 + gaming? nice temperature there..

my ear can accept max 2.2K RPM during gaming & above that consider loud to me.


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> I never switch cards without totally refreshing my system, resetting system BIOS and reapply my settings, cleaning with DDU followed by CCleaner and then reinstalling the drivers, i even uninstall Afterburner with delete settings and reinstall it, everything gets refreshed with the new card, nothing else was changed on the software side.
> 
> Yes this is what i am telling you, we need more testing with professional tools by people who have both cards, i was just making a proposal and asked for it to be tested, never said i was sure at all about anything that has to do with Frametime measurement, it is just an idea proposed by a few people and i was asking for verification, all i have in that regard is the clear lag i experienced.
> 
> It is also worth mentioning that the Micron card was providing the right 3DMark score for its configuration, i have a habit after installing a new card and doing all that process described above that i run 3DMark to make sure that CPU/RAM/GPU settings are applied correctly and that my score is in the level is should be before lunching games.
> 
> After doing all that rotTR for example was very noticeably laggy in 2K compared the Samsung card were it was really smooth and fluent, and BTW this is the first thing that draw my attention to the Micron card having something off about it even before realizing it had Micron memory or knowing anything at all about this issue.


Lots of stuff can go wrong there. Just how inconsistent Windows installs are, its a huge issue. Thats why pros use images to be 100% comparable.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> Lots of stuff can go wrong there. Just how inconsistent Windows installs are, its a huge issue. Thats why pros use images to be 100% comparable.


Yeah, it is interesting when these things that can go wrong won't happen when i switch from Samsung to Samsung but then happens when i changed from Samsung to Micron, looks like windows love Samsung and conspires against Micron









Also interesting how all Mjhieu problems that he has been complaining about for months just magically disappeared once he installed the Samsung card in his system, he suddenly became a happy satisfied user after months of misery.

Yeah it is the things that can go wrong with windows and not the vRAM, great thinking.

Man you are satisfied with your card right? just forget all this and go play some games already ..


----------



## TheGlow

Quote:


> Originally Posted by *EDK-TheONE*
> 
> could you provide 3d FS bench??



This is an older TimeSpy. I have screen shots of the 3d mark. Something happened with my install about the steam version was using the wrong account, so I grabbed the stand alone version. I have to bookmark the results because if I check my profile there are none. I have screenshots of them at home in a folder so I can check if you want.
Quote:


> Originally Posted by *khanmein*
> 
> congrats but i would like to know u set the fan speed 100% max for 24/7 + gaming? nice temperature there..
> 
> my ear can accept max 2.2K RPM during gaming & above that consider loud to me.


That was only for the test. No stress it was at 40c, just showing I can move my sliders up high on desktop without checkerboarding since I have voltage up.
I have a fan curve where it goes to about 60% at 60c which is where it hovers when playing Overwatch.
I really dont hear it, and I have headphones on often so I really don't hear anything.
But even at 100% I dont really recall hearing it as its in my tower, on the floor, to the side.
I refuse to get really close and see if theres coil whine, thats the last thing I need bugging me.


----------



## criminal

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah, it is interesting when these things that can go wrong won't happen when i switch from Samsung to Samsung but then happens when i changed from Samsung to Micron, looks like windows love Samsung and conspires against Micron
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also interesting how all Mjhieu problems that he has been complaining about for months just magically disappeared once he installed the Samsung card in his system, he suddenly became a happy satisfied user after months of misery.
> 
> Yeah it is the things that can go wrong with windows and not the vRAM, great thinking.
> 
> Man you are satisfied with your card right? just forget all this and go play some games already ..


So did you give up on Pascal or you getting a Samsung card?


----------



## gtbtk

Quote:


> Originally Posted by *EDK-TheONE*
> 
> here is my result with little oc gpu: zotac amp extreme
> 
> 
> 
> 
> 
> 
> fs: http://www.3dmark.com/3dm/15627237


You should really look at upgrading that i3 if you can


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> Count from when the issue was first discovered and reported not from the announcement.


gtbtk made his thread at geforce.com 1 1/2 months ago.
Quote:


> Correction: i experienced these issues first hand by trying a few different cards and comparing, you have only one card and have read some posts on the internet as well, you don't have any better tools than i do and you never tested a Samsung card yourself.


If all what you said regarding your card is true, you had a defective card.
I on the other hand have access to 3 micron 1070, plus to the reports of gtbtk, TheGlow, QPSS and others.

So, Samsung cards overclock easier? Yes. Micron cards generally flawed? No
Quote:


> Also interesting how all Mjhieu problems that he has been complaining about for months just magically disappeared once he installed the Samsung card in his system, he suddenly became a happy satisfied user after months of misery.


That's because he had a defective card as well, (I told him to RMA that card a months ago.) from the brand that acknowledged quality problems.
If he would have changed that card a months ago to another brand he could have enjoyed that feeling sooner.
Quote:


> Man you are satisfied with your card right? just forget all this and go play some games already ..


You not even have a 1070 anymore.


----------



## gtbtk

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> if his cpu bottle-neck then my cpu can throw away.
> 
> 
> 
> i do not think my cpu impact on graphics score.
Click to expand...

It does to a small extent but gaming also uses physics which does make use of the CPU. An I5 will give you a 10000 physics score, 6700k even better


----------



## MyNewRig

Quote:


> Originally Posted by *criminal*
> 
> So did you give up on Pascal or you getting a Samsung card?


Totally gave up, playing games in my library from 2010 on my iGPU now, that Intel HD 530 is really not that bad, it runs a few stuff pretty well









Where can i get that Samsung card? i decided i don't want an FE and i seen reports lately of some FEs coming with Micron, Namely EVGA and Zotac, it is also a risk if i somehow magically find a Samsung card now and then it fails say in a few months i will have to accept a Micron replacement, at this point i will have to go and shoot my retailer


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah, it is interesting when these things that can go wrong won't happen when i switch from Samsung to Samsung but then happens when i changed from Samsung to Micron, looks like windows love Samsung and conspires against Micron
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also interesting how all Mjhieu problems that he has been complaining about for months just magically disappeared once he installed the Samsung card in his system, he suddenly became a happy satisfied user after months of misery.
> 
> Yeah it is the things that can go wrong with windows and not the vRAM, great thinking.
> 
> Man you are satisfied with your card right? just forget all this and go play some games already ..


Yep, thats the funny thing about problems. They always happen in the weirdest fashion ESPECIALLY on computers.

For the last time:
1) He only had the crashing problem.
2) He even said that his friend with Samsung has the same problem.
3) He temp. fixed it by going maximum performance and later using 368.81 (last WDDM 2.0 driver).
4) He never said anything about the card not being smooth. He actually said its working fine with the temp. fix.

But I see youre posting your assumptions on the Geforce forum now too and seeding uncertainty among people.


----------



## MyNewRig

Quote:


> Originally Posted by *Roland0101*
> 
> That's because he had a defective card as well, (I told him to RMA that card a months ago.) from the brand that acknowledged quality problems. If he would have changed that card a months ago to another brand he could have enjoyed that feeling sooner.


Are you serious? i already exchanged one Micron Strix for another which didn't fix the issues, and he has an MSI which he also attempt switching earlier but his retailer told him all cards are the same, so brand switching was not the answer, memory switching is, if someone offered us Samsung memory cards from day one we would have not even got to the point of posting in Forums, but nothing of that sort was offered, i am not sure yet how he scored a Samsung but i asked him on the dark side.

Quote:


> You not even have a 1070 anymore.


Yes, i am now the happy owner of an Intel HD 530 LOL, it is not fast but it is stable and Rocket League runs very well on it, i love Intel and hate Nvidia


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> But I see youre posting your assumptions on the Geforce forum now too and seeding uncertainty among people.


What assumptions? the dude is asking if it s a placebo or not and i tell him i don't know we need professional testing, what do you want? why aren't you playing games now? you satisfied or not?


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> What assumptions? the dude is asking if it s a placebo or not and i tell him i don't know we need professional testing, what do you want? why aren't you playing games now? you satisfied or not?


Your assumptions about what Mjhieu said.

I have to play when you tell me to? Sorry, I play when I want to and have time for it. And no worries, I am very much enjoying this card, and hopefully will even more once the BIOS fix is out and I can refix the cooler and tickle a bit more out of the GPU and memory due to better cooling.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> Your assumptions about what Mjhieu said.
> 
> I have to play when you tell me to? Sorry, I play when I want to and have time for it. And no worries, I am very much enjoying this card, and hopefully will even more once the BIOS fix is out and I can refix the cooler and tickle a bit more out of the GPU and memory due to better cooling.


So what do you want now? are you against testing? you bought a card to defend it or to play? if you are satisfied go play and don't give me a headache, if you are not just return it and get something else or wait for VEGA, i don't know what you are arguing exactly.


----------



## QPSS

Quote:


> Originally Posted by *MyNewRig*
> 
> So what do you want now? are you against testing? you bought a card to defend it or to play? if you are satisfied go play and don't give me a headache, if you are not just return it and get something else or wait for VEGA, i don't know what you are arguing exactly.


Maybe read my last few comments to find out what I am arguing about.
But I noticed you ignore many parts of a comment, especially those that prove you wrong.
And again this time. I said I play when I want to. Being happy with your card doesnt mean you have to play 24/7.
But I guess you have achieved what you wanted. I give up trying to talk reason into you. You just ignore everything anyway.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> Maybe read my last few comments to find out what I am arguing about.
> But I noticed you ignore many parts of a comment, especially those that prove you wrong.


Okay am wrong. feel better now? time to shake off your doubts and go use your product, it is your money, you decide what to do with it not me! for all i know if you feel the product is solid enough you would not even care, when i had that Samsung GTX 1070, if some came and told me it will eat me alive in my sleep i wouldn't care, that is how much i loved it, it was perfect, is yours?


----------



## asdkj1740

Quote:


> Originally Posted by *EDK-TheONE*
> 
> http://www.3dmark.com/3dm/15629396


very nice result dude.
vram clock takes a important role in these benchmarking.
mine (micron) is 2190/2350 and it is less than 4900 marks....

you should upgrade your cpu if you care about actual gaming experience.
at least you should fine the right motherboard to overclock your i3 to 4.5g.


----------



## MyNewRig

Quote:


> Originally Posted by *QPSS*
> 
> But I guess you have achieved what you wanted. I give up trying to talk reason into you. You just ignore everything anyway.


Dude, seriously, i no longer own the product, i know what i know when i had it, why does it bother you if i ask others to test a hypothesis? really we are both just trolling now, you have doubts which you should not have, the product is installed in your PC now and you can test it, it is either you like it or you don't, if you are really satisfied with it, your FPS and Frametime is good in your own testing and your sensitive eyes are detecting smooth frames then, THAT IS ALL THAT MATTERS. regardless of what me or anyone else thinks!

If you keep going in the same direction i will just stop responding because we are just bothering everyone else here and not adding any value to the thread.


----------



## zipzop

EVGA forums moderator says Micron memory was an NVidia choice that board partners did not have control over. Not sure if that question was ever completely answered up until now but there you go.

http://forums.evga.com/FindPost/2568431


----------



## MyNewRig

Quote:


> Originally Posted by *zipzop*
> 
> EVGA forums moderator says Micron memory was an NVidia choice that board partners did not have control over. Not sure if that question was ever completely answered up until now but there you go.
> 
> http://forums.evga.com/FindPost/2568431


+REP

WOW, thank you very much for this, that is totally new information, i took screenshots and saved the page in case they change it under pressure, it was very obvious to me that it was Nvidia's orders and that manufacturers' hand were forced, it does not make sense for them to trash their products like that if it was not forced upon them.

The poor manufacturers got all the blame for this, all Nvidia fans and lovers blamed this on them, bad choice on their part, bad design, bad engineering, lack of testing and QC and a bunch of other things, and that Nvidia was innocent as a baby.

All board partners switching to Micron at the same exact time, claiming shortage in the face of consumer complaints when the ICs were obviously widely available and were used in lower end cards in abundance, all the shadiness and secrecy surround the issue, it was very obvious that it was a gimmick from Nvidia for business and marketing purposes.

When i presented this information some used to laugh and say oh that is a conspiracy theory etc ...

That guy will probably get in trouble for finally admitting what actually happened, i will post a screenshot for future reference in case things get changed or deleted.


----------



## zipzop

EVGA free thermal pads request form---> http://www.evga.com/thermalmod/


----------



## Aretak

Quote:


> Originally Posted by *zipzop*
> 
> EVGA forums moderator says Micron memory was an NVidia choice that board partners did not have control over. Not sure if that question was ever completely answered up until now but there you go.
> 
> http://forums.evga.com/FindPost/2568431


I don't think that's what he's saying. He's saying the *bug* that has been affecting all cards was something inherent to the base BIOS code that AIBs build their custom BIOSes from, and needed to be fixed on their end. Not that Nvidia made the choice to switch to Micron memory. That'd make no sense, as partners have always sourced their own components for their custom boards.


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> +REP
> 
> WOW, thank you very much for this, that is totally new information, i took screenshots and saved the page in case they change it under pressure, it was very obvious to me that it was Nvidia's orders and that manufacturers' hand were forced, it does not make sense for them to trash their products like that if it was not forced upon them.


The problem here is that the guy at evga did not said that.

He just said that the issue was out of evgas hands, probably because they rely on the vBios from Nvidia. Something we (or I in this case) learned from gtbtk's geforce thread.
Quote:


> When i presented this information some used to laugh and say oh that is a conspiracy theory etc ...


First, you didn't presented any information, you just speculated.
Second, I didn't called you a conspiracy theorist because you claimed that Nvidia ordered that switch. I still don't belief that Nvidia did that, but that alone would be no reason to argue.
I called you a conspiracy theorist because you claimed that Nvidia ordered that switch to gimp the 1070 in favour of the 1080.
And yes, that still is a conspiracy theory and it is still not based on any facts whatsoever.
Quote:


> That guy will probably get in trouble for finally admitting what actually happened, i will post a screenshot for future reference in case things get changed or deleted.


Do you think they will ban him?









Quote:


> Originally Posted by *Aretak*
> 
> I don't think that's what he's saying. He's saying the *bug* that has been affecting all cards was something inherent to the base BIOS code that AIBs build their custom BIOSes from, and needed to be fixed on their end. Not that Nvidia made the choice to switch to Micron memory. That'd make no sense, as partners have always sourced their own components for their custom boards.


+1
Furthermore, the guy at EVGA that made this statement is not even an EVGA employee, he is just a Forum moderator.


----------



## Forceman

Quote:


> Originally Posted by *MyNewRig*
> 
> Dude, seriously, i no longer own the product, i know what i know when i had it, why does it bother you if i ask others to test a hypothesis? really we are both just trolling now, you have doubts which you should not have, the product is installed in your PC now and you can test it, it is either you like it or you don't, if you are really satisfied with it, your FPS and Frametime is good in your own testing and your sensitive eyes are detecting smooth frames then, THAT IS ALL THAT MATTERS. regardless of what me or anyone else thinks!
> 
> If you keep going in the same direction i will just stop responding because we are just bothering everyone else here and not adding any value to the thread.


If you no longer own a 1070, why are you still spamming this thread? We get it, you don't like that they switched to Micron RAM, but since it no longer affects you maybe just let it go.


----------



## Snuckie7

Does anyone know why most 1070's can't boost past 2.1GHz? Is it artifacting, power/voltage limit, or something arbitrary in the BIOS?


----------



## Forceman

Quote:


> Originally Posted by *Snuckie7*
> 
> Does anyone know why most 1070's can't boost past 2.1GHz? Is it artifacting, power/voltage limit, or something arbitrary in the BIOS?


Most likely just a process/architecture limitation. 1080s hit about the same wall. Maybe with voltage mods you could go a little higher, but I don't think even LN2 is doing all that much.


----------



## ucode

Quote:


> Originally Posted by *TheGlow*
> 
> GPU sits at at around 1582 I think, and Mem 4100MHz.


So it's still in P0 or P2.

Can you try fixed voltage, say 1.000V or above while allowing the GPU to idle at the lower clock speeds in P5, P8. Cheers.

Quote:


> Originally Posted by *Forceman*
> 
> Most likely just a process/architecture limitation. 1080s hit about the same wall. Maybe with voltage mods you could go a little higher, but I don't think even LN2 is doing all that much.


With my 1080 I can run full load at 2.2GHz, with next to no load it will run nearly 2.4GHz but will crash with load. So it certainly doesn't seem like a programmed hard limit in VBIOS.


----------



## Snuckie7

Quote:


> Originally Posted by *Forceman*
> 
> Most likely just a process/architecture limitation. 1080s hit about the same wall. Maybe with voltage mods you could go a little higher, but I don't think even LN2 is doing all that much.


Huh I see. I was just watching Digital Foundry's review of the card and Richard commented that something seemed to be holding the core frequency back.


----------



## MyNewRig

Quote:


> Originally Posted by *Forceman*
> 
> If you no longer own a 1070, why are you still spamming this thread? We get it, you don't like that they switched to Micron RAM, but since it no longer affects you maybe just let it go.


Since you have a Samsung card on which you did some basic Frametime measurement, do you have experience in that kind of thing? measuring and properly interpreting frametime charts? i ask because i am not sure if that test was conclusive or not, i found the following article discussing how to do it properly, can you please go over it and follow that methodology if you have the time?

http://techreport.com/blog/28679/is-fcat-more-accurate-than-fraps-for-frame-time-measurements


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> You should really look at upgrading that i3 if you can


Do you think an i7-2600K clocked at 4.5GHz is bottlenecking my 1070? How would I test anyway?


----------



## Majentrix

Quote:


> Originally Posted by *kevindd992002*
> 
> Do you think an i7-2600K clocked at 4.5GHz is bottlenecking my 1070? How would I test anyway?


Look at your Firestrike graphics score. If it's significantly less than what others get with the same card, then you are bottlenecked. To directly answer your question, no.


----------



## Majentrix

I wish Gainward made SLI HB bridges. Guess I'll settle for MSI.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You should really look at upgrading that i3 if you can
> 
> 
> 
> Do you think an i7-2600K clocked at 4.5GHz is bottlenecking my 1070? How would I test anyway?
Click to expand...

I have an i7-2600 non K in my rig OC to 4.4Ghz. I can pull a 10200 physics score (similar to a skylake i5) but I cannot get better than a 20500 graphics score and that is with settings that I cannot use 24/7. I would think that a 2600K should be similar or slightly better then what I get.

There are two things going on with the CPU during gaming/benchmarks:

There is the CPU's ability to feed data to the GPU over the life of the run (write polygon here) which is typically in DX11 and open GL single threaded and,

The additional computational load being used in games doing the physics calculations which can be multi threaded.

The original poster is running an i3-6100 and getting a 21000 graphics score, that is a good result. The i3 does a good job of feeding the GPU with instructions to draw graphical elements. The 5000 physics score in just shows that a fast 2 core, 4 thread CPU does not have the same processing muscle as even a slower 4 core, 8 thread CPU. Is there an urgent need for him to upgrade? No. will there be some games effected more than others ? Yes.

Given that game play in many titles relies on the physics calculations to progress as well as as much as the graphics instructions, the user experience will be held back as the GPU has to wait more for the physics to catch up. If he can stretch to a i5-6600K which should be a relatively inexpensive upgrade as it is only a straight chip swap, then the user experience should be improved all around. Is that requirement urgent? No.

Sandy bridge CPUs are amazing given their age. Will the performance with a 2600K match a 6700K? No it wont. However the performance difference is not huge enough to need urgent attention to upgrade either. In my opinion it is still in the category of "good enough" although it is getting near to retirement age.


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> I have an i7-2600 non K in my rig OC to 4.4Ghz. I can pull a 10200 physics score (similar to a skylake i5) but I cannot get better than a 20500 graphics score and that is with settings that I cannot use 24/7. I would think that a 2600K should be similar or slightly better then what I get.
> 
> There are two things going on with the CPU during gaming/benchmarks:
> 
> There is the CPU's ability to feed data to the GPU over the life of the run (write polygon here) which is typically in DX11 and open GL single threaded and,
> 
> The additional computational load being used in games doing the physics calculations which can be multi threaded.
> 
> The original poster is running an i3-6100 and getting a 21000 graphics score, that is a good result. The i3 does a good job of feeding the GPU with instructions to draw graphical elements. The 5000 physics score in just shows that a fast 2 core, 4 thread CPU does not have the same processing muscle as even a slower 4 core, 8 thread CPU. Is there an urgent need for him to upgrade? No. will there be some games effected more than others ? Yes.
> 
> Given that game play in many titles relies on the physics calculations to progress as well as as much as the graphics instructions, the user experience will be held back as the GPU has to wait more for the physics to catch up. If he can stretch to a i5-6600K which should be a relatively inexpensive upgrade as it is only a straight chip swap, then the user experience should be improved all around. Is that requirement urgent? No.
> 
> Sandy bridge CPUs are amazing given their age. Will the performance with a 2600K match a 6700K? No it wont. However the performance difference is not huge enough to need urgent attention to upgrade either. In my opinion it is still in the category of "good enough" although it is getting near to retirement age.


Thanks for the insight and I'm glad reading your answer. I'll probably upgrade when Kaby Lake arrives.


----------



## Roland0101

Quote:


> Originally Posted by *kevindd992002*
> 
> Do you think an i7-2600K clocked at 4.5GHz is bottlenecking my 1070? How would I test anyway?


Additional to what said gtbtk said, you have a 1080p G-SYNC Monitor. I doupt that there is anything on the market you can't play on high settings with your setup.
So, will a more modern CPU be faster? Sometimes, but that depends sole on the games you play, but a i7-2600K is in no way a bottleneck in the sens of the word for a configuration like yours.


----------



## khanmein

today, i decided to give up since i saw this vid skipping my question regarding micron & if no mistake they still need another few month for testing micron or find other statement to prove there's no different in terms of performance >> 



 (if no performance different, y can't they just public announce to us?)

steve burke, where's the part 2 of AMD RX 480 power issue on the cheap mobo u mentioned?? i commented but no respond.. i guess still doing research.

another vid 



 didn't talk bout the VRM cooling but i saw the GURU3D review that the VRM is hit up nearly 100°c but none of them said anything bout it until few months. this fella got a measurement tool to measure the VRM temp but mentioned nothing regarding the VRM. what a great reviewer ever & reaching 800k subscribers.

all the micron fiasco thang are fake, blamed me created this whole havoc situation. i'm done~


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have an i7-2600 non K in my rig OC to 4.4Ghz. I can pull a 10200 physics score (similar to a skylake i5) but I cannot get better than a 20500 graphics score and that is with settings that I cannot use 24/7. I would think that a 2600K should be similar or slightly better then what I get.
> 
> There are two things going on with the CPU during gaming/benchmarks:
> 
> There is the CPU's ability to feed data to the GPU over the life of the run (write polygon here) which is typically in DX11 and open GL single threaded and,
> 
> The additional computational load being used in games doing the physics calculations which can be multi threaded.
> 
> The original poster is running an i3-6100 and getting a 21000 graphics score, that is a good result. The i3 does a good job of feeding the GPU with instructions to draw graphical elements. The 5000 physics score in just shows that a fast 2 core, 4 thread CPU does not have the same processing muscle as even a slower 4 core, 8 thread CPU. Is there an urgent need for him to upgrade? No. will there be some games effected more than others ? Yes.
> 
> Given that game play in many titles relies on the physics calculations to progress as well as as much as the graphics instructions, the user experience will be held back as the GPU has to wait more for the physics to catch up. If he can stretch to a i5-6600K which should be a relatively inexpensive upgrade as it is only a straight chip swap, then the user experience should be improved all around. Is that requirement urgent? No.
> 
> Sandy bridge CPUs are amazing given their age. Will the performance with a 2600K match a 6700K? No it wont. However the performance difference is not huge enough to need urgent attention to upgrade either. In my opinion it is still in the category of "good enough" although it is getting near to retirement age.
> 
> 
> 
> Thanks for the insight and I'm glad reading your answer. I'll probably upgrade when Kaby Lake arrives.
Click to expand...

That is my plan too


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> today, i decided to give up since i saw this vid skipping my question regarding micron & if no mistake they still need another few month for testing micron or find other statement to prove there's no different in terms of performance >>
> 
> 
> 
> (if no performance different, y can't they just public announce to us?)
> 
> steve burke, where's the part 2 of AMD RX 480 power issue on the cheap mobo u mentioned?? i commented but no respond.. i guess still doing research.
> 
> another vid
> 
> 
> 
> didn't talk bout the VRM cooling but i saw the GURU3D review that the VRM is hit up nearly 100°c but none of them said anything bout it until few months. this fella got a measurement tool to measure the VRM temp but mentioned nothing regarding the VRM. what a great reviewer ever & reaching 800k subscribers.
> 
> all the micron fiasco thang are fake, blamed me created this whole havoc situation. i'm done~


Steve Burke mentioned in a Q&A video not long after they did part 1 that after a week of testing, they could not find any evidence of RX480 PCIe power problems and that it did not harm their test motherboard. The AMD drivers solved the problem anyway. 




The VRM heat thing is nothing really that new other than the complaints have gone viral for the 1070. Look at the 970 review here http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/67958-evga-geforce-gtx-970-ftw-acx-2-0-review-4.html. 91 degrees is getting close and as different reviewers did the tests there is no matching in gpu loads to compare exactly like for like.

Just remember that he is a journalist with some knowledge about technology. That doesnt mean he knows everything. There are probably people on this forum with as good or better tech knowledge as him


----------



## khanmein

^^ no issue my
Quote:


> Originally Posted by *gtbtk*
> 
> Steve Burke mentioned in a Q&A video not long after they did part 1 that after a week of testing, they could not find any evidence of RX480 PCIe power problems and that it did not harm their test motherboard. The AMD drivers solved the problem anyway.
> 
> 
> 
> 
> The VRM heat thing is nothing really that new other than the complaints have gone viral for the 1070. Look at the 970 review here http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/67958-evga-geforce-gtx-970-ftw-acx-2-0-review-4.html. 91 degrees is getting close and as different reviewers did the tests there is no matching in gpu loads to compare exactly like for like.
> 
> Just remember that he is a journalist with some knowledge about technology. That doesnt mean he knows everything. There are probably people on this forum with as good or better tech knowledge as him


yeah i saw that vid before (like i said he mentioned before but i can't rmb which vid) & he said no more issue but if no issue y no part 2? (still gather data?) look carefully on the vid 11:01







+ steve burke is using right hand so u judge yourself. (http://www.blifaloo.com/info/lies_eyes.php)

VRM no heat issue but EVGA is giving away free thermal pad (weird) if u received extra can u give me? seriously anyone got extra thermal pad from EVGA can deliver to my bro that stay at edmond okc!

vendors chosen them to review their products based on the knowledge & i don't think due to their popularity or higher subscriber.. there're professional tech person & i'm just a troll keyboard warrior..


----------



## gtbtk

My take was that 2 mins of screen time was part 2. If he could find no issues then a part 2 would be a very short video otherwise.

EVGA is taking the approach that it is better to deal with a perceived issue now with something that is a low cost solution rather than let it blow up and having everyone saying EVGA is crap and they lose business. They also get the benefit of the publicity, promoting the idea that they are proactive and have good support.

Vendors choose to market through them by supplying hardware because they have an audience of 100s of thousands of people through youtube and/or their web site and they appear credible on screen. I cant see that it is in their interests to drop bombs on the vendors who supply them unless they feel the negative press is not going to disadvantage them or if the story is already out in the general public consciousness. Forum members here and at nvidia.com know about the issue but we are not a very large number of the buying public. I suspect most of these sites only realized it is a real thing and not the standard moaning of dissatisfied buyers when guru3d published the New Bios coming article.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> My take was that 2 mins of screen time was part 2. If he could find no issues then a part 2 would be a very short video otherwise.
> 
> EVGA is taking the approach that it is better to deal with a perceived issue now with something that is a low cost solution rather than let it blow up and having everyone saying EVGA is crap and they lose business. They also get the benefit of the publicity, promoting the idea that they are proactive and have good support.
> 
> Vendors choose to market through them by supplying hardware because they have an audience of 100s of thousands of people through youtube and/or their web site and they appear credible on screen. I cant see that it is in their interests to drop bombs on the vendors who supply them unless they feel the negative press is not going to disadvantage them or if the story is already out in the general public consciousness. Forum members here and at nvidia.com know about the issue but we are not a very large number of the buying public. I suspect most of these sites only realized it is a real thing and not the standard moaning of dissatisfied buyers when guru3d published the New Bios coming article.


i still prefer EVGA more than MSI & GIGA.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> My take was that 2 mins of screen time was part 2. If he could find no issues then a part 2 would be a very short video otherwise.
> 
> EVGA is taking the approach that it is better to deal with a perceived issue now with something that is a low cost solution rather than let it blow up and having everyone saying EVGA is crap and they lose business. They also get the benefit of the publicity, promoting the idea that they are proactive and have good support.
> 
> Vendors choose to market through them by supplying hardware because they have an audience of 100s of thousands of people through youtube and/or their web site and they appear credible on screen. I cant see that it is in their interests to drop bombs on the vendors who supply them unless they feel the negative press is not going to disadvantage them or if the story is already out in the general public consciousness. Forum members here and at nvidia.com know about the issue but we are not a very large number of the buying public. I suspect most of these sites only realized it is a real thing and not the standard moaning of dissatisfied buyers when guru3d published the New Bios coming article.


thats quiet disgusting, lots of fools in evga forum said evga has good after sales support....
these people wont think about the nature of the problem...actually they are encouraging evga to keep doing this.

msi gamingx has the mid cooling plate too but its plate has got some fins above the plate so vrm temp is not bad, at all.


----------



## HOODedDutchman

Lol haven't checked this thread in weeks and still going on about micron memory. Glad both mine are Samsung so I don't have to worry about this crap.


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Lol haven't checked this thread in weeks and still going on about micron memory. Glad both mine are Samsung so I don't have to worry about this crap.


usually 3 months people will start to forget but i like to remind people..


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> usually 3 months people will start to forget but i like to remind people..


Thought Nvidia was releasing a bios to fix the issue.


----------



## Roland0101

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Thought Nvidia was releasing a bios to fix the issue.


It's already out for evga and Gainward.


----------



## HOODedDutchman

Quote:


> Originally Posted by *Roland0101*
> 
> It's already out for evga and Gainward.


Well then lol.
Lets see more of this...

...instead of pouting.


----------



## Joenc

I'd like to thank all the 1070/1080 owners for being early adaptors !

I'm sure in a couple of months there will be 1070ssc+, 1070ftw+ , 1080ssc+

1080ftw+ cards released with all the fixes that's happened over the past

few months....

So , thank you


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Well then lol.
> Lets see more of this...
> 
> ...instead of pouting.


nice same case with mine.


----------



## khanmein

https://rog.asus.com/forum/showthread.php?87773-ASUS-Strix-1070-ALL-Using-Micron-Memory-ICs


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> https://rog.asus.com/forum/showthread.php?87773-ASUS-Strix-1070-ALL-Using-Micron-Memory-ICs


Meanwhile i am enjoying my Intel HD graphics with 1300 Firestrike score


----------



## Roland0101

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Well then lol.
> Lets see more of this...
> 
> ...instead of pouting.


I couldn't agree more.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Meanwhile i am enjoying my Intel HD graphics with 1300 Firestrike score


my intel HD 4600 can play dota 2 (everything low + 40% + win mode 1080p) 60 fps & theme hospital is playable.

titan quest playable but headache on my 1440p screen. seriously i can't even play my FIFA 17 at all. i'm not enjoying for 5 weeks + & my life is f miserable & disaster.

FYI, i rarely benchmarking. when i owned this rig, i ran the memtest86 for one day then ran aida64 extreme less than 5 min until i reached the max + constant temp.

2nd time ran aida cos i changed new thermal paste for my processor that's all.


----------



## HOODedDutchman

You guys buying too expensive of cards that's ur problem lol. I have 2 of cheapest 1070 u can buy n both Samsung. Maybe no one buys cuz they think they are cheap lol.


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> You guys buying too expensive of cards that's ur problem lol. I have 2 of cheapest 1070 u can buy n both Samsung. Maybe no one buys cuz they think they are cheap lol.


i didn't buy GTX 1070 yet. i plan to purchase Asus once i sell my 970.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My take was that 2 mins of screen time was part 2. If he could find no issues then a part 2 would be a very short video otherwise.
> 
> EVGA is taking the approach that it is better to deal with a perceived issue now with something that is a low cost solution rather than let it blow up and having everyone saying EVGA is crap and they lose business. They also get the benefit of the publicity, promoting the idea that they are proactive and have good support.
> 
> Vendors choose to market through them by supplying hardware because they have an audience of 100s of thousands of people through youtube and/or their web site and they appear credible on screen. I cant see that it is in their interests to drop bombs on the vendors who supply them unless they feel the negative press is not going to disadvantage them or if the story is already out in the general public consciousness. Forum members here and at nvidia.com know about the issue but we are not a very large number of the buying public. I suspect most of these sites only realized it is a real thing and not the standard moaning of dissatisfied buyers when guru3d published the New Bios coming article.
> 
> 
> 
> thats quiet disgusting, lots of fools in evga forum said evga has good after sales support....
> these people wont think about the nature of the problem...actually they are encouraging evga to keep doing this.
> 
> msi gamingx has the mid cooling plate too but its plate has got some fins above the plate so vrm temp is not bad, at all.
Click to expand...

I don't think EVGA is doing anything wrong. All good companies try and manage their public perception and use marketing to get the buying public to know they and their products exist. You can have the best product in the world but no-one will buy if they do not know it exists.

Putting the your brand and your product out in the publics face, combined with doing things that look like they care about customers is generally how a company becomes the market leader, particularly if they do not have something that is unique and so much better than their direct competition. EVGA sells GPUs, motherboards and power supplies just the same as Asus, MSI, etc. RGB lights and plastic shrouds that cover a heatsink can only do so much to make your product desirable over anyone else. At the end of the day, EVGA exists to make a profit just like every other company that makes things we buy. None of them are charities.

The issue here though is really not the Thermal pads but more the question "are the temperatures really a major issue or is it something that people are being stirred up to get emotional about"? Distributing inexpensive thermal pads solves both problems without costing a fortune. Ignoring it and having it blow up or having a media campaign launched against them will cost a lot more. Look at the GTX 970 memory issue Nvidia tried to deal with by ignoring.

I have not noticed that MSI gets too hot but I don't have a laser thermometer to test it and I have not found a sensor to report on it.


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> I have not noticed that MSI gets too hot but I don't have a laser thermometer to test it and I have not found a sensor to report on it.


I don't recall really feeling heat with other cards but my MSI Gaming X does seem to put some out.
My new right uses same case as old, CM Haf 912, and instead of 2x80mm fans on top I have a 120mm, but otherwise similar layout.
Often when playing I can feel a warm breeze coming out of the top of the case and hitting my legs.
This is at 60c. 60c is safe correct? I thought about bringing fans up more as the acoustics don't bother me but then I'm not sure if I will kill lifespan on the fans.


----------



## M3Stang

I got an evga superclocked one.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have not noticed that MSI gets too hot but I don't have a laser thermometer to test it and I have not found a sensor to report on it.
> 
> 
> 
> I don't recall really feeling heat with other cards but my MSI Gaming X does seem to put some out.
> My new right uses same case as old, CM Haf 912, and instead of 2x80mm fans on top I have a 120mm, but otherwise similar layout.
> Often when playing I can feel a warm breeze coming out of the top of the case and hitting my legs.
> This is at 60c. 60c is safe correct? I thought about bringing fans up more as the acoustics don't bother me but then I'm not sure if I will kill lifespan on the fans.
Click to expand...

60deg is fine, GPUs will generate heat, that is a fact of life and the reason for the fans in the first place. I was referring to overly hot VRM/backplate temps like in the EVGA report.

When I overclock, I run my fans at about 85% or with a curve and run at 57-60 deg and it is not too loud for me even at 100% however, I do also have an in windows AC running as well. The MSIs have ball bearing fans so they should last a long time


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> Often when playing I can feel a warm breeze coming out of the top of the case and hitting my legs.
> This is at 60c. 60c is safe correct?


60° Celsius is a warm breeze for this card. The max operation temp is 94° Celsius.


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> 60° Celsius is a warm breeze for this card. The max operation temp is 94° Celsius.


That's what I remember seeing. But then humans can also go over 2 weeks without food but doesnt mean I want to go past 1 day without.


----------



## MyNewRig

Quote:


> Originally Posted by *TheGlow*
> 
> That's what I remember seeing. But then humans can also go over 2 weeks without food but doesnt mean I want to go past 1 day without.


All that is being measured is the core temp and core voltage, we don't know the VRM or Memory chips temps or voltages on these cards and we have no tools to measure them except thermal imaging and laser thermometers.

The core staying below 70c does not mean that the VRM is not exceeding 100c or so like it is the case with the FTW, this overheats the backplate, the PCB, the memory chips and maybe the die legs as well which is not necessarily reflected in the core temp the software reads.


----------



## KedarWolf

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roland0101*
> 
> 60° Celsius is a warm breeze for this card. The max operation temp is 94° Celsius.
> 
> 
> 
> That's what I remember seeing. But then humans can also go over 2 weeks without food but doesnt mean I want to go past 1 day without.
Click to expand...

I know in the previous generation of cards you wanted to always stay under 80C. Likely still true.


----------



## MyNewRig

Quote:


> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know in the previous generation of cards you wanted to always stay under 80C. Likely still true.


I once had an AMD Sapphire, don't remember which model it was, were VRM temps were actually measured by the software, it used to exceed 100c and the card would produce a mild burning smell, i contacted Sapphire about the issue, they said it was normal and that these VRMs are rated for up to 120c , the core used to stay below 80c most of the time, the burning smell was worrying so i ended up returning the card anyways


----------



## EDK-TheONE

is that safe memory on @ 2402 Mhz (effective 9608) for 24/7?


----------



## Majentrix

Some prelim results with SLI 1070s. +150MHz on the core and +400MHz on memory. I'll take it further when I get home tonight.


----------



## TylerAD

I thought so also, why I started to use the MSI afterburner. I notice the small stutters only when I had the graphs open.


----------



## Roland0101

Quote:


> Originally Posted by *gtbtk*
> 
> I have not noticed that MSI gets too hot but I don't have a laser thermometer to test it and I have not found a sensor to report on it.


I think you don't have to worry. Guru 3D
Quote:


> Originally Posted by *TheGlow*
> 
> That's what I remember seeing. But then humans can also go over 2 weeks without food but doesnt mean I want to go past 1 day without.


Neither do I. My wife is a chef.







From what brand is your card?


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> I think you don't have to worry. Guru 3D
> Neither do I. My wife is a chef.
> 
> 
> 
> 
> 
> 
> 
> From what brand is your card?


MSI Gaming X


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> MSI Gaming X


Well, then the link I posted for gtbtk is also for you.


----------



## TheDeadCry

I think around up to ~95 is 'Safe' but I would in no way ever recommend anybody coming 10 degrees within that threshold. BTW our Gaming X's are supposed to have the fan on only when it gets up to 60 degrees, so that should say, yeah, that's pretty safe. I have as many fans that a case can hold, and my card hasn't met 60 degrees for awhile - not since all of the fans. I'd be comfortable with up to 80, maybe 85 if you REALLY want to push it and are pretty confident in the knowledge of your hardware. Then again, if anybody's 1070 actually is getting in the 80's their card is straight up bad, or is in a case without airflow. The 1070, under realistic overclocking potential, should not come close to 80...unless you live in an oven...or maybe have a cat, blanket and your coffee placed on it...with a giant fat man, and also a fire blanket. Anyways, I'd consider anything under 70 degrees to be acceptable by my standards.


----------



## TheDeadCry

Quote:


> Originally Posted by *Majentrix*
> 
> Some prelim results with SLI 1070s. +150MHz on the core and +400MHz on memory. I'll take it further when I get home tonight.


That score seems fairly low in terms of combined score. I can't imagine it's your cpu... I wonder what's up with that. I have a single 1070, and a 6700k, and get about the same score, I think, at least. EDIT: Wait, I just checked - I lied







I have like 17000 max I think.


----------



## Dude970

I think that is low for SLI, the Graphics score is good, it's the Physics score bringing it down. OC that i5


----------



## Majentrix

Quote:


> Originally Posted by *TheDeadCry*
> 
> That score seems fairly low in terms of combined score. I can't imagine it's your cpu... I wonder what's up with that. I have a single 1070, and a 6700k, and get about the same score, I think, at least. EDIT: Wait, I just checked - I lied
> 
> 
> 
> 
> 
> 
> 
> I have like 17000 max I think.


You can see my i5 6600k in the screenshot. It might have been running at stock speeds when I ran that bench, I cleared CMOS when I installed the second card.


----------



## Dude970

Quote:


> Originally Posted by *Majentrix*
> 
> You can see my i5 6600k in the screenshot. It might have been running at stock speeds when I ran that bench, I cleared CMOS when I installed the second card.


It is a good score







, you can squeeze another 1-2 out of the CPU is what I meant


----------



## TheDeadCry

Quote:


> Originally Posted by *Majentrix*
> 
> You can see my i5 6600k in the screenshot. It might have been running at stock speeds when I ran that bench, I cleared CMOS when I installed the second card.


Oh, I know, I just was saying that your 6600k shouldn't show to much of a discrepancy, because you have a sli setup, whereas I have a single card. I know you have a 6600k. I was just confused because I thought your score seemed a little low, as compared to my single card score. Because even though I have a 6700k you should still get a far better score for having dual 1070's. Like I said in the edit, my score was a lot lower than I remembered it being, and so your score falls more inline with my expectations.


----------



## Majentrix

In any case I'll push it higher tonight


----------



## Majentrix

Highest I can get without artifacts.


----------



## MyNewRig

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Highest I can get without artifacts.


Is your CPU OCed? i can get 10200 physics score with my 6600K running at 4.6Ghz with 3000Mhz DDR4


----------



## Majentrix

4.5GHz with 1.285V. I can get it higher but I feel comfortable at the current speed.


----------



## MyNewRig

Quote:


> Originally Posted by *Majentrix*
> 
> 4.5GHz with 1.285V. I can get it higher but I feel comfortable at the current speed.


I am running 4.6Ghz on auto-voltage it hovers between 1.32v and 1.36v , temps in gaming are around 54-55c and the max is 61c during CPU stress test, that is pretty safe IMO

What are your temps on 1.285V and what cooler are you using?


----------



## Majentrix

Quote:


> Originally Posted by *MyNewRig*
> 
> I am running 4.6Ghz on auto-voltage it hovers between 1.32v and 1.36v , temps in gaming are around 54-55c and the max is 61c during CPU stress test, that is pretty safe IMO
> 
> What are your temps on 1.285V and what cooler are you using?


A Phanteks TC12DX.



Temps reach 65c under a CPU-intensive game like Planetside 2.
Now bare in mind have no problems with my CPU's performance at the moment, if I overclocked it further it would be just for bigger numbers in benchmarks.


----------



## Mr-Dark

Hello

My Second 1070 Gaming-X Arrive today.. Here is my build with 2 card in SLI



With some RGB's











Both at 16X PCI-E 3.0... 5960X at stock and 32GB @2666mhz



I had problem with screen flickering on the Desktop, the only fix for that changing the power plan from Optimal to Adaptive.. now after installing another card and a clean install from Safe mod, the problem gone! No problem So far.. no flicker on Optimal power plan
















My Firstrike

http://www.3dmark.com/3dm/15662464?


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> With some RGB's


Are you using the NZXT HUE+ for the RGB?


----------



## Mr-Dark

Quote:


> Originally Posted by *MyNewRig*
> 
> Are you using the NZXT HUE+ for the RGB?


For sure


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> For sure


Okay, i noticed your lighting looks the same as mine when using that spectrum wave mode so i suspected it is the HUE+









Where are you hiding the control box?

I have a problem with my HUE+ i am not sure if you are having the same, i have the spectrum wave mode configured, and i also have ErP enabled in BIOS to save power when my system is off, as a result, when i shutdown the power is cutoff from the USB and the HUE+ control box, when i turn my system back on all the LEDs are white until i login back to windows and CAM loads then it goes into spectrum wave mode again, this is very annoying, are you having the same issue?


----------



## Mr-Dark

Quote:


> Originally Posted by *MyNewRig*
> 
> Okay, i noticed your lighting looks the same as mine when using that spectrum wave mode so i suspected it is the HUE+
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Where are you hiding the control box?
> 
> I have a problem with my HUE+ i am not sure if you are having the same, i have the spectrum wave mode configured, and i also have ErP enabled in BIOS to save power when my system is off, as a result, when i shutdown the power is cutoff from the USB and the HUE+ control box, when i turn my system back on all the LEDs are white until i login back to windows and CAM loads then it goes into spectrum wave mode again, this is very annoying, are you having the same issue?


I have the same, that not a problem at all.. the led stay white until CAM load... Its okay as booting time less than 20 Sec









My control Box in the HDD cage


----------



## TheGlow

*poke*
9 hours since last post is too long.
Wheres dat msi bios?


----------



## HOODedDutchman

Might sell one of my 1070s if anyone is interested. Its a hassle keeping the noise down on 2 cards on air lol. Both my cards are Samsung so...


----------



## HOODedDutchman

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> My Second 1070 Gaming-X Arrive today.. Here is my build with 2 card in SLI
> 
> 
> 
> With some RGB's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both at 16X PCI-E 3.0... 5960X at stock and 32GB @2666mhz
> 
> 
> 
> I had problem with screen flickering on the Desktop, the only fix for that changing the power plan from Optimal to Adaptive.. now after installing another card and a clean install from Safe mod, the problem gone! No problem So far.. no flicker on Optimal power plan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Firstrike
> 
> http://www.3dmark.com/3dm/15662464?




Not sure why but people graphics scores on x99 are **** compared to z170. This is stock. Your cards should come clocked much higher then mine. I consistantly see this all the time. Not sure how much overclockijg your cpu would help but i dont think it would give u 3000 points graphics score. I also get better scaling in every game compared to sli reviews using x99 platform. Not sure where this behaviour comes from. You would think x99 would be better in every way but aparently not.


----------



## khanmein




----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> 
> 
> Not sure why but people graphics scores on x99 are **** compared to z170. This is stock. Your cards should come clocked much higher then mine. I consistantly see this all the time. Not sure how much overclockijg your cpu would help but i dont think it would give u 3000 points graphics score. I also get better scaling in every game compared to sli reviews using x99 platform. Not sure where this behaviour comes from. You would think x99 would be better in every way but aparently not.


this is y u saw a lot tech reviewer from youtube using personal rig with x99 for video editing & z170 for their test bench. e.g. joker slunt less than 100k subscriber no choice but to spent some money for z170 just for benchmark & show better score. another example, jayztwocents did the same thang too so quite obviously if u prefer higher gimmick benchmark go for z170 & if u're interested to reduce the time spent for video editing just go for x99. seriously,the different is significantly between z170 vs x99 & i personally suggest go for z170 mobo cos x99 is real pain in the butt.


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> this is y u saw a lot tech reviewer from youtube using personal rig with x99 for video editing & z170 for their test bench. e.g. joker slunt less than 100k subscriber no choice but to spent some money for z170 just for benchmark & show better score. another example, jayztwocents did the same thang too so quite obviously if u prefer higher gimmick benchmark go for z170 & if u're interested to reduce the time spent for video editing just go for x99. seriously,the different is significantly between z170 vs x99 & i personally suggest go for z170 mobo cos x99 is real pain in the butt.


Very weird tho. Had a 3930k and a 3770k rig back in the day with gtx 670s in sli and never had this much difference. I guess the 670s obv never pushed the system as hard as 1070s do. I wonder if someone shut off all but 4 cores with hyperthreading if gpu score would go up. Be a good way to test. Either that or it's just an issue with the platform.


----------



## HOODedDutchman

I could probably get 40k out of my 1070s on gpu score if I tried. When I bump to about 1950 and +400 memory I get over 38k. N that's not even trying to overclock that's just throwing some random guaranteed stable numbers at my cards.


----------



## Mr-Dark

Quote:


> Originally Posted by *HOODedDutchman*
> 
> 
> 
> Not sure why but people graphics scores on x99 are **** compared to z170. This is stock. Your cards should come clocked much higher then mine. I consistantly see this all the time. Not sure how much overclockijg your cpu would help but i dont think it would give u 3000 points graphics score. I also get better scaling in every game compared to sli reviews using x99 platform. Not sure where this behaviour comes from. You would think x99 would be better in every way but aparently not.


I think the score is fine as 33.5k is fine for stock gtx 1070 in SLI.. your at which clock ?

Btw I had 2* Evga SC and 6700k @4.7ghz.. here is the score

http://www.3dmark.com/3dm/15006887?

and this with 2ghz core and 4300mhz memory

http://www.3dmark.com/3dm/15007001?

Nothing wrong with X99 at all.. same score









6700k even at 4.7ghz bottleneck my 1070's very hard on 1440p @144hz... now with this 5960X at stock clock which 3.5Ghz my fps stable at 144 and no drop at all!


----------



## HOODedDutchman

Quote:


> Originally Posted by *Mr-Dark*
> 
> I think the score is fine as 33.5k is fine for stock gtx 1070 in SLI.. your at which clock ?
> 
> Btw I had 2* Evga SC and 6700k @4.7ghz.. here is the score
> 
> http://www.3dmark.com/3dm/15006887?
> 
> and this with 2ghz core and 4300mhz memory
> 
> http://www.3dmark.com/3dm/15007001?
> 
> Nothing wrong with X99 at all.. same score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 6700k even at 4.7ghz bottleneck my 1070's very hard on 1440p @144hz... now with this 5960X at stock clock which 3.5Ghz my fps stable at 144 and no drop at all!


That's stock on my cards. Around 1860 core and reference memory clocks. No idea how ur scores are lower then mine. U should be boosting quite a bit higher.


----------



## Mr-Dark

Quote:


> Originally Posted by *HOODedDutchman*
> 
> That's stock on my cards. Around 1860 core and reference memory clocks. No idea how ur scores are lower then mine. U should be boosting quite a bit higher.


Which driver you have there ? I'm not sure but 33.5k is fine for stock 1070 in SLI..

Yo know how latest driver's is broken


----------



## gtbtk

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> My Second 1070 Gaming-X Arrive today.. Here is my build with 2 card in SLI
> 
> Both at 16X PCI-E 3.0... 5960X at stock and 32GB @2666mhz
> 
> 
> 
> I had problem with screen flickering on the Desktop, the only fix for that changing the power plan from Optimal to Adaptive.. now after installing another card and a clean install from Safe mod, the problem gone! No problem So far.. no flicker on Optimal power plan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Firstrike
> 
> http://www.3dmark.com/3dm/15662464?
> 
> 
> 
> 
> 
> Not sure why but people graphics scores on x99 are **** compared to z170. This is stock. Your cards should come clocked much higher then mine. I consistantly see this all the time. Not sure how much overclockijg your cpu would help but i dont think it would give u 3000 points graphics score. I also get better scaling in every game compared to sli reviews using x99 platform. Not sure where this behaviour comes from. You would think x99 would be better in every way but aparently not.
Click to expand...

Single core CPU performance has an impact on these benchmarks. The the CPU has to send draw calls for every frame and can can be bottlenecked at lower resolutions, especially paired with a 1070 SLI pair. The Extreme and Ultra benchmarks will not bottleneck as much as the lower numbers of higher resolution frames per second do not need as many instructions from the CPU.

The 6700K is much faster than a stock 5960X in single core performance. It onlu in multi threaded scenarios like the physics test that the 5960X destroys Skylake performance.

Overclocking the 5960X to 4.4 or 4.5 Ghz would certainly improve the graphics score at the lower resolutions.


----------



## Mr-Dark

Quote:


> Originally Posted by *gtbtk*
> 
> Single core CPU performance has an impact on these benchmarks. The the CPU has to send draw calls for every frame and can can be bottlenecked at lower resolutions, especially paired with a 1070 SLI pair. The Extreme and Ultra benchmarks will not bottleneck as much as the lower numbers of higher resolution frames per second do not need as many instructions from the CPU.
> 
> The 6700K is much faster than a stock 5960X in single core performance. It onlu in multi threaded scenarios like the physics test that the 5960X destroys Skylake performance.
> 
> Overclocking the 5960X to 4.4 or 4.5 Ghz would certainly improve the graphics score at the lower resolutions.


That true but not on Firestrike.. some benchmark's as Valley prefer single core performance.. Btw I'm planing to push my cpu on the weekend.. I will report back if the graphic score changed or not


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> That true but not on Firestrike.. some benchmark's as Valley prefer single core performance.. Btw I'm planing to push my cpu on the weekend.. I will report back if the graphic score changed or not


I wonder, if drivers are already broken enough for single card operation, what is the point of opting for an SLI setup that will probably give you a lot more issues? are you usually playing a particular game that has good SLI support?


----------



## Mr-Dark

Quote:


> Originally Posted by *MyNewRig*
> 
> I wonder, if drivers are already broken enough for single card operation, what is the point of opting for an SLI setup that will probably give you a lot more issues? are you usually playing a particular game that has good SLI support?


I have no problem with driver's at all.. some report many issue with latest driver's.. that's why i say maybe its the driver..

SLI work fine for games i play.. BF3 & BF4 & Hardline & GTA V & Battlefront..etc no problem So far


----------



## gtbtk

Quote:


> Originally Posted by *Mr-Dark*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Single core CPU performance has an impact on these benchmarks. The the CPU has to send draw calls for every frame and can can be bottlenecked at lower resolutions, especially paired with a 1070 SLI pair. The Extreme and Ultra benchmarks will not bottleneck as much as the lower numbers of higher resolution frames per second do not need as many instructions from the CPU.
> 
> The 6700K is much faster than a stock 5960X in single core performance. It onlu in multi threaded scenarios like the physics test that the 5960X destroys Skylake performance.
> 
> Overclocking the 5960X to 4.4 or 4.5 Ghz would certainly improve the graphics score at the lower resolutions.
> 
> 
> 
> That true but not on Firestrike.. some benchmark's as Valley prefer single core performance.. Btw I'm planing to push my cpu on the weekend.. I will report back if the graphic score changed or not
Click to expand...

Firestrike still favors single threaded performance. It is teh nature of DX11. If you run it with the CPU thread utilization displayed on the OSD, there is certainly activity on all cores with physics calculations and the like (characters falling down, swords swinging in an arc, sparks flying etc) but there is usually one thread running much harder as any given time than the others during the graphics tests. At least that is true on a Hyper threaded quad core. The physics load will be spread over a greater number of threads but I cant see that being conceptually different on an 8C/16T CPU.

You are pumping out 175 fps of quite highly detailed frame data, every one of those frames needs the CPU to give it the appropriate draw call instructions. Moving from 1080p to 1440p or 4K increases the GPU work to output pixels for each frame but you are only outputting 50-100 frames a second. The amount of "draw these elements" commands to the GPU for each frame will be roughly the same or just a bit more, however the number of times it has to give those instructions is reduced simply because of the reduced number of frames per second that the GPUs have the capacity to work on given the numbers of pixels they have to push to the screen.

Edit:

I should have mentioned that Heaven has very little physics requirement so it is much more single threaded focused, Valley, has a bit more physics requirement with the rain drops section being the main part.


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Very weird tho. Had a 3930k and a 3770k rig back in the day with gtx 670s in sli and never had this much difference. I guess the 670s obv never pushed the system as hard as 1070s do. I wonder if someone shut off all but 4 cores with hyperthreading if gpu score would go up. Be a good way to test. Either that or it's just an issue with the platform.


seriously, by disabled 4 cores with HT benchmark is accurate? some processors got different L3 cache like e.g 5960x vs 6700k (20 vs 8)


----------



## HOODedDutchman

Quote:


> Originally Posted by *Mr-Dark*
> 
> Which driver you have there ? I'm not sure but 33.5k is fine for stock 1070 in SLI..
> 
> Yo know how latest driver's is broken


Not latest but ones before that. I don't have Titanfall 2 or bf1 so didn't bother updating yet.


----------



## gtbtk

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> this is y u saw a lot tech reviewer from youtube using personal rig with x99 for video editing & z170 for their test bench. e.g. joker slunt less than 100k subscriber no choice but to spent some money for z170 just for benchmark & show better score. another example, jayztwocents did the same thang too so quite obviously if u prefer higher gimmick benchmark go for z170 & if u're interested to reduce the time spent for video editing just go for x99. seriously,the different is significantly between z170 vs x99 & i personally suggest go for z170 mobo cos x99 is real pain in the butt.
> 
> 
> 
> Very weird tho. Had a 3930k and a 3770k rig back in the day with gtx 670s in sli and never had this much difference. I guess the 670s obv never pushed the system as hard as 1070s do. I wonder if someone shut off all but 4 cores with hyperthreading if gpu score would go up. Be a good way to test. Either that or it's just an issue with the platform.
Click to expand...

GPU perfomance has increased much faster than than CPU performance over the last 5 or 6 years.

Your 5960x running 4c/8T would perform about the same or maybe slightly better due to the larger on chip caches as a i7-4970K downclocked to the same 3Ghz clock speed.


----------



## HOODedDutchman

Quote:


> Originally Posted by *gtbtk*
> 
> GPU perfomance has increased much faster than than CPU performance over the last 5 or 6 years.
> 
> Your 5960x running 4c/8T would perform about the same or maybe slightly better due to the larger on chip caches as a i7-4970K downclocked to the same 3Ghz clock speed.


It's only 3ghz. Dam well there's the answer lol. That's a big per core bottleneck.


----------



## gtbtk

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> GPU perfomance has increased much faster than than CPU performance over the last 5 or 6 years.
> 
> Your 5960x running 4c/8T would perform about the same or maybe slightly better due to the larger on chip caches as a i7-4970K downclocked to the same 3Ghz clock speed.
> 
> 
> 
> It's only 3ghz. Dam well there's the answer lol. That's a big per core bottleneck.
Click to expand...

That's why I suggested the Overclock.

A 4.2Ghz OC should be quite easy if you have a reasonable cooler. 4.5Ghz if the chip co-operates with you and you have a high end cooler is possible.

You should see a good performance increase in your firestrike scores


----------



## HOODedDutchman

Quote:


> Originally Posted by *gtbtk*
> 
> That's why I suggested the Overclock.
> 
> A 4.2Ghz OC should be quite easy if you have a reasonable cooler. 4.5Ghz if the chip co-operates with you and you have a high end cooler is possible.
> 
> You should see a good performance increase in your firestrike scores


I never see reviewers using more then 4.2ghz on them. It's stupid a hex core at 4.5-4.8 would likely outperform an 8 core at 4.2 for gaming all day long.


----------



## Mr-Dark

Quote:


> Originally Posted by *HOODedDutchman*
> 
> It's only 3ghz. Dam well there's the answer lol. That's a big per core bottleneck.


the base clock is 3Ghz while all core's boost to 3.5Ghz with Intel Boost tech..

4.5Ghz is possible on my cooler at 1.300v.. I think 4.2ghz is the sweet spot


----------



## HOODedDutchman

So here is my score at stock. I was wrong about the gpu clocks. Both cards spike to 1924 for a second the settle at 1911 for the rest of the test. Memory is stock. This is with cpu only at 4.5ghz 1.3v.
http://www.3dmark.com/3dm/15689672?

Here is +75 core and +400 memory. Just a random clock I threw on. Core ranged from 1962 to 1987.
http://www.3dmark.com/3dm/15689788?

For the hell of it I threw in +125 core and +450 memory with +50mv. Core ranged from 2012 to 2038.
http://www.3dmark.com/3dm/15689884?

Here is my Valley run just the the heck of it... Stock clocks.


Driver for all this is 372.90.


----------



## HOODedDutchman

Here is how she looks right now.


----------



## rfarmer

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Here is how she looks right now.
> 
> 
> Spoiler: Warning: Spoiler!


Looks good.


----------



## _Killswitch_

Edited!

Well bought a GTX 1070 =)


----------



## HOODedDutchman

Quote:


> Originally Posted by *rfarmer*
> 
> Looks good.


Thanks. Just a few things left to do n she will be done for a while. Black Friday will hopefully get me into a 1tb Samsung 850 evo so I can sell off the 840 evo and have 2 850s with 1 big game drive. Currently on a 250gb 840, 500gb 850, n 1.5tb hdd. Want that hdd outta there. Also need to find a way to run those top fans on 7v instead of 12 without having to use 3 individual 7v 4 pin Molex adapters. I have a 4 pin Molex to 3x3pin fan header right now I wonder if i could just switch the pin over... I'll have to compare it to a single 4pin Molex to 3 pin fan adapter I have lying around n see if I can make it work.


----------



## Nukemaster

If they are uncontrolled fans, I can not see splitting a 7 volt adapter being an issue.

Some power supplies did not like the 7 volt trick. If that worries you an inexpensive regulator(switching to avoid too much heat) would be another option. One of those could run a good pile of fans.


----------



## TheGlow

Quote:


> Originally Posted by *_Killswitch_*
> 
> Edited!
> 
> Well bought a GTX 1070 =)


Go visit their NYC super store! I've passed it so many times but never been in.


----------



## _Killswitch_

TheGlow, I live in Missouri, NYC is a little far away lol


----------



## gtbtk

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That's why I suggested the Overclock.
> 
> A 4.2Ghz OC should be quite easy if you have a reasonable cooler. 4.5Ghz if the chip co-operates with you and you have a high end cooler is possible.
> 
> You should see a good performance increase in your firestrike scores
> 
> 
> 
> I never see reviewers using more then 4.2ghz on them. It's stupid a hex core at 4.5-4.8 would likely outperform an 8 core at 4.2 for gaming all day long.
Click to expand...

The problem is mostly heat related. 4.2 will run at 1.3v but 4.5 on many chips needs something like 1.45v to run stable. That extra voltage causes the cpu to run much hotter so if you put the CPU at 4.5 under load and there is the possibility that it is running at 90 deg. That high heat will kill the CPU much faster.

A 6700K 4 core will run games faster than a hex core. The situation changes when you start loading the machine up with many multi threaded applications. The 8 core could probably game while you encode Video where the 6700K could not do that and be usable


----------



## gtbtk

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Here is how she looks right now.


nice rig


----------



## M3Stang

Anyone have problems with random no posting on gigabyte motherboards? Maybe once every 15-20 startups? Started when I put the 1070.


----------



## rfarmer

Quote:


> Originally Posted by *M3Stang*
> 
> Anyone have problems with random no posting on gigabyte motherboards? Maybe once every 15-20 startups? Started when I put the 1070.


I have a Gigabyte Z170N G1 Gaming 5 and haven't had any issues.


----------



## M3Stang

Quote:


> Originally Posted by *rfarmer*
> 
> I have a Gigabyte Z170N G1 Gaming 5 and haven't had any issues.


I have a gaming 3. Started right after I upgraded. Have 3 days toi retunr the board. Wondering if I damaged it pulling the 1060 out was so hard to get it out. Realized it was because it was grabbing one of the screws for the case pci delete plate things so dont think I really hurt the board.


----------



## gtbtk

Quote:


> Originally Posted by *M3Stang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rfarmer*
> 
> I have a Gigabyte Z170N G1 Gaming 5 and haven't had any issues.
> 
> 
> 
> I have a gaming 3. Started right after I upgraded. Have 3 days toi retunr the board. Wondering if I damaged it pulling the 1060 out was so hard to get it out. Realized it was because it was grabbing one of the screws for the case pci delete plate things so dont think I really hurt the board.
Click to expand...

running the latest motherboard bios?


----------



## Majentrix

Tight fit.


----------



## khanmein

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> 
> Tight fit.


i suggest use reference aka FE blower style is more suitable for SLI mode.


----------



## M3Stang

Quote:


> Originally Posted by *gtbtk*
> 
> running the latest motherboard bios?


I was running it. I am still running the same one. It's F7. A new beta bios came out like 4 days ago but I'm afraid to play with that. F7 came out I think in 6/10. I don't even know if the 1070 was announced yet.


----------



## HOODedDutchman

Quote:


> Originally Posted by *M3Stang*
> 
> I have a gaming 3. Started right after I upgraded. Have 3 days toi retunr the board. Wondering if I damaged it pulling the 1060 out was so hard to get it out. Realized it was because it was grabbing one of the screws for the case pci delete plate things so dont think I really hurt the board.


Is it not posting or is the gpu not firing up and displaying anything. I'd bet gpu since its the only thing you changed.


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> i suggest use reference aka FE blower style is more suitable for SLI mode.


I agree I'd of gone fe for my cards if they weren't ugly n have green led on the side. With 7 fans in my h440 I max around 70° on top card now tho at 65% can. Not bad really. Case is louder then gpu most of the time. That's with 100% utilization in the witcher 3 with framecap and vsync off at 1440p. The witcher 3 definitely hottest my cards get by far compared to most games.


----------



## Arturo.Zise

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> 
> Tight fit.


Those coolers are no joke. Some of the best air cards available.


----------



## asdkj1740

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Those coolers are no joke. Some of the best air cards available.


it is the best.


----------



## Roland0101

Palits BIOS is out: http://www.palit.com/palit/download.php?pc_cate=vga&pc=282&id=2629&lang=en


----------



## HOODedDutchman

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Those coolers are no joke. Some of the best air cards available.


No cooler is efficient with that 3mm gap between them. Usually 3 slot coolers are terrible in sli.


----------



## asdkj1740

Quote:


> Originally Posted by *Roland0101*
> 
> Palits BIOS is out: http://www.palit.com/palit/download.php?pc_cate=vga&pc=282&id=2629&lang=en


oh my good another new bios?
or just uploading the update log?


----------



## Ryusaki

For the people that got memory ( micron) problems. DId the vBIOS update actually fix the problem and do they overclock now similiar like the samsung memory?

I want to upgrade to a 1070 and I am really not sure now. Since the chances to get a samsung is very slim.


----------



## HOODedDutchman

Quote:


> Originally Posted by *Arnezaki*
> 
> For the people that got memory ( micron) problems. DId the vBIOS update actually fix the problem and do they overclock now similiar like the samsung memory?
> 
> I want to upgrade to a 1070 and I am really not sure now. Since the chances to get a samsung is very slim.


You're talking like 1-2% performance advantage between what micron chips can overclock vs what Samsung chips can. Not worth not buying a card over. The issue was with power states on previous bios versions so I doubt it will help overclocking at all. Well it will help overclocking without forcing constant voltage which many had issues with on the micron.


----------



## MyNewRig

BIOS update now available for the ASUS Strix 1070 in the "GLOBAL" support page not in localized pages, here you can download for the non-OC and OC versions

Standard non-OC version: https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-8G-GAMING/HelpDesk_Download/

OC Version: https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-O8G-GAMING/HelpDesk_Download/

You will find a new entry BIOS from which you can download an .exe update file

Changelog:
GTX1070 updat bios
1. DUAL GTX 1070 series now support 0dB function.
2. Improve Micron memory overclock stability.

Looking forward to your testing results!


----------



## khanmein

t
Quote:


> Originally Posted by *Arnezaki*
> 
> For the people that got memory ( micron) problems. DId the vBIOS update actually fix the problem and do they overclock now similiar like the samsung memory?
> 
> I want to upgrade to a 1070 and I am really not sure now. Since the chances to get a samsung is very slim.


don't waste your money on crippled micron cos u can't see a single well known tech reviewer use micron sample at all. there's 1-2% to get samsung or tried to buy used one.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> BIOS update now available for the ASUS Strix 1070 in the "GLOBAL" support page not in localized pages, here you can download for the non-OC and OC versions
> 
> Standard non-OC version: https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-8G-GAMING/HelpDesk_Download/
> 
> OC Version: https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-O8G-GAMING/HelpDesk_Download/
> 
> You will find a new entry BIOS from which you can download an .exe update file
> 
> Changelog:
> GTX1070 updat bios
> 1. DUAL GTX 1070 series now support 0dB function.
> 2. Improve Micron memory overclock stability.
> 
> Looking forward to your testing results!


Hopefully works out so people stop whining.


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> t
> don't waste your money on crippled micron cos u can't see a single well known tech reviewer use micron sample at all. there's 1-2% to get samsung or tried to buy used one.


If u show me 1 game that becomes playable from unplayable by overclocking your gpu even with Samsung memory you have a valid point. There's no situation that a game becomes playable when it wasn't at stock. The only reason to overclock is to mess around n get some solid benchmark numbers.


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> If u show me 1 game that becomes playable from unplayable by overclocking your gpu even with Samsung memory you have a valid point. There's no situation that a game becomes playable when it wasn't at stock. The only reason to overclock is to mess around n get some solid benchmark numbers.


e.g GoW4 flickering, artifacts & freeze. there's a lot user got this issue.. if no issue then y there's no single review from tech reviewer from youtube?

i can't even find one & if no issue y released new vbios at the 1st place?


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> e.g GoW4 flickering, artifacts & freeze. there's a lot user got this issue..


Ok ya I agree the fix bios is needed. I'm just saying on a stable card even with Samsung memory overclocking the card vs stock does not make a better experience other then slightly bigger numbers. There's no games that become playable at overclocked speed vs unplayable at stock speed.


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ok ya I agree the fix bios is needed. I'm just saying on a stable card even with Samsung memory overclocking the card vs stock does not make a better experience other then slightly bigger numbers. There's no games that become playable at overclocked speed vs unplayable at stock speed.


until today i never said a single word about samsung OC better than micron will make huge performance. what i said is samsung got less issue like artifacts, flickering, shuttering, & other unknown issue.

never mind let's wait the asus owner feedback..


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ok ya I agree the fix bios is needed. I'm just saying on a stable card even with Samsung memory overclocking the card vs stock does not make a better experience other then slightly bigger numbers. There's no games that become playable at overclocked speed vs unplayable at stock speed.


There is actually a performance difference of over 11% between an OCed 1070 vs stock clocks in actual games as reported by multiple reviewers and experienced by myself, this is especially true in higher resolutions like 2K and 4K, frames become noticeably smoother.

Keep in mind that 11% is almost the difference between the 1070 and the 1080 because an OCed 1070 gets very close to the performance territory of the 1080.

Also the Micron issue is not only about OC, there are many people myself included who experience multiple stability issues on stock settings with Micron memory, it is actually a bigger deal than you think for an enthusiast, the average user may not care that much if the card is stable enough at stock settings.

And like said above, there is a reason they ensure that all media samples are sent with Samsung memory, it is not a coincidence.


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> until today i never said a single word about samsung OC better than micron will make huge performance. what i said is samsung got less issue like artifacts, flickering, shuttering, & other unknown issue.
> 
> never mind let's wait the asus owner feedback..


Ok just saying for ignorant people whining about overclockability. I never remember any manufacturer ever guaranteeing a cards ability to overclock.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> There is actually a performance difference of over 11% between an OCed 1070 vs stock clocks in actual games as reported by multiple reviewers and experienced by myself, this is especially true in higher resolutions like 2K and 4K, frames become noticeably smoother.
> 
> Keep in mind that 11% is almost the difference between the 1070 and the 1080 because an OCed 1070 gets very close to the performance territory of the 1080.
> 
> Also the Micron issue is not only about OC, there are many people myself included who experience multiple stability issues on stock settings with Micron memory, it is actually a bigger deal than you think for an enthusiast, the average user may not care that much if the card is stable enough at stock settings.
> 
> And like said above, there is a reason they ensure that all media samples are sent with Samsung memory, it is not a coincidence.


The difference between a 1070 and a 1080 is over 20%. 11% at 50fps is 5.5fps. Your telling me you can see the difference in any game between 50fps and 55.5fps. No a chance. Or 100 and 111fps or 30 and 33.3fps. 11% will not make a difference in any game. N all media samples were sent out long before micron cards started being sold. Stop thinking the world is some big conspiracy.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Ok just saying for ignorant people whining about overclockability. I never remember any manufacturer ever guaranteeing a cards ability to overclock.


We discussed that a million times before and i really don't want to repeat myself, but like many said before, OC headroom is not a luxury, it is a quality assurance that guarantee stock settings will be stable in a variety of situations like stressful workloads, future drivers updates, future games that pushes the memory to the max, power/voltage fluctuations, etc .. all decent DDR memory chips have around 10% OC headroom, i never owned a GPU in the past since many generations ago that i could not do at least +10% on memory, even my DDR4 standards system memory can do that 10%, everything except that Micron GDDR5 on the 1070 , that is also one more point to keep in mind.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> The difference between a 1070 and a 1080 is over 20%. 11% at 50fps is 5.5fps. Your telling me you can see the difference in any game between 50fps and 55.5fps. No a chance. Or 100 and 111fps or 30 and 33.3fps. 11% will not make a difference in any game. N all media samples were sent out long before micron cards started being sold. Stop thinking the world is some big conspiracy.


Charts have been posted before many pages ago that show an OCed 1070 exceeds a stock 1080, if you browse back in this thread to when this discussion started you will find them, and yes 11% is very noticeable in high resolutions when the card is already struggling, in 1080p it is not noticeable if you go from 100FPS to 111FPS, but in 4K going from 45FPS to 50FPS or from 50FPS to 55FPS is actually a very big deal and is noticeable.

Reviews that are coming out UNTIL THIS DAY are using Samsung memory, we are not talking about reviews from release, but anyways all this has been discussed extensively before and i don't want to go there again, i personally believe that Micron memory is inferior and not worth the money, i acted accordingly, returned my card and got my money back a few days ago, if you believe otherwise it is up to you, but you have Samsung cards anyways, you can't judge a product without owning it, try one and let me know what you think.


----------



## Lahatiel

Quote:


> Originally Posted by *HOODedDutchman*
> 
> The difference between a 1070 and a 1080 is over 20%. 11% at 50fps is 5.5fps. Your telling me you can see the difference in any game between 50fps and 55.5fps. No a chance. Or 100 and 111fps or 30 and 33.3fps. 11% will not make a difference in any game. N all media samples were sent out long before micron cards started being sold. Stop thinking the world is some big conspiracy.


It depends.
Example: if you are not able to get constant 60fps and only reach 55fps, 11% more gpu-speed can be the difference between a fluent or a stuttering feeling (on condition that no freesync or gsync is present and you rely to vsync).

But mostly there is no big change and only a nice to have.


----------



## MyNewRig

Quote:


> Originally Posted by *Lahatiel*
> 
> It depends.
> Example: if you are not able to get constant 60fps and only reach 55fps, 11% more gpu-speed can be the differece between a fluent or a stuttering feeling (on condition that no freesync or gsync is present and you rely to vsync).
> 
> But mostly there is no big change and only a nice to have.


+1

exactly and this too, dipping below 60FPS when vsync is enabled becomes laggy and stuttery, and that 11% can make a lot of difference in that scenario without sacrificing visual fidelity.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Charts have been posted before many pages ago that show an OCed 1070 exceeds a stock 1080, if you browse back in this thread to when this discussion started you will find them, and yes 11% is very noticeable in high resolutions when the card is already struggling, in 1080p it is not noticeable if you go from 100FPS to 111FPS, but in 4K going from 45FPS to 50FPS or from 50FPS to 55FPS is actually a very big deal and is noticeable.
> 
> Reviews that are coming out UNTIL THIS DAY are using Samsung memory, we are not talking about reviews from release, but anyways all this has been discussed extensively before and i don't want to go there again, i personally believe that Micron memory is inferior and not worth the money, i acted accordingly, returned my card and got my money back a few days ago, if you believe otherwise it is up to you, but you have Samsung cards anyways, you can't judge a product without owning it, try one and let me know what you think.


Let's see proof. I don't think I've seen a card yet that can even match a reference clocked 1080. Its over 20% more powerful. Even a golden sample 1070 that does 2200mhz core would not match a stock 1080.


----------



## HOODedDutchman

If u look here the gaming z slightly edges out the stock 1080 by less then 1% but it says it gained 11% from the gaming z stock clocks and if u look at the performance summary the gaming z is 14% slower then a reference 1080 and a reference 1070 is 18-21% slower depending on resolution overall so overall even overclocked with that 11% gain it would still be 3% slower overall.
https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html


----------



## stulda

MSI please hurry up!


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> We discussed that a million times before and i really don't want to repeat myself, but like many said before OC headroom is not a luxury, it is a quality assurance that guarantee stock settings will be stable in a variety of situations like stressful workloads, future drivers updates, future games that pushes the memory to the max, power/voltage fluctuations, etc


No, not many said that, you are repeating that over and over again. And it's still not true, OC ability says noting about the overall reliability of a Memory chip.
Quote:


> +1
> 
> exactly and this too, dipping below 60FPS when vsync is enabled becomes laggy and stuttery, and that 11% can make a lot of difference in that scenario without sacrificing visual fidelity.


No.

It can get laggy at 60FPS if you have a 60Hz Monitor and V-SYNC on, but it will never get laggy at 55 FPS. Plus, there is a nice little feature called Adaptive V-Sync that takes care of the under 60 = 30FPS V-SYNC problem.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Let's see proof. I don't think I've seen a card yet that can even match a reference clocked 1080. Its over 20% more powerful. Even a golden sample 1070 that does 2200mhz core would not match a stock 1080.


https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html



_"Actual 3D performance gained from overclocking is 11.0%."_

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html



_"Actual 3D performance gained from overclocking is 13.4%."_


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/29.html
> 
> 
> 
> _"Actual 3D performance gained from overclocking is 11.0%."_
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html
> 
> 
> 
> _"Actual 3D performance gained from overclocking is 13.4%."_


Did u even look at the performance summary page. That card is 22% slower then the 1080 overall at stock so that 13.4% gain if it spans across all games is still 8.6% slower then 1080 at stock speeds.


----------



## HOODedDutchman

Quote:


> Originally Posted by *Roland0101*
> 
> No, not many said that, you are repeating that over and over again. And it's still not true, OC ability says noting about the overall reliability of a Memory chip.
> No.
> 
> It can get laggy at 60FPS if you have a 60Hz Monitor and V-SYNC on, but it will never get laggy at 55 FPS. Plus, there is a nice little feature called Adaptive V-Sync that takes care of the under 60 = 30FPS V-SYNC problem.


I'm gonna agree with you for the most part but the witcher 3 lags at 55fps with vsync on or off with frame cap at 60fps.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I
> Did u even look at the performance summary page. That card is 22% slower then the 1080 overall at stock so that 13.4% gain if it spans across all games is still 8.6% slower then 1080 at stock speeds.


So? you got your proof, it is not 2 FPS slower like you advised the other guy, actually the difference between the Samsung 1070 and the Micron 1070 is actually wider than the difference between an OCed Samsung 1070 and a Stock 1080

Probably the lineup should be:

Micron GTX 1070 -> GTX 1070

Samsung GTX 1070 -> GTX 1070 Ti (With OC)

GTX 1080

GTX 1080 Ti

Not exactly of course but you get the idea


----------



## HOODedDutchman

So here's the reason I got Samsung memory and nobody is having issues I've seen with gigabyte cards. They don't have 1070s with micron memory from what this says.
http://www.kitguru.net/components/graphic-cards/matthew-wilson/firmware-updates-begin-rolling-out-to-fix-gtx-1070-memory-issues/


----------



## khanmein

actually a lot user that bought micron cards didn't post or visit forum one. where's the asus micron user & i want feedback??? thanks.


----------



## khanmein

Quote:


> Originally Posted by *HOODedDutchman*
> 
> So here's the reason I got Samsung memory and nobody is having issues I've seen with gigabyte cards. They don't have 1070s with micron memory from what this says.
> http://www.kitguru.net/components/graphic-cards/matthew-wilson/firmware-updates-begin-rolling-out-to-fix-gtx-1070-memory-issues/


actually giga already started to use micron for quite some times & even zotac FE... etc


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> actually giga already started to use micron for quite some times & even zotac FE... etc


Hmmm I haven't seen any and this says otherwise ? Ah read somewhere else that they use both. Maybe gigabyte was smart enough to not just throw cards on the market because micron said they would work fine and actually made a bios to make it work properly. I would be surprised cuz it's gigabyte but still lol.


----------



## Roland0101

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I'm gonna agree with you for the most part but the witcher 3 lags at 55fps with vsync on or off with frame cap at 60fps.


I don't play The Witcher 3, but if it lags at 55FPS then there is a different problem than normal Frame buffer lag.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> So here's the reason I got Samsung memory and nobody is having issues I've seen with gigabyte cards. They don't have 1070s with micron memory from what this says.
> http://www.kitguru.net/components/graphic-cards/matthew-wilson/firmware-updates-begin-rolling-out-to-fix-gtx-1070-memory-issues/


That is a lie Gigabyte made, this has also been addressed before in this thread, Gigabyte is using both memory types and even have a BIOS for Micron memory cards released in August for the G1

http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios


----------



## Roland0101

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Hmmm I haven't seen any and this says otherwise ?


Gigabyte is using Micron,

and why shouldn't they, It is completely fine memory.


----------



## HOODedDutchman

Quote:


> Originally Posted by *Roland0101*
> 
> Gigabyte is using Micron,
> 
> and why shouldn't they, It is completely fine memory.


Maybe they find the issue during quality check before release and made a proper bios. I have a feeling a lot of the other companies just asked micron if it would work at said speed and didn't bother to test properly and just threw cards on the market without in depth testing to make money.


----------



## MyNewRig

Quote:


> Originally Posted by *Roland0101*
> 
> Blah


Would you just stop babbling and do something useful for once in your life? go update your Strix BIOS and let us know what results you get and how far can you OC.


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> go update your Strix BIOS and let us know what results you get and how far can you OC.


Why do you care? You don't have a 1070 anymore.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Maybe they find the issue during quality check before release and made a proper bios. I have a feeling a lot of the other companies just asked micron if it would work at said speed and didn't bother to test properly and just threw cards on the market without in depth testing to make money.


Having multiple memory suppliers and switching memory during a production run is a common practice if both chips are identical spec-for-spec , until last-gen when the highest end cards were configured with 7 Gbps that was perfectly fine, no one notice the difference or complained, and the different ICs used to work with the same BIOS like how different DDR3/4 chips work with the same motherboard BIOS.

This time because we reached the maximum GDDR5 data rate of 8 Gbps Samsung and Micron ICs are not spec-for-spec identical, there is difference in stability, OC-potential, Voltage, power consumption .. etc and hence the complaints ..


----------



## MyNewRig

Quote:


> Originally Posted by *Roland0101*
> 
> Why do you care? You don't have a 1070 anymore.


I want to buy a GPU, go do something actually helpful in your life, test and tell us, stop arguing.


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> I want to buy a GPU, go do something actually helpful in your life, test and tell us, stop arguing.


Oh, I will, when I have time, and not when you tell me to.


----------



## MyNewRig

Quote:


> Originally Posted by *Roland0101*
> 
> Oh, I will, when I have time, and not when you tell me to.


Okay, waiting for your feedback, hope it is not biased as usual.


----------



## _Killswitch_

I starting looking at this thread because i was wanting too upgrade my GTX 680 too 1070, and I did buy a 1070 last night but seems be a lot arguing going on in this thread =S


----------



## Dude970

Yes, thread has gone to %(#! I improved my TimeSpy with the latest driver


----------



## _Killswitch_

Wish I could say thats good but I don't know anything about the new benchmarks. Not really a benchmark person but once I get my new 1070 i will run the new benchmarks just too see


----------



## Dude970

Quote:


> Originally Posted by *_Killswitch_*
> 
> Wish I could say thats good but I don't know anything about the new benchmarks. Not really a benchmark person but once I get my new 1070 i will run the new benchmarks just too see


You will love the card. Games like a champ


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> Wish I could say thats good but I don't know anything about the new benchmarks. Not really a benchmark person but once I get my new 1070 i will run the new benchmarks just too see


i'm not keen into benchmark too but there's some reason behind for using micron. since few months ago i already started to questioning the tech reviewer from youtube & they either skipped my question or treated it as a common daily basis.

they keep repeating that the review sample product is samsung & nothing much they can do. i don't care some said no issue or not but u ask yourself, if no issue y there's no tech reviewer make a video to clarified it?

this type of vid will gain more viewers than they can imagine. i'm helping them but they really don't dare to mention it obviously.


----------



## _Killswitch_

Well I ordered this EVGA GTX 1070 lastnight, which im sure ACX type cooler's probably better but blower goes with how i want to do my fan's in my new PC better plus every card i have owned had been blower style so just what i'm use too. Seems i picked an odd card because no reviews on it or anything far as i can tell. So getting Micron or samsung memory is up the air we will see what i get Weds when my card arrives


----------



## Dude970

Congrats and welcome to the club


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> 
> 
> Well I ordered this EVGA GTX 1070 lastnight, which im sure ACX type cooler's probably better but blower goes with how i want to do my fan's in my new PC plus every card i have had have been blower style so just want im use too. Seems i picked an odd card because no reviews on it or anything far as i can tell. So getting Micron or samsung memory is up the air we will see what i get Weds when my card arrives


micron for sure & i really hope that one day i can be US Citizen to vote for Donald Trump to solve this NVIDIA fiasco. .


----------



## Dude970

Quote:


> Originally Posted by *khanmein*
> 
> micron for sure & i really hope that one day i can be US Citizen to vote for Donald Trump to solve this NVIDIA fiasco. .


Geez man, let the Micron doom talk go, you guys are ruining the thread. Yes, I understand the grievance, but at this point enough is enough


----------



## MojoW

So i just got myself 1070 G1 Gaming and i am pretty happy with it.
It has samsung memory and had 1960 boost clocks out of the box.
This is my first nvidia in idk how long.

I'm trying to overclock but with that crazy boost i am not getting it too high above the original boost clocks.


----------



## Dude970

Quote:


> Originally Posted by *MojoW*
> 
> So i just got myself 1070 G1 Gaming and i am pretty happy with it.
> It has samsung memory and had 1960 boost clocks out of the box.
> This is my first nvidia in idk how long.
> 
> I'm trying to overclock but with that crazy boost i am not getting it too high above the original boost clocks.










Congrats and welcome to the club


----------



## _Killswitch_

Thanks Dude, My GTX 680 has been great card but it's time for it too be retired lol. Yes I really don't know why this Micron VS samsung memory is a big deal =S As long my card works, and nothing is majorly wrong with it im happy.
Besides been pretty lucky only have had RMA 1 PC part in few years I been building pc's which was 680i motherboard that came DOA so feeling pretty good about my new card =D


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> Thanks Dude, My GTX 680 has been great card but it's time for it too be retired lol. Yes I really don't know why this Micron VS samsung memory is a big deal =S As long my card works, and nothing is majorly wrong with it im happy.
> Besides been pretty lucky only have had RMA 1 PC part in few years I been building pc's which was 680i motherboard that came DOA so feeling pretty good about my new card =D


same feeling like u bought benz but came with volkswagen engine.


----------



## asdkj1740

you better cancel it if no additional caused.

asus turbo will be a better choice.


----------



## _Killswitch_

I suppose? I wouldn't buy a Bens too start with lol but anyways Guess just have to see how it turns out, but not going too get overly excited about getting Samsung or Micron. If my card works and suites my needs I really don't care beyond that, call me weird I guess.


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> you better cancel it if no additional caused.
> 
> asus turbo will be a better choice.


yeah i agreed if i wanna pick a blower type i either go for ASUS or KFA2 (Galax)

this time round EVGA Pascal got VRM cooling issue.


----------



## _Killswitch_

Only on the FTW models/, FE and Blower type card's aren't affected by it.


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> I suppose? I wouldn't buy a Bens too start with lol but anyways Guess just have to see how it turns out, but not going too get overly excited about getting Samsung or Micron. If my card works and suites my needs I really don't care beyond that, call me weird I guess.


my comparison is that we pay the price like benz but received the parts like volkswagen.

i didn't said volkswagen can't drive but they cheat.

other said i'm using petrol no issue but the one got issue is diesel so i don't even bother as long my car working.


----------



## M3Stang

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Is it not posting or is the gpu not firing up and displaying anything. I'd bet gpu since its the only thing you changed.


The computer doesnt beep, it just sits there with the fans running. My keyboard and mouse dont light up or anything.


----------



## rfarmer

Quote:


> Originally Posted by *M3Stang*
> 
> The computer doesnt beep, it just sits there with the fans running. My keyboard and mouse dont light up or anything.


Have you tried pulling the card out and see if the problem is still there?


----------



## _Killswitch_

Edited, last reply may lead into convo's I do not like being involved in.


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> Edited, last reply may lead into convo's I do not like being involved in.


god bless u & happy deepavali~


----------



## M3Stang

Quote:


> Originally Posted by *rfarmer*
> 
> Have you tried pulling the card out and see if the problem is still there?


No since the problem is so rare its hard to replicate and every time I have the computer on I am playing games and using dual monitors so switching to i-GPU isnt really the best option for me. It has probably been about 5 days since the last occurrence and before that it was 8 days. So its happening 2 times in 2 weeks or so. Before I reseated the ram it would be like twice every day or 2.


----------



## _Killswitch_

I could be wrong but you say fans will spin, but you keyboard/mouse don't light up or anything? Kind of sound's like your motherboard is kind of Flaky

Editied, maybe PSU? what PSU are you using? I didnt see Pc specs.


----------



## MojoW

Quote:


> Originally Posted by *M3Stang*
> 
> No since the problem is so rare its hard to replicate and every time I have the computer on I am playing games and using dual monitors so switching to i-GPU isnt really the best option for me. It has probably been about 5 days since the last occurrence and before that it was 8 days. So its happening 2 times in 2 weeks or so. Before I reseated the ram it would be like twice every day or 2.


Doesnt your board has error leds (code based) on them?
Where does it hang on which error code?


----------



## Lahatiel

Quote:


> Originally Posted by *_Killswitch_*
> 
> Only on the FTW models/, FE and Blower type card's aren't affected by it.


On all cards with ACX3.0 cooler. Not FTW-only.


----------



## M3Stang

Quote:


> Originally Posted by *_Killswitch_*
> 
> I could be wrong but you say fans will spin, but you keyboard/mouse don't light up or anything? Kind of sound's like your motherboard is kind of Flaky
> 
> Editied, maybe PSU? what PSU are you using? I didnt see Pc specs.


I salvaged it from my old build. Its about 4 years old. The corsair TX850. It worked fine powering a build with a Q9450 and GTX 590. This build uses much less power. Current specs:
6700K
16GB DDR4
GTX 1070
TX850


----------



## asdkj1740

Quote:


> Originally Posted by *Lahatiel*
> 
> On all cards with ACX3.0 cooler. Not FTW-only.


asus turbo 1080 uses IR3555, and it is said 1070 uses the same mosfet too.
from what i have know the last time i saw a blower style card with such great pcb is classified 680.


----------



## _Killswitch_

Quote:


> Originally Posted by *Lahatiel*
> 
> On all cards with ACX3.0 cooler. Not FTW-only.


Point in case, why I bought a blower style card, why I didnt go with the black edition which was cheaper but it had ACX cooler.


----------



## gtbtk

Quote:


> Originally Posted by *HOODedDutchman*
> 
> So here's the reason I got Samsung memory and nobody is having issues I've seen with gigabyte cards. They don't have 1070s with micron memory from what this says.
> http://www.kitguru.net/components/graphic-cards/matthew-wilson/firmware-updates-begin-rolling-out-to-fix-gtx-1070-memory-issues/


The article is wrong, they do have Micron Cards


----------



## gtbtk

Well I put the new Asus OC bios on my MSI gaming X card tonight to see how it will go.

memory checkerboard/bsod seems to have been resolved. Works like any other graphics card I have ever owned. It is now easy to start an OC session with memory set to +550 from .625v (9100Mhz). Before that would guarantee a white Checkerboard screen and video scheduler BSOD.

If I OC the memory too high, to +600, i Still get blue checkers on the screen as happened before however, it is only on part of the screen and not full screen like it was with the MSI bios.

the Asus card draws less power than the MSI bios but performance in Heaven is slightly better than the best score I could ever get with the MSI bios. (129 on the extreme preset. MSI bios was just over 128)

In Firestrike the best I managed in the short time I have been playing is close to equal of the highest scores I have ever got with the old MSI bios. I am sure with more time, I could beat the MSI bios best score.

Asus Bios http://www.3dmark.com/fs/10609084

MSI Bios http://www.3dmark.com/fs/10573318

I'm really looking forward to getting hold of the MSI version to see how that works with proper power management


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> Well I put the new Asus OC bios on my MSI gaming X card tonight to see how it will go.
> 
> memory checkerboard/bsod seems to have been resolved. Works like any other graphics card I have ever owned. It is now easy to start an OC session with memory set to +550 from .625v (9100Mhz). Before that would guarantee a white Checkerboard screen and video scheduler BSOD.
> 
> If I OC the memory too high, to +600, i Still get blue checkers on the screen as happened before however, it is only on part of the screen and not full screen like it was with the MSI bios.
> 
> the Asus card draws less power than the MSI bios but performance in Heaven is slightly better than the best score I could ever get with the MSI bios. (129 on the extreme preset. MSI bios was just over 128)
> 
> In Firestrike the best I managed in the short time I have been playing is close to equal of the highest scores I have ever got with the old MSI bios. I am sure with more time, I could beat the MSI bios best score.
> 
> Asus Bios http://www.3dmark.com/fs/10609084
> 
> MSI Bios http://www.3dmark.com/fs/10573318
> 
> I'm really looking forward to getting hold of the MSI version to see how that works with proper power management


Nice, look forward to the results


----------



## gtbtk

One thing that I have noticed is that even if the application crashes, I have not received and DTC Watchdog BSOD crashes or and Video Scheduler BSOD.

The card is much more stable at the OS level.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Well I put the new Asus OC bios on my MSI gaming X card tonight to see how it will go.
> 
> memory checkerboard/bsod seems to have been resolved. Works like any other graphics card I have ever owned. It is now easy to start an OC session with memory set to +550 from .625v (9100Mhz). Before that would guarantee a white Checkerboard screen and video scheduler BSOD.
> 
> If I OC the memory too high, to +600, i Still get blue checkers on the screen as happened before however, it is only on part of the screen and not full screen like it was with the MSI bios.
> 
> the Asus card draws less power than the MSI bios but performance in Heaven is slightly better than the best score I could ever get with the MSI bios. (129 on the extreme preset. MSI bios was just over 128)
> 
> In Firestrike the best I managed in the short time I have been playing is close to equal of the highest scores I have ever got with the old MSI bios. I am sure with more time, I could beat the MSI bios best score.
> 
> Asus Bios http://www.3dmark.com/fs/10609084
> 
> MSI Bios http://www.3dmark.com/fs/10573318
> 
> I'm really looking forward to getting hold of the MSI version to see how that works with proper power management


i think MSI will release new vbios soon so Asus really fixed the issue?


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> i think MSI will release new vbios soon so Asus really fixed the issue?


A week ago I was told by MSI that their bios should be available in the about 2 weeks. If that is true, It should arrive at the end of next week or early the following week

Not having an Asus card, I can not give an absolute answer but the bios on my card seems to work perfectly. In the few hours I have had it installed, I have not had a single BSOD. I can trigger a bsod crash with the MSI bios on command.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> A week ago I was told by MSI that their bios should be available in the about 2 weeks. If that is true, It should arrive at the end of next week or early the following week
> 
> Not having an Asus card, I can not give an absolute answer but the bios on my card seems to work perfectly. In the few hours I have had it installed, I have not had a single BSOD. I can trigger a bsod crash with the MSI bios on command.


great news for u. thanks.


----------



## TheDeadCry

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Hopefully works out so people stop whining.


waaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaahhhhhhhh

rio...


----------



## rfarmer

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Hopefully works out so people stop whining.


You think this is bad? You should have been in the 970 owners club when the news came out about the 3.5GB memory. OMG you would have thought someone shot their dogs and stole all their lunch money, never seen so much whining in my life. This is minor by comparison.


----------



## Avendor

Quote:


> Originally Posted by *HOODedDutchman*
> 
> So here's the reason I got Samsung memory and nobody is having issues I've seen with gigabyte cards. They don't have 1070s with micron memory from what this says.
> http://www.kitguru.net/components/graphic-cards/matthew-wilson/firmware-updates-begin-rolling-out-to-fix-gtx-1070-memory-issues/


That's not true. Personally, i do have Micron mem. on my Gigabyte, they promised new VBIOS.


http://www.christiantoday.com/article/nvidia.gtx.1070.news.technical.issues.plague.graphics.card.owners.fix.to.roll.out.soon.via.bios.update/98705.htm


----------



## TheDeadCry

Quote:


> Originally Posted by *M3Stang*
> 
> No since the problem is so rare its hard to replicate and every time I have the computer on I am playing games and using dual monitors so switching to i-GPU isnt really the best option for me. It has probably been about 5 days since the last occurrence and before that it was 8 days. So its happening 2 times in 2 weeks or so. Before I reseated the ram it would be like twice every day or 2.


What are your bios settings? Latest Bios? Is it overclocked (motherboard; ram, cpu, anything in the mobo settings. In other words...not stock)? What is your FCLK at? 800mhz or 1GHz?


----------



## Roland0101

OK, new ASUS vBIOS results.

First, the imho most important thing, checkerboard artifacts crashes are completely gone.
I did set all my Games and Benchmarks back to Adaptive or Optimal Power Mode and tested about every 3D application I have with the highest Memory OC result I could achieve.
No checkerboard crashes whatsoever.

Furthermore, If the OC is set too high, it results in driver crashes and not in Blue Screens or checkerboard artifacts crashes.

The overall Overclocking ability is only slightly higher. Before the Bios update I could achieve 8808Mhz effective (with locked voltage), now I can achieve 8912Mhz effective. (without locked voltage.)

Fire Strike 1

Fire Strike 2


----------



## Wonnebju

Quote:


> Originally Posted by *HOODedDutchman*
> 
> If u show me 1 game that becomes playable from unplayable by overclocking your gpu even with Samsung memory you have a valid point. There's no situation that a game becomes playable when it wasn't at stock. The only reason to overclock is to mess around n get some solid benchmark numbers.


I may proove you wrong. On my Gigabyte Radeon Hd 7970 Windforce @Stock 1000/1375MHz playing The Witcher 3 was impossible due to fps in the one digit area.
When I set it to +50MHz on the core the game ran ~40-50fps. I still to this date don't have a freaking clue why.


----------



## blued

Posted this in another forum (hello khanmein). Micron Zotac Amp vram test +700mhz no issues at all (no OC on core). No bios update yet from Zotac and will not likely bother with it when released. Bought the card knowing in advance of the micron issue. Since I dont normally OC my cards, all I required of the card was it to function normally at stock clocks, anything above that is a bonus which I may not even utilize.


Spoiler: Warning: Spoiler!


----------



## blued

Quote:


> Originally Posted by *Roland0101*
> 
> The overall Overclocking ability is only slightly higher. *Before the Bios update I could achieve 8808Mhz effective (with locked voltage)*, now I can achieve 8912Mhz effective. (without locked voltage.)


Missed that. Cool. The way some folks were going about this was that anything above 300mhz on a micron card was not possible and that every micron card owner had a certified lemon and not entitled to enjoy his card. Thanks.


----------



## TheGlow

Quote:


> Originally Posted by *blued*
> 
> Missed that. Cool. The way some folks were going about this was that anything above 300mhz on a micron card was not possible and that every micron card owner had a certified lemon and not entitled to enjoy his card. Thanks.


Anytime they cry about Microns not capable of 300+ I just direct them to my screenshot,


----------



## gtbtk

Quote:


> Originally Posted by *blued*
> 
> Posted this in another forum (hello khanmein). Micron Zotac Amp vram test +700mhz no issues at all (no OC on core). No bios update yet from Zotac and will not likely bother with it when released. Bought the card knowing in advance of the micron issue. Since I dont normally OC my cards, all I required of the card was it to function normally at stock clocks, anything above that is a bonus which I may not even utilize.
> 
> 
> Spoiler: Warning: Spoiler!


The Micron cards with original bios seem to be ok up to just over +400 (800Mhz) without having to work around voltage issues. The original bios also caused DTC watchdog and Video scheduler blue screens. The new Asus bios no longer causes blue screens, just crashes the driver if you go too high.


----------



## M3Stang

Quote:


> Originally Posted by *M3Stang*
> 
> I salvaged it from my old build. Its about 4 years old. The corsair TX850. It worked fine powering a build with a Q9450 and GTX 590. This build uses much less power.


Quote:


> Originally Posted by *TheDeadCry*
> 
> What are your bios settings? Latest Bios? Is it overclocked (motherboard; ram, cpu, anything in the mobo settings. In other words...not stock)? What is your FCLK at? 800mhz or 1GHz?


I havent touched anything in the bios at all. Its the latest stable bios as of 6/10/2016. F7. There is a beta one that came out last week but dont want to mess with that stuff. If FCLOCK is when the computer is fully idle the CPU runs at 800MHz.


----------



## MyNewRig

Quote:


> Originally Posted by *rfarmer*
> 
> You think this is bad? You should have been in the 970 owners club when the news came out about the 3.5GB memory. OMG you would have thought someone shot their dogs and stole all their lunch money, never seen so much whining in my life. This is minor by comparison.


hahahahahahah, hilarious comment, that cracked me up big time









Comment of the week award


----------



## HOODedDutchman

Quote:


> Originally Posted by *Roland0101*
> 
> OK, new ASUS vBIOS results.
> 
> First, the imho most important thing, checkerboard artifacts crashes are completely gone.
> I did set all my Games and Benchmarks back to Adaptive or Optimal Power Mode and tested about every 3D application I have with the highest Memory OC result I could achieve.
> No checkerboard crashes whatsoever.
> 
> Furthermore, If the OC is set too high, it results in driver crashes and not in Blue Screens or checkerboard artifacts crashes.
> 
> The overall Overclocking ability is only slightly higher. Before the Bios update I could achieve 8808Mhz effective (with locked voltage), now I can achieve 8912Mhz effective. (without locked voltage.)
> 
> Fire Strike 1
> 
> Fire Strike 2


Beautiful. Right where it should be. I wonder if they increased tdp as well to help with core throttling some reported. Unless they are just dumb and don't understand how gpu boost works lol. Those are perfectly good overclock results. No reason to complain there at all. Congrats. Get some long game sessions in to confirm.


----------



## DeathAngel74

Quote:


> Originally Posted by *TheGlow*
> 
> 
> Anytime they cry about Microns not capable of 300+ I just direct them to my screenshot,


I totally agree. I'm tired of hearing everyone complaining, every single day, day after day..after day. You get the idea....My Micron is only capable of +313.5(+627 effective). I have sympathy for those who are truly having issues, but all the people just complaining for the sake of complaining, not so much! JUst like everything else, one person goes off, then all of a sudden 3 ppl jump in.
People forget to double what they adjust for mem clock...Anyway, rant over :/


----------



## Hunched

New driver 375.70 sneaks in tasks and telemetry monitoring garbage to eat up more resources, didn't see it mentioned anywhere else.

This is with only choosing the driver and PhysX during installation...
The tasks are enabled by default, I simply disabled them prior to the screenshot.

Here's where "NVIDIA crash and telemetry reporter" or "NvTmRep.exe" is now installed on all your PC's.


----------



## khanmein

this bryan from australia damn received MSI gaming Z & guess what? mahfager samsung again. for really? this kinda marketing...


----------



## TheDeadCry

Quote:


> Originally Posted by *Hunched*
> 
> New driver 375.70 sneaks in tasks and telemetry monitoring garbage to eat up more resources, didn't see it mentioned anywhere else.
> 
> This is with only choosing the driver and PhysX during installation...
> The tasks are enabled by default, I simply disabled them prior to the screenshot.
> 
> Here's where "NVIDIA crash and telemetry reporter" or "NvTmRep.exe" is now installed on all your PC's.


Nice Heads Up. +1


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> this bryan from australia damn received MSI gaming Z & guess what? mahfager samsung again. for really? this kinda marketing...


I think that It is pretty obvious that Nvidia and the AIB partners have known that there were issues with the Micron card from the beginning. I suspect that they had not worked out exactly what the problem was to fix it until we started to experiment and then shake the tree at nvidia.com. In all probability, they probably got side tracked looking at the memory chips themselves and not the supporting logic and circuitry.

In my experience, these manufacturers usually work on the deny, deny, deny process for months until they are cornered like in the 970 memory drama or have a fix to distribute so they can feel that they were the "savior".

I was initially surprised that nvidia actually volunteered to work on a bios update so quickly.

If I was in their position with the same issues, I would not send out micron cards for review knowing they would crash either.

In the light of this issue though, we now know that Bios updates are initially created by Nvidia and pushed out to the Partners for branding customization. The limited default clock speed variations across different brands together with the absence of tweaking tools would tend to suggest that the core aspects of the pascal bios are not customizable by the partners and have to come from Nvidia themselves.

The samsung only review samples, together with what surely must have been nvidia supplied reviewer bios version cards suggests that the cards are all sourced from and cherry picked by single source. The only common denominator here is Nvidia themselves.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> New driver 375.70 sneaks in tasks and telemetry monitoring garbage to eat up more resources, didn't see it mentioned anywhere else.
> 
> This is with only choosing the driver and PhysX during installation...
> The tasks are enabled by default, I simply disabled them prior to the screenshot.
> 
> Here's where "NVIDIA crash and telemetry reporter" or "NvTmRep.exe" is now installed on all your PC's.


I have no idea how much resources that they eat up. They look like a run a small job periodically and send a small report home type apps. They are a bit naughty installing it without telling you though.

To be fair, In the light of the bios updates for micron memory that is supposed to to fix crashes, It is a way to confirm that their fix actually worked or didn't work as the case may be.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I think that It is pretty obvious that Nvidia and the AIB partners have known that there were issues with the Micron card from the beginning. I suspect that they had not worked out exactly what the problem was to fix it until we started to experiment and then shake the tree at nvidia.com. In all probability, they probably got side tracked looking at the memory chips themselves and not the supporting logic and circuitry.
> 
> In my experience, these manufacturers usually work on the deny, deny, deny process for months until they are cornered like in the 970 memory drama or have a fix to distribute so they can feel that they were the "savior".
> 
> I was initially surprised that nvidia actually volunteered to work on a bios update so quickly.
> 
> If I was in their position with the same issues, I would not send out micron cards for review knowing they would crash either.
> 
> In the light of this issue though, we now know that Bios updates are initially created by Nvidia and pushed out to the Partners for branding customization. The limited default clock speed variations across different brands together with the absence of tweaking tools would tend to suggest that the core aspects of the pascal bios are not customizable by the partners and have to come from Nvidia themselves.
> 
> The samsung only review samples, together with what surely must have been nvidia supplied reviewer bios version cards suggests that the cards are all sourced from and cherry picked by single source. The only common denominator here is Nvidia themselves.


i already provided 2 articles that reviewed micron to him. he promised to make a vid about it. i would like him give an proper explanation regarding the micron.

damn~ steve burke gave me that kinda explanation like changing parts is like nothing & joker slunt bias towards EVGA said that the amount of heat of VRM won't cause any big deal if u're not over-clocking.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I think that It is pretty obvious that Nvidia and the AIB partners have known that there were issues with the Micron card from the beginning. I suspect that they had not worked out exactly what the problem was to fix it until we started to experiment and then shake the tree at nvidia.com. In all probability, they probably got side tracked looking at the memory chips themselves and not the supporting logic and circuitry.
> 
> In my experience, these manufacturers usually work on the deny, deny, deny process for months until they are cornered like in the 970 memory drama or have a fix to distribute so they can feel that they were the "savior".
> 
> I was initially surprised that nvidia actually volunteered to work on a bios update so quickly.
> 
> If I was in their position with the same issues, I would not send out micron cards for review knowing they would crash either.
> 
> In the light of this issue though, we now know that Bios updates are initially created by Nvidia and pushed out to the Partners for branding customization. The limited default clock speed variations across different brands together with the absence of tweaking tools would tend to suggest that the core aspects of the pascal bios are not customizable by the partners and have to come from Nvidia themselves.
> 
> The samsung only review samples, together with what surely must have been nvidia supplied reviewer bios version cards suggests that the cards are all sourced from and cherry picked by single source. The only common denominator here is Nvidia themselves.
> 
> 
> 
> i already provided 2 articles that reviewed micron to him. he promised to make a vid about it. i would like him give an proper explanation regarding the micron.
> 
> damn~ steve burke gave me that kinda explanation like changing parts is like nothing & joker slunt bias towards EVGA said that the amount of heat of VRM won't cause any big deal if u're not over-clocking.
Click to expand...

I know. I saw it and left a detailed comment under yours as well.

Steve Burke is sort of half right. Changing parts is common and should be nothing to be upset about - That is provided of course, that the alternative part and associated firmware and circuitry works properly and exactly the same as the original part. In this case, it doesn't and there is more to the situation than just a card with a different brand of memory. That is why I have been harping about not just blaming the chips and saying they are intrinsically defective. That couldn't be the case because you could work around the issue to improve the situation on the vast majority of cards. If they were defective, then there would be no work around at all.

The Steve Burkes of the world have only focused on the "Micron memory is defective" claim becauise it is a simplistic concept. The chips are fixed and cant be changed so slagging off nvidia when there is no hope of a solution, simply because they are looking in the wrong place, is counter productive for them if they want to continue creating content. Because the root cause is not immediately in your face and particularly when there are a few people spamming every forum know to man claiming that the chips are defective or dismissing the claim because "overclocking is not guaranteed", they ignore what is actually causing the issue that makes these cards crash the operating system. In this case, it was the firmware logic managing the interaction between the micron memory and the card. Not the chips themselves. In other cases it may well be a specific chip but we dont know that if we don't look past the first thing in the chain of components that all work together.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I know. I saw it and left a detailed comment under yours as well.
> 
> Steve Burke is sort of half right. Changing parts is common and should be nothing to be upset about - That is provided of course, that the alternative part and associated firmware and circuitry works properly and exactly the same as the original part. In this case, it doesn't and there is more to the situation than just a card with a different brand of memory. That is why I have been harping about not just blaming the chips and saying they are intrinsically defective. That couldn't be the case because you could work around the issue to improve the situation on the vast majority of cards. If they were defective, then there would be no work around at all.
> 
> The Steve Burkes of the world have only focused on the "Micron memory is defective" claim becauise it is a simplistic concept. The chips are fixed and cant be changed so slagging off nvidia when there is no hope of a solution, simply because they are looking in the wrong place, is counter productive for them if they want to continue creating content. Because the root cause is not immediately in your face and particularly when there are a few people spamming every forum know to man claiming that the chips are defective or dismissing the claim because "overclocking is not guaranteed", they ignore what is actually causing the issue that makes these cards crash the operating system. In this case, it was the firmware logic managing the interaction between the micron memory and the card. Not the chips themselves. In other cases it may well be a specific chip but we dont know that if we don't look past the first thing in the chain of components that all work together.


totally agreed what u said. yeah i guess that's u. ROFL~ steve burke busy with the vrm cooling.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> totally agreed what u said. yeah i guess that's u. ROFL~ steve burke busy with the vrm cooling.


i think the vram cooling is worse than the vrm cooling


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> associated firmware and circuitry works properly and exactly the same as the original part. In this case, it doesn't and there is more to the situation than just a card with a different brand of memory. That is why I have been harping about not just blaming the chips and saying they are intrinsically defective. That couldn't be the case because you could work around the issue to improve the situation on the vast majority of cards. If they were defective, then there would be no work around at all.
> 
> it was the firmware logic managing the interaction between the micron memory and the card. Not the chips themselves. In other cases it may well be a specific chip but we dont know that if we don't look past the first thing in the chain of components that all work together.


One question comes to mind every time i read that kind of logic in your posts,

Why wouldn't a memory chip that has an identical spec sheet to another chip, wouldn't function properly within the same parameters that the other chip functioned flawlessly under if it was indeed conformant to its own spec sheet?

There is a difference between "defective" and "out of spec" chip, you can sure lock voltage, raise voltage, make voltage constant and prevent fluctuations, raise power or halt proper power management to get the "out of spec" chip to appear like it is working properly, but if it was conformant to spec at the first place why wouldn't it work within the same parameters as the other?

I asked you that question many times, never received a proper answer from you to augment your theory ever.


----------



## MyNewRig

Quote:


> Originally Posted by *Hunched*
> 
> New driver 375.70 sneaks in tasks and telemetry monitoring garbage to eat up more resources, didn't see it mentioned anywhere else.
> 
> This is with only choosing the driver and PhysX during installation...
> The tasks are enabled by default, I simply disabled them prior to the screenshot.
> 
> Here's where "NVIDIA crash and telemetry reporter" or "NvTmRep.exe" is now installed on all your PC's.


That kind of monitoring is a good thing at its essence so that the software team can monitor performance of BIOS, Drivers and Games, gauge the spread and severity of issues and effectiveness of fixes, Nvidia sure has a ton of software-related problems lately after windows 10 anniversary update.

The dark side of this is Nvidia being the horribly shady company it is, they don't inform the user upon installation that monitoring tools are being installed on their own PCs and don't give the user the option to opt-in/out of that testing/monitoring pool of users like every other single decent company does and that is pretty ugly, Nvidia's insistence in using shady practices and policies is very distributing since with them you don't know what you are getting when buying hardware or downloading software that is installed on your own wholly owned PC.


----------



## HOODedDutchman

The amount of whining in this thread is ridiculous. Grow up. U guys acting like this are same type of people fighting against police. U whine cuz there's a bios issue with cards with micron. They roll out fixes n add something to driver to monitor the fix n u whine some more then whine about other subjects. Start a whiner thread.


----------



## Lahatiel

Quote:


> Originally Posted by *HOODedDutchman*
> 
> The amount of whining in this thread is ridiculous. Grow up. U guys acting like this are same type of people fighting against police. U whine cuz there's a bios issue with cards with micron. They roll out fixes n add something to driver to monitor the fix n u whine some more then whine about other subjects. Start a whiner thread.


So for you companys like nvidia have the right to install spy telemetry software without asking for permission on customers PCs? Your police/company comparison is naive at best.
Even the police can't do what they want with the property of citizens. Our PCs aren't self-service stores.


----------



## rfarmer

Quote:


> Originally Posted by *HOODedDutchman*
> 
> The amount of whining in this thread is ridiculous. Grow up. U guys acting like this are same type of people fighting against police. U whine cuz there's a bios issue with cards with micron. They roll out fixes n add something to driver to monitor the fix n u whine some more then whine about other subjects. Start a whiner thread.


http://www.overclock.net/t/1614656/gtx-1070s-micron-feedback There was a thread started for 1070 with Micron memory.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> The amount of whining in this thread is ridiculous. Grow up. U guys acting like this are same type of people fighting against police. U whine cuz there's a bios issue with cards with micron. They roll out fixes n add something to driver to monitor the fix n u whine some more then whine about other subjects. Start a whiner thread.


You are pretty much the one whining right now about what you perceive as whining, you have whined in the past and is continuing to whine again now, i don't see any whining taking place in this thread by others, i see mature adults discussing current issues affecting the GTX 1070 GPU in a way that in many times is as objective as possible, talking about different hardware components of the card or software issues effecting their PCs is pretty much what this thread is for, it just happens that Micron memory and its related software fixes is the most important recent event for the product and so we talk about it, what else do you want?

It is naive to assume that the monitoring tools are being involuntarily force-installed on people's own computers without consent just to monitor the updated BIOS effects, Nvidia has many more driver and game related issues that are being reported on daily basis that they need to monitor and track on a mass scale to be able to resolve them, there are bugs that have been currying on from one driver update to the next that have went unresolved for months and hence the monitoring, but Nvidia MUST ask PC owners for their consent before installing such tools on their own PCs and that is a common practice, we are not asking for something new here.

However you keep whining about what you think is whining, when we stay shut and leave space that thread dies for hours without a single post and for a good reason, all benchmarks have been posted already, we already know the limits of the card, all manufacturers have released their cards already and we know the pros and cons of each, nothing new is being introduced to the lineup currently and hence there is nothing else to talk about but the recent Memory, BIOS and Drivers issues, so you stop whining not us.


----------



## Brohem0th

Got the EVGA GTX 1070 FTW the first day it was available on Amazon, shipped with Samsung memory. Does 2100Mhz/9200Mhz rock solid stable. I can pass 3dMark at 2126Mhz/9400Mhz if I crank the fans all the way up, but it fails after about half an hour during games.

Normally run it at 2100/9000 at 90% fan speed, been running it at stock since I heard about the VRM overheating thing. Already ordered the thermal pad from EVGA, and ordered some copper fin heatsinks for the VRM/Mosfet on the "front" side of the PCB. Will update if I get any higher stable clocks.

For the record I've never experienced the black screen glitch, and I've never seen core temp get over 65c. Checked the VRM's and Mosfets with an infrared thermometer the other day and they were in the 85-90c range with the side panel off the case. I've got a 140mm fan mounted on the side panel pushing cool air in, so if anything I suspect they operate at a lower temperature with the panel on than with it off. Also have more fans pulling air out of the case than pushing in, ambient temp inside case doesn't go over 23c during hour long loops of 3dMark.

Moral of the story; don't skimp the 1$ thermal pads EVGA! Also, have good case airflow. I'm sure if I didn't I would have damaged my card by now. Thank goodness for being paranoid and running fans hard.


----------



## Roland0101

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Beautiful. Right where it should be. Those are perfectly good overclock results. No reason to complain there at all. Congrats. Get some long game sessions in to confirm.


I agree, but to be accurate, neither gtbtk nor I complained or whined about the issue, we just stated facts and countered false argumentation regarding the issue.

Played 1 hour RotTR DX11 last night, highest possible settings, DSR 1440P + SSAA 2x. All stable.


----------



## HOODedDutchman

Fixed


----------



## HOODedDutchman

Simply something that reports to Nvidia if u get a driver crash. Soooo terrible omg. They're not gonna check your bank info.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> You are the worst one of all. Whining about bad memory and bios fix for weeks now and when it finally comes out you move your whining to evga. Evga has a fix offered. Ya they screwed up n it happens. Its evga. You don't want to install the thermal pass yourself I'm sure they would be happy to rma the card for one that has them. They have the best customer service in the gpu game. Pointless trivial whining. U don't even own a 1070 anymore ffs. Sit back and wait for people bios testing to come forward and stop fuming like a child. There's nothing rational coming from you. Going on about how everything is a conspiracy like they totally knew the cards weren't working right when they released them.
> 
> Or if u can't help yourself go over to that micron thread. I'm sure that there's a lot more then me subscribed to this thread that are sick of hearing it.


Tip, i think you quoted the wrong person, calm down, look at your screen properly and don't mix posters


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> You are the worst one of all. Whining about bad memory and bios fix for weeks now and when it finally comes out you move your whining to evga. Evga has a fix offered. Ya they screwed up n it happens. Its evga. You don't want to install the thermal pass yourself I'm sure they would be happy to rma the card for one that has them. They have the best customer service in the gpu game. Pointless trivial whining. U don't even own a 1070 anymore ffs. Sit back and wait for people bios testing to come forward and stop fuming like a child. There's nothing rational coming from you. Going on about how everything is a conspiracy like they totally knew the cards weren't working right when they released them.
> 
> Or if u can't help yourself go over to that micron thread. I'm sure that there's a lot more then me subscribed to this thread that are sick of hearing it.


Still wrong, i did not complain about EVGA at all, i don't have an EVGA card so i am not effected, these are two different posts by two different people ... try again please


----------



## HOODedDutchman

Mobile site doesn't work well. Double quotes sometimes n I deleted wrong one. Tip, buy a 1070 or leave.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Mobile site doesn't work well. Double quotes sometimes n I deleted wrong one. Tip, buy a 1070 or leave.


And who are you to ask me to leave? why leave? also installing spyware on people's computers without their consent is illegal, what if 50 different software on your computer does the same? your PC would turn into a zombie reporting station, would that be okay?


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> And who are you to ask me to leave? why leave? also installing spyware on people's computers without their consent is illegal, what if 50 different software on your computer does the same? your PC would turn into a zombie reporting station, would that be okay?


Lol how is it spyware. You're installed their software which send something to them if their software screws up. What are they spying on. You really that protective of your driver crashes. Lol ridiculous. N the thread is called 1070 owners club. Which you are not.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Lol how is it spyware. You're installed their software which send something to them if their software screws up. What are they spying on. You really that protective of your driver crashes. Lol ridiculous. N the thread is called 1070 owners club. Which you are not.


It is a spyware if it sends data out without the owner's consent even if it is their own data, if every software you install is allowed to do the same, then everything you do on your PC will be available to third parties so it is illegal, even windows does not dare doing that without consent and gives you the option to opt-out

I bought 6 different GTX 1070s since June out of which two i bought directly for myself, my PC has always had a 1070 installed in it since June up until a few days ago, i invested a lot of time into the product, and only returned it until the dust settles so i can decide on a re-purchase, consider it a replacement period, so if someone sends their 1070 to warranty and replacement took a few weeks to arrive they should get out of the thread until their replacement comes in according to your logic?


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> It is a spyware if it sends data out without the owner's consent even if it is their own data, if every software you install is allowed to do the same, then everything you do on your PC will be available to third parties so it is illegal, even windows does not dare doing that without consent and gives you the option to opt-out
> 
> I bought 6 different GTX 1070s since June out of which two i bought directly for myself, my PC has always had a 1070 installed in it since June up until a few days ago, i invested a lot of time into the product, and only returned it until the dust settles so i can decide on a re-purchase, consider it a replacement period, so if someone sends their 1070 to warranty and replacement took a few weeks to arrive they should get out of the thread until their replacement comes in according to your logic?


You didn't. You don't own one. Your logic is flawed in that it's not at all what's happened. Stay I don't care. Just stop with the conspiracy theories About reviewers being sent binned cards and Nvidia and everyone else knowing about it and purposely screwing everyone over.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> You didn't. You don't own one. Your logic is flawed in that it's not at all what's happened. Stay I don't care. Just stop with the conspiracy theories About reviewers being sent binned cards and Nvidia and everyone else knowing about it and purposely screwing everyone over.


I didn't what? so you are questioning that i did that purchasing?

So first you want me to leave, and then now you took it back and want me to stop expressing what i believe happened? if my version of truth does not match yours you seek to silence me?

Now you are trolling big time and whining, so you are doing much worse than any of us, you just wasted two pages of this thread trolling, what valued did you add to the readers?


----------



## Lahatiel

Quote:


> Originally Posted by *HOODedDutchman*
> 
> Lol how is it spyware. You're installed their software which send something to them if their software screws up. What are they spying on. You really that protective of your driver crashes. Lol ridiculous. N the thread is called 1070 owners club. Which you are not.


Seriously, telemetry software doesn't send only crash reports back. Your definition of telemetry is wrong. You describe the behavior of a simple crash report tool which creates a snapshot of the system and error codes.
Telemetry logs programs you run, what you do and when and how you interact with your PC. These are a lot more informations (sensitive data!) and should only be collected with knowledge and acceptance of the user.
Big data is the cash cow of tomorrow and many companys want to to step in early.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> I didn't what? so you are questioning that i did that purchasing?
> 
> So first you want me to leave, and then now you took it back and want me to stop expressing what i believe happened? if my version of truth does not match yours you seek to silence me?
> 
> Now you are trolling big time and whining, so you are doing much worse than any of us, you just wasted two pages of this thread trolling, what valued did you add to the readers?


You throwing out propaganda conspiracy theories that would direct people against purchasing Nvidia products. Your "version of the truth" is so out there it's laughable. N I mean you didn't rma your card you just outright don't own one.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> One question comes to mind every time i read that kind of logic in your posts,
> 
> Why wouldn't a memory chip that has an identical spec sheet to another chip, wouldn't function properly within the same parameters that the other chip functioned flawlessly under if it was indeed conformant to its own spec sheet?
> 
> There is a difference between "defective" and "out of spec" chip, you can sure lock voltage, raise voltage, make voltage constant and prevent fluctuations, raise power or halt proper power management to get the "out of spec" chip to appear like it is working properly, but if it was conformant to spec at the first place why wouldn't it work within the same parameters as the other?
> 
> I asked you that question many times, never received a proper answer from you to augment your theory ever.


Obviously is doesn't run at exactly the same spec. If it did then it would behave exactly the same as the Samsung memory chips. But having to change a setting within the bios doesn't make the memory bad, defective, sub standard etc. It only makes it different. You have been using terms like defective for a long time, in fact you were so adamant that the chips were defective you returned your cards after the bios update has been announced.

The only specs Nvidia have published about the 1070 GDDR5 Memory is that it is 8GB at 8gb/s with 256bit memory bus. Both brands of chips conform exactly with that published spec. But those limited specifications do not address Ram timing, the differences in operating parameters for power saving voltages etates and all the other settings that need to be baked into the bios that the memory relies on to operate in a stable manner. Can you tell me what the ram timings are for the Samsung memory? Can you tell me how both ram types deal with sleep states? I don't know what they are because Samsung have never published it in their data sheet for the memory and Micron have only published their ram timings in their datasheet. I have nothing to compare them it to from either Nvidia or Samsung.

Given that the memory appears to run fine and no longer appears to crash after the bios update has been applied, it would seem to suggest that Nvidia did make the required setting adjustments


----------



## HOODedDutchman

Stating your opinion is one thing but going on and on about your conspiracy theory for weeks and driving it down people's throats is obscene.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> You throwing out propaganda conspiracy theories that would direct people against purchasing Nvidia products. Your "version of the truth" is so out there it's laughable. N I mean you didn't rma your card you just outright don't own one.


What you think is laughable others think makes perfect sense given the events, we did not stop anyone from buying, we just say what we believe happened especially with the manufacturers not having any confidence to send a Micron card out for review up until this day, you and others think otherwise, fine, i can also call you a biased fanboy or a corporate shell who would defend Nvidia no matter how shady they are, but i did not call you that, so have some respect for other people's minds and opinions.

The second part is truly non of your business, you are not aware what agreement i had with my retailer that led me to handle it via this channel instead of RMA, it does not matter, i bought more GTX 1070s than most people here and had the product since launch, being active in this thread means that i am still interested in the topic and will repurchase when i see signs of improvement, so just because you don't agree with my views does not give you the right to ask me to leave, that is just outright trolling my friend.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> What you think is laughable others think makes perfect sense given the events, we did not stop anyone from buying, we just say what we believe happened especially with the manufacturers not having any confidence to send a Micron card out for review up until this day, you and others think otherwise, fine, i can also call you a biased fanboy or a corporate shell who would defend Nvidia no matter how shady they are, but i did not call you that, so have some respect for other people's minds and opinions.
> 
> The second part is truly non of your business, you are not aware what agreement i had with my retailer that led me to handle it via this channel instead of RMA, it does not matter, i bought more GTX 1070s than most people here and had the product since launch, being active in this thread means that i am still interested in the topic and will repurchase when i see signs of improvement, so just because you don't agree with my views does not give you the right to ask me to leave, that is just outright trolling my friend.


I don't care anymore. U don't own card lol. N my views make a hell of a lot more sense. Do this. The high end cards that have been released or reviewed in the last month go find evidence of any of them having micron chips.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Obviously is doesn't run at exactly the same spec. If it did then it would behave exactly the same as the Samsung memory chips. But having to change a setting within the bios doesn't make the memory bad, defective, sub standard etc. It only makes it different. You have been using terms like defective for a long time, in fact you were so adamant that the chips were defective you returned your cards after the bios update has been announced.
> 
> The only specs Nvidia have published about the 1070 GDDR5 Memory is that it is 8GB at 8gb/s with 256bit memory bus. Both brands of chips conform exactly with that published spec. But those limited specifications do not address Ram timing, the differences in operating parameters for power saving voltages etates and all the other settings that need to be baked into the bios that the memory relies on to operate in a stable manner. Can you tell me what the ram timings are for the Samsung memory? Can you tell me how both ram types deal with sleep states? I don't know what they are because Samsung have never published it in their data sheet for the memory and Micron have only published their ram timings in their datasheet. I have nothing to compare them it to from either Nvidia or Samsung.
> 
> Given that the memory appears to run fine and no longer appears to crash after the bios update has been applied, it would seem to suggest that Nvidia did make the required setting adjustments


Okay, the thing is that in previous generations this has not been the case, memory chips between Samsung, Hynix and Micron were mixed on the product line without the need for a special BIOS for each, that is spec-for-spec matching, now is different, Micron and Samsung chips are not the same, they switched to Micron without a proper BIOS in place, we found a problem, tested and reported, one month later they say they will make a new BIOS, one more month and the BIOS starts rolling out and until this day MSI and Gigayte have nothing, so you had your card since July i suppose now is almost November you did not get your fix yet, add to that the total lack of transparency and information sharing and the initial denial that any problem existed, this picture is unacceptable in any consumer law in the world, that is one thing.

Not having the confidence to send Micron cards out for review meaning that the developer and manufacturer themselves have doubt about the product which is a disaster in and by itself.

I don't know the timings of the Samsung chip of course because like you said Samsung does not publish that to the public, but in testing they are indeed different.

Personally i returned the card right before the BIOS was out not after, because i had to, i been with that particular card for more than a month just waiting, i asked for a return period extension from my retailer and was awarded one, original return period ran out and extension period ran out, and they requested that i send it back until the problem is resolved because they wanted these cards back to make a case with their supplier as well because they had a lot of people complaining and returning as well, the consumer law in my country states that a fix should be provided within 2 weeks, we were way paste two months by the time i returned it and it was still nothing.

If you put the technical and business aspects together and look at the whole picture, it is a total disaster from such big companies, and after that i personally don't trust the Micron memory at all given that a changelog has not even been provided for the BIOS, we don't even know what they did to either fix the root cause or hide the problem, i am still waiting on professionally done Frametime testing and more importantly for a bunch of review samples to be sent out with Micron memory and the new BIOS so that all aspects can be tested including thermal, power consumption etc ..

So until now the issue still smells very fishy for me as my confidence in the product and the company behind it is beyond shaken after this painful incident.


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> I don't care anymore. U don't own card lol. N my views make a hell of a lot more sense. Do this. The high end cards that have been released or reviewed in the last month go find evidence of any of them having micron chips.


Keep trolling , go on ... just don't complain about whining when you are clearly a troll.

I don't understand the second sentence, what high end cards have what? slow down and type clearly or wait until you get back to your PC before responding because you are not making any sense and you are misquoting others.


----------



## khanmein

guys stop, i didn't own the card yet so i'll leave but not now.. wait when the day arrive.. let's wait for PCPER review.


----------



## muzammil84

guys... PLEASE! use your 1070s for something more demanding than browsing this forum. Enough been said(actually, way too much) about Micron and Samsung, all I see everyday in this, subscribed by myself, thread is some kids soaking because of Nvdia lying, ripping off and spying... stop arguing, disrespecting each other and writing conspiracy theories. Let some of us actually enjoy our 1070s in this thread.


----------



## khanmein

Quote:


> Originally Posted by *muzammil84*
> 
> guys... PLEASE! use your 1070s for something more demanding than browsing this forum. Enough been said(actually, way too much) about Micron and Samsung, all I see everyday in this, subscribed by myself, thread is some kids soaking because of Nvdia lying, ripping off and spying... stop arguing, disrespecting each other and writing conspiracy theories. Let some of us actually enjoy our 1070s in this thread.


y i'm so free cos i can't play anything with HD 4600 but dota 2 or theme hospital only. i'm bashing tech reviewer especially youtube fella & the popular articles like tomshardware, anandtech, techreport, pcper etc. where's the F mahfager micron review?

send me your GTX 1070 & i'll leave this forum right now.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Keep trolling , go on ... just don't complain about whining when you are clearly a troll.
> 
> I don't understand the second sentence, what high end cards have what? slow down and type clearly or wait until you get back to your PC before responding because you are not making any sense and you are misquoting others.


1 misquote lol. What cards have been reviewed lately that you are talking about. Micron haven't been around since launch. Its likely those cards that are being reviewed don't come with micron chips at all. Its also likely these beast triple slot cards like palit were made long before launch and just not released until now to milk the sales of their other cards.


----------



## MyNewRig

Quote:


> Originally Posted by *muzammil84*
> 
> guys... PLEASE! use your 1070s for something more demanding than browsing this forum. Enough been said(actually, way too much) about Micron and Samsung, all I see everyday in this, subscribed by myself, thread is some kids soaking because of Nvdia lying, ripping off and spying... stop arguing, disrespecting each other and writing conspiracy theories. Let some of us actually enjoy our 1070s in this thread.


How much time/space do you need to enjoy your 1070 in the thread? we have previously left that thread alone multiple times for days and nothing gets posted, it just totally halts, these are the current issues effecting the card so we discuss them, in what other way would you like to enjoy your 1070 in this thread if we agree to completely stop talking about Micron memory, BIOS update, Driver updates, drivers issues and brand specific issues? what else do you propose that we do?


----------



## MyNewRig

Quote:


> Originally Posted by *HOODedDutchman*
> 
> 1 misquote lol. What cards have been reviewed lately that you are talking about. Micron haven't been around since launch. Its likely those cards that are being reviewed don't come with micron chips at all. Its also likely these beast triple slot cards like palit were made long before launch and just not released until now to milk the sales of their other cards.


Two misquotes so far and post mixing, you attacked me furiously a few posts ago for an EVGA post that i did not even make.

Micron has been around since July, been out 3 months now at least.

@khanmein could you please take that one? our friend here has been hiding under a rock for the past few months and is not aware of very recent reviews until today that are being made with Samsung memory cards that no longer exist in retail, please point him out to some, it seems like he wants us to do all the research for him while he sits with his mobile phone trolling and misquoting us.


----------



## HOODedDutchman

Quote:


> Originally Posted by *MyNewRig*
> 
> Two misquotes so far and post mixing, you attacked me furiously a few posts ago for an EVGA post that i did not even make.
> 
> Micron has been around since July, been out 3 months now at least.
> 
> @khanmein could you please take that one? our friend here has been hiding under a rock for the past few months and is not aware of very recent reviews until today that are being made with Samsung memory cards that no longer exist in retail, please point him out to some, it seems like he wants us to do all the research for him while he sits with his mobile phone trolling and misquoting us.


Ah well. Guess I'm an idiot. I'm not a conspiracy theorist so I have no intention on giving a dam about doing research for a theory that is ridiculous. If I was on my rig I'd be gaming but instead of being patient and using the bios you were promised you're now using your rig to sit here running your mouth about subjects you don't know the truth behind and will never know.


----------



## HOODedDutchman

Anyone had a driver crash with this new driver with "spyware". Maybe it asks if it can send info to Nvidia like Microsoft does when windows has crashes.


----------



## HOODedDutchman

I still don't see any legit links made recently. Did a little search and I see nothing made after August. Some random youtuber doing an unboxing is not a legit review and means nothing. N very likely weren't sent the card for the review either. The only one I found was the gainward and that all white palit card review I linked in here a long time ago. Which were likely made before release and held on too and likely all Samsung and will be limited production. Not special it's just they will be around for a while then u won't be able to find one. This happens all the time.


----------



## guttheslayer

Just upgraded my PC from a 670 to 1070....

The score jump is pretty amazing..











Comparison of OLD vs NEW PC score for FSE


----------



## Dude970

Congrats on the upgrade, enjoy the new GPU


----------



## khanmein

@MyNewRig this link https://www.youtube.com/results?search_query=GTX+1070 ?? recently their viewership also increased.

@HOODedDutchman 



 (so far the only micron review from kyle) *maybe*cough*cough*


----------



## guttheslayer

Quote:


> Originally Posted by *Dude970*
> 
> Congrats on the upgrade, enjoy the new GPU


Thank you. I wanted a Titan XP at first but I realised to bring it it would cost 1.5-1.6K USD.

Its too much for me to handle. So I pick up a pair 1070 to give that same level of performance. (I got both for less than US$650, total brand new)


----------



## MyNewRig

Just found that picture posted in a comment on a news article about EVGA FTW cards catching fire, explains a lot of the sentiment towards Nvidia products


----------



## Dude970

Quote:


> Originally Posted by *guttheslayer*
> 
> Thank you. I wanted a Titan XP at first but I realised to bring it it would cost 1.5-1.6K USD.
> 
> Its too much for me to handle. So I pick up a pair 1070 to give that same level of performance. (I got both for less than US$650, total brand new)


You did great at that price!


----------



## blued

Quote:


> Originally Posted by *MyNewRig*
> 
> 
> 
> Just found that picture posted in a comment on a news article about EVGA FTW cards catching fire, explains a lot of the sentiment towards Nvidia products


Evga issue, not an Nvidia one.


----------



## TheGlow

Quote:


> Originally Posted by *blued*
> 
> Evga issue, not an Nvidia one.


He cant differentiate. By next week NVidia will be to blame for only 8 pepperonis on his pizza instead of 9.5


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> If you put the technical and business aspects together and look at the whole picture, it is a total disaster from such big companies


The issue was published at Geforce.com by gtbtk.

*You ranted.*

NVIDIA acknowledged the issue.

*You ranted.* (and introduced weird conspiracy theories.)

NVIDIA said they have a fix and that they will provide the fix to the AIBs.

*You ranted.*

Some AIBs acknowledged that they have the fix and that they will release a new VBios.

*You ranted.*

The isuue got fixed.

*You rant on and on...*

It was nothing else than a technical problem, everything else is just in your head.

Quote:


> Originally Posted by *MyNewRig*
> 
> Just found that picture posted in a comment on a news article about EVGA FTW cards catching fire, explains a lot of the sentiment towards Nvidia products


It's an EVGA product.

Can you really not comprehend that this is a AIB card and that NVIDIA is not responsible for the design mistakes the partners make.
Now after one problem is fixed you search for another reason to bash NVIDIA?

Man, get a life.


----------



## Roland0101

Quote:


> Originally Posted by *guttheslayer*
> 
> Just upgraded my PC from a 670 to 1070....
> 
> The score jump is pretty amazing..
> 
> 
> 
> 
> 
> 
> 
> 
> Comparison of OLD vs NEW PC score for FSE










Great upgrade, have fun with the new card.


----------



## Dude970

Quote:


> Originally Posted by *Roland0101*
> 
> The issue was published at Geforce.com by gtbtk.
> 
> *You ranted.*
> 
> NVIDIA acknowledged the issue.
> 
> *You ranted.* (and introduced weird conspiracy theories.)
> 
> NVIDIA said they have a fix and that they will provide the fix to the AIBs.
> 
> *You ranted.*
> 
> Some AIBs acknowledged that they have the fix and that they will release a new VBios.
> 
> *You ranted.*
> 
> The isuue got fixed.
> 
> *You rant on and on...*
> 
> It was nothing else than a technical problem, everything else is just in your head.
> It's an EVGA product.
> 
> Can you really not comprehend that this is a AIB card and that NVIDIA is not responsible for the design mistakes the partners make.
> Now after one problem is fixed you search for another reason to bash NVIDIA?
> 
> Man, get a life.


Well said


----------



## blued

--


----------



## Klue22

Everyone needs to take a deep breath and calm down. Some people are upset about the memory situation, others are upset about the BIOS. You can be upset about both. Heck, you can be upset about the color of the PCB just as long as you don't make a disruptive fuss. Further, you don't need to own a 1070 to view or comment in this thread. Telling others to leave because they don't is ridiculous, this is not a boys club. Don't make me start handing out warnings.


----------



## HOODedDutchman

Quote:


> Originally Posted by *khanmein*
> 
> @MyNewRig this link https://www.youtube.com/results?search_query=GTX+1070 ?? recently their viewership also increased.
> 
> @HOODedDutchman
> 
> 
> 
> (so far the only micron review from kyle) *maybe*cough*cough*


The first link is just if I went to YouTube and searched gtx 1070. It doesn't go direct to 1 video.


----------



## HOODedDutchman

Quote:


> Originally Posted by *guttheslayer*
> 
> Just upgraded my PC from a 670 to 1070....
> 
> The score jump is pretty amazing..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Comparison of OLD vs NEW PC score for FSE


Cool comparison. I wonder how my old zotac AMP 670 that did JUST under 1400mhz would of compared (literally 139x something). It was a monster card with 98% asic.


----------



## MyNewRig

Quote:


> Originally Posted by *blued*
> 
> Evga issue, not an Nvidia one.


Quote:


> Originally Posted by *TheGlow*
> 
> He cant differentiate. By next week NVidia will be to blame for only 8 pepperonis on his pizza instead of 9.5


Quote:


> Originally Posted by *Roland0101*
> 
> Can you really not comprehend that this is a AIB card and that NVIDIA is not responsible for the design mistakes the partners make.
> Now after one problem is fixed you search for another reason to bash NVIDIA?


wow, looks like this picture hit a nerve with a lot of people, i did not expect such an effect, you are about to start a revolution and burn me alive like witches









i fully understand that EVGA is not Nvidia although they are their closest partner, but my comment was not meant to say that EVGA problems are Nvidia's problems, you are just way too sensitive towards your beloved Brainwashvidia that you can't even read a message as it is,

I said the picture "explains the sentiment towards Nvidia's products" which is exactly what you are doing.

It is like you GPU buyers live in a planet of your own making, using words like bashing, just a technical problem, whining, ranting, not Nvidia's fault etc .. same language is not used in the market for TVs, Cars, Refrigerators, home appliance etc .. if LG for example does the same for one of its TVs and people complain we don't hear such vocabulary being used, but when it is Nvidia and GPUs suddenly we live in another world and have unlimited amount of tolerance to completely unacceptable practices in other markets.

With that open appetite for buying inferior products and happily waiting months for fixes i wish i was a GPU manufacturer and have customers like you, that will be the best easy money making machine in the world


----------



## Azazil1190

Hi guys I need your help here.
Btw I flash my strix 1070 oc with the new bios for micron (yes I got micron)
With the first bios that came with the card I was able to be stable at +500 for memory.Now after flash the new one at +500 im not stable. Im stable at +400 anything above is unstable.
So I flash again the first bios and im stable again at +500 I dont have issues with that bios at all but I cant understand why the new one cut my oc.I didn't play with the core with the new one.
Sorry for my English and thanks in advance


----------



## J Burgen

Quote:


> Originally Posted by *Azazil1190*
> 
> Hi guys I need your help here.
> Btw I flash my strix 1070 oc with the new bios for micron (yes I got micron)
> With the first bios that came with the card I was able to be stable at +500 for memory.Now after flash the new one at +500 im not stable. Im stable at +400 anything above is unstable.
> So I flash again the first bios and im stable again at +500 I dont have issues with that bios at all but I cant understand why the new one cut my oc.I didn't play with the core with the new one.
> Sorry for my English and thanks in advance


What do you need help with?


----------



## HOODedDutchman

Nobody planned this and nobody makes money off this. Obviously nobody knowing put out a product with a bios issue (that's what were calling it now I'm guessing). There's no good that comes of it. They undoubtedly drove away potential customer that will migrate to Vega when it is released and cost Nvidia a bunch of time to have the issue resolved. Nvidia should be commended for stepping up and making the fix and releasing it to all partners and forcing them to fix it. That is exactly what happened so don't think otherwise. You can bet someone is losing their job over this or being fined without anybody knowing due to a contract issue and its likely micron. N I say contract issue because it's likely micron gauranteed to partners that their memory would work in identical situation that Samsung does.


----------



## MyNewRig

Quote:


> Originally Posted by *Azazil1190*
> 
> Hi guys I need your help here.
> Btw I flash my strix 1070 oc with the new bios for micron (yes I got micron)
> With the first bios that came with the card I was able to be stable at +500 for memory.Now after flash the new one at +500 im not stable. Im stable at +400 anything above is unstable.
> So I flash again the first bios and im stable again at +500 I dont have issues with that bios at all but I cant understand why the new one cut my oc.I didn't play with the core with the new one.
> Sorry for my English and thanks in advance


Are you sure the correct Strix OC BIOS was the one flashed in the update? is your power target still going to 120%? is your Base Clock still 1657 MHz and your Boost Clock still 1860 MHz after the update?

If that is the case then no one here or anywhere else can help answer your question, we don't know what the original problem was and we don't know what changes were implemented in that BIOS to predict the behavior, also ASUS can not help you because if you create a ticket with their support you will just get the "overclocking is not guaranteed" or "we don't support overclocking" treatment.

So you pretty much just run the BIOS that works best for you or wait a few more weeks/months until we uncover more about all this with more reviews and investigation on the way.


----------



## Azazil1190

Quote:


> Originally Posted by *J Burgen*
> 
> What do you need help with?


Why with the new bios from asus I get lower oc on the memories. Im asking if someone else have the same behavior.
Thanks


----------



## HOODedDutchman

Who knows tho. Could be Nvidia certified the use of these chips. Could be partners did. If Nvidia allows partners to change hardware without confirmation or testing by Nvidia. Could of been micron. Were never going to know. So let's move on. If anyone comes in here having issues we point them to fixes bios. Other then that were going in circles and its defeating the purpose of the thread.


----------



## Azazil1190

Quote:


> Originally Posted by *MyNewRig*
> 
> Are you sure the correct Strix OC BIOS was the one flashed in the update? is your power target still going to 120%? is your Base Clock still 1657 MHz and your Boost Clock still 1860 MHz after the update?
> 
> If that is the case then no one here or anywhere else can help answer your question, we don't know what the original problem was and we don't know what changes were implemented in that BIOS to predict the behavior, also ASUS can not help you because if you create a ticket with their support you will just get the "overclocking is not guaranteed" or "we don't support overclocking" treatment.
> 
> So you pretty much just run the BIOS that works best for you or wait a few more weeks/months until we uncover more about all this with more reviews and investigation on the way.


Thanks for the answer mate appreciate.
Yes the power target and the clcocks are ok I already checked them.The only different is the less oc on the memories thats all.
I m gonna give one more try to the new one and im gonna make a fresh install the drivers.


----------



## MyNewRig

Quote:


> Originally Posted by *Azazil1190*
> 
> Thanks for the answer mate appreciate.
> Yes the power target and the clcocks are ok I already checked them.The only different is the less oc on the memories thats all.
> I m gonna give one more try to the new one and im gonna make a fresh install the drivers.


You are welcome no problem buddy, did the new BIOS at least solve your hard-crashing checkerboard BSOD crashing when you reach your max memory OC?

By that i mean, on the old BIOS for all of us, say if the card max memory OC limit is +300 and one does +350 and play a game or run a benchmark you get checkerboard crashing and system restart or loss of screen signal on exit.

You say the new BIOS only allows +400 for you now, what happens if you dial +450 or +500? does your system hard-crash and restart or is it only the driver that crashes with "display driver has stopped responding and has recovered" ?


----------



## MyNewRig

Quote:


> Originally Posted by *Azazil1190*
> 
> Thanks for the answer mate appreciate.
> Yes the power target and the clcocks are ok I already checked them.The only different is the less oc on the memories thats all.
> I m gonna give one more try to the new one and im gonna make a fresh install the drivers.


Sorry one more small thing in case you forgot to do it, do you restart your PC after flashing the updated BIOS? a vBIOS update does not take effect until you restart your system, also yes uninstall your current driver completely, use DDU in safe mode to remove all traces, restart and install the latest driver, you can also try with previous drivers and see if behavior changes, i know that is a lot of work to do and probably a lot of hassle but that is the world Nvidia has us living in now.


----------



## Roland0101

Quote:


> Originally Posted by *Dude970*
> 
> Well said


Thanks.


----------



## muzammil84

Quote:


> Originally Posted by *TheGlow*
> 
> He cant differentiate. By next week NVidia will be to blame for only 8 pepperonis on his pizza instead of 9.5


Quote:


> Originally Posted by *Roland0101*
> 
> The issue was published at Geforce.com by gtbtk.
> 
> *You ranted.*
> 
> NVIDIA acknowledged the issue.
> 
> *You ranted.* (and introduced weird conspiracy theories.)
> 
> NVIDIA said they have a fix and that they will provide the fix to the AIBs.
> 
> *You ranted.*
> 
> Some AIBs acknowledged that they have the fix and that they will release a new VBios.
> 
> *You ranted.*
> 
> The isuue got fixed.
> 
> *You rant on and on...*
> 
> It was nothing else than a technical problem, everything else is just in your head.
> It's an EVGA product.
> 
> Can you really not comprehend that this is a AIB card and that NVIDIA is not responsible for the design mistakes the partners make.
> Now after one problem is fixed you search for another reason to bash NVIDIA?
> 
> Man, get a life.


???


----------



## Brohem0th

Darn you moderators, doing your job effectively and efficiently! I wanted to come in here and take a dump on whoever replied to my post four pages back with a long, silly string of ranting and nonsense!

But then you removed their post and hopefully banned them. I am now oddly sad and happy at the same time.


----------



## Klue22

Quote:


> Originally Posted by *Brohem0th*
> 
> Darn you moderators, doing your job effectively and efficiently! I wanted to come in here and take a dump on whoever replied to my post four pages back with a long, silly string of ranting and nonsense!
> 
> But then you removed their post and hopefully banned them. I am now oddly sad and happy at the same time.


Nobody has been banned. We aren't that strict here.








As a general reminder it is better to report posts that you think are disruptive rather than reply. Worst case scenario is I dismiss your report but still end up poking around in the reported thread. I'd rather that happen than have to clean up a flame war and have to do all that silly infraction business.


----------



## shhek0

Hello guys,

which 1070 you would recommend. I know that this is a overclock forum however the person for whom i am asking for is not going to do that. He just bought a 1440p monitor ( and gtx 1080 too expensive ) so a 1070 would be a good fit. From what i saw at stock the Gainward Phoenix (Golden Sample) from i have available in my country has the highest boost clock. It this what to be looking for because most probably every card is boosting past the official numbers. Also the price is like 50-60$ below the MSI versions for example. Thanks in an advance!


----------



## Hunched

Quote:


> Originally Posted by *shhek0*
> 
> -snip-


Gainward and Palit are great, same company different names.

Just stay away from EVGA and Gigabyte as they are constantly making stupid design decisions, I don't care how good EVGA's support is when their products aren't.

MSI is the best you can get in North America, with Asus and Zotac in 2nd.


----------



## Majentrix

The Gainward Phoenix has the best cooler of all partner cards. It's also the quietest and overclocks quite well, and the GS and GLH models have some of the highest out of the box performance.
Just make sure you have room for it. It's a triple slot leviathan and is a hair longer than 10.5".


----------



## TheGlow

Yea be sure to confirm the dimensions. MY MSI said it would be a touch too short but it fits, nice and snug.


----------



## Azazil1190

Quote:


> Originally Posted by *MyNewRig*
> 
> Sorry one more small thing in case you forgot to do it, do you restart your PC after flashing the updated BIOS? a vBIOS update does not take effect until you restart your system, also yes uninstall your current driver completely, use DDU in safe mode to remove all traces, restart and install the latest driver, you can also try with previous drivers and see if behavior changes, i know that is a lot of work to do and probably a lot of hassle but that is the world Nvidia has us living in now.


Yeap I made a restart of course. The only thing that I didnt is fresh install.
I always use ddu btw.
The strange thing on my card is even with the "old" bios never had a blue screen only crashes on 3dmark f.s if I was pas the stable memory clocks.
With the new one the same but less memory oc this is the difference. I m gonna flash again the new one ill make a fresh install the drivers and ill try to oc the core first to see if I take any improvement on the core and then the memory.If not flash back to the old.
Notice that I read on asus forum case like mine.
Person have the same behavior on his card with the memory(less oc) but now he can do 2200 on the core with the new.
With the previous bios he was unstable at 2100.
Strange things.


----------



## LogicusMPS

Quote:


> Originally Posted by *Azazil1190*
> 
> Hi guys I need your help here.
> Btw I flash my strix 1070 oc with the new bios for micron (yes I got micron)
> With the first bios that came with the card I was able to be stable at +500 for memory.Now after flash the new one at +500 im not stable. Im stable at +400 anything above is unstable.
> So I flash again the first bios and im stable again at +500 I dont have issues with that bios at all but I cant understand why the new one cut my oc.I didn't play with the core with the new one.
> Sorry for my English and thanks in advance


I have the same problem... Please can you or someone give me a link of old BIOS. Just to check it one more time, because as i saw i got around -150 stable OC on memory with this new BIOS.


----------



## MyNewRig

Quote:


> Originally Posted by *Azazil1190*
> 
> The strange thing on my card is even with the "old" bios never had a blue screen only crashes on 3dmark f.s if I was pas the stable memory clocks.
> Notice that I read on asus forum case like mine.
> Person have the same behavior on his card with the memory(less oc) but now he can do 2200 on the core with the new.
> With the previous bios he was unstable at 2100.
> Strange things.


mmmm, that is a first, how come you did not checkerboard and BSOD crash on the old BIOS when you reached your max memory OC? every single one of us with a Micron card had the same issue, forgive me for the stupid question, are you sure you have Micron memory?









If you can confirm you actually have Micron and have checked for it properly then there are really some strange anomalies with these memory chips, you for example did not hard-crash on the old BIOS and we have a guy with the nickname TheGlow here who can do +850 on his Micron card, that is 9700Mhz effective!!! almost the same data rate of the GDDR5X!

No one really knows what the hell is going on here, can a chip have such variable yield results? where its stable range is as wide as being 7600Mhz to 9700Mhz from lowest to highest sample?

Did your core OC change as well? it seems obvious that the new BIOS increased TDP on all cards because it looks like Micron ICs use more power than Samsung ICs do, so they increased the power envelope for the entire card to account for that and as a result people with exceptional core silicon were able to achieve higher OC on the core, but that does not explain how your memory OC became lower and actually how you didn't hard-crash with the old BIOS to begin with!

No one understands anything man.

Quote:


> Originally Posted by *LogicusMPS*
> 
> I have the same problem... Please can you or someone give me a link of old BIOS. Just to check it one more time, because as i saw i got around -150 stable OC on memory with this new BIOS.


Tell me the exact model you have and i can send you the correct BIOS that i extracted from my cards (have both the OC and non-OC versions) or point you to the appropriate TechPowerUp link for your card so you can download it.


----------



## MyNewRig

Quote:


> Originally Posted by *Klue22*
> 
> Nobody has been banned. We aren't that strict here.
> 
> 
> 
> 
> 
> 
> 
> 
> As a general reminder it is better to report posts that you think are disruptive rather than reply. Worst case scenario is I dismiss your report but still end up poking around in the reported thread. I'd rather that happen than have to clean up a flame war and have to do all that silly infraction business.


Excellent moderation policies, man i like your style


----------



## gtbtk

Quote:



> Originally Posted by *LogicusMPS*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Azazil1190*
> 
> Hi guys I need your help here.
> Btw I flash my strix 1070 oc with the new bios for micron (yes I got micron)
> With the first bios that came with the card I was able to be stable at +500 for memory.Now after flash the new one at +500 im not stable. Im stable at +400 anything above is unstable.
> So I flash again the first bios and im stable again at +500 I dont have issues with that bios at all but I cant understand why the new one cut my oc.I didn't play with the core with the new one.
> Sorry for my English and thanks in advance
> 
> 
> 
> I have the same problem... Please can you or someone give me a link of old BIOS. Just to check it one more time, because as i saw i got around -150 stable OC on memory with this new BIOS.
Click to expand...

Here you go https://www.techpowerup.com/vgabios/185940/asus-gtx1070-8192-160711-2


----------



## Azazil1190

Quote:


> Originally Posted by *MyNewRig*
> 
> mmmm, that is a first, how come you did not checkerboard and BSOD crash on the old BIOS when you reached your max memory OC? every single one of us with a Micron card had the same issue, forgive me for the stupid question, are you sure you have Micron memory?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you can confirm you actually have Micron and have checked for it properly then there are really some strange anomalies with these memory chips, you for example did not hard-crash on the old BIOS and we have a guy with the nickname TheGlow here who can do +850 on his Micron card, that is 9700Mhz effective!!! almost the same data rate of the GDDR5X!
> 
> No one really knows what the hell is going on here, can a chip have such variable yield results? where its stable range is as wide as being 7600Mhz to 9700Mhz from lowest to highest sample?
> 
> Did your core OC change as well? it seems obvious that the new BIOS increased TDP on all cards because it looks like Micron ICs use more power than Samsung ICs do, so they increased the power envelope for the entire card to account for that and as a result people with exceptional core silicon were able to achieve higher OC on the core, but that does not explain how your memory OC became lower and actually how you didn't hard-crash with the old BIOS to begin with!
> 
> No one understands anything man.
> Tell me the exact model you have and i can send you the correct BIOS that i extracted from my cards (have both the OC and non-OC versions) or point you to the appropriate TechPowerUp link for your card so you can download it.


Yeap mate I got micron for sure(gpuz info).I didnt flash yet the new cause I have a lot of work and I want my time to make some tests more carefully.
I will post again about the core clocks if something change positive and higher than the 2088 that I get max oc on core


----------



## ahmedmo1

Just sold my desktop (in sig) and bought a laptop with a GTX 1070. 

Hopefully, I won't ever have to get a desktop again. Consolidating everything into one machine should be awesome!


----------



## shhek0

Thanks to all who replied to my post! It would be the Gainhawrd Phoenix GS. Thank you again.


----------



## M3Stang

Just caught it in the act


----------



## TheGlow

Quote:


> Originally Posted by *ahmedmo1*
> 
> Just sold my desktop (in sig) and bought a laptop with a GTX 1070.
> 
> Hopefully, I won't ever have to get a desktop again. Consolidating everything into one machine should be awesome!


I never ever liked Laptops. They always seemed to under perform, power issues, etc.
Want to upgrade that cpu? good luck. GPU? Nope. Screen? If its an addon and now you're talking setting up a desk again.
Keyboard? Only as a usb add on, etc etc.
Quote:


> Originally Posted by *M3Stang*
> 
> Just caught it in the act


Act of what? It didnt blow up, waste of 2 minutes.


----------



## M3Stang

Quote:


> Originally Posted by *TheGlow*
> 
> I never ever liked Laptops. They always seemed to under perform, power issues, etc.
> Want to upgrade that cpu? good luck. GPU? Nope. Screen? If its an addon and now you're talking setting up a desk again.
> Keyboard? Only as a usb add on, etc etc.
> Act of what? It didnt blow up, waste of 2 minutes.


If you were following my previous posts of it not posting you would have seen that I have been having intermittent posting issues since I installed the 1070. And a few people in this thread have been trying to help me out. I figured now that I got it on video, it would help to further diagnose the issue. Thanks for watching!


----------



## ahmedmo1

Quote:


> Originally Posted by *TheGlow*
> 
> I never ever liked Laptops. They always seemed to under perform, power issues, etc.
> Want to upgrade that cpu? good luck. GPU? Nope. Screen? If its an addon and now you're talking setting up a desk again.
> Keyboard? Only as a usb add on, etc etc.
> Act of what? It didnt blow up, waste of 2 minutes.


Well that's traditionally been the case but it's becoming less relevant.. They've improved tremendously with the Pascal chips- the GPU has always been the problem but much less so now. With CPUs that are sufficiently powerful for most use-cases (including high-end gaming- although throttling is still a concern in a small chassis), the barriers are becoming smaller. The system I got allows for upgrading on the CPU, GPU, RAM, and storage.

My ideal scenario for me is a laptop that can be quickly docked to a desk with a keyboard, mouse, monitor and speakers. Then subsequently be used in a portable fashion. The improvements in packing powerful hardware into increasingly small builds have been stellar.

The major technical roadblocks have been; GPU & thermals. But the primary issue has been price. It's especially a road-block for the low-end and mid-range. But in my case, I sold my system for ~1600 CAD without the HD and bought the laptop for $2100 CAD; with the following specs.

i7 6700- got it because it's a 65W TDP
16gb DDR4 2133 ram
GTX 1070
15.6" 4K PLS G-Sync display
1 x HDMI 2.0, 2 x DP 1.3, 2 x USB 3.1 Gen 2 (with thunderbolt)

Those specs for $2100 CAD are pretty damn impressive. The one issue was the size but it'll be stationary most of the time. This isn't the ideal solution but it's pretty close. We'll get there from a specs perspective in a few years.

Price is a different story entirely...


----------



## gtbtk

Quote:


> Originally Posted by *M3Stang*
> 
> Just caught it in the act


The first copy of msi gaming x that I bought did that. I returned it and got a second card that works flawlessly.

I have been playing around with cross flashing my cards and occasionally I will end up "bricking" it after what is reported as a Sucessful flash. The symptoms look exactly the same as what you showed on the video. The PC will not activate the monitor from its sleep mode tpo display POST. Booting from iGPU and reflashing solves the problem

I would suggest that you get hold of the identical bios file and try reflashing your card, either after a sucessful boot in your 1070, or by booting off iGPU if you have one available. You may find that it solves the problem


----------



## LogicusMPS

Quote:


> Originally Posted by *MyNewRig*
> 
> Tell me the exact model you have and i can send you the correct BIOS that i extracted from my cards (have both the OC and non-OC versions) or point you to the appropriate TechPowerUp link for your card so you can download it.


Strix 1070 OC. With Micron ofc... 

Quote:


> Originally Posted by *gtbtk*
> 
> Here you go https://www.techpowerup.com/vgabios/185940/asus-gtx1070-8192-160711-2


Thanks man. Is this for OC version for sure?


----------



## gtbtk

Quote:


> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Here you go https://www.techpowerup.com/vgabios/185940/asus-gtx1070-8192-160711-2
> 
> 
> 
> Thanks man. Is this for OC version for sure?
Click to expand...

Yes that is the OC version.

I pulled the link from the verified section of their database as well


----------



## LogicusMPS

Quote:


> Originally Posted by *gtbtk*
> 
> Yes that is the OC version.
> 
> I pulled the link from the verified section of their database as well


Thanks my friend, I will try it and I will give a feedback soon as possible


----------



## MyNewRig

Quote:


> Originally Posted by *LogicusMPS*
> 
> Strix 1070 OC. With Micron ofc...
> Thanks man. Is this for OC version for sure?


I can confirm that file is identical to the one i pulled from my Strix OC with only one difference, it looks like these BIOS files are serialized to identify the specific card, batch or production run, because the only difference between that file and the one i have in a hex comparison is a number that looks like a serial.

The moral of the story is that yes, go ahead and flash it to your Strix OC, it is the correct BIOS for the card, but be careful if you send the card for RMA ASUS will know that it is not the stock BIOS that the card shipped with due to the different serial and might deny warranty, it is identical in every other way but just make sure that if you ever send the card to RMA don't send it with that BIOS flashed, send it with the updated BIOS instead to avoid trouble.


----------



## MyNewRig

Quote:


> Originally Posted by *ahmedmo1*
> 
> Well that's traditionally been the case but it's becoming less relevant.. They've improved tremendously with the Pascal chips- the GPU has always been the problem but much less so now. With CPUs that are sufficiently powerful for most use-cases (including high-end gaming- although throttling is still a concern in a small chassis), the barriers are becoming smaller. The system I got allows for upgrading on the CPU, GPU, RAM, and storage.
> 
> My ideal scenario for me is a laptop that can be quickly docked to a desk with a keyboard, mouse, monitor and speakers. Then subsequently be used in a portable fashion. The improvements in packing powerful hardware into increasingly small builds have been stellar.
> 
> The major technical roadblocks have been; GPU & thermals. But the primary issue has been price. It's especially a road-block for the low-end and mid-range. But in my case, I sold my system for ~1600 CAD without the HD and bought the laptop for $2100 CAD; with the following specs.
> 
> i7 6700- got it because it's a 65W TDP
> 16gb DDR4 2133 ram
> GTX 1070
> 15.6" 4K PLS G-Sync display
> 1 x HDMI 2.0, 2 x DP 1.3, 2 x USB 3.1 Gen 2 (with thunderbolt)
> 
> Those specs for $2100 CAD are pretty damn impressive. The one issue was the size but it'll be stationary most of the time. This isn't the ideal solution but it's pretty close. We'll get there from a specs perspective in a few years.
> 
> Price is a different story entirely...


I never understood the concept of a high-end bulky and heavy gaming laptop, it is just the form factor and price does not go well with the use-case, my desktop with a GTX 1070 costs exactly 1/3 (one third) of a similar speced laptop in my market and it runs faster and cooler.

My idea is that a gaming laptop will need to be plugged-in anyways for any serious gaming session, the battery would not last you one hour in a heavy gaming session so you will be tethered and loose mobility anyways when you game, also you don't have any upgrade options, you put all your egg in one basket if one component is damaged after your warranty runs out you loose all your investment in the machine.

Beyond all that, who actually plays high-end games on the go? if you are on your way to work or on a vacation or at school or wherever you may be, how would that be the time and place to play games?

My strategy is to always have a high-end gaming desktop and a cheap i3 or i5 thin and light laptop that i use for browsing, text editing, programming, PowerPoint etc .. this way i distribute my budget properly and for the same money or less i get two devices instead of one each serves a different purpose, a high-end desktop plus a cheap work laptop together cost less than a 1070 gaming laptop, and allow me to work on the go while currying a light and slim device and game at home with the best performance possible.

One more workaround to add to the picture, i live in a country with very fast internet, so if i am away from home and badly need to game which rarely happens, i use Steam in-home streaming while connected to my home desktop via VPN and stream the game i want.

I feel that is a much more efficient setup, what are your views? why do you think that putting all that money in a gaming laptop is actually a good thing? and what is that model you have that allow you to upgrade the GPU? how is that possible?


----------



## M3Stang

Quote:


> Originally Posted by *gtbtk*
> 
> The first copy of msi gaming x that I bought did that. I returned it and got a second card that works flawlessly.
> 
> I have been playing around with cross flashing my cards and occasionally I will end up "bricking" it after what is reported as a Sucessful flash. The symptoms look exactly the same as what you showed on the video. The PC will not activate the monitor from its sleep mode tpo display POST. Booting from iGPU and reflashing solves the problem
> 
> I would suggest that you get hold of the identical bios file and try reflashing your card, either after a sucessful boot in your 1070, or by booting off iGPU if you have one available. You may find that it solves the problem


The graphics card I got at Best Buy about a week later so I had about a week or two to return the graphics card I think. I can't remember if Best Buy has 14 or 30 day policy.


----------



## LogicusMPS

Quote:


> Originally Posted by *MyNewRig*
> 
> I can confirm that file is identical to the one i pulled from my Strix OC with only one difference, it looks like these BIOS files are serialized to identify the specific card, batch or production run, because the only difference between that file and the one i have in a hex comparison is a number that looks like a serial.
> 
> The moral of the story is that yes, go ahead and flash it to your Strix OC, it is the correct BIOS for the card, but be careful if you send the card for RMA ASUS will know that it is not the stock BIOS that the card shipped with due to the different serial and might deny warranty, it is identical in every other way but just make sure that if you ever send the card to RMA don't send it with that BIOS flashed, send it with the updated BIOS instead to avoid trouble.


Thank you. I'm aware of that. I know that procedures end everything. But thanks for remind me









I'm going to try it.


----------



## gtbtk

Quote:


> Originally Posted by *M3Stang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The first copy of msi gaming x that I bought did that. I returned it and got a second card that works flawlessly.
> 
> I have been playing around with cross flashing my cards and occasionally I will end up "bricking" it after what is reported as a Sucessful flash. The symptoms look exactly the same as what you showed on the video. The PC will not activate the monitor from its sleep mode tpo display POST. Booting from iGPU and reflashing solves the problem
> 
> I would suggest that you get hold of the identical bios file and try reflashing your card, either after a sucessful boot in your 1070, or by booting off iGPU if you have one available. You may find that it solves the problem
> 
> 
> 
> The graphics card I got at Best Buy about a week later so I had about a week or two to return the graphics card I think. I can't remember if Best Buy has 14 or 30 day policy.
Click to expand...

If you can still swap it out, do that. you deserve reliability out of the box


----------



## M3Stang

Quote:


> Originally Posted by *gtbtk*
> 
> If you can still swap it out, do that. you deserve reliability out of the box


Yeah I might go do that this week. Or maybe rma it since I already registered it if it's to late to return it. Even with the very long gaps between these startups you still think it's the card?


----------



## MyNewRig

Quote:


> Originally Posted by *M3Stang*
> 
> Yeah I might go do that this week. Or maybe rma it since I already registered it if it's to late to return it. Even with the very long gaps between these startups you still think it's the card?


If i remember correctly you can unregister cards with EVGA so you can still return it, did you try the latest BIOS for that MB? Gigabyte has a habit of releasing beta BIOSes all the time with new features or improvements, but these beta BIOSes usually have some strange glitches, do you happen to have one of those flashed? if that is the case try the latest stable BIOS for a while (i mean downgrading to the latest stable one that is not marked as beta) and see if that still happens. i personally suspect the MB more because i had all sorts of glitches with GB motherboards in the past few generations including black screen on boot but that was when connected to the HDMI on the motherboard directly.


----------



## ahmedmo1

Quote:


> Originally Posted by *MyNewRig*
> 
> I never understood the concept of a high-end bulky and heavy gaming laptop, it is just the form factor and price does not go well with the use-case, my desktop with a GTX 1070 costs exactly 1/3 (one third) of a similar speced laptop in my market and it runs faster and cooler.
> 
> My idea is that a gaming laptop will need to be plugged-in anyways for any serious gaming session, the battery would not last you one hour in a heavy gaming session so you will be tethered and loose mobility anyways when you game, also you don't have any upgrade options, you put all your egg in one basket if one component is damaged after your warranty runs out you loose all your investment in the machine.
> 
> Beyond all that, who actually plays high-end games on the go? if you are on your way to work or on a vacation or at school or wherever you may be, how would that be the time and place to play games?
> 
> My strategy is to always have a high-end gaming desktop and a cheap i3 or i5 thin and light laptop that i use for browsing, text editing, programming, PowerPoint etc .. this way i distribute my budget properly and for the same money or less i get two devices instead of one each serves a different purpose, a high-end desktop plus a cheap work laptop together cost less than a 1070 gaming laptop, and allow me to work on the go while currying a light and slim device and game at home with the best performance possible.
> 
> One more workaround to add to the picture, i live in a country with very fast internet, so if i am away from home and badly need to game which rarely happens, i use Steam in-home streaming while connected to my home desktop via VPN and stream the game i want.
> 
> I feel that is a much more efficient setup, what are your views? why do you think that putting all that money in a gaming laptop is actually a good thing? and what is that model you have that allow you to upgrade the GPU? how is that possible?


I indicated the size was an issue but things change. So I don't know why you're reiterating that point. What I'm saying is the paradigm you're referring to is the one I was stuck in for quite a while but it's changing quite quickly. This is the same sentiment that tech reviewers like Linustechtipcs & Hardwarecanucks have also expressed. I don't claim to be in the largest segment of the population but my segment will represent a larger and larger share of the market.

I'd also like to know where you live because a desktop shouldn't be 1/3 the price of a similarly spec'd laptop. Unless the person went out of their way to be especially thrifty when getting the desktop but spent their money stupidly with the laptop. I did the price comparison in USD and CAD and the desktop is around 2/3 the price of the laptop at the high end. With the secondary laptop, your at the same price as the high-end laptop or more expensive.

The point for a segment of the market is to consolidate the number of devices. And some folks are willing to pay a premium. Plus, there'll almost always be an outlet nearby.

It's also worth noting that I appreciate you indicated that having two systems is YOUR strategy. But consumers fall into different segments and have different requirements. There is a segment that is actively seeking to consolidate the # of devices that they own. They don't want a desktop, laptop, and smartphone (the tablet market is already in terrible shape as it is).

The model I have is the Eurocom SKY X4 E2- the GPU is an MXM GPU- they can be replaced. See the below video.


----------



## MyNewRig

Quote:


> Originally Posted by *ahmedmo1*
> 
> I indicated the size was an issue but things change. So I don't know why you're reiterating that point. What I'm saying is the paradigm you're referring to is the one I was stuck in for quite a while but it's changing quite quickly. This is the same sentiment that tech reviewers like Linustechtipcs & Hardwarecanucks have also expressed. I don't claim to be in the largest segment of the population but my segment will represent a larger and larger share of the market.
> 
> I'd also like to know where you live because a desktop shouldn't be 1/3 the price of a similarly spec'd laptop. Unless the person went out of their way to be especially thrifty when getting the desktop but spent their money stupidly with the laptop. I did the price comparison in USD and CAD and the desktop is around 2/3 the price of the laptop at the high end. With the secondary laptop, your at the same price as the high-end laptop or more expensive.
> 
> The point for a segment of the market is to consolidate the number of devices. And some folks are willing to pay a premium. Plus, there'll almost always be an outlet nearby.
> 
> It's also worth noting that I appreciate you indicated that having two systems is YOUR strategy. But consumers fall into different segments and have different requirements. There is a segment that is actively seeking to consolidate the # of devices that they own. They don't want a desktop, laptop, and smartphone (the tablet market is already in terrible shape as it is).
> 
> The model I have is the Eurocom SKY X4 E2- the GPU is an MXM GPU- they can be replaced. See the below video.


In Europe, my CPU/MB/RAM/1070 combo cost about €800 while the cheapest GTX 1070 laptop is €2000 i did not take into account the case and PSU because i already have them, but if you take that into account as well the desktop would cost about half of a similar speced laptop in my market, the difference would allow for a purchase of a very high-end non-gaming laptop and i end up with both a gaming device and work/travel device.

I see device consolidation in this case as unneeded because i can not think of a use-case where i will need a very high-end 4K gaming on the go, of course if you are the kind of person who travel or move a lot or go to LAN parties frequently as part of your usual lifestyle then it makes sense in your case, but for me having to curry a heavy and bulky device around all the time that i would mostly be doing work-related activities on when out and about and maybe rarely run a game is a bad idea.

Smartphones and tablet are in totally different category so they can not be included in the same pool of devices one aims to consolidate, a tablet is mainly a content consumption device not content creation device like a laptop, i can not get any serious work done on my tablet, usually only reading, watching Youtube and such due to lack of keyboard and convenient multitasking functions, and a smartphone is mainly a voice communications device that can be used as a tablet inconveniently though due to the small screen real-estate.

These small swapable MXM GPUs are very interesting, how do you reseat the cooling solution when changing them? and where do you actually buy these GPUs and for how much?


----------



## ucode

Quote:


> Originally Posted by *Forceman*
> 
> it looks like these BIOS files are serialized to identify the specific card, batch or production run, because the only difference between that file and the one i have in a hex comparison is a number that looks like a serial.


Falshing shouldn't change the original serial number however the date the last flash was done is recorded.

Some laptops with dual 1070's out there that would put a lot of desktops to shame. IMHO the high price tag is more about portability, nice if one can afford it,


----------



## LogicusMPS

Guys, just to mention, I hope it will be helpful for most of you. As I told few pages before, i had problems with unstable OC on Strix 1070 OC - Micron. After I updated drivers today (clean install) from 375.63 to 375.70 I get huge stability improvement. I'm still testing, step by step, but for now I passed 17k on 3D Mark 2013, with last drivers it couldn't go over 16400, as well as Unigine Valley, now it's hitting over 5500, with last drivers, it was around 5100.

I have just left BIOS, didn't flashed nothing, just updated one from Asus page.

Nice improvements.

3DMark 2013

Valley benchmark


----------



## TheGlow

Quote:


> Originally Posted by *M3Stang*
> 
> The graphics card I got at Best Buy about a week later so I had about a week or two to return the graphics card I think. I can't remember if Best Buy has 14 or 30 day policy.


You need to check as bestbuy has changed policy countless times. 14 I think is default now, 30 if you have the Silver tier membership.


----------



## DeathAngel74

It's 14 days.


----------



## M3Stang

Guess Im dealing with it for now then.


----------



## DeathAngel74

If you call or go to the store to explain what the issue is in person, they might be able to honor the mfg. warranty and swap it out. One of my old 970's died 2 days before the 1 yr mfg. warranty ran out and they swapped it out for me. I still had the receipt though.


----------



## gtbtk

Quote:


> Originally Posted by *M3Stang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you can still swap it out, do that. you deserve reliability out of the box
> 
> 
> 
> Yeah I might go do that this week. Or maybe rma it since I already registered it if it's to late to return it. Even with the very long gaps between these startups you still think it's the card?
Click to expand...

If it wont even post, there is something going on.


----------



## khanmein

kev & bryan blamed board partners not NVIDIA fault due to the micron.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I think that It is pretty obvious that Nvidia and the AIB partners have known that there were issues with the Micron card from the beginning. I suspect that they had not worked out exactly what the problem was to fix it until we started to experiment and then shake the tree at nvidia.com. In all probability, they probably got side tracked looking at the memory chips themselves and not the supporting logic and circuitry.
> 
> In my experience, these manufacturers usually work on the deny, deny, deny process for months until they are cornered like in the 970 memory drama or have a fix to distribute so they can feel that they were the "savior".
> 
> I was initially surprised that nvidia actually volunteered to work on a bios update so quickly.
> 
> If I was in their position with the same issues, I would not send out micron cards for review knowing they would crash either.
> 
> In the light of this issue though, we now know that Bios updates are initially created by Nvidia and pushed out to the Partners for branding customization. The limited default clock speed variations across different brands together with the absence of tweaking tools would tend to suggest that the core aspects of the pascal bios are not customizable by the partners and have to come from Nvidia themselves.
> 
> The samsung only review samples, together with what surely must have been nvidia supplied reviewer bios version cards suggests that the cards are all sourced from and cherry picked by single source. The only common denominator here is Nvidia themselves.
> 
> 
> 
> i already provided 2 articles that reviewed micron to him. he promised to make a vid about it. i would like him give an proper explanation regarding the micron.
> 
> damn~ steve burke gave me that kinda explanation like changing parts is like nothing & joker slunt bias towards EVGA said that the amount of heat of VRM won't cause any big deal if u're not over-clocking.
Click to expand...

well you got some youtube attention 



 quoting the wffctech article that rips off the Guru3d article that was only half researched in the first place.


----------



## khanmein

^^ yeah bryan mentioned a lot users received micron unit from europe but i'm staying quite near with bryan australia. f asia (MH370 missing plane)

i don't wanna get any attention but i want proper answer y they received samsung but deceived viewer? now they blamed board partners? FE didn't use micron at all? this is NVIDIA, board partners & tech reviewer fault.

https://www.facebook.com/AcheenAudioShoppe/

how many users at my country had bought GTX 1070? FYI, i just pointed out one of the shop that get the stock for main local distributor "Ban Leong Technologies Sdn Bhd"

for VRM issue the credit should go for Hilbert Hagedoorn from GURU3D.

http://www.guru3d.com/articles_pages/evga_geforce_gtx_1070_sc_superclocked_gaming_review,10.html (july)

http://www.tomshardware.de/nvidia-geforce-gtx-1080-gtx-1070-grafikkarten-roundup,testberichte-242137-2.html (october due the FTW released)

Palit released new vbios 28th Oct & they started to ship new graphic card that even GPU-Z didn't show the manufacture brand e.g samsung or micron anymore.

Palit keep tweaking proven the micron chip is not stable & now they using this method. the only way to find out the chip is to dismantled the card & be prepare to void the warranty.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> ^^ yeah bryan mentioned a lot users received micron unit from europe but i'm staying quite near with bryan australia. f asia (MH370 missing plane)
> 
> i don't wanna get any attention but i want proper answer y they received samsung but deceived viewer? now they blamed board partners? FE didn't use micron at all? this is NVIDIA, board partners & tech reviewer fault.
> 
> https://www.facebook.com/AcheenAudioShoppe/
> 
> how many users at my country had bought GTX 1070? FYI, i just pointed out one of the shop that get the stock for main local distributor "Ban Leong Technologies Sdn Bhd"


I'm in HK and bought my card 21 July. It was manufactured in July. There was not a single copy of a 86.04.26 bios from anyone on the Techpowerup VGA database when I uploaded my bios the day I got the card up and running in my PC.

I don't think they appreciated that all brands produced these Micron cards starting in July including Founders edition, though it seems they have switched back to Samsung in September. All brands changed up the bios to the .26 range with the introduction of Micron memory. The Gigabyte lie that they dont have Micron that they changed to we have but have never heard of any problems seems to have been taken as fact.

I think the reviewers have been kept in the dark to allow for some plausible deniability. I certainly would not go so far as to say that it is their fault or they have intentionally deceived their audience.


----------



## mrmouse

KFA² has released the new BIOS versions.

http://www.kfa2.com/kfa2/1070bios/


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> 
> 
> 
> 
> 
> kev & bryan blamed board partners not NVIDIA fault due to the micron.


That is utter BS, there are very strong clues that Nvidia is the one behind the move, first Nvidia sorts of "rushed" in a sense to deliver the BIOS fix while getting the updated BIOS from board partners has really been like pulling teeth as @gtbtk put it in one of his comments somewhere, it only points to partners being annoyed having to do extra work to fix something that was not their fault at the first place, if it was the partners' fault they would be running to deliver that BIOS fix while Nvidia will just be like it is not our problem go deal with the company you bought from, but this was not the case at all.

Another clue is that the mass switch happened on all 1070s in almost exactly the same time, during that time the partners had Samsung 8 Gpbs chips lying around and were producing the 1060s with it but never put it in the 1070, if it was really their choice at the time both the 1060 & 1070 will be fully switched or at least be mixed, again it was not the case.

These reviewers are either misinformed or are ignoring very strong clues to the contrary of the conclusions they are making.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> That is utter BS, there are very strong clues that Nvidia is the one behind the move, first Nvidia sorts of "rushed" in a sense to deliver the BIOS fix while getting the updated BIOS from board partners has really been like pulling teeth as @gtbtk put it in one of his comments somewhere, it only points to partners being annoyed having to do extra work to fix something that was not their fault at the first place, if it was the partners' fault they would be running to deliver that BIOS fix while Nvidia will just be like it is not our problem go deal with the company you bought from, but this was not the case at all.
> 
> Another clue is that the mass switch happened on all 1070s in almost exactly the same time, during that time the partners had Samsung 8 Gpbs chips lying around and were producing the 1060s with it but never put it in the 1070, if it was really their choice at the time both the 1060 & 1070 will be fully switched or at least be mixed, again it was not the case.
> 
> These reviewers are either misinformed or are ignoring very strong clues to the contrary of the conclusions they are making.


if not NV issue then y manuel guzman made a statement "Thank you for your feedback. I have filed a bug so that our software team can look into this."

usually i reported any issue with the driver, he'll asked me to provide more information so that the s/w team can reproduce the issue like i reported the TDR & MFAA. e.g enable MFAA in global setting will cause error but now is totally fixed.

this proven that MG & his team already knew the issue with micron.


----------



## khanmein

where's Roland0101 or Roland01? i expected same guys cos so far i'm the one using the same name at here, guru3d & geforce forums. this is my f real name..

Roland0101 said "First, the imho most important thing, checkerboard artifacts crashes are completely gone."

http://www.overclock.net/t/1614656/gtx-1070s-micron-feedback/10

what he mean is that before the Asus released the new vbios for micron, his card got checkerboard artifacts crashes?

Roland01 said "My Strix OC with micron memory don't shows any of the problems described for RotTR. No flickering, no artifacts, no core clock throttling, stays not OCed (that means not further as Asus did overclocked the card anyway.) at 1987Mhz even in several hour long seasons. "

https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/2/

what he mean is that "no flickering, no artifacts, no core clock throttling, stays not OCed" for RotTR only but got checkerboard artifacts crashes on other games??

i'm confusing & please enlighten me. thanks (no offense)


----------



## asdkj1740

pascal ftw from evga is a big upgaded compared to the fe pcb.
maxwell and kelper ftw are just slightly upgraded compared to the reference pcb.

but this time evga totally ****ed up......


----------



## asdkj1740

Quote:


> Originally Posted by *asdkj1740*
> 
> 
> 
> 
> 
> 
> pascal ftw from evga is a big upgaded compared to the fe pcb.
> maxwell and kelper ftw are just slightly upgraded compared to the reference pcb.
> 
> but this time evga is totally ****ed up......


----------



## asdkj1740

1070ftw may have the same issue


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> 
> 
> 
> 
> 
> pascal ftw from evga is a big upgaded compared to the fe pcb.
> maxwell and kelper ftw are just slightly upgraded compared to the reference pcb.
> 
> but this time evga totally ****ed up......


SH**t, this happened when the card was not even under stress or heat up! i thought this was related to VRMs overheating after hours of stressful workloads but the guy just turn on his PC!! so it is actually defective circuitry or bad design that is causing shorts.

too many nasty stuff happening in this generation, i guess Gaming is not "Perfected" yet!


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> SH**t, this happened when the card was not even under stress or heat up! i thought this was related to VRMs overheating after hours of stressful workloads but the guy just turn on his PC!! so it is actually defective circuitry or bad design that is causing shorts.
> 
> too many nasty stuff happening in this generation, i guess Gaming is not "Perfected" yet!


VRM over-heating news is covering over micron issue.


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> SH**t, this happened when the card was not even under stress or heat up! i thought this was related to VRMs overheating after hours of stressful workloads but the guy just turn on his PC!! so it is actually defective circuitry or bad design that is causing shorts.
> 
> too many nasty stuff happening in this generation, i guess Gaming is not "Perfected" yet!


dude, evga is right, evga said the caught fire is not related to overheat of vrm. this video proves it.


----------



## GunnzAkimbo

Galax HOF heavy duty?

*Dont care about colour scheme, size, amount of fans or noise, LED's, length, width, breadth, patterns, logo's.*

Backplate and full frontplate, surely it is 80s tuff?

*All i care is that it WORKS, and doesn't break after 30 seconds.*


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> dude, evga is right, evga said the caught fire is not related to overheat of vrm. this video proves it.


If i remember correctly EVGA was talking about the Black screen and 100% fan issue when they said it was not related to overheating, they were not talking about catching fire.

So now an EVGA Pascal card has MIcron memory, is a fire hazard, has VRMs that run hot as hell, and can black screen randomly at any time, these are four different issues affecting one single card, that is really bad.

If you have an EVGA card you have to at least update the BIOS (which looks like a problem for many people from the comments i see in the forums) and remove the cooler to apply thermal pads yourself, plus keep your fingers crossed that nothing else goes bad afterwards and it still costs $400 or more


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> If i remember correctly EVGA was talking about the Black screen and 100% fan issue when they said it was not related to overheating, they were not talking about catching fire.
> 
> So now an EVGA Pascal card has MIcron memory, is a fire hazard, has VRMs that run hot as hell, and can black screen randomly at any time, these are four different issues affecting one single card, that is really bad.
> 
> If you have an EVGA card you have to at least update the BIOS (which looks like a problem for many people from the comments i see in the forums) and remove the cooler to apply thermal pads yourself, plus keep your fingers crossed that nothing else goes bad afterwards and it still costs $400 or more


the vrm is samsung will exploded that's y they changed to micron.


----------



## Nukemaster

Quote:


> Originally Posted by *khanmein*
> 
> the vrm is samsung will exploded that's y they changed to micron.


I hope this is just an S7 joke.


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> If i remember correctly EVGA was talking about the Black screen and 100% fan issue when they said it was not related to overheating, they were not talking about catching fire.
> 
> So now an EVGA Pascal card has MIcron memory, is a fire hazard, has VRMs that run hot as hell, and can black screen randomly at any time, these are four different issues affecting one single card, that is really bad.


1. evga never reveal the causes of black screen problem
2. lots of users complaint about caught fire because of overheating vrm at evga forum recently and an evga tech support said caught fire is not related to overheating.
3. all aib uses micron
4. ftw mosfet can provide 30a at 100c, so keep them under 100c is totally fine. at 125c some ppl guess the mosfet will derate to ~20a, which is still fine for 1070 ftw and 1080 ftw at 100% power limit (208w for 1070 ftw and 215w for 1080 ftw).
evga set bios very conservatively.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> the vrm is samsung will exploded that's y they changed to micron.


Would be interesting to count how many times we typed the word Samsung and Micron during the past two months, my S and M on the keyboard are starting to wear out


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> 1. evga never reveal the causes of black screen problem
> 2. lots of users complaint about caught fire because of overheating vrm at evga forum recently and an evga tech support said caught fire is not related to overheating.
> 3. all aib uses micron
> 4. ftw mosfet can provide 30a at 100c, so keep them under 100c is totally fine. at 125c some ppl guess the mosfet will derate to ~20a, which is still fine for 1070 ftw and 1080 ftw at 100% power limit (208w for 1070 ftw and 215w for 1080 ftw).
> evga set bios very conservatively.


1- Of course they did, look at the first post EDIT BY EVGA (10/24/2016) http://forums.evga.com/GTX-1080-FTW-Black-screen-fans-spin-up-to-100-m2530081.aspx

2- Did not see that, i only saw them saying the BS100% is not related to overheating.

3- But they don't have cards catching fire, black screen and they sure never ask end users to mod the cooling solution themselves.

4- The risk is not on the MOSFETs themselves but the surrounding components like vRAM, and PCB, also i would not want anything inside my case to reach 120c even if it can handle it, would you put a boiler or an oven inside your case?


----------



## Lahatiel

Quote:


> Originally Posted by *MyNewRig*
> 
> ...
> i would not want anything inside my case to reach 120c even if it can handle it, would you put a boiler or an oven inside your case?


Well, a PC with built-in pizza oven is a mod I would actually buy.


----------



## MyNewRig

Quote:


> Originally Posted by *Lahatiel*
> 
> Well, a PC with built-in pizza oven is a mod I would actually buy.


LOL or a tea maker/boiler mod, that is safe only reaches 100c tops


----------



## asdkj1740

Quote:


> Originally Posted by *MyNewRig*
> 
> 1- Of course they did, look at the first post EDIT BY EVGA (10/24/2016) http://forums.evga.com/GTX-1080-FTW-Black-screen-fans-spin-up-to-100-m2530081.aspx
> 
> 2- Did not see that, i only saw them saying the BS100% is not related to overheating.
> 
> 3- But they don't have cards catching fire, black screen and they sure never ask end users to mod the cooling solution themselves.
> 
> 4- The risk is not on the MOSFETs themselves but the surrounding components like vRAM, and PCB, also i would not want anything inside my case to reach 120c even if it can handle it, would you put a boiler or an oven inside your case?


oh i didnt notice the evga response, thank you.


----------



## khanmein

Quote:


> Originally Posted by *Nukemaster*
> 
> I hope this is just an S7 joke.


note 7 while i'm still using S3 that came with SOD 1st batch but rocking with android 6.0.1..


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> note 7 while i'm still using S3 that came with SOD 1st batch but rocking with android 6.0.1..


LOL only now i got the joke, i could not link Samsung to explosions at first, thought you were just saying nonsense, good joke though


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Would be interesting to count how many times we typed the word Samsung and Micron during the past two months, my S and M on the keyboard are starting to wear out


my PBT keycaps won't wear out that easily. every 3 months i will switch white > black (http://www.ikbc.com.cn/)


----------



## syl1979

I have just updated the bios on my Galax 1070 Gamer
The bios is COMMON for Micron and Samsung memories
https://www.techpowerup.com/vgabios/187077/187077

Here is GPU-Z after update


I have Samsung card, but could find some small gain
- Original bios : stable no artefacts +425 (2215 Mhz)
- Updated : stable no artefacts +465 (2235 Mhz)


----------



## khanmein

Quote:


> Originally Posted by *syl1979*
> 
> I have just updated the bios on my Galax 1070 Gamer
> The bios is COMMON for Micron and Samsung memories
> https://www.techpowerup.com/vgabios/187077/187077
> 
> Here is GPU-Z after update
> 
> 
> I have Samsung card, but could find some small gain
> - Original bios : stable no artefacts +425 (2215 Mhz)
> - Updated : stable no artefacts +465 (2235 Mhz)


samsung almost reaching the peak so the small gain is really quite good but don't push too hard can cause explosion

whereas micron is meant to be crippled to prevent future damage. be grateful for what u had now.

y don't u grab it from official website? http://www.galax.com/1070bios/


----------



## asdkj1740

MyNewRig, if you go evga forum to check about the recent threads, you may see another fatal mistake of that cooling plate.
there are gaps between the vram chip and the thermal pads on the cooling plate. this is baking....


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> samsung almost reaching the peak so the small gain is really quite good but don't push too hard can cause explosion
> 
> whereas micron is meant to be crippled to prevent future damage. be grateful for what u had now.


You know what is cool? being without my 1070 for a few days now i discovered that my Intel HD 530 overclocked to 1350Mhz is quite powerful, i can play DOOM, Shadow Warrior 2, Mirror Edge Catalyst etc on Medium-Ultra at 720p and get over 30FPS , actually reaches 46FPS in DOOM on Ultra preset!

So i have quite strong staying power to wait for VEGA/Pascal V2 without having to jump on a bad purchase.


----------



## Nukemaster

Quote:


> Originally Posted by *khanmein*
> 
> note 7 while i'm still using S3 that came with SOD 1st batch but rocking with android 6.0.1..


My bad.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> You know what is cool? being without my 1070 for a few days now i discovered that my Intel HD 530 overclocked to 1350Mhz is quite powerful, i can play DOOM, Shadow Warrior 2, Mirror Edge Catalyst etc on Medium-Ultra at 720p and get over 30FPS , actually reaches 46FPS in DOOM on Ultra preset!
> 
> So i have quite strong staying power to wait for VEGA/Pascal V2 without having to jump on a bad purchase.


i just finished dl FIFA 17 & during the lineup got glitches headless but i know this is the well known issue with intel graphics. 720p is not playable for me under 1440p screen. my GTX 970 still RMA-ing cos one fan stop spinning. damn i waited for 6~7 weeks still no feedback.


----------



## MyNewRig

Quote:


> Originally Posted by *asdkj1740*
> 
> MyNewRig, if you go evga forum to check about the recent threads, you may see another fatal mistake of that cooling plate.
> there are gaps between the vram chip and the thermal pads on the cooling plate. this is baking....


I am tired of checking that forum, it has nothing but issues, problems and fixes, I never actually seen EVGA as a good quality manufacturer, my previous EVGA cards had nothing but problems especially the most annoying coil whining cards i had, and their support was not that great either, i always think what the fuss is all about EVGA? maybe they are good in the US but here in the EU they are nothing special, i actually had a much better RMA experience here with Gigabyte than i did with EVGA, GB were really very nice and helpful while EVGA was not.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> i just finished dl FIFA 17 & during the lineup got glitches headless but i know this is the well known issue with intel graphics. 720p is not playable for me under 1440p screen. my GTX 970 still RMA-ing cos one fan stop spinning. damn i waited for 6~7 weeks still no feedback.


Yeah Intel iGPU always have textures artifacts in most games, it has always been that way, actually DOOM on 1080p with 50% resolution scale, Ultra preset and 8x AA looks very nice and is also very playable, i can also play with Ultra textures in most games, just have to lower shadows and reflections to get a decent FPS, but hey that is an emergency temp solution, i am not going to be playing like that for the rest of my life, i just don't want the current Micron 1070 and also don't want to buy an RX 480 because i know AMD is about to announce new stuff, so at least with the iGPU i can get a "taste" of gaming while waiting for a satisfying product to hit the market, if they start making 1070s with Samsung memory again i will buy it immediately.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Yeah Intel iGPU always have textures artifacts in most games, it has always been that way, actually DOOM on 1080p with 50% resolution scale, Ultra preset and 8x AA looks very nice and is also very playable, i can also play with Ultra textures in most games, just have to lower shadows and reflections to get a decent FPS, but hey that is an emergency temp solution, i am not going to be playing like that for the rest of my life, i just don't want the current Micron 1070 and also don't want to buy an RX 480 because i know AMD is about to announce new stuff, so at least with the iGPU i can get a "taste" of gaming while waiting for a satisfying product to hit the market, if they start making 1070s with Samsung memory again i will buy it immediately.


my cash is ready & let NV rip off but they don't appreciate it. come take my money when u show me the samsung explosion.


----------



## MyNewRig

Quote:


> Originally Posted by *khanmein*
> 
> my cash is ready & let NV rip off but they don't appreciate it. come take my money when u show me the samsung explosion.


Exactly, we accept to pay the Nvidia tax/premium but now we get nothing for it, if they continue to use the "cheap" Micron chips like that reviewer said then the product is not premium anymore, even if they reduce the price and still use Micron i will not buy it because it is not stable, it messes up my PC.


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> Exactly, we accept to pay the Nvidia tax/premium but now we get nothing for it, if they continue to use the "cheap" Micron chips like that reviewer said then the product is not premium anymore, even if they reduce the price and still use Micron i will not buy it because it is not stable, it messes up my PC.







FX 8350 working quite good with GTX 1070 but my i5-4460 is decent too.


----------



## gammagoat

I have the Gigabyte G1, everything is is at stock and am quite happy with the performance of card.

I haven't tried any overclocking yet as I wanted to get cpu overclock settled first.

I like to keep things as cool as possible, and am finding the wind force fans to be very annoying. Loud and whiney.

I am thinking that I could remove the current fans and shroud and replace with either 2 120's or 3 92's any suggestions on the best way to set this up and which fans would work best?


----------



## syl1979

Quote:


> Originally Posted by *khanmein*
> 
> y don't u grab it from official website? http://www.galax.com/1070bios/


I did got it from official Galax website, the techpowerup database link is my upload of the bios after update.

May be useful for cross flashing for people with Ex or Exoc card, PCB should be the same but the power limit can move up to 250W


----------



## zipzop

Quote:


> Originally Posted by *asdkj1740*
> 
> 
> 
> 
> 
> 
> pascal ftw from evga is a big upgaded compared to the fe pcb.
> maxwell and kelper ftw are just slightly upgraded compared to the reference pcb.
> 
> but this time evga totally ****ed up......


Ok that card is not even plugged in


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> samsung almost reaching the peak so the small gain is really quite good but don't push too hard can cause explosion
> 
> whereas micron is meant to be crippled to prevent future damage. be grateful for what u had now.


Where did that come from?

My preupdate bios Micron Memory card overclocks and runs without artifacts at +540 (2275Mhz) as long as the voltage is at .800 when start the 3d load. There is another micron card from TheGlow who can run his Micron card at +800

There is no evidence of Micron being crippled to prevent damage. Only evidence of a bug that is being rectified by the bios update


----------



## MyNewRig

Quote:


> Originally Posted by *zipzop*
> 
> Ok that card is not even plugged in


WOW you are right, i just noticed, none of the PCI-E conectors are plugged in, but would that make the card short and catch fire? it should just not run at all, no?


----------



## syl1979

New Galax 1070 bios seems to help also for core frequency. The frequency wall was at around 2100 Mhz before, moved to 2150 after update (stable at 2164 1.081v / 2050 1.06v)


----------



## gtbtk

Quote:


> Originally Posted by *syl1979*
> 
> New Galax 1070 bios seems to help also for core frequency. The frequency wall was at around 2100 Mhz before, moved to 2150 after update (stable at 2164 1.081v / 2050 1.06v)


Galax increased the power limit from 225 W to 250W with this bios update i believe


----------



## Avendor

New VBIOS for Gigabyte. Release for SAMSUNG Memory, meantime i'm waiting for Micron
http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
F2 - 2016/10/31


----------



## MojoW

Quote:


> Originally Posted by *Avendor*
> 
> New VBIOS for Gigabyte. Release for SAMSUNG Memory, meantime i'm waiting for Micron
> http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> F2 - 2016/10/31


Well i tried to download it but the file is corrupt everytime and on the NL side of gigabyte it is not even updated yet.
Guess i'll have to wait aswell.


----------



## Avendor

Indeed, seem to be corrupted for the time being
This XML file does not appear to have any style information associated with it. The document tree is shown below.


----------



## gtbtk

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Avendor*
> 
> New VBIOS for Gigabyte. Release for SAMSUNG Memory, meantime i'm waiting for Micron
> http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> F2 - 2016/10/31
> 
> 
> 
> Well i tried to download it but the file is corrupt everytime and on the NL side of gigabyte it is not even updated yet.
> Guess i'll have to wait aswell.
Click to expand...

Strange. The base bios comes from Nvidia and all other 86.04.50.00.XX bioses from other manufacturers have been to resolve the micron Memory problem.

I suspect that the "for samsung" is a typo or that entry is only a place holder as the file is not on any of the the sites for download.


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> Strange. The base bios comes from Nvidia and all other 86.04.50.00.XX bioses from other manufacturers have been to resolve the micron Memory problem.
> 
> I suspect that the "for samsung" is a typo or that entry is only a place holder as the file is not on any of the the sites for download.


LOL, man that amount of negligence! as if the situation is not confusing enough as it is, they even don't know what memory type they are releasing the BIOS for, unbelievable.

Where is MSI from all this? what are they doing all that time? building a new wall of china or new pyramids! , how hard is it for them to throw some custom settings at a base BIOS test it and release it?!


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Strange. The base bios comes from Nvidia and all other 86.04.50.00.XX bioses from other manufacturers have been to resolve the micron Memory problem.
> 
> I suspect that the "for samsung" is a typo or that entry is only a place holder as the file is not on any of the the sites for download.
> 
> 
> 
> LOL, man that amount of negligence! as if the situation is not confusing enough as it is, they even don't know what memory type they are releasing the BIOS for, unbelievable.
> 
> Where is MSI from all this? what are they doing all that time? building a new wall of china or new pyramids! , how hard is it for them to throw some custom settings at a base BIOS test it and release it?!
Click to expand...

best I can get out of MSI is this



so hopefully by the end of this week


----------



## gtbtk

I suspect that there is more expertise on Nvidia 1070s in this forum than there is in all the AIB vendors Tech support departments put together


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Strange. The base bios comes from Nvidia and all other 86.04.50.00.XX bioses from other manufacturers have been to resolve the micron Memory problem.
> 
> I suspect that the "for samsung" is a typo or that entry is only a place holder as the file is not on any of the the sites for download.


agree...


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Where did that come from?
> 
> My preupdate bios Micron Memory card overclocks and runs without artifacts at +540 (2275Mhz) as long as the voltage is at .800 when start the 3d load. There is another micron card from TheGlow who can run his Micron card at +800
> 
> There is no evidence of Micron being crippled to prevent damage. Only evidence of a bug that is being rectified by the bios update


can u see i'm trolling? by the way, the new GIGA F2 is for micron right?


----------



## ITAngel

have they released any bios that help with OC on the GTX 1070? Just curious but I know Zotac GTX 1070 AMP Extreme has no bios on the download file support section. I have no issue with my card maybe is the reason they have not needed to release one yet.


----------



## Avendor

Quote:


> Originally Posted by *MojoW*
> 
> Well i tried to download it but the file is corrupt everytime and on the NL side of gigabyte it is not even updated yet.
> Guess i'll have to wait aswell.


It seems they fixed, now can be downloaded but ill wait for Micron


----------



## saunupe1911

Hey Asus Strix OC owners,

Has anyone with Samsung memory downloaded the new BIOS??? Does it improve performance? It seems like it's more tailored toward Micron memory chips.


----------



## MojoW

Is there no way of flashing the bios from dos? I don't trust bios flashing from the OS.
I'm new to nvidia so if anybody can point me in the right direction that would be great.


----------



## Avendor

Quote:


> Originally Posted by *khanmein*
> 
> can u see i'm trolling? by the way, the new GIGA F2 is for micron right?


Don't think so, I guess we have to wait a few more days for Micron... or maybe they just messed up instead of Micron they stated Samsung, that's really odd


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MojoW*
> 
> Well i tried to download it but the file is corrupt everytime and on the NL side of gigabyte it is not even updated yet.
> Guess i'll have to wait aswell.
> 
> 
> 
> It seems they fixed, now can be downloaded but ill wait for Micron
Click to expand...

You should clarify that with gigabyte. I think that someone made a mistake in the description.

The Core bios comes from Nvidia and every other 86.04.50.00.xx bios released from the other vendors has been for the micron fix. One of the other guys here is running a palit update .50 bios on his Samsung card and he says it works fine. he even got a performance boost because Palit increased the power limit for his card to 250W.

It would greatly surprise me if Gigabyte have reversed out any changes for the micron bug to make it Samsung specific but you are better to check that is the case.


----------



## emsj86

I'm looking into getting a 1070. I have 2 x 780 bios hacked and water cooled. Will the 1070 be an upgrade or mainly just better vram and not having to worry about sli scaling with only have one card? Also I was looking at he evga sc version. Is there any problems with it from owners ?


----------



## Avendor

Quote:


> Originally Posted by *MojoW*
> 
> Is there no way of flashing the bios from dos? I don't trust bios flashing from the OS.
> I'm new to nvidia so if anybody can point me in the right direction that would be great.


You can check: http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980
http://www.overclock.net/t/1521334/official-nvflash-with-certificate-checks-bypassed-for-gtx-950-960-970-980-980ti-titan-x
Personally, I just run the installer from Gigabyte to version F11_Beta when i bought 1070, nothing went wrong


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> It would greatly surprise me if Gigabyte have reversed out any changes for the micron bug to make it Samsung specific but you are better to check that is the case.


I think the BIOS is interchangeable, there is no BIOS that is exclusively Micron and one that is exclusively Samsung, the new and old BIOS work on both memory types either by identifying the chip type and changing settings like timings and voltages accordingly or by having baseline settings that basically works on both, if you check TechPowerUp's readings of BIOS values you will find that all old BIOS versions have been marked with Memory supported Samsung/Micron.

This explains that Samsung memory OC improved with that BIOS as well since it probably increased voltage to memory in general or changed the voltage adjustment algorithm in a way that works with both chips without causing the Micron chip to checkerboard.


----------



## MyNewRig

Quote:


> Originally Posted by *MojoW*
> 
> Is there no way of flashing the bios from dos? I don't trust bios flashing from the OS.
> I'm new to nvidia so if anybody can point me in the right direction that would be great.


It is not really that risky and takes only a second to flash, i don't know why people make such a big deal out of this, if you want to make sure just disable your video adapter and close all open apps prior to flashing, even in the very unlikely event that you end up bricking your card it is easily recoverable by booting with iGPU and reflashing, i would say just go for it, there is really nothing to worry about.


----------



## Mr-Dark

Hello

Regarding the Evga ACX overheat, is the FTW Hybrid and the normal Hybrid affected or ?


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> Regarding the Evga ACX overheat, is the FTW Hybrid and the normal Hybrid affected or ?


not sure about the overheating but the black screen with 100% fans glitch affects the Hybrid as well cuz they have identical PCB design and components and the problems go beyond being just a cooler issue, it is a PCB design issue so the Hybrid must also be effected to some degree, weather it is fatal or not, that i am not sure of.


----------



## Mr-Dark

Quote:


> Originally Posted by *MyNewRig*
> 
> not sure about the overheating but the black screen with 100% fans glitch affects the Hybrid as well cuz they have identical PCB design and components and the problems go beyond being just a cooler issue, it is a PCB design issue so the Hybrid must also be effected to some degree, weather it is fatal or not, that i am not sure of.


Thanks for the reply

I had Evga SC with black screen problem when the temp hit + 75c.. also as you say 60% fan speed fix the black screen problem..

should note Evga card's with Samsung memory have Zero problem.. as my sister have Evga SC with samsung memory and no problem at all..









No Evga anymore with Pascal.... Enough!


----------



## DeathAngel74

We need pascal bios editor already.....


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MyNewRig*
> 
> Yeah Intel iGPU always have textures artifacts in most games, it has always been that way, actually DOOM on 1080p with 50% resolution scale, Ultra preset and 8x AA looks very nice and is also very playable, i can also play with Ultra textures in most games, just have to lower shadows and reflections to get a decent FPS, but hey that is an emergency temp solution, i am not going to be playing like that for the rest of my life, i just don't want the current Micron 1070 and also don't want to buy an RX 480 because i know AMD is about to announce new stuff, so at least with the iGPU i can get a "taste" of gaming while waiting for a satisfying product to hit the market, if they start making 1070s with Samsung memory again i will buy it immediately.
> 
> 
> 
> my cash is ready & let NV rip off but they don't appreciate it. come take my money when u show me the samsung explosion.
Click to expand...

you got your mention


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> Thanks for the reply
> 
> I had Evga SC with black screen problem when the temp hit + 75c.. also as you say 60% fan speed fix the black screen problem..
> 
> should note Evga card's with Samsung memory have Zero problem.. as my sister have Evga SC with samsung memory and no problem at all..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No Evga anymore with Pascal.... Enough!


yeah EVGA messed up big time, their forums is a disaster filled with all sorts of ugly reports from heat to fire to black screen and much more, poor engineering and QC, the entire Pascal line seem like a rushed out product with so much issues, it is a mess all round.


----------



## MojoW

Quote:


> Originally Posted by *MyNewRig*
> 
> It is not really that risky and takes only a second to flash, i don't know why people make such a big deal out of this, if you want to make sure just disable your video adapter and close all open apps prior to flashing, even in the very unlikely event that you end up bricking your card it is easily recoverable by booting with iGPU and reflashing, i would say just go for it, there is really nothing to worry about.


If x99 had an IGPU you mean?
I have flashed my 290's and others countless of times thru dos and it never fails not once.(Booted up from an usb stick.)
But also flashed it a total of 8 times on windows and it went wrong 2 times, luckily it had dual bios switch so it was hassle free.
So it IS safer to do it from dos from my experience.


----------



## Mr-Dark

Quote:


> Originally Posted by *DeathAngel74*
> 
> We need pascal bios editor already.....


I think the benefit from custom bios not big as Maxwell.. Nvidia limited the OC potential with voltage scale thing









Some card need higher power limit to avoid the throttle, and maybe boost off to avoid temp throttle but some card's with high power limit don't need that.. yo know I had 2 Evga SC in SLI and both throttle at stock clock without any OC... while now with 2* MSI Gaming-X the power usage barely hit 70%.. also i can push both to 2050mhz without any voltage increase...


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> you got your mention


This guy is really good i am surprised why he doesn't have a lot of viewers, only problem is that he is repeating the same broken record manufacturers are using, how can Samsung 8 Gbps be in shortage when 10s of thousands of GTX 1060s are coming out with it in the same time the 1070s are coming with Micron? someone needs to inform him in the comments.

Also why everyone of them say we don't have a Micron card? how hard is it for them to go out and buy a Micron retail sample for testing or request a Micron sample from manufacturers for review and comparison? this does not make sense at all, even those reviewers who actually have a Micron card are hiding their GPU-Z and not bothering performing any testing or comparisons, this is one of the strangest dilemmas in the world!


----------



## Hunched

I can't wait for everyone to once again forget another colossal EVGA failure, and recommend them everywhere as the best option when the next line of Nvidia cards release.


----------



## saunupe1911

Quote:


> Originally Posted by *Hunched*
> 
> I can't wait for everyone to once again forget another colossal EVGA failure, and recommend them everywhere as the best option when the next line of Nvidia cards release.


LMAO I stick to Gigabyte and Asus Graphics cards and motherboards. EVGA make helluva power supplies though!


----------



## Hunched

Quote:


> Originally Posted by *saunupe1911*
> 
> LMAO I stick to Gigabyte and Asus Graphics cards and motherboards. EVGA make helluva power supplies though!


Gigabyte is garbage too as far as I'm concerned, their Pascal cards are by far the cheapest looking and feeling and their fans, as in fans on the GPU's, are terrible.

Asus is probably the best for motherboards, their BIOS are well designed.
EVGA PSU's are great because they don't make them, an actual competent company by the name of SuperFlower does. EVGA just puts their name on them, rebranding.

EVGA has some SeaSonic PSU's as well.

MSI makes the best Pascal cards in North America, I don't think it's even a competition.


----------



## saunupe1911

Quote:


> Originally Posted by *Hunched*
> 
> Gigabyte is garbage too as far as I'm concerned, their Pascal cards are by far the cheapest looking and feeling and their fans, as in fans on the GPU's, are terrible.
> 
> Asus is probably the best for motherboards, their BIOS are well designed.
> EVGA PSU's are great because they don't make them, an actual competent company by the name of SuperFlower does. EVGA just puts their name on them, rebranding.
> 
> EVGA has some SeaSonic PSU's as well.
> 
> MSI makes the best Pascal cards in North America, I don't think it's even a competition.


My Asus Strix OC begs to differ. The card is amazing. I honestly don't have any issues. Actually none my Asus cards have. I haven't had a Gigabyte graphics card but man damn sure makes some good motherboards. The bios could be better the hardward is top notch.


----------



## gtbtk

Quote:


> Originally Posted by *MyNewRig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you got your mention
> 
> 
> 
> 
> 
> 
> 
> This guy is really good i am surprised why he doesn't have a lot of viewers, only problem is that he is repeating the same broken record manufacturers are using, how can Samsung 8 Gbps be in shortage when 10s of thousands of GTX 1060s are coming out with it in the same time the 1070s are coming with Micron? someone needs to inform him in the comments.
> 
> Also why everyone of them say we don't have a Micron card? how hard is it for them to go out and buy a Micron retail sample for testing or request a Micron sample from manufacturers for review and comparison? this does not make sense at all, even those reviewers who actually have a Micron card are hiding their GPU-Z and not bothering performing any testing or comparisons, this is one of the strangest dilemmas in the world!
Click to expand...

it is certainly easy to be cynical.


----------



## zipzop

Quote:


> Originally Posted by *Mr-Dark*
> 
> I think the benefit from custom bios not big as Maxwell.. Nvidia limited the OC potential with voltage scale thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some card need higher power limit to avoid the throttle, and maybe boost off to avoid temp throttle but some card's with high power limit don't need that.. yo know I had 2 Evga SC in SLI and both throttle at stock clock without any OC... while now with 2* MSI Gaming-X the power usage barely hit 70%.. also i can push both to 2050mhz without any voltage increase...


Since you had SC's can you recommend a good BIOS for that card which power limit / TDP is higher?? I get power limit throttling regularly in many games at 1440p 144hz. Got the temps under control with using a liquid metal TIM


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> I think the benefit from custom bios not big as Maxwell.. Nvidia limited the OC potential with voltage scale thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some card need higher power limit to avoid the throttle, and maybe boost off to avoid temp throttle but some card's with high power limit don't need that.. yo know I had 2 Evga SC in SLI and both throttle at stock clock without any OC... while now with 2* MSI Gaming-X the power usage barely hit 70%.. also i can push both to 2050mhz without any voltage increase...
> 
> 
> 
> Since you had SC's can you recommend a good BIOS for that card which power limit / TDP is higher?? I get power limit throttling regularly in many games at 1440p 144hz. Got the temps under control with using a liquid metal TIM
Click to expand...

The SC cards only have an 8 pin power don't they? That will limit you to something that maxes out at a power limit of about 200W.

The Asus Strix OC could be one to try out and see how it goes.

Remember if you do cross flash, you will lose all the Precision XOC extras like KBoost


----------



## TheGlow

And I was leaning towards the EVGA. I figured for a few bucks more, who cares and went with the MSI. Even though it's technically a Micron, I am very satisfied.
Once the vbios comes out then I can undo those voltages work arounds.


----------



## blued

Quote:


> Originally Posted by *Hunched*
> 
> Gigabyte is garbage too as far as I'm concerned, their Pascal cards are by far the cheapest looking and feeling and their fans, as in fans on the GPU's, are terrible.


Disagree. Not sure about 1070 Gigabytes, but look at their 980 below. Lot of expensive copper there. Many other GPUs use alloy or other cheaper materials (esp Evga).



Meanwhile, Evga 1080:


----------



## MojoW

Quote:


> Originally Posted by *gtbtk*
> 
> The SC cards only have an 8 pin power don't they? That will limit you to something that maxes out at a power limit of about 200W.
> 
> The Asus Strix OC could be one to try out and see how it goes.
> 
> Remember if you do cross flash, you will lose all the Precision XOC extras like KBoost


Crossflashing sounds interesting







... didn't know the bios could differ that much and still work.


----------



## QPSS

Quote:


> Originally Posted by *blued*
> 
> Disagree. Not sure about 1070 Gigabytes, but look at their 980 below. Lot of expensive copper there. Many other GPUs use alloy or other cheaper materials (esp Evga).
> 
> 
> 
> Meanwhile, Evga 1080:


While I dislike EVGA too, how can you be sure thats not nickel plated copper, like most coolers are?


----------



## LuckLess7

Hi. The last couple of weeks have been really stressful. I bought new computerparts and I've been having immense problems with my hardware (ASUS Z170 Gaming Pro, i7 6700K, Corsair 3000MHz DDR4, MSI GTX 1070 Gaming X - on AIR). I've just started to get into overclocking, but shortly after my first attempts, BSOD started to appear hourly. So I reverted to stock settings and was even more disappointed that I STILL got those BSOD... Or random freezes and crashes... I've installed Windows 10 Pro three times since, I asked my retailer for a complete swap of the hardware (Board, CPU and a brand new PSU (be quiet straight power 500W).

Except for the graphics card I swapped it all, did a fresh install of Windows and all the drivers, used DDU for every new version that came out and didn't even touch ocing the card nor the CPU.

I read about that Micron memory problem and i pray.... no, really, at this point I do pray, that this must be the problem. I get video_scheduler_internal_error, Kernel_pwr, etc... I've been troubleshooting for roughly one month now - I'm about done, I can't take one BullS***OfDeath anymore. And every issue listed in the event viewer directs to a video card and or -driver problem

Could you confidently confirm that the problems I've been having (and a lot of others too??) are related to the Micron memory issue that's around? And that a VBIOS update will fix it?

There's just nothing I can think of what else could cause that many problems, especially when these crashes only happen in 3d applications, like games.

Is that correct that I could simply flash ASUS's VBIOS onto my MSI GTX 1070 Gaming X ?


----------



## syl1979

Quote:


> Originally Posted by *gtbtk*
> 
> Galax increased the power limit from 225 W to 250W with this bios update i believe


Was already at 250W.

Here is my original bios :
https://www.techpowerup.com/vgabios/184645/184645


----------



## Roland0101

Quote:


> Originally Posted by *LuckLess7*
> 
> Hi. The last couple of weeks have been really stressful. I bought new computerparts and I've been having immense problems with my hardware (ASUS Z170 Gaming Pro, i7 6700K, Corsair 3000MHz DDR4, MSI GTX 1070 Gaming X - on AIR). I've just started to get into overclocking, but shortly after my first attempts, BSOD started to appear hourly. So I reverted to stock settings and was even more disappointed that I STILL got those BSOD... Or random freezes and crashes... I've installed Windows 10 Pro three times since, I asked my retailer for a complete swap of the hardware (Board, CPU and a brand new PSU (be quiet straight power 500W).
> 
> Except for the graphics card I swapped it all, did a fresh install of Windows and all the drivers, used DDU for every new version that came out and didn't even touch ocing the card nor the CPU.
> 
> I read about that Micron memory problem and i pray.... no, really, at this point I do pray, that this must be the problem. I get video_scheduler_internal_error, Kernel_pwr, etc... I've been troubleshooting for roughly one month now - I'm about done, I can't take one BullS***OfDeath anymore. And every issue listed in the event viewer directs to a video card and or -driver problem
> 
> Could you confidently confirm that the problems I've been having (and a lot of others too??) are related to the Micron memory issue that's around? And that a VBIOS update will fix it?
> 
> There's just nothing I can think of what else could cause that many problems, especially when these crashes only happen in 3d applications, like games.
> 
> Is that correct that I could simply flash ASUS's VBIOS onto my MSI GTX 1070 Gaming X ?


Did you get checkerboard artifact crashes?
If you use (and stress) your CPU graphics, do the crashes stop?
Did you tried the card in the second PCIe slot of the motherboard?
Can you test the card in a different PC?

It could be the memory. It also could be another part of the Card that is defective.
If you can't test the card in another PC I would rather RMA than to flash a Bios from a different brand to it.


----------



## syl1979

Do you have the crashes without the graphuc card, running on integrated gpu ? If no then rma your card....

If yes then it looks a lot like PSU or main ram issue.

For the PSU you ruled this out.
I suggest you verify the memory settings in the bios (frequ, voltage, timing). Try to go to some standard settings (freq 2133). See if any improvement. Also try with only one

Also you should make some memmtest86+ testing. Verify no problem here.


----------



## M3Stang

So maybe me returning mine for the fire recall will fix my black screen. Look at that an excuse to return it. Hope they send a new one first. Don't want to wait 2 weeks to get to use my computer again for their mistake.


----------



## DeathAngel74

Quote:


> Originally Posted by *M3Stang*
> 
> So maybe me returning mine for the fire recall will fix my black screen. Look at that an excuse to return it. Hope they send a new one first. Don't want to wait 2 weeks to get to use my computer again for their mistake.


Is there an official recall now or??


----------



## M3Stang

Quote:


> Originally Posted by *DeathAngel74*
> 
> Is there an official recall now or??


Just saw this looks like you just have to ask for the exchange.

http://www.digitaltrends.com/computing/evga-gtx-1080-1070-overheating-issue/


----------



## DeathAngel74

But only for those unfortunate people with overheating issues correct?


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> you got your mention


i'm not ryanrazer1 but what he answered is the same like he said during TGW with joker slunt & even the new tech lounge, bryan & kevin, bryan also continued dissed "slunt"

steve burke can compare all the AIO water cool but how come he still unable to make the comparison & based on his statement, i don't think he will make a vid.

i force them announced but at the end of the day, i want a proper neck to neck comparison between GTX 1070/1060 samsung vs micron.


----------



## M3Stang

Quote:


> Originally Posted by *khanmein*
> 
> i'm not ryanrazer1 but what he answered is the same like he said during TGW with joker slunt & even the new tech lounge, bryan & kevin, bryan also continued dissed "slunt"


Not sure what you are making a reference too. The way I interpreted the article is that you should replace yours before it happens to you.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> But only for those unfortunate people with overheating issues correct?


those tech reviewer that tested the FTW & none of them said got heat issue. everything working flawlessly. if u didn't over-clock, the vrm cooling is not a big deal


----------



## khanmein

Quote:


> Originally Posted by *blued*
> 
> Disagree. Not sure about 1070 Gigabytes, but look at their 980 below. Lot of expensive copper there. Many other GPUs use alloy or other cheaper materials (esp Evga).
> 
> 
> 
> Meanwhile, Evga 1080:


did u check that for pascal G1 gaming come with 3 copper heat-pipe for 1080 & 2 heat-pipe for 1070

Image_13S.jpg 110k .jpg file


480033.jpg 154k .jpg file


GIGA is cutting cost & cheap out mosfet too (depend which revision/batches) *cough*

i saw most of the tech reviewer is highly recommend GIGA!

colorful stated got 4 heat-pipe but actually just 3 only also facing high temperature. maybe y some user said the fan noise is pretty loud on giga.

G1 Gaming for 970/980 cooling is really darn good for the 1~2 batches but following came with the hynix one i got no commend!


----------



## khanmein

Quote:


> Originally Posted by *MyNewRig*
> 
> yeah EVGA messed up big time, their forums is a disaster filled with all sorts of ugly reports from heat to fire to black screen and much more, poor engineering and QC, the entire Pascal line seem like a rushed out product with so much issues, it is a mess all round.


i trust EVGA will bounce back. for pascal avoid EVGA & did u notice the price is slightly reduce compare the time maxwell released.

if pascal i either go for asus, msi, palit/gainward or galax. guess what my GTX 970 is back..









i need to drive 250 km to reach the service center pay rip-off toll road (lousy), pay car park, show my f service note & pay USD 7.xx (due to the warranty after 1 year *stupid policy* then return back with another 250 km.


----------



## MyNewRig

Quote:


> Originally Posted by *LuckLess7*
> 
> Hi. The last couple of weeks have been really stressful. I bought new computerparts and I've been having immense problems with my hardware (ASUS Z170 Gaming Pro, i7 6700K, Corsair 3000MHz DDR4, MSI GTX 1070 Gaming X - on AIR). I've just started to get into overclocking, but shortly after my first attempts, BSOD started to appear hourly. So I reverted to stock settings and was even more disappointed that I STILL got those BSOD... Or random freezes and crashes... I've installed Windows 10 Pro three times since, I asked my retailer for a complete swap of the hardware (Board, CPU and a brand new PSU (be quiet straight power 500W).
> 
> Except for the graphics card I swapped it all, did a fresh install of Windows and all the drivers, used DDU for every new version that came out and didn't even touch ocing the card nor the CPU.
> 
> I read about that Micron memory problem and i pray.... no, really, at this point I do pray, that this must be the problem. I get video_scheduler_internal_error, Kernel_pwr, etc... I've been troubleshooting for roughly one month now - I'm about done, I can't take one BullS***OfDeath anymore. And every issue listed in the event viewer directs to a video card and or -driver problem
> 
> Could you confidently confirm that the problems I've been having (and a lot of others too??) are related to the Micron memory issue that's around? And that a VBIOS update will fix it?
> 
> There's just nothing I can think of what else could cause that many problems, especially when these crashes only happen in 3d applications, like games.
> 
> Is that correct that I could simply flash ASUS's VBIOS onto my MSI GTX 1070 Gaming X ?


How come you have been troubleshooting for a month and still don't find the issue? it is not that hard:

1- Remove the GPU, play a game on the processor graphics if no crash happens return your GPU and get it replaced, if crash happens move to step two.

2- Updates to the latest system BIOS , currently version 2003 for the Pro Gaming, reset BIOS to default settings, play some more on the processor graphics, if crash happens move to next step.

3- Apply XMP profile for RAM exit and test, if still crashing ..

4- Run memtest overnight to check for any memory errors and test one RAM stick at a time, if RAM is good and you still crashing on the latest BIOS and stock settings then it is the motherboard or PSU.

5- I have the Pro Gaming myself never had a BSOD once that was caused by the motherboard and it is unlikely for the CPU to cause any of this, i suspect it is the GPU.

You can also run the GPU i debug mode, downclock memory and core to test or run it with high performance mode and locked voltage to test if it is actually the MIcron memory or something else.


----------



## gtbtk

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The SC cards only have an 8 pin power don't they? That will limit you to something that maxes out at a power limit of about 200W.
> 
> The Asus Strix OC could be one to try out and see how it goes.
> 
> Remember if you do cross flash, you will lose all the Precision XOC extras like KBoost
> 
> 
> 
> Crossflashing sounds interesting
> 
> 
> 
> 
> 
> 
> 
> ... didn't know the bios could differ that much and still work.
Click to expand...

do not flash the Galax HOF or Snipr bioses, they use different voltage controllers and will brick the card.


----------



## gtbtk

Quote:


> Originally Posted by *M3Stang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> Is there an official recall now or??
> 
> 
> 
> Just saw this looks like you just have to ask for the exchange.
> 
> http://www.digitaltrends.com/computing/evga-gtx-1080-1070-overheating-issue/
Click to expand...

EVGA is offering Thermal pads to all users. You have to ask them and they will send them to you to allow you to upgrade your cooling yourself. They claim it solves the problem.

There are details at the EVGA Tech forum


----------



## M3Stang

Quote:


> Originally Posted by *gtbtk*
> 
> EVGA is offering Thermal pads to all users. You have to ask them and they will send them to you to allow you to upgrade your cooling yourself. They claim it solves the problem.
> 
> There are details at the EVGA Tech forum


Then I guess I'll rma it for the no post when I get around to it. Hasn't had the problem since yesterday. Probably won't do it for another week or so lol. Then again the computer has probably been off for about 12 hours. By the time I wake up in the morning it might be enough to trigger it.


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> Thanks for the reply
> 
> I had Evga SC with black screen problem when the temp hit + 75c.. also as you say 60% fan speed fix the black screen problem..
> 
> should note Evga card's with Samsung memory have Zero problem.. as my sister have Evga SC with samsung memory and no problem at all..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No Evga anymore with Pascal.... Enough!


evga hybrid has unique cooling design so i wont worry about it.


----------



## asdkj1740

Quote:


> Originally Posted by *blued*
> 
> Disagree. Not sure about 1070 Gigabytes, but look at their 980 below. Lot of expensive copper there. Many other GPUs use alloy or other cheaper materials (esp Evga).
> 
> 
> 
> Meanwhile, Evga 1080:


this gigabyte cooler was not planned to release.
what gigabyte planned to release for its pascal xtreme cards is heat dirct touch cooler.


----------



## khanmein

http://www.ferra.ru/ru/system/review/nvidia-geforce-gtx-1070-asus-gigabyte-msi-palit-zotac/


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> http://www.ferra.ru/ru/system/review/nvidia-geforce-gtx-1070-asus-gigabyte-msi-palit-zotac/


are you really surprised?

"Balin, where did you find MSI 1070 card c memory from Samsung-a? All the latest released games have been with Micron.
-0+
REPLY
Sergey Plotnikov> Alex Nemich
06.10.2016
Good afternoon. Map provided MSI itself."


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> are you really surprised?
> 
> "Balin, where did you find MSI 1070 card c memory from Samsung-a? All the latest released games have been with Micron.-0+REPLYSergey Plotnikov> Alex Nemich06.10.2016Good afternoon. Map provided MSI itself."


not surprised but the zotac is micron & the rest like asus, giga, msi & palit are samsung.

look like palit is the hottest among others that hit above 8x°c

funny this kinda stuff also don't know & yet wannabe youtube pc tech reviewer.


__ https://twitter.com/i/web/status/793176313179803649%5B%2FURL


----------



## LuckLess7

Quote:


> Originally Posted by *MyNewRig*
> 
> How come you have been troubleshooting for a month and still don't find the issue? it is not that hard:
> 
> 1- Remove the GPU, play a game on the processor graphics if no crash happens return your GPU and get it replaced, if crash happens move to step two.
> 
> 2- Updates to the latest system BIOS , currently version 2003 for the Pro Gaming, reset BIOS to default settings, play some more on the processor graphics, if crash happens move to next step.
> 
> 3- Apply XMP profile for RAM exit and test, if still crashing ..
> 
> 4- Run memtest overnight to check for any memory errors and test one RAM stick at a time, if RAM is good and you still crashing on the latest BIOS and stock settings then it is the motherboard or PSU.
> 
> 5- I have the Pro Gaming myself never had a BSOD once that was caused by the motherboard and it is unlikely for the CPU to cause any of this, i suspect it is the GPU.


Thanks you for your reply. I really have done every step you listed, multiple times, and I _already know_, as stated, it must be the GPU. What I really want to know is if _the issues with it come from micron memory_. Because if it's the faulty memory, would it make sense to return the card only to have it swapped for another one with Micron memory? There are none left with Samsung memory, right?
Quote:


> Originally Posted by *MyNewRig*
> 
> You can also run the GPU i debug mode, downclock memory and core to test or run it with high performance mode and locked voltage to test if it is actually the MIcron memory or something else.


*This I haven't tried.* How would I proceed here? Running GPU in debug mode is completely new to me.


----------



## RyanRazer

Well then. He replied. Nothing clarified. Highly doubt they will make comparison video.


----------



## Mr-Dark

Quote:


> Originally Posted by *zipzop*
> 
> Since you had SC's can you recommend a good BIOS for that card which power limit / TDP is higher?? I get power limit throttling regularly in many games at 1440p 144hz. Got the temps under control with using a liquid metal TIM


Stay on the stock bios as flashing a bios from another card isn't good idea... also the SC use single 8Pin while other card with higher limit use 8+6Pin...


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> Stay on the stock bios as flashing a bios from another card isn't good idea... also the SC use single 8Pin while other card with higher limit use 8+6Pin...


much safer than maxwell cards.
sc has only 170w max, which is much less than enough even for gaming (~200w)


----------



## asdkj1740

Quote:


> Originally Posted by *RyanRazer*
> 
> Well then. He replied. Nothing clarified. Highly doubt they will make comparison video.


yeah, micron cards can be found on market easily so there is no excuse for him to say he doesnt have one...


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> Hi. The last couple of weeks have been really stressful. I bought new computerparts and I've been having immense problems with my hardware (ASUS Z170 Gaming Pro, i7 6700K, Corsair 3000MHz DDR4, MSI GTX 1070 Gaming X - on AIR). I've just started to get into overclocking, but shortly after my first attempts, BSOD started to appear hourly. So I reverted to stock settings and was even more disappointed that I STILL got those BSOD... Or random freezes and crashes... I've installed Windows 10 Pro three times since, I asked my retailer for a complete swap of the hardware (Board, CPU and a brand new PSU (be quiet straight power 500W).
> 
> Except for the graphics card I swapped it all, did a fresh install of Windows and all the drivers, used DDU for every new version that came out and didn't even touch ocing the card nor the CPU.
> 
> I read about that Micron memory problem and i pray.... no, really, at this point I do pray, that this must be the problem. I get video_scheduler_internal_error, Kernel_pwr, etc... I've been troubleshooting for roughly one month now - I'm about done, I can't take one BullS***OfDeath anymore. And every issue listed in the event viewer directs to a video card and or -driver problem
> 
> Could you confidently confirm that the problems I've been having (and a lot of others too??) are related to the Micron memory issue that's around? And that a VBIOS update will fix it?
> 
> There's just nothing I can think of what else could cause that many problems, especially when these crashes only happen in 3d applications, like games.
> 
> Is that correct that I could simply flash ASUS's VBIOS onto my MSI GTX 1070 Gaming X ?


One question that hasn't been asked is what types of BSOD are you getting?

The tool Whocrashed that you can get here http://www.resplendence.com/downloads can be handy in pointing you in the direction to look for the problem

There are two types of BSOD that I have experienced that are caused by the MSI Gaming card with the current 86.04.26.00.3e bios. anything other than these types and I would have suspicion that it is something else causing the crashes.

1. DTC Watchdog error BSOD. This happens after a machine lockup and timeout.

2. Video Scheduler error. This happens after the white full screen checkerboard artifacts

The DTC watchdog error, which I think is causedby the gpu being power overloaded. I found that it helps if you reduce the power limit slider to 90% or even 85%. The 90% setting will set the MSI card to draw a max of 210W. That is still 10W more at max voltage than the Asus strix OC card with the slider set to 120% so It is not going to nerf the cards performance but should allow you to get more stability. Under general usage, the card does not pull much more than 80% anyway but it will spike up to higher levels. If you get the system stable you can then start increasing the slider back towards 100% and see how you go.

The Video Scheduler Error is caused by the Micron Bug that will be addressed when the MSI bois update is released. To mitigate that one until the release of the bios update, you can open the Nvidia control panel and in the 3D settings section add the programs DWN.exe and Explorer.exe and set them both to maximum performance mode. The aim is to keep the card voltage above .800v at idle. The other option is less desirable but you can lock the voltage in the curve in afterburner to keep the card at max voltage.

You can flash the new 86.04.50.00.xx bios from Asus. I have done that and my gaming X works well but the card is limited to a max power draw of 200W. However, the MSI bios is due either this week or the next so it may not be worth the added risk of multiple flashing for you if you can work around the problems in the mean time.


----------



## khanmein

Quote:


> Originally Posted by *LuckLess7*
> 
> Thanks you for your reply. I really have done every step you listed, multiple times, and I _already know_, as stated, it must be the GPU. What I really want to know is if _the issues with it come from micron memory_. Because if it's the faulty memory, would it make sense to return the card only to have it swapped for another one with Micron memory? There are none left with Samsung memory, right?
> *This I haven't tried.* How would I proceed here? Running GPU in debug mode is completely new to me.


FYI, not every card supported the debug mode. my GTX 970 shown but i saw some geforce forums said their card is grayed out.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RyanRazer*
> 
> Well then. He replied. Nothing clarified. Highly doubt they will make comparison video.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yeah, micron cards can be found on market easily so there is no excuse for him to say he doesnt have one...
Click to expand...

To be fair, He may never have gone looking for one before too. Particularly if all his gear is supplied by vendors who are going out of their way to hide it from the media.


----------



## gtbtk

Quote:


> Originally Posted by *Mr-Dark*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipzop*
> 
> Since you had SC's can you recommend a good BIOS for that card which power limit / TDP is higher?? I get power limit throttling regularly in many games at 1440p 144hz. Got the temps under control with using a liquid metal TIM
> 
> 
> 
> Stay on the stock bios as flashing a bios from another card isn't good idea... also the SC use single 8Pin while other card with higher limit use 8+6Pin...
Click to expand...

actually the galax and MSI are the only vendors with 8+6 pins power that I am aware of. The High end Zotac and Gigabyte cards use 8+8 pin. all the others like teh Asus are only a single 8 pin


----------



## Avendor

GTX 1070 Xtreme Gaming is using 1 x 6-pin 1 x 8-pin, while G1 Gaming does have 1 x 8-pin


----------



## LuckLess7

Quote:


> Originally Posted by *gtbtk*
> 
> One question that hasn't been asked is what types of BSOD are you getting?


I've been getting those two types that you mention, in a ratio of 9:1, the _2. Video Scheduler error. This happens after the white full screen checkerboard artifacts_ being 90% of the time.

However, without any overclocking applied, not even the mild OC that the MSI Gaming App does, *I do not get the checkerboard prior to the crash/BSOD*
Quote:


> Originally Posted by *gtbtk*
> 
> There are two types of BSOD that I have experienced that are caused by the MSI Gaming card with the current 86.04.26.00.3e bios. anything other than these types and I would have suspicion that it is something else causing the crashes.
> 
> 1. DTC Watchdog error BSOD. This happens after a machine lockup and timeout.
> 
> 2. Video Scheduler error. This happens after the white full screen checkerboard artifacts
> 
> The DTC watchdog error, which I think is causedby the gpu being power overloaded. I found that it helps if you reduce the power limit slider to 90% or even 85%. The 90% setting will set the MSI card to draw a max of 210W. That is still 10W more at max voltage than the Asus strix OC card with the slider set to 120% so It is not going to nerf the cards performance but should allow you to get more stability. Under general usage, the card does not pull much more than 80% anyway but it will spike up to higher levels. If you get the system stable you can then start increasing the slider back towards 100% and see how you go.


I have already read about this. I should try to keep the card at full performance at all times, in order to prevent it from lowering the voltage and thus causing the Micron memory to fail and produce crashes. Or do you mean to set the power target to 90% and still set "prefer maximum performance" in the control panel?

Quote:


> Originally Posted by *gtbtk*
> 
> The Video Scheduler Error is caused by the Micron Bug that will be addressed when the MSI bois update is released. To mitigate that one until the release of the bios update, you can open the Nvidia control panel and in the 3D settings section add the programs DWN.exe and Explorer.exe and set them both to maximum performance mode. The aim is to keep the card voltage above .800v at idle. The other option is less desirable but you can lock the voltage in the curve in afterburner to keep the card at max voltage.
> 
> You can flash the new 86.04.50.00.xx bios from Asus. I have done that and my gaming X works well but the card is limited to a max power draw of 200W. However, the MSI bios is due either this week or the next so it may not be worth the added risk of multiple flashing for you if you can work around the problems in the mean time.


I really hope this fixes it. It's certainly good to know... in a way ... that others have the same issue. I've never had this much trouble with new hardware!


----------



## MyNewRig

Quote:


> Originally Posted by *LuckLess7*
> 
> I've been getting those two types that you mention, in a ratio of 9:1, the _2. Video Scheduler error. This happens after the white full screen checkerboard artifacts_ being 90% of the time.
> 
> However, without any overclocking applied, not even the mild OC that the MSI Gaming App does, *I do not get the checkerboard prior to the crash/BSOD*
> I have already read about this. I should try to keep the card at full performance at all times, in order to prevent it from lowering the voltage and thus causing the Micron memory to fail and produce crashes. Or do you mean to set the power target to 90% and still set "prefer maximum performance" in the control panel?
> I really hope this fixes it. It's certainly good to know... in a way ... that others have the same issue. I've never had this much trouble with new hardware!


Welcome to the Micron memory club, you have multiple options:

1- Wait for the new Micron fix BIOS to be released by MSI which we don't have an ETA for and hope your problem goes away.

2- Get your card replaced for a new one which have a chance of coming with the same problem.

3- Ask your retailer to send you a Samsung memory card as a replacement which is very difficult or impossible to do but i know one guy who was successful in doing so and the Samsung card solved all his problems.

4- Return the 1070 and get something else lower performance but stable like a 1060 with Samsung memory or an RX 480 which typically comes with Samsung or Hynix memory and both work properly.

5- Wait a few months for VEGA or Pascal v2 with improved memory.


----------



## Gurkburk

Just bought a Gigabyte 1070. Do these cards barely OC or am i doing something wrong?


----------



## MyNewRig

Quote:


> Originally Posted by *Gurkburk*
> 
> Just bought a Gigabyte 1070. Do these cards barely OC or am i doing something wrong?


barely OC in terms of core or memory? what memory type do you have showing in GPU-Z?


----------



## syl1979

Quote:


> Originally Posted by *Gurkburk*
> 
> Just bought a Gigabyte 1070. Do these cards barely OC or am i doing something wrong?


Which real frqiency are you getting ?


----------



## Gurkburk

Quote:


> Originally Posted by *MyNewRig*
> 
> barely OC in terms of core or memory? what memory type do you have showing in GPU-Z?


Right now, it's the Clock I'm looking to OC.

GDDR5 Micron.
Quote:


> Originally Posted by *syl1979*
> 
> Which real frqiency are you getting ?


Before i touched it, it said 1999Mhz when it maxed its boost in Furmark.


----------



## MyNewRig

Quote:


> Originally Posted by *Gurkburk*
> 
> Right now, it's the Clock I'm looking to OC.
> 
> GDDR5 Micron.
> Before i touched it, it said 1999Mhz when it maxed its boost in Furmark.


Which Gigabyte model are we talking about here? what is your current Core clock and max OC? anything near or above 2000Mhz is normal, are you getting less than that? Micron memory on the other hand might not OC at all, not even +100 in some cases.


----------



## Gurkburk

Quote:


> Originally Posted by *syl1979*
> 
> Which real frqiency are you getting ?


Quote:


> Originally Posted by *MyNewRig*
> 
> Which Gigabyte model are we talking about here? what is your current Core clock and max OC? anything near or above 2000Mhz is normal, are you getting less than that? Micron memory on the other hand might not OC at all, not even +100 in some cases.


Gigabyte GeForce GTX1070 G1 Gaming

I placed it on max performance in Battlefield 1, and it's boosting to 2076Mhz. I have CoreV to +100 & Core Clock to +100.


----------



## MyNewRig

Quote:


> Originally Posted by *Gurkburk*
> 
> Gigabyte GeForce GTX1070 G1 Gaming
> 
> I placed it on max performance in Battlefield 1, and it's boosting to 2076Mhz. I have CoreV to +100 & Core Clock to +100.


And how is that "barely OC" ? the card came factory OCed already and you were able to push +100 on top of that, what were you expecting?


----------



## Gurkburk

Quote:


> Originally Posted by *MyNewRig*
> 
> And how is that "barely OC" ? the card came factory OCed already and you were able to push +100 on top of that, what were you expecting?


My just replaced 780 had what, 800-something baseclock? 1100 boosted? I clocked that up to 1400mhz.

I expected it to clock pretty well above 150mhz.


----------



## MyNewRig

Quote:


> Originally Posted by *Gurkburk*
> 
> My just replaced 780 had what, 800-something baseclock? 1100 boosted? I clocked that up to 1400mhz.
> 
> I expected it to clock pretty well above 150mhz.


Anything above 2000Mhz is good for the 1070, a few samples can go above 2100Mhz but that is not common


----------



## Gurkburk

Quote:


> Originally Posted by *MyNewRig*
> 
> Anything above 2000Mhz is good for the 1070, a few samples can go above 2100Mhz but that is not common


And you mean Any brand of 1070 as well as the samsung memory?


----------



## MyNewRig

Quote:


> Originally Posted by *Gurkburk*
> 
> And you mean Any brand of 1070 as well as the samsung memory?


Yes, 2100Mhz core is pretty much the ceiling across the board, i had a 1070 that can barely reach 2000Mhz and another that can do 2113Mhz but would throttle down to 2072Mhz under load.

Samsung memory helps with Memory OC not core OC


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> much safer than maxwell cards.
> sc has only 170w max, which is much less than enough even for gaming (~200w)


Yeah, 170w is very low... even not enough for Evga oc.....


----------



## Avendor

Quote:


> Originally Posted by *Gurkburk*
> 
> My just replaced 780 had what, 800-something baseclock? 1100 boosted? I clocked that up to 1400mhz.
> 
> I expected it to clock pretty well above 150mhz.


Core Clock it's pretty much restricted, you need lot of luck to obtain +150 / +200 unfortunately not so many 1070s can be pushed that far, squeeze more from Mem. Clock, it matters a lot for higher resolutions








The best Core Clock i've ever seen for GTX 1070, is it legit?








https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html
LE: We still need Gigabyte new vbios for Micron type, but they have been rolled out for Samsung... I am reading so many different opinions...
http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,11.html

__
https://www.reddit.com/r/5aeoqi/new_bios_for_gtx_1070_g1_gaming_edition_samsung/


----------



## yorfi86

I need a little help, this week thinking of buying a 1070, Zotac Amp Edition or Evga SC?


----------



## Avendor

Quote:


> Originally Posted by *yorfi86*
> 
> I need a little help, this week thinking of buying a 1070, Zotac Amp Edition or Evga SC?


Don't go for EVGA they got overheating issues. I don't know about SuperClocked edition though


----------



## MyNewRig

Quote:


> Originally Posted by *yorfi86*
> 
> I need a little help, this week thinking of buying a 1070, Zotac Amp Edition or Evga SC?


Stay away from EVGA they have nothing but problems this generation, go ASUS or Zotac , some prefer MSI


----------



## Roland0101

Quote:


> Originally Posted by *LuckLess7*
> 
> I've been getting those two types that you mention, in a ratio of 9:1, the _2. Video Scheduler error. This happens after the white full screen checkerboard artifacts_ being 90% of the time.
> 
> However, without any overclocking applied, not even the mild OC that the MSI Gaming App does, *I do not get the checkerboard prior to the crash/BSOD[/B*


This indicates that there is a second problem. Did you tested the card in the second PCIe slot of the motherboard?
Quote:


> I have already read about this. I should try to keep the card at full performance at all times, in order to prevent it from lowering the voltage and thus causing the Micron memory to fail and produce crashes. Or do you mean to set the power target to 90% and still set "prefer maximum performance" in the control panel?


Do both.
Quote:


> I really hope this fixes it. It's certainly good to know... in a way ... that others have the same issue. I've never had this much trouble with new hardware!


Neither gtbtk nor I did experience crashes on stock clocks with our Micron memory cards. We got this crashes only after applying higher overclock settings to the memory. Over 8352Mhz effective in my case.


----------



## yorfi86

Quote:


> Originally Posted by *Avendor*
> 
> Don't go for EVGA they got overheating issues. I don't know about SuperClocked edition though


That's what worries me, I have not heard problems with the SC, but being EVGA ...


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gurkburk*
> 
> My just replaced 780 had what, 800-something baseclock? 1100 boosted? I clocked that up to 1400mhz.
> 
> I expected it to clock pretty well above 150mhz.
> 
> 
> 
> Core Clock it's pretty much restricted, you need lot of luck to obtain +150 / +200 unfortunately not so many 1070s can be pushed that far, squeeze more from Mem. Clock, it matters a lot for higher resolutions
> 
> 
> 
> 
> 
> 
> 
> 
> The best Core Clock i've ever seen for GTX 1070, is it legit?
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/27.html
> LE: We still need Gigabyte new vbios for Micron type, but they have been rolled out for Samsung... I am reading so many different opinions...
> http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,11.html
> 
> __
> https://www.reddit.com/r/5aeoqi/new_bios_for_gtx_1070_g1_gaming_edition_samsung/%5B/URL
Click to expand...


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> One question that hasn't been asked is what types of BSOD are you getting?
> 
> 
> 
> I've been getting those two types that you mention, in a ratio of 9:1, the 2. Video Scheduler error. This happens after the white full screen checkerboard artifacts being 90% of the time.
> 
> However, without any overclocking applied, not even the mild OC that the MSI Gaming App does, *I do not get the checkerboard prior to the crash/BSOD*
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> There are two types of BSOD that I have experienced that are caused by the MSI Gaming card with the current 86.04.26.00.3e bios. anything other than these types and I would have suspicion that it is something else causing the crashes.
> 
> 1. DTC Watchdog error BSOD. This happens after a machine lockup and timeout.
> 
> 2. Video Scheduler error. This happens after the white full screen checkerboard artifacts
> 
> The DTC watchdog error, which I think is causedby the gpu being power overloaded. I found that it helps if you reduce the power limit slider to 90% or even 85%. The 90% setting will set the MSI card to draw a max of 210W. That is still 10W more at max voltage than the Asus strix OC card with the slider set to 120% so It is not going to nerf the cards performance but should allow you to get more stability. Under general usage, the card does not pull much more than 80% anyway but it will spike up to higher levels. If you get the system stable you can then start increasing the slider back towards 100% and see how you go.
> 
> Click to expand...
> 
> I have already read about this. I should try to keep the card at full performance at all times, in order to prevent it from lowering the voltage and thus causing the Micron memory to fail and produce crashes. Or do you mean to set the power target to 90% and still set "prefer maximum performance" in the control panel?
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The Video Scheduler Error is caused by the Micron Bug that will be addressed when the MSI bois update is released. To mitigate that one until the release of the bios update, you can open the Nvidia control panel and in the 3D settings section add the programs DWN.exe and Explorer.exe and set them both to maximum performance mode. The aim is to keep the card voltage above .800v at idle. The other option is less desirable but you can lock the voltage in the curve in afterburner to keep the card at max voltage.
> 
> You can flash the new 86.04.50.00.xx bios from Asus. I have done that and my gaming X works well but the card is limited to a max power draw of 200W. However, the MSI bios is due either this week or the next so it may not be worth the added risk of multiple flashing for you if you can work around the problems in the mean time.
> 
> Click to expand...
> 
> I really hope this fixes it. It's certainly good to know... in a way ... that others have the same issue. I've never had this much trouble with new hardware!
Click to expand...

The power limit deals with the Watchdog BSOD and NVcontrol panel max performance deals with checkerboard problem and are separate issues.

As long as you keep the voltage above .800 I have not seen the cards checkerboard. DWM.exe and explorer.exe are base applications of the operating system so they are always loaded putting an idle load on the card and keeping the voltage up at .800v and will stop the checkerboards in my experience. Try it and see what happens. It is difficult for me to give you an absolute answer on hardware that I have never seen.

I have uninstalled the gaming app and only use afterburner


----------



## LuckLess7

Quote:


> Originally Posted by *Roland0101*
> 
> Did you tested the card in the second PCIe slot of the motherboard?


Yes I did, I also tested my old gtx 770 in that slot which didn't cause any issues. Besides, its the same problem in two different Asus z170 pro gaming boards. the chance of the pcie slot being faulty in two boards are really small, istn't it?

Is there a test I could do like memtest86 except for graphics card memory?


----------



## Avendor

@gtbtk That's one of the reason why i'm waiting for Micron VBIOS, I don't want to touch, to modify voltages into AF.
It looks like this


----------



## iARDAs

I have a good offer for my 980ti and will sell it to a close friend. I will be buying a 1070 and getting a 2nd one later down the road to SLI.

One question. Is my Evga Supernova 750W GPU good enough for 1070SLI?

I know it was not for 980ti SLI.


----------



## Gurkburk

I'm clocking the memory of my 1070 now. Seems like it's able to go pretty high. I'm doing furmark & Nvidia settings are told to go max performance. I'm up at +600ish now.


----------



## MyNewRig

Quote:


> Originally Posted by *LuckLess7*
> 
> Yes I did, I also tested my old gtx 770 in that slot which didn't cause any issues. Besides, its the same problem in two different Asus z170 pro gaming boards. the chance of the pcie slot being faulty in two boards are really small, istn't it?
> 
> Is there a test I could do like memtest86 except for graphics card memory?


Man don't waste your time, i and others too had artifacts at stock settings with our Micron memory 1070 , just because some people don't have that does not mean that no one is effected, don't bother doing any further testing we have already gone over all that and it is a known issue, either run your card in high performance mode or locked voltage, wait for the BIOS update or just return it, your motherboard is fine, don't let anyone make you think that there is a problem with your system because after replacing the entire system once already you have eliminated any other problems with your system.

Some people here will confuse you, tell you our cards are okay or that the problem is with your motherboard, your RAM or your neighbors, just don't listen to that crap and end up wasting your time.


----------



## LuckLess7

Quote:


> Originally Posted by *gtbtk*
> 
> I have uninstalled the gaming app and only use afterburner


OK, I will try all this and report back once I'll have more data.

I know my priorities could be wrong here, but I want to run gaming app solely for led color control If I close it, the led light goes off. I will uninstall it now anyway.

But in any case, if your card crashes, be it with overclocked memory or without, the driver crashes and shows nvlddmkm error in event viewer?


----------



## Roland0101

Quote:


> Originally Posted by *iARDAs*
> 
> I have a good offer for my 980ti and will sell it to a close friend. I will be buying a 1070 and getting a 2nd one later down the road to SLI.
> 
> One question. Is my Evga Supernova 750W GPU good enough for 1070SLI?
> 
> I know it was not for 980ti SLI.


Look here.

It's of course entirely your decision, but I would rather buy a 1080 or a Titan (Pascal) than a 1070 SLI system. Less headache.


----------



## Avendor

Quote:


> Originally Posted by *Gurkburk*
> 
> I'm clocking the memory of my 1070 now. Seems like it's able to go pretty high. I'm doing furmark & Nvidia settings are told to go max performance. I'm up at +600ish now.


Don't use FurMark, it's not compatible with gtx 10 series yet. Use Stress Tests or Benchmarks like FireStrike, Time Spy for stablity, afterwards test games as well


----------



## Roland0101

Quote:


> Originally Posted by *LuckLess7*
> 
> Yes I did, I also tested my old gtx 770 in that slot which didn't cause any issues. Besides, its the same problem in two different Asus z170 pro gaming boards. the chance of the pcie slot being faulty in two boards are really small, istn't it?


Yes, it was just to make sure.

Quote:


> Is there a test I could do like memtest86 except for graphics card memory?


No not really. Set the system to "prefer maximum performance" as gtbtk described it.

If you still get BOSDs at stock settings, I would suggest to RMA the card.


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> @gtbtk That's one of the reason why i'm waiting for Micron VBIOS, I don't want to touch, to modify voltages into AF.
> It looks like this


I have never been able to run my card at the settings in techpowerup page. But all cards are different so I wont say it is impossible.

I have found that with my card, the left side of the curve will rob performance from the right side of the curve where the card is running at when it is under load if the left end points are set too high. If you want to keep a smooth curve like yours, Ctrl Click on the point at .800 and drag it down to 1600 Mhz and see how that performs for you. remember it is all trial and error


----------



## gtbtk

Quote:


> Originally Posted by *Roland0101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iARDAs*
> 
> I have a good offer for my 980ti and will sell it to a close friend. I will be buying a 1070 and getting a 2nd one later down the road to SLI.
> 
> One question. Is my Evga Supernova 750W GPU good enough for 1070SLI?
> 
> I know it was not for 980ti SLI.
> 
> 
> 
> Look here.
> 
> It's of course entirely your decision, but I would rather buy a 1080 or a Titan (Pascal) than a 1070 SLI system. Less headache.
Click to expand...

agreed


----------



## TylerAD

Seems like here is a recent review of an MSI card that has the Micron memory. This sample does not seem to have the issues as others are facing here.

http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html


----------



## khanmein

Quote:


> Originally Posted by *TylerAD*
> 
> Seems like here is a recent review of an MSI card that has the Micron memory. This sample does not seem to have the issues as others are facing here.
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html


yeah 1st well known tech side with proper micron review but this card got coil whine.


----------



## MyNewRig

Quote:


> Originally Posted by *TylerAD*
> 
> Seems like here is a recent review of an MSI card that has the Micron memory. This sample does not seem to have the issues as others are facing here.
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html


They sent it with the 86.04.50.xx.xx new BIOS branch that is not even available to the public yet.


----------



## MojoW

Quote:


> Originally Posted by *gtbtk*
> 
> do not flash the Galax HOF or Snipr bioses, they use different voltage controllers and will brick the card.


Well i was thinking of flashing the Xtreme gaming bios.
But that card is 8+6 and i'm not sure if that bios will work with my card.
Wish we could edit our own bios as that would be way safer.


----------



## Mr-Dark

SO, MSI release new card today.. the Quick Silver GTX 1070.. also with Micron memory.. here is the review

http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html

9Ghz memory OC and normal GTX 1070 performance.. I don't think Samsung vs Micron make any difference on the performance at 8Ghz clock...


----------



## LuckLess7

A couple of posts before someone has already posted the link to the review. The bios version is 50.00.2A . It's not available to the public yet


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> SO, MSI release new card today.. the Quick Silver GTX 1070.. also with Micron memory.. here is the review
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html
> 
> 9Ghz memory OC and normal GTX 1070 performance.. I don't think Samsung vs Micron make any difference on the performance at 8Ghz clock...


i see lots of samsung reaching 9.4g~9.6g


----------



## asdkj1740

msi users should check the live updata app from msi to see whether new bios is released


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> i see lots of samsung reaching 9.4g~9.6g


I'm talking about the performance at stock clock.. someone say Samsung give better fps and no one review gtx 1070 with Micron memory.. that's my point


----------



## khanmein

Quote:


> Originally Posted by *Mr-Dark*
> 
> SO, MSI release new card today.. the Quick Silver GTX 1070.. also with Micron memory.. here is the review
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html
> 
> 9Ghz memory OC and normal GTX 1070 performance.. I don't think Samsung vs Micron make any difference on the performance at 8Ghz clock...


stability & latency? that micron review barely hit 9 GHz


----------



## Mr-Dark

Quote:


> Originally Posted by *khanmein*
> 
> stability & latency? that micron review barely hit 9 GHz


I'm talking about the performance on stock clock.. Micron vs Samsung = nothing... you can see the review


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> I'm talking about the performance at stock clock.. someone say Samsung give better fps and no one review gtx 1070 with Micron memory.. that's my point


I personally said smoother frames and not a higher average FPS, unfortunately Guru3D does not measure Frametimes so we can compare, they only report performance in average FPS which does not tell much about how a gameplay actually feels on the screen, i think Toms measures Frametimes in their reviews so i would like to see a Micron review from them.


----------



## khanmein

86.04.1E.00.AA (samsung) giga

until now still reviewing samsung


----------



## MyNewRig

I read somewhere that Gigabyte will not release the Micron BIOS fix because they say that they do not have a problem with their Micron cards and can not reproduce any issues in their lab, did anyone else with a Gigabyte card get this information from support? this might explain why they released the new BIOS branch only for Samsung memory, but that is the strangest thing in the world!


----------



## Gurkburk

I'm slowly moving upwards with my clocks.

Right now it's 2113//4576mhz.

Unigine heaven score 2659.


----------



## gtbtk

Quote:


> Originally Posted by *TylerAD*
> 
> Seems like here is a recent review of an MSI card that has the Micron memory. This sample does not seem to have the issues as others are facing here.
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html


Exciting. that card does indeed have micron memory. It also is running the new bios that we are all waiting for.


----------



## Gurkburk

Okay. So what the actual f*ck is up with these cards. Is there no way to force constant voltage? This crop is causing the damn card to lower the clocks mid-testing.

Jumping between 1040mn to 1062. No way to get to 1.3v, like the 780?

Anyone have an explanation?

Last Unigine test was scored 2664.


----------



## criminal

Cheap waterblock for the FE back in stock: https://www.amazon.com/dp/B01IIGVL3W/ref=cm_cr_ryp_prd_ttl_sol_0

Great block for the price.
Quote:


> Originally Posted by *Gurkburk*
> 
> Okay. So what the actual f*ck is up with these cards. Is there no way to force constant voltage? This crop is causing the damn card to lower the clocks mid-testing.
> 
> Jumping between 1040mn to 1062. No way to get to 1.3v, like the 780?
> 
> Anyone have an explanation?
> 
> Last Unigine test was scored 2664.


Just the nature of Pascal and the bios Nvidia has provided. The cooler temps you can run, the more consistent the clocks. 1.3v is not possible.


----------



## zipzop

Quote:


> Originally Posted by *Gurkburk*
> 
> Okay. So what the actual f*ck is up with these cards. Is there no way to force constant voltage? This crop is causing the damn card to lower the clocks mid-testing.
> 
> Jumping between 1040mn to 1062. No way to get to 1.3v, like the 780?
> 
> Anyone have an explanation?
> 
> Last Unigine test was scored 2664.


Check GPU-z sensors tab after running a test. PerfCap Reason, common are Pwr(power limit) and Vrel(voltage reliability)


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> msi users should check the live updata app from msi to see whether new bios is released


It is not on Live update yet.

MSI user forum is offline as well. I wonder if they are being bombarded?


----------



## Gurkburk

Quote:


> Originally Posted by *zipzop*
> 
> Check GPU-z sensors tab after running a test. PerfCap Reason, common are Pwr(power limit) and Vrel(voltage reliability)


It says "Utility" is the reason. What does that mean? Never used this part of the software


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Okay. So what the actual f*ck is up with these cards. Is there no way to force constant voltage? This crop is causing the damn card to lower the clocks mid-testing.
> 
> Jumping between 1040mn to 1062. No way to get to 1.3v, like the 780?
> 
> Anyone have an explanation?
> 
> Last Unigine test was scored 2664.


That is the nature of GPU Boost 3.0.

The 16nm Finfet process requires much less voltage than the older 28nm Process used in the 780 and 900 cards. Even if a tweaker is released, I doubt that these pascal cards will even run at 1.3v. You can currently run the card at a maximum of 1.093V but setting Afterburner to unlock the voltage adjustment and increasing the voltage slider to 100.

Cooling plays a big part in keeping the voltages and clock speeds higher in the range but they will start dropping from their peak at around 47 deg. I read somewhere that above 68 degrees it bottoms out and the clock and voltage stablizes but I have not tested that myself. Personally, I am running on Air cooling and I keep my card at a maximum of about 60 deg with a fan curve so I don't really have any experience at the higher range temps.


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipzop*
> 
> Check GPU-z sensors tab after running a test. PerfCap Reason, common are Pwr(power limit) and Vrel(voltage reliability)
> 
> 
> 
> It says "Utility" is the reason. What does that mean? Never used this part of the software
Click to expand...

hover your mouse over the sensor title and an explanation describing each of the terms will popup.

Util means it is limited by GPU operation. ie. it is not being throttled back by anything


----------



## gtbtk

Quote:


> Originally Posted by *Mr-Dark*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> i see lots of samsung reaching 9.4g~9.6g
> 
> 
> 
> I'm talking about the performance at stock clock.. someone say Samsung give better fps and no one review gtx 1070 with Micron memory.. that's my point
Click to expand...

It will be interesting to see if Steve Burke at Gamers Nexus actually does a back to back comparison as he half heartedly promised yesterday.

I will reserve judgement on the range of overclocks on the micron memory until we can get a better range of samples without the flawed bios. Yes on an overclocking web site we are certainly seeing many people claiming 9.5Ghz samsung memory overclocks but the people reporting that are also tuning the overclocks with solid attention. Also is that 9.5-9.6Ghz stable on both DX11 and DX12? It never really gets discussed. We have also seen other people reporting that their card samsung card will only do 8.5Ghz as well. Are many people going to be posting about cards that don't overclock well all that much on an overclocking site? I doubt it. so just taking these enthusiast sites reports mat be skewing the statistics a bit.

Personally, my card which is micron, will run OK at up to +550 on DX11 (9.1ghz) on the original bios if I am careful with the voltages. In DX12, timespy starts flashing red spots all over the screen at anything higher than 8.8-8.9Ghz. I don't know what will happen once I get the new bios that is made for my card installed. The Asus Bios on my card, even with a 200W power target instead of the 291W target that the MSI cards comes with, pretty much matches the best performance that I can manage with the original MSI bios.


----------



## Gurkburk

Quote:


> Originally Posted by *gtbtk*
> 
> That is the nature of GPU Boost 3.0.
> 
> The 16nm Finfet process requires much less voltage than the older 28nm Process used in the 780 and 900 cards. Even if a tweaker is released, I doubt that these pascal cards will even run at 1.3v. You can currently run the card at a maximum of 1.093V but setting Afterburner to unlock the voltage adjustment and increasing the voltage slider to 100.
> 
> Cooling plays a big part in keeping the voltages and clock speeds higher in the range but they will start dropping from their peak at around 47 deg. I read somewhere that above 68 degrees it bottoms out and the clock and voltage stablizes but I have not tested that myself. Personally, I am running on Air cooling and I keep my card at a maximum of about 60 deg with a fan curve so I don't really have any experience at the higher range temps.


Quote:


> Originally Posted by *gtbtk*
> 
> hover your mouse over the sensor title and an explanation describing each of the terms will popup.
> 
> Util means it is limited by GPU operation. ie. it is not being throttled back by anything


I see. I'm maxing the fanspeed when doing overclocking, so it runs around 50*c at "max volt" & 2126mh//4579mhz-ish.

The volt-jumping is putting the clocks in a twist & jumps up and down. Irritating while trying to get every inch of performance


----------



## headd

Quote:


> Originally Posted by *Mr-Dark*
> 
> SO, MSI release new card today.. the Quick Silver GTX 1070.. also with Micron memory.. here is the review
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html
> 
> 9Ghz memory OC and normal GTX 1070 performance.. I don't think Samsung vs Micron make any difference on the performance at 8Ghz clock...


The difference in performance at stock is 0.The difference in performance after you oc memory with micron to 9Ghz or with samsung with 9.4Ghz is 1-2Fps.
I really dont know where this micron madness come from.
Both cards are equal after oc and at stock.


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is the nature of GPU Boost 3.0.
> 
> The 16nm Finfet process requires much less voltage than the older 28nm Process used in the 780 and 900 cards. Even if a tweaker is released, I doubt that these pascal cards will even run at 1.3v. You can currently run the card at a maximum of 1.093V but setting Afterburner to unlock the voltage adjustment and increasing the voltage slider to 100.
> 
> Cooling plays a big part in keeping the voltages and clock speeds higher in the range but they will start dropping from their peak at around 47 deg. I read somewhere that above 68 degrees it bottoms out and the clock and voltage stablizes but I have not tested that myself. Personally, I am running on Air cooling and I keep my card at a maximum of about 60 deg with a fan curve so I don't really have any experience at the higher range temps.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> hover your mouse over the sensor title and an explanation describing each of the terms will popup.
> 
> Util means it is limited by GPU operation. ie. it is not being throttled back by anything
> 
> Click to expand...
> 
> I see. I'm maxing the fanspeed when doing overclocking, so it runs around 50*c at "max volt" & 2126mh//4579mhz-ish.
> 
> The volt-jumping is putting the clocks in a twist & jumps up and down. Irritating while trying to get every inch of performance
Click to expand...

That is a good overclock.

I would only suggest not fixating on the clock speed numbers to the exclusion of Frames per second. If you really want to max your benchmarks once you have something that is stable enough to finish your benchmark, Set the OC in Afterburner then exit. it will keep the oc without AB running. Also close GPU-z plus anything else that is running in the background, you should see a nice performance bump in your score.


----------



## gtbtk

Quote:


> Originally Posted by *headd*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> SO, MSI release new card today.. the Quick Silver GTX 1070.. also with Micron memory.. here is the review
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html
> 
> 9Ghz memory OC and normal GTX 1070 performance.. I don't think Samsung vs Micron make any difference on the performance at 8Ghz clock...
> 
> 
> 
> The difference in performance at stock is 0.The difference in performance after you oc memory with micron to 9Ghz or with samsung with 9.4Ghz is 1-2Fps.
> I really dont know where this micron madness come from.
> Both cards are equal after oc and at stock.
Click to expand...

it did get a bit viral didn't it?


----------



## MyNewRig

Quote:


> Originally Posted by *gtbtk*
> 
> It will be interesting to see if Steve Burke at Gamers Nexus actually does a back to back comparison as he half heartedly promised yesterday.
> 
> I will reserve judgement on the range of overclocks on the micron memory until we can get a better range of samples without the flawed bios. Yes on an overclocking web site we are certainly seeing many people claiming 9.5Ghz samsung memory overclocks but the people reporting that are also tuning the overclocks with solid attention. Also is that 9.5-9.6Ghz stable on both DX11 and DX12? It never really gets discussed. We have also seen other people reporting that their card samsung card will only do 8.5Ghz as well. Are many people going to be posting about cards that don't overclock well all that much on an overclocking site? I doubt it. so just taking these enthusiast sites reports mat be skewing the statistics a bit.
> 
> Personally, my card which is micron, will run OK at up to +550 on DX11 (9.1ghz) on the original bios if I am careful with the voltages. In DX12, timespy starts flashing red spots all over the screen at anything higher than 8.8-8.9Ghz. I don't know what will happen once I get the new bios that is made for my card installed. The Asus Bios on my card, even with a 200W power target instead of the 291W target that the MSI cards comes with, pretty much matches the best performance that I can manage with the original MSI bios.


Historically Samsung memory chips have always clocked better than others (Hynix/Micron/Elpida), it is a pattern not one single case.


----------



## MyNewRig

Quote:


> Originally Posted by *headd*
> 
> The difference in performance at stock is 0.The difference in performance after you oc memory with micron to 9Ghz or with samsung with 9.4Ghz is 1-2Fps.
> I really dont know where this micron madness come from.
> Both cards are equal after oc and at stock.


That is after months of begging Nvidia & Co. and with a bunch of workarounds to make it work previously, it did not come easy out of the box hence the "madness", and not all manufacturers have a BIOS out yet, MSI and Gigabyte do not have one yet and Gigabyte is not planning to release one for Micron as i read, also that is still one single review, we are not sure if the results are consistent though and there are people still not getting good results with the new BIOS, there is a guy here reporting worse memory OC after the update, also no proper Frametime/latency and power draw testing/comparison between the two chips yet, don't rush to conclusions, Samsung worked perfectly and easily out of the box from day one, Micron is a totally different story.


----------



## Sycksyde

I have an MSI Gaming X with Micron and it does +500 on the RAM easily, I don't know what all the butthurt over Micron is about...?


----------



## Roland0101

Quote:


> Originally Posted by *khanmein*
> 
> where's Roland0101 or Roland01? i expected same guys cos so far i'm the one using the same name at here, guru3d & geforce forums. this is my f real name..
> 
> Roland0101 said "First, the imho most important thing, checkerboard artifacts crashes are completely gone."
> 
> http://www.overclock.net/t/1614656/gtx-1070s-micron-feedback/10
> 
> what he mean is that before the Asus released the new vbios for micron, his card got checkerboard artifacts crashes?
> 
> Roland01 said "My Strix OC with micron memory don't shows any of the problems described for RotTR. No flickering, no artifacts, no core clock throttling, stays not OCed (that means not further as Asus did overclocked the card anyway.) at 1987Mhz even in several hour long seasons. "
> 
> https://forums.geforce.com/default/topic/963768/geforce-drivers/gtx-1070-memory-vrm-driver-or-bios-bug-in-micron-memory-1070-cards-/2/
> 
> what he mean is that "no flickering, no artifacts, no core clock throttling, stays not OCed" for RotTR only but got checkerboard artifacts crashes on other games??
> 
> i'm confusing & please enlighten me. thanks (no offense)


Lol, I see this post just now, I hop I did answer your question to your satisfaction.


----------



## Hunched

Quote:


> Originally Posted by *Sycksyde*
> 
> I have an MSI Gaming X with Micron and it does +500 on the RAM easily, I don't know what all the butthurt over Micron is about...?


It's people seeing the name "Samsung" and then the name "Micron" and thinking "Damn, I've heard more good things about Samsung, their SSD's and stuff are great, I want to say I have Samsung, who even is Micron?"

If we weren't able to easily identify the VRAM in GPU-Z, people wouldn't even know there's any difference.


----------



## BroPhilip

I have laid low on the while micron issue lately on the promise of a new Bios, however this review on the new msi card angered me greatly. After waiting this long they put out a new card with the new Bios while we wait so everyone that buys the new card gets the update we who didn't return our cards and stuck with them get to sit and wait .

So you release the new Micron memory bios in a new card but leave the early adopters in the dark in regards to a Bios update to fix the Micron memory problems 1070 owners are having.

http://www.guru3d.com/articles_pages/msi_geforce_gtx_1070_quick_silver_8g_oc_review,29.html


----------



## Roland0101

Quote:


> Originally Posted by *Hunched*
> 
> It's people seeing the name "Samsung" and then the name "Micron" and thinking "Damn, I've heard more good things about Samsung, their SSD's and stuff are great, I want to say I have Samsung, who even is Micron?"


And then they purchased a Samsung Note 7...


----------



## MyNewRig

Quote:


> Originally Posted by *Hunched*
> 
> It's people seeing the name "Samsung" and then the name "Micron" and thinking "Damn, I've heard more good things about Samsung, their SSD's and stuff are great, I want to say I have Samsung, who even is Micron?"
> 
> If we weren't able to easily identify the VRAM in GPU-Z, people wouldn't even know there's any difference.


LOL, you make it sound like all of us who have problematic cards with Micron are complete fools, i noticed there was something terribly off with my Micron cards without looking at GPU-Z and without even knowing that memory was witched, i said that multiple times, it is not as ridiculously simple as you make it sound, and i assume your card has Samsung so you don't even know what you are talking about.


----------



## Mr-Dark

Well, Same thing happen each gen..

lets talk about Maxwell.. the gtx 970 for example

first card's was Samsung and they capable for 8Ghz memory clock ( 7Ghz stock ).. while later all switch to Hynix which barely hit 7600mhz.. then all switch to Elpida which worst...

No one complain at all.. we all say its just the Luck.. Samsung or Hynix or Elpida.. also we can't complain about the OC ability as the card work just fine at 8Ghz clock so its meet the minimum space for GTX 1070...


----------



## Hunched

Quote:


> Originally Posted by *MyNewRig*
> 
> LOL, you make it sound like all of us who have problematic cards with Micron are complete fools, i noticed there was something terribly off with my Micron cards without looking at GPU-Z and without even knowing that memory was witched, i said that multiple times, it is not as ridiculously simple as you make it sound, and i assume your card has Samsung so you don't even know what you are talking about.


Samsung has problems too, in fact many Micron cards are more stable and overclock better than many cards with Samsung.
Wow!

Everyone using these blanket statements and blowing things out of proportion are stupid.
Maybe I'd care more to listen if everyone wasn't spreading so much misinformation, about how Samsung is better than it usually actually is and that Micron is worse than it usually actually is.


----------



## Hunched

According to this topic, 99% of Samsung cards hit 9.5ghz+ and 99% of Micron card's aren't stable at stock.
Nope.


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> Well, Same thing happen each gen..
> 
> lets talk about Maxwell.. the gtx 970 for example
> 
> first card's was Samsung and they capable for 8Ghz memory clock ( 7Ghz stock ).. while later all switch to Hynix which barely hit 7600mhz.. then all switch to Elpida which worst...
> 
> No one complain at all.. we all say its just the Luck.. Samsung or Hynix or Elpida.. also we can't complain about the OC ability as the card work just fine at 8Ghz clock so its meet the minimum space for GTX 1070...


Quote:


> Originally Posted by *Hunched*
> 
> Samsung has problems too, in fact many Micron cards are more stable and overclock better than many cards with Samsung.
> Wow!
> 
> Everyone using these blanket statements and blowing things out of proportion are stupid.
> Maybe I'd care more to listen if everyone wasn't spreading so much misinformation, about how Samsung is better than it usually actually is and that Micron is worse than it usually actually is.


You both realize that Micron 8 Gbps had problems for many people out of the box that requires a BIOS update to restore normal operations and that we been begging for that BIOS for so long and not everyone got it yet? did the 970 have that? did Samsung 1070 have that?


----------



## Hunched

Quote:


> Originally Posted by *MyNewRig*
> 
> You both realize that Micron 8 Gbps had problems for many people out of the box that requires a BIOS update to restore normal operations and that we been begging for that BIOS for so long and not everyone got it yet? did the 970 have that? did Samsung 1070 have that?


Yeah, actually.
Samsung cards have had issues at stock, and people have had to underclock or RMA.
Is this what you wanted to hear?

It happened with 900 Series too, with all brands of VRAM.
Even core clocks are unstable out of the box sometimes, time to boycott Nvidia because their cores are faulty amirite?


----------



## Mr-Dark

Quote:


> Originally Posted by *MyNewRig*
> 
> You both realize that Micron 8 Gbps had problems for many people out of the box that requires a BIOS update to restore normal operations and that we been begging for that BIOS for so long and not everyone got it yet? did the 970 have that? did Samsung 1070 have that?


all Micron not stable at stock ?? i can see a little have that not all...

gigabyte & msi & asus work just fine...


----------



## MyNewRig

Quote:


> Originally Posted by *Hunched*
> 
> Yeah, actually.
> Samsung cards have had issues at stock, and people have had to underclock or RMA.
> Is this what you wanted to hear?


Proof? give me 10 cases at least not one off case, or a thread anywhere were people complaining about Samsung memory problems that received any significant number of views.


----------



## MyNewRig

Quote:


> Originally Posted by *Mr-Dark*
> 
> all Micron not stable at stock ?? i can see a little have that not all...
> 
> gigabyte & msi & asus work just fine...


Not all, but many, Asus and MSI in particular do not work fine at all, actually MSI looks to have the most number of complaints, why did they make the BIOS fix if all was okay?


----------



## MyNewRig

Quote:


> Originally Posted by *Hunched*
> 
> It happened with 900 Series too, with all brands of VRAM.
> Even core clocks are unstable out of the box sometimes, time to boycott Nvidia because their cores are faulty amirite?


What is the memory type in your own 1070?


----------



## Hunched

Quote:


> Originally Posted by *MyNewRig*
> 
> Proof? give me 10 cases at least not one off case, or a thread anywhere were people complaining about Samsung memory problems that received any significant number of views.


No. You go research every Nvidia card ever made and every issue that has ever happened with VRAM.
This isn't new, and you would know this if you had been around for any meaningful amount of time, it's pretty clear you have little experience like many others here and this is your first big scandal.

Cards being unstable at stock settings has been around forever, it happens, to everything, all brands.

How do you explain people hitting 9ghz+ with Micron? It means you're wrong.
If their BIOS wasn't gimped, would they be hitting 10ghz+? Is that the theory here? Lol.
I'm sure it's possible the BIOS isn't as fine tuned as it could be, and it's likely you're more likely to get lucky with Samsung.

You can always get a lemon, probably more likely with Micron, not all Micron cards are lemons though, this is false.
I have nothing more to say, I have more important things to do right now.
I shouldn't have bothered saying anything but this topic has been ruined for weeks and its annoying.
It's just regurgitated vomit, every post.


----------



## MyNewRig

Quote:


> Originally Posted by *Hunched*
> 
> you have little experience like many others here and this is your first big scandal.


I been around more than you at least in this forum and have always owned GPUs for years Nvidia and AMD, never seen such problem happen on a mass scale
Quote:


> How do you explain people hitting 9ghz+ with Micron?


That Micron has a significant QC problem when some can't run stable at stock and others like TheGlow can run 9700Mhz effective, that is more than 2000Mhz variance which is not normal.
Quote:


> I have nothing more to say, I have more important things to do right now.
> 
> I shouldn't have bothered saying anything but this topic has been ruined for weeks and its annoying.
> It's just regurgitated vomit, every post.


I have to go to bed myself, when this topic is not "running" nothing else is running, the thread just halts, there is nothing else to talk about, and we are not idiots so don't take us for fools i had one Micron card with issues, got it replaced with another Micron card from another batch and it also had issues, a Samsung card i had for one month before these two was flawless, in my case the odds were clear enough, but not everybody's experience with Micron is the same, as i explained above the variance is huge.

Good Night!


----------



## Hunched

Quote:


> Originally Posted by *MyNewRig*
> 
> That Micron has a significant QC problem when some can't run stable at stock and others like TheGlow can run 9700Mhz effective, that is more than 2000Mhz variance which is not normal.


You realize the exact same thing happens with Samsung VRAM?
There are cards that aren't stable at stock and there are cards that hit 9.5ghz+
So Samsung is just as bad as Micron according to this.
Okay.

If the BIOS was completely gimped and broken, nobody would be hitting 9.7ghz with it.
So thanks for proving my point that it's actually just the lottery.
Quote:


> Originally Posted by *MyNewRig*
> 
> never seen such problem happen on a mass scale


Well that's what happens when you create an echochamber of 10-20 people in this topic who keep posting the same thing for weeks straight.
It's really very small in the grand scheme of things.


----------



## BroPhilip

It's also interesting that the msi 1070 quick silver card (new Micron bios) which is clocked a good bit lower than the gaming z model but scores higher in time spy in the chart on this page
http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,30.html


----------



## MyNewRig

Quote:


> Originally Posted by *Hunched*
> 
> If the BIOS was completely gimped and broken, nobody would be hitting 9.7ghz with it.
> So thanks for proving my point that it's actually just the lottery.


Micron problems go way beyond a simple silicon lottery for they hard-crash the system and make it unstable in general in my own experience, and if the BIOS was fine we would not be getting a fix from Nvidia (starting to go in circles now)

Anyways, further arguing is unnecessary since both of us are not affected, you have Samsung memory so have no business or stake in this mess and i returned my Micron card and refuse to spend a penny on a card with that memory type after my horrible experience with it, no one needs to convince the other of their stance, this issue is becoming way too political, to each their views.


----------



## Snuckie7

Quote:


> Originally Posted by *Mr-Dark*
> 
> SO, MSI release new card today.. the Quick Silver GTX 1070.. also with Micron memory.. here is the review
> 
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,1.html
> 
> 9Ghz memory OC and normal GTX 1070 performance.. I don't think Samsung vs Micron make any difference on the performance at 8Ghz clock...


Love the new color scheme. Anyone want to buy one to trade for my Gaming X? Haha


----------



## gtbtk

Quote:


> Originally Posted by *Mr-Dark*
> 
> Well, Same thing happen each gen..
> 
> lets talk about Maxwell.. the gtx 970 for example
> 
> first card's was Samsung and they capable for 8Ghz memory clock ( 7Ghz stock ).. while later all switch to Hynix which barely hit 7600mhz.. then all switch to Elpida which worst...
> 
> No one complain at all.. we all say its just the Luck.. Samsung or Hynix or Elpida.. also we can't complain about the OC ability as the card work just fine at 8Ghz clock so its meet the minimum space for GTX 1070...


The 3.5Gb vram issue took everyone's attention off the brand of the memory.


----------



## Forceman

Quote:


> Originally Posted by *gtbtk*
> 
> The 3.5Gb vram issue took everyone's attention off the brand of the memory.


Excellent diversion strategy by Nvidia. Worked flawlessly.


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> It's also interesting that the msi 1070 quick silver card (new Micron bios) which is clocked a good bit lower than the gaming z model but scores higher in time spy in the chart on this page
> http://www.guru3d.com/articles-pages/msi-geforce-gtx-1070-quick-silver-8g-oc-review,30.html


CPU performance does have an impact on time spy scores. I have an older CPU and I am almost 1000 below his score but my graphics scores are almost identical to his


----------



## gtbtk

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The 3.5Gb vram issue took everyone's attention off the brand of the memory.
> 
> 
> 
> Excellent diversion strategy by Nvidia. Worked flawlessly.
Click to expand...

misdirects dont take much do they


----------



## Roland0101

Quote:


> Originally Posted by *MyNewRig*
> 
> You both realize that Micron 8 Gbps had problems for many people out of the box that requires a BIOS update to restore normal operations and that we been begging for that BIOS for so long and not everyone got it yet? did the 970 have that? did Samsung 1070 have that?


No, the Bios is for increased Overclocking stability, and it works. People with problems at stock had/have a defective card. (And some Samsung cards are defective out of the box too.) You just mixed this two problems together without any facts to back this claim up.
Quote:


> this issue is becoming way too political, to each their views.


And who is responsible for that...


----------



## DeathAngel74

I have some spare pads lying around. Will these suffice for the gap between MMCP and backplate??
Fujipoly / mod/smart Ultra Extreme XR-m Thermal Pad - 100 x 15 x 0.5 - Thermal Conductivity 17.0 W/mK
Fujipoly / mod/smart Ultra Extreme XR-m Thermal Pad - 100 x 15 x 1.0 - Thermal Conductivity 17.0 W/mK
Fujipoly / mod/smart Ultra Extreme XR-m Thermal Pad - 100 x 15 x 1.5 - Thermal Conductivity 17.0 W/mK
EVGA is releasing a new bios update to resolve temp issues in a few days.
http://www.evga.com/thermalmod/


----------



## abctoz

Quote:


> Originally Posted by *Roland0101*
> 
> No, the Bios is for increased Overclocking stability, and it works. People with problems at stock had/have a defective card. (And some Samsung cards are defective out of the box too.) You just mixed this two problems together without any facts to back this claim up.
> And who is responsible for that...


I think we can all agree that

1. More Micron cards are unstable at stock
2. Micron VRAM have much higher variance on overclock ability

But the real issue for enthusiasts is that Samsung clocks higher on average, thus:

1. Nobody in their right mind would choose Micron over Samsung VRAM
2. NVIDIA is responsible for this bait-and-switch(early cards + review samples all use Samsung)


----------



## Hunched

Quote:


> Originally Posted by *abctoz*
> 
> 2. NVIDIA is responsible for this bait-and-switch(early cards + review samples all use Samsung)


It's not a bait and switch though, because they're still making cards with Samsung, just also Micron now too...
There was no switch, there was an addition, they could add Hynix next if they like they're allowed to do that.

As for sending out Samsung units for review, cherry picking always happens for reviews.
It's the reviewers fault if they overclock VRAM and do not specify you may not get the same results or a different variant of memory.
The reviewers could also choose not to get review units directly from the manufacturer, and they could also request Micron variants if they wanted.

As much as I dislike Nvidia and their practices, this isn't really anything new or even that bad.
There's FAR worse things Nvidia is doing I could complain about.

If Nvidia was marketing how their cards had superior Samsung VRAM and always will, you would have a point, but that never happened, so you don't.


----------



## pacopepe

Quote:


> Originally Posted by *MyNewRig*
> 
> I read somewhere that Gigabyte will not release the Micron BIOS fix because they say that they do not have a problem with their Micron cards and can not reproduce any issues in their lab, did anyone else with a Gigabyte card get this information from support? this might explain why they released the new BIOS branch only for Samsung memory, but that is the strangest thing in the world!


Didn't see anything about that and gigabyte, where did you read it? Because if they don't update it I will send back to the shop the card and will buy another brand.


----------



## khanmein

Quote:


> Originally Posted by *Mr-Dark*
> 
> all Micron not stable at stock ?? i can see a little have that not all...
> 
> gigabyte & msi & asus work just fine...


where's the micron review at youtube? even until now 1st Nov still reviewing samsung (



)

my fav youtube reviewer pc games h/w (i love german)

what bout guru3d articles? this is 3rd or 4th review article with micron. try google & search for micron article then u know what i'm trying to said.

i'm a typical NV fan boy but if something is wrong. i'll pointed out!

since u got plenty experience modding bios with 970 or others, y took them so long if there's just OC issue? meaning more than meets the eye & u also haven't answer me yet. (last month i asked u)


----------



## abctoz

Quote:


> Originally Posted by *Hunched*
> 
> If Nvidia was marketing how their cards had superior Samsung VRAM and always will, you would have a point, but that never happened, so you don't.


I agree that it is well within NVIDIA's rights to do what they are doing since they never advertised it, I used that phrase "bait-and-switch" because it sounded right but the last point should be amended as

2. NVIDIA are responsible for using components of lesser caliber on an enthusiast card.

I think my other 3 points still stand.


----------



## Roland0101

Quote:


> Originally Posted by *abctoz*
> 
> 1. More Micron cards are unstable at stock


Probably true, but hard facts are not available. Msi acknowledged quality problems, and I personally think that there was a bad batch of Micron memory that is responsible for this problem. Point is that it is not the norm, if it would be the norm we would see much more complains about this.
Quote:


> 2. Micron VRAM have much higher variance on overclock ability


If I look at the OC results people with normal working Micron cards have, I doupt that.
Quote:


> NVIDIA is responsible for this bait-and-switch(early cards + review samples all use Samsung)


The "bait-and-switch"point has been already discussed, but you are right about the hand picked samples, that is a bad practice and the review sites (at least the professional ones), should start to use "street" samples as soon as they are available.
Quote:


> 2. NVIDIA are responsible for using components of lesser caliber on an enthusiast card.


Lesser caliber? It's not the first time that Graphic cards uses different RAM chips, and one of this RAM chips was always from "Lesser caliber". The Guru 3D review proofs that a card with Micron is not in any way slower than a Samsung card on stock clocks and that is all what is guaranteed. (I also would not call a 70th number card a enthusiast card, but if you like to.)

Furthermore, what do you mean by "responsible"? Nvidia used Samsung longer than the AIBs and it is far more likely that the switch was made out of practical reasons, meaning a shortage of Samsung Ram.


----------



## Majentrix

Does anyone else have Gainward or Palit 1070s? When and what country did you buy it from? Does it have Micron or Samsung memory?
I have early and late Gainward cards that both have Samsung memory and I'm beginning to think it may just have been specific batches that got Micron memory.


----------



## abctoz

Quote:


> Originally Posted by *Roland0101*
> 
> Probably true, but hard facts are not available. Msi acknowledged quality problems, and I personally think that there was a bad batch of Micron memory that is responsible for this problem. Point is that it is not the norm, if it would be the norm we would see much more complains about this.
> If I look at the OC results people with normal working Micron cards have, I doupt that.
> Lesser caliber? It's not the first time that Graphic cards uses different RAM chips, and one of this RAM chips was always from "Lesser caliber". The Guru 3D review proofs that a card with Micron is not in any way slower than a Samsung card on stock clocks and that is all what is guaranteed. (I also would not call a 70th number card a enthusiast card, but if you like to.)
> 
> Furthermore, what do you mean by "responsible"? Nvidia used Samsung longer than the AIBs and it is far more likely that the switch was made out of practical reasons, meaning a shortage of Samsung Ram.


From what I read from these and other forums, as well as my personal experience I concluded that Samsung VRAM clocks higher on average, I have tested 6 cards personally:

2x Micron clocks to +500-550 before artifacts 8700mhz effective
4x Samsung stable at +700 (haven't tried for higher) 9000mhz effective

My sample is small, but I see similar results on the internet, few people with Micron memory can get above +700, while you see this figure reported for Samsung VRAM regularly.

When I say caliber I mean ability to overclock, sure this is a well known practice by NVIDIA and it can be justified by the shortage argument, but at the end of the day consumers lose, NVIDIA+AIB partners win.


----------



## Skyblaze

Quick question as I have never flashed a vBIOS before, if I want to update my Palit 1070 DUAL with their newly supplied vBIOS and utility from here: http://www.palit.com/palit/vgapro.php?id=2679&lang=en&pn=NE51070015P2-1043D&tab=do are there any things I should be aware of? I'm guessing closing Afterburner first would be a good idea but anything else?


----------



## DeathAngel74

I usually close PrecisionX and Microsoft Security Essentials. Nothing more.


----------



## Skyblaze

Quote:


> Originally Posted by *DeathAngel74*
> 
> I usually close PrecisionX and Microsoft Security Essentials. Nothing more.


Okay thanks I'll try it tomorrow then.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> I usually close PrecisionX and Microsoft Security Essentials. Nothing more.


load into safe mode is the safest bet.


----------



## asdkj1740

Quote:


> Originally Posted by *Skyblaze*
> 
> Quick question as I have never flashed a vBIOS before, if I want to update my Palit 1070 DUAL with their newly supplied vBIOS and utility from here: http://www.palit.com/palit/vgapro.php?id=2679&lang=en&pn=NE51070015P2-1043D&tab=do are there any things I should be aware of? I'm guessing closing Afterburner first would be a good idea but anything else?


reset oc settings


----------



## 3jackdaws

Hello. Im a first time GPU overclocker and 1070 owner.

I wanted to try overclocking my GPU this morning with what little information I knew about OC'ing.
Using MSI AB, I have add the follow settings to my GPU.

Core clock + 100
Meemory + 360
Fan speed @80%

Did not touch voltage yet since I wanted to see how'd it go. But I quickly experienced major artifacting.
It was so bad that it was visible every 3-8 seconds.

Can someone tell me what are the proper settings for this GPU in terms of overclocking?

i7-3770
Zotac 1070 Amp! Extreme
16GB RAM
700w PSU

Also, does anybody use the fire storm app from Zotac to OC? or is MSI AB still the best one to use?

thanks!


----------



## khanmein

^^ increase the power limit to the max & don't touch the volt yet.


----------



## Avendor

Indeed, same problem happens to me (not in Heaven, ive tested in Fire Strike) if i try to raise more than +325Mhz... this guy also having occasionally freezing which is really bad. Even i set in NVCP - Prefer maximum performance the result is the same, checkerboard.... can't get higher


----------



## LuckLess7

So, I yesterday I did what was suggested in the thread. I went to nvidia control panel and put in "prefer maximum performance" for dwm.exe and explorer.exe. The voltage never dips below 0.8250Volts. I haven't had a crash since. I played the games where it usually crashed and so far so good.

Obviously, the card now consumes more power in idle states, which I don't consider idial. The fans will start at 60°C - so watching a Youtube Video now makes the fans start.
Quote:


> Originally Posted by *Roland0101*
> 
> No, the Bios is for increased Overclocking stability, and it works. People with problems at stock had/have a defective card. (And some Samsung cards are defective out of the box too.)


You say that problems at Stock settings indicate a defective card, and the upcoming VBIOS update is in any case just for the Overclocking issues? I have contacted my reatiler and described my problem, they answered within 30 minutes and just said please send it in, without any questions or anything. This indicates that there is some kind of acknowledged issue, otherwise this retailer will not accept rma so easily without asking the usual questions, like, did you plug it in, you know these kind of questions. I think I send it in, because really if the upcoming BIOS will not solve crash-issues at stock, I don't need to wait.


----------



## gtbtk

Quote:


> Originally Posted by *Majentrix*
> 
> Does anyone else have Gainward or Palit 1070s? When and what country did you buy it from? Does it have Micron or Samsung memory?
> I have early and late Gainward cards that both have Samsung memory and I'm beginning to think it may just have been specific batches that got Micron memory.


I think that is exactly what has happened for all manufacturers.


----------



## khanmein

Quote:


> Originally Posted by *LuckLess7*
> 
> So, I yesterday I did what was suggested in the thread. I went to nvidia control panel and put in "prefer maximum performance" for dwm.exe and explorer.exe. The voltage never dips below 0.8250Volts. I haven't had a crash since. I played the games where it usually crashed and so far so good.
> 
> Obviously, the card now consumes more power in idle states, which I don't consider idial. The fans will start at 60°C - so watching a Youtube Video now makes the fans start.
> You say that problems at Stock settings indicate a defective card, and the upcoming VBIOS update is in any case just for the Overclocking issues? I have contacted my reatiler and described my problem, they answered within 30 minutes and just said please send it in, without any questions or anything. This indicates that there is some kind of acknowledged issue, otherwise this retailer will not accept rma so easily without asking the usual questions, like, did you plug it in, you know these kind of questions. I think I send it in, because really if the upcoming BIOS will not solve crash-issues at stock, I don't need to wait.


don't over-clock & stay with stock setting confirm no freezing issue or other unknown problem. recommended by a lot user here. luckily i didn't sell my 970 yet. i almost pull the trigger on Asus Strix GTX 1070 NON-OC/OC (depend which available) & this coming friday my beloved 970 is coming back home with the courier.

y don't u just flash the latest vbios from asus? we bought around USD 400 no need set the dwm or explorer. RMA asap!


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> So, I yesterday I did what was suggested in the thread. I went to nvidia control panel and put in "prefer maximum performance" for dwm.exe and explorer.exe. The voltage never dips below 0.8250Volts. I haven't had a crash since. I played the games where it usually crashed and so far so good.
> 
> Obviously, the card now consumes more power in idle states, which I don't consider idial. The fans will start at 60°C - so watching a Youtube Video now makes the fans start.
> Quote:
> 
> 
> 
> Originally Posted by *Roland0101*
> 
> No, the Bios is for increased Overclocking stability, and it works. People with problems at stock had/have a defective card. (And some Samsung cards are defective out of the box too.)
> 
> 
> 
> You say that problems at Stock settings indicate a defective card, and the upcoming VBIOS update is in any case just for the Overclocking issues? I have contacted my reatiler and described my problem, they answered within 30 minutes and just said please send it in, without any questions or anything. This indicates that there is some kind of acknowledged issue, otherwise this retailer will not accept rma so easily without asking the usual questions, like, did you plug it in, you know these kind of questions. I think I send it in, because really if the upcoming BIOS will not solve crash-issues at stock, I don't need to wait.
Click to expand...

The new bios should be released in the next week or so. You wont need to run the work around for long.

You also have an option to flash your card with a new asus strix OC bios for the time before you get the correct update. You will need to flash back to the original before you apply the real update because their utility checks what card you have and the cross flash will make the card appear to be an Asus card to the PC. It is a more risky option if you don't know exactly what you are doing as you can brick the card. It worked well on my gaming X though.

If your card is crashing at stock speeds and you can swap it, I would do that. We don't know if the new bios will fix that issue or not until it is here to try.


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> 
> 
> 
> 
> Indeed, same problem happens to me (not in Heaven, ive tested in Fire Strike) if i try to raise more than +325Mhz... this guy also having occasionally freezing which is really bad. Even i set in NVCP - Prefer maximum performance the result is the same, checkerboard.... can't get higher


That guy is running an ASUS strix OC card. There is already a bios update available for that card.

He is idling his card at .625v, below the magical .800v that keeps the cards more stable and prevents that arfifact/crash issue.

These work arounds are not perfect by any means. If you crash the driver and the card resets itself, it will often ignore the control panel settings and restart wi the card default settings. when you apply a load it then lead to the checkerboards.


----------



## gtbtk

Quote:


> Originally Posted by *3jackdaws*
> 
> Hello. Im a first time GPU overclocker and 1070 owner.
> 
> I wanted to try overclocking my GPU this morning with what little information I knew about OC'ing.
> Using MSI AB, I have add the follow settings to my GPU.
> 
> Core clock + 100
> Meemory + 360
> Fan speed @80%
> 
> Did not touch voltage yet since I wanted to see how'd it go. But I quickly experienced major artifacting.
> It was so bad that it was visible every 3-8 seconds.
> 
> Can someone tell me what are the proper settings for this GPU in terms of overclocking?
> 
> i7-3770
> Zotac 1070 Amp! Extreme
> 16GB RAM
> 700w PSU
> 
> Also, does anybody use the fire storm app from Zotac to OC? or is MSI AB still the best one to use?
> 
> thanks!


Afterburner is a superior Overclock utility and will give you more flexibility however the Zotac software may have extras to control things like LEDs etc. You can install both but do not run them at the same time.

Do not set either utility to automatically apply overclock settings at boot, at least until you are absolutely sure that what you have is stable.

When it comes to overclocking, every card in every installation is a little bit different so there is no magic formula or the correct settings to apply.

I would suggest that you use this process:

1. Set the power limit and temp settings to the maximum. leave voltage at 0 for now and set a custom fan curve. Overclocking will produce more heat so faster fans are better than slower fans.

2. Start from the default core clock and memory settings and increase the core clock only and test. looks like you can run stable at +100 so start there. Increase the slider by +25. After each sucessful test until your test fails then back it off by 25.

3. next adjust the memory clock settings using the same method you did with the core clock. Set the value and apply, then test, then increase if the test passed or back it off if it fails.

4. you can then start increasing the voltage slider and use the same method.

That will give you a baseline that you will probably need to play with over time. What is stable in Heaven, may not be stable in firestrike or time spy. It may take you a month of fine tuning to get settings that are stable for everything.


----------



## Avendor

Quote:


> Originally Posted by *gtbtk*
> 
> That guy is running an ASUS strix OC card. There is already a bios update available for that card.
> 
> He is idling his card at .625v, below the magical .800v that keeps the cards more stable and prevents that arfifact/crash issue.
> 
> These work arounds are not perfect by any means. If you crash the driver and the card resets itself, it will often ignore the control panel settings and restart wi the card default settings. when you apply a load it then lead to the checkerboards.


What about mine? breaking news... Gigabyte stated they won't release vbios for Micron, no issue was found...
http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,11.html
http://hexus.net/tech/news/graphics/98335-nvidia-geforce-gtx-1070-bios-updates-fix-memory-issues/

Do i have to deal with voltages manually after all? If that's the case, I want to get +500Mhz if will be possible. Can you show me exactly on chart what do i have to do? just reproduce a new chart if you can
You're right, in IDLE it stays .625v


----------



## khanmein

Quote:


> Originally Posted by *Avendor*
> 
> What about mine? breaking news... Gigabyte stated they won't release vbios for Micron, no issue was found...
> http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,11.html
> http://hexus.net/tech/news/graphics/98335-nvidia-geforce-gtx-1070-bios-updates-fix-memory-issues/
> 
> Do i have to deal with voltages manually after all? If that's the case, I want to get +500Mhz if will be possible. Can you show me exactly on chart what do i have to do? just reproduce a new chart if you can
> You're right, in IDLE it stays .625v


this is y i never pick giga. sell it & change to asus or msi or even galax


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That guy is running an ASUS strix OC card. There is already a bios update available for that card.
> 
> He is idling his card at .625v, below the magical .800v that keeps the cards more stable and prevents that arfifact/crash issue.
> 
> These work arounds are not perfect by any means. If you crash the driver and the card resets itself, it will often ignore the control panel settings and restart wi the card default settings. when you apply a load it then lead to the checkerboards.
> 
> 
> 
> What about mine? breaking news... Gigabyte stated they won't release vbios for Micron, no issue was found...
> http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,11.html
> http://hexus.net/tech/news/graphics/98335-nvidia-geforce-gtx-1070-bios-updates-fix-memory-issues/
> 
> Do i have to deal with voltages manually after all? If that's the case, I want to get +500Mhz if will be possible. Can you show me exactly on chart what do i have to do? just reproduce a new chart if you can
> You're right, in IDLE it stays .625v
Click to expand...

Gigabyte also stated that they did not have micron memory in any of their cards at first. Not sure that I would believe those sort of announcements.

Do you get checkerboard artifacts and crash if you set a +500 overclock while the voltage is at .625? I assume that you do so Giga saying no problems is also untrue. The core bios comes from Nvidia to fix a problem with the memory power interface to the micron chips. Every other manufacturer have released their 86.04.50.00.xx bios saying it was for the micron memory problem.

Giga have released the bios in the 86.04.50 range but they said it was for samsung. I cant help feeling that the people responsible are not so bright and that the label for the download is wrong. I cannot see any reason why the "samsung" 86.04.50.00.xx bios would not also work on a Micron card. It should fix the memory OC issues too.

To set a work around in the mean time, the easiest work around is to right click your desktop and open the Nvidia Control Panel, Go to manage 3d settings, click the program settings tab. In there add dwm.exe if it is not already there and set it to maximum performance, do the same thing with explorer.exe and set it to maximum performance then click apply.

After you have done that, Log out and then log in again and you are good to go.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Gigabyte also stated that they did not have micron memory in any of their cards at first. Not sure that I would believe those sort of announcements.
> 
> Do you get checkerboard artifacts and crash if you set a +500 overclock while the voltage is at .625? I assume that you do so Giga saying no problems is also untrue. The core bios comes from Nvidia to fix a problem with the memory power interface to the micron chips. Every other manufacturer have released their 86.04.50.00.xx bios saying it was for the micron memory problem.
> 
> Giga have released the bios in the 86.04.50 range but they said it was for samsung. I cant help feeling that the people responsible are not so bright and that the label for the download is wrong. I cannot see any reason why the "samsung" 86.04.50.00.xx bios would not also work on a Micron card. It should fix the memory OC issues too.
> 
> To set a work around in the mean time, the easiest work around is to right click your desktop and open the Nvidia Control Panel, Go to manage 3d settings, click the program settings tab. In there add dwm.exe if it is not already there and set it to maximum performance, do the same thing with explorer.exe and set it to maximum performance then click apply.
> 
> After you have done that, Log out and then log in again and you are good to go.


latest GIGA bios for samsung is 86.04.1E.00.AA


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> latest GIGA bios for samsung is 86.04.1E.00.AA


Nope

http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios

http://www.gigabyte.com/products/product-page.aspx?pid=5922#bios

http://www.gigabyte.com/products/product-page.aspx?pid=5921#bios


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Nope
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios


my mistake.. look like they got even newer for extreme version http://www.gigabyte.com/products/product-page.aspx?pid=5921#bios


----------



## gtbtk

G1 and xtreme got updates.

I was considering getting a gigabyte motherboard for my next build. after this, they are off the list


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> G1 and xtreme got updates.
> 
> I was considering getting a gigabyte motherboard for my next build. after this, they are off the list


i will go for Asus or ASRock.. giga & msi mobo prone with issue.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> G1 and xtreme got updates.
> 
> I was considering getting a gigabyte motherboard for my next build. after this, they are off the list
> 
> 
> 
> i will go for Asus or ASRock.. giga & msi mobo prone with issue.
Click to expand...

I have Asus now. I have decided that I will stick with them.

Never used asrock so I dont know much about them


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I have Asus now. I have decided that I will stick with them.
> 
> Never used asrock so I dont know much about them


stick with asus cos i can't afford asus that's y i go for asrock.


----------



## Skyblaze

Well my vBIOS update worked painlessly and I know it worked because the Palit DUAL release vBIOS had a bug where the card wouldn't shut-off its fan at low usage. Now that I flashed the new improved vBIOS it does which I know is intended behavior but I liked it better when it didn't


----------



## syl1979

I am running one asrock p67 for my 2500k after i got compatibility issues with ram on msi card... Runs well. Made runs a 5ghz... Frequent bios update. At that time good price (not really the case anymore). I also like the fact that they make legacy cards. I bought an asrock 939 matx motherboard for my x2 3800 plus, based on amd 790 chipset, to replace an asus card that died.
http://www.asrock.com/mb/AMD/939A790GMH/


----------



## emsj86

Which 1070 that has a water block would you guys recommend. I was leaning towards a EVGA sc 1070 with ek water block


----------



## DanielB123

Thought I would share my experiences with the GTX 1070 so far, in my case the Zotac AMP Extreme

Bought the card in late August, it had Samsung memory in it however my system was acting up crazy and I thought it was because of the card so I've sent it back for replacement (it wasn't the card, but I fixed the issue so all good now.)

The replacement also has Samsung memory and seems to be working well. I'm not much of an overclocker however, using Zotac's Firestorm utility I have bumped the Core to +130 which gives me a stable 2050MHz whilst under load, +160 caused Battlefield 1 to crash within minutes. As for the memory overclock, at the time of writing this I'm at +650 on the memory giving me an overall of 9.5GHz. I have tried +700 however that caused flickering in Battlefield 1 menu (Soldier customization specifically) however in-game it was fine.

As for temperatures I have setup a custom fan curve that doesn't turn on the fans until 40 degrees Celcius and after 40 they spin at 50%, during long periods of gaming it usually never goes above 55 degrees Celcius, oh and It's reasonably quiet at that speed as well, can't really hear it with headphones on without sound playing.


----------



## khanmein

Quote:


> Originally Posted by *emsj86*
> 
> Which 1070 that has a water block would you guys recommend. I was leaning towards a EVGA sc 1070 with ek water block


avoid using EVGA SC but since u goin water cool then don't worried over-heating.


----------



## emsj86

Quote:


> Originally Posted by *khanmein*
> 
> avoid using EVGA SC but since u goin water cool then don't worried over-heating.


I be always used the EVGA AC version in the past. What's exactly wrong with the 1070. I was thinking if not EVGA than Asus strix


----------



## THEROTHERHAMKID

What's the best 1070 up to now?
What's the quicksilver like? Has it been released yet ?


----------



## Mr-Dark

Quote:


> Originally Posted by *emsj86*
> 
> I be always used the EVGA AC version in the past. What's exactly wrong with the 1070. I was thinking if not EVGA than Asus strix


The power limit on the SC is very low.. 170W as that not enough for stock evga clock... the FTW has 215W limit and that not enough to push the card without throttle..

check if there is a WB for the MSI Gaming-X & Z.. both has 290W limit...








Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> What's the best 1070 up to now?
> What's the quicksilver like? Has it been released yet ?


I'd way MSI Gamin-X or Z or Quick silver.. also the AMP! EXtreme (but the 3 slot cooler is heavy on the Pci slot ) ..


----------



## emsj86

That's the other one I was looking at. The gaming x does have a water block. Sucks it doesn't have the full length one (which is just for looks as the current one has v ram cooling). Looks like that will be my choice than


----------



## BroPhilip

Quote:


> Originally Posted by *Mr-Dark*
> 
> The power limit on the SC is very low.. 170W as that not enough for stock evga clock... the FTW has 215W limit and that not enough to push the card without throttle..
> 
> check if there is a WB for the MSI Gaming-X & Z.. both has 290W limit...
> 
> 
> 
> 
> 
> 
> 
> 
> I'd way MSI Gamin-X or Z or Quick silver.. also the AMP! EXtreme (but the 3 slot cooler is heavy on the Pci slot ) ..


Available now newegg right now for order and includ3s the updated bios....apparently it was the priority over all those that had already bought a card with micron. Maybe one day they will get around to updating ours as well.


----------



## BroPhilip

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> What's the best 1070 up to now?
> What's the quicksilver like? Has it been released yet ?


Available now newegg right now for order and includ3s the updated bios....apparently it was the priority over all those that had already bought a card with micron. Maybe one day they will get around to updating ours as well.


----------



## Roland0101

Quote:


> Originally Posted by *LuckLess7*
> 
> Obviously, the card now consumes more power in idle states, which I don't consider idial. The fans will start at 60°C - so watching a Youtube Video now makes the fans start.
> You say that problems at Stock settings indicate a defective card, and the upcoming VBIOS update is in any case just for the Overclocking issues? I have contacted my reatiler and described my problem, they answered within 30 minutes and just said please send it in, without any questions or anything. This indicates that there is some kind of acknowledged issue, otherwise this retailer will not accept rma so easily without asking the usual questions, like, did you plug it in, you know these kind of questions. I think I send it in, because really if the upcoming BIOS will not solve crash-issues at stock, I don't need to wait.


A card that crashes at stock is a defective card by definition and as a paying costumer you deserve a card that is working as advertised.

If I remember correctly you said that you have a MSI card, MSI acknowledged quality problems on some 1070. (only in China, but they did).
Now it is possible that the BIOS will resolve issue on stock clocks, and if that is all you want you can wait and test, but imho you should RMA that card, especially if your retailer is so accommodating.


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *BroPhilip*
> 
> Available now newegg right now for order and includ3s the updated bios....apparently it was the priority over all those that had already bought a card with micron. Maybe one day they will get around to updating ours as well.


Thanks for reply repd
What's newegg? American?
I'm in the uk


----------



## saunupe1911

Asus Strix OC owners,

I updated to their new bios and my Samsung card is even more stable. Firestrike never dips below 2038 with my locked voltage and fan speeds. I got a pretty good damn card.

My only gripe is this and I blame Nvidia...the Nvidia control suite won't detect my HDMI connection when I switch from my TVs HDMI input and force Windows to only display my monitors. My AMD 6850 card would always detect my TV's HDMI connection once I turned the TV on and made it my primary display. I hope Nvidia fixes this but I don't think the average gamer cares. And most HTPC guys keep their TV as their main display 24/7.


----------



## emsj86

Looks like I'll get a EVGA FtW with a water block. Uses 2x 8 pin so should handle some good over locks. Correct me if I'm wrong. If not Msi gaming x. But I like the FtW water block better


----------



## Avendor

Quote:


> Originally Posted by *khanmein*
> 
> this is y i never pick giga. sell it & change to asus or msi or even galax


No, I won't do that. I'll stick with G1 maybe in the future i'll focus to a different GPU manufacturer. I didn't know GTX 1070 will come with Micron mem. until i grabbed one . I sold my GTX 580 with Samsung memory and bought G1, the potential OC to 1070 is really low compared to Samsung from my previous GTX 580, I am still pleased because the temperature it's so much better now
Quote:


> Originally Posted by *gtbtk*
> 
> Gigabyte also stated that they did not have micron memory in any of their cards at first. Not sure that I would believe those sort of announcements.
> 
> Do you get checkerboard artifacts and crash if you set a +500 overclock while the voltage is at .625? I assume that you do so Giga saying no problems is also untrue. The core bios comes from Nvidia to fix a problem with the memory power interface to the micron chips. Every other manufacturer have released their 86.04.50.00.xx bios saying it was for the micron memory problem.
> 
> Giga have released the bios in the 86.04.50 range but they said it was for samsung. I cant help feeling that the people responsible are not so bright and that the label for the download is wrong. I cannot see any reason why the "samsung" 86.04.50.00.xx bios would not also work on a Micron card. It should fix the memory OC issues too.
> 
> To set a work around in the mean time, the easiest work around is to right click your desktop and open the Nvidia Control Panel, Go to manage 3d settings, click the program settings tab. In there add dwm.exe if it is not already there and set it to maximum performance, do the same thing with explorer.exe and set it to maximum performance then click apply.
> 
> After you have done that, Log out and then log in again and you are good to go.


If i set more than +325Mhz on memory I'm getting checkerboard patterns in Fire Strike the combined test or into Fire Strike Stress Test, also it happened in Witcher III, it's really weird I've tested Rise of the Tomb Raider the benchmark no checkerboard. Usually after checkerboard, every setting goes to stock or sometimes pc rebooting itself. Already tried that with NVCP - Global Settings - Power management mode - Prefer maximum performance won't help me at all. Now I'm keeping Adaptive mode which works great for me. Someone said about Gigabyte "Gigabyte have released 3 bios updates in as many months. What makes you think they will stop with the Samsung one?" I really hope they will not stop here


----------



## Mr-Dark

Quote:


> Originally Posted by *emsj86*
> 
> Looks like I'll get a EVGA FtW with a water block. Uses 2x 8 pin so should handle some good over locks. Correct me if I'm wrong. If not Msi gaming x. But I like the FtW water block better


Bad idea.. 213W is the limit for that card.. go with MSI and you will be happy


----------



## gtbtk

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> What's the best 1070 up to now?
> What's the quicksilver like? Has it been released yet ?


Announced yesterday. The OC version is the same as Gaming X card performance wise


----------



## gtbtk

Quote:


> Originally Posted by *emsj86*
> 
> Which 1070 that has a water block would you guys recommend. I was leaning towards a EVGA sc 1070 with ek water block


EVGA has VRM heat issues right now. I would wait and see about them first


----------



## DeathAngel74

2mm Thermal pads over the chokes and back of pcb is their solution. Plus another bios update for steeper fan curve. More bandaids


----------



## ITAngel

I would say if you get an EVGA just double check the thermal pads on the VRAM to make sure they are fully making contact so they don't over heat. I am sure they are just separating out and so the heat is being trapped and not spreading out across the metal back plate.


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *gtbtk*
> 
> Announced yesterday. The OC version is the same as Gaming X card performance wise


So they are doing 2 versions?


----------



## BroPhilip

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Thanks for reply repd
> What's newegg? American?
> I'm in the uk


Yes it is a North American site but I believe that I have heard of a UK partner site talked about here but don't remember the name of the site


----------



## gtbtk

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Announced yesterday. The OC version is the same as Gaming X card performance wise
> 
> 
> 
> So they are doing 2 versions?
Click to expand...

Guru3d said there was a refernece clock version and an OC version that has the same clocks as a Gaming X card. I could only see the OC version on newegg


----------



## headd

Quote:


> Originally Posted by *abctoz*
> 
> From what I read from these and other forums, as well as my personal experience I concluded that Samsung VRAM clocks higher on average, I have tested 6 cards personally:
> 
> 2x Micron clocks to +500-550 before artifacts 8700mhz effective
> 4x Samsung stable at +700 (haven't tried for higher) 9000mhz effective
> 
> My sample is small, but I see similar results on the internet, few people with Micron memory can get above +700, while you see this figure reported for Samsung VRAM regularly.
> 
> When I say caliber I mean ability to overclock, sure this is a well known practice by NVIDIA and it can be justified by the shortage argument, but at the end of the day consumers lose, NVIDIA+AIB partners win.


AND?You know 9000Mhz or 9400Ram adds you 1fps.Just stop this micron hate.


----------



## Nukemaster

Do they have a bios that covers both Samsung and Micron now?

I ask this because Asus lists Micron in the latest bios for my card, but I have Samsung memory. I am also somewhat interested in if the zeroRPM mode will work on my Mono Plus cooler.

I can not wait for a bios editor just for better fan control since this coolers low speed does not work well with the bios since it controls it pretty poorly.


----------



## emsj86

Really wish the Msi gaming x block covered he whole length of the card as from what I'm being told it's the best one to get that has a water block [.  and than the Msi gaming x block below


----------



## saunupe1911

Quote:


> Originally Posted by *Nukemaster*
> 
> Do they have a bios that covers both Samsung and Micron now?
> 
> I ask this because Asus lists Micron in the latest bios for my card, but I have Samsung memory. I am also somewhat interested in if the zeroRPM mode will work on my Mono Plus cooler.
> 
> I can not wait for a bios editor just for better fan control since this coolers low speed does not work well with the bios since it controls it pretty poorly.


I think you missed my posts about this. That Asus bios is good for both Samsung and Micron. And the default Asus fans setting work flawless. I have no idea what your are talking about. The fans don't even kick on until 45 or 50 degrees. I used After burner to program my own custom speeds anways. That's a user error. It has nothing to do with a bios update


----------



## LuckLess7

Quote:


> Originally Posted by *headd*
> 
> AND?You know 9000Mhz or 9400Ram adds you 1fps.Just stop this micron hate.


I think this is not about a hate towards Micron as a manufacturer per se. It's more about what was communicated by nVidia and other third party manufacturers for changing the supplier of the memory that is being used in that Graphics Card. They are advertised and benchmarked by trusted websites to be able to hit a specific Overclock-value, and while it might even be true that you will see a mere 1 fps increase in real life performance (which I don't know, really), that is obviously not the point. This one is about you as a costumer feel disappointed after spending, depending how rich you are, a s***tload of money on a product that does not perform as insinuated or rather implied by the manufactureres. They all carry OC in the name and you buy these products in order to do just that with them, overclock them. If it turns out there's a discrepancy of about 600% of what you expect and what you get, you naturally don't feel well about your purchase and want to see that as an acknowledged issue that will be taken care of.

At any rate, that's how I see it.


----------



## gtbtk

Quote:


> Originally Posted by *Nukemaster*
> 
> Do they have a bios that covers both Samsung and Micron now?
> 
> I ask this because Asus lists Micron in the latest bios for my card, but I have Samsung memory. I am also somewhat interested in if the zeroRPM mode will work on my Mono Plus cooler.
> 
> I can not wait for a bios editor just for better fan control since this coolers low speed does not work well with the bios since it controls it pretty poorly.


the 86.00.26 "micron bios" works on samsung cards. It should work but I have no way of testing it


----------



## gtbtk

Quote:


> Originally Posted by *emsj86*
> 
> Really wish the Msi gaming x block covered he whole length of the card as from what I'm being told it's the best one to get that has a water block [. and than the Msi gaming x block below


You mean this one?

https://www.msi.com/Graphics-card/GeForce-GTX-1070-SEA-HAWK-EK-X.html#hero-overview


----------



## Gurkburk

Can the gigabyte with Windforce reach high temps or will the card give up before going anywhere near too high clocks?

Edit: It seems like my card crashes when it reached just above 60-63*C.


----------



## smokerings

I guess I can join in as I just got my Gigabyte G1 gaming 1070 today.
The card has the samsung vram which was a surprise as I was just expecting the micron ICs.

After a few hours of use the card is doing well and is running very cool and quiet.

I noticed a big difference in gameplay experience in GTAV as I was able to actually max out all the settings and drive around smoothly with no stutters and drops compared to my GTX680 2gb SLI setup that was running lower settings and suffered greatly from only having 2 gb of vram.

I also got some Noctua 140mm redux 1500pwm fans to add to my case but I can't find where I put the swiftech pwm splitter that I bought last year!


----------



## Nukemaster

Quote:


> Originally Posted by *saunupe1911*
> 
> I think you missed my posts about this. That Asus bios is good for both Samsung and Micron. And the default Asus fans setting work flawless. I have no idea what your are talking about. The fans don't even kick on until 45 or 50 degrees. I used After burner to program my own custom speeds anways. That's a user error. It has nothing to do with a bios update


Thanks for the info.

By fan speed issues, I mean lower speed aftermarker coolers do not control speed properly because some Nvidia cards adjust pwm to get a certain RPM instead of just a straight percentage. With a fan that maxes at 1500rpm replacing a fan that used to max at 3500ish causes the speed to adjust very erratically(lowest it goes is in the 1050 area).

The stock cooler would control perfectly fine, but the card I have did not have any zero rpm mode at release.
Quote:


> Originally Posted by *gtbtk*
> 
> the 86.00.26 "micron bios" works on samsung cards. It should work but I have no way of testing it


Thanks.
This thread moves along pretty good at times.


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Can the gigabyte with Windforce reach high temps or will the card give up before going anywhere near too high clocks?
> 
> Edit: It seems like my card crashes when it reached just above 60-63*C.


that question is like one asking how long is a piece of string. No-one can give you a meaningful answer if they don't know the rest of the environment

what settings are you running the card at when the crash happens?

what are you running to put your card under load when it crashes?

does it do that at stock settings or when overclocked?


----------



## BroPhilip

Here is the reply that I got from the MSI USA facebook regarding the bios update....

"I got a response that puts the timeframe some time this month, the earliest being next week
not a very good one, but one nonetheless"


----------



## _Killswitch_

Well got my GTX 1070 today, haven't played with it much yet. Guess I can actually join the club now


----------



## Dude970

Quote:


> Originally Posted by *smokerings*
> 
> I guess I can join in as I just got my Gigabyte G1 gaming 1070 today.
> The card has the samsung vram which was a surprise as I was just expecting the micron ICs.
> 
> After a few hours of use the card is doing well and is running very cool and quiet.
> 
> I noticed a big difference in gameplay experience in GTAV as I was able to actually max out all the settings and drive around smoothly with no stutters and drops compared to my GTX680 2gb SLI setup that was running lower settings and suffered greatly from only having 2 gb of vram.
> 
> I also got some Noctua 140mm redux 1500pwm fans to add to my case but I can't find where I put the swiftech pwm splitter that I bought last year!


Quote:


> Originally Posted by *_Killswitch_*
> 
> Well got my GTX 1070 today, haven't played with it much yet. Guess I can actually join the club now










Welcome to the club!


----------



## Roland0101

Quote:


> Originally Posted by *_Killswitch_*
> 
> Well got my GTX 1070 today, haven't played with it much yet. Guess I can actually join the club now


Welcome!


----------



## _Killswitch_

Using my current pc, 2550K Oc too 4.8. GTX 1070 Oc with 255Mhz core (tried 260 but crashed) and 385Mhz on memory, far as i tried.
And thanks for the welcome =)


----------



## hitladen

Quote:


> Originally Posted by *saunupe1911*
> 
> Asus Strix OC owners,
> 
> I updated to their new bios and my Samsung card is even more stable. Firestrike never dips below 2038 with my locked voltage and fan speeds. I got a pretty good damn card.
> 
> My only gripe is this and I blame Nvidia...the Nvidia control suite won't detect my HDMI connection when I switch from my TVs HDMI input and force Windows to only display my monitors. My AMD 6850 card would always detect my TV's HDMI connection once I turned the TV on and made it my primary display. I hope Nvidia fixes this but I don't think the average gamer cares. And most HTPC guys keep their TV as their main display 24/7.


I also updated the Bios but I have Micron Memory.

I didn't really have any problems with Micron, before the updated bios I could do +100 on the Core and +200 on the Memory.

After the update I can do +300 on the memory an still +100 on the Core.

Getting around 19,900 in Fire Strike. I'm pretty sure I'm over 2100 under load but need to double check.


----------



## gtbtk

Lots of information about installing the 1070 /1080 EVGA ACX 3.0 cooler VRM heat pads out there now


----------



## syl1979

Here are official details
http://www.evga.com/thermalmod/thermal_pad_mod_installation_guide.pdf

I would skip the pad on the front coils ? Are these really getting hot ?


----------



## asdkj1740

Quote:


> Originally Posted by *syl1979*
> 
> Here are official details
> http://www.evga.com/thermalmod/thermal_pad_mod_installation_guide.pdf
> 
> I would skip the pad on the front coils ? Are these really getting hot ?


i would rather put this front pad on top of those caps under the cooling plate because lots of ir images show that those caps are very hot too.
i dont think placing the pad on top of the chokes and cooling plate to make contact with the main heatsink will help a lot as the fins of the heatsink on top of the chokes have not folded at 90 degrees...vertical fin is hard to make contact with the pad....very stupid


----------



## Hnykill

Paliti GTX Super Jeteatrem Here. core +150Mhz, memory +650. Micron Memory and no articfacts or problem whatsoewer. Cooler of a Beast. fan profile from 0 fans to 10% ....then 40% fans at 40c, from then to 70 rpm to 70 c° and 90% under full load.. but my card is kinda that why then if it hit 40°C the fans roll at 40 rpm. never even hit 70°c.. one badass card if you ask my.. sorry to hear about you Micron problems men:/

But i also bought it right when it came out. sad to see how maunfactrures seem too chep out :/


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Lots of information about installing the 1070 /1080 EVGA ACX 3.0 cooler VRM heat pads out there now


no use at all if suffered from this issue((

__
https://www.reddit.com/r/5aqjvb/so_i_disassembled_my_two_evga_gtx_1080_ftws/


----------



## gtbtk

For all the impatient MSI Gaming users out there. I found the new micron fix 1070 gaming Z bios on line

https://www.techpowerup.com/vgabios/187155/187155


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> For all the impatient MSI Gaming users out there. I found the new micron fix 1070 gaming Z bios on line
> 
> https://www.techpowerup.com/vgabios/187155/187155


After waiting this long, I'll wait a bit more to know its 100% kosher.
Besides staying in 3d clocks and higher voltage, I dont have issues as is.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> For all the impatient MSI Gaming users out there. I found the new micron fix 1070 gaming Z bios on line
> 
> https://www.techpowerup.com/vgabios/187155/187155
> 
> 
> 
> After waiting this long, I'll wait a bit more to know its 100% kosher.
> Besides staying in 3d clocks and higher voltage, I dont have issues as is.
Click to expand...

I am running it on my Gaming X. Works fine and reports a Gaming Z card


----------



## Avendor

It seems they made a change for Gigabyte G1 Gaming Micron, previous days was F11_Beta now it's F11. It has been added today
http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios

Could not be downloaded at this time...


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> It seems they made a change for Gigabyte G1 Gaming Micron, previous days was F11_Beta now it's F11. It has been added today
> http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> 
> Could not be downloaded at this time...


That bios will not fix the Checkerboard bug. The .26 range of bioses are the ones that introduced the bug in the first place


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> For all the impatient MSI Gaming users out there. I found the new micron fix 1070 gaming Z bios on line
> 
> https://www.techpowerup.com/vgabios/187155/187155


https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
officially out


----------



## Avendor

Quote:


> Originally Posted by *gtbtk*
> 
> That bios will not fix the Checkerboard bug. The .26 range of bioses are the ones that introduced the bug in the first place


How did you arrive at this conclusion? Yeah, but maybe they fixed the older VBIOS, certainly they've changed something. No longer beta


----------



## LuckLess7

I just sent mine back. I hope they don't just flash the new BIOS and ship it back to me as is


----------



## BroPhilip

It's official for the MSI Gaming Z as well

https://us.msi.com/Graphics-card/support/GeForce-GTX-1070-GAMING-Z-8G.html


----------



## DeathAngel74

Mod is done




It's not pretty, but it will work until eVGA sends out the other pads.


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> Mod is done
> 
> 
> 
> 
> It's not pretty, but it will work until eVGA sends out the other pads.


omg evga started sending thermal pads out?


----------



## asdkj1740

Quote:


> Originally Posted by *asdkj1740*
> 
> omg evga started sending thermal pads out?


did you check the vram contact problem?


----------



## DeathAngel74

yeah, they're fine. I thought I broke something, lol. The dang LED cable was in the path of one of the far right fan (rattle, rattle). DoH!
Browsing and youtube, temps look ok so far! I haven't used my PC in almost a week, I was so pissed......


----------



## Roland0101

Quote:


> Originally Posted by *asdkj1740*
> 
> omg evga started sending thermal pads out?


And for a good reason....



It's not a 1070 though.


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> yeah, they're fine. I thought I broke something, lol. The dang LED cable was in the path of one of the far right fan (rattle, rattle). DoH!
> Browsing and youtube, temps look ok so far! I haven't used my PC in almost a week, I was so pissed......


i think you may also suffer by the contact problem


----------



## DeathAngel74

yeah, it was a little short. Temporary fix though.


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> yeah, it was a little shot. Temporary fix though.


the gaps of your are smaller than those pics posted on evga forums and reddit. but it is clearly that the vram chips have not fully contacted with the stock pads under the midplate.
you better change them too, 1.5mm pad should get the vram cooling job done.


----------



## asdkj1740

Quote:


> Originally Posted by *asdkj1740*
> 
> the gaps of yours are smaller than those pics posted on evga forums and reddit. but it is clearly that the vram chips have not fully contacted with the stock pads under the midplate.
> you better change them too, 1.5mm pad should get the vram cooling job done.


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That bios will not fix the Checkerboard bug. The .26 range of bioses are the ones that introduced the bug in the first place
> 
> 
> 
> How did you arrive at this conclusion? Yeah, but maybe they fixed the older VBIOS, certainly they've changed something. No longer beta
Click to expand...

Because the base bios for all the manufacturers comes from Nvidia and the bug is contained in the Nvidia base code and not the customized parts done by gigabyte


----------



## Prothean

I just flashed the new micron bios for my MSI card.

There is no change in memory overclocking potential for me. That is, I was getting +500 Mhz stable on memory with the old bios, and now I'm getting the same with the new bios.

The only difference is that I don't need to set the power management to "maximum power" anymore. I just leave it at "optimal power" now.


----------



## DeathAngel74

new bios is out. adds 200-250 rpms.


----------



## Gurkburk

Quote:


> Originally Posted by *gtbtk*
> 
> that question is like one asking how long is a piece of string. No-one can give you a meaningful answer if they don't know the rest of the environment
> 
> what settings are you running the card at when the crash happens?
> 
> what are you running to put your card under load when it crashes?
> 
> does it do that at stock settings or when overclocked?


Around +100ish Core clock and around +590 Memory.


----------



## xGeNeSisx

Is this nightmare finally over? I have had my G1 since mid July, ended up RMA'ing the first (had trouble holding stock clocks). The second card's cooler ran so loud to keep the card from throttling at 54C, fans needed to be on 85% to keep it from stuttering from throttle. Both cards were Micron memory. Needless to say, I ripped the stock cooler off and put on a Corsair H55 AIO to cool the core. Added thermal pads and heatsinks to VRAM and VRM. Here's the best part - the card still would throttle even at stock clocks. It was trying to constantly clock down from 1999mhz core to 1986mhz or lower. The GPU core does not exceed 40C and the maximum temps on the VRM were recorded to be 45C. Ridiculous.

No modification of voltage/power target (more or less) resulted in any change. The temps were not a factor in why the card was throttling, and there was plenty of room for more power target/voltage as the card was only hitting 80-85% at max. I finally resorted to shunt modding the 1070 G1 Gaming with CLU. It was great, the card ran hotter but stopped throttling. Sort of. While the voltage/clock speed did not actually drop according to extensive monitoring, the card was still attempting to choke. The dips and stutters in game were greatly reduced, but still somewhat there as the card repeatedly tried to reduce clock and voltage but was unable to do so.

This morning I removed the shunt mod, checked the entire thermal setup and reassembled the card. After flashing this BIOS it seems as though it's finally gone. Now when I enter a different area of the map in DOOM, the card doesn't choke needlessly over and over. I attempted overclocking the card many times, overclocks were stable, but it felt as though not enough voltage/power headroom was available and it lagged and stuttered.

I just tested +50 mhz core to 2050 mhz, and +250 (4125 mhz) to memory. Completely stable after running Heaven and playing DOOM. Maybe I can finally enjoy gaming on the card and stop trying in vain to get a stable gaming experience









Edit: Also all through this Gigabyte support has been total pricks. Dealt with them when I inquired about RMA'ing the first card which was terrible. Second dealing was recently when I inquired about the release of the new BIOS and potential Micron memory voltage issue fix. I was told flat out "Gigabyte does not use Micron DDR5 memory in the 1070 G1 Gaming Series."
>mfw
Sent them readings from the card showing it had Micron memory with a nice arrow to direct their gaze. Support reply asked "is this a 1070 or 1080."
I tell them it's a 1070 G1 Gaming as shown in the screenshot and including the serial number and all relevant information. Got this reply before they closed the ticket.
"There is no micron memory issue in 1070 gaming."








Thanks Gigabyte, I really appreciate your superior product quality and technical support! I will never buy another Gigabyte product again.


----------



## BroPhilip

So question guys and girls.... after bios update no changes to oc limits but if I push memory over +375 (with factory oc +425) I now get a hard reboot.


----------



## Roland0101

Quote:


> Originally Posted by *Gurkburk*
> 
> Around +100ish Core clock and around +590 Memory.


Is that afterburner or effective?
Quote:


> Originally Posted by *BroPhilip*
> 
> So question guys and girls.... after bios update no changes to oc limits but if I push memory over +375 (with factory oc +425) I now get a hard reboot.


Same question. + what Card is that.


----------



## BroPhilip

Quote:


> Originally Posted by *Roland0101*
> 
> Is that afterburner or effective?
> Same question. + what Card is that.


MSI gaming z


----------



## LeonardoHLB

The technician who analyzed and said that EVGA has no protection system against short circuit and overload, managed to turn the GTX 1070 FTW in a GTX 1080!


----------



## Hunched

There are no new BIOS for Samsung cards from MSI, only for Micron correct?
Every card on their product page now has a BIOS with 0 information as to what it is.


----------



## gtbtk

Quote:


> Originally Posted by *Prothean*
> 
> I just flashed the new micron bios for my MSI card.
> 
> There is no change in memory overclocking potential for me. That is, I was getting +500 Mhz stable on memory with the old bios, and now I'm getting the same with the new bios.
> 
> The only difference is that I don't need to set the power management to "maximum power" anymore. I just leave it at "optimal power" now.


I am seeing pretty much the same. That is exactly what was supposed to happen. Success!


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> There are no new BIOS for Samsung cards from MSI, only for Micron correct?
> Every card on their product page now has a BIOS with 0 information as to what it is.


The new bios should work fine on both cards but it does not increase functionality. Gigabyte has released a bios from the same core code and say it is for samsung cards.

If you are not having any dramas, there is no need for you to install it unless you want to try it out. If you do decide to install it and don't like it, you can always reflash your card to the original bios using nvflash


----------



## gtbtk

Quote:


> Originally Posted by *LeonardoHLB*
> 
> The technician who analyzed and said that EVGA has no protection system against short circuit and overload, managed to turn the GTX 1070 FTW in a GTX 1080!


GP104-200 is a 1080 GPU.


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> The new bios should work fine on both cards but it does not increase functionality. Gigabyte has released a bios from the same core code and say it is for samsung cards.
> 
> If you are not having any dramas, there is no need for you to install it unless you want to try it out. If you do decide to install it and don't like it, you can always reflash your card to the original bios using nvflash


I tried installing the new BIOS and it failed, so I had to reflash my old one.


----------



## LeonardoHLB

Quote:


> Originally Posted by *gtbtk*
> 
> GP104-200 is a 1080 GPU.


Yes, it's the same chip.


----------



## gtbtk

Quote:


> Originally Posted by *LeonardoHLB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> GP104-200 is a 1080 GPU.
> 
> 
> 
> Yes, it's the same chip.
Click to expand...

No the 1070 has a slightly different model GPU chip.

1070 uses a GP104-400 GPU chip with reduced cuda cores

Edit: I just looked at some review photos and it seems that I am wrong. the 1080 is gp104-400.


----------



## LeonardoHLB

Quote:


> Originally Posted by *gtbtk*
> 
> No the 1070 has a slightly different model GPU chip.
> 
> 1070 uses a GP104-400 GPU chip with reduced cuda cores


Still the same chip.


----------



## BroPhilip

The following is from the moderators regarding the msi update being for Samsung only...

Quote from: flobelix on 03 Nov, 2016 18:10
No, it can not be confirmed. All new vbios versions released November 3 are for Samsung ram batches. Either no vbios for micron cards are released yet or some are universal but there is no documentation. Even the Gaming Z November 3 released vbios is - at least according to MSI's list - only meant for a Samsung batch card.

The product site is also not clear about this issue. So far I can't say if there is a risk of crossflashing a wrong vbios.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The new bios should work fine on both cards but it does not increase functionality. Gigabyte has released a bios from the same core code and say it is for samsung cards.
> 
> If you are not having any dramas, there is no need for you to install it unless you want to try it out. If you do decide to install it and don't like it, you can always reflash your card to the original bios using nvflash
> 
> 
> 
> I tried installing the new BIOS and it failed, so I had to reflash my old one.
Click to expand...

the utility must check before installing.

Edit:

In light of what BroPhillip said above. I am not sure what is going one now. The claim that it is for the samesung card at the MSI forums


----------



## gtbtk

Quote:



> Originally Posted by *BroPhilip*
> 
> The following is from the moderators regarding the msi update being for Samsung only...
> 
> Quote from: flobelix on 03 Nov, 2016 18:10
> No, it can not be confirmed. All new vbios versions released November 3 are for Samsung ram batches. Either no vbios for micron cards are released yet or some are universal but there is no documentation. Even the Gaming Z November 3 released vbios is - at least according to MSI's list - only meant for a Samsung batch card.
> 
> The product site is also not clear about this issue. So far I can't say if there is a risk of crossflashing a wrong vbios.


The bios version that the utility installed on my Gaming X is the same version number as the bios installed on the Quick Silver card (same PCB as Gaming 8g, X and Z) with Micron Memory reviewed on Guru3d the other day


----------



## DeathAngel74

Figures....After modding the card myself....eVGA is preparing to ship the FREE thermal pads, lol...... Rotten luck.


----------



## Roland0101

Quote:


> Originally Posted by *DeathAngel74*
> 
> Figures....After modding the card myself....eVGA is preparing to ship the FREE thermal pads, lol...... Rotten luck.


Better save than sorry.


----------



## Hunched

Quote:


> Originally Posted by *BroPhilip*
> 
> The following is from the moderators regarding the msi update being for Samsung only...
> 
> Quote from: flobelix on 03 Nov, 2016 18:10
> No, it can not be confirmed. All new vbios versions released November 3 are for Samsung ram batches. Either no vbios for micron cards are released yet or some are universal but there is no documentation. Even the Gaming Z November 3 released vbios is - at least according to MSI's list - only meant for a Samsung batch card.
> 
> The product site is also not clear about this issue. So far I can't say if there is a risk of crossflashing a wrong vbios.


But... the BIOS's don't even work on my Samsung card.
I'm using the old OC Mode Gaming Z BIOS from Techpowerup currently, for Samsung memory and it works fine.

*** is going on








Wow ftw backwards is really censored? Is this website hosted in Saudi Arabia what is this censorship?


----------



## DeathAngel74

Quote:


> Originally Posted by *Roland0101*
> 
> Better save than sorry.


I'll see how my mod performs and save the free pads as spares.


----------



## SuperZan

Well, the new vBIOS fan settings are very nearly the same ones I've been running as a custom profile, so that was reassuring if nothing else. I'll still go ahead and apply the thermal pads because a cool card is a happy card. I've got the five-year warranty and EVGA cross-ships in the UK so I feel pretty good about the situation..

I really hope that EVGA doesn't make this sort of mistake again but their service response is why I buy their Nvidia cards.


----------



## Hunched

So has anyone with a MSI Samsung card tried any of the new BIOS's from MSI's product page or Techpowerup?
Do they work?


----------



## BroPhilip

Mine results with the new Bios seem to be exactly the same oc potential as the previous bios except now I'm getting hard system locks when trying to test out the memory Overclocks....it also seems to be running hotter than before with quicker step downs of the gpu clock...not really what I was expecting .


----------



## Hunched

Quote:


> Originally Posted by *BroPhilip*
> 
> Mine results with the new Bios seem to be exactly the same oc potential as the previous bios except now I'm getting hard system locks when trying to test out the memory Overclocks....it also seems to be running hotter than before with quicker step downs of the gpu clock...not really what I was expecting .


Is your card Micron or Samsung?


----------



## TheGlow

I just applied the MSI bios for my micron but it looks like something else is running to keep my voltage at 800mV.
I undid everything I thought had an impact and seems theres something else.
Is there something to figure out what it might be?
Edit: ok, those msi gaming osd exes were still running with service disabled. I killed those and dropped to core 215MHz, Memory 405MHz and 649mV.
I can open Edge or chrome, watch it kick up for a second core to 1582MHz, Memory 4704MHz and 725mV.
I can keep opening and closing and watch the spikes in the graph. Appears to be ok.
Too tired to play the OC max finding game again right now.
Going to leave my usual +180 core/+700 memory and see how Overatch handles it.


----------



## BroPhilip

Quote:


> Originally Posted by *Hunched*
> 
> Is your card Micron or Samsung?


Micron


----------



## Hunched

I just want to know if these new BIOS's from MSI are for Micron or Samsung.
The product page descriptions are wrong and useless, the MSI forum mods say they're for Samsung.

This is stupid.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> So has anyone with a MSI Samsung card tried any of the new BIOS's from MSI's product page or Techpowerup?
> Do they work?


I flashed the techpowerup Gaming Z bios manually and it worked fine on my Gaming X (micron) for the hour or two I used it before I found out about the proper update from MSI.

I then reverted back to the original X bios and used the utility to update to the Gaming X .50 bios and it worked fine


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> I just want to know if these new BIOS's from MSI are for Micron or Samsung.
> The product page descriptions are wrong and useless, the MSI forum mods say they're for Samsung.
> 
> This is stupid.


If you are on samsung and have not had problems dont worry about it just now. The bios does not seem to improve performance


----------



## Dude970

I am curious about this. I will hold off until I read more about it. I have Samsung memory, but thought the BIOS update was for the micron. Flobelix has always been spot on, so I will wait


----------



## Hunched

Can somebody provide a Samsung Gaming Z BIOS?
The one from Techpowerup is not recognized by MSI's update tool, since it's a reviewer OC Mode BIOS.
I cannot find a regular consumer Gaming Z BIOS.

Does anyone here have a Gaming Z 1070 with Samsung? If you upload your BIOS that would be awesome.


----------



## pacopepe

gigabyte just posted a new bios for micron G1

http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios

Release for Micron Memory
NVIDIA Source BIOS Version: 86.04.26.00.56
2016/11/03

and, its based on a June bios while the samsung one from days ago was based on the newest one from october, 86.04.50.00.7A

So, not sure what the actual **** is gigabyte doing.


----------



## Hunched

Quote:


> Originally Posted by *pacopepe*
> 
> gigabyte just posted a new bios for micron G1
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> 
> Release for Micron Memory
> NVIDIA Source BIOS Version: 86.04.26.00.56
> 2016/11/03


Hm.
Looks like Micron is still 86.04.*26*.xx.xx
Gigabyte's Samsung is also 86.04.*50*.xx.xx

MSI's new BIOS's allow older Samsung BIOS's to be updated to them, even though they're capable of flagging incompatibility between product lines.
Example, if you're using a Gaming X BIOS you cannot flash to a Gaming Z BIOS with their tool, vice versa.

I'm pretty certain, as the MSI forum mods have been saying, that MSI still has not released a Micron BIOS.

I'm going to flash to the new 86.04.50 user uploaded Gaming Z BIOS on techpowerup.
Too bad no more OC Mode by default though.


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pacopepe*
> 
> gigabyte just posted a new bios for micron G1
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> 
> Release for Micron Memory
> NVIDIA Source BIOS Version: 86.04.26.00.56
> 2016/11/03
> 
> 
> 
> Hm.
> Looks like Micron is still 86.04.*26*.xx.xx
> Gigabyte's Samsung is also 86.04.*50*.xx.xx
> 
> MSI's new BIOS's allow older Samsung BIOS's to be updated to them, even though they're capable of flagging incompatibility between product lines.
> Example, if you're using a Gaming X BIOS you cannot flash to a Gaming Z BIOS with their tool, vice versa.
> 
> I'm pretty certain, as the MSI forum mods have been saying, that MSI still has not released a Micron BIOS.
> 
> I'm going to flash to the new 86.04.50 user uploaded Gaming Z BIOS on techpowerup.
> Too bad no more OC Mode by default though.
Click to expand...

While overclocking, using curves, I can actually get more performance out of my card with the X bios compared to the Z bios


----------



## Hunched

I can't flash the new BIOS. Newer version of NvFlash required, latest version doesn't work either.
Okay.

Looks like I need to find a consumer Z BIOS so I can actually use MSI's tool.
Damn, I can't find one anywhere.


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> While overclocking, using curves, I can actually get more performance out of my card with the X bios compared to the Z bios


I might just use the new X BIOS then, it's not like I have a choice right now.
I'm unable to flash these new BIOS's with anything but MSI's official tool, and you cannot flash to a Z BIOS unless you're already using one.
The only one I've ever seen, the Techpowerup OC Mode BIOS does not work, they didn't account for the 10 people using a reviewer BIOS with their tool.

I find it odd how nobody in the world has posted a Gaming Z BIOS except for a reviewer.


----------



## BroPhilip

Quote:


> Originally Posted by *Hunched*
> 
> I might just use the new X BIOS then, it's not like I have a choice right now.
> I'm unable to flash these new BIOS's with anything but MSI's official tool, and you cannot flash to a Z BIOS unless you're already using one.
> The only one I've ever seen, the Techpowerup OC Mode BIOS does not work, they didn't account for the 10 people using a reviewer BIOS with their tool.
> 
> I find it odd how nobody in the world has posted a Gaming Z BIOS except for a reviewer.


I posted a gaming z bios it's micron though


----------



## Hunched

Quote:


> Originally Posted by *BroPhilip*
> 
> I posted a gaming z bios it's micron though


Thanks, but yea unfortunately that doesn't help me.

With NVFlash 5.319.0 I can't flash to the newest MSI BIOS on Techpowerup, says a new version is needed.
The new version 5.328.0 doesn't work either.

Somebody did upload the new Z BIOS here but I can't flash to it.
https://www.techpowerup.com/vgabios/187155/187155
(It's not Micron, uploader was wrong with their description)

The new version of the X BIOS, which I've flashed to with MSI's BIOS tool since I can actually acquire the old version of the X BIOS, is also on there and what I'm now using.
New X BIOS: https://www.techpowerup.com/vgabios/187158/187158 (probably can't flash with NVFlash)
Old X BIOS: https://www.techpowerup.com/vgabios/184207/msi-gtx1070-8192-160603-1
MSI new X BIOS: https://us.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios

Just flash to the old one with NVFlash and then you can get the new version directly from MSI and it won't be incompatible.

I hate these hoops, just provide the god damn .rom's MSI. Though I suppose they don't work anyway with NVFlash for now.
I'll flash my Gaming 1070 to Z status if I want to, one day, probably as soon as NVFlash is updated again.
It seems MSI's tool is using 5.333 version of nvflash, which isn't available yet.


----------



## pacopepe

After flashing the bios for micron from G1, GPU-Z says that my bios is now a 86.04.50.00.7A

So, the patch notes must be wrong.


----------



## Hunched

Quote:


> Originally Posted by *pacopepe*
> 
> After flashing the bios for micron from G1, GPU-Z says that my bios is now a 86.04.50.00.7A
> 
> So, the patch notes must be wrong.


Or Gigabyte put up the wrong BIOS.
All the new MSI BIOS are 86.04.50 and the mods on MSI forum say they're Samsung.
Every single Micron BIOS from MSI and Gigabyte to date have been 86.04.26

The .50 is a new thing.
.26 = Micron
.1E = Samsung

None of the manufacturer's providing the BIOS seem to understand what they're providing.
I'm not certain of anything yet.

MSI mods have been instructed that .50's are Samsung so that's what I'm believing.
Their tool also updated my Samsung BIOS to a .50 BIOS.


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> I then reverted back to the original X bios and used the utility to update to the Gaming X .50 bios and it worked fine


This is what I've done, but my card is Samsung.
86.04.50.00.2A?


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BroPhilip*
> 
> I posted a gaming z bios it's micron though
> 
> 
> 
> Thanks, but yea unfortunately that doesn't help me.
> 
> With NVFlash 5.319.0 I can't flash to the newest MSI BIOS on Techpowerup, says a new version is needed.
> The new version 5.328.0 doesn't work either.
> 
> Somebody did upload the new Z BIOS here but I can't flash to it.
> https://www.techpowerup.com/vgabios/187155/187155
> (It's not Micron, uploader was wrong with their description)
> 
> The new version of the X BIOS, which I've flashed to with MSI's BIOS tool since I can actually acquire the old version of the X BIOS, is also on there and what I'm now using.
> New X BIOS: https://www.techpowerup.com/vgabios/187158/187158 (probably can't flash with NVFlash)
> Old X BIOS: https://www.techpowerup.com/vgabios/184207/msi-gtx1070-8192-160603-1
> MSI new X BIOS: https://us.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
> 
> Just flash to the old one with NVFlash and then you can get the new version directly from MSI and it won't be incompatible.
> 
> I hate these hoops, just provide the god damn .rom's MSI. Though I suppose they don't work anyway with NVFlash for now.
> I'll flash my Gaming 1070 to Z status if I want to, one day, probably as soon as NVFlash is updated again.
> It seems MSI's tool is using 5.333 version of nvflash, which isn't available yet.
Click to expand...

I flashed my gaming x using the gaming Z bios from techpowerup using the OEM version of nvflash 5.328 I extracted from the micron fix Asus vBios uploader. I did flash it from a gaming X .26 bios.

If you want it, I have posted it here. Download both files from here:

https://1drv.ms/f/s!AplTNK-q9mm4iKlEeAoHeULWcLrWBw


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I then reverted back to the original X bios and used the utility to update to the Gaming X .50 bios and it worked fine
> 
> 
> 
> This is what I've done, but my card is Samsung.
> 86.04.50.00.2A?
Click to expand...

That is the version I am running


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> That is the version I am running


So either I'm running the wrong BIOS, or you're running the wrong BIOS, or it's somehow universal.
MSI cared enough to flag incompatibility if they recognized a Gaming X BIOS trying to flash to a Gaming Z BIOS.
I'd think they would be able to recognize a Samsung BIOS trying to flash to a Micron BIOS or the other way around and stop that as well.

I still don't understand why they would instruct their forum moderators, the ones who handle BIOS requests, that it's Samsung only if it's not.
Somebody is doing their job wrong.


----------



## gtbtk

Quote:



> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is the version I am running
> 
> 
> 
> So either I'm running the wrong BIOS, or you're running the wrong BIOS, or it's somehow universal.
> MSI cared enough to flag incompatibility if they recognized a Gaming X BIOS trying to flash to a Gaming Z BIOS.
> I'd think they would be able to recognize a Samsung BIOS trying to flash to a Micron BIOS or the other way around and stop that as well.
> 
> I still don't understand why they would instruct their forum moderators, the ones who handle BIOS requests, that it's Samsung only if it's not.
> Somebody is doing their job wrong.
Click to expand...

this is what I know.

The bioses are interchangable. A .26 bios was also shipped on later samsung cards. They are not memory specific

If you compare .26 bioses between two different manufacturers in a HEX editor, the files are almost identical. The only bits that have been changes are mostly in the header area and a bit near the end of file with branding and clock speed, power limits changes. Basically the bits that differentiate different models. I have not done a comparison with .50 bioses but the manufacturers are not foing to rewrite their own bios files from scratch.

EVGA, ASUS, Palit/gainward have all published .50 bioses as micron fixes.

Giga has published a .50 as a Samsung fix but there is nothing wrong with the samsung cards that I am aware of that would need a bios fix. MSI forum has now said that their .50 bios is only for Samsung as well but given that nvidia wrote the bug fix, the bioses from all brands are mostly the same file and the .50 does fix the checkerboarding, I cant help think that Giga and MSI both attended the same meeting with nvidia and there has been a communication mix up.


----------



## benjamen50

I don't think I have 0db anymore on my EVGA graphics card after updating the BIOS with the updated PWM profile, is this intended or not normal for my EVGA GTX 1070 ACX 3.0?

Here is a GPU-Z screenshot with the GPU vBIOS version:
http://prntscr.com/d2v448


----------



## SuperZan

Quote:


> Originally Posted by *benjamen50*
> 
> I don't think I have 0db anymore on my EVGA graphics card after updating the BIOS with the updated PWM profile, is this intended or not normal for my EVGA GTX 1070 ACX 3.0?
> 
> Here is a GPU-Z screenshot with the GPU vBIOS version:
> http://prntscr.com/d2v448


When I looked into the new settings from the updated vBIOS the change to the Quiet-oriented profile moved the absolute minimum fan speed from 0 to 20%. You can still make a custom profile that begins at 0 and then moves to their recommended settings past, say, 50 °C. I noticed that the new fan profiles aim to prevent critical temperature development early in the ramp-up rather than trying to walk high temperatures back, which is how I typically design my custom fan profiles anyway. If you follow their methodology you should be able to tweak a bit within safety parameters provided you pay attention to when their 'Quiet' profile makes any aggressive fan-speed changes. Err on the side of caution, I'd say. If you've ordered the thermal pads then once they've been applied you should be able to tame your fan profile a bit further.


----------



## khanmein

Quote:


> Originally Posted by *SuperZan*
> 
> When I looked into the new settings from the updated vBIOS the change to the Quiet-oriented profile moved the absolute minimum fan speed from 0 to 20%. You can still make a custom profile that begins at 0 and then moves to their recommended settings past, say, 50 °C. I noticed that the new fan profiles aim to prevent critical temperature development early in the ramp-up rather than trying to walk high temperatures back, which is how I typically design my custom fan profiles anyway. If you follow their methodology you should be able to tweak a bit within safety parameters provided you pay attention to when their 'Quiet' profile makes any aggressive fan-speed changes. Err on the side of caution, I'd say. If you've ordered the thermal pads then once they've been applied you should be able to tame your fan profile a bit further.


there's no way to force my Leadtek GTX 970 fan-less mode even with edited vbios or using msi ab. i flashed back to ori stock

my RMA card, both spinning damn fast & loud. hwmonitor or gpu-z shown 0 RPM but FANPWMIN0 show 37% during idle & max 59% but fan rpm still 0

temperature is stable no over-heating & same goes to VRM too. re-applied new thermal paste & cleaned fan/heat-sink


----------



## Hunched

Quote:


> Originally Posted by *gtbtk*
> 
> this is what I know.
> 
> The bioses are interchangable. A .26 bios was also shipped on later samsung cards. They are not memory specific
> 
> If you compare .26 bioses between two different manufacturers in a HEX editor, the files are almost identical. The only bits that have been changes are mostly in the header area and a bit near the end of file with branding and clock speed, power limits changes. Basically the bits that differentiate different models. I have not done a comparison with .50 bioses but the manufacturers are not foing to rewrite their own bios files from scratch.
> 
> EVGA, ASUS, Palit/gainward have all published .50 bioses as micron fixes.
> 
> Giga has published a .50 as a Samsung fix but there is nothing wrong with the samsung cards that I am aware of that would need a bios fix. MSI forum has now said that their .50 bios is only for Samsung as well but given that nvidia wrote the bug fix, the bioses from all brands are mostly the same file and the .50 does fix the checkerboarding, I cant help think that Giga and MSI both attended the same meeting with nvidia and there has been a communication mix up.


Yes, it does look like these new BIOS's are compatible with both Samsung & Micron.
I'm not going to dig into it much further unless I have issues, but I did find a post from EVGA Jacob saying their new BIOS's for the Micron fix were fine for Samsung users to update to.
It's probably the same for everybody.

It's just a real lack of communication from the manufacturers, they all seem pretty clueless.
There would have never been an issue in the first place had any of them been capable of properly designing a BIOS for the Micron cards.


----------



## DeathAngel74

from what i remember eVGA 970 bios were universal. Samsung/Hynix/Elpida*cough*Micron*cough*

@khanmein
this won't work on your card?


----------



## benjamen50

Never mind about what I said about the new EVGA vBIOSes, my card actually still has the 0db fanless with the new BIOS as well. Card was just too hot at the time, it's getting close to summer here.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> from what i remember eVGA 970 bios were universal. Samsung/Hynix/Elpida*cough*Micron*cough*
> 
> @khanmein
> this won't work on your card?


yeah i set RPM11, PER01 & RPM01 to 0

now i estimate current my fan speed is around 3K RPM (min is 1.31K & max is 3.9K) lowest PWMFANIN0 is around 34~39% during idle

https://www.techpowerup.com/vgabios/168531/168531

1) v5.287 flash the edited vbios 0 RPM rom

2) v5.287 ran --protectoff

3) v5.196.0.1 flash by original rom

4) v5.287 ran --protecton

FYI, Memory Support; (maxwell)
GDDR5, Elpida
GDDR5, Hynix
GDDR5, Samsung

my card also universal for all the different vram chip but not like *cough* micron or samsung with pascal!

i got no choice but to ordered long sleeve bearing (https://www.aliexpress.com/item/Free-Shipping-Brand-New-2016-2pcs-Lot-POWER-LOGIC-PLD08010S12HH-75mm-DC12V-0-35A-4Pins-Graphics/32616914727.html)

my 2 fans came with colorful CF-12815B pwm ball bearing but hard to find it. (http://bbs.expreview.com/thread-62980-1-1.html)

my card just returned thru RMA. yesterday i just received thru parcel. i tested many hours with Fifa 17 & Dota 2 no issue with temperature but the fan noise is like the lowest common hair dryer noise.

my fans will become normal if no NV driver is installed or disable GPU. once i turned on my PC the fan constantly spin at this speed & noise permanently even during idle or gaming. weird..


----------



## Avendor

Quote:


> Originally Posted by *gtbtk*
> 
> Because the base bios for all the manufacturers comes from Nvidia and the bug is contained in the Nvidia base code and not the customized parts done by gigabyte


Apparently after installing F11 for my G1 Micron I'm getting a new Bios version: 86.04.50.00.7A (which indeed it's for Samsung, according to Gigabyte site) I think they finally fixed it, why do I think that? because with previous bios f11_beta couldn't pass more than +325 on mem. clock that was my limit, now with F11 I've just tested with +410Mhz (8.8 getting close to 9GHz effective I'm gonna push it more to see where I'm crashing at) no checkerboard, no problems so far, seem to be stable. I'm on Adaptive mode.








New results in Fire Strike:






LE: Interesting, no artifacts occur or video driver crashing
+125 Core Clock
+500MHz Memory Clock


----------



## pacopepe

With the G1 new bios for micron I can get 600 memory overclock, 625 is not stable on unigine as crash after 15 scenes or so, but, at least is something.

stable on heaven and fire strike and with 0 problems on real games like titanfall 2 or Company of heroes 2.


----------



## Avendor

+50% Core Voltage
+700MHz Memory Clock
Couldn't finish the Stress Tests, in some scenes starts flickering. Overall significant improvement in my case


----------



## gtbtk

Quote:


> Originally Posted by *Avendor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Because the base bios for all the manufacturers comes from Nvidia and the bug is contained in the Nvidia base code and not the customized parts done by gigabyte
> 
> 
> 
> Apparently after installing F11 for my G1 Micron I'm getting a new Bios version: 86.04.50.00.7A (which indeed it's for Samsung, according to Gigabyte site) I think they finally fixed it, why do I think that? because with previous bios f11_beta couldn't pass more than +325 on mem. clock that was my limit, now with F11 I've just tested with +410Mhz (8.8 getting close to 9GHz effective I'm gonna push it more to see where I'm crashing at) no checkerboard, no problems so far, seem to be stable. I'm on Adaptive mode.
> 
> 
> 
> 
> 
> 
> 
> 
> New results in Fire Strike:
Click to expand...

Gigabyte didn't fix it. Nvidia did in the core .50 bios that they distributed to all the vendors. I have no idea why they are saying the bios is for Samsung.

Glad you are a happy chappy


----------



## Avendor

Thanks for this information. You're the best


----------



## F3niX69

My friend with a g1 gaming with micron and a bios version 86.04.26.00.56 updated the vbios with the F11 bios and it installed the 86.04.50.00.7A version.


----------



## Mr-Dark

So MSI new bios out.. how about updating the bios in SLI ? will the update apply for both card or i should switch the card's ?

I'm super lazy to open the case door and take out the 2nd card..lol


----------



## TheGlow

I applied the MSI bios but found it a bit odd.
I grabbed the one on their site, https://us.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
I didnt like how I didnt see anything mentioning a bios version number, patch notes, etc.
It's just an exe. It shows some info in a command prompt but closes itself, even if run from command prompt, so I couldn't see what it said.
It left me in low res so I had to reboot assuming that was the next step as nothing prompted me.
Only GPU-Z had the new bios revision # which I forgot to compare to old screen shots so I could only assume I had it successfully updated by trying to crash it.


----------



## DirektEffekt

I actually recently got a G1 gaming with Micron and it seems that the F11 BIOS from the Gigabyte website (which actually comes out as being 86.04.50.00.7A when you flash it, contrary to what the description says. Particularly interesting since the F11 Beta BIOS most definitely was not the .50 version) seems to have fixed the memory instability for me. Before I was stuck at +350. Now on a quick tweak I am at +600











Edit: On another quick tweak it looks like I artifact at around +700, so I will see where I can get it to without artifacting. Not too shabby.


----------



## BroPhilip

Quote:


> Originally Posted by *TheGlow*
> 
> I applied the MSI bios but found it a bit odd.
> I grabbed the one on their site, https://us.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
> I didnt like how I didnt see anything mentioning a bios version number, patch notes, etc.
> It's just an exe. It shows some info in a command prompt but closes itself, even if run from command prompt, so I couldn't see what it said.
> It left me in low res so I had to reboot assuming that was the next step as nothing prompted me.
> Only GPU-Z had the new bios revision # which I forgot to compare to old screen shots so I could only assume I had it successfully updated by trying to crash it.


Yeah the MSI release has been very odd to say the least. Mine became more unstable and ran hotter so I downgraded mine to the old bios. The old one was a .27 and the new was .50 in the bios number


----------



## BroPhilip

Lets add more confusion to the MSI bios release.... it has been updated on the Web site again with a release date for today....I don't remember the file size of the last one.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> I applied the MSI bios but found it a bit odd.
> I grabbed the one on their site, https://us.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios
> I didnt like how I didnt see anything mentioning a bios version number, patch notes, etc.
> It's just an exe. It shows some info in a command prompt but closes itself, even if run from command prompt, so I couldn't see what it said.
> It left me in low res so I had to reboot assuming that was the next step as nothing prompted me.
> Only GPU-Z had the new bios revision # which I forgot to compare to old screen shots so I could only assume I had it successfully updated by trying to crash it.


The exe has nvflash and the bios rom file embedded. It extracts the components to a temp directory somewhere and runs a script to automate the entire update process.

The NVflash will stop if the card is the wrong version as determined by its Vid/Pid indetifier and not change anything

reboot after a bios update is always necessary. The card reads the new firmware instructions at start up. The script should probably have told you or prompted for a restart but they obviously forgot that bit.

The drivers you were using with the old bios will operate as before as the PCIe identifiers are still the same after the update. As far as the UEFI and Windows are concerned you still have teh same hardware.


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> Lets add more confusion to the MSI bios release.... it has been updated on the Web site again with a release date for today....I don't remember the file size of the last one.


I would say that micron checkerboards are a problem. The bios is fine to update Micron cards or at least it is on my card


----------



## Mr-Dark

Updated my card's now.. the bios update tool update both card in SLI automatically









Stock bios



New bios


----------



## BroPhilip

Quote:


> Originally Posted by *gtbtk*
> 
> I would say that micron checkerboards are a problem. The bios is fine to update Micron cards or at least it is on my card


What's odd is I got a checkerboard after the bios update, when pushing the memory oc


----------



## khanmein

Quote:


> Originally Posted by *BroPhilip*
> 
> What's odd is I got a checkerboard after the bios update, when pushing the memory oc


very low no need worried. a lot said no white checkerboard after updated the vbios.


----------



## asdkj1740

Quote:


> Originally Posted by *BroPhilip*
> 
> What's odd is I got a checkerboard after the bios update, when pushing the memory oc


if you push your micron too hard you will still get the checkerboard.
the bios seems to alleviate the checkerboard problem.


----------



## Dude970

Has anyone here with Samsung Vram tried the MSI BIOS?


----------



## BroPhilip

Removed


----------



## emsj86

Reading this forum scares me for getting a 1070. Seems as every card that I want (as I need one that supports a water block) EVGA, Msi, gigabyte all have problems and or don't over clock much if at all


----------



## gtbtk

Quote:


> Originally Posted by *emsj86*
> 
> Reading this forum scares me for getting a 1070. Seems as every card that I want (as I need one that supports a water block) EVGA, Msi, gigabyte all have problems and or don't over clock much if at all


Micron memory problems have been solved for most brands by bios update. That includes EVGA, MSI and there is a bios for Gigabyte that solves problems with the G1 and xTreme cards. Zotac is about the only brand who has not resolved the memory issue by releasing a bios update as yet.

EVGA heat problems have a solution. New cards will come with extra heat pads and a revised fan curve that solves the problem and existing cards will be retrofitted either by EVGA or the user themselves.

You have come along after all the dramas have pretty much been solved.

Yes there has been a lot of discussion delving into the nvidia secret sauce but don't be dissuaded. In spite of a few problems that you can work around anyway, the 1070s are great cards. In spite of the dramas, I am really happy with mine.


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> Micron memory problems have been solved for most brands by bios update. That includes EVGA, MSI and there is a bios for Gigabyte that solves problems with the G1 and xTreme cards. Zotac is about the only brand who has not resolved the memory issue by releasing a bios update as yet.
> 
> EVGA heat problems have a solution. New cards will come with extra heat pads and a revised fan curve that solves the problem and existing cards will be retrofitted either by EVGA or the user themselves.
> 
> You have come along after all the dramas have pretty much been solved.
> 
> Yes there has been a lot of discussion delving into the nvidia secret sauce but don't be dissuaded. In spite of a few problems that you can work around anyway, the 1070s are great cards. In spite of the dramas, I am really happy with mine.


+1 Well said. The 1070's are great cards


----------



## emsj86

I wonder if buying a card now say EVGA FtW will allready have memory fix. Heat won't be a problem as a ek water block would be installed


----------



## gtbtk

Quote:


> Originally Posted by *emsj86*
> 
> I wonder if buying a card now say EVGA FtW will allready have memory fix. Heat won't be a problem as a ek water block would be installed


If it doesn't cause it is older stock, you can do the bios update yourself to fix it. You may want to still apply the heat pad between the backplate and the PCB even if you are under water.

Very easy, takes about 1 minute.

The only problem I can see with the EGVA cards is that they have limited power limits FTW is only 226W max.

I assume you are talking about a custom loop. Unless you have bought your waterblock already what about this card?

https://www.msi.com/Graphics-card/GeForce-GTX-1070-SEA-HAWK-EK-X.html#hero-overview


----------



## Prothean

Quote:


> Originally Posted by *BroPhilip*
> 
> Lets add more confusion to the MSI bios release.... it has been updated on the Web site again with a release date for today....I don't remember the file size of the last one.


I just flashed this newer bios and it's the same bios version and same bios date as the one from yesterday.

The only difference is that it comes with extra documentation (ie: a PDF guide), extra error checking (ie: batch file checks if you're running 64-bit windows), and even reboots automatically after flashing.

It flashes the same bios as yesterday.


----------



## emsj86

Quote:


> Originally Posted by *gtbtk*
> 
> If it doesn't cause it is older stock, you can do the bios update yourself to fix it. You may want to still apply the heat pad between the backplate and the PCB even if you are under water.
> 
> Very easy, takes about 1 minute.
> 
> The only problem I can see with the EGVA cards is that they have limited power limits FTW is only 226W max.
> 
> I assume you are talking about a custom loop. Unless you have bought your waterblock already what about this card?
> 
> https://www.msi.com/Graphics-card/GeForce-GTX-1070-SEA-HAWK-EK-X.html#hero-overview


Haven't bought the card yet. Going to buy one this weekend. I want the Msi one but not a fan of the dragon on it. Prefer the block for EVGA cards but may have to just cover up the dragon or get the originally Msi gaming x water block. Rig is in my profile


----------



## gtbtk

Quote:


> Originally Posted by *emsj86*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If it doesn't cause it is older stock, you can do the bios update yourself to fix it. You may want to still apply the heat pad between the backplate and the PCB even if you are under water.
> 
> Very easy, takes about 1 minute.
> 
> The only problem I can see with the EGVA cards is that they have limited power limits FTW is only 226W max.
> 
> I assume you are talking about a custom loop. Unless you have bought your waterblock already what about this card?
> 
> https://www.msi.com/Graphics-card/GeForce-GTX-1070-SEA-HAWK-EK-X.html#hero-overview
> 
> 
> 
> Haven't bought the card yet. Going to buy one this weekend. I want the Msi one but not a fan of the dragon on it. Prefer the block for EVGA cards but may have to just cover up the dragon or get the originally Msi gaming x water block. Rig is in my profile
Click to expand...

Of course it is up to you. The dragon on the block will be facing the floor so you wont see it anyway.

I just think that the performance of the EVGA cards have been nerfed a bit as the card constantly hits the power limit and throttles itself back a bit. Possibly they did that because they were trying to manage/cover up the VRM heat issues that have come back to bite them?

Unfortunately, at this time there is no bios editor to open up the power limits ether so the EVGA cards are stuck and water blocks wont really help with power limits.


----------



## EDK-TheONE

Quote:


> Originally Posted by *emsj86*
> 
> Reading this forum scares me for getting a 1070. Seems as every card that I want (as I need one that supports a water block) EVGA, Msi, gigabyte all have problems and or don't over clock much if at all


buy zotac amp extreme. it is rock solid stable @ 2150Mhz during gaming with 50% rpm without noise and low temp (60 c)















also @ stcok @ 2000Mhz with 30% rpm temp is (62 c)


----------



## emsj86

Quote:


> Originally Posted by *EDK-TheONE*
> 
> buy zotac amp extreme. it is rock solid stable @ 2150Mhz during gaming with 50% rpm without noise and low temp (60 c)


There is no water block for it. They have there own but it's made with aluminum which will cause corrosion


----------



## madmeatballs

Quote:


> Originally Posted by *emsj86*
> 
> There is no water block for it. They have there own but it's made with aluminum which will cause corrosion


I know alphacool has one.

Here


----------



## Gurkburk

Quote:


> Originally Posted by *Roland0101*
> 
> Is that afterburner or effective?
> Same question. + what Card is that.


I'm using Afterburner.

Also, im running Firestrike stresstest in 3dmark and its around 65*C and its not crashing. Might have been the bios update?

Edit, it did crash after i posted this. But after testing again and maxing the fanspeed, keeping the temp around 54*C, it's been running a while now.

Edit2: I've lowered my Core clocks a little, from 100ish to 20 now, but my Memory clock is at +640, seems to be passing Fire Strike stress test so far. Still have a few runs of my 20 to go. Hope for the best!

Edit3: Are there any good Aftermarket AIR coolers out there? I had the 3x fan Accelero Xtreme for my GTX 780. I couldnt find an updated version, will it fit my 1070 as well perhaps?


----------



## DeathAngel74

I have a question. Since I've taken 2 bios revisions from eVGA and done the thermal pad mod? Should I leave dwm.exe, explorer.exe set to max performance in NVCP? Or can I set optimal power or adaptive as global profile now? Idle temps are 33-34C, gaming 44-47C. Just wondering if leaving it at 1594 base clock 24/7 will cause issues later on in the future or am I safe? TIA


----------



## Dude970

I would say you are good to switch to adaptive now.


----------



## DeathAngel74

Thanks. If I leave it the way it is will there be any damage? over time? The only reason I ask is I like the the speed, lol. Its been set like this for about 3 weeks, after I bought the card and read about Micron issues here. The card has never run at idle clocks, for more than the first five or ten minutes after I installed it.


----------



## Dude970

That is just a matter of power savings. The GPU will run reliable for a long time, even at the higher level. Could it shorten the life, I would say yes but it will still last for years


----------



## DeathAngel74

Thank you.


----------



## asdkj1740

Quote:


> Originally Posted by *EDK-TheONE*
> 
> buy zotac amp extreme. it is rock solid stable @ 2150Mhz during gaming with 50% rpm without noise and low temp (60 c)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also @ stcok @ 2000Mhz with 30% rpm temp is (62 c)


could you run furmark 1080p 0xaa with 100% fan speed to see whether there is throttling to your card? thanks


----------



## Samurai707

Quote:


> Originally Posted by *emsj86*
> 
> There is no water block for it. They have there own but it's made with aluminum which will cause corrosion


I have the MSI 1070 Gaming X, Mine, fortunately was the last one at Central Computers in downtown San Francisco and also had Samsung memory









Knock on Ikea wood, but mine's going strong after some recent driver updates on Win 10 and I have little to no issues in major games due to the card itself.

Can't complain when clocking at the upper end of the core clock spectrums shown before grabbing my water block!


----------



## Joeyclown

I recently purchased a EVGA 1070 FTW few weeks ago (before all the heating issues came up).

I have a question regarding running these cards in SLI. I have never ran anything in SLI nor crossfire before. And I've always heard and seen first hand how top cards get hotter naturally just by the added temps from the card below it.

My question is, would it be a smart idea to get say, the hybrid version of EVGA 1070 to help with keep card cool? Or would I be better of just getting another FTW edition and say screw it for now?

Just curious as to other peoples input due to the fact i recently had 200 dollars worth of amazon gift cards fall into my lap, for free basically.

I debated about putting it aside and saving them for when i do my new upgrade. But, my I5 2500k and 16 gigs 1866 ram are still getting the job done with anything i need to do with it right now. Adding the 1070 was a big improvement in itself. So, thats why im debating about throwing the gift cards at another 1070.

just to give an idea, I'm running battlefield 1 at high settings and getting anywhere from 85-110 fps at 1440P. I'm sure i may get a few extra fps by upgrading the CPU, I'm just not sure if it would worth it at the moment.

Any info and input would be greatly appreciated. Thanks guys!


----------



## gtbtk

Quote:


> Originally Posted by *Joeyclown*
> 
> I recently purchased a EVGA 1070 FTW few weeks ago (before all the heating issues came up).
> 
> I have a question regarding running these cards in SLI. I have never ran anything in SLI nor crossfire before. And I've always heard and seen first hand how top cards get hotter naturally just by the added temps from the card below it.
> 
> My question is, would it be a smart idea to get say, the hybrid version of EVGA 1070 to help with keep card cool? Or would I be better of just getting another FTW edition and say screw it for now?
> 
> Just curious as to other peoples input due to the fact i recently had 200 dollars worth of amazon gift cards fall into my lap, for free basically.
> 
> I debated about putting it aside and saving them for when i do my new upgrade. But, my I5 2500k and 16 gigs 1866 ram are still getting the job done with anything i need to do with it right now. Adding the 1070 was a big improvement in itself. So, thats why im debating about throwing the gift cards at another 1070.
> 
> just to give an idea, I'm running battlefield 1 at high settings and getting anywhere from 85-110 fps at 1440P. I'm sure i may get a few extra fps by upgrading the CPU, I'm just not sure if it would worth it at the moment.
> 
> Any info and input would be greatly appreciated. Thanks guys!


the hybrid in the top slot would certainly be a benefit


----------



## _Killswitch_

Been playing with my GTX 1070 a little, Max I can go on core and didn't really play with memory much. Would say my 2550K is my limiting factor, need to finish my new pc =S


----------



## TylerAD

Quote:


> Originally Posted by *_Killswitch_*
> 
> Been playing with my GTX 1070 a little, Max I can go on core and didn't really play with memory much. Would say my 2550K is my limiting factor, need to finish my new pc =S


Nice clocks, but I doubt you will be able to play games at that core speed. I would bet that BF1 (or other really taxing game) would crash that within a few min. Most cards can bench around 2.1 mark, but little to none can play extended games at those speeds. I cant really pass 2075 for continued gaming.


----------



## _Killswitch_

well I don't play BF so I can't test that game, playing dishonored atm so when I buy dishonored 2 im know the story. I don't know how taxing dishonored is, but ill give it a try at those clocks on dishonored for giggles in a little bit


----------



## Gurkburk

I'm running +15 CC and + 600 memory atm.


----------



## DeathAngel74

I'm playing star wars battlefront, tw3, and batman ak @2101/4363mhz. Anything past that and things go downhill quick


----------



## _Killswitch_

I was able to play Dishonored at 2190/4404mhz with graphic settings turn up high as they can go for little under an hour without any issues. I only doing this out of curious, I won't run my gpu like this 24/7, the GTX 1070 is enough for me at stock settings but just having little fun i guess.

Edit: I do have Crysis 3 i can see if it will hold up on that


----------



## GunnzAkimbo

Got an upgrade setup.

I like the MSI quick silver. Prefer the zotac amp extreme monster.


----------



## khanmein

what's wrong with MSI? non OC is more expensive than OC version for really???









https://www.amazon.com/MSI-GAMING-GTX-1070-8G/dp/B01GXOX3SW/ref=sr_1_1?ie=UTF8&qid=1478315364&sr=8-1&keywords=MSI%2BGaming%2BGeForce%2BGTX%2B1070&th=1


----------



## GunnzAkimbo

Quote:


> Originally Posted by *khanmein*
> 
> what's wrong with MSI? non OC is more expensive than OC version for really???
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.amazon.com/MSI-GAMING-GTX-1070-8G/dp/B01GXOX3SW/ref=sr_1_1?ie=UTF8&qid=1478315364&sr=8-1&keywords=MSI%2BGaming%2BGeForce%2BGTX%2B1070&th=1


The zotac can be used as a zombie apocalypse weapon.


----------



## EDK-TheONE

Quote:


> Originally Posted by *_Killswitch_*
> 
> I was able to play Dishonored at 2190/4404mhz with graphic settings turn up high as they can go for little under an hour without any issues. I only doing this out of curious, I won't run my gpu like this 24/7, the GTX 1070 is enough for me at stock settings but just having little fun i guess.
> 
> Edit: I do have Crysis 3 i can see if it will hold up on that


nvidia decrease performance higher than 2150mhz if your temp higher than 45 c.
nvidia use new secret pattern (voltage/frequency/temperature are used) in pascal so if go higher freq. that means it is not to be stronger.
please post FS benchmark.

this is my bench: http://www.3dmark.com/fs/10573019 with i3-6100.








Graphics Score 21 874
Physics Score 6 379
Combined Score 3 988


----------



## asdkj1740

Quote:


> Originally Posted by *EDK-TheONE*
> 
> nvidia decrease performance higher than 2150mhz if your temp higher than 45 c.
> nvidia use new secret pattern (voltage/frequency/temperature are used) in pascal so if go higher freq. that means it is not to be stronger.
> please post FS benchmark.
> 
> this is my bench: http://www.3dmark.com/fs/10573019 with i3-6100.
> 
> 
> 
> 
> 
> 
> 
> 
> Graphics Score 21 874
> Physics Score 6 379
> Combined Score 3 988


this is the feature of gpu boost 3.0.
nvidia even locks the highest speed of 1050ti core and vram to 1911/2000.
this gpu boost 3.0 restricts card to run faster than nvidia expectation.
my card has lower fsu mark when my core clock exceeds 2100mhz.


----------



## Bold Eagle

Gainward Phoenix GTX1070 has a new BIOS: 86.04.50.00.4D - from their site "p00983_bios_16215812ae37d783a"


----------



## Roland0101

Quote:


> Originally Posted by *khanmein*
> 
> what's wrong with MSI? non OC is more expensive than OC version for really???
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.amazon.com/MSI-GAMING-GTX-1070-8G/dp/B01GXOX3SW/ref=sr_1_1?ie=UTF8&qid=1478315364&sr=8-1&keywords=MSI%2BGaming%2BGeForce%2BGTX%2B1070&th=1


Amazon does the pricing on their Website, not MSI.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> what's wrong with MSI? non OC is more expensive than OC version for really???
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.amazon.com/MSI-GAMING-GTX-1070-8G/dp/B01GXOX3SW/ref=sr_1_1?ie=UTF8&qid=1478315364&sr=8-1&keywords=MSI%2BGaming%2BGeForce%2BGTX%2B1070&th=1


maybe everyone is buying the Gaming 8G card creating a higher demand because they worked out that you can just flash it with the Z bios and get a faster card for what they think is a "cheaper" price.


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> Amazon does the pricing on their Website, not MSI.


Dont lie to the boy. We all know the prices MSI picks are forced on Amazon, influenced by the the season's harvest and successful transport of orange juice and other commodities, depending on the Illuminati's objectives that month.
Quote:


> Originally Posted by *gtbtk*
> 
> maybe everyone is buying the Gaming 8G card creating a higher demand because they worked out that you can just flash it with the Z bios and get a faster card for what they think is a "cheaper" price.


In what ways? I have the Gaming X, so what benefits would I get with a Z bios? Higher voltage limits?


----------



## Gurkburk

So let me get this straight.. The new 3.0 ruins any possibility to overclock the coreclock, when reaching a certain -predecided- temperature?

This explains why my 1070 crashes when going above 63*C..


----------



## gtbtk

> Quote:TheGlow
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> maybe everyone is buying the Gaming 8G card creating a higher demand because they worked out that you can just flash it with the Z bios and get a faster card for what they think is a "cheaper" price.
> 
> 
> 
> In what ways? I have the Gaming X, so what benefits would I get with a Z bios? Higher voltage limits?
Click to expand...

If you are manually overclocking the card anyway, nothing.

If you want to be lazy and not bother learning how to OC, or actually OC, you get more FPS card at stock settings


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> maybe everyone is buying the Gaming 8G card creating a higher demand because they worked out that you can just flash it with the Z bios and get a faster card for what they think is a "cheaper" price.


maybe more and more people have figured out the only difference between gaming and gamingx and gamingz is the bios which can be cross flashed safely.


----------



## backie

Just flashed the new MSI bios to my gaming x and managed +500 on memory easily, before it checkerboarded on +400. Got a nice 21k graphics score on firestrike now









http://www.3dmark.com/3dm/15878648


----------



## duganator

How's everyone's GPU usage in battlefield one? Mine seems to spike quite a bit, I can't even come close to maintaining 90%+ GPU usage.


----------



## backie

+500 seems to be the max after that it gets artifacts after prolonged use, +700 crashed my graphics driver didnt actualy expect the bios to make this much difference, very pleased


----------



## Mr-Dark

Quote:


> Originally Posted by *duganator*
> 
> How's everyone's GPU usage in battlefield one? Mine seems to spike quite a bit, I can't even come close to maintaining 90%+ GPU usage.


which cpu you have there ? bf1 is super demand on the cpu..

at 1440p and one card my gpu usage is over 95% constant while with sli on i see +80% all time.. that on 5960x @4.2ghz


----------



## Dude970

Quote:


> Originally Posted by *backie*
> 
> Just flashed the new MSI bios to my gaming x and managed +500 on memory easily, before it checkerboarded on +400. Got a nice 21k graphics score on firestrike now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15878648












Quote:


> Originally Posted by *duganator*
> 
> How's everyone's GPU usage in battlefield one? Mine seems to spike quite a bit, I can't even come close to maintaining 90%+ GPU usage.


Mine stays at 99%


----------



## duganator

I have an e5 2660v2. I'm thinking the low clock speed is slowing me down. I may have to throw the 3930k back in
Quote:


> Originally Posted by *Mr-Dark*
> 
> which cpu you have there ? bf1 is super demand on the cpu..
> 
> at 1440p and one card my gpu usage is over 95% constant while with sli on i see +80% all time.. that on 5960x @4.2ghz


----------



## Mr-Dark

Quote:


> Originally Posted by *duganator*
> 
> I have an e5 2660v2. I'm thinking the low clock speed is slowing me down. I may have to throw the 3930k back in


the 3930k at 4.5ghz should easily beat that as 12 thread at 4.5ghz better than 20 thread at 3ghz for games


----------



## duganator

It actually only boosts to 2.6 on all cores. I'm really debating picking up a xeon with a faster clock speed
Quote:


> Originally Posted by *Mr-Dark*
> 
> the 3930k at 4.5ghz should easily beat that as 12 thread at 4.5ghz better than 20 thread at 3ghz for games


----------



## SuperZan

Quote:


> Originally Posted by *Mr-Dark*
> 
> the 3930k at 4.5ghz should easily beat that as 12 thread at 4.5ghz better than 20 thread at 3ghz for games


Empirically without having taken down any numbers or made any charts, the 4.6GHz 3930k is still managing frames very well at 1080p with my 1070. No complaints whatsoever with Battlefield 1, Skyrim SE modded to the brim, StarCraft II, Witcher 3, The Secret World, Smite, TW:Warhammer, or Fallout 4. I think that's a fairly good cross-section of games so I'd say IME SB-e still has plenty of life left and handles the 1070 without issue.


----------



## duganator

Yeah, I've thought about going x99, but the x79 platform still has tons of life left
Quote:


> Originally Posted by *SuperZan*
> 
> Empirically without having taken down any numbers or made any charts, the 4.6GHz 3930k is still managing frames very well at 1080p with my 1070. No complaints whatsoever with Battlefield 1, Skyrim SE modded to the brim, StarCraft II, Witcher 3, The Secret World, Smite, TW:Warhammer, or Fallout 4. I think that's a fairly good cross-section of games so I'd say IME SB-e still has plenty of life left and handles the 1070 without issue.


----------



## gtbtk

Quote:


> Originally Posted by *backie*
> 
> +500 seems to be the max after that it gets artifacts after prolonged use, +700 crashed my graphics driver didnt actualy expect the bios to make this much difference, very pleased


Isnt it amazing how well the memory works when it is configured properly?


----------



## DeathAngel74

I was able to manage +726,just barely...


----------



## _Killswitch_

Well these are max Core and Memory clocks i could run Firestrike at. Almost broke 21K in graphic score


http://www.3dmark.com/3dm/15880508


----------



## Dude970

Quote:


> Originally Posted by *_Killswitch_*
> 
> Well these are max Core and Memory clocks i could run Firestrike at. Almost broke 21K in graphic score
> 
> 
> http://www.3dmark.com/3dm/15880508










and nice wallpaper too


----------



## _Killswitch_

Thanks, My wallpapers usually only pretty thing about my PC's lol


----------



## Mr-Dark

Quote:


> Originally Posted by *_Killswitch_*
> 
> Well these are max Core and Memory clocks i could run Firestrike at. Almost broke 21K in graphic score
> 
> 
> http://www.3dmark.com/3dm/15880508


That's Big enough to say decent, Oh the OC...lol


----------



## Dude970

Quote:


> Originally Posted by *Dude970*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mine stays at 99%


Just checked, 99% for campaign, seems more like 90 in MP


----------



## _Killswitch_

well off the topic of my wallpaper , come on OCN focus! My EVGA GTX 1070 has performed better than I was expecting even with Micron memory. I was able too play Crysis 3 with graphics on Max with my gpu Oc'ed +280Mz Core and 400Mhz memory.

so over all very pleased and was worried bc seems card I bought was an unloved one or something lol


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> Dont lie to the boy. We all know the prices MSI picks are forced on Amazon, influenced by the the season's harvest and successful transport of orange juice and other commodities, depending on the Illuminati's objectives that month.


Damn, that was suppose to be secret. How do you know?


----------



## Dude970

Quote:


> Originally Posted by *_Killswitch_*
> 
> well off the topic of my wallpaper , come on OCN focus! My EVGA GTX 1070 has performed better than I was expecting even with Micron memory. I was able too play Crysis 3 with graphics on Max with my gpu Oc'ed +280Mz Core and 400Mhz memory.
> 
> so over all very pleased and was worried bc seems card I bought was an unloved one or something lol


Even at stock settings the 1070's play games great


----------



## Forceman

Quote:


> Originally Posted by *Roland0101*
> 
> Damn, that was suppose to be secret. How do you know?


For a secret society, the Illuminati suck at keeping secrets.


----------



## BroPhilip

All these reports of increased memory oc is making me feel bad.... my memory oc was +400 before the bios and after it it is now +400. Odd how this works.....oh well, back to The Division. I have gear to collect!


----------



## EDK-TheONE

Finally Got 22K GS on FS
http://www.3dmark.com/3dm/15885784

Graphics Score 22 069








Physics Score 6 937








Combined Score 4 357


----------



## rfarmer

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Finally Got 22K GS on FS
> http://www.3dmark.com/3dm/15885784
> 
> Graphics Score 22 069
> 
> 
> 
> 
> 
> 
> 
> 
> Physics Score 6 937
> 
> 
> 
> 
> 
> 
> 
> 
> Combined Score 4 357


Very nice, you have a good card there.


----------



## Dude970

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Finally Got 22K GS on FS
> http://www.3dmark.com/3dm/15885784
> 
> Graphics Score 22 069
> 
> 
> 
> 
> 
> 
> 
> 
> Physics Score 6 937
> 
> 
> 
> 
> 
> 
> 
> 
> Combined Score 4 357


Well done


----------



## _Killswitch_

Well by lowering my core clock a little, and upping my memory clock i was able to break 21K

http://www.3dmark.com/3dm/15886691

Edited, best it will do any higher on memory it starts going bunkers

http://www.3dmark.com/3dm/15887761


----------



## TheGlow

Anyone have suggestions on confirming GSync is working? Using a Dell S2716DG. I have it enabled, my monitor says its on. I even found the gsync indicator that says its on. Capped frames at 142fps and in overwatch I swear I'm still seeing tearing. Very minor, but I see it. I even dled the gsync pendulum demo and i can see some flickering at random times.


----------



## Roland0101

Quote:


> Originally Posted by *TheGlow*
> 
> Anyone have suggestions on confirming GSync is working? Using a Dell S2716DG. I have it enabled, my monitor says its on. I even found the gsync indicator that says its on. Capped frames at 142fps and in overwatch I swear I'm still seeing tearing. Very minor, but I see it. I even dled the gsync pendulum demo and i can see some flickering at random times.


Did you tried a little bit less, a 140FPS cap for example?


----------



## DeathAngel74

or 120fps, although it may counter-productive


----------



## gtbtk

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Finally Got 22K GS on FS
> http://www.3dmark.com/3dm/15885784
> 
> Graphics Score 22 069
> 
> 
> 
> 
> 
> 
> 
> 
> Physics Score 6 937
> 
> 
> 
> 
> 
> 
> 
> 
> Combined Score 4 357


How do you have your Zotac card set up? That really is an impressive result.

Could you please post some screenshots of your OC utility settings?


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> Anyone have suggestions on confirming GSync is working? Using a Dell S2716DG. I have it enabled, my monitor says its on. I even found the gsync indicator that says its on. Capped frames at 142fps and in overwatch I swear I'm still seeing tearing. Very minor, but I see it. I even dled the gsync pendulum demo and i can see some flickering at random times.


are you still running your memory OC at +800? Do you see the tearing if you run the memory at +450 or +500 instead?


----------



## Roland0101

Quote:


> Originally Posted by *DeathAngel74*
> 
> or 120fps, although it may counter-productive


If G-SYNC works, it won't be counter productive.


----------



## DeathAngel74




----------



## syl1979

Quite difficult to beat 22k graphic score on firestrike. I need to push the card to 2126 core, 2330 memory.... not really stable but it pass the test....
http://www.3dmark.com/fs/10685342


----------



## _Killswitch_

Quote:


> Originally Posted by *syl1979*
> 
> Quite difficult to beat 22k graphic score on firestrike. I need to push the card to 2126 core, 2330 memory.... not really stable but it pass the test....
> http://www.3dmark.com/fs/10685342


Well im slightly confused, I have a 2550K and had my GPU clocked higher yet my score over-all/ graphic score isn't close to that so whats the difference here?


----------



## syl1979

I score less at 2150 than 2126. Some stability issue i guess. Errors at higher frequency will reduce score. Also found better score without voltage lock (gpu can cool down a bit between tests)


----------



## Pittster

Just a heads up MSI released there updated BIOS for the 1070's I have the Gaming 8gb and have gone from a 220+ Mem OC to 500+ (4512mhz)

Testing lots stable so far any higher and not so stable, no black squares etc so far. Core is at 2100. Happy with the improvement

http://www.3dmark.com/fs/10682249


----------



## khanmein

@gtbtk how's the new vbios? any feedback? thanks. *the new vbios is for samsung or micron or both?

https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios


----------



## Gurkburk

Any changelog to that bios update?

Would love if Gigabyte released something as well.


----------



## macwin2012

Heys Guys ,

I recently got *Zotac Gtx 1070 AMP extreme* ! Currently using following settings :
Zotac : Firestorm with Advance Fan settings
Core : +200
Memory : +600
Voltage : 100%
Power : 120%
GPU : 92C Max

How is my performance ? i could probably squees probably 1-1.5 fps more but i felt some headroom is good for stability


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> @gtbtk how's the new vbios? any feedback? thanks. *the new vbios is for samsung or micron or both?
> 
> https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios


It resolves the issue. No more checker board artifact/video scheduler BSOD. The MSI bios is listed for anyone having "problems" with their card.

Answers for Vbios versions for specific Ram types should be taken up with the vendor. I am not in a position to give a definitive answer, however, Nvidia supplies the core code for all these Bios versions and this bug is fixed in the core so a .50 bios should solve the micron problem. The Bios interfaces the memory controller, not the memory chips themselves so the bios should be working for both types of ram. I don't know why Gigabyte is publishing memory brand specific Bioses like they are because the controller is the same for both cards and built into the GPU silicon.

My Gaming X tops out at now at +550 stable, but with careful voltage management I could get that before the update but with major artifacts.

I have been playing around with other bioses on my card and I am currently running the ASUS Strix OC bios. It seems more stable, consistent and reliable at high overclocks than the original MSI bios so I will leave it for a while.

I tried the Palit 1671Mhz OC bios with the factory overclocked Ram and could not get that stable at anything above stock, maybe my chip is not binned high enough? I have not tried the new Gigabyte xTreme bios as yet. The old .26 one gave me stability problems similar to the old Palit version so I am not hopeful.


----------



## gtbtk

Quote:


> Originally Posted by *macwin2012*
> 
> Heys Guys ,
> 
> I recently got *Zotac Gtx 1070 AMP extreme* ! Currently using following settings :
> Zotac : Firestorm with Advance Fan settings
> Core : +200
> Memory : +600
> Voltage : 100%
> Power : 120%
> GPU : 92C Max
> 
> How is my performance ? i could probably squees probably 1-1.5 fps more but i felt some headroom is good for stability


Those results are about what I can get from my card when Overclocked as well so they look OK. You could try playing with the curve in Afterburner instead of using the core slider and it might get you a little more. Try starting with teh core clock slider at 0 and only increase the 1.093 and the 0.975 points to say 2100 and 1999 respectively and see how that works for you. Make minor adjustments either way to see if that can fine tune settings.

Having said that high settings that work in Heaven, will not always work with say, firestrike so use a variety of benchmark tools.


----------



## Gurkburk

The best score i've gotten was 2661


----------



## QPSS

Just flashed my Inno3D X3 with their new BIOS and now instead of +250, I can do +550.

I also dont get checkerboards anymore, even if I do too high OC. Instead its now large black blocks. So I guess its fixed.
Boost is still only 2000 stable, but that might change, once I change the TIM and thermal pads with more expensive stuff, which always gets me about an additional 50-75 MHz on the GPU and more on VRAM.


----------



## Roland0101

Quote:


> Originally Posted by *Gurkburk*
> 
> Any changelog to that bios update?
> 
> Would love if Gigabyte released something as well.


They did. http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
(Or did you only refereed to a detailed change-log?)

Quote:


> Originally Posted by *_Killswitch_*
> 
> Well im slightly confused, I have a 2550K and had my GPU clocked higher yet my score over-all/ graphic score isn't close to that so whats the difference here?


How high is your 2550K overclocked? The pascal error correction probably plays a role too.


----------



## Gurkburk

Quote:


> Originally Posted by *Roland0101*
> 
> They did. http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> (Or did you only refereed to a detailed change-log?)
> How high is your 2550K overclocked? The pascal error correction probably plays a role too.


a detailed changelog from the MSI one. I know gigabyte released that update, but it didnt do anything to my clocks, that i have noticed. Before and after, i get +100 cc and +550mem.


----------



## asdkj1740

Quote:


> Originally Posted by *QPSS*
> 
> Just flashed my Inno3D X3 with their new BIOS and now instead of +250, I can do +550.
> 
> I also dont get checkerboards anymore, even if I do too high OC. Instead its now large black blocks. So I guess its fixed.
> Boost is still only 2000 stable, but that might change, once I change the TIM and thermal pads with more expensive stuff, which always gets me about an additional 50-75 MHz on the GPU and more on VRAM.


any link?
i cant nothing at downloads page.
http://www.inno3d.com/products_detail.php?refid=242


----------



## QPSS

https://www.facebook.com/inno3Ditaly/


----------



## TheGlow

Quote:


> Originally Posted by *Roland0101*
> 
> Did you tried a little bit less, a 140FPS cap for example?


I think that may have been it. I had it around 140 initially and dont remember if I noticed anything. I've been working on lowering my sensitivity in Overwatch so now that I'm comfortable I've been spinning more and noticed it more.
I put it down to 135 for now and it seems to b e ok.

Quote:


> Originally Posted by *DeathAngel74*
> 
> or 120fps, although it may counter-productive


I set to 135 and seems ok for now.

Quote:


> Originally Posted by *gtbtk*
> 
> are you still running your memory OC at +800? Do you see the tearing if you run the memory at +450 or +500 instead?


I think its the frame cap too close to 144. My daily config is +180 core/+700 mem which is what I've been using since the start.
I havent tried pushing it since vbios update because I have the free 3dmark. Alt-f4 on demo of firestrike looks like it kills the benchmark but timespy it goes through.
I set to +210/+825 and was seeing the red sparkles, so I think I need to take the core back a little.


----------



## asdkj1740

Quote:


> Originally Posted by *QPSS*
> 
> https://www.facebook.com/inno3Ditaly/


do you know the ichill one is fro x4 or x3?
which nvflash you have used to flash the new bios?


----------



## QPSS

Quote:


> Originally Posted by *asdkj1740*
> 
> do you know the ichill one is fro x4 or x3?
> which nvflash you have used to flash the new bios?


Its for both apparently (the ichill1070.rom file).
I used the newest nvflash version (5.328.0).


----------



## asdkj1740

Quote:


> Originally Posted by *QPSS*
> 
> Its for both apparently (the ichill1070.rom file).
> I used the newest nvflash version (5.328.0).


downloaded from techpowerup?


----------



## QPSS

yes


----------



## asdkj1740

Quote:


> Originally Posted by *QPSS*
> 
> yes


thank you


----------



## LogicusMPS

My 3D Mark result - 17366


----------



## BroPhilip

So this is me on the new Bios trying to push more than +400 on ram... it only does it on the Second screen not the one showing valley...I had less bsod on the old bios. Any suggestions?


----------



## BroPhilip

I think the new bios hates me


----------



## macwin2012

Hey ,

My core clock speed while benchmarking downclocks to 2100 MHZ , My temperature are constant at 57/56 C , never go above 60 C .

What is the issue ?


----------



## DeathAngel74

Could you be hitting the power limit maybe? Haven't done any benchmarks with the 1070, just installed thermal pads on Thursday. Will run some when I get a day off.


----------



## _Killswitch_

Quote:


> Originally Posted by *Roland0101*
> 
> They did. http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> (Or did you only refereed to a detailed change-log?)
> How high is your 2550K overclocked? The pascal error correction probably plays a role too.


well realized after i posted that my cpu was only running at 3.8, i must reset my overclock and forgot about it. So i went into bios selected my 4.8 profile which was stable was running fine since i built this pc/Wins 7 installed. Now upgraded too wins 10 it's not stable anymore, I keep getting BSOD WHEA_uncorrectible and watchdog_error, done everything it said online to fix it. *shrugs im lost atm


----------



## gtbtk

Quote:


> Originally Posted by *macwin2012*
> 
> Hey ,
> 
> My core clock speed while benchmarking downclocks to 2100 MHZ , My temperature are constant at 57/56 C , never go above 60 C .
> 
> What is the issue ?


No issue, That is normal, It is a function of GPU Boost 3.0. Temp, power load all get balanced around


----------



## gtbtk

Quote:


> Originally Posted by *_Killswitch_*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roland0101*
> 
> They did. http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> (Or did you only refereed to a detailed change-log?)
> How high is your 2550K overclocked? The pascal error correction probably plays a role too.
> 
> 
> 
> well realized after i posted that my cpu was only running at 3.8, i must reset my overclock and forgot about it. So i went into bios selected my 4.8 profile which was stable was running fine since i built this pc/Wins 7 installed. Now upgraded too wins 10 it's not stable anymore, I keep getting BSOD WHEA_uncorrectible and watchdog_error, done everything it said online to fix it. *shrugs im lost atm
Click to expand...

WHEA uncorrectable BSOD is a 0x124 error. Overclocked Sandy Bridge errors like that can normally be fixed that by adding a little more to your vcore voltage in the bios.

I get occasional DTC Watchdog errors as well If I push my overclocks. I have found though that if I kill the tasks "google crash handler" "Google crash handler (32-bit)" and "google installer (32-bit)" my card tends to be more stable.


----------



## gtbtk

DTC watchdog error Bsod

Just made a discovery that I think is causing the computer lockups and 0x133 DTC watchdog errors

I noticed that when running my pc under load. My CPU was exhibiting some vdroop until it eventually crashed. The last time I stability tested my rig I didn't notice the same behaviour. I am running the windows insider preview build of win 10 so I can only guess that the new version of the operating system has changed the power draw behaviour of windows

If I increase the CPU vcore voltage, by 0.1 volts, I have not been able to repeat the watchdog crashes and my CPU has gained a little extra performance as well.

If you are experiencing the DTC watchdog bsod on your pc. I suggest you check your CPU over clock voltages


----------



## ahmedmo1

Has anyone tried OCing the 1070 on their laptop? How'd it go? Especially when it comes to temps.

Mine runs at ~70-85C when playing games (FPS locked to 59 fps). It hits 80C+ when it's running at near-full utilization. For example Overwatch @4K. When I run BF4 @ 1440P locked to 59 FPS, temps stay at ~73C.


----------



## syl1979

I would try to make both undervolt + overclock. My daily setting on desktop is 2075mhz for 0.993v.


----------



## ucode

Quote:


> Originally Posted by *macwin2012*
> 
> Hey ,
> 
> My core clock speed while benchmarking downclocks to 2100 MHZ , My temperature are constant at 57/56 C , never go above 60 C .
> 
> What is the issue ?


FWIW 1050Ti running 4 minutes of furmark with fan speed fixed at low RPM. Temperature rise from 33C to 74C.


Perhaps effects can be reduced depending on OC settings.

@ahmedmo1 do you have a link to the laptops VBIOS? Probably going to be power limited.


----------



## lawrencelyl

Quote:


> Originally Posted by *gtbtk*
> 
> It resolves the issue. No more checker board artifact/video scheduler BSOD. The MSI bios is listed for anyone having "problems" with their card.
> 
> Answers for Vbios versions for specific Ram types should be taken up with the vendor. I am not in a position to give a definitive answer, however, Nvidia supplies the core code for all these Bios versions and this bug is fixed in the core so a .50 bios should solve the micron problem. The Bios interfaces the memory controller, not the memory chips themselves so the bios should be working for both types of ram. I don't know why Gigabyte is publishing memory brand specific Bioses like they are because the controller is the same for both cards and built into the GPU silicon.
> 
> My Gaming X tops out at now at +550 stable, but with careful voltage management I could get that before the update but with major artifacts.
> 
> I have been playing around with other bioses on my card and I am currently running the ASUS Strix OC bios. It seems more stable, consistent and reliable at high overclocks than the original MSI bios so I will leave it for a while.
> 
> I tried the Palit 1671Mhz OC bios with the factory overclocked Ram and could not get that stable at anything above stock, maybe my chip is not binned high enough? I have not tried the new Gigabyte xTreme bios as yet. The old .26 one gave me stability problems similar to the old Palit version so I am not hopeful.


Is it possible to flash other brand's Bios into another brand's 1070? I always thought that is not possible.


----------



## gtbtk

Quote:


> Originally Posted by *lawrencelyl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It resolves the issue. No more checker board artifact/video scheduler BSOD. The MSI bios is listed for anyone having "problems" with their card.
> 
> Answers for Vbios versions for specific Ram types should be taken up with the vendor. I am not in a position to give a definitive answer, however, Nvidia supplies the core code for all these Bios versions and this bug is fixed in the core so a .50 bios should solve the micron problem. The Bios interfaces the memory controller, not the memory chips themselves so the bios should be working for both types of ram. I don't know why Gigabyte is publishing memory brand specific Bioses like they are because the controller is the same for both cards and built into the GPU silicon.
> 
> My Gaming X tops out at now at +550 stable, but with careful voltage management I could get that before the update but with major artifacts.
> 
> I have been playing around with other bioses on my card and I am currently running the ASUS Strix OC bios. It seems more stable, consistent and reliable at high overclocks than the original MSI bios so I will leave it for a while.
> 
> I tried the Palit 1671Mhz OC bios with the factory overclocked Ram and could not get that stable at anything above stock, maybe my chip is not binned high enough? I have not tried the new Gigabyte xTreme bios as yet. The old .26 one gave me stability problems similar to the old Palit version so I am not hopeful.
> 
> 
> 
> Is it possible to flash other brand's Bios into another brand's 1070? I always thought that is not possible.
Click to expand...

Absolutely possible. It will work as long as they have the same voltage controllers. The only 1070 that I know that you cant do that is the HOF cards and the Galaxy SNPR that youe unique voltage controllers

You do need to be careful if you have low end VRM and install a Top level Card's bios as power delivery is not the same on all cards.

You need to get hold of the latest OEM version of nvflash and use the -6 flag to override the PCI check when you flash the card.


----------



## mbm

still no news regarding BIOS custom modding?


----------



## lawrencelyl

Quote:


> Originally Posted by *gtbtk*
> 
> Absolutely possible. It will work as long as they have the same voltage controllers. The only 1070 that I know that you cant do that is the HOF cards and the Galaxy SNPR that youe unique voltage controllers
> 
> You do need to be careful if you have low end VRM and install a Top level Card's bios as power delivery is not the same on all cards.
> 
> You need to get hold of the latest OEM version of nvflash and use the -6 flag to override the PCI check when you flash the card.


Sounds risky I have Zotac Amp Extreme edition. Wondering which Bios will be compatible...in case Zotac is very late to update new Bios to resolve Micron memory issue.


----------



## asdkj1740

Quote:


> Originally Posted by *lawrencelyl*
> 
> Sounds risky I have Zotac Amp Extreme edition. Wondering which Bios will be compatible...in case Zotac is very late to update new Bios to resolve Micron memory issue.


dont need to flash another bios if you are using amp extreme, this card has the best bios among other 1070s.
i asked zotac about the new bios release and zotac said the new bios should arrive in this month.


----------



## lawrencelyl

Quote:


> Originally Posted by *asdkj1740*
> 
> dont need to flash another bios if you are using amp extreme, this card has the best bios among other 1070s.
> i asked zotac about the new bios release and zotac said the new bios should arrive in this month.


Can't wait for the new Bios


----------



## RyanRazer

Quote:


> Originally Posted by *lawrencelyl*
> 
> Sounds risky I have Zotac Amp Extreme edition. Wondering which Bios will be compatible...in case Zotac is very late to update new Bios to resolve Micron memory issue.


Do you even have micron memory? I have Zotac Amp! Etr. and i have Samsung.. No need for update, still checking for bios update though, in hope for 1 more FPS


----------



## LuckLess7

Hey guys, so I just received my replacement card for the one I sent in. The one I sent in had caused me headaches for a whole month. I wrote that in the rma issue description. I now have a MSI GTX 1070 Gaming X with Samsung memory. Should I bother updating the BIOS at all?

Out of the Box, I just put in +105 Clock and +350 in Afterburner, best result I've ever gotten. http://www.3dmark.com/fs/10709518 And this score is with geforce Experience enabled, which always deducts a few point with its stupid game share overlay... This card is finally going where I think it belongs


----------



## gtbtk

Quote:


> Originally Posted by *lawrencelyl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Absolutely possible. It will work as long as they have the same voltage controllers. The only 1070 that I know that you cant do that is the HOF cards and the Galaxy SNPR that youe unique voltage controllers
> 
> You do need to be careful if you have low end VRM and install a Top level Card's bios as power delivery is not the same on all cards.
> 
> You need to get hold of the latest OEM version of nvflash and use the -6 flag to override the PCI check when you flash the card.
> 
> 
> 
> Sounds risky I have Zotac Amp Extreme edition. Wondering which Bios will be compatible...in case Zotac is very late to update new Bios to resolve Micron memory issue.
Click to expand...

Something is risky when you don't know what you are doing. 

All of these cards can be recovered if they end up being bricked by the flash process as long as you have another graphibc option you can boot from (iGPU is great for that). I am certainly not recommending that you flash your card. It is at your own risk. The Micron bug can be worked around and does not impact performance that much anyway. Given that the Zotac has the highest power limit of any of the cards as far as I am aware, You are only likely to reduce your performance.

If you do decide to try, you also need to understand the power capacity of your card and monitor power draw when you are testing although the Zotac is probably less likely to have issues, it is still a good approach to take as many Phases allow you to spread loads and some cards only have half the amount of phases with corresponding higher current per phase.

I don't have a Zotac card, mine is MSI gaming X, however, I have previously flashed the following bioses and successfully tested my card:

Asus Strix OC - .26 and .50 versions

MSI Gaming Z - .1E Reviewer, .26 and .50 versions

MSI Gaming 8G - .1E reviewer, .26 and .50 versions

Galaxy Gamer - .26 version

Palit super jet stream - .26 version

Palit Gamerock Premium - .26, .3B and .50 versions

Gigabyte Xtreme - .26

Zotac Amp Extreme .26

EVGA SC .26

EVGA FTW .26 and .50

All of which worked with various levels of performance. The differences in VRM design do impact performance

The EVGA bioses are power limited that the clock bounces around way more than any other version. That may have been EVGAs attempt to keep temps down while hoping no-one noticed.

The Automatic OC in Precision XOC works but I could not get really reliable overclocks. I have discovered something in my CPU overclock since then that may have been causing that so I will reserve judgement.

The Asus bios pairs with MSI hardware very well.

the Gamerock premium and Giga Xtreme crash my PC easily

The others never performed as well as the ASUS or MSI bios so I did not spend much time with them

KAF2/Galax HOF/Galaxy SNPR - .26 Bricked the card - do not flash. These cards use a different voltage controller


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> Hey guys, so I just received my replacement card for the one I sent in. The one I sent in had caused me headaches for a whole month. I wrote that in the rma issue description. I now have a MSI GTX 1070 Gaming X with Samsung memory. Should I bother updating the BIOS at all?
> 
> Out of the Box, I just put in +105 Clock and +350 in Afterburner, best result I've ever gotten. http://www.3dmark.com/fs/10709518 And this score is with geforce Experience enabled, which always deducts a few point with its stupid game share overlay... This card is finally going where I think it belongs


the is no pressing need. the update is optional


----------



## guttheslayer

My pair of 1070s is doing pretty good at sli. With 150% resolution and everything at ultra, TAA using 1440p display. It constantly stay above 60 fps.

Still i guess it need a pair of 1080 TI to really push 200% DSR on a 144Hz monitor.


----------



## Mr-Dark

Hello

MSI SLI Bridge arrived today!



Still waiting another Af140 fan for the rear



MSI


----------



## watermanpc85

Quote:


> Originally Posted by *mbm*
> 
> still no news regarding BIOS custom modding?


+1, really interested on this...


----------



## Dude970

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> MSI SLI Bridge arrived today!
> 
> 
> 
> Still waiting another Af140 fan for the rear
> 
> 
> 
> MSI










Looks great


----------



## Inelastic

Just got in my EVGA GTX 1070 FTW. It'll have to be on air for now until I get the parts in, so pay no mind to that funky angle fitting


----------



## lawrencelyl

Quote:


> Originally Posted by *RyanRazer*
> 
> Do you even have micron memory? I have Zotac Amp! Etr. and i have Samsung.. No need for update, still checking for bios update though, in hope for 1 more FPS


GPU-Z is showing Micron memory and I have screen corruption/checkerboard/freeze if I o/c above +400 on memory. Also I couldn't o/c much at idle, +200 memory is enough to crash my PC.


----------



## lawrencelyl

Quote:


> Originally Posted by *gtbtk*
> 
> Something is risky when you don't know what you are doing.
> 
> All of these cards can be recovered if they end up being bricked by the flash process as long as you have another graphibc option you can boot from (iGPU is great for that). I am certainly not recommending that you flash your card. It is at your own risk. The Micron bug can be worked around and does not impact performance that much anyway. Given that the Zotac has the highest power limit of any of the cards as far as I am aware, You are only likely to reduce your performance.
> 
> If you do decide to try, you also need to understand the power capacity of your card and monitor power draw when you are testing although the Zotac is probably less likely to have issues, it is still a good approach to take as many Phases allow you to spread loads and some cards only have half the amount of phases with corresponding higher current per phase.
> 
> I don't have a Zotac card, mine is MSI gaming X, however, I have previously flashed the following bioses and successfully tested my card:
> 
> Asus Strix OC - .26 and .50 versions
> MSI Gaming Z - .1E Reviewer, .26 and .50 versions
> MSI Gaming 8G - .1E reviewer, .26 and .50 versions
> Galaxy Gamer - .26 version
> Palit super jet stream - .26 version
> Palit Gamerock Premium - .26, .3B and .50 versions
> Gigabyte Xtreme - .26
> Zotac Amp Extreme .26
> EVGA SC .26
> EVGA FTW .26 and .50
> 
> All of which worked with various levels of performance. The differences in VRM design do impact performance
> 
> The EVGA bioses are power limited that the clock bounces around way more than any other version. That may have been EVGAs attempt to keep temps down while hoping no-one noticed.
> 
> The Automatic OC in Precision XOC works but I could not get really reliable overclocks. I have discovered something in my CPU overclock since then that may have been causing that so I will reserve judgement.
> 
> The Asus bios pairs with MSI hardware very well.
> 
> the Gamerock premium and Giga Xtreme crash my PC easily
> 
> The others never performed as well as the ASUS or MSI bios so I did not spend much time with them
> 
> KAF2/Galax HOF/Galaxy SNPR - .26 Bricked the card - do not flash. These cards use a different voltage controller


Thanks for the detailed explanation. I'll probably wait till Zotac release new Bios first. Hopefully the new Bios will improve my memory o/c.


----------



## khanmein

Quote:


> Originally Posted by *lawrencelyl*
> 
> GPU-Z is showing Micron memory and I have screen corruption/checkerboard/freeze if I o/c above +400 on memory. Also I couldn't o/c much at idle, +200 memory is enough to crash my PC.


what brand of graphic card? did u flash the new vbios?


----------



## lawrencelyl

Quote:


> Originally Posted by *khanmein*
> 
> what brand of graphic card? did u flash the new vbios?


Zotac Amp Extreme. Still waiting for new Bios to be released...


----------



## TheGlow

Quote:


> Originally Posted by *lawrencelyl*
> 
> Thanks for the detailed explanation. I'll probably wait till Zotac release new Bios first. Hopefully the new Bios will improve my memory o/c.


If you want for now in afterburner, press ctrl+f to bring up the curve. Click the point around the 1.093 voltage area and press L, then apply.
This will lock the voltage at max. Now try increasing the memory and see what you can get to without checkerboarding.
Basically your voltage needs to be at least 725-800mV to play with memory clocks for Micron without checkerboarding.
If you lock it at 800 then when you game it wont scale up to 1.093 for the core clock to keep up sadly.
It'd be nice if there was a minimum but the new bios fixes these issues.
Mine seems to be an extreme case but I can go to +850 without checkerboarding, pre-bios. I havent tried to find the max yet w/ new bios as I was already burnt out messing with it when I first got it.


----------



## ucode

Quote:


> Originally Posted by *TheGlow*
> 
> If you lock it at 800 then when you game it wont scale up to 1.093 for the core clock to keep up sadly.


Meh, I already offered some posters a way to set a minimum of 0.8V while at idle clocks and and still have voltage scale up past 1V for the higher clocks. Didn't get so much as a reply let alone thanks.

At least it's fixed / being fixed properly BIOS wise now. :/


----------



## DeathAngel74

Could you please repost? I never saw the original, but thanks.


----------



## TheGlow

Quote:


> Originally Posted by *ucode*
> 
> Meh, I already offered some posters a way to set a minimum of 0.8V while at idle clocks and and still have voltage scale up past 1V for the higher clocks. Didn't get so much as a reply let alone thanks.
> 
> At least it's fixed / being fixed properly BIOS wise now. :/


I might have missed it since I had found my own, the MSI gaming app adds a service that runs an osd exe that kept me in 3d clocks, so it never dipped under .725.


----------



## BroPhilip

Quote:


> Originally Posted by *TheGlow*
> 
> If you want for now in afterburner, press ctrl+f to bring up the curve. Click the point around the 1.093 voltage area and press L, then apply.
> This will lock the voltage at max. Now try increasing the memory and see what you can get to without checkerboarding.
> Basically your voltage needs to be at least 725-800mV to play with memory clocks for Micron without checkerboarding.
> If you lock it at 800 then when you game it wont scale up to 1.093 for the core clock to keep up sadly.
> It'd be nice if there was a minimum but the new bios fixes these issues.
> Mine seems to be an extreme case but I can go to +850 without checkerboarding, pre-bios. I havent tried to find the max yet w/ new bios as I was already burnt out messing with it when I first got it.


Please update when you can on your progress... I would like to know. On Moines I could oc right at 8700 to 8800 on memory before and really alhad to push hard for the checkerboard effects. After bios and driver updates I only got around a +25 increase


----------



## BroPhilip

Quote:


> Originally Posted by *TheGlow*
> 
> I might have missed it since I had found my own, the MSI gaming app adds a service that runs an osd exe that kept me in 3d clocks, so it never dipped under .725.


The msi gaming app actually hindered my oc experience


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lawrencelyl*
> 
> Thanks for the detailed explanation. I'll probably wait till Zotac release new Bios first. Hopefully the new Bios will improve my memory o/c.
> 
> 
> 
> If you want for now in afterburner, press ctrl+f to bring up the curve. Click the point around the 1.093 voltage area and press L, then apply.
> This will lock the voltage at max. Now try increasing the memory and see what you can get to without checkerboarding.
> Basically your voltage needs to be at least 725-800mV to play with memory clocks for Micron without checkerboarding.
> If you lock it at 800 then when you game it wont scale up to 1.093 for the core clock to keep up sadly.
> It'd be nice if there was a minimum but the new bios fixes these issues.
> Mine seems to be an extreme case but I can go to +850 without checkerboarding, pre-bios. I havent tried to find the max yet w/ new bios as I was already burnt out messing with it when I first got it.
Click to expand...

A better solution to locking voltage is to make sure that you have added dwm.exe and explorere.exe to the nvidia control panel and set their performance setting to "maximum performance". They both run all the time so It keeps the voltage up hihgh enough but doesnt force the max voltage through the card all the time. Solution courtesy of ucode.


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheGlow*
> 
> If you lock it at 800 then when you game it wont scale up to 1.093 for the core clock to keep up sadly.
> 
> 
> 
> Meh, I already offered some posters a way to set a minimum of 0.8V while at idle clocks and and still have voltage scale up past 1V for the higher clocks. Didn't get so much as a reply let alone thanks.
> 
> At least it's fixed / being fixed properly BIOS wise now. :/
Click to expand...

Welcome to the club







.

Interesting how the internet in general is not at all interested in solutions, only in the drama that can be created by complaining about it. I thought your solution was an excellent one.


----------



## TheGlow

Quote:


> Originally Posted by *BroPhilip*
> 
> Please update when you can on your progress... I would like to know. On Moines I could oc right at 8700 to 8800 on memory before and really alhad to push hard for the checkerboard effects. After bios and driver updates I only got around a +25 increase


I was able to go as high as 9660 safely before the bios, hence I'm not in a rush to stress it because I was already getting above average.
Quote:


> Originally Posted by *BroPhilip*
> 
> The msi gaming app actually hindered my oc experience


The app itself yes, it was just bad and conflicting with Afterburner, but just having it installed and the service it added, that would lock my minimum voltage.

As confirmed again last night, my Overwatch session started randomly dropping to 110fps. My wife comes in and says I should go watch a movie with her. Turns out it was another h265 video I downloaded so it seems those cause some resource conflicts with plex media server. Setting overwatch exe to higher priority in Task manager fixed it.
Ill need to see other videos do it too or if it is just h265.


----------



## BroPhilip

Quote:


> Originally Posted by *TheGlow*
> 
> I was able to go as high as 9660 safely before the bios, hence I'm not in a rush to stress it because I was already getting above average.
> The app itself yes, it was just bad and conflicting with Afterburner, but just having it installed and the service it added, that would lock my minimum voltage.
> 
> As confirmed again last night, my Overwatch session started randomly dropping to 110fps. My wife comes in and says I should go watch a movie with her. Turns out it was another h265 video I downloaded so it seems those cause some resource conflicts with plex media server. Setting overwatch exe to higher priority in Task manager fixed it.
> Ill need to see other videos do it too or if it is just h265.


That's what I'm talking about, the background services from the app. I killed Them and got a more stable oc


----------



## Nukemaster

I just want to post that I flashed the new Asus bios to my Dual(2 fan not 2 cards) 1070 and it DOES work fine on Samsung memory cards.

The 0rpm mode works fine on the stock cooler, but I get full speed on my Mono Plus. Afterburner does not change it unless I first manually max it out(have to do this on every startup).

For now I am just running the fan off my board and it is working well(boards speed is somewhat limited to 1050-1600 via the bios. It adjusts with case temp and since the video card dumps heat into the case it works out well). I have not have any stability issues or anything so far.


----------



## ucode

Quote:


> Originally Posted by *DeathAngel74*
> 
> Could you please repost? I never saw the original, but thanks.


Well I've kind of given up on it now, especially as there is a BIOS fix but if you want to try it.

800mV.zip 2k .zip file


It's pretty crude, limited error checking and won't give you any message if failed.

*How does it work:* It sets all the curve points below 0.8V to the lowest idle clock. Now if other softwares such as AfterBurner allowed adjustments below 0.8V you could do all this in AB but unfortunately it doesn't. I did contact Unwinder (author of AB) and ask why the points below 0.8V and above 1.2V are not shown but never got a reply. This probably means 800mV.exe will need to be run last, after OC setting's made in 3rd party software such as AB to avoid the settings being reset by those softwares.

There's an example using 900mV here.

I don't have a 1070 and only wrote this to try and help out. I can say it works okay as of today on a 1050Ti and 1080, so maybe you will be the first to try on a 1070 and actually see if it does help if the updated BIOS has not yet been applied. The BIOS obviously being the better fix.


----------



## gtbtk

Quote:


> Originally Posted by *TheGlow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BroPhilip*
> 
> Please update when you can on your progress... I would like to know. On Moines I could oc right at 8700 to 8800 on memory before and really alhad to push hard for the checkerboard effects. After bios and driver updates I only got around a +25 increase
> 
> 
> 
> I was able to go as high as 9660 safely before the bios, hence I'm not in a rush to stress it because I was already getting above average.
> Quote:
> 
> 
> 
> Originally Posted by *BroPhilip*
> 
> The msi gaming app actually hindered my oc experience
> 
> Click to expand...
> 
> The app itself yes, it was just bad and conflicting with Afterburner, but just having it installed and the service it added, that would lock my minimum voltage.
> 
> As confirmed again last night, my Overwatch session started randomly dropping to 110fps. My wife comes in and says I should go watch a movie with her. Turns out it was another h265 video I downloaded so it seems those cause some resource conflicts with plex media server. Setting overwatch exe to higher priority in Task manager fixed it.
> Ill need to see other videos do it too or if it is just h265.
Click to expand...

Plex transcodes movies for the device it is playing on if that device cannot play the format natively. H.265 is most likely to be transcoded on any devices that are not extremely recent. H.264 should play natively on many more devices that are out there.


----------



## khanmein

@gtbk what driver u using now? i heard a lot said 375.70/.76 can cause GPU burnt out & over-heating.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> @gtbk what driver u using now? i heard a lot said 375.70/.76 can cause GPU burnt out & over-heating.


I'm running 375.70. I don't have a 144mhz monitor so I didn't need the hotfix driver.

I have not noticed anything strange. My temps appear normal.

This is the first time I heard anything about a card being burned out.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I'm running 375.70. I don't have a 144mhz monitor so I didn't need the hotfix driver.
> 
> I have not noticed anything strange. My temps appear normal.
> 
> This is the first time I heard anything about a card being burned out.


i'm not using 144 Hz too. 1440p 60 Hz more than enough. .

by the way, .76 fixed Artifacts in GIFs

https://forums.geforce.com/default/topic/975448/geforce-drivers/first-generation-titan-burnt-out-using-375-70-drivers/


----------



## MechanimaL

Hey guys, my new card msi-gtx1070-X has samsung ram and bios 86.04.1E.00.41. I see, that's not the newest bios, but the latest (and only one) I see on the msi side, seems to be that specially designed to fix the micron ram problems. so my question is: what is the best bios for my card and where can I get it, if it's not the one I've currently installed?
thanks!


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> i'm not using 144 Hz too. 1440p 60 Hz more than enough. .
> 
> by the way, .76 fixed Artifacts in GIFs
> 
> https://forums.geforce.com/default/topic/975448/geforce-drivers/first-generation-titan-burnt-out-using-375-70-drivers/


I have not been affected with gif issues.

The latest Chrome version fixed the black artifacts in youtube videos.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I have not been affected with gif issues.
> 
> The latest Chrome version fixed the black artifacts in youtube videos.


no artifacts from previous version until now with chrome (always use stable version only) but the .gif issue still haven't completed resolve cos i visited https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-O8G-GAMING/

i noticed the x-split gamecaster got artifacts but i checked other asus products no issue for the non-oc. *weird*

apparently, i can't stand with my graphic cooling noise. i'm going to mod the fan with 2 x Noctua NF-F12 + cable ties + customize 4-pin female mini GPU cable.


----------



## gtbtk

I cant comment on Strix noise. My MSI is pretty quiet, even at 100%


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I cant comment on Strix noise. My MSI is pretty quiet, even at 100%


i don't have strix. woot i mean my leadtek.


----------



## ucode

Quote:


> Originally Posted by *khanmein*
> 
> https://forums.geforce.com/default/topic/975448/geforce-drivers/first-generation-titan-burnt-out-using-375-70-drivers/


Unlike the original cooler that appears to have a plate and thermal pads for memory and VRM it looks like that custom cooler runs VRM and memory bare, using just the PCB and fans too cool them.

Original

Custom

What does that mean. Well if that's the case then with the pads and plate the components would have higher thermal latency while the bare components relying on the fans would only have the PCB if fans failed and for the same conditions would heat up much more quickly and allow thermal runaway to happen quicker too. Perhaps too quick for thermal shutdown to kick in properly, just saying. Fans do fail / stop for either hardware or software reasons so might be something one might want to think about before employing such a cooling solution.


----------



## khanmein

Quote:


> Originally Posted by *ucode*
> 
> Unlike the original cooler that appears to have a plate and thermal pads for memory and VRM it looks like that custom cooler runs VRM and memory bare, using just the PCB and fans too cool them.
> 
> Original
> 
> Custom
> 
> What does that mean. Well if that's the case then with the pads and plate the components would have higher thermal latency while the bare components relying on the fans would only have the PCB if fans failed and for the same conditions would heat up much more quickly and allow thermal runaway to happen quicker too. Perhaps too quick for thermal shutdown to kick in properly, just saying. Fans do fail for either hardware or software reasons so might be something one might want to think about before employing such a cooling solution.


i'm planning to add 2 x 120 mm connected to the y-split cable (provided) then 4-pin Female mini. once i'm done i'll let u know the feedback.


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> https://forums.geforce.com/default/topic/975448/geforce-drivers/first-generation-titan-burnt-out-using-375-70-drivers/
> 
> 
> 
> Unlike the original cooler that appears to have a plate and thermal pads for memory and VRM it looks like that custom cooler runs VRM and memory bare, using just the PCB and fans too cool them.
> 
> Original
> 
> Custom
> 
> What does that mean. Well if that's the case then with the pads and plate the components would have higher thermal latency while the bare components relying on the fans would only have the PCB if fans failed and for the same conditions would heat up much more quickly and allow thermal runaway to happen quicker too. Perhaps too quick for thermal shutdown to kick in properly, just saying. Fans do fail for either hardware or software reasons so might be something one might want to think about before employing such a cooling solution.
Click to expand...

Agreed. Too early to wildly jump to the conclusion that the drivers are at fault especially given a total of one device.

Custom cooler with no heatsync on Ram and VRMs are a much more likely explanation. These cards monitor the GPU chip but there is no monitoring on the VRMs. Maybe this is becoming something that the card makers need to start including on any future models?

Electronic components also deteriorate over time if they are being constantly heated up to temps outside of design perameters. It is possible that one of the inductors was slightly out of spec to begin with and ran hotter than what is considered normal but not enough for an immediate failure. After two years of being over heated it finally failed. As there is no monotoring, it is not likely that the real reason will ever be determined


----------



## khanmein

^^ indeed & i still don't think the driver can cause over-heating. maybe AMD but not NV.

my GTX 970 VRM is without any thermal pad or heat-sink just plainly the graphic cooler blowing the heat-sink that's all.


----------



## ucode

Quote:


> Originally Posted by *khanmein*
> 
> my GTX 970 VRM is without any thermal pad or heat-sink just plainly the graphic cooler blowing the heat-sink that's all.


But was it designed that way? There are many graphics designs that are produced to work with bare components. It would seem however that the Titan though was not originally designed this way. The question is wether it would have failed with if it's propriety cooling solution was used instead of 3rd party cooling? If drivers are causing fans to not run as expected then that is still a concern, more so for those with reduced thermal latency through use of third party cooling.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> ^^ indeed & i still don't think the driver can cause over-heating. maybe AMD but not NV.
> 
> my GTX 970 VRM is without any thermal pad or heat-sink just plainly the graphic cooler blowing the heat-sink that's all.


your card is not a 2 year old titan x with a arctic cooler. All graphics cards are not the same. Right now there is a sample of one card

If you are worried, roll your drivers back to an earlier version


----------



## benjamen50

Do the newer AMD GPUs all come with VRM temperature sensors? Because I know that some of them do while I'm pretty sure Nvidia ones most likely do not.


----------



## DeathAngel74

my solution to vrm cooling problems, I have newest bios + thermal pads too


----------



## gtbtk

Quote:


> Originally Posted by *benjamen50*
> 
> Do the newer AMD GPUs all come with VRM temperature sensors? Because I know that some of them do while I'm pretty sure Nvidia ones most likely do not.


I dont know. I have not touched an AMD GPU in years


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> my solution to vrm cooling problems, I have newest bios + thermal pads too


FTW 3120RPM? i can't stand with that noise.


----------



## TheGlow

Quote:


> Originally Posted by *gtbtk*
> 
> Plex transcodes movies for the device it is playing on if that device cannot play the format natively. H.265 is most likely to be transcoded on any devices that are not extremely recent. H.264 should play natively on many more devices that are out there.


Yes, this is exactly what I figured. Roku3's being what I have on my 3 TV's.
My phone and tablets seem to handle 265 fine though.


----------



## TheBoom

Just noticed earlier my card refused to boost from 1632mhz and 725mv no matter what until I rebooted. This is after the latest drivers. Never had this issue before. Asus strix oc.

Seems like a few 1080 owners have been plagued with this "stuck clocks" phenomenon since launch though. They claim it's usually caused by temp monitoring software conflicts although I'm not sure how accurate that is. Might be instability after all.


----------



## BroPhilip

Quote:


> Originally Posted by *TheBoom*
> 
> Just noticed earlier my card refused to boost from 1632mhz and 725mv no matter what until I rebooted. This is after the latest drivers. Never had this issue before. Asus strix oc.
> 
> Seems like a few 1080 owners have been plagued with this "stuck clocks" phenomenon since launch though. They claim it's usually caused by temp monitoring software conflicts although I'm not sure how accurate that is. Might be instability after all.


I had same issue early on and the only fix was a clean driver re-install


----------



## gtbtk

Quote:


> Originally Posted by *TheBoom*
> 
> Just noticed earlier my card refused to boost from 1632mhz and 725mv no matter what until I rebooted. This is after the latest drivers. Never had this issue before. Asus strix oc.
> 
> Seems like a few 1080 owners have been plagued with this "stuck clocks" phenomenon since launch though. They claim it's usually caused by temp monitoring software conflicts although I'm not sure how accurate that is. Might be instability after all.


I have seen that behavior after a driver crash


----------



## DeathAngel74

I've had that issue after a driver crash as well.


----------



## KawasakiDragonn

I'm upgrading my EVGA 1070 to newest bios. Anyone had any issue with new bios? want to make sure before flash.

My Evga 1070 SC, still waiting for thermal pad. Still running strong.


----------



## amstech

Im going to pick up one of these cards soon.


----------



## DeathAngel74

I ran mine at 2101/8726 1.075v without thermal pads for almost a month. I got tired of waiting, so I used pads and paste I already had. It's been running fine so far. Core temps went down after. 47C before, 44C after (35C idle before, 32C after) thermal pads and TIM change.


----------



## Jackharm

I have a Zotac GTX 1070 Amp Extreme with micron memory. Fortunately it does overclock well as I am able to hit 2088-21xx on core and 92xx on memory stable while gaming/benchmarking. Though I do notice I am able to stay above 21xx when I max out the voltage slider.

But I notice that my BIOS version is 86.04.26.00.22, which I suppose is an unverified VGA bios according to techpowerup VGA bios collection? The latest verified bios uploaded there for the AMP Extreme is the 86.04.1E.00.89 bios released on 2016-06-16 while the bios my card has was released several days after (2016-06-27). Perhaps the bios on my card are for cards with micron memory
EDIT: Just read through some posts via the search and saw that the 86.04.26.xx bios are indeed for Micron memory cards.

The other oddity is that before (at least as of late it hasn't happened *knock on wood*) from cold boot I would randomly be greeted with a white checker board soon after my computer BIOS would finish loading (just before one would be prompted with a login), or even see my screens turn on, only to be greeted with a 'no signal' message. But this would normally be fixed with a quick reboot.
Reading through the posts, this issue is apparently due voltage issues related to Micron memory? Although perhaps the the current driver (375.70) I updated to fixed it, since as mentioned previously I haven't encountered the issue anymore. That and apparently you can do some custom power management rules to keep the voltages up(which is the main cause of this issue?) which I have yet to do.

But it is good that someone reached out to Zotac to inform us that they are indeed working on/planning to release a bios for cards with Micron memory since if I recall correctly there was a post before that stating that they (Zotac) weren't going to do so since they (Zotac) mentioned their (Zotac) cards were unaffected by the issue.

But overall, I am loving my card.









P.S. It will fit in a Node 804


----------



## ElectroManiac

I'm still not OC my 1070 because to be honest for 1080p gaming this card is overkill, but want to ask. Even if I'm not getting any issues as the card is right now, should I update to the newer bios?

Card have Microm memory


----------



## gtbtk

Quote:


> Originally Posted by *Jackharm*
> 
> I have a Zotac GTX 1070 Amp Extreme with micron memory. Fortunately it does overclock well as I am able to hit 2088-21xx on core and 92xx on memory stable while gaming/benchmarking. Though I do notice I am able to stay above 21xx when I max out the voltage slider.
> 
> But I notice that my BIOS version is 86.04.26.00.22, which I suppose is an unverified VGA bios according to techpowerup VGA bios collection? The latest verified bios uploaded there for the AMP Extreme is the 86.04.1E.00.89 bios released on 2016-06-16 while the bios my card has was released several days after (2016-06-27). Perhaps the bios on my card are for cards with micron memory
> EDIT: Just read through some posts via the search and saw that the 86.04.26.xx bios are indeed for Micron memory cards.
> 
> The other oddity is that before (at least as of late it hasn't happened *knock on wood*) from cold boot I would randomly be greeted with a white checker board soon after my computer BIOS would finish loading (just before one would be prompted with a login), or even see my screens turn on, only to be greeted with a 'no signal' message. But this would normally be fixed with a quick reboot.
> Reading through the posts, this issue is apparently due voltage issues related to Micron memory? Although perhaps the the current driver (375.70) I updated to fixed it, since as mentioned previously I haven't encountered the issue anymore. That and apparently you can do some custom power management rules to keep the voltages up(which is the main cause of this issue?) which I have yet to do.
> 
> But it is good that someone reached out to Zotac to inform us that they are indeed working on/planning to release a bios for cards with Micron memory since if I recall correctly there was a post before that stating that they (Zotac) weren't going to do so since they (Zotac) mentioned their (Zotac) cards were unaffected by the issue.
> 
> But overall, I am loving my card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. It will fit in a Node 804


The .26 bios did come with Micron cards but it has a bug that causes checker board crashes. Nvidia has released an update to the partners. While some of the partners have released their bios updates. the one from Zotac is still pending and due this month I believe.

Until then you will need to keep the nvidia control panels setting set to maximum performance. there are a number of posts that discuss exactly how this is done in this thread.


----------



## gtbtk

Quote:


> Originally Posted by *ElectroManiac*
> 
> I'm still not OC my 1070 because to be honest for 1080p gaming this card is overkill, but want to ask. Even if I'm not getting any issues as the card is right now, should I update to the newer bios?
> 
> Card have Microm memory


I would.

It will ensure stability and slightly improve performance for the rest of the life of the card. The update utility that you download for your card will be automated and very easy to use.


----------



## ElectroManiac

Quote:


> Originally Posted by *gtbtk*
> 
> I would.
> 
> It will ensure stability and slightly improve performance for the rest of the life of the card. The update utility that you download for your card will be automated and very easy to use.


Sorry for my dumbness but what's this update utility? The program that came with my MSI card?


----------



## gtbtk

Quote:


> Originally Posted by *ElectroManiac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I would.
> 
> It will ensure stability and slightly improve performance for the rest of the life of the card. The update utility that you download for your card will be automated and very easy to use.
> 
> 
> 
> Sorry for my dumbness but what's this update utility? The program that came with my MSI card?
Click to expand...

the bios update utility you have to download from MSI for your card.

This one is the Gaming X utility

https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios


----------



## ElectroManiac

Quote:


> Originally Posted by *gtbtk*
> 
> the bios update utility you have to download from MSI for your card.
> 
> This one is the Gaming X utility
> 
> https://www.msi.com/Graphics-card/support/GEFORCE-GTX-1070-GAMING-X-8G.html#down-bios


I see yeah thanks. I though you mean that driver which you use to change led colors and stuff.

I have the regular gaming and not the x though. I will do the update tonight.


----------



## Jackharm

Yup, I went ahead and added the two .exes to the NVCP.

Thanks for the advice!


----------



## mbm

I have had no issues with my MSI 1070 Armor.
I cant find any release notes.
Should I update?


----------



## khanmein

Quote:


> Originally Posted by *mbm*
> 
> I have had no issues with my MSI 1070 Armor.
> I cant find any release notes.
> Should I update?


if no issue don't flash but my finger always itchy like to update.


----------



## _Killswitch_

I have no issues with my EVGA GTX 1070, actually thinking about going SLI. I know SLI isn't supported much or "dead" but hope it will gain support plus I never have had SLI system before figured for my new system why the hell not. If all else fails at least i'll have a back up gpu if one goes down


----------



## asdkj1740

who can tell me which of the followings would arrive first

A. evga thermal pads
B. zotac micron bios
C. 1080ti


----------



## asdkj1740

Quote:


> Originally Posted by *_Killswitch_*
> 
> I have no issues with my EVGA GTX 1070, actually thinking about going SLI. I know SLI isn't supported much or "dead" but hope it will gain support plus I never have had SLI system before figured for my new system why the hell not. If all else fails at least i'll have a back up gpu if one goes down


wait for 1080ti, sli is not good, especially with evga pascal ftw


----------



## _Killswitch_

I have a 1070, im not buying a 1080 TI or anything else. I want to go SLI people are writing SLI off way too fast. Im going SLI 1070's and won't replace them for another 4-5 years


----------



## gtbtk

Quote:


> Originally Posted by *_Killswitch_*
> 
> I have no issues with my EVGA GTX 1070, actually thinking about going SLI. I know SLI isn't supported much or "dead" but hope it will gain support plus I never have had SLI system before figured for my new system why the hell not. If all else fails at least i'll have a back up gpu if one goes down


You would probably have a better experience selling the 1070 and buying a 1080. less glitches and that will works with all games. SLI will only work with select games


----------



## _Killswitch_

My GTX 1070 works Flawlessly, and my 1070 was HUGE upgrade from my GTX 680. If I wanted a GTX 1080, Titan, TI ..i would have bought one.


----------



## fauka

guys i have one question... GeForce GTX SLI HB Bridge, 2-Slot will work with 1070 MSI SEA HAWK X ?

i know they work with founders but it will with that msi ?


----------



## saunupe1911

Quote:


> Originally Posted by *_Killswitch_*
> 
> My GTX 1070 works Flawlessly, and my 1070 was HUGE upgrade from my GTX 680. If I wanted a GTX 1080, Titan, TI ..i would have bought one.


I feel the same way. A 1080 is simply overkill right now. It's going to take 4 to 5 years (if ever) before games outmatch a 1070 at 1080p or 2k. One might say well I got a 4k monitor so I need a 1080. Ok cool...thats the only scenario IMO that a 1080 is useful. Otherwise folks with a 2k and below monitor just wasted their cash.


----------



## _Killswitch_

Well I don't mean to sound like an A$$ about it, just i thought what gpu i wanted and 1070 suited my needs and far as i can tell it will much into the future. I don't always buy the top of line stuff, just wanted better performance over my 680 and I got it.

If i want to go SLI well thats my choice too, sure 1080 is better, but im happy with my choice and sticking with it


----------



## zipper17

Obviously 1070 on 2560x1440P is not overkill at all. Witcher 3 1440P max settings still dips down to 40-50FPS, still *can't maintain a perfect minimum 60 FPS*, what's so overkill?

GTX 1070/1080 is just have a right portion of performances at 1440P resolution, is not overkill yet.

For 1080P yeah it might be, but not really GTA5 1080P max settings Ultra Grass still below 60FPS, in Wild areas.


----------



## saunupe1911

Quote:


> Originally Posted by *zipper17*
> 
> Obviously 1070 on 2560x1440P is not overkill at all. Witcher 3 1440P max settings still dips down to 40-50FPS, still *can't maintain a perfect minimum 60 FPS*, what's so overkill?


Witcher and other open world games seems to be the exceptions. If it's that important to you then you need a 1080. It fits your needs. Or Heck just lower it to 1080p. It will still be gorgous. But guess what. I can't stand that boring game. I'm good to go lmao!!!


----------



## _Killswitch_

well Im still on 1080P, I'd liked upgrade to 1440P but haven't found a monitor i liked or wasn't un-godly expendsive =S


----------



## supermodjo

overkill? i dont think so dude.at 1080 p my 1070 kannot handle games at my 144 monitor.people most upgarde to 120 144 + so 1070 or 1080 are low cards even at 1080p we need more power cards.


----------



## BroPhilip

This argument is about as tired as the Micron memory one. There will always be an argument for a more powerful card hence the reason they keep evolving. This is the beauty and curse of the pc gaming market over consoles. I bought a 1070 because of cost to performance for my particular needs. I honestly went with it also because they gimped the sli from the 1060 model. I like the thought that if down the road I decide I want to sli to play with, I can possibly buy a used card. We all can't afford a Titan xp and if we waited till we can it would be outdated. I could have waited but when I would be sitting with a pc that doesn't meet my needs.... So if you want to sli then go for it...


----------



## TheGlow

Quote:


> Originally Posted by *zipper17*
> 
> Obviously 1070 on 2560x1440P is not overkill at all. Witcher 3 1440P max settings still dips down to 40-50FPS, still *can't maintain a perfect minimum 60 FPS*, what's so overkill?
> 
> GTX 1070/1080 is just have a right portion of performances at 1440P resolution, is not overkill yet.
> 
> For 1080P yeah it might be, but not really GTA5 1080P max settings Ultra Grass still below 60FPS, in Wild areas.


Double check the Hairworks. It seems to have little impact, to me, visually, and lets me hover at 80-85fps.


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> who can tell me which of the followings would arrive first
> 
> A. evga thermal pads
> B. zotac micron bios
> C. 1080ti


obviously, arrival of thermal pads then following with 1080Ti & don't expect Zotac release any bios cos they're busy celebrating 10 years anniversary with Microsoft Windows 10 Anniversary Update. woot~


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> I have no issues with my EVGA GTX 1070, actually thinking about going SLI. I know SLI isn't supported much or "dead" but hope it will gain support plus I never have had SLI system before figured for my new system why the hell not. If all else fails at least i'll have a back up gpu if one goes down


look at Salazar Studio that fella got EVGA GTX 1070 SLI full with trouble.

single card is always the best solution (NVIDIA) but i guess AMD is way better in the future for dual setup.


----------



## _Killswitch_

I'm still getting SLI =S, and it's purely because i want too.


----------



## gtbtk

Quote:


> Originally Posted by *_Killswitch_*
> 
> Well I don't mean to sound like an A$$ about it, just i thought what gpu i wanted and 1070 suited my needs and far as i can tell it will much into the future. I don't always buy the top of line stuff, just wanted better performance over my 680 and I got it.
> 
> If i want to go SLI well thats my choice too, sure 1080 is better, but im happy with my choice and sticking with it


No-one is criticizing your choices. The 1070 is a great card, as it is the owners club I think everyone here agrees with that.

Having said that, each card does have finite performance abilities and you phrased a question that sounded like you were about to get a second card now. The statement that you made in this discussion forum seemed to indicate that a single 1070 does not meet your performance requirements.

We were only offering opinions to try and help you out.


----------



## khanmein

Quote:


> Originally Posted by *_Killswitch_*
> 
> I'm still getting SLI =S, and it's purely because i want too.


"the cash is yours"

i dropped the whole idea to upgrade GTX 1070. cash flow issue.


----------



## Aretak

Quote:


> Originally Posted by *TheGlow*
> 
> Double check the Hairworks. It seems to have little impact, to me, visually, and lets me hover at 80-85fps.


Hairworks doesn't have a particularly amazing visual impact when it comes to Geralt or other people, but it makes a huge difference to certain monsters in the game. I've wanted an option to enable it for monsters only since the game launched (as that seems like a good way to lessen its general performance impact whilst keeping its biggest draw), but it never happened, sadly.


----------



## TheGlow

Quote:


> Originally Posted by *Aretak*
> 
> Hairworks doesn't have a particularly amazing visual impact when it comes to Geralt or other people, but it makes a huge difference to certain monsters in the game. I've wanted an option to enable it for monsters only since the game launched (as that seems like a good way to lessen its general performance impact whilst keeping its biggest draw), but it never happened, sadly.


Check on Nexusmods. I think I saw something there for a mod where if Hairworks is set to All On, it actually turns it off for Geralt so you get it for everything else.
Edit: http://www.nexusmods.com/witcher3/mods/165/?
Although it looks like last update Oct 2015. It may still be relevant.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> obviously, arrival of thermal pads then following with 1080Ti & don't expect Zotac release any bios cos they're busy celebrating 10 years anniversary with Microsoft Windows 10 Anniversary Update. woot~


lol but i have faith in zotac, but not evga


----------



## fauka

guys i have one question... GeForce GTX SLI HB Bridge, 2-Slot will work with 1070 MSI SEA HAWK X ?

i know they work with founders but it will with that msi ?


----------



## _Killswitch_

Quote:


> Originally Posted by *gtbtk*
> 
> No-one is criticizing your choices. The 1070 is a great card, as it is the owners club I think everyone here agrees with that.
> 
> Having said that, each card does have finite performance abilities and you phrased a question that sounded like you were about to get a second card now. The statement that you made in this discussion forum seemed to indicate that a single 1070 does not meet your performance requirements.
> 
> We were only offering opinions to try and help you out.


Actually GTBTK, GTX 1070 is great card i'm loving it, I wanting too SLI mainly because i never SLI before and this PC i'm going little more "extreme" than I normally do. I have hope's SLI will get better support but like I said if it doesn't then at least I'll have a back up card. Has nothing to do with Single GTX 1070 being bad performance or not enough =)


----------



## gtbtk

Quote:


> Originally Posted by *_Killswitch_*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> No-one is criticizing your choices. The 1070 is a great card, as it is the owners club I think everyone here agrees with that.
> 
> Having said that, each card does have finite performance abilities and you phrased a question that sounded like you were about to get a second card now. The statement that you made in this discussion forum seemed to indicate that a single 1070 does not meet your performance requirements.
> 
> We were only offering opinions to try and help you out.
> 
> 
> 
> Actually GTBTK, GTX 1070 is great card i'm loving it, I wanting too SLI mainly because i never SLI before and this PC i'm going little more "extreme" than I normally do. I have hope's SLI will get better support but like I said if it doesn't then at least I'll have a back up card. Has nothing to do with Single GTX 1070 being bad performance or not enough =)
Click to expand...

My suggestion was to purely alleviate the frustration that SLI can bring.

While it does work in certain cases, it is not universal and introduces other challenges like micro stutters and card cooling. Additionally depending on where you live in the world, the best case economic view is two GTX1070 + one HB sli bridge ~ $900 - $1000 while 1 x GTX1080 ~ $650. It doesnt make a lot of fiscal sense if you have an alternative, especially as the 1070 is still an in demand card with a great resale value. If we were talking about an older card that is not in high demand any more then the story is different.

But I guess If you add the education value you gain from it, i may be worth it to you


----------



## ZakZakXxX

Zotac support center

Hi Moaath,

This will be released before the end of this month, November. You will be updated when it is ready. Thank you for your patience.

Best regards,

Jon Villamor
ZOTAC Technical Support
Mon-Fri 9am-6pm PST

I send this email
Hi sir

October ended .. im need this bios . How to mach time waiting for released bios for micron memory type gtx 1070 amp

Micron memory locked vram volt .. can.t overclock v ram

Update new email
Hi Sir,

We're currently working on a BIOS update for GTX 1070. As soon as available we will notify you on your email so you no longer need to check the page from time to time. Expected release will be no later than LATE NOVEMBER. Whats the best email address we can reach you to give you an update as soon as it gets released?

Sorry for the inconvenience

Cheers,
Paul

I have 2 gtx 1070 amp
One by samsung memory " old "
Tow Micron memory new i bught from amazon uk "new" with new system i7 6859k and x99 ,, etc,,


----------



## _Killswitch_

where I live I can get another GTX 1070 for $380 so yea Single GTX 1080 would have been better but I already got 1070 soo mise well stick with it.

I fully understand SLI may be complete headache but wouldnt be my 1st headche in my life


----------



## M3Stang

Here's a new one. Just amazon same dayed an Asus 24" 1440p monitor. Looks great and games still run at 60 fps at highest settings and they look better so it's nice. Running my other 2 1080p displays as well so now I have a total of 3 displays with the 1440p as the main. Well was watching some Netflix and the computer randomly shut off and rebooted itself. I ended up going to get some food in the kitchen and came back and hit the space bar. The keyboard was lit up but the computer was off again and rebooted. Now it's working normally. Could the monitor be causing this?


----------



## Inelastic

Ok, now that my 1070 FTW is under water, here are the overclocking results I got using the secondary bios. The best part was that the temperature never got above 37C.

No overclock scores (2016MHz gpu, 4006MHz memory)



overclock scores (2151MHz gpu +150, 4498MHz memory +500)


----------



## Exenth

Quote:


> Originally Posted by *Inelastic*
> 
> Ok, now that my 1070 FTW is under water, here are the overclocking results I got using the secondary bios. The best part was that the temperature never got above 37C.
> 
> No overclock scores (2016MHz gpu, 4006MHz memory)
> 
> 
> 
> overclock scores (2151MHz gpu +150, 4498MHz memory +500)


Was thinking about getting the Hydro Upgrade Kit for my 1070 FTW too, but for only 100MHz more i don't think thats worth another 100 bucks.

Now my Question how silent is it, because the AXC 3.0 Cooler is already really silent.


----------



## EDK-TheONE

i got better score with lower cpu and lower freq. on the air
http://www.3dmark.com/spy/669145









SCORE 5327 with NVIDIA GeForce GTX 1070(1x) and Intel Core i3-6100 Processor
Graphics Score 6705
CPU Score 2461


----------



## EDK-TheONE

double post!


----------



## Inelastic

Quote:


> Originally Posted by *Exenth*
> 
> Was thinking about getting the Hydro Upgrade Kit for my 1070 FTW too, but for only 100MHz more i don't think thats worth another 100 bucks.
> 
> Now my Question how silent is it, because the AXC 3.0 Cooler is already really silent.


Are you talking about the hybrid cooler? I have no idea how silent it is; I've never used one. It does have a blower-style fan as well as one on the radiator. I'm using the EK waterblock, so it doesn't have any fans which makes it quieter than having the ACX cooler. When I had the ACX cooler on, it was the loudest part of my computer when gaming. The other benefit I see is the temperature. It was running somewhere in the mid 60s on air with my case cover off. I never tried with the cover on since my case kind of has poor air flow due to the size and all the openings being covered in rads. Now the highest I see is 37C with the cover back on.

I don't think temperature is the limiting factor right now for overclocking, I think it's the power limit so I don't really see water cooling allowing for higher overclocks at this point.


----------



## fauka

Quote:


> Originally Posted by *Inelastic*
> 
> Are you talking about the hybrid cooler? I have no idea how silent it is; I've never used one. It does have a blower-style fan as well as one on the radiator. I'm using the EK waterblock, so it doesn't have any fans which makes it quieter than having the ACX cooler. When I had the ACX cooler on, it was the loudest part of my computer when gaming. The other benefit I see is the temperature. It was running somewhere in the mid 60s on air with my case cover off. I never tried with the cover on since my case kind of has poor air flow due to the size and all the openings being covered in rads. Now the highest I see is 37C with the cover back on.
> 
> I don't think temperature is the limiting factor right now for overclocking, I think it's the power limit so I don't really see water cooling allowing for higher overclocks at this point.


well i got 2x 1070 MSI sea hawk x and power limist is 105 so kinda low oc on both cards are max 2088 vcore and vram 9200 and still vcore jumping like creazy so yea its power limit cuz temp of my cards dont pass 47


----------



## Gurkburk

Quote:


> Originally Posted by *Inelastic*
> 
> Ok, now that my 1070 FTW is under water, here are the overclocking results I got using the secondary bios. The best part was that the temperature never got above 37C.
> 
> No overclock scores (2016MHz gpu, 4006MHz memory)
> 
> 
> 
> overclock scores (2151MHz gpu +150, 4498MHz memory +500)


Why does it seem like your Intel HD graphics are actually somehow active & boosting your scores?

I've not seen any 1070 reach these scores on Unigine heaven. Not even overclocked.

I'm +125 CC & +610 Memory and i get 2600ish in Heaven benchmark.


----------



## Forceman

Quote:


> Originally Posted by *Gurkburk*
> 
> Why does it seem like your Intel HD graphics are actually somehow active & boosting your scores?
> 
> I've not seen any 1070 reach these scores on Unigine heaven. Not even overclocked.
> 
> I'm +125 CC & +610 Memory and i get 2600ish in Heaven benchmark.


Probably because he is running 2xAA when most people run it at 8xAA.


----------



## Gurkburk

Quote:


> Originally Posted by *Forceman*
> 
> Probably because he is running 2xAA when most people run it at 8xAA.


Aaaaaaand there we have it. Lmao didnt even see this. Disqualified "benchmark test" that is then!


----------



## Inelastic

Quote:


> Originally Posted by *Gurkburk*
> 
> Aaaaaaand there we have it. Lmao didnt even see this. Disqualified "benchmark test" that is then!


Why do people run it at custom settings? I just ran it at whatever it brought up.

I don't know why it shows the intel graphics. I have it disabled.


----------



## M3Stang

I RMA'd my card. During playing infinite warfare in 1440p the card gets to like 80% load then the screen goes black and the fans speed up to 100%. I updated the BIOS on the card same result. Doing cross ship and will probably step up to the 1080. I think the 83% load is unacceptable at just 1440p.


----------



## jrp0079

Looking to upgrade from my old 660 ti to a gtx 1070. Just wanted to ask if my old intel 3570 would bottleneck with the 1070? Also wondering what the difference is between a pci 2.0 versus the 3.0?


----------



## benjamen50

Quote:


> Originally Posted by *M3Stang*
> 
> I RMA'd my card. During playing infinite warfare in 1440p the card gets to like 80% load then the screen goes black and the fans speed up to 100%. I updated the BIOS on the card same result. Doing cross ship and will probably step up to the 1080. I think the 83% load is unacceptable at just 1440p.


I thought the game was just badly optimized.


----------



## Gurkburk

Quote:


> Originally Posted by *Inelastic*
> 
> Why do people run it at custom settings? I just ran it at whatever it brought up.
> 
> I don't know why it shows the intel graphics. I have it disabled.


I dont remember changing anything.

DX11
Quality Ultra
Tess: extreme
3d disabled
multi disabled
AA x8
Fullscreen
1920x1080

These are the settings i got when installing. I think. They are the "unofficial benchmark settings" at least. Anything else will be disqialified.


----------



## Inelastic

Quote:


> Originally Posted by *Gurkburk*
> 
> I dont remember changing anything.
> 
> DX11
> Quality Ultra
> Tess: extreme
> 3d disabled
> multi disabled
> AA x8
> Fullscreen
> 1920x1080
> 
> These are the settings i got when installing. I think. They are the "unofficial benchmark settings" at least. Anything else will be disqialified.


I see. Thanks.


----------



## asdkj1740

Quote:


> Originally Posted by *Inelastic*
> 
> Why do people run it at custom settings? I just ran it at whatever it brought up.
> 
> I don't know why it shows the intel graphics. I have it disabled.


1080P 0XAA is more demanding to stress gups in terms of temp and power


----------



## madmeatballs

Just a warning for people who are planning on getting the most out of their non-reference card by doing the shunt mod, do not do it unless you really know the risks and accept it. I had my card gone crazy hot after doing the shunt mod. I guess it doesn't apply to all pascal cards. Even der8eur said it would be fine, I followed him on what to short (_now I am not blaming der8eur as I knew this possible risk_). Also made sure I am shorting the right shunt resistors by getting reference from https://xdevs.com/guide/pascal_oc/#voltsc.

I don't really know what went wrong with card but I am sure I heard something pop on my initial run after applying CLU to short the resistors. Could be that the other parts couldn't handle the power passing through it(don't really know, I don't have any background on electronics). Ran a few tests found out my card was running a lot hotter than normal, GPU boost even downclocks the card so bad. The card was still usable to be exact, it seemed like there wasn't anything wrong with it. Upon looking at the PCB, I did not notice any burn marks (even smelled it lol) although what I failed to do was remove the VRM heatsink of the amp extreme, maybe that was the culprit since it was the only part I never saw bare but I can't be sure, I may be wrong. Anyway, so I sent the card for RMA and got accepted for replacement. Thanks to Zotac.

Tl;dr, I just wanted to warn people to think twice before doing the shunt mod.


----------



## M3Stang

I ended up canceling the rma since I'm a BB elite plus member I still had time to return it so I did. Ordered a 1070 ftw from amazon should be here today.


----------



## herkalurk

Ordered a 1070 last night, should be here later this week.

https://www.amazon.com/gp/product/B01HHCA1IO/


----------



## M3Stang

I hope this Evga one doesn't give me issues or I'm sending it back tonight and probably going to get a strix


----------



## asdkj1740

Quote:


> Originally Posted by *madmeatballs*
> 
> Just a warning for people who are planning on getting the most out of their non-reference card by doing the shunt mod, do not do it unless you really know the risks and accept it. I had my card gone crazy hot after doing the shunt mod. I guess it doesn't apply to all pascal cards. Even der8eur said it would be fine, I followed him on what to short (_now I am not blaming der8eur as I knew this possible risk_). Also made sure I am shorting the right shunt resistors by getting reference from https://xdevs.com/guide/pascal_oc/#voltsc.
> 
> I don't really know what went wrong with card but I am sure I heard something pop on my initial run after applying CLU to short the resistors. Could be that the other parts couldn't handle the power passing through it(don't really know, I don't have any background on electronics). Ran a few tests found out my card was running a lot hotter than normal, GPU boost even downclocks the card so bad. The card was still usable to be exact, it seemed like there wasn't anything wrong with it. Upon looking at the PCB, I did not notice any burn marks (even smelled it lol) although what I failed to do was remove the VRM heatsink of the amp extreme, maybe that was the culprit since it was the only part I never saw bare but I can't be sure, I may be wrong. Anyway, so I sent the card for RMA and got accepted for replacement. Thanks to Zotac.
> 
> Tl;dr, I just wanted to warn people to think twice before doing the shunt mod.


amp extreme bios already has enough power. if your card under-measure the power input, amp extreme's mosfets may not be good enough to handle >300w.


----------



## Gurkburk

is the only way to bump the voltage, through physical modding? Wont custom bios help? Or that will never happen?


----------



## xixou

You can enable 3 way sli for now to speed up ^^


----------



## gtbtk

Quote:


> Originally Posted by *jrp0079*
> 
> Looking to upgrade from my old 660 ti to a gtx 1070. Just wanted to ask if my old intel 3570 would bottleneck with the 1070? Also wondering what the difference is between a pci 2.0 versus the 3.0?


I am running an overclocked i7-2600 non K overclocked at 4.4Ghz with a Z68 motherboard and upgraded from a GTX 660. With the OC, the cpu is almost comparable with a i5-6600.

It does impact performance a bit compared to say, a 6700K but it is not major. While it is not the absolute fastest performer, it still fallis into the category of good enough for the time being.

The best firestrike score I have managed is a 15150 with a 20600 graphics score and 10390 Physics score. 6700K results are in the 16000s with a physics score of around 15000

http://www.3dmark.com/fs/10705136

The best 3570 firestrike score I can find is still 14238 with graphics score of 20500 and a physics score of 7652. It appears that CPU is running at about 4.2Ghz.

http://www.3dmark.com/fs/10291672

I would suggest that it still falls into the category of being good enough for the time being particularly if you are running 60 Hz monitors. If you are trying to run 144hz 1440p monitors you may struggle a little . If you have a Z68 or Z77 motherboard, you always have the option to pick up a used 3770K at some stage and give your rig an extra performance boost without spending a load of extra cash.


----------



## madmeatballs

Quote:


> Originally Posted by *asdkj1740*
> 
> amp extreme bios already has enough power. if your card under-measure the power input, amp extreme's mosfets may not be good enough to handle >300w.


True


----------



## M3Stang

Welp this one is having a different issue. Playing infinite warfare and it just random reboots. So great. Regretting the 1440p monitor.


----------



## jrp0079

Quote:


> Originally Posted by *gtbtk*
> 
> I am running an overclocked i7-2600 non K overclocked at 4.4Ghz with a Z68 motherboard and upgraded from a GTX 660. With the OC, the cpu is almost comparable with a i5-6600.
> 
> It does impact performance a bit compared to say, a 6700K but it is not major. While it is not the absolute fastest performer, it still fallis into the category of good enough for the time being.
> 
> The best firestrike score I have managed is a 15150 with a 20600 graphics score and 10390 Physics score. 6700K results are in the 16000s with a physics score of around 15000
> 
> http://www.3dmark.com/fs/10705136
> 
> The best 3570 firestrike score I can find is still 14238 with graphics score of 20500 and a physics score of 7652. It appears that CPU is running at about 4.2Ghz.
> 
> http://www.3dmark.com/fs/10291672
> 
> I would suggest that it still falls into the category of being good enough for the time being particularly if you are running 60 Hz monitors. If you are trying to run 144hz 1440p monitors you may struggle a little . If you have a Z68 or Z77 motherboard, you always have the option to pick up a used 3770K at some stage and give your rig an extra performance boost without spending a load of extra cash.


Yes i normally play at 60Hz at 1080p. But thank you for the informative reply. Might have to overclock my cpu a bit and pick up a 1070 on black friday or cyber monday!


----------



## Majentrix

The RGB LEDs my Gainward 1070s are slightly different on either card. Blue and green on one card are slightly weaker than the other, but red appears to be the same.
Really annoying.


----------



## khanmein

Quote:


> Originally Posted by *jrp0079*
> 
> Yes i normally play at 60Hz at 1080p. But thank you for the informative reply. Might have to overclock my cpu a bit and pick up a 1070 on black friday or cyber monday!


https://www.amazon.com/MSI-GAMING-GTX-1070-8G/dp/B01GXOX3SW/ref=sr_1_2?s=pc&ie=UTF8&qid=1479198248&sr=1-2&keywords=gtx+1070

$385.99 (after rebate)

no need wait for black friday.


----------



## gtbtk

Quote:


> Originally Posted by *jrp0079*
> 
> Yes i normally play at 60Hz at 1080p. But thank you for the informative reply. Might have to overclock my cpu a bit and pick up a 1070 on black friday or cyber monday!


In spite of being a bit CPU constrained. I am sure that you will enjoy the performance increase regardless.


----------



## asdkj1740

Quote:


> Originally Posted by *madmeatballs*
> 
> True


if you check the tomshardware and guru3d review about zotac amp extreme, during furmark the temp of mosfet already reached ~110c.
if you raise the power by shunt mod, the mosfet may probably rise over 125c, which is dangerous for 24/7.


----------



## lawrencelyl

Quote:


> Originally Posted by *asdkj1740*
> 
> if you check the tomshardware and guru3d review about zotac amp extreme, during furmark the temp of mosfet already reached ~110c.
> if you raise the power by shunt mod, the mosfet may probably rise over 125c, which is dangerous for 24/7.


Couldn't find tomshardware or guru3d review on Zotac 1070 Amp Extreme...


----------



## shadowrain

*PSA*

*DO NOT update to the latest 375.86 drivers! I repeat, DO NOT update to the latest 375.86 drivers!*

Multiple reports of users using the 10 series cards are having low FPS on all games (<30 FPS at 1080p on my 1070 Zotac Amp Extreme on all my games and Heaven) caused by the drivers forcing the VRAM clock to only 810 mhz effective. Issue doesn't appear when I was using 375.70 and now running at 375.76. DDU and Clean Installs were always used.

https://forums.geforce.com/default/topic/976636/geforce-drivers/official-375-86-game-ready-whql-display-driver-feedback-thread-released-11-15-16-/4/


__
https://www.reddit.com/r/5d2v3u/driver_37586_faqdiscussion_thread/

EDIT: DDU and Clean Installs were always used.


----------



## xixou

373.06 are the best for now.


----------



## khanmein

pascal should avoid install this new 375.86 cos the memory clock stucked at 810 MHz but maxwell no issue (i personally feel this driver is good)


----------



## Gurkburk

New driver boosted my BF1 FPS from around 90~ to 110-120 on the map i just tested. Will test the rest of the maps soon.

Edit: Should note that there was a massive BF1 patch today as well, might have helped too.


----------



## msigtx760tf4

http://www.3dmark.com/3dm/16077480


----------



## ElectroManiac

So newer Nvidia drivers 375.86 are having problems with Pascal cards?

Some say it boost BF1 and others saying there are problems with it.


----------



## Gurkburk

Quote:


> Originally Posted by *ElectroManiac*
> 
> So newer Nvidia drivers 375.86 are having problems with Pascal cards?
> 
> Some say it boost BF1 and others saying there are problems with it.


No problems here.


----------



## M3Stang

Looks like the PSU was my issue all along. The 12V rail was running at 11.4-11.5 at idle in the bios. New PSU is at 12.24 stable at idle. No issues with the card yet.


----------



## StrelokAT

Just made Bios update on my Msi gtx1070. I hope that my Micron Memory overcocks now a little bit higher. (at least they did something:thumb
Did anybody notice an improvement? I unfortunately dont have time for benching right now?

I will test it with MetroLL BenchmarkTool and Bf1 at the weekend.
Of course for many hours and not only for 5 or 10 minutes. But right now i dont have time. Damit!


----------



## DanielB123

Quote:


> Originally Posted by *asdkj1740*
> 
> if you check the tomshardware and guru3d review about zotac amp extreme, during furmark the temp of mosfet already reached ~110c.
> if you raise the power by shunt mod, the mosfet may probably rise over 125c, which is dangerous for 24/7.


Please link these articles, I'm interested in having a look.


----------



## gtbtk

Quote:


> Originally Posted by *StrelokAT*
> 
> Just made Bios update on my Msi gtx1070. I hope that my Micron Memory overcocks now a little bit higher. (at least they did something:thumb
> Did anybody notice an improvement? I unfortunately dont have time for benching right now?
> 
> I will test it with MetroLL BenchmarkTool and Bf1 at the weekend.
> Of course for many hours and not only for 5 or 10 minutes. But right now i dont have time. Damit!


The whole card/memory is more stable. No more video Scheduler BSOD. I get an improvement of about 100mhz over what I could achieve on the original .26 bios. Before anything much above 9100Mhz, after the fiddle with the voltages would start to display blue checkerboard artifacts. I can now run just just under 9200Mhz before I start seeing the traditional flashing/ tearing type artifacts that I would expect to see from a heavily OC GPU


----------



## Mad Pistol

Hey guys,

Just installed the 375.86 drivers, and now I am getting text flickering on my 1070 SLI setup in Battlefield 1. Can anyone else with a 1070 SLI setup confirm this?


----------



## khanmein

Quote:


> Originally Posted by *ElectroManiac*
> 
> So newer Nvidia drivers 375.86 are having problems with Pascal cards?
> 
> Some say it boost BF1 and others saying there are problems with it.


i don't have BF1 but what i know a lot complaining about SLI, BF1, flickering & memory clock at 810 MHz (memory leak)

maxwell user no issue & i noticed some improvement.


----------



## Hunched

I've never had trouble installing an Nvidia driver but no matter what I do the 375.86 installation fails.
Every other driver still installs fine... So I guess I'll just install 375.76 again because I can actually install it.

Hopefully the next driver can actually install on my pretty fresh and clean Windows 10 install.
Not even DDU helps


----------



## herkalurk

Changed my order, but Amazon same day delivery is fun. Installed a couple hours ago.

https://www.techpowerup.com/gpuz/details/zbnkf


----------



## TheNoub

Hey guys, I tried searching the forums for something about my card but couldn't find it so I figured i'd ask you guys. I'm new to this site (just registered) but I read a lot over the past few years.

So now with the problem: I'm stuck at 2113-2150mHz on my Gigabyte GTX 1070 G1 Gaming and sometimes (mostly in more demanding games like Witcher 3 or DOOM) I hit the power limit of 111%, then the card throttles a little over 2050mHz. My voltage stays the same (mostly) at 1.07v or 1.81v. I only see 1.093v in 3Dmark or other benchmarks.

The Question: I'm not asking for the voltage part, I get that the 1070s and 1080s don't go so well with more voltage, but could I flash my BIOS to let's say a STRIXX OC (which I think removes the power limit completly) and have better results in achieving 2150-2200mHz?

BTW, I can get my card to 2200mHz but then the throttling is constant, no artifacts or glitches noticed. So I think I won the lotery with my card but I'm limited because of the goddamn power limit.

Any help appreciated and I would like to know if someone succesfully flashed the STRIXX OC BIOS onto the G1 Gaming card.


----------



## gtbtk

Quote:


> Originally Posted by *TheNoub*
> 
> Hey guys, I tried searching the forums for something about my card but couldn't find it so I figured i'd ask you guys. I'm new to this site (just registered) but I read a lot over the past few years.
> 
> So now with the problem: I'm stuck at 2113-2150mHz on my Gigabyte GTX 1070 G1 Gaming and sometimes (mostly in more demanding games like Witcher 3 or DOOM) I hit the power limit of 111%, then the card throttles a little over 2050mHz. My voltage stays the same (mostly) at 1.07v or 1.81v. I only see 1.093v in 3Dmark or other benchmarks.
> 
> The Question: I'm not asking for the voltage part, I get that the 1070s and 1080s don't go so well with more voltage, but could I flash my BIOS to let's say a STRIXX OC (which I think removes the power limit completly) and have better results in achieving 2150-2200mHz?
> 
> BTW, I can get my card to 2200mHz but then the throttling is constant, no artifacts or glitches noticed. So I think I won the lotery with my card but I'm limited because of the goddamn power limit.
> 
> Any help appreciated and I would like to know if someone succesfully flashed the STRIXX OC BIOS onto the G1 Gaming card.


Your card is a very good one. The behavior of your card is completely normal and is the way GPU boost 3.0 works, balancing load, temps and voltage. even at 2050mHz under load, you are still getting about 300mhz overclock. There is currently no way to use more than 1.093V through 1070 cards as there is now way currently to create a custom bios.

I suggest that you do some performance comparisons at different starting frequencies, I can run my card faster at speeds, well above 2100 but I end up with more FPS if I run it at 2088Mhz. I cannot explain why that is the case and your milage may vary but worth confirming with your hardware

The Strix XOC bios you are talking about is for 1080 and not 1070 cards as far as I am aware. The 1080s using that bios only work well under water, they tend to overheat with air cooling from reports I have read.


----------



## xGeNeSisx

Quote:


> Originally Posted by *TheNoub*
> 
> Hey guys, I tried searching the forums for something about my card but couldn't find it so I figured i'd ask you guys. I'm new to this site (just registered) but I read a lot over the past few years.
> 
> So now with the problem: I'm stuck at 2113-2150mHz on my Gigabyte GTX 1070 G1 Gaming and sometimes (mostly in more demanding games like Witcher 3 or DOOM) I hit the power limit of 111%, then the card throttles a little over 2050mHz. My voltage stays the same (mostly) at 1.07v or 1.81v. I only see 1.093v in 3Dmark or other benchmarks.
> 
> The Question: I'm not asking for the voltage part, I get that the 1070s and 1080s don't go so well with more voltage, but could I flash my BIOS to let's say a STRIXX OC (which I think removes the power limit completly) and have better results in achieving 2150-2200mHz?
> 
> BTW, I can get my card to 2200mHz but then the throttling is constant, no artifacts or glitches noticed. So I think I won the lotery with my card but I'm limited because of the goddamn power limit.
> 
> Any help appreciated and I would like to know if someone succesfully flashed the STRIXX OC BIOS onto the G1 Gaming card.


I also have the G1 1070, and you absolutely have good card. I cannot even get a meaningfully overclock and I have tried for quite some time. My card will throttle when it hits 102% even when the slider is maxed out to 111%. I can't even get past 2050mhz core clock as the power readings show the card goes from 80s at 2000mhz to 104-107 at 2050mhz. Do you have micron memory or Samsung out of curiosity?

I have never crossflashed a bios on the 10XX series, but the biggest problem is that not all manufacturers used the same VRM layouts or components in their voltage controllers. The TDP power slider levels are not the same.

I am unsure if the limitation is imposed by only having a single 8pin powering the card. Any card (I believe the Strixx might) that is also powered by an 8 pin is the best choice, though I am still very wary of crossflashing.

I don't have much information on crossflashing, but I attempted to use a shunt mod on the card. It essentially makes the card think that it is drawing less power than it is by lowering the resistance of the shunts in the VRM area using CLU. The card will have less power limit reported to it, and thus it can be further increased via the slider. After doing so the card had two modes - idle and max. GPU boost was essentially disabled and constant steady voltage was consistently applied. I did not get to experiment with the mod for long as the new Bios update was released later that day. I cleaned up the CLU shunt mod and flashed the update before I could begin to fully experiment with the card in that state.

I have been experimenting with the card for quite some time and have been very curious concerning why the card throttles so aggressively and when it is unnecessary to do so. Ultimately, the 8pin power delivery may be the main factor


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheNoub*
> 
> Hey guys, I tried searching the forums for something about my card but couldn't find it so I figured i'd ask you guys. I'm new to this site (just registered) but I read a lot over the past few years.
> 
> So now with the problem: I'm stuck at 2113-2150mHz on my Gigabyte GTX 1070 G1 Gaming and sometimes (mostly in more demanding games like Witcher 3 or DOOM) I hit the power limit of 111%, then the card throttles a little over 2050mHz. My voltage stays the same (mostly) at 1.07v or 1.81v. I only see 1.093v in 3Dmark or other benchmarks.
> 
> The Question: I'm not asking for the voltage part, I get that the 1070s and 1080s don't go so well with more voltage, but could I flash my BIOS to let's say a STRIXX OC (which I think removes the power limit completly) and have better results in achieving 2150-2200mHz?
> 
> BTW, I can get my card to 2200mHz but then the throttling is constant, no artifacts or glitches noticed. So I think I won the lotery with my card but I'm limited because of the goddamn power limit.
> 
> Any help appreciated and I would like to know if someone succesfully flashed the STRIXX OC BIOS onto the G1 Gaming card.
> 
> 
> 
> I also have the G1 1070, and you absolutely have good card. I cannot even get a meaningfully overclock and I have tried for quite some time. My card will throttle when it hits 102% even when the slider is maxed out to 111%. I can't even get past 2050mhz core clock as the power readings show the card goes from 80s at 2000mhz to 104-107 at 2050mhz. Do you have micron memory or Samsung out of curiosity?
> 
> I have never crossflashed a bios on the 10XX series, but the biggest problem is that not all manufacturers used the same VRM layouts or components in their voltage controllers. The TDP power slider levels are not the same.
> 
> I am unsure if the limitation is imposed by only having a single 8pin powering the card. Any card (I believe the Strixx might) that is also powered by an 8 pin is the best choice, though I am still very wary of crossflashing.
> 
> I don't have much information on crossflashing, but I attempted to use a shunt mod on the card. It essentially makes the card think that it is drawing less power than it is by lowering the resistance of the shunts in the VRM area using CLU. The card will have less power limit reported to it, and thus it can be further increased via the slider. After doing so the card had two modes - idle and max. GPU boost was essentially disabled and constant steady voltage was consistently applied. I did not get to experiment with the mod for long as the new Bios update was released later that day. I cleaned up the CLU shunt mod and flashed the update before I could begin to fully experiment with the card in that state.
> 
> I have been experimenting with the card for quite some time and have been very curious concerning why the card throttles so aggressively and when it is unnecessary to do so. Ultimately, the 8pin power delivery may be the main factor
Click to expand...

Your card only has a 6+2 phase VRM, You should be fine flashing EVGA bioses as they use 5 channels from the voltage controller and unmanaged doublers to get the "10 Phase" vrm. the FTW bios will increase your power limit a little from 200W to 226W too.

The Strix has 8+2 Phase VRM. The MSI cards are also 8 +2 phase VRM cards but they are 8+6 pin power. The voltage controllers are the same 8 channel units on just about all of them but I don't know what impact there would be with a bios that expects extra voltage controller channels to be there that do not exist.

The Galax/KAF HOF cards are the ones that use a different voltage controller and should be avoided for cross flashing


----------



## zipper17

Quote:


> Originally Posted by *TheNoub*
> 
> Hey guys, I tried searching the forums for something about my card but couldn't find it so I figured i'd ask you guys. I'm new to this site (just registered) but I read a lot over the past few years.
> 
> So now with the problem: I'm stuck at 2113-2150mHz on my Gigabyte GTX 1070 G1 Gaming and sometimes (mostly in more demanding games like Witcher 3 or DOOM) I hit the power limit of 111%, then the card throttles a little over 2050mHz. My voltage stays the same (mostly) at 1.07v or 1.81v. I only see 1.093v in 3Dmark or other benchmarks.
> 
> The Question: I'm not asking for the voltage part, I get that the 1070s and 1080s don't go so well with more voltage, but could I flash my BIOS to let's say a STRIXX OC (which I think removes the power limit completly) and have better results in achieving 2150-2200mHz?
> 
> BTW, I can get my card to 2200mHz but then the throttling is constant, no artifacts or glitches noticed. So I think I won the lotery with my card but I'm limited because of the goddamn power limit.
> 
> Any help appreciated and I would like to know if someone succesfully flashed the STRIXX OC BIOS onto the G1 Gaming card.


Did you ever run Firestrike Extreme/Ultra Stress Test? What's your % result?
Just give you some tips though to make sure your card really stable at that corespeed.
Quote:


> Originally Posted by *M3Stang*
> 
> Looks like the PSU was my issue all along. The 12V rail was running at 11.4-11.5 at idle in the bios. New PSU is at 12.24 stable at idle. No issues with the card yet.


Yes, Voltage Rail Ripples is pretty important for overclocking.
If The PSU's Rails couldn't deliver stable output, your component of overclocking would not get the stable voltages.
"This would result in the GPU rendering incorrectly, driver crashes, and general instability."
http://www.gamersnexus.net/guides/2053-power-supply-voltage-ripple-and-relevance
Quote:


> Originally Posted by *shadowrain*
> 
> *PSA*
> 
> *DO NOT update to the latest 375.86 drivers! I repeat, DO NOT update to the latest 375.86 drivers!*
> 
> Multiple reports of users using the 10 series cards are having low FPS on all games (<30 FPS at 1080p on my 1070 Zotac Amp Extreme on all my games and Heaven) caused by the drivers forcing the VRAM clock to only 810 mhz effective. Issue doesn't appear when I was using 375.70 and now running at 375.76. DDU and Clean Installs were always used.
> 
> https://forums.geforce.com/default/topic/976636/geforce-drivers/official-375-86-game-ready-whql-display-driver-feedback-thread-released-11-15-16-/4/
> 
> 
> __
> https://www.reddit.com/r/5d2v3u/driver_37586_faqdiscussion_thread/
> 
> EDIT: DDU and Clean Installs were always used.


Wow. I'm still on 373.06.
Looks like Driver with higher version than 373.06 has lot problems.


----------



## Avendor

Not really, installed 375.86 no problems so far


----------



## khanmein

Quote:


> Originally Posted by *Avendor*
> 
> Not really, installed 375.86 no problems so far


for majority maxwell users but not for pascal. i think related with micron row hammer.


----------



## Avendor

Thanks for the heads up


----------



## LogicusMPS

Quote:


> Originally Posted by *Avendor*
> 
> Not really, installed 375.86 no problems so far


I also had installed it and I just I can see, i got higher GPU usage in games, less FPS drops. Works fine.


----------



## msigtx760tf4

Quote:


> Originally Posted by *LogicusMPS*
> 
> I also had installed it and I just I can see, i got higher GPU usage in games, less FPS drops. Works fine.


for me working fine


----------



## herkalurk

I guess I should install too....., I'm quite behind on driver version. (369.09)


----------



## TheNoub

Quote:


> Originally Posted by *gtbtk*
> 
> Your card is a very good one. The behavior of your card is completely normal and is the way GPU boost 3.0 works, balancing load, temps and voltage. even at 2050mHz under load, you are still getting about 300mhz overclock. There is currently no way to use more than 1.093V through 1070 cards as there is now way currently to create a custom bios.
> 
> I suggest that you do some performance comparisons at different starting frequencies, I can run my card faster at speeds, well above 2100 but I end up with more FPS if I run it at 2088Mhz. I cannot explain why that is the case and your milage may vary but worth confirming with your hardware
> 
> The Strix XOC bios you are talking about is for 1080 and not 1070 cards as far as I am aware. The 1080s using that bios only work well under water, they tend to overheat with air cooling from reports I have read.


I did some comparisons, specially with 3Dmark and Heaven, my final score with the Time Spy benchmark is 6051 (http://www.3dmark.com/spy/734584) I had around 5800-5900 scores with lower clocks. Thanks for the info on the 1080 XOC.
Quote:


> Originally Posted by *xGeNeSisx*
> 
> I also have the G1 1070, and you absolutely have good card. I cannot even get a meaningfully overclock and I have tried for quite some time. My card will throttle when it hits 102% even when the slider is maxed out to 111%. I can't even get past 2050mhz core clock as the power readings show the card goes from 80s at 2000mhz to 104-107 at 2050mhz. Do you have micron memory or Samsung out of curiosity?


I have Samsung memory on my card (Ithought all gigabyte cards had Samsung, might be wrong tho) also latest BIOS F2.
Quote:


> Originally Posted by *zipper17*
> 
> Did you ever run Firestrike Extreme/Ultra Stress Test? What's your % result?


Yes, I ran several benchmarks and all pass, latest result is 6051 in Time Spy (http://www.3dmark.com/spy/734584)
Quote:


> Originally Posted by *Avendor*
> 
> Not really, installed 375.86 no problems so far


Same, I'm on the 375.86 with no problem at all, memory still at 9216mHz. gl guys


----------



## cutty1998

Holy Moly ! I finally have a working Strix 1070 ,after 2 months of RMA torture! They(Asus) actually sent me a new card that works. Amazingly ,It can play Crysis! Now I can finally try to sell my 980. I am hoping to get $200 for it. That is what I got for my 680's after 2 years 980 is a great card ,and gave me 2 years of flawless performance. Strix 1070 looks awesome in my HAF case with the red fans ,and the card is pulsing red.


----------



## BroPhilip

I hate to bring up the Micron issue again but I wonder if the new driver issue is connected to Micron. Mine is micron memory and it wouldn't clock my ram higher that 800mhz I had to downgrade it. What are other samsung and micron users finding with this release?


----------



## TheGlow

I know I played some Heroes of the storm last night and upgraded drivers, but can't recall which order I did it.
I think drivers first.
I'll double check later.


----------



## Nukemaster

I have not noticed any issues with the latest drivers, but I am not playing anything that demanding. I will check tonight.

I have Samsung memory on my Asus card.


----------



## BroPhilip

Turns out it is an issue with AIB cards that have OC memory from factory.

From ManuelGuzman

Just a quick update on the memory clock speed stuck at 810MHz issue. This issue seems to affect some factory overclocked GeForce GTX 1060/1070/1080 cards. We have reproduced this issue and identified the root cause. A hotfix driver will be made available as soon as a fix has been confirmed.


----------



## cutty1998

Thats right , I just read about the whole micron /Samsung memory issue with these cards. With my luck ,I'm sure I got the 8000Mhz Micron memory. I do have the serial number on the RMA receipt . Is there a way to tell which memory by the serial # ? Mine is G6C0YZKK001Z Any help appreciated !


----------



## backie

No downclocking with the new driver on micron here.


----------



## Nukemaster

Quote:


> Originally Posted by *cutty1998*
> 
> Thats right , I just read about the whole micron /Samsung memory issue with these cards. With my luck ,I'm sure I got the 8000Mhz Micron memory. I do have the serial number on the RMA receipt . Is there a way to tell which memory by the serial # ? Mine is G6C0YZKK001Z Any help appreciated !


Use gpu-z to check the memory brand.
https://www.techpowerup.com/gpuz/


----------



## chrcoluk

For those who dont know the new palit bios raises the TDP limit, so my card no longer throttles.


----------



## cutty1998

Quote:


> Originally Posted by *Nukemaster*
> 
> Use gpu-z to check the memory brand.
> https://www.techpowerup.com/gpuz/


Thank you very much . I am downloading newest version of GPU-Z now! .


----------



## gtbtk

Quote:


> Originally Posted by *TheNoub*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Your card is a very good one. The behavior of your card is completely normal and is the way GPU boost 3.0 works, balancing load, temps and voltage. even at 2050mHz under load, you are still getting about 300mhz overclock. There is currently no way to use more than 1.093V through 1070 cards as there is now way currently to create a custom bios.
> 
> I suggest that you do some performance comparisons at different starting frequencies, I can run my card faster at speeds, well above 2100 but I end up with more FPS if I run it at 2088Mhz. I cannot explain why that is the case and your milage may vary but worth confirming with your hardware
> 
> The Strix XOC bios you are talking about is for 1080 and not 1070 cards as far as I am aware. The 1080s using that bios only work well under water, they tend to overheat with air cooling from reports I have read.
> 
> 
> 
> I did some comparisons, specially with 3Dmark and Heaven, my final score with the Time Spy benchmark is 6051 (http://www.3dmark.com/spy/734584) I had around 5800-5900 scores with lower clocks. Thanks for the info on the 1080 XOC.
> Quote:
> 
> 
> 
> Originally Posted by *xGeNeSisx*
> 
> I also have the G1 1070, and you absolutely have good card. I cannot even get a meaningfully overclock and I have tried for quite some time. My card will throttle when it hits 102% even when the slider is maxed out to 111%. I can't even get past 2050mhz core clock as the power readings show the card goes from 80s at 2000mhz to 104-107 at 2050mhz. Do you have micron memory or Samsung out of curiosity?
> 
> Click to expand...
> 
> I have Samsung memory on my card (Ithought all gigabyte cards had Samsung, might be wrong tho) also latest BIOS F2.
Click to expand...

They are all different. Your Timespy results are really good. I am only running an i7-2600 so it impacts my physics and combined scores but the best I have managed is just over 5800.

Gigabyte cards have both Micron and Samsung memory just like all the other branded cards including branded Founder edition cards. Gigabyte lied to Guru3d when they wrote article about the new bios to fix the Micron bug. Isn't it amazing how a single piece of misinformation becomes widespread "fact".

Since the .50 bios upgrade fixed the voltage/timing bug, there is nothing wrong Micron memory. It certainly has no effect on the Core frequency overclock potential.


----------



## gtbtk

Quote:


> Originally Posted by *cutty1998*
> 
> Thats right , I just read about the whole micron /Samsung memory issue with these cards. With my luck ,I'm sure I got the 8000Mhz Micron memory. I do have the serial number on the RMA receipt . Is there a way to tell which memory by the serial # ? Mine is G6C0YZKK001Z Any help appreciated !


If you have Micron memory and make sure that you have the 86.04.50.00.xx bios installed on your card, there is no Micron issue any more. The bios update fixed the bug that caused the issue.

If you have an earlier version of the bios, the update utility can be downloaded from Asus.


----------



## shadowrain

Quote:


> Originally Posted by *BroPhilip*
> 
> I hate to bring up the Micron issue again but I wonder if the new driver issue is connected to Micron. Mine is micron memory and it wouldn't clock my ram higher that 800mhz I had to downgrade it. What are other samsung and micron users finding with this release?


Not this Micron thing again.

I'm Samsung, I wrote the PSA of the driver issue here.
Quote:


> Originally Posted by *gtbtk*
> 
> If you have Micron memory and make sure that you have the 86.04.50.00.xx bios installed on your card, there is no Micron issue any more. The bios update fixed the bug that caused the issue.
> 
> If you have an earlier version of the bios, the update utility can be downloaded from Asus.


+100

also *crickets* MySX *crickets*


----------



## zipper17

Quote:


> Originally Posted by *TheNoub*
> 
> Yes, I ran several benchmarks and all pass, latest result is 6051 in Time Spy (http://www.3dmark.com/spy/734584)


Firestrike Stress Test not Firestrike benchmark.


----------



## TheNoub

Quote:


> Originally Posted by *zipper17*
> 
> Firestrike Stress Test not Firestrike benchmark.


I will as soon as I get home, by the way, the GPU is stable in all the games I played so far, it's been 2 weeks now on the same clocks. I played Overwatch, Heroes of The Storm, Witcher 3, DOOM 2016, Deus Ex Human Revolution, Ashes of the Singularity (also the benchmark) so yeah, I guess if I don't have issues playing games the GPU should be Firestrike stable. I'll give you an update.

For the throttling, I noticed some games drop to 2080 from 2113 because of the TDP limit, voltage stayed between 1.05v and 1.07v. Only Ashes of the SIngularity benchmark (1440p Crazy settings, 2xAA) gave me Vrel AND Pwr limits.


----------



## cutty1998

Quote:


> Originally Posted by *gtbtk*
> 
> If you have Micron memory and make sure that you have the 86.04.50.00.xx bios installed on your card, there is no Micron issue any more. The bios update fixed the bug that caused the issue.
> 
> If you have an earlier version of the bios, the update utility can be downloaded from Asus.


thanx , Yeah I have the Micron memory modules. I am going to check on the bios version,I just got the card yesterday ,and it looked to be brand new,although it could very well be a refurbished card. I have never flashed a video card as I was always afraid of bricking! It would be nice to be able to overclock the memory though.


----------



## ITAngel

Has any one released any tools to be able to modified your GPU bios for a GTX1070? Are we still having issues with locked voltages on the bios?


----------



## ITAngel

Quote:


> *DO NOT update to the latest 375.86 drivers! I repeat, DO NOT update to the latest 375.86 drivers!*
> 
> Multiple reports of users using the 10 series cards are having low FPS on all games (<30 FPS at 1080p on my 1070 Zotac Amp Extreme on all my games and Heaven) caused by the drivers forcing the VRAM clock to only 810 mhz effective. Issue doesn't appear when I was using 375.70 and now running at 375.76. DDU and Clean Installs were always used.
> 
> https://forums.geforce.com/default/topic/976636/geforce-drivers/official-375-86-game-ready-whql-display-driver-feedback-thread-released-11-15-16-/4/
> 
> 
> __
> https://www.reddit.com/r/5d2v3u/driver_37586_faqdiscussion_thread/
> 
> EDIT: DDU and Clean Installs were always used.


I agree... I was running Overwatch with the previous drivers just fine in EPIC mode. I updated to the new drivers and I can't barely play the game I needed to actually remove myself from the team to do some test and it kept causing issues. I then ended up removing the drivers and installing the previous drivers and all the problems were gone.

Symptoms felt like lag on the memory side of the video, this was on a Zotac GTX 1070 AMP! Extreme. I have had no issues until this new drivers with that card.

Thanks mate! +1


----------



## Gurkburk

After playing a while the new drivers went to ****. had to downgrade.

But I noticed now after downgrading to 76, my Core clock is being used 100% but my memory of my 1070 is only being used 2766 out of 8GB or rather 4600mhz per memory(?).

Any ideas? Checking this in GPU-Z.


----------



## jadenx2

Hey guys,

I have the exact GPU that has been having overheating issues and realized I need to update my firmware. But I have a dual bios card, and I have no idea how to install the firmware or how to 'set my GPU to BIOS'.

any help would be great, thanks!


----------



## Inelastic

Quote:


> Originally Posted by *Majentrix*
> 
> The RGB LEDs my Gainward 1070s are slightly different on either card. Blue and green on one card are slightly weaker than the other, but red appears to be the same.
> Really annoying.


Ugh, I bet that's annoying. I wonder if it's an issue with voltage differences to the rgb leds. I was looking at the EVGA ACX 3.0 leds and I noticed that the pinout is exactly the same as SMD 5050 strips. If I hadn't already taken replaced it with an EK water block, I would have hooked it up to my Arduino Uno and ran it off there so I could sync it up with the rest of my pc lights. I'm guessing the leds on yours are similar and could both be controlled the same way.

I've updated to the latest 375.86 drivers and haven't had any issues so far. It has actually fixed an issue I had with artifacts in gifs.


----------



## msigtx760tf4

Quote:


> Originally Posted by *shadowrain*
> 
> *PSA*
> 
> *DO NOT update to the latest 375.86 drivers! I repeat, DO NOT update to the latest 375.86 drivers!*
> 
> Multiple reports of users using the 10 series cards are having low FPS on all games (<30 FPS at 1080p on my 1070 Zotac Amp Extreme on all my games and Heaven) caused by the drivers forcing the VRAM clock to only 810 mhz effective. Issue doesn't appear when I was using 375.70 and now running at 375.76. DDU and Clean Installs were always used.
> 
> https://forums.geforce.com/default/topic/976636/geforce-drivers/official-375-86-game-ready-whql-display-driver-feedback-thread-released-11-15-16-/4/
> 
> 
> __
> https://www.reddit.com/r/5d2v3u/driver_37586_faqdiscussion_thread/%5B/URL


----------



## SpirosKGR

Have anyone tried the new hotfix drivers at Battlefield 1?


----------



## ITAngel

Quote:


> Originally Posted by *msigtx760tf4*
> 
> here is an Update/Fix of that issue LINK


Thanks I will be testing this now I will report back if it works for me or not.


----------



## EDK-TheONE

new driver 375.95 is beast! rock solid stable on 2176 Mhz (2189 Mhz/One step down!)& 9400 Mhz. fs ultra stress hit 99.7 % (==250 Watts)

Zotac 1070 amp! Extreme is a really beast!


----------



## ITAngel

Yes the new drivers are working perfectly. Thank You!


----------



## gtbtk

Quote:


> Originally Posted by *cutty1998*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you have Micron memory and make sure that you have the 86.04.50.00.xx bios installed on your card, there is no Micron issue any more. The bios update fixed the bug that caused the issue.
> 
> If you have an earlier version of the bios, the update utility can be downloaded from Asus.
> 
> 
> 
> thanx , Yeah I have the Micron memory modules. I am going to check on the bios version,I just got the card yesterday ,and it looked to be brand new,although it could very well be a refurbished card. I have never flashed a video card as I was always afraid of bricking! It would be nice to be able to overclock the memory though.
Click to expand...

It will depend how long the card has been sitting on the shelf or in a warehouse before you bought it. The new bios has only been out for about 4 weeks so anything made before then will not have the new bios ex factory. It does not necessarily mean that the card is refurbished.

The Bios update is very straight forward and automated when you use the Asus update utility. The one download has the bios files for their entire range of 1070 cards and will check that it applies the correct bios to your card. Just run the card at default settings and you should be fine.


----------



## Snuckie7

Quote:


> Originally Posted by *EDK-TheONE*
> 
> new driver 375.95 is beast! rock solid stable on 2176 Mhz (2189 Mhz/One step down!)& 9400 Mhz. fs ultra stress hit 99.7 % (==250 Watts)
> 
> Zotac 1070 amp! Extreme is a really beast!


Is this a better overclock than before you updated your drivers?


----------



## ITAngel

Quote:


> Originally Posted by *EDK-TheONE*
> 
> new driver 375.95 is beast! rock solid stable on 2176 Mhz (2189 Mhz/One step down!)& 9400 Mhz. fs ultra stress hit 99.7 % (==250 Watts)
> 
> Zotac 1070 amp! Extreme is a really beast!


Hey what are your settings like? I want to compare them once I do my new test. Maybe I can push my card even higher now.


----------



## gtbtk

Quote:


> Originally Posted by *jadenx2*
> 
> Hey guys,
> 
> I have the exact GPU that has been having overheating issues and realized I need to update my firmware. But I have a dual bios card, and I have no idea how to install the firmware or how to 'set my GPU to BIOS'.
> 
> any help would be great, thanks!


I assume you are talking about the EVGA FTW 1070? There is a 2 position slide switch on the card for bios selection, usually next to the PCI power sockets

You need to download 2 update files. One for the primary bios and the other for the secondary bios.

Turn off your PC and set the switch to the primary bios, boot the computer and run the primary bios update until it completes and reports update sucessful.

Shut down your PC, change the switch to the secondary bios, boot up again and run the update utility for the secondary bios.


----------



## _Killswitch_

With the newest driver's, really haven't found the "sweet spot" for my 1070 yet, maybe i'll play with it tomorrow


http://www.3dmark.com/3dm/16123901


----------



## ITAngel

I was testing mine and I could swear these drivers amp my gtx 1070 amp! Extreme up pass default OC on the core. I played Overwatch for 3hrs ambient temp 72F and the GPU was running at 1999Mhz with temps at 50C. Is it normal for a stock settings on these zotac gtx 1070 amp extreme to run that high with out OC? By the way CPU was at 32C so I take the case was running pretty cool. I will do a better test this weekend and record my findings. I used Gpu-z to monitor.


----------



## Bold Eagle

Quote:


> Originally Posted by *_Killswitch_*
> 
> With the newest driver's, really haven't found the "sweet spot" for my 1070 yet, maybe i'll play with it tomorrow
> 
> 
> http://www.3dmark.com/3dm/16123901


Transgender unit there?


----------



## RyanRazer

Quote:


> Originally Posted by *EDK-TheONE*
> 
> new driver 375.95 is beast! rock solid stable on 2176 Mhz (2189 Mhz/One step down!)& 9400 Mhz. fs ultra stress hit 99.7 % (==250 Watts)
> 
> Zotac 1070 amp! Extreme is a really beast!


Wow that looks insane. i play at 2050Mhz now with AMP! ext. Let me test that. What were your clocks before?


----------



## _Killswitch_

Quote:


> Originally Posted by *Bold Eagle*
> 
> Transgender unit there?


Lol, no bold is a woman but "fit" ....lol wasnt expecting a comment like that


----------



## Avendor

I assume 375.95 not released yet via GeForce Experience? it does say You have the latest GeForce driver which is 375.86.


----------



## DeathAngel74

http://nvidia.custhelp.com/app/answers/detail/a_id/4260


----------



## Avendor

Got it! https://forums.geforce.com/default/topic/977133/geforce-drivers/announcing-hot-fix-driver-375-95/

NOTE: We have submitted this driver to Microsoft for WHQL-certification and intend to replace 375.86 with this version online and within GeForce Experience as soon as possible.


----------



## _Killswitch_

Wanted to see max core I could get, which was a little over 300Mhz with slight bump in the Memory, now im see what max Memory i can do and how far the core will go...I'm bored


http://www.3dmark.com/3dm/16133374

Edited:


----------



## Mr-Dark

Some Red LED memory on my MSI build


----------



## philhalo66

Just got mine a few hours ago, this thing is insane! HUGE upgrade from my 580. Down sampling over 4K and only game to drop below 30 FPS was Crysis 3 for obvious reasons
https://www.techpowerup.com/gpuz/details/gsxc4
http://valid.x86.fr/k2agi1


----------



## Avendor

Nice, identical to mine, just different memory type


----------



## ITAngel

Quote:


> Originally Posted by *_Killswitch_*
> 
> Wanted to see max core I could get, which was a little over 300Mhz with slight bump in the Memory, now im see what max Memory i can do and how far the core will go...I'm bored
> 
> 
> http://www.3dmark.com/3dm/16133374
> 
> Edited:


Cool! Yea I going to see how high mine can go now this weekend.


----------



## philhalo66

Quote:


> Originally Posted by *Avendor*
> 
> Nice, identical to mine, just different memory type


Yeah, same card from the looks of it just different BIOS and Memory manufacturer. Weird i wonder why gigabyte did that and i wonder which one oc better.


----------



## _Killswitch_

never did a GPU-z on mine, I have Micron memory as well


----------



## Avendor

Quote:


> Originally Posted by *philhalo66*
> 
> Yeah, same card from the looks of it just different BIOS and Memory manufacturer. Weird i wonder why gigabyte did that and i wonder which one oc better.


Well, I can go mine up to +600MHz memory clock with the latest BIOS, 700MHz it's too much for me starts flickering in Stress Test. How far you can go with your G1?
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/5760#post_25628150
I keep mine right now onto OC mode in the Giga utility software. I really don't need hefty OC, when i'll upgrade my monitor to 1440p maybe then it will be a necessity
I've been used MSI Afterburner for OC-ing, never tried into Giga software.


----------



## philhalo66

Quote:


> Originally Posted by *Avendor*
> 
> Well, I can go mine up to +600MHz memory clock with the latest BIOS, 700MHz it's too much for me starts flickering in Stress Test. How far you can go with your G1?
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/5760#post_25628150
> I keep mine right now onto OC mode in the Giga utility software. I really don't need hefty OC, when i'll upgrade my monitor to 1440p maybe then it will be a necessity
> I've been used MSI Afterburner for OC-ing, never tried into Giga software.


i haven't tried to OC yet. but MSI afterburner says its auto overclocking to 2100Mhz core and looks like the memory stays solid 8004MHz. Im honestly not even sure how to overclock lol going from Fermi to this new boost stuff is a little overwhelming.


----------



## DeathAngel74

I installed eVGA thermal pad mod on gtx 1070 and android 7.0 nougat on my s7. I wonder which one will blow up first........


----------



## Avendor

Quote:


> Originally Posted by *philhalo66*
> 
> i haven't tried to OC yet. but MSI afterburner says its auto overclocking to 2100Mhz core and looks like the memory stays solid 8004MHz. Im honestly not even sure how to overclock lol going from Fermi to this new boost stuff is a little overwhelming.


We are on same boat. I came also from GTX 580 aka. Fermi







actually it's pretty easy to OC GTX 1070, follow this review: http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_1070_g1_gaming_review,29.html
Yes, the memory it runs at 8 GB out of the box. Just make sure you don't start with too higher frequencies (Core Clock / Memory Clock) from the start. Just start with small frequencies like +70 Core Clock, +250MHz on Memory Clock also don't touch Core Voltage, even you touch that and you decide to ramp up the maximum is closest to 1.1v
After you apply new frequencies run benchmarks, I recommend you 3DMark Fire Strike, Time Spy or Stress Test for stability, if driver crashes or you'll notice scenes starts flickering you need to back down, lower memory clock and run the test once more. It might be needed Windows 10 though to be able to run Time Spy.


----------



## zipper17

Firestrike Extreme/Ultra Stress Test is much more credible than Unigine Valley/Heaven, at least for me. I can Overclock my card as high as 2100/2151mhz passed on UnigineValley test just fine, but In Firestrike Extreme Stress Test I had crashes in a matter of secs. Witcher 3 also produce the same crashes.

Make sure after overclocking your card is stable on all Stress Test & all Games that you have. If you have crashes your overclock is too high or not stable yet.

For memory overclocking make sure there is no mini-artifact in your screens aka Green/Red sparkles artifacts, or make sure there's no performances degradation/error check bug memory or something like that.

Samsung Memory overclocking should be more easy and stable at 9GHZ territory since day1 (2250MHZ*2=4500*2=9000MHZ). For Micron Memory you should update your VGAbios to latest recommended version.


----------



## DeathAngel74

I'm happy with my 1070 SC after new thermal pads from eVGA and 2 bios revisions from nVidia.. 2101/8726, 1.075v, 31C idle, 46C max. It's cold in the living room, lol. I just hope it last longer than my S7.


----------



## Avendor

Looks like now available


----------



## StrelokAT

After bios update on my msi gtx 1070 i can oc +500 on memory. Earlier i could only go up to ~ +315:thumb:


----------



## Gurkburk

I think ive reached my limit on this voltage.

+110 CC and +600 memory

Wish i could go further :/ Damn limitations, can't wait for some custom bios to unlock this **** and disable this 3.0 ****.


----------



## Yetyhunter

Can someone please help me choose the correct BIOS for my gigabyte card. I would like to flash the newest version but don't know how.


----------



## Gurkburk

Quote:


> Originally Posted by *Yetyhunter*
> 
> Can someone please help me choose the correct BIOS for my gigabyte card. I would like to flash the newest version but don't know how.


http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios

& pick the one that says "Release for samsung memory"


----------



## Yetyhunter

Thank you. It was so obvious XD.


----------



## Forceman

Quote:


> Originally Posted by *Yetyhunter*
> 
> Can someone please help me choose the correct BIOS for my gigabyte card. I would like to flash the newest version but don't know how.


Are there any benefits from the new BIOS if you have Samsung memory?


----------



## lanofsong

Hey GTX 1070 owners,

We are having our monthly Foldathon from Monday 21st - 23rd 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

November Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Mad Pistol

Nvidia's 375.95 drivers are magic for overclocking. I can now bench @ a core of 200+ on my SLI 1070 FE setup.... I could barely get past 160 with all previous drivers. The new scores are WAY higher than they were before. I even broke 10k overall on my Time Spy run.


----------



## zipzop

Firestrike results on a single 1070 SC @ 2126mhz. Some power throttling dips to 2088mhz. Tested on a 1440p monitor though I'm not sure that effects the test results?

FS normal: http://www.3dmark.com/fs/10838403

Time Spy: http://www.3dmark.com/spy/753815

Edit: I tried 2139mhz which is about all it's got, I think. It crashed on the first try until I went back and tweaked the voltage curve a bit.

http://www.3dmark.com/fs/10838655

Better overall score on everything, oddly the graphics score is lower


----------



## zipper17

Quote:


> Originally Posted by *zipzop*
> 
> Firestrike results on a single 1070 SC @ 2126mhz. Some power throttling dips to 2088mhz. Tested on a 1440p monitor though I'm not sure that effects the test results?
> 
> FS normal: http://www.3dmark.com/fs/10838403
> 
> Time Spy: http://www.3dmark.com/spy/753815
> 
> Edit: I tried 2139mhz which is about all it's got, I think. It crashed on the first try until I went back and tweaked the voltage curve a bit.
> 
> http://www.3dmark.com/fs/10838655
> 
> Better overall score on everything, oddly the graphics score is lower


2126mhz memory? that's 8,5GHZ effective clock.

Try overclock memory into 9GHZ? I'm sure you will got higher Graphic scores.


----------



## Forceman

Pretty sure that's core clock, not memory.


----------



## zipper17

If you click the scores result , his Memory Bus clock at 2126mhz.


----------



## zipzop

Yeah it is +250 and my base memory clock is 4,002mhz for some reason. 2126mhz / *4252mhz*/ 8504mhz effective


----------



## DeathAngel74

I have a 1070 SC, that I've oc'd the memory to +362. 2181.5/4363/8726 effective. Is this average or pretty good for Micron VRAM? or VRAM overclocking in general for the 1070's? I have yet to try overclocking the card higher since I installed new TIM(GELID GC Extreme) and thermal pads on the VRAM/ VRM's (baseplate and backplate) from eVGA.


----------



## Avendor

From my perspective +362 it's low, with F11 for Gigabyte G1 Microns some of us easily we get +500. I can push further to +650+ which is good, but my Core Clock seems to be max. 140MHz cannot get more from that, bad limit.









P.S. Is it true, the biggest boost come from Core? same applies to GTX 1070 or Memory Clock it's more important to Pascal architecture. Please enlighten me, which is more valuable?


----------



## headd

Quote:


> Originally Posted by *Avendor*
> 
> From my perspective +362 it's low, with F11 for Gigabyte G1 Microns some of us easily we get +500. I can push further to +650+ which is good, but my Core Clock seems to be max. 140MHz cannot get more from that, bad limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. Is it true, the biggest boost come from Core? same applies to GTX 1070 or Memory Clock it's more important to Pascal architecture. Please enlighten me, which is more valuable?


Both gains are same.Of course you get more from memory if you have aftermarket card and you run already at 2Ghz core.
But with 1800mhz core(stock GTX1070) to 2100Mhz you get around same fps increase like oc memory from 8Ghz to 9.4Ghz(still you get slightly more with core speed increase)


----------



## DeathAngel74

I run 2101 core /8726 memory, no throttling.


----------



## GeneO

Getting an MSI GTX 1070 X next Monday. Just peeing here so it shows up in my subscriptions.


----------



## khanmein

Quote:


> Originally Posted by *GeneO*
> 
> Getting an MSI GTX 1070 X next Monday. Just peeing here so it shows up in my subscriptions.


how much u bought?


----------



## GeneO

Quote:


> Originally Posted by *khanmein*
> 
> how much u bought?


On sale at Amazon 394 plus there is a 20 dollar rebate so 374.


----------



## khanmein

Quote:


> Originally Posted by *GeneO*
> 
> On sale at Amazon 394 plus there is a 20 dollar rebate so 374.


yeah from 399 dropped to 394. please let me know the vram come with micron or samsung when u received. cheers.


----------



## Avendor

Quote:


> Originally Posted by *headd*
> 
> Both gains are same.Of course you get more from memory if you have aftermarket card and you run already at 2Ghz core.
> But with 1800mhz core(stock GTX1070) to 2100Mhz you get around same fps increase like oc memory from 8Ghz to 9.4Ghz(still you get slightly more with core speed increase)


So basically you're saying memory will not be required OC-ed if Core Clock stays at 2100MHz without throttling, did I get it wrong? what about if you OC-ing memory too besides Core, you won't acquire more performance? I am pretty confused


----------



## GeneO

Quote:


> Originally Posted by *khanmein*
> 
> yeah from 399 dropped to 394. please let me know the vram come with micron or samsung when u received. cheers.


I actually decided to get it from local microcenter instead tonight. Will let you know - they have a good return policy.


----------



## headd

Quote:


> Originally Posted by *Avendor*
> 
> So basically you're saying memory will not be required OC-ed if Core Clock stays at 2100MHz without throttling, did I get it wrong? what about if you OC-ing memory too besides Core, you won't acquire more performance? I am pretty confused


I said if you have already aftermarket card that boost at 2ghz, OC core will have almost zero efect(because its already have heavy oc).If you have reference that boost at 1800mhz, oc core will have more efect than oc memory.

witcher3
GTX1070 reference 1730-1770Mhz/8000
Avg: 44.851 - Min: 39 - Max: 52
GTX1070 2114/8000
Avg: 50.526 - Min: 43 - Max: 60
GTX1070 1730-1770/9400
Avg: 46.471 - Min: 40 - Max: 55
GTX1070 2114/9400
Avg: 53.501 - Min: 46 - Max: 63

Crysis3 1440p all max 4xMSAA
GTX1070 1730-1770/8000
Avg: 45.959 - Min: 40 - Max: 56
2120/8000
Avg: 49.553 - Min: 42 - Max: 59
1730-1770/9400
Avg: 50.779 - Min: 43 - Max: 63
2120/9400
Avg: 55.162 - Min: 48 - Max: 67

Rise of the Tomb raider 1440p dx12 all max
Stock 1070 1730-1770/8000
Avg: 50.358 - Min: 45 - Max: 58
2120/8000
Avg: 56.769 - Min: 50 - Max: 64
1730-1770/9400
Avg: 53.946 - Min: 47 - Max: 62
2120/9400
Avg: 60.846 - Min: 54 - Max: 69


----------



## Avendor

That makes perfect sense. Thanks for clarifying.


----------



## GeneO

Quote:


> Originally Posted by *khanmein*
> 
> yeah from 399 dropped to 394. please let me know the vram come with micron or samsung when u received. cheers.


Samsung. Yaaaay.

Out of the box and open case valley at 2000 core and 9000 mem no issues. Shows I am Vrel capped - might be cooling- hadn't tuned the fan. Very quiet.


----------



## khanmein

[/quote]
Quote:


> Originally Posted by *GeneO*
> 
> Samsung. Yaaaay.
> 
> Out of the box and open case valley at 2000 core and 9000 mem no issues. Shows I am Vrel capped - might be cooling- hadn't tuned the fan. Very quiet.


micro center or amazon? your 1st 4 digit S/N?


----------



## GeneO

Quote:


> Originally Posted by *khanmein*
> 
> micro center or amazon? your 1st 4 digit S/N?


Microcenter. 08SB


----------



## khanmein

Quote:


> Originally Posted by *GeneO*
> 
> Microcenter. 08SB


thanks then i should ask my bro go the nearest microcenter.

the price same with amazon too after rebate?


----------



## GeneO

$10 more.


----------



## khanmein

Quote:


> Originally Posted by *GeneO*
> 
> $10 more.


worth it & enjoy!


----------



## GeneO

Quote:


> Originally Posted by *khanmein*
> 
> worth it & enjoy!


Agree. Amazon now charges taxes, so would rather give it to local brick and morter fro $10.

It is working great so far. Got Gears of war from nvidia free too. Downloading it.


----------



## khanmein

Quote:


> Originally Posted by *GeneO*
> 
> Agree. Amazon now charges taxes, so would rather give it to local brick and morter fro $10.
> 
> It is working great so far. Got Gears of war from nvidia free too. Downloading it.


how come not watch dogs 2?


----------



## Gurkburk

Why wont my damn overclocks go through Time spy in 3Dmark, but works just fine in Fire Strike Extreme?


----------



## AliasOfMyself

New 1070 owner here







i have the GV-N1070XTREME-8GD (Gigabyte 1070 Xtreme Gaming).


----------



## khanmein

Quote:


> Originally Posted by *AliasOfMyself*
> 
> New 1070 owner here
> 
> 
> 
> 
> 
> 
> 
> i have the GV-N1070XTREME-8GD (Gigabyte 1070 Xtreme Gaming).


samunng or micron?


----------



## AliasOfMyself

Quote:


> Originally Posted by *khanmein*
> 
> samunng or micron?


Samsung, i've had a quick look on google, but still not sure which is better lol


----------



## khanmein

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Samsung, i've had a quick look on google, but still not sure which is better lol


of course samsung is way better. congrats..


----------



## AliasOfMyself

Quote:


> Originally Posted by *khanmein*
> 
> of course samsung is way better. congrats..


Awesome, thanks!


----------



## GeneO

Quote:


> Originally Posted by *khanmein*
> 
> how come not watch dogs 2?


Don't know. But it is not a very good game anyhow.


----------



## cutty1998

I am so happy with my Strix 1070 now! After 2 months of waiting ,it is working great! This is my first AIB custom -Non stock Card I have owned. (although I did mod my old GTX 470 with a Zalman twin fan green cooler ) I need to officially join this group now. So I just sold my GTX980( I actually kind of miss it already! ) card to a dude who had an HP pre-built minitower with a GTX 760 in it ,and a 450W PSU. He went out and got a used EVGA 600W PSU ,but it would not fit in his little case , so I just Threw in my old Thermaltake Armor MX case to help the Kid out. I think he got a good deal ,as I sold him the Gigabyte 980 ,and my old case for $200. However , I really needed the $$ to get new front tires for my Honda Accord. Anyway , the combination of the Strix 1070 ,and my Ivy Bridge 3770K @ 4.4 Ghz seems to be nice! Maybe next year I'll upgrade to New chipset . I am very impressed how cool the 1070 runs ,even under load ! So far showing highest clock at 2038 Mhz playing Crysis Warhead ! The buyer of my 980 called me and reported he was getting 80* C temps playing Witcher 3 @ 1080P. I have never played that game ,but here it is pretty demanding. I never saw 80* temps on my 980,ever .


----------



## AliasOfMyself

Anyone running the F3 bios on the same card I have? I have questions! (I'm running F2)


----------



## khanmein

Quote:


> Originally Posted by *cutty1998*
> 
> I am so happy with my Strix 1070 now! After 2 months of waiting ,it is working great! This is my first AIB custom -Non stock Card I have owned. (although I did mod my old GTX 470 with a Zalman twin fan green cooler ) I need to officially join this group now. So I just sold my GTX980( I actually kind of miss it already! ) card to a dude who had an HP pre-built minitower with a GTX 760 in it ,and a 450W PSU. He went out and got a used EVGA 600W PSU ,but it would not fit in his little case , so I just Threw in my old Thermaltake Armor MX case to help the Kid out. I think he got a good deal ,as I sold him the Gigabyte 980 ,and my old case for $200. However , I really needed the $$ to get new front tires for my Honda Accord. Anyway , the combination of the Strix 1070 ,and my Ivy Bridge 3770K @ 4.4 Ghz seems to be nice! Maybe next year I'll upgrade to New chipset . I am very impressed how cool the 1070 runs ,even under load ! So far showing highest clock at 2038 Mhz playing Crysis Warhead ! The buyer of my 980 called me and reported he was getting 80* C temps playing Witcher 3 @ 1080P. I have never played that game ,but here it is pretty demanding. I never saw 80* temps on my 980,ever .


indeed 9xx series maxwell playing witcher 3 is pretty toasty.. FYI, my ambient is around 30-34°c


----------



## Benny89

Quote:


> Originally Posted by *khanmein*
> 
> indeed 9xx series maxwell playing witcher 3 is pretty toasty.. FYI, my ambient is around 30-34°c


lol, that depends on Card, case airflow, temperature and GPU cooler... No if it is 9xxx series or 10xx series....

My 1535 (1550 max) Gigabyte 980Ti Gaming G1 in Witcher 3 on 1440p Ultra settings never went above 75C.

Look at EVGA 1070 and 1080 overheating issues. 9xxx series from EVGA did not have them. They just failed this time.

That all depends of how well they made their cooling solutions in cards. So far Gigabyte never let me down with their cooling


----------



## Marin

Did the EVGA thermal mod last night so hopefully the prevents any issues.


----------



## i2CY

So, I did some benching with 3DMark last night, and my old 2 CX R9 290s @ 947mhz 1250mhz are greater then my new GTX Strix 1070 Gaming Card with OC at 2000mhz


----------



## cutty1998

Quote:


> Originally Posted by *Benny89*
> 
> lol, that depends on Card, case airflow, temperature and GPU cooler... No if it is 9xxx series or 10xx series....
> 
> My 1535 (1550 max) Gigabyte 980Ti Gaming G1 in Witcher 3 on 1440p Ultra settings never went above 75C.
> 
> Look at EVGA 1070 and 1080 overheating issues. 9xxx series from EVGA did not have them. They just failed this time.
> 
> That all depends of how well they made their cooling solutions in cards. So far Gigabyte never let me down with their cooling


Yeah , I gave him my old Thermaltake Armor MX case from 2009 ,I think the design is even older. It was a beast of a case back in it's day. The front fan was not really working too good ,and the large 200MM side door fan is siezed. . So he isn't getting much airflow through the case. I told him to run it with side cover off ,and get a new pair of intake fans for front asap. I never had 80* temps on that card. I think it is safe , but a little high for my comfort level. Anyway ,I think he got a good deal! 
 $200 !


----------



## rfarmer

Quote:


> Originally Posted by *cutty1998*
> 
> Yeah , I gave him my old Thermaltake Armor MX case from 2009 ,I think the design is even older. It was a beast of a case back in it's day. The front fan was not really working too good ,and the large 200MM side door fan is siezed. . So he isn't getting much airflow through the case. I told him to run it with side cover off ,and get a new pair of intake fans for front asap. I never had 80* temps on that card. I think it is safe , but a little high for my comfort level. Anyway ,I think he got a good deal!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> $200 !


$200 for the 980 is a good deal, really good deal with a case thrown in.


----------



## i2CY

GTX 980! $200! yeah i mean cripes, I got that for each of my R9 290s, $200CND that is
like wow man


----------



## _Killswitch_

so I noticed something odd about my card. I noticed in firestike my core will bounce around on Mhz, 93Mz being lowest dropped amount. Yet when i played a game it didnt move, thought it was little odd


----------



## MrTOOSHORT

Quote:


> Originally Posted by *_Killswitch_*
> 
> so I noticed something odd about my card. I noticed in firestike my core will bounce around on Mhz, 93 being lowest jump. Yet when i played a game it didnt move, thought it was little odd


Game was not demanding or lower than benchmark usage, Firestrike uses all your gpu. So it hits the power limit running FS.


----------



## GeneO

https://www.techpowerup.com/gpuz/details/u22ud

OC to 2025/9028 with no voltage bump, 2037.5/9028 with approx. 10 mv

I am Vrel capped, tem,ps on the VRM I suspect. I could go higher with more fan in case and card.. But I am OK with this, it is a beast out of the box:


----------



## ITAngel

That right there my friend is a sweet card.


----------



## Quadrider10

any news on when custom bios or bios editors will come out?


----------



## Dude970

Quote:


> Originally Posted by *Quadrider10*
> 
> any news on when custom bios or bios editors will come out?


none at all


----------



## GeneO

Quote:


> Originally Posted by *Quadrider10*
> 
> any news on when custom bios or bios editors will come out?


Well at least for the MSI cards and with the beta Afterburner, you can edit the frequency voltage curve. I haven't tried it (only verified it is available and works) and I don't know if it will work with other manufacturers cards.


----------



## CaptainZombie

I just picked up the ASUS Strix 1070 OC and it has Micron memory. I know reading sometime back people complaining about Micron memory is there much to worry about?


----------



## DeathAngel74

Depends on who you ask. Happy Turkey day


----------



## Dude970

Quote:


> Originally Posted by *GeneO*
> 
> Well at least for the MSI cards and with the beta Afterburner, you can edit the frequency voltage curve. I haven't tried it (only verified it is available and works) and I don't know if it will work with other manufacturers cards.


They have the 4.3 Final out now, check Guru3D


----------



## Br3ach

Hey there. Just moved on to MSI Gaming X. Core overclock sucks - 1999Mhz max boost and that's it... ;( Well, it doesn't really suck, but I'm just envious of people getting 2.1 ;-) Pity there's no BIOS editor to try 1.25V...

Could playing with the curve help get anything better?

Memory on the other hand is great - 4720x2 without artifacts, yay (Samsung)!


----------



## ITAngel

I got my card to 2100.5 on the core and 9408Mhz memory and fans speed 60% with GPU temp of 53C and ambiet temp 73C. But during the test I got 1 to 2 artifacts so may need to drop the memory down.









Looks like stock with just cranking up the core and memory I got 2062.5Mhz / 9216Mhz on the memory temps at 65C with auto fans running at 31%, Bot bad n Core voltage needed, or Power Limit or Temp.Limit was all stock.


----------



## syl1979

Quote:


> Originally Posted by *Br3ach*
> 
> Hey there. Just moved on to MSI Gaming X. Core overclock sucks - 1999Mhz max boost and that's it... ;( Well, it doesn't really suck, but I'm just envious of people getting 2.1 ;-) Pity there's no BIOS editor to try 1.25V...
> 
> Could playing with the curve help get anything better?
> 
> Memory on the other hand is great - 4720x2 without artifacts, yay (Samsung)!


You should try to lock voltage at 1.04v and see what you can get at effective boost (real one from sensors)

I have got some strange behavior with high voltages (over 1.06), maybe linked to vrm stability


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Why wont my damn overclocks go through Time spy in 3Dmark, but works just fine in Fire Strike Extreme?


Firestrike will allow higher overclocks that Time spy, particularly when it comes to memory clocks. The latest drivers have improved things a little, however, in my experience, timespy will start to crash as memory speeds start getting close to 9000 mhz where Firestrike will complete a run with memory at 9100mhz and above. Heaven and Valley are even more forgiving of overclock speeds than anything in the 3dmark benchmarks.

Remember that every game/3d application applies load to GPUs in different ways. You will probably find that the best run you can get in Firestrike will also crash when you run Witcher 3 or Crysis 3.


----------



## gtbtk

Quote:


> Originally Posted by *Br3ach*
> 
> Hey there. Just moved on to MSI Gaming X. Core overclock sucks - 1999Mhz max boost and that's it... ;( Well, it doesn't really suck, but I'm just envious of people getting 2.1 ;-) Pity there's no BIOS editor to try 1.25V...
> 
> Could playing with the curve help get anything better?
> 
> Memory on the other hand is great - 4720x2 without artifacts, yay (Samsung)!


Using the curve is the only way you can get the core clock up to 2100 and above for any of these cards


----------



## gtbtk

Quote:


> Originally Posted by *CaptainZombie*
> 
> I just picked up the ASUS Strix 1070 OC and it has Micron memory. I know reading sometime back people complaining about Micron memory is there much to worry about?


with the latest bios. No


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quadrider10*
> 
> any news on when custom bios or bios editors will come out?
> 
> 
> 
> Well at least for the MSI cards and with the beta Afterburner, you can edit the frequency voltage curve. I haven't tried it (only verified it is available and works) and I don't know if it will work with other manufacturers cards.
Click to expand...

Afterburner works with all pascal cards


----------



## gtbtk

Quote:


> Originally Posted by *_Killswitch_*
> 
> so I noticed something odd about my card. I noticed in firestike my core will bounce around on Mhz, 93Mz being lowest dropped amount. Yet when i played a game it didnt move, thought it was little odd


Different 3d loads will effect the card in different ways. If you run furmark, the card will only run at about 1600-1800mhz, regardless of what you can get in firestrike


----------



## Jackharm

I wonder when Zotac will release their micron memory bios


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> Afterburner works with all pascal cards


Yes that is well known. I was specifically referring to the voltage-frequency curve editor feature.


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Afterburner works with all pascal cards
> 
> 
> 
> Yes that is well known. I was specifically referring to the voltage-frequency curve editor feature.
Click to expand...

so was I


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> so was I


Cool. Will have to play with it after the holiday.


----------



## Gurkburk

Anyone know what happened?

I'm getting very bouncy GPU usage around 70-90% and around 80-90% CPU usage on my 4770k.

Didnt have this high CPU usage before..

This is on BF1


----------



## DeathAngel74

My cpu usage would sometimes spike to 95-100%. i7 4790k and gtx 970...last year's build+ star wars battlefront 2015


----------



## Br3ach

Quote:


> Originally Posted by *gtbtk*
> 
> Using the curve is the only way you can get the core clock up to 2100 and above for any of these cards


Thanks! I thought I had to change each and every curve point, but in fact I only need to change the one for the last voltage (1.09). This got me stable at 2062 Mhz max boost clock vs max old-school overclock of 1999Mhz!


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> Firestrike will allow higher overclocks that Time spy, particularly when it comes to memory clocks. The latest drivers have improved things a little, however, in my experience, timespy will start to crash as memory speeds start getting close to 9000 mhz where Firestrike will complete a run with memory at 9100mhz and above. Heaven and Valley are even more forgiving of overclock speeds than anything in the 3dmark benchmarks.
> 
> Remember that every game/3d application applies load to GPUs in different ways. You will probably find that the best run you can get in Firestrike will also crash when you run Witcher 3 or Crysis 3.


Yeah, i can run Valley just fine with certain high overclock.
In 3dmark Firestrike/StressTest i got crashes right away.

However just Upgraded my RAM from 1600mhz into 2400MHZ, it won't Run with XMP2400 profile, damn.
There's a chance to increase minimum framerates with higher RAM speed.


----------



## gtbtk

Quote:


> Originally Posted by *Br3ach*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Using the curve is the only way you can get the core clock up to 2100 and above for any of these cards
> 
> 
> 
> Thanks! I thought I had to change each and every curve point, but in fact I only need to change the one for the last voltage (1.09). This got me stable at 2062 Mhz max boost clock vs max old-school overclock of 1999Mhz
Click to expand...

You can try this trick and it should get you even better FPS.

In addition to moving the 1.093v point up higher (I suggest you try setting the 1.093 point to 2088mhz and see how that goes), you can also try moving the 0.975 point up to the 1999mhz level so you end up with a curve with a double hump. Of course, experiment with both points you may get a bit more or it may be a bit less.

If you do not increase the voltage slider and leave it at 0, the 2 key voltage points to adjust are 0.950 and 1.065.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Firestrike will allow higher overclocks that Time spy, particularly when it comes to memory clocks. The latest drivers have improved things a little, however, in my experience, timespy will start to crash as memory speeds start getting close to 9000 mhz where Firestrike will complete a run with memory at 9100mhz and above. Heaven and Valley are even more forgiving of overclock speeds than anything in the 3dmark benchmarks.
> 
> Remember that every game/3d application applies load to GPUs in different ways. You will probably find that the best run you can get in Firestrike will also crash when you run Witcher 3 or Crysis 3.
> 
> 
> 
> Yeah, i can run Valley just fine with certain high overclock.
> In 3dmark Firestrike/StressTest i got crashes right away.
> 
> However just Upgraded my RAM from 1600mhz into 2400MHZ, it won't Run with XMP2400 profile, damn.
> There's a chance to increase minimum framerates with higher RAM speed.
Click to expand...

I'm not sure that Z77 MBs support ram clocks much higher than 2133 or so but don't quote me on that.

I started overclocking the 1600mhz Ram in my Sandy bridge rig to about 1950mhz and it did improve 3dmark physics and cinebench r15 scores.


----------



## syl1979

Quote:


> Originally Posted by *zipper17*
> 
> However just Upgraded my RAM from 1600mhz into 2400MHZ, it won't Run with XMP2400 profile, damn.
> There's a chance to increase minimum framerates with higher RAM speed.


You may run it at 2133 with lower timings. That s what i do (2133 9-10-10-27) with my 2500k and p67 MB


----------



## loopy750

Quote:


> Originally Posted by *GeneO*
> 
> ...


Nvidia Control Panel set to "High Performance"?


----------



## khanmein

Quote:


> Originally Posted by *Benny89*
> 
> lol, that depends on Card, case airflow, temperature and GPU cooler... No if it is 9xxx series or 10xx series....
> 
> My 1535 (1550 max) Gigabyte 980Ti Gaming G1 in Witcher 3 on 1440p Ultra settings never went above 75C.
> 
> Look at EVGA 1070 and 1080 overheating issues. 9xxx series from EVGA did not have them. They just failed this time.
> 
> That all depends of how well they made their cooling solutions in cards. So far Gigabyte never let me down with their cooling


GTX 970 (core 1231 / memory 2003) (max 1471/4001) + 1440p 60Hz + ultra setting (GFE optimized) + max temp 70°c + voltage untouched (low 0.85 / max 1.00)


----------



## Dude970

Quote:


> Originally Posted by *zipper17*
> 
> Yeah, i can run Valley just fine with certain high overclock.
> In 3dmark Firestrike/StressTest i got crashes right away.
> 
> However just Upgraded my RAM from 1600mhz into 2400MHZ, it won't Run with XMP2400 profile, damn.
> There's a chance to increase minimum framerates with higher RAM speed.


Could be the RAM has compatibility issues with your board. Try manualy setting the RAM timings and DRAM Voltage. Or exchange it for a different brand, maybe G-Skill or Patriot. I had the same issue with Corsair RAM on my Z77 board, Corsair wouldnt do 2400 stable, Patriot or G-SKILL did with no issues.


----------



## asdkj1740

the very first micron's review sample on techpowerup, is micron oc the same as samsung oc?
https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/32.html

btw, seriously, why this card has significant cooling improvement than msi gamingx and gamingz???????
https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/33.html

i really curious about the power settings on its bios.
according to techpowerup bios collection, this new msi card has the same bios as the msi gamingx: 230W @ 100% and 291W MAX.


----------



## zipper17

Quote:


> Originally Posted by *syl1979*
> 
> You may run it at 2133 with lower timings. That s what i do (2133 9-10-10-27) with my 2500k and p67 MB


have been tested so far it only can run Up To 2000mhz CL9/10 @1.6V as for right now.
Quote:


> Originally Posted by *Dude970*
> 
> Could be the RAM has compatibility issues with your board. Try manualy setting the RAM timings and DRAM Voltage. Or exchange it for a different brand, maybe G-Skill or Patriot. I had the same issue with Corsair RAM on my Z77 board, Corsair wouldnt do 2400 stable, Patriot or G-SKILL did with no issues.


yeah it might be compatibility issues, got a defective ram, weak cpu/mobo or something else ...
already tried manually random timings CL8/9/10/11/12/13 and XMP timings, Setting 1.65V-1.75V voltage, bumping VCCSA & VCCIO step by step, seems still won't post at all with Vulcan 2400/2133MHZ. it always revert back to SPD 1333mhz.

However switch back to my old ram 8GB Vengeance 1600mhz CL9, can boot up to 2200mhz CL9/10 @1.6V just fine, but not stable.


----------



## Snuckie7

Finally got around to setting up my MSI Gaming X. Is it better to overclock with the offset or with the custom curve?


----------



## _Killswitch_

Just received my back plate for my EVGA GTX 1070 Gaming (blower style) from coldzero, had them reversed writing for my reversed STH10 and make "1070" match the color of my blue sleeving. They did an awesome job!


----------



## Dude970

Quote:


> Originally Posted by *_Killswitch_*
> 
> Just received my back plate for my EVGA GTX 1070 Gaming (blower style) from coldzero, had them reversed writing for my reversed STH10 and make "1070" match the color of my blue sleeving. They did an awesome job!










Looks good


----------



## Nukemaster

Quote:


> Originally Posted by *_Killswitch_*
> 
> Just received my back plate for my EVGA GTX 1070 Gaming (blower style) from coldzero, had them reversed writing for my reversed STH10 and make "1070" match the color of my blue sleeving. They did an awesome job!


Looks good.
Does it have thermal pads to prevent touching the card and help pull some heat into it?

EDIT. Looks like it sits high enough to clear all parts. They have one for my card


----------



## Pittster

Quote:


> Originally Posted by *asdkj1740*
> 
> the very first micron's review sample on techpowerup, is micron oc the same as samsung oc?
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/32.html
> 
> btw, seriously, why this card has significant cooling improvement than msi gamingx and gamingz???????
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/33.html
> 
> i really curious about the power settings on its bios.
> according to techpowerup bios collection, this new msi card has the same bios as the msi gamingx: 230W @ 100% and 291W MAX.


Tech Power Up do not report Delta T on there temps so its probably just a case of different ambient temps if the cards are tested in the same ope air bench.

Cards look identical except different colors and slight back plate design change

https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/4.html

https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/4.html


----------



## _Killswitch_

Nuke, they sent me spacers/longer screws. I haven't noticed any difference in my temps so far.


----------



## Br3ach

Quote:


> Originally Posted by *Snuckie7*
> 
> Finally got around to setting up my MSI Gaming X. Is it better to overclock with the offset or with the custom curve?


See a few posts back ;-) Using the curve netted me about 60 Mhz extra.


----------



## GeneO

Quote:


> Originally Posted by *loopy750*
> 
> Nvidia Control Panel set to "High Performance"?


You mean texture filtering quality? Yes, it makes about 1.5 fps difference from quality.


----------



## asdkj1740

Quote:


> Originally Posted by *Pittster*
> 
> Tech Power Up do not report Delta T on there temps so its probably just a case of different ambient temps if the cards are tested in the same ope air bench.
> 
> Cards look identical except different colors and slight back plate design change
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/4.html
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/4.html


maybe, i dont know, but a well known reivewer should have a stable testing environment.

if you look closer to the mosfets of two cards, you should be able to find something different.


----------



## Pittster

Quote:


> Originally Posted by *asdkj1740*
> 
> maybe, i dont know, but a well known reivewer should have a stable testing environment.
> 
> if you look closer to the mosfets of two cards, you should be able to find something different.


I have know doubt that TechpowerUp try keep as many variable's consistent as possible, please do tell of this difference for the less technically inclined.


----------



## Snuckie7

Quote:


> Originally Posted by *asdkj1740*
> 
> the very first micron's review sample on techpowerup, is micron oc the same as samsung oc?
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/32.html
> 
> btw, seriously, why this card has significant cooling improvement than msi gamingx and gamingz???????
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/33.html
> 
> i really curious about the power settings on its bios.
> according to techpowerup bios collection, this new msi card has the same bios as the msi gamingx: 230W @ 100% and 291W MAX.


Maybe it's because of this?
Quote:


> In terms of capabilities and performance the card sits between the GTX 1070 Gaming X and GTX 1070 Gaming Z. It has the slightly more powerful thermal solution of the Gaming Z, but comes with the same clocks as the Gaming X.


----------



## asdkj1740

Quote:


> Originally Posted by *Snuckie7*
> 
> Maybe it's because of this?


i didnt reat the review thoroughly.
what is "slightly more powerful thermal solution"?


----------



## gtbtk

Quote:


> Originally Posted by *Snuckie7*
> 
> Finally got around to setting up my MSI Gaming X. Is it better to overclock with the offset or with the custom curve?


I have had better results using the curve but it is worth trying both


----------



## xGeNeSisx

Quote:


> Originally Posted by *gtbtk*
> 
> Your card only has a 6+2 phase VRM, You should be fine flashing EVGA bioses as they use 5 channels from the voltage controller and unmanaged doublers to get the "10 Phase" vrm. the FTW bios will increase your power limit a little from 200W to 226W too.
> 
> The Strix has 8+2 Phase VRM. The MSI cards are also 8 +2 phase VRM cards but they are 8+6 pin power. The voltage controllers are the same 8 channel units on just about all of them but I don't know what impact there would be with a bios that expects extra voltage controller channels to be there that do not exist.
> 
> The Galax/KAF HOF cards are the ones that use a different voltage controller and should be avoided for cross flashing


I was able to increase my Heaven score from mid 1500s to slightly past 1700 by flashing the EVGA 1070 FTW bios to my Gigabyte G1 Gaming. The power slider was able to be increased from 111 to 112. I was able to overclock to 2050mhz at 1.063v and +500mhz (reaching 9000mhz on Micron GDDR5) on the memory. I backed the memory down to +450mhz for now until I can screw with it later.

I have not increased the voltage slider at all. It seems as though any core clock speed above 2500 yields no performance increase. I will have try to put a double hump in the curve by setting the .950 or .975 voltage to 1999mhz and see if FPS increases. I'll see if maxing the voltage slider at 1.081 or all the way to 1.093v shows increased performance with higher core clock.

Thank you so much for your advice! I gained between 10 and 20 FPS in GTAV at 2880x1620 DSR resolution maxed out with just FXAA on


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Your card only has a 6+2 phase VRM, You should be fine flashing EVGA bioses as they use 5 channels from the voltage controller and unmanaged doublers to get the "10 Phase" vrm. the FTW bios will increase your power limit a little from 200W to 226W too.
> 
> The Strix has 8+2 Phase VRM. The MSI cards are also 8 +2 phase VRM cards but they are 8+6 pin power. The voltage controllers are the same 8 channel units on just about all of them but I don't know what impact there would be with a bios that expects extra voltage controller channels to be there that do not exist.
> 
> The Galax/KAF HOF cards are the ones that use a different voltage controller and should be avoided for cross flashing
> 
> 
> 
> I was able to increase my Heaven score from mid 1500s to slightly past 1700 by flashing the EVGA 1070 FTW bios to my Gigabyte G1 Gaming. The power slider was able to be increased from 111 to 112. I was able to overclock to 2050mhz at 1.063v and +500mhz (reaching 9000mhz on Micron GDDR5) on the memory. I backed the memory down to +450mhz for now until I can screw with it later.
> 
> I have not increased the voltage slider at all. It seems as though any core clock speed above 2500 yields no performance increase. I will have try to put a double hump in the curve by setting the .950 or .975 voltage to 1999mhz and see if FPS increases. I'll see if maxing the voltage slider at 1.081 or all the way to 1.093v shows increased performance with higher core clock.
> 
> Thank you so much for your advice! I gained between 10 and 20 FPS in GTAV at 2880x1620 DSR resolution maxed out with just FXAA on
Click to expand...

As you are using an EVGA card, One thing that you can try out is to install the EVGA Precision XOC software and run the auto overclock utility built into that.

In my experience it doesn't produce the most stable overclocks and I find the controls quire clunky, only allowing +25 point curve adjustments as opposed to the infinite adjustments in afterburner. What I have noticed though, is that it will show how your card overclocks at each and every voltage point in the curve going from .800 all the way to 1.1v in +25unit steps. Make sure that you run the utility through a couple of times because I think things like temperature will also effect the upper limits of oc headroom. The trick is to find the points on the curve that are stable when it is both hot and cold

The cards that I have seen run through the utility showed that there is a dip in overclock headroom in the 1.0 to 1.025V range of voltage points. for example, it may be stable at +125 at the .800 and 1.050v points but may crash if you go above +75 at the 1.0 volts point for example. Contrary to what the utilities imply with the sliders making it appear that things happen at that single high frequency, the cards dont actually work that way. the are functioning at multiple frequency point all along the curve in parallel. The the curve shows performance in the area under the curve and not specifically the point at the end.

Using the traditional slider moves the whole fixed curve up and down at once. One part of the curve will hit the lowest part of that dip in OC headroom and you end up wasting the rest of the performance that you could have used at the other parts of the curve that will support being boosted higher.

After you see the rough curve that the Precision auto utility creates, take some notes and that gives you a pretty good view on where the OC headroom dips and where you can push it higher. You can then use that knowledge in creating a much more fine tuned curve using Afterburner.


----------



## gtbtk

Quote:


> Originally Posted by *Pittster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> maybe, i dont know, but a well known reivewer should have a stable testing environment.
> 
> if you look closer to the mosfets of two cards, you should be able to find something different.
> 
> 
> 
> I have know doubt that TechpowerUp try keep as many variable's consistent as possible, please do tell of this difference for the less technically inclined.
Click to expand...

If you are measuring the absolute operating temps of a gpu or CPU under load, You are actually measuring the ambient room temperature plus the temperature increase caused by the system load. The temp increase from room temperature cause by, for example a 150w load will always be the same but ambient room temps could vary from 15 deg to 30 degrees giving you different totals.

Unless you can make sure that every time you do a temperature test the room is a fixed temperature, your absolute readings taken today cannot be compared in a meaningful way with temps taken 6 months ago on a different device. If you test and report on the temp differences between current ambient room temp and the absolute temp the card generates, you can compare the difference in cooling performances between any two devices measured using the same test. It then doesn't matter if the first test was done in the snow and the second test was in the desert even though the absolute temp measurements are different.


----------



## gtbtk

Quote:


> Originally Posted by *_Killswitch_*
> 
> Nuke, they sent me spacers/longer screws. I haven't noticed any difference in my temps so far.


You wont notice any difference. These cards don't have any way to measure VRM temps unless you stick a thermocouple on the chips themselves.


----------



## Gurkburk

Getting driver crashes in BF1. I'm doubting it's due to the overclocks, I've lowered them by quite a bit a few times when every crash happens.

Anyone else getting this?


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Getting driver crashes in BF1. I'm doubting it's due to the overclocks, I've lowered them by quite a bit a few times when every crash happens.
> 
> Anyone else getting this?


you are going to need to give us a bit more information to help you.

Card model

Driver version - was it installed from a clean install?

OC and fan settings you are using

room and card operating temperatures

other applications running in the background


----------



## Bee Dee 3 Dee

New Rig is all _settled in_...

1st Valley test of Preset: "Extreme HD Valley"


----------



## anthonyg45157

Any recommendations for a zotac 1070 exreme owner with micron memory ? Just wait for BIOS? I just got my card last night and it seems I have issues above +100 on Mem sometimes artifacts in Windows even. I can do 2100 on core @60 degrees Celsius


----------



## melodystyle2003

edited


----------



## weskeh

nevermind..


----------



## xGeNeSisx

Quote:


> Originally Posted by *gtbtk*
> 
> As you are using an EVGA card, One thing that you can try out is to install the EVGA Precision XOC software and run the auto overclock utility built into that.
> 
> In my experience it doesn't produce the most stable overclocks and I find the controls quire clunky, only allowing +25 point curve adjustments as opposed to the infinite adjustments in afterburner. What I have noticed though, is that it will show how your card overclocks at each and every voltage point in the curve going from .800 all the way to 1.1v in +25unit steps. Make sure that you run the utility through a couple of times because I think things like temperature will also effect the upper limits of oc headroom. The trick is to find the points on the curve that are stable when it is both hot and cold
> 
> The cards that I have seen run through the utility showed that there is a dip in overclock headroom in the 1.0 to 1.025V range of voltage points. for example, it may be stable at +125 at the .800 and 1.050v points but may crash if you go above +75 at the 1.0 volts point for example. Contrary to what the utilities imply with the sliders making it appear that things happen at that single high frequency, the cards dont actually work that way. the are functioning at multiple frequency point all along the curve in parallel. The the curve shows performance in the area under the curve and not specifically the point at the end.
> 
> Using the traditional slider moves the whole fixed curve up and down at once. One part of the curve will hit the lowest part of that dip in OC headroom and you end up wasting the rest of the performance that you could have used at the other parts of the curve that will support being boosted higher.
> 
> After you see the rough curve that the Precision auto utility creates, take some notes and that gives you a pretty good view on where the OC headroom dips and where you can push it higher. You can then use that knowledge in creating a much more fine tuned curve using Afterburner.


The G1 has a TDP of 200W and the EVGA FTW 226, is there any other BIOS which uses the same VRM phases which would allow me to squeeze a bit more performance out. Not that it matters, I'm pretty happy. The G1 bios isn't choking itself with no reason with the FTW bios


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> As you are using an EVGA card, One thing that you can try out is to install the EVGA Precision XOC software and run the auto overclock utility built into that.
> 
> In my experience it doesn't produce the most stable overclocks and I find the controls quire clunky, only allowing +25 point curve adjustments as opposed to the infinite adjustments in afterburner. What I have noticed though, is that it will show how your card overclocks at each and every voltage point in the curve going from .800 all the way to 1.1v in +25unit steps. Make sure that you run the utility through a couple of times because I think things like temperature will also effect the upper limits of oc headroom. The trick is to find the points on the curve that are stable when it is both hot and cold
> 
> The cards that I have seen run through the utility showed that there is a dip in overclock headroom in the 1.0 to 1.025V range of voltage points. for example, it may be stable at +125 at the .800 and 1.050v points but may crash if you go above +75 at the 1.0 volts point for example. Contrary to what the utilities imply with the sliders making it appear that things happen at that single high frequency, the cards dont actually work that way. the are functioning at multiple frequency point all along the curve in parallel. The the curve shows performance in the area under the curve and not specifically the point at the end.
> 
> Using the traditional slider moves the whole fixed curve up and down at once. One part of the curve will hit the lowest part of that dip in OC headroom and you end up wasting the rest of the performance that you could have used at the other parts of the curve that will support being boosted higher.
> 
> After you see the rough curve that the Precision auto utility creates, take some notes and that gives you a pretty good view on where the OC headroom dips and where you can push it higher. You can then use that knowledge in creating a much more fine tuned curve using Afterburner.
> 
> 
> 
> The G1 has a TDP of 200W and the EVGA FTW 226, is there any other BIOS which uses the same VRM phases which would allow me to squeeze a bit more performance out. Not that it matters, I'm pretty happy. The G1 bios isn't choking itself with no reason with the FTW bios
Click to expand...

I honestly couldn't say.

If you browse through the Techpowerup VGA bios database, you might find something. It extracts the Power limits values from the bios file.


----------



## asdkj1740

Quote:


> Originally Posted by *Pittster*
> 
> Tech Power Up do not report Delta T on there temps so its probably just a case of different ambient temps if the cards are tested in the same ope air bench.
> 
> Cards look identical except different colors and slight back plate design change
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/4.html
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Gaming_Z/4.html


mosfets are different, quick silver has changed to ubiq mosfet.
the techpowerup pics from above two reviews can also see this if you zoom in more.
msi widely uses ubiq mosfets on many cards like 480 470 1060



https://www.chiphell.com/portal.php?mod=view&aid=16999&page=3


----------



## asdkj1740

del


----------



## Gurkburk

Quote:


> Originally Posted by *gtbtk*
> 
> you are going to need to give us a bit more information to help you.
> 
> Card model
> Driver version - was it installed from a clean install?
> OC and fan settings you are using
> room and card operating temperatures
> other applications running in the background


Gigabyte 1070 G1

Newest, 95 or whatever it ends at.

I lowered the clocks a bit more, +40 CC & 550Memory, im not sure, but it seems to not crash.

At 1100rpm its at 46*C idle and around 60*C in Battlefield 1, depends on what i set the fans on, but following my curve, thats the temp. Room is about 20*C.

I've got Chrome running a lot of tabs most of the time.


----------



## brettjv

After way too many years w/the same old GPU, I just finally ordered an upgrade ... MSi Quicksilver ... earlier tonight, should be here this week. Also grabbed an X5675 for some 6-core goodness to extend the life of my trusty LGA1366 platform (Rampage III Extreme), but now I'm disappointed to hear I might've gotten a card with not so good a memory OC cause it has Micron memory ... but then there's talk of new BIOSes that can correct the issue and make the Micron OC as well as the Samsung or some such?

Kinda confused ... are all the Quicksilver's shipping without the 'good' bios that allows the higher VRAM? If so, where do I get the 'good' bios? Is it available from MSi directly, or ... what's the deal exactly? Does the Micron OC similarly as long as you have the right BIOS?

BTW, as an Oakland Raider fan, once I saw that card ... along w/the $379 after MIR tag on it ... basically as cheap as any 1070 on NE ... the choice became very no-brainer ... Now I just need to figure a way to get some Raider regalia onto my card ...

As always, TIA!


----------



## asdkj1740

Quote:


> Originally Posted by *brettjv*
> 
> After way too many years w/the same old GPU, I just finally ordered an upgrade ... MSi Quicksilver ... earlier tonight, should be here this week. Also grabbed an X5675 for some 6-core goodness to extend the life of my trusty LGA1366 platform (Rampage III Extreme), but now I'm disappointed to hear I might've gotten a card with not so good a memory OC cause it has Micron memory ... but then there's talk of new BIOSes that can correct the issue and make the Micron OC as well as the Samsung or some such?
> 
> Kinda confused ... are all the Quicksilver's shipping without the 'good' bios that allows the higher VRAM? If so, where do I get the 'good' bios? Is it available from MSi directly, or ... what's the deal exactly? Does the Micron OC similarly as long as you have the right BIOS?
> 
> BTW, as an Oakland Raider fan, once I saw that card ... along w/the $379 after MIR tag on it ... basically as cheap as any 1070 on NE ... the choice became very no-brainer ... Now I just need to figure a way to get some Raider regalia onto my card ...
> 
> As always, TIA!


quick sliver comes with the correct bios, no need to flash and fix by yourself.
but quick silver should be the only one using micron vram at all, from the beginning too. no chance for getting samsung vram on quick silver as those review samples are already micron vram.

you can go techpowerup and guru3d to compare micron's quick silver card to other samsung's cards to see is there any difference between two vrams in terms of overclocking.


----------



## anthonyg45157

Any recommendations for a zotac 1070 exreme owner with micron memory ? Just wait for BIOS? I just got my card last night and it seems I have issues above +100 on Mem sometimes artifacts in Windows even. I can do 2100 on core @60 degrees Celsius


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you are going to need to give us a bit more information to help you.
> 
> Card model
> Driver version - was it installed from a clean install?
> OC and fan settings you are using
> room and card operating temperatures
> other applications running in the background
> 
> 
> 
> Gigabyte 1070 G1
> 
> Newest, 95 or whatever it ends at.
> 
> I lowered the clocks a bit more, +40 CC & 550Memory, im not sure, but it seems to not crash.
> 
> At 1100rpm its at 46*C idle and around 60*C in Battlefield 1, depends on what i set the fans on, but following my curve, thats the temp. Room is about 20*C.
> 
> I've got Chrome running a lot of tabs most of the time.
Click to expand...

Temps look OK. 60deg is fine

maybe turn down your memory clocks to about 400. I cannot get timespy to run properly with memory clocks that high. might be similar with battlefield 1?

I have noticed that the google updater and crash handler apps that run in the background do impact overclock ability so maybe you could experiment by closing chrome and the google backbround apps down and see if that is having any effect


----------



## gtbtk

Quote:


> Originally Posted by *anthonyg45157*
> 
> Any recommendations for a zotac 1070 exreme owner with micron memory ? Just wait for BIOS? I just got my card last night and it seems I have issues above +100 on Mem sometimes artifacts in Windows even. I can do 2100 on core @60 degrees Celsius


set the nvidia control panel 3d power management mode setting to maximum performance. add dwm.exe and explorer.exe to the list of applications in the control panel and also make sure that they are set to max performance as well and it should stabilize your card by having the card Idle at or above about .800v until Zotac releases their Bios update.

The other option you could try out would be to cross flash a gigabyte 1070 xtreme .50 bios to your card as a temporary measure. It will give you faster default clocks on the core and the memory but will have lower maximum power targets. It should resolve the memory issues if your card is physically not faulty. This will give you a more enjoyable experience I think but there is always a small risk that you could brick your card if the flash update doesn't work properly

make sure that you back up the original bios so you can reinstall that before you update with the new zotac bios when it is released.

you will need to get hold of the latest version of nvflash but here is the gigabyte bios if you want it https://www.techpowerup.com/vgabios/187291/187291


----------



## anthonyg45157

Quote:


> Originally Posted by *gtbtk*
> 
> set the nvidia control panel 3d power management mode setting to maximum performance. add dwm.exe and explorer.exe to the list of applications in the control panel and also make sure that they are set to max performance as well and it should stabilize your card by having the card Idle at or above about .800v until Zotac releases their Bios update.
> 
> The other option you could try out would be to cross flash a gigabyte 1070 xtreme .50 bios to your card as a temporary measure. It will give you faster default clocks on the core and the memory but will have lower maximum power targets. It should resolve the memory issues if your card is physically not faulty. This will give you a more enjoyable experience I think but there is always a small risk that you could brick your card if the flash update doesn't work properly
> 
> make sure that you back up the original bios so you can reinstall that before you update with the new zotac bios when it is released.
> 
> you will need to get hold of the latest version of nvflash but here is the gigabyte bios if you want it https://www.techpowerup.com/vgabios/187291/187291


Thanks for all your helpful tips ! Gonna test and see how it goes. Always seems to freeze with artifacts on desktop after running s benchmark so it seems something is going in with idle voltage. Hopefully zotac will release something soon. I sent them an email asking about bios updates. I doubt I'll get a response however...


----------



## Jackharm

A few pages back, a fellow posted that Zotac support (through an email) said there should be a bios released "LATE November" hopefully that remains true, as the month of November is coming to a close


----------



## MEC-777

Hey all, just ordered a Zotac 1070 FE (black Friday sales). Should arrive sometime this week









Upgrading from a Strix 980. Some people have told me it's not worth it, but the 4GB Vram has shown to be a limiting factor in a number of games I like to play, so I think the 1070 will be a much better card for longevity. Have had the 980 for just over a year and it's been awesome, but I'm looking for just a bit more.


----------



## TheBoom

Quote:


> Originally Posted by *Gurkburk*
> 
> Getting driver crashes in BF1. I'm doubting it's due to the overclocks, I've lowered them by quite a bit a few times when every crash happens.
> 
> Anyone else getting this?


Disable origin in game for bf1.


----------



## InvalidUserID

Ordered an MSI GTX 1070 Quicksilver to replace my GTX 680 Lightning. Should arrive tomorrow (Thankfully NewEgg is in the same state).

Loved that 680 as I coveted it back in the day but time for something new. Will post up tomorrow (hopefully).


----------



## Luckael

any gtx 1070 xtreme gaming user here. what is your bios?


----------



## zipper17

Finally get work around my RAM into 2133MHZ CL10, a slight improvement for minimum framerates.
Hitman Marakesh in dense area with 1600mhz minimum lowest FPS are 48-49.7 FPS, with 2133mhz i never see 49.7, but lowest is about +51FPS.
Quote:


> Originally Posted by *Gurkburk*
> 
> Getting driver crashes in BF1. I'm doubting it's due to the overclocks, I've lowered them by quite a bit a few times when every crash happens.
> 
> Anyone else getting this?


Look into your event viewers, is there any Display Warning nvlddmkm error, that could be because overclock is not stable.


----------



## Luckael

is there any fix for this? the division (games) has been block from accessing graphics hardware?


----------



## ITAngel

Quote:


> Originally Posted by *Luckael*
> 
> any gtx 1070 xtreme gaming user here. what is your bios?


Yo, I will let you know what my card has once I get home today from work.


----------



## iARDAs

Bought Asus Dual 1070..

Coming from Gİgabyte 980ti gaming...

What an upgrade huh?


----------



## KSIMP88

So I bought a 1070 for my graphics amplifier, but I'm worried it is too big. The amp says up to 10.5" card, but the car is 11.9". Didn't know until I found out on a forum by accident. Hope I can jerry rig it


----------



## gtbtk

Quote:


> Originally Posted by *Luckael*
> 
> is there any fix for this? the division (games) has been block from accessing graphics hardware?


reduce you overclock


----------



## Luckael

Quote:


> Originally Posted by *gtbtk*
> 
> reduce you overclock


my card is Gtx 1070 xtreme gaming and its factory oc. The Division is only the game that has error. i already try remove driver by ddu and fresh install. but still the same. i think its not a faulty card since The Division is only affected.


----------



## Vowels

I bought a Gigabyte GTX 1070 Mini ITX yesterday and it has some crazy coil whine running Unigine Heaven and Valley. I'm not sure if it's because I'm getting high FPS in those two benchmarks. 100 - 120 fps doesn't seem like it's in the range of excessive where I expect coil whine like in game menus where I can get 300+ fps. I'm not getting any coil whine running Time Spy (<60fps though) and I haven't had the time yet to run actual games since I've just been OC'ing and running benchmarks.

Should I try to return and exchange it for a different card? Is there a way to fix or reduce coil whine? I'm planning to test using actual games tonight and hopefully it goes away when it counts.


----------



## khanmein

Quote:


> Originally Posted by *Vowels*
> 
> I bought a Gigabyte GTX 1070 Mini ITX yesterday and it has some crazy coil whine running Unigine Heaven and Valley. I'm not sure if it's because I'm getting high FPS in those two benchmarks. 100 - 120 fps doesn't seem like it's in the range of excessive where I expect coil whine like in game menus where I can get 300+ fps. I'm not getting any coil whine running Time Spy (<60fps though) and I haven't had the time yet to run actual games since I've just been OC'ing and running benchmarks.
> 
> Should I try to return and exchange it for a different card? Is there a way to fix or reduce coil whine? I'm planning to test using actual games tonight and hopefully it goes away when it counts.


giga 1000 series got this well known issue, there's some users said don't have this issue due to personal preferences. e.g. the case fans is louder & the rig distance is too far etc.

apparently, giga got higher chance u receive samsung vram chip.


----------



## KSIMP88

It fits! 60FPS Stable on the Witcher III. Finally. All max 1080p. Been too long since I had a good GPU

Now I can safely mod skyrim.... HA


----------



## Dude970

Quote:


> Originally Posted by *KSIMP88*
> 
> It fits! 60FPS Stable on the Witcher III. Finally. All max 1080p. Been too long since I had a good GPU
> 
> Now I can safely mod skyrim.... HA


----------



## Jackharm

To be honest, this was one of the firsts things I checked too.







Quote:


> Originally Posted by *KSIMP88*
> 
> It fits! 60FPS Stable on the Witcher III. Finally. All max 1080p. Been too long since I had a good GPU
> 
> Now I can safely mod skyrim.... HA


----------



## gtbtk

Quote:


> Originally Posted by *Luckael*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> reduce you overclock
> 
> 
> 
> my card is Gtx 1070 xtreme gaming and its factory oc. The Division is only the game that has error. i already try remove driver by ddu and fresh install. but still the same. i think its not a faulty card since The Division is only affected.
Click to expand...

What Power Supply do you have?

Are you running the GPU in OC mode with the gigabyte software? if so run it in gaming mode (the card default) and try again see if that makes a difference.

Is your PC CPU/RAM overclocked? You can try taking some of your PC overclock off and running that closer to stock speeds. If that has been stable for a long time i do not expect that to be the cause but we should eliminate it properly

that as the base cause. You could also try experimenting by increasing or decreasing VCCIO voltage in the bios a little to see if that improves your stability

As an experiment to humor me, could you also try down clocking the GPU by -50mhz in the gpu OC utility and trying the game again? The 1671 base clock of the xtreme is very high and relies on having a binned GPU to operate reliably. those cards have very little if any OC headroom to start with. It is possible that the binning is not quite as high as it maybe should be on your model GPU. This can confirm or eliminate if the card is not binned high enough to cope with the default clocks. If it ends up being confirmed you will have to decide if an RMA is something you want to go through.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vowels*
> 
> I bought a Gigabyte GTX 1070 Mini ITX yesterday and it has some crazy coil whine running Unigine Heaven and Valley. I'm not sure if it's because I'm getting high FPS in those two benchmarks. 100 - 120 fps doesn't seem like it's in the range of excessive where I expect coil whine like in game menus where I can get 300+ fps. I'm not getting any coil whine running Time Spy (<60fps though) and I haven't had the time yet to run actual games since I've just been OC'ing and running benchmarks.
> 
> Should I try to return and exchange it for a different card? Is there a way to fix or reduce coil whine? I'm planning to test using actual games tonight and hopefully it goes away when it counts.
> 
> 
> 
> giga 1000 series got this well known issue, there's some users said don't have this issue due to personal preferences. e.g. the case fans is louder & the rig distance is too far etc.
> 
> apparently, giga got higher chance u receive samsung vram chip.
Click to expand...

There is nothing intrinsically wrong with micron memory chips that have been used on Gigabyte cards or ant other brand for that matter - (the statement to guru3d that they dont use Micron memory was a lie), the previous issue was a firmware bug that has been fixed with the recent .50 bios update. Please done keep spreading fear and trepidation. I note that the Mini does not have any updates available so maybe they have not been manufactured at a time when micron memory was being installed??

Having said that memory chips of any type certainly don't cause coil whine.

Does it whine in firestrike? that will push 100fps in the first test.

If that card is only whining in heaven/valley maybe it is something that you can decide to live with or return the card is 100fps in any application is the cause where returning the card and getting a replacement


----------



## philhalo66

I been watching my card for a few days with GPU-Z and i keep seeing VREL in the perf cap window. What does VREL actually mean google gave me a wide array of answers but nobody really seemed to have a definite answer.

Also i have BIOS F1 is it worth my time to flash to F2?


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> I been watching my card for a few days with GPU-Z and i keep seeing VREL in the perf cap window. What does VREL actually mean google gave me a wide array of answers but nobody really seemed to have a definite answer.
> 
> Also i have BIOS F1 is it worth my time to flash to F2?


V.rel means it is using all the voltage that it has been allowed to use.

F1 is samsung memory bios. Other than bring it into line with the current .50 range of bioses, I cannot see what that actually improves on those cards. If you like the latest and greatest go ahead. If you are not getting any issues and it worries you, leave it alone. I am not aware of any performance increases,


----------



## GeneO

Quote:


> Originally Posted by *philhalo66*
> 
> I been watching my card for a few days with GPU-Z and i keep seeing VREL in the perf cap window. What does VREL actually mean google gave me a wide array of answers but nobody really seemed to have a definite answer.
> 
> Also i have BIOS F1 is it worth my time to flash to F2?


Vrel is reliable voltage. There is also Vop, which is the maximum operating voltage you set.
From observation, I find Vrel is temperature dependent. For good temps on the VRM, etc., it will run at ferequncy of Vop. If the temperature is high enough vrel will lower your voltage and frequency to lower the temps.


----------



## Vowels

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> 
> 
> There is nothing intrinsically wrong with micron memory chips that have been used on Gigabyte cards or ant other brand for that matter - (the statement to guru3d that they dont use Micron memory was a lie), the previous issue was a firmware bug that has been fixed with the recent .50 bios update. Please done keep spreading fear and trepidation. I note that the Mini does not have any updates available so maybe they have not been manufactured at a time when micron memory was being installed??
> 
> Having said that memory chips of any type certainly don't cause coil whine.
> 
> Does it whine in firestrike? that will push 100fps in the first test.
> 
> If that card is only whining in heaven/valley maybe it is something that you can decide to live with or return the card is 100fps in any application is the cause where returning the card and getting a replacement
Click to expand...

Having tested it with games, there's still just as much coil whine as in Heaven and Valley. Witcher 2 max everything + ubersampling hovering around 70 fps goes crazy. I re-tested Time Spy but this time with my case side panel off and my head close to the video card and I can could some coil whine as well. Fire Strike whines too.

I'm starting to think my card whines more and more as it gets closer to 100% usage. There's no whine when I turn on V-sync. I'm going to exchange it for another card.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> There is nothing intrinsically wrong with micron memory chips that have been used on Gigabyte cards or ant other brand for that matter - (the statement to guru3d that they dont use Micron memory was a lie), the previous issue was a firmware bug that has been fixed with the recent .50 bios update. Please done keep spreading fear and trepidation. I note that the Mini does not have any updates available so maybe they have not been manufactured at a time when micron memory was being installed??
> 
> Having said that memory chips of any type certainly don't cause coil whine.
> 
> Does it whine in firestrike? that will push 100fps in the first test.
> 
> If that card is only whining in heaven/valley maybe it is something that you can decide to live with or return the card is 100fps in any application is the cause where returning the card and getting a replacement


what i mean is giga coil whine is well known issue. don't care micron or samsung. the coil whine does exist. i didn't said micron cause coil whine.

i stated very clearly that giga got higher chance to obtain samsung based on my country. giga 9xx series don't have so much coil whine problem but 1xxx series especially giga got pretty % chance with coil whine.

based on my survey, msi & asus don't have coil whine. giga (coil whine) is the worst then following with evga then palit/zotac but the % is way lesser compare to giga.

i already stop the micron fiasco but i still suggest people get samsung!


----------



## philhalo66

Quote:


> Originally Posted by *gtbtk*
> 
> V.rel means it is using all the voltage that it has been allowed to use.
> 
> F1 is samsung memory bios. Other than bring it into line with the current .50 range of bioses, I cannot see what that actually improves on those cards. If you like the latest and greatest go ahead. If you are not getting any issues and it worries you, leave it alone. I am not aware of any performance increases,


Yeah i don't have any issues. been rock solid since day one i'll just leave it alone then.

Quote:


> Originally Posted by *GeneO*
> 
> Vrel is reliable voltage. There is also Vop, which is the maximum operating voltage you set.
> From observation, I find Vrel is temperature dependent. For good temps on the VRM, etc., it will run at ferequncy of Vop. If the temperature is high enough vrel will lower your voltage and frequency to lower the temps.


Hmm i set the voltage to +100% i guess the VRM is getting too warm GPU seems to max at 2120MHz with a core temp of 55C


----------



## Lord Xeb

I just got me GTX 1070. Will be selling my GTX 970s on OCN tomorrow


----------



## Quadrider10

Can we get a bios editor and custom bioses already! Jeez lol tired of my card throttling cause of some weird power or temp targets.


----------



## LeandroJVarini

Hi guys, add me to the club! I back the green side









GeForce PNY GTX 1070 OC edition


----------



## MEC-777

Quote:


> Originally Posted by *Quadrider10*
> 
> Can we get a bios editor and custom bioses already! Jeez lol tired of my card throttling cause of some weird power or temp targets.


Open MSI afterburner and move the temp and power target sliders all the way up. No more throttling.


----------



## criminal

Quote:


> Originally Posted by *MEC-777*
> 
> Open MSI afterburner and move the temp and power target sliders all the way up. No more throttling.


LOL... nope. Not with Pascal. GPU Boost 3.0 makes sure of that.


----------



## MEC-777

Quote:


> Originally Posted by *criminal*
> 
> LOL... nope. Not with Pascal. GPU Boost 3.0 makes sure of that.


Right, but GPU boost reacts based on those targets, no? Increase power and temp targets and it should hold higher boost clocks more consistently. (Unless you have a serious case temp/air flow problem)

@Quadrider10 How much is it throttling?


----------



## criminal

Quote:


> Originally Posted by *MEC-777*
> 
> Right, but GPU boost reacts based on those targets, no? Increase power and temp targets and it should hold higher boost clocks more consistently. (Unless you have a serious case temp/air flow problem)
> 
> @Quadrider10 How much is it throttling?


Temps help more than anything with Pascal. I think throttling starts happening as early as 35C though.

Pascal really, really needs a bios editor and a way to disable GPU boost 3.0


----------



## ogow89

Quote:


> Originally Posted by *criminal*
> 
> Temps help more than anything with Pascal. I think throttling starts happening as early as 35C though.
> 
> Pascal really, really needs a bios editor and a way to disable GPU boost 3.0


53°C*

There, corrected it


----------



## criminal

Quote:


> Originally Posted by *ogow89*
> 
> 53°C*
> 
> There, corrected it


No, I don't think it is that high.

http://images.anandtech.com/doci/10325/TempComp.png?_ga=1.111884292.1437714659.1474307155


----------



## ogow89

Quote:


> Originally Posted by *criminal*
> 
> No, I don't think it is that high.
> 
> http://images.anandtech.com/doci/10325/TempComp.png?_ga=1.111884292.1437714659.1474307155


The card runs at 35°C when idle, it can't throttle right away. Launching a game right away kicks the temps to the low 40s on a game's menu before the fans even start spinning.

I tested it on my card and the core clock holds to the max possible boost as long as the temps don't go over 54°C, and as soon it goes 1 degree over that, it drops around 13MHz.


----------



## Dude970

Quote:


> Originally Posted by *ogow89*
> 
> The card runs at 35°C when idle, it can't throttle right away. Launching a game right away kicks the temps to the low 40s on a game's menu before the fans even start spinning.
> 
> I tested it on my card and the core clock holds to the max possible boost as long as the temps don't go over 54°C, and as soon it goes 1 degree over that, it drops around 13MHz.


If you had lower idle temps you would see that Criminal is correct, they start throttling in the 35c area


----------



## ogow89

Quote:


> Originally Posted by *Dude970*
> 
> If you had lower idle temps you would see that Criminal is correct, they start throttling in the 35c area


Not with the zero fan profile









Well i can only speak for what i have noticed with my card, until more test sites actually prove it.


----------



## Dude970

Fan profile while it may effect temps doesnt matter. Paschal will throttle in the low 30s, google it it is already a fact. My card idling in the 20s, then running a bench and watching temps shows that.


----------



## GeneO

Quote:


> Originally Posted by *ogow89*
> 
> 53°C*
> 
> There, corrected it


That is about where I see it start throttling - the low 50s, but it probably has already throttled some before I see it.


----------



## GeneO

Quote:


> Originally Posted by *Dude970*
> 
> Fan profile while it may effect temps doesnt matter. Paschal will throttle in the low 30s, google it it is already a fact. My card idling in the 20s, then running a bench and watching temps shows that.


I see temperature throttling in low 50s. I can maintain higher clocks if I set my fan profile accordingly. It shows up as vrel. I presume this is because the VRM temp is high enough the controller thinks the voltage is not reliable and lowers it. And my idle is 30.


----------



## criminal

Quote:


> Originally Posted by *Dude970*
> 
> Fan profile while it may effect temps doesnt matter. Paschal will throttle in the low 30s, google it it is already a fact. My card idling in the 20s, then running a bench and watching temps shows that.


QFT.

My card idles at 29C and tops out at 45C. I drop two to three clock bins by the time my card hits max temp.


----------



## philhalo66

mine doesn't throttle till the high 50's. it runs at a constant 2200 till around 58C then drops to 2120 and stays there regardless of the temps


----------



## Nukemaster

Quote:


> Originally Posted by *philhalo66*
> 
> mine doesn't throttle till the high 50's. it runs at a constant 2200 till around 58C then drops to 2120 and stays there regardless of the temps


Luck you. Cards with semi passive cooling run hotter at idle and may not show the drops. I know mine starts to drop in the 40's I think. The drop overall is very small at maybe 2-3 bins, but it is still a drop.


----------



## philhalo66

Quote:


> Originally Posted by *Nukemaster*
> 
> Luck you. Cards with semi passive cooling run hotter at idle and may not show the drops. I know mine starts to drop in the 40's I think. The drop overall is very small at maybe 2-3 bins, but it is still a drop.


why are these cards throttling at such low temps? is it a bug or a design flaw?


----------



## Nukemaster

I am not sure. When I first got the card, I posted in this thread and some others said they had the same things I just figured it was normal for Pascal cards. It may be on a per gpu basis similar to how the boost feature works.

This was when I first got the card on stock cooling and settings.


----------



## RJTablante

ZOTAC 1070 owners, get ready for the vBios update to address the Micron issue! Just got off of Zotac chat support and they confirmed that the vBios will be dropping tomorrow morning!

They're literally putting the .exe together for distribution to adhere to nvidia standards, so if all holds true then we're finally in business!

Created an account just to post this here since I figure others have been waiting for this as well


----------



## KSIMP88

Micron issue? I just got this thing


----------



## GeneO

Quote:


> Originally Posted by *KSIMP88*
> 
> Micron issue? I just got this thing


Search. Google. You should be all right.


----------



## KSIMP88

Please don't make me do stuff


----------



## Dude970

Quote:


> Originally Posted by *KSIMP88*
> 
> Please don't make me do stuff


The micron issue has been resolved with a BIOS update


----------



## KSIMP88

I have Samsung GDDR5 on mine. ZT-P10700I-10P

Haven't fiddled with overclocking yet. I'm a practical overclocker, so I don't even know if I need to overclock yet.

I upgraded my RAM and SSD in my laptop, so I did a fresh install of Windows, and reinstalling my games. If I feel I need to overclock, I'll do so, but I don't think I need to. I'm using a 1080P display for now.


----------



## Quadrider10

My G1 usually settles around 1949mhz stock at 60c on the core. If I move the temp and power target up, it settles around 1980mhz. At 60c. But it boosts to 2000mhz until like 50c ish or so no matter the settings.


----------



## Forceman

Quote:


> Originally Posted by *KSIMP88*
> 
> I have Samsung GDDR5 on mine. ZT-P10700I-10P
> 
> Haven't fiddled with overclocking yet. I'm a practical overclocker, so I don't even know if I need to overclock yet.
> 
> I upgraded my RAM and SSD in my laptop, so I did a fresh install of Windows, and reinstalling my games. If I feel I need to overclock, I'll do so, but I don't think I need to. I'm using a 1080P display for now.


Honestly not much point in overclocking the core, the boost already does a pretty good job there. Overclocking the memory can help though, and Samsung memory should do +400 or +500 with no problem.


----------



## gtbtk

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MEC-777*
> 
> Open MSI afterburner and move the temp and power target sliders all the way up. No more throttling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> LOL... nope. Not with Pascal. GPU Boost 3.0 makes sure of that.
Click to expand...

founders edition....there's your problem....

the only way you are going to get what you want is to put the card under water.


----------



## gtbtk

Quote:


> Originally Posted by *ogow89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *criminal*
> 
> Temps help more than anything with Pascal. I think throttling starts happening as early as 35C though.
> 
> Pascal really, really needs a bios editor and a way to disable GPU boost 3.0
> 
> 
> 
> 53°C*
> 
> There, corrected it
Click to expand...

Idling at 35 with the fan running, It starts ramping down while the card is still in its 40s. 47deg from memory.


----------



## gtbtk

Quote:


> Originally Posted by *KSIMP88*
> 
> I have Samsung GDDR5 on mine. ZT-P10700I-10P
> 
> Haven't fiddled with overclocking yet. I'm a practical overclocker, so I don't even know if I need to overclock yet.
> 
> I upgraded my RAM and SSD in my laptop, so I did a fresh install of Windows, and reinstalling my games. If I feel I need to overclock, I'll do so, but I don't think I need to. I'm using a 1080P display for now.


Samsung memory is fine, no need for the bios


----------



## AstralReaper

Can't wait to dig into my EVGA 1070 FTW edition. EVGA finally sent me my thermal pads so I can to do the thermal pad mod, and I am also replacing the thermal paste with aftermarket. Hoping to see some good results when it's all said and done. Any else notice a good difference after the thermal pad upgrade?


----------



## gtbtk

Quote:


> Originally Posted by *AstralReaper*
> 
> Can't wait to dig into my EVGA 1070 FTW edition. EVGA finally sent me my thermal pads so I can to do the thermal pad mod, and I am also replacing the thermal paste with aftermarket. Hoping to see some good results when it's all said and done. Any else notice a good difference after the thermal pad upgrade?


I wouldn't expect too much difference from the pads. The thermals were not the cause of the problem. Gamers Nexus found that it was actually a bad batch or capacitors that were blowing up.

No idea on how large the batch was and you wont know if you were effected until the card blows a capacitor


----------



## gtbtk

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KSIMP88*
> 
> I have Samsung GDDR5 on mine. ZT-P10700I-10P
> 
> Haven't fiddled with overclocking yet. I'm a practical overclocker, so I don't even know if I need to overclock yet.
> 
> I upgraded my RAM and SSD in my laptop, so I did a fresh install of Windows, and reinstalling my games. If I feel I need to overclock, I'll do so, but I don't think I need to. I'm using a 1080P display for now.
> 
> 
> 
> Honestly not much point in overclocking the core, the boost already does a pretty good job there. Overclocking the memory can help though, and Samsung memory should do +400 or +500 with no problem.
Click to expand...

even micron memory after the bios update will do +400 to +500 without too much drama. I can run mine at +600 but it is starting to show the odd artifact


----------



## asdkj1740

for those who would like to buy bbn/fe/ref pcb card, you should go for inno3d ichill x3 or x4.
its upgraded ref pcb, with full six phases design, 4c85n ref's mosfet, ten pcb layers as ref does, better cooling and noise control, higher power limit with extra 6pin connector, but its 2.5 slot, and caps are different (not worse but seems even better).

no need to find the samsung vram cards, as the newest model in msi comes with micron already meaning samsung vram is now very very rare. with correct bios installed, you are good to go.


----------



## asdkj1740

there is no reason to buy pascal evga ftw now. because the hybrid ftw is only ~$30 usd more.
the hybrid aio is from asetek, with good vram and vrm cooling too, therefore no reason for choosing ftw over ftw hybrid.


----------



## asdkj1740

https://www.techpowerup.com/228323/gainward-intros-the-gamesoul-geforce-gtx-1080-and-gtx-1070-graphics-cards
gainward gamesoul is the same as galax hof, very good pcb quality.


----------



## anthonyg45157

Quote:


> Originally Posted by *RJTablante*
> 
> ZOTAC 1070 owners, get ready for the vBios update to address the Micron issue! Just got off of Zotac chat support and they confirmed that the vBios will be dropping tomorrow morning!
> 
> They're literally putting the .exe together for distribution to adhere to nvidia standards, so if all holds true then we're finally in business!
> 
> Created an account just to post this here since I figure others have been waiting for this as well


I hope this holds true. I talked to zotac chat support two days ago and they couldn't confirm or give a date.

Messaged them again and got this



Take with huge grain of salt


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> founders edition....there's your problem....
> 
> the only way you are going to get what you want is to put the card under water.


When my Zotac FE finally arrives, I'll report back with how much it throttles out of the box vs after some tweaking.

I don't need insane high overclocks, so I don't think I'll need water to get what I want.









Little concerned about the micron issue though. This is the first I've heard of it. Hope flashing the vBIOS is easier on Nvidia than it is on AMD cards.









If all the FE cards are the same from all manufacturers, then you should be able to flash the updated vBIOS from another onto the Zotac and it should work just fine, no?


----------



## RJTablante

Looks like I got bamboozled.... apologies guys, doesn't look like 'Paolo' from Zotac support was being honest with me.

Hopefully still we're looking at a release taking place within the next week
Quote:


> Originally Posted by *anthonyg45157*
> 
> I hope this holds true. I talked to zotac chat support two days ago and they couldn't confirm or give a date.
> 
> Messaged them again and got this
> 
> 
> 
> Take with huge grain of salt


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> for those who would like to buy bbn/fe/ref pcb card, you should go for inno3d ichill x3 or x4.
> its upgraded ref pcb, with full six phases design, 4c85n ref's mosfet, ten pcb layers as ref does, better cooling and noise control, higher power limit with extra 6pin connector, but its 2.5 slot, and caps are different (not worse but seems even better).
> 
> no need to find the samsung vram cards, as the newest model in msi comes with micron already meaning samsung vram is now very very rare. with correct bios installed, you are good to go.


inno3d or colorful is not available everywhere in the world though. I dont think they are being sold in the US. I have not seen much coverage on the ichill pascal cards to date. The Maxwell cards got quite good reviews though.

The manufacturers, including MSI, have been alternating batches in both micron and samsung but Micron memory no longer has any issues after the .50 bios update. The quicksilver cards are just Gaming X cards with a Gaming Z backplate painted silver and a Silver/black coloured fan shroud. Those reviewed to date have had Micron memory with the Gaming X .50 bios ex factory, but I expect that well will see those cards with samsung memory sometime in the future


----------



## iARDAs

Anyone owns the asus dual 1070?


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> founders edition....there's your problem....
> 
> the only way you are going to get what you want is to put the card under water.
> 
> 
> 
> When my Zotac FE finally arrives, I'll report back with how much it throttles out of the box vs after some tweaking.
> 
> I don't need insane high overclocks, so I don't think I'll need water to get what I want.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Little concerned about the micron issue though. This is the first I've heard of it. Hope flashing the vBIOS is easier on Nvidia than it is on AMD cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If all the FE cards are the same from all manufacturers, then you should be able to flash the updated vBIOS from another onto the Zotac and it should work just fine, no?
Click to expand...

Anything above 1683mhz is a free overclock that GPU boost 3.0 gives you anyway.

Most cards can be made start above 2100Mhz by overclocking with the curve, but will settle somewhere in the 20XXMhz range under most 3d loads in normal operating conditions but that is not throttling in the same way a CPU will cut its clocks back when it hits 95 deg to protect itself from destruction.

Yes pascal does balance clock speed and voltage draw against temp and power limits but it is not doing that to prevent destruction unless you have closed up all airflow and the card is overheating. The only thing is that the FE cooler is not that efficient, compared to the custom coolers, but it is better in confined spaces as the heat is exhausted out the back instead of into the case to heat up everything else.

All the manufacturers seem to be using a standardized method which is really easy and takes care of it all for you


----------



## gtbtk

Quote:


> Originally Posted by *RJTablante*
> 
> Looks like I got bamboozled.... apologies guys, doesn't look like 'Paolo' from Zotac support was being honest with me.
> 
> Hopefully still we're looking at a release taking place within the next week
> Quote:
> 
> 
> 
> Originally Posted by *anthonyg45157*
> 
> I hope this holds true. I talked to zotac chat support two days ago and they couldn't confirm or give a date.
> 
> Messaged them again and got this
> 
> 
> 
> Take with huge grain of salt
Click to expand...

It will have to arrive sometime soon unless Zotac are shutting down the sale of Nvidia pascal cards which I doubt.

The Bios update is provided to them by Nvidia as part of the core package, Zotac only has to modify the clock settings and power limits that are applicable for each card in their range.

I suspect that as Zotac are not exactly the largest supplier of cards, they would manufacture the cards in batches and warehouse the cards until they can ship to the sellers before manufacturing the next batch of cards. I guess they are tying the bios update into the schedule with their next manufacturing run.


----------



## chrcoluk

Quote:


> Originally Posted by *asdkj1740*
> 
> the very first micron's review sample on techpowerup, is micron oc the same as samsung oc?
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/32.html
> 
> btw, seriously, why this card has significant cooling improvement than msi gamingx and gamingz???????
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/33.html
> 
> i really curious about the power settings on its bios.
> according to techpowerup bios collection, this new msi card has the same bios as the msi gamingx: 230W @ 100% and 291W MAX.


seems a coincidence only after micron bios fixes that a reviewer reviews a micron card?


----------



## gtbtk

Quote:


> Originally Posted by *chrcoluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> the very first micron's review sample on techpowerup, is micron oc the same as samsung oc?
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/32.html
> 
> btw, seriously, why this card has significant cooling improvement than msi gamingx and gamingz???????
> https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/33.html
> 
> i really curious about the power settings on its bios.
> according to techpowerup bios collection, this new msi card has the same bios as the msi gamingx: 230W @ 100% and 291W MAX.
> 
> 
> 
> seems a coincidence only after micron bios fixes that a reviewer reviews a micron card?
Click to expand...

It is very easy to be a cynic isn't it?


----------



## AliasOfMyself

Still need to know what bios version those of you that have the Gigabyte 1070 Xtreme Gaming are running.. mine shipped with F2 and refuses to flash F3, gives me an error saying "board is incompatible with firmware version 86.04.1E.00.AA"

I've got a rep from the Gigabyte Xtreme Gaming facebook page working on the problem with me(AFAIK Gigabyte only have one version of my card released, so the bios SHOULD flash fine).

If someone that has the same card as me could tell me what version bios they're running, and also post a screenshot of the main tab of gpu-z so i can compare some stuff, i'd really appreciate it!


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Still need to know what bios version those of you that have the Gigabyte 1070 Xtreme Gaming are running.. mine shipped with F2 and refuses to flash F3, gives me an error saying "board is incompatible with firmware version 86.04.1E.00.AA"
> 
> I've got a rep from the Gigabyte Xtreme Gaming facebook page working on the problem with me(AFAIK Gigabyte only have one version of my card released, so the bios SHOULD flash fine).
> 
> If someone that has the same card as me could tell me what version bios they're running, and also post a screenshot of the main tab of gpu-z so i can compare some stuff, i'd really appreciate it!


There are two different directories in the zip file. Doesn't the xtreme have a dual bios implementation?

Are you sure that you are trying to update the correctly selected bios on the card with the correct update version?

The xtreme gaming engine has a bios update tool that may help as well


----------



## Luckael

Quote:


> Originally Posted by *gtbtk*
> 
> What Power Supply do you have?
> 
> Are you running the GPU in OC mode with the gigabyte software? if so run it in gaming mode (the card default) and try again see if that makes a difference.
> 
> Is your PC CPU/RAM overclocked? You can try taking some of your PC overclock off and running that closer to stock speeds. If that has been stable for a long time i do not expect that to be the cause but we should eliminate it properly
> that as the base cause. You could also try experimenting by increasing or decreasing VCCIO voltage in the bios a little to see if that improves your stability
> 
> As an experiment to humor me, could you also try down clocking the GPU by -50mhz in the gpu OC utility and trying the game again? The 1671 base clock of the xtreme is very high and relies on having a binned GPU to operate reliably. those cards have very little if any OC headroom to start with. It is possible that the binning is not quite as high as it maybe should be on your model GPU. This can confirm or eliminate if the card is not binned high enough to cope with the default clocks. If it ends up being confirmed you will have to decide if an RMA is something you want to go through.


finally i fixed my problem. i tick the debug mode in nvidia settings but the cons was the boost clock from 2000mhz downlock to 1890+mhz


----------



## gtbtk

Quote:


> Originally Posted by *Luckael*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> What Power Supply do you have?
> 
> Are you running the GPU in OC mode with the gigabyte software? if so run it in gaming mode (the card default) and try again see if that makes a difference.
> 
> Is your PC CPU/RAM overclocked? You can try taking some of your PC overclock off and running that closer to stock speeds. If that has been stable for a long time i do not expect that to be the cause but we should eliminate it properly
> that as the base cause. You could also try experimenting by increasing or decreasing VCCIO voltage in the bios a little to see if that improves your stability
> 
> As an experiment to humor me, could you also try down clocking the GPU by -50mhz in the gpu OC utility and trying the game again? The 1671 base clock of the xtreme is very high and relies on having a binned GPU to operate reliably. those cards have very little if any OC headroom to start with. It is possible that the binning is not quite as high as it maybe should be on your model GPU. This can confirm or eliminate if the card is not binned high enough to cope with the default clocks. If it ends up being confirmed you will have to decide if an RMA is something you want to go through.
> 
> 
> 
> finally i fixed my problem. i tick the debug mode in nvidia settings but the cons was the boost clock from 2000mhz downlock to 1890+mhz
Click to expand...

good to hear


----------



## AliasOfMyself

Quote:


> Originally Posted by *gtbtk*
> 
> There are two different directories in the zip file. Doesn't the xtreme have a dual bios implementation?
> 
> Are you sure that you are trying to update the correctly selected bios on the card with the correct update version?


Yeah Gigabyte told me one bios is for display port(ends in DP in the zip) and the other is for DVI/HDMI(ends in DH in the zip). I'm running on DVI so I tried to flash the DH bios, this is all what Gigabyte have told me to do lol.

It's not like they provide this info anywhere, trust me I looked, told them they need to include some instructions with their files, no good making people guess









The Xtreme Engine software isn't the greatest.. it says there's no bios update a available when I use that, was the first thing I tried.. I'd understand if there were two Xtreme 1070's made by Gigabyte, but there's only one so strange...


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> There are two different directories in the zip file. Doesn't the xtreme have a dual bios implementation?
> 
> Are you sure that you are trying to update the correctly selected bios on the card with the correct update version?
> 
> 
> 
> Yeah Gigabyte told me one bios is for display port(ends in DP in the zip) and the other is for DVI/HDMI(ends in DH in the zip). I'm running on DVI so I tried to flash the DH bios, this is all what Gigabyte have told me to do lol.
> 
> It's not like they provide this info anywhere, trust me I looked, told them they need to include some instructions with their files, no good making people guess
Click to expand...

I don't have a giga xtreme card but doesnt the xtreme card have custom circuitry that will switch automatically to the other output with a reboot? shouldnt the card have both installed?

I had a look at the bios download and noticed that they were using psychic support and documentation


----------



## Luckael

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Still need to know what bios version those of you that have the Gigabyte 1070 Xtreme Gaming are running.. mine shipped with F2 and refuses to flash F3, gives me an error saying "board is incompatible with firmware version 86.04.1E.00.AA"
> 
> I've got a rep from the Gigabyte Xtreme Gaming facebook page working on the problem with me(AFAIK Gigabyte only have one version of my card released, so the bios SHOULD flash fine).
> 
> If someone that has the same card as me could tell me what version bios they're running, and also post a screenshot of the main tab of gpu-z so i can compare some stuff, i'd really appreciate it!


we have the same card and bios.


----------



## AliasOfMyself

Quote:


> Originally Posted by *gtbtk*
> 
> I don't have a giga xtreme card but doesnt the xtreme card have custom circuitry that will switch automatically to the other output with a reboot? shouldnt the card have both installed?
> 
> I had a look at the bios download and noticed that they were using psychic support and documentation


Oh I asked them about that, I told them I don't have a display port capable display, so they told me to flash just the DH bios, the hilarious thing is, after the nvflash dos box closes, another window pops up saying flash was successful and asks for a reboot, yet still remains on the F2 bios


----------



## AliasOfMyself

Quote:


> Originally Posted by *Luckael*
> 
> we have the same card and bios.


Have you tried flashing the F3 bios? Also can you attach a screenshot of the main tab in gpu-z for me? Thanks


----------



## Luckael

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Have you tried flashing the F3 bios? Also can you attach a screenshot of the main tab in gpu-z for me? Thanks


yeah i already did it. but i revert back to F2. i will post later, im in the office right now


----------



## AliasOfMyself

Quote:


> Originally Posted by *Luckael*
> 
> yeah i already did it. but i revert back to F2. i will post later, im in the office right now


How come you went back to F2? And thanks, I appreciate it


----------



## Luckael

Quote:


> Originally Posted by *AliasOfMyself*
> 
> How come you went back to F2? And thanks, I appreciate it


i downloaded the extreme gaming bios here https://www.techpowerup.com/vgabios/ and flash it using nvflash


----------



## AliasOfMyself

Quote:


> Originally Posted by *Luckael*
> 
> i downloaded the extreme gaming bios here https://www.techpowerup.com/vgabios/ and flash it using nvflash


I meant why did you go back to F2 from F3?


----------



## ajx

Hello, which is the most stable Nvidia drivers for 1070?
I am running Windows 10 Pro 64 bits and encountering severe issues with these follwing issues:
- no app tiles, taskbar unstable, incomplete start menu (parameter button is missing)
- Couldn't update through WU
- Instable OS

I figured it out there was a residue of Nvidia folder, faulty drivers i assume
Windows odd behavior, even after a proper clean install, Nvidia folder still remains into hard drive partition huh? I never seen such thing








I was using 373.06, i want never install this version anymore








I guess the last one would be the most stable version, right?

Thanks


----------



## AliasOfMyself

Try 375.95, that's what I'm running and everything is running nicely..


----------



## gtbtk

Quote:


> Originally Posted by *ajx*
> 
> Hello, which is the most stable Nvidia drivers for 1070?
> I am running Windows 10 Pro 64 bits and encountering severe issues with these follwing issues:
> - no app tiles, taskbar unstable, incomplete start menu (parameter button is missing)
> - Couldn't update through WU
> - Instable OS
> 
> I figured it out there was a residue of Nvidia folder, faulty drivers i assume
> Windows odd behavior, even after a proper clean install, Nvidia folder still remains into hard drive partition huh? I never seen such thing
> 
> 
> 
> 
> 
> 
> 
> 
> I was using 373.06, i want never install this version anymore
> 
> 
> 
> 
> 
> 
> 
> 
> I guess the last one would be the most stable version, right?
> 
> Thanks


A couple of options to fix your menu.

option 1 is to open an administrator powershell and run Get-AppXPackage -AllUsers | Foreach {Add-AppxPackage -DisableDevelopmentMode -Register "$($_.InstallLocation)\AppXManifest.xml"} . that should repair your start menu.

another option is to create a new user ID. That is a windows problem, not a problem caused directly by nvidia drivers. If you really need to use the old login ID, copy all your user files to the new account, delete the old messed up ID including the appdata folders then recreate it using the name you want to use. Log in and copy your user files back again.

The latest 376.09 drivers seem to be quite good for me.

The c:\nvidia folder only contains the installation files from the drivers installation.


----------



## Dan-H

hey 1070 club.

I'm building a system for a family member and looking to choose a 1070.

right now these two are top of my list.

ASUS ROG GeForce GTX 1070 STRIX-GTX1070-O8G-GAMING

http://www.newegg.com/Product/Product.aspx?Item=N82E16814126109&Tpk=N82E16814126109

and

MSI GeForce GTX 1070 DirectX 12 GeForce GTX 1070 Quick Silver 8G OC

http://www.newegg.com/Product/Product.aspx?Item=N82E16814137046&Tpk=N82E16814137046

I've owned a few MSI cards, but not any Asus.

This is going into a skylake 6700K/16GB system, similar in ways to my daily driver, just newer.

Opinions please?


----------



## jlhawn

Quote:


> Originally Posted by *Dan-H*
> 
> hey 1070 club.
> 
> I'm building a system for a family member and looking to choose a 1070.
> 
> right now these two are top of my list.
> 
> ASUS ROG GeForce GTX 1070 STRIX-GTX1070-O8G-GAMING
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814126109&Tpk=N82E16814126109
> 
> and
> 
> MSI GeForce GTX 1070 DirectX 12 GeForce GTX 1070 Quick Silver 8G OC
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814137046&Tpk=N82E16814137046
> 
> I've owned a few MSI cards, but not any Asus.
> 
> This is going into a skylake 6700K/16GB system, similar in ways to my daily driver, just newer.
> 
> Opinions please?


MSI


----------



## GeneO

MSI









Or Gigabyte. I am seeing good things about their OC ability.

.


----------



## MEC-777

Quote:


> Originally Posted by *Dan-H*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> hey 1070 club.
> 
> I'm building a system for a family member and looking to choose a 1070.
> 
> right now these two are top of my list.
> 
> ASUS ROG GeForce GTX 1070 STRIX-GTX1070-O8G-GAMING
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814126109&Tpk=N82E16814126109
> 
> and
> 
> MSI GeForce GTX 1070 DirectX 12 GeForce GTX 1070 Quick Silver 8G OC
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814137046&Tpk=N82E16814137046
> 
> I've owned a few MSI cards, but not any Asus.
> 
> This is going into a skylake 6700K/16GB system, similar in ways to my daily driver, just newer.
> 
> Opinions please?


I've had a Strix 980 OC for just over a year now (running at 1459 daily without added voltage). Best GPU I've owned to date. If the price was right, I would have sprung for the Strix 1070. IMO, they are great cards.

Can't really go wrong with either, honestly. Just pick which ever looks better to you.


----------



## gtbtk

Quote:


> Originally Posted by *Dan-H*
> 
> hey 1070 club.
> 
> I'm building a system for a family member and looking to choose a 1070.
> 
> right now these two are top of my list.
> 
> ASUS ROG GeForce GTX 1070 STRIX-GTX1070-O8G-GAMING
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814126109&Tpk=N82E16814126109
> 
> and
> 
> MSI GeForce GTX 1070 DirectX 12 GeForce GTX 1070 Quick Silver 8G OC
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814137046&Tpk=N82E16814137046
> 
> I've owned a few MSI cards, but not any Asus.
> 
> This is going into a skylake 6700K/16GB system, similar in ways to my daily driver, just newer.
> 
> Opinions please?


I have the MSI and love it but to be honest they are both about the same. The MSI is slightly quieter, the bot cool about the same, the and the MSI bios has a higher power limit so It will give you slightly more stable clocks but nothing to get anxious about.

Unless you have a design/colour priority, I would suggest which ever one is cheaper when you go to buy it and save the extra cash. If they are the same price, I would sway towards the MSI for the reasons mentioned above but I don't think you will go wrong with the ASUS either. I was running an Asus GTX 660 OC DirectCUII before my 1070 and that was a quality piece of gear. Not crazy about ASUS GPU tweak II though. Afterburner is much better OC software.

What motherboard are you getting or do you have? like brands mean you only need to install one management utility that Manages both the MB and GPU LEDs etc. differing software will require an extra management utility installed


----------



## Dan-H

Thanks for the comments and feedback.

Leaning towards the MSI as I like how quiet my MSI 970 Twin Frozr is under a heavy load. Price is the same for both. edit. the Asus is $20 less.

MoBo is ASUS Z170 ROG Maximus VIII Hero on order but hasn't arrived.

RGB control isn't a priority. Well it isn't to me. Case chosen is a Fractal R5 and this will be on the floor under a desk with the side that would have had a window facing a wall. Not much room for it...

kindof a shame to hide the snazzy hardware inside of a black box mini fridge but....


----------



## gtbtk

Quote:


> Originally Posted by *Dan-H*
> 
> Thanks for the comments and feedback.
> 
> Leaning towards the MSI as I like how quiet my MSI 970 Twin Frozr is under a heavy load. Price is the same for both.
> 
> MoBo is ASUS Z170 ROG Maximus VIII Hero on order but hasn't arrived.
> 
> RGB control isn't a priority. Well it isn't to me. Case chosen is a Fractal R5 and this will be on the floor under a desk with the side that would have had a window facing a wall. Not much room for it...
> 
> kindof a shame to hide the snazzy hardware inside of a black box mini fridge but....


Nice board. I am toying with the idea of upgrading my sandy bridge but I am not in a hurry. Debating z170/skylake or wait until January and go z270/Kabylake. I have had Asus boards forever thought i migh try something different this time around. If i go z170, my thinking was to try the MSI xpower titanium or teh ASrock OC formula board

My favorite RGB setting is "Off" but the MSI card defaults to "on" at startup, fortunately, like you, the case windows faces away from where I am sitting so it is not a biggie for me.


----------



## chrcoluk

Quote:


> Originally Posted by *MEC-777*
> 
> Open MSI afterburner and move the temp and power target sliders all the way up. No more throttling.


depends on the card, some cards have been hitting TDP limits.

My card was throttling due to TDP a gamerock premium, but now the new bios has added 25 watts to the TDP limit it no longer TDP throttles.

On the other zotac extreme cards had a high TDP limit from the off so they never throttled on shipped bios.


----------



## chrcoluk

Quote:


> Originally Posted by *philhalo66*
> 
> why are these cards throttling at such low temps? is it a bug or a design flaw?


TDP

I noticed 95% of people dont even bother checking TDP, they just think its all down to temps and voltage.


----------



## gtbtk

Quote:


> Originally Posted by *chrcoluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MEC-777*
> 
> Open MSI afterburner and move the temp and power target sliders all the way up. No more throttling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> depends on the card, some cards have been hitting TDP limits.
> 
> My card was throttling due to TDP a gamerock premium, but now the new bios has added 25 watts to the TDP limit it no longer TDP throttles.
> 
> On the other zotac extreme cards had a high TDP limit from the off so they never throttled on shipped bios.
Click to expand...

the MSI Gaming cards have a max power limit at +126% of 291W. They don't throttle either. EVGA cards jump all over the place under heavy load


----------



## RJTablante

Lo and behold, Zotac has pulled through! I'm passing on the email that was sent to me to you all so that you can also reap the benefits of the updated Zotac 1070 series bios to address the Micron memory issue.

I flashed it already and honestly it's like I have a new card!

As this is straight from Zotac, I take no responsibility for any damage done, so please follow their instructions, ensure you flash the proper BIOS meant for your card, and get ready for the 1070 you've always wanted









_Hi,

Here are the temporary links to our FTP, for the respective GTX1070 Cards affected by the "Micron RAM " issue.

Please select the correct BIOS for your card based on SKU. The ZIP filenames are self-explaining and in EXE format so this will execute when you double click. This bios only supports windows operating system.

SKU : ZT-P10700E-10S
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700E-10S__288-1N435-200Z8-201Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700F-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700F-10P__288-1N424-200Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700I-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700I-10P__299-1N424-300Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700C-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_(ZT-P10700C-10P__288-1N435-100Z8-101Z8)_Micron_RAM_(2016_11).zip

SKU :ZT-P10700B-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_Extreme_(ZT-P10700B-10P__288-1N435-000Z8-001Z8)_Micron_RAM_(2016_11).zip

SKU :ZT-P10700A-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Founders_(ZT-P10700A-10P__288-1N424-000Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700G-10M
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700G-10M__288-1N445-030Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700K-10M
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700K-10M__288-1N445-130Z8)_Micron_RAM_(2016_11).zip

+++++++++++++++
Important Remark :
+++++++++++++++
- The VBIOS files are made into .EXE files, for 32bit and 64bit Windows respectively. Run the .EXE file in the suitable Windows type.
- Check to make sure the card is really built from "Micron RAM" before starting to do the VBIOS change !!!
- These VBIOS changes are "One-Way" only, no return path to go back.

Yours,

Fred
ZOTAC Technical Support_


----------



## MEC-777

Quote:


> Originally Posted by *RJTablante*
> 
> Lo and behold, Zotac has pulled through! I'm passing on the email that was sent to me to you all so that you can also reap the benefits of the updated Zotac 1070 series bios to address the Micron memory issue.
> 
> I flashed it already and honestly it's like I have a new card!
> 
> As this is straight from Zotac, I take no responsibility for any damage done, so please follow their instructions, ensure you flash the proper BIOS meant for your card, and get ready for the 1070 you've always wanted
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> _Hi,
> 
> Here are the temporary links to our FTP, for the respective GTX1070 Cards affected by the "Micron RAM " issue.
> 
> Please select the correct BIOS for your card based on SKU. The ZIP filenames are self-explaining and in EXE format so this will execute when you double click. This bios only supports windows operating system.
> 
> SKU : ZT-P10700E-10S
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700E-10S__288-1N435-200Z8-201Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700F-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700F-10P__288-1N424-200Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700I-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700I-10P__299-1N424-300Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700C-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_(ZT-P10700C-10P__288-1N435-100Z8-101Z8)_Micron_RAM_(2016_11).zip
> 
> SKU :ZT-P10700B-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_Extreme_(ZT-P10700B-10P__288-1N435-000Z8-001Z8)_Micron_RAM_(2016_11).zip
> 
> SKU :ZT-P10700A-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Founders_(ZT-P10700A-10P__288-1N424-000Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700G-10M
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700G-10M__288-1N445-030Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700K-10M
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700K-10M__288-1N445-130Z8)_Micron_RAM_(2016_11).zip
> 
> +++++++++++++++
> Important Remark :
> +++++++++++++++
> - The VBIOS files are made into .EXE files, for 32bit and 64bit Windows respectively. Run the .EXE file in the suitable Windows type.
> - Check to make sure the card is really built from "Micron RAM" before starting to do the VBIOS change !!!
> - These VBIOS changes are "One-Way" only, no return path to go back.
> 
> Yours,
> 
> Fred
> ZOTAC Technical Support_


Awesome! Thanks! My card should be arriving today or tomorrow, so it's great to know the updated BIOS is here waiting. If it is a Micron card, I'll make sure I flash it before doing anything else.









Now I just need to learn how to flash the BIOS on Nvidia cards...


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> inno3d or colorful is not available everywhere in the world though. I dont think they are being sold in the US. I have not seen much coverage on the ichill pascal cards to date. The Maxwell cards got quite good reviews though.
> 
> The manufacturers, including MSI, have been alternating batches in both micron and samsung but Micron memory no longer has any issues after the .50 bios update. The quicksilver cards are just Gaming X cards with a Gaming Z backplate painted silver and a Silver/black coloured fan shroud. Those reviewed to date have had Micron memory with the Gaming X .50 bios ex factory, but I expect that well will see those cards with samsung memory sometime in the future


it is sad that these two brands dont provide international rma like what evga does, and risking for them may not be a good choice as there are lots of alternatives available on us market.
Quote:


> Originally Posted by *RJTablante*
> 
> Lo and behold, Zotac has pulled through! I'm passing on the email that was sent to me to you all so that you can also reap the benefits of the updated Zotac 1070 series bios to address the Micron memory issue.
> 
> I flashed it already and honestly it's like I have a new card!
> 
> As this is straight from Zotac, I take no responsibility for any damage done, so please follow their instructions, ensure you flash the proper BIOS meant for your card, and get ready for the 1070 you've always wanted
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Hi,
> 
> Here are the temporary links to our FTP, for the respective GTX1070 Cards affected by the "Micron RAM " issue.
> 
> Please select the correct BIOS for your card based on SKU. The ZIP filenames are self-explaining and in EXE format so this will execute when you double click. This bios only supports windows operating system.
> 
> SKU : ZT-P10700E-10S
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700E-10S__288-1N435-200Z8-201Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700F-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700F-10P__288-1N424-200Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700I-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700I-10P__299-1N424-300Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700C-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_(ZT-P10700C-10P__288-1N435-100Z8-101Z8)_Micron_RAM_(2016_11).zip
> 
> SKU :ZT-P10700B-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_Extreme_(ZT-P10700B-10P__288-1N435-000Z8-001Z8)_Micron_RAM_(2016_11).zip
> 
> SKU :ZT-P10700A-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Founders_(ZT-P10700A-10P__288-1N424-000Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700G-10M
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700G-10M__288-1N445-030Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700K-10M
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700K-10M__288-1N445-130Z8)_Micron_RAM_(2016_11).zip
> 
> +++++++++++++++
> Important Remark :
> +++++++++++++++
> - The VBIOS files are made into .EXE files, for 32bit and 64bit Windows respectively. Run the .EXE file in the suitable Windows type.
> - Check to make sure the card is really built from "Micron RAM" before starting to do the VBIOS change !!!
> - These VBIOS changes are "One-Way" only, no return path to go back.
> 
> Yours,
> 
> Fred
> ZOTAC Technical Support_


why i dont get this, zotac support said i will be noticed by email too. dammit


----------



## KSIMP88

I bought the base Zotac 1070, and it uses Samsung RAM. And man it runs cool and quiet! Never reached 60°C on GTA V yesterday. Gonna force some high level settings on it to see if I can get the FPS to drop to see where my Temps will be under stress, but it's in a separate enclosure and has ano excellent stock cooler


----------



## asdkj1740

ZT-P10700I-10P
ZOTAC GeForce® GTX 1070
https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-NE
SKU : ZT-P10700I-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700I-10P__299-1N424-300Z8)_Micron_RAM_(2016_11).zip

ZT-P10700G-10M
ZOTAC GeForce® GTX 1070 Mini
https://www.zotac.com/us/product/graphics_card/zotac-geforce%C2%AE-gtx-1070-mini-0
SKU : ZT-P10700G-10M
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700G-10M__288-1N445-030Z8)_Micron_RAM_(2016_11).zip

ZT-P10700B-10P
ZOTAC GeForce® GTX 1070 AMP Extreme
https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme
SKU :ZT-P10700B-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_Extreme_(ZT-P10700B-10P__288-1N435-000Z8-001Z8)_Micron_RAM_(2016_11).zip

ZT-P10700C-10P
ZOTAC GeForce® GTX 1070 AMP Edition
https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-edition
SKU : ZT-P10700C-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_(ZT-P10700C-10P__288-1N435-100Z8-101Z8)_Micron_RAM_(2016_11).zip

ZT-P10700A-10P
ZOTAC GeForce® GTX 1070 Founders Edition
https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-founders-edition
SKU :ZT-P10700A-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Founders_(ZT-P10700A-10P__288-1N424-000Z8)_Micron_RAM_(2016_11).zip


----------



## iARDAs

Does this 1070 overclock like my old 980ti?

Turn on MSI Afterburner.
Set Power Limit to maximum
And play with core clock to find a stable OC.


----------



## MEC-777

Quote:


> Originally Posted by *iARDAs*
> 
> Does this 1070 overclock like my old 980ti?
> 
> Turn on MSI Afterburner.
> Set Power Limit to maximum
> And play with core clock to find a stable OC.


Follow same procedure for OCing as with the 900 series, but just keep in mind; GPUboost 3.0 will interfere a lot more. From what most people have been saying, it won't hold a steady boost clock but will "throttle" down slightly over time, then level off based on a number of different factors.

Also, with GPU boost 3.0, they OC automatically quite a bit right out of the box without touching anything.


----------



## iARDAs

Quote:


> Originally Posted by *MEC-777*
> 
> Follow same procedure for OCing as with the 900 series, but just keep in mind; GPUboost 3.0 will interfere a lot more. From what most people have been saying, it won't hold a steady boost clock but will "throttle" down slightly over time, then level off based on a number of different factors.


I ser thanks. So it is best to have an agressive gan profile as well?


----------



## MEC-777

Quote:


> Originally Posted by *iARDAs*
> 
> I ser thanks. So it is best to have an agressive gan profile as well?


Someone else may want to chime in here to correct if I'm wrong, but I believe it does take into account the fan speed. So in that case, yes. To what extent though, I'm not sure. I'm expecting my 1070 to arrive today, so I haven't yet experimented with all this.


----------



## iARDAs

Quote:


> Originally Posted by *MEC-777*
> 
> Someone else may want to chime in here to correct if I'm wrong, but I believe it does take into account the fan speed. So in that case, yes. To what extent though, I'm not sure. I'm expecting my 1070 to arrive today, so I haven't yet experimented with all this.


Mine is here but I didnot unbox it yet.

Will game at 2560 1080p 60hz maybe 75 so maybe no need for oc


----------



## anthonyg45157

Quote:


> Originally Posted by *RJTablante*
> 
> Lo and behold, Zotac has pulled through! I'm passing on the email that was sent to me to you all so that you can also reap the benefits of the updated Zotac 1070 series bios to address the Micron memory issue.
> 
> I flashed it already and honestly it's like I have a new card!
> 
> As this is straight from Zotac, I take no responsibility for any damage done, so please follow their instructions, ensure you flash the proper BIOS meant for your card, and get ready for the 1070 you've always wanted
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Hi,
> 
> Yours,
> 
> Fred
> ZOTAC Technical Support_


How does it make your card feel new again? Can you explain in more detail? Higher over clock? I have micron as well.

Have you noticed voltage being any different or the power limit?


----------



## ogow89

Quote:


> Originally Posted by *iARDAs*
> 
> Does this 1070 overclock like my old 980ti?
> 
> Turn on MSI Afterburner.
> Set Power Limit to maximum
> And play with core clock to find a stable OC.


It would really help to know what card you have, and what the vram type (mircon/samsung)

In case of samsung vram, you will be able to add 500mhz to the memory. effective 9000mhz.

Micron memory could maybe do 8800mhz

The core however is a bit tricky. I would suggest adding perhaps around 50mhz to the core before touching the powerlimit. Increasing the power intake, increases the temps causing further throttling early on. Make a fan profile to cool the card but i would also take noise into account.

I have the gainward golden sample, and here is my oc;

no powerlimit/overvoltage changes
57+ mhz core
500+ mhz memory (samsung)
50% fan speed when card hits 60°C

2050 mhz avg. core clock in the witcher 3 at 1440p with gameworks on - Most demanding game on the card.

With other games i get around 2063mhz.

I can clock my card to boost to 2189 mhz on the core and 9600 mhz on the memory, but the 1-2 fps increase didn't seem worth it, considering how loud the fans have to get to keep the card adequately cool.


----------



## RJTablante

Quote:


> Originally Posted by *anthonyg45157*
> 
> How does it make your card feel new again? Can you explain in more detail? Higher over clock? I have micron as well.
> 
> Have you noticed voltage being any different or the power limit?


The primary difference I found here is simply the memory overclocking potential. Before the update I was limited to +360 on the memory stable on the standard benchmark suites and my sample games (3dmark Firestrike, TimeSpy, Unigine Heaven, etc. - Games Rise of the Tomb Raider, Gears of War 4). Now I can honestly seemingly go 600+ on the memory and still have the benchmarks complete with a decent score bump in all benchmarks and gaming performance.

Note that this bios update did not impact the core overclock potential, voltage or power limit settings I had previously and only allowed my memory overclock to go much much higher without issue.

Something I noticed is that increasing the memory past a certain point yields worse performance, so right now I'm sitting at +550 and seeing better gains then when I set it to +600 so I'm happy to keep it lower for the better performance. Granted this is with the little time I've spent overclocking this card since my bios update, but so far I'm a very happy camper









My memory overclock is at stock voltage, +120 Power Limit so I guess I could feasibly go higher with more voltage but I found my core overclock loses stability with higher voltage (weird, but it's what I'm seeing - benchmark runs would crash out with the higher voltage). I do wonder if more voltage would help with a higher memory overclock, so I guess I'll try that out later. It's part of the fun really of overclocking to see what potential you have in your card and even though I'm pretty certain I don't have a golden sample or anything, at least I've already experienced enough gains from a simple bios flash to be happy with my purchase (which initially I wasn't as it didn't have Samsung ram).


----------



## hammelgammler

Hey guys,

can anyone tell me what those Checkerboard Pattern should look like with Micron VRAM? Because I have two Gainward 1070 Phoenix with Micron, and it doesn't matter how much Mem Clock I put on the card, I have sometimes weird VIdeo Artifacts when watching two YouTube Videos at the same time on different monitors (I have 3).

When gaming or when only watching one video at a time (YouTube, didn't saw those Artifacts with Twitch yet), then I don't get those Artifacts.
_Anyone know if that's because of the Micron VRAM? I already have the latest BIOS which should solve the issues._

Besides that, I can run my memory with 2300-2400MHz before I get Artifacts in Games, although my performance doesn't increase in Tomb Raider when pushing it besides 2300MHz.

Thanks!


----------



## asdkj1740

Quote:


> Originally Posted by *hammelgammler*
> 
> Hey guys,
> 
> can anyone tell me what those Checkerboard Pattern should look like with Micron VRAM? Because I have two Gainward 1070 Phoenix with Micron, and it doesn't matter how much Mem Clock I put on the card, I have sometimes weird VIdeo Artifacts when watching two YouTube Videos at the same time on different monitors (I have 3).
> 
> When gaming or when only watching one video at a time (YouTube, didn't saw those Artifacts with Twitch yet), then I don't get those Artifacts.
> _Anyone know if that's because of the Micron VRAM? I already have the latest BIOS which should solve the issues._
> 
> Besides that, I can run my memory with 2300-2400MHz before I get Artifacts in Games, although my performance doesn't increase in Tomb Raider when pushing it besides 2300MHz.
> 
> Thanks!


try the latest nvidia driver if you are using .50 micron fixed bios


----------



## asdkj1740

Quote:


> Originally Posted by *asdkj1740*
> 
> try the latest nvidia driver if you are already using .50 micron fixed bios


----------



## blued

Having Micron vram does not mean you will get those issues. Some may have issues, others do not. The YT video artifacts might not be related to the micron vram. If the card works at 2300mhz without artifacts but not above 2400mhz, that could be its limit.


----------



## hammelgammler

Those YouTube artefacts are there even with stock clocks, and I already have the latest (376.09 WHQL) Nvidia Driver installed. :/

I get it with both cards even at stock, and with the fixed bios. It's just weird, as it only happens in this certain case as for now, no issues while gaming whatsoever.


----------



## iARDAs

Quote:


> Originally Posted by *hammelgammler*
> 
> Those YouTube artefacts are there even with stock clocks, and I already have the latest (376.09 WHQL) Nvidia Driver installed. :/
> 
> I get it with both cards even at stock, and with the fixed bios. It's just weird, as it only happens in this certain case as for now, no issues while gaming whatsoever.


If you are using chrome can you please do this

chrome://flags/#disable-accelerated-video-decode

And disable it


----------



## anthonyg45157

Quote:


> Originally Posted by *hammelgammler*
> 
> Those YouTube artefacts are there even with stock clocks, and I already have the latest (376.09 WHQL) Nvidia Driver installed. :/
> 
> I get it with both cards even at stock, and with the fixed bios. It's just weird, as it only happens in this certain case as for now, no issues while gaming whatsoever.


I had similar issue. Would only freeze or lock up on desktop(moving windows) or using chrome. Seems it was related to memory not getting enough voltage . Or the Mem controller it upping the voltage when needed. I solved this by adding explorer.exe, windows shell, dwm.exe and chrome to 3d settings in Nvidia control panel and set them to prefer max performance. It keeps the Mem voltage up and I haven't had this issue since.


----------



## KSIMP88

Overclocking my 1070 now. What's a good OC? I'm at 2000 core 2250 mem and rising. Or do we say 4500 mem?


----------



## Dude970

Quote:


> Originally Posted by *KSIMP88*
> 
> Overclocking my 1070 now. What's a good OC? I'm at 2000 core 2250 mem and rising. Or do we say 4500 mem?


Try 9K


----------



## KSIMP88

Really? Why


----------



## ogow89

Quote:


> Originally Posted by *KSIMP88*
> 
> Really? Why


He meant, we say 9000 mhz effective, referring to the vram clock.


----------



## KSIMP88

Weird

EDIT:

Timespy 5684
http://www.3dmark.com/3dm/16424723?


----------



## RJTablante

Quote:


> Originally Posted by *KSIMP88*
> 
> Weird
> 
> EDIT:
> 
> Timespy 5684
> http://www.3dmark.com/3dm/16424723?


I'm not sure why your core clock is showing up so low.

Here is mine for reference:

http://www.3dmark.com/3dm/16425477?

Oh wait, are you not running a desktop setup? Oh are you running an external enclosure w/ the 1070? Still not sure why the core is so low.


----------



## KSIMP88

No idea, really. Was set higher, and was running higher on gpu z


----------



## iARDAs

So my Asus Dual 1070 is here. I did not do anything but I can see it reaching to 1950s in Division....

Interesting


----------



## spddmn24

Anyone having power limit issues on the MSI cards? I just got a quicksilver 1070 and I'm banging against the power limit at 100-105% despite being set at 126 in afterburner.


----------



## philhalo66

Quote:


> Originally Posted by *spddmn24*
> 
> Anyone having power limit issues on the MSI cards? I just got a quicksilver 1070 and I'm banging against the power limit at 100-105% despite being set at 126 in afterburner.


im having the same issue on my gigabyte card. i have the power limit set to 110% which is the highest i can put it. but it has never gone above 99% even in rise of the tomb raider. Might be throttling from Heat on the VRM's or something.


----------



## CriZ93

Is the new bios of zotac amp extreme safe? is stable?
My gpu has micron vram
I have overclocking problems


----------



## RJTablante

Quote:


> Originally Posted by *CriZ93*
> 
> Is the new bios of zotac amp extreme safe? is stable?
> My gpu has micron vram
> I have overclocking problems


If anything it adds more stability during an overclock as it "cures" the issue w/ Micron based ram. Zotac's bios is really not that different than all of the other manufacturers who've already pushed out their update weeks ago, so I'd say it's safe (I've done it already and have experienced nothing bad from it).


----------



## CriZ93

Quote:


> Originally Posted by *RJTablante*
> 
> If anything it adds more stability during an overclock as it "cures" the issue w/ Micron based ram. Zotac's bios is really not that different than all of the other manufacturers who've already pushed out their update weeks ago, so I'd say it's safe (I've done it already and have experienced nothing bad from it).


I'll wait a few more days if no one else in this forum had any problems I'll try it
gracias-thanks


----------



## spddmn24

I didn't gain any additional OC on my micron ram with the MSI update. +500 on both.


----------



## Jackharm

I didn't gain much of an additional memory OC with the Zotac update too.

But that wasn't my main concern since even before the update I was hitting 9200 on memory.

What I do hope the bios fixes is that from cold boot, I would get (sometimes) a white checker-board or blue-screen video scheduling error which would be fixed once I restart and reboot again.

So I guess I'll see if that issues pops up again over these next few days/weeks.


----------



## KSIMP88

Probably not much gains in OC, but maybe more stability.


----------



## RJTablante

I suppose I should add that my previous lower memory overclock pre-bios flash was without any workarounds in place (no adjustments made to the NVidia control panel), which likely explains what I called my stable +360 overclock as this resulted in no checkerboarding/freezing at all.

Now post-bios flash, I can definitely overclock to +550 and have never seen any checkerboarding as of yet, which increased my stable overclock significantly (again with no workarounds in the NVidia control panel in place).

So if you already performed the workarounds previously listed in this thread and others around the net w/ the NVidia control panel, then likely you've discovered something very close if not a match to your max overclock, but now you don't have to use any workarounds to achieve it post-bios flash.


----------



## anthonyg45157

Does anyone know what Nvidia "fixed" in all these bios releases?


----------



## Dan-H

Quote:


> Originally Posted by *jlhawn*
> 
> MSI


Quote:


> Originally Posted by *GeneO*
> 
> MSI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .


Quote:


> Originally Posted by *gtbtk*
> 
> I have the MSI and love it but to be honest they are both about the same.


Went with the MSI 1070 quicksilver. Should be here early next week. I may have to try it out in my system first.

Thanks for the feedback.


----------



## kevindd992002

So is the Zotac BIOS update really just for Micron-based cards?


----------



## Luckael

Quote:


> Originally Posted by *Luckael*
> 
> yeah i already did it. but i revert back to F2. i will post later, im in the office right now


hey, this is the screenshot of my bios


----------



## Gurkburk

Quote:


> Originally Posted by *anthonyg45157*
> 
> Does anyone know what Nvidia "fixed" in all these bios releases?


Nvidia isnt releasing Bios updates..


----------



## philhalo66

what exactly does the gigabyte F2-F9 bios even do?


----------



## Avendor

Quote:


> Originally Posted by *philhalo66*
> 
> what exactly does the gigabyte F2-F9 bios even do?


Check here: http://www.guru3d.com/news-story/manufacturers-roll-out-firmware-updates-for-geforce-gtx-1070-due-to-memory-issue,15.html
Both were updated to:
F11 - Release for Micron Memory
F2 - Release for SAMSUNG Memory
http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios


----------



## AliasOfMyself

Quote:


> Originally Posted by *Luckael*
> 
> hey, this is the screenshot of my bios


Hmmm, i don't know if this means anything, but my subvendor is undefined and my device id doesn't quite match yours... also it's worth mentioning that this is my first Nvidia card in almost 15 years lol, i've attached a screenshot of my gpu-z


----------



## boostnek9

For those who were told the Zotac 1070 Bios would come out within a week or so, I received this email Nov 29th after asking about the update...

Quote:


> Dear Sir,
> 
> The Zotac graphic card does fulfill the card specification.
> Currently, we have no update VBIOS for the GTX 1070.
> 
> B.rgds
> ZOTAC Support Session
> [email protected]


----------



## Santosh

You can download the Bios Update here:
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6300#post_25685869


----------



## gtbtk

Quote:


> Originally Posted by *iARDAs*
> 
> Does this 1070 overclock like my old 980ti?
> 
> Turn on MSI Afterburner.
> Set Power Limit to maximum
> And play with core clock to find a stable OC.


You can do the OC that way

Pascal cards now also give you the option of using a voltage curve as these cards will clock higher at some voltage levels than they will at others. Using the slider to manage core clocks will hit the lowest voltage/clock point first and leave performance on the table at other voltage levels.

"throttling" overstates the situation and makes it sound more dramatic than it is. Pascal does manage clock speed vs voltage vs temps and it will move the number that Afterburner is reporting. But what is actually happening is the card is using many different frecuencies all the way along the curve in parallel. My firestrike performance is better with a reported 2088Mhz OC then it is at 2100mhz.

Using a more aggressive fan curve will keep temps lower and the allow the card to clock higher leading to better performance


----------



## gtbtk

Quote:


> Originally Posted by *boostnek9*
> 
> For those who were told the Zotac 1070 Bios would come out within a week or so, I received this email Nov 29th after asking about the update...
> 
> Quote:
> 
> 
> 
> Dear Sir,
> 
> The Zotac graphic card does fulfill the card specification.
> Currently, we have no update VBIOS for the GTX 1070.
> 
> B.rgds
> ZOTAC Support Session
> supp[email protected]
Click to expand...

just goes to show the quality of the Zotac tech support team. just like the teams everywhere else


----------



## gtbtk

Quote:


> Originally Posted by *hammelgammler*
> 
> Hey guys,
> 
> can anyone tell me what those Checkerboard Pattern should look like with Micron VRAM? Because I have two Gainward 1070 Phoenix with Micron, and it doesn't matter how much Mem Clock I put on the card, I have sometimes weird VIdeo Artifacts when watching two YouTube Videos at the same time on different monitors (I have 3).
> 
> When gaming or when only watching one video at a time (YouTube, didn't saw those Artifacts with Twitch yet), then I don't get those Artifacts.
> Anyone know if that's because of the Micron VRAM? I already have the latest BIOS which should solve the issues.
> 
> Besides that, I can run my memory with 2300-2400MHz before I get Artifacts in Games, although my performance doesn't increase in Tomb Raider when pushing it besides 2300MHz.
> 
> Thanks!


The checkerboard pattern is what it sounds like. small white checkerboards covering the entire screen.

Gainward has already released a bios update that solves the problem. you can check to see if that is needed by looking in GPU-Z. The bug fix bios version is 86.04.50.00.XX. If you are running 86.04.26.00.xx version, then you need to do the update.


----------



## gtbtk

Quote:


> Originally Posted by *hammelgammler*
> 
> Those YouTube artefacts are there even with stock clocks, and I already have the latest (376.09 WHQL) Nvidia Driver installed. :/
> 
> I get it with both cards even at stock, and with the fixed bios. It's just weird, as it only happens in this certain case as for now, no issues while gaming whatsoever.


some monitors have special processing for video that can be turned on or off in the setting. Maybe you might have a look at the monitor and turn that feature on/off and see if that helps


----------



## gtbtk

Quote:


> Originally Posted by *iARDAs*
> 
> So my Asus Dual 1070 is here. I did not do anything but I can see it reaching to 1950s in Division....
> 
> Interesting


that is fairly typical for a 1070


----------



## gtbtk

Quote:


> Originally Posted by *CriZ93*
> 
> Is the new bios of zotac amp extreme safe? is stable?
> My gpu has micron vram
> I have overclocking problems


The bios that effects all the core functions of the card actually comes from Nvidia and is standard across all 1070s. The partners only change their branding, power limit settings and the default clock speeds for each model of card in their range.


----------



## gtbtk

Quote:


> Originally Posted by *anthonyg45157*
> 
> Does anyone know what Nvidia "fixed" in all these bios releases?


Yes. The micron memory, when ramping up from a low voltage sleep state to the OC frequency would try to operate before the memory VRMs could increase the voltage to the chips leading to the chips trying to operate at high frequency without the required voltage to support it and the memory would basically give up with white checkerboard patterns over the screen before BSODing the PC.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> So is the Zotac BIOS update really just for Micron-based cards?


It should run fine on both but the Micron cards will get the non crash benefits that the Samsung cards had from the beginning


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *anthonyg45157*
> 
> Does anyone know what Nvidia "fixed" in all these bios releases?
> 
> 
> 
> Nvidia isnt releasing Bios updates..
Click to expand...

Yes they are but not to the public.

The core bios is sent to the partners from Nvidia who brand it and finalize the clock speeds, power limits etc for each card in their range before they distribute the update to their customers


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Luckael*
> 
> hey, this is the screenshot of my bios
> 
> 
> 
> 
> 
> Hmmm, i don't know if this means anything, but my subvendor is undefined and my device id doesn't quite match yours... also it's worth mentioning that this is my first Nvidia card in almost 15 years lol, i've attached a screenshot of my gpu-z
Click to expand...

That is a bug with the AMD chipset not passing the sub vendor ID along to the operating system


----------



## AliasOfMyself

Quote:


> Originally Posted by *gtbtk*
> 
> That is a bug with the AMD chipset not passing the sub vendor ID along to the operating system


Aaah I see, is that only with Nvidia cards then? My last few AMD cards sub vendor ID showed up fine lol.. I hate how old the 990FX chipset and vishera processors are, if Zen turns out to be a let down then I'll end up going intel. Thanks for the info


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> It should run fine on both but the Micron cards will get the non crash benefits that the Samsung cards had from the beginning


But is there even any benefit from flashing it on Samsung-based cards?


----------



## Vitto

Hello everyone! I've been following this thread for last few days and finally decided to join the site







I'm running 6600k @ 4.7 Ghz with an Asus Strix OC 1070s in SLI. Some benchmarks to follow shortly! I'm coming from a GTX 465 so it's quite an upgrade


----------



## Dude970

Quote:


> Originally Posted by *Vitto*
> 
> Hello everyone! I've been following this thread for last few days and finally decided to join the site
> 
> 
> 
> 
> 
> 
> 
> I'm running 6600k @ 4.7 Ghz with an Asus Strix OC 1070s in SLI. Some benchmarks to follow shortly! I'm coming from a GTX 465 so it's quite an upgrade










Welcome to the club, great upgrade for you


----------



## Vitto

Thank you! I built it a few months ago and just last week added the 2nd 1070 for SLI, now, the original came with samsung memory, this one with micron (hope that's ok and i don't have to go look for a different one). i updated the vbios on them to the newest one which is suppose to improve cards with micron memory. Here are some screenshots.


----------



## rfarmer

Quote:


> Originally Posted by *Vitto*
> 
> Thank you! I built it a few months ago and just last week added the 2nd 1070 for SLI, now, the original came with samsung memory, this one with micron (hope that's ok and i don't have to go look for a different one). i updated the vbios on them to the newest one which is suppose to improve cards with micron memory. Here are some screenshots.
> 
> 
> Spoiler: Warning: Spoiler!


Very nice. One 1070 would be a huge upgrade from a 465, 2 must be insane.


----------



## loopy750

Quote:


> Originally Posted by *kevindd992002*
> 
> But is there even any benefit from flashing it on Samsung-based cards?


Unlikely. I know flashing a new BIOS is tempting but they've stated that whatever adjustments were made are specifically for Micron memory. If your Samsung memory isn't crashing when changing from low to high voltage then I doubt you'd be gaining anything from this BIOS.

Quote:


> Originally Posted by *gtbtk*
> 
> just goes to show the quality of the Zotac tech support team. just like the teams everywhere else


Indeed. I asked two separate support people a couple of weeks ago about when to expect the BIOS update and on both occasions told that my "card was within specifications and if there is an issue to RMA". They knew nothing about a new BIOS.

Anyway I downloaded the link a few pages back for my GTX 1070 AMP Extreme and it's working excellent.
OC speeds are 2114 MHz / 9065 MHz. Custom fan speed sits around 50% when gaming and temps never go over 53°C.

Many thanks to those who shared the BIOS links. This is a beast of a card and now I'm officially happy with it.


----------



## RJTablante

Quote:


> Originally Posted by *loopy750*
> 
> Unlikely. I know flashing a new BIOS is tempting but they've stated that whatever adjustments were made are specifically for Micron memory. If your Samsung memory isn't crashing when changing from low to high voltage then I doubt you'd be gaining anything from this BIOS.
> Indeed. I asked two separate support people a couple of weeks ago about when to expect the BIOS update and on both occasions told that my "card was within specifications and if there is an issue to RMA". They knew nothing about a new BIOS.
> 
> Anyway I downloaded the link a few pages back for my GTX 1070 AMP Extreme and it's working excellent.
> OC speeds are 2114 MHz / 9065 MHz. Custom fan speed sits around 50% when gaming and temps never go over 53°C.
> 
> Many thanks to those who shared the BIOS links. This is a beast of a card and now I'm officially happy with it.


Glad I could help







It's why I shared the email that was sent to me containing the bios links. It does appear the rep I established a connection with at Zotac hooked me up ahead of the "official" bios release so had to share the love.


----------



## Vitto

It's pretty mind blowing XD. and at 4k I tend to get distracted by the scenery in games


----------



## Vitto

I was wondering, I'm using Asus GPU Tweak II now for my OC, do you guys think i should switch to MSI AB? i noticed once in a while the 2 cards are not exactly sync'ed. also sometimes when alt+tabing out of a game one card still keeps going at full speed while the other one adjusts itself. I'm pretty sure it's the Micron card, I wonder if that new vbios introduced some new bug. Or would you guys recommend returning Micron and trying to find one with Samsung memory? i got this one from local micro center.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It should run fine on both but the Micron cards will get the non crash benefits that the Samsung cards had from the beginning
> 
> 
> 
> But is there even any benefit from flashing it on Samsung-based cards?
Click to expand...

if it aint broke, no need to fix anything


----------



## gtbtk

Quote:


> Originally Posted by *Vitto*
> 
> Hello everyone! I've been following this thread for last few days and finally decided to join the site
> 
> 
> 
> 
> 
> 
> 
> I'm running 6600k @ 4.7 Ghz with an Asus Strix OC 1070s in SLI. Some benchmarks to follow shortly! I'm coming from a GTX 465 so it's quite an upgrade


question for you. why did you decide to do 1070 in SLI and not go for a 1080 straight up?


----------



## Vitto

I originally built my rig with 1 1070 and after few months decided to get a second one


----------



## gtbtk

Quote:


> Originally Posted by *loopy750*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> But is there even any benefit from flashing it on Samsung-based cards?
> 
> 
> 
> Unlikely. I know flashing a new BIOS is tempting but they've stated that whatever adjustments were made are specifically for Micron memory. If your Samsung memory isn't crashing when changing from low to high voltage then I doubt you'd be gaining anything from this BIOS.
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> just goes to show the quality of the Zotac tech support team. just like the teams everywhere else
> 
> Click to expand...
> 
> Indeed. I asked two separate support people a couple of weeks ago about when to expect the BIOS update and on both occasions told that my "card was within specifications and if there is an issue to RMA". They knew nothing about a new BIOS.
> 
> Anyway I downloaded the link a few pages back for my GTX 1070 AMP Extreme and it's working excellent.
> OC speeds are 2114 MHz / 9065 MHz. Custom fan speed sits around 50% when gaming and temps never go over 53°C.
> 
> Many thanks to those who shared the BIOS links. This is a beast of a card and now I'm officially happy with it.
Click to expand...

I am the guy who worked out what was causing the bug and got Nvidia to produce the bios update to fix it. Glad you are happy, I know I was when I got my MSI update.

I went through a similar exercise to you with MSI.

I have worked in the IT biz for years and have experienced what these guys are dealing with. These Vendor companies don't exactly make it easy for the poor blokes on the front line. The support staff are usually the last ones to hear about anything new in the pipeline or receive any documentation of products or product updates. The upside is though for the good guys, it really makes them sharp at solving problems so the get promoted to roles where the skills are not needed, while the not so good guys get left doing more customer support work.


----------



## gtbtk

Quote:


> Originally Posted by *Vitto*
> 
> I originally built my rig with 1 1070 and after few months decided to get a second one


that makes sense


----------



## zipper17

Quote:


> Originally Posted by *Vitto*
> 
> Hello everyone! I've been following this thread for last few days and finally decided to join the site
> 
> 
> 
> 
> 
> 
> 
> I'm running 6600k @ 4.7 Ghz with an Asus Strix OC 1070s in SLI. Some benchmarks to follow shortly! I'm coming from a GTX 465 so it's quite an upgrade


That huge upgrade than me, im from GTX 560Ti. 5 Years of Waiting.
I think I'm gonna do the same as for GTX 1070, 5 years later to upgrade.


----------



## CriZ93

No change in overclock after installing the new zotac bios. I guess only fix the micron memory voltage problem
my OC speeds are 2164/8910 core voltage and power limit on max
zotac 1070 amp extreme

I guess in each gpu will give different results


----------



## Vitto

Hmm i noticed that my SLI cards are not always sync'd, GPU clock, mem clock and thus voltage and fan speed, but i believe it's only happening while using steam, watching youtube etc.the non gaming or benchmarking apps. Do you guys think it's normal and they only sync for heavy tasks?


----------



## Vitto

Even so that's still a very nice OC Cri







im getting 2114/8706 on my Strix. How are your temps? Air of liquid cooled?


----------



## Vitto

Cri inspired me to go further XD.



+100 on GPU for 2114 MHz,
+800 on mem for 8808 MHz,
120% on power target for stability, cards got to 106% max,
No GPU voltage changed.
Temps at 58 and 60 for 1070A and 1070B with fans at 60%.

Pretty sure this would be unstable in 3D Marks.


----------



## CriZ93

vitto air 50c idle 70c max
my cpu case is bad


----------



## Vitto

Ouch that's quite a high idle D: What case you got? just tried 8858 on mem it went thru 18 scenes gave me a score and then froze XD.


----------



## CriZ93

generic black case








Someday I'll change it
samsung or micron vram?


----------



## Vitto

Samsung on the one I got while building the pc few months ago but Micron on the one I got a week ago :/


----------



## Vitto

I got the NZXT Noctis 450, the cooling on it is quite good, 3 x 120s in the front for intake, 2 x 180s at the top and 1 x 180 in the back for exhaust. Idles at around 35°


----------



## CriZ93

Quote:


> Originally Posted by *Vitto*
> 
> Samsung on the one I got while building the pc few months ago but Micron on the one I got a week ago :/


It seems that the overclocking of micron memories varies on some GPUs
Other people achieved better frequencies


----------



## Vitto

That's what I heard :/ and this is after the vbios update from Asus that was suppose to help with Micron mem.


----------



## anthonyg45157

My Zotac 1070 is hitting 2164 core and 9200 on the micron memory. Not too much different after bios update. However I'm not having freezes on desktop or blue screens here and there when I alt tab out of games. Thanks for the bios


----------



## Vitto

Looks like Zotac is the boss! @[email protected]


----------



## Vitto

Do you guys think i could get better results by OCing each card individually, since one comes with samsung and other with micron? GPU Tweak II has an option to sync the settings on both cards and i've been using that so far.


----------



## Munchkinpuncher

Hey guys ive got 2 asus strix gtx 1070s, havent really gotten too deep yet on overclocking them yet, they have micron memory which I hear isn't as good as samsung, they seem to be pretty decent so far. This is at 2073 core 8120 mem.

Im going to see how far I can push them tonight


----------



## asdkj1740

Quote:


> Originally Posted by *CriZ93*
> 
> No change in overclock after installing the new zotac bios. I guess only fix the micron memory voltage problem
> my OC speeds are 2164/8910 core voltage and power limit on max
> zotac 1070 amp extreme
> 
> I guess in each gpu will give different results


how about running furmark 1080p 0xaa? what is the voltage? any serious throttling?


----------



## gtbtk

Quote:


> Originally Posted by *Munchkinpuncher*
> 
> 
> Hey guys ive got 2 asus strix gtx 1070s, havent really gotten too deep yet on overclocking them yet, they have micron memory which I hear isn't as good as samsung, they seem to be pretty decent so far. This is at 2073 core 8120 mem.
> 
> Im going to see how far I can push them tonight


If you have updated the bioses to the 86.04.50.00.xx version then there is no problem with Micron Memory


----------



## CriZ93

Quote:


> Originally Posted by *asdkj1740*
> 
> how about running furmark 1080p 0xaa? what is the voltage? any serious throttling?


I do not have the furmark
I am using the voltage curve in afterburner msi at 2164 the maximum voltage in load is 1093mv
And test stability with games gta v, doom, tomb raider etc
For now no artifact or throttling


----------



## REDEFINE

Hey guys Im new to this club and I was just wondering if any of you are still having the high DPC latency that reported back in August? I feel like it might be a little worse then my old 970 but I would like other input. Thanks


----------



## gtbtk

Quote:


> Originally Posted by *REDEFINE*
> 
> Hey guys Im new to this club and I was just wondering if any of you are still having the high DPC latency that reported back in August? I feel like it might be a little worse then my old 970 but I would like other input. Thanks


no, however, if you are not use the nvidia audio, disable it


----------



## asdkj1740

Quote:


> Originally Posted by *CriZ93*
> 
> I do not have the furmark
> I am using the voltage curve in afterburner msi at 2164 the maximum voltage in load is 1093mv
> And test stability with games gta v, doom, tomb raider etc
> For now no artifact or throttling


if you have time please download it and run it. 1080p 0xaa(off)
http://www.ozone3d.net/benchmarks/fur/


----------



## blued

Avoid furmark. Unnecessary over-stressing of GPU.. both Nvidia and AMD recommend against it. Furthermore, no game or bench exists that can approximate the furmark stress level, so no point in it.


----------



## spddmn24

What am I doing wrong here? Hitting power limit on firestrike xtreme at 100% tdp.


----------



## MEC-777

Quote:


> Originally Posted by *blued*
> 
> Avoid furmark. Unnecessary over-stressing of GPU.. both Nvidia and AMD recommend against it. Furthermore, no game or bench exists that can approximate the furmark stress level, so no point in it.


^This. I don't understand why people keep using it.


----------



## MEC-777

Quote:


> Originally Posted by *Vitto*
> 
> Do you guys think i could get better results by OCing each card individually, since one comes with samsung and other with micron? GPU Tweak II has an option to sync the settings on both cards and i've been using that so far.


I know when I ran crossfire R9 290's, if the clocks were not synced, performance suffered. I'm assuming it's the same with SLI, you want to have both cards running the same clocks for the best performance. This unfortunately limits you to the best OC of the worst card you have, but I would keep them both synced.


----------



## JAM3S121

Should I be happy with 68c-73c while playing Battlefield 1 at ultra settings, locked to 100hz? My case is pretty restrictive and I don't think the airflow is as good as it should be. This is a EVGA GTX 1070 FTW, stock clocks


----------



## Caveat

Hey guys. I just sold my 2 Sapphire R9 290x's because the power consumption was way to high. I want to get an 1070. But any tips on what brand to get for an 1070? I am leaning to an Asus strix gtx1070 8gb gaming.

specs:
Mobo: Asus Crosshair V Formula-Z
CPU: AMD FX-9590 at 4.7GHz cooled NZXT Kraken x61
GPU: 2 Sapphire R9 290X Tri-X in crossfire
Memory: G.Skill TridentX F3-2400C10D-16GTX
PSU: 1200Watt Sirtek Rock Solid


----------



## anthonyg45157

Quote:


> Originally Posted by *Caveat*
> 
> Hey guys. I just sold my 2 Sapphire R9 290x's because the power consumption was way to high. I want to get an 1070. But any tips on what brand to get for an 1070? I am leaning to an Asus strix gtx1070 8gb gaming.
> 
> specs:
> Mobo: Asus Crosshair V Formula-Z
> CPU: AMD FX-9590 at 4.7GHz cooled NZXT Kraken x61
> GPU: 2 Sapphire R9 290X Tri-X in crossfire
> Memory: G.Skill TridentX F3-2400C10D-16GTX
> PSU: 1200Watt Sirtek Rock Solid


Can't go wrong with MSI Gaming, Asus Strix or Zotac Amp! Extreme , basically comes down to looks.


----------



## Caveat

Quote:


> Originally Posted by *anthonyg45157*
> 
> Can't go wrong with MSI Gaming, Asus Strix or Zotac Amp! Extreme , basically comes down to looks.


Ok thanks. Thats all i needed to know


----------



## Bold Eagle

Get one with a dual BIOS after the need to flash many of the 1070's - always good to have that back up.


----------



## Luckael

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Hmmm, i don't know if this means anything, but my subvendor is undefined and my device id doesn't quite match yours... also it's worth mentioning that this is my first Nvidia card in almost 15 years lol, i've attached a screenshot of my gpu-z


i returned my Gtx 1070 xtreme gaming for rma, because the BF4,The Witcher 3, 3d mark fire strike is always freezing and i need to downclock my gpu before it works fine. even the default gaming mode is consistently crashing my games.


----------



## zipper17

Quote:


> Originally Posted by *Luckael*
> 
> i returned my Gtx 1070 xtreme gaming for rma, because the BF4,The Witcher 3, 3d mark fire strike is always freezing and i need to downclock my gpu before it works fine. even the default gaming mode is consistently crashing my games.


What is your spec and PSU?
be sure u have a good PSU.
Powersupply voltage ripple and relevance:
http://www.gamersnexus.net/guides/2053-power-supply-voltage-ripple-and-relevance


----------



## Caveat

Quote:


> Originally Posted by *Bold Eagle*
> 
> Get one with a dual BIOS after the need to flash many of the 1070's - always good to have that back up.


Ehm... how do i know that it has a dual bios?


----------



## EDK-TheONE

Did they fix fan speed issue below 30%?


----------



## EDK-TheONE

Quote:


> Originally Posted by *CriZ93*
> 
> No change in overclock after installing the new zotac bios. I guess only fix the micron memory voltage problem
> my OC speeds are 2164/8910 core voltage and power limit on max
> zotac 1070 amp extreme
> 
> I guess in each gpu will give different results


Did they fix fan speed issue below 30%?


----------



## TK421

Hi

I have shorted the R005 resistor on the motherboard (Laptop GTX1070) but I am not seeing a decrease in power consumption. The card still power throttles. What should I do to fix this problem?

I also have the Samsung VRAM, what is the average overclock for this type of GDDR5 memory on a GTX1070? I have done +700 in the past but that is on a different laptop (G752VS) with unknown VRAM brand.


----------



## khanmein

Quote:


> Originally Posted by *TK421*
> 
> Hi
> 
> I have shorted the R005 resistor on the motherboard (Laptop GTX1070) but I am not seeing a decrease in power consumption. The card still power throttles. What should I do to fix this problem?
> 
> I also have the Samsung VRAM, what is the average overclock for this type of GDDR5 memory on a GTX1070? I have done +700 in the past but that is on a different laptop (G752VS) with unknown VRAM brand.


i don't suggest u shorted the resistor. damn risky~


----------



## AliasOfMyself

Quote:


> Originally Posted by *Luckael*
> 
> i returned my Gtx 1070 xtreme gaming for rma, because the BF4,The Witcher 3, 3d mark fire strike is always freezing and i need to downclock my gpu before it works fine. even the default gaming mode is consistently crashing my games.


Awww no way :/ what psu are you using in your rig? Apart from not being able to flash the F3 bios to fix a few bugs e.g my 3d clocks being stuck on til I reboot, my card has been rock solid.. let's hope you don't get micron vram next time around lol ?


----------



## TK421

Quote:


> Originally Posted by *khanmein*
> 
> i don't suggest u shorted the resistor. damn risky~


?


----------



## RyanRazer

I am also interested in that info: do new Zotac drivers fix the fan issue at low (under 30%) RPM? Zotac people


----------



## Luckael

Quote:


> Originally Posted by *zipper17*
> 
> What is your spec and PSU?
> be sure u have a good PSU.
> Powersupply voltage ripple and relevance:
> http://www.gamersnexus.net/guides/2053-power-supply-voltage-ripple-and-relevance


im using seasonic 750w snow silent platinum psu.


----------



## Luckael

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Awww no way :/ what psu are you using in your rig? Apart from not being able to flash the F3 bios to fix a few bugs e.g my 3d clocks being stuck on til I reboot, my card has been rock solid.. let's hope you don't get micron vram next time around lol ?


im using seasonic 750w snow silent platinum. i think the factory oc on my chip is not good.


----------



## anthonyg45157

Is the Zotac fan issue when under 30 percent fan setting the fan will stop and spin and repeat?

I can check soon if so

Edit: This issue still persists with updated bios. I just set fan curve to 0 until 45 celcius with afterburner which works.


----------



## AliasOfMyself

Quote:


> Originally Posted by *Luckael*
> 
> im using seasonic 750w snow silent platinum. i think the factory oc on my chip is not good.


You'd think the binning process would have picked that up really.. how long did you have your card before it started to die? Good thing Gigabyte have a 4 year warranty on these cards really!

That PSU should be more than good enough, so i really doubt that is the cause


----------



## khanmein

the fan issue from zotac can't be fix cos that's their signature.


----------



## philhalo66

Anyone having issues with downsampling? i can DS to anything accept 2560x1440, 2560x1600 and 2880x1800 those 3 just give me a black screen with a small white strip at the top and its not my monitor losing signal. Downsampling 4K works fine i can even go up to 5K and it works fine, it's just those 3. I didn't have this problem on my GTX 580. It only seems to do it on the desktop in game its fine.


----------



## Gurkburk

Quote:


> Originally Posted by *Caveat*
> 
> Ok thanks. Thats all i needed to know


Gigabyte has superior cooling to all other cards.


----------



## biffenl

Quote:


> Originally Posted by *Gurkburk*
> 
> Gigabyte has superior cooling to all other cards.


Lol


----------



## Hunched

Quote:


> Originally Posted by *Gurkburk*
> 
> Gigabyte has superior cooling to all other cards.


Lol no.
Also their fans are loud as hell, and they're the cheapest looking and feeling 1070's


----------



## philhalo66

Quote:


> Originally Posted by *Hunched*
> 
> Lol no.
> Also their fans are loud as hell, and they're the cheapest looking and feeling 1070's


I don't know about that, my 1070 is dead silent and never goes above 58C even with a hefty OC.


----------



## Caveat

Quote:


> Originally Posted by *Gurkburk*
> 
> Gigabyte has superior cooling to all other cards.


Thanks for the info. But i ordered already the ASUS STRIX-GTX1070-8G-GAMING. I don.t know why, but i like the Asus software. Probably because it is easy to use...


----------



## TheBoom

MSI AB is probably still the easiest to use.

But good choice, the card is relatively silent and cool. It's very solidly built but extremely long so make sure you have allowance in your case.


----------



## Caveat

Quote:


> Originally Posted by *TheBoom*
> 
> MSI AB is probably still the easiest to use.
> 
> But good choice, the card is relatively silent and cool. It's very solidly built but extremely long so make sure you have allowance in your case.


I have the NZXT Phantom 820 case. So i think it should fit. right?


----------



## spin5000

My GPU is constantly hitting power limits even with the power slider turned up to max. It's literally non-stop power-limit-hitting like crazy. Is there still no way to use a custom BIOS if at the very least a way to just increase the power limit (or up the baseline power limits, ie. what wattage = 100%)???

My 980 Ti which was not heavily overclocked compared to many others - 1483 MHz / 7700 MHz - hits 101 and 82 fps in Firestrike (2 GPU-only tests) while my 1070 gets only 91 and 76 fps.

It's like there is a "blanket" overly-low power-cap on every single 1070 out there no matter what brand or model. It's like Nvidia forced every single company, with all their models, to be power-capped very harshly. Weird...


----------



## zipper17

My card only barely can reach 21K in Firestrike

21.0XX - 21.1XX Graphic Scores

and it's not perfectly stable.

I got many crashes (nvlddmkm.sys) everytime attempt to reach 21K Firestrike, I can't get a perfect rockstable on Every each run of benchmarking, or Every each run of Firestrike/Extreme Stress test. Sometime I can stable, but next run it will crash randomly. I tried many offset or curve settings but still cannot get a rockstable at all.

Is anyone experience the same?

I guess this mean you're pretty much reach your max silicon? or something else?


----------



## JAM3S121

I know there were a few problems with these cards but I'm just inquiring about mine

GTX 1070 EVGA FTW edition purchased from amazon December 3rd 2016.
At idle the fans don't spin? Playing BF1 at ultra the temps max at 73c, my case is a little restrictive are these temps okay?


----------



## MEC-777

Quote:


> Originally Posted by *JAM3S121*
> 
> I know there were a few problems with these cards but I'm just inquiring about mine
> 
> GTX 1070 EVGA FTW edition purchased from amazon December 3rd 2016.
> At idle the fans don't spin? Playing BF1 at ultra the temps max at 73c, my case is a little restrictive are these temps okay?


Fans aren't supposed to spin unless necessary. Yep, those temps are totally safe.

You may want to take note of the recent issues with those cards and order/install the revised thermal pad kit they have put out.


----------



## Nukemaster

Quote:


> Originally Posted by *JAM3S121*
> 
> I know there were a few problems with these cards but I'm just inquiring about mine
> 
> GTX 1070 EVGA FTW edition purchased from amazon December 3rd 2016.
> At idle the fans don't spin? Playing BF1 at ultra the temps max at 73c, my case is a little restrictive are these temps okay?


Your temps are within specs.

0DB or 0RPM mode on most of these cards keep the fan off under 50c or so.

If you are worried about temperatures, You can make a custom profile with MSI AfterBurner or Evga Precision X


----------



## Caveat

Finally got mine (i know, 1200Watt is a complete overkill. I already trying to sell or trade it against 600-700Watt). I never had Nvidia. Always had ATI/AMD. Anything i need to know?


----------



## MEC-777

Quote:


> Originally Posted by *Caveat*
> 
> Finally got mine (i know, 1200Watt is a complete overkill. I already trying to sell or trade it against 600-700Watt). I never had Nvidia. Always had ATI/AMD. Anything i need to know?


Nvidia drivers are much easier, cleaner to deal with vs AMD drivers. Basically just download via Geforce Experience, click express install and you're laughing. No need to run DDU or all that junk.









Also, Shadowplay is freakin awesome if you're into recording gameplay footage.

Other than that, enjoy the silence, low temps and efficiency.


----------



## amstech

Quote:


> Originally Posted by *Caveat*
> 
> Thanks for the info. But i ordered already the ASUS STRIX-GTX1070-8G-GAMING. I don.t know why, but i like the Asus software. Probably because it is easy to use...


I am torn between both those cards (Windforce/STRIX) and honestly, I don't think you can go wrong with either.


----------



## philhalo66

Turns out my 1070 has coil whine but i can barely hear it over the fans on my radiator Plus I got noise cancelling headphones so i cant hear it at all 90% of the time. I assume coil whine doesn't actually mean failure right?


----------



## Caveat

Gamed for 2 hours without any problems. Than it said "hey new nvidia drivers" so i installed, reboot aaaand now i get this all the time. Going to look into it tomorrow. Maybe a complete reinstall of windows...


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> 
> Gamed for 2 hours without any problems. Than it said "hey new nvidia drivers" so i installed, reboot aaaand now i get this all the time. Going to look into it tomorrow. Maybe a complete reinstall of windows...


get rid of all the asus software on your system IOMAP64.sys is part of that and that BSOD was caused by that according the BC Code. if that doesnt fix it then boot into safe mode and delete the file from c:\windows\system32. it does seem to be caused by ASUS software somehow.


----------



## TK421

Hi

I have shorted the R005 resistor on the motherboard (Laptop GTX1070) but I am not seeing a decrease in power consumption. The card still power throttles. What should I do to fix this problem?

I also have the Samsung VRAM, what is the average overclock for this type of GDDR5 memory on a GTX1070? I have done +700 in the past but that is on a different laptop (G752VS) with unknown VRAM brand.


----------



## Caveat

Quote:


> Originally Posted by *philhalo66*
> 
> get rid of all the asus software on your system IOMAP64.sys is part of that and that BSOD was caused by that according the BC Code. if that doesnt fix it then boot into safe mode and delete the file from c:\windows\system32. it does seem to be caused by ASUS software somehow.


All the Asus software? You mean the drivers?


----------



## philhalo66

Quote:


> Originally Posted by *TK421*
> 
> Hi
> 
> I have shorted the R005 resistor on the motherboard (Laptop GTX1070) but I am not seeing a decrease in power consumption. The card still power throttles. What should I do to fix this problem?
> 
> I also have the Samsung VRAM, what is the average overclock for this type of GDDR5 memory on a GTX1070? I have done +700 in the past but that is on a different laptop (G752VS) with unknown VRAM brand.


might be throttling thats hardwired into the laptop motherboard. or could be the power adapter not giving enough wattage or even heat related. shorting anything on a laptop is unimaginably risky i strongly recommend you reverse that. is it really worth killing your laptop for a few extra FPS? Check your temps and check to see if you have the correct wattage adapter.

i seen people in here tossing around 500MHz for an average OC on samsung memory but im sure it will vary from card to card.


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> All the Asus software? You mean the drivers?


no, i mean asus stuff it seems to be caused by asus software somehow a quick google search says getting rid of the asus ai tweak things helped alot of people. maybe the new driver is conflicting or something.


----------



## Caveat

Quote:


> Originally Posted by *philhalo66*
> 
> no, i mean asus stuff it seems to be caused by asus software somehow a quick google search says getting rid of the asus ai tweak things helped alot of people. maybe the new driver is conflicting or something.


Ah ok. I will try that tomorrow. Thank you.


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> Ah ok. I will try that tomorrow. Thank you.


NP i hope it helps.


----------



## Caveat

Quote:


> Originally Posted by *philhalo66*
> 
> NP i hope it helps.


Yea. Me too. Not feeling much for reinstalling the whole darn things


----------



## MEC-777

Quote:


> Originally Posted by *philhalo66*
> 
> Turns out my 1070 has coil whine but i can barely hear it over the fans on my radiator Plus I got noise cancelling headphones so i cant hear it at all 90% of the time. I assume coil whine doesn't actually mean failure right?


Some coil whine is fine. Some cards do it and some don't. Not a sign of failure. If it's really bad, you could RMA it.

Quote:


> Originally Posted by *philhalo66*
> 
> no, i mean asus stuff it seems to be caused by asus software somehow a quick google search says getting rid of the asus ai tweak things helped alot of people. maybe the new driver is conflicting or something.


Yep. I have the Asus AI suite software cause me issues and interfering when trying to use the Speedfan program to control all my case fans. Just got rid of all of it except the BIOS updater and all is well.


----------



## Caveat

Quote:


> Originally Posted by *MEC-777*
> 
> Yep. I have the Asus AI suite software cause me issues and interfering when trying to use the Speedfan program to control all my case fans. Just got rid of all of it except the BIOS updater and all is well.


yes. I figured that aisuite sucks. Everytime saying that my processor is -450 degrees or whatever. But i thought i needed it... but i am going to delete it tomorrow. Maybe it will help


----------



## TK421

Quote:


> Originally Posted by *philhalo66*
> 
> might be throttling thats hardwired into the laptop motherboard. or could be the power adapter not giving enough wattage or even heat related. shorting anything on a laptop is unimaginably risky i strongly recommend you reverse that. is it really worth killing your laptop for a few extra FPS? Check your temps and check to see if you have the correct wattage adapter.
> 
> i seen people in here tossing around 500MHz for an average OC on samsung memory but im sure it will vary from card to card.


Absolutely.

I have liquid metal applied on the heatsink and premium+accidental warranty to back it up (and I'm still on the return window to boot).

Temp max firestrike ultra 54c.

Power adapter: 230w

Model: Alienware 15R3 2016 (6820HK @4.2)

I tried a baseline of 550, then 700 and it wasn't holding a stable improvement in points. 650 nets a sizeable improvement in overall scores.

FS Ultra

4306+550
4297+550
4287+550

4312+700
4302+700
4332+700

4316+650
4311+650
4321+650
4331+650
4312+650

4308+600
4287+600

CPU clocked at 4.2GHz with max temp 70c (80w max power draw)

I also have an MSi GT73VR coming in sometime next year, that one has a 1080 so hopefully the shunt resistor is actually used for power measurement.


----------



## Luckael

Quote:


> Originally Posted by *AliasOfMyself*
> 
> You'd think the binning process would have picked that up really.. how long did you have your card before it started to die? Good thing Gigabyte have a 4 year warranty on these cards really!
> 
> That PSU should be more than good enough, so i really doubt that is the cause


since first day, i bought. my division game is always crashing every time i check the map. but when i downclock, it runs fine.


----------



## AliasOfMyself

Quote:


> Originally Posted by *Luckael*
> 
> since first day, i bought. my division game is always crashing every time i check the map. but when i downclock, it runs fine.


Sounds like it wasn't binned properly to me then, you going to get a replacement from gigabyte or try another brand?


----------



## Nukemaster

Quote:


> Originally Posted by *Luckael*
> 
> since first day, i bought. my division game is always crashing every time i check the map. but when i downclock, it runs fine.


My GTX 670 did the same thing in some games. It sucks that they do not fully test the chips before sending them to make cards(as far as I know the chips max boost is set from Nvidia.).


----------



## Luckael

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Sounds like it wasn't binned properly to me then, you going to get a replacement from gigabyte or try another brand?


im still going with the xtreme gaming, it has a good rma support here in our country.


----------



## philhalo66

Quote:


> Originally Posted by *Luckael*
> 
> im still going with the xtreme gaming, it has a good rma support here in our country.


what country are you from? I been reading nothing but horror stories to the point im more than likely going to sell this and get an EVGA 1070 in a few months. first and last gigabyte card unless they prove me wrong.


----------



## gtbtk

Quote:


> Originally Posted by *Caveat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gurkburk*
> 
> Gigabyte has superior cooling to all other cards.
> 
> 
> 
> Thanks for the info. But i ordered already the ASUS STRIX-GTX1070-8G-GAMING. I don.t know why, but i like the Asus software. Probably because it is easy to use...
Click to expand...

shame it is not accurate.

You have the choice of cross flashing t e card with the Strix OC firmware too. Better performance for free ;-)


----------



## Caveat

Quote:


> Originally Posted by *gtbtk*
> 
> shame it is not accurate.
> 
> You have the choice of cross flashing t e card with the Strix OC firmware too. Better performance for free ;-)


Ye i need to look into that later. I reinstalled windows and now windows can not find updates. It keeps on searching and searching. Sigh


----------



## gtbtk

Quote:


> Originally Posted by *Caveat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> shame it is not accurate.
> 
> You have the choice of cross flashing t e card with the Strix OC firmware too. Better performance for free ;-)
> 
> 
> 
> Ye i need to look into that later. I reinstalled windows and now windows can not find updates. It keeps on searching and searching. Sigh
Click to expand...

try running the windows update troubleshooter, sfc/scannow and reboot


----------



## iARDAs

Forgot to add a picture of my 1070...

Here you go


----------



## Caveat

Quote:


> Originally Posted by *gtbtk*
> 
> try running the windows update troubleshooter, sfc/scannow and reboot


I tried. It was still ****. The blue screen error always comes if i installed the version through the Geforce Experience program...


----------



## asdkj1740

Quote:


> Originally Posted by *iARDAs*
> 
> Forgot to add a picture of my 1070...
> 
> Here you go


glorious girl.


----------



## iARDAs

Quote:


> Originally Posted by *asdkj1740*
> 
> glorious girl.


She is barely 3 and already ascended to the master race. What a time to be alive for her









Thanks by the way buddy.


----------



## gtbtk

Quote:


> Originally Posted by *Caveat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> try running the windows update troubleshooter, sfc/scannow and reboot
> 
> 
> 
> I tried. It was still ****. The blue screen error always comes if i installed the version through the Geforce Experience program...
Click to expand...

so the problem is not windows operating system update but updating nvidia drivers though GFE?

In that case, uninstall all the nvidia drivers uning the windows uninstaller then use DDU to uninstall all the left over bits of the nvidia drivers. Run it in safe mode.

Then manually download the latest version of drivers from the nvidia web site and try installing manually. That should sort out the driver problem


----------



## gtbtk

Quote:


> Originally Posted by *iARDAs*
> 
> Forgot to add a picture of my 1070...
> 
> Here you go


Pretty Girl. I have 2 daughters myself but they are a lot further along the growth path than yours.


----------



## Caveat

Quote:


> Originally Posted by *gtbtk*
> 
> so the problem is not windows operating system update but updating nvidia drivers though GFE?
> 
> In that case, uninstall all the nvidia drivers uning the windows uninstaller then use DDU to uninstall all the left over bits of the nvidia drivers. Run it in safe mode.
> 
> Then manually download the latest version of drivers from the nvidia web site and try installing manually. That should sort out the driver problem


Lol sorry. I think i explained it wrong. First Blue screen was when i installed latest drivers of Geforce. I could not fix it by deleting AISuite. Because the blue screen came 5 seconds after entering windows. So today i did an reainstall of windows and i did not install any drivers of the motherboard. Now it is working fine without bluescreen. But now i have the problem that windows update keeps searching and saying there is no update, but it is an complete fresh install. So there must be any update


----------



## MEC-777

Quote:


> Originally Posted by *Caveat*
> 
> Lol sorry. I think i explained it wrong. First Blue screen was when i installed latest drivers of Geforce. I could not fix it by deleting AISuite. Because the blue screen came 5 seconds after entering windows. So today i did an reainstall of windows and i did not install any drivers of the motherboard. Now it is working fine without bluescreen. But now i have the problem that windows update keeps searching and saying there is no update, but it is an complete fresh install. So there must be any update


Is it possible it downloaded and installed the latest updates during the reinstall?


----------



## MEC-777

Quote:


> Originally Posted by *iARDAs*
> 
> She is barely 3 and already ascended to the master race. What a time to be alive for her
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks by the way buddy.


Start 'em while they're young, right!


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> Lol sorry. I think i explained it wrong. First Blue screen was when i installed latest drivers of Geforce. I could not fix it by deleting AISuite. Because the blue screen came 5 seconds after entering windows. So today i did an reainstall of windows and i did not install any drivers of the motherboard. Now it is working fine without bluescreen. But now i have the problem that windows update keeps searching and saying there is no update, but it is an complete fresh install. So there must be any update


did you try booting in safe mode and removing the IOMAP64.sys?


----------



## Caveat

Quote:


> Originally Posted by *philhalo66*
> 
> did you try booting in safe mode and removing the IOMAP64.sys?


Tbh o have no idea how to start in safe mode in Windows 8.1


----------



## Caveat

Quote:


> Originally Posted by *MEC-777*
> 
> Is it possible it downloaded and installed the latest updates during the reinstall?


No i don't think so. I always have to install alot of updates if i installed this windows.


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> Tbh o have no idea how to start in safe mode in Windows 8.1


This guide will tell you everything you need. it's very useful for deleting files that cause issues.


----------



## gtbtk

Quote:


> Originally Posted by *Caveat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *philhalo66*
> 
> did you try booting in safe mode and removing the IOMAP64.sys?
> 
> 
> 
> Tbh o have no idea how to start in safe mode in Windows 8.1
Click to expand...

Hold the shift key down while you click on restart and it will allow you to start safe mode


----------



## Caveat

Quote:


> Originally Posted by *gtbtk*
> 
> Hold the shift key down while you click on restart and it will allow you to start safe mode


It does not actually. I only get restart, troubleshoot. That kind of stuff


----------



## gtbtk

Quote:


> Originally Posted by *Caveat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Hold the shift key down while you click on restart and it will allow you to start safe mode
> 
> 
> 
> It does not actually. I only get restart, troubleshoot. That kind of stuff
Click to expand...

it is under troubleshoot section i think


----------



## Joenc

Quote:


> Originally Posted by *JAM3S121*
> 
> I know there were a few problems with these cards but I'm just inquiring about mine
> 
> GTX 1070 EVGA FTW edition purchased from amazon December 3rd 2016.
> At idle the fans don't spin? Playing BF1 at ultra the temps max at 73c, my case is a little restrictive are these temps okay?


switch the bios on the card to other bios, I think it runs the fan at

30% or so at idle, plus it puts a slight overclock on...


----------



## MEC-777

Quote:


> Originally Posted by *Caveat*
> 
> It does not actually. I only get restart, troubleshoot. That kind of stuff


I think it's under troubleshoot > advanced > restart in safemode.

Something like that.


----------



## Caveat

Quote:


> Originally Posted by *gtbtk*
> 
> it is under troubleshoot section i think


Quote:


> Originally Posted by *MEC-777*
> 
> I think it's under troubleshoot > advanced > restart in safemode.
> 
> Something like that.


Thanks. I will take a look into it


----------



## DeathAngel74

or msconfig>check "safe boot">apply>ok>restart
Windows 10 is being a PITA on my laptop right now. HP Recovery partition was corrupt. After 2 hours of trying to reset the PC or doing it through recovery, I gave up(2AM). Grabbed my Windows 10 1607 usb and reformatted that way(3AM). Installed necessary drivers and went to bed, lol(4:30AM). I hate Windows 10!!!!!! I wish HP would release drivers for the OMEN 17+Windows 7.....


----------



## IRO-Bot

Just got the MSI GTX 1070 Quick Silver. That back plate is sexy. Works perfectly with the Dominator RAM.


----------



## IRO-Bot

Edit: Not sure why it double posted. I'd take a pic of it but don't have a camera and my phone camera is crappy.


----------



## criminal

Quote:


> Originally Posted by *spin5000*
> 
> My GPU is constantly hitting power limits even with the power slider turned up to max. It's literally non-stop power-limit-hitting like crazy. Is there still no way to use a custom BIOS if at the very least a way to just increase the power limit (or up the baseline power limits, ie. what wattage = 100%)???
> 
> My 980 Ti which was not heavily overclocked compared to many others - 1483 MHz / 7700 MHz - hits 101 and 82 fps in Firestrike (2 GPU-only tests) while my 1070 gets only 91 and 76 fps.
> 
> It's like there is a "blanket" overly-low power-cap on every single 1070 out there no matter what brand or model. It's like Nvidia forced every single company, with all their models, to be power-capped very harshly. Weird...


Why would you side-grade to a 1070 in the first place?


----------



## silencespr

https://www.techpowerup.com/gpuz/details/d8kwn

PNY 1070


----------



## Luckael

Quote:


> Originally Posted by *philhalo66*
> 
> what country are you from? I been reading nothing but horror stories to the point im more than likely going to sell this and get an EVGA 1070 in a few months. first and last gigabyte card unless they prove me wrong.


im from Philippines


----------



## asdkj1740

https://www.zotac.com/cn/files/download/graphics_cards?driver_type=238&g_card_series=1168&g_card_os=All&sku=&skuSelect=
there are some chinese versions of zotac 1070 micron bios, like pgf


----------



## asdkj1740

these two pics show the power throttling under furmark. the msi 291w bios seems to be....

















but this is strange: stock air cooling vs water cooling









http://tieba.baidu.com/p/4861293577


----------



## philhalo66

Quote:


> Originally Posted by *asdkj1740*
> 
> these two pics show the power throttling under furmark. the msi 291w bios seems to be....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but this is strange: stock air cooling vs water cooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://tieba.baidu.com/p/4861293577


Furmark is really bad for your graphics card anyway, likely it throttles so it wont fry the VRM's. Both AMD and Nvidia tell you not to touch it as it causes unrealistically high load and will damage your card.


----------



## asdkj1740

Quote:


> Originally Posted by *philhalo66*
> 
> Furmark is really bad for your graphics card anyway, likely it throttles so it wont fry the VRM's. Both AMD and Nvidia tell you not to touch it as it causes unrealistically high load and will damage your card.


its because the stock coolers from nvidia and amd are both suck.


----------



## philhalo66

Quote:


> Originally Posted by *asdkj1740*
> 
> its because the stock coolers from nvidia and amd are both suck.


show me any game that puts the same amount of load furmark does and i'll eat my words. It has nothing to do with coolers. I have seen people fry out graphics cards from both AMD and Nvidia with full cover waterblocks. Furmark is a stupid piece of software that does nothing more than kill your graphics card eventually.


----------



## asdkj1740

Quote:


> Originally Posted by *philhalo66*
> 
> show me any game that puts the same amount of load furmark does and i'll eat my words. It has nothing to do with coolers. I have seen people fry out graphics cards from both AMD and Nvidia with full cover waterblocks. Furmark is a stupid piece of software that does nothing more than kill your graphics card eventually.


the gamernexus report on crazy stressing the ftw vrms to 125c shows that vrm temp may not be the crucial reason of firing.
and the tomshardware de review shows most of the cards under furmark the vrm temp is below 100c.

this is all about cooling....not just gpu but vrm and vram and choke and cap and those voltage ic etc.

calm down dude. dont need to eat anything for the rubbish stock coolers from nvidia and amd.


----------



## asdkj1740

del del


----------



## Caveat

Quote:


> Originally Posted by *Luckael*
> 
> im from Philippines


Kabayan Kumusta


----------



## foxmagic

Anyone got the new zotac 1070 amp rom? cant get the zotac flashing tool working.


----------



## philhalo66

Quote:


> Originally Posted by *asdkj1740*
> 
> the gamernexus report on crazy stressing the ftw vrms to 125c shows that vrm temp may not be the crucial reason of firing.
> and the tomshardware de review shows most of the cards under furmark the vrm temp is below 100c.
> 
> this is all about cooling....not just gpu but vrm and vram and choke and cap and those voltage ic etc.
> 
> calm down dude. dont need to eat anything for the rubbish stock coolers from nvidia and amd.


i would take anything from toms hardware with a grain of salt half the time they get paid off for good reviews. how does one test VRM temps on cards that don't have sensors on the VRM's 90% of nvidia cards don't have a sensor on them so any temps reported from infrared can easily be 70-90C higher on the actual component my point is its stupid to use it because its been confirmed to kill cards even with watercooliong


----------



## MEC-777

Quote:


> Originally Posted by *asdkj1740*
> 
> its because the stock coolers from nvidia and amd are both suck.


It's not stock cooled cards that have been killed by furmark. It's any cards, stock and aftermarket.

The problem with furmark is it puts extreme unrealistic load on the GPU that could never be duplicated in any game. Reviewers like Tom's can run it all they want. They didn't buy the cards (most of the time) and it doesn't matter if they kill it. But when I've saved up for months just to buy a new GPU, I'm not going to subject it to this kind of torture that could potentially damage or kill it. It doesn't prove anything because any actual game you run will never be that demanding.

In-game benchmark tests are the best, most realistic and relevant test for GPUs, IMO.


----------



## MEC-777

After waiting nearly 2 weeks now, my 1070 should finally be arriving today.









Ran through a gauntlet of benchmarks the other night to get some numbers on the 980 before replacing it. Wanted to get an idea of the performance difference, (stock & OC'd) between it and the 1070. Really going to miss this 980. Was my first venture to the green team and let's just say it's going to take a lot for the red team to win me back.


----------



## loopy750

Quote:


> Originally Posted by *MEC-777*
> 
> It's not stock cooled cards that have been killed by furmark. It's any cards, stock and aftermarket.
> 
> The problem with furmark is it puts extreme unrealistic load on the GPU that could never be duplicated in any game. Reviewers like Tom's can run it all they want. They didn't buy the cards (most of the time) and it doesn't matter if they kill it. But when I've saved up for months just to buy a new GPU, I'm not going to subject it to this kind of torture that could potentially damage or kill it. It doesn't prove anything because any actual game you run will never be that demanding.
> 
> In-game benchmark tests are the best, most realistic and relevant test for GPUs, IMO.


Exactly this. I completely agree with your opinion. I have a R9 290X I tested with OCCT. In any game with all settings maxed out, VRM1 would peak in the low 80's°C. In OCCT, VRM1 quickly hit over 110°C before throttling to save itself.

There's no way I'll be running anything like this on my GTX 1070, for two reasons: 1) No VRM sensors, 2) Absolutely nothing to gain from doing so.

Not only that, your card may appear stable under one of these stress tests but then still crash from a bad OC in-game anyway.


----------



## gtbtk

Quote:


> Originally Posted by *foxmagic*
> 
> Anyone got the new zotac 1070 amp rom? cant get the zotac flashing tool working.


Amp extreme 1633Mhz

https://www.techpowerup.com/vgabios/187899/187899

Amp+ 1607Mhz

https://www.techpowerup.com/vgabios/187937/187937


----------



## philhalo66

How do you even flash pascal? I know the 700 series had some kind of PLX chip doesn't pascal have that too?


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Caveat*
> 
> Thanks for the info. But i ordered already the ASUS STRIX-GTX1070-8G-GAMING. I don.t know why, but i like the Asus software. Probably because it is easy to use...


I have a Strix 1070 and that Asus software just didn't work for me, Windows 10 machine. The problem I had was that no matter what I tried, I couldn't get the program to start with Windows. After every cold boot (not re-start) I'd have to manually start the program or the fans on the GPU wouldn't run, and that would cause some serious temps. I was in the shakedown phase, so I was always on the lookout for issues, but it could have been a bad situation if I wasn't right there. I messed with it for about a week before just giving up and installing AB, which I'm more familiar with anyway. And yes, I checked the "start with Windows" boxes or whatever they were called in Settings, all of that.

This machine isn't mine, building it as a gift for non-techie family, and I just couldn't send them a rig that they'd have to remember to start the GPU management software every time - there will be kids involved. But AB fires right up, I have 5 OC profiles established with nice fan curves, should be pretty painless.

The card is great, absolutely no issues with the hardware part of the purchase. In fact I like it quite a bit and wouldn't mind having one for myself.


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> How do you even flash pascal? I know the 700 series had some kind of PLX chip doesn't pascal have that too?


Do this at your own risk. It can potentially brick the card which can be recovered if you have another gpu (igpu of a different card) that you can boot from as a primary

To flash a card manually, you get the current version of a utility called nvflash.exe

you get hold of the bios you want to flash to your card. techpowerup has a database of vga bioses. copy it into the same directory as the nvflash utility

you disable the card in device manager

in an admin command prompt type the command without the quotes change the rom name to your file name "nvflash -6 vgabiosfilename.rom" and hit enter.

answer Y to the prompts and it will flash the bios to the card.

re-enable the card in device manager

reboot your PC.

The -6 flag will override the compatibility check and allow you to cross flash a bios from a different model card if you wish


----------



## philhalo66

Quote:


> Originally Posted by *gtbtk*
> 
> Do this at your own risk. It can potentially brick the card which can be recovered if you have another gpu (igpu of a different card) that you can boot from as a primary
> 
> To flash a card manually, you get the current version of a utility called nvflash.exe
> 
> you get hold of the bios you want to flash to your card. techpowerup has a database of vga bioses. copy it into the same directory as the nvflash utility
> 
> you disable the card in device manager
> 
> in an admin command prompt type the command without the quotes change the rom name to your file name "nvflash -6 vgabiosfilename.rom" and hit enter.
> 
> answer Y to the prompts and it will flash the bios to the card.
> 
> re-enable the card in device manager
> 
> reboot your PC.
> 
> The -6 flag will override the compatibility check and allow you to cross flash a bios from a different model card if you wish


Oh i have no intention of flashing my card. I was just wondering.


----------



## owikhan

Quote:


> Originally Posted by *gtbtk*
> 
> Amp extreme 1633Mhz
> 
> https://www.techpowerup.com/vgabios/187899/187899
> 
> Amp+ 1607Mhz
> 
> https://www.techpowerup.com/vgabios/187937/187937


No officialy bios relase
https://www.zotac.com/us/files/download/by_product?p_nid=558498&driver_type=238&os=All


----------



## gtbtk

Quote:


> Originally Posted by *owikhan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Amp extreme 1633Mhz
> 
> https://www.techpowerup.com/vgabios/187899/187899
> 
> Amp+ 1607Mhz
> 
> https://www.techpowerup.com/vgabios/187937/187937
> 
> 
> 
> No officialy bios relase
> https://www.zotac.com/us/files/download/by_product?p_nid=558498&driver_type=238&os=All
Click to expand...

He said that he could not get the Zotac update utility to work


----------



## watermanpc85

Fast question guys...I have just installed my new 1070 Gaming G1, all seems to be working fine and boost to 1927Mhz without touching anything but I have a question, do I need to install the Gigabyte XTREME GAMING software to set the card in "OC" mode???? I have MSI Afterburner installed but not sure if I MUST install that soft just to ¿enable? some more OC capability??

thanks!


----------



## BulletSponge

Quote:


> Originally Posted by *watermanpc85*
> 
> Fast question guys...I have just installed my new 1070 Gaming G1, all seems to be working fine and boost to 1927Mhz without touching anything but I have a question, do I need to install the Gigabyte XTREME GAMING software to set the card in "OC" mode???? I have MSI Afterburner installed but not sure if I MUST install that soft just to ¿enable? some more OC capability??
> 
> thanks!


No, AB is all you need.


----------



## zipper17

Quote:


> Originally Posted by *MEC-777*
> 
> In-game benchmark tests are the best, most realistic and relevant test for GPUs, IMO.


Not really exactly agree with this one.

so how about 3dmark etc, they are useless too? (Firestrike, Timespy, 3dmark stress test etc?)

Personally i use 3dmark Test for Bench & Stress testing.
I Run benchmark and run the Stress test, especially after overclocking.
At same point it telling the truth if your card stable or not. After that then go play some games.


----------



## gtbtk

Quote:


> Originally Posted by *watermanpc85*
> 
> Fast question guys...I have just installed my new 1070 Gaming G1, all seems to be working fine and boost to 1927Mhz without touching anything but I have a question, do I need to install the Gigabyte XTREME GAMING software to set the card in "OC" mode???? I have MSI Afterburner installed but not sure if I MUST install that soft just to ¿enable? some more OC capability??
> 
> thanks!


"OC" mode is just a small overclock to Core and Ram. It does not turn on any hidden features of the card so AB can do the OC for you in a profile anyway.


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Do this at your own risk. It can potentially brick the card which can be recovered if you have another gpu (igpu of a different card) that you can boot from as a primary
> 
> To flash a card manually, you get the current version of a utility called nvflash.exe
> 
> you get hold of the bios you want to flash to your card. techpowerup has a database of vga bioses. copy it into the same directory as the nvflash utility
> 
> you disable the card in device manager
> 
> in an admin command prompt type the command without the quotes change the rom name to your file name "nvflash -6 vgabiosfilename.rom" and hit enter.
> 
> answer Y to the prompts and it will flash the bios to the card.
> 
> re-enable the card in device manager
> 
> reboot your PC.
> 
> The -6 flag will override the compatibility check and allow you to cross flash a bios from a different model card if you wish
> 
> 
> 
> Oh i have no intention of flashing my card. I was just wondering.
Click to expand...

I'll remember that next time you ask a question.


----------



## MEC-777

Quote:


> Originally Posted by *zipper17*
> 
> Not really exactly agree with this one.
> 
> so how about 3dmark etc, they are useless too? (Firestrike, Timespy, 3dmark stress test etc?)
> 
> Personally i use 3dmark Test for Bench & Stress testing.
> I Run benchmark and run the Stress test, especially after overclocking.
> At same point it telling the truth if your card stable or not. After that then go play some games.


3dmark is fine because it's putting stress, but not over-kill unrealistic stress on the card. It's a good benchmark and so is Heaven and Valley because everyone can run them for free or very little cost and it's a good way to compare systems.

I stress test my overclocks by first seeing if it can make it through 3dmark firestrike. If it makes it through that, I then run Valley loops for at least 30 mins. Then I run the most demanding games I have (TW3) for a good hour or more as a final test. If it's stable through all of that, there's no need to subject it to furmark and potentially damage or kill the card.


----------



## DeathAngel74

Yep. Tw3 and any of the battlefield games are a good enough stability test after realbench. Sw:bf 2015 stresses pretty well too.


----------



## Caveat

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> I have a Strix 1070 and that Asus software just didn't work for me, Windows 10 machine. The problem I had was that no matter what I tried, I couldn't get the program to start with Windows. After every cold boot (not re-start) I'd have to manually start the program or the fans on the GPU wouldn't run, and that would cause some serious temps. I was in the shakedown phase, so I was always on the lookout for issues, but it could have been a bad situation if I wasn't right there. I messed with it for about a week before just giving up and installing AB, which I'm more familiar with anyway. And yes, I checked the "start with Windows" boxes or whatever they were called in Settings, all of that.
> 
> This machine isn't mine, building it as a gift for non-techie family, and I just couldn't send them a rig that they'd have to remember to start the GPU management software every time - there will be kids involved. But AB fires right up, I have 5 OC profiles established with nice fan curves, should be pretty painless.
> 
> The card is great, absolutely no issues with the hardware part of the purchase. In fact I like it quite a bit and wouldn't mind having one for myself.


I have the same problem here tho... Sometimes i can not start the software. And i turned it on so that if windows starts, the software starts... But it still will not start. so it is really weird. Ingame i have the NZXT CAM overlay that keeps info on the cpu, gpu and stuff...


----------



## khanmein

Quote:


> Originally Posted by *Caveat*
> 
> I have the same problem here tho... Sometimes i can not start the software. And i turned it on so that if windows starts, the software starts... But it still will not start. so it is really weird. Ingame i have the NZXT CAM overlay that keeps info on the cpu, gpu and stuff...


remove the NZXT CAM. the s/w is pretty buggy!


----------



## Caveat

Quote:


> Originally Posted by *khanmein*
> 
> remove the NZXT CAM. the s/w is pretty buggy!


To be honoust i have no problems with it. Working fine. Found a suluyion for the Asus Gtx software. Turn on that you want to start the software up when windows starts, but do not allow it to start it minimalized. Now its working fine for me


----------



## khanmein

Quote:


> Originally Posted by *Caveat*
> 
> To be honoust i have no problems with it. Working fine. Found a suluyion for the Asus Gtx software. Turn on that you want to start the software up when windows starts, but do not allow it to start it minimalized. Now its working fine for me


alright. cheers.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Caveat*
> 
> To be honoust i have no problems with it. Working fine. Found a suluyion for the Asus Gtx software. Turn on that you want to start the software up when windows starts, but do not allow it to start it minimalized. Now its working fine for me


Ah, I never tried to start it non-minimized. Thanks for the tip!

But I really think AB will be better suited for what I'm trying to do here. If they learn the ins and outs of GPU overclocking and want to try the Asus software, they can load it up then.


----------



## Caveat

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> Ah, I never tried to start it non-minimized. Thanks for the tip!
> 
> But I really think AB will be better suited for what I'm trying to do here. If they learn the ins and outs of GPU overclocking and want to try the Asus software, they can load it up then.


No problem







Ye for overclocking AB is better. But i don't overclock. So i'll stick with the standard software. But sometimes it change things by itself. Like te fans setup and the mode for gaming or standard OC


----------



## philhalo66

I'm actually really liking the new EVGA Precision X


----------



## watermanpc85

Quote:


> Originally Posted by *BulletSponge*
> 
> No, AB is all you need.


Quote:


> Originally Posted by *gtbtk*
> 
> "OC" mode is just a small overclock to Core and Ram. It does not turn on any hidden features of the card so AB can do the OC for you in a profile anyway.


Thanks both guys!!

Btw, this is what I have done by now with my 1070 Gaming G1 (samsung memory):

-TDP and Temp rised to the max (111% and 92º)

-Upped mem 100 by 100 Mhz, and right now, it seems like my card is stable at +650Mhz (dont know if probably more but I think its "enough"). However, I have found a weird behaviour here...Firestrike crashes (well, it tells me that the "user" has cancelled the bench, which is obviously not the case) if I rise the voltage even with the memory at +400Mhz?¿?¿...as soon as I set voltage to +0v it finish the bench no problems, so if I rise core voltage then crash due to memory???... Also, not a single artifact due to memory by now. Why is that happening??

-GPU core rised to +115Mhz which gives me 2050Mhz stable, BUT, another weird thing...as soon as the core is over 2076Mhz, I get a lot of artifacting, even at max voltage (voltage slider to the max, which applies 1,09v if Im correct). I find it very bad Im getting core artifacts at max voltage just at 2076Mhz, is that expectable??...

Also, TDP is sometimes reaching limits (no many times, but some times) so the best balance I could find is:

- +25% core voltage.
- +115Mhz core, which gives 2050 boost most of the time but drops at 2025 sometimes
- +625Mhz mem.
- Temps are between 57/62º max with custom fan curve.

DUe to the size of the thread I cant read all the details so I would love if someone can help me to increase the TDP limit if its possible for this card...also, what do you think guys about the behaviour of my card??, artifacts at 1,09v at 2088 core seems very low to me...

3D mark:

http://www.3dmark.com/3dm/16571401

Heaven:



thanks guys!!


----------



## GeneO

Quote:


> Originally Posted by *philhalo66*
> 
> I'm actually really liking the new EVGA Precision X


I like the curve editor better than Afterburner's. It has discrete steps and you get what you set, unlike AB, which has a mind of its own when you apply changes,


----------



## philhalo66

Quote:


> Originally Posted by *GeneO*
> 
> I like the curve editor better than Afterburner's. It has discrete steps and you get what you set, unlike AB, which has a mind of its own when you apply changes,


I don't ever bother with my 1070's fan the auto profile keeps it under 60C But on my last card i did notice it kept resetting itself randomly i thought it was my card.


----------



## GeneO

Quote:


> Originally Posted by *philhalo66*
> 
> I don't ever bother with my 1070's fan the auto profile keeps it under 60C But on my last card i did notice it kept resetting itself randomly i thought it was my card.


I meant the frequency voltage curve, not the fan rpm temp curve.


----------



## gtbtk

Quote:


> Btw, this is what I have done by now with my 1070 Gaming G1 (samsung memory):
> 
> -TDP and Temp rised to the max (111% and 92º)
> 
> -Upped mem 100 by 100 Mhz, and right now, it seems like my card is stable at +650Mhz (dont know if probably more but I think its "enough"). However, I have found a weird behaviour here...Firestrike crashes (well, it tells me that the "user" has cancelled the bench, which is obviously not the case) if I rise the voltage even with the memory at +400Mhz?¿?¿...as soon as I set voltage to +0v it finish the bench no problems, so if I rise core voltage then crash due to memory???... Also, not a single artifact due to memory by now. Why is that happening??
> 
> -GPU core rised to +115Mhz which gives me 2050Mhz stable, BUT, another weird thing...as soon as the core is over 2076Mhz, I get a lot of artifacting, even at max voltage (voltage slider to the max, which applies 1,09v if Im correct). I find it very bad Im getting core artifacts at max voltage just at 2076Mhz, is that expectable??...
> 
> Also, TDP is sometimes reaching limits (no many times, but some times) so the best balance I could find is:
> 
> - +25% core voltage.
> - +115Mhz core, which gives 2050 boost most of the time but drops at 2025 sometimes
> - +625Mhz mem.
> - Temps are between 57/62º max with custom fan curve.
> 
> DUe to the size of the thread I cant read all the details so I would love if someone can help me to increase the TDP limit if its possible for this card...also, what do you think guys about the behaviour of my card??, artifacts at 1,09v at 2088 core seems very low to me...
> 
> 3D mark:
> 
> http://www.3dmark.com/3dm/16571401
> 
> Heaven:
> 
> thanks guys!!


Understand that the cards performance is not a single point on the voltage curve but the card operates all along the curve in parallel. Adjusting the voltage slider up will increase the amount of the curve going to the right that your card can access.

The amount you can overclock each of the different voltage points above the baseline before it crashes is variable. On My 1070, the voltage points between about 1v and 1.043v will not overclock as much as as the points in the 1.050 - 1.093V range of the curve.

When you overclock with the slider, the fixed shape curve will move all the points up together and when it hits the voltage point with the lowest OC headroom, it will cause the card to crash. The potential extra performance that can be found at other points on the curve that are not hitting the low point of OC potential ends up being wasted.

If you OC with the curve instead of the slider, It is more tricky because you have to identify where the low points are, but you can adjust the curve to dip under the low point you hit with the slider and increase the points on either side of that dip to make use of the extra OC potential at the other points on the curve.

If you open the curve window in AB and move the slider, you will see how the slider moves the entire curve up an down.

With that in mind, It sounds like the points on the curve on your card, in the high range, such as 1.081v have crossed over into the zone where that point is trying to overclock higher than the gpu can cope. When you increase the voltage it moves the cards performance along the curve until it tries to operate at the point on the curve in that "too much" zone and it crashes.

The other thing is not to get hung up on the OC numbers, you need to concentrate on maximizing frame rates. There are cards that will produce less frames with the memory set to +650 then it will with the memory at +500. You need to experiment with your card to see how the different levels of memory overclock effects your particular card.


----------



## DennyCorsa86

I have order the 1070 HALL OF FAME card today, i will hope it running scores xD


----------



## gtbtk

Quote:


> Originally Posted by *DennyCorsa86*
> 
> I have order the 1070 HALL OF FAME card today, i will hope it running scores xD


nice card. not sure if it will compare well with 2x titan X cards though


----------



## SHiMZ

Having my 1070 Strix O8G since 2-3 months and it's a beast for 1440p.
But I have a 1440p 144Hz monitor (MG278Q) so I have ordered a second 1070 Strix O8G with the ROG SLI Bridge and they are comming in the next 10 hours









It's not my first Strix, I also have a second O8G on another rigg and tryed the SLI (with the old SLI Bridge from 970 Strix..) in 3DMark.

The first setup is my second rigg with a 4690k stock (only turboboost, 3.9GHz) and a VII Ranger.
The second is my actuel rigg with 4690k at 4.5GHz and Z97 Gaming 7.

http://www.3dmark.com/compare/spy/845769/spy/713873#

I can't try the 3-way because my motherboard doesn't support 3-way SLI, I have both 2 and 3way SLI Bridge and a 850W Toughpower DPS G RGB from Thermaltake.
Edit: She support 3-way but not compatible with my bridge configuration.



I'll change my motherboard with the comming up 7700k, for a VIII Formula or IX Formula (Have an open-air Thermaltake Core P3) but both have the same layout than Gaming 7, so not possible without buying 6x old Bridge.

I'm sorry for my basic english


----------



## reb0rn

Quote:


> Originally Posted by *GeneO*
> 
> I like the curve editor better than Afterburner's. It has discrete steps and you get what you set, unlike AB, which has a mind of its own when you apply changes,


I am using 1070 as mining rigs, they have five cards per system, do you know if evga precesion X can set curve editor (voltage / freq) to all same cards in system at once?
MSI afterburner curve editor works only on one card and you must edit each one by one and set in in 5 profiles for 5 cards.... and I can do that over team viwer


----------



## GeneO

Quote:


> Originally Posted by *reb0rn*
> 
> I am using 1070 as mining rigs, they have five cards per system, do you know if evga precesion X can set curve editor (voltage / freq) to all same cards in system at once?
> MSI afterburner curve editor works only on one card and you must edit each one by one and set in in 5 profiles for 5 cards.... and I can do that over team viwer


Nope, don't know. Going to stick with Afterburner myself though, just used to it.


----------



## GeneO

Quote:


> Originally Posted by *reb0rn*
> 
> I am using 1070 as mining rigs, they have five cards per system, do you know if evga precesion X can set curve editor (voltage / freq) to all same cards in system at once?
> MSI afterburner curve editor works only on one card and you must edit each one by one and set in in 5 profiles for 5 cards.... and I can do that over team viwer


Nope, don't know. Going to stick with Afterburner myself though, just used to it.


----------



## ucode

Quote:


> Originally Posted by *DennyCorsa86*
> 
> I have order the 1070 HALL OF FAME card today, i will hope it running scores xD


Nice. Would you mind confirming if operation above 1.1V is possible using the Galax Extreme Tuner Plus. Might see this either through increased power consumption at same clocks or higher clocking possible.


----------



## TeslaHUN

I have one GTX1070 Inno3D +EK FC waterblock. Im using the card on 2150 mhz GPU clock +300mhz on memory. (120% TDP limit on MSI AB)
Any1 here got watercooled card ? How far can u overclock ? Im lazy to test myself


----------



## SHiMZ

1070 Strix OC (O8G) SLI


----------



## Dude970

Looks great


----------



## rfarmer

Quote:


> Originally Posted by *SHiMZ*
> 
> 1070 Strix OC (O8G) SLI
> 
> 
> Spoiler: Warning: Spoiler!


Looks good, what case is that?


----------



## asdkj1740

Quote:


> Originally Posted by *TeslaHUN*
> 
> I have one GTX1070 Inno3D +EK FC waterblock. Im using the card on 2150 mhz GPU clock +300mhz on memory. (120% TDP limit on MSI AB)
> Any1 here got watercooled card ? How far can u overclock ? Im lazy to test myself


which 1070 from inno3d? there are fe, twin, x3, x4 and black.
x3 x4 and black has six phases power design.


----------



## SHiMZ

Quote:


> Originally Posted by *rfarmer*
> 
> Looks good, what case is that?


Thermaltake Core P3 Black


----------



## TeslaHUN

Quote:


> Originally Posted by *asdkj1740*
> 
> which 1070 from inno3d? there are fe, twin, x3, x4 and black.
> x3 x4 and black has six phases power design.


I have ichill x4


----------



## zipper17

Up until now I'm still not really fully understand The Pascal Overclock logic, GPU Boost 3.0 logic, and every bit combination of curve/offset method.

Whenever I attempt to reach 21K Firestrike Graphic scores, sometime will pass the test/bench, but sometime it will fail/crash.

Core chip lottery?

I would hope someone post some full theoretically a guide or something how to overclock with curve/offset Pascal.

I set up a different Curve methods,
both curve still has the same core speed, mem speed, and voltage while running Firestrike.
But the result is different, although it running at same speed of core/mem/voltage frequencies.

here Example:

Curve A:

*Offset +0
1.093 +140*

Curve B:

*Offset +51
1.093 +140*

as you can see Curve A & B,

Curve A & B has same speed constantly running at ~2151-2138MHZ, 4600MHZ(9.2 GHZ), and voltage 1.093 during Firestrike Test.

But for Real performance; Curve B will provide a much better FPS than Curve A. If you know what i mean.
Where is the logic behind this?Curve A is running a fake performances of 2151MHZ ??


----------



## asdkj1740

Quote:


> Originally Posted by *zipper17*
> 
> Up until now I'm still not really fully understand The Pascal Overclock logic, GPU Boost 3.0 logic, and every bit combination of curve/offset method.
> 
> Whenever I attempt to reach 21K Firestrike Graphic scores, sometime will pass the test/bench, but sometime it will fail/crash.
> 
> Core chip lottery?
> 
> I would hope someone post some full theoretically a guide or something how to overclock with curve/offset Pascal.
> 
> I set up a different Curve methods,
> both curve still has the same core speed, mem speed, and voltage while running Firestrike.
> But the result is different, although it running at same speed of core/mem/voltage frequencies.
> 
> here Example:
> 
> Curve A:
> 
> *Offset +0
> 1.093 +140*
> 
> Curve B:
> 
> *Offset +51
> 1.093 +140*
> 
> as you can see Curve A & B,
> 
> Curve A & B has same speed constantly running at ~2151-2138MHZ, 4600MHZ(9.2 GHZ), and voltage 1.093 during Firestrike Test.
> 
> But for Real performance; Curve B will provide a much better FPS than Curve A. If you know what i mean.
> Where is the logic behind this?Curve A is running a fake performances of 2151MHZ ??


what you are doing with the curve is rigth, just raise the 1.09v point is enough.
you should try to lock the voltage at 1.09v during gaming. maybe it is because the voltage is too fluctuated during gaming.

you can also try the 1.08v, my card works even better at 1.08v compared to 1.09v.


----------



## asdkj1740

Quote:


> Originally Posted by *TeslaHUN*
> 
> I have ichill x4


this is great. you can try other high power limit bios to access more power.


----------



## MEC-777

Finally, my 1070 FE came in yesterday.







Hit 2000/9000 OC (+150/+500) no prob. It's also surprisingly quiet for a blower card. I setup a custom fan curve so even in the most demanding games it never cracks higher than 75*C and still stays very quiet. Must prefer the sound of a blower fan over axial fans. Much smoother sound, kind of like background white noise.

Overall very happy with it.









Also, I got lucky, it has Samsung memory, so no BIOS flashing needed.


----------



## zipper17

Quote:


> Originally Posted by *asdkj1740*
> 
> what you are doing with the curve is rigth, just raise the 1.09v point is enough.
> you should try to lock the voltage at 1.09v during gaming. maybe it is because the voltage is too fluctuated during gaming.
> 
> you can also try the 1.08v, my card works even better at 1.08v compared to 1.09v.


both curves it is locked to 1.093V, it never dropped.

but curve A has lower performances than curve B in Firestrike Graphic Tests.
Even though both curves running at the same core/mem/voltage speed (Monitoring via MSI AB). it's not fluctuated IMO.

Offset +51 combined 1.093V at +140 has better performances?

Fan speed is at 100% when i tested.

Or maybe anyone would try it, to see what is happening.


----------



## TeslaHUN

Heaven
GPU 2150mhz


Firestrike

21k graphics score , not even close to 1080


----------



## philhalo66

Quote:


> Originally Posted by *TeslaHUN*
> 
> Heaven
> GPU 2150mhz
> 
> 
> Firestrike
> 
> 21k graphics score , not even close to 1080


well of course you wont even come close. the 1080 has 2560 cuda cores the 1070 has 1920 thats a huge difference plus the 1080 has more than double the texture fill rate plus GDDR5X. nothing you can do will bring you close to a 1080 accept LN2 runs, nvidia made sure of this.


----------



## watermanpc85

Quote:


> Originally Posted by *gtbtk*
> 
> Understand that the cards performance is not a single point on the voltage curve but the card operates all along the curve in parallel. Adjusting the voltage slider up will increase the amount of the curve going to the right that your card can access.
> 
> The amount you can overclock each of the different voltage points above the baseline before it crashes is variable. On My 1070, the voltage points between about 1v and 1.043v will not overclock as much as as the points in the 1.050 - 1.093V range of the curve.
> 
> When you overclock with the slider, the fixed shape curve will move all the points up together and when it hits the voltage point with the lowest OC headroom, it will cause the card to crash. The potential extra performance that can be found at other points on the curve that are not hitting the low point of OC potential ends up being wasted.
> 
> If you OC with the curve instead of the slider, It is more tricky because you have to identify where the low points are, but you can adjust the curve to dip under the low point you hit with the slider and increase the points on either side of that dip to make use of the extra OC potential at the other points on the curve.
> 
> If you open the curve window in AB and move the slider, you will see how the slider moves the entire curve up an down.
> 
> With that in mind, It sounds like the points on the curve on your card, in the high range, such as 1.081v have crossed over into the zone where that point is trying to overclock higher than the gpu can cope. When you increase the voltage it moves the cards performance along the curve until it tries to operate at the point on the curve in that "too much" zone and it crashes.
> 
> The other thing is not to get hung up on the OC numbers, you need to concentrate on maximizing frame rates. There are cards that will produce less frames with the memory set to +650 then it will with the memory at +500. You need to experiment with your card to see how the different levels of memory overclock effects your particular card.


Thanks for your help mate!!









Anyway, Im not sure that explains why is the card crashing due to memory OC when I rise the voltage a fixed percentage or if its expectable to get heavy artifacting due to core OC at 2076+Mhz even at max voltage (1,09v)...maybe I didnt undertand correctly what you explained but not sure....

Btw, Im getting pretty much what I think is a really good performance (20900 Fire strike and 105,7 Heaven) at only 2050 Core and 4622 Mem...looks like memory OC is giving me as good performance increase as core OC for some reason, also, no artifacts nor crashes...in fact, my card hasnt even crashed A SINGLE TIME YET







...so I guess I should leave it as its now right?? what do you think?? 2050 (dropping to 2038 some times due to TDP) and 4622 mem...


----------



## Caveat

Question. My Strix 1070 is not going 100% in the pci slot because the "white lock thing" is blocking the card. Is this a major problem? Sometimes i get a blue screen, than i do sfc/scannow and everything works fine. But could this the reason for that? If yes, how do i remove the pci lock thing without destroying my mobo?


----------



## benjamen50

What is the blue screen error code / debug code being displayed? Because that usually tells us what hardware / drivers / software may be at fault.


----------



## Caveat

Quote:


> Originally Posted by *benjamen50*
> 
> What is the blue screen error code / debug code being displayed? Because that usually tells us what hardware / drivers / software may be at fault.


Quote:


> Originally Posted by *benjamen50*
> 
> What is the blue screen error code / debug code being displayed? Because that usually tells us what hardware / drivers / software may be at fault.


I posted a ss before. But the blue screens started after i changed my 290x in crossfire to a single strix 1070. After i reinstalled windows, also my windows can not find any updates anymore


----------



## Caveat

Wow. Phone is tripping. I do not know how many times i send this now haha


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> 
> Question. My Strix 1070 is not going 100% in the pci slot because the "white lock thing" is blocking the card. Is this a major problem? Sometimes i get a blue screen, than i do sfc/scannow and everything works fine. But could this the reason for that? If yes, how do i remove the pci lock thing without destroying my mobo?


you should be able to remove the PCI-E lock. most just pop off without hardly any force but make sure first.


----------



## Caveat

Ok thank you. I will try that if i have some time.


----------



## NErdgOd56

guys, i feel like ive read about 200 pages of this and searched all through it and i still cant seem to find the answers im looking for.

heres what i know:

i have the Gigabyte GTX 1070 G1 gaming X8g with micron gddr5

the current bios available from gigabyte does not flash, however the wrapper software the manufacture uses reprts that it flashes sucessfully

after trying multiple times the actual nvflash terminal box that appears flashes something like "board is incompatible with firmware version [alphanumeric string]" before closing abruptly and incorrectly reporting a successful flash

the board currently has the f10 (f11beta?) 86.04.26.00.56 firmware on it

from what ive read, here are the questions i have:

can you just flash any pascal chip firmware on any card as long as the memory brand is the same?

has a way been found to remove the power limit of the board?

im not necessarily worried about overclocking, but i read some mentions of a "micron fix" do i need to worry about this?

and finally is there a way to debrand, or rebrand my card, so i no longer have to deal with gigabytes.... general..... weirdness?

in not a graphics newb, but i am a nvidia newb (due to lack of funding not preference.) and am extremely satisfied with this card. i can provide any information i may have missed in this post if necessary

thanks in advance for any advice and answers


----------



## Blackfirehawk

Hi folks, I Pick up a Gainwand GTX 1070 on Black Fridays Sale for about 380€ (400 US Dollar)
and i Wonder Whats the Difference between these Two..
http://www.gainward.com/main/vgapro.php?id=987&lang=de
http://www.gainward.com/main/vgapro.php?id=986&lang=de
my Card have Micron Memory i can OC the Card + 175 Core and +550mhz Memory on a Powertarget +12% and didn´t get temps over 64 degree Celsius boostclock is about 2000-2025
What would happen if i Flash the GLH bios on it? would it be more stable to overclock? or should i stick with the standart Bios for my card

i know the GLH has a TDP of 170W and the standart only has 150w


----------



## NErdgOd56

Quote:


> Originally Posted by *Blackfirehawk*
> 
> standart


#1 form pet peeve just sayin


----------



## Blackfirehawk

Quote:


> Originally Posted by *NErdgOd56*
> 
> #1 form pet peeve just sayin


Sorry my english isn´t the best becouse its not my native language








i mean the Gainwand Stock GTX1070 vs the GTX1070 GLH


----------



## Dan-H

Quote:


> Originally Posted by *Blackfirehawk*
> 
> Hi folks, I Pick up a Gainwand GTX 1070 on Black Fridays Sale for about 380€ (400 US Dollar)
> and i Wonder Whats the Difference between these Two..
> http://www.gainward.com/main/vgapro.php?id=987&lang=de
> http://www.gainward.com/main/vgapro.php?id=986&lang=de
> my Card have Micron Memory i can OC the Card + 175 Core and +550mhz Memory on a Powertarget +12% and didn´t get temps over 64 degree Celsius boostclock is about 2000-2025
> What would happen if i Flash the GLH bios on it? would it be more stable to overclock? or should i stick with the standart Bios for my card
> 
> i know the GLH has a TDP of 170W and the standart only has 150w


the GLH is the performance boosted version.

checkout one of the reviews. Here's one:

https://us.hardware.info/reviews/6856/gainward-geforce-gtx-1070-phoenix-glh-review-sizeable-performance

If reading it in English isn't working for you, try using Google Chrome browser and the "translate" feature.


----------



## Caveat

Quote:


> Originally Posted by *philhalo66*
> 
> you should be able to remove the PCI-E lock. most just pop off without hardly any force but make sure first.


Found it. It is called pci Retention clip or something


----------



## Blackfirehawk

Quote:


> Originally Posted by *Dan-H*
> 
> the GLH is the performance boosted version.
> 
> checkout one of the reviews. Here's one:
> 
> https://us.hardware.info/reviews/6856/gainward-geforce-gtx-1070-phoenix-glh-review-sizeable-performance
> 
> If reading it in English isn't working for you, try using Google Chrome browser and the "translate" feature.


i can Read english very well but i do have some problems writhing it.

my question is..
what happen if i flash the bios to the Stock Gainwand GTX1070 ?
would i have some benefits in OC my card
can i easyly flash the bios to the Stock GTX1070

or should i stick with the Stock GTX1070 bios and be happy with my +175 core and +550 Memory


----------



## watermanpc85

Quote:


> Originally Posted by *NErdgOd56*
> 
> guys, i feel like ive read about 200 pages of this and searched all through it and i still cant seem to find the answers im looking for.
> 
> heres what i know:
> 
> i have the Gigabyte GTX 1070 G1 gaming X8g with micron gddr5
> 
> the current bios available from gigabyte does not flash, however the wrapper software the manufacture uses reprts that it flashes sucessfully
> 
> after trying multiple times the actual nvflash terminal box that appears flashes something like "board is incompatible with firmware version [alphanumeric string]" before closing abruptly and incorrectly reporting a successful flash
> 
> the board currently has the f10 (f11beta?) 86.04.26.00.56 firmware on it
> 
> from what ive read, here are the questions i have:
> 
> can you just flash any pascal chip firmware on any card as long as the memory brand is the same?
> 
> has a way been found to remove the power limit of the board?
> 
> im not necessarily worried about overclocking, but i read some mentions of a "micron fix" do i need to worry about this?
> 
> and finally is there a way to debrand, or rebrand my card, so i no longer have to deal with gigabytes.... general..... weirdness?
> 
> in not a graphics newb, but i am a nvidia newb (due to lack of funding not preference.) and am extremely satisfied with this card. i can provide any information i may have missed in this post if necessary
> 
> thanks in advance for any advice and answers


Im also very interested in flashing a bios with increased TDP...hope someone can help us...


----------



## NErdgOd56

Quote:


> Originally Posted by *watermanpc85*
> 
> Im also very interested in flashing a bios with increased TDP...hope someone can help us...


i made a new thread over here http://www.overclock.net/t/1618201/confused-1070-owner

bump it yo.


----------



## NErdgOd56

Quote:


> Originally Posted by *Caveat*
> 
> Found it. It is called pci Retention clip or something


in my experience these come off rather easily. pry more toward one side than the other theyre usually held in by little pins on the side


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> Found it. It is called pci Retention clip or something


Yeah i always called it a PCI-E lock haha. hopefully yours pops off without any issue and it solves your problems. Good luck


----------



## gtbtk

Quote:


> Originally Posted by *watermanpc85*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Understand that the cards performance is not a single point on the voltage curve but the card operates all along the curve in parallel. Adjusting the voltage slider up will increase the amount of the curve going to the right that your card can access.
> 
> The amount you can overclock each of the different voltage points above the baseline before it crashes is variable. On My 1070, the voltage points between about 1v and 1.043v will not overclock as much as as the points in the 1.050 - 1.093V range of the curve.
> 
> When you overclock with the slider, the fixed shape curve will move all the points up together and when it hits the voltage point with the lowest OC headroom, it will cause the card to crash. The potential extra performance that can be found at other points on the curve that are not hitting the low point of OC potential ends up being wasted.
> 
> If you OC with the curve instead of the slider, It is more tricky because you have to identify where the low points are, but you can adjust the curve to dip under the low point you hit with the slider and increase the points on either side of that dip to make use of the extra OC potential at the other points on the curve.
> 
> If you open the curve window in AB and move the slider, you will see how the slider moves the entire curve up an down.
> 
> With that in mind, It sounds like the points on the curve on your card, in the high range, such as 1.081v have crossed over into the zone where that point is trying to overclock higher than the gpu can cope. When you increase the voltage it moves the cards performance along the curve until it tries to operate at the point on the curve in that "too much" zone and it crashes.
> 
> The other thing is not to get hung up on the OC numbers, you need to concentrate on maximizing frame rates. There are cards that will produce less frames with the memory set to +650 then it will with the memory at +500. You need to experiment with your card to see how the different levels of memory overclock effects your particular card.
> 
> 
> 
> Thanks for your help mate!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, Im not sure that explains why is the card crashing due to memory OC when I rise the voltage a fixed percentage or if its expectable to get heavy artifacting due to core OC at 2076+Mhz even at max voltage (1,09v)...maybe I didnt undertand correctly what you explained but not sure....
> 
> Btw, Im getting pretty much what I think is a really good performance (20900 Fire strike and 105,7 Heaven) at only 2050 Core and 4622 Mem...looks like memory OC is giving me as good performance increase as core OC for some reason, also, no artifacts nor crashes...in fact, my card hasnt even crashed A SINGLE TIME YET
> 
> 
> 
> 
> 
> 
> 
> ...so I guess I should leave it as its now right?? what do you think?? 2050 (dropping to 2038 some times due to TDP) and 4622 mem...
Click to expand...

Mine crashes if I OC too high too. the crash with a given setting will always happen when the temperature hits a certain value and GPU boost tries to change the primary point reported by AB to the next voltage value down and that value happens to be above the limit dor that voltage level I mentioned above.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Up until now I'm still not really fully understand The Pascal Overclock logic, GPU Boost 3.0 logic, and every bit combination of curve/offset method.
> 
> Whenever I attempt to reach 21K Firestrike Graphic scores, sometime will pass the test/bench, but sometime it will fail/crash.
> 
> Core chip lottery?
> 
> I would hope someone post some full theoretically a guide or something how to overclock with curve/offset Pascal.
> 
> I set up a different Curve methods,
> both curve still has the same core speed, mem speed, and voltage while running Firestrike.
> But the result is different, although it running at same speed of core/mem/voltage frequencies.
> 
> here Example:
> 
> Curve A:
> 
> *Offset +0
> 1.093 +140*
> 
> Curve B:
> 
> *Offset +51
> 1.093 +140*
> 
> as you can see Curve A & B,
> 
> Curve A & B has same speed constantly running at ~2151-2138MHZ, 4600MHZ(9.2 GHZ), and voltage 1.093 during Firestrike Test.
> 
> But for Real performance; Curve B will provide a much better FPS than Curve A. If you know what i mean.
> Where is the logic behind this?Curve A is running a fake performances of 2151MHZ ??


the card actually accesses all voltage points along the curve in parallel. The amount of performance is measured by the area under the whole curve. not by the highest point on the curve reported by AB.

While the card process that is operating at 1.093V can reach that 2151 high point the rest of the potential performance at the other voltage levels is being restricted to the values along the default curve


----------



## cutty1998

I have to say , I am finally so happy with my Asus Strix 1070! Seeing as I am still only running an ancient Ivy Bridge chipset ,and a old school Yamakasi Catleap Monitor ,this seems like the perfect card for Ivy Bridge @,1440P gaming! It just took Asus 2 months to get me an actual working card. I just hope it lasts as long as my Reference model GTX 980,that gave me 2+ years of flawless gaming bliss!


----------



## Caveat

Quote:


> Originally Posted by *NErdgOd56*
> 
> in my experience these come off rather easily. pry more toward one side than the other theyre usually held in by little pins on the side


Quote:


> Originally Posted by *philhalo66*
> 
> Yeah i always called it a PCI-E lock haha. hopefully yours pops off without any issue and it solves your problems. Good luck


I tried the card in another pci-e slot and it seems that it fit more easy in this slot. The lock doesn't hold the card for going in and it tge lock locked into the card. So i am not sure why or what. Hopefully this solves it. Tomorrow (my gf and son are sick so i can't do it tonight) i try a fresh install of windows, because when i booted up the pc it didn't give the right resolution rightaway.


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> I tried the card in another pci-e slot and it seems that it fit more easy in this slot. The lock doesn't hold the card for going in and it tge lock locked into the card. So i am not sure why or what. Hopefully this solves it. Tomorrow (my gf and son are sick so i can't do it tonight) i try a fresh install of windows, because when i booted up the pc it didn't give the right resolution rightaway.


Oh thats normal when you change PCI-E Slots it changes the Hardware IRQ so windows needs to reinstall the driver, let it go ahead and you should be good to go.


----------



## Caveat

Quote:


> Originally Posted by *philhalo66*
> 
> Oh thats normal when you change PCI-E Slots it changes the Hardware IRQ so windows needs to reinstall the driver, let it go ahead and you should be good to go.


Ye but my windows can't find uodates. So it is f*cked up anyways. So i need to do an reinstall anyways.


----------



## philhalo66

Quote:


> Originally Posted by *Caveat*
> 
> Ye but my windows can't find uodates. So it is f*cked up anyways. So i need to do an reinstall anyways.


That's a known bug. not sure about the fix for windows 8 updates but for the NV driver go to the nvidia folder in your C drive and keep clicking till you see the setup files that will fix the GPU for ya. it should look something like this but replace the driver number and change windows 10 to 8 or whatever *C:\NVIDIA\DisplayDriver\375.95\Win10_64\International*


----------



## Caveat

Quote:


> Originally Posted by *philhalo66*
> 
> That's a known bug. not sure about the fix for windows 8 updates but for the NV driver go to the nvidia folder in your C drive and keep clicking till you see the setup files that will fix the GPU for ya. it should look something like this but replace the driver number and change windows 10 to 8 or whatever *C:\NVIDIA\DisplayDriver\375.95\Win10_64\International*


Ok thank you. I will try that tomorrow.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> the card actually accesses all voltage points along the curve in parallel. The amount of performance is measured by the area under the whole curve. not by the highest point on the curve reported by AB.
> 
> While the card process that is operating at 1.093V can reach that 2151 high point the rest of the potential performance at the other voltage levels is being restricted to the values along the default curve


yup it confused me really, see? it's kinda strange

2151mhz with other voltage at lowered curve point is running a fake performances of 2151mhz?
When i setup a curve 2151mhz with overall higher curve point, performances of 2151mzh is better.

What's the Logic? same speed different result because of different curve?

I'm pretty sure both tested curves has same core/mem/voltage speed running a custom Graphic test 1 & 2 Firestrike for the whole time, but the performances is different, FPS result is different.

Anyway, my main concern is why my gpu got many crashing when near a 21K Graphic Scores Firestrike, It is not really Rockstable. I'm still not get the best settings for my card.


----------



## zipper17

Quote:


> Originally Posted by *NErdgOd56*
> 
> guys, i feel like ive read about 200 pages of this and searched all through it and i still cant seem to find the answers im looking for.
> 
> heres what i know:
> 
> i have the Gigabyte GTX 1070 G1 gaming X8g with micron gddr5
> 
> the current bios available from gigabyte does not flash, however the wrapper software the manufacture uses reprts that it flashes sucessfully
> 
> after trying multiple times the actual nvflash terminal box that appears flashes something like "board is incompatible with firmware version [alphanumeric string]" before closing abruptly and incorrectly reporting a successful flash
> 
> the board currently has the f10 (f11beta?) 86.04.26.00.56 firmware on it
> 
> from what ive read, here are the questions i have:
> 
> can you just flash any pascal chip firmware on any card as long as the memory brand is the same?
> 
> has a way been found to remove the power limit of the board?
> 
> im not necessarily worried about overclocking, but i read some mentions of a "micron fix" do i need to worry about this?
> 
> and finally is there a way to debrand, or rebrand my card, so i no longer have to deal with gigabytes.... general..... weirdness?
> 
> in not a graphics newb, but i am a nvidia newb (due to lack of funding not preference.) and am extremely satisfied with this card. i can provide any information i may have missed in this post if necessary
> 
> thanks in advance for any advice and answers


As i remember, pretty sure some guys overhere posted about that, they try a different bios for their card.

Some bios will not work, example: galax hof bios on other brand card.

And you also be caution with BIOS that has higher powerlimit, your VRM design card might be lower quality than the original actual card.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the card actually accesses all voltage points along the curve in parallel. The amount of performance is measured by the area under the whole curve. not by the highest point on the curve reported by AB.
> 
> While the card process that is operating at 1.093V can reach that 2151 high point the rest of the potential performance at the other voltage levels is being restricted to the values along the default curve
> 
> 
> 
> yup it confused me really, see? it's kinda strange
> 
> 2151mhz with other voltage at lowered curve point is running a fake performances of 2151mhz?
> When i setup a curve 2151mhz with overall higher curve point, performances of 2151mzh is better.
> 
> What's the Logic? same speed different result because of different curve?
> 
> I'm pretty sure both tested curves has same core/mem/voltage speed running a custom Graphic test 1 & 2 Firestrike for the whole time, but the performances is different, FPS result is different.
> 
> Anyway, my main concern is why my gpu got many crashing when near a 21K Graphic Scores Firestrike, It is not really Rockstable. I'm still not get the best settings for my card.
Click to expand...

No the 2151 is not fake but it is being described/reported in a misleading way. The performance is all about the area below the curve, not the curve itself. As you increase access to the higher voltage the curve opens up more, increasing the area under the curve. Try using your best curve but modify the .975 point and increase that one to 1999mhz, 2012 Mhz and above and see if that helps you performance.

BTW, Most people cant get 21000 in FS. If you cant quite make it, dont sweat it. you are not alone.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NErdgOd56*
> 
> guys, i feel like ive read about 200 pages of this and searched all through it and i still cant seem to find the answers im looking for.
> 
> heres what i know:
> 
> i have the Gigabyte GTX 1070 G1 gaming X8g with micron gddr5
> 
> the current bios available from gigabyte does not flash, however the wrapper software the manufacture uses reprts that it flashes sucessfully
> 
> after trying multiple times the actual nvflash terminal box that appears flashes something like "board is incompatible with firmware version [alphanumeric string]" before closing abruptly and incorrectly reporting a successful flash
> 
> the board currently has the f10 (f11beta?) 86.04.26.00.56 firmware on it
> 
> from what ive read, here are the questions i have:
> 
> can you just flash any pascal chip firmware on any card as long as the memory brand is the same?
> 
> has a way been found to remove the power limit of the board?
> 
> im not necessarily worried about overclocking, but i read some mentions of a "micron fix" do i need to worry about this?
> 
> and finally is there a way to debrand, or rebrand my card, so i no longer have to deal with gigabytes.... general..... weirdness?
> 
> in not a graphics newb, but i am a nvidia newb (due to lack of funding not preference.) and am extremely satisfied with this card. i can provide any information i may have missed in this post if necessary
> 
> thanks in advance for any advice and answers
> 
> 
> 
> As i remember, pretty sure some guys overhere posted about that, they try a different bios for their card.
> 
> Some bios will not work, example: galax hof bios on other brand card.
> 
> And you also be caution with BIOS that has higher powerlimit, your VRM design card might be lower quality than the original actual card.
Click to expand...

@NErdgOd56

I am the serial 1070 cross flasher here. 

Not all bios files will flash to all cards. The Galax HOF cards use a different voltage controller to just about everyone else and will brick a more standard card for example.

It is not safe to assume a 86.04.26 bios has Micron Memory. Some gigabyte cards have samsung memory and the .26 bios installed as stock. You need to use GPU-Z and check yourself. you may find samsung memory and I think that there is a different .50 bios update utility for samsung from Gigabyte as well. I don't know why they do that, because the core parts of a given generation of bios that effect the memory controller is all exactly the same for both samsung and micron cards and is default nvidia code. The .50 bios will fix a bug that causes instability with Micron memory chips

What Gigabyte weirdness are you talking about?

You can flash other brand bioses to these cards, however, you need to have an understanding of the VRM design, obviously an 8 phase card will have lower power going through each phase up to the power limit of the card than a 5 phase card. Different card model bioses do have different power limits as well. Anything that you do is at your own risk and can potentially brick the card. That is recoverable for the most part but no guarantees. It will also void your warranty if you get caught by Gigabyte so if you ever do have to RMA the card, flash it back to stock before you send it back.

I do not have a Gigabyte card, Mine is an MSI with an 8+6 power input, an 8 phase VRM power supply similar to the G1 but and a listed with a higher 230-291W power limit but I have successfully flashed G1 and Xtreme bioses to my Gaming X but neither of them worked as well as the original MSI bios in on the MSI cards - Asus bios actually seems to work pretty well on MSI cards. They seem to be ok to cross flash but I don't have any Idea what experience you would have. Different card/bios combinations interact differently and high power limits do not always mean better performance.

If you want to try doing a cross flash. Make sure you back up your original bios first. As a first attempt, I suggest that you look at the ASUS strix OC version bios. Like the G1, it also has 8 phase power and an 8 pin power supply with a listed 200W power limit so power levels should be similar. You will get a boost in base clock to 1633Mhz from your default 1595.

To flash the card you will need to get hold of the latest NVflash utility and the Rom file that you can obtain from techpowerup.com

Bioses from Cards that have 2x8pin or 8+6pin power supplies have potential to do damage to your card as they can overload the installed mosfets that may not have the capacity for the extra power that say, a Zotac amp extreme can pump through the card.

Unfortunately, that is the only way you can modify the operation of 1070 cards, there is currently no way to edit Pascal bios files


----------



## AliasOfMyself

Quote:


> Originally Posted by *NErdgOd56*
> 
> guys, i feel like ive read about 200 pages of this and searched all through it and i still cant seem to find the answers im looking for.
> 
> heres what i know:
> 
> i have the Gigabyte GTX 1070 G1 gaming X8g with micron gddr5
> 
> the current bios available from gigabyte does not flash, however the wrapper software the manufacture uses reprts that it flashes sucessfully
> 
> after trying multiple times the actual nvflash terminal box that appears flashes something like "board is incompatible with firmware version [alphanumeric string]" before closing abruptly and incorrectly reporting a successful flash
> 
> the board currently has the f10 (f11beta?) 86.04.26.00.56 firmware on it
> 
> from what ive read, here are the questions i have:
> 
> can you just flash any pascal chip firmware on any card as long as the memory brand is the same?
> 
> has a way been found to remove the power limit of the board?
> 
> im not necessarily worried about overclocking, but i read some mentions of a "micron fix" do i need to worry about this?
> 
> and finally is there a way to debrand, or rebrand my card, so i no longer have to deal with gigabytes.... general..... weirdness?
> 
> in not a graphics newb, but i am a nvidia newb (due to lack of funding not preference.) and am extremely satisfied with this card. i can provide any information i may have missed in this post if necessary
> 
> thanks in advance for any advice and answers


You're not the only one having issues with Gigabyte bios files not flashing.. i have the Xtreme Gaming, and i can't get the latest bios to flash to my card either. What motherboard and CPU do you run with your card?


----------



## gtbtk

> You're not the only one having issues with Gigabyte bios files not flashing.. i have the Xtreme Gaming, and i can't get the latest bios to flash to my card either. What motherboard and CPU do you run with your card?


Exactly what is the error message/on screen message that you are getting when you try to flash the bios?


----------



## AliasOfMyself

Quote:


> Originally Posted by *gtbtk*
> 
> Exactly what is the error message/on screen message that you are getting when you try to flash the bios?


Exactly the same as the guy i just quoted.



Right after NVflash quits i get the wrapper software from Gigabyte telling me the bios flash was successful. There's literally no reason it shouldn't be working, i even tried it on my partners rig and got the same result.


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Exactly what is the error message/on screen message that you are getting when you try to flash the bios?
> 
> 
> 
> Exactly the same as the guy i just quoted.
> 
> 
> 
> Right after NVflash quits i get the wrapper software from Gigabyte telling me the bios flash was successful. There's literally no reason it shouldn't be working, i even tried it on my partners rig and got the same result.
Click to expand...

There appears to be a bug in the installer wrapper they have provided. Did you try running the installer from an administrative command prompt?

You can do the bios update manually if you want. You need to download

Latest version of NVflash from here

https://www.techpowerup.com/downloads/2850/nvflash-5-328-0-for-windows

A copy of the extracted 1070 xtreme gaming bios

https://www.techpowerup.com/vgabios/187291/187291

or Gaming G1 F11 bios for G1 Micron Cards as appropriate

https://www.techpowerup.com/vgabios/187267/187267

The process is quite simple

Open device manager and disable your graphics card

copy the bios file for your card into the same directory as where you put extracted nvflash and rename it to a short name such as 1070.rom

open an administrator CMD prompt and change to the nvflash directory you created

run the command nvflash -6 1070.rom

type Y at the prompts and it will flash your card

Upon getting an "update successful" message, re-enable your graphics card in device manager and then reboot your PC

Job done


----------



## AliasOfMyself

Quote:


> Originally Posted by *gtbtk*
> 
> There appears to be a bug in the installer wrapper they have provided. Did you try running the installer from an administrative command prompt?
> 
> You can do the bios update manually if you want. You need to download
> 
> Latest version of NVflash from here
> 
> https://www.techpowerup.com/downloads/2850/nvflash-5-328-0-for-windows
> 
> A copy of the extracted 1070 xtreme gaming bios
> 
> https://www.techpowerup.com/vgabios/187291/187291
> 
> or Gaming G1 F11 bios for G1 Micron Cards as appropriate
> 
> https://www.techpowerup.com/vgabios/187267/187267
> 
> The process is quite simple
> 
> Open device manager and disable your graphics card
> 
> copy the bios file for your card into the same directory as where you put extracted nvflash and rename it to a short name such as 1070.rom
> 
> open an administrator CMD prompt and change to the nvflash directory you created
> 
> run the command nvflash -6 1070.rom
> 
> type Y at the prompts and it will flash your card
> 
> Upon getting an "update successful" message, re-enable your graphics card in device manager and then reboot your PC
> 
> Job done


Yeah i ran it as administrator, I've got Gigabyte looking into it. I personally think it's part of the sub vendor ID bug I have, which is why I asked what motherboard and CPU the other guy has.

I'll hold off on a manual flash for now, the dual bios nature of my card is putting me off. I'll give Gigabyte another week or two and if they don't have an answer I'll either flash it manually or rma it lol.


----------



## c0nsistent

Has anyone done the shunt resistor trick to increase power limits? I'm tempted to do so, although I have an EVGA SC version and not a Founder's edition. I'm hitting the power limits as is typical and that seems like a quick fix... has anyone tried this?


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> There appears to be a bug in the installer wrapper they have provided. Did you try running the installer from an administrative command prompt?
> 
> You can do the bios update manually if you want. You need to download
> 
> Latest version of NVflash from here
> 
> https://www.techpowerup.com/downloads/2850/nvflash-5-328-0-for-windows
> 
> A copy of the extracted 1070 xtreme gaming bios
> 
> https://www.techpowerup.com/vgabios/187291/187291
> 
> or Gaming G1 F11 bios for G1 Micron Cards as appropriate
> 
> https://www.techpowerup.com/vgabios/187267/187267
> 
> The process is quite simple
> 
> Open device manager and disable your graphics card
> 
> copy the bios file for your card into the same directory as where you put extracted nvflash and rename it to a short name such as 1070.rom
> 
> open an administrator CMD prompt and change to the nvflash directory you created
> 
> run the command nvflash -6 1070.rom
> 
> type Y at the prompts and it will flash your card
> 
> Upon getting an "update successful" message, re-enable your graphics card in device manager and then reboot your PC
> 
> Job done
> 
> 
> 
> Yeah i ran it as administrator, I've got Gigabyte looking into it. I personally think it's part of the sub vendor ID bug I have, which is why I asked what motherboard and CPU the other guy has.
> 
> I'll hold off on a manual flash for now, the dual bios nature of my card is putting me off. I'll give Gigabyte another week or two and if they don't have an answer I'll either flash it manually or rma it lol.
Click to expand...

I just noticed that you have an AMD motherboard.

That bug fails to report the device ID to cpuid and will be the cause of your problem. That is not just a gigabyte problem but is a chipset issue that AMD should be fixing.

MSI brand AMD motherboards have the same bug.


----------



## philhalo66

Quote:


> Originally Posted by *gtbtk*
> 
> I just noticed that you have an AMD motherboard.
> 
> That bug fails to report the device ID to cpuid and will be the cause of your problem. That is not just a gigabyte problem but is a chipset issue that AMD should be fixing.
> 
> MSI brand AMD motherboards have the same bug.


MSI brand AMD motherboards have a much more serious problem than VGA BIOS flashing. they're VRM's are complete garbage on the AMD side and tend to go up in smoke with or without an OC.


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I just noticed that you have an AMD motherboard.
> 
> That bug fails to report the device ID to cpuid and will be the cause of your problem. That is not just a gigabyte problem but is a chipset issue that AMD should be fixing.
> 
> MSI brand AMD motherboards have the same bug.
> 
> 
> 
> MSI brand AMD motherboards have a much more serious problem than VGA BIOS flashing. they're VRM's are complete garbage on the AMD side and tend to go up in smoke with or without an OC.
Click to expand...

No need to attack anyone here. I am not proposing an MSI board and we are not talking about VRMs of any brand. I am just saying that it exhibits the same bug that the gigabyte board does. I dare say that AMD boards from other manufacturers are the same as well


----------



## philhalo66

Quote:


> Originally Posted by *gtbtk*
> 
> No need to attack anyone here. I am not proposing an MSI board and we are not talking about VRMs of any brand. I am just saying that it exhibits the same bug that the gigabyte board does. I dare say that AMD boards from other manufacturers are the same as well


nobody is attacking you. I was just stating a fact.


----------



## benjamen50

Quick question, is it fine to have EVGA Precision X OC and MSI Afterburner installed at the same time?


----------



## philhalo66

Quote:


> Originally Posted by *benjamen50*
> 
> Quick question, is it fine to have EVGA Precision X OC and MSI Afterburner installed at the same time?


Yeah, I have those 2 and the gigabyte software installed


----------



## AliasOfMyself

Quote:


> Originally Posted by *philhalo66*
> 
> MSI brand AMD motherboards have a much more serious problem than VGA BIOS flashing. they're VRM's are complete garbage on the AMD side and tend to go up in smoke with or without an OC.


Far be it for me to be an AMD fan girl, but isn't it down to the motherboard manufacturer to decide which voltage regulators to use? AMD just make the chips, not the boards. Oh and I've never had an AMD board die on me, especially VRM based failure. MSI boards are maybe an exception when it comes to AMD, but still, no need in your behaviour just because I don't have an Intel based board dude. Besides, mines a Gigabyte board


----------



## AliasOfMyself

Sav
Quote:


> Originally Posted by *gtbtk*
> 
> I just noticed that you have an AMD motherboard.
> 
> That bug fails to report the device ID to cpuid and will be the cause of your problem. That is not just a gigabyte problem but is a chipset issue that AMD should be fixing.
> 
> MSI brand AMD motherboards have the same bug.


That's exactly what I'm thinking. Is this just with Nvidia cards though? Not had this issue come up with any AMD card I've had, the RX 480 I had before my 1070 had a bios update and it flashed fine.


----------



## philhalo66

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Far be it for me to be an AMD fan girl, but isn't it down to the motherboard manufacturer to decide which voltage regulators to use? AMD just make the chips, not the boards. Oh and I've never had an AMD board die on me, especially VRM based failure. No need in your behaviour just because I don't have an Intel based board dude.


what are you even talking about? when did i once mention intel? I stated a fact! MSI AMD boards are junk and have a very high fail rate, nothing to do with intel it's only MSI that that has such a high fail rate too. most other brands are fine if you buy fairly decent ones. Not sure how me stating a fact (look up VRM failure database on here 99% are MSI boards) equates to me insulting someone and being an intel fanboy at the same time. Just because i criticize something doesn't mean im a fanboy for the other team


----------



## AliasOfMyself

Quote:


> Originally Posted by *philhalo66*
> 
> what are you even talking about? when did i once mention intel? I stated a fact! MSI AMD boards are junk and have a very high fail rate, nothing to do with intel it's only MSI that that has such a high fail rate too. most other brands are fine if you buy fairly decent ones. Not sure how me stating a fact (look up VRM failure database on here 99% are MSI boards) equates to me insulting someone and being an intel fanboy at the same time. Just because i criticize something doesn't mean im a fanboy for the other team


did you read my edit?







I avoid MSI based AMD boards because of their sketchy practices with them, take their 990FX Gaming board, it's a rubbish 6+2 phase design that would struggle to run my cpu on default boost clocks, I went with ASRock over them because at the time there wasn't anything better from Asus or gigabyte. I wasn't accusing you of being a fan boy, I just didn't get why you suddenly needed to say what you did in the first place, nobody even mentioned me having an MSI AMD motherboard


----------



## philhalo66

Quote:


> Originally Posted by *AliasOfMyself*
> 
> did you read my edit?
> 
> 
> 
> 
> 
> 
> 
> I avoid MSI based AMD boards because of their sketchy practices with them, take their 990FX Gaming board, it's a rubbish 6+2 phase design that would struggle to run my cpu on default boost clocks, I went with ASRock over them because at the time there wasn't anything better from Asus or gigabyte. I wasn't accusing you of being a fan boy, I just didn't get why you suddenly needed to say what you did in the first place, nobody even mentioned me having an MSI AMD motherboard


it was more of a "yeah that's true but msi will never fix it, they have bigger fish to fry such as the insanely high fail rates on the AMD side." I was mostly just stating a fact it wasn't a jab at anyone. Yeah anything less than 8+2 for an 8xxx series is going to cause problems because of the power consumption. I used to be an AMD user till my PSU fried my sabertooth 990FX board (i forget which one) and i found my current CPU+MB for $225 on here a few years ago.


----------



## AliasOfMyself

Quote:


> Originally Posted by *philhalo66*
> 
> it was more of a "yeah that's true but msi will never fix it, they have bigger fish to fry such as the insanely high fail rates on the AMD side." I was mostly just stating a fact it wasn't a jab at anyone. Yeah anything less than 8+2 for an 8xxx series is going to cause problems because of the power consumption. I used to be an AMD user till my PSU fried my sabertooth 990FX board (i forget which one) and i found my current CPU+MB for $225 on here a few years ago.


They don't even list what the phase design is for their flagship AMD board on their website







I had to do a bit of searching to find out, i did ring them to try to get an answer, they said they'd get back to me and never did lol. I always go for an 8+2 setup when i'm looking for a motherboard, so MSI will never be a choice for me when it comes to an AMD based setup. What PSU took out your motherboard? That had to suck really hard! The ASRock board i had was nice, til it started dying just under a year after i bought it, it would randomly power itself on and off really quickly, til i cut the power from the PSU. It happened every so often when i shut windows down, so back to Gigabyte i went, i'm glad i did, for an aging chipset they did really well with the 990FX-Gaming.


----------



## philhalo66

Quote:


> Originally Posted by *AliasOfMyself*
> 
> They don't even list what the phase design is for their flagship AMD board on their website
> 
> 
> 
> 
> 
> 
> 
> I had to do a bit of searching to find out, i did ring them to try to get an answer, they said they'd get back to me and never did lol. I always go for an 8+2 setup when i'm looking for a motherboard, so MSI will never be a choice for me when it comes to an AMD based setup. What PSU took out your motherboard? That had to suck really hard! The ASRock board i had was nice, til it started dying just under a year after i bought it, it would randomly power itself on and off really quickly, til i cut the power from the PSU. It happened every so often when i shut windows down, so back to Gigabyte i went, i'm glad i did, for an aging chipset they did really well with the 990FX-Gaming.


lol they know they're boards are junk and they trying to cover it up. have a look at this says exactly why they're hiding it.

It was a seasonic 620W, Guess i got "lucky" and got a lemon. The best part was it died Christmas morning 2013 and i had to wait about a week just for ASUS to deny my RMA on the grounds it wasn't the board's fault.







and Seasonic told me to take a hike they don't care. I have had a ton of ASRock boards and they were all junk for me but i was always overclocking on 4+2 phase systems lol so it was my fault for them failing so quick.

I never had a gigabyte system till this one and it's been rock solid for 2 years and change even with extreme overclocking (5GHz+) But their graphics card RMA is a nightmare from what i keep seeing so i wont be buying anymore gigabyte cards.


----------



## benjamen50

Quote:


> Originally Posted by *philhalo66*
> 
> I never had a gigabyte system till this one and it's been rock solid for 2 years and change even with extreme overclocking (5GHz+) But their graphics card RMA is a nightmare from what i keep seeing so i wont be buying anymore gigabyte cards.


I've RMA'd a faulty Gigabyte windforce 3x GTX 780 oc almost out of warranty from 3 years (Through retailer / store to get them to RMA it for me) and they ended up giving me a GTX 780ti with samsung memory. Guess I got lucky? Well the reason I get gigabyte motherboards still is because I'm friends with a guy that works in gigabyte and I haven't ran into DOA, faulty boards for now.


----------



## philhalo66

Quote:


> Originally Posted by *benjamen50*
> 
> I've RMA'd a faulty Gigabyte windforce 3x GTX 780 oc almost out of warranty from 3 years (Through retailer / store to get them to RMA it for me) and they ended up giving me a GTX 780ti with samsung memory. Guess I got lucky? Well the reason I get gigabyte motherboards still is because I'm friends with a guy that works in gigabyte and I haven't ran into DOA, faulty boards for now.


Hmm maybe when i go SLI in march i will get another gigabyte card then. Kinda torn between that or selling my current card and getting an EVGA 1080 FTW. just very weary about spending 840 dollars on 2 cards if the RMA support sucks you know?


----------



## benjamen50

Quote:


> Originally Posted by *philhalo66*
> 
> Hmm maybe when i go SLI in march i will get another gigabyte card then. Kinda torn between that or selling my current card and getting an EVGA 1080 FTW. just very weary about spending 840 dollars on 2 cards if the RMA support sucks you know?


I currently use a EVGA GTX 1070 ACX 3.0 FTW. I'm sticking with EVGA for graphics cards from now on even though I have sticked with Gigabyte for GPU's for years.

For motherboards I'm sticking with Gigabyte because I like how their CPU voltage control works and has most the settings most people would use compared to other manufacturers IMO.


----------



## philhalo66

Quote:


> Originally Posted by *benjamen50*
> 
> I currently use a EVGA GTX 1070 ACX 3.0 FTW. I'm sticking with EVGA for graphics cards from now on even though I have sticked with Gigabyte for GPU's for years.
> 
> For motherboards I'm sticking with Gigabyte because I like how their CPU voltage control works and has most the settings most people would use compared to other manufacturers IMO.


Did you get in contact with EVGA about getting the free thermal pad upgrade? EVGA 10x0 ACX 3.0 cards didnt come with any thermal pads on the VRM's so they overheat but EVGA is giving out free thermal pads upgrades to fix it.


----------



## benjamen50

Quote:


> Originally Posted by *philhalo66*
> 
> Did you get in contact with EVGA about getting the free thermal pad upgrade? EVGA 10x0 ACX 3.0 cards didnt come with any thermal pads on the VRM's so they overheat but EVGA is giving out free thermal pads upgrades to fix it.


Already done the thermal pad upgrade and GPU vBIOS fan profile for EVGA thermal mod, thanks for the heads up though. Also additionally got the power link adapter too.


----------



## philhalo66

Quote:


> Originally Posted by *benjamen50*
> 
> Already done the thermal pad upgrade and GPU vBIOS fan profile for EVGA thermal mod, thanks for the heads up though. Also additionally got the power link adapter too.


Excellent







im so tempted to get the 1070 FTW Hybrid those super low temps look pretty darn nice to me.


----------



## owikhan

which memory chip card should i buy micron or samsung?


----------



## GeneO

Well probably samsung is a safer bet, but it is not like you get a choice.


----------



## owikhan

Quote:


> Originally Posted by *GeneO*
> 
> Well probably samsung is a safer bet, but it is not like you get a choice.


i have a choice... micron memory zotac 1080 amp editioo 650USD and Zotac 1080 amp edition Samsung memory 710USD

i hear micron memory only bios issue?is it true when u update bios issue resolve?or not?

or in real on card micron memory installed?like a chip?


----------



## MEC-777

Quote:


> Originally Posted by *owikhan*
> 
> i have a choice... micron memory zotach 1080 amp editioo 650USD and Zotach 1080 amp edition Samsung memory 710USD
> 
> i hear micron memory only bios issue?is it true when u update bios issue resolve?or not?
> 
> or in real on card micron memory installed?like a chip?


That's quite the price difference. Where are you looking to buy these from?

It doesn't really matter either way. If you get a Micron card, just flash it with the updated bios. Most people have reported this solved the problems they had. Or, if you don't want to hassle with that, just get a card you know has Samsung memory.


----------



## owikhan

Quote:


> Originally Posted by *MEC-777*
> 
> That's quite the price difference. Where are you looking to buy these from?
> 
> It doesn't really matter either way. If you get a Micron card, just flash it with the updated bios. Most people have reported this solved the problems they had. Or, if you don't want to hassle with that, just get a card you know has Samsung memory.


i live in Pakistan...here i am getting these two card,s

Thanks for your reply.Is there any chance to come issue during flash your card?

last question micron memory card gpu have no problem with games and display etc?


----------



## philhalo66

Is anyone else having issues with youtube videos making chrome freeze with the last two drivers? 375.70 is perfectly fine its only the last two.


----------



## MEC-777

Quote:


> Originally Posted by *owikhan*
> 
> i live in Pakistan...here i am getting these two card,s
> 
> Thanks for your reply.Is there any chance to come issue during flash your card?
> 
> last question micron memory card gpu have no problem with games and display etc?


There are risks in flashing the bios, but as long as you do it correctly, there shouldn't be any issues.

The issue with the micron memory is only really a problem if you're planning to overclock a lot. Even then, some people have reported to issues with overclocking without the updated bios.

Quote:


> Originally Posted by *philhalo66*
> 
> Is anyone else having issues with youtube videos making chrome freeze with the last two drivers? 375.70 is perfectly fine its only the last two.


No issues here. Running like a charm and I never even reinstalled the drivers when switching from the 980 to 1070 (had just updated the drivers the day before). Just swapped them and continued on running as usual.


----------



## TheNoub

Quote:


> Originally Posted by *philhalo66*
> 
> Is anyone else having issues with youtube videos making chrome freeze with the last two drivers? 375.70 is perfectly fine its only the last two.


I am getting The same problems but it is very intermittant, i can watch a couple hours of videos with no issues or run into a problem a few seconds into a single video. Might get a fix in the next driver.


----------



## Kreeker

So I bought a EVGA GTX 1070 FTW last night and after reading some comments on reddit with people blasting EVGA I'm a little nervous that I made the wrong decision... Is this card bad?


----------



## JAM3S121

Quote:


> Originally Posted by *Kreeker*
> 
> So I bought a EVGA GTX 1070 FTW last night and after reading some comments on reddit with people blasting EVGA I'm a little nervous that I made the wrong decision... Is this card bad?


I dunno, I have one too and it seems rock solid


----------



## philhalo66

Quote:


> Originally Posted by *Kreeker*
> 
> So I bought a EVGA GTX 1070 FTW last night and after reading some comments on reddit with people blasting EVGA I'm a little nervous that I made the wrong decision... Is this card bad?


if you buy it directly from the EVGA site they will have the thermal pad upgrade and the BIOS fix already installed so you have nothing to worry about. Also the EVGA customer service is legendary for a reason if you ever do have a problem they will take care of you.


----------



## Kreeker

Quote:


> Originally Posted by *JAM3S121*
> 
> I dunno, I have one too and it seems rock solid


That makes me feel better. Did you get one with the pads and updated bios?
Quote:


> Originally Posted by *philhalo66*
> 
> if you buy it directly from the EVGA site they will have the thermal pad upgrade and the BIOS fix already installed so you have nothing to worry about. Also the EVGA customer service is legendary for a reason if you ever do have a problem they will take care of you.


So you think if I bought it from Newegg last night it probably won't have the pad upgrade and BIOS fix already installed?

Does the updated BIOS make the card very loud?


----------



## philhalo66

Quote:


> Originally Posted by *Kreeker*
> 
> That makes me feel better. Did you get one with the pads and updated bios?
> So you think if I bought it from Newegg last night it probably won't have the pad upgrade and BIOS fix already installed?
> 
> Does the updated BIOS make the card very loud?


EVGA has a QnA on their website, According to them most places such as newegg ect. should have the fixed cards. and even if you get unlucky EVGA will give you the thermal pad fix for free you wont need to pay a dime and it only takes 30 minutes to install it. I can't say about the BIOS but it only brings the fan speed up to 2200 RPM give or take so you won't hear it at all when idle.

Here is a direct link to their QnA it's got alot of useful information I think you should take a look they worded things better than i did that's for sure http://www.evga.com/thermalmod/

*Edit* there is an EVGA rep on OCN too here is a link to his profile feel free to send him a PM he will be more suited to answer any questions you might have and he is a pretty chill guy too so dont be shy. http://www.overclock.net/u/264974/evga-jacobf


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Sav
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I just noticed that you have an AMD motherboard.
> 
> That bug fails to report the device ID to cpuid and will be the cause of your problem. That is not just a gigabyte problem but is a chipset issue that AMD should be fixing.
> 
> MSI brand AMD motherboards have the same bug.
> 
> 
> 
> That's exactly what I'm thinking. Is this just with Nvidia cards though? Not had this issue come up with any AMD card I've had, the RX 480 I had before my 1070 had a bios update and it flashed fine.
Click to expand...

To be honest. I have no idea. I am only aware of the problem because there are posts about it over at the MSI graphics card forum. The last AMD CPU I had anything to do with was when AMD cpus were really fast and were leaving Intel Pentium 4 cpu performance in the dust.

I would think that the flaw is in reading a section of the PCIe device information that AMD doesn't use to identify the manufacturer of its cards. The bug does not effect the actual usage of the crad, only vendor specific firmware updates. As Graphics cards are the only devices that use a slot and can need vendor specific Firmware upgrades, generally speaking, and that AMD and Nvidia are the only two players in town, it is likely that the bug, for the most part only effects Nvidia cards. I suspect that AMD don't feel it is in their interests to rush a bug fix out any time soon.


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Well probably samsung is a safer bet, but it is not like you get a choice.


Quote:


> Originally Posted by *owikhan*
> 
> which memory chip card should i buy micron or samsung?


Since the Bug fix bios updates came out, it doesn't really make much or a difference. The majority of cards with both brands will clock to about +550 to +600 over the reference speed. And there are many things other than the graphics card like CPU, BCLK overclocks and system voltage settings, that can impact those results. There are some cards that can do +800Mhz but they tend to be silicon lottery winning golden examples and are not typical.


----------



## icold

I bought a GTX 1070 ASUS Strix ROG, this Model is very COLD ( excellent cooler), I got only: 2050mhz/4322mhz . My memory is Micron, dont ups much.


----------



## MEC-777

Quote:


> Originally Posted by *icold*
> 
> I bought a GTX 1070 ASUS Strix ROG, this Model is very COLD ( excellent cooler), I got only: 2050mhz/4322mhz . My memory is Micron, dont ups much.


Have you flashed it with the recent bios update? That should fix the poor memory OC issue.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *icold*
> 
> I bought a GTX 1070 ASUS Strix ROG, this Model is very COLD ( excellent cooler), I got only: 2050mhz/4322mhz . My memory is Micron, dont ups much.
> 
> 
> 
> 
> 
> Have you flashed it with the recent bios update? That should fix the poor memory OC issue.
Click to expand...

Further to that. Overclocking with the Core clock slider is never likely to get you into the 2100Mhz range, you need to fine tune the curve


----------



## owikhan

Is somewhere ZOTAC 1080 AMP Edition (Micron) Bios Available?


----------



## boostnek9

Here you go.

http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6300#post_25685869


----------



## usoldier

Hey Guys juts want to post my Asus Strix OC 24/7 settings

Aftherburner Settings
Core Voltage +30
Power Limit +120
CoreClock +60
Memory clock +502

With Gives me a 2088 core speed and 4514 Mem Speed

Been Running this flawlessly on BF1 for 5 days now speeds remain constant no drops , temp tops at 56cº you guys think this is good for 24/7


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> Further to that. Overclocking with the Core clock slider is never likely to get you into the 2100Mhz range, you need to fine tune the curve


Isn't that just two ways to do the same thing in the end? The curve applies an offset, just as the slider does. I've only tried the sliders with mine, thus far, but IIRC in Jayztwocents Pascal overclocking video, he showed you basically end up with the same end result using the sliders or the curve. It's just a matter of preference, no?
Quote:


> Originally Posted by *usoldier*
> 
> Hey Guys juts want to post my Asus Strix OC 24/7 settings
> 
> Aftherburner Settings
> Core Voltage +30
> Power Limit +120
> CoreClock +60
> Memory clock +502
> 
> With Gives me a 2088 core speed and 4514 Mem Speed
> 
> Been Running this flawlessly on BF1 for 5 days now speeds remain constant no drops , temp tops at 56cº you guys think this is good for 24/7


As long as it's stable, you should be good to go.


----------



## benjamen50

Quote:


> Originally Posted by *usoldier*
> 
> Hey Guys juts want to post my Asus Strix OC 24/7 settings
> 
> Aftherburner Settings
> Core Voltage +30
> Power Limit +120
> CoreClock +60
> Memory clock +502
> 
> With Gives me a 2088 core speed and 4514 Mem Speed
> 
> Been Running this flawlessly on BF1 for 5 days now speeds remain constant no drops , temp tops at 56cº you guys think this is good for 24/7


Is that the highest you can go for core Voltage? Because I have the same overclock pretty much on my 1070 and I may be using too much as I like just putting higher voltage.


----------



## philhalo66

People here been telling me to just max out the slider and not worry about it so a bit more than whats needed should be safe.


----------



## iARDAs

Quote:


> Originally Posted by *hammelgammler*
> 
> Hey guys,
> 
> can anyone tell me what those Checkerboard Pattern should look like with Micron VRAM? Because I have two Gainward 1070 Phoenix with Micron, and it doesn't matter how much Mem Clock I put on the card, I have sometimes weird VIdeo Artifacts when watching two YouTube Videos at the same time on different monitors (I have 3).
> 
> When gaming or when only watching one video at a time (YouTube, didn't saw those Artifacts with Twitch yet), then I don't get those Artifacts.
> _Anyone know if that's because of the Micron VRAM? I already have the latest BIOS which should solve the issues._
> 
> Besides that, I can run my memory with 2300-2400MHz before I get Artifacts in Games, although my performance doesn't increase in Tomb Raider when pushing it besides 2300MHz.
> 
> Thanks!


Still have the issue?

Latest 376.33 driver has this

Windows 10 Fixes - [375.86, GP104] Corruption in YouTube video playback when two or more videos are playing at the same time in Chrome. [1843100]


----------



## zipper17

Quote:


> Originally Posted by *owikhan*
> 
> i have a choice... micron memory zotac 1080 amp editioo 650USD and Zotac 1080 amp edition Samsung memory 710USD
> 
> i hear micron memory only bios issue?is it true when u update bios issue resolve?or not?
> 
> or in real on card micron memory installed?like a chip?


Micron memory only affected GTX 1070.

Theres no problem with micron GTX 1080, all 1080 uses micron for GDDR5X, AFAIK theres no samsung GDDR5X.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Further to that. Overclocking with the Core clock slider is never likely to get you into the 2100Mhz range, you need to fine tune the curve
> 
> 
> 
> Isn't that just two ways to do the same thing in the end? The curve applies an offset, just as the slider does. I've only tried the sliders with mine, thus far, but IIRC in Jayztwocents Pascal overclocking video, he showed you basically end up with the same end result using the sliders or the curve. It's just a matter of preference, no?
Click to expand...

No it is not the exactly same thing. These cards run using all different voltage levels (that are "translated" by these tools into frequencies for us to interact with) at the same time but only the maximum is reported by afterburner/precision. The measure of performance is actually the area under, as opposed to a single point on the curve.

Jay is only partly right in his video in that the card will tend to operate in the higher Voltage values but the card is not exclusively running at only that single point on the curve being reported in you OSD. It is running at multiple values in parallel with the reported value in AB or precision being the highest voltage point available on a curve that is constantly being adjusted by GPU Boost 3.0.

If you use the slider, it will apply an equal offset to every point along the graph. If say, the 1.000 voltage point can cope with a +50 offset before it crashes but the 1.093V point can cope with a +100 offset The slider will be limited to the point with the lowest tolerance for the increase, namely +50 in this example. At some stage during your usage, the card is constantly making calculations at lots of different voltage values in parallel, the card will try to use a 1.000v value. If you have the slider at +100 the run may start fine, but at some stage it will hit the +100 offset at the 1.000 level and crash even though your OSD is telling you that the card is running at 2880 at 1.081V.

If you use the curve and can identify the maximum offset that each point can tolerate, you can put a dip in the curve to work around the voltage point or points that cannot tolerate such a large offset meaning that you can make use of the extra offset OC potential (the +50 to +100 range of offsets all the voltage points except 1.000 in my example) that would otherwise be unavailable if you only use a fixed curve. Finding the maximum offsets available at various voltage levels along the curve is exactly what the precision XOC software attempts to do when it is doing the auto overclock thing with EVGA cards.

Regardless of what card you have, you can test it yourself. In Afterburner, starting from the stock curve, only increase the 1.093v point to say 2100 and run a bench mark and note the score. Then, while leaving everything as it is change the offset at, 0.975 so that it matches the 2000mhz setting. You will see your curve now has a double bump. Rerun the benchmark. If you are still within the offset limits of your card at those two points, You should see an increased framerate scores even though the OSD is telling you the whole run took place at 2100Mhz 1.093v.

I'm just pulling settings from the air here, All cards behave slightly differently with regards overclocking. If either of your two tests crash, obviously reduce the offset at the last point that you adjusted and rerun the benchmark.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *owikhan*
> 
> i have a choice... micron memory zotac 1080 amp editioo 650USD and Zotac 1080 amp edition Samsung memory 710USD
> 
> i hear micron memory only bios issue?is it true when u update bios issue resolve?or not?
> 
> or in real on card micron memory installed?like a chip?
> 
> 
> 
> Micron memory only affected GTX 1070.
> 
> Theres no problem with micron GTX 1080, all 1080 uses micron for GDDR5X, AFAIK theres no samsung GDDR5X.
Click to expand...

On 1070s the bios update fixed the micron memory bug.

As Zipper17 states, all 1080s have Micron GDDR5X memory as Micron is the only company who makes GDDR5X memory.


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> No it is not the exactly same thing. These cards run using all different voltage levels (that are "translated" by these tools into frequencies for us to interact with) at the same time but only the maximum is reported by afterburner/precision. The measure of performance is actually the area under, as opposed to a single point on the curve.


If you really want to see what is happening, bring up the voltage/frequency curve in afterburner and watch it while you play. It will dynamically display not only the voltage you are at, but the frequency at that voltage, which may be less than frequency point on the curve for that voltage (boost will lower the frequency at a given voltage due to temperature, etc.). Of course you need two monitors or windowed mode to do this.









.


----------



## ucode

Even running one voltage level and flat lining one frequency can still see drops in gpu clock with temperature increase.


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> No it is not the exactly same thing. These cards run using all different voltage levels (that are "translated" by these tools into frequencies for us to interact with) at the same time but only the maximum is reported by afterburner/precision. The measure of performance is actually the area under, as opposed to a single point on the curve.
> 
> 
> 
> If you really want to see what is happening, bring up the voltage/frequency curve in afterburner and watch it while you play. It will dynamically display not only the voltage you are at, but the frequency at that voltage, which may be less than frequency point on the curve for that voltage (boost will lower the frequency at a given voltage due to temperature, etc.). Of course you need two monitors or windowed mode to do this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

That is still a bit misleading. By showing only a single point, It still infers that the card is operating at a single point. It isnt. The white dotted lines shows you the current high point of the operating range in the area under the curve.


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> That is still a bit misleading. By showing only a single point, It still infers that the card is operating at a single point. It isnt. The white dotted lines shows you the current high point of the operating range in the area under the curve.


No it isn't. Afterburner displays a dynamic point and cross-hairs at the actual lower frequency than the one set on the curve.


----------



## Flisker_new

Hey guys,

so I got Asus Strix GTX 1070 yesterday and I'm super curious about the 6 points on top of the card, anyone knows how does this work with Asus ROG Series motherboards (I've got Rampage IV Extreme)



Thanks a lot for any info on this topic

o/


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is still a bit misleading. By showing only a single point, It still infers that the card is operating at a single point. It isnt. The white dotted lines shows you the current high point of the operating range in the area under the curve.
> 
> 
> 
> No it isn't. Afterburner displays a dynamic point and cross-hairs at the actual lower frequency than the one set on the curve.
Click to expand...

But the card is NOT using a dynamic point. It is using points all along the curve to a Maximum of that dynamic Voltage value that it is reporting


----------



## gtbtk

Quote:


> Originally Posted by *Flisker_new*
> 
> Hey guys,
> 
> so I got Asus Strix GTX 1070 yesterday and I'm super curious about the 6 points on top of the card, anyone knows how does this work with Asus ROG Series motherboards (I've got Rampage IV Extreme)
> 
> 
> 
> Thanks a lot for any info on this topic
> 
> o/


I dont have a strix card and I am only guessing here but they look like voltage measurement points that you could use for extreme overclocking (?)


----------



## Flisker_new

Quote:


> Originally Posted by *gtbtk*
> 
> I dont have a strix card and I am only guessing here but they look like voltage measurement points that you could use for extreme overclocking (?)


Yep, it's called "VGA Hotwire" , but I can't find more info on this specific card.

Do I just solder there wires from Rampage IV Extreme and get free voltage control on GPU ? Or do I need to short/remove some resistors also to get it working ?


----------



## philhalo66

I'd wager those arent even connected to anything it might be the same PCB as a higher end card so those could be blank traces. Shorting out anything will more than likely fry your card and invalidate your warranty at the same time.


----------



## Flisker_new

Quote:


> Originally Posted by *philhalo66*
> 
> I'd wager those arent even connected to anything it might be the same PCB as a higher end card so those could be blank traces. Shorting out anything will more than likely fry your card and invalidate your warranty at the same time.


Hmm, you might be right ! Thanks for reply.


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> But the card is NOT using a dynamic point. It is using points all along the curve to a Maximum of that dynamic Voltage value that it is reporting


No it isn't. It will follow the curve fro the most part and that curve defines the maximum frequency at a given voltage, but it can and does lower the frequency at that voltage as it deems necessary when, for example, the temperature increases. Just watch it in conjunction with gpu-z.

For example, I can start a benchmarking run and have 1.081V set on the curve for 2062 MHz. It will initially start at 2062 but after a while, temperature rises, and it boost drops it down to 2050 MHz @ 1.081V. It doesn't lower the voltage, only the MHz. If it gets too hot it may lower both. And these changes are displayed dynamically in both the Afterburner f/V curve and GPU-Z.

.


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> No it is not the exactly same thing. These cards run using all different voltage levels (that are "translated" by these tools into frequencies for us to interact with) at the same time but only the maximum is reported by afterburner/precision. The measure of performance is actually the area under, as opposed to a single point on the curve.
> 
> Jay is only partly right in his video in that the card will tend to operate in the higher Voltage values but the card is not exclusively running at only that single point on the curve being reported in you OSD. It is running at multiple values in parallel with the reported value in AB or precision being the highest voltage point available on a curve that is constantly being adjusted by GPU Boost 3.0.
> 
> If you use the slider, it will apply an equal offset to every point along the graph. If say, the 1.000 voltage point can cope with a +50 offset before it crashes but the 1.093V point can cope with a +100 offset The slider will be limited to the point with the lowest tolerance for the increase, namely +50 in this example. At some stage during your usage, the card is constantly making calculations at lots of different voltage values in parallel, the card will try to use a 1.000v value. If you have the slider at +100 the run may start fine, but at some stage it will hit the +100 offset at the 1.000 level and crash even though your OSD is telling you that the card is running at 2880 at 1.081V.
> 
> If you use the curve and can identify the maximum offset that each point can tolerate, you can put a dip in the curve to work around the voltage point or points that cannot tolerate such a large offset meaning that you can make use of the extra offset OC potential (the +50 to +100 range of offsets all the voltage points except 1.000 in my example) that would otherwise be unavailable if you only use a fixed curve. Finding the maximum offsets available at various voltage levels along the curve is exactly what the precision XOC software attempts to do when it is doing the auto overclock thing with EVGA cards.
> 
> Regardless of what card you have, you can test it yourself. In Afterburner, starting from the stock curve, only increase the 1.093v point to say 2100 and run a bench mark and note the score. Then, while leaving everything as it is change the offset at, 0.975 so that it matches the 2000mhz setting. You will see your curve now has a double bump. Rerun the benchmark. If you are still within the offset limits of your card at those two points, You should see an increased framerate scores even though the OSD is telling you the whole run took place at 2100Mhz 1.093v.
> 
> I'm just pulling settings from the air here, All cards behave slightly differently with regards overclocking. If either of your two tests crash, obviously reduce the offset at the last point that you adjusted and rerun the benchmark.


Holy crap, that's confusing.







I miss the old days of overclocking where all you had to do was add clock speed until unstable, then add voltage, and repeat until you find the absolute limit of the card.

Now with GPUboost 3.0, it's constantly adjusting everything and they all seem to run way faster than their rated stock clocks out of the box so we don't get to see major gains as it's doing most of it for you, automatically. Even my FE 1070 runs up to 1911, then levels out in the mid/high 1800's without touching a thing. Following Jay's guide I'm able to get it to stay around 1970-2000Mhz even when it reaches 80*C (I like silence), which I'm happy with.

Is there really much to gain beyond that by playing with the curve?

I'm surprised they don't do this with the memory as well, seeing as most cards are capable of +450-500.


----------



## Flisker_new

Btw anyone got latest *Asus Strix OC version BIOS* ? I would like to try flashing OC BIOS on non-OC Strix card, but can't find latest OC version BIOS .... if that makes any sense


----------



## gtbtk

Quote:


> Originally Posted by *Flisker_new*
> 
> Quote:
> 
> 
> 
> Originally Posted by *philhalo66*
> 
> I'd wager those arent even connected to anything it might be the same PCB as a higher end card so those could be blank traces. Shorting out anything will more than likely fry your card and invalidate your warranty at the same time.
> 
> 
> 
> Hmm, you might be right ! Thanks for reply.
Click to expand...

that could be right too. the 1080 does have 2 x 8pin connectors and I think the boards are shared between models


----------



## asdkj1740

Quote:


> Originally Posted by *Flisker_new*
> 
> Btw anyone got latest *Asus Strix OC version BIOS* ? I would like to try flashing OC BIOS on non-OC Strix card, but can't find latest OC version BIOS .... if that makes any sense


200W is still not enough for 1070, you can first try to flash the old oc bios and then download the latest update exe


----------



## philhalo66

Quote:


> Originally Posted by *Flisker_new*
> 
> Hey guys,
> 
> so I got Asus Strix GTX 1070 yesterday and I'm super curious about the 6 points on top of the card, anyone knows how does this work with Asus ROG Series motherboards (I've got Rampage IV Extreme)
> 
> 
> 
> Thanks a lot for any info on this topic
> 
> o/


Just thought of a safe way to test. grab a multi meter and check the points for voltage see if you get any readings. Just be super careful not to short anything out.


----------



## AliasOfMyself

Quote:


> Originally Posted by *philhalo66*
> 
> Is anyone else having issues with youtube videos making chrome freeze with the last two drivers? 375.70 is perfectly fine its only the last two.


I don't use chrome, but I had to turn off hw acceleration in Firefox, video heavy pages gave my browser really bad stutters lol. Tried turning acceleration off in chrome?


----------



## Flisker_new

Quote:


> Originally Posted by *asdkj1740*
> 
> 200W is still not enough for 1070, you can first try to flash the old oc bios and then download the latest update exe


Tried that, but it says "no need for update" or something like that


----------



## Flisker_new

Quote:


> Originally Posted by *philhalo66*
> 
> Just thought of a safe way to test. grab a multi meter and check the points for voltage see if you get any readings. Just be super careful not to short anything out.


That's a great tip, will test it out.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> No it is not the exactly same thing. These cards run using all different voltage levels (that are "translated" by these tools into frequencies for us to interact with) at the same time but only the maximum is reported by afterburner/precision. The measure of performance is actually the area under, as opposed to a single point on the curve.
> 
> Jay is only partly right in his video in that the card will tend to operate in the higher Voltage values but the card is not exclusively running at only that single point on the curve being reported in you OSD. It is running at multiple values in parallel with the reported value in AB or precision being the highest voltage point available on a curve that is constantly being adjusted by GPU Boost 3.0.
> 
> If you use the slider, it will apply an equal offset to every point along the graph. If say, the 1.000 voltage point can cope with a +50 offset before it crashes but the 1.093V point can cope with a +100 offset The slider will be limited to the point with the lowest tolerance for the increase, namely +50 in this example. At some stage during your usage, the card is constantly making calculations at lots of different voltage values in parallel, the card will try to use a 1.000v value. If you have the slider at +100 the run may start fine, but at some stage it will hit the +100 offset at the 1.000 level and crash even though your OSD is telling you that the card is running at 2880 at 1.081V.
> 
> If you use the curve and can identify the maximum offset that each point can tolerate, you can put a dip in the curve to work around the voltage point or points that cannot tolerate such a large offset meaning that you can make use of the extra offset OC potential (the +50 to +100 range of offsets all the voltage points except 1.000 in my example) that would otherwise be unavailable if you only use a fixed curve. Finding the maximum offsets available at various voltage levels along the curve is exactly what the precision XOC software attempts to do when it is doing the auto overclock thing with EVGA cards.
> 
> Regardless of what card you have, you can test it yourself. In Afterburner, starting from the stock curve, only increase the 1.093v point to say 2100 and run a bench mark and note the score. Then, while leaving everything as it is change the offset at, 0.975 so that it matches the 2000mhz setting. You will see your curve now has a double bump. Rerun the benchmark. If you are still within the offset limits of your card at those two points, You should see an increased framerate scores even though the OSD is telling you the whole run took place at 2100Mhz 1.093v.
> 
> I'm just pulling settings from the air here, All cards behave slightly differently with regards overclocking. If either of your two tests crash, obviously reduce the offset at the last point that you adjusted and rerun the benchmark.
> 
> 
> 
> 
> 
> 
> Holy crap, that's confusing.
> 
> 
> 
> 
> 
> 
> 
> I miss the old days of overclocking where all you had to do was add clock speed until unstable, then add voltage, and repeat until you find the absolute limit of the card.
> 
> Now with GPUboost 3.0, it's constantly adjusting everything and they all seem to run way faster than their rated stock clocks out of the box so we don't get to see major gains as it's doing most of it for you, automatically. Even my FE 1070 runs up to 1911, then levels out in the mid/high 1800's without touching a thing. Following Jay's guide I'm able to get it to stay around 1970-2000Mhz even when it reaches 80*C (I like silence), which I'm happy with.
> 
> Is there really much to gain beyond that by playing with the curve?
> 
> I'm surprised they don't do this with the memory as well, seeing as most cards are capable of +450-500.
Click to expand...

Good overclocking to maximize your performance has a learning curve, it is not really that difficult once you get the hang of it. You just need to think of the upper limit of the performance as like the ceiling in a house and the performance you can get is available in the airspace inside the building. In some places you have doorways and you can't get as high as you can in the open plan spaces that has higher ceilings.

If you experiment a bit and are, at times, willing to have a bit more fan noise, you have more performance you can get from the card if you need it.

There are some partner models of 1070 that come with factory overclocked memory. Yes most cards will overclock to +500 not not all will, some will even go to +800, that is the nature of silicon. it is always a bit of a lottery as to how much overclock you can get.

One has to ask, if you like silence, why did you buy a Founders edition card?


----------



## philhalo66

Quote:


> Originally Posted by *AliasOfMyself*
> 
> I don't use chrome, but I had to turn off hw acceleration in Firefox, video heavy pages gave my browser really bad stutters lol. Tried turning acceleration off in chrome?


Actually yeah and that fixed it. running the new driver too I wonder why it was doing this.


----------



## ucode

Quote:


> Originally Posted by *MEC-777*
> 
> I'm surprised they don't do this with the memory as well, seeing as most cards are capable of +450-500.


Well they do seem to do something with the timing on Pascal vs the clock. Maybe that's enough since voltage is going to be pretty much constant.

A quick memory gain vs clock shows some interesting points with Pascal.



For this particular card artifacts can start at around ~9200MT/s


----------



## M0E

I posted because I had to get in on page 666


----------



## zipper17

Quote:


> Originally Posted by *Prozillah*
> 
> Ive noticed something very interesting - when do my benching after the shunt mod (using the liquid metal method) I did two separate tests:
> 
> 1. Using AB and locking the point at 1.093 - I moved that one point and one point only - all the up over 2200 mhz (something I was not able to do before) it didnt pass the entire bench so dropped it back down to the stable pass point which was about 2164. Had the memory clocked at 600+ (Micron) and the max graphics score I was able to achieve was 20 880.
> 
> 2. Same again using AB with volt point locked at 1.093, used the core clock method and set it to 105+ on the core and simply moved only the last volt point up a few points to lock the card at 2100mhz with 600+ memory OC. Final graphics score - 21216 (as linked)
> 
> What I noticed is even you have it locked at a certain point but the rest of your curve remains closer to the stock speeds that affects your overall performance. It seems like a bug with the software/firmware as the numbers dont make any sense at all.
> 
> Both of these tests I had the fan speed manually set to 100%


Digging old comment in the past, found post that has similar though with me.
GPU Boost 3.0 is confusing.


----------



## iARDAs

Quote:


> Originally Posted by *Flisker_new*
> 
> Hey guys,
> 
> so I got Asus Strix GTX 1070 yesterday and I'm super curious about the 6 points on top of the card, anyone knows how does this work with Asus ROG Series motherboards (I've got Rampage IV Extreme)
> 
> 
> 
> Thanks a lot for any info on this topic
> 
> o/


I believe that is for this?


----------



## zipper17

My memory at +700-800 will produce an obvious Dot Green Flash Sparkles Anomalies especially when running 3D Mark Stress Test FireStrike Extreme after some loops.

The Minor graphic artifacting is Something like this:

but it's smaller and it flashing very fast on the screen. Randomly appear at random position.

it is Samsung memory.

Is anyone experience this? should i keep it at +700?? without +700 mem I can't break 21K FS graphic scores.


----------



## asdkj1740

Quote:


> Originally Posted by *zipper17*
> 
> My memory at +700-800 will produce an obvious Dot Green Flash Sparkles Anomalies especially when running 3D Mark Stress Test FireStrike Extreme after some loops.
> 
> The Minor graphic artifacting is Something like this:
> 
> but it's smaller and it flashing very fast on the screen. Randomly appear at random position.
> 
> it is Samsung memory.
> 
> Is anyone experience this? should i keep it at +700?? without +700 mem I can't break 21K FS graphic scores.


Christmas edition?


----------



## asdkj1740

spending times on studying the gpu boost 3.0 is stupid, as the gpu boost 3.0 in nature is stupid.
the driver must have done somethings to reduce the performance starting from 2000mhz.


----------



## zipper17

Quote:


> Originally Posted by *asdkj1740*
> 
> Christmas edition?


yeah memory its artifacting like that, green flash sparkles during running 3dmark stress test Firestrike Extreme, when on +700 or higher.
didnt test for games yet.

Quote:


> Originally Posted by *asdkj1740*
> 
> spending times on studying the gpu boost 3.0 is stupid, as the gpu boost 3.0 in nature is stupid.
> the driver must have done somethings to reduce the performance starting from 2000mhz.


overclocked 1070 overall is not better than 980ti overclocked.
the only thing 1070 wins is performance per watt..

i can see most of Top 100 Firestrike result 980ti +3570k easily reach 21-22k .
980ti has better silicon chip for overclocking probably in general.

some people of 1070 can reach 22k though, but probably less in general than 980ti.

It's not feel so high-end for overclocking.
out of the box my 1070 scores 19.2XX, once overclocked only 20.8XX stable.
only gaining ~1000-1.500 points :X


----------



## asdkj1740

Quote:


> Originally Posted by *zipper17*
> 
> yeah memory its artifacting like that, green flash sparkles during running 3dmark stress test Firestrike Extreme, when on +700 or higher.
> didnt test for games yet.
> overclocked 1070 overall is not better than 980ti overclocked.
> the only thing 1070 wins is performance per watt..
> 
> i can see most of Top 100 Firestrike result 980ti +3570k easily reach 21-22k .
> 980ti has better silicon chip for overclocking probably in general.
> 
> some people of 1070 can reach 22k though, but probably less in general than 980ti.
> 
> It's not feel so high-end for overclocking.
> out of the box my 1070 scores 19.2XX, once overclocked only 20.8XX stable.
> only gaining ~1000-1.500 points :X


i can pass fsu test at 2215/2300 with zotac amp extreme bios cross flashed. but the mark of fsu is freaking low, 4864 only.
my hwinfo record shows that there are no throttling at all because i am using aio cooling and the gpu temp is below 40c so there is no temp throttling for my card.
voltage is locked at 1.08v for the whole test.
the zotac amp extreme bios has 300w max power limit, so there is no power throttling too. my stock bios has only 226w max and my card crashes during the test with stock bios using.

assuming the zotac bios can work fine and properly on my non zotac card and nothings go wrong, then this result shows that there is somethings influencing the performance of my card. properly is the driver...

nvidia dares to lock those 1050s and 1050tis' the max core clock and memory clock...


----------



## philhalo66

Quote:


> Originally Posted by *asdkj1740*
> 
> i can pass fsu test at 2215/2300 with zotac amp extreme bios cross flashed. but the mark of fsu is freaking low, 4864 only.
> my hwinfo record shows that there are no throttling at all because i am using aio cooling and the gpu temp is below 40c so there is no temp throttling for my card.
> voltage is locked at 1.08v for the whole test.
> the zotac amp extreme bios has 300w max power limit, so there is no power throttling too. my stock bios has only 226w max and my card crashes during the test with stock bios using.
> 
> assuming the zotac bios can work fine and properly on my non zotac card and nothings go wrong, then this result shows that there is somethings influencing the performance of my card. properly is the driver...
> 
> nvidia dares to lock those 1050s and 1050tis' the max core clock and memory clock...


What card do you have?


----------



## asdkj1740

Quote:


> Originally Posted by *philhalo66*
> 
> What card do you have?


ftw


----------



## philhalo66

Quote:


> Originally Posted by *asdkj1740*
> 
> ftw


the EVGA FTW Hybrid? and you flashed it with the Zotac bios? wow that's quite impressive


----------



## asdkj1740

Quote:


> Originally Posted by *philhalo66*
> 
> the EVGA FTW Hybrid? and you flashed it with the Zotac bios? wow that's quite impressive


not ftw hybrid, just ftw. with arctic 140 hybrid cooler, push & pull setting.
yes, i have been trying different bios to find out the best bios that suits my ftw. and zotac amp extreme is my king.

i am ******* regret about buying ftw too earlier. the hybrid ftw is just $30~$40 more than the ftw. but if you firstly buy the ftw at $460 then you have to spend $120 more for the hybrid kit. no any discount for ftw buyers to buy the hybrid kit....


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> Good overclocking to maximize your performance has a learning curve, it is not really that difficult once you get the hang of it. You just need to think of the upper limit of the performance as like the ceiling in a house and the performance you can get is available in the airspace inside the building. In some places you have doorways and you can't get as high as you can in the open plan spaces that has higher ceilings.
> 
> If you experiment a bit and are, at times, willing to have a bit more fan noise, you have more performance you can get from the card if you need it.
> 
> There are some partner models of 1070 that come with factory overclocked memory. Yes most cards will overclock to +500 not not all will, some will even go to +800, that is the nature of silicon. it is always a bit of a lottery as to how much overclock you can get.
> 
> One has to ask, if you like silence, why did you buy a Founders edition card?


I understand what you're saying. I'm just wondering how much more there is to gain vs just setting the slider as high as it will go and remain stable, basically.

Might play around with the curve a bit and see if I can get a bit more out of it, but honestly I'm totally happy with the performance as it is with +150 core and +500 memory. 65% fan speed seems to be enough to keep temps under 80C and keep the core clock close to 2000 while still remaining very quiet. Should also mention I have fairly high ambient temps in my room for my Bearded Dragon.









I went with the Founders Edition for a number of reasons: (in no particular order)
-Pushes hot air outside the case.
-It was the best priced card at the time of purchase (black Friday).
-I really like the look of them.
-It's a 150w TDP card, so I figured the blower should be able to cool it sufficiently, even when OC'd, without needing to run at mach 5 rpm. lol (this has proven to be true).
-My case has really good airflow (S340), with no obstructions (none of the case fans even turn on until the GPU hits about 55C and don't need to run faster than 50% to keep everything cool).
-Still overclocks very well (I'm able to match the OC on a friend's Gigabyte 1070 G1).
-I prefer the sound of blower card fan to axial fans (if I have to hear them at all).


----------



## zipper17

Quote:


> Originally Posted by *asdkj1740*
> 
> i can pass fsu test at 2215/2300 with zotac amp extreme bios cross flashed. but the mark of fsu is freaking low, 4864 only.
> my hwinfo record shows that there are no throttling at all because i am using aio cooling and the gpu temp is below 40c so there is no temp throttling for my card.
> voltage is locked at 1.08v for the whole test.
> the zotac amp extreme bios has 300w max power limit, so there is no power throttling too. my stock bios has only 226w max and my card crashes during the test with stock bios using.
> 
> assuming the zotac bios can work fine and properly on my non zotac card and nothings go wrong, then this result shows that there is somethings influencing the performance of my card. properly is the driver...
> 
> nvidia dares to lock those 1050s and 1050tis' the max core clock and memory clock...


Btw, what is your absolute stable 3dmark Firestrike basic graphic scores?
absolute stable mean it very persistent on stress test at least 97% FS Extreme/Ultra, persistent of no crash at all, persistent no artifact at all, also of course stable in games.

It seem i already found my max absolute stable is around 20.8XX scores with stock bios.

I can get my Card break +21K scores, but it will get me into trouble such as crashing driver (nvlddmkm), and minor memory artifacting.
TBH it sucks, it's like my pc screaming for SLI or upgrade into 1080 lol.

edit:
Anyone could also share here would be great, what is your absolute stable Firestrike basic graphic scores?


----------



## Curseair

Sent my EVGA FTW back for a refund 3 days ago I am now looking for another out of the Gigabyte Xtreme gaming 1070 and the MSI Gaming Z 1070 with RGB on the back-plate, I think the MSI looks better and has better build quality but the gigabyte cools better and possibly has better overclocking potential?

Which one? Hopefully ordering in a few hours. same price really.


----------



## Kreeker

Quote:


> Originally Posted by *Curseair*
> 
> Sent my EVGA FTW back for a refund 3 days ago I am now looking for another out of the Gigabyte Xtreme gaming 1070 and the MSI Gaming Z 1070 with RGB on the back-plate, I think the MSI looks better and has better build quality but the gigabyte cools better and possibly has better overclocking potential?
> 
> Which one? Hopefully ordering in a few hours. same price really.


Can I ask why you send back the FTW? I just installed mine last night.


----------



## Dan-H

Quote:


> Originally Posted by *Curseair*
> 
> Sent my EVGA FTW back for a refund 3 days ago I am now looking for another out of the Gigabyte Xtreme gaming 1070 and the MSI Gaming Z 1070 with RGB on the back-plate, I think the MSI looks better and has better build quality but the gigabyte cools better and possibly has better overclocking potential?
> 
> Which one? Hopefully ordering in a few hours. same price really.


I bought the MSI Gaming X, and have yet to push it hard. There is also the Quicksilver which is essentially the Gaming X but it isn't red.

edit: Actually I bought both. ( I'm building two systems).

I've had really good success with MSI, and at the time the X was $40 less so hard to cost justify 10% higher cost for a fraction more performance.


----------



## Curseair

Quote:


> Originally Posted by *Kreeker*
> 
> Can I ask why you send back the FTW? I just installed mine last night.


Coil whine, Thermal pads, Bios updates.


----------



## philhalo66

Avoid gigabyte. when i get playing a game and i get high framerates it screams like a banshee from coil whine. i dont normally hear it through my headset so it doesn't bother me but it might you.


----------



## Hunched

Quote:


> Originally Posted by *philhalo66*
> 
> Avoid gigabyte. when i get playing a game and i get high framerates it screams like a banshee from coil whine. i dont normally hear it through my headset so it doesn't bother me but it might you.


They should also be avoided for having the loudest fans, cheapest build quality, and worst customer service from any company you could possibly purchase a 1070 from.


----------



## philhalo66

Quote:


> Originally Posted by *Hunched*
> 
> They should also be avoided for having the loudest fans, cheapest build quality, and worst customer service from any company you could possibly purchase a 1070 from.


not sure what your doing to your cards man, but even with the fan maxed its barely audible over my H100i on balanced.


----------



## Hunched

Quote:


> Originally Posted by *philhalo66*
> 
> not sure what your doing to your cards man, but even with the fan maxed its barely audible over my H100i on balanced.


I have 5 Noctua fans and use SpeedFan (so quieter than your H100i if you're using Corsair fans)
I had a Gigabyte 970 and a Gigabyte 1070, their fans are high RPM and loud garbage compared to my MSI 1070.

Might just be the sound it creates, high RPM is much more high pitch.
High RPM is just bad, and they're smaller fans so they need to go higher to cool equivalently.


----------



## philhalo66

Quote:


> Originally Posted by *Hunched*
> 
> I have 5 Noctua fans and use SpeedFan (so quieter than your H100i if you're using Corsair fans)
> I had a Gigabyte 970 and a Gigabyte 1070, their fans are high RPM and loud garbage compared to my MSI 1070.


your card was faulty then man idk what to tell you. my fans are not the stock corsair ones they were junk i switched to Noctua NF-A14's


----------



## Hunched

Quote:


> Originally Posted by *philhalo66*
> 
> your card was faulty then man idk what to tell you. my fans are not the stock corsair ones they were junk i switched to Noctua NF-A14's


I didn't have 6 faulty Gigabyte GPU fans, 3x on a 970 and 3x on a 1070.
But alright.


----------



## philhalo66

Quote:


> Originally Posted by *Hunched*
> 
> I didn't have 6 faulty Gigabyte GPU fans, 3x on a 970 and 3x on a 1070.
> But alright.


I don't know then, All i know is mine is dead silent aside from the coil whine.


----------



## Hunched

Quote:


> Originally Posted by *philhalo66*
> 
> I don't know then, All i know is mine is dead silent aside from the coil whine.


Every review there of 1070's and 1080's show triple fan setups are many decibels louder than MSI and Palit's dual fans, and they often cool worse too.
What is 100% RPM on your card? Mine is 2500.
3 small fans need to spin faster to cool as well as two larger fans, and will be louder and more annoying by a noticeable amount.

You can use a laid back fan curve and have way higher temps at equal noise of a card with better fans, but that doesn't mean it's quiet or efficient.

The rest of your case isn't quiet if a Gigabyte 1070 isn't the loudest thing in your PC.
I think Gigabyte cards go up to 3200rpm


----------



## philhalo66

Quote:


> Originally Posted by *Hunched*
> 
> Every review there of 1070's and 1080's show triple fan setups are many decibels louder than MSI and Palit's dual fans, and they often cool worse too.
> What is 100% RPM on your card? Mine is 2500.
> 3 small fans need to spin faster to cool as well as two larger fans, and will be louder and more annoying by a noticeable amount.
> 
> You can use a laid back fan curve and have way higher temps at equal noise of a card with better fans, but that doesn't mean it's quiet or efficient.
> 
> The rest of your case isn't quiet if a Gigabyte 1070 isn't the loudest thing in your PC.
> I think Gigabyte cards go up to 3200rpm


i never said my system was quiet. i wear headphones so i don't care if it sounds like a shop vac as long as my components are cooled. EVGA Precision says 3500 RPM


----------



## gtbtk

Quote:



> Originally Posted by *zipper17*
> 
> My memory at +700-800 will produce an obvious Dot Green Flash Sparkles Anomalies especially when running 3D Mark Stress Test FireStrike Extreme after some loops.
> 
> The Minor graphic artifacting is Something like this:
> 
> but it's smaller and it flashing very fast on the screen. Randomly appear at random position.
> 
> it is Samsung memory.
> 
> Is anyone experience this? should i keep it at +700?? without +700 mem I can't break 21K FS graphic scores.


Your memory OC is too high, pull it back to +600 and even try +500, you may even find that as there are less errors, your scores increase


----------



## gtbtk

Quote:


> Originally Posted by *Hunched*
> 
> Quote:
> 
> 
> 
> Originally Posted by *philhalo66*
> 
> I don't know then, All i know is mine is dead silent aside from the coil whine.
> 
> 
> 
> Every review there of 1070's and 1080's show triple fan setups are many decibels louder than MSI and Palit's dual fans, and they often cool worse too.
> What is 100% RPM on your card? Mine is 2500.
> 3 small fans need to spin faster to cool as well as two larger fans, and will be louder and more annoying by a noticeable amount.
> 
> You can use a laid back fan curve and have way higher temps at equal noise of a card with better fans, but that doesn't mean it's quiet or efficient.
> 
> The rest of your case isn't quiet if a Gigabyte 1070 isn't the loudest thing in your PC.
> I think Gigabyte cards go up to 3200rpm
Click to expand...

My MSI Gaming 100% fan is about 2500


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> Christmas edition?
> 
> 
> 
> yeah memory its artifacting like that, green flash sparkles during running 3dmark stress test Firestrike Extreme, when on +700 or higher.
> didnt test for games yet.
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> spending times on studying the gpu boost 3.0 is stupid, as the gpu boost 3.0 in nature is stupid.
> the driver must have done somethings to reduce the performance starting from 2000mhz.
> 
> Click to expand...
> 
> overclocked 1070 overall is not better than 980ti overclocked.
> the only thing 1070 wins is performance per watt..
> 
> i can see most of Top 100 Firestrike result 980ti +3570k easily reach 21-22k .
> 980ti has better silicon chip for overclocking probably in general.
> 
> some people of 1070 can reach 22k though, but probably less in general than 980ti.
> 
> It's not feel so high-end for overclocking.
> out of the box my 1070 scores 19.2XX, once overclocked only 20.8XX stable.
> only gaining ~1000-1.500 points :X
Click to expand...

As I have been saying in a number of posts, the GPU does not only operate at a single voltage point, It is doing things at multiple voltage levels in parallel. That is why using the slider to increase the whole curve with a kick up at the end will give you a better score than just pulling the 1.093 point up to 2164Mhz and leaving the rest at stock. The 980ti did not have the luxury of using a curve so by definition all points were getting an overclock and not just the highest voltage value.

Having access to the curve is both a blessing and a curse because not every voltage level can be increased by the same amount. some points may be limited to +50 and others may be OK if you increase them by +150. Using the Core slider will move the whole curve but because the lider is moving the fixed curve. +50 will be the highest that you can lift it up because the curve will hit the lowest point first, you still have the performance potential of the extra +100 at 1.093v but you cant use it if you have a fixed curve because of the slider. If you manually adjust the slider, you can leave the lower lying point at that +50 level and increase the other points up to their limits an get maximum performance.

The down side of the curve is that we are all obsessed with the highest number so we find how to get that and forget that the ultimate aim is maximum FPS.

Also remember that the performance in these tests, even in the graphics score, is not just how powerful the GPU is and how much GPU overclock you are running. It is also relying on your CPU being able to feed it instructions to process, the CPU getting the right voltage levels, CPU cooling, The efficiency of the PCIe bus, your system RAM, the quality of the power being supplied by the PSU, etc etc.

If you want to try and increase your scores, at a physical level:

Try cleaning your CPU cooler - extra heat interferes with CPU Turbo and reduces performance.

Review you case cooling, maybe an extra fan can improve temps for both CPU and GPU

Check that the CPU PLL is adjusted to avoid vdroop.

Try increasing your VCCIO voltage a little bit.

Some people have found that increasing the PCH voltage slightly improved scores but others didnt see any difference.

You could also try overclocking your CPU and system RAM as well if you have not done so already.

If the CPU is overclocked just with the multiplier with BCLK at 100Mhz, experiment and try a 103 BCLK. That will OC the CPU an extra 100Mhz and OC the Ram at the same time.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Good overclocking to maximize your performance has a learning curve, it is not really that difficult once you get the hang of it. You just need to think of the upper limit of the performance as like the ceiling in a house and the performance you can get is available in the airspace inside the building. In some places you have doorways and you can't get as high as you can in the open plan spaces that has higher ceilings.
> 
> If you experiment a bit and are, at times, willing to have a bit more fan noise, you have more performance you can get from the card if you need it.
> 
> There are some partner models of 1070 that come with factory overclocked memory. Yes most cards will overclock to +500 not not all will, some will even go to +800, that is the nature of silicon. it is always a bit of a lottery as to how much overclock you can get.
> 
> One has to ask, if you like silence, why did you buy a Founders edition card?
> 
> 
> 
> I understand what you're saying. I'm just wondering how much more there is to gain vs just setting the slider as high as it will go and remain stable, basically.
> 
> Might play around with the curve a bit and see if I can get a bit more out of it, but honestly I'm totally happy with the performance as it is with +150 core and +500 memory. 65% fan speed seems to be enough to keep temps under 80C and keep the core clock close to 2000 while still remaining very quiet. Should also mention I have fairly high ambient temps in my room for my Bearded Dragon.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I went with the Founders Edition for a number of reasons: (in no particular order)
> -Pushes hot air outside the case.
> -It was the best priced card at the time of purchase (black Friday).
> -I really like the look of them.
> -It's a 150w TDP card, so I figured the blower should be able to cool it sufficiently, even when OC'd, without needing to run at mach 5 rpm. lol (this has proven to be true).
> -My case has really good airflow (S340), with no obstructions (none of the case fans even turn on until the GPU hits about 55C and don't need to run faster than 50% to keep everything cool).
> -Still overclocks very well (I'm able to match the OC on a friend's Gigabyte 1070 G1).
> -I prefer the sound of blower card fan to axial fans (if I have to hear them at all).
Click to expand...

For me using the curve is an extra 400-500 graphics score in Firestrike or an extra 4-5fps in the ultimate preset in heaven. your mileage may vary of course


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> As I have been saying in a number of posts, the GPU does not only operate at a single voltage point, It is doing things at multiple voltage levels in parallel. That is why using the slider to increase the whole curve with a kick up at the end will give you a better score than just pulling the 1.093 point up to 2164Mhz and leaving the rest at stock. The 980ti did not have the luxury of using a curve so by definition all points were getting an overclock and not just the highest voltage value.
> 
> Having access to the curve is both a blessing and a curse because not every voltage level can be increased by the same amount. some points may be limited to +50 and others may be OK if you increase them by +150. Using the Core slider will move the whole curve but because the lider is moving the fixed curve. +50 will be the highest that you can lift it up because the curve will hit the lowest point first, you still have the performance potential of the extra +100 at 1.093v but you cant use it if you have a fixed curve because of the slider. If you manually adjust the slider, you can leave the lower lying point at that +50 level and increase the other points up to their limits an get maximum performance.
> 
> The down side of the curve is that we are all obsessed with the highest number so we find how to get that and forget that the ultimate aim is maximum FPS.


for me the problem is i got many crashes & minor memory artifacts no matter what i do to the curves/slider like you mentioned, everytime when attempt to get 21k Firestrike graphic scores.
so i think i've been limited by silicon chip lottery. ithink The luxury Boost 3.0 didn't help at all once you already hit the wall limit by silicon chip.

Hows your experience so far overclocked your card to get 21-22k Firestrike gs scores ? do you encountered any crashing/artifacting so far?
Quote:


> Also remember that the performance in these tests, even in the graphics score, is not just how powerful the GPU is and how much GPU overclock you are running. It is also relying on your CPU being able to feed it instructions to process, the CPU getting the right voltage levels, CPU cooling, The efficiency of the PCIe bus, your system RAM, the quality of the power being supplied by the PSU, etc etc.
> 
> If you want to try and increase your scores, at a physical level:
> 
> Try cleaning your CPU cooler - extra heat interferes with CPU Turbo and reduces performance.
> 
> Review you case cooling, maybe an extra fan can improve temps for both CPU and GPU
> 
> Check that the CPU PLL is adjusted to avoid vdroop.
> 
> Try increasing your VCCIO voltage a little bit.
> 
> Some people have found that increasing the PCH voltage slightly improved scores but others didnt see any difference.
> 
> You could also try overclocking your CPU and system RAM as well if you have not done so already.
> 
> If the CPU is overclocked just with the multiplier with BCLK at 100Mhz, experiment and try a 103 BCLK. That will OC the CPU an extra 100Mhz and OC the Ram at the same time.


In my personal test CPU & RAM speed seem doesn't affected 3dmark Graphic scores
with @4,5GHZ + @1600 = graphic scores 20.8xx
with @4,7GHZ + @2400 ram = graphic scores still score the same around 20.8xx
faster CPU & RAM speed it only increase the physic score & combined scores.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> As I have been saying in a number of posts, the GPU does not only operate at a single voltage point, It is doing things at multiple voltage levels in parallel. That is why using the slider to increase the whole curve with a kick up at the end will give you a better score than just pulling the 1.093 point up to 2164Mhz and leaving the rest at stock. The 980ti did not have the luxury of using a curve so by definition all points were getting an overclock and not just the highest voltage value.
> 
> Having access to the curve is both a blessing and a curse because not every voltage level can be increased by the same amount. some points may be limited to +50 and others may be OK if you increase them by +150. Using the Core slider will move the whole curve but because the lider is moving the fixed curve. +50 will be the highest that you can lift it up because the curve will hit the lowest point first, you still have the performance potential of the extra +100 at 1.093v but you cant use it if you have a fixed curve because of the slider. If you manually adjust the slider, you can leave the lower lying point at that +50 level and increase the other points up to their limits an get maximum performance.
> 
> The down side of the curve is that we are all obsessed with the highest number so we find how to get that and forget that the ultimate aim is maximum FPS.
> 
> 
> 
> for me the problem is i got many crashes & minor memory artifacts no matter what i do to the curves/slider like you mentioned, everytime when attempt to get 21k Firestrike graphic scores.
> so i think i've been limited by silicon chip lottery. ithink The luxury Boost 3.0 didn't help at all once you already hit the wall limit by silicon chip.
> 
> Hows your experience so far overclocked your card to get 21-22k Firestrike gs scores ? do you encountered any crashing/artifacting so far?
> Quote:
> 
> 
> 
> Also remember that the performance in these tests, even in the graphics score, is not just how powerful the GPU is and how much GPU overclock you are running. It is also relying on your CPU being able to feed it instructions to process, the CPU getting the right voltage levels, CPU cooling, The efficiency of the PCIe bus, your system RAM, the quality of the power being supplied by the PSU, etc etc.
> 
> If you want to try and increase your scores, at a physical level:
> 
> Try cleaning your CPU cooler - extra heat interferes with CPU Turbo and reduces performance.
> 
> Review you case cooling, maybe an extra fan can improve temps for both CPU and GPU
> 
> Check that the CPU PLL is adjusted to avoid vdroop.
> 
> Try increasing your VCCIO voltage a little bit.
> 
> Some people have found that increasing the PCH voltage slightly improved scores but others didnt see any difference.
> 
> You could also try overclocking your CPU and system RAM as well if you have not done so already.
> 
> If the CPU is overclocked just with the multiplier with BCLK at 100Mhz, experiment and try a 103 BCLK. That will OC the CPU an extra 100Mhz and OC the Ram at the same time.
> 
> Click to expand...
> 
> In my personal test CPU & RAM speed seem doesn't affected 3dmark Graphic scores
> with @4,5GHZ + @1600 = graphic scores 20.8xx
> with @4,7GHZ + @2400 ram = graphic scores still score the same around 20.8xx
> faster CPU & RAM speed it only increase the physic score & combined scores.
Click to expand...

I am running an i7-2600. Best graphics score I have managed is about 20600 but I am also limited to PCIe 2.0 so I supposedly lose about 1% total performance compared to a pcie 3.0 computer. Maybe the point where that doesnt give you an improvement is pcie 3,0??

After tweaking voltages for VCCIO and a good spring clean of my cooler, I got physics score boost from 10300 to 10500 with the better CPU temps and a more small boost to graphics scores.


----------



## zipper17

@gtbk I'm at pcie 3.0

ivybridge & my mobo support pcie 3.0

btw my goal is to get a *perfect minimum framerates of 60FPS* in all games 1440P max settings.
but it looks need another years.


----------



## ucode

Quote:


> Originally Posted by *zipper17*
> 
> but it looks need another years.


Maybe if your only going to be playing the same games still and not the newer games that require even more power.


----------



## khanmein

Quote:


> Originally Posted by *zipper17*
> 
> @gtbk I'm at pcie 3.0
> 
> ivybridge & my mobo support pcie 3.0
> 
> btw my goal is to get a *perfect minimum framerates of 60FPS* in all games 1440P max settings.
> but it looks need another years.


i personally prefer high setting cos i think ultra is over-kill!


----------



## cyronn

Finnaly got around to upgrade my old GTX 480. It lasted me 6 good years but was struggling on a couple of maps in bf1. Anyways I decided to go with a Palit Jetstream GTX 1070 as I am not going to buy a wb for it and just used it as stock for now. Just have to wait for it to come now grrr.


----------



## philhalo66

Quote:


> Originally Posted by *cyronn*
> 
> Finnaly got around to upgrade my old GTX 480. It lasted me 6 good years but was struggling on a couple of maps in bf1. Anyways I decided to go with a Palit Jetstream GTX 1070 as I am not going to buy a wb for it and just used it as stock for now. Just have to wait for it to come now grrr.


I upgraded from a 580, Your going to see a massive improvement, Enjoy


----------



## Kreeker

I just got a EVGA 1070 FTW. I'm getting about 85 FPS in Overwatch on Ultra. Is this normal?

My cpu doesn't seem to want to OC past 43x (although I didn't try too hard).

I've heard faster memory can help?


----------



## iARDAs

Quote:


> Originally Posted by *Kreeker*
> 
> I just got a EVGA 1070 FTW. I'm getting about 85 FPS in Overwatch on Ultra. Is this normal?
> 
> My cpu doesn't seem to want to OC past 43x (although I didn't try too hard).
> 
> I've heard faster memory can help?


Check resolution scale. It is probably scaled to a higher resolution.


----------



## Kreeker

Quote:


> Originally Posted by *iARDAs*
> 
> Check resolution scale. It is probably scaled to a higher resolution.


Oh I forgot to mention I'm running 1440p... Just got a new Dell 1440p 165 Hz monitor.


----------



## Luckael

Quote:


> Originally Posted by *philhalo66*
> 
> what country are you from? I been reading nothing but horror stories to the point im more than likely going to sell this and get an EVGA 1070 in a few months. first and last gigabyte card unless they prove me wrong.


im from Philippines







Quote:


> Originally Posted by *AliasOfMyself*
> 
> Sounds like it wasn't binned properly to me then, you going to get a replacement from gigabyte or try another brand?


just got my replacement.. but it has a micron memory lol.. but it runs very good and no crashing on the Division


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> @gtbk I'm at pcie 3.0
> 
> ivybridge & my mobo support pcie 3.0
> 
> btw my goal is to get a *perfect minimum framerates of 60FPS* in all games 1440P max settings.
> but it looks need another years.


I know that you are at 3.0. that is why i brought up the subject. I am only at 2.0 and I think I have hit a ceiling.

Given that the GPU is running in a multitasking environment that relies on another processor to feed it instructions, I doubt that you will even see the "perfect" results that you are looking for. That may be true if we went back to DOS or a 2016 equivalent of the single tasking operating system


----------



## gtbtk

Quote:


> Originally Posted by *cyronn*
> 
> Finnaly got around to upgrade my old GTX 480. It lasted me 6 good years but was struggling on a couple of maps in bf1. Anyways I decided to go with a Palit Jetstream GTX 1070 as I am not going to buy a wb for it and just used it as stock for now. Just have to wait for it to come now grrr.


you should enjoy the cooler room and power bill reductions


----------



## gtbtk

Quote:


> Originally Posted by *Luckael*
> 
> Quote:
> 
> 
> 
> Originally Posted by *philhalo66*
> 
> what country are you from? I been reading nothing but horror stories to the point im more than likely going to sell this and get an EVGA 1070 in a few months. first and last gigabyte card unless they prove me wrong.
> 
> 
> 
> im from Philippines
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *AliasOfMyself*
> 
> Sounds like it wasn't binned properly to me then, you going to get a replacement from gigabyte or try another brand?
> 
> Click to expand...
> 
> just got my replacement.. but it has a micron memory lol.. but it runs very good and no crashing on the Division
Click to expand...

There is Nothing wrong with micron memory as long as you have the .50 bios running on it!!!!!!


----------



## lanofsong

Hey GTX 1070 owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Luckael

Quote:


> Originally Posted by *gtbtk*
> 
> There is Nothing wrong with micron memory as long as you have the .50 bios running on it!!!!!!


Noted! Thank you


----------



## PloniAlmoni

I'm having a problem flashing my BIOS for my micron-memory Zotac 1070 FE. I run the command-prompt Windows x64 program and it never reboots, it makes the screen blank briefly, but it gives me an error.



Anyone have any suggestions on getting this to flash properly? It's the only program that Zotac offered, and only via their tech support, so I can't simply use a DOS floppy and flash from that or the like.


----------



## Clocknut

Hey guys, how safe it is, to run 1070 in SLI on a Seasonic Gold X series 650w on conditions below?

System spec will be something like these.

AMD RyZen 8 core (CPU is unreleased, so let assuming an OCed 8 core Zen takes as much power an OCed 6700K both are 95w)
1x 120mm CPU fan + 1x 180mm fan, 3x 120mm fan
2x7200rpm HDD
two 1070s will be running at stock.
4 sticks of DDR4 RAM @ stock speed
an socket AM4 Enthusiast motherboard?

thats all my system has. It is only CPU being OCed

Reason to SLI = manage to find a local retailer selling Zotac 1070 mini at very cheap price.


----------



## shadowrain

Quote:


> Originally Posted by *PloniAlmoni*
> 
> I'm having a problem flashing my BIOS for my micron-memory Zotac 1070 FE. I run the command-prompt Windows x64 program and it never reboots, it makes the screen blank briefly, but it gives me an error.
> 
> 
> 
> Anyone have any suggestions on getting this to flash properly? It's the only program that Zotac offered, and only via their tech support, so I can't simply use a DOS floppy and flash from that or the like.


zotac firestorm has a bios flash utility. Might try that.
Quote:


> Originally Posted by *Clocknut*
> 
> Hey guys, how safe it is, to run 1070 in SLI on a Seasonic Gold X series 650w on conditions below?
> 
> System spec will be something like these.
> 
> AMD RyZen 8 core (CPU is unreleased, so let assuming an OCed 8 core Zen takes as much power an OCed 6700K both are 95w)
> 1x 120mm CPU fan + 1x 180mm fan, 3x 120mm fan
> 2x7200rpm HDD
> two 1070s will be running at stock.
> 4 sticks of DDR4 RAM @ stock speed
> an socket AM4 Enthusiast motherboard?
> 
> thats all my system has. It is only CPU being OCed
> 
> Reason to SLI = manage to find a local retailer selling Zotac 1070 mini at very cheap price.


amd's has historically been a high wattage cpu. If 6 core intels are 100plus tdp, expect ryzen 8 cores to be 120tdp at least.

As for your question, probably yes but will be at the above 650watts over wattage limit of the psu, causing it to fail fast. Source, psucalculator.com


----------



## PloniAlmoni

Quote:


> Originally Posted by *shadowrain*
> 
> zotac firestorm has a bios flash utility. Might try that.


It only accepts .ROM files, tech support gave me an EXE...


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Clocknut*
> 
> Hey guys, how safe it is, to run 1070 in SLI on a Seasonic Gold X series 650w on conditions below?
> 
> System spec will be something like these.
> 
> AMD RyZen 8 core (CPU is unreleased, so let assuming an OCed 8 core Zen takes as much power an OCed 6700K both are 95w)
> 1x 120mm CPU fan + 1x 180mm fan, 3x 120mm fan
> 2x7200rpm HDD
> two 1070s will be running at stock.
> 4 sticks of DDR4 RAM @ stock speed
> an socket AM4 Enthusiast motherboard?
> 
> thats all my system has. It is only CPU being OCed
> 
> Reason to SLI = manage to find a local retailer selling Zotac 1070 mini at very cheap price.


I'd say no problem even overclocks on the 1070s too.


----------



## allsop

Hello all! Figured I'd join up as I just purchased my asus dual 1070oc 8gb. It has the micron memory, which I read up on and decided to download the asus vbios update. But when I click the file to update it, it hangs on a black screen at the end of it. I waited a minute or two but the black screen was still there. I think the entire computer shut off at this point, and then my monitor (using DVI) shut off by itself, and when I turned the monitor back on it said no input and wasn't recognized and turned off again.

I'm pretty sure that is when I had to hard reboot, and my mobo beeped a few times. Once it booted back up successfully it seemed to act normal, but my asus 1070 oc bios is 86.04.26.00.80 (which I believe is the older bios version with the memory issues).

Am I doing something wrong? Could it be from an outdated motherboard bios or from not installing new windows 7 updates for a while? I haven't updated my mobo bios at all before. My motherboard is "ASRock Fatal1ty Gaming Fatal1ty Z97 Killer LGA 1150 Intel Z97 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard" (http://www.newegg.com/Product/Product.aspx?Item=N82E16813157501). I'm not sure if the rest of my specs are important for this so I'll list it below.

https://ca.pcpartpicker.com/list/r4pDQV


----------



## icold

I can 2126mhz stable and 4363MHZ on memory, is the max


----------



## zipper17

Quote:


> Originally Posted by *Clocknut*
> 
> Hey guys, how safe it is, to run 1070 in SLI on a Seasonic Gold X series 650w on conditions below?
> 
> System spec will be something like these.
> 
> AMD RyZen 8 core (CPU is unreleased, so let assuming an OCed 8 core Zen takes as much power an OCed 6700K both are 95w)
> 1x 120mm CPU fan + 1x 180mm fan, 3x 120mm fan
> 2x7200rpm HDD
> two 1070s will be running at stock.
> 4 sticks of DDR4 RAM @ stock speed
> an socket AM4 Enthusiast motherboard?
> 
> thats all my system has. It is only CPU being OCed
> 
> Reason to SLI = manage to find a local retailer selling Zotac 1070 mini at very cheap price.


What's your current Specs?

If you want some software reading CPU/GPU power try download these:

Download Intel Power Gadget, or Aida64, or CoreTemp to see your real time CPU Package Power.
Download HWinfo64 to see your GPU Power.

example:
my 3570K @4.7ghz @1.320V, draw 50-55 watt while playing Witcher3, Hitman, 3dmark Firestrike Physic Test. 90-100% Usage.
My 1070 overclocked, draw +- 200-220watt, 90-100% Usage.

if you want more accurate maybe u need advance equipment such as (Multitester etc)


----------



## MEC-777

Quote:


> Originally Posted by *Clocknut*
> 
> Hey guys, how safe it is, to run 1070 in SLI on a Seasonic Gold X series 650w on conditions below?
> 
> System spec will be something like these.
> 
> AMD RyZen 8 core (CPU is unreleased, so let assuming an OCed 8 core Zen takes as much power an OCed 6700K both are 95w)
> 1x 120mm CPU fan + 1x 180mm fan, 3x 120mm fan
> 2x7200rpm HDD
> two 1070s will be running at stock.
> 4 sticks of DDR4 RAM @ stock speed
> an socket AM4 Enthusiast motherboard?
> 
> thats all my system has. It is only CPU being OCed
> 
> Reason to SLI = manage to find a local retailer selling Zotac 1070 mini at very cheap price.


Should be fine.

Quote:


> Originally Posted by *shadowrain*
> 
> zotac firestorm has a bios flash utility. Might try that.
> amd's has historically been a high wattage cpu. If 6 core intels are 100plus tdp, expect ryzen 8 cores to be 120tdp at least.
> 
> As for your question, probably yes but will be at the above 650watts over wattage limit of the psu, causing it to fail fast. Source, psucalculator.com


I guess you haven't been following the news on AMD's new CPU platform... No, they won't be 120w and you cannot assume based on their old CPUs. Their new Zen CPU architecture is 100% completely different and new from their old pile-driver architecture. It's also on a new and much smaller process node (14nm). The top-of-the-line enthusiast 8 core Ryzen CPU will be 95w TDP. This has already been announced and it competes with Intel's 140w TDP 6900k.

All that being said, no, Clocknut shouldn't be on the limit of a 650w PSU (especially a Seasonic gold unit) with that setup. It will have about 100w of headroom +/- which is totally fine.


----------



## zipper17

Quote:


> Originally Posted by *icold*
> 
> I can 2126mhz stable and 4363MHZ on memory, is the max


Faster coreclock frequencies it doesn't always determine anymore for a better performances than a lowered coreclock frequencies in Pascal.

I can run my card with 2152-2101mhz coreclock (via gpuz, msi afterburner censor reading all time) with specific curves, but it got a lower Firestrike graphic scores than when running on 2076mhz with decent curves. It is due to complexity of GPU boost 3.0 Curves.

GPU boost 3.0 can do a camouflage frequency lol, actually fake performances of 2151mhz.

http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703006
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6560#post_25704868


----------



## shadowrain

Quote:


> Originally Posted by *MEC-777*
> 
> Should be fine.
> I guess you haven't been following the news on AMD's new CPU platform... No, they won't be 120w and you cannot assume based on their old CPUs. Their new Zen CPU architecture is 100% completely different and new from their old pile-driver architecture. It's also on a new and much smaller process node (14nm). The top-of-the-line enthusiast 8 core Ryzen CPU will be 95w TDP. This has already been announced and it competes with Intel's 140w TDP 6900k.
> 
> All that being said, no, Clocknut shouldn't be on the limit of a 650w PSU (especially a Seasonic gold unit) with that setup. It will have about 100w of headroom +/- which is totally fine.


The RX 480 was also 14nm and "new architecture" but look what happened with their PCIE draw wattage(was over the 75watts max of PCIE slots)and the tdp of 150w vs the 16nm 1060 at 120w. The RX 480 keynote was the last time I'll ever get hyped for AMD. And I was lucky to be denied getting the RX 480 from my store a few hours before the reviews NDA lifting. Got my 1070 and everything is smooth and silky.

Let's also not forget the fact that even if Ryzen is really at 95w TDP, the AM# platform and chipset is also historically a power hog.

Don't get me wrong, I want Ryzen to be a success and I might get it, but no it is not the Holy Grail, not until real people gets their hands on it for scrutiny.

TLR- wait for reviews and benchmarks before you assume what they say in keynotes and marketting are correct. And always be guided by a company's previous products, not the dream product they want you to preorder without reviews. And in the past AMD and Power Draw never got along, for 10+ years. No amount of technical mumbo jumbo can change that fact. Only Ryzen can, IF and a Big *IF* it can.
Quote:


> Originally Posted by *PloniAlmoni*
> 
> It only accepts .ROM files, tech support gave me an EXE...


check if techpowerup has that bios rom uploaded, maybe someone did. Other than that, maybe someone here who got ur bios working can send the rom to you by saving it from firestorm too.


----------



## MEC-777

Quote:


> Originally Posted by *shadowrain*
> 
> The RX 480 was also 14nm and "new architecture" but look what happened with their PCIE draw wattage(was over the 75watts max of PCIE slots)and the tdp of 150w vs the 16nm 1060 at 120w. The RX 480 keynote was the last time I'll ever get hyped for AMD. And I was lucky to be denied getting the RX 480 from my store a few hours before the reviews NDA lifting. Got my 1070 and everything is smooth and silky.
> 
> Let's also not forget the fact that even if Ryzen is really at 95w TDP, the AM# platform and chipset is also historically a power hog.
> 
> Don't get me wrong, I want Ryzen to be a success and I might get it, but no it is not the Holy Grail, not until real people gets their hands on it for scrutiny.
> 
> TLR- wait for reviews and benchmarks before you assume what they say in keynotes and marketting are correct. And always be guided by a company's previous products, not the dream product they want you to preorder without reviews. And in the past AMD and Power Draw never got along, for 10+ years. No amount of technical mumbo jumbo can change that fact. Only Ryzen can, IF and a Big *IF* it can.


The PCIe power draw issue was partially a BIOS thing and how AMD designed the card to split the power draw. This issue has since been resolved.

You can't look and judge historically because it's a 100% completely new and different architecture and platform. The AM4 chipsets are also new.

Yes, I agree we should all wait for proper reviews and benchmarks before buying anything new. But don't assume based on previous hardware, which is now completely different, that it will be the same as before. There is no relation.


----------



## allsop

I was able to successfully update my vbios with the micron fix. I notice a change in idle temps however, a few degrees higher or more. Is that normal?


----------



## allsop

Is it safe to be going over the 112% power limit by quite a bit? In hwinfo64 it shows max being 130% power. Could that damage the card? I've had a few minor green flash/flicker artifacts sometimes, lowering clocks a bit doesn't seem to stop them, but makes them more unfrequented. Could going over the power limit like that cause the green flicker artifacts on my screen? I have the ASUS dual 1070 overclocked

Also, what is the difference between "Total GPU Power [% of TDP]" and "Total GPU Power (normalized) [% of TDP]"?

Both my CPU and GPU usage is quite high. GPU is usually very close to 100%, with my CPU around 80% or so, if that matters

My motherboard is ASRock Fatal1ty Gaming Fatal1ty Z97 Killer LGA 1150 and my cpu is the i7-4790k overclocked to 4.7ghz. My power supply seems to not be very good from what people tell me, its the EVGA SuperNOVA 650 G1 120-G1-0650-XR 80+ GOLD 650W


----------



## gtbtk

Quote:


> Originally Posted by *PloniAlmoni*
> 
> I'm having a problem flashing my BIOS for my micron-memory Zotac 1070 FE. I run the command-prompt Windows x64 program and it never reboots, it makes the screen blank briefly, but it gives me an error.
> 
> 
> 
> Anyone have any suggestions on getting this to flash properly? It's the only program that Zotac offered, and only via their tech support, so I can't simply use a DOS floppy and flash from that or the like.


You could try disabling the video card in device manager before trying to do the update.

If it completes sucessfully, re-enable the card before you reboot the PC


----------



## ucode

Quote:


> Originally Posted by *allsop*
> 
> Is it safe to be going over the 112% power limit by quite a bit?


Welcome to OCN. If it's inside the manufacturers spec it should be fine barring defects. Percentages can be misleading, better to use Watts where available. For instance my own card can report above 200% but it's within it's spec and well within what it can handle.

Green flickers are usually a sign of the memory clock being too high and/or memory chip(s) too hot.


----------



## gtbtk

Quote:


> Originally Posted by *Clocknut*
> 
> Hey guys, how safe it is, to run 1070 in SLI on a Seasonic Gold X series 650w on conditions below?
> 
> System spec will be something like these.
> 
> AMD RyZen 8 core (CPU is unreleased, so let assuming an OCed 8 core Zen takes as much power an OCed 6700K both are 95w)
> 1x 120mm CPU fan + 1x 180mm fan, 3x 120mm fan
> 2x7200rpm HDD
> two 1070s will be running at stock.
> 4 sticks of DDR4 RAM @ stock speed
> an socket AM4 Enthusiast motherboard?
> 
> thats all my system has. It is only CPU being OCed
> 
> Reason to SLI = manage to find a local retailer selling Zotac 1070 mini at very cheap price.


if Ryzen is indeed a 95W chip, overclocked would end up pulling maybe 140W. Zotac Mini is specced at 170W maximum power draw so you are looking at a maximum power draw of about 480W. The 650 should work but you may have to deal with fan noise and it will not be running at peak efficency


----------



## ucode

Quote:


> Originally Posted by *PloniAlmoni*
> 
> I'm having a problem flashing my BIOS


If you would provide a link maybe someone would extract the ROM image for you.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *shadowrain*
> 
> The RX 480 was also 14nm and "new architecture" but look what happened with their PCIE draw wattage(was over the 75watts max of PCIE slots)and the tdp of 150w vs the 16nm 1060 at 120w. The RX 480 keynote was the last time I'll ever get hyped for AMD. And I was lucky to be denied getting the RX 480 from my store a few hours before the reviews NDA lifting. Got my 1070 and everything is smooth and silky.
> 
> Let's also not forget the fact that even if Ryzen is really at 95w TDP, the AM# platform and chipset is also historically a power hog.
> 
> Don't get me wrong, I want Ryzen to be a success and I might get it, but no it is not the Holy Grail, not until real people gets their hands on it for scrutiny.
> 
> TLR- wait for reviews and benchmarks before you assume what they say in keynotes and marketting are correct. And always be guided by a company's previous products, not the dream product they want you to preorder without reviews. And in the past AMD and Power Draw never got along, for 10+ years. No amount of technical mumbo jumbo can change that fact. Only Ryzen can, IF and a Big *IF* it can.
> 
> 
> 
> The PCIe power draw issue was partially a BIOS thing and how AMD designed the card to split the power draw. This issue has since been resolved.
> 
> You can't look and judge historically because it's a 100% completely new and different architecture and platform. The AM4 chipsets are also new.
> 
> Yes, I agree we should all wait for proper reviews and benchmarks before buying anything new. But don't assume based on previous hardware, which is now completely different, that it will be the same as before. There is no relation.
Click to expand...

Ryzen is effectively a SOC from what I ahve read so the chipset is built into the chip for the most part.

I agree that we cannot judge AMD by what has gone before. This is entirely new architecture. Besides, AMD has not always been making crap CPUs. The original Athlon and Athlon64 chips killed anything Intel was selling at the time. It wasnt until the core duo/core2duo chips came along that Intel overtook them and headed to teh position they have now.

The current demo chips are claimed to be 95W at 3.4Ghz. I didn't see any mention of power draw with Turbo enabled or if the 95W includes room for turbo clocks. Clock speeds for the Zen range have increased over time and they are desperate to take on Broadwell-e from looking at their demos to date. If they increased power draw to say, 110-120 W they have more room for increasing clock speeds further still out of the box while still being less power hungry than Intel but who knows what will actually happen until the chips are formally announced.


----------



## gtbtk

Quote:


> Originally Posted by *allsop*
> 
> I was able to successfully update my vbios with the micron fix. I notice a change in idle temps however, a few degrees higher or more. Is that normal?


idle temps up to 60 deg are normal with the silent fan option that pascal cards have. You can set your own fan curve in afterburner and it should idle at around 35-40 deg if you want. The cooler you can keep the card, the better performance you will get so a custom curve is actually a good idea if you are overclocking the card


----------



## PloniAlmoni

Quote:


> Originally Posted by *gtbtk*
> 
> You could try disabling the video card in device manager before trying to do the update.
> 
> If it completes sucessfully, re-enable the card before you reboot the PC


I'm using a CPU with no integrated graphics, would that be a problem?


----------



## gtbtk

No. The card will just switch to basic vga mode


----------



## allsop

Quote:


> Originally Posted by *ucode*
> 
> Welcome to OCN. If it's inside the manufacturers spec it should be fine barring defects. Percentages can be misleading, better to use Watts where available. For instance my own card can report above 200% but it's within it's spec and well within what it can handle.
> 
> Green flickers are usually a sign of the memory clock being too high and/or memory chip(s) too hot.


Thank you!

The green flickers are quite rare, is it fine to just live with it happening once in a while? And my card specs list 112% as max power limit but it keeps going over that by quite a bit.

What is the best stress test for the gpu memory overclocked stability? I ran the Furry E GPU memory burner in the EVGA OC scanner for a few minutes but it did not detect any artifacts? My card isn't even EVGA so I'm not sure if that's even an accurate test for asus 1070 overclocked cards.

Is there a program that can run in the background while I play games that will detect and count errors/artifacts?


----------



## philhalo66

Against my better judgement i flashed my card with the F2 bios and now the damn thing wont go into idle clocks at all -_- GG gigabyte.

*Edit anyone got the F1 bios for G1 gaming 1070?*


----------



## msigtx760tf4

old 3570K + 1070

http://www.3dmark.com/3dm/16796081


----------



## cyronn

Finnally I have my GTX 1070 man its been a mission. Basically I ordered it and got it delivered to my sisiter work, as I was not at home. So I thought it would be easy but no, I had a message saying it was delivered last Tuesday at 15:36pm (signed by Nelson). I was like great I can pick it up on the way home. So I arrived at my sisters work, which she was ill and wasn't there and I asked is there a parcel for me the staff said no nothing as arrived...

I'm like balls ok ill contact the delivery company and see whats going on. The delivery company said it will take 5 days to investigate, which I was mad at. I called the company where I bought it from they couldnt do anything until the deilivery company finished its investigation. A few frustaion emails back and forth. My sister was back at work today and she said oh your parcel is here ( she works for a big company that helps all kinds of animals fixing broken bones etc). I was like oh, she said the new guy did'nt look at the name.

Sorry for the long story but man a week later and I have it

http://s1355.photobucket.com/user/cyronn/media/gtx1070_zpsimsoeaa4.jpg.html

Also worked how to set afterburner to see all 8gb or ram. Took me a while after a bios flash etc and a few drivers installs loool.

I'm super happy now with my GTX 1070 Jetstream. So much better than my 6 year old GTX 480 which was watercooled. I dont need to watercool the GTX 1070 though runs super cool


----------



## rfarmer

Quote:


> Originally Posted by *msigtx760tf4*
> 
> old 3570K + 1070
> 
> http://www.3dmark.com/3dm/16796081


Nice score on the 1070, but the 5.1 GHz OC on the i5 is what impresses me. Damn son.


----------



## Forceman

Quote:


> Originally Posted by *philhalo66*
> 
> Against my better judgement i flashed my card with the F2 bios and now the damn thing wont go into idle clocks at all -_- GG gigabyte.
> 
> *Edit anyone got the F1 bios for G1 gaming 1070?*


Why not just grab it from the Techpowerup database?

https://www.techpowerup.com/vgabios/183934/gigabyte-gtx1070-8192-160608


----------



## zipper17

Quote:


> Originally Posted by *allsop*
> 
> Thank you!
> 
> The green flickers are quite rare, is it fine to just live with it happening once in a while? And my card specs list 112% as max power limit but it keeps going over that by quite a bit.
> 
> *What is the best stress test for the gpu memory overclocked stability?* I ran the Furry E GPU memory burner in the EVGA OC scanner for a few minutes but it did not detect any artifacts? My card isn't even EVGA so I'm not sure if that's even an accurate test for asus 1070 overclocked cards.
> 
> Is there a program that can run in the background while I play games that will detect and count errors/artifacts?


I use 3Dmark Stress Test Firestrike Extreme, with those i can see my overclocking limit on memory is +600,
above +650 or so i will get minor memory artifacting such as green flashing sparks. My memory its samsung though.
Quote:


> Originally Posted by *msigtx760tf4*
> 
> old 3570K + 1070
> 
> http://www.3dmark.com/3dm/16796081


Nice OC CPU
and Nice Sample Chip on the 1070.
My card cant even break 21K with Absolute Stable.


----------



## Duvar

What do you think about my clocksspeeds? 



And here a little bit better score due to mem oc http://www.3dmark.com/3dm/16766701?
Heaven score too at the bottom http://extreme.pcgameshardware.de/grafikkarten/447445-video-2303-mhz-gtx-1070-boostclock-seite-37-optimierungswahn-gtx-1070-980-ti-bios-mod-resultate-videos-etc-38.html#post8601375


----------



## Curseair

Hi guys, Anyone know why my fans keep hitting 700 then goes to 0 again? MSI Gaming Z 1070 is something wrong with my fan curve? (Note my fan speed in red pattern)


----------



## BulletSponge

Quote:


> Originally Posted by *Curseair*
> 
> Hi guys, Anyone know why my fans keep hitting 700 then goes to 0 again? MSI Gaming Z 1070 is something wrong with my fan curve? (Note my fan speed in red pattern)


I can only say that you have the weirdest custom fan curve I have ever seen, no offense intended. Here's mine.



The constant on-off-on-off you are experiencing now cannot possibly be good for fan life. Delete your current curve and start over at 30C as a starting point is what I would try.


----------



## Curseair

Quote:


> Originally Posted by *BulletSponge*
> 
> I can only say that you have the weirdest custom fan curve I have ever seen, no offense intended. Here's mine.
> 
> 
> 
> The constant on-off-on-off you are experiencing now cannot possibly be good for fan life. Delete your current curve and start over at 30C as a starting point is what I would try.


I had the same fan curve with a EVGA FTW I never had this issue that I am having now.


----------



## GeneO

Quote:


> Originally Posted by *Curseair*
> 
> Hi guys, Anyone know why my fans keep hitting 700 then goes to 0 again? MSI Gaming Z 1070 is something wrong with my fan curve? (Note my fan speed in red pattern)


You are hitting the lowest % the fan can run at (they can't continuously go down to 0V and 0rpm). It drops down to the lowest rpm it can run, then stops, the gpu then heats back up and the fan voltage ramps back up to the point where the fan can start back up, cools the gpu down again, then repeats the cycle.

Just raise the lowest % up to about 20% or so. I see the same behavior in mine if I do similar to your curve.

Here is a curve I have that avoids that pitfall. It is either zero, or above 20%


----------



## zipper17

Quote:


> Originally Posted by *Duvar*
> 
> What do you think about my clocksspeeds?
> 
> 
> 
> And here a little bit better score due to mem oc http://www.3dmark.com/3dm/16766701?
> Heaven score too at the bottom http://extreme.pcgameshardware.de/grafikkarten/447445-video-2303-mhz-gtx-1070-boostclock-seite-37-optimierungswahn-gtx-1070-980-ti-bios-mod-resultate-videos-etc-38.html#post8601375


what is your Firestriike Basic & Firestrike Extrreme Graphic scores btw?

So you use curve method and only increase the highest Voltage point at 1.093 V into 2300mhz?

For my experiences if you only increase the highest voltage point and the rest are not, the performances will be lower, even though it's showing running at 2300mhz
its not running a real 2300mhz's performances cmiiw.

http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703006
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6560#post_25704868

Try increasing coreclock using offset slider only into the 2300mhz
if you can running it with absolute stable (no crash, no artifacting,etc) you probably has super sample on the chip.


----------



## icold

I used this specs:

I7 [email protected]
P8Z77-V LX
GTX1070 ROG: 2126mhz/4363mhz
16GB corsair [email protected]

I made the terrible mistake of getting a cpu with locked multiplier in 2013, I envied the 5.1ghz clock:thumb:


----------



## zipper17

My Fan curve probably even weird than anyone lol. When GPU hit 50C the Fan will ramp up into 100% fanspeed. This is best for heavy games in my case, But on Light Gaming my GPU still stay under 50C not all the time 100%fanspeed. Different game different Load. Overall This setting is pretty suitable for me personally.

I don't mind with 100%, my number one concern is temperatures and also why not use 100% fan speed potential since you bought the whole card anyway, that 100% fan speed means it's yours. at 100% the speed it only max at 2000RPM dual fans, is not even so loud in my case. Stock Bios Fan is unstoppable, the lowest is 50% (1030RPM). But they recently added a BIOS Update for galax exoc model to stop the Fan.

Btw you guys should try Temperature hysteresis that is important to add, especially in game such Witcher 3.
When you play a lot witcher 3, you will notice every time when you go to pause screen, inventory screen, map, etc,
the GPU load, GPU heat, and also FAN will immediately lowered down, with Fan Temperature Hysteresis you will keep your fan speed stable.

With Temp Hysteresis at 11C means: 50-11= 39C.
So the Fan will turn down when GPU hit 39C. Cmiiw.


----------



## MEC-777

Quote:


> Originally Posted by *Curseair*
> 
> Hi guys, Anyone know why my fans keep hitting 700 then goes to 0 again? MSI Gaming Z 1070 is something wrong with my fan curve? (Note my fan speed in red pattern)


This is because the fans can't run at 15%. There's a minimum % they will run at (usually 25%). What you need to do is just have the fan speed at 0% below 50 degrees (or what ever you want) and then have them come on to at lest 30% to "kick" them on, or they will "surge" on and off like that because they're not getting enough power to fully continuously spin. If you look up how electric motors work, you'll see why this is.









Has nothing to do with temps. The MSI gaming Z was designed to run fully passive (fans off) below 50 degrees or so, out of the box. My 980 Strix was the same way.


----------



## GeneO

Yeah, what I already said.


----------



## Duvar

Quote:


> Originally Posted by *zipper17*
> 
> what is your Firestriike Basic & Firestrike Extrreme Graphic scores btw?
> 
> So you use curve method and only increase the highest Voltage point at 1.093 V into 2300mhz?
> 
> For my experiences if you only increase the highest voltage point and the rest are not, the performances will be lower, even though it's showing running at 2300mhz
> its not running a real 2300mhz's performances cmiiw.
> 
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703006
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6560#post_25704868
> 
> Try increasing coreclock using offset slider only into the 2300mhz
> if you can running it with absolute stable (no crash, no artifacting,etc) you probably has super sample on the chip.


Until 1V i increased every 0.5V to gamestable clocks and i just wanted to know how far can i push with max voltage and was wondering as i reached 2330MHz but that crashed during second graphicstest, so i ended up @ 2303MHz. It isnt fully optimized yet, only until 1V and i also need to tune a lil bit the 0.25V sections (0.825V 0.875V, 0.925V etc)

You can check out the thread there (pc games hardware link), i have posted some results, but i got the good card a couple days ago, so its ok if you only check out the last 2 pages there.
You can try Firestrike Ultra i scored 5121 graphicsscore, but that wasn t with 2303MHz Coreclock because 122% Power isn t enough if i raise both mem and core (but 2228MHz was doable with max memclock).

2744 scored in heaven, would be nice if some of you can test that too. (i think that was with 2190MHz and +775MHz? )
I scored 21.5K graphicsscore with basic firestrike but only with 2177MHz on the core, its more difficult to go high on basic.
For firestrike memclock is bread and butter, mine wont go higher than ~+805 (Micron)
Overall i am happy with the card and i wont use it with that voltage and high clocks. Made 5 rockstable profiles with MSI AB

All profiles with only +400 memclock

1. 0.8V with 1797 MHz
2. 0.85V with 1911 MHz
3. 0.9V with 2000 MHz
4. 0.95V with 2063 MHz
5. 1V with 2100 MHz

But for example profile 5 is running firestrike with 2126MHz and 1 V too but artifacting a lil bit.
Imho profile 2 is the best and thats what i use most of the time, scores more than 4.5k graphicsscore @ firestrike ultra and f.e. profile 3 scores 4.6k+
I like 0.85V profile most, very low power consumption during gaming and nice performance + very good temps.


----------



## philhalo66

Quote:


> Originally Posted by *Forceman*
> 
> Why not just grab it from the Techpowerup database?
> 
> https://www.techpowerup.com/vgabios/183934/gigabyte-gtx1070-8192-160608


How do you even tell if it's the F1 bios?


----------



## allsop

Quote:


> Originally Posted by *zipper17*
> 
> Try increasing coreclock using offset slider only into the 2300mhz
> if you can running it with absolute stable (no crash, no artifacting,etc) you probably has super sample on the chip.


Whats the difference between turning up an offset slider? I use the asus gpu tweak 2 and it only has the normal core **** slider, it doesn't say the word offset anywhere. Can you not use both of them?


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> Against my better judgement i flashed my card with the F2 bios and now the damn thing wont go into idle clocks at all -_- GG gigabyte.
> 
> *Edit anyone got the F1 bios for G1 gaming 1070?*


take a look in the techpowerup vga bios data base. you will be able to download it there along with nvflash to manually flash the card


----------



## gtbtk

Quote:


> Originally Posted by *allsop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipper17*
> 
> Try increasing coreclock using offset slider only into the 2300mhz
> if you can running it with absolute stable (no crash, no artifacting,etc) you probably has super sample on the chip.
> 
> 
> 
> Whats the difference between turning up an offset slider? I use the asus gpu tweak 2 and it only has the normal core **** slider, it doesn't say the word offset anywhere. Can you not use both of them?
Click to expand...

the sliders on gpu tweak and afterburner adjust offset values, not actual values, that it sends to the driver to adjust card settings. What they display are just designed to make the program user friendly


----------



## gtbtk

Quote:


> Originally Posted by *msigtx760tf4*
> 
> old 3570K + 1070
> 
> http://www.3dmark.com/3dm/16796081


you have a good card there


----------



## philhalo66

Quote:


> Originally Posted by *gtbtk*
> 
> take a look in the techpowerup vga bios data base. you will be able to download it there along with nvflash to manually flash the card


I actually got the F1 BIOS from Gigabyte i sent an email last night and this morning they sent it to me so im back up 100%. Techpowerup doesn't list if its F1 or not.


----------



## Avendor

Quote:


> Originally Posted by *philhalo66*
> 
> I actually got the F1 BIOS from Gigabyte i sent an email last night and this morning they sent it to me so im back up 100%. Techpowerup doesn't list if its F1 or not.


I guess F1 it's outdated. Which memory type do you have? get latest BIOS for Gigabyte GTX 1070 G1, either one will serve your needs Micron or Samsung memory: http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
To check your version, go into Xtreme Gaming Engine (third-party software) - Setting (top right corner) - VGA
Also the BIOS Version Code can be found in GPU-Z
Here are mines:


----------



## philhalo66

Quote:


> Originally Posted by *Avendor*
> 
> I guess F1 it's outdated. Which memory type do you have? get latest BIOS for Gigabyte GTX 1070 G1, either one will serve your needs Micron or Samsung memory: http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> To check your version, go into Xtreme Gaming Engine (third-party software) - Setting (top right corner) - VGA
> Also the BIOS Version Code can be found in GPU-Z
> Here are mines:


the latest one is a mess for me, That's why i wanted the F1. With F2 i got stuck in 3d clocks so it ran hot all the time and i didn't like that And for some reason the performance was a solid 5-7% lower than with the F1 i don't know why either. My card has samsung memory so i don't need the F2 bios i'll stick to the F1 like the old saying says "If it aint broke don't fix it"


----------



## Forceman

Quote:


> Originally Posted by *philhalo66*
> 
> How do you even tell if it's the F1 bios?


It says in the ID string:

GPU Device Id: 0x10DE 0x1B81
Version: 86.04.1E.00.68
GV-N1070G1 GAMING-8GD/*F1*/0408


----------



## philhalo66

Quote:


> Originally Posted by *Forceman*
> 
> It says in the ID string:
> 
> GPU Device Id: 0x10DE 0x1B81
> Version: 86.04.1E.00.68
> GV-N1070G1 GAMING-8GD/*F1*/0408


ohh i didn't know what those numbers meant. And it wasn't worth killing my card so i asked gigabyte.

Also here is something interesting. On the stock BIOS i was able to do 2160MHz on the core and be stable in all games but with the one gigabyte gave me it maxes at 2081MHz anymore than that and it artifacts and crashes. Any ideas?


----------



## allsop

Quote:


> Originally Posted by *gtbtk*
> 
> the sliders on gpu tweak and afterburner adjust offset values, not actual values, that it sends to the driver to adjust card settings. What they display are just designed to make the program user friendly


That was a dumb question haha sorry. Thank you though


----------



## allsop

Again, the card I have is the ASUS dual 1070. I haven't touched the voltage slider at all on GPU tweak 2. I keep reading that I can slide it over to max, since the voltage is locked past 1.09 so it won't ever cause damage. The highest I've seen it go is 1.063. I maxed the voltage slider and the OSD showed 1.088v and I almost pooped. But I'm still being told that won't damage anything at all, as long as my temps stay below 80.

When I look online for max voltage for 1070s it shows both 1.063 and 1.093, so I'm confused? Those numbers were for the EVGA version, and I have asus. But are those numbers the same across all 1070s? I've had difficulties reaching +150 core, so if maxing the voltage out for 1070s is normal for overclocks, I'll feel pretty stupid


----------



## philhalo66

Quote:


> Originally Posted by *allsop*
> 
> Again, the card I have is the ASUS dual 1070. I haven't touched the voltage slider at all on GPU tweak 2. I keep reading that I can slide it over to max, since the voltage is locked past 1.09 so it won't ever cause damage. The highest I've seen it go is 1.063. I maxed the voltage slider and the OSD showed 1.088v and I almost pooped. But I'm still being told that won't damage anything at all, as long as my temps stay below 80.
> 
> When I look online for max voltage for 1070s it shows both 1.063 and 1.093, so I'm confused? Those numbers were for the EVGA version, and I have asus. But are those numbers the same across all 1070s? I've had difficulties reaching +150 core, so if maxing the voltage out for 1070s is normal for overclocks, I'll feel pretty stupid


My understanding is 1.093 is the max for the pascal chip set by Nvidia and the 1.063 is the max for some partner cards. When it comes down to it your card is a pretty good quality 1070 as far as quality so you should be able to max out the slider and not have a worry in the world. Overclock away friend


----------



## allsop

So having it set to 1.093v is fine? My ASIC quality is 61% if that matters. For the partner cards that are limited to 1.063, wouldn't they be limited in the vbios so you wouldn't ever see a reading higher than that in monitor programs?


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> take a look in the techpowerup vga bios data base. you will be able to download it there along with nvflash to manually flash the card
> 
> 
> 
> I actually got the F1 BIOS from Gigabyte i sent an email last night and this morning they sent it to me so im back up 100%. Techpowerup doesn't list if its F1 or not.
Click to expand...

you need to keep track of the nvidia bios version. it is listen in gpu-z


----------



## gtbtk

Quote:


> Originally Posted by *allsop*
> 
> So having it set to 1.093v is fine? My ASIC quality is 61% if that matters. For the partner cards that are limited to 1.063, wouldn't they be limited in the vbios so you wouldn't ever see a reading higher than that in monitor programs?


there is no way to tell asic quality on pascal. you must have a very old version or GPU-Z that is trying to read the chip as maxwell. the number is inaccurate.

You can only access 1.093 if you use an overclocking utility. stock for all cards is 1.063 as far as I am aware. I am not aware of any cards that you cannot increase the card voltage on. 1.093 is fine but it will increase temps over stock voltages so you may find that it starts high but drops off quicker as the temps increase. Key is finding the best banance between temps, frequency and voltage to maximize framerates


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Avendor*
> 
> I guess F1 it's outdated. Which memory type do you have? get latest BIOS for Gigabyte GTX 1070 G1, either one will serve your needs Micron or Samsung memory: http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios
> To check your version, go into Xtreme Gaming Engine (third-party software) - Setting (top right corner) - VGA
> Also the BIOS Version Code can be found in GPU-Z
> Here are mines:
> 
> 
> 
> 
> 
> 
> 
> the latest one is a mess for me, That's why i wanted the F1. With F2 i got stuck in 3d clocks so it ran hot all the time and i didn't like that And for some reason the performance was a solid 5-7% lower than with the F1 i don't know why either. My card has samsung memory so i don't need the F2 bios i'll stick to the F1 like the old saying says "If it aint broke don't fix it"
Click to expand...

You are running Samsung memory, unless the new bios changes something like the max power draw or default fan curves, the core part is fine, there are no bugs that effect performance


----------



## philhalo66

Quote:


> Originally Posted by *gtbtk*
> 
> You are running Samsung memory, unless the new bios changes something like the max power draw or default fan curves, the core part is fine, there are no bugs that effect performance


Clearly something didn't like my system. Either way F1 is much better for me.


----------



## comagnum

Hey gang!

Long time AMD/ATI user here. Getting my MSI Gamer X 1070 in the mail this week and I was wondering if you guys could offer up any tips/tricks/info on overclocking. I'm sure it's similar to amd cards, I do use AfterBurner for all of my ocing needs. More curious about expected oc ranges, voltages, etc. From the confirmed owners table I saw that a typical OC on the Gamer X is between 1800mhz and 2100mhz. Hell, stock will be a vast improvement over my rx 480, but who the hell wants to run anything stock?


----------



## asdkj1740

Quote:


> Originally Posted by *comagnum*
> 
> Hey gang!
> 
> Long time AMD/ATI user here. Getting my MSI Gamer X 1070 in the mail this week and I was wondering if you guys could offer up any tips/tricks/info on overclocking. I'm sure it's similar to amd cards, I do use AfterBurner for all of my ocing needs. More curious about expected oc ranges, voltages, etc. From the confirmed owners table I saw that a typical OC on the Gamer X is between 1800mhz and 2100mhz. Hell, stock will be a vast improvement over my rx 480, but who the hell wants to run anything stock?


without decent cooling the card cant hold 2100 all the time. and 2000 & 2100 wont make any significant differences in performance. so leave the oc to gpu boost 3.0 is fine actually.


----------



## allsop

Thanks for the help to all, appreciate it!


----------



## MEC-777

Quote:


> Originally Posted by *comagnum*
> 
> Hey gang!
> 
> Long time AMD/ATI user here. Getting my MSI Gamer X 1070 in the mail this week and I was wondering if you guys could offer up any tips/tricks/info on overclocking. I'm sure it's similar to amd cards, I do use AfterBurner for all of my ocing needs. More curious about expected oc ranges, voltages, etc. From the confirmed owners table I saw that a typical OC on the Gamer X is between 1800mhz and 2100mhz. Hell, stock will be a vast improvement over my rx 480, but who the hell wants to run anything stock?


Overclocking Pascal is not like overclocking AMD. You can do a quick and dirty OC, but GPUboost 3.0 still controls everything for the most part, aside from what you do with the sliders. If you really want to fine-tune pascal it will take time and a lot of testing.

GPUboost 3.0 overclocks all 10 series cards like crazy out of the box, without even touching anything. My FE 1070 at stock boost to 1911 and settles in the low-mid 1800's even though it's supposed to have a base/boost of 1503/1607. Overclocked I can get it to peak at 2050 and settle in the mid 1900's. The lower you can keep the temps, the higher sustained boost clocks will be. And that's just the tip of the iceberg, lol.

Memory overclocking is very simple though. Most can get +450-500 stable. If your card has micron memory, make sure it has the latest BIOS version. If not, updated it.


----------



## gtbtk

Quote:


> Originally Posted by *comagnum*
> 
> Hey gang!
> 
> Long time AMD/ATI user here. Getting my MSI Gamer X 1070 in the mail this week and I was wondering if you guys could offer up any tips/tricks/info on overclocking. I'm sure it's similar to amd cards, I do use AfterBurner for all of my ocing needs. More curious about expected oc ranges, voltages, etc. From the confirmed owners table I saw that a typical OC on the Gamer X is between 1800mhz and 2100mhz. Hell, stock will be a vast improvement over my rx 480, but who the hell wants to run anything stock?


If you overclock, use a custom fan curve. 100% at 60 deg on the curve will keep temps below 60 deg under load. Running the fans fixed at 100% is audible but not that loud so it is an option for extended use as well particularly if benchmarking

Check the bios version of your card with GPU-Z when you get the card. If the bios is 86.04.26.00.3E you need to apply the latest bios update to take it to 86.04.50.00.XX because the .26 bios has the micron bug that will cause memory artifacts and crash your PC if you overclock the memory more than about +400. The 86.04.1E.41 bios is fine as is the .50 bios.

memory should be fine to overclock to to +500 or so reliably. You may get +600 or more is you are lucky but check both levels and see which one performs better.

You will have the traditional slider for core frequency that will probably work up to about +100 or so, that will probably get you to about 2050mhz or maybe a bit more. You will also have the option of tweaking the new for pascal voltage curve to fine tune performance. the curve will give you the really high frequencies although that does not always give you the best FPS performance if you only increase the slider at 1.093V. Read back through this thread for many many discussions. I dont think anyone runs at 1800Mhz unles they are running at 90 deg which the Gaming X does not do, even with the stock fans settings.

Pascal will reduce clocks in 13Mhz steps and voltages as temp increases, starting at 43 deg. That is normal and a function of GPU Boost 3.0 and the reason why I suggested the Custom Fan curve.

Default max voltage is 1.063. if you increase the voltage slider to 100% you can get up to 1.093V


----------



## philhalo66

I wonder if a custom bios will be able to disable the temp throttle till a reasonable temp like 70C+ It's kind of stupid to throttle at 43C when it's pretty widely accepted that under 80C your fine.


----------



## ucode

Quote:


> Originally Posted by *philhalo66*
> 
> I wonder if a custom bios will be able to disable the temp throttle till a reasonable temp like 70C+ It's kind of stupid to throttle at 43C when it's pretty widely accepted that under 80C your fine.


Probably a driver mod would work however IMHO the point is Pascal is heat sensitive. Perhaps the method of control is a little to generalized (understandably) to work perfectly.


----------



## Clocknut

Quote:


> Originally Posted by *MEC-777*
> 
> Overclocking Pascal is not like overclocking AMD. You can do a quick and dirty OC, but GPUboost 3.0 still controls everything for the most part, aside from what you do with the sliders. If you really want to fine-tune pascal it will take time and a lot of testing.
> 
> GPUboost 3.0 overclocks all 10 series cards like crazy out of the box, without even touching anything. My FE 1070 at stock boost to 1911 and settles in the low-mid 1800's even though it's supposed to have a base/boost of 1503/1607. Overclocked I can get it to peak at 2050 and settle in the mid 1900's. The lower you can keep the temps, the higher sustained boost clocks will be. And that's just the tip of the iceberg, lol.
> 
> Memory overclocking is very simple though. Most can get +450-500 stable. If your card has micron memory, make sure it has the latest BIOS version. If not, updated it.


I am kinda wondered, as the 16nm process matured, will the boost 3.0 of the later manufactured 1070 get better & boost higher clocks automatically?


----------



## GeneO

Quote:


> Originally Posted by *philhalo66*
> 
> I wonder if a custom bios will be able to disable the temp throttle till a reasonable temp like 70C+ It's kind of stupid to throttle at 43C when it's pretty widely accepted that under 80C your fine.


With my card, I can overclock it so that it will run stably if I keep the temps low and the fans 100%. If I let it run with a less aggressive fan curve and let the temps rise (but still below 65), even though the chip lowers the frequency (at the same voltage), it becomes unstable. So everything is not good, even under 70. It seems very temperature sensitive to me.

.


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> I wonder if a custom bios will be able to disable the temp throttle till a reasonable temp like 70C+ It's kind of stupid to throttle at 43C when it's pretty widely accepted that under 80C your fine.


it is not throttling. you are still more than 300mhz or more above what has been advertised as the boost clock. it is power management. The card will overboost as it were (think of it as emergency war power), for short periods of time before it settles down to a stable clock rate as per the settings of the voltage curve. The colck speed you end up with is still an overclock


----------



## comagnum

Quote:


> Originally Posted by *gtbtk*
> 
> If you overclock, use a custom fan curve. 100% at 60 deg on the curve will keep temps below 60 deg under load. Running the fans fixed at 100% is audible but not that loud so it is an option for extended use as well particularly if benchmarking
> 
> Check the bios version of your card with GPU-Z when you get the card. If the bios is 86.04.26.00.3E you need to apply the latest bios update to take it to 86.04.50.00.XX because the .26 bios has the micron bug that will cause memory artifacts and crash your PC if you overclock the memory more than about +400. The 86.04.1E.41 bios is fine as is the .50 bios.
> 
> memory should be fine to overclock to to +500 or so reliably. You may get +600 or more is you are lucky but check both levels and see which one performs better.
> 
> You will have the traditional slider for core frequency that will probably work up to about +100 or so, that will probably get you to about 2050mhz or maybe a bit more. You will also have the option of tweaking the new for pascal voltage curve to fine tune performance. the curve will give you the really high frequencies although that does not always give you the best FPS performance if you only increase the slider at 1.093V. Read back through this thread for many many discussions. I dont think anyone runs at 1800Mhz unles they are running at 90 deg which the Gaming X does not do, even with the stock fans settings.
> 
> Pascal will reduce clocks in 13Mhz steps and voltages as temp increases, starting at 43 deg. That is normal and a function of GPU Boost 3.0 and the reason why I suggested the Custom Fan curve.
> 
> Default max voltage is 1.063. if you increase the voltage slider to 100% you can get up to 1.093V


Wow, thanks for all of the useful info! I'll definitely look through the thread and get a good idea of everything before I dive in. Again, much appreciated.


----------



## Arturo.Zise

Definitely recommend adjusting the fan curve on these cards.

I flashed my Gainward Phoenix with the Golden Sample bios which boosted my default clocks and power target. Was happy to see the card run at over 2ghz, but started to notice it would drop into the mid 1900's after an hour. Checked my Afterburner stats and saw the fan speeds never going above 40%. My temps were high 60's. I created a custom fan profile to get the fans spinning up earlier and now have them hitting 50% at 60c. My clocks are now rock solid above 2ghz at all times and my temp's never go above 60c. Can hardly hear any noise difference either.

Very happy I read this thread


----------



## Duvar

I am running @ ~1900MHZ and +400MHz Memclock with 0.85V and getting ~52°C with 2000 RPM (EVGA 1070 FTW) max RPM = ~3250
Goal ist to stop it from dropping the boostclock, maybe i can go a lil bit lower with the fanspeed, but i am pleased so far.
You wont loose much fps if you use the sweetspot instead of max clock reachable. I think the best voltage is 0.85V or max 0.9V.
If you can get 1900MHz+ with that voltage, you would be fine.


----------



## AngryLobster

Wow that's a impressive card. Of the 3 1070s I've had, none would go lower than 0.91-0.93v @ 1932ish MHz. I did it to reduce noise though and found around 1900 the sweet spot as anything more required too much voltage + the performance gains were pretty worthless all the way up to 2050 MHz. They only required 950-1050RPM for 57-63C on the MSI Gaming X model (4K).


----------



## EvilWiffles

Just wanted to say that temps isn't the reason Pascal throttles so much.
I run MSI GTX 1070 Sea Hawk X, my GPU would throttle before my temps ever touch 40c.


----------



## gtbtk

Quote:


> Originally Posted by *EvilWiffles*
> 
> Just wanted to say that temps isn't the reason Pascal throttles so much.
> I run MSI GTX 1070 Sea Hawk X, my GPU would throttle before my temps ever touch 40c.


Seahawk X bios has lower power limits than the Gaming X cards. If you hit the power limit it will also reduce clocks, regardless of temperatures to keep it within the cards operating envelope.

Seahawk X Power limits are 190W up to 200W with the 105% power limit slider whereas Gaming X or Z cards bios power limits are 230W-291W with a 126% power limit slider. I have never managed to get my Gaming X card with an MSI bios installed to pull more than about 106% before it too starts to reduce clocks, even though I can set the limit to 126%. Not worked out why it does that as yet.

The limit the Seahawk has because of the single 8 pin power and the 4+1 VRM as opposed to the Gaming/Quicksilver Custom boards 8+6 pin power with 8+2 phase VRM.

Given that performance between Seahawk and Gaming is fairly comparable in benchmarks, it is my belief that the 6+8 pin and 8 phase VRM is more marketing fluff than substance. The twin Frozr Cooler is a great on Air solution though.


----------



## gtbtk

3DMark benchmark is 85% off at the steam Winter sale.

Cheap way to pick up the full suite of benchmarks and avoid having to sit through the demos

http://store.steampowered.com/sub/114165/


----------



## rfarmer

Quote:


> Originally Posted by *gtbtk*
> 
> 3DMark benchmark is 85% off at the steam Winter sale.
> 
> Cheap way to pick up the full suite of benchmarks and avoid having to sit through the demos
> 
> http://store.steampowered.com/sub/114165/


Damn that is a nice deal, I paid $20 for it.


----------



## EvilWiffles

Yeah, I thought the power limit might be an issue but I can get 2101MHz on core pretty easy, can go higher then that but I just stuck to 2101.
I've tested even stock GPU boost clocks, and it'll still throttle around like my OC would.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> 3DMark benchmark is 85% off at the steam Winter sale.
> 
> Cheap way to pick up the full suite of benchmarks and avoid having to sit through the demos
> 
> http://store.steampowered.com/sub/114165/


Already have only need the Timespy upgrade. it's also included the 3dmark Stress Test, which is important for test stability after overclocking imo.

Btw I'm struggle between buying ROTR or Dying Light. Both game its a horse power.


----------



## Bratislav

Zotac FE @ 2139/9600MHz(2114/9400MHz for 24/7), i5-3570K @ 5.2GHz(5GHz for 24/7), EK CPU & GPU water block:


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 3DMark benchmark is 85% off at the steam Winter sale.
> 
> Cheap way to pick up the full suite of benchmarks and avoid having to sit through the demos
> 
> http://store.steampowered.com/sub/114165/
> 
> 
> 
> Already have only need the Timespy upgrade. it's also included the 3dmark Stress Test, which is important for test stability after overclocking imo.
> 
> Btw I'm struggle between buying ROTR or Dying Light. Both game its a horse power.
Click to expand...

I bought my copy at the sale last year. Mine came with time spy and it didn't cost me anything extra.


----------



## MEC-777

Quote:


> Originally Posted by *zipper17*
> 
> Already have only need the Timespy upgrade. it's also included the 3dmark Stress Test, which is important for test stability after overclocking imo.
> 
> Btw I'm struggle between buying ROTR or Dying Light. Both game its a horse power.


ROTTR is a much better game than Dying Light, IMO.


----------



## philhalo66

I agree ROTTR was amazing. Worth every penny IMO and it looks insanely good.


----------



## stulda

Quote:


> Originally Posted by *gtbtk*
> 
> 3DMark benchmark is 85% off at the steam Winter sale.
> 
> Cheap way to pick up the full suite of benchmarks and avoid having to sit through the demos
> 
> http://store.steampowered.com/sub/114165/


Great find! Thanks


----------



## rfarmer

Quote:


> Originally Posted by *philhalo66*
> 
> I agree ROTTR was amazing. Worth every penny IMO and it looks insanely good.


I even pre ordered it. First time I ever pre ordered a game, always swore I never would. But I absolutely wanted to play this game day 1 and it was great.


----------



## allsop

I use the GPU tweak 2 that came with my asus dual 1070 card, and I use precision xoc as well for the OSD.

I was able to get the core clock to stay above 2000mhz by adding +150, but once in a while that green flash artifact happens quickly. I'm told that is related to a memory overclock, but my memory clock oc is only +400, which seems a bit low since most people seem to get to +500-600 easily. I haven't maxed the voltage yet for continuous gameplay, just tested it quick, it's set to +90. When I maxed it out, precision osd showed 1.097v which is over the safe and locked voltage for overclocking at 1.093v so I stopped and went back to the previous voltage. I go over the max power limit allot, limit is 112% but the OSD shows nearly 130% most of the time

Is it normal for readings to list voltage above 1.093v? I thought that number was locked as max. I am using the asus dual 1070oc. And as I said I still get those green monitor flashes, but I'm not sure if putting the voltage on max and having it above the 1.093v safe limit to stabilize is still safe to do.

I cannot find any way to monitor the cards VRAM or anything similar so I don't know whether it could be overheating or not. The temp for the card itself never reaches 70°. I tend to keep my gpu fans at nearly maximum rpm, around 90% fan usage. Could the two gpu fans spinning that fast for long gaming sessions take more than normal voltage amounts? I also have my noctua NH D15 cpu fans at maximum as well along with my one case fan.

Could it be motherboard related issue with the mobo volts? (click for the one I have) I never plugged the noctua NH D15 in the PWM header, but I can still control it and make a curve for it in my BIOS settings.

Could I just leave everything as it is and just deal with the occasional green flicker or will it cause damage in the long run if not addressed?


----------



## philhalo66

Quote:


> Originally Posted by *rfarmer*
> 
> I even pre ordered it. First time I ever pre ordered a game, always swore I never would. But I absolutely wanted to play this game day 1 and it was great.


I didn't bother getting it till after i go my 1070 i didnt think my 580 could run it and i was probably right. It was a fantastic game I'm hyped for the next one.


----------



## gtbtk

Quote:


> Originally Posted by *stulda*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 3DMark benchmark is 85% off at the steam Winter sale.
> 
> Cheap way to pick up the full suite of benchmarks and avoid having to sit through the demos
> 
> http://store.steampowered.com/sub/114165/
> 
> 
> 
> Great find! Thanks
Click to expand...

you are welcome


----------



## Clocknut

Quote:


> Originally Posted by *gtbtk*
> 
> if Ryzen is indeed a 95W chip, overclocked would end up pulling maybe 140W. Zotac Mini is specced at 170W maximum power draw so you are looking at a maximum power draw of about 480W. The 650 should work but you may have to deal with fan noise and it will not be running at peak efficency


Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I'd say no problem even overclocks on the 1070s too.


Quote:


> Originally Posted by *MEC-777*
> 
> Should be fine. All that being said, no, Clocknut shouldn't be on the limit of a 650w PSU (especially a Seasonic gold unit) with that setup. It will have about 100w of headroom +/- which is totally fine.


Cool. I wasnt thinking to SLI right away, but I just wanted to keep my option open in the future.

I was looking for a 1060 6GB most of them sit around $250, but the Zotac 1070 mini retailed about $330, that price is way too close to pass up.









And thanks to all replied me (cant quote all)


----------



## gtbtk

Quote:


> Originally Posted by *allsop*
> 
> I use the GPU tweak 2 that came with my asus dual 1070 card, and I use precision xoc as well for the OSD.
> 
> I was able to get the core clock to stay above 2000mhz by adding +150, but once in a while that green flash artifact happens quickly. I'm told that is related to a memory overclock, but my memory clock oc is only +400, which seems a bit low since most people seem to get to +500-600 easily. I haven't maxed the voltage yet for continuous gameplay, just tested it quick, it's set to +90. When I maxed it out, precision osd showed 1.097v which is over the safe and locked voltage for overclocking at 1.093v so I stopped and went back to the previous voltage. I go over the max power limit allot, limit is 112% but the OSD shows nearly 130% most of the time
> 
> Is it normal for readings to list voltage above 1.093v? I thought that number was locked as max. I am using the asus dual 1070oc. And as I said I still get those green monitor flashes, but I'm not sure if putting the voltage on max and having it above the 1.093v safe limit to stabilize is still safe to do.
> 
> I cannot find any way to monitor the cards VRAM or anything similar so I don't know whether it could be overheating or not. The temp for the card itself never reaches 70°. I tend to keep my gpu fans at nearly maximum rpm, around 90% fan usage. Could the two gpu fans spinning that fast for long gaming sessions take more than normal voltage amounts? I also have my noctua NH D15 cpu fans at maximum as well along with my one case fan.
> 
> Could it be motherboard related issue with the mobo volts? (click for the one I have) I never plugged the noctua NH D15 in the PWM header, but I can still control it and make a curve for it in my BIOS settings.
> 
> Could I just leave everything as it is and just deal with the occasional green flicker or will it cause damage in the long run if not addressed?


In my opinion, MSI afterburner is superior to both GPU Tweak and Precision XOC. It is more flexible to fine tune and monitor your card than either of the other two.

You can use HWInfo64 to monitor just about everything in your PC including everything that is monitor-able out of your Graphics card. It will even integrate into the Afterburner OSD to let you monitor things on screen that the OC utilities do not cover like how many watts your card is drawing. There is no way to monitor VRAM temps on Pascal cards.

I suspect that the Precision 1.097V voltage report and 130% power limit report are rounding/programming errors in precision, you can confirm that by also running HWinfo.

If your card has Micron Memory with bios version number 86.04.26.00.xx (look in GPU-Z or system information in the Nvidia control panel) there is a bios update available from ASUS (86.04.50.00.xx) that you need to install to fix a bug with the memory controller. If you have Micron memory/old bios, the bios update usually improves memory overclock performance by about +100 or so and that may solve your green flashing problem.

The GPU fans running at 90% should not be causing you any major issues although they will contribute a couple of watts to your GPUs total power usage. Your +150 /+400 OC is the reason that you are hitting the power limits of the card. You could maybe try using the Voltage curve instead of the slider and only overclock the high end of the curve. That should reduce the total power the card is drawing under load. try increasing only the voltage points at 1.043, 1.050, 1.063, 1.075, 1.081 and 1.093 by +150, leave the rest at default and see how it goes for you.

Your Noctual D15 must be running through a pwn fan controller which in turn is connected to the motherboard if you can make a fan curves and control it through the bios. The Noctua fans are not going to effect the performance of your GPU.


----------



## rfarmer

https://unigine.com/en/products/benchmarks/superposition

New Unigine benchmarking tool, delayed until Q1 2017 but looks very promising.


----------



## RyanRazer

Silent PC

Hey a bit off-topic but still









I am in a transition of getting a new PC case. I also fried my new ATX mobo so i had to go back to my old mATX mobo and thrown it into old case for a day or two till new case comes. Since it's a temporary thing i only hooked my SSD drive and left HDDs out. Since HDDs were the only sound source, the PC is dead silent now. Its unbelievable how HDD can get noticeably "loud" when they are the only source of sound, especially at night.

GPU ofc has a fan stop feature (like every gpu today), CPU cooler has the same feature (great little CPU cooler for 30€, great pursuance, not for OCing ofc) and lastly semi-passive PSU.

Mad props to that *Corsair rm750X PSU* ! Could't recommend it enough. I've had an XFX TS PRO 650W, man that thing was loud. It was like a vacuum cleaner. Roughly same dB as my old R9 290 Sapphire 3x. And we know those are hot and loud. I got rid of that thing and got this one. Believe it or not, not only GTX 1070 but neither 250-300W R9 290 couldn't get it to spin. This thing just never spins. At first i thought it was broke. Then i noticed It spins a few times when turning PC on














Anyways, having a silent PC is awesome. It almost doesn't feel right. Like something is missing. Like having a smartphone with a giant screen







But it will be gone soon when new case with fans arrives... i was about to get 1 more SSD and get rid of HDDs alltogether for silent PC but i don't know how loud case fans will be and if it makes sense. will see









Ohh and marry Christmas everyone! Nice holidays


----------



## gtbtk

Quote:



> Originally Posted by *rfarmer*
> 
> https://unigine.com/en/products/benchmarks/superposition
> 
> New Unigine benchmarking tool, delayed until Q1 2017 but looks very promising.


Have they actually announced a delay? Web site still says End of 2016


----------



## rfarmer

Quote:


> Originally Posted by *gtbtk*
> 
> Have they actually announced a delay? Web site still says End of 2016


https://www.overclock3d.net/news/software/unigine_delays_their_superposition_benchmark_until_q1_2017/1

I saw it there.


----------



## gtbtk

Quote:


> Originally Posted by *rfarmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Have they actually announced a delay? Web site still says End of 2016
> 
> 
> 
> https://www.overclock3d.net/news/software/unigine_delays_their_superposition_benchmark_until_q1_2017/1
> 
> I saw it there.
Click to expand...

Thanks


----------



## icold

This guy grab 2202mhz OC on your 1070 amp extreme!












Someone here got 2.2ghz?


----------



## gtbtk

if you leave only increase the 1.093v point on the curve to 2200 and leave the rest at stock or with a slight underclock, That is not that difficult to obtain for many cards.

It doesn't give you huge performance gains on your card though.


----------



## defyoddz

what are the best non reference 1070s for using stock, and possibly under water in future?


----------



## defyoddz

what are the best non reference 1070s for using stock, and possibly under water in future?


----------



## ucode

Quote:


> Originally Posted by *Duvar*
> 
> Goal ist to stop it from dropping the boostclock


Here's one after messing with the curve points and running Fire Strike where Pascal clocks actually increase with temperature.



Lol, not really recommended though for obvious reasons.


----------



## Bratislav

Quote:


> Originally Posted by *defyoddz*
> 
> what are the best non reference 1070s for using stock, and possibly under water in future?


MSI Gaming X - top air cooler, 291W power limit, cheap EK FC block.


----------



## Chuck Nourrish

MSI GTX 1070 Quick Silver.


----------



## gtbtk

Quote:


> Originally Posted by *Chuck Nourrish*
> 
> 
> 
> MSI GTX 1070 Quick Silver.


Nice, The Silver Gaming X. Should be a good card


----------



## RyanRazer

Quote:


> Originally Posted by *Chuck Nourrish*
> 
> 
> 
> MSI GTX 1070 Quick Silver.


sexy


----------



## RyanRazer

deleted, repost


----------



## V3n0m15

My 1070 in all her glory. Going to block the GPU eventually.































Sent from my XT1585 using Tapatalk


----------



## rfarmer

Quote:


> Originally Posted by *V3n0m15*
> 
> My 1070 in all her glory. Going to block the GPU eventually.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my XT1585 using Tapatalk


Looks good.









Yeah throw a block on that, Pascal run extremely cool under water. I get 42C max with my FE and that is after extended gaming, 38C if I am running benchmarks.


----------



## V3n0m15

Quote:


> Originally Posted by *rfarmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *V3n0m15*
> 
> My 1070 in all her glory. Going to block the GPU eventually.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my XT1585 using Tapatalk
> 
> 
> 
> Looks good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah throw a block on that, Pascal run extremely cool under water. I get 42C max with my FE and that is after extended gaming, 38C if I am running benchmarks.
Click to expand...

Awesome yea I'm really looking forward to it getting a 240 and 360 rad with 5 black Noctua







should run pretty cool









Sent from my XT1585 using Tapatalk


----------



## V3n0m15

Quote:


> Originally Posted by *rfarmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *V3n0m15*
> 
> My 1070 in all her glory. Going to block the GPU eventually.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my XT1585 using Tapatalk
> 
> 
> 
> Looks good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah throw a block on that, Pascal run extremely cool under water. I get 42C max with my FE and that is after extended gaming, 38C if I am running benchmarks.
Click to expand...

Awesome yea I'm really looking forward to it getting a 240 and 360 rad with 5 black Noctua







should run pretty cool









Sent from my XT1585 using Tapatalk


----------



## saunupe1911

Quote:


> Originally Posted by *V3n0m15*
> 
> My 1070 in all her glory. Going to block the GPU eventually.
> 
> Sent from my XT1585 using Tapatalk


Sick!!! Please post when you do. A set of instructions would be great. I would love to put a water block on my Strix but I'm nervous about doing it


----------



## Gorhell

I bought my GTX 1070 yesterday it's a ZOTAC GTX 1070 AMP! Extreme Edition. I'm satisfied right now temps never go up to 70C.


----------



## shadowrain

Quote:


> Originally Posted by *Gorhell*
> 
> I bought my GTX 1070 yesterday it's a ZOTAC GTX 1070 AMP! Extreme Edition. I'm satisfied right now temps never go up to 70C.


Pre, Samsung or Micron VRAM? Can check with GPU-Z. And welcome to the club.


----------



## icold

How i do that?


----------



## Dude970

Use GPU-Z


----------



## icold

I talked about OC 2202MHZ. My memory is micron, go only 4363mhz














. How do I get fixed voltage at 1,093mv?


----------



## Dude970

Pull up the curve in Afterburner and click on the voltage you want, I believe L will lock it.


----------



## icold

Quote:


> Originally Posted by *Dude970*
> 
> Pull up the curve in Afterburner and click on the voltage you want, I believe L will lock it.


what curve?


----------



## Dude970

http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


Spoiler: Warning: Spoiler!



You may press after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling


----------



## GeneO

Quote:


> Originally Posted by *icold*
> 
> what curve?


Control- in AB will bring up the curveF. Selecting a point and pressing the L will lock it. Sort of. Boost may still lower.

But really, just moving the voltage and power slider to there max should give you 1.093 V if the conditions are right (mainly temp).

BTW, I got weird behavior after locking a point and saving it in a profile. I ended up locked in at base voltage and frequency and couldn't undo it. I had a backup of my profiles I had to restore.


----------



## icold

tnks, i found


----------



## Gorhell

Quote:


> Originally Posted by *shadowrain*
> 
> Pre, Samsung or Micron VRAM? Can check with GPU-Z. And welcome to the club.


It's a Samsung VRAM. I overclocked to 100Mhz


----------



## shadowrain

Quote:


> Originally Posted by *Gorhell*
> 
> It's a Samsung VRAM. I overclocked to 100Mhz


6 months have passed and Zotac still has Samsung 1070 vrams in the PH. Noice. I got mine last July and still amazed at the power of this thing.

Also, 100mhz on GPU clock is good, but your VRAM speeds can go way more than 100mhz. Way more. Find your limit. Hint, mine is +777mhz in evga precision or half of that number in gpu z.


----------



## Gorhell

@shadowrain

Thanks will OC more if I feel I need the power. Right now I'm satisfied and you are right. With my temps and such I think i can reach more. My Highest so far is 65C in Benchmarks and 67C if i play for more than 2 hours.


----------



## icold

With samsung vram you can much more, around 700+ , but with micron only half this


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> I talked about OC 2202MHZ. My memory is micron, go only 4363mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> . How do I get fixed voltage at 1,093mv?


Have you updated the vBios? You should be running an 86.04.50.00.xx bios on the card to fix a bug with the memory controller

You an set the voltage that you want by opening the curve window, slecting the point that you want and typing the letter L


----------



## gtbtk

Quote:


> Originally Posted by *shadowrain*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gorhell*
> 
> It's a Samsung VRAM. I overclocked to 100Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 6 months have passed and Zotac still has Samsung 1070 vrams in the PH. Noice. I got mine last July and still amazed at the power of this thing.
> 
> Also, 100mhz on GPU clock is good, but your VRAM speeds can go way more than 100mhz. Way more. Find your limit. Hint, mine is +777mhz in evga precision or half of that number in gpu z.
Click to expand...

@Gorhell your memory OC should be able to get up to about 500 or maybe more

@shadowrain They are alternating between Samsung and Micron in different production runs. Haviong said that your card still has a 1E bios on it that they have been using since the start of manufacturing so the card may have been warehoused in the PI for the last 6 months


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> Have you updated the vBios? You should be running an 86.04.50.00.xx bios on the card to fix a bug with the memory controller
> 
> You an set the voltage that you want by opening the curve window, slecting the point that you want and typing the letter L


Yes, i have used this latest vbios, and my memory dont up more stable than +360


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Have you updated the vBios? You should be running an 86.04.50.00.xx bios on the card to fix a bug with the memory controller
> 
> You an set the voltage that you want by opening the curve window, slecting the point that you want and typing the letter L
> 
> 
> 
> Yes, i have used this latest vbios, and my memory dont up more stable than +360
Click to expand...

I have found that GPU frequency and memory clocks tend to trade off between each other, the key is to find the right compromised between the two. I have also found that 2100Mhz does not always give you the best performance

Have you tried leaving the core at stock and seeing how far you can push the memory?


----------



## Mad Pistol

I was beginning to have my doubts about Nvidia and their future. However, today I got a new Dell 27" 1440P GSYNC monitor.

My doubts are officially gone. I have NEVER seen such a smooth and responsive gaming experience as this 144hz GSYNC monitor. It's like constant Vsync-like picture without the input latency. It's stupidly smooth!

I wish I could take a video of it, but it won't do it justice. It's just something that you have to experience to believe... it makes a massive difference in First Person Shooters and action games.

Lots of people say Gsync and Freesync are gimmicks. I'm here to tell you that this is NOT a gimmick. The difference is real, and it's AMAZING!!!


----------



## Gorhell

Hi All.

I'm trying to OC my Zotac GTX 1070 AMP! Extreme edition but I'm getting black screen flashing quickly and green dots flashing and my 3DMark Firestrike stopped. My temps never reach 70C.

Here's my result:
http://www.3dmark.com/fs/11235107

My GPU-Z:


My OC settings:


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> I have found that GPU frequency and memory clocks tend to trade off between each other, the key is to find the right compromised between the two. I have also found that 2100Mhz does not always give you the best performance
> 
> Have you tried leaving the core at stock and seeing how far you can push the memory?


Yes, it dont go stable more than 360+(4363mhz) and i tell stable for gaming, for bechmark i can push around 400+ ou 450+ but freeze after some time. To my core clock i get 2126mhz max game stable, but after discovery "The curve" overclock, i´m trying get more core clock.


----------



## shadowrain

Quote:


> Originally Posted by *Gorhell*
> 
> Hi All.
> 
> I'm trying to OC my Zotac GTX 1070 AMP! Extreme edition but I'm getting black screen flashing quickly and green dots flashing and my 3DMark Firestrike stopped. My temps never reach 70C.
> 
> Here's my result:
> http://www.3dmark.com/fs/11235107
> 
> My GPU-Z:
> 
> 
> My OC settings:


Yeah for some reason I never got stable oc's when using Afterburner. But got stable oc's in Zotac Firestom and my Max OC of +100core +777vram with EVGA Precision with my AMP Extreme. I don't have Firestrike but can confirm that this is Heaven and Gaming Stable even on PCIE 2.0.


----------



## zipper17

Quote:


> Originally Posted by *Mad Pistol*
> 
> I was beginning to have my doubts about Nvidia and their future. However, today I got a new Dell 27" 1440P GSYNC monitor.
> 
> My doubts are officially gone. I have NEVER seen such a smooth and responsive gaming experience as this 144hz GSYNC monitor. It's like constant Vsync-like picture without the input latency. It's stupidly smooth!
> 
> I wish I could take a video of it, but it won't do it justice. It's just something that you have to experience to believe... it makes a massive difference in First Person Shooters and action games.
> 
> Lots of people say Gsync and Freesync are gimmicks. I'm here to tell you that this is NOT a gimmick. The difference is real, and it's AMAZING!!!


Yeah thats originally/basically a vsync function.

Vsync is so much important in gaming after all. Screen tearing have been plagued a PC gaming since the old day.

Glad they implement a new vsync called gsync/freesync, so vsync much more perfected, screen tearing & input lag totally gone, produces a perfect fluid as butter on the screen.

I don't have a gsync/freeesync monitor, but I always use at least Adaptive Vsync in all games.

Vsync also would make your GPU power & heat reduced, example if your refresh rate at 60hz, with vsync on it will capped your Framerates at 60FPS, so wasted FPS is gone.
GPU power & heat reduced also plus a screen tearing & input lag eliminated.

When you fast rotate the camera in games, you'll notice if there's a screen tearing/micro stuttering or not.

It's important to also Make sure your PC specs can handle that perfect [email protected], [email protected], or [email protected] achievement.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have found that GPU frequency and memory clocks tend to trade off between each other, the key is to find the right compromised between the two. I have also found that 2100Mhz does not always give you the best performance
> 
> Have you tried leaving the core at stock and seeing how far you can push the memory?
> 
> 
> 
> Yes, it dont go stable more than 360+(4363mhz) and i tell stable for gaming, for bechmark i can push around 400+ ou 450+ but freeze after some time. To my core clock i get 2126mhz max game stable, but after discovery "The curve" overclock, i´m trying get more core clock.
Click to expand...

Time spy and rise of the tomb raider memory oc for me is limited to about +400 +450 for me. I have found slight improvements at higher overclocks by increasing the VCCIO and PCH voltages a bit but I am running on a Z68 board. That may or not be relevant on later boards but you may want to give it a try. 2126 is pretty good OC to start with.


----------



## DeathAngel74

GPU Boost 3.0 is weird. I can play SW:BF 2015 for hours and the core will stay at 2101 the whole time. TW3 and BM:AK are a different story....2050-2025 on and off. This is completely normal though correct? I was spoiled with my 970's and being able to mod the bios so Boost 2.0 was off.
I apologize if this has already been answered.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> GPU Boost 3.0 is weird. I can play SW:BF 2015 for hours and the core will stay at 2101 the whole time. TW3 and BM:AK are a different story....2050-2025 on and off. This is completely normal though correct? I was spoiled with my 970's and being able to mod the bios so Boost 2.0 was off.
> I apologize if this has already been answered.


Boost 3 rewards temperature stability. Different applications will stress the gpu in different ways, changing temps and memory loads. you will probably find that the load and temps you were getting were higher in the 2nd and 3rd titles.

try increasing case and gpu fan speeds for the slower boost titles and see if that makes a difference for you


----------



## dallemon

Hi. 

So I got a ASUS STRIX 1070 OC recently (with Samsung memory), watercooled it instantly and feel that I have hit a powerwall. I can get the core to 2253 in most tests (still need experience with the curve to limit throttling) but when i hit approx 120% power the card will throttle. I tried crossflashing an MSI GAMING Z BIOS and the increased power limit let me run at 2253 with no throttling, but the memory became very unstable, any attempt at memory OC and it would artifact. :-(

Is there any other BIOS that could be good to try and crossflash or am I stuck until proper BIOS modding arrives (or I do the shunt mod)?
Other than this issue I am really happy with my card as I came from a BIOS modded ASUS 780 DCII (which also hit a powerlimit).


----------



## Snuckie7

So I've finally got to overclocking my 1070 Gaming X, using the curve. The behavior of GPU Boost 3.0 is extremely confusing to me though.

Stock clocks and voltage on my card are 1974 MHz/1.050V. When I apply a small overclock to 2050 MHz, the voltage actually drops to 1.031V to sustain a clock of 2038 MHz. I never specified 2038 MHz on anywhere on my frequency curve though. If I want to force higher frequencies, why does the voltage automatically drop? Shouldn't it be the other way around? Temperatures are fine too, it doesn't exceed 55C during testing.

Also, adding to the voltage slider has no effect. Voltage never increases past 1.031 despite changing the slider.


----------



## gtbtk

Quote:


> Originally Posted by *dallemon*
> 
> Hi.
> 
> So I got a ASUS STRIX 1070 OC recently (with Samsung memory), watercooled it instantly and feel that I have hit a powerwall. I can get the core to 2253 in most tests (still need experience with the curve to limit throttling) but when i hit approx 120% power the card will throttle. I tried crossflashing an MSI GAMING Z BIOS and the increased power limit let me run at 2253 with no throttling, but the memory became very unstable, any attempt at memory OC and it would artifact. :-(
> 
> Is there any other BIOS that could be good to try and crossflash or am I stuck until proper BIOS modding arrives (or I do the shunt mod)?
> Other than this issue I am really happy with my card as I came from a BIOS modded ASUS 780 DCII (which also hit a powerlimit).


What version of MSI bios did you flash? the current version for Samsung memory cards is the 1E version of bios


----------



## gtbtk

Quote:


> Originally Posted by *Snuckie7*
> 
> So I've finally got to overclocking my 1070 Gaming X, using the curve. The behavior of GPU Boost 3.0 is extremely confusing to me though.
> 
> Stock clocks and voltage on my card are 1974 MHz/1.050V. When I apply a small overclock to 2050 MHz, the voltage actually drops to 1.031V to sustain a clock of 2038 MHz. I never specified 2038 MHz on anywhere on my frequency curve though. If I want to force higher frequencies, why does the voltage automatically drop? Shouldn't it be the other way around? Temperatures are fine too, it doesn't exceed 55C during testing.
> 
> Also, adding to the voltage slider has no effect. Voltage never increases past 1.031 despite changing the slider.


Are you running Afterburner 4.3?

What settings have you enabled in the general tab/compatability properties section? Voltage control has never been an issue for me.

I have enabled:

Hardware Control and monitoring

Enable Low Level I/O driver

Restore Settings after suspend

Unlock Voltage control (Standard MSI)

Unlock Voltage monitoring


----------



## icold

Quote:


> Originally Posted by *dallemon*
> 
> Hi.
> 
> So I got a ASUS STRIX 1070 OC recently (with Samsung memory), watercooled it instantly and feel that I have hit a powerwall. I can get the core to 2253 in most tests (still need experience with the curve to limit throttling) but when i hit approx 120% power the card will throttle. I tried crossflashing an MSI GAMING Z BIOS and the increased power limit let me run at 2253 with no throttling, but the memory became very unstable, any attempt at memory OC and it would artifact. :-(
> 
> Is there any other BIOS that could be good to try and crossflash or am I stuck until proper BIOS modding arrives (or I do the shunt mod)?
> Other than this issue I am really happy with my card as I came from a BIOS modded ASUS 780 DCII (which also hit a powerlimit).


Micron memory is a ****ty, you have luck:thumb:. I had a GTX 780 DCII with a skynet bios @1241mhz.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dallemon*
> 
> Hi.
> 
> So I got a ASUS STRIX 1070 OC recently (with Samsung memory), watercooled it instantly and feel that I have hit a powerwall. I can get the core to 2253 in most tests (still need experience with the curve to limit throttling) but when i hit approx 120% power the card will throttle. I tried crossflashing an MSI GAMING Z BIOS and the increased power limit let me run at 2253 with no throttling, but the memory became very unstable, any attempt at memory OC and it would artifact. :-(
> 
> Is there any other BIOS that could be good to try and crossflash or am I stuck until proper BIOS modding arrives (or I do the shunt mod)?
> Other than this issue I am really happy with my card as I came from a BIOS modded ASUS 780 DCII (which also hit a powerlimit).
> 
> 
> 
> Micron memory is a ****ty, you have luck:thumb:. I had a GTX 780 DCII with a skynet bios @1241mhz.
Click to expand...

there is nothing wrong with micon memory. the firmware code from Nvidia that programmed the on GPU chip memory controller in the initial bios release was ****ty


----------



## JackCY

Samsung may be stable in some review and bragging benchmark results up to 9.4GHz but when you properly test it with games you're often down to 9.0-9.2GHz. And when you run an application that uses most of the GPU for many hours at full load it's even lower more around 9.0GHz and that doesn't include the fact that the live memory clocks are lower in this state around 8.6GHz. If you look at AMD cards that do report memory errors they get no errors up to about that mark 8.6Ghz as well. But sure run crazy memory clocks if the errors don't bother you in encoded or played video, random freezes etc.
AMD limited the chips to 9.0GHz only or at least used to be in the drivers for some time and while that is not the highest you can risk for a benchmark it is a sensible max for general use of the Samsung memory chips.

I wouldn't recommend going above 9.0GHz for gaming with Samsung which from my experience is also stable for any extended use via other applications that are far more sensitive to memory errors but NV forces the clocks a little down for those automatically.

Microns... they are probably glad they even run 8.0Ghz stable. I wouldn't touch that crap again, nor Elpida, Hynix etc.


----------



## Snuckie7

Quote:


> Originally Posted by *gtbtk*
> 
> Are you running Afterburner 4.3?
> 
> What settings have you enabled in the general tab/compatability properties section? Voltage control has never been an issue for me.
> 
> I have enabled:
> 
> Hardware Control and monitoring
> Enable Low Level I/O driver
> Restore Settings after suspend
> Unlock Voltage control (Standard MSI)
> Unlock Voltage monitoring


Yep I have all those settings enabled, even tried it with Extended MSI voltage control.

I don't think I'm going to bother with the custom frequency curve. Overall offset worked much better for me and I could hold 2114MHz @ 1.081V.


----------



## zipper17

Quote:


> Originally Posted by *DeathAngel74*
> 
> GPU Boost 3.0 is weird. I can play SW:BF 2015 for hours and the core will stay at 2101 the whole time. TW3 and BM:AK are a different story....2050-2025 on and off. This is completely normal though correct? I was spoiled with my 970's and being able to mod the bios so Boost 2.0 was off.
> I apologize if this has already been answered.


TW3 is very demanding imho, easily put your GPU into 99-100% load.
I Play on 1440P max settings hairworks on, my GPU almost 99-100% most of the time, temperature will increase in no time that make a coreclock throttle.
Every games/stress test has different task load.


----------



## gtbtk

MSI 1070 Gaming X/Z/8G owners,

Have any of you, in a high load scenario such as Firestrike Ultra, ever been able to make your card draw more than 105-106% of the cards power limit before it starts to downclock as though it has hit the upper power limit?

I have just been testing in FS Ultra and 105% is the maximum I can get my card to draw. If anyone can get higher, what settings in afterburner or nvidia control panel were you using?


----------



## Derek1387

Any good walkthroughs on overclocking these cards? I just picked up an Asus STRIX and am gaming at 4K and feel the need to OC the card a bit.


----------



## gtbtk

Quote:


> Originally Posted by *Derek1387*
> 
> Any good walkthroughs on overclocking these cards? I just picked up an Asus STRIX and am gaming at 4K and feel the need to OC the card a bit.


Use afterburner 4.3. Time Spy is a good thing to use to test as it seems to more closely match games like Tomb Raider. Firestrike and heaven will allow you to run higher settings that tends to crash in games

Set Max Voltage

Set Max power and temp limit

Start with core clock at +75 and increase in +25 steps. See how high you can get it before it crashes. Step it back a little after you find the limit

Start with Memory at +400 and increase in increments as you test until you start seeing artifacts then step the memory slider back a bit


----------



## WarbossChoppa

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Definitely interested to see how this 1070 is going to pump out the frames. Seems like quite a hefty performance jump from a GTX970 with little to no increase in power consumption...
> 
> Looks like this card will easily handle today's games upto 1440p with no problem at all. Not quite there yet for 4k but getting there, that's more of the 1080's business.


I have seen up to 120% before but that was the max and it was only for a spike second.


----------



## gtbtk

Quote:


> Originally Posted by *Snuckie7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Are you running Afterburner 4.3?
> 
> What settings have you enabled in the general tab/compatability properties section? Voltage control has never been an issue for me.
> 
> I have enabled:
> 
> Hardware Control and monitoring
> Enable Low Level I/O driver
> Restore Settings after suspend
> Unlock Voltage control (Standard MSI)
> Unlock Voltage monitoring
> 
> 
> 
> Yep I have all those settings enabled, even tried it with Extended MSI voltage control.
> 
> I don't think I'm going to bother with the custom frequency curve. Overall offset worked much better for me and I could hold 2114MHz @ 1.081V.
Click to expand...

1.081V is the increased voltage, you will probably see bumps to 1.093V occasionally if you dont want to tweak the curve. Top default voltage is 1.061V


----------



## dallemon

Quote:


> Originally Posted by *gtbtk*
> 
> What version of MSI bios did you flash? the current version for Samsung memory cards is the 1E version of bios


86.04.1E.00.6C is the version I tried first, then I tried 86.04.50.00.29 and it was no better (both for Gaming Z)
I'll try a different 1E MSI BIOS when I return from work.
Quote:


> Originally Posted by *icold*
> 
> Micron memory is a ****ty, you have luck:thumb:. I had a GTX 780 DCII with a skynet bios @1241mhz.


My old 780 got up to 1360 with a modded skyn3t BIOS


----------



## Avendor

Quote:


> Originally Posted by *Mad Pistol*
> 
> I was beginning to have my doubts about Nvidia and their future. However, today I got a new Dell 27" 1440P GSYNC monitor.
> 
> My doubts are officially gone. I have NEVER seen such a smooth and responsive gaming experience as this 144hz GSYNC monitor. It's like constant Vsync-like picture without the input latency. It's stupidly smooth!
> 
> I wish I could take a video of it, but it won't do it justice. It's just something that you have to experience to believe... it makes a massive difference in First Person Shooters and action games.
> 
> Lots of people say Gsync and Freesync are gimmicks. I'm here to tell you that this is NOT a gimmick. The difference is real, and it's AMAZING!!!


I would like to know, how's with Fast-Sync for you instead of using G-Sync? can you test with Fast-Sync please and then switch it back to G-Sync? I am interested to know if you can tell the differences. From my knowledge Fast-Sync eliminate frame tearing, there's minor input lag (it delivers latency that is very close to VSYNC off, I assume G-Sync does not have input lag at all?) Fast-Sync it's the most viable option for Pascal without spending lots of money in order to have G-Sync. I'm using Fast-Sync from the first day I've bought my GTX 1070 and it's really smooth experiences from my point of view. Is G-Sync truly worth it?



LE: It's kinda weird but NVIDIA recommends it both to be enabled, but that's not possible you have to choose in NVCP either is G-Sync or Fast-Sync. I am not sure how you will enable both at the same time...
http://nvidia.custhelp.com/app/answers/detail/a_id/4161/related/1


----------



## Mad Pistol

Quote:


> Originally Posted by *Avendor*
> 
> I would like to know, how's with Fast-Sync for you instead of using G-Sync? can you test with Fast-Sync please and then switch it back to G-Sync? I am interested to know if you can tell the differences. From my knowledge Fast-Sync eliminate frame tearing, there's minor input lag (it delivers latency that is very close to VSYNC off, I assume G-Sync does not have input lag at all?) Fast-Sync it's the most viable option for Pascal without spending lots of money in order to have G-Sync. I'm using Fast-Sync from the first day I've bought my GTX 1070 and it's really smooth experiences from my point of view. Is G-Sync truly worth it?
> 
> 
> 
> LE: It's kinda weird but NVIDIA recommends it both to be enabled, but that's not possible you have to choose in NVCP either is G-Sync or Fast-Sync. I am not sure...
> http://nvidia.custhelp.com/app/answers/detail/a_id/4161/related/1


This is an interesting question. I did a few tests to see what the difference is. I played Titanfall 2, mainly because it gets around 60 FPS maxed out @ 1440P (sometimes above and sometimes below 60) and because it does not support SLI, so I won't be biased by "feel" only using one card.

I used these different scenarios.


60hz w/ Fast Sync
144hz w/ Fast Sync
144hz w/ G-Sync

For 60hz w/ Fast Sync, it wasn't that smooth. It was cool that there was no tearing in the image, but you could still see visible jutter as there were dropped frames any time it went above or below 60 FPS. However, the game was still very responsive and I could not detect any lag between input and what I was seeing on screen. If you've only got a 60hz screen and a single GPU, this is a decent choice.

For 144hz w/ Fast Sync, it was much smoother than 60hz. Again, no tearing in the image, and the gameplay experience was great. However, I did notice dropped frames as I was moving. In fast moving gameplay, you won't notice this, but if you're just walking around in a Titan, it is there. However, it isn't annoying. If you can afford a 144hz monitor but don't want to pay the premium for G-Sync, Fast-Sync @ 144hz is an excellent option that will satisfy probably 95% of gamers out there.

For 144hz w/ G-Sync, this is hands down the best experience. No jutter. No dropped frames. Just completely smooth gameplay. The other advantage of G-Sync is that it DOES support SLI setups, while Fast Sync will not work with SLI at this time. It's difficult to describe G-Sync to someone that has never experienced it (myself included up until about 2 days ago), but it and Freesync are literally a game changing tech. Being able to see a completely smooth streaming image with no tearing and no input lag is an amazing experience.

My takeaway is that if you're a gamer, you owe it to yourself to at least get a 144hz screen. They are so much better than 60hz screens when it comes to smoothness in gaming. If you can go the little extra and get a G-Sync monitor, do it. It's incredible!


----------



## Avendor

Many thanks for the prompt reply







Certainly in the near future will buy a 1440p with G-Sync. For the time being I am pleased with Fast-Sync though


----------



## Mad Pistol

Quote:


> Originally Posted by *Avendor*
> 
> Many thanks for the prompt reply
> 
> 
> 
> 
> 
> 
> 
> Certainly in the near future will buy a 1440p with G-Sync. For the time being I am pleased with Fast-Sync though


Definitely pick it up when you can. G-Sync is definitely not a gimmick. It's an incredible piece of tech that I wish had existed years ago.

Because 4k @ 144hz does not exist (yet), 2560x1440 @ 144hz is the holy grail of gaming experiences. A good CRT could not hit these resolutions/refresh rates.


----------



## CoD511

Quote:


> Originally Posted by *Mad Pistol*
> 
> Definitely pick it up when you can. G-Sync is definitely not a gimmick. It's an incredible piece of tech that I wish had existed years ago.
> 
> Because 4k @ 144hz does not exist (yet), 2560x1440 @ 144hz is the holy grail of gaming experiences. A good CRT could not hit these resolutions/refresh rates.


The physical tech certainly is helping in a significant way, in terms of being able to allow very tight timing control over every pixel and the dynamially managed driving of them or overdriving. I'm not sure 240Hz would be possible on those recent TNs without the chip itself.

The only thing annoying me about my monitor is the matte finish which looks terrible compared to glossy I had previously







Otherwise, 1440p/144Hz or 165Hz that I confirmed wasn't skipping frames, is quite luxurious with Gsync.


----------



## Gorhell

Hello guys

Question I can OC my GTX 1070 up to 100 core clock and 700 mem clock but it gives me artifacts and such. my OC now is 50 core and 500 mems and I can play games if I get up higher it goes artifacts.
My temps are nowhere near 70C. it stays up to 67C. I wonder where this artifacts coming? my Core voltage is 100% in MSI AB


----------



## R432

Got really bad Msi gtx 1070 gaming x, only goes 2000 core and 8500 memory (micron+stock voltage) even with bios update... meh.

Should i return and try for better?


----------



## Mad Pistol

Quote:


> Originally Posted by *R432*
> 
> Got really bad Msi gtx 1070 gaming x, only goes 2000 core and 8500 memory (micron+stock voltage) even with bios update... meh.
> 
> Should i return and try for better?


2000 core before boost, or 2000 core after boost?


----------



## dallemon

Just a quick update, I have now tried all 1E MSI BIOS and all with the same result, better core OC, but anything on the memory and I get corruption and crashes. :-( I may just have to do the shunt mod as I feel this card can perform much better than it does. Or I may just enjoy it for a while and cross my fingers in hope of some BIOS modding tool support.


----------



## R432

Quote:


> Originally Posted by *Mad Pistol*
> 
> 2000 core before boost, or 2000 core after boost?


It boosts to ~1930mhz at factory settings, i can put +80mhz to core and boost settles to 2000mhz after while in 3dmark timespy and firestrike, +100mhz resulted crash

Memory goes +250mhz total of 8500mhz, +275mhz resulted crash


----------



## Mad Pistol

Quote:


> Originally Posted by *R432*
> 
> It boosts to ~1930mhz at factory settings, i can put +80mhz to core and boost settles to 2000mhz after while in 3dmark timespy and firestrike, +100mhz resulted crash
> 
> Memory goes +250mhz total of 8500mhz, +275mhz resulted crash


yea, that's not that great. You could try to exchange it for another one to see if you get a better clocking chip. Not sure about the memory, though.


----------



## asdkj1740

happy new year! i wish the pascal bios tweaker will arrive very soon!


----------



## asdkj1740

Quote:


> Originally Posted by *dallemon*
> 
> Just a quick update, I have now tried all 1E MSI BIOS and all with the same result, better core OC, but anything on the memory and I get corruption and crashes. :-( I may just have to do the shunt mod as I feel this card can perform much better than it does. Or I may just enjoy it for a while and cross my fingers in hope of some BIOS modding tool support.


dont do the shunt mod... try some high power settings bios like zotac amp extreme 300w bios.


----------



## dukeReinhardt

I got an ASUS 1070 Strix (non-OC). I'm pretty happy with it, though the stock boost (1873mhz) isn't mindblowing.

I have two questions, if someone wouldn't mind helping me out.

1. *With the Micron BIOS updates, does anyone know how the AIB partners managed to make the vRAM more stable? Was it an increase in power, or perhaps a lowering of the memory timings? Or was it actually a bug in the BIOS which was fixed?*

2. At temps near-ish 50 degrees, the fans rev up and down every second non-stop, regardless of the program. Power settings are at optimal, and the card does idle properly, but sometimes after a gaming session, the card stays around that temp for a while, especially if I continue to browse or watch videos. Sometimes this actually happens for extended periods in less stressful games as well.

My PC is near silent and I have open headphones, so the fans constantly revving up and down is actually quite loud distracting, especially during music or movies. But it's just as concerning that fans switching on and off continuously is obviously going to wear them down and break the motors prematurely.

*Is this typical behaviour, and is there a way to increase the temperature hysteresis or increase the minimum duration of fans being on or something like that, without using overclocking software? Or is this a common issue that can only be fixed with a custom fan profile?*


----------



## dallemon

Too drunk to give a proper response. But I'll try zotac BIOS tomorrow and Hope for the best


----------



## SavageBrat

Well.. just got my new build done, here is my Asus Strix 1070 non-oc..at stock volts with the power target raised to +112, I got +180 on boost and +640 memory..

 , 
 ,


----------



## LogicusMPS

Here is mine 2 scores, with diff drivers.

17613 - http://www.3dmark.com/3dm/16625781

17604 - http://www.3dmark.com/3dm/16626017


----------



## Mache

Is anyone having any stuttering/gpu usage problems??

Had 670 sli before no problems atall.

Cheers


----------



## gtbtk

Quote:


> Originally Posted by *R432*
> 
> Got really bad Msi gtx 1070 gaming x, only goes 2000 core and 8500 memory (micron+stock voltage) even with bios update... meh.
> 
> Should i return and try for better?


you are never going to get great results you heard about using the core clock slider on pascal cards. you need to use the curve to maximize your performance (Ctrl-F in AB).

If the bios you have is version 86.04.26.00.2e you need to do the bios update to take it up to he 86.04.50.00.2A version and that shpould improve the memory by resolving the memory controller bug a,d get you close to a +500 memory OC setting

with +100 voltage, try starting with the curve at the default settings and setting the 1.093v curve point to 2088-2100 and the 0.975 point to 2012-2025 and see how that works for you


----------



## gtbtk

Quote:


> Originally Posted by *dukeReinhardt*
> 
> I got an ASUS 1070 Strix (non-OC). I'm pretty happy with it, though the stock boost (1873mhz) isn't mindblowing.
> 
> I have two questions, if someone wouldn't mind helping me out.
> 
> 1. *With the Micron BIOS updates, does anyone know how the AIB partners managed to make the vRAM more stable? Was it an increase in power, or perhaps a lowering of the memory timings? Or was it actually a bug in the BIOS which was fixed?*
> 
> 2. At temps near-ish 50 degrees, the fans rev up and down every second non-stop, regardless of the program. Power settings are at optimal, and the card does idle properly, but sometimes after a gaming session, the card stays around that temp for a while, especially if I continue to browse or watch videos. Sometimes this actually happens for extended periods in less stressful games as well.
> 
> My PC is near silent and I have open headphones, so the fans constantly revving up and down is actually quite loud distracting, especially during music or movies. But it's just as concerning that fans switching on and off continuously is obviously going to wear them down and break the motors prematurely.
> 
> *Is this typical behaviour, and is there a way to increase the temperature hysteresis or increase the minimum duration of fans being on or something like that, without using overclocking software? Or is this a common issue that can only be fixed with a custom fan profile?*


You can get much better performance from the card with an overclock.

1. The AIB partners only rebranded, set clock speeds and power limits for the different model cards in their ranges. The bios file that controls the memory and GPU dunction is actually created by NVIDIA and supplied to each of the vendors. The bios resolved a bug with the memory controller that did not sync the 2 clocks used by GDDR5 memory properly with the micron memory.

2. Use Afterburner as opposed to GPU Tweak II, It gives you better control. Create a custom fan curve. You can set it to be 0 fan until a particular temp if you want to - most cards are silent until 60 deg. If you still get on/off fans cycling. Set the hysteresis setting for the fan to 4 or 5 deg.


----------



## dukeReinhardt

Quote:


> Originally Posted by *gtbtk*
> 
> You can get much better performance from the card with an overclock.
> 
> 1. The AIB partners only rebranded, set clock speeds and power limits for the different model cards in their ranges. The bios file that controls the memory and GPU dunction is actually created by NVIDIA and supplied to each of the vendors. The bios resolved a bug with the memory controller that did not sync the 2 clocks used by GDDR5 memory properly with the micron memory.
> 
> 2. Use Afterburner as opposed to GPU Tweak II, It gives you better control. Create a custom fan curve. You can set it to be 0 fan until a particular temp if you want to - most cards are silent until 60 deg. If you still get on/off fans cycling. Set the hysteresis setting for the fan to 4 or 5 deg.


Thanks for getting back to me









1. So to be clear, the fix wasn't something like adding delay to dynamic clock changing or anything like that, right? There's 0 performance decrease from the new BIOS?

2. I'm familiar with Afterburner, but I just didn't want to use it because I'm 99% satisfied with the card's default noise and performance. ASUS made the RIGHT CHOICE putting a 1 ton cooler and 3 90mm fans on this chip. If a custom profile is the only way, then so be it.

Anyway, this is intended behaviour, right? Fans are actually MEANT to rev up and down like that every second if you're at the temperature range where they're meant to turn on? First thing I want to do is make sure I don't need to RMA my brand new card.


----------



## gtbtk

Quote:


> Originally Posted by *dukeReinhardt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You can get much better performance from the card with an overclock.
> 
> 1. The AIB partners only rebranded, set clock speeds and power limits for the different model cards in their ranges. The bios file that controls the memory and GPU dunction is actually created by NVIDIA and supplied to each of the vendors. The bios resolved a bug with the memory controller that did not sync the 2 clocks used by GDDR5 memory properly with the micron memory.
> 
> 2. Use Afterburner as opposed to GPU Tweak II, It gives you better control. Create a custom fan curve. You can set it to be 0 fan until a particular temp if you want to - most cards are silent until 60 deg. If you still get on/off fans cycling. Set the hysteresis setting for the fan to 4 or 5 deg.
> 
> 
> 
> Thanks for getting back to me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1. So to be clear, the fix wasn't something like adding delay to dynamic clock changing or anything like that, right? There's 0 performance decrease from the new BIOS?
> 
> 2. I'm familiar with Afterburner, but I just didn't want to use it because I'm 99% satisfied with the card's default noise and performance. ASUS made the RIGHT CHOICE putting a 1 ton cooler and 3 90mm fans on this chip. If a custom profile is the only way, then so be it.
> 
> Anyway, this is intended behaviour, right? Fans are actually MEANT to rev up and down like that every second if you're at the temperature range where they're meant to turn on? First thing I want to do is make sure I don't need to RMA my brand new card.
Click to expand...

1. There is a performance increase from the bios update. The Micron memory works as it was intended to work

2. I have an MSI card and I have not noticed my card doing that at 60 deg but the MSI fans are very quiet so it may and I just haven't noticed. If I am ever using the card in anger, I have a curve or am using 100% fans anyway. I would not think that start stop behavior at 50 deg was an intended design feature. I thought all the cards stayed silent until they hit 60 deg. Possibly the solution is that you need to install an additional case fan to give the card a little airflow when the fans are not engaged?


----------



## dukeReinhardt

Thanks







. Actually I found videos of the same issue with other Strix 1070s. Apparently the fans rev at or near 55 degrees. I'll probably just install AB and let the card keep fans off until 60.


----------



## kuelho

Did you guys noticed increased temperatures after bios update? I got a Zotac 1070 amp edition, it was maxing 72°C on Heaven benchmark, room temp 28ºC. So doing tests after the micron bios fix, its getting to 80ºC... running stock, same voltage, power limit etc. Though the score is the same on Heaven, wonder what else was changed on those bios to increase temps like that.


----------



## philhalo66

Quote:


> Originally Posted by *kuelho*
> 
> Did you guys noticed increased temperatures after bios update? I got a Zotac 1070 amp edition, it was maxing 72°C on Heaven benchmark, room temp 28ºC. So doing tests after the micron bios fix, its getting to 80ºC... running stock, same voltage, power limit etc. Though the score is the same on Heaven, wonder what else was changed on those bios to increase temps like that.


i did on my gigabyte card not as extreme as yours but mine went up a solid 5-6C with the F2 bios. i flashed it back to F1 and temps went back to normal.


----------



## MEC-777

So strangely enough, by manually creating a custom offset curve (see below) in Precision XOC, I was able to see a noticeable gain in synthetic benchmarks (beyond margin of error), like firestrike, valley and heaven. I don't fully understand how this is possible because the offset is the same as if I simply adjust the slider to +150. All points on the curve in the picture below are also +150. If I try +175 at any point, it crashes the benchmark. In-game benchmarks and gaming performance is exactly the same, however.

Everything else is the same; power and temp target maxed with priority set to temp, voltage slider maxed and memory at +500.


----------



## c0nsistent

I'm running Micron memory and anything over +480mhz causes a crash. I've overclocked dozens of cards in the past and a crash was not typically caused by memory. You would notice via artifacts or lost performance due to errors. Is that how everyone else is overclocking their memory on these cards?

I've also noticed adding 100% voltage in Precision XOC allowed me to add another 13 mhz to the core clock on my EVGA SC card.


----------



## kuelho

Quote:


> Originally Posted by *philhalo66*
> 
> i did on my gigabyte card not as extreme as yours but mine went up a solid 5-6C with the F2 bios. i flashed it back to F1 and temps went back to normal.


hum, problem is... some games lock up on startup like fallout 4 if on fullscreen. The work around is disable pcie power management on Windows, because on moderate option it lock up most games on fullscreen. Now i need to figure out what to do revert and get lower temps or ignore and worry about lifetime...


----------



## gtbtk

Quote:


> Originally Posted by *kuelho*
> 
> Did you guys noticed increased temperatures after bios update? I got a Zotac 1070 amp edition, it was maxing 72°C on Heaven benchmark, room temp 28ºC. So doing tests after the micron bios fix, its getting to 80ºC... running stock, same voltage, power limit etc. Though the score is the same on Heaven, wonder what else was changed on those bios to increase temps like that.


It is possible Zotac added something like a modified default fan curve. That is in the area of things that they can change on their bioses. If the Zotac bios has also increased the power limits of the card as well you may see higher temps but you would also likely see better benchmark scores as well.

Did you ever do a before/after 100% fan on temp comparison?


----------



## gtbtk

Quote:


> Originally Posted by *c0nsistent*
> 
> I'm running Micron memory and anything over +480mhz causes a crash. I've overclocked dozens of cards in the past and a crash was not typically caused by memory. You would notice via artifacts or lost performance due to errors. Is that how everyone else is overclocking their memory on these cards?
> 
> I've also noticed adding 100% voltage in Precision XOC allowed me to add another 13 mhz to the core clock on my EVGA SC card.


Update the vBios of your card. You can get it from EVGA web site. Look in the 1070 forum section for download links

The latest EVGA bios update fixes that and modifies the fan curves to help reduce VRM temps a bit.

Extra voltage will increase clock speeds. that is normal and the point of the voltage slider. The curve that you can access in precision is directly related to voltages vs frequency


----------



## gtbtk

Quote:


> Originally Posted by *kuelho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *philhalo66*
> 
> i did on my gigabyte card not as extreme as yours but mine went up a solid 5-6C with the F2 bios. i flashed it back to F1 and temps went back to normal.
> 
> 
> 
> hum, problem is... some games lock up on startup like fallout 4 if on fullscreen. The work around is disable pcie power management on Windows, because on moderate option it lock up most games on fullscreen. Now i need to figure out what to do revert and get lower temps or ignore and worry about lifetime...
Click to expand...

create a custom fan curve?

Extra temps could only have come from the embellishments added by gigabyte in their version of the update. The core bios from Nvidia that is common to all 1070s has not changed temps of other cards.

In fact you two are the only people here to have complained about increased temperatures here on OC.Net that I have seen


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> So strangely enough, by manually creating a custom offset curve (see below) in Precision XOC, I was able to see a noticeable gain in synthetic benchmarks (beyond margin of error), like firestrike, valley and heaven. I don't fully understand how this is possible because the offset is the same as if I simply adjust the slider to +150. All points on the curve in the picture below are also +150. If I try +175 at any point, it crashes the benchmark. In-game benchmarks and gaming performance is exactly the same, however.
> 
> Everything else is the same; power and temp target maxed with priority set to temp, voltage slider maxed and memory at +500.


A couple of things from my observations.

1. Not all voltage points can OC by the same offset. Some may go to +200 others are limited at say +150 for example. If you use the slider, then the +150 point is as high as you can take the OC before the card will show instability. Because above that level, the curve is a fixed shape and at some stage it will try and use that voltage point with the +150 limitation at the higher level and crash the driver. The performance that you can obtain from the +200 at say, 1.093 is left on the table and can only be accessed by using a custom curve.

2. the card uses all the voltage points when it is under a 3d load, however the bulk of the processing work is biased towards the right side of the curve. The measure of performance is actually the area under the curve, not the right most point.

3. High OC levels at the left end (low voltage end) of the curve also draw power against the power target and yet do not use the power to create frames as efficiently as the higher voltage levels do so leaving the .800 end low and allowing the higher end of the curve to run at higher frequencies gives the card more power budget to efficiently creates graphic frames.

4. while keeping the curve that you show in your screenshot, try also increasing the .975v point up a couple of values. That voltage point, while you are running the +100 voltage setting, increases the GPU Video clock (you need HWinfo64 to see that reported value) and that also seems to help getting higher frame rates.


----------



## kuelho

Quote:


> Originally Posted by *gtbtk*
> 
> It is possible Zotac added something like a modified default fan curve. That is in the area of things that they can change on their bioses. If the Zotac bios has also increased the power limits of the card as well you may see higher temps but you would also likely see better benchmark scores as well.
> 
> Did you ever do a before/after 100% fan on temp comparison?


Yeah i did test that, luckly i had the preset from clocks and fans saved. With 100% it doesnt change much about 2~3ºC from default 70%. For sure its not fan curve its the same 1/1 as before, on heaven voltage max at 1.063 ~165w also no change, same score on benchs...

Leaving synthetic tests, the problem is, it caps the stock 83ºC temp limit on witcher 3, before this never happened.


----------



## philhalo66

Quote:


> Originally Posted by *kuelho*
> 
> Yeah i did test that, luckly i had the preset from clocks and fans saved. With 100% it doesnt change much about 2~3ºC from default 70%. For sure its not fan curve its the same 1/1 as before, on heaven voltage max at 1.063 ~165w also no change, same score on benchs...
> 
> Leaving synthetic tests, the problem is, it caps the stock 83ºC temp limit on witcher 3, before this never happened.


It might help if you reflash with the updated BIOS again just to eliminate a bad flash. if that doesn't help then try asking the manufacturer if they know of any issues.


----------



## spddmn24

Quote:


> Originally Posted by *gtbtk*
> 
> MSI 1070 Gaming X/Z/8G owners,
> 
> Have any of you, in a high load scenario such as Firestrike Ultra, ever been able to make your card draw more than 105-106% of the cards power limit before it starts to downclock as though it has hit the upper power limit?
> 
> I have just been testing in FS Ultra and 105% is the maximum I can get my card to draw. If anyone can get higher, what settings in afterburner or nvidia control panel were you using?


Same issue with my quicksilver oc 1070(same as gaming x). It power throttles around that despite having the power limit set at max.

http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6400#post_25689920


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> MSI 1070 Gaming X/Z/8G owners,
> 
> Have any of you, in a high load scenario such as Firestrike Ultra, ever been able to make your card draw more than 105-106% of the cards power limit before it starts to downclock as though it has hit the upper power limit?
> 
> I have just been testing in FS Ultra and 105% is the maximum I can get my card to draw. If anyone can get higher, what settings in afterburner or nvidia control panel were you using?
> 
> 
> 
> Same issue with my quicksilver oc 1070(same as gaming x). It power throttles around that despite having the power limit set at max.
> 
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6400#post_25689920
Click to expand...

Well they are the same card with different coloured cladding.

I would like to work out whats happening but I am glad that it is not just me meaning that it is something in the design of the card bios.

An interesting thing that I am seeing also is if the voltage is set to +100%, GPU-Z gives me a nice rainbow of power limit reasons that don't seem to make any sense. Have you noticed that?


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> A couple of things from my observations.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1. Not all voltage points can OC by the same offset. Some may go to +200 others are limited at say +150 for example. If you use the slider, then the +150 point is as high as you can take the OC before the card will show instability. Because above that level, the curve is a fixed shape and at some stage it will try and use that voltage point with the +150 limitation at the higher level and crash the driver. The performance that you can obtain from the +200 at say, 1.093 is left on the table and can only be accessed by using a custom curve.
> 
> 2. the card uses all the voltage points when it is under a 3d load, however the bulk of the processing work is biased towards the right side of the curve. The measure of performance is actually the area under the curve, not the right most point.
> 
> 3. High OC levels at the left end (low voltage end) of the curve also draw power against the power target and yet do not use the power to create frames as efficiently as the higher voltage levels do so leaving the .800 end low and allowing the higher end of the curve to run at higher frequencies gives the card more power budget to efficiently creates graphic frames.
> 
> 4. while keeping the curve that you show in your screenshot, try also increasing the .975v point up a couple of values. That voltage point, while you are running the +100 voltage setting, increases the GPU Video clock (you need HWinfo64 to see that reported value) and that also seems to help getting higher frame rates.


Quote:


> Originally Posted by *gtbtk*
> 
> A couple of things from my observations.
> 
> 1. Not all voltage points can OC by the same offset. Some may go to +200 others are limited at say +150 for example. If you use the slider, then the +150 point is as high as you can take the OC before the card will show instability. Because above that level, the curve is a fixed shape and at some stage it will try and use that voltage point with the +150 limitation at the higher level and crash the driver. The performance that you can obtain from the +200 at say, 1.093 is left on the table and can only be accessed by using a custom curve.
> 
> 2. the card uses all the voltage points when it is under a 3d load, however the bulk of the processing work is biased towards the right side of the curve. The measure of performance is actually the area under the curve, not the right most point.
> 
> 3. High OC levels at the left end (low voltage end) of the curve also draw power against the power target and yet do not use the power to create frames as efficiently as the higher voltage levels do so leaving the .800 end low and allowing the higher end of the curve to run at higher frequencies gives the card more power budget to efficiently creates graphic frames.
> 
> 4. while keeping the curve that you show in your screenshot, try also increasing the .975v point up a couple of values. That voltage point, while you are running the +100 voltage setting, increases the GPU Video clock (you need HWinfo64 to see that reported value) and that also seems to help getting higher frame rates.


Yeah, when I tried putting any point on that green curve up to +175 (testing one point at a time, leaving the rest at +150 which I know is stable) it would crash the benchmark after a few minutes.

It's just odd because like I mentioned before, this line I set is no higher than it is when I set the slider to +150, since all those points you see are also +150. Not one of those points can handle +175 without crashing. So then, if the area under the curve is actually less now (because I only raised the curve in the .975 to 1.093v range), how and why then are benchmark scores higher?


----------



## Gurkburk

Any news on custom bios?

Stopped overclocking when i found my stable setting... These cards are so boring when it comes to overclocking..


----------



## MEC-777

Quote:


> Originally Posted by *Gurkburk*
> 
> Any news on custom bios?
> 
> Stopped overclocking when i found my stable setting... These cards are so boring when it comes to overclocking..


They are and they aren't. I think what many of us are forgetting is that GPU boost 3.0 is already overclocking these cards WAY above stock, right out of the box without touching anything. Thus, any gains above that aren't seen as anything too significant.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> A couple of things from my observations.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1. Not all voltage points can OC by the same offset. Some may go to +200 others are limited at say +150 for example. If you use the slider, then the +150 point is as high as you can take the OC before the card will show instability. Because above that level, the curve is a fixed shape and at some stage it will try and use that voltage point with the +150 limitation at the higher level and crash the driver. The performance that you can obtain from the +200 at say, 1.093 is left on the table and can only be accessed by using a custom curve.
> 
> 2. the card uses all the voltage points when it is under a 3d load, however the bulk of the processing work is biased towards the right side of the curve. The measure of performance is actually the area under the curve, not the right most point.
> 
> 3. High OC levels at the left end (low voltage end) of the curve also draw power against the power target and yet do not use the power to create frames as efficiently as the higher voltage levels do so leaving the .800 end low and allowing the higher end of the curve to run at higher frequencies gives the card more power budget to efficiently creates graphic frames.
> 
> 4. while keeping the curve that you show in your screenshot, try also increasing the .975v point up a couple of values. That voltage point, while you are running the +100 voltage setting, increases the GPU Video clock (you need HWinfo64 to see that reported value) and that also seems to help getting higher frame rates.
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> A couple of things from my observations.
> 
> 1. Not all voltage points can OC by the same offset. Some may go to +200 others are limited at say +150 for example. If you use the slider, then the +150 point is as high as you can take the OC before the card will show instability. Because above that level, the curve is a fixed shape and at some stage it will try and use that voltage point with the +150 limitation at the higher level and crash the driver. The performance that you can obtain from the +200 at say, 1.093 is left on the table and can only be accessed by using a custom curve.
> 
> 2. the card uses all the voltage points when it is under a 3d load, however the bulk of the processing work is biased towards the right side of the curve. The measure of performance is actually the area under the curve, not the right most point.
> 
> 3. High OC levels at the left end (low voltage end) of the curve also draw power against the power target and yet do not use the power to create frames as efficiently as the higher voltage levels do so leaving the .800 end low and allowing the higher end of the curve to run at higher frequencies gives the card more power budget to efficiently creates graphic frames.
> 
> 4. while keeping the curve that you show in your screenshot, try also increasing the .975v point up a couple of values. That voltage point, while you are running the +100 voltage setting, increases the GPU Video clock (you need HWinfo64 to see that reported value) and that also seems to help getting higher frame rates.
> 
> Click to expand...
> 
> Yeah, when I tried putting any point on that green curve up to +175 (testing one point at a time, leaving the rest at +150 which I know is stable) it would crash the benchmark after a few minutes.
> 
> It's just odd because like I mentioned before, this line I set is no higher than it is when I set the slider to +150, since all those points you see are also +150. Not one of those points can handle +175 without crashing. So then, if the area under the curve is actually less now (because I only raised the curve in the .975 to 1.093v range), how and why then are benchmark scores higher?
Click to expand...

+175 is probably past what the card can do at that voltage point, you will only find what the right number is through experimentation as all cards are different, I cant give you an exact answer. Experimentation is the best way to identify all the quirks that your card/PC has and you may find that +150 is actually near the highest you can go or maybe some points will stop at +160.

I have found that If you run a 4K benchmark such as Firestrike Ultra, the card will draw more power than it does at 1080p and will down clock the card to keep it under the power limit. If the card does not have all the points at +150, but only the ones set higher than default from .975 to 1.093, the card will still operate at the higher frequencies but overall, will pull less power and not downclock itself as much. I don't know exactly what the GPU Video Clock (controlled at .0975V) actually does as Nvidia provides next to no documentation that I can find that describes it. But I have found that the higher you can set that value, the better the performance I can get, even at the expense of the 1.093V frequency Value.

My Card tends to get the best results with .0975 at 2025 and the 1.093V point at 2088. I can run the card at 2114Mhz but the performance is worse than it is at 2088. Remember that you are also balancing memory clock vs core clock as well. In the Firestrike Benchmarks, most of the score increases tend to come from memory clock increase and not so much core clock increases


----------



## dallemon

Just wanted to say that the Zotac BIOS did help improve clock stability on my ASUS card  (now I just need to finetune clocks at certain voltages) Highest stable now seems to be 2278. :-D (With hardly any throttling in Time Spy test 2, Firestrike Ultra test 1 is another story though)


----------



## gtbtk

Quote:


> Originally Posted by *dallemon*
> 
> Just wanted to say that the Zotac BIOS did help improve clock stability on my ASUS card  (now I just need to finetune clocks at certain voltages) Highest stable now seems to be 2278. :-D (With hardly any throttling in Time Spy test 2, Firestrike Ultra test 1 is another story though)


Just be careful and manage your power draw. Firestrike Ultra pulls more power than any of the other benchmarks in 3dMark

While the Asus card has a good 8+2phase VRM, it only has a single 8 pin power supply and an official 200W limit baked into the original bios. You can only get officially 225W through the PCIE connector per the specs but I think that the card can actually pull more power than what the official spec says. You should be OK but the mosfets in the ASUS card are not likely to be as robust as the ones on the Zotac card (300W 2x8 pin power) so you don't want to be pulling really high Wattages through the Asus card for long periods of time else you will be putting your card at risk of cooking itself.

HWinfo64 will integrate with the OSD that is part of Afterburner. HWinfo can monitors temps, frequencies and GPU power draw among many other things. You can set it up so that the GPU power draw wattage is displayed on screen with the standard AB GPU clock speeds etc . I find it very handy.


----------



## zipper17

@gtbtk yeah actually Hwinfo64 & Afterburner integrate with the Rivatuner Statistic Server.

Btw I Have some question, My PSU only has 2x8pin PCIE cable, If I add another 1070 to SLI, is it safe to use 2xMolex to 6/8 pin PCIE? The Molex to PCIE adapter it come from the box I bought together with the card as accessories. Some say It's better to buy a new PSU, but why not to just use adapter to save bucks.


----------



## Dan-H

Quote:


> Originally Posted by *zipper17*
> 
> @gtbtk yeah actually Hwinfo64 & Afterburner integrate with the Rivatuner Statistic Server.
> 
> Btw I Have some question, My PSU only has 2x8pin PCIE cable, If I add another 1070 to SLI, is it safe to use 2xMolex to 6/8 pin PCIE? The Molex to PCIE adapter it come from the box I bought together with the card as accessories. Some say It's better to buy a new PSU, but why not to just use adapter to save bucks.


What PSU do you have?

edit: I saw "Seasonic SS 650 AT" in your sig but it isn't clear which one that is.


----------



## zipper17

Quote:


> Originally Posted by *Dan-H*
> 
> What PSU do you have?
> 
> edit: I saw "Seasonic SS 650 AT" in your sig but it isn't clear which one that is.


I bought this PSU already re-branded with other brandname in my local place. But actually it's a Seasonic OEM version ss650at APFC bronze (Written on the Label), and quality is a good old seasonic PSU (probably 2010-2011). There is a local review for it, and the ripple quality is good.

If actually molex to pcie are safe, why not just to use them, save money & time.


----------



## Dan-H

Quote:


> Originally Posted by *zipper17*
> 
> I bought this PSU already re-branded with other brandname in my local place. But actually it's a Seasonic OEM version ss650at APFC bronze (Written on the Label), and quality is a good old seasonic PSU (probably 2010-2011). There is a local review for it, and the ripple quality is good.
> 
> If actually molex to pcie are safe, why not just to use them, save money & time.


Can you post spec's from the label on the side of the PSU? Without seeing the spec's of the PSU it is hard to say if it is safe or not. It might be OK. It might not.

I"m not a PSU expert, but I consider a good PSU the foundation investment in the rest of the system.

A 650 Bronze PSU from any MFG is worth about $30 to $40 USD.

The pair of 1070s are worth around $800 USD.

If it were my system, I'd invest in a better PSU if I were trying to SLI that much graphics card.

YMMV.


----------



## icold

I not risk placing bios from another manufacturer. What we really need is the pascal bios tweaker


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> GPU Video Clock (controlled at .0975V)


What does that mean? (even if assuming typo 0.975V)


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> @gtbtk yeah actually Hwinfo64 & Afterburner integrate with the Rivatuner Statistic Server.
> 
> Btw I Have some question, My PSU only has 2x8pin PCIE cable, If I add another 1070 to SLI, is it safe to use 2xMolex to 6/8 pin PCIE? The Molex to PCIE adapter it come from the box I bought together with the card as accessories. Some say It's better to buy a new PSU, but why not to just use adapter to save bucks.


Yes I know that, but as AB is the front end and Riva is an integral part to provide the OSD. I did not feel the need to complicate matters.

Using adapters from Molex to Pcie is never a recommended course of action.

A 6 pin PCIE cable is rated to supply 75W and has 2 x 12v conductors that supply 3.35A and would need 2 molex cables that directly connect to the PSU. An 8 pin cable has 3 x 12V conductors at 4 amps and would require 3 directly connected molex cables to provide the 150W for an 8 pin. You could not use 2 molex connectors on a single daisy chained cable as the single shared conductor only supplies a total of 5A.

A reasonable 750 watt PSU can be picked up for not that much money. You could get an adequate 750W PUS for about $75 from newegg. Is it really worth cheaping out when trying to connect a $400 graphics card?


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> GPU Video Clock (controlled at .0975V)
> 
> 
> 
> What does that mean? (even if assuming typo 0.975V)
Click to expand...

yes 0.975V

I have no idea what the video clock actually does. Nvidia doesn't seem to mention it in any of the doco I have ever seen and the Overclock utilities don't address or report on the clock value. To be fair, until pascal arrived with the voltage curve, there was no way to directly address the video clock anyway.

I have found though, that if you set the 1.093V to 2100 or so and tune the Video clock to clock faster by increasing the frequency point on the curve at 0.975V to about 2025, you get more FPS than if you leave the 0,975 point at the default curve setting


----------



## ucode

Okay, so if I understand correctly when increasing the 0.975V GPU clock point it increases the video clock at the 1.093V point.

From what I've seen it does seem there's an option to independently program the video clock but VBIOS has it disabled. Instead we seem to be left with a buggy implementation that can vary from one driver version to the next. Usually the video clock tends to follow the GPU clock some ~200MHz behind and the gap seems to be bigger if one gets a VBIOS with increased base/boost clocks. Forcing a p-state seems to drop the video clock to a much lower static value with a subsequent drop in overall performance. Shame really.


----------



## Gurkburk

People on reddit are so keen on spewing out that a "4770k bottlenecks a 1070", even when overclocked.

I find it extremely hard to believe that this processor bottlenecks a 1070. So whats really the deal?


----------



## philhalo66

Quote:


> Originally Posted by *Gurkburk*
> 
> People on reddit are so keen on spewing out that a "4770k bottlenecks a 1070", even when overclocked.
> 
> I find it extremely hard to believe that this processor bottlenecks a 1070. So whats really the deal?


it depends on the game. my 3570K at 4.8GHz doesn't bottleneck my 1070 in most games. but it does in GTA V and rise of the romb raider and a few heavily single threaded games like crysis 1.


----------



## Gurkburk

Quote:


> Originally Posted by *philhalo66*
> 
> it depends on the game. my 3570K at 4.8GHz doesn't bottleneck my 1070 in most games. but it does in GTA V and rise of the romb raider and a few heavily single threaded games like crysis 1.


Yea im talking about bf1.


----------



## philhalo66

Quote:


> Originally Posted by *Gurkburk*
> 
> Yea im talking about bf1.


I can't help you there i dont have battlefield 1.


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Okay, so if I understand correctly when increasing the 0.975V GPU clock point it increases the video clock at the 1.093V point.
> 
> From what I've seen it does seem there's an option to independently program the video clock but VBIOS has it disabled. Instead we seem to be left with a buggy implementation that can vary from one driver version to the next. Usually the video clock tends to follow the GPU clock some ~200MHz behind and the gap seems to be bigger if one gets a VBIOS with increased base/boost clocks. Forcing a p-state seems to drop the video clock to a much lower static value with a subsequent drop in overall performance. Shame really.


The Video clock only varies with the position of the voltage slider. +0 Voltage, the clock will be adjusted with the .950V point, +100 it is adjusted with .975.

That has been consistent since the day the Pascal cards were introduced and applies equally to MSI, ASUS, EVGA, Zotac, Galax. Given that I do not know exactly what a Video Clock is actually doing/measuring, i certainly would not say it is buggy it has remained consistent across brands, bios updates and driver updates. What I do know is that increasing that clock point together with the point at 1.093 while leaving all the other points alone has given me the best overclocks.

Yes the video clock will follow the core slider, because the core slider moves the entire curve up and down and the relative heights of the points on the curve to each other remain the same. but I am not talking about using the core slider. I am talking about a custom curve that has a double hump. Yes in a lower p-state all the clocks drop to a lower idle. The screenshot I posted was displaying just that.


----------



## icold

Even I7 [email protected] "bottleneck" with gtx 1080... this is not bottleneck. Bottleneck cpu stays at 100% and gpu (example): 80%. This is a cpu bound game.


----------



## philhalo66

Quote:


> Originally Posted by *icold*
> 
> Even I7 [email protected] "bottleneck" with gtx 1080... this is not bottleneck. Bottleneck cpu stays at 100% and gpu (example): 80%. This is a cpu bound game.


I wonder how one of those 12 core core Xeons would handle BF1


----------



## icold

Quote:


> Originally Posted by *philhalo66*
> 
> I wonder how one of those 12 core core Xeons would handle BF1


at 5ghz:sonic:

The game rapes the processors, However, the situation is mitigated with OC


----------



## Gurkburk

Quote:


> Originally Posted by *icold*
> 
> Even I7 [email protected] "bottleneck" with gtx 1080... this is not bottleneck. Bottleneck cpu stays at 100% and gpu (example): 80%. This is a cpu bound game.


Lol yeah then my 4770k isnt bottlenecking at all.


----------



## icold

It was the fault of the developers of the game, your cpu handle until titan x pascal:thumb:


----------



## spddmn24

Quote:


> Originally Posted by *icold*
> 
> *It was the fault of the developers of the game,* your cpu handle until titan x pascal:thumb:


Got a source to back that up? My 6700k @ 4.7ghz and 3333mhz ram can't keep my 1070 pegged in quite a few other games too. CPU's have stayed relatively stagnant compared to gpu's since sandy bridge.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> Yes I know that, but as AB is the front end and Riva is an integral part to provide the OSD. I did not feel the need to complicate matters.
> 
> Using adapters from Molex to Pcie is never a recommended course of action.
> 
> A 6 pin PCIE cable is rated to supply 75W and has 2 x 12v conductors that supply 3.35A and would need 2 molex cables that directly connect to the PSU. An 8 pin cable has 3 x 12V conductors at 4 amps and would require 3 directly connected molex cables to provide the 150W for an 8 pin. You could not use 2 molex connectors on a single daisy chained cable as the single shared conductor only supplies a total of 5A.
> 
> A reasonable 750 watt PSU can be picked up for not that much money. You could get an adequate 750W PUS for about $75 from newegg. Is it really worth cheaping out when trying to connect a $400 graphics card?


yeah I also planning to get a new PSU if want SLI then, but I still curious from technical perspective answer, actually is it dual molex to pcie really safe? why they still include those in the box until now? the HW engineering what are they thinking?


----------



## zipper17

Quote:


> Originally Posted by *Gurkburk*
> 
> People on reddit are so keen on spewing out that a "4770k bottlenecks a 1070", even when overclocked.
> 
> I find it extremely hard to believe that this processor bottlenecks a 1070. So whats really the deal?


Since i5/i7 Sandybridge - up until now, they're cures to the bottlenecks.
Any i5/i7 overclocked should be still enough. Only on games that really need lot of i7 Hyperthreading that probably where i5 will start to fall behind.
RAM speed also help increasing minimum framerates. RAM 1600MHZ probably the most standard, but I think now maybe start from 2400mhz or so is become a new standard.

Generation behind the Sandybridge or worst is where probably bottlenecks are the real deal.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gurkburk*
> 
> People on reddit are so keen on spewing out that a "4770k bottlenecks a 1070", even when overclocked.
> 
> I find it extremely hard to believe that this processor bottlenecks a 1070. So whats really the deal?
> 
> 
> 
> Since i5/i7 Sandybridge - up until now, they're cures to the bottlenecks.
> Any i5/i7 overclocked should be still enough. Only on games that really need lot of i7 Hyperthreading that probably where i5 will start to fall behind.
> RAM speed also help increasing minimum framerates. RAM 1600MHZ probably the most standard, but I think now maybe start from 2400mhz or so is become a new standard.
> 
> Generation behind the Sandybridge or worst is where probably bottlenecks are the real deal.
Click to expand...

An i7-2600 with a 4.439Ghz BCLK overclock has a similar performance to an i5-6600. A 2600K would perform even better. It is for that reason that I have not been able to justify a CPU/Motherboard upgrade

Does the 2600 or a 6600 have an effect on the performance when compared to a 6700K? Yes it does. But it does not impact performance to the level of being unusable. Even a sandy or Ivy i5 is not that much further behind the 2600

The performance of both of those CPUs still falls into the category of "good enough" and is the reason that I have not yet upgraded to a new CPU that doesn't really perform that much better in the real world.


----------



## MEC-777

I'm running a locked i5-4570 (3.6 max turbo). It doesn't like any BCLK overclocking. Not even a little. Anyways, until I upgraded to the 1070, I hadn't noticed any perceived CPU bottleneck - only in a few games in certain situations (example: racing with a lot of AI), and very minimal. Now with the 1070 though, there are more noticeable moments where the i5 holds it back.

But as gbtk said, it's nothing game-breaking and doesn't make games unplayable. Though I am planning to upgrade to a 4790K very soon, the i5 is still definitely "good enough" to be paired with a 1070, IMO.


----------



## icold

my bclk dont up more than 104.3, you have a nice clock for a non k:thumb:


----------



## msigtx760tf4

Quote:


> Originally Posted by *philhalo66*
> 
> it depends on the game. my 3570K at 4.8GHz doesn't bottleneck my 1070 in most games. but it does in GTA V and rise of the romb raider and a few heavily single threaded games like crysis 1.


Bull**** maybee it bottleneck you'r brain hehe


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> I'm running a locked i5-4570 (3.6 max turbo). It doesn't like any BCLK overclocking. Not even a little. Anyways, until I upgraded to the 1070, I hadn't noticed any perceived CPU bottleneck - only in a few games in certain situations (example: racing with a lot of AI), and very minimal. Now with the 1070 though, there are more noticeable moments where the i5 holds it back.
> 
> But as gbtk said, it's nothing game-breaking and doesn't make games unplayable. Though I am planning to upgrade to a 4790K very soon, the i5 is still definitely "good enough" to be paired with a 1070, IMO.


I never noticed any CPU deficiency until I updated to my 1070 either.

All the Z270/Kaby Lake vids that are dropping are getting my juices flowing. Do I really need it. Probably not, but I want a new toy


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> Given that I do not know exactly what a Video Clock is actually doing/measuring, i certainly would not say it is buggy it has remained consistent across brands, bios updates and driver updates.


Bugs have been with Pascal for over 6 months now and not just the video clock. The base clock of the video clock is 658MHz and if P-State is forced it stays at 658MHz while memory and GPU clock can still be adjusted. That certainly seems like a bug to me, like someone forgot about the video clock in this scenario.

Also it's hard to get confidence in the clocks reported.

Running FS with GPU clock reported at 139MHz and Video clock over 2GHz


Then running FS at clocks reported as less than a tenth of the above clocks but graphics score not even close to scaling.


Here's two different softwares reporting 5.1GHz GPU clock before crash. Gives the impression programmed clocks are being displayed rather than actual real clocks.


The GDDR5X bug still has not been fixed since day one where performance drops every other clock setting unless the PC is put in sleep mode and woken whereby performance returns to where expected.

While IMO none of these are show stoppers it still can be a disappointment for some.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> my bclk dont up more than 104.3, you have a nice clock for a non k:thumb:


I have got it up to 105.7 and the 1600mhz Kingston Fury Ram is running at 1972Mhz.

I discovered that VCCIO voltage, as well as stabilizing memory overclocks, also helps with my graphics performance


----------



## xGeNeSisx

Quote:


> Originally Posted by *gtbtk*
> 
> An i7-2600 with a 4.439Ghz BCLK overclock has a similar performance to an i5-6600. A 2600K would perform even better. It is for that reason that I have not been able to justify a CPU/Motherboard upgrade
> 
> Does the 2600 or a 6600 have an effect on the performance when compared to a 6700K? Yes it does. But it does not impact performance to the level of being unusable. Even a sandy or Ivy i5 is not that much further behind the 2600
> 
> The performance of both of those CPUs still falls into the category of "good enough" and is the reason that I have not yet upgraded to a new CPU that doesn't really perform that much better in the real world.


This is unrelated to the topic of CPU bottlenecks, but I would just like to say thank you gtbtk. Your advice on crossflashing allowed me to flash my Gigabyte G1 to EVGA FTW bios, which completely eliminated the issues I was experiencing. Constant stutter going into a new area in Doom, card choking for no reason and throttling at 40 C with an AIO loop on it (VRMs never hit more than 55C either), and for some reason it was reporting that it was hitting the power limit at around 92-96%.

I tried using the shunt mod on my 1070 which helped, but the card was still trying to downclock itself every few frames. After getting that extra base clock and increased power limit from the crossflash, everything has been running exceptionally. I was not feeling too great about my 1070 purchase until Micron mem got fixed, and then I flashed the card.
Anyway, just wanted to let you know I appreciate it dude. I'm trying to gain a bit more insight into how Pascal overclocks, and your posts are certainly helping!


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Given that I do not know exactly what a Video Clock is actually doing/measuring, i certainly would not say it is buggy it has remained consistent across brands, bios updates and driver updates.
> 
> 
> 
> Bugs have been with Pascal for over 6 months now and not just the video clock. The base clock of the video clock is 658MHz and if P-State is forced it stays at 658MHz while memory and GPU clock can still be adjusted. That certainly seems like a bug to me, like someone forgot about the video clock in this scenario.
> 
> Also it's hard to get confidence in the clocks reported.
> 
> Running FS with GPU clock reported at 139MHz and Video clock over 2GHz
> 
> 
> Then running FS at clocks reported as less than a tenth of the above clocks but graphics score not even close to scaling.
> 
> 
> Here's two different softwares reporting 5.1GHz GPU clock before crash. Gives the impression programmed clocks are being displayed rather than actual real clocks.
> 
> 
> The GDDR5X bug still has not been fixed since day one where performance drops every other clock setting unless the PC is put in sleep mode and woken whereby performance returns to where expected.
> 
> While IMO none of these are show stoppers it still can be a disappointment for some.
Click to expand...

There have certainly been bugs with the 1070, just not anything I have seen relating to the Video Clock. I was the guy who worked out what was causing the Micron artifact problem and started the big thread at the nvidia forums that lobbied Nvidia to create the bios update to fix the bug in the memory controller for all micron memory 1070 cards.

You are running a 1080. I don't have any experience with one of those.

1070s do not behave in the same manner showing 5Ghz GPU clocks and I have never been able to get my 1070 to idle below 260Mhz. The MSI Gaming X card that I have, though will start to power limit down clock at 105% in spite of having a 126% power limit available. GPU-Z shows me a nice rainbow pattern in the power limit reason graph that I had never seen before. It claims that everything at the same time is the reason that it is power limited. Not sure what Is going on there. I posted about it at the MSI forums only to be met with total silence in return.

As I said before. I don't know what the official spec sheet definition of what the Video Clock is actually supposed to do. I have just found that running it at a higher clock speed, in my case I can run it at 1822Mhz and keep it stable, and it gives me a better overclock FPS result.


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> An i7-2600 with a 4.439Ghz BCLK overclock has a similar performance to an i5-6600. A 2600K would perform even better. It is for that reason that I have not been able to justify a CPU/Motherboard upgrade
> 
> Does the 2600 or a 6600 have an effect on the performance when compared to a 6700K? Yes it does. But it does not impact performance to the level of being unusable. Even a sandy or Ivy i5 is not that much further behind the 2600
> 
> The performance of both of those CPUs still falls into the category of "good enough" and is the reason that I have not yet upgraded to a new CPU that doesn't really perform that much better in the real world.
> 
> 
> 
> This is unrelated to the topic of CPU bottlenecks, but I would just like to say thank you gtbtk. Your advice on crossflashing allowed me to flash my Gigabyte G1 to EVGA FTW bios, which completely eliminated the issues I was experiencing. Constant stutter going into a new area in Doom, card choking for no reason and throttling at 40 C with an AIO loop on it (VRMs never hit more than 55C either), and for some reason it was reporting that it was hitting the power limit at around 92-96%.
> 
> I tried using the shunt mod on my 1070 which helped, but the card was still trying to downclock itself every few frames. After getting that extra base clock and increased power limit from the crossflash, everything has been running exceptionally. I was not feeling too great about my 1070 purchase until Micron mem got fixed, and then I flashed the card.
> Anyway, just wanted to let you know I appreciate it dude. I'm trying to gain a bit more insight into how Pascal overclocks, and your posts are certainly helping!
Click to expand...

You are very welcome. I appreciate the acknowledgement. Glad that you found a Bios that works for you. Did you know that you can now play with the EVGA only features of Precision XOC, including the auto overclock utility?

Getting Nvidia to create the Micron Bugfix bios was certainly a challenge, particularly when the Nvidia forum is full of nay sayers who just want to make sure the the nvidia products stay broken by arguing against everything. Pascal cards certainly have more complexities that have been made visible with the introduction of the Voltage curve. I suspect that the "features" have always been there in older cards but locked in a back room and inaccessible to the user.

My MSI card is making me scratch my head with strange power limit behavior as well. GPU-Z gives me a nice rainbow coloured sensor reading in the power limit reason graph. I am thinking that it may be a Bios bug introduced by MSI. In spite of the 126% power slider, The card wants to down clock itself at 105%.


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> People on reddit are so keen on spewing out that a "4770k bottlenecks a 1070", even when overclocked.
> 
> I find it extremely hard to believe that this processor bottlenecks a 1070. So whats really the deal?


You realize that by definition, there is always a bottleneck in the Graphics Pipeline? The limiting factor could either be CPU/Memory, PCIE bus, Graphics card itself or even the choice of cable connecting the monitor.

Even If you are creating 250 fps to play COD, the GPU is bottle necking the system and will not let the system produce 400fps. Because the frame rates are higher than what can actually be displayed on a monitor, no-one complains about it but the graphics pipeline is still being bottlenecked.

Will a 4770K produce the same physics scores in Firestrike as a 6950X? No it wont. 10 Cores/20 threads will kill any 4 core/8 thread CPU in the very multi threaded application such as the Physics test in Firestrike.

In this context though, the 4770K is creating a bottleneck in the graphics pipeline when compared to a 6950X. But that doesn't mean that you cannot have a good gaming experience with the 4770K. Dont get carried away with forum herd fashion, Stop and think about it yourself first


----------



## xGeNeSisx

Quote:


> Originally Posted by *gtbtk*
> 
> You are very welcome. I appreciate the acknowledgement. Glad that you found a Bios that works for you. Did you know that you can now play with the EVGA only features of Precision XOC, including the auto overclock utility?
> 
> Getting Nvidia to create the Micron Bugfix bios was certainly a challenge, particularly when the Nvidia forum is full of nay sayers who just want to make sure the the nvidia products stay broken by arguing against everything. Pascal cards certainly have more complexities that have been made visible with the introduction of the Voltage curve. I suspect that the "features" have always been there in older cards but locked in a back room and inaccessible to the user.
> 
> My MSI card is making me scratch my head with strange power limit behavior as well. GPU-Z gives me a nice rainbow coloured sensor reading in the power limit reason graph. I am thinking that it may be a Bios bug introduced by MSI. In spite of the 126% power slider, The card wants to down clock itself at 105%.


I actually found Precision XOC under the software section in Steam yesterday. Tried to see how the Auto OC feature worked, but the program was kind of buggy in general (in hindsight I may have had afterburner open with no OC profile set or just recently closed it). After launching it a few times I got it to start progressing up via the Auto OC feature. It hit ~2075 before I ended the process. Weird part is that the clock speed would not revert back to stock. A reboot fixed the issue of course. I likely did leave AB on in the background so I will definitely give it another try









One of the strangest things about that post on the GeForce forums, is that I saw your thread get created while searching for fixes for my G1 related issues. I remember the first post by Sora, I believe their name is, and then the chaos that ensued lol. The Nvidia forums are...awful to say the least. Taking the problem with proof and laying it out for all to see is astounding in itself. Having to bear personal attacks and trolls and maintain your level of patience throughout that situation? Now that's something









Is there a particular reason why most manufacturers seem to throttle cards early, even when they don't hit the power limit %? I am still learning about Pascal, but I seem to remember reading an article that detailed that Nvidia provides a warranty on the integrity of the PCB design so long as it is operated within specified limits. Manufacturers would lessen risk of loss due to hardware being operated out of spec and subsequently malfunctioning. This is just speculation though, I am familiar with the strategies Pascal uses to to achieve such power efficiency

Your MSI card downclocking is akin to the situation with my G1. The card just would not stop trying to throttle, and go through absolutely erratic clock speeds, bouncing all over the place. It would downclock itself at mid 80s occasionally and any spike into the low 90s was even worse. I RMA'd the first G1 I bought beginning of August due to the same problem and coil whine so loud it was like a tea kettle boiling. The replacement displayed the same power limiting behavior until I crossflashed it


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> I have got it up to 105.7 and the 1600mhz Kingston Fury Ram is running at 1972Mhz.
> 
> I discovered that VCCIO voltage, as well as stabilizing memory overclocks, also helps with my graphics performance


what is your latencies?


----------



## Snuckie7

Quote:


> Originally Posted by *gtbtk*
> 
> The Video clock only varies with the position of the voltage slider. +0 Voltage, the clock will be adjusted with the .950V point, +100 it is adjusted with .975.
> 
> That has been consistent since the day the Pascal cards were introduced and applies equally to MSI, ASUS, EVGA, Zotac, Galax. Given that I do not know exactly what a Video Clock is actually doing/measuring, i certainly would not say it is buggy it has remained consistent across brands, bios updates and driver updates. What I do know is that increasing that clock point together with the point at 1.093 while leaving all the other points alone has given me the best overclocks.
> 
> Yes the video clock will follow the core slider, because the core slider moves the entire curve up and down and the relative heights of the points on the curve to each other remain the same. but I am not talking about using the core slider. I am talking about a custom curve that has a double hump. Yes in a lower p-state all the clocks drop to a lower idle. The screenshot I posted was displaying just that.


How do you get a double hump in the Afterburner curve editor? After I apply settings, all the voltage points between 0.975V and 1.093V will be raised to the value at 0.975V.


----------



## skupples

tr-sli GK110 titans gone, hello single EVGA 1070. I'll add a second when the prices drop after AMD does whatever it is they're about to do.

hai guise.

it'll be here in a couple days.

I promised OCN I'd upgrade when Pscal hit, and pascal 2.0 looks pretty lame, so 1070 for 2-3 years until DX12 actually means something.

soooo leme guess, no we don't have manual control over the voltage controllers via command prompt?


----------



## duganator

Finally got around to upgrading the aging PC. Zotac 1070, 32gb 3200 ram, 5960x. I'm really debating adding another 1070 soon.


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> I actually found Precision XOC under the software section in Steam yesterday. Tried to see how the Auto OC feature worked, but the program was kind of buggy in general (in hindsight I may have had afterburner open with no OC profile set or just recently closed it). After launching it a few times I got it to start progressing up via the Auto OC feature. It hit ~2075 before I ended the process. Weird part is that the clock speed would not revert back to stock. A reboot fixed the issue of course. I likely did leave AB on in the background so I will definitely give it another try
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One of the strangest things about that post on the GeForce forums, is that I saw your thread get created while searching for fixes for my G1 related issues. I remember the first post by Sora, I believe their name is, and then the chaos that ensued lol. The Nvidia forums are...awful to say the least. Taking the problem with proof and laying it out for all to see is astounding in itself. Having to bear personal attacks and trolls and maintain your level of patience throughout that situation? Now that's something
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there a particular reason why most manufacturers seem to throttle cards early, even when they don't hit the power limit %? I am still learning about Pascal, but I seem to remember reading an article that detailed that Nvidia provides a warranty on the integrity of the PCB design so long as it is operated within specified limits. Manufacturers would lessen risk of loss due to hardware being operated out of spec and subsequently malfunctioning. This is just speculation though, I am familiar with the strategies Pascal uses to to achieve such power efficiency
> 
> Your MSI card downclocking is akin to the situation with my G1. The card just would not stop trying to throttle, and go through absolutely erratic clock speeds, bouncing all over the place. It would downclock itself at mid 80s occasionally and any spike into the low 90s was even worse. I RMA'd the first G1 I bought beginning of August due to the same problem and coil whine so loud it was like a tea kettle boiling. The replacement displayed the same power limiting behavior until I crossflashed it


The OC utilities should only be run in isolation. Precision Auto overclock utility is interesting and it will help you quickly see what voltage levels over clock better than others across the range. It is still immature and the OC that you get from it is not guaranteed to be stable but it is handy to discover where the cards weaknesses are quickly. If it crashes, restart precision and it will carry on from where it left off. Just remember to flash the card back to the original bios if you ever have to RMA the second G1

It is fascinating to see how the self appointed Nvidia forum police like Sora work so hard to ensure Nvidia products remain flawed. I didn't really have much choice but to be patient. I wanted my expensive GPU fixed and MSI would not RMA the card. I also figured that it would help out thousands of other 1070 owners as well. It is amusing when some people try to educate me on how bad Micron memory chips are.

I really don't know why the power limits kick in early other than to say that it is definitely something in the way the bios file has been written and it is not a hardware problem. It is not such a huge problem for me as the card only tries to pull more than 100% under a 4K load and I don't have a 4K monitor, The only time that I see the reduced power limit is in Firestrike Ultra and that only bugs me because I always want MOAR POWER!!! One thing that just crossed my mind is that the Seahawk card has a power limit of 105%. I wonder if they grabbed a Seahawk bios and edited that version to make it a Gaming bios and missed one of the internal settings that defines the power limit as they were preparing the Gaming X and Gaming Z bioses?

These cards are engineered to operate in the temperature range that we use them, even with an overclock, that is why they build the down clocking safeguards in in the first place. I have run almost all of the high end bioses on my card. The Zotac Bios will pull 300W without any drama at all so the early down clocking is not a hardware issue and it is not an Nvidia Issue. It is a shame that the extra power draw doesn't really translate to lots of extra performance though. From memory, and I admit that I had never really focused any attention on it at the time, I think that the replacement bug fix bios that MSI distributed is actually worse at down clocking early than the original bios.

The AIB partners are providing the warranty for their own cards. Many models have custom PCBs that follow the high level electrical, but not physical design that came from Nvidia. Maybe it is MSIs and Gigabytes Idea to sneak in a power limitation because they were having a run of warranty claims?


----------



## gtbtk

Quote:


> Originally Posted by *Snuckie7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The Video clock only varies with the position of the voltage slider. +0 Voltage, the clock will be adjusted with the .950V point, +100 it is adjusted with .975.
> 
> That has been consistent since the day the Pascal cards were introduced and applies equally to MSI, ASUS, EVGA, Zotac, Galax. Given that I do not know exactly what a Video Clock is actually doing/measuring, i certainly would not say it is buggy it has remained consistent across brands, bios updates and driver updates. What I do know is that increasing that clock point together with the point at 1.093 while leaving all the other points alone has given me the best overclocks.
> 
> Yes the video clock will follow the core slider, because the core slider moves the entire curve up and down and the relative heights of the points on the curve to each other remain the same. but I am not talking about using the core slider. I am talking about a custom curve that has a double hump. Yes in a lower p-state all the clocks drop to a lower idle. The screenshot I posted was displaying just that.
> 
> 
> 
> How do you get a double hump in the Afterburner curve editor? After I apply settings, all the voltage points between 0.975V and 1.093V will be raised to the value at 0.975V.
Click to expand...

that right the points will be raised.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have got it up to 105.7 and the 1600mhz Kingston Fury Ram is running at 1972Mhz.
> 
> I discovered that VCCIO voltage, as well as stabilizing memory overclocks, also helps with my graphics performance
> 
> 
> 
> what is your latencies?
Click to expand...

I am running the 2x8GB sticks at 986.7Mhz at 11-12-11-33 1T.

I could never get the PC to boot with memory set any faster, even if I really loosened up the timings. Now with my discovery of VCCIO, I might have another go at getting it to the next frequency bin.


----------



## gtbtk

Quote:


> Originally Posted by *duganator*
> 
> Finally got around to upgrading the aging PC. Zotac 1070, 32gb 3200 ram, 5960x. I'm really debating adding another 1070 soon.


Nice rig


----------



## Snuckie7

Quote:


> Originally Posted by *gtbtk*
> 
> that right the points will be raised.


Thanks, cheers mate


----------



## Luckael

guys, question. I oc my Asus Gtx 1070 Strix using msi afterburner. but after restarting my pc it will back to default clock. i already click the apply overclocking when the pc starts. anyone having same issue's like mine?


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> 1070s do not behave in the same manner showing 5Ghz GPU clocks and I have never been able to get my 1070 to idle below 260Mhz.


That 12.5MHz was a P0 P-State and not an idle state. The 1080 and 1070 are not so different I think, both GP104 just memory mainly. Probably can do those things on 1070's but maybe limited with the tools to do it. For instance I'm not sure if AB will let you run a curve fully flat-lined but it can be done, as an example.



The idea was if temp / freq was controlled by points on the curve then making them all the same should stop temperature / frequency changes. Instead clocks increase with temperature increase unless clocks are set when the core is hottest.
Quote:


> Originally Posted by *gtbtk*
> 
> The MSI Gaming X card that I have, though will start to power limit down clock at 105% in spite of having a 126% power limit available. GPU-Z shows me a nice rainbow pattern in the power limit reason graph that I had never seen before. It claims that everything at the same time is the reason that it is power limited. Not sure what Is going on there. I posted about it at the MSI forums only to be met with total silence in return.


Seen those too, the silence probably because no one but nVidia knows why. Try checking your power limits with the SMI utility that comes with the nVidia driver, it can also be used to set the limit. Have you tried running furmark (small window size)?


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> The OC utilities should only be run in isolation. Precision Auto overclock utility is interesting and it will help you quickly see what voltage levels over clock better than others across the range. It is still immature and the OC that you get from it is not guaranteed to be stable but it is handy to discover where the cards weaknesses are quickly. If it crashes, restart precision and it will carry on from where it left off. Just remember to flash the card back to the original bios if you ever have to RMA the second G1
> 
> It is fascinating to see how the self appointed Nvidia forum police like Sora work so hard to ensure Nvidia products remain flawed. I didn't really have much choice but to be patient. I wanted my expensive GPU fixed and MSI would not RMA the card. I also figured that it would help out thousands of other 1070 owners as well. It is amusing when some people try to educate me on how bad Micron memory chips are.
> 
> I really don't know why the power limits kick in early other than to say that it is definitely something in the way the bios file has been written and it is not a hardware problem. It is not such a huge problem for me as the card only tries to pull more than 100% under a 4K load and I don't have a 4K monitor, The only time that I see the reduced power limit is in Firestrike Ultra and that only bugs me because I always want MOAR POWER!!! One thing that just crossed my mind is that the Seahawk card has a power limit of 105%. I wonder if they grabbed a Seahawk bios and edited that version to make it a Gaming bios and missed one of the internal settings that defines the power limit as they were preparing the Gaming X and Gaming Z bioses?
> 
> These cards are engineered to operate in the temperature range that we use them, even with an overclock, that is why they build the down clocking safeguards in in the first place. I have run almost all of the high end bioses on my card. The Zotac Bios will pull 300W without any drama at all so the early down clocking is not a hardware issue and it is not an Nvidia Issue. It is a shame that the extra power draw doesn't really translate to lots of extra performance though. From memory, and I admit that I had never really focused any attention on it at the time, I think that the replacement bug fix bios that MSI distributed is actually worse at down clocking early than the original bios.
> 
> The AIB partners are providing the warranty for their own cards. Many models have custom PCBs that follow the high level electrical, but not physical design that came from Nvidia. Maybe it is MSIs and Gigabytes Idea to sneak in a power limitation because they were having a run of warranty claims?


The NVidia forum need a real forum police. Some people are just out of control there because they allowed to - including Nvidia sycophants/fanboys like Sora.

I have an MSI 1070 X too. I just don't get the point of delivering more power with an extra 6-pin connector if you can't use it - just plain marketing I guess. I can tweak one point on my curve and I will go from no perfcap to a rainbow. I can save a curve profile and apply it after a reboot and it will apply different than what I saved. I can run stable at one temperature but not at another even though the card adjusts. I can make a step function curve and run at high frequency stable but get low FPS. I get no consistency and I am at this point pretty much frustrated and disappointed because of all of this.


----------



## shadowrain

Quote:


> Originally Posted by *Luckael*
> 
> guys, question. I oc my Asus Gtx 1070 Strix using msi afterburner. but after restarting my pc it will back to default clock. i already click the apply overclocking when the pc starts. anyone having same issue's like mine?


Do you happen to have NZXT CAM software or something similar installed? Same thing happened to me at startup before, when evga precision applies the oc then since CAM loads late, it defaults the clocks after load. Fixed by putting a 30sec to 1min delay to evga precision with task scheduler.


----------



## Quadrider10

What's the shortcut to bring up voltage control in AB?


----------



## zipper17

Quote:


> Originally Posted by *ucode*
> 
> That 12.5MHz was a P0 P-State and not an idle state. The 1080 and 1070 are not so different I think, both GP104 just memory mainly. Probably can do those things on 1070's but maybe limited with the tools to do it. For instance I'm not sure if AB will let you run a *curve fully flat-lined* but it can be done, as an example.
> 
> 
> 
> The idea was if temp / freq was controlled by points on the curve then making them all the same should stop temperature / frequency changes. Instead clocks increase with temperature increase unless clocks are set when the core is hottest.
> Seen those too, the silence probably because no one but nVidia knows why. Try checking your power limits with the SMI utility that comes with the nVidia driver, it can also be used to set the limit. Have you tried running furmark (small window size)?


I remember did tried that once in past with full flat curve on AB, immediately my card crashing, blackscreen, and PC need a hard reboot, until now I'm not sure to do that again.

The Very First Curve point if you set it too high it will make GPU hard crashing.
Quote:


> Originally Posted by *gtbtk*
> 
> I am running the 2x8GB sticks at 986.7Mhz at 11-12-11-33 1T.
> 
> I could never get the PC to boot with memory set any faster, even if I really loosened up the timings. Now with my discovery of VCCIO, I might have another go at getting it to the next frequency bin.


If I'm not mistaken, Sandybridge platform the highest max RAM frequency that SB can handle is 2133 RAM (1066mhz), not sure with the Sandybridge HEDT platform though.
Ivybridge max support 2666/2800 RAM. cmiiw. Also CPU "K" version has better support for higher RAM frequencies.


----------



## Quadrider10

so does anyone think a bios editor will ever actually come out? im trying to get my GPU to sustain 2000mhz in game and its impossible despite factory clock of 2000. drops and settles to 1949mhz.


----------



## Luckael

Quote:


> Originally Posted by *shadowrain*
> 
> Do you happen to have NZXT CAM software or something similar installed? Same thing happened to me at startup before, when evga precision applies the oc then since CAM loads late, it defaults the clocks after load. Fixed by putting a 30sec to 1min delay to evga precision with task scheduler.


Yes, i have nzxt cam software installed.. So thats the issue.. Thanks for your help man!


----------



## dallemon

Quote:


> Originally Posted by *gtbtk*
> 
> Just be careful and manage your power draw. Firestrike Ultra pulls more power than any of the other benchmarks in 3dMark
> 
> While the Asus card has a good 8+2phase VRM, it only has a single 8 pin power supply and an official 200W limit baked into the original bios. You can only get officially 225W through the PCIE connector per the specs but I think that the card can actually pull more power than what the official spec says. You should be OK but the mosfets in the ASUS card are not likely to be as robust as the ones on the Zotac card (300W 2x8 pin power) so you don't want to be pulling really high Wattages through the Asus card for long periods of time else you will be putting your card at risk of cooking itself.
> 
> HWinfo64 will integrate with the OSD that is part of Afterburner. HWinfo can monitors temps, frequencies and GPU power draw among many other things. You can set it up so that the GPU power draw wattage is displayed on screen with the standard AB GPU clock speeds etc . I find it very handy.


Thanks, didn't know about that hwinfo plugin  But the card doesn't go above 211W (yet) and it seems that I've got the core stable at [email protected],043V (so far)
The Zotac BIOS has really helped make my ASUS card shine. :-D Haven't started pushing memory clock yet, so that will probably make the card throttle again in Firestrike, but with a very sublte memory OC there is no longer any throttling in any benches or games.


----------



## syl1979

Quote:


> Originally Posted by *Quadrider10*
> 
> so does anyone think a bios editor will ever actually come out? im trying to get my GPU to sustain 2000mhz in game and its impossible despite factory clock of 2000. drops and settles to 1949mhz.


Did you try to undervolt ?
It will reduce heat and then the thermal throttling (first step at 35degC !)

If you use the curv function of afterburner you should be able to set 2038 Mhz for 1.025v. It should not go below 2000 Mhz with proper cooling / fan curve ( try to stay under 60degC)


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 1070s do not behave in the same manner showing 5Ghz GPU clocks and I have never been able to get my 1070 to idle below 260Mhz.
> 
> 
> 
> That 12.5MHz was a P0 P-State and not an idle state. The 1080 and 1070 are not so different I think, both GP104 just memory mainly. Probably can do those things on 1070's but maybe limited with the tools to do it. For instance I'm not sure if AB will let you run a curve fully flat-lined but it can be done, as an example.
> 
> The idea was if temp / freq was controlled by points on the curve then making them all the same should stop temperature / frequency changes. Instead clocks increase with temperature increase unless clocks are set when the core is hottest.
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The MSI Gaming X card that I have, though will start to power limit down clock at 105% in spite of having a 126% power limit available. GPU-Z shows me a nice rainbow pattern in the power limit reason graph that I had never seen before. It claims that everything at the same time is the reason that it is power limited. Not sure what Is going on there. I posted about it at the MSI forums only to be met with total silence in return.
> 
> Click to expand...
> 
> Seen those too, the silence probably because no one but nVidia knows why. Try checking your power limits with the SMI utility that comes with the nVidia driver, it can also be used to set the limit. Have you tried running furmark (small window size)?
Click to expand...

A flat line in afterburner curve will simply run the card at the lowest voltage point on the flat line which is fine if your card will run at the value you set at .800V. I can get pretty stable results just pulling up the 1.093 point. One thing I did notice though is that the clock seems move around less if you set the curve to 2076 or 2100 compared to 2088 and the other in between points

i have looked at the smi utility before and just took another look at it now. It tells me that my max power limit is 289W if I set it with afterburner. I can manually change it to 291W from the command line but it still down clocks when the power load hits 105% (about 225-230W) under FS Ultra. Furmark is different again. The TDP value that I get in Furmark is lower than the value reported by AB. That downclocks at 105% according to AB as well but the card is only boosting to about 1900Mhz. It never behaved that way with my GTX 660 that I can remember

Another interesting observation is that GPU Shark, like GPU-Z will light up all the limits (voltage, power, temp etc) on at the same time when the readings on the card indicate that it is not close to its set limits. It will do that at random times but I have not worked out how to trigger it on demand as yet.

I hope you can read the attachment.


----------



## ucode

That power limit is total power, perhaps one of the circuits is hitting it's individual limit long before the total limit has a chance to take effect.

See http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/110#post_25319052 for an example.

For Furmark testing purposes by starting with a small custom window (160x120 for instance) and increasing it, it should be possible to find a size that runs a fairly constant load near to but under the power limit, no throttling.


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> That power limit is total power, perhaps one of the circuits is hitting it's individual limit long before the total limit has a chance to take effect.
> 
> See http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/110#post_25319052 for an example.
> 
> For Furmark testing purposes by starting with a small custom window (160x120 for instance) and increasing it, it should be possible to find a size that runs a fairly constant load near to but under the power limit, no throttling.


I understand that it is total power but 60W between what I actually get and what I'm supposed to get is a huge difference.

That table was very interesting. Do you know what he used to extract and convert the values?

just played with Furmark again. the HIghest power draw I could get out of it was 230W. While running with my curve. GPU shark reported that I have exceeded every limit the card had. Only problem was that the card was not exceeding any of the limits while it was reporting that. me thinks I have found a bug in the firmware


----------



## Onlygear

Hi folk

I'm trying to overclock a ASUS STRIX 1070 OC, GPU core can reach 1220MHz easily but unfortunately I'm still having problems with Micron memory , even with last BIOS available my GPU becomes really unestable with anything above 2172Mhz (+340MHz)

Any fix or idea to improve it?

Thanks in advance


----------



## I Am The Stig

Hi all - so I just purchased a 1070 FTW last night. Great card so far, max temps I've had playing BF1, RB6: Siege and The Division has been 64c, happy with the temps. However, I keep hearing about previous issues with this card about overheating, to the point of possibly catching on fire.

Is the new batch of FTWs safe? Should I consider exchanging it for maybe a Strix?


----------



## skupples

no one should be using furmark for anything in 2016+ besides to check TDP. it's a worthless tool that causes modern architecture a large chance of throttling. or just exploding.

that is, unless its been brought back from the dead in the last 2 years, which doesn't seem to be the case.


----------



## BroPhilip

Will the i7-7700k match with my gtx 1070 better than the i5-6600k

So here is my question. I enjoy gaming and actively play The Division and Other open world games.... my i5-6600k (oc 4.6 stable) is running at 100% gpu in these games and during active sections with npcs the bottleneck will drop my gpu (gtx 1070 msi gaming z) down to 75-80%. I am running a asus 1080 gaming monitor with 144hz refresh rate and my fps drops from 120 down to almost 60-70 fps during these moments. My question is this, should I upgrade to the i7-7700k or wait for for zen info which seems to absent from ces? I have around $500 for upgrades.

Also would running sli make the problem worse?


----------



## skupples

is your CPU actually pegging on any cores?



whats the deal with EVGA & working with MSI AB these dfays?


----------



## BroPhilip

Quote:


> Originally Posted by *skupples*
> 
> is your CPU actually pegging on any cores?
> 
> 
> 
> whats the deal with EVGA & working with MSI AB these dfays?


My cpu is showing 100% on all 4 cores


----------



## skupples

i've been gone for too long!

then its up to you my friend. Yes, you'll see some benefit in the titles where you peg all 4, but is it worth the money? only you can make that choice.

i'm still on x79, 4930, & just ditched my keplar products. 4930k still humming at damn near 5.0 though.


----------



## spin5000

Quote:


> Originally Posted by *Quadrider10*
> 
> so does anyone think a bios editor will ever actually come out? im trying to get my GPU to sustain 2000mhz in game and its impossible despite factory clock of 2000. drops and settles to 1949mhz.


+1


----------



## c0nsistent

As someone with an EVGA SC 1070, should I flash another BIOS onto this card for a higher power limit or are there any benefits to it whatsoever? I have micron memory and have already flashed the updated EVGA BIOS to the card, which gave me 0 gains on the memory, as I was already at around +475

My highest stable GPU clock is around 2126 to 2152 or so with the voltage slider in Precision XOC maxed out.

I haven't tried overclocking using the curve method much yet, as I figured I would not get much more of an overclock than I already have.


----------



## gerardfraser

I thought I would downclock one of my GTX 1070 cards as mentioned in a few post back.
So first try @ 1.025mv. Overall Good, I just may keep these settings ,core clock settled at 2025 and no throttling.

The difference in benchmarks is nothing really when running GTX 1070 1.093 mv and 2100 / 9000 with clocks all over the place and even dropping below 2000 core after awhile due to throttling.

GPU voltage-1.025mv
Core Clock settled-2025
Memory set at -8506
Max temp reached 64C
Max Fan speed -66%

https://postimg.org/image/o4rdfr5vf/uploading pictures


----------



## syl1979

You may try 2025mhz at 0.993v...


----------



## philhalo66

the fans on my card are rattling already and its super super loud like metal rattling on metal -_- any suggestions?


----------



## khanmein

Quote:


> Originally Posted by *philhalo66*
> 
> the fans on my card are rattling already and its super super loud like metal rattling on metal -_- any suggestions?


i read a lot news regarding the audible coil-whine & rattle fan noise but try to hit softly on the middle fan parts for 3 fans. by the way, the power logic fan replacement is very easy to find around USD 10


----------



## gtbtk

Quote:


> Originally Posted by *Onlygear*
> 
> Hi folk
> 
> I'm trying to overclock a ASUS STRIX 1070 OC, GPU core can reach 1220MHz easily but unfortunately I'm still having problems with Micron memory , even with last BIOS available my GPU becomes really unestable with anything above 2172Mhz (+340MHz)
> 
> Any fix or idea to improve it?
> 
> Thanks in advance


Is 1220Mhz a typo? That is way lower than stock

Have you checked that you are actually running the 86.04.50.00.XX bios on the card? You can check in GPU-Z. It is possible that the update failed and maybe you didn't notice?

What exact symptoms of instability are you suffering?

What are the specs of your system?

Is your PC overclocked? If so, what have you overclocked (CPU/memory) and how did you do it/what settings are you using?

All of those things could be contributing to instability


----------



## gtbtk

Quote:


> Originally Posted by *I Am The Stig*
> 
> Hi all - so I just purchased a 1070 FTW last night. Great card so far, max temps I've had playing BF1, RB6: Siege and The Division has been 64c, happy with the temps. However, I keep hearing about previous issues with this card about overheating, to the point of possibly catching on fire.
> 
> Is the new batch of FTWs safe? Should I consider exchanging it for maybe a Strix?


It wasn't overheating, that was an internet beat up that wrongly assumed that the VRMS were overheting. There were some cards that had faulty capacitors that blew up. EVGA will cover under warranty


----------



## gtbtk

Quote:


> Originally Posted by *BroPhilip*
> 
> Will the i7-7700k match with my gtx 1070 better than the i5-6600k
> 
> So here is my question. I enjoy gaming and actively play The Division and Other open world games.... my i5-6600k (oc 4.6 stable) is running at 100% gpu in these games and during active sections with npcs the bottleneck will drop my gpu (gtx 1070 msi gaming z) down to 75-80%. I am running a asus 1080 gaming monitor with 144hz refresh rate and my fps drops from 120 down to almost 60-70 fps during these moments. My question is this, should I upgrade to the i7-7700k or wait for for zen info which seems to absent from ces? I have around $500 for upgrades.
> 
> Also would running sli make the problem worse?


A 7700K will give you about 150-160% of the computing power compared to the 6600K. An upgrade can be a direct chip replacement, assuming that there is a kaby lake Bios update for your motherboard.

You will certainly see a small improvement in frame rates with the 7700K overclocked to 4.9-5.0Ghz in games that are CPU bound or physics intensive. While we are all excited about Ryzen, Unfortunately we don't have any concrete knowledge of what the production Ryzen chip can actually do or how it will overclock. If it only overclocks to 4Ghz, it is likely that the Ryzen chip could perform worse in gaming than the 7700K

When the GPU drops to 75% is the CPU or any of the cores running at 100%?


----------



## Gurkburk

Quote:


> Originally Posted by *BroPhilip*
> 
> Will the i7-7700k match with my gtx 1070 better than the i5-6600k
> 
> So here is my question. I enjoy gaming and actively play The Division and Other open world games.... my i5-6600k (oc 4.6 stable) is running at 100% gpu in these games and during active sections with npcs the bottleneck will drop my gpu (gtx 1070 msi gaming z) down to 75-80%. I am running a asus 1080 gaming monitor with 144hz refresh rate and my fps drops from 120 down to almost 60-70 fps during these moments. My question is this, should I upgrade to the i7-7700k or wait for for zen info which seems to absent from ces? I have around $500 for upgrades.
> 
> Also would running sli make the problem worse?


Your setup isnt getting bottlenecked anywhere.


----------



## philhalo66

Quote:


> Originally Posted by *khanmein*
> 
> i read a lot news regarding the audible coil-whine & rattle fan noise but try to hit softly on the middle fan parts for 3 fans. by the way, the power logic fan replacement is very easy to find around USD 10


i already checked its not rubbing against the wires. what it sounds like is one of the fans is vibrating badly and its rattling the shroud. I never heard of power logic before. how would i even go about finding the fans to replace the 3 on my card? its 3 fans connected to one piece.

this is exactly what it sounds like


----------



## gtbtk

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> i read a lot news regarding the audible coil-whine & rattle fan noise but try to hit softly on the middle fan parts for 3 fans. by the way, the power logic fan replacement is very easy to find around USD 10
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i already checked its not rubbing against the wires. what it sounds like is one of the fans is vibrating badly and its rattling the shroud. I never heard of power logic before. how would i even go about finding the fans to replace the 3 on my card? its 3 fans connected to one piece.
> 
> this is exactly what it sounds like
Click to expand...

I just saw this that may help.


----------



## Quadrider10

what are the compatible bios to flash on the gigabyte g1 gaming with Samsung memory? im trying to get a higher power target than 111%.


----------



## khanmein

Quote:


> Originally Posted by *philhalo66*
> 
> i already checked its not rubbing against the wires. what it sounds like is one of the fans is vibrating badly and its rattling the shroud. I never heard of power logic before. how would i even go about finding the fans to replace the 3 on my card? its 3 fans connected to one piece.
> 
> this is exactly what it sounds like


https://www.aliexpress.com/wholesale?catId=0&initiative_id=AS_20170106043103&SearchText=power+logic+fans

MSI, GIGA etc stated using double dual bearing but actually using long sleeve bearing


----------



## ZakZakXxX

https://www.techpowerup.com/gpuz/details/kkn8d

my valid

 gpu 1 by micron

 gpu 2 by samsung

cpuid

4.4 ghz
http://valid.x86.fr/t3n75j

4.5 ghz
http://valid.x86.fr/emk05r


----------



## ZakZakXxX

fix bios 1070 zotac memory micron

Hi,

Here are the temporary links to our FTP, for the respective GTX1070 Cards affected by the "Micron RAM " issue.

Please select the correct BIOS for your card based on SKU. The ZIP filenames are self-explaining and in EXE format so this will execute when you double click. This bios only supports windows operating system.

SKU : ZT-P10700E-10S
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700E-10S__288-1N435-200Z8-201Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700F-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700F-10P__288-1N424-200Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700I-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700I-10P__299-1N424-300Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700C-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_(ZT-P10700C-10P__288-1N435-100Z8-101Z8)_Micron_RAM_(2016_11).zip

SKU :ZT-P10700B-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_Extreme_(ZT-P10700B-10P__288-1N435-000Z8-001Z8)_Micron_RAM_(2016_11).zip

SKU :ZT-P10700A-10P
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Founders_(ZT-P10700A-10P__288-1N424-000Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700G-10M
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700G-10M__288-1N445-030Z8)_Micron_RAM_(2016_11).zip

SKU : ZT-P10700K-10M
http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700K-10M__288-1N445-130Z8)_Micron_RAM_(2016_11).zip

+++++++++++++++
Important Remark :
+++++++++++++++
- The VBIOS files are made into .EXE files, for 32bit and 64bit Windows respectively. Run the .EXE file in the suitable Windows type.
- Check to make sure the card is really built from "Micron RAM" before starting to do the VBIOS change !!!
- These VBIOS changes are "One-Way" only, no return path to go back.

Yours,


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> what are the compatible bios to flash on the gigabyte g1 gaming with Samsung memory? im trying to get a higher power target than 111%.


You could try an Asus strix OC bios. That will also give you a core clock of 1633Mhz and a power slider to 120. Both cards have the same power limit of 200W but remember though that different cards have different VRM designs so pay close attention to temps etc


----------



## Quadrider10

Quote:


> Originally Posted by *gtbtk*
> 
> You could try an Asus strix OC bios. That will also give you a core clock of 1633Mhz and a power slider to 120. Both cards have the same power limit of 200W but remember though that different cards have different VRM designs so pay close attention to temps etc


I wanna do it, but I feel like its way to risky. Are we ever going to see bios editors or custom bios?


----------



## zipper17

Quote:


> Originally Posted by *BroPhilip*
> 
> Will the i7-7700k match with my gtx 1070 better than the i5-6600k
> 
> So here is my question. I enjoy gaming and actively play The Division and Other open world games.... my i5-6600k (oc 4.6 stable) is running at 100% gpu in these games and during active sections with npcs the bottleneck will drop my gpu (gtx 1070 msi gaming z) down to 75-80%. I am running a asus 1080 gaming monitor with 144hz refresh rate and my fps drops from 120 down to almost 60-70 fps during these moments. My question is this, should I upgrade to the i7-7700k or wait for for zen info which seems to absent from ces? I have around $500 for upgrades.
> 
> Also would running sli make the problem worse?


Probably Go to youtube search video people with i7 and 1070 play the exact same game with yours and see his framerates. Compare it with your framerates, you can judge your PC with i5+1070 is bottlenecking a lot or not.


----------



## Quadrider10

Same issue here with my 6600k at 4.6. Some CPU bottlenecks in some games, but idk if it still justifies the extra $100 for on average 3fps across many games.


----------



## RyanRazer

Quote:


> Originally Posted by *philhalo66*
> 
> i already checked its not rubbing against the wires. what it sounds like is one of the fans is vibrating badly and its rattling the shroud. I never heard of power logic before. how would i even go about finding the fans to replace the 3 on my card? its 3 fans connected to one piece.
> 
> this is exactly what it sounds like


I've had 1070 G1 gaming myself, it had a coil whine like that one in video. I replaced with zotac amp extreme which is almost dead silent.


----------



## khanmein

Quote:


> Originally Posted by *RyanRazer*
> 
> I've had 1070 G1 gaming myself, it had a coil whine like that one in video. I replaced with zotac amp extreme which is almost dead silent.


Magic R15 chokes often cause more noticeable coil-whine compare with stock R22 (http://www.overclock.net/t/1558645/official-nvidia-gtx-980-ti-owners-club/920)

GIGA are using AON6414A & AON6508 MOSFETs for pascal & polaris (



) "worst than Founder Editions PCB"






GIGA G1 GAMING @ USD ~390

EVGA SuperClocked ACX 3.0 (Standard Silver/Black Edition) @ USD ~400

EVGA FTW @ USD ~446

the cheapest i can get at my country & the price gap is around USD ~10


----------



## BroPhilip

Quote:


> Originally Posted by *gtbtk*
> 
> A 7700K will give you about 150-160% of the computing power compared to the 6600K. An upgrade can be a direct chip replacement, assuming that there is a kaby lake Bios update for your motherboard.
> 
> You will certainly see a small improvement in frame rates with the 7700K overclocked to 4.9-5.0Ghz in games that are CPU bound or physics intensive. While we are all excited about Ryzen, Unfortunately we don't have any concrete knowledge of what the production Ryzen chip can actually do or how it will overclock. If it only overclocks to 4Ghz, it is likely that the Ryzen chip could perform worse in gaming than the 7700K
> 
> When the GPU drops to 75% is the CPU or any of the cores running at 100%?


Hey man its good to see you still on here. I have been too busy with a newborn to be on much at all. My board is anot asus z170a and it has already released the bios update for kabylake. I am just really struggling with this as I have 500 for upgrades asus do want to make the best long term investment. I would have never seen the bottleneck if not for a 144hz monitor lol. I have seen some of the leaked benchmarks for ryzen and while everyone is looking at it beating a i7 6900 however the gaming benchmarks are showing 6700k level proformance but it is a developer preview at the lower clock speed with no boost. The biggest thing that I wish we had a crystal ball to see is how developers are going to handle multithreading in future games. If the trend of 4 core usage is going to be in play for a while I'll wait for an architecture overhaul as the Intel processors have made little advances and it should be getting close to a cycle. If in the next set six core becomes standard thenot I'll be kicking myself for not waiting. lol
Quote:


> Originally Posted by *zipper17*
> 
> Probably Go to youtube search video people with i7 and 1070 play the exact same game with yours and see his framerates. Compare it with your framerates, you can judge your PC with i5+1070 is bottlenecking a lot or not.


Thank you....I have looked and the difference is around 10-15 frames as it seems to not scale much beyond 4 cores however with the system fighting a 100% core the division uses around 85-95% of the cpu and system etc using the fighting for the rest. If avaliable it would eat all 4 cores. Literally it sits at 100% for the moment it stops loading to the moment you log out. My main question is what will the future holds and will it only get worse...

Quote:


> Originally Posted by *Quadrider10*
> 
> Same issue here with my 6600k at 4.6. Some CPU bottlenecks in some games, but idk if it still justifies the extra $100 for on average 3fps across many games.


I agree expecially since it is a 6 month old build...I wish I would have sprung for the i7 but everything I read swore it would never bottleneck for years....lol and people will still say we are crazy and it doesn't bottleneck


----------



## BroPhilip

Quote:


> Originally Posted by *Gurkburk*
> 
> Your setup isnt getting bottlenecked anywhere.


WOW....that was neither helpful nor informative just an a simple declaration.

However you are partly right. Yes at 1440 res or limited to 60fps vsync there is no bottleneck at the cpu as it is pushing the gpu more than cpu. However the issue comes running a 1080 res with 144hz when the cpu instructions can out pace the gpu. (After going 144hz I can't imagine going back to 60hz) but for it to be good you need to run 80-144 fps. The only place this causes problems is open world games and while 60fps on ultra in open world games such as gta v, or assassin's creed syndicate is nothing to laugh at in an open world shooter like The Division it is important to have high frames and low latency in multiplAyer situations. So if my cpu runs 100% the entire game and when I encounter NPC's (cpu intensive) my gpu usage can drop to 70% from 98% and my frames drop from 120 to 60-75fps and you don't call that a bottleneck..... you can Google this issue and find that it is common among i5's but disappears with i7 and also so does stuttering from a maxed out cpu....

Also, *ALL* computers bottleneck somewhere it is the nature of electronics no system is evenly matched across all parts!

I do appreciate your willingness to help but abatrary declarations are not helpful. If you don't believe I am having a cpu issue perhaps you can enlighten us with another possibility or solution...


----------



## philhalo66

Quote:


> Originally Posted by *RyanRazer*
> 
> I've had 1070 G1 gaming myself, it had a coil whine like that one in video. I replaced with zotac amp extreme which is almost dead silent.


thats not coil whine that is a faulty fan. Coil whine is a high pitched squealing noise.


----------



## Quadrider10

Quote:


> Originally Posted by *BroPhilip*
> 
> I agree expecially since it is a 6 month old build...I wish I would have sprung for the i7 but everything I read swore it would never bottleneck for years....lol and people will still say we are crazy and it doesn't bottleneck


***** i read exactly the same things from multiple reliable sources.


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You could try an Asus strix OC bios. That will also give you a core clock of 1633Mhz and a power slider to 120. Both cards have the same power limit of 200W but remember though that different cards have different VRM designs so pay close attention to temps etc
> 
> 
> 
> I wanna do it, but I feel like its way to risky. Are we ever going to see bios editors or custom bios?
Click to expand...

On my MSI gaming X I have run just about every bios available to see what it did. The only bioses that you should avoid are the Galax/Kaf HOF bioses that use different voltage controllers. I am currently running the Gaming Z bios on my card.

All the other cards use the same voltage controller but the mosfet/VRM implementation is different so they will all work, some will be more in sync with your hardware than others and perform better though. In my case, I found that the Asus strix OC bios was well matched with the MSI hardware and performed well. Of course that may be different with a G1. I did discover that the Giga Extreme bios makes changes to the display ports compared to standard cards and I could only display anything from my card using display port 3. I did discover that the voltage controller itself configures the number of channels that it uses, the bios just says to the controller power up or power down and the controller silicon is left to work it out itself.

As long as you have another gpu you can boot from (igpu makes it very easy) you can recover a card if you make a mistake and brick your 1070 - flashing wont kill the hardware, it can mess with the way the components communicate with each other in software because of the different brands hardware design (different voltage controller for example). To recover the card, you simply boot from igpu and reflash the nvidia card with the default bios and it will come back to life again.

You have just as much information about a bios editor as anyone else.


----------



## Onlygear

Hi, I have found the issue: I pluged GPU on second PCI and worked like a charm, same after clean first PCI express GPU core is running fine to default frecuency also still stable until 2240MHz, looks was doing a bad contact : )
About micron memory no improvements, just can add +340MHz on afterburner or my pc crash during valley benchmark and appear artifacts on time spy. GPU-Z shows Bios is updated to last version available on ASUS website 86.04.50.00.XX
My system specs are: core I7-3770k 5GHz - Asus Maximus Extreme V - Corsair AX1200i 1200W 80 Plus Platinum Modular -16Gb gskill ripjawZ 2.400MHz CL9
Thanks for your advice, I´m gonna run my pc to default speed and check if can add some stablity to micron memo and will update the results


----------



## gtbtk

Quote:


> Originally Posted by *Onlygear*
> 
> Hi, I have found the issue: I pluged GPU on second PCI and worked like a charm, same after clean first PCI express GPU core is running fine to default frecuency also still stable until 2240MHz, looks was doing a bad contact : )
> About micron memory no improvements, just can add +340MHz on afterburner or my pc crash during valley benchmark and appear articats on time spy. GPU-Z shows Bios is updated to last version available on ASUS website 86.04.50.00.XX 86.04.50.00.XX
> My system specs are: core I7-3770k 5GHz - Asus Maximus Extreme - Corsair AX1200i 1200W 80 Plus Platinum Modular -16Gb gskill ripjawZ 2.400MHz CL9
> Thanks for your advice, I´m gonna run my pc to default speed and check if can add some stablity to micron memo and will update the results


Are you are 5Ghz with 100 bCLK? It is possible that a higher bclk is reducing your PCIE mem OC headroom. You might fond that increasing your VCCIO and system agent voltages helps a bit. I am on Z68 and running VCCIO at 1.15 and I find it helps overclock stability.

What I have also found is that the cards operate in a performance envelope when you get to the ragged edge. High core clocks will take away from high memory. I have also found that 1070s love as much memory bandwidth as they can get. The 1080 seems to stop improving at 11Ghz, Given that we are using basically the same chip as a 1080, 8-9Ghz is restrictive compared to what the chip can actually handle. I get better benchmarks scores with the GPU at 2088 and memory at 9100+Mhz than I get with CPU clocks above 2100 and a lower memory OC


----------



## Onlygear

Quote:


> Originally Posted by *gtbtk*
> 
> Are you are 5Ghz with 100 bCLK? It is possible that a higher bclk is reducing your PCIE mem OC headroom. You might fond that increasing your VCCIO and system agent voltages helps a bit. I am on Z68 and running VCCIO at 1.15 and I find it helps overclock stability.
> 
> What I have also found is that the cards operate in a performance envelope when you get to the ragged edge. High core clocks will take away from high memory. I have also found that 1070s love as much memory bandwidth as they can get. The 1080 seems to stop improving at 11Ghz, Given that we are using basically the same chip as a 1080, 8-9Ghz is restrictive compared to what the chip can actually handle. I get better benchmarks scores with the GPU at 2088 and memory at 9100+Mhz than I get with CPU clocks above 2100 and a lower memory OC


Hi there, yep 5GHz with 100 BCLK I have been using default settings on my motherboard for a while, also I rised VCCIO from 1.2 to 1.5V and VCCSA to 1.1 and underclock my ram modules to 1.600Mhz, also I removed two stick and runing with other two to make sure is not a problem related with ram voltage. Unfortunately in my case GPU core speed doesn't have any impact on memory stability and GPU was crashing at same point during all test. I start to belive Micron memory doesn't have enough voltage to get higer OC under heavy loads (VRM Temperature are ridiculously low 62 degrees) and only way to fix this gonna be cocking a custom Bios or waiting for other official relase...


----------



## BroPhilip

Quote:


> Originally Posted by *Quadrider10*
> 
> ***** i read exactly the same things from multiple reliable sources.


Now it is listed as the minimum spec for battlefield 1 lol. Irony


----------



## gtbtk

Quote:


> Originally Posted by *Onlygear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Are you are 5Ghz with 100 bCLK? It is possible that a higher bclk is reducing your PCIE mem OC headroom. You might fond that increasing your VCCIO and system agent voltages helps a bit. I am on Z68 and running VCCIO at 1.15 and I find it helps overclock stability.
> 
> What I have also found is that the cards operate in a performance envelope when you get to the ragged edge. High core clocks will take away from high memory. I have also found that 1070s love as much memory bandwidth as they can get. The 1080 seems to stop improving at 11Ghz, Given that we are using basically the same chip as a 1080, 8-9Ghz is restrictive compared to what the chip can actually handle. I get better benchmarks scores with the GPU at 2088 and memory at 9100+Mhz than I get with CPU clocks above 2100 and a lower memory OC
> 
> 
> 
> Hi there, yep 5GHz with 100 BCLK I have been using default settings on my motherboard for a while, also I rised VCCIO from 1.2 to 1.5V and VCCSA to 1.1 and underclock my ram modules to 1.600Mhz, also I removed two stick and runing with other two to make sure is not a problem related with ram voltage. Unfortunately in my case GPU core speed doesn't have any impact on memory stability and GPU was crashing at same point during all test. I start to belive Micron memory doesn't have enough voltage to get higer OC under heavy loads (VRM Temperature are ridiculously low 62 degrees) and only way to fix this gonna be cocking a custom Bios or waiting for other official relase...
Click to expand...

what model 1070 are you running?

I found a post somewhere a while back while I was looking to fiddle with my motherboard voltages to get better performance and a guy claimed that he found a performance boost by upping the PCH voltage. Not sure how that could work as the GPU doesnt use the PCH but I guess that there are connections from PCH back to CPU that the voltage must have some effect on.

Maybe that is worth a try?


----------



## icold

After many days testing my GPU in game I also noticed instability in my ****ty memory micron @4361mhz to @4333mhz and now looks like perfect stable. You get half the OC with micron in relation to samsung, and my bios is updated.


----------



## c0nsistent

Quote:


> Originally Posted by *icold*
> 
> After many days testing my GPU in game I also noticed instability in my ****ty memory micron @4361mhz to @4333mhz and now looks like perfect stable. You get half the OC with micron in relation to samsung, and my bios is updated.


I'm able to do 4475 with my Micron but no more than that... nothing like the +700-800 the Samsung users are achieving.


----------



## BroPhilip

Quote:


> Originally Posted by *c0nsistent*
> 
> I'm able to do 4475 with my Micron but no more than that... nothing like the +700-800 the Samsung users are achieving.


Exact same thing here....msi gaming z with micron memory


----------



## vfrmaverick

Quote:


> Originally Posted by *gtbtk*
> 
> You could try an Asus strix OC bios. That will also give you a core clock of 1633Mhz and a power slider to 120. Both cards have the same power limit of 200W but remember though that different cards have different VRM designs so pay close attention to temps etc


i did this on my evga sc and it worked fine. Got a slight bump to 2108 which would throttle to 2079 with power maxed out. Then i installed the updated evga bios for the pad fix and applied the fix myself. now its stable and rock solid at 2164. doesnt power or temp throttle. And i did get that overclock with just the bios while the pads arrived in the mail.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> After many days testing my GPU in game I also noticed instability in my ****ty memory micron @4361mhz to @4333mhz and now looks like perfect stable. You get half the OC with micron in relation to samsung, and my bios is updated.


My micron memory starts to artifact at about 4580-4600mhz.

Maybe you are a silicon lottery loser? Don't feel too bad, the high OCs with Samsung memory are not universal, there are Samsung cards that can only reach what you are getting too. Many users have also reported that performance drops off with samsung overclocks much above 4500.


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> My micron memory starts to artifact at about 4580-4600mhz.
> 
> Maybe you are a silicon lottery loser? Don't feel too bad, the high OCs with Samsung memory are not universal, there are Samsung cards that can only reach what you are getting too. Many users have also reported that performance drops off with samsung overclocks much above 4500.


Is not artifact, its freeze


----------



## Quadrider10

Quote:


> Originally Posted by *BroPhilip*
> 
> Now it is listed as the minimum spec for battlefield 1 lol. Irony


Lol idk, still not worth the upgrade. Maybe from i5 6600k to i7, 7700k, otherwise, I don't see it. But lot of my games I see 50+ % usage.


----------



## Onlygear

Quote:


> Originally Posted by *Onlygear*
> 
> Hi there, yep 5GHz with 100 BCLK I have been using default settings on my motherboard for a while, also I rised VCCIO from 1.2 to 1.5V and VCCSA to 1.1 and underclock my ram modules to 1.600Mhz, also I removed two stick and runing with other two to make sure is not a problem related with ram voltage. Unfortunately in my case GPU core speed doesn't have any impact on memory stability and GPU was crashing at same point during all test. I start to belive Micron memory doesn't have enough voltage to get higer OC under heavy loads (VRM Temperature are ridiculously low 62 degrees) and only way to fix this gonna be cocking a custom Bios or waiting for other official relase...


Quote:


> Originally Posted by *gtbtk*
> 
> what model 1070 are you running?
> 
> I found a post somewhere a while back while I was looking to fiddle with my motherboard voltages to get better performance and a guy claimed that he found a performance boost by upping the PCH voltage. Not sure how that could work as the GPU doesnt use the PCH but I guess that there are connections from PCH back to CPU that the voltage must have some effect on.
> 
> Maybe that is worth a try?


Hello!! I was looking into this and I left PCH at default values (1.053V) for all overclocking, I have not observed any relationship between this voltage rail and GPU estability in my test to date. By the way I have Asus ROG Strix Geforce GTX 1070 Gaming OC 8GB GDDR5 P/N: 90YV09N0-M0NA00
I have also observed that some games are more unestables than benchmarcks as Shadow Warrior 2 so my real estable core with maximum performance is 2140Mhz if I push more than this I get worse results on bench DX12 and some random crashes on GTAV and Shadow warrior 2.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My micron memory starts to artifact at about 4580-4600mhz.
> 
> Maybe you are a silicon lottery loser? Don't feel too bad, the high OCs with Samsung memory are not universal, there are Samsung cards that can only reach what you are getting too. Many users have also reported that performance drops off with samsung overclocks much above 4500.
> 
> 
> 
> Is not artifact, its freeze
Click to expand...

I get that 1 in 3 or times on benchmark runs if I don't have enough vcore and with vccio voltage increased above the default


----------



## gtbtk

Quote:


> Originally Posted by *Onlygear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Onlygear*
> 
> Hi there, yep 5GHz with 100 BCLK I have been using default settings on my motherboard for a while, also I rised VCCIO from 1.2 to 1.5V and VCCSA to 1.1 and underclock my ram modules to 1.600Mhz, also I removed two stick and runing with other two to make sure is not a problem related with ram voltage. Unfortunately in my case GPU core speed doesn't have any impact on memory stability and GPU was crashing at same point during all test. I start to belive Micron memory doesn't have enough voltage to get higer OC under heavy loads (VRM Temperature are ridiculously low 62 degrees) and only way to fix this gonna be cocking a custom Bios or waiting for other official relase...
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> what model 1070 are you running?
> 
> I found a post somewhere a while back while I was looking to fiddle with my motherboard voltages to get better performance and a guy claimed that he found a performance boost by upping the PCH voltage. Not sure how that could work as the GPU doesnt use the PCH but I guess that there are connections from PCH back to CPU that the voltage must have some effect on.
> 
> Maybe that is worth a try?
> 
> Click to expand...
> 
> Hello!! I was looking into this and I left PCH at default values (1.053V) for all overclocking, I have not observed any relationship between this voltage rail and GPU estability in my test to date. By the way I have Asus ROG Strix Geforce GTX 1070 Gaming OC 8GB GDDR5 P/N: 90YV09N0-M0NA00
> I have also observed that some games are more unestables than benchmarcks as Shadow Warrior 2 so my real estable core with maximum performance is 2140Mhz if I push more than this I get worse results on bench DX12 and some random crashes on GTAV and Shadow warrior 2.
Click to expand...

That card is a good one. The Asus oc bios matches my MSI Gaming X hardware rather well

All of these cards are slightly different, we are only going to fine tune things through trial and error until we find something that works for the individual PC and Graphics card. I am ov the opinion that bouncing ideas around like this is ultimately helpful.

I know most sites say leave pch alone. But I found a forum post that claimed an increase in stability with a pch increase so I tried it. while not a miracle oc to the moon fix, It seems to help a bit but I cannot explain why, the GPU PCIE lanes are directly connected to the CPU where as the PCH is connected via the DMI bus.

I agree that not every application is equal in terms of loads placed on overclocked graphics cards. Heaven and valley will take higher over clocks than Firestrike which will take higher memory OC than time spy. ROTR gives me big black squares over the screen if i use an OC setting that will not work in Time spy.


----------



## Onlygear

Quote:


> Originally Posted by *Quadrider10*
> 
> ***** i read exactly the same things from multiple reliable sources.


Quote:


> Originally Posted by *Quadrider10*
> 
> Same issue here with my 6600k at 4.6. Some CPU bottlenecks in some games, but idk if it still justifies the extra $100 for on average 3fps across many games.


In my opinion, don't worry so much about this probably you are loosing around 3-4fps average. I did some test few months ago with 6700k,3770k and 6600k overclocked to maximum, and I still with my ''old'' core i7 3770k. Using a GTX 980 ti this is what I found:

-Test 1080p- when you are getting a high amount of fps on game, more than 140 you can find big differences between them around 20fps+ on 6700k
When you are getting around 100 fps differences are smaller no more than 10 usually also depends the OC , in some game no differences.
Other games as Fallout 4 don't really take care about processor and depends largely on the ram, you can lost around 30fps with ram to 1600MHz and get 30 fps more with your ram at 2400MHz.

-Test 1440p: Clearly the gpu is the delimiting factor here, repeating the same tests differences was around 1fps between 3770k-6700k-6600k.


----------



## Quadrider10

Quote:


> Originally Posted by *Onlygear*
> 
> In my opinion, don't worry so much about this probably you are loosing around 3-4fps average. I did some test few months ago with 6700k,3770k and 6600k overclocked to maximum, and I still with my ''old'' core i7 3770k. Using a GTX 980 ti this is what I found:
> 
> -Test 1080p- when you are getting a high amount of fps on game, more than 140 you can find big differences between them around 20fps+ on 6700k
> When you are getting around 100 fps differences are smaller no more than 10 usually also depends the OC , in some game no differences.
> Other games as Fallout 4 don't really take care about processor and depends largely on the ram, you can lost around 30fps with ram to 1600MHz and get 30 fps more with your ram at 2400MHz.
> 
> -Test 1440p: Clearly the gpu is the delimiting factor here, repeating the same tests differences was around 1fps between 3770k-6700k-6600k.


Interesting. I run my games at 1080p but if there are resolution scaling options, I'll use those to increase. Really I only notice the difference when things get heavy on the CPU such as physics or smoke. Well see how things go especially with these new games coming out in 2017. My next step would be 1440p monitor and a gtx 1080. But I got a long way until I can justify that and by that time, new CPUs and GPUs will be out. So in reality my next step would be to oc to 4.8ghz if games demand. Honestly I feel it's my CPU bottlenecking this 6600k in pretty much everything.

I only have a 60hz monitor, so I do set a fps limiter to 60fps. So I don't ever see frames over that. Keeps things cooler and is less work in components.

My RAM is at 3200mhz, so that definitely isn't a bottleneck.


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> My micron memory starts to artifact at about 4580-4600mhz.
> 
> Maybe you are a silicon lottery loser? Don't feel too bad, the high OCs with Samsung memory are not universal, there are Samsung cards that can only reach what you are getting too. Many users have also reported that performance drops off with samsung overclocks much above 4500.


I know memory clock is little significant for FPS gain, it's still frustrating. Im also have bad experience with GTX 780 DCuII with elpida vrams It was worse, it not up 5mhz without freeze.


----------



## Exenth

Just tried to Undervolt my Palit 1070 Jetstream and here are the Results.
It is stable at every benchmark i throw at it, expect 3DMark but it even crashes without OC, so I don't know whats wrong with that.

In some Games it stays at 2050MHz, but at Heaven and Valley it drops to 2025MHz.


----------



## R432

Quote:


> Originally Posted by *R432*
> 
> Got really bad Msi gtx 1070 gaming x, only goes 2000 core and 8500 memory (micron+stock voltage) even with bios update... meh.
> 
> Should i return and try for better?


Quoting my own post.

At the end Micron chip card was stable 1960/8400 (after bios update) which was horribly bad.

Gladly it only costed effort not money so i ordered other and returned that and it succeeded since i got card with samsung memory which went 2050/9300 stable.


----------



## computerfreak09

Been wanting to see if I can OC my GPU to it's maximum performance (It's an ASUS GTX 1070 STRIX). Flashed my BIOS with the OC version, took it well and is stable. Did some minor OC'ing, still stable. Clocks right now (by GPU-Z): GPU Clock: 1706MHz, Memory Clock: 2102MHz, Boost Clock: 1908MHz. Those are some strange numbers, but that's what GPU-Z tells me, lol.

Also realized I had Micron memory and became







RIP Memory OC'ing, no wonder I couldn't even do much on it in the first place, even with the fix from ASUS.


----------



## zipper17

Quote:


> Originally Posted by *Onlygear*
> 
> In my opinion, don't worry so much about this probably you are loosing around 3-4fps average. I did some test few months ago with 6700k,3770k and 6600k overclocked to maximum, and I still with my ''old'' core i7 3770k. Using a GTX 980 ti this is what I found:
> 
> -Test 1080p- when you are getting a high amount of fps on game, more than 140 you can find big differences between them around 20fps+ on 6700k
> When you are getting around 100 fps differences are smaller no more than 10 usually also depends the OC , in some game no differences.
> Other games as Fallout 4 don't really take care about processor and depends largely on the ram, you can lost around 30fps with ram to 1600MHz and get 30 fps more with your ram at 2400MHz.
> 
> -Test 1440p: Clearly the gpu is the delimiting factor here, repeating the same tests differences was around 1fps between 3770k-6700k-6600k.


Quote:


> Originally Posted by *Quadrider10*
> 
> Interesting. I run my games at 1080p but if there are resolution scaling options, I'll use those to increase. Really I only notice the difference when things get heavy on the CPU such as physics or smoke. Well see how things go especially with these new games coming out in 2017. My next step would be 1440p monitor and a gtx 1080. But I got a long way until I can justify that and by that time, new CPUs and GPUs will be out. So in reality my next step would be to oc to 4.8ghz if games demand. Honestly I feel it's my CPU bottlenecking this 6600k in pretty much everything.
> 
> I only have a 60hz monitor, so I do set a fps limiter to 60fps. So I don't ever see frames over that. Keeps things cooler and is less work in components.
> 
> My RAM is at 3200mhz, so that definitely isn't a bottleneck.


Generation before SandyBridge or worst is where bottleneck severe the most.

Any i5/i7 K Series from SB to the latest Gen, bottlenecks should not severe anymore.

I remember my PC was under heavy bottlenecking a lot with Athlon II X2 245, GTX 560Ti, 2 GB DDR2. It's lagging on heavy games back in 2011 era, that's what real bottleneck experience are.
My Oldest PC are with Pentium 4, lol that's even worst than anything.

IF you still can maintain a good Framerates +60FPS without lagging, Games will still feel smooth even CPU bottlenecking by a small margin.

Probably you need Intel High End Desktop to really avoid botneck at all cost. But the Technology always never stop, it will always getting better.
Intel HEDT has 40 lanes that make SLI Setup even better with Full 16x/16x (32lanes).


----------



## icold

Quote:


> Originally Posted by *R432*
> 
> Quoting my own post.
> 
> At the end Micron chip card was stable 1960/8400 (after bios update) which was horribly bad.
> 
> Gladly it only costed effort not money so i ordered other and returned that and it succeeded since i got card with samsung memory which went 2050/9300 stable.


Micron is a terrible ****ty like elpida. Samsung made awsome memory chips. Sad also for those who have gtx 1080 that has only micron chip


----------



## ucode

Quote:


> Originally Posted by *icold*
> 
> Sad also for those who have gtx 1080 that has only micron chip


Err, for 1080 owners that's everyone.


----------



## DeathAngel74

I thought micron=elpida....Maybe I'm wrong...


----------



## syl1979

Quote:


> Originally Posted by *icold*
> 
> Micron is a terrible ****ty like elpida. Samsung made awsome memory chips. Sad also for those who have gtx 1080 that has only micron chip


Micron is the only manufacturer for the fastest gddr5x that is found on gtx 1080....


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> That table was very interesting. Do you know what he used to extract and convert the values?


Not off hand but they appear to just be offsets in the VBIOS so probably needs a little more work to be universal. As no one is requesting Hulk certs from the manufacturers (AFAIK) then little point taking it any further I guess since without the Hulk cert no one will be able to run there modified VBIOS. Hulk certs are required for applying "tweaked" VBIOS images. Request files can be had using nvflash.


----------



## OxygeenHD

Hi guys, recently i upgraded my setup to Skylake's 6700K, re-doing my entiere system cooling, bought a TC-14PE from Phanteks with 4 PH-F140SP.

And i was wondering which 1070 is "the best" between Gigabyte G1 (3fans) one (474€) EVGA SC(489€) or FTW(499] and the ASUS Strix one (470€) given the fact that the 1080Ti is confirmed, is it worth upgrading now ? for which card ? Or should i wait till 1080Ti comes out and jump on price drop ?


----------



## khanmein

Quote:


> Originally Posted by *OxygeenHD*
> 
> Hi guys, recently i upgraded my setup to Skylake's 6700K, re-doing my entiere system cooling, bought a TC-14PE from Phanteks with 4 PH-F140SP.
> 
> And i was wondering which 1070 is "the best" between Gigabyte G1 (3fans) one (474€) EVGA SC(489€) or FTW(499] and the ASUS Strix one (470€) given the fact that the 1080Ti is confirmed, is it worth upgrading now ? for which card ? Or should i wait till 1080Ti comes out and jump on price drop ?


Strix is the best but my place Asus is way more expensive than FTW so i ended up with SC & i didn't pick GIGA due to coil whine + fan noise issue but GIGA confirm come with samsung vram chip.

the latest batch for SC came with the latest bios & core clock can hit 1999 MHz by default w/o touching anything.


----------



## asdkj1740

Quote:


> Originally Posted by *OxygeenHD*
> 
> Hi guys, recently i upgraded my setup to Skylake's 6700K, re-doing my entiere system cooling, bought a TC-14PE from Phanteks with 4 PH-F140SP.
> 
> And i was wondering which 1070 is "the best" between Gigabyte G1 (3fans) one (474€) EVGA SC(489€) or FTW(499] and the ASUS Strix one (470€) given the fact that the 1080Ti is confirmed, is it worth upgrading now ? for which card ? Or should i wait till 1080Ti comes out and jump on price drop ?


dont buy evga acx anymore, go for hybrid or wait for icx//acx2 although i dont think icx is good.

asus strix temp looks good, but if you also look into the fan rpm and noise you can see that asus strix low temp relies on high noise and fan speed. gigabyte g1 is the same as asus strix too.
acx 3.0 is bad, especially for sc version as sc version has downgrade acx3.0 design. but i dont think ftw acx3.0 is good anyway.

buying 1070 is fine now, 1080ti should be launched in june computex to complete with vega 10.

just remember one thing, pascal gpus need to be super cool, msi gaming series and palit jetstream/gamerock or gainward phoenix are few cards with good stock cooling.
gigabyte xtreme and zotac amp extreme are good too, but these two are not cheap.
avoid msi seahawk, go for evga hybrid if you have ~400usd budget.

of course if you dont care about fan noise and you have got ~20c ambient temp 24/7 then dont need to give a **** about above my two cents.


----------



## OxygeenHD

Quote:


> Originally Posted by *asdkj1740*
> 
> dont buy evga acx anymore, go for hybrid or wait for icx//acx2 although i dont think icx is good.
> 
> asus strix temp looks good, but if you also look into the fan rpm and noise you can see that asus strix low temp relies on high noise and fan speed. gigabyte g1 is the same as asus strix too.
> acx 3.0 is bad, especially for sc version as sc version has downgrade acx3.0 design. but i dont think ftw acx3.0 is good anyway.
> 
> buying 1070 is fine now, 1080ti should be launched in june computex to complete with vega 10.
> 
> go for evga hybrid if you have ~400usd budget.
> 
> of course if you dont care about fan noise and you have got ~20c ambient temp 24/7 then dont need to give a **** about above my two cents.


I care A LOT about fan noise, that's why i'm updgrading my cooling fans and heatsink.. ^^

I wish i could go Hybrid but in france, the 1070 Hybrid is 600€, which is a lot more than you in the US and a lot more than other 1070 here in France


----------



## gtbtk

Quote:


> Originally Posted by *OxygeenHD*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> dont buy evga acx anymore, go for hybrid or wait for icx//acx2 although i dont think icx is good.
> 
> asus strix temp looks good, but if you also look into the fan rpm and noise you can see that asus strix low temp relies on high noise and fan speed. gigabyte g1 is the same as asus strix too.
> acx 3.0 is bad, especially for sc version as sc version has downgrade acx3.0 design. but i dont think ftw acx3.0 is good anyway.
> 
> buying 1070 is fine now, 1080ti should be launched in june computex to complete with vega 10.
> 
> go for evga hybrid if you have ~400usd budget.
> 
> of course if you dont care about fan noise and you have got ~20c ambient temp 24/7 then dont need to give a **** about above my two cents.
> 
> 
> 
> I care A LOT about fan noise, that's why i'm updgrading my cooling fans and heatsink.. ^^
> 
> I wish i could go Hybrid but in france, the 1070 Hybrid is 600€, which is a lot more than you in the US and a lot more than other 1070 here in France
Click to expand...

If you care about noise and like low temps with good performance on Air then look at the MSI Gaming X/Quicksilver cards (same card but different colour fan shroud and back plate)

Gigabyte G1 has quality issues with the fans hitting the cooler. Evga hits power limits and power throttles easily, Asus Strix cards are pretty good but not as quiet as MSI cards.


----------



## khanmein

Quote:


> Originally Posted by *OxygeenHD*
> 
> I care A LOT about fan noise, that's why i'm updgrading my cooling fans and heatsink.. ^^
> 
> I wish i could go Hybrid but in france, the 1070 Hybrid is 600€, which is a lot more than you in the US and a lot more than other 1070 here in France


msi, asus & evga fan is pretty quiet. i'm very sensitive with noise one. giga & zotac pretty noisy.

get the cheapest!


----------



## asdkj1740

Quote:


> Originally Posted by *OxygeenHD*
> 
> I care A LOT about fan noise, that's why i'm updgrading my cooling fans and heatsink.. ^^
> 
> I wish i could go Hybrid but in france, the 1070 Hybrid is 600€, which is a lot more than you in the US and a lot more than other 1070 here in France


evga has global rma, you may consider buy it from us amazon.
palit and gainward are good too.


----------



## icold

Quote:


> Originally Posted by *DeathAngel74*
> 
> I thought micron=elpida....Maybe I'm wrong...


My old GTX 780 DCUII had Elpida vram And it was terrible, I had to use in stock, if you increase 5 mhz gpu freeze, is unbelievable a vram dont increase nothing OC. ELPIDA/MICRON have a terrible chips.


----------



## RyanRazer

Quote:


> Originally Posted by *khanmein*
> 
> msi, asus & evga fan is pretty quiet. i'm very sensitive with noise one. giga & zotac pretty noisy.
> 
> get the cheapest!


Hmm, my Zotac amp extreme is pretty quiet. I've had Giga G1 gaming which was way louder. can't say about msi and asus but that gigantic heatsink on zotac is pretty good with dissipating heat.


----------



## Nebulous

Quote:


> Originally Posted by *RyanRazer*
> 
> Hmm, my Zotac amp extreme is pretty quiet. I've had Giga G1 gaming which was way louder. can't say about msi and asus but that gigantic heatsink on zotac is pretty good with dissipating heat.


I agree! Mine is extremely quiet even with an aggressive fan curve. That massive sink sure does a really good job and looks good doing it


----------



## SalamiBoi69

Hey what model did you get? i have the gaming x and have OCed to 2.1 GHz, with a boost of 1710 mhz


----------



## duganator

Quote:


> Originally Posted by *Nebulous*
> 
> I agree! Mine is extremely quiet even with an aggressive fan curve. That massive sink sure does a really good job and looks good doing it


Do you have the amp extreme? My amp card gets really hot, but I guess it stays fairly quiet. I see 70+c under load pretty regularly.


----------



## assface

Any known brands that still use Samsung chips instead of Micron? Or any known stock that carry a batch of 1070s with Samsung chips? I really want a 1070 with good OC abilities.


----------



## cyronn

I'm loving my Palit GTX 1070 Jetstream its really quiet and also for a mid overclock it handles it well. 2050/8192 is the current oc temps are below 60c also.


----------



## khanmein

Quote:


> Originally Posted by *assface*
> 
> Any known brands that still use Samsung chips instead of Micron? Or any known stock that carry a batch of 1070s with Samsung chips? I really want a 1070 with good OC abilities.


grab giga got higher chance u receive samsung cos the rest is micron aka elpida.


----------



## g-lad21

Hey guys, moving from my Sapphire R9 Fury to a 1070, can someone recommend a quiet one but with some fancy leds? not too much, but controllable like STRIX. i heard MSI cards are very quiet, but do they have some leds? Thanks!

EDIT: Also, do we know which cards have coil whine? Thanks for any help


----------



## blued

Quote:


> Originally Posted by *RyanRazer*
> 
> Hmm, my Zotac amp extreme is pretty quiet. I've had Giga G1 gaming which was way louder. can't say about msi and asus but that gigantic heatsink on zotac is pretty good with dissipating heat.


Same here. Max 70c and fan very quiet. Very happy with my Zotac Amp. Glad I went for it instead of Evga who are known to be cheap with parts, build quality.


----------



## MEC-777

Quote:


> Originally Posted by *assface*
> 
> Any known brands that still use Samsung chips instead of Micron? Or any known stock that carry a batch of 1070s with Samsung chips? I really want a 1070 with good OC abilities.


Quote:


> Originally Posted by *khanmein*
> 
> grab giga got higher chance u receive samsung cos the rest is micron aka elpida.


Nope. My Zotac FE is Samsung.

It's kind of the luck of the draw, but even if you get a card with Micron, you just have to flash it to the latest BIOS and it should be fine. Micron should OC as well as Samsung +/-.
Quote:


> Originally Posted by *g-lad21*
> 
> Hey guys, moving from my Sapphire R9 Fury to a 1070, can someone recommend a quiet one but with some fancy leds? not too much, but controllable like STRIX. i heard MSI cards are very quiet, but do they have some leds? Thanks!
> 
> EDIT: Also, do we know which cards have coil whine? Thanks for any help


Why not get the Strix then?


----------



## g-lad21

Quote:


> Originally Posted by *MEC-777*
> 
> Why not get the Strix then?


I FEAR COIL WHINE! its like the most important thing because the case is near me and i like staring at it!
please help me


----------



## icold

GTX 1070 ROG @ 2126mhz , At least the chip went up reasonably. Memory max 4363mhz ¬¬. Who here has vram micron post your OC´s plz. We really need pascal bios tweaker


----------



## khanmein

Quote:


> Originally Posted by *MEC-777*
> 
> Nope. My Zotac FE is Samsung.
> 
> It's kind of the luck of the draw, but even if you get a card with Micron, you just have to flash it to the latest BIOS and it should be fine. Micron should OC as well as Samsung +/-.
> Why not get the Strix then?


mostly FE came with samsung but the latest batch might be micron. i had no luck with zotac, msi & giga


----------



## DeathAngel74

Quote:


> Originally Posted by *icold*
> 
> GTX 1070 ROG @ 2126mhz , At least the chip went up reasonably. Memory max 4363mhz ¬¬. Who here has vram micron post your OC´s plz. We really need pascal bios tweaker


eVGA GTX 1070 SC(08G-P4-6173-KB):
2101/4363- SW:BF 2015 (24/7 usage), dips down to 2088-2050 during TW3 and BM:AK


----------



## Spin Cykle

Quote:


> GTX 1070 ROG @ 2126mhz , At least the chip went up reasonably. Memory max 4363mhz ¬¬. Who here has vram micron post your OC´s plz. We really need pascal bios tweaker biggrin.gif


I have micron VRAM chips on my EVGA 1070 SC Black Edition and it will do +550 during gaming with no artifacts at all. Core is at +125 stable in 3DMark and gaming. Air cooled with a pretty agressive fan curve. Gaming core boosts to 2113mhz stable. Temps around 65-70c.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> eVGA GTX 1070 SC(08G-P4-6173-KB):
> 2101/4363- SW:BF 2015 (24/7 usage), dips down to 2088-2050 during TW3 and BM:AK


nice one but micron still can't achieve 9216 MHz like samsung http://www.guru3d.com/articles_pages/evga_geforce_gtx_1070_sc_superclocked_gaming_review,29.html

i suggest try out .72 bios & GeForce Hot Fix driver version 376.60 http://nvidia.custhelp.com/app/answers/detail/a_id/4293


----------



## Avendor

According to guru3d all charts were updated. Guys is this legit? there's a big gap between 1070 and 980 Ti, what do you think?


----------



## shadowrain

All this micron mumbo jumbo again. It's already [current year]! Fact, not all samsungs OC high as well. There are a lot of well documented samsungs in this thread that have lower oc's than the bios updated microns. If it doesn't perform at stock/OOTB clocks, RMA. If it does, it's working as intended. Extra OC's always have been "silicon lottery."

All this talk makes me think MyNewRig or his Alt account is back here again even after gtbk and nvidia already debunked thier micron issues with the bios updates.

If you still want samsungs, all Zotac Amp and Amp extremes in the Philippines are all Samsungs up to this day, thanks to ignorance that this generation, Zotac has has equalled or eclipsed the other brands.


----------



## icold

Quote:


> Originally Posted by *khanmein*
> 
> nice one but micron still can't achieve 9216 MHz like samsung http://www.guru3d.com/articles_pages/evga_geforce_gtx_1070_sc_superclocked_gaming_review,29.html
> 
> i suggest try out .72 bios & GeForce Hot Fix driver version 376.60 http://nvidia.custhelp.com/app/answers/detail/a_id/4293


For strix has no .72 bios


----------



## Onlygear

Quote:


> Originally Posted by *icold*
> 
> For strix has no .72 bios


Yeah same for me, Asus still stuck on 86.04.50.00.63 for Strix. I'm gonna install Zotac AMP! extreme bios 86.04.50.00.98 to check how micron memory works...cross fingers : )

Update: Bricked. After re-flash BIOS backup all is working fine, as I expected GPU survived without damage, since the power delivery is lower on Zotac bios.


----------



## zipper17

Quote:


> Originally Posted by *Spin Cykle*
> 
> I have micron VRAM chips on my EVGA 1070 SC Black Edition and it will do +550 during gaming with no artifacts at all. Core is at +125 stable in 3DMark and gaming. Air cooled with a pretty agressive fan curve. Gaming core boosts to 2113mhz stable. Temps around 65-70c.


Quote:


> Originally Posted by *icold*
> 
> GTX 1070 ROG @ 2126mhz , At least the chip went up reasonably. Memory max 4363mhz ¬¬. Who here has vram micron post your OC´s plz. We really need pascal bios tweaker


What's your Firestrike Graphic Scores?

My most stable scores around ~20.8XX

I can reach +21k but not stable 100%

Factory Stock = +19k Graphic Scores

Honestly, overclock performance is pretty disappointing even though i have samsung etc, final actual result still not really high than other 1070.


----------



## khanmein

Quote:


> Originally Posted by *Onlygear*
> 
> Yeah same for me, Asus still stuck on 86.04.50.00.63 for Strix. I'm gonna install Zotac AMP! extreme bios 86.04.50.00.98 to check how micron memory works...cross fingers : )
> 
> Update: Bricked. After re-flash BIOS backup all is working fine, as I expected GPU survived without damage, since the power delivery is lower on Zotac bios.


i suggest stick with the latest vbios from ASUS since not too shabby at all.


----------



## Dan-H

MSI Gaming are very quiet. I fans off it idle and it t
Quote:


> Originally Posted by *g-lad21*
> 
> Hey guys, moving from my Sapphire R9 Fury to a 1070, can someone recommend a quiet one but with some fancy leds? not too much, but controllable like STRIX. i heard MSI cards are very quiet, but do they have some leds? Thanks!
> 
> EDIT: Also, do we know which cards have coil whine? Thanks for any help


I bought two MSI Gaming X cards for the "wish it was mine" builds I'm working on. One is the normal red Gaming X 8GB and the other is the recently released Quicksilver which is simply a silver version of the gaming X. It is very sharp looking..

I haven't done much with the LEDs on either as I'm still getting the builds finished.

https://www.msi.com/Graphics-card/GeForce-GTX-1070-Quick-Silver-8G-OC.html#hero-overview

https://us.msi.com/Graphics-card/GEFORCE-GTX-1070-GAMING-X-8G.html#hero-overview

here are both side by side early on in the assembly.



and a better shot of the backing plate on the quick silver.



super quiet. no coil whine for mine.


----------



## JoeUbi

Really starting to like my Zotac Amp Extreme card... The Samsung memory overclocks like crazy!! http://www.3dmark.com/spy/1040198


----------



## zipper17

Quote:


> Originally Posted by *JoeUbi*
> 
> Really starting to like my Zotac Amp Extreme card... The Samsung memory overclocks like crazy!! http://www.3dmark.com/spy/1040198


Make sure there is no shinny sparks mini artifact [in milliseconds.]

My card Samsung only can do +600 (4600 / 9200mhz effectively)

Anything higher my card will produce a shinny green sparks during _looping_ Firestrike Extreme Stress test. Probably is not noticeable if only running a benchmark once or so. +700 it much worse.

They're randomly appears on the screen during rendering the graphic, Something like this:


----------



## bust3r

hello..
i have Galax 1070 HOF with micron mem, can do +700, but i run +650 for stable 24/7
http://www.3dmark.com/3dm/17298438

i think micron memory not bad at all


----------



## zipper17

Quote:


> Originally Posted by *bust3r*
> 
> hello..
> i have Galax 1070 HOF with micron mem, can do +700, but i run +650 for stable 24/7
> http://www.3dmark.com/3dm/17298438
> 
> i think micron memory not bad at all


it's silicon chip lottery
but samsung generally has better lottery.
any artifact tho?

Btw damn 1080Ti rumored to be at PAX East March 10.


----------



## icold

Quote:


> Originally Posted by *Onlygear*
> 
> Yeah same for me, Asus still stuck on 86.04.50.00.63 for Strix. I'm gonna install Zotac AMP! extreme bios 86.04.50.00.98 to check how micron memory works...cross fingers : )
> 
> Update: Bricked. After re-flash BIOS backup all is working fine, as I expected GPU survived without damage, since the power delivery is lower on Zotac bios.


On non OC bios stopped at .64 bios. It is not advisable to put bios of another model, even more the extreme amp that has 2 connectors of 8 pins. What will solve our problem is the pascal bios tweaker


----------



## gtbtk

Quote:


> Originally Posted by *Onlygear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *icold*
> 
> For strix has no .72 bios
> 
> 
> 
> Yeah same for me, Asus still stuck on 86.04.50.00.63 for Strix. I'm gonna install Zotac AMP! extreme bios 86.04.50.00.98 to check how micron memory works...cross fingers : )
> 
> Update: Bricked. After re-flash BIOS backup all is working fine, as I expected GPU survived without damage, since the power delivery is lower on Zotac bios.
Click to expand...

The 86.04.50.00.xx Bios is the bios that fixed the Micron memory controller bug. The XX part of the version number is only relevant to the manufacturer's own product range. a bios that ends in 72 from one vendor has no relevance to a bios that ends in 63 from Asus. The core elements like the code for the memory controller comes in the base bios code direct from Nvidia, the version of the core code is identified by the ".50" part of the version number. That code is identical between brands, each vendor only makes changes to settings like core clocks for different models, fan curves, power limits, branding and other things to differentiate their range of models. The Bios version that caused issues with the micron memory was 86.04.26.00.xx

The Strix OC bios is one of the better bios files around for single 8 pin 1070s. Base Clock of 1633Mhz and a max power draw of 200W. That bios works well on a MSI Gaming X cards, offering about the same performance as the original MSI bios even though it is supposedly set with a lower TDP. With only a single 8 pin power connection the max rating for the power supply of your card is only 225W.

The Zotac Amp Extreme Bios is set to pull up to 300W so you do put the card at risk of getting fried if you install it and get it to work as the installed VRM does not have the capacity to cope with that amount of power.

If you want to try an extreme bios with higher factory 1671Mhz core clock OC and a 250Mhz memory OC that is power level safe for your card, you should have a look at the Gainward Pheonix GLH bios https://www.techpowerup.com/vgabios/187062/187062 .

That card has the same 8 pin power supply as your Asus and I have previously flashed my MSI with both the Asus and the Gainward bioses and they both worked fine indocating a shared voltage controller comonality.


----------



## RyanRazer

Hey guys. Just dropping my baby here.

Was putting it piece by piece together for some time now. The case came last. never thought i'd end up like this








It's funny how i wasn't even a gamer and interested in computers 2 years ago. I just had my crappy laptop for browsing and videos and that was it. Then i decided to get a decent computer. Got one with i7 4790 which was top at that time. Then i decided to get myself a GPU just for fun, for the first time in my life. It was an R7 270







Was an OK GPU, good enough so that i started playing games.









Soon 270 just wouldn't cut it and i replaced it with r9 290 but that was a hungry beast so i had to get myself a proper PSU (had a crappy 500W no-name thing). Surely i had to add another 8GBs of RAM and here i am today with 1070 and a case with side window. Considering i was rocking a R9 290 just half a year ago with old case side-panel opened, PC put in a corner of a room, only cared about performance (not aesthetics) and 2 years ago didn't even own a decent computer...







From not caring at all, to a bit into gaming, then performance only, and finally performance and aesthetics









Much has changed. Got hooked in upgrading my PC. Great hobby. A bit expensive








, yes, but better than smoking brains out with weed









Here she is


----------



## MEC-777

Quote:


> Originally Posted by *RyanRazer*
> 
> Hey guys. Just dropping my baby here.
> 
> Was putting it piece by piece together for some time now. The case came last. never thought i'd end up like this
> 
> 
> 
> 
> 
> 
> 
> 
> It's funny how i wasn't even a gamer and interested in computers 2 years ago. I just had my crappy laptop for browsing and videos and that was it. Then i decided to get a decent computer. Got one with i7 4790 which was top at that time. Then i decided to get myself a GPU just for fun, for the first time in my life. It was an R7 270
> 
> 
> 
> 
> 
> 
> 
> Was an OK GPU, good enough so that i started playing games.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Soon 270 just wouldn't cut it and i replaced it with r9 290 but that was a hungry beast so i had to get myself a proper PSU (had a crappy 500W no-name thing). Surely i had to add another 8GBs of RAM and here i am today with 1070 and a case with side window. Considering i was rocking a R9 290 just half a year ago with old case side-panel opened, PC put in a corner of a room, only cared about performance (not aesthetics) and 2 years ago didn't even own a decent computer...
> 
> 
> 
> 
> 
> 
> 
> From not caring at all, to a bit into gaming, performance only, and finally performance and aesthetics
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Much has changed. Got hooked in upgrading my PC. Great hobby. A bit expensive
> 
> 
> 
> 
> 
> 
> 
> , yes, but better than smoking brains out with weed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here she is


Build looks great.









Just curious why you have the CPU cooler facing down instead of blowing across towards the back?


----------



## Nebulous

Yup looks good. Congrats on the 1070!


----------



## RyanRazer

Quote:


> Originally Posted by *Nebulous*
> 
> Yup looks good. Congrats on the 1070!


Tnx








Quote:


> Originally Posted by *MEC-777*
> 
> Build looks great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just curious why you have the CPU cooler facing down instead of blowing across towards the back?


I'd interfere with GPU. It is a temporary MoBo as i've fried Giga Z97 and waiting for new one. This is mATX so there is not much space. The one that i've RMAd (hope i get replacement) was a regular ATX, much more space to play around.

Btw, great community







Glad to see no toxic comments like under most youtube videos and comments sections like Wccf tech etc...


----------



## Onlygear

Quote:


> Originally Posted by *gtbtk*
> 
> The 86.04.50.00.xx Bios is the bios that fixed the Micron memory controller bug. The XX part of the version number is only relevant to the manufacturer's own product range. a bios that ends in 72 from one vendor has no relevance to a bios that ends in 63 from Asus. The core elements like the code for the memory controller comes in the base bios code direct from Nvidia, the version of the core code is identified by the ".50" part of the version number. That code is identical between brands, each vendor only makes changes to settings like core clocks for different models, fan curves, power limits, branding and other things to differentiate their range of models. The Bios version that caused issues with the micron memory was 86.04.26.00.xx
> 
> The Strix OC bios is one of the better bios files around for single 8 pin 1070s. Base Clock of 1633Mhz and a max power draw of 200W. That bios works well on a MSI Gaming X cards, offering about the same performance as the original MSI bios even though it is supposedly set with a lower TDP. With only a single 8 pin power connection the max rating for the power supply of your card is only 225W.
> 
> The Zotac Amp Extreme Bios is set to pull up to 300W so you do put the card at risk of getting fried if you install it and get it to work as the installed VRM does not have the capacity to cope with that amount of power.
> 
> If you want to try an extreme bios with higher factory 1671Mhz core clock OC and a 250Mhz memory OC that is power level safe for your card, you should have a look at the Gainward Pheonix GLH bios https://www.techpowerup.com/vgabios/187062/187062 .
> 
> That card has the same 8 pin power supply as your Asus and I have previously flashed my MSI with both the Asus and the Gainward bioses and they both worked fine indocating a shared voltage controller comonality.


Hello! thanks for the clarification, I thought Zotac bios with two 8 pin power just delivery 200watts and 100 watts on the second one and working on asus STRIX will not exceed 200watts : (
After runing my card with Gainward bios I had a lower oc on micron memory.


----------



## MEC-777

Quote:


> Originally Posted by *RyanRazer*
> 
> I'd interfere with GPU. It is a temporary MoBo as i've fried Giga Z97 and waiting for new one. This is mATX so there is not much space. The one that i've RMAd (hope i get replacement) was a regular ATX, much more space to play around.


Ah, understood. Yeah, I have a huge CPU cooler now and if the top PCIe slot on my mobo wasn't one slot down, it would interfere like yours.


----------



## Dan-H

Quote:


> Originally Posted by *MEC-777*
> 
> Build looks great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just curious why you have the CPU cooler facing down instead of blowing across towards the back?


@RyanRazer I'm curious also.

It looks like a Fractal R5 case.

Here is how the last build I did in that case ended up. (Wish it was mine build in my sig)

four fans are intakes and there is lots of air flow out the back.


----------



## JoeUbi

No artifacts, but unfortunately whenever I mess with the core voltage it becomes unstable. Same with the core clock, set it anywhere above 2000 and it does';t want to work.


----------



## RyanRazer

Quote:


> Originally Posted by *MEC-777*
> 
> Ah, understood. Yeah, I have a huge CPU cooler now and if the top PCIe slot on my mobo wasn't one slot down, it would interfere like yours.


Woow, holly s*, this is one big cooler. Why not go for liquid if need low temps. I guess high OC?

What are those at the top? They look like fans for radiator but i see no watter block (cpu or GPU). Or are those out take through the side of the case? Confused








Quote:


> Originally Posted by *Dan-H*
> 
> @RyanRazer I'm curious also.
> 
> It looks like a Fractal R5 case.
> 
> Here is how the last build I did in that case ended up. (Wish it was mine build in my sig)
> 
> four fans are intakes and there is lots of air flow out the back.


It's actually Corsair C400. I'll only add one more 140mm fan in front. And probably put 2 140mm at top as exhaust and remove the 120 mm one at the back. My idea is as hot air automatically rises, I'll just help him do that with fans on top. And I'll remove the back 120 one as it gets loud. Usually smaller fans have to spin faster to compensate for smaller blades as oposed to big ones and tend to get louder. I like my comp to be as quiet as possible.

Is this a healthy idea?


----------



## WillG027

Micron memory is fine after the .50 Bios flash.
(and most certainly does not = Elpida as I've seen written here)

Have a MSI Gaming X with Micron good upto +800 on memory no artifacts.


----------



## Arturo.Zise

I have a Gainward Phoenix with the Golden Sample bios flashed and it runs well, but I noticed the standard bios my card came with has a 170w max board rating and my GS bios is 195w max board rating. My card has a single 8 pin so I should be safe up to 225w max no? Not sure if I'm harming my card at all.


----------



## Dan-H

Quote:


> Originally Posted by *RyanRazer*
> 
> It's actually Corsair C400. I'll only add one more 140mm fan in front. And probably put 2 140mm at top as exhaust and remove the 120 mm one at the back. My idea is as hot air automatically rises, I'll just help him do that with fans on top. And I'll remove the back 120 one as it gets loud. Usually smaller fans have to spin faster to compensate for smaller blades as oposed to big ones and tend to get louder. I like my comp to be as quiet as possible.
> 
> Is this a healthy idea?


my .02 is stop by the air cooling forum. I've received some excellent advice there, this thread in particular. I think they will say if you are exhausting air up and out the top, you will pull heat from the graphics card up and the intake of the CPU cooler will be getting warm air vs cool air.

If you went front-to-back cooling you would get the air from the graphics card out, and the air temp at the intake of the CPU cooler would be lower.

I also remember a post where they tested the CPU cooler orientation but can't recall the outcome.


----------



## MEC-777

Quote:


> Originally Posted by *RyanRazer*
> 
> Woow, holly s*, this is one big cooler. Why not go for liquid if need low temps. I guess high OC?
> 
> What are those at the top? They look like fans for radiator but i see no watter block (cpu or GPU). Or are those out take through the side of the case? Confused


Yeah, it's pretty big, lol. That's the Deepcool Lucifer V2 with a GF120 fan. With the 4770k @ 4.2 it runs fully passive 90% of the time (fan off). Occasionally it will come on during gaming, depending on the load.

I favor air cooling over water because I don't need absolute lowest temps, but I want complete silence at system idle and low loads. So all the fans you see are off until the GPU hits about 55C, then the bottom 3 turn on. At 67C+ the top 3 and rear fans also come to life. They all run at max 40% so while I can hear it, it's still very quiet and everything is efficiently cooled. I just assembled this and started it up for the first time yesterday, so I'm still tweaking things like the GPU fan speed for optimal temps/noise, but it's working more or less just the way I want it.









The case fans are all Deepcool TF120's and they are white LEDs. It's pretty cool watching them all start to glow and come to life as the GPU load increases. They are hybrid fans, so they work well as rad fans and air flow fans. They also have surprisingly low power draw at only 0.18A at 12v. That's another reason I prefer air cooling - can achieve very low system power draw at system idle by being able to shut all the fans off and having no pumps running.

Don't get me wrong though. I did think about doing a custom loop.


----------



## gtbtk

Quote:


> Originally Posted by *Arturo.Zise*
> 
> I have a Gainward Phoenix with the Golden Sample bios flashed and it runs well, but I noticed the standard bios my card came with has a 170w max board rating and my GS bios is 195w max board rating. My card has a single 8 pin so I should be safe up to 225w max no? Not sure if I'm harming my card at all.


Yes you are fine. Are you running the 86.04.3B.00.xx bios on the card? I think that the .50 bios pushes max power to 225W to help keep your over clocks at higher stable levels.

The 8 pin power cable is officially rated at 150 watts but will supply higher than that and the PCIe slot is rated at 75 Watts but can also provide more power than specified.


----------



## Blackfirehawk

Quote:


> Originally Posted by *gtbtk*
> 
> Yes you are fine. Are you running the 86.04.3B.00.xx bios on the card? I think that the .50 bios pushes max power to 225W to help keep your over clocks at higher stable levels.
> 
> The 8 pin power cable is officially rated at 150 watts but will supply higher than that and the PCIe slot is rated at 75 Watts but can also provide more power than specified.


you seem to know a lot about the different Bioses..

i have a Gainwand GTX 1070 (no Phoenix or GLS edition)
http://www.gainward.com/main/vgapro.php?id=987&lang=en
https://www.techpowerup.com/gpudb/b3715/gainward-gtx-1070

@ the moment i have the Bios from the Palit Superjetstream running, Techpowerup shows me its compatible and techpowerup tells me

Board power limit
Target: 195.0 W
Limit: 225.0 W

https://www.techpowerup.com/vgabios/187001/palit-gtx1070-8192-161021

Running a Boost of about 2050Mhz core Clock and + 600mhz Ram clock (micron) stable

What do you think would be the Best Bios for me?
Card has only a 8 pin connector


----------



## RyanRazer

Quote:


> Originally Posted by *Dan-H*
> 
> my .02 is stop by the air cooling forum. I've received some excellent advice there, this thread in particular. I think they will say if you are exhausting air up and out the top, you will pull heat from the graphics card up and the intake of the CPU cooler will be getting warm air vs cool air.
> 
> If you went front-to-back cooling you would get the air from the graphics card out, and the air temp at the intake of the CPU cooler would be lower.
> 
> I also remember a post where they tested the CPU cooler orientation but can't recall the outcome.


Thanx man, i'll do that !









Quote:


> Originally Posted by *MEC-777*
> 
> Yeah, it's pretty big, lol. That's the Deepcool Lucifer V2 with a GF120 fan. With the 4770k @ 4.2 it runs fully passive 90% of the time (fan off). Occasionally it will come on during gaming, depending on the load.
> 
> I favor air cooling over water because I don't need absolute lowest temps, but I want complete silence at system idle and low loads. So all the fans you see are off until the GPU hits about 55C, then the bottom 3 turn on. At 67C+ the top 3 and rear fans also come to life. They all run at max 40% so while I can hear it, it's still very quiet and everything is efficiently cooled. I just assembled this and started it up for the first time yesterday, so I'm still tweaking things like the GPU fan speed for optimal temps/noise, but it's working more or less just the way I want it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The case fans are all Deepcool TF120's and they are white LEDs. It's pretty cool watching them all start to glow and come to life as the GPU load increases. They are hybrid fans, so they work well as rad fans and air flow fans. They also have surprisingly low power draw at only 0.18A at 12v. That's another reason I prefer air cooling - can achieve very low system power draw at system idle by being able to shut all the fans off and having no pumps running.
> 
> Don't get me wrong though. I did think about doing a custom loop.


Oh man, i loved my PC when it was silent before i got this case (had open case, no case fans, cpu cooler semi-pasive, gpu fans also semi pasive). Now these case fans are pretty loud, will get new ones. Will take a look, definitely going for PWM ones. Do you plug them all to 4 pin header on MoBo or do you have a controller? i'd like to set them up so they spin at certain load.


----------



## MEC-777

Quote:


> Originally Posted by *RyanRazer*
> 
> Oh man, i loved my PC when it was silent before i got this case (had open case, no case fans, cpu cooler semi-pasive, gpu fans also semi pasive). Now these case fans are pretty loud, will get new ones. Will take a look, definitely going for PWM ones. Do you plug them all to 4 pin header on MoBo or do you have a controller? i'd like to set them up so they spin at certain load.


All fans are run off the motherboard headers. The top 3 are on a 3-pin splitter on 1 header. The bottom 3 are on their own splitter on another header. The rear fan has it's own header, as does the CPU cooler fan.

I put all fan headers in voltage mode in the BIOS and use a program called Speedfan to control each header individually. The case fans are all set up with custom fan curves based on GPU temps and the CPU fan has it's own curve based on just the CPU temp.

Speedfan is an amazing fan control software with a lot of flexibility, but it is a bit tricky to setup. Once you get past the setup, it's a breeze.

I found I have much better control over the fans in voltage mode instead of PWM. Just a simpler way of doing things I guess.









It's hilarious though, Asus motherboards like to crank all the fans to over 9000 when you first boot them up. So until windows boots and Speedfan takes control, my PC sounds like a 747 on take-off. Lol


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Yes you are fine. Are you running the 86.04.3B.00.xx bios on the card? I think that the .50 bios pushes max power to 225W to help keep your over clocks at higher stable levels.
> 
> The 8 pin power cable is officially rated at 150 watts but will supply higher than that and the PCIe slot is rated at 75 Watts but can also provide more power than specified.
> 
> 
> 
> you seem to know a lot about the different Bioses..
> 
> i have a Gainwand GTX 1070 (no Phoenix or GLS edition)
> http://www.gainward.com/main/vgapro.php?id=987&lang=en
> https://www.techpowerup.com/gpudb/b3715/gainward-gtx-1070
> 
> @ the moment i have the Bios from the Palit Superjetstream running, Techpowerup shows me its compatible and techpowerup tells me
> 
> Board power limit
> Target: 195.0 W
> Limit: 225.0 W
> 
> https://www.techpowerup.com/vgabios/187001/palit-gtx1070-8192-161021
> 
> Running a Boost of about 2050Mhz core Clock and + 600mhz Ram clock (micron) stable
> 
> What do you think would be the Best Bios for me?
> Card has only a 8 pin connector
Click to expand...

That Bios you are running has a 225W limit, they increased it from the original stock rating. You should be fine with that one. You got a nice OC from 1506Mhz to 1633Mhz. The Gainward version is still listed as 195W.

If you want to go extreme, you could try a Super Jetstream/Pheonix GLH bios, that also has a 225W limit and the the factory clock is 1671Mhz and the memory gets a 250MhzOC as well.

Palit Superjetstream 1671Mhz bios https://www.techpowerup.com/vgabios/187013/187013

Gainward GLH 1671Mhz bios https://www.techpowerup.com/vgabios/187062/187062

I think that they are doing some binning and your card, being a base model may not have a chip binned well enough to cope with the extreme overclock. You may find that it will crash at the stock clocks under a 3d load. You will only know by trying it but I will not kill your card. You can always flash it back to the bios you are running on now if it doesn't work out for you.


----------



## gtbtk

Quote:


> Originally Posted by *Arturo.Zise*
> 
> I have a Gainward Phoenix with the Golden Sample bios flashed and it runs well, but I noticed the standard bios my card came with has a 170w max board rating and my GS bios is 195w max board rating. My card has a single 8 pin so I should be safe up to 225w max no? Not sure if I'm harming my card at all.


Even the founders edition card has a VRM that will cope with a max power draw of 250W. You should be fine


----------



## kdchan

Ho guys, bought a Dual GTX1070 08G card about 4 days ago, the card is amazing and already occed very well (1582/2002/1772), unfortunately i noticed i'm one of these people who have the Micron GDDR5 vram, so i upgraded the vbios from the stock 86.04.26.00.80 to the last one 86.04.50.00.46 using the GTX1070updatebios.exe.
All worked great but i noitced that the whole temperature of my card reach 70+ °C and also the whole cabinet raise by a 24°C margin, something that never happened with my old GTX670.
So i read better the bios patchnotes and it say:
GTX1070updatebios
1.DUAL GTX 1070 series now support 0dB function.
2.Improve Micron memory overclock stability.

The 0db function is the root cause.

Can please someone send me the previous GTX1070updatebios.exe or the .As03.rom file so i can downgrade to the old vbios? I don't like to use a third party software tool to tweak the internal fan, i prefer them to stay always active and i never had a crash with the old bios caused by the micron vram before.

Thanks.


----------



## gtbtk

Quote:


> Originally Posted by *Onlygear*
> 
> Hello! thanks for the clarification, I thought Zotac bios with two 8 pin power just delivery 200watts and 100 watts on the second one and working on asus STRIX will not exceed 200watts : (
> After runing my card with Gainward bios I had a lower oc on micron memory.


Absolute performance is all about finding the right compromise within the available power and temp envelope. What you get to the limits of the card, what you give to the core has to be taken away from the memory.

This cross flashing caper is certainly educational but there are no guarantees that a bios for one card will perform well on another brand. I cant get the Giga Extreme or Gainward GLH bioses to run stable on my MSI card even at their stock clocks but you don't know if you dont try.


----------



## gtbtk

Quote:


> Originally Posted by *WillG027*
> 
> Micron memory is fine after the .50 Bios flash.
> (and most certainly does not = Elpida as I've seen written here)
> 
> Have a MSI Gaming X with Micron good upto +800 on memory no artifacts.


You are right, after the bug fix was released there are no unique issues with Micron memory.

My MSI gaming X is OK up to just under +600 but I have to run a BCLK overclock and I think that may have reduced the Vram OC headroom I have available


----------



## asdkj1740

higher power bios gives you more stable gpu clock.
~225w on 1070 probably gives you locked gpu clock. you can go check the tomshardware review.
factory overclocking can be lowered by setting minus core offset on msi ab. what you need is high power bios and good cooling.

not need to worry about high power bios would hurt your motherboard and power supply, unless you have a extremely cost saving motherboard and overrated power supply.


----------



## WillG027

Quote:


> Originally Posted by *gtbtk*
> 
> You are right, after the bug fix was released there are no unique issues with Micron memory.
> 
> My MSI gaming X is OK up to just under +600 but I have to run a BCLK overclock and I think that may have reduced the Vram OC headroom I have available


Absolutely. The Micron is working properly, it's just down to silicon lottery how well your particular example OC's.
+ 600 is quite a decent OC on the memory though.

How does the BCLK affect the graphics card memory system?


----------



## asdkj1740

Quote:


> Originally Posted by *WillG027*
> 
> Absolutely. The Micron is working properly, it's just down to silicon lottery how well your particular example OC's.
> + 600 is quite a decent OC on the memory though.
> 
> How does the BCLK affect the graphics card memory system?


there is no doubt that on average samsung vram is better. samsung has stock 1.6v max but micron has only ~1.5v max, and samsung scales better with voltage, meaning you are probably going to get higher vram clock under the same voltage than micron vram.
those lower latencies / timings may affect benchmarks but it is hard to see the difference during gaming.

if you have a chance to choose between samsung and micron vram, with the same cost, what would you choose?
the answer is simple

micron vram can do the job well at stock 8000mhz. and anything above 8000mhz is not guaranteed. so if you dont plan to have your vram oc, micron is perfectly fine.

samsung vram is good doesnt mean micron is bad. my micron vram can oc to 2300mhz on gaming and 2350mhz for benchmarks.

https://xdevs.com/evga/980ti/Ti_KPE_OC_Guide.pdf
Two main reasons:
1. Samsung is faster clock-per-clock against Hynix due to some tighter latencies and the
ability to run higher voltage. This improves the performance a bit over that of the
reference 980Ti.

2. Samsung memory can scale nicely with memory voltage. This means that if you
provide more voltage to memory than stock 1.60V, you will very likely get higher
overclocking on memory. ...


----------



## Blackfirehawk

Quote:


> Originally Posted by *gtbtk*
> 
> That Bios you are running has a 225W limit, they increased it from the original stock rating. You should be fine with that one. You got a nice OC from 1506Mhz to 1633Mhz. The Gainward version is still listed as 195W.
> 
> If you want to go extreme, you could try a Super Jetstream/Pheonix GLH bios, that also has a 225W limit and the the factory clock is 1671Mhz and the memory gets a 250MhzOC as well.
> 
> Palit Superjetstream 1671Mhz bios https://www.techpowerup.com/vgabios/187013/187013
> 
> Gainward GLH 1671Mhz bios https://www.techpowerup.com/vgabios/187062/187062
> 
> I think that they are doing some binning and your card, being a base model may not have a chip binned well enough to cope with the extreme overclock. You may find that it will crash at the stock clocks under a 3d load. You will only know by trying it but I will not kill your card. You can always flash it back to the bios you are running on now if it doesn't work out for you.


i have Testet the GLH Bios

after a Hour of running Heavens benchmark

Core Clock : 2050mhz (can´t oc higher without Crash) even on + 100% Core voltage its crash over 2050
Memory Clock: 4556mhz in Afterburner.. /9112mhz Vram (can´t go higher without artefacts)
100% Fan (costum Curve) = 67c degree
VDDC = 1050 mv

maybe not Perfekt, but i think its okay for a Cheap GTX1070

Thanks for 4 Help


----------



## gtbtk

Quote:


> Originally Posted by *WillG027*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You are right, after the bug fix was released there are no unique issues with Micron memory.
> 
> My MSI gaming X is OK up to just under +600 but I have to run a BCLK overclock and I think that may have reduced the Vram OC headroom I have available
> 
> 
> 
> Absolutely. The Micron is working properly, it's just down to silicon lottery how well your particular example OC's.
> + 600 is quite a decent OC on the memory though.
> 
> How does the BCLK affect the graphics card memory system?
Click to expand...

I have a sandy Bridge i7-2600 non K CPU. Overclocking the BCLK to get an overclock on the CPU also over clocks the PCIe lanes, the Computer base clock in the PCH and the system memory.

Reference clocks are signaled over the PCIE bus in the form of electrical sine waves that require a certain level of being in "phase". If everything is pushed too far, the waves get out of whack.

You may have seen the nvidia "eye" image when they where introducing the Gddr5X memory with the 1080. They didn't explain what the "eye" (right image below) image actually was but what it is showing is an image of an oscilloscope screen with multiple passes of the two different clock signals that GDDR5 or 5X memory uses to operate when they have the correct timings. The lines have width because each pass is never exactly along the same track as each time there are minute variances in the voltages but as long as they signal within a range of values everything is OK and the "eye is produces and that can be used by the memory. When you overclock the PCIe it introduces higher variations than it would at stock and those lines can end up outside of the range that can be tolerated by the silicon so everything gets out of whack and the card crashes .


----------



## WillG027

Quote:


> Originally Posted by *gtbtk*
> 
> I have a sandy Bridge i7-2600 non K CPU. Overclocking the BCLK to get an overclock on the CPU also over clocks the PCIe lanes, the Computer base clock in the PCH and the system memory.
> 
> Reference clocks are signaled over the PCIE bus in the form of electrical sine waves that require a certain level of being in "phase". If everything is pushed too far, the waves get out of whack.
> 
> -SNIP-


Nice, thanks for explaining - perfectly understandable.


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That Bios you are running has a 225W limit, they increased it from the original stock rating. You should be fine with that one. You got a nice OC from 1506Mhz to 1633Mhz. The Gainward version is still listed as 195W.
> 
> If you want to go extreme, you could try a Super Jetstream/Pheonix GLH bios, that also has a 225W limit and the the factory clock is 1671Mhz and the memory gets a 250MhzOC as well.
> 
> Palit Superjetstream 1671Mhz bios https://www.techpowerup.com/vgabios/187013/187013
> 
> Gainward GLH 1671Mhz bios https://www.techpowerup.com/vgabios/187062/187062
> 
> I think that they are doing some binning and your card, being a base model may not have a chip binned well enough to cope with the extreme overclock. You may find that it will crash at the stock clocks under a 3d load. You will only know by trying it but I will not kill your card. You can always flash it back to the bios you are running on now if it doesn't work out for you.
> 
> 
> 
> i have Testet the GLH Bios
> 
> after a Hour of running Heavens benchmark
> 
> Core Clock : 2050mhz (can´t oc higher without Crash) even on + 100% Core voltage its crash over 2050
> Memory Clock: 4556mhz in Afterburner.. /9112mhz Vram (can´t go higher without artefacts)
> 100% Fan (costum Curve) = 67c degree
> VDDC = 1050 mv
> 
> maybe not Perfekt, but i think its okay for a Cheap GTX1070
> 
> Thanks for 4 Help
Click to expand...

You will get more impressive core clock numbers and potentially better performance if you use the curve (press Ctrl-F to access in AB) instead of the core clock slider in afterburner. Each Voltage point along that curve has a different limit as to how much over clock headroom it can cope with. If you use the slider, you are basically limited to the voltage point with the lowest headroom along the curve.

You could try just pulling the 1.093v point up to say 2100 while leaving the rest of the curve at stock and test. Without a memory OC , I can use that to run the card at 2164Mhz in heaven As memory speeds come up though, clock speeds have to go down.

Remember that Heaven is a lot more forgiving of overclocks than Firestrike which is more forgiving than Time Spy and Real Games like Rise of the Tomb Raider or witcher 3 are.

You and I get about the same memory OC in heaven and Firestrike. In my case though, that is too high if you run Time Spy, it will artifact and crash at those speeds. I have found that 4400-4450Mhz on my card is about the limit of what can be universally stable


----------



## JoeUbi

I'm able to hit 4900 on my memory. Any higher and I get instability, no artifacts though. Which makes me think if I could get more power to the card I could hit higher clocks. For some reason I can't get my card to use more than 200W. Not really sure what the 2x8-pin PCIe connectors are for if they can't give me more juice!

https://www.techpowerup.com/gpuz/details/q2ka

Some very good scores on benchmarks as well.
http://www.3dmark.com/fs/11409542
http://www.3dmark.com/fs/11409449
http://www.3dmark.com/spy/1055431


----------



## EDK-TheONE

Zotac amp extreme



Graphics Score 21 836








Physics Score 9 923
Combined Score 8 806
http://www.3dmark.com/3dm/17372344


----------



## zipper17

wow i should pick zotac extreme to be honest lol.

Those are kind of 1070's OC performance should be. but seems averagely most of 1070 card can't even break 21K imho.

Like 980Ti also has better OC performance once overclocked in general.


----------



## dasa94

Hey guys. I wanted to know which 1070 should I go for? Which runs cooler? The zotac amp extreme or the Asus strix version? Doesn't care about aesthetics that much. I want to pick the card than runs coller than the other. Thanks.

Sent from my D6503 using Tapatalk


----------



## khanmein

Quote:


> Originally Posted by *dasa94*
> 
> Hey guys. I wanted to know which 1070 should I go for? Which runs cooler? The zotac amp extreme or the Asus strix version? Doesn't care about aesthetics that much. I want to pick the card than runs coller than the other. Thanks.
> 
> Sent from my D6503 using Tapatalk


pick asus (micron) or zotac (samsung)


----------



## dasa94

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dasa94*
> 
> Hey guys. I wanted to know which 1070 should I go for? Which runs cooler? The zotac amp extreme or the Asus strix version? Doesn't care about aesthetics that much. I want to pick the card than runs coller than the other. Thanks.
> 
> Sent from my D6503 using Tapatalk
> 
> 
> 
> pick asus (micron) or zotac (samsung)
Click to expand...

I'm sorry but I know what that means? Let's just say I'm a noob so can you please explain what you mean by that? Thanks.

Sent from my D6503 using Tapatalk


----------



## khanmein

Quote:


> Originally Posted by *dasa94*
> 
> I'm sorry but I know what that means? Let's just say I'm a noob so can you please explain what you mean by that? Thanks.
> 
> Sent from my D6503 using Tapatalk


just pick the cheapest one u can afford.


----------



## dasa94

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dasa94*
> 
> I'm sorry but I know what that means? Let's just say I'm a noob so can you please explain what you mean by that? Thanks.
> 
> Sent from my D6503 using Tapatalk
> 
> 
> 
> just pick the cheapest one u can afford.
Click to expand...

Can afford both of then but I want the one which runs cooler.

Sent from my D6503 using Tapatalk


----------



## khanmein

Quote:


> Originally Posted by *dasa94*
> 
> Can afford both of then but I want the one which runs cooler.
> 
> Sent from my D6503 using Tapatalk


pick asus!!


----------



## cyronn

Heres my first oc just over 2100mhz on the core

http://s1355.photobucket.com/user/cyronn/media/3dmark 1st run_zpslxf6oddf.jpg.html

http://s1355.photobucket.com/user/cyronn/media/2126_zpsmfgbafst.jpg.html


----------



## asdkj1740

Quote:


> Originally Posted by *dasa94*
> 
> Can afford both of then but I want the one which runs cooler.
> 
> Sent from my D6503 using Tapatalk


evga ftw hybrid on us amazon is on sales, very appealing, just 450usd.


----------



## H4mm3R2

Hi,
If I throw EK waterblock on my Asus 1070 Strix, I lose the Warranty?


----------



## asdkj1740

Quote:


> Originally Posted by *H4mm3R2*
> 
> Hi,
> If I throw EK waterblock on my Asus 1070 Strix, I lose the Warranty?


unless you can keep the sticker untouched.


----------



## H4mm3R2

I'll be careful, I'll try not to damage the sticker. Thanks


----------



## mrtbahgs

I haven't been keeping up with the thread so I apologize if this has been asked in the past week or so.

What fan curve setups do you run on the bottom end?
Perhaps it has always been like this for me, but I am just now noticing it due to a minor grind like noise from initial spin up, but my GPU keeps bumping between Fan Stop and a few % on and repeats it over and over while idle.
I have tried raising the threshold for 0% to like 48C or something, but slowly the GPU heats up and then goes back into the on/off fan cycle.

Right now I went back to 40C 0% and then 41C 12% so if they need to come on, they actually come on.
I didn't like the idea of like 3% fan speed or something on a low slope line for the first few degrees if I had 40C/0% and then maybe 45C/12%.

Do most of you try and utilize the fan stop feature or do you just run like 10% fan 24/7 as your minimum speed no matter the temp?

I worry that this on/off repetitiveness will cause early failure of the fans or louder noises down the line so I think I need to dial in a better curve for the low end.


----------



## gtbtk

Quote:


> Originally Posted by *cyronn*
> 
> Heres my first oc just over 2100mhz on the core


Overclocking the memory gives you great scores in firestrike


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dasa94*
> 
> Can afford both of then but I want the one which runs cooler.
> 
> Sent from my D6503 using Tapatalk
> 
> 
> 
> evga ftw hybrid on us amazon is on sales, very appealing, just 450usd.
Click to expand...

You thinking about SLI?


----------



## lanofsong

Hello GTX 1070 owners,

We are having our monthly Foldathon from Monday 16th - 18th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

January 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## zipper17

Quote:


> Originally Posted by *mrtbahgs*
> 
> I haven't been keeping up with the thread so I apologize if this has been asked in the past week or so.
> 
> What fan curve setups do you run on the bottom end?
> Perhaps it has always been like this for me, but I am just now noticing it due to a minor grind like noise from initial spin up, but my GPU keeps bumping between Fan Stop and a few % on and repeats it over and over while idle.
> I have tried raising the threshold for 0% to like 48C or something, but slowly the GPU heats up and then goes back into the on/off fan cycle.
> 
> Right now I went back to 40C 0% and then 41C 12% so if they need to come on, they actually come on.
> I didn't like the idea of like 3% fan speed or something on a low slope line for the first few degrees if I had 40C/0% and then maybe 45C/12%.
> 
> Do most of you try and utilize the fan stop feature or do you just run like 10% fan 24/7 as your minimum speed no matter the temp?
> 
> I worry that this on/off repetitiveness will cause early failure of the fans or louder noises down the line so I think I need to dial in a better curve for the low end.


Maybe you should try over "Temperature hysteresis" fan setting in MSI AB.

Temperature hysteresis is pretty important to add imo, especially on game such Witcher 3.
When you play witcher 3, you will notice every time when you go to pause screen, inventory screen, map, etc, the GPU load-GPU heat & FANS will all immediately lowered down because GPU stopped rendering. With Fan Temperature Hysteresis you will keep your fan speed stable.

So example your Full load temp during gaming is 50C, with Temp Hysteresis at 11C means: 50-11= 39C. So the Fan will start to turn down only when GPU hit 39C. Cmiiw.


----------



## Janes360

Hi. this is to complain ?? Fan noise on 100 percent zzzzzzz It starts at 1600 RMP GPU is 1 Month


----------



## icold

I like so much Pascal temps, this part is 10 points to nvidia, my STRIX dont pass 60C in fulload @2126mhz, because this i believe we can make much more clock at pascal bios tweaker:thumb:


----------



## GeneO

Quote:


> Originally Posted by *mrtbahgs*
> 
> I haven't been keeping up with the thread so I apologize if this has been asked in the past week or so.
> 
> What fan curve setups do you run on the bottom end?
> Perhaps it has always been like this for me, but I am just now noticing it due to a minor grind like noise from initial spin up, but my GPU keeps bumping between Fan Stop and a few % on and repeats it over and over while idle.
> I have tried raising the threshold for 0% to like 48C or something, but slowly the GPU heats up and then goes back into the on/off fan cycle.
> 
> Right now I went back to 40C 0% and then 41C 12% so if they need to come on, they actually come on.
> I didn't like the idea of like 3% fan speed or something on a low slope line for the first few degrees if I had 40C/0% and then maybe 45C/12%.
> 
> Do most of you try and utilize the fan stop feature or do you just run like 10% fan 24/7 as your minimum speed no matter the temp?
> 
> I worry that this on/off repetitiveness will cause early failure of the fans or louder noises down the line so I think I need to dial in a better curve for the low end.


Fans will not be able to startup and run at 10%. The ones on mine require at least 20%. Here is my fan curve, I have it steep so I can keep the temps under 60c, and stop the fans below 45. Hysteresis is set to 3.


----------



## mrtbahgs

Quote:


> Originally Posted by *GeneO*
> 
> Fans will not be able to startup and run at 10%. The ones on mine require at least 20%. Here is my fan curve, I have it steep so I can keep the temps under 60c, and stop the fans below 45. Hysteresis is set to 3.


Thanks for sharing, that's more what I was looking for.
Mine claim to be on and running at like 12 or 15%, I purposely set the temp jump to 12% right after 0% and not make it attempt to spin at like 4%.
I am not sure what my minimum speed is or if the reading is accurate when it claims 12%.

For your set up does having it jump to 25% for initial start up keep it from bouncing between 0 and 25% at idle though?
I could see it hitting 46C and cooling quick to like 43C and then back and forth which is what I am trying to avoid.

I am running EVGA Precision XOC btw and not MSI AB, but I see that I also have a hysteresis option, but had no idea what it was so it is still at 0C.
Zipper17 gave a nice little description though so thank you for that.

Will Hysteresis apply to idle loads too or when does it not consider the temp setting i put in?
That is what I am trying to fix, idle on and off, in game I have not had a problem with temps.

Here is what I have currently just to show where I last left it:


----------



## xGeNeSisx

Going to experiment with flashing the Asus Strix OC bios to my Gigabyte G1, I'm interested to see if performance increases versus current EVGA bios. As previously mentioned, it's very important to monitor temperatures when crossflashing. Luckily I don't seem to have a problem here, my G1 has a Corsair H55 mounted on it with the Kraken G10. Put copper heatsinks on all of the VRAM chips (not entirely necessary, but might as well go all out) in addition to the nice setup I put on the VRM. I have not hit a temperature over 41 C under any benchmark or game, it really is quite amazing. Replaced the default Kraken G10 92mm fan with a Noctua which provides a great deal more airflow to VRM. Even overclocked with max voltage just screwing around with benching, I have not ever had the VRMs exceed 55 C under full stress. This is even whilst running x264 encoder CPU stability test, essentially pushing my entire system to max to test stability.

Oh, and something that may be of interest to those of you suffering from coil whine. My G1 was so damn loud on stock cooler, the sound drove me crazy while in DOOM startup loading menus. On the G1 backplate is a sheet of plastic insulator which prevents the PCB from shorting itself on the backplate. In order to achieve better temps when I was experimenting with my G1 mod, I cut away the plastic insulator in a section right under the VRM. In it's place I put Fujipoly thermal pads of the correct height to facilitate heat transfer from PCB to backplate. The cooling on the card improved (the G1 backplate traps hot air under it and holds it there with that gap), the coil whine noise from the VRMs was eliminated entirely.

From my understanding coil whine comes from when the VRMS enter high energy state, the padding served as a bit of sound dampering as well as a nice way to eliminate the backplate from trapping hot air against the board.


----------



## GeneO

Quote:


> Originally Posted by *mrtbahgs*
> 
> I haven't been keeping up with the thread so I apologize if this has been asked in the past week or so.
> 
> What fan curve setups do you run on the bottom end?
> Perhaps it has always been like this for me, but I am just now noticing it due to a minor grind like noise from initial spin up, but my GPU keeps bumping between Fan Stop and a few % on and repeats it over and over while idle.
> I have tried raising the threshold for 0% to like 48C or something, but slowly the GPU heats up and then goes back into the on/off fan cycle.
> 
> Right now I went back to 40C 0% and then 41C 12% so if they need to come on, they actually come on.
> I didn't like the idea of like 3% fan speed or something on a low slope line for the first few degrees if I had 40C/0% and then maybe 45C/12%.
> 
> Do most of you try and utilize the fan stop feature or do you just run like 10% fan 24/7 as your minimum speed no matter the temp?
> 
> I worry that this on/off repetitiveness will cause early failure of the fans or louder noises down the line so I think I need to dial in a better curve for the low end.


Fans will not be able to startup and run at 10%. The ones on mine require at least 20%. Here is my fan curve, I have it steep so I can keep the temps under 60c, and stop the fans below 45.
Quote:


> Originally Posted by *mrtbahgs*
> 
> Thanks for sharing, that's more what I was looking for.
> Mine claim to be on and running at like 12 or 15%, I purposely set the temp jump to 12% right after 0% and not make it attempt to spin at like 4%.
> I am not sure what my minimum speed is or if the reading is accurate when it claims 12%.
> 
> For your set up does having it jump to 25% for initial start up keep it from bouncing between 0 and 25% at idle though?
> I could see it hitting 46C and cooling quick to like 43C and then back and forth which is what I am trying to avoid.
> 
> I am running EVGA Precision XOC btw and not MSI AB, but I see that I also have a hysteresis option, but had no idea what it was so it is still at 0C.
> Zipper17 gave a nice little description though so thank you for that.
> 
> Will Hysteresis apply to idle loads too or when does it not consider the temp setting i put in?
> That is what I am trying to fix, idle on and off, in game I have not had a problem with temps.
> 
> Here is what I have currently just to show where I last left it:


At ambient temperature non gameplay my card stays below 45c with no fan. I selected that curve so I would get no bouncing. Gaming is always above 45.

The hysteresis applies when lowering fan speed (not increasing). It helps some, but will just increase the period of the bouncing if you don't have it setup to have a step up from 0 past your idle usage.

Same concern as you (on/off cycling decreasing fan life).


----------



## UZ7

http://www.3dmark.com/fs/11421484

Still trying to push for that 16K mark...

19K on the laptop 1070









Wish I had control over the power limit / voltages.


----------



## ranillo

Going to try KFA2 1070 ex oc on my evga. Surely better ability to overclock, more power limit and better stability.


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Going to experiment with flashing the Asus Strix OC bios to my Gigabyte G1, I'm interested to see if performance increases versus current EVGA bios. As previously mentioned, it's very important to monitor temperatures when crossflashing. Luckily I don't seem to have a problem here, my G1 has a Corsair H55 mounted on it with the Kraken G10. Put copper heatsinks on all of the VRAM chips (not entirely necessary, but might as well go all out) in addition to the nice setup I put on the VRM. I have not hit a temperature over 41 C under any benchmark or game, it really is quite amazing. Replaced the default Kraken G10 92mm fan with a Noctua which provides a great deal more airflow to VRM. Even overclocked with max voltage just screwing around with benching, I have not ever had the VRMs exceed 55 C under full stress. This is even whilst running x264 encoder CPU stability test, essentially pushing my entire system to max to test stability.
> 
> Oh, and something that may be of interest to those of you suffering from coil whine. My G1 was so damn loud on stock cooler, the sound drove me crazy while in DOOM startup loading menus. On the G1 backplate is a sheet of plastic insulator which prevents the PCB from shorting itself on the backplate. In order to achieve better temps when I was experimenting with my G1 mod, I cut away the plastic insulator in a section right under the VRM. In it's place I put Fujipoly thermal pads of the correct height to facilitate heat transfer from PCB to backplate. The cooling on the card improved (the G1 backplate traps hot air under it and holds it there with that gap), the coil whine noise from the VRMs was eliminated entirely.
> 
> From my understanding coil whine comes from when the VRMS enter high energy state, the padding served as a bit of sound dampering as well as a nice way to eliminate the backplate from trapping hot air against the board.


I found that the EVGA Bios did like to hit its maximum power limit quickly and start bouncing the core clock frequency around a lot. The Asus bios on my MSI worked really well and was quite stable


----------



## DeathAngel74

Mine bounces alot, 2101, 2050, 2075, 2025







damn you gpu boost 3.0


----------



## khanmein

Quote:


> Originally Posted by *Janes360*
> 
> Hi. this is to complain ?? Fan noise on 100 percent zzzzzzz It starts at 1600 RMP GPU is 1 Month


well known issue from zotac.


----------



## khanmein

Quote:


> Originally Posted by *ranillo*
> 
> Going to try KFA2 1070 ex oc on my evga. Surely better ability to overclock, more power limit and better stability.


try asus one on evga is way better.


----------



## DeathAngel74

which bios do you suggest for evga 1070 sc? 6173-kb/kr


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> which bios do you suggest for evga 1070 sc? 6173-kb/kr


i'm sticking with ori default stock 86.04.50.00.*72*


----------



## UZ7

http://www.3dmark.com/fs/11433632

Alright I'm done for now lol.. 16K was my goal.


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> Mine bounces alot, 2101, 2050, 2075, 2025
> 
> 
> 
> 
> 
> 
> 
> damn you gpu boost 3.0


its all about the acx3.0 cooler. its just suck.


----------



## MEC-777

http://www.3dmark.com/fs/11389532

That's more like it.







Just upgraded to a 4770K from an i5-4570. Didn't realize how much the i5 was holding this card back. Pretty much all my gaming benchmarks increased as well, some by significant amounts.


----------



## saunupe1911

Man I remember months ago when everybody was going MSI, Zotac, and the FTW when I kept telling them how good the Asus Strix OC was. Now the Strix OC bios and the card itself is THE card to have right now. Plus retailers are running good deals on them.


----------



## ranillo

Quote:


> Originally Posted by *khanmein*
> 
> try asus one on evga is way better.


Thanks, i?ll give it a try.


----------



## zipper17

Quote:


> Originally Posted by *MEC-777*
> 
> http://www.3dmark.com/fs/11389532
> 
> That's more like it.
> 
> 
> 
> 
> 
> 
> 
> Just upgraded to a 4770K from an i5-4570. Didn't realize how much the i5 was holding this card back. Pretty much all my gaming benchmarks increased as well, some by significant amounts.


your 3dmark firestrike with 4770k it is only increased the Physic scores, Combined test, & Total Scores.

AFAIK CPU didn't increase/affect GPU scores in Firestrike, your 1070 at 19k that probably still the same as previous performance with 4570, try overclock the 1070 to reach 20-21k Graphic scores.

yes in games i5 4570 to i7 4770k, with 4770k your minimum framerates definitely will increased. Try Overclock your 4770k into even 4,5ghz or beyond to squeeze even more performances.

You can also upgrade your RAM into 2400mhz or so, that will also helps increase minimum framerates.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Mine bounces alot, 2101, 2050, 2075, 2025
> 
> 
> 
> 
> 
> 
> 
> damn you gpu boost 3.0


That is a function of the EVGA bios hitting the power limit set by the bios, not GPU Boost 3.0 as such.

The MSI Gaming and Asus strix bioses do not behave that way. The MSI bios needs a high 4K load like firestrike ultra to exceed 100% power load


----------



## MEC-777

Quote:


> Originally Posted by *zipper17*
> 
> your 3dmark firestrike with 4770k it is only increased the Physic scores, Combined test, & Total Scores.
> 
> AFAIK CPU didn't increase/affect GPU scores in Firestrike, your 1070 at 19k that probably still the same as previous performance with 4570, try overclock the 1070 to reach 20-21k Graphic scores.
> 
> yes in games i5 4570 to i7 4770k, with 4770k your minimum framerates definitely will increased. Try Overclock your 4770k into even 4,5ghz or beyond to squeeze even more performances.
> 
> You can also upgrade your RAM into 2400mhz or so, that will also helps increase minimum framerates.


Yeah, I'm aware of all that.







I just meant that before with the 4570, my firestrike scores were well behind what most people were getting with 1070's (was just over 13k overall score). It's nice to be up in the scores with the majority is all.

It's a founders edition 1070, so limited overclocking. Don't think I can hit 20-21k in graphics because the score you saw was with my max stable OC already. It plenty fast enough anyways. Runs any game maxed at 1080 which is all I need it to do.









I'm still experimenting with overclocking the 4770k though. I'm not familiar with OCing CPUs yet.


----------



## skupples

Spoiler: Warning: precision x still sucks!



dl afterburner


----------



## syl1979

Quote:


> Originally Posted by *MEC-777*
> 
> Yeah, I'm aware of all that.
> 
> 
> 
> 
> 
> 
> 
> I just meant that before with the 4570, my firestrike scores were well behind what most people were getting with 1070's (was just over 13k overall score). It's nice to be up in the scores with the majority is all.
> 
> It's a founders edition 1070, so limited overclocking. Don't think I can hit 20-21k in graphics because the score you saw was with my max stable OC already. It plenty fast enough anyways. Runs any game maxed at 1080 which is all I need it to do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm still experimenting with overclocking the 4770k though. I'm not familiar with OCing CPUs yet.


For the 1070FE, overclocking + undervolting should work well. 2000mhz can be reached at 1.025v (even lower), while reducing power need and temperature.


----------



## MEC-777

Quote:


> Originally Posted by *syl1979*
> 
> For the 1070FE, overclocking + undervolting should work well. 2000mhz can be reached at 1.025v (even lower), while reducing power need and temperature.


Yeah. I have mine running at around 2000mhz core and 4500 memory perfectly stable.

I know how to increase the voltage, but how do you undervolt lower than stock? I'm using precision XOC.


----------



## syl1979

Quote:


> Originally Posted by *MEC-777*
> 
> Yeah. I have mine running at around 2000mhz core and 4500 memory perfectly stable.
> 
> I know how to increase the voltage, but how do you undervolt lower than stock? I'm using precision XOC.


If same as afterburner use the curv function. Set your desired working point (for instance V=1.0125v @ 2025mhz), and for points higher than V set also 2025 or lower (in afterburner you need only to set the first point if higher than max stock frequency)


----------



## RaleighStClair

I have a MSI Seahawk X 1070 and I seem to hit the pwr limit in games (according to Afterburner overlay) and apparently 110% Power Limit is the max for this model (on this BIOS), is there another BIOS I could flash to raise that limit to 120-130%?

I never hit over 50C @ 2076mhz/500mem.

Thanks.


----------



## gtbtk

Quote:


> Originally Posted by *RaleighStClair*
> 
> I have a MSI Seahawk X 1070 and I seem to hit the pwr limit in games (according to Afterburner overlay) and apparently 110% Power Limit is the max for this model (on this BIOS), is there another BIOS I could flash to raise that limit to 120-130%?
> 
> I never hit over 50C @ 2076mhz/500mem.
> 
> Thanks.


You need to realize that the power slider values are not absolute microvolts but a percentage value. the important number that you should pay attention to is the card power draw that you can monitor with HWinfo64.

The Seahawk uses a reference board and I think is limited to 180W. You could try the Asus Strix OC bios that seems to be well balanced with an 8 pin power supply and 200W power limit with a 120% slider. You could also try the MSI Gaming X bios, be aware of your VRM temperatures though as the card is 6+8pin, however, it is mostly for show and marketing. The card never draws anywhere near the maximum possible, the most I have managed is about 230W under a 4K load. The reference VRM has a max capacity of 250W. The Power limit slider goes to 126%.


----------



## MEC-777

Quote:


> Originally Posted by *syl1979*
> 
> If same as afterburner use the curv function. Set your desired working point (for instance V=1.0125v @ 2025mhz), and for points higher than V set also 2025 or lower (in afterburner you need only to set the first point if higher than max stock frequency)


How is that undervolting though? What you described is just overclocking but leaving the voltage stock. Unless that's what you meant? Sorry, I don't quite understand.


----------



## syl1979

Quote:


> Originally Posted by *MEC-777*
> 
> How is that undervolting though? What you described is just overclocking but leaving the voltage stock. Unless that's what you meant? Sorry, I don't quite understand.


It seems the gpu will apply the lowest voltage defined for a given frequency in the curv (unless you apply overvolting in software). By setting same frequency starting from one desired voltage, the gpu will remains at that voltage unless it reaches power limit.

The stock voltage should be 1.06v (highset voltage seen on stock).


----------



## XRogerX

Quote:


> Originally Posted by *RaleighStClair*
> 
> I have a MSI Seahawk X 1070 and I seem to hit the pwr limit in games (according to Afterburner overlay) and apparently 110% Power Limit is the max for this model (on this BIOS), is there another BIOS I could flash to raise that limit to 120-130%?
> 
> I never hit over 50C @ 2076mhz/500mem.
> 
> Thanks.


i got an EVGA 1070SC Black Edition and i can get over 2100 but my Power limit is a little more then yours 112%
but when i raise my voltage to the max i get over 2100 , 2126 something like that ,gpu temps are like 46 ,of course
that water


----------



## MEC-777

Quote:


> Originally Posted by *syl1979*
> 
> It seems the gpu will apply the lowest voltage defined for a given frequency in the curv (unless you apply overvolting in software). By setting same frequency starting from one desired voltage, the gpu will remains at that voltage unless it reaches power limit.
> 
> The stock voltage should be 1.06v (highset voltage seen on stock).


The curve represents an offset at the various voltage points though. +150 at 1.06, for example. We can't change the frequency, only the target offset and GPUboost 3.0 still decides if conditions (temps, power limit etc.) are ideal to allow the target offset to be achieved and what the actual frequency it runs at will be.

So forgive me, but I'm still not understanding what you're saying. Because you can't set a frequency with these GPUs, only a target offset. Are you basically saying to set one point on the curve (1.06v) at whatever offset I choose (say +150 for example) and leave all the other points at stock values?


----------



## syl1979

Quote:


> Originally Posted by *MEC-777*
> 
> The curve represents an offset at the various voltage points though. +150 at 1.06, for example. We can't change the frequency, only the target offset and GPUboost 3.0 still decides if conditions (temps, power limit etc.) are ideal to allow the target offset to be achieved and what the actual frequency it runs at will be.
> 
> So forgive me, but I'm still not understanding what you're saying. Because you can't set a frequency with these GPUs, only a target offset. Are you basically saying to set one point on the curve (1.06v) at whatever offset I choose (say +150 for example) and leave all the other points at stock values?


Quote:


> Originally Posted by *MEC-777*
> 
> The curve represents an offset at the various voltage points though. +150 at 1.06, for example. We can't change the frequency, only the target offset and GPUboost 3.0 still decides if conditions (temps, power limit etc.) are ideal to allow the target offset to be achieved and what the actual frequency it runs at will be.
> 
> So forgive me, but I'm still not understanding what you're saying. Because you can't set a frequency with these GPUs, only a target offset. Are you basically saying to set one point on the curve (1.06v) at whatever offset I choose (say +150 for example) and leave all the other points at stock values?


In Afterburner the curv represents the target voltage / frequency points.

With a picture may be easier to understand : Undervolt 1.012v 2050 Mhz max ( curv is flat starting 1.012v)


----------



## MEC-777

Quote:


> Originally Posted by *syl1979*
> 
> In Afterburner the curv represents the target voltage / frequency points.
> 
> With a picture may be easier to understand : Undervolt 1.012v 2050 Mhz max ( curv is flat starting 1.012v)


Ok, now I understand. Precision XOC does the graph/curve differently. I have Afterburner already installed. I'll give this a try. Looks like Afterburner has a much more granular and accurate curve tool than Precision.

This is my current curve in Precision:



Each of those points on the green line are +150 offset.


----------



## ErrorFile

Got my Palit GTX 1070 Game Rock today. My heart stopped at the post office when I went to pick it up... There was a new guy working and handing out all the packages for customers. The package slipped from his hands and I was ready to lose all hope - but the guy was fast and caught the package mid-air, no damage caused at all.

The card seems to be a really good choice, 63c while playing Assetto Corsa and I simply couldn't hear if the card was even making the fans spin. They had spun at 1187/rpm, but simply impossible for me to hear my whole PC. And I do have a bit oversensitive hearing. My 980 STRIX was a bit loud when the fans were spinning. Couldn't be happier with my new GPU!


----------



## RyanRazer

Quote:


> Originally Posted by *ErrorFile*
> 
> Got my Palit GTX 1070 Game Rock today. My heart stopped at the post office when I went to pick it up... There was a new guy working and handing out all the packages for customers. The package slipped from his hands and I was ready to lose all hope - but the guy was fast and caught the package mid-air, no damage caused at all.
> 
> The card seems to be a really good choice, 63c while playing Assetto Corsa and I simply couldn't hear if the card was even making the fans spin. They had spun at 1187/rpm, but simply impossible for me to hear my whole PC. And I do have a bit oversensitive hearing. My 980 STRIX was a bit loud when the fans were spinning. Couldn't be happier with my new GPU!


That "phieew" moment







congratz mate


----------



## kalidae

Hey guys I picked up a strix 1070 today. It's the first NVIDIA card I have owned since the 9800 and that card was faulty so I took it back and got a 4870 and have used AMD cards ever since. I haven't had a chance to play with the 1070 yet due to work but I love it so far. Looks like overclocking is very different to the AMD cards as well. Good times ahead


----------



## asdkj1740

Quote:


> Originally Posted by *ErrorFile*
> 
> Got my Palit GTX 1070 Game Rock today. My heart stopped at the post office when I went to pick it up... There was a new guy working and handing out all the packages for customers. The package slipped from his hands and I was ready to lose all hope - but the guy was fast and caught the package mid-air, no damage caused at all.
> 
> The card seems to be a really good choice, 63c while playing Assetto Corsa and I simply couldn't hear if the card was even making the fans spin. They had spun at 1187/rpm, but simply impossible for me to hear my whole PC. And I do have a bit oversensitive hearing. My 980 STRIX was a bit loud when the fans were spinning. Couldn't be happier with my new GPU!


update to latest bios if yours is micron with old bios. check it on gpuz.
samsung vrams' cards should update the bios too as it is a higher power bios with fan curve tuned for avoiding fans repeatedly on and off problem.

i personally suggest you to flash the gamerock premium bios as the latest gamerock premium bios has even higher power limit than latest gamerock bios. the latest bios are compatible for micron as well as samsung vram card.

in my opinion this is the best card with competitive pricing. it is shame that palit offers only two years warranty.


----------



## ErrorFile

Quote:


> Originally Posted by *asdkj1740*
> 
> update to latest bios if yours is micron with old bios. check it on gpuz.
> samsung vrams' cards should update the bios too as it is a higher power bios with fan curve tuned for avoiding fans repeatedly on and off problem.
> 
> i personally suggest you to flash the gamerock premium bios as the latest gamerock premium bios has even higher power limit than latest gamerock bios. the latest bios are compatible for micron as well as samsung vram card.
> 
> in my opinion this is the best card with competitive pricing. it is shame that palit offers only two years warranty.


Yeah, it seems to be the Micron-one... Do I really have to flash a newer BIOS if I don't have any problems with this card? It's really damn quiet. I played games for hours last night and in a silent room (even muted game sounds), my PC wouldn't make any extra noise compared to idling. Truly amazed how this card seems to be totally silent, but surely the Mini C's dampening materials help a bit.


----------



## asdkj1740

Quote:


> Originally Posted by *ErrorFile*
> 
> Yeah, it seems to be the Micron-one... Do I really have to flash a newer BIOS if I don't have any problems with this card? It's really damn quiet. I played games for hours last night and in a silent room (even muted game sounds), my PC wouldn't make any extra noise compared to idling. Truly amazed how this card seems to be totally silent, but surely the Mini C's dampening materials help a bit.


oh sorry i messed up the gamerock bios with jetstream bios.
gamerock old bios has already come with 195w and the latest one is the same as 195w. there is no power limit change on the old and new gamerock bios.
so if the fans dont behave wrong at idle and the micron problem does not trouble you then you do not need to update the bios.


----------



## ErrorFile

Quote:


> Originally Posted by *asdkj1740*
> 
> oh sorry i messed up the gamerock bios with jetstream bios.
> gamerock old bios has already come with 195w and the latest one is the same as 195w. there is no power limit change on the old and new gamerock bios.
> so if the fans dont behave wrong at idle and the micron problem does not trouble you then you do not need to update the bios.


It's okay.







Yeah, I'll just keep using it as it is, all seems to be just fine. Thanks for the blazingly fast reply, though.


----------



## kalidae

Does the max core clock and memory clock look right? I thought I'd be able to slide the bars further than that.


----------



## MEC-777

Quote:


> Originally Posted by *kalidae*
> 
> 
> 
> Does the max core clock and memory clock look right? I thought I'd be able to slide the bars further than that.


I don't recommend using Asus GPU tweak. I don't really have any experience with it, but I know MSI Afterburner will probably give you more/better control. Should definitely be able to target higher core and memory clocks than that. I'm running around 2000mhz core and 9000 memory on my Founders 1070.

What card do you have?


----------



## RogueVariable

Hi, all. I've ghosted these forums for years, but I'm not one to rehash old subjects & tend to only create an account and post if I feel there's something important to be shared.

Out of all the forum threads on the internet, this one seems to be filled with the most knowledge shared by the most knowledgeable people so I'm going to post this here, especially since there's more than a couple people in here who like to experiment with cross-posting BIOS images.

I received a Gigabyte GTX 1070 Xtreme on December 9, 2016. For those who care, it came with Samsung VRAM. I was originally going to order mid-November from B&H Photo&Video, but they went out of stock before I could pull the trigger. They were without stock for 2+ weeks before coming back in stock on December 6. I ordered as soon as the in-stock notification email hit my mailbox.

I post the preceding so it's known that Samsung samples are still alive and well out there & it wasn't old stock sitting around on a store shelf that I received.

(Yes, I know the Micron timing issue has been resolved, but I have read the entire thread (Yes, every post) so I know some people are still keeping track of this.)

Now that prologue is out of the way, I'm here to say that my video card is different from all the other Gigabyte Xtreme 1070 cards in this thread. Gigabyte must have silently revised their design.

Everywhere I see reference that this video card comes with 8+6 power pin connectors on the PCB. My card has 8+8 power pin connectors. If one visits Gigabyte's Xtreme Gaming web site and views the product pics an 8+6 pin configuration is clearly shown & listed in the specs. Imagine my surprise when I went to install it.

Also, lots of user reviews have expressed disappointment in the fan shroud in regards to what is usually referred to as flimsy plastic. My fan shroud is metal. Thick, heavy, unbendable metal.

It's a true shame that Gigabyte cut so many corners on the Gaming G1 model as it seems to have colored and tarnished their entire line-up of cards this generation. Words cannot describe what went through my mind as I unpackaged it. Closest thing I can think of is the opening sequence of 2001: A Space Odyssey when the monolith first shows up in front of the primitives. It's truly a work of art & industrial design. I've seen and experienced a lot of video cards & hardware in general since I got my first computer in 1983. This card takes the cake. No pictures can ever do it justice and convey its awesomeness.

Noise was a very valid concern what with rumors of Nvidia coil whine & this monster has no less than *three* 100mm fans. After I installed it I left the PC case open and booted into Windows so I could jack the fan speed up to 100%. Let me tell you, in my quiet computer room it was a whisper! A soft whisper I tell you. I had to put my ear down by the PC case to barely hear it. And since I really haven't had the inclination to upgrade my PC for years I could only compare it to my last video card (an XFX Radeon HD 5850) and of course is only one explanation for such behavior...sorcery and witchcraft. And once I put the side back on my Fractal Design R5 even when the card is going full bore one can not even tell the computer is turned on. At all. Complete silence.

Sweet witchcraft indeed.

Sorry to ramble...short version is be aware that there's a new hardware revision of the Gigabyte GTX Xtreme Gaming out in the wild. BIOS cross-flashing cowboys, let's be safe out there.


----------



## gtbtk

Quote:


> Originally Posted by *ErrorFile*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> oh sorry i messed up the gamerock bios with jetstream bios.
> gamerock old bios has already come with 195w and the latest one is the same as 195w. there is no power limit change on the old and new gamerock bios.
> so if the fans dont behave wrong at idle and the micron problem does not trouble you then you do not need to update the bios.
> 
> 
> 
> It's okay.
> 
> 
> 
> 
> 
> 
> 
> Yeah, I'll just keep using it as it is, all seems to be just fine. Thanks for the blazingly fast reply, though.
Click to expand...

Just Check your Bios Version in GPU-Z.

If it is 86.04.50.00.xx then you don't need the update. If it is running 86.04.26.00.xx or 86.04.3B.00.xx bios then there is a bug with the memory controller that will cause BSOD and artifacts if you try overclocking the vram too much. the .50 bios also helps with general card performance so it is worth doing and very easy to do as the distribution utility is completely automated .


----------



## ErrorFile

Quote:


> Originally Posted by *gtbtk*
> 
> Just Check your Bios Version in GPU-Z.
> 
> If it is 86.04.50.00.xx then you don't need the update. If it is running 86.04.26.00.xx or 86.04.3B.00.xx bios then there is a bug with the memory controller that will cause BSOD and artifacts if you try overclocking the vram too much. the .50 bios also helps with general card performance so it is worth doing and very easy to do as the distribution utility is completely automated .


It has 86.04.26.00.29. Okay, I'll take a look at the procedure.










EDIT: Now I'm on the .50 BIOS, thanks for the tip! It was very easy to do.


----------



## kalidae

Quote:


> Originally Posted by *MEC-777*
> 
> I don't recommend using Asus GPU tweak. I don't really have any experience with it, but I know MSI Afterburner will probably give you more/better control. Should definitely be able to target higher core and memory clocks than that. I'm running around 2000mhz core and 9000 memory on my Founders 1070.
> 
> What card do you have?


I have the strixx oc. I had a look at MSI afterburner and I can't increase the voltage at all, even though I have it ticked in the settings, there is also no power limit slider. Is that right?


----------



## dboythagr8

Thinking of getting a 1070 to hold me over until the release of the 1080ti or later. What's the best model? Are the EVGA boards still having problems?


----------



## gtbtk

Quote:


> Originally Posted by *RogueVariable*
> 
> Hi, all. I've ghosted these forums for years, but I'm not one to rehash old subjects & tend to only create an account and post if I feel there's something important to be shared.
> 
> Out of all the forum threads on the internet, this one seems to be filled with the most knowledge shared by the most knowledgeable people so I'm going to post this here, especially since there's more than a couple people in here who like to experiment with cross-posting BIOS images.
> 
> I received a Gigabyte GTX 1070 Xtreme on December 9, 2016. For those who care, it came with Samsung VRAM. I was originally going to order mid-November from B&H Photo&Video, but they went out of stock before I could pull the trigger. They were without stock for 2+ weeks before coming back in stock on December 6. I ordered as soon as the in-stock notification email hit my mailbox.
> 
> I post the preceding so it's known that Samsung samples are still alive and well out there & it wasn't old stock sitting around on a store shelf that I received.
> 
> (Yes, I know the Micron timing issue has been resolved, but I have read the entire thread (Yes, every post) so I know some people are still keeping track of this.)
> 
> Now that prologue is out of the way, I'm here to say that my video card is different from all the other Gigabyte Xtreme 1070 cards in this thread. Gigabyte must have silently revised their design.
> 
> Everywhere I see reference that this video card comes with 8+6 power pin connectors on the PCB. My card has 8+8 power pin connectors. If one visits Gigabyte's Xtreme Gaming web site and views the product pics an 8+6 pin configuration is clearly shown & listed in the specs. Imagine my surprise when I went to install it.
> 
> Also, lots of user reviews have expressed disappointment in the fan shroud in regards to what is usually referred to as flimsy plastic. My fan shroud is metal. Thick, heavy, unbendable metal.
> 
> It's a true shame that Gigabyte cut so many corners on the Gaming G1 model as it seems to have colored and tarnished their entire line-up of cards this generation. Words cannot describe what went through my mind as I unpackaged it. Closest thing I can think of is the opening sequence of 2001: A Space Odyssey when the monolith first shows up in front of the primitives. It's truly a work of art & industrial design. I've seen and experienced a lot of video cards & hardware in general since I got my first computer in 1983. This card takes the cake. No pictures can ever do it justice and convey its awesomeness.
> 
> Noise was a very valid concern what with rumors of Nvidia coil whine & this monster has no less than *three* 100mm fans. After I installed it I left the PC case open and booted into Windows so I could jack the fan speed up to 100%. Let me tell you, in my quiet computer room it was a whisper! A soft whisper I tell you. I had to put my ear down by the PC case to barely hear it. And since I really haven't had the inclination to upgrade my PC for years I could only compare it to my last video card (an XFX Radeon HD 5850) and of course is only one explanation for such behavior...sorcery and witchcraft. And once I put the side back on my Fractal Design R5 even when the card is going full bore one can not even tell the computer is turned on. At all. Complete silence.
> 
> Sweet witchcraft indeed.
> 
> Sorry to ramble...short version is be aware that there's a new hardware revision of the Gigabyte GTX Xtreme Gaming out in the wild. BIOS cross-flashing cowboys, let's be safe out there.


Giga Extreme has always been 8+8 and has a many phase VRM that spreads the load and certainly helps with coil whine. I have not heard the the extreme card being criticized for lacking in quality. I think I saw one youtube video that had a broken shrouid because the card had been dropped but that doesnt reflect on quality, more on mishandling. The G1 certainly has a reputation for fan blades fouling on the heat sync and making noise but that is at the complete opposite end of the market

The MSI Gaming and some of the Galax cards are the only 6+8 cards on the market that I can think of off the top of my head.

If you try overclocking your xtreme, just remember that the Core is already overclocked to 1671Mhz which is already about +165 overclock compared to the founders card so there is not likely to have too much headroom left and the memory is already overclocked by about +150 when you get it so remember to take that into consideration when comparing your values (you will probably be ok to add 450 or 550 in afterburner which is equivalent to 600-800 for cards with stock Vram clocks) to other people here on the forum


----------



## gtbtk

Quote:


> Originally Posted by *dboythagr8*
> 
> Thinking of getting a 1070 to hold me over until the release of the 1080ti or later. What's the best model? Are the EVGA boards still having problems?


define best.

The reality is that there is not a huge difference between the lowest and highest 1070 models

EVGAs dramas have been "resolved" but they are coming out with a new cooler design so that suggests that they took another look and discovered design issues or they are just changing the decore to make the perceived problem go away. I have found that the EVGA bioses are set to hit the power limits under fairly low loads so they will down clock themselves even under a 1080p load. An MSI Gaming card will only be pulling 85% with the same load so it end up wil more stable clock speeds

Fastest card out of the box is the Palit Gamerock/Gainward GLH but it is a 3 slot card.

Zotac Amp Extreme is also a great 3 slot card.

MSI Gaming and Asus strix are both great mainstream cards.

Having said that, If the card is only a temporary thing, you should probably look at the second tier ranges such as the MSI Armor or Asus Dual. You will save some money and the performance is almost the same as the top tier. Cross flash your cheaper model with say a Strix bios and you get the same performance as the more expensive card without the cost


----------



## gtbtk

Quote:


> Originally Posted by *ErrorFile*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Just Check your Bios Version in GPU-Z.
> 
> If it is 86.04.50.00.xx then you don't need the update. If it is running 86.04.26.00.xx or 86.04.3B.00.xx bios then there is a bug with the memory controller that will cause BSOD and artifacts if you try overclocking the vram too much. the .50 bios also helps with general card performance so it is worth doing and very easy to do as the distribution utility is completely automated .
> 
> 
> 
> It has 86.04.26.00.29. Okay, I'll take a look at the procedure.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Now I'm on the .50 BIOS, thanks for the tip! It was very easy to do.
Click to expand...

Good.

Now you can overclock the memory to 9100 to 9200Mhz, get more fps and it wont crash your PC.


----------



## RogueVariable

Quote:


> Giga Extreme has always been 8+8 and has a many phase VRM that spreads the load and certainly helps with coil whine. I have not heard the the extreme card being criticized for lacking in quality. I think I saw one youtube video that had a broken shrouid because the card had been dropped but that doesnt reflect on quality, more on mishandling. The G1 certainly has a reputation for fan blades fouling on the heat sync and making noise but that is at the complete opposite end of the market


The official product page states 8+6 and numerous posters in this thread have repeated the misinformation. Heck, official product shots have shown an 8+6 arrangement. Just doing my part to clear things up.
Quote:


> I think I saw one youtube video that had a broken shrouid because the card had been dropped but that doesnt reflect on quality, more on mishandling.


I haven't seen any YouTube videos. Only some early reviews on Amazon and NewEgg. Not sure if early versions were really that bad or someone just posted their review under the wrong model number. Again, just trying to make sure people know what the card is really like.
Quote:


> If you try overclocking your xtreme, just remember that the Core is already overclocked to 1671Mhz which is already about +165 overclock compared to the founders card so there is not likely to have too much headroom left and the memory is already overclocked by about +150 when you get it so remember to take that into consideration when comparing your values (you will probably be ok to add 450 or 550 in afterburner which is equivalent to 600-800 for cards with stock Vram clocks) to other people here on the forum


The original plan was to just buy the card, assume I was getting Micron RAM and just let the built-in OC profiles do their job. Curiosity got the better of me so I ran GPU-Z and found I had Samsung VRAM. Loaded Afterburner and did a 'lazy' overclock of 200 on the core and 500 on the memory. Heaven, Valley, Firestrike, TimeSpy, Firestrike Stress Test, and the RotTR benchmark showed no artifacts or crashes so I'm calling it good.

As for coil whine, knock wood, I've had a bajillion video cards in the last 30-something years and haven't had a card with coil whine yet....hope my streak continues.

BTW...lots of respect for you, gtbtk. You really went above and beyond to do your part during the Micron 'incident'....and the part you're playing in this thread keeping the "knowledgebase" accurate.
I owe my success with my card to people like you who jumped in the trenches first because without people willing to be guinea pigs and willing to share your experiences, those of us who hang back and research our purchases to death before jumping in would have no data to base our decisions on.


----------



## dboythagr8

Quote:


> Originally Posted by *gtbtk*
> 
> define best.
> 
> The reality is that there is not a huge difference between the lowest and highest 1070 models
> 
> EVGAs dramas have been "resolved" but they are coming out with a new cooler design so that suggests that they took another look and discovered design issues or they are just changing the decore to make the perceived problem go away. I have found that the EVGA bioses are set to hit the power limits under fairly low loads so they will down clock themselves even under a 1080p load. An MSI Gaming card will only be pulling 85% with the same load so it end up wil more stable clock speeds
> 
> Fastest card out of the box is the Palit Gamerock/Gainward GLH but it is a 3 slot card.
> 
> Zotac Amp Extreme is also a great 3 slot card.
> 
> MSI Gaming and Asus strix are both great mainstream cards.
> 
> Having said that, If the card is only a temporary thing, you should probably look at the second tier ranges such as the MSI Armor or Asus Dual. You will save some money and the performance is almost the same as the top tier. Cross flash your cheaper model with say a Strix bios and you get the same performance as the more expensive card without the cost


Thanks. Since this will be a temporary card, I'm even thinking of going with a 1060 until the 1080ti or later. Whatever card I get will become the backup to the newer card.


----------



## RyanRazer

Quote:


> Originally Posted by *RogueVariable*
> 
> The official product page states 8+6 and numerous posters in this thread have repeated the misinformation. Heck, official product shots have shown an 8+6 arrangement. Just doing my part to clear things up....


Correct. http://www.gigabyte.com/products/product-page.aspx?pid=5921#sp

Woot :O


----------



## zipper17

Quote:


> Originally Posted by *kalidae*
> 
> I have the strixx oc. I had a look at MSI afterburner and I can't increase the voltage at all, even though I have it ticked in the settings, there is also no power limit slider. Is that right?


probably you need to download latest MSI AB full 4.3.0 version , your version is 4.1.1
https://www.msi.com/page/afterburner

Usually you need go to settings -->in General Tab --> to Unlock Voltage Control.


----------



## khanmein

Quote:


> Originally Posted by *dboythagr8*
> 
> Thinking of getting a 1070 to hold me over until the release of the 1080ti or later. What's the best model? Are the EVGA boards still having problems?


new EVGA iCX cooler come with slightly lower base core clock compare with ACX 3.0.

currently i'm using EVGA SC version so far still good, whisper quiet & the fans never spin above 1K RPM during intensive gaming.

the best model is ASUS Strix for Pascal architecture & my suggestion for u is grab the cheapest GTX 1070 that available at your place.

FYI, EVGA that came with latest bios is most likely applied new thermal pad.

P.S if based on Amazon US price then i suggest for this;

https://www.amazon.com/PNY-GeForce-Overclocked-Graphic-VCGGTX10708XGPB-OC/dp/B01JOF81BG/ref=sr_1_6?s=pc&ie=UTF8&qid=1484884073&sr=1-6&keywords=gtx+1070


----------



## XRogerX

this is the one i have without the back plate , so it didn't matter to me as i went water cooling

https://www.newegg.com/Product/Product.aspx?item=N82E16814487265

but if you was to want the back plate it would cost you $30 more and that's about right price for a back plate

https://www.newegg.com/Product/Product.aspx?item=N82E16814487248

that's if you want to go with evga

just my 2cents


----------



## MEC-777

Quote:


> Originally Posted by *kalidae*
> 
> I have the strixx oc. I had a look at MSI afterburner and I can't increase the voltage at all, even though I have it ticked in the settings, there is also no power limit slider. Is that right?


Yeah, you're missing a lot of options. I would suggest uninstalling it (completely, don't save settings when it prompts you), then download the latest version and reinstall.









Just today I had to nuke afterburner, and DDU the drivers, then reinstall geforce drivers and reinstall afterburner because it was doing strange things. Sometimes you just have to start clean.


----------



## dboythagr8

Quote:


> Originally Posted by *XRogerX*
> 
> this is the one i have without the back plate , so it didn't matter to me as i went water cooling
> 
> https://www.newegg.com/Product/Product.aspx?item=N82E16814487265
> 
> but if you was to want the back plate it would cost you $30 more and that's about right price for a back plate
> 
> https://www.newegg.com/Product/Product.aspx?item=N82E16814487248
> 
> that's if you want to go with evga
> 
> just my 2cents


my main concern about evga is the thermal pad issue. is there any way to know if these already have the fix or whatever in place?


----------



## gtbtk

Quote:


> Originally Posted by *RogueVariable*
> 
> Quote:
> 
> 
> 
> Giga Extreme has always been 8+8 and has a many phase VRM that spreads the load and certainly helps with coil whine. I have not heard the the extreme card being criticized for lacking in quality. I think I saw one youtube video that had a broken shrouid because the card had been dropped but that doesnt reflect on quality, more on mishandling. The G1 certainly has a reputation for fan blades fouling on the heat sync and making noise but that is at the complete opposite end of the market
> 
> 
> 
> The official product page states 8+6 and numerous posters in this thread have repeated the misinformation. Heck, official product shots have shown an 8+6 arrangement. Just doing my part to clear things up.
> Quote:
> 
> 
> 
> I think I saw one youtube video that had a broken shrouid because the card had been dropped but that doesnt reflect on quality, more on mishandling.
> 
> Click to expand...
> 
> I haven't seen any YouTube videos. Only some early reviews on Amazon and NewEgg. Not sure if early versions were really that bad or someone just posted their review under the wrong model number. Again, just trying to make sure people know what the card is really like.
> Quote:
> 
> 
> 
> If you try overclocking your xtreme, just remember that the Core is already overclocked to 1671Mhz which is already about +165 overclock compared to the founders card so there is not likely to have too much headroom left and the memory is already overclocked by about +150 when you get it so remember to take that into consideration when comparing your values (you will probably be ok to add 450 or 550 in afterburner which is equivalent to 600-800 for cards with stock Vram clocks) to other people here on the forum
> 
> Click to expand...
> 
> The original plan was to just buy the card, assume I was getting Micron RAM and just let the built-in OC profiles do their job. Curiosity got the better of me so I ran GPU-Z and found I had Samsung VRAM. Loaded Afterburner and did a 'lazy' overclock of 200 on the core and 500 on the memory. Heaven, Valley, Firestrike, TimeSpy, Firestrike Stress Test, and the RotTR benchmark showed no artifacts or crashes so I'm calling it good.
> 
> As for coil whine, knock wood, I've had a bajillion video cards in the last 30-something years and haven't had a card with coil whine yet....hope my streak continues.
> 
> BTW...lots of respect for you, gtbtk. You really went above and beyond to do your part during the Micron 'incident'....and the part you're playing in this thread keeping the "knowledgebase" accurate.
> I owe my success with my card to people like you who jumped in the trenches first because without people willing to be guinea pigs and willing to share your experiences, those of us who hang back and research our purchases to death before jumping in would have no data to base our decisions on.
Click to expand...

You are welcome.

I just took another look at a tech powerup review and indeed you are right. It is showing 8+6. I am sure that I looked it up at one stage and saw 8+8. maybe where ever I looked was showing a 1080 and claiming it as a 1070?? The PCBs are pretty much identical, the only difference is an extra 1.8V vrm for the GDDR5X memory on the 1080 that is not needed on the 1070 board. Maybe the factory that made your card used a naked 1080 PCB with 8+8 in the 1070 production run? Mind you, with a bios rated 240W power limit, the card could probably get away with a single 8 pin and be fine.

The MSI cards are also 8+6, have a max bios rated power limit of 291W but I have not been able to get mine to pull more than about 230W before it starts dropping clock speed due to power limits, even then it is requiring a 4K Firestrike Ultra load to do that. It does, however allow the card to retain more stable clocks at 1080/1440p, pulling only 80-90% of its power limit, compared to say an EVGA card that bounces off its power limits the momemt you look at it


----------



## RaleighStClair

Quote:


> Originally Posted by *gtbtk*
> 
> You need to realize that the power slider values are not absolute microvolts but a percentage value. the important number that you should pay attention to is the card power draw that you can monitor with HWinfo64.
> 
> The Seahawk uses a reference board and I think is limited to 180W. You could try the Asus Strix OC bios that seems to be well balanced with an 8 pin power supply and 200W power limit with a 120% slider. You could also try the MSI Gaming X bios, be aware of your VRM temperatures though as the card is 6+8pin, however, it is mostly for show and marketing. The card never draws anywhere near the maximum possible, the most I have managed is about 230W under a 4K load. The reference VRM has a max capacity of 250W. The Power limit slider goes to 126%.


Thanks. So are these other BIOS compatible with the MSI Seahawk? Or is it a ''lets hope this works..." situation?

I ask because this card doesn't have a BIOS switch.

Thanks.


----------



## gtbtk

Quote:


> Originally Posted by *RaleighStClair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You need to realize that the power slider values are not absolute microvolts but a percentage value. the important number that you should pay attention to is the card power draw that you can monitor with HWinfo64.
> 
> The Seahawk uses a reference board and I think is limited to 180W. You could try the Asus Strix OC bios that seems to be well balanced with an 8 pin power supply and 200W power limit with a 120% slider. You could also try the MSI Gaming X bios, be aware of your VRM temperatures though as the card is 6+8pin, however, it is mostly for show and marketing. The card never draws anywhere near the maximum possible, the most I have managed is about 230W under a 4K load. The reference VRM has a max capacity of 250W. The Power limit slider goes to 126%.
> 
> 
> 
> Thanks. So are these other BIOS compatible with the MSI Seahawk? Or is it a ''lets hope this works..." situation?
> 
> I ask because this card doesn't have a BIOS switch.
> 
> Thanks.
Click to expand...

The seahawk uses a reference PCB. The voltage controller is the standard controller used on the majority of other boards, that is the part that interfaces the VRM circuitry and manages whoever many phases are available. The Bios only says stop pulling power when you reach X Watts. The BIOS does not know how many power cables are connected to the card.

None of the MSI cards have a bios switch. As long as you can boot your pc from another GPU, even the iGPU, It is easy to recover a "bricked" GPU. Even if you made a mistake and flashed a 1080 bios to the card, it will not kill the card physically but it can change settings enough to make the card unbootable. The PCIe/flash storage connection on the card still works fine. If you boot from your iGPU and flash the stock bios back to the card it will come back to life. The Galax HOF cards are the only ones I know of that are not using a standard power controller and wont work that I know of. The only other Gotchas, I found was with gigabyte Xtreme bioses I had to change the displayport I was using because DP port 1 on the MSI became inactive (I guess because it has extra HDMI ports out the back of the card) but DP port 3 worked fine.

Cross flashing is always experimental in nature but it will not destroy the hardware, at least to boot the system and is quite straight forward to recover the card if you do mess up. Just be aware of what the power supply environment is for the original card. A Zotac Amp Extreme card for example can pull 300W and it is probably too much for the VRMs on the seahawk to cope with but even there, it is not an issue unless you are running really high loads on the card. At idle it is still within spec so it will boot and work fine in the windows desktop. A firmware flash is not really much different to copying a file to a USB flash thumb drive that is used by the hardware at boot time to set its config settings.

I have not tested against a seahawk card however, the Asus bios (also an 8 pin supply) works fine on a Gaming X (as do the EVGA, Gigabyte, Zotac, most of the galax/KAF bioses as well). The VRM you have has a max power limitation of 250W due to the mosfets that have been used in the VRM so you are within power hardware limits so there is no reason why it will not work. Even a Gaming X/Z bios should work OK in spite of teh 8+6 pin power arrangement..

Understand that to do this, you will have to use NVFLASH manually with a -6 flag because the utility checks Model device IDs between the card and bios file and wont flash the card by default if there is a mismatch. The -6 flag will force the utility to ignore the difference in device IDs and flash the bios regardless of what card is in the machine so you do need to pay attention to what you are doing to keep track of what bios files from what card you are using. Rename bios files to identify what card it is from when you obtain it and keep your master copies in a separate directory to the flash utility and only copy the bios you want to try in and out of teh flash util directory as you need it


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kalidae*
> 
> I have the strixx oc. I had a look at MSI afterburner and I can't increase the voltage at all, even though I have it ticked in the settings, there is also no power limit slider. Is that right?
> 
> 
> 
> 
> 
> Yeah, you're missing a lot of options. I would suggest uninstalling it (completely, don't save settings when it prompts you), then download the latest version and reinstall.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just today I had to nuke afterburner, and DDU the drivers, then reinstall geforce drivers and reinstall afterburner because it was doing strange things. Sometimes you just have to start clean.
Click to expand...

You need to download Afterburner 4.3. That is version 4.1.1 and it doesn't support Pascal cards


----------



## RaleighStClair

Quote:


> Originally Posted by *gtbtk*
> 
> The seahawk uses a reference PCB. The voltage controller is the standard controller used on the majority of other boards, that is the part that interfaces the VRM circuitry and manages whoever many phases are available. The Bios only says stop pulling power when you reach X Watts. The BIOS does not know how many power cables are connected to the card.
> 
> ....


Thanks!


----------



## kalidae

Thanks everyone! I ended up working it out. For some reason the version I was was on would check for updates but wouldn't find any so I just assumed I had the latest version. I had a look and surely enough I found 4.3







it's working all good now.


----------



## Wyllliam

Hi
Anybody here feels like putting their 1070's to use for a good cause?
Join the forum folding war!
Team intel could use some people with the 1070's.
more info follow the link

Team Intel FFW


----------



## philhalo66

Any recommendations on replacement fans for my 1070 G1 gaming? one of them is rattling and driving me insane. i think they're 92MM fans. it only needs to last a few months im going to replace it with an EVGA 1080 hopefully before july


----------



## khanmein

Quote:


> Originally Posted by *philhalo66*
> 
> Any recommendations on replacement fans for my 1070 G1 gaming? one of them is rattling and driving me insane. i think they're 92MM fans. it only needs to last a few months im going to replace it with an EVGA 1080 hopefully before july


G1 gaming replacement fans is easy to find.

they using power logic long sleeve bearing not ball bearing.

try new EVGA iCX or other brand like ASUS etc.


----------



## Pesodev

*Here some NVFLASH for Asus 1070 Series [64Bit]*









Orginally i had ROG STRIX-GTX1070-8G-GAMING and now its flashed to
GTX 1070 Dragon OC













Guide for NvFlashing:
http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980

Asus Gtx 1070 Dragon OC Vbios rom and NVFlash only for 64X operating systems
https://drive.google.com/file/d/0B3Eoti2oFBh4dnRSY1ByRURWR1k/view?usp=sharing



Stably Tested GPU Boost Clock 1900MHz Memory Clock 8450MHz



*FLASH AT YOUR OWN RISK!
I AM NOT RESPONSIBLE FOR ANY DAMAGES THAT MAY OCCUR WHILE ATTEMPTING THIS!*


----------



## MEC-777

Did a little experiment last night. Results were rather interesting, so thought I'd share.









I set up two OC profiles (using AB this time instead of Precision XOC). Both had temp and power limit targets maxed and un-linked, both with voltage % at 0 (stock), and both with memory at +500. The only difference between the two was on one I set the clock slider to +150 and on the other I did the same but then modified the curve so that basically below 975mV was stock offset and anything above 1031mV was no higher than just over 2000MHz (max stable clocks I can achieve on my card).

Then I ran several passes of Heaven benchmark and the results were well within margin of error. They performed virtually the same. The only difference was with the profile using the custom curve ran several degrees cooler, producing less heat. 3-4 degrees cooler on average. So @syl1979 was correct; you can get the same performance and produce less heat by creating a custom offset curve like this.









Another interesting thing to note is I had Heaven running in windowed mode because I wanted to see if there was any immediate differences by increasing the voltage % and hitting apply while the benchmark was running. To my surprise.... nothing. No difference at all. It seems the max OC on my Founders card doesn't require adding any voltage, or at least there was no benefit from doing so.


----------



## chispy

Thank you guys for this well put together and very informative thread. I have read close to 300 pages and in between. I need some help and guidance. I have just bought this evga gtx 1070 sc gaming acx 3.0 black edition = https://www.newegg.com/Product/Product.aspx?Item=N82E16814487265

EVGA GeForce GTX 1070 SC GAMING ACX 3.0 Black Edition, 08G-P4-5173-KR , it has a single 8 pin power connector and i believe it's a reference pcb because it's on the compatible list of EK full cover water block for gtx 1070 FE. I run a custom water loop for my cpu and gpu , also have here the EK GTX 1070 water block that will go on it today = https://www.ekwb.com/configurator/waterblock/3831109831472

So temperatures will not be a problem. My question is: Wish one of the Bios compatible with this video card is the best one in terms of highest stable overclocking for safe gaming and 24/7 operations without blowing the vrm on this video card ? Highest power limit and voltage limit ? I want to squeeze every last Mhz out of this card to hold me over until gtx 1080Ti and vega 10 are out since i game at 4k resolution. Thank you guys and i really do appreciate all the time and effort put into this thread


----------



## dlewbell

Quote:


> Originally Posted by *dboythagr8*
> 
> my main concern about evga is the thermal pad issue. is there any way to know if these already have the fix or whatever in place?


If you buy direct from EVGA, it should already have the new thermal pads installed. If you're buying in person, you can run the serial number through the thermal mod request page before buying to find out. If you buy online through someone other than EVGA, you're just going to have to hope you get lucky. I purchased my EVGA GTX 1070 FTW through Amazon on 12/27/16, & received one from older stock, so I had to request & install the thermal pads myself.


----------



## dboythagr8

Quote:


> Originally Posted by *dlewbell*
> 
> If you buy direct from EVGA, it should already have the new thermal pads installed. If you're buying in person, you can run the serial number through the thermal mod request page before buying to find out. If you buy online through someone other than EVGA, you're just going to have to hope you get lucky. I purchased my EVGA GTX 1070 FTW through Amazon on 12/27/16, & received one from older stock, so I had to request & install the thermal pads myself.


Do we have any idea when the icx models will ship?


----------



## asdkj1740

Quote:


> Originally Posted by *dboythagr8*
> 
> Do we have any idea when the icx models will ship?


feb~mar, according to jacob in ces. check those videos on youtube.

when he said there are lots of tech under the new cooler, i cant stop laughing.


----------



## gtbtk

Quote:


> Originally Posted by *Pesodev*
> 
> *Here some NVFLASH for Asus 1070 Series [64Bit]*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Guide for NvFlashing:
> http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980
> 
> Asus Gtx 1070 Dragon OC Vbios rom and NVFlash only for 64X operating systems
> https://drive.google.com/file/d/0B3Eoti2oFBh4dnRSY1ByRURWR1k/view?usp=sharing
> 
> 
> 
> Stably Tested GPU Boost Clock 1900MHz Memory Clock 8450MHz
> 
> *FLASH AT YOUR OWN RISK!
> I AM NOT RESPONSIBLE FOR ANY DAMAGES THAT MAY OCCUR WHILE ATTEMPTING THIS!*


Do you have a Native Asus 1070 Dragon card or have you only flashed the bios? If you flashed the bios what card did you flash it to?

I ask because the 1657Mhz core clock matches one from a Chinese Galax 1070 SNPR model card and that is the only bios that bricked my GPU during my cross flashing adventures because it and the HOF cards use a different voltage controller to all the other 1070 cards and this may be using the same PCB as the Galax model


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> define best.
> 
> The reality is that there is not a huge difference between the lowest and highest 1070 models
> 
> EVGAs dramas have been "resolved" but they are coming out with a new cooler design so that suggests that they took another look and discovered design issues or they are just changing the decore to make the perceived problem go away. I have found that the EVGA bioses are set to hit the power limits under fairly low loads so they will down clock themselves even under a 1080p load. An MSI Gaming card will only be pulling 85% with the same load so it end up wil more stable clock speeds
> 
> Fastest card out of the box is the Palit Gamerock/Gainward GLH but it is a 3 slot card.
> 
> Zotac Amp Extreme is also a great 3 slot card.
> 
> MSI Gaming and Asus strix are both great mainstream cards.
> 
> Having said that, If the card is only a temporary thing, you should probably look at the second tier ranges such as the MSI Armor or Asus Dual. You will save some money and the performance is almost the same as the top tier. Cross flash your cheaper model with say a Strix bios and you get the same performance as the more expensive card without the cost


down clocking you say, can you suggest a bios that eliminates this soft wall?

is a pascal bios tweaker in the works? Does anyone even care enough about 1070 to do the work this time around?

thx


----------



## rfarmer

Anyone else get one of these from nVidia? I was mad I didn't get a game when I bought my 1070, feel better now.


----------



## MEC-777

Quote:


> Originally Posted by *rfarmer*
> 
> 
> 
> Anyone else get one of these from nVidia? I was mad I didn't get a game when I bought my 1070, feel better now.


I got WatchDogs 2 with mine. Glad I didn't pay for it because it doesn't run as well as it should, IMO. Still somewhat optimized. I get frame rate drops and dips when neither the CPU or GPU is at 100% (so theoretically no bottleneck) depending on where I am in the map. Plus I'm not really into the whole premise of the game either.

If I had the choice, I would have taken The Division instead without hesitation.


----------



## Pesodev

i have "normal" gtx 1070 strix and i flashed bios to gtz 1070 Dragon oc


----------



## Pesodev

Quote:


> Originally Posted by *gtbtk*
> 
> Do you have a Native Asus 1070 Dragon card or have you only flashed the bios? If you flashed the bios what card did you flash it to?
> 
> I ask because the 1657Mhz core clock matches one from a Chinese Galax 1070 SNPR model card and that is the only bios that bricked my GPU during my cross flashing adventures because it and the HOF cards use a different voltage controller to all the other 1070 cards and this may be using the same PCB as the Galax model


i have Asus GeForce GTX 1070 8Gb STRIX and i flashed it GTX 1070 Dragon OC and its works great!

and Asus NVIDIA GTX 1070 Dragon OC have sama clocks and voltages whit Asus GeForce GTX 1070 8Gb STRIX OC

And board is same but out looking and coolers maybe litle bit diffrent


----------



## XRogerX

Quote:


> Originally Posted by *dboythagr8*
> 
> my main concern about evga is the thermal pad issue. is there any way to know if these already have the fix or whatever in place?


i think they did fix it but i can ask evga by giving them a call

i did call evga and the issue has been addressed , as of NOV 1st

but there are some card out that that hasn't been addressed if you do order a 1070 you can check it here
to see if you need the thermal pads or the bios ,all you have to do is put you Serial Number of your card on that site

also if you don't feel comfortable doing anything of this evga will work with you on replacing your card with a new one

http://www.evga.com/thermalmod/

so your best bet is to order them from evga
as there 100% resolved if you like to give them a call to verify this

888.881.3842

remember if ordered from EVGA their stock is 100% resolved
newegg and Amazon might still have older card in stock


----------



## Avendor

From what I know GTX 1070 Xtreme Gaming this one comes with 8+6pin while GTX 1080 Xtreme Gaming does have 8+8pin


----------



## skupples

Quote:


> Originally Posted by *rfarmer*
> 
> 
> 
> Anyone else get one of these from nVidia? I was mad I didn't get a game when I bought my 1070, feel better now.










my evga didn't come with a game.

also, it would be awesome if someone could recommend a bios for this evga SC, & where to get it. I'm trying to knock the rust off. my caselabs has been off for almost two years XD


----------



## DeathAngel74

I got a gow4 code with my 1070 sc.


----------



## khanmein

Quote:


> Originally Posted by *skupples*
> 
> 
> 
> 
> 
> 
> 
> 
> my evga didn't come with a game.
> 
> also, it would be awesome if someone could recommend a bios for this evga SC, & where to get it. I'm trying to knock the rust off. my caselabs has been off for almost two years XD


latest bios for EVGA SC is 86.04.50.00.*72*

http://forums.evga.com/Update-11916-with-NEW-BIOS-EVGA-GeForce-GTX-108010701060-PWM-Temperature-Update-m2573491.aspx

i also don't receive any games.


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> define best.
> 
> The reality is that there is not a huge difference between the lowest and highest 1070 models
> 
> EVGAs dramas have been "resolved" but they are coming out with a new cooler design so that suggests that they took another look and discovered design issues or they are just changing the decore to make the perceived problem go away. I have found that the EVGA bioses are set to hit the power limits under fairly low loads so they will down clock themselves even under a 1080p load. An MSI Gaming card will only be pulling 85% with the same load so it end up wil more stable clock speeds
> 
> Fastest card out of the box is the Palit Gamerock/Gainward GLH but it is a 3 slot card.
> 
> Zotac Amp Extreme is also a great 3 slot card.
> 
> MSI Gaming and Asus strix are both great mainstream cards.
> 
> Having said that, If the card is only a temporary thing, you should probably look at the second tier ranges such as the MSI Armor or Asus Dual. You will save some money and the performance is almost the same as the top tier. Cross flash your cheaper model with say a Strix bios and you get the same performance as the more expensive card without the cost
> 
> 
> 
> down clocking you say, can you suggest a bios that eliminates this soft wall?
> 
> is a pascal bios tweaker in the works? Does anyone even care enough about 1070 to do the work this time around?
> 
> thx
Click to expand...

Well the down clocking is a design element of GPU Boost. Once the car hits the power limit, it will reduce the clocks to keep the card under that power limit. The way to avoid it is to have a bios that sets a power limit high enough that the card never reaches that figure unless it us under really heavy load. That is what MSI has done with the Gaming 8G/X/Z/Quicksilver cards. They will reduce clocks when the power limit hits about 106% but you have to be under a Firestrike Ultra like 4K load to see that happen, given that the 1070 is not strong enough to output a decent frame rate at that resolution, you are not likely to be using 4K for gaming anyway..

Pascal tweaker is not available because no-one has cracked the encryption that is used to sign the bios files. I am sure that one day, the software that vendors use will leak and something will happen. It is not that no-one is interested


----------



## gtbtk

Quote:


> Originally Posted by *Pesodev*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Do you have a Native Asus 1070 Dragon card or have you only flashed the bios? If you flashed the bios what card did you flash it to?
> 
> I ask because the 1657Mhz core clock matches one from a Chinese Galax 1070 SNPR model card and that is the only bios that bricked my GPU during my cross flashing adventures because it and the HOF cards use a different voltage controller to all the other 1070 cards and this may be using the same PCB as the Galax model
> 
> 
> 
> i have Asus GeForce GTX 1070 8Gb STRIX and i flashed it GTX 1070 Dragon OC and its works great!
> 
> and Asus NVIDIA GTX 1070 Dragon OC have sama clocks and voltages whit Asus GeForce GTX 1070 8Gb STRIX OC
> 
> And board is same but out looking and coolers maybe litle bit diffrent
Click to expand...

so the default core clock is 1633Mhz? The Videocardz page shows it as 1657Mhz which is faster than the strix OC card.

The Dragon TOP version is listed as 1670Mhz and that matches the Gigabyte Xtreme and Gainward GLH/Palit Gamerock default coreclock.

Could you use GPU-Z to upload the flashed bios from your card to the techpowerup.com web site? The site will read the information in the bios so the people can see what they are getting. The bios should list itself on the site as 86.04.50.00.6C


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rfarmer*
> 
> 
> 
> Anyone else get one of these from nVidia? I was mad I didn't get a game when I bought my 1070, feel better now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my evga didn't come with a game.
> 
> also, it would be awesome if someone could recommend a bios for this evga SC, & where to get it. I'm trying to knock the rust off. my caselabs has been off for almost two years XD
Click to expand...

You could try the 2nd bios from a FTW card or the asus strix oc bios


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> Pascal tweaker is not available because no-one has cracked the encryption that is used to sign the bios files. I am sure that one day, the software that vendors use will leak and something will happen. It is not that no-one is interested


People can apply for a Hulk key from the manufacturers to enable tweaking of Pascal VBIOS, but AFAIK no one has. Maybe needs a little effort rather than just interest and maybe it will not be free


----------



## syl1979

Quote:


> Originally Posted by *gtbtk*
> 
> Well the down clocking is a design element of GPU Boost. Once the car hits the power limit, it will reduce the clocks to keep the card under that power limit. The way to avoid it is to have a bios that sets a power limit high enough that the card never reaches that figure unless it us under really heavy load. That is what MSI has done with the Gaming 8G/X/Z/Quicksilver cards. They will reduce clocks when the power limit hits about 106% but you have to be under a Firestrike Ultra like 4K load to see that happen, given that the 1070 is not strong enough to output a decent frame rate at that resolution, you are not likely to be using 4K for gaming anyway..
> 
> Pascal tweaker is not available because no-one has cracked the encryption that is used to sign the bios files. I am sure that one day, the software that vendors use will leak and something will happen. It is not that no-one is interested


I think it will be very difficult to remove the thermal throttling starting as low as 30degc. If on air there is little interest of pushing power limit over 250w.


----------



## Quadrider10

So what's the best BIOS to be running on these cards as far as efficiency and stable clocks? More specifically, G1 Gaming?


----------



## roccotheoc

So I've had my msi 1070 gaming x for half a year ish and at the beginning my temps where around 68-70 now they are more like 73-75!
Why would this happen? The ambient temp is about the same in both.


----------



## rfarmer

Quote:


> Originally Posted by *roccotheoc*
> 
> So I've had my msi 1070 gaming x for half a year ish and at the beginning my temps where around 68-70 now they are more like 73-75!
> Why would this happen? The ambient temp is about the same in both.


Are CPU temps the same or higher too?


----------



## Pesodev

Quote:


> Originally Posted by *gtbtk*
> 
> so the default core clock is 1633Mhz? The Videocardz page shows it as 1657Mhz which is faster than the strix OC card.
> 
> The Dragon TOP version is listed as 1670Mhz and that matches the Gigabyte Xtreme and Gainward GLH/Palit Gamerock default coreclock.
> 
> Could you use GPU-Z to upload the flashed bios from your card to the techpowerup.com web site? The site will read the information in the bios so the people can see what they are getting. The bios should list itself on the site as 86.04.50.00.6C


Its 86.04.50.00.AS09 you can see it GPU Tweak II Info screenshot but yeah i can get Gpu-z screenshot too


----------



## roccotheoc

Quote:


> Originally Posted by *rfarmer*
> 
> Are CPU temps the same or higher too?


No they are the same!


----------



## MEC-777

Quote:


> Originally Posted by *roccotheoc*
> 
> So I've had my msi 1070 gaming x for half a year ish and at the beginning my temps where around 68-70 now they are more like 73-75!
> Why would this happen? The ambient temp is about the same in both.


Have you overclocked it at all?

Do you have dust filters on your case and if so, when was the last time you cleaned them? When was the last time you blew the dust out of your PC and cleaned it?


----------



## skupples

Quote:


> Originally Posted by *khanmein*
> 
> latest bios for EVGA SC is 86.04.50.00.*72*
> 
> http://forums.evga.com/Update-11916-with-NEW-BIOS-EVGA-GeForce-GTX-108010701060-PWM-Temperature-Update-m2573491.aspx
> 
> i also don't receive any games.


thank you sir!

+1

my amazon card shipped with the 00.72 bios.









Quote:


> Originally Posted by *gtbtk*
> 
> Well the down clocking is a design element of GPU Boost. Once the car hits the power limit, it will reduce the clocks to keep the card under that power limit. The way to avoid it is to have a bios that sets a power limit high enough that the card never reaches that figure unless it us under really heavy load. That is what MSI has done with the Gaming 8G/X/Z/Quicksilver cards. They will reduce clocks when the power limit hits about 106% but you have to be under a Firestrike Ultra like 4K load to see that happen, given that the 1070 is not strong enough to output a decent frame rate at that resolution, you are not likely to be using 4K for gaming anyway..
> 
> Pascal tweaker is not available because no-one has cracked the encryption that is used to sign the bios files. I am sure that one day, the software that vendors use will leak and something will happen. It is not that no-one is interested


thank you gentlemen. I see the problem has only gotten worse since Kepler. Nvidia was pissed & even sent reps here to discredit the buck controller crack on gk110









so basically the name of the game is figure out which manufacturer bios has the most oomf for the different cards we're using


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Pascal tweaker is not available because no-one has cracked the encryption that is used to sign the bios files. I am sure that one day, the software that vendors use will leak and something will happen. It is not that no-one is interested
> 
> 
> 
> People can apply for a Hulk key from the manufacturers to enable tweaking of Pascal VBIOS, but AFAIK no one has. Maybe needs a little effort rather than just interest and maybe it will not be free
Click to expand...

I have not heard about that.

I know there was an Asus 1080 bios that had higher voltage limits that Dancop released to the internet


----------



## gtbtk

Quote:


> Originally Posted by *syl1979*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Well the down clocking is a design element of GPU Boost. Once the car hits the power limit, it will reduce the clocks to keep the card under that power limit. The way to avoid it is to have a bios that sets a power limit high enough that the card never reaches that figure unless it us under really heavy load. That is what MSI has done with the Gaming 8G/X/Z/Quicksilver cards. They will reduce clocks when the power limit hits about 106% but you have to be under a Firestrike Ultra like 4K load to see that happen, given that the 1070 is not strong enough to output a decent frame rate at that resolution, you are not likely to be using 4K for gaming anyway..
> 
> Pascal tweaker is not available because no-one has cracked the encryption that is used to sign the bios files. I am sure that one day, the software that vendors use will leak and something will happen. It is not that no-one is interested
> 
> 
> 
> I think it will be very difficult to remove the thermal throttling starting as low as 30degc. If on air there is little interest of pushing power limit over 250w.
Click to expand...

Given that the card is running 100's of Mhz above its rated speed, it is not exactly thermal throttling. Yes it reduces clock speed from a peak value to manage temperatures but CPUs also do similar things.

The Zotac Amp Extreme is a 300W card but also needs a 3 slot cooler. I do agree that there is not much point going above that without water or even sub zero cooling


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> thank you gentlemen. I see the problem has only gotten worse since Kepler. Nvidia was pissed & even sent reps here to discredit the buck controller crack on gk110
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so basically the name of the game is figure out which manufacturer bios has the most oomf for the different cards we're using


yes that's about it. Just remember take your cards VRM into consideration as well


----------



## gtbtk

Quote:


> Originally Posted by *Pesodev*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> so the default core clock is 1633Mhz? The Videocardz page shows it as 1657Mhz which is faster than the strix OC card.
> 
> The Dragon TOP version is listed as 1670Mhz and that matches the Gigabyte Xtreme and Gainward GLH/Palit Gamerock default coreclock.
> 
> Could you use GPU-Z to upload the flashed bios from your card to the techpowerup.com web site? The site will read the information in the bios so the people can see what they are getting. The bios should list itself on the site as 86.04.50.00.6C
> 
> 
> 
> Its 86.04.50.00.AS09 you can see it GPU Tweak II Info screenshot but yeah i can get Gpu-z screenshot too
Click to expand...

I had a look at the bios with a Hex editor. The only thing that reads the AS09 part of the bios is Asus software the rest of the world looks at the part that lists the bios as ending in 6C which is the Nvidia version number.

It seems that Asus is just re-skinning the Strix OC card with a different cooler but using basically the same bios. MSI reskinned the gaming X to become the Quicksilver card and did the same thing.

The really interesting bios to get hold of would be the TOP version of the card with the 1670Mhz Core clock bios.


----------



## gtbtk

Quote:


> Originally Posted by *roccotheoc*
> 
> So I've had my msi 1070 gaming x for half a year ish and at the beginning my temps where around 68-70 now they are more like 73-75!
> Why would this happen? The ambient temp is about the same in both.


I See you are in Canada. It is winter and really cold and snowing there right now right?

Does your house have central heating by any chance? Is there a central heating outlet just near where you have your PC set up and maybe blowing on the PC or does the warm air collect in the corner of the room where the PC is installed?

If your GPU is not full of dust, which after only 6 months is unlikely, My guess would be that in heating your home, even though the average air temperature is constant, you have increased the localized ambient air temperature inside your computer case.


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> So what's the best BIOS to be running on these cards as far as efficiency and stable clocks? More specifically, G1 Gaming?


You are going to have to do some experimentation to get some definitive answers I'm afraid.

Your existing bios has core clock of 1595Mhz, reference clocked memory and a power limit of 200W with a single 8 pin power supply. I am only looking at worthwhile 8 pin only cards. You may be able to get away with using a bios from a card with the extra PCIe power cable but I dont want to be the one suggesting that.

The Asus strix OC bios is basically the same, just 1633Mhz clocked core so probably no value there. maybe worth trying though.

EVGA SC bios is not one that I would recommend as they seem to bounce of teh power limit easily. It could be interesting for you though the try the EBGA bios and the automatic overclocking utility that is in Precision XOC. It is a simple way to see roughly where the limits for each voltage point on your card are.

These two might be worth trying out:

The Palit Gamerock Premium has an 8 pin supply with 225W power limit, 1671Mhz base clock and +250Mhz overclocked memory.

https://www.techpowerup.com/vgabios/187013/187013

Or the Palit SuperJetstream with 225W Power limit and 1633Mhz core clock and no overclocked memory

https://www.techpowerup.com/vgabios/187001/palit-gtx1070-8192-161021


----------



## zipzop

Anyone know if there is an 8-pin BIOS that has higher power limit than the reference 112%? Specifically my card is 1070 SC by EVGA


----------



## .Sup

Just ordered the Zotac Amp card


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> Anyone know if there is an 8-pin BIOS that has higher power limit than the reference 112%? Specifically my card is 1070 SC by EVGA


The percentage is a pretty meaningless for comparison with other cards. You need to find out the bios power limit range in Watts so you can compare like with like.

The SC power limit is 151-170W. Just keep in mind the EVGA overheating dramas that have just settled down together with the fact that more watts means more heat so make a fan curve to offset the higher power consumption and temps if you do cross flash the card.

To answer your question, The Asus Strix OC bios has 1633Mhz core clock and has a 180-200W limit, the Palit Gamerock premium (1670Mhz Core Clock and 2127Mhz Memory oc) and SuperJetstream (1633Mhz core clock) both have 8 pin power and 195-225W power limits.

You need to make sure that you download bioses in the 86.04.50.00.xx range. Do not download the 86.04.1E, 86.04.26, or 86.04.3B bioses. You can get them all from techpowerup VGA bios database.

Those 3 bioses will flash to your card and should work with your EVGA SC card although sometimes cross flashed bioses don't compliment the non standard hardware as well as a native bios. All of those cards I mentioned above use the same voltage regulator but they have differently configured VRMs on the card. The bios code only needs to know the standard commands that the controller uses but doesn't need to know about the VRM configuration, the controller manages that.

You may find that one of the bioses causes the card to not perform very well but you wont know unless you actually try it.


----------



## Nebulous

Quote:


> Originally Posted by *.Sup*
> 
> Just ordered the Zotac Amp card


If it's the amp extreme you won't be disappointed


----------



## syl1979

Quote:


> Originally Posted by *gtbtk*
> 
> Given that the card is running 100's of Mhz above its rated speed, it is not exactly thermal throttling. Yes it reduces clock speed from a peak value to manage temperatures but CPUs also do similar things.


It makes the overclock tricky to manage when running with air cooler.

Here is on graph done by french site hardware.fr
http://www.hardware.fr/medias/photos_news/00/50/IMG0050557.png

From my side as at the moment ambient temp is low. At idle the gpu goes below 30degc. If i set max 2050 in the aferburner curv at 29degc, then as soon as i launch 3d load the gpu temp pass below 30degc and frequency goes down one step to 2038. I can even see the curv moving in afterburner !
It is very tricky because if i set same frequency on the curv at 29degc or 40degc i will have in reality two totally different settings.


----------



## ErrorFile

So last night it happened, I noticed 1070 Game Rock's fan going on-off every other second while. Was there any other real fix, besides creating a new fan-curve? Already did that, but I thought this was fixed with installing the latest BIOS to my card.


----------



## roccotheoc

Quote:


> Originally Posted by *gtbtk*
> 
> If your GPU is not full of dust, which after only 6 months is unlikely, My guess would be that in heating your home, even though the average air temperature is constant, you have increased the localized ambient air temperature inside your computer case.


Well that would make sense I guess! I was thinking about replacing the thermal pastel like I did on all my past gpu's but the paste on this MSI card is supposed to be pretty good what are you thoughts about that?


----------



## gtbtk

Quote:


> Originally Posted by *syl1979*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Given that the card is running 100's of Mhz above its rated speed, it is not exactly thermal throttling. Yes it reduces clock speed from a peak value to manage temperatures but CPUs also do similar things.
> 
> 
> 
> It makes the overclock tricky to manage when running with air cooler.
> 
> Here is on graph done by french site hardware.fr
> http://www.hardware.fr/medias/photos_news/00/50/IMG0050557.png
> 
> From my side as at the moment ambient temp is low. At idle the gpu goes below 30degc. If i set max 2050 in the aferburner curv at 29degc, then as soon as i launch 3d load the gpu temp pass below 30degc and frequency goes down one step to 2038. I can even see the curv moving in afterburner !
> It is very tricky because if i set same frequency on the curv at 29degc or 40degc i will have in reality two totally different settings.
Click to expand...

That is true but on the other hand, you are getting the extra performance for free


----------



## gtbtk

Quote:


> Originally Posted by *roccotheoc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If your GPU is not full of dust, which after only 6 months is unlikely, My guess would be that in heating your home, even though the average air temperature is constant, you have increased the localized ambient air temperature inside your computer case.
> 
> 
> 
> Well that would make sense I guess! I was thinking about replacing the thermal pastel like I did on all my past gpu's but the paste on this MSI card is supposed to be pretty good what are you thoughts about that?
Click to expand...

I would try moving the case somewhere else in the room and see if your temps change.

The MSI Paste is supposed to be of good quality. As you were getting good temps for the first couple of months, I don't think that the paste is likely to have gone off in the last month or so. If it was bad to start with, then you are not likely to have got good temps when the card was new either. That being the case, I doubt that new paste would make any difference for you.


----------



## zipzop

Quote:


> Originally Posted by *gtbtk*
> 
> The percentage is a pretty meaningless for comparison with other cards. You need to find out the bios power limit range in Watts so you can compare like with like.
> 
> The SC power limit is 151-170W. Just keep in mind the EVGA overheating dramas that have just settled down together with the fact that more watts means more heat so make a fan curve to offset the higher power consumption and temps if you do cross flash the card.
> 
> To answer your question, The Asus Strix OC bios has 1633Mhz core clock and has a 180-200W limit, the Palit Gamerock premium (1670Mhz Core Clock and 2127Mhz Memory oc) and SuperJetstream (1633Mhz core clock) both have 8 pin power and 195-225W power limits.
> 
> You need to make sure that you download bioses in the 86.04.50.00.xx range. Do not download the 86.04.1E, 86.04.26, or 86.04.3B bioses. You can get them all from techpowerup VGA bios database.
> 
> Those 3 bioses will flash to your card and should work with your EVGA SC card although sometimes cross flashed bioses don't compliment the non standard hardware as well as a native bios. All of those cards I mentioned above use the same voltage regulator but they have differently configured VRMs on the card. The bios code only needs to know the standard commands that the controller uses but doesn't need to know about the VRM configuration, the controller manages that.
> 
> You may find that one of the bioses causes the card to not perform very well but you wont know unless you actually try it.


Nice. Should I try the one third from the bottom here(86.04.50.00.63)? It seems to be the only one with a 200w power limit... The rest before it are 150w-170w. what was their reason to release a BIOS in October that had a higher power limit?

Anyway my current BIOS 86.04.50.00.70 which was the Micron update for 1070 SC. I didnt bother with EVGA's fan profile BIOS update cause I do that myself in Afterburner. Original BIOS was 86.04.26.00.70(non-micron update). So that leaves me wondering what are the difference in the.1E BIOS? Didn't see a .3B, though I did not look though the Palit or SuperJetStream lists.

As for the overheating thing well I thought it was concluded that the hardware was at fault there. VRM components for the FTW out of spec on their 10 phase design. SC is reference and reference boards have so far proven pretty solid


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> yes that's about it. Just remember take your cards VRM into consideration as well


really? hmm, seemed like they built them safe enough to never have to worry, even on air.


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The percentage is a pretty meaningless for comparison with other cards. You need to find out the bios power limit range in Watts so you can compare like with like.
> 
> The SC power limit is 151-170W. Just keep in mind the EVGA overheating dramas that have just settled down together with the fact that more watts means more heat so make a fan curve to offset the higher power consumption and temps if you do cross flash the card.
> 
> To answer your question, The Asus Strix OC bios has 1633Mhz core clock and has a 180-200W limit, the Palit Gamerock premium (1670Mhz Core Clock and 2127Mhz Memory oc) and SuperJetstream (1633Mhz core clock) both have 8 pin power and 195-225W power limits.
> 
> You need to make sure that you download bioses in the 86.04.50.00.xx range. Do not download the 86.04.1E, 86.04.26, or 86.04.3B bioses. You can get them all from techpowerup VGA bios database.
> 
> Those 3 bioses will flash to your card and should work with your EVGA SC card although sometimes cross flashed bioses don't compliment the non standard hardware as well as a native bios. All of those cards I mentioned above use the same voltage regulator but they have differently configured VRMs on the card. The bios code only needs to know the standard commands that the controller uses but doesn't need to know about the VRM configuration, the controller manages that.
> 
> You may find that one of the bioses causes the card to not perform very well but you wont know unless you actually try it.
> 
> 
> 
> Nice. Should I try the one third from the bottom here(86.04.50.00.63)? It seems to be the only one with a 200w power target... The rest before it are 150w-170w. what was their reason to release a BIOS in October that had a higher power limit?
> 
> Anyway my current BIOS 86.04.50.00.70 which was the Micron update for 1070 SC. I didnt bother with EVGA's fan profile BIOS update cause I do that myself in Afterburner. Original BIOS was 86.04.26.00.70(non-micron update). So that leaves me wondering what are the difference in the.1E BIOS? Didn't see a .3B, though I did not look though the Palit or SuperJetStream lists.
> 
> As for the overheating thing well I thought it was concluded that the hardware was at fault there. VRM components for the FTW out of spec on their 10 phase design. SC is reference and reference boards have so far proven pretty solid
Click to expand...

you can try any asus 1070 bios that is version 86.04.50.00.xx. The one you pointed out is the OC version of the bios so yes give it a try. Remember to make a backup of your original bios before you flash a new one. The only brand with 3B bioses are Palit/gainward but If you want to play, you should take a look at the Palit ones I suggested as well as the Asus one.

The 1E bios came with the original samsung memory cards that were made in the first batch in May/June 2016. the 2nd batch of cards that they started making at the end of June had Micron Memory with the .26 bios.

As for the overheating, Gamers Nexus found faulty capacitors were the cause, however, EVGA has been quick to send out Thermal pads, they have released a new fan curve bios and they have announce new ver 2 cards with an iCX cooler design. If the "overheating" issues had no substance at all, a company is not going to expend money on those new things if the cooler design wasn't at least of some concern for some reason. I suspect that there is a slight design flaw that has been patched up with the thermal pads but is probably not the most ideal solution.

The VRMs in the reference design are OK to support up to 250W, However, Given that you are looking at putting more power through the card than it (the cooler/card combination) was originally designed for and given the history, It is prudent just to keep an eye on things more than you would with the stock bios. At least until you see how the card behaves under load. Put an extra fan on the card. If it gets red hot, flash the original bios back again.


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> yes that's about it. Just remember take your cards VRM into consideration as well
> 
> 
> 
> really? hmm, seemed like they built them safe enough to never have to worry, even on air.
Click to expand...

Isnt it better to be safe than sorry?

If you have a reference card whose VRM is limited by the mosfets to 250W, loading a Zotac Amp Extreme bios that can pull up to 300W can potentially do some damage.

If you have a card with 8+8 pin power card, you can pretty much flash anything as long as it is using a common Voltage controller (the Galax HOF is the exception), If you have an 8 pin only card you need to be aware of what you are putting on the card because you are creating the potential to overload the VRM if you blindly flash your card with a high power limit bios


----------



## zipzop

.
Quote:


> Originally Posted by *gtbtk*
> 
> you can try any asus 1070 bios that is version 86.04.50.00.xx. The one you pointed out is the OC version of the bios so yes give it a try. Remember to make a backup of your original bios before you flash a new one. The only brand with 3B bioses are Palit/gainward but If you want to play, you should take a look at the Palit ones I suggested as well as the Asus one.
> 
> The 1E bios came with the original samsung memory cards that were made in the first batch in May/June 2016. the 2nd batch of cards that they started making at the end of June had Micron Memory with the .26 bios.
> 
> As for the overheating, Gamers Nexus found faulty capacitors were the cause, however, EVGA has been quick to send out Thermal pads, they have released a new fan curve bios and they have announce new ver 2 cards with an iCX cooler design. If the "overheating" issues had no substance at all, a company is not going to expend money on those new things if the cooler design wasn't at least of some concern for some reason. I suspect that there is a slight design flaw that has been patched up with the thermal pads but is probably not the most ideal solution.
> 
> The VRMs in the reference design are OK to support up to 250W, However, Given that you are looking at putting more power through the card than it (the cooler/card combination) was originally designed for and given the history, It is prudent just to keep an eye on things more than you would with the stock bios. At least until you see how the card behaves under load. Put an extra fan on the card. If it gets red hot, flash the original bios back again.


Yeah if anything I'm worried about VRM temps maybe but ehh. It does have the thermal mod, I use CLU on the die (max 56C under absolute full load) and 70% fans with side panel removed.









Other thing is before I flash, the Asus has 2 x DP and 2 x HDMI, 1 x DVI-D. But my EVGA with 3 x DP and 1 x HDMI, 1 x DVI-D. Is that going to royally screw anything? I use DP for my XB270HU and one HDMI for TV beside my desk as a secondary sometimes.


----------



## gtbtk

With your fan curve, try setting the temperature hysteresis to a value like 5 degrees. That add a delay to the cut off of the fan after it hist the curves base temp and give you a larger gap between fan shut off and fan start up.


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> .
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you can try any asus 1070 bios that is version 86.04.50.00.xx. The one you pointed out is the OC version of the bios so yes give it a try. Remember to make a backup of your original bios before you flash a new one. The only brand with 3B bioses are Palit/gainward but If you want to play, you should take a look at the Palit ones I suggested as well as the Asus one.
> 
> The 1E bios came with the original samsung memory cards that were made in the first batch in May/June 2016. the 2nd batch of cards that they started making at the end of June had Micron Memory with the .26 bios.
> 
> As for the overheating, Gamers Nexus found faulty capacitors were the cause, however, EVGA has been quick to send out Thermal pads, they have released a new fan curve bios and they have announce new ver 2 cards with an iCX cooler design. If the "overheating" issues had no substance at all, a company is not going to expend money on those new things if the cooler design wasn't at least of some concern for some reason. I suspect that there is a slight design flaw that has been patched up with the thermal pads but is probably not the most ideal solution.
> 
> The VRMs in the reference design are OK to support up to 250W, However, Given that you are looking at putting more power through the card than it (the cooler/card combination) was originally designed for and given the history, It is prudent just to keep an eye on things more than you would with the stock bios. At least until you see how the card behaves under load. Put an extra fan on the card. If it gets red hot, flash the original bios back again.
> 
> 
> 
> Yeah if anything I'm worried about VRM temps maybe but ehh. It does have the thermal mod, I use CLU on the die (max 56C under absolute full load) and 70% fans with side panel removed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Other thing is before I flash, the Asus has 2 x DP and 2 x HDMI, 1 x DVI-D. But my EVGA with 3 x DP and 1 x HDMI, 1 x DVI-D. Is that going to royally screw anything? I use DP for my XB270HU and one HDMI for TV beside my desk as a secondary sometimes.
Click to expand...

VRMs are ok up to 150 degC. If you are not trying to overload them you should be fine.

With regards all the ports out of the card, I honestly don't know.

I generally use a single display port cable to a single monitor. I do know that the DP1 port works with both my original MSI bios and the Asus one. I did have to swap DP port to DP3 when I tried the Gigabyte xtreme bios. I don't remember if I ever tried plugging an HDMI or DVI cable in when I was running the Asus bios on mine. I would guess that for the 2 ports you use you should be OK. If one display port stops working, try another port. I guess that there could be an issue with the display port (is it DP2?) that is being replaced with the 2nd HDMI port on the Asus you you will only know if you try it.

Both DP and HDMI are digital and I think that they both require a chip to manage the protocols. It is also possible that the driver chip takes care of the physical translation and the bios does not need to have any special; programming to use it


----------



## zipzop

The Asus 1070 OC flashed to EVGA 1070 SC and works. Just had to bypass PCI subsystem ID mismatch error with the "-6" command in Nvflash. Both DP and HDMI displays still working in the ports they were in. And no more power limit throttling yayyy.
120% is now the max power limit in Afterburner instead of 112%. Playing DOOM with a solid 2152mhz OC and only around 80%-90% power


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> I have not heard about that.
> 
> I know there was an Asus 1080 bios that had higher voltage limits that Dancop released to the internet


Just run nvflash -? to the end

Code:



Code:


Create unlock license request file.
  > nvflash --licreq=LicenseRequest.bin USER_FW_MOD

  Install unlock license objects.
  > nvflash --wrhlk=License.hulk

  Flash tweaked firmware.
  > nvflash --license=License.hulk vbios.rom


----------



## Quadrider10

What's the best bios for the gigabyte g1 with Samsung memory?


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> What's the best bios for the gigabyte g1 with Samsung memory?


the same .50 one that you can use with the micron memory


----------



## dboythagr8

Ended up getting the ASUS RoG Strix model.

Ran a quick and dirty Heaven benchmark. I got *55.4* fps with the 1070, 4930k @ 4.5ghz, 1440p, Extreme settings. Card was running at it's stock OC of 1638mhz.

Just curious if my results are about right for the resolution and settings.

Also are there any suggested OC tips for Pascal?


----------



## GeneO

Quote:


> Originally Posted by *dboythagr8*
> 
> Ended up getting the ASUS RoG Strix model.
> 
> Ran a quick and dirty Heaven benchmark. I got *55.4* fps with the 1070, 4930k @ 4.5ghz, 1440p, Extreme settings. Card was running at it's stock OC of 1638mhz.
> 
> Just curious if my results are about right for the resolution and settings.
> 
> Also are there any suggested OC tips for Pascal?


I currently only have a 1080p monitor, but through the magic of DSR I can test it at 1440 and I get 55.2 fps on stock.
Lots of info and tips in this thread about overclocking.


----------



## dboythagr8

Quote:


> Originally Posted by *GeneO*
> 
> I currently only have a 1080p monitor, but through the magic of DSR I can test it at 1440 and I get 55.2 fps on stock.
> Lots of info and tips in this thread about overclocking.


Appreciate you checking. What speed was your card clocked at?


----------



## xGeNeSisx

Quote:


> Originally Posted by *gtbtk*
> 
> the same .50 one that you can use with the micron memory


Hey dude, the ASUS Strix OC bios is working out great. I've managed to clock it to 2100mhz @ 1.063v and memory at 8700mhz. At default clocks, the card automatically boosts to 2038mhz. There is no clock speed reduction or attempted throttling in any circumstances, it is completely smooth. That is absolutely insane compared to the limitations imposed by the stock G1 bios as well as the EVGA bios. The increased power limit obviously causes a bit more heat production, but I think I'm alright at 43 C instead of 40 C









I have not found increasing the voltage over 1.063v to be beneficial. We know Pascal like less voltage, and putting more voltage in can actually degrade performance from my observations. I've been trying to test if raising the volt slider +50 and getting the card to 1.081v would allow me more room to increase clock speed, but I have been unsuccessful. Still, I am ecstatic with what the Strix bios has allowed me to achieve.

With default G1 bios, overclocking past 2025 was a crapshoot. I can't factor memory clock into the equation here as the Micron bios was not released this time. EVGA allowed me to reach 2050-2063mhz core, at the cost of reduced memory overclock due to power limitations and somewhat aggressive clock speed reductions. Could not be happier with the Strix Bios!

I have attached results of running a quick Heaven benchmark with information from HWInfo64, Afterburner, and the curve I have set


----------



## RyanRazer

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Hey dude, the ASUS Strix OC bios is working out great. I've managed to clock it to 2100mhz @ 1.063v and memory at 8700mhz. At default clocks, the card automatically boosts to 2038mhz. There is no clock speed reduction or attempted throttling in any circumstances, it is completely smooth. That is absolutely insane compared to the limitations imposed by the stock G1 bios as well as the EVGA bios. The increased power limit obviously causes a bit more heat production, but I think I'm alright at 43 C instead of 40 C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have not found increasing the voltage over 1.063v to be beneficial. We know Pascal like less voltage, and putting more voltage in can actually degrade performance from my observations. I've been trying to test if raising the volt slider +50 and getting the card to 1.081v would allow me more room to increase clock speed, but I have been unsuccessful. Still, I am ecstatic with what the Strix bios has allowed me to achieve.
> 
> With default G1 bios, overclocking past 2025 was a crapshoot. I can't factor memory clock into the equation here as the Micron bios was not released this time. EVGA allowed me to reach 2050-2063mhz core, at the cost of reduced memory overclock due to power limitations and somewhat aggressive clock speed reductions. Could not be happier with the Strix Bios!
> 
> I have attached results of running a quick Heaven benchmark with information from HWInfo64, Afterburner, and the curve I have set


Cool! Can i ask you for a favour? Could you test any game at 2000mhz and 2100mhz and tell fps difference? I am really interested how well it scales? Using fraps's benchmark running same sequence in game (probably better single player as more consistent run) gives a reasonable approximation of performance.

Is that too much to ask?


----------



## EDK-TheONE

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Hey dude, the ASUS Strix OC bios is working out great. I've managed to clock it to 2100mhz @ 1.063v and memory at 8700mhz. At default clocks, the card automatically boosts to 2038mhz. There is no clock speed reduction or attempted throttling in any circumstances, it is completely smooth. That is absolutely insane compared to the limitations imposed by the stock G1 bios as well as the EVGA bios. The increased power limit obviously causes a bit more heat production, but I think I'm alright at 43 C instead of 40 C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have not found increasing the voltage over 1.063v to be beneficial. We know Pascal like less voltage, and putting more voltage in can actually degrade performance from my observations. I've been trying to test if raising the volt slider +50 and getting the card to 1.081v would allow me more room to increase clock speed, but I have been unsuccessful. Still, I am ecstatic with what the Strix bios has allowed me to achieve.
> 
> With default G1 bios, overclocking past 2025 was a crapshoot. I can't factor memory clock into the equation here as the Micron bios was not released this time. EVGA allowed me to reach 2050-2063mhz core, at the cost of reduced memory overclock due to power limitations and somewhat aggressive clock speed reductions. Could not be happier with the Strix Bios!
> 
> I have attached results of running a quick Heaven benchmark with information from HWInfo64, Afterburner, and the curve I have set


Is it under water cooled?


----------



## RyanRazer

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Is it under water cooled?


Hmm weird, I've put a bucket of watter above my rig, doesn't seem to run any cooler ?


----------



## xGeNeSisx

Quote:


> Originally Posted by *RyanRazer*
> 
> Cool! Can i ask you for a favour? Could you test any game at 2000mhz and 2100mhz and tell fps difference? I am really interested how well it scales? Using fraps's benchmark running same sequence in game (probably better single player as more consistent run) gives a reasonable approximation of performance.
> 
> Is that too much to ask?


I have the day off and I will absolutely do this as soon as I have verified my new CPU volts as stable as I was able to reduce IMC and PCH voltages almost back to stock, a significant decrease. Just want to verify stability with AIDA, x264, and HCl Memtest briefly. I haven't ever used Fraps, so it may take me a bit to figure out







I am definitely up for this though

I was thinking of following test setups:
Compare 2000 to 2100mhz core at stock memory using ROTTR at 1080p
- do the same with memory clock at 8700mhz
Then doing the same tests using DSR to compare the same at a resolution of 2180x1620
Quote:


> Originally Posted by *EDK-TheONE*
> 
> Is it under water cooled?


Yes, I ripped the stock G1 cooler off almost as soon as I had it up and running. Besides hitting the heatsink, the three ~80 mm or so fans produced the most irritating high pitched buzzing and subpar cooling performance. My G1 has a Corsair H55 attached with a Kraken G10 bracket. I switched the 92mm stock fan out for a Noctua to keep VRM temps even lower. Even at full load on system and card, I have never seen temperatures above 41 C.

I really would love to order an EK kit and do a custom loop, but it's hard to justify the cost and effort at the moment. The Corsair H105 240mm rad at the front of my Fractal R5 keeps my 6700k @ 4.7 below 55C under any stress test and the GPU temps are exceptional with an H55. I use the front 240 in push/pull (SP120s and Deltas are quite the combo







) for intake along with 2 bottom 140mm Cougars. The 120mm GPU rad on the back as the sole exhaust. With this setup cooling performance is optimal and I can keep the top of the R5 to eliminate noise. I couldn't be happier with the setup


----------



## RyanRazer

Huh guys, take a look a this. 368.69 beta driver with full vulkan support. Will try this with doom.

https://developer.nvidia.com/vulkan-driver


----------



## xGeNeSisx

Quote:


> Originally Posted by *RyanRazer*
> 
> Huh guys, take a look a this. 368.69 beta driver with full vulkan support. Will try this with doom.
> 
> https://developer.nvidia.com/vulkan-driver




__
https://www.reddit.com/r/5npabn/geforce_beta_driver_37666_nv_vulkan_api_update/
 a post from reddit about the Beta Driver 376.66

There was 1 or 2 posts where people reported FPS drop in Doom, but the drivers likely have debug mode enabled which will cause performance decrease. Results could be interesting though, please report what you find out


----------



## RyanRazer

Quote:


> Originally Posted by *xGeNeSisx*
> 
> I have the day off and I will absolutely do this as soon as I have verified my new CPU volts as stable as I was able to reduce IMC and PCH voltages almost back to stock, a significant decrease. Just want to verify stability with AIDA, x264, and HCl Memtest briefly. I haven't ever used Fraps, so it may take me a bit to figure out
> 
> 
> 
> 
> 
> 
> 
> I am definitely up for this though
> 
> I was thinking of following test setups:
> Compare 2000 to 2100mhz core at stock memory using ROTTR at 1080p
> - do the same with memory clock at 8700mhz
> Then doing the same tests using DSR to compare the


Great, tnx mate. Would be interesting to see. Meanwhile I'll do at 1900mhz and 2000mhz. And 1800 and 1900mhz. I'd like to see how linear the scaling is. I guess fps gain drops with frequency.


----------



## GeneO

Quote:


> Originally Posted by *dboythagr8*
> 
> Appreciate you checking. What speed was your card clocked at?


Default is 1582/1772


----------



## dboythagr8

I downloaded GPU Tweak II for use with my Strix 1070. Can you not enable overlay monitoring ala Precision X and Afterburner?


----------



## xGeNeSisx

Quote:


> Originally Posted by *RyanRazer*
> 
> Great, tnx mate. Would be interesting to see. Meanwhile I'll do at 1900mhz and 2000mhz. And 1800 and 1900mhz. I'd like to see how linear the scaling is. I guess fps gain drops with frequency.


Did a comparison of best overclock profile so far, stock ASUS strix bios settings, and forced downclock

All tests were performed in Rise of the Tomb Raider at 2180x1620, image of ingame settings attached below. Each benchmark was started after a level was restarted and the same sequence was played for 60s

2100mhz core clock
8700mhz memory clock
Power slider 120% 1.063v
Frames: 3561 - Time: 60000ms - Avg: 59.350 - Min: 52 - Max: 78

2000mhz core clock (forced downclock)
8700mhz memory
Power slider 120%
Frames: 3420 - Time: 60000ms - Avg: 57.000 - Min: 49 - Max: 72

Stock settings: 2038 core clock with GPU Boost
8014mhz memory clock
Default power limit
Frames: 3318 - Time: 60000ms - Avg: 55.300 - Min: 48 - Max: 68

2000mhz memory (forced downclock and default boost clock of my G1 1070)
8014mhz memory clock
Default power limit
Frames: 3373 - Time: 60000ms - Avg: 56.217 - Min: 47 - Max: 68



i7-6700k @ 4.7ghz core, 4.6ghz cache
16gb G.Skill Ripjaws @ 3100mhz 14-16-16-36
Gigabyte 1070 G1 flashed with ASUS Strix OC bios
Game run from Crucial MX300 SSD


----------



## RyanRazer

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Did a comparison of best overclock profile so far, stock ASUS strix bios settings, and forced downclock
> 
> All tests were performed in Rise of the Tomb Raider at 2180x1620, image of ingame settings attached below. Each benchmark was started after a level was restarted and the same sequence was played for 60s
> 
> 2100mhz core clock
> 8700mhz memory clock
> Power slider 120% 1.063v
> Frames: 3561 - Time: 60000ms - Avg: 59.350 - Min: 52 - Max: 78
> 
> 2000mhz core clock (forced downclock)
> 8700mhz memory
> Power slider 120%
> Frames: 3420 - Time: 60000ms - Avg: 57.000 - Min: 49 - Max: 72
> 
> Stock settings: 2038 core clock with GPU Boost
> 8014mhz memory clock
> Default power limit
> Frames: 3318 - Time: 60000ms - Avg: 55.300 - Min: 48 - Max: 68
> 
> 2000mhz memory (forced downclock and default boost clock of my G1 1070)
> 8014mhz memory clock
> Default power limit
> Frames: 3373 - Time: 60000ms - Avg: 56.217 - Min: 47 - Max: 68
> 
> 
> 
> i7-6700k @ 4.7ghz core, 4.6ghz cache
> 16gb G.Skill Ripjaws @ 3100mhz 14-16-16-36
> Gigabyte 1070 G1 flashed with ASUS Strix OC bios
> Game run from Crucial MX300 SSD


GReat tnx man. So from 2000 to 2100 = 5% increase and 4% FPS increase (avg). This numbers (fps) are small so room for small errors. But it looks to scale quite good. toworrow i'll do some tests myself as well, my cousin had a BD today so i was busy entertaining her


----------



## gtbtk

> Quote:





> Originally Posted by *dboythagr8*
> 
> Ended up getting the ASUS RoG Strix model.
> 
> Ran a quick and dirty Heaven benchmark. I got *55.4* fps with the 1070, 4930k @ 4.5ghz, 1440p, Extreme settings. Card was running at it's stock OC of 1638mhz.
> 
> Just curious if my results are about right for the resolution and settings.
> 
> Also are there any suggested OC tips for Pascal?


I used DSR to test as well on a 1920x1200 monitor, Overclocked, with a boost frequency that started at 2114Mhz and dropped to 2100 and memory running at 9100Mhz, I can get 59fps with an i7-2600 @4.4Ghz


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Huh guys, take a look a this. 368.69 beta driver with full vulkan support. Will try this with doom.
> 
> https://developer.nvidia.com/vulkan-driver


we are up to 376.33 drivers now. all drivers after the one you mentioned support Vilkan


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xGeNeSisx*
> 
> Did a comparison of best overclock profile so far, stock ASUS strix bios settings, and forced downclock
> 
> All tests were performed in Rise of the Tomb Raider at 2180x1620, image of ingame settings attached below. Each benchmark was started after a level was restarted and the same sequence was played for 60s
> 
> 2100mhz core clock
> 8700mhz memory clock
> Power slider 120% 1.063v
> Frames: 3561 - Time: 60000ms - Avg: 59.350 - Min: 52 - Max: 78
> 
> 2000mhz core clock (forced downclock)
> 8700mhz memory
> Power slider 120%
> Frames: 3420 - Time: 60000ms - Avg: 57.000 - Min: 49 - Max: 72
> 
> Stock settings: 2038 core clock with GPU Boost
> 8014mhz memory clock
> Default power limit
> Frames: 3318 - Time: 60000ms - Avg: 55.300 - Min: 48 - Max: 68
> 
> 2000mhz memory (forced downclock and default boost clock of my G1 1070)
> 8014mhz memory clock
> Default power limit
> Frames: 3373 - Time: 60000ms - Avg: 56.217 - Min: 47 - Max: 68
> 
> 
> 
> i7-6700k @ 4.7ghz core, 4.6ghz cache
> 16gb G.Skill Ripjaws @ 3100mhz 14-16-16-36
> Gigabyte 1070 G1 flashed with ASUS Strix OC bios
> Game run from Crucial MX300 SSD
> 
> 
> 
> GReat tnx man. So from 2000 to 2100 = 5% increase and 4% FPS increase (avg). This numbers (fps) are small so room for small errors. But it looks to scale quite good. toworrow i'll do some tests myself as well, my cousin had a BD today so i was busy entertaining her
Click to expand...

The 1070, in my experience, is memory bandwidth constrained at stock speeds compared to the 1080. Memory overclocking seems for the most part, give good improvements and in certain situations sacrificing the absolute highest core frequency to allow more memory bandwidth often gives you better framerates. The 1070 is 20% slower than a 1080 with 25% less cores. Tests have also shown that a 1080 does not increase its performance after the stock 10Ghz GDDR5X memory overclock gets the memory to 11Ghz where we do continue to see improvements with GDDR5 ram overclocks that are above 9Ghz which is a much bigger increase in percentage terms

You may want to compare say, 2050Mhz Core and 9000 Mhz memory to 2100/8500 and see which one performs better in terms of framerates. Obliviously there is a balance point somewhere that may be 2076 or 2088Mhz.

If your memory craps out at not much more than the stock speeds, try having a look at the VRM frequency, VCCIO VCCSA and CPU PLL voltages on your motherboard and increase them a bit if they are still on auto or set really low on the acceptable range. You will need to research what the safe range for your motherboard is. But I am sure that the 1070 is more capable than your old GPU to stress things enough to find new instabilities in your Motherboard overclock settings that maybe you have not looked it

On my Z68 board, I found that it stopped my card from randomly crashing under load at certain higher overclocks that worked sometimes but not at others and improved Memory Overclock frequency another 100Mhz before it started throwing artifacts. GPU performance has also improved. I have improved my firestrike graphics score me another 150 points after hitting a wall at 20500 with the original settings I was using. I hope with more experimentation, I can improve it even more.

I have not seen any mention or discussion or looks into this before and people seem to have accepted the idea of silicon lottery being the cause of low overclocks in every situation. As those voltage settings all play a part in fine tuning the quality of the signalling being sent over the PCIe bus, having settings that do tuned to each other properly will result in signals that are less powerful and less well defined, placing lower limits on your ultimate performance potential, I suspect that those variations on different individual boards play a role in giving much wider Overclocking result discrepancies than we would be seeing if everything else was equal.


----------



## RyanRazer

Quote:


> Originally Posted by *gtbtk*
> 
> we are up to 376.33 drivers now. all drivers after the one you mentioned support Vilkan


Ups, ment:

Developer Beta Driver Downloads

"Windows driver version 376.71 and Linux driver version 375.27.07 provide new features for Vulkan developers to test their upcoming Vulkan applications."

Understand these are *beta* and *developers*







probably not the best option to run as daily driver.


----------



## Mazda6i07

Kind of off topic; Do you think at this point in the year it would be smarter to just wait for the next release of cards (2070?)


----------



## pez

Quote:


> Originally Posted by *Mazda6i07*
> 
> Kind of off topic; Do you think at this point in the year it would be smarter to just wait for the next release of cards (2070?)


I think the better question(s) is/are:
-Do you need a new card right now?
-Do you play something daily that doesn't give the performance you want/need/desire? Will the 1070 give said performance?

Otherwise, you can get yourself stuck in the loop of 'but something better will be out in XX months'.


----------



## Mazda6i07

Quote:


> Originally Posted by *pez*
> 
> I think the better question(s) is/are:
> -Do you need a new card right now?
> -Do you play something daily that doesn't give the performance you want/need/desire? Will the 1070 give said performance?
> 
> Otherwise, you can get yourself stuck in the loop of 'but something better will be out in XX months'.


Currently an r9 390, which doesnt work in premiere / adobe suite as well as cannot handle BF1 very well, in my opinion, but should i wait a few months or just wing it and scoop a 1070


----------



## Awsan

People i would like to know what a system with

i7 7700k
1070
16gb
500gb ssd
H100i

Will consume from the wall @

1-Idle
2-Low usage (Browsing and movie watching)
3-100% load
4-100% load OC


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> we are up to 376.33 drivers now. all drivers after the one you mentioned support Vilkan
> 
> 
> 
> Ups, ment:
> 
> Developer Beta Driver Downloads
> 
> "Windows driver version 376.71 and Linux driver version 375.27.07 provide new features for Vulkan developers to test their upcoming Vulkan applications."
> 
> Understand these are *beta* and *developers*
> 
> 
> 
> 
> 
> 
> 
> probably not the best option to run as daily driver.
Click to expand...

The 376.71 is a beta for the next release but you didnt mention that one in your last post, you mentioned a 368 driver


----------



## MEC-777

Quote:


> Originally Posted by *Awsan*
> 
> People i would like to know what a system with
> 
> i7 7700k
> 1070
> 16gb
> 500gb ssd
> H100i
> 
> Will consume from the wall @
> 
> 1-Idle
> 2-Low usage (Browsing and movie watching)
> 3-100% load
> 4-100% load OC


Try this. http://outervision.com/power-supply-calculator

Just input all your components. I've found this calculator to be fairly accurate.


----------



## gtbtk

Quote:


> Originally Posted by *Awsan*
> 
> People i would like to know what a system with
> 
> i7 7700k
> 1070
> 16gb
> 500gb ssd
> H100i
> 
> Will consume from the wall @
> 
> 1-Idle
> 2-Low usage (Browsing and movie watching)
> 3-100% load
> 4-100% load OC


It will depend on what 1070 you install. A Founders card can pull up to about 170W just for the Graphics card and a Zotac Amp Extreme can pull up to about 300W. Most AIB custom cooler cards will pull between 180W - 250W depending on the model

Having said that, Installing a Zotac Amp extreme card in that rig could pull up to about 450 watts or maybe a touch more. A founders card could pull about 130W less.

At Idle you are looking at about 100W with a web browser open and about 110 with a youtube video playing


----------



## pez

Quote:


> Originally Posted by *Mazda6i07*
> 
> Currently an r9 390, which doesnt work in premiere / adobe suite as well as cannot handle BF1 very well, in my opinion, but should i wait a few months or just wing it and scoop a 1070


Yeah, depending on the resolution, a 1070 is a great pick, IMO. I've been seeing deals pop up here and there where you can get a 1070 for around $350-370, which is a great price-to-performance ratio as well.


----------



## Mazda6i07

Quote:


> Originally Posted by *pez*
> 
> Yeah, depending on the resolution, a 1070 is a great pick, IMO. I've been seeing deals pop up here and there where you can get a 1070 for around $350-370, which is a great price-to-performance ratio as well.


Yeah, I just always can't decide if i should wait, but i think im going to get it and put it on water.


----------



## Awsan

So i can look at under 300 for a full stock system or under 350 for OC


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> The 376.71 is a beta for the next release but you didnt mention that one in your last post, you mentioned a 368 driver


378 drivers are out today


----------



## DeathAngel74

Is that the RE7 Game Ready driver?


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> Is that the RE7 Game Ready driver?


yeah but saw GURU3D benchmark is pretty ridiculous with GTX 1070 under-performing & included all NV GPU's


----------



## skupples

Quote:


> Originally Posted by *dboythagr8*
> 
> I downloaded GPU Tweak II for use with my Strix 1070. Can you not enable overlay monitoring ala Precision X and Afterburner?


I've had issues with MSI AB & PrecX overlays since returning to gaming.

Toggling them causes the game to skip a beat, but the overlay never shows up.

in short, idk.








Quote:


> Originally Posted by *pez*
> 
> Yeah, depending on the resolution, a 1070 is a great pick, IMO. I've been seeing deals pop up here and there where you can get a 1070 for around $350-370, which is a great price-to-performance ratio as well.


they were down to $329 after christmas... Or was it $349 & I had a #20 discount? I forget.


----------



## RyanRazer

Quote:


> Originally Posted by *gtbtk*
> 
> The 1070, in my experience, is memory bandwidth constrained at stock speeds compared to the 1080. Memory overclocking seems for the most part, give good improvements and in certain situations sacrificing the absolute highest core frequency to allow more memory bandwidth often gives you better framerates. The 1070 is 20% slower than a 1080 with 25% less cores. Tests have also shown that a 1080 does not increase its performance after the stock 10Ghz GDDR5X memory overclock gets the memory to 11Ghz where we do continue to see improvements with GDDR5 ram overclocks that are above 9Ghz which is a much bigger increase in percentage terms
> 
> You may want to compare say, 2050Mhz Core and 9000 Mhz memory to 2100/8500 and see which one performs better in terms of framerates. Obliviously there is a balance point somewhere that may be 2076 or 2088Mhz.
> 
> If your memory craps out at not much more than the stock speeds, try having a look at the VRM frequency, VCCIO VCCSA and CPU PLL voltages on your motherboard and increase them a bit if they are still on auto or set really low on the acceptable range. You will need to research what the safe range for your motherboard is. But I am sure that the 1070 is more capable than your old GPU to stress things enough to find new instabilities in your Motherboard overclock settings that maybe you have not looked it
> 
> On my Z68 board, I found that it stopped my card from randomly crashing under load at certain higher overclocks that worked sometimes but not at others and improved Memory Overclock frequency another 100Mhz before it started throwing artifacts. GPU performance has also improved. I have improved my firestrike graphics score me another 150 points after hitting a wall at 20500 with the original settings I was using. I hope with more experimentation, I can improve it even more.
> 
> I have not seen any mention or discussion or looks into this before and people seem to have accepted the idea of silicon lottery being the cause of low overclocks in every situation. As those voltage settings all play a part in fine tuning the quality of the signalling being sent over the PCIe bus, having settings that do tuned to each other properly will result in signals that are less powerful and less well defined, placing lower limits on your ultimate performance potential, I suspect that those variations on different individual boards play a role in giving much wider Overclocking result discrepancies than we would be seeing if everything else was equal.


Hmm. Didn't think of that. Considering what you've just wrote it makes sense as i run my 1070 on a msi msi h81m-p33 basic mobo. I hope i get RMA from z97 giga i sent back a while ago. You think my crappy mobo is to blame?

+ do you guys use DDU when upgrading drivers? Or only when experiencing issues? I would do a ddu cleaning only when experiencing issues but what about those times i don't experience benefits? How do you know that you don't have something, if you don't have it







paradox much?


----------



## DeathAngel74

Quote:


> Originally Posted by *khanmein*
> 
> yeah but saw GURU3D benchmark is pretty ridiculous with GTX 1070 under-performing & included all NV GPU's


Meh....Oh well, just another slowly polished turd passed off as gold again....
Hopefully, something will drop again before the weekend. Tnx


----------



## zipzop

http://www.guru3d.com/articles-pages/resident-evil-7-pc-graphics-performance-benchmark-review,1.html

I don't know why those re7 benchmarks show such low score. It was the same story with the demo. 75 FPS for GTX 1070 in 1440p? Well thats my specs and I match those graphics settings and get between 100 and 120FPS. #fakenews


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The 1070, in my experience, is memory bandwidth constrained at stock speeds compared to the 1080. Memory overclocking seems for the most part, give good improvements and in certain situations sacrificing the absolute highest core frequency to allow more memory bandwidth often gives you better framerates. The 1070 is 20% slower than a 1080 with 25% less cores. Tests have also shown that a 1080 does not increase its performance after the stock 10Ghz GDDR5X memory overclock gets the memory to 11Ghz where we do continue to see improvements with GDDR5 ram overclocks that are above 9Ghz which is a much bigger increase in percentage terms
> 
> You may want to compare say, 2050Mhz Core and 9000 Mhz memory to 2100/8500 and see which one performs better in terms of framerates. Obliviously there is a balance point somewhere that may be 2076 or 2088Mhz.
> 
> If your memory craps out at not much more than the stock speeds, try having a look at the VRM frequency, VCCIO VCCSA and CPU PLL voltages on your motherboard and increase them a bit if they are still on auto or set really low on the acceptable range. You will need to research what the safe range for your motherboard is. But I am sure that the 1070 is more capable than your old GPU to stress things enough to find new instabilities in your Motherboard overclock settings that maybe you have not looked it
> 
> On my Z68 board, I found that it stopped my card from randomly crashing under load at certain higher overclocks that worked sometimes but not at others and improved Memory Overclock frequency another 100Mhz before it started throwing artifacts. GPU performance has also improved. I have improved my firestrike graphics score me another 150 points after hitting a wall at 20500 with the original settings I was using. I hope with more experimentation, I can improve it even more.
> 
> I have not seen any mention or discussion or looks into this before and people seem to have accepted the idea of silicon lottery being the cause of low overclocks in every situation. As those voltage settings all play a part in fine tuning the quality of the signalling being sent over the PCIe bus, having settings that do tuned to each other properly will result in signals that are less powerful and less well defined, placing lower limits on your ultimate performance potential, I suspect that those variations on different individual boards play a role in giving much wider Overclocking result discrepancies than we would be seeing if everything else was equal.
> 
> 
> 
> Hmm. Didn't think of that. Considering what you've just wrote it makes sense as i run my 1070 on a msi msi h81m-p33 basic mobo. I hope i get RMA from z97 giga i sent back a while ago. You think my crappy mobo is to blame?
> 
> + do you guys use DDU when upgrading drivers? Or only when experiencing issues? I would do a ddu cleaning only when experiencing issues but what about those times i don't experience benefits? How do you know that you don't have something, if you don't have it
> 
> 
> 
> 
> 
> 
> 
> paradox much?
Click to expand...

It is possible I guess H81 are pretty old and they dont have the greatest VRMs.

These pascal cards, when overclocked are stressing the older boards in terms of data transfer more than previous generations. I think that we are starting to see instability now because we could never test the PCIe bus this much before. I dont have a Z170 or Z270 board to test it against but It would be my guess that those boards would not need the same levels of fine tuning that say, a z68 board needs.

I only use DDU if i have a problem. Normally I just do a custom clean install


----------



## Mazda6i07

So I scooped this up today; did a quick & dirty oc on it.


__
https://flic.kr/p/Rt5Jm7


__
https://flic.kr/p/Rt5Jm7


----------



## GeneO

Quote:


> Originally Posted by *Mazda6i07*
> 
> So I scooped this up today; did a quick & dirty oc on it.
> 
> 
> __
> https://flic.kr/p/Rt5Jm7
> 
> 
> __
> https://flic.kr/p/Rt5Jm7


Yeah, I can get 133 fps (3356 score) on those settings. That benchmark is sensitive to VRAM speed. I am running 9300 MHz for that.

Correction: 9137 MHz effective memory clock.

.


----------



## Mazda6i07

Okay, I need to get a better oc setup , but I guess I'm not terribly far off then.


----------



## MEC-777

Quote:


> Originally Posted by *Mazda6i07*
> 
> So I scooped this up today; did a quick & dirty oc on it.
> 
> No idea if my heaven 4 score is good; any insight from you guys? Thanks


I scored 3233 (using same settings) on my Founders Edition with a custom curve OC and Vram at 9000 (+500). 128.3 fps avg.

What are you using to OC and what are your settings?


----------



## pez

Quote:


> Originally Posted by *Mazda6i07*
> 
> Yeah, I just always can't decide if i should wait, but i think im going to get it and put it on water.


Quote:


> Originally Posted by *skupples*
> 
> I've had issues with MSI AB & PrecX overlays since returning to gaming.
> 
> Toggling them causes the game to skip a beat, but the overlay never shows up.
> 
> in short, idk.
> 
> 
> 
> 
> 
> 
> 
> 
> they were down to $329 after christmas... Or was it $349 & I had a #20 discount? I forget.


You could be right. The closer to $300 the AIB cards become, the better!
Quote:


> Originally Posted by *Mazda6i07*
> 
> So I scooped this up today; did a quick & dirty oc on it.
> 
> 
> __
> https://flic.kr/p/Rt5Jm7
> 
> 
> __
> https://flic.kr/p/Rt5Jm7
> 
> 
> 
> Spoiler: Warning: Spoiler!


I see you made a decision! Nice!

I have to say, the EVGA cooler looks a million times better in person than in pictures. I was really hesitant about it and criticized it heavily when I first saw it. But I picked up a 1080 SC last week and have to say I'm eating my words.


----------



## gtbtk

Quote:


> Originally Posted by *Mazda6i07*
> 
> So I scooped this up today; did a quick & dirty oc on it.
> 
> 
> 
> 
> __
> https://flic.kr/p/Rt5Jm7


That is a great start. With a bit more time tuning, You have the potential to get that up to about 130fps


----------



## zipper17

Quote:


> Originally Posted by *zipzop*
> 
> http://www.guru3d.com/articles-pages/resident-evil-7-pc-graphics-performance-benchmark-review,1.html
> 
> I don't know why those re7 benchmarks show such low score. It was the same story with the demo. 75 FPS for GTX 1070 in 1440p? Well thats my specs and I match those graphics settings and get between 100 and 120FPS. #fakenews


That's why I more prefer benchmark videos that shows Running an entire frames & entire scenes, It's more accurate.

Mainstream benchmark only showing 1 number which is of course ridiculous, we don't even know which scenes is 1070 at 74FPS.


----------



## blued

The Guru3d benchmarks have been updated with explanations. Apparently latest Nv drivers impact performance as well as other inconsistencies that make this game difficult to bench.


----------



## shadowrain

PCgameshardware.de has different results though.



http://www.pcgameshardware.de/Resident-Evil-7-Biohazard-Spiel-57353/Specials/Benchmark-PC-Anforderungen-1219005/


----------



## khanmein

Quote:


> Originally Posted by *shadowrain*
> 
> PCgameshardware.de has different results though.
> 
> 
> 
> http://www.pcgameshardware.de/Resident-Evil-7-Biohazard-Spiel-57353/Specials/Benchmark-PC-Anforderungen-1219005/


once u turn on shadow cache then NV start to crippled!


----------



## dboythagr8

Quote:


> Originally Posted by *RyanRazer*
> 
> + do you guys use DDU when upgrading drivers? Or only when experiencing issues? I would do a ddu cleaning only when experiencing issues but what about those times i don't experience benefits? How do you know that you don't have something, if you don't have it
> 
> 
> 
> 
> 
> 
> 
> paradox much?


I usually just do the Express install when a new set comes out. I never ran into any problems. I did use DDU a few days ago, but that was due to my system being out of commission for 4 months, and I just got a 1070. Used DDU to do a clean wipe and upgrade to the, at the time, newest driver set.

I think most folks here would tell you they use DDU with every driver install though.


----------



## zipper17

Anyone try latest driver?

Looks like Performances Improved on some Synthetic Bench and games.

__
https://www.reddit.com/r/5pwfnh/driver_37849_faqdiscussion_thread/

Will try this driver, let see if there is a gain on Firestrike Graphics.


----------



## RyanRazer

Quote:


> Originally Posted by *dboythagr8*
> 
> I usually just do the Express install when a new set comes out. I never ran into any problems. I did use DDU a few days ago, but that was due to my system being out of commission for 4 months, and I just got a 1070. Used DDU to do a clean wipe and upgrade to the, at the time, newest driver set.
> 
> I think most folks here would tell you they use DDU with every driver install though.


Tnx for input.


----------



## zipper17

Definitely there's some improvement over previous driver.

20,7XX to 20,9XX, +100-200 pts, with exact same oc settings.

Will try to overclock again to break 21K lol!?


----------



## skupples

Quote:


> Originally Posted by *RyanRazer*
> 
> Hmm. Didn't think of that. Considering what you've just wrote it makes sense as i run my 1070 on a msi msi h81m-p33 basic mobo. I hope i get RMA from z97 giga i sent back a while ago. You think my crappy mobo is to blame?
> 
> + do you guys use DDU when upgrading drivers? Or only when experiencing issues? I would do a ddu cleaning only when experiencing issues but what about those times i don't experience benefits? How do you know that you don't have something, if you don't have it
> 
> 
> 
> 
> 
> 
> 
> paradox much?


I pretty much always use DDU / clean install feature.

It doesn't seem as pertinent these days, but I also don't spend time pinching %s anymore.

I still don't get the point of the NV shader cache. STILL seems to affect performance with zero visual benefit.


----------



## MEC-777

I only used DDU maybe once a year or if I'm experiencing driver issues, which seem to be very few and far between these days. Nvidia drivers usually update very smoothly for me.


----------



## RyanRazer

Quote:


> Originally Posted by *skupples*
> 
> I pretty much always use DDU / clean install feature.
> 
> It doesn't seem as pertinent these days, but I also don't spend time pinching %s anymore.
> 
> I still don't get the point of the NV shader cache. STILL seems to affect performance with zero visual benefit.


I just went into nvidia controle panel and found that cache you are talking about. what is that? do you have it off?


----------



## skupples

Quote:


> Originally Posted by *RyanRazer*
> 
> I just went into nvidia controle panel and found that cache you are talking about. what is that? do you have it off?


I've been turning it off since it came out... Which I think was late in the Kepler days.

I don't remember its purpose, but I do remember it hindering performance for anyone not running a $50 GPU @ release.

heeey look! OCN is top result in my google bubble when googling "purpose of NV Shader cache"

http://www.overclock.net/t/1553142/shader-cache-setting/0_50

this quote intrigues me.
Quote:


> Originally Posted by *jlhawn*
> 
> I keep it disabled under global settings as I see no difference with it enabled except my ssd space gets smaller due to it storing info in the shader cache folder. it you disable it don't forget to delete the files it created, you have to reboot before you can delete the files though after disabling it.


too bad he didn't mention where the cache stash is typically located.

OK - sooo after 2-3 minutes of reading. It does what its called. it stashes compiled shaders so the CPU doesn't have to recompile them later on.

hmmm.... gonna keep reading. Seems like it would be more (or only) beneficial on slower/older systems.

also, does it purge this cache? Or does it save it for later use?

seems like a slow CPU n storage matched with a fast GPU could possible by affected by this setting. Whereas, a fast main system w/ slow/old GPU could benefit.


----------



## ErrorFile

So my 1070 Game Rock still on-off's the fans, if I don't keep at least 25 % -speed on the fans. I'm on the latest BIOS-version. I noticed that the Premium-version has gotten a fan-curve fix on its BIOS-update, while this normal version didn't have the same thing mentioned in the change list. Should I flash the Premium-BIOS or what to do? I'd like to keep the semipassive-mode on.


----------



## GeneO

Quote:


> Originally Posted by *ErrorFile*
> 
> So my 1070 Game Rock still on-off's the fans, if I don't keep at least 25 % -speed on the fans. I'm on the latest BIOS-version. I noticed that the Premium-version has gotten a fan-curve fix on its BIOS-update, while this normal version didn't have the same thing mentioned in the change list. Should I flash the Premium-BIOS or what to do? I'd like to keep the semipassive-mode on.


The fan can only spin up at 25%, below that it stops, Nothing to do with the bios - fans have a minimum rpm. The curve you want should have it zero % for most tasks, then step up to over 25% for gaming like above, with a hysteresis of a few degrees, if you want to avoid the start/stop.


----------



## GeneO

Quote:


> Originally Posted by *skupples*
> 
> I've been turning it off since it came out... Which I think was late in the Kepler days.
> 
> I don't remember its purpose, but I do remember it hindering performance for anyone not running a $50 GPU @ release.
> 
> heeey look! OCN is top result in my google bubble when googling "purpose of NV Shader cache"
> 
> http://www.overclock.net/t/1553142/shader-cache-setting/0_50
> 
> this quote intrigues me.
> too bad he didn't mention where the cache stash is typically located.
> 
> OK - sooo after 2-3 minutes of reading. It does what its called. it stashes compiled shaders so the CPU doesn't have to recompile them later on.
> 
> hmmm.... gonna keep reading. Seems like it would be more (or only) beneficial on slower/older systems.
> 
> also, does it purge this cache? Or does it save it for later use?
> 
> seems like a slow CPU n storage matched with a fast GPU could possible by affected by this setting. Whereas, a fast main system w/ slow/old GPU could benefit.


It is located in c:\Users\\AppData\Local\Temp\NVIDIA Corporation\NV_Cache


----------



## asdkj1740

Quote:


> Originally Posted by *zipper17*
> 
> Anyone try latest driver?
> 
> Looks like Performances Improved on some Synthetic Bench and games.
> 
> __
> https://www.reddit.com/r/5pwfnh/driver_37849_faqdiscussion_thread/
> 
> Will try this driver, let see if there is a gain on Firestrike Graphics.


watch dogs 2 has significant improvement on stuttering issue.


----------



## icold

We need Pascal bios tweaker. Someone has unlocked 1080 strix bios and push [email protected] the voltage can increse until 1200mv

http://forums.evga.com/GTX-1080-Unlocked-Bios-Voltagefan-limit-increased-NEW-possible-voltage-tool-m2548379.aspx


----------



## ErrorFile

Quote:


> Originally Posted by *GeneO*
> 
> The fan can only spin up at 25%, below that it stops, Nothing to do with the bios - fans have a minimum rpm. The curve you want should have it zero % for most tasks, then step up to over 25% for gaming like above, with a hysteresis of a few degrees, if you want to avoid the start/stop.


Well, of course the fans do have their minimum speed.







But that's the problem - I tried to create such a fan-curve that below 60c fans wouldn't spin and they still keep going on-off while the GPU-temperature is now 44c on desktop and if I put the automode on or use my own curve, the fans will start to do this immediately. It stops happening if I set a fixed speed, like the 25 %. I contacted Palit and they told to me to flash a new BIOS-version, but I already have the latest version and so I only got a error-message saying the following:

"No need to update your VGA BIOS! (or not support for your VGA card)"

The BIOS-file that they sent to me, the only difference seems to be the file name and fan-curve changes listed in the included change-list. VGA_BIOS_Upgrade_1027-AF.exe is the file name, while the one I've flashed last week was this: VGA_BIOS_Upgrade_1027-A.exe

It seems like they asked me to flash the Premium-version BIOS, as I downloaded that file as well and it seems to be the same as the guy from Palit told me to flash. I just emailed Palit and asked how can I flash that fixed BIOS-version now. Hopefully I'm being clear enough, typing this without my morning coffee.


----------



## GeneO

Quote:


> Originally Posted by *ErrorFile*
> 
> Well, of course the fans do have their minimum speed.
> 
> 
> 
> 
> 
> 
> 
> But that's the problem - I tried to create such a fan-curve that below 60c fans wouldn't spin and they still keep going on-off while the GPU-temperature is now 44c on desktop and if I put the automode on or use my own curve, the fans will start to do this immediately. It stops happening if I set a fixed speed, like the 25 %. I contacted Palit and they told to me to flash a new BIOS-version, but I already have the latest version and so I only got a error-message saying the following:
> 
> "No need to update your VGA BIOS! (or not support for your VGA card)"
> 
> The BIOS-file that they sent to me, the only difference seems to be the file name and fan-curve changes listed in the included change-list. VGA_BIOS_Upgrade_1027-AF.exe is the file name, while the one I've flashed last week was this: VGA_BIOS_Upgrade_1027-A.exe
> 
> It seems like they asked me to flash the Premium-version BIOS, as I downloaded that file as well and it seems to be the same as the guy from Palit told me to flash. I just emailed Palit and asked how can I flash that fixed BIOS-version now. Hopefully I'm being clear enough, typing this without my morning coffee.


Nope. clear. Good luck.


----------



## RyanRazer

Quote:


> Originally Posted by *skupples*
> 
> I've been turning it off since it came out... Which I think was late in the Kepler days.
> 
> I don't remember its purpose, but I do remember it hindering performance for anyone not running a $50 GPU @ release.
> 
> heeey look! OCN is top result in my google bubble when googling "purpose of NV Shader cache"
> 
> http://www.overclock.net/t/1553142/shader-cache-setting/0_50
> 
> this quote intrigues me.
> too bad he didn't mention where the cache stash is typically located.
> 
> OK - sooo after 2-3 minutes of reading. It does what its called. it stashes compiled shaders so the CPU doesn't have to recompile them later on.
> 
> hmmm.... gonna keep reading. Seems like it would be more (or only) beneficial on slower/older systems.
> 
> also, does it purge this cache? Or does it save it for later use?
> 
> seems like a slow CPU n storage matched with a fast GPU could possible by affected by this setting. Whereas, a fast main system w/ slow/old GPU could benefit.


In my experience (i only tested with BF1) performance is roughly the same (hard to test fps accurately in MP) but it lags and stutters for first 30 seconds. then, it's the same. So considering my Nvidia temp files take up only 50MB of space, i'll keep that on.


----------



## skupples

Quote:


> Originally Posted by *icold*
> 
> We need Pascal bios tweaker. Someone has unlocked 1080 strix bios and push [email protected] the voltage can increse until 1200mv
> 
> http://forums.evga.com/GTX-1080-Unlocked-Bios-Voltagefan-limit-increased-NEW-possible-voltage-tool-m2548379.aspx












hmm... n I'd guess the strix & FTW have different voltage controllers?
Quote:


> Originally Posted by *RyanRazer*
> 
> In my experience (i only tested with BF1) performance is roughly the same (hard to test fps accurately in MP) but it lags and stutters for first 30 seconds. then, it's the same. So considering my Nvidia temp files take up only 50MB of space, i'll keep that on.


I'd like to test it in titles like Watch Dogs... Ones that're known to have issues with stutter due to texture streaming.


----------



## icold

Quote:


> Originally Posted by *skupples*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hmm... n I'd guess the strix & FTW have different voltage controllers?
> I'd like to test it in titles like Watch Dogs... Ones that're known to have issues with stutter due to texture streaming.


I have 1070 strix, and no


----------



## skupples

Quote:


> Originally Posted by *icold*
> 
> I have 1070 strix, and no


then there's hope after all.

I'm surprised no one's jumped to testing it yet.

It'll be the first thing I derp with when I get my new office setup this weekend. 1.2V isn't much, but its better than nothing & I'm willing to bet most of us will be able to push it 24/7 without causing damage to our cards.


----------



## icold

Quote:


> Originally Posted by *skupples*
> 
> then there's hope after all.
> 
> I'm surprised no one's jumped to testing it yet.
> 
> It'll be the first thing I derp with when I get my new office setup this weekend. 1.2V isn't much, but its better than nothing & I'm willing to bet most of us will be able to push it 24/7 without causing damage to our cards.


Pascal custom cards are very cold, And their bios are very capped. With custom bios we could increase the overclock much more and keep it fixed, gaining much more performance, around 2250 - 2300mhz at 70´s °C its very safe.


----------



## RyanRazer

Quote:


> Originally Posted by *icold*
> 
> Pascal custom cards are very cold, And their bios are very capped. With custom bios we could increase the overclock much more and keep it fixed, gaining much more performance, around 2250 - 2300mhz at 70´s °C its very safe.


That is if temps rise linearly with frequency... It could be that heat starts to rise dramatically with freq.


----------



## dboythagr8

My 1070 Strix has yet to break the 55c with the very small fan curve i've given it. I've benchmarked and gamed as well.

Different world than the Titan and its blower style cooling that I'm used to


----------



## xGeNeSisx

Anyone used the ASUS Gpu Tweak utility? Wondering how useful it is, might try it later but not sure how it will work on G1 w/ Strix bios


----------



## mrzoo

Just got my EVGA GTX 1070 superclocked the other day and I noticed when I turn pc on monitor says no signal and I have to reset pc in order to get picture on monitor. Any idea why this is happening in have it connected with displayport, this never happened when I had my 970 g1.


----------



## SuperZan

Quote:


> Originally Posted by *mrzoo*
> 
> Just got my EVGA GTX 1070 superclocked the other day and I noticed when I turn pc on monitor says no signal and I have to reset pc in order to get picture on monitor. Any idea why this is happening in have it connected with displayport, this never happened when I had my 970 g1.


 I had weird DP problems with my Furies and Fury X, but I haven't used DP with the 1070 so I'm not sure if it's just a generic DP issue or specific to the card. https://rejzor.wordpress.com/2016/03/12/fix-displayport-not-working-no-signal/

I ultimately solved the issue with my Furies by purchasing a VESA-compliant DP cable rather than the one supplied with my Fury X.

Here is some additional info that may be of use to you:

__
https://www.reddit.com/r/4vrjxv/question_just_got_evga_gtx_1070_ftw_display_ports/


----------



## mrzoo

Quote:


> Originally Posted by *SuperZan*
> 
> I had weird DP problems with my Furies and Fury X, but I haven't used DP with the 1070 so I'm not sure if it's just a generic DP issue or specific to the card. https://rejzor.wordpress.com/2016/03/12/fix-displayport-not-working-no-signal/
> 
> I ultimately solved the issue with my Furies by purchasing a VESA-compliant DP cable rather than the one supplied with my Fury X.
> 
> Here is some additional info that may be of use to you:
> 
> __
> https://www.reddit.com/r/4vrjxv/question_just_got_evga_gtx_1070_ftw_display_ports/


Thanks I'll try these out and see before getting a new cable my monitor is ASUS VG248QE (144Hz gaming monitor)


----------



## icold

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Anyone used the ASUS Gpu Tweak utility? Wondering how useful it is, might try it later but not sure how it will work on G1 w/ Strix bios


Not worth it, Msi afterburner is much better


----------



## dboythagr8

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Anyone used the ASUS Gpu Tweak utility? Wondering how useful it is, might try it later but not sure how it will work on G1 w/ Strix bios


It's trash. I tried putting up with it for a few days when I got my 1070 Strix, and then decided to just go back to Afterburner.


----------



## xGeNeSisx

Thanks guys, I'll stay away. I feared it would be like most Asus software. This is my first time using Asus components in a build, and as much as I respect their hardware quality most of their software is trash


----------



## MEC-777

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Thanks guys, I'll stay away. I feared it would be like most Asus software. This is my first time using Asus components in a build, and as much as I respect their hardware quality most of their software is trash


Yep. Great hardware, horrible software. lol. I had to uninstall the Asus AI Suite for my Z97-E motherboard because it was interfering with allowing Speedfan full control of the fan headers.


----------



## icold

Quote:


> Originally Posted by *MEC-777*
> 
> Yep. Great hardware, horrible software. lol. I had to uninstall the Asus AI Suite for my Z97-E motherboard because it was interfering with allowing Speedfan full control of the fan headers.


I used ai suite just to test OC, after this Uninstall...


----------



## VladimirAG

Hi all... can I joing?









  

&#8230;and first Q is&#8230; where can I see overclock potential of this beauty?

_PS: Samsung RAM._


----------



## RyanRazer

Quote:


> Originally Posted by *VladimirAG*
> 
> Hi all... can I joing?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> &#8230;and first Q is&#8230; where can I see overclock potential of this beauty?
> 
> _PS: Samsung RAM._


wow, nice card!


----------



## rfarmer

Quote:


> Originally Posted by *VladimirAG*
> 
> Hi all... can I joing?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> &#8230;and first Q is&#8230; where can I see overclock potential of this beauty?
> 
> _PS: Samsung RAM._


Yeah I would have loved to get one of those but I have a Ncase with only 5.5" GPU width, so ended up putting a block on a FE. That is a great looking card.


----------



## icold

Quote:


> Originally Posted by *VladimirAG*
> 
> Hi all... can I joing?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> &#8230;and first Q is&#8230; where can I see overclock potential of this beauty?
> 
> _PS: Samsung RAM._


How up ?


----------



## VladimirAG

*RyanRazer
rfarmer*
Thanx









*icold*
On this moment

 

&#8230;if someone suggested overclock nuances 10xx series or links on good manual&#8230; step by step about power limits, qurves etc., I would be grateful


----------



## icold

Quote:


> Originally Posted by *VladimirAG*
> 
> *RyanRazer
> rfarmer*
> Thanx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *icold*
> On this moment
> 
> 
> 
> &#8230;if someone suggested overclock nuances 10xx series or links on good manual&#8230; step by step about power limits, qurves etc., I would be grateful


I have problem with the curves, ever crash... i use traditional + 226. Have you better result use curve?


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *icold*
> 
> I have 1070 strix, and no
> 
> 
> 
> then there's hope after all.
> 
> I'm surprised no one's jumped to testing it yet.
> 
> It'll be the first thing I derp with when I get my new office setup this weekend. 1.2V isn't much, but its better than nothing & I'm willing to bet most of us will be able to push it 24/7 without causing damage to our cards.
Click to expand...

I have tried just about all the different bioses on my MSI Gaming X. Both EVGA and Strix bioses work fine on the MSI card but the EVGA does hit its power limit very easily so the boost clock jumps around a lot. Make sure that you test your FTW in game using the 2nd bios as it is the more powerful one with a higher power limit.

The Asus bios maintains more stable clocks but, the Asus OC bios has a lower voltage limit than the 2nd bios on the FTW (200w vs 226W). Dont waste your time with the non OC bios.

To address stuttering, you could try is to disable the Shader Cache in Nvidia control panel. The Cache really only helps older lower powered cards. You may find that helps with stuttering.


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Anyone used the ASUS Gpu Tweak utility? Wondering how useful it is, might try it later but not sure how it will work on G1 w/ Strix bios


It will work to overclock the card but Don't waste your time. Afterburner is a superior utility.


----------



## gtbtk

Quote:


> Originally Posted by *mrzoo*
> 
> Just got my EVGA GTX 1070 superclocked the other day and I noticed when I turn pc on monitor says no signal and I have to reset pc in order to get picture on monitor. Any idea why this is happening in have it connected with displayport, this never happened when I had my 970 g1.


Have you tried changing to a different DP port on the 1070 SC?


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xGeNeSisx*
> 
> Thanks guys, I'll stay away. I feared it would be like most Asus software. This is my first time using Asus components in a build, and as much as I respect their hardware quality most of their software is trash
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yep. Great hardware, horrible software. lol. I had to uninstall the Asus AI Suite for my Z97-E motherboard because it was interfering with allowing Speedfan full control of the fan headers.
Click to expand...

The Asus software services also kills CPU performance in benchmarks as well.


----------



## gtbtk

Quote:


> Originally Posted by *VladimirAG*
> 
> *RyanRazer
> rfarmer*
> Thanx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *icold*
> On this moment
> 
> 
> 
> &#8230;if someone suggested overclock nuances 10xx series or links on good manual&#8230; step by step about power limits, qurves etc., I would be grateful


You should be fine with Voltage, power limit and temperature sliders all the way to maximum.

Your memory should overclock to somewhere between +500 to +800 with good performance increases.

Depending on your card and your rig, The card may overclock easily with the slider or you may find that you can get best results leaving the core clock slider at 0 and pulling the 1.093 voltage point on the curve up to somewhere above 2100 - 2200 and see how it goes. Some water cooled cards I believe are getting 2300mhz. You are going to have to experiment to find out what you can do with yours.


----------



## mrzoo

Quote:


> Originally Posted by *gtbtk*
> 
> Have you tried changing to a different DP port on the 1070 SC?


I have I'm gonna just take the card out and reseat it make sure it's sitting correctly on mobo. I even went into bios and made sure my pcie 1 was first boot up


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> I have tried just about all the different bioses on my MSI Gaming X. Both EVGA and Strix bioses work fine on the MSI card but the EVGA does hit its power limit very easily so the boost clock jumps around a lot. Make sure that you test your FTW in game using the 2nd bios as it is the more powerful one with a higher power limit.
> 
> The Asus bios maintains more stable clocks but, the Asus OC bios has a lower voltage limit than the 2nd bios on the FTW (200w vs 226W). Dont waste your time with the non OC bios.
> 
> To address stuttering, you could try is to disable the Shader Cache in Nvidia control panel. The Cache really only helps older lower powered cards. You may find that helps with stuttering.


Forget, we need Pascal bios tweaker


----------



## skupples

Quote:


> Originally Posted by *icold*
> 
> I have problem with the curves, ever crash... i use traditional + 226. Have you better result use curve?


lol, this is all I did with my card.

max volts, +226, +500, max fan (I don't really care about fan noise I have 40 gentle typhoons in my case) & I just observe the power target to see if she needs more or less.

worst case scenario so far, I have to go +226 +400.


----------



## icold

Quote:


> Originally Posted by *skupples*
> 
> lol, this is all I did with my card.
> 
> max volts, +226, +500, max fan (I don't really care about fan noise I have 40 gentle typhoons in my case) & I just observe the power target to see if she needs more or less.
> 
> worst case scenario so far, I have to go +226 +400.


you grab 2.126mhz too, my memory is micron i grab only 4374mhz. If you want - temp, try change thermal past to Coolaboratory Liquid Pro
, the best paste of the market.


----------



## skupples

Quote:


> Originally Posted by *icold*
> 
> you grab 2.126mhz too, my memory is micron i grab only 4374mhz. If you want - temp, try change thermal past to Coolaboratory Liquid Pro
> , the best paste of the market.


Ultra is the good stuff, pro is harder to work with. I might even have a stick of ultra stashed somewhere...

You know, I haven't even checked to be honest. I just set it & went about my business.


----------



## icold

Quote:


> Originally Posted by *skupples*
> 
> Ultra is the good stuff, pro is harder to work with. I might even have a stick of ultra stashed somewhere...
> 
> You know, I haven't even checked to be honest. I just set it & went about my business.


I just saw a comparison test with an MX4, two have the same performance. Better grab MX4: same performance of CLP, non conductive, easy to apply, easy to clean, cheapest.







But if you have CLP, use it.


----------



## gtbtk

Quote:


> Originally Posted by *mrzoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Have you tried changing to a different DP port on the 1070 SC?
> 
> 
> 
> I have I'm gonna just take the card out and reseat it make sure it's sitting correctly on mobo. I even went into bios and made sure my pcie 1 was first boot up
Click to expand...

You should also try one of the other Display ports in the back of the card. You should have three of them with an HDMI and a DVI-D connector. It is possible that you have just used one port that thinks that it is a secondary port and not the master one.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> lol, this is all I did with my card.
> 
> max volts, +226, +500, max fan (I don't really care about fan noise I have 40 gentle typhoons in my case) & I just observe the power target to see if she needs more or less.
> 
> worst case scenario so far, I have to go +226 +400.
> 
> 
> 
> you grab 2.126mhz too, my memory is micron i grab only 4374mhz. If you want - temp, try change thermal past to Coolaboratory Liquid Pro
> , the best paste of the market.
Click to expand...

Increase your Vcore a bit, increase VCCIO just a bit, possibly VCCSA ( I cant adjust the SA voltage on my Z68) and CPU PLL voltages a bit and you might find that your memory overclocks a whole lot better. By playing with my voltages, I have gone from a +490 to a +650 memory overclock.

I have come to the conclusion that things such as vdroop are caused by the added loads the new higher powered GPUs are putting on the PCIE connections that are directly connected between the GPU and the CPU. When it gets too much, the driver crashes.


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> Increase your Vcore a bit, increase VCCIO just a bit, possibly VCCSA ( I cant adjust the SA voltage on my Z68) and CPU PLL voltages a bit and you might find that your memory overclocks a whole lot better. By playing with my voltages, I have gone from a +490 to a +650 memory overclock.
> 
> I have come to the conclusion that things such as vdroop are caused by the added loads the new higher powered GPUs are putting on the PCIE connections that are directly connected between the GPU and the CPU. When it gets too much, the driver crashes.


I see... n all that would apply to x79 as well. I'll derp around.

my 4930k has been at stock for at least 6 months now, I need to get it back to 4.5+ giggles anyways


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> Increase your Vcore a bit, increase VCCIO just a bit, possibly VCCSA ( I cant adjust the SA voltage on my Z68) and CPU PLL voltages a bit and you might find that your memory overclocks a whole lot better. By playing with my voltages, I have gone from a +490 to a +650 memory overclock.
> 
> I have come to the conclusion that things such as vdroop are caused by the added loads the new higher powered GPUs are putting on the PCIE connections that are directly connected between the GPU and the CPU. When it gets too much, the driver crashes.


My h60 dont hold very good my cpu(maybe i need delid, the ihs thermal paste must be only the shell), i use i7 [email protected] almost 4.3ghz with 1.065vcore, yes cpu golden. But my temps at aida extreme became around 75 °C FULLOAD, is not good idea increase vcore. I can try with pll and vccsa. I put Vcssa 1.050v to 1.055v and + 10v pll, my mobo have no VCCIO.


----------



## Nukemaster

75c under a heavy stress test does not seem bad(very few tasks will push it that hard). Very nice voltage.


----------



## icold

Quote:


> Originally Posted by *Nukemaster*
> 
> 75c under a heavy stress test does not seem bad(very few tasks will push it that hard). Very nice voltage.


The problem is: Locked CPU







. 3770K like this can 5ghz easy. I dont know if my h60 is broken. Pump look works 4700rpm+ and FAN 1900+, but temps is very ****ty.


----------



## dlewbell

Quote:


> Originally Posted by *icold*
> 
> My h60 dont hold very good my cpu(maybe i need delid, the ihs thermal paste must be only the shell), i use i7 [email protected] almost 4.3ghz with 1.065vcore, yes cpu golden. But my temps at aida extreme became around 75 °C FULLOAD, is not good idea increase vcore. I can try with pll and vccsa. I put Vcssa 1.050v to 1.055v and + 10v pll, my mobo have no VCCIO.


The issue is that you're using an H60. I had one a year ago, & it was quite disappointing. I replaced it with the PH-TC14PE. I never overclocked with the H60, but with an i5-6600K at stock, the PH-TC14PE reduced temperatures under load (Prime95) by 13°C & was much quieter to boot. After that experience I would never consider an H60 again. I sold mine for $30 on Craigslist & never looked back.


----------



## VladimirAG

Quote:


> Originally Posted by *gtbtk*
> 
> You are going to have to experiment to find out what you can do with yours.


Thank you for help!

 

Obviously this is the maximum









_PS: Fire Strike stresstest._


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Increase your Vcore a bit, increase VCCIO just a bit, possibly VCCSA ( I cant adjust the SA voltage on my Z68) and CPU PLL voltages a bit and you might find that your memory overclocks a whole lot better. By playing with my voltages, I have gone from a +490 to a +650 memory overclock.
> 
> I have come to the conclusion that things such as vdroop are caused by the added loads the new higher powered GPUs are putting on the PCIE connections that are directly connected between the GPU and the CPU. When it gets too much, the driver crashes.
> 
> 
> 
> My h60 dont hold very good my cpu(maybe i need delid, the ihs thermal paste must be only the shell), i use i7 [email protected] almost 4.3ghz with 1.065vcore, yes cpu golden. But my temps at aida extreme became around 75 °C FULLOAD, is not good idea increase vcore. I can try with pll and vccsa. I put Vcssa 1.050v to 1.055v and + 10v pll, my mobo have no VCCIO.
Click to expand...

it does become a problem if your cooling is marginal. I have my Z68 with i7-2600 at 4443Mhz but if I want to run hugh 1070 Overclocks, I have to set vcore to 1.34V, vccio to 1.1 and cpu pll to 1.8185. I can now get 20700 in firestrike routinely where before I discovered it, I was topping out at 20500 on a good run. Memory OC has also gone from +500 to +650 as well.


----------



## gtbtk

Quote:


> Originally Posted by *VladimirAG*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You are going to have to experiment to find out what you can do with yours.
> 
> 
> 
> Thank you for help!
> 
> 
> 
> Obviously this is the maximum
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: Fire Strike stresstest.
Click to expand...

IN Afterburner, in the monitoring options tab, you can increase the graph maximum to a higher limit if you want. The image was hard for me to read because it has been size reduced. did I see that you were running at higher than 2500Mhz? if so, that is very impressive. What firestrike scores are you getting with that?


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have tried just about all the different bioses on my MSI Gaming X. Both EVGA and Strix bioses work fine on the MSI card but the EVGA does hit its power limit very easily so the boost clock jumps around a lot. Make sure that you test your FTW in game using the 2nd bios as it is the more powerful one with a higher power limit.
> 
> The Asus bios maintains more stable clocks but, the Asus OC bios has a lower voltage limit than the 2nd bios on the FTW (200w vs 226W). Dont waste your time with the non OC bios.
> 
> To address stuttering, you could try is to disable the Shader Cache in Nvidia control panel. The Cache really only helps older lower powered cards. You may find that helps with stuttering.
> 
> 
> 
> Forget, we need Pascal bios tweaker
Click to expand...

did you try disabling the shader cache?


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Increase your Vcore a bit, increase VCCIO just a bit, possibly VCCSA ( I cant adjust the SA voltage on my Z68) and CPU PLL voltages a bit and you might find that your memory overclocks a whole lot better. By playing with my voltages, I have gone from a +490 to a +650 memory overclock.
> 
> I have come to the conclusion that things such as vdroop are caused by the added loads the new higher powered GPUs are putting on the PCIE connections that are directly connected between the GPU and the CPU. When it gets too much, the driver crashes.
> 
> 
> 
> I see... n all that would apply to x79 as well. I'll derp around.
> 
> my 4930k has been at stock for at least 6 months now, I need to get it back to 4.5+ giggles anyways
Click to expand...

I have never tried it on an x79 but you still have PCIe lanes directly connected to the CPU so it would seem logical that the theory should also hold true. It doesnt cost you anything to try it out. Just save an OC profile of your current settings so you can revert if needed


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> it does become a problem if your cooling is marginal. I have my Z68 with i7-2600 at 4443Mhz but if I want to run hugh 1070 Overclocks, I have to set vcore to 1.34V, vccio to 1.1 and cpu pll to 1.8185. I can now get 20700 in firestrike routinely where before I discovered it, I was topping out at 20500 on a good run. Memory OC has also gone from +500 to +650 as well.


I do not understand vcores cpu correlaction with GPU. How you put 4.4ghz on 2600 cpu non k? My BCLK go only 104.5, more freeze my PC. I trying put more VCCSA to up more my GPU


----------



## madweazl

I maxed out at 2152 core and 2430 mem (+252/843 IIRC but may have been 842 on the mem) the other night benchmarking. For daily use I've left it at 200/800. Been very happy with the Founder Edition thus far









Time Spy validation link.


Spoiler: Warning: Spoiler!





__
https://flic.kr/p/RtdBkh


----------



## icold

Quote:


> Originally Posted by *madweazl*
> 
> I maxed out at 2152 core and 2430 mem (+252/843 IIRC but may have been 842 on the mem) the other night benchmarking. For daily use I've left it at 200/800. Been very happy with the Founder Edition thus far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Time Spy validation link.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> __
> https://flic.kr/p/RtdBkh


Very nice clock, the problem with reference GPUS is works very hot.


----------



## madweazl

Quote:


> Originally Posted by *icold*
> 
> Very nice clock, the problem with reference GPUS is works very hot.


I'm on water


----------



## RyanRazer

Quote:


> Originally Posted by *gtbtk*


That is more likely close to 2200Mhz. hard to tell. 2500 is the maximum on that graph, while he has just below pre-last line. There are 10 lines, each increments 250mhz. Just below 9th line would make it just below 2250mhz.


----------



## VladimirAG

Quote:


> Originally Posted by *gtbtk*
> What firestrike scores are you getting with that?





Spoiler: Warning: Spoiler!






Quote:


> Originally Posted by *gtbtk*
> did I see that you were running at higher than 2500Mhz


No







look at the white figures.

The tests fails if I increase core or memory.


----------



## RyanRazer

Quote:


> Originally Posted by *VladimirAG*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> No
> 
> 
> 
> 
> 
> 
> 
> look at the white figures.
> 
> The tests fails if I increase core or memory.


Dark blue







great OC man


----------



## icold

Increase VCCSA and PLL here dont change anything about GPU clocks


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> it does become a problem if your cooling is marginal. I have my Z68 with i7-2600 at 4443Mhz but if I want to run hugh 1070 Overclocks, I have to set vcore to 1.34V, vccio to 1.1 and cpu pll to 1.8185. I can now get 20700 in firestrike routinely where before I discovered it, I was topping out at 20500 on a good run. Memory OC has also gone from +500 to +650 as well.
> 
> 
> 
> I do not understand vcores cpu correlaction with GPU. How you put 4.4ghz on 2600 cpu non k? My BCLK go only 104.5, more freeze my PC. I trying put more VCCSA to up more my GPU
Click to expand...

The GPU is using the PCIe Lanes direct to the CPU. The conclusion I have come to is that the higher loads on the pcie lanes that the 1070/1080 level cards can now put on the signalling of those lanes is overwelming the cpus ability to stay stable and not crash the driver. I think the extra vcore is helping stablize the CPU connected PCIe signals

I can set the 4 extra multiplier bins to 42. BCLK to 105.7mhz, (I was running vcore at 1.3 before I started playing with this) vcore to 1.34v (I have a sandy bridge which is a 95W chip, your Ivy would be lower as they have lower TDP).

I have HyperX Fury DDR3 2x8Gb 1600Mhz CL10 overclocked to 1972Mhz with timings of 11-12-11-33

I have VCCIO set to 1.1 - Asus Z68 ties SA voltage to VCCIO and I cannot change that.

I set CPU PLL set to 1.8125v - If I go much higher than that. USB Keyboard and mouse start getting laggy

I also had the VRM frequency set to a fixed 350Mhz with extreme phase control but I discovered that changing the VRM frequency to auto with VRM spread spectrum also gave me better stability. Phase control set to optimized.

I had to experiment with the balance between VCCIO and CPU PLL


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Increase VCCSA and PLL here dont change anything about GPU clocks


combined with extra vcore it provides stability at higher overclocks . it doesn't make the gpu automatically faster


----------



## iluvkfc

Anyone know where are the shunt resistors on 1070 G1 Gaming? Want to remove power limit once and for all and I just got myself some CLU.


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> combined with extra vcore it provides stability at higher overclocks . it doesn't make the gpu automatically faster


i tried, dont work, is no possible use more than 4374 stable


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> combined with extra vcore it provides stability at higher overclocks . it doesn't make the gpu automatically faster
> 
> 
> 
> i tried, dont work, is no possible use more than 4374 stable
Click to expand...

Well you didn't lose anything.

Maybe What I am seeing is a pcie 2.0 thing?


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> Well you didn't lose anything.
> 
> Maybe What I am seeing is a pcie 2.0 thing?


is PCIE 3.0.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> The GPU is using the PCIe Lanes direct to the CPU. The conclusion I have come to is that the higher loads on the pcie lanes that the 1070/1080 level cards can now put on the signalling of those lanes is overwelming the cpus ability to stay stable and not crash the driver. I think the extra vcore is helping stablize the CPU connected PCIe signals
> 
> I can set the 4 extra multiplier bins to 42. BCLK to 105.7mhz, (I was running vcore at 1.3 before I started playing with this) vcore to 1.34v (I have a sandy bridge which is a 95W chip, your Ivy would be lower as they have lower TDP).
> I have HyperX Fury DDR3 2x8Gb 1600Mhz CL10 overclocked to 1972Mhz with timings of 11-12-11-33
> I have VCCIO set to 1.1 - Asus Z68 ties SA voltage to VCCIO and I cannot change that.
> I set CPU PLL set to 1.8125v - If I go much higher than that. USB Keyboard and mouse start getting laggy
> 
> I also had the VRM frequency set to a fixed 350Mhz with extreme phase control but I discovered that changing the VRM frequency to auto with VRM spread spectrum also gave me better stability. Phase control set to optimized.
> 
> I had to experiment with the balance between VCCIO and CPU PLL


as far as I concern VCSSA/VCCIO voltage is for better Memory RAM overclocking isn't it?.
CPU's IMC (integrated Memory Controller) also play role for better Memory RAM OC.
CPU PLL is like overclocking USB voltage, etc. Cpu vcores majority for stabilize cpu overclocking.
I think GPu's overclocking won't have any correlation if you modify motherboard bios isn't it?, well still if there is some i would still want to look into though.

I think you still have room for about +2133mhz memory ram speed for maximum you can get with Sandybridge cpu+z68+bclk oc, sandy bridge IMC does not support 2400MHz divider, ivybridge and so on Intel start start to improve the IMC.

my ivy 4.7ghz @1.3xxV, @2400mhz CL11
GTX 1070 still scores around as yours 20,6XX-20,9XX, 21K is not stable.
I quess it's 1070 silicon lottery again ...

maybe this will help gain some points on FS graphics:
Quote:


> Originally Posted by *zipper17*
> 
> Also another little method how to get a few hundred points on Firestrike CPU/GPU scores:
> - make your GPU running on prefer max performances
> - go to power settings, change your preferred plan to High Performances. (Disable PCIE & CPU idle state)
> - close any monitoring software while benching, Hwmonitor, msi afterburner,gpuz, etc.
> - lower your temperature ambient and go for 100% fan speed to maximum your cooling potential.
> 
> In my experience i got a few hundred points on GPU scores from around 20.800 into 20.900ish, and a few hundred gain points on CPU scores also.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Well you didn't lose anything.
> 
> Maybe What I am seeing is a pcie 2.0 thing?
> 
> 
> 
> is PCIE 3.0.
Click to expand...

Yes, but mine isn't. Sandy Bridge only supports PCIe 2


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The GPU is using the PCIe Lanes direct to the CPU. The conclusion I have come to is that the higher loads on the pcie lanes that the 1070/1080 level cards can now put on the signalling of those lanes is overwelming the cpus ability to stay stable and not crash the driver. I think the extra vcore is helping stablize the CPU connected PCIe signals
> 
> I can set the 4 extra multiplier bins to 42. BCLK to 105.7mhz, (I was running vcore at 1.3 before I started playing with this) vcore to 1.34v (I have a sandy bridge which is a 95W chip, your Ivy would be lower as they have lower TDP).
> I have HyperX Fury DDR3 2x8Gb 1600Mhz CL10 overclocked to 1972Mhz with timings of 11-12-11-33
> I have VCCIO set to 1.1 - Asus Z68 ties SA voltage to VCCIO and I cannot change that.
> I set CPU PLL set to 1.8125v - If I go much higher than that. USB Keyboard and mouse start getting laggy
> 
> I also had the VRM frequency set to a fixed 350Mhz with extreme phase control but I discovered that changing the VRM frequency to auto with VRM spread spectrum also gave me better stability. Phase control set to optimized.
> 
> I had to experiment with the balance between VCCIO and CPU PLL
> 
> 
> 
> as far as I concern VCSSA/VCCIO voltage is for better Memory RAM overclocking isn't it?.
> CPU's IMC (integrated Memory Controller) also play role for better Memory RAM OC.
> CPU PLL is like overclocking USB voltage, etc. Cpu vcores majority for stabilize cpu overclocking.
> I think GPu's overclocking won't have any correlation if you modify motherboard bios isn't it?, well still if there is some i would still want to look into though.
> 
> I think you still have room for about +2133mhz memory ram speed for maximum you can get with Sandybridge cpu+z68+bclk oc, sandy bridge IMC does not support 2400MHz divider, ivybridge and so on Intel start start to improve the IMC.
> 
> my ivy 4.7ghz @1.3xxV, @2400mhz CL11
> GTX 1070 still scores around as yours 20,6XX-20,9XX, 21K is not stable.
> I quess it's 1070 silicon lottery again ...
> 
> maybe this will help gain some points on FS graphics:
> Quote:
> 
> 
> 
> Originally Posted by *zipper17*
> 
> Also another little method how to get a few hundred points on Firestrike CPU/GPU scores:
> - make your GPU running on prefer max performances
> - go to power settings, change your preferred plan to High Performances. (Disable PCIE & CPU idle state)
> - close any monitoring software while benching, Hwmonitor, msi afterburner,gpuz, etc.
> - lower your temperature ambient and go for 100% fan speed to maximum your cooling potential.
> 
> In my experience i got a few hundred points on GPU scores from around 20.800 into 20.900ish, and a few hundred gain points on CPU scores also.
> 
> Click to expand...
Click to expand...

All those things you mentioned are exactly what the Overclocking tutorials tell you it is. VCCIO/SA voltages do help with Memory Overclocks and the on cpu memory controller. However, I am seeing increases and decreases in GPU OC stability and benchmark performance when I make changes in those voltages. That would tend to suggest that there is more going on than what the common wisdom would have us believe.

Following a tutorial will only get you maybe 80-90% of the way to getting the absolute maximum performance from overclocking our hardware.

What we are discussing here is just as much Art as it is science, mostly because we dont have a full understanding of what is going on when me make those adjustments. Intel is certainly not here telling us all their dirty little sectrets.

In general terms what you said above is correct, however, CPUs have to interact with memory, their own x16 PCIe lanes to GPUs and DMI channels, Sata, USB controllers etc. They don't all operate in isolation, they signal each other with electrical wave forms. High performance comes from tuning each of those waveforms exactly with each other. The more out of being perfectly in tune, the communication between the parts gets slower and slower until the Waveform variation is so far out of whack that the communications break down and the computer crashes. Those waveforms are tuned by changing different combinations of voltage levels that can be set in the UEFI.

Rather than argue with me over what the tutorials tell you, why don't you try experimenting with your own hardware and come back and tell us what you found so we can compare notes?


----------



## msigtx760tf4

my new, old i7 3770k

http://www.3dmark.com/3dm/17680083


----------



## icold

Quote:


> Originally Posted by *msigtx760tf4*
> 
> my new, old i7 3770k
> 
> http://www.3dmark.com/3dm/17680083


Very nice clock, I thought of switching to a 3770k, but it's not worth just switching 4.28ghz to 4.5ghz, here this cpu is very expensive.


----------



## g-lad21

Decided today to get a new evga 1070! decided to go with this card because i really like its look and i really love how it fits my build, clean and minimalistic.

photo: https://goo.gl/photos/mrknECWRBn1TRbAp7

unfortunately my coil whine bad luck stayed with me, but in a better way

no coil whine while v-sync is on, so i guess i'm happy! my rig is super quiet!
rocking a 6700k and 16gb ram i got this new baby for playing dota 2 lol, but for 4k resolution i guess its a legit purchase,
happy to join the club!


----------



## icold

Quote:


> Originally Posted by *g-lad21*
> 
> Decided today to get a new evga 1070! decided to go with this card because i really like its look and i really love how it fits my build, clean and minimalistic.
> 
> photo: https://goo.gl/photos/mrknECWRBn1TRbAp7
> 
> unfortunately my coil whine bad luck stayed with me, but in a better way
> 
> no coil whine while v-sync is on, so i guess i'm happy! my rig is super quiet!
> rocking a 6700k and 16gb ram i got this new baby for playing dota 2 lol, but for 4k resolution i guess its a legit purchase,
> happy to join the club!


You need update bios and put thermal pads. EVGA card runs very hot vrm


----------



## g-lad21

Quote:


> Originally Posted by *icold*
> 
> You need update bios and put thermal pads. EVGA card runs very hot vrm


Thanks for the tips, just updated the bios.

do you know if the pads lower the coil whine? so far temps are fine..

thanks


----------



## icold

Quote:


> Originally Posted by *g-lad21*
> 
> Thanks for the tips, just updated the bios.
> 
> do you know if the pads lower the coil whine? so far temps are fine..
> 
> thanks


Check vrm temps, some EVGA cards Literally exploded, its project defect.


----------



## madweazl

Quote:


> Originally Posted by *msigtx760tf4*
> 
> my new, old i7 3770k
> 
> http://www.3dmark.com/3dm/17680083


10k on the memory, nice!

I spent all day working on my score too. Made a move toward the top but ended up 8th over all but only 3 users above me (6600k with 1070).

16743


----------



## gtbtk

Quote:


> Originally Posted by *g-lad21*
> 
> Decided today to get a new evga 1070! decided to go with this card because i really like its look and i really love how it fits my build, clean and minimalistic.
> 
> photo: https://goo.gl/photos/mrknECWRBn1TRbAp7
> 
> unfortunately my coil whine bad luck stayed with me, but in a better way
> 
> no coil whine while v-sync is on, so i guess i'm happy! my rig is super quiet!
> rocking a 6700k and 16gb ram i got this new baby for playing dota 2 lol, but for 4k resolution i guess its a legit purchase,
> happy to join the club!


Fast sync is a much better option than v-sync with much lower latency. It will allow the card to render at what ever speed that it can go and the frame buffer manages sending the frames out to your monitor while matching the Monitor refresh rate.

You need to set that in the Nvidia Control panel.


----------



## g-lad21

Quote:


> Originally Posted by *gtbtk*
> 
> Fast sync is a much better option than v-sync with much lower latency. It will allow the card to render at what ever speed that it can go and the frame buffer manages sending the frames out to your monitor while matching the Monitor refresh rate.
> 
> You need to set that in the Nvidia Control panel.


thanks for the tip, does this force fast sync even if v-sync is on? i set fast sync on global settings.
also, i have a freesync monitor, i know its for AMD only, but i saw there are options for adaptive in the nvidia control panel, did anyone test this? thanks.

EDIT: tried fast sync, it causes a coil whine that i hate dearly, so i switched back to normal v-sync, i havnent noticed the change, plus im not a hardcore gamer


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *g-lad21*
> 
> Thanks for the tips, just updated the bios.
> 
> do you know if the pads lower the coil whine? so far temps are fine..
> 
> thanks
> 
> 
> 
> Check vrm temps, some EVGA cards Literally exploded, its project defect.
Click to expand...

The explosions were coming from faulty capacitors that were installed in a earlier production batch. The faulty capacitor issue has been resolved. Unless the EVGA card has been sitting in a warehouse somewhere for the last 6 months you should be fine. There is also the thermal pads that the card may have already installed and bios update to increase the fan curve. (if the default Vbios is version 86.04.50.00.XX it should already have the latest bios and thermal pads installed in the factory)


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> The explosions were coming from faulty capacitors that were installed in a earlier production batch. The faulty capacitor issue has been resolved. Unless the EVGA card has been sitting in a warehouse somewhere for the last 6 months you should be fine. There is also the thermal pads that the card may have already installed and bios update to increase the fan curve. (if the default Vbios is version 86.04.50.00.XX it should already have the latest bios and thermal pads installed in the factory)


Man, Is there any way to up more bclk? My going only 104.5.


----------



## khanmein

Quote:


> Originally Posted by *icold*
> 
> Check vrm temps, some EVGA cards Literally exploded, its project defect.


seriously no issue & my card is dec '16 batch that straight away came with pre-installed thermal pad + vbios

no coil whine & whisper quiet.


----------



## gtbtk

Quote:


> Originally Posted by *g-lad21*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Fast sync is a much better option than v-sync with much lower latency. It will allow the card to render at what ever speed that it can go and the frame buffer manages sending the frames out to your monitor while matching the Monitor refresh rate.
> 
> You need to set that in the Nvidia Control panel.
> 
> 
> 
> thanks for the tip, does this force fast sync even if v-sync is on? i set fast sync on global settings.
> also, i have a freesync monitor, i know its for AMD only, but i saw there are options for adaptive in the nvidia control panel, did anyone test this? thanks.
> 
> EDIT: tried fast sync, it causes a coil whine that i hate dearly, so i switched back to normal v-sync, i havnent noticed the change, plus im not a hardcore gamer
Click to expand...

Shame about the coil whine. It was worth trying out though. Free sync is not going to help you without an AMD Card. I have never bother trying adaptive As I dont use v-sync anyway. I assume that it either turns v-sync off if the framerate drops below 60 fps or swaps to triple buffer v-sync that I think works at 30fps. Best thing to do if you are curious is try it out and see what happens. You cant break anything.

The problem with v-sync is the high latency - Move the mouse and the delay before the cursor moves. If you are not running around shooting people whipping the mouse around you probably wont notice that too much.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The explosions were coming from faulty capacitors that were installed in a earlier production batch. The faulty capacitor issue has been resolved. Unless the EVGA card has been sitting in a warehouse somewhere for the last 6 months you should be fine. There is also the thermal pads that the card may have already installed and bios update to increase the fan curve. (if the default Vbios is version 86.04.50.00.XX it should already have the latest bios and thermal pads installed in the factory)
> 
> 
> 
> Man, Is there any way to up more bclk? My going only 104.5.
Click to expand...

OC potential is chip dependent, It is possible that is all your chip can cope with using sensible voltages. I hit a wall at 105.8Mhz and I am not prepared to increase voltages up really high to keep trying for more cause I am only using air cooling.

If you want to try, I suggest that you use fixed voltages and try increasing vcore voltage up from defaults. VRM spread spectrum with an optimized phase control can with stability. Adjust the Line calibration so that under load the vcore remains stable. HWinfo64 is a great utility to keep track of everything.

0x124 BSOD crashes indicate vcore is too low.

Remember that BCLK will automatically overclock you memory so while you are getting things stable, try dropping your ram speed back to 1333Mhz until you find stable settings with the cpu. After you are stable start bringing your ram speeds back up and adjust voltage and timings to keep things stable with rge higher BCLK settings.


----------



## gtbtk

Quote:


> Originally Posted by *madweazl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *msigtx760tf4*
> 
> my new, old i7 3770k
> 
> http://www.3dmark.com/3dm/17680083
> 
> 
> 
> 10k on the memory, nice!
> 
> I spent all day working on my score too. Made a move toward the top but ended up 8th over all but only 3 users above me (6600k with 1070).
> 
> 16743
Click to expand...

You have some great overclocking vram there. With your overclock you are matching a 1080 in vram bandwidth. I'm envious.

As a matter of interest, I see you are running your 3770K at 4.8Ghz. What various voltage (vcore, VCCIO, VCCSA, CPU PLL etc), line calibration and VRM phase control settings are you running on your mother board?


----------



## Curseair

Quote:


> Originally Posted by *msigtx760tf4*
> 
> my new, old i7 3770k
> 
> http://www.3dmark.com/3dm/17680083


I guess that's Samsung memory?, I recently sent back a Gigabyte Xtreme 1070 it could not even keep it's own advertised clock speeds..


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> I have never tried it on an x79 but you still have PCIe lanes directly connected to the CPU so it would seem logical that the theory should also hold true. It doesnt cost you anything to try it out. Just save an OC profile of your current settings so you can revert if needed


i actually spent some time watching the numbers last night.

I definitely see more benefit from OC'ing memory than OC'ing core. More so than any other GPU I can think of.

Oh, I'm also running NV Surround, which is likely a contributing factor.

Core pegs out @ just under 2giggles, on default, max temp is ~70 (even though it was 55f in my room last night)

basically, just another power starved chip


----------



## Curseair

Which 1070 should I go for out of these 3 guys?

Zotac Amp Extreme

Palit Super Jetstream

Asus Strix OC

Looking for one with the best Overclocking potential and build quality.


----------



## icold

Quote:


> Originally Posted by *Curseair*
> 
> Which 1070 should I go for out of these 3 guys?
> 
> Zotac Amp Extreme
> 
> Palit Super Jetstream
> 
> Asus Strix OC
> 
> Looking for one with the best Overclocking potential and build quality.


I vote to amp extreme


----------



## JoeUbi

My AMP Extreme! Gets 2100 Mhz on core and 9600+ Mhz with Samsung memory. However, I just leave mine at stock, the core at stock I've seen boost itself all the way up to 2025 Mhz.


----------



## icold

I think OC is lotery, 2 power connectors dont help more OC.


----------



## skupples

is it just me or.....

I'm seeing almost no point in buying the triple deckers for this series due to the restrictions. Seems everything OCs damn near the same. I know I know, it becomes more and more like this each cycle, it just seems the 1070 is so damn set it & forget it. Every post I read says basically the same thing. The majority of cards, from the majority of manufacturers are within a 50hz core range. waterblocks still make sense due to noise.

hmm, i wanna go see what this chip is doing when its gets zombified.


----------



## JoeUbi

I'm not exactly sure why they even included two power connectors. The most watts I've seen my card pull according to HWInfo64 is 230W. I wish I could give it more juice....


----------



## skupples

Quote:


> Originally Posted by *JoeUbi*
> 
> I'm not exactly sure why they even included two power connectors. The most watts I've seen my card pull according to HWInfo64 is 230W. I wish I could give it more juice....


i gotta check that 1080 bios thread, see how the progress is going. Someone found the good old 1.212 loophole.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> All those things you mentioned are exactly what the Overclocking tutorials tell you it is. VCCIO/SA voltages do help with Memory Overclocks and the on cpu memory controller. However, I am seeing increases and decreases in GPU OC stability and benchmark performance when I make changes in those voltages. That would tend to suggest that there is more going on than what the common wisdom would have us believe.
> 
> Following a tutorial will only get you maybe 80-90% of the way to getting the absolute maximum performance from overclocking our hardware.
> 
> What we are discussing here is just as much Art as it is science, mostly because we dont have a full understanding of what is going on when me make those adjustments. Intel is certainly not here telling us all their dirty little sectrets.
> 
> In general terms what you said above is correct, however, CPUs have to interact with memory, their own x16 PCIe lanes to GPUs and DMI channels, Sata, USB controllers etc. They don't all operate in isolation, they signal each other with electrical wave forms. High performance comes from tuning each of those waveforms exactly with each other. The more out of being perfectly in tune, the communication between the parts gets slower and slower until the Waveform variation is so far out of whack that the communications break down and the computer crashes. Those waveforms are tuned by changing different combinations of voltage levels that can be set in the UEFI.
> 
> Rather than argue with me over what the tutorials tell you, why don't you try experimenting with your own hardware and come back and tell us what you found so we can compare notes?


I was did overclocking to 4.7ghz several months ago, i do still have a notes:
Test 3570k @4.7GHZ LLCalibration 50% Not Delidded
Prime95 Specified test Min/MAx 864K FFT, RunFFTin place 15minutes each, running for +105minutes, no error/crash/bsod/whea error.
CPUZ reporting @4.7GHZ full load during prime95 @1.308V (in Games Could be little higher ~1.320V)
VCCSA/VCCIO at default (0.930V, 1.068V)
Internal PLL disabled
2x8GB RAM @2400MHZ CL 11-13-13-35 2T @1.65V
Secondary Timings: 16-255-10-7-10 -31-12 (TWR-TRFC-TWTR-TRRD-TRTP-TFAW-CWL)
Spread spectrum at 100Mhz constant.

I probably need a watercooling, and delidded my processor if want to break above +4,7GHZ -5GHZ.

GPU MSI AB:
Voltage 100%
Power Limit 125% (250watt max)
Temp Limit 92
Core Clock +75 (Hovering around 2076, 2063, 2050, 2038mhz) (not use curve this time)
Mem Clock +600 (2302*4 = effective 9200mhz)
Fan Speed User Defined (50C automatically at 100% fanspeed)
Hottest Temp while playing Witcher 3 max settings @1440P = 65-70C (while playing on @1080P temp would be lower)

Firestrike Extreme Stress test was passed 99% with this settings, also no crash/artifact whatsoever while playing games.
http://www.3dmark.com/fsst/254527

Latest Driver WHQL 378.49
20,9XX Graphic scores
http://www.3dmark.com/fs/11523829

I believe latest driver it gives improvement over +100points on Graphic scores imho.

Probably i will startover to OC again to break +21K. I might already for several time hit limit of my 1070 silicon lottery chip.


----------



## zipper17

Quote:


> Originally Posted by *JoeUbi*
> 
> I'm not exactly sure why they even included two power connectors. The most watts I've seen my card pull according to HWInfo64 is 230W. I wish I could give it more juice....


same here, my card also it seems never hit the max power limit while playing games. (6ping+8pin 125% = 250watt)

The highest was around 200-214watt iirc.


----------



## colabang

hi guys is it possible now to edit the bios of the gtx 1070 to get more stable boost frequencies?


----------



## zipper17

Quote:


> Originally Posted by *colabang*
> 
> hi guys is it possible now to edit the bios of the gtx 1070 to get more stable boost frequencies?


currently there is no existence of bios editor yet for pascal.

best possible for now is use; extreme bios or extreme circuit board modification.

https://xdevs.com/guide/pascal_oc/


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have never tried it on an x79 but you still have PCIe lanes directly connected to the CPU so it would seem logical that the theory should also hold true. It doesnt cost you anything to try it out. Just save an OC profile of your current settings so you can revert if needed
> 
> 
> 
> i actually spent some time watching the numbers last night.
> 
> I definitely see more benefit from OC'ing memory than OC'ing core. More so than any other GPU I can think of.
> 
> Oh, I'm also running NV Surround, which is likely a contributing factor.
> 
> Core pegs out @ just under 2giggles, on default, max temp is ~70 (even though it was 55f in my room last night)
> 
> basically, just another power starved chip
Click to expand...

I have concluded that the 1070 bottlenecked by the available memory bandwidth. The card loves memory clocks. Not surprising given the basically same chip in the 1080 benefits from memory oc up to about 11Gbs .

Don't use the slider for core clock, you are overclocking many voltage points that you will not be using under load and it takes "resources" for the want of a better word, away from the the bit of the GPU you want to be using.

Use the curve in afterburner and only pull up the point at 1.093 if you have the voltage set to +100. 2100Mhz should be dead easy.

Set a fan curve. If noise is not an issue, you should be able to keep the chip at 60 deg or less with the fan at 100%. The chip will reduce clock speeds as temps increase


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *colabang*
> 
> hi guys is it possible now to edit the bios of the gtx 1070 to get more stable boost frequencies?
> 
> 
> 
> currently there is no existence of bios editor yet for pascal.
> 
> best possible for now is use; extreme bios or extreme circuit board modification.
> 
> https://xdevs.com/guide/pascal_oc/
Click to expand...

Where is there a 1070 extreme OC bios available? I have seen an Asus one for a 1080 that will allow 1.2v but not a 1070.

The king pin mod is designed to trick the standard bios into thinking that it is pulling less power than it really is so the downclock features do not kick in


----------



## skupples

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *gtbtk*
> 
> I have concluded that the 1070 bottlenecked by the available memory bandwidth. The card loves memory clocks. Not surprising given the basically same chip in the 1080 benefits from memory oc up to about 11Gbs .
> 
> Don't use the slider for core clock, you are overclocking many voltage points that you will not be using under load and it takes "resources" for the want of a better word, away from the the bit of the GPU you want to be using.
> 
> Use the curve in afterburner and only pull up the point at 1.093 if you have the voltage set to +100. 2100Mhz should be dead easy.
> 
> Set a fan curve. If noise is not an issue, you should be able to keep the chip at 60 deg or less with the fan at 100%. The chip will reduce clock speeds as temps increase






thx... My card is inside a mostly sealed caselabs STH10. Inner airflow wasn't really a thought in the original 100% all water everything build LOL! I also live in florida, & my office is the addon room with no routed AC & only a layer of cinder between me & the outerworld. Match that with crap aluminum single sheet windows from the 80s & you have what's essentially an unairconditioned room, so my temps are always high. i'm also loading the card in 5760x1080P, which seems to take just about all she's got.

I'll definitely play with that voltage tip.


----------



## Nebulous

Quote:


> Originally Posted by *gtbtk*
> 
> I have concluded that the 1070 bottlenecked by the available memory bandwidth. The card loves memory clocks. Not surprising given the basically same chip in the 1080 benefits from memory oc up to about 11Gbs .
> 
> Don't use the slider for core clock, you are overclocking many voltage points that you will not be using under load and it takes "resources" for the want of a better word, away from the the bit of the GPU you want to be using.
> 
> Use the curve in afterburner and only pull up the point at 1.093 if you have the voltage set to +100. 2100Mhz should be dead easy.
> 
> Set a fan curve. If noise is not an issue, you should be able to keep the chip at 60 deg or less with the fan at 100%. The chip will reduce clock speeds as temps increase


I was having issues keeping 2100 with my 1070 and this tiny little tidbit saved me more headaches. I did exactly this without so much as touching the slider for the core speed and bam got 2100









Rep added!


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> Where is there a 1070 extreme OC bios available? I have seen an Asus one for a 1080 that will allow 1.2v but not a 1070.
> 
> The king pin mod is designed to trick the standard bios into thinking that it is pulling less power than it really is so the downclock features do not kick in


corrected, I mean cross flash to other Bios. (Ex: like asus bios to msi bios.)

currently only 1080 has XOC bios.


----------



## icold

Why we dont have yet pascal bios tweaker?


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have concluded that the 1070 bottlenecked by the available memory bandwidth. The card loves memory clocks. Not surprising given the basically same chip in the 1080 benefits from memory oc up to about 11Gbs .
> 
> Don't use the slider for core clock, you are overclocking many voltage points that you will not be using under load and it takes "resources" for the want of a better word, away from the the bit of the GPU you want to be using.
> 
> Use the curve in afterburner and only pull up the point at 1.093 if you have the voltage set to +100. 2100Mhz should be dead easy.
> 
> Set a fan curve. If noise is not an issue, you should be able to keep the chip at 60 deg or less with the fan at 100%. The chip will reduce clock speeds as temps increase
> 
> 
> 
> 
> 
> 
> 
> thx... My card is inside a mostly sealed caselabs STH10. Inner airflow wasn't really a thought in the original 100% all water everything build LOL! I also live in florida, & my office is the addon room with no routed AC & only a layer of cinder between me & the outerworld. Match that with crap aluminum single sheet windows from the 80s & you have what's essentially an unairconditioned room, so my temps are always high. i'm also loading the card in 5760x1080P, which seems to take just about all she's got.
> 
> I'll definitely play with that voltage tip.
Click to expand...

If you leave the voltage slider at stock, you should adjust the 1.063V point instead of the 1.093v point. Given your Cooling challenges, running the card at the lower voltage will allow it to run slightly cooler.

You can try 2100 Mhz, but it is 50/50 if you can run stable at that frequency/voltage combination


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Where is there a 1070 extreme OC bios available? I have seen an Asus one for a 1080 that will allow 1.2v but not a 1070.
> 
> The king pin mod is designed to trick the standard bios into thinking that it is pulling less power than it really is so the downclock features do not kick in
> 
> 
> 
> corrected, I mean cross flash to other Bios. (Ex: like asus bios to msi bios.)
> 
> currently only 1080 has XOC bios.
Click to expand...

Your Galax 1070 EXOC card already has a 250W limit and a 6 phase VRM that should be good up to 300W. 250W is on the high end of whats available in 1070s

You could try the the Gigabyte Xtreme, MSI Gaming Z or the Gainward GLH bioses. For you, probably no point trying the Asus Strix OC bios as it only has a 200W power limit. You could also try the Zotac Amp Extreme bios but it has a 300W limit. It should run ok but make sure that you keep an eye on card temps as it can potentially power up to close to the limits of your VRM.

The EVGA FTW bios is interesting because you can play the the Precision XOC auto overclock utility but I don't like using it long term as it continuously hits the power limit and reduces its clocks all the time. At least when running on the gaming X hardware. The MSI bioses are really good at keeping the overclock frequency reasonably stable.

I have tried all of them on my MSI Gaming X and have ended up settling on the Gaming Z bios.

I'll warn you now that you are not likely to see much in the way of performance improvements if you are overclocking.


----------



## gtbtk

Quote:


> Originally Posted by *Nebulous*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have concluded that the 1070 bottlenecked by the available memory bandwidth. The card loves memory clocks. Not surprising given the basically same chip in the 1080 benefits from memory oc up to about 11Gbs .
> 
> Don't use the slider for core clock, you are overclocking many voltage points that you will not be using under load and it takes "resources" for the want of a better word, away from the the bit of the GPU you want to be using.
> 
> Use the curve in afterburner and only pull up the point at 1.093 if you have the voltage set to +100. 2100Mhz should be dead easy.
> 
> Set a fan curve. If noise is not an issue, you should be able to keep the chip at 60 deg or less with the fan at 100%. The chip will reduce clock speeds as temps increase
> 
> 
> 
> I was having issues keeping 2100 with my 1070 and this tiny little tidbit saved me more headaches. I did exactly this without so much as touching the slider for the core speed and bam got 2100
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Rep added!
Click to expand...

Thank.

Glad I could help


----------



## colabang

Quote:


> Originally Posted by *gtbtk*
> 
> Where is there a 1070 extreme OC bios available? I have seen an Asus one for a 1080 that will allow 1.2v but not a 1070.
> 
> The king pin mod is designed to trick the standard bios into thinking that it is pulling less power than it really is so the downclock features do not kick in


thx for the answer. I miss the time where we could mod the bios of the gtx 770^^. Its really annoying that this card start throttling when they hit 55C or less. With a manual fan curve my card is not getting hoter than 62C. Its a shame that i cant overclock more because of the voltage limit, power limit and e.t.c. I hope someday we can modify the bios.


----------



## colabang

Quote:


> Originally Posted by *gtbtk*
> 
> Thank.
> 
> Glad I could help


i dont get it i have mine plus 115 core and plus 500 on memory with msi afterburner. So i get something over 2100 but it downclocks to 2088mhz. so you mean we should leave the voltage slider at stock, bceause i have it on max like power target and temp target?


----------



## gtbtk

Quote:


> Originally Posted by *colabang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Thank.
> 
> Glad I could help
> 
> 
> 
> i dont get it i have mine plus 115 core and plus 500 on memory with msi afterburner. So i get something over 2100 but it downclocks to 2088mhz. so you mean we should leave the voltage slider at stock, bceause i have it on max like power target and temp target?
Click to expand...

The higher voltage will give you the ability to run faster clocks and give you better performance at the expense of more heat. Lower voltages mean less heat and slightly lower frequencies.

As a function of the design of GPU Boost 3.0, frequency drops as temperatures rises so it is in your interest to find the best balance between heat/voltage/core clock. You will never get the initial frequency you start with on a cold GPU after it is at temperature. Performance is about finding the best balance.

Pascal, due to the 14nm finfet process does not cope with high voltages the same way that Maxwell did- The smaller Component size of the transistors on a smaller die on the Pascal GPU doesn't have the quantity of material to cope the same way. On the other hand, you get significantly higher clocks at standard voltages than you would have with maxwell cards even though Pascal is really just a refinement and miniaturization of Maxwell.


----------



## Handrox

Good afternoon guys, do not we have a BIOS unlocked for GTX 1070?


----------



## icold

Quote:


> Originally Posted by *Handrox*
> 
> Good afternoon guys, do not we have a BIOS unlocked for GTX 1070?


no


----------



## Handrox

Ok...









Is why I've been reading that they unlocked a bios for GTX 1080, then I thought that for the GTX 1070 they had also achieved.


----------



## madweazl

Made another run for the top in Fire Strike (6600k/1070) last night and pulled it off. Last time I made it to the top was with a Celeron 333/TNT 2 combo back in the Mad Onion days LOL.

2164/2430


Spoiler: Warning: Spoiler!





__
https://flic.kr/p/Rt7Z3U


----------



## Handrox

GPU Score 22510
http://www.3dmark.com/fs/11483651


Spoiler: Warning: Spoiler!


----------



## madweazl

I7 makes a huge difference in the physics based runs.


----------



## MEC-777

Quote:


> Originally Posted by *madweazl*
> 
> I7 makes a huge difference in the physics based runs.


That's because physics is primarily CPU-intensive.









I too saw a massive jump in scores going from an i5-4570 to an i7-4770K @ 4.4.


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> If you leave the voltage slider at stock, you should adjust the 1.063V point instead of the 1.093v point. Given your Cooling challenges, running the card at the lower voltage will allow it to run slightly cooler.
> 
> You can try 2100 Mhz, but it is 50/50 if you can run stable at that frequency/voltage combination


i couldn't even find the voltage curve in the settings... I'm not worried about temp, outside of instability. I have a huge window AC in the room. I'm also not worried about the card popping as its not being taken outside of spec in anyway shape or form, & it's EVGA. Quality might be lacking, but their service is still spot on.


----------



## colabang

Quote:


> Originally Posted by *gtbtk*
> 
> The higher voltage will give you the ability to run faster clocks and give you better performance at the expense of more heat. Lower voltages mean less heat and slightly lower frequencies.
> 
> As a function of the design of GPU Boost 3.0, frequency drops as temperatures rises so it is in your interest to find the best balance between heat/voltage/core clock. You will never get the initial frequency you start with on a cold GPU after it is at temperature. Performance is about finding the best balance.
> 
> Pascal, due to the 14nm finfet process does not cope with high voltages the same way that Maxwell did- The smaller Component size of the transistors on a smaller die on the Pascal GPU doesn't have the quantity of material to cope the same way. On the other hand, you get significantly higher clocks at standard voltages than you would have with maxwell cards even though Pascal is really just a refinement and miniaturization of Maxwell.


thx for your answer. But i think this will not help to get higher frequencies. Because for example if i hit 55C my gtx downclocks from 2100 to 2077. But also the Voltage decrease from 1.093V to 1.075V. So if you give 1.075 from the start youll only achieve 2077. So maybe you dont have no downclocks anymore after hitting 55C but it also have lower Coreclock from the begining. So what i want to say is it desnt matter how you set up the the voltage in msi afterburner itll be the same. Or did i overlooked something?









What if we turn on the constant voltage function in msi afterburner. Will that keep the card from downclocking?


----------



## kignt

The latest afterburner revision notes the hotkeys for voltage/frequency curve http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


----------



## gtbtk

Quote:


> Originally Posted by *Handrox*
> 
> Ok...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is why I've been reading that they unlocked a bios for GTX 1080, then I thought that for the GTX 1070 they had also achieved.


Asus supplied an unlocked XOC bios 1080 Strix card to an overclocker called Dancop who shared it with the world. There has been not 1070 version released into the wild


----------



## gtbtk

Quote:


> Originally Posted by *kignt*
> 
> The latest afterburner revision notes the hotkeys for voltage/frequency curve http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


CTRL-F brings up the curve (Also the little bar graph Icon)

CTRL-D resets the card to defaults

Ctrl-L on a selected point will lock the voltage at that point


----------



## gtbtk

Quote:


> Originally Posted by *colabang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The higher voltage will give you the ability to run faster clocks and give you better performance at the expense of more heat. Lower voltages mean less heat and slightly lower frequencies.
> 
> As a function of the design of GPU Boost 3.0, frequency drops as temperatures rises so it is in your interest to find the best balance between heat/voltage/core clock. You will never get the initial frequency you start with on a cold GPU after it is at temperature. Performance is about finding the best balance.
> 
> Pascal, due to the 14nm finfet process does not cope with high voltages the same way that Maxwell did- The smaller Component size of the transistors on a smaller die on the Pascal GPU doesn't have the quantity of material to cope the same way. On the other hand, you get significantly higher clocks at standard voltages than you would have with maxwell cards even though Pascal is really just a refinement and miniaturization of Maxwell.
> 
> 
> 
> thx for your answer. But i think this will not help to get higher frequencies. Because for example if i hit 55C my gtx downclocks from 2100 to 2077. But also the Voltage decrease from 1.093V to 1.075V. So if you give 1.075 from the start youll only achieve 2077. So maybe you dont have no downclocks anymore after hitting 55C but it also have lower Coreclock from the begining. So what i want to say is it desnt matter how you set up the the voltage in msi afterburner itll be the same. Or did i overlooked something?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What if we turn on the constant voltage function in msi afterburner. Will that keep the card from downclocking?
Click to expand...

Try it and see what happens. It wont hurt your card


----------



## colabang

Quote:


> Originally Posted by *gtbtk*
> 
> Try it and see what happens. It wont hurt your card


yeah ill try it im on my laptop right now. after i tried ill report.


----------



## skupples

Quote:


> Originally Posted by *kignt*
> 
> The latest afterburner revision notes the hotkeys for voltage/frequency curve http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


hey thanks!

+1


----------



## EDK-TheONE

Quote:


> Originally Posted by *skupples*
> 
> hey thanks!
> 
> +1


----------



## EDK-TheONE

Quote:


> Originally Posted by *Handrox*
> 
> GPU Score 22510
> http://www.3dmark.com/fs/11483651
> 
> 
> Spoiler: Warning: Spoiler!


Nice score. can you send screenshot from curve? which model do you have?


----------



## gtbtk

Not quite sure how I did this.

http://www.3dmark.com/fs/10295423


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> Not quite sure how I did this.
> 
> http://www.3dmark.com/fs/10295423


That's stable right? LOL


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> Not quite sure how I did this.
> 
> http://www.3dmark.com/fs/10295423


Somehow Graphics test 2 ran about double FPS than what it should have based on your specs


----------



## zipper17

I also find people with 24,105k GPU Scores on single 1070
http://www.3dmark.com/fs/9740626

it's a glitch isnt it? i would be surprised if 1070 can really reach that performances (gtx 1080 zone).


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Not quite sure how I did this.
> 
> http://www.3dmark.com/fs/10295423
> 
> 
> 
> 
> 
> That's stable right? LOL
Click to expand...

I have never been able to replicate that score. The next best result I have done is 15244. I thgink that it is what is referred to as an outlier


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you leave the voltage slider at stock, you should adjust the 1.063V point instead of the 1.093v point. Given your Cooling challenges, running the card at the lower voltage will allow it to run slightly cooler.
> 
> You can try 2100 Mhz, but it is 50/50 if you can run stable at that frequency/voltage combination
> 
> 
> 
> i couldn't even find the voltage curve in the settings... I'm not worried about temp, outside of instability. I have a huge window AC in the room. I'm also not worried about the card popping as its not being taken outside of spec in anyway shape or form, & it's EVGA. Quality might be lacking, but their service is still spot on.
Click to expand...

using afterburner, press ctrl-f and the curve box will open in a new box on screen.

using precision XOC, press the right side triangle button twice. You can adjust the curve there in 25Mhz steps so adjustments are much more coarse.


----------



## Handrox

Quote:


> Originally Posted by *gtbtk*
> 
> Asus supplied an unlocked XOC bios 1080 Strix card to an overclocker called Dancop who shared it with the world. There has been not 1070 version released into the wild


Oh man, thanks for the info!
Quote:


> Originally Posted by *EDK-TheONE*
> 
> Nice score. can you send screenshot from curve? which model do you have?


The curve is default, the model is an EVGA GTX 1070 FTW.


----------



## equlix

A quick question.
I just picked up an msi 1070 armor OC a few days ago but after hearing about the for honor and wild lands bundle i'm considering upgrading to different 1070 with the game bundle and possibly a better oc. My armor can boost to 1979 Mhz and oc to about 2070~2100 Mhz before artifacts start showing up. So should I keep what I have or upgrade to different 1070. And if I upgrade which 1070 should I get? I was thinking the asus oc strix or the msi sea hawk.


----------



## GeneO

Quote:


> Originally Posted by *zipper17*
> 
> I also find people with 24,105k GPU Scores on single 1070
> http://www.3dmark.com/fs/9740626
> 
> it's a glitch isnt it? i would be surprised if 1070 can really reach that performances (gtx 1080 zone).


That is a i7-6850K, so they have a much higher physics and combined score.


----------



## ucode

Quote:


> Originally Posted by *GeneO*
> 
> That is a i7-6850K, so they have a much higher physics and combined score.


The graphics score is too high and it says on the link result "Time measurement data not available" so a bogus score.


----------



## gtbtk

Quote:


> Originally Posted by *equlix*
> 
> A quick question.
> I just picked up an msi 1070 armor OC a few days ago but after hearing about the for honor and wild lands bundle i'm considering upgrading to different 1070 with the game bundle and possibly a better oc. My armor can boost to 1979 Mhz and oc to about 2070~2100 Mhz before artifacts start showing up. So should I keep what I have or upgrade to different 1070. And if I upgrade which 1070 should I get? I was thinking the asus oc strix or the msi sea hawk.


you are getting pretty much what everyone can get with an overclock, give or take. If you have Micron memory, make sure that you have an 86,04.50.00.xx version of vbios installed. it solves a bug with the memory controller

Clocks drop as temps rise. A seahawk is likely to give you a more stable oc than an air cooled card. Other than the free games, If you plan on overclocking, you will not really get any more performance benefits from a switch to Strix even though it has higher default clock speeds. If you really want that, you can always cross flash a different bios to your card and get that anyway


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> I also find people with 24,105k GPU Scores on single 1070
> http://www.3dmark.com/fs/9740626
> 
> it's a glitch isnt it? i would be surprised if 1070 can really reach that performances (gtx 1080 zone).


yes, like my 17000 score i posted this morning, it is a glitch


----------



## RaleighStClair

Quote:


> Originally Posted by *equlix*
> 
> A quick question.
> I just picked up an msi 1070 armor OC a few days ago but after hearing about the for honor and wild lands bundle i'm considering upgrading to different 1070 with the game bundle and possibly a better oc. My armor can boost to 1979 Mhz and oc to about 2070~2100 Mhz before artifacts start showing up. So should I keep what I have or upgrade to different 1070. And if I upgrade which 1070 should I get? I was thinking the asus oc strix or the msi sea hawk.


I was able to flash my MSI SeahawkX to an GamingX - thanks to the suggestions in this subforum - and that in turn allowed me to get a much higher power limit and voltage (locked), where I could get my core to 2200 mhz/ 400 mem but the 1070 doesn't really seem to scale that well past 2000 mhz core. I think a higher memory OC with a lower core clock shows better gains, IMO. So i went back to my 2050/500 @ 1.043 and original BIOS.

My point is that I don't really see a point in going through the hassle of getting new cards for the sake of a +200 core overclock, when we really don't see any gains past 2000 core. Now if you were not able to get to 1900mhz, then I would consider it.


----------



## dboythagr8

Who did the RAM issue impact (Micron v Samsung)? Was it a certain vendor? Just checked GPU-Z, and I've Samsung. I just got my 1070 about 8 days ago, so I completely missed that discussion.


----------



## ucode

Micron


----------



## dboythagr8

Quote:


> Originally Posted by *ucode*
> 
> Micron


Did it only impact cards from Asus, EVGA, MSI, etc or was it random? Guess I got lucky either way.


----------



## EDK-TheONE

Quote:


> Originally Posted by *Handrox*
> 
> Oh man, thanks for the info!
> The curve is default, the model is an EVGA GTX 1070 FTW.


Can you send screenshot from default curve in after burner?


----------



## Handrox

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Can you send screenshot from default curve in after burner?


I leave the capture already with the overclock, so you can see how it stayed.


Spoiler: Warning: Spoiler!


----------



## zipper17

Quote:


> Originally Posted by *Handrox*
> 
> I leave the capture already with the overclock, so you can see how it stayed.
> 
> 
> Spoiler: Warning: Spoiler!


No crash/artifact whatsoever on games/Firestrike? u got better silicon chip lottery.

Did you have Firestrike Extreme Stress Test result with that setting?


----------



## gtbtk

Quote:


> Originally Posted by *dboythagr8*
> 
> Who did the RAM issue impact (Micron v Samsung)? Was it a certain vendor? Just checked GPU-Z, and I've Samsung. I just got my 1070 about 8 days ago, so I completely missed that discussion.


The Memory controller bug effected the Micron memory equiped cards that were sold by all the vendors. Manufacturing has occurs in batches that seems to alternate between the two vendors

With Samsung memory, you have nothing to worry about. It generally overclocks really well


----------



## gtbtk

Quote:


> Originally Posted by *Handrox*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EDK-TheONE*
> 
> Can you send screenshot from default curve in after burner?
> 
> 
> 
> I leave the capture already with the overclock, so you can see how it stayed.
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

It that runs without artifacts with those settings, you have a great card


----------



## gtbtk

Quote:


> Originally Posted by *dboythagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ucode*
> 
> Micron
> 
> 
> 
> Did it only impact cards from Asus, EVGA, MSI, etc or was it random? Guess I got lucky either way.
Click to expand...

all vendors.


----------



## gtbtk

Quote:


> Originally Posted by *RaleighStClair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *equlix*
> 
> A quick question.
> I just picked up an msi 1070 armor OC a few days ago but after hearing about the for honor and wild lands bundle i'm considering upgrading to different 1070 with the game bundle and possibly a better oc. My armor can boost to 1979 Mhz and oc to about 2070~2100 Mhz before artifacts start showing up. So should I keep what I have or upgrade to different 1070. And if I upgrade which 1070 should I get? I was thinking the asus oc strix or the msi sea hawk.
> 
> 
> 
> I was able to flash my MSI SeahawkX to an GamingX - thanks to the suggestions in this subforum - and that in turn allowed me to get a much higher power limit and voltage (locked), where I could get my core to 2200 mhz/ 400 mem but the 1070 doesn't really seem to scale that well past 2000 mhz core. I think a higher memory OC with a lower core clock shows better gains, IMO. So i went back to my 2050/500 @ 1.043 and original BIOS.
> 
> My point is that I don't really see a point in going through the hassle of getting new cards for the sake of a +200 core overclock, when we really don't see any gains past 2000 core. Now if you were not able to get to 1900mhz, then I would consider it.
Click to expand...

Cross flashing is not going to magically make your 1070 into a psuedo 1080. A Gaming X bios will give you more boost frequency stability inder a 1080p/1440p load than an EVGA bios because the Gaming X card will not hit the 100% power limit and the EVGA card will because of the way the bios power limit has been configured in the different vendor specific bios versions. In some cases too, It can give you access to a higher default core and memory clock, a higher power limit or a unique feature, like the auto overclock feature that EVGA cards have with Precision XOC.

I ended up becoming an expert at it because I was trying everything while I was diagnosing the micron memory bug


----------



## Kamikaze-X

I get 15800 on firestrike with my 1070 with a 4690K, but thats overclocking the nuts off the card

http://www.3dmark.com/fs/9708102


----------



## mrzoo

Where can I find these bios updates for memory issues? And how do I flash bios I have evga gtx 1070 sc


----------



## zipper17

Have this posted before?



Use liquid metal on that small shunt resistors, and boom looks like pretty easy & interesting result.

http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/4/

i think this the only way right now to push my card ...


----------



## khanmein

Quote:


> Originally Posted by *mrzoo*
> 
> Where can I find these bios updates for memory issues? And how do I flash bios I have evga gtx 1070 sc


http://forums.evga.com/Update-11916-with-NEW-BIOS-EVGA-GeForce-GTX-108010701060-PWM-Temperature-Update-m2573491.aspx


----------



## mrzoo

Quote:


> Originally Posted by *khanmein*
> 
> http://forums.evga.com/Update-11916-with-NEW-BIOS-EVGA-GeForce-GTX-108010701060-PWM-Temperature-Update-m2573491.aspx


Thanks gonna check it out once I get home


----------



## icold

I managed to pull to from 4374 to 4392mhz is the max, more this freeze all. Micron/Elpida is ****test chips ever.







. What comforts me is that the GPU went up reasonably.


----------



## beans444

Love this MSI gtx 1070 gaming x! I got it on sale $559 CND which is cool, the price was fluctuating every day after and now at $630 with crap games..

It does have Micron chips.. but I probably won't plan on overclocking it. Had to update the bios first thing (screens blacking out).
Curiosity though, wouldn't a great overclock only add a 10% increase in performance (on air) as well as eliminating the slow down when and if the card gets to hot or is that just baked in and not something you can disable with my style OC edition card?

ps. I feel like this card is to much tech for my corsair TX 850w PSU as I had to use a 8 pin converter / adapter which looks like crap. Ahh last upgrade for a totally new system I guess.


----------



## skupples

Quote:


> Originally Posted by *dboythagr8*
> 
> Who did the RAM issue impact (Micron v Samsung)? Was it a certain vendor? Just checked GPU-Z, and I've Samsung. I just got my 1070 about 8 days ago, so I completely missed that discussion.


i can run my micron @ damn near +1000

i'm also in a higher resolution scenario where the mem clock benefits me more than core clock.


----------



## mrzoo

What benefits fps more Oc core clock or Oc memory? Evga gtx 1070 sc and how do I know what chips I have? Micron or Samsung


----------



## khanmein

Quote:


> Originally Posted by *mrzoo*
> 
> What benefits fps more Oc core clock or Oc memory? Evga gtx 1070 sc and how do I know what chips I have? Micron or Samsung


OC core then memory & FYI, both equally important.

https://www.techpowerup.com/230235/techpowerup-announces-gpu-z-1-17-0


----------



## icold

Checkout the new hotfix, it give me much more stability in Killing Floor 2 + flex And I'm getting to 2139mhz

http://www.guru3d.com/files-details/geforce-378-57-hotfix-driver-download.html


----------



## skupples

Quote:


> Originally Posted by *mrzoo*
> 
> What benefits fps more Oc core clock or Oc memory? Evga gtx 1070 sc and how do I know what chips I have? Micron or Samsung


Quote:


> Originally Posted by *khanmein*
> 
> OC core then memory & FYI, both equally important.
> 
> https://www.techpowerup.com/230235/techpowerup-announces-gpu-z-1-17-0


this is the normal advice, however there are some situations, like mine, where memory benefits more than core(NV Surround). The card already boost itself to the high 19s as well.


----------



## Handrox

Quote:


> Originally Posted by *zipper17*
> 
> No crash/artifact whatsoever on games/Firestrike? u got better silicon chip lottery.
> 
> Did you have Firestrike Extreme Stress Test result with that setting?


Quote:


> Originally Posted by *gtbtk*
> 
> It that runs without artifacts with those settings, you have a great card


Not everything, the GPU clock works without problems in games but the Vram clock is in the limit practically, can reach up to + 900MHz but very unstable, + 850MHz with some benchmark however with artifacts. + 750MHz is fully stable in any game 24/7.


----------



## zipper17

Quote:


> Originally Posted by *mrzoo*
> 
> What benefits fps more Oc core clock or Oc memory? Evga gtx 1070 sc and how do I know what chips I have? Micron or Samsung


Both are important afaik and has benefits. Both of them always overclocked together and combined to get the best result.

To Unlock Full Potential Power of GPU Core, best possible for right now you need Power Limit Mod (Shunt resistors mod), Voltage Mod (extreme PCB mod), Powerful Cooling, and probably BIOS mod/Editor. Or maybe Nvidia kind enough will completely remove them in future. GPU Core clock will throttle down every time because a factors from this guy: Power, Voltage, Temperature limit.

For Memory, much easier to overclock they do not throttle down but they will just produce errors such artifacts, crash, or performances loss? if not stable.

Silicon lottery also play roles for better luck to get higher overclocking.


----------



## mrzoo

Quote:


> Originally Posted by *zipper17*
> 
> Both are important afaik and has benefits. Both of them always overclocked together and combined to get the best result.
> 
> To Unlock Full Potential Power of GPU Core, best possible for right now you need Power Limit Mod (Shunt resistors mod), Voltage Mod (extreme PCB mod), Powerful Cooling, and probably BIOS mod/Editor. Or maybe Nvidia kind enough will completely remove them in future. GPU Core clock will throttle down every time because a factors from this guy: Power, Voltage, Temperature limit.
> 
> For Memory, much easier to overclock they do not throttle down but they will just produce errors such artifacts, crash, or performances loss? if not stable.
> 
> Silicon lottery also play roles for better luck to get higher overclocking.


What step would you take to overclocking? Start with core and get Max October, then set back to default then work on memory then combine both and turn down until stable? I've always tried and never got he hang of it on any card I've had same with cpu


----------



## zipper17

Quote:


> Originally Posted by *mrzoo*
> 
> What step would you take to overclocking? Start with core and get Max October, then set back to default then work on memory then combine both and turn down until stable? I've always tried and never got he hang of it on any card I've had same with cpu


core clock & memory tested individual, then combine them...

I overclocked the coreclock until it crash in Firestrike Extreme stres test, games, etc.

then overclock the memory until get artifacting (green sparkle)

Combine the most stable Coreclock & memclock, and then tested again in Firestrike Extreme Stress test, Timespy Stress test, games etc.until all fully stable nocrash/artifacts.


----------



## MEC-777

Quote:


> Originally Posted by *mrzoo*
> 
> What step would you take to overclocking? Start with core and get Max October, then set back to default then work on memory then combine both and turn down until stable? I've always tried and never got he hang of it on any card I've had same with cpu


Personally, I start with the core clock and bring it up until it crashes firestrike, then dial it back until that's stable. Then I bring up the memory (leaving the core clock at max stable OC) until again, crashing or artifacts. You get more of a gain from maxing the core clock than you do from maxing the memory, this is why I do it this way.

Then if it's stable in firestrike, I run other benchmarks like Heaven, Valley and some of the most demading games I have, like The Witcher 3 and let it run for about an hour. If no issues, I consider it stable.









With my particular card, it doesn't matter if I only OC the core or the memory, I can't get more out of the core and memory separately, so both are at their absolute max stable as I have it. I'm running a Founders, so that might have something to do with it.

I have also discovered, with the Founders anyways, that increasing the voltage % slider did nothing to improve performance and only creates more heat, so I have that at default to keep temps down.









If you really want to get every last drop of performance, learn how to create a custom core clock curve.


----------



## madweazl

Quote:


> Originally Posted by *MEC-777*
> 
> Personally, I start with the core clock and bring it up until it crashes firestrike, then dial it back until that's stable. Then I bring up the memory (leaving the core clock at max stable OC) until again, crashing or artifacts. You get more of a gain from maxing the core clock than you do from maxing the memory, this is why I do it this way.
> 
> Then if it's stable in firestrike, I run other benchmarks like Heaven, Valley and some of the most demading games I have, like The Witcher 3 and let it run for about an hour. If no issues, I consider it stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With my particular card, it doesn't matter if I only OC the core or the memory, I can't get more out of the core and memory separately, so both are at their absolute max stable as I have it. I'm running a Founders, so that might have something to do with it.
> 
> I have also discovered, with the Founders anyways, that increasing the voltage % slider did nothing to improve performance and only creates more heat, so I have that at default to keep temps down.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you really want to get every last drop of performance, learn how to create a custom core clock curve.


Upping the voltage was worth about 20mhz on the core and 100mhz on the memory once I moved over to water cooling. On air I was maxing out around 234 and now I'm sitting at 253; with more voltage I'm fairly certain it would pick up quite a bit more but I havent investigated a BIOS solution that would provide that yet. Memory max was around 725-735 and now it maxes at 843. My air cooled maxes are now pretty close to my fully stable settings of 200/750 and I bet both could still be increased a fair bit, I just havent taken the time to run the tests.


----------



## mrzoo

Quote:


> Originally Posted by *madweazl*
> 
> Upping the voltage was worth about 20mhz on the core and 100mhz on the memory once I moved over to water cooling. On air I was maxing out around 234 and now I'm sitting at 253; with more voltage I'm fairly certain it would pick up quite a bit more but I havent investigated a BIOS solution that would provide that yet. Memory max was around 725-735 and now it maxes at 843. My air cooled maxes are now pretty close to my fully stable settings of 200/750 and I bet both could still be increased a fair bit, I just havent taken the time to run the tests.


I was just about to ask about water cooling I'm going to also use the kraken g10 with my card so maybe I'll have a chance of ocing a little higher than I would on air.


----------



## JoeUbi

Quote:


> Originally Posted by *madweazl*
> 
> Upping the voltage was worth about 20mhz on the core and 100mhz on the memory once I moved over to water cooling. On air I was maxing out around 234 and now I'm sitting at 253; with more voltage I'm fairly certain it would pick up quite a bit more but I havent investigated a BIOS solution that would provide that yet. Memory max was around 725-735 and now it maxes at 843. My air cooled maxes are now pretty close to my fully stable settings of 200/750 and I bet both could still be increased a fair bit, I just havent taken the time to run the tests.


+xxx means nothing. It's so annoying that people keep posting it......

So what are your actual clock speeds? Or posted GPU-Z verification?


----------



## MEC-777

Quote:


> Originally Posted by *JoeUbi*
> 
> +xxx means nothing. It's so annoying that people keep posting it......
> 
> So what are your actual clock speeds? Or posted GPU-Z verification?


Yeah, I have to agree. With GPU boost 3.0 it doesn't actually apply +___ (what ever you set it to). That's why I found I got better results with a custom curve vs just moving the slider. The actual clocks jump all over depending on a wide range of factors. One major factor is temps. The lower the temps, the higher sustained boost clocks.


----------



## madweazl

Quote:


> Originally Posted by *JoeUbi*
> 
> +xxx means nothing. It's so annoying that people keep posting it......
> 
> So what are your actual clock speeds? Or posted GPU-Z verification?


The value is that I gained roughly 20/100mhz on the clocks by upping the voltage; the +xxx was there to show the increase achieved. The actual clocks are irrelevant to the conversation.

Edit: You can however back up a couple pages and find my final clocks in the Fire Strike results I posted previously. If you dont trust those results you can look in the Fire Strike and Time Spy threads for my results that have the GPU-Z data too. Annoying is people questioning everything for no reason.


----------



## JoeUbi

Quote:


> Originally Posted by *madweazl*
> 
> The value is that I gained roughly 20/100mhz on the clocks by upping the voltage; the +xxx was there to show the increase achieved. The actual clocks are irrelevant to the conversation.
> 
> Edit: You can however back up a couple pages and find my final clocks in the Fire Strike results I posted previously. If you dont trust those results you can look in the Fire Strike and Time Spy threads for my results that have the GPU-Z data too. Annoying is people questioning everything for no reason.


Nobody is questioning anything. I am just asking that people post their actual clocks speeds. How can one compare when the result of +xxx changes based on what the manufacturer/model chooses to spec the card at. You could have posted the actual clock speeds to show the increase, instead you showed +xxx compared to +xxx, contributing to the problem.


----------



## madweazl

Quote:


> Originally Posted by *JoeUbi*
> 
> Nobody is questioning anything. I am just asking that people post their actual clocks speeds. How can one compare when the result of +xxx changes based on what the manufacturer/model chooses to spec the card at. You could have posted the actual clock speeds to show the increase, instead you showed +xxx compared to +xxx, contributing to the problem.


The increase of 20mhz on the core and 100mhz on the memory is absolute. The final max OC was 2164/2430 but again, it has no relevance to the original statement. If gaining an additional 20/100mhz by increasing the voltages is to difficult to comprehend, I dont even know where to begin.


----------



## WillG027

Quote:


> Originally Posted by *madweazl*
> 
> The increase of 20mhz on the core and 100mhz on the memory is absolute. The final max OC was *2164/2430* but again, it has no relevance to the original statement. If gaining an additional 20/100mhz by increasing the voltages is to difficult to comprehend, I dont even know where to begin.


That's all people need to post.

+xxx Mhz doesn't mean a thing when the person reading doesn't know the base boost of the card in question - and every card has a different baseline regardless of what the factory BIOS sets.
Avoid confusion and just post the final, total boost and memory figures.

(Not aimed at you specifically - just a comment in general)


----------



## madweazl

Quote:


> Originally Posted by *WillG027*
> 
> That's all people need to post.
> 
> +xxx Mhz doesn't mean a thing when the person reading doesn't know the base boost of the card in question - and every card has a different baseline regardless of what the factory BIOS sets.
> Avoid confusion and just post the final, total boost and memory figures.
> 
> (Not aimed at you specifically - just a comment in general)


The original post had absolutely nothing to do with max clocks, that information was completely irrelevant to the discussion. If I were in a max clocks discussion, I would have posted that information but I was just showing the delta in regard to the voltage increase.


----------



## HOODedDutchman

Everytime I check on this thread people are fighting lol.


----------



## HOODedDutchman

Quote:


> Originally Posted by *beans444*
> 
> Love this MSI gtx 1070 gaming x! I got it on sale $559 CND which is cool, the price was fluctuating every day after and now at $630 with crap games..
> 
> It does have Micron chips.. but I probably won't plan on overclocking it. Had to update the bios first thing (screens blacking out).
> Curiosity though, wouldn't a great overclock only add a 10% increase in performance (on air) as well as eliminating the slow down when and if the card gets to hot or is that just baked in and not something you can disable with my style OC edition card?
> 
> ps. I feel like this card is to much tech for my corsair TX 850w PSU as I had to use a 8 pin converter / adapter which looks like crap. Ahh last upgrade for a totally new system I guess.


Pretty sure TX 850 have 2x8 pin and 2x6pin...

Edit: checked specs both original 850tx and newer tx850 have 4x6+2(8) pin connectors. You must be blind if u don't see the extra 2 pons hanging off. No reason at all to use an adapter.


----------



## zipper17

My real clock only can fully stable hovering around 2076, 2063, 2050, 2038mhz. This is With 100% fanspeed, power 125% & voltage 100% max. It's +75 on clock slider MSI afterburner.
Weird, if i put +88 sometime it will pass FS Extreme stress test/Timespy stress test, but the other time it crashes.

Samsung Memory only full stable at +600, real effective clock 2302*4 = 9200mhz.
Higher than +600 memory will produce mini artifacting green sparks.

This is my silicon lottery can do, pushed my card to limit. overclock ability is pretty disappoint to be honest..

I need powerful cooling, shunt resistor mod, bios mod, and voltage mod if want to to push further.

If i put core +100, mem +650-700 , i can easily break +21K FS graphic scores, but its not stable ***.


----------



## gtbtk

Quote:


> Originally Posted by *JoeUbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *madweazl*
> 
> Upping the voltage was worth about 20mhz on the core and 100mhz on the memory once I moved over to water cooling. On air I was maxing out around 234 and now I'm sitting at 253; with more voltage I'm fairly certain it would pick up quite a bit more but I havent investigated a BIOS solution that would provide that yet. Memory max was around 725-735 and now it maxes at 843. My air cooled maxes are now pretty close to my fully stable settings of 200/750 and I bet both could still be increased a fair bit, I just havent taken the time to run the tests.
> 
> 
> 
> +xxx means nothing. It's so annoying that people keep posting it......
> 
> So what are your actual clock speeds? Or posted GPU-Z verification?
Click to expand...

While you are correct that+xxx is dependent on the factory overclock levels of a specific card and not tied to a consistent baseline. it does mean something if you are comparing two copies of the same card or changes in limits of the same card.

I am running Gaming Z bios on my Gaming X card. It comes with a +100Mhz factory overclock (as per the spec sheet) on the vRAM. But even speaking about Mhz, GPU-Z tells me it is a 25Mhz, Afterburner tells me it is a 50Mhz overclock and the full DDR frequency quoted by MSI/Nvidia is 100Mhz so which one do we suggest the beginners asking the questions use?

On the other hand, Overclocking with the Voltage curve doesn't end up being reflected properly in GPU-Z. It only reflects an overclock correctly using the traditional core clock slider so the verification page as it stands now is obsolete and just as meaningless. My card is running 2126Mhz with a curve overclock but GPU-Z tells me thet the Core clock is stock and boost clock is 1Mhz above stock.

When I am trying to describe something, I try and be as precise as possible because I am aware of the differences but Vocabulary for describing this stuff tends to be drawn from the tools we have, Right now, we are at a point in GPU development history where the new functionality of the GPUs have outstripped the tools that we are using to measure and manage them. The Tools additions to cope are really expedient stuck on kludges rather than an elegant way of integrating the new features. I don't have the answer to the conundrum other than to be precise with the flawed vocab, but I am confident that someone will work out a better way of presenting and communicating this sort of stuff.


----------



## icold

I need remember here: Core clock is much more important to gaming than memory. OC memory almost improve nothing in gaming, its around: + 100 to + 1 fps. If you have Samsung memory you gained around more 4fps than micron because micron made OC around 400´s and samsung 800´s.


----------



## DeathAngel74

barely...on Micron I get +363 before things get wonky.


----------



## ChronoBodi

Quote:


> Originally Posted by *gtbtk*
> 
> While you are correct that+xxx is dependent on the factory overclock levels of a specific card and not tied to a consistent baseline. it does mean something if you are comparing two copies of the same card or changes in limits of the same card.
> 
> I am running Gaming Z bios on my Gaming X card. It comes with a +100Mhz factory overclock (as per the spec sheet) on the vRAM. But even speaking about Mhz, GPU-Z tells me it is a 25Mhz, Afterburner tells me it is a 50Mhz overclock and the full DDR frequency quoted by MSI/Nvidia is 100Mhz so which one do we suggest the beginners asking the questions use?
> 
> On the other hand, Overclocking with the Voltage curve doesn't end up being reflected properly in GPU-Z. It only reflects an overclock correctly using the traditional core clock slider so the verification page as it stands now is obsolete and just as meaningless. My card is running 2126Mhz with a curve overclock but GPU-Z tells me thet the Core clock is stock and boost clock is 1Mhz above stock.
> 
> When I am trying to describe something, I try and be as precise as possible because I am aware of the differences but Vocabulary for describing this stuff tends to be drawn from the tools we have, Right now, we are at a point in GPU development history where the new functionality of the GPUs have outstripped the tools that we are using to measure and manage them. The Tools additions to cope are really expedient stuck on kludges rather than an elegant way of integrating the new features. I don't have the answer to the conundrum other than to be precise with the flawed vocab, but I am confident that someone will work out a better way of presenting and communicating this sort of stuff.


Only way to get real clock speed is from Sensor Tab in GPU-Z while running a game or benchmark.

I do have Pascals, on my TXP, i added +200 to core and my actual clock speed is between 1930-2012mhz depending on temperatures and fan speed.

As for my 1080, It's pretty much +180 to core, but that gets me to 2050-2100 mhz actual clock speed.

As i recall on the 1070 i had before i sold it, I can only add +100 core to it to the same Pascal wall limit of 2050-2100 mhz because it's already factory overclocked MSI, whereas my current Titan XP and GTX 1080 are stock reference clock speed out of the box, so one cannot say they added +xxx or so without listing what factory OC was already applied to begin with.


----------



## icold

Quote:


> Originally Posted by *DeathAngel74*
> 
> barely...on Micron I get +363 before things get wonky.


I get +374, but 400 average of the people up


----------



## DeathAngel74

Yeah the lucky ones, lol. Cheers


----------



## ucode

Quote:


> Originally Posted by *ChronoBodi*
> 
> Only way to get real clock speed is from Sensor Tab in GPU-Z while running a game or benchmark.


Your relying on the driver to be correct.

As for offsets here's a GTX 1050ti with +1180MHz offset, a bit too much for AB







Actual GPU clock is reported to be limited to 1911MHz.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> I need remember here: Core clock is much more important to gaming than memory. OC memory almost improve nothing in gaming, its around: + 100 to + 1 fps. If you have Samsung memory you gained around more 4fps than micron because micron made OC around 400´s and samsung 800´s.


I can run my micron memory MSI Gaming X at more than 9300Mhz that is +650 above reference speed. There are Micon cards that clock to +800 that belong to members here and there are samsung cards that wont clock past +400

While it was true 4 months ago that the Micron memory did had controller timing issues before the bug fix bios was released, It is not true now if you keep your hardware up to date. I wish that you would stop repeating that misinformation.


----------



## kevindd992002

I'm trying to install the Accelero Hybrid Cooler ob my Zotac GTX 1070 AMP! Extreme card and I noticed that the pump interferes with a si gle component on the lower right side area of the GPU die area. This cooler has an Asetek pump so it's pretty much compatible with all Pascal cards.

The component that I'm talking about is the one located on the left side of the bottom vRAM chip that is in the right side of the GPU die. Sorry for the confusing description but I hope you get what I mean.

The lower right corner of the pump touches around 1/4 of that component. Now I see a lot of people using a Kraken G10 eith an Asetek pump as well but I'm not sure if they notice this problem.

The G10+Asetek combo is pretty much the same as the Accelero Hybrid Cooler. I'm just not sure about the spacing that the G10 gives.

Can anyone comment on this please? Thanks.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> barely...on Micron I get +363 before things get wonky.
> 
> 
> 
> I get +374, but 400 average of the people up
Click to expand...

I am assuming that your vbios is a .50 version. Are you finding that you start getting small blue or cyan rectangles over parts of the screen if you try to OC your memory too much?

Try resetting your motherboard to optimized defaults and re do your BCLK overclock from scratch. I had been using the OC settings that I had set before I bought my 1070. You can try either a fix frequency VRM or enabling VRM stread spectrum with optimized phase management, I found that spread spectrum helped with stability when I was experimenting and tuning other things.

Undo any memory overclocks that you may have until you get the CPU stable again. Then start from scratch with the memory if you are running any custom memory overclocks. I have ended up loosening my ram timings slightly but dropped my memory voltage at OC back from 1.65 that I had been using to 1.5v and that seems to have helped stability as well.

I have also found Reducing CPU PLL slightly from 1.8 default helps with higher memory clock speeds as well but I am still experimenting with that to nail the best value on my motherboard. CPU PLL changes voltage to a oscillator that helps keep things in sync with the system clock. I am pretty sure that the reason your GPU memory is getting wonky is because the different signalling waves going over the PCI bus to the vram are getting out of phase because the clocks are not exactly in sync


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm trying to install the Accelero Hybrid Cooler ob my Zotac GTX 1070 AMP! Extreme card and I noticed that the pump interferes with a si gle component on the lower right side area of the GPU die area. This cooler has an Asetek pump so it's pretty much compatible with all Pascal cards.
> 
> The component that I'm talking about is the one located on the left side of the bottom vRAM chip that is in the right side of the GPU die. Sorry for the confusing description but I hope you get what I mean.
> 
> The lower right corner of the pump touches around 1/4 of that component. Now I see a lot of people using a Kraken G10 eith an Asetek pump as well but I'm not sure if they notice this problem.
> 
> The G10+Asetek combo is pretty much the same as the Accelero Hybrid Cooler. I'm just not sure about the spacing that the G10 gives.
> 
> Can anyone comment on this please? Thanks.




Do you mean the 4 silver things just next to the bottom right screw hole that surround the GPU?

They look like a pair of resistors to me. I would suggest that if you are getting contact, you get a small piece of electrical tape and use that between the components and the pump plate as metal to metal contact will probably cause a short


----------



## ChronoBodi

Quote:


> Originally Posted by *ucode*
> 
> Your relying on the driver to be correct.
> 
> As for offsets here's a GTX 1050ti with +1180MHz offset, a bit too much for AB
> 
> 
> 
> 
> 
> 
> 
> Actual GPU clock is reported to be limited to 1911MHz.


How does that even..... oh.

Um, just for reference, I tried to do 2150 mhz clock speed on my TXP, sure it's applied, but in no way it was stable. crashes on anything 3D intensive.

So..... what's the difference of you applying your insane +1150 offset and you get a boost speed of 1911 mhz or otherwise capped to that speed, whereas mine could go higher the more i apply core offset, but obviously to Pascal wall limit it crashes.


----------



## DeathAngel74

I know this is a little off-topic, but what drivers work best in general for the 1070? I'm currently using 373.06, been thinking of upgrading. From what I've read 376.60 is pretty stable? TIA, just don't want to break something that is running perfectly fine so far, lol.


----------



## Dude970

I'm using the latest WHQL 378.49. Runs stable and games are great. If you are wanting the best for benchmarks it seems some older drivers score better


----------



## DeathAngel74

Nah, I just want stability in game. I'm all benchmarked out. No time, lol.


----------



## Dude970

What games are you currently playing? I'm pretty much on BF1 exclusively, and the new drivers are great.


----------



## DeathAngel74

Star Wars Battlefront 2015, The Witcher 3 and Batman: Arkham Knight. All maxed out, max stable OC is 2101/8726mhz 1.075v. Windows 7 Ultimate x64


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> 
> 
> Do you mean the 4 silver things just next to the bottom right screw hole that surround the GPU?
> 
> They look like a pair of resistors to me. I would suggest that if you are getting contact, you get a small piece of electrical tape and use that between the components and the pump plate as metal to metal contact will probably cause a short


Nope, not those. On the right side of those 4 silver things, right next to the bottom vRam chip, is a small rectangular component. That's the one I'm pertaining to. It's not the potential short I'm worried about. It's the pressure given by the pump to that component when getting screwed down.

What I really want to know is if the Kraken G10 users are also getting this kind of problem?


----------



## iluvkfc

Anyone else have their max voltage go down from 1.093 to 1.081 in one of the latest drivers? I was always getting 1.093 before, now it's stuck at 1.081 even if I lock a higher voltage point in Afterburner. Voltage slider also maxed out.


----------



## madweazl

Quote:


> Originally Posted by *Dude970*
> 
> I'm using the latest WHQL 378.49. Runs stable and games are great. If you are wanting the best for benchmarks it seems some older drivers score better


I'm actually on the latest drivers and still have the top benchmark


----------



## GeneO

Been playing around benchmarking some. The left side are results that are stable for normal use: low and quiet fan speed, normal ambient temperature, the right with fans full tilt, cooler ambient (20c). My weak point is 1.081V (2037 MHz) and 1.075v (2012 MHz). Done with this I think, though maybe someone will inspire me here.\\\


----------



## gtbtk

Quote:


> Originally Posted by *Dude970*
> 
> I'm using the latest WHQL 378.49. Runs stable and games are great. If you are wanting the best for benchmarks it seems some older drivers score better


The hotfix driver 378.57 that fixes the bug that makes the 378.49 drivers run in debug mode all the time actually seems pretty good

Available here http://nvidia.custhelp.com/app/answers/detail/a_id/4378


----------



## GeneO

That is what I am running. Seems pretty solid.


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> That is what I am running. Seems pretty solid.


My micron memory on both this driver and the .49 version is running pretty stable at 9288Mhz which is the best I have ever got it up to. I have just redone my overclock on the motherboard so I am not sure if it is the new voltage settings I am running or the drivers or both but I am happy to take it.


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> My micron memory on both this driver and the .49 version is running pretty stable at 9288Mhz which is the best I have ever got it up to. I have just redone my overclock on the motherboard so I am not sure if it is the new voltage settings I am running or the drivers or both but I am happy to take it.


Better than I can do stable with Samsung - 9136 MHz, though I haven't tried to bump it up with this release - I expect I can't. Gonna try now


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> My micron memory on both this driver and the .49 version is running pretty stable at 9288Mhz which is the best I have ever got it up to. I have just redone my overclock on the motherboard so I am not sure if it is the new voltage settings I am running or the drivers or both but I am happy to take it.


Nah. I just tried and I can push +600 on memory in Firestrike without an issue, but above +560 in Valley and I get artifacts. I can't think of any way a driver could affect memory clocking stability, so I expect it is CPU OC voltage settings.


----------



## ucode

Quote:


> Originally Posted by *ChronoBodi*
> 
> So..... what's the difference of you applying your insane +1150 offset and you get a boost speed of 1911 mhz or otherwise capped to that speed, whereas mine could go higher the more i apply core offset, but obviously to Pascal wall limit it crashes.


When the 1050Ti first came out the nVidia drivers capped both GPU clock and Memory clock on it, it wasn't until a couple of months later (IIRC) that the cap was removed with a driver update.

A side effect of this was that by using a higher offset than the 1911MHz cap one could eliminate frequency / temperature chnages. Over a 40C range the 1911MHz clock would remain 1911MHz. If the clock offset was set to only achieve 1911MHz then one would see the usual temperature / frequency drops. So looks like the driver itself needs looking at rather than the VBIOS to change the way boost 3.0 reacts with temperature, if someone was trying to achieve that.


----------



## ChronoBodi

Quote:


> Originally Posted by *ucode*
> 
> When the 1050Ti first came out the nVidia drivers capped both GPU clock and Memory clock on it, it wasn't until a couple of months later (IIRC) that the cap was removed with a driver update.
> 
> A side effect of this was that by using a higher offset than the 1911MHz cap one could eliminate frequency / temperature chnages. Over a 40C range the 1911MHz clock would remain 1911MHz. If the clock offset was set to only achieve 1911MHz then one would see the usual temperature / frequency drops. So looks like the driver itself needs looking at rather than the VBIOS to change the way boost 3.0 reacts with temperature, if someone was trying to achieve that.


I admit the GPU Boost 3.0 is annoying in that it's not a constant boost value like previous generations.

I can say for sure that my previous 980 Ti always did 1450 mhz, no matter what, that was my OC.

Now, the TXP and 1080, i have to go like "yea it's 2038 mhz but more like between 1930-1999mhz most of the time.

question is, if this 1050 ti cap can somehow be hacked in for other Pascal cards to essentially achieve a constant value with no fluctuations.

this requires some custom drivers or whatever it is.


----------



## ucode

It's a big driver, that boost 3.0 would have to annoy a lot to try and change its behavior.


----------



## asdkj1740

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm trying to install the Accelero Hybrid Cooler ob my Zotac GTX 1070 AMP! Extreme card and I noticed that the pump interferes with a si gle component on the lower right side area of the GPU die area. This cooler has an Asetek pump so it's pretty much compatible with all Pascal cards.
> 
> The component that I'm talking about is the one located on the left side of the bottom vRAM chip that is in the right side of the GPU die. Sorry for the confusing description but I hope you get what I mean.
> 
> The lower right corner of the pump touches around 1/4 of that component. Now I see a lot of people using a Kraken G10 eith an Asetek pump as well but I'm not sure if they notice this problem.
> 
> The G10+Asetek combo is pretty much the same as the Accelero Hybrid Cooler. I'm just not sure about the spacing that the G10 gives.
> 
> Can anyone comment on this please? Thanks.



can you point out which part are you talking about on this pic?

i am using arctic aio cooler on my 1070. one thing you need to know is that the aio pump does not need to be screwed strictly, once you feel some resistance then you can stop screwing it.

if you are talking about the pll vrm, the inductor/choke really has some height that probably blocks the arctic cooler mounting.


----------



## kevindd992002

Quote:


> Originally Posted by *asdkj1740*
> 
> 
> can you point out which part are you talking about on this pic?
> 
> i am using arctic aio cooler on my 1070. one thing you need to know is that the aio pump does not need to be screwed strictly, once you feel some resistance then you can stop screwing it.
> 
> if you are talking about the pll vrm, the inductor/choke really has some height that probably blocks the arctic cooler mounting.




This is the exact board that I have. The one in the green box is what I'm referring to. It's also interesting that the red-encircled area in my pic is empty and are missing the two capacitors that you have in the pic that you posted above my post. It only looks like there are two resistors but the black shade there is the board itself and are not components. Could that be intentionally designed by Zotac? I got the pic from gtbtk's post and I have the same exact board. The PLL VRM area that you pointed out does not interfere with the cooler.

Which AC Hybrid Cooler are you using? I have the 1st iteration of Hybrid Cooler if that matters.

All along (with my GTX 670), I was screwing the 4 screws strictly until I the spacers won't allow me from doing so. Is there a disadvantage to that? Could it be the cause that lead to my GPU die contact problem on my other thread?


----------



## asdkj1740

Quote:


> Originally Posted by *kevindd992002*
> 
> 
> 
> This is the exact board that I have. The one in the green box is what I'm referring to. It's also interesting that the red-encircled area in my pic is empty and are missing the two capacitors that you have in the pic that you posted above my post. It only looks like there are two resistors but the black shade there is the board itself and are not components. Could that be intentionally designed by Zotac? I got the pic from gtbtk's post and I have the same exact board. The PLL VRM area that you pointed out does not interfere with the cooler.
> 
> Which AC Hybrid Cooler are you using? I have the 1st iteration of Hybrid Cooler if that matters.
> 
> All along (with my GTX 670), I was screwing the 4 screws strictly until I the spacers won't allow me from doing so. Is there a disadvantage to that? Could it be the cause that lead to my GPU die contact problem on my other thread?


the pic i posted above is zotac 1080 amp extreme, yours should be 1070 amp extreme, these two are pretty much the same.
and you can see that two missing components on your 1070 should be sp cap just like the common one.

i would suggest you to mount your arctic aio to your 1070 anyway, to see how the gpu temp perform. if it is poor then you can buy a nzxt g10 to see whether the problem can be avoided by g10.

i have remounted the my hybrid 140mm aio several times and i found that there is no need to screw them too tight. you can also check the gamersnexus rx480 hybrid video that they also said no need to screw too strictly.


----------



## asdkj1740

but your case seems to be failed no matter how you mount your aio on the gpu as the cap simply block your aio.

few methods you can consider to do
1. buy a copper shim to compensate the height of the cap so that you can have full contact with you aio to the gpu.
2. buy a nzxt g10 to see whether the problem would be solved as nzxt g10 uses another mounting method, but i doubt this will work
3. i doubt that because the asetek aio mounting areas are still here, unless you cut it off, this is the third way you could do by cutting off the affected area of the corner of aio... you have to check which mounting hole on the aio you need to mount then check where that hole is on top of the cap, if not on top of the hole then you may consider cutting the affected area. dont cut it first because you may cut the needed hole out too. this will break your warranty of your aio so do it at you own risk.

maybe you should use back the amp extreme cooler as it is very good among other 1070s. or you can sell yours and go buy a old gen asetek aio with nzxt g10.


----------



## asdkj1740

this is a design fault of arctic, arctic does not leave some/enough space for the potential components around the gpu socket area...
may be you should send a email to arctic to see what they will recommend you to do


----------



## matti2

Still getting dxerrors on bf1, i know theres thread for this but anyway..
Default clock speeds and full fan speed but no help.


----------



## kevindd992002

Quote:


> Originally Posted by *asdkj1740*
> 
> but your case seems to be failed no matter how you mount your aio on the gpu as the cap simply block your aio.
> 
> few methods you can consider to do
> 1. buy a copper shim to compensate the height of the cap so that you can have full contact with you aio to the gpu.
> 2. buy a nzxt g10 to see whether the problem would be solved as nzxt g10 uses another mounting method, but i doubt this will work
> 3. i doubt that because the asetek aio mounting areas are still here, unless you cut it off, this is the third way you could do by cutting off the affected area of the corner of aio... you have to check which mounting hole on the aio you need to mount then check where that hole is on top of the cap, if not on top of the hole then you may consider cutting the affected area. dont cut it first because you may cut the needed hole out too. this will break your warranty of your aio so do it at you own risk.
> 
> maybe you should use back the amp extreme cooler as it is very good among other 1070s. or you can sell yours and go buy a old gen asetek aio with nzxt g10.


Actually I was experimenting with it yesterday and I was able to mount it even though there is an interfering cap. I even tightened the screws all the way down until the spacers won't let me do so. It seems to be a snug fit and I'm thinking that the cap is receiving "pressure" from the mounting holes, hence why I asked the question in the first place. But if I don't screw the screws all the way down (as you said, only until I feel resistance), I would bet that there would be enough "space" between the cap and the mounting hole of the pump. I'd have to try this on Monday and report back.

I was about to consider cutting the affected corner area of the AIO but I figured that would be my last resort. I also don't think the G10 will solve it as the mounting hole will still interfere no matter what. BUT if the mounting method of the G10 is such that it produces the same pressure to that of Arctic's WHEN NOT FULLY TIGHTENED, then that sure will do it. But then I won't need to buy the G10 altogether and just stick with the Arctic method of not fully tightening it.

Let me check the gamersnexus rx480 video.

A couple of questions:

1.) What Arctic AIO do you use? Why is yours not interfering with the cap in question?

2.) Also before I even contact Arctic regarding this, can you confirm if the Zotac GTX 1070 AMP! Extreme uses a reference board design or not? This is because Arctic's fine print only caters to reference boards.

3.) Can you comment on my other thread regarding my GTX 670 issue?


----------



## EDK-TheONE

Quote:


> Originally Posted by *matti2*
> 
> Still getting dxerrors on bf1, i know theres thread for this but anyway..
> Default clock speeds and full fan speed but no help.


for fix this issue, change post processing to medium and effects to high.


----------



## asdkj1740

Quote:


> Originally Posted by *kevindd992002*
> 
> Actually I was experimenting with it yesterday and I was able to mount it even though there is an interfering cap. I even tightened the screws all the way down until the spacers won't let me do so. It seems to be a snug fit and I'm thinking that the cap is receiving "pressure" from the mounting holes, hence why I asked the question in the first place. But if I don't screw the screws all the way down (as you said, only until I feel resistance), I would bet that there would be enough "space" between the cap and the mounting hole of the pump. I'd have to try this on Monday and report back.
> 
> I was about to consider cutting the affected corner area of the AIO but I figured that would be my last resort. I also don't think the G10 will solve it as the mounting hole will still interfere no matter what. BUT if the mounting method of the G10 is such that it produces the same pressure to that of Arctic's WHEN NOT FULLY TIGHTENED, then that sure will do it. But then I won't need to buy the G10 altogether and just stick with the Arctic method of not fully tightening it.
> 
> Let me check the gamersnexus rx480 video.
> 
> A couple of questions:
> 
> 1.) What Arctic AIO do you use? Why is yours not interfering with the cap in question?
> 
> 2.) Also before I even contact Arctic regarding this, can you confirm if the Zotac GTX 1070 AMP! Extreme uses a reference board design or not? This is because Arctic's fine print only caters to reference boards.
> 
> 3.) Can you comment on my other thread regarding my GTX 670 issue?


what i mean is that on normal case you dont need to screw all the way down. i have tried both and the temp are the same. some resistances are okay, dont need strict resistance.

oh sorry i just checked my ftw 1070 and fe 1070, the pll vrm components (mosfet, choke, sp cap) are the same and placed on the same direction and located at the same area. so if yours are affected by that sp cap then mine should have the same problem. but my ftw 1070 with arctic hybrid 140 iii are perfectly fine with this. the temp is very good at ~50c.

did you have four screws mount evenly with cross method used? could you take some photo on that?

if you screw all the way donw then the contact of the aio copper base with the gpu die must not be even because of that sp cap existence. but i have tried this too and the temp also look good at 50c. i dont really understand your situation...


----------



## kevindd992002

A
Quote:


> Originally Posted by *asdkj1740*
> 
> what i mean is that on normal case you dont need to screw all the way down. i have tried both and the temp are the same. some resistances are okay, dont need strict resistance.
> 
> oh sorry i just checked my ftw 1070 and fe 1070, the pll vrm components (mosfet, choke, sp cap) are the same and placed on the same direction and located at the same area. so if yours are affected by that sp cap then mine should have the same problem. but my ftw 1070 with arctic hybrid 140 iii are perfectly fine with this. the temp is very good at ~50c.
> 
> did you have four screws mount evenly with cross method used? could you take some photo on that?
> 
> if you screw all the way donw then the contact of the aio copper base with the gpu die must not be even because of that sp cap existence. but i have tried this too and the temp also look good at 50c. i dont really understand your situation...


I see. Yes, I did use the cross method to mount the pump. I'll take a photo on Monday and post it.

I asked the question because I'm worried about the pressure that it could give that single cap. I was experimenting with different tightening pressures yesterday but never got to really install the card on my system because of the potential issue that I've encountered with interference. I think over-tightening the screws until it won't let you screw the screws anymore will tend to warp the GPU board and would lead to other problems like the one in my other thread.


----------



## asdkj1740

Quote:


> Originally Posted by *kevindd992002*
> 
> A
> I see. Yes, I did use the cross method to mount the pump. I'll take a photo on Monday and post it.
> 
> I asked the question because I'm worried about the pressure that it could give that single cap. I was experimenting with different tightening pressures yesterday but never got to really install the card on my system because of the potential issue that I've encountered with interference. I think over-tightening the screws until it won't let you screw the screws anymore will tend to warp the GPU board and would lead to other problems like the one in my other thread.


haev you ever tried to put the card on your rig to see what temp is it???


----------



## b0uncyfr0

Hi guys ; thinking of grabbing a 1070 Gaming X but i'm worried about afew things :

Cooling/Fan Noise in an FTO2 ( Vertical position + vertical heatipipes in Frozr i assume arent the ideal setup)
Any Coil whine? - i hate it.
Im aware of the Micron debacle - so im prepared for it if the memory is Micron.
What are the chances that i cant hit 1900 clock? Ideally i'm aiming for 2000.

Thanks.


----------



## xGeNeSisx

Quote:


> Originally Posted by *gtbtk*
> 
> My micron memory on both this driver and the .49 version is running pretty stable at 9288Mhz which is the best I have ever got it up to. I have just redone my overclock on the motherboard so I am not sure if it is the new voltage settings I am running or the drivers or both but I am happy to take it.


I have noticed this also! I have now achieved my best performance to date at 9000mhz on memory and 2038mhz core @ 1.043 mV. The new drivers are playing a large role in this ino. The difference in memory OC is especially noticeable when using Heaven with DSR at 2160x1620. Still on OC Strix bios on my G1 and loving it









I did screw around with PLL volts, but it didnt improve performance. In fact I feel something akin to latency by upping PLL past 1 or 2 steps. It completely changed how the mouse handles on the desktop in a very strange way. I think its introducing jitter in the frequency and causing such odd behavior

One question I do have is the voltage on DMI. Couldnt find much info, I'm guessing direct memory interface? Any info would be greatly appreciated.

Also wondering if dropping the multiplier and using higher BCLK to achieve better performance woild be worth it. The system agent clock at 1ghz improves PCIE performance vs 800mhz when skylake first launched. It is a small difference, but if clocked to 1400mhz SA (had a 6400 I hit this clocking to 4.4ghz with bclk).

Thanks again!

BTW guys the micron volt controller bug was *fixed*. Can we stop rehashing this over and over.

Also posts stating I got +100 on the core/memory mean absolutely nothing. It doesnt give us any information as boost clocks vary and cards come overclocked at stock in addition. It's a bit irritating as it is indicitive that the poster clearly doesn't understand how Pascal works


----------



## kevindd992002

Quote:


> Originally Posted by *asdkj1740*
> 
> haev you ever tried to put the card on your rig to see what temp is it???


Nope, not yet. I'm planning to do so this coming week though.


----------



## Blackfyre

Sorry ladies and gents. I haven't been here since around page 370, up until then I was up to date.

I have an *MSI GTX 1070 Gaming X* (_one of the early versions with the Samsung memory_).

Any reason why I should update to the latest BIOS? I remember early on I did not update, and I am still using the ORIGINAL BIOS which came with my videocard, which allows for +126% Powerlimit on MSI AfterBurner. I believe after that they decreased it to around 114% Maximum with a BIOS update which I never done.

Anyway, the official MSI site has a BIOS from November 2016. Should I update to that? Or any other one in fact? Any benefits? Did they fix or add anything? There's no changelog or details on their site.

Also have custom BIOS' been developed yet?

Sorry for these questions if they've been answered recently, and thank you to anyone who replies.


----------



## DeathAngel74

I need some advice. Lets say I want to cross-flash my eVGA 1070 SC 6173-KB/KR w/ Micron VRAM for a higher power limit/TDP. What options do I have? TIA


----------



## DeathAngel74

pointless rambling


----------



## jlhawn

Quote:


> Originally Posted by *Blackfyre*
> 
> Sorry ladies and gents. I haven't been here since around page 370, up until then I was up to date.
> 
> I have an *MSI GTX 1070 Gaming X* (_one of the early versions with the Samsung memory_).
> 
> Any reason why I should update to the latest BIOS? I remember early on I did not update, and I am still using the ORIGINAL BIOS which came with my videocard, which allows for +126% Powerlimit on MSI AfterBurner. I believe after that they decreased it to around 114% Maximum with a BIOS update which I never done.
> 
> Anyway, the official MSI site has a BIOS from November 2016. Should I update to that? Or any other one in fact? Any benefits? Did they fix or add anything? There's no changelog or details on their site.
> 
> Also have custom BIOS' been developed yet?
> 
> Sorry for these questions if they've been answered recently, and thank you to anyone who replies.


Keep your current bios, I still have mine from when I bought my 1070 on first week of release.
I to have samsung memory, the bios update is for the micron memory only.
http://www.pcgamer.com/msi-releases-geforce-gtx-1070-bios-update-to-fix-micron-memory-issue/


----------



## DeathAngel74

pointless rambling


----------



## ucode

GPU 1633MHz 800mV or GPU 1633MHz and Memory 800MHz?


----------



## DeathAngel74

syntax error


----------



## DeathAngel74

meh


----------



## spddmn24

Anyone ever figure out the power limit throttling on the msi gaming cards? 241 watts max in HW info. Does the core only get its power from the 6 and 8 pin connector so it throttles that to 225 watts, and the ram + other stuff uses the peci?


----------



## MEC-777

Quote:


> Originally Posted by *spddmn24*
> 
> Anyone ever figure out the power limit throttling on the msi gaming cards? 241 watts max in HW info. Does the core only get its power from the 6 and 8 pin connector so it throttles that to 225 watts, and the ram + other stuff uses the peci?


The whole card can pull more than what the PCIe connectors are rated for and don't forget it also pulls up to 75w from the PCIe slot.









PCie slot +75w
6-pin +75w
8-pin +150w

Theoretical max = 300w.


----------



## skupples

and the theories are quite conservative.


----------



## DeathAngel74

This is all quite confusing


----------



## khanmein

@DeathAngel74 *** 7211 RPM?


----------



## DeathAngel74

ROFL it was a glitch. Flashed the 1070 STRIX OC bios to 1070 SC


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> ROFL it was a glitch. Flashed the 1070 STRIX OC bios to 1070 SC


i haven't flash & OC yet cos my i5-4460 will be seriously bottle-neck! now still acceptable with 1440p60 & i always make sure to turn on every AA.


----------



## spddmn24

Quote:


> Originally Posted by *MEC-777*
> 
> The whole card can pull more than what the PCIe connectors are rated for and don't forget it also pulls up to 75w from the PCIe slot.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PCie slot +75w
> 6-pin +75w
> 8-pin +150w
> 
> Theoretical max = 300w.


Yeah I know it should be able to pull 300 within the ratings of the connectors, but it seems to hit the cap around 225 watts. Just a random theory with no basis that the core only pulls from the 2 power connectors = 225 watts. I'm sure someone smarter than me can debunk that from the board layout if it's the case.


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My micron memory on both this driver and the .49 version is running pretty stable at 9288Mhz which is the best I have ever got it up to. I have just redone my overclock on the motherboard so I am not sure if it is the new voltage settings I am running or the drivers or both but I am happy to take it.
> 
> 
> 
> Nah. I just tried and I can push +600 on memory in Firestrike without an issue, but above +560 in Valley and I get artifacts. I can't think of any way a driver could affect memory clocking stability, so I expect it is CPU OC voltage settings.
Click to expand...

Have a play with your CPU overclock, I just redid mine from scratch and tried different settings and features such as VRM spread specturm in stead of a fixed frequency to see what would happen. I tried it with my old overclock settings and my Vram Memory performance improved. Further tuning got it to a point where I reached the limits of the VRM spread spectrum and It ended up being better to go back to fixed frequency VRM but with new Vcore, VCCIO CPU PLL voltage settings. I am still working on fine tuning things as I think that I still have a little more performance that I can find.

Problem is that the OC guides are just step 1, step 2 etc and none of them go into the details of what you are actually changing making a particular voltage adjustment. I suspect it is because most people writing the guides dont really know themselves.

My hypothesis is that the different clocks - CPU, PCH etc, are all complimentary with each other and the communicate by signalling by sending a wave form over the various IO buses that computers use to communicate between components. They all need to stay completely in phase with each other to get the best performance from GPUs and Vram etc. The components do have a certain level of tolerance for the variations in signalling at the cost of performance. If one is off enough, the communication can drift out of phase enough to exceed the tolerance levels needed for high overclocked vram to stay stable for example. I also suspect that the Micron memory may be less tolerant to out of phase signals than Samsung memory and may explain why it can start off stable but start showing artifacts after a period of time under load.

I think that because the vram and GPU are operating at such high frequency levels compared to Maxwell and Kepler, the signalling phase tolerances have been reduced and what we used to consider a stable overclock is not stable enough to cope with the smaller tolerances these pascal cards are operating at. I think that the people who have "golden samples" that run at +800 memory over clocks, in addition to having a well made card, probably also have really well tuned CPU/memory over clocks and motherboards rather than just getting a lucky card but no-one seems to consider that due to the "magical black box" nature of these devices and general ignorance as to what is actually happening inside. I think the tendency is to just blame the "silicon lottery", "Micron memory", EVGA overheating or whatever else is excuse of the month rather than delve deeper.


----------



## DeathAngel74

I figured everything out. Thank you to everyone that posted about cross-flashing these cards. My GTX 1070 SC now thinks its an ASUS GTX 1070 OC. I was spoiled using PXOC, had to get used to MSI AB again.





Here's the skin if anyone wants it

msi_afterburner_nvidia_flat_skin__big_edition__by_grum_d-d8e.zip 225k .zip file


----------



## gtbtk

Quote:


> Originally Posted by *matti2*
> 
> Still getting dxerrors on bf1, i know theres thread for this but anyway..
> Default clock speeds and full fan speed but no help.


Motherboard vcore/overclock settings can cause driver crash instability if they are not quite right. You can have the situation where it is not out enough to BSOD but enough to mess with the GPU. If you get the occasional 0x124 BSOD, too low vcore is definitely causing your problem

Firstly I would also suggest using DDU to completely remove Nvidia drivers from your system and then a clean reinstall if you have not done that recently.

Update Motherboard to latest Bios.

Try resetting your CPU/Memory overclock/MB settings to absolute defaults and start looking at tuning from scratch. Getting vcore settings right helps stability and GPU performance. Readjust the Load line calibration so that vcore does not drop under load. If you are running in CPU voltage Auto offset mode, try setting a +0.010 offset to keep vcore and CPU VID voltages close, ideally avoiding vcore dropping below the VID voltage. If that doesn't help, keep trying up to +0.050 offset in -.010v steps. keep an eye on your voltages and temperatures under load. Bursts up to high range 1.3v for short periods should be OK if you have a half decent cooler on your CPU. Idling or under lower loads will show much lower vcore voltages.

HWinfo64 is a great tool to monitor at what is going on with temps, frequencies voltages etc. and will even integrate with the afterburner on screen display to present extra information during benchmarks etc.

These Pascal cards, due to the higher frequencies, seem to have tighter signalling tolerances than previous model cards and can make what was considered a stable overclock before, unstable now. I am also running z68 and I just redid my overclock and found an extra 100 MHz in vram overclock and stopped a load of driver crashes.


----------



## DeathAngel74

Thank you for posting your findings @gtbtk and posting your results @xGeNeSisx


----------



## ucode

Quote:


> Originally Posted by *MEC-777*
> 
> The whole card can pull more than what the PCIe connectors are rated for and don't forget it also pulls up to 75w from the PCIe slot.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PCie slot +75w
> 6-pin +75w
> 8-pin +150w
> 
> Theoretical max = 300w.


The power depends on source resistors so if 6-pin / pcie are both pulling 75W each then 8-pin is pulling 225W. Any one of these can be firmware limited which in turn will have a knock on effect with the others. i.e. If the 8-pin was limited to 150W then pcie and 6-pin would only be able to draw 50W each.


----------



## DeathAngel74

I went from 8726MHz memory overclock(eVGA bios) to 8926Mhz memory overclock(ASUS bios). From +363MHz to +463MHz on Micron VRAM. Dropping the core -26MHz made it possible to overclock the VRAM a little more.


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Anyone ever figure out the power limit throttling on the msi gaming cards? 241 watts max in HW info. Does the core only get its power from the 6 and 8 pin connector so it throttles that to 225 watts, and the ram + other stuff uses the peci?


The Bios is coded to have max power limit of 289W at the 126% power limit setting in AB. That can be confirmed with the nvidia-smi.exe utility.

Having said that, I cannot get my card to pull more than about 106% (about 230W) before it starts to down clock but and I only see that with a 4K load (firestrike ultra). I have had it drawing 300W when I installed a Zotac Amp extreme bios but I didnt get any better performance from the card.

As I only have a 1920x1200 monitor, I have not spent time trying to figure out why it is doing that yet cause in my usage, I never get up to the 100% power load at 1200p. I suspect that actual power limit downclock level are controlled by the level of a particular voltage point on the curve. Not sure what that voltage point may be though but it is probably one of the points at around .800v and it may vary depending on how the rest of the curve has been set.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> This is all quite confusing


You can still use Precision XOC if you want to. You will have lost the kboost feature and auto overclock feature but everything else is the same.

If you want to use afterburner for it functionality, you can run the precision 16 skin on it, have kboost again that can be accessed through the skin kboost button and use a familiar interface. But, you also get the benefit of having the better curve editor that comes with Afterburner.


----------



## DeathAngel74

I like Afterburner much better. I just had to get used to the interface again.

Plus the skin matches my nVidia Desktop theme.


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> Have a play with your CPU overclock, I just redid mine from scratch and tried different settings and features such as VRM spread specturm in stead of a fixed frequency to see what would happen. I tried it with my old overclock settings and my Vram Memory performance improved. Further tuning got it to a point where I reached the limits of the VRM spread spectrum and It ended up being better to go back to fixed frequency VRM but with new Vcore, VCCIO CPU PLL voltage settings. I am still working on fine tuning things as I think that I still have a little more performance that I can find.
> 
> Problem is that the OC guides are just step 1, step 2 etc and none of them go into the details of what you are actually changing making a particular voltage adjustment. I suspect it is because most people writing the guides dont really know themselves.
> 
> My hypothesis is that the different clocks - CPU, PCH etc, are all complimentary with each other and the communicate by signalling by sending a wave form over the various IO buses that computers use to communicate between components. They all need to stay completely in phase with each other to get the best performance from GPUs and Vram etc. The components do have a certain level of tolerance for the variations in signalling at the cost of performance. If one is off enough, the communication can drift out of phase enough to exceed the tolerance levels needed for high overclocked vram to stay stable for example. I also suspect that the Micron memory may be less tolerant to out of phase signals than Samsung memory and may explain why it can start off stable but start showing artifacts after a period of time under load.
> 
> I think that because the vram and GPU are operating at such high frequency levels compared to Maxwell and Kepler, the signalling phase tolerances have been reduced and what we used to consider a stable overclock is not stable enough to cope with the smaller tolerances these pascal cards are operating at. I think that the people who have "golden samples" that run at +800 memory over clocks, in addition to having a well made card, probably also have really well tuned CPU/memory over clocks and motherboards rather than just getting a lucky card but no-one seems to consider that due to the "magical black box" nature of these devices and general ignorance as to what is actually happening inside. I think the tendency is to just blame the "silicon lottery", "Micron memory", EVGA overheating or whatever else is excuse of the month rather than delve deeper.


Why would I? I OC by multiplier, not by BCLK. That kind of CPU OC should have no influence on the GPU OC, and I have never seen any evidence of such.

On my MB (and it is not just mine), VRM spread spectrum does not behave well - waking from sleep hangs the computer and it takes a complete power cycle to recover.

PCI-E is a separate bus than the internal ones used by the CPU, but runs off the same clock so there should be no phase differences. The only influence they could have over each other would be by introducing common noise or introducing power fluctuations or drops on the PCI-E bus. I guess I just don't think your conjectures necessarily represent the way the computer works.


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Hi guys ; thinking of grabbing a 1070 Gaming X but i'm worried about afew things :
> 
> Cooling/Fan Noise in an FTO2 ( Vertical position + vertical heatipipes in Frozr i assume arent the ideal setup)
> Any Coil whine? - i hate it.
> Im aware of the Micron debacle - so im prepared for it if the memory is Micron.
> What are the chances that i cant hit 1900 clock? Ideally i'm aiming for 2000.
> 
> Thanks.


Vertical position should not be an issue, heat pipes will still conduct heat through the pipe and cause a phase change in the contained fluid. Fans are very quiet

Mine does not have coil whine and the MSI cards do not have a reputation for it like the Gigabyte G1 cards do.

There is no Micron debacle. There was a memory controller bug in the original core vbios code that was written by nvidia and then modified and distributed by the vendors that had an effect on Micron memory ability to overclock, but that was resolved months ago. As long as you have the updated bios, it is a non issue.

Running at 2050 - 2100 with a +400 to +500 memory overclock is pretty standard. you can get higher if you overclock only on the voltage curve.

really poor performance is more related to poor motherboard config settings than it is "silicon lottery"


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Have a play with your CPU overclock, I just redid mine from scratch and tried different settings and features such as VRM spread specturm in stead of a fixed frequency to see what would happen. I tried it with my old overclock settings and my Vram Memory performance improved. Further tuning got it to a point where I reached the limits of the VRM spread spectrum and It ended up being better to go back to fixed frequency VRM but with new Vcore, VCCIO CPU PLL voltage settings. I am still working on fine tuning things as I think that I still have a little more performance that I can find.
> 
> Problem is that the OC guides are just step 1, step 2 etc and none of them go into the details of what you are actually changing making a particular voltage adjustment. I suspect it is because most people writing the guides dont really know themselves.
> 
> My hypothesis is that the different clocks - CPU, PCH etc, are all complimentary with each other and the communicate by signalling by sending a wave form over the various IO buses that computers use to communicate between components. They all need to stay completely in phase with each other to get the best performance from GPUs and Vram etc. The components do have a certain level of tolerance for the variations in signalling at the cost of performance. If one is off enough, the communication can drift out of phase enough to exceed the tolerance levels needed for high overclocked vram to stay stable for example. I also suspect that the Micron memory may be less tolerant to out of phase signals than Samsung memory and may explain why it can start off stable but start showing artifacts after a period of time under load.
> 
> I think that because the vram and GPU are operating at such high frequency levels compared to Maxwell and Kepler, the signalling phase tolerances have been reduced and what we used to consider a stable overclock is not stable enough to cope with the smaller tolerances these pascal cards are operating at. I think that the people who have "golden samples" that run at +800 memory over clocks, in addition to having a well made card, probably also have really well tuned CPU/memory over clocks and motherboards rather than just getting a lucky card but no-one seems to consider that due to the "magical black box" nature of these devices and general ignorance as to what is actually happening inside. I think the tendency is to just blame the "silicon lottery", "Micron memory", EVGA overheating or whatever else is excuse of the month rather than delve deeper.
> 
> 
> 
> Why would I? I OC by multiplier, not by BCLK. That kind of CPU OC should have no influence on the GPU OC, and I have never seen any evidence of such.
> 
> On my MB (and it is not just mine), VRM spread spectrum does not behave well - waking from sleep hangs the computer and it takes a complete power cycle to recover.
> 
> PCI-E is a separate bus than the internal ones used by the CPU, but runs off the same clock so there should be no phase differences. The only influence they could have over each other would be by introducing common noise or introducing power fluctuations or drops on the PCI-E bus. I guess I just don't think your conjectures necessarily represent the way the computer works.
Click to expand...

You don't have to do anything, I only suggested that you might like to try it. I just related my findings and the related performance improvement I got when I did rethink my Overclocking approach that had not changed since I was using a GTX 660. If you want to try experimenting you may see a benefit, then again you may not. My hypothesis is only that. It is a reasoned explanation based on personal observation and what I know about these electronics. I only have physical experimentation and observation, I don't have a scope or other test equipment available to actually test a theory or produce detailed reference data for a technical white paper. I am trying to share a discovery to give you an opportunity or possibly get more performance from your hardware if you want it.

CPU spread spectrum causes instability on my motherboard. VRM spread spectrum does not seem to do so. The bios notes for that vrm setting do say it aids stability wheras the CPU spread spectrum setting does say if is not good for overclocking. They are two different settings. The VRM spread spectrum option is not available/visible on my motherboard if you have the extreme phase setting enabled. It had been so long that I had forgotten it existed.

CPU PLL voltage controls voltage drift and synchronization of CPU clock with the central base clock that resides in the PCH. DDR memory has 2 clock rates, one that is double the other, that are supposed to sync at start up but if voltages are too high or too low they can effect the synchronization. Too high voltages in some settings can create "noise" that interferes with component signalling and reduces performance. How, without trying various setting combinations, do you know for sure that the motherboard vendor has set the default values, which by definition are arrived at to match the lowest common denominator of hardware tolerances, to exactly match the variable tolerances of the components on your motherboard?


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> You don't have to do anything, I only suggested that you might like to try it. I just related my findings and the related performance improvement I got when I did rethink my Overclocking approach that had not changed since I was using a GTX 660. If you want to try experimenting you may see a benefit, then again you may not. My hypothesis is only that. It is a reasoned explanation based on personal observation and what I know about these electronics. I only have physical experimentation and observation, I don't have a scope or other test equipment available to actually test a theory or produce detailed reference data for a technical white paper. I am trying to share a discovery to give you an opportunity or possibly get more performance from your hardware if you want it.
> 
> CPU spread spectrum causes instability on my motherboard. VRM spread spectrum does not seem to do so. The bios notes for that vrm setting do say it aids stability wheras the CPU spread spectrum setting does say if is not good for overclocking. They are two different settings. The VRM spread spectrum option is not available/visible on my motherboard if you have the extreme phase setting enabled. It had been so long that I had forgotten it existed.
> 
> CPU PLL voltage controls voltage drift and synchronization of CPU clock with the central base clock that resides in the PCH. DDR memory has 2 clock rates, one that is double the other, that are supposed to sync at start up but if voltages are too high or too low they can effect the synchronization. Too high voltages in some settings can create "noise" that interferes with component signalling and reduces performance. How, without trying various setting combinations, do you know for sure that the motherboard vendor has set the default values, which by definition are arrived at to match the lowest common denominator of hardware tolerances, to exactly match the variable tolerances of the components on your motherboard?


Again you are doing a BLCK overclock that will affect the PCI-E bus and the GPU and may require diddling the PLL and other things. BCLK OC affect many more things than a multiplier OC does..

The Asus Hero Motherboard's VRM spread spectrum does not function correctly.

The PCH has nothing to do with PCI-E which is on-die.

PLL is relevant to PCI-E only if you are overclocking BCLK - which the PCI-E uses as reference clock. If you don't change the BCLK, then the PLL for the PCI-E should be the same (at any multiplier).

You are asking me a question about tolerances of something I think you have no evidence that it has anything to do with GPU OC instabilities- to me without some kind of explanation other than hand waving using technical terms how can I answer that?

Look, all I am saying is the whole reason Intel introduced multipliers so they could change core frequencies (turbo) without affecting peripherals and other components.

.


----------



## tranceaddict

I recently upgraded from a GTX 780 to Gigabyte Gtx 1070 G1, and have major issues playing BF1. Running 3570K @4.6Ghz and 8 GB of 2133 Mhz Ram, latest Nvidia drivers. The game keeps crashing after 1 to 3 rounds and this is the message I get.

20170205_041930.jpg 8660k .jpg file


Please help me as I sm so fed up that I'm about to get rid of the card.


----------



## asdkj1740

Quote:


> Originally Posted by *DeathAngel74*
> 
> I like Afterburner much better. I just had to get used to the interface again.
> 
> Plus the skin matches my nVidia Desktop theme.


i hate nvidia a a lot, nvidia starts to stop us reselling the bundle game....


----------



## EDK-TheONE

Quote:


> Originally Posted by *tranceaddict*
> 
> I recently upgraded from a GTX 780 to Gigabyte Gtx 1070 G1, and have major issues playing BF1. Running 3570K @4.6Ghz and 8 GB of 2133 Mhz Ram, latest Nvidia drivers. The game keeps crashing after 1 to 3 rounds and this is the message I get.
> 
> 20170205_041930.jpg 8660k .jpg file
> 
> 
> Please help me as I sm so fed up that I'm about to get rid of the card.


Set post processing to medium and effects to high.


----------



## icold

Try this: https://www.microsoft.com/pt-br/download/details.aspx?id=35


----------



## tranceaddict

Thanks guys but I figured it out. When I got the new card I bumped the settings to Ultra and I became starved for RAM, only having 8Gb. The game then started using more virtual memory and I had my pagefile set to only 1Gb. I increased it to 12Gb and all is well. Played hours straight without a single crash. ?


----------



## b0uncyfr0

Quote:


> Originally Posted by *gtbtk*
> 
> Vertical position should not be an issue, heat pipes will still conduct heat through the pipe and cause a phase change in the contained fluid. Fans are very quiet
> 
> Mine does not have coil whine and the MSI cards do not have a reputation for it like the Gigabyte G1 cards do.
> 
> There is no Micron debacle. There was a memory controller bug in the original core vbios code that was written by nvidia and then modified and distributed by the vendors that had an effect on Micron memory ability to overclock, but that was resolved months ago. As long as you have the updated bios, it is a non issue.
> 
> Running at 2050 - 2100 with a +400 to +500 memory overclock is pretty standard. you can get higher if you overclock only on the voltage curve.
> 
> really poor performance is more related to poor motherboard config settings than it is "silicon lottery"


Thanks mate. Ive been playing with the card for a few hours now and so far its alright. I still have afew issues im worried about though;

1) What is up with the Manage 3D section - Adding a program crashes NV settings? Is this a known bug?
2) Here's a quick firestrike run - Hows it looking? http://www.3dmark.com/3dm/17812201?
3) I've applied a small overclock (+150 to the core without any power/voltage adjustments and cant see any artifacts). Think i might have a good card on my hands.
4) GPU-Z reports the boost with the OC as 1922 - but Its around 2088 according to Firestrike? Which one should i trust?

Apologies for the million questions but i've never had an NV card with the boost tech. Last nv card was a 570.


----------



## outofmyheadyo

nevermind found it !


----------



## ChronoBodi

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Thanks mate. Ive been playing with the card for a few hours now and so far its alright. I still have afew issues im worried about though;
> 
> 1) What is up with the Manage 3D section - Adding a program crashes NV settings? Is this a known bug?
> 2) Here's a quick firestrike run - Hows it looking? http://www.3dmark.com/3dm/17812201?
> 3) I've applied a small overclock (+150 to the core without any power/voltage adjustments and cant see any artifacts). Think i might have a good card on my hands.
> 4) GPU-Z reports the boost with the OC as 1922 - but Its around 2088 according to Firestrike? Which one should i trust?
> 
> Apologies for the million questions but i've never had an NV card with the boost tech. Last nv card was a 570.


What GPU-Z reports and what you actually get is different, this has been the case since Kepler.

2088 is a good boost speed for a 1070, the GPU-Z main page's reported 1922 boost is worst case scenario if thermals and workloads are really harsh, forcing the card to downclock to keep within either thermal or power limits.


----------



## gtbtk

Quote:


> Originally Posted by *GeneO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You don't have to do anything, I only suggested that you might like to try it. I just related my findings and the related performance improvement I got when I did rethink my Overclocking approach that had not changed since I was using a GTX 660. If you want to try experimenting you may see a benefit, then again you may not. My hypothesis is only that. It is a reasoned explanation based on personal observation and what I know about these electronics. I only have physical experimentation and observation, I don't have a scope or other test equipment available to actually test a theory or produce detailed reference data for a technical white paper. I am trying to share a discovery to give you an opportunity or possibly get more performance from your hardware if you want it.
> 
> CPU spread spectrum causes instability on my motherboard. VRM spread spectrum does not seem to do so. The bios notes for that vrm setting do say it aids stability wheras the CPU spread spectrum setting does say if is not good for overclocking. They are two different settings. The VRM spread spectrum option is not available/visible on my motherboard if you have the extreme phase setting enabled. It had been so long that I had forgotten it existed.
> 
> CPU PLL voltage controls voltage drift and synchronization of CPU clock with the central base clock that resides in the PCH. DDR memory has 2 clock rates, one that is double the other, that are supposed to sync at start up but if voltages are too high or too low they can effect the synchronization. Too high voltages in some settings can create "noise" that interferes with component signalling and reduces performance. How, without trying various setting combinations, do you know for sure that the motherboard vendor has set the default values, which by definition are arrived at to match the lowest common denominator of hardware tolerances, to exactly match the variable tolerances of the components on your motherboard?
> 
> 
> 
> Again you are doing a BLCK overclock that will affect the PCI-E bus and the GPU and may require diddling the PLL and other things. BCLK OC affect many more things than a multiplier OC does..
> 
> The Asus Hero Motherboard's VRM spread spectrum does not function correctly.
> 
> The PCH has nothing to do with PCI-E which is on-die.
> 
> PLL is relevant to PCI-E only if you are overclocking BCLK - which the PCI-E uses as reference clock. If you don't change the BCLK, then the PLL for the PCI-E should be the same (at any multiplier).
> 
> You are asking me a question about tolerances of something I think you have no evidence that it has anything to do with GPU OC instabilities- to me without some kind of explanation other than hand waving using technical terms how can I answer that?
> 
> Look, all I am saying is the whole reason Intel introduced multipliers so they could change core frequencies (turbo) without affecting peripherals and other components.
> 
> .
Click to expand...

I am not asking you anything. I gave you some suggestions to try and was was telling you my observations and giving you my reasoning behing my theory. You have been arguing with me about it because What I have observed does not seem match the "overclocking rule book"

I understand multipliers, I understand BCLK overclocking. I also understand manufacturing tolerances in electronics that can cause variations in signalling between devices, even at a 100mhz BCLK, that are far enough apart to reduce performance but not far enough apart to make things cease to function.

I always had the understanding that Spread spectrum was a bad thing when overclocking as well and I never bothered to try it until the other day. When I did discovered that it didnt cause me any of the problems that I had been led to believe it would and I was surprised. I have no idea if it is stable or not on a Hero MB because I dont have one and no one ever tries it because the book says it is not a good thing to use.

You were complaining about limited vram overclocks and i made some suggestions to try, with some reasoning behind it, that may help you if you choose to do some experimentation. I didn't guarantee you a fix or massive improvements. If you don't want to do anything you don't have to and your memory OC speed will stay as it is.

The Overclocking tutorials and definitions that you are quoting, while generally correct in describing the major functionality of various voltages/settings don't always give you the entire story because I don't think there is anyone outside of the engineers in the companies who designed the chips who really have the whole story on the more obscure operations of computer components.


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Vertical position should not be an issue, heat pipes will still conduct heat through the pipe and cause a phase change in the contained fluid. Fans are very quiet
> 
> Mine does not have coil whine and the MSI cards do not have a reputation for it like the Gigabyte G1 cards do.
> 
> There is no Micron debacle. There was a memory controller bug in the original core vbios code that was written by nvidia and then modified and distributed by the vendors that had an effect on Micron memory ability to overclock, but that was resolved months ago. As long as you have the updated bios, it is a non issue.
> 
> Running at 2050 - 2100 with a +400 to +500 memory overclock is pretty standard. you can get higher if you overclock only on the voltage curve.
> 
> really poor performance is more related to poor motherboard config settings than it is "silicon lottery"
> 
> 
> 
> Thanks mate. Ive been playing with the card for a few hours now and so far its alright. I still have afew issues im worried about though;
> 
> 1) What is up with the Manage 3D section - Adding a program crashes NV settings? Is this a known bug?
> 2) Here's a quick firestrike run - Hows it looking? http://www.3dmark.com/3dm/17812201?
> 3) I've applied a small overclock (+150 to the core without any power/voltage adjustments and cant see any artifacts). Think i might have a good card on my hands.
> 4) GPU-Z reports the boost with the OC as 1922 - but Its around 2088 according to Firestrike? Which one should i trust?
> 
> Apologies for the million questions but i've never had an NV card with the boost tech. Last nv card was a 570.
Click to expand...

You should be able to find at least another 800-1000 points in the FS graphics score if you overclock your vram. +400 should be stable starting point and you could quite possible manage a +800 depending on the card.

You will get better core OC performance using the curve to overclock single points at the high end of the voltage curve rather than the Core slider. 2100 Should be achievable, 2152 is achievable on some cards


----------



## GeneO

Quote:


> Originally Posted by *gtbtk*
> 
> You were complaining about limited vram overclocks and i made some suggestions to try, with some reasoning behind it, that may help you if you choose to do some experimentation. I didn't guarantee you a fix or massive improvements. If you don't want to do anything you don't have to and your memory OC speed will stay as it is.
> 
> The Overclocking tutorials and definitions that you are quoting, while generally correct in describing the major functionality of various voltages/settings don't always give you the entire story because I don't think there is anyone outside of the engineers in the companies who designed the chips who really have the whole story on the more obscure operations of computer components.


I wasn't complaining about anything. And I wasn't quoting any tutorials, only information from Intel on hw the architecture is designed. Whatever, believe what you will.

.


----------



## outofmyheadyo

*Score: 6850 Graphics: 6901 Cpu: 6577*
[email protected]
GTX 1070 @ +100core +800mem
2x8gb 3200 ram @ 14-14-14-34-2T

Ran a timespy not too bad for the 1070


----------



## DeathAngel74

Can I run the Palit Jetstream 1070 bios on my eVGA 1070 SC? The ASUS 1070 STRIX OC bios flashed fine, 24 hours no adverse effects. I have one 8-pin and the max power is 225W, which makes sense 75W on the pcie lane and 150W from the PSU?


----------



## zipper17

Quote:


> Originally Posted by *tranceaddict*
> 
> Thanks guys but I figured it out. When I got the new card I bumped the settings to Ultra and I became starved for RAM, only having 8Gb. The game then started using more virtual memory and I had my pagefile set to only 1Gb. I increased it to 12Gb and all is well. Played hours straight without a single crash. ?


pagefile is important, I set it 15 GB on my SSD.

Hitman 2016 1440p max settings always peaked my pagefile at 12 -13GB.

I also upgraded my 8 GB into 16GB, a lot better when you alt+tab while gaming into browser.

With 8 Gb only played Hitman max settings + browser with several tabs= system out of memory error lol.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Can I run the Palit Jetstream 1070 bios on my eVGA 1070 SC? The ASUS 1070 STRIX OC bios flashed fine, 24 hours no adverse effects. I have one 8-pin and the max power is 225W, which makes sense 75W on the pcie lane and 150W from the PSU?


yes, that will work. Gainward and Palit are basically the same card so you can use gainward equivalent

The physical power limitations of the card are a functions of the number of VRM phases and the associated mosfets. The SC card is basically a reference board that has a 5 phase VRM that with the reference mosfets will supply up to about 250W.

A 6 pin will actually provide much more power than 150W. The PCI-e slot can also provide more than 75W to the card as well. I know the palit and gainward 1080 cards are configured to pull 99W from the PCI-e slot. Not sure about 1070 cards. The PCI-E power limit there is actually managed by a hidden bios setting.


----------



## Quadrider10

Do u guys use shader cache? Is it enabled or disabled in global settings of Nvidia control panel?


----------



## khanmein

Quote:


> Originally Posted by *Quadrider10*
> 
> Do u guys use shader cache? Is it enabled or disabled in global settings of Nvidia control panel?


i always turned it on.


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> Do u guys use shader cache? Is it enabled or disabled in global settings of Nvidia control panel?


The shader cache can help some but not all applications. It helps performance in 3dMark but I believe that disabling the shader cache for GTA V can help reduce stuttering.

It is on by default.


----------



## b0uncyfr0

Quote:


> Originally Posted by *gtbtk*
> 
> You will get better core OC performance using the curve to overclock single points at the high end of the voltage curve rather than the Core slider. 2100 Should be achievable, 2152 is achievable on some cards


Sorry i don't understand this: using the curve to overclock single points at the high end of the voltage? This is not done through AB is it? We really should have a 1070 overclocking guide for the newbie's like me.


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You will get better core OC performance using the curve to overclock single points at the high end of the voltage curve rather than the Core slider. 2100 Should be achievable, 2152 is achievable on some cards
> 
> 
> 
> Sorry i don't understand this: using the curve to overclock single points at the high end of the voltage? This is not done through AB is it? We really should have a 1070 overclocking guide for the newbie's like me.
Click to expand...

If you read through this thread from the beginning there are a number of posts that describe that.

Afterburner 4.3 does allow you to access the new Voltage curve feature that arrived with pascal cards. If you are running an earlier version, you need to upgrade.

If you open Afterburner and then press CTRL-F a new graph window will pop up on screen. X Axis is voltage levels and Y axis if Card frequency.

If you click on any point along you can drag it to a new position and then click apply in the main window of AB to apply the overclock.


----------



## shilka

I finally got around to ordering a new video card after i have been forced to push it back time and time again
Endend up ordering the Gigabyte GTX 1070 Extreme Gaming after i looked at and dismissed all the other cards

Almost ordered an EVGA GTX 1070 FTW but the VRM overheating problems turned me off that card

Since i plan on testing the card anyway would anyone want to know the results?
Not sure what games i plan on testing but i think i got somewhere around 35 games i can test in total

Some of these games are a bit old but since i own a 1440P 165 Hz monitor i really want to see if those old games can do 165 FPS in 1440P
Probably not but i plan on testing anyway

Really should have bought a GTX 1070 half a year ago but life got in the way so better late than never i suppose.


----------



## icold

Curve dont help anything, i try up to 2139mhz on 1.093 point and crash. My card only up 2126 with stock bios. OC with slider is more fast and same thing.


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> I finally got around to ordering a new video card after i have been forced to push it back time and time again
> Endend up ordering the Gigabyte GTX 1070 Extreme Gaming after i looked at and dismissed all the other cards
> 
> Almost ordered an EVGA GTX 1070 FWT but the VRM overheating problems turned me off that card
> 
> Since i plan on testing the card anyway would anyone want to know the results?
> Not sure what games i plan on testing but i think i got somewhere around 35 games i can test in total
> 
> Some of these games are a bit old but since i own a 1440P 165 Hz monitor i really want to see if those old games can do 165 FPS in 1440P
> Probably not but i plan on testing anyway
> 
> Really should have bought a GTX 1070 half a year ago but life got in the way so better late than never i suppose.


Congrats. I think we all know about life getting in the way of things

Why was only the Giga Xtreme acceptable? The EVGA "overheating problem" was a batch of faulty Capacitors and nothing to do with overheating. Having said that though, the EVGA cooler is not the best design around but it is still good enough to keep the temps below combustion levels..

Always interesting seeing what a card can do and compare notes. Heaven , Valley and 3dmark benchmarks are probably the lowest common denominator but Games titles are also interesting especially in a quest for 165fps at 1440p.

There is lots of information here in this thread that can help with overclocking 1070s. I think that you will find that in your case, the main area for finding more performance will be in overclocking your vram. The high factory overclock does not leave that much overhead to overclock the core that much further.

Make sure that your card has been updated with the latest vbios. It should have a version that starts with 86.04.50.00.XX that you can check in gpu-z. If it is running an 86.04.26.xx.xx or 86.00.1e.xx.xx bios you can get an update from gigabyte to upgrade it.


----------



## TUFinside

i thought i could post this here aswell, I need some help to pick thermal pads for my GPU, the card is a Zotac GTX 1070 mini and i need the perfect thickness for the pads replacement. Best brand ? i was thinking about Thermal Grizzly ,waddya think ?


----------



## Inelastic

I'd be interested in seeing the types of gains in games people are getting by overclocking their card. I spent a few hours fiddling with my card when I got it and got about a 9% increase in Heaven, but I never tried it out in the games I play. I just set it back to default and forgot about it. I currently game at 1080P and cap my games at 60fps. Even in The Witcher 3, my card doesn't get above 36C (custom water cooled) so my fans are always at low rpms which is why I don't bother with it being overclocked right now.


----------



## shilka

Quote:


> Originally Posted by *gtbtk*
> 
> Congrats. I think we all know about life getting in the way of things
> 
> Why was only the Giga Xtreme acceptable? The EVGA "overheating problem" was a batch of faulty Capacitors and nothing to do with overheating. Having said that though, the EVGA cooler is not the best design around but it is still good enough to keep the temps below combustion levels..
> 
> Always interesting seeing what a card can do and compare notes. Heaven , Valley and 3dmark benchmarks are probably the lowest common denominator but Games titles are also interesting especially in a quest for 165fps at 1440p.
> 
> There is lots of information here in this thread that can help with overclocking 1070s. I think that you will find that in your case, the main area for finding more performance will be in overclocking your vram. The high factory overclock does not leave that much overhead to overclock the core that much further.
> 
> Make sure that your card has been updated with the latest vbios. It should have a version that starts with 86.04.50.00.XX that you can check in gpu-z. If it is running an 86.04.26.xx.xx or 86.00.1e.xx.xx bios you can get an update from gigabyte to upgrade it.





Spoiler: List of games with built in benchmarks that i own



All of these games are in no particular order

Street Fighter IV
Lost Planet 2 (test A takes too long to run)
Resident Evil 5 (takes too long to run)
Resident Evil 6 (only score based)
Devil May Cry 4 (takes too long to run)
S.T.A.L.K.E.R Call of Pripyat and Clear Sky (takes too long to run)
Batman Arkham City and Arkham Origins (dont own Arkham Knight)
Middle Earth Shadow Of Mordor
Theif
Bioshock Infinite
Crysis 1
Deus Ex Mankind Divided
F.E.A.R. 1 (oldest game on the list but hey could be funny to test anyway)
Far Cry 2
Hard Rest
Metro 2033 and Last Light (not the reduex version)
Section 8
Tom Clancy´s H.A.W.X 1 and 2
Dirt 2 and 3 (dont own Dirt Rally)
Gird 1 / 2 and Autosport
Ashes Of The Singularity
Company Of Heroes 1 and 2
Total War Shogun 2 (dont own Rome II Atila or the new Wahammer Total War games)
World In Conflict
GTA IV (dont own GTA V)
Hitman Absolution and Hitman 2016
Just Cause 2
Sleeping Dogs
Tomb Raider (dont own Rise Of the Tomb Raider)

And then there are Unigine Heaven 4.0 and Unigine Valley 1.0 and there is a new Unigine benchmark on the way

Besides all of those games with built in benchmarks i also got these games on my wish list as well

Mafia II (never got around to buying it)
Far Cry Primal
Rainbow Six Siege
Tom Clancy The Division


----------



## madweazl

In Black Desert with everything but distances maxed out, I average an extra 4fps. The difference for me was occasionally dipping into the high 20s but with the overclock it is generally 31-34fps in busy areas but I have seen 30 a few times. I havent played anything else that has been nearly as demanding. I guess Deus Ex Mankind Divided; I cant play it over 30fps with MSAA fully enabled (I think 4x is the max I can do which gives about 45fps but anything else and frame rates plummet with the rest of the settings maxed).


----------



## Curseair

Quote:


> Originally Posted by *shilka*
> 
> I finally got around to ordering a new video card after i have been forced to push it back time and time again
> Endend up ordering the Gigabyte GTX 1070 Extreme Gaming after i looked at and dismissed all the other cards
> 
> Almost ordered an EVGA GTX 1070 FWT but the VRM overheating problems turned me off that card
> 
> Since i plan on testing the card anyway would anyone want to know the results?
> Not sure what games i plan on testing but i think i got somewhere around 35 games i can test in total
> 
> Some of these games are a bit old but since i own a 1440P 165 Hz monitor i really want to see if those old games can do 165 FPS in 1440P
> Probably not but i plan on testing anyway
> 
> Really should have bought a GTX 1070 half a year ago but life got in the way so better late than never i suppose.


The Gigabyte Xtreme 1070 that I had not long ago could not even run at stock clocks it also ran hot for some reason compared to my previous EVGA's, I now have a Asus Strix OC 1070 and the fans once above 65 sound like a vacuum cleaner, It can't overclock much either, I'm trying to hold out for the EVGA FTW2 but it's taking forever.

Which monitor you got is it a IPS one?


----------



## shilka

Quote:


> Originally Posted by *Curseair*
> 
> The Gigabyte Xtreme 1070 that I had not long ago could not even run at stock clocks it also ran hot for some reason compared to my previous EVGA's, I now have a Asus Strix OC 1070 and the fans once above 65 sound like a vacuum cleaner, It can't overclock much either, I'm trying to hold out for the EVGA FTW2 but it's taking forever.
> 
> Which monitor you got is it a IPS one?


An Asus PG279Q which is yes IPS.

Edit: sounds like your Extreme Gaming was broken or defective.


----------



## skupples

some of the modern MSI AB skins -

MSI MSSII MSSSIII - It really kicks the Llama's butt!


----------



## kevindd992002

What does the SP-Cap in the Zotac GTX 1070 AMP! Extreme, specifically the one thst interferes with my AIO Pump, really stand for? What is its main purpose?

Also, is there still no way to increase the voltage of these Pascal cards just like we did with Keplers?


----------



## abiliocdf

I found a bug with the bios from Galax/KFA2.


----------



## ucode

^^Which is?

Any reason the Unigene GPU OSD is covered by AB control panel?


----------



## gtbtk

Quote:


> Originally Posted by *Curseair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *shilka*
> 
> I finally got around to ordering a new video card after i have been forced to push it back time and time again
> Endend up ordering the Gigabyte GTX 1070 Extreme Gaming after i looked at and dismissed all the other cards
> 
> Almost ordered an EVGA GTX 1070 FWT but the VRM overheating problems turned me off that card
> 
> Since i plan on testing the card anyway would anyone want to know the results?
> Not sure what games i plan on testing but i think i got somewhere around 35 games i can test in total
> 
> Some of these games are a bit old but since i own a 1440P 165 Hz monitor i really want to see if those old games can do 165 FPS in 1440P
> Probably not but i plan on testing anyway
> 
> Really should have bought a GTX 1070 half a year ago but life got in the way so better late than never i suppose.
> 
> 
> 
> The Gigabyte Xtreme 1070 that I had not long ago could not even run at stock clocks it also ran hot for some reason compared to my previous EVGA's, I now have a Asus Strix OC 1070 and the fans once above 65 sound like a vacuum cleaner, It can't overclock much either, I'm trying to hold out for the EVGA FTW2 but it's taking forever.
> 
> Which monitor you got is it a IPS one?
Click to expand...

EVGA 1070 cards, due to the way they have configured the power settings in the bios, hit even the extended power limit quite easily even at 1080p and the Core clock bounces around all over the place.

I am really quite happy with my MSI Gaming X/Quicksilver (same card, different colour shell) that I have flashed with the Gaming Z bios. The Fans running at 100% will keep my card under 60 deg at a heavy 1200p load and, while not silent, are reasonably quiet even at 100%.

You need to be loading the overclocked card up with a 4K load (firestrike ultra) to get it to intermittently hit power limits and temporarily reduce clock speeds.


----------



## DeathAngel74

New hotfix driver 378.57 killed my overclock. It used to be 2101/4463(8926 effective).


----------



## pez

Quote:


> Originally Posted by *shilka*
> 
> I finally got around to ordering a new video card after i have been forced to push it back time and time again
> Endend up ordering the Gigabyte GTX 1070 Extreme Gaming after i looked at and dismissed all the other cards
> 
> Almost ordered an EVGA GTX 1070 FTW but the VRM overheating problems turned me off that card
> 
> Since i plan on testing the card anyway would anyone want to know the results?
> Not sure what games i plan on testing but i think i got somewhere around 35 games i can test in total
> 
> Some of these games are a bit old but since i own a 1440P 165 Hz monitor i really want to see if those old games can do 165 FPS in 1440P
> Probably not but i plan on testing anyway
> 
> Really should have bought a GTX 1070 half a year ago but life got in the way so better late than never i suppose.


970 -> 1070 is a very nice upgrade indeed. Similarly my GFs setup is a 1070 pushing a 1440p display (only 144hz, though







). Congrats on the upgrade







.
Quote:


> Originally Posted by *Curseair*
> 
> The Gigabyte Xtreme 1070 that I had not long ago could not even run at stock clocks it also ran hot for some reason compared to my previous EVGA's, I now have a Asus Strix OC 1070 and the fans once above 65 sound like a vacuum cleaner, It can't overclock much either, I'm trying to hold out for the EVGA FTW2 but it's taking forever.
> 
> Which monitor you got is it a IPS one?


Yeah, sounds like a DOA card. The early Xtreme Gaming cards had some serious QC issues that even a couple reviewers came across on their review samples.


----------



## b0uncyfr0

Hmm should i flash the gaming z bios on my gaming x? Purely interested in OC potential.


----------



## DeathAngel74

Here you go


Pascal_flash.zip 2714k .zip file


----------



## tiramoko

What 1070 do you recommend?


----------



## pez

From what I recall when heavily researching at first, EVGA and MSI are the best as far as stock coolers. However, if you plan on going under water, there's not much reason to just choose a great performing card with block support. EVGA cards seem to be ok, but hit a speed bump because of the overheating issue. Apparently there's an 'ICX' cooler releasing very soon, so it might be worth waiting on those. ASUS cards are great, but the biggest criticism of them is that they decided to recycle the cooler and one of the direct heatpipes doesn't really make contact with the chip itself. Hell, if I could have fit it in my case, I would have gone with a MSI card.


----------



## EDK-TheONE

Quote:


> Originally Posted by *DeathAngel74*
> 
> Here you go
> 
> 
> Pascal_flash.zip 2714k .zip file


Could you provide my your skin of windows?


----------



## DeathAngel74

you will need some sort of uxtheme patcher for it to work though

https://drive.google.com/open?id=0B007JgCLgXQLbGQ4aENLTld0RFE

MSI AB skin

nvidiaflat_big.zip 226k .zip file


----------



## EDK-TheONE

Quote:


> Originally Posted by *DeathAngel74*
> 
> you will need some sort of uxtheme patcher for it to work though
> 
> https://drive.google.com/open?id=0B007JgCLgXQLbGQ4aENLTld0RFE
> 
> MSI AB skin
> 
> nvidiaflat_big.zip 226k .zip file


Thank you.


----------



## DeathAngel74

you're welcome


----------



## WillG027

Was tossing up between the MSI and the Strix.

I went with the MSI Gaming X.
If the BIOS editor ever comes out, it has two VGA power sockets (6+8 - compared to Strix single power socket).

Very quiet and keeps cool.


----------



## kevindd992002

Quote:


> Originally Posted by *kevindd992002*
> 
> What does the SP-Cap in the Zotac GTX 1070 AMP! Extreme, specifically the one thst interferes with my AIO Pump, really stand for? What is its main purpose?
> 
> Also, is there still no way to increase the voltage of these Pascal cards just like we did with Keplers?


Bump!


----------



## ucode

Google?
Quote:


> Ultra-low ESR, High Voltage Capacitors
> 
> Panasonic's industry leading SP-Cap™ Polymer Aluminum Capacitors, are surface mount (SMT) capacitors that utilize a conductive polymer as their electrolyte material in a layered aluminum design. SP-Cap™ capacitors are primarily used as input and output capacitors for DC/DC converters due to ultra-low ESR values, high voltage options, and the ability to withstand high reflow temperatures. Panasonic SP-Cap™ Polymer Aluminum Capacitors offer capacitance values up to 560µF, voltage values ranging from 2V to 25V, and are free from temperature drift. Discover all the benefits of our SP-Cap series.


As for Voltage mods, they have been detailed since practically the beginning of Pascal.


----------



## kevindd992002

Quote:


> Originally Posted by *ucode*
> 
> Google?
> As for Voltage mods, they have been detailed since practically the beginning of Pascal.


Thanks.

Yeah, I know about voltage mods but why didn't anyone care to create a Pascal BIOS Editor in the first place? Or is it just that the voltage is really locked?


----------



## ucode

Perhaps nobody wanted to pay for a Hulk certificate (a user key to flash a tweaked VBIOS). Rumor was they cost $50 a piece.

FWIW I did voltage modify a GTX 1050Ti for operation a little over 1.3V. It didn't help a great deal with OC but maybe it's a little YMMV with some chips doing better than others.


----------



## DanielB123

Just out of curiosity, why doesn't anyone recommend or mention the Zotac Amp Extreme? It has good cooling as far as I'm aware, is it just because it looks bonkers?


----------



## madmeatballs

Quote:


> Originally Posted by *DanielB123*
> 
> Just out of curiosity, why doesn't anyone recommend or mention the Zotac Amp Extreme? It has good cooling as far as I'm aware, is it just because it looks bonkers?


The 1070 Amp Extreme is a good card, but it isn't far from how the others perform. The extreme will just give you the best out of the box overclock. I have an amp extreme myself but decided to put a waterblock on it. It is happly stable at 2125MHz core and 9400MHz mem no throttling thanks to watercooling.


----------



## Handrox

Please, someone has a nvflash that works with Pascal. Thank you


----------



## DanielB123

2125 core seems nice, my card doesn't really like to go up with core (at least by just increasing the slider in firestorm) and will run at around 2025-2050mhz. Memory I can do an easy 9400 - 9600mhz without any issues


----------



## madweazl

Not to bad for a Gigabyte motherboard that is reportedly not so good when it comes to RAM overclocking paired with a Founder's Edition.

Time Spy


Spoiler: Warning: Spoiler!





__
https://flic.kr/p/QH1mdD




Fire Strike


Spoiler: Warning: Spoiler!





__
https://flic.kr/p/RWXBFi


----------



## b0uncyfr0

My latest run with +150 core - +600 Mem + 125% Power and stock voltage. I've Adjusted the fan to hit around 75% when im in the mid 50's to prevent it from heating up any higher - i noticed the clock always drops as it goes past 57 degrees.

http://www.3dmark.com/3dm/17862225?

Good?


----------



## icold

Quote:


> Originally Posted by *b0uncyfr0*
> 
> My latest run with +200 core - +600 Mem + 125% Power and stock voltage. I've Adjusted the fan to hit around 75% when im in the mid 50's to prevent it from heating up any higher - i noticed the clock always drops as it goes past 57 degrees.
> 
> http://www.3dmark.com/3dm/17862225?
> 
> Good?


Put voltage on the max to hit 1.093v


----------



## b0uncyfr0

Quote:


> Originally Posted by *icold*
> 
> Put voltage on the max to hit 1.093v


The last time i tried this - i got no benefit whatsoever but i will try again.

Edit: Yep. 1.093v does not help me at all. Pushing +151 core takes my boost to 1125 and it simply cant handle it.It crashes everytime , i thought it was the memory OC - but backing it down to 500 didn't make a difference. Initial clock was +150 not +200!


----------



## icold

try + 226 and max voltage, i think you can, is my clock @2126mhz


----------



## madweazl

Quote:


> Originally Posted by *icold*
> 
> try + 226 and max voltage, i think you can, is my clock @2126mhz


Wouldnt his equivalent be +176? If he cant do 151, why would he bump to 176?


----------



## icold

Quote:


> Originally Posted by *madweazl*
> 
> Wouldnt his equivalent be +176? If he cant do 151, why would he bump to 176?


Nope, my STRIX is non OC version, is equivalent at founders edition


----------



## madweazl

I must be missing something; assuming his Fire Strike results are correct, it ran at 2101mhz which I'm assuming was +150 for him. You're suggesting to increase that to +226 (an increase of 76mhz) which would place his core at 2177mhz, correct?


----------



## icold

nope, in my GPU +226 = 2126MHZ, 2177mhz is if your card have stock OC.


----------



## b0uncyfr0

Quote:


> Originally Posted by *icold*
> 
> nope, in my GPU +226 = 2126MHZ, 2177mhz is if your card have stock OC.


Your're not understanding. I dont have your card + you cant generalize all 1070's to have the same boost as yours if they have +226 in Afterburner. Why? Because your ROG has a base clock of 1687 (1860 boost) and my gaming X has a base default of 1607 (1797).

See the difference...


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Hmm should i flash the gaming z bios on my gaming x? Purely interested in OC potential.


The only benefit you get is that you have a small overclock to core and memory at boot up without having to load afterburner. If you are planning on permanently running an afterburner OC profile 100% of the time, then it wont make any difference.

Gaming X and Z Power limits are both the same.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ucode*
> 
> Google?
> As for Voltage mods, they have been detailed since practically the beginning of Pascal.
> 
> 
> 
> Thanks.
> 
> Yeah, I know about voltage mods but why didn't anyone care to create a Pascal BIOS Editor in the first place? Or is it just that the voltage is really locked?
Click to expand...

pascal doesn't really benefit that much from extra voltage in the same way maxwell did. you can read this if you like https://xdevs.com/guide/pascal_oc/#intro

Also, no one has cracked the encryption needed to sign the drivers to allow it to be flashed.


----------



## gtbtk

Quote:


> Originally Posted by *Handrox*
> 
> Please, someone has a nvflash that works with Pascal. Thank you


there you go https://1drv.ms/f/s!AplTNK-q9mm4iKlEeAoHeULWcLrWBw


----------



## Putzlappen

Love this Card.

Cheap and Powerful


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> pascal doesn't really benefit that much from extra voltage in the same way maxwell did. you can read this if you like https://xdevs.com/guide/pascal_oc/#intro
> 
> Also, no one has cracked the encryption needed to sign the drivers to allow it to be flashed.


Just read the last bit of that article. Does that mean that modifying the core voltage slider bar in MSI AB really doesn't do anything to OC potential?


----------



## GeneO

Quote:


> Originally Posted by *kevindd992002*
> 
> Just read the last bit of that article. Does that mean that modifying the core voltage slider bar in MSI AB really doesn't do anything to OC potential?


Yes to a good extent. Unless you have the cooling capacity to counteract the rise in temperature due to raising the voltage - i.e. water cooling. These chips will **** down with temperature- the article claims 2 MHz per degree C on average.


----------



## madmeatballs

Quote:


> Originally Posted by *GeneO*
> 
> Yes to a good extent. Unless you have the cooling capacity to counteract the rise in temperature due to raising the voltage - i.e. water cooling. These chips will **** down with temperature- the article claims 2 MHz per degree C on average.


True, it down clocks as temperature increases.


----------



## kevindd992002

Quote:


> Originally Posted by *GeneO*
> 
> Yes to a good extent. Unless you have the cooling capacity to counteract the rise in temperature due to raising the voltage - i.e. water cooling. These chips will **** down with temperature- the article claims 2 MHz per degree C on average.


Is there like a threshold temp when it starts to downclock? At least there was one (and it was very evident) with Keplers.


----------



## GeneO

See temperature compensation plot in this review:

http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/15

.


----------



## madmeatballs

Quote:


> Originally Posted by *kevindd992002*
> 
> Is there like a threshold temp when it starts to downclock? At least there was one (and it was very evident) with Keplers.


From what I've experienced my card starts to downclock at 42C.


----------



## DeathAngel74

My card drops one bin @42C from 2075mhz to 2063mhz. Not too bad I guess


----------



## Dude970

If you have idle temps in the 20s you will see a drop in the 30s too


----------



## GP104

How's this clock, can I be part of the club?

1070 FE / Samsung memory (purchased from nvidia on release day) +230 Core / +485 memory OC / stock cooler.


----------



## madweazl

Memory wont go higher than that? I was able to hit around 720 on the memory before going on water. The core looks about normal but the memory typically goes quite a bit higher. Maybe try dropping the core down to 220 and pushing the memory higher if you havent tried yet?


----------



## GP104

I will try and report back, havent tweak with OC on card for a few months. I do remember last year I had memory at 600 but had to lower it to 485 to get rid of artifacts in the game Dying Light even though I wasn't getting any artifacts in Heaven or 3D Mark. May play with it a little bit tomorrow.


----------



## ucode

Quote:


> Originally Posted by *kevindd992002*
> 
> Is there like a threshold temp when it starts to downclock? At least there was one (and it was very evident) with Keplers.


There's still a firm limit but the scaling down clocking over a temperature range depends on how things are set.

This is a 1050Ti with constant 1911MHz GPU clock over temperature range of 33C to 74C that was posted before. No down clocking at all.


And here's one of a 1080 with clock increase with temperature rather than reduction. Not very practical, just for reference.


Pascal does clock better with lower temperatures so one can see the logic in reducing the GPU clock with increased temperature. Whether it's reducing too much is another question and will vary across chips.


----------



## JoeUbi

Just upgraded to a 7700k and scored 18194 on Firestrike. http://www.3dmark.com/fs/11652759 These Zotac cards are so beast-like.


----------



## ucode

Nice. What processor were you using before (6700k?) and how do the graphics scores compare between them (assuming same GPU clocks)


----------



## tiramoko

I just wondering if evga SC BE is good 1070? I never had a evga brand before. I found a card for $370 including tax (tax kills it). How is this card on overclocking? Was going to get 1080 b way too expensive for a 5-10% performance on 1440p


----------



## pez

Quote:


> Originally Posted by *tiramoko*
> 
> I just wondering if evga SC BE is good 1070? I never had a evga brand before. I found a card for $370 including tax (tax kills it). How is this card on overclocking? Was going to get 1080 b way too expensive for a 5-10% performance on 1440p


Not sure what you're playing exactly, but the 1080 is much better than '5-10%' over a 1070 at 1440p.


----------



## khanmein

Quote:


> Originally Posted by *tiramoko*
> 
> I just wondering if evga SC BE is good 1070? I never had a evga brand before. I found a card for $370 including tax (tax kills it). How is this card on overclocking? Was going to get 1080 b way too expensive for a 5-10% performance on 1440p


new EVGA SuperClocked 2 ICX cooler is releasing soon within this month & slightly improvement but come with lower default core clock compared with previous ACX 3.0


----------



## headd

Quote:


> Originally Posted by *pez*
> 
> Not sure what you're playing exactly, but the 1080 is much better than '5-10%' over a 1070 at 1440p.


25-30% in most modern games released in last 5-6 months.


----------



## tiramoko

Quote:


> Originally Posted by *pez*
> 
> Not sure what you're playing exactly, but the 1080 is much better than '5-10%' over a 1070 at 1440p.


my bad i just read this from someone's thread maybe much more.
Quote:


> Originally Posted by *khanmein*
> 
> new EVGA SuperClocked 2 ICX cooler is releasing soon within this month & slightly improvement but come with lower default core clock compared with previous ACX 3.0


i just really wanted to get 1070 right now to upgrade from 7950. i cant wait any longer for a newer card thats going to be released in few months. i will either get Evga SC or G1 gigabyte


----------



## zipper17

Quote:


> Originally Posted by *JoeUbi*
> 
> Just upgraded to a 7700k and scored 18194 on Firestrike. http://www.3dmark.com/fs/11652759 These Zotac cards are so beast-like.


That's a total combined scores, Raw power of Graphics score is much more interesting.

I didn't care much about total scores every time i run 3dmark, I am mainly more focus on GPU overclocking & graphic scores result.

you could try 22k btw ,,,


----------



## khanmein

this friday EVGA gonna release so just wait for few days not month. i don't recommend giga g1 cos dual heat-pipe + coil whine + fan rattling issue but that's your money so depend on u.


----------



## zipper17

Quote:


> Originally Posted by *GeneO*
> 
> See temperature compensation plot in this review:
> 
> http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/15
> 
> .


Even if you set a fixed voltage & frequency on the curve, the coreclock frequency would still dropped because of temperatures, it doesn't even exceed 60-65C still far from 83C.

I miss the oldday GTX 500 series, i can run coreclock & memclock at fixed frequency no drop at all, no matter how hot the card become!?


----------



## b0uncyfr0

I hope there will be custom bioses soon because these 1070's are severely crippled atm. It would be interesting to see how disabling boost 3.0 would also go ; maybe its not doing such a great job and we think. Regarding my card Ive noticed it stays around 2088 at 55 degrees but sometimes drops to 2050/2063 when it gets really close to 60 degrees. Im trying to set up my fans for the card will never hit 60. i cant imagine how good the boost would be on rigs with water. It prob wouldn't even go past 40 degree's.


----------



## pez

Quote:


> Originally Posted by *tiramoko*
> 
> my bad i just read this from someone's thread maybe much more.
> 
> i just really wanted to get 1070 right now to upgrade from 7950. i cant wait any longer for a newer card thats going to be released in few months. i will either get Evga SC or G1 gigabyte


All good. Just kinda made me go '???'. The 1070 is still a great card; especially for 1440p. Is the MSI Gaming (X) not an option for you? Also a very good card now that they are more reasonably priced (or they were somewhat recently).


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> pascal doesn't really benefit that much from extra voltage in the same way maxwell did. you can read this if you like https://xdevs.com/guide/pascal_oc/#intro
> 
> Also, no one has cracked the encryption needed to sign the drivers to allow it to be flashed.
> 
> 
> 
> Just read the last bit of that article. Does that mean that modifying the core voltage slider bar in MSI AB really doesn't do anything to OC potential?
Click to expand...

It gives you a bit of flexibility to move things up and down a little but I can get the same scores in Firestrike at 1.062V with lower temps as I can at 1.093V.

I have come to the conclusion that the core clock speed that AB reports doesn't actually have that much relevance when it comes to frames per second. the levels around .975v seem to have much more impact on GPU performance


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> It gives you a bit of flexibility to move things up and down a little but I can get the same scores in Firestrike at 1.062V with lower temps as I can at 1.093V.
> 
> I have come to the conclusion that the core clock speed that AB reports doesn't actually have that much relevance when it comes to frames per second. the levels around .975v seem to have much more impact on GPU performance


i dont know why mine running at 1.05v resulted in the same gpu temp and gpu power draw of 1.09v with the same core and vram clock settings.
indeed the pascal does not really scale with voltage. but i have been thinking this is all about the gpu boost and the driver.


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> It gives you a bit of flexibility to move things up and down a little but I can get the same scores in Firestrike at 1.062V with lower temps as I can at 1.093V.
> 
> I have come to the conclusion that the core clock speed that AB reports doesn't actually have that much relevance when it comes to frames per second. the levels around .975v seem to have much more impact on GPU performance


But you still adjust the core frequency when you adjust the voltage, right? Or is it that you vary the voltage, keep the core freq constant, and then test with Firestrike?


----------



## abiliocdf

Look this bug, crazy, and it work juts with Galax/KFA2



http://www.3dmark.com/compare/fs/11656473/fs/11656535

https://lockgamer.com/2017/02/09/galaxkfa2-gtx-1070-ex-fakes-2-5ghz-gpu-clock/


----------



## madweazl

Quote:


> Originally Posted by *b0uncyfr0*
> 
> I hope there will be custom bioses soon because these 1070's are severely crippled atm. It would be interesting to see how disabling boost 3.0 would also go ; maybe its not doing such a great job and we think. Regarding my card Ive noticed it stays around 2088 at 55 degrees but sometimes drops to 2050/2063 when it gets really close to 60 degrees. Im trying to set up my fans for the card will never hit 60. *i cant imagine how good the boost would be on rigs with water.* It prob wouldn't even go past 40 degree's.


This only represents about 12 minutes in Black Desert (all settings maxed with the exception of some viewing distances) but I've never had it go over 41° in any game or benchmark that I recall.


Spoiler: Warning: Spoiler!





__
https://flic.kr/p/RMRKZz


----------



## b0uncyfr0

@madweazl

Still kinda low for water though no? 2101 isn't that high.


----------



## zipper17

Quote:


> Originally Posted by *abiliocdf*
> 
> Look this bug, crazy, and it work juts with Galax/KFA2
> 
> 
> 
> http://www.3dmark.com/compare/fs/11656473/fs/11656535
> 
> https://lockgamer.com/2017/02/09/galaxkfa2-gtx-1070-ex-fakes-2-5ghz-gpu-clock/


Interesting

this is pretty close like what I was experienced before ...
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703006
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703710
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6570#post_25705184

I can run coreclock 2151mhz seem stable, but it's a fake clock & fake performances of 2151mhz...


----------



## EDK-TheONE

Quote:


> Originally Posted by *zipper17*
> 
> this is pretty close like what I was experienced before ...
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703006
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703710
> http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6570#post_25705184
> 
> I can run coreclock 2151mhz seem stable, but it's a fake clock & fake performances of 2151mhz...


It's not fake clock, it's turbo boost's smart behaviors inside chip that nvidia engineered you can not achieve to better performance.


----------



## madweazl

Quote:


> Originally Posted by *b0uncyfr0*
> 
> @madweazl
> 
> Still kinda low for water though no? 2101 isn't that high.


Those are the settings I roll with for daily operations (200/800 over reference). The plots dont show any downclocking at 40° until I played with DSR.

Edit: as stated earlier in this thread, the move to water only netted an additional 20/100 but reduced the temps by roughly 23°.

I played around with DSR and was able to get considerably higher GPU usage in game with it set to 4k. This got the temps up to 41°. With DSR set to 1440 it didnt operate much differently than my native of 1080 but did drop about 3fps. With DSR set to 4k I was getting mid teens for frame rates with everything maxed out.


Spoiler: Warning: Spoiler!





__
https://flic.kr/p/RVbjz1


----------



## asdkj1740

bitwik uploaded the new evga icx 1080 ftw2 review video but after <30mins the video set to private.

the temp test is that: one old ftw 1080, one new icx 1080 ftw2, 2000rpm for both cards, ambient temp 25c, open test bench settings, the gpu temp of new icx is 2c higher, around 71c.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It gives you a bit of flexibility to move things up and down a little but I can get the same scores in Firestrike at 1.062V with lower temps as I can at 1.093V.
> 
> I have come to the conclusion that the core clock speed that AB reports doesn't actually have that much relevance when it comes to frames per second. the levels around .975v seem to have much more impact on GPU performance
> 
> 
> 
> But you still adjust the core frequency when you adjust the voltage, right? Or is it that you vary the voltage, keep the core freq constant, and then test with Firestrike?
Click to expand...

I overclock core clock with points on the curve. I do not use the code clock slider. Remember there are two things we are talking about that relate to voltage. One is the voltage slider that regulates the high level limit that the board can operate at. Second are the various points on the voltage curve that will adjust core clock speed at a given voltage level that ranges from .800V up to 1.062-1.093V depending on the voltage slider setting.

In very simple terms, If you have the voltage slider at "0" then the points above 1.062 on the voltage curve don't do anything. It you increase the voltage slider to "100" then the points to the right of 1.062 up to 1.093 become active in setting core clock frequency.

Each one of the voltage points along the curve will independently have a level above the stock frequency where it will go unstable if you go too far. For some voltage level points on the curve it may be 100 or more above stock but there may be a single voltage point that can only go to 50 above the stock. lf you use the slider, all the voltage levels on that curve are increased together with the same offset so even if only one voltage level has a limitation of 50 above the stock level, it will limit your overclock/performance for the entire curve to +50 above the stock level. If you adjust the different points on the curve independently, then you can just limit that single point weak to 50 and go higher at other points along the curve.

After months of experimentation, i have discovered that with the memory highly overclocked, the area of the curve that seems gets the most work are the points around .0950-.0975v. If you find a good level at .0975, then setting the 1.062 point to 2088Mhz with +0 voltage slider or 2114Mhz at 1.093V with voltage slider at +100 really does not make much difference in performance. I can score around 20700 graphics score either way.

What I have also found is that there is a certain frequency level on the .0975v point that will change the balance between CPU/GPU performance that I can demonstrate in firestrike with either a higher graphics score and lower physics score or vice versa by just changing the level above and below about 2000mhz at .0975v. For Firestrike, The slightly lower graphics with higher physics/combined scores will give better total scores.

The lower voltage levels do make a difference to card temps and will improve the core clock stability under load. That is something that could be of benefit to owners of cards that run hot like founders edition or the other blower style cards.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> 
> 
> 
> 
> 
> bitwik uploaded the new evga icx 1080 ftw2 review video but after <30mins the video set to private.
> 
> the temp test is that: one old ftw 1080, one new icx 1080 ftw2, 2000rpm for both cards, ambient temp 25c, open test bench settings, the gpu temp of new icx is 2c higher, around 71c.


I think he may have just broken his NDA...oops

The 2c increase could be because the new cooler is drawing more heat load from VRM and memory that the old cooler was. End result would be that the VRMs and memory chips are running cooler than they were in the original at the expense of an extra 2c for the GPU


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> I think he may have just broken his NDA...oops
> 
> The 2c increase could be because the new cooler is drawing more heat load from VRM and memory that the old cooler was. End result would be that the VRMs and memory chips are running cooler than they were in the original at the expense of an extra 2c for the GPU


https://videocardz.com/65789/evga-icx-technology-detailed


----------



## Sptz

Hi everyone,

I just built an mATX rig with an i7 7700k and an EVGA ACX 3.0 1070. CPU is running at 4.5 and no OC on GPU.

Is this Firestrike score normal? I'm using the latest nvidia drivers (the hotfix one due to the debug issue on pascal).

http://www.3dmark.com/fs/11657859

I've seen "default" reviews getting 16 or 17k, just wondering if this is to be expected.

Cheers!


----------



## asdkj1740

Quote:


> Originally Posted by *Sptz*
> 
> Hi everyone,
> 
> I just built an mATX rig with an i7 7700k and an EVGA ACX 3.0 1070. CPU is running at 4.5 and no OC on GPU.
> 
> Is this Firestrike score normal? I'm using the latest nvidia drivers (the hotfix one due to the debug issue on pascal).
> 
> http://www.3dmark.com/fs/11657859
> 
> I've seen "default" reviews getting 16 or 17k, just wondering if this is to be expected.
> 
> Cheers!


i really have no idea why ppl still buy evga acx3.0......
is your card come with the lastest bios and thermal pads preinstalled?


----------



## Sptz

Quote:


> Originally Posted by *asdkj1740*
> 
> i really have no idea why ppl still buy evga acx3.0......
> is your card come with the lastest bios and thermal pads preinstalled?


I was unaware of the issues when I bought it. It has the latest BIOS (86.04.50.00.72) , so I'm assuming it comes with the thermal pads.

EDIT: Just checked EVGA's serial number check and it does.


----------



## asdkj1740

Quote:


> Originally Posted by *Sptz*
> 
> I was unaware of the issues when I bought it. It has the latest BIOS (86.04.50.00.72) , so I'm assuming it comes with the thermal pads.
> 
> EDIT: Just checked EVGA's serial number check and it does.


thermal pads can been seen obviously.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It gives you a bit of flexibility to move things up and down a little but I can get the same scores in Firestrike at 1.062V with lower temps as I can at 1.093V.
> 
> I have come to the conclusion that the core clock speed that AB reports doesn't actually have that much relevance when it comes to frames per second. the levels around .975v seem to have much more impact on GPU performance
> 
> 
> 
> i dont know why mine running at 1.05v resulted in the same gpu temp and gpu power draw of 1.09v with the same core and vram clock settings.
> indeed the pascal does not really scale with voltage. but i have been thinking this is all about the gpu boost and the driver.
Click to expand...

Given that these cards have about 20 different things that are all dependent on each other, some of which are dependant on external events process like load on CPU, it certainly makes it interesting trying to work out what effects what and get some consistency.

The driver version can certainly have an effect on overall performance. GPU boost is officially effected by power limit, temperature and curve settings. What they don't tell you is that the relative differences between different points on the curve also has an effect on how the card performs while all the reporting tools only mislead you by telling you what frequency the highest accessible point on the curve is operating at.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I think he may have just broken his NDA...oops
> 
> The 2c increase could be because the new cooler is drawing more heat load from VRM and memory that the old cooler was. End result would be that the VRMs and memory chips are running cooler than they were in the original at the expense of an extra 2c for the GPU
> 
> 
> 
> https://videocardz.com/65789/evga-icx-technology-detailed
Click to expand...

That is the first time I have seen the details of the FTW2. The recent dramas have certainly encouraged EVGA to go overboard with sensors.


----------



## gtbtk

Quote:


> Originally Posted by *Sptz*
> 
> Hi everyone,
> 
> I just built an mATX rig with an i7 7700k and an EVGA ACX 3.0 1070. CPU is running at 4.5 and no OC on GPU.
> 
> Is this Firestrike score normal? I'm using the latest nvidia drivers (the hotfix one due to the debug issue on pascal).
> 
> http://www.3dmark.com/fs/11657859
> 
> I've seen "default" reviews getting 16 or 17k, just wondering if this is to be expected.
> 
> Cheers!


The total score is highly dependent on CPU performance and capacity as well. The graphics score at stock clocks is usually around 19000 or so. You will benefit in firestrike by setting a fan curve that is more aggressive than the stock one that will not spin fans until it gets to 60 deg.

With fans at 100%, there is no reason why you cant run the entire benchmark with temps below 60 deg. That will help GPU boost keep the clocks higher.

You will get the highest benefits in FS scores if you overclock the memory to as high as you can get it - you should expect a minimum of +500 but maybe as high as 800 or so depending on your card. Core clock overclocking will get you an increase as well but not to the same levels as memory OC will. Of course other 3d applications will behave differently.


----------



## gtbtk

Quote:


> Originally Posted by *madweazl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *b0uncyfr0*
> 
> @madweazl
> 
> Still kinda low for water though no? 2101 isn't that high.
> 
> 
> 
> Those are the settings I roll with for daily operations (200/800 over reference). The plots dont show any downclocking at 40° until I played with DSR.
> 
> Edit: as stated earlier in this thread, the move to water only netted an additional 20/100 but reduced the temps by roughly 23°.
> 
> I played around with DSR and was able to get considerably higher GPU usage in game with it set to 4k. This got the temps up to 41°. With DSR set to 1440 it didnt operate much differently than my native of 1080 but did drop about 3fps. With DSR set to 4k I was getting mid teens for frame rates with everything maxed out.
Click to expand...

As you have water cooling, your biggest challenge will be managing the power limits. They become more problematic as resolution and load increases.

The founders edition only has a maximum 170W power limit so you don't have a lot of headroom to begin with. As long as you are cooling VRM and memory, You could get benefit from cross flashing your card with an ASUS strix OC bios. That will give you a 200W power limit


----------



## tiramoko

Quote:


> Originally Posted by *pez*
> 
> All good. Just kinda made me go '???'. The 1070 is still a great card; especially for 1440p. Is the MSI Gaming (X) not an option for you? Also a very good card now that they are more reasonably priced (or they were somewhat recently).


Msi x 1070 is over my budget. If I had not bought a new monitor maybe I'll be able to afford 1080.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> I overclock core clock with points on the curve. I do not use the code clock slider. Remember there are two things we are talking about that relate to voltage. One is the voltage slider that regulates the high level limit that the board can operate at. Second are the various points on the voltage curve that will adjust core clock speed at a given voltage level that ranges from .800V up to 1.062-1.093V depending on the voltage slider setting.
> 
> In very simple terms, If you have the voltage slider at "0" then the points above 1.062 on the voltage curve don't do anything. It you increase the voltage slider to "100" then the points to the right of 1.062 up to 1.093 become active in setting core clock frequency.
> 
> Each one of the voltage points along the curve will independently have a level above the stock frequency where it will go unstable if you go too far. For some voltage level points on the curve it may be 100 or more above stock but there may be a single voltage point that can only go to 50 above the stock. lf you use the slider, all the voltage levels on that curve are increased together with the same offset so even if only one voltage level has a limitation of 50 above the stock level, it will limit your overclock/performance for the entire curve to +50 above the stock level. If you adjust the different points on the curve independently, then you can just limit that single point weak to 50 and go higher at other points along the curve.
> 
> After months of experimentation, i have discovered that with the memory highly overclocked, the area of the curve that seems gets the most work are the points around .0950-.0975v. If you find a good level at .0975, then setting the 1.062 point to 2088Mhz with +0 voltage slider or 2114Mhz at 1.093V with voltage slider at +100 really does not make much difference in performance. I can score around 20700 graphics score either way.
> 
> What I have also found is that there is a certain frequency level on the .0975v point that will change the balance between CPU/GPU performance that I can demonstrate in firestrike with either a higher graphics score and lower physics score or vice versa by just changing the level above and below about 2000mhz at .0975v. For Firestrike, The slightly lower graphics with higher physics/combined scores will give better total scores.
> 
> *The lower voltage levels do make a difference to card temps and will improve the core clock stability under load. That is something that could be of benefit to owners of cards that run hot like founders edition or the other blower style cards. *


my galax card if i set max voltage to 1.062, 1.075, or 1.081v it will have a lower Graphic scores 20,7xx-20,8xx,
with max 1.093V (100% voltage)i have higher graphic scores 20,9xx. just tested it several times. (same settings config)

Higher Voltages can stabilize some core clock stability, but ironically it can also causing throttle down the core clock if card get too hot.

I think higher than 2050mhz it start to use more than 1.062V, if you set it too low, it will not even boost at higher coreclock frequency.

Im using basic slider not curve btw, I will try to start to OC with curve settings again when i get a better mood for overclocking.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I overclock core clock with points on the curve. I do not use the code clock slider. Remember there are two things we are talking about that relate to voltage. One is the voltage slider that regulates the high level limit that the board can operate at. Second are the various points on the voltage curve that will adjust core clock speed at a given voltage level that ranges from .800V up to 1.062-1.093V depending on the voltage slider setting.
> 
> In very simple terms, If you have the voltage slider at "0" then the points above 1.062 on the voltage curve don't do anything. It you increase the voltage slider to "100" then the points to the right of 1.062 up to 1.093 become active in setting core clock frequency.
> 
> Each one of the voltage points along the curve will independently have a level above the stock frequency where it will go unstable if you go too far. For some voltage level points on the curve it may be 100 or more above stock but there may be a single voltage point that can only go to 50 above the stock. lf you use the slider, all the voltage levels on that curve are increased together with the same offset so even if only one voltage level has a limitation of 50 above the stock level, it will limit your overclock/performance for the entire curve to +50 above the stock level. If you adjust the different points on the curve independently, then you can just limit that single point weak to 50 and go higher at other points along the curve.
> 
> After months of experimentation, i have discovered that with the memory highly overclocked, the area of the curve that seems gets the most work are the points around .0950-.0975v. If you find a good level at .0975, then setting the 1.062 point to 2088Mhz with +0 voltage slider or 2114Mhz at 1.093V with voltage slider at +100 really does not make much difference in performance. I can score around 20700 graphics score either way.
> 
> What I have also found is that there is a certain frequency level on the .0975v point that will change the balance between CPU/GPU performance that I can demonstrate in firestrike with either a higher graphics score and lower physics score or vice versa by just changing the level above and below about 2000mhz at .0975v. For Firestrike, The slightly lower graphics with higher physics/combined scores will give better total scores.
> 
> *The lower voltage levels do make a difference to card temps and will improve the core clock stability under load. That is something that could be of benefit to owners of cards that run hot like founders edition or the other blower style cards. *
> 
> 
> 
> my galax card if i set max voltage to 1.062, 1.075, or 1.081v it will have a lower Graphic scores 20,7xx-20,8xx,
> with max 1.093V (100% voltage)i have higher graphic scores 20,9xx. just tested it several times. (same settings config)
> 
> Higher Voltages can stabilize some core clock stability, but ironically it can also causing throttle down the core clock if card get too hot.
> 
> I think higher than 2050mhz it start to use more than 1.062V, if you set it too low, it will not even boost at higher coreclock frequency.
> 
> Im using basic slider not curve btw, I will try to start to OC with curve settings again when i get a better mood for overclocking.
Click to expand...

Yes the higher voltages can stabilize and give you more flexibility in overclocking. I am not suggesting don't use them.

Have you tried increasing the overclock setting with the lower voltage? I can run my card at .0975V at about 2050

left score 1.062V right score 1.093V. Note the significant differences in frequencies shown in the details sections but the differences in scores are pretty much within the margin of error of running these sort of bench marks

http://www.3dmark.com/compare/fs/11643842/fs/11633012

The curve is more work to get right but it is also much more educational as well because you will start seeing that the performance does not all come at the top of the curve and that different voltage points start changing the behaviour of the card.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> Yes the higher voltages can stabilize and give you more flexibility in overclocking. I am not suggesting don't use them.
> 
> *Have you tried increasing the overclock setting with the lower voltage? I can run my card at .0975V at about 2050*
> 
> left score 1.062V right score 1.093V. Note the significant differences in frequencies shown in the details sections but the differences in scores are pretty much within the margin of error of running these sort of bench marks
> 
> http://www.3dmark.com/compare/fs/11643842/fs/11633012
> 
> The curve is more work to get right but it is also much more educational as well because you will start seeing that the performance does not all come at the top of the curve and that different voltage points start changing the behaviour of the card.


i think i already did that in the past iirc, but not exactly remember, some guy here suggest those also set 20xx mhz with 0,9XXV, but the performance result is lower even though it runs at same coreclock, with normal voltage it will get higher performance result.

Do you mean you put 2050mhz to all point curves start from 0,975V-to the max voltage? is that mean the card undervolting isnt it? the card will be at fixed 0,975Voltage. the 2050mhz probably a fake 2050mhz's performances...

btw Do you run benchmark and at the same time running a monitoring software? maybe you should try runs bench without monitoring softwares, i think IMO it gives me more points to the graphic scores. Also try Set everything at default on nvcp, set GPu prefer max performances, and also Power option windows to High performances (remove pcie&cpu lower state), cooler ambient & 100%fanspeed also will help.


----------



## madweazl

Quote:


> Originally Posted by *gtbtk*
> 
> As you have water cooling, your biggest challenge will be managing the power limits. They become more problematic as resolution and load increases.
> 
> The founders edition only has a maximum 170W power limit so you don't have a lot of headroom to begin with. As long as you are cooling VRM and memory, You could get benefit from cross flashing your card with an ASUS strix OC bios. That will give you a 200W power limit


I may give that a shot when somebody tops my scores but for now, I enjoy having it show as NVidia in the details /snicker


----------



## owikhan

update bios for asus gtx 1070 dual oc???

please share link


----------



## icold

Quote:


> Originally Posted by *owikhan*
> 
> update bios for asus gtx 1070 dual oc???
> 
> please share link


Use this:

http://dlcdnet.asus.com/pub/ASUS/vga/app/GTX1070updatebios.rar?_ga=1.234835933.274823173.1468605128


----------



## Nukemaster

Quote:


> Originally Posted by *icold*
> 
> Use this:
> 
> http://dlcdnet.asus.com/pub/ASUS/vga/app/GTX1070updatebios.rar?_ga=1.234835933.274823173.1468605128


It adds fanless idle mode too


----------



## DeathAngel74

What is the difference in the power limits between the dual oc and strix oc? Would I gain anything with this over the strix oc bios? Nvm Google is my friend...strix 8pin, dual 6pin.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> i think i already did that in the past iirc, but not exactly remember, some guy here suggest those also set 20xx mhz with 0,9XXV, but the performance result is lower even though it runs at same coreclock, with normal voltage it will get higher performance result.
> 
> Do you mean you put 2050mhz to all point curves start from 0,975V-to the max voltage? is that mean the card undervolting isnt it? the card will be at fixed 0,975Voltage. the 2050mhz probably a fake 2050mhz's performances...
> 
> btw Do you run benchmark and at the same time running a monitoring software? maybe you should try runs bench without monitoring softwares, i think IMO it gives me more points to the graphic scores. Also try Set everything at default on nvcp, set GPu prefer max performances, and also Power option windows to High performances (remove pcie&cpu lower state), cooler ambient & 100%fanspeed also will help.


No. on my card, the sweet spot for the 0.0975 point is at 1999 or 2025. it then runs flat until the 1.062 point which I pull up to the 2088 if i have not turned up the voltage. 1.093V if the voltage is at +100.

I do both. I run Afterburner and with HWinfo feeding some of its info into the OSD while I am experimenting to hopefully better understand why things crash or don't crash. If I want to really test out the settings I will shut down everything and see how it goes without any resource contention because it does impact ultimate performance. If you can test in a clean mode with the minimum services running, you will get the best scores. Having said that, the synthetic benchmark results while they are fun to try and improve things, generally crash out if you are playing a game.

The comparison results that I posted earlier were done with AB and HW info running.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> What is the difference in the power limits between the dual oc and strix oc? Would I gain anything with this over the strix oc bios? Nvm Google is my friend...strix 8pin, dual 6pin.


Yes there is a difference, the dual is 170W, the non OC Strix is 180W and the OC strix is 200W.


----------



## DeathAngel74

Ok thanks @gtbtk


----------



## icold

Quote:


> Originally Posted by *DeathAngel74*
> 
> What is the difference in the power limits between the dual oc and strix oc? Would I gain anything with this over the strix oc bios? Nvm Google is my friend...strix 8pin, dual 6pin.


1070 dual is a very ugly GPU ( color)


----------



## DeathAngel74

The dragon oc also has a low power limit correct?


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> i really have no idea why ppl still buy evga acx3.0......
> is your card come with the lastest bios and thermal pads preinstalled?


no issue with the vrm temp plus my batch came with pre-applied thermal pad & vbios


----------



## GeneO

Nm


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> The dragon oc also has a low power limit correct?


I think the dragon OC card is a Strix OC PCB with a different shroud and the original reviewer bios installed that changes the base clock to run in strix "OC" mode by default. The PCB and the back end of the cooler looks the same.

If that is the case, the dragon should have a limit of 200W


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> I think the dragon OC card is a Strix OC PCB with a different shroud and the original reviewer bios installed that changes the base clock to run in strix "OC" mode by default. The PCB and the back end of the cooler looks the same.
> 
> If that is the case, the dragon should have a limit of 200W


http://www.pcpop.com/doc/3/3402/3402325_all.shtml


----------



## EDK-TheONE

I found a new information about zotac amp extreme(samsung memory). it has two version vbios *86.04.1E.00.00* and *86.04.1E.00.89*. first one was sent to editors it has better curve and better overclock. look below pic, got 22700! fire-strike gs @ 2151 mhz on air. second vbios is for consumers out of box.


----------



## asdkj1740

Quote:


> Originally Posted by *EDK-TheONE*
> 
> I found a new information about zotac amp extreme(samsung memory). it has two version vbios *86.04.1E.00.00* and *86.04.1E.00.89*. first one was sent to editors it has better curve and better overclock. look below pic, got 22700! fire-strike gs @ 2151 mhz on air. second vbios is for consumers out of box.


run fsu instead of other 3d marks for gpu testings.

zotac is insane, hong kong number one


----------



## Dude970

Upgraded my MB, RAM, and CPU broke 18K on FS


----------



## asdkj1740

$100 for old users to upgrade to icx. yeah. evga is the best. after sales service is the best.

f**k you.


----------



## DeathAngel74

I called this morning about that, they said they'd have more info about it next week. $99+sales tax isn't bad. They pay for return shipping and you get iCX t-shirt!


----------



## RyanRazer

Quote:


> Originally Posted by *EDK-TheONE*
> 
> I found a new information about zotac amp extreme(samsung memory). it has two version vbios *86.04.1E.00.00* and *86.04.1E.00.89*. first one was sent to editors it has better curve and better overclock. look below pic, got 22700! fire-strike gs @ 2151 mhz on air. second vbios is for consumers out of box.


Where can i get one? (bios)


----------



## gtbtk

Quote:


> Originally Posted by *EDK-TheONE*
> 
> I found a new information about zotac amp extreme(samsung memory). it has two version vbios *86.04.1E.00.00* and *86.04.1E.00.89*. first one was sent to editors it has better curve and better overclock. look below pic, got 22700! fire-strike gs @ 2151 mhz on air. second vbios is for consumers out of box.


So Zotac was also a member of the lets deceive the public club with Asus and MSI.

The stupid thing is that the vbios exists. It costs Zotac nothing, so why the hell did they not just put it on the card as standard?


----------



## gtbtk

Quote:


> Originally Posted by *Dude970*
> 
> Upgraded my MB, RAM, and CPU broke 18K on FS


Did your graphics score get much of an increase ir were the increases just from physics/combined?


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> Did your graphics score get much of an increase ir were the increases just from physics/combined?


Mostly Physics/combined. I have gotten 22k graphics so I will keep testing


----------



## ranillo

Quote:


> Originally Posted by *EDK-TheONE*
> 
> I found a new information about zotac amp extreme(samsung memory). it has two version vbios *86.04.1E.00.00* and *86.04.1E.00.89*. first one was sent to editors it has better curve and better overclock. look below pic, got 22700! fire-strike gs @ 2151 mhz on air. second vbios is for consumers out of box.


Can you share it, pls?


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EDK-TheONE*
> 
> I found a new information about zotac amp extreme(samsung memory). it has two version vbios *86.04.1E.00.00* and *86.04.1E.00.89*. first one was sent to editors it has better curve and better overclock. look below pic, got 22700! fire-strike gs @ 2151 mhz on air. second vbios is for consumers out of box.
> 
> 
> 
> 
> 
> 
> 
> 
> run fsu instead of other 3d marks for gpu testings.
> 
> zotac is insane, hong kong number one
Click to expand...

I considered the Amp extreme but it would not fit in my rig


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I think the dragon OC card is a Strix OC PCB with a different shroud and the original reviewer bios installed that changes the base clock to run in strix "OC" mode by default. The PCB and the back end of the cooler looks the same.
> 
> If that is the case, the dragon should have a limit of 200W
> 
> 
> 
> http://www.pcpop.com/doc/3/3402/3402325_all.shtml
Click to expand...

Thanks Dude!


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Thanks Dude!


actually this is better, with the diy kit included, and the asus direct power to provide better quality electricity to the gpu.


----------



## zipzop

If the 1070 ICX SC2 ships with Samsung GDDR5(which is unlikely) then I'm all in for the upgrade...especially considering I have EVGA bucks saved up from Folding.

I'd be worried about getting a card with a worse core though(mine OC's nicely to 2139mhz stable)...but my memory is sh** even with the micron BIOS update


----------



## AliasOfMyself

Sorry for the crappy photo, i couldn't get this to show up properly in a screenshot, meaning it was there, but it was cut off.. i just wondered if anyone knows what the hell that is on the bottom of the display??


----------



## EDK-TheONE

Quote:


> Originally Posted by *ranillo*
> 
> Can you share it, pls?


i don't have this version of bios yet. i contacted zotac support. it may they provide it for me.


----------



## EDK-TheONE

Quote:


> Originally Posted by *RyanRazer*
> 
> Where can i get one? (bios)


Quote:


> Originally Posted by *RyanRazer*
> 
> Where can i get one? (bios)


there is no link yet. i contacted zotac, they may provide it! i still waiting...


----------



## DeathAngel74

Quote:


> Originally Posted by *zipzop*
> 
> If the 1070 ICX SC2 ships with Samsung GDDR5(which is unlikely) then I'm all in for the upgrade...especially considering I have EVGA bucks saved up from Folding.
> 
> I'd be worried about getting a card with a worse core though(mine OC's nicely to 2139mhz stable)...but my memory is sh** even with the micron BIOS update


The swirling vortex of silicon lottery terror?


----------



## khanmein

Quote:


> Originally Posted by *zipzop*
> 
> If the 1070 ICX SC2 ships with Samsung GDDR5(which is unlikely) then I'm all in for the upgrade...especially considering I have EVGA bucks saved up from Folding.
> 
> I'd be worried about getting a card with a worse core though(mine OC's nicely to 2139mhz stable)...but my memory is sh** even with the micron BIOS update


all the reviewers just show GTX 1080 & obviously GDD5X is from Micron & i doubt it will come with samsung

if i know is samsung, then i will apply the step up program.


----------



## anthonyg45157

Quote:


> Originally Posted by *EDK-TheONE*
> 
> there is no link yet. i contacted zotac, they may provide it! i still waiting...


If the BIOS is on your card can't you save the BIOS using gpu-z then upload it?


----------



## EDK-TheONE

Quote:


> Originally Posted by *anthonyg45157*
> 
> If the BIOS is on your card can't you save the BIOS using gpu-z then upload it?


i don't have this version of bios yet,it s not mine. i contacted zotac support. it may they provide it for me.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Thanks Dude!
> 
> 
> 
> actually this is better, with the diy kit included, and the asus direct power to provide better quality electricity to the gpu.
Click to expand...

I have a feeling that, like the extra 6 or 8 pin conectors some cards have installed, is more for marketing and show than actually required for good operation.


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> If the 1070 ICX SC2 ships with Samsung GDDR5(which is unlikely) then I'm all in for the upgrade...especially considering I have EVGA bucks saved up from Folding.
> 
> I'd be worried about getting a card with a worse core though(mine OC's nicely to 2139mhz stable)...but my memory is sh** even with the micron BIOS update


Why is it unlikely? 1070 cards from all brands are manufactured in batches. They are alternating Samsung with micron on a batch by batch basis.

Even if you get a card with micron memory, there is absolutely nothing wrong with it as long as it is running the .50 bios update to resolve a bug with the memory controller that is built into the Nvidia GPU silicon.

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RyanRazer*
> 
> Where can i get one? (bios)
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *RyanRazer*
> 
> Where can i get one? (bios)
> 
> Click to expand...
> 
> there is no link yet. i contacted zotac, they may provide it! i still waiting...
Click to expand...

that bios is a very old version that was current at the release of the 1070 in June last year.


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Sorry for the crappy photo, i couldn't get this to show up properly in a screenshot, meaning it was there, but it was cut off.. i just wondered if anyone knows what the hell that is on the bottom of the display??


Looks like you have overclocked your vram slightly too much.


----------



## DoubleNorm

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Sorry for the crappy photo, i couldn't get this to show up properly in a screenshot, meaning it was there, but it was cut off.. i just wondered if anyone knows what the hell that is on the bottom of the display??


It`s just a bug of game man, relax, tap alt+ener then alt+enter again. From ubisoft with love bugs


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> I have a feeling that, like the extra 6 or 8 pin conectors some cards have installed, is more for marketing and show than actually required for good operation.


adding more capacitors should help overclocking.
http://www.overclock.net/t/1616678/gtx1060-extreme-mod-log-come-watch-me-bench-the-blue-smoke-out-of-a-gtx1060


----------



## AliasOfMyself

Quote:


> Originally Posted by *gtbtk*
> 
> Looks like you have overclocked your vram slightly too much.


Nah, figured out the exact cause this morning. It's Directx 12, when i have it enabled, i get the crap on the bottom of the display when i have it turned off, there's no random crap lol. This is only when i'm in the underground though, the rest of the game appears to be fine wiith Directx 12 enabled.

Quote:


> Originally Posted by *DoubleNorm*
> 
> It`s just a bug of game man, relax, tap alt+ener then alt+enter again. From ubisoft with love bugs


Don't i know it







leave it to ubisoft to have a bug in a game that made me think my 1070 had a problem


----------



## khanmein

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Nah, figured out the exact cause this morning. It's Directx 12, when i have it enabled, i get the crap on the bottom of the display when i have it turned off, there's no random crap lol. This is only when i'm in the underground though, the rest of the game appears to be fine wiith Directx 12 enabled.
> Don't i know it
> 
> 
> 
> 
> 
> 
> 
> leave it to ubisoft to have a bug in a game that made me think my 1070 had a problem


like i said ubisoft games is crap.


----------



## DoubleNorm

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Nah, figured out the exact cause this morning. It's Directx 12, when i have it enabled, i get the crap on the bottom of the display when i have it turned off, there's no random crap lol. This is only when i'm in the underground though, the rest of the game appears to be fine wiith Directx 12 enabled.
> Don't i know it
> 
> 
> 
> 
> 
> 
> 
> leave it to ubisoft to have a bug in a game that made me think my 1070 had a problem


Sometimes it`s may happen without DX12)) I think it`s happen because some issue with windowed mode. In DX12 in this game ONLY windowed mode can be, settings may say that fullscreen mode, but it`s lie. Alt+enter for a few times always help.


----------



## DeathAngel74

Quote:


> Originally Posted by *gtbtk*
> 
> Why is it unlikely? 1070 cards from all brands are manufactured in batches. They are alternating Samsung with micron on a batch by batch basis.
> 
> Even if you get a card with micron memory, there is absolutely nothing wrong with it as long as it is running the .50 bios update to resolve a bug with the memory controller that is built into the Nvidia GPU silicon.
> that bios is a very old version that was current at the release of the 1070 in June last year.


Although I completely agree with you, with a newly designed PCB and fancy new sensors, there should be no excuse at this point. If anything is wrong with this generation(SC2), I'll look to MSI for my next GPU.


----------



## AliasOfMyself

Quote:


> Originally Posted by *khanmein*
> 
> like i said ubisoft games is crap.


They're not crap, just poorly made, never have i seen a ubisoft game that was released without a load of bugs though









Quote:


> Originally Posted by *DoubleNorm*
> 
> Sometimes it`s may happen without DX12)) I think it`s happen because some issue with windowed mode. In DX12 in this game ONLY windowed mode can be, settings may say that fullscreen mode, but it`s lie. Alt+enter for a few times always help.


That's the one thing i can't do in Directx 12 mode, when i try to use alt+enter the game crashes, the game also crashes when i alt+tab, Directx 12 can look very nice, i mean Gears of War 4 was gorgeous, just ubisoft+massive have implemented it very badly


----------



## gtbtk

Quote:


> Originally Posted by *AliasOfMyself*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Looks like you have overclocked your vram slightly too much.
> 
> 
> 
> Nah, figured out the exact cause this morning. It's Directx 12, when i have it enabled, i get the crap on the bottom of the display when i have it turned off, there's no random crap lol. This is only when i'm in the underground though, the rest of the game appears to be fine wiith Directx 12 enabled.
> 
> Quote:
> 
> 
> 
> Originally Posted by *DoubleNorm*
> 
> It`s just a bug of game man, relax, tap alt+ener then alt+enter again. From ubisoft with love bugs
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Don't i know it
> 
> 
> 
> 
> 
> 
> 
> leave it to ubisoft to have a bug in a game that made me think my 1070 had a problem
Click to expand...

DX12 does not cope with as high VRAM overclocks as dx11. Timespy wont run with memory overclocks much more than about 9000mhz, Firestrike will run on some cards at 9400 without issue.


----------



## madweazl

Quote:


> Originally Posted by *gtbtk*
> 
> DX12 does not cope with as high VRAM overclocks as dx11. Timespy wont run with memory overclocks much more than about 9000mhz, Firestrike will run on some cards at 9400 without issue.


I run the memory at 9720 in both without issue. I do have to drop to 2152 on the core for Time Spy when I can run 2164 in Fire Strike.


----------



## gtbtk

Quote:


> Originally Posted by *madweazl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> DX12 does not cope with as high VRAM overclocks as dx11. Timespy wont run with memory overclocks much more than about 9000mhz, Firestrike will run on some cards at 9400 without issue.
> 
> 
> 
> I run the memory at 9720 in both without issue. I do have to drop to 2152 on the core for Time Spy when I can run 2164 in Fire Strike.
Click to expand...

That is interesting. I cant get anything above 9400. even then it is artifacting all over the place. The general consensus at about time of the bios update, that was not the case. Maybe you lucked out wit a magic card, it is down to some difference in the bios/default clocks of the card, or possibly getting the exact right combination of a couple of different variables that work in harmony.

The other thing that is possible is in the differences in PCI-e communication/motherboard voltage delivery between z68 PCIe 2.0 and Z170 PCIe 3.0. I know that after the bios update and some motherboard tweaks I have improved my vram overclockability by about 50% since my card was new.

As a matter of interest, what exact settings are you running on your card and what voltage settings have you got running on your motherboard


----------



## madweazl

Quote:


> Originally Posted by *gtbtk*
> 
> That is interesting. I cant get anything above 9400. even then it is artifacting all over the place. The general consensus at about time of the bios update, that was not the case. Maybe you lucked out wit a magic card, it is down to some difference in the bios/default clocks of the card, or possibly getting the exact right combination of a couple of different variables that work in harmony.
> 
> The other thing that is possible is in the differences in PCI-e communication/motherboard voltage delivery between z68 PCIe 2.0 and Z170 PCIe 3.0. I know that after the bios update and some motherboard tweaks I have improved my vram overclockability by about 50% since my card was new.
> 
> As a matter of interest, what exact settings are you running on your card and what voltage settings have you got running on your motherboard


I have an untouched Founder's aside from the water (voltage slider maxed in AB). The motherboard settings are all set to auto for PCIe (FCLK is set to 1000mhz vice the default of 800mhz but that shouldnt make any difference for the OC).


----------



## icold

On high overclocks my GPU freeze before gave artifacts, is this a problem?


----------



## zipper17

Quote:


> Originally Posted by *madweazl*
> 
> I run the memory at 9720 in both without issue. I do have to drop to 2152 on the core for Time Spy when I can run 2164 in Fire Strike.


Your card is it on Aircooled or Watercooled?

I'm curious does 21-22k FS graphic scores actually need watercooled or not?


----------



## gtbtk

Quote:


> Originally Posted by *madweazl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is interesting. I cant get anything above 9400. even then it is artifacting all over the place. The general consensus at about time of the bios update, that was not the case. Maybe you lucked out wit a magic card, it is down to some difference in the bios/default clocks of the card, or possibly getting the exact right combination of a couple of different variables that work in harmony.
> 
> The other thing that is possible is in the differences in PCI-e communication/motherboard voltage delivery between z68 PCIe 2.0 and Z170 PCIe 3.0. I know that after the bios update and some motherboard tweaks I have improved my vram overclockability by about 50% since my card was new.
> 
> As a matter of interest, what exact settings are you running on your card and what voltage settings have you got running on your motherboard
> 
> 
> 
> I have an untouched Founder's aside from the water (voltage slider maxed in AB). The motherboard settings are all set to auto for PCIe (FCLK is set to 1000mhz vice the default of 800mhz but that shouldnt make any difference for the OC).
Click to expand...

thanks.


----------



## owikhan

Please guide me curve settings for my ASUS DUAL OC 1070 Gpu

if with picture then too much easy for me to do

Thanks in advance


----------



## madweazl

Quote:


> Originally Posted by *zipper17*
> 
> Your card is it on Aircooled or Watercooled?
> 
> I'm curious does 21-22k FS graphic scores actually need watercooled or not?


It is on water; I didnt pick up much on the core but did pick up roughly 110mhz on the memory slider after the switch (max temps dropped from 64 to 41). I may try the BIOS flash recommended earlier to see if I can squeeze out a little more.


----------



## gtbtk

Quote:


> Originally Posted by *owikhan*
> 
> Please guide me curve settings for my ASUS DUAL OC 1070 Gpu
> 
> if with picture then too much easy for me to do
> 
> Thanks in advance


it will be unique to your card.

Try with the slider first and see how high you can get it. that will give you a base line to start playing. from the high point, you can start pulling up points, one at a time in 12mhz steps and seeing if it crashed.

I have found that you end up with best performance if you concentrate on 0.0975v and the highest voltage point that you have enabled with the voltage slider


----------



## icold

I trying editing curve again to refine OC, in the last trying I ended up failing and annoying me


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> I trying editing curve again to refine OC, in the last trying I ended up failing and annoying me


if you edit a curve, start from the default curve each time. The fine line that appears with the heavy line is the offset that is applied at each voltage and is additive is you keep tweaking it. Fine line above heave line = negatibe off set, no fine line = no offset from the stock level and fine line below = positive offset.

The end result can be confusing because the the heavy line looks like the actual curve is not actually the actual curve unless you tahe the fine line offsets into account. You can also have a situation with 2 diagrams looking the same if you dont understand the meaning of the fine line on the graph, but are actually completely different.


----------



## xxpantherrrxx

Having a little bit of an issue. Recently my GTX 1070 FE will not idle, it stays at 1506Mhz at all times. I've tried latest drivers, changing power modes, rebooting several times. Nothing. Says it's 33% power usage and stays clocked at 1506mhz. Any ideas?

EDIT: I found out the issue, the issue was that desktop instant replay was enabled, as soon as I disabled it the card idled as normal.


----------



## shilka

Got my Gigabyte GTX 1070 Xtreme Gaming today and so far duing my testing i am very happy with my card
Its more quiet then the old Gigabyte GTX 970 G1 Gaming i had before both during ilde due to the fan off mode and during loads due to the stacked triple fan design

So far the only games i have tested in 1440P that cant be run at 60 FPS or higher is Deus Ex Mankind Divided and Ashes Of The Singularity
Turn a few settings down from the the highest to the second highest and the games run much better

So far i am seeing double the FPS in some games going from a GTX 970 to a GTX 1070
Am getting 22-23 FPS more in Heaven 4,0 and Valley 1,0

Moved my new video card to the second PCI-E slot instead of the top one as its easier to get to the PCI-E port lock in the second slot


----------



## gerardfraser

Quote:


> Originally Posted by *owikhan*
> 
> Please guide me curve settings for my ASUS DUAL OC 1070 Gpu
> 
> if with picture then too much easy for me to do
> 
> Thanks in advance


Here is a fan curve I have been using and works fine.

Mv set to 1.02
Core clock settles at 2038
Micron memory set to 9000

Game run 30 mins with max temp at 54c and max fan speed was 58%

https://postimg.org/image/lj910tz7f/pic upload


----------



## khanmein

@shilka i wonder y u didn't go for Titan X since i bet u can afford it with this all stuff u had with u.


----------



## pez

Quote:


> Originally Posted by *shilka*
> 
> Got my Gigabyte GTX 1070 Xtreme Gaming today and so far duing my testing i am very happy with my card
> Its more quiet then the old Gigabyte GTX 970 G1 Gaming i had before both during ilde due to the fan off mode and during loads due to the stacked triple fan design
> 
> So far the only games i have tested in 1440P that cant be run at 60 FPS or higher is Deus Ex Mankind Divided and Ashes Of The Singularity
> Turn a few settings down from the the highest to the second highest and the games run much better
> 
> So far i am seeing double the FPS in some games going from a GTX 970 to a GTX 1070
> Am getting 22-23 FPS more in Heaven 4,0 and Valley 1,0
> 
> Moved my new video card to the second PCI-E slot instead of the top one as its easier to get to the PCI-E port lock in the second slot


Well I wouldn't feel bad for neither of those games running at 60+ FPS. Also, do you actually play AotS or use it for a benchmark?


----------



## shilka

Quote:


> Originally Posted by *khanmein*
> 
> @shilka i wonder y u didn't go for Titan X since i bet u can afford it with this all stuff u had with u.


Here where i live we have 25% VAT so a Titan X is $1700 vs $577 for a Gigabyte GTX 1070 Xtreme
Even if i had $1700 which i dont i would never spend that much on a video card, a Gigabyte GTX 1080 Xtreme would have been $858 compared and thats too high as well

I dont have a ton of money what i have is skyhigh prices as it took me almost half a year to save up to a GTX 1070
Could have waited another 2-3 months and gotten a GTX 1080 or GTX 1080 Ti but i got fed up with both my GTX 970 and wating so i just bought a GTX 1070

Pretty happy with it so far
Quote:


> Originally Posted by *pez*
> 
> Well I wouldn't feel bad for neither of those games running at 60+ FPS. Also, do you actually play AotS or use it for a benchmark?


I do plan on playing it and not just use the benchmark
With a GTX 970 the game ran too poorly so decided to wait with playing it untill i got a better video card.


----------



## pez

Quote:


> Originally Posted by *shilka*
> 
> Here where i live we have 25% VAT so a Titan X is $1700 vs $577 for a Gigabyte GTX 1070 Xtreme
> Even if i had $1700 which i dont i would never spend that much on a video card, a Gigabyte GTX 1080 Xtreme would have been $858 compared and thats too high as well
> 
> I dont have a ton of money what i have is skyhigh prices as it took me almost half a year to save up to a GTX 1070
> Could have waited another 2-3 months and gotten a GTX 1080 or GTX 1080 Ti but i got fed up with both my GTX 970 and wating so i just bought a GTX 1070
> 
> Pretty happy with it so far
> I do plan on playing it and not just use the benchmark
> With a GTX 970 the game ran too poorly so decided to wait with playing it untill i got a better video card.


Nice







. You might be the first person I've ever seen admit that they were going to play it







.


----------



## shilka

Quote:


> Originally Posted by *pez*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> . You might be the first person I've ever seen admit that they were going to play it
> 
> 
> 
> 
> 
> 
> 
> .


If not for the fact that i need to pay $800 to the IRS here in march i would have gotten a GTX 1080 instead
Need to get a 1 TB SSD and a copy of Windows 10 as well since i am still using Windows 7 and those $800 where meant for that

Damm the IRS to hell after almost two years i hear from them that since i had extra work back in 2015 i did not pay enough tax and i need to pay them back
$800 out the window which in itself is annoying but two years later? really fast work right there!

Need to start from skratch saving up for a new SSD and a copy of Windows 10.


----------



## pez

Quote:


> Originally Posted by *shilka*
> 
> If not for the fact that i need to pay $800 to the IRS here in march i would have gotten a GTX 1080 instead
> Need to get a 1 TB SSD and a copy of Windows 10 as well since i am still using Windows 7 and those $800 where meant for that
> 
> Damm the IRS to hell after almost two years i hear from them that since i had extra work back in 2015 i did not pay enough tax and i need to pay them back
> $800 out the window which in itself is annoying but two years later? really fast work right there!
> 
> Need to start from skratch saving up for a new SSD and a copy of Windows 10.


That does suck indeed







. Although, don't feel bad at all for getting a 1070. A 1070 handles its business on pretty much anything at non-4K. 21:9 is a bit sketchy with it, but still doable. Overall a solid choice if you ask me. Just like the 970, the 1070 can give 99% of the people out there the performance they are looking for







.

Also, I'm sending you a PM







.


----------



## dallemon

Quote:


> Originally Posted by *shilka*
> 
> If not for the fact that i need to pay $800 to the IRS here in march i would have gotten a GTX 1080 instead
> Need to get a 1 TB SSD and a copy of Windows 10 as well since i am still using Windows 7 and those $800 where meant for that
> 
> Damm the IRS to hell after almost two years i hear from them that since i had extra work back in 2015 i did not pay enough tax and i need to pay them back
> $800 out the window which in itself is annoying but two years later? really fast work right there!
> 
> Need to start from skratch saving up for a new SSD and a copy of Windows 10.


Just wanted to say that if you have a license for Win 7 you can most likely use it for an install of Win 10. 
And also yeah, the tax office is always slow like that, except for when they are missing several million then they tend to not say anything unless they get caught :-D


----------



## rfarmer

https://www.humblebundle.com/pc-lovers-software-bundle

Humble has a PC lovers bundle, you can get 3DMark for $5.33.


----------



## shilka

Quote:


> Originally Posted by *dallemon*
> 
> Just wanted to say that if you have a license for Win 7 you can most likely use it for an install of Win 10.
> And also yeah, the tax office is always slow like that, except for when they are missing several million then they tend to not say anything unless they get caught :-D


As far as i am aware you cant with Win 7
But i do think you can for Win 8 which i skipped because i despise it so much

Is Win 10 better or is it just hype? really dont want to move from Win 7 but i cant keep using it forever like an old dinosaur
Would like to use DX12 now that i have games that work with it

Edit: on another note i am almost done doing all my testing and benchmarks so does anyone know how to make graphs?
And no i dont have Excel or know how to use it if i had it, could use some help making graphs if anyone can help me out?

Thanks


----------



## zipper17

Use both win 7 and then win 10, pretty much the same I think. Got it from free upgrade. Start Menu UI & navigation seems little bit easier on Win7.

Win 10 has fancy auto wallpaper screen during Login screen lol

Btw, Just installed latest driver 378.66, performances FireStrike Graphic scores slightly a little bit slower than on 378.49, or just a margin error.


----------



## the.hollow

Quick question, can you still enable the LED breathing effects on the 1070 founders edition. I know I used to be able to do it, but haven't been able to find the option since card first came out basically. Maybe I'm missing something or it was just done away with lol.


----------



## xGeNeSisx

Quote:


> Originally Posted by *the.hollow*
> 
> Quick question, can you still enable the LED breathing effects on the 1070 founders edition. I know I used to be able to do it, but haven't been able to find the option since card first came out basically. Maybe I'm missing something or it was just done away with lol.


When GFE was updated from 2 -> the option to change led lighting was removed and was never added back into the control panel. Nvidia released a standalone package to change LED lighting. I do not have a link, but I remember the package being linked on the /r/nvidia subreddit


----------



## the.hollow

Thanks, I'll look in to that then.


----------



## xGeNeSisx

Quote:


> Originally Posted by *dallemon*
> 
> Just wanted to say that if you have a license for Win 7 you can most likely use it for an install of Win 10.
> And also yeah, the tax office is always slow like that, except for when they are missing several million then they tend to not say anything unless they get caught :-D


Quote:


> Originally Posted by *rfarmer*
> 
> https://www.humblebundle.com/pc-lovers-software-bundle
> 
> Humble has a PC lovers bundle, you can get 3DMark for $5.33.


Quote:


> Originally Posted by *rfarmer*
> 
> https://www.humblebundle.com/pc-lovers-software-bundle
> 
> Humble has a PC lovers bundle, you can get 3DMark for $5.33.


Thanks for the link! Missed the Steam sale when it was ~$5. Ended up going for the $12 deal as I would like to compare Dashlane to Lastpass


----------



## asfaR

Hi,
I have an ASUS Strix OC and this is as far as I could reach.

+50 core
+800 mem










I guess this is limited by silicon but I would like to reach at least 50 mhz more in core clock.


----------



## pez

I think it'd be worth it to not OC your memory and start with your GPU core (unless you did that already). You'll see more benefit from higher GPU core clocks than mem clocks.


----------



## tensoisso

This guy on youtube (Nicholas Peyton) found a way to overvolt a gtx 1080 to 1.2v, can anyone identify what software he used to do so?


----------



## DeathAngel74

eVGA Classy voltage tool on a 1080 Classified....


----------



## asfaR

Quote:


> Originally Posted by *pez*
> 
> I think it'd be worth it to not OC your memory and start with your GPU core (unless you did that already). You'll see more benefit from higher GPU core clocks than mem clocks.


Yes, that's where I started, +100 on core with +0 on memory and artifacts and power cuts appeared. With +75 is not stable in 3D Mark.

I will try to increase a little bit the core reducing memory frecuency.


----------



## dallemon

Quote:


> Originally Posted by *shilka*
> 
> As far as i am aware you cant with Win 7
> But i do think you can for Win 8 which i skipped because i despise it so much
> 
> Is Win 10 better or is it just hype? really dont want to move from Win 7 but i cant keep using it forever like an old dinosaur
> Would like to use DX12 now that i have games that work with it
> 
> Edit: on another note i am almost done doing all my testing and benchmarks so does anyone know how to make graphs?
> And no i dont have Excel or know how to use it if i had it, could use some help making graphs if anyone can help me out?
> 
> Thanks


I do it for customers all the time at work, works without issue almost every time. Just make a clean install and type in your key when it asks for one, also make sure you are using the latest ISO with the media creation tool.


----------



## dallemon

BTW still rocking that Zotac AMP Extreme BIOS on my STRIX OC but it hits a limit at 240W+ and starts throttling, so I need to find a good balance between memory OC and core OC as the high memory OC will sometimes push FPS so high that it increases power usage and starts to throttle core and that then reduces FPS ?


----------



## gtbtk

Quote:


> Originally Posted by *asfaR*
> 
> Hi,
> I have an ASUS Strix OC and this is as far as I could reach.
> 
> +50 core
> +800 mem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess this is limited by silicon but I would like to reach at least 50 mhz more in core clock.


remember that you are already starting with a +127 overclock over the reference clock. your +50 is already +177 over reference

Use the curve and not the slider and it will get your boost clock significantly higher. You should not have any problem running in the range 2100-2126Mhz. you can try leaving the +50 then opening the curve windows and increase the 1.093v point up to 2114 or 2126 and see how you go. (assuming you have increased the voltage +100)


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dallemon*
> 
> Just wanted to say that if you have a license for Win 7 you can most likely use it for an install of Win 10.
> And also yeah, the tax office is always slow like that, except for when they are missing several million then they tend to not say anything unless they get caught :-D
> 
> 
> 
> As far as i am aware you cant with Win 7
> But i do think you can for Win 8 which i skipped because i despise it so much
> 
> Is Win 10 better or is it just hype? really dont want to move from Win 7 but i cant keep using it forever like an old dinosaur
> Would like to use DX12 now that i have games that work with it
> 
> Edit: on another note i am almost done doing all my testing and benchmarks so does anyone know how to make graphs?
> And no i dont have Excel or know how to use it if i had it, could use some help making graphs if anyone can help me out?
> 
> Thanks
Click to expand...

What are you trying to graph?

If you are using Aida, HWinfo or GPU-z to log the activity, you can use generic log viewer and it will create the graphs with a few clicks

User guide here

https://www.hwinfo.com/forum/Thread-LogViewer-for-HWINFO-is-available

download from here

https://www.hwinfo.com/files/GenericLogViewer/GenericLogViewer_v3.1.zip


----------



## asfaR

Quote:


> Originally Posted by *gtbtk*
> 
> remember that you are already starting with a +127 overclock over the reference clock. your +50 is already +177 over reference
> 
> Use the curve and not the slider and it will get your boost clock significantly higher. You should not have any problem running in the range 2100-2126Mhz. you can try leaving the +50 then opening the curve windows and increase the 1.093v point up to 2114 or 2126 and see how you go. (assuming you have increased the voltage +100)


Thank you for your reply.

I try but testing with Fire Strike crash with this config. Should I change anything in afterburner?


----------



## shilka

Quote:


> Originally Posted by *gtbtk*
> 
> What are you trying to graph?
> 
> If you are using Aida, HWinfo or GPU-z to log the activity, you can use generic log viewer and it will create the graphs with a few clicks
> 
> User guide here
> 
> https://www.hwinfo.com/forum/Thread-LogViewer-for-HWINFO-is-available
> 
> download from here
> 
> https://www.hwinfo.com/files/GenericLogViewer/GenericLogViewer_v3.1.zip


I have the FPS numbers written down in a notepad doc and thats all i have right now.


----------



## ranillo

Here's mine, if it can help you.

setting1.png 543k .png file


----------



## gtbtk

Quote:


> Originally Posted by *asfaR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> remember that you are already starting with a +127 overclock over the reference clock. your +50 is already +177 over reference
> 
> Use the curve and not the slider and it will get your boost clock significantly higher. You should not have any problem running in the range 2100-2126Mhz. you can try leaving the +50 then opening the curve windows and increase the 1.093v point up to 2114 or 2126 and see how you go. (assuming you have increased the voltage +100)
> 
> 
> 
> Thank you for your reply.
> 
> I try but testing with Fire Strike crash with this config. Should I change anything in afterburner?
Click to expand...

Set Nvidia control panel to high performance mode for 3dmark application. try disabling the Nvidia Audio component in device manager if you are not using it.

HWinfo64 is a good tool to monitor everything in your PC. you can also set it to include extra monitoring values in the Afterburner OSD

All cards/pcs behave differently so you have to experiment to find what is right for your rig. The only way you will get the right answer is to experiment

Try reduce the slider from +50 to +25 and then adjust the curve the at 1.093 the way you have here. If that doesnt work leave slider at 0 and adjust curve. Try the same thing and adjust curve to 2088 or 2075 and see if that is stable

You can also try leaving the voltage slider at 0 and adjusting the 1.063v point instead of 1.093v or adjust the voltage to about 30 and try adjusting the 1.075V point

I have also found the using the curve method gets a benefit for me if I increase the .975v point on the curve to 1999-2025 as well as increasing the 1.093/1.063v point

Firestrike loves memory overclock, Timespy not as much. +500 memory OC is good +600 really good, +700 Great, +800 fantastic

Motherboard settings can also have an effect. If you are getting BSOD, either 0x124 WHEA or Watchdog Timeout, the VCore (CPU Voltage) probably needs a slight increase or the load line calibration may need adjustment to stop voltage droop.

Also look at the VRM frequency and Phase control settings. Best to get specific advice on your board from an OC guide as I dont know your motherboard. I cannot give you good advice on the exact settings. When I first got my 1070, my VRM settings were set to extreme, turning them down to the optimized setting and even trying VRM spread spectum mode (Not CPU Spread Spectrum) helped with stability.

If you have VCCIO and or System Agent (VCCSA) voltages at auto, you may find improvements if you try and adjust them a little above the default settings. dont make big adjustments. one small step at a time and test


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> What are you trying to graph?
> 
> If you are using Aida, HWinfo or GPU-z to log the activity, you can use generic log viewer and it will create the graphs with a few clicks
> 
> User guide here
> 
> https://www.hwinfo.com/forum/Thread-LogViewer-for-HWINFO-is-available
> 
> download from here
> 
> https://www.hwinfo.com/files/GenericLogViewer/GenericLogViewer_v3.1.zip
> 
> 
> 
> I have the FPS numbers written down in a notepad doc and thats all i have right now.
Click to expand...

GPU-Z can be used to log all the different sensor values that you probably want to look at to start with. Try using that with the log viewer I linked to above.

Enable logging, do a benchmark or gaming run or what you are trying to test. stop the logging, open the log file in the viewer and it will graph all diffferent sensor setting over the time of the run.

That viewer will allow you to open 2 different log files for comparison if you want.

The following is the GPU-Z log for the firestrike run you can see here http://www.3dmark.com/3dm/18006945?


----------



## npcizzy

MSI swapped out my GTX 970 for GTX 1070 (Cool!). It's an ARMOR OC edition but I had the "TIGER" edition of the 970 before so all is good. I can only get my 1070 to run at 2100Mhz/1.093Vcore for 10 seconds LOL. It settles around 2025 at stock voltage. Anyone with the same card had better luck? Looks like I'm going to have to scour over 700 pages.


----------



## pez

Quote:


> Originally Posted by *asfaR*
> 
> Yes, that's where I started, +100 on core with +0 on memory and artifacts and power cuts appeared. With +75 is not stable in 3D Mark.
> 
> I will try to increase a little bit the core reducing memory frecuency.


It looks like your card is auto-boosting and holding a constant 2000Mhz+ clock, so in the end you're still seeing a pretty 'normal' OC for your card.


----------



## kignt

I'm almost certain 1070 downclocks after 54 C in steps of 12/13 Mhz.


----------



## pez

Users that are on water have reported that the 10-series cards start to throttle as early as the low 40s.


----------



## syl1979

I am on air and i can tell you there is also one step of frequency reduction as low as 30degc


----------



## g-lad21

Hello fellow 1070'ers
I have a stutter everytime open task view (Winkey+Tab), i came to notice its because of the idle clock of my evga 1070, which is too low i guess.
if i put the power mode to performance mode on nvidia control panel, the stutter is gone, but the clock is also higher than it should be on idle.

does anyone know a way to change idle clock? without getting it to 1500+ for no reason?
Thanks!


----------



## RyanRazer

Quote:


> Originally Posted by *g-lad21*
> 
> Hello fellow 1070'ers
> I have a stutter everytime open task view (Winkey+Tab), i came to notice its because of the idle clock of my evga 1070, which is too low i guess.
> if i put the power mode to performance mode on nvidia control panel, the stutter is gone, but the clock is also higher than it should be on idle.
> 
> does anyone know a way to change idle clock? without getting it to 1500+ for no reason?
> Thanks!


Sorry for not answering your question but i can say that it's weird that stuttering occurs even at idle clocks. My iGPU from intel runs task view just fine. I can't imagine 1070 at idle being worse than that... I am sure my 1070 is also at idle clocks when i task view but i see no such stuttering.
I will for a fact test that when i get home. I'll make sure gpu runs at lowest clocks on idle and try to switch task view.


----------



## g-lad21

Quote:


> Originally Posted by *RyanRazer*
> 
> Sorry for not answering your question but i can say that it's weird that stuttering occurs even at idle clocks. My iGPU from intel runs task view just fine. I can't imagine 1070 at idle being worse than that... I am sure my 1070 is also at idle clocks when i task view but i see no such stuttering.


Its a very small stutter, but im very sensitive to details.
the 1070 idle clock is around 200mhz, which is low, taskview isnt battlefield but it does require some gpu power, when the clock is higher than 1000 there is no stutter.
thanks anyway


----------



## madweazl

Quote:


> Originally Posted by *syl1979*
> 
> I am on air and i can tell you there is also one step of frequency reduction as low as 30degc


Not sure what is going on with yours but I can confirm that nothing changes at or below 41° because of temperature. Very rare that my card goes above that (cant recall it happening) so I cant say at what temperature throttling occurs. I was however playing with some settings last week and was able to observe some throttling with DSR turned up to 1440 and 4k resolutions. I suspect these are caused by power limits as the same temps do not produce the throttling issues in other scenarios. In the picture below, the flat core clock was gaming at 1080 (Black Desert all settings maxed with the exception of distances that were left default). The short section with the throttling core clocks (right of center) was when I switched DSR to 1440 and the next one (furthest right) was when I switched to DSR of 4k. The fluctuation was very small but present.


Spoiler: Warning: Spoiler!





__
https://flic.kr/p/RVbjz1


----------



## gtbtk

Quote:


> Originally Posted by *npcizzy*
> 
> MSI swapped out my GTX 970 for GTX 1070 (Cool!). It's an ARMOR OC edition but I had the "TIGER" edition of the 970 before so all is good. I can only get my 1070 to run at 2100Mhz/1.093Vcore for 10 seconds LOL. It settles around 2025 at stock voltage. Anyone with the same card had better luck? Looks like I'm going to have to scour over 700 pages.


for benchmarks run the fan at 100%. for other high performance usage, create a fan curve and keep gpu temp as low as you can. Using the default silent curve will idle the GPU at 50-60 deg and allow temps to get into the 70s. As temps increase, the gou will donw clock itself every couple of degrees as tems rise.

With the fan curve I created, My card idles with a low speed fan and the card sits at about 30-32 deg, starts increasing at about 37 deg and hits 100% fan at 57deg. Under full load, I rarely, if ever get over 60.

I can run stable at 2126-2114 through an entire firestrike run with voltage at 1.093v. At 1.063v my card likes to settle at about 2088-2076. both of those are with a card overclocked using the voltage curve and not the traditional slider


----------



## gtbtk

Quote:


> Originally Posted by *g-lad21*
> 
> Hello fellow 1070'ers
> I have a stutter everytime open task view (Winkey+Tab), i came to notice its because of the idle clock of my evga 1070, which is too low i guess.
> if i put the power mode to performance mode on nvidia control panel, the stutter is gone, but the clock is also higher than it should be on idle.
> 
> does anyone know a way to change idle clock? without getting it to 1500+ for no reason?
> Thanks!


Unless you really need it, disable the windows game DVR if you are using windows 10.

You can try using adaptive power mode in the nvidia control panel. It will boost up under a 3d load such as in game, heaven or 3dmark the same way it does under high performance mode but will stay at low idle under an accelerated 2d load such as Chrome.


----------



## asfaR

Quote:


> Originally Posted by *gtbtk*
> 
> Set Nvidia control panel to high performance mode for 3dmark application. try disabling the Nvidia Audio component in device manager if you are not using it.
> 
> HWinfo64 is a good tool to monitor everything in your PC. you can also set it to include extra monitoring values in the Afterburner OSD
> 
> All cards/pcs behave differently so you have to experiment to find what is right for your rig. The only way you will get the right answer is to experiment
> 
> Try reduce the slider from +50 to +25 and then adjust the curve the at 1.093 the way you have here. If that doesnt work leave slider at 0 and adjust curve. Try the same thing and adjust curve to 2088 or 2075 and see if that is stable
> 
> You can also try leaving the voltage slider at 0 and adjusting the 1.063v point instead of 1.093v or adjust the voltage to about 30 and try adjusting the 1.075V point
> 
> I have also found the using the curve method gets a benefit for me if I increase the .975v point on the curve to 1999-2025 as well as increasing the 1.093/1.063v point
> 
> Firestrike loves memory overclock, Timespy not as much. +500 memory OC is good +600 really good, +700 Great, +800 fantastic
> 
> Motherboard settings can also have an effect. If you are getting BSOD, either 0x124 WHEA or Watchdog Timeout, the VCore (CPU Voltage) probably needs a slight increase or the load line calibration may need adjustment to stop voltage droop.
> 
> Also look at the VRM frequency and Phase control settings. Best to get specific advice on your board from an OC guide as I dont know your motherboard. I cannot give you good advice on the exact settings. When I first got my 1070, my VRM settings were set to extreme, turning them down to the optimized setting and even trying VRM spread spectum mode (Not CPU Spread Spectrum) helped with stability.
> 
> If you have VCCIO and or System Agent (VCCSA) voltages at auto, you may find improvements if you try and adjust them a little above the default settings. dont make big adjustments. one small step at a time and test


Ok, I could reach 2113mhz on 1093 (I had the slider on 55 and I could reduce it to 50 and increase 1093 voltage to 2113mhz), on 2126 crash so that's the limit for that voltage. What now? Should I test the others voltages individualy? I see some improvements but looks like I'm reaching the limit, I have more benefits ocing the memory than core (this is normall, the memory is complete stable at +770 (9548mhz))

Something that I've seen is that the frecuency in 1093 is not 2113mhz but sometimes 2088, 2101, I guess that fluctuation is normal.

The CPU is an i7 7700k default, the only change that I did to the motherboard (Maximus IX Formula) is to increase the RAM frecuency and latency to its default (3200 CL14) that's all. I'm not getting any BSOD, the test that I'm making is on Heaven and 3D Mark (Fire Strike, Fire Strike Ultra, Game Spy...) and they just crash or simply draw some artifacts.

I keep going testing.


----------



## KedarWolf

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dallemon*
> 
> Just wanted to say that if you have a license for Win 7 you can most likely use it for an install of Win 10.
> And also yeah, the tax office is always slow like that, except for when they are missing several million then they tend to not say anything unless they get caught :-D
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rfarmer*
> 
> https://www.humblebundle.com/pc-lovers-software-bundle
> 
> Humble has a PC lovers bundle, you can get 3DMark for $5.33.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *rfarmer*
> 
> https://www.humblebundle.com/pc-lovers-software-bundle
> 
> Humble has a PC lovers bundle, you can get 3DMark for $5.33.
> 
> Click to expand...
> 
> Thanks for the link! Missed the Steam sale when it was ~$5. Ended up going for the $12 deal as I would like to compare Dashlane to Lastpass
Click to expand...

I went for the $12 bundle, i was looking for PCMark 8 cheap and now am using Dashlane as well.

Have 3DMark so giving the key away.









Looks around for a peep who likes free stuff.


----------



## b0uncyfr0

Id love to take PCMark off your hands if you're feeling generous KedarWolf.


----------



## rfarmer

Free is good, I paid $20 for mine.


----------



## KedarWolf

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Id love to take PCMark off your hands if you're feeling generous KedarWolf.


Not PCMark, I needed that, 3DMark I don't need the key.


----------



## pez

Quote:


> Originally Posted by *KedarWolf*
> 
> Not PCMark, I needed that, 3DMark I don't need the key.


I'd be happy to take 3DMark off your hands if that's the case and no one else claimed it







.


----------



## KedarWolf

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Not PCMark, I needed that, 3DMark I don't need the key.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd be happy to take 3DMark off your hands if that's the case and no one else claimed it
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

I'll PM you.


----------



## gtbtk

Quote:


> Originally Posted by *asfaR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Set Nvidia control panel to high performance mode for 3dmark application. try disabling the Nvidia Audio component in device manager if you are not using it.
> 
> HWinfo64 is a good tool to monitor everything in your PC. you can also set it to include extra monitoring values in the Afterburner OSD
> 
> All cards/pcs behave differently so you have to experiment to find what is right for your rig. The only way you will get the right answer is to experiment
> 
> Try reduce the slider from +50 to +25 and then adjust the curve the at 1.093 the way you have here. If that doesnt work leave slider at 0 and adjust curve. Try the same thing and adjust curve to 2088 or 2075 and see if that is stable
> 
> You can also try leaving the voltage slider at 0 and adjusting the 1.063v point instead of 1.093v or adjust the voltage to about 30 and try adjusting the 1.075V point
> 
> I have also found the using the curve method gets a benefit for me if I increase the .975v point on the curve to 1999-2025 as well as increasing the 1.093/1.063v point
> 
> Firestrike loves memory overclock, Timespy not as much. +500 memory OC is good +600 really good, +700 Great, +800 fantastic
> 
> Motherboard settings can also have an effect. If you are getting BSOD, either 0x124 WHEA or Watchdog Timeout, the VCore (CPU Voltage) probably needs a slight increase or the load line calibration may need adjustment to stop voltage droop.
> 
> Also look at the VRM frequency and Phase control settings. Best to get specific advice on your board from an OC guide as I dont know your motherboard. I cannot give you good advice on the exact settings. When I first got my 1070, my VRM settings were set to extreme, turning them down to the optimized setting and even trying VRM spread spectum mode (Not CPU Spread Spectrum) helped with stability.
> 
> If you have VCCIO and or System Agent (VCCSA) voltages at auto, you may find improvements if you try and adjust them a little above the default settings. dont make big adjustments. one small step at a time and test
> 
> 
> 
> Ok, I could reach 2113mhz on 1093 (I had the slider on 55 and I could reduce it to 50 and increase 1093 voltage to 2113mhz), on 2126 crash so that's the limit for that voltage. What now? Should I test the others voltages individualy? I see some improvements but looks like I'm reaching the limit, I have more benefits ocing the memory than core (this is normall, the memory is complete stable at +770 (9548mhz))
> 
> Something that I've seen is that the frecuency in 1093 is not 2113mhz but sometimes 2088, 2101, I guess that fluctuation is normal.
> 
> The CPU is an i7 7700k default, the only change that I did to the motherboard (Maximus IX Formula) is to increase the RAM frecuency and latency to its default (3200 CL14) that's all. I'm not getting any BSOD, the test that I'm making is on Heaven and 3D Mark (Fire Strike, Fire Strike Ultra, Game Spy...) and they just crash or simply draw some artifacts.
> 
> I keep going testing.
Click to expand...

There is no such thing as an absolute answer. every computer is different. try them all.

3d mark does love memory overclocks. +770 is very good. you may want to install OCCT and run the GPU test with error checking. The memory does have some error correction but at those speeds it is possible that errors are outweighing the performance improvements. I know some guys can run +800 memory but they actually get better performance at +600 due to errors. The errors can also be the reason for intermittent failure during benchmarks that you can't identify any other reason for.

I do not have any direct experience with Z270 so these suggestions are a bit generic but may inspire you.

Make sure that any CPU and memory overclocks are stable because these will have an effect in GPU @ 2100mhz benchmark runs as well. Asus boards do overclock the CPU by default setting all core turbo multiplier. XMP can also change BCLK. Not sure about z270 but that generally overclocks the PCIe bus that can cause instability. These pascal cards put a lot more stress on the PCIe bus than even a maxwell card so they will show up areas that are slightly out of tune more easily as well

The motherboard settings can be off enough to not BSOD but under the heavy graphics load be enough for the graphics or a DX driver to crash. A small increase in VCCIO or VCCSA voltage with a high speed memory installation like yours may help in stability. Also pay attention to the VRM and load line calibration settings. Turning things up to extreme doesnt always mean better. and having a large difference between vcore and CPU VID, while it still runs, can impact performance of the GPU

Did you see this? http://edgeup.asus.com/2017/01/31/kaby-lake-overclocking-guide/

I am running on z68 with an i7-2600. When I first got my card July last year, I got one of the first Micron cards on the market and that came with the bonus Micron memory controller bug. The max I could do was a +400 memory oc before it crash. best I could do was about 14300 firestrike with 20200 graphics and about 9500 physics. After I managed to get Nvidia to produce the Bug fix bios and a whole lot of motherboard setting tweaks, I can now get 15200 firestrike with 21000 graphics and 10500 Physics which is on par with an i5-6600 so whatever you do, don't just write everything off to "silicon lottery".


----------



## JoeUbi

I really wish there was some sort of custom BIOS for this card. I am able to push 2164 Mhz stable on the core and 9840 Mhz on that sweet sweet Samsung memory. When I go higher I don't get any artifacting, just crashing.

http://www.3dmark.com/fs/11726453
Fire Strike 1.1
3DMark Score
18432
Graphics Score
21728

http://www.3dmark.com/spy/1231176
Time Spy 1.0
3DMark Score
6900
Graphics Score
6995


----------



## RyanRazer

Quote:


> Originally Posted by *JoeUbi*
> 
> I really wish there was some sort of custom BIOS for this card. I am able to push 2164 Mhz stable on the core and 9840 Mhz on that sweet sweet Samsung memory. When I go higher I don't get any artifacting, just crashing.
> 
> http://www.3dmark.com/fs/11726453
> Fire Strike 1.1
> 3DMark Score
> 18432
> Graphics Score
> 21728
> 
> http://www.3dmark.com/spy/1231176
> Time Spy 1.0
> 3DMark Score
> 6900
> Graphics Score
> 6995


great oc man


----------



## zipper17

Quote:


> Originally Posted by *JoeUbi*
> 
> I really wish there was some sort of custom BIOS for this card. I am able to push 2164 Mhz stable on the core and 9840 Mhz on that sweet sweet Samsung memory. When I go higher I don't get any artifacting, just crashing.
> 
> http://www.3dmark.com/fs/11726453
> Fire Strike 1.1
> 3DMark Score
> 18432
> Graphics Score
> 21728
> 
> http://www.3dmark.com/spy/1231176
> Time Spy 1.0
> 3DMark Score
> 6900
> Graphics Score
> 6995


is that still on aircooling? or on water?
nice silicon chip btw.


----------



## EDK-TheONE

Quote:


> Originally Posted by *JoeUbi*
> 
> I really wish there was some sort of custom BIOS for this card. I am able to push 2164 Mhz stable on the core and 9840 Mhz on that sweet sweet Samsung memory. When I go higher I don't get any artifacting, just crashing.
> 
> http://www.3dmark.com/fs/11726453
> Fire Strike 1.1
> 3DMark Score
> 18432
> Graphics Score
> 21728
> 
> http://www.3dmark.com/spy/1231176
> Time Spy 1.0
> 3DMark Score
> 6900
> Graphics Score
> 6995


I have zotac also. here my result: http://www.3dmark.com/fs/11464285

SCORE
16 364 with NVIDIA GeForce GTX 1070(1x) and Intel Core i5-6600K
Graphics Score 21 859
Physics Score 9 735
Combined Score 8 781


----------



## dlewbell

Well, I've had my EVGA GTX 1070 FTW for less than 2 months, & I'm sending it in for warranty replacement. Thankfully nothing major, & it's usable while I wait for a cross ship. My front fan appears to have a bearing issue. I get ticking noises when it spins. This was an older stock with Micron memory. I did the thermal mod, but never updated the BIOS. If I have time over the weekend, I may try overclocking as is, run the BIOS update (just the one from EVGA), & try overclocking again to see what kind of difference it would have made. Before I do (if I do), I have a couple quick questions. Are the master & slave BIOSes identical from factory? Also, I assume I can only switch between the master & slave BIOSes with the PC shut down. Is this correct?


----------



## zipper17

Quote:


> Originally Posted by *EDK-TheONE*
> 
> I have zotac also. here my result: http://www.3dmark.com/fs/11464285
> 
> SCORE
> 16 364 with NVIDIA GeForce GTX 1070(1x) and Intel Core i5-6600K
> Graphics Score 21 859
> Physics Score 9 735
> Combined Score 8 781


is that still on aircooler, or already modified with watercool?, im just curious. Temperature during Fullload?

btw you guys both have great silicon.


----------



## Pcnewbielol

Yo I recently bought a Gtx 1070 Strix OC Graphics card.

I have a 144hz 1920 X 1080 (1080p) Monitor, I see the card supports DVI-D, DISPLAYPORT & 2 HDMI 2.0...

So how would i connect this card to my monitor to support 1080p @ 144hz.? I will be changing the refresh rate down to 120hz via Nvidia control panel though.

Also I have no speakers but the monitor has built in ones so sound would be good too lol.

Would someone be kind enough as to link all the necessary cables I would need for the picture/ sound via amazon or newegg please.?

PS big noob


----------



## Nukemaster

HDMI and DP both bring sound and picture at the same time.

With the latest version of HDMI you should be able to send 1920 x 1080 @ 120hz.

HDMI cables are the most cost effective and would be a good start, but not all monitors will do 1920 x 1080 over HDMI(my BenQ for instance will not, but some others will and even some would do 2560 x 1440 over HDMI without issues). It is best to check the monitor specs.

I personally still use Dual Link DVI, but I do not have speakers and can not stand the re-detect my monitor causes with DP(and it auto switches source as soon as the computer sleeps. This is a monitor not DP issue.).


----------



## JoeUbi

Quote:


> Originally Posted by *zipper17*
> 
> is that still on aircooling? or on water?
> nice silicon chip btw.


I'm still using stock cooling. The Amp Extreme's stock cooler is massive. I'm not home now but if I remember correctly my temps don't go over 55 at 100% fan.


----------



## EDK-TheONE

Quote:


> Originally Posted by *zipper17*
> 
> is that still on aircooler, or already modified with watercool?, im just curious. Temperature during Fullload?
> 
> btw you guys both have great silicon.


it's on air . i don't remember temperature but i am sure it's below 50 centigrade (my room is cold







)


----------



## DeathAngel74

I pulled the trigger.....Paid for my 1070 SC2 iCX upgrade and shipped the 1070 SC ACX back to eVGA today. Hopefully, if they get it by Tuesday, I'll have the new card by Friday or Monday....Paid for next day air....


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> I pulled the trigger.....Paid for my 1070 SC2 iCX upgrade and shipped the 1070 SC ACX back to eVGA today. Hopefully, if they get it by Tuesday, I'll have the new card by Friday or Monday....Paid for next day air....










are u serious? FYI, iCX also using micron for GDDR5. can i know it come with free for honor / ghost recon or not? cheers..


----------



## DeathAngel74

I doubt it....we get a free shirt. I think the free game/s are for people that buy iCX directly from evga.com. I don't think we are entitled to promotions with the iCX upgrade. New warranty and eligibility for new step-up are nice though. Not sure if we can step-up to other iCX cards though.


----------



## gtbtk

Quote:


> Originally Posted by *JoeUbi*
> 
> I really wish there was some sort of custom BIOS for this card. I am able to push 2164 Mhz stable on the core and 9840 Mhz on that sweet sweet Samsung memory. When I go higher I don't get any artifacting, just crashing.
> 
> http://www.3dmark.com/fs/11726453
> Fire Strike 1.1
> 3DMark Score
> 18432
> Graphics Score
> 21728
> 
> http://www.3dmark.com/spy/1231176
> Time Spy 1.0
> 3DMark Score
> 6900
> Graphics Score
> 6995


take a fresh look at your motherboard voltages. Particularly the vrm frequency settings, load line calibration/vcore. vdroop and or incorrect VRM settings can cause those symptoms. dont assume that "extreme" settings are always the best.

If that has not helped, also take a look at vccio and vccsa and experiment with varying values slightly. on my motherboard, increasing from my default of 1.05v to 1.075v helped my overclock settings

You can also try disabling the nvidia audio controller in device manager if you are not using the audio.

the scores you posted are in the top 1% or so of all 1070 cards so you don't have much room for improvement anyway


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> I doubt it....we get a free shirt. I think the free game/s are for people that buy iCX directly from evga.com. I don't think we are entitled to promotions with the iCX upgrade. New warranty and eligibility for new step-up are nice though. Not sure if we can step-up to other iCX cards though.


damn i'm not staying in US but my lil bro is at there.


----------



## b0uncyfr0

So to test the memory for artifacts - we should run the OCCT benchmark and tick the option for errors. How long though? The last time i ran OCCT on my 5850 ; it killed it.


----------



## JoeUbi

Quote:


> Originally Posted by *gtbtk*
> 
> take a fresh look at your motherboard voltages. Particularly the vrm frequency settings, load line calibration/vcore. vdroop and or incorrect VRM settings can cause those symptoms. dont assume that "extreme" settings are always the best.
> 
> If that has not helped, also take a look at vccio and vccsa and experiment with varying values slightly. on my motherboard, increasing from my default of 1.05v to 1.075v helped my overclock settings
> 
> You can also try disabling the nvidia audio controller in device manager if you are not using the audio.
> 
> the scores you posted are in the top 1% or so of all 1070 cards so you don't have much room for improvement anyway


Good tip man. With your suggestion I was able to push the core voltage to 2202 Mhz and the memory to 9868! And completed each 3dmark test no problemo.

Time Spy:
http://www.3dmark.com/spy/1237557
3DMark Score
6947
Graphics Score
7031

Fire Strike:
http://www.3dmark.com/fs/11747183
3DMark Score
18572
Graphics Score
21836

Fire Strike Extreme:
http://www.3dmark.com/fs/11747222
3DMark Score
9627
Graphics Score
10331

Fire Strike Ultra:
http://www.3dmark.com/fs/11747268
3DMark Score
5151
Graphics Score
5106


----------



## zipper17

Quote:


> Originally Posted by *JoeUbi*
> 
> Good tip man. With your suggestion I was able to push the core voltage to 2202 Mhz and the memory to 9868! And completed each 3dmark test no problemo.
> 
> Time Spy:
> http://www.3dmark.com/spy/1237557
> 3DMark Score
> 6947
> Graphics Score
> 7031
> 
> Fire Strike:
> http://www.3dmark.com/fs/11747183
> 3DMark Score
> 18572
> Graphics Score
> 21836
> 
> Fire Strike Extreme:
> http://www.3dmark.com/fs/11747222
> 3DMark Score
> 9627
> Graphics Score
> 10331
> 
> Fire Strike Ultra:
> http://www.3dmark.com/fs/11747268
> 3DMark Score
> 5151
> Graphics Score
> 5106


Do you pass each 3dmark stress test also?

performances need equal to stability, to be ultra sure.


----------



## JoeUbi

I'm sure its not 100% stable, however I didn't notice any artifact or weird stuff. Earlier in the day the temps peaked at 45 C now they are around 60 C. So I might have to wait.... Or just water cool it. lol


----------



## zipper17

Quote:


> Originally Posted by *JoeUbi*
> 
> I'm sure its not 100% stable, however I didn't notice any artifact or weird stuff. Earlier in the day the temps peaked at 45 C now they are around 60 C. So I might have to wait.... Or just water cool it. lol


Fyi I mean the 3dmark Stress Test one, which is different from 3dmark benchmark. If you have advanced of 3dmark, You'll got 3dmark Stress test (pass at least 97% stability test).

I should pick Zotac Amp exteme instead, but it will not fit in my case, because it has 12.8 inches length.

Galax exoc = 11.65 inches/296 mm
Zotac Amp exteme = 12.8 inches/325 mm

My galax already barely fit in my case. i was lucky lol.


----------



## owikhan

which is best nvidia driver to get good score in 3d mark time spy?

windows 10 64-bit pro edition

Asus Gtx 1070 Dual Oc Gpu


----------



## khanmein

Quote:


> Originally Posted by *owikhan*
> 
> which is best nvidia driver to get good score in 3d mark time spy?
> 
> windows 10 64-bit pro edition
> 
> Asus Gtx 1070 Dual Oc Gpu


378.72


----------



## owikhan

Quote:


> Originally Posted by *khanmein*
> 
> 378.72


Thanks for your quick response


----------



## DeathAngel74

Which is the best nVidia driver for overclocking/gaming stability?


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> Which is the best nVidia driver for overclocking/gaming stability?


this one i'm not sure but i think 368.69


----------



## DeathAngel74

thanks. just wondering since i ddu'd whatever was on the pc with the 1070sc. Any other opinions?


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> thanks. just wondering since i ddu'd whatever was on the pc with the 1070sc. Any other opinions?


i always use DDU for new WHQL. currently, i'm using 378.72 & installed on top of 378.66. seriously for pascal users, r378 still the best for Q1 2017 & let's hope the next WHQL will be more better.


----------



## DeathAngel74

can't I just install 378.72 and be done, since i already used ddu? no need to install on top of 378.66 right? assuming another driver doesn't drop before I get the new 1070 sc2 iCX, lol.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> can't I just install 378.72 and be done, since i already used ddu? no need to install on top of 378.66 right? assuming another driver doesn't drop before I get the new 1070 sc2 iCX, lol.


of course u can but i highly recommended clean install when u getting new graphic card.

yeah u're right & most likely 7th March for Tom Clancy's Ghost Recon - Wildlands.

i'm sticking with my ACX 3.0 since everything working flawlessly & temperature also decent. (my highest ambient temp around 30~33°c)


----------



## DeathAngel74

thank you again


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> thank you again


u're welcome.

P.S "Several users mention issues in Google's Chrome web browser, among them a crash issue when skipping videos on YouTube. Text may also be bold on Google sites such as YouTube or Google Search after the update."

https://www.ghacks.net/2017/02/18/nvidia-releases-geforce-hotfix-driver-378-72/


----------



## DeathAngel74

maybe i'll stick with 378.57 until its fixed then


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> maybe i'll stick with 378.57 until its fixed then


the youtube bugs exist ever since i switched to pascal so any driver also have this issue but i noticed 378.xx reduced a lot.

by so far, i had 3 times freeze & unresponsive when watching vids on youtube & B]G[/B]SOD again.


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> So to test the memory for artifacts - we should run the OCCT benchmark and tick the option for errors. How long though? The last time i ran OCCT on my 5850 ; it killed it.


OCCT will start to show memory errors pretty quickly. The hardware will cope with a few errors as it applies its own error correction. This is just from eyeballing the progression but an error count that increases periodically 1 at a time is nothing to worry too much about. The hardware error correction can deal with that.

As you overclock higher, The numbers of errors per second increases, you will start seeing increases in the count in multiple errors at a time. That is when the error correction starts getting overwhelmed. It may still continue to run and appear OK during the bench mark but you may find at that point, the many errors increase latency to such a point that using a slightly slower memory overclock, actually gives you higher performance.


----------



## owikhan

@JoeUbi

which nvidia driver u use for those benchmarks?


----------



## JoeUbi

Quote:


> Originally Posted by *owikhan*
> 
> @JoeUbi
> 
> which nvidia driver u use for those benchmarks?


Quote:


> Originally Posted by *owikhan*
> 
> @JoeUbi
> 
> which nvidia driver u use for those benchmarks?


378.49, I haven't updated to the latest yet.


----------



## kevindd992002

What does the "temp target" do in MSI AB?


----------



## blued

Temp target is when throttling begins to occur.


----------



## kevindd992002

Quote:


> Originally Posted by *blued*
> 
> Temp target is when throttling begins to occur.


Why would one want to ever lower that value?


----------



## RyanRazer

Quote:


> Originally Posted by *kevindd992002*
> 
> Why would one want to ever lower that value?


Usually people increase that one for benchmarking or even gaming. With pascal i just leave it at stock, my GPU never reaches near that temp.

You could lower however if you wanted to run it cooler for prolonged periods of time, for example for folding. If you were about to fold for a week or month, you might want to keep it nice and cool. Again pascal runs so cool i don't think you have to touch that. Mine was folding at 2000mhz and temps were 57C cca, which ia far from 80C or something that is set by default in AB.

You can also undervolt for temps ofc...


----------



## kevindd992002

Quote:


> Originally Posted by *RyanRazer*
> 
> Usually people increase that one for benchmarking or even gaming. With pascal i just leave it at stock, my GPU never reaches near that temp.
> 
> You could lower however if you wanted to run it cooler for prolonged periods of time, for example for folding. If you were about to fold for a week or month, you might want to keep it nice and cool. Again pascal runs so cool i don't think you have to touch that. Mine was folding at 2000mhz and temps were 57C cca, which ia far from 80C or something that is set by default in AB.
> 
> You can also undervolt for temps ofc...


Gotcha, thanks!


----------



## ahmedmo1

Anyone else OC the 1070 on their laptop? I dropped +200 MHz on the core and left the memory alone. Hope that's stable.


----------



## syl1979

Quote:


> Originally Posted by *ahmedmo1*
> 
> Anyone else OC the 1070 on their laptop? I dropped +200 MHz on the core and left the memory alone. Hope that's stable.


What is the effective boost you get under load ? Can you show a picture of gpuz sensor tab under load ?

You may try also to undervolt the catd, should help on mobile


----------



## gtbtk

Quote:


> Originally Posted by *JoeUbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> take a fresh look at your motherboard voltages. Particularly the vrm frequency settings, load line calibration/vcore. vdroop and or incorrect VRM settings can cause those symptoms. dont assume that "extreme" settings are always the best.
> 
> If that has not helped, also take a look at vccio and vccsa and experiment with varying values slightly. on my motherboard, increasing from my default of 1.05v to 1.075v helped my overclock settings
> 
> You can also try disabling the nvidia audio controller in device manager if you are not using the audio.
> 
> the scores you posted are in the top 1% or so of all 1070 cards so you don't have much room for improvement anyway
> 
> 
> 
> Good tip man. With your suggestion I was able to push the core voltage to 2202 Mhz and the memory to 9868! And completed each 3dmark test no problemo.
> 
> Time Spy:
> http://www.3dmark.com/spy/1237557
> 3DMark Score
> 6947
> Graphics Score
> 7031
> 
> Fire Strike:
> http://www.3dmark.com/fs/11747183
> 3DMark Score
> 18572
> Graphics Score
> 21836
> 
> Fire Strike Extreme:
> http://www.3dmark.com/fs/11747222
> 3DMark Score
> 9627
> Graphics Score
> 10331
> 
> Fire Strike Ultra:
> http://www.3dmark.com/fs/11747268
> 3DMark Score
> 5151
> Graphics Score
> 5106
Click to expand...

Glad it helped you out.

While there is some minor variation in tolerances between different copies of the same hardware component and that can be compounded because of the interactions with other components with their own tolerance ranges, The Silicon Lottery gets such a bad wrap when it is actually motherboard voltage and VRM settings holding some people back from reaching the peak performance potential of their hardware.

Unfortunately there is so much diversity in PC hardware, especially when we are talking about hardware levels that range over 5 generations back to Sandy Bridge or even earlier, it is difficult, if not impossible to give advice in absolute terms.

These pascal cards are such high performance devices that, particularly with the older hardware, the motherboard infrastructure is being stressed at levels it has never experienced before with GTX 600, 700 and 900 level GPUs, so we are seeing behavior with the hardware that was not apparent when sloppy settings were not as apparent because of the lower loads. Some of things that I am discovering now seem to contradict some of the "facts" in the overclock guides that have been published over the years, even by the Vendors like ASUS, so it requires someone with an open mind, a willingness to experiment and a realization that they can always revert the settings after experimenting if it doesn't work to actually try it rather than argue with me. It is refreshing to see. After all these years of doing this sort of stuff, I should be used to it i guess.

I only have z68 MB with an i7-2600 here to experiment with so I am in need of all the help that overclocking can give me to hold me over to my next CPU/Motherboard upgrade. The conceptual design and function of different voltage controls of these boards has not changed too much between generations. As a matter of interest, what were your bios before and after settings? How have you clocked your CPU (multiplier or multiplier and BCLK) and Are you doing any custom memory overclocking etc?


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> What does the "temp target" do in MSI AB?


As mentioned avove it sets the point weher it starts throttling the GPU.

The temp target is not so relevant on the partner cards with the massive multi fan heat syncs. It is something that is useful if you are running a blower style card that tends to run hotter in the first place.


----------



## gtbtk

Quote:


> Originally Posted by *asfaR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Set Nvidia control panel to high performance mode for 3dmark application. try disabling the Nvidia Audio component in device manager if you are not using it.
> 
> HWinfo64 is a good tool to monitor everything in your PC. you can also set it to include extra monitoring values in the Afterburner OSD
> 
> All cards/pcs behave differently so you have to experiment to find what is right for your rig. The only way you will get the right answer is to experiment
> 
> Try reduce the slider from +50 to +25 and then adjust the curve the at 1.093 the way you have here. If that doesnt work leave slider at 0 and adjust curve. Try the same thing and adjust curve to 2088 or 2075 and see if that is stable
> 
> You can also try leaving the voltage slider at 0 and adjusting the 1.063v point instead of 1.093v or adjust the voltage to about 30 and try adjusting the 1.075V point
> 
> I have also found the using the curve method gets a benefit for me if I increase the .975v point on the curve to 1999-2025 as well as increasing the 1.093/1.063v point
> 
> Firestrike loves memory overclock, Timespy not as much. +500 memory OC is good +600 really good, +700 Great, +800 fantastic
> 
> Motherboard settings can also have an effect. If you are getting BSOD, either 0x124 WHEA or Watchdog Timeout, the VCore (CPU Voltage) probably needs a slight increase or the load line calibration may need adjustment to stop voltage droop.
> 
> Also look at the VRM frequency and Phase control settings. Best to get specific advice on your board from an OC guide as I dont know your motherboard. I cannot give you good advice on the exact settings. When I first got my 1070, my VRM settings were set to extreme, turning them down to the optimized setting and even trying VRM spread spectum mode (Not CPU Spread Spectrum) helped with stability.
> 
> If you have VCCIO and or System Agent (VCCSA) voltages at auto, you may find improvements if you try and adjust them a little above the default settings. dont make big adjustments. one small step at a time and test
> 
> 
> 
> Ok, I could reach 2113mhz on 1093 (I had the slider on 55 and I could reduce it to 50 and increase 1093 voltage to 2113mhz), on 2126 crash so that's the limit for that voltage. What now? Should I test the others voltages individualy? I see some improvements but looks like I'm reaching the limit, I have more benefits ocing the memory than core (this is normall, the memory is complete stable at +770 (9548mhz))
> 
> Something that I've seen is that the frecuency in 1093 is not 2113mhz but sometimes 2088, 2101, I guess that fluctuation is normal.
> 
> The CPU is an i7 7700k default, the only change that I did to the motherboard (Maximus IX Formula) is to increase the RAM frecuency and latency to its default (3200 CL14) that's all. I'm not getting any BSOD, the test that I'm making is on Heaven and 3D Mark (Fire Strike, Fire Strike Ultra, Game Spy...) and they just crash or simply draw some artifacts.
> 
> I keep going testing.
Click to expand...

Sorry, I missed this.

There is nothing you can do with a stock 1070 that will break your video card. Try everything and see what happens. Just do not set afterburner to apply your overclock settings at login because it can create a boot loop if your GPU OC settings are out enough to cause a crash.

Pascal will down clock as temperatures increase. That is the nature of GPU Boost 3.0. If you are finding that the GPU will crash or give you watchdog Bluescreens of death, maybe have a look at tweaking motherboard voltages. A slight vcore voltage increase can someimes help and small Vccio and system agent voltage increased settings may also help with stability running the GPU faster.

I suggest that while high frequencies do relate to performance you look at primarily frame rates rather than maximum frequencies. You may find that in some cases, a slightly lower core clock with higher memory speeds give you better performance. You should also treat this as a performance journey, There are so many different variations that it is doubtful that anyone has actually found the absolute limit of what they can do. Next week or next month you may try something completely different and find a performance boost so dont give up trying new things.


----------



## asfaR

Quote:


> Originally Posted by *gtbtk*
> 
> Sorry, I missed this.
> 
> There is nothing you can do with a stock 1070 that will break your video card. Try everything and see what happens. Just do not set afterburner to apply your overclock settings at login because it can create a boot loop if your GPU OC settings are out enough to cause a crash.
> 
> Pascal will down clock as temperatures increase. That is the nature of GPU Boost 3.0. If you are finding that the GPU will crash or give you watchdog Bluescreens of death, maybe have a look at tweaking motherboard voltages. A slight vcore voltage increase can someimes help and small Vccio and system agent voltage increased settings may also help with stability running the GPU faster.
> 
> I suggest that while high frequencies do relate to performance you look at primarily frame rates rather than maximum frequencies. You may find that in some cases, a slightly lower core clock with higher memory speeds give you better performance. You should also treat this as a performance journey, There are so many different variations that it is doubtful that anyone has actually found the absolute limit of what they can do. Next week or next month you may try something completely different and find a performance boost so dont give up trying new things.


I'm currently playing Metro Last Light and while in Fire Strike and Heaven is stable, in Metro doesn't. I did some test with the GPU fans at 100% and I can reach a higher oc, that's true. I'm still looking the best settings but the performance impact (even beign unstable) is not very noticeable (taking into account that I have an OCed card by default and I can reach +55 in core clock (without considering the curve) and +750-800 in mem clock).


----------



## yevonxxx

hi guys,
i've tried searching a bit but direct question probably is better:

i've seen there are some 1070 with overheating problem,
some other with memory issues,
some other with coil wine problems,and so on.

which is a reliable brand/model for gtx 1070,
known for NOT giving problems?

my plan is also to oc it as much as possible,
probably watercooling in the near future but not now(got no time to rebuild my custom loop i'm busy with work).

thank you.


----------



## pez

EVGA had some issues more recently, but are resolving this with the ICX cooler. However, I have not seen them on sale at retailers yet. Regardless the ACX 3.0 issues have been addressed.

I haven't seen too many complaints about the MSI cards (outside of their asking prices early on).

My 1080 G1 was solid, though one card did have some coil whine. Coil whine is simply going to be luck of the draw on a lot of cards. As for other cards, I can't really comment on that.


----------



## caenlen

Anyone wondering, the gtx 1070 in laptops is just as amazing as desktop version, I overclocked +500 vram and +150 core, I hit 2012 core in most games and never budge over 70 celsius.... its insane ^^


----------



## guttheslayer

Quote:


> Originally Posted by *caenlen*
> 
> Anyone wondering, the gtx 1070 in laptops is just as amazing as desktop version, I overclocked +500 vram and +150 core, I hit 2012 core in most games and never budge over 70 celsius.... its insane ^^


The laptop variant has more cores than desktop.


----------



## zipper17

Quote:


> Originally Posted by *asfaR*
> 
> I'm currently playing Metro Last Light and while in *Fire Strike* and Heaven is *stable, in Metro doesn't*. I did some test with the GPU fans at 100% and I can reach a higher oc, that's true. I'm still looking the best settings but the performance impact (even beign unstable) is not very noticeable (taking into account that I have an OCed card by default and I can reach +55 in core clock (without considering the curve) and +750-800 in mem clock).


By you mean Firestrike is stable, is it on the 3dmark Benchmark or 3dmark Stress test one? How many time did you run the test?
3d mark benchmark and 3dmark Stress Test is different ...
For Stability Test use 3dmark Stress test (Pass 97 % at least, 20 loops), or you can use a Customization Loops (Infinite loops).

I can pass 3dmark benchmark with higher Graphic scores & OverClocked settings, but once in 3dmark Stress test, it will crash randomly pretty easy (if overclocked too much)..


----------



## gtbtk

Quote:


> Originally Posted by *asfaR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Sorry, I missed this.
> 
> There is nothing you can do with a stock 1070 that will break your video card. Try everything and see what happens. Just do not set afterburner to apply your overclock settings at login because it can create a boot loop if your GPU OC settings are out enough to cause a crash.
> 
> Pascal will down clock as temperatures increase. That is the nature of GPU Boost 3.0. If you are finding that the GPU will crash or give you watchdog Bluescreens of death, maybe have a look at tweaking motherboard voltages. A slight vcore voltage increase can someimes help and small Vccio and system agent voltage increased settings may also help with stability running the GPU faster.
> 
> I suggest that while high frequencies do relate to performance you look at primarily frame rates rather than maximum frequencies. You may find that in some cases, a slightly lower core clock with higher memory speeds give you better performance. You should also treat this as a performance journey, There are so many different variations that it is doubtful that anyone has actually found the absolute limit of what they can do. Next week or next month you may try something completely different and find a performance boost so dont give up trying new things.
> 
> 
> 
> I'm currently playing Metro Last Light and while in Fire Strike and Heaven is stable, in Metro doesn't. I did some test with the GPU fans at 100% and I can reach a higher oc, that's true. I'm still looking the best settings but the performance impact (even beign unstable) is not very noticeable (taking into account that I have an OCed card by default and I can reach +55 in core clock (without considering the curve) and +750-800 in mem clock).
Click to expand...

did you take another look at refining your motherboard settings? If they are a little bit off from being optimal, what was stable with a maxwell card can fall over with pascal because the maxwell card could never stress some of the components enough for the sloppy settings to matter.

Talking about +55 is not very beneficial as everyone has different cards with different base clocks, better to talk about clock frequencies. The same goes for memory overclocks.

Having said that, If you are running +800 memory overclocks, while not enough to crash the benchmarks, are probably giving you errors that the memory error correction is having to deal with and failing in metro last light. try dropping your memory overclock down to say +500 and see how that goes for stability. vccio voltage may also help with stability. did you try looking at that in your bios?


----------



## gtbtk

Quote:


> Originally Posted by *caenlen*
> 
> Anyone wondering, the gtx 1070 in laptops is just as amazing as desktop version, I overclocked +500 vram and +150 core, I hit 2012 core in most games and never budge over 70 celsius.... its insane ^^


how does it go in firestrike?


----------



## asfaR

Quote:


> Originally Posted by *zipper17*
> 
> By you mean Firestrike is stable, is it on the 3dmark Benchmark or 3dmark Stress test one? How many time did you run the test?
> 3d mark benchmark and 3dmark Stress Test is different ...
> For Stability Test use 3dmark Stress test (Pass 97 % at least, 20 loops), or you can use a Customization Loops (Infinite loops).
> 
> I can pass 3dmark benchmark with higher Graphic scores & OverClocked settings, but once in 3dmark Stress test, it will crash randomly pretty easy (if overclocked too much)..


I mean the benchmark test not the stress one. Why I didn't use the stress test? Because I used to use furmark for stability test but I think that the stress test are extremly heavy for what a graphic card needs. But I don't test any other than furmark, I will give it a try.
Quote:


> Originally Posted by *gtbtk*
> 
> did you take another look at refining your motherboard settings? If they are a little bit off from being optimal, what was stable with a maxwell card can fall over with pascal because the maxwell card could never stress some of the components enough for the sloppy settings to matter.
> 
> Talking about +55 is not very beneficial as everyone has different cards with different base clocks, better to talk about clock frequencies. The same goes for memory overclocks.
> 
> Having said that, If you are running +800 memory overclocks, while not enough to crash the benchmarks, are probably giving you errors that the memory error correction is having to deal with and failing in metro last light. try dropping your memory overclock down to say +500 and see how that goes for stability. vccio voltage may also help with stability. did you try looking at that in your bios?


You are right, I would have to talk with clocks speeds not the increments.

The only setting that I made in the bios was the VCCIO which I set one "point" higher than auto (apart of the RAM settings, timings, speed and voltage).

In GPUZ I have the following:

Default core clock: 1633
Default boost clock: 1835
Default mem clock: 2002

Current core clock: 1688
Current boost clock: 1890
Current mem clock: 2387

While playing, I usually make test with core or mem, not both at the same time, and crash with core clock increases but all this comments help to find the best settings and to learn how this generation works.


----------



## gtbtk

Quote:


> Originally Posted by *yevonxxx*
> 
> hi guys,
> i've tried searching a bit but direct question probably is better:
> 
> i've seen there are some 1070 with overheating problem,
> some other with memory issues,
> some other with coil wine problems,and so on.
> 
> which is a reliable brand/model for gtx 1070,
> known for NOT giving problems?
> 
> my plan is also to oc it as much as possible,
> probably watercooling in the near future but not now(got no time to rebuild my custom loop i'm busy with work).
> 
> thank you.


EVGA cards did have some faulty capacitor issues that were erroneously blamed on VRM overheating. EVGA have provided extra thermal pads to mitigate. They have also announced a new model card with upgraded cooler design with sensors everywhere to try and shut up all the people who are fixated on "overheating" problems. EVGA card bioses are set up to hit power limits very quickly so they are probably not the best choice foor water cooling.

There are no 1070 cards with memory problems. Cards that came with Micron Chips, for the first 4-5 months of the production life of 1070s last year, did have a bug in the bios that effected the memory controller with the micron chips and caused blue screen crashes if you overclocked the memory more than about 400Mhz above stock. Nvidia pushed Bios updates through all the vendors last November and it resolved the issue. All brands have sold cards with both micron and samsung memory installed so what you get is luck of the draw.

Gigabyte cards do have a reputation for coil whine and also for fans hitting the heatsync fins making noises. there are other mentions of cards with coil whine but they do not seem to be as prevalent. it just seems to be a bit of a lottery.

The very highest over clocked 1070 you can buy is probably the Gainward GLH/Palit Gamerock that comes with a 1671 core clock and a 250mhz memory OC from the factory. They also have a 250W power limit

Zotac Amp extreme is a very good card and has the highest power limit of 300W.

In real terms having the 2nd pcie power cable is more for marketing than it is for performance

You can always cross flash your card with a bios from another vendor so you get little value buying a card with the highest factory overclock. with the exception of the Galax HOF cards, you can pretty much flash any bios you like to any other card


----------



## gtbtk

Quote:


> Originally Posted by *asfaR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipper17*
> 
> By you mean Firestrike is stable, is it on the 3dmark Benchmark or 3dmark Stress test one? How many time did you run the test?
> 3d mark benchmark and 3dmark Stress Test is different ...
> For Stability Test use 3dmark Stress test (Pass 97 % at least, 20 loops), or you can use a Customization Loops (Infinite loops).
> 
> I can pass 3dmark benchmark with higher Graphic scores & OverClocked settings, but once in 3dmark Stress test, it will crash randomly pretty easy (if overclocked too much)..
> 
> 
> 
> I mean the benchmark test not the stress one. Why I didn't use the stress test? Because I used to use furmark for stability test but I think that the stress test are extremly heavy for what a graphic card needs. But I don't test any other than furmark, I will give it a try.
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> did you take another look at refining your motherboard settings? If they are a little bit off from being optimal, what was stable with a maxwell card can fall over with pascal because the maxwell card could never stress some of the components enough for the sloppy settings to matter.
> 
> Talking about +55 is not very beneficial as everyone has different cards with different base clocks, better to talk about clock frequencies. The same goes for memory overclocks.
> 
> Having said that, If you are running +800 memory overclocks, while not enough to crash the benchmarks, are probably giving you errors that the memory error correction is having to deal with and failing in metro last light. try dropping your memory overclock down to say +500 and see how that goes for stability. vccio voltage may also help with stability. did you try looking at that in your bios?
> 
> Click to expand...
> 
> You are right, I would have to talk with clocks speeds not the increments.
> 
> The only setting that I made in the bios was the VCCIO which I set one "point" higher than auto (apart of the RAM settings, timings, speed and voltage).
> 
> In GPUZ I have the following:
> 
> Default core clock: 1633
> Default boost clock: 1835
> Default mem clock: 2002
> 
> Current core clock: 1688
> Current boost clock: 1890
> Current mem clock: 2387
> 
> While playing, I usually make test with core or mem, not both at the same time, and crash with core clock increases but all this comments help to find the best settings and to learn how this generation works.
Click to expand...

you may want to increase vccio up a few more "points".

A fun trick that I found is that i cross flashed my card with an evga ftw bios. I then used Precision XOC automatic OC utility and you can see where the different voltage levels start to hit their limits during the test. If you go and mak a vccio adjustment and rerun the OC utility you can see any improvements that you get so it makes it easy to tune. It also helps you identify which voltage levels overclock more strongly than the others.

After you finish tuning vccio and cpu pll voltages, you can flash your card back to its original bios and if you feel like it, use the insite you have learned and try overclocking with the curve which will ultimately allow you to gain much more performance than you can with only the slider as that is limited by the voltage point with the lowest oc potential.


----------



## asfaR

Quote:


> Originally Posted by *gtbtk*
> 
> you may want to increase vccio up a few more "points".
> 
> A fun trick that I found is that i cross flashed my card with an evga ftw bios. I then used Precision XOC automatic OC utility and you can see where the different voltage levels start to hit their limits during the test. If you go and mak a vccio adjustment and rerun the OC utility you can see any improvements that you get so it makes it easy to tune. It also helps you identify which voltage levels overclock more strongly than the others.
> 
> After you finish tuning vccio and cpu pll voltages, you can flash your card back to its original bios and if you feel like it, use the insite you have learned and try overclocking with the curve which will ultimately allow you to gain much more performance than you can with only the slider as that is limited by the voltage point with the lowest oc potential.


I will take in account.

I have a question concerning the curve. When I increase a value (for example the voltage 1093 from 0 to 100) what do I have to consider?, the core clock or the mhz increased. I say this because if for example I increase from 0 to 100 the core clock established is 1013 and sometimes 1026 with the same settings. This is quite confusing because I'm not sure if the value that I introduce is correct or not, if I have to look the core clock or the value that I can increase.

I hope you understand what I mean.


----------



## ColdDeckEd

Hi guys I found a great deal on a new 1070 sc, that I couldn't pass up, so now I'm part of the 1070 club lol.

Anyways, thanks to this thread, I was able to flash the asus oc bios on my evga sc (micron), and it worked great! Temps have gone up a bit, but it is worth it to get rid the constant hitting of the power limit with the evga bios.

I'm wondering has anyone flashed the palit super Jetstream bios on to the evga sc? I was going to do it, but chickened out at the press y to continue prompt. I expect that if it works correctly, my temps would pass the 70 degree threshold and instead of power throttling, i'd end up with temp throttling. I also have some CLU on order, but i'm worried about temps if I do the power mod as well. I'm definitely happy with the free perf boost the asus bios gave me though, so I will probably end up flashing the palit bios unless someone tells me otherwise.


----------



## gtbtk

Quote:


> Originally Posted by *asfaR*
> 
> I will take in account.
> 
> I have a question concerning the curve. When I increase a value (for example the voltage 1093 from 0 to 100) what do I have to consider?, the core clock or the mhz increased. I say this because if for example I increase from 0 to 100 the core clock established is 1013 and sometimes 1026 with the same settings. This is quite confusing because I'm not sure if the value that I introduce is correct or not, if I have to look the core clock or the value that I can increase.
> 
> I hope you understand what I mean.


All of the changes you make, being on the slider or the curve are offset values and not absolute values. a +100 increase on a point could roughly equate to say 2100mhz for example, however, if the card has temperature and power headroom, the card can boost higher than the initial approximation.

I suggest that the best way forward is to set your voltage level first and then adjust the highest single point on the curve supported by the voltage (+100 voltage = 1.093v) to say +50 then test. if it passes, increase it to +75 and test again keep going until it goes unstable then back it off to the last stable point. After you have found the high point for 1.093V. then try and adjust a mid point on the curve. I have found that.975v is a good point to try. keep increasing that .975 point and check if frame rates improve until it goes unstable. if framerates do not change at all, try a different point, say .950.

After you have settled on the 2 points you are adjusting, increase memory as high as you can go before it goes unstable. When you found that point, save the profile.

You can then try reducing the core offset points a little and trying higher memory overclocks and checking if that improves performance. You may find that different games will react differently to different styles of overclock adjustments.

The afterburner curve can actually be quite confusing because you need to read both the thick line and the thin line together. the thin line actually shows the amount of offset that each point is actually using. In a logical world, the thin line you can see should, in my opinion, be a fixed line that matches the default 0 offset curve settings with the adjusted curve points showing adjustment offset that goes above or below the baseline but is does not display that way.

to illustrate what I mean see the following diagrams, these curves are only to illustrate what i mean with offsets and hopefully will help you understand what you are actually seeing. They are not a curve that I would recommend anyone actually use.

this is an image of adjustments that could be made before clicking the apply button. the thin line is the curve will all offsets at 0. The thick line shows the adjusted points before you apply the overclock. the yellow labels show you how much offset that I have given each point (group of points in this case just to make it easier to see what I am doing). This is the way that I think Afterburner should display the voltage curve after you apply the changes, but it does not.



this is how the curve displays after you apply the curve in afterburner



You can see that in some cases, the thin line has moved to be above or below the thick line and the thick line has stayed in the same position on the screen. in other places, the thick line has moved and the thin line has stayed in its original position. If you continue making tweaks, you can end up with many zigzag patterns in the thin line that makes it really difficult to compare the current curve to what you had in the last experiment.

If you ignore the thin line which most people do because the documentation does not explain what the thin line actually represents, it is possible to show 2 curves that look exactly the same but , if you take the thin line into account you can see that while the curve with the -100 points looks flat, it actually has a big dip in it that is not represented on screen.

The best advice I can give you when using the curve, is to try and keep adjustments from the default limited to a few changes for each curve (say 3 adjustments), if you plan to tweak, reset the curve to default then try the tweak on one of the adjustments and put the rest back to the same place they were on in the last curve.


----------



## gtbtk

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Hi guys I found a great deal on a new 1070 sc, that I couldn't pass up, so now I'm part of the 1070 club lol.
> 
> Anyways, thanks to this thread, I was able to flash the asus oc bios on my evga sc (micron), and it worked great! Temps have gone up a bit, but it is worth it to get rid the constant hitting of the power limit with the evga bios.
> 
> I'm wondering has anyone flashed the palit super Jetstream bios on to the evga sc? I was going to do it, but chickened out at the press y to continue prompt. I expect that if it works correctly, my temps would pass the 70 degree threshold and instead of power throttling, i'd end up with temp throttling. I also have some CLU on order, but i'm worried about temps if I do the power mod as well. I'm definitely happy with the free perf boost the asus bios gave me though, so I will probably end up flashing the palit bios unless someone tells me otherwise.


You should think about using a custom fan curve with the asus bios. Different bioses do set max fans to different speeds. The Asus card has 3 fans so it is probably using a slower max fan speed than the dual fans on the evga card. The Asus bios also has a 200W limit compared to the 170 limit of the EVGA stock bios so you can expect it to generate more heat

I have flashed the Palit bioses to my MSI Gaming X and it works fine. The EVGA bioses also flashed fine to my msi card.

Unless you are hitting power limits with the Asus bios, I doubt that you will see any improved performance from the Palit Bios as they are both clocked the same. If you do flash that Palit bios, max power is up to 225W for that card so it could run hotter still. The SC has a reference PCB so the VRM should be able to handle up to 250W but you should Keep a very close eye on temps if you want to try it out. It may be fine but what you are doing is experimental.


----------



## zipzop

Quote:


> Originally Posted by *gtbtk*
> 
> The Asus card has 3 fans so it is probably using a slower max fan speed than the dual fans on the evga card.


Surprisingly, the opposite. Also running the Strix OC Bios on my EVGA SC(from some weeks ago, you recommended this BIOS for the 8-pin). Seems STrix BIOS supplies a bit more power to the fans under same % setting in AB. Had to lower my fan speed profile after flashing the STrix BIOS cause the fans were way louder and faster...wish I had took note of the RPM before but I'm too lazy to switch back again

..Come to think, maybe because they are wired in parallel or series, more voltage may be required to run three 3 fans simultaneously


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The Asus card has 3 fans so it is probably using a slower max fan speed than the dual fans on the evga card.
> 
> 
> 
> Surprisingly, the opposite. Also running the Strix OC Bios on my EVGA SC(from some weeks ago, you recommended this BIOS for the 8-pin). Seems STrix BIOS supplies a bit more power to the fans under same % setting in AB. Had to lower my fan speed profile after flashing the STrix BIOS cause the fans were way louder and faster...wish I had took note of the RPM before but I'm too lazy to switch back again
> 
> ..Come to think, maybe because they are wired in parallel or series, more voltage may be required to run three 3 fans simultaneously
Click to expand...

I like the Asus bios. It ran well on the msi card but in your case the extra power will translate to extra heat. I did notice variations in 100% fan speed that ranged between 2350 and about 2500 but I didnt pay that much attention or much thought to it until now.

The strix has 3 fans that I guess are wired with a splitter so the bios is probably pushing enough power to support 3 fans when your card only needs enough power to drive dual fans at speed. As long as you are aware of what is happening, you can deal with it so it should not be a problem.


----------



## Nukemaster

Newer Nvidia cards adjust the PWM to get a certain speed(set by the card maker).

This makes fan swaps act strange on some cards. My Asus 1070 with a Mono Plus cooler has almost no fan control(I could get it, but it was over a very short PWM range) because the fan spins slow and the card gets confused. With the micron bios update(I have Samsung memory anyway), They added 0rpm mode and that is even worse because somehow it forces my Mono Plus fan to 100%. I just connected it to my motherboards case fan header(Even at the lowest speed the card would feed, I was in no danger of overheating).


----------



## djriful

Can someone tell me if this is worth upgrade over my vanilla GTX TITAN... (not the black)

I'm looking at the Zotac GTX 1070 Mini.


----------



## icold

Quote:


> Originally Posted by *djriful*
> 
> Can someone tell me if this is worth upgrade over my vanilla GTX TITAN... (not the black)
> 
> I'm looking at the Zotac GTX 1070 Mini.


Choose 1070 amp extreme or 1070 strix ( have a awsome cooler: 59C fullload). GTX titan = gtx 780, GTX 1070 HAVE DOUBLE performance. My previous VGA was GTX 780 DIRECTCU II.


----------



## dboythagr8

I've been really impressed with the Strix cooler. My first non blower card in some time, as I was a Titan customer since the original card. Don't think I'll ever go back to the blower style unless I decide to return to SLI for some reason.


----------



## rfarmer

https://www.newegg.com/Product/Product.aspx?Item=N82E16814126109

Asus Strix is also on sale for $399 after rebate.


----------



## KedarWolf

Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!









I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.









I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.


----------



## gtbtk

Quote:


> Originally Posted by *djriful*
> 
> Can someone tell me if this is worth upgrade over my vanilla GTX TITAN... (not the black)
> 
> I'm looking at the Zotac GTX 1070 Mini.


1070 has about 120% the performance of the original 7 series Titan. All 1070s, with some overclocking, generally perform in roughly same range. Water cooling and the enormous air coolers do help a little with getting higher frequencies but that really doesn't give you all that much framerate improvenent in games anyway. 1070 cards really benefit memory overclocks.

That Zotac card has a twin fan cooler so it should be OK and be able to run without too much noise. Pascal does not run that hot so you can keep fan speeds lower than earlier model cards. You can easily cross flash the card with a higher overclocked model's bios if you want higher default clocks.


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.


If you plan to SLI the Zotac Amp extreme cards, remember that they are enormous 3 slot cards so you may find it a challenge to find a motherboard that has enough spacing between the PCIe slots to allow for the installation.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *djriful*
> 
> Can someone tell me if this is worth upgrade over my vanilla GTX TITAN... (not the black)
> 
> I'm looking at the Zotac GTX 1070 Mini.
> 
> 
> 
> Choose 1070 amp extreme or 1070 strix ( have a awsome cooler: 59C fullload). GTX titan = gtx 780, GTX 1070 HAVE DOUBLE performance. My previous VGA was GTX 780 DIRECTCU II.
Click to expand...

My MSI Gaming cooler rarely gets above 50 deg at full load.


----------



## KedarWolf

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you plan to SLI the Zotac Amp extreme cards, remember that they are enormous 3 slot cards so you may find it a challenge to find a motherboard that has enough spacing between the PCIe slots to allow for the installation.
Click to expand...

Is a Sabertooth X99 motherboard and has one extra pci-e slot between the two video card pci-e slots. I know thermal throttling is an issue with 1070s. Would I be better going with another card and which card should I get? I'd want one with the extra pci-e power connector, not just one eight pin power connector. :/

Edit: I can get MSI Armour or a G1 Gaming, either better option?

Second edit: What about the regular Zotac Amp, not the extreme. I want the best two slot cooling with two power connectors each card.


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you plan to SLI the Zotac Amp extreme cards, remember that they are enormous 3 slot cards so you may find it a challenge to find a motherboard that has enough spacing between the PCIe slots to allow for the installation.
> 
> Click to expand...
> 
> Is a Sabertooth X99 motherboard and has one extra pci-e slot between the two video card pci-e slots. I know thermal throttling is an issue with 1070s. Would I be better going with another card and which card should I get? I'd want one with the extra pci-e power connector, not just one eight pin power connector. :/
Click to expand...

There is no thermal throttling with 1070s unless you are thrashing a founders blower style card. Pascal GPU Boost does move the clock speeds around a bit under load as temps increase but that starts at about 40 deg and the clock speed starts higher than the level that you set it too anyway so it ends up as a wash.

Thermal throttling doesn't start happening until 82 deg or 92 deg if you increase the temp level in afterburner. Even if you don't create a fan curve, any non blower style card will be hard pressed to hit 70 deg unless there is no airflow to your case.

For an SLI install, I really recommend that you rethink and choose 2 slot cards and take advantage of the extra space for better airflow to each of the cards. If you are planning to overclock, all 1070s will get fairly similar levels of performance.

The extra power connector is more marketing gimmick than a performance enhancer. Of all the cards, the EVGA ones are the only ones that have real issues with hitting power limits. They hit the limit at 1080p and that is because of the way the bios has been configured not the power supply. The FTW card has 8+8 pins I think and that hits power limits just the same as the SC card.

If you really want a great option with excellent and quiet 2 slot cooler, a 291W total power limit and a 6+8 pin power supply, take a look at the MSI Gaming 8G cards and save a bit of money or the quicksilver card if you don't want red in your case.

You can cross flash the Gaming Z bios to them when you get them if you want a higher default base clock.


----------



## spddmn24

How much better do these overclock under water? My MSI quicksilver 1070 does 2126 max and settles around 2101-2113 under load at 66 max temp. Kind of impulse bought an ek waterblock off ebay







, and if I won't get much more out of it I'll probably just sell it.


----------



## DeathAngel74

Finally...only took 6 days in total, including 2 way shipping:


----------



## djriful

I think I'll wait for GTX 1170.


----------



## DeathAngel74

Dang it! @khanmein was right...


----------



## DeathAngel74

Oh well, no matter....Micron can now overclock just as well as Samsung w/iCX cooler:


----------



## SuperZan

Quote:


> Originally Posted by *DeathAngel74*
> 
> Oh well, no matter....Micron can now overclock just as well as Samsung w/iCX cooler:
> 
> 
> Spoiler: Warning: Spoiler!


Cheers! I've sent mine in for the iCX upgrade as my curiosity often gets the better of me. I'm glad to see that sacrificing the Sam-ram wasn't as critical as I'd thought.


----------



## DeathAngel74

Yeah, my old card used to crash past +415MHz on the old VRAM.


----------



## khanmein

@DeathAngel74 congrats & i'm happy for u. my ACX 3.0 also not too shabby cos no over-heating & no coil-whine too.


----------



## DeathAngel74

i forgot about coil whine.......don't have it tho. I was more worried about artifacts and bad VRAM overclock.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> i forgot about coil whine.......don't have it tho. I was more worried about artifacts and bad VRAM overclock.


how come your micron OC better than samsung? impossible..


----------



## DeathAngel74

rofl, i'm going to bed, lol


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> rofl, i'm going to bed, lol


happy night. just now i also received internet upgrade.


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> How much better do these overclock under water? My MSI quicksilver 1070 does 2126 max and settles around 2101-2113 under load at 66 max temp. Kind of impulse bought an ek waterblock off ebay
> 
> 
> 
> 
> 
> 
> 
> , and if I won't get much more out of it I'll probably just sell it.


You can probably get 2200 or even more out of the watercooled card


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Oh well, no matter....Micron can now overclock just as well as Samsung w/iCX cooler:


There has never been anything wrong with Micron memory. The problem was with buggy bios firmware and the memory controller but that has been fixed.

If there are still stability problems after the bios upgrade, then that is due to motherboard voltage settings not being optimal.


----------



## DeathAngel74

vccio is set to 1.208v auto
vccsa is set to 1.216v auto
cpu standby is set to 1.237v auto


----------



## zipper17

Quote:


> Originally Posted by *DeathAngel74*
> 
> Oh well, no matter....Micron can now overclock just as well as Samsung w/iCX cooler:


Try overclock your vram higher? or this already on limit?

Use 3dmark stress test/custom loops, or others. Every bench/games has different kind of load.

make sure there is no artifacts, even _smallest artifact_ is mean not good.


----------



## DeathAngel74

Quote:


> Originally Posted by *khanmein*
> 
> happy night. just now i also received internet upgrade.


Must be nice. I only get 50Mbps down/6Mbps up


----------



## DeathAngel74

Quote:


> Originally Posted by *zipper17*
> 
> Try overclock your vram higher? or this already on limit?
> 
> Use 3dmark stress test/custom loops, or others. Every bench/games has different kind of load.
> 
> make sure there is no artifacts, even _smallest artifact_ is mean not good.


It's already at the limit. +500-505 = crash and loud buzzing noise, then have to hard reset pc, lol


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> Must be nice. I only get 50Mbps down/6Mbps up


previously 1 Mbps down @ 384 Kbps up for more than 10 years.









last year Nov 28 Mbps down @ 10 Mbps up.









today 47 Mbps down @ 19 Mbps up.









oh yeah any free games for your step up??


----------



## DeathAngel74

no....because it is technically and RMA upgrade...


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> vccio is set to 1.208v auto
> vccsa is set to 1.216v auto
> cpu standby is set to 1.237v auto


have a try experimenting with VCCIO and vccsa voltages. Try Setting them both to 1.22 as a starting point, test the memory OC limits and step it up a bit at a time (1.35 is the absolute limit but I would keep it well below that level as they will impact thermals). Those settings provide voltage that help strengthen memory and PCIe controller stability.

You might just find that you find some extra OC headroom for your Vram.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> no....because it is technically and RMA upgrade...


ACX 3.0 heavier or the new iCX? i'm envy with u.


----------



## DeathAngel74

I will when I have more time, lol. Thank you for your advice as always.


----------



## DeathAngel74

iCX is heavier, and more clunky....Had to ask my wife to install the card. The rear mounting bracket wasn't playing nice with my case. Needed smaller hands to get it right....AND...thermal pads were everywhere. I guess they were like "Here's your damned thermal pads!"


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> iCX is heavier, and more clunky....Had to ask my wife to install the card. The rear mounting bracket wasn't playing nice with my case. Needed smaller hands to get it right....AND...thermal pads were everywhere. I guess they were like "Here's your damned thermal pads!"


one thing i'm kinda disappointed with iCX is the thermal pad quality they used.


----------



## DeathAngel74

When I have more time and some extra cash, I'll get some TG Kryonaut and 17mK thermal pads to replace stock ones+TIM. I wonder if buying my old 1070 sc from best buy made a difference in overclocking (6173-kb) compared to the sc2 (6573-kr). Wonder if they used different components......meh....gotta finish coffee, put on my Blue shirt and go to work.....


----------



## KedarWolf

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> Must be nice. I only get 50Mbps down/6Mbps up
> 
> 
> 
> previously 1 Mbps down @ 384 Kbps up for more than 10 years.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> last year Nov 28 Mbps down @ 10 Mbps up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> today 47 Mbps down @ 19 Mbps up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh yeah any free games for your step up??
Click to expand...

Have Fibre Interwebs here, 950 down, 115 up, 2 ms pings..


----------



## DeathAngel74

I get 24ms pings on uverse...now I really have to go, lol.


----------



## KedarWolf

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you plan to SLI the Zotac Amp extreme cards, remember that they are enormous 3 slot cards so you may find it a challenge to find a motherboard that has enough spacing between the PCIe slots to allow for the installation.
Click to expand...

Going with MSI Gaming X, same as the regular Gaming, but higher binned.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> iCX is heavier, and more clunky....Had to ask my wife to install the card. The rear mounting bracket wasn't playing nice with my case. Needed smaller hands to get it right....AND...thermal pads were everywhere. I guess they were like "Here's your damned thermal pads!"


It is certainly a knee jerk response product.

With a heavier cooler though, you should find that temps stay a bit lower than with the older card


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you plan to SLI the Zotac Amp extreme cards, remember that they are enormous 3 slot cards so you may find it a challenge to find a motherboard that has enough spacing between the PCIe slots to allow for the installation.
> 
> Click to expand...
> 
> Going with MSI Gaming X, same as the regular Gaming, but higher binned.
Click to expand...

That is what I am running but I have installed the Gaming Z bios.


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> Must be nice. I only get 50Mbps down/6Mbps up
> 
> 
> 
> previously 1 Mbps down @ 384 Kbps up for more than 10 years.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> last year Nov 28 Mbps down @ 10 Mbps up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> today 47 Mbps down @ 19 Mbps up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh yeah any free games for your step up??
> 
> Click to expand...
> 
> Have Fibre Interwebs here, 950 down, 115 up, 2 ms pings..
Click to expand...

I have 500 down and 500 up with 2ms pings


----------



## KedarWolf

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you plan to SLI the Zotac Amp extreme cards, remember that they are enormous 3 slot cards so you may find it a challenge to find a motherboard that has enough spacing between the PCIe slots to allow for the installation.
> 
> Click to expand...
> 
> Going with MSI Gaming X, same as the regular Gaming, but higher binned.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> That is what I am running but I have installed the Gaming Z bios.
Click to expand...

Yeah, I was going to do the same, are yours Samsung or Micron?


----------



## Blackfirehawk

wich bios does the highest RPM for fans? actually i have the gainwand glh phoenix bios but it feels like the fans are slower.. max 2350 rpm on my gainwand gtx 1070


----------



## GnarlyCharlie

I saw a presser where MSI has a new 1070 mini card

https://www.msi.com/Graphics-card/GeForce-GTX-1070-AERO-ITX-8G-OC.html#hero-overview

I'm looking forward to an actual release with pricing, I might snag one for the HTPC.


----------



## khanmein

Quote:


> Originally Posted by *KedarWolf*
> 
> Have Fibre Interwebs here, 950 down, 115 up, 2 ms pings..


don't compare your country cos my country is for #poorpeople. average 12 ms ping


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> It is certainly a knee jerk response product.
> 
> With a heavier cooler though, you should find that temps stay a bit lower than with the older card


they dug some holes to the heat-sink & should be slightly lighter right?

by the way, can u try watching any 4K60 from youtube by using chrome latest stable version, right-click stats for nerds, check whether any dropped frames by viewing theater mode? thanks.

e.g 




anyone can feedback to me? greatly appreciated..


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Soon I'm getting two ZOTAC GeForce GTX 1070 AMP! Extremes!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I contacted the seller though to make sure they are from the same batch number, I want them both to have the same type of memory, preferably Samsung but as long as it's the same I'm not that fussy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked them if they have any open box they can check in GPU-Z or at least send me the same batch number like I said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you plan to SLI the Zotac Amp extreme cards, remember that they are enormous 3 slot cards so you may find it a challenge to find a motherboard that has enough spacing between the PCIe slots to allow for the installation.
> 
> Click to expand...
> 
> Going with MSI Gaming X, same as the regular Gaming, but higher binned.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> That is what I am running but I have installed the Gaming Z bios.
> 
> Click to expand...
> 
> Yeah, I was going to do the same, are yours Samsung or Micron?
Click to expand...

I got one of the very first Micron cards they ever produced in early July last year and went through all the Checkerboard artifact/BSOD pain If I tried to overclock the vram to +400Mhz. Now, with the same card I can run at +700mhz but it is not quite 100% stable at that speed. It is more comfortable at 680Mhz.

I am the guy who finally worked out what the problem was, realized that it was software related and lobbied Nvidia to fix it. That is why there was the Micron .50 a bios update. The last time I looked, my thread over at Nvidia.com ran to about 55 pages. There were a hell of a lot of the "forum police" that were adamant that there was no problem, I was an idiot, Micron was obviously inferior and I should just accept it and how dare I post that there was a problem with an Nvidia product type posts.

Fortunately, I was right and all the nay sayers were wrong but the total absence of Micron Card reviews until after the bios fix became available did not do anything for clearing the Micron is bad myth that has taken on a life of its own.

Most of the Limitations that people are seeing now are usually down to Auto configured motherboard voltage settings that are a bit too low to support the higher loads that pascal can put on the PCIe controllers in the CPUs


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It is certainly a knee jerk response product.
> 
> With a heavier cooler though, you should find that temps stay a bit lower than with the older card
> 
> 
> 
> they dug some holes to the heat-sink & should be slightly lighter right?
> 
> by the way, can u try watching any 4K60 from youtube by using chrome latest stable version, right-click stats for nerds, check whether any dropped frames by viewing theater mode? thanks.
> 
> e.g
> 
> 
> 
> 
> anyone can feedback to me? greatly appreciated..
Click to expand...

Chrome 56 seems to be broken in a number of areas. I am getting a lot of video freezes. If you disable IPV6 on your network card and go into the chrome://flags page and enable the prefer HTML5 over flash setting and it helps.

Chrome 56 also messes with the gamma of images, making them look really dark if you have a monitor icc calibration profile installed


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> wich bios does the highest RPM for fans? actually i have the gainwand glh phoenix bios but it feels like the fans are slower.. max 2350 rpm on my gainwand gtx 1070


with Gigabyte bioses installed on my MSI card, the fans would spin at 2500+ at 100%, The stock MSI bios only spins the fans at 2400


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Chrome 56 seems to be broken in a number of areas. I am getting a lot of video freezes. If you disable IPV6 on your network card and go into the chrome://flags page and enable the prefer HTML5 over flash setting and it helps.
> 
> Chrome 56 also messes with the gamma of images, making them look really dark if you have a monitor icc calibration profile installed


yeah i'm using .icc calibration profile & IPV6 but i don't think it related with that.

w/o a single dropped frames so it related to CPU, GPU, internet connection speed or???

his spec i7-4790K (4.6GHz) + 1080p144Hz TN panel + GTX 970 + 4x4 DDR3 1866MHz



http://imgur.com/ezavv4l


furthermore, i noticed youtube is not using my GPU juice but instead CPU. y i can't get 0 dropped frames like him?

FYI, my friend Xeon E3-1231 V3 + 8GB DDR3 1600MHz + R9 290 + 1440p60Hz IPS panel also had some dropped frames too. (30 Mbps down & 10 Mbps up)

oh regarding the freeze is due to VP9 thang. GTX 970 don't have this issue cos not supported?? i'm not really sure cos just guessing.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Chrome 56 seems to be broken in a number of areas. I am getting a lot of video freezes. If you disable IPV6 on your network card and go into the chrome://flags page and enable the prefer HTML5 over flash setting and it helps.
> 
> Chrome 56 also messes with the gamma of images, making them look really dark if you have a monitor icc calibration profile installed
> 
> 
> 
> yeah i'm using .icc calibration profile & IPV6 but i don't think it related with that.
> 
> w/o a single dropped frames so it related to CPU, GPU, internet connection speed or???
> 
> his spec i7-4790K (4.6GHz) + 1080p144Hz TN panel + GTX 970 + 4x4 DDR3 1866MHz
> 
> 
> 
> http://imgur.com/ezavv4l
> 
> 
> furthermore, i noticed youtube is not using my GPU juice but instead CPU. y i can't get 0 dropped frames like him?
> 
> FYI, my friend Xeon E3-1231 V3 + 8GB DDR3 1600MHz + R9 290 + 1440p60Hz IPS panel also had some dropped frames too. (30 Mbps down & 10 Mbps up)
> 
> oh regarding the freeze is due to VP9 thang. GTX 970 don't have this issue cos not supported?? i'm not really sure cos just guessing.
Click to expand...

Chrome, by default, uses the GPU to decode youtube videos. However, the 4K stream is using HVEC VP9 codec. I do not think HVEC video decode is supported by Maxwell. I know Pascal can decode hvec. That may explain why your CPU is doing all the work with a 970 installed.

Have you tried a 1080p stream and looked at what is doing the decoding work then?

VP9 also indicates that chrome is using the built in flash plugin. You can set a flag to prefer html5 and it will play mp4 streams instead. that may make a difference for you


----------



## DeathAngel74

@khanmein,
Screenshots don't lie, lol. Check out the memory









J/K









Ghetto cooling FTW


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Chrome, by default, uses the GPU to decode youtube videos. However, the 4K stream is using HVEC VP9 codec. I do not think HVEC video decode is supported by Maxwell. I know Pascal can decode hvec. That may explain why your CPU is doing all the work with a 970 installed.
> 
> Have you tried a 1080p stream and looked at what is doing the decoding work then?
> 
> VP9 also indicates that chrome is using the built in flash plugin. You can set a flag to prefer html5 and it will play mp4 streams instead. that may make a difference for you


thanks a lot. now are u still living at HK? symmetrical speed is the best. my country is good but government sux. my monthly internet payment is around USD 45.

i can said majority 1080p vids that i watching on youtube is running vp9 expect some is running mp4 @ avc1.640028 but the mp4.

i'm using few extension like adblock plus, disable HTML5 auto-play & speed dial 2.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> @khanmein,
> Screenshots don't lie, lol. Check out the memory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> J/K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ghetto cooling FTW


i didn't said u lie but FYI, i didn't OC cos my CPU is damn weak so no point.


----------



## DeathAngel74

Quote:


> Originally Posted by *khanmein*
> 
> how come your micron OC better than samsung? impossible..


Quote:


> Originally Posted by *khanmein*
> 
> i didn't said u lie but FYI, i didn't OC cos my CPU is damn weak so no point.


I know I was just kidding. I was referring to that though ^^^ I used to have a 4460, it was not fun.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> I know I was just kidding. I was referring to that though ^^^ I used to have a 4460, it was not fun.


seriously, i don't like to OC due to my country weather is above 30°c ambient temp.

500 MHz for memory is good & i saw you core hit 2K for 24/7 usage + fan speed max 38xx RPM is damn loud.

go for water-cooling is the wiser choice if u wanna hit core 2K @ 24/7


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Chrome, by default, uses the GPU to decode youtube videos. However, the 4K stream is using HVEC VP9 codec. I do not think HVEC video decode is supported by Maxwell. I know Pascal can decode hvec. That may explain why your CPU is doing all the work with a 970 installed.
> 
> Have you tried a 1080p stream and looked at what is doing the decoding work then?
> 
> VP9 also indicates that chrome is using the built in flash plugin. You can set a flag to prefer html5 and it will play mp4 streams instead. that may make a difference for you
> 
> 
> 
> thanks a lot. now are u still living at HK? symmetrical speed is the best. my country is good but government sux. my monthly internet payment is around USD 45.
> 
> i can said majority 1080p vids that i watching on youtube is running vp9 expect some is running mp4 @ avc1.640028 but the mp4.
> 
> i'm using few extension like adblock plus, disable HTML5 auto-play & speed dial 2.
Click to expand...

Yes, I am still here. HK does have the benefit of being so small so it is not that hard to roll Fiber out everywhere. My internet is about $US28 a month.

certainly try turning extension off and test.

Sometimes the local caching gets overloaded. You can block the following IP ranges with your firewall to disable access to the local distributed cache. - 206.111.0.0/16 and 173.194.55.0/24


----------



## ucode

Quote:


> Originally Posted by *khanmein*
> 
> my monthly internet payment is around USD 45.


WTH, for 50Mbps / 5Mbps costs USD80 per month here so am stuck with 15 / 1.5 and don't even mention ping times.


----------



## DeathAngel74

Meh! PXOC is a PITA! Games kept crashing the driver to the desktop....DXGI_ERROR_DEVICE_REMOVED_blahblahblah.
Obviously, my previous overclock was not stable. It was the core, not the memory. I set vccio to 1.225v, vccsa to 1.225v, pch core voltage to 1.0v, and cpu standby to 1.243v.The core was unstable at 2113/2100/2088. Finally stable at 2062-2075MHz/8996Mhz.
I just got done playing Star Wars Battlefront 2015 @1440p/AF x16, so yay! time for bed.
Final overclock settings:
+100 voltage
120% power
92C temp
+88 core (2062-2075MHz) 1.081-1.093V
+501 memory (4498MHz, 8996MHz effective)


----------



## khanmein

Quote:


> Originally Posted by *ucode*
> 
> WTH, for 50Mbps / 5Mbps costs USD80 per month here so am stuck with 15 / 1.5 and don't even mention ping times.


which country? i thought my country is the most expensive one! if singapore i can get 1Gbps down & 500 Mbps up with this kinda payment.

my package come with 30 Mbps down / 10 Mbps up which included free Archer C5 C2 & Motorola digital cordless telephone C1001LA but speed is capped 28.xx Mbps & 10/11 Mbps up with ping ~12 ms.

my average ping ~12 ms with capped 47.xx Mbps down & 19.xx Mbps up after received free upgraded. FYI, any calls will be charged accordingly.

my government offered upgrade for 30 > 50 & 50 > 100 & 10 or 20 > 30 (https://www.tm.com.my/speedupgrade/Pages/index.html)

previously, i'm living with 1 Mbps / 384 Kbps for more than 10 years until Nov 2016 @ USD 26/month with unlimited local call, lousy modem router (able to back-door) & one cordless phone.


----------



## comanzo

So, is there any bios for the 1070 that pushes voltage above the 1.093 volts? I am using a 1070 FTW.


----------



## icold

Quote:


> Originally Posted by *comanzo*
> 
> So, is there any bios for the 1070 that pushes voltage above the 1.093 volts? I am using a 1070 FTW.


no, because we dont have a pascal bios tweaker


----------



## comanzo

Isn't there a strix bios for the 1070 that pushes voltage above 1.25 volts? I see it all around the internet, but the problem is that I only see it for the 1080. Thanks for replying so quickly.


----------



## icold

Quote:


> Originally Posted by *comanzo*
> 
> Isn't there a strix bios for the 1070 that pushes voltage above 1.25 volts? I see it all around the internet, but the problem is that I only see it for the 1080. Thanks for replying so quickly.


This is was bios made by asus special to a overclocker. He decided shared bios for the world


----------



## comanzo

So was that bios that the overclocker shared only for the 1080? Or can I find it for the 1070? Sorry for all these questions, but I am really trying to overclock my 1070 to 1080 performance. Since the 1070 has the same layout as the 1080, just with some SM's/shader cores disabled, do you think I can use the 1080 strix bios?


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Isn't there a strix bios for the 1070 that pushes voltage above 1.25 volts? I see it all around the internet, but the problem is that I only see it for the 1080. Thanks for replying so quickly.


1080 only


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Meh! PXOC is a PITA! Games kept crashing the driver to the desktop....DXGI_ERROR_DEVICE_REMOVED_blahblahblah.
> Obviously, my previous overclock was not stable. It was the core, not the memory. I set vccio to 1.225v, vccsa to 1.225v, pch core voltage to 1.0v, and cpu standby to 1.243v.The core was unstable at 2113/2100/2088. Finally stable at 2062-2075MHz/8996Mhz.
> I just got done playing Star Wars Battlefront 2015 @1440p/AF x16, so yay! time for bed.
> Final overclock settings:
> +100 voltage
> 120% power
> 92C temp
> +88 core (2062-2075MHz) 1.081-1.093V
> +501 memory (4498MHz, 8996MHz effective)


I don't like the EVGA software much either. the auto overclock tool, while not producing stable overclocks, does give you an easy way to see if the voltage adjustments change and improve things though. The MSI Afterburner software is better generally, but the user interface logic in the voltage curve is pretty confusing.

Every board is different. adjusting these voltages is a fine tuning exercise. Did you try a variety of voltage levels?

If it was me. I would also try adjusting only vccio and leaving SA voltage at stock and vis-versa and see what gives you the best performance. Obviously if you get best performance at auto set it back again.


----------



## comanzo

So if the strix bios is only for the 1080, then there's no other bios for the 1070 to unlock voltages correct? If there is no other bios to unlock voltage for the 1070, is there at least a bios for the 1070 that provided the best clock offsets for most people? Or am I better off just staying with my bios? Since I have a dual bios switch, I don't mind the risks of switching bios.


----------



## DeathAngel74

Quote:


> Originally Posted by *gtbtk*
> 
> I don't like the EVGA software much either. the auto overclock tool, while not producing stable overclocks, does give you an easy way to see if the voltage adjustments change and improve things though. The MSI Afterburner software is better generally, but the user interface logic in the voltage curve is pretty confusing.
> 
> Every board is different. adjusting these voltages is a fine tuning exercise. Did you try a variety of voltage levels?
> 
> If it was me. I would also try adjusting only vccio and leaving SA voltage at stock and vis-versa and see what gives you the best performance. Obviously if you get best performance at auto set it back again.


I wasn't able to get anywhere near +495-+501 before, so I'm happy. Going to try ASUS M8H 3201 BIOS today and see how it goes. Thanks for your advice


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> So if the strix bios is only for the 1080, then there's no other bios for the 1070 to unlock voltages correct? If there is no other bios to unlock voltage for the 1070, is there at least a bios for the 1070 that provided the best clock offsets for most people? Or am I better off just staying with my bios? Since I have a dual bios switch, I don't mind the risks of switching bios.


the only strix 1.2v bios is fpr the 1080. there is no 1.2v bios for the 1070. having said that. Unless you are using specialized below 0 cooling, Pascal doesn't really benefit from extra voltage much anyway.

ONe of the other guys here has an FTW with a hybrid water cooler and he is runnig the Zotac Amp Extreme bios and it improved his performance a little. That bios has a 300w power limit which is higher than anything else.

If you try the 1080 1.2v bios. Make sure you come back and let us know how it worked out


----------



## comanzo

Sure. Will definitely try it. Can you provide me the name of the person who used the zotac amp extreme bios? Also, can you provide the strix bios link? There are several different versions of the bios, and I want to make sure I get the latest one. I will definitely report back on my results later on. Thanks.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I don't like the EVGA software much either. the auto overclock tool, while not producing stable overclocks, does give you an easy way to see if the voltage adjustments change and improve things though. The MSI Afterburner software is better generally, but the user interface logic in the voltage curve is pretty confusing.
> 
> Every board is different. adjusting these voltages is a fine tuning exercise. Did you try a variety of voltage levels?
> 
> If it was me. I would also try adjusting only vccio and leaving SA voltage at stock and vis-versa and see what gives you the best performance. Obviously if you get best performance at auto set it back again.
> 
> 
> 
> I wasn't able to get anywhere near +495-+501 before, so I'm happy. Going to try ASUS M8H 3201 BIOS today and see how it goes. Thanks for your advice
Click to expand...

I am not familiar with the manufacturer specific version numbers. To keep the core functionality and bug fixes that comes from Nvidia You are better served using the nvidia versions (86.04.50.00.XX) because you can compare what generation of bios it is.

you can search for all of them here https://www.techpowerup.com/vgabios/



http://www.3dmark.com/fs/11821805


----------



## DeathAngel74

Oh, I'm still on the eVGA stock 86.04.50.00.70. Sorry, I meant ASUS motherboard bios ASUS MAXIMUS VIII HERO.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Oh, I'm still on the eVGA stock 86.04.50.00.70. Sorry, I meant ASUS motherboard bios ASUS MAXIMUS VIII HERO.


That may well help things along.

The thing that I have concluded is that pascal is loading up the inter-component motherboard communications more than anything that has gone before. Where we used to have systems that appeared stable under the lower loads of Maxwell GPUs, we are now bumping into the limits of stability so a better tuned bios may help.


----------



## ucode

@khanmein next door bro

@comanzo the T4 VBIOS was provided by Elmor who works for Asus as pasrt of the R&D team. He did say he was going to put together a 1070 VBIOS but perhaps some entity was worried the 1070 performance would get too close to the 1080 so gave him a slap on the wrist and end result is no special 1070 VBIOS.


----------



## khanmein

@DeathAngel74 your iCX default stock bios is 86.04.50.00.*70*??

mine ACX 3.0 default is *.72*


----------



## DeathAngel74

*.70* was the micron fix, .*72* was the micron fix + higher fan curve to compensate for heat on VRAM and VRM. It was before they issued the thermal pads to everyone.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> *.70* was the micron fix, .*72* was the micron fix + higher fan curve to compensate for heat on VRAM and VRM. It was before they issued the thermal pads to everyone.


yeah i know so your iCX came with *.70* ??


----------



## DeathAngel74

yeah. I can't believe the VRAM is at 9010-9012MHz. I think upping the VCCIO/VCCSA to 1.2250v, RAM to 1.3650v and CPU standyby to 1.24375v helped. Core is kinda average...2088-->2075-->2063-->sometimes 2050.


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> yeah. I can't believe the VRAM is at 9010-9012MHz. I think upping the VCCIO/VCCSA to 1.2250v, RAM to 1.3650v and CPU standyby to 1.24375v helped. Core is kinda average...2088-->2075-->2063-->sometimes 2050.


once u water-cooling core clock will be slightly higher than that e.g. 2100


----------



## DeathAngel74

load temps would have to be 25-30C though. The card runs at 47-49C (gpu) / 37-40C (power+memory) during extended gaming sessions, max on air. Remember I have a loud family, so fans at 100% don't bother me.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> *.70* was the micron fix, .*72* was the micron fix + higher fan curve to compensate for heat on VRAM and VRM. It was before they issued the thermal pads to everyone.


EVGA are the only vendor who do this but they use version number 86.04.50.00.70 for both the FTW and the SC micron fix bioses even though they are not the same bios file.


----------



## DeathAngel74

I wish an nVidia employee or AIB vendor employee would leak how they edit those vBIOS files. I know it's highly unlikely, but come on! They have to have one somewhere if they're making "fixed" files. dirty bastids


----------



## DeathAngel74

http://www.3dmark.com/3dm/18245881?


----------



## TheBoom

How do you guys tell if the motherboard voltages are enough?

This is the first time I'm hearing that CPU and motherboard related voltages are affecting GPU overclocks.


----------



## Psilosoph8

anyone have the 1070 hybrid with the AIO it comes with?


----------



## Mr-Dark

Quote:


> Originally Posted by *Psilosoph8*
> 
> anyone have the 1070 hybrid with the AIO it comes with?


Hello

Yes, I have one here, what you need ?


----------



## Psilosoph8

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> Yes, I have one here, what you need ?


do you think it's possible to replace the stock 120mm AIO it comes with for a corsair h100i?

maybe a better question is, how does the stock AIO mount onto the gpu?


----------



## Mr-Dark

Quote:


> Originally Posted by *Psilosoph8*
> 
> do you think it's possible to replace the stock 120mm AIO it comes with for a corsair h100i?


Hello

No, you can't.. check this



as you can see, evga use circular block while the H100i is Square block..

but why you want to change the AIO ? the performance is amazing.. the card barely break 40C under full load...


----------



## Psilosoph8

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> No, you can't.. check this
> 
> 
> 
> as you can see, evga use circular block while the H100i is Square block..
> 
> but why you want to change the AIO ? the performance is amazing.. the card barely break 40C under full load...


just figured a 240mm rad would be even better haha. by 'block' do you mean the entire shape of the pump or the copper plate that contacts the GPU?
the plate on my h100i v2 is also circular, although the top portion of the pump is square.

will the h100i v2 pump physically not fit into/under the block even if i can mount it?


----------



## DeathAngel74

Mr dark? How u been? It's been a while...


----------



## Mr-Dark

Quote:


> Originally Posted by *Psilosoph8*
> 
> just figured a 240mm rad would be even better haha. by 'block' do you mean the entire shape of the pump or the copper plate that contacts the GPU?
> the plate on my h100i v2 is also circular, although the top portion of the pump is square.
> 
> will the h100i v2 pump physically not fit into/under the block even if i can mount it?


The whole cooper plate is different.. also Evga change the base of the copper for better contact between the gpu and the AIO..



The GTX 1070 is 170W gpu.. so 120mm vs 240mm Rad mean nothing.. maybe 2-3c and you can keep the fan's at low speed on the 240m rad..
Quote:


> Originally Posted by *DeathAngel74*
> 
> Mr dark? How u been? It's been a while...


Hey bro, everything fine how are you ?

Girls and this life make me super busy


----------



## DeathAngel74

I thought you got lost playing WoW, lol. Happy new year. Nothing new except delid 6700k and evga 1070 sc2 icx. 20 degrees cooler and 100mhz more to 4700mhz.


----------



## gtbtk

Quote:


> Originally Posted by *TheBoom*
> 
> How do you guys tell if the motherboard voltages are enough?
> 
> This is the first time I'm hearing that CPU and motherboard related voltages are affecting GPU overclocks.


It is certainly an area that gets overlooked a lot. The industry and the media have a habit of thinking in isolated "boxes" and lose site of the fact that everything is interdependent on everything else. Mind you, we are talking about tiny fractions of a volt

The Memory controller and PCIe Controller that drives the communication to the graphics card and back is on die on the CPU. The only way of managing the stability of the controller and the quality of the resulting signalling across the bus is with the appropriate voltages. This is more relevant if you are overclocking than running everything at stock speeds but then again, you would not be trying to score 21000 graphics scores in firestrike if you were only interested in leaving things at stock.







.

Vcore voltage levels will also have an impact on the CPUs ability to reach its maximum turbo frequencies. The "Just enough" voltage so it doesn't BSOD crash, while a safe rule of thumb, will not give you the absolute best performance from your CPU. That "just enough" voltage level is usually not enough to allow the chip to reach its maximum turbo frequencies. If the CPU runs slower, it cannot feed as much data to the GPU to process, the result is lower framerates. Obviously voltage levels also need to be balanced against thermals and kept below levels where they become immediately destructive to the silicon. It is all about the best compromise.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> http://www.3dmark.com/3dm/18245881?


I reckon you can still tweak another 500-1000 out of that graphics score if you wanted to.


----------



## DeathAngel74

Funny story, after the run bsod, lol.


----------



## Mr-Dark

Quote:


> Originally Posted by *DeathAngel74*
> 
> I thought you got lost playing WoW, lol. Happy new year. Nothing new except delid 6700k and evga 1070 sc2 icx. 20 degrees cooler and 100mhz more to 4700mhz.


Hahah, No games at all from last year almost







but now its the time for new PC again..

I have 2 GTX 1070 hybrid here and some 7700k's.. but will pickup some Ryzen once they hit the market on 3/Mar









I just upgraded the GTX 1070 SC to Hybrid on my Sister pc.. the Hybrid is awesome card.. barely break 40c under heavy load


----------



## HowYesNo

guys, i have this gainward gtx1070, and it runs quite good. got it oc +99/196 core/mem that gives 1999/4201. temps are at 75C-ish at load.
so i decided to replace thermal paste on it, and got good result on core going around 66C. the problem is the vrm area.
before i disassembled the cooler vrm did get quite hot but i could keep my finger on it. now with lower core temp the vrm area (backside) is hotter, can't keep my finger long as before. i did crank up fan curve, no help. i believe this happened due to thermal pad now not sitting properly nor being as clean so i ordered pad from phobya.
i am interested is there a sort of heatsink that would go on the backside and cover only vrm area, not the full back plate.
sory if this has been answered before, didn't go much through this topic.
this is my card

reference with hot area in red

and something like this to mount using holes marked green


----------



## KedarWolf

I'm pretty sure the MSI Gaming X and Gaming Z 1070s are the same cards, Re: same binning etc. even though the Z the clocks are a bit higher, you just are paying with the Z for the extra RGB and stuff. Flashing the Z BIOS would essentially make it a Z card then. Anyone confirm?

I'm buying two for SLI, likely Newegg.ca so if one is Samsung and one is Micron I can return one as incompatible


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> Yes, I have one here, what you need ?


long time no see, how are you


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> The whole cooper plate is different.. also Evga change the base of the copper for better contact between the gpu and the AIO..
> 
> 
> 
> The GTX 1070 is 170W gpu.. so 120mm vs 240mm Rad mean nothing.. maybe 2-3c and you can keep the fan's at low speed on the 240m rad..
> Hey bro, everything fine how are you ?
> 
> Girls and this life make me super busy


the vrm cooling plate of evga blocks all flat copper base aio. only evga gpu aio can be mounted on evga cards unless taking off the vrm cooling plate.
an extremely disgusting design from evga tries to block changing cooler.


----------



## asdkj1740

Quote:


> Originally Posted by *HowYesNo*
> 
> guys, i have this gainward gtx1070, and it runs quite good. got it oc +99/196 core/mem that gives 1999/4201. temps are at 75C-ish at load.
> so i decided to replace thermal paste on it, and got good result on core going around 66C. the problem is the vrm area.
> before i disassembled the cooler vrm did get quite hot but i could keep my finger on it. now with lower core temp the vrm area (backside) is hotter, can't keep my finger long as before. i did crank up fan curve, no help. i believe this happened due to thermal pad now not sitting properly nor being as clean so i ordered pad from phobya.
> i am interested is there a sort of heatsink that would go on the backside and cover only vrm area, not the full back plate.
> sory if this has been answered before, didn't go much through this topic.
> this is my card
> 
> reference with hot area in red
> 
> and something like this to mount using holes marked green


this card has reference level power limit setting which should not be a concern about the vrm temp.

you can check the marks on the stock thermal pads to see whether it is tightly contacted. if not you can replace it by thicker one.

if you wish to add additional heatsinks to the back of the pcb vrm part, you should fine some thick thermal pad to completely cover all the components to make perfect contact which needs sticky thermal pads to do it. otherwise even thick thermal pads are using, the contact would not be good enough. and too sticky thermal pads are hard to clean up when you want to take off the heatsink after...

the much simple and effective way is to add a little fan to cool the back of the pcb directly, as long as you find out how to mount the fan.


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> long time no see, how are you


Hey bro

All fine, this life make me super busy








Quote:


> Originally Posted by *asdkj1740*
> 
> the vrm cooling plate of evga blocks all flat copper base aio. only evga gpu aio can be mounted on evga cards unless taking off the vrm cooling plate.
> an extremely disgusting design from evga tries to block changing cooler.


I think that's fine as this Hybrid card.. if you want to change the cooling why not buying cheap blower design and mod it as you like ?


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hey bro
> 
> All fine, this life make me super busy
> 
> 
> 
> 
> 
> 
> 
> 
> I think that's fine as this Hybrid card.. if you want to change the cooling why not buying cheap blower design and mod it as you like ?


you have a lot of requests at the another post. your life will be super more busier lol.

evga hybrid card is good, extremely good. but these are too late to be available on the market....
but if you have bought the air cooling version evga card in advance, then you will suffer a lot when changing to aio cooler. very very painful...


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Funny story, after the run bsod, lol.


what BSOD was it?

the error can hint at what needs adjusting


----------



## DeathAngel74

@gtbtk
combination of gpu and cpu overclocking....
4.9 @ 1.5v, lol. loud buzzing, black screen gpu driver crash, then bsod, which pointed to not enough voltage. Which driver should i be using? 378.77 and 357.49 both crash to black-screen since i got the iCX on Thursday. 3DMark says my graphic card is not recognized "Mystery Machine". I don't care about BM scores, just game stable ATM. Any suggestions are welcome, please.


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> you have a lot of requests at another post. your life will be super more busier lol.
> 
> evga hybrid card is good, extremely good. but these are too late to be available on the market....
> but if you have bought the air cooling version evga card in advance, then you will suffer a lot when changing to aio cooler. very very painful...


Yeah, I will check the request thread soon as that need so much time









that's true.. Hybrid card hit the market after 2-4 month after the release...but they worth it as one used the 980 Ti Hybrid in SLI and now the 1070..


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure the MSI Gaming X and Gaming Z 1070s are the same cards, Re: same binning etc. even though the Z the clocks are a bit higher, you just are paying with the Z for the extra RGB and stuff. Flashing the Z BIOS would essentially make it a Z card then. Anyone confirm?
> 
> I'm buying two for SLI, likely Newegg.ca so if one is Samsung and one is Micron I can return one as incompatible


With the exception of the back LED badge, you are right. I am running my Gaming X with a Gaming Z bios works great.

If they have different memory brands on the two cards there should not be an issue. After the bios update, there is nothing wrong with either brand of memory.


----------



## DeathAngel74

Hey Dark, what temps you had after 6700k delid? My previous were 78-86 max, after delid 57-64C max. I used gelid gc extreme on the die and IHS. no reseal, H100i v2 and 4.7 @ 1.4v


----------



## gtbtk

Quote:


> Originally Posted by *HowYesNo*
> 
> guys, i have this gainward gtx1070, and it runs quite good. got it oc +99/196 core/mem that gives 1999/4201. temps are at 75C-ish at load.
> so i decided to replace thermal paste on it, and got good result on core going around 66C. the problem is the vrm area.
> before i disassembled the cooler vrm did get quite hot but i could keep my finger on it. now with lower core temp the vrm area (backside) is hotter, can't keep my finger long as before. i did crank up fan curve, no help. i believe this happened due to thermal pad now not sitting properly nor being as clean so i ordered pad from phobya.
> i am interested is there a sort of heatsink that would go on the backside and cover only vrm area, not the full back plate.
> sory if this has been answered before, didn't go much through this topic.
> this is my card
> 
> reference with hot area in red
> 
> and something like this to mount using holes marked green


If you put thermal pads between VRM and backplate and it is now hotter to touch, it means the thermal pads are doing exactly what they are supposed to be doing and transferring the heat in the vrms AWAY and into the back plate to radiate away to the atmosphere. That is a good thing, not something to worry about


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> Yeah, I will check the request thread soon as that need so much time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that's true.. Hybrid card hit the market after 2-4 month after the release...but they worth it as one used the 980 Ti Hybrid in SLI and now the 1070..


the gpu socket size from 980ti to 1080 are all the same so the hybrid 980ti evga aio can be reused on pascal cards, but the vrm cooling part is a big concern...


----------



## Mr-Dark

Quote:


> Originally Posted by *DeathAngel74*
> 
> Hey Dark, what temps you had after 6700k delid? My previous were 78-86 max, after delid 57-64C max. I used gelid gc extreme on the die and IHS. no reseal, H100i v2 and 4.7 @ 1.4v


Mine isn't delidded yet and will not delid this







maybe 7700k if Ryzen fail


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> the gpu socket size from 980ti to 1080 are all the same so the hybrid 980ti evga aio can be reused on pascal cards, but the vrm cooling part is a big concern...


I know both is same, also you may check the 1070 FTW Hybrid.. that better than the normal Hybrid


----------



## HowYesNo

Quote:


> Originally Posted by *gtbtk*
> 
> If you put thermal pads between VRM and backplate and it is now hotter to touch, it means the thermal pads are doing exactly what they are supposed to be doing and transferring the heat in the vrms AWAY and into the back plate to radiate away to the atmosphere. That is a good thing, not something to worry about


my card doesn't have a backplate.


----------



## DeathAngel74

What drivers is everyone using? SWBF, TW3, Batman AK: all crashing...... PLZ HELP MAH!


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> I know both is same, also you may check the 1070 FTW Hybrid.. that better than the normal Hybrid


ftw card is $460, ftw hyrbid card is $490, ftw hybrid kit is $120.
cant you see how stupid is me to buy the ftw too eariler
evga serves the old loyal customers like shi-t.

not to mention the icx upgrade, evga totally forget and forgo the old loyal customers.


----------



## asdkj1740

Quote:


> Originally Posted by *HowYesNo*
> 
> my card doesn't have a backplate.


it is hard to find the right heatink to be mounted on your card. you can only go for sticky thermal pads with separate heatsinks.


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> ftw card is $460, ftw hyrbid card is $490, ftw hybrid kit is $120.
> cant you see how stupid is me to buy the ftw too eariler
> evga serves the old loyal customers like shi-t.
> 
> not to mention the icx upgrade, evga totally forget and forgo the old loyal customers.


The FTW is 414$ -- Normal Hybrid -- 429$--FTW Hybrid - 459$









the best is the FTW Hybrid... 215W power limit- Higher boost clock-- RGB--Quiet blower fan


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure the MSI Gaming X and Gaming Z 1070s are the same cards, Re: same binning etc. even though the Z the clocks are a bit higher, you just are paying with the Z for the extra RGB and stuff. Flashing the Z BIOS would essentially make it a Z card then. Anyone confirm?
> 
> I'm buying two for SLI, likely Newegg.ca so if one is Samsung and one is Micron I can return one as incompatible


To answer my own question they bin the Gaming Z, if it fails it becomes a Gaming X, if Gaming X fails bin it becomes a regular Gaming. Only $10 more for Gaming Z.over X.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> @gtbtk
> combination of gpu and cpu overclocking....
> 4.9 @ 1.5v, lol. loud buzzing, black screen gpu driver crash, then bsod, which pointed to not enough voltage. Which driver should i be using? 378.77 and 357.49 both crash to black-screen since i got the iCX on Thursday. 3DMark says my graphic card is not recognized "Mystery Machine". I don't care about BM scores, just game stable ATM. Any suggestions are welcome, please.


Not surprising that 3dmark has not caught up with the ICX yet. The results will update when their database catches up.

I have been using the .77 drivers

Maybe you need to drop the vcore voltage back a notch or two. 1.5v is pretty high. Did you tune the bios voltages with this new card or the old one?


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> The FTW is 414$ -- Normal Hybrid -- 429$--FTW Hybrid - 459$
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the best is the FTW Hybrid... 215W power limit- Higher boost clock-- RGB--Quiet blower fan


my above stated prices are msrp.
so i wont buy evga air cooling card anymore.
215w is extremely low, with respect to the pcb of ftw, and compared to other 1070s. this is what i dislike evga on the other hand. too many marketing gimmicks...


----------



## Mr-Dark

Quote:


> Originally Posted by *KedarWolf*
> 
> To answer my own question they bin the Gaming Z, if it fails it becomes a Gaming X, if Gaming X fails bin it becomes a regular Gaming. Only $10 more for Gaming Z.over X.


That's correct bro









All use same PCB but each come with different bios... Should note that card is the best for SLI setup.. that cooler is beast..

I had pair in SLI and even at stock fan curve ( super silent ) the max temp around 70c.. also the power limit is very high.. the card barely break 75% power usage under full load


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> my above stated prices are msrp.
> so i wont buy evga air cooling card anymore.
> 215w is extremely low, with respect to the pcb of ftw, and compared to other 1070s. this is what i dislike evga on the other hand. too many marketing gimmicks...


Evga air cooler isn't good.. MSI is way better..

the power limit on the SC and Hybrid is around 190W when you max the power slider while FTW go to 215W..

MSI power limit is 315W...


----------



## asdkj1740

Quote:


> Originally Posted by *Mr-Dark*
> 
> Evga air cooler isn't good.. MSI is way better..
> 
> the power limit on the SC and Hybrid is around 190W when you max the power slider while FTW go to 215W..
> 
> MSI power limit is 315W...


i have flashed to 300w max zotac amp extreme bios now.
ftw 215w is weaker than the galax ex which has 250w with 5+2phases only.

msi one has some issue to the power, but overall it is a good card, just overpriced.


----------



## asdkj1740

the galax ex pcb are the same as a gainward chinese exclusive card, which has a nick name, beggar card. its very cheap, on its price and its pcb.


----------



## DeathAngel74

@gtbtk,
Nope left vccio and vccsa on auto. Just on the new card, reverted this morning. Set everything back to 4.7ghz @1.417-1.419v. 59-66c 100% [email protected] 15 minutes.


----------



## DeathAngel74

Whoops, wrong thread


----------



## Mr-Dark

Quote:


> Originally Posted by *asdkj1740*
> 
> i have flashed to 300w max zotac amp extreme bios now.
> ftw 215w is weaker than the galax ex which has 250w with 5+2phases only.
> 
> msi one has some issue to the power, but overall it is a good card, just overpriced.


Issue with power ? Mine was super smooth in SLI..


----------



## MR-e

Hi guys, I sold my EVGA Titan X (Maxwell) card. I am looking for help choosing GTX 1070 SLI.

I will watercool both GPU with Aquacomputer Kryographics Pascal for GTX 1070 Waterblocks + Active XCS Backplates. What is the best 1070 for SLI and Watercooled with the Aquacomputer waterblocks I choose?

Thank you!


----------



## DeathAngel74

Msi gaming x and z is what I would choose. I have evga 1070 sc2 though.


----------



## DeathAngel74

Are these acceptable?


or just average?


----------



## KedarWolf

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> Evga air cooler isn't good.. MSI is way better..
> 
> the power limit on the SC and Hybrid is around 190W when you max the power slider while FTW go to 215W..
> 
> MSI power limit is 315W...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i have flashed to 300w max zotac amp extreme bios now.
> ftw 215w is weaker than the galax ex which has 250w with 5+2phases only.
> 
> msi one has some issue to the power, but overall it is a good card, just overpriced.
Click to expand...

MSI Gaming Z 1070. 10 phase power design.

As with other Pascal based cards we've reviewed, the PWM control for the GPU side is the UPI semiconductor up9511. In this configuration, MSI is using the full 8-phase capability of the uP9511.

A pair of Sinopower SM7320 Dual Channel MOSFETS in conjunction with eight ON Semiconductor 4C86N PowerPhase MOSFETS make up the overall 10-phase power design of the MSI Geforce GTX 1070 Gaming Z. Output filtering uses 10 0.20uH Super Ferrite Choke (SFC) inductors and 10 820uF solid capacitors.

Memory VRM is controlled by the UPI Semiconductor uP1641. The Sinopower SM7320 Dual Channel MOSFETS help make up the 2-phase memory power delivery. The uP1641 includes two integrates MOSFET drivers (hence no external drivers for the memory MOSFETS) and is capable of two phase. The 10 phase power of this card is 8+2 where 8 phases is used for the GPU and 2 phases are for the memory.


----------



## MR-e

Quote:


> Originally Posted by *DeathAngel74*
> 
> Msi gaming x and z is what I would choose. I have evga 1070 sc2 though.


Is the MSI Gaming Z compatible with the Aquacomputer Kryographics Pascal for GTX 1070 Waterblocks + Active XCS Backplates?

Edit - doesn't look like that's possible. The Aquacomputer Kryographics block looks to only be compatible with FE cards. Is there any FE cards that are better than the other when it comes to overclocking and watercooling?


----------



## rfarmer

Quote:


> Originally Posted by *MR-e*
> 
> Is the MSI Gaming Z compatible with the Aquacomputer Kryographics Pascal for GTX 1070 Waterblocks + Active XCS Backplates?
> 
> Edit - doesn't look like that's possible. The Aquacomputer Kryographics block looks to only be compatible with FE cards. Is there any FE cards that are better than the other when it comes to overclocking and watercooling?


I have a FE and was wondering the same thing when I bought it. All FE are the same and manufactured by nVidia themselves not the individual vendors. Get whichever one you can get the best deal on.

Btw I have that block and backplate, they are very well made and cool well.


----------



## MR-e

Quote:


> Originally Posted by *rfarmer*
> 
> I have a FE and was wondering the same thing when I bought it. All FE are the same and manufactured by nVidia themselves not the individual vendors. Get whichever one you can get the best deal on.
> 
> Btw I have that block and backplate, they are very well made and cool well.


Thank you, will see what I can get from EVGA as they are waterblock friendly


----------



## KedarWolf

Quote:


> Originally Posted by *MR-e*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> Msi gaming x and z is what I would choose. I have evga 1070 sc2 though.
> 
> 
> 
> Is the MSI Gaming Z compatible with the Aquacomputer Kryographics Pascal for GTX 1070 Waterblocks + Active XCS Backplates?
> 
> Edit - doesn't look like that's possible. The Aquacomputer Kryographics block looks to only be compatible with FE cards. Is there any FE cards that are better than the other when it comes to overclocking and watercooling?
Click to expand...

https://www.ekwb.com/news/ek-releases-msi-gtx-1080-tf6-full-cover-water-block/

Works with Gaming X 1070, has a passive compatible backplate, should work with Gaming Z, same card in the guts.


----------



## rfarmer

Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.ekwb.com/news/ek-releases-msi-gtx-1080-tf6-full-cover-water-block/
> 
> Works with Gaming X 1070, has a passive compatible backplate, should work with Gaming Z, same card in the guts.


Yeah EK is the way to go for non reference blocks, they have pretty much all the major ones covered.


----------



## KedarWolf

Quote:


> Originally Posted by *rfarmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> https://www.ekwb.com/news/ek-releases-msi-gtx-1080-tf6-full-cover-water-block/
> 
> Works with Gaming X 1070, has a passive compatible backplate, should work with Gaming Z, same card in the guts.
> 
> 
> 
> Yeah EK is the way to go for non reference blocks, they have pretty much all the major ones covered.
Click to expand...

And they works with Gaming Z, I checked on their site.


----------



## MR-e

Thanks for the info with the EKWB option guys. I checked the EK configurator and was able to come to the same conclusion. But all my past GPU's were watercooled with EKWB. I want to try Aquacomputer this time due to the amazing aesthetics of the Kyrographics + Active XCS Backplate. I am currently debating between 1070 & 1080 FE + 1080ti tomorrow.


----------



## rfarmer

Quote:


> Originally Posted by *MR-e*
> 
> Thanks for the info with the EKWB option guys. I checked the EK configurator and was able to come to the same conclusion. But all my past GPU's were watercooled with EKWB. I want to try Aquacomputer this time due to the amazing aesthetics of the Kyrographics + Active XCS Backplate. I am currently debating between 1070 & 1080 FE + 1080ti tomorrow.




It's a good looking block.


----------



## RyanRazer

Quote:


> Originally Posted by *KedarWolf*
> 
> MSI Gaming Z 1070. 10 phase power design.
> 
> As with other Pascal based cards we've reviewed, the PWM control for the GPU side is the UPI semiconductor up9511. In this configuration, MSI is using the full 8-phase capability of the uP9511.
> 
> A pair of Sinopower SM7320 Dual Channel MOSFETS in conjunction with eight ON Semiconductor 4C86N PowerPhase MOSFETS make up the overall 10-phase power design of the MSI Geforce GTX 1070 Gaming Z. Output filtering uses 10 0.20uH Super Ferrite Choke (SFC) inductors and 10 820uF solid capacitors.
> 
> Memory VRM is controlled by the UPI Semiconductor uP1641. The Sinopower SM7320 Dual Channel MOSFETS help make up the 2-phase memory power delivery. The uP1641 includes two integrates MOSFET drivers (hence no external drivers for the memory MOSFETS) and is capable of two phase. The 10 phase power of this card is 8+2 where 8 phases is used for the GPU and 2 phases are for the memory.


I only understood "two, gpu, 10, used, memory, for, the and uses".


----------



## khanmein

Quote:


> Originally Posted by *KedarWolf*
> 
> MSI Gaming Z 1070. 10 phase power design.
> 
> As with other Pascal based cards we've reviewed, the PWM control for the GPU side is the UPI semiconductor up9511. In this configuration, MSI is using the full 8-phase capability of the uP9511.
> 
> A pair of Sinopower SM7320 Dual Channel MOSFETS in conjunction with eight ON Semiconductor 4C86N PowerPhase MOSFETS make up the overall 10-phase power design of the MSI Geforce GTX 1070 Gaming Z. Output filtering uses 10 0.20uH Super Ferrite Choke (SFC) inductors and 10 820uF solid capacitors.
> 
> Memory VRM is controlled by the UPI Semiconductor uP1641. The Sinopower SM7320 Dual Channel MOSFETS help make up the 2-phase memory power delivery. The uP1641 includes two integrates MOSFET drivers (hence no external drivers for the memory MOSFETS) and is capable of two phase. The 10 phase power of this card is 8+2 where 8 phases is used for the GPU and 2 phases are for the memory.


what he said might be related to vbios power.


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> Evga air cooler isn't good.. MSI is way better..
> 
> the power limit on the SC and Hybrid is around 190W when you max the power slider while FTW go to 215W..
> 
> MSI power limit is 315W...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i have flashed to 300w max zotac amp extreme bios now.
> ftw 215w is weaker than the galax ex which has 250w with 5+2phases only.
> 
> msi one has some issue to the power, but overall it is a good card, just overpriced.
> 
> Click to expand...
> 
> MSI Gaming Z 1070. 10 phase power design.
> 
> As with other Pascal based cards we've reviewed, the PWM control for the GPU side is the UPI semiconductor up9511. In this configuration, MSI is using the full 8-phase capability of the uP9511.
> 
> A pair of Sinopower SM7320 Dual Channel MOSFETS in conjunction with eight ON Semiconductor 4C86N PowerPhase MOSFETS make up the overall 10-phase power design of the MSI Geforce GTX 1070 Gaming Z. Output filtering uses 10 0.20uH Super Ferrite Choke (SFC) inductors and 10 820uF solid capacitors.
> 
> Memory VRM is controlled by the UPI Semiconductor uP1641. The Sinopower SM7320 Dual Channel MOSFETS help make up the 2-phase memory power delivery. The uP1641 includes two integrates MOSFET drivers (hence no external drivers for the memory MOSFETS) and is capable of two phase. The 10 phase power of this card is 8+2 where 8 phases is used for the GPU and 2 phases are for the memory.
Click to expand...

there is something strange going on with MSI bioses. The bios specs say max is 291W and Nvidia-smi reports 289W but, the most I have ever been able to pull from my card with MSI bioses installed is about 230w (about 107% reported power limit) before it started to power limit throttle and drop the clocks back. It does not keep pulling power up to the published limit. It is only an issue at 4K Though.

I have not worked out what they are doing exactly but it looks like they have done something to fool the GPU logic to keep power limit throttling at bay in 1080 and 1440p loads. Something similar to but not the same as what the power mod does when you short the shunt resistors.

The Zotac extreme bios allowed me to pull 300W but, I did not see any performance increase come with the extra power draw so it leads me to believe that MORE Voltage and MORE power are not necessarily the solution to faster framerates. I have gotten to the point that my highest average Firestrike scores of 21000+ graphics scores are coming from settings at 1.063V.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> there is something strange going on with MSI bioses. The bios specs say max is 291W and Nvidia-smi reports 289W but, the most I have ever been able to pull from my card with MSI bioses installed is about 230w (about 107% reported power limit) before it started to power limit throttle and drop the clocks back. It does not keep pulling power up to the published limit. It is only an issue at 4K Though.
> 
> I have not worked out what they are doing exactly but it looks like they have done something to fool the GPU logic to keep power limit throttling at bay in 1080 and 1440p loads. Something similar to but not the same as what the power mod does when you short the shunt resistors.
> 
> The Zotac extreme bios allowed me to pull 300W but, I did not see any performance increase come with the extra power draw so it leads me to believe that MORE Voltage and MORE power are not necessarily the solution to faster framerates. I have gotten to the point that my highest average Firestrike scores of 21000+ graphics scores are coming from settings at 1.063V.


more voltage, like evga classified that can be raised to 1.2v. it pushes the gpu to 2.3g easily.
i guess 1.3v with good air cooling (definitely not the evga stock one) or aio or water cooling, can reach 2.4g~2.5g.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> there is something strange going on with MSI bioses. The bios specs say max is 291W and Nvidia-smi reports 289W but, the most I have ever been able to pull from my card with MSI bioses installed is about 230w (about 107% reported power limit) before it started to power limit throttle and drop the clocks back. It does not keep pulling power up to the published limit. It is only an issue at 4K Though.
> 
> I have not worked out what they are doing exactly but it looks like they have done something to fool the GPU logic to keep power limit throttling at bay in 1080 and 1440p loads. Something similar to but not the same as what the power mod does when you short the shunt resistors.
> 
> The Zotac extreme bios allowed me to pull 300W but, I did not see any performance increase come with the extra power draw so it leads me to believe that MORE Voltage and MORE power are not necessarily the solution to faster framerates. I have gotten to the point that my highest average Firestrike scores of 21000+ graphics scores are coming from settings at 1.063V.
> 
> 
> 
> more voltage, like evga classified that can be raised to 1.2v. it pushes the gpu to 2.3g easily.
> i guess 1.3v with good air cooling (definitely not the evga stock one) or aio or water cooling, can reach 2.4g~2.5g.
Click to expand...

but does 2.3G actually give you better frame rates? I am getting better frame rates at 2088/2076 than i am at 2126 and I can run at both


----------



## Darkermanz099

hi all, i just plasti dipt my Gigabyte gtx 1070 g1 and i am gone mount it horizontalie in my nzxt s340 elite case.

i was wondering how i could make the fans eluminate or make it in the case of the card light up with rgb lighting,
anybody here got any experiance with that, or any sugestions how to or what to use?


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> but does 2.3G actually give you better frame rates? I am getting better frame rates at 2088/2076 than i am at 2126 and I can run at both


i think the nvidia driver only treats traditional offset oc as "real overclocking". playing with voltage curve seems to be a ineffective cheating...
the one who has zotac amp extreme above has got insanely high scores...with 2164mhz only...


----------



## KedarWolf

I'm buying two MSI Gaming Z 1070s. NewEgg says I can send one back if I don't open the box. I want to make sure they are both Samsung or Micron. NewEgg can't make sure they are both from the same batch number.

I know they'll still work in SLI with different memory types but is preferable if they are the same.

Question is, how can I check the memory type by serial or batch number without opening the box? Maybe if I call MSI support?


----------



## TheBoom

Quote:


> Originally Posted by *RyanRazer*
> 
> I only understood "two, gpu, 10, used, memory, for, the and uses".


How about we've, pair, cards, is, and i can go on lol.


----------



## ColdDeckEd

Quote:


> Originally Posted by *Darkermanz099*
> 
> hi all, i just plasti dipt my Gigabyte gtx 1070 g1 and i am gone mount it horizontalie in my nzxt s340 elite case.
> 
> i was wondering how i could make the fans eluminate or make it in the case of the card light up with rgb lighting,
> anybody here got any experiance with that, or any sugestions how to or what to use?


You will have to use the Gigabyte Gaming Xtreme app to control the LEDs.


----------



## Darkermanz099

Quote:


> Originally Posted by *ColdDeckEd*
> 
> You will have to use the Gigabyte Gaming Xtreme app to control the LEDs.


i know that i can use that to change the rgb lighting thats inside the gigabyte sign and the fan stop sign.
but i want to add my self more rgb lighting inside the houseing of the fans that is what my question was about.
about what type of rgb leds you guys would recomend


----------



## lulavc

Hi guys. Im Gonna buy one 1070 this week and i would like to know the ones that are currently giving the best OC records.

What should I look for?

Enviado de meu GT-N7100 usando Tapatalk


----------



## KedarWolf

Quote:


> Originally Posted by *lulavc*
> 
> Hi guys. Im Gonna buy one 1070 this week and i would like to know the ones that are currently giving the best OC records.
> 
> What should I look for?
> 
> Enviado de meu GT-N7100 usando Tapatalk


https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme

Peeps have good results with these but if you're going to get a second one at one point for SLI they are three card slots, you'd be better off getting a two slot MSI Gaming X or Gaming Z.









https://msi.com/Graphics-card/GeForce-GTX-1070-GAMING-Z-8G.html#hero-overview


----------



## lulavc

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lulavc*
> 
> Hi guys. Im Gonna buy one 1070 this week and i would like to know the ones that are currently giving the best OC records.
> 
> What should I look for?
> 
> Enviado de meu GT-N7100 usando Tapatalk
> 
> 
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme
> 
> Peeps have good results with these but if you're going to get a second one at one point for SLI they are three card slots, you'd be better off getting a two slot MSI Gaming X or Gaming Z.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://msi.com/Graphics-card/GeForce-GTX-1070-GAMING-Z-8G.html#hero-overview
Click to expand...

The AMP extreme is the only one unavailable where I Gonna buy.

There is any other?

In not gonna do SLI.

Enviado de meu GT-N7100 usando Tapatalk


----------



## KedarWolf

Quote:


> Originally Posted by *lulavc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lulavc*
> 
> Hi guys. Im Gonna buy one 1070 this week and i would like to know the ones that are currently giving the best OC records.
> 
> What should I look for?
> 
> Enviado de meu GT-N7100 usando Tapatalk
> 
> 
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme
> 
> Peeps have good results with these but if you're going to get a second one at one point for SLI they are three card slots, you'd be better off getting a two slot MSI Gaming X or Gaming Z.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://msi.com/Graphics-card/GeForce-GTX-1070-GAMING-Z-8G.html#hero-overview
> 
> Click to expand...
> 
> The AMP extreme is the only one unavailable where I Gonna buy.
> 
> There is any other?
> 
> In not gonna do SLI.
> 
> Enviado de meu GT-N7100 usando Tapatalk
Click to expand...

Gaming Z a really nice card, or even a Gaming X, MSI.


----------



## lulavc

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lulavc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lulavc*
> 
> Hi guys. Im Gonna buy one 1070 this week and i would like to know the ones that are currently giving the best OC records.
> 
> What should I look for?
> 
> Enviado de meu GT-N7100 usando Tapatalk
> 
> 
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme
> 
> Peeps have good results with these but if you're going to get a second one at one point for SLI they are three card slots, you'd be better off getting a two slot MSI Gaming X or Gaming Z.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://msi.com/Graphics-card/GeForce-GTX-1070-GAMING-Z-8G.html#hero-overview
> 
> Click to expand...
> 
> The AMP extreme is the only one unavailable where I Gonna buy.
> 
> There is any other?
> 
> In not gonna do SLI.
> 
> Enviado de meu GT-N7100 usando Tapatalk
> 
> Click to expand...
> 
> Gaming Z a really nice card, or even a Gaming X, MSI.
Click to expand...

Howard about this one?

Zotac Geforce GTX 1070 AMP! Edition 8GB GDDR5 256Bit, ZT-P10700C-10P

Enviado de meu GT-N7100 usando Tapatalk


----------



## RyanRazer

Quote:


> Originally Posted by *TheBoom*
> 
> How about we've, pair, cards, is, and i can go on lol.


Some do ring a bell, yes.


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm buying two MSI Gaming Z 1070s. NewEgg says I can send one back if I don't open the box. I want to make sure they are both Samsung or Micron. NewEgg can't make sure they are both from the same batch number.
> 
> I know they'll still work in SLI with different memory types but is preferable if they are the same.
> 
> Question is, how can I check the memory type by serial or batch number without opening the box? Maybe if I call MSI support?


The easiest way to check if they have the same memory is to look at the serial numbers for both cards. You will have serial numbers similar to 602-V330-43SBxxxxxxxxx. Yours wont be 43SB because that was from the June 2016 batch of micron cards but the format should be similar. If the first part of the 3rd section (the 43SB part) are the same, then the cards both have the memory type.

As the cards are manufactured in batches that all use the same hardware components, the odds are pretty high that you will get two cards from the same batch.


----------



## gtbtk

Quote:


> Originally Posted by *lulavc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lulavc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lulavc*
> 
> Hi guys. Im Gonna buy one 1070 this week and i would like to know the ones that are currently giving the best OC records.
> 
> What should I look for?
> 
> Enviado de meu GT-N7100 usando Tapatalk
> 
> 
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1070-amp-extreme
> 
> Peeps have good results with these but if you're going to get a second one at one point for SLI they are three card slots, you'd be better off getting a two slot MSI Gaming X or Gaming Z.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://msi.com/Graphics-card/GeForce-GTX-1070-GAMING-Z-8G.html#hero-overview
> 
> Click to expand...
> 
> The AMP extreme is the only one unavailable where I Gonna buy.
> 
> There is any other?
> 
> In not gonna do SLI.
> 
> Enviado de meu GT-N7100 usando Tapatalk
> 
> Click to expand...
> 
> Gaming Z a really nice card, or even a Gaming X, MSI.
> 
> Click to expand...
> 
> Howard about this one?
> 
> Zotac Geforce GTX 1070 AMP! Edition 8GB GDDR5 256Bit, ZT-P10700C-10P
> 
> Enviado de meu GT-N7100 usando Tapatalk
Click to expand...

If you take a look at the 3dmark website and do a search in the results section for firestrike and GTX 1070, you will need to click through about the first 5 pages of SLI results intil you get to the single card benchmarks, you can see what model cards were used to make the highest scores. It doesn't tell you if the cards were water or air cooled though.

I just had a look at the top of the leader boards it seems to have a mix of pretty much all the different brands. The reality is that as long as you can keep the card relatively cool, all the 1070 cards will perform about the same.

I really like my MSI Gaming X with the Z bios installed. It has enough power/VRM that you can run just about any 1070 bios you want to with it. The fans are really quiet and the cooler is effective. With the MSI bios under 1080 and 1440 loads, I have never hit the power limit. I can pull 21000 graphics scores in Firestrike with an ancient i7-2600 and it would go even faster under water.

Evga cards hit power limits easily and can be frustrating.


----------



## KedarWolf

Next question.

Two 1070s when they drop in price, or one 1080 Ti.


----------



## rfarmer

Quote:


> Originally Posted by *KedarWolf*
> 
> Next question.
> 
> Two 1070s when they drop in price, or one 1080 Ti.


One 1080 Ti. There will always be games that are not SLI compatible. You will get better performance from a single GPU.


----------



## AuraNova

Quote:


> Originally Posted by *KedarWolf*
> 
> Next question.
> 
> Two 1070s when they drop in price, or one 1080 Ti.


With the pricing of the 1080Ti, you're much better off selling your 1070 and getting a 1080Ti.


----------



## madweazl

Quote:


> Originally Posted by *gtbtk*
> 
> If you take a look at the 3dmark website and do a search in the results section for firestrike and GTX 1070, you will need to click through about the first 5 pages ...


You can also filter by number of GPUs in the drop down menu so you dont need to click through the pages.

Looks like I've been bested in Firestrike and Time Spy; got some work to do when I get back in town


----------



## gtbtk

Quote:


> Originally Posted by *madweazl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you take a look at the 3dmark website and do a search in the results section for firestrike and GTX 1070, you will need to click through about the first 5 pages ...
> 
> 
> 
> You can also filter by number of GPUs in the drop down menu so you dont need to click through the pages.
> 
> Looks like I've been bested in Firestrike and Time Spy; got some work to do when I get back in town
Click to expand...

the search filter there is not particularly good. The number of GPU filter only works on the search results that are displayed on the page. if you search 1070, you only get 2x sli results unless you go to advanced tab and limit the maximum result to about 21000. If not you need to page through to page 5 or 6 before you get and single results


----------



## madweazl

Quote:


> Originally Posted by *gtbtk*
> 
> the search filter there is not particularly good. The number of GPU filter only works on the search results that are displayed on the page. if you search 1070, you only get 2x sli results unless you go to advanced tab and limit the maximum result to about 21000. If not you need to page through to page 5 or 6 before you get and single results


I dont change the number of results (advanced search by GPU) and when I select 1 GPU, I get results with only 1 GPU. Not sure what is going on for your results but it works properly for me unless I hit the back button; then I have to select 1 GPU again.


----------



## asdkj1740

samsung vram master race, vram clock as high as possible


----------



## asdkj1740

it is said that 1070 msrp has dropped to 350usd now. so dont buy 1070 right now and wait for the price adjustment.


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> 
> 
> 
> 
> 
> samsung vram master race, vram clock as high as possible


are u MindBlank Tech?


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> are u MindBlank Tech?


no, but i like his videos, much more better than traditional ones.


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> no, but i like his videos, much more better than traditional ones.


actually i like his content but his voice really annoyed me. no offense.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> actually i like his content but his voice really annoyed me. no offense.


lol i though this is my bad eng problem. i have to enable the eng sub to see his video lol.


----------



## asdkj1740

the 1080ti is the best 80ti ever...quiet appealing


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> lol i though this is my bad eng problem. i have to enable the eng sub to see his video lol.


his slang is weird to me but no doubt he spoke very fluent than me. hence, i also have asian slang too.

the vid don't show the RX 480 GPU temp cos he's using AIO cooler???


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> the 1080ti is the best 80ti ever...quiet appealing


yeah cuda cores remained the same with Titan XP but 352-bit bus & 88 ROPs are pretty weird.

previously the reduced cuda cores but the bit bus & ROPs remained the same.

this sound like VEGA might force to sell around USD 650 in Q2 2017.

FYI, 11 GB GDDR5X (11Gbps OC sure can hit 12Gbps)

great timing with RYZEN too & HBM2 is expensive.


----------



## ucode

Quote:


> Originally Posted by *gtbtk*
> 
> If you take a look at the 3dmark website and do a search in the results section for firestrike and GTX 1070, you will need to click through about the first 5 pages of SLI results intil you get to the single card benchmarks, you can see what model cards were used to make the highest scores. It doesn't tell you if the cards were water or air cooled though.


Doesn't tell if cross flashed either unless noted by bencher so some results may be misleading.

@madweazl I get the same problem as gtbtk when using the search filter in that when selecting search GPU only and then use show 1 GPU only, the search engine doesn't seem intelligent enough to skip all other results but only looks at the first 1000. If there aren't any in that first 1000 then it asks to check the next 1000 and so on until a records with just 1 GPU are located in that 1000 block.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> his slang is weird to me but no doubt he spoke very fluent than me. hence, i also have asian slang too.
> 
> the vid don't show the RX 480 GPU temp cos he's using AIO cooler???


yes, this dude dares to play with modded bios. but he said some throttlings are still there, its probably because the modded bios has only 225w power. 480 is capable to eat more than 300w.

his card is even faster than "artis" one, and the result also matches with "artis" which is 1460/2250 480 = 1550/2000 970.

https://www.youtube.com/user/ArT1S


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> yes, this dude dares to play with modded bios. but he said some throttlings are still there, its probably because the modded bios has only 225w power. 480 is capable to eat more than 300w.
> 
> his card is even faster than "artis" one, and the result also matches with "artis" which is 1460/2250 480 = 1550/2000 970.
> 
> https://www.youtube.com/user/ArT1S


weird his PSU just 550W from SuperFlower one of my fav. i personally don't like to OC too extreme.

mild OC is more than enough cos i'm sensitive with noise.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> weird his PSU just 550W from SuperFlower one of my fav. i personally don't like to OC too extreme.
> 
> mild OC is more than enough cos i'm sensitive with noise.


even his 480 can be drawn 300w, 550w is still good enough to feed them.


----------



## DeathAngel74

Quote:


> Originally Posted by *khanmein*
> 
> his slang is weird to me but no doubt he spoke very fluent than me. hence, i also have asian slang too.
> 
> the vid don't show the RX 480 GPU temp cos he's using AIO cooler???


heehee! What you mean Bruddah? funny combination is japanese/hawaiian slang. Sometimes on vacation back home, funny stuff comes out. Sometimes my wife and I end up speaking pidgin(hawaiian slang).


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> heehee! What you mean Bruddah? funny combination is japanese/hawaiian slang. Some times on vacation back home, funny stuff comes out. Sometimes my wife and I end up speaking pidgin(hawaiian slang).


ROFL now GTX 1080/1070 dropped price to $449/$349.


----------



## kevindd992002

Quote:


> Originally Posted by *asdkj1740*
> 
> it is said that 1070 msrp has dropped to 350usd now. so dont buy 1070 right now and wait for the price adjustment.


Quote:


> Originally Posted by *khanmein*
> 
> ROFL now GTX 1080/1070 dropped price to $449/$349.


Where do you see that the 1070 dropped price also?


----------



## pez

The FE is $399 on the NVIDIA site--however, they mentioned they were getting rid of the FE 'premium pricing' scheme, so I don't think this will effect AIB partners too much. It just means anyone who wants to get a FE for easy block compatibility will now pay less.


----------



## khanmein

Quote:


> Originally Posted by *kevindd992002*
> 
> Where do you see that the 1070 dropped price also?


https://www.techpowerup.com/231114/nvidia-cuts-price-of-its-geforce-gtx-1080-graphics-card-usd-499

don't blame me & blame one of your fav tech review side techpowerup.


----------



## kevindd992002

Quote:


> Originally Posted by *khanmein*
> 
> https://www.techpowerup.com/231114/nvidia-cuts-price-of-its-geforce-gtx-1080-graphics-card-usd-499
> 
> don't blame me & blame one of your fav tech review side techpowerup.


Thanks.

Also, I wasn't blaming anyone. I just wanted to read the article myself.


----------



## gtbtk

Quote:



> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> actually i like his content but his voice really annoyed me. no offense.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol i though this is my bad eng problem. i have to enable the eng sub to see his video lol.
Click to expand...

He has an eastern European/Russian accent, that is why he sounds a bit strange.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> the 1080ti is the best 80ti ever...quiet appealing
> 
> 
> 
> yeah cuda cores remained the same with Titan XP but 352-bit bus & 88 ROPs are pretty weird.
> 
> previously the reduced cuda cores but the bit bus & ROPs remained the same.
> 
> this sound like VEGA might force to sell around USD 650 in Q2 2017.
> 
> FYI, 11 GB GDDR5X (11Gbps OC sure can hit 12Gbps)
> 
> great timing with RYZEN too & HBM2 is expensive.
Click to expand...

The reduced Rops will have some impact on performance but I am guessing that it will be offset by the increased clock speeds compared to the titan. The bus width is determined by the number of memory chips. Each Chip has a fixrd 32-bit bus. Times that by 11 chips for 11GB Vram and you get 352bits. I suspect that the TI will out perform Titan cards. I feel sorry for titan XP owners, the resale value of their current model $1200 cards has just been cut by more than 50%

Micron has released 11GB GDDR5X skus now. I wonder if that has been achieved by using different manufacturing techniques or if they have just set the clock speeds higher than the 10GB skus using the same but higher binned chips? If it is just basically overclocked 10GB vram, the overclocking overhead may not actually be as good as the 1080. I know that the 1080 does not really get any more improvements in performance once bandwidth gets past 11GB


----------



## gtbtk

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you take a look at the 3dmark website and do a search in the results section for firestrike and GTX 1070, you will need to click through about the first 5 pages of SLI results intil you get to the single card benchmarks, you can see what model cards were used to make the highest scores. It doesn't tell you if the cards were water or air cooled though.
> 
> 
> 
> Doesn't tell if cross flashed either unless noted by bencher so some results may be misleading.
> 
> @madweazl I get the same problem as gtbtk when using the search filter in that when selecting search GPU only and then use show 1 GPU only, the search engine doesn't seem intelligent enough to skip all other results but only looks at the first 1000. If there aren't any in that first 1000 then it asks to check the next 1000 and so on until a records with just 1 GPU are located in that 1000 block.
Click to expand...

That is true, however the point i was trying to make was that the brand really doesn't matter that much, there was no single brand/model that dominated any of the others. The big differences come from having better temperature control from better cooling, be it water or the best air coolers


----------



## kevindd992002

The article says "The GTX 1070 is unflinched for now, from its $349 baseline pricing." I'm not sure how much did the 1070 go for at launch but that statement doesn't seem to suggest that it had a price drop, does it?


----------



## rfarmer

Quote:


> Originally Posted by *kevindd992002*
> 
> The article says "The GTX 1070 is unflinched for now, from its $349 baseline pricing." I'm not sure how much did the 1070 go for at launch but that statement doesn't seem to suggest that it had a price drop, does it?


When released the FE was $450 and other 1070s were supposed to start at $380 but it was several months until you saw any of them under $400. Most of the vendor cards were in the low $400s when launched.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> The reduced Rops will have some impact on performance but I am guessing that it will be offset by the increased clock speeds compared to the titan. The bus width is determined by the number of memory chips. Each Chip has a fixrd 32-bit bus. Times that by 11 chips for 11GB Vram and you get 352bits. I suspect that the TI will out perform Titan cards. I feel sorry for titan XP owners, the resale value of their current model $1200 cards has just been cut by more than 50%
> 
> Micron has released 11GB GDDR5X skus now. I wonder if that has been achieved by using different manufacturing techniques or if they have just set the clock speeds higher than the 10GB skus using the same but higher binned chips? If it is just basically overclocked 10GB vram, the overclocking overhead may not actually be as good as the 1080. I know that the 1080 does not really get any more improvements in performance once bandwidth gets past 11GB


new GTX 1080 (11 Gbps) & GTX 1060 (9 Gbps) mean no more Samsung VRAM?

GTX 1070 like totally ignored.

Titan XP still have it own value & those guys who bought Titan don't care about the money.

now Micron confirm have some issue with GDDR5X on GTX 1080 & Titan XP.

hence, Micron manage to push extra 1 Gbps & fixed some bugs/stability.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The reduced Rops will have some impact on performance but I am guessing that it will be offset by the increased clock speeds compared to the titan. The bus width is determined by the number of memory chips. Each Chip has a fixrd 32-bit bus. Times that by 11 chips for 11GB Vram and you get 352bits. I suspect that the TI will out perform Titan cards. I feel sorry for titan XP owners, the resale value of their current model $1200 cards has just been cut by more than 50%
> 
> Micron has released 11GB GDDR5X skus now. I wonder if that has been achieved by using different manufacturing techniques or if they have just set the clock speeds higher than the 10GB skus using the same but higher binned chips? If it is just basically overclocked 10GB vram, the overclocking overhead may not actually be as good as the 1080. I know that the 1080 does not really get any more improvements in performance once bandwidth gets past 11GB
> 
> 
> 
> new GTX 1080 (11 Gbps) & GTX 1060 (9 Gbps) mean no more Samsung VRAM?
> 
> GTX 1070 like totally ignored.
> 
> Titan XP still have it own value & those guys who bought Titan don't care about the money.
> 
> now Micron confirm have some issue with GDDR5X on GTX 1080 & Titan XP.
> 
> hence, Micron manage to push extra 1 Gbps & fixed some bugs/stability.
Click to expand...

1080 never had samsung ram.

1060 was being beaten by RX480 after AMD driver improvements so I am not surprised they are trying to get in front again. not sure how they will get 9GB GDDR5 unless they just overclock the 8Gb memory.

1070 doesn't have any competition so I guess that Nvidia don't think investing resources upgrading it gives them any benefit.

I cant imagine why anyone would buy a titan now unless the price was less than a TI. The guys who bought Titan wanted fastest card. If they didn't care about money, then they would not be rich enough to buy a Titan card in the first place. You don't get rich by throwing the money away.

What confirmation has micron given about GDDR5X problems? I have not heard anything. The only thing I have seen is that they have released 11Gbps SKUs which is a natural development on the way to the max of 16Gb they are ultimately aiming for in about 5 years time. We will probably see 12GB GDDR5X memory next year.


----------



## gtbtk

Quote:


> Originally Posted by *exzacklyright*
> 
> IDK why but I get a lot more artifacts on this scene than any other in the heaven benchmark
> 
> 
> 
> I was only able to get to 190MHz with no artifacts.
> 
> As for memory.. I'm not sure how to figure out how high to go.
> 
> One review said this, "Our GTX 1080's GDDR5X memory also took a final offset +400MHz to achieve a 5400MHz final stable memory clock. We noticed that there is a memory hole with instability around +500MHz offset, and that we could then push it even higher to +600MHz, but we lost some performance compared with a +400MHz offset."
> 
> I'm not sure how they found it it had a memory hole? Or that they lost performance? I'm guessing FPS?
> 
> HARDOCP: http://www.hardocp.com/article/2016/06/13/geforce_gtx_1070_founders_edition_overclocking_review#.V2C3HfkrIuU they got to +230 with no issues somehow


That is 1080 and not 1070. They have different memory types. I think that hardcop is the only erson who found that card behavior as well.

Memory OC you can try starting at +500Mhz. If it crashes or has funky patterns or flashing lights on the screen, you have gone to far and should reduce it by 25Mhz and try again. If it passes increase by 25Mhz and test again


----------



## spddmn24

Put my quicksilver 1070 under water yesterday. Max load temps dropped from ~68 to 40, usually in the 30's gaming. Prelim boost clock is 2163, ram is 8950mhz. It boosted to 2101-2113 and 8800 on air so small gains, but gains none the less. Loop is a 7700k and gtx 1070 on a single ek coolstream xe360 radiator. Max coolant temp I saw so far was 32c in 26-27 ambient.

http://www.3dmark.com/fs/11848879

http://www.3dmark.com/spy/1292083


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Put my quicksilver 1070 under water yesterday. Max load temps dropped from ~68 to 40, usually in the 30's gaming. Prelim boost clock is 2163, ram is 8950mhz. It boosted to 2101-2113 and 8800 on air so small gains, but gains none the less. Loop is a 7700k and gtx 1070 on a single ek coolstream xe360 radiator. Max coolant temp I saw so far was 32c in 26-27 ambient.
> 
> http://www.3dmark.com/fs/11848879
> 
> http://www.3dmark.com/spy/1292083


That FS score is about what I can manage on Air with my Gaming X (Graphics scores, you physics scores are obviously much better than mine) running the Gaming Z bios. You will get better scores if you can increase the memory overclock even if you have to sacrifice a little bit of core clock frequency.

These two results were obtained at 1.063v with a max core clock of 2088 and memory clocked at about 9330Mhz on the firestrike run. Try as I might, I cant get a successful run in Timespy with memory faster than 9000Mhz The results are generally better than what I can do at 1.093V 2126Mhz, 9280Mhz .

http://www.3dmark.com/fs/11822144

http://www.3dmark.com/spy/1262929

At least on my rig, the higher voltage runs seem to take resources away from the CPU and Physics scores usually drop a couple of hundred points. Not sure if that may also apply to Kaby Lake rigs.

If you have hit a ceiling with vRam clocks, you may like to try increasing VCCIO voltage slightly to fortify your PCIe controller a bit more


----------



## spddmn24

Quote:


> Originally Posted by *gtbtk*
> 
> That FS score is about what I can manage on Air with my Gaming X (Graphics scores, you physics scores are obviously much better than mine) running the Gaming Z bios. You will get better scores if you can increase the memory overclock even if you have to sacrifice a little bit of core clock frequency.
> 
> These two results were obtained at 1.063v with a max core clock of 2088 and memory clocked at about 9330Mhz on the firestrike run. Try as I might, I cant get a successful run in Timespy with memory faster than 9000Mhz The results are generally better than what I can do at 1.093V 2126Mhz, 9280Mhz .
> 
> http://www.3dmark.com/fs/11822144
> http://www.3dmark.com/spy/1262929
> 
> At least on my rig, the higher voltage runs seem to take resources away from the CPU and Physics scores usually drop a couple of hundred points. Not sure if that may also apply to Kaby Lake rigs.
> 
> If you have hit a ceiling with vRam clocks, you may like to try increasing VCCIO voltage slightly to fortify your PCIe controller a bit more


That's all my vram will do, +500 crashes regardless of gpu core oc. Timespy stress test is the best I found for finding vram instability. VCCIO is already at 1.24v i believe for my ram running at 3866c17.


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That FS score is about what I can manage on Air with my Gaming X (Graphics scores, you physics scores are obviously much better than mine) running the Gaming Z bios. You will get better scores if you can increase the memory overclock even if you have to sacrifice a little bit of core clock frequency.
> 
> These two results were obtained at 1.063v with a max core clock of 2088 and memory clocked at about 9330Mhz on the firestrike run. Try as I might, I cant get a successful run in Timespy with memory faster than 9000Mhz The results are generally better than what I can do at 1.093V 2126Mhz, 9280Mhz .
> 
> http://www.3dmark.com/fs/11822144
> http://www.3dmark.com/spy/1262929
> 
> At least on my rig, the higher voltage runs seem to take resources away from the CPU and Physics scores usually drop a couple of hundred points. Not sure if that may also apply to Kaby Lake rigs.
> 
> If you have hit a ceiling with vRam clocks, you may like to try increasing VCCIO voltage slightly to fortify your PCIe controller a bit more
> 
> 
> 
> That's all my vram will do, +500 crashes regardless of gpu core oc. Timespy stress test is the best I found for finding vram instability. VCCIO is already at 1.24v i believe for my ram running at 3866c17.
Click to expand...

vccio and the VCCSA fortify both the on die memory controller and the pcie controller for the CPU PCIe lanes used by the GPU. I struggled with vram clocks that would start putting blue artifacts all over the screen in 2d clocks if the vram was running much higher than +500 until I discovered the correlation, at least with my sandy bridge chip but the theory should hold true on z270 as well. My ram has been overclocked from 1600mhz to 1927Mhz and ran fine at stock IO voltages so I had never tried adjusting it before throwing a 1070 in my rig. After a small bump in VCCIO voltage, Vram will now clock to just about +650 stable and just under +700 Mhz above reference before I start seeing major weirdness.

Maybe try increasing VCCIO to 1.25 and see how it goes. You might also try a small increase in VCCSA voltage. Safe limit for that is said to be 1.3V. try different combinations and see how it goes. If you don't get any improvements from tuning those two voltages, you can always put it back where it is now.


----------



## KedarWolf

Welp, getting one 1080 Ti instead of two 1070s.









Waits for 1080 Ti Owner's thread.


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Welp, getting one 1080 Ti instead of two 1070s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waits for 1080 Ti Owner's thread.


If you sell the 1070 that makes a lot of sense


----------



## KedarWolf

Quote:


> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Welp, getting one 1080 Ti instead of two 1070s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waits for 1080 Ti Owner's thread.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you sell the 1070 that makes a lot of sense
Click to expand...

I don't have a 1070, I have an older Maxwell Titan X.


----------



## gtbtk

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Welp, getting one 1080 Ti instead of two 1070s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waits for 1080 Ti Owner's thread.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you sell the 1070 that makes a lot of sense
> 
> Click to expand...
> 
> I don't have a 1070, I have an older Maxwell Titan X.
Click to expand...

trading the old titan for the 1080ti still makes a lot of sense


----------



## asdkj1740

didnt apply the icx upgrade because the cost is unreasonable.
but these two days ppl have been discussing whether the icx upgrade problem can further step up to 1080ti (of course the price differential needed to be paid).


----------



## pez

My step up should fit around the Ti release, so I didn't even bother with the ICX 'upgrade'. I got the thermal pad mod from them, but haven't even applied it yet as I'm not going to waste my time if I'm going to step it up in a few weeks.


----------



## DeathAngel74

http://www.3dmark.com/compare/fs/11863110/fs/11863659#
http://www.3dmark.com/compare/fs/11863806/fs/11863659
http://www.3dmark.com/compare/fs/11864174/fs/11863806


----------



## KedarWolf

I'm on the Nvidia website, hit the 1080 Ti 'Pre-order Now' button, get to checkout, says 'Not Available', go back to Nvidia website, 'Pre-order Now' has changed to 'Notify Me When Available' between the time I started the order and the time I got to check out.









Anyone know how long until aftermarket cards will be available, even pre-orders?


----------



## icold

I hate reference models
Quote:


> Originally Posted by *KedarWolf*
> 
> I'm on the Nvidia website, hit the 1080 Ti 'Pre-order Now' button, get to checkout, says 'Not Available', go back to Nvidia website, 'Pre-order Now' has changed to 'Notify Me When Available' between the time I started the order and the time I got to check out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone know how long until aftermarket cards will be available, even pre-orders?


I hate reference models


----------



## DeathAngel74

I'll just wait...hopefully I can step-up to 1080Ti before 90 days runs out.


----------



## pez

Since EVGA ran the ICX program specifically through the end date of the Ti announce, I imagine we should at least see custom cards before then. Even then I'd imagine sooner than that or you'd be able to claim a step up within reason, but I could be wrong.


----------



## DeathAngel74

First iCX card was a dud. Just got the 2nd one yesterday. It seems better so far. Maybe I'll tinker with SLI before I get the other back to them. Just not motivated ATM, lol.


----------



## Star Forge

Quote:


> Originally Posted by *asdkj1740*
> 
> didnt apply the icx upgrade because the cost is unreasonable.
> but these two days ppl have been discussing whether the icx upgrade problem can further step up to 1080ti (of course the price differential needed to be paid).


I did the math.

To get the 1080Ti (probably Founder's Edition) Step-Up you need to pay the following extra:

$99 for the ICX upgrade + $230 from the ICX FTW2 to the 1080Ti FE.

So +$329 dollars to Step-Up from whatever price you paid for your original card excluding shipping fees (assuming you had the 1070 FTW to start with).

Quote:


> Originally Posted by *pez*
> 
> Since EVGA ran the ICX program specifically through the end date of the Ti announce, I imagine we should at least see custom cards before then. Even then I'd imagine sooner than that or you'd be able to claim a step up within reason, but I could be wrong.


EVGA's Step-Up often only allows you to Step-Up to the Founder's Edition or a near equivalent reference edition. You won't be able to Step-Up to an SC or above grade card.


----------



## DeathAngel74

http://www.3dmark.com/fs/11831568



I'm tired of watching the FS demo, I'm done, lol.


----------



## pez

Quote:


> Originally Posted by *Star Forge*
> 
> I did the math.
> 
> To get the 1080Ti (probably Founder's Edition) Step-Up you need to pay the following extra:
> 
> $99 for the ICX upgrade + $230 from the ICX FTW2 to the 1080Ti FE.
> 
> So +$329 dollars to Step-Up from whatever price you paid for your original card excluding shipping fees (assuming you had the 1070 FTW to start with).
> 
> EVGA's Step-Up often only allows you to Step-Up to the Founder's Edition or a near equivalent reference edition. You won't be able to Step-Up to an SC or above grade card.


Hmmm that must have changed. It used to at least be you could step up to the equivalent of your card (I.e SC to SC). I remember that being the case with the 780 to 780Ti.

It looks like you at least have the option to step up to a ACX 3.0 for the current 1080, so I assume that much will be true for the Ti. I don't mind not getting a SC. I only got this SC because it was $5 cheaper than the non-SC at the time.


----------



## Star Forge

Quote:


> Originally Posted by *pez*
> 
> Hmmm that must have changed. It used to at least be you could step up to the equivalent of your card (I.e SC to SC). I remember that being the case with the 780 to 780Ti.
> 
> It looks like you at least have the option to step up to a ACX 3.0 for the current 1080, so I assume that much will be true for the Ti. I don't mind not getting a SC. I only got this SC because it was $5 cheaper than the non-SC at the time.


I mean if you look at the current list of Step-Up cards on EVGA's list, none of them are SC and above variants. Hell, you can't even get a 1080 FE if you wanted to. So I am assuming it is only going to be base models of the 1080Ti.


----------



## RyanRazer

Wow they have a step up program like that? How cool. Wish others had that.
Plus, i wonder how that works in Europe, will check that out.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> http://www.3dmark.com/fs/11831568
> 
> 
> 
> I'm tired of watching the FS demo, I'm done, lol.


you need to keep an eye out for a steam sale


----------



## DeathAngel74

I never have the money when there's a sale.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> I never have the money when there's a sale.


I bought my copy of 3dmark on sale on steam for $US4.50 about 8 months ago.

I will not miss that 1/2 of one extra beer at the pub I think.


----------



## dlewbell

Too bad that Humble Bundle is over. It was available for pretty cheap a week or 2 ago.


----------



## paulclift

I've never overclocked a graphics card before.

I got +100 core and +500 memory before unigine started acting a bit squiffy.

2075.5Mhz Core
4498.2 Mhz Memory

Is that any good?

My score was 2606.


----------



## gtbtk

Quote:


> Originally Posted by *paulclift*
> 
> I've never overclocked a graphics card before.
> 
> I got +100 core and +500 memory before unigine started acting a bit squiffy.
> 
> 2075.5Mhz Core
> 4498.2 Mhz Memory
> 
> Is that any good?
> 
> My score was 2606.


that is middle of the pack.

did you get to that setting by doing core and memory separately or all in one hit?

define squiffy - application crash, artifacts on the screen (in 2d or 3d graphics?) or psychodelic vomit on the screen?


----------



## ColdDeckEd

Here's my highest FS score, EVGA SC with palit bios

http://www.3dmark.com/fs/11869342

I've found there's less hitting of powerlimit with this bios, but the fans don't spin as fast? But the temps are pretty much the same the asus bios. I bought CLU in order to do the power shunt mod, but don't think I need to attempt it anymore, there's no point without additional voltage.

+25% voltage, 115 power limit, +113 core, +545 memory (micron).

I'll try messing around with the curve when I have more time.


----------



## asdkj1740

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Here's my highest FS score, EVGA SC with palit bios
> 
> http://www.3dmark.com/fs/11869342
> 
> I've found there's less hitting of powerlimit with this bios, but the fans don't spin as fast? But the temps are pretty much the same the asus bios. I bought CLU in order to do the power shunt mod, but don't think I need to attempt it anymore, there's no point without additional voltage.
> 
> +25% voltage, 115 power limit, +113 core, +545 memory (micron).
> 
> I'll try messing around with the curve when I have more time.


which palit bios, palit has few bios with different power settings.
i flashed the 225w palit one to my evga ftw and my card seemed cant run properly. all the stats reported were strange like fps was half below the past but the smoothness in the game was the same, so did the power draw.

it is really happy to hear you have found the right bios for your card. i tried many times with many bios to find out the most suitable one for my ftw.


----------



## gtbtk

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Here's my highest FS score, EVGA SC with palit bios
> 
> http://www.3dmark.com/fs/11869342
> 
> I've found there's less hitting of powerlimit with this bios, but the fans don't spin as fast? But the temps are pretty much the same the asus bios. I bought CLU in order to do the power shunt mod, but don't think I need to attempt it anymore, there's no point without additional voltage.
> 
> +25% voltage, 115 power limit, +113 core, +545 memory (micron).
> 
> I'll try messing around with the curve when I have more time.


I have been telling everyone here that the EVGA bioses are configured to power limit more quickly than the other bioses for a while now, that includes the FTW card bioses as well. The MSI bioses seem the best suited to never hitting power limits unless they are under a 4K load but when they do, they start to drop clocks at about 106% even though the slider says power limit is 126% and I have not fugured out why that is yet. Not sure that the Gaming bios would be good on an SC card though due to the 6+8 power available on the msi gaming cards

What are you seeing if you push the memory to say +600? Particularly if you are running chrome or something else that uses 2d acceleration, do you get blue or cyan artifacts all over the screen? does it just crash out of the 3d application?

I have discovered with my card, that fps performance at +0Mv +100 on the voltage slider is about the same even though the core clock doesn't run at as high a frequency as it does at 1.093v. The lower voltage generates lower temps and allows clocks to stay higher longer so unless you are using non stock cooling, the power shunt is not really going to benefit you much anyway.

This is my most recently saved firestrike score at +0 volts and memory (micron) running at +650 over reference. I have a non K i7-2600 so the best I can run my CPU at is 4440Mhz and it does show in the lower physics and combined scores that I can achieve but Graphics score is just over 21000. The absolute best graphics score I have managed was about 21600 or so but the physics score was about 600 points lower than normal, all scores were achieved with a curve.

http://www.3dmark.com/fs/11822144


----------



## lexer

Guys i have a EVGA GTX 1070 Founders Edition with a EKWB EK-FC1070 GTX, everything is good temps below 45°c full load. The card overclock just up to 2080Mhz but during heavy loads downclocks to 1980-2000Mhz making the O.C totally pointless. Is there a BIOS that is compatible with my card? Thanks


----------



## gtbtk

Quote:


> Originally Posted by *lexer*
> 
> Guys i have a EVGA GTX 1070 Founders Edition with a EKWB EK-FC1070 GTX, everything is good temps below 45°c full load. The card overclock just up to 2080Mhz but during heavy loads downclocks to 1980-2000Mhz making the O.C totally pointless. Is there a BIOS that is compatible with my card? Thanks


take a look at the asus strix OC bios


----------



## asdkj1740

https://videocardz.com/66972/asus-rog-strix-geforce-gtx-1080-ti-and-turbo-available-mid-march

asus strix is crazy....probably 10 phases ir3555 60a
the turbo version may also share the same pcb or just slightly cut down 2 phases without changing the mosfets.

there should be t4 bios again.


----------



## paulclift

Quote:


> Originally Posted by *gtbtk*
> 
> that is middle of the pack.
> 
> did you get to that setting by doing core and memory separately or all in one hit?
> 
> define squiffy - application crash, artifacts on the screen (in 2d or 3d graphics?) or psychodelic vomit on the screen?


Application crash.

I did memory 1st, then voltage but with the memory at the highest I had I got it.


----------



## shilka

I have a really strange problem with the GTX 1070 i just bought

Sometimes at random but typically when there is a big explosion in a game the game will freeze the screen will turn black with the message display port no signal and after a few seconds i get thrown to the desktop with the message that the Nvidia driver has stopped working

It also happens when i turn off my TV (which is also hooked up to my GTX 1070) and the signal to the HDMI port is cut
Same thing PC will freeze the screen will turn black with the message display port no signal and after a few seconds i get thrown to the desktop with the message that the Nvidia driver has stopped working

I have tried every GTX 1070 driver Nvidia has and i have tried underclocking the card i have tried cleaning out all remains of Nvidia drivers and i have tried everything short of reinstalling Windows

Next week i am going to upgrade to Windows 10 and if i still have the problem on a brand new Windows install i would assume the card is broken?
Anyone have any idea why the card is doing this? cant play any games other then games without big explosions as the game will just crash and its super annoying.


----------



## gtbtk

Quote:


> Originally Posted by *paulclift*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> that is middle of the pack.
> 
> did you get to that setting by doing core and memory separately or all in one hit?
> 
> define squiffy - application crash, artifacts on the screen (in 2d or 3d graphics?) or psychodelic vomit on the screen?
> 
> 
> 
> Application crash.
> 
> I did memory 1st, then voltage but with the memory at the highest I had I got it.
Click to expand...

In your bios, try increasing the IO and/or System Agent voltages slightly. (usually called VCCIO and VCCSA but different vendors call it different names)

What Motherboard/CPU are u running?


----------



## paulclift

Quote:


> Originally Posted by *gtbtk*
> 
> In your bios, try increasing the IO and/or System Agent voltages slightly. (usually called VCCIO and VCCSA but different vendors call it different names)
> 
> What Motherboard/CPU are u running?


Asus z97-p and a 4790k


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> I have a really strange problem with the GTX 1070 i just bought
> 
> Sometimes at random but typically when there is a big explosion in a game the game will freeze the screen will turn black with the message display port no signal and after a few seconds i get thrown to the desktop with the message that the Nvidia driver has stopped working
> 
> It also happens when i turn off my TV (which is also hooked up to my GTX 1070) and the signal to the HDMI port is cut
> Same thing PC will freeze the screen will turn black with the message display port no signal and after a few seconds i get thrown to the desktop with the message that the Nvidia driver has stopped working
> 
> I have tried every GTX 1070 driver Nvidia has and i have tried underclocking the card i have tried cleaning out all remains of Nvidia drivers and i have tried everything short of reinstalling Windows
> 
> Next week i am going to upgrade to Windows 10 and if i still have the problem on a brand new Windows install i would assume the card is broken?
> Anyone have any idea why the card is doing this? cant play any games other then games without big explosions as the game will just crash and its super annoying.


The black screen is the driver crashing. It is recovering itself as designed. I don't think the card is broken, I think that your bios settings are off slightly. Explosions tend to really load up the Graphics cars and that in turn loads up the integrated PCIe controller on the CPU

The settings I would look at are VCCIO and VCCSA voltages and increase them slightly. Increase one or both one or two levels at a time then test. I think X99 maximums are 1.25 and 1.3V respectively but you should confirm that.

You may also want to take a look at vcore and your load line calibration settings. That may be off slightly but not enough to cause a blue screen

Windows 10 uses a different version of the driver to windows 7/8. Maybe the new win 10 drivers will resolve the problem as well


----------



## shilka

Quote:


> Originally Posted by *gtbtk*
> 
> The black screen is the driver crashing. It is recovering itself as designed. I don't think the card is broken, I think that your bios settings are off slightly. Explosions tend to really load up the Graphics cars and that in turn loads up the integrated PCIe controller on the CPU
> 
> The settings I would look at are VCCIO and VCCSA voltages and increase them slightly. Increase one or both one or two levels at a time then test. I think X99 maximums are 1.25 and 1.3V respectively but you should confirm that.
> 
> You may also want to take a look at vcore and your load line calibration settings. That may be off slightly but not enough to cause a blue screen
> 
> Windows 10 uses a different version of the driver to windows 7/8. Maybe the new win 10 drivers will resolve the problem as well


Can i send you a PM and ask for help if moving to Win 10 does not help
Not sure i understood what you where saying so lets see if moving to Win 10 helps or not

Does the card need to have its BIOS undated?
Not sure if there even is a BIOS update for the Gigabyte GTX 1070 Xtreme Gaming?


----------



## DeathAngel74

I would upgrade your motherboard and video card bios if they available.


----------



## gtbtk

Quote:


> Originally Posted by *paulclift*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> In your bios, try increasing the IO and/or System Agent voltages slightly. (usually called VCCIO and VCCSA but different vendors call it different names)
> 
> What Motherboard/CPU are u running?
> 
> 
> 
> Asus z97-p and a 4790k
Click to expand...

OK the settings to look at are CPU SA, CPU Analog IO and CPU Digital IO

safe range is said to be 1.15-1.3v for all 3. I would try starting at say, 1.1v for each and see if it helps. I don't think you would need maximum voltage, just a bit above where you are now which I assume is on Auto?


----------



## shilka

Found out i am still using the F1 BIOS on my motherboard so that might be why its crashing
Cant find out what BIOS the GTX 1070 is on but i think i better update that as well.


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The black screen is the driver crashing. It is recovering itself as designed. I don't think the card is broken, I think that your bios settings are off slightly. Explosions tend to really load up the Graphics cars and that in turn loads up the integrated PCIe controller on the CPU
> 
> The settings I would look at are VCCIO and VCCSA voltages and increase them slightly. Increase one or both one or two levels at a time then test. I think X99 maximums are 1.25 and 1.3V respectively but you should confirm that.
> 
> You may also want to take a look at vcore and your load line calibration settings. That may be off slightly but not enough to cause a blue screen
> 
> Windows 10 uses a different version of the driver to windows 7/8. Maybe the new win 10 drivers will resolve the problem as well
> 
> 
> 
> Can i send you a PM and ask for help if moving to Win 10 does not help
> Not sure i understood what you where saying so lets see if moving to Win 10 helps or not
> 
> Does the card need to have its BIOS undated?
> Not sure if there even is a BIOS update for the Gigabyte GTX 1070 Xtreme Gaming?
Click to expand...

yes send me a PM and I will help u if i can. I am in no way an x99 expert though

The latest Giga xtreme 1070 Bioses are 86.04.50.00.7C and 7D for the 2nd bios. Check using GPU-Z. If you have those bioses you are fine, If not you can get it from the gigabyte web site here http://www.gigabyte.com/Graphics-Card/GV-N1070XTREME-8GD#support-dl

You should probably also check the motherboard bios and update that to the latest version. F6A says it is beta but has been there since august last year. All of the bioses pretty much talk about DDR stability and the voltage that helps that also helps the integrated PCIE controller so the bios updates may be enough to solve your problem.

these threads may also be more help than I can be for specifics http://www.overclock.net/t/1606212/gigabyte-x99-ultra-gaming-owners-thread/0_20 and this one http://www.overclock.net/t/1510355/gigabyte-x99-motherboard-discussion-club/0_20


----------



## shilka

I have the 86.04.50.00.7D BIOS on my video card and thats the newest one as far as i can tell
Its probably because of the motherboard then because i am still using the F1 BIOS.

Going to update the motherboard BIOS and install Windows 10 and that does not help either then the card must be broken.


----------



## kevindd992002

Quote:


> Originally Posted by *gtbtk*
> 
> I have been telling everyone here that the EVGA bioses are configured to power limit more quickly than the other bioses for a while now, that includes the FTW card bioses as well.


Were there already instances that the power limit of the FTW was hit? I read in the 1080 thread that FTW's are "enough" if you're concerned about power limits.


----------



## gtbtk

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have been telling everyone here that the EVGA bioses are configured to power limit more quickly than the other bioses for a while now, that includes the FTW card bioses as well.
> 
> 
> 
> Were there already instances that the power limit of the FTW was hit? I read in the 1080 thread that FTW's are "enough" if you're concerned about power limits.
Click to expand...

1070 evga cards bounce off their power limit at 1080p. That applies to both the SC and FTW cards and is the reason why I don't recommend EVGA pascal cards. I cannot comment on what 1080 cards do but in 1070, it is directly related to the way EVGA has customized the power delivery in their bioses, pulling higher wattages sooner than other manufacturers cards.

MSI Gaming cards have gone the other way and stuggle to reach much higher than 80-85%% or their power limit running the same game of benchmark like Firestrike. However, in spite of a power slider that goes to 126%, the MSI cards will downclock as though they have hit their power limit in 4K loads at about 106%


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> 1070 evga cards bounce off their power limit at 1080p. That applies to both the SC and FTW cards and is the reason why I don't recommend EVGA pascal cards. I cannot comment on what 1080 cards do but in 1070, it is directly related to the way EVGA has customized the power delivery in their bioses, pulling higher wattages sooner than other manufacturers cards.
> 
> MSI Gaming cards have gone the other way and stuggle to reach much higher than 80-85%% or their power limit running the same game of benchmark like Firestrike. However, in spite of a power slider that goes to 126%, the MSI cards will downclock as though they have hit their power limit in 4K loads at about 106%


no issue with my EVGA at 1440p.


----------



## KedarWolf

Quote:


> Originally Posted by *shilka*
> 
> I have a really strange problem with the GTX 1070 i just bought
> 
> Sometimes at random but typically when there is a big explosion in a game the game will freeze the screen will turn black with the message display port no signal and after a few seconds i get thrown to the desktop with the message that the Nvidia driver has stopped working
> 
> It also happens when i turn off my TV (which is also hooked up to my GTX 1070) and the signal to the HDMI port is cut
> Same thing PC will freeze the screen will turn black with the message display port no signal and after a few seconds i get thrown to the desktop with the message that the Nvidia driver has stopped working
> 
> I have tried every GTX 1070 driver Nvidia has and i have tried underclocking the card i have tried cleaning out all remains of Nvidia drivers and i have tried everything short of reinstalling Windows
> 
> Next week i am going to upgrade to Windows 10 and if i still have the problem on a brand new Windows install i would assume the card is broken?
> Anyone have any idea why the card is doing this? cant play any games other then games without big explosions as the game will just crash and its super annoying.


https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista

Works in Windows 10 too. Qword with 64 bit Windows 10, had the same issue.


----------



## khanmein

Quote:


> Originally Posted by *KedarWolf*
> 
> https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista
> 
> Works in Windows 10 too. Qword with 64 bit Windows 10, had the same issue.


previously, i had TDR issue with my GTX 970 during that time GTA V 1st game ready driver & for the next 4 drivers (WHQL)

FYI, i never face any flickering or sudden black screen for few sec with DVI, HDMI or DP/mDP on my GTX 970 or 1070.

literally, TDR never appear on my GTX 1070.

@shilka should try TDR Manipulator V1.2


----------



## shilka

I never had any problems with my old GTX 970 its only after i upgraded to the GTX 1070 i have had problems.


----------



## DeathAngel74

try chkdsk /f /r on all drives+ sfc /scannow a couple of times.
I had the same issue, but going from 1070 SC to 1070 SC2 iCX. I did an advanced RMA for a second SC2 iCX card. This one is fine, no crashes. Sent the first one back yesterday.


----------



## shilka

I am upgrading to Windows 10 next week so am not going to try and fix it before
Windows 10 might help if not i will return and try something else.


----------



## Blackfirehawk

i tryed the Palit Gamerock bios with my Gainwand GTX (no glh)
bios version is 86.04.3B.00.72

core is Stable @ 2050 mhz Boost @ 1.05MV @ 68 degree Celsius.. if i go higher i get artefacts and heaven crash

Memory (micron) is Stable @ 4750 MHZ in Afterburner /9500Mhz
4780mhz seems stable too.. but higher and it gonna Crash heaven

i think Core is lower end on Silicon lottery.. or i need a better Cooler.. don´t know
but Memory OC is fine


----------



## spddmn24

My 1070 quicksilver isn't over volting properly after putting it under water. It would do 1.092 under air, and now the highest I have seen under water is 1.062 I think. Only hit 1.05 on this firestrike run. voltage/power/temp limit are all maxed out in afterburner.


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> i tryed the Palit Gamerock bios with my Gainwand GTX (no glh)
> bios version is 86.04.3B.00.72
> 
> core is Stable @ 2050 mhz Boost @ 1.05MV @ 68 degree Celsius.. if i go higher i get artefacts and heaven crash
> 
> Memory (micron) is Stable @ 4750 MHZ in Afterburner /9500Mhz
> 4780mhz seems stable too.. but higher and it gonna Crash heaven
> 
> i think Core is lower end on Silicon lottery.. or i need a better Cooler.. don´t know
> but Memory OC is fine


The memory overclock you have is excellent - 9500Mhz, 1500Mhz overclocked above stock. That is 200mhz higher than I can currently get with mine and still run stable. The memory performance is the thing that will give you the biggest boost in framerates. Isnt it funny how so many people still believe that Micron memory cannot overclock.

2050 is not a terrible core clock rate. If you experiment with the curve, , the slider overclock has hit one pof the points that the card doesnt want to go highre on. the other points still have some headroom to overclock more. You will probably find you can pull the 1.05 or 1.063 point up to about 2088 or 2100. You can also try pulling the .950 point up to about 2525. see how that helps your framerates


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> My 1070 quicksilver isn't over volting properly after putting it under water. It would do 1.092 under air, and now the highest I have seen under water is 1.062 I think. Only hit 1.05 on this firestrike run. voltage/power/temp limit are all maxed out in afterburner.


What version Afterburner are you using?

Can you post a screenshot of the setting screen that has the voltage control options.


----------



## spddmn24

Quote:


> Originally Posted by *gtbtk*
> 
> What version Afterburner are you using?
> 
> Can you post a screenshot of the setting screen that has the voltage control options.


4.3.0, unintalled and reinstalled with no change.


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> What version Afterburner are you using?
> 
> Can you post a screenshot of the setting screen that has the voltage control options.
> 
> 
> 
> 4.3.0, unintalled and reinstalled with no change.
Click to expand...

The AB version is correct. Can you please Click the cog button in afterburner and do a screen shot of the first settings tab that shows how you have set up AB to do power adjustments so I can see how you have it set up?. Sorry I dont have access to my PC with afterburner right now to show you the screen I need

I would recommend that you turn start afterburner at startup OFF. If you end up autoloading loading a profile that crashes your PC at boot time, it is a major hassle to fix it.


----------



## DeathAngel74

Heh. I remember having to boot into safe mode to disable AB w/ windows startup.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Heh. I remember having to boot into safe mode to disable AB w/ windows startup.


yep, that's the hassle. I bet you have not made that mistake a 2nd time


----------



## spddmn24

Quote:


> Originally Posted by *gtbtk*
> 
> The AB version is correct. Can you please Click the cog button in afterburner and do a screen shot of the first settings tab that shows how you have set up AB to do power adjustments so I can see how you have it set up?. Sorry I dont have access to my PC with afterburner right now to show you the screen I need
> 
> I would recommend that you turn start afterburner at startup OFF. If you end up autoloading loading a profile that crashes your PC at boot time, it is a major hassle to fix it.




Everything worked fine when it was on air. I'm not sure if the bios just doesn't let it go up in voltage unless the temp goes up for some reason?


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The AB version is correct. Can you please Click the cog button in afterburner and do a screen shot of the first settings tab that shows how you have set up AB to do power adjustments so I can see how you have it set up?. Sorry I dont have access to my PC with afterburner right now to show you the screen I need
> 
> I would recommend that you turn start afterburner at startup OFF. If you end up autoloading loading a profile that crashes your PC at boot time, it is a major hassle to fix it.
> 
> 
> 
> 
> 
> Everything worked fine when it was on air. I'm not sure if the bios just doesn't let it go up in voltage unless the temp goes up for some reason?
Click to expand...

First try right clicking on the monitoring graph windows and check to see if the "pause monitoring" option has a check mark against it. If yes, unpause the monitoring there.

If the monitoring is not paused in the main graph window, you need to check "Unlock voltage monitoring" (this is why the voltage shows up as 0 on the main screen).

You can also uncheck "Enable low level I/O driver" and "Enable Low level hardware access interface" as these impact performance. They become more useful if you are debugging issues or if you are running a game that has the punkbuster service or other applicatoins that do not behave well.


----------



## spddmn24

Quote:


> Originally Posted by *gtbtk*
> 
> First try right clicking on the monitoring graph windows and check to see if the "pause monitoring" option has a check mark against it. If yes, unpause the monitoring there.
> 
> If the monitoring is not paused in the main graph window, you need to check "Unlock voltage monitoring" (this is why the voltage shows up as 0 on the main screen).
> 
> You can also uncheck "Enable low level I/O driver" and "Enable Low level hardware access interface" as these impact performance. They become more useful if you are debugging issues or if you are running a game that has the punkbuster service or other applicatoins that do not behave well.


I just ran firestrike ultra stability test to try to get the temp up a bit, and the voltage increased with the temp. So it looks like the bios does some funky stuff under 40c.


----------



## gtbtk

I see you got your voltage control working.

The voltage and frequency you get to is dependent on the curve that you set. Using the slider for core frequency and 100% voltage, many times the 1.081 and 1.093 point will offset to the same frequency. the card will start out at the lowest voltage it can to run at the desired frequency set by the curve. As temps increase, the voltage will increase and keep a steady frequency or the frequency will drop and keep a steady voltage.


----------



## spddmn24

Quote:


> Originally Posted by *gtbtk*
> 
> I see you got your voltage control working.
> 
> The voltage and frequency you get to is dependent on the curve that you set. Using the slider for core frequency and 100% voltage, many times the 1.081 and 1.093 point will offset to the same frequency. the card will start out at the lowest voltage it can to run at the desired frequency set by the curve. As temps increase, the voltage will increase and keep a steady frequency or the frequency will drop and keep a steady voltage.


It was always working, I just assumed it ran at 1.093v and Downclocked as the temp rose. This is my first water cooled gpu so it's still a learning experience.


----------



## pez

Quote:


> Originally Posted by *Star Forge*
> 
> I mean if you look at the current list of Step-Up cards on EVGA's list, none of them are SC and above variants. Hell, you can't even get a 1080 FE if you wanted to. So I am assuming it is only going to be base models of the 1080Ti.


Indeed. I'll just be happy to get a non-FE cooler, honestly. I mean, I'm not going to be upset with an 'average' card considering that Pascal kinda has a trend going on with the OCs. I'd be surprised if the Ti is any different.


----------



## w-moffatt

Hi Guys,

Adding my GTX1070 and subbing this forum









- MSI GTX1070 8GB Gaming X edition

First time ive spent big money on a card (big money for me anyway). Couldnt be happier. Only downside is the LEDs on the card are a bit dull outside (photo makes them look alot brighter, a very minor not really relevant complaint but still...

Photos below











Cheers,
Will


----------



## khanmein

Quote:


> Originally Posted by *w-moffatt*
> 
> Hi Guys,
> 
> Adding my GTX1070 and subbing this forum
> 
> 
> 
> 
> 
> 
> 
> 
> 
> - MSI GTX1070 8GB Gaming X edition
> 
> First time ive spent big money on a card (big money for me anyway). Couldnt be happier. Only downside is the LEDs on the card are a bit dull outside (photo makes them look alot brighter, a very minor not really relevant complaint but still...
> 
> Photos below
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers,
> Will


nice rig but i suggest to cover up the top area to prevent dust.


----------



## w-moffatt

Quote:


> Originally Posted by *khanmein*
> 
> nice rig but i suggest to cover up the top area to prevent dust.


Sorry potato quality photos- There is a cover on top, phone camera is junk


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I see you got your voltage control working.
> 
> The voltage and frequency you get to is dependent on the curve that you set. Using the slider for core frequency and 100% voltage, many times the 1.081 and 1.093 point will offset to the same frequency. the card will start out at the lowest voltage it can to run at the desired frequency set by the curve. As temps increase, the voltage will increase and keep a steady frequency or the frequency will drop and keep a steady voltage.
> 
> 
> 
> It was always working, I just assumed it ran at 1.093v and Downclocked as the temp rose. This is my first water cooled gpu so it's still a learning experience.
Click to expand...

It will only run at 1.093 if you have the voltage slider at 100%. At 0% the card will run at a max of 1.050 - 1.063v


----------



## spddmn24

Quote:


> Originally Posted by *gtbtk*
> 
> It will only run at 1.093 if you have the voltage slider at 100%. At 0% the card will run at a max of 1.050 - 1.063v


The voltage slider has always been at 100%, and the voltage control worked fine on air when it hit 60-68c and had no problem hitting 1.093v. . What it appears to be doing is running at 1.050v @ ~<33c, 1.063 at ~35c, 1.075v at ~37c, and I'm guessing 1.093v @ ~39-40c, all at the same clock speed. I thought it would run at 1.093v at all temps then downclock as the temp went up, but that isn't the case.


----------



## gtbtk

Quote:


> Originally Posted by *spddmn24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It will only run at 1.093 if you have the voltage slider at 100%. At 0% the card will run at a max of 1.050 - 1.063v
> 
> 
> 
> The voltage slider has always been at 100%, and the voltage control worked fine on air when it hit 60-68c and had no problem hitting 1.093v. . What it appears to be doing is running at 1.050v @ ~<33c, 1.063 at ~35c, 1.075v at ~37c, and I'm guessing 1.093v @ ~39-40c, all at the same clock speed. I thought it would run at 1.093v at all temps then downclock as the temp went up, but that isn't the case.
Click to expand...

I don't have water so I have not seen that.

I have found though, that under air, I run at about 50-53 deg with a 2088-2076Mhz core clock at 1.063V when I leave the voltage at 0 and my performance is about the same as it is at 1.093v with a 2126Mhz core with temps at about 56-58 (from memory).

Unfortunately my rig had just died with a bang, frying the motherboard so I cant get in to double check it.


----------



## DeathAngel74

What the...sorry for your loss sir


----------



## dmnclocker

Hello all. I just purchased the Asus Strix Gtx 1070. I was messing around with o.c. it. I was able to get it at +125 on core and +275 on memory, is that good? Also can someone point me to a really good detailed guide on o.c. the gpu, i have watched tons of videos and read a lot, but was wondering if there was a good tutorial i missed. I tried adding volts to o.c. further, but it didn't help with the oc any. What were you all able to get on yours?


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I don't have water so I have not seen that.
> 
> I have found though, that under air, I run at about 50-53 deg with a 2088-2076Mhz core clock at 1.063V when I leave the voltage at 0 and my performance is about the same as it is at 1.093v with a 2126Mhz core with temps at about 56-58 (from memory).
> 
> Unfortunately my rig had just died with a bang, frying the motherboard so I cant get in to double check it.


omg time to buy ryzen 7 1700 with x370.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> What the...sorry for your loss sir


Thanks for the condolences. Not sure If the 1070 survived unscathed.


----------



## CriZ93

a question. In techpowerup there are 3 bios for zotac amp extreme.
Which is the best for micron ram?

86.04.1E.00.89

86.04.26.00.22

86.04.50.00.98 --This is the one I have now installed

thanks


----------



## gtbtk

Quote:


> Originally Posted by *dmnclocker*
> 
> Hello all. I just purchased the Asus Strix Gtx 1070. I was messing around with o.c. it. I was able to get it at +125 on core and +275 on memory, is that good? Also can someone point me to a really good detailed guide on o.c. the gpu, i have watched tons of videos and read a lot, but was wondering if there was a good tutorial i missed. I tried adding volts to o.c. further, but it didn't help with the oc any. What were you all able to get on yours?


I am assuming that is the non OC version?

+125 on the core is not too bad. You should be looking for somthing in the +500 to +800 range on the memory. High memory clocks will give you more of a performance increase than extremely high Core clocks.

An overclock with memory at +500 or above and the core slider at +75 is better than a high core OC with low memory oc.

Make sure that you have the most up to date vbios installed on your card. You can check using GPU-Z. Some cards with Micron memory suffered from a bug in the earlier bioses that caused the PC to crash if you increased memory clock above about +400.


----------



## gtbtk

Quote:


> Originally Posted by *CriZ93*
> 
> a question. In techpowerup there are 3 bios for zotac amp extreme.
> Which is the best for micron ram?
> 
> 86.04.1E.00.89
> 
> 86.04.26.00.22
> 
> 86.04.50.00.98 --This is the one I have now installed
> 
> thanks


.1E came with samsung in the first batch of cards.

.26 came with micron with the bug

.50 bios solves the micron bug so you do not need to do anything


----------



## CriZ93

Quote:


> Originally Posted by *gtbtk*
> 
> .1E came with samsung in the first batch of cards.
> .26 came with micron with the bug
> .50 bios solves the micron bug so you do not need to do anything


ok. thank you


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I don't have water so I have not seen that.
> 
> I have found though, that under air, I run at about 50-53 deg with a 2088-2076Mhz core clock at 1.063V when I leave the voltage at 0 and my performance is about the same as it is at 1.093v with a 2126Mhz core with temps at about 56-58 (from memory).
> 
> Unfortunately my rig had just died with a bang, frying the motherboard so I cant get in to double check it.
> 
> 
> 
> omg time to buy ryzen 7 1700 with x370.
Click to expand...

Considering that. But I feels like Micron Memory all over again.

To me at least, It is obvious that the SOC parts of the chip (Memory and PCIe controllers) is under performing and the cause of the SMT, gaming and memory speed bottlenecks.

No one in the media or from AMD is mentioning it at all. Just like some nVidia cards we are familiar with, instead they are trying to blame windows scheduling and everything else. None of which makes any sense because it they were correct it would also effect cinebench scores and other non gaming tests and it obviously doesn't.

Without one to experiment with, I'm don't know if that can be solved with tuning voltages or if AMD screwed up in designing that part of the chip so it can never run fast enough to support the communication between CPU - GPU and CPU - RAM.

If it is a tuning issue then this is a teething problem. If the design is flawed they will never fix it in this generation of chips. Given that they only recently enabled turbo boost, There must be a reason for that. I fear that the design may be flawed so they may be able to improve things a bit but it will always be somewhat limited.


----------



## shilka

Udpated my motherboard BIOS from the old F2 BIOS to the newest F6a BIOS
Lets see if that fixes the Nvidia driver crashes or not.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Considering that. But I feels like Micron Memory all over again.
> 
> To me at least, It is obvious that the SOC parts of the chip (Memory and PCIe controllers) is under performing and the cause of the SMT, gaming and memory speed bottlenecks.
> 
> No one in the media or from AMD is mentioning it at all. Just like some nVidia cards we are familiar with, instead they are trying to blame windows scheduling and everything else. None of which makes any sense because it they were correct it would also effect cinebench scores and other non gaming tests and it obviously doesn't.
> 
> Without one to experiment with, I'm don't know if that can be solved with tuning voltages or if AMD screwed up in designing that part of the chip so it can never run fast enough to support the communication between CPU - GPU and CPU - RAM.
> 
> If it is a tuning issue then this is a teething problem. If the design is flawed they will never fix it in this generation of chips. Given that they only recently enabled turbo boost, There must be a reason for that. I fear that the design may be flawed so they may be able to improve things a bit but it will always be somewhat limited.


seem u studied quite a lot too. yeah the chipset & everything is indeed new but FYI, there's no Intel 8C/16T 4.0 GHz so i'm saving money cos 4C/4T really slow.

AMD Athlon 2500 really served me quite well that time with Asus mobo. SMT > HT by Intel. wait windows fix the kernel.

regarding the cinebench, wendell mentioned run it few times will increase the score (

__ https://twitter.com/i/web/status/838617988920184832) & he also said Intel will copy too.

my own perspective thinking is that if your resolution is 1080p don't go for AMD & 1440p/4k there difference is minimal.

for DDR4 ram 2666MHz is the sweet spot with 2x8GB since AM4 supported dual channel only. i heard hynix have some issue & still samsung is more superior.

woot AMD also have micron fiasco like NV?


----------



## dmnclocker

Quote:


> Originally Posted by *gtbtk*
> 
> I am assuming that is the non OC version?
> 
> +125 on the core is not too bad. You should be looking for somthing in the +500 to +800 range on the memory. High memory clocks will give you more of a performance increase than extremely high Core clocks.
> 
> An overclock with memory at +500 or above and the core slider at +75 is better than a high core OC with low memory oc.
> 
> Make sure that you have the most up to date vbios installed on your card. You can check using GPU-Z. Some cards with Micron memory suffered from a bug in the earlier bioses that caused the PC to crash if you increased memory clock above about +400.


I do have the oc edition. I will have to go on and check the bios version. I've never upgraded Bios on gpu before. Is it scary thing to do like updating motherboard Bios?


----------



## khanmein

Quote:


> Originally Posted by *dmnclocker*
> 
> I do have the oc edition. I will have to go on and check the bios version. I've never upgraded Bios on gpu before. Is it scary thing to do like updating motherboard Bios?


not really & almost same like updating mobo bios. if u messed up just load with another gpu or use integrated graphic from Intel.

i tried on my previous GTX 970 few times so far no issue.


----------



## RyzenChrist

Looking at buying a 1070 SC. How well are they clocking guys?


----------



## HowYesNo

Quote:


> Originally Posted by *HowYesNo*
> 
> guys, i have this gainward gtx1070, and it runs quite good. got it oc +99/196 core/mem that gives 1999/4201. temps are at 75C-ish at load.
> so i decided to replace thermal paste on it, and got good result on core going around 66C. the problem is the vrm area.
> before i disassembled the cooler vrm did get quite hot but i could keep my finger on it. now with lower core temp the vrm area (backside) is hotter, can't keep my finger long as before. i did crank up fan curve, no help. i believe this happened due to thermal pad now not sitting properly nor being as clean so i ordered pad from phobya.
> i am interested is there a sort of heatsink that would go on the backside and cover only vrm area, not the full back plate.
> sory if this has been answered before, didn't go much through this topic.
> this is my card
> 
> reference with hot area in red
> 
> and something like this to mount using holes marked green


does anyone else has this same card. how thick are the thermal pads. i replaced stock with 1.5mm from Phobya, looking at them next to each other they seemed equal. but when in place with cooler back on card it doesn't seem they are pressed good on heat spreader. as mentioned in previous post above back of the card gets very hot after a 5-6 of GTA 5, hotter than with stock pads.
waiting for 2mm Thermal grizly pads to arrive to replace.
any ideas? thanks.


----------



## aimidin

Quote:


> Originally Posted by *w-moffatt*
> 
> Hi Guys,
> 
> Adding my GTX1070 and subbing this forum
> 
> 
> 
> 
> 
> 
> 
> 
> 
> - MSI GTX1070 8GB Gaming X edition
> 
> First time ive spent big money on a card (big money for me anyway). Couldnt be happier. Only downside is the LEDs on the card are a bit dull outside (photo makes them look alot brighter, a very minor not really relevant complaint but still...
> 
> Photos below
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers,
> Will


Now just put Gaming Z BIOS and you will be perfect







i have the same GPU and before i pushed the GPU to unstable 2050MHz core (in some benchmarks ,but not in games) and 9400 Mhz Memory stable *with low fan profile to 65 and after that from 65 to 75 aggressive to 100% * . After the Z Bios i got stable 2050 Mhz core and 9500 Mhz memory with the same profile , tho when i put it to be on 100% fan speed all the time it is boosting to 2070-2080Mhz core .


----------



## dmnclocker

I notice that i do have the micron memory. I went to the asus site and downloaded the bios. It gave me gtx1070updatebios.exe. When i ran it said no need to update bios?


----------



## Blackfirehawk

Quote:


> Originally Posted by *HowYesNo*
> 
> does anyone else has this same card. how thick are the thermal pads. i replaced stock with 1.5mm from Phobya, looking at them next to each other they seemed equal. but when in place with cooler back on card it doesn't seem they are pressed good on heat spreader. as mentioned in previous post above back of the card gets very hot after a 5-6 of GTA 5, hotter than with stock pads.
> waiting for 2mm Thermal grizly pads to arrive to replace.
> any ideas? thanks.


i have the same Card togheter with a Palit Gamerock Bios
Getting 2050 mhz on Core and 4750 mhz on Memory

100% Fanspeed @ about 68 degree Celsius Core temp

but i didn´t have changing thermal Pads or tryed thermal Paste on it..

iam Very interessted in this Too if i can improve some Temps with other Thermal Pats/paste

i would maybe order
ARCTIC Thermal pad 1,5mm
and
Artic Silver 5 Thermal Paste for the Core

if i get a improvement with this


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Considering that. But I feels like Micron Memory all over again.
> 
> To me at least, It is obvious that the SOC parts of the chip (Memory and PCIe controllers) is under performing and the cause of the SMT, gaming and memory speed bottlenecks.
> 
> No one in the media or from AMD is mentioning it at all. Just like some nVidia cards we are familiar with, instead they are trying to blame windows scheduling and everything else. None of which makes any sense because it they were correct it would also effect cinebench scores and other non gaming tests and it obviously doesn't.
> 
> Without one to experiment with, I'm don't know if that can be solved with tuning voltages or if AMD screwed up in designing that part of the chip so it can never run fast enough to support the communication between CPU - GPU and CPU - RAM.
> 
> If it is a tuning issue then this is a teething problem. If the design is flawed they will never fix it in this generation of chips. Given that they only recently enabled turbo boost, There must be a reason for that. I fear that the design may be flawed so they may be able to improve things a bit but it will always be somewhat limited.
> 
> 
> 
> seem u studied quite a lot too. yeah the chipset & everything is indeed new but FYI, there's no Intel 8C/16T 4.0 GHz so i'm saving money cos 4C/4T really slow.
> 
> AMD Athlon 2500 really served me quite well that time with Asus mobo. SMT > HT by Intel. wait windows fix the kernel.
> 
> regarding the cinebench, wendell mentioned run it few times will increase the score (
> 
> __ https://twitter.com/i/web/status/838617988920184832%5B%2FURL the bug is interesting but his observations are not quite complete. He does not notice that in overclock mode, all cores run at the same speed with no turbo, but his pc is running with 2 cores that show 4.2Ghz and the other 6 cores at 3.6. He does make a discovery that the bug actually makes the core clock run about 15% slower than it is supposed to be.
> 
> I did notice something about Ryzen that makes me think they will never be able to improve the PCIE bus to the GPU to the same levels as intel.
> 
> This is what I am thinking. The Ryzen chips are made on a fab that usually makes ARM phone CPUs, The Ryzen chip has been marketed with SOC elements that AMD have not really explained very well. The bug in that video showed the PC waking up and it back in the default mode that allows 2 cores to turbo up. AMD do talk about changing things to "Overclock mode" that only allows all core overclocks and is only done with software. You can turn cores off in software. AMD needs Windows to have the HPET enabled with a software command to work properly, just like vmtools on vmware.
> 
> Qualcomm also made a big deal about running Native windows on an ARM processor about 6 months ago.
> 
> When I put that all together, it says to me that a Ryzen chip is actually really big ARM mobile phone chip that is running an 8 core 64 bit x86 Virtual Machine on top. The EFI is basically a hypervisor. There is nothing intrinsically wrong with that concept, however, if you have a virtual machine there is always a small overhead involved compared to being able to run the code on bare metal.
> 
> We cannot tell there is a performance difference with the Ryzen 8 core processing performance because we do not have a non VM ryzen to compare it to directly so that doesn't really matter, only how it compares to other Intel or AMD FX chips. We can compare the performance of windows memory access and PCIe 3.0 access though, because it is in common with Intel boxes that don't have the virtualization overhead. The performance limitations we are seeing with memory and PCIe 3.0, i think, is the same sort of virtualization overhead you see if you run a VM on VMware. If i am correct, the windows memory performance and PCIe performance will never get to the same levels as Intel Machines.
Click to expand...


----------



## asdkj1740

Quote:


> Originally Posted by *HowYesNo*
> 
> does anyone else has this same card. how thick are the thermal pads. i replaced stock with 1.5mm from Phobya, looking at them next to each other they seemed equal. but when in place with cooler back on card it doesn't seem they are pressed good on heat spreader. as mentioned in previous post above back of the card gets very hot after a 5-6 of GTA 5, hotter than with stock pads.
> waiting for 2mm Thermal grizly pads to arrive to replace.
> any ideas? thanks.


you can simply disassemble the card again to check whether there are significant marks on the pad to show there is a good contact between pad and heatsink and mosfet.

it is true that if your gpu has got hot 5c more meaning your mosfet cooling could be worse than before, i have got the same situation before but i just let it be as it is really frustration to solve this problem and this will take you lots of efforts and times to fix it.

it is wise to use thicker pad to replace the original one as thermal pads are soft enough to be pressed down for ~0.5mm

if all these above are not relevant than maybe the pad you have bought is really bad but it is said that using different grade of thermal pad on mosfet has insignificant cooling difference because the mosfet temp is highly restricted by the packaging itself.


----------



## gtbtk

Quote:


> Originally Posted by *dmnclocker*
> 
> I notice that i do have the micron memory. I went to the asus site and downloaded the bios. It gave me gtx1070updatebios.exe. When i ran it said no need to update bios?


The bios update has been available since last November. Some cards have been warehoused or on shop shelves for some time and still have the old bios installed.

Anything made since last November is running the new bios. Yours must have been manufactured since last November. Running the bios update utility will not have hurt your card


----------



## dmnclocker

So the micron for memory is bad. Should I try my luck on taking it back to store and seeing if I can get one with Samsung memory?


----------



## blued

In response to someone (non-1070 owner) who posted in monitors section that we 1070 owners are suffering from "absurd latency problems"...

http://www.overclock.net/t/1384767/official-the-qnix-x-star-1440p-monitor-club/25650#post_25904309

Well apparently it seems that this was indeed an issue when cards were released (June 2016), but as I recall was quickly fixed with a driver update.

So... anyone still experiencing 'latency problems' with their 1070s? Never noticed anything of the sort in my 6 months of ownership. And as frequent visitor to other 1070 forums, not heard of it. I know in geforce forums there was a thread about it, but outside of it almost seems non-existent. But, the gentleman non-owner suggests that others should not make the same mistake we did in buying 1070, and that they should really go for RX 480 or gtx1080







.

Btw, I would think latency problems are easily detected in FCAT measurements in reviews. No reviewer has reported anything of the sort to my knowledge.


----------



## gtbtk

Quote:


> Originally Posted by *dmnclocker*
> 
> So the micron for memory is bad. Should I try my luck on taking it back to store and seeing if I can get one with Samsung memory?


no. micron memory is not bad at all. Why would you say that?

The firmware that Nvidia wrote when the cards first got Micron chips last June had a mistake in it that caused some issues. The bios update in november fixed that. Your card has the new bios so there is no issue at all


----------



## zipper17

@blued Saying Nvidia doesn't have Async hardware support or cannot take full advantage of DX12/Vulkan is completely blind...


__
https://www.reddit.com/r/50dqd5/demystifying_asynchronous_compute/%5B/URL


----------



## gtbtk

Quote:


> Originally Posted by *blued*
> 
> In response to someone (non-1070 owner) who posted in monitors section that we 1070 owners are suffering from "absurd latency problems"...
> 
> http://www.overclock.net/t/1384767/official-the-qnix-x-star-1440p-monitor-club/25650#post_25904309
> 
> Well apparently it seems that this was indeed an issue when cards were released (June 2016), but as I recall was quickly fixed with a driver update.
> 
> So... anyone still experiencing 'latency problems' with their 1070s? Never noticed anything of the sort in my 6 months of ownership. And as frequent visitor to other 1070 forums, not heard of it. I know in geforce forums there was a thread about it, but outside of it almost seems non-existent. But, the gentleman non-owner suggests that others should not make the same mistake we did in buying 1070, and that they should really go for RX 480 or gtx1080
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Btw, I would think latency problems are easily detected in FCAT measurements in reviews. No reviewer has reported anything of the sort to my knowledge.


When the card was first released, one or two versions of Nvidia drivers had some DPC latency issues but they were hardly absurd and a driver update resolved what was a relatively minor issue. There has been no mention of DPC latency by anyone since about last August. You can measure latency with an application called DPC latency monitor. You could work around the latency by configuring Message Signaled Interrupts in the registry and that solved the problem. The issues caused by the firmware bug that caused the micron memory problems were being blamed on DPC latency at that stage so I guess the impact here was a little more visible. 1080 also used the same drivers and observed the same DPU Latency issues.

1070 performs better in doom with Vulkan than OPenGL so that statement is untrue. AMD cards certainly improved more but that is more a comment on how bad the drivers were before Vulkan came along than the brilliance of AMD.

There are no Vulkan performance benefits with Maxwell cards at all. Vulkan has not really made much impact in anything other than doom.

DX12 does improve ROTR but it is only a minor improvement. Thing is with DX12, it is reliant on the developer to write good code and understand what they are doing. There are a number of DX12 patched titles that have been shown to not give any benefit to either Graphics card brand.

There is nothing wrong with an RX480 if you like 1060 level performance. the AMD drivers certainly seem to have improved recently but it has taken them long enough.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> @blued Saying Nvidia doesn't have Async hardware support or cannot take full advantage of DX12/Vulkan is completely blind...
> 
> 
> __
> https://www.reddit.com/r/50dqd5/demystifying_asynchronous_compute/%5B/URL


Nvidia did certainly take a different approach to Async compute than AMD have with GCN but up until Vulkan and DX12, the hardware based solution that AMD implemented went completely unused unless Mantle was implemented and I don't think that was very widespread.

Nvidia has taken a software based scheduler approach with Pascal. 1070 and above seem to benefit. I don't think 1060 and below show much benefit one way or the other. 1070 cards still blitz RX480 cards in Doom even when using Vulkan so having a hardware scheduler has still, at least here, not shown itself to be very relevant anyway.


----------



## dmnclocker

Quote:


> Originally Posted by *gtbtk*
> 
> no. micron memory is not bad at all. Why would you say that?
> 
> The firmware that Nvidia wrote when the cards first got Micron chips last June had a mistake in it that caused some issues. The bios update in november fixed that. Your card has the new bios so there is no issue at all


I was just referring to the issue that had the bios fix for micron. Was reading that if you had Samsung you didn't need Bios fix. I don't know anything about it besides that. I did buy my card last week though, so I'm still In the window frame for return. Just seen that I could get the asus strix gtx 1080 non oc edition for only $100 more. Is it worth it, because the 1070 I have now is oc.


----------



## blued

Quote:


> Originally Posted by *gtbtk*
> 
> When the card was first released, one or two versions of Nvidia drivers had some DPC latency issues but they were hardly absurd and a driver update resolved what was a relatively minor issue. There has been no mention of DPC latency by anyone since about last August. ...


Yeah, figured as much. 1070 is a popular, high selling card. Whenever an Nv card has minor issues that affect very few, geforce forums is usually the main place they go to. Even if 1% were affected out of tens of thousands sold, that can still amount to a big thread there. All it takes is for anyone to google any problem concerning any card and they can come with big number of hits (2,000,000 for the latency thing). But google amd rx 480 power draw problem and you get over 6,000,000 hits. Never mind that was also fixed very early on after release







. Still annoying when poorly informed people try to peddle crap like this as big ongoing problems long after they've been solved.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> No, the fiasco I am talking about is the under performing PCIe bus that the media is not mentioning or asking the question about "WHY games are performing slower than Intel boxes?". It just reminds me of how they ignored the Micron memory issue we had on 1070s and hoped it would just go away.
> 
> The gaming tests are not being run to demonstrate games prowess, they are being run to demonstrate CPU performance relative to intel chips but, they are actually demonstrating the bottleneck of the PCIe controller under load. Slow memory is also being noticed and even though the memory controller and the PCIe controller are managed by the same bit of the chip, no one seems to have worked out there is some sort of connection. Of course, just like with the 1070 everyone just jumps to the wrong conclusion that the memory is the fault when it is the controller that is actually causing the problem. I think memory support will improve but I'm not so sure that the Ryzen PCIe bandwidth issue is fix.
> 
> I was watching this video tonight
> 
> 
> 
> the bug is interesting but his observations are not quite complete. He does not notice that in overclock mode, all cores run at the same speed with no turbo, but his pc is running with 2 cores that show 4.2Ghz and the other 6 cores at 3.6. He does make a discovery that the bug actually makes the core clock run about 15% slower than it is supposed to be.
> 
> I did notice something about Ryzen that makes me think they will never be able to improve the PCIE bus to the GPU to the same levels as intel.
> 
> This is what I am thinking. The Ryzen chips are made on a fab that usually makes ARM phone CPUs, The Ryzen chip has been marketed with SOC elements that AMD have not really explained very well. The bug in that video showed the PC waking up and it back in the default mode that allows 2 cores to turbo up. AMD do talk about changing things to "Overclock mode" that only allows all core overclocks and is only done with software. You can turn cores off in software. AMD needs Windows to have the HPET enabled with a software command to work properly, just like vmtools on vmware.
> 
> Qualcomm also made a big deal about running Native windows on an ARM processor about 6 months ago.
> 
> When I put that all together, it says to me that a Ryzen chip is actually really big ARM mobile phone chip that is running an 8 core 64 bit x86 Virtual Machine on top. The EFI is basically a hypervisor. There is nothing intrinsically wrong with that concept, however, if you have a virtual machine there is always a small overhead involved compared to being able to run the code on bare metal.
> 
> We cannot tell there is a performance difference with the Ryzen 8 core processing performance because we do not have a non VM ryzen to compare it to directly so that doesn't really matter, only how it compares to other Intel or AMD FX chips. We can compare the performance of windows memory access and PCIe 3.0 access though, because it is in common with Intel boxes that don't have the virtualization overhead. The performance limitations we are seeing with memory and PCIe 3.0, i think, is the same sort of virtualization overhead you see if you run a VM on VMware. If i am correct, the windows memory performance and PCIe performance will never get to the same levels as Intel Machines.


his temp for TMPIN3 is hot for 1st boot up & after wake up from sleep, the temp is slightly lower. confirm have bugs & like old days release of x99 also plenty issue too.

like i said u use higher resolution like above 1440p & turn every settings + AA to the max we can't notice any huge different.

did u know about the SATA3 on AM4 mobo? if i use 2 SSD + 2 HDD will it cause any bottle-neck? what i know is that if u use 2 units of M.2 NVMe then PCIe running x2.

i agreed the guy said about the gamer nexus conclusion. i'm not sure bout PCIe but i think u got some point there.


----------



## JoeUbi

Anyone else pick up Ghost Recon Wildlands? I'm able to run it at 2164/9840 and it looks AMAZING and runs GREAT. I love the in-game benchmark too.


----------



## pez

I ran the open beta on my 1080 at 3440x1440 and it was a pretty constant 60FPS even with a lot going on. Was super happy about that. Thinking I may actually pick this up soon.


----------



## w-moffatt

Quote:


> Originally Posted by *JoeUbi*
> 
> Anyone else pick up Ghost Recon Wildlands? I'm able to run it at 2164/9840 and it looks AMAZING and runs GREAT. I love the in-game benchmark too.


got a copy free with my gpu. Running at 1440p on high @ 60fps. Game is super buggy tho. Getting FPS drop in major towns, locking up errors etc. All have been reported. Game devs have just come back and said wait, we will fix it eventually......








Quote:


> Originally Posted by *aimidin*
> 
> Now just put Gaming Z BIOS and you will be perfect
> 
> 
> 
> 
> 
> 
> 
> i have the same GPU and before i pushed the GPU to unstable 2050MHz core (in some benchmarks ,but not in games) and 9400 Mhz Memory stable *with low fan profile to 65 and after that from 65 to 75 aggressive to 100% * . After the Z Bios i got stable 2050 Mhz core and 9500 Mhz memory with the same profile , tho when i put it to be on 100% fan speed all the time it is boosting to 2070-2080Mhz core .


gaming z bios? where do i get this from? im getting around 1935mhz core on OC mode when in game as it stands, will the extra 100hz actually make a difference?


----------



## w-moffatt

deleted.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> Nvidia did certainly take a different approach to Async compute than AMD have with GCN but up until Vulkan and DX12, the hardware based solution that AMD implemented went completely unused unless Mantle was implemented and I don't think that was very widespread.
> 
> Nvidia has taken a software based scheduler approach with Pascal. 1070 and above seem to benefit. I don't think 1060 and below show much benefit one way or the other. 1070 cards still blitz RX480 cards in Doom even when using Vulkan so having a hardware scheduler has still, at least here, not shown itself to be very relevant anyway.


I think GPU IPC Architecture still priority number 1 to dominate performances, No matter how optimize your Async compute no matter "close to the metal or not", they're not going to defeat GPU Core Architecture IPC performance.

AMD need to development & improves their GPU IPC architecture to back into competition. Looking for their Vega generation.

Btw Seems this Ryzen only support up to 24 lanes, that's mean you only has 8x/8x for Dual GPU, Intel HEDT has 40lanes that can do full 16x/16x, I think Intel will still hold the price for HEDT because of that.


----------



## gtbtk

Quote:


> Originally Posted by *dmnclocker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> no. micron memory is not bad at all. Why would you say that?
> 
> The firmware that Nvidia wrote when the cards first got Micron chips last June had a mistake in it that caused some issues. The bios update in november fixed that. Your card has the new bios so there is no issue at all
> 
> 
> 
> I was just referring to the issue that had the bios fix for micron. Was reading that if you had Samsung you didn't need Bios fix. I don't know anything about it besides that. I did buy my card last week though, so I'm still In the window frame for return. Just seen that I could get the asus strix gtx 1080 non oc edition for only $100 more. Is it worth it, because the 1070 I have now is oc.
Click to expand...

nothing to worry about. you are right, the samsung memory that was on the very first cards did not have the issue.

any 1080 will beat a 1070 even if it is overclocked performance wise. You need to decide if that extra 100 is worth it for you as you can play 1440p games with both. A 1080 will last longer before the performance drops to a level that is too slow for games if you plan to keep it for a number of years.

The Asus non OC Strix models are handy because they have the same PCB and you can cross flash them with the OC bios and basically get an oC model for free.


----------



## gtbtk

Quote:


> Originally Posted by *blued*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> When the card was first released, one or two versions of Nvidia drivers had some DPC latency issues but they were hardly absurd and a driver update resolved what was a relatively minor issue. There has been no mention of DPC latency by anyone since about last August. ...
> 
> 
> 
> Yeah, figured as much. 1070 is a popular, high selling card. Whenever an Nv card has minor issues that affect very few, geforce forums is usually the main place they go to. Even if 1% were affected out of tens of thousands sold, that can still amount to a big thread there. All it takes is for anyone to google any problem concerning any card and they can come with big number of hits (2,000,000 for the latency thing). But google amd rx 480 power draw problem and you get over 6,000,000 hits. Never mind that was also fixed very early on after release
> 
> 
> 
> 
> 
> 
> 
> . Still annoying when poorly informed people try to peddle crap like this as big ongoing problems long after they've been solved.
Click to expand...

The big micron memory thread over at nvidia was my thread. At the time though I was the one working out what the problem was and there was no google searches to look up. Remember that the internet is pretty much about 90% overreaction certainly not about considered facts.

Latency did cause some stuttering but sometimes drivers are buggy and do that. an update fixes it. Unfortunately it was at the time of the micron memory issues and no one had worked out what was causing it so the two things got confused.

RX480 power draw was a panic over nothing based on wrong assumptions by journos. There are cards, including a couple of 1080 models from gainward/Palit and possibly 1070 models are configured to draw up to 95 watts over the PCI bus. No panics going on there. PCs are not blowing up either

EVGA blowing up VRMs because of thermals was a panic and wrong based on wrong assumptions by journos. cause was a batch of faulty capacitors. Still caused the ICX upgrade to be produced to recover reputation

We are starting to see the misinformation about how Ryzen cant play games spread now. Also blaming the CPU itself which is a wrong assumption because these guys don't understand what is actually on a Ryzen chip and how it works. AMD are not exactly educating them either so that doesn't help.

oh well


----------



## zipper17

New Driver 378.78 1080Ti launch

seem has some improvement on some DX12 games


















Vulkan performances










http://www.geforce.com/whats-new/articles/tom-clancys-ghost-recon-wildlands-game-ready-driver
http://www.anandtech.com/show/11180/the-nvidia-geforce-gtx-1080-ti-review/4


----------



## aimidin

Quote:


> Originally Posted by *w-moffatt*
> gaming z bios? where do i get this from? im getting around 1935mhz core on OC mode when in game as it stands, will the extra 100hz actually make a difference?


Well you must to check if you are with Samsung memory or with Micron .
This one is for Samsung: https://www.techpowerup.com/vgabios/185888/msi-gtx1070-8192-160608-2
and this one is for Micron : https://www.techpowerup.com/vgabios/187155/msi-gtx1070-8192-161024-3

Be aware that the stock Mhz will be higher , my GPU boosted without any overclock to 2000 Mhz after the BIOS flash , so test your GPU manually to how far it can go. About the 100 Mhz difference , i never test it , when i bought the GPU i straight overclocked it XD and i used it always overclocked .

That's my MSI Afterburner settings with the Z Bios :

Screenshot_4.jpg 319k .jpg file


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Nvidia did certainly take a different approach to Async compute than AMD have with GCN but up until Vulkan and DX12, the hardware based solution that AMD implemented went completely unused unless Mantle was implemented and I don't think that was very widespread.
> 
> Nvidia has taken a software based scheduler approach with Pascal. 1070 and above seem to benefit. I don't think 1060 and below show much benefit one way or the other. 1070 cards still blitz RX480 cards in Doom even when using Vulkan so having a hardware scheduler has still, at least here, not shown itself to be very relevant anyway.
> 
> 
> 
> I think GPU IPC Architecture still priority number 1 to dominate performances, No matter how optimize your Async compute no matter "close to the metal or not", they're not going to defeat GPU Core Architecture IPC performance.
> 
> AMD need to development & improves their GPU IPC architecture to back into competition. Looking for their Vega generation.
> 
> Btw Seems this Ryzen only support up to 24 lanes, that's mean you only has 8x/8x for Dual GPU, Intel HEDT has 40lanes that can do full 16x/16x, I think Intel will still hold the price for HEDT because of that.
Click to expand...

Async is one of the nice to haves around the edges bits as it were. I guess Volta will probably get hardware acceleration for async but the performance will not improve that much. The software approach actually helps as you get faster CPUs

AMD problems are not related to GPU processing power. In some cases like Fury X it was more powerful in TFLOPS than anything from Nvidia at the time. Their problem has always been driver optimization. RX480 was behind 1060 on release but now it is generally a better card because they have spent time improving the driver support. You might be amazed at how much of the graphics performance in games is actually because of the driver intercepting terrible commands from the application and translating them to something the hardware makes best use of. Nvidia has had much more resources to do the optimizations.

x8 PCI 3.0 (x16 PCIe 2.0) is enough for a 1070 to hit 21500 graphics score in firestrike which is better than the majority of PCIe 3 1070 systems are getting. A graphics card cannot use much more than x8 PCIe 3 at the best of times. x8/x8 is exactly the same as you get with z270. True you can do x16/x16 with X99 and some CPUs that start at prices higher than a Ryzen 1800x. It will maybe give you an extra 3-4fps compared to a z270 (apples with apples). Ryzen is slower but it is still new and needs further tuning and stable bioses to improve the performance of the PCIe controller on the chip. I am not so sure they will be able to match Intel in that area because of the way Ryzen has been architected though.

If you are only about gaming, z270 with x8/x8 is the platform of choice because the 7700K beats most x99 or Ryzen based rigs in most gaming scenarios because of higher clock speeds. the 8 core market will still allow acceptable gaming but will kill it in the rendering or server type loads.


----------



## gtbtk

Quote:


> Originally Posted by *aimidin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *w-moffatt*
> gaming z bios? where do i get this from? im getting around 1935mhz core on OC mode when in game as it stands, will the extra 100hz actually make a difference?
> 
> 
> 
> Well you must to check if you are with Samsung memory or with Micron .
> This one is for Samsung: https://www.techpowerup.com/vgabios/185888/msi-gtx1070-8192-160608-2
> and this one is for Micron : https://www.techpowerup.com/vgabios/187155/msi-gtx1070-8192-161024-3
> 
> Be aware that the stock Mhz will be higher , my GPU boosted without any overclock to 2000 Mhz after the BIOS flash , so test your GPU manually to how far it can go. About the 100 Mhz difference , i never test it , when i bought the GPU i straight overclocked it XD and i used it always overclocked .
> 
> That's my MSI Afterburner settings with the Z Bios :
> 
> Screenshot_4.jpg 319k .jpg file
Click to expand...

The 86.04.50.00 bios will actually run on both micron and Samsung cards

When he says stock Mhz are higher. Gaming Z has 1633Mhz core in stead of 1582Mhz and Vram 4050 instead of the Gaming X 4008Mhz so you need to remember that with your overclcoks. +600 Memory OC on the Gaming X now = +550 on the gaming Z for example.

The new bios does not change the maxium number of Mhz the card can cope with. It just moves the defaults closer to the upper limits of the hardware.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> his temp for TMPIN3 is hot for 1st boot up & after wake up from sleep, the temp is slightly lower. confirm have bugs & like old days release of x99 also plenty issue too.
> 
> like i said u use higher resolution like above 1440p & turn every settings + AA to the max we can't notice any huge different.
> 
> did u know about the SATA3 on AM4 mobo? if i use 2 SSD + 2 HDD will it cause any bottle-neck? what i know is that if u use 2 units of M.2 NVMe then PCIe running x2.
> 
> i agreed the guy said about the gamer nexus conclusion. i'm not sure bout PCIe but i think u got some point there.


I have not seen a single thing about sata performance but it is controlled from the SOC part of the chip.


----------



## aimidin

Quote:


> Originally Posted by *gtbtk*
> The 86.04.50.00 bios will actually run on both micron and Samsung cards
> 
> When he says stock Mhz are higher. Gaming Z has 1633Mhz core in stead of 1582Mhz and Vram 4050 instead of the Gaming X 4008Mhz so you need to remember that with your overclcoks. +600 Memory OC on the Gaming X now = +550 on the gaming Z for example.
> 
> The new bios does not change the maximum number of Mhz the card can cope with. It just moves the defaults closer to the upper limits of the hardware.


True , but i really got better stability with the Gaming Z Bios , before with the same 2050 Mhz i couldn't past FireStrike benchmark even once , but in games i got only problem with DirectX 12 games , so i needed to lower the overclock with 20Mhz . Now everything is working properly on every benchmark .
One strange thing happened when i was testing the new BIOS , when i put on FireStrike benchmark , with custom Voltage curve i went to 2126 Mhz on core and i pass the test , well when i tried in games and other benchmarks i got immediately artefacts and after that Driver Crash. And i think i found out how i did it , because i replicated it with Time Spy benchmark . I used the Steam version of 3dMark , and once i open it , i run the overclock profile with 2126Mhz and i run after that the benchmark , when it crashes , i don't close the program and just start the test over again , after 2-3 times , i can pass the Whole test without any problems . Idk why it's doing that , it's really strange , like it can do it , but at the same time it can't XD


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I have not seen a single thing about sata performance but it is controlled from the SOC part of the chip.


good if that case. by the way, 378.78 really improved & the color more vibrant. (pop out)

for pure entertainment between raja koduri & ryan shrout >>

__ https://twitter.com/i/web/status/839566712236298240


----------



## gtbtk

Quote:


> Originally Posted by *aimidin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> The 86.04.50.00 bios will actually run on both micron and Samsung cards
> 
> When he says stock Mhz are higher. Gaming Z has 1633Mhz core in stead of 1582Mhz and Vram 4050 instead of the Gaming X 4008Mhz so you need to remember that with your overclcoks. +600 Memory OC on the Gaming X now = +550 on the gaming Z for example.
> 
> The new bios does not change the maximum number of Mhz the card can cope with. It just moves the defaults closer to the upper limits of the hardware.
> 
> 
> 
> True , but i really got better stability with the Gaming Z Bios , before with the same 2050 Mhz i couldn't past FireStrike benchmark even once , but in games i got only problem with DirectX 12 games , so i needed to lower the overclock with 20Mhz . Now everything is working properly on every benchmark .
> One strange thing happened when i was testing the new BIOS , when i put on FireStrike benchmark , with custom Voltage curve i went to 2126 Mhz on core and i pass the test , well when i tried in games and other benchmarks i got immediately artefacts and after that Driver Crash. And i think i found out how i did it , because i replicated it with Time Spy benchmark . I used the Steam version of 3dMark , and once i open it , i run the overclock profile with 2126Mhz and i run after that the benchmark , when it crashes , i don't close the program and just start the test over again , after 2-3 times , i can pass the Whole test without any problems . Idk why it's doing that , it's really strange , like it can do it , but at the same time it can't XD
Click to expand...

I have gaming Z on my gamx card and I like it too.

Have a fresh look at setting up your vcore and load line calibration again. I found that with my 1070 installed my CPU liked a little extra voltage compared to when I had a GTX 660 installed. It improved overclocks and stopped application crashes.

Windows randomly starting background tasks would often be enough to reduce the CPU voltage supply just enough to crash firestrike but make it intermittent and difficult to work out what was causing it .

An increase in VCCIO helped eliminate artifacts that would pop up in 2D graphics loads (chrome) at memory overclocks above +500Mhz


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have not seen a single thing about sata performance but it is controlled from the SOC part of the chip.
> 
> 
> 
> good if that case. by the way, 378.78 really improved & the color more vibrant. (pop out)
> 
> for pure entertainment between raja koduri & ryan shrout >>
> 
> __ https://twitter.com/i/web/status/839566712236298240
Click to expand...

cant test the new driver right now. :-(


----------



## shilka

So updating the BIOS on both the video card and the motherboard did NOT help whatsoever the driver still crashes when i turn my TV off and the power for the HDMI is cut
Someone said something about bumping the voltage and others said something about something called TDR Manipulator?

Edit: someone (cant recall who) sent me this link so i tried doing what it said
https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista

Was that you gtbtk?


----------



## GeneO

Quote:


> Originally Posted by *zipper17*
> 
> New Driver 378.78 1080Ti launch
> 
> seem has some improvement on some DX12 games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Vulkan performances
> 
> http://www.geforce.com/whats-new/articles/tom-clancys-ghost-recon-wildlands-game-ready-driver
> http://www.anandtech.com/show/11180/the-nvidia-geforce-gtx-1080-ti-review/4


I saw not improvement in DX12 or 11, my computer would no longer automatically sleep, and the driver screwed up my MSI afterburner curves. I reverted back by way of restoring from backup.

Vulcan is at the same revision, 37. Hiss boo!


----------



## khanmein

Quote:


> Originally Posted by *GeneO*
> 
> I saw not improvement in DX12 or 11, my computer would no longer automatically sleep, and the driver screwed up my MSI afterburner curves. I reverted back by way of restoring from backup.
> 
> Vulcan is at the same revision, 37. Hiss boo!


Manuel Guzman stated;

"With optimizations, GPU loads tend to increase so your overclocks may need to be turned down a bit."

https://forums.geforce.com/default/topic/997925/official-378-78-game-ready-whql-display-driver-feedback-thread-released-3-9-17-/?offset=22#5103031

i noticed something different with the color gamut & gamma too. (look nicer & more clarity)

everything working great expect Dota 2 vulkan mode have glitches everywhere especially the item slots.

itemslotsglitches.png 4879k .png file


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> cant test the new driver right now. :-(


oh u really fried yr mobo? what's processor & mobo u goin to buy??


----------



## zipper17

I haven't tested yet 378.78, but I play Hitman Dx12 game often, so this is better be good.


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> So updating the BIOS on both the video card and the motherboard did NOT help whatsoever the driver still crashes when i turn my TV off and the power for the HDMI is cut
> Someone said something about bumping the voltage and others said something about something called TDR Manipulator?
> 
> Edit: someone (cant recall who) sent me this link so i tried doing what it said
> https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista
> 
> Was that you gtbtk?


I didn't send you the MS link as that really only masks the real problem. I did have timeout and driver crashes but I was not using an HDMI connected monitor. I found that increasing vcore solved my problems. Having said that, I note that the problem seems quite common.

Before you give up, I suggest you try is to use DDU to completely remove all Nvidia drivers and do a clean install. nvidia drivers can sometimes get messed up and make PCs do strange things. could be causing this issue??


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> cant test the new driver right now. :-(
> 
> 
> 
> oh u really fried yr mobo? what's processor & mobo u goin to buy??
Click to expand...

Not 100% sure. I have my fingers crossed that it and my 1070 is not dead but I cant be sure. It is 6 years old and still performs like a skylake i5 so it has had a pretty good run. I am up to dead Corsair HX850i power supply number 3 right now (2 have been brand new and both blew up) and though I would give the distributor a weeks break before showing up again. The last one went bang and even tripped a power breaker in my house. Not a bad effort for a 80+ Platinum PSU that is supposed to have overcurrent and short circuit protection built in. I think I know where the problem is originating from now though.

I have been considering Ryzen as an upgrade anyway but the Memory and PCIe performance bottlenecks are disappointing. After looking at their architecture, I'm not so sure that they can fix it up to Intel like levels. The last AMD PC I owned about 15 years ago ran great but the Microsoft windows driver support was never quite 100% so It makes me wonder if Ryzen will have the same future of mostly working but missing the next best thing in a years time. If I don't get Ryzen, X99 is expensive so I would probably go for a 7700K. But, I have been thinking that going quad core now will probably limit the useful life of the PC as the industry moves towards 6 and 8 cores as a standard over the next couple of years.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> No, the fiasco I am talking about is the under performing PCIe bus that the media is not mentioning or asking the question about "WHY games are performing slower than Intel boxes?". It just reminds me of how they ignored the Micron memory issue we had on 1070s and hoped it would just go away.
> 
> The gaming tests are not being run to demonstrate games prowess, they are being run to demonstrate CPU performance relative to intel chips but, they are actually demonstrating the bottleneck of the PCIe controller under load. Slow memory is also being noticed and even though the memory controller and the PCIe controller are managed by the same bit of the chip, no one seems to have worked out there is some sort of connection. Of course, just like with the 1070 everyone just jumps to the wrong conclusion that the memory is the fault when it is the controller that is actually causing the problem. I think memory support will improve but I'm not so sure that the Ryzen PCIe bandwidth issue is fix.
> 
> I was watching this video tonight
> 
> 
> 
> the bug is interesting but his observations are not quite complete. He does not notice that in overclock mode, all cores run at the same speed with no turbo, but his pc is running with 2 cores that show 4.2Ghz and the other 6 cores at 3.6. He does make a discovery that the bug actually makes the core clock run about 15% slower than it is supposed to be.
> 
> I did notice something about Ryzen that makes me think they will never be able to improve the PCIE bus to the GPU to the same levels as intel.
> 
> This is what I am thinking. The Ryzen chips are made on a fab that usually makes ARM phone CPUs, The Ryzen chip has been marketed with SOC elements that AMD have not really explained very well. The bug in that video showed the PC waking up and it back in the default mode that allows 2 cores to turbo up. AMD do talk about changing things to "Overclock mode" that only allows all core overclocks and is only done with software. You can turn cores off in software. AMD needs Windows to have the HPET enabled with a software command to work properly, just like vmtools on vmware.
> 
> Qualcomm also made a big deal about running Native windows on an ARM processor about 6 months ago.
> 
> When I put that all together, it says to me that a Ryzen chip is actually really big ARM mobile phone chip that is running an 8 core 64 bit x86 Virtual Machine on top. The EFI is basically a hypervisor. There is nothing intrinsically wrong with that concept, however, if you have a virtual machine there is always a small overhead involved compared to being able to run the code on bare metal.
> 
> We cannot tell there is a performance difference with the Ryzen 8 core processing performance because we do not have a non VM ryzen to compare it to directly so that doesn't really matter, only how it compares to other Intel or AMD FX chips. We can compare the performance of windows memory access and PCIe 3.0 access though, because it is in common with Intel boxes that don't have the virtualization overhead. The performance limitations we are seeing with memory and PCIe 3.0, i think, is the same sort of virtualization overhead you see if you run a VM on VMware. If i am correct, the windows memory performance and PCIe performance will never get to the same levels as Intel Machines.
> 
> 
> 
> his temp for TMPIN3 is hot for 1st boot up & after wake up from sleep, the temp is slightly lower. confirm have bugs & like old days release of x99 also plenty issue too.
> 
> like i said u use higher resolution like above 1440p & turn every settings + AA to the max we can't notice any huge different.
> 
> did u know about the SATA3 on AM4 mobo? if i use 2 SSD + 2 HDD will it cause any bottle-neck? what i know is that if u use 2 units of M.2 NVMe then PCIe running x2.
> 
> i agreed the guy said about the gamer nexus conclusion. i'm not sure bout PCIe but i think u got some point there.
Click to expand...

you have seen that Ryzen seems to have a wake from sleep bug that slows the bclk by about 15%, makes windows think that is boosting at a higher frequency and disables any overclock you have. Could be why his temps reduce as the PC is back at stock clock XFR mode and the Pstate auto voltage controls are enabled and can work reducing power draw, reducing temps.

1440p and above reduces the load on the PCIe controller because of reduced frame rates require less draw commands to the GPU. Strangely none of the Reviews I have seen mention NVME performance that I remember. As it is using the same fabric as the memory and PCIe, I suspect that it will also have the same limitations. If you use 2 NVMe drives, the 2nd one would have to use the PCH lanes or both drives would have to share the x4 lanes from the CPU. Either way not ideal.

I think that the sata3 is off the PCH so that probably isn't affected by what I am talking about


----------



## DeathAngel74

Meh. A little off topic..1080ti founders edition just got added to step up list. $250+tax+shipping. IDK if it's worth it for 1480 base clock/5500 memory ddr5x...probably Micron. I have 75 days left to decide. What would you all do? 8pin+6pin would be nice though.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Meh. A little off topic..1080ti founders edition just got added to step up list. $250+tax+shipping. IDK if it's worth it for 1480 base clock/5500 memory ddr5x...probably Micron. I have 75 days left to decide. What would you all do? 8pin+6pin would be nice though.


the 1080ti uses GDDR5X memory. Not GDDR5 memory that the 1070 uses.

It can only be Micron memory because no-one else makes GDDR5X.

Unless you are planning to water cool the card. I would wait to see what non reference cards hit the market and then decide. Founders cards coolers are pretty crappy


----------



## shilka

Quote:


> Originally Posted by *gtbtk*
> 
> I didn't send you the MS link as that really only masks the real problem. I did have timeout and driver crashes but I was not using an HDMI connected monitor. I found that increasing vcore solved my problems. Having said that, I note that the problem seems quite common.
> 
> Before you give up, I suggest you try is to use DDU to completely remove all Nvidia drivers and do a clean install. nvidia drivers can sometimes get messed up and make PCs do strange things. could be causing this issue??


I am going to try that.


----------



## DeathAngel74

Yeah, on top of that I just spent 99 bucks on the icx upgrade. I have a while to decide...Will have to see what's left after taxes and booking the trip to Hawaii. i7 7700k Kaby lake+z270 msi xpower titanium+1080ti+water loop may have to wait for my b-day or xmas.


----------



## Quadrider10

How are these 1070s with games at 1440p?


----------



## syl1979

Quote:


> Originally Posted by *Quadrider10*
> 
> How are these 1070s with games at 1440p?


Fine most of the time....


----------



## zipper17

Quote:


> Originally Posted by *Quadrider10*
> 
> How are these 1070s with games at 1440p?


i have single 1070, and overclocked as far as i can into its limit...

playing on 1440P locked at 60hz/60fps, prefer max settings as possible included physx etc, with normal AA (not too exaggerated).

I mostly often only play witcher3, gta5, hitman 2016 for demanding games, and so far still sufficient & enjoyable.

it still has dips into around 50FPS~ on some frames, still not perfect 60FPS all the time. But maybe depends on cpu/ram also.

If i upgrade into 1080Ti, I think 1440P @60hz/60FPS would be to overkill









GTX 1070 still solid for its price & performances.


----------



## khanmein

@gtbtk;

i personally don't like Corsair stuff expect their fans but @shilka won't recommend u grab HX model from Corsair for sure. try out EVGA or SuperFlower. by the way, he's pro in PSU









previously my Seasonic G-550 (SSR-550RM-V2) Haswell ready cause my PC auto reboot during intensive gaming expect playing Dota 2 or Fifa 15 after 4 month usage only.

hence, i sold it & bought Seasonic X-750 KM3. FYI, the guy who bought my PSU still using now w/o single issue & left around 2 yrs warranty.

Ryzen indeed have some bottle-neck with storage too. yeah future more cores & threads is more better.

after watched PCPER w/ NV Tom Peterson new G-SYNC for HDR is coming & he mentioned games still heavily favors single threaded due to the developer writing the engine s/w.

let's see so far 2017 is very interesting for tech.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> @gtbtk;
> 
> i personally don't like Corsair stuff expect their fans but @shilka won't recommend u grab HX model from Corsair for sure. try out EVGA or SuperFlower. by the way, he's pro in PSU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> previously my Seasonic G-550 (SSR-550RM-V2) Haswell ready cause my PC auto reboot during intensive gaming expect playing Dota 2 or Fifa 15 after 4 month usage only.
> 
> hence, i sold it & bought Seasonic X-750 KM3. FYI, the guy who bought my PSU still using now w/o single issue & left around 2 yrs warranty.
> 
> Ryzen indeed have some bottle-neck with storage too. yeah future more cores & threads is more better.
> 
> after watched PCPER w/ NV Tom Peterson new G-SYNC for HDR is coming & he mentioned games still heavily favors single threaded due to the developer writing the engine s/w.
> 
> let's see so far 2017 is very interesting for tech.


I don't think I would buy another Corsair but I still have another 7 years warranty left on it. Maybe I will just sell the next replacement unit and buy a new one.

I believe the cooler Master V series PSUs are supposed to be pretty good as are the EVGA G3. Seasonic is not popular in HK.

the CPU has 4 x sata and NVME in the same area as the memory and pcie controllers. I everything seems to have high latency issues there at the moment. I think as fast memory support improves it will get better i hope


----------



## zipper17

Looks like Ryzen as I'm aware not really have big overclock headroom in general...commonly only around 4.0-4.1GHZ ?

__
https://www.reddit.com/r/5xybp7/silicon_lottery_ryzen_overclock_statistics/%5B/URL
but Intel chip will likely easy overclocked ahead to 4.5ghz and beyond.

and Der8auer break World Recod Ryzen 1800X overclocked to 5.8GHZ
http://wccftech.com/ryzen-7-1800x-overclocked-58ghz-ln2/


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Looks like Ryzen as I'm aware not really have big overclock headroom in general...commonly only around 4.0-4.1GHZ ?
> 
> __
> https://www.reddit.com/r/5xybp7/silicon_lottery_ryzen_overclock_statistics/%5B/URL
> but Intel chip will likely easy overclocked ahead to 4.5ghz and beyond.
> 
> and Der8auer break World Recod Ryzen 1800X overclocked to 5.8GHZ
> http://wccftech.com/ryzen-7-1800x-overclocked-58ghz-ln2/


Yes 4.1 is about the limit. The 1700 runs coolest and can be overclocked to run the same or slightly faster than 1800.

yes, the 7700k has a better single core performance than Ryzen when run at stock with turbo. The 7700k is destroyed in multicore performance though. The differences are similar to broadwell-E vs Kaby Lake. That article makes it easy to see the pcie bottleneck that is causing the slower ryzen gaming performance. You just need to look at the firestrike graphs.

LN2 and 1.9V is a wonderful thing for high frequencies


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> I don't think I would buy another Corsair but I still have another 7 years warranty left on it. Maybe I will just sell the next replacement unit and buy a new one.
> 
> I believe the cooler Master V series PSUs are supposed to be pretty good as are the EVGA G3. Seasonic is not popular in HK.
> 
> the CPU has 4 x sata and NVME in the same area as the memory and pcie controllers. I everything seems to have high latency issues there at the moment. I think as fast memory support improves it will get better i hope


i won't recommend u grab Seasonic. if under load my PSU have a high pitch noise, sorta like coil-whine but after awhile will be disappeared & maybe due to my power delivery electricity. (blu-ray player, goodtv decoder, sony tv & air cond spoiled)

frankly speaking, i'm anti-CM guy so no comment ROFL. CM, Corsair & Seasonic are very popular at M'sia. if seriously no choice Corsair RM/RMi way to go.

if u fully occupied 4 slots for AM4 board, huge latency can't avoided & there's not solution too. if u watched the Tom Peterson said they won't allow use to mod vbios for voltage.

now i'm thinking to grab GTX 1080.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> Yes 4.1 is about the limit. The 1700 runs coolest and can be overclocked to run the same or slightly faster than 1800.
> 
> yes, the 7700k has a better single core performance than Ryzen when run at stock with turbo. The 7700k is destroyed in multicore performance though. The differences are similar to broadwell-E vs Kaby Lake. That article makes it easy to see the pcie bottleneck that is causing the slower ryzen gaming performance. You just need to look at the firestrike graphs.
> 
> LN2 and 1.9V is a wonderful thing for high frequencies


http://www.linleygroup.com/mpr/article.php?id=11666

https://www.pcper.com/reviews/Processors/AMD-Ryzen-and-Windows-10-Scheduler-No-Silver-Bullet


----------



## shilka

Quote:


> Originally Posted by *Quadrider10*
> 
> How are these 1070s with games at 1440p?


I tested my GTX 1070 with about 40 games in 1440P and out of all those 3 could not run at 60 FPS with everything cranked to max and that was with 8x AA

Metro 2033, Ashes Of The Singularity and Deus Ex Mankind Divided where the only games that could not run at 60 FPS in 1440P but if you turn off AA and a few settigs to the second highest those games run just fine
Quote:


> Originally Posted by *gtbtk*
> 
> I don't think I would buy another Corsair but I still have another 7 years warranty left on it. Maybe I will just sell the next replacement unit and buy a new one.
> 
> I believe the cooler Master V series PSUs are supposed to be pretty good as are the EVGA G3. Seasonic is not popular in HK.
> 
> the CPU has 4 x sata and NVME in the same area as the memory and pcie controllers. I everything seems to have high latency issues there at the moment. I think as fast memory support improves it will get better i hope


The Cooler Master V series are nowhere near as good as the EVGA SuperNova G2 yet alone the G3
The V850 are based on the old Seasonic KM3 or KM3S which is what the newest Seasonic X uses

They are not bad or anything just overpriced and inferior compared to the EVGA SuperNova G2 and G3.


----------



## khanmein

Quote:


> Originally Posted by *shilka*
> 
> I tested my GTX 1070 with about 40 games in 1440P and out of those 3 could not run at 60 FPS with everything cranked to max and that was with 8x AA
> The Cooler Master V series are nowhere near as good as the EVGA SuperNova G2 yet alone the G3
> The V850 are based on the old Seasonic KM3 or KM3S which is what the newest Seasonic X uses
> 
> They are not bad or anything just overpriced and inferior compared to the EVGA SuperNova G2 and G3.


i'm using KM3 but i don't think is good so i never intro to other.


----------



## shilka

Quote:


> Originally Posted by *khanmein*
> 
> i'm using KM3 but i don't think is good so i never intro to other.


Sory i did not undertand?
Another thing i forgot to mention is the KM3 and KM3S are multi rail despite the fact they are listed branded and sold as single rail units

Everyone think most Seasonic units are single rail due to some cleaver marketing but they are in fact multi rail.


----------



## khanmein

Quote:


> Originally Posted by *shilka*
> 
> Sory i did not undertand?
> Another thing i forgot to mention is the KM3 and KM3S are multi rail despite the fact they are listed branded and sold as single rail units
> 
> Everyone think most Seasonic units are single rail due to some cleaver marketing but they are in fact multi rail.


seriously, single vs multi-rail discussed long time ago. at the end of the day, which one is better? (don't understand)

what i mean is that KM3 really good? i think just decent only.


----------



## shilka

Quote:


> Originally Posted by *khanmein*
> 
> seriously, single vs multi-rail discussed long time ago. at the end of the day, which one is better? (don't understand)
> 
> what i mean is that KM3 really good? i think just decent only.


Multi rail is safer because if something goes up its only whats hooked up to that rail that goes up and all the other rails are not affected
That means if you have your video card that short circuits and kill itself its only that rail the video card is hooked up to that had damage done to it

Lets say you have a $5000 system with 3-4 video cards and one of those cards blow up the only damage is to that rail and the rest of the system is fine
Thats the upside to multi rail the down side to multi rail is its much easier to hook too much up to the same rail and overload it which will casue OCP to kick in and shut the PSU off

It happen fairly regularly that someone with a multi rail PSU will post in the PSU section asking why their PSU shuts off
Or how much more wattage they need when they in fact dont need more wattage

As for the KM3 and KM3S they where once very high end platforms but with newer and better platforms out there they have started to become a bit long in the tooth
I would still consider them high end platforms but they are a lower tie compared to say the Leadex II.


----------



## khanmein

Quote:


> Originally Posted by *shilka*
> 
> Multi rail is safer because if something goes up its only whats hooked up to that rail that goes up and all the other rails are not affected
> That means if you have your video card that short circuits and kill itself its only that rail the video card is hooked up to that had damage done to it
> 
> Lets say you have a $5000 system with 3-4 video cards and one of those cards blow up the only damage is to that rail and the rest of the system is fine
> Thats the upside to multi rail the down side to multi rail is its much easier to hook too much up to the same rail and overload it which will casue OCP to kick in and shut the PSU off
> 
> It happen fairly regularly that someone with a multi rail PSU will post in the PSU section asking why their PSU shuts off
> Or how much more wattage they need when they in fact dont need more wattage
> 
> As for the KM3 and KM3S they where once very high end platforms but with newer and better platforms out there they have started to become a bit long in the tooth
> I would still consider them high end platforms but they are a lower tie compared to say the Leadex II.


noted & understood. thanks for the clarification. cheers.


----------



## Quadrider10

Sweet! Thanks guys! I just ordered a 1440p monitor. Was worried about the 1070 keeping up.


----------



## khanmein

Quote:


> Originally Posted by *Quadrider10*
> 
> Sweet! Thanks guys! I just ordered a 1440p monitor. Was worried about the 1070 keeping up.


i7-6700K can handle 1080p with higher refresh rate w/o any hassle. congrats, what's your monitor model?


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I don't think I would buy another Corsair but I still have another 7 years warranty left on it. Maybe I will just sell the next replacement unit and buy a new one.
> 
> I believe the cooler Master V series PSUs are supposed to be pretty good as are the EVGA G3. Seasonic is not popular in HK.
> 
> the CPU has 4 x sata and NVME in the same area as the memory and pcie controllers. I everything seems to have high latency issues there at the moment. I think as fast memory support improves it will get better i hope
> 
> 
> 
> i won't recommend u grab Seasonic. if under load my PSU have a high pitch noise, sorta like coil-whine but after awhile will be disappeared & maybe due to my power delivery electricity. (blu-ray player, goodtv decoder, sony tv & air cond spoiled)
> 
> frankly speaking, i'm anti-CM guy so no comment ROFL. CM, Corsair & Seasonic are very popular at M'sia. if seriously no choice Corsair RM/RMi way to go.
> 
> if u fully occupied 4 slots for AM4 board, huge latency can't avoided & there's not solution too. if u watched the Tom Peterson said they won't allow use to mod vbios for voltage.
> 
> now i'm thinking to grab GTX 1080.
Click to expand...

I saw the PCper interview. Of course NV are not going to allow voltage mods. they and the partners offer warranties on the product. If you put 1.5v through the chip the life expectancy of the chip is in weeks and not years.

The architecture of the Ryzen Data Fabric bus, sharing bandwidth between on Chip Sata, NVMe, USB, memory access and PCIe is head scratching. Finding a way to support high speed memory will help alleviate some of the issues but 4 sticks will always have a performance penalty over 2 sticks. DF clocks are currently set in a 1:2 ratio of the memory frequency. If that could be changed to say 1:1 or even if it could only go to 1:1.5, and I don't know if that is possible given hardware sync timings etc, It would instantly increase the available bandwidth.


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quadrider10*
> 
> How are these 1070s with games at 1440p?
> 
> 
> 
> I tested my GTX 1070 with about 40 games in 1440P and out of all those 3 could not run at 60 FPS with everything cranked to max and that was with 8x AA
> 
> Metro 2033, Ashes Of The Singularity and Deus Ex Mankind Divided where the only games that could not run at 60 FPS in 1440P but if you turn off AA and a few settigs to the second highest those games run just fine
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I don't think I would buy another Corsair but I still have another 7 years warranty left on it. Maybe I will just sell the next replacement unit and buy a new one.
> 
> I believe the cooler Master V series PSUs are supposed to be pretty good as are the EVGA G3. Seasonic is not popular in HK.
> 
> the CPU has 4 x sata and NVME in the same area as the memory and pcie controllers. I everything seems to have high latency issues there at the moment. I think as fast memory support improves it will get better i hope
> 
> Click to expand...
> 
> The Cooler Master V series are nowhere near as good as the EVGA SuperNova G2 yet alone the G3
> The V850 are based on the old Seasonic KM3 or KM3S which is what the newest Seasonic X uses
> 
> They are not bad or anything just overpriced and inferior compared to the EVGA SuperNova G2 and G3.
Click to expand...

If I do replace my PSU, I would probably go for a 650 unit. I certainly don't need another 850 right now.

Coolermaster prices are almost 20% cheaper in HK than the EVGA G2 units which are about the same as corsair RM units. I know the supernovas have a good reputation and I do know you end up getting what you pay for.


----------



## shilka

Quote:


> Originally Posted by *gtbtk*
> 
> If I do replace my PSU, I would probably go for a 650 unit. I certainly don't need another 850 right now.
> 
> Coolermaster prices are almost 20% cheaper in HK than the EVGA G2 units which are about the same as corsair RM units. I know the supernovas have a good reputation and I do know you end up getting what you pay for.


If the V are that much cheaper then there is no reason not to get one
They are much better then the old Corsair RM ever was or are

Sure the KM3/KM3S are not the lastest and greatest anymore but they are still very good.


----------



## Motley01

Hey guys,

If I buy the Zotac AMP 1070.

Is it possible to use the BIOS from the AMP Extreme?


----------



## gtbtk

Quote:


> Originally Posted by *Motley01*
> 
> Hey guys,
> 
> If I buy the Zotac AMP 1070.
> 
> Is it possible to use the BIOS from the AMP Extreme?


yes, you can flash it and it will run.

make sure you keep a close eye on temps though as the amp extreme can pull 300w


----------



## Inelastic

Quote:


> Originally Posted by *gtbtk*
> 
> Not 100% sure. I have my fingers crossed that it and my 1070 is not dead but I cant be sure. It is 6 years old and still performs like a skylake i5 so it has had a pretty good run. I am up to dead Corsair HX850i power supply number 3 right now (2 have been brand new and both blew up) and though I would give the distributor a weeks break before showing up again. The last one went bang and even tripped a power breaker in my house. Not a bad effort for a 80+ Platinum PSU that is supposed to have overcurrent and short circuit protection built in. I think I know where the problem is originating from now though.
> 
> I have been considering Ryzen as an upgrade anyway but the Memory and PCIe performance bottlenecks are disappointing. After looking at their architecture, I'm not so sure that they can fix it up to Intel like levels. The last AMD PC I owned about 15 years ago ran great but the Microsoft windows driver support was never quite 100% so It makes me wonder if Ryzen will have the same future of mostly working but missing the next best thing in a years time. If I don't get Ryzen, X99 is expensive so I would probably go for a 7700K. But, I have been thinking that going quad core now will probably limit the useful life of the PC as the industry moves towards 6 and 8 cores as a standard over the next couple of years.


Ugh, that sucks. I've had my Corsair AX1200 for over 6 years with no issue; well other than it's kind of large for the case I have now. I just upgraded my system due to my mobo dying. I was thinking the same thing as you about the cores, but I wound up going with a 7700K. I didn't want to concern myself too much about the future. I'd rather purchase something that works at its full potential now than something that has issues that may or may not be worked out in the coming months. The same thing was said 4 years ago when AMD first released their 8 core processors, and the 8 core Intel processors still don't give that much of an advantage over the 7700K right now (in gaming). Who knows how long it'll be until they do. I'll just wind up upgrading again if the 7700K is insufficient by then.


----------



## RyanRazer

All are great PSUs.
Id' prefer the first two (G2 and RM750x) because of the semi-fanless mode. But as for the power efficiency and power delivery are all excellent. I personally can vouch for RM750X as i have one. I had a XFX one before but i sold it after a month because it was extremely loud. This one, the fan never even turns on. It actually stays off the whole time. Gaming included. Extremely happy with this PSU.









http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story6&reid=380 - EVGA Supernova G2 750w

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=452 - Corsair RM 750x

http://www.eteknix.com/cooler-master-v750-semi-modular-power-supply-review/all/1/ Coolermaster V750


----------



## H4mm3R2

Hi
Can you look at my problem?
http://www.overclock.net/t/1625355/bitspower-1080-strix-problem-with-temp-and-rgb-on-1070-strix


----------



## gtbtk

Quote:


> Originally Posted by *Inelastic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Not 100% sure. I have my fingers crossed that it and my 1070 is not dead but I cant be sure. It is 6 years old and still performs like a skylake i5 so it has had a pretty good run. I am up to dead Corsair HX850i power supply number 3 right now (2 have been brand new and both blew up) and though I would give the distributor a weeks break before showing up again. The last one went bang and even tripped a power breaker in my house. Not a bad effort for a 80+ Platinum PSU that is supposed to have overcurrent and short circuit protection built in. I think I know where the problem is originating from now though.
> 
> I have been considering Ryzen as an upgrade anyway but the Memory and PCIe performance bottlenecks are disappointing. After looking at their architecture, I'm not so sure that they can fix it up to Intel like levels. The last AMD PC I owned about 15 years ago ran great but the Microsoft windows driver support was never quite 100% so It makes me wonder if Ryzen will have the same future of mostly working but missing the next best thing in a years time. If I don't get Ryzen, X99 is expensive so I would probably go for a 7700K. But, I have been thinking that going quad core now will probably limit the useful life of the PC as the industry moves towards 6 and 8 cores as a standard over the next couple of years.
> 
> 
> 
> Ugh, that sucks. I've had my Corsair AX1200 for over 6 years with no issue; well other than it's kind of large for the case I have now. I just upgraded my system due to my mobo dying. I was thinking the same thing as you about the cores, but I wound up going with a 7700K. I didn't want to concern myself too much about the future. I'd rather purchase something that works at its full potential now than something that has issues that may or may not be worked out in the coming months. The same thing was said 4 years ago when AMD first released their 8 core processors, and the 8 core Intel processors still don't give that much of an advantage over the 7700K right now (in gaming). Who knows how long it'll be until they do. I'll just wind up upgrading again if the 7700K is insufficient by then.
Click to expand...

For just gaming you are right, 8 cores at this point in time don't give that much advantage, but I do use this for things other than gaming so I was hoping that Ryzen may be a CPU that both performs and has longevity. Right now I am starting to think that the interconnect that connects memory CPU and PCIe together is sadly under engineered and at this point in time, seem to offer about the same bandwidth than a sandy bridge CPU. So gaming wise, I would be no better off than I already am. Now I am leaning back towards a 7700K as well.


----------



## gtbtk

Quote:


> Originally Posted by *RyanRazer*
> 
> All are great PSUs.
> Id' prefer the first two (G2 and RM750x) because of the semi-fanless mode. But as for the power efficiency and power delivery are all excellent. I personally can vouch for RM750X as i have one. I had a XFX one before but i sold it after a month because it was extremely loud. This one, the fan never even turns on. It actually stays off the whole time. Gaming included. Extremely happy with this PSU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story6&reid=380 - EVGA Supernova G2 750w
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=452 - Corsair RM 750x
> 
> http://www.eteknix.com/cooler-master-v750-semi-modular-power-supply-review/all/1/ Coolermaster V750


I do like the zero fan on the HX850i I dont think my fan ever turned on either. If 3 hadnt blown up on me in the space of a week I would still be a happy camper.

I live in a 50 year old building, built when one power socket per room was all that you would ever need. My computer has to run off a power bar and I am starting to think that maybe it has developed a short that is causing the problem. PSUs are supposed to have short protection that I would think does not include exploding though


----------



## Munross88

Hi all,



I recently upgraded from a Galax GTX 970 to a MSI GTX 1070, and I couldn't be happier! Gaming on my Acer 1080p 144Hz monitor now is so much more enjoyable. With some games my GTX 970 would use 100%+ power and temps would be 70ºC - 75ºC... My new GTX 1070 on the other hand breezes through the same games (with higher settings in some cases) without breaking a sweat and not even coming close to 70ºC
















Will definitely be reading through the forums for tips on overclocking my 1070







.

Cheers.


----------



## gtbtk

welcome, plenty of good info here


----------



## HAL900




----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> For just gaming you are right, 8 cores at this point in time don't give that much advantage, but I do use this for things other than gaming so I was hoping that Ryzen may be a CPU that both performs and has longevity. Right now I am starting to think that the interconnect that connects memory CPU and PCIe together is sadly under engineered and at this point in time, seem to offer about the same bandwidth than a sandy bridge CPU. So gaming wise, I would be no better off than I already am. Now I am leaning back towards a 7700K as well.


how about this upcoming game?








*use 1.5x speed on this

100k in 1 screen, 6700k 4c/8t struggle.
Damn, as big as Lord of the Ring massive war


----------



## gtbtk

The extra cores would certainly help in the processing of that. As games develop they will absolutely become better at using more threads. That is why I do not think Kaby Lake will have much longevity and I was hopeful that Ryzen might be a good thing and I guess, if you can limit the traffic through the SOC with a 4K monitor it is an OK solution. Tuning may help but no-one has done anything that will address this issue that I am aware of.

At 4K, the reduced frame rates would generate less traffic to the PCIe/GPU so the Data fabric will have more bandwidth available for thread switching, memory access disk access etc. and not generate the bottlenecks to the same extent that cause the performance drops that all the youtubers are talking about but don't understand properly. Looking at the combined results in FS Ultra still shows that Ryzen is restricted at 4K compared to even a 7700K but not quite as severely as it is at 1080p.

If you run a 1060 with Ryzen, the performance is going to be pretty much the same as anything from Intel at any resolution. As an RX480 is about the same as a 1060, I wonder if that is the only thing that AMD used to design the capacity of the SOC data Fabric?.


----------



## w-moffatt

Quote:


> Originally Posted by *Munross88*
> 
> Hi all,
> 
> 
> 
> I recently upgraded from a Galax GTX 970 to a MSI GTX 1070, and I couldn't be happier! Gaming on my Acer 1080p 144Hz monitor now is so much more enjoyable. With some games my GTX 970 would use 100%+ power and temps would be 70ºC - 75ºC... My new GTX 1070 on the other hand breezes through the same games (with higher settings in some cases) without breaking a sweat and not even coming close to 70ºC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will definitely be reading through the forums for tips on overclocking my 1070
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Cheers.


great minds think alike







, love this card as well! Welcome to the club.


----------



## w-moffatt

Quote:


> Originally Posted by *gtbtk*
> 
> The extra cores would certainly help in the processing of that. As games develop they will absolutely become better at using more threads. That is why I do not think Kaby Lake will have much longevity and I was hopeful that Ryzen might be a good thing and I guess, if you can limit the traffic through the SOC with a 4K monitor it is an OK solution. Tuning may help but no-one has done anything that will address this issue that I am aware of.
> 
> At 4K, the reduced frame rates would generate less traffic to the PCIe/GPU so the Data fabric will have more bandwidth available for thread switching, memory access disk access etc. and not generate the bottlenecks to the same extent that cause the performance drops that all the youtubers are talking about but don't understand properly. Looking at the combined results in FS Ultra still shows that Ryzen is restricted at 4K compared to even a 7700K but not quite as severely as it is at 1080p.
> 
> If you run a 1060 with Ryzen, the performance is going to be pretty much the same as anything from Intel at any resolution. As an RX480 is about the same as a 1060, I wonder if that is the only thing that AMD used to design the capacity of the SOC data Fabric?.


Interesting you say this. I was just looking at high cpu usage the other day. Currently i have a i5-7400 playing games on 1440p @60Hz high-ultra detail (depending on game) and As it stands current AAA titles (BF!, Ghost recon wildlands) use around 80%-90% of my cpu as well as 100% gpu usage. Im unclear as to whether this is a good/bad thing. temps dont get high (max around 45C cpu and GPU 71C), i do suffer from a subtle FPS drop 60-56 and then back up again so not really noticeable.

the big issue is almost like a "lag" type effect for a brief second when in high density areas or long distance e.g. the valleys in wildlands. I'm assuming these newer titles are more CPU dependant hence the additional resources but i didnt think they would need to utilise SO much of the cpu.


----------



## khanmein

Quote:


> Originally Posted by *w-moffatt*
> 
> Interesting you say this. I was just looking at high cpu usage the other day. Currently i have a i5-7400 playing games on 1440p @60Hz high-ultra detail (depending on game) and As it stands current AAA titles (BF!, Ghost recon wildlands) use around 80%-90% of my cpu as well as 100% gpu usage. Im unclear as to whether this is a good/bad thing. temps dont get high (max around 45C cpu and GPU 71C), i do suffer from a subtle FPS drop 60-56 and then back up again so not really noticeable.
> 
> the big issue is almost like a "lag" type effect for a brief second when in high density areas or long distance e.g. the valleys in wildlands. I'm assuming these newer titles are more CPU dependant hence the additional resources but i didnt think they would need to utilise SO much of the cpu.


very simple, what i know is that CPU usage the lower the better & GPU usage the higher the better but if CPU usage is high like hitting 100% all the time while playing games w/o any shuttering/jittering then is acceptable for me.


----------



## TheBoom

Quote:


> Originally Posted by *w-moffatt*
> 
> Interesting you say this. I was just looking at high cpu usage the other day. Currently i have a i5-7400 playing games on 1440p @60Hz high-ultra detail (depending on game) and As it stands current AAA titles (BF!, Ghost recon wildlands) use around 80%-90% of my cpu as well as 100% gpu usage. Im unclear as to whether this is a good/bad thing. temps dont get high (max around 45C cpu and GPU 71C), i do suffer from a subtle FPS drop 60-56 and then back up again so not really noticeable.
> 
> the big issue is almost like a "lag" type effect for a brief second when in high density areas or long distance e.g. the valleys in wildlands. I'm assuming these newer titles are more CPU dependant hence the additional resources but i didnt think they would need to utilise SO much of the cpu.


Unless you're pushing over 100 fps at least, I don't see how they use so much of your CPU though.

Edit : Oops nvm didn't realize that you were using an i5. That seems about right.


----------



## gtbtk

Quote:


> Originally Posted by *w-moffatt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The extra cores would certainly help in the processing of that. As games develop they will absolutely become better at using more threads. That is why I do not think Kaby Lake will have much longevity and I was hopeful that Ryzen might be a good thing and I guess, if you can limit the traffic through the SOC with a 4K monitor it is an OK solution. Tuning may help but no-one has done anything that will address this issue that I am aware of.
> 
> At 4K, the reduced frame rates would generate less traffic to the PCIe/GPU so the Data fabric will have more bandwidth available for thread switching, memory access disk access etc. and not generate the bottlenecks to the same extent that cause the performance drops that all the youtubers are talking about but don't understand properly. Looking at the combined results in FS Ultra still shows that Ryzen is restricted at 4K compared to even a 7700K but not quite as severely as it is at 1080p.
> 
> If you run a 1060 with Ryzen, the performance is going to be pretty much the same as anything from Intel at any resolution. As an RX480 is about the same as a 1060, I wonder if that is the only thing that AMD used to design the capacity of the SOC data Fabric?.
> 
> 
> 
> Interesting you say this. I was just looking at high cpu usage the other day. Currently i have a i5-7400 playing games on 1440p @60Hz high-ultra detail (depending on game) and As it stands current AAA titles (BF!, Ghost recon wildlands) use around 80%-90% of my cpu as well as 100% gpu usage. Im unclear as to whether this is a good/bad thing. temps dont get high (max around 45C cpu and GPU 71C), i do suffer from a subtle FPS drop 60-56 and then back up again so not really noticeable.
> 
> the big issue is almost like a "lag" type effect for a brief second when in high density areas or long distance e.g. the valleys in wildlands. I'm assuming these newer titles are more CPU dependant hence the additional resources but i didnt think they would need to utilise SO much of the cpu.
Click to expand...

I have been running an i7-2600. It is overclocked to 4.4Ghz but it is only got about 60% the power of a 7700K. Unless you need it, Make sure that you turn the windows 10 GameDVR off

I have not played either of those games but i did find that the windows gamedvr does add additional load to the cpu even if you are not using it. Disabling it for me helped performance and reduced stutters.


----------



## Munross88

Successfully flashed the Gaming Z bios onto my Gaming X MSI 1070









*Before Flash:*


*After Flash:*


----------



## w-moffatt

Quote:


> Originally Posted by *gtbtk*
> 
> I have been running an i7-2600. It is overclocked to 4.4Ghz but it is only got about 60% the power of a 7700K. Unless you need it, Make sure that you turn the windows 10 GameDVR off
> 
> I have not played either of those games but i did find that the windows gamedvr does add additional load to the cpu even if you are not using it. Disabling it for me helped performance and reduced stutters.


Yep, already disabled..Don't know why it does this but as there is no fps drop I'm not overly concerned


----------



## w-moffatt

Quote:


> Originally Posted by *TheBoom*
> 
> Unless you're pushing over 100 fps at least, I don't see how they use so much of your CPU though.
> 
> Edit : Oops nvm didn't realize that you were using an i5. That seems about right.


I thought the same but there are a number of people with core i7s also overclocked that are having the same issue. I'm thinking it could also be game optimisation.


----------



## muzammil84

hello everyone. I've owned a Inno3d iChill x4 1070 for a while now and it's been ok but i wonder if there's any way to squeeze a bit more out of it, i mostly mean BIOS, is there any bios that will allow for higher voltage/overclock? Gpu has reference pcb. max oc seems to be just under 2100mhz on water, i wonder if different bios would allow me to push it further?


----------



## gtbtk

you could try

Quote:


> Originally Posted by *muzammil84*
> 
> hello everyone. I've owned a Inno3d iChill x4 1070 for a while now and it's been ok but i wonder if there's any way to squeeze a bit more out of it, i mostly mean BIOS, is there any bios that will allow for higher voltage/overclock? Gpu has reference pcb. max oc seems to be just under 2100mhz on water, i wonder if different bios would allow me to push it further?


You cant do anything about voltages. Higher voltages do not help pascal all that much anyway.

A Palit superjetstream bios will give you an extra 10W power limit and about 13 extra Mhz base clock but I would think Overclocking with the Afterburner curve will give you the most benefits.

I am assuming you are running 1.093V and using the slider to OC the card. To get you started set AB to your max stable overclock, and then open the AB Voltage curve window (CTRL-F) click on only the voltage point that matches the .950v and drag that up to 2025Mhz, click apply and test performance. If it is stable, increase by another 25Mhz and test it again. If the test crashes, reduce the last setting by 13Mhz and test again. until you get that point stable.

After you have got that point stable at a higher clock speed. select the 1.093V point and start increasing that point by 25Mhz at a time and and test that until you find the max stable frequency.

Those 2 points should give you a bump in FPS. Experiment with other points on the curve and see if they help you as well. You will discover that at least one point will be unstable if you increase it at all because when you use the slider to OC the whole curve moves moves up together. When you are running, the GPU will try to use that one voltage level with the lower headroom and crash.

I have actually found on my card, I get better performance with the absolutely highers .950 voltage point, even if I have to use a slightly lower level at 1.093.

If you discover another point that gives you a good boost over what I have described, let us know back here.


----------



## muzammil84

Quote:


> Originally Posted by *gtbtk*
> 
> you could try
> You cant do anything about voltages. Higher voltages do not help pascal all that much anyway.
> 
> A Palit superjetstream bios will give you an extra 10W power limit and about 13 extra Mhz base clock but I would think Overclocking with the Afterburner curve will give you the most benefits.
> 
> I am assuming you are running 1.093V and using the slider to OC the card. To get you started set AB to your max stable overclock, and then open the AB Voltage curve window (CTRL-F) click on only the voltage point that matches the .950v and drag that up to 2025Mhz, click apply and test performance. If it is stable, increase by another 25Mhz and test it again. If the test crashes, reduce the last setting by 13Mhz and test again. until you get that point stable.
> 
> After you have got that point stable at a higher clock speed. select the 1.093V point and start increasing that point by 25Mhz at a time and and test that until you find the max stable frequency.
> 
> Those 2 points should give you a bump in FPS. Experiment with other points on the curve and see if they help you as well. You will discover that at least one point will be unstable if you increase it at all because when you use the slider to OC the whole curve moves moves up together. When you are running, the GPU will try to use that one voltage level with the lower headroom and crash.
> 
> I have actually found on my card, I get better performance with the absolutely highers .950 voltage point, even if I have to use a slightly lower level at 1.093.
> 
> If you discover another point that gives you a good boost over what I have described, let us know back here.


awesome, this is extremely helpful, i appreciate a lot mate. I've read somewhere some time ago that ppl tried working with that curve and locked voltages and it wouldn't idle even in 2d(desktop,browser etc). not sure if that applies to what you said but i guess curve means there's variable voltage and core speed which you can manually edit(opposite to slider which has same curve shape all the time, just bumps it up?).


----------



## DeathAngel74

Stupid luxmark64.exe keeps crashing. Even tried upping vccio, vccsa and running -100 under the factory overclock.


----------



## DeathAngel74

Nevermind, the problem was caused by precisionxoc running at the same time ;[


Spoiler: Warning: Spoiler!


----------



## icold

I flashed my 1070 strix to strix OC, now power target got 120% and fix the clock cap caused by stock bios. Now i can work my GPU at fixed 2138mhz until [email protected] 1.093mv. Change bios to bios card OC worth so much.


----------



## gtbtk

Quote:


> Originally Posted by *muzammil84*
> 
> awesome, this is extremely helpful, i appreciate a lot mate. I've read somewhere some time ago that ppl tried working with that curve and locked voltages and it wouldn't idle even in 2d(desktop,browser etc). not sure if that applies to what you said but i guess curve means there's variable voltage and core speed which you can manually edit(opposite to slider which has same curve shape all the time, just bumps it up?).


No worries.

The Curve adjustments I am talking about do not involve trying to lock the voltage high.

The locked voltage high using the curve was a method to work around what turned out to be the micron controller bug and stop the card from crashing the PC if you tried to overclock the memory on cards that had Micron Vram. If the card jumped from an Idle p-state up to the p-state 0 when the GPU was under load, it would artifact the screen and then BSOD. Chrome uses 2d Acceleration and starting that up would also trigger crashes as well. Since the bios update that resolved that bug was released last November, it is not relevant any more.

Adjusting the curve is a feature that opened up to end users with pascal cards. Default curves are set up as a smooth arc and the slider just moves that entire arc up and down when you Overclock the card. That was also true for Maxwell and kepler etc.

Because Silicon is not perfect, where you may be stable with an offset of a 100 at every voltage point below say, 1.0v. 1.013v may only be stable if you offset that point up by 75 and 1.025v may only be stable up to 50 above stock then every point above 1.025V may be ok if you offset the point by 125.

Using my example, overclocking with the slider will only be stable up to 50 because when you are running something on the card, it accesses all the points along the curve at some stage during your usage and at some point, it will try and use the 1.025v level and go unstable if you have tried to overclock above 50. Problem is that leaves all higher offset OC potential at all the other points that are not 1.025 along the curve unused.

Allowing us to access the curve gives us the chance to find any of thos bits that we have not been able to access before while leaving the point with the low OC potential at the low level. challenge is finding exactly what your cards limits at each point. Most people dont bother but I found it really cool.

I also discovered that Nvidia cards seem to use voltage levels to control different functional aspects of the card. the .950 point is the one that will adjust what is referred to a the "video Clock" by HWinfo64 and I found that I could get the best performance by setting that point as High as possible even if the top frequency reported by Afterburner was not as high as it could be with a lower .950 voltage.

I can run my card at 2164Mhz but the performance is not as good as I can get with .950 at 2037 and 1.063 point 2076 (voltage 0) or 1.093v 2144 (voltage 100). Of course your card may be different and you are water cooling which I am not. The +0 voltage does allow you to run a bit cooler but that is probably not that relevant to you.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> I flashed my 1070 strix to strix OC, now power target got 120% and fix the clock cap caused by stock bios. Now i can work my GPU at fixed 2138mhz until [email protected] 1.093mv. Change bios to bios card OC worth so much.


You also went from 170W to 200W power limit. Certainly helps with performance doesnt it?


----------



## gtbtk

Quote:


> Originally Posted by *Munross88*
> 
> Successfully flashed the Gaming Z bios onto my Gaming X MSI 1070
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Before Flash:*
> 
> 
> *After Flash:*


I am doing the same with mine. I like the Z bios


----------



## gtbtk

Quote:


> Originally Posted by *w-moffatt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have been running an i7-2600. It is overclocked to 4.4Ghz but it is only got about 60% the power of a 7700K. Unless you need it, Make sure that you turn the windows 10 GameDVR off
> 
> I have not played either of those games but i did find that the windows gamedvr does add additional load to the cpu even if you are not using it. Disabling it for me helped performance and reduced stutters.
> 
> 
> 
> Yep, already disabled..Don't know why it does this but as there is no fps drop I'm not overly concerned
Click to expand...

You said that the lags happen during scenes with heavy buildup and high details. It does require more rendering and as a result more calculations than wide open places. It does sound like you are hitting a CPU bottleneck at those odd times. I do think that non top of the line i5 CPUs are starting to get to a point in history when they will be left behind for new generation games. Not there yet but it is coming I think.

Windows 10 has a habit of running background tasks at various times that, if you are under a heavy gaming load, may be enough to hit the limits of the CPU. Have you tried shutting off services that you don't need?

I have found that things like Asus AI Suite also puts a load on the hardware that has an impact on performance. I don't know about other manufacturers but that may also be true there as well.


----------



## icold

Looks like 170W bios have much thotling, the clocks drop around 2050 mhz and works in the max 2126mhz @1093mv, now clocks only drop to 2138mhz and works max [email protected] max temps 60C at 80% cooler. Im going try change stock thermal paste to a top paste to drop some more C to fix try fiz at 2151mhz after finish Warranty.


----------



## zeeee4

Any advancements in the overclock dept? any bios tweaker yet? Can we even change the voltage on cards on msi afterburner and does that actually change the voltage or not? It didnt with my last card Gtx 970


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Looks like 170W bios have much thotling, the clocks drop around 2050 mhz and works in the max 2126mhz @1093mv, now clocks only drop to 2138mhz and works max [email protected] max temps 60C at 80% cooler. Im going try change stock thermal paste to a top paste to drop some more C to fix try fiz at 2151mhz after finish Warranty.


If you are using the slider to OC it will always move clocks around. If you are using curves, at +100 voltage instead of using the 1.093 point, use the 1.081 point and it leaves voltage headroom to maintain frequency levels longer.

you should maybe also try an overclock at 1.05 to 1.063v, I found that I can get about the same performance as 1.093 but the lower voltage causes lower temps


----------



## gtbtk

Quote:


> Originally Posted by *zeeee4*
> 
> Any advancements in the overclock dept? any bios tweaker yet? Can we even change the voltage on cards on msi afterburner and does that actually change the voltage or not? It didnt with my last card Gtx 970


Curves are fun

no

yes and yes - if your card has a voltage controller that MSI AB can manage.

There is a way you can add your voltage controller support manually to AB. The pdf that comes in the one of the application directories has some instructions about the database. Guru3d forums have a thread with details as well


----------



## TheBoom

Anyone knows how to get just cause 3 working with the 1070?

I'm getting DXGI crashes even with the card downclocked to 1900mhz


----------



## icold

For me Just Cause 3 works good with OC. Try update your bios


----------



## TheBoom

Mines a Asus Strix OCG. Samsung memory. There isnt a bios update for it.


----------



## gtbtk

Quote:


> Originally Posted by *TheBoom*
> 
> Anyone knows how to get just cause 3 working with the 1070?
> 
> I'm getting DXGI crashes even with the card downclocked to 1900mhz


Use DDU to completely remove Nvidia drivers and do fresh clean install would be my first suggestion


----------



## Quadrider10

Anyone know if i can flash the gigabyte extreme gaming bios onto a gigabyte g1 gaming 8gb with samsung memory?


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> Anyone know if i can flash the gigabyte extreme gaming bios onto a gigabyte g1 gaming 8gb with samsung memory?


It should work. I flashed a copy it to my Gaming X and it ran. I have also flashed the G1 bios in the past and i ran fine. It is a 240W power limit bios, It might be worth considering not turning the power limit slider up to 100% lest it burn up your VRM so make sure you keep an eye on temps.

YMMV, but the factory overclock was at about the limits of what my card could cope with so I could never get it that stable. My card was restricted in overclocking around the 1v-1.025v level but I have tuned that out for the most part now but I have not tried flashing it again recently.


----------



## HAL900

You can but it does not make sense because you lose the programe xtreme gaming and silent mode


----------



## icold

Quote:


> Originally Posted by *gtbtk*
> 
> It should work. I flashed a copy it to my Gaming X and it ran. I have also flashed the G1 bios in the past and i ran fine. It is a 240W power limit bios, It might be worth considering not turning the power limit slider up to 100% lest it burn up your VRM so make sure you keep an eye on temps.
> 
> YMMV, but the factory overclock was at about the limits of what my card could cope with so I could never get it that stable. My card was restricted in overclocking around the 1v-1.025v level but I have tuned that out for the most part now but I have not tried flashing it again recently.


What is the best 1070 bios?


----------



## Quadrider10

So what u are saying is that the fans will not shut off? I'm just looking for a higher watt card to flash my bios to. The G1 throttles it's self like hell because it always surpasses the power limit.


----------



## HAL900

and what do you need a bigger limit? as even furmarku it does not touch

You have g1 or Xtreeme?
Bios are interchangeable and you can want even from other cards


----------



## Quadrider10

Quote:


> Originally Posted by *HAL900*
> 
> and what do you need a bigger limit? as even furmarku it does not touch
> 
> You have g1 or Xtreeme?
> Bios are interchangeable and you can want even from other cards


Nope. I have a G1 gaming.


----------



## HAL900

you can upload a BIOSes thang. f2 and f3 has and samsung and mircon


----------



## shilka

Seems like nothing i do stops my GTX 1070 drivers from crashing
Messing around with windows settings did jack to help the problem

Tried bumping the vcore in the Xtreme software but at this point i dont think it will help either
The only option i seem to have left is to move to Windows 10 but i want a new SSD before i install a new Windows

Whats most annoying is the driver crashes are so random
Sometimes days will go by without a single crash and other days it will crash multiple times

If moving to Windows 10 does not help then the card must be broken and at this point in am fed up with Gigabyte
I am actually thinking about selling my GTX 1070 Xtreme Gaming and buying an Asus GTX 1080 or GTX 1080 Ti Strix instead!


----------



## khanmein

Quote:


> Originally Posted by *shilka*
> 
> Seems like nothing i do stops my GTX 1070 drivers from crashing
> Messing around with windows settings did jack to help the problem
> 
> Tried bumping the vcore in the Xtreme software but at this point i dont think it will help either
> The only option i seem to have left is to move to Windows 10 but i want a new SSD before i install a new Windows
> 
> Whats most annoying is the driver crashes are so random
> Sometimes days will go by without a single crash and other days it will crash multiple times
> 
> If moving to Windows 10 does not help then the card must be broken and at this point in am fed up with Gigabyte
> I am actually thinking about selling my GTX 1070 Xtreme Gaming and buying an Asus GTX 1080 or GTX 1080 Ti Strix instead!


confirm your x99 giga no issue? latest bios?


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It should work. I flashed a copy it to my Gaming X and it ran. I have also flashed the G1 bios in the past and i ran fine. It is a 240W power limit bios, It might be worth considering not turning the power limit slider up to 100% lest it burn up your VRM so make sure you keep an eye on temps.
> 
> YMMV, but the factory overclock was at about the limits of what my card could cope with so I could never get it that stable. My card was restricted in overclocking around the 1v-1.025v level but I have tuned that out for the most part now but I have not tried flashing it again recently.
> 
> 
> 
> What is the best 1070 bios?
Click to expand...

I think it depends on the hardware you are using.

I quite liked the Strix OC bios on my MSI Gaming. For a little while, I was getting more performance with that bios than I got with the original MSI one. I have gone back to the Gaming Z bios now and that is the one that I have had most sucess with. The Gigabyte xtreme gaming bios was not very stable on my hardware but that may work well on a different card. You only know if you try it and see. You need to do your research first and experiment after making informed decisions, dont just flash anything cause that will ultimately give you a headache.

the MSI card does have an advantage the strix card for cross flashing. The MSI card has a 6+8 power and a higher phase VRM with a 291W max power limit that helps spread heat load further. The strix is only 8 pin with a 6 phase vrm (I think) so using say, a zotac extreme 300W bios is probably not a good idea on a strix card as even thgough the VRM can deliver that amount of power, cooling may start becoming a problem. I can get away with it on the MSI cards because it is designed to deal with 291W loads anyway. You probably only want to look at bioses that limit power draw to a max of 240-250W. That is most of them anyway. Do not try the HOF bios because it is not compatible and be very careful if you look at bioses from China only model cards because some of them also have different voltage regulators like the HOF models. A number of China models put a different coloured/style cooler on a standardized board that may be sold outside China as a Galax or PNY so you just need to research to know what you are getting


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Seems like nothing i do stops my GTX 1070 drivers from crashing
> Messing around with windows settings did jack to help the problem
> 
> Tried bumping the vcore in the Xtreme software but at this point i dont think it will help either
> The only option i seem to have left is to move to Windows 10 but i want a new SSD before i install a new Windows
> 
> Whats most annoying is the driver crashes are so random
> Sometimes days will go by without a single crash and other days it will crash multiple times
> 
> If moving to Windows 10 does not help then the card must be broken and at this point in am fed up with Gigabyte
> I am actually thinking about selling my GTX 1070 Xtreme Gaming and buying an Asus GTX 1080 or GTX 1080 Ti Strix instead!


Have you tried uninstalling drivers with DDU and done a totally clean install of the latest driver package?


----------



## gtbtk

Quote:


> Originally Posted by *TheBoom*
> 
> Anyone knows how to get just cause 3 working with the 1070?
> 
> I'm getting DXGI crashes even with the card downclocked to 1900mhz


Try using DDU and doing a completely clean install of new drivers


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> So what u are saying is that the fans will not shut off? I'm just looking for a higher watt card to flash my bios to. The G1 throttles it's self like hell because it always surpasses the power limit.


I'm saying the Xtreme card has 6+8 power in and a many phase VRM while the G1 only has 8 pin power and a lower end VRM.

The Xtreme bios will pull more power and create more heat doing that. It is best to be aware of the hardware limitations the G1 has before you try it out. The flash should physically work. The VRM phase/Voltage controller config is automatic and done by the hardware. The bios has only been configured to say "my limit is 200W" or whatever the hardware is sold as. Cross flashed bioses coulld potentially ask more than the available hardware can deliver. If you pay attention to that it is not a problem, you can monitor and if it goes crazy shut it down and flash back again. If you are not aware and dont pay attention, you could flash the card and ignore something that kills your card because of over heating for example. Some cross flashed bioses do strange things with card fan and the max speed ends up quite low. If you know to check, you'll see that it is not working as expected and can then reflash back.

I don't have a G1 and I don't have any past experience with that combination so I cant say to you. Sure, it works fine 100% for sure.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> No worries.
> 
> The Curve adjustments I am talking about do not involve trying to lock the voltage high.
> 
> The locked voltage high using the curve was a method to work around what turned out to be the micron controller bug and stop the card from crashing the PC if you tried to overclock the memory on cards that had Micron Vram. If the card jumped from an Idle p-state up to the p-state 0 when the GPU was under load, it would artifact the screen and then BSOD. Chrome uses 2d Acceleration and starting that up would also trigger crashes as well. Since the bios update that resolved that bug was released last November, it is not relevant any more.
> 
> Adjusting the curve is a feature that opened up to end users with pascal cards. Default curves are set up as a smooth arc and the slider just moves that entire arc up and down when you Overclock the card. That was also true for Maxwell and kepler etc.
> 
> Because Silicon is not perfect, where you may be stable with an offset of a 100 at every voltage point below say, 1.0v. 1.013v may only be stable if you offset that point up by 75 and 1.025v may only be stable up to 50 above stock then every point above 1.025V may be ok if you offset the point by 125.
> 
> Using my example, overclocking with the slider will only be stable up to 50 because when you are running something on the card, it accesses all the points along the curve at some stage during your usage and at some point, it will try and use the 1.025v level and go unstable if you have tried to overclock above 50. Problem is that leaves all higher offset OC potential at all the other points that are not 1.025 along the curve unused.
> 
> Allowing us to access the curve gives us the chance to find any of thos bits that we have not been able to access before while leaving the point with the low OC potential at the low level. challenge is finding exactly what your cards limits at each point. Most people dont bother but I found it really cool.
> 
> _I also discovered that Nvidia cards seem to use voltage levels to control different functional aspects of the card. the .950 point is the one that will adjust what is referred to a the "video Clock" by HWinfo64 and I found that I could get the best performance by setting that point as High as possible even if the top frequency reported by Afterburner was not as high as it could be with a lower .950 voltage.
> _
> I can run my card at 2164Mhz but the performance is not as good as I can get with .950 at 2037 and 1.063 point 2076 (voltage 0) or 1.093v 2144 (voltage 100). Of course your card may be different and you are water cooling which I am not. The +0 voltage does allow you to run a bit cooler but that is probably not that relevant to you.


On my card video clock seem at 0.850V max at 1771mhz with current settings.



what is this video clock actually mean? curve method still pretty much a mystery for me.

So far overclocking beyond traditional +75mhz or using various curve method, my card simply has many crashes 3dmark/stress test.

I tried increased vcssa/vccio like you suggested into the safe maximum recommended for ivybridge, I think still not helping, still get the same exact crash, and memory artifact beyond +600mhz.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> On my card video clock seem at 0.850V max at 1771mhz with current settings.
> 
> 
> 
> what is this video clock actually mean? curve method still pretty much a mystery for me.
> 
> So far overclocking beyond traditional +75mhz or using various curve method, my card simply has many crashes 3dmark/stress test.
> 
> I tried increased vcssa/vccio like you suggested into the safe maximum recommended for ivybridge, I think still not helping, still get the same exact crash, and memory artifact beyond +600mhz.


You can also try making small adjustments to CPU PLL and/or increasing vcore very slightly. All of those improved my performance on my Z68 sandy bridge rig. I am not near it now to look but I think that my VCCIO is at 1.185 (1.05 default) and PLL is 1.8125 (1.8 def). The adjustments were not large for my PC. I also changed from fixed voltage to Offset voltage and I am letting VID jump up to 1.365v. All of those things together helped improve my stability and performance. Maybe the benefits come from not hitting maximums but just increasing both a little bit. I am not familiar with Ivy bridge so I don't know how much change there was between the 2 chips. but every time you experiment you learn something new.

The curve does not match or report the video clock speed. HWinfo does report it and it did exist evenwith Kepler but you could not directly change it back then.

I have no idea what the official definition is. Nvidia never mentions it in any doc that I have seen. I have just found that gaming graphics performance works better the faster I can get it working by increasing the .950 voltage point.

I know that these cards have a number of different p-states that they run at depending on what type of load they are under. Pstate 0 is the 3d gaming p-State. P-state2 is the state that the card goes to if you run an openCL renderer like Luxmark. Adjusting the curve relates directly to P-State 0, But the lower states seem to follow along and end up overclocked as well. I have never tried to match up the video clock to an actual p-state because until your question, it had never occured to me. I suspect that the video clock is actually related to one of the other p_states the card can operate at. If you want to take a look at the different p-States, you can use Nvidia Inspector which is a free download overclocking utility. It is a bit clunky and I would not use it for overclocking but it is Handy to look at P-States. It also comes with a utility to change the hidden settings that nvidia control panel doesnt show you.

I just know from experience if if set my .950v point to 2037Mhz, which is as high as I can go and stay stable, HWinfo reports that the video clock boosts up to about 1840Mhz and with the 1.061v point at 2076 and memory at +630Mhz, I score about 21000 in firestrike graphics score.


----------



## TheBoom

Quote:


> Originally Posted by *gtbtk*
> 
> Try using DDU and doing a completely clean install of new drivers


Tried that already.

Seems that its a JC3 issue that happens with some systems only.


----------



## gtbtk

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Try using DDU and doing a completely clean install of new drivers
> 
> 
> 
> Tried that already.
> 
> Seems that its a JC3 issue that happens with some systems only.
Click to expand...

I have never played JC3.

one thing that can cause issues that are hard to track down are missmatched versions of the version of the visual C++ redistribution that is installed on your rig. you can get the latest versions of all of them from microsoft.

alternatively, look in the JC3 install files, if there is the redistribution installer, see if that version is the same as the one you have installed already, you may only need to do a repair on the existing ones


----------



## shilka

Quote:


> Originally Posted by *khanmein*
> 
> confirm your x99 giga no issue? latest bios?


Yes its the newest BIOS
Quote:


> Originally Posted by *gtbtk*
> 
> Have you tried uninstalling drivers with DDU and done a totally clean install of the latest driver package?


I already tried that

I have tired pretty all the drivers Nvidia has i have tried using a DDU and nothing helps
I tired using multiple PCI-E slots and i have tried both overclocking and underclocking the card

I also tried messing around with Windows settings
https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista

NOTHING HELPS!!! tried overvolting the card by 3% but i think it will do jack
The only thing i have not tired yet is move to Windows 10

Should i just RMA this piece of crap card?


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> confirm your x99 giga no issue? latest bios?
> 
> 
> 
> Yes its the newest BIOS
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Have you tried uninstalling drivers with DDU and done a totally clean install of the latest driver package?
> 
> Click to expand...
> 
> I already tried that
> 
> I have tired pretty all the drivers Nvidia has i have tried using a DDU and nothing helps
> I tired using multiple PCI-E slots and i have tried both overclocking and underclocking the card
> 
> I also tried messing around with Windows settings
> https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista
> 
> NOTHING HELPS!!! tried overvolting the card by 3% but i think it will do jack
> The only thing i have not tired yet is move to Windows 10
> 
> Should i just RMA this piece of crap card?
Click to expand...

have you ever checked dpc latency? you can download latency monitor and it may tell you what is causing the time outs through high latency

Do you use the nvidia audio out over hdmi or Dp? if not, disable nvidia audio in device manager

You could try enabling Message signaled interrupts:

Trying to switch device to MSI-mode.
You must your graphics card's registry key. You can follow these steps

Backup or at least create a system restore point before making changes to be safe.

Open Device manager

find the listing for your Graphics adapter

Invoke device properties dialog.

Switch to "Details" tab.

Select "Device Instance Path" in "Properties" dropdown box.

Write down "Value" (for example "PCI\VEN_1002&DEV_4397&SUBSYS_1609103C&REV_00\3&11 583659&0&B0").

This is relative registry path under the key "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum\PCI\VEN.......".

Go to that device`s registry key ("HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum \PCI\VEN_1002&DEV_4397&SUBSYS_1609103C&REV_00\3&11 583659&0&B0") and locate down the subkey "Device Parameters\Interrupt Management".

For devices working in MSI-mode there will be subkey "Device Parameters\Interrupt Management\*MessageSignaledInterruptProperties*" and in that subkey there will be DWORD value "*MSISupported*" equals to "0x00000001".

If the MSISupported subkey is missing, add them manually.

You can export a copy of the key you just created to a *.reg file so it is easy to apply after your next driver update by just double clicking on the file and rebooting.

Reboot the PC.


----------



## khanmein

Visual C++ Runtime Installer credit to thatguy91
Quote:


> Originally Posted by *shilka*
> 
> Yes its the newest BIOS
> I already tried that
> 
> I have tired pretty all the drivers Nvidia has i have tried using a DDU and nothing helps
> I tired using multiple PCI-E slots and i have tried both overclocking and underclocking the card
> 
> I also tried messing around with Windows settings
> https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista
> 
> NOTHING HELPS!!! tried overvolting the card by 3% but i think it will do jack
> The only thing i have not tired yet is move to Windows 10
> 
> Should i just RMA this piece of crap card?


last resort RMA & don't pick GIGA anymore.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *shilka*
> 
> Yes its the newest BIOS
> I already tried that
> 
> I have tired pretty all the drivers Nvidia has i have tried using a DDU and nothing helps
> I tired using multiple PCI-E slots and i have tried both overclocking and underclocking the card
> 
> I also tried messing around with Windows settings
> https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista
> 
> NOTHING HELPS!!! tried overvolting the card by 3% but i think it will do jack
> The only thing i have not tired yet is move to Windows 10
> 
> Should i just RMA this piece of crap card?


do u have any Gigabyte MOBO CD utilities installed and/or running?
if so ditch them all.
(my x99/6850k crashed in games randomly like urs until i vanquished all the Gigabyte utilities.)


----------



## zipper17

Quote:


> Originally Posted by *shilka*
> 
> Seems like nothing i do stops my GTX 1070 drivers from crashing
> Messing around with windows settings did jack to help the problem
> 
> Tried bumping the vcore in the Xtreme software but at this point i dont think it will help either
> The only option i seem to have left is to move to Windows 10 but i want a new SSD before i install a new Windows
> 
> Whats most annoying is the driver crashes are so random
> Sometimes days will go by without a single crash and other days it will crash multiple times
> 
> If moving to Windows 10 does not help then the card must be broken and at this point in am fed up with Gigabyte
> I am actually thinking about selling my GTX 1070 Xtreme Gaming and buying an Asus GTX 1080 or GTX 1080 Ti Strix instead!


Driver Crashing without GPU doing anything at all? no gaming, etc? idling and crashing? that's weird.

To be sure double check your Event Viewers, see what is the system tell during crash.


----------



## shilka

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> do u have any Gigabyte MOBO CD utilities installed and/or running?
> if so ditch them all.
> (my x99/6850k crashed in games randomly like urs until i vanquished all the Gigabyte utilities.)


I could do that but i doubt it will help as i believe i have already trired that
Quote:


> Originally Posted by *zipper17*
> 
> Driver Crashing without GPU doing anything at all? no gaming, etc? idling and crashing? that's weird.
> 
> To be sure double check your Event Viewers, see what is the system tell during crash.


It crashes at random in games and almost always when i create a big explosion in the game
It also crashes at random when i turn my TV off and the signal to the HDMI port is cut

Most of the time when i turn the TV off the monitor just blinks and goes back to normal in a 1-2 seconds

But sometimes when i turn the TV off the monitor will freeze then go black with the message display port no signal and then it goes back to windows with the message that the Nivida driver has crashed and recovered

All of this has been going on since day one at random

I am sick to death of this problem and the only thing i have not tried yet is installing Windows 10 which i cant since i dont have a USB or DVD to install with
I do own a legal Windows 10 code someone i know gave me but i dont have a disk or USB

If moving to Windows 10 does not help then the card must be broken in which case i will RMA it and sell it the second i get it back fixed or get a new card
This is the last time i will ever buy a Gigabyte video card.


----------



## zipper17

Do you overclock your 1070?

Try to put your 1070 into other different Computer maybe. If those symptoms still occur, then it's probably your GPU.


----------



## shilka

Quote:


> Originally Posted by *zipper17*
> 
> Do you overclock your 1070?
> 
> Try to put your 1070 into other different Computer maybe. If those symptoms still occur, then it's probably your GPU.


I have tried over AND underclocking the card as i said before and no matter what i do or try nothing helps in any way
I was thinking about taking the card out and let my friend mess around with it as a way to find out if the card or system is the problem

In the end i think i am going to sell it and buy a GTX 1080 or a GTX 1080 Ti from Asus or EVGA instead as my patience with this card has run out
Never going to buy another Gigabyte card after this just like i will never buy a card from XFX after all the problems i had with the old Radeon 7950 DD i had from XFX.


----------



## DeathAngel74

For me, if my cpu was not stable, any big explosion would cause the pc to crash. I thought it was my gpu, but it was not enough vcore.


----------



## shilka

Quote:


> Originally Posted by *DeathAngel74*
> 
> For me, if my cpu was not stable, any big explosion would cause the pc to crash. I thought it was my gpu, but it was not enough vcore.


I did bump the vcore on the GPU a bit but that was before i uninstalled the Gigabyte software
I had a GTX 970 installed in my system since last summer and i have never had a single problem with my GTX 970 it was only after i upgraded to the GTX 1070 the problems started

I still think its the card and i am thinking about if i should just give up and take the card to the shop i bought it from for an RMA
I would much rather have my old GTX 970 back then deal with all this crap.


----------



## shilka

The driver just crashed again so uninstalling the Gigabyte software did jack
Damm this card to hell that was the last drop i am going to RMA this son of what know what on monday!

Edit: since its friday night here i cant do anything for another 3 days so i have tried yet again using a display driver uninstaller and this time i made sure i used safe mode
Its probably not going to help a damm thing but i cant do anything else

Could throw in my old GTX 970 right now but its too late and i am too tired and annoyed to do that right now.


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeathAngel74*
> 
> For me, if my cpu was not stable, any big explosion would cause the pc to crash. I thought it was my gpu, but it was not enough vcore.
> 
> 
> 
> I did bump the vcore on the GPU a bit but that was before i uninstalled the Gigabyte software
> I had a GTX 970 installed in my system since last summer and i have never had a single problem with my GTX 970 it was only after i upgraded to the GTX 1070 the problems started
> 
> I still think its the card and i am thinking about if i should just give up and take the card to the shop i bought it from for an RMA
> I would much rather have my old GTX 970 back then deal with all this crap.
Click to expand...

In my experience, 1070 puts loads on things that the older cards were not strong enough to show up. When I first got mine, i was getting watch dog errors and loads of crashes as well. Plus I was also discovering the wonders of micron with the old bios.

Maybe it is worth swapping the card you have out for a new one. If the new one still does the same things, you then know it is PC config and not the card.


----------



## shilka

Quote:


> Originally Posted by *gtbtk*
> 
> In my experience, 1070 puts loads on things that the older cards were not strong enough to show up. When I first got mine, i was getting watch dog errors and loads of crashes as well. Plus I was also discovering the wonders of micron with the old bios.
> 
> Maybe it is worth swapping the card you have out for a new one. If the new one still does the same things, you then know it is PC config and not the card.


If i get a new card and the new card does the same thing then what could the problem be then?
My CPU and everything in the system is at stock clock speeds and vcore because i dont feel like overclocking my system or messing around in the BIOS

A system at out of the box clock speeds and settings should not be the problem, yes i realize that i am the bigest noob when it comes to overclocking
Never had any problems with my old GTX 970 which i took out of my secondary system and have ready to be installed in this system.

Edit: my gut feeling is there is something wrong with the power delivery on my GTX 1070 because it always crashes when power is suddenly cut or increased
Its not the PCI-E slots its not the motherboard BIOS or the Gigabyte software its not the drivers its not because the card is clocked too low or high

I can only think of two things that can be wrong either my Windows is messed up or the card is broken
If i throw in my old card and it still crashes then i know for a fact it must be Windows or the motherboard itself which i doubt

This problem is so frustrating!

Edit 2: @Bee Dee 3 Dee you have the same motherboard and CPU that i have so is there anything you have done to your system that would make it run a GTX 1070 better???


----------



## gtbtk

Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> In my experience, 1070 puts loads on things that the older cards were not strong enough to show up. When I first got mine, i was getting watch dog errors and loads of crashes as well. Plus I was also discovering the wonders of micron with the old bios.
> 
> Maybe it is worth swapping the card you have out for a new one. If the new one still does the same things, you then know it is PC config and not the card.
> 
> 
> 
> If i get a new card and the new card does the same thing then what could the problem be then?
> My CPU and everything in the system is at stock clock speeds and vcore because i dont feel like overclocking my system or messing around in the BIOS
> 
> A system at out of the box clock speeds and settings should not be the problem, yes i realize that i am the bigest noob when it comes to overclocking
> Never had any problems with my old GTX 970 which i took out of my secondary system and have ready to be installed in this system.
> 
> Edit: my gut feeling is there is something wrong with the power delivery on my GTX 1070 because it always crashes when power is suddenly cut or increased
> Its not the PCI-E slots its not the motherboard BIOS or the Gigabyte software its not the drivers its not because the card is clocked too low or high
> 
> I can only think of two things that can be wrong either my Windows is messed up or the card is broken
> If i throw in my old card and it still crashes then i know for a fact it must be Windows or the motherboard itself which i doubt
> 
> This problem is so frustrating!
> 
> Edit 2: @Bee Dee 3 Dee you have the same motherboard and CPU that i have so is there anything you have done to your system that would make it run a GTX 1070 better???
Click to expand...

You are aware that not all chips overclock the same right? some chips want a bit more voltage than others? That also applies equaly at stock as it does at Overclocked frequencies. Motherboards are configured to be within a range that will support most cpus. Your may be but it may also be right on the fringe of the settings envelope.So things like Vcore being assumed to be perfect, particularly when you have one of these unknown reason problems, is not a good assumption. I am not suggesting to be worried, just dont exclude a possible reason because of a wrong assumption.

Kepler and Maxwell Graphics cards do not push hardware to the same limits that a pascal card can. After all, a 970 cannot do 21000 graphics scores in Firestrike. All those frames put more data load on the CPU and Memory etc. So they need to be in a finer state of tune to meet the new requirements that they have not had to deal with before. If the tune is out a bit, the 970 never pushed hard enough and you never noticed the issue.

To solve your problem we need to do a process of elimination because I have no idea what your exact problem is off the top of my head. You have described things but maybe there are things that have happened that you didn't notice or seemed unimportant and you didn't mention it, no big deal, we just need to work the problem.

My first instinct was to work on the premise that the card is OK and there is a miss configuration somewhere. That is a slight risk because if the card is duff, we wasted time anyway but the reason I went with software first is that bad card means a trip to the shop, software fix doesn't. Having said that, If you do replace the card and it is doing exactly the same thing, we can be pretty sure it is the software or CPU/Motherboard.

This tread has jumped around a bit. And some posts just say graphics card crashed which doesn't help me that much. Could you please go through the event viewer, Administrative, Application and system logs and look at the time around the last crash. Could you take screenshots of the lists and post it back here so I can get a look at what is going on.


----------



## kignt

@Shilka Have you tested your RAM? Perhaps over the years, a memory chip has gone bad.

Recently had to replace the ram in a i7-950 system because some ram went bad. Would crash only during overwatch competitive, but quickplay was fine, weird. Tried a few things before, like changing cmos battery, reseated some ram, gpu, backed off overclock a bit. Sometimes the system wouldn't boot BIOS. When it did, decided to test the ram, and found errors. Got some new sticks, issue solved, been stable a week now.


----------



## khanmein

Quote:


> Originally Posted by *kignt*
> 
> @Shilka Have you tested your RAM? Perhaps over the years, a memory chip has gone bad.
> 
> Recently had to replace the ram in a i7-950 system because some ram went bad. Would crash only during overwatch competitive, but quickplay was fine, weird. Tried a few things before, like changing cmos battery, reseated some ram, gpu, backed off overclock a bit. Sometimes the system wouldn't boot BIOS. When it did, decided to test the ram, and found errors. Got some new sticks, issue solved, been stable a week now.


Memtest86


----------



## Quadrider10

what version and how are you guys flashing these cards? the latest nvflash has not been working for me


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> what version and how are you guys flashing these cards? the latest nvflash has not been working for me


I have been using 5.83

If you download the asus strix 1070 bios update utility, you can use 7zip to extract the rar and then use 7zip again to also extract the contents of the exe file. inside the directory that will be created from the exe, there is a directory called 64 and you will find a copy of nvflash there that should work there.

you need to be using the command "nvflash -6 yourfilename.rom" for it cross flash to a different model card


----------



## zipper17

Quote:


> Originally Posted by *shilka*
> 
> I could do that but i doubt it will help as i believe i have already trired that
> It crashes at random in games and almost always when i create a big explosion in the game
> It also crashes at random when i turn my TV off and the signal to the HDMI port is cut
> 
> Most of the time when i turn the TV off the monitor just blinks and goes back to normal in a 1-2 seconds
> 
> But sometimes when i turn the TV off the monitor will freeze then go black with the message display port no signal and then it goes back to windows with the message that the Nivida driver has crashed and recovered
> 
> All of this has been going on since day one at random
> 
> I am sick to death of this problem and the only thing i have not tried yet is installing Windows 10 which i cant since i dont have a USB or DVD to install with
> I do own a legal Windows 10 code someone i know gave me but i dont have a disk or USB
> 
> If moving to Windows 10 does not help then the card must be broken in which case i will RMA it and sell it the second i get it back fixed or get a new card
> This is the last time i will ever buy a Gigabyte video card.


Driver nvidia crashes, I'd say close to GPU or Power Supply problem. Driver crash usually cause by GPU defective/artifacting/overheating/too much overclocking.

Power supply can be defective too, even though it's well known good power supply, it's electronic component, who knows you got the defect one. PSU defective or didn't provide enough juice power or stable voltage's ripple to the GPU. Worst case scenario both component defective, the Defective PSU causing the GPU to be defective too.

Try reseat the GPU & Pcie cables. Or maybe try on other monitor, check/change your monitor cables.


----------



## khanmein

Zotac ZT-P10700I-10P @ $370


----------



## shilka

Quote:


> Originally Posted by *kignt*
> 
> @Shilka Have you tested your RAM? Perhaps over the years, a memory chip has gone bad.
> 
> Recently had to replace the ram in a i7-950 system because some ram went bad. Would crash only during overwatch competitive, but quickplay was fine, weird. Tried a few things before, like changing cmos battery, reseated some ram, gpu, backed off overclock a bit. Sometimes the system wouldn't boot BIOS. When it did, decided to test the ram, and found errors. Got some new sticks, issue solved, been stable a week now.


The system is not even a year old yet and i had a GTX 970 installed until a few weeks ago and i never had a problem there

I am going to move my GTX 1070 over to my secondary PC which is an old X79 rig with an old 3820 and 12 GB of RAM (12GB since the 4th RAM is broken)
If the card crashes on that PC as well then its the card

Only problem is there is nothing installed on that system which means the fastest way to get it to crash is to turn my TV on and off untill it crashes.


----------



## shilka

Never mind the monitor came back to life and the problem came back using HDMI.


----------



## lanofsong

Hey GTX 1070 owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *shilka*
> 
> If i get a new card and the new card does the same thing then what could the problem be then?
> My CPU and everything in the system is at stock clock speeds and vcore because i dont feel like overclocking my system or messing around in the BIOS
> 
> A system at out of the box clock speeds and settings should not be the problem, yes i realize that i am the bigest noob when it comes to overclocking
> Never had any problems with my old GTX 970 which i took out of my secondary system and have ready to be installed in this system.
> 
> Edit: my gut feeling is there is something wrong with the power delivery on my GTX 1070 because it always crashes when power is suddenly cut or increased
> Its not the PCI-E slots its not the motherboard BIOS or the Gigabyte software its not the drivers its not because the card is clocked too low or high
> 
> I can only think of two things that can be wrong either my Windows is messed up or the card is broken
> If i throw in my old card and it still crashes then i know for a fact it must be Windows or the motherboard itself which i doubt
> 
> This problem is so frustrating!
> 
> Edit 2: @Bee Dee 3 Dee you have the same motherboard and CPU that i have so is there anything you have done to your system that would make it run a GTX 1070 better???


no. i left all default settings. no OC of anything. after using XMP setting in RAM i did change Command Rate to from 2T to 1T. (ironically i've got some of the best RAM on the planet for OCing, but i just use the XMP and 1T CR.)

the utilities that come with this mobo may work with pre 1000 geforce cards; but i believe that starting with 1000 cards it is a must, that u do not use any Gigabyte utilities from the mobo cd included. (they were designed too long ago and not updated if at all.)

and yes, the bigger the eye candy in games, the more likely the random crash was, for me. (i didn't mind it in Doom4, because it was better than _Hell_. lol But sucked in games like GTA V... i was ***ing[/I] whenever it happened.) but remember, i do have SLI and it performs very well. and as a result it was hard to figure out. As a perfectionist, i found that the Gigabyte utilities were the cause and not just maybe. i never ever crash now for any reason. after fixing ur 1st card, u should add a second card.







(or call and tell Gigabyte and/or the place u bought the card and say u need to spend more money; and ask 'em if u could trade-in/split the difference on a single GTX1080-ti.







That would be cool as blank.







)

GL









PS i guess i could include that i've used driver "*376.19-desktop-win8-win7-64bit-international-whql*" ever since it came out ages ago (i DLed installed it December 6th). Never have i needed to uninstall/reinstall it. Built the PC October 7, 2016. It probably took a full month to nail down how not to use Gigabyte utilities to avoid games crashing. And my *BIOS is version F5 08/29/2016* (the one it came with).

PSS Did u figure out the extra Power adapter included with the X99-Ultra Gaming mobo? the instructions were so vague that i skipped researching it. i can't recall for sure, but i think it looks like an additional PCI-E power plug is used with it and it then goes into the 24-pin on mobo along with the PSU 24-pin; and it's for the sake of boosting power to a six-core CPU? But without a certified Gigabyte employee on the phone i wouldn't allow myself to touch it. i wish they had had extensive instructions included. but oh well all else seems fine. i luv my AIO cooler too. My Max CPU temp is 56c with a room temp of 70f.







i have'nt researched anything i posted in this thread back in October or after but u might peek to see wat if anything i mentioned bak than about this mobo. And u know about official Gigabyte forum being located on TweakTown site in their forums: https://forums.tweaktown.com/gigabyte/ , right?


----------



## shilka

I am now 100% sure i have finally found the problem and it is indeed my GTX 1070 that is defective
I moved it to my second PC and in less then 30 min of playing Just Cause 3 i had a number of crashes just like on my main PC

So yes the card is defective and is going to RMA on monday
I am going to talk to the shop and ask if i can get an Asus GTX 1070 Strix instead of this god awful Gigabyte Xtreme card

Not sure if want to keep whatever GTX 1070 i end up with so might sell it without opening the box.


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *shilka*
> 
> I am now 100% sure i have finally found the problem and it is indeed my GTX 1070 that is defective
> I moved it to my second PC and in less then 30 min of playing Just Cause 3 i had a number of crashes just like on my main PC
> 
> So yes the card is defective and is going to RMA on monday
> I am going to talk to the shop and ask if i can get an Asus GTX 1070 Strix instead of this god awful Gigabyte Xtreme card
> 
> Not sure if want to keep whatever GTX 1070 i end up with so might sell it without opening the box.


cool.







i forgot to mention that both of my ASUS OC 1070 cards worked in my LGA1366 before i got the x99 mobo. i got the new mobo because performance was only 40% of wat it should have been with LGA1366.

go for a single GTX1080-ti now that they exist. i wish i had it.









Edit 1: if someone here has the same 1070 as u RMA they can SLI it with the unopened RMA u get. sell them at a discount and go single 1080-ti!!!!







(it is a waste of the 40 lanes for SLI but 1080-ti ROCKS don't it? i have'nt read reviews on founders yet; but you can get in on the ASUS OC 1080-ti i bet.... be patient.







)


----------



## shilka

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> cool.
> 
> 
> 
> 
> 
> 
> 
> i forgot to mention that both of my ASUS OC 1070 cards worked in my LGA1366 before i got the x99 mobo. i got the new mobo because performance was only 40% of wat it should have been with LGA1366.
> 
> go for a single GTX1080-ti now that they exist. i wish i had it.


I am going to sell my GTX 1070 if i get another Gigabyte card to my friend and borrow his GTX 970 which was my old GTX 970 before i sold it to him
I feel like the GTX 1080 Ti is a bit too much price wise or for my usage, the GTX 1080 seems like a better option now that it has dropped in price

I have been looking at the EVGA GTX 1080 FTW2 but i feel like its a bit overpriced here so the Asus Strix seems like the best option?

https://www.computersalg.dk/i/1911166/asus-geforce-gtx-1080-a8g-gaming
https://www.computersalg.dk/i/3326824/evga-geforce-gtx-1080-ftw2-gaming?sq=EVGA+GTX+1080FTW2

Not going to touch Gigabyte with a 10 foot barge pole ever again!

https://www.computersalg.dk/i/1893204/gigabyte-geforce-gtx-1080-gv
https://www.computersalg.dk/i/3306289/gigabyte-aorus-geforce-gtx-1080

Edit: unless i can talk the shop into letting me get another card i am not going to keep the card if i get a Gigabyte card back
I really like the GTX 1070 and i dont mind getting an Asus or EVGA card but i refuse to keep a Gigabyte card and if i am going to sell it anyway i might as well upgrade


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *shilka*
> 
> I am going to sell my GTX 1070 if i get another Gigabyte card to my friend and borrow his GTX 970 which was my old GTX 970 before i sold it to him
> I feel like the GTX 1080 Ti is a bit too much price wise or for my usage, the GTX 1080 seems like a better option now that it has dropped in price
> 
> I have been looking at the EVGA GTX 1080 FTW2 but i feel like its a bit overpriced here so the Asus Strix seems like the best option?
> 
> https://www.computersalg.dk/i/1911166/asus-geforce-gtx-1080-a8g-gaming
> https://www.computersalg.dk/i/3326824/evga-geforce-gtx-1080-ftw2-gaming?sq=EVGA+GTX+1080FTW2
> 
> Not going to touch Gigabyte with a 10 foot barge pole ever again!
> 
> https://www.computersalg.dk/i/1893204/gigabyte-geforce-gtx-1080-gv
> https://www.computersalg.dk/i/3306289/gigabyte-aorus-geforce-gtx-1080
> 
> Edit: unless i can talk the shop into letting me get another card i am not going to keep the card if i get a Gigabyte card back
> I really like the GTX 1070 and i dont mind getting an Asus or EVGA card but i refuse to keep a Gigabyte card and if i am going to sell it anyway i might as well upgrade


yet. i agree. the strix cooling rocked compared to anyone else last july (whenever they came out i read every review).

i had Gigabyte Windforce 760-SLI that was fine for several years... and had nothing but their MOBOs for over a decade and they ROCK still. But ASUS Strix vid cards are great. (And if one breaks the ASUS Vid card/MOBO RMA location is a two hour drive from me. w00t!







)

GL with the shop.


----------



## shilka

Only reason i got an Xtreme Gaming card was i was so satisfied with my old GTX 970 G1 Gaming cards
Should have spent more time looking at all the options and not just buy the first thing that looked good

Before i had two GTX 970 G1 cards from Gigabyte i had two of the old Asus GTX 680 4 GB CU II cards in SLI and those where damm good cards back in the day
Hope that the new Asus Strix cards are just as good or better

Never owned an EVGA card before so i am thinking about getting one of those but i dont know?


----------



## cutty1998

Quote:


> Originally Posted by *gtbtk*
> 
> there is no way to tell asic quality on pascal. you must have a very old version or GPU-Z that is trying to read the chip as maxwell. the number is inaccurate.
> 
> You can only access 1.093 if you use an overclocking utility. stock for all cards is 1.063 as far as I am aware. I am not aware of any cards that you cannot increase the card voltage on. 1.093 is fine but it will increase temps over stock voltages so you may find that it starts high but drops off quicker as the temps increase. Key is finding the best banance between temps, frequency and voltage to maximize framerates


Everything I do , I do for my 18 year old Son (Cutty1998 ) I had been a bit depressed of late ,as he had seemed to have lost all interest in PC builds, and overclocking, & even PC gaming ,but he just got back from a PC programming/coding competition at UCF college (Fl.) ,and he seems to have been re-ignited in hardware ,and software interest ,so I am very happy about that! He was asking me about the 1080Ti, and Ryzen,& new chipsets ! Blew my mind ! We have gone from GTX680 SLI-2-way ,to single Gigabyte GTX980 ,to ASUS STRIX GTX 1070 OC on the Ivy Bridge platform ,and due to our insanely high upcoming college costs, we will be lucky if we can add a second 1070 for SLI. I have another kid right behind him headed for college ,and a 3rd behind that one! at this point , I would be more than happy with a 6700K & Z170 board at a discounted price ,to do a new build ,for my Son. He has maintained a 4.2 GPA for his Senior year in H.S. and got accepted to Uof F ,and will major in Computer engineering! So I am very excited for him! He is also very involved in Jazz ,concert ,& Marching band ,and plays 4 different brass instruments!


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *shilka*
> 
> Only reason i got an Xtreme Gaming card was i was so satisfied with my old GTX 970 G1 Gaming cards
> Should have spent more time looking at all the options and not just buy the first thing that looked good
> 
> Before i had two GTX 970 G1 cards from Gigabyte i had two of the old Asus GTX 680 4 GB CU II cards in SLI and those where damm good cards back in the day
> Hope that the new Asus Strix cards are just as good or better
> 
> Never owned an EVGA card before so i am thinking about getting one of those but i dont know?


Strix blew away all comp last July. i thought Page 1 had linx to many reviews but i don't see any. it must be a thread dedicated to 1070 reviews that had them page 1. Anyways, Strix all the way. BUT, at least with EVGA u get a Trade-up program unlike almost all other brands. (i wish ASUS did.) And a Trade-up program can be a better option. Like with a possibility of a 1080-ti. SRY i just luv mentioning it.lol having the best is always fun to dream of.


----------



## WyreTheWolf

Uninstalled my old Radeon R9 290x from Gigabyte to install a Zotac GTX 1070... So far so good, easy overclocking above 2GHz.


----------



## ColdDeckEd

I saw a Hybrid cooler on sale on Amazon (amazon warehouse so it was used), so I picked it up, finished installing it the other day. The repackaging job was terrible and messed up the included thermal pads but was able to reuse the ones installed by evga. Was able to hit 21k in FS like I've wanted. Will reuse it if I ever get a 1080ti.


----------



## Quadrider10

So I flashed the gigabyte extreme version to my g1. It seemed to stabilize clocks because it's not bouncing off the power limit, but temps got a bit hotter with no voltage applied or additional clock speed


----------



## shilka

Would anyone in the club be interested buying a brand new and unopened Gigabyte GTX 1070 Xtreme Gaming?

Reason why i ask is i dont want another Gigabyte Xtreme Gaming card so if i cant get a refund or another card after my RMA is done i am sitting on a brand new and unopened card i dont want

The normal price would be 4000 kr which is about $580 US locally but since the card was bought with a VAT exemption i only paid 3200 kr or about $460 for it
I would be willing to take a little bit off the price so lets say 3000 kr or $430 US

And before anyone begins to complain about those prices you should know i live in Denmark which has 25% VAT and 39% or higher tax which make our prices some of the highest in the world

3000 kr would be a 25% discount from new.


----------



## Blackfirehawk

Quote:


> Originally Posted by *shilka*
> 
> Would anyone in the club be interested buying a brand new and unopened Gigabyte GTX 1070 Xtreme Gaming?
> 
> Reason why i ask is i dont want another Gigabyte Xtreme Gaming card so if i cant get a refund or another card after my RMA is done i am sitting on a brand new and unopened card i dont want
> 
> The normal price would be 4000 kr which is about $580 US locally but since the card was bought with a VAT exemption i only paid 3200 kr or about $460 for it
> I would be willing to take a little bit off the price so lets say 3000 kr or $430 US
> 
> And before anyone begins to complain about those prices you should know i live in Denmark which has 25% VAT and 39% or higher tax which make our prices some of the highest in the world
> 
> 3000 kr would be a 25% discount from new.


wouldn´t it be easyer for you to drive over to Germany for new Hardware?.. for a new GTX 1070 you will spend around 400-450 €/ 430 Dollar-500 Dollar

you get the cheapest GTX 1070 for about 370€/400Dollar


----------



## shilka

Quote:


> Originally Posted by *Blackfirehawk*
> 
> wouldn´t it be easyer for you to drive over to Germany for new Hardware?.. for a new GTX 1070 you will spend around 400-450 €/ 430 Dollar-500 Dollar
> 
> you get the cheapest GTX 1070 for about 370€/400Dollar


No its not that much cheaper when you take into account what shipping costs

Germany is not near from where i am and its even further when you dont own a car
Its also a pain in the butt to return stuff all the way to Germany which is the reason why i dont order from Germany


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> So I flashed the gigabyte extreme version to my g1. It seemed to stabilize clocks because it's not bouncing off the power limit, but temps got a bit hotter with no voltage applied or additional clock speed


that is to be expected, the xtreme is a 1671mhz, 210w graphics card (240 if you bump the power limit) the G1 is only 1595, 180-200W card. If you are under 80 deg then you are fine.

I suggest that you create a fan curve if the temps worry you


----------



## Quadrider10

Quote:


> Originally Posted by *gtbtk*
> 
> that is to be expected, the xtreme is a 1671mhz, 210w graphics card (240 if you bump the power limit) the G1 is only 1595, 180-200W card. If you are under 80 deg then you are fine.
> 
> I suggest that you create a fan curve if the temps worry you


my G1 hits 70C at 2100mhz at 1.083v auto fan. with the extreme bios, it hit 72c at 2000mhz and stock voltage and fan.


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> that is to be expected, the xtreme is a 1671mhz, 210w graphics card (240 if you bump the power limit) the G1 is only 1595, 180-200W card. If you are under 80 deg then you are fine.
> 
> I suggest that you create a fan curve if the temps worry you
> 
> 
> 
> my G1 hits 70C at 2100mhz at 1.083v auto fan. with the extreme bios, it hit 72c at 2000mhz and stock voltage and fan.
Click to expand...

72 deg is still fine but I think your performance will probably benefit from a fan curve. It doesn't need to kick the fan to 100% straight away but if you keep the card in the mid 60s it will run faster.

If i am bench marking with my MSI Gaming X with Z bios, 100% fan will keep mine at 53-54 deg at 1.063 and 58-59 at 1.093v. I have moved away from using +100 voltage. In my case, I am getting about the same performance at 1.063v with lower temps.

http://www.3dmark.com/fs/11822144

This is on an i7-2600 with only PCIe 2.0. I'm still getting over 21000 graphics score at 1.063V


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> 72 deg is still fine but I think your performance will probably benefit from a fan curve. It doesn't need to kick the fan to 100% straight away but if you keep the card in the mid 60s it will run faster.
> 
> If i am bench marking with my MSI Gaming X with Z bios, 100% fan will keep mine at 53-54 deg at 1.063 and 58-59 at 1.093v. I have moved away from using +100 voltage. In my case, I am getting about the same performance at 1.063v with lower temps.
> 
> http://www.3dmark.com/fs/11822144
> 
> This is on an i7-2600 with only PCIe 2.0. I'm still getting over 21000 graphics score at 1.063V


your physic scores 10,441
combined scores 6,260

my 3570k oc physic scores scored 8,923
but my combined scores 8,143

why combined scores on i7 is lower, hmm?

this is my latest run
http://www.3dmark.com/fs/11945625
FS GS 20,7XX ;S

I can hit +21k but not perfect stable in 3d mark Stress Test, it would crashing due to gpu core, or small artifacting due to memory too much overclocking. I quess i have a lower binned chip of 1070.
http://www.3dmark.com/fs/11588417


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 72 deg is still fine but I think your performance will probably benefit from a fan curve. It doesn't need to kick the fan to 100% straight away but if you keep the card in the mid 60s it will run faster.
> 
> If i am bench marking with my MSI Gaming X with Z bios, 100% fan will keep mine at 53-54 deg at 1.063 and 58-59 at 1.093v. I have moved away from using +100 voltage. In my case, I am getting about the same performance at 1.063v with lower temps.
> 
> http://www.3dmark.com/fs/11822144
> 
> This is on an i7-2600 with only PCIe 2.0. I'm still getting over 21000 graphics score at 1.063V
> 
> 
> 
> your physic scores 10,441
> combined scores 6,260
> 
> my 3570k oc physic scores scored 8,923
> but my combined scores 8,143
> 
> why combined scores on i7 is lower, hmm?
> 
> this is my latest run
> http://www.3dmark.com/fs/11945625
> FS GS 20,7XX ;S
> 
> I can hit +21k but not perfect stable in 3d mark Stress Test
> http://www.3dmark.com/fs/11588417
Click to expand...

That is a good question that I have not yet found an answer for. I am seeing similar behavior to the Ryzen chips.

I suspect that It might have something to do with PCIe 2.0 vs PCIE 3.0 that you have on z77. It is still fine with load on one end or the other but load on both ends is causing contention somewhere.

I am wringing the neck of my CPU which is a Non K chip. If you have any suggestions

I7-2600 @ 4440 Mhz with BCLK at 105.8Mhz. DDR3 @1972Mhz. As a matter of interest. What speed memory are you running?

memory latency is about 57ns.

I am using offset Voltage and VID peaks at 1.364

I have discovered that tuning vccio up to 1.183v increased my Graphics performance as it seems to have fortified the PCIe and memory controllers. If you try it, do it one adjustment value up at a time and test

I have discovered that if i adjust CPU_PLL it adjusts the balance between graphics score and Physics score. Default is 1.8 and I am running at 1.8183V that is where things peak.

The highest Graphics score I remember is 21600 but the physics score was about 9900

Unfortunately my PSU went bang last week and I don't have a working PSU just now to log in and check


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> That is a good question that I have not yet found an answer for. I am seeing similar behavior to the Ryzen chips.
> 
> I suspect that It might have something to do with PCIe 2.0 vs PCIE 3.0 that you have on z77. It is still fine with load on one end or the other but load on both ends is causing contention somewhere.
> 
> I am wringing the neck of my CPU which is a Non K chip. If you have any suggestions
> 
> I7-2600 @ 4440 Mhz with BCLK at 105.8Mhz. DDR3 @1972Mhz. As a matter of interest. What speed memory are you running?
> 
> memory latency is about 57ns.
> 
> I am using offset Voltage and VID peaks at 1.364
> 
> I have discovered that tuning vccio up to 1.183v increased my Graphics performance as it seems to have fortified the PCIe and memory controllers. If you try it, do it one adjustment value up at a time and test
> 
> I have discovered that if i adjust CPU_PLL it adjusts the balance between graphics score and Physics score. Default is 1.8 and I am running at 1.8183V that is where things peak.
> 
> The highest Graphics score I remember is 21600 but the physics score was about 9900
> 
> Unfortunately my PSU went bang last week and I don't have a working PSU just now to log in and check


or maybe Cpu Test is testing whole cpu, when Combined Test maybe it's a different workload.

16 Gb dual channel @2400mhz CL 11-13-13-35,
in CPU-z it shows @1200mhz.

at 3570K 4.7ghz @1.32-1.34 vcore (in CPU-z fullload while prime95 & games)
at 3570K 4.6ghz @1.28V (in CPU-z fullload while prime95 & games)

on my motherboard's BIOS it seems CPU PLL is not adjustable, there's no option for it. only has other like PCH voltage.
Ive already tried increased vccsa/vccio didn't help gain the FS graphic scores, maybe i will try increase it further.

my card will crashes so badly whenever over +21000 Graphic scores. Make me think my overclock performances is pretty much disappointing.
But +1000 graphic scores different might be only 1-3 FPS difference.


----------



## pez

I OC'ed the 1070 in the secondary rig this weekend and couldn't get much more stable out of it than 1911 after it's clock plateau. I'm not done with it as the memory seems like it will OC pretty nicely, however.

This is the first 10-series cards I've played with that just didn't want to boost to 2GHz or so. Anyone had this kind of luck? I'm not heartbroken as it's running extremely well.


----------



## asdkj1740

Quote:


> Originally Posted by *ColdDeckEd*
> 
> 
> 
> 
> 
> I saw a Hybrid cooler on sale on Amazon (amazon warehouse so it was used), so I picked it up, finished installing it the other day. The repackaging job was terrible and messed up the included thermal pads but was able to reuse the ones installed by evga. Was able to hit 21k in FS like I've wanted. Will reuse it if I ever get a 1080ti.


dude can you zoom in to the stock thermal pad to check whether there are remarkable marks, lol


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is a good question that I have not yet found an answer for. I am seeing similar behavior to the Ryzen chips.
> 
> I suspect that It might have something to do with PCIe 2.0 vs PCIE 3.0 that you have on z77. It is still fine with load on one end or the other but load on both ends is causing contention somewhere.
> 
> I am wringing the neck of my CPU which is a Non K chip. If you have any suggestions
> 
> I7-2600 @ 4440 Mhz with BCLK at 105.8Mhz. DDR3 @1972Mhz. As a matter of interest. What speed memory are you running?
> 
> memory latency is about 57ns.
> 
> I am using offset Voltage and VID peaks at 1.364
> 
> I have discovered that tuning vccio up to 1.183v increased my Graphics performance as it seems to have fortified the PCIe and memory controllers. If you try it, do it one adjustment value up at a time and test
> 
> I have discovered that if i adjust CPU_PLL it adjusts the balance between graphics score and Physics score. Default is 1.8 and I am running at 1.8183V that is where things peak.
> 
> The highest Graphics score I remember is 21600 but the physics score was about 9900
> 
> Unfortunately my PSU went bang last week and I don't have a working PSU just now to log in and check
> 
> 
> 
> or maybe Cpu Test is testing whole cpu, when Combined Test maybe it's a different workload.
> 
> 16 Gb dual channel @2400mhz CL 11-13-13-35,
> in CPU-z it shows @1200mhz.
> 
> at 3570K 4.7ghz @1.32-1.34 vcore (in CPU-z fullload while prime95 & games)
> at 3570K 4.6ghz @1.28V (in CPU-z fullload while prime95 & games)
> 
> on my motherboard's BIOS it seems CPU PLL is not adjustable, there's no option for it. only has other like PCH voltage.
> Ive already tried increased vccsa/vccio didn't help gain the FS graphic scores, maybe i will try increase it further.
> 
> my card will crashes so badly whenever over +21000 Graphic scores. Make me think my overclock performances is pretty much disappointing.
> But +1000 graphic scores different might be only 1-3 FPS difference.
Click to expand...

You are running memory 450Mhz (effective) faster than what I have installed. my timings are looser 12-12-12-34, I can tighten them but it does introduce a bit of instability. You also have an extra 300Mhz, better IPC but with less cores.

21000 is pretty tough to get with a 1070. I was stuck at 20500 for ages.

you using curve to oc?

have you enabled message signaled interrupts on your card? It can help to reduce latency

My board is an Asus. I dont know biostar so I dont know what else I can suggest short of keep experimenting. Are there any biostar OC guide sites?

From the 3dmark tech guide

Graphics 1

3DMark Fire Strike Graphics test 1 focuses on geometry and illumination. Particles are drawn at half resolution and dynamic particle illumination is disabled. There are 100 shadow casting spot lights and 140 non-shadow casting point lights in the scene. Compute shaders are used for particle simulations and post processing. Pixel processing is lower than in Graphics test 2 as there is no depth of field effect.

graphics 2

3DMark Fire Strike Graphics test 2 focuses on particles and GPU simulations. Particles are drawn at full resolution and dynamic particle illumination is enabled. There are two smoke fields simulated on GPU. Six shadow casting spot lights and 65 non-shadow casting point lights are present. Compute shaders are used for particle and fluid simulations and for post processing steps. Post processing includes a depth of field effect.

Physics

"3DMark Fire Strike Physics test benchmarks the hardware's ability to run gameplay physics simulations on the CPU. The GPU load is kept as low as possible to ensure that only the CPU is stressed. The Bullet Open Source Physics Library is used as the physics library for the test. 
The test has 32 simulated worlds. One thread per available CPU core is used to run simulations. All physics are computed on CPU with soft body vertex data updated to GPU each frame.

Combined

3DMark Fire Strike Combined test stresses both the GPU and CPU simultaneously. The GPU load combines elements from Graphics test 1 and 2 using tessellation, volumetric illumination, fluid simulation, particle simulation, FFT based bloom and depth of field. 
The CPU load comes from the rigid body physics of the breaking statues in the background. There are 32 simulation worlds running in separate threads each containing one statue decomposing into 113 parts. Additionally there are 16 invisible rigid bodies in each world except the one closest to camera to push the decomposed elements apart. The simulations run on one thread per available CPU core. 
The 3DMark Fire Strike Combined test uses the Bullet Open Source Physics Library.


----------



## gtbtk

Quote:


> Originally Posted by *pez*
> 
> I OC'ed the 1070 in the secondary rig this weekend and couldn't get much more stable out of it than 1911 after it's clock plateau. I'm not done with it as the memory seems like it will OC pretty nicely, however.
> 
> This is the first 10-series cards I've played with that just didn't want to boost to 2GHz or so. Anyone had this kind of luck? I'm not heartbroken as it's running extremely well.


What motherboard/cpu/memory and 1070 model card are you running?

Is 1911 the clock after it was hot under load? What clock did it start out at and how did you overclock it?

With a fan curve, most 1070s can run in the 60s without any problems. The upmarket ones have no problem staying at less than 60.

Check to see if you have Micron or samsung memory on the card. If Micron, you should have bios version 86.04.50.00.xx installed. If it is a .26 bios you should get the bios update from your vendor as it solves the micron memory bug.

1070 vram should OC to about +500 up to +800


----------



## icold

My Bclk only go around 104.4


----------



## ColdDeckEd

Quote:


> Originally Posted by *asdkj1740*
> 
> dude can you zoom in to the stock thermal pad to check whether there are remarkable marks, lol


There were on all thermal pads.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> My Bclk only go around 104.4


tried a little extra vcore? i am unstable at 105.9


----------



## icold

Im use 4.28ghz with 1.050v pretty stable i tried 104.9 until 1.200 and freeze. I reduce my VCCSA 1.050 to 1.000 and now looks stable 104.4, only...


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Im use 4.28ghz with 1.050v pretty stable i tried 104.9 until 1.200 and freeze. I reduce my VCCSA 1.050 to 1.000 and now looks stable 104.4, only...


As long as you have cooling, 1.3-1.45v is within acceptable vcore range for ivy bridge on air cooling i believe.

Also, if the vcore is only just high enough for stability, you might find that the chips does not turbo up all that much. A bit more vcore will encourage the turbo to get up there. Offset voltages will allow the chip to idle really low and boost to levels that would be higher than you would use at a fixed voltage as well


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> You are running memory 450Mhz (effective) faster than what I have installed. my timings are looser 12-12-12-34, I can tighten them but it does introduce a bit of instability. You also have an extra 300Mhz, better IPC but with less cores.
> 
> 21000 is pretty tough to get with a 1070. I was stuck at 20500 for ages.
> 
> _you using curve to oc?
> 
> have you enabled message signaled interrupts on your card? It can help to reduce latency_
> 
> My board is an Asus. I dont know biostar so I dont know what else I can suggest short of keep experimenting. Are there any biostar OC guide sites?
> 
> From the 3dmark tech guide
> 
> Graphics 1
> 3DMark Fire Strike Graphics test 1 focuses on geometry and illumination. Particles are drawn at half resolution and dynamic particle illumination is disabled. There are 100 shadow casting spot lights and 140 non-shadow casting point lights in the scene. Compute shaders are used for particle simulations and post processing. Pixel processing is lower than in Graphics test 2 as there is no depth of field effect.
> 
> graphics 2
> 3DMark Fire Strike Graphics test 2 focuses on particles and GPU simulations. Particles are drawn at full resolution and dynamic particle illumination is enabled. There are two smoke fields simulated on GPU. Six shadow casting spot lights and 65 non-shadow casting point lights are present. Compute shaders are used for particle and fluid simulations and for post processing steps. Post processing includes a depth of field effect.
> 
> Physics
> 
> "3DMark Fire Strike Physics test benchmarks the hardware's ability to run gameplay physics simulations on the CPU. The GPU load is kept as low as possible to ensure that only the CPU is stressed. The Bullet Open Source Physics Library is used as the physics library for the test.
> 
> The test has 32 simulated worlds. One thread per available CPU core is used to run simulations. All physics are computed on CPU with soft body vertex data updated to GPU each frame.
> 
> Combined
> 3DMark Fire Strike Combined test stresses both the GPU and CPU simultaneously. The GPU load combines elements from Graphics test 1 and 2 using tessellation, volumetric illumination, fluid simulation, particle simulation, FFT based bloom and depth of field.
> 
> The CPU load comes from the rigid body physics of the breaking statues in the background. There are 32 simulation worlds running in separate threads each containing one statue decomposing into 113 parts. Additionally there are 16 invisible rigid bodies in each world except the one closest to camera to push the decomposed elements apart. The simulations run on one thread per available CPU core.
> 
> The 3DMark Fire Strike Combined test uses the Bullet Open Source Physics Library.


I have both profile, traditional slider & curve method.
I think both are pretty much has the same results in 3dmark firestrike graphic scores.
over +21000 graphic i have many crashes/small artifact, will not stable in 3dmark Firestrike Extreme-Stress test all the time. but I'm still keep trying though...

I think my card runs at IRQ 16(positive value) not MSI method (negative value). do they has some effect on benchmark scores?
but in nvcp system information said IRQ is not used.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You are running memory 450Mhz (effective) faster than what I have installed. my timings are looser 12-12-12-34, I can tighten them but it does introduce a bit of instability. You also have an extra 300Mhz, better IPC but with less cores.
> 
> 21000 is pretty tough to get with a 1070. I was stuck at 20500 for ages.
> 
> you using curve to oc?
> 
> have you enabled message signaled interrupts on your card? It can help to reduce latency
> 
> My board is an Asus. I dont know biostar so I dont know what else I can suggest short of keep experimenting. Are there any biostar OC guide sites?
> 
> From the 3dmark tech guide
> 
> Graphics 1
> 3DMark Fire Strike Graphics test 1 focuses on geometry and illumination. Particles are drawn at half resolution and dynamic particle illumination is disabled. There are 100 shadow casting spot lights and 140 non-shadow casting point lights in the scene. Compute shaders are used for particle simulations and post processing. Pixel processing is lower than in Graphics test 2 as there is no depth of field effect.
> 
> graphics 2
> 3DMark Fire Strike Graphics test 2 focuses on particles and GPU simulations. Particles are drawn at full resolution and dynamic particle illumination is enabled. There are two smoke fields simulated on GPU. Six shadow casting spot lights and 65 non-shadow casting point lights are present. Compute shaders are used for particle and fluid simulations and for post processing steps. Post processing includes a depth of field effect.
> 
> Physics
> 
> "3DMark Fire Strike Physics test benchmarks the hardware's ability to run gameplay physics simulations on the CPU. The GPU load is kept as low as possible to ensure that only the CPU is stressed. The Bullet Open Source Physics Library is used as the physics library for the test.
> 
> The test has 32 simulated worlds. One thread per available CPU core is used to run simulations. All physics are computed on CPU with soft body vertex data updated to GPU each frame.
> 
> Combined
> 3DMark Fire Strike Combined test stresses both the GPU and CPU simultaneously. The GPU load combines elements from Graphics test 1 and 2 using tessellation, volumetric illumination, fluid simulation, particle simulation, FFT based bloom and depth of field.
> 
> The CPU load comes from the rigid body physics of the breaking statues in the background. There are 32 simulation worlds running in separate threads each containing one statue decomposing into 113 parts. Additionally there are 16 invisible rigid bodies in each world except the one closest to camera to push the decomposed elements apart. The simulations run on one thread per available CPU core.
> 
> The 3DMark Fire Strike Combined test uses the Bullet Open Source Physics Library.
> 
> 
> 
> I have both profile, traditional slider & curve method.
> I think both are pretty much has the same results in 3dmark firestrike graphic scores.
> over +21000 graphic i have many crashes/small artifact, will not stable in 3dmark Firestrike Extreme-Stress test all the time. but I'm still keep trying though...
> 
> I think my card runs at IRQ 16(positive value) not MSI method (negative value). do they has some effect on benchmark scores?
> but in nvcp system information said IRQ is not used.
Click to expand...

I would suggest that you try enabling MSI. It cant hurt and it will mean it is not sharing an IRQ but using inline dma management. It will depend on your rig if you will see any improvements or not. really fast ram probably helps. z170/270 don't usually see much but older rigs seem to. No idea about z77. The performance issues I have been seeing seem to have a connection to IRQ/DMA at the memory controllers fighting with the CPU for access. They are used to manage memory IO and because IRQ is out of band, it can get a bit out of sync and cause memory data to become stale.

Each time you update drivers it will reset it back to the default. Export the registry key to a reg file and it is easy to reapply later.

Wendel from Level1techs and I just discovered that MSI gives Ryzen 1800X with 1080Ti cards a 5% boost in combined performance in Firestrike.

https://level1techs.com/article/fastest-ryzen-1800x-system-world-2017-03-20


----------



## pez

Quote:


> Originally Posted by *gtbtk*
> 
> What motherboard/cpu/memory and 1070 model card are you running?
> 
> Is 1911 the clock after it was hot under load? What clock did it start out at and how did you overclock it?
> 
> With a fan curve, most 1070s can run in the 60s without any problems. The upmarket ones have no problem staying at less than 60.
> 
> Check to see if you have Micron or samsung memory on the card. If Micron, you should have bios version 86.04.50.00.xx installed. If it is a .26 bios you should get the bios update from your vendor as it solves the micron memory bug.
> 
> 1070 vram should OC to about +500 up to +800


It's a FE 1070 from Asus. It was bought fairly early on. I'll have to check in GPU-z to see the memory. The memory wasn't causing instability so far, though, so I'm happy about that. And yeah, after some heat (I set power target to 112% and temp limit to 85C) it would hover between 1888-1911.


----------



## icold

My strix run at 2152mhz after change bios to OC edition and throttling at 2139mhz max, with original bios works at 2126 mhz max and throttling at 2050mhz ¬¬''. Bios with 170w have much throttling, change to 200W.


----------



## gtbtk

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> What motherboard/cpu/memory and 1070 model card are you running?
> 
> Is 1911 the clock after it was hot under load? What clock did it start out at and how did you overclock it?
> 
> With a fan curve, most 1070s can run in the 60s without any problems. The upmarket ones have no problem staying at less than 60.
> 
> Check to see if you have Micron or samsung memory on the card. If Micron, you should have bios version 86.04.50.00.xx installed. If it is a .26 bios you should get the bios update from your vendor as it solves the micron memory bug.
> 
> 1070 vram should OC to about +500 up to +800
> 
> 
> 
> It's a FE 1070 from Asus. It was bought fairly early on. I'll have to check in GPU-z to see the memory. The memory wasn't causing instability so far, though, so I'm happy about that. And yeah, after some heat (I set power target to 112% and temp limit to 85C) it would hover between 1888-1911.
Click to expand...

Early FE will be Samsung. Given that temps become an issue with blower cards maybe try under volting your overclock

what I would try is to use Afterburner,

leave voltage at 0

increase power and temp slider to max

open the curve and pull the 0.950v point up to 2050. My card gives up at 2037 so you may or may not be stable. The frequency dividers are every 12.5Mhz. If 2050 is unstable, try 2037, 2025, 2012, 2000 etc The curve to the right will stay flat

set memory to +500 (also try higher if you stay stable)

set a fan curve to something that is tolerable based on the temps you observed.

See how that goes for you. Only going to .950 volts should help keep temps lower while the performance should be pretty good. For some reason .950v is a voltage point that adjusts some undocumented feature that these cards like for good performance.

After you stabilize the .950 point, if you want, you can try grabbing a 2nd point over to the right of the curve and increase the clocks at a higher voltage value on the curve. You mat get 2100 if you try but you will also get more heat and only a small performance improvement. Memory OC usually gives you more FPS.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> My strix run at 2152mhz after change bios to OC edition and throttling at 2139mhz max, with original bios works at 2126 mhz max and throttling at 2050mhz ¬¬''. Bios with 170w have much throttling, change to 200W.


good result


----------



## pez

Quote:


> Originally Posted by *gtbtk*
> 
> Early FE will be Samsung. Given that temps become an issue with blower cards maybe try under volting your overclock
> 
> what I would try is to use Afterburner,
> leave voltage at 0
> increase power and temp slider to max
> open the curve and pull the 0.950v point up to 2050. My card gives up at 2037 so you may or may not be stable. The frequency dividers are every 12.5Mhz. If 2050 is unstable, try 2037, 2025, 2012, 2000 etc The curve to the right will stay flat
> set memory to +500 (also try higher if you stay stable)
> set a fan curve to something that is tolerable based on the temps you observed.
> 
> See how that goes for you. Only going to .950 volts should help keep temps lower while the performance should be pretty good. For some reason .950v is a voltage point that adjusts some undocumented feature that these cards like for good performance.
> 
> After you stabilize the .950 point, if you want, you can try grabbing a 2nd point over to the right of the curve and increase the clocks at a higher voltage value on the curve. You mat get 2100 if you try but you will also get more heat and only a small performance improvement. Memory OC usually gives you more FPS.


I had power target maxed (I thought it was 112%), but I'll have to try temps. I have a custom fan curve, so that should be fine. I don't know if I'll play with voltage any as the system is really meant to just be rock solid stable (same rules don't apply for mine







). I do appreciate your help so far







.


----------



## Darkermanz099

here is my little build, plasti dipt my gigabyte gtx 1070 g1 and mounted it with the new coorer maaster pci mount that came out like 4 days ago, i used a pci express extender cable to mount it in this position and did not have any preformance drops at all at 4k quality on my 43" lg tv. love the looks of the card mounted this way but the only thing that i would like to do is put some rgb lighting inside the gpu behiend the fans inside the casing.


----------



## icold

You cant play at 4k with gtx 1070, you need a 1080 SLI ( minimum)


----------



## Darkermanz099

Quote:


> Originally Posted by *icold*
> 
> You cant play at 4k with gtx 1070, you need a 1080 SLI ( minimum)


euh no sins when, i could even play 4k quality with my gigabyte gtx 960 g1. only the fps was crap with that card but i get at least 70 fps+ in almost every game at ultra settings so no you dont need a gtx 1080 at minimum


----------



## icold

Who plays want fluidity not resolution, if want [email protected] fps min you need 1080 SLI, better 1080ti sli.


----------



## gtbtk

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Early FE will be Samsung. Given that temps become an issue with blower cards maybe try under volting your overclock
> 
> what I would try is to use Afterburner,
> leave voltage at 0
> increase power and temp slider to max
> open the curve and pull the 0.950v point up to 2050. My card gives up at 2037 so you may or may not be stable. The frequency dividers are every 12.5Mhz. If 2050 is unstable, try 2037, 2025, 2012, 2000 etc The curve to the right will stay flat
> set memory to +500 (also try higher if you stay stable)
> set a fan curve to something that is tolerable based on the temps you observed.
> 
> See how that goes for you. Only going to .950 volts should help keep temps lower while the performance should be pretty good. For some reason .950v is a voltage point that adjusts some undocumented feature that these cards like for good performance.
> 
> After you stabilize the .950 point, if you want, you can try grabbing a 2nd point over to the right of the curve and increase the clocks at a higher voltage value on the curve. You mat get 2100 if you try but you will also get more heat and only a small performance improvement. Memory OC usually gives you more FPS.
> 
> 
> 
> I had power target maxed (I thought it was 112%), but I'll have to try temps. I have a custom fan curve, so that should be fine. I don't know if I'll play with voltage any as the system is really meant to just be rock solid stable (same rules don't apply for mine
> 
> 
> 
> 
> 
> 
> 
> ). I do appreciate your help so far
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

voltage adds temp so probably not ideal with a founders. big cooler or under water, different story.

Obviously be sensible with temps. I always turn them up out of habit. If you run it with a low voltage, the temps may stay under control anyway. my MSI Gaming is at about 48-49 deg at .950


----------



## pez

Quote:


> Originally Posted by *icold*
> 
> You cant play at 4k with gtx 1070, you need a 1080 SLI ( minimum)


Quote:


> Originally Posted by *Darkermanz099*
> 
> euh no sins when, i could even play 4k quality with my gigabyte gtx 960 g1. only the fps was crap with that card but i get at least 70 fps+ in almost every game at ultra settings so no you dont need a gtx 1080 at minimum


Quote:


> Originally Posted by *icold*
> 
> Who plays want fluidity not resolution, if want [email protected] fps min you need 1080 SLI, better 1080ti sli.


Depends on what you play. You will run some things great at lower settings, but you won't run (most) modern triple A titles at 60+ 4K with a single 1070, no.
Quote:


> Originally Posted by *gtbtk*
> 
> voltage adds temp so probably not ideal with a founders. big cooler or under water, different story.
> 
> Obviously be sensible with temps. I always turn them up out of habit. If you run it with a low voltage, the temps may stay under control anyway. my MSI Gaming is at about 48-49 deg at .950


Yeah, I didn't think about it, but setting the temp limit to 85C only probably influences the clocks to drop a bit faster then normal I assume. Nonetheless it's worth a shot to see what it does







.


----------



## HowYesNo

hey guys. well my gainward 1070 died, i got new as replacement, same model.
i inspected the card and notice something bothering. seems that the thermal pad one at the VRM is crumpled, and not contacting the 1 VRM at the edge of the pcb.
it is folded toward inside of cooler. previous card that died got quite hot at that area (close to power connector), and now it looks semi defective.
will contact gainward directly on this, do i return it to the store i got it? that's going to be a hassle as the card is running fine (didn't do gaming yet).
summer is here temps are going up, it will definitely die after a few hours of gaming with room temp of 30C.
it is this model GTX 1070
some photos taken wit mobile.


----------



## Nukemaster

The VRM pad on my Asus is too small to cover the full chip(75-80%), but your issue looks worse for sure.

The board will take quite a bit of the heat, but a folded thermal pad is an issue for sure. Maybe they will be able to just send you a new one so you do not have to send the card back.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> I would suggest that you try enabling MSI. It cant hurt and it will mean it is not sharing an IRQ but using inline dma management. It will depend on your rig if you will see any improvements or not. really fast ram probably helps. z170/270 don't usually see much but older rigs seem to. No idea about z77. The performance issues I have been seeing seem to have a connection to IRQ/DMA at the memory controllers fighting with the CPU for access. They are used to manage memory IO and because IRQ is out of band, it can get a bit out of sync and cause memory data to become stale.
> 
> Each time you update drivers it will reset it back to the default. Export the registry key to a reg file and it is easy to reapply later.
> 
> Wendel from Level1techs and I just discovered that MSI gives Ryzen 1800X with 1080Ti cards a 5% boost in combined performance in Firestrike.
> https://level1techs.com/article/fastest-ryzen-1800x-system-world-2017-03-20


Tried switched to MSI, im not sure if there's a gain, but i think still not help me gain a perfect stable +21,000 graphics scores FS, but thx for the tips anyway.

I can get a constant graphic scores +21000, +9900 extreme, +6700 Timespy
if I put core +88 and +645mhz to the memory, but in firestrike stress test, it would crash and memory get a small artifacts.

Btw, do you use 3d mark Firestrike Extreme Stress test at least 97%?
it's not only useful for detect crashes, but also to detect a memory artifacts
But you need to starring the whole run from start to end if there's any small artifacting,
Usually it will popout some weird graphical glitch such as green sparkles spot pop out, or flashing screen, etc.

Im testing with newest driver btw 378.92


----------



## gtbtk

Quote:


> Originally Posted by *gtbtk*
> 
> voltage adds temp so probably not ideal with a founders. big cooler or under water, different story.
> 
> Obviously be sensible with temps. I always turn them up out of habit. If you run it with a low voltage, the temps may stay under control anyway. my MSI Gaming is at about 48-49 deg at .950
> 
> 
> 
> Yeah, I didn't think about it, but setting the temp limit to 85C only probably influences the clocks to drop a bit faster then normal I assume. Nonetheless it's worth a shot to see what it does
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

you have nothing to lose.


----------



## Blackfirehawk

Quote:


> Originally Posted by *HowYesNo*
> 
> hey guys. well my gainward 1070 died, i got new as replacement, same model.
> i inspected the card and notice something bothering. seems that the thermal pad one at the VRM is crumpled, and not contacting the 1 VRM at the edge of the pcb.
> it is folded toward inside of cooler. previous card that died got quite hot at that area (close to power connector), and now it looks semi defective.
> will contact gainward directly on this, do i return it to the store i got it? that's going to be a hassle as the card is running fine (didn't do gaming yet).
> summer is here temps are going up, it will definitely die after a few hours of gaming with room temp of 30C.
> it is this model GTX 1070
> some photos taken wit mobile.


i have the same card..
but i have changes the Thermal Pads and the Thermal Compound on the Chip..

Cards runs about 7-8 degree celsius cooler as with gainwands original thermal compount..
the thermal Pads on the VRM are 2mm

you can Flash the Palit GTX 1070 GameRock Premium Edition Bios to it without problems

getting stable 2050mhz core Boost /9500 mhz memory (micron)


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I would suggest that you try enabling MSI. It cant hurt and it will mean it is not sharing an IRQ but using inline dma management. It will depend on your rig if you will see any improvements or not. really fast ram probably helps. z170/270 don't usually see much but older rigs seem to. No idea about z77. The performance issues I have been seeing seem to have a connection to IRQ/DMA at the memory controllers fighting with the CPU for access. They are used to manage memory IO and because IRQ is out of band, it can get a bit out of sync and cause memory data to become stale.
> 
> Each time you update drivers it will reset it back to the default. Export the registry key to a reg file and it is easy to reapply later.
> 
> Wendel from Level1techs and I just discovered that MSI gives Ryzen 1800X with 1080Ti cards a 5% boost in combined performance in Firestrike.
> https://level1techs.com/article/fastest-ryzen-1800x-system-world-2017-03-20
> 
> 
> 
> Tried switched to MSI, im not sure if there's a gain, but i think still not help me gain a perfect stable +21,000 graphics scores FS, but thx for the tips anyway.
> 
> I can get a constant graphic scores +21000, +9900 extreme, +6700 Timespy
> if I put core +88 and +645mhz to the memory, but in firestrike stress test, it would crash and memory get a small artifacts.
> 
> Btw, do you use 3d mark Firestrike Extreme Stress test at least 97%?
> it's not only useful for detect crashes, but also to detect a memory artifacts
> But you need to starring the whole run from start to end if there's any small artifacting,
> Usually it will popout some weird graphical glitch such as green sparkles spot pop out, or flashing screen, etc.
> 
> Im testing with newest driver btw 378.92
Click to expand...

try switching between balanced and performance power modes and back again in windows see what happens.

I have used the stress tests in the past but I am usually playing around at the ragged edge to see what I can wring out of the board and work out why Pascal behaves the way it does so on most stress tests my OCs I usually fail after a couple of loops

Firestrike is more forgiving on vram errors than time spy is in my experience.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> try switching between balanced and performance power modes and back again in windows see what happens.
> 
> I have used the stress tests in the past but I am usually playing around at the ragged edge to see what I can wring out of the board and work out why Pascal behaves the way it does so on most stress tests my OCs I usually fail after a couple of loops
> 
> Firestrike is more forgiving on vram errors than time spy is in my experience.


I run 3d mark when CPU Power at High performances mode, and GPU at prefer max performances.

Graphics score 20.8xx-20.9xx Firestrike, 98XX Extreme, 66XX Timespy, that's the most stable I can get currently on benchmark & stress test..

Graphics scores 20.8xx, minus +200 points to +21000, it is probably only 1-2 FPS.

But I guess overall it's not bad because that is the most mainstream 1070 can get.
+21000-22000 perfect stable probably luck silicon chip, probably only 10-20% chip can do.


----------



## HowYesNo

Quote:


> Originally Posted by *Blackfirehawk*
> 
> i have the same card..
> but i have changes the Thermal Pads and the Thermal Compound on the Chip..
> 
> Cards runs about 7-8 degree celsius cooler as with gainwands original thermal compount..
> the thermal Pads on the VRM are 2mm
> 
> you can Flash the Palit GTX 1070 GameRock Premium Edition Bios to it without problems
> 
> getting stable 2050mhz core Boost /9500 mhz memory (micron)


are all thermal pads on this card 2mm or only on VRM. which ones did u use, ca i get a link?


----------



## RyanRazer

Quote:


> Originally Posted by *Darkermanz099*
> 
> here is my little build, plasti dipt my gigabyte gtx 1070 g1 and mounted it with the new coorer maaster pci mount that came out like 4 days ago, i used a pci express extender cable to mount it in this position and did not have any preformance drops at all at 4k quality on my 43" lg tv. love the looks of the card mounted this way but the only thing that i would like to do is put some rgb lighting inside the gpu behiend the fans inside the casing.


That vertical gpu looks sick ! GJ


----------



## icold

Quote:


> Originally Posted by *RyanRazer*
> 
> That vertical gpu looks sick ! GJ


PUT this ram on quad channel


----------



## Darkermanz099

Quote:


> Originally Posted by *icold*
> 
> PUT this ram on quad channel


i did try that but i think one of my ram sockets is broken or not functioning not right.
my board dosnt detect a ram stick in one of the right slots so i run my ram in 2x dual channel


----------



## Blackfirehawk

Quote:


> Originally Posted by *HowYesNo*
> 
> are all thermal pads on this card 2mm or only on VRM. which ones did u use, ca i get a link?


Cooler Master MasterGel Maker for the GPU
and all Thermal Pads are from Thermal Grizzly

i bought them on Amazon..


----------



## HowYesNo

Quote:


> Originally Posted by *Blackfirehawk*
> 
> Cooler Master MasterGel Maker for the GPU
> and all Thermal Pads are from Thermal Grizzly
> 
> i bought them on Amazon..


all pad u used are 2mm thick? both on memory and VRM?


----------



## xixou

3 way sli gtx 1070 ^^

http://users.skynet.be/xixou/3_way_msi_bridge.jpg


----------



## patriotaki

i bought the 1070 g1


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> try switching between balanced and performance power modes and back again in windows see what happens.
> 
> I have used the stress tests in the past but I am usually playing around at the ragged edge to see what I can wring out of the board and work out why Pascal behaves the way it does so on most stress tests my OCs I usually fail after a couple of loops
> 
> Firestrike is more forgiving on vram errors than time spy is in my experience.
> 
> 
> 
> I run 3d mark when CPU Power at High performances mode, and GPU at prefer max performances.
> 
> Graphics score 20.8xx-20.9xx Firestrike, 98XX Extreme, 66XX Timespy, that's the most stable I can get currently on benchmark & stress test..
> 
> Graphics scores 20.8xx, minus +200 points to +21000, it is probably only 1-2 FPS.
> 
> But I guess overall it's not bad because that is the most mainstream 1070 can get.
> +21000-22000 perfect stable probably luck silicon chip, probably only 10-20% chip can do.
Click to expand...

so did you try running balanced power mode to see what happens?

I am not suggesting that you always leave it there. I am suggesting taking the view that if you experiment with things you learn something new. Pay attention to what happened to the Physics and combined scores as well

For day to day use, there is little point tuning a 1070 card to do 21500 graphics score if it kills the physics and combined scores. It may increase max frame rates in game in the areas that dont have much going on but it will also reduce the minimum frame rate creating a stuttery mess in the heavy action scenes where you really want the card to perform. The trick is to understand why a setting behaves in a particular way and then you can tune your rig in different ways depending on what you want to use it for.

Firestrike can be played like a video game to get a high score, but it can also be used as a tool so you understand why changes in config behave the way the do.

If you understand that you can tune tour rig to do anything you like within the limitations of the hardware. I am pretty sure that to get a 22000 graphics score will require sub zero cooling.

All the Ryzen tests are actually proving right now, that the things about CPU and GPU and how they effect one another that have been assumed to be fact for years are in actual fact not correct. All the talk of CCX thread switching and Ryzen bad for gaming is just plain wrong but we are only just starting to see it now because the 1080TI level cards are finally so powerful, they are starting to exceed the I/O limits of the current chips. It also shows that all the "trusted" tech media are not the tech gurus that they would like you to believe.

Having x16 PCIe 3.0 sounds great with 15GB/s of bandwidth. What we are seeing now, is that CPUs cannot output data at 15GB/s to the graphics card. In fact CPUs have trouble even writing at much more than 8GB/s. That is why PCIe2 and PCIe3 are not really producing any major performance differences. There is an I/O ceiling for all CPUs. The current difference between Ryzen and Intel is that the Intel chips can output fractionally more data to feed the GPU than Ryzen chips can right now.

I don't know the exact numbers but it is probably something like 8GB/s vs 7.75GB/s at room temperature with the current tuning of each platform. Can it be tuned out to balance the Ryzen? Possibly. Bios and Microcode is being updated at a rapid pace. We really wont know unless someone actually tries looking in that area and experimenting because there are no instruction books. That requires the same inquiring mindset as the one I mention above about trying a setting and seeing what effect it has on the entire environment.


----------



## gtbtk

Quote:


> Originally Posted by *xixou*
> 
> 3 way sli gtx 1070 ^^
> 
> http://users.skynet.be/xixou/3_way_msi_bridge.jpg


Have you tried to play any games with that yet?


----------



## xixou

Quote:


> Originally Posted by *gtbtk*
> 
> Have you tried to play any games with that yet?


Yes for months ^^

I explain how to enable tri way SLI on Pascal for games in my video over there:






The trick is to make think nvidia drivers that the game is actually enabled for 3 or 4 way sli,
meaning like for the benchmark (catzilla, uningine, 3dmark, ...).

So open nvidiaProfileInspector.exe, pick your game (or the general entry),
click the right button show unknown settings (glass icon),
go the entry 0x10FD4C5F and select catzilla,... (data is 0x01F296C1).
Apply changes (top right icon) then play your game in 3 or 4 way sli ^^


----------



## gtbtk

Quote:


> Originally Posted by *xixou*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Have you tried to play any games with that yet?
> 
> 
> 
> Yes for months ^^
> 
> I explain how to enable tri way SLI on Pascal for games in my video over there:
> 
> 
> 
> 
> 
> 
> The trick is to make think nvidia drivers that the game is actually enabled for 3 or 4 way sli,
> meaning like for the benchmark (catzilla, uningine, 3dmark, ...).
> 
> So open nvidiaProfileInspector.exe, pick your game (or the general entry),
> click the right button show unknown settings (glass icon),
> go the entry 0x10FD4C5F and select catzilla,... (data is 0x01F296C1).
> Apply changes (top right icon) then play your game in 3 or 4 way sli ^^
Click to expand...

That is really interesting.

I have Profile Inspector and I have noticed the settings for SLI in there but I never even considered you could use it for that. Great Discovery!

Do you get proper SLI performance boosts for the games that traditionally do not seem to scale or negatively scale? Any other catches or disadvantages?


----------



## zipper17

@gtbk

Some other method such Shunt mod resistor, voltage modding, & powerful cooling mod, that maybe also will help for last effort overclocking..
Quote:


> Originally Posted by *gtbtk*
> 
> That is really interesting.
> 
> I have Profile Inspector and I have noticed the settings for SLI in there but I never even considered you could use it for that. Great Discovery!
> 
> Do you get proper SLI performance boosts for the games that traditionally do not seem to scale or negatively scale? Any other catches or disadvantages?


Pascal 4way Functional SLI
http://forums.guru3d.com/showthread.php?t=409468

you need to edit the profile for each game, and use special type of SLI bridge to avoid warning message in nvcp.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> @gtbk
> 
> Some other method such Shunt mod resistor, voltage modding, & powerful cooling mod, that maybe also will help for last effort overclocking..
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is really interesting.
> 
> I have Profile Inspector and I have noticed the settings for SLI in there but I never even considered you could use it for that. Great Discovery!
> 
> Do you get proper SLI performance boosts for the games that traditionally do not seem to scale or negatively scale? Any other catches or disadvantages?
> 
> 
> 
> Pascal 4way Functional SLI
> http://forums.guru3d.com/showthread.php?t=409468
> 
> you need to edit the profile for each game, and use special type of SLI bridge to avoid warning message in nvcp.
Click to expand...

You can certainly try all of those modifications. As I said, there is no instruction book, only details that have been discovered because someone else has tried something new at some stage.

You can water cool the card that will help keep temps lower and run faster. you will get a little more performance/more stable high clocks but it will not be huge. You can also use something like dry ice or ln2 to cool the card with voltage mods and condensation protection but if you do that, the only thing that you can really use it for is for benchmarks. You wont be able to use it to play games or use it for a daily driver.

I Understand how the SLI profile/nvidia inspector thing works, I just don't use SLI and as a result had never given it any thought. I has also not heard anyone discussing it before and wondered if the "non scaling" SLI games actually started scaling after changing a profile setting that is normally hidden.


----------



## xixou

Have a look at my youtube channel, i tested few games. Tonight i will test mass effect andromeda. I use a metal bridge from Msi to increase the sli bandwidth, frequency is increased.


----------



## orbitalwalsh

Phanteks Strix Block loaded



ran this under air

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/spy/P/2191/1090/7199?minScore=6400&cpuName=Intel Core i5-7600K&gpuName=NVIDIA GeForce GTX 1070

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/X/2191/1090/9899?minScore=8800&cpuName=Intel Core i5-7600K&gpuName=NVIDIA GeForce GTX 1070

need to regain top spot in TimeSpy!!!!


----------



## Faded

I just ordered an MSI GTX1070 Sea Hawk EK X to replace a pair of 7970s... i can't wait to add this to my loop!

https://www.newegg.com/Product/Product.aspx?Item=N82E16814127956&ignorebbr=1


----------



## spin5000

Quote:


> Originally Posted by *xixou*
> 
> 3 way sli gtx 1070 ^^
> 
> http://users.skynet.be/xixou/3_way_msi_bridge.jpg


Nice. I wish games/hardware/APIs where designed in a way where multi-GPU scaling just worked and therefore didn't need to be properly coded into by the game devs, and/or need Nvidia to create a profile for; sort of like how games, in general, work with a single GPU even without a profile (although not always optimized but you know what I mean). It baffles me even more for VR / 3D Vision use where there are two screens / 2 images and you'd expect almost perfect GPU scaling with each GPU going to each screen respectively. So much potential.....


----------



## b0uncyfr0

Im averaging 2050 -2073 on the core at default voltage and +150. Mostly achieved this by setting a custom fan curve to stay under 60 degree's and in my FT02 case it doesnt get too loud either. But ideally i would like to hit 2100 core. What should i try first:

1) Gaming Z bios ( also which one is newer on gpu-z - hard to tell from uploaded date)
2) Start playing with the OC slider
3) Something i missed?


----------



## xixou

Mass effect andromeda is set to 2 GPU by default but changing the settings with nvidiaprofileinspector can load my 3 gtx 1070 ^^

Will now start the game to see if there is no flickering or other stuff.

http://users.skynet.be/xixou/andromeda_a.jpg

http://users.skynet.be/xixou/andromeda_b.jpg

http://users.skynet.be/xixou/andromeda_c.jpg


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Im averaging 2050 -2073 on the core at default voltage and +150. Mostly achieved this by setting a custom fan curve to stay under 60 degree's and in my FT02 case it doesnt get too loud either. But ideally i would like to hit 2100 core. What should i try first:
> 
> 1) Gaming Z bios ( also which one is newer on gpu-z - hard to tell from uploaded date)
> 2) Start playing with the OC slider
> 3) Something i missed?


You don't really need the Gaming Z bios. With overclocking a Gaming X using +150, With the Z bios you will not be able to get past about +100 on the core slider. It wont actually make the card overclock higher (Assuming you have Gaming X?) It will just run your card a bit faster at stock settings when you don't play with overclock settings.

leave the core slider at 150 and try increasing the 1.063v point on the curve to 2100 or 2114Mhz

You will actually get better performance if you OC the memory as much as you can while keeping it stable, even if you have to sacrifice a few MHZ at the high end of the core

If you have a card with Micron Memory then you need to be using an 86.04.50.00.xx version of bios.

If you have samsung memory the 86.04.1E.00.xx version is fine. You even have the option of grabbing the "reviewer bios" that is clocked even higher and runs in OC mode by default.

Regular Z bios default clock is 1633 with memory getting a default +50 OC compared to the X and the reviewer bios is 1658Mhz


----------



## gtbtk

Quote:


> Originally Posted by *xixou*
> 
> Mass effect andromeda is set to 2 GPU by default but changing the settings with nvidiaprofileinspector can load my 3 gtx 1070 ^^
> 
> Will now start the game to see if there is no flickering or other stuff.
> 
> http://users.skynet.be/xixou/andromeda_a.jpg
> 
> http://users.skynet.be/xixou/andromeda_b.jpg
> 
> http://users.skynet.be/xixou/andromeda_c.jpg


I was actually curious about GTA V and tomb raider as per this, 




which is all I have seen up until your post.


----------



## icold

Someone know about bios mod for Z77 mobos to up more BCLK?


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> I was actually curious about GTA V and tomb raider as per this,
> 
> 
> 
> 
> which is all I have seen up until your post.


check this dude channel:
https://www.youtube.com/user/ThirtyIR/videos

GTA5 4 way TitanXP




ROTR 4way TitanXP




It's no longer officially supported but yeah i think it still works. You need to tweaks game profiles, etc.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I was actually curious about GTA V and tomb raider as per this,
> 
> 
> 
> 
> which is all I have seen up until your post.
> 
> 
> 
> check this dude channel:
> https://www.youtube.com/user/ThirtyIR/videos
> 
> GTA5 4 way TitanXP
> 
> 
> 
> 
> ROTR 4way TitanXP
> 
> 
> 
> 
> It's no longer officially supported but yeah i think it still works. You need to tweaks game profiles, etc.
Click to expand...

I saw the earlier post about the tweak and I am familiar with nvidia profile inspector to do the tweak. It is just the last I heard, Tomb Raider and GTA gave black screen with more than 2 cards. glad for those people who like to play with SLI found a work around


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Someone know about bios mod for Z77 mobos to up more BCLK?


Unlikely you can mod bios for that. You are limited by the frequency that the memory and PCIe bus can handle


----------



## jdj9

So... i got in SLI 2 x 1070 .... one uses Micron.. and the other uses Samsung!!! I bought the 2nd afterwards....

Let's say i flash the bios and it works....
Quote:


> Originally Posted by *RJTablante*
> 
> Lo and behold, Zotac has pulled through! I'm passing on the email that was sent to me to you all so that you can also reap the benefits of the updated Zotac 1070 series bios to address the Micron memory issue.
> 
> I flashed it already and honestly it's like I have a new card!
> 
> As this is straight from Zotac, I take no responsibility for any damage done, so please follow their instructions, ensure you flash the proper BIOS meant for your card, and get ready for the 1070 you've always wanted
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Hi,
> 
> Here are the temporary links to our FTP, for the respective GTX1070 Cards affected by the "Micron RAM " issue.
> 
> Please select the correct BIOS for your card based on SKU. The ZIP filenames are self-explaining and in EXE format so this will execute when you double click. This bios only supports windows operating system.
> 
> SKU : ZT-P10700E-10S
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700E-10S__288-1N435-200Z8-201Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700F-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700F-10P__288-1N424-200Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700I-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_(ZT-P10700I-10P__299-1N424-300Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700C-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_(ZT-P10700C-10P__288-1N435-100Z8-101Z8)_Micron_RAM_(2016_11).zip
> 
> SKU :ZT-P10700B-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_AMP_Extreme_(ZT-P10700B-10P__288-1N435-000Z8-001Z8)_Micron_RAM_(2016_11).zip
> 
> SKU :ZT-P10700A-10P
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Founders_(ZT-P10700A-10P__288-1N424-000Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700G-10M
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700G-10M__288-1N445-030Z8)_Micron_RAM_(2016_11).zip
> 
> SKU : ZT-P10700K-10M
> http://support.pcpartner.com/support/temp/VBIOS_GTX1070_Mini_(ZT-P10700K-10M__288-1N445-130Z8)_Micron_RAM_(2016_11).zip
> 
> +++++++++++++++
> Important Remark :
> +++++++++++++++
> - The VBIOS files are made into .EXE files, for 32bit and 64bit Windows respectively. Run the .EXE file in the suitable Windows type.
> - Check to make sure the card is really built from "Micron RAM" before starting to do the VBIOS change !!!
> - These VBIOS changes are "One-Way" only, no return path to go back.
> 
> Yours,
> 
> Fred
> ZOTAC Technical Support_


\

I got 2 x GTX 1070 in SLI.... one uses Micron and the other Samsung! ffs... im so unlucky. I haven't yet tried OC'ing but i was about to until i discovered this issue with Vram Mircon. If i update and it works... should i still SLI after or am i better off without it?


----------



## jdj9

Quote:


> Originally Posted by *asdkj1740*
> 
> try the latest nvidia driver if you are using .50 micron fixed bios


Wait, you are saying that this issue could have been resolved by nvidia drivers without having to update the bios of the GPU? I got 2 x 1070 and one of them is using samsung vram and the other micron vram....


----------



## jdj9

Quote:


> Originally Posted by *gtbtk*
> 
> Yes. The micron memory, when ramping up from a low voltage sleep state to the OC frequency would try to operate before the memory VRMs could increase the voltage to the chips leading to the chips trying to operate at high frequency without the required voltage to support it and the memory would basically give up with white checkerboard patterns over the screen before BSODing the PC.


Got 2 x 1070 in SLI (not overclocked) one uses samsung vram the other micron vram. I was about to try the update before proceeding to overclock.... Considering what you are saying... i should not patch the bios? is this fixed?


----------



## jdj9

Quote:


> Originally Posted by *Vitto*
> 
> Thank you! I built it a few months ago and just last week added the 2nd 1070 for SLI, now, the original came with samsung memory, this one with micron (hope that's ok and i don't have to go look for a different one). i updated the vbios on them to the newest one which is suppose to improve cards with micron memory. Here are some screenshots.


Hi!

I am too using 1070 in 2-way SLI one with samsung vram and the other with micron vram. Should i patch the bios of the micron one? I still haven't OC'ed but i was about to...


----------



## jdj9

Quote:


> Originally Posted by *Vitto*
> 
> Thank you! I built it a few months ago and just last week added the 2nd 1070 for SLI, now, the original came with samsung memory, this one with micron (hope that's ok and i don't have to go look for a different one). i updated the vbios on them to the newest one which is suppose to improve cards with micron memory. Here are some screenshots.


I too got 2 x 1070 in SLI. Zotac founders edition. one uses samsung vram, the other micron vram. i was about to OC but i wanted to make sure that the "unofficial" patch is stable. If something goes wrong.. there is no turning back. Did you have any issues?


----------



## Vitto

I definitely would before any OCing, it takes the Micron vram one close to the stability and performance of the Samsung one.


----------



## zipper17

Quote:


> Originally Posted by *jdj9*
> 
> So... i got in SLI 2 x 1070 .... one uses Micron.. and the other uses Samsung!!! I bought the 2nd afterwards....
> 
> Let's say i flash the bios and it works....
> \
> 
> I got 2 x GTX 1070 in SLI.... one uses Micron and the other Samsung! ffs... im so unlucky. I haven't yet tried OC'ing but i was about to until i discovered this issue with Vram Mircon. If i update and it works... should i still SLI after or am i better off without it?


Bug Micron memory 1070 problem was around early months, But since there is a bios update for it, micron memory is not a problem anymore, cmiiw.

Try Update your micron card to the latest bios.

Run 3dmark Stress Test Extreme/Timespy after overclocking both card.
Your Goal: Find your best possible overclock settings without any crashes, memory artifacts, and framerates stability passed at least 97%.

Even the smallest artifacts is not good, You need to watch the entire run to make sure every frames doesn't have artifacts.

Usually 1070 can do +500mhz to memory clock, see what happens and just keep trial & error until you find the best settings for Coreclock, memclock, voltages, Fans curve.

Or you can just test them individually to see the limit on micron & samsung card, before you SLI them.


----------



## Vitto

I do +100 and +600 and 112% power on my SLI 1070 setup (sammy/micron) with the latest bios, I could push it to maybe +150 +750 and 120% power but I noticed that after +600 you don't get much gain and temps start to get much higher so there's really no point. With this I have my gpus at maybe 65-68° air cooled at max load.


----------



## Vitto

Quote:


> Originally Posted by *xixou*
> 
> 3 way sli gtx 1070 ^^
> 
> http://users.skynet.be/xixou/3_way_msi_bridge.jpg


Xixou what liquid coolers are you running on your 1070s?


----------



## gtbtk

Quote:


> Originally Posted by *jdj9*
> 
> So... i got in SLI 2 x 1070 .... one uses Micron.. and the other uses Samsung!!! I bought the 2nd afterwards....
> 
> Let's say i flash the bios and it works....
> 
> \
> 
> I got 2 x GTX 1070 in SLI.... one uses Micron and the other Samsung! ffs... im so unlucky. I haven't yet tried OC'ing but i was about to until i discovered this issue with Vram Mircon. If i update and it works... should i still SLI after or am i better off without it?


There is nothing wrong with Micron memory on 1070 cards. There was a bug that effected cards with Micron memory in the 86.04.26.00.xx bios version family across all brands of cards.

All vendors have released the 86.04.50.00.xx version of bios as an update that fixes the bug that caused that issue. But at stock speeds 95% of cards never experienced the bug anyway as most cards need ed memory to be overclocked to +400 or so before experiencing the crashing problem.

If the Micron memory card still has the .26 version bios installed, get the update and take it to the .50 version and there is no issues at all and you can SLI the cards all day long without an issue.


----------



## gtbtk

Quote:


> Originally Posted by *jdj9*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Yes. The micron memory, when ramping up from a low voltage sleep state to the OC frequency would try to operate before the memory VRMs could increase the voltage to the chips leading to the chips trying to operate at high frequency without the required voltage to support it and the memory would basically give up with white checkerboard patterns over the screen before BSODing the PC.
> 
> 
> 
> Got 2 x 1070 in SLI (not overclocked) one uses samsung vram the other micron vram. I was about to try the update before proceeding to overclock.... Considering what you are saying... i should not patch the bios? is this fixed?
Click to expand...

It was fixed last November when the bios update was released.

Updating the card is trivial, the Bios update utility automates everything if you actually have to update the .26 bios to the .50 bios. New cards manufactured since then have all been ex factory with the new bios installed by default. You just need to check what version of bios you have on you Micron memory card with GPU-Z.

If you do need to update the bios, just don't turn your computer off during the 20-30 seconds it takes to run the update and write the new bios to the card. It will tell you that it was successful. If it tells you it failed, do not reboot your PC but run the update utility again. Make sure that you run the utility as administrator.


----------



## bigmac121

Does it run cooler this way?


----------



## gtbtk

Quote:


> Originally Posted by *bigmac121*
> 
> Does it run cooler this way?


which way? by not increasing the voltage slider in AB or undervolting with the curve? yes it will run a bit cooler than it does with everything on maximum


----------



## Nawafwabs

I have 1070 o8g

Fan doesn't work if Asus GPU tweak not running

so I got drop fps until I run Asus GPU tweak

after that I hear fan go high speed

I don't know what should I do


----------



## icold

Quote:


> Originally Posted by *Nawafwabs*
> 
> I have 1070 o8g
> 
> Fan doesn't work if Asus GPU tweak not running
> 
> so I got drop fps until I run Asus GPU tweak
> 
> after that I hear fan go high speed
> 
> I don't know what should I do


Msi afterburner is much better than asus GPU tweak


----------



## gtbtk

Quote:


> Originally Posted by *Nawafwabs*
> 
> I have 1070 o8g
> 
> Fan doesn't work if Asus GPU tweak not running
> 
> so I got drop fps until I run Asus GPU tweak
> 
> after that I hear fan go high speed
> 
> I don't know what should I do


You know that with the default fan curve built into the card, the fans will not spin until the card heats up to 60 degrees? It is designed that way for silent operation.

GPU tweak or afterburner can be used to manually turn on fan to fixed speed or be used to set a custom curve so that the fans spinn slowly at low temps and progressively get faster as the card gets hotter.

And Yes Afterburner is better than GPU tweak. they are both compatible with Asus Strix cards but only run one at a time


----------



## Nawafwabs

Quote:


> Originally Posted by *gtbtk*
> 
> You know that with the default fan curve built into the card, the fans will not spin until the card heats up to 60 degrees? It is designed that way for silent operation.
> 
> GPU tweak or afterburner can be used to manually turn on fan to fixed speed or be used to set a custom curve so that the fans spinn slowly at low temps and progressively get faster as the card gets hotter.
> 
> And Yes Afterburner is better than GPU tweak. they are both compatible with Asus Strix cards but only run one at a time


custom bios don't solve it?


----------



## gtbtk

Quote:


> Originally Posted by *Nawafwabs*
> 
> custom bios don't solve it?


there is no bios editor for pascal cards.

You can try cross flashing with a bios from another brand but they all have 0 speed fans as well. It is a feature of the pascal cards.


----------



## raisethe3

I have a question for you all owners, is this card the best value card for the money? Especially when compared to the GTX 1080, 1080Ti, GTX Titan X? For someone like me who game on 1680x1050 resolution monitor. In doing my own research, I find that they only thing that separates them are the core clock, boost clock, TFLOPS, memory bandwith. According to results, I see they are separated by like what, 30 watts of power apart? I have about a $400 budget, so this might be the card? Or should I save up more? As for how many fps apart, I don't think its a wide margin. Unless someone wants to correct me on this one.

I got games that I want to play such as Resident Evil 7 and Rise of the Tomb Raider, Battlefield 1 which I don't think my current video graphics card can handle (look at my sig rig). I also need to catch up on older games such as Fallout New Vegas (never played it), Alic Madness Returns, Splinter Cell: Conviction, Borderlands 1 & 2, Splinter Cell: Ghost Recon.

Any thoughts/inputs would be deeply appreciated.


----------



## zipper17

Quote:


> Originally Posted by *raisethe3*
> 
> I have a question for you all owners, is this card the best value card for the money? Especially when compared to the GTX 1080, 1080Ti, GTX Titan X? For someone like me who game on 1680x1050 resolution monitor. In doing my own research, I find that they only thing that separates them are the core clock, boost clock, TFLOPS, memory bandwith. According to results, I see they are separated by like what, 30 watts of power apart? I have about a $400 budget, so this might be the card? Or should I save up more? As for how many fps apart, I don't think its a wide margin. Unless someone wants to correct me on this one.
> 
> I got games that I want to play such as Resident Evil 7 and Rise of the Tomb Raider, Battlefield 1 which I don't think my current video graphics card can handle (look at my sig rig). I also need to catch up on older games such as Fallout New Vegas (never played it), Alic Madness Returns, Splinter Cell: Conviction, Borderlands 1 & 2, Splinter Cell: Ghost Recon.
> 
> Any thoughts/inputs would be deeply appreciated.


Yes, imho 1070 still solid for its price and performances.

1070 definitely more than capable for 1680x1050p at those games or even 1920x1080P.
Even 2560x1440P/4K still capable but not perfect stable 60FPS at all games max settings (depends what games & settings.)

1680x1050 is not problem at all,
or If you want more a less cost of money, go GTX 1060 6 GB, its also still very capable at mainstream 1680x1050/1920x1080P.

From 8800GT to GTX 1070, that will be a hell lot of Gain performances.


----------



## gtbtk

Quote:


> Originally Posted by *raisethe3*
> 
> I have a question for you all owners, is this card the best value card for the money? Especially when compared to the GTX 1080, 1080Ti, GTX Titan X? For someone like me who game on 1680x1050 resolution monitor. In doing my own research, I find that they only thing that separates them are the core clock, boost clock, TFLOPS, memory bandwith. According to results, I see they are separated by like what, 30 watts of power apart? I have about a $400 budget, so this might be the card? Or should I save up more? As for how many fps apart, I don't think its a wide margin. Unless someone wants to correct me on this one.
> 
> I got games that I want to play such as Resident Evil 7 and Rise of the Tomb Raider, Battlefield 1 which I don't think my current video graphics card can handle (look at my sig rig). I also need to catch up on older games such as Fallout New Vegas (never played it), Alic Madness Returns, Splinter Cell: Conviction, Borderlands 1 & 2, Splinter Cell: Ghost Recon.
> 
> Any thoughts/inputs would be deeply appreciated.


For 1050p at 60 frames per second gaming with an i5-2550K, you should not really need more than a 1060. Even the 1050 cards should be powerful enough to give u a good gaming experience with the monitor you have. If you plan to upgrade monitors to a multi monitor setup or to a 1080p/144hz or 1440p monitor in the near future, you could consider a 1070. If you have no upgrade plans for the next year, even a 1070 is probably overkill

Anything higher up the range and you won't really get value for money unless you upgrade both the monitor and CPU you are using. Spending for the highest range cards thinking it will be good for a couple of years is usually not good economy because the next generation of cards will leap in performance and be priced at about the same levels as the current stack so by the time you get to use the extra performance you paid for, the card is obsolete anyway.


----------



## crastakippers

Hi Guys,

I just picked up an MSI Gaming X. It has the micron memory and GPUz shows the release date as May 30 2016. I would like to update the BIOS as the latest from MSI are November 2016.

Can I download the MSI Gaming Z bios and apply that to my Gaming X. And will I see a benefit?

thanks.

EDIT: Found the answer and this guide. Thanks.

http://www.overclock.net/t/1617207/noob-gtx1080-safely-flashing-gaming-x-to-gaming-z#post_25685620


----------



## crastakippers

That was easy. I just downloaded the Z bios and it flashed perfectly using the MSI batch file. No need for the guide.

Cool.


----------



## raisethe3

Thank you, yeah, I am leaning real hard on this card.

I thought about 1060, but I have money for the 1070. I have considered the EVGA ACX 3.0 or the ASUS Strix to be the card for me. (I am kinda loyal)
Quote:


> Originally Posted by *zipper17*
> 
> Yes, imho 1070 still solid for its price and performances.
> 
> 1070 definitely more than capable for 1680x1050p at those games or even 1920x1080P.
> Even 2560x1440P/4K still capable but not perfect stable 60FPS at all games max settings (depends what games & settings.)
> 
> 1680x1050 is not problem at all,
> or If you want more a less cost of money, go GTX 1060 6 GB, its also still very capable at mainstream 1680x1050/1920x1080P.
> 
> From 8800GT to GTX 1070, that will be a hell lot of Gain performances.


Quote:


> Originally Posted by *gtbtk*
> 
> For 1050p at 60 frames per second gaming with an i5-2550K, you should not really need more than a 1060. Even the 1050 cards should be powerful enough to give u a good gaming experience with the monitor you have. If you plan to upgrade monitors to a multi monitor setup or to a 1080p/144hz or 1440p monitor in the near future, you could consider a 1070. If you have no upgrade plans for the next year, even a 1070 is probably overkill
> 
> Anything higher up the range and you won't really get value for money unless you upgrade both the monitor and CPU you are using. Spending for the highest range cards thinking it will be good for a couple of years is usually not good economy because the next generation of cards will leap in performance and be priced at about the same levels as the current stack so by the time you get to use the extra performance you paid for, the card is obsolete anyway.


I might upgrade my monitor in the future. But that's going to be a while. In the meantime, how do you know that I would be getting 60fps with my i5 2550k in gaming? Have you had this experience before?

Rep+ you guys.


----------



## gtbtk

Quote:


> Originally Posted by *raisethe3*
> 
> Thank you, yeah, I am leaning real hard on this card.
> 
> I thought about 1060, but I have money for the 1070. I have considered the EVGA ACX 3.0 or the ASUS Strix to be the card for me. (I am kinda loyal)
> 
> I might upgrade my monitor in the future. But that's going to be a while. In the meantime, how do you know that I would be getting 60fps with my i5 2550k in gaming? Have you had this experience before?
> 
> Rep+ you guys.


i am running mine with an i7-2600 at 4.4Ghz on a z68 MB. With an overclock you are not that far behind what I can manage. I can get 108fps in ROTR, about 90-100 in GTA V and can pull a 15200 in Firestrike (21000 graphics score and 10400 physics) you should be able to do about 14000 with an 8000 physics score in firestrike.

i5 Sandy bridge CPUs are getting close to the end of their useful main stream lives, an unfortunate fact of life for a 6yo CPU. Most other CPUs would have been retired long ago. Watchdogs 2 and some of the other newer titles are now starting to stress the older i5 but if you adjust some of the settings you can still get good frame rates.

You should also look at the MSI gaming X/Quicksilver card. Very quiet and efficient cooler and a very high 291W power limit lets you keep clocks high without down clocking at 1080p. I have one of these and I am very pleased with it.

Gainward/Palit have models that are the absolutely fastest out of the box cards. The Zotac Amp extreme is a monster card that performs really well too.

Cards that have the reference boards are easier to get after market water blocks for if you ever want to water cool it.

They are all pretty reasonable, they all perform roughly about the same. water cooled cards have a bit more OC potential than aircooled cards. If you end up liking 2 or 3 different models get which ever one is the cheapest of you favorites.


----------



## raisethe3

Quote:


> Originally Posted by *gtbtk*
> 
> i am running mine with an i7-2600 at 4.4Ghz on a z68 MB. With an overclock you are not that far behind what I can manage. I can get 108fps in ROTR, about 90-100 in GTA V and can pull a 15200 in Firestrike (21000 graphics score and 10400 physics) you should be able to do about 14000 with an 8000 physics score in firestrike.
> 
> i5 Sandy bridge CPUs are getting close to the end of their useful main stream lives, an unfortunate fact of life for a 6yo CPU. Most other CPUs would have been retired long ago. Watchdogs 2 and some of the other newer titles are now starting to stress the older i5 but if you adjust some of the settings you can still get good frame rates.
> 
> You should also look at the MSI gaming X/Quicksilver card. Very quiet and efficient cooler and a very high 291W power limit lets you keep clocks high without down clocking at 1080p. I have one of these and I am very pleased with it.
> 
> Gainward/Palit have models that are the absolutely fastest out of the box cards. The Zotac Amp extreme is a monster card that performs really well too.
> 
> Cards that have the reference boards are easier to get after market water blocks for if you ever want to water cool it.
> 
> They are all pretty reasonable, they all perform roughly about the same. water cooled cards have a bit more OC potential than aircooled cards. If you end up liking 2 or 3 different models get which ever one is the cheapest of you favorites.


None of those cards appeal to me. As you may not know, I don't ever plan on overclocking my GPU nor do I know how to. I also don't have the skills/money to even watercool, lol. (You geeks) I have been looking at the EVGA ACX version, EVGA Superclocked ACX, and the ASUS ROG Strix.

Thank you so much for your input and help, great appreciated.


----------



## FitNerdPilot

Quote:


> Originally Posted by *raisethe3*
> 
> None of those cards appeal to me. As you may not know, I don't ever plan on overclocking my GPU nor do I know how to. I also don't have the skills/money to even watercool, lol. (You geeks) I have been looking at the EVGA ACX version, EVGA Superclocked ACX, and the ASUS ROG Strix.
> 
> Thank you so much for your input and help, great appreciated.


A lot of the cards come with software that allows you to 1-click OC.... It's simple and safe. If you're not manually oc just research which software is best and go with that card. Or just buy which one you like the best (looks, cooling, etc).


----------



## Madmaxneo

It looks as though EVGA is sending me a GTX 1070 SC to replace the 980 I sent in for RMA. I had originally planned to save up for a 1080Ti but now that I have this I was thinking of going SLI with another 1070.

But is SLI still a viable option? I have heard that it is not as beneficial as it once was.
Is it better that I save up for the 1080Ti instead? This would take a while and much longer than saving for another 1070.
My current rig (minus the 1070) is in my signature with PSU and all.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> It looks as though EVGA is sending me a GTX 1070 SC to replace the 980 I sent in for RMA. I had originally planned to save up for a 1080Ti but now that I have this I was thinking of going SLI with another 1070.
> 
> But is SLI still a viable option? I have heard that it is not as beneficial as it once was.
> Is it better that I save up for the 1080Ti instead? This would take a while and much longer than saving for another 1070.
> My current rig (minus the 1070) is in my signature with PSU and all.


2 way SLI still works as long as the game supports SLI and not all do. There are of course issues that go with it like micro stutter that you don't need to worry about with a single card but they have always been around. One thing that may help with SLI that I have never heard anyone talk about is fast sync. It is sort of like vsync except without the latency. The cards generate frames as fast as they usually do but the output to the screen is managed at the monitor frame rate. that may well balance out the frame pacing and prevent the stutter.

If you have $400 saved up, you could sell the 1070SC still boxed up from evga for $300-350 and buy a 1080TI now if you wanted


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> 2 way SLI still works as long as the game supports SLI and not all do. There are of course issues that go with it like micro stutter that you don't need to worry about with a single card but they have always been around. One thing that may help with SLI that I have never heard anyone talk about is fast sync. It is sort of like vsync except without the latency. The cards generate frames as fast as they usually do but the output to the screen is managed at the monitor frame rate. that may well balance out the frame pacing and prevent the stutter.
> 
> If you have $400 saved up, you could sell the 1070SC still boxed up from evga for $300-350 and buy a 1080TI now if you wanted


Don't quite have that much saved up right now, not even enough to get a second 1070. I have an AOC gsync monitor with a refresh rate of 144mhz so that would probably help. In fact this fast sync sounds a lot like gsync.....


----------



## Bee Dee 3 Dee

Quote:


> Originally Posted by *Madmaxneo*
> 
> Don't quite have that much saved up right now, not even enough to get a second 1070. I have an AOC gsync monitor with a refresh rate of 144mhz so that would probably help. In fact this fast sync sounds a lot like gsync.....


g-sync is the best investment.
i used it for a year before my last pc and vid card upgrades and it extended the usefulness of my old stuff just long enough to end up with a rig with the "best bang for the money+".


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 2 way SLI still works as long as the game supports SLI and not all do. There are of course issues that go with it like micro stutter that you don't need to worry about with a single card but they have always been around. One thing that may help with SLI that I have never heard anyone talk about is fast sync. It is sort of like vsync except without the latency. The cards generate frames as fast as they usually do but the output to the screen is managed at the monitor frame rate. that may well balance out the frame pacing and prevent the stutter.
> 
> If you have $400 saved up, you could sell the 1070SC still boxed up from evga for $300-350 and buy a 1080TI now if you wanted
> 
> 
> 
> Don't quite have that much saved up right now, not even enough to get a second 1070. I have an AOC gsync monitor with a refresh rate of 144mhz so that would probably help. In fact this fast sync sounds a lot like gsync.....
Click to expand...

The benefit of fast sync is that it does not require a Gsync monitor and I think that it is supposed to compliment gsync. A second 1070 will help in a number of scenarios, you may even be able to use nvidia profile inspector to add your own SLI profiles to games that don't work in SLI out of the box. It does mean though, that your investment in 2x1070 GPUs will be actually higher than what you would have spent If you sold the 1070 and bought on a single 1080Ti.

Maybe the best answer is to not let impatience sway your decision, continue using the single 1070SC and keep saving. Sell the 1070 when you have enough money and get the card that you really want.


----------



## Madmaxneo

My 1070 came today and this thing is beast compared to my GTX 980! Ashes of the Singularity looks stunning with the recommended settings.


----------



## pez

Yeah the 1070 is honestly my favorite card of the lineup so far. This is after dealing with 3 different 1080s, my GFs 1070 and my TXP. I've yet to get my hands on a Ti just yet. Waiting until the last minute to step up in hopes that EVGA does a iCX version for step-up...but that doesn't look likely...so....

But yeah, the 1070 is a great bang for the buck card. I've seen several go for sub-$350 at this point, and frankly, the rather reasonable power it uses for its' performance is impressive. A GTX 1070 + Gsync 144hz panel is a gorgeous combo to have.


----------



## Quadrider10

U guys with gsync, what are your settings? Are u running with gsync and vsync with fps cap or without fps cap, or gsync enabled without vsync and an fps cap?


----------



## Vitto

Guys I want your opinion, I'm running strix 1070s in SLI, I was wondering if maybe I should try to sell those and go for a single strix 1080ti when it's out. Your thoughts?


----------



## khanmein

Quote:


> Originally Posted by *Vitto*
> 
> Guys I want your opinion, I'm running strix 1070s in SLI, I was wondering if maybe I should try to sell those and go for a single strix 1080ti when it's out. Your thoughts?


don't go for SLI & stick with a single card!


----------



## zipzop

Quote:


> Originally Posted by *Quadrider10*
> 
> U guys with gsync, what are your settings? Are u running with gsync and vsync with fps cap or without fps cap, or gsync enabled without vsync and an fps cap?


XB270HU(1440p 144hz) I cap at 140fps with Rivatuner, no v-sync.


----------



## syl1979

Quote:


> Originally Posted by *Quadrider10*
> 
> U guys with gsync, what are your settings? Are u running with gsync and vsync with fps cap or without fps cap, or gsync enabled without vsync and an fps cap?


Gsync + fast sync


----------



## Quadrider10

Quote:


> Originally Posted by *zipzop*
> 
> XB270HU(1440p 144hz) I cap at 140fps with Rivatuner, no v-sync.


u dont get stutters or tearing?

Quote:


> Originally Posted by *syl1979*
> 
> Gsync + fast sync


i can never use fast sync. i always get micro stutter


----------



## khanmein

fast sync works well on Dota 2, CS:GO, Torchlight II, Mighty No. 9, Diablo 3, Starcraft 2, SFV, USFIV, Titan Quest :AE, Doom, NSUS4 etc. (locked 60 fps)


----------



## zipzop

Quote:


> Originally Posted by *Quadrider10*
> 
> u dont get stutters or tearing?
> i can never use fast sync. i always get micro stutter


Nope use latest Nvidia drivers and latest MSI Afterburner / RivaTuner revisions. The FPS cap works works perfectly for me now


----------



## Madmaxneo

Does anyone here use the EVGA powerlink? If so, or even if not, are there any reported issues with it?


----------



## pez

Quote:


> Originally Posted by *Madmaxneo*
> 
> Does anyone here use the EVGA powerlink? If so, or even if not, are there any reported issues with it?


I got one free with my 1080, but never bothered to use it. When my Ti and new case comes in , I can give it a try for you if you're interested. What kinda issues are you looking for?


----------



## Madmaxneo

Quote:


> Originally Posted by *pez*
> 
> I got one free with my 1080, but never bothered to use it. When my Ti and new case comes in , I can give it a try for you if you're interested. What kinda issues are you looking for?


Issues in general. I'm just being cautious.


----------



## pez

Quote:


> Originally Posted by *Madmaxneo*
> 
> Issues in general. I'm just being cautious.


Ah, I haven't been on the lookout for issues. I guess the one thing I'll say for it is that it's pretty large. Larger then I felt like it was going to be (even though I had seen pictures previously).


----------



## sblocc10

need your opinions... is THIS ok ??

ps:
msi 1070 armor 8g oc air / 2050 - 9000
ITX mini + vent *lol* my room vent..
intel i5 4460 .. stock
16g's

-> + 2x shunt mod -> no power limit (<100%)
+ 2x power cap mod -> no total power limit anymore !! (rated ~70%) at 100%l usage

running pamela early access with high detail in 4k..








)))))))))))))))))))))


----------



## Vitto

Quote:


> Originally Posted by *raisethe3*
> 
> I have a question for you all owners, is this card the best value card for the money? Especially when compared to the GTX 1080, 1080Ti, GTX Titan X? For someone like me who game on 1680x1050 resolution monitor. In doing my own research, I find that they only thing that separates them are the core clock, boost clock, TFLOPS, memory bandwith. According to results, I see they are separated by like what, 30 watts of power apart? I have about a $400 budget, so this might be the card? Or should I save up more? As for how many fps apart, I don't think its a wide margin. Unless someone wants to correct me on this one.
> 
> I got games that I want to play such as Resident Evil 7 and Rise of the Tomb Raider, Battlefield 1 which I don't think my current video graphics card can handle (look at my sig rig). I also need to catch up on older games such as Fallout New Vegas (never played it), Alic Madness Returns, Splinter Cell: Conviction, Borderlands 1 & 2, Splinter Cell: Ghost Recon.
> 
> Any thoughts/inputs would be deeply appreciated.


I'm thinking you could go with 1060 and maybe save up a bit and add that to the difference between 1070 and 1060 and get a slightly better monitor with higher refresh rate. 1060 should still be plenty capable.


----------



## MrGreaseMonkkey

Hi guys, is there a custom bios out there that can disable GPU Boost on the 1070 FE from EVGA? I've googled but can't find anything.


----------



## Archdregs

Do you guys bump the voltage when OCing for daily use?


----------



## kignt

Since I saw someone else underclock/undervolt, I gave it a try and have kept it since.


----------



## Gurkburk

Quote:


> Originally Posted by *kignt*
> 
> Since I saw someone else underclock/undervolt, I gave it a try and have kept it since.


Why underclock?


----------



## gtbtk

Quote:


> Originally Posted by *MrGreaseMonkkey*
> 
> Hi guys, is there a custom bios out there that can disable GPU Boost on the 1070 FE from EVGA? I've googled but can't find anything.


No, there are no custom bioses available for 1070 cards.


----------



## gtbtk

Quote:


> Originally Posted by *kignt*
> 
> Since I saw someone else underclock/undervolt, I gave it a try and have kept it since.


You will get better performance with similar temps if you use the .950v point instread of the .9v point


----------



## Madmaxneo

Quote:


> Originally Posted by *pez*
> 
> Ah, I haven't been on the lookout for issues. I guess the one thing I'll say for it is that it's pretty large. Larger then I felt like it was going to be (even though I had seen pictures previously).


I got mine today and it is about the size I imagined it would be. It looks way better than having the just the cable.
I may add some UV reactive paint to the EVGA symbol though....


----------



## fyzzz

Been reading alot here about tweaking the 1070, I really like tweaking my hardware to get max out of it. Bought a Palit 1070 SJS back in november 2016, been really happy with this card. At first max stable was 2088 mhz and +405 on the memory (micron) 1.081v. Max stable is still [email protected], but i've got the vram stable at +450 instead (no artifacts in witcher 3 or crashes). I'm using the curve to overclock, +55 mhz and then i tweak the voltage points so that it runs at 2088 1.081v (doesn't throttle if the card is kept below 55c) and the .950v point at 2012 mhz. A few 3dmark benches with current stable settings: Ultra: http://www.3dmark.com/fs/12233867 Extreme: http://www.3dmark.com/fs/12233718 Normal: http://www.3dmark.com/fs/12233773 Time spy: http://www.3dmark.com/spy/1509028
And currently i'm testing what max clock i can get a the .950v point


Spoiler: Warning: Spoiler!






After 20 minutes of witcher 3, Seems to bounce between 2050 and 2038 mhz. I will continue testing.


----------



## pez

Quote:


> Originally Posted by *Madmaxneo*
> 
> I got mine today and it is about the size I imagined it would be. It looks way better than having the just the cable.
> I may add some UV reactive paint to the EVGA symbol though....


Yeah I don't think it would actually fit correctly in my current case







. Show us some pics when you get a chance







.


----------



## raisethe3

Quote:


> Originally Posted by *Vitto*
> 
> I'm thinking you could go with 1060 and maybe save up a bit and add that to the difference between 1070 and 1060 and get a slightly better monitor with higher refresh rate. 1060 should still be plenty capable.


Thanks for the late reply! But I have already set on getting the GTX 1070.

Thank you!!!


----------



## Madmaxneo

Quote:


> Originally Posted by *pez*
> 
> Yeah I don't think it would actually fit correctly in my current case
> 
> 
> 
> 
> 
> 
> 
> . Show us some pics when you get a chance
> 
> 
> 
> 
> 
> 
> 
> .


Ok. It is an old but decent camera as my phone camera is busted...lol.


Spoiler: Warning: Spoiler!








I am looking to get a backplate that has the batman symbol on it (maybe in UV) if I can find one. I will also be installing a Heatkiller waterblock with a Swiftech H140x to cool it..


----------



## gtbtk

Quote:


> Originally Posted by *fyzzz*
> 
> Been reading alot here about tweaking the 1070, I really like tweaking my hardware to get max out of it. Bought a Palit 1070 SJS back in november 2016, been really happy with this card. At first max stable was 2088 mhz and +405 on the memory (micron) 1.081v. Max stable is still [email protected], but i've got the vram stable at +450 instead (no artifacts in witcher 3 or crashes). I'm using the curve to overclock, +55 mhz and then i tweak the voltage points so that it runs at 2088 1.081v (doesn't throttle if the card is kept below 55c) and the .950v point at 2012 mhz. A few 3dmark benches with current stable settings: Ultra: http://www.3dmark.com/fs/12233867 Extreme: http://www.3dmark.com/fs/12233718 Normal: http://www.3dmark.com/fs/12233773 Time spy: http://www.3dmark.com/spy/1509028
> And currently i'm testing what max clock i can get a the .950v point
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> After 20 minutes of witcher 3, Seems to bounce between 2050 and 2038 mhz. I will continue testing.


Make sure that your card is running the 86.04.50.00 series of vbios. It resolves a bug that caused Micron memory to crash you PC if you overclock it too far.

I think the Palit cards had 86.04.3b.00 bioses before the bugfix bios upgrade was released in November last year. You should be able to get the upgrade from the Palit website

If you are overclocking, use a custom fan curve. As temperatures rise, clocks start to progressively drop so the cooler you can keep your card, the faster it will maintain clock speed


----------



## fyzzz

Quote:


> Originally Posted by *gtbtk*
> 
> Make sure that your card is running the 86.04.50.00 series of vbios. It resolves a bug that caused Micron memory to crash you PC if you overclock it too far.
> 
> I think the Palit cards had 86.04.3b.00 bioses before the bugfix bios upgrade was released in November last year. You should be able to get the upgrade from the Palit website
> 
> If you are overclocking, use a custom fan curve. As temperatures rise, clocks start to progressively drop so the cooler you can keep your card, the faster it will maintain clock speed


Yeah i know about the micron bug before, i'm running the latest bios. +430 - 450 seems to be the max before artifacts appear. I always use a custom fan curve with air cooled cards, even at 65% the noise doesn't bother me much. But it's hard to keep this card under 55c and thats when it increases the voltage or adjusts the clock.


----------



## zipper17

I set my custom fan curve 50C = 100% fan speed. Temperature Hysteresis = 11C, that's mean fans will cooldown to normal speed when it hits 39C. Some light gaming as long as the temperature never hit 50C, the fan will never goes to 100%. So it still depends on games & graphic settings. The custom fan curve will only applied to heavy gaming scenario. I also mostly capped my Framerates with Adaptive vsync at [email protected] 2560x1440P. I also didn't find my 100% fan speed is very loud, so basically I'm fine with it. Btw for benchmarks purpose I always set them manually into 100% so the fans will never dropped.


----------



## zipper17

Quote:


> Originally Posted by *fyzzz*
> 
> Been reading alot here about tweaking the 1070, I really like tweaking my hardware to get max out of it. Bought a Palit 1070 SJS back in november 2016, been really happy with this card. At first max stable was 2088 mhz and +405 on the memory (micron) 1.081v. Max stable is still [email protected], but i've got the vram stable at +450 instead (no artifacts in witcher 3 or crashes). I'm using the curve to overclock, +55 mhz and then i tweak the voltage points so that it runs at 2088 1.081v (doesn't throttle if the card is kept below 55c) and the .950v point at 2012 mhz. A few 3dmark benches with current stable settings: Ultra: http://www.3dmark.com/fs/12233867 Extreme: http://www.3dmark.com/fs/12233718 Normal: http://www.3dmark.com/fs/12233773 Time spy: http://www.3dmark.com/spy/1509028
> And currently i'm testing what max clock i can get a the .950v point
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> After 20 minutes of witcher 3, Seems to bounce between 2050 and 2038 mhz. I will continue testing.


Pretty same with mine in witcher3, after long session it stay about 2050 and 2038 mhz. This is because temperature throttling & the fan cooler capabilities really are.

Try 100% fan speed.

Witcher 3 max settings is very gpu demanding will put any GPU into 99-100% load most of time = heats. (I play at 2560x1440P)

More powerful cooling definitely will make clockspeed more stable.


----------



## Madmaxneo

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah i know about the micron bug before, i'm running the latest bios. +430 - 450 seems to be the max before artifacts appear. I always use a custom fan curve with air cooled cards, even at 65% the noise doesn't bother me much. But it's hard to keep this card under 55c and thats when it increases the voltage or adjusts the clock.


You should try watercooling your card. I am getting an H140-X to pair with a Heatkiller IV waterblock for my GPU. Watercooling a GPU has a much greater impact on performance than watercooling the CPU.


----------



## Gurkburk

Reached my limit i believe.


----------



## fyzzz

Quote:


> Originally Posted by *Madmaxneo*
> 
> You should try watercooling your card. I am getting an H140-X to pair with a Heatkiller IV waterblock for my GPU. Watercooling a GPU has a much greater impact on performance than watercooling the CPU.


I have actually thought about doing that. I only need to buy a block for the gpu if i decide to go that route, since I already have a custom loop.


----------



## Madmaxneo

Quote:


> Originally Posted by *fyzzz*
> 
> I have actually thought about doing that. I only need to buy a block for the gpu if i decide to go that route, since I already have a custom loop.


Awesome! I also already have a Swiftech H 240-X in my system that I could add my GPU to but I prefer to keep the loops separate as I may add a second GPU to that loop or add my MB to my CPU loop if the opportunity arises and I do not want to add an additional res....


----------



## gtbtk

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Make sure that your card is running the 86.04.50.00 series of vbios. It resolves a bug that caused Micron memory to crash you PC if you overclock it too far.
> 
> I think the Palit cards had 86.04.3b.00 bioses before the bugfix bios upgrade was released in November last year. You should be able to get the upgrade from the Palit website
> 
> If you are overclocking, use a custom fan curve. As temperatures rise, clocks start to progressively drop so the cooler you can keep your card, the faster it will maintain clock speed
> 
> 
> 
> Yeah i know about the micron bug before, i'm running the latest bios. +430 - 450 seems to be the max before artifacts appear. I always use a custom fan curve with air cooled cards, even at 65% the noise doesn't bother me much. But it's hard to keep this card under 55c and thats when it increases the voltage or adjusts the clock.
Click to expand...

+400 was about the limit I got to before the bios update. Above that, It would put white checkerboards all over the screen and then BSOD.

After the Bios Update I can do +650 - +690 although about +500 - +550 give me better performance as errors/error correction starts creeping in above that. Don't worry too much about max core frequency, Focus on maximizing Vram overclock even if you have to reduce the core frequency by 25 Mhz to get there. you should find you get better performance.


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> 
> 
> Reached my limit i believe.


from that profile you are running in AB. Do a CTRL-F to open the curve window. Drag the .950v point up to 2025 or 2037Mhz, hit apply and try firestrike again. you might be able to get it up to a 21000 graphics score


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> +400 was about the limit I got to before the bios update. Above that, It would put white checkerboards all over the screen and then BSOD.
> 
> After the Bios Update I can do +650 - +690 although about +500 - +550 give me better performance as errors/error correction starts creeping in above that. Don't worry too much about max core frequency, Focus on maximizing Vram overclock even if you have to reduce the core frequency by 25 Mhz to get there. you should find you get better performance.


Do you see any improvements in games with those numbers?


----------



## Gurkburk

Quote:


> Originally Posted by *gtbtk*
> 
> from that profile you are running in AB. Do a CTRL-F to open the curve window. Drag the .950v point up to 2025 or 2037Mhz, hit apply and try firestrike again. you might be able to get it up to a 21000 graphics score


Just that .950 one, or the entire row ?

Edit: Changed it and the entire row fixed itself









But 3Dmark keeps crashing when I'm running it now. The driver doesnt crash, Just 3dmark.. What can that be?

& Heaven unigine crashes as well. But driver stays intact..


----------



## Mad Pistol

Alright 1070 warriors. I need your help.

I have two GTX 1070 FEs. One was purchased directly from Nvidia (Digital River), and the other was purchased from MSI (newegg).

I noticed that while running SLI, the MSI card is triggering a VREL in GPU-Z (the blue indication is VREL, yellow is SLI). If you notice on the graph below (the nvidia-sourced card) that VREL is not present. It's just limited by SLI.

I was wondering if anyone else here had a 1070 Founders Edition not from Nvidia that is exhibiting the same characteristics. I only bring it up because it appears that my 1070 FE from Nvidia was binned better than the one from MSI.

Here's a pic for reference.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> +400 was about the limit I got to before the bios update. Above that, It would put white checkerboards all over the screen and then BSOD.
> 
> After the Bios Update I can do +650 - +690 although about +500 - +550 give me better performance as errors/error correction starts creeping in above that. Don't worry too much about max core frequency, Focus on maximizing Vram overclock even if you have to reduce the core frequency by 25 Mhz to get there. you should find you get better performance.
> 
> 
> 
> Do you see any improvements in games with those numbers?
Click to expand...

Yes.

these are early and current results:

http://www.3dmark.com/compare/fs/9470128/fs/11822144#


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> from that profile you are running in AB. Do a CTRL-F to open the curve window. Drag the .950v point up to 2025 or 2037Mhz, hit apply and try firestrike again. you might be able to get it up to a 21000 graphics score
> 
> 
> 
> Just that .950 one, or the entire row ?
> 
> Edit: Changed it and the entire row fixed itself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But 3Dmark keeps crashing when I'm running it now. The driver doesnt crash, Just 3dmark.. What can that be?
> 
> & Heaven unigine crashes as well. But driver stays intact..
Click to expand...

You have the slider set to 110 and then adjust that single point.

You could try dropping the slider back down to +100 and the point at 2025 and see how that goes. otherwise, try 2012Mhz, 2000, keep dropping it by 12-13mhz

Try it at 2012Mhz. Keep adjusting the point until it doesn't crash any more. I am running a MSI Gaming X with a Gaming Z bios installed. I can only get my slider up to +50 without it crashing but my default core clock is 1633Mhz. I will bump the .950 point up to 2037 and run the 1.063 point at 2088 and leave the voltage slider at 0. I found that the frame rates that I was getting was about the same at +0 and +100 on the voltage slider, the +0 voltage gives me lower temps


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> Yes.
> 
> these are early and current results:
> 
> http://www.3dmark.com/compare/fs/9470128/fs/11822144#


That is Firestrike.
I am asking if you noticed any improvement in games? How much of a framerate difference did you get?


----------



## pez

Quote:


> Originally Posted by *Madmaxneo*
> 
> Ok. It is an old but decent camera as my phone camera is busted...lol.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I am looking to get a backplate that has the batman symbol on it (maybe in UV) if I can find one. I will also be installing a Heatkiller waterblock with a Swiftech H140x to cool it..


Looks nice and snazzy! Congrats







.


----------



## khanmein

@Mad Pistol, this is very common even I don't own any FE before, but pretty obvious NV binned the chip. Sell of the MSI & stick with single GPU or get rid of both & grab 1080Ti.


----------



## kignt

Quote:


> Originally Posted by *gtbtk*
> 
> You will get better performance with similar temps if you use the .950v point instread of the .9v point


@gtbtk Thanks for pointing this out. Definite noticeable stability with 0.950 instead of .900 . I noticed while on Tera, thought the choppiness was due to poorly optimized game, but once changed to .950, it feels smoother, (game is still poorly optimized imo). Also, I agree with your point of not to worry about max clock, since boost 3.0 will handle it.


----------



## khanmein

anyone tested MSI:AB 4.4.0 beta 6?


----------



## zipzop

Quote:


> Originally Posted by *khanmein*
> 
> anyone tested MSI:AB 4.4.0 beta 6?


Not much way of a change log from Unwinder, other than just some change to the way the OSD text looks, adjustable size option?


----------



## Archdregs

Quote:


> Originally Posted by *zipzop*
> 
> Not much way of a change log from Unwinder, other than just some change to the way the OSD text looks, adjustable size option?


Where can I get that beta version?


----------



## zipzop

Quote:


> Originally Posted by *Archdregs*
> 
> Where can I get that beta version?


http://forums.guru3d.com/showthread.php?t=412822&page=9

Post #216

Note: that post includes RTSS 7.0.0 beta 15, but he has released RTSS beta19 standalone on the last page of the thread(page 12).


----------



## zipzop

edit: duplicate post.


----------



## khanmein

Quote:


> Originally Posted by *zipzop*
> 
> http://forums.guru3d.com/showthread.php?t=412822&page=9
> 
> Post #216
> 
> Note: that post includes RTSS beta 15, but he has released RTSS beta19 standalone on the last page of the thread(page 12).


Thanks, last night I've updated RTSS Beta 19.


----------



## Madmaxneo

Quote:


> Originally Posted by *pez*
> 
> Looks nice and snazzy! Congrats
> 
> 
> 
> 
> 
> 
> 
> .


Thanks though it is not as nice as I want it to be just yet. I am getting a watercooling set up for my GPU. Once that is done I may do some work with acrylic to cover some things up....


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Yes.
> 
> these are early and current results:
> 
> http://www.3dmark.com/compare/fs/9470128/fs/11822144#
> 
> 
> 
> That is Firestrike.
> I am asking if you noticed any improvement in games? How much of a framerate difference did you get?
Click to expand...

I am in between power supplies at the moment so I can access the rig. Rise of Tomb Raider from memory was improved a from mid 90s to 110ish


----------



## gtbtk

Quote:


> Originally Posted by *kignt*
> 
> @gtbtk Thanks for pointing this out. Definite noticeable stability with 0.950 instead of .900 . I noticed while on Tera, thought the choppiness was due to poorly optimized game, but once changed to .950, it feels smoother, (game is still poorly optimized imo). Also, I agree with your point of not to worry about max clock, since boost 3.0 will handle it.


Obviously 2088 will be better than 1900. 

The other thing that you should be aware of is that fine line is actually "0". For the first coupe of months I had that card I didn't work it out and always assumed that the curves baseline was fixed but it isn't. It represents an offset from a baseline and is cumulative if you are fiddling and tweaking the curve. You can end up with 2 curves that look the same if you only pay attention to the red line with the adjustment points, but are actually totally different so it can lead to confusion. To keep track of what you have changed, You are usually better off resetting the curve to the default and building a new curve up from scratch while you are learning the quirks.

As an example, you can go wild and create a curve that looks like the top one. What you have actually done in that admittedly extreme example, is created a curve like the bottom one. The whole thing works fine as long as you understand what both lines actually mean.


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> I am in between power supplies at the moment so I can access the rig. Rise of Tomb Raider from memory was improved a from mid 90s to 110ish


That's pretty good.
Now you have me convinced to go ahead and OC this card now, before I get the card set up for water cooling.

You are using MSI ab right? If so what are the initial settings you use, or at least as a starting point for me.

Are the core and memory clocks still done in steps of 13?


----------



## rfarmer

Any of you guys try the new Unigine Superposition?


----------



## zipzop

Quote:


> Originally Posted by *rfarmer*
> 
> 
> 
> Any of you guys try the new Unigine Superposition?


It's out!??









downloading now

edit: made a successful run at 2164mhz. Though that speed isn't stable in most games. There were a few red dot artifact thingys


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I am in between power supplies at the moment so I can access the rig. Rise of Tomb Raider from memory was improved a from mid 90s to 110ish
> 
> 
> 
> That's pretty good.
> Now you have me convinced to go ahead and OC this card now, before I get the card set up for water cooling.
> 
> You are using MSI ab right? If so what are the initial settings you use, or at least as a starting point for me.
> 
> Are the core and memory clocks still done in steps of 13?
Click to expand...

What I would suggest that you do is install Precision XOC and use that to start out. That has an Automatic Overclock utility that you can run against your EVGA card.

It is not the most accurate tool in the world and I would not assume that the curve it sets will be completely stable, but what it will do is let you see roughly how the different voltage levels on the curve overclock. After you have played with the auto OC, you can use Afterburner or precision. What ever you like better. Personally I prefer Afterburner because I can make finer adjustments on the curve. Precision does offer more features for EVGA cards than it does for other brands though.

On my MSI Card, I flashed an EVGA bios to try it out and ran the auto tune against my "EVGA" card and found that the voltage points around 1.0V are the weakest points and will only support an increase of around +50 over the base clock of my card. The points at the low end and high end of my card's curve around 0.800V are quite happy to run at +125 to +150.

Once you get a reasonable idea of the point with the lowest OC headroom, it is easy to set an OC with the slider and gives you a starting point if you want to try overclocking using the Curve feature in either AB or Precision XOC.

The steps in afterburner with Pascal are approx 12.5 apart, start out making +25 adjustments and then reduce 12 or 13 points to fine tune if the +25 score is too high.

1070 Loves memory overclock. I would try setting that to +500 to +600 to start with and increase as far as you can go while remaining stable. Some people can run stable at +800 but that is not everyone. My card will run with memory at +700 but it is not completely stable, I tend to run it at about +675 over reference.


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> What I would suggest that you do is install Precision XOC and use that to start out. That has an Automatic Overclock utility that you can run against your EVGA card.
> 
> It is not the most accurate tool in the world and I would not assume that the curve it sets will be completely stable, but what it will do is let you see roughly how the different voltage levels on the curve overclock. After you have played with the auto OC, you can use Afterburner or precision. What ever you like better. Personally I prefer Afterburner because I can make finer adjustments on the curve. Precision does offer more features for EVGA cards than it does for other brands though.
> 
> On my MSI Card, I flashed an EVGA bios to try it out and ran the auto tune against my "EVGA" card and found that the voltage points around 1.0V are the weakest points and will only support an increase of around +50 over the base clock of my card. The points at the low end and high end of my card's curve around 0.800V are quite happy to run at +125 to +150.
> 
> Once you get a reasonable idea of the point with the lowest OC headroom, it is easy to set an OC with the slider and gives you a starting point if you want to try overclocking using the Curve feature in either AB or Precision XOC.
> 
> The steps in afterburner with Pascal are approx 12.5 apart, start out making +25 adjustments and then reduce 12 or 13 points to fine tune if the +25 score is too high.
> 
> 1070 Loves memory overclock. I would try setting that to +500 to +600 to start with and increase as far as you can go while remaining stable. Some people can run stable at +800 but that is not everyone. My card will run with memory at +700 but it is not completely stable, I tend to run it at about +675 over reference.


I already had my encounters with precision X and I prefer using MSI AB which is much more stable than anything EVGA software wise. I'll stick with AB for now.

When I was OCing my 980 (of which had a modded bios) I just used the sliders in AB and have never used the curve feature you mention. So I am completely new to that. With my 980 I hit some really great numbers even without watercooling the card. This was the best OC and subsequent Firestrike test I could get with the card: http://www.3dmark.com/fs/7868284, which gave me the best overall score (for a single 980) in in the 2016 AMD vrs Nvida competition.


----------



## zipper17

Just tried a new driver whql 381.65 and Windows 10 Creator Update (1703)

3d mark firestrike result
http://www.3dmark.com/fs/12307496

+75 core clock, +600memory (9200mhz effective), max voltage, max power, 100% fanspeed.
highest point core clock is at 2076mhz seen in curve table. Custom curve = no.
Overall & Graphic scores pretty much the same as before,i didn't see any improvement.

Why reddit saying there is an improvement,

__
https://www.reddit.com/r/63t2bg/driver_38165_faqdiscussion_thread/
AS you can see 381.65+w10 CU got +500points graphic scores on 3d mark firestrike compared with windows 10 AU (anniversary update old version). i didn't see any improvement on my system.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> What I would suggest that you do is install Precision XOC and use that to start out. That has an Automatic Overclock utility that you can run against your EVGA card.
> 
> It is not the most accurate tool in the world and I would not assume that the curve it sets will be completely stable, but what it will do is let you see roughly how the different voltage levels on the curve overclock. After you have played with the auto OC, you can use Afterburner or precision. What ever you like better. Personally I prefer Afterburner because I can make finer adjustments on the curve. Precision does offer more features for EVGA cards than it does for other brands though.
> 
> On my MSI Card, I flashed an EVGA bios to try it out and ran the auto tune against my "EVGA" card and found that the voltage points around 1.0V are the weakest points and will only support an increase of around +50 over the base clock of my card. The points at the low end and high end of my card's curve around 0.800V are quite happy to run at +125 to +150.
> 
> Once you get a reasonable idea of the point with the lowest OC headroom, it is easy to set an OC with the slider and gives you a starting point if you want to try overclocking using the Curve feature in either AB or Precision XOC.
> 
> The steps in afterburner with Pascal are approx 12.5 apart, start out making +25 adjustments and then reduce 12 or 13 points to fine tune if the +25 score is too high.
> 
> 1070 Loves memory overclock. I would try setting that to +500 to +600 to start with and increase as far as you can go while remaining stable. Some people can run stable at +800 but that is not everyone. My card will run with memory at +700 but it is not completely stable, I tend to run it at about +675 over reference.
> 
> 
> 
> I already had my encounters with precision X and I prefer using MSI AB which is much more stable than anything EVGA software wise. I'll stick with AB for now.
> 
> When I was OCing my 980 (of which had a modded bios) I just used the sliders in AB and have never used the curve feature you mention. So I am completely new to that. With my 980 I hit some really great numbers even without watercooling the card. This was the best OC and subsequent Firestrike test I could get with the card: http://www.3dmark.com/fs/7868284, which gave me the best overall score (for a single 980) in in the 2016 AMD vrs Nvida competition.
Click to expand...

I am not suggesting you use precision long term. The auto OC feature is a handy tool to get a rough profile of your card.

You can have both installed at the same time, just don't run them at the same time

980 didn't have the curve to edit. It is a new feature with pascal cards that lets you eek out the last bit of performance from your card.

With the slider, it might go unstable at say, +100 if you are only using the slider. The instability is because one of the voltage points is unstable at +100 and the slider moves the entire fixed shape curve up and down. The weakest point could be say, 1.025V, All the other points might be quite happy at +150.

With the slider only, you waste all the OC potential of that extra 50 points for all the rest of the points other than the 1.025V.

Using the curve allows you, once you have identified which point is the weak one, to increase all the others up to their +150 max potential while leaving the 1.025 point at +100. The auto OC thing in Precision helps you identify the weak points on the curve more easily than starting from scratch.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Just tried a new driver whql 381.65 and Windows 10 Creator Update (1703)
> 
> 3d mark firestrike result
> http://www.3dmark.com/fs/12307496
> 
> +75 core clock, +600memory (9200mhz effective), max voltage, max power, 100% fanspeed.
> highest point core clock is at 2076mhz seen in curve table. Custom curve = no.
> Overall & Graphic scores pretty much the same as before,i didn't see any improvement.
> 
> Why reddit saying there is an improvement,
> 
> __
> https://www.reddit.com/r/63t2bg/driver_38165_faqdiscussion_thread/
> AS you can see 381.65+w10 CU got +500points graphic scores on 3d mark firestrike compared with windows 10 AU (anniversary update old version). i didn't see any improvement on my system.


Did you enable game mode for 3dmark?

windows+G key while you are in 3dmark lobby page opens the game bar and will allow you to check a box to enable game mode for the application. Make sure you disable GameDVR in xbox settings


----------



## crastakippers

Quote:


> Originally Posted by *gtbtk*
> 
> Using the curve allows you, once you have identified which point is the weak one, to increase all the others up to their +150 max potential while leaving the 1.025 point at +100. The auto OC thing in Precision helps you identify the weak points on the curve more easily than starting from scratch.


Thanks for all the helpful information I have read since I bought my 1070 recently. Plus Rep.

Is there usually only one week point?

So the use of XOC is to identify where the peaks and the dips are likely to be if I set up a manual curve in AB. Thats brilliant.


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> Did you enable game mode for 3dmark?
> 
> windows+G key while you are in 3dmark lobby page opens the game bar and will allow you to check a box to enable game mode for the application. Make sure you disable GameDVR in xbox settings


yeah re-run the benchmark with GameDVR disabled, graphic scores still around 20,8-20,9 K just a margin error

still didnt break 21K+ perfectly

i would want to test with custom curve

what is the lowest point voltage to be increased in custom curve? start from 0.950v or 1.025v?


----------



## zipper17

Just a little clue from what i observed, In Graphic Test 2 you can guess the graphic scores from Framerates,

In the very end of Graphic test 2, if you hit +90FPS, your graphic scores will be likely to break 21k+









You can just run Graphic Test 1 & 2 only to observe the graphic scores behavior, you dont need to run full test every time.


----------



## gtbtk

Quote:


> Originally Posted by *crastakippers*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Using the curve allows you, once you have identified which point is the weak one, to increase all the others up to their +150 max potential while leaving the 1.025 point at +100. The auto OC thing in Precision helps you identify the weak points on the curve more easily than starting from scratch.
> 
> 
> 
> Thanks for all the helpful information I have read since I bought my 1070 recently. Plus Rep.
> 
> Is there usually only one week point?
> 
> So the use of XOC is to identify where the peaks and the dips are likely to be if I set up a manual curve in AB. Thats brilliant.
Click to expand...

Every card in every motherboard seems to be different. On my motherboard, I can tune CPUPLL and VCCIO voltages and it can improve the performance. the XOC autotune test can be handy for seeing the effects of motherboard voltage adjustments as well.

My card, at default motherboard voltages on the z68 board I am using, dips over about 4 points, the worst one is around 1.0v. I can improve things by increasing my CPU PLL voltage slightly. I suspect that the dip may be the ceiling caused by the PCIe 2.0 bus that I am forced to use. Newer boards with PCIe 3.0 may be better but I have not tested my card in another board.

If it were me using the xoc autotune to profile my card, I would run the xoc test between 75 and 150 Mhz with 12.5mhz increments to start with. If it crashes straight away, drop the test to run 50 -125. If everything passes at +150, run the auto tune again but test between 150 and 200. I suggest breaking it down because the whole test takes a while to run and it gets boring waiting for it to finish all the tests. If the app crashes during the test, just restart it and it will pick up and start testing the next higher voltage point from where it crashed.

If you apply the curve that the auto tune creates to the card, you can save it to a profile in XOC if you want then shut down Precision XOC and start up afterburner. OPen the AB Curve windows and you will see the curve that XOC created displayed in the AB curve window so you can save it in an AB profile slot if you want. I personally would not expect that the XOC curve it creates is truely stable but it is a starting point. If it causes crashes, try dropping each point by 12 or 25 mhz and see if that is stable. If it is, you can try bumping individual points back up and test again.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Did you enable game mode for 3dmark?
> 
> windows+G key while you are in 3dmark lobby page opens the game bar and will allow you to check a box to enable game mode for the application. Make sure you disable GameDVR in xbox settings
> 
> 
> 
> yeah re-run the benchmark with GameDVR disabled, graphic scores still around 20,8-20,9 K just a margin error
> 
> still didnt break 21K+ perfectly
> 
> i would want to test with custom curve
> 
> what is the lowest point voltage to be increased in custom curve? start from 0.950v or 1.025v?
Click to expand...

I would leave everything else as you have it to get 20900 and just try increasing the .950v point up to 2000 or 2025 and try that. If you use HWInfo. The voltage at .950 seems to be the one that controls the "video clock" result. I have no idea what the Video clock is actually supposed to do and the reported frequency does not seem to relate to anything on the curve, but I have found that I get my best performance If I can run that as high as possible. It may well be that it has an effect on the cards pstates but none of that information is reported in AB or HWinfo

As far as I have been able to work out, Nvidia actually uses different voltage points along the curve to adjust different settings for the different features of the GPU. Unfortunately, Nvidia don't bother telling us what they actually are.

I notices that If I get better than 118 at the end of Graphics test 1, I usually get a great score


----------



## Madmaxneo

Quote:


> Originally Posted by *zipper17*
> 
> Just tried a new driver whql 381.65 and Windows 10 Creator Update (1703)
> 
> 3d mark firestrike result
> http://www.3dmark.com/fs/12307496
> 
> +75 core clock, +600memory (9200mhz effective), max voltage, max power, 100% fanspeed.
> highest point core clock is at 2076mhz seen in curve table. Custom curve = no.
> Overall & Graphic scores pretty much the same as before,i didn't see any improvement.
> 
> Why reddit saying there is an improvement,
> 
> __
> https://www.reddit.com/r/63t2bg/driver_38165_faqdiscussion_thread/%5B/URL
> 
> I am not suggesting you use precision long term. The auto OC feature is a handy tool to get a rough profile of your card.
> 
> You can have both installed at the same time, just don't run them at the same time
> 
> 980 didn't have the curve to edit. It is a new feature with pascal cards that lets you eek out the last bit of performance from your card.
> 
> With the slider, it might go unstable at say, +100 if you are only using the slider. The instability is because one of the voltage points is unstable at +100 and the slider moves the entire fixed shape curve up and down. The weakest point could be say, 1.025V, All the other points might be quite happy at +150.
> 
> With the slider only, you waste all the OC potential of that extra 50 points for all the rest of the points other than the 1.025V.
> 
> Using the curve allows you, once you have identified which point is the weak one, to increase all the others up to their +150 max potential while leaving the 1.025 point at +100. The auto OC thing in Precision helps you identify the weak points on the curve more easily than starting from scratch.


The last time I tried Precision XOC I started having some weird issues. Even now the precision forums has people complaining about issues that disappear when XOC is uninstalled. I may try it otherwise. But how do you get to the curve feature in MS AB?


----------



## Imprezzion

I have 2 cards here. A MSI Gaming Z with Micron that runs 2114-2128Mhz core and 4404Mhz VRAM max stable on max volt and power limits in MSI AB. Runs incredibly cool (never hits 60c on my custom fan profile) and is very stable. Clocks don't vary at all and never hits power limit.

Now, I also have a Gigabyte G1 Gaming. It clocks better, has Samsung that will do like 4700-4800 MHz and the core looks like it's good for like, 2128+ only even on stock it bounces all over the power limit.. Core clocks are all over the place with the G1 and I can't get it to just stabilize at all..

Is there like, a BIOS or tweak or whatever I can do to get a better power limit on the G1? I mean, it's (much) louder and hotter than the Gaming Z but hey if it clocks better..


----------



## Gurkburk

My performance dropped dramatically after updating to the new windows 10 creators.

Getting around 11500~ physics score in Firestrike 3dmark now. Below 16k overall score..


----------



## Bold Eagle

Quote:


> Originally Posted by *Gurkburk*
> 
> My performance dropped dramatically after updating to the new windows 10 creators.
> 
> Getting around 11500~ physics score in Firestrike 3dmark now. Below 16k overall score..


Did you update the Video Driver *after* the Win 10 update?


----------



## Gurkburk

Quote:


> Originally Posted by *Bold Eagle*
> 
> Did you update the Video Driver *after* the Win 10 update?


I updated to the latest driver a few days ago, so before updating to the new Win10.


----------



## Imprezzion

My performance only improved since updating to creators with 381.65. I was on anniversary with 378.xx and I clearly notice improvement in both games and benchmarks.

EDIT: I got so annoyed by the terrible power limit on the G1 Gaming I decided to just YOLO flash a BIOS on it from a different card.. And it worked.. That was very unexpected.. The G1 is now running my other cards BIOS, a MSI Gaming Z. And it booted, drivers recognized the card and all outputs, and it shows in GPU-Z with MSI branding and clocks.

I'm going to see if the power limit is actually higher now and if it can OC stable.. Will report back. This BIOS should have a 291w power limit in stead of the G1's 200w.

EDIT2: Not stable with Gaming Z BIOS. Artifacts like mad under load.

Flashed a Gigabyte XOC BIOS now. It also boots and idles fine. Testing load now. (240w power).

EDIT3: So far no artifacting or weird behavior on the XOC BIOS. Runs Fire strike Ultra looped fine so far.

EDIT4: OK, I can say it's "safe" to flash a G1 Gaming with the XOC BIOS if it has Samsung memory. Runs fine. Fanspeed in RPM is a LOT lower so you need 80% PWM to run 2200RPM. But it's stable and so far runs fire strike @ 2114mhz core 4700mhz vram fine without hitting the power limiter. Barely.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipper17*
> 
> Just tried a new driver whql 381.65 and Windows 10 Creator Update (1703)
> 
> 3d mark firestrike result
> http://www.3dmark.com/fs/12307496
> 
> +75 core clock, +600memory (9200mhz effective), max voltage, max power, 100% fanspeed.
> highest point core clock is at 2076mhz seen in curve table. Custom curve = no.
> Overall & Graphic scores pretty much the same as before,i didn't see any improvement.
> 
> Why reddit saying there is an improvement,
> 
> __
> https://www.reddit.com/r/63t2bg/driver_38165_faqdiscussion_thread/%5B/URL
> 
> I am not suggesting you use precision long term. The auto OC feature is a handy tool to get a rough profile of your card.
> 
> You can have both installed at the same time, just don't run them at the same time
> 
> 980 didn't have the curve to edit. It is a new feature with pascal cards that lets you eek out the last bit of performance from your card.
> 
> With the slider, it might go unstable at say, +100 if you are only using the slider. The instability is because one of the voltage points is unstable at +100 and the slider moves the entire fixed shape curve up and down. The weakest point could be say, 1.025V, All the other points might be quite happy at +150.
> 
> With the slider only, you waste all the OC potential of that extra 50 points for all the rest of the points other than the 1.025V.
> 
> Using the curve allows you, once you have identified which point is the weak one, to increase all the others up to their +150 max potential while leaving the 1.025 point at +100. The auto OC thing in Precision helps you identify the weak points on the curve more easily than starting from scratch.
> 
> 
> 
> The last time I tried Precision XOC I started having some weird issues. Even now the precision forums has people complaining about issues that disappear when XOC is uninstalled. I may try it otherwise. But how do you get to the curve feature in MS AB?
Click to expand...

Time spy doesn't seem to like memory overclocked more than about +400 - +450.

xoc and afterburner only have code in memory when they are running. the xoc server/riva tuner that also start up should close automatically but it is worth checking on in the task bar and close manually is they are still present. I have had them both installed on my PC and never had issues but I do make sure not to run them concurrently


----------



## Imprezzion

Conclusion:

- Gigabyte G1 Gaming can be flashed with the XOC BIOS. You gain +40w of power limit.
Note: Fan speed RPM is halved. You need a custom curve to work (~80% XOC is ~50% on G1).

- MSI Armor and Gaming X can be flashed to Gaming Z.
Note: All 3 versions have VERY high power limit of 291w.

- Crossflashing with the -6 flag in NVFlash to overwrite PCI-E subsystem mismatches works and is relatively safe.


----------



## gtbtk

Quote:


> Originally Posted by *Imprezzion*
> 
> I have 2 cards here. A MSI Gaming Z with Micron that runs 2114-2128Mhz core and 4404Mhz VRAM max stable on max volt and power limits in MSI AB. Runs incredibly cool (never hits 60c on my custom fan profile) and is very stable. Clocks don't vary at all and never hits power limit.
> 
> Now, I also have a Gigabyte G1 Gaming. It clocks better, has Samsung that will do like 4700-4800 MHz and the core looks like it's good for like, 2128+ only even on stock it bounces all over the power limit.. Core clocks are all over the place with the G1 and I can't get it to just stabilize at all..
> 
> Is there like, a BIOS or tweak or whatever I can do to get a better power limit on the G1? I mean, it's (much) louder and hotter than the Gaming Z but hey if it clocks better..


keep in mind that even though a bios will install and run it doesn't always match the hardware that well.

When I was diagnosing the Micron bug, I cross flashed gust about all the bioses to my MSI card.

For a G1, If memory serves me correctly, it is a single 8 pin power card.

You could try the following

Palit Super Jetstream 1633Mhz base clock 225W

https://www.techpowerup.com/vgabios/187001/palit-gtx1070-8192-161021

Palit Gamerock premium/ gainward GLH 1671Mhz Base Clock 2127Mhz Memory 225w (basically the same card different shroud)

https://www.techpowerup.com/vgabios/187013/palit-gtx1070-8192-161026

https://www.techpowerup.com/vgabios/187062/187062

Asus Strix OC bios 1633Mhz Core Clock 200W That is the same power limit but power usage settings in the bios may not be so aggressive. This worked well when I flashed my MSI Gaming X card with the asus bios and is my favorite non MSI bios for my card. I am currently running the Gaming Z bios.

https://www.techpowerup.com/vgabios/187005/asus-gtx1070-8192-161020

EVGA FTW bios with 226W limit constantly bounce off the limit also so it has something to do with how aggressively the bios is configured to draw power. I am not really a fan of this bios

You could also try the MSI Gaming Z bios on the G1 as well. The MSI as you know is 6+8 but it doesn't really draw more than 230W so it is within limits of the G1 hardware

https://www.techpowerup.com/vgabios/187155/msi-gtx1070-8192-161024-3


----------



## Imprezzion

I tried the Gaming Z BIOS on the G1 and it works sort of. But gives very serious memory relate artifacts and crashes even on stock vram speed.

The Palit BIOS works fine, but it offers no real advantages over the XOC BIOS as that has 240w. I still see 105-108% (~230w) on that BIOS when clocked to 2128mhz core, 4700 vram and +100 core volts. But at least it isn't throttling like with the original G1 BIOS.

EVGA, well, I used to run a EVGA FTW BIOS on my G1 Gaming 980 Ti.. Hehe. Haven't tested in the 1070 yet.

I am selling my G1 anyway in favor of the Gaming Z as it simply looks better, is MUCH cooler and quieter and has the best power limit and actual 6+8 to run it. And last but not least, a vented backplate which is way better than the G1's closed backplate.

The only major downside to my particular Gaming Z is the fact it has Micron VRAM and not Samsung. It's good Micron non the less, does 4400-4450Mhz perfectly stable, but it's no 4700+ Samsung..


----------



## gtbtk

Quote:


> Originally Posted by *Imprezzion*
> 
> I tried the Gaming Z BIOS on the G1 and it works sort of. But gives very serious memory relate artifacts and crashes even on stock vram speed.
> 
> The Palit BIOS works fine, but it offers no real advantages over the XOC BIOS as that has 240w. I still see 105-108% (~230w) on that BIOS when clocked to 2128mhz core, 4700 vram and +100 core volts. But at least it isn't throttling like with the original G1 BIOS.
> 
> EVGA, well, I used to run a EVGA FTW BIOS on my G1 Gaming 980 Ti.. Hehe. Haven't tested in the 1070 yet.
> 
> I am selling my G1 anyway in favor of the Gaming Z as it simply looks better, is MUCH cooler and quieter and has the best power limit and actual 6+8 to run it. And last but not least, a vented backplate which is way better than the G1's closed backplate.
> 
> The only major downside to my particular Gaming Z is the fact it has Micron VRAM and not Samsung. It's good Micron non the less, does 4400-4450Mhz perfectly stable, but it's no 4700+ Samsung..


There is nothing generally wrong with Micron memory. The MSI Gaming cooler, in my opinion, is one of the best around. I like mine too.

The EVGA bioses bounce around everywhere, even at 1080p due to hitting the power limit. I would not bother wasting your time unless you want to play with Precision XOC autotuning.

Make sure any bios you test out is a 86.04.50.00.xx version. That is the family of bioses that solves the Memory controller bug that effected the Micron memory. After the update, I could run Firestrike at memory clocks just under 9400Mhz. Timespy is stable at 9000Mhz. I was getting the speeds you are getting now with the 86.04.26.00.xx version bioses.

Until my PSU blew it up the other day, I was running on an Asus z68 board and I did find tweaking VCCIO and CPUPLL voltages a little helped inprove the Graphics memory oc performance and stopped blue line artifacts appearing on screen if i started running 2g acceleration if i was at 600 over reference voltages. z77 may be similar?

Remember that the high end cards tend to have more beefy VRMs and use the dual power connectors than the lower range cards don't have. You can certainly try flashing a Zotac Amp extreme vios as well but it will pull 300W but you don't really get a performance increase in line with the amount of extra power you are drawing.


----------



## Imprezzion

I run the Micron update BIOS on my Gaming Z with Micron but anything over 4450Mhz will give bad artifacting (black blocks and stripes) in any 3d load. That's kind of why i wanted a Samsung card.

Did they sell the Gaming X / Z with Samsung at all?


----------



## zipper17

2138mhz coreclock by far is the highest coreclock that my card can at least pass a benchmark, but I guess it's just a lucky run. On 3d mark loop stress test it will likely crashes.

and with that 2138mhz my card still cant even manage solid +21.000 graphic scores,
Memory at +600 = 4600(9200mhz effective)

It's Samsung vram, but beyond +600 it will got a small artifacts during 3d mark loop stress test.
So +600 is by far most stable i can achieve for vram oc.


----------



## icold

Quote:


> Originally Posted by *zipper17*
> 
> 2138mhz coreclock by far is the highest coreclock that my card can at least pass a benchmark, but I guess it's just a lucky run. On 3d mark loop stress test it will likely crashes.
> 
> and with that 2138mhz my card still cant even manage solid +21.000 graphic scores,
> Memory at +600 = 4600(9200mhz effective)
> 
> It's Samsung vram, but beyond +600 it will got a small artifacts during 3d mark loop stress test.
> So +600 is by far most stable i can achieve for vram oc.


You mean 2139mhz, my card too. But for reduce throttling you need 200W bios.


----------



## Quadrider10

So I've tried the gigabyte extreme bios on my G1 but the San profile is too low and card gets hot, what would be the next best bios without having to make a custom fan curve?


----------



## Imprezzion

Probably the Palit one with 240w power. If fanspeeds are correct with that one. I'd just use a custom curve..


----------



## gtbtk

Quote:


> Originally Posted by *Imprezzion*
> 
> I run the Micron update BIOS on my Gaming Z with Micron but anything over 4450Mhz will give bad artifacting (black blocks and stripes) in any 3d load. That's kind of why i wanted a Samsung card.
> 
> Did they sell the Gaming X / Z with Samsung at all?


yes, they have sold both models with both Samsung (86.04.1E.00.xx original default bios) and Micron (86.04.26.00.xx original default bios cards manufactured before Nov 2016) memory installed.

The other thing you should try is to increase your VCCIO voltage from the default of 1.05v to somewhere between 1.1 and 1.2v, That solved anlines and squares artiface problem i had that was triggered by accelerated 2d canvas as used by chrome, with high mem oc after the update for me blue lines artifacts on the later bios


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> So I've tried the gigabyte extreme bios on my G1 but the San profile is too low and card gets hot, what would be the next best bios without having to make a custom fan curve?


I would try the Asus strix OC bios. It is a good one. The FPS performance of the card is not directly related to how much power it is pulling.

MSI Gaming z bios is good to. MSI have done some strange things to stabilice the frequency at OC. The card reports that it can support 291W with the 126% power limit but it will start throttling at about 106%. but to get to that you need a 4K graphics load.

The Palit/Gainward bioses could also be worth trying


----------



## Quadrider10

I'll give those a shot, thanks!


----------



## Quadrider10

The msi one held up the OC's well without downclocking, but the fan profiles on both bios are terrible. only runs mine at 20-40% and the card hits 75C. plus i really dont want to have another program up and running all the time with my PC just for a fan curve.

owell sticking to stock bios.

i really want a 1080 or 1080ti now that ive upgraded to 1440p, but i just cant be justified to buy one of these cards without being able to mod the bios.


----------



## Imprezzion

Quote:


> Originally Posted by *gtbtk*
> 
> yes, they have sold both models with both Samsung (86.04.1E.00.xx original default bios) and Micron (86.04.26.00.xx original default bios cards manufactured before Nov 2016) memory installed.
> 
> The other thing you should try is to increase your VCCIO voltage from the default of 1.05v to somewhere between 1.1 and 1.2v, That solved anlines and squares artiface problem i had that was triggered by accelerated 2d canvas as used by chrome, with high mem oc after the update for me blue lines artifacts on the later bios


Hmm. Might be worth trying to trade it for a Samsung model (either X or Z, I don't really care for the RGB on the Z and the PCB is the same anyway).

I'm already running VCCIO on 1.15 and VCCSA on 1.05 because my 3770K has a terrible memory controller and needs this to even run 4 sticks on 2133 cl9.


----------



## Gurkburk

Is anyone here also thinking that 3DMark has become extremely bad for everything? The new Heaven benchmark can get my 1070 through with 2126mhz and +545 memory clock, and score 20088 with my 4770k.

Meanwhile, 3Dmark struggles +115 on Coreclock.. Can't make that through Firestrike..


----------



## Mad Pistol

Quote:


> Originally Posted by *Gurkburk*
> 
> Is anyone here also thinking that 3DMark has become extremely bad for everything? The new Heaven benchmark can get my 1070 through with 2126mhz and +545 memory clock, and score 20088 with my 4770k.
> 
> Meanwhile, 3Dmark struggles +115 on Coreclock.. Can't make that through Firestrike..


I wouldn't say "bad". Just more demanding.

What I have noticed about Nvidia cards is that when the workload is more intense, it tends to be much more finicky about clock speeds and performance. Mainly this can be seen with increasing resolution; If you run The same settings on BF1 for instance and try it between 1080P and 4K, you will notice that the card bounces off the power limiter a lot more @ 4k. I assume this is because the card's resources are more heavily saturated than @ 1080p.

We're probably seeing the same phenomenon on Firestrike.


----------



## lanofsong

Hey GTX 1070 owners,

We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

April 2017 Foldathon

BTW - make sure you sign up









To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## ParlyShary

Hi, I'm about to receive my new MSI 1070 Gaming X.
I would want to know what I should look for when getting my hand on this, as I read several issues like coil whining or VBios issues.
Also what should I expect coming from an OC'd 970? I think 50% more fps? and for temp (my 970 reach 75°C on certain game)?


----------



## Gurkburk

Quote:


> Originally Posted by *ParlyShary*
> 
> Hi, I'm about to receive my new MSI 1070 Gaming X.
> I would want to know what I should look for when getting my hand on this, as I read several issues like coil whining or VBios issues.
> Also what should I expect coming from an OC'd 970? I think 50% more fps? and for temp (my 970 reach 75°C on certain game)?


Your 1070 will be around 60-70*C in really heavy games. Think 50% more fps is a bit high







But it's def better.


----------



## w-moffatt

Quote:


> Originally Posted by *Gurkburk*
> 
> Your 1070 will be around 60-70*C in really heavy games. Think 50% more fps is a bit high
> 
> 
> 
> 
> 
> 
> 
> But it's def better.


seconded, my 1070 hovers hits a max temp of 72 degrees under load in game.


----------



## zipper17

Quote:


> Originally Posted by *Gurkburk*
> 
> Is anyone here also thinking that 3DMark has become extremely bad for everything? The new Heaven benchmark can get my 1070 through with 2126mhz and +545 memory clock, and score 20088 with my 4770k.
> 
> Meanwhile, 3Dmark struggles +115 on Coreclock.. Can't make that through Firestrike..


i noticed that since bought 1070 and whenever I try to overclock it

on valley unigine i can run very high clockspeed on core & memory stable

but on firestrike i got crashes/artifacts in benchmark or stress test loops.

i guess firetrike 3dmark is much more reliable. in witcher 3 i also got the same crashes as on the firestrike.


----------



## Imprezzion

Quote:


> Originally Posted by *ParlyShary*
> 
> Hi, I'm about to receive my new MSI 1070 Gaming X.
> I would want to know what I should look for when getting my hand on this, as I read several issues like coil whining or VBios issues.
> Also what should I expect coming from an OC'd 970? I think 50% more fps? and for temp (my 970 reach 75°C on certain game)?


Check with GPU-Z if the card has Micron or Samsung memory. If it's Micron check BIOS version if it's the Micron bugfix BIOS.

Gaming X BIOS is solid. Upgrade to Gaming Z is possible for higher base clocks.

+50% FPS isn't far off I guess. On 1080p the 970 has 61% of the performance of a 1070 Gaming X. So yeah.

Temps? Expect 55-60c load on stock. OC to max volts & power with 2128 core and 4400 mem mine hits about 62c on 80% fanspeed which is very quiet still - only 2000rpm.


----------



## Quadrider10

Can I get a link to the gaming x bios?


----------



## gtbtk

Quote:


> Originally Posted by *Imprezzion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> yes, they have sold both models with both Samsung (86.04.1E.00.xx original default bios) and Micron (86.04.26.00.xx original default bios cards manufactured before Nov 2016) memory installed.
> 
> The other thing you should try is to increase your VCCIO voltage from the default of 1.05v to somewhere between 1.1 and 1.2v, That solved anlines and squares artiface problem i had that was triggered by accelerated 2d canvas as used by chrome, with high mem oc after the update for me blue lines artifacts on the later bios
> 
> 
> 
> Hmm. Might be worth trying to trade it for a Samsung model (either X or Z, I don't really care for the RGB on the Z and the PCB is the same anyway).
> 
> I'm already running VCCIO on 1.15 and VCCSA on 1.05 because my 3770K has a terrible memory controller and needs this to even run 4 sticks on 2133 cl9.
Click to expand...

VCCIO also does the PCIe controller as well as the memory controller. Maybe try it at 1.2 with a touch of extra SA voltage?


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gurkburk*
> 
> Is anyone here also thinking that 3DMark has become extremely bad for everything? The new Heaven benchmark can get my 1070 through with 2126mhz and +545 memory clock, and score 20088 with my 4770k.
> 
> Meanwhile, 3Dmark struggles +115 on Coreclock.. Can't make that through Firestrike..
> 
> 
> 
> i noticed that since bought 1070 and whenever I try to overclock it
> 
> on valley unigine i can run very high clockspeed on core & memory stable
> 
> but on firestrike i got crashes/artifacts in benchmark or stress test loops.
> 
> i guess firetrike 3dmark is much more reliable. in witcher 3 i also got the same crashes as on the firestrike.
Click to expand...

Firestrike is much harder on the hardware than heaven and valley. Timespy is even more brutal in a similar way to witcher 3


----------



## gtbtk

Quote:


> Originally Posted by *Quadrider10*
> 
> Can I get a link to the gaming x bios?


https://www.techpowerup.com/vgabios/187158/msi-gtx1070-8192-161024

They are all available on techpowerup


----------



## JooTheNoo

Hi all I have big concern about my 1070...
I have reference model with 4phases. Is it enough? Since the card taking around 160Wats in gaming stress (card alone)....
Or it's just my stupid concering.

I want to use it around 2 years 24h/7 with a little of stress (CUDA) all the time and a little of gaming part time...


----------



## Nukemaster

I do not think they would make a card that would be designed to fail.

I have had reference cards last years of heavy gaming(not 24/7 mind you).


----------



## JooTheNoo

Becouse when I open 1070 it's just 25A for each phase... so when card draw 160W-170W-> it's feal like design to fail for me. That's why I asked









If they put 50A I would be relax... my knowlage here is limited. I rember dual GPU cards - most of them they were design to fail (overheat, not enough power phases).


----------



## zipzop

Quote:


> Originally Posted by *JooTheNoo*
> 
> Becouse when I open 1070 it's just 25A for each phase... so when card draw 160W-170W-> it's feal like design to fail for me. That's why I asked
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If they put 50A I would be relax... my knowlage here is limited. I rember dual GPU cards - most of them they were design to fail (overheat, not enough power phases).


Each phase is 50A pretty sure. At max voltage and power draw that's around 220W potential. I've been running the 200W Strix OC BIOS on my stock 4-phase model(EVGA SC) for a few months now. No issues


----------



## JooTheNoo

Maybe I get something wrong







50A would be great.
Can I put Strix OC Bios to reference card?


----------



## zipzop

Quote:


> Originally Posted by *JooTheNoo*
> 
> Maybe I get something wrong
> 
> 
> 
> 
> 
> 
> 
> 50A would be great.
> Can I put Strix OC Bios to reference card?


I don't know, can you?









Yes it's possible, though you'll need to bypass the hardware subsystem ID mismatch in NvFlash. Takes some extra command I think a "-6" at the end

The correct one is STRIX-GTX1070-O8G-GAMING, BIOS v. *86.04.50.00.63*


----------



## gtbtk

Quote:


> Originally Posted by *JooTheNoo*
> 
> Hi all I have big concern about my 1070...
> I have reference model with 4phases. Is it enough? Since the card taking around 160Wats in gaming stress (card alone)....
> Or it's just my stupid concering.
> 
> I want to use it around 2 years 24h/7 with a little of stress (CUDA) all the time and a little of gaming part time...


You should be fine. the mosfets that Nvidia use on the founders are 4c85N models i think, they each can handle 50A of constant current drain and up to 300A when pulsed.


----------



## ParlyShary

Got my MSI 1070 Gaming X today.
1st: This thing is LARGE! I had to bend the power cable to make them fit in my full tower case.
The card's firmware is the latest (Micron), so I don't have to flash it myself, which is appreciable.
I use Afterburner to edit the fan curve, so the card don't hit like 50°C when I'm just browsing the web and watching videos.
Max temperature on load=60°C so about 15°C less than my GTX970.
Otherwise my 2500k is in trouble, fortunately I play at 1440p. I think my next upgrade will be next year with the new Intel CPU I think.
Oh, and when I turn on my PC today I got a weird artifact ans then no signal on startup. After a restart it works normally but it's a bit annoying, I will see If the problem occurs again, but that's weird.
Edit: sry for bad English, I hope I will progress posting some feedbacks here


----------



## gtbtk

Quote:


> Originally Posted by *ParlyShary*
> 
> Got my MSI 1070 Gaming X today.
> 1st: This thing is LARGE! I had to bend the power cable to make them fit in my full tower case.
> The card's firmware is the latest (Micron), so I don't have to flash it myself, which is appreciable.
> I use Afterburner to edit the fan curve, so the card don't hit like 50°C when I'm just browsing the web and watching videos.
> Max temperature on load=60°C so about 15°C less than my GTX970.
> Otherwise my 2500k is in trouble, fortunately I play at 1440p. I think my next upgrade will be next year with the new Intel CPU I think.
> Oh, and when I turn on my PC today I got a weird artifact ans then no signal on startup. After a restart it works normally but it's a bit annoying, I will see If the problem occurs again, but that's weird.
> Edit: sry for bad English, I hope I will progress posting some feedbacks here


Try increasing vccio voltage slightly to about 1.1v


----------



## Jpmboy

Hey Guys - Team OCN needs your help. We need a 1070 to ring up a score in *TimeSpy for this*.
MB must be an ASUS board, if you do compete, please follow HWBOT rules for subs.


----------



## Blatsz32

Hello, its been a long time since I've posted, but I recently upgraded my rig and i've run into some issues.
While Playing PlayerUnknowns battle Grounds Ive started getting crashes with a window that pops up stating that Ive run out of GPU memory. I'm running 2 MSI EK Seahawks in SLI, i thought the cards would be able to handle that game with no problems. I do have the game set to Ultra but I don't think that it should be a problem. I've played the Division with no issues so this is a bit puzzling.

Could I have sensitive cards? Wonky memory? Do I need to upgrade Bios? Ive tried overclocking but I can't get anything above 200mhz over stock. readign through the forums ive seen numbers that surpass that they are using stock coolers whiile mine is under water. I'm debating on unplugging the cards and changing out my thermal paste. Strange thing is, my temps are good.

I'm kind of bummed out with my cards at the moment. Any guidance would be greatly appreciated. ty.


----------



## ParlyShary

Quote:


> Originally Posted by *gtbtk*
> 
> Try increasing vccio voltage slightly to about 1.1v


Is this a known issue? It seems to happen only on the first boot of the day.


----------



## rfarmer

Quote:


> Originally Posted by *Blatsz32*
> 
> Hello, its been a long time since I've posted, but I recently upgraded my rig and i've run into some issues.
> While Playing PlayerUnknowns battle Grounds Ive started getting crashes with a window that pops up stating that Ive run out of GPU memory. I'm running 2 MSI EK Seahawks in SLI, i thought the cards would be able to handle that game with no problems. I do have the game set to Ultra but I don't think that it should be a problem. I've played the Division with no issues so this is a bit puzzling.
> 
> Could I have sensitive cards? Wonky memory? Do I need to upgrade Bios? Ive tried overclocking but I can't get anything above 200mhz over stock. readign through the forums ive seen numbers that surpass that they are using stock coolers whiile mine is under water. I'm debating on unplugging the cards and changing out my thermal paste. Strange thing is, my temps are good.
> 
> I'm kind of bummed out with my cards at the moment. Any guidance would be greatly appreciated. ty.


Google "playerunknown's battlegrounds memory error" and you will get pages of results showing memory problems after recent update. http://forums.playbattlegrounds.com/topic/5419-memory-error-please-post-specifics/ I would start in their forum.


----------



## gtbtk

Quote:


> Originally Posted by *ParlyShary*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Try increasing vccio voltage slightly to about 1.1v
> 
> 
> 
> Is this a known issue? It seems to happen only on the first boot of the day.
Click to expand...

specifically no, however, intermittent problems like that tend for be indicative of voltage levels that are hovering around the level of failure. As the GPU is new, I am guessing that it is the unaccustomed load that your 2500K rig is finding itself under that is pushing a requirement to the very edge of stability. Increasing VCCIO will help the PCIE and memory controllers.

If it is not VCCIO, and it still keeps happening, it could be the ram voltage or vcore. Try one at a time, you should not need to increase any of them by very much.


----------



## ParlyShary

Ok thanks for the answer, I put the VCCIO at 1.075.
I think my MB is pretty weak (P8Z77-V LX), so it could be faulty.
If the problem happens again I will push the VCCIO again.


----------



## TheBoom

Quote:


> Originally Posted by *Jpmboy*
> 
> Hey Guys - Team OCN needs your help. We need a 1070 to ring up a score in *TimeSpy for this*.
> MB must be an ASUS board, if you do compete, please follow HWBOT rules for subs.


Link you provided doesn't seem to load


----------



## ParlyShary

Raising the VCCIO to 1.10 didn't change anything, PC still fail to boot on the first try.
Try with no CPU OC, with no success, the problem still occurs after the PC is off for about 10 minutes.
I see this after the windows logo, stay for 5 seconds and then the screen turns completly black:


I've a SeaSonic 620GM2

Edit: Disable Fastboot in Windows 10 seems to work for now, but I wonder if the problem comes from the MB, GPU or PSU


----------



## JooTheNoo

Quote:


> Originally Posted by *zipzop*
> 
> I don't know, can you?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes it's possible, though you'll need to bypass the hardware subsystem ID mismatch in NvFlash. Takes some extra command I think a "-6" at the end
> 
> The correct one is STRIX-GTX1070-O8G-GAMING, BIOS v. *86.04.50.00.63*


It's not a problem that's have different phases, layout etc...?

If it safe...
I have microns shouldn't be better to install something newer?


----------



## Minium

Hello guys,
I´m pretty new to overclocking and just wanted to know if this GPU/overall score is good.


----------



## EDK-TheONE

Nice Score!

What is your gpu and settings?


----------



## Minium

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Nice Score!
> 
> What is your gpu and settings?


I´ll wait for some other opinions and tell you guys my settings then.I havent seen any 1070 near 22800 GPU score dou you guys know any ?


----------



## EDK-TheONE

Quote:


> Originally Posted by *Minium*
> 
> I´ll wait for some other opinions and tell you guys my settings then.I havent seen any 1070 near 22800 GPU score dou you guys know any ?


http://cdn.overclock.net/a/a8/a8adde77_zotac1070amp.png


----------



## Minium

Quote:


> Originally Posted by *EDK-TheONE*
> 
> http://cdn.overclock.net/a/a8/a8adde77_zotac1070amp.png


Wow thats some SERIOUS GDDR5 memory oc holy beeeeep.Is that the best 1070 GPU score you know?It looks like a HWBOT submission screenshot or something like that.


----------



## gtbtk

Quote:


> Originally Posted by *ParlyShary*
> 
> Ok thanks for the answer, I put the VCCIO at 1.075.
> I think my MB is pretty weak (P8Z77-V LX), so it could be faulty.
> If the problem happens again I will push the VCCIO again.


The Memory and PCIe controllers are on the CPU chip.

I had the p8z68-V until it fried itself the other day while i was running watchdogs 2


----------



## gtbtk

Quote:


> Originally Posted by *ParlyShary*
> 
> Raising the VCCIO to 1.10 didn't change anything, PC still fail to boot on the first try.
> Try with no CPU OC, with no success, the problem still occurs after the PC is off for about 10 minutes.
> I see this after the windows logo, stay for 5 seconds and then the screen turns completly black:
> 
> 
> I've a SeaSonic 620GM2
> 
> Edit: Disable Fastboot in Windows 10 seems to work for now, but I wonder if the problem comes from the MB, GPU or PSU


Fastboot can be problematic for many things. It is possible that your PSU is past it as well. Check all the power cables are firmly in place


----------



## Madmaxneo

Quote:


> Originally Posted by *Jpmboy*
> 
> Hey Guys - Team OCN needs your help. We need a 1070 to ring up a score in *TimeSpy for this*.
> MB must be an ASUS board, if you do compete, please follow HWBOT rules for subs.


I am interested in this. But my Rampage IV Black ed is starting to crap out.
I recently got my 4930k replaced under the intel tuning plan and it just came in the mail today. I will hopefully get the chance to put that chip in this weekend and join the contest. I am just not sure how well this board will handle some OC, but hopefully this chip is at least an average OCer (which would be better than my old 4930k). We will see though. FYI, I know to limit my CPU to using only 4 cores so I will have to turn two off which will help with heat and possibly help some with OCing.


----------



## zipper17

22k Graphic scores but is it 100% stable no crash/artifacts in all programs, games, stress test, benchmark?

After overclocking progress it also needed a serious & proper stress test to your components, If you're crashing/artifacting at certain applications, you're just passed that benchmark.

Just like example CPU, you can boot with 5GHZ, but in certain stress Test/programs you'll BSOD or crash.

Just FYI because Some people are just not aware of this.
A proper stress test like this https://steamcommunity.com/discussions/forum/11/215439774868691574/

Part B point 4; using 3dmark Stress test Firestrike/FSX/FSU/Timespy
Quote:


> At each OC level start by running the 3DMark Fire Strike Stress Test. This point is passed only if this stress test ends with a result of at least 97% in its Frame Stability Rate (FSR) AND without noticing artifacts during the test.
> Recommendation: during this test look at the screen almost all the time to detect possible artifacts.


----------



## Minium

Quote:


> Originally Posted by *zipper17*
> 
> 22k Graphic scores but is it 100% stable no crash/artifacts in all programs, games, stress test, benchmark?
> 
> After overclocking progress it also needed a serious & proper stress test to your components, If you're crashing/artifacting at certain applications, you're just passed that benchmark.
> 
> Just like example CPU, you can boot with 5GHZ, but in certain stress Test/programs you'll BSOD or crash.
> 
> Just FYI because Some people are just not aware of this.
> A proper stress test like this https://steamcommunity.com/discussions/forum/11/215439774868691574/
> 
> Part B point 4; using 3dmark Stress test Firestrike/FSX/FSU/Timespy


I´m running my daily OC with a few mhz less on core and memory.With that OC I get 226xx GPU score.Completely stable in BF1,several 3DMark benchmarks,Unigene Superposition/Heaven/Valley and so on.I tested something. I dont actually have a 1070.Its a 980Ti.I have never seen a 1070 that could beat my score , not even the highest overclocked ones.
http://www.3dmark.com/3dm/19417680
My daily clocks are 1600/2000 but even with these clocks I only saw a few 1070 barely beating my GPU score.I saw some people upgrading from a OC´d 980Ti to a 1070(also OC´d) and actually losing FPS and graphics score.My 980Ti is pretty rare (in terms of OC potential) and outscoring every 1070 except 1/10000 in every FireStrike.


----------



## ColdDeckEd

Quote:


> Originally Posted by *Minium*
> 
> I´m running my daily OC with a few mhz less on core and memory.With that OC I get 226xx GPU score.Completely stable in BF1,several 3DMark benchmarks,Unigene Superposition/Heaven/Valley and so on.I tested something. I dont actually have a 1070.Its a 980Ti.I have never seen a 1070 that could beat my score , not even the highest overclocked ones.
> http://www.3dmark.com/3dm/19417680
> My daily clocks are 1600/2000 but even with these clocks I only saw a few 1070 barely beating my GPU score.I saw some people upgrading from a OC´d 980Ti to a 1070(also OC´d) and actually losing FPS and graphics score.My 980Ti is pretty rare (in terms of OC potential) and outscoring every 1070 except 1/10000 in every FireStrike.


Must be nice to win the silicon lottery 2 times in a row! 1600 is just insane on a 980ti.

My old 980ti was also faster in FS than my 1070. My best score on my 980ti was 21260, while on my 1070 it was 21000. The positions switch in TimeSpy.


----------



## JooTheNoo

@*gtbtk*
You are amazing. So if I understand increase VCCIO and PCH(?) voltage can increace OC of graphics?

One more thing is it possible to flash strix bios 1,24v to MSI Gaming X?


----------



## Minium

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Must be nice to win the silicon lottery 2 times in a row! 1600 is just insane on a 980ti.
> 
> My old 980ti was also faster in FS than my 1070. My best score on my 980ti was 21260, while on my 1070 it was 21000. The positions switch in TimeSpy.


Maybe pascal is better for DX12 things.


----------



## zipper17

Quote:


> Originally Posted by *Minium*
> 
> I´m running my daily OC with a few mhz less on core and memory.With that OC I get 226xx GPU score.Completely stable in BF1,several 3DMark benchmarks,Unigene Superposition/Heaven/Valley and so on.I tested something. I dont actually have a 1070.Its a 980Ti.I have never seen a 1070 that could beat my score , not even the highest overclocked ones.
> http://www.3dmark.com/3dm/19417680
> My daily clocks are 1600/2000 but even with these clocks I only saw a few 1070 barely beating my GPU score.I saw some people upgrading from a OC´d 980Ti to a 1070(also OC´d) and actually losing FPS and graphics score.My 980Ti is pretty rare (in terms of OC potential) and outscoring every 1070 except 1/10000 in every FireStrike.


Yup I'm aware of that, 980Ti generally still better in overclocking/silicon lottery.

I can see 980Ti OC from _pro overclocker_ can be in the same level as Titan X pascal 2016 Afaik.

So far 1070 is not really favored by _pro overclocker._

The only 1070 wins against 980ti is performance per watt. You really need a lucky sample of 1070 to pass 22k Graphic scores imo,

I mean very less opportunity to get that kind of 1070 I believe.

+20k is the most for 1070 OC. +21K also can be considered good silicon chip.


----------



## rfarmer

Quote:


> Originally Posted by *zipper17*
> 
> Yup I'm aware of that, 980Ti generally still better in overclocking/silicon lottery.
> 
> I can see 980Ti OC from _pro overclocker_ can be in the same level as Titan X pascal 2016 Afaik.
> 
> So far 1070 is not really favored by _pro overclocker._
> 
> The only 1070 wins against 980ti is performance per watt. You really need a lucky sample of 1070 to pass 22k Graphic scores imo,
> 
> I mean very less opportunity to get that kind of 1070 I believe.
> 
> +20k is the most for 1070 OC. +21K also can be considered good silicon chip.


I'm actually a bit curious why you are posting in this thread when you don't own a 1070, if you just want to brag on your 980Ti I am sure there is a thread for that.

Not many, if any, people in this thread came from a 980Ti, unless you were looking for lower power usage the performance difference between them was too close to justify it. A 1080 I could see.

You get very good overclocks it is true, but you also have the ability to edit the bios and don't have gpu boost 3.0 to deal with, if that was true of the 1070 you would see some better numbers.

And if we are bragging on cards, my 1070 was probably at least $200 cheaper than your 980Ti.


----------



## TUFinside

I just ordered this: EVGA GeForce GTX 1070 FTW HYBRID GAMING


----------



## zipper17

Quote:


> Originally Posted by *rfarmer*
> 
> I'm actually a bit curious why you are posting in this thread when you don't own a 1070, if you just want to brag on your 980Ti I am sure there is a thread for that.
> 
> Not many, if any, people in this thread came from a 980Ti, unless you were looking for lower power usage the performance difference between them was too close to justify it. A 1080 I could see.
> 
> You get very good overclocks it is true, but you also have the ability to edit the bios and don't have gpu boost 3.0 to deal with, if that was true of the 1070 you would see some better numbers.
> 
> And if we are bragging on cards, my 1070 was probably at least $200 cheaper than your 980Ti.


Hell yeah 1070 still solid for it's price & performances, I own 1070, and I don't even own 980ti.
btw are you quoted a wrong person?
Quote:


> Originally Posted by *Minium*
> 
> I´m running my daily OC with a few mhz less on core and memory.With that OC I get 226xx GPU score.Completely stable in BF1,several 3DMark benchmarks,Unigene Superposition/Heaven/Valley and so on.I tested something. I dont actually have a 1070.Its a 980Ti.I have never seen a 1070 that could beat my score , not even the highest overclocked ones.
> http://www.3dmark.com/3dm/19417680
> My daily clocks are 1600/2000 but even with these clocks I only saw a few 1070 barely beating my GPU score.I saw some people upgrading from a OC´d 980Ti to a 1070(also OC´d) and actually losing FPS and graphics score.My 980Ti is pretty rare (in terms of OC potential) and outscoring every 1070 except 1/10000 in every FireStrike.


----------



## TUFinside

Quote:


> Originally Posted by *Minium*
> 
> I´m running my daily OC with a few mhz less on core and memory.With that OC I get 226xx GPU score.Completely stable in BF1,several 3DMark benchmarks,Unigene Superposition/Heaven/Valley and so on.I tested something. I dont actually have a 1070.Its a 980Ti.I have never seen a 1070 that could beat my score , not even the highest overclocked ones.
> http://www.3dmark.com/3dm/19417680
> My daily clocks are 1600/2000 but even with these clocks I only saw a few 1070 barely beating my GPU score.I saw some people upgrading from a OC´d 980Ti to a 1070(also OC´d) and actually losing FPS and graphics score.My 980Ti is pretty rare (in terms of OC potential) and outscoring every 1070 except 1/10000 in every FireStrike.


OOO Strong ! the bragging creme !


----------



## rfarmer

Quote:


> Originally Posted by *zipper17*
> 
> Hell yeah 1070 still solid for it's price & performances, I own 1070, and I don't even own 980ti.
> btw are you quoted a wrong person?


Yeah thanks, meant to quote Minium. Sorry about that zipper17


----------



## Dude970

Quote:


> Originally Posted by *TUFinside*
> 
> OOO Strong ! the bragging creme !


That is an impressive FireStrike run he has. I came close to him
http://www.3dmark.com/fs/11972919


----------



## Minium

I never wanted to brag.I just wanted to find out if theres any 1070 air/watercooled highly oc´d that could beat my 980Ti and i guess a 1070 owners club is a good place for that.In every comparison video between 980ti/1070/1080 they use like the worst reference 980ti sleeping at stupid low clocks so this information is completely useless.


----------



## Dude970

Quote:


> Originally Posted by *Minium*
> 
> I never wanted to brag.I just wanted to find out if theres any 1070 air/watercooled highly oc´d that could beat my 980Ti and i guess a 1070 owners club is a good place for that.In every comparison video between 980ti/1070/1080 they use like the worst reference 980ti sleeping at stupid low clocks so this information is completely useless.


How is your Timespy benchmark?


----------



## Minium

Quote:


> Originally Posted by *Dude970*
> 
> How is your Timespy benchmark?


That fire strike was the only run I did that day.Didnt run extreme/ultra/time spy yet. Maybe I will run them tomorrow.


----------



## JooTheNoo

Quote:


> Originally Posted by *Minium*
> 
> I never wanted to brag.I just wanted to find out if theres any 1070 air/watercooled highly oc´d that could beat my 980Ti and i guess a 1070 owners club is a good place for that.In every comparison video between 980ti/1070/1080 they use like the worst reference 980ti sleeping at stupid low clocks so this information is completely useless.


Rarely 1070 can be overclock to 2,3GHz....
1920SP @ 2,3GHz (but you need to mod cooling for watter)
2816SP @ 1,5GHz

So SP power are almost same...
But bandwidth 980 Ti is much better and that is the reason why 980 Ti is more powerfull


----------



## TUFinside

I was kidding, indeed the GTX 980Ti is a great card and still relevant for today applications and games and that is what matter in the end.


----------



## Marshock

Quote:


> Originally Posted by *TUFinside*
> 
> I was kidding, indeed the GTX 980Ti is a great card and still relevant for today applications and games and that is what matter in the end.


Perhaps you should check this out if you are comparing GTX 980 Ti to GTX 1070. _It's right under your noses_:

http://www.overclock.net/t/1628699/sli-gtx970-g1-gaming-vs-gtx980-ti-strix-oc-vs-gtx1070-dual-benchmarks

GTX 1070 Dual does not go higher than 1770 MHz boost, while GTX980 Ti Strix OC goes up to 1550 MHz. Yes, had the GTX 1070 been Strix OC, it could reach 2+ GHz as proven elsewhere, but i agree that GTX980 Ti has better silicon lottery.


----------



## ezveedub

Just installed a Nvidia GTX-1070 Founders a few days ago......will add a waterblock to it soon.....


----------



## gtbtk

Quote:


> Originally Posted by *Minium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipper17*
> 
> 22k Graphic scores but is it 100% stable no crash/artifacts in all programs, games, stress test, benchmark?
> 
> After overclocking progress it also needed a serious & proper stress test to your components, If you're crashing/artifacting at certain applications, you're just passed that benchmark.
> 
> Just like example CPU, you can boot with 5GHZ, but in certain stress Test/programs you'll BSOD or crash.
> 
> Just FYI because Some people are just not aware of this.
> A proper stress test like this https://steamcommunity.com/discussions/forum/11/215439774868691574/
> 
> Part B point 4; using 3dmark Stress test Firestrike/FSX/FSU/Timespy
> 
> 
> 
> I´m running my daily OC with a few mhz less on core and memory.With that OC I get 226xx GPU score.Completely stable in BF1,several 3DMark benchmarks,Unigene Superposition/Heaven/Valley and so on.I tested something. I dont actually have a 1070.Its a 980Ti.I have never seen a 1070 that could beat my score , not even the highest overclocked ones.
> http://www.3dmark.com/3dm/19417680
> My daily clocks are 1600/2000 but even with these clocks I only saw a few 1070 barely beating my GPU score.I saw some people upgrading from a OC´d 980Ti to a 1070(also OC´d) and actually losing FPS and graphics score.My 980Ti is pretty rare (in terms of OC potential) and outscoring every 1070 except 1/10000 in every FireStrike.
Click to expand...

There are not many 980TI that run better than 1070s. You have a great card. Best Graphics score I could manage on my i7-2600 rig was about 21500. Unfortunately my rig died the other day and not got up to deciding on what new one to get myself.

As a matter of interest, how are you overclocking your skylake rig? Are you just using multiplier or a combination of BCLK and multi to get the cpu/ram to 4.7/3466? If BCLK what straps are you using for the memory, what ram timings are you running with your memory and what latency are you getting with your ram?


----------



## gtbtk

Quote:


> Originally Posted by *JooTheNoo*
> 
> @*gtbtk*
> You are amazing. So if I understand increase VCCIO and PCH(?) voltage can increace OC of graphics?
> 
> One more thing is it possible to flash strix bios 1,24v to MSI Gaming X?


VCCIO and CPU PLL voltage, not PCH. At least on my Z68 motherboard. i am using bclk overclocking as well. I only needed

Z77 are fairly similar so I assume that it may also have an impact but i don't know how well it would work for more recent boards.

It worked for me, primarily with GDDR5 memory stability .

this is one of the best runs I got http://www.3dmark.com/fs/11784694

As far as I know, the 1.2V bios is only for the 1080. If one does exist and I don't know about it, please let me know where I can get it. If it does exist, then you should be able to flash the bios to a Gaming x. the regular Strix bioses work on the msi


----------



## gtbtk

Quote:


> Originally Posted by *Minium*
> 
> I never wanted to brag.I just wanted to find out if theres any 1070 air/watercooled highly oc´d that could beat my 980Ti and i guess a 1070 owners club is a good place for that.In every comparison video between 980ti/1070/1080 they use like the worst reference 980ti sleeping at stupid low clocks so this information is completely useless.


Believe me, it is not just 980ti comparisons, the tech media is pretty average for most things other than reading off review sheets. You know with all the Ryzen coverage, not a single one of the big sites have ever pondered the question "why Ryzen is slower than an Intel Platform?"


----------



## TUFinside

Quote:


> Originally Posted by *ezveedub*
> 
> Just installed a Nvidia GTX-1070 Founders a few days ago......will add a waterblock to it soon.....


Gratz ! Enjoy your new toy !


----------



## Minium

Quote:


> Originally Posted by *gtbtk*
> 
> .
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> There are not many 980TI that run better than 1070s. You have a great card. Best Graphics score I could manage on my i7-2600 rig was about 21500. Unfortunately my rig died the other day and not got up to deciding on what new one to get myself.
> 
> As a matter of interest, how are you overclocking your skylake rig? Are you just using multiplier or a combination of BCLK and multi to get the cpu/ram to 4.7/3466? If BCLK what straps are you using for the memory, what ram timings are you running with your memory and what latency are you getting with your ram?
> 
> 
> 
> Hello,
> my CPU is running 4.8GHZ @1.38V(for benchmarks ONLY -24/7 is [email protected]) using multiplier only.My ram is 3466MHZ with CL16-16-16-36 running at 1.55V(for benchmarks ONLY -24/7 is 3200MHZ [email protected]).I can just set the mem freq. in the bios.I could run lower timings and higher mem freq. but I dont want to push higher voltage (its safe for benchmarks but I just dont like it).
> 
> I think my 6700k could do 4.9GHZ but at around 1.45V which is pretty high and my current cpu block is complete garbage (4.8GHZ ->80C at 100%).Memory could be better too but I dont like higher volts than 1.5-1.6.I didnt put much work into ram oc.Maybe I could get better timings and freq. if I would adjust some other settings.
Click to expand...


----------



## gtbtk

Quote:


> Originally Posted by *Minium*
> 
> 
> 
> 
> Hello,
> my CPU is running 4.8GHZ @1.38V(for benchmarks ONLY -24/7 is [email protected]) using multiplier only.My ram is 3466MHZ with CL16-16-16-36 running at 1.55V(for benchmarks ONLY -24/7 is 3200MHZ [email protected]).I can just set the mem freq. in the bios.I could run lower timings and higher mem freq. but I dont want to push higher voltage (its safe for benchmarks but I just dont like it).
> 
> I think my 6700k could do 4.9GHZ but at around 1.45V which is pretty high and my current cpu block is complete garbage (4.8GHZ ->80C at 100%).Memory could be better too but I dont like higher volts than 1.5-1.6.I didnt put much work into ram oc.Maybe I could get better timings and freq. if I would adjust some other settings.
Click to expand...

4.8Ghz for a 6700K is great going. did you delid the chip?

45ns is nice low memory latency. Rather than looking for fastest memory clocks, you should try tuning for the best balance of frequency and lowest latency, if you are not already doing it. Lower latencies directly relates to the the balance of CPU/GPU performance. FS combined score and gaming FPS performance improve in line with reduced memory latency. Using a lower memory strap @40ns could actually give you better scores that [email protected] for example. You may even find that with even lower latency, you can clock the CPU at 4.7Ghz, reduce the voltage and temps slightly and get similar scores

higher Bclk/ lower multiplier adjustments may also be worthg experimenting with to help with improving latency/memory performance. BCLK up to 104-105Mhz should not hurt anything.

The higher memory latency and immature memory support is the reason that Ryzen is under performing Intel in the 3D graphics benchmarks.


----------



## Minium

Quote:


> Originally Posted by *gtbtk*
> 
> 4.8Ghz for a 6700K is great going. did you delid the chip?
> 
> 45ns is nice low memory latency. Rather than looking for fastest memory clocks, you should try tuning for the best balance of frequency and lowest latency, if you are not already doing it. Lower latencies directly relates to the the balance of CPU/GPU performance. FS combined score and gaming FPS performance improve in line with reduced memory latency. Using a lower memory strap @40ns could actually give you better scores that [email protected] for example. You may even find that with even lower latency, you can clock the CPU at 4.7Ghz, reduce the voltage and temps slightly and get similar scores
> 
> higher Bclk/ lower multiplier adjustments may also be worthg experimenting with to help with improving latency/memory performance. BCLK up to 104-105Mhz should not hurt anything.
> 
> The higher memory latency and immature memory support is the reason that Ryzen is under performing Intel in the 3D graphics benchmarks.


Not deliddet-Will do it soon.


----------



## Tennobanzai

Has anyone here installed a 1080 FE cooler on a 1070? If so, what was the temps like after?


----------



## Minium

Quote:


> Originally Posted by *Tennobanzai*
> 
> Has anyone here installed a 1080 FE cooler on a 1070? If so, what was the temps like after?


1070 FE and 1080 FE coolers are the exact same ?!


----------



## Tennobanzai

Quote:


> Originally Posted by *Minium*
> 
> 1070 FE and 1080 FE coolers are the exact same ?!


AFAIK, they're different? 1080/1080 Ti has vapor chamber


----------



## gtbtk

Quote:


> Originally Posted by *Tennobanzai*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Minium*
> 
> 1070 FE and 1080 FE coolers are the exact same ?!
> 
> 
> 
> AFAIK, they're different? 1080/1080 Ti has vapor chamber
Click to expand...

the Ti has an improved vapor chamber over the 1080 Vapor chamber

1070 has a copper core

http://www.pcgameshardware.de/Nvidia-Geforce-Grafikkarte-255598/Tests/GTX-1070-Benchmarks-Test-Preis-1196360/galerie/2582458/

the 1080 cooler should be a little more efficient on a 1070 than the original cooler


----------



## Tennobanzai

Quote:


> Originally Posted by *gtbtk*
> 
> the Ti has an improved vapor chamber over the 1080 Vapor chamber
> 
> 1070 has a copper core
> 
> http://www.pcgameshardware.de/Nvidia-Geforce-Grafikkarte-255598/Tests/GTX-1070-Benchmarks-Test-Preis-1196360/galerie/2582458/
> 
> the 1080 cooler should be a little more efficient on a 1070 than the original cooler


Do you know if a 1080 Ti cooler fits a 1070 perfectly?


----------



## Mad Pistol

Quote:


> Originally Posted by *Tennobanzai*
> 
> Do you know if a 1080 Ti cooler fits a 1070 perfectly?


It probably won't because the 1080 Ti has 11 memory chips, while the 1070 has 8. Beyond that, I don't know if the shrouds have the same mounting holes.


----------



## zipzop

Quote:


> Originally Posted by *Tennobanzai*
> 
> Do you know if a 1080 Ti cooler fits a 1070 perfectly?


No that definitely won't fit. If you're looking at an alternative cooling solution for your 1070 FE, go with either an EVGA 1070 or 1080 *SC* cooler assembly / heatsink (if you can find one online like eBay)

There's also the Arctic Accelero Xtreme IV or EVGA hybrid kits


----------



## gtbtk

Quote:


> Originally Posted by *Tennobanzai*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the Ti has an improved vapor chamber over the 1080 Vapor chamber
> 
> 1070 has a copper core
> 
> http://www.pcgameshardware.de/Nvidia-Geforce-Grafikkarte-255598/Tests/GTX-1070-Benchmarks-Test-Preis-1196360/galerie/2582458/
> 
> the 1080 cooler should be a little more efficient on a 1070 than the original cooler
> 
> 
> 
> Do you know if a 1080 Ti cooler fits a 1070 perfectly?
Click to expand...

I honestly don't know. I know that waterblocks for a 1080 will fit on a 1070

these may help. the GPU, the memory chups and the chokes are all located in the same areas of the PCB.

1080 Ti Founders Edition



1070 Founders Edition


----------



## Minium

Temps would still suck as hell -not worth it


----------



## Mad Pistol

Quote:


> Originally Posted by *Minium*
> 
> Temps would still suck as hell -not worth it


Considering how much more robust the cooler is on the 1080 Ti compared to the 1070, the temps would actually be pretty good.


----------



## HowYesNo

hello guys.
i got this Gainward Gtx 1070 and i noticed that the fins on the cooler are parallel to the main board unlike phoenix version where fins are perpendicular. I am guessing this allows air better exiting the case oppose when fins are other scenario. although phoenix is beefier and thus better.
check these photos as comparison.

 

so i got an idea to buy exhaust coller for laptop to pull air out, as the hot air is not blown out like with reference blower cooler from nvidia.
i ordered this from ebay as it seem mounting mechanism can attach in the rear of the card. did not yet receive it, will post results when it arives.


----------



## rfarmer

https://www.newegg.com/Product/Product.aspx?Item=N82E16814487265&utm_medium=Email&utm_source=EXPRESS042917&cm_mmc=EMC-EXPRESS042917-_-EMC-042917-Index-_-DesktopGraphicsCards-_-14487265-S1A1A&ignorebbr=1

EVGA GeForce GTX 1070 SC GAMING ACX 3.0 Black Edition on sale for $359 after rebate if anyone is interested.


----------



## Madmaxneo

Quote:


> Originally Posted by *rfarmer*
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487265&utm_medium=Email&utm_source=EXPRESS042917&cm_mmc=EMC-EXPRESS042917-_-EMC-042917-Index-_-DesktopGraphicsCards-_-14487265-S1A1A&ignorebbr=1
> 
> EVGA GeForce GTX 1070 SC GAMING ACX 3.0 Black Edition on sale for $359 after rebate if anyone is interested.


That is an awesome price but what is the difference between that one and the non black edition, other then the current sale price that is?

I have checked and they look the same.

I currently have a non black version of the same card and if I were to go SLI that is a very tempting price.


----------



## rfarmer

Quote:


> Originally Posted by *Madmaxneo*
> 
> That is an awesome price but what is the difference between that one and the non black edition, other then the current sale price that is?
> 
> I have checked and they look the same.
> 
> I currently have a non black version of the same card and if I were to go SLI that is a very tempting price.


Only difference I see on EVGA's site is the lack of a backplate on the Black Edition, base clock/boost clock are the same.


----------



## Madmaxneo

Quote:


> Originally Posted by *rfarmer*
> 
> Only difference I see on EVGA's site is the lack of a backplate on the Black Edition, base clock/boost clock are the same.


Mine does not have a backplate either...


----------



## gtbtk

Quote:


> Originally Posted by *HowYesNo*
> 
> hello guys.
> i got this Gainward Gtx 1070 and i noticed that the fins on the cooler are parallel to the main board unlike phoenix version where fins are perpendicular. I am guessing this allows air better exiting the case oppose when fins are other scenario. although phoenix is beefier and thus better.
> check these photos as comparison.
> 
> 
> 
> so i got an idea to buy exhaust coller for laptop to pull air out, as the hot air is not blown out like with reference blower cooler from nvidia.
> i ordered this from ebay as it seem mounting mechanism can attach in the rear of the card. did not yet receive it, will post results when it arives.


I would think that the fins on the black card would exhaust the air out both ends of the card.

How many fans do you have forcing air into the case? If the inward fans are filtered and air volume is more than fans that extract the air, then high pressure inside the case will help keep dust buildup inside your case down. i am not sure that the mini fan will help much


----------



## Yukss

i dont what to do. if keep my gtx 1070 or upgrade to 1080 ? the 1080ti is to expensive for me atm


----------



## Minium

Quote:


> Originally Posted by *Yukss*
> 
> i dont what to do. if keep my gtx 1070 or upgrade to 1080 ? the 1080ti is to expensive for me atm


From 1070 to 1080 is a complete wase of money.Imagine you would do this every time. From 1070 to 1080 then next gen launches -upgrade to 1080ti then next gen Ti cards come and you upgrade to the non ti.Just dont do it ... its the same exact thing with every pc part


----------



## zipper17

Quote:


> Originally Posted by *Yukss*
> 
> i dont what to do. if keep my gtx 1070 or upgrade to 1080 ? the 1080ti is to expensive for me atm


If you can sell 1070 with good amount of money, it's fine to upgrade to 1080.

what thing to be considered is, what is your current cpu/ram? make sure they're not bottlenecking your GPU.

in my case i5 3570k will be bottlenecking 1080Ti a lot. So upgrade into 1080ti is kind of waste performances/money unless I upgrade my cpu too.
Quote:


> Originally Posted by *Minium*
> 
> From 1070 to 1080 is a complete wase of money.Imagine you would do this every time. From 1070 to 1080 then next gen launches -upgrade to 1080ti then next gen Ti cards come and you upgrade to the non ti.Just dont do it ... its the same exact thing with every pc part


However some people do that for hobby, it's GPU addiction. It's not wrong though, if you compare into another bigger hobbyist, GPU addiction still cheaper.


----------



## striker3

guys i have a question may be it is related to the post but ill try here







i have gtx 1070 and benq xl 2720z .when i play any video the video judder i tried eveything without any luck .if i set resolution to 60hz the move become choppy too any way to fix it


----------



## zipper17

Quote:


> Originally Posted by *striker3*
> 
> guys i have a question may be it is related to the post but ill try here
> 
> 
> 
> 
> 
> 
> 
> i have gtx 1070 and benq xl 2720z .when i play any video the video judder i tried eveything without any luck .if i set resolution to 60hz the move become choppy too any way to fix it


you mean screen tearing?

try enable adaptive vsync on control panel.


----------



## Skyblaze

So because I was a bit fed up with the fan-noise of my PALIT 1070 DUAL I went and attempted to undervolt it manually now. It seems to be stable at 1848mhz with 0.825v, how is this in comparison with other 1070's? At 0.825v it keeps the fan at 69% and stays at 75°C under full load which makes it reasonably quiet.


----------



## gtbtk

Quote:


> Originally Posted by *Yukss*
> 
> i dont what to do. if keep my gtx 1070 or upgrade to 1080 ? the 1080ti is to expensive for me atm


1070 performance is adequate for most things, even if it requires you to reduce some settings from ultra to high on some titles, it is still a good experience.

I would wait 6 months and look at the gtx2070 that will perform about the same as a 1080ti does now or a new 2080 that should beat it if nvidia stay true to form.


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> So because I was a bit fed up with the fan-noise of my PALIT 1070 DUAL I went and attempted to undervolt it manually now. It seems to be stable at 1848mhz with 0.825v, how is this in comparison with other 1070's? At 0.825v it keeps the fan at 69% and stays at 75°C under full load which makes it reasonably quiet.


I would suggest that you try the card at .950v, you should be able to run it cooler and still clock it to 2000-2025Mhz and get significantly better performance.

I have the MSI Gaming x (different cooler I know) but running with 100% fan, I can stay well under 50 deg. Running a more reasonable fan curve that keeps fan speeds low you should still be able to run the card in the slightly higher 70s range with much better performance


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> I would suggest that you try the card at .950v, you should be able to run it cooler and still clock it to 2000-2025Mhz and get significantly better performance.
> 
> I have the MSI Gaming x (different cooler I know) but running with 100% fan, I can stay well under 50 deg. Running a more reasonable fan curve that keeps fan speeds low you should still be able to run the card in the slightly higher 70s range with much better performance


Thanks for the advice! While I would love to run the card at 2ghz and I'm sure it could handle it based on my undervolting experiences, I'm not sure if the fan is up for it. I guess now it bites me back that I only picked up the Palit DUAL for 420€ but to be honest that was already out of my price-range. Before I started undervolting yesterday I ran the card with a temp-limit of 75°C which kept the fan at 70% but also limited me to 1600-1700mhz which dampened performance a bit. At pure stock the card would reach 84°C with 80% fan-speed at 1.032v and never boost beyond 1850mhz anyway so I'm more than happy with my results.

To get the card run higher I pretty much would have to modify the fan-curve yeah. I noticed that it starts to crash if I move beyond 1855mhz at 0.825v but I'm not sure I want higher voltages. 0.850v is the last point where the fans keep themselves in check with 71-72%. Anything upward of 0.850v makes the fans eventually creep up to 80% which just gets annoyingly loud and I'm not sure how much an adjusted fan-cure would help here. Especially since I need a bit headroom because I live in an attic-style flat where 25-32°C room-temperature can be the norm in summer









But there's no harm in trying I suppose, could you suggest me a fan-curve for 0.950v? I'm sure I could perhaps even reach 2ghz at 0.900v but I would have to test. I found out that Playerunknown's Battlegrounds seems to be a good test-case in the open field as it pushes the card to 99% usage in the open field and makes my card generally work harder than, let's say Overwatch at 1440p.


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I would suggest that you try the card at .950v, you should be able to run it cooler and still clock it to 2000-2025Mhz and get significantly better performance.
> 
> I have the MSI Gaming x (different cooler I know) but running with 100% fan, I can stay well under 50 deg. Running a more reasonable fan curve that keeps fan speeds low you should still be able to run the card in the slightly higher 70s range with much better performance
> 
> 
> 
> Thanks for the advice! While I would love to run the card at 2ghz and I'm sure it could handle it based on my undervolting experiences, I'm not sure if the fan is up for it. I guess now it bites me back that I only picked up the Palit DUAL for 420€ but to be honest that was already out of my price-range. Before I started undervolting yesterday I ran the card with a temp-limit of 75°C which kept the fan at 70% but also limited me to 1600-1700mhz which dampened performance a bit. At pure stock the card would reach 84°C with 80% fan-speed at 1.032v and never boost beyond 1850mhz anyway so I'm more than happy with my results.
> 
> To get the card run higher I pretty much would have to modify the fan-curve yeah. I noticed that it starts to crash if I move beyond 1855mhz at 0.825v but I'm not sure I want higher voltages. 0.850v is the last point where the fans keep themselves in check with 71-72%. Anything upward of 0.850v makes the fans eventually creep up to 80% which just gets annoyingly loud and I'm not sure how much an adjusted fan-cure would help here. Especially since I need a bit headroom because I live in an attic-style flat where 25-32°C room-temperature can be the norm in summer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But there's no harm in trying I suppose, could you suggest me a fan-curve for 0.950v? I'm sure I could perhaps even reach 2ghz at 0.900v but I would have to test. I found out that Playerunknown's Battlegrounds seems to be a good test-case in the open field as it pushes the card to 99% usage in the open field and makes my card generally work harder than, let's say Overwatch at 1440p.
Click to expand...

The Palit Dual is still a dual fan card that should run better than 84 deg. the 1070 is a very power efficient chip. With mine, at 1.093v and 80% fans I only just hit 62 deg in a 24 deg room. Granted my cooler is a higher quality one than the palit, but it shouldn't be 20 deg different. I would expect more like 10-15.

If your card is running hot like that, either your case has no ventilation at all and the hot air is just going around and around. You may benefit by installing another fan or two to the case. You can do a quick test before spending any money by taking the case door off and running it open and seeing how the temps go when you know the card can get some fresh air. a fan in the case door would be useful if your case supports it

Or the other thing it could be, If it still runs at 84 deg with an open case, is the thermal paste. The factory application of thermal interface material may be faulty. If you dont want/cant replace the thermal paste yourself because of warranty stickers etc, RMA the card and get another one because it is faulty.

If you are undervolting, you already have afterburner installed. You can set a custom fan curve in AB so that it spins at 20-25% at idle. (Im not sure if the dual has the 0 fan below 60 deg by default) That way it is still quiet but it will idle at 35deg instead of hi 50s and then start increasing the curve at say 40 or 45 deg set the max speed to 75% if that is what is tolerable from say, 60 deg up.


----------



## Minium

Quote:


> Originally Posted by *striker3*
> 
> guys i have a question may be it is related to the post but ill try here
> 
> 
> 
> 
> 
> 
> 
> i have gtx 1070 and benq xl 2720z .when i play any video the video judder i tried eveything without any luck .if i set resolution to 60hz the move become choppy too any way to fix it


Internet connection ? Try other browser. Use iGPU to test if its the 1070´s fault.


----------



## Gurkburk

Just installed my Arctic Accelero Xtreme IV & lowered the temps by around 15*C & volume from the fans (had gigabyte windforce) at 100% by a ton. I barely hear these new fans when on 100%.

Edit: It seems, with this new cooler, i can bump up the clocks slightly.

From +110 CC to 135 Atm.

From 510 Mem to 545 Atm

Still in progress testing.


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Just installed my Arctic Accelero Xtreme IV & lowered the temps by around 15*C & volume from the fans (had gigabyte windforce) at 100% by a ton. I barely hear these new fans when on 100%.
> 
> Edit: It seems, with this new cooler, i can bump up the clocks slightly.
> 
> From +110 CC to 135 Atm.
> 
> From 510 Mem to 545 Atm
> 
> Still in progress testing.


isn't it amazing what a decent cooler and thermal paste can do for you?


----------



## Gurkburk

Quote:


> Originally Posted by *gtbtk*
> 
> isn't it amazing what a decent cooler and thermal paste can do for you?


Well. Tbh the gigabyte windforce may be noisy, but it's definitely the better pre-installed cooler there is









I had a accelero on my 780 as well. Was just as insane on that card


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> The Palit Dual is still a dual fan card that should run better than 84 deg. the 1070 is a very power efficient chip. With mine, at 1.093v and 80% fans I only just hit 62 deg in a 24 deg room. Granted my cooler is a higher quality one than the palit, but it shouldn't be 20 deg different. I would expect more like 10-15.
> 
> If your card is running hot like that, either your case has no ventilation at all and the hot air is just going around and around. You may benefit by installing another fan or two to the case. You can do a quick test before spending any money by taking the case door off and running it open and seeing how the temps go when you know the card can get some fresh air. a fan in the case door would be useful if your case supports it
> 
> Or the other thing it could be, If it still runs at 84 deg with an open case, is the thermal paste. The factory application of thermal interface material may be faulty. If you dont want/cant replace the thermal paste yourself because of warranty stickers etc, RMA the card and get another one because it is faulty.
> 
> If you are undervolting, you already have afterburner installed. You can set a custom fan curve in AB so that it spins at 20-25% at idle. (Im not sure if the dual has the 0 fan below 60 deg by default) That way it is still quiet but it will idle at 35deg instead of hi 50s and then start increasing the curve at say 40 or 45 deg set the max speed to 75% if that is what is tolerable from say, 60 deg up.


You were right my temps in terms of undervolting from yesterday were complete bogus. Since the outside temperatures are still below 10°C here like they have been the whole winter I set my case fans to Level 1 and didn't clean my dust-filters in a while. (I tend to rarely use heaters even in winter so my room-temperatures are below average during this time) After I cleaned the filters and set the fans back to Level 2 I'm now running 2000mhz at 0.9v @ 68°C with the fans barely scratching 60-65%. I was shocked how quiet the card is now when I tested it a while longer with BF1. I guess since I went the lazy route over winter (I bought my 1070 in October I think where it was already pretty cold) and used the thermal-limit option I never noticed how much my ambience airflow must have throttled my GTX 1070 because I rarely play a game that push it







So I guess I'm happy!









One question though, did I undervolt the right way? I followed a YouTube tutorial and set my Afterburner curve like this:



but I read on another forum that doing it that way actually can cause FPS loss because it doesn't raise another hidden clock called Videoclock properly? The right way is supposedly to offset the curve first and then flatten it beyond the desired max-voltage but that caused me to instantly crash my drivers. I'm not sure if I did it right though because the instructions were a bit confusing. Did I do things right or wrong?

And thanks for your suggestions, they lead me to finding my issue.


----------



## Yukss

Quote:


> From 1070 to 1080 is a complete wase of money.Imagine you would do this every time. From 1070 to 1080 then next gen launches -upgrade to 1080ti then next gen Ti cards come and you upgrade to the non ti.Just dont do it ... its the same exact thing with every pc part


well is like 20% more powerfull still have this big doubt, thanks for you advise









Quote:


> If you can sell 1070 with good amount of money, it's fine to upgrade to 1080.
> 
> what thing to be considered is, what is your current cpu/ram? make sure they're not bottlenecking your GPU.
> 
> in my case i5 3570k will be bottlenecking 1080Ti a lot. So upgrade into 1080ti is kind of waste performances/money unless I upgrade my cpu too.


Hi, its hard to belived than a 6 core cpu at 4.6ghz bottlenecks and current gpu, what do you think ? ... thanks for your advise
Quote:


> 1070 performance is adequate for most things, even if it requires you to reduce some settings from ultra to high on some titles, it is still a good experience.
> 
> I would wait 6 months and look at the gtx2070 that will perform about the same as a 1080ti does now or a new 2080 that should beat it if nvidia stay true to form.


yes, i guess we can always wait until next gens of cards but here the deals is to get a 1080 founders edition with waterblock at 450$ and try to sell mine at 380$ at least (it cost me 450$ less than a month ago)


----------



## ParlyShary

Even the best CPU occasionally bottlenecks a 1080Ti. Intel has been so lazy this days


----------



## spieluhr

It seems to me that no matter what 1070 version you have, one can easily clock it to around 2000mhz + 4500mhz.
What is the point of having so many different versions on the market then ?? And why the different prices ?


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> isn't it amazing what a decent cooler and thermal paste can do for you?
> 
> 
> 
> Well. Tbh the gigabyte windforce may be noisy, but it's definitely the better pre-installed cooler there is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had a accelero on my 780 as well. Was just as insane on that card
Click to expand...

hahahaha. your funny


----------



## headd

Quote:


> Originally Posted by *Yukss*
> 
> well is like 20% more powerfull still have this big doubt, thanks for you advise


its more than 20%.In latest games GTX1080 is almost 27% average faster than GTX1070
https://www.techpowerup.com/reviews/EVGA/GTX_1080_Ti_SC2/31.html


----------



## gtbtk

Quote:


> Originally Posted by *spieluhr*
> 
> It seems to me that no matter what 1070 version you have, one can easily clock it to around 2000mhz + 4500mhz.
> What is the point of having so many different versions on the market then ?? And why the different prices ?


Better coolers do give better frequency stability and will keep frequency and performance higher over time. GPUboost 3 does increase voltage/drop clocks as temps rise. They can also use lower fan speeds and provide you a quieter experience.

The top/largest coolers are more important in tropical areas with the high ambient temps. If you live in a temperate place like Northern Europe, then it really doesn't matter anywhere near as much.


----------



## Yukss

Quote:


> Originally Posted by *headd*
> 
> its more than 20%.In latest games GTX1080 is almost 27% average faster than GTX1070
> https://www.techpowerup.com/reviews/EVGA/GTX_1080_Ti_SC2/31.html


thanks for your comment, hopefully i get this deal with the 1080 FE with ek waterblock already installed (i can do it myself btw







) for 450$

ps. what would be a good price for a asus strix 1070 oc unsed, it was for another build that i cancel.

http://www.overclock.net/t/1629324/asus-geforce-gtx-1070-strix-oc/0_40


----------



## Yukss

edit double post


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The Palit Dual is still a dual fan card that should run better than 84 deg. the 1070 is a very power efficient chip. With mine, at 1.093v and 80% fans I only just hit 62 deg in a 24 deg room. Granted my cooler is a higher quality one than the palit, but it shouldn't be 20 deg different. I would expect more like 10-15.
> 
> If your card is running hot like that, either your case has no ventilation at all and the hot air is just going around and around. You may benefit by installing another fan or two to the case. You can do a quick test before spending any money by taking the case door off and running it open and seeing how the temps go when you know the card can get some fresh air. a fan in the case door would be useful if your case supports it
> 
> Or the other thing it could be, If it still runs at 84 deg with an open case, is the thermal paste. The factory application of thermal interface material may be faulty. If you dont want/cant replace the thermal paste yourself because of warranty stickers etc, RMA the card and get another one because it is faulty.
> 
> If you are undervolting, you already have afterburner installed. You can set a custom fan curve in AB so that it spins at 20-25% at idle. (Im not sure if the dual has the 0 fan below 60 deg by default) That way it is still quiet but it will idle at 35deg instead of hi 50s and then start increasing the curve at say 40 or 45 deg set the max speed to 75% if that is what is tolerable from say, 60 deg up.
> 
> 
> 
> You were right my temps in terms of undervolting from yesterday were complete bogus. Since the outside temperatures are still below 10°C here like they have been the whole winter I set my case fans to Level 1 and didn't clean my dust-filters in a while. (I tend to rarely use heaters even in winter so my room-temperatures are below average during this time) After I cleaned the filters and set the fans back to Level 2 I'm now running 2000mhz at 0.9v @ 68°C with the fans barely scratching 60-65%. I was shocked how quiet the card is now when I tested it a while longer with BF1. I guess since I went the lazy route over winter (I bought my 1070 in October I think where it was already pretty cold) and used the thermal-limit option I never noticed how much my ambience airflow must have throttled my GTX 1070 because I rarely play a game that push it
> 
> 
> 
> 
> 
> 
> 
> So I guess I'm happy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One question though, did I undervolt the right way? I followed a YouTube tutorial and set my Afterburner curve like this:
> 
> 
> 
> but I read on another forum that doing it that way actually can cause FPS loss because it doesn't raise another hidden clock called Videoclock properly? The right way is supposedly to offset the curve first and then flatten it beyond the desired max-voltage but that caused me to instantly crash my drivers. I'm not sure if I did it right though because the instructions were a bit confusing. Did I do things right or wrong?
> 
> And thanks for your suggestions, they lead me to finding my issue.
Click to expand...

The .950v point is the voltage point that also controls the video clock speed. That is the reason why I said use that point.

One thing that maybe you haven't paid any attention to is that thin line you can see on that pops out under the main curve. Understanding what that means will help you not spend ages scratching your head wondering why the curve never seems to be consistent. Keep in mind that the graphics card actually uses all the voltage points up to the level where it stops increasing, not just the highest point on the curve.

The way the AB curve works, is that it has a default set of voltage values are fixed in the bios. OC versions of the card have higher values baked in to the bios. The card uses those fixed values as the default curve. Any time you do any overclocking, you are not actually changing those values, instead, you are telling the card to apply an offset to each of the fixed values that you adjust. Using the slider and adding +100 in AB tells the card to offset every value in the range by and extra 100 on the y axis if the graph. If you use the slider and look at the graph, you will see that the thin line stays where the default curve used to be.

If you use the individual voltage points on the curve to overclock, this is where it can get confusing. Even though all the points on the curve appears to move around the one you just moved, they actually are not. Even though the thick curve appears to move, the Afterburner application is just relocating the where it is representing the 0 offset on screen. Intuitively, I think most people would assume that the 0 offset point would remain fixed on the screen but it doesn't.

The thin line that you can see is actually there to indicate how much offset that any given voltage point has from the 0 level that is moving all around the screen. If the thin line ends up above the main curve line, it is showing you that at that voltage point the offset is negative. Sometimes you can move the point and hit apply and it will look like the curve didnt move. but if you pay attention to the thin line, that did move and it shows that the offset was actually changed.

This curve is only there to illustrate what I am saying and is not a usable curve for your graphics card. It took me more than 6 months of head scratching before I worked out what the curve was actually telling me and understand why it all seemed so inconsistent.

This is what a curve should look like if the 0 offset remained fixed on the screen and you made positive and negative adjustments. It the way I think everyone assumes the curve works, but it doesn't in Afterburner. The curve adjustment in precision XOC is ugly and clunky, but, the 0 baseline is actually flat along the bottom of the histogram, making it much easier to understand what you are doing if you manually adjust curves.



This is actually the exact same curve after you hit the apply button. See how it looks nothing like what a normal person would assume it should look like given the settings in the first screencap? The thin line though, being above and below the thick line does actually show you where points have been offset in a positive direction and where points have been offset in a negative direction. The thin line is actually 0.



If you look at your curve, the only points that are getting any offset from default are the ones between .900 and .950. Everything else is running at stock. There is more performance that you can get out of your card while still running at a lower voltage than default.

What I would suggest, is that you increase every voltage point from the 0.800v point up to 0.950v. If you want to limit the cards voltage to .950v, don't adjust anything above that point. The highest level for each point will be dependent on your card but I would try adding +200 mhz to each of those low range point first and test for stability. If it is stable, increase them all by +25 and test again. if not reduce each point by 25 and test. when you get the .8 to .95v points adjusted and stable, then increase the .950v point up to 2025mhz if you are still below that level on the curve and see how that works for you.

If you are really keen, you can adjust each point individually as they are all likely to have different maximum offset values but the gains are probably not worth the extra time to fine tune it and maybe see a 1fps improvement in some games.


----------



## gtbtk

Quote:


> Originally Posted by *Yukss*
> 
> Quote:
> 
> 
> 
> 1070 performance is adequate for most things, even if it requires you to reduce some settings from ultra to high on some titles, it is still a good experience.
> 
> I would wait 6 months and look at the gtx2070 that will perform about the same as a 1080ti does now or a new 2080 that should beat it if nvidia stay true to form.
> 
> 
> 
> yes, i guess we can always wait until next gens of cards but here the deals is to get a 1080 founders edition with waterblock at 450$ and try to sell mine at 380$ at least (it cost me 450$ less than a month ago)
Click to expand...

3 months ago I would have said go for it. Now though, if it was me, I would be really disappointed if having spent just $450, I could get a new card for $400 that blows the card I just bought to the weeds.

You also mentioned that you have another spare 1070. If you did SLI with your existing 1070 you get 1080TI performance, for the most part and it costs you nothing extra


----------



## Yukss

Quote:


> Originally Posted by *gtbtk*
> 
> 3 months ago I would have said go for it. Now though, if it was me, I would be really disappointed if having spent just $450, I could get a new card for $400 that blows the card I just bought to the weeds.
> 
> You also mentioned that you have another spare 1070. If you did SLI with your existing 1070 you get 1080TI performance, for the most part and it costs you nothing extra


thanks again for your time, yes, i have a evga 1070 sc and a brand new asus 1070 strix oc, i did ran in sli for one day to test, but i did not like itm miss match lol, performance wise are monsters, but im running a 27" 1080p, waiting for a 2k monitor that i bought, sli is not an option because im more in to one gpu setup.. the "spare" gtx 1070 was for a second build, and i cannot sell both card because i promised the evga for a friend as a gift, so basically i have only one 1070 (the asus strix) for selling in order to get a 1080 or god knows what (no amd)


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> The .950v point is the voltage point that also controls the video clock speed. That is the reason why I said use that point.
> 
> One thing that maybe you haven't paid any attention to is that thin line you can see on that pops out under the main curve. Understanding what that means will help you not spend ages scratching your head wondering why the curve never seems to be consistent. Keep in mind that the graphics card actually uses all the voltage points up to the level where it stops increasing, not just the highest point on the curve.
> 
> The way the AB curve works, is that it has a default set of voltage values are fixed in the bios. OC versions of the card have higher values baked in to the bios. The card uses those fixed values as the default curve. Any time you do any overclocking, you are not actually changing those values, instead, you are telling the card to apply an offset to each of the fixed values that you adjust. Using the slider and adding +100 in AB tells the card to offset every value in the range by and extra 100 on the y axis if the graph. If you use the slider and look at the graph, you will see that the thin line stays where the default curve used to be.
> 
> If you use the individual voltage points on the curve to overclock, this is where it can get confusing. Even though all the points on the curve appears to move around the one you just moved, they actually are not. Even though the thick curve appears to move, the Afterburner application is just relocating the where it is representing the 0 offset on screen. Intuitively, I think most people would assume that the 0 offset point would remain fixed on the screen but it doesn't.
> 
> The thin line that you can see is actually there to indicate how much offset that any given voltage point has from the 0 level that is moving all around the screen. If the thin line ends up above the main curve line, it is showing you that at that voltage point the offset is negative. Sometimes you can move the point and hit apply and it will look like the curve didnt move. but if you pay attention to the thin line, that did move and it shows that the offset was actually changed.
> 
> This curve is only there to illustrate what I am saying and is not a usable curve for your graphics card. It took me more than 6 months of head scratching before I worked out what the curve was actually telling me and understand why it all seemed so inconsistent.
> 
> This is what a curve should look like if the 0 offset remained fixed on the screen and you made positive and negative adjustments. It the way I think everyone assumes the curve works, but it doesn't in Afterburner. The curve adjustment in precision XOC is ugly and clunky, but, the 0 baseline is actually flat along the bottom of the histogram, making it much easier to understand what you are doing if you manually adjust curves.
> 
> 
> 
> 
> This is actually the exact same curve after you hit the apply button. See how it looks nothing like what a normal person would assume it should look like given the settings in the first screencap? The thin line though, being above and below the thick line does actually show you where points have been offset in a positive direction and where points have been offset in a negative direction. The thin line is actually 0.
> 
> 
> 
> 
> If you look at your curve, the only points that are getting any offset from default are the ones between .900 and .950. Everything else is running at stock. There is more performance that you can get out of your card while still running at a lower voltage than default.
> 
> What I would suggest, is that you increase every voltage point from the 0.800v point up to 0.950v. If you want to limit the cards voltage to .950v, don't adjust anything above that point. The highest level for each point will be dependent on your card but I would try adding +200 mhz to each of those low range point first and test for stability. If it is stable, increase them all by +25 and test again. if not reduce each point by 25 and test. when you get the .8 to .95v points adjusted and stable, then increase the .950v point up to 2025mhz if you are still below that level on the curve and see how that works for you.
> 
> If you are really keen, you can adjust each point individually as they are all likely to have different maximum offset values but the gains are probably not worth the extra time to fine tune it and maybe see a 1fps improvement in some games.


Thanks for the detailed explanation of the curve, things indeed make quite a bit more sense now! I ran a few tests with 2000mhz @ 0.900v and 2000mhz @ 0.950v and I indeed lost quite a bit performance:

FFXIV Stormblood Benchmark:

Stock:

Score: 12935
Average Frame Rate: 91.601

0.900v:

Score: 12987
Average Frame Rate: 91.750

0.950v:

Score: 12971
Average Frame Rate: 91.861

Unigine Heaven Benchmark 4.0:

Stock:

FPS:
91.2

Score:
2296

Min FPS:
9.0

Max FPS:
192.1

0.900v:

FPS:
91.3

Score:
2300

Min FPS:
31.8

Max FPS:
188.6

0.950v:

FPS:
95.5

Score:
2405

Min FPS:
31.5

Max FPS:
196.4

3DMark Firestrike:

Stock: 13364
0.900v: 13247
0.950v: 13720

That odd results are from some benchmarks jumping all over the states at times so I have to figure out the lower clocks individually. Setting the lower states to +200 across the board sadly doesn't work for me as some games start to crash then as soon as they land at some lower values as soon as videos play or I'm in a menu. I think I'll keep the 0.900v curve in a separate profile though as it gives me around stock performance with vastly lower temps which will be useful for summer when I feel 0.950v will also start to creep up a bit more. For now 0.950v never went higher than 74°C for me at good fan-speeds but I like to have headroom for these 30°C+ days.


----------



## Minium

Broke my FireStrike record again









http://www.3dmark.com/3dm/19701453

Any beastly 1070´s around ?


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Thanks for the detailed explanation of the curve, things indeed make quite a bit more sense now! I ran a few tests with 2000mhz @ 0.900v and 2000mhz @ 0.950v and I indeed lost quite a bit performance:
> 
> FFXIV Stormblood Benchmark:
> 
> Stock:
> 
> Score: 12935
> Average Frame Rate: 91.601
> 
> 0.900v:
> 
> Score: 12987
> Average Frame Rate: 91.750
> 
> 0.950v:
> 
> Score: 12971
> Average Frame Rate: 91.861
> 
> Unigine Heaven Benchmark 4.0:
> 
> Stock:
> 
> FPS:
> 91.2
> 
> Score:
> 2296
> 
> Min FPS:
> 9.0
> 
> Max FPS:
> 192.1
> 
> 0.900v:
> 
> FPS:
> 91.3
> 
> Score:
> 2300
> 
> Min FPS:
> 31.8
> 
> Max FPS:
> 188.6
> 
> 0.950v:
> 
> FPS:
> 95.5
> 
> Score:
> 2405
> 
> Min FPS:
> 31.5
> 
> Max FPS:
> 196.4
> 
> 3DMark Firestrike:
> 
> Stock: 13364
> 0.900v: 13247
> 0.950v: 13720
> 
> That odd results are from some benchmarks jumping all over the states at times so I have to figure out the lower clocks individually. Setting the lower states to +200 across the board sadly doesn't work for me as some games start to crash then as soon as they land at some lower values as soon as videos play or I'm in a menu. I think I'll keep the 0.900v curve in a separate profile though as it gives me around stock performance with vastly lower temps which will be useful for summer when I feel 0.950v will also start to creep up a bit more. For now 0.950v never went higher than 74°C for me at good fan-speeds but I like to have headroom for these 30°C+ days.


I only suggested 200 as a starting point. The amount that you can increase these cards clock speeds is dependent on the silicon itself and what the factory set the default clocks at. A reference clocked card can add many more points to an overclock than say a Gigabyte Xtreme Gaming card can simply because the Gigabyte card has already had an overclock applied that is close to the silicons limits in the factory.

On my GamingX I can take the lower points up to about +150 but the points around 1.0-1.025v don't really like being increased more than about +75 and then from 1.043v it is ok with about an extra 100. If i tune CPU PLL voltages, I can tune that dip in performance out, but CPU temps start increasing. While there is certainly variation in silicon chips performance, I really think that silicon lottery gets blamed more than it should because motherboard voltage combinations, ram speed and latency cpu clocks all impact How well your GPU performs.

I Have experimented with it but I do not run my card undervolted as a general rule. I have settled to run my card at the default voltage and overclock with the curve that has a hump at .950 dips after that and then boosts up again to peak at 1.063v.

When I first got the card, my first instinct was to add +100 to the voltage slider and run the card a 1.093V all the time and the card performs OK. It will let me run the card stable at 2126Mhz. At default voltages, I can only run the card stable at 2076Mhz but I discovered that it runs almost 10 deg cooler and I get similar Firestrike and Heaven results. http://www.3dmark.com/fs/11822144 this is a recent FS result running on an i7-2600. Unfortunately the z68 MB died and I not found a reasonably priced replacement yet.

1070 love memory bandwidth and will increase your Benchmark scores significantly. Clock your memory as fast as you stably can, that should probably be somewhere around +500 to +600. The card may run the memory faster, but gpu memory errors start creeping in and the performance drops off again. OCCT is a good tool that can test graphic card memory overclocks.

Make sure that, unless you use it, you also disable the windows GameDVR service running on Win 10 by default. It kills GPU benchmark performance.


----------



## ParlyShary

So what's are recommended, stable OC settings on the Gaming X (base voltage)?
I ask because I want a starting point for Oc'ing mine.


----------



## gtbtk

Quote:


> Originally Posted by *ParlyShary*
> 
> So what's are recommended, stable OC settings on the Gaming X (base voltage)?
> I ask because I want a starting point for Oc'ing mine.


You could try starting at:

Voltage 0

Powerlimit 126

Temp limit 93

core clock +100 or +75

memory clock +500

adjust core clock in + or - 25 increments to get close and then dial it in in +/- 12 increments to fine tune. smaller adjustments wont make any changes to reported values.

Create a fan curve. I don't mind the 100% fan noise from my GamingX so for benchmarking my fans are at 30% at idle keeping temps at around 30-35 deg, start increasing at 40 deg and get to 100% at about 50. That will keep temps about as low as they can go and keep your clocks up. If you use the zerofan option, it will idle at 59 deg and get hotter from there.

Don't be afraid of increasing the voltage slider, it wont hurt the card. The 100 value is not mV, it is a percentage offset adjusting the highest point in the curve between 1063mV and 1093mV. the extra voltage will let you run the card faster.

Once you have dialed in settings overclocking on the slider, there is probably more performance to be found using the curve to overclock, but you are best off getting the slider oc sorted first.


----------



## gtbtk

Quote:


> Originally Posted by *Minium*
> 
> Broke my FireStrike record again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/19701453
> 
> Any beastly 1070´s around ?


That's a great Graphics score.









I'm tempted to point you to this one http://www.3dmark.com/fs/10295423







but it was a borked driver run.

There are some 1070s around that will do 22000+ graphics score. Unfortunately I haven't found a way to get mine there yet.

This is one of the best I have managed with an i7-2600 http://www.3dmark.com/fs/11532231

The best I have managed to get min is 21500


----------



## zipper17

Quote:


> Originally Posted by *Minium*
> 
> Broke my FireStrike record again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/19701453
> 
> Any beastly 1070´s around ?


You have Galax/KFA HOF 980ti card, running with 1636mhz/8280mhz effective, not even 980ti in general can do that.
However current highest pro-overclocker 980ti under LN2 could do +1900mhz corespeed. http://www.3dmark.com/fs/6825897 28,829 Graphic scores.

Might be 1070 HOF owner has capability to reach yours, or some lucky people in the world probably could reach that who knows, not every single 1070s owner are at OCN.

You can search results at 3dmark.com, as far as I can find single 1070 about +22.4xx graphic scores.

btw did you only run the benchmark? no stable test/stress test/games?

I can reach my highest scores +21k, but I turn it down, coreclock too high would result crashes during stress tests, and also the memory I turn it down to +600mhz because it's start artifacting at +700mhz in stress tests. Achieving high scores/clocks but no perfect stable it's not my goals in overclocking.


----------



## Minium

Quote:


> Originally Posted by *zipper17*
> 
> You have Galax/KFA HOF 980ti card, running with 1636mhz/8280mhz effective, not even 980ti in general can do that.
> However current highest pro-overclocker 980ti under LN2 could do +1900mhz corespeed. http://www.3dmark.com/fs/6825897 28,829 Graphic scores.
> 
> Might be 1070 HOF owner has capability to reach yours, or some lucky people in the world probably could reach that who knows, not every single 1070s owner are at OCN.
> 
> You can search results at 3dmark.com, as far as I can find single 1070 about +22.4xx graphic scores.
> 
> btw did you only run the benchmark? no stable test/stress test/games?
> 
> I can reach my highest scores +21k, but I turn it down, coreclock too high would result crashes during stress tests, and also the memory I turn it down to +600mhz because it's start artifacting at +700mhz in stress tests. Achieving high scores/clocks but no perfect stable it's not my goals in overclocking.


Its not even a galax Its a gigabyte 989Ti xtreme gaming


----------



## xGeNeSisx

I have a Krakken G10 with H55 cooling my Gigabyte G1 1070 Gaming currently, although I am now building a custom loop. I was wondering if anyone had any suggestions for a waterblock. I am quite alright with using the stock backplate in order to reduce costs. It's hard to find reliable information about compatibility due to the non-reference PCB. Thanks guys!


----------



## rfarmer

Quote:


> Originally Posted by *xGeNeSisx*
> 
> I have a Krakken G10 with H55 cooling my Gigabyte G1 1070 Gaming currently, although I am now building a custom loop. I was wondering if anyone had any suggestions for a waterblock. I am quite alright with using the stock backplate in order to reduce costs. It's hard to find reliable information about compatibility due to the non-reference PCB. Thanks guys!


I looked at the G1 Gaming but ended up with the FE. I would have gone with the EK block, they make one specifically for the G1.


----------



## Minium

Quote:


> Originally Posted by *xGeNeSisx*
> 
> I have a Krakken G10 with H55 cooling my Gigabyte G1 1070 Gaming currently, although I am now building a custom loop. I was wondering if anyone had any suggestions for a waterblock. I am quite alright with using the stock backplate in order to reduce costs. It's hard to find reliable information about compatibility due to the non-reference PCB. Thanks guys!


Go with EK. They make the best ones you can get.
https://www.caseking.de/ek-water-blocks-ek-fc-1080-1070-gtx-g1-nickel-waek-1284.html?gclid=CKeggoGg1tMCFU0Q0wody7QLEg
Thats a block that works with G1 1070/1080
You should find it in your country too.


----------



## xGeNeSisx

Tempted to get the EK block and backplate, was checking out the Bitspower at first. Decided to go with black EK waterblock + blackplate which will match my build's theme nicely. Should look great in blacked out windowed Fractal R5 with 2m white LED string. Everything except rads is EK in the setup. Just sticking with clear soft tubing and coolant for now as it's my first loop. I have some white dye which I may add after setting everything up







Just going to setup CPU loop with both rads for the time being, and will add 1070 waterblock in a few days when it arrives.

EK Supreme MX Uni Waterblock
Alphacool NexXXops XT45 240mm & NexXos /ST30 120MM Rads
EK-XRES 100 Revo D5 PWM w/ pump
EK-DuraClear tubing and all EK ACF fittings


----------



## rfarmer

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Tempted to get the EK block and backplate, was checking out the Bitspower at first. Decided to go with black EK waterblock + blackplate which will match my build's theme nicely. Should look great in blacked out windowed Fractal R5 with 2m white LED string. Everything except rads is EK in the setup. Just sticking with clear soft tubing and coolant for now as it's my first loop. I have some white dye which I may add after setting everything up
> 
> 
> 
> 
> 
> 
> 
> Just going to setup CPU loop with both rads for the time being, and will add 1070 waterblock in a few days when it arrives.
> 
> EK Supreme MX Uni Waterblock
> Alphacool NexXXops XT45 240mm & NexXos /ST30 120MM Rads
> EK-XRES 100 Revo D5 PWM w/ pump
> EK-DuraClear tubing and all EK ACF fittings


I had a Bitspower on my 970, the included backplate was nice and it was a decent block. I was always nervous about the threaded acrylic for the fittings, metal is much more durable.

EK make excellent blocks and will go with your other components.


----------



## zipper17

Quote:


> Originally Posted by *Minium*
> 
> Its not even a galax Its a gigabyte 989Ti xtreme gaming


i looked into your posts i thought you mentioned your card its HOF hmpph nvm.
Quote:


> Originally Posted by *Minium*
> 
> Broke my FireStrike record again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/19701453
> Position 19 on I7 6700k + 980Ti HOF


btw I play Ultimate Epic Battle Simulator, that game easily eat 7-8GB vram ultra settings 1440p about 10k units.


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> I only suggested 200 as a starting point. The amount that you can increase these cards clock speeds is dependent on the silicon itself and what the factory set the default clocks at. A reference clocked card can add many more points to an overclock than say a Gigabyte Xtreme Gaming card can simply because the Gigabyte card has already had an overclock applied that is close to the silicons limits in the factory.
> 
> On my GamingX I can take the lower points up to about +150 but the points around 1.0-1.025v don't really like being increased more than about +75 and then from 1.043v it is ok with about an extra 100. If i tune CPU PLL voltages, I can tune that dip in performance out, but CPU temps start increasing. While there is certainly variation in silicon chips performance, I really think that silicon lottery gets blamed more than it should because motherboard voltage combinations, ram speed and latency cpu clocks all impact How well your GPU performs.
> 
> I Have experimented with it but I do not run my card undervolted as a general rule. I have settled to run my card at the default voltage and overclock with the curve that has a hump at .950 dips after that and then boosts up again to peak at 1.063v.
> 
> When I first got the card, my first instinct was to add +100 to the voltage slider and run the card a 1.093V all the time and the card performs OK. It will let me run the card stable at 2126Mhz. At default voltages, I can only run the card stable at 2076Mhz but I discovered that it runs almost 10 deg cooler and I get similar Firestrike and Heaven results. http://www.3dmark.com/fs/11822144 this is a recent FS result running on an i7-2600. Unfortunately the z68 MB died and I not found a reasonably priced replacement yet.
> 
> 1070 love memory bandwidth and will increase your Benchmark scores significantly. Clock your memory as fast as you stably can, that should probably be somewhere around +500 to +600. The card may run the memory faster, but gpu memory errors start creeping in and the performance drops off again. OCCT is a good tool that can test graphic card memory overclocks.
> 
> Make sure that, unless you use it, you also disable the windows GameDVR service running on Win 10 by default. It kills GPU benchmark performance.


I'm aware, I only said because that means I have to fiddle a bit more, I'll sit down on the weekend and try to find a reasonable curve for me. Sounds like you managed to add a nice little OC on top on what you already had without getting too unreasonable on the voltages. As far as the mainboard goes, I have to replace mine anyway by the end of the year. My 3570k gets long in the tooth even at 4.4ghz and my ASRock Z77 Pro 3 had a few quirks from the start even though it runs solid and stable so I'm looking forward to having a new system. I agree though, many factors can influence a GPU and it's always helpful to have a solid and well working base-system before overclocking to minimize causes for errors. And I would say that 10 degrees cooler for near similar performance is more than adequate in terms of a trade-off. Regarding my memory clocks I'll have to see if I can push them much as I have a near launch card with the infamous Micron memory and even a simple +50 gave me checkerboard crashes. This was before Palit released their new BIOS though, I haven't tried again yet since I flashed it so I'll have to see how this goes. I admit I'm not well versed in terms of GPU overclocking yet, will the memory-clock mainly help me with benchmarks or will I see possible real-world improvements in actual games while playing at 1080p-1440p resolutions?

Also I had that already disabled but thanks for the tip!


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> I have a Krakken G10 with H55 cooling my Gigabyte G1 1070 Gaming currently, although I am now building a custom loop. I was wondering if anyone had any suggestions for a waterblock. I am quite alright with using the stock backplate in order to reduce costs. It's hard to find reliable information about compatibility due to the non-reference PCB. Thanks guys!


EK model 1080 and 1070 use the same block

https://www.ekwb.com/shop/ek-fc1080-gtx-g1-nickel

I think Bykski are the OEM manufacturer for EK waterblocks. About $30 cheaper than the EK branded one.

http://www.bykski.com/poce12?product_id=398&product_category=6

there is also one branded barrow that also looks the same

http://www.barrowint.com/index.php/article/623.html

here is a bitspower model

http://www.performance-pcs.com/bitspower-gigabyte-geforce-gtx-1080-g1-gaming-clear-acrylic-limited-edition-for-gigabyte-geforce-gtx-1080-1070-1060-g1.html


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> I'm aware, I only said because that means I have to fiddle a bit more, I'll sit down on the weekend and try to find a reasonable curve for me. Sounds like you managed to add a nice little OC on top on what you already had without getting too unreasonable on the voltages. As far as the mainboard goes, I have to replace mine anyway by the end of the year. My 3570k gets long in the tooth even at 4.4ghz and my ASRock Z77 Pro 3 had a few quirks from the start even though it runs solid and stable so I'm looking forward to having a new system. I agree though, many factors can influence a GPU and it's always helpful to have a solid and well working base-system before overclocking to minimize causes for errors. And I would say that 10 degrees cooler for near similar performance is more than adequate in terms of a trade-off. Regarding my memory clocks I'll have to see if I can push them much as I have a near launch card with the infamous Micron memory and even a simple +50 gave me checkerboard crashes. This was before Palit released their new BIOS though, I haven't tried again yet since I flashed it so I'll have to see how this goes. I admit I'm not well versed in terms of GPU overclocking yet, will the memory-clock mainly help me with benchmarks or will I see possible real-world improvements in actual games while playing at 1080p-1440p resolutions?
> 
> Also I had that already disabled but thanks for the tip!


My card is a June 2016 micron card as well. Before the update, if I stopped the card from dropping to a lower pstate by running something like chrome, I could get it to +400. After the update, with a little extra vccio and cpu pll on my i7-2600 @4.4ghz I could run it up to about +700 but best performance was at about +500 - 600.

The 1070 is really a 1080 with slower memory and a few cores disabled. The chip itself would still be happy running with GDDR5X at 1080 speeds. I found that it did help framerates in witcher 3, tomb raider and gta v.


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> My card is a June 2016 micron card as well. Before the update, if I stopped the card from dropping to a lower pstate by running something like chrome, I could get it to +400. After the update, with a little extra vccio and cpu pll on my i7-2600 @4.4ghz I could run it up to about +700 but best performance was at about +500 - 600.
> 
> The 1070 is really a 1080 with slower memory and a few cores disabled. The chip itself would still be happy running with GDDR5X at 1080 speeds. I found that it did help framerates in witcher 3, tomb raider and gta v.


Wow that sounds like really great results, nice!







I guess I'll start with +200 and then work my way up when I test my core-clocks over the weekend.

And I know, it's too bad that the extra cores are cut off, unlocking a 1070 into a quasi 1080 would have been great but it's not surprsing nVidia wanted to prevent that. Good to hear on the performance improvements though.


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My card is a June 2016 micron card as well. Before the update, if I stopped the card from dropping to a lower pstate by running something like chrome, I could get it to +400. After the update, with a little extra vccio and cpu pll on my i7-2600 @4.4ghz I could run it up to about +700 but best performance was at about +500 - 600.
> 
> The 1070 is really a 1080 with slower memory and a few cores disabled. The chip itself would still be happy running with GDDR5X at 1080 speeds. I found that it did help framerates in witcher 3, tomb raider and gta v.
> 
> 
> 
> Wow that sounds like really great results, nice!
> 
> 
> 
> 
> 
> 
> 
> I guess I'll start with +200 and then work my way up when I test my core-clocks over the weekend.
> 
> And I know, it's too bad that the extra cores are cut off, unlocking a 1070 into a quasi 1080 would have been great but it's not surprsing nVidia wanted to prevent that. Good to hear on the performance improvements though.
Click to expand...

Live on the edge. Start at +500.

You can always drop it back if it ends up being too much. I think that you will find the right value quicker that way.


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> Live on the edge. Start at +500.
> 
> You can always drop it back if it ends up being too much. I think that you will find the right value quicker that way.


Well I had a bit of time now and fired up a round of BF1 with my 0.95v profile and +500 memory, so far so good I have to do more testing!









But I really need a new CPU because I still only have 40-50fps when alot of stuff happens and my CPU is pegged at 99.9% usage.


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Live on the edge. Start at +500.
> 
> You can always drop it back if it ends up being too much. I think that you will find the right value quicker that way.
> 
> 
> 
> Well I had a bit of time now and fired up a round of BF1 with my 0.95v profile and +500 memory, so far so good I have to do more testing!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I really need a new CPU because I still only have 40-50fps when alot of stuff happens and my CPU is pegged at 99.9% usage.
Click to expand...

You have a 3570K that I agree, is getting near the end of AAA gaming life now but it should still has a bit left in the tank. Have you tried overclocking your RAM?

You might find that reducing the multiplier and increasing BCLK a bit while tuning memory timings might help with the fps. I was running my i7-2600 with a 42 multiplier and a 105.8bclk and OC 1600Mhz @ 1972Mhz ram and it was performing like an i5 6600. It would have been even better if I had 2133Mhz ram installed


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> You have a 3570K that I agree, is getting near the end of AAA gaming life now but it should still has a bit left in the tank. Have you tried overclocking your RAM?
> 
> You might find that reducing the multiplier and increasing BCLK a bit while tuning memory timings might help with the fps. I was running my i7-2600 with a 42 multiplier and a 105.8bclk and OC 1600Mhz @ 1972Mhz ram and it was performing like an i5 6600. It would have been even better if I had 2133Mhz ram installed


Yeah my current RAM runs at 2000mhz via XMP (it's rated for 2400mhz) but anything higher and my system refuses to POST. My old 8gb set ran up to 2133mhz just fine so I guess I would have to manually dig in the settings but I'm not sure if it's worth the effort and how much difference the 133mhz really will make, of course if I could make them run at the advertised 2400mhz that might be a different story. That's one of the quirks of my mainboard that I mentioned. The A-channel memory-slots don't work at all, whenever I put some RAM into it the system is thrown into a POST-loop trying to detect the RAM settings and failing, the B-channel works without issues. I read that ASRock boards had alot of these issues during the Z77 era and with this being my first self-build Intel-system I probably just have some pins to the memory-controller not making proper contact but I've been afraid trying to fix this as I know how delicate these pins are and I don't want to make matters worse. At the time I build this build I already boxed up my old system to sent to a friend and after I figured the B-channel works I didn't want to RMA it and be 1-2 weeks without any PC. It was a bit risky since I couldn't know if the board had some other issues but it's been running for years now in Single-Channel Mode without issues and with me wanting to replace it this year anyway I don't want to risk anything on the finish line.

Ideally I would like to buy a true six-core CPU from Intel but the difference between these and quad-cores are 200€ and I really don't have 400€ I could spend on a CPU alone so I might have to bite the bullet and go for a i5 quad-core again unless prices are dropping soon.

And thanks for the tip with the baseclock, that I could look into. How much difference would it make though when I currently run at a x44 multiplier with 2000mhz RAM?


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You have a 3570K that I agree, is getting near the end of AAA gaming life now but it should still has a bit left in the tank. Have you tried overclocking your RAM?
> 
> You might find that reducing the multiplier and increasing BCLK a bit while tuning memory timings might help with the fps. I was running my i7-2600 with a 42 multiplier and a 105.8bclk and OC 1600Mhz @ 1972Mhz ram and it was performing like an i5 6600. It would have been even better if I had 2133Mhz ram installed
> 
> 
> 
> Yeah my current RAM runs at 2000mhz via XMP (it's rated for 2400mhz) but anything higher and my system refuses to POST. My old 8gb set ran up to 2133mhz just fine so I guess I would have to manually dig in the settings but I'm not sure if it's worth the effort and how much difference the 133mhz really will make, of course if I could make them run at the advertised 2400mhz that might be a different story. That's one of the quirks of my mainboard that I mentioned. The A-channel memory-slots don't work at all, whenever I put some RAM into it the system is thrown into a POST-loop trying to detect the RAM settings and failing, the B-channel works without issues. I read that ASRock boards had alot of these issues during the Z77 era and with this being my first self-build Intel-system I probably just have some pins to the memory-controller not making proper contact but I've been afraid trying to fix this as I know how delicate these pins are and I don't want to make matters worse. At the time I build this build I already boxed up my old system to sent to a friend and after I figured the B-channel works I didn't want to RMA it and be 1-2 weeks without any PC. It was a bit risky since I couldn't know if the board had some other issues but it's been running for years now in Single-Channel Mode without issues and with me wanting to replace it this year anyway I don't want to risk anything on the finish line.
> 
> Ideally I would like to buy a true six-core CPU from Intel but the difference between these and quad-cores are 200€ and I really don't have 400€ I could spend on a CPU alone so I might have to bite the bullet and go for a i5 quad-core again unless prices are dropping soon.
> 
> And thanks for the tip with the baseclock, that I could look into. How much difference would it make though when I currently run at a x44 multiplier with 2000mhz RAM?
Click to expand...

the memory controller is actually on the CPU die. the MB only has the electrical connections between the CPU socket to the ram so maybe the pins in the socket are bent or there is some dust/debris in the slot. a can of compressed air may be able to blow them out. Dual channel will give you better performance than single channel

I have never used an Asrock board so I am not familiar with the specific quirks. The VCCIO voltage in the bios is the one that helps fortify the memory controller. Increasing that from 1.05 to 1.1 may help you get the ram running faster. It also helps improve stability of the PCIe controller for overclocking GPUs.

You should be able to increase bclk to 103 - 104mhz and not change anything else. That will give you 4532Mhz on the cpu and 2060Mhz memory at 103BCLK. If it wont boot, try dropping the multiplier to 43 and trying again. x44 multi on those chips is not right at the ragged edge so I would be surprised if it does have too many problems. If it starts giving whea BSOD, increase vcore voltage slightly.

I have been wanting to upgrade mine too. The fact that my system it now makes PSUs go bang when you turn it on has forced the matter. I have been really considering a Ryzen system. I was originally thinking R7 but now I think an R5 6 core would be a better buy for my needs. The current Intel x99 broadwell-e products are now overpriced for the performance you can get out of them and I think that 4 core chips, even though they offer better single core performance, will be pushed aside by 6 core chips in the next year anyway.


----------



## zipper17

Quote:


> Originally Posted by *Skyblaze*
> 
> Yeah my current RAM runs at 2000mhz via XMP (it's rated for 2400mhz) but anything higher and my system refuses to POST. My old 8gb set ran up to 2133mhz just fine so I guess I would have to manually dig in the settings but I'm not sure if it's worth the effort and how much difference the 133mhz really will make, of course if I could make them run at the advertised 2400mhz that might be a different story. That's one of the quirks of my mainboard that I mentioned. The A-channel memory-slots don't work at all, whenever I put some RAM into it the system is thrown into a POST-loop trying to detect the RAM settings and failing, the B-channel works without issues. I read that ASRock boards had alot of these issues during the Z77 era and with this being my first self-build Intel-system I probably just have some pins to the memory-controller not making proper contact but I've been afraid trying to fix this as I know how delicate these pins are and I don't want to make matters worse. At the time I build this build I already boxed up my old system to sent to a friend and after I figured the B-channel works I didn't want to RMA it and be 1-2 weeks without any PC. It was a bit risky since I couldn't know if the board had some other issues but it's been running for years now in Single-Channel Mode without issues and with me wanting to replace it this year anyway I don't want to risk anything on the finish line.
> 
> Ideally I would like to buy a true six-core CPU from Intel but the difference between these and quad-cores are 200€ and I really don't have 400€ I could spend on a CPU alone so I might have to bite the bullet and go for a i5 quad-core again unless prices are dropping soon.
> 
> And thanks for the tip with the baseclock, that I could look into. How much difference would it make though when I currently run at a x44 multiplier with 2000mhz RAM?


you have pretty similar setup with mine, and similar problem i had with 2400 memory.

At first my 16gb 2400RAm also simply wouldn't boot at 2400mhz with XMP, tried manual settings; timings, voltages, increased vccsa/vccio, still couldn't boot at all.

But finally I can running it at 2400mhz by lower The Secondary Timings which is TRRD & CWL. (it took me many weeks to figure it out myself finally, zero information on the internet)
idk if this works to you, but it works on my system. After worked I tested it with memtest86 3 loops (9-10hours long) no error, and it's seems fine so far.

Probably this is caused by a compatibility problem (mobo, ram ,cpu.)


----------



## zipper17

Quote:


> Originally Posted by *Skyblaze*
> 
> Well I had a bit of time now and fired up a round of BF1 with my 0.95v profile and +500 memory, so far so good I have to do more testing!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *But I really need a new CPU because I still only have 40-50fps when alot of stuff happens and my CPU is pegged at 99.9% usage*.


what game is that? BF1?Yeah bf1 seems really cpu intensive

bottlenecking are depends what games you play.

If you play games that not really a CPU intensive, 3570k still fine

On Higher resolution 1440P or so, CPU botneck are become less.




this video playing BF1 Ultra 1440P looks just fine with 3570k 4.7ghz +1070,

Bottleneck as long as not make the game unplayable/very bad Lags( under 40-50FPS) still probably fine. But upgrade cpu definitely would be better.


----------



## Skyblaze

Quote:


> Originally Posted by *gtbtk*
> 
> the memory controller is actually on the CPU die. the MB only has the electrical connections between the CPU socket to the ram so maybe the pins in the socket are bent or there is some dust/debris in the slot. a can of compressed air may be able to blow them out. Dual channel will give you better performance than single channel
> 
> I have never used an Asrock board so I am not familiar with the specific quirks. The VCCIO voltage in the bios is the one that helps fortify the memory controller. Increasing that from 1.05 to 1.1 may help you get the ram running faster. It also helps improve stability of the PCIe controller for overclocking GPUs.
> 
> You should be able to increase bclk to 103 - 104mhz and not change anything else. That will give you 4532Mhz on the cpu and 2060Mhz memory at 103BCLK. If it wont boot, try dropping the multiplier to 43 and trying again. x44 multi on those chips is not right at the ragged edge so I would be surprised if it does have too many problems. If it starts giving whea BSOD, increase vcore voltage slightly.
> 
> I have been wanting to upgrade mine too. The fact that my system it now makes PSUs go bang when you turn it on has forced the matter. I have been really considering a Ryzen system. I was originally thinking R7 but now I think an R5 6 core would be a better buy for my needs. The current Intel x99 broadwell-e products are now overpriced for the performance you can get out of them and I think that 4 core chips, even though they offer better single core performance, will be pushed aside by 6 core chips in the next year anyway.


I know I looked extensively into the issue but I was and still am afraid to actually pull the CPU out of the socket now in fear to make the problem worse as I have no idea how the state of the pins is and I rather have a working Single Channel system than trying to fix it and having to buy a new system right now.

Thanks for the tip, first I'll try what Zipper17 suggested though, it sounds like the problem is quite similar to mine. Interesting on the baseclock though, I'll definitely try that, I have more than enough experience with Ivy Bridge stability-testing by now.

That doesn't sound good on your system alright, I hope it still works for you for a while. I thought about Ryzen too but I'm not sure if they are for me, I definitely want a improvement on IPC single-clock performance along with more cores and from what I read AMD made huge strides there but in the end that's an area where Intel is still better at. Is Ryzen beating Ivy Bridge on IPC performance? I really hope Intel is forced to drop prices on their six-cores, around 300€ by the end of the year and I would bite, although I doubt that will happen.
Quote:


> Originally Posted by *zipper17*
> 
> you have pretty similar setup with mine, and similar problem i had with 2400 memory.
> 
> At first my 16gb 2400RAm also simply wouldn't boot at 2400mhz with XMP, tried manual settings; timings, voltages, increased vccsa/vccio, still couldn't boot at all.
> 
> But finally I can running it at 2400mhz by lower The Secondary Timings which is TRRD & CWL. (it took me many weeks to figure it out myself finally, zero information on the internet)
> idk if this works to you, but it works on my system. After worked I tested it with memtest86 3 loops (9-10hours long) no error, and it's seems fine so far.
> 
> Probably this is caused by a compatibility problem (mobo, ram ,cpu.)


Huh interesting that sounds indeed very similar to my problem. Could you lend me a hand in lowing the TRRD & CWL timings? I literally never messed with RAM settings before, would I still need to set XMP afterwards or how would I go about this? Thanks in advance!
Quote:


> Originally Posted by *zipper17*
> 
> what game is that? BF1?Yeah bf1 seems really cpu intensive
> 
> bottlenecking are depends what games you play.
> 
> If you play games that not really a CPU intensive, 3570k still fine
> 
> On Higher resolution 1440P or so, CPU botneck are become less.
> 
> 
> 
> 
> this video playing BF1 Ultra 1440P looks just fine with 3570k 4.7ghz +1070,
> 
> Bottleneck as long as not make the game unplayable/very bad Lags( under 40-50FPS), i think it still fine.


Well on the CPU intensive side I do play BF4, BF1 and plenty of emulators where I see my my friends with Skylake pull ahead already too. I know my 3570k will still be useful for me for a while that's why I only want to upgrade by the end of the year. And I know I'm not too bothered by the dips but in the end I know it bottlenecks my GTX 1070 and if I already spend 430€ on a GPU I want to make full use of it while at the same time future-proofing myself, I think my 3570k lasted me for around 4 years now so I'm happy about the time it lasted me.

EDIT: Looking at the video the person only plays Team Deathmatch which is much easier on the CPU than Conquest Large.


----------



## zipper17

Quote:


> Originally Posted by *Skyblaze*
> 
> I know I looked extensively into the issue but I was and still am afraid to actually pull the CPU out of the socket now in fear to make the problem worse as I have no idea how the state of the pins is and I rather have a working Single Channel system than trying to fix it and having to buy a new system right now.
> 
> Thanks for the tip, first I'll try what Zipper17 suggested though, it sounds like the problem is quite similar to mine. Interesting on the baseclock though, I'll definitely try that, I have more than enough experience with Ivy Bridge stability-testing by now.
> 
> That doesn't sound good on your system alright, I hope it still works for you for a while. I thought about Ryzen too but I'm not sure if they are for me, I definitely want a improvement on IPC single-clock performance along with more cores and from what I read AMD made huge strides there but in the end that's an area where Intel is still better at. Is Ryzen beating Ivy Bridge on IPC performance? I really hope Intel is forced to drop prices on their six-cores, around 300€ by the end of the year and I would bite, although I doubt that will happen.
> 
> Huh interesting that sounds indeed very similar to my problem. Could you lend me a hand in lowing the TRRD & CWL timings? I literally never messed with RAM settings before, would I still need to set XMP afterwards or how would I go about this? Thanks in advance!
> Well on the CPU intensive side I do play BF4, BF1 and plenty of emulators where I see my my friends with Skylake pull ahead already too. I know my 3570k will still be useful for me for a while that's why I only want to upgrade by the end of the year. And I know I'm not too bothered by the dips but in the end I know it bottlenecks my GTX 1070 and if I already spend 430€ on a GPU I want to make full use of it while at the same time future-proofing myself, I think my 3570k lasted me for around 4 years now so I'm happy about the time it lasted me.
> 
> EDIT: Looking at the video the person only plays Team Deathmatch which is much easier on the CPU than Conquest Large.


Not use XMP, you need to manually input all memory settings in bios
primary timings usually the first four timing that we can usually see on the specs (ex: 11-13-13-35), but secondary timings usually are hidden.

download aida64, there you can see a complete SPD memory timings & voltage
Write everything on the paper.

go to bios
set manually your rated DRAM voltage that require to run at 2400 according to your RAM (1.65V or 1.5V?)
set memory speed into 2400
input every timings you got there into bios (Timing's name could be slightly different on your bios)

Example on aida64 mine looks:
@ 1200 MHZ 11-13-13-35 (CL-RCD-RP-RAS) / 48-314-3-*8*-18-10-10-31-*13* (RC-RFC-CR-*RRD*-WR-WTR-RTP-FAW-*WCL*)
In my case I need to lower TRRD & CWL
TRRD 8 lower to 7
CWL 13 lower to 12

Boom it booted instantly with 2400mhz speed (@1200mhz in cpuz/PC19200)
In my case i dont even need to increase VCCSA/VCCIO, my VCCSA/VCCIO still at default voltages running with 2400.
But I dont know in your case.


----------



## TerafloppinDatP

Quote:


> Originally Posted by *gtbtk*
> 
> I have been really considering a Ryzen system. I was originally thinking R7 but now I think an R5 6 core would be a better buy for my needs. The current Intel x99 broadwell-e products are now overpriced for the performance you can get out of them and I think that 4 core chips, even though they offer better single core performance, will be pushed aside by 6 core chips in the next year anyway.


Ryzen 5 1600 + GTX 1070 right here. Smooth like buttah! Really happy with the combo.


----------



## Madmaxneo

Ok, found some limits on basic ocing this 1070. So I am now wanting to look more into the "Power Curves" some have mentioned for this card. Where can I find a guide on this?


----------



## Dude970

Quote:


> Originally Posted by *Madmaxneo*
> 
> Ok, found some limits on basic ocing this 1070. So I am now wanting to look more into the "Power Curves" some have mentioned for this card. Where can I find a guide on this?


This will get you started
http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,2.html


----------



## Madmaxneo

Quote:


> Originally Posted by *Dude970*
> 
> This will get you started
> http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,2.html


Thanks!
I had something like this up when 3dmark froze and I was unable to get it to completely shot down so I had to restart my PC....


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> I have been wanting to upgrade mine too. The fact that my system it now makes PSUs go bang when you turn it on has forced the matter. I have been really considering a Ryzen system. I was originally thinking R7 but now I think an R5 6 core would be a better buy for my needs. The current Intel x99 broadwell-e products are now overpriced for the performance you can get out of them and I think that 4 core chips, even though they offer better single core performance, will be pushed aside by 6 core chips in the next year anyway.


Just for reference would you take a look at this


http://www.legitreviews.com/cpu-bottleneck-geforce-gtx-1080-ti-tested-on-amd-ryzen-versus-intel-kaby-lake_192585/3

7700k stock + 1080Ti Graphic scores 28.6k
[email protected] + 1080Ti Graphic scores 27.5k
strangely on physic scores they are the same.

Gta5









i dont know but in gaming Ryzen seems slightly under perform


----------



## Skyblaze

Quote:


> Originally Posted by *zipper17*
> 
> Not use XMP, you need to manually input all memory settings in bios
> primary timings usually the first four timing that we can usually see on the specs (ex: 11-13-13-35), but secondary timings usually are hidden.
> 
> download aida64, there you can see a complete SPD memory timings & voltage
> Write everything on the paper.
> 
> go to bios
> set manually your rated DRAM voltage that require to run at 2400 according to your RAM (1.65V or 1.5V?)
> set memory speed into 2400
> input every timings you got there into bios (Timing's name could be slightly different on your bios)
> 
> Example on aida64 mine looks:
> @ 1200 MHZ 11-13-13-35 (CL-RCD-RP-RAS) / 48-314-3-*8*-18-10-10-31-*13* (RC-RFC-CR-*RRD*-WR-WTR-RTP-FAW-*WCL*)
> In my case I need to lower TRRD & CWL
> TRRD 8 lower to 7
> CWL 13 lower to 12
> 
> Boom it booted instantly with 2400mhz speed (@1200mhz in cpuz/PC19200)
> In my case i dont even need to increase VCCSA/VCCIO, my VCCSA/VCCIO still at default voltages running with 2400.
> But I dont know in your case.


Hmm alright that seems easy enough thanks for the explanation. In the worst case when I can't boot anymore I would just do a CMOS reset right?
Quote:


> Originally Posted by *zipper17*
> 
> Just for reference would you take a look at this
> 
> 
> http://www.legitreviews.com/cpu-bottleneck-geforce-gtx-1080-ti-tested-on-amd-ryzen-versus-intel-kaby-lake_192585/3
> 
> 7700k stock + 1080Ti Graphic scores 28.6k
> [email protected] + 1080Ti Graphic scores 27.5k
> strangely on physic scores they are the same.
> 
> Gta5
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i dont know but in gaming Ryzen seems slightly under perform


That's exactly while I'm careful with Ryzen :/


----------



## gtbtk

Quote:


> Originally Posted by *Skyblaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the memory controller is actually on the CPU die. the MB only has the electrical connections between the CPU socket to the ram so maybe the pins in the socket are bent or there is some dust/debris in the slot. a can of compressed air may be able to blow them out. Dual channel will give you better performance than single channel
> 
> I have never used an Asrock board so I am not familiar with the specific quirks. The VCCIO voltage in the bios is the one that helps fortify the memory controller. Increasing that from 1.05 to 1.1 may help you get the ram running faster. It also helps improve stability of the PCIe controller for overclocking GPUs.
> 
> You should be able to increase bclk to 103 - 104mhz and not change anything else. That will give you 4532Mhz on the cpu and 2060Mhz memory at 103BCLK. If it wont boot, try dropping the multiplier to 43 and trying again. x44 multi on those chips is not right at the ragged edge so I would be surprised if it does have too many problems. If it starts giving whea BSOD, increase vcore voltage slightly.
> 
> I have been wanting to upgrade mine too. The fact that my system it now makes PSUs go bang when you turn it on has forced the matter. I have been really considering a Ryzen system. I was originally thinking R7 but now I think an R5 6 core would be a better buy for my needs. The current Intel x99 broadwell-e products are now overpriced for the performance you can get out of them and I think that 4 core chips, even though they offer better single core performance, will be pushed aside by 6 core chips in the next year anyway.
> 
> 
> 
> I know I looked extensively into the issue but I was and still am afraid to actually pull the CPU out of the socket now in fear to make the problem worse as I have no idea how the state of the pins is and I rather have a working Single Channel system than trying to fix it and having to buy a new system right now.
> 
> Thanks for the tip, first I'll try what Zipper17 suggested though, it sounds like the problem is quite similar to mine. Interesting on the baseclock though, I'll definitely try that, I have more than enough experience with Ivy Bridge stability-testing by now.
> 
> That doesn't sound good on your system alright, I hope it still works for you for a while. I thought about Ryzen too but I'm not sure if they are for me, I definitely want a improvement on IPC single-clock performance along with more cores and from what I read AMD made huge strides there but in the end that's an area where Intel is still better at. Is Ryzen beating Ivy Bridge on IPC performance? I really hope Intel is forced to drop prices on their six-cores, around 300€ by the end of the year and I would bite, although I doubt that will happen.
Click to expand...

I wasn't talking about the CPU socket having dust in it. I assume the CPU went in when the board was new so no way for the dust to get in there. I was talking about the memory slots themselves.

If you are going to make changes in the bios. If you haven't done it yet, save your current settings as a profile. if things mess up, you can just reload the original settings.

My system is dead. the board has formed a short circuit for some reason. It has now killed 4 platinum PSUs so I wont try again without a different motherboard in it. If i can find a cheap used z68 or z77 board, i might just swap the CPU, memory and cooler over and run that for a while. Fingers crossed that the board dying did not fry my CPU and GPU.

I am torn with Ryzen. I did own an AMD pc about 15 years ago and the performance was better than Intel in those days. The problem then was windows would add a feature that the AMD chip would not support so It always felt like a compromise. I am worried that it may be the same now. The Ryzen platform really needs to mature a bit more but I do get my jollies solving Tech problems. Ryzens problem currently is memory throughput limitations that are impacting the gaming performance

Ryzen actually performs better than the equivalent Intel CPU in areas that don't hammer the memory sub system. They are having similar challenges getting DDR4 working well that X99 had when it was launched. Latency is a bit high and some people are having trouble getting memory trained to run at 3200Mhz but that is more about the immaturity of the platform bios revisions. There are guys with 3600Mhz ram running and memory latency is down from 100ns at launch to 60ns but still another 20ns to go to be comparable with Intel memory latency.

The Ryzen SOC design architecture is much more sensitive to memory speed than the Intel architecture so It will take it a while but I am sure that performance will improve further. A new microcode/bios is due out soon and that is supposed to improve memory support significantly, hopefully allowing tighter Ram timings. I'm sure that It will get there eventually. I was planning on waiting for it to mature another couple of months before deciding to jump in or not but this hardware failure may have forced my hand.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Ok, found some limits on basic ocing this 1070. So I am now wanting to look more into the "Power Curves" some have mentioned for this card. Where can I find a guide on this?


You mean the voltage curve?

The best place to get educated on curves is probably reading back through this thread. I have written a few mini tutorials, most recently the other day.

There is no official documentation that I know of.

What model card are you running?


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> You mean the voltage curve?
> 
> The best place to get educated on curves is probably reading back through this thread. I have written a few mini tutorials, most recently the other day.
> 
> There is no official documentation that I know of.
> 
> What model card are you running?


I need to update my profile with this card.
It is an EVGA GTX 1070 SC ACX 3.0

I am waiting to order a water block for this card but the ACX 3,0 does a great job keeping it cool, much better than the ACX 2.0 did on my 980.

Will a waterblock improve OC capability like it did for the 900 series and earlier cards?

EDIT: Where are the LED controls for the cards now in Geforce Experience?


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have been wanting to upgrade mine too. The fact that my system it now makes PSUs go bang when you turn it on has forced the matter. I have been really considering a Ryzen system. I was originally thinking R7 but now I think an R5 6 core would be a better buy for my needs. The current Intel x99 broadwell-e products are now overpriced for the performance you can get out of them and I think that 4 core chips, even though they offer better single core performance, will be pushed aside by 6 core chips in the next year anyway.
> 
> 
> 
> Just for reference would you take a look at this
> 
> 
> http://www.legitreviews.com/cpu-bottleneck-geforce-gtx-1080-ti-tested-on-amd-ryzen-versus-intel-kaby-lake_192585/3
> 
> 7700k stock + 1080Ti Graphic scores 28.6k
> [email protected] + 1080Ti Graphic scores 27.5k
> strangely on physic scores they are the same.
> 
> Gta5
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i dont know but in gaming Ryzen seems slightly under perform
Click to expand...

You are right. It is under performing in gaming workloads and in other workloads that require high memory throughput such as 7zip compression. Memory support is improving every day, the deficit is progressively being reduced to an equivalent Intel platform like the 6900K.

For some reason, the media like to avoid mentioning the combined score that is the most telling part of the firestrike benchmark that shows the bottleneck in the uncore/Infinity Fabric part of the Ryzen chip. Then again everything about the Ryzen launch has been a bit strange with most of the industry having a suspension of common sense.

The Ryzen CPU is a System on a chip, The processor is made up of two 4 core modules that each contain its own cache memory. The Infinity fabric is effectively a network that connects the two 4 core modules to the memory controller and the I/O controller that contains the PCIe Controller, The on chip Sata and the USB controller that are built into the Ryzen Chip. The Infinity fabric clock speed is tied to the memory frequency and like any network the performance is effected by latency. Right now that is quite high but improving. The end result though, is that gaming workloads and things like 7zip compression, that need lots of throughput between the cores and the GPU tend to be under performing because the higher latencies make the cpu sit waiting for data to arrive. The CPU actually works really well when it does have the data to use though.

In single core performance, A 7700K will always win. It is clocked 25% faster and has a higher IPC. Ryzen IPC pretty much matches Broadwell E in single core tasks but tends to be more efficient in multi core workloads. If you compare a 7700K, Ryzen R7 and an i7-6900K in cinebench, you can see how they all relate to each other in pure processing. Ryzen will win in most multithreaded tasks unless the infinity Fabric is heavily loaded like when a AAA game is being run.

If gaming is the only thing that you really do, a kaby lake is probably a better buy right now.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You mean the voltage curve?
> 
> The best place to get educated on curves is probably reading back through this thread. I have written a few mini tutorials, most recently the other day.
> 
> There is no official documentation that I know of.
> 
> What model card are you running?
> 
> 
> 
> I need to update my profile with this card.
> It is an EVGA GTX 1070 SC ACX 3.0
> 
> I am waiting to order a water block for this card but the ACX 3,0 does a great job keeping it cool, much better than the ACX 2.0 did on my 980.
> 
> Will a waterblock improve OC capability like it did for the 900 series and earlier cards?
> 
> EDIT: Where are the LED controls for the cards now in Geforce Experience?
Click to expand...

That makes it easy then.

Personally I like After Burner better than Precision XOC If you want to play with the curves. The controls are more fine grained.

For you though, Precision gives you a handy tool that can help get you close to the best curve for your card. The auto overclock tool that Precision has for EVGA cards doesn't actually work that well for overclocking the card but what it is pretty good at doing is getting you close to the best curve for your card. It will show you the areas of the curve that overclock better than the other parts. So what you do is run the utility and scan each point between say 50 and 150 with 12.5mhz steps. When it is finished, close precision XOC and then open Afterburner and that curve window. You can save that curve that Precision created and then tweak the points to get it stable.

You should probably drop each point down by 12-25 mhz and then see if it will run stable. You can then try adjusting each point one at a time starting at the highest point and work your way down the curve staying reasonably close to the curve that Precision found for you. If there is a dip in the curve, keep the dip in that area because those points dont overclock as well.

A Water block will help get clocks slightly higher and keep them a bit higher and more stable. You wont actually gain that much extra performance though.

LED controls are now in the manufacturers software. Precision XOC has them for EVGA.


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> That makes it easy then.
> 
> Personally I like After Burner better than Precision XOC If you want to play with the curves. The controls are more fine grained.
> 
> For you though, Precision gives you a handy tool that can help get you close to the best curve for your card. The auto overclock tool that Precision has for EVGA cards doesn't actually work that well for overclocking the card but what it is pretty good at doing is getting you close to the best curve for your card. It will show you the areas of the curve that overclock better than the other parts. So what you do is run the utility and scan each point between say 50 and 150 with 12.5mhz steps. When it is finished, close precision XOC and then open Afterburner and that curve window. You can save that curve that Precision created and then tweak the points to get it stable.
> 
> You should probably drop each point down by 12-25 mhz and then see if it will run stable. You can then try adjusting each point one at a time starting at the highest point and work your way down the curve staying reasonably close to the curve that Precision found for you. If there is a dip in the curve, keep the dip in that area because those points dont overclock as well.
> 
> A Water block will help get clocks slightly higher and keep them a bit higher and more stable. You wont actually gain that much extra performance though.
> 
> LED controls are now in the manufacturers software. Precision XOC has them for EVGA.


Ok, I normally use and much prefer AB but I went and installed XOC. First it took me awhile to figure out that I needed to use the "Linear" curve then a bit longer to realize I had to click in specific spots in the graph to increase it. I still don't completely understand what you mean by "So what you do is run the utility and scan each point between say 50 and 150 with 12.5mhz steps".

Is there a guide for this somewhere? I looked and found a bunch of forums on on this but prefer an official guide. at least for the basics so I can better follow your directions.

Ok I think I found out how to do it. But I am not sure I did anything right...lol.

First you click on a curve in Linear, hit apply then go to manual and hit run. If that is correct, does it just keep doing the same thing over and over until I stop it or does it stop automatically?
When I go to manual and hit run is it still running on the Linear curve I selected?

EDIT: After some more senseless button pushing I think I figured it out. I click on a slider in manual then click run. But no matter what settings I set it at the program crashes. I have not been able to complete a run yet.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That makes it easy then.
> 
> Personally I like After Burner better than Precision XOC If you want to play with the curves. The controls are more fine grained.
> 
> For you though, Precision gives you a handy tool that can help get you close to the best curve for your card. The auto overclock tool that Precision has for EVGA cards doesn't actually work that well for overclocking the card but what it is pretty good at doing is getting you close to the best curve for your card. It will show you the areas of the curve that overclock better than the other parts. So what you do is run the utility and scan each point between say 50 and 150 with 12.5mhz steps. When it is finished, close precision XOC and then open Afterburner and that curve window. You can save that curve that Precision created and then tweak the points to get it stable.
> 
> You should probably drop each point down by 12-25 mhz and then see if it will run stable. You can then try adjusting each point one at a time starting at the highest point and work your way down the curve staying reasonably close to the curve that Precision found for you. If there is a dip in the curve, keep the dip in that area because those points dont overclock as well.
> 
> A Water block will help get clocks slightly higher and keep them a bit higher and more stable. You wont actually gain that much extra performance though.
> 
> LED controls are now in the manufacturers software. Precision XOC has them for EVGA.
> 
> 
> 
> Ok, I normally use and much prefer AB but I went and installed XOC. First it took me awhile to figure out that I needed to use the "Linear" curve then a bit longer to realize I had to click in specific spots in the graph to increase it. I still don't completely understand what you mean by "So what you do is run the utility and scan each point between say 50 and 150 with 12.5mhz steps".
> 
> Is there a guide for this somewhere? I looked and found a bunch of forums on on this but prefer an official guide. at least for the basics so I can better follow your directions.
> 
> Ok I think I found out how to do it. But I am not sure I did anything right...lol.
> 
> First you click on a curve in Linear, hit apply then go to manual and hit run. If that is correct, does it just keep doing the same thing over and over until I stop it or does it stop automatically?
> When I go to manual and hit run is it still running on the Linear curve I selected?
> 
> EDIT: After some more senseless button pushing I think I figured it out. I click on a slider in manual then click run. But no matter what settings I set it at the program crashes. I have not been able to complete a run yet.
Click to expand...

I am not suggesting that you use precision for manual curve adjustment. It is really clunky.

If you are running an EVGA card, or a card with an EVGA bios installed, if you open the curve window, There are a couple of different options, (this is from memory, I am not near a PC that has precision installed) one is linear and the other one is Manual I think.

On the manual curve page there is a button that says "run" that you have already found.

Directly below that run button, there is a cog icon that if you click, it will give you an option of what resolution you want to run this test on, how many seconds it should run at each level, the max and min offset limits the test will run between and the amount that the test will step up for each level of test.

in the cog settings windows that pops up, I would suggest that you use:


1920x1080p resolution,
7 seconds,
high of 150,
low of 50 - you can adjust this to be higher if you already know what the limit is from using the slider to overclock, if you can OC with a +110, set the minimum to 100 instead of 50. It will make the test faster as it has less points to test.
the step level of 12.5

Adjust the settings and close the settings window.

Click on the run button and a furmark like radar screen will pop up on the screen and start running.

it will run for the 7 seconds you set and then restart and do it all over again, it will continue doing that over and over for some time, each test is checking stability of a specific voltage level 12.5 mhz apart between +50 and +150. Go and get a coffee or have a meal because it will run for the next 20-30 minutes and you cant really use the PC for anything else while it is running. If you can see the precision screen, you will see the curve being updated with what point is being tested and what voltage level is being tested as it goes along.

If it crashes along the way, either the app or even a bluescreen, get back to precision and restart the test again. It will probably tell you that the test was interrupted and ask you do you want to continue. Say yes and it will continue where it left off before the crash. You can tell if the curve on screen has increased levels on the first part of the curve but the right side is all still at default.


----------



## Gurkburk

My 3dmark score somehow dropped by 5-600 points after reinstalling windows. ***?!


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> My 3dmark score somehow dropped by 5-600 points after reinstalling windows. ***?!


did you remember to turn off gamedvr?


----------



## Gurkburk

Quote:


> Originally Posted by *gtbtk*
> 
> did you remember to turn off gamedvr?


GameDVR? I DDUd the windows installed driver, disabled Game mode in windows 10.

Edit'. Yes i disabled that.


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> I am not suggesting that you use precision for manual curve adjustment. It is really clunky.
> 
> If you are running an EVGA card, or a card with an EVGA bios installed, if you open the curve window, There are a couple of different options, (this is from memory, I am not near a PC that has precision installed) one is linear and the other one is Manual I think.
> 
> On the manual curve page there is a button that says "run" that you have already found.
> 
> Directly below that run button, there is a cog icon that if you click, it will give you an option of what resolution you want to run this test on, how many seconds it should run at each level, the max and min offset limits the test will run between and the amount that the test will step up for each level of test.
> 
> in the cog settings windows that pops up, I would suggest that you use:
> 
> 1920x1080p resolution,
> 7 seconds,
> high of 150,
> low of 50 - you can adjust this to be higher if you already know what the limit is from using the slider to overclock, if you can OC with a +110, set the minimum to 100 instead of 50. It will make the test faster as it has less points to test.
> the step level of 12.5
> Adjust the settings and close the settings window.
> 
> Click on the run button and a furmark like radar screen will pop up on the screen and start running.
> 
> it will run for the 7 seconds you set and then restart and do it all over again, it will continue doing that over and over for some time, each test is checking stability of a specific voltage level 12.5 mhz apart between +50 and +150. Go and get a coffee or have a meal because it will run for the next 20-30 minutes and you cant really use the PC for anything else while it is running. If you can see the precision screen, you will see the curve being updated with what point is being tested and what voltage level is being tested as it goes along.
> 
> If it crashes along the way, either the app or even a bluescreen, get back to precision and restart the test again. It will probably tell you that the test was interrupted and ask you do you want to continue. Say yes and it will continue where it left off before the crash. You can tell if the curve on screen has increased levels on the first part of the curve but the right side is all still at default.


It did crash and there is no continue button. It basically says there was an issue with the program and it stopped running. The only button is an OK button. This happens every time.
There is also an Antialiasing option but I leave that off.


----------



## Gurkburk

I've noticed my GPU usage drops to 70% in the last test in 3dmark for some reason...

Edit: This seems to be intentional..

But i fixed the loss of score, newest driver was causing it. Reverting to the previous bumped me up to 16400 again.

Edit2:

16438 score now. Getting 110CC (Can't seem to get higher without crash) & a wopping 720 Memory clock! Holy cow..

Edit 3. Somehow those clocks managed it through 3dmark without showing any sign of flickering etc. But in games it flickers a lot..


----------



## lanofsong

Hey there GTX 1070 owners,

Would you consider signing up with Team OCN for the 2017 Pentathlon (*May 5th through May 19th*). There is so much time left an we really could use your help.

This event is truly a GLOBAL battle with you team OCN going up against many teams from across the world and while we put in a good showing at last year's event by finishing 6th, we could do with a lot more CPU/GPU compute power. All you need to do is sign up and crunch on any available hardware that you can spare.

The cool thing about this event is that it spread over 5 disciplines over *varying lengths of time* (different projects) so there is a lot of *strategy/tactics* involved.

We look forward to having you and your hardware on our team. Again, this event lasts for two weeks and takes place May 5th through the 19th.


Download the software here.

https://boinc.berkeley.edu/download.php

Presently we really would like some help with the following project:

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

To find your Cross Project ID# - sign into your account and it will be located under Computing and Credit


Please check out the GUIDE - How to add BOINC Projects page for more information about running different projects:

This really is an exciting and fun event and i look forward to it every year and I am hoping that you will join us and participate in this event









BTW - There is an awesome BOINC Pentathlon badge for those who participate










lanofsong

OCN - FTW


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I am not suggesting that you use precision for manual curve adjustment. It is really clunky.
> 
> If you are running an EVGA card, or a card with an EVGA bios installed, if you open the curve window, There are a couple of different options, (this is from memory, I am not near a PC that has precision installed) one is linear and the other one is Manual I think.
> 
> On the manual curve page there is a button that says "run" that you have already found.
> 
> Directly below that run button, there is a cog icon that if you click, it will give you an option of what resolution you want to run this test on, how many seconds it should run at each level, the max and min offset limits the test will run between and the amount that the test will step up for each level of test.
> 
> in the cog settings windows that pops up, I would suggest that you use:
> 
> 1920x1080p resolution,
> 7 seconds,
> high of 150,
> low of 50 - you can adjust this to be higher if you already know what the limit is from using the slider to overclock, if you can OC with a +110, set the minimum to 100 instead of 50. It will make the test faster as it has less points to test.
> the step level of 12.5
> Adjust the settings and close the settings window.
> 
> Click on the run button and a furmark like radar screen will pop up on the screen and start running.
> 
> it will run for the 7 seconds you set and then restart and do it all over again, it will continue doing that over and over for some time, each test is checking stability of a specific voltage level 12.5 mhz apart between +50 and +150. Go and get a coffee or have a meal because it will run for the next 20-30 minutes and you cant really use the PC for anything else while it is running. If you can see the precision screen, you will see the curve being updated with what point is being tested and what voltage level is being tested as it goes along.
> 
> If it crashes along the way, either the app or even a bluescreen, get back to precision and restart the test again. It will probably tell you that the test was interrupted and ask you do you want to continue. Say yes and it will continue where it left off before the crash. You can tell if the curve on screen has increased levels on the first part of the curve but the right side is all still at default.
> 
> 
> 
> It did crash and there is no continue button. It basically says there was an issue with the program and it stopped running. The only button is an OK button. This happens every time.
> There is also an Antialiasing option but I leave that off.
Click to expand...

restart the program and start tthe auto tuning again. it will pick up where it left off


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> I've noticed my GPU usage drops to 70% in the last test in 3dmark for some reason...
> 
> Edit: This seems to be intentional..
> 
> But i fixed the loss of score, newest driver was causing it. Reverting to the previous bumped me up to 16400 again.
> 
> Edit2:
> 
> 16438 score now. Getting 110CC (Can't seem to get higher without crash) & a wopping 720 Memory clock! Holy cow..
> 
> Edit 3. Somehow those clocks managed it through 3dmark without showing any sign of flickering etc. But in games it flickers a lot..


The combined test is running physics on the CPU plus sending draw calls to the GPU. the card is slowing down so the CPU can keep up

memory at +720 in time spy may crash as well. Somewhere around +500 is probably wrere it will be happy in games


----------



## Halseluk

Hello, guys!

I have now a 1070 HOF, and I'm having some trouble overclocking it. I couldn't get 2050 MHz stable yet. Micron memory, latest bios installed.

Beside that, it boosts to 1949 ~1962 at stock, and it gets 18700 graphics score in Firestrike. I'm sure when I had a 1070 FTW it could score 19000 with same clocks. Is it a big deal?


----------



## lanofsong

Hey there GTX 1070 owners,

We truly could use your help here. Presently we are #1 and just ahead of two of the Great TITAN's when it comes to Distributed Computing and to stay there we could use the help from you and your BOSS GPU's. Only 4 days left.




Download the software here.

https://boinc.berkeley.edu/download.php

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

Thanks in advance.

lanofsong

OCN - FTW


----------



## TerafloppinDatP

My 1070 will be crushing some [email protected] for the next 4 days! As much as a 1070 can in the face of Titans and Quadros, at least


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Hello, guys!
> 
> I have now a 1070 HOF, and I'm having some trouble overclocking it. I couldn't get 2050 MHz stable yet. Micron memory, latest bios installed.
> 
> Beside that, it boosts to 1949 ~1962 at stock, and it gets 18700 graphics score in Firestrike. I'm sure when I had a 1070 FTW it could score 19000 with same clocks. Is it a big deal?


try using Afterburner rather than the Galax software.

You can try to use the curve to overclock in stead of the slider, there is probably only one point that is becoming unstable along the curve and that is messing up the overclock. How much can you overclock the card with the slider?

I found that on my sandy bridge system, it had limitations at about 1.0v on the curve. I also found that small adjustments to vccio and CPU PLL (an am using bclk overclocking) on a the motherboard improved the overhead problem on the card.

If you want to start trying the curve, if you want to run at the highest voltage, just try adjusting the 1.093v to 2100Mhz and the .950V point to 2025. Memory should be ok with a +400 to +500 increase.

If that runs stable, great, experiment by increasing a single point at a time at other voltage points and see if it improves things. If it is not stable, reduce the 1.093v point by 12 mhz first and try again. If it doesnt like it, drop the voltage back to stock level and use the 1.06v point instead of the 1.093v point. The performance is almost the same anyway and the lower voltage helps keep the card cooler. once the two points are good, you can try experimenting other points along the curve and see how that goes.

my Gaming X can do a 21200 gaming score, so 18000 is a little low. memory overclocks helps Pascal 1070 cards more than a little extra frequency


----------



## Halseluk

Quote:


> Originally Posted by *gtbtk*
> 
> try using Afterburner rather than the Galax software.
> 
> You can try to use the curve to overclock in stead of the slider, there is probably only one point that is becoming unstable along the curve and that is messing up the overclock. How much can you overclock the card with the slider?
> 
> I found that on my sandy bridge system, it had limitations at about 1.0v on the curve. I also found that small adjustments to vccio and CPU PLL (an am using bclk overclocking) on a the motherboard improved the overhead problem on the card.
> 
> If you want to start trying the curve, if you want to run at the highest voltage, just try adjusting the 1.093v to 2100Mhz and the .950V point to 2025. Memory should be ok with a +400 to +500 increase.
> 
> If that runs stable, great, experiment by increasing a single point at a time at other voltage points and see if it improves things. If it is not stable, reduce the 1.093v point by 12 mhz first and try again. If it doesnt like it, drop the voltage back to stock level and use the 1.06v point instead of the 1.093v point. The performance is almost the same anyway and the lower voltage helps keep the card cooler. once the two points are good, you can try experimenting other points along the curve and see how that goes.
> 
> my Gaming X can do a 21200 gaming score, so 18000 is a little low. memory overclocks helps Pascal 1070 cards more than a little extra frequency


I keep Galax software only for the rgb ligthing.

In AB, I can overclock about 50 MHz with the slider, and it doesn't give much performance. Runs fine with +500MHz in memory. My 2700K is @ 4.4 GHz just by raising multiplier, 1.23V.

I did what you told:

.950 point stabilizes in 1936 or 1949 (I don't remember right now)
1.093 point stabilizes in 2037

I've tweaked the points between them, core clock stays in 2012 ~ 2025, 9000 MHz memory. Got 20100 graphics score. I don't think this card will oc further, and I'll test stability with some games.

I've read several pages in this thread, and did some monitoring with HWINFO. Max power draw for my card was 172W. Usually it draws 162W, and the power target slider doesn't seem to make any difference.

And thank you very much for the tips!


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> try using Afterburner rather than the Galax software.
> 
> You can try to use the curve to overclock in stead of the slider, there is probably only one point that is becoming unstable along the curve and that is messing up the overclock. How much can you overclock the card with the slider?
> 
> I found that on my sandy bridge system, it had limitations at about 1.0v on the curve. I also found that small adjustments to vccio and CPU PLL (an am using bclk overclocking) on a the motherboard improved the overhead problem on the card.
> 
> If you want to start trying the curve, if you want to run at the highest voltage, just try adjusting the 1.093v to 2100Mhz and the .950V point to 2025. Memory should be ok with a +400 to +500 increase.
> 
> If that runs stable, great, experiment by increasing a single point at a time at other voltage points and see if it improves things. If it is not stable, reduce the 1.093v point by 12 mhz first and try again. If it doesnt like it, drop the voltage back to stock level and use the 1.06v point instead of the 1.093v point. The performance is almost the same anyway and the lower voltage helps keep the card cooler. once the two points are good, you can try experimenting other points along the curve and see how that goes.
> 
> my Gaming X can do a 21200 gaming score, so 18000 is a little low. memory overclocks helps Pascal 1070 cards more than a little extra frequency
> 
> 
> 
> I keep Galax software only for the rgb ligthing.
> 
> In AB, I can overclock about 50 MHz with the slider, and it doesn't give much performance. Runs fine with +500MHz in memory. My 2700K is @ 4.4 GHz just by raising multiplier, 1.23V.
> 
> I did what you told:
> 
> .950 point stabilizes in 1936 or 1949 (I don't remember right now)
> 1.093 point stabilizes in 2037
> 
> I've tweaked the points between them, core clock stays in 2012 ~ 2025, 9000 MHz memory. Got 20100 graphics score. I don't think this card will oc further, and I'll test stability with some games.
> 
> I've read several pages in this thread, and did some monitoring with HWINFO. Max power draw for my card was 172W. Usually it draws 162W, and the power target slider doesn't seem to make any difference.
> 
> And thank you very much for the tips!
Click to expand...

You have a 2700K. Great my experience is with a 2600.

I have been running an Asus Z68 motherboard, other manufacturers may call the IO voltage something slightly different. I found the increasing the VCCIO voltage from the default of 1.05 to just over 1.1v improved GPU performance and stability. I also found stability benefits from increasing CPU PLL voltage slightly from 1.8 to 1.8138v. You may want to do a bit of experimentation there.

The HOF cards use an International Rectifier IR3595A voltage controller which is different to most of the other 1070s. After burner does support the controller but You may actually need to go through the steps to properly configure the software for voltage control with the instructions here http://forums.guru3d.com/showthread.php?t=399542. having said that. If you card is running 1.093v then the voltage control works as the limit at default voltage is 1.063v

The voltage control slider in pascal does not actually increase the voltage. It adjusts the card through the driver, to access the higher voltage points on the curve. The precision XOC auto overclock utility with an EVGA bios card can access 1.1v and above when it is working. That being said, the extra voltage does help get higher frequency readings at the cost of higher temps, it doesn't really help give hugely more FPS so not getting crazy high core frequency results is not actually the end of the world.

In the mean time, I suggest that you leave voltage slider at 0 for the time being. Try setting the slider to +50 and use that as your starting point (that is also the limit with my card running the Gaming Z bios). Open the curve window. now using the +50 as a starting point, try increasing the .950v point as high as it will go, if it is 1949 that's fine. Then try the 1.050 or the 1.063v point. I can get mine to 2076 on that voltage level with +500 memory. the 1.050v point will keep the clock speeds higher as temps increase compared to the 1.063v point.


----------



## lanofsong

Quote:


> Originally Posted by *TerafloppinDatP*
> 
> My 1070 will be crushing some [email protected] for the next 4 days! As much as a 1070 can in the face of Titans and Quadros, at least


Crushing Like a Boss


----------



## Halseluk

Quote:


> Originally Posted by *gtbtk*
> 
> You have a 2700K. Great my experience is with a 2600.
> 
> I have been running an Asus Z68 motherboard, other manufacturers may call the IO voltage something slightly different. I found the increasing the VCCIO voltage from the default of 1.05 to just over 1.1v improved GPU performance and stability. I also found stability benefits from increasing CPU PLL voltage slightly from 1.8 to 1.8138v. You may want to do a bit of experimentation there.


My board is a P8Z77-V, I didn't find VCCIO voltage. There is VCCSA, CPU PLL and PCH voltage.
Quote:


> Originally Posted by *gtbtk*
> 
> The HOF cards use an International Rectifier IR3595A voltage controller which is different to most of the other 1070s. After burner does support the controller but You may actually need to go through the steps to properly configure the software for voltage control with the instructions here http://forums.guru3d.com/showthread.php?t=399542. having said that. If you card is running 1.093v then the voltage control works as the limit at default voltage is 1.063v


Should I follow these steps or just set voltage control option to "third-party"? AB installs with the latest list from that thread, which includes 980Ti HOF.

BTW, I found that the voltage slider in Xtreme Tuner really affects the card, but the voltage readings in Afterburner and Hwinfo doesn't show the actual voltage, while the power draw can go beyond 200W depending on the setting. The slider goes from 0.8 to 1.3.

At stock frequencies and increasing voltage in this slider made the card get 19000 in Firestrike. Then I put + 25 MHz core (actual increase was + 50) and +500 MHz memory, got 20100. I tried + 6 MHz core and increased voltage further, doesn't get stable in any voltage.

So, I rolled back to +25 core, +500 MHz, 1.1V slider in Xtreme Tuner and ran Firestrike again. Got 17000 in graphics. Reinstalled Xtreme Tuner, AB and Nvidia drivers (using DDU), and the performance still dropped. The card clocks normally, but power draw is around 82W. I'm afraid I damaged my card somehow.
Quote:


> Originally Posted by *gtbtk*
> 
> In the mean time, I suggest that you leave voltage slider at 0 for the time being. Try setting the slider to +50 and use that as your starting point (that is also the limit with my card running the Gaming Z bios). Open the curve window. now using the +50 as a starting point, try increasing the .950v point as high as it will go, if it is 1949 that's fine. Then try the 1.050 or the 1.063v point. I can get mine to 2076 on that voltage level with +500 memory. the 1.050v point will keep the clock speeds higher as temps increase compared to the 1.063v point.


I will try this as soon as I fix my card. Hope I didn't screw up...

EDIT: Reseated the card and it's back to normal. I'm not touching that voltage slider in Xtreme Tuner again.


----------



## Madmaxneo

Does anyone if the EVGA GTX 1070 SC ACX 3.0 Black ed has the 4 pin mini PWM connection? I am going to connect the Swiftech H140-X AIO to my card for watercooling. After a bit of research I have found out I can connect both the pump and the fan using a splitter with a mini port on one end that splits into two regular PWM connections.
I also want to make sure I can access this connector on the card with a waterblock and a backplate installed.....

I have been looking for images and most of what I find deals with other manufacturers video cards. So nothing that accurate so far.

Thanks in advance.


----------



## a doorway

Hey guys, hoping for a little help.

Had my 8GB MSI Gaming X 1070 card for about a year now and decided to try to overclock it recently, only to end up with quite poor results. I'm not sure if I've lucked out on the silicon lottery or something else is the issue. Using MSI Afterburner, I can't even get my Mem Clock above about 200 without losing stability in games. I see plenty of people overclock theirs to at least 400 up to 700 so surely something is a miss there. I've also already installed the MSI Micron BIOS update already. Could it have something to do with my card not getting enough juice from my PSU? RAM? The motherboard I'm using by the way is a Gigabyte z170x Gaming 3.

Thanks


----------



## DoubleNorm

Quote:


> Originally Posted by *a doorway*
> 
> Hey guys, hoping for a little help.
> 
> Had my 8GB MSI Gaming X 1070 card for about a year now and decided to try to overclock it recently, only to end up with quite poor results. I'm not sure if I've lucked out on the silicon lottery or something else is the issue. Using MSI Afterburner, I can't even get my Mem Clock above about 200 without losing stability in games. I see plenty of people overclock theirs to at least 400 up to 700 so surely something is a miss there. I've also already installed the MSI Micron BIOS update already. Could it have something to do with my card not getting enough juice from my PSU? RAM? The motherboard I'm using by the way is a Gigabyte z170x Gaming 3.
> 
> Thanks


No man, if you alredy installed the MSI Micron BIOS update, you can nothing to do.


----------



## a doorway

Quote:


> Originally Posted by *DoubleNorm*
> 
> No man, if you alredy installed the MSI Micron BIOS update, you can nothing to do.


Thanks, looks like it could actually be the game I'm playing causing the crashes.....in Unigine I can get to like 400 mem no probs.


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You have a 2700K. Great my experience is with a 2600.
> 
> I have been running an Asus Z68 motherboard, other manufacturers may call the IO voltage something slightly different. I found the increasing the VCCIO voltage from the default of 1.05 to just over 1.1v improved GPU performance and stability. I also found stability benefits from increasing CPU PLL voltage slightly from 1.8 to 1.8138v. You may want to do a bit of experimentation there.
> 
> 
> 
> My board is a P8Z77-V, I didn't find VCCIO voltage. There is VCCSA, CPU PLL and PCH voltage.
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The HOF cards use an International Rectifier IR3595A voltage controller which is different to most of the other 1070s. After burner does support the controller but You may actually need to go through the steps to properly configure the software for voltage control with the instructions here http://forums.guru3d.com/showthread.php?t=399542. having said that. If you card is running 1.093v then the voltage control works as the limit at default voltage is 1.063v
> 
> Click to expand...
> 
> Should I follow these steps or just set voltage control option to "third-party"? AB installs with the latest list from that thread, which includes 980Ti HOF.
> 
> BTW, I found that the voltage slider in Xtreme Tuner really affects the card, but the voltage readings in Afterburner and Hwinfo doesn't show the actual voltage, while the power draw can go beyond 200W depending on the setting. The slider goes from 0.8 to 1.3.
> 
> At stock frequencies and increasing voltage in this slider made the card get 19000 in Firestrike. Then I put + 25 MHz core (actual increase was + 50) and +500 MHz memory, got 20100. I tried + 6 MHz core and increased voltage further, doesn't get stable in any voltage.
> 
> So, I rolled back to +25 core, +500 MHz, 1.1V slider in Xtreme Tuner and ran Firestrike again. Got 17000 in graphics. Reinstalled Xtreme Tuner, AB and Nvidia drivers (using DDU), and the performance still dropped. The card clocks normally, but power draw is around 82W. I'm afraid I damaged my card somehow.
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> In the mean time, I suggest that you leave voltage slider at 0 for the time being. Try setting the slider to +50 and use that as your starting point (that is also the limit with my card running the Gaming Z bios). Open the curve window. now using the +50 as a starting point, try increasing the .950v point as high as it will go, if it is 1949 that's fine. Then try the 1.050 or the 1.063v point. I can get mine to 2076 on that voltage level with +500 memory. the 1.050v point will keep the clock speeds higher as temps increase compared to the 1.063v point.
> 
> Click to expand...
> 
> I will try this as soon as I fix my card. Hope I didn't screw up...
> 
> EDIT: Reseated the card and it's back to normal. I'm not touching that voltage slider in Xtreme Tuner again.
Click to expand...

You could try adjusting the SA.

Does your UEFI look like this? the control is directly under the VCCSA voltage (the igpu section may not appear if you have it disabled) You do not need to touch the pch voltage.



With regards AB and voltage control, if the card runs at 1.081 or 1.093v when the slider is at 100% it is working. If it stays at 1.050 or 1.063 it isn't so you will need to go through the procedure to make it work. You can check in AB by setting the OSD to display the voltage on screen. AB knows how to talk to the controller, but it sounds like it may not know the controller is there as this model card may not be referenced in its database. There is further pdf doc files in the install directory that you can read in addition to the forum posts.

20100 sounds more like what I would expect you to get and is about what I got when I first got my card. With tuning you may find you can get it to 21000 of more.

How did you go playing with the curve?


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Does anyone if the EVGA GTX 1070 SC ACX 3.0 Black ed has the 4 pin mini PWM connection? I am going to connect the Swiftech H140-X AIO to my card for watercooling. After a bit of research I have found out I can connect both the pump and the fan using a splitter with a mini port on one end that splits into two regular PWM connections.
> I also want to make sure I can access this connector on the card with a waterblock and a backplate installed.....
> 
> I have been looking for images and most of what I find deals with other manufacturers video cards. So nothing that accurate so far.
> 
> Thanks in advance.


I think it might.

See if Techpowerup.com has a review on the card. They have photos of the review samples PCB so you can check yourself. From memory the SC card is a reference board and I think that runs 4 pin PWM for the fans.

Just make sure that the pump and the fan don't exceed the power specs of the 4 pin if you do try it.


----------



## gtbtk

Quote:


> Originally Posted by *a doorway*
> 
> Hey guys, hoping for a little help.
> 
> Had my 8GB MSI Gaming X 1070 card for about a year now and decided to try to overclock it recently, only to end up with quite poor results. I'm not sure if I've lucked out on the silicon lottery or something else is the issue. Using MSI Afterburner, I can't even get my Mem Clock above about 200 without losing stability in games. I see plenty of people overclock theirs to at least 400 up to 700 so surely something is a miss there. I've also already installed the MSI Micron BIOS update already. Could it have something to do with my card not getting enough juice from my PSU? RAM? The motherboard I'm using by the way is a Gigabyte z170x Gaming 3.
> 
> Thanks


I would suggest that you look at tweaking VCCIO and VCCSA voltages the recommended maximum is 1.25v for both. if you do adjust them, increase them one step at a time. All the guides say it is for memory overclocking and it does help there, but I have also found that it helps with the PCIe controller as well.


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> I think it might.
> 
> See if Techpowerup.com has a review on the card. They have photos of the review samples PCB so you can check yourself. From memory the SC card is a reference board and I think that runs 4 pin PWM for the fans.
> 
> Just make sure that the pump and the fan don't exceed the power specs of the 4 pin if you do try it.


Thanks.

I finally got a response back from EVGA and it is a mini 4pin header and according to them the EVGA power link should not interfere. They sent me links to their hybrid cooling unit to help me with connecting the AIO to my GPU though they also recommend contacting Swiftech support, which I have more than a week ago and still no response.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I think it might.
> 
> See if Techpowerup.com has a review on the card. They have photos of the review samples PCB so you can check yourself. From memory the SC card is a reference board and I think that runs 4 pin PWM for the fans.
> 
> Just make sure that the pump and the fan don't exceed the power specs of the 4 pin if you do try it.
> 
> 
> 
> Thanks.
> 
> I finally got a response back from EVGA and it is a mini 4pin header and according to them the EVGA power link should not interfere. They sent me links to their hybrid cooling unit to help me with connecting the AIO to my GPU though they also recommend contacting Swiftech support, which I have more than a week ago and still no response.
Click to expand...

The reason why I mentioned it is that swifttech has a 6W mcp30 pump that is a bit more powerful than the asetek 2.1W in the block pumps. Good to see EVGA has a healthy attitude to it and is not all about warranty void if removed stickers everywhere.

the pump is sata powered isnt it?


----------



## w-moffatt

Hey Guys,

n00b question here, but - Can i run an SLI 1070 on a 750w PSU. Specs in sig but im looking to jump to an i7-7700k an extra 16gb ddr4 ram and a second 1070. Just need to know if my 750w will be enough


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> The reason why I mentioned it is that swifttech has a 6W mcp30 pump that is a bit more powerful than the asetek 2.1W in the block pumps. Good to see EVGA has a healthy attitude to it and is not all about warranty void if removed stickers everywhere.
> 
> the pump is sata powered isnt it?


Yes the pump is SATA powered and I have everything working normally for now. I connected only the fan to the GPU and the pump directly to my MB.
I did not remove any stickers to void any kind of warranty for my card. EVGA does not consider adding a waterblock to one of their graphics cards something that voids the warranty. They also allow one to change the TIM on the GPU core if you prefer to do so, which also does not void the warranty.


----------



## Mad Pistol

Quote:


> Originally Posted by *w-moffatt*
> 
> Hey Guys,
> 
> n00b question here, but - Can i run an SLI 1070 on a 750w PSU. Specs in sig but im looking to jump to an i7-7700k an extra 16gb ddr4 ram and a second 1070. Just need to know if my 750w will be enough


Yes, it's plenty. I was able to run dual GTX 780's on my 750w eVGA PSU, and I currently run dual GTX 1070s. Solid as a rock.


----------



## ezveedub

Quote:


> Originally Posted by *w-moffatt*
> 
> Hey Guys,
> 
> n00b question here, but - Can i run an SLI 1070 on a 750w PSU. Specs in sig but im looking to jump to an i7-7700k an extra 16gb ddr4 ram and a second 1070. Just need to know if my 750w will be enough


I would say you're at limit with almost no head room if power consumption goes high or spikes. PSU would have to be a very top of line unit if it does. Nvidia has one GTX1070 system rated for minimum 500watts,

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-1070/specifications


----------



## w-moffatt

Quote:


> Originally Posted by *ezveedub*
> 
> I would say you're at limit with almost no head room if power consumption goes high or spikes. PSU would have to be a very top of line unit if it does. Nvidia has one GTX1070 system rated for minimum 500watts,
> 
> http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-1070/specifications


Having the MSI Gaming X edition, would that change anything? I thought it might be a bit tight especially if there are any power spikes. I may hold off on the second 1070 and assess vega when it drops.


----------



## Mad Pistol

Quote:


> Originally Posted by *ezveedub*
> 
> I would say you're at limit with almost no head room if power consumption goes high or spikes. PSU would have to be a very top of line unit if it does. Nvidia has one GTX1070 system rated for minimum 500watts,
> 
> http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-1070/specifications


I'm telling you, it's not a problem. I have 3 Hard Drives, an M.2 SSD, a 280mm AIO watercooler, 16GB 1866mhz ram, and I can overclock everything as far as it will go, and the system is rock solid. In fact, I was able to do the same thing with SLI GTX 780's, which used a lot more power than SLI 1070's.

750-watts is plenty for an SLI GTX 1070 setup. His current PSU is fine.
Quote:


> Originally Posted by *w-moffatt*
> 
> Having the MSI Gaming X edition, would that change anything? I thought it might be a bit tight especially if there are any power spikes. I may hold off on the second 1070 and assess vega when it drops.


Your current PSU is fine. You have nothing to worry about.


----------



## gtbtk

Quote:


> Originally Posted by *w-moffatt*
> 
> Hey Guys,
> 
> n00b question here, but - Can i run an SLI 1070 on a 750w PSU. Specs in sig but im looking to jump to an i7-7700k an extra 16gb ddr4 ram and a second 1070. Just need to know if my 750w will be enough


With Founders edition cards/reference board cards (180W cards), 750W should be just enough. An 850W would be a more comfortable choice and run at a higher efficiency level though

You would be highly marginal with say, two Zotac amp extreme cards as they are 300W cards.

If you allocate 200W for the motherboard/cpu/ram/storage and then find out the max power draw of the graphics card, you just need to add everything together. PSUs will run most efficiently if they spend most of their lives at 50% load. If it bumps up under a heavy load that is fine but high loads make heat and that will force the PSU fan on high to cool it so it will also increase noise.


----------



## w-moffatt

Quote:


> Originally Posted by *gtbtk*
> 
> With Founders edition cards/reference board cards (180W cards), 750W should be just enough. An 850W would be a more comfortable choice and run at a higher efficiency level though
> 
> You would be highly marginal with say, two Zotac amp extreme cards as they are 300W cards.
> 
> If you allocate 200W for the motherboard/cpu/ram/storage and then find out the max power draw of the graphics card, you just need to add everything together. PSUs will run most efficiently if they spend most of their lives at 50% load. If it bumps up under a heavy load that is fine but high loads make heat and that will force the PSU fan on high to cool it so it will also increase noise.


thanks for the input GTBTK, Noise doesnt concern me at all i wear G933 headphones while playing which cancel out 90% of room noise on full volume. Its for the occasional game session (the 11 hour lan days are gone with kids







). As long as it runs the gpus the rest im not overly concerned about. I only have 2 HDDs + ssd in the case, the other 2 HDDs are in an enclosure connected via SATA. Thanks for the info guys!


----------



## Nawafwabs

I can't overclock my 1070

Model: Asus 1070 o8g


every time I want to oc my gpu crashed + green color on screen


----------



## AngryGoldfish

Can folks do me a favour, please, and post any results they have (personal findings ideally) of an overclocked 1070 beating a stock 1080 FE? I've always seen the 1070 falling short of the 1080 by a few percent, but I have just noticed that TechPowerUp can consistently reach 1080 levels with their 1070's in their reviews, and even beat it.


----------



## gtbtk

Quote:


> Originally Posted by *w-moffatt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> With Founders edition cards/reference board cards (180W cards), 750W should be just enough. An 850W would be a more comfortable choice and run at a higher efficiency level though
> 
> You would be highly marginal with say, two Zotac amp extreme cards as they are 300W cards.
> 
> If you allocate 200W for the motherboard/cpu/ram/storage and then find out the max power draw of the graphics card, you just need to add everything together. PSUs will run most efficiently if they spend most of their lives at 50% load. If it bumps up under a heavy load that is fine but high loads make heat and that will force the PSU fan on high to cool it so it will also increase noise.
> 
> 
> 
> thanks for the input GTBTK, Noise doesnt concern me at all i wear G933 headphones while playing which cancel out 90% of room noise on full volume. Its for the occasional game session (the 11 hour lan days are gone with kids
> 
> 
> 
> 
> 
> 
> 
> ). As long as it runs the gpus the rest im not overly concerned about. I only have 2 HDDs + ssd in the case, the other 2 HDDs are in an enclosure connected via SATA. Thanks for the info guys!
Click to expand...

The single pcie power cable cards should be fine.


----------



## gtbtk

Quote:


> Originally Posted by *Nawafwabs*
> 
> I can't overclock my 1070
> 
> Model: Asus 1070 o8g
> 
> 
> every time I want to oc my gpu crashed + green color on screen


Your card is already overclocked. The reference base clock is 1506mhz. It is possible that 1633 is as high as your chip is able to cope with using the slider.

you could try using the curve to overclock. leave voltage at stock, increase the power and temp targets and create a fan curve. start with increasing the 1.050 point up to 2050mhz and see how that goes. When you find the limit on that voltage point, also try increasing the .950v point until you find the limits on that point.

Then increase the memory. try +500. that is around most cards sweet spot to start, you may find that it goes above 600 but by then you will be getting lots of memory errors that the card will try to error correct but you will probably get random application crashes


----------



## gtbtk

Quote:


> Originally Posted by *AngryGoldfish*
> 
> Can folks do me a favour, please, and post any results they have (personal findings ideally) of an overclocked 1070 beating a stock 1080 FE? I've always seen the 1070 falling short of the 1080 by a few percent, but I have just noticed that TechPowerUp can consistently reach 1080 levels with their 1070's in their reviews, and even beat it.


it is possible that their 1080 was crap. The 1080 result is always the same so likely to be the first card that they tested. It takes a while to work out the best approach to improving an architectures overclock as well. Updates in drivers could also have helped the 1070s


----------



## pez

Quote:


> Originally Posted by *ezveedub*
> 
> I would say you're at limit with almost no head room if power consumption goes high or spikes. PSU would have to be a very top of line unit if it does. Nvidia has one GTX1070 system rated for minimum 500watts,
> 
> http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-1070/specifications


Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm telling you, it's not a problem. I have 3 Hard Drives, an M.2 SSD, a 280mm AIO watercooler, 16GB 1866mhz ram, and I can overclock everything as far as it will go, and the system is rock solid. In fact, I was able to do the same thing with SLI GTX 780's, which used a lot more power than SLI 1070's.
> 
> 750-watts is plenty for an SLI GTX 1070 setup. His current PSU is fine.
> Your current PSU is fine. You have nothing to worry about.


As Mad Pistol said...750w is plenty. My system is pushing <400w at full tilt. You're well within your means with room to OC.


----------



## AngryGoldfish

Quote:


> Originally Posted by *gtbtk*
> 
> it is possible that their 1080 was crap. The 1080 result is always the same so likely to be the first card that they tested. It takes a while to work out the best approach to improving an architectures overclock as well. Updates in drivers could also have helped the 1070s


Yeah, I figured this might be the case. The problem is, TPU don't benchmark Battlefield 3 as part of their usual suite. They only benchmark it for the overclocks. That means that any new 1080 reviews cannot be extrapolated on. So, say for instance the new revised EVGA 1080 FTW2 is reviewed, if it hit 145 FPS in BF3, we could extrapolate on that by deducting a few FPS to factor in the non-reference PCB overclocks over the Founders Edition. But since TPU don't use BF3 in their benchmarking suite, I have no way of knowing an up-to-date benchmark result for the stock 1080 FE with its newer drivers.

Thanks all the same.


----------



## Halseluk

Quote:


> Originally Posted by *gtbtk*
> 
> You could try adjusting the SA.
> 
> Does your UEFI look like this? the control is directly under the VCCSA voltage (the igpu section may not appear if you have it disabled) You do not need to touch the pch voltage.
> 
> 
> 
> With regards AB and voltage control, if the card runs at 1.081 or 1.093v when the slider is at 100% it is working. If it stays at 1.050 or 1.063 it isn't so you will need to go through the procedure to make it work. You can check in AB by setting the OSD to display the voltage on screen. AB knows how to talk to the controller, but it sounds like it may not know the controller is there as this model card may not be referenced in its database. There is further pdf doc files in the install directory that you can read in addition to the forum posts.
> 
> 20100 sounds more like what I would expect you to get and is about what I got when I first got my card. With tuning you may find you can get it to 21000 of more.
> 
> How did you go playing with the curve?




This photo isn't mine, but my mobo is like this.

Voltage control works in AB, the card can run at 1.081 and 1.093V. It is just that whatever Xtreme Tuner does, the only thing I see is the power draw increase in hwinfo and some points more in graphics score.

Playing with the curve, I'm getting 19900 graphics score, core 2000 ~ 2012 MHz / 9000 MHz. Seems pretty stable.


----------



## gtbtk

Quote:


> Originally Posted by *AngryGoldfish*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> it is possible that their 1080 was crap. The 1080 result is always the same so likely to be the first card that they tested. It takes a while to work out the best approach to improving an architectures overclock as well. Updates in drivers could also have helped the 1070s
> 
> 
> 
> Yeah, I figured this might be the case. The problem is, TPU don't benchmark Battlefield 3 as part of their usual suite. They only benchmark it for the overclocks. That means that any new 1080 reviews cannot be extrapolated on. So, say for instance the new revised EVGA 1080 FTW2 is reviewed, if it hit 145 FPS in BF3, we could extrapolate on that by deducting a few FPS to factor in the non-reference PCB overclocks over the Founders Edition. But since TPU don't use BF3 in their benchmarking suite, I have no way of knowing an up-to-date benchmark result for the stock 1080 FE with its newer drivers.
> 
> Thanks all the same.
Click to expand...

Since the launch of the 1070 I have been paying more attention to the media than I used to and I have discovered that all of them are pretty poor, Most gloss over things that the vendor may not want mentioned and at times, play dumb like a fox. Very few appear to have a deep understanding of what they would have you believe they are experts in and almost none of them these days have an inquiring mind.

Battlefield 3 I can't help you with. This is a handy place for real comparisons on a level playing field. Use the advanced tab and you can compare a range of results in just about any combination. http://www.3dmark.com/search


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You could try adjusting the SA.
> 
> Does your UEFI look like this? the control is directly under the VCCSA voltage (the igpu section may not appear if you have it disabled) You do not need to touch the pch voltage.
> 
> 
> 
> With regards AB and voltage control, if the card runs at 1.081 or 1.093v when the slider is at 100% it is working. If it stays at 1.050 or 1.063 it isn't so you will need to go through the procedure to make it work. You can check in AB by setting the OSD to display the voltage on screen. AB knows how to talk to the controller, but it sounds like it may not know the controller is there as this model card may not be referenced in its database. There is further pdf doc files in the install directory that you can read in addition to the forum posts.
> 
> 20100 sounds more like what I would expect you to get and is about what I got when I first got my card. With tuning you may find you can get it to 21000 of more.
> 
> How did you go playing with the curve?
> 
> 
> 
> 
> 
> This photo isn't mine, but my mobo is like this.
> 
> Voltage control works in AB, the card can run at 1.081 and 1.093V. It is just that whatever Xtreme Tuner does, the only thing I see is the power draw increase in hwinfo and some points more in graphics score.
> 
> Playing with the curve, I'm getting 19900 graphics score, core 2000 ~ 2012 MHz / 9000 MHz. Seems pretty stable.
Click to expand...

the page scrolls down further below spread spectrum or also look in the advanced tab


----------



## tpn555

Hi there, I'm a Palit Dual 1070 owner that I managed to successfully oc with tips from this thread, but i have problem now with nvflash. I flashed my card with recommended here ASUS strix OC bios and it run flawlessly, but i wanted more, and here i am with bricked (?) card. Nvflash with -6 flag doesnt want to flash the card any more, it gives ERROR: PCI Block chain corrupted - PCI block search failed. Card won't boot, no success with flashing any bios. Do you guys know how to overcome this error?


----------



## Minium

Quote:


> Originally Posted by *tpn555*
> 
> Hi there, I'm a Palit Dual 1070 owner that I managed to successfully oc with tips from this thread, but i have problem now with nvflash. I flashed my card with recommended here ASUS strix OC bios and it run flawlessly, but i wanted more, and here i am with bricked (?) card. Nvflash with -6 flag doesnt want to flash the card any more, it gives ERROR: PCI Block chain corrupted - PCI block search failed. Card won't boot, no success with flashing any bios. Do you guys know how to overcome this error?


Not booting cards are normally not a problem but since "PCI block search failed" it looks like your card isnt recognized anymore which means flashing it with a PC is impossible.Nvflash wont "find" the card anymore.I´ve never had that error but from what you told us (asus bios on palit cards +more mods) it looks like its bricked...


----------



## tpn555

**** happened when i tried to flash MSI gaming z bios, well GPU-Z and nvflash still see the card so i still have hopes.
It is recognized as MSI.


----------



## Minium

Quote:


> Originally Posted by *tpn555*
> 
> **** happened when i tried to flash MSI gaming z bios, well GPU-Z and nvflash still see the card so i still have hopes.
> It is recognized as MSI.


Does windows device manager see the card ?


----------



## tpn555

Yes it does.


----------



## Minium

Quote:


> Originally Posted by *tpn555*
> 
> Yes it does.


Maybe you can get it flashed with like 10 different override nvflash commands but I dont know how^^


----------



## xGeNeSisx

Really awful about the bricked card dude







Hopefully it can be forced to flash, not very experienced with nvflash so I'm not of much help :/

Still running ASUS strix OC bios on my Gigabyte G1 flawlessly. Have any new bios releases permitted higher power limits out of curiosity? As far as I am aware the strix oc bios is the best bios for high base clock and a 120% power limit for my card

Thanks


----------



## TerafloppinDatP

Deleted. Realized my question wasn't OC related in any way and got my answer in the EVGA website forums.


----------



## gtbtk

Quote:


> Originally Posted by *tpn555*
> 
> Hi there, I'm a Palit Dual 1070 owner that I managed to successfully oc with tips from this thread, but i have problem now with nvflash. I flashed my card with recommended here ASUS strix OC bios and it run flawlessly, but i wanted more, and here i am with bricked (?) card. Nvflash with -6 flag doesnt want to flash the card any more, it gives ERROR: PCI Block chain corrupted - PCI block search failed. Card won't boot, no success with flashing any bios. Do you guys know how to overcome this error?


I would try to flash your original palit backup if you made one. If you dont, try getting it from techpowerup. It is likely the gaming z bios file that you have is corrupted


----------



## gtbtk

Quote:


> Originally Posted by *xGeNeSisx*
> 
> Really awful about the bricked card dude
> 
> 
> 
> 
> 
> 
> 
> Hopefully it can be forced to flash, not very experienced with nvflash so I'm not of much help :/
> 
> Still running ASUS strix OC bios on my Gigabyte G1 flawlessly. Have any new bios releases permitted higher power limits out of curiosity? As far as I am aware the strix oc bios is the best bios for high base clock and a 120% power limit for my card
> 
> Thanks


On my Gaming X, After testing just about all of them, the best non MSI bios I found was the asus OC bios. The gaming z bios which I ended up settling on and was written for my PCB to start with, is only slightly ahead. Unless I was running bench marks, wouldn't notice a difference it was so small.

The hyper clocked bioses from Palit and Gigabyte would not overclock at all on my card and performed worse


----------



## tpn555

I tried to flash about 10 different bioses, including the original one, no go unfortunately. Its the same nvflash error every time.


----------



## Minium

Quote:


> Originally Posted by *tpn555*
> 
> I tried to flash about 10 different bioses, including the original one, no go unfortunately. Its the same nvflash error every time.


Try out other NVflsh versions.If it still doesnt work I guess its bricked.


----------



## Blackfirehawk

corrently i have a GTX 1070 from Gainwand with a Palit Gamerock Bios..
1x8 Pin Power supply.. 195w max

can i Flash a MSI Gaming Z bios to it ? (1x6 Pin + 1x8 Pin ) 230w


----------



## Vitto

Quote:


> Originally Posted by *gtbtk*
> 
> 1070 performance is adequate for most things, even if it requires you to reduce some settings from ultra to high on some titles, it is still a good experience.
> 
> I would wait 6 months and look at the gtx2070 that will perform about the same as a 1080ti does now or a new 2080 that should beat it if nvidia stay true to form.


Or 2080ti


----------



## Madmaxneo

Quote:


> Originally Posted by *Vitto*
> 
> Or 2080ti


More than likely they will go with the 1170/80 or the 1270/80 series.....


----------



## gtbtk

Quote:


> Originally Posted by *Vitto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 1070 performance is adequate for most things, even if it requires you to reduce some settings from ultra to high on some titles, it is still a good experience.
> 
> I would wait 6 months and look at the gtx2070 that will perform about the same as a 1080ti does now or a new 2080 that should beat it if nvidia stay true to form.
> 
> 
> 
> Or 2080ti
Click to expand...

That is probably a year away though


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> corrently i have a GTX 1070 from Gainwand with a Palit Gamerock Bios..
> 1x8 Pin Power supply.. 195w max
> 
> can i Flash a MSI Gaming Z bios to it ? (1x6 Pin + 1x8 Pin ) 230w


Gainward and Palit are the same company and the same cards with different plastic shrouds. They do have models that have 225W bioses for a couple of models that use the same PCB you have. I think it is the Gamerock premium or the pheonix GLH bios from memory.

Yes you can flash the Gaming Z and it will work but it will draw up to 230-250W and the MSI card has a VRM that spreads the heat out over a wider area so you need to be careful of power draw. I would suggest that you install the 225W gainward/palit bios first as you wont see much if any difference with the MSI bios and it has been design with the power supply and VRMs in mind


----------



## gtbtk

Quote:


> Originally Posted by *tpn555*
> 
> I tried to flash about 10 different bioses, including the original one, no go unfortunately. Its the same nvflash error every time.


Where did you get the bioses from that you flashed?

The messages in the screenshot indicates that there is a section of the InfoROM that has license/SN information that appears to have been corrupted and it is not registering the serial numbers with the nvidia software. As a result the drivers are refusing to load.

nvflash has a number of options built in to do thing other than just flash bios files. One I noticed may possibly help you out.

I have never had to use it so I don't know if it will actually do the trick or not. the only reference I can find is talking about some sort of tablet so maybe you flashed a wrong bios along the way?

If you type nvflash /? you get a list and it includes "Restore InfoROM from embedded IB: nvflash [options] --recoverinforom"

The message suggests that the OBD data is stored in the inforom so I would try it without any options first up.


----------



## tpn555

The MSI gaming z bios I obtained is from techpowerup database, there was no mistake with that. I tried to find any nvflash error guide on the internet but with no luck. This thing you found in a nice lead, I'm gonna try it right away.

EDIT:

Seems like another obstacle on the way, nvflash app crashes as a whole while trying to run recover command. Anyways it says that inforom for my chip is not present so i cant backup/read it. The idea is to get inforom file from someone else and then try to flash it. gtbtk would you be so kind and send me yours? You are running now gaming Z bios right?


----------



## zipper17

Quote:


> Originally Posted by *Halseluk*
> 
> Hello, guys!
> 
> I have now a 1070 HOF, and I'm having some trouble overclocking it. I couldn't get 2050 MHz stable yet. Micron memory, latest bios installed.
> 
> Beside that, it boosts to 1949 ~1962 at stock, and it gets 18700 graphics score in Firestrike. I'm sure when I had a 1070 FTW it could score 19000 with same clocks. Is it a big deal?


Galax HOF should be able to overclock better because they have better quality of VRM/mosfet phase & overall cooling. And it should be able to get more than 18700 graphic scores. Try to reach Firestrike Graphic Scores about 20k, then 21k, lucky if you can reach 22k graphic scores.

Just put voltage max, power target max, fan speed at 100% (during bench).
In general for pascal card, first try is bump Coreclock About +100, Memory +500 or so.
Additionally on pascal card you can also overclock the Coreclock via custom curves (MSI afterburner CTRL+F) .

if your card crashes, lower the clock per 13mhz or so.
if artifacts In my experience usually it is a symptoms from memory, lower your memory clock per 25-50mhz (trial & error).
cmiiw in this part.

Do 3dmark benchmark & 3dmark Stress test (Firestrike,Firestrike Extreme, Timespy),
During Stress test is recommended to monitoring the whole Stress test process till the end,
make sure until there is no crash, artifacts or whatsoever during benching or gaming. (Note: even smallest artifacts is not good.)
Keep adjust your core/memory until you find the best of your stable overclock.

The other way to improve your overclock even more are; use voltage Mod (expert), Shunt mod resistor(liquid metal apply), custom waterloop/watercooling.


----------



## lanofsong

Hey GTX 1070 owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Halseluk

Quote:


> Originally Posted by *zipper17*
> 
> Galax HOF should be able to overclock better because they have better quality of VRM/mosfet phase & overall cooling. And it should be able to get more than 18700 graphic scores. Try to reach Firestrike Graphic Scores about 20k, then 21k, lucky if you can reach 22k graphic scores.
> 
> Just put voltage max, power target max, fan speed at 100% (during bench).
> In general for pascal card, first try is bump Coreclock About +100, Memory +500 or so.
> Additionally on pascal card you can also overclock the Coreclock via custom curves (MSI afterburner CTRL+F) .
> 
> if your card crashes, lower the clock per 13mhz or so.
> if artifacts In my experience usually it is a symptoms from memory, lower your memory clock per 25-50mhz (trial & error).
> cmiiw in this part.
> 
> Do 3dmark benchmark & 3dmark Stress test (Firestrike,Firestrike Extreme, Timespy),
> During Stress test is recommended to monitoring the whole Stress test process till the end,
> make sure until there is no crash, artifacts or whatsoever during benching or gaming. (Note: even smallest artifacts is not good.)
> Keep adjust your core/memory until you find the best of your stable overclock.
> 
> The other way to improve your overclock even more are; use voltage Mod (expert), Shunt mod resistor(liquid metal apply), custom waterloop/watercooling.


HOF has a really great PCB, but it doesn't make any difference if you're limited by the GPU. I have seen a simple EVGA 1070 SC smoking my card in OC, reaching 2114 MHz stable.

I have already tried everything you said, and flashing HOF Limited Edition bios. Nothing works. My card crashes if I go beyond +50 MHz core. Used even Xtreme Tuner Plus for overvoltage.

Now I've optimized a curve in Afterburner, 2000 ~ 2025 core, 9000 MHz memory. It's as far as my card can go. As for voltage mod, shunt mod, watercooler, I think it isn't worth the trouble. My only hope would be a bios editor.


----------



## taowulf

New card just arrived, Zotac 1070 AMP!Extreme.

Now just have to wait another 6 and a half hours till I can install it. Stupid work.


----------



## gtbtk

Quote:


> Originally Posted by *tpn555*
> 
> The MSI gaming z bios I obtained is from techpowerup database, there was no mistake with that. I tried to find any nvflash error guide on the internet but with no luck. This thing you found in a nice lead, I'm gonna try it right away.
> 
> EDIT:
> 
> Seems like another obstacle on the way, nvflash app crashes as a whole while trying to run recover command. Anyways it says that inforom for my chip is not present so i cant backup/read it. The idea is to get inforom file from someone else and then try to flash it. gtbtk would you be so kind and send me yours? You are running now gaming Z bios right?


I would if I could.

My rig unfortunately self destructed so my 1070 is sitting on a table with nothing to plug it into and I am currently on a macbook pro with igpu running windows.

Have you looked in the event viewer to see if the nvidia software is throwing event errors?

The original message said that it could not register the OBD serial number with the nvidia service. It may be crashing because the service is locking the inforom as it continues to try getting the serial number. First thing I would suggest is stopping all the nvidia services running under windows and try it again that way.

If that doesn't do it, the next thing is to try doing it in windows safe mode and see how that goes.

If that is still a problem, I believe that nvflash (maybe not the 64bit version) can also be run under DOS. At least it used to be able to do that. not sure now. You may need to boot from a dos USB and try it that way.


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipper17*
> 
> Galax HOF should be able to overclock better because they have better quality of VRM/mosfet phase & overall cooling. And it should be able to get more than 18700 graphic scores. Try to reach Firestrike Graphic Scores about 20k, then 21k, lucky if you can reach 22k graphic scores.
> 
> Just put voltage max, power target max, fan speed at 100% (during bench).
> In general for pascal card, first try is bump Coreclock About +100, Memory +500 or so.
> Additionally on pascal card you can also overclock the Coreclock via custom curves (MSI afterburner CTRL+F) .
> 
> if your card crashes, lower the clock per 13mhz or so.
> if artifacts In my experience usually it is a symptoms from memory, lower your memory clock per 25-50mhz (trial & error).
> cmiiw in this part.
> 
> Do 3dmark benchmark & 3dmark Stress test (Firestrike,Firestrike Extreme, Timespy),
> During Stress test is recommended to monitoring the whole Stress test process till the end,
> make sure until there is no crash, artifacts or whatsoever during benching or gaming. (Note: even smallest artifacts is not good.)
> Keep adjust your core/memory until you find the best of your stable overclock.
> 
> The other way to improve your overclock even more are; use voltage Mod (expert), Shunt mod resistor(liquid metal apply), custom waterloop/watercooling.
> 
> 
> 
> HOF has a really great PCB, but it doesn't make any difference if you're limited by the GPU. I have seen a simple EVGA 1070 SC smoking my card in OC, reaching 2114 MHz stable.
> 
> I have already tried everything you said, and flashing HOF Limited Edition bios. Nothing works. My card crashes if I go beyond +50 MHz core. Used even Xtreme Tuner Plus for overvoltage.
> 
> Now I've optimized a curve in Afterburner, 2000 ~ 2025 core, 9000 MHz memory. It's as far as my card can go. As for voltage mod, shunt mod, watercooler, I think it isn't worth the trouble. My only hope would be a bios editor.
Click to expand...

Be really careful flashing non HOF bioses to HOF cards. Same family should be fine but it uses a different voltage controller (International rectifier instead of Onsemi) to all the other pascal cards and a flash from a different brand will brick the card.

As temps increase, GPUBoost will increase the voltage if it can or reduce the clock if it cant. Make sure you use a custom fan curve that is more aggressive than stock to manage temps and keep them as low as you can within you own noise limits


----------



## ravihpa

Bought *Zotac GTX1070 AMP! Extreme* last week in Amazon sale. Just got it this Sunday.



It's an old case, NZXT Gamma. Had to cut half of the HDD cage to make room for the card







Will upgrade to a decent cabinet later down the road









I don't have an air conditioner, and the idle temps hover around 48 degrees. Need to set up a good fan curve









Good to be a part of this community


----------



## kliklakloe

I was wondering if someone could help me with this, Yesterday i was installing an EK waterblock on my gtx 1070 reference pcb. but when i wanted to install the backplate i discovered that all the standoffs were loose! i think i overtightened them. So i cannot get my waterblock from the pcb anymore. I still wanted to install the backplate so i used a small pliers to unscrew the screws that had to be used by the backplate. in this process i accidentaly damaged the pcb, a component has come off i think it is a conductor. its rather small and seems insignificant. is it ok to run the GPU without this component? or should i get it repaired first?

I have a picture to show where im talking about


----------



## pez

Someone did something similar like this to a TXP recently. I'm personally not sure if it's safe to try and run the card, but I must stress that it's better to buy the proper tools for the job instead of damaging a few-hundred dollar GPU.


----------



## kliklakloe

Can you tell me where i can find the post about this damaged titan xp? I know its a stupid mistake, i have been watercooling and whatnot for years and never made a mistake like this.


----------



## pez

Quote:


> Originally Posted by *kliklakloe*
> 
> Can you tell me where i can find the post about this damaged titan xp? I know its a stupid mistake, i have been watercooling and whatnot for years and never made a mistake like this.


It's buried in the thread, but I believe it was a different piece and it needed to be soldered back on.


----------



## Halseluk

Quote:


> Originally Posted by *gtbtk*
> 
> Be really careful flashing non HOF bioses to HOF cards. Same family should be fine but it uses a different voltage controller (International rectifier instead of Onsemi) to all the other pascal cards and a flash from a different brand will brick the card.
> 
> As temps increase, GPUBoost will increase the voltage if it can or reduce the clock if it cant. Make sure you use a custom fan curve that is more aggressive than stock to manage temps and keep them as low as you can within you own noise limits


It's ok. I know I can't cross flash my HOF. I just flashed the 1070 HOF Limited Edition bios, did some testing and flashed back the original. I don't need a better curve because the card is really cold, below 67ºC with 50% fan speed and low noise.


----------



## Sn4k3

May I join in?


----------



## gtbtk

Quote:


> Originally Posted by *ravihpa*
> 
> Bought *Zotac GTX1070 AMP! Extreme* last week in Amazon sale. Just got it this Sunday.
> 
> 
> 
> It's an old case, NZXT Gamma. Had to cut half of the HDD cage to make room for the card
> 
> 
> 
> 
> 
> 
> 
> Will upgrade to a decent cabinet later down the road
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't have an air conditioner, and the idle temps hover around 48 degrees. Need to set up a good fan curve
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good to be a part of this community


Welcome. This is a good community, none of the rudeness and condescension that you see in some of the other forums here

At stock fan setting (0 fan until the card hits 60 deg), my MSI Gaming X idles at 59deg. I run a custom curve and the card rarely hits 60 under full load.

Idle at 48 is not terrible but it means when you put the card under load, it is closer to the max load temp so the clocks slow down faster.


----------



## gtbtk

Quote:


> Originally Posted by *kliklakloe*
> 
> I was wondering if someone could help me with this, Yesterday i was installing an EK waterblock on my gtx 1070 reference pcb. but when i wanted to install the backplate i discovered that all the standoffs were loose! i think i overtightened them. So i cannot get my waterblock from the pcb anymore. I still wanted to install the backplate so i used a small pliers to unscrew the screws that had to be used by the backplate. in this process i accidentaly damaged the pcb, a component has come off i think it is a conductor. its rather small and seems insignificant. is it ok to run the GPU without this component? or should i get it repaired first?
> 
> I have a picture to show where im talking about


That looks like a shunt resistor. The card uses shunt resistors to manage for its voltage regulation but that one is not one that you want to touch if you are overclocking. I think that is the low voltage VRM that supports the memory and doesn't need to be touched.

If you can find a shop that can do PCB board level repairs they may be able to reattach the resistor for you. I would not try running the card without the component unless you like the magic blue smoke and smell of burning electronics


----------



## taowulf

Quote:


> Originally Posted by *gtbtk*
> 
> Welcome. This is a good community, none of the rudeness and condescension that you see in some of the other forums here
> 
> At stock fan setting (0 fan until the card hits 60 deg), my MSI Gaming X idles at 59deg. I run a custom curve and the card rarely hits 60 under full load.
> 
> Idle at 48 is not terrible but it means when you put the card under load, it is closer to the max load temp so the clocks slow down faster.


This is why I love watercooling. At 93% load (runinng [email protected] right now, I am at 47C. Idle is at 39C.


----------



## Inelastic

Quote:


> Originally Posted by *Sn4k3*
> 
> May I join in?
> 
> 
> Spoiler: Warning: Spoiler!


Welcome to the club







Quick question, are the fans on your cpu cooler suppose to be pointing towards each other like that?


----------



## muzammil84

Quote:


> Originally Posted by *taowulf*
> 
> This is why I love watercooling. At 93% load (runinng [email protected] right now, I am at 47C. Idle is at 39C.


These are high temps! Your loop must be not very efficient, 40°C for gpu after couple of hrs of gaming is what you should be getting on custom loop.


----------



## ravihpa

Quote:


> Originally Posted by *gtbtk*
> 
> Welcome. This is a good community, none of the rudeness and condescension that you see in some of the other forums here
> 
> At stock fan setting (0 fan until the card hits 60 deg), my MSI Gaming X idles at 59deg. I run a custom curve and the card rarely hits 60 under full load.
> 
> Idle at 48 is not terrible but it means when you put the card under load, it is closer to the max load temp so the clocks slow down faster.


Thank you. Wow, 60 deg at full load. That's great. Can you share your fan curve with me, please









As of right now, this is what I have set up...


FYI, I am a noob at this and this is the first time I've ever set up a fan curve







Actually saw a few YouTube videos and this is my first setup. With this fan curve, my card reaches 70 deg with full load and 4 to 5 hours of gaming. It has NEVER gone above 70 deg.

Is that good? Would love to hear your thoughts on this. Any and all suggestions and recommendations are welcome









Thanx in advance









PS: Ambient city temperature (on google) shows 36 degrees


----------



## Bee Dee 3 Dee

i've liked for past 10 months:


----------



## taowulf

Quote:


> Originally Posted by *muzammil84*
> 
> These are high temps! Your loop must be not very efficient, 40°C for gpu after couple of hrs of gaming is what you should be getting on custom loop.


It was hot in here


----------



## 303869

Quote:


> Originally Posted by *Inelastic*
> 
> Welcome to the club
> 
> 
> 
> 
> 
> 
> 
> Quick question, are the fans on your cpu cooler suppose to be pointing towards each other like that?


Haha yeah whats going on there?


----------



## gtbtk

Quote:


> Originally Posted by *taowulf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Welcome. This is a good community, none of the rudeness and condescension that you see in some of the other forums here
> 
> At stock fan setting (0 fan until the card hits 60 deg), my MSI Gaming X idles at 59deg. I run a custom curve and the card rarely hits 60 under full load.
> 
> Idle at 48 is not terrible but it means when you put the card under load, it is closer to the max load temp so the clocks slow down faster.
> 
> 
> 
> This is why I love watercooling. At 93% load (runinng [email protected] right now, I am at 47C. Idle is at 39C.
Click to expand...

That's all great when the VRMs are within their voltage limits. The shunt removal, will not restrict the vrms voltage to the chip and plow it up. doesn't matter if it is under water or not


----------



## pez

Quote:


> Originally Posted by *Inelastic*
> 
> Welcome to the club
> 
> 
> 
> 
> 
> 
> 
> Quick question, are the fans on your cpu cooler suppose to be pointing towards each other like that?


Good catch on that lol. Talk about some positive pressure


----------



## taowulf

Quote:


> Originally Posted by *gtbtk*
> 
> That's all great when the VRMs are within their voltage limits. The shunt removal, will not restrict the vrms voltage to the chip and plow it up. doesn't matter if it is under water or not


I think you got mixed up, I am not the one that knocked the shunt off of the card.


----------



## kliklakloe

Quote:


> Originally Posted by *gtbtk*
> 
> That looks like a shunt resistor. The card uses shunt resistors to manage for its voltage regulation but that one is not one that you want to touch if you are overclocking. I think that is the low voltage VRM that supports the memory and doesn't need to be touched.
> 
> If you can find a shop that can do PCB board level repairs they may be able to reattach the resistor for you. I would not try running the card without the component unless you like the magic blue smoke and smell of burning electronics


Thanks for your reply but that is Not a shunt resistor. I know about shunts and how to mod them. These little blocks I killed one ofe are really scattered all over pcbs. Just don't know how important they are. Waiting for a reply from EVGA specialist.


----------



## kliklakloe

Quote:


> Originally Posted by *pez*
> 
> It's buried in the thread, but I believe it was a different piece and it needed to be soldered back on.


Found a guy on reddit that has done this to his TXP, not the exact same one of course but the component looks the same. Says his titan runs like a dream with the waterblock on.


----------



## Sn4k3

Quote:


> Originally Posted by *Inelastic*
> 
> Welcome to the club
> 
> 
> 
> 
> 
> 
> 
> Quick question, are the fans on your cpu cooler suppose to be pointing towards each other like that?


Thank you!. They are in push/pull configuration


----------



## zipper17

Just FYI don't hesitate to push more your fan % speed, because The fans it's not only cooling the Core GPU but VRM & Memory components are also part that getting cooled by fans.


----------



## ravihpa

Quote:


> Originally Posted by *Bee Dee 3 Dee*
> 
> i've liked for past 10 months:


Thanx for sharing. Does that mean your fans are ALWAYS running and never idle? Is that good? Over here, my idle temp with 0% fans is 48 degrees. When I implemented your curve, my idle temps dropped down to 43 deg, which means my fans are still running and will always keep running at 15% to 20% because my idle temps will NEVER go below 40 degrees for the fans to stop spinning.

Another thing is, this PC is also my workstation, as in I work on this PC for 6 to 8 hours every day. During these hours, of course the card idles at 48 degrees (0% fans), but with your curve, they're running at 15% to 20% ALL THE TIME. I want to know, is that healthy? Do I keep them running all the time and maintain idle temps at 43 deg? Or revert back to 48 deg and 0% fans spinning?

What would you guys suggest?

Thanx a lot in advance


----------



## Inelastic

Quote:


> Originally Posted by *Sn4k3*
> 
> Thank you!. They are in push/pull configuration


A push/pull configuration would mean one fan is pushing air into the heatsink while the other is pulling air out. By the looks of it (judging by the position of the red plastic rings on your SP120s), both fans are pushing air into the heatsink, fighting each other for airflow direction. You would need to flip one of the fans around (usually the fan on the left side in that pic) so the fans are facing the same direction for it to be push/pull.


----------



## Sn4k3

Quote:


> Originally Posted by *Inelastic*
> 
> A push/pull configuration would mean one fan is pushing air into the heatsink while the other is pulling air out. By the looks of it (judging by the position of the red plastic rings on your SP120s), both fans are pushing air into the heatsink, fighting each other for airflow direction. You would need to flip one of the fans around (usually the fan on the left side in that pic) so the fans are facing the same direction for it to be push/pull.


Yes I know how it works, and I'm pretty sure that's how I have it set up. Dang you, now you have made me want to check and I'm not at home.


----------



## 303869

Quote:


> Originally Posted by *Sn4k3*
> 
> Yes I know how it works, and I'm pretty sure that's how I have it set up. Dang you, now you have made me want to check and I'm not at home.


It does look like both fans are in push config due to the orientation of the red rings.


----------



## rfarmer

Quote:


> Originally Posted by *Sn4k3*
> 
> Yes I know how it works, and I'm pretty sure that's how I have it set up. Dang you, now you have made me want to check and I'm not at home.


I don't own any Corsair fans so can't check for sure but it looks like the rings only mount on the one side, if true they will be blowing toward each other.


----------



## Sn4k3

Quote:


> Originally Posted by *rfarmer*
> 
> I don't own any Corsair fans so can't check for sure but it looks like the rings only mount on the one side, if true they will be blowing toward each other.


Yes you guys were correct, they're both pushing air towards the cooler. I didn't change it though because I'm about to buy a new cooler anyways and besides, it looks awful that way


----------



## Madmaxneo

Quote:


> Originally Posted by *Sn4k3*
> 
> Yes you guys were correct, they're both pushing air towards the cooler. I didn't change it though because I'm about to buy a new cooler anyways and besides, it looks awful that way


Changing it should be quick and easy, plus you will see an improvement in temps....


----------



## gtbtk

Quote:


> Originally Posted by *ravihpa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Welcome. This is a good community, none of the rudeness and condescension that you see in some of the other forums here
> 
> At stock fan setting (0 fan until the card hits 60 deg), my MSI Gaming X idles at 59deg. I run a custom curve and the card rarely hits 60 under full load.
> 
> Idle at 48 is not terrible but it means when you put the card under load, it is closer to the max load temp so the clocks slow down faster.
> 
> 
> 
> Thank you. Wow, 60 deg at full load. That's great. Can you share your fan curve with me, please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As of right now, this is what I have set up...
> 
> 
> FYI, I am a noob at this and this is the first time I've ever set up a fan curve
> 
> 
> 
> 
> 
> 
> 
> Actually saw a few YouTube videos and this is my first setup. With this fan curve, my card reaches 70 deg with full load and 4 to 5 hours of gaming. It has NEVER gone above 70 deg.
> 
> Is that good? Would love to hear your thoughts on this. Any and all suggestions and recommendations are welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanx in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: Ambient city temperature (on google) shows 36 degrees
Click to expand...

My Rig is currently in pieces but I can describe it.

from 0-30 deg I have the fans at about 25 percent constant. The airflow will keep the card in the low 30s at Idle. From about 35 i start increasing the curve to 40 and then increase the curve more steeply to 100% at 50 deg and I can run everything at 1.050v at around 50 deg. The noise with my AC going is not really an issue. My ambient temp is about 20-22 deg for those numbers. If ambient is 36 then you will never get the GPU below that temp and load temps will be about 14 deg higher than mine if everything else is equal.

The thing with this is that performance is a balancing act between performance and associated fan noise vs a quieter system. you need to decide what noise levels are acceptable for you and build your curve around that. A card at 50 and a card at 60 will perform differently but the difference is not huge. If you can run the card in the 60s you should be fine. Best way to determine your best settings is to experiment


----------



## gtbtk

Quote:


> Originally Posted by *rfarmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sn4k3*
> 
> Yes I know how it works, and I'm pretty sure that's how I have it set up. Dang you, now you have made me want to check and I'm not at home.
> 
> 
> 
> I don't own any Corsair fans so can't check for sure but it looks like the rings only mount on the one side, if true they will be blowing toward each other.
Click to expand...

you can always tell which way a fan blows without needing dress rings. You just need to look for the side with the arms that hold the motor in place and that is the side that the air will exit from


----------



## muzammil84

hi guys i'm tempted to try a different bios on my inno3d iChill x4. It's a reference pcb with quite high clocks out of the box, my bios version is 86.04.1E.00.77. I have heard that i need to look for the bios with the same first 8 digits(last two can be different) - not sure how true it is, but that would mean quite a few bioses would work(including Gigabyte extreme gaming which has got 86.04.1E.00.AA, KFA2 EXOC has got highest power limit of 250W and bios looks similar too: 86.04.1E.00.8F).
Please enlighten me before i brick my gpu :]


----------



## Stupidfastwagon

Quote:


> Originally Posted by *Sn4k3*
> 
> May I join in?


Do you realize that both of your CPU fans are blowing against one another?


----------



## taowulf

Quote:


> Originally Posted by *Stupidfastwagon*
> 
> Do you realize that both of your CPU fans are blowing against one another?


Welcome to last week.


----------



## microchidism

I needed some clarification on how to go about fine tuning an OC on my 1070,

From what I have read, even if your temp is in the 50s or 60s your gpu will reduce its clock speeds. So the problem I have is how do you adjust so that when you first start an application your clocks don't spike too high before returning down to your "stable clock."


----------



## rfarmer

Quote:


> Originally Posted by *taowulf*
> 
> Welcome to last week.


rofl


----------



## gtbtk

Quote:


> Originally Posted by *muzammil84*
> 
> hi guys i'm tempted to try a different bios on my inno3d iChill x4. It's a reference pcb with quite high clocks out of the box, my bios version is 86.04.1E.00.77. I have heard that i need to look for the bios with the same first 8 digits(last two can be different) - not sure how true it is, but that would mean quite a few bioses would work(including Gigabyte extreme gaming which has got 86.04.1E.00.AA, KFA2 EXOC has got highest power limit of 250W and bios looks similar too: 86.04.1E.00.8F).
> Please enlighten me before i brick my gpu :]


Make sure you back up your original bios before doing anything else.

the 86.04.50.00.xx bioses are fine as long as you take note of what voltage controller the donor bios is using. EVGA,ASUS,MSI,palit/gainward are all fine as they use the same voltage controller as reference

Steer away from the KAF or Galax bioses, particularly the HOF or SNPR ones as some of those cards use a different voltage controller and will brick your card.

Your card also has an unusual fan layout, donor bioses may mess with the fan control so watch out for that too.

Palit/gainward are the ones that you should probably take a look at first.

Having said that, highest power draw does not guarantee best performance. Asus OC bios is 200w and performs about the same on my gaming x as the msi bioses do with a higher power draw.


----------



## gtbtk

Quote:


> Originally Posted by *microchidism*
> 
> I needed some clarification on how to go about fine tuning an OC on my 1070,
> 
> From what I have read, even if your temp is in the 50s or 60s your gpu will reduce its clock speeds. So the problem I have is how do you adjust so that when you first start an application your clocks don't spike too high before returning down to your "stable clock."


use the curve.

the gpu will either increase the voltage to the next higher level if it has headroom or reduce the clock speed by one level as temps increase in 3 degree steps if it is already at its voltage limit

if you leave the voltage slider on 0, set the 1.050v point to the highest frequency on the curve with voltage points above that level flat. that gives you voltage headroom to 1.063v before the frequency drops - a wider temp range of stable frequencies.

at +100 voltage do the same thing but use the 1.081v point.


----------



## muzammil84

Quote:


> Originally Posted by *gtbtk*
> 
> Make sure you back up your original bios before doing anything else.
> 
> the 86.04.50.00.xx bioses are fine as long as you take note of what voltage controller the donor bios is using. EVGA,ASUS,MSI,palit/gainward are all fine as they use the same voltage controller as reference
> 
> Steer away from the KAF or Galax bioses, particularly the HOF or SNPR ones as some of those cards use a different voltage controller and will brick your card.
> 
> Your card also has an unusual fan layout, donor bioses may mess with the fan control so watch out for that too.
> 
> Palit/gainward are the ones that you should probably take a look at first.
> 
> Having said that, highest power draw does not guarantee best performance. Asus OC bios is 200w and performs about the same on my gaming x as the msi bioses do with a higher power draw.


great help sir. regarding fan profiles, I'm not worried about that at all as I'm not using any fans to cool my gpu, it's cooled by a block. I flashed other ichill official bios with higher power draw and it does go higher but didn't do much in terms of performance. my gpu still seems to go max 105%, not higher. might be just it's limit and not much can be done about it


----------



## turrican9

Sign me up


----------



## gtbtk

Quote:


> Originally Posted by *muzammil84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Make sure you back up your original bios before doing anything else.
> 
> the 86.04.50.00.xx bioses are fine as long as you take note of what voltage controller the donor bios is using. EVGA,ASUS,MSI,palit/gainward are all fine as they use the same voltage controller as reference
> 
> Steer away from the KAF or Galax bioses, particularly the HOF or SNPR ones as some of those cards use a different voltage controller and will brick your card.
> 
> Your card also has an unusual fan layout, donor bioses may mess with the fan control so watch out for that too.
> 
> Palit/gainward are the ones that you should probably take a look at first.
> 
> Having said that, highest power draw does not guarantee best performance. Asus OC bios is 200w and performs about the same on my gaming x as the msi bioses do with a higher power draw.
> 
> 
> 
> great help sir. regarding fan profiles, I'm not worried about that at all as I'm not using any fans to cool my gpu, it's cooled by a block. I flashed other ichill official bios with higher power draw and it does go higher but didn't do much in terms of performance. my gpu still seems to go max 105%, not higher. might be just it's limit and not much can be done about it
Click to expand...

If you are under water and can keep the VRMs cool enough. A Zotac amp extreme bios will pull 300w. The performance is not really any better and it may be beyond the capabilities of a single 8 pin power supply though.

MSI Gaming bioses claim max power of 291W but my card throttles at about 106%. With the MSI bioses, I only see that though under a 4K load. 1080p and 1440 stay below that power draw level and never throttle clocks back. EVGA on the other hand, bounce of the 120% limits even under 1080p loads

My favorite non native bios that matches the best performance I could get with the native Gaming Z bios was the Asus OC bios. 8 pin power card, 200W and a default core frequency of 1633Mhz. 1080 and 1440 have no power limit issues. With the Samsung memory you have, you can even get the reviewer bios version that defaults to the OC mode instead of gaming mode giving you a default frequency of about 1658Mhz


----------



## microchidism

Quote:


> Originally Posted by *gtbtk*
> 
> use the curve.
> 
> the gpu will either increase the voltage to the next higher level if it has headroom or reduce the clock speed by one level as temps increase in 3 degree steps if it is already at its voltage limit
> 
> if you leave the voltage slider on 0, set the 1.050v point to the highest frequency on the curve with voltage points above that level flat. that gives you voltage headroom to 1.063v before the frequency drops - a wider temp range of stable frequencies.
> 
> at +100 voltage do the same thing but use the 1.081v point.


Thank you for your reply, it seemed to help a bit, I ended up leaving it at 2075, it runs most things at 2063 which seems to be about average for the 1070s so i'm happy with that


----------



## gtbtk

Quote:


> Originally Posted by *microchidism*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> use the curve.
> 
> the gpu will either increase the voltage to the next higher level if it has headroom or reduce the clock speed by one level as temps increase in 3 degree steps if it is already at its voltage limit
> 
> if you leave the voltage slider on 0, set the 1.050v point to the highest frequency on the curve with voltage points above that level flat. that gives you voltage headroom to 1.063v before the frequency drops - a wider temp range of stable frequencies.
> 
> at +100 voltage do the same thing but use the 1.081v point.
> 
> 
> 
> Thank you for your reply, it seemed to help a bit, I ended up leaving it at 2075, it runs most things at 2063 which seems to be about average for the 1070s so i'm happy with that
Click to expand...

Don't get too hung up on the max boost frequency. The card actually calls on all parts of the curve while it is processing frames. Try increasing the middle or lower ends of the curve now you have found the maximum at the high end and see if it increases performance.


----------



## Sn4k3

I have a couple questions regarding overclocking. How does exactly temp target work? Do I have to keep it in mind at all given that my card has never exceeded 60-65c at full load and the current target is 83c?
Also, I want to know how is voltage managed for this generation, will increasing my clock offset also increase voltage? I've read that it is limited to 1.09v but I don't know whether it is fixed to that value when at full load or if it changes automatically depending on clock speed (so the more I overclock the closer to that final value it'll get).


Spoiler: Warning: Spoiler!



And yes, I finally decided to fix my fans


----------



## gtbtk

Quote:


> Originally Posted by *Sn4k3*
> 
> I have a couple questions regarding overclocking. How does exactly temp target work? Do I have to keep it in mind at all given that my card has never exceeded 60-65c at full load and the current target is 83c?
> Also, I want to know how is voltage managed for this generation, will increasing my clock offset also increase voltage? I've read that it is limited to 1.09v but I don't know whether it is fixed to that value when at full load or if it changes automatically depending on clock speed (so the more I overclock the closer to that final value it'll get).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> And yes, I finally decided to fix my fans


Temp target is the temp where the card will start thermal throttling. Default is 83 deg but you can increase that to 93 deg. if you never go past 65, it doesnt matter.

The card operates on a voltage curve. CTRL-F in afterburner will open the curve window. The card will operate up to 1.063v on the curve with the voltage slider at 0 and Adjusting the voltage slider to 100 lets the card operate further up the curve to 1.093v.

Increasing the offset will increase the power draw (amperage) as the load increases with the card operating at the higher frequencies the maximum is set by the voltage setting. without the load on the card, the power draw is not that high.


----------



## Sn4k3

Quote:


> Originally Posted by *gtbtk*
> 
> Temp target is the temp where the card will start thermal throttling. Default is 83 deg but you can increase that to 93 deg. if you never go past 65, it doesnt matter.
> 
> The card operates on a voltage curve. CTRL-F in afterburner will open the curve window. The card will operate up to 1.063v on the curve with the voltage slider at 0 and Adjusting the voltage slider to 100 lets the card operate further up the curve to 1.093v.
> 
> Increasing the offset will increase the power draw (amperage) as the load increases with the card operating at the higher frequencies the maximum is set by the voltage setting. without the load on the card, the power draw is not that high.


Thanks, that clears things up. I will tinker with my card a bit when I get home and post back


----------



## hadis1000

Hey guys, I recently bought a msi gtx1070 Armor and overclocked it.
Now I am utterly confused.

My card is stable at 2080MHz using an offset of 170MHz and crashes after that.
What got me confused is that using this curve, my card is able to hit 2400MHz without crashing or artifacting. However, the much higher overclock achieved by using a curve instead of an offset doesn't yield much better benchmark scores. I didn't try to go further than 2400MHz in fear of breaking anything.

Can anyone here make sense of this?


----------



## gtbtk

Quote:


> Originally Posted by *hadis1000*
> 
> Hey guys, I recently bought a msi gtx1070 Armor and overclocked it.
> Now I am utterly confused.
> 
> My card is stable at 2080MHz using an offset of 170MHz and crashes after that.
> What got me confused is that using this curve, my card is able to hit 2400MHz without crashing or artifacting. However, the much higher overclock achieved by using a curve instead of an offset doesn't yield much better benchmark scores. I didn't try to go further than 2400MHz in fear of breaking anything.
> 
> Can anyone here make sense of this?


 Pascal actually operates at every voltage level across the curve up to the limit set by the voltage slider.

Using the slider to overclock offsets every point on the curve by an equal amount.

Some points will overclock higher than others, in your case the weakest voltage point on the curve will only overclock to 170Mhz above stock. That is a really good result btw.

If you are hitting 2400, I am guessing that you are only increasing the highest voltage level point on the curve that is active based on the voltage slider. It wont break anything, it will just crash if you go too far. You have also eliminated that point as being the weakest one.

the curve allows you to increase all the strong points up above the level that is set by that single weak point to avoid leaving performance on the table.

I would suggest that to get maximum performance from your card. Increase the slider to your 170mhz overclock and then go to the curve and increase the highest point and test, then increase the second highest point and test. eventually you will find the weak point so you set the curve to increase each of the other points and leave the weak one at the level just below the level that it crashes or artifacts.


----------



## hadis1000

Ahhhh, this makes much more sense now. I'm supposed to maximize the area under my curve to increase performance.
This will be actually really exciting to play around with.

I have one more question though. How much headroom do I have above +170MHz? Should I increase each point by for example +100MHz or +10MHz to test it?

Thank you for your help!

Edit: Should I keep the points I tested to work at +200MHz at +200MHz, or should I reset them back to +170MHz and test every point individually?


----------



## gtbtk

Quote:


> Originally Posted by *hadis1000*
> 
> Ahhhh, this makes much more sense now. I'm supposed to maximize the area under my curve to increase performance.
> This will be actually really exciting to play around with.
> 
> I have one more question though. How much headroom do I have above +170MHz? Should I increase each point by for example +100MHz or +10MHz to test it?
> 
> Thank you for your help!


 correct. think area under the curve and not the curve itself. Be aware that the curve is not as logical as it could be. 0 offset does not stay fixed, the thin line is actually the 0 baseline

Every card is different so I can't answer how much headroom you have. At one of the voltage points the answer will be 0. At others you can only find out by testing. adjust one point at a time. If you see artifacts or it crashes, then you gave gone too far.

the only thing is not to automatically start and set a profile in afterburner at startup. If you set something too high and it crashes the system it can start a loop.i


----------



## hadis1000

Quote:


> Originally Posted by *gtbtk*
> 
> correct. think area under the curve and not the curve itself. Be aware that the curve is not as logical as it could be. 0 offset does not stay fixed, the thin line is actually the 0 baseline
> 
> Every card is different so I can't answer how much headroom you have. At one of the voltage points the answer will be 0. At others you can only find out by testing. adjust one point at a time. If you see artifacts or it crashes, then you gave gone too far.
> 
> the only thing is not to automatically start and set a profile in afterburner at startup. If you set something too high and it crashes the system it can start a loop.i


I think I got it now, thank you so much for your advice!

I'll report back with the performance increase between my offset overclock and the curve overclock on top of that!


----------



## hadis1000

Turns out after hours of testing and an additional overclock of about 35MHz on average on top of my +170MHz offset didn't result in anywhere close to the performance of my initial 2400MHz overclock.
Now I'm confused all over again because I'm pretty sure I'm within 5MHz of the max frequency for every voltage point.
I tried adding another 5MHz to about half of the voltage points and all of them resulted in at least a few artifacts within 10 minutes.


----------



## gtbtk

Quote:


> Originally Posted by *hadis1000*
> 
> Turns out after hours of testing and an additional overclock of about 35MHz on average on top of my +170MHz offset didn't result in anywhere close to the performance of my initial 2400MHz overclock.
> Now I'm confused all over again because I'm pretty sure I'm within 5MHz of the max frequency for every voltage point.
> I tried adding another 5MHz to about half of the voltage points and all of them resulted in at least a few artifacts within 10 minutes.


 Welcome to the wonderful world of pascal overclocking. Confusion is pretty normal to start with, especially with the Afterburner curve whose behaviour is not immediately obvious.

This took me a couple of months to work out but you need to read both the thick adjustable curve together with the thin curve. At times you can make a small adjustment and it will drop back to where the curve was before if you only look at the thickcurve with the squares on it. The thin line is actually relating to a 0 offset

In my usage there are 2 points that seem to be the most useful. The point at .950 and the highest point of the curve

as allowed by the voltage slider. Get those 2 as high as you can and increase the memory up as high as it will go an I got the best performance out of my card


----------



## hadis1000

SO... Take three:

We can assume that the performance of Pascal somehow relates to the area under the adjustable curve, more specifically between Umin and Umax and that every voltage increase has to yield a frequency increase.
It also doesn't gain an equal amount from every section between voltage points and increasing a single voltage point to it's maximum potential lowers the maximum potential of (presumably) every other voltage point. This means that there's a 'budget' of unused potential which we can distribute across the curve and our goal is to distribute it in the most beneficial way possible and not equally across every voltage point.
There is presumably at least one voltage point that contributes more to performance than the other voltage points, the point on the curve at Umax.

_*Sigh*_ did i get it right this time?

I'd now proceed by increasing the frequency at 1050mV to 24XX (whatever is stable) and then increasing the frequency at 865mV as far as it can go. (It seems like for some reason the 865mV point contributes to stability when starting a 3D application on my card. Don't ask me why but it does.)

Edit: After trying some things I decided to do some extensive testing tomorrow. Overclocking hooray...


----------



## kliklakloe

Quote:


> Originally Posted by *gtbtk*
> 
> That looks like a shunt resistor. The card uses shunt resistors to manage for its voltage regulation but that one is not one that you want to touch if you are overclocking. I think that is the low voltage VRM that supports the memory and doesn't need to be touched.
> 
> If you can find a shop that can do PCB board level repairs they may be able to reattach the resistor for you. I would not try running the card without the component unless you like the magic blue smoke and smell of burning electronics


For those wondering, I soldered the smd capacitor back on myself, it's not pretty but it's back in its place again. Card is running nicely under water now @ 42 degrees celcius under extensive load. Clocks are 2164mhz if it keeps under the 112% power target. If only that damn power target could be set higher it could stay there forever.


----------



## kliklakloe

I'm aware of the shunt mod, just not feeling like smearing liquid metal over the pbc


----------



## kliklakloe




----------



## gtbtk

Quote:


> Originally Posted by *kliklakloe*


glad to hear you resolved the problem.

what are you running your memory oc at?


----------



## zipper17

The first *GTX 1070 breaks +23,6k Firestrike Graphic scores* under LN2 by @buildzoid
*Timespy GS = 7,764*


__
https://www.reddit.com/r/6egxbr/finally_finished_setting_up_the_gtx_1070_dual_for/%5B/URL
FS GS = 23,623 (used LOD modification btw, it's not legal for 3dmark, but legal for hwbot)

https://d1ebmxcfh8bf9c.cloudfront.net/u44176/image_id_1850896.png
Timespy GS = 7,764

The GPU boost 3.0 somehow make the card slower...
2GHZ manual (GPU Boost disabled) is better than 2GHZ Boost ...

He heavily modified the card physically, so GPU boost 3.0 is somehow disabled totally
and he used a Flat line curves.


----------



## kliklakloe

+750


----------



## kliklakloe

Can someone tell me if i can flash a EVGA gtx 1070 FTW bios to a gtx 1070 SC? and if doing so would give me the 122% power target the FTW has?


----------



## gtbtk

Quote:


> Originally Posted by *kliklakloe*
> 
> Can someone tell me if i can flash a EVGA gtx 1070 FTW bios to a gtx 1070 SC? and if doing so would give me the 122% power target the FTW has?


yes and yes.

The FTW does have a beefier VRM and extra PCIe power than the SC card does though. The top FTW bios will top out at 226W but in all honesty, I don't think that it will give you much better performance that you cant get by just overclocking the SC bios card


----------



## kliklakloe

Quote:


> Originally Posted by *gtbtk*
> 
> yes and yes.
> 
> The FTW does have a beefier VRM and extra PCIe power than the SC card does though. The top FTW bios will top out at 226W but in all honesty, I don't think that it will give you much better performance that you cant get by just overclocking the SC bios card


Thanks for your information! The cards clocks very well it just is running into power limit constantly. This does not happen at 112% as what it is set to in after burner but already starts at 107%. Result is that the voltage is constantly dropping to around 1000mv. And the clockspeed is bouncing al over the place from 2164mhz to 2038mhz I know the VRM section is not beefy enough to go over 200watts or so, but it should manage a little more than it is now, especially onder a FC water Block. So you are a 100% sure flashing the FTW bios is not gonna give me problems?


----------



## gtbtk

Quote:


> Originally Posted by *kliklakloe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> yes and yes.
> 
> The FTW does have a beefier VRM and extra PCIe power than the SC card does though. The top FTW bios will top out at 226W but in all honesty, I don't think that it will give you much better performance that you cant get by just overclocking the SC bios card
> 
> 
> 
> Thanks for your information! The cards clocks very well it just is running into power limit constantly. This does not happen at 112% as what it is set to in after burner but already starts at 107%. Result is that the voltage is constantly dropping to around 1000mv. And the clockspeed is bouncing al over the place from 2164mhz to 2038mhz I know the VRM section is not beefy enough to go over 200watts or so, but it should manage a little more than it is now, especially onder a FC water Block. So you are a 100% sure flashing the FTW bios is not gonna give me problems?
Click to expand...

 I have flashed both the SC and FTW bioses to my MSI gaming card. the key thing is that the cards have matching voltage controllers and you watch temps and power draws if you are experimenting with the lower specced VRM cards. I also know that the Zotac Amp Extreme bios runs fine on a FTW card

EVGA bioses do have a habit of bouncing off the power limit. They are the only ones in my testing that will hit the power limit under a 1080p load.

Maybe you might like to try the Asus strix OC bios?

Before you do start playing with cross flashing, install precision XOC and run the auto overclocking utility. You probably wont get a completely stable overclock out of it but it will show you where the strong and weak voltage points are for curve overclocking if you want to try fine tuning the card in the future.


----------



## kliklakloe

Quote:


> Originally Posted by *gtbtk*
> 
> I have flashed both the SC and FTW bioses to my MSI gaming card. the key thing is that the cards have matching voltage controllers and you watch temps and power draws if you are experimenting with the lower specced VRM cards. I also know that the Zotac Amp Extreme bios runs fine on a FTW card
> 
> EVGA bioses do have a habit of bouncing off the power limit. They are the only ones in my testing that will hit the power limit under a 1080p load.
> 
> Maybe you might like to try the Asus strix OC bios?
> 
> Before you do start playing with cross flashing, install precision XOC and run the auto overclocking utility. You probably wont get a completely stable overclock out of it but it will show you where the strong and weak voltage points are for curve overclocking if you want to try fine tuning the card in the future.


Thanks I appreciate the info you gave me. I just flashed a Asus strix oc bios and the power target still only goes to 112% thought the strix can go to 120%. Checked gpu-z and the bios is flashed correct, says its an Asus card. I don't hear many complaints about people power limiting but for my card it's the only limit I have. Can't even reach 1.093 volts because the power limit kicks in long before that. Temps are not an issue, 42 degrees max on water.


----------



## Nawafwabs

i have asus 1070 o8g

i change motherboard yesterday from z170x gaming 7 to msi z270 pro carbon

now i have problem, gpu Fan speed 0 rpm

i cant controll it with mai afterburner or other monitoring software

i try different driver nothing work









please help me


----------



## kliklakloe

Just flashed a FTW bios and now it runs like a dream, no more power limiting and clocks stay stable at 2176.5mhz all the time. memory at 9500mhz.


----------



## TerafloppinDatP

Quote:


> Originally Posted by *kliklakloe*
> 
> Just flashed a FTW bios and now it runs like a dream, no more power limiting and clocks stay stable at 2176.5mhz all the time. memory at 9500mhz.


Inspiring. What's your water setup?


----------



## kliklakloe

Quote:


> Originally Posted by *TerafloppinDatP*
> 
> Inspiring. What's your water setup?




I Suck at taking pictures, the rad is an EK coolstream XE360


----------



## TerafloppinDatP

Looks sweet to me! The major lesson I see here is that serious water cooling gets you way more than having a 2nd 8-pin power connector. I read FTW owners being happy hitting stable 2100 MHz, which you've beaten handily with an SC and its single 8-pin power connector. I love EVGA and my 1070 SC but the FTW's 2nd 8-pin power connector is full on hype the more I read about final overclocks vs other models. FTW owners want to set me straight, please do, I'm here to learn.


----------



## Nawafwabs

Quote:


> Originally Posted by *Nawafwabs*
> 
> i have asus 1070 o8g
> 
> i change motherboard yesterday from z170x gaming 7 to msi z270 pro carbon
> 
> now i have problem, gpu Fan speed 0 rpm
> 
> i cant controll it with mai afterburner or other monitoring software
> 
> i try different driver nothing work
> 
> 
> 
> 
> 
> 
> 
> 
> 
> please help me


slove it by back to older bios


----------



## kliklakloe

Yes, all the power in the world doesnt get you anywhere if the silicon isn't up for it. And temperatures plays a BIG role in this generation. 60 degrees on air is very nice but 15 degrees lower and your clocks are more stable and can reach higher. Doesn't change the fact that the power delivery on the reference PCB's is quite weak. would've been nice to have the extra phase that the gtx 1080 has. but on water the vrm's stay much cooler so can handle more power per phase. The power delivery on the FTW is just plain overkill and everybody knows that. Happy with the FTW bios though 100% power target on that bios is 185watts, and thats high enough for this chips and for me to get nice stable clocks without hitting a limit other than voltage limit.


----------



## kliklakloe

Benching Heaven @ 2200 now!


----------



## kliklakloe

One of my Rad fans died so temps are a bit higher atm.


----------



## MEC-777

Quote:


> Originally Posted by *kliklakloe*
> 
> Benching Heaven @ 2200 now!


2200, damn! :O Nice OC. Does it stay there though?









What card is this?


----------



## kliklakloe

Quote:


> Originally Posted by *MEC-777*
> 
> 2200, damn! :O Nice OC. Does it stay there though?
> 
> 
> 
> 
> 
> 
> 
> What card is this?


Yes it stays there, evga gtx 1070 sc. Same pcb as founders edition


----------



## MEC-777

Quote:


> Originally Posted by *kliklakloe*
> 
> Yes it stays there, evga gtx 1070 sc. Same pcb as founders edition


Well, I think you got a golden chip, my friend.







Goes to show that OC potential with Pascal seems to be more heavily dependent on silicon lottery than VRM and card design. Will be putting my 1070 Founders under water very soon. Curious to see what it will do then....


----------



## kliklakloe

Quote:


> Originally Posted by *MEC-777*
> 
> Well, I think you got a golden chip, my friend.
> 
> 
> 
> 
> 
> 
> 
> Goes to show that OC potential with Pascal seems to be more heavily dependent on silicon lottery than VRM and card design. Will be putting my 1070 Founders under water very soon. Curious to see what it will do then....


Well that's a first for me, last 6gpus where all average or just completely garbage when it came to overclocking. Didn't matter what great pcb design they had so, this time I just bought whatever came on my path for the right price. I use a FTW bios though stock bios constantly hits power limit so voltage was limited to around 1000mv. With the FTW bios it can stretch its legs.


----------



## MEC-777

Quote:


> Originally Posted by *kliklakloe*
> 
> Well that's a first for me, last 6gpus where all average or just completely garbage when it came to overclocking. Didn't matter what great pcb design they had so, this time I just bought whatever came on my path for the right price. I use a FTW bios though stock bios constantly hits power limit so voltage was limited to around 1000mv. With the FTW bios it can stretch its legs.


If the EVGA 1070 SC is essentially a founders card with a different cooler, then I wonder if the FTW bios could be used on any 1070 FE... ? I have a Zotac FE and now I'm curious... lol

I also don't want to brick my card though...


----------



## kliklakloe

Quote:


> Originally Posted by *MEC-777*
> 
> If the EVGA 1070 SC is essentially a founders card with a different cooler, then I wonder if the FTW bios could be used on any 1070 FE... ? I have a Zotac FE and now I'm curious... lol
> 
> I also don't want to brick my card though...


Yes it works, it's exactly the same card with a different cooler. That I can guarantee you. Flashing stays at own risk ofcourse. Keep in mind that the FE cooler probably can't keep up with the extra power draw. Mine is cooled by a waterblock so the VRM section is also much cooler than on air.


----------



## MEC-777

Quote:


> Originally Posted by *kliklakloe*
> 
> Yes it works, it's exactly the same card with a different cooler. That I can guarantee you. Flashing stays at own risk ofcourse. Keep in mind that the FE cooler probably can't keep up with the extra power draw. Mine is cooled by a waterblock so the VRM section is also much cooler than on air.


I'm installing the new kraken G12 water cooling kit with an H55 AIO, so the cooling will be far better than with the stock FE cooler.









I'll try just OCing with the stock bios after installing the water cooling and then probably try the FTW bios and see if there's a difference.

Thanks for the info.


----------



## gtbtk

Quote:


> Originally Posted by *kliklakloe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have flashed both the SC and FTW bioses to my MSI gaming card. the key thing is that the cards have matching voltage controllers and you watch temps and power draws if you are experimenting with the lower specced VRM cards. I also know that the Zotac Amp Extreme bios runs fine on a FTW card
> 
> EVGA bioses do have a habit of bouncing off the power limit. They are the only ones in my testing that will hit the power limit under a 1080p load.
> 
> Maybe you might like to try the Asus strix OC bios?
> 
> Before you do start playing with cross flashing, install precision XOC and run the auto overclocking utility. You probably wont get a completely stable overclock out of it but it will show you where the strong and weak voltage points are for curve overclocking if you want to try fine tuning the card in the future.
> 
> 
> 
> Thanks I appreciate the info you gave me. I just flashed a Asus strix oc bios and the power target still only goes to 112% thought the strix can go to 120%. Checked gpu-z and the bios is flashed correct, says its an Asus card. I don't hear many complaints about people power limiting but for my card it's the only limit I have. Can't even reach 1.093 volts because the power limit kicks in long before that. Temps are not an issue, 42 degrees max on water.
Click to expand...

 As soon as the card goes into the 3d p-state it will pull the maximum voltage level that the curve you have set allows. As load is applied the card draws more current, not voltage to increase the power draw. If you are only getting to 1.081v it is because the curve the card is running has flattened off at 1.081v. You do get one benefit from that though, As temps increase, the card will boost up to 1.093v before it reduces core clock speed. You can go in and adjust the curve manually to make the 1.093v point the highest point on the curve and it will go to that voltage if you want.

The voltage slider and not the power limit slider is the one that allows you to hit 1.093v. You do need to enable the voltage adjustments in the AB settings to enable it.

The total power draw is built into each particular bios and is different for each of different model of card. The power limit is just a percentage above the limit that the bios sets, not an arbitrary value that all cards share.

Having said that, from memory, the ASUS OC bios (86.04.50.00.63) does have a 120% slider. The non OC bios is 112% and is 180W max (86.04.50.00.62 I think??). It is possible that you downloaded a mislabeled one. If you go to Techpowerup to get the bios, the OC version is a 200W version. This is the OC version https://www.techpowerup.com/vgabios/187005/asus-gtx1070-8192-161020

I have run the zotac Extreme bios on my msi. It will pull over 300W, the best I can get out of the standard MSI bios is 240-250W but I don't get better frame rates with the Zotac Bios. In fact the Asus bios which pulls only 200W max performs better than the zotac one and just about as well as the gaming Z bios on my hardware. One of the other guys here has a hybrid watercooled FTW card and he does get better performance using the Zotac bios so every different card will perform differently particularly when you cross flash. I cannot give you first hand experience using an EVGA SC card cause I don't have one to test. The only way to know for sure is to try it out and see how it works for you.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kliklakloe*
> 
> Yes it works, it's exactly the same card with a different cooler. That I can guarantee you. Flashing stays at own risk ofcourse. Keep in mind that the FE cooler probably can't keep up with the extra power draw. Mine is cooled by a waterblock so the VRM section is also much cooler than on air.
> 
> 
> 
> I'm installing the new kraken G12 water cooling kit with an H55 AIO, so the cooling will be far better than with the stock FE cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try just OCing with the stock bios after installing the water cooling and then probably try the FTW bios and see if there's a difference.
> 
> Thanks for the info.
Click to expand...

If you want to play, you only have aircooling on the VRMs, I would suggest that you try the Asus Strix OC bios over the FTW to start with, the FTW has the potential to pull 226W but it has two 8 Pin power connectors to supply the power. The Asus is a great bios with a core clock of 1633Mhz and it only uses a single 8 pin cable.


----------



## kliklakloe

Tx for the info man, appreciate it, but i have it all figured out, see other posts on the last page. running stable now @ 2200mhz.


----------



## kliklakloe

Quote:


> Originally Posted by *MEC-777*
> 
> I'm installing the new kraken G12 water cooling kit with an H55 AIO, so the cooling will be far better than with the stock FE cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try just OCing with the stock bios after installing the water cooling and then probably try the FTW bios and see if there's a difference.
> 
> Thanks for the info.


Let me know how it goes, i would appreciate that. with the kraken your VRM section is still aircooled, so watch out you dont overload them, PCB with more VRMS like the FTW has, does less heat per VRM because they are with more to share the supplied power.


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> If you want to play, you only have aircooling on the VRMs, I would suggest that you try the Asus Strix OC bios over the FTW to start with, the FTW has the potential to pull 226W but it has two 8 Pin power connectors to supply the power. The Asus is a great bios with a core clock of 1633Mhz and it only uses a single 8 pin cable.


The air cooling on the VRM with the G12 still shows improvement in VRM cooling over the stock cooler of even the FTW card like Jayztwocents observed in his testing.




So I'm not too concerned about the VRM cooling. I will definitely keep a close eye on it, especially if I try cross-flashing with the Strix bios. Thanks for suggesting that. I was wondering if it would be an issue using a bios from a 2x 8-pin card on a single 8-pin card. I'm not looking to break records, just a good solid/stable OC.








Quote:


> Originally Posted by *kliklakloe*
> 
> Let me know how it goes, i would appreciate that. with the kraken your VRM section is still aircooled, so watch out you dont overload them, PCB with more VRMS like the FTW has, does less heat per VRM because they are with more to share the supplied power.


See the video above I posted. Better VRM temps with the G12, even with naked VRMs.


----------



## kliklakloe

Interesting video, it is great to have so many sensors like the icx has. Hope this is going to be more common with future generations. Goodluck with the kraken, hope to hear from you how it goes.


----------



## zipper17

This is interesting to look at about pascal overclocking if you want to.








I watched him at timespy benches, something interesting about he can run a very high core clock with lowered voltage, but it got a bad scores. (fake coreclock or something like that). Once he crank up the voltage, the scores start to scale up, but no longer can run with a very high coreclock, it will likely start to crash. Cmiiw

Then he heavily modded the card, he totally disabled power limit & voltage limit. He used a full flat line curve.


----------



## kliklakloe

Great haircut!


----------



## khanmein

@zipper17 Ever since that 'fella' criticized ASRock bios & damn bias with certain brands then I stop following that channel is moving towards another path.


----------



## turkishmafia

I've been extremely happy with my GTX 1070 FE, but I am looking to upgrade to SLI or 1080Ti.

I'm not so concerned that games won't support an SLI profile, but I am concerned regarding microstutter. Is microstutter still an issue with SLI?

Thanks very much.


----------



## Madmaxneo

Quote:


> Originally Posted by *turkishmafia*
> 
> I've been extremely happy with my GTX 1070 FE, but I am looking to upgrade to SLI or 1080Ti.
> 
> I'm not so concerned that games won't support an SLI profile, but I am concerned regarding microstutter. Is microstutter still an issue with SLI?
> 
> Thanks very much.


I believe microstutter still is an issue, and though I know you are not concerned, there are apparently less and less games that come out that support SLI. I also believe you would get better performance with a single 1080Ti than you would with two 1070's in SLI. But I could be wrong as it is a slightly uneducated guess based on comparisons of previous cards SLI vrs a single card (albeit more powerful) performance.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you want to play, you only have aircooling on the VRMs, I would suggest that you try the Asus Strix OC bios over the FTW to start with, the FTW has the potential to pull 226W but it has two 8 Pin power connectors to supply the power. The Asus is a great bios with a core clock of 1633Mhz and it only uses a single 8 pin cable.
> 
> 
> 
> The air cooling on the VRM with the G12 still shows improvement in VRM cooling over the stock cooler of even the FTW card like Jayztwocents observed in his testing.
> 
> 
> 
> 
> So I'm not too concerned about the VRM cooling. I will definitely keep a close eye on it, especially if I try cross-flashing with the Strix bios. Thanks for suggesting that. I was wondering if it would be an issue using a bios from a 2x 8-pin card on a single 8-pin card. I'm not looking to break records, just a good solid/stable OC.
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kliklakloe*
> 
> Let me know how it goes, i would appreciate that. with the kraken your VRM section is still aircooled, so watch out you dont overload them, PCB with more VRMS like the FTW has, does less heat per VRM because they are with more to share the supplied power.
> 
> Click to expand...
> 
> See the video above I posted. Better VRM temps with the G12, even with naked VRMs.
Click to expand...

Not anything to panic about, just be aware of and keep an eye on it


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> @zipper17 Ever since that 'fella' criticized ASRock bios & damn bias with certain brands then I stop following that channel is moving towards another path.


The criticisms were justified - Resetting all your settings if you adjust something in the memory timings is stupid. It did have the desired effect and got a fixed bios revision released though.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *turkishmafia*
> 
> I've been extremely happy with my GTX 1070 FE, but I am looking to upgrade to SLI or 1080Ti.
> 
> I'm not so concerned that games won't support an SLI profile, but I am concerned regarding microstutter. Is microstutter still an issue with SLI?
> 
> Thanks very much.
> 
> 
> 
> I believe microstutter still is an issue, and though I know you are not concerned, there are apparently less and less games that come out that support SLI. I also believe you would get better performance with a single 1080Ti than you would with two 1070's in SLI. But I could be wrong as it is a slightly uneducated guess based on comparisons of previous cards SLI vrs a single card (albeit more powerful) performance.
Click to expand...

Microstutter is still a thing associated with SLI. You will end up with a less problematic solution with a 1080 Ti if you can afford it

You can use Nvidia profile inspector to create your own SLI profiles (even 4 way if you want) for games that don't have native support


----------



## kliklakloe

Quote:


> Originally Posted by *gtbtk*
> 
> Not anything to panic about, just be aware of and keep an eye on it


Bit hard to track without temp sensors on the VRM


----------



## gtbtk

Quote:


> Originally Posted by *kliklakloe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Not anything to panic about, just be aware of and keep an eye on it
> 
> 
> 
> Bit hard to track without temp sensors on the VRM
Click to expand...

turn the fans up and dont rely on the stock curve...


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> turn the fans up and dont rely on the stock curve...


The 92mm fan on the G10/12 can be run at 100% and it remains near completely silent. No reason to not run it wide open and have piece of mind.


----------



## MEC-777

G12 + H55 now installed on the 1070. Good results, thus far. 2050 seems to be the max this card can do and remain stable, but at least now it actually stays at 2050 95% of the time now and only dips to 2038 every now and then. Finally cracked 20k in Firestrike graphics score.









Also interesting is that I can run this setup without the rad fan running with the GPU at idle and low loads. Core temp levels out at 34-35C with just the pump circulating the water. Under load, I have the rad fan turn on and ramp up to about 50-60% which maintains low/mid 50C core temps under full load with a 2050/9000mhz OC.

Pics of the final setup to come later tonight or tomorrow...


----------



## TerafloppinDatP

Quote:


> Originally Posted by *MEC-777*
> 
> G12 + H55 now installed on the 1070.


Considering this combo. What's the loudest part of your build now? Got my noise levels dialed way down except for when my EVGA 1070 SC is running 70-100% fans.


----------



## kliklakloe

After extensive stable gaming on 2200mhz i tried benching even higher and it takes 2303 as well!


----------



## blaze2210

Quote:


> Originally Posted by *kliklakloe*
> 
> 
> 
> After extensive stable gaming on 2200mhz i tried benching even higher and it takes 2303 as well!


What did you do to keep from hitting the power or voltage limits?


----------



## kliklakloe

Quote:


> Originally Posted by *blaze2210*
> 
> What did you do to keep from hitting the power or voltage limits?


Quote:


> Originally Posted by *blaze2210*
> 
> What did you do to keep from hitting the power or voltage limits?


This is a evga sc same pcb as founders edition. I flashed a FTW bios to this card, this has solved power limit problems for me, the FTW bios draws 185 watts @ 100% so I set power limit to 95% now because it's a weaker power delivery section then the FTW. As for voltage I just set the clock I want on the max voltage Pascal can draw (1.093) using the curve menu in afterburner.


----------



## Harrywang

Hey guys planning on upgrading my r9 280x finally. Looking to get the 1070 even thuogh the prices are going insane right now. Not sure if I should wait until the prices drop or just buy one now.

Any how right now the only ones available is the msi gaming X or the zotac AMP edition (not extreme). The zotac is about 60$ cheaper. Money isn't a REAL concern but out of these 2 cards which one is better in terms of overclocking, cooling, and noise?


----------



## MEC-777

Quote:


> Originally Posted by *TerafloppinDatP*
> 
> Considering this combo. What's the loudest part of your build now? Got my noise levels dialed way down except for when my EVGA 1070 SC is running 70-100% fans.


At idle and normal use, it's completely silent. The only way you can hear the 92mm fan on the G12 bracket (which is running at 100% along with the pump on the H55) is if you literally put your ear up to the rear of the case where it's open.

Under full load, there is a faint and gentle "woosh" from the fans as they come on and ramp up to 50% max (the intakes at the bottom) and 40% max (the exhausts at the top). CPU fan only comes on when necessary and usually doesn't need to exceed 50%.

So the fans are the loudest part, but only when they're running and as such, they are very quiet.


----------



## MEC-777

Yesterday I got the setup, fan curves and everything all dialed-in and optimized. After some thorough testing, very happy with the end results. Push-pull actually makes a huge difference. With 1 fan in push at 60%, core temps reached up to 55C. With 2 fans in push-pull at 50%, core temps reached 50C max and hovered around 48-49. Core clocks actually hit 2076 and flat-lined at 2050.

Also confirmed that the core temp does gradually come down, even with no fans running on the rad. Eventually settled back down in the low 30C range.









Setup pics:


----------



## kliklakloe

Quote:


> Originally Posted by *Harrywang*
> 
> Hey guys planning on upgrading my r9 280x finally. Looking to get the 1070 even thuogh the prices are going insane right now. Not sure if I should wait until the prices drop or just buy one now.
> 
> Any how right now the only ones available is the msi gaming X or the zotac AMP edition (not extreme). The zotac is about 60$ cheaper. Money isn't a REAL concern but out of these 2 cards which one is better in terms of overclocking, cooling, and noise?


Where do you live? That prices are so high? I think both are really good cards and I should go for the cheapest one. Performance doesn't really matter in terms of board design this generation.


----------



## TerafloppinDatP

Quote:


> Originally Posted by *MEC-777*
> 
> At idle and normal use, it's completely silent. The only way you can hear the 92mm fan on the G12 bracket (which is running at 100% along with the pump on the H55) is if you literally put your ear up to the rear of the case where it's open.
> 
> Under full load, there is a faint and gentle "woosh" from the fans as they come on and ramp up to 50% max (the intakes at the bottom) and 40% max (the exhausts at the top). CPU fan only comes on when necessary and usually doesn't need to exceed 50%.
> 
> So the fans are the loudest part, but only when they're running and as such, they are very quiet.


Thanks. Sweet rig!


----------



## zipper17

Quote:


> Originally Posted by *kliklakloe*
> 
> 
> 
> After extensive stable gaming on 2200mhz i tried benching even higher and it takes 2303 as well!


You're using curve, can you share your curve how it looks like?
Did you only use Unigine heaven? try 3dmark firestrike/timespy Bench & Stress test
What type of cooling your card are using?


----------



## Keudn

Hey guys, is there any way to get over the 1.093v limit on these cards? (besides doing a hard mod with soldering and crap). I miss the good days of overclocking my 770 with a softmod to get some extra voltage


----------



## Falkentyne

Been trying to get someone to make a modded NVflash for flashing modded versions of the newer MXM and Pascal Bioses, but either no one cares or the people who are capable of such things have other things to do


----------



## kliklakloe

Quote:


> Originally Posted by *zipper17*
> 
> You're using curve, can you share your curve how it looks like?
> Did you only use Unigine heaven? try 3dmark firestrike/timespy Bench & Stress test
> What type of cooling your card are using?


I ran firestrike too @ 2300mhz, graphics score was just under 22000, haven't got time to do much testing further, will post some screenshots tonight. My curve is just 2300 @ 1.093 volts set and flat out from there, the rest of the voltage points I leave untouched, will show a screenshot from that too, so far the only game that lets me down is overwatch after an hour or so, but I suspect that's because of the memory OC


----------



## kliklakloe

Cooling us custom liquid loop with Ek full cover block.


----------



## Madmaxneo

Quote:


> Originally Posted by *Falkentyne*
> 
> Been trying to get someone to make a modded NVflash for flashing modded versions of the newer MXM and Pascal Bioses, but either no one cares or the people who are capable of such things have other things to do


It's not that. The bios for the Pascal chips are encrypted and as far as I know, no one has been able to break it so far.


----------



## Falkentyne

There is an editor for the MXM Pascal versions already. But it requires a hardware programmer to flash it if you make edits. NVflash just says certificate 2.0 error.
People are using GTX 1070's with 200w TDP...


----------



## kliklakloe

Quote:


> Originally Posted by *MEC-777*
> 
> Yesterday I got the setup, fan curves and everything all dialed-in and optimized. After some thorough testing, very happy with the end results. Push-pull actually makes a huge difference. With 1 fan in push at 60%, core temps reached up to 55C. With 2 fans in push-pull at 50%, core temps reached 50C max and hovered around 48-49. Core clocks actually hit 2076 and flat-lined at 2050.
> 
> Also confirmed that the core temp does gradually come down, even with no fans running on the rad. Eventually settled back down in the low 30C range.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Setup pics:


----------



## rfarmer

Quote:


> Originally Posted by *kliklakloe*
> 
> [URL=http://www.overclock.net/cont]http://www.overclock.net/cont[/URL] How and why do you get that card in a 45 degree angle?[/QUOTE]
> 
> Thanks for asking that because I was wondering the same thing.


----------



## Warface-NL

Quote:


> Originally Posted by *Harrywang*
> 
> Hey guys planning on upgrading my r9 280x finally. Looking to get the 1070 even thuogh the prices are going insane right now. Not sure if I should wait until the prices drop or just buy one now.
> 
> Any how right now the only ones available is the msi gaming X or the zotac AMP edition (not extreme). The zotac is about 60$ cheaper. Money isn't a REAL concern but out of these 2 cards which one is better in terms of overclocking, cooling, and noise?


The price is getting high, because of the mining, i bought it on the right time, for my mining rig "Only bought 2" but yes the 1070 is atm the best card for mining because of the power draw, and the cards are running out. Thats why they are getting expensive atm :/


----------



## TerafloppinDatP

Much love for my 1070 but wouldn't pass up the chance to sell it to a miner and trade up to a 1080 at even money. If it's even true the 1080 GDDR5X makes them not so good for mining so they don't go up in price too. Haven't seen 1070 prices budge on eBay though.


----------



## MEC-777

Quote:


> Originally Posted by *kliklakloe*
> 
> [URL=http://www.overclock.net/cont]http://www.overclock.net/cont[/URL] How and why do you get that card in a 45 degree angle?[/QUOTE]
> 
> Quote:
> [QUOTE]Originally Posted by [B]rfarmer[/B] [URL=https://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/8750#post_26167535][IMG alt="View Post"]https://www.overclock.net/img/forum/go_quote.gif[/URL]
> 
> Thanks for asking that because I was wondering the same thing.


I modified the GPU mounting bracket and used a flexible PCIe extension ribbon.

Why? To be different.


----------



## lanofsong

Hello GTX 1070 owners,

We are having our monthly Foldathon from Monday 19th - Wednesday 21st - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

June 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## kckyle

quick question guys, does the dual pin draw more power by default or is it only if you really overclock it?


----------



## microchidism

Quote:


> Originally Posted by *TerafloppinDatP*
> 
> Much love for my 1070 but wouldn't pass up the chance to sell it to a miner and trade up to a 1080 at even money. If it's even true the 1080 GDDR5X makes them not so good for mining so they don't go up in price too. Haven't seen 1070 prices budge on eBay though.


Id do the same, but from the looks of things prices of 1080s have gone up quite a bit. Around 1080ti release you could find them in the 400s, now they are in the 500s


----------



## comanzo

Tbh, if I foresaw the 1070's lack of custom biose's as well as a hard lock on the voltage placed by nvidia, I probably would've went 980ti. In other words, if I knew what I know now with the horrible overclocking of pascal, 980ti is the way to go. If you overclock the 980ti with enough voltage and put it under a custom bios, you will definitely have performance closer to a 1080 than a 1070. Though, this is what I have observed from online forums. I can only speak for 1070 because that is what I own. I don't own a 980ti.


----------



## MEC-777

Quote:


> Originally Posted by *comanzo*
> 
> Tbh, if I foresaw the 1070's lack of custom biose's as well as a hard lock on the voltage placed by nvidia, I probably would've went 980ti. In other words, if I knew what I know now with the horrible overclocking of pascal, 980ti is the way to go. If you overclock the 980ti with enough voltage and put it under a custom bios, you will definitely have performance closer to a 1080 than a 1070. Though, this is what I have observed from online forums. I can only speak for 1070 because that is what I own. I don't own a 980ti.


Overclocking isn't horrible on Pascal. It's actually quite significant. The reason people aren't understanding this is because of GPUboost 3.0 and the fact that this feature overclocks ALL Pascal GPUs automatically, right out of the box.

Just look at the base/boost clocks of any 1070 and look at what they actually clock to without even touching anything.

The FE has a base of 1506 and boost of 1683. Most of them ramp up to the 1900's (mine hits 1911) and settles somewhere in the 1800's. That's already an OC of 150-200mhz over the rated boost clock without doing anything. GPUboost 3.0 is designed to push the card further if certain parameters are met.

Getting more out of the card above that takes a bit more knowledge, time and finesse and also requires that the core temp be kept as low as possible to prevent the clocks from down-stepping. I was able to get my FE to hold 2050 under water while keeping temps around 50C. That's an OC of +367mhz over the stock boost clock on a reference card. Not to mention most 1070's can run the memory at +500 or more.

They are a different beast than Maxwell and operate differently. They are engineered with higher efficiency in mind, not blistering overclocks. But that being said, look at how much they actually do OC and especially compare that to AMD cards which struggle to do +150-200mhz above stock clocks a lot of the time.









Just some food or thought.


----------



## asdkj1740

Quote:


> Originally Posted by *kckyle*
> 
> quick question guys, does the dual pin draw more power by default or is it only if you really overclock it?


all depends on bios power settings which you cant modify them


----------



## gtbtk

Quote:


> Originally Posted by *kckyle*
> 
> quick question guys, does the dual pin draw more power by default or is it only if you really overclock it?


Power draw is determined by the bios settings and how much you overclock. I assume you mean cards that have 6+8pin or 8=8pin power connections vs single 8 pin power cards.

The cards with single 8 pin power typically have lower power limits baked in to the bios than the cards with multi power cables.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Tbh, if I foresaw the 1070's lack of custom biose's as well as a hard lock on the voltage placed by nvidia, I probably would've went 980ti. In other words, if I knew what I know now with the horrible overclocking of pascal, 980ti is the way to go. If you overclock the 980ti with enough voltage and put it under a custom bios, you will definitely have performance closer to a 1080 than a 1070. Though, this is what I have observed from online forums. I can only speak for 1070 because that is what I own. I don't own a 980ti.


A good OC for a 980ti is 350-400mhz with a few getting to 500mhz.

1070 will get similar or better OC results over the published boost clocks. A few are even getting 600mhz over stock with water cooling


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Tbh, if I foresaw the 1070's lack of custom biose's as well as a hard lock on the voltage placed by nvidia, I probably would've went 980ti. In other words, if I knew what I know now with the horrible overclocking of pascal, 980ti is the way to go. If you overclock the 980ti with enough voltage and put it under a custom bios, you will definitely have performance closer to a 1080 than a 1070. Though, this is what I have observed from online forums. I can only speak for 1070 because that is what I own. I don't own a 980ti.


Quote:


> Originally Posted by *gtbtk*
> 
> A good OC for a 980ti is 350-400mhz with a few getting to 500mhz.
> 
> 1070 will get similar or better OC results over the published boost clocks. A few are even getting 600mhz over stock with water cooling


I had a 980 with a modded bios that I easily hit about 1530 mhz with. It took some tweaking and really messing with the OC numbers but I was finally able to barely beat the benchmarks with my 1070. The thing is my 980 was not watercooled but my 1070 is. I constantly hit a max OC on my 980 sometimes because it got so high in temps.
The 900 series of cards had some great OC potential with or without watercooling, the 1000 series of cards, not so much, though they are powerful cards I lean towards the idea that the 900 series had more potential than the 1000 series.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *comanzo*
> 
> Tbh, if I foresaw the 1070's lack of custom biose's as well as a hard lock on the voltage placed by nvidia, I probably would've went 980ti. In other words, if I knew what I know now with the horrible overclocking of pascal, 980ti is the way to go. If you overclock the 980ti with enough voltage and put it under a custom bios, you will definitely have performance closer to a 1080 than a 1070. Though, this is what I have observed from online forums. I can only speak for 1070 because that is what I own. I don't own a 980ti.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> A good OC for a 980ti is 350-400mhz with a few getting to 500mhz.
> 
> 1070 will get similar or better OC results over the published boost clocks. A few are even getting 600mhz over stock with water cooling
> 
> Click to expand...
> 
> I had a 980 with a modded bios that I easily hit about 1530 mhz with. It took some tweaking and really messing with the OC numbers but I was finally able to barely beat the benchmarks with my 1070. The thing is my 980 was not watercooled but my 1070 is. I constantly hit a max OC on my 980 sometimes because it got so high in temps.
> The 900 series of cards had some great OC potential with or without watercooling, the 1000 series of cards, not so much, though they are powerful cards I lean towards the idea that the 900 series had more potential than the 1000 series.
Click to expand...

You may want to try an asus strix oc bios on your card. it doesn't hit power limits anywhere near as quickly as the EVGA vbioses do


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> You may want to try an asus strix oc bios on your card. it doesn't hit power limits anywhere near as quickly as the EVGA vbioses do


I thought about doing something like that but I am not sure this card would do any better.
My ASIC is only a 60.2 (though I am not sure how relevant ASIC nowadays) and when running the test with precision XOC it hit a lot of limits. Plus this card has micron memory so maybe not a great OC card, but it does a good job. I am not complaining because I got this card for free as a warranty replacement when I sent my 980 for issues with the HDMI out amongst a few other things.
This is my highest score with FIrestrike (overall 17,046, with a graphics score of 20,273) on this card, my highest overall score with my 980 with a modded bios was only about 14,597 but the framerates were on par with my 1070.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You may want to try an asus strix oc bios on your card. it doesn't hit power limits anywhere near as quickly as the EVGA vbioses do
> 
> 
> 
> I thought about doing something like that but I am not sure this card would do any better.
> My ASIC is only a 60.2 (though I am not sure how relevant ASIC nowadays) and when running the test with precision XOC it hit a lot of limits. Plus this card has micron memory so maybe not a great OC card, but it does a good job. I am not complaining because I got this card for free as a warranty replacement when I sent my 980 for issues with the HDMI out amongst a few other things.
> This is my highest score with FIrestrike (overall 17,046, with a graphics score of 20,273) on this card, my highest overall score with my 980 with a modded bios was only about 14,597 but the framerates were on par with my 1070.
Click to expand...

My Rig is down at the moment so I cant check this but unless asic support has been re-added recently in 2.1.0, GPU-z does not support asic scores for Pascal cards. The very old versions of GPU-z would report a score but it is not accurate.

During my research to resolve the Micron memory bug, I cross flashed most of the different bioses to my card. the EVGA bioses were the only ones to hit the power limit under a 1080p load. That was true for both the SC and the FTW bioses.

As you have an EVGA card, the auto overclocking utility in precision is a handy tool to show you what parts of the curve have more OC headroom than you are currently using. The results themselves are not always stable but if you look at the shape of the curve that it creates, you can compare against the overclock curve you have created and it will show you which voltage levels that you can play with to get more performance. It helped me determine that my card had a hole that would make the card crash if i pushed my card more than +50 at 1.0v. I also discovered that I could tune the hole out with a small CPU PLL adjustment (running z68 i7-2600, YMMV)

Make sure that gameDVR is disabled and asus aisuite is not installed. they both kill graphics performance. Having said that, 20273 is not a bad score. I assume that you are overclocking using the slider? Afterburner gives you much finer adjustments that Precision. If you set your card as you normally would overclock it, try playing with the curve and boosting the .950 point up as high as you can get it while remaining stable. You might find that 21000 is achievable.


----------



## zetoor85

Owner of a Asus Dual 1070gtx i Can hit 2100mhz

My old card was a gigabyte 980ti xtreme max oc was 1660mhz and it could eat a 1080gtx ez







but happy with the 1070gtx i sold the 980ti with my old skylake rig.. rip card of doom


----------



## iARDAs

Guys I can't find a 1070 anywhere on amazonr or newegg. Whats going on?


----------



## syl1979

Quote:


> Originally Posted by *iARDAs*
> 
> Guys I can't find a 1070 anywhere on amazonr or newegg. Whats going on?


They are all going to Chinese mining farms


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> My Rig is down at the moment so I cant check this but unless asic support has been re-added recently in 2.1.0, GPU-z does not support asic scores for Pascal cards. The very old versions of GPU-z would report a score but it is not accurate.
> 
> During my research to resolve the Micron memory bug, I cross flashed most of the different bioses to my card. the EVGA bioses were the only ones to hit the power limit under a 1080p load. That was true for both the SC and the FTW bioses.
> 
> As you have an EVGA card, the auto overclocking utility in precision is a handy tool to show you what parts of the curve have more OC headroom than you are currently using. The results themselves are not always stable but if you look at the shape of the curve that it creates, you can compare against the overclock curve you have created and it will show you which voltage levels that you can play with to get more performance. It helped me determine that my card had a hole that would make the card crash if i pushed my card more than +50 at 1.0v. I also discovered that I could tune the hole out with a small CPU PLL adjustment (running z68 i7-2600, YMMV)
> 
> Make sure that gameDVR is disabled and asus aisuite is not installed. they both kill graphics performance. Having said that, 20273 is not a bad score. I assume that you are overclocking using the slider? Afterburner gives you much finer adjustments that Precision. If you set your card as you normally would overclock it, try playing with the curve and boosting the .950 point up as high as you can get it while remaining stable. You might find that 21000 is achievable.


I was running the GPUZ version prior to 2.1.0, I am not so sure my ASIC score was accurate because I realized it was reading -60.2. Now though with the latest version it does not even give me the option to read the ASIC quality. I had AI Suite 3 enabled with my gtx 980 and someone else had told me the same thing so I disabled/uninstalled it. The interesting thing was my highest scores with my 980 were when I had AI suite installed. I reinstalled it but when my 908 and my CPU started dying I again uninstalled AI suite trying to trouble shoot my issues and I have not reinstalled it since. So I think the AI suite issues don't happen with everyone as it ran fine for me. If I get the chance in the next few weeks I will see if I can call that curve up in Precision XOC, hopefully it is saved. But I do not know how to call the curve up in MSI AB for adjusting each voltage slider as you recommended..


----------



## gtbtk

Quote:


> Originally Posted by *iARDAs*
> 
> Guys I can't find a 1070 anywhere on amazonr or newegg. Whats going on?


Ethereum mining rigs


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My Rig is down at the moment so I cant check this but unless asic support has been re-added recently in 2.1.0, GPU-z does not support asic scores for Pascal cards. The very old versions of GPU-z would report a score but it is not accurate.
> 
> During my research to resolve the Micron memory bug, I cross flashed most of the different bioses to my card. the EVGA bioses were the only ones to hit the power limit under a 1080p load. That was true for both the SC and the FTW bioses.
> 
> As you have an EVGA card, the auto overclocking utility in precision is a handy tool to show you what parts of the curve have more OC headroom than you are currently using. The results themselves are not always stable but if you look at the shape of the curve that it creates, you can compare against the overclock curve you have created and it will show you which voltage levels that you can play with to get more performance. It helped me determine that my card had a hole that would make the card crash if i pushed my card more than +50 at 1.0v. I also discovered that I could tune the hole out with a small CPU PLL adjustment (running z68 i7-2600, YMMV)
> 
> Make sure that gameDVR is disabled and asus aisuite is not installed. they both kill graphics performance. Having said that, 20273 is not a bad score. I assume that you are overclocking using the slider? Afterburner gives you much finer adjustments that Precision. If you set your card as you normally would overclock it, try playing with the curve and boosting the .950 point up as high as you can get it while remaining stable. You might find that 21000 is achievable.
> 
> 
> 
> I was running the GPUZ version prior to 2.1.0, I am not so sure my ASIC score was accurate because I realized it was reading -60.2. Now though with the latest version it does not even give me the option to read the ASIC quality. I had AI Suite 3 enabled with my gtx 980 and someone else had told me the same thing so I disabled/uninstalled it. The interesting thing was my highest scores with my 980 were when I had AI suite installed. I reinstalled it but when my 908 and my CPU started dying I again uninstalled AI suite trying to trouble shoot my issues and I have not reinstalled it since. So I think the AI suite issues don't happen with everyone as it ran fine for me. If I get the chance in the next few weeks I will see if I can call that curve up in Precision XOC, hopefully it is saved. But I do not know how to call the curve up in MSI AB for adjusting each voltage slider as you recommended..
Click to expand...

AI suite ran fine for me too but it did impact my Firestrike scores by about 500 points. Just uninstalling the software leaves 2 or 3 services behind that you have to remove manually to get rid of. If your services list still has ASUS labeled services running, you can use the SC command from an admin command prompt to delete them or disable them from the list.

There are unicorn 980ti cards around, as there are unicorn 1070 cards as well.

As long as you save the precision auto generated curve to an empty profile slot, you can recall it again after a reboot. It is easy to transfer the EVGA Precision auto curve across to Afterburner. You just apply the curve in Precision and then close that software down completely, start Afterburner and it will pick up the cards currently applied curve that you can save to one of the AB profile slots. The overclocks that you apply do not rely on either Precision or afterburner to remain running after you have applied them to the card unless you reboot the PC when the card will revert back to stock settings.

To open the curve window in AB 4.3 and above, you can either click the little bar graph icon next to the core speed slider or more simply press CTRL-F


----------



## MrTOOSHORT

Quote:


> Originally Posted by *iARDAs*
> 
> Guys I can't find a 1070 anywhere on amazonr or newegg. Whats going on?


Yes, miners eating up the supply. Couple for sale in the OCN market place. Around $450 used though.


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> Ethereum mining rigs


Yep. They are particularly good for this because you can lower the power limit to 60%, OC the memory to +600 and still get the same mining rate while consuming a lot less power (90w vs 150w). My 1070 is mining ETH now, basically 24/7.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Ethereum mining rigs
> 
> 
> 
> Yep. They are particularly good for this because you can lower the power limit to 60%, OC the memory to +600 and still get the same mining rate while consuming a lot less power (90w vs 150w). My 1070 is mining ETH now, basically 24/7.
Click to expand...

My rig currently makes power supplies make a very large bang and release a bit of the magic smoke if you plug it in and turn it on so I cant do that right now. Need to find a used motherboard to replace it.


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> My rig currently makes power supplies make a very large bang and release a bit of the magic smoke if you plug it in and turn it on so I cant do that right now. Need to find a used motherboard to replace it.


Well... that's not good...


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My rig currently makes power supplies make a very large bang and release a bit of the magic smoke if you plug it in and turn it on so I cant do that right now. Need to find a used motherboard to replace it.
> 
> 
> 
> Well... that's not good...
Click to expand...

no, It's a pain in the butt. Unfortunately Z68 and Z77 motherboards are rare second hand here and still rate a premium when you can find them used. Hong Kong people realize how good sandy bridge CPUs really were so they don't sell them often.


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> no, It's a pain in the butt. Unfortunately Z68 and Z77 motherboards are rare second hand here and still rate a premium when you can find them used. Hong Kong people realize how good sandy bridge CPUs really were so they don't sell them often.


Sandy and Ivy bridge stuff is kind of hard to find here in NA as well and hold their value. People don't like to let go of that which works well.


----------



## gtbtk

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> no, It's a pain in the butt. Unfortunately Z68 and Z77 motherboards are rare second hand here and still rate a premium when you can find them used. Hong Kong people realize how good sandy bridge CPUs really were so they don't sell them often.
> 
> 
> 
> Sandy and Ivy bridge stuff is kind of hard to find here in NA as well and hold their value. People don't like to let go of that which works well.
Click to expand...

Agreed. The reason why I never bothered to invest money upgrading to a more recent processor. When working, my 2600 non K overclocked to 4.4Ghz performs most things about the same as a skylake i5 6500K. A 7700K would be faster but I don't think that it will have the longevity of a sandy bridge CPU. Ryzen has its own challenges, single core is about the same as what I have now and memory latency still has its issues. x299 is ridiculous for my needs although it would be nice. Better to wait for coffee lake 6 cores i think.


----------



## rfarmer

Quote:


> Originally Posted by *gtbtk*
> 
> Agreed. The reason why I never bothered to invest money upgrading to a more recent processor. When working, my 2600 non K overclocked to 4.4Ghz performs most things about the same as a skylake i5 6500K. A 7700K would be faster but I don't think that it will have the longevity of a sandy bridge CPU. Ryzen has its own challenges, single core is about the same as what I have now and memory latency still has its issues. x299 is ridiculous for my needs although it would be nice. Better to wait for coffee lake 6 cores i think.


I was all set to pick up that 6 core Coffee Lake, especially since there were early rumors it might show up in Aug. Now they are talking about a 1151 v2 socket for it, I was like ffs I just bought my Z270 and now need another motherboard. More and more I am thinking of moving to AMD.


----------



## MEC-777

Quote:


> Originally Posted by *gtbtk*
> 
> Agreed. The reason why I never bothered to invest money upgrading to a more recent processor. When working, my 2600 non K overclocked to 4.4Ghz performs most things about the same as a skylake i5 6500K. A 7700K would be faster but I don't think that it will have the longevity of a sandy bridge CPU. Ryzen has its own challenges, single core is about the same as what I have now and memory latency still has its issues. x299 is ridiculous for my needs although it would be nice. Better to wait for coffee lake 6 cores i think.


Lol, yep. That's why I only just upgraded to an i7-4770K (from an i5-4570) just a few months ago. No point upgrading the whole platform (mb, ram, CPU, etc.), which would have cost me $800-1000 CAD, if the tangible performance differences are negligible at best (from Haswell to Sky/Kabylake). Instead it only cost me the difference between the cost of the 2nd hand 4770K and what I sold the i5 for - so about $170. lol







Good to go for another 3-4 years, I recon.


----------



## rfarmer

Quote:


> Originally Posted by *MEC-777*
> 
> Lol, yep. That's why I only just upgraded to an i7-4770K (from an i5-4570) just a few months ago. No point upgrading the whole platform (mb, ram, CPU, etc.), which would have cost me $800-1000 CAD, if the tangible performance differences are negligible at best (from Haswell to Sky/Kabylake). Instead it only cost me the difference between the cost of the 2nd hand 4770K and what I sold the i5 for - so about $170. lol
> 
> 
> 
> 
> 
> 
> 
> Good to go for another 3-4 years, I recon.


Yeah I am kicking myself for ever upgrading from my 4690k, new mobo ram and CPU. Loved my Devil's Canyon, wish I still had it.


----------



## gtbtk

Quote:


> Originally Posted by *rfarmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MEC-777*
> 
> Lol, yep. That's why I only just upgraded to an i7-4770K (from an i5-4570) just a few months ago. No point upgrading the whole platform (mb, ram, CPU, etc.), which would have cost me $800-1000 CAD, if the tangible performance differences are negligible at best (from Haswell to Sky/Kabylake). Instead it only cost me the difference between the cost of the 2nd hand 4770K and what I sold the i5 for - so about $170. lol
> 
> 
> 
> 
> 
> 
> 
> Good to go for another 3-4 years, I recon.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I am kicking myself for ever upgrading from my 4690k, new mobo ram and CPU. Loved my Devil's Canyon, wish I still had it.
Click to expand...

live and learn. The rig I had before the Sandy bridge was a 3 ghz pentium D. That CPU was the worst cpu I have even bought. A 7700K in the scheme of things is not that bad


----------



## Jayhonda

So I've had my Evga hybrid 1070 for about 3 months I love it but now i have some time on my hands but always want to look into how I can avoid making expensive mistakes does anyone have a guid or maybe an idea of how the tear down process goes? Please any links help


----------



## Madmaxneo

Quote:


> Originally Posted by *Jayhonda*
> 
> So I've had my Evga hybrid 1070 for about 3 months I love it but now i have some time on my hands but always want to look into how I can avoid making expensive mistakes does anyone have a guid or maybe an idea of how the tear down process goes? Please any links help


Correct me if I am wrong but isn't the hybrid already watercooled? If so then why would you want to tear it down?


----------



## gtbtk

Quote:


> Originally Posted by *Jayhonda*
> 
> So I've had my Evga hybrid 1070 for about 3 months I love it but now i have some time on my hands but always want to look into how I can avoid making expensive mistakes does anyone have a guid or maybe an idea of how the tear down process goes? Please any links help


inquiring minds. I get it.

Make sure that you have some good thermal paste on hand before you pull the card to pieces.

Have a look on youtube at the gamers nexus channel, he does graphics cards teardowns, watch some of those and you should have a good idea of how they go together. 1070 and 1080 cards are basically the same so 1080 teardowns are just as educational at 1070 videos.


----------



## Jayhonda

Quote:


> Originally Posted by *gtbtk*
> 
> inquiring minds. I get it.
> 
> Make sure that you have some good thermal paste on hand before you pull the card to pieces.
> 
> Have a look on youtube at the gamers nexus channel, he does graphics cards teardowns, watch some of those and you should have a good idea of how they go together. 1070 and 1080 cards are basically the same so 1080 teardowns are just as educational at 1070 videos.


Thanks ya i wanted to switch out the T.I.M. and go with something a little higher end and get some better temps while trying to over clock. Thank you guys I have seen Gamers videos but i'm not sure if it was the same under the shroud Thanks again OC.NET!!!!!!!!!!


----------



## NiKiZ

Ordered a MSI GTX 1070 Armor OC few days ago. It was the cheapest I could get and that was in stock. (Damn Etherium coin miners..) I wanted an Asus STRIX one, but it was 130-170€ more.. Oh well, at least the card matches my black and white color scheme.







Can't wait for it to arrive. The shipping time is 8 days. :/ Radeon HD7850 -> GTX 1070 should be a nice upgrade.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Ordered a MSI GTX 1070 Armor OC few days ago. It was the cheapest I could get and that was in stock. (Damn Etherium coin miners..) I wanted an Asus STRIX one, but it was 130-170€ more.. Oh well, at least the card matches my black and white color scheme.
> 
> 
> 
> 
> 
> 
> 
> Can't wait for it to arrive. The shipping time is 8 days. :/ Radeon HD7850 -> GTX 1070 should be a nice upgrade.


You can always cross flash the Asus Strix OC bios to the card and get the same performance as the strix card.

The cooler may be the only thing really limiting you as it is not as heavy as the Gaming X/strix coolers. Good choice if you want to upgrade the card to being watercooled


----------



## gtbtk

Quote:


> Originally Posted by *Jayhonda*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> inquiring minds. I get it.
> 
> Make sure that you have some good thermal paste on hand before you pull the card to pieces.
> 
> Have a look on youtube at the gamers nexus channel, he does graphics cards teardowns, watch some of those and you should have a good idea of how they go together. 1070 and 1080 cards are basically the same so 1080 teardowns are just as educational at 1070 videos.
> 
> 
> 
> Thanks ya i wanted to switch out the T.I.M. and go with something a little higher end and get some better temps while trying to over clock. Thank you guys I have seen Gamers videos but i'm not sure if it was the same under the shroud Thanks again OC.NET!!!!!!!!!!
Click to expand...

they all follow pretty much the same build principles. 4 screws for the cooler attachment and other smaller scres holding the back plate and periphery parts together


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> You can always cross flash the Asus Strix OC bios to the card and get the same performance as the strix card.
> 
> The cooler may be the only thing really limiting you as it is not as heavy as the Gaming X/strix coolers. Good choice if you want to upgrade the card to being watercooled


The cooler is the reason I wanted the STRIX card. Mainly for looks. Oh well, I guess I could sell the card and buy the STRIX one when I get more money. I know my friend would buy it from me.


----------



## NiKiZ

YES! My MSI 1070 Armor OC will arrive tomorrow!


----------



## rfarmer

Quote:


> Originally Posted by *NiKiZ*
> 
> YES! My MSI 1070 Armor OC will arrive tomorrow!


Nice, you were lucky to find one. Did you have to pay way too much for it?


----------



## NiKiZ

Quote:


> Originally Posted by *rfarmer*
> 
> Nice, you were lucky to find one. Did you have to pay way too much for it?


I got it for 430€, which was amazing! However, shortly after ordering it disappeared from the product selection and the prices went up, especially the ones that are in stock. For example, Asus GTX1070 Dual OC, 646,60€.


----------



## rfarmer

Quote:


> Originally Posted by *NiKiZ*
> 
> I got it for 430€, which was amazing! However, shortly after ordering it disappeared from the product selection and the prices went up, especially the ones that are in stock. For example, Asus GTX1070 Dual OC, 646,60€.


Yeah same here in the US. The ones out of stock are like $399 - $469 and the ones in stock are $599 to $799, insane. These bit miners really screw it up for anyone just wanting to build a gaming pc.


----------



## EDK-TheONE

Which 1070 v-bios are compatible with zotac 1070 amp extreme (Samsung memory)?


----------



## EagleHawk

anyone have good luck on asus rma turn around like get a working one back in 2 weeks my STRIX-GTX1070-O8G-GAMING the leds are all messed up set to blue i get a red rog logo and sett to white i get a yellow logo this sucks


----------



## drunkonpiss

Anyone using a 3440x1440 panel? I'm contemplating to upgrade my card cause I don't know if it would get consistent 60 fps if I upgrade my monitor. Currently using a Palit Jetstream 1070+i7 4790k combo.


----------



## NiKiZ

It finally arrived!


----------



## skupples

hey folks!

Quick question on SLI'ing 10 series GPUs... How strict is the mix & matching these days?

Like - will I have issues trying to SLi an ACX 2.0 w/ a 3.0? I got lucky n found an EVGA 3.0 1070 for $389.99... While everyone else is sold out, or asking damn near 200% of MSRP. Tiger Direct has a few for under $500 as well.

hopefully its still like the old days where they just need to be on the same bios.


----------



## MEC-777

Quote:


> Originally Posted by *skupples*
> 
> hey folks!
> 
> Quick question on SLI'ing 10 series GPUs... How strict is the mix & matching these days?
> 
> Like - will I have issues trying to SLi an ACX 2.0 w/ a 3.0? I got lucky n found an EVGA 3.0 1070 for $389.99... While everyone else is sold out, or asking damn near 200% of MSRP. Tiger Direct has a few for under $500 as well.
> 
> hopefully its still like the old days where they just need to be on the same bios.


Brand and model doesn't matter. As long as it's the same "GPU".


----------



## skupples

Quote:


> Originally Posted by *MEC-777*
> 
> Brand and model doesn't matter. As long as it's the same "GPU".


good deal, glad that hasn't changed.

thanks

+1

now to just dig a sli bridge out of the stash...


----------



## gtbtk

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Which 1070 v-bios are compatible with zotac 1070 amp extreme (Samsung memory)?


Any bios from EVGA, ASUS, MSI, Palit/gainward and Gigabyte will work. The current range is 86.04.50.x.x. and supports both samsung and micron. The Galax HOF and snipr bioses are not compatible.

Why are you thinking about cross flashing? That standard bios will allow you to draw 300W. Make sure you create a back up of your original bios to flash back to stock later.

Having said that, the 300W doesn't really give you much performance benefits over other cards. You have a massive cooler, you may find using that card with a bios that pulls less power runs much cooler and gives you better performance.

You might like to try out

Gigabyte xtreme

Gainward GLH

Asus Strix OC

MSI Gaming Z

EVGA FTW 226W (there are two different bioses on that card)


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> hey folks!
> 
> Quick question on SLI'ing 10 series GPUs... How strict is the mix & matching these days?
> 
> Like - will I have issues trying to SLi an ACX 2.0 w/ a 3.0? I got lucky n found an EVGA 3.0 1070 for $389.99... While everyone else is sold out, or asking damn near 200% of MSRP. Tiger Direct has a few for under $500 as well.
> 
> hopefully its still like the old days where they just need to be on the same bios.


Any 1070 will SLI. Doesn't even matter about bios version. Both EVGA cards will SLI fine

the only thing is that you will get lowest common denominator performance. If they are both clocked the same, then it doesn't matter. Use the coolest running card as the top card.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> It finally arrived!


Xmas in June!!!!


----------



## gtbtk

Quote:


> Originally Posted by *drunkonpiss*
> 
> Anyone using a 3440x1440 panel? I'm contemplating to upgrade my card cause I don't know if it would get consistent 60 fps if I upgrade my monitor. Currently using a Palit Jetstream 1070+i7 4790k combo.


3440x1440 is almost 4K and a 1070 may struggle at Ultra settings, you would be better served with a 1080 or 1080TI.

1070 is probably fine at medium to high settings. Your CPU is more than adequate.


----------



## gtbtk

Quote:


> Originally Posted by *EagleHawk*
> 
> anyone have good luck on asus rma turn around like get a working one back in 2 weeks my STRIX-GTX1070-O8G-GAMING the leds are all messed up set to blue i get a red rog logo and sett to white i get a yellow logo this sucks


the led power cable may not be seated properly. It sounds like one of the 3 channels is not working


----------



## EagleHawk

Quote:


> Originally Posted by *gtbtk*
> 
> the led power cable may not be seated properly. It sounds like one of the 3 channels is not working


thanks when i get home i try a different cable which i doubt but worth a shot yes 3 channels is correct bottom towards mb is different color also wish i can repair the leds myself


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> Any 1070 will SLI. Doesn't even matter about bios version. Both EVGA cards will SLI fine
> 
> the only thing is that you will get lowest common denominator performance. If they are both clocked the same, then it doesn't matter. Use the coolest running card as the top card.


thanks, i'm glad to see not much has changed.

I got duped though, my card wasn't actually in-stock, back to the drawing board. I have some bids in on eBay for under $400.

damn you miners, damn you all to hell!

day trading is so much more profitable, and way more green


----------



## NiKiZ

Oh no..


----------



## blaze2210




----------



## skupples

this is why tigerdirect continues to slowly fail.

They sell product they know they don't have, then call you 12-24 hours later in an attempt @ retaining your business.

Lead time is currently 2-3 weeks on their next load, so I'd guess everyone else is in a similar boat. HOPEFULLY they over-produce in the next couple batches so prices go right back to MSRP n not a penny more.

people on Ebay are pushing used units north of $500 -.-

honestly though, i don't even NEED it right now - I just wanted to buy myself something nice for my birthday. I currently use in-home steam stream to stream into my bedroom, but the hard line running from my switch to my back office is only pulling 1-2MB/s down, n barely 1MB/s up, when it should be closer to 15o down & 10 up, so that right there explains why my streaming performance is garbage.


----------



## NiKiZ

Well, it looks like I have to RMA my new card.. I have artifacting with factory clockspeeds.. Also one of the fans gets stuck and won't spin.


----------



## gtbtk

Quote:


> Originally Posted by *EagleHawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the led power cable may not be seated properly. It sounds like one of the 3 channels is not working
> 
> 
> 
> thanks when i get home i try a different cable which i doubt but worth a shot yes 3 channels is correct bottom towards mb is different color also wish i can repair the leds myself
Click to expand...

The other thing is that the cable is on the wrong way and the what the driver thinks is RGB is actually BGR


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Any 1070 will SLI. Doesn't even matter about bios version. Both EVGA cards will SLI fine
> 
> the only thing is that you will get lowest common denominator performance. If they are both clocked the same, then it doesn't matter. Use the coolest running card as the top card.
> 
> 
> 
> thanks, i'm glad to see not much has changed.
> 
> I got duped though, my card wasn't actually in-stock, back to the drawing board. I have some bids in on eBay for under $400.
> 
> damn you miners, damn you all to hell!
> 
> day trading is so much more profitable, and way more green
Click to expand...

You know that you can get the cheapest reference PCB card from any brand and just cross flash the EVGA SC bios to it so you get all the precision xoc features like k-boost etc


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Oh no..


Before you RMA the card, try increasing the VCCIO and VCCSA voltages a bit, that could be pcie controller instability and not a bad card.


----------



## comanzo

Hey guys, I know I should make a new thread on this, but in some games, my gpu usage drops to 0, and makes me get 30 ish fps. What boggles my mind is that it doesn't happen in all games, and not even heaven or valley unless it's changing scenes, which is normal(or at least I think it is). I already replaced the mobo. and still happens, does this sound like a gpu issue? Cpu issue?

Gpu-gtx 1070
Cpu-i7-4790s
Ram-12gb ddr3 @1600mhz
Games that stutter:
Batman arkham asylum/city/knight and rainbow six siege. To my knowlege, every other game runs fine.


----------



## TerafloppinDatP

Quote:


> Originally Posted by *NiKiZ*
> 
> Well, it looks like I have to RMA my new card.. I have artifacting with factory clockspeeds.. Also one of the fans gets stuck and won't spin.


Sorry you're having problems! Had the same issue with an EVGA card, saw that one of the fan wires was sticking out and stopping the fan from spinning. Had to get in there with needle nose pliers and a thin screwdriver but was an easy fix.


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> this is why tigerdirect continues to slowly fail.
> 
> They sell product they know they don't have, then call you 12-24 hours later in an attempt @ retaining your business.
> 
> Lead time is currently 2-3 weeks on their next load, so I'd guess everyone else is in a similar boat. HOPEFULLY they over-produce in the next couple batches so prices go right back to MSRP n not a penny more.
> 
> people on Ebay are pushing used units north of $500 -.-
> 
> honestly though, i don't even NEED it right now - I just wanted to buy myself something nice for my birthday. I currently use in-home steam stream to stream into my bedroom, but the hard line running from my switch to my back office is only pulling 1-2MB/s down, n barely 1MB/s up, when it should be closer to 15o down & 10 up, so that right there explains why my streaming performance is garbage.


If you copy a file between two PCs at home across your lan are you still only getting 1MB/s (that is about 10 megabits)?

Is your switch a 10Mbit, 100Mbit or Gigabit switch? Have you tried restarting the switch? sometimes a port on the switch can drop into go slow mode or disable itself. restarting can help.

Are you using at least Cat 5E or cat 6 cables if you have a gigabit switch? Anything Cat 5 or below will reduce gigabit speeds to much lower.

Try re-seating each patch cable or move the cables to a new switch port for each connection from the internet router to the switch, to your the patch cables between your internet connection to the switch to your office.

Some of the newer Intel Network drivers have diagnostics built in that you can access from the driver properties. You can test the cable/signal quality. You would need to test both connections to your switch.

If you unplug or turn all other hard wired devices on your network off, do you still get really slow speeds. It could be another device with a faulty adapter on your lan


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Hey guys, I know I should make a new thread on this, but in some games, my gpu usage drops to 0, and makes me get 30 ish fps. What boggles my mind is that it doesn't happen in all games, and not even heaven or valley unless it's changing scenes, which is normal(or at least I think it is). I already replaced the mobo. and still happens, does this sound like a gpu issue? Cpu issue?
> 
> Gpu-gtx 1070
> Cpu-i7-4790s
> Ram-12gb ddr3 @1600mhz
> Games that stutter:
> Batman arkham asylum/city/knight and rainbow six siege. To my knowlege, every other game runs fine.


Check your power plan on the PC. Try High performance mode or alternatively, reset the power mode back to the defaults. You may be core parking or the power plan is putting something to sleep

If your PC is sitting idle, is the CPU at or near 0% load?

My Laptop has a buggy driver (Macbook Pro 15" Retina running windows 10) that makes it throw loads of system interrupts at core 0 (i7-4770HQ) and make other things sluggish until i put the PC to sleep and wake it up again.


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> Before you RMA the card, try increasing the VCCIO and VCCSA voltages a bit, that could be pcie controller instability and not a bad card.


Thanks for the tip! How much should I increase those? And I was going to test the card on my friends computer before I RMA it, however he isn't at home this weekend. My motherboard has some issues with onboard audio, (popping/crackling) so I wouldn't be surprised if the problem was with my motherboard.

Also, I am going to format my Steam SSD and install a fresh copy of Windows on it and see if the problems still exist.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Before you RMA the card, try increasing the VCCIO and VCCSA voltages a bit, that could be pcie controller instability and not a bad card.
> 
> 
> 
> Thanks for the tip! How much should I increase those? And I was going to test the card on my friends computer before I RMA it, however he isn't at home this weekend. My motherboard has some issues with onboard audio, (popping/crackling) so I wouldn't be surprised if the problem was with my motherboard.
Click to expand...

If you hit the + key, it should increase in steps. Try one step at a time for each. I think max for Z170 is 1.25v for both

Ideally you want to keep them as low as possible while you still get stability.

I assume it is motherboard audio? Have you done the update bios, update audio drivers thing? You could also try increasing PCH voltage by 0.01 or 0.015v. That probably wont help the graphics, but it may solve your Audio crackling problem.


----------



## comanzo

I set the power plan back to default since high performance doesn't seem to be availiable unless I create my own custom plan. I am going to try to game a little bit with this power option and see what happens. If it doesn't work, I will try creating a custom power plan with high performance. Either way, I will report on my findings from this suggestion tomorrow as it's late over here in the states. Thanks for your help.


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> If you hit the + key, it should increase in steps. Try one step at a time for each. I think max for Z170 is 1.25v for both
> 
> Ideally you want to keep them as low as possible while you still get stability.
> 
> I assume it is motherboard audio? Have you done the update bios, update audio drivers thing? You could also try increasing PCH voltage by 0.01 or 0.015v. That probably wont help the graphics, but it may solve your Audio crackling problem.


Thanks! I reinstalled Windows on my secondary SSD and now I don't seem to have any problems with the graphics.., I'll install some games and test more.

My old windows installation seems to be a bit clogged up.. Cinebench scored 711 on my old installation and 803 on the fresh installation.









EDIT: Yup, the problem was my Windows installation. I had few hardware changes since I installed it, so that might be why it was a bit screwed up. I reinstalled Windows and configured my SSDs in RAID 0 as I had the chance. Now everything is working correctly! Also adjusted the VCCIO, VCCSA and PCH voltages and the audio crackling problem is gone too! However, I still have some stuttering issues when playing GTA V. :/

EDIT2: Nope, the problem came back after I installed MSI afterburner and didn't change anything.. The problem still exists after uninstalling it.


----------



## tiramoko

out of Topic. Why amazon doesn't have gtx 1070 on stock? Even 1080s are too expensive. I bought my 1080 3 months ago, but I had to return it. Does anyone knows when Amazon will stock their cards again?


----------



## blaze2210

Quote:


> Originally Posted by *tiramoko*
> 
> out of Topic. Why amazon doesn't have gtx 1070 on stock? Even 1080s are too expensive. I bought my 1080 3 months ago, but I had to return it. Does anyone knows when Amazon will stock their cards again?


The 1070 is great for mining, so its basically either sold out everywhere, or being sold for some crazy inflated price.


----------



## NiKiZ

Lowering the core clock by 51MHz (to 1506MHz, reference clocks peed) seems to work for now. The factory OC didn't seem stable at all. Well, I'll contact the seller and make it clear that when I RMA it, I want a replacement card and NOT a refund. Otherwise I need to spend 100-200€ more to get a new card..


----------



## NiKiZ

I don't think the voltage should show mV?

Idle:

While running MSI Kombustor: (You can see some artifacts here, despite the card being underclocked a bit..)

Also VID usage shows 0%:


----------



## Performer81

Does anybody know the difference between PCB revision Rev 3.0 and rev 6.0 with MSI 1070 cards?


----------



## NiKiZ

My 1070 is weird.. Sometimes it works correctly with the factory OC and sometimes it artifacts at -100MHz underclock.. Oh well, tomorrow I can test the card on my friends computer to see if the card really is defective or not.. Because my old HD7850 behaves a bit similarly, but alot less instability. However, the HD7850 did not artifact, but the image on my secondary monitor did turn into red/green checkerboard-like pattern. I don't think my Corsair RM750i PSU would be the problem as it shows solid 12.00V on the 12V rail under load. My motherboard is the main suspect right now because I have some audio problems with it too..


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> I set the power plan back to default since high performance doesn't seem to be availiable unless I create my own custom plan. I am going to try to game a little bit with this power option and see what happens. If it doesn't work, I will try creating a custom power plan with high performance. Either way, I will report on my findings from this suggestion tomorrow as it's late over here in the states. Thanks for your help.


Are you running an Insiders edition of windows 10?

I am and I have to create a new high performance plan. The balance plan is the only one available by default.

The difference between the two is in the way windows deals with CPU core parking which can hinder performance and the amount it will allow the CPU frequency to drop at idle


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you hit the + key, it should increase in steps. Try one step at a time for each. I think max for Z170 is 1.25v for both
> 
> Ideally you want to keep them as low as possible while you still get stability.
> 
> I assume it is motherboard audio? Have you done the update bios, update audio drivers thing? You could also try increasing PCH voltage by 0.01 or 0.015v. That probably wont help the graphics, but it may solve your Audio crackling problem.
> 
> 
> 
> Thanks! I reinstalled Windows on my secondary SSD and now I don't seem to have any problems with the graphics.., I'll install some games and test more.
> 
> My old windows installation seems to be a bit clogged up.. Cinebench scored 711 on my old installation and 803 on the fresh installation.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Yup, the problem was my Windows installation. I had few hardware changes since I installed it, so that might be why it was a bit screwed up. I reinstalled Windows and configured my SSDs in RAID 0 as I had the chance. Now everything is working correctly! Also adjusted the VCCIO, VCCSA and PCH voltages and the audio crackling problem is gone too! However, I still have some stuttering issues when playing GTA V. :/
> 
> EDIT2: Nope, the problem came back after I installed MSI afterburner and didn't change anything.. The problem still exists after uninstalling it.
Click to expand...

Make sure that you are running the latest vbios on the card. Your problem sounds similar to the micron memory bug that was present in the early days of the 1070. If your card has Micron memory and has been on a shelf for a long time, maybe it has an older version bios installed? The bug fix version for any of the cards should be 86.04.50.00.x displayed in GPU-Z. If the current bios is 86.04.26.00.x you definitely need to do the update.

GTA V is famous for stuttering with Intel CPUs. It does it even with 5960X 8 core CPUs.

Something that some people have said helps the GTA V stuttering issue that you could try is to open the nvidia control panel and in the 3D settings tab for GTA V, Set it high performance mode, turn Shader Cache = off and turn Threaded Optimisation = On


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> Make sure that you are running the latest vbios on the card. Your problem sounds similar to the micron memory bug that was present in the early days of the 1070. If your card has Micron memory and has been on a shelf for a long time, maybe it has an older version bios installed? The bug fix version for any of the cards should be 86.04.50.00.x displayed in GPU-Z. If the current bios is 86.04.26.00.x you definitely need to do the update.
> 
> GTA V is famous for stuttering with Intel CPUs. It does it even with 5960X 8 core CPUs.
> 
> Something that some people have said helps the GTA V stuttering issue that you could try is to open the nvidia control panel and in the 3D settings tab for GTA V, Set it high performance mode, turn Shader Cache = off and turn Threaded Optimisation = On


This card does indeed have Micron memory. I did update the bios when I first experienced problemss, but it didn't do anything.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Make sure that you are running the latest vbios on the card. Your problem sounds similar to the micron memory bug that was present in the early days of the 1070. If your card has Micron memory and has been on a shelf for a long time, maybe it has an older version bios installed? The bug fix version for any of the cards should be 86.04.50.00.x displayed in GPU-Z. If the current bios is 86.04.26.00.x you definitely need to do the update.
> 
> GTA V is famous for stuttering with Intel CPUs. It does it even with 5960X 8 core CPUs.
> 
> Something that some people have said helps the GTA V stuttering issue that you could try is to open the nvidia control panel and in the 3D settings tab for GTA V, Set it high performance mode, turn Shader Cache = off and turn Threaded Optimisation = On
> 
> 
> 
> This card does indeed have Micron memory. I did update the bios when I first experienced problemss, but it didn't do anything.
Click to expand...

I was the one who identified the micron bug so I know all about the artifacts. I had the similar issues as you are complaining about after the new bios if the memory was clocked to +600 - the artifacts you are getting are green or magenta in localized parts of the screen?

I was running z68 with a sandy bridge PC though. VCCIO helped. I could not adjust VCCSA on my board, CPU PLL was another setting that i increased slightly. My MB has died since (not OC related), so I cant check it but from memory, vccio was increased from 1.05 default to just over 1.1v and CPU PLL was increased from 1.8 to 1.813v i think. If you increase the values in the bios using the + key, they were increased about 4 or 5 increments out of the 30 or so available. It did not need very much to stabilize everything. The CPU PLL voltage helped with a hole in my gpu overclock. without the adjustment the curve would hit a ceiling at +50 at 1.0v on the curve. the rest of the curve would go much higher. Increasing the PLL voltage a bit allowed 1.0 to go higher as well.


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> I was the one who identified the micron bug so I know all about the artifacts. I had the similar issues as you are complaining about after the new bios if the memory was clocked to +600 - the artifacts you are getting are green or magenta in localized parts of the screen?
> 
> I was running z68 with a sandy bridge PC though. VCCIO helped. I could not adjust VCCSA on my board, CPU PLL was another setting that i increased slightly. My MB has died since (not OC related), so I cant check it but from memory, vccio was increased from 1.05 default to just over 1.1v and CPU PLL was increased from 1.8 to 1.813v i think. If you increase the values in the bios using the + key, they were increased about 4 or 5 increments out of the 30 or so available. It did not need very much to stabilize everything. The CPU PLL voltage helped with a hole in my gpu overclock. without the adjustment the curve would hit a ceiling at +50 at 1.0v on the curve. the rest of the curve would go much higher. Increasing the PLL voltage a bit allowed 1.0 to go higher as well.


I tried to increase the VCCIO and VCCSA voltages, but it didn't do anything. I stopped at 1.2-1.25 volts when the color turned to pink. (White -> Yellow -> Pink) Standard VCCIO was 0.950V and VCCSA 1.050V.

Here's a better look at the artifacts:


Artifacts even at 250MHz underclock:


I have to run the card at 300MHz underclock, about 250MHz under the reference clock..


----------



## MrTOOSHORT

Quote:


> Originally Posted by *NiKiZ*
> 
> I tried to increase the VCCIO and VCCSA voltages, but it didn't do anything. I stopped at 1.2-1.25 volts when the color turned to pink. (White -> Yellow -> Pink) Standard VCCIO was 0.950V and VCCSA 1.050V.
> 
> Here's a better look at the artifacts:
> 
> 
> Artifacts even at 250MHz underclock:
> 
> 
> I have to run the card at 300MHz underclock, about 250MHz under the reference clock..


If you haven't tried the card in your friend's pc yet

clean out the pci-E slot and teeth of the card with a tooth brush and isopropyl alcohol.


----------



## NiKiZ

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> If you haven't tried the card in your friend's pc yet
> 
> clean out the pci-E slot and teeth of the card with a tooth brush and isopropyl alcohol.


I did clean the slots and the problem still exists. I tried one another slot too and it behaved the same way. My friend should be home soon.


----------



## NiKiZ

Well, it is confirmed. The card is defective, it had artifacts on my friends computer too. Though he was very impressed about the performance when it ran DOOM at Ultra 1080p settings at 100-150 FPS despite being underclocked by 400MHz.







I sent the RMA email, just need to wait for the reply so I can send the card back.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I was the one who identified the micron bug so I know all about the artifacts. I had the similar issues as you are complaining about after the new bios if the memory was clocked to +600 - the artifacts you are getting are green or magenta in localized parts of the screen?
> 
> I was running z68 with a sandy bridge PC though. VCCIO helped. I could not adjust VCCSA on my board, CPU PLL was another setting that i increased slightly. My MB has died since (not OC related), so I cant check it but from memory, vccio was increased from 1.05 default to just over 1.1v and CPU PLL was increased from 1.8 to 1.813v i think. If you increase the values in the bios using the + key, they were increased about 4 or 5 increments out of the 30 or so available. It did not need very much to stabilize everything. The CPU PLL voltage helped with a hole in my gpu overclock. without the adjustment the curve would hit a ceiling at +50 at 1.0v on the curve. the rest of the curve would go much higher. Increasing the PLL voltage a bit allowed 1.0 to go higher as well.
> 
> 
> 
> I tried to increase the VCCIO and VCCSA voltages, but it didn't do anything. I stopped at 1.2-1.25 volts when the color turned to pink. (White -> Yellow -> Pink) Standard VCCIO was 0.950V and VCCSA 1.050V.
> 
> Here's a better look at the artifacts:
> 
> 
> Artifacts even at 250MHz underclock:
> 
> 
> I have to run the card at 300MHz underclock, about 250MHz under the reference clock..
Click to expand...

I have not seen artifacting like that before. It seems that no matter what setting you use, you get the same results.

Have you tried to use a different monitor cable?


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> I have not seen artifacting like that before. It seems that no matter what setting you use, you get the same results.
> 
> Have you tried to use a different monitor cable?


If it was the monitor cable, the artifacts shouldn't show up in a screenshot. It also artifacted on my friends computer with a different monitor. Underclocking the GPU Core by 400MHz gets rid of the artifacts. But I contacted the store and I am waiting for a reply so I can send the card back.

When playing games without underclocking, anything above 60 FPS causes artifacts. Higher the FPS, more artifacts. For example, Minecraft has very noticeable artifacts:


----------



## comanzo

Quote:


> Originally Posted by *NiKiZ*
> 
> If it was the monitor cable, the artifacts shouldn't show up in a screenshot. It also artifacted on my friends computer with a different monitor. Underclocking the GPU Core by 400MHz gets rid of the artifacts. But I contacted the store and I am waiting for a reply so I can send the card back.
> 
> When playing games without underclocking, anything above 60 FPS causes artifacts. Higher the FPS, more artifacts. For example, Minecraft has very noticeable artifacts:


It looks like your sword is shooting out bullets. xD but in all seriousness, what store are you contacting? Who is the manufacturer for your card?


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have not seen artifacting like that before. It seems that no matter what setting you use, you get the same results.
> 
> Have you tried to use a different monitor cable?
> 
> 
> 
> If it was the monitor cable, the artifacts shouldn't show up in a screenshot. It also artifacted on my friends computer with a different monitor. Underclocking the GPU Core by 400MHz gets rid of the artifacts. But I contacted the store and I am waiting for a reply so I can send the card back.
> 
> When playing games without underclocking, anything above 60 FPS causes artifacts. Higher the FPS, more artifacts. For example, Minecraft has very noticeable artifacts:
Click to expand...

Yes you are right, of course a screen shot wont show a cable problem. No sleep for 24 hours when I typed that and I didn't think it through properly.

Well we gave it a try. It looks like it may be faulty and time for an RMA.


----------



## NiKiZ

Quote:


> Originally Posted by *comanzo*
> 
> It looks like your sword is shooting out bullets. xD but in all seriousness, what store are you contacting? Who is the manufacturer for your card?


I ordered the card from Tietokonekauppa.fi. The card is MSI GTX 1070 Armor OC.

Quote:


> Originally Posted by *gtbtk*
> 
> Yes you are right, of course a screen shot wont show a cable problem. No sleep for 24 hours when I typed that and I didn't think it through properly.
> 
> Well we gave it a try. It looks like it may be faulty and time for an RMA.


Yeah. I contacted the store and they replied that I can send the card for them and they had reserved a replacement card for me. As soon as they have verified the card is indeed defective, they will send the new one.


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> If you copy a file between two PCs at home across your lan are you still only getting 1MB/s (that is about 10 megabits)?
> 
> Is your switch a 10Mbit, 100Mbit or Gigabit switch? Have you tried restarting the switch? sometimes a port on the switch can drop into go slow mode or disable itself. restarting can help.
> 
> Are you using at least Cat 5E or cat 6 cables if you have a gigabit switch? Anything Cat 5 or below will reduce gigabit speeds to much lower.
> 
> Try re-seating each patch cable or move the cables to a new switch port for each connection from the internet router to the switch, to your the patch cables between your internet connection to the switch to your office.
> 
> Some of the newer Intel Network drivers have diagnostics built in that you can access from the driver properties. You can test the cable/signal quality. You would need to test both connections to your switch.
> 
> If you unplug or turn all other hard wired devices on your network off, do you still get really slow speeds. It could be another device with a faulty adapter on your lan


I've narrowed it to the issue is the line from my gigabit switch to the backroom. any other not-custom made cable gets what it should. I borrowed a roll of CAT6 from work, n will hopefully get it done this week. The wire I ran was incredibly inexpensive, flimsy, and terrible. I got 500 feet for like $3.

I'm stupidly colorblind, so I require assistance when making cables


----------



## comanzo

Can anyone give me any suggestions for why my gpu usage drops to 0% randomly. I switched out the mobo. already, and nada. Maybe this is a bent mobo. pin issue? I already tried the windows plan settings to no avail. I just tried changing the page file size and see what happens. If anyone can give any further suggestions, that would be appreciated. I have already listed my specs in an earlier post. Thanks.

Edit: I can't even play wizard101 without my gpu usage dropping to 0% for god sakes.


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you copy a file between two PCs at home across your lan are you still only getting 1MB/s (that is about 10 megabits)?
> 
> Is your switch a 10Mbit, 100Mbit or Gigabit switch? Have you tried restarting the switch? sometimes a port on the switch can drop into go slow mode or disable itself. restarting can help.
> 
> Are you using at least Cat 5E or cat 6 cables if you have a gigabit switch? Anything Cat 5 or below will reduce gigabit speeds to much lower.
> 
> Try re-seating each patch cable or move the cables to a new switch port for each connection from the internet router to the switch, to your the patch cables between your internet connection to the switch to your office.
> 
> Some of the newer Intel Network drivers have diagnostics built in that you can access from the driver properties. You can test the cable/signal quality. You would need to test both connections to your switch.
> 
> If you unplug or turn all other hard wired devices on your network off, do you still get really slow speeds. It could be another device with a faulty adapter on your lan
> 
> 
> 
> I've narrowed it to the issue is the line from my gigabit switch to the backroom. any other not-custom made cable gets what it should. I borrowed a roll of CAT6 from work, n will hopefully get it done this week. The wire I ran was incredibly inexpensive, flimsy, and terrible. I got 500 feet for like $3.
> 
> I'm stupidly colorblind, so I require assistance when making cables
Click to expand...

I guess that you get what you paid for.

Be aware that UTP ethernet cable runs are limited to 100m in length (328 feet) from the switch. If you need a longer run, you need to put a gigabyte hub or switch in the middle of the run so that each separate cable run length remains within spec.

Termination of Cat 6 Cables is also something that requires precision as the twisted pairs need to be kept tightly twisted right up to the connector. Ideally the assistance you get is someone with the right termination tool and some experience terminating cabling.


----------



## NiKiZ

Well, I sent the card to RMA and they don't seem to know what GPU-Boost is.. They asked what BIOS I used to update the card as the core frequency was around 2000MHz under load. They determined that the card is fine when underclocked. Oh well, let's see what happens. I tried explaining what GPU-Boost is and put the link to the BIOS file I used. (Which is from MSI website.) And that 1900-2000MHz is normal for 1070 cards.


----------



## MEC-777

Quote:


> Originally Posted by *NiKiZ*
> 
> Well, I sent the card to RMA and they don't seem to know what GPU-Boost is.. They asked what BIOS I used to update the card as the core frequency was around 2000MHz under load. They determined that the card is fine when underclocked. Oh well, let's see what happens. I tried explaining what GPU-Boost is and put the link to the BIOS file I used. (Which is from MSI website.) And that 1900-2000MHz is normal for 1070 cards.


This is one of the problems you run into with some retailers - the people working there aren't always knowledgeable with the products they sell and it can be frustrating. They should know what GPU-Boost is and they should at least acknowledge that it's unacceptable to have to underclock a GPU to stop it from artifacting. The card is obviously faulty and they should give you a replacement. Hope they do...


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> I guess that you get what you paid for.
> 
> Be aware that UTP ethernet cable runs are limited to 100m in length (328 feet) from the switch. If you need a longer run, you need to put a gigabyte hub or switch in the middle of the run so that each separate cable run length remains within spec.
> 
> Termination of Cat 6 Cables is also something that requires precision as the twisted pairs need to be kept tightly twisted right up to the connector. Ideally the assistance you get is someone with the right termination tool and some experience terminating cabling.


The longest run is ~100feet, but it has to make quite a few twists and turns to get to its destination. The attic isn't an option unfortunately. I also have everything running from a 24 port gb switch that I acquired from work during our last refresh.

As to the card.
I thought you preferred curve OC via MSI AB over PRECX? Or did I miss the part where PRECX has the same feature?


----------



## comanzo

Hey guys. Anyone have any other suggestions to try for the sudden gpu usage drops to 0%? I already tried the windows power options to highest perf. already, as well as in nvidia control panel. Any other suggestions would be appreciated.


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> Are you running an Insiders edition of windows 10?
> 
> I am and I have to create a new high performance plan. The balance plan is the only one available by default.
> 
> The difference between the two is in the way windows deals with CPU core parking which can hinder performance and the amount it will allow the CPU frequency to drop at idle


Hey man. Sorry I ddin't reply to this message earlier. I honestly didn't see it as it didn't pop up as a notification. to answer your question, I am running windows 10 home.


----------



## NiKiZ

Quote:


> Originally Posted by *MEC-777*
> 
> This is one of the problems you run into with some retailers - the people working there aren't always knowledgeable with the products they sell and it can be frustrating. They should know what GPU-Boost is and they should at least acknowledge that it's unacceptable to have to underclock a GPU to stop it from artifacting. The card is obviously faulty and they should give you a replacement. Hope they do...


I explained the GPU Boost to them and suggested that they would test a known working card and see how it behaves. They replied that I was right, they tested the replacement card and it was working like it should and it did boost just like my defective card. The BIOS versions matched so I didn't flash a wrong BIOS like they thought I did.. The sent the known working card, it should arrive tomorrow.


----------



## Nukemaster

Quote:


> Originally Posted by *NiKiZ*
> 
> I explained the GPU Boost to them and suggested that they would test a known working card and see how it behaves. They replied that I was right, they tested the replacement card and it was working like it should and it did boost just like my defective card. The BIOS versions matched so I didn't flash a wrong BIOS like they thought I did.. The sent the known working card, it should arrive tomorrow.


Good. My Galaxy 670 boosted into instability too. I got an RMA without issues, but I just sent them [email protected] screens since it was unstable unless under clocked.


----------



## MEC-777

Quote:


> Originally Posted by *NiKiZ*
> 
> I explained the GPU Boost to them and suggested that they would test a known working card and see how it behaves. They replied that I was right, they tested the replacement card and it was working like it should and it did boost just like my defective card. The BIOS versions matched so I didn't flash a wrong BIOS like they thought I did.. The sent the known working card, it should arrive tomorrow.


Nice! Glad you got it sorted.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Well, I sent the card to RMA and they don't seem to know what GPU-Boost is.. They asked what BIOS I used to update the card as the core frequency was around 2000MHz under load. They determined that the card is fine when underclocked. Oh well, let's see what happens. I tried explaining what GPU-Boost is and put the link to the BIOS file I used. (Which is from MSI website.) And that 1900-2000MHz is normal for 1070 cards.


It does not actually instill you with a lot of confidence does it?


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Are you running an Insiders edition of windows 10?
> 
> I am and I have to create a new high performance plan. The balance plan is the only one available by default.
> 
> The difference between the two is in the way windows deals with CPU core parking which can hinder performance and the amount it will allow the CPU frequency to drop at idle
> 
> 
> 
> Hey man. Sorry I ddin't reply to this message earlier. I honestly didn't see it as it didn't pop up as a notification. to answer your question, I am running windows 10 home.
Click to expand...

I would suggest that you use DDU to completely uninstall the nvidia drivers and then do a fresh install of the latest drivers


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I would suggest that you use DDU to completely uninstall the nvidia drivers and then do a fresh install of the latest drivers


ok. will do. I will report back on results when I can. Thanks for helping me out.


----------



## NiKiZ

The replacement card is working! It has been mining for 16 hours now at 100% GPU usage and I had no problems. The first card had problems after gaming a bit and I heard some strange pops inside my computer.


----------



## Performer81

I also have a new MSI Armor 1070 and i am very impressed. Bought it also one day before it was sold out and prices went skyrocket.
I undervolted to 0,95V and overclocked to 2037 Core and 9000 Memory and there is about no difference between idle and load noise and temps stay at around 60 degrees.


----------



## NiKiZ

Quote:


> Originally Posted by *Performer81*
> 
> I also have a new MSI Armor 1070 and i am very impressed. Bought it also one day before it was sold out and prices went skyrocket.
> I undervolted to 0,95V and overclocked to 2037 Core and 9000 Memory and there is about no difference between idle and load noise and temps stay at around 60 degrees.


Yeah, the fans only start spinning after 60 degrees, very quiet card. I can't hear it over the H110i GT pump.. Which is pretty loud. I am thinking about replacing it.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Performer81*
> 
> I also have a new MSI Armor 1070 and i am very impressed. Bought it also one day before it was sold out and prices went skyrocket.
> I undervolted to 0,95V and overclocked to 2037 Core and 9000 Memory and there is about no difference between idle and load noise and temps stay at around 60 degrees.
> 
> 
> 
> Yeah, the fans only start spinning after 60 degrees, very quiet card. I can't hear it over the H110i GT pump.. Which is pretty loud. I am thinking about replacing it.
Click to expand...

The problem with the auto fans is that the card idles at 59 deg. if you run the fans at 20% at idle, it will sit at about 35 deg. Ramp up starting at 40-45 deg, you will get even better performance under lighter loads. you can set max fans at whatever your ears are comfortable with. My Gaming x at 100% is audible but not terrible and at 1.050v will run at 53 deg and still score over 21000 graphics score in Firestrike at 2076Mhz. 1.093v will get me slightly more and the card runs at 58 deg


----------



## Performer81

Quote:


> Originally Posted by *gtbtk*
> 
> The problem with the auto fans is that the card idles at 59 deg.


No, atm my fans stand still and i have 33 degrees. WIth youtube and stuff a little over 40 or so.


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> The problem with the auto fans is that the card idles at 59 deg. if you run the fans at 20% at idle, it will sit at about 35 deg. Ramp up starting at 40-45 deg, you will get even better performance under lighter loads. you can set max fans at whatever your ears are comfortable with. My Gaming x at 100% is audible but not terrible and at 1.050v will run at 53 deg and still score over 21000 graphics score in Firestrike at 2076Mhz. 1.093v will get me slightly more and the card runs at 58 deg


Yeah, I set a manual fan curve. Starting from 30% and from 40 degrees it starts to ramp up. At 80 degrees it will go from 80% straight to 100%.

Also overclocked mine a bit. Managed to overclock the core by 180 MHz and the memory by 420 MHz. Max core frequency is now at 2139 MHz and memory clock 4428/8856 MHz.
Quote:


> Originally Posted by *gtbtk*
> 
> and still score over 21000 graphics score in Firestrike at 2076Mhz.


Over 21000 in fire strike at 2076 MHz? I only get 20991 at 2139 MHz.


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I would suggest that you use DDU to completely uninstall the nvidia drivers and then do a fresh install of the latest drivers


Hey mate. quick question. Do you think this stuttering problem could be a cpu bent pin issue? One time when putting in my intel cpu into the mobo. I heard some unpleaseant noises. My pc boots and posts and is in windows right now, but I wanted to know if bent pins could be the suspect. Or would bent pins not lead me to post in the first place?


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Hey mate. quick question. Do you think this stuttering problem could be a cpu bent pin issue? One time when putting in my intel cpu into the mobo. I heard some unpleaseant noises. My pc boots and posts and is in windows right now, but I wanted to know if bent pins could be the suspect. Or would bent pins not lead me to post in the first place?


It all depends on the bent pins. I had bent pins twice on this MB. The first time it wouldn't post, the second time XMP would not work correctly and I couldn't manually OC my RAM. It depends on how many pins are bent and which one(s).
Fixing them can be a real pain. It is much easier to take the MB out of the case and have lots of light. A large magnifying glass helps a lot, a phone camera with a micro setting might work depending on the quality but I think a magnifying glass works much better. You would also need a precision screw driver or something skinny like one so you can try to push each individual pin back one at a time. But be very careful because if you push to hard you can mess it up further.
When you check you have to look at it from all angles and keep an eye out for the pins that look discolored or "different", meaning if it is slightly bent it will reflect the light at a different angle. I hope you have at least a somewhat steady hand....


----------



## comanzo

Quote:


> Originally Posted by *Madmaxneo*
> 
> It all depends on the bent pins. I had bent pins twice on this MB. The first time it wouldn't post, the second time XMP would not work correctly and I couldn't manually OC my RAM. It depends on how many pins are bent and which one(s).
> Fixing them can be a real pain. It is much easier to take the MB out of the case and have lots of light. A large magnifying glass helps a lot, a phone camera with a micro setting might work depending on the quality but I think a magnifying glass works much better. You would also need a precision screw driver or something skinny like one so you can try to push each individual pin back one at a time. But be very careful because if you push to hard you can mess it up further.
> When you check you have to look at it from all angles and keep an eye out for the pins that look discolored or "different", meaning if it is slightly bent it will reflect the light at a different angle. I hope you have at least a somewhat steady hand....


Hopefully that isn't what's causing my gpu to drop to 0% randomly while gaming causing stutters. I don't think it will be the case as it doesn't happen with all games, so it makes me think this is software related. If it happened with all games, then I would suspect hardware issues. Now, this issue isn't just one or two games that could be written off as bad ports, but rather quite a few, but not all. What do you think? Do you think if it happened with all games it would be hardware related? Or can it still be a hardware issue with only a couple games having the stutter? Thanks, and I hope the pins aren't it because I don't have steady hands, especially when nervous.


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Hopefully that isn't what's causing my gpu to drop to 0% randomly while gaming causing stutters. I don't think it will be the case as it doesn't happen with all games, so it makes me think this is software related. If it happened with all games, then I would suspect hardware issues. Now, this issue isn't just one or two games that could be written off as bad ports, but rather quite a few, but not all. What do you think? Do you think if it happened with all games it would be hardware related? Or can it still be a hardware issue with only a couple games having the stutter? Thanks, and I hope the pins aren't it because I don't have steady hands, especially when nervous.


It happens with a few games but not all of them. Are the games it happens with more graphic intensive? Name the games it happens with and a few it doesn't happen with. This could also be a driver issue with your card so make sure you have the latest driver installed. Didn't someone post about using DDU, was that you? If so did you do that?
FYI, I had an issue with my old 980 with stuttering on some games at first then the card just started crashing. I sent it in for an RMA and they sent me this 1070 as a replacement. Also check to make sure the PSU is working properly and can handle the load. If the PSU is going bad it can cause issues like yours.


----------



## comanzo

Quote:


> Originally Posted by *Madmaxneo*
> 
> It happens with a few games but not all of them. Are the games it happens with more graphic intensive? Name the games it happens with and a few it doesn't happen with. This could also be a driver issue with your card so make sure you have the latest driver installed. Didn't someone post about using DDU, was that you? If so did you do that?
> FYI, I had an issue with my old 980 with stuttering on some games at first then the card just started crashing. I sent it in for an RMA and they sent me this 1070 as a replacement. Also check to make sure the PSU is working properly and can handle the load. If the PSU is going bad it can cause issues like yours.


yea. Unfortunately DDU didn't work. idk if this would cause an issue, but while updating nvidia drivers after using ddu, windows would update the nvidia drivers to an older version, and then the new version would install afterwards. I don't think that would cause any issues if the older version was picked first, but I would like to know your opinion on that. If you say yes, is there a way to disable windows updater so I can successfully download nvidia drivers without windows downloading older ones first? As for the games that stutter: Batman arkham city/asylum, and rainbow siege. Even wizard101 stutters and drops gpu usage to 0% for god's sakes. For games that work well: GTA V, black ops 3, and overwatch. There are some other games I didn't include as I haven't tested them in a while.

Haven't thought about PSU causing stutter issues, as I thought that would only be the case if it was to all games causing stutter . Will definitely look into that. Btw, you really lucked out when it came to getting a 1070 from a 980. Lemme guess, EVGA was the company that did that. Am I right?









On a final note, I don't think I should open up my cpu and check for bent pins as it opens up the risk to cause exactly that. I think it would be best to leave that as a last resort to not cause unnecessary risk. What do you think? let me know, and thanks again for helping me out.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I would suggest that you use DDU to completely uninstall the nvidia drivers and then do a fresh install of the latest drivers
> 
> 
> 
> Hey mate. quick question. Do you think this stuttering problem could be a cpu bent pin issue? One time when putting in my intel cpu into the mobo. I heard some unpleaseant noises. My pc boots and posts and is in windows right now, but I wanted to know if bent pins could be the suspect. Or would bent pins not lead me to post in the first place?
Click to expand...

I would suspect drivers doing funky things first. you can turn automatic driver updates off in windows 10

When you say unpleasant noises, do you mean when you were pulling the arm down to lock in the CPU? the pins are already protected by the CPU in the socket at that stage


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> yea. Unfortunately DDU didn't work. idk if this would cause an issue, but while updating nvidia drivers after using ddu, windows would update the nvidia drivers to an older version, and then the new version would install afterwards. I don't think that would cause any issues if the older version was picked first, but I would like to know your opinion on that. If you say yes, is there a way to disable windows updater so I can successfully download nvidia drivers without windows downloading older ones first? As for the games that stutter: Batman arkham city/asylum, and rainbow siege. Even wizard101 stutters and drops gpu usage to 0% for god's sakes. For games that work well: GTA V, black ops 3, and overwatch. There are some other games I didn't include as I haven't tested them in a while.
> 
> Haven't thought about PSU causing stutter issues, as I thought that would only be the case if it was to all games causing stutter . Will definitely look into that. Btw, you really lucked out when it came to getting a 1070 from a 980. Lemme guess, EVGA was the company that did that. Am I right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a final note, I don't think I should open up my cpu and check for bent pins as it opens up the risk to cause exactly that. I think it would be best to leave that as a last resort to not cause unnecessary risk. What do you think? let me know, and thanks again for helping me out.


Yes, it is EVGA, they apparently have a rep for doing stuff like that.
Like @gtbtk mentioned it's sounding more and more like a driver issue and you can turn off windows updates or at least defer them to a later time. That way the new driver will get installed without the old one. I have had almost the same issue before but with different games. I even switched to my old GTX 750Ti card to see if it was my card. At first it all seemed fine then a week later the issues returned. I switched back to my 980 and soon enough my HDMI ports died completely and I sent my card in for the RMA. Consequently, my cpu crapped out around the same time and I sent that in for the intel tuning plan.
Now with both the a gpu and cpu everything seems to be running ok. In my case I wonder if my cpu dying somehow affected my gpu and caused it to die.
When I say everything seems to be running ok I am having what I believe to be software issues. The Corsair Utility Engine software for my k95 RGB keyboard keeps throwing "malfunction" errors and the RGB lights (and macro functions) stop entirely. Plus my system has restarted for unknown reasons twice when I was not near it, with no error reports or anything of the kind.

I'd check the pins as a last resort but you may be having other issues. Try deferring windows updates to a later time and run DDU then install the latest driver directly from the nvidia website. I'd also return your system to the base clocks if you have any OC's on either the CPU or GPU (even the RAM). Also if you haven't already run memtest and then a stress test of some kind after you've set the base clocks.. See if any of that helps.

One last set of questions:
What are your system specs (you should post these in your sig)?
For RAM did you get it all at the same time or did you purchase the modules separately?


----------



## VeauX

Hello folks,

New to the club, I just sold my 1060 and replaced it by a 1070 G1 Gaming. Playing with OC right now getting 2088 / 2076 of boost clocks in benchmark . I just raised the memory by 150MHz in afterburner.

Will push more in the following days. Anything I need to know before proceeding?

Thx


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I would suspect drivers doing funky things first. you can turn automatic driver updates off in windows 10
> 
> When you say unpleasant noises, do you mean when you were pulling the arm down to lock in the CPU? the pins are already protected by the CPU in the socket at that stage


Well that, and also when I was placing the cpu into the socket I heard a couple of crunching noises. I didn't press down hard on the cpu either. Just a little because the cpu was out of place when I first placed it. The arm also made a couple noises too but it seems that's not a concern as you suggested.


----------



## comanzo

Quote:


> Originally Posted by *Madmaxneo*
> 
> Yes, it is EVGA, they apparently have a rep for doing stuff like that.
> Like @gtbtk mentioned it's sounding more and more like a driver issue and you can turn off windows updates or at least defer them to a later time. That way the new driver will get installed without the old one. I have had almost the same issue before but with different games. I even switched to my old GTX 750Ti card to see if it was my card. At first it all seemed fine then a week later the issues returned. I switched back to my 980 and soon enough my HDMI ports died completely and I sent my card in for the RMA. Consequently, my cpu crapped out around the same time and I sent that in for the intel tuning plan.
> Now with both the a gpu and cpu everything seems to be running ok. In my case I wonder if my cpu dying somehow affected my gpu and caused it to die.
> When I say everything seems to be running ok I am having what I believe to be software issues. The Corsair Utility Engine software for my k95 RGB keyboard keeps throwing "malfunction" errors and the RGB lights (and macro functions) stop entirely. Plus my system has restarted for unknown reasons twice when I was not near it, with no error reports or anything of the kind.
> 
> I'd check the pins as a last resort but you may be having other issues. Try deferring windows updates to a later time and run DDU then install the latest driver directly from the nvidia website. I'd also return your system to the base clocks if you have any OC's on either the CPU or GPU (even the RAM). Also if you haven't already run memtest and then a stress test of some kind after you've set the base clocks.. See if any of that helps.
> 
> One last set of questions:
> What are your system specs (you should post these in your sig)?
> For RAM did you get it all at the same time or did you purchase the modules separately?


I bought them at the same time as it's a pre-built. They are both from kingston. My specs are:

cpu-i7 4790s
gpu- gtx 1070 FTW acx 3.0
ram- 12gb kingston ddr3 @1600mhz
mobo- H81M-E(asus board)
PSU- EVGA 750w supernova

Since I just woke up not too long ago, I haven't had time yet to test what you guys purposed me to try. I will report back my results from trying these solutions soon.


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> Hello folks,
> 
> New to the club, I just sold my 1060 and replaced it by a 1070 G1 Gaming. Playing with OC right now getting 2088 / 2076 of boost clocks in benchmark . I just raised the memory by 150MHz in afterburner.
> 
> Will push more in the following days. Anything I need to know before proceeding?
> 
> Thx


Yea, when overclocking, please note that core clock symptoms of instability aren't just crashing but also artifacting. People commonly think this is a symptom for memory clock instability, but I got artifacting when my core clock was not stable and crashed when memory clock was unstable. This is the opposite of the symptoms people give. Other than that, just be patient when it comes to overclocking for best results.


----------



## VeauX

Quote:


> Originally Posted by *comanzo*
> 
> Hopefully that isn't what's causing my gpu to drop to 0% randomly while gaming causing stutters. I don't think it will be the case as it doesn't happen with all games, so it makes me think this is software related. If it happened with all games, then I would suspect hardware issues. Now, this issue isn't just one or two games that could be written off as bad ports, but rather quite a few, but not all. What do you think? Do you think if it happened with all games it would be hardware related? Or can it still be a hardware issue with only a couple games having the stutter? Thanks, and I hope the pins aren't it because I don't have steady hands, especially when nervous.


What power supply do you have?

Also latest bios, default settings? I would remove all overclocks and run a stability test first to start on good grounds


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> What power supply do you have?
> 
> Also latest bios, default settings? I would remove all overclocks and run a stability test first to start on good grounds


Ok. Yea I am in the middle of testing everyone's ideas of what to do to fix this, and will certainly try what you are suggesting. I do have the latest bios, and removed the gpu oc already. PSU- EVGA 750w supernova. It does have enough power for my specs, but that doesn't eliminate it being the suspect as it could be DOA.


----------



## VeauX

More than enough for your specs. I personally have a Supernova 650GS it never passed 450W (I have a watt-meter on the wall / Kill-A-Watt P3)

If the PSU is good, next thing that could be causing issues would be the MB.

If you have parts to switch around you might need to do this to find the faulty component

And as a rule of thumb, assemble the strict minimum components. That would be PSU>MB>CPU>RAM (one stick only) > HDD. If your CPU has graphic capabilities, do not install the graphic card. Then fresh install of windows 10 64 bits and you can start to run stability tests. You may start with OCCT Perestroika. Prime or AIDA 64 work too for system stability purpose. Let run them and monitor temps.

You might want to post your rig pictures here in case we spot something.


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> More than enough for your specs. I personally have a Supernova 650GS it never passed 450W (I have a watt-meter on the wall / Kill-A-Watt P3)
> 
> If the PSU is good, next thing that could be causing issues would be the MB.
> 
> If you have parts to switch around you might need to do this to find the faulty component
> 
> And as a rule of thumb, assemble the strict minimum components. That would be PSU>MB>CPU>RAM (one stick only) > HDD. If your CPU has graphic capabilities, do not install the graphic card. Then fresh install of windows 10 64 bits and you can start to run stability tests. You may start with OCCT Perestroika. Prime or AIDA 64 work too for system stability purpose. Let run them and monitor temps.
> 
> You might want to post your rig pictures here in case we spot something.


Turns out using ddu and not letting windows update first did the trick. I don't notice stuttering anymore in arkham city/asylum. Haven't tested wizard 101 or siege yet. My only concern left is this. When running prime 95, my cpu temps went to 92 degrees celsius which is also the tj max for my i7 4790s. I turned off the pc immediately by holding the power button. I would say the temps were there for about 5-7 seconds, 10 at the most before my pc turned off. Do you think my cpu could possibly be damaged by this situation? I ran intel processor diagnostic test with a pass and cpu-z gave scores similar to what an i7 4790s should be.

I have ran prime95 in the past with temps being around low 80's. I don't know why they got so high this time, maybe when I re-applied thermal paste I didn't apply it effectively? Anyways, would the cpu at those high temps reaching it's tj max(or throttle temp) for about 10 seconds cause any permanent damage? It seems to be fine, but confirmation would be nice. I am glad the stutter issue is solved.


----------



## blaze2210

Quote:


> Originally Posted by *comanzo*
> 
> Turns out using ddu and not letting windows update first did the trick. I don't notice stuttering anymore in arkham city/asylum. Haven't tested wizard 101 or siege yet. My only concern left is this. When running prime 95, my cpu temps went to 92 degrees celsius which is also the tj max for my i7 4790s. I turned off the pc immediately by holding the power button. I would say the temps were there for about 5-7 seconds, 10 at the most before my pc turned off. Do you think my cpu could possibly be damaged by this situation? I ran intel processor diagnostic test with a pass and cpu-z gave scores similar to what an i7 4790s should be.
> 
> I have ran prime95 in the past with temps being around low 80's. I don't know why they got so high this time, maybe when I re-applied thermal paste I didn't apply it effectively? Anyways, would the cpu at those high temps reaching it's tj max(or throttle temp) for about 10 seconds cause any permanent damage? It seems to be fine, but confirmation would be nice. I am glad the stutter issue is solved.


I don't think you would have damaged your CPU during that short burst, but be sure to keep a closer eye on things to be sure. Make sure you're monitoring temps, voltages, etc. for a while.


----------



## comanzo

Quote:


> Originally Posted by *blaze2210*
> 
> I don't think you would have damaged your CPU during that short burst, but be sure to keep a closer eye on things to be sure. Make sure you're monitoring temps, voltages, etc. for a while.


Believe me, I always monitor my temps when playing games and stuff. It just baffles me that prime95 made it so high this time. It's usually in the low 80's(using a stock cooler btw). The only thing I can think of is that I re-applied thermal paste in an ineffective way since I still used the same stock cooler when I got in the low 80's as I did getting the 92 degrees. Thanks for giving me the piece of mind by telling me that my cpu is probably fine. I was really worried about it's lifespan.


----------



## rfarmer

Quote:


> Originally Posted by *comanzo*
> 
> Believe me, I always monitor my temps when playing games and stuff. It just baffles me that prime95 made it so high this time. It's usually in the low 80's(using a stock cooler btw). The only thing I can think of is that I re-applied thermal paste in an ineffective way since I still used the same stock cooler when I got in the low 80's as I did getting the 92 degrees. Thanks for giving me the piece of mind by telling me that my cpu is probably fine. I was really worried about it's lifespan.


I would definitely pull the cooler off and check both your TIM and that you have the cooler seated properly and firmly attached, there is no reason for you not to get the same temps as before.


----------



## blaze2210

It could have been a seating issue, an air pocket would definitely cause temps to go up. In your case, I think you got it handled soon enough to avoid major damage. If you have some spare TIM, you could always remove the cooler and check on whether or not you got a good spread.


----------



## comanzo

Quote:


> Originally Posted by *rfarmer*
> 
> I would definitely pull the cooler off and check both your TIM and that you have the cooler seated properly and firmly attached, there is no reason for you not to get the same temps as before.


I will be doing that soon. However, do you also think I may have caused damage to it? Luckily, I can still play games and be in safe temps. (below 80's). Will definitely inspect the temp. discrepancy soon though.


----------



## rfarmer

Quote:


> Originally Posted by *comanzo*
> 
> I will be doing that soon. However, do you also think I may have caused damage to it? Luckily, I can still play games and be in safe temps. (below 80's). Will definitely inspect the temp. discrepancy soon though.


No your cpu should be fine, they don't immediately self destruct when they reach their max temps. You just don't want to leave it there for prolonged periods.


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Turns out using ddu and not letting windows update first did the trick. I don't notice stuttering anymore in arkham city/asylum. Haven't tested wizard 101 or siege yet. My only concern left is this. When running prime 95, my cpu temps went to 92 degrees celsius which is also the tj max for my i7 4790s. I turned off the pc immediately by holding the power button. I would say the temps were there for about 5-7 seconds, 10 at the most before my pc turned off. Do you think my cpu could possibly be damaged by this situation? I ran intel processor diagnostic test with a pass and cpu-z gave scores similar to what an i7 4790s should be.
> 
> I have ran prime95 in the past with temps being around low 80's. I don't know why they got so high this time, maybe when I re-applied thermal paste I didn't apply it effectively? Anyways, would the cpu at those high temps reaching it's tj max(or throttle temp) for about 10 seconds cause any permanent damage? It seems to be fine, but confirmation would be nice. I am glad the stutter issue is solved.


Those temps are to high, even at the 80 deg mark, when running any stress test if you are not OCing. As a comparison I am running a 4930k OC's at 4.4ghz my highest temps when stress testing reach the mid 60's. I am of course watercooling but even on air yours is still to high.
As mentioned remove the cooler and reapply some new TIM, you can't avoid replacing it because once you pull the cooler off you have to clean it and add new TIM.
FYI, I'd think your PC would shut down before letting the CPU overheat to far. I believe that is built into the chip and/or the MB (I'm not entirely sure though).

Another test you should try running is the Intel burn test to see how high your temps go. You can stop the test anytime you want so you do not have to do a hard shut down.


----------



## Lahatiel

It is recommended to disable AVX in Prime95 (edit the "undoc.txt" and change CpuSupportsAVX=1 to 0) or to use verion 26.6 for testing.
AVX creates unnatural load and very high temps.
There are circuits, which will protect your cpu from overheating but this is an emergency shutdown and should not be triggered for fun.


----------



## VeauX

CPU throttle and self shut down before damaging themselves due to high temps, don't worry. You need to address that though. You have 3 routes for that, reapplying TIM correctly (better too much than too little), change fan profile in BIOS to something more aggressive and ultimately getting something better like an hyper 212.


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> CPU throttle and self shut down before damaging themselves due to high temps, don't worry. You need to address that though. You have 3 routes for that, reapplying TIM correctly (better too much than too little), change fan profile in BIOS to something more aggressive and ultimately getting something better like an hyper 212.


Ok. That's a relief. Yea believe it or not, the cpu didn't even shut down. I shut it off as I was frantically trying to stop it. 92 degrees is the tj max(which is what I got in prime 95), so I think tj max is the throttling temp. , not shutdown temp. Otherwise, my cpu would have shut down on it's own. The only game that maxes my cpu close to 100% is black ops 3 and it gives me temps around mid 70's. Prime 95 as well as other stress tests, are the only ones putting it to such high temps. Am I still in need of a cooler upgrade especially since my cpu can't overclock anyways?


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Ok. That's a relief. Yea believe it or not, the cpu didn't even shut down. I shut it off as I was frantically trying to stop it. 92 degrees is the tj max(which is what I got in prime 95), so I think tj max is the throttling temp. , not shutdown temp. Otherwise, my cpu would have shut down on it's own. The only game that maxes my cpu close to 100% is black ops 3 and it gives me temps around mid 70's. Prime 95 as well as other stress tests, are the only ones putting it to such high temps. Am I still in need of a cooler upgrade especially since my cpu can't overclock anyways?


Do you know what stock cooler came with the system?
Unless the cooler is dying you should not need to replace it. Make sure the fan on the cooler is running and that it increases speeds the hotter the CPU gets.
A Hyper 212 evo is one of the best air coolers on the market and it is inexpensive, the last I checked they were going for about $30. .
FYI, stress tests will push your CPU beyond what just about any game today will do. Most stress tests are not realistic in comparison to daily usage even with high end gaming. But people like to use them to make sure their system can handle it....So you will get higher temps with most stress tests when compared to gaming.


----------



## comanzo

Quote:


> Originally Posted by *Madmaxneo*
> 
> Do you know what stock cooler came with the system?
> Unless the cooler is dying you should not need to replace it. Make sure the fan on the cooler is running and that it increases speeds the hotter the CPU gets.
> A Hyper 212 evo is one of the best air coolers on the market and it is inexpensive, the last I checked they were going for about $30. .
> FYI, stress tests will push your CPU beyond what just about any game today will do. Most stress tests are not realistic in comparison to daily usage even with high end gaming. But people like to use them to make sure their system can handle it....So you will get higher temps with most stress tests when compared to gaming.


Yea, the intel stock cooler. It does indeed increase speed as temps go up, and no I don't notice any slowdown in terms of speed of the fan(or any sign of damage). I think a good re-application of thermal paste will get me back to the temps. I usually get with prime 95 and leave it at that. So I agree, unless fan is dying, I am probably going to stick with it. I do think that if my cpu is reaching throttling temps. for only a couple of seconds, I should be fine. I wasn't close to it being shut down, just throttled down, so I do think I am ok in terms of any damage done to the cpu.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I would suspect drivers doing funky things first. you can turn automatic driver updates off in windows 10
> 
> When you say unpleasant noises, do you mean when you were pulling the arm down to lock in the CPU? the pins are already protected by the CPU in the socket at that stage
> 
> 
> 
> Well that, and also when I was placing the cpu into the socket I heard a couple of crunching noises. I didn't press down hard on the cpu either. Just a little because the cpu was out of place when I first placed it. The arm also made a couple noises too but it seems that's not a concern as you suggested.
Click to expand...

Just had a thought. Maybe you should try resetting the bios to optimised defaults?


----------



## VeauX

You can use HWTEMP to check if your CPU is throttling during normal use. Launch the sensors only and then use your games. If any throttle occur HW TEMP will flag it.

Check this link as reference: http://www.cnx-software.com/2016/02/04/how-to-check-cpu-throttling-in-windows-10/

Please test and post back results.


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> You can use HWTEMP to check if your CPU is throttling during normal use. Launch the sensors only and then use your games. If any throttle occur HW TEMP will flag it.
> 
> Check this link as reference: http://www.cnx-software.com/2016/02/04/how-to-check-cpu-throttling-in-windows-10/
> 
> Please test and post back results.


Already have. In an earlier post, I mentioned black ops 3 didn't get past 77 degrees. I haven't replaced TIM yet btw. When I say 77 degrees, I mean at or near 100% load on all 4 cores. Yes, black ops 3 multiplayer at 1440p with a 1070 puts my CPU at or near 100% load. It is indeed the most stressful game I own for my cpu. As long as it's under 80, I am ok with that. Remember, we are talking about a stock cooler over here. I am using arctic silver 5 which requires 200 hours of use to get optimal results. I haven't fulfilled the 200 hours yet meaning temps. can only go down from here on out. If after 200 hours it doesn't decrease, or goes over the 80 degree mark, I will re-apply thermal paste.

As of now, the only thing I know what to do is to avoid prime95 like the plague. I will only use that when I have a decent cooler and a overclocked cpu and want to test stability on it. Not temps, but stability. I find prime95 to be a horrible way to measure temps., as they are so unrealistically high that you would never see in games. Since I don't have a decent cooler nor a overclocked chip, I will avoid it for now to avoid any further damage to my chip.









Edit: I use HWInfo64 along with RivaTuner to monitor temps. while gaming. They depend on each other in order to work properly.


----------



## zipper17

Prime95 is one of the best stress test tool It's designed for overclocking. Every one in majority use it and also other programs. If you OC your CPU and it is rock stable at Prime95, gaming will never be a problem. You can run for Full custom run for a day or run use special FFT 1334k, 864k, etc. the smallest FFT 8-12k are the hardest one to test your cooling capabilities. Each Every FFT number has different kind of test loads. Always avoid use Stock cooler if plan to overclocking. To be more detail you can search a bit for tutorial how to use prime95 properly. +90C is indicator need better cooling.

Some pro-overclocker running prime95 and furmark at the same time, it's tested for worst case scenario (whole components being tested, for detects if there is crash, bsod, cpu throttle, vrm throttle, temperatures, GPU, PSU, etc..)

My CPu 4.7ghz 1.34v at prime95 running 85-90C for a couple times & whole test in long hours. Pretty stable because no bsod crash or whea error. In Gaming never exceed 70-75C at 100%, mostly stay at 50-65C, using Hyper 212+, need to upgrade to AIO might will be able push it to 5GHZ.

For GPU stress tool I'm more recommend 3DMark Firestrike/Timespy Stress Test.


----------



## zipper17

Quote:


> Originally Posted by *comanzo*
> 
> Turns out using ddu and not letting windows update first did the trick. I don't notice stuttering anymore in arkham city/asylum. Haven't tested wizard 101 or siege yet. My only concern left is this. When running prime 95, my cpu temps went to 92 degrees celsius which is also the tj max for my i7 4790s. I turned off the pc immediately by holding the power button. I would say the temps were there for about 5-7 seconds, 10 at the most before my pc turned off. Do you think my cpu could possibly be damaged by this situation? I ran intel processor diagnostic test with a pass and cpu-z gave scores similar to what an i7 4790s should be.
> 
> I have ran prime95 in the past with temps being around low 80's. I don't know why they got so high this time, maybe when I re-applied thermal paste I didn't apply it effectively? Anyways, would the cpu at those high temps reaching it's tj max(or throttle temp) for about 10 seconds cause any permanent damage? It seems to be fine, but confirmation would be nice. I am glad the stutter issue is solved.


Your cpu is still fine bro. Just stop the prime95 if get too hot.

Do you overclocking your cpu while prime95? or it just run at stock clock with stock cooler?

Stock cooler will never be recommended for overclocking of course.

If the CPU on Mildly-to-Extreme overclocking(+500mhz or so) , 80C to about 90C after very long hours testing, it's pretty much normal with mainstream aircooler, it's still doing pretty good job.
Watercooler of course will be better to completely avoid ~80-90C temps.


----------



## VeauX

Agreed, never overclock on stock cooler. Stock coolers are just designed to cover the base CPU TDP. Regarding the TIM 200h settling time frame, while it is not false, we are talking about a few degrees, nothing really relevant for non uber overclockers.


----------



## NiKiZ

Damn this card is a beast when overclocked a bit. Very happy with this. Running DOOM, Ultra settings and max AA at 1080p. Most of the time the FPS is about 150-200 and rarely drops down to 80 in large, open areas outside.


----------



## blaze2210

Quote:


> Originally Posted by *NiKiZ*
> 
> Damn this card is a beast when overclocked a bit. Very happy with this. Running DOOM, Ultra settings and max AA at 1080p. Most of the time the FPS is about 150-200 and rarely drops down to 80 in large, open areas outside.


Right?!? I ran through the Titanfall 2 campaign at a constant 144fps, with maxed settings. It was sooo smooth, like watching a movie....


----------



## Madmaxneo

I'm hitting a steady 60fps in The Witcher 3. Maybe I should try OCing this card again and see where I can go with that...lol


----------



## comanzo

Quote:


> Originally Posted by *zipper17*
> 
> Your cpu is still fine bro. Just stop the prime95 if get too hot.
> 
> Do you overclocking your cpu while prime95? or it just run at stock clock with stock cooler?
> 
> Stock cooler will never be recommended for overclocking of course.
> 
> If the CPU on Mildly-to-Extreme overclocking(+500mhz or so) , 80C to about 90C after very long hours testing, it's pretty much normal with mainstream aircooler, it's still doing pretty good job.
> Watercooler of course will be better to completely avoid ~80-90C temps.


Nah bro. I have an i7-4790s, meaning no overclocking going on here. I ran prime to see if my cpu was stable while diagnosing what was causing the stuttering. In other words, testing to see if cpu was faulty. I typically get low 80's with stock cooler, except I got a different temp. this time. No worries though, as the intel diagnostic diagnostic test had me covered and doesn't put as much stress.


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> Agreed, never overclock on stock cooler. Stock coolers are just designed to cover the base CPU TDP. Regarding the TIM 200h settling time frame, while it is not false, we are talking about a few degrees, nothing really relevant for non uber overclockers.


I am not overclocking though, I was using it to test whether the cpu was the suspect when it came to the stutters. In other words, to see if it was faulty.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Damn this card is a beast when overclocked a bit. Very happy with this. Running DOOM, Ultra settings and max AA at 1080p. Most of the time the FPS is about 150-200 and rarely drops down to 80 in large, open areas outside.


The sad thing is that you will get used to it in time and then start thinking to yourself. "I wish it was faster"


----------



## VeauX

Quote:


> Originally Posted by *comanzo*
> 
> I am not overclocking though, I was using it to test whether the cpu was the suspect when it came to the stutters. In other words, to see if it was faulty.


Bent pin would cause the Computer to not post. Stutter may be caused by a lot of things, background software, drivers, trottling etc..

CPU throttling can be identified with the correct monitoring tools like HWmonitor.

Software and drivers issues can be long and tedious to find out. If you have a spare hdd to install a fresh windows copy, you could try that.

Let us know what you find!


----------



## snow cakes




----------



## comanzo

Hey guys, another quick question. My mobo supports pci-e 2.0 x16 bandwitdth. I would like to verify if my gpu is using all 16 lanes. Is there any way I can verify if it's using all 16 or if it's using less? Thanks.


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> Bent pin would cause the Computer to not post. Stutter may be caused by a lot of things, background software, drivers, trottling etc..
> 
> CPU throttling can be identified with the correct monitoring tools like HWmonitor.
> 
> Software and drivers issues can be long and tedious to find out. If you have a spare hdd to install a fresh windows copy, you could try that.
> 
> Let us know what you find!


Well not just bent pins, but also if the cpu is DOA. Or maybe static electricity got the best of it. Bent pins was just one guess, but that one is eliminated as you all pointed out it wouldn't post. I will post occasionally on any progress I find regarding my issue. Thanks for being such a great sport and helping me out.


----------



## blaze2210

Quote:


> Originally Posted by *comanzo*
> 
> Hey guys, another quick question. My mobo supports pci-e 2.0 x16 bandwitdth. I would like to verify if my gpu is using all 16 lanes. Is there any way I can verify if it's using all 16 or if it's using less? Thanks.


Download and run GPU-Z, it will show you on the main page:


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Well not just bent pins, but also if the cpu is DOA. Or maybe static electricity got the best of it. Bent pins was just one guess, but that one is eliminated as you all pointed out it wouldn't post. I will post occasionally on any progress I find regarding my issue. Thanks for being such a great sport and helping me out.


Not true, It can post depending on the bent pins. I have had this problem twice now and the computer would post and run, just not with all the RAM power......


----------



## VeauX

Quote:


> Originally Posted by *comanzo*
> 
> Hey guys, another quick question. My mobo supports pci-e 2.0 x16 bandwitdth. I would like to verify if my gpu is using all 16 lanes. Is there any way I can verify if it's using all 16 or if it's using less? Thanks.


Quote:


> Originally Posted by *comanzo*
> 
> Well not just bent pins, but also if the cpu is DOA. Or maybe static electricity got the best of it. Bent pins was just one guess, but that one is eliminated as you all pointed out it wouldn't post. I will post occasionally on any progress I find regarding my issue. Thanks for being such a great sport and helping me out.


Yep, GPU-z is the easiest way to check the PCI-express lines. Just don´t freak out if it says PCIe gen 2 16x as it depends on your motherboard and the slot the card it on. PCIe 16x gen 2 does not bottleneck graphic cards, it has enough bandwidth.

Btw, I came across this tool that I had completely forgotten. You might to give it a shot. It is Intel Processor Diagnostic Tool. Download and run and you could have a pretty good idea if something is not going as expected.
https://downloadcenter.intel.com/download/19792


----------



## Nawafwabs

what's best driver has low letancy and stable?


----------



## khanmein

Quote:


> Originally Posted by *Nawafwabs*
> 
> what's best driver has low letancy and stable?


Currently, 384.94 the latest & newest.


----------



## skupples

Quote:


> Originally Posted by *VeauX*
> 
> Yep, GPU-z is the easiest way to check the PCI-express lines. Just don´t freak out if it says PCIe gen 2 16x as it depends on your motherboard and the slot the card it on. PCIe 16x gen 2 does not bottleneck graphic cards, it has enough bandwidth.
> 
> Btw, I came across this tool that I had completely forgotten. You might to give it a shot. It is Intel Processor Diagnostic Tool. Download and run and you could have a pretty good idea if something is not going as expected.
> https://downloadcenter.intel.com/download/19792


it'll also not always show the correct speed if the bus isn't in use. Some sort of power setting or something, so check GPU-z while the GPU has some sort of load on it.


----------



## NiKiZ

I'm a bit worried because I sometimes hear some "clicks" inside my computer at heavy loads. When I had my old GPU I did not hear any clicks. I had two different MSI GTX 1070 Armor OC cards and I have heard some clicks when they were installed in my computer. The first one was defective and I used my old HD7850 meanwhile. No clicks. The replacement card has been running fine, but I can hear the clicks again.

Not sure what that's about.. It seems to be random when they happen. It seems to occur most commonly, when the card goes from idle to high load quickly. Then it "clicks" couple of times.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> I'm a bit worried because I sometimes hear some "clicks" inside my computer at heavy loads. When I had my old GPU I did not hear any clicks. I had two different MSI GTX 1070 Armor OC cards and I have heard some clicks when they were installed in my computer. The first one was defective and I used my old HD7850 meanwhile. No clicks. The replacement card has been running fine, but I can hear the clicks again.
> 
> Not sure what that's about.. It seems to be random when they happen. It seems to occur most commonly, when the card goes from idle to high load quickly. Then it "clicks" couple of times.


check that the fan is not slightly catching on the heat sync. you may need to give it a little massage to increase the tolerance between the blades and fins on the sync where it catches as the fans accelerate


----------



## NiKiZ

Quote:


> Originally Posted by *gtbtk*
> 
> check that the fan is not slightly catching on the heat sync. you may need to give it a little massage to increase the tolerance between the blades and fins on the sync where it catches as the fans accelerate


Well that might be it. One of the fans got stuck occasionally on the defective card. On the replacement the fans spin freely. I haven't heard any clicking for couple of days and I tried to replicate it unsuccessfully by setting the speed manually to 0% and rapidly ramping it up to 100%.

But the sound is very similar to this, but only one click and it happens very rarely:


----------



## Madmaxneo

Quote:


> Originally Posted by *NiKiZ*
> 
> Well that might be it. One of the fans got stuck occasionally on the defective card. On the replacement the fans spin freely. I haven't heard any clicking for couple of days and I tried to replicate it unsuccessfully by setting the speed manually to 0% and rapidly ramping it up to 100%.
> 
> But the sound is very similar to this, but only one click and it happens very rarely:
> 
> 
> Spoiler: Warning: Spoiler!


In that video you can see the one fan starting up and stopping every time there is a click.... Are you having that same problem?


----------



## NiKiZ

Quote:


> Originally Posted by *Madmaxneo*
> 
> In that video you can see the one fan starting up and stopping every time there is a click.... Are you having that same problem?


I haven't seen it. It happens randomly and unexpectedly and I haven't been able to replicate the problem myself. Last time I heard it was 2 days ago.


----------



## Madmaxneo

Quote:


> Originally Posted by *NiKiZ*
> 
> I haven't seen it. It happens randomly and unexpectedly and I haven't been able to replicate the problem myself. Last time I heard it was 2 days ago.


I had a similar problem with my top fans but it was more profound and happened when I had to move my case. The cause of mine was a fan wire had somehow moved itself right against a fan blade.
In your case it is possible there is a wire that slowly creeps just close enough to click a couple of times then it is pushed away only to slowly creep back.
How good are your wire management skills?...lol.

Hmmm on second thought that fan in the video only made the sound when the fan was starting up (and maybe stopping). Your card may be stopping the fan at low temps and it may click when starting up again.....just a thought.


----------



## NiKiZ

Quote:


> Originally Posted by *Madmaxneo*
> 
> I had a similar problem with my top fans but it was more profound and happened when I had to move my case. The cause of mine was a fan wire had somehow moved itself right against a fan blade.
> In your case it is possible there is a wire that slowly creeps just close enough to click a couple of times then it is pushed away only to slowly creep back.
> How good are your wire management skills?...lol.
> 
> Hmmm on second thought that fan in the video only made the sound when the fan was starting up (and maybe stopping). Your card may be stopping the fan at low temps and it may click when starting up again.....just a thought.


Not many wires hanging around the fans.










I actually have a custom fan curve, the fan never stops. I don't like the idea of the GPU idling around 50-60 celsius. The fans are already quiet enough that it doesn't make any noticeable difference.


----------



## Madmaxneo

Quote:


> Originally Posted by *NiKiZ*
> 
> Not many wires hanging around the fans.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I actually have a custom fan curve, the fan never stops. I don't like the idea of the GPU idling around 50-60 celsius. The fans are already quiet enough that it doesn't make any noticeable difference.


Then this is perplexing. Could it possibly be something else in the system other than the GPU making that noise?


----------



## djleakyg

Howdy!

Just picked up an Asus GTX 1070 8GB OC Edition. As far as OC's are concerned what are some relatively safe numbers I can hit without missing the cooler. I wasn't going to buy a cars to replace my 970 SSC but this popped up at my local MicroCenter for $275. I had to return some RAM and ended up checking the open box section. Looking forward to some improvements.


----------



## Nukemaster

All cards are different, but almost anything will be 2000mhz(heck your may already be at that.)+. You will have to test with different games because some will be fine while others will cause issues.


----------



## gtbtk

Quote:


> Originally Posted by *NiKiZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Madmaxneo*
> 
> I had a similar problem with my top fans but it was more profound and happened when I had to move my case. The cause of mine was a fan wire had somehow moved itself right against a fan blade.
> In your case it is possible there is a wire that slowly creeps just close enough to click a couple of times then it is pushed away only to slowly creep back.
> How good are your wire management skills?...lol.
> 
> Hmmm on second thought that fan in the video only made the sound when the fan was starting up (and maybe stopping). Your card may be stopping the fan at low temps and it may click when starting up again.....just a thought.
> 
> 
> 
> Not many wires hanging around the fans.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I actually have a custom fan curve, the fan never stops. I don't like the idea of the GPU idling around 50-60 celsius. The fans are already quiet enough that it doesn't make any noticeable difference.
Click to expand...

If the fan is catching on acceleration, then there should be some marks on the inside of the fan blades


----------



## gtbtk

Quote:


> Originally Posted by *djleakyg*
> 
> Howdy!
> 
> Just picked up an Asus GTX 1070 8GB OC Edition. As far as OC's are concerned what are some relatively safe numbers I can hit without missing the cooler. I wasn't going to buy a cars to replace my 970 SSC but this popped up at my local MicroCenter for $275. I had to return some RAM and ended up checking the open box section. Looking forward to some improvements.


If you have Micron memory on your card, make sure that you are running an 86.04.50.00.xx version bios. If it is still on 86.04.26.00.xx bios, get the update from asus and update your card.

Memory should be fine at +500 as a starting point. If you are lucky, you may get it up to +700 to +800 but it will likely be generating errors and error correcting at that stage. That will make the card either crash or reduce performance. You can use OCCT to check at what point memory errors start to occur. I get the best performance on my card at about +600 but the card will run at +700

For core overclocking on the slider, you are already starting with a base frequency of 1633mhz, you could reasonably expect around an extra +50. A really good card may go more. To get the 2100+mhz overclocks, you need to use the curve

Set a custom fan curve that idles with the fans at about 25% and starts ramping up in the low 40s peaking under 60. At 100% fans under load, you should expect the card to run under 60 deg


----------



## VeauX

I have micron memory on my Gigabyte G1 Gaming and the tool provided to update bios is not catching the update from their servers. It says it found and updated bios but is not able to download it. However I have not overclocked memory so much (+300 only) so I'm not seeing issues. The core clock is stable @ ~2075 with +50% on the vcore slider.


----------



## gtbtk

Quote:


> Originally Posted by *VeauX*
> 
> I have micron memory on my Gigabyte G1 Gaming and the tool provided to update bios is not catching the update from their servers. It says it found and updated bios but is not able to download it. However I have not overclocked memory so much (+300 only) so I'm not seeing issues. The core clock is stable @ ~2075 with +50% on the vcore slider.


take a look in GPU-Z and it will tell you what the bios version that is running on the card. If it tells you that you are running a version that starts with 86.04.50.00.XX then you do not have to worry about doing a bios update, it is already done. The Updates were released last November, so it would be unusual now to find any new cards that came from the factory with the old bios.

1070s love faster memory. +500 on your memory should be easily achieved and give you a nice boost. +600 is possible for many cards


----------



## TerafloppinDatP

Vague question, but does OCing the memory have similar effects on memory temps as OCing the core does on core temps? My 6173, being pre-iCX, doesn't offer temps other than core, and I've left it at stock speeds to keep temps and noise levels down. But if I can do +0 on core and +500 on memory and get a bit of a lift without needing to ramp up my fans to keep up, that would be nice. Hope that makes sense.


----------



## djleakyg

Hey Guys,

Just got my new 1070 installed and I have zero experuience with 1070's. When I run Speccy, it comes up with only 4 GB instead of 8. Is this a known issue or is there something actually wrong?


----------



## djleakyg

Also what are normal temps I can expect from an Asus 1070?


----------



## blaze2210

Quote:


> Originally Posted by *djleakyg*
> 
> Also what are normal temps I can expect from an Asus 1070?


That is going to be heavily dependent on your ambient temps. If your ambient temps are lower, then your card will run cooler, and vise versa. Also, if you're using the stock fan controls, then the card will most likely run warmer, than it would if you set an aggressive custom fan curve.


----------



## djleakyg

Quote:


> Originally Posted by *blaze2210*
> 
> That is going to be heavily dependent on your ambient temps. If your ambient temps are lower, then your card will run cooler, and vise versa. Also, if you're using the stock fan controls, then the card will most likely run warmer, than it would if you set an aggressive custom fan curve.


What program would you use to augment fan curve? I have the ASUS OC tool & I do not like it very much. It seems very clunky. Any you would recommend?>


----------



## Hunched

Quote:


> Originally Posted by *djleakyg*
> 
> What program would you use to augment fan curve? I have the ASUS OC tool & I do not like it very much. It seems very clunky. Any you would recommend?>


The answer is always MSI Afterburner


----------



## blaze2210

Quote:


> Originally Posted by *djleakyg*
> 
> What program would you use to augment fan curve? I have the ASUS OC tool & I do not like it very much. It seems very clunky. Any you would recommend?>


Quote:


> Originally Posted by *Hunched*
> 
> The answer is always MSI Afterburner


^This. Afterburner works, no matter what card you have - Nvidia or AMD....


----------



## Dude970

Quote:


> Originally Posted by *djleakyg*
> 
> Hey Guys,
> 
> Just got my new 1070 installed and I have zero experuience with 1070's. When I run Speccy, it comes up with only 4 GB instead of 8. Is this a known issue or is there something actually wrong?


Known issue, GPU-Z will show the correct amount. and Afterburner will too


----------



## blaze2210

Quote:


> Originally Posted by *Dude970*
> 
> Known issue, GPU-Z will show the correct amount. and Afterburner will too


^ Yup. GPU-Z does one thing, and does it well - gives you GPU info. Afterburner accomplishes a similar task, but more in the OC/monitoring department.


----------



## djleakyg

Quote:


> Originally Posted by *Dude970*
> 
> Known issue, GPU-Z will show the correct amount. and Afterburner will too


Just out of curiosity, why does it do that? Its not like Memorygate or whatever the hell they called it with the GTX 970 is it?


----------



## Nukemaster

No, it is most likely just a software bug. I have seen many programs mis-read video memory.

The 1070 has all memory at the same speed.


----------



## NinjaCool

I missed this thread or I would have posted months ago but anyway I preordered my Zotac 1070 AMP Extreme over Amazon when it had a good price and it's been running great with a custom fan profile









When I went to install it I had to take the drive cage out of my Antec 180b case to fit this beast


----------



## ravihpa

Quote:


> Originally Posted by *NinjaCool*
> 
> I missed this thread or I would have posted months ago but anyway I preordered my Zotac 1070 AMP Extreme over Amazon when it had a good price and it's been running great with a custom fan profile
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I went to install it I had to take the drive cage out of my Antec 180b case to fit this beast


Hahaha. Same here. Bought mine off Amazon in a Lightning Deal and then took my NZXT Gamma to a friend who works with metal. Cut the drive bay out so I could fit my AMP Extreme in


----------



## gtbtk

Quote:


> Originally Posted by *TerafloppinDatP*
> 
> Vague question, but does OCing the memory have similar effects on memory temps as OCing the core does on core temps? My 6173, being pre-iCX, doesn't offer temps other than core, and I've left it at stock speeds to keep temps and noise levels down. But if I can do +0 on core and +500 on memory and get a bit of a lift without needing to ramp up my fans to keep up, that would be nice. Hope that makes sense.


Running a custom curve with fans running faster than stock does add to card stability, even on the msi gamming that has a mid plate heat spreader for memory and VRMs. At idle the card down clocks everything regardless so temps are not really too much of an issue then.

You will need to experiment for yourself to see how your card behaves. not increasing the fan to start with will not kill your card however i may make it unstable. If it does, start off creating a curve that mirrors the stock one except that it starts ramping up at a lower temp.

If that is still no good, try running the card with a 20% fan at idle and start ramping up at about 50 deg. It will still remain quiet but the idle temps will be 35 deg and not high 50 degs


----------



## TerafloppinDatP

Quote:


> Originally Posted by *gtbtk*
> 
> Running a custom curve with fans running faster than stock does add to card stability, even on the msi gamming that has a mid plate heat spreader for memory and VRMs. At idle the card down clocks everything regardless so temps are not really too much of an issue then.
> 
> You will need to experiment for yourself to see how your card behaves. not increasing the fan to start with will not kill your card however i may make it unstable. If it does, start off creating a curve that mirrors the stock one except that it starts ramping up at a lower temp.
> 
> If that is still no good, try running the card with a 20% fan at idle and start ramping up at about 50 deg. It will still remain quiet but the idle temps will be 35 deg and not high 50 degs


Thanks. I think I've got it dialed in with modest +25/+400 overclocks and unlinked 80%/70C on power limit and temp limit. Fan curve looks like this:



Temps are not going over 68C and fans hovering around 48%/1750rpm, which is very quiet. Obviously the temp limit is throttling performance by design, which might sound strange to your normal balls-to-the-wall overclocker. But everything is cool and quiet while I peg the 80Hz limit of my monitor with a constant 80fps in BF1 on Ultra, so I am happy with it. FWIW the power draw per HWiNFO64 hovers around 90% and peaks at 95%. If I run Witcher 3, I up the temp limit to 72C which allows for 65-75fps and still fairly quiet operation. I just don't like the sound of this EVGA 1070 SC at anything over 60% fan speed, which is why I've gone down this tweaking road. I know a lot of people here are all about 80-100% fan speed, but as long as temps are fine I'm all about keeping fan speed as low as possible. I do have HWiNFO64 alarm set to go off if the card reaches 80C, which so far has never happened.

Still not entirely positive about memory temps and how to gauge it with pre-iCX products, but "if it ain't broke, stop f-ing with it," or something like that


----------



## comanzo

Quote:


> Originally Posted by *VeauX*
> 
> Yep, GPU-z is the easiest way to check the PCI-express lines. Just don´t freak out if it says PCIe gen 2 16x as it depends on your motherboard and the slot the card it on. PCIe 16x gen 2 does not bottleneck graphic cards, it has enough bandwidth.
> 
> Btw, I came across this tool that I had completely forgotten. You might to give it a shot. It is Intel Processor Diagnostic Tool. Download and run and you could have a pretty good idea if something is not going as expected.
> https://downloadcenter.intel.com/download/19792


Do you think cpu-z benchmark and intel processor diagnostic tool should be enough to tell if something is wrong with my cpu or not? Cpu-z gives me scores similar to my model which is good, and intel processor diagnostic comes back with a pass


----------



## gtbtk

Quote:


> Originally Posted by *TerafloppinDatP*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Running a custom curve with fans running faster than stock does add to card stability, even on the msi gamming that has a mid plate heat spreader for memory and VRMs. At idle the card down clocks everything regardless so temps are not really too much of an issue then.
> 
> You will need to experiment for yourself to see how your card behaves. not increasing the fan to start with will not kill your card however i may make it unstable. If it does, start off creating a curve that mirrors the stock one except that it starts ramping up at a lower temp.
> 
> If that is still no good, try running the card with a 20% fan at idle and start ramping up at about 50 deg. It will still remain quiet but the idle temps will be 35 deg and not high 50 degs
> 
> 
> 
> Thanks. I think I've got it dialed in with modest +25/+400 overclocks and unlinked 80%/70C on power limit and temp limit. Fan curve looks like this:
> 
> 
> 
> Temps are not going over 68C and fans hovering around 48%/1750rpm, which is very quiet. Obviously the temp limit is throttling performance by design, which might sound strange to your normal balls-to-the-wall overclocker. But everything is cool and quiet while I peg the 80Hz limit of my monitor with a constant 80fps in BF1 on Ultra, so I am happy with it. FWIW the power draw per HWiNFO64 hovers around 90% and peaks at 95%. If I run Witcher 3, I up the temp limit to 72C which allows for 65-75fps and still fairly quiet operation. I just don't like the sound of this EVGA 1070 SC at anything over 60% fan speed, which is why I've gone down this tweaking road. I know a lot of people here are all about 80-100% fan speed, but as long as temps are fine I'm all about keeping fan speed as low as possible. I do have HWiNFO64 alarm set to go off if the card reaches 80C, which so far has never happened.
> 
> Still not entirely positive about memory temps and how to gauge it with pre-iCX products, but "if it ain't broke, stop f-ing with it," or something like that
Click to expand...

I am a real FPS head. On mine, the 0 deg - 35 deg x axis I run at about 25% and start increasing fans at about 38deg and take it to 100% at about 55. I can run under constant load at 53-59 deg depending on where I set the voltage. The fans are audible but not intrusive unless it is 3 in the morning with no AC turned on.


----------



## gtbtk

My rig is currently down. Has anyone tried out the new 385.12 drivers?

Did you notice any improvements in performance with your 1070s?


----------



## TerafloppinDatP

Quote:


> Originally Posted by *gtbtk*
> 
> I am a real FPS head. On mine, the 0 deg - 35 deg x axis I run at about 25% and start increasing fans at about 38deg and take it to 100% at about 55. I can run under constant load at 53-59 deg depending on where I set the voltage. The fans are audible but not intrusive unless it is 3 in the morning with no AC turned on.


That's definitely the FPS head way to do it. I think I'm just super sensitive to the sound to the point where it's distracting, plus I use on-ear headphones instead of circumaural. And I think the EVGA SC is louder at 100% than the MSI Gaming X.

Now I'm curious to see how high a clock rate I could sustain if I go max fans on everything, even if I have to turn them back down for daily driving....


----------



## gtbtk

Quote:


> Originally Posted by *TerafloppinDatP*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I am a real FPS head. On mine, the 0 deg - 35 deg x axis I run at about 25% and start increasing fans at about 38deg and take it to 100% at about 55. I can run under constant load at 53-59 deg depending on where I set the voltage. The fans are audible but not intrusive unless it is 3 in the morning with no AC turned on.
> 
> 
> 
> That's definitely the FPS head way to do it. I think I'm just super sensitive to the sound to the point where it's distracting, plus I use on-ear headphones instead of circumaural. And I think the EVGA SC is louder at 100% than the MSI Gaming X.
> 
> Now I'm curious to see how high a clock rate I could sustain if I go max fans on everything, even if I have to turn them back down for daily driving....
Click to expand...

I would still recommend running low speed fans even at idle if you are running overclocked. at 20% they remain silent. You just set a less aggressive curve that starts ramping up at say 45 deg.

Keeping temps as low as possible will maximize performance. As temps increase, at each step, the card will either try to step up the voltage if there is any headroom first and then reduce frequency when it hits the next temp stem (about 3 deg apart).

To get the most stable highest frequency, you will get best results setting the card +100v and create a curve to start at its maximum frequency at 1.081v. As it heats up, it will go to 1.093v at that frequency first and then the next step will be 12mhz down at 1.081v. You will get to a balance point where the fans and heat balance out and the frequency will stabilize. I know 2200Mhz sounds great, but don't obsess over just getting higher frequencies.

My card does about 21000+ in firestrike starting at 2088Mhz and it balance out at 2076Mhz. Micron Memory is at +640 running on an i7-2600 rig. I can run both faster but scores are lower and the frequency is a bit erratic.

http://www.3dmark.com/fs/11822144


----------



## zipper17

Quote:


> Originally Posted by *gtbtk*
> 
> My rig is currently down. Has anyone tried out the new 385.12 drivers?
> 
> Did you notice any improvements in performance with your 1070s?


I'm still on 382.05

Didn't update because of this:

_384.94 Important Open Issues (For full list of open issues please check out release notes)_
*[GeForce GTX 1070][Doom]: The GPU clocks remain running at high performance speeds after exiting from the game. [1954033]
*

I'm curious too, looks like v384.94 got about +-200pts on FS graphic scores according to reddit driver 384.94 thread

385.12 still beta, major improvement in rendering performances for pascal cards


----------



## b0uncyfr0

Any word on bios modding the pascal line yet? So much potential under the hood but limited by voltage. Im sitting at 1.05v without even hitting 55 degrees. The card is begging to be pushed.


----------



## blaze2210

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Any word on bios modding the pascal line yet? So much potential under the hood but limited by voltage. Im sitting at 1.05v without even hitting 55 degrees. The card is begging to be pushed.


If it hasn't happened by now, I'm pretty sure it's not going to. That is, unless Nvidia decides to give us the ability to actually flash the modded vBIOS back to the card.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> My rig is currently down. Has anyone tried out the new 385.12 drivers?
> 
> Did you notice any improvements in performance with your 1070s?
> 
> 
> 
> I'm still on 382.05
> 
> Didn't update because of this:
> 
> 384.94 Important Open Issues (For full list of open issues please check out release notes)
> *[GeForce GTX 1070][Doom]: The GPU clocks remain running at high performance speeds after exiting from the game. [1954033]*
> 
> I'm curious too, looks like v384.94 got about +-200pts on FS graphic scores according to reddit driver 384.94 thread
> 
> 385.12 still beta, major improvement in rendering performances for pascal cards
Click to expand...

I just don't know if the GPGPU performance translates to 3d graphics or not.

nice that there may be a bit of a boost


----------



## VeauX

Quote:


> Originally Posted by *comanzo*
> 
> Do you think cpu-z benchmark and intel processor diagnostic tool should be enough to tell if something is wrong with my cpu or not? Cpu-z gives me scores similar to my model which is good, and intel processor diagnostic comes back with a pass


Intel processor diag would be enough to know if your CPU works as intended. Now it won't cover throttling.

For stability (CPU, GPU or both) OCCT is great.

AIDA64 has a stability bench also.


----------



## Falkentyne

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Any word on bios modding the pascal line yet? So much potential under the hood but limited by voltage. Im sitting at 1.05v without even hitting 55 degrees. The card is begging to be pushed.


You can mod the pascal Bios right now, but you can't NVflash the Bios back to the card. But you can force flash it with a SPI programmer..


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> My rig is currently down. Has anyone tried out the new 385.12 drivers?
> 
> Did you notice any improvements in performance with your 1070s?


I can't notice any difference, but so far so good.


----------



## khanmein

Quote:


> Originally Posted by *zipper17*
> 
> I'm still on 382.05
> 
> Didn't update because of this:
> 
> _384.94 Important Open Issues (For full list of open issues please check out release notes)_
> *[GeForce GTX 1070][Doom]: The GPU clocks remain running at high performance speeds after exiting from the game. [1954033]
> *
> 
> I'm curious too, looks like v384.94 got about +-200pts on FS graphic scores according to reddit driver 384.94 thread
> 
> 385.12 still beta, major improvement in rendering performances for pascal cards


Regarding the Doom or other games stuck at high clocks all started NVIDIA makes the idle clock stayed 139 MHz, but the issue you mentioned can easily be fixed by selecting 'Adaptive' or 'Optimal power' under the Doom profile.

By default, Doom is set as 'Prefer maximum performance' but if you insisted to use the setting, all you need to do is rebooting your PC when you exit the game which will return back to normal.


----------



## icold

Have we pascal bios editor now?


----------



## Falkentyne

Quote:


> Originally Posted by *icold*
> 
> Have we pascal bios editor now?


It's been out for awhile.
But you need a hardware programmer to flash any modified Bios.

https://github.com/LaneLyng/MobilePascalTDPTweaker/releases


----------



## icold

Quote:


> Originally Posted by *Falkentyne*
> 
> It's been out for awhile.
> But you need a hardware programmer to flash any modified Bios.
> 
> https://github.com/LaneLyng/MobilePascalTDPTweaker/releases


I only want change temp throttling


----------



## Falkentyne

It doesn't matter. You either buy a hardware programmer or you don't change temp throttling.
Changing even *ONE BYTE* in a Falcon protected area of the Bios causes NVflash to fail with a certificate error, even after the Checksum has been corrected.
You either have to bypass Ngreedia's greed and force flash the Bios yourself, or deal with it.

I complained about it also. Believe me. But when I saw how EASY it is to flash a Vbios with a programmer, I never looked back.
Please check my other post Either in the drivers/overclocking section or in the "Flash different Bios on 1080 ti" for items to buy. You may not even need the SOIC8 adapter, since if you're building the wires yourself (without soldering), you just hook it up to the 1.8v adapter directly.

All you need to flash:
https://www.amazon.com/CTYRZCH-SOIC8-Socket-Adpter-Programmer/dp/B015W4PKR6/ref=pd_sbs_504_1?_encoding=UTF8&pd_rd_i=B015W4PKR6&pd_rd_r=GHMAVVZHP650ZSMFTR6V&pd_rd_w=gwG5w&pd_rd_wg=CJKtG&psc=1&refRID=GHMAVVZHP650ZSMFTR6V(only for the soic8 adapter, don't use the crappy clip or cable, but the adapter might not be needed at all since you're building your own clip and wires with proper spacing anyway. <---so maybe you can skip this.

https://www.amazon.com/gp/product/B00HHH65T4/ref=oh_aui_detailpage_o09_s00?ie=UTF8&psc=1 <--essential.

https://www.amazon.com/gp/product/B01DC74A9Y/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1 <--you'll use the female to female clips if you are hooking it up to an SOIC 8 adapter (the soic8 adapter plugs into the 1.8v adapter), you'll use the female to male clips if hooking it directly to the 1.8v adapter.

https://www.amazon.com/gp/product/B072KYK2DR/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1 <--essential. Not using the 1.8v adapter can fry the Bios chip. The 1.8v adapter plugs into the flasher.

https://www.amazon.com/gp/product/B01DZC36GY/ref=oh_aui_detailpage_o05_s00?ie=UTF8&psc=1 <--the flasher.

If you want something a little more fancy, and don't mind waiting a bit longer for shipping, you can get the skypro II:
https://www.aliexpress.com/item/SkyPRO-II-update-version-of-skypro-24-25-93-SPI-FLASH-AVR-STM32-offline-programming-offline/32655341936.html?aff_platform=aaf&cpt=1502056055035&sk=zj6qB6AIM&aff_trace_key=98425bf9cd07492eb91755cd1b2f592e-1502056055035-05724-zj6qB6AIM

but you'll still need the 1.8v adapter (I do NOT know this for certain but there is NO HARM in using the 1.8v adapter even if the flasher supports native 1.8v. The software for the Skypro/Skypro II will tell you if the 1.8v adapter is required when you hook the clip to the Bios pins and press "Detect.".

The HARDEST thing about doing the flashing is lining up PIN 1 on the clip to pin 1 on the BIOS chip, and making sure pin 1 on the clip hooks up to pin 1 on the 1.8v adapter. it's rather simple, not complicated. Just pay attention. NOTE: PINS ARE COUNTER CLICKWISE. so if pin 1 is in the top left corner, pin 8 is in the top right corner. If pin 1 on the Bios chip itself is in the top right corner, pin 8 is in the bottom right corner. So when hooking up the clip wires to pin 1, you may have to cross them counterclockwise because very often, pin 1 is in the "top left" corner on the flasher, but in the top RIGHT corner of the Bios chip (at the notch indention on the chip).

Just make sure the pins line up on the IC clip, matching pin 1 and direction on the Bios chip, and they line up on the 1.8v adapter.

Seems complicated until you actually start doing it.

The nice thing about having a hardware programmer is you can recover from all bad flashes with this thing. There are 16 pin clips you can buy also for 16 pin IC's. Note: BGA chips require desoldering AND a special adapter. You won't have to worry about BGA unless you're messing with motherboards and EC firmwares.

Pictures of the (Laptop) cards and how you hook up the flashers are in this thread.

http://forum.notebookreview.com/threads/mobile-pascal-tdp-tweaker-update-and-feedback-thread.806161/

Works on desktops also. Titan XP and 1080 Ti support is unknown however.


----------



## b0uncyfr0

Quote:


> Originally Posted by *Falkentyne*
> 
> If you want something a little more fancy, and don't mind waiting a bit longer for shipping, you can get the skypro II:
> https://www.aliexpress.com/item/SkyPRO-II-update-version-of-skypro-24-25-93-SPI-FLASH-AVR-STM32-offline-programming-offline/32655341936.html?aff_platform=aaf&cpt=1502056055035&sk=zj6qB6AIM&aff_trace_key=98425bf9cd07492eb91755cd1b2f592e-1502056055035-05724-zj6qB6AIM


This looks interesting - but is it worth using? Do we know if there is much to be gained from modding the bios in the first place?


----------



## khanmein

Quote:


> Originally Posted by *b0uncyfr0*
> 
> This looks interesting - but is it worth using? Do we know if there is much to be gained from modding the bios in the first place?


I don't think is worth for bios mod. Since NVIDIA stated clearly the bios is locked for Pascal.


----------



## Falkentyne

The Bios is not locked. It's "protected."
I think you should do a little research.

http://www.voltground.com/haven/threads/94/

This protection from Ngreedia can be overcome by using a hardware programmer.

https://github.com/LaneLyng/MobilePascalTDPTweaker/releases

I already unlocked my "locked" laptop 1070 card from 115W TDP to 151-170W (same as the desktop version) and increased the throttling temp (boost clocks) to 71C.
Now my laptop 1070 is faster than a desktop one (~20500 firestrike graphics score). I also never get power limit throttling with 1886 mhz GPU boost clock (+250 mhz)
Note I disabled boost 2 (aka automatic overclocking) clocks, so my core clock always stays at 1886 mhz without dropping.

The author of that tool managed to pull 230W from a laptop 1070.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zipper17*
> 
> I'm still on 382.05
> 
> Didn't update because of this:
> 
> 384.94 Important Open Issues (For full list of open issues please check out release notes)
> *[GeForce GTX 1070][Doom]: The GPU clocks remain running at high performance speeds after exiting from the game. [1954033]*
> 
> I'm curious too, looks like v384.94 got about +-200pts on FS graphic scores according to reddit driver 384.94 thread
> 
> 385.12 still beta, major improvement in rendering performances for pascal cards
> 
> 
> 
> Regarding the Doom or other games stuck at high clocks all started NVIDIA makes the idle clock stayed 139 MHz, but the issue you mentioned can easily be fixed by selecting 'Adaptive' or 'Optimal power' under the Doom profile.
> 
> By default, Doom is set as 'Prefer maximum performance' but if you insisted to use the setting, all you need to do is rebooting your PC when you exit the game which will return back to normal.
Click to expand...

Takes me back to the micron bug days


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> The Bios is not locked. It's "protected."
> I think you should do a little research.
> 
> http://www.voltground.com/haven/threads/94/
> 
> This protection from Ngreedia can be overcome by using a hardware programmer.
> 
> https://github.com/LaneLyng/MobilePascalTDPTweaker/releases
> 
> I already unlocked my "locked" laptop 1070 card from 115W TDP to 151-170W (same as the desktop version) and increased the throttling temp (boost clocks) to 71C.
> Now my laptop 1070 is faster than a desktop one (~20500 firestrike graphics score). I also never get power limit throttling with 1886 mhz GPU boost clock (+250 mhz)
> Note I disabled boost 2 (aka automatic overclocking) clocks, so my core clock always stays at 1886 mhz without dropping.
> 
> The author of that tool managed to pull 230W from a laptop 1070.


you can set Pascal to stay at hi clocks by setting dwm.exe to run in high performance mode. No need for a bios hack.

The problem with 230w in a laptop is that there is never enough heat dissipation for sustained usage.


----------



## Falkentyne

My downclocking had nothing to do with DWM. It was TDP + temperature. That required a Bios hack.


----------



## zipper17

Quote:


> Originally Posted by *khanmein*
> 
> Regarding the Doom or other games stuck at high clocks all started NVIDIA makes the idle clock stayed 139 MHz, but the issue you mentioned can easily be fixed by selecting 'Adaptive' or 'Optimal power' under the Doom profile.
> 
> By default, Doom is set as 'Prefer maximum performance' but if you insisted to use the setting, all you need to do is rebooting your PC when you exit the game which will return back to normal.


Thx, noted.

Installed 384.94 driver, no difference from my previous driver in FS Graphic Scores.


----------



## Halseluk

Quote:


> Originally Posted by *Falkentyne*
> 
> The Bios is not locked. It's "protected."
> I think you should do a little research.
> 
> http://www.voltground.com/haven/threads/94/
> 
> This protection from Ngreedia can be overcome by using a hardware programmer.
> 
> https://github.com/LaneLyng/MobilePascalTDPTweaker/releases
> 
> I already unlocked my "locked" laptop 1070 card from 115W TDP to 151-170W (same as the desktop version) and increased the throttling temp (boost clocks) to 71C.
> Now my laptop 1070 is faster than a desktop one (~20500 firestrike graphics score). I also never get power limit throttling with 1886 mhz GPU boost clock (+250 mhz)
> Note I disabled boost 2 (aka automatic overclocking) clocks, so my core clock always stays at 1886 mhz without dropping.
> 
> The author of that tool managed to pull 230W from a laptop 1070.


If I could prevent my card drop from 2000 MHz, I'd be very happy.

But buying the necessary things for flashing is expensive to me.


----------



## khanmein

Quote:


> Originally Posted by *Falkentyne*
> 
> The Bios is not locked. It's "protected."
> I think you should do a little research.
> 
> http://www.voltground.com/haven/threads/94/
> 
> This protection from Ngreedia can be overcome by using a hardware programmer.
> 
> https://github.com/LaneLyng/MobilePascalTDPTweaker/releases
> 
> I already unlocked my "locked" laptop 1070 card from 115W TDP to 151-170W (same as the desktop version) and increased the throttling temp (boost clocks) to 71C.
> Now my laptop 1070 is faster than a desktop one (~20500 firestrike graphics score). I also never get power limit throttling with 1886 mhz GPU boost clock (+250 mhz)
> Note I disabled boost 2 (aka automatic overclocking) clocks, so my core clock always stays at 1886 mhz without dropping.
> 
> The author of that tool managed to pull 230W from a laptop 1070.


Thanks for your info, but you don't get my point. Even though got an official 'bios' unlocked for Pascal also not worth to do it. Volta is around the corner.


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> My downclocking had nothing to do with DWM. It was TDP + temperature. That required a Bios hack.


oh I see, it appeared from your post that you wanted to stop the GPU sleeping at idle.

Did you actually manage to turn it off or just push the boost curve higher?

Pascal will upvolt if there is headroom or down clock as temps rise as a function of the chip but you can actually manage it with the curves as well.


----------



## syl1979

Quote:


> Originally Posted by *Halseluk*
> 
> If I could prevent my card drop from 2000 MHz, I'd be very happy.
> 
> But buying the necessary things for flashing is expensive to me.


The frequency drop is mostly due to temperature, there should be some power margin for 2000mhz.

Try to undervolt your card (for instance 2025mhz for 0.987v, that would allow 2 steps of 12.5mhz frequency reduction) , max the cooling (fan 100%), check behavior under load.

Low voltage => less power => less heat => less frequency drop


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Falkentyne*
> 
> The Bios is not locked. It's "protected."
> I think you should do a little research.
> 
> http://www.voltground.com/haven/threads/94/
> 
> This protection from Ngreedia can be overcome by using a hardware programmer.
> 
> https://github.com/LaneLyng/MobilePascalTDPTweaker/releases
> 
> I already unlocked my "locked" laptop 1070 card from 115W TDP to 151-170W (same as the desktop version) and increased the throttling temp (boost clocks) to 71C.
> Now my laptop 1070 is faster than a desktop one (~20500 firestrike graphics score). I also never get power limit throttling with 1886 mhz GPU boost clock (+250 mhz)
> Note I disabled boost 2 (aka automatic overclocking) clocks, so my core clock always stays at 1886 mhz without dropping.
> 
> The author of that tool managed to pull 230W from a laptop 1070.
> 
> 
> 
> If I could prevent my card drop from 2000 MHz, I'd be very happy.
> 
> But buying the necessary things for flashing is expensive to me.
Click to expand...

that is not that difficult to achieve using the curve in afterburner

With a desktop card, if you use the voltage slider at 0 the two voltages that are important are 1.050 and 1.061. With the voltage at +100 the two voltages that you need are 1.081 and 1.093.

Not sure if you can adjust voltage on a laptop 1070 but you should be able to identify the highest voltage steps that are available for that card running afterburner.

You will need to manage your temps with a fan curve to keep your temps as low as you can reasonably keep it. 2000mhz is not that high and hard on the card so it should not be difficult.

What you do using the curve is to set the 1.050v or the 1.081v point, depending on your voltage setting, to your initial target frequency (CTRL-F will open the curve windows in AB). Click the point on the curve and drag it up to the frequency you want and make the rest of the curve to the right side of the screen flat. Many desktop 1070 have no problem starting at 2100Mhz. Click Apply.

You can also use the slider while looking at the curve and increase the slider so that the 1.050 or 1.081 point hist the level you want and then you pull the next higher voltage point down the the same level as your target point.

What you will find is that the card will start at the frequency you set, as temps increase, the first step that the GPU takes is to increase the voltage to the next higher voltage step while keeping the frequency the same. The next temperature step the card hits will force the frequency to drop by 12 mhz, the third step is that it will increase the voltage to the high level again while keeping the frequency at the new level and so on. just maxing the curve will make the card downclock every 3 deg increase in temp as there is no voltage headroom for it to run at the next higher voltage. using the point one step down from the top one does have that extra voltage step and will double the range of temp increase before it has to drop the frequency.

The trick is to balance the fans to maintain the max temps at a sensible level, say 60 deg for example. Starting at 2100 will probably settle at around 2050 - 2076mhz.

The same principle will also apply to a laptop, just adjust the fans as best you can and balance the curve to match the range of variation you get with that form factor. the key is still to use the voltage level n-1 instead of the max voltage. The GPU should step up two voltage levels before dropping frequency if you use voltage levels at n-2 that will give you about a 12 degree working range to play with to fix your frequency.

this is an example using n-2 with a +100 voltage


----------



## gtbtk

Quote:


> Originally Posted by *syl1979*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Halseluk*
> 
> If I could prevent my card drop from 2000 MHz, I'd be very happy.
> 
> But buying the necessary things for flashing is expensive to me.
> 
> 
> 
> The frequency drop is mostly due to temperature, there should be some power margin for 2000mhz.
> 
> Try to undervolt your card (for instance 2025mhz for 0.987v, that would allow 2 steps of 12.5mhz frequency reduction) , max the cooling (fan 100%), check behavior under load.
> 
> Low voltage => less power => less heat => less frequency drop
Click to expand...

voltages and frequency will move around regardless of under volting or not. It starts managing temps below 40 degrees. It will only lower the the amount of temperature increase from idle reducing the number of temperature steps that it has to manage.

Using the one step below the max possible voltage approach will work using any voltage levels


----------



## Falkentyne

Quote:


> Originally Posted by *gtbtk*
> 
> oh I see, it appeared from your post that you wanted to stop the GPU sleeping at idle.
> 
> Did you actually manage to turn it off or just push the boost curve higher?
> 
> Pascal will upvolt if there is headroom or down clock as temps rise as a function of the chip but you can actually manage it with the curves as well.


I used the '8A' Bios that was tested by many on normal locked 1070's to perform more stable with less core fluctuations (= more stable frametimes), because that Bios has boost '2' (automatic overclocking) clocks disabled by MSI (so it will never exceed the boost clock), so it won't go above the main boost clocks. I then modded the 8A bios to increase the TDP and temp throttling point to 151-170W and 71C.(stock was 115w fixed and 54C, although usually boost "2" clocks start throttling at 42C, with -13 mhz every 5C.

This way when I overclock to +250/+500, the core stays at 1886 mhz and never moves. Stable frametimes are extremely important (not framerate).


----------



## microchidism

not sure if this helps anyone but from my experience with the 1070 all fluctuations ended up being due temps. once that was under control my clocks / voltage were 100% steady throughout any game/ application... it took probably a whole evening messing with the boost curve but ultimately it was worth it.


----------



## Halseluk

Quote:


> Originally Posted by *syl1979*
> 
> The frequency drop is mostly due to temperature, there should be some power margin for 2000mhz.
> 
> Try to undervolt your card (for instance 2025mhz for 0.987v, that would allow 2 steps of 12.5mhz frequency reduction) , max the cooling (fan 100%), check behavior under load.
> 
> Low voltage => less power => less heat => less frequency drop


Undervolt + overclock just works with good overcloking gpus. It won't work with mine because I didn't win the silicon lottery.


----------



## drunkonpiss

Anyone running their 1070 wth a 3440x1440 monitor? I'm just wondering what is your OC like and do you opt for High settings rather than Ultra to get more frames?


----------



## icold

Not good
Quote:


> Originally Posted by *drunkonpiss*
> 
> Anyone running their 1070 wth a 3440x1440 monitor? I'm just wondering what is your OC like and do you opt for High settings rather than Ultra to get more frames?


Not good, you need a gtx 1080ti. Some games dont want so good at 1080p with gtx 1070 like deus ex making divided.


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> oh I see, it appeared from your post that you wanted to stop the GPU sleeping at idle.
> 
> Did you actually manage to turn it off or just push the boost curve higher?
> 
> Pascal will upvolt if there is headroom or down clock as temps rise as a function of the chip but you can actually manage it with the curves as well.
> 
> 
> 
> I used the '8A' Bios that was tested by many on normal locked 1070's to perform more stable with less core fluctuations (= more stable frametimes), because that Bios has boost '2' (automatic overclocking) clocks disabled by MSI (so it will never exceed the boost clock), so it won't go above the main boost clocks. I then modded the 8A bios to increase the TDP and temp throttling point to 151-170W and 71C.(stock was 115w fixed and 54C, although usually boost "2" clocks start throttling at 42C, with -13 mhz every 5C.
> 
> This way when I overclock to +250/+500, the core stays at 1886 mhz and never moves. Stable frametimes are extremely important (not framerate).
Click to expand...

Well that is one way to do it. Yours is a laptop though right?

I flashed my Gaming X with a Gaming Z bios, run an agressive fan curve, clock the memory to +640 above reference, run +0 volts and set a curve that starts at 2088 and settles at 2076Mhz while the card would get no hotter than about 53 deg. Doing that I was running 21300 Firestrike graphics scores 6 months ago. If i run +100 on the voltage, temps increase to 59 deg, the clocks will stay at 2088 but the performance only gives me an extra 50 points sometimes so it is not worth the bother.

My Rig died and I have yet to replace it but I think Drivers have improved things a bit since then so It should be even better now


----------



## wefornes

Hello everyone, i have just ordered a gtx 1070 seahawk ek and i was wondering if anyone has one alredy. And what kind of Oc is getting with out vcore mod. Many thanks


----------



## gtbtk

Quote:


> Originally Posted by *wefornes*
> 
> Hello everyone, i have just ordered a gtx 1070 seahawk ek and i was wondering if anyone has one alredy. And what kind of Oc is getting with out vcore mod. Many thanks


I have the Gaming X which is the same PCB with an air cooler. It is not a unicorn card but also not a terrible card either and I have micron and not samsung memory. The water cooler only adds benefits in terms of temps but Pascal loves being able to run cool

With 100% fans running, I can run my card in the 50-60 deg range under full load, after much tuning using the curve, I can run Firestrike graphics scores in the 21200-21500 range. with your cooler, I would estimate a temp of around 40-45 deg under load so a score around 21500-22000 may be possible with good tuning


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Undervolt + overclock just works with good overcloking gpus. It won't work with mine because I didn't win the silicon lottery.


I have been running my card on an asus p8z68-v board with an i7-2600. I found that my card liked a slight increase in VCCIO voltage and that a slight increase in CPU PLL voltage resolved an overclocking headroom limitation at around the 1.0v level on the gpu voltage curve that had been limiting my overclocks


----------



## outofmyheadyo

Does anyone know what PCB or powerlimit do we have on MSI Sea Hawk EK X 1070 cards ? Considering buying one since I can get a good deal, dont really game that often, but would be nice to have a card that could play some games if I wanted to.
Coming from 3 different 1080Ti-s wich I sold because of massive coilwhine, and well they were a bit overkill for browsing reddit.


----------



## Halseluk

Quote:


> Originally Posted by *gtbtk*
> 
> I have been running my card on an asus p8z68-v board with an i7-2600. I found that my card liked a slight increase in VCCIO voltage and that a slight increase in CPU PLL voltage resolved an overclocking headroom limitation at around the 1.0v level on the gpu voltage curve that had been limiting my overclocks


Yeah, I remember when you were giving me some hints you told me about this. My P8Z77-V doesn't have VCCIO. PLL voltage is in Auto, I don't know how much I can increase.


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> I have the Gaming X which is the same PCB with an air cooler. It is not a unicorn card but also not a terrible card either and I have micron and not samsung memory. The water cooler only adds benefits in terms of temps but Pascal loves being able to run cool
> 
> With 100% fans running, I can run my card in the 50-60 deg range under full load, after much tuning using the curve, I can run Firestrike graphics scores in the 21200-21500 range. with your cooler, I would estimate a temp of around 40-45 deg under load so a score around 21500-22000 may be possible with good tuning


You get a 50 to 60 deg temp range when running Firestrike and your card is watercooled? What cooler are you running on it?
My max temps are in the high 40's low 50's when running Firestrike. But I have since then dropped down to the base clock my card came with. I am running a Swiftech H140-X into a Heatkiller IV pro GPU block for my 1070.


----------



## pez

Quote:


> Originally Posted by *drunkonpiss*
> 
> Anyone running their 1070 wth a 3440x1440 monitor? I'm just wondering what is your OC like and do you opt for High settings rather than Ultra to get more frames?


Quote:


> Originally Posted by *icold*
> 
> Not good
> Not good, you need a gtx 1080ti. Some games dont want so good at 1080p with gtx 1070 like deus ex making divided.


I wouldn't say you need a 1080Ti for it. You'll definitely make more compromises that some of us (including myself) do not like to make, but if opting for a display with G-sync like the x34, I think you'll still be relatively happy until the next GTX xx70 card comes out and matches the TXP/1080Ti performance margin.


----------



## kava2126

Before I overclock my new EVGA 1070 SC, I was going for a baseline benchmark but I noticed it kept hitting the power limit. I haven't changed anything yet, using PrecisionXOC. Increased power target but no change. Have voltage turned all the way down and it still jumps to 1063mV. Temp target turned all the way to 92c and set as priority. My temps never go above 40c, so doesn't make sense why it still throttles back. Any idea what any of this means?

Has been 25min since last checked it, now says voltage is set to 1050mV but still hitting 1063mV and the Power % keeps going up until it throttles back. Seems as if card is stuck wide open and just keeps throttling back after it keeps hitting power limit. Any help would be much appreciated.


----------



## gtbtk

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Does anyone know what PCB or powerlimit do we have on MSI Sea Hawk EK X 1070 cards ? Considering buying one since I can get a good deal, dont really game that often, but would be nice to have a card that could play some games if I wanted to.
> Coming from 3 different 1080Ti-s wich I sold because of massive coilwhine, and well they were a bit overkill for browsing reddit.


It is the same PCB as the Gaming X. Power limit slider goes to 126% and max power is listed at 291W, although I have never got mine to pull more than about 250W


----------



## gtbtk

Quote:


> Originally Posted by *Halseluk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have been running my card on an asus p8z68-v board with an i7-2600. I found that my card liked a slight increase in VCCIO voltage and that a slight increase in CPU PLL voltage resolved an overclocking headroom limitation at around the 1.0v level on the gpu voltage curve that had been limiting my overclocks
> 
> 
> 
> Yeah, I remember when you were giving me some hints you told me about this. My P8Z77-V doesn't have VCCIO. PLL voltage is in Auto, I don't know how much I can increase.
Click to expand...

The asus bioss allows you to increase settings by steps if you press the plus key. CPU PLL default is 1.8v, on my board I ended up at 1.8138v. I think it was about 5 taps on the plus key. My adjustment was not extreme

The way I dialed it in is was to increase by one, run a benchmark. it kept getting better up until 1.8138v when it started dropping off again as if kept increasing.

The bios does have VCCIO, you may need to enable manual mode in ai tweaker to display it.


----------



## gtbtk

Quote:


> Originally Posted by *kava2126*
> 
> Before I overclock my new EVGA 1070 SC, I was going for a baseline benchmark but I noticed it kept hitting the power limit. I haven't changed anything yet, using PrecisionXOC. Increased power target but no change. Have voltage turned all the way down and it still jumps to 1063mV. Temp target turned all the way to 92c and set as priority. My temps never go above 40c, so doesn't make sense why it still throttles back. Any idea what any of this means?
> 
> Has been 25min since last checked it, now says voltage is set to 1050mV but still hitting 1063mV and the Power % keeps going up until it throttles back. Seems as if card is stuck wide open and just keeps throttling back after it keeps hitting power limit. Any help would be much appreciated.


the EVGA bios bounces off the power limit more than the bioses from any other brand so what you are seeing is not unusual.

With the voltage slider at 0, the top voltage that you will see is 1.063v. Voltage slider at 100% will let voltage go to 1.093v.

Pascal will start adjusting voltage and clocks as the temperature increases. It does that even if the temp is below 40. It is the way GPU boost works and is not throttling.

the way to get the most stable clocks, you want the card to naturally start out at 1.050v. As the temps increase, it will boost voltage to 1.063 if there is headroom. as temp increases again, the voltage will drop back to 1050 and the clock will drop by 12mhz until the next temp step when it will increase the voltage again.

As you have an EVGA card, the easiest way to get close to a good overclock is to run the auto OC utility in Precision XOC and pay attention to the curve that it sets. It may not exactly be stable but it will show you the parts of the curve that love to be overclocked and other parts of the curve that does not like to be overclocked. If the process crashes, just restart precision and it will continue where it crashed out


----------



## VeauX

I do and I'm very happy with it. Mostly on high and ultra on everything I play with. My monitor is 60hz only. When something is not working I just decrease the AA to something less taxing. I was running and OC on the 2080ish Mhz but latelly I removed it and the card at stock goes already at 1980 ish (Giga G1 Gaming), I did not notice any difference.


----------



## wefornes

Hello, does anyone had tried nvidia fast sync ok a 144hz monitor that came with freesync?

Enviado desde mi iPhone utilizando Tapatalk


----------



## khanmein

Quote:


> Originally Posted by *wefornes*
> 
> Hello, does anyone had tried nvidia fast sync ok a 144hz monitor that came with freesync?
> 
> Enviado desde mi iPhone utilizando Tapatalk


https://www.blurbusters.com/gsync/gsync101-input-lag/


----------



## gtbtk

Quote:


> Originally Posted by *wefornes*
> 
> Hello, does anyone had tried nvidia fast sync ok a 144hz monitor that came with freesync?
> 
> Enviado desde mi iPhone utilizando Tapatalk


Free sync would have to be turned off of course so Fast sync would be running as it does on any non gsync 144mhz monitor.

I have not tested it at 144Mhz but it does work effectively at 75Mhz with significantly less latency than vsync


----------



## Blackfirehawk

After i Replaced Thermal Pads and Thermal Compount on the GPU ... 1 Hour Stresstest on my Gainwand GTX 1070 with a Gainwand phoenix GLH Bios



i need 1.043 mv for a stable GPU clock of 2012 :/.. didn´t win in Silicon lottery or the Cooler isn´t that great on the Standart gainwand GTX1070 (non phoenix but with phoenix GLH Bios)

but i think its a decent Memory Clock + 1500mhz on micron Memory
it runs at 9500mhz

i testet it up to 9580.. @ 9600mhz heaven starts crashing some times


----------



## Hunched

Quote:


> Originally Posted by *Blackfirehawk*
> 
> but i think its a decent Memory Clock + 1500mhz on micron Memory
> it runs at 9500mhz
> 
> i testet it up to 9580.. @ 9600mhz heaven starts crashing some times


Play some games and see if it ever crashes or locks up for a second.
I can pass benchmarks at +700 but I've lowered it by -25 every time I have a game crash from my GPU and I'm down to like +300 by now...
Recently a game was freezing up for a second like every 5~ and it stopped when lowered.
Visual glitches will happen before crashes happen too.
Could pass at +900 or more with missing textures/black voids and flickering

It's weird how instability can exist at less than half of what you can pass benchmarks with


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> After i Replaced Thermal Pads and Thermal Compount on the GPU ... 1 Hour Stresstest on my Gainwand GTX 1070 with a Gainwand phoenix GLH Bios
> 
> 
> 
> i need 1.043 mv for a stable GPU clock of 2012 :/.. didn´t win in Silicon lottery or the Cooler isn´t that great on the Standart gainwand GTX1070 (non phoenix but with phoenix GLH Bios)
> 
> but i think its a decent Memory Clock + 1500mhz on micron Memory
> it runs at 9500mhz
> 
> i testet it up to 9580.. @ 9600mhz heaven starts crashing some times


1.043 is undervolting the card. Without increasing the voltage slider, the card will go up to 1.063v.

If you really want to undervolt, see how fast you an get the card using the .950v point. I can run my card at 2012Mhx at .950v and set the 1.050 to go all the way to 2088. It will settle at 2076. Those settings gets me over 21000 on firestrike graphics score

9500mhz is a great memory overclock. I start getting memory errors on my card above about 1300Mhz over reference


----------



## zipper17

Quote:


> Originally Posted by *Blackfirehawk*
> 
> After i Replaced Thermal Pads and Thermal Compount on the GPU ... 1 Hour Stresstest on my Gainwand GTX 1070 with a Gainwand phoenix GLH Bios
> 
> 
> 
> i need 1.043 mv for a stable GPU clock of 2012 :/.. didn´t win in Silicon lottery or the Cooler isn´t that great on the Standart gainwand GTX1070 (non phoenix but with phoenix GLH Bios)
> 
> but i think its a decent Memory Clock + 1500mhz on micron Memory
> it runs at 9500mhz
> 
> i testet it up to 9580.. @ 9600mhz heaven starts crashing some times


What's your Firestrike Graphic Scores?

Btw for your overclocked memory, you really need to make sure it's stability. Playing some games and some stability test. Even the tiniest visual artifacts, it means not really stable.

My memory overclocked to +600, i can go pass +600 but it will definitely produce very tiny artifacts on the screen.

For stability testing i recommend use 3d mark Stress test one (Firestrike-Extreme or Timespy), Stress test for 20 loops and at least pass 97%. as for heaven/valley i never see my card artifacting or crash, unless it is very very high overclocked.


----------



## Blackfirehawk

it lett me pass Firestrike benchmark

didn´t see any artefakts after a hour firestrike..
played yesterday
2 hours COD BO3
3 hours BF1
and Today about 1 hour Watchdogs

no Crash/ no Artefakts

Quote:


> Originally Posted by *gtbtk*
> 
> 1.043 is undervolting the card. Without increasing the voltage slider, the card will go up to 1.063v.
> 
> If you really want to undervolt, see how fast you an get the card using the .950v point. I can run my card at 2012Mhx at .950v and set the 1.050 to go all the way to 2088. It will settle at 2076. Those settings gets me over 21000 on firestrike graphics score
> 
> 9500mhz is a great memory overclock. I start getting memory errors on my card above about 1300Mhz over reference


fans are allready at 100% .. card don´t like 950v point @ 2012
and even with 100% fanspeed @ 1.042v and 2012 the card goes 70 degree after 1 hour of gaming

the cooler is not the greatest on this card


----------



## b0uncyfr0

Ive been stuck at 2088/2076 for awhile at stock voltage. I think my card is fairly good but a solid 2100 core would be amazing (i barely hit 60 degrees too) I havent had time to go through oc'ing with the curve but it looks very time consuming.


----------



## icold

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Ive been stuck at 2088/2076 for awhile at stock voltage. I think my card is fairly good but a solid 2100 core would be amazing (i barely hit 60 degrees too) I havent had time to go through oc'ing with the curve but it looks very time consuming.


No sense use stock voltage, pascal have low max voltage, put 1.093v and you can push around more than 2.1ghz. I change my bios 1070 strix normal to Strix OC edition and push 2.139mhz @ 1.093mv + 120% power target, helps so much reduce temp throttling from 2050mhz to 2114mhz max temps around 65C .


----------



## zipper17

Quote:


> Originally Posted by *Blackfirehawk*
> 
> 
> 
> it lett me pass Firestrike benchmark
> 
> didn´t see any artefakts after a hour firestrike..
> played yesterday
> 2 hours COD BO3
> 3 hours BF1
> and Today about 1 hour Watchdogs
> 
> no Crash/ no Artefakts
> fans are allready at 100% .. card don´t like 950v point @ 2012
> and even with 100% fanspeed @ 1.042v and 2012 the card goes 70 degree after 1 hour of gaming
> 
> the cooler is not the greatest on this card


Your OC seems higher than mine but have lower Graphic scores (19,xxx)
it seems like your gpu getting thermal throttling, 1070 FE card?

my scored 20,9xxx - fs gs
with peak clock 2076mhz, average clock 2038-2050mhz
samsung mem clock +600, 4600/9200 MHZ .
60-70 C, 100% fan, 100% voltage

other member here scored +21k, some 22k
i can get +21k, but memory will artifacts if I increase the memclock, or gpu will crashing if I increase it's coreclock.

Edit: tbh I'm still seeking to break +21k with perfect 100% stable since bought the card, but it seems like I'm already on the edge. :SS

Mega Edit:

Priority programs that I used to measure 100% stable (no crash or tiny artifacts):

3d mark Base Benchmark (Firestrike, FS-Extreme, Timespy)
3d mark Stress test (Firestrike-Extreme, Timespy)[20 loops, at least pass 97%]
Witcher 3 (1440p highest possible settings)
GTA 5 (1440p highest possible settings)
Hitman (1440p highest possible settings)
Doom (1440p highest possible settings)
and several other games


----------



## Blackfirehawk

Quote:


> Originally Posted by *zipper17*
> 
> Your OC seems higher than mine but have lower Graphic scores (19,xxx)
> it seems like your gpu getting thermal throttling, 1070 FE card?
> 
> my scored 20,9xxx - fs gs
> with peak clock 2076mhz, average clock 2038-2050mhz
> samsung mem clock +600, 4600/9200 MHZ .
> 60-70 C, 100% fan, 100% voltage
> 
> other member here scored +21k, some 22k
> i can get +21k, but memory will artifacts if I increase the memclock, or gpu will crashing if I increase it's coreclock.
> 
> Edit: tbh I'm still seeking to break +21k with perfect 100% stable since bought the card, but it seems like I'm already on the edge. :SS
> 
> Mega Edit:
> 
> Priority programs that I used to measure 100% stable (no crash or tiny artifacts):
> 
> 3d mark Base Benchmark (Firestrike, FS-Extreme, Timespy)
> 3d mark Stress test (Firestrike-Extreme, Timespy)[20 loops, at least pass 97%]
> Witcher 3 (1440p highest possible settings)
> GTA 5 (1440p highest possible settings)
> Hitman (1440p highest possible settings)
> Doom (1440p highest possible settings)
> and several other games


yeah my card is thermal throttling.
i can go + 100% voltage and the card goes into 74-75 degree and automatical clock down under 2000mhz
even with Temp limit on Max

if i hold the card via Curve on the 1.042v point @2012mhz its stable @ 70-72 Degree @ 100% Fan and i can play some hours without crash or arefacts @ 70-72 degree

its not a FE its a gainwand
http://www.gainward.com/main/vgapro.php?id=987&lang=de
(not phoenix or GLH or so)
iam thinking the Cooler is just a little bit Better as the FE version (really Quiet on 100% fan Speed/2500rpm) but not designet with much headrome to OC

iam Thinking about getting a Aftermarket cooler
maybe a morpheus 2...


----------



## nuno_p

Is it possible to undervolt lower than 800mv?


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blackfirehawk*
> 
> After i Replaced Thermal Pads and Thermal Compount on the GPU ... 1 Hour Stresstest on my Gainwand GTX 1070 with a Gainwand phoenix GLH Bios
> 
> 
> 
> i need 1.043 mv for a stable GPU clock of 2012 :/.. didn´t win in Silicon lottery or the Cooler isn´t that great on the Standart gainwand GTX1070 (non phoenix but with phoenix GLH Bios)
> 
> but i think its a decent Memory Clock + 1500mhz on micron Memory
> it runs at 9500mhz
> 
> i testet it up to 9580.. @ 9600mhz heaven starts crashing some times
> 
> 
> 
> What's your Firestrike Graphic Scores?
> 
> Btw for your overclocked memory, you really need to make sure it's stability. Playing some games and some stability test. Even the tiniest visual artifacts, it means not really stable.
> 
> My memory overclocked to +600, i can go pass +600 but it will definitely produce very tiny artifacts on the screen.
> 
> For stability testing i recommend use 3d mark Stress test one (Firestrike-Extreme or Timespy), Stress test for 20 loops and at least pass 97%. as for heaven/valley i never see my card artifacting or crash, unless it is very very high overclocked.
Click to expand...

OCCT will test Graphics card memory and tell you if it is throwing memory errors. The card will error correct is there are only intermittent single errors. When the OC starts getting up over +500 and above, you will get to a point when the number of errors start to increase dramatically and that is when you typically start seeing instability. That point could be +500 or +600, it could be +750, it is card dependent


----------



## gtbtk

Quote:


> Originally Posted by *nuno_p*
> 
> Is it possible to undervolt lower than 800mv?


Not with any of the software I have seen.

800mv is idle voltage why would you want to try going below that?


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> 
> 
> it lett me pass Firestrike benchmark
> 
> didn´t see any artefakts after a hour firestrike..
> played yesterday
> 2 hours COD BO3
> 3 hours BF1
> and Today about 1 hour Watchdogs
> 
> no Crash/ no Artefakts
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> 1.043 is undervolting the card. Without increasing the voltage slider, the card will go up to 1.063v.
> 
> If you really want to undervolt, see how fast you an get the card using the .950v point. I can run my card at 2012Mhx at .950v and set the 1.050 to go all the way to 2088. It will settle at 2076. Those settings gets me over 21000 on firestrike graphics score
> 
> 9500mhz is a great memory overclock. I start getting memory errors on my card above about 1300Mhz over reference
> 
> 
> 
> fans are allready at 100% .. card don´t like 950v point @ 2012
> and even with 100% fanspeed @ 1.042v and 2012 the card goes 70 degree after 1 hour of gaming
> 
> the cooler is not the greatest on this card
Click to expand...

70 degrees is not too hot for the card. you can run it up to about 93 but that is pretty warm. Stock is still 84 deg before it will start thermal throttling.

I added a side panel fan to my case and it helped a lot with temps. maybe you need to consider adding additional case fans to improve airflow in your case?


----------



## nuno_p

Quote:


> Originally Posted by *gtbtk*
> 
> Not with any of the software I have seen.
> 
> 800mv is idle voltage why would you want to try going below that?


The idle voltage of both of my cards is around 650mv, and since Pascal gpus are really good with low voltage i want to know how far i could go to lower the temps (my ambient temp is around 30 to 35 degrees celcius) and power consumption.


----------



## icold

Guys, if you want reduce more temps, change stock thermal paste to liquid metal, this can reduce until 10C your GPU:




Sure this depends your cooler


----------



## nuno_p

Quote:


> Originally Posted by *icold*
> 
> Guys, if you want reduce more temps, change stock thermal paste to liquid metal, this can reduce until 10C your GPU:
> 
> 
> 
> 
> Sure this depends your cooler


I have one 1070 strix and one 1060 gaming X, but i will loose the warranty if i change the thermal paste.


----------



## GoLDii3

Quote:


> Originally Posted by *icold*
> 
> Guys, if you want reduce more temps, change stock thermal paste to liquid metal, this can reduce until 10C your GPU:
> 
> 
> 
> 
> Sure this depends your cooler


You may also warn people about using it with certain metals i do not remember now,but wich liquid metal causes corrosion.


----------



## nuno_p

Quote:


> Originally Posted by *GoLDii3*
> 
> Quote:
> 
> 
> 
> Originally Posted by *icold*
> 
> Guys, if you want reduce more temps, change stock thermal paste to liquid metal, this can reduce until 10C your GPU:
> 
> 
> 
> 
> Sure this depends your cooler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You may also warn people about using it with certain metals i do not remember now,but wich liquid metal causes corrosion.
Click to expand...

Liquid metal corrodes aluminium.


----------



## wefornes

I5 7600k + 1070 sea hawk x el ar stock settings(gpu core 1975mhz most of The time with out Oc)
Many thanks.

Enviado desde mi iPhone utilizando Tapatalk


----------



## icold

Quote:


> Originally Posted by *nuno_p*
> 
> Liquid metal corrodes aluminium.


causes it thermal grizzly conductonaut???? I hear about corrosion with Collaboratory liquid ultra


----------



## $ilent

Can I join the club? MSI 1070 Gaming Z, 2012mhz core stock boost. Seems like a decent boost?


----------



## gtbtk

Quote:


> Originally Posted by *nuno_p*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Not with any of the software I have seen.
> 
> 800mv is idle voltage why would you want to try going below that?
> 
> 
> 
> The idle voltage of both of my cards is around 650mv, and since Pascal gpus are really good with low voltage i want to know how far i could go to lower the temps (my ambient temp is around 30 to 35 degrees celcius) and power consumption.
Click to expand...

I honestly don't know. I like to hotrod my cards. under volting things to that level seems counter productive for me.

The best way to find out is to try and see what happens. as long as you don't have the test profile set to auto start at boot up the worst that can happen is a crash and reboot.


----------



## gtbtk

Quote:


> Originally Posted by *icold*
> 
> Quote:
> 
> 
> 
> Originally Posted by *b0uncyfr0*
> 
> Ive been stuck at 2088/2076 for awhile at stock voltage. I think my card is fairly good but a solid 2100 core would be amazing (i barely hit 60 degrees too) I havent had time to go through oc'ing with the curve but it looks very time consuming.
> 
> 
> 
> No sense use stock voltage, pascal have low max voltage, put 1.093v and you can push around more than 2.1ghz. I change my bios 1070 strix normal to Strix OC edition and push 2.139mhz @ 1.093mv + 120% power target, helps so much reduce temp throttling from 2050mhz to 2114mhz max temps around 65C .
Click to expand...

@b0uncyfr0 Increase voltage slider to +100 and from your current overclock pull the 1.093v point on the curve up to 2100Mhz and test. you can try increasing that in steps of 12 mhz and see how high you can get it.


----------



## gtbtk

Quote:


> Originally Posted by *GoLDii3*
> 
> Quote:
> 
> 
> 
> Originally Posted by *icold*
> 
> Guys, if you want reduce more temps, change stock thermal paste to liquid metal, this can reduce until 10C your GPU:
> 
> 
> 
> 
> Sure this depends your cooler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You may also warn people about using it with certain metals i do not remember now,but wich liquid metal causes corrosion.
Click to expand...

liquid metal does not go well with aluminium


----------



## zipzop

Most if not all cooler base plates for GPUs are nickel plated copper nowadays...I applied CLU to my 1070 like last October and can confirm it does a great job! And I disassemble my card once every couple months(last time being just a couple days ago) to remove dust and no corrosion to the base plate.
I have EVGA SC model, I checked with EVGA first and they confirmed nickel plated copper, though did not "recommend" liquid metal TIM. But no mention of voiding warranty.


----------



## Nukemaster

Watch out any Asus 1070 Dual cooler users. Those are aluminum with copper heat pipes.

Note the very well setup thermal pad to cool memory with no metal under it

















I am sure I saw some other cards with copper pipes embedded in aluminum as well.


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> Most if not all cooler base plates for GPUs are nickel plated copper nowadays...I applied CLU to my 1070 like last October and can confirm it does a great job! And I disassemble my card once every couple months(last time being just a couple days ago) to remove dust and no corrosion to the base plate.
> I have EVGA SC model, I checked with EVGA first and they confirmed nickel plated copper, though did not "recommend" liquid metal TIM. But no mention of voiding warranty.


Gallium is the component in the liquid metal that eats away Aluminium. The modern standard cooler heat plates on all Graphics cards, as far as I am aware, is either nickel plated copper or raw copper. The fins are aluminium.

If you end up getting the new EWBK all aluminuim CPU + GPU cooling kit, stay well away from liquid metal.

EVGA is ok with removing the stock cooler and as long as you don't visibly damage the card, will honour warranty. MSI, ASUS etc do not want you removing the cooler and will void warranty.


----------



## $ilent

Is 2100mhz core clock at stock voltage any good?

Also I'm having strange issue with my 1070. At stock my memory speed is only like 7600mhz, but isn't it supposed to be 8054mhz? I'm running latest driver and latest version of gpuz and msi ab to monitor it.

Thanks


----------



## skupples

Quote:


> Originally Posted by *nuno_p*
> 
> Liquid metal corrodes aluminium.


and pits copper, but its not a big deal.

cleaning it off of your IHS will also most likely remove the printed info, which then voids warranty. Luckily though, GPUs don't use IHS anymore, and the markings are laser etched.


----------



## $ilent

Why do people ignore you in this thread?


----------



## skupples

cuz i'm just a lowly dirt farmer


----------



## Falkentyne

Quote:


> Originally Posted by *icold*
> 
> causes it thermal grizzly conductonaut???? I hear about corrosion with Collaboratory liquid ultra


Dude no offense.....but your english.....i can't even understand you.....

"Reduce until 10C your gpu"? What does that mean?
"Reduces your GPU temp by 10C"?


----------



## icold

Quote:


> Originally Posted by *Falkentyne*
> 
> Dude no offense.....but your english.....i can't even understand you.....
> 
> "Reduce until 10C your gpu"? What does that mean?
> "Reduces your GPU temp by 10C"?


Is reduces GPU temps around 10C with Thermal Grizzly conductonaut


----------



## gtbtk

Quote:


> Originally Posted by *$ilent*
> 
> Is 2100mhz core clock at stock voltage any good?
> 
> Also I'm having strange issue with my 1070. At stock my memory speed is only like 7600mhz, but isn't it supposed to be 8054mhz? I'm running latest driver and latest version of gpuz and msi ab to monitor it.
> 
> Thanks


Are you talking about desktop 1070? MSI Gaming X? Are you reading that in afterburner or GPU-Z? could you post GPU-Z and Afterburner screenshots?

Reference stock memory is 8008Mhz. Afterburner will report it at 1/2 of that - 4004Mhz. GPU-Z will report it as 25% of the total value. The Gaming Z bios memory has a slight factory overclock at 4054Mhz (8108Mhz from memory). If you are only seeing 7600 then something is strange or you have underclocked your memory to get 2100Mhz stable and it is being auto applied at start up.

Faster memory will get you bigger performance increases than faster GPU clock speeds. You will get better results at 2050 GPU and 9000Mhz than you will get at 2100Mhz/8000Mhz

If you are running Afterburner, install the latest version. I would suggest that you go into the Afterburner program directory then to the profiles directory and you will see a file that matches the PID/VID number of your graphics card. Rename that file and try rebooting the PC. Your card should restart at stock settings. If it is still running at 7600, then maybe consider an RMA


----------



## $ilent

The memory is set to 0 on afterburner but when I run +150 it runs at like 8.4ghz? Does memory speed adjust on its own like core clock?

For info I didn't touch the memory speed to get core clock of 2100mhz, it's overclocked to that stable at stock voltage.


----------



## ucode

Memory can down clock to save power. BTW your probably running at 4200MHz which is 4.2GHz / 8400 GT/s


----------



## gtbtk

Quote:


> Originally Posted by *$ilent*
> 
> The memory is set to 0 on afterburner but when I run +150 it runs at like 8.4ghz? Does memory speed adjust on its own like core clock?
> 
> For info I didn't touch the memory speed to get core clock of 2100mhz, it's overclocked to that stable at stock voltage.


No, memory doesn't have a boost clock. You have GDDR5 memory, that is double data rate. Afterburner is showing the actual frequency not the megatransfer rate. +150 on the slider is a 300mhz oc. The gaming Z memory is clocked at 8100mhz so that makes perfect sense


----------



## $ilent

Quote:


> Originally Posted by *gtbtk*
> 
> No, memory doesn't have a boost clock. You have GDDR5 memory, that is double data rate. Afterburner is showing the actual frequency not the megatransfer rate. +150 on the slider is a 300mhz oc. The gaming Z memory is clocked at 8100mhz so that makes perfect sense


Thanks for your reply, but now the memory is set to +0 and its showing 3855mhz? Does that not mean its running at 7710mhz? I.e lower than what it should be at?


----------



## gtbtk

Quote:


> Originally Posted by *$ilent*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> No, memory doesn't have a boost clock. You have GDDR5 memory, that is double data rate. Afterburner is showing the actual frequency not the megatransfer rate. +150 on the slider is a 300mhz oc. The gaming Z memory is clocked at 8100mhz so that makes perfect sense
> 
> 
> 
> Thanks for your reply, but now the memory is set to +0 and its showing 3855mhz? Does that not mean its running at 7710mhz? I.e lower than what it should be at?
Click to expand...

I have seen that after a driver crash, never from a clean reboot. Make sure you do not apply the gpu oc settings at start up.

Reference memory clock should be showing 4004 and a gaming Z should show a higher amount that represents the factory overclock (4053 from memory).

A couple of things you could try is

1. In the Nvidia control panel set the global setting to maximum performance mode and see what the card is reading. the available choices are optimal, adaptive and max performance modes. It could be down clocking because of a driver setting

2. in the \program files (x86)\msi afterburner\profiles directory, rename the card settings file. The file uses the cards ViD/PID values in the file name. reboot and restart afterburner. maybe the settings file is corrupt?? the settings files are text files and you can always cut and paste a complex curve across if you needed to.


----------



## $ilent

Quote:


> Originally Posted by *gtbtk*
> 
> I have seen that after a driver crash, never from a clean reboot. Make sure you do not apply the gpu oc settings at start up.
> 
> Reference memory clock should be showing 4004 and a gaming Z should show a higher amount that represents the factory overclock (4053 from memory).
> 
> A couple of things you could try is
> 
> 1. In the Nvidia control panel set the global setting to maximum performance mode and see what the card is reading. the available choices are optimal, adaptive and max performance modes. It could be down clocking because of a driver setting
> 
> 2. in the \program files (x86)\msi afterburner\profiles directory, rename the card settings file. The file uses the cards ViD/PID values in the file name. reboot and restart afterburner. maybe the settings file is corrupt?? the settings files are text files and you can always cut and paste a complex curve across if you needed to.


When I leave the memory set to +0 on gpuz it briefly shoots up to 2025mhz, then drops to idle clocks. So I assume this is normal and that its running at the required 8100mhz?

I noticed that when runninfg [email protected] the memory doesnt quite get up to full speed but its nearly there. I assume this is because the gpu si running at 90% not 100% when folding. Guess if i test in gaming and if it goes to 2025mhz in gpuz then that means its working fine?

Thanks


----------



## Sycksyde

I just took my Asus 1070 Dual back to the store and paid an extra $20 for an MSI Gaming X. The Dual fans made an annoying buzzing noise over 40% and it was hitting 75c with no OC.....horrible card, would not recommend.


----------



## 7oda

power limit 50 Stable performance In mining
save in Future ?


power limit 100/80 temperature was 68/69
power limit 50 temperature was 57/58

so it was save for Gpu in Future or not ?


----------



## blaze2210

Quote:


> Originally Posted by *7oda*
> 
> power limit 50 Stable performance In mining
> save in Future ?
> 
> 
> power limit 100/80 temperature was 68/69
> power limit 50 temperature was 57/58
> 
> so it was save for Gpu in Future or not ?


It doesn't look like you have any profiles saved yet. For that, you'll want to press the save icon (below the area that's showing the voltage and temperature readings), then click one of the numbers next to it. Personally, I always have 5 as the total default settings, then I use the other 4 for assorted other profiles.









If you're meaning "safe for the future", then the lower power limit might help, but only time will tell for sure.


----------



## gtbtk

Quote:


> Originally Posted by *$ilent*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I have seen that after a driver crash, never from a clean reboot. Make sure you do not apply the gpu oc settings at start up.
> 
> Reference memory clock should be showing 4004 and a gaming Z should show a higher amount that represents the factory overclock (4053 from memory).
> 
> A couple of things you could try is
> 
> 1. In the Nvidia control panel set the global setting to maximum performance mode and see what the card is reading. the available choices are optimal, adaptive and max performance modes. It could be down clocking because of a driver setting
> 
> 2. in the \program files (x86)\msi afterburner\profiles directory, rename the card settings file. The file uses the cards ViD/PID values in the file name. reboot and restart afterburner. maybe the settings file is corrupt?? the settings files are text files and you can always cut and paste a complex curve across if you needed to.
> 
> 
> 
> When I leave the memory set to +0 on gpuz it briefly shoots up to 2025mhz, then drops to idle clocks. So I assume this is normal and that its running at the required 8100mhz?
> 
> I noticed that when runninfg [email protected] the memory doesnt quite get up to full speed but its nearly there. I assume this is because the gpu si running at 90% not 100% when folding. Guess if i test in gaming and if it goes to 2025mhz in gpuz then that means its working fine?
> 
> Thanks
Click to expand...

2025 sounds right for a gaming z, that is 8100 mhz. If you are getting that under load then you must have the settings at optimal or adaptive rather than high performance.

My rig is off line so I can't check to see what my card does. I have always ran mine in high performance mode so the card would not downclock to the most base level pstate.

I have never done folding so I cant comment directly. I do know that if you run a GPGPU application like luxmark, the GPU will not run in the highest Pstate, it runs in the pstate one level down from the maximum. I assume that folding is probably doing the same. Nvidia inspector will show you and can either allow you to overclock at a pstate level or let you add specific applications to higher performance profiles


----------



## Blackfirehawk

i changed my Cooler to a accelero twin turbo 2 on my Gainwand GTX 1070

and it improved alot in temperature




after 30 mins Heaven, the card don´t go over 56 degree celsius

boost clock is stable on [email protected]


----------



## $ilent

How do we change performance mice is it in nvidia control?


----------



## Mazudsg

Hello , just got a gtx 1070 amp! extreme ed , anything else i should know about 1070 in general or nvidia cards (tips n tricks) that i missed.
Why this card? because i had a rx 480 reference and here in cyprus we have 40 c in the summer with 32-33 inside house, since it's an island power is quite expensive and i cannot afford to run AC, so after 10 months of using the rx 480 i was going crazy of the sound and the heat and i decided to sell it for 250 euro ( i know right...) and bought this monster ,3 fans quiet and COLD! hehe.

Didnt had nvidia cards for long time , been a console player for alot of time , but i decided to swap back to first love (pc) since i cba to pay yearly tax to play online.
So if i did something wrong please correct me or give me some advice.
In 2 days of having this card i managed to :

-Max oc 2088. core- 9200 on memory.... i think got like 20540 in firestrike (just graphics).
-I undervolt it to 0.950 @ 2000 core / 9000 mem , also 0.931 to 1974 core , 8500 mem. if it's worth mention
-I've set from NVCP 65 frames locked with fastsync ( my monitor is a 40 sony with 60 hz ) and playing mostly simracing is not worth the expense of power this card is quite power hungry but
locked @ 65 frames, i manage to get 60-80w(only gpu) consumption in full hd resolution every game bumped to max. (yes i like the overkill)







.

Thanks for taking the time to read this post, love the card, even if i payed around 370 euro dunno if i was lucky but i like to think i was .


----------



## blaze2210

Quote:


> Originally Posted by *$ilent*
> 
> How do we change performance mice is it in nvidia control?


Yep, under the "Manage 3D settings" heading. (you meant "mode" instead of mice there, right?)


----------



## TLCH723

I have a ZOTAC GeForce GTX 1070 Mini, ZT-P10700G-10M, 8GB GDDR5.

My set up (ideal):
Dell (Primary) - Displayport to DVI adapter using DVI cable to DVI input
ASUS (Secondary) - DVI to HDMI cable to a HDMI switch using HDMI cable to HDMI input
Orculus Rift - HDMI

I am having a problem. The ASUS, the secondary display, doesn't work if I use the HDMI switch. It can detect the display but cannot display anything. If I connect directly it will display. I tried another ports on the switch still doesn't work. Other computers (NUC with mini HDMI to HDMI) /devices (Chromecast, ROKU, Nintendo Switch) work, just my computer with this GPU.

Anyone has a suggestion?


----------



## b0uncyfr0

Quote:


> Originally Posted by *gtbtk*
> 
> @b0uncyfr0
> Increase voltage slider to +100 and from your current overclock pull the 1.093v point on the curve up to 2100Mhz and test. you can try increasing that in steps of 12 mhz and see how high you can get it.


So with +150 on the core, i was averaging 2088 depending on the temp. With what you suggested - it's still only settling at 2100 with 1.093v after hitting 56+ onwards. I don't get it - shouldn't it be going higher. Boost 3.0 is freaking hard to understand. Pushing the vcore didn't do much at all. I get to 2125 briefly but as soon as i hit 56+ it just drops.

Also, why does my card drop down to 1.050v sometimes? It refuses to stay at 1.063v all the time. Noticed this also drops the clock down severely. Another section i haven't tested properly is memory OC - what is a good way to test and what are the signs that its been pushed too much?


----------



## skupples

Quote:


> Originally Posted by *b0uncyfr0*
> 
> So with +150 on the core, i was averaging 2088 depending on the temp. With what you suggested - it's still only settling at 2100 with 1.093v after hitting 56+ onwards. I don't get it - shouldn't it be going higher. Boost 3.0 is freaking hard to understand. Pushing the vcore didn't do much at all. I get to 2125 briefly but as soon as i hit 56+ it just drops.
> 
> Also, why does my card drop down to 1.050v sometimes? It refuses to stay at 1.063v all the time. Noticed this also drops the clock down severely. Another section i haven't tested properly is memory OC - what is a good way to test and what are the signs that its been pushed too much?


you can pretty much +400 your memory & go from there, or call it a day.


----------



## syl1979

Quote:


> Originally Posted by *skupples*
> 
> you can pretty much +400 your memory & go from there, or call it a day.


Just for info I have Samsung memory that get artefacts over +450. But not crashing even at +650 (but the display is more a Christmas tree than anything else)


----------



## $ilent

Hows this looking?


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> @b0uncyfr0
> Increase voltage slider to +100 and from your current overclock pull the 1.093v point on the curve up to 2100Mhz and test. you can try increasing that in steps of 12 mhz and see how high you can get it.
> 
> 
> 
> So with +150 on the core, i was averaging 2088 depending on the temp. With what you suggested - it's still only settling at 2100 with 1.093v after hitting 56+ onwards. I don't get it - shouldn't it be going higher. Boost 3.0 is freaking hard to understand. Pushing the vcore didn't do much at all. I get to 2125 briefly but as soon as i hit 56+ it just drops.
> 
> Also, why does my card drop down to 1.050v sometimes? It refuses to stay at 1.063v all the time. Noticed this also drops the clock down severely. Another section i haven't tested properly is memory OC - what is a good way to test and what are the signs that its been pushed too much?
Click to expand...

2100 is boosted. The reference boost clock is about 1800mhz, anything above that is GPU Boost overclocking the GPU.

The way GPU boost works is that the frequency will boost up to a given frequency depending on the voltage curve that has been set and the voltage that you gave allowed it to use with the voltage slider. (1.063 @ 0 and 1.093v @ +100)

As temperature rises in 5 deg steps, the card will either keep the frequency where it is and increase the voltage to the next highest step up if there is any headroom available or it it is already at the limit the voltage slider allows, it will step the frequency down 12 mhz.

If you want to keep the most stable frequency possible, always set your curve so that the initial boost is going to the voltage point at least 1 step below the maximum possible. with the voltage slider at 0 that would be 1.050v or 1.081v with the slider at +100. That will give the card the extra voltage step up headroom. To do that you want the curve to go flat from the voltage point that you want the card to start at - 1.081v with slider at +100. If you leave two voltage steps above the curve high point, then it will increase voltage twice before it reduces frequency.

If you start with the curve so the highest point is at max voltage (1.063 or 1.093) to start with, then the only thing that card can do is downclock the gpu frequency as the curve doesn't have any more voltage headroom to boost up to


----------



## gtbtk

Quote:


> Originally Posted by *syl1979*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> you can pretty much +400 your memory & go from there, or call it a day.
> 
> 
> 
> Just for info I have Samsung memory that get artefacts over +450. But not crashing even at +650 (but the display is more a Christmas tree than anything else)
Click to expand...

You should update your rig info in your sig.

With my sandy bridge rig, I found that increasing vccio and CPU PLL slightly stabilized my Vram OC are higher frequencies. AMD probably has similar settings that may be worth experimenting with


----------



## gtbtk

Quote:


> Originally Posted by *TLCH723*
> 
> I have a ZOTAC GeForce GTX 1070 Mini, ZT-P10700G-10M, 8GB GDDR5.
> 
> My set up (ideal):
> Dell (Primary) - Displayport to DVI adapter using DVI cable to DVI input
> ASUS (Secondary) - DVI to HDMI cable to a HDMI switch using HDMI cable to HDMI input
> Orculus Rift - HDMI
> 
> I am having a problem. The ASUS, the secondary display, doesn't work if I use the HDMI switch. It can detect the display but cannot display anything. If I connect directly it will display. I tried another ports on the switch still doesn't work. Other computers (NUC with mini HDMI to HDMI) /devices (Chromecast, ROKU, Nintendo Switch) work, just my computer with this GPU.
> 
> Anyone has a suggestion?


DVI has max resolution of [email protected]

HDMI and displayport support a much greater range of supported resolutions and frequencies. HDMI Switches tend to introduce some level of signal loss compared to a straight through cable, combined with the conversion to DVI it may put it outside the operating range of the Asus monitor. I would suggest trying to avoid DVI adapters/connections if at all possible.


----------



## TLCH723

Quote:


> Originally Posted by *gtbtk*
> 
> DVI has max resolution of [email protected]
> 
> HDMI and displayport support a much greater range of supported resolutions and frequencies. HDMI Switches tend to introduce some level of signal loss compared to a straight through cable, combined with the conversion to DVI it may put it outside the operating range of the Asus monitor. I would suggest trying to avoid DVI adapters/connections if at all possible.


It was working fine when I have the GTX 760. Just when I upgrade to 1070 it didnt work.


----------



## shadowrain

Quote:


> Originally Posted by *TLCH723*
> 
> It was working fine when I have the GTX 760. Just when I upgrade to 1070 it didnt work.


Maybe coz the DVI in the 760 was analog(DVI-I) and afaik the DVI in the 1070 is pure digital only(DVI-D).


----------



## Dan-H

Quote:


> Originally Posted by *TLCH723*
> 
> I have a ZOTAC GeForce GTX 1070 Mini, ZT-P10700G-10M, 8GB GDDR5.
> 
> My set up (ideal):
> Dell (Primary) - Displayport to DVI adapter using DVI cable to DVI input
> ASUS (Secondary) - DVI to HDMI cable to a HDMI switch using HDMI cable to HDMI input
> Orculus Rift - HDMI
> 
> I am having a problem. The ASUS, the secondary display, doesn't work if I use the HDMI switch. It can detect the display but cannot display anything. If I connect directly it will display. I tried another ports on the switch still doesn't work. Other computers (NUC with mini HDMI to HDMI) /devices (Chromecast, ROKU, Nintendo Switch) work, just my computer with this GPU.
> 
> Anyone has a suggestion?


Can you get the switch to work if you use HDMI on the card as the only display?

What brand/model/exact part of the switch?

Will your OR work with a DP to HDMI or DVI to HDMI adapter?

Quote:


> Originally Posted by *gtbtk*
> 
> DVI has max resolution of [email protected]


True, but TLCH723 's 1070 has Dual Link DVI which supports up to 2560×1600 https://www.zotac.com/us/product/graphics_card/zotac-geforce%C2%AE-gtx-1070-mini-0#spec


----------



## wefornes

Hello, does anyone know if for oc an gtx 1070 msi ek x with out vcore limit do i need to re flash with a custom bios my gpu?? Thanks

Enviado desde mi iPhone utilizando Tapatalk


----------



## blaze2210

Quote:


> Originally Posted by *wefornes*
> 
> Hello, does anyone know if for oc an gtx 1070 msi ek x with out vcore limit do i need to re flash with a custom bios my gpu?? Thanks
> 
> Enviado desde mi iPhone utilizando Tapatalk


No option to flash a modified BIOS back to the card, so you're stuck with what you can accomplish through regular overclocking means.


----------



## Falkentyne

Quote:


> Originally Posted by *gtbtk*
> 
> DVI has max resolution of [email protected]
> 
> HDMI and displayport support a much greater range of supported resolutions and frequencies. HDMI Switches tend to introduce some level of signal loss compared to a straight through cable, combined with the conversion to DVI it may put it outside the operating range of the Asus monitor. I would suggest trying to avoid DVI adapters/connections if at all possible.


DVI is capable of [email protected] It requires patching the driver pixel clock, however.


----------



## TLCH723

Quote:


> Originally Posted by *Dan-H*
> 
> Can you get the switch to work if you use HDMI on the card as the only display?
> 
> What brand/model/exact part of the switch?
> 
> Will your OR work with a DP to HDMI or DVI to HDMI adapter?


It will work if I connected directly so I think is because of my cheap ebay switch that is the problem.


----------



## gtbtk

Quote:


> Originally Posted by *TLCH723*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> DVI has max resolution of [email protected]
> 
> HDMI and displayport support a much greater range of supported resolutions and frequencies. HDMI Switches tend to introduce some level of signal loss compared to a straight through cable, combined with the conversion to DVI it may put it outside the operating range of the Asus monitor. I would suggest trying to avoid DVI adapters/connections if at all possible.
> 
> 
> 
> It was working fine when I have the GTX 760. Just when I upgrade to 1070 it didnt work.
Click to expand...

GTX 760 is using earlier versions of hdmi and display port, it is possible that the signalling power being pushed up the cables is higher on the older card and the losses introduced by the switch are not enough to effect that card.


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> DVI has max resolution of [email protected]
> 
> HDMI and displayport support a much greater range of supported resolutions and frequencies. HDMI Switches tend to introduce some level of signal loss compared to a straight through cable, combined with the conversion to DVI it may put it outside the operating range of the Asus monitor. I would suggest trying to avoid DVI adapters/connections if at all possible.
> 
> 
> 
> DVI is capable of [email protected] It requires patching the driver pixel clock, however.
Click to expand...

yes but that is out of spec and it does require a patch.


----------



## gtbtk

Quote:


> Originally Posted by *Dan-H*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TLCH723*
> 
> I have a ZOTAC GeForce GTX 1070 Mini, ZT-P10700G-10M, 8GB GDDR5.
> 
> My set up (ideal):
> Dell (Primary) - Displayport to DVI adapter using DVI cable to DVI input
> ASUS (Secondary) - DVI to HDMI cable to a HDMI switch using HDMI cable to HDMI input
> Orculus Rift - HDMI
> 
> I am having a problem. The ASUS, the secondary display, doesn't work if I use the HDMI switch. It can detect the display but cannot display anything. If I connect directly it will display. I tried another ports on the switch still doesn't work. Other computers (NUC with mini HDMI to HDMI) /devices (Chromecast, ROKU, Nintendo Switch) work, just my computer with this GPU.
> 
> Anyone has a suggestion?
> 
> 
> 
> Can you get the switch to work if you use HDMI on the card as the only display?
> 
> What brand/model/exact part of the switch?
> 
> Will your OR work with a DP to HDMI or DVI to HDMI adapter?
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> DVI has max resolution of [email protected]
> 
> Click to expand...
> 
> True, but TLCH723 's 1070 has Dual Link DVI which supports up to 2560×1600 https://www.zotac.com/us/product/graphics_card/zotac-geforce%C2%AE-gtx-1070-mini-0#spec
Click to expand...

he is adapting it to hdmi. no guarantee that the dual link feature is being passed through and adapters by their very nature introduce signal losses


----------



## gtbtk

Quote:


> Originally Posted by *blaze2210*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wefornes*
> 
> Hello, does anyone know if for oc an gtx 1070 msi ek x with out vcore limit do i need to re flash with a custom bios my gpu?? Thanks
> 
> Enviado desde mi iPhone utilizando Tapatalk
> 
> 
> 
> No option to flash a modified BIOS back to the card, so you're stuck with what you can accomplish through regular overclocking means.
Click to expand...

You can flash the Gaming Z bios to the card and get a default boost clock of 1633Mhz and a default 100mhz extra on the memory clock.


----------



## Paztak

So, I just bought new 1440p IPS monitor and I need to upgrade my old buddy Asus Srtix GTX 970 to GTX 1070. Even though GTX 970 handles 1440p in some games just fine but to ensure smooth 60fps gameplay at all games. If I understood correctly, every GTX 1070 model (at least non-reference cards) should hit at least 1.2GHz core clock without any problems, so there really isn't need to "find the best" model for overclocking to achieve such a "sweet spot" like 1500MHz was for GTX 970 and 1300MHz was for GTX 670. Am I Right? Is there any models what should be avoided?


----------



## nuno_p

@Paztak i have a 1070 strix oc and the max stable oc i can get stable is 2088Mhz


----------



## zipper17

Quote:


> Originally Posted by *Paztak*
> 
> So, I just bought new 1440p IPS monitor and I need to upgrade my old buddy Asus Srtix GTX 970 to GTX 1070. Even though GTX 970 handles 1440p in some games just fine *but to ensure smooth 60fps gameplay at all games.* If I understood correctly, every GTX 1070 model (at least non-reference cards) should hit at least 1.2GHz core clock without any problems, so there really isn't need to "find the best" model for overclocking to achieve such a "sweet spot" like 1500MHz was for GTX 970 and 1300MHz was for GTX 670. Am I Right? Is there any models what should be avoided?


1070 is still a great price performance card for 1440P, but for Perfect [email protected] all of time, you just need more than 1070.

I play Witcher 3 1440P max settings+hairworks on, it still averaging below 60FPS (+-55FPS), not perfect 60FPS. 1070 is overclocked @2076mhz+ @4600 (effectively 9200mhz) memory. And many other games I believe still not perfect [email protected], but some wouldn't be a problem.


----------



## gtbtk

Quote:


> Originally Posted by *Paztak*
> 
> So, I just bought new 1440p IPS monitor and I need to upgrade my old buddy Asus Srtix GTX 970 to GTX 1070. Even though GTX 970 handles 1440p in some games just fine but to ensure smooth 60fps gameplay at all games. If I understood correctly, every GTX 1070 model (at least non-reference cards) should hit at least 1.2GHz core clock without any problems, so there really isn't need to "find the best" model for overclocking to achieve such a "sweet spot" like 1500MHz was for GTX 970 and 1300MHz was for GTX 670. Am I Right? Is there any models what should be avoided?


They can all pretty much do 2000mhz, 2100 is usually achievable with curve tweaking but it will not always give you better performance. In most situations once you get to the mid 2000s, you will get more from overclocking the memory. Huge VRMs are not necessary with Pascal like they were on earlier generations

If you like playing with cross flashing bioses, avoid the Galax/KAF cards as a number of them use a different voltage controller to the majority of the different cards and that wont let you experiment.

If you want to do open loop watercooling at some stage, If you choose a card that uses a reference PCB such as the EVGA SC or Asus turbo, it is easier to find full cover waterblocks.

You can cross flash the non OC version of the card with the OC bios and effectively get a free upgrade they are the same card with different bioses. The Strix is a great card. I have been pleased with my MSI Gaming X and have been running that as a gaming Z.

The cheapest 1070 versions generally sacrifice the quality in the cooler/fans and tend to run a little hotter and louder. That is a non issue if you want to watercool. PCB quality stays good across the entire range

The Zotac Amp extreme is enormous as are the top end Palit/Gainward cards, expensive and generally don't get any better performance than more mainstream cards from asus, msi or gigabyte.


----------



## Aussiejuggalo

How's the Zotac AMP! Edition in terms of temps / noise?


----------



## khanmein

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> How's the Zotac AMP! Edition in terms of temps / noise?


Fan issue! I suggest trying another brand. If the price is too good to be turned down, then just grabbing Zotac.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *khanmein*
> 
> Fan issue! I suggest trying another brand. If the price is too good to be turned down, then just grabbing Zotac.


Ended up ordering an MSI Armour OC from Newegg just before, was cheaper than the Zotac AMP!.


----------



## khanmein

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Ended up ordering an MSI Armour OC from Newegg just before, was cheaper than the Zotac AMP!.


Enjoy your gaming. Welcome to the club & cheers.


----------



## khanmein

NVIDIA Readying a GeForce GTX 1070 Refresh


----------



## blaze2210

Quote:


> Originally Posted by *khanmein*
> 
> NVIDIA Readying a GeForce GTX 1070 Refresh


Allegedly....


----------



## khanmein

Quote:


> Originally Posted by *blaze2210*
> 
> Allegedly....
> 
> https://www.overclock3d.net/news/gpu_displays/nvidia_is_rumoured_to_be_releasing_a_gtx_1070_ti/1


What's your point? OC3D.net grabbed from others. Please kindly check the article released date.


----------



## blaze2210

Quote:


> Originally Posted by *khanmein*
> 
> What's your point? OC3D.net grabbed from others. Please kindly check the article released date.


Mainly I posted that link for the screenshot in the article - I was building off of what you posted....I'll modify my post....


----------



## comanzo

Hey guys. I wanted to know whether I should upgrade my 1070 to a 1080ti to play at 1440p 144hz. I am fine not playing games at ultra detail when targeting 144fps, but I can't play at medium. In other words, the lowest I will go is high, as medium settings compromises too much on graphics. I also am ok overclocking both the 1070 and 1080ti. My 1070 achieves 2075mhz at the moment

One final thing to note: 144fps isn't needed in games such as gta V, the witcher 3, the division, etc. It's more useful in games such as overwatch, call of duty, csgo, etc. In the games like gta V, the witcher, I prefer 60fps with higher details as that's more important than 144fps.


----------



## khanmein

Quote:


> Originally Posted by *comanzo*
> 
> Hey guys. I wanted to know whether I should upgrade my 1070 to a 1080ti to play at 1440p 144hz. I am fine not playing games at ultra detail when targeting 144fps, but I can't play at medium. In other words, the lowest I will go is high, as medium settings compromises too much on graphics. I also am ok overclocking both the 1070 and 1080ti. My 1070 achieves 2075mhz at the moment
> 
> One final thing to note: 144fps isn't needed in games such as gta V, the witcher 3, the division, etc. It's more useful in games such as overwatch, call of duty, csgo, etc. In the games like gta V, the witcher, I prefer 60fps with higher details as that's more important than 144fps.


Yeah, I suggest you should upgrade to 1080Ti & don't forget your processor too. Read this article.


----------



## comanzo

Ok. Yea my processor is an i7 4790s. I plan to upgrade it to ice lake/ryzen 2. What do you think? Do you think the processor needs an upgrade as well? When ice lake comes out, I plan to get the i7 of course, or the r5 for zen 2 as I only need 6 cores for gaming. Thanks for sending the article, I will take a look at it later on when I get the chance.


----------



## khanmein

Quote:


> Originally Posted by *comanzo*
> 
> Ok. Yea my processor is an i7 4790s. I plan to upgrade it to ice lake/ryzen 2. What do you think? Do you think the processor needs an upgrade as well? When ice lake comes out, I plan to get the i7 of course, or the r5 for zen 2 as I only need 6 cores for gaming. Thanks for sending the article, I will take a look at it later on when I get the chance.


I don't think you need to upgrade your processor, so just go for GTX 1080Ti!


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> NVIDIA Readying a GeForce GTX 1070 Refresh


What is the bet that a 1070TI will be a 1070 with 9GB/s memory, just like the 1060 9GBps cards.

Might be fun to cross flash the new cards bios onto the original 1070 though. The memory manufacturers do not make GDDR5 skus rated at 9GB/s as far as I am aware so it is just a factory OC, the same as what we are doing in Afterburner


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Ok. Yea my processor is an i7 4790s. I plan to upgrade it to ice lake/ryzen 2. What do you think? Do you think the processor needs an upgrade as well? When ice lake comes out, I plan to get the i7 of course, or the r5 for zen 2 as I only need 6 cores for gaming. Thanks for sending the article, I will take a look at it later on when I get the chance.


Your CPU should still have a few years left in it yet. While not at the same level as an i7-7700K, An i7-2600 still games reasonably well today though I coffee lake seems to be about the right time for a sandy bridge upgrade.


----------



## khanmein

Quote:


> Originally Posted by *gtbtk*
> 
> What is the bet that a 1070TI will be a 1070 with 9GB/s memory, just like the 1060 9GBps cards.
> 
> Might be fun to cross flash the new cards bios onto the original 1070 though. The memory manufacturers do not make GDDR5 skus rated at 9GB/s as far as I am aware so it is just a factory OC, the same as what we are doing in Afterburner


Yeah, maybe, but I hope they take back our all GTX 1070 & give us brand new GTX 1070Ti.


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> Your CPU should still have a few years left in it yet. While not at the same level as an i7-7700K, An i7-2600 still games reasonably well today though I coffee lake seems to be about the right time for a sandy bridge upgrade.


oh ok. I thought that since higher frames stress the cpu more, and I game at 144hz meaning I need the higher number of frames, I was due for an upgrade. I realize intel hasn't made great strides in perf. each generation, but my i7 4790s is 5% behind i7 4790 and if I jumped to 4790k and overclocked, that's another 10-15% perf. boost. From the 4790k to the 9700k(ice lake)/zen 2, that's another 35% give or take. That's a solid 50-55% grand total increase in performance approximately. This 50-55% calculation doesn't include the increase from 4 cores to 6 cores, just ipc/clockspeed gains. However, if you guys say I don't need an upgrade, I will follow your guys' advice. I just wanted to share my thought process on why I thought I needed an upgrade.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Your CPU should still have a few years left in it yet. While not at the same level as an i7-7700K, An i7-2600 still games reasonably well today though I coffee lake seems to be about the right time for a sandy bridge upgrade.
> 
> 
> 
> oh ok. I thought that since higher frames stress the cpu more, and I game at 144hz meaning I need the higher number of frames, I was due for an upgrade. I realize intel hasn't made great strides in perf. each generation, but my i7 4790s is 5% behind i7 4790 and if I jumped to 4790k and overclocked, that's another 10-15% perf. boost. From the 4790k to the 9700k(ice lake), that's another 35% give or take. That's a solid 50-55% grand total increase in performance approximately. However, if you guys say I don't need an upgrade, I will listen to you guys since you guys have more experience than me. I just wanted to share my thought process on why I thought I needed an upgrade.
Click to expand...

a 9700K is at least a year away. It may well be worth the upgrade when that CPU arrives on the market. 8700K is the chip that has brought enough improvement for me to consider an upgrade from my sandy bridge

a 7700K will generate more frames in most games but it is not by a huge amount.

You can overclock the 4790s chip a bit with a BCLK overclock. A 103.5 BCLK will take the chip to a tad over 4130Mhz. I was running an i7-2600 at 4.44Ghz with a 105.8Mhz oc and i was doing 21500 graphics scores and 10500 physics scores in Firestrike. I never had any issues with ROTR, Watchdogs, GTA V but I am also not running 144Mhz as well.

4Ghz 4790S are doing about 10% better than mine on the physics and the graphics scores are about the same. You are running at about the same as a good i5-7600K which has been considered the standard. Even a 7700K doesnt get the graphics score too much higher than mid 21000, the highly overclocked CPUs will do about 30-40% more on physics.

I dont want to tell you not to upgrade, you will get more performance but the increase is not that large yet the costs of new CPU and Ram and motherboard and possibly a better cooler is going to be expensive for only a modest gain. A used 4790K is still a beast. Maybe that is the best thing to consider if you really want an upgrade now.

Gaming at 1080p 144mhz with a 1070 can be a tough ask if you are only prepared to use ultra settings even on a 7700K


----------



## comanzo

Oh ok. Well thanks for letting me know this. Just a couple of things I would like to point out in case it would make any changes to your advice. First is that I game at 1440p 144hz with a 1070 not 1080p 144hz. Secondly, the 4ghz only happens on turbo, in other words, it only goes to 4 ghz on a single core. For all 4 cores, I could probably go from 3.2 to 3.3 or so since 3.2 is the base using bclk. Just wanted to point those two things out in case it makes any sort of difference to the advice given. Otherwise, I will be on the lookout for the 9700k and zen 2 and see if it's worth it then. I won't upgrade to coffee lake as you advised me not to do.


----------



## khanmein

@comanzo In the nutshell, just stick with your i7-4790s. Your processor is way better than mine 4C/4T. You can upgrade any processor you desire if the money is not an issue. Cheers.


----------



## comanzo

Cheers mate.
Quote:


> Originally Posted by *khanmein*
> 
> @comanzo In the nutshell, just stick with your i7-4790s. Your processor is way better than mine 4C/4T. You can upgrade any processor you desire if the money is not an issue. Cheers.


Thanks for helping me out with the advice along with gtbtk. Mind if I ask another question that's unrelated to gpu's or am I best suited to making a thread for it? Don't worry, it's not as complex of a question this time. Just want to make sure if it's all good asking it here when it's unrelated to gpu's on a gpu owners club forum.


----------



## zipper17

Quote:


> Originally Posted by *comanzo*
> 
> Hey guys. I wanted to know whether I should upgrade my 1070 to a 1080ti to play at 1440p 144hz. I am fine not playing games at ultra detail when targeting 144fps, but I can't play at medium. In other words, the lowest I will go is high, as medium settings compromises too much on graphics. I also am ok overclocking both the 1070 and 1080ti. My 1070 achieves 2075mhz at the moment
> 
> One final thing to note: 144fps isn't needed in games such as gta V, the witcher 3, the division, etc. It's more useful in games such as overwatch, call of duty, csgo, etc. In the games like gta V, the witcher, I prefer 60fps with higher details as that's more important than 144fps.


How far do you really feel satisfied with your own PC currently? What goals do you really want to achieve in games? looking for Highest framerates as high as possible or @number framerates perfection at all games? max settings as possible? Every people is different how they using their PC or what kind of games they're playing. The answer is really depends on everyone how far they feel satisfied with their own pc. Every kind of PC should be upgraded at certain time because technology will never stop advanced.

Also RAM speed upgrade also important imho, even though they are considered as tiny gain, but they're really improving the minimum frame rates in games. If you type kind of framerates-perfection guy, everything literally is important.

http://www.overclock.net/t/1487162/an-independent-study-does-the-speed-of-ram-directly-affect-fps-during-high-cpu-overhead-scenarios


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Oh ok. Well thanks for letting me know this. Just a couple of things I would like to point out in case it would make any changes to your advice. First is that I game at 1440p 144hz with a 1070 not 1080p 144hz. Secondly, the 4ghz only happens on turbo, in other words, it only goes to 4 ghz on a single core. For all 4 cores, I could probably go from 3.2 to 3.3 or so since 3.2 is the base using bclk. Just wanted to point those two things out in case it makes any sort of difference to the advice given. Otherwise, I will be on the lookout for the 9700k and zen 2 and see if it's worth it then. I won't upgrade to coffee lake as you advised me not to do.


For your CPU open your bios.

Im assuming z97 MB.

Go to the tweaking section and set it to manual

initially leave the bclk at 100

sync all cores and set them each to a multiplier of 40 and not auto

save your settings and the CPU should be running at 4ghz all the time.

You can then experiment with bclk to get some more performance. you may need to tweak voltages a little bit


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Cheers mate.
> Quote:
> 
> 
> 
> Originally Posted by *khanmein*
> 
> @comanzo In the nutshell, just stick with your i7-4790s. Your processor is way better than mine 4C/4T. You can upgrade any processor you desire if the money is not an issue. Cheers.
> 
> 
> 
> Thanks for helping me out with the advice along with gtbtk. Mind if I ask another question that's unrelated to gpu's or am I best suited to making a thread for it? Don't worry, it's not as complex of a question this time. Just want to make sure if it's all good asking it here when it's unrelated to gpu's on a gpu owners club forum.
Click to expand...

we tend to be a pretty helpful bunch here. And no-one is so anal retentive that we have the forum police. give it a try. If it is beyond our knowledge, we can refer you in he right direction


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Oh ok. Well thanks for letting me know this. Just a couple of things I would like to point out in case it would make any changes to your advice. First is that I game at 1440p 144hz with a 1070 not 1080p 144hz. Secondly, the 4ghz only happens on turbo, in other words, it only goes to 4 ghz on a single core. For all 4 cores, I could probably go from 3.2 to 3.3 or so since 3.2 is the base using bclk. Just wanted to point those two things out in case it makes any sort of difference to the advice given. Otherwise, I will be on the lookout for the 9700k and zen 2 and see if it's worth it then. I won't upgrade to coffee lake as you advised me not to do.


144mhz at 1440p may be a bit beyond the 1070 at ultra settings regardless of the CPU.

No one is saying don't upgrade. New toys are always nice to have. If getting a new PC makes you happy go for it. If you are taking a more analytical approach, then any money you spend on a new CPU/motherboard/ram will not likely get you as much of an improvement as say keeping the CPU, selling the 1070 and buying a 1080TI which absolutely performs well at 144mhz 1440p


----------



## khanmein

Quote:


> Originally Posted by *comanzo*
> 
> Cheers mate.
> Thanks for helping me out with the advice along with gtbtk. Mind if I ask another question that's unrelated to gpu's or am I best suited to making a thread for it? Don't worry, it's not as complex of a question this time. Just want to make sure if it's all good asking it here when it's unrelated to gpu's on a gpu owners club forum.


Yeah, should be no problem. Have you purchased your GTX 1080Ti?


----------



## DeathAngel74

Quote:


> Originally Posted by *gtbtk*
> 
> 144mhz at 1440p may be a bit beyond the 1070 at ultra settings regardless of the CPU.
> 
> No one is saying don't upgrade. New toys are always nice to have. If getting a new PC makes you happy go for it. If you are taking a more analytical approach, then any money you spend on a new CPU/motherboard/ram will not likely get you as much of an improvement as say keeping the CPU, selling the 1070 and buying a 1080TI which absolutely performs well at 144mhz 1440p


I just jumped ship with 7820x and 1080 ti ftw3 hybrid. 1440p was only a dream until now.


----------



## comanzo

Quote:


> Originally Posted by *zipper17*
> 
> How far do you really feel satisfied with your own PC currently? What goals do you really want to achieve in games? looking for Highest framerates as high as possible or @number framerates perfection at all games? max settings as possible? Every people is different how they using their PC or what kind of games they're playing. The answer is really depends on everyone how far they feel satisfied with their own pc. Every kind of PC should be upgraded at certain time because technology will never stop advanced.
> 
> Also RAM speed upgrade also important imho, even though they are considered as tiny gain, but they're really improving the minimum frame rates in games. If you type kind of framerates-perfection guy, everything literally is important.
> 
> http://www.overclock.net/t/1487162/an-independent-study-does-the-speed-of-ram-directly-affect-fps-during-high-cpu-overhead-scenarios


My goal is to utilize my 144hz 1440p monitor to it's full extent. I want the 144fps where it counts. Some games are fine with 60fps and don't need 144fps as there's no perceptible or tangible difference. Other games, the difference is massive. I want settings on high at the bare minimum, medium is too much of a compromise on graphics. Ultra is a bit overrated imo, and I can live without it if that does keep me away from my 144fps goal. Currently, the only games I see somewhat of a bottleneck on the cpu with my 1070 is black ops 3. Black ops 3 sometimes switches between being graphics-limited and cpu limited. Other games that I am cpu bound is in older titles where not many threads are utilized. Thus, since my processor is locked, it has weak single core perf. Despite being strong in multi-thread as an i7, it's weak in single core and thus, older titles see a cpu limited situation. Overall however, besides the older games and black ops 3, I am gpu limited in almost any other game out there that I play. Currently my ram is ddr3 @1600mhz.


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> For your CPU open your bios.
> 
> Im assuming z97 MB.
> 
> Go to the tweaking section and set it to manual
> 
> initially leave the bclk at 100
> 
> sync all cores and set them each to a multiplier of 40 and not auto
> 
> save your settings and the CPU should be running at 4ghz all the time.
> 
> You can then experiment with bclk to get some more performance. you may need to tweak voltages a little bit


Can this be done on a H81M-E mobo? I don't have a Z board since my processor is locked anyways. Btw, I bought this pc as a pre-built back when I knew absolutely nothing about hardware. The motherbaord isn't an oem mobo because I switched it out for a H81 since the oem was acting weird and was dying.


----------



## comanzo

For anyone asking about whether I upgraded to a 1080ti, not yet. I am gonna try to get it around christmas time. Onto my question which involves case selection. You see, my case is a pre-built and wouldn't be best for when it does come time to upgrade my cpu(which isn't any time soon as discussed). When the time does come to upgrade my cpu, I want to overclock it to the max and be limited by voltages, not temps. Thus, I need watercooling. My case isn't big enough for that. Before we go into which case to get, I need to ask this question.

Which is better for value, in other words, performance for the dollar? When I mean performance, I mean cooling performance. Is it custom loop cooling, or in aio cooling such as corsair H60 for example. I know custom loop provides better cooling due to better quality pumps, radiators, etc. while aio's typically tend to cheap out on quality for a lower price. But my question is, which one provides better value? After knowing the answer to this question, I will go further into some conditions for buying the case.

For anyone who is wondering what case I have, it's an asus M51AD pre-built. That is the case I have and it's size is micro-atx.


----------



## blaze2210

Quote:


> Originally Posted by *comanzo*
> 
> For anyone asking about whether I upgraded to a 1080ti, not yet. I am gonna try to get it around christmas time. Onto my question which involves case selection. You see, my case is a pre-built and wouldn't be best for when it does come time to upgrade my cpu(which isn't any time soon as discussed). When the time does come to upgrade my cpu, I want to overclock it to the max and be limited by voltages, not temps. Thus, I need watercooling. My case isn't big enough for that. Before we go into which case to get, I need to ask this question.
> 
> Which is better for value, in other words, performance for the dollar? When I mean performance, I mean cooling performance. Is it custom loop cooling, or in aio cooling such as corsair H60 for example. I know custom loop provides better cooling due to better quality pumps, radiators, etc. while aio's typically tend to cheap out on quality for a lower price. But my question is, which one provides better value? After knowing the answer to this question, I will go further into some conditions for buying the case.


This is the point in which your question would essentially be hijacking the thread, as your inquiry _will_ be starting a (potentially lengthy) debate. Creating a new thread in an appropriate section would be your best course of action, then you could always link people to it....


----------



## comanzo

Quote:


> Originally Posted by *blaze2210*
> 
> This is the point in which your question would essentially be hijacking the thread, as your inquiry _will_ be starting a debate.


Fair enough. I was definitely thinking that was going to happen. That's why I asked that earlier. I will start a thread on it then. The case already has a thread made, it's only the aio vs. custom loop that doesn't have a thread of it's own.


----------



## blaze2210

Quote:


> Originally Posted by *comanzo*
> 
> Fair enough. I was definitely thinking that was going to happen. That's why I asked that earlier. I will start a thread on it then.


Quick questions are one thing, questions that tend to start up huge debates are another. "Which cooling route to take" _almost always_ starts up a debate on OCN, since there are the hardcore air cooling people, the hardcore custom loop people, the people who view CLCs as acceptable, and those who don't really care as long as their parts stay cool. Those groups all tend to clash on here, just saying....


----------



## rfarmer

I agree with blaze2210 on this one. One thing about OC net is there are plenty of experienced watercoolers. Post this question in the watercooling thread and you will gets lot's of advice.

There are also a lot of guys on this forum that despise AIO coolers, so be prepared for that too.


----------



## comanzo

Quote:


> Originally Posted by *rfarmer*
> 
> I agree with blaze2210 on this one. One thing about OC net is there are plenty of experienced watercoolers. Post this question in the watercooling thread and you will gets lot's of advice.
> 
> There are also a lot of guys on this forum that despise AIO coolers, so be prepared for that too.


Ok. Hopefully I get more lucky with responses. Didn't get anything from making the case thread. Will definitely make a thread on it.


----------



## blaze2210

Quote:


> Originally Posted by *comanzo*
> 
> Ok. Hopefully I get more lucky with responses. Didn't get anything from making the case thread. Will definitely make a thread on it.


Like rfarmer said, post in the watercooling section. Since your question is mainly about watercooling, that's an appropriate heading, and you'll be able to get case recommendations there as well.


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> For anyone asking about whether I upgraded to a 1080ti, not yet. I am gonna try to get it around christmas time. Onto my question which involves case selection. You see, my case is a pre-built and wouldn't be best for when it does come time to upgrade my cpu(which isn't any time soon as discussed). When the time does come to upgrade my cpu, I want to overclock it to the max and be limited by voltages, not temps. Thus, I need watercooling. My case isn't big enough for that. Before we go into which case to get, I need to ask this question.
> 
> Which is better for value, in other words, performance for the dollar? When I mean performance, I mean cooling performance. Is it custom loop cooling, or in aio cooling such as corsair H60 for example. I know custom loop provides better cooling due to better quality pumps, radiators, etc. while aio's typically tend to cheap out on quality for a lower price. But my question is, which one provides better value? After knowing the answer to this question, I will go further into some conditions for buying the case.
> 
> For anyone who is wondering what case I have, it's an asus M51AD pre-built. That is the case I have and it's size is micro-atx.


I know others have mentioned to post this as a question in another thread but I will post a reasonable comment here and take it back to the gpu area.
First thing is I am a watercooler and I watercool with AIOs. But if you want the best value for your money you would be better off staying with air cooling. Unless you want to do some hefty OCing on your CPU, then you would get slightly better performance with watercooling. As I mentioned I use AIOs, in particular the Swiftech series of AIOs. They keep my system nice and cool. I use a H240-X on my CPU which is a 4930k OCd to 4.4ghz (+1ghz). My high temps during bnechmarks or stress testing is in the low to mid 60's. If I was on Air cooling I would probably not be able to maintain such a high OC with decent temps. Now my idle temps are in the mid 30's but that is mainly because the ambient temps in this room are not so great either, except for in the winter time of which then my CPU idle temps are normally in the high 20's. If I had the Swiftech H320 then my temps would be even better...presumably that is.
But I also water cool my GTX 1070 with a H140-X and that is where watercooling wins out hands down against any kind of air cooling. Since I started watercooling my GPU it's temps have never gone above the mid 40's no matter what I throw at it, in fact it usually never goes above 42 deg even when running a direct x 12 game in high settings.


----------



## khanmein

Quote:


> Originally Posted by *comanzo*
> 
> For anyone asking about whether I upgraded to a 1080ti, not yet. I am gonna try to get it around christmas time. Onto my question which involves case selection. You see, my case is a pre-built and wouldn't be best for when it does come time to upgrade my cpu(which isn't any time soon as discussed). When the time does come to upgrade my cpu, I want to overclock it to the max and be limited by voltages, not temps. Thus, I need watercooling. My case isn't big enough for that. Before we go into which case to get, I need to ask this question.
> 
> Which is better for value, in other words, performance for the dollar? When I mean performance, I mean cooling performance. Is it custom loop cooling, or in aio cooling such as corsair H60 for example. I know custom loop provides better cooling due to better quality pumps, radiators, etc. while aio's typically tend to cheap out on quality for a lower price. But my question is, which one provides better value? After knowing the answer to this question, I will go further into some conditions for buying the case.
> 
> For anyone who is wondering what case I have, it's an asus M51AD pre-built. That is the case I have and it's size is micro-atx.


Now you should concentrate to pull the trigger for GTX 1080Ti & if you are going to wait until X'mas then you better buy a whole new rig.

You seem like having limited budget & if you want better value, I suggest selling off your rig first then only you plan your next move.

If money is not an issue for you, I wondered why you haven't purchased GTX 1080Ti until now.


----------



## comanzo

Quote:


> Originally Posted by *khanmein*
> 
> Now you should concentrate to pull the trigger for GTX 1080Ti & if you are going to wait until X'mas then you better buy a whole new rig.
> 
> You seem like having limited budget & if you want better value, I suggest selling off your rig first then only you plan your next move.
> 
> If money is not an issue for you, I wondered why you haven't purchased GTX 1080Ti until now.


I haven't purchased it yet because I am hearing lots of rumors about volta possibly releasing soon. I would be very upset if volta came and the 1170 has the same performance as the 1080ti. I hear some rumors putting volta to 2017, and other say it will release in 2018.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> For your CPU open your bios.
> 
> Im assuming z97 MB.
> 
> Go to the tweaking section and set it to manual
> 
> initially leave the bclk at 100
> 
> sync all cores and set them each to a multiplier of 40 and not auto
> 
> save your settings and the CPU should be running at 4ghz all the time.
> 
> You can then experiment with bclk to get some more performance. you may need to tweak voltages a little bit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can this be done on a H81M-E mobo? I don't have a Z board since my processor is locked anyways. Btw, I bought this pc as a pre-built back when I knew absolutely nothing about hardware. The motherbaord isn't an oem mobo because I switched it out for a H81 since the oem was acting weird and was dying.
Click to expand...

I believe that H81 will allow you to increase the multiplier. The CPU is locked but it will allow you to increase the multiplier by 4. The K CPUs can be increased more than that without a built in restriction.

You go to the AI tweaker section in the uefi.

There is the MB manual

http://dlcdnet.asus.com/pub/ASUS/mb/LGA1150/H81M-A/E8599_H81M-Series.pdf

Take a look at the Asus multicore enhancement setting and try it on manual. you should then be able to set the core ratio to 40 for all cores. Reboot and the CPU should be running at 4.0Ghz.

Try leaving the vcore voltage at stock. If you get a BSOD, increase the Vcore voltage slightly. If it is stable, you can then try increasing the BCLK frequency 1% at a time and that will increase the OC further. I dont know how well haswell copes with bclk. As I said earlier, I have been running Sandy Bridge at 105.8


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I believe that H81 will allow you to increase the multiplier. The CPU is locked but it will allow you to increase the multiplier by 4. The K CPUs can be increased more than that without a built in restriction.
> 
> You go to the AI tweaker section in the uefi.
> 
> There is the MB manual
> 
> http://dlcdnet.asus.com/pub/ASUS/mb/LGA1150/H81M-A/E8599_H81M-Series.pdf
> 
> Take a look at the Asus multicore enhancement setting and try it on manual. you should then be able to set the core ratio to 40 for all cores. Reboot and the CPU should be running at 4.0Ghz.
> 
> Try leaving the vcore voltage at stock. If you get a BSOD, increase the Vcore voltage slightly. If it is stable, you can then try increasing the BCLK frequency 1% at a time and that will increase the OC further. I dont know how well haswell copes with bclk. As I said earlier, I have been running Sandy Bridge at 105.8


ok. Thanks for letting me know. I will try it out. Will report back results when I have time to do it(most likely over weekend).


----------



## rfarmer

http://www.game-debate.com/news/23762/rumour-geforce-gtx-1070-ti-to-launch-in-october-priced-at-429-just-5-slower-than-1080

Well if the rumors are true can't see much reason to buy a 1080, this price will destroy the Vega 56.


----------



## Blackfirehawk

interessting question..
what would happen if you flash a TI bios to a nonTI 1070

accourding to PCGamesHardware they will use the same Boards as the non TI version.. only a different chip..

maybe we can unlock some shaders with the other bios?
the Mobile version has more shaders as the Desktop version too


----------



## Nawafwabs

Quote:


> Originally Posted by *Blackfirehawk*
> 
> interessting question..
> what would happen if you flash a TI bios to a nonTI 1070
> 
> accourding to PCGamesHardware they will use the same Boards as the non TI version.. only a different chip..
> 
> maybe we can unlock some shaders with the other bios?
> the Mobile version has more shaders as the Desktop version too


can i flash 1070 with setting to make fan work at 50c ?


----------



## gtbtk

Quote:


> Originally Posted by *Nawafwabs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blackfirehawk*
> 
> interessting question..
> what would happen if you flash a TI bios to a nonTI 1070
> 
> accourding to PCGamesHardware they will use the same Boards as the non TI version.. only a different chip..
> 
> maybe we can unlock some shaders with the other bios?
> the Mobile version has more shaders as the Desktop version too
> 
> 
> 
> can i flash 1070 with setting to make fan work at 50c ?
Click to expand...

the tool is called custom curve in afterburner


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I believe that H81 will allow you to increase the multiplier. The CPU is locked but it will allow you to increase the multiplier by 4. The K CPUs can be increased more than that without a built in restriction.
> 
> You go to the AI tweaker section in the uefi.
> 
> There is the MB manual
> 
> http://dlcdnet.asus.com/pub/ASUS/mb/LGA1150/H81M-A/E8599_H81M-Series.pdf
> 
> Take a look at the Asus multicore enhancement setting and try it on manual. you should then be able to set the core ratio to 40 for all cores. Reboot and the CPU should be running at 4.0Ghz.
> 
> Try leaving the vcore voltage at stock. If you get a BSOD, increase the Vcore voltage slightly. If it is stable, you can then try increasing the BCLK frequency 1% at a time and that will increase the OC further. I dont know how well haswell copes with bclk. As I said earlier, I have been running Sandy Bridge at 105.8


Oddly enough, I don't have asus multicore enhancement on mine. 
Could the reason be that the bios version isn't up to date? That's the only explanation for why it's doing this. The only other thing I can think of is that since my cpu comes from a pre-built oem, maybe it doesn't have that feature included? I swapped motherboards so that can't be the reason.


----------



## Nawafwabs

Quote:


> Originally Posted by *gtbtk*
> 
> the tool is called custom curve in afterburner


i dont like MSI afterburner


----------



## [email protected]

Sign me up i currently own the EVGA 1060 6GB SSC card.


----------



## amalik

Quote:


> Originally Posted by *[email protected]*
> 
> Sign me up i currently own the EVGA 1060 6GB SSC card.


except this is the GTX 1070 owner's club.


----------



## rfarmer

Quote:


> Originally Posted by *[email protected]*
> 
> Sign me up i currently own the EVGA 1060 6GB SSC card.


There is a 1060 owners club. http://www.overclock.net/t/1610596/official-nvidia-gtx-1060-owners-club


----------



## zipper17

Quote:


> Originally Posted by *Blackfirehawk*
> 
> interessting question..
> what would happen if you flash a TI bios to a nonTI 1070
> 
> accourding to PCGamesHardware they will use the same Boards as the non TI version.. only a different chip..
> 
> maybe we can unlock some shaders with the other bios?
> the Mobile version has more shaders as the Desktop version too


It is physically disabled/removed, I don't think Nvidia would do that. Also I never see nvidia chip can be unlocked with bios flash. (Need correction if I'm wrong.)

Only some AMD card that can be.

Bios Voltage unlocker is all we need Imho. This 1.093v limit is dumb. I believe we all can get better result if the voltage get unlocked.
http://www.overclock.net/t/1605348/bios-hardware-voltage-lock-is-preventing-gtx-1070-from-reaching-1080-performance

Literally all pascal features like power limit, temp limit, volt limit seems smart, but actually dumb for overclocker.

This dude get rid the pascal features limitation by extreme physical modification:





Yeah at least we know that GTX 1070 definitely can surpass GTX 1080 easily.


----------



## DeathAngel74

May I rejoin?
I added the 1070 SC2 as PhysX and an additional 32GB of DRAM.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I believe that H81 will allow you to increase the multiplier. The CPU is locked but it will allow you to increase the multiplier by 4. The K CPUs can be increased more than that without a built in restriction.
> 
> You go to the AI tweaker section in the uefi.
> 
> There is the MB manual
> 
> http://dlcdnet.asus.com/pub/ASUS/mb/LGA1150/H81M-A/E8599_H81M-Series.pdf
> 
> Take a look at the Asus multicore enhancement setting and try it on manual. you should then be able to set the core ratio to 40 for all cores. Reboot and the CPU should be running at 4.0Ghz.
> 
> Try leaving the vcore voltage at stock. If you get a BSOD, increase the Vcore voltage slightly. If it is stable, you can then try increasing the BCLK frequency 1% at a time and that will increase the OC further. I dont know how well haswell copes with bclk. As I said earlier, I have been running Sandy Bridge at 105.8
> 
> 
> 
> Oddly enough, I don't have asus multicore enhancement on mine.
> Could the reason be that the bios version isn't up to date? That's the only explanation for why it's doing this. The only other thing I can think of is that since my cpu comes from a pre-built oem, maybe it doesn't have that feature included? I swapped motherboards so that can't be the reason.
Click to expand...

Take a look at the asus web site in the support section to see if you have the latest bios version.

On the screen you posted where it has CPU core ratio - AUTO. Click on that and it should let you set something like manual or manual sync all cores or similar. change it to manual and you should be able to set all cores to x40 either on a single line or four separate lines for each core. Save the line and it should run the chip at 4Ghz on all cores after you reboot


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> May I rejoin?
> I added the 1070 SC2 as PhysX and an additional 32GB of DRAM.


Living it up in the hotel california.....


----------



## gtbtk

Quote:


> Originally Posted by *Nawafwabs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the tool is called custom curve in afterburner
> 
> 
> 
> i dont like MSI afterburner
Click to expand...

use precision XOC or nvidia inspector then


----------



## DeathAngel74

Twas a birfday present from my wife, lol.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> Twas a birfday present from my wife, lol.


werll happy birfday. That will teach you to scratch the GAS itch


----------



## DeathAngel74

I'm just happy to reuse the card. $449 sitting in the closet for 3 weeks. I was getting the stink-eye for leaving 32gb of ram and 1070sc2 in there for so long. 120fps @ 1440p and ultra settings + hairworks for TW3.


----------



## fenixfox

Hi, im new to the GTX 1070 scene, just upgraded from an MSI R9 390 to the EVGA GTX 1070 Superclocked Black Edition, and all i can say is what a fantastic card. Even able to play games at 4k all be it at around 30fps, but i am used to the PS4 pro so this is still a huge improvement.


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> I'm just happy to reuse the card. $449 sitting in the closet for 3 weeks. I was getting the stink-eye for leaving 32gb of ram and 1070sc2 in there for so long. 120fps @ 1440p and ultra settings + hairworks for TW3.


1080TI is a great new toy to have. It is a shame that a phyx GPU doesn't give you all that much benefit across the board. If every game supported it it would be a great use of older resources


----------



## gtbtk

Quote:


> Originally Posted by *fenixfox*
> 
> Hi, im new to the GTX 1070 scene, just upgraded from an MSI R9 390 to the EVGA GTX 1070 Superclocked Black Edition, and all i can say is what a fantastic card. Even able to play games at 4k all be it at around 30fps, but i am used to the PS4 pro so this is still a huge improvement.


welcome to the 1070 club. I have been really please with pascal. These cards overclock pretty well and give a nice boost to performance. If you need any advise, there are a number of helpful people here


----------



## DeathAngel74

It actually helps with all the games I play. Ac;syndicate, Batman ak,and the the witcher3


----------



## gtbtk

Quote:


> Originally Posted by *DeathAngel74*
> 
> It actually helps with all the games I play. Ac;syndicate, Batman ak,and the the witcher3


lucky that you have the right subset of games


----------



## Akimbad

https://puu.sh/xKScP/06b57c5832.png


----------



## patriotaki

what kind of fps do you get on the new cod wwii open beta?


----------



## ImpliedConsent

Sign me up then ... My 1070 comes off an Alienware Aurora R6 i7700. Decent box. Added 32GB and a 850 Evo. The 1070 is bumped 10% using MSI Afterburner and a solid . I don't use it for mining, just gaming. In a VR environment (Rift CV1), it's absolutely solid. I stressed it 60min using Kombustor 3 and it sits solid 79C, fan 100%. I haven't had any glitches on Ultra settings. Not sure what I would get if I bumped up to 1080/ti.


----------



## amalik

Quote:


> Originally Posted by *fenixfox*
> 
> Hi, im new to the GTX 1070 scene, just upgraded from an MSI R9 390 to the EVGA GTX 1070 Superclocked Black Edition, and all i can say is what a fantastic card. Even able to play games at 4k all be it at around 30fps, but i am used to the PS4 pro so this is still a huge improvement.


How is 30fps of 4k on a 60hz monitor an upgrade vs your ps4 pro, wouldn't ps4 pro and xbox one X be [email protected] on a 60hz tv?


----------



## blaze2210

Quote:


> Originally Posted by *amalik*
> 
> How is 30fps of 4k on a 60hz monitor an upgrade vs your ps4 pro, wouldn't ps4 pro and xbox one X be [email protected] on a 60hz tv?


Doesn't look like the PS4 Pro actually delivers the promise of 4K @ 60fps.

Quote:


> In the final analysis, we were left with a mere ten titles that fulfilled the initial brief - handing in the combination of a locked 4K pixel count matched with 60fps gameplay.


Source: http://www.eurogamer.net/articles/digitalfoundry-2017-every-native-4k-ps4-pro-game-tested

Even Destiny 2, a brand new game, seems to be failing at the 60fps claim also.

Quote:


> While Microsoft and Bungie haven't announced any plans to enhance Destiny 2 for the Xbox One X, we do have news that Destiny 2 will see increased performance on PS4 Pro. While the framerate *will remain at 30 FPS*, we see the PS4 Pro version get bumped up to a 4K resolution.


Source: http://heavy.com/games/2017/09/destiny-2-ps4-pro-base-differences-changes-4k-resolution-framerate/

With regards to the Xbox One X, only some of the games will do 60fps @ 4k, more of them still seem to be 30fps. The following site lists the games:

http://www.gamesradar.com/every-xbox-one-x-enhanced-game-4k-hdr-framerates-and-features-explained/


----------



## gtbtk

Quote:


> Originally Posted by *ImpliedConsent*
> 
> Sign me up then ... My 1070 comes off an Alienware Aurora R6 i7700. Decent box. Added 32GB and a 850 Evo. The 1070 is bumped 10% using MSI Afterburner and a solid . I don't use it for mining, just gaming. In a VR environment (Rift CV1), it's absolutely solid. I stressed it 60min using Kombustor 3 and it sits solid 79C, fan 100%. I haven't had any glitches on Ultra settings. Not sure what I would get if I bumped up to 1080/ti.


That card is using reference clocks and should cope with more than a +100 core boost in AB - +150 to +200 should be achievable. Vram should be quite happy with a +500 - +800 boost and give you a nice performance boost


----------



## amalik

Quote:


> Originally Posted by *fenixfox*
> 
> Hi, im new to the GTX 1070 scene, just upgraded from an MSI R9 390 to the EVGA GTX 1070 Superclocked Black Edition, and all i can say is what a fantastic card. Even able to play games at 4k all be it at around 30fps, but i am used to the PS4 pro so this is still a huge improvement.


How is 30fps of 4k on a 60hz monitor an upgrade vs your ps4 pro, wouldn't ps4 pro and xbox one X be 4k
Quote:


> Originally Posted by *blaze2210*
> 
> Doesn't look like the PS4 Pro actually delivers the promise of 4K @ 60fps. I haven't real
> Source: http://www.eurogamer.net/articles/digitalfoundry-2017-every-native-4k-ps4-pro-game-tested
> 
> Even Destiny 2, a brand new game, seems to be failing at the 60fps claim also.
> Source: http://heavy.com/games/2017/09/destiny-2-ps4-pro-base-differences-changes-4k-resolution-framerate/
> 
> With regards to the Xbox One X, only some of the games will do 60fps @ 4k, more of them still seem to be 30fps. The following site lists the games:
> 
> http://www.gamesradar.com/every-xbox-one-x-enhanced-game-4k-hdr-framerates-and-features-explained/


thanks for the links, i'll have to check them out.

i generally find console gaming funner / more comfortable (couch/laying down/etc vs messing your back up) than PC gaming, but I believe the FPS genre is only good/great on PC, I've been a Counterstrike player for 17 years.


----------



## blaze2210

Quote:


> Originally Posted by *amalik*
> 
> How is 30fps of 4k on a 60hz monitor an upgrade vs your ps4 pro, wouldn't ps4 pro and xbox one X be 4k
> thanks for the links, i'll have to check them out.
> 
> i generally find console gaming funner / more comfortable (couch/laying down/etc vs messing your back up) than PC gaming, but I believe the FPS genre is only good/great on PC, I've been a Counterstrike player for 17 years.


That's one of the ways that PC is evolving. We don't have to be the "weirdos at their computer desk" anymore....

I've been a Counter-Strike player for about the last 24 hours now (not a solid 24 hours though)....Hehehe.... Pretty fun, but man, there are A LOT of cheaters....


----------



## pez

You'd have to be getting 30+ at all times and to make it somewhat 'nice' you'd want a G-sync panel. Anything less than that and you're back to tearing. At least consoles generally lock FPS somewhere so that it reduces tearing and framedrops (*usually*) so that you don't have a terrible experience.

Otherwise, I generally will prefer to use a console at 4K. I still have yet to get my hands on a Shield, as I feel that would change my ideal for the most part.


----------



## amalik

Quote:


> Originally Posted by *fenixfox*
> 
> Hi, im new to the GTX 1070 scene, just upgraded from an MSI R9 390 to the EVGA GTX 1070 Superclocked Black Edition, and all i can say is what a fantastic card. Even able to play games at 4k all be it at around 30fps, but i am used to the PS4 pro so this is still a huge improvement.


How is 30fps on a 60hz monitor an upgrade
Quote:


> Originally Posted by *blaze2210*
> 
> That's one of the ways that PC is evolving. We don't have to be the "weirdos at their computer desk" anymore....
> 
> I've been a Counter-Strike player for about the last 24 hours now (not a solid 24 hours though)....Hehehe.... Pretty fun, but man, there are A LOT of cheaters....


I hate CS:GO.

I play CS 1.6 at 144fps on 144hz monitor for the sheer game experience of what I think is the best FPS ever made. Play it for a few minutes and you'll see what I mean.

Graphics suck on it, but gameplay is golden. So, for the sake of the graphics, you'll have to pair it with a more modern game like BF etc on the side or CS:GO if you can stomach it.

Anyway, even if some of those console games are [email protected], still pretty hard to do on PC without probably two 1080 ti's, much more expensive than console. or whatever, a single 1080ti but with turned down settings.


----------



## blaze2210

Quote:


> Originally Posted by *amalik*
> 
> I hate CS:GO.
> 
> I play CS 1.6 at 144fps on 144hz monitor for the sheer game experience of what I think is the best FPS ever made. Play it for a few minutes and you'll see what I mean.
> 
> Graphics suck on it, but gameplay is golden. So, for the sake of the graphics, you'll have to pair it with a more modern game like BF etc on the side or CS:GO if you can stomach it.
> 
> Anyway, even if some of those console games are [email protected], still pretty hard to do on PC without probably two 1080 ti's, much more expensive than console. or whatever, a single 1080ti but with turned down settings.


I like shooting people in games, so I'm not generally too picky, plus it was cheap.









I enjoy seeing what my hardware can do with graphics, so I generally don't tend to go too far back in gaming history. There are a few exceptions to that though, like games that I really enjoyed when I was trying to play games on my Dad's old non-gaming oriented PC - Jedi Knight: Dark Forces II and Jedi Academy are the main ones that come to mind.

I may give CS a shot, if I can get it for a reasonable price....

It seems like [email protected] is challenging, no matter what the platform is. Though I'm fairly certain that it will be fully achieved on PC first, then the tech adapted to consoles. Hehehe....


----------



## b0uncyfr0

Any more news on the 1070ti - Im getting giddy thinking of the possibilites.

@gtbtk - Im gonna follow your lead and flash the Z bios. Alil boost might just give me what i need.


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Any more news on the 1070ti - Im getting giddy thinking of the possibilites.
> 
> @gtbtk - Im gonna follow your lead and flash the Z bios. Alil boost might just give me what i need.


I'm happy with it


----------



## ImpliedConsent

Quote:


> Originally Posted by *amalik*
> 
> I play CS 1.6 at 144fps on 144hz monitor for the sheer game experience of what I think is the best FPS ever made. Play it for a few minutes and you'll see what I mean.


I think CS:GO is what keeps CS going. It's amazing...this, coming from a guy who beta tested Half-Life(1) and struggled through implementing VAC. CS barely registers CPU/GPU stress now.


----------



## b0uncyfr0

So abit of an update with the Z bios - i havent touched the overclocking yet but by simply adjusting the fan to keep it cooler - Ive hit 2012 at 56 degrees. It super stable as Andromeda is one of those games that hates OC's with a passion - not a single crash for 3 hours. Loving it so far.

I will push it later tonight hopefully - a stable 2100 core is the goal. I have a feeling though that my memory is not so good







- always had a few issues going on +400.


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> So abit of an update with the Z bios - i havent touched the overclocking yet but by simply adjusting the fan to keep it cooler - Ive hit 2012 at 56 degrees. It super stable as Andromeda is one of those games that hates OC's with a passion - not a single crash for 3 hours. Loving it so far.
> 
> I will push it later tonight hopefully - a stable 2100 core is the goal. I have a feeling though that my memory is not so good
> 
> 
> 
> 
> 
> 
> 
> - always had a few issues going on +400.


with the same Gaming X with Gaming Z bios setup and micron memory my best Firestrike graphics score is at about 21300 running on an i7 sandy bridge CPU with PCIe 2.0. A modern CPU would probably get a bit more out of the card.

I found my best performance sweet spot to be around 2088/2076Mhx with memory at 9200Mhz from memory. I could run the card over 2100 MHz but the scores didn't really improve.

If you increase the voltage at the high clocks, it will boost to a higher frequency but will also increase temps by about 8-9 degrees with the fans at 100% making the card want to down clock more. The end results ended up being within about 1-2fps of each other.


----------



## bigazz

Hi guys,

I have a GTX 1070 from an acer predator G3 desktop, it looks a bit like a founders edition in the fan design. What options do I have to overclock? Was thinking of at least replacing the thermal compound with some liquid metal. Should I get an aftermarket cooler? Can I flash another bios on the card to increase core voltages?


----------



## blaze2210

Quote:


> Originally Posted by *bigazz*
> 
> Hi guys,
> 
> I have a GTX 1070 from an acer predator G3 desktop, it looks a bit like a founders edition in the fan design. What options do I have to overclock? Was thinking of at least replacing the thermal compound with some liquid metal. Should I get an aftermarket cooler? Can I flash another bios on the card to increase core voltages?


Download Afterburner, since it doesn't care what specific card you have - it just works.... You can replace the thermal compound with liquid metal, just make sure that the plate for the cooler is *not* aluminum - nickel-plated copper is perfectly fine... You can flash another vBIOS, but there's no guarantee that you'll bypass whatever voltage limitations your card has, and no absolute guarantee that the vBIOS will work on your card.... Be sure to make a backup of your stock vBIOS first using GPU-Z....


----------



## gtbtk

Quote:


> Originally Posted by *blaze2210*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bigazz*
> 
> Hi guys,
> 
> I have a GTX 1070 from an acer predator G3 desktop, it looks a bit like a founders edition in the fan design. What options do I have to overclock? Was thinking of at least replacing the thermal compound with some liquid metal. Should I get an aftermarket cooler? Can I flash another bios on the card to increase core voltages?
> 
> 
> 
> Download Afterburner, since it doesn't care what specific card you have - it just works.... You can replace the thermal compound with liquid metal, just make sure that the plate for the cooler is *not* aluminum - nickel-plated copper is perfectly fine... You can flash another vBIOS, but there's no guarantee that you'll bypass whatever voltage limitations your card has, and no absolute guarantee that the vBIOS will work on your card.... Be sure to make a backup of your stock vBIOS first using GPU-Z....
Click to expand...

What blaze said except for voltage limitations. Before pulling the card to pieces, I suggest that you install afterburner and see how it goes. A replacement air cooler will certainly run quieter but you wont get that much of a boost over what you can get now. Water cooling will give you a much larger performance boost potential.

There are no bioses that will allow increased voltages. Only maximum power draw, default core clock and in some cases memory clock. The reference cards and partner models using the reference board, with the base clock of 1506Mhz, actually overclock really well without needing to swap out the bios. +200 core and +500 memory should be an easy OC for you.

Additionally you only have a single 8 pin power supply so bioses like the Zotac Amp extreme bios that can pull 300W is not recommended on a reference design card that is nominally a 180W design. The increased power really only translates to extra heat.


----------



## Madmaxneo

Quote:


> Originally Posted by *bigazz*
> 
> Hi guys,
> 
> I have a GTX 1070 from an acer predator G3 desktop, it looks a bit like a founders edition in the fan design. What options do I have to overclock? Was thinking of at least replacing the thermal compound with some liquid metal. Should I get an aftermarket cooler? Can I flash another bios on the card to increase core voltages?


If you're looking to get better temps then I recommend water cooling the card. A different air cooler for the card might not make that much of a difference if any, but watercooling the card will make a huge difference especially in your max temps. I have a Swiftech H140-X paired with a Heatkiller IV pro gpu block to keep my GTX 1070 no higher than the mid to high, 40's for max temps.


----------



## Gurkburk

Been out of the loop for a while.. Any news on custom flashing to get past the volt & temperature throttling in 3.0?


----------



## gtbtk

Quote:


> Originally Posted by *Gurkburk*
> 
> Been out of the loop for a while.. Any news on custom flashing to get past the volt & temperature throttling in 3.0?


nothing to increase card voltages.

You can manage frequencies better by tuning your curve to start with 1.050v with +0 voltage or 1.075 or 1.081v with +100 voltage. As temps increase, the card will increase the voltage in steps up to the card maximum rather than drop the clock speed which it will only do after it has run out of voltage headroom.

If you set the curve to start out running the max voltage, then there is no voltage headroom for GPU boost to use and the only way it can go is to down clock the GPU core frequency.

To tune the curve. you set the voltage you want to use as the highest pont on the curve and everything to the right of the target voltage point remains flat and does not increase the frequency


----------



## Falkentyne

How exactly do you do that?
.....

Can you post a screenshot of how the curve looks at that 1.050v, please? All this curve talk makes me dizzy because its confusing.


----------



## gtbtk

Like this. The initial voltage the card will draw in this example is 1.075v. If you set the voltage to +100 you have 2 steps of voltage headroom above 1.075v


----------



## Imprezzion

I've been running this MSI Gaming Z 1070 for quite a while now and i just switched to a new case. Went from my Corsair Air 540 to a Corsair 730T and it lowered GPU temps even further.. I can't begin to even understand how this thing can run so cool..

It's not even hitting 60c in games anymore. Usually stays around 57-58c with a 2114Mhz core and 4200Mhz VRAM OC on +100mV and 126% power target.. Fancurve is pretty much stock. All I did was disable fanless mode by letting it run at 40% when above 20c.

Is it "normal" for a 1070 to run this cool









Just a shame it can't clock any higher than 2114Mhz stable.. It benches a LOT higher but it will give random DirectX crashes at anything above 2114Mhz in long term gaming.. VRAM is also Micron and doesn't OC at all. 4300Mhz already gives black lines..


----------



## gtbtk

Quote:


> Originally Posted by *Imprezzion*
> 
> I've been running this MSI Gaming Z 1070 for quite a while now and i just switched to a new case. Went from my Corsair Air 540 to a Corsair 730T and it lowered GPU temps even further.. I can't begin to even understand how this thing can run so cool..
> 
> It's not even hitting 60c in games anymore. Usually stays around 57-58c with a 2114Mhz core and 4200Mhz VRAM OC on +100mV and 126% power target.. Fancurve is pretty much stock. All I did was disable fanless mode by letting it run at 40% when above 20c.
> 
> Is it "normal" for a 1070 to run this cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just a shame it can't clock any higher than 2114Mhz stable.. It benches a LOT higher but it will give random DirectX crashes at anything above 2114Mhz in long term gaming.. VRAM is also Micron and doesn't OC at all. 4300Mhz already gives black lines..


Check that you are actually running on the latest vbios. GPU-Z Should list it as 86.04.50.00,xx and not 86.04.26.00.xx. that updater solved a bug with the micron memory cards.

I found that bumping my vccio slightly helped the stability of higher vram overclocks after the bios update. I was running at +600 with little problem

I have the Gaming X card that I flashed with a Gaming Z bios. also with Micron memory. With 100% fan and a sidepanel case fan I was running between 53-58 deg depending on what voltage setting I used


----------



## Imprezzion

Quote:


> Originally Posted by *gtbtk*
> 
> Check that you are actually running on the latest vbios. GPU-Z Should list it as 86.04.50.00,xx and not 86.04.26.00.xx. that updater solved a bug with the micron memory cards.
> 
> I found that bumping my vccio slightly helped the stability of higher vram overclocks after the bios update. I was running at +600 with little problem
> 
> I have the Gaming X card that I flashed with a Gaming Z bios. also with Micron memory. With 100% fan and a sidepanel case fan I was running between 53-58 deg depending on what voltage setting I used


I'm running 86.04.50.00.29. I updated it right when i got it.

And ehm.. raising VCCIO? On the motherboard or the GPU since the latest MSI AB non beta doesn't have an option for it?


----------



## blaze2210

Quote:


> Originally Posted by *Imprezzion*
> 
> I'm running 86.04.50.00.29. I updated it right when i got it.
> 
> And ehm.. raising VCCIO? On the motherboard or the GPU since the latest MSI AB non beta doesn't have an option for it?


VCCIO would be a voltage on the motherboard.


----------



## Imprezzion

Figured as much. Some GPU's do have a second voltage option in MSI AB but i'm not sure what it's for.. And the 1070 doesn't have this.

I'm not keeping this CPU for long ayway as i probably found a proper 4.7Ghz delidded 6700k with z170a gaming 7 board for a great price in my neighbourhood and i'm really thinking of getting it..


----------



## tunejunky

*[Official] NVIDIA GTX 1070 Owner's Club *


----------



## gtbtk

Quote:


> Originally Posted by *Imprezzion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Check that you are actually running on the latest vbios. GPU-Z Should list it as 86.04.50.00,xx and not 86.04.26.00.xx. that updater solved a bug with the micron memory cards.
> 
> I found that bumping my vccio slightly helped the stability of higher vram overclocks after the bios update. I was running at +600 with little problem
> 
> I have the Gaming X card that I flashed with a Gaming Z bios. also with Micron memory. With 100% fan and a sidepanel case fan I was running between 53-58 deg depending on what voltage setting I used
> 
> 
> 
> I'm running 86.04.50.00.29. I updated it right when i got it.
> 
> And ehm.. raising VCCIO? On the motherboard or the GPU since the latest MSI AB non beta doesn't have an option for it?
Click to expand...

That is the right bios.

VCCIO on the motherboard in the UEFI. Together with the System Agent voltage in the bios, they help fortify the system memory controller and system IO.


----------



## Falkentyne

Quote:


> Originally Posted by *gtbtk*
> 
> Like this. The initial voltage the card will draw in this example is 1.075v. If you set the voltage to +100 you have 2 steps of voltage headroom above 1.075v


Thank you very much for your help, gtbtk.
This explanation makes sense. The curve is flat and the frequency is flat, with different voltage points, so the voltage can rise, at the same clock frequency.
On the 'Non flat" curve, a voltage point only has a set clock speed tied to it, and there are no other clock speeds for that voltage, so clocks can only go up (if automatic overclocking and voltage is not locked with Control L), or down (No voltage points).

So in your example, your example removes "VREL" (reliability voltage), because there are MORE voltage points along the locked point (either 1.050v or 1.075v)? Do I understand this correctly?

I tried this on my TDP modded laptop 1070 (185W TDP, modded from 115W TDP), however when I set the curve like this--+200 mhz overclock and everything at 1.050v and to the right completely flat (this showed up as 2075 mhz at <42C), GPU-Z showed "vOP" under perfcap reason...

This seems to show that 1.050v is the highest voltage allowed by the controller and thus cannot go higher. So I'm guessing your example won't work for me in this case because 1.050v gives vOP.

Does this mean I have to use a lower voltage point, like 1.00v, lock that, and then make the curve completely flat to the right of 1.00v? Or should I start at 1.012v or 1.025v?

Thank you.


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> You should update your rig info in your sig.
> 
> With my sandy bridge rig, I found that increasing vccio and CPU PLL slightly stabilized my Vram OC are higher frequencies. AMD probably has similar settings that may be worth experimenting with


lol, all that's changed is the tri-sli-K-Titans came out and the single 1070 came in, which I might add is faster in 1080p surround than two titans, which i suppose is decent progress.


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Like this. The initial voltage the card will draw in this example is 1.075v. If you set the voltage to +100 you have 2 steps of voltage headroom above 1.075v
> 
> 
> 
> 
> 
> Thank you very much for your help, gtbtk.
> This explanation makes sense. The curve is flat and the frequency is flat, with different voltage points, so the voltage can rise, at the same clock frequency.
> On the 'Non flat" curve, a voltage point only has a set clock speed tied to it, and there are no other clock speeds for that voltage, so clocks can only go up (if automatic overclocking and voltage is not locked with Control L), or down (No voltage points).
> 
> So in your example, your example removes "VREL" (reliability voltage), because there are MORE voltage points along the locked point (either 1.050v or 1.075v)? Do I understand this correctly?
> 
> I tried this on my TDP modded laptop 1070 (185W TDP, modded from 115W TDP), however when I set the curve like this--+200 mhz overclock and everything at 1.050v and to the right completely flat (this showed up as 2075 mhz at <42C), GPU-Z showed "vOP" under perfcap reason...
> 
> This seems to show that 1.050v is the highest voltage allowed by the controller and thus cannot go higher. So I'm guessing your example won't work for me in this case because 1.050v gives vOP.
> 
> Does this mean I have to use a lower voltage point, like 1.00v, lock that, and then make the curve completely flat to the right of 1.00v? Or should I start at 1.012v or 1.025v?
> 
> Thank you.
Click to expand...

In that example I posted, you would set your voltage slider to +100. You don't need to lock the Voltage, just create a curve that stops increasing at say 1.075v or 1.081v and remains flat after that. You can use the core slider as well, you only need to flatten off the last part of the curve up to 1.093v. You can use the same method to undervolt the card.

When you start the 3D load the card will initially attempt to go up to the highest frequency on the curve where it flattens off and will start there. As temps increase, the card will increementally increase voltage if there is head room or it will step down to the next lower frequency step. There will be a point of equilibrium where the cooling , the voltage and the frequency all balances out and it will stay at that frequency.

If you leave the voltage at +0, and set a curve that keeps increasing as it goes past 1.050v, it will go one step higher, to 1.061v before it hist the +0 voltage limit. increasing the voltage slider to +100 will increase the limit to 1.093v

I found with my rig that GPU-Z is buggy and doesn't really know how to deal with a GPU overclocking on the voltage curve. In my case, Perfcap would end up with a nice rainbow strip and GPU-Z told me that everything was causing the throttling. You have to go into the settings in After burner to enable the voltage slider

Having flashed most 1070 bioses onto my card, there are many differences between brands. Some cards, if you use the slider,, will always have the default curve set the one step below max voltage as the highest point on the curve. Others will have the curve continuing to increase so it will stop increasing once it hits its upper voltage as set bu the curve. For the sake of your experiments use the AB curve to try manually (Ctrl-F) you can try 1.050 1.043 and 1.061 and see what happens. you can also use a lopwer point if you want


----------



## Falkentyne

Hi again,
Unfortunately, this (voltage +100mv) is impossible on the laptop cards. Even if you unlock the voltage slider in MSI afterburner, it does absolutely nothing. I did increase my memory speed by another 100 mhz which did show a nice score boost in firestrike graphics test and in Valley. (memory is now 9200 mhz).

I do appreciate the help though, thank you!


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> Hi again,
> Unfortunately, this (voltage +100mv) is impossible on the laptop cards. Even if you unlock the voltage slider in MSI afterburner, it does absolutely nothing. I did increase my memory speed by another 100 mhz which did show a nice score boost in firestrike graphics test and in Valley. (memory is now 9200 mhz).
> 
> I do appreciate the help though, thank you!


Sorry. I assumed you were on a desktop card.

1070s do love faster memory. You may find that running faster memory even if you have to sacrifice a few Mhz of Core clock OC, may get you a bigger performance boost in many workloads. Given that laptop heat management is a priority, it is not surprising that voltage is not unlocked. The extra voltage on a desktop card does increase temps and doesn't really get you a huge increase in performance over the stock voltage anyway so its probably not worth it.

Without being able to increase the voltage, If the laptop card follows the desktop cards, the max voltage headroom that you should have available would be 1.063v. The curve that is adjusted with the core slider in AB would be topping out at 1.050 so it gives you one extra step in gpu boost voltage headroom. If you want to try 1.063v open the curve and manually drag the point at 1.063v up to whatever frequency you want to try running at and the card should boost up to that frequency to start with. As temps increase though, the frequency will immediately start dropping off.


----------



## ravihpa

Quote:


> Originally Posted by *gtbtk*
> 
> Sorry. I assumed you were on a desktop card.


Hi,

I want to OC my GPU. Can you please link me to a newbie's guide or a video on how to properly and safely overclock my GTX 1070 so that I can get the max out of my card?

I have a *Zotac GTX1070 AMP! Extreme*.

I read up on forums and have set up a proper fan curve, which keeps my temps below 65 degrees C.

Thanx a lot in advance for the help


----------



## microchidism

this video should be enough to get you started


----------



## khanmein

Quote:


> Originally Posted by *microchidism*
> 
> 
> 
> 
> 
> 
> this video should be enough to get you started


Stop promoting this useless YouTube reviewer.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> Stop promoting this useless YouTube reviewer.


lol


----------



## ravihpa

Quote:


> Originally Posted by *khanmein*
> 
> Stop promoting this useless YouTube reviewer.


Lol. Am just interested in the end result. Is it a good OC video? If not, can you recommend me something else?


----------



## nuno_p

Yes, that video is a good guide, don't worry about the hate comments.


----------



## lanofsong

Hey GTX 1070 owners,

We are having our monthly Foldathon from Monday 16th - Wednesday 18th - 12noon EST.
Would you consider putting all that awesome GPU power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

October 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## microchidism

Quote:


> Originally Posted by *khanmein*
> 
> Stop promoting this useless YouTube reviewer.


I don't follow his channel but I remember when first got my 1070 it was one of the vids I watched which explained pascal's GPU boost enough to get me started....


----------



## khanmein

Quote:


> Originally Posted by *ravihpa*
> 
> Lol. Am just interested in the end result. Is it a good OC video? If not, can you recommend me something else?


Any videos as long not the moron JayzTwoCents! The fella like to proclaim himself as an enthusiast & stated that don't install MSI: AB thru Guru3D. He don't even know that MSI: AB is @Unwinder work. We have Overclock forum.


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> Any videos as long not the moron JayzTwoCents! The fella like to proclaim himself as an enthusiast & stated that don't install MSI: AB thru Guru3D. He don't even know that MSI: AB is @Unwinder work. We have Overclock forum.


i have been thinking how he has achieve 1m subs.


----------



## Capt

How well does the GTX 1070 handle BF1 at ultra settings at 2560x1440? I don't know if I would need a gtx 1080 or if a gtx 1070 can get the job done.


----------



## khanmein

Quote:


> Originally Posted by *asdkj1740*
> 
> i have been thinking how he has achieve 1m subs.


No offense, if you like his videos, no one can stop you from watching, but just 1M subscriber is not a big deal. I also watched, but purely for entertainment only. There's a lot million subscriber so who care he/she had & there's also other that less YouTube reviewer that's honest than him. Frankly speaking, I didn't learn anything much from him & he's cocky!

He has achieved 1M subs. SO?


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> No offense, if you like his videos, no one can stop you from watching, but just 1M subscriber is not a big deal. I also watched, but purely for entertainment only. There's a lot million subscriber so who care he/she had & there's also other that less YouTube reviewer that's honest than him. Frankly speaking, I didn't learn anything much from him & he's cocky!
> 
> He has achieved 1M subs. SO?


no, as a techtuber, 1m sub is really high enough.


----------



## zipper17

Quote:


> Originally Posted by *khanmein*
> 
> Stop promoting this *useless* YouTube reviewer.


Quote:


> Originally Posted by *khanmein*


That's quite harsh comments bro lmao, I'm not a fan on any tech reviewers, sometime I see their channel videos, but I also never take them into personal feeling ... we search information on the internet based on facts/objective, not personal feeling/being subjective.


----------



## nolive721

how you guys in the US see the market moving with the soon to come 1070TI release? is there people here ready to trade in their beloved 1070 for a TI variant?

are there any idea about retail pricing yet?


----------



## blaze2210

Quote:


> Originally Posted by *khanmein*
> 
> No offense, if you like his videos, no one can stop you from watching, *but just 1M subscriber is not a big deal. I also watched,* but purely for entertainment only. There's a lot million subscriber so who care he/she had & there's also other that less YouTube reviewer that's honest than him. Frankly speaking, I didn't learn anything much from him & he's cocky!
> 
> He has achieved 1M subs. SO?


Sounds like you might be confusing views with subscriptions, unless you mean that while watching one of his videos, you "accidentally" clicked the big red Subscribe button. I wouldn't bat an eye at 1M _views_, but 1M subscribers is something to actually consider. That means that there are 1M people who want to make sure they know when he releases a new video.

I've watched a few of his videos, and he gives decent opinions on the items he talks about, and presents things in an entertaining way (IMHO). What videos would you prefer that people watch for PC info? Whenever you "shut down a road", you should open another at the same time.


----------



## asdkj1740

Quote:


> Originally Posted by *zipper17*
> 
> That's quite harsh comments bro lmao, I'm not a fan on any tech reviewers, sometime I see their channel videos, but I also never take them into personal feeling ... we search information on the internet based on facts/objective, not personal feeling/being subjective.


his water cooling stuffs seem to be very good, i dont know, i dont familiar with this aspect.

for other stuffs, just for fun


----------



## khanmein

Quote:


> Originally Posted by *blaze2210*
> 
> Sounds like you might be confusing views with subscriptions, unless you mean that while watching one of his videos, you "accidentally" clicked the big red Subscribe button. I wouldn't bat an eye at 1M _views_, but 1M subscribers is something to actually consider. That means that there are 1M people who want to make sure they know when he releases a new video.
> 
> I've watched a few of his videos, and he gives decent opinions on the items he talks about, and presents things in an entertaining way (IMHO). What videos would you prefer that people watch for PC info? Whenever you "shut down a road", you should open another at the same time.


I personally prefer PCPER.


----------



## Gurkburk

Quote:


> Originally Posted by *Capt*
> 
> How well does the GTX 1070 handle BF1 at ultra settings at 2560x1440? I don't know if I would need a gtx 1080 or if a gtx 1070 can get the job done.


Pretty well, 90+ fps at all times.


----------



## Blze001

My GTX-1070 FE is thermal throttling like crazy lately. Any thoughts on possible solutions? I'm down to either changing cases to improve the air intake for it, or trying to shoehorn a watercooling setup into my current case.


----------



## blaze2210

Quote:


> Originally Posted by *Blze001*
> 
> My GTX-1070 FE is thermal throttling like crazy lately. Any thoughts on possible solutions? I'm down to either changing cases to improve the air intake for it, or trying to shoehorn a watercooling setup into my current case.


There's always the option of adding a Kraken G12 and a CLC to your card. Easy to connect, pretty affordable, and should solve that thermal issue.


----------



## Blze001

Quote:


> Originally Posted by *blaze2210*
> 
> There's always the option of adding a Kraken G12 and a CLC to your card. Easy to connect, pretty affordable, and should solve that thermal issue.


Interesting solution... probably slightly less insane than trying to stick 2 slim 240mm EKWB radiators and a full loop into a Hadron Air, huh?

Does it increase the width of the card, will that stay about stock width?


----------



## blaze2210

Quote:


> Originally Posted by *Blze001*
> 
> Interesting solution... probably slightly less insane than trying to stick 2 slim 240mm EKWB radiators and a full loop into a Hadron Air, huh?
> 
> Does it increase the width of the card, will that stay about stock width?


It stays pretty close to the stock width, and costs less than you'd pay for just the block for the card (EK is charging $120+ for theirs).... The tubes may add a little extra to the width, but that mainly depends on what cooler is used. In my pic below, I'm using the LiquidFreezer 120 (got it cheap on eBay







).


----------



## Gurkburk

Quote:


> Originally Posted by *Blze001*
> 
> Interesting solution... probably slightly less insane than trying to stick 2 slim 240mm EKWB radiators and a full loop into a Hadron Air, huh?
> 
> Does it increase the width of the card, will that stay about stock width?


I replaced stock cooler with an Arctic Xtreme. With VERY heavy games, i reach 60*C~. Idling around 30~*C. Normal card would run around 85*C~ with the same heavy game and also volt/clock throttle due to nvidia 3.0.

https://www.arctic.ac/eu_en/accelero-xtreme-iv.html


----------



## Blze001

Quote:


> Originally Posted by *Gurkburk*
> 
> I replaced stock cooler with an Arctic Xtreme. With VERY heavy games, i reach 60*C~. Idling around 30~*C. Normal card would run around 85*C~ with the same heavy game and also volt/clock throttle due to nvidia 3.0.
> 
> https://www.arctic.ac/eu_en/accelero-xtreme-iv.html


I briefly considered that path, but there's no way that'd even come close to fitting in my current case with that massive backplate.









Looks like if I found an AIO with some really flexible tubing or, even better, a 90 degree turn out of the waterblock it'd be pretty much stock height. Thanks for the info.


----------



## blaze2210

Quote:


> Originally Posted by *Gurkburk*
> 
> I replaced stock cooler with an Arctic Xtreme. With VERY heavy games, i reach 60*C~. Idling around 30~*C. Normal card would run around 85*C~ with the same heavy game and also volt/clock throttle due to nvidia 3.0.
> 
> https://www.arctic.ac/eu_en/accelero-xtreme-iv.html


If the cooler alone is going to eat up 2+ slots _without_ the card, it better work well. That cooler is huge!








Quote:


> Originally Posted by *Blze001*
> 
> Looks like if I found an AIO with some really flexible tubing or, even better, a 90 degree turn out of the waterblock it'd be pretty much stock height. Thanks for the info.


Sounds like a good plan, the 90* ends definitely make the install a bit easier.


----------



## SavantStrike

Quote:


> Originally Posted by *blaze2210*
> 
> It stays pretty close to the stock width, and costs less than you'd pay for just the block for the card (EK is charging $120+ for theirs).... The tubes may add a little extra to the width, but that mainly depends on what cooler is used. In my pic below, I'm using the LiquidFreezer 120 (got it cheap on eBay
> 
> 
> 
> 
> 
> 
> 
> ).


You might want to change the orientation of that AIO on the bottom. You're at risk for a slug of air killing the pump.


----------



## blaze2210

Quote:


> Originally Posted by *SavantStrike*
> 
> You might want to change the orientation of that AIO on the bottom. You're at risk for a slug of air killing the pump.


When I get a different case....In the current situation, I'd have to use copious amounts of zip-ties to orient it any other way.... Plus, for a ~$20 cooler, I'm not overly concerned about it at the moment....If it dies and my PC is idle (when I'm asleep or not home), the liquid inside should keep temps from skyrocketing - never heard of a video card dying from high temps at 974mhz and 0.65v. If it happens while I'm gaming, I'll see my temps climbing.

I appreciate the concern though.


----------



## SavantStrike

Quote:


> Originally Posted by *blaze2210*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SavantStrike*
> 
> You might want to change the orientation of that AIO on the bottom. You're at risk for a slug of air killing the pump.
> 
> 
> 
> When I get a different case....In the current situation, I'd have to use copious amounts of zip-ties to orient it any other way.... Plus, for a ~$20 cooler, I'm not overly concerned about it at the moment....If it dies and my PC is idle (when I'm asleep or not home), the liquid inside should keep temps from skyrocketing - never heard of a video card dying from high temps at 974mhz and 0.65v. If it happens while I'm gaming, I'll see my temps climbing.
> 
> I appreciate the concern though.
Click to expand...

Oh, for 20 bucks I wouldn't care either.

It's been my experience that Pascal will just get hot when a pump fails (or when you forget to plug it in, doh). The worst you'll see is you come home to a card that's running at 80C, it'll just keep throttling the core to maintain the temp target.

Sent from my ZTE A2017U using Tapatalk


----------



## blaze2210

Quote:


> Originally Posted by *SavantStrike*
> 
> Oh, for 20 bucks I wouldn't care either.
> 
> It's been my experience that Pascal will just get hot when a pump fails (or when you forget to plug it in, doh). The worst you'll see is you come home to a card that's running at 80C, it'll just keep throttling the core to maintain the temp target.
> 
> Sent from my ZTE A2017U using Tapatalk


Good deal on eBay! I'll admit though, I was extremely skeptical about the price and was half-expecting it to be on its last legs. Yep, ran into that one the first time I installed it.... Hehehe....I plugged in the fan for the G12, and the fans for the cooler, but forgot about the pump. Oops.... On the desktop, temps were in the mid-high 60's before I shut it down. Plus, I think my PC _should_ hit its low--power stages before too much can happen to it.


----------



## SavantStrike

Quote:


> Originally Posted by *blaze2210*
> 
> Good deal on eBay! I'll admit though, I was extremely skeptical about the price and was half-expecting it to be on its last legs. Yep, ran into that one the first time I installed it.... Hehehe....I plugged in the fan for the G12, and the fans for the cooler, but forgot about the pump. Oops.... On the desktop, temps were in the mid-high 60's before I shut it down. Plus, I think my PC _should_ hit its low--power stages before too much can happen to it.


It will hit low power states in the most uneventful way possible. I was mining on a 1080 TI that lost a pump and the thing just kept plugging away maintaining the 65C temp limit I had set. It was running really slowly, but it was completely unphased.

I might not have been as happy had I let my cards go to 85-90C like some miners do, but I'm an enthusiast, not a monster. As it was I think this happened for a couple hours and might have dragged on for days if I hadn't noticed my hash rate was too low.


----------



## gtbtk

Quote:


> Originally Posted by *khanmein*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ravihpa*
> 
> Lol. Am just interested in the end result. Is it a good OC video? If not, can you recommend me something else?
> 
> 
> 
> Any videos as long not the moron JayzTwoCents! The fella like to proclaim himself as an enthusiast & stated that don't install MSI: AB thru Guru3D. He don't even know that MSI: AB is @Unwinder work. We have Overclock forum.
Click to expand...

Since the pascal launch, EVGA was supplying him graphics cards and providing sponsorship to his channel. I am pretty sure that the precision XOC usage is just part of the EVGA sponsorship marketing deal he made. Maxwell card videos used to feature afterburner.

I am not crazy about Precision XOC either, it does, however have some handy features that you can use with EVGA cards that are not available elsewhere. The auto overclock feature with pascal, while not great for creating a stable overclock, is handy to show you where the holes/weaknesses are along the voltage curve. You can even use it with other brand cards if you cross flash an EVGA bios.

That video is an adequate overclocking guide. It at least mentions the curve where most others dont so it makes it an OK place to start. I do agree though that he likes to tell everyone what an expert he is about Pascal GPUs but it appears that there are a load of details we have discovered here that he has no idea about.

Just remember that you cannot kill your graphics card playing with the overclocking. If you go too far, it will flash artifacts everywhere on screen or crash your PC, reboot and it will reset the card to to the default settings. Just don't set the overclock utility to auto load a profile when windows starts up until you are sure that your OC is stable.

If you have a reference based card (base clock 1506mhz), A good starting point is +150 on the core slider and +400 on memory slider. If you have a factory overclocked card, deduct whoever many mhz you have over 1505Mhz and tak that off the amount you add to the core slider.


----------



## gtbtk

Quote:


> Originally Posted by *nolive721*
> 
> how you guys in the US see the market moving with the soon to come 1070TI release? is there people here ready to trade in their beloved 1070 for a TI variant?
> 
> are there any idea about retail pricing yet?


I cant really see much benefit trading up from a 1070 to a 1070TI. I can run almost at reference 1080 speeds now.

Assuming that they don't change the voltage controller from the standard one, I am going to try cross flashing my 1070 with a 1070TI bios to see if I can unlock some extra cuda cores though.


----------



## gtbtk

Quote:


> Originally Posted by *Blze001*
> 
> My GTX-1070 FE is thermal throttling like crazy lately. Any thoughts on possible solutions? I'm down to either changing cases to improve the air intake for it, or trying to shoehorn a watercooling setup into my current case.


If you take the side panel off the PC, does it improve the temps? If not, changing case wont help

Have you ever tried cleaning the accumulated dust out of the cooler? The way you state the question sounds like it never used to be like that so dust is most likely the reason for the temp increase

You can try some compressed air. Taking the shroud off make make access easier.

You can also remove the cooler completely, clean it out completely and then re-paste the GPU when you put it back together again and you should see an improvement.


----------



## Blze001

Quote:


> Originally Posted by *gtbtk*
> 
> If you take the side panel off the PC, does it improve the temps? If not, changing case wont help
> 
> Have you ever tried cleaning the accumulated dust out of the cooler? The way you state the question sounds like it never used to be like that so dust is most likely the reason for the temp increase
> 
> You can try some compressed air. Taking the shroud off make make access easier.
> 
> You can also remove the cooler completely, clean it out completely and then re-paste the GPU when you put it back together again and you should see an improvement.


I'll give the compressed air dust-cleaning a shot. My solution is going to be rather dramatic: I'm sticking everything in a SG13b while I prep my Hadron Air case for a full water loop.


----------



## b0uncyfr0

Quote:


> Originally Posted by *gtbtk*
> 
> I found my best performance sweet spot to be around 2088/2076Mhx with memory at 9200Mhz from memory. I could run the card over 2100 MHz but the scores didn't really improve.


You are spot on. My card is doing the same. I finally settled on the core - 2100 at 1.081 is just not worth it. I can achieve 2075 at 1.050 with my temps (i have an FTO2 so airflow is superb). And if i can crank it up abit further i will prob achieve 2088 @ 1050. Worst case scenario - will settle at 1.067 so it never drops the core down.

The real suprise has been the memory - increasing the VTT voltage in the bios allowed me to go way past 8400 on the samsung ram. Im now at 9000







. Steadily increasing whilst testing in firestrike to make sure performance doesnt drop off somewhere. Whats the most efficient way to test memory/artifacts though...long instances of GPU-Z?

Please flash the 1070ti bios - i will be eagerly waiting to see what magic unfolds..


----------



## gtbtk

Quote:


> Originally Posted by *Blze001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> If you take the side panel off the PC, does it improve the temps? If not, changing case wont help
> 
> Have you ever tried cleaning the accumulated dust out of the cooler? The way you state the question sounds like it never used to be like that so dust is most likely the reason for the temp increase
> 
> You can try some compressed air. Taking the shroud off make make access easier.
> 
> You can also remove the cooler completely, clean it out completely and then re-paste the GPU when you put it back together again and you should see an improvement.
> 
> 
> 
> I'll give the compressed air dust-cleaning a shot. My solution is going to be rather dramatic: I'm sticking everything in a SG13b while I prep my Hadron Air case for a full water loop.
Click to expand...

That will certainly make things run cooler. If that pushes your buttons go right ahead but I'm pretty confident that the compressed air will tide you over until the loop is ready


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I found my best performance sweet spot to be around 2088/2076Mhx with memory at 9200Mhz from memory. I could run the card over 2100 MHz but the scores didn't really improve.
> 
> 
> 
> You are spot on. My card is doing the same. I finally settled on the core - 2100 at 1.081 is just not worth it. I can achieve 2075 at 1.050 with my temps (i have an FTO2 so airflow is superb). And if i can crank it up abit further i will prob achieve 2088 @ 1050. Worst case scenario - will settle at 1.067 so it never drops the core down.
> 
> The real suprise has been the memory - increasing the VTT voltage in the bios allowed me to go way past 8400 on the samsung ram. Im now at 9000
> 
> 
> 
> 
> 
> 
> 
> . Steadily increasing whilst testing in firestrike to make sure performance doesnt drop off somewhere. Whats the most efficient way to test memory/artifacts though...long instances of GPU-Z?
> 
> Please flash the 1070ti bios - i will be eagerly waiting to see what magic unfolds..
Click to expand...

I found VCCIO and CPU PLL helped me improve GPU stability on my rig. I have come to the conclusion that the "silicon lottery" is way overstated and blamed for many issues that can be resolved with a bit of bios tweaking. "The experts" don't believe me though.

I wont do it if the bios is locked for overclocking. I don't think that Nvidia would do that though, I suspect that the rumor about overclocking is a restriction on the AIB partners to sell the cards all at reference settings.

I cant see any reason why it should not at least run on the card if the voltage controller stays the same. The memory controller and base chip should be the same. I'm not sure if the cores are disabled because of something physical or only due to a bios setting. This is the first time that a base card and the TI has the same memory and base chip so it will be interesting to see what happens.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> I found VCCIO and CPU PLL helped me improve GPU stability on my rig. I have come to the conclusion that the "silicon lottery" is way overstated and blamed for many issues that can be resolved with a bit of bios tweaking. "The experts" don't believe me though.
> 
> I wont do it if the bios is locked for overclocking. I don't think that Nvidia would do that though, I suspect that the rumor about overclocking is a restriction on the AIB partners to sell the cards all at reference settings.
> 
> I cant see any reason why it should not at least run on the card if the voltage controller stays the same. The memory controller and base chip should be the same. I'm not sure if the cores are disabled because of something physical or only due to a bios setting. This is the first time that a base card and the TI has the same memory and base chip so it will be interesting to see what happens.


lol.
"silicon lottery" is for the weak to explain what they dont know about.
nowadays, enthusiasts are everywhere, playing rgb and tempered glass.


----------



## Falkentyne

Quote:


> Originally Posted by *gtbtk*
> 
> I found VCCIO and CPU PLL helped me improve GPU stability on my rig. I have come to the conclusion that the "silicon lottery" is way overstated and blamed for many issues that can be resolved with a bit of bios tweaking. "The experts" don't believe me though.
> 
> I wont do it if the bios is locked for overclocking. I don't think that Nvidia would do that though, I suspect that the rumor about overclocking is a restriction on the AIB partners to sell the cards all at reference settings.
> 
> I cant see any reason why it should not at least run on the card if the voltage controller stays the same. The memory controller and base chip should be the same. I'm not sure if the cores are disabled because of something physical or only due to a bios setting. This is the first time that a base card and the TI has the same memory and base chip so it will be interesting to see what happens.


Hi,
Can you explain how CPU PLL improves GPU overclocking? My MSI Laptop (mainboard MS 17-A1) has access to CPU PLL (i forget but i think it's in mv). , but NO memory voltage or I/O voltage settings. There is "AC/DC loadline for I/A domain (CPU core), system Agent (what?), GT sliced, GT uncliced (worthless). I have no idea what changing AC/DC loadline does for the system agent., especially since SA voltage (VCCSA??) is not accessible.

Thank you.


----------



## Nawafwabs

Physx settings: GPU or CPU?


----------



## blaze2210

Quote:


> Originally Posted by *Nawafwabs*
> 
> Physx settings: GPU or CPU?


GPU


----------



## zipper17

I doubt "Silicon lottery" it is a myth or weak opinion, it really does have mean, it is a fact happens from the complexity of the chips and overall system itself.

In Pro-overclockers or competitive overclocker, with same exact components/setup they still never achieved the same result even they already maxed out every components that would limit the result. Some result always have less or better than the others.

Better chips has less crash or bsod or errorness, some chip wouldn't even run. Also Some chip may requires higher voltage or less voltage than the others to run at certain frequency, etc. That's the example common things what happens in overclocking, the different behaviors one to another chips, that's also the indication.

Maybe depends on who or the contexts of discussion when used the term of silicon lottery.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I found VCCIO and CPU PLL helped me improve GPU stability on my rig. I have come to the conclusion that the "silicon lottery" is way overstated and blamed for many issues that can be resolved with a bit of bios tweaking. "The experts" don't believe me though.
> 
> I wont do it if the bios is locked for overclocking. I don't think that Nvidia would do that though, I suspect that the rumor about overclocking is a restriction on the AIB partners to sell the cards all at reference settings.
> 
> I cant see any reason why it should not at least run on the card if the voltage controller stays the same. The memory controller and base chip should be the same. I'm not sure if the cores are disabled because of something physical or only due to a bios setting. This is the first time that a base card and the TI has the same memory and base chip so it will be interesting to see what happens.
> 
> 
> 
> lol.
> "silicon lottery" is for the weak to explain what they dont know about.
> nowadays, enthusiasts are everywhere, playing rgb and tempered glass.
Click to expand...

You are exactly right. Don't know how to do something or something doesn't work the way it does for others, don't investigate, just blame the silicon lottery. That was the cause of the micron memory issue wasn't it? hahaha

yes RGB and Tempered glass are both essential for performance. My favorite RGB setting is "off".

When I was a kid in the yearly 70's, our car had a tempered glass windscreen. A car in front threw up a bolt lying on the road that smashed the glass and showered me in small shards of smashed windscreen cutting me all over. Never really been a fan of tempered glass since then. At least there are a few that are now starting to put the glass on hinges or are supporting the glass at the bottom. I cannot believe that there are so many cases that use the stupid 4 thumb screw lets drop the glass on the floor attachment method.


----------



## DeathAngel74

What I've also noticed is ppl mare playing synthetics more than real games these days. $3000-3500 to run benchmarks? I dun get it...


----------



## khanmein

Quote:


> Originally Posted by *DeathAngel74*
> 
> What I've also noticed is ppl mare playing synthetics more than real games these days. $3000-3500 to run benchmarks? I dun get it...


Yeah, I also don't get it. I don't give a f with the benchmark score & what I'm more concerned is about the smooth gaming experience, lower frame time & less input lag.


----------



## DeathAngel74

I mean don't get me wrong, I like to test out my hardware once I build a new PC too. I can only stand Realbench and the Firestrike demo for only so long though, lol.


----------



## gtbtk

Quote:


> Originally Posted by *zipper17*
> 
> I doubt "Silicon lottery" it is a myth or weak opinion, it really does have mean, it is a fact happens from the complexity of the chips and overall system itself.
> 
> In Pro-overclockers or competitive overclocker, with same exact components/setup they still never achieved the same result even they already maxed out every components that would limit the result. Some result always have less or better than the others.
> 
> Better chips has less crash or bsod or errorness, some chip wouldn't even run. Also Some chip may requires higher voltage or less voltage than the others to run at certain frequency, etc. That's the example common things what happens in overclocking, the different behaviors one to another chips, that's also the indication.
> 
> Maybe depends on who or the contexts of discussion when used the term of silicon lottery.


There are certainly minor variations in quality so some chips do perform or overclock better than others if everything else is equal.

The thing is, particularly in forums like this, the different environments are almost always not equal, yet the lottery gets the blame because something doesn't work as expected. It is used as an excuse to not look, not investigate, not understand and remain technically mediocre/incompetent yet sound knowledgable by using a buzzword in the conversation to stifle discussion.

Different motherboards, even down to manufacturing revision differences of the same model name, different bios and bios settings, different voltage settings, different quality power supplies, different brands and speed Ram, different combination of USB devices connected to different ports all contribute to a CPU or GPU performing at a given level of performance. Having Auto settings makes the variations even worse. It is not only down to a bad piece of silicon in the thing you are trying to make go faster. You need to eliminate everything else first before you can blame the Silicon lottery.

The pro Overclockers nail down their environment so that the other variations are eliminated and only things that do change are directly related to the component they are trying to overclock.

It took me with the help of a few others here including asdkj1740 almost 3 months of arguing with the "silicon Lottery"/"Micron is intrinsically broken" crowd rather than discussing it rationally before I managed to get Nvidia to fix the bug that was causing the Micron memory bug in the 1070 bios. The same attitude exists with the AMD Ryzen

Quote:


> Originally Posted by *DeathAngel74*
> 
> What I've also noticed is ppl mare playing synthetics more than real games these days. $3000-3500 to run benchmarks? I dun get it...


Well this is an overclock web site. I am probably guilty as charged. i want to know how my GPU works. Besides, that is how overclocking is justified isnt it?


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> There are certainly minor variations in quality so some chips do perform or overclock better than others if everything else is equal.
> 
> The thing is, particularly in forums like this, the different environments are almost always not equal, yet the lottery gets the blame because something doesn't work as expected. It is used as an excuse to not look, not investigate, not understand and remain technically mediocre/incompetent yet sound knowledgable by using a buzzword in the conversation to stifle discussion.
> 
> Different motherboards, even down to manufacturing revision differences of the same model name, different bios and bios settings, different voltage settings, different quality power supplies, different brands and speed Ram, different combination of USB devices connected to different ports all contribute to a CPU or GPU performing at a given level of performance. Having Auto settings makes the variations even worse. It is not only down to a bad piece of silicon in the thing you are trying to make go faster. You need to eliminate everything else first before you can blame the Silicon lottery.
> 
> The pro Overclockers nail down their environment so that the other variations are eliminated and only things that do change are directly related to the component they are trying to overclock.
> 
> It took me with the help of a few others here including asdkj1740 almost 3 months of arguing with the "silicon Lottery"/"Micron is intrinsically broken" crowd rather than discussing it rationally before I managed to get Nvidia to fix the bug that was causing the Micron memory bug in the 1070 bios. The same attitude exists with the AMD Ryzen
> Well this is an overclock web site. I am probably guilty as charged. i want to know how my GPU works. Besides, that is how overclocking is justified isnt it?


the latest "silicon Lottery" is going to coffelake z370 now, ryzen is outdated.
just few techtubers, like tom from oc3d, have found out the load line calibration is completely failed causing bad overclocking.


----------



## SavantStrike

Quote:


> Originally Posted by *asdkj1740*
> 
> the latest "silicon Lottery" is going to coffelake z370 now, ryzen is outdated.
> just few techtubers, like tom from oc3d, have found out the load line calibration is completely failed causing bad overclocking.


The mainstream Intel platforms always attract the novices. Fast and easy (not knocking the platforms). Novices often try to blame their equipment instead of their testing methodology. It's only when someone like Tom posts a video that novices will reevaluate their methods.

There was a time that novices had enough patience to read to learn, but a lot of people will only watch a video today. It gets frustrating when someone gets on a forum and 10 people will tell them where there problem is and they won't listen until anther poster links them to a video that's often less detailed than the forum posts.
Quote:


> Originally Posted by *gtbtk*
> 
> There are certainly minor variations in quality so some chips do perform or overclock better than others if everything else is equal.
> 
> The thing is, particularly in forums like this, the different environments are almost always not equal, yet the lottery gets the blame because something doesn't work as expected. It is used as an excuse to not look, not investigate, not understand and remain technically mediocre/incompetent yet sound knowledgable by using a buzzword in the conversation to stifle discussion.
> 
> Different motherboards, even down to manufacturing revision differences of the same model name, different bios and bios settings, different voltage settings, different quality power supplies, different brands and speed Ram, different combination of USB devices connected to different ports all contribute to a CPU or GPU performing at a given level of performance. Having Auto settings makes the variations even worse. It is not only down to a bad piece of silicon in the thing you are trying to make go faster. You need to eliminate everything else first before you can blame the Silicon lottery.
> 
> The pro Overclockers nail down their environment so that the other variations are eliminated and only things that do change are directly related to the component they are trying to overclock.
> 
> It took me with the help of a few others here including asdkj1740 almost 3 months of arguing with the "silicon Lottery"/"Micron is intrinsically broken" crowd rather than discussing it rationally before I managed to get Nvidia to fix the bug that was causing the Micron memory bug in the 1070 bios. The same attitude exists with the AMD Ryzen
> Well this is an overclock web site. I am probably guilty as charged. i want to know how my GPU works. Besides, that is how overclocking is justified isnt it?


Most people don't follow the scientific method. The pros do, and they reap the reward for proper testing methodology.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> There are certainly minor variations in quality so some chips do perform or overclock better than others if everything else is equal.
> 
> The thing is, particularly in forums like this, the different environments are almost always not equal, yet the lottery gets the blame because something doesn't work as expected. It is used as an excuse to not look, not investigate, not understand and remain technically mediocre/incompetent yet sound knowledgable by using a buzzword in the conversation to stifle discussion.
> 
> Different motherboards, even down to manufacturing revision differences of the same model name, different bios and bios settings, different voltage settings, different quality power supplies, different brands and speed Ram, different combination of USB devices connected to different ports all contribute to a CPU or GPU performing at a given level of performance. Having Auto settings makes the variations even worse. It is not only down to a bad piece of silicon in the thing you are trying to make go faster. You need to eliminate everything else first before you can blame the Silicon lottery.
> 
> The pro Overclockers nail down their environment so that the other variations are eliminated and only things that do change are directly related to the component they are trying to overclock.
> 
> It took me with the help of a few others here including asdkj1740 almost 3 months of arguing with the "silicon Lottery"/"Micron is intrinsically broken" crowd rather than discussing it rationally before I managed to get Nvidia to fix the bug that was causing the Micron memory bug in the 1070 bios. The same attitude exists with the AMD Ryzen
> Well this is an overclock web site. I am probably guilty as charged. i want to know how my GPU works. Besides, that is how overclocking is justified isnt it?
> 
> 
> 
> the latest "silicon Lottery" is going to coffelake z370 now, ryzen is outdated.
> just few techtubers, like tom from oc3d, have found out the load line calibration is completely failed causing bad overclocking.
Click to expand...

The silicon lottery applies to every CPU. To me the 8700k is a lot more compelling than skylake or Kaby Lake. Ryzen looked promising but the single core performance that only matched my sandy bridge together with the memory/pcie performance quirks says wait to see version 2.

We hear about the same bugs every new chipset release. OC3d complained about MSI overvolting the cpu on z270 as well. A bios update will be released and that problem will go away


----------



## gtbtk

Quote:


> Most people don't follow the scientific method. The pros do, and they reap the reward for proper testing methodology.


Yes I know. I am not referring to the pros.

I am referring to the great unwashed who blame the silicon lottery for just about everything and use it as a cover all for ignorance


----------



## gtbtk

Doesn't look like the 1070TI is going to be a compelling upgrade from 1070

Videocardz leak of firestrike extreme 1070TI benchmarks https://videocardz.com/73395/nvidia-geforce-gtx-1070-ti-3dmark-performance

The 1070TI is benching graphics scores of about 9500

My admittedly overclocked 1070 gaming X running on a z68 board wiith a i7-2600 CPU (PCIe 2.0) gets a graphics score of 9877

https://www.3dmark.com/fs/11795224

My timespy results are within 200 points of the TI score and I am running on a sandy bridge CPU, not a 7700K which would have gotten me an extra 1000 points.

Overclocking a TI should certainly improve the scores but I dont think the benefits get value that would match the costs of upgrading


----------



## Falkentyne

gtbtk did you see my message (question)?


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I found VCCIO and CPU PLL helped me improve GPU stability on my rig. I have come to the conclusion that the "silicon lottery" is way overstated and blamed for many issues that can be resolved with a bit of bios tweaking. "The experts" don't believe me though.
> 
> I wont do it if the bios is locked for overclocking. I don't think that Nvidia would do that though, I suspect that the rumor about overclocking is a restriction on the AIB partners to sell the cards all at reference settings.
> 
> I cant see any reason why it should not at least run on the card if the voltage controller stays the same. The memory controller and base chip should be the same. I'm not sure if the cores are disabled because of something physical or only due to a bios setting. This is the first time that a base card and the TI has the same memory and base chip so it will be interesting to see what happens.
> 
> 
> 
> Hi,
> Can you explain how CPU PLL improves GPU overclocking? My MSI Laptop (mainboard MS 17-A1) has access to CPU PLL (i forget but i think it's in mv). , but NO memory voltage or I/O voltage settings. There is "AC/DC loadline for I/A domain (CPU core), system Agent (what?), GT sliced, GT uncliced (worthless). I have no idea what changing AC/DC loadline does for the system agent., especially since SA voltage (VCCSA??) is not accessible.
> 
> Thank you.
Click to expand...

CPU PLL is a voltage used by the clock multiplier circuit. It is very relevant if you are using BCLK, which in my case I was forced to do with non K CPU. I had experimented with adjustments and seen that benchmarks improved and then dropped back again as the voltage increased. This was very early on with my card so I had the PLL stuck at about 1.85v and had left it there.

VCCIO is the voltage used to manage the connections to the package memory and pcie controllers. I did not have access to the SA voltage in the bios I was using. Increasing VCCIO about 5 ticks in the bios stabilized the Micron vram. I had found slight improvements before getting the vbios update to fix the micron bug. After the update i readjusted the VCCIO and improved stability with ram OC up to +650.

My card always had somewhat disappointing oc limit of about +75 from stock. By playing with the curve, I discovered that at the top and bottom end of the voltage curve, the card was happy boosting to +150 or more. However at around the 1.0v range it would crash if you tried to lift the curve past that +75 limit.

In my adventures with cross flashing the card, I installed an EVGA bios onto my card and was playing with the precision XOC auto overclock tool. I discovered that the curve that it would build pretty closely matched the shape of the curve I had discovered manually with the dip in the middle. It occured to me that this tool provides something to give me a rapid feedback on what bios changes are doing for me.

With that idea in mind. I started increasing the PLL voltage from default 1.8v one tick at a time and reran the EVGA tool. The curve it created did not dip to the same extent at 1.0v. I increased the PLL voltage another tick and tested again. the curve kept improving up to 1.8138v which i think is about 5 or 6 ticks above stock at which time the curve generated by the utility was close to matching the shape of the stock curve but at +150. Higher voltages than that, the performance started to degrade again.

As far as I can tell, changing the PLL voltage has a tuning effect on the signalling frequency and how it interacts with things like the ring bus frequency sort of like getting the radio tuned exactly on the station.


----------



## Falkentyne

Thank you!


----------



## SavantStrike

Quote:


> Originally Posted by *gtbtk*
> 
> Yes I know. I am not referring to the pros.
> 
> I am referring to the great unwashed who blame the silicon lottery for just about everything and use it as a cover all for ignorance


Yeah, well in that case, forums are like a huge support group for novices









Poster A: I got a bad chip, it won't do 5.2ghz like reviewer XYZ had.
Poster B: Yeah I know, it sucks, Intel did a crappy job on this chip. You should RMA the chip, it sucks. You lost the lottery.
Poster C: Delidding will save your life, I delidded and the black helicopters stopped landing on my front lawn.
Poster D: Silicon lottery! Silicon lottery! You need to go through at least 10 chips to get a good one.


----------



## gtbtk

Quote:


> Originally Posted by *SavantStrike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Yes I know. I am not referring to the pros.
> 
> I am referring to the great unwashed who blame the silicon lottery for just about everything and use it as a cover all for ignorance
> 
> 
> 
> Yeah, well in that case, forums are like a huge support group for novices
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Poster A: I got a bad chip, it won't do 5.2ghz like reviewer XYZ had.
> Poster B: Yeah I know, it sucks, Intel did a crappy job on this chip. You should RMA the chip, it sucks. You lost the lottery.
> Poster C: Delidding will save your life, I delidded and the black helicopters stopped landing on my front lawn.
> Poster D: Silicon lottery! Silicon lottery! You need to go through at least 10 chips to get a good one.
Click to expand...

It is true that some chips will clock better with lower voltages. If you are looking to set world LN2 records binning is certainly a thing too. Intel, Nvidia and AMD unfortunately also cherry pick media samples or flash media sample cards with overclocked bioses that are purely designed to deceive the public.

How many Silicon lottery conversations have you ever heard that discuss all the supporting voltages and other base considerations before jumping to the conclusion that they lost the silicon lottery? Maybe they have got a "lottery" loser but there are many things that can stifle performance to rule out before you can say that for sure. If you just assume that it is a loser CPU, you end up with a replacement that can potentially perform just as poorly.

There is nothing wrong with novices. We were all novices at some point in time. Forums can be a great place to get information and help, but you tend to find that in a forum there are maybe half a dozen people who actually help if you are lucky. There is another larger group who try their best to help but don't know the subject well enough and then there is a big group who have learnt the buzzwords and are great at spreading misinformation.

Unfortunately, it is human nature when you get any group together, certain half truths get bent and twisted and take a life of their own. The repetition keeps reinforcing the half truth and somehow it becomes a "fact".


----------



## khanmein

@SavantStrike You had been brainwashed by those YouTube reviewers. The sample they received is cherry picked & totally different with the actual retail sample. I know there're some of them bought with their own cash, especially those small channels.


----------



## Nawafwabs

i have problem with asus 1070 o8g

one fan running at 100% sometimes


----------



## mailto

Hello I have an asus gtx 1070 strix oc, micron memory, is there a bios update?

in addition, in the nvidea system control the display is no longer inside!
and have the attitudes no more! can someone please help me for the best performance for the system control? egoshooter!


----------



## Nukemaster

Just hit the Asus support site to check for a bios update for your card. Please note the screen may blank when flashing if you do not restore the standard VGA driver(use onboard if you can. no change of that going off while flashing).

If you have Nvidia issues, it may be an option to reinstall the drivers(uninstall them reinstall with the clean install option).


----------



## amalik

Quote:


> Originally Posted by *DeathAngel74*
> 
> What I've also noticed is ppl mare playing synthetics more than real games these days. $3000-3500 to run benchmarks? I dun get it...


I feel like that's a summary of all the popular hardware review and benchmark channels on youtube.

I mean hell, how many benchmarks can you do against a set of 20 games before you ******* enjoy one of them?


----------



## kcskcw

uhh, benchmarking until games are stable? no shiet sherlock.

Anyways guys, I have the gigabyte 1070 btw, with BIOS version 86.04.50.00.7A. Is this the latest version?

Mine luckily hits 2.1Ghz, but how much are you supposed to overclock the memory? mine hits the limit @ 4233mhz, any tips to push further?


----------



## kcskcw

There's a youtube video comparing i7-2600k vs coffee lake showing that there's NO noticeable improvement framerates in gaming, only in workstations / rendering / compressing crap.

I'll link it if I come across the video later on.


----------



## division2

That was Hardware Canucks video and that 2600k was overclocked to 4.8Ghz.


----------



## kcskcw

the coffee lake was also overclocked, and the video in question was trying to prove beyond a reasonable doubt that there was no difference in framerates







(ddr3 overclocking as well as ddr4 overclocking as well.)

well, to play the devil's advocate again, unlocked CPU's were for mostly savvy prosumers whom knew what they were getting, and Sandy Bridge was known for the flexibility of overclocking and hitting ~5Ghz easily regardless of air/water cooling.

But if you want low-power usage, along with native NVME, and other host of features - I don't blame you









If I had a 6-7 year old system, I'd just add NVME and usb3 expansion cards to keep my system going, but that's just me ahaha

TL;DR Save your money, don't waste your money on things you don't want/need.

and I liked the idea of old vs new comparisons, puts things into perspective.


----------



## gtbtk

Quote:


> Originally Posted by *Nawafwabs*
> 
> i have problem with asus 1070 o8g
> 
> one fan running at 100% sometimes


Is everything stock? bios stock?

you may benefit from reflashing your card

https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-O8G-GAMING/HelpDesk_BIOS/


----------



## gtbtk

Quote:


> Originally Posted by *mailto*
> 
> Hello I have an asus gtx 1070 strix oc, micron memory, is there a bios update?
> 
> in addition, in the nvidea system control the display is no longer inside!
> and have the attitudes no more! can someone please help me for the best performance for the system control? egoshooter!


If you have bios 86.04.50.00.xx you are up to date. This utility will automatically update your card if necessary.

https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-O8G-GAMING/HelpDesk_BIOS/

Go to nvidia.com and download the latest geforce drivers. You are most likely running the default windows 10 drivers


----------



## gtbtk

Quote:


> Originally Posted by *kcskcw*
> 
> There's a youtube video comparing i7-2600k vs coffee lake showing that there's NO noticeable improvement framerates in gaming, only in workstations / rendering / compressing crap.
> 
> I'll link it if I come across the video later on.


There is a difference but it is not huge in most games. My 1070 will do 21000+ graphics scores in Firestrike with an i7-2600 non K @4.4Ghz. That score is better than most 1070s around running on any platform. The areas of difference are the physics score of 10000 compared to a 7700K at 15000 and the sandy bridge combined score suffers compared to a kaby lake cpu as well.

Coffee and Kaby lake will beat a 2600K in frame rate in most games benchmarks as well but for the most part, The sandy can still keep the frame rate in 1080p games up above the monitor refresh rate to the differences is pretty meaningless particularly if you have a 60Hz monitor.

If you have a 2600K and the ONLY thing you do is gaming and web browsing and nothing much else, you do not have a pressing need to upgrade just yet. If you are rendering 3d or video, doing autocad etc may be a different story.

The 2500K without the extra threads, on the other hand, is still OK but it is getting close to the point where its performance is becoming marginal and possibly worth considering an upgrade.

BTW Sandy bridge p67 and Z68 both came with Sata III and USB 3.0. The only thing missing is NVME but that is not a critical must have anyway


----------



## Nawafwabs

Quote:


> Originally Posted by *gtbtk*
> 
> Is everything stock? bios stock?
> 
> you may benefit from reflashing your card
> 
> https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1070-O8G-GAMING/HelpDesk_BIOS/


i fix it

I found there is cable block fan from working


----------



## 8bitG33k

For some reason my Asus Strix 1070 will no longer idle. It stays at 911-937 when using just the Windows desktop and clocks up when I game, but wont go below the 911-937 range. I had just upgraded to the newest Asus GPU Tweak and it has stayed at those clocks since. Memory is also running at a constant 4007MHz.

I obviously uninstalled Asus PGU Tweak, but the issue persists. I have also updated to the newest nvidia drivers since then, and I have also installed MSI Afterburner to see if that helped (it did not). Problem still persists after uninstalling MSI Afterburner, and reinstalling the original (older) version of GPU Tweak (the one I was running prior to the issue occuring).

I am a bit at my wits end at what to do. I've obviously been researching the issue for the last two days, but couldn't really find anything helpful other than the afore mentioned steps. Many of the results I found were resolved after uninstalling 'x' program - none of which I have installed to begin with.

Any helpful tips would be appreciated and repped!


----------



## gtbtk

Quote:


> Originally Posted by *8bitG33k*
> 
> For some reason my Asus Strix 1070 will no longer idle. It stays at 911-937 when using just the Windows desktop and clocks up when I game, but wont go below the 911-937 range. I had just upgraded to the newest Asus GPU Tweak and it has stayed at those clocks since. Memory is also running at a constant 4007MHz.
> 
> I obviously uninstalled Asus PGU Tweak, but the issue persists. I have also updated to the newest nvidia drivers since then, and I have also installed MSI Afterburner to see if that helped (it did not). Problem still persists after uninstalling MSI Afterburner, and reinstalling the original (older) version of GPU Tweak (the one I was running prior to the issue occuring).
> 
> I am a bit at my wits end at what to do. I've obviously been researching the issue for the last two days, but couldn't really find anything helpful other than the afore mentioned steps. Many of the results I found were resolved after uninstalling 'x' program - none of which I have installed to begin with.
> 
> Any helpful tips would be appreciated and repped!


Have a look at the Nvidia control panel and see what the performance mode you have it set to. It sounds like you may be in high performance mode. Optimal or adaptive should allow the clocks to drop way down at idle


----------



## 8bitG33k

Quote:


> Originally Posted by *gtbtk*
> 
> Have a look at the Nvidia control panel and see what the performance mode you have it set to. It sounds like you may be in high performance mode. Optimal or adaptive should allow the clocks to drop way down at idle


Ah yes, thanks for reminding me! I forgot to mention that another troubleshooting step I found and attempted was to change the Performance mode in the nvidia control panel. That particular setting has 3 settings, and I tried "Optimal" (default) and "adaptive". I also typically rebooted my PC after each single troubleshooting attempt. Further I "reset" the nvidia control panel to it's default settings, with no resolution.


----------



## SavantStrike

Quote:


> Originally Posted by *8bitG33k*
> 
> For some reason my Asus Strix 1070 will no longer idle. It stays at 911-937 when using just the Windows desktop and clocks up when I game, but wont go below the 911-937 range. I had just upgraded to the newest Asus GPU Tweak and it has stayed at those clocks since. Memory is also running at a constant 4007MHz.
> 
> I obviously uninstalled Asus PGU Tweak, but the issue persists. I have also updated to the newest nvidia drivers since then, and I have also installed MSI Afterburner to see if that helped (it did not). Problem still persists after uninstalling MSI Afterburner, and reinstalling the original (older) version of GPU Tweak (the one I was running prior to the issue occuring).
> 
> I am a bit at my wits end at what to do. I've obviously been researching the issue for the last two days, but couldn't really find anything helpful other than the afore mentioned steps. Many of the results I found were resolved after uninstalling 'x' program - none of which I have installed to begin with.
> 
> Any helpful tips would be appreciated and repped!


Have you tried a clean install of the drivers? It sounds like you're got the CUDA P0 state enabled all the time. If you haven't locked this on purpose with something like profile inspector, then a clean install might reset it.


----------



## Madmaxneo

Is it normal for my EVGA gtx 1070 to be only running at 99% max or could there be something wrong?


----------



## khanmein

Quote:


> Originally Posted by *Madmaxneo*
> 
> Is it normal for my EVGA gtx 1070 to be only running at 99% max or could there be something wrong?


Normal.


----------



## 8bitG33k

Quote:


> Originally Posted by *SavantStrike*
> 
> Have you tried a clean install of the drivers? It sounds like you're got the CUDA P0 state enabled all the time. If you haven't locked this on purpose with something like profile inspector, then a clean install might reset it.


I'm wondering if said setting could be reset by using a tool such as nvidia Inspector? I am really glad I posted here, it sounds as if this is something that can be tweaked via an ini file or rather it's nvidia driver analog. I will give that and clean driver install a try and report back.

EDIT: Performing a clean driver install did not resolve the issue. I even used a driver cleaner tool to make sure any setting files are removed.

What's more, nvidia Inspector shows said setting, but does not allow me to change the state from P0 to any of the other settings (P2, P5, P8). Whenever I select and apply any of the other states, it remains at P0.

EDIT 2: Is there a registry setting somewhere that can be changed?

EDIT 3: Attaching screenshot of NV Inspector: note how it shows P0 still being reported as P-State after selecting and applying Performance Level [2] - (P2).


Spoiler: Warning: Spoiler!


----------



## Madmaxneo

Quote:


> Originally Posted by *khanmein*
> 
> Normal.


Thanks, that is what I figured but I remember people complaining about it and doing all these troubleshooting steps.....lol


----------



## Skylinestar

My GTX1070 has Samsung memory. Currently, it's running at stock (non OC). Temperature is approx 70°c during Unigine Heaven 4 bechmark. Room ambient temp is 33°c. With such high temp, should I overclock?


----------



## BTCHSLP

Maybe a crossflash to a gtx 1070 ti will be possible, hm ?

What do you think ?


----------



## jon666

I doubt it. Trying to remember if Nvidia ever allowed software to unlock anything, but I still doubt it. Last Nvidia card I owned was a GTX 460.


----------



## gtbtk

Quote:


> Originally Posted by *Skylinestar*
> 
> My GTX1070 has Samsung memory. Currently, it's running at stock (non OC). Temperature is approx 70°c during Unigine Heaven 4 bechmark. Room ambient temp is 33°c. With such high temp, should I overclock?


There is no reason why you cant overclock. you are well under power limits at 70 deg. The default limit is 84 and you can increase that in afterburner to 93 deg. You didn't say what model card you have.

I am assuming that the temp details you gave us is with everything, including the fans, at stock?

If that is the case, then the first thing you should do is set a custom fan curve. Most Pascal cards have the zero noise fans that don't start spinning until 60 deg. I would suggest that you have the fans spinning at 20% at idle and your 1070 idle temps should drop to about 40-45 deg on a 33 deg day giving you a bit more headroom at the beginning. You can set the fans to max speed on the curve once the GPU temp gets to 60s or 70 deg if you like.

Once you have the fans under control, memory OC should be fine in the -500 to +700Mhz range.

Increase the power target slider but, voltage = heat so you can try adding the +100 to the voltage slider, It wont kill your card, but will probably be better off leaving the power target at the default and benefit from lower temps/potentially higher frequency. If your case airflow is restrictive, extra case fans will help manage temps as well.

If your card uses a reference board with 1506Mhz reference clocks, you can probably OC the core in the +150 to +200 range with the core slider.


----------



## gtbtk

Quote:


> Originally Posted by *BTCHSLP*
> 
> Maybe a crossflash to a gtx 1070 ti will be possible, hm ?
> 
> What do you think ?


I was planning to try it out to see what happens but I am going to have to rebuild my rig first. I currently need to find a replacement MB.

I am assuming that the TI boards have not been totally re-engineered with different voltage controllers etc. I am expecting that the TI PCB will be the same as as the 1070 PCB. I am also assuming that the rumors that the cards are locked from overclocking are wrong. I have a card that I can already OC to get close to reference 1080 performance so I wont do it if the clocks are locked on the card.

We may need to wait for an updated version of NVflash to be released but If my understanding of the NV firmware is correct It stands a chance of working. Actually unlocking Cuda Cores is a different matter. The worst that can happen is that the cross flash bricks the card needs to be recovered. The flash is just copying a file to a chip that is not unlike a USB thumb drive. It will not physically change the card. The risk is that the hardware cannot read the instructions sitting in the file. If you are planning to try, make sure that you have either an iGPU or another GPU you can use to boot from if you do brick the card. And make backups of the bios before you try.

It will be interesting to see if the extra cores have been restricted by the bios or by some physical means. The other riosk is that the 1070 GPUs are faulty and the faults cause instability if the bios allows access to the physical location in the silicon that was previously locked out. This is the first TI card that I can think of that uses the same base GPU as the non TI version. Normally it is the next level up. I had been toying with the idea of flashing a 1080 bios to my 1070. As far as I can tell, the integrated memory controller on the GPU has been designed to support both GDDR5 and GDDR5x memory. What I was not sure about was if the memory auto negotiated and trained itself for timings or it it was reliant on the bios to providing the correct timings. A 1070TI with the rumored specs shouldn't have that problem.


----------



## BTCHSLP

Quote:


> Originally Posted by *gtbtk*
> 
> I was planning to try it out to see what happens but I am going to have to rebuild my rig first. I currently need to find a replacement MB.
> 
> I am assuming that the TI boards have not been totally re-engineered with different voltage controllers etc. I am expecting that the TI PCB will be the same as as the 1070 PCB. I am also assuming that the rumors that the cards are locked from overclocking are wrong. I have a card that I can already OC to get close to reference 1080 performance so I wont do it if the clocks are locked on the card.
> 
> We may need to wait for an updated version of NVflash to be released but If my understanding of the NV firmware is correct It stands a chance of working. Actually unlocking Cuda Cores is a different matter. The worst that can happen is that the cross flash bricks the card needs to be recovered. The flash is just copying a file to a chip that is not unlike a USB thumb drive. It will not physically change the card. The risk is that the hardware cannot read the instructions sitting in the file. If you are planning to try, make sure that you have either an iGPU or another GPU you can use to boot from if you do brick the card. And make backups of the bios before you try.
> 
> It will be interesting to see if the extra cores have been restricted by the bios or by some physical means. The other riosk is that the 1070 GPUs are faulty and the faults cause instability if the bios allows access to the physical location in the silicon that was previously locked out. This is the first TI card that I can think of that uses the same base GPU as the non TI version. Normally it is the next level up. I had been toying with the idea of flashing a 1080 bios to my 1070. As far as I can tell, the integrated memory controller on the GPU has been designed to support both GDDR5 and GDDR5x memory. What I was not sure about was if the memory auto negotiated and trained itself for timings or it it was reliant on the bios to providing the correct timings. A 1070TI with the rumored specs shouldn't have that problem.


Interessting !








Can you please post a feedback / test after your GTX 1070 Ti crossflash ?


----------



## gtbtk

Quote:


> Originally Posted by *BTCHSLP*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I was planning to try it out to see what happens but I am going to have to rebuild my rig first. I currently need to find a replacement MB.
> 
> I am assuming that the TI boards have not been totally re-engineered with different voltage controllers etc. I am expecting that the TI PCB will be the same as as the 1070 PCB. I am also assuming that the rumors that the cards are locked from overclocking are wrong. I have a card that I can already OC to get close to reference 1080 performance so I wont do it if the clocks are locked on the card.
> 
> We may need to wait for an updated version of NVflash to be released but If my understanding of the NV firmware is correct It stands a chance of working. Actually unlocking Cuda Cores is a different matter. The worst that can happen is that the cross flash bricks the card needs to be recovered. The flash is just copying a file to a chip that is not unlike a USB thumb drive. It will not physically change the card. The risk is that the hardware cannot read the instructions sitting in the file. If you are planning to try, make sure that you have either an iGPU or another GPU you can use to boot from if you do brick the card. And make backups of the bios before you try.
> 
> It will be interesting to see if the extra cores have been restricted by the bios or by some physical means. The other riosk is that the 1070 GPUs are faulty and the faults cause instability if the bios allows access to the physical location in the silicon that was previously locked out. This is the first TI card that I can think of that uses the same base GPU as the non TI version. Normally it is the next level up. I had been toying with the idea of flashing a 1080 bios to my 1070. As far as I can tell, the integrated memory controller on the GPU has been designed to support both GDDR5 and GDDR5x memory. What I was not sure about was if the memory auto negotiated and trained itself for timings or it it was reliant on the bios to providing the correct timings. A 1070TI with the rumored specs shouldn't have that problem.
> 
> 
> 
> Interessting !
> 
> 
> 
> 
> 
> 
> 
> 
> Can you please post a feedback / test after your GTX 1070 Ti crossflash ?
Click to expand...

If I do manage to do it, of course I will share my findings. Who doesn't want a free upgrade?

I am hoping that I can unlock cores and make a TI out of a 1070 but being realistic, I think that the experience will be similar to flashing the Vega 64 bios to the Vega 56 - It will still allow the card to run, change the clocks but probably not unlock cores.


----------



## Skylinestar

Quote:


> Originally Posted by *gtbtk*
> 
> There is no reason why you cant overclock. you are well under power limits at 70 deg. The default limit is 84 and you can increase that in afterburner to 93 deg. You didn't say what model card you have.
> 
> I am assuming that the temp details you gave us is with everything, including the fans, at stock?
> 
> If that is the case, then the first thing you should do is set a custom fan curve. Most Pascal cards have the zero noise fans that don't start spinning until 60 deg. I would suggest that you have the fans spinning at 20% at idle and your 1070 idle temps should drop to about 40-45 deg on a 33 deg day giving you a bit more headroom at the beginning. You can set the fans to max speed on the curve once the GPU temp gets to 60s or 70 deg if you like.
> 
> Once you have the fans under control, memory OC should be fine in the -500 to +700Mhz range.
> 
> Increase the power target slider but, voltage = heat so you can try adding the +100 to the voltage slider, It wont kill your card, but will probably be better off leaving the power target at the default and benefit from lower temps/potentially higher frequency. If your case airflow is restrictive, extra case fans will help manage temps as well.
> 
> If your card uses a reference board with 1506Mhz reference clocks, you can probably OC the core in the +150 to +200 range with the core slider.


My card is the Palit JetStream GTX1070


----------



## gtbtk

Quote:


> Originally Posted by *Skylinestar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> There is no reason why you cant overclock. you are well under power limits at 70 deg. The default limit is 84 and you can increase that in afterburner to 93 deg. You didn't say what model card you have.
> 
> I am assuming that the temp details you gave us is with everything, including the fans, at stock?
> 
> If that is the case, then the first thing you should do is set a custom fan curve. Most Pascal cards have the zero noise fans that don't start spinning until 60 deg. I would suggest that you have the fans spinning at 20% at idle and your 1070 idle temps should drop to about 40-45 deg on a 33 deg day giving you a bit more headroom at the beginning. You can set the fans to max speed on the curve once the GPU temp gets to 60s or 70 deg if you like.
> 
> Once you have the fans under control, memory OC should be fine in the -500 to +700Mhz range.
> 
> Increase the power target slider but, voltage = heat so you can try adding the +100 to the voltage slider, It wont kill your card, but will probably be better off leaving the power target at the default and benefit from lower temps/potentially higher frequency. If your case airflow is restrictive, extra case fans will help manage temps as well.
> 
> If your card uses a reference board with 1506Mhz reference clocks, you can probably OC the core in the +150 to +200 range with the core slider.
> 
> 
> 
> My card is the Palit JetStream GTX1070
Click to expand...

that will overclock fine. just set a custom fan curve and pay attention to case airflow if you are in a tropical environment. Everything I said in my last post applies


----------



## gtbtk

Well MSI 1070TI Gaming comes with Micron memory. The PCB is v330 v6.0. The 1070 PCB is v330 v5.0.

As far as I could tell from the hardware unboxed video was that they are identical except for the screen printing on the GPU


----------



## SavantStrike

Quote:


> Originally Posted by *gtbtk*
> 
> Well MSI 1070TI Gaming comes with Micron memory. The PCB is v330 v6.0. The 1070 PCB is v330 v5.0.
> 
> As far as I could tell from the hardware unboxed video was that they are identical except for the screen printing on the GPU


I would have been shocked if they were different. Now if there was a BIOS available...


----------



## gtbtk

Quote:


> Originally Posted by *SavantStrike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Well MSI 1070TI Gaming comes with Micron memory. The PCB is v330 v6.0. The 1070 PCB is v330 v5.0.
> 
> As far as I could tell from the hardware unboxed video was that they are identical except for the screen printing on the GPU
> 
> 
> 
> I would have been shocked if they were different. Now if there was a BIOS available...
Click to expand...

give it a week or two and they will start popping up on techpowerup


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Well MSI 1070TI Gaming comes with Micron memory. The PCB is v330 v6.0. The 1070 PCB is v330 v5.0.
> 
> As far as I could tell from the hardware unboxed video was that they are identical except for the screen printing on the GPU


the mosfet should be from ubiq, but not onsemconductor 4c86n.
just like the gaming silver 1070.


----------



## asdkj1740

Quote:


> Originally Posted by *SavantStrike*
> 
> I would have been shocked if they were different. Now if there was a BIOS available...


i think we can still cross flash bios for higher power limit.
no suitable driver for 1070ti right now, cant even get the card run properly, nvidia blocks 1070ti from using current latest or old drivers.

it is said that nvidia bans all oc bios.
but according to us amazon and newegg, those asus strix models have higher preoverclocking.
i dont know what is going on, i know some aib will still sell the oc version 1070ti because they want to sell all the stock of oc version.
nvidia suddenly changes its mind banning all oc bios after aib have already started to make oc version 1070ti. so there must be some stock of oc card and aib will still sell it.
however, as far as i am concern, those oc bios and stock bios difference is just the gpu clock, power limit are the same.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Well MSI 1070TI Gaming comes with Micron memory. The PCB is v330 v6.0. The 1070 PCB is v330 v5.0.
> 
> As far as I could tell from the hardware unboxed video was that they are identical except for the screen printing on the GPU
> 
> 
> 
> the mosfet should be from ubiq, but not onsemconductor 4c86n.
> just like the gaming silver 1070.
Click to expand...

I actually think that MSI is probably using exactly the same card as the quick silver 1070 with the differently labeled gpu labeled v6.0. I never bothered pulling the cooler off mine but as far as I am aware, all of the gaming and quicksilver cards are built on the v330 v5.0 labeled PCB regardless of the brand of memory. The initial cards with Samsung memory had onsemi mosfets. The quicksilver that was reviewed initially had ubiq. I don't have enough information to be able to say that the mosfet change was permanent. That could be changing back and forward just like the vram. The MSI Gaming 8+2 phase VRM is designed with so much overhead for even the 1080 requirements, I don't think that it really matters. The bios doesn't care, the voltage controller takes care of all of that.

My micron bug/cross flashing adventures did show me that Nvidia is in control of all the different levels of factory overclock. They are offering the AIB partners a shopping list of core bios versions with a range of about 6 different levels of clock speed levels for the 1070, and I assume the 1080, that they use to differentiate their range of products. The partners are not actually in control of the amount of overclock other than deciding to select level 1, level 2 level 3 etc of an Nvidia product list. The partners are only changing the branding and some of the fluffy support features such as LED lighting functions or support for the EVGA auto overclock feature.

If the partners were in control of the level of overclocks, you would see more variation than what you get. All the partners offer 1070 models that clock at either 1506, 1582, 1607, 1595, 1633, or 1671mhz and one or two others that use the IR voltage controllers that are mostly used by Galax. As far as I can tell, there is only two or three factories that OEM a range of cards that are differentiated by screen printing on the PCB and the different coolers and shrouds just like they do with power supplies.

At the moment, Nvidia are only offering one level of 1070TI bios. I am sure we will see OC version coming out

This "locked" no overclock 1070 TI from all the vendors limit is just Nvidia only offering a single core bios


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SavantStrike*
> 
> I would have been shocked if they were different. Now if there was a BIOS available...
> 
> 
> 
> i think we can still cross flash bios for higher power limit.
> no suitable driver for 1070ti right now, cant even get the card run properly, nvidia blocks 1070ti from using current latest or old drivers.
> 
> it is said that nvidia bans all oc bios.
> but according to us amazon and newegg, those asus strix models have higher preoverclocking.
> i dont know what is going on, i know some aib will still sell the oc version 1070ti because they want to sell all the stock of oc version.
> nvidia suddenly changes its mind banning all oc bios after aib have already started to make oc version 1070ti. so there must be some stock of oc card and aib will still sell it.
> however, as far as i am concern, those oc bios and stock bios difference is just the gpu clock, power limit are the same.
Click to expand...

The 1070TI cards are all 1607Mhz cards as far as I have seen but they are being marketed as being overclockable in Afterburner/precision XOC etc. They are not locked chips like a Non K Intel CPU. As I said in my last post, It would seem that Nvidia are only supplying a single level of core bios with the 1607mhz clock for these cards right now. We know that it is the bios that sets the voltage/frequency curve and not the silicon itself.

I suspect that the Asus cards are being marketed with the GPUtweak OC mode frequency in their spec sheet


----------



## KillerBee33

Are there any Threads with dual 1070 laptops if anyone knows plz.


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Are there any Threads with dual 1070 laptops if anyone knows plz.


not seen one.

you can discuss the 1070s here though.


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> not seen one.
> 
> you can discuss the 1070s here though.


It's still on its way here , so i wanted to know what issues i might run into ...but thanx.


----------



## Madmaxneo

Quote:


> Originally Posted by *Skylinestar*
> 
> My GTX1070 has Samsung memory. Currently, it's running at stock (non OC). Temperature is approx 70°c during Unigine Heaven 4 bechmark. Room ambient temp is 33°c. With such high temp, should I overclock?


I just ran my 1070 in Heaven and my temps never got over 42°c.
But I am watercooling my card with a Swiftech H140-X AIO and a Heatkiller IV waterblock. I paid about $160 combined for both of those.
I mention this because if you want great temps then you should try watercooling, and the route I went was pretty inexpensive comparatively.


----------



## TLCH723

Quote:


> Originally Posted by *Madmaxneo*
> 
> I just ran my 1070 in Heaven and my temps never got over 42°c.
> But I am watercooling my card with a Swiftech H140-X AIO and a Heatkiller IV waterblock. I paid about $160 combined for both of those.
> I mention this because if you want great temps then you should try watercooling, and the route I went was pretty inexpensive comparatively.


Or bigger fan.


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> not seen one.
> 
> you can discuss the 1070s here though.
> 
> 
> 
> It's still on its way here , so i wanted to know what issues i might run into ...but thanx.
Click to expand...

The 1070 actually has a few more cuda cores than the desktop but is clocked lower than the desktop version.

Keep an eye on your fan management and temps to keep frequency up.

Nvidia inspector allows you to create SLI profiles for games that don't otherwise have any.

there are no glaring problems that I am aware of.


----------



## b0uncyfr0

Alright things are getting interesting - we need more reviews/teardowns.


----------



## gtbtk

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Alright things are getting interesting - we need more reviews/teardowns.


I'm sure they will come. The EVGA SC board, which is a reference card, looked identical too the 1070 card but Of course You tubers have no Idea about what they are looking at so they doe bother doing things like looking at voltage controller chip model numbers. I guess that announcing an observation that the 1070 and the 1070TI cards are the same would not bode well for them getting a continued supply of product to make content about.

The Chinese and German sites are the ones that provide greater info on the actual components used on the boards as they tend to be more focused on the manufacturing aspects of the cards.


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> The 1070 actually has a few more cuda cores than the desktop but is clocked lower than the desktop version.
> 
> Keep an eye on your fan management and temps to keep frequency up.
> 
> Nvidia inspector allows you to create SLI profiles for games that don't otherwise have any.
> 
> there are no glaring problems that I am aware of.


Well, SLI is the issue for me , never had one nor wanted to get but this laptop was a steal and i cant use my desktop right now so i just went with it.


----------



## asdkj1740

the evga sc shown on bitwit should be a 1080 fe pcb, but one solid cap is missed on the right side.
inno3d, this time still use the fe pcb but with full 6 phases for gpu core.
http://www.expreview.com/57421.html

rumor:
the reason why nvidia bans all aib custom bios is mainly for real oc restriction. in short, the voltage frequency curve is insane.

users can still overclock 1070ti manually, however, the voltage behavior is very strange.
gpu voltage fluctuates a lot, it can be dropped to 0.87v from 1.06v, making maually oc very difficult for stabilizing the core clock.
cant sure whether it is the bios problem and/or the driver problem.
as always power limit is too low on some 1070ti with stock 180w and max 216w, causing power throttling on almost every second.
voltage locking by "ctrl and l" wont work on, at least some, 1070ti.
flattening the voltage curve from 1v to 1.09v on 2000mhz would be resulting in crashes.
some insider said nvidia has heavily tweaked the stock bios for all 1070ti for preventing 1070ti being too powerful.


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The 1070 actually has a few more cuda cores than the desktop but is clocked lower than the desktop version.
> 
> Keep an eye on your fan management and temps to keep frequency up.
> 
> Nvidia inspector allows you to create SLI profiles for games that don't otherwise have any.
> 
> there are no glaring problems that I am aware of.
> 
> 
> 
> Well, SLI is the issue for me , never had one nor wanted to get but this laptop was a steal and i cant use my desktop right now so i just went with it.
Click to expand...

A single 1070 sull gives you really good performance in a laptop so you wont need to compromise that much not running SLI. Disabling SLI and running on a single GPU will certainly help you extend battery life and reduce heat. The second card is a bonus that you can use on AC power in a cool environment if you want to.

Definitely make sure you get hold of the latest version of Nvidia Inspector and the associated profile manager. You can use it for overclocking but I would recommend Afterburner for that. The great thing is that it allows you to edit the SLI profiles in the driver for games that don't scale in SLI. It can be added in a new or existing profile to create the SLI support for that game.

Make sure you do a little homework to understand the different ways SLI can be implemented. It can be set to render either full alternate frames or alternate lines on a single frame. Different games will perform better with one or the other, so you may need to do some trial and error in creating new profiles.


----------



## mattliston

New 1070 owner here.

PNY XCLR8 double fan updated version from bestbuy about a month ago.

Not sure of the memory type. under gpuZ, it shows the v-bios as supporting 3 types of Micron and 1 type of Samsung, so I believe PNY put whatever they had on it.

Currently running 2GHz at 1.025 volts stable in games and all benchmarks, never going over 65*C, and a simple 200+ offset on memory to be 4.2//8.4 GHz spec.

Does okay, even with my AMD FX. But then, my FX is running 300x16 for 4.8GHz and running 2700MHz on both northbridge and HT link, and 2400MHz CL10 ram at CR1, so the system as a whole is pretty darn smooth, especially with help from processor lasso to target primary cores of each module on chip.


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> A single 1070 sull gives you really good performance in a laptop so you wont need to compromise that much not running SLI. Disabling SLI and running on a single GPU will certainly help you extend battery life and reduce heat. The second card is a bonus that you can use on AC power in a cool environment if you want to.
> 
> Definitely make sure you get hold of the latest version of Nvidia Inspector and the associated profile manager. You can use it for overclocking but I would recommend Afterburner for that. The great thing is that it allows you to edit the SLI profiles in the driver for games that don't scale in SLI. It can be added in a new or existing profile to create the SLI support for that game.
> 
> Make sure you do a little homework to understand the different ways SLI can be implemented. It can be set to render either full alternate frames or alternate lines on a single frame. Different games will perform better with one or the other, so you may need to do some trial and error in creating new profiles.


Yeah, thats exactly the reason i never went with SLI, All this tweaking for a lil chance of performance gain. Well i'll be back for info once it's here. Mainly got it for VR since VR applications start supporting SLI.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> the evga sc shown on bitwit should be a 1080 fe pcb, but one solid cap is missed on the right side.
> inno3d, this time still use the fe pcb but with full 6 phases for gpu core.
> http://www.expreview.com/57421.html
> 
> rumor:
> the reason why nvidia bans all aib custom bios is mainly for real oc restriction. in short, the voltage frequency curve is insane.
> 
> users can still overclock 1070ti manually, however, the voltage behavior is very strange.
> gpu voltage fluctuates a lot, it can be dropped to 0.87v from 1.06v, making maually oc very difficult for stabilizing the core clock.
> cant sure whether it is the bios problem and/or the driver problem.
> as always power limit is too low on some 1070ti with stock 180w and max 216w, causing power throttling on almost every second.
> voltage locking by "ctrl and l" wont work on, at least some, 1070ti.
> flattening the voltage curve from 1v to 1.09v on 2000mhz would be resulting in crashes.
> some insider said nvidia has heavily tweaked the stock bios for all 1070ti for preventing 1070ti being too powerful.


Thank you for posting that. please keep sharing the chinese site links here, they are difficult to find in you only search in English.

The EVGA SC card is the same as the 1070 PCB as is the MSI Gaming board as far as I can tell. Only a single 8 pin on the EVGA card, it lacks the extra VRM that GDDR5X needs and of course it has GDDR5 installed. I am not familiar with the colorful range of 1070s but electrically, the 1070 and 1070TI have similar requirements. The TI as with teh 1080 is likely to draw a little more power than the 1070 but even the reference VRM can support up to 250W. The 1080 and 1070 reference PCBs are very similar though.

Interesting to see that all of the boards that have had a teardown are using Micron memory.

How do you know about the reports of the voltage fluctuations? The EVGA 1070 bioses had a tendency to jump voltage levels around a lot, the MSI and ASUS bioses did not


----------



## Falkentyne

Quote:


> Originally Posted by *KillerBee33*
> 
> It's still on its way here , so i wanted to know what issues i might run into ...but thanx.


What laptop are you talking about here?

The GT73VR 7RE SLI ? or the GT83VR SLI? Or a Clevo model?


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> What laptop are you talking about here?
> 
> The GT73VR 7RE SLI ? or the GT83VR SLI? Or a Clevo model?


msi_gt83vr_titan_sli_055


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> Thank you for posting that. please keep sharing the chinese site links here, they are difficult to find in you only search in English.
> 
> The EVGA SC card is the same as the 1070 PCB as is the MSI Gaming board as far as I can tell. Only a single 8 pin on the EVGA card, it lacks the extra VRM that GDDR5X needs and of course it has GDDR5 installed. I am not familiar with the colorful range of 1070s but electrically, the 1070 and 1070TI have similar requirements. The TI as with teh 1080 is likely to draw a little more power than the 1070 but even the reference VRM can support up to 250W. The 1080 and 1070 reference PCBs are very similar though.
> 
> Interesting to see that all of the boards that have had a teardown are using Micron memory.
> 
> How do you know about the reports of the voltage fluctuations? The EVGA 1070 bioses had a tendency to jump voltage levels around a lot, the MSI and ASUS bioses did not


evga sc 1070 has 4 phase for the 1070 gpu core, while 1070ti evga sc has 5 phases.
i think there is a little upgrade, because the stock power limit is said to be 180w, just like 1080.
but some aib simply uses the 1070 heatsink design....namely we would see poor cooling/noise performance on some 1070ti which is using 1070's heatsink but not 1080 heatsink.
for example, gigabye g1 gaming, the 1070 g1 has only 2 copper heatpipes, while 1080 g1 has 3, and the latest 1070ti gaming (g1) has 2 only.

exactly, thats why i flashed zotac bios on my evga ftw. i though that was because of the evga low power limit.
however asus strix has even lower power limit than evga ftw, if asus strix doesnt suffer from voltage/clock fluctuation then it means some bios from some aib is just superior.
therefore cross flashing maybe helpful to those who would like to have stable voltage and core clock.


----------



## Falkentyne

Quote:


> Originally Posted by *asdkj1740*
> 
> evga sc 1070 has 4 phase for the 1070 gpu core, while 1070ti evga sc has 5 phases.
> i think there is a little upgrade, because the stock power limit is said to be 180w, just like 1080.
> but some aib simply uses the 1070 heatsink design....namely we would see poor cooling/noise performance on some 1070ti which is using 1070's heatsink but not 1080 heatsink.
> for example, gigabye g1 gaming, the 1070 g1 has only 2 copper heatpipes, while 1080 g1 has 3, and the latest 1070ti gaming (g1) has 2 only.
> 
> exactly, thats why i flashed zotac bios on my evga ftw. i though that was because of the evga low power limit.
> however asus strix has even lower power limit than evga ftw, if asus strix doesnt suffer from voltage/clock fluctuation then it means some bios from some aib is just superior.
> therefore cross flashing maybe helpful to those who would like to have stable voltage and core clock.


Do the phases even matter?
The MSI laptop MXM 1070 card has a 115W TDP, yet can handle *195W* directly through the MXM port with a HW programmer and the pascal bios editor (provided the mainboard can deliver that power), and has been tested at 250W successfully. I use mine at 185W, so I seriously doubt the desktop cards are that limited by phases.


----------



## Falkentyne

Quote:


> Originally Posted by *KillerBee33*
> 
> msi_gt83vr_titan_sli_055


That laptop is a good candidate for a TDP mod. You can mod both cards to 185W TDP, but please check to see if the 1070 SLI version of the GT83VR uses the same bios and firmware as the 1080 version (please check this)--you'll be glad you did.

If it does, you can increase the overall system power limit from 460W to 660W (if both cards are inserted, not sure if they have to be ENABLED or not, however), by using RW Everything, Embedded Controller, offset E3, and increase the value here by 1. (this will change your system power ID from 1070 (230W / 230W X2) to 1080 (330W / 330W x2)

The system power ID is important, for 2 reasons:
1) there is hybrid battery drain if the system load is >70% of the main system rating. Changing the power ID to a higher tier can prevent this.
2) if total system power exceeds the system power ID, the CPU will be forcibly power limit throttled to TDP (or even LOWER than TDP)--45W and then 25W. This is important when TDP modding videocards. Changing the powerID even if u arent tdp modding is good, because you can avoid hybrid battery drain at high load


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> That laptop is a good candidate for a TDP mod. You can mod both cards to 185W TDP, but please check to see if the 1070 SLI version of the GT83VR uses the same bios and firmware as the 1080 version (please check this)--you'll be glad you did.
> 
> If it does, you can increase the overall system power limit from 460W to 660W (if both cards are inserted, not sure if they have to be ENABLED or not, however), by using RW Everything, Embedded Controller, offset E3, and increase the value here by 1. (this will change your system power ID from 1070 (230W / 230W X2) to 1080 (330W / 330W x2)
> 
> The system power ID is important, for 2 reasons:
> 1) there is hybrid battery drain if the system load is >70% of the main system rating. Changing the power ID to a higher tier can prevent this.
> 2) if total system power exceeds the system power ID, the CPU will be forcibly power limit throttled to TDP (or even LOWER than TDP)--45W and then 25W. This is important when TDP modding videocards. Changing the powerID even if u arent tdp modding is good, because you can avoid hybrid battery drain at high load


Will check that once its here.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Thank you for posting that. please keep sharing the chinese site links here, they are difficult to find in you only search in English.
> 
> The EVGA SC card is the same as the 1070 PCB as is the MSI Gaming board as far as I can tell. Only a single 8 pin on the EVGA card, it lacks the extra VRM that GDDR5X needs and of course it has GDDR5 installed. I am not familiar with the colorful range of 1070s but electrically, the 1070 and 1070TI have similar requirements. The TI as with teh 1080 is likely to draw a little more power than the 1070 but even the reference VRM can support up to 250W. The 1080 and 1070 reference PCBs are very similar though.
> 
> Interesting to see that all of the boards that have had a teardown are using Micron memory.
> 
> How do you know about the reports of the voltage fluctuations? The EVGA 1070 bioses had a tendency to jump voltage levels around a lot, the MSI and ASUS bioses did not
> 
> 
> 
> evga sc 1070 has 4 phase for the 1070 gpu core, while 1070ti evga sc has 5 phases.
> i think there is a little upgrade, because the stock power limit is said to be 180w, just like 1080.
> but some aib simply uses the 1070 heatsink design....namely we would see poor cooling/noise performance on some 1070ti which is using 1070's heatsink but not 1080 heatsink.
> for example, gigabye g1 gaming, the 1070 g1 has only 2 copper heatpipes, while 1080 g1 has 3, and the latest 1070ti gaming (g1) has 2 only.
> 
> exactly, thats why i flashed zotac bios on my evga ftw. i though that was because of the evga low power limit.
> however asus strix has even lower power limit than evga ftw, if asus strix doesnt suffer from voltage/clock fluctuation then it means some bios from some aib is just superior.
> therefore cross flashing maybe helpful to those who would like to have stable voltage and core clock.
Click to expand...

You are right. I was sure that the Ref board was 5 phase. So EVGA/Nvidia has changed their board to support the added power draw from the extra available cuda cores. The 1070 reference cards has 250w VRMs maybe this is to spread the heat load over a greater area?

The MSI board is identical as far as I can tell but it has always been a 8+6 power board with a 250-300W VRM. The MSI 1080 gaming heatsink has an extra heat pipe compared to the 1070 and I heard a runor that the titanium model was using the 1080 heat synk while the others have retasked the 1070 one.

I agree with you that not all bioses are created equal. EVGA chose to level up power draw faster than the msi and Asus bioses for a given load. Both of those would need a 1440p load to get near to hitting power limits. You could hit the power limits with even the FTW bios at 1080p.

Having said that, The bios does not know or care anything about the VRM. It only knows it has to send a more power signal up to its limits to the voltage controller and that controller does the rest.

The cards I have seen to date all have UP9511 Voltage controllers just like the 1070s. They all have micron memory so the old 1070 bug should not be coming back. The only thing that is different that is relevant to the bios is the number of cores available in the GPU and we don't yet know if the number of cores is a physically limited thing or limited by a bios setting. It is unusual that Nvidia make it so easy to find out.


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> evga sc 1070 has 4 phase for the 1070 gpu core, while 1070ti evga sc has 5 phases.
> i think there is a little upgrade, because the stock power limit is said to be 180w, just like 1080.
> but some aib simply uses the 1070 heatsink design....namely we would see poor cooling/noise performance on some 1070ti which is using 1070's heatsink but not 1080 heatsink.
> for example, gigabye g1 gaming, the 1070 g1 has only 2 copper heatpipes, while 1080 g1 has 3, and the latest 1070ti gaming (g1) has 2 only.
> 
> exactly, thats why i flashed zotac bios on my evga ftw. i though that was because of the evga low power limit.
> however asus strix has even lower power limit than evga ftw, if asus strix doesnt suffer from voltage/clock fluctuation then it means some bios from some aib is just superior.
> therefore cross flashing maybe helpful to those who would like to have stable voltage and core clock.
> 
> 
> 
> Do the phases even matter?
> The MSI laptop MXM 1070 card has a 115W TDP, yet can handle *195W* directly through the MXM port with a HW programmer and the pascal bios editor (provided the mainboard can deliver that power), and has been tested at 250W successfully. I use mine at 185W, so I seriously doubt the desktop cards are that limited by phases.
Click to expand...

TDP is heat load, not how much power it draws from a power supply. It can provide a very rough guide as long as you fudge upwards.

VRM Phases matter in terms of how much power they can supply to the GPU. Reducing voltage from 12v down to 1.06v is 100% efficient co it creates heat, with more phases, the area that the heat load on each phase is reduced and spread across a larger area meaning phase will run cooler and lessen the possibility that the VRM will start throttling power because the mosfets get too hot. More power to the GPU also translates to more heat which in turn gets GPU Boost to drop the clocks

Given the limited cooling in Laptops, the ability to cool the VRM, vram and GPU down becomes even more finely balanced. You may find that reducing the amount of power to 10 or 20w above stock instead of 50W gives graphics card a lower top frequency but actually increases performance under heavy loads


----------



## Jaybro

Hey everyone
I wanted to know which 1070 to get been saving up for one finally have a chance to buy one but I don't know which one out of the three is best in terms of cooling, noise and warranty I am not a fan of overclocking that much the reason I can't go full sized is due to my mobo having sata ports on the rlight edge my current evga 1060 is blocking them









MSI aero 1070 IXT
Gigabyte mini 1070
or
Zotac 1070 mini


----------



## khanmein

Quote:


> Originally Posted by *Jaybro*
> 
> Hey everyone
> I wanted to know which 1070 to get been saving up for one finally have a chance to buy one but I don't know which one out of the three is best in terms of cooling, noise and warranty I am not a fan of overclocking that much the reason I can't go full sized is due to my mobo having sata ports on the rlight edge my current evga 1060 is blocking them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI aero 1070 IXT
> Gigabyte mini 1070
> or
> Zotac 1070 mini


If mini, I personally will go for Zotac, but you should go for the cheapest & availability.









Another suggestion is the cheapest route, get yourself a PCI-E riser cable.


----------



## Jaybro

Quote:


> Originally Posted by *khanmein*
> 
> If mini, I personally will go for Zotac, but you should go for the cheapest & availability.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another suggestion is the cheapest route, get yourself a PCI-E riser cable.


thanks yeah I was leaning towards the zotac too two fans would be better.


----------



## gtbtk

Quote:


> Originally Posted by *Jaybro*
> 
> Hey everyone
> I wanted to know which 1070 to get been saving up for one finally have a chance to buy one but I don't know which one out of the three is best in terms of cooling, noise and warranty I am not a fan of overclocking that much the reason I can't go full sized is due to my mobo having sata ports on the rlight edge my current evga 1060 is blocking them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI aero 1070 IXT
> Gigabyte mini 1070
> or
> Zotac 1070 mini


I would also pick the Zotac for the dual fans as well. Having said that, The Gigabyte OC version has the highest core clock out of the box but not by much. That doesnt help much if the cooler is not as efficient though


----------



## mattliston

A poor cooler is not always the worst thing though. my PNY 1070 has been running COOLER at my voltage/frequency curve to 2GHz, than the out of box specs. I love you MSI Afterburner!!!


----------



## gtbtk

Quote:


> Originally Posted by *mattliston*
> 
> A poor cooler is not always the worst thing though. my PNY 1070 has been running COOLER at my voltage/frequency curve to 2GHz, than the out of box specs. I love you MSI Afterburner!!!


True, quality is also a thing. Giga have a reputation on 1070s for poor fans and cheap shrouds. The MSI uses what looks like a single fan from the Gaming X range that may be fine but I expect that it would not be as efficient as a larger card.

Temps only help with the clock stability and ultimately performance in longer gaming sessions. The cooler you can run the card over a period of time, the less the frequency will drop from the peak. Pascal is not a very hot architecture so you dont need a massive cooler anyway. All of the 2 fan normal size models work well enough. With the Mini models cooling is more limited, 2 fans will generally move more air than a single fan so that is the basis of my reasoning. any of the three will perform adequately, especially if he is not overclocking


----------



## GoLDii3

Got a GTX 1070 Windforce 2X as a replacement for my dead 980 Ti,it's boosting to 1960 MHz out of the box,how's that?


----------



## asdkj1740

Quote:


> Originally Posted by *GoLDii3*
> 
> Got a GTX 1070 Windforce 2X as a replacement for my dead 980 Ti,it's boosting to 1960 MHz out of the box,how's that?


rev1 or rev2? they have different cooler design as well as pcb too.

out of the box boost clock is nothing as you probably cant stablise it during gaming.


----------



## GoLDii3

Quote:


> Originally Posted by *asdkj1740*
> 
> rev1 or rev2? they have different cooler design as well as pcb too.
> 
> out of the box boost clock is nothing as you probably cant stablise it during gaming.


REV 2.0, what are the differences?


----------



## asdkj1740

Quote:


> Originally Posted by *GoLDii3*
> 
> REV 2.0, what are the differences?


cooler design is totally changed although there are still 2 hdt heatpipes there on both version.
the rev2.0 pcb's vrm is located at the left side of the gpu die, its more like the amd rx480/580 pcb which share the same vrm location.
the rev1.0 pcb's vrm is at the right side.
cant say much about the rev2 pcb differences in terms of quality, as i dont have the rev1 card for comparison, both of them have 6+2 design for sure as shown on the official sites.
it is said that rev 1 card is already at discontinued production, it is hard to find one of those on the market now.

for my current experience on rev2 cooler, you need to raise the fan speed to at least 2000rpm in order to get enough airflow and/or pressure to suppress the gpu die.
i personally think that the cut outs on the top and bottom side of the fan shroud are not big enough to let hot air exhausted efficiently. however once you give the card enough fan rpm then the cooler is not bad and should be better than the rev 1 cooler.


----------



## khanmein

Quote:


> Originally Posted by *GoLDii3*
> 
> REV 2.0, what are the differences?


I personally will avoid Gigabyte, as they always tend to release a lot revision included their motherboards lineup.


----------



## gtbtk

Quote:


> Originally Posted by *GoLDii3*
> 
> Got a GTX 1070 Windforce 2X as a replacement for my dead 980 Ti,it's boosting to 1960 MHz out of the box,how's that?


that is about what I would expect for a 1070.


----------



## LuckLess7

I'm not sure if that's the correct thread for it, but I'm a little worried. I haven't had many issues with my GTX 1070 lately, but yesterday I installed Wolfenstein II and I can't get it to run without issues. Sadly. During video sequences and loading screens, I'm getting these artifacts. I checked out another Bethesda Game that uses Vulkan, Doom, and those artifacts appeared there too. I updated the driver, nothing changed, rolled back to 388.0.0 - just in case. I dialed back the oc to stock, still the same. Last time I played Doom there were no problems like that, but it was on a much lower driver version.
Any GTX 1070 owner with the same problem? I really hope it's driver related or something on the software end.


----------



## KillerBee33

So, got my new toy ...this is how the 2X1070's compare to desktop first gen TitanXP watercooled.
TXP https://www.3dmark.com/spy/624012
1070X2 https://www.3dmark.com/spy/2642121


----------



## asdkj1740

Quote:


> Originally Posted by *khanmein*
> 
> I personally will avoid Gigabyte, as they always tend to release a lot revision included their motherboards lineup.


its common on almost all aib.


----------



## GoLDii3

Quote:


> Originally Posted by *LuckLess7*
> 
> I'm not sure if that's the correct thread for it, but I'm a little worried. I haven't had many issues with my GTX 1070 lately, but yesterday I installed Wolfenstein II and I can't get it to run without issues. Sadly. During video sequences and loading screens, I'm getting these artifacts. I checked out another Bethesda Game that uses Vulkan, Doom, and those artifacts appeared there too. I updated the driver, nothing changed, rolled back to 388.0.0 - just in case. I dialed back the oc to stock, still the same. Last time I played Doom there were no problems like that, but it was on a much lower driver version.
> Any GTX 1070 owner with the same problem? I really hope it's driver related or something on the software end.


Try disabling async compute and gpu culling. Also enable deferred rendering.


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> I'm not sure if that's the correct thread for it, but I'm a little worried. I haven't had many issues with my GTX 1070 lately, but yesterday I installed Wolfenstein II and I can't get it to run without issues. Sadly. During video sequences and loading screens, I'm getting these artifacts. I checked out another Bethesda Game that uses Vulkan, Doom, and those artifacts appeared there too. I updated the driver, nothing changed, rolled back to 388.0.0 - just in case. I dialed back the oc to stock, still the same. Last time I played Doom there were no problems like that, but it was on a much lower driver version.
> Any GTX 1070 owner with the same problem? I really hope it's driver related or something on the software end.


What CPU/motherboard? How much and what type of Ram are you using? Have you overclocked the CPU? Have you overclocked the memory? Have you increased BCLK? what other motherboard bios voltage adjustments have you made?

Have you tried resetting everything in the UEFI back to optimized defaults?


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> So, got my new toy ...this is how the 2X1070's compare to desktop first gen TitanXP watercooled.
> TXP https://www.3dmark.com/spy/624012
> 1070X2 https://www.3dmark.com/spy/2642121


that is really impressive for a laptop


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> that is really impressive for a laptop


Thats with just OneClick Turbo mode from MSI , will do everything manually later , more than sure will hit over 10K on the TimeSpy
Keep in mind my Titan was clocked to the max with +800 on the memory and 6700 in the desktop is @ 4.8
Check out the SLI Scaling on that thing









Spoiler: Warning: Spoiler!


----------



## SavantStrike

Quote:


> Originally Posted by *KillerBee33*
> 
> So, got my new toy ...this is how the 2X1070's compare to desktop first gen TitanXP watercooled.
> TXP https://www.3dmark.com/spy/624012
> 1070X2 https://www.3dmark.com/spy/2642121


I'm not sure why 2 1070's instead of a single 1080 ti. Other than that nice setup.


----------



## KillerBee33

Quote:


> Originally Posted by *SavantStrike*
> 
> I'm not sure why 2 1070's instead of a single 1080 ti. Other than that nice setup.


With current job i'm always on the move, cant be dragging a Desktop and a 50 Inch around. This thing fits in a special backpack.


----------



## LuckLess7

Quote:


> Originally Posted by *GoLDii3*
> 
> Try disabling async compute and gpu culling. Also enable deferred rendering.


I tried that, didn't change anything, unfortunately.
Quote:


> Originally Posted by *gtbtk*
> 
> What CPU/motherboard? How much and what type of Ram are you using? Have you overclocked the CPU? Have you overclocked the memory? Have you increased BCLK? what other motherboard bios voltage adjustments have you made?
> 
> Have you tried resetting everything in the UEFI back to optimized defaults?


I have a Core i7 6700K with only the stock XMP profile applied on an Asus Z170 Pro Gaming. And this RAM I put the UEFI to default but that didn't change anything either. It must be some Vulkan problem.

Alright, Wolfenstein II is very new, so there are issues, sure. But the same thing is in Doom too, but only if I use Vulkan API. Doom lets you choose ogl as well, there are no issues. Look at the screenshots:




Its the same scene, I took the vulkan screenshot with my phone as it did not show up on actual screenshots!

So is that a graphics chip problem or a vulkan problem? I played through all of Doom last year when my card was fairly new, it was not there, I'm 100% sure. And I did use Vulkan... So strange


----------



## Madmaxneo

Quote:


> Originally Posted by *LuckLess7*
> 
> I tried that, didn't change anything, unfortunately.
> I have a Core i7 6700K with only the stock XMP profile applied on an Asus Z170 Pro Gaming. And this RAM I put the UEFI to default but that didn't change anything either. It must be some Vulkan problem.
> 
> Alright, Wolfenstein II is very new, so there are issues, sure. But the same thing is in Doom too, but only if I use Vulkan API. Doom lets you choose ogl as well, there are no issues. Look at the screenshots:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Its the same scene, I took the vulkan screenshot with my phone as it did not show up on actual screenshots!
> 
> So is that a graphics chip problem or a vulkan problem? I played through all of Doom last year when my card was fairly new, it was not there, I'm 100% sure. And I did use Vulkan... So strange


I may be going out on a limb here but is there any chance it could be your monitor reacting to using vulkan?

What monitor do you have?


----------



## LuckLess7

Quote:


> Originally Posted by *Madmaxneo*
> 
> I may be going out on a limb here but is there any chance it could be your monitor reacting to using vulkan?
> 
> What monitor do you have?


I have an this rather generic 4k Monitor https://www.asus.com/de/Monitors/PB287Q/ and a rather expensive Samsung 4k TV. And a very good Full HD Asus monitor for editing. I launched the game on all three monitors in many configurations, and it turns out those artifacts only occur on Native 4K resolution. No matter the VSYNC mode, adaptive, on or off. When the resolution is lower, So for the 4k Displays 1440p or Full HD, I don't get these artifacts/glitches in neither Vulkan game.
When the API is not Vulkan, I don't experience these issues either.

TL;DR: Vulkan @ 4K causes glitches, artifacts and ugly screentearing/lining.

I have tested DX12 games: no issues (Tomb Raider and Quantum Break.


----------



## gtbtk

Quote:


> Originally Posted by *SavantStrike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KillerBee33*
> 
> So, got my new toy ...this is how the 2X1070's compare to desktop first gen TitanXP watercooled.
> TXP https://www.3dmark.com/spy/624012
> 1070X2 https://www.3dmark.com/spy/2642121
> 
> 
> 
> I'm not sure why 2 1070's instead of a single 1080 ti. Other than that nice setup.
Click to expand...

They don't make 1080TI laptops as far as I am aware


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GoLDii3*
> 
> Try disabling async compute and gpu culling. Also enable deferred rendering.
> 
> 
> 
> I tried that, didn't change anything, unfortunately.
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> What CPU/motherboard? How much and what type of Ram are you using? Have you overclocked the CPU? Have you overclocked the memory? Have you increased BCLK? what other motherboard bios voltage adjustments have you made?
> 
> Have you tried resetting everything in the UEFI back to optimized defaults?
> 
> Click to expand...
> 
> I have a Core i7 6700K with only the stock XMP profile applied on an Asus Z170 Pro Gaming. And this RAM I put the UEFI to default but that didn't change anything either. It must be some Vulkan problem.
> 
> Alright, Wolfenstein II is very new, so there are issues, sure. But the same thing is in Doom too, but only if I use Vulkan API. Doom lets you choose ogl as well, there are no issues. Look at the screenshots:
> 
> 
> 
> 
> Its the same scene, I took the vulkan screenshot with my phone as it did not show up on actual screenshots!
> 
> So is that a graphics chip problem or a vulkan problem? I played through all of Doom last year when my card was fairly new, it was not there, I'm 100% sure. And I did use Vulkan... So strange
Click to expand...

Nothing in your hardware suggests anything strange. One API only causing the problem suggests configuration issue somewhere in the system causing a problem with the way vilkan is using your hardware.

Screenshots not picking up the weirdness suggests the weirdness may be coming from something happening in the chain after the signal has left the the graphics card.

Does your monitor have any funky video enhancement modes enabled or overclocked refresh rates set? I have had experience on my monitor where somehow the automatic video enhancement mode got turned on. All of a sudden if I used photoshop. Anything that I opened in Camera Raw had funky colours. The in monitor "enhancement" modes can and do mess with video signals and sometimes will cause unexpected problems.

Have you tried changing the method of attachment to you monitor (DVI to HDMI or HDMI to Displayport for example)?


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Madmaxneo*
> 
> I may be going out on a limb here but is there any chance it could be your monitor reacting to using vulkan?
> 
> What monitor do you have?
> 
> 
> 
> I have an this rather generic 4k Monitor https://www.asus.com/de/Monitors/PB287Q/ and a rather expensive Samsung 4k TV. And a very good Full HD Asus monitor for editing. I launched the game on all three monitors in many configurations, and it turns out those artifacts only occur on Native 4K resolution. No matter the VSYNC mode, adaptive, on or off. When the resolution is lower, So for the 4k Displays 1440p or Full HD, I don't get these artifacts/glitches in neither Vulkan game.
> When the API is not Vulkan, I don't experience these issues either.
> 
> TL;DR: Vulkan @ 4K causes glitches, artifacts and ugly screentearing/lining.
> 
> I have tested DX12 games: no issues (Tomb Raider and Quantum Break.
Click to expand...

I posted my last message to u before I saw this one. Mentioning a TV suggests you are using HDMI. Earlier versions of HDMI only supported 4K at 30fps. Hdmi 2 is good for 60hz. Maybe you are hitting HDMI bandwidth Issues. Using a displayport connection may solve the problem to the monitors. If you have a really old HDMI cable, they may have made it and cut some corners that cause problems with later versions of HDMI. swapping that out with a new one may help.


----------



## LuckLess7

Quote:


> Originally Posted by *gtbtk*
> 
> Does your monitor have any funky video enhancement modes enabled or overclocked refresh rates set? I have had experience on my monitor where somehow the automatic video enhancement mode got turned on. All of a sudden if I used photoshop. Anything that I opened in Camera Raw had funky colours. The in monitor "enhancement" modes can and do mess with video signals and sometimes will cause unexpected problems.
> 
> Have you tried changing the method of attachment to you monitor (DVI to HDMI or HDMI to Displayport for example)?


The TV switches into Game Mode automatically when I launch a game, but I think it's actually designed to get rid of all these funky enhancements. The monitor, however, does not apply any enhancements and the issue is visible on both displays in the same manner. Asus Monitor is connected via Displayport, and the TV via HDMI, but it is HDMI 2.0, so it's perfectly capable of 4k @ 60 Hz.

Well, since the issue appeared after a driver update, I would like to install a driver version of November 2016. But then again I'm afraid it causes additional conflicts, so I'd better not do that. I seem to be the only one experiencing this particular kind of issue though.


----------



## Blackfirehawk

first 1070 TI bios are avaibale..
https://www.techpowerup.com/vgabios/195938/palit-gtx1070ti-8192-171011

but i have fear to crossflash :/


----------



## gtbtk

Quote:


> Originally Posted by *LuckLess7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Does your monitor have any funky video enhancement modes enabled or overclocked refresh rates set? I have had experience on my monitor where somehow the automatic video enhancement mode got turned on. All of a sudden if I used photoshop. Anything that I opened in Camera Raw had funky colours. The in monitor "enhancement" modes can and do mess with video signals and sometimes will cause unexpected problems.
> 
> Have you tried changing the method of attachment to you monitor (DVI to HDMI or HDMI to Displayport for example)?
> 
> 
> 
> The TV switches into Game Mode automatically when I launch a game, but I think it's actually designed to get rid of all these funky enhancements. The monitor, however, does not apply any enhancements and the issue is visible on both displays in the same manner. Asus Monitor is connected via Displayport, and the TV via HDMI, but it is HDMI 2.0, so it's perfectly capable of 4k @ 60 Hz.
> 
> Well, since the issue appeared after a driver update, I would like to install a driver version of November 2016. But then again I'm afraid it causes additional conflicts, so I'd better not do that. I seem to be the only one experiencing this particular kind of issue though.
Click to expand...

if it appeared after a driver update. roll back to the last version of driver and it should resolve your problem


----------



## gtbtk

Quote:


> Originally Posted by *Blackfirehawk*
> 
> first 1070 TI bios are avaibale..
> https://www.techpowerup.com/vgabios/195938/palit-gtx1070ti-8192-171011
> 
> but i have fear to crossflash :/


I just downloaded the MSI version. unfortunately I don't have a working motherboard to run my 1070 in just now.

I have taken a look at the MSI 1070TI gaming PCB and as far as I can tell it is identical to the 1070 Quicksilver board. Voltage controller and is identical to thew 1070 boards. When I do get my card back up and running I am planning to try it out if only to see if the cores are locked by bios or physically.


----------



## asdkj1740

Quote:


> Originally Posted by *Blackfirehawk*
> 
> first 1070 TI bios are avaibale..
> https://www.techpowerup.com/vgabios/195938/palit-gtx1070ti-8192-171011
> 
> but i have fear to crossflash :/


no point to flash a stock power limit bios..


----------



## BTCHSLP

Quote:


> Originally Posted by *Blackfirehawk*
> 
> first 1070 TI bios are avaibale..
> https://www.techpowerup.com/vgabios/195938/palit-gtx1070ti-8192-171011
> 
> but i have fear to crossflash :/


Same here ... i`m a bit fear of that and i do not got a second VGA-card to reflash the main VGA-card if it's bricked.

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1070+Ti&interface=&memType=&memSize=&since=


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> I just downloaded the MSI version. unfortunately I don't have a working motherboard to run my 1070 in just now.
> 
> I have taken a look at the MSI 1070TI gaming PCB and as far as I can tell it is identical to the 1070 Quicksilver board. Voltage controller and is identical to thew 1070 boards. When I do get my card back up and running I am planning to try it out if only to see if the cores are locked by bios or physically.


I have a MSI GTX 1070 GamingX. Im curious what the Ti Bios might do too. I didnt save the link for nvlash, can you post a link fo the right NVflash?


----------



## Madmaxneo

Quote:


> Originally Posted by *LuckLess7*
> 
> The TV switches into Game Mode automatically when I launch a game, but I think it's actually designed to get rid of all these funky enhancements. The monitor, however, does not apply any enhancements and the issue is visible on both displays in the same manner. Asus Monitor is connected via Displayport, and the TV via HDMI, but it is HDMI 2.0, so it's perfectly capable of 4k @ 60 Hz.
> 
> Well, since the issue appeared after a driver update, I would like to install a driver version of November 2016. But then again I'm afraid it causes additional conflicts, so I'd better not do that. I seem to be the only one experiencing this particular kind of issue though.


Have you tried completely uninstalling the video drivers and installing a fresh download directly from the nvidia site?
If not you could also try DDU but there is a warning that you use it at your own risk if your version of windows is higher than 1703.
Go here for DDU.


----------



## gtbtk

Quote:


> Originally Posted by *Dude970*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I just downloaded the MSI version. unfortunately I don't have a working motherboard to run my 1070 in just now.
> 
> I have taken a look at the MSI 1070TI gaming PCB and as far as I can tell it is identical to the 1070 Quicksilver board. Voltage controller and is identical to thew 1070 boards. When I do get my card back up and running I am planning to try it out if only to see if the cores are locked by bios or physically.
> 
> 
> 
> I have a MSI GTX 1070 GamingX. Im curious what the Ti Bios might do too. I didnt save the link for nvlash, can you post a link fo the right NVflash?
Click to expand...

I would advise caution if you do not have an igpu or another graphics card available if the flash bricks your card. As the only think that is different is the number of available cores on the same model GPU, i do not think that it will brick the card but without trying it I don't know for sure.

Make sure that you use GPU-Z to make a backup of your original bios file before you try flashing the TI bios.

Right now. We will only know if available NVflash versions will run on the card if we try it. Incompatible versions will just stop with an error and not touch the card. The MSI PCB is physically identical to the one used for the 1070 quicksilver as far as I could tell from photos and a video so I would think it should be ok.

You can get a recent version of NVflash by going to the Asus Strix 1070 support page and download their bios update. You can extract the the files to a directory using 7zip or winrar. There is a recent version of NVflash in the archive that I was using when I was cross flashing my 1070. There may be more recent versions on techpowerup.com but I have not been keeping track recently


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> I would advise caution if you do not have an igpu or another graphics card available if the flash bricks your card. As the only think that is different is the number of available cores on the same model GPU, i do not think that it will brick the card but without trying it I don't know for sure.
> 
> Make sure that you use GPU-Z to make a backup of your original bios file before you try flashing the TI bios.
> 
> Right now. We will only know if available NVflash versions will run on the card if we try it. Incompatible versions will just stop with an error and not touch the card. The MSI PCB is physically identical to the one used for the 1070 quicksilver as far as I could tell from photos and a video so I would think it should be ok.
> 
> You can get a recent version of NVflash by going to the Asus Strix 1070 support page and download their bios update. You can extract the the files to a directory using 7zip or winrar. There is a recent version of NVflash in the archive that I was using when I was cross flashing my 1070. There may be more recent versions on techpowerup.com but I have not been keeping track recently


Thank you. I tried with NVflash 5.370 but it gave GPU mismatch error.


----------



## BTCHSLP

Quote:


> Originally Posted by *Dude970*
> 
> Thank you. I tried with NVflash 5.370 but it gave GPU mismatch error.


"nvflash -4 -5 -6 *filename.rom*" maybe force the flash


----------



## Dude970

Quote:


> Originally Posted by *BTCHSLP*
> 
> "nvflash -4 -5 -6 *filename.rom*" maybe force the flash


I only tried -6, will give this a go soon


----------



## Dude970

Found NVflash 5.416.0 still get GPU mismatch error


----------



## asdkj1740

Quote:


> Originally Posted by *Dude970*
> 
> Found NVflash 5.416.0 still get GPU mismatch error


try the version on the "how to flash bios on 1080ti" thread.
http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti

this kind of cross flashing on different gtx models requires special nvflash version, afaik only maxwell gpus are supported on this kind of cross flashing on some nvflash version.


----------



## Dude970

Quote:


> Originally Posted by *asdkj1740*
> 
> try the version on the "how to flash bios on 1080ti" thread.
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
> 
> this kind of cross flashing on different gtx models requires special nvflash version, afaik only maxwell gpus are supported on this kind of cross flashing on some nvflash version.


I did try that, that is version 5.370.0


----------



## gtbtk

the Bios version determines the version of nvflash that you need to use. when the Micron bug bios for the 1070 was originally released, it needed a later version of NVFLASH than the one I had been using for the release bios cross flashing. Looks like we need to be patient. It will arrive before too much longer.

5.416 claims 1070TI support. Make sure that you use the x64 version

Make sure that you run as admin

disable the graphics card in device manager

the command used to work nvflash -6 biosname.rom

the 1080TI bios thread is now talking about needing to turn protection off before you can flash the card.


----------



## Dude970

Quote:


> Originally Posted by *gtbtk*
> 
> the Bios version determines the version of nvflash that you need to use. when the Micron bug bios for the 1070 was originally released, it needed a later version of NVFLASH than the one I had been using for the release bios cross flashing. Looks like we need to be patient. It will arrive before too much longer.
> 
> 5.416 claims 1070TI support. Make sure that you use the x64 version
> 
> Make sure that you run as admin
> disable the graphics card in device manager
> the command used to work nvflash -6 biosname.rom
> 
> the 1080TI bios thread is now talking about needing to turn protection off before you can flash the card.


Thanks for all the advice. Tried all that and still GPU mismatch error


----------



## gtbtk

Quote:


> Originally Posted by *gtbtk*
> 
> the Bios version determines the version of nvflash that you need to use. when the Micron bug bios for the 1070 was originally released, it needed a later version of NVFLASH than the one I had been using for the release bios cross flashing. Looks like we need to be patient. It will arrive before too much longer.
> 
> 5.416 claims 1070TI support.
> 
> Make sure that you run as admin
> 
> disable the graphics card in device manager
> 
> the command used to work nvflash -6 biosname.rom
> 
> the 1080TI bios thread is now talking about needing to turn protection off before you can flash the card.


Quote:


> Originally Posted by *Dude970*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the Bios version determines the version of nvflash that you need to use. when the Micron bug bios for the 1070 was originally released, it needed a later version of NVFLASH than the one I had been using for the release bios cross flashing. Looks like we need to be patient. It will arrive before too much longer.
> 
> 5.416 claims 1070TI support. Make sure that you use the x64 version
> 
> Make sure that you run as admin
> disable the graphics card in device manager
> the command used to work nvflash -6 biosname.rom
> 
> the 1080TI bios thread is now talking about needing to turn protection off before you can flash the card.
> 
> 
> 
> Thanks for all the advice. Tried all that and still GPU mismatch error
Click to expand...

from memory. GPUTweak from Asus also has a copy of NVflash that comes as part of the install. The latest release came out 26 October with support for the TI. It may be worth checking that one as well. There are some different versions of NVflash. The best one to look for is the OEM version. I don't know what usually comes as part of gpu tweak


----------



## bahn

Finally upgraded my aging R9 380 with this


----------



## gtbtk

Quote:


> Originally Posted by *bahn*
> 
> Finally upgraded my aging R9 380 with this


Nice. welcome.

Are you pleased with the performance? Take a look back through this thread for 1070 overclocking tips if you are interested.

Let us know how you get on


----------



## nolive721

Quote:


> Originally Posted by *bahn*
> 
> Finally upgraded my aging R9 380 with this


can you share what Core/memory clocks you are getting out of the box? and if you have OCed the card already?

I bought one of those lats month in Japan but was left bit underwhelmed by the perfs even after OCing to run my triple 1080p monitor set-up but I think its because I didnt use enough the voltage vs frequency feature that MSI AB offers to keep core boost high

so I moved to a GTX 1080 since last week which gave me a good 20% increase in performance but it is kind of overkill now

I know it sounds strange but I might go back to a 1070 and do a better OCing job with it if the prices go down after the introduction of 1070Ti here in Japan


----------



## Falkentyne

gtbtk:
How many phases is this MSI laptop card?
is it similar to desktop cards?

https://www.ebay.com/itm/NVIDIA-GTX-1070-N17E-G2-8GB-DDR5-MXM-3-0-Module-150W-Upgrade-Kit-MSI-AW/112144267554?_trkparms=aid%3D222007%26algo%3DSIM.MBE%26ao%3D2%26asc%3D48754%26meid%3Defc146a95af546d289cf74208ff9b3ac%26pid%3D100005%26rk%3D5%26rkt%3D6%26mehot%3Dpp%26sd%3D132217802860&_trksid=p2047675.c100005.m1851

from what you know, what is the maximum watts this phase setup can pull successfully? 250W?


----------



## DavyGT

Quote:


> Originally Posted by *Dude970*
> 
> Thanks for all the advice. Tried all that and still GPU mismatch error


Have you tried using "nvflash64 --protectoff" before flashing?


----------



## mailto

hello, can I update my asus 1070 strix oc on a 1070ti or it is not better ti?


----------



## khanmein

Quote:


> Originally Posted by *mailto*
> 
> hello, can I update my asus 1070 strix oc on a 1070ti or it is not better ti?


Can, but are you into crazy over-clocking benchmark score? If not, better stick with stock vbios.


----------



## mailto

ty khan,.

how can I activate kboost, and which mhz, voltage would be best for the card?

in nvidia system control what settings do I have to enable to have the highest performance?


----------



## khanmein

Quote:


> Originally Posted by *mailto*
> 
> ty khan,.
> 
> how can I activate kboost, and which mhz, voltage would be best for the card?
> 
> in nvidia system control what settings do I have to enable to have the highest performance?


The easiest way is to install the Asus GPU Tweak II application & select the OC mode profile.


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> gtbtk:
> How many phases is this MSI laptop card?
> is it similar to desktop cards?
> 
> https://www.ebay.com/itm/NVIDIA-GTX-1070-N17E-G2-8GB-DDR5-MXM-3-0-Module-150W-Upgrade-Kit-MSI-AW/112144267554?_trkparms=aid%3D222007%26algo%3DSIM.MBE%26ao%3D2%26asc%3D48754%26meid%3Defc146a95af546d289cf74208ff9b3ac%26pid%3D100005%26rk%3D5%26rkt%3D6%26mehot%3Dpp%26sd%3D132217802860&_trksid=p2047675.c100005.m1851
> 
> from what you know, what is the maximum watts this phase setup can pull successfully? 250W?


I have no definitive answer. I have not played with the MXM flavor of 1070s before.

From what I can see, there is no extra power connector so I would think that 250W is being overly optimistic. My guess would be closer to 150W. If you can find out what model mosfets the card is using we may be able to find a data sheet


----------



## gtbtk

Quote:


> Originally Posted by *mailto*
> 
> hello, can I update my asus 1070 strix oc on a 1070ti or it is not better ti?


@Dude970 is attempting to cross flash but not had any luck so far. I have noticed that an MSI Gaming 1070TI bios is a couple of bytes larger than a MSI Gaming/quicksilver 1070 bios even though they are both running on the same PCB. It seems Nvidia are maybe not quite as dumb as we would like them to be and have added a little extra security.


----------



## gtbtk

Quote:


> Originally Posted by *mailto*
> 
> ty khan,.
> 
> how can I activate kboost, and which mhz, voltage would be best for the card?
> 
> in nvidia system control what settings do I have to enable to have the highest performance?


kboost is an EVGA feature. If you have an EVGA card you can just install precision XOC.

If you have another brand card, you can hack Afterburner to enable it. You need to get a skin for EVGA Precision 16 that has the k-boost button (the older version before the XOC version. the skin formats are the same as AB. The new version uses different file formats). You install the skin file as a custom skin for afterburner and it will let you use the K-Boost button to enable k-boost functionality.

All cards are different. I found with my card, the best performance I was able to get out of the card was by using the curve to OC running at 2088 and memory at +650. +100v will get you absolutely the highest frequency but leaving the voltage at 0 will allow the card to run about 6 deg cooler so it is easier to keep stable clocks

Enable high performance mode in the NV control panel


----------



## bahn

Quote:


> Originally Posted by *nolive721*
> 
> can you share what Core/memory clocks you are getting out of the box? and if you have OCed the card already?
> 
> I bought one of those lats month in Japan but was left bit underwhelmed by the perfs even after OCing to run my triple 1080p monitor set-up but I think its because I didnt use enough the voltage vs frequency feature that MSI AB offers to keep core boost high
> 
> so I moved to a GTX 1080 since last week which gave me a good 20% increase in performance but it is kind of overkill now
> 
> I know it sounds strange but I might go back to a 1070 and do a better OCing job with it if the prices go down after the introduction of 1070Ti here in Japan


Out of the box. Havent OCed yet

Quote:


> Originally Posted by *gtbtk*
> 
> Nice. welcome.
> 
> Are you pleased with the performance? Take a look back through this thread for 1070 overclocking tips if you are interested.
> 
> Let us know how you get on


Thanks.

I tested it on Resident Evil 7 and Rise of the Tomb Raider. Didnt notice much improvement although I increased the graphic settings. I did a Unigine benchmark but I forgot where I saved the screenshot of the result


----------



## nolive721

Thank you
Do you monitor actual boost during benchmark or gaming sessions?
Mine was crap not going higher than 1988mhz if my memory serves well

Talking about memory galax uses micron brand and it didn't ox well on my card

Anyway happy if you can share


----------



## bahn

noob question. is this right?


----------



## microchidism

What is that picture from?

Anyhow your GPU clocks itself based on work load, it is normal for clocks to be low (to save power etc) when you are not doing anything demanding.


----------



## KillerBee33

Can someone tell me what can i use to work SLI into GTA V plz. Single 1070 runs better than 2 . I've heard SLI blows but didn't think it was this much...


----------



## gtbtk

Quote:


> Originally Posted by *bahn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nolive721*
> 
> can you share what Core/memory clocks you are getting out of the box? and if you have OCed the card already?
> 
> I bought one of those lats month in Japan but was left bit underwhelmed by the perfs even after OCing to run my triple 1080p monitor set-up but I think its because I didnt use enough the voltage vs frequency feature that MSI AB offers to keep core boost high
> 
> so I moved to a GTX 1080 since last week which gave me a good 20% increase in performance but it is kind of overkill now
> 
> I know it sounds strange but I might go back to a 1070 and do a better OCing job with it if the prices go down after the introduction of 1070Ti here in Japan
> 
> 
> 
> Out of the box. Havent OCed yet
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Nice. welcome.
> 
> Are you pleased with the performance? Take a look back through this thread for 1070 overclocking tips if you are interested.
> 
> Let us know how you get on
> 
> Click to expand...
> 
> Thanks.
> 
> I tested it on Resident Evil 7 and Rise of the Tomb Raider. Didnt notice much improvement although I increased the graphic settings. I did a Unigine benchmark but I forgot where I saved the screenshot of the result
Click to expand...

These cards love memory overclocks.

Increase core clock are good but don't get hung up only on the core clocks. In most situations if you have to make a choice between max Core or max memory clocks, pick the max memory. I can run mine at 2126mhz stable but the best graphics scores I have made in firestrike were done at 2088. I have managed to get it up to about 21500 points on my gaming X with a gaming Z bios installed


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Can someone tell me what can i use to work SLI into GTA V plz. Single 1070 runs better than 2 . I've heard SLI blows but didn't think it was this much...


Nvidia profile inspector

https://pcgamingwiki.com/wiki/NVIDIA_Profile_Inspector

not the greatest vid but he does show how it is done


----------



## gtbtk

Quote:


> Originally Posted by *bahn*
> 
> noob question. is this right?


those are idle clocks. they will increase under load.

You have the Nvidia Control panel setting at optimized mode. If you set the driver to high performance mode it will run the high clocks at idle


----------



## gtbtk

Quote:


> Originally Posted by *nolive721*
> 
> Thank you
> Do you monitor actual boost during benchmark or gaming sessions?
> Mine was crap not going higher than 1988mhz if my memory serves well
> 
> Talking about memory galax uses micron brand and it didn't ox well on my card
> 
> Anyway happy if you can share


If you are at stock, 1988 is about right. the high end factory overclocked cards will boost to just over 2000 out of the box.

If you install MSI afterburner, as well as being the tool to overclock the card, it has an on screen display that will overlay over the game or benchmark and you can keep an eye on your boost clocks. you can set it up in the settings menu to display what you like.

Check GPU-Z and make sure that the bios version on your card is 86.04.50.00.xx if it is 86.04.26.x.x or 86.04.3b.x.x you need to get a bios update and it resolves a bug that effects micron memory.

otherwise, you may see some benefits if you open your bios and increase VCCIO voltage slightly


----------



## skupples

Nyone running sli 1070s on a predator?

Getting a vibe that i should cash in fir a 1080ti (or whats next) due to memory bandwidth limitations, and SLi just feels dead. Like they dont even care anymore.


----------



## zipzop

Quote:


> Originally Posted by *gtbtk*
> 
> otherwise, you may see some benefits if you open your bios and increase VCCIO voltage slightly


Does everyone have VCCIO voltage control in their BIOS though? Or is it called something different in other manufacturer's? Some say VCCIO is integrated into system agent voltage(CPU SA voltage) on MSI Z97 motherboards but doesn't directly control it. Any ideas?


----------



## nolive721

Quote:


> Originally Posted by *gtbtk*
> 
> If you are at stock, 1988 is about right. the high end factory overclocked cards will boost to just over 2000 out of the box.
> 
> If you install MSI afterburner, as well as being the tool to overclock the card, it has an on screen display that will overlay over the game or benchmark and you can keep an eye on your boost clocks. you can set it up in the settings menu to display what you like.
> 
> Check GPU-Z and make sure that the bios version on your card is 86.04.50.00.xx if it is 86.04.26.x.x or 86.04.3b.x.x you need to get a bios update and it resolves a bug that effects micron memory.
> 
> otherwise, you may see some benefits if you open your bios and increase VCCIO voltage slightly


as I mentioned, my 1st experience with Pascal was this GALAX 1070 and I had been so brainwashed having spent too much time with my Polaris RX480 with OCing and BIOS editing lol) that I missed some of the voltage vs frequency feature in MSI AB, for example.

pretty sure I could have clocked the core higher and stable like I am doing now with my current GTX1080

I had noticed the Vram being from Micron as opposed to the Samsung brand I believe MSI is using on the Gaming X and read that it was not such a good OCer

if for whatever reason I decide to go back to a 1070 with micron Vram, can you explain what is this bug and the BIOS update would fix? just curious because I confess,I like the looks of the white GALAX in my White and Red rig more than the ugly ZOTAX AMP extreme yellow/gun gray backplate......


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> Nvidia profile inspector
> 
> https://pcgamingwiki.com/wiki/NVIDIA_Profile_Inspector
> 
> not the greatest vid but he does show how it is done


Tried quite a few things and still Flying 120-40 FPS . BTW. are Presets in Profile Inspector need to be tweaked or untouched?


----------



## gtbtk

Quote:


> Originally Posted by *zipzop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> otherwise, you may see some benefits if you open your bios and increase VCCIO voltage slightly
> 
> 
> 
> Does everyone have VCCIO voltage control in their BIOS though? Or is it called something different in other manufacturer's? Some say VCCIO is integrated into system agent voltage(CPU SA voltage) on MSI Z97 motherboards but doesn't directly control it. Any ideas?
Click to expand...

Z97 Haswell added an integrated voltage controller and changed things around a bit. You might like trying the SA (system agent) voltage. small increase and test Overclocking recommended range 1.15-1.25v but the lower you can keep any voltage the better.

You seem to be at 1.05 now so increase 1 step at a time and see if it improves things


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Nvidia profile inspector
> 
> https://pcgamingwiki.com/wiki/NVIDIA_Profile_Inspector
> 
> not the greatest vid but he does show how it is done
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tried quite a few things and still Flying 120-40 FPS . BTW. are Presets in Profile Inspector need to be tweaked or untouched?
Click to expand...

the game presets in the drop down box? yes i believe so. Be aware that the GTA is a really a bit funky. I know it is a common problem. You could also try disabling the shader cache for GTA in the Nvidia control panel.


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> the game presets in the drop down box? yes i believe so. Be aware that the GTA is a really a bit funky. I know it is a common problem. You could also try disabling the shader cache for GTA in the Nvidia control panel.


Well "Live & Learn" . Whats messed up is a single FPS drop in SLI feels like a Frame Skip and its annoying as hell. Thanx for the help







i'll keep trying but already regret getting this so called Beast


----------



## SavantStrike

Quote:


> Originally Posted by *skupples*
> 
> Nyone running sli 1070s on a predator?
> 
> Getting a vibe that i should cash in fir a 1080ti (or whats next) due to memory bandwidth limitations, and SLi just feels dead. Like they dont even care anymore.


I wouldn't recommend SLI for anything less than a pair of 1080's. Scaling isn't great and some titles don't even support it.

I also don't have data to confirm this, but given the shift to larger textures over time and the rumored 16GB framebuffer for Volta, the 8GB of RAM that the 1070 (and for that matter 1080) have could make the horsepower of SLI irrelevant if devs starts using all of that vram.

Which 1070's are you using out of curiosity?


----------



## skupples

Yeah, the scaling is garbage, n the predator is making everyone ive showed it to sick (including me) due to a weird random back light flicker issue in games. Also, besides the random flicker, best viewing distance is like 3 feet.

Its all going back to amazon me thinks. Get a 30some 4k, n actually see whats going on in gpu town.


----------



## SavantStrike

Quote:


> Originally Posted by *skupples*
> 
> Yeah, the scaling is garbage, n the predator is making everyone ive showed it to sick (including me) due to a weird random back light flicker issue in games. Also, besides the random flicker, best viewing distance is like 3 feet.
> 
> Its all going back to amazon me thinks. Get a 30some 4k, n actually see whats going on in gpu town.


You water cool right?

If you can swing it with a full size desktop (or heck even some franken ITX build), the EK MSI Gaming X is a lot cheaper then a SLI gaming laptop (although a heck of a lot less portable).

That screen sounds amazing as a torture device. I wonder if Acer plans to make VR headsets. Strap something like that on somebody and tell them it's a carnival ride and you'd make a fortune. They could call it the acer pukador.


----------



## zipper17

I really want to try again tips from @gtbk about vccio tricks to push the overclock GPU further







. I have been hoping it works on mine







(

still stuck at 20,9K FS









I can get 21k+ but it rather would crash or get memory artifacts.


----------



## mailto

ty bro..
if I go to 2088mhz I have a core clock of 55 +, powerlimit is on 120.memory clock 500. no crash! I can max. go to 150 core ,!
kboost I have with msi afterburner with ctrl + f adapted to 1150mv that's right?


----------



## Falkentyne

Quote:


> Originally Posted by *zipper17*
> 
> I really want to try again tips from @gtbk about vccio tricks to push the overclock GPU further
> 
> 
> 
> 
> 
> 
> 
> . I have been hoping it works on mine
> 
> 
> 
> 
> 
> 
> 
> (
> 
> still stuck at 20,9K FS
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can get 21k+ but it rather would crash or get memory artifacts.


Firestrike total score? Or graphics score?


----------



## VeauX

Question for you guys, I've been looking to upgrade my GTX 1070 (Gigabyte Gaming 1) for a GTX 1080. I'm playing at 1440p on a 60hz monitor. Someone offers me to swap his GTX 1080 FE with my card and asks for 75 usd to even values. Both our cards are around 6 months old.

Do you think it is a good deal?


----------



## SavantStrike

Quote:


> Originally Posted by *VeauX*
> 
> Question for you guys, I've been looking to upgrade my GTX 1070 (Gigabyte Gaming 1) for a GTX 1080. I'm playing at 1440p on a 60hz monitor. Someone offers me to swap his GTX 1080 FE with my card and asks for 75 usd to even values. Both our cards are around 6 months old.
> 
> Do you think it is a good deal?


Talk him down to 50 bucks and it'll be a fantastic deal. As it is it's a fair deal and should give you a healthy bump in performance.


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the game presets in the drop down box? yes i believe so. Be aware that the GTA is a really a bit funky. I know it is a common problem. You could also try disabling the shader cache for GTA in the Nvidia control panel.
> 
> 
> 
> Well "Live & Learn" . Whats messed up is a single FPS drop in SLI feels like a Frame Skip and its annoying as hell. Thanx for the help
> 
> 
> 
> 
> 
> 
> 
> i'll keep trying but already regret getting this so called Beast
Click to expand...

I know you have a laptop but didn't I suggest a 1080TI instead of sli 1070 because of the funkyness of sli about 1 month or so ago?









Now that you have it, GTA is only a single game and a single 1070 lets it run at a respectable rate anyway so I would not worry about it too much. I know that stutter problems have been solved by disabling the shader cache in other systems but that was not specifically an SLI fix, that also helped with stutters in single card systems as well.

Other games work OK out of the box. Some of the ones that don't can be tweaked and made work. Remember SLI has four different modes it can run in that you can change in the nvidia inspector. The best mode to use depends on the game engine and the way the card is coded. Did you try each of the different modes?


----------



## VeauX

Quote:


> Originally Posted by *SavantStrike*
> 
> Talk him down to 50 bucks and it'll be a fantastic deal. As it is it's a fair deal and should give you a healthy bump in performance.


A GTX 1080 is not overkill for what I'm using it for? (1440p)


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> I know you have a laptop but didn't I suggest a 1080TI instead of sli 1070 because of the funkyness of sli about 1 month or so ago?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now that you have it, GTA is only a single game and a single 1070 lets it run at a respectable rate anyway so I would not worry about it too much. I know that stutter problems have been solved by disabling the shader cache in other systems but that was not specifically an SLI fix, that also helped with stutters in single card systems as well.
> 
> Other games work OK out of the box. Some of the ones that don't can be tweaked and made work. Remember SLI has four different modes it can run in that you can change in the nvidia inspector. The best mode to use depends on the game engine and the way the card is coded. Did you try each of the different modes?


Well, as i said before i have a TitanXP build locked in storage and the only reason i got this Laptop is to be able to move it around in a single Backpack. I checked bunch of games and most work just fine with SLI and run @ 4K , GTAV is one of the favorites and it just pissed me off







. Also changed options in NVIDIA Control Panel and played with Profile Inspector but i cant find any Modes in Nvidia Inspector other than Performance level 0-4 and mine is set to 4


----------



## skupples

Quote:


> Originally Posted by *SavantStrike*
> 
> You water cool right?
> 
> If you can swing it with a full size desktop (or heck even some franken ITX build), the EK MSI Gaming X is a lot cheaper then a SLI gaming laptop (although a heck of a lot less portable).
> 
> That screen sounds amazing as a torture device. I wonder if Acer plans to make VR headsets. Strap something like that on somebody and tell them it's a carnival ride and you'd make a fortune. They could call it the acer pukador.


yeah, this is in my STH10 - motherboard/chip/memory are under water at the moment.

I have a lenovo Y70 w/ 970m, a T460, & a 2017 shield pro TV, so I should be good on a laptop (as far as travel gaming goes) for awhile.

and yeah its a torture device. It's just amazing that its brighter than my old 3x 1080p VA panels... It's enjoy able when I attach my corsair keyboard & mouse lap tray, or use a controller, but working from my standard keyboard tray is annoying. I've already requested a return label, though I will continue to troubleshoot it this weekend before actually sending it back. I love the idea, but I should'a done my homework, and I don't even see G-sync doing a damn thing so idk. (yes i enabled it in the control panel)

i also jus realized its an acer product, not an asus product. LOL


----------



## gtbtk

Quote:


> Originally Posted by *VeauX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SavantStrike*
> 
> Talk him down to 50 bucks and it'll be a fantastic deal. As it is it's a fair deal and should give you a healthy bump in performance.
> 
> 
> 
> A GTX 1080 is not overkill for what I'm using it for? (1440p)
Click to expand...

1080 is perfect if you can get one at a good price. It is not quite enough for 4K, you need a TI for that for the majority of games. The more powerful the card, the longer the time before obsolescence.


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I know you have a laptop but didn't I suggest a 1080TI instead of sli 1070 because of the funkyness of sli about 1 month or so ago?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now that you have it, GTA is only a single game and a single 1070 lets it run at a respectable rate anyway so I would not worry about it too much. I know that stutter problems have been solved by disabling the shader cache in other systems but that was not specifically an SLI fix, that also helped with stutters in single card systems as well.
> 
> Other games work OK out of the box. Some of the ones that don't can be tweaked and made work. Remember SLI has four different modes it can run in that you can change in the nvidia inspector. The best mode to use depends on the game engine and the way the card is coded. Did you try each of the different modes?
> 
> 
> 
> Well, as i said before i have a TitanXP build locked in storage and the only reason i got this Laptop is to be able to move it around in a single Backpack. I checked bunch of games and most work just fine with SLI and run @ 4K , GTAV is one of the favorites and it just pissed me off
> 
> 
> 
> 
> 
> 
> 
> . Also changed options in NVIDIA Control Panel and played with Profile Inspector but i cant find any Modes in Nvidia Inspector other than Performance level 0-4 and mine is set to 4
Click to expand...

I understand the reasons why a laptop is desirable. I have one for the same reasons and the reason why I prefaced my answer with the acknowledgement that it is a laptop.

The thing is, people are recommending a single card as being better for just the reasons that you are seeing for yourself now. I think that with SLI, you have to accept that it will not always work but you are getting the benefit of more power most of the time.

the nvidia profile inspector software is different to the nvidia control panel, the inspector panel has some shared setting. what it gives you is direct access to the Nvidia driver. there is an option called sli compatibility bits somewhere in the list. It was near the top. Assasin's creed or dark souls options in that setting


----------



## skupples

IDK why Nvidia doesn't put more effort into SLi functionality. It still benches great, but in-game performance gains seem to drop with each generation.

this is the second generation i've skipped buying 2x flagships from since 480. The last one was 9xx

I'm returning my 2nd 1070, n selling the first one, in exchange fora used 1080ti, which will sit all by its' lonesome due to how pathetic SLi has become.


----------



## zipper17

Quote:


> Originally Posted by *Falkentyne*
> 
> Firestrike total score? Or graphics score?


GS or Graphics Scores


----------



## Loodeelow

my first post here







i am so glad 2 finally register on this forum...first of all tnx to all of you guys here from across the globe








for making possible 2 overclock and hunt for performance with yours advices and competency altough..somewhere little more and somewhere little less







eng language is not my native resolution







and therefore i am sorry for gramma and blablabla









Following instructions in this thread i manage with AB tweaks on my 1070FE first stock bios, to get stable 1940-1988 with samsung mem +700
having all 5 profiles active on AB i manage 2 just lowering temps when i feel it is going 2 hot acctually just lowering the mem..and set the higher freq.....and yeah my card is running last year and a half or 2 (i cant even remember







time is but a window







) very hot for standards on this thread 75-82,3,4c. and acctually on my last ctrl F on 925 voltage temp maxd 78 (i am close 2 central heating radiator in my room so in winter is hot and i live on the last floor in my building so in summer is hot i hate AC..so it is always hot in here







)
with no problems at all...my rig is z170 with nonK 6700 on 4.5 and 16gigs of 2666stock boosted on 3200..playin dota and csgo mainly,on 2k native 144hz easy if you dont mind the temps








i was having an issue before with my psu and some crackling noise..so i just underclock the card 2 solve it..acctually i did it and then i change my psu and able to put my FE to max potencial,and my conclusion is like..everybody need 2 buy a FE after a couple of months from release date bcs you can save the money and diffrence from others cards is marginal..bsc if 100 euro or usd for example is worth 2100freq instead of 1950 then go for it








and also watercoolin like you get your temps low and then wat? you are firmly on 2100 and i am droped like -12 --24 -36 from 2000freq..another 50euro or usd for what,like...3fps? tottaly 150...maybe 1080ti







and biggest noncense for me is that spymark or graphic score idk the exact name of these programs, like i need to give some money to download a program that will show that i am able to clock the card on 2600 and your not







and where is actually that point when you start a favorite game and just play it...who gives a .... about your GS and your big glassed shiny case. just a little observation








and all that trend like my mem kit need 2 glow..and my case is like that so anybody can see a full spectar of collors..cmnn...2 be honest my case is cover with dust








i know this is a little bit radical thinking from majority here..but than it is boring if wee all thinking the same








and yeah what happend with that TI bios? that is the only bios that i will use on my FE if it is possibleand if it can acctually put some performance on regular 1070..
and 1 more time sincere apologies for gramma and i dont wanna insult anybody here...lova ya guys..keep up with good work!


----------



## khanmein

Quote:


> Originally Posted by *VeauX*
> 
> A GTX 1080 is not overkill for what I'm using it for? (1440p)


Sort of for 1440p @ 60 FPS, but not for high refresh rate.


----------



## BTCHSLP

The EK waterblock Backplate fits without any modifications on the original FE-heatsink









   

... for those which wants to change the boring OEM-backplate and cooling the backside of the VRM.


----------



## BTCHSLP

My EVGA GTX 1070 FE is actually flashed with the EVGA GTX 1070 Hybrid-BIOS - both cards got the reference PCB.
It is ~90 Mhz more than the Stock FE-BIOS.
The GPU clock is at 1595 Mhz and the card will boost up to 1987-2000 Mhz.

The GPU clock of a ASUS GeForce GTX 1070 8GB DRAGON TOP is at 1670 Mhz, but i'm a bit to scared to do this crossflash.

Is there an another recommend BIOS which boost higher then 2000 Mhz ?


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> I understand the reasons why a laptop is desirable. I have one for the same reasons and the reason why I prefaced my answer with the acknowledgement that it is a laptop.
> 
> The thing is, people are recommending a single card as being better for just the reasons that you are seeing for yourself now. I think that with SLI, you have to accept that it will not always work but you are getting the benefit of more power most of the time.
> 
> the nvidia profile inspector software is different to the nvidia control panel, the inspector panel has some shared setting. what it gives you is direct access to the Nvidia driver. there is an option called sli compatibility bits somewhere in the list. It was near the top. Assasin's creed or dark souls options in that setting


Well, so far its a single game that i have to run @ 1080p on a Single 1070 , "GTAV" i think i'll live








But check what this thing can do...


Spoiler: Warning: Spoiler!






https://www.3dmark.com/3dm/23219428


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I understand the reasons why a laptop is desirable. I have one for the same reasons and the reason why I prefaced my answer with the acknowledgement that it is a laptop.
> 
> The thing is, people are recommending a single card as being better for just the reasons that you are seeing for yourself now. I think that with SLI, you have to accept that it will not always work but you are getting the benefit of more power most of the time.
> 
> the nvidia profile inspector software is different to the nvidia control panel, the inspector panel has some shared setting. what it gives you is direct access to the Nvidia driver. there is an option called sli compatibility bits somewhere in the list. It was near the top. Assasin's creed or dark souls options in that setting
> 
> 
> 
> Well, so far its a single game that i have to run @ 1080p on a Single 1070 , "GTAV" i think i'll live
> 
> 
> 
> 
> 
> 
> 
> 
> But check what this thing can do...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> https://www.3dmark.com/3dm/23219428
Click to expand...

nice


----------



## gtbtk

Quote:


> Originally Posted by *BTCHSLP*
> 
> My EVGA GTX 1070 FE is actually flashed with the EVGA GTX 1070 Hybrid-BIOS - both cards got the reference PCB.
> It is ~90 Mhz more than the Stock FE-BIOS.
> The GPU clock is at 1595 Mhz and the card will boost up to 1987-2000 Mhz.
> 
> The GPU clock of a ASUS GeForce GTX 1070 8GB DRAGON TOP is at 1670 Mhz, but i'm a bit to scared to do this crossflash.
> 
> Is there an another recommend BIOS which boost higher then 2000 Mhz ?


The EVGA bioses have a tendency to hit the power limits even at 1080p and then start throttling the card back. It is handy if you want to play with K-Boost or the auto oc utility in precision XOC though.

The Dragon bios may be a little over the top given your VRM and power supply on the reference board. I never had much joy with any of the 1670 bios on my msi gaming. Palit and Gigabyte have versions as well as the dragon. The giga bios makes display port one stop working as it tries to redirect the signal to a rear HDMI port

The Asus Strix OC bios is, in my opinion, a better bios as the power delivery curve is not set so low, It is an 8 pin 200w card and it doesn't start hitting power limits typically until you get to a 1440p load. It is a 1633Mhz card from memory.


----------



## BTCHSLP

Quote:


> Originally Posted by *gtbtk*
> 
> The EVGA bioses have a tendency to hit the power limits even at 1080p and then start throttling the card back. It is handy if you want to play with K-Boost or the auto oc utility in precision XOC though.
> 
> The Dragon bios may be a little over the top given your VRM and power supply on the reference board. I never had much joy with any of the 1670 bios on my msi gaming. Palit and Gigabyte have versions as well as the dragon. The giga bios makes display port one stop working as it tries to redirect the signal to a rear HDMI port
> 
> The Asus Strix OC bios is, in my opinion, a better bios as the power delivery curve is not set so low, It is an 8 pin 200w card and it doesn't start hitting power limits typically until you get to a 1440p load. It is a 1633Mhz card from memory.


Thank you !








I do not really like the oc-softwares ...

Do you think i can flash the strix oc bios without any problems on my FE ?
Is that the correct bios ?
https://www.techpowerup.com/vgabios/192133/asus-gtx1070-8192-161026


----------



## gtbtk

Quote:


> Originally Posted by *BTCHSLP*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The EVGA bioses have a tendency to hit the power limits even at 1080p and then start throttling the card back. It is handy if you want to play with K-Boost or the auto oc utility in precision XOC though.
> 
> The Dragon bios may be a little over the top given your VRM and power supply on the reference board. I never had much joy with any of the 1670 bios on my msi gaming. Palit and Gigabyte have versions as well as the dragon. The giga bios makes display port one stop working as it tries to redirect the signal to a rear HDMI port
> 
> The Asus Strix OC bios is, in my opinion, a better bios as the power delivery curve is not set so low, It is an 8 pin 200w card and it doesn't start hitting power limits typically until you get to a 1440p load. It is a 1633Mhz card from memory.
> 
> 
> 
> Thank you !
> 
> 
> 
> 
> 
> 
> 
> 
> I do not really like the oc-softwares ...
> 
> Do you think i can flash the strix oc bios without any problems on my FE ?
> Is that the correct bios ?
> https://www.techpowerup.com/vgabios/192133/asus-gtx1070-8192-161026
Click to expand...

yes. that's the one. It should allow you to boot with graphics. The important thing is to make sure both cards are using the same voltage controller. The kingpin and hof cards use different controllers and wont work on most cards.

Cross flashing is a bit of a lottery in terms of how the bios settings suits the hardware. I have an MSI Gaming X and the Asus bios works well. I have never tried it on a reference card. the only way you will know how well they match is to try it. The non OC card may also be worth looking at if this one is a bit over the top.

The easiest way to get a working nvflash is to go to the strix page on the asus site and download the 1070 bios update utility. It is actually an executable archive file that you can unzip. Then use 7zip or winrar to open the exe file and extract the contents. A working version of nvflash64 is inside the update utility folder structure. All the different bios versions for all the asus cards are also in the file but they use asus unique names so identifying the correct one for a particular model is a challenge.


----------



## VeauX

Quote:


> Originally Posted by *SavantStrike*
> 
> Talk him down to 50 bucks and it'll be a fantastic deal. As it is it's a fair deal and should give you a healthy bump in performance.


So I was able to get that for 100bucks. Didn't had a chance to test much but at first, that thing is freaking loud. I'l go bothering the guys in the 1080 club to see how to tweak it.

Thanks for the feedback!


----------



## skupples

I'm glad I'm returning this 1070, as something's wack. it'll only ever clock to the 1600s before artifacting... my other card does 2100 all day.


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> I'm glad I'm returning this 1070, as something's wack. it'll only ever clock to the 1600s before artifacting... my other card does 2100 all day.


when the micron memory bug was a thing. we were seeing similar sorts of whack except with memory.

If you cant get the clock past 1600, you might try using nvflash and reflashing the default bios before you send it back. It could be a slightly corrupt bios flash on your card causing the problems. If that doesn't fix it RMA it


----------



## SavantStrike

Quote:


> Originally Posted by *VeauX*
> 
> So I was able to get that for 100bucks. Didn't had a chance to test much but at first, that thing is freaking loud. I'l go bothering the guys in the 1080 club to see how to tweak it.
> 
> Thanks for the feedback!


Yeah the blowers are loud. A DIY hybrid is the quietest option and can be done with a few m3 screws and a copper shim if you've got an AIO lying around. Otherwise a re paste with good TIM can net you a few degrees which might let you drop the fan speed a bit.


----------



## bahn

Having trouble with my pc since I installing my 1070. 1st time I booted after installing theres was no display. I plugged my old r9 380 still no display. After several restarts it finally came on.
This morning I was playing Civ 6 the game froze. I restarted the pc and continued playing. It froze again. I restarted again there was no display this time. I unplugged and removed my 1070. It took several restarts for it to boot up.

I suspect that its the hdmi cable not getting fully plugged in because of my S340


----------



## asdkj1740

Quote:


> Originally Posted by *bahn*
> 
> Having trouble with my pc since I installing my 1070. 1st time I booted after installing theres was no display. I plugged my old r9 380 still no display. After several restarts it finally came on.
> This morning I was playing Civ 6 the game froze. I restarted the pc and continued playing. It froze again. I restarted again there was no display this time. I unplugged and removed my 1070. It took several restarts for it to boot up.
> 
> I suspect that its the hdmi cable not getting fully plugged in because of my S340


try loosening the pcie screws to let your card rotating downward a little bit.
plug the hdmi cable first and then tighten back the pcie screws.

this problem is quiet often happening on low end cases, especially in the past.
if the above method cant solve the problem then you have to choose another cable that with small plastic cover on the edge of the metal plug, or find some cable with longer metal plug.


----------



## Loodeelow

hellou guys...a problem occur with new 388.31 driver...7zip data error?? and windows defender turning on and asking me if in the middle of instalation frist 10-15%...would i continue with instalation? i deleted the driver from downloads...add an exeption in defender,new download from nvidia ,defender asking me whould i continue with instalation,file origin unknown?? and again 7zip error occur...

edit: in AB saying it is .31 driver...how can i be sure which version of driver is installed,bcs when i did instalation...at the end sayin everything was instaled exept the driver.


----------



## khanmein

Quote:


> Originally Posted by *Loodeelow*
> 
> hellou guys...a problem occur with new 388.31 driver...7zip data error?? and windows defender turning on and asking me if in the middle of instalation frist 10-15%...would i continue with instalation? i deleted the driver from downloads...add an exeption in defender,new download from nvidia ,defender asking me whould i continue with instalation,file origin unknown?? and again 7zip error occur...
> 
> edit: in AB saying it is .31 driver...how can i be sure which version of driver is installed,bcs when i did instalation...at the end sayin everything was instaled exept the driver.


Grab the driver from the official website.







How come is a zip file?


----------



## Loodeelow

Quote:


> Originally Posted by *khanmein*
> 
> Grab the driver from the official website.
> 
> 
> 
> 
> 
> 
> 
> How come is a zip file?


Idk







right now i am stuck, after ddu in safe mode now i am not able to install any driver







crc error 7zip bla bla...
any tips would be apriciated...
edit: it worked by itself....no valid explanation..idk but right now i am on .31 ..tnx anyway








any news with TI bios?


----------



## skupples

Quote:


> Originally Posted by *gtbtk*
> 
> when the micron memory bug was a thing. we were seeing similar sorts of whack except with memory.
> 
> If you cant get the clock past 1600, you might try using nvflash and reflashing the default bios before you send it back. It could be a slightly corrupt bios flash on your card causing the problems. If that doesn't fix it RMA it


honestly, i'll probably go with a used 1080ti since 2x 4k monitors are coming in (32 inch 10bit for office and 28 inch 8bit for 2017 4K shield)


----------



## gtbtk

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> when the micron memory bug was a thing. we were seeing similar sorts of whack except with memory.
> 
> If you cant get the clock past 1600, you might try using nvflash and reflashing the default bios before you send it back. It could be a slightly corrupt bios flash on your card causing the problems. If that doesn't fix it RMA it
> 
> 
> 
> honestly, i'll probably go with a used 1080ti since 2x 4k monitors are coming in (32 inch 10bit for office and 28 inch 8bit for 2017 4K shield)
Click to expand...

that would be nice


----------



## b0uncyfr0

Still no work on flashing the 1070tI BIOS?


----------



## NIK1

I have a Asus GeForce Gtx 1060 6GB and wonder is it worthwhile to upgrade to a ASUS ROG Strix GeForce GTX 1070 8GB Gaming OC (STRIX-GTX1070-O8G-GAMING) video card.


----------



## syl1979

Quote:


> Originally Posted by *NIK1*
> 
> I have a Asus GeForce Gtx 1060 6GB and wonder is it worthwhile to upgrade to a ASUS ROG Strix GeForce GTX 1070 8GB Gaming OC (STRIX-GTX1070-O8G-GAMING) video card.


I would say definitely yes, if you have a decent processor for 1080p or want to increase resolution


----------



## NIK1

I have a Asus Z270 Apex mb with I7 7700K cpu.


----------



## gtbtk

Quote:


> Originally Posted by *NIK1*
> 
> I have a Asus GeForce Gtx 1060 6GB and wonder is it worthwhile to upgrade to a ASUS ROG Strix GeForce GTX 1070 8GB Gaming OC (STRIX-GTX1070-O8G-GAMING) video card.


It depends on what you are using the graphics card for.

If you are hammering it with AAA games then you will see a benefit. You may even want to look at a 1070TI or one of the lower 1080 that are only about $50 more than some of the 1070s.

If you are only playing war thunder or world of tanks level games, you are not likely to get much benefit over the 1060


----------



## ColdDeckEd

I recently picked up an evga 1070 FE, with Samsung memory and with a bios version of 86.04.1E.00.70.

Put a hybrid cooler on it, so was wondering what's a good bios that pushes the power limit higher? I'm guessing that I need to pick one that has 1E?

Would this one be good? https://www.techpowerup.com/vgabios/184891/palit-gtx1070-8192-160531-2


----------



## khanmein

Quote:


> Originally Posted by *ColdDeckEd*
> 
> I recently picked up an evga 1070 FE, with Samsung memory and with a bios version of 86.04.1E.00.70.
> 
> Put a hybrid cooler on it, so was wondering what's a good bios that pushes the power limit higher? I'm guessing that I need to pick one that has 1E?
> 
> Would this one be good? https://www.techpowerup.com/vgabios/184891/palit-gtx1070-8192-160531-2


Damn, Samsung memory chip. You're lucky..







Enjoy just sticks with the current VBIOS 86.04.1E.00.70 is the latest one.


----------



## gtbtk

Quote:


> Originally Posted by *ColdDeckEd*
> 
> I recently picked up an evga 1070 FE, with Samsung memory and with a bios version of 86.04.1E.00.70.
> 
> Put a hybrid cooler on it, so was wondering what's a good bios that pushes the power limit higher? I'm guessing that I need to pick one that has 1E?
> 
> Would this one be good? https://www.techpowerup.com/vgabios/184891/palit-gtx1070-8192-160531-2


The 1E is the original release bios that came on the samsung memory cards. There is nothing wrong with that bios. If you are happy with it you don't need to do anything.

The Micron memory cards with the .26 bios are the ones that are effected by the bug.

The .50 bioses are currently coming on all AIB Samsung and Micron cards. I don't think that any of the FE cards ever got an update.

If you are manually overclocking, you are not really going to get much benefit from cross flashing. Your card has a 4 phase VRM and the mosfets are OK but pumping too much power through it may not always be the best thing compared to some of the higher phase VRMs on the aib cards. FE cards actually overclock really well without needing to cross flash. The cooler has always been th elimiter. If you have put it under water already, you have solved the only real problem the card had.

Are you hitting power limits on your card now? If you really have the urge, you could look at the EVGA SC bios. That card has higher power limit, 1595 base clock and uses the Reference PCB. It will allow you to continue using the extra EVGA auto overclock tools as well.

If that bios hits power limits, Even the FTW bioses do on my MSI card do at 1080p. The Asus Strix OC bios is a Max 200w bios, uses a single 8 pin and works quite well on my MSI Gaming card. You could try that. The base clock will jump from 1506 up to 1633mhz


----------



## ColdDeckEd

Yeah the power limit is way to low for the hybrid cooler.

I had the hybrid on an evga sc with micron memory that I flashed with palit bios that gave the card the right amount of power limit to max it out.

Wiith the hybrid on the evga FE with Samsung,it hits the power limit very quickly even though I have a lot of temp left.

And yes the Samsung memory kicks the Microns ass lol.

But if you are saying that there really isn't any bioses to cross flash, I guess that answers my questions. Thanks for the info!

edit:
And when I say max it out, I mean I had no problem boosting up to 2100 with the evga SC w/ hybrid and palit bios.

With the evga FE w/ hybrid, with max power, it can hardly maintain 2000 boost.


----------



## gtbtk

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Yeah the power limit is way to low for the hybrid cooler.
> 
> I had the hybrid on an evga sc with micron memory that I flashed with palit bios that gave the card the right amount of power limit to max it out.
> 
> Wiith the hybrid on the evga FE with Samsung,it hits the power limit very quickly even though I have a lot of temp left.
> 
> And yes the Samsung memory kicks the Microns ass lol.
> 
> But if you are saying that there really isn't any bioses to cross flash, I guess that answers my questions. Thanks for the info!
> 
> edit:
> And when I say max it out, I mean I had no problem boosting up to 2100 with the evga SC w/ hybrid and palit bios.
> 
> With the evga FE w/ hybrid, with max power, it can hardly maintain 2000 boost.


There are no hard and fast rules with cross flashing. the different manufacturers tweak their own power delivery curves and tune them for the particular VRM on the board.

Obviously a FE card is different from a card that has an 8 phase VRM. I found that on my MSI, The MSI bioses are actually the best sorted for that card. I guess that should not be surprising. The Asus bios was really good and a pretty good match as well.

The Zotac amp extreme was disappointing although it would pull up to 300W.

The EVGA FTW would hit its power limit at 1080p. I assume that is the same on real FTW as well.

The lower end Palit bioses on my MSI worked but were not spectacular. with the high end bioses, My card would run but did not really like the palit or gigabyte bioses that oc the base clock to 1671Mhz

Samsung memory will clock faster but above about +600 both samsung and micron are starting to produce errors that the built in error correction is trying to deal with and slowing performance if it doesn't crash.

OCCT has a great tool that will test Vram for errors at overclocked rates. I found it handy with my card. Actually, samsung ram and micron ram tend to perform about the same


----------



## Falkentyne

@gtbtk

You or anyone else ever see someone push **10 ghz** on the RAM?
This guy just did on a MXM 1070 card (TDP modded from 115W to 190W).



I don't think I've seen anyone in this thread push 10 ghz on a desktop card!

http://forum.notebookreview.com/threads/mobile-pascal-tdp-tweaker-update-and-feedback-thread.806161/page-76


----------



## shadowrain

Quote:


> Originally Posted by *Falkentyne*
> 
> @gtbtk
> 
> You or anyone else ever see someone push **10 ghz** on the RAM?
> This guy just did on a MXM 1070 card (TDP modded from 115W to 190W).
> 
> 
> 
> I don't think I've seen anyone in this thread push 10 ghz on a desktop card!
> 
> http://forum.notebookreview.com/threads/mobile-pascal-tdp-tweaker-update-and-feedback-thread.806161/page-76



I'm close at 9828 mhz but this is just with EVGA precision with no mods.


----------



## gtbtk

Quote:


> Originally Posted by *Falkentyne*
> 
> @gtbtk
> 
> You or anyone else ever see someone push **10 ghz** on the RAM?
> This guy just did on a MXM 1070 card (TDP modded from 115W to 190W).
> 
> 
> 
> I don't think I've seen anyone in this thread push 10 ghz on a desktop card!
> 
> http://forum.notebookreview.com/threads/mobile-pascal-tdp-tweaker-update-and-feedback-thread.806161/page-76


There have been a couple who said they can do it. The problem with most cards, particularly the samsung ones, is that they clock high but over about +600 to +650 over the stock memory clock, the memory starts throwing errors either crashing the driver or needing the card to do error correction slowing down the performance. I Have Micron memory and can run it at up to about 9500Mhz. Performance though, is better at 9300mhz though

Great utility is the free version of OCCT. That will allow you to test your vRAM and tell you at what point loads of errors start creeping in as you overclock the card. These cards will cope with one error every now and then but when you start getting hundreds, it causes problems,


----------



## asdkj1740

recently a nvidia official from china said volta will be very powerful, and much more expensive.


----------



## KillerBee33

Quote:


> Originally Posted by *asdkj1740*
> 
> recently a nvidia official from china said volta will be very powerful, and much more expensive.


The only benefit i may see from VOLTA is VR enhancements , other than that 10 Series is as Killer as it Gets .


----------



## bahn

I cant play Civilization 5 with this card. I have the latest drivers
If I choose DX 10/11 I get a blank screen.
If I choose DX 9 I get this


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> recently a nvidia official from china said volta will be very powerful, and much more expensive.


If Nvidia follow the trend they have created, the Volta 70 level card should have similar performance to a Titan Xp.

The V100 is using a huge die which means expensive. Maybe Nvidia is going bonkers, making huge GPUs with many transistors to create the volta line of cards that blow any anticipated AMD Navi card out of the water. That may explain the expensive comment?


----------



## gtbtk

Quote:


> Originally Posted by *bahn*
> 
> I cant play Civilization 5 with this card. I have the latest drivers
> If I choose DX 10/11 I get a blank screen.
> If I choose DX 9 I get this


Nvidia GEforce Forum would be the best place to post that query.


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *asdkj1740*
> 
> recently a nvidia official from china said volta will be very powerful, and much more expensive.
> 
> 
> 
> The only benefit i may see from VOLTA is VR enhancements , other than that 10 Series is as Killer as it Gets .
Click to expand...

The AMD Cards are actually more powerful than the Nvidia Cards in terms of raw TFLOP processing power. Where they miss out is that the AMD DX 11 drivers and it seems PCIe Controllers, are nowhere near as efficient as the NV drivers in creating multi threaded access to the GPU.

The story changes with DX 12 because async compute benefits more from the hardware schedulers AMD have built into the architecture. Nvidia is using software running on the CPU to provide the scheduling to the card and that is not as efficient using the DX12 API.


----------



## Sgang

After a very good 1060 3gb I picked a very good 1070 from ASUS (the Turbo Version) that I plan to water cool with an accelero Hybrid iii 120 I have.

How do you judge these numbers? Is the result of a 3 hours benchmarking and gaming

My micron memories goes very high, +750 and more but reading about the error generated I decided to stop here

The core is not so lucky, plus 230 I think I barely arrive to 2000mhz.

Temperature with a custom fan curve seems very good but as I said I plan to water cooling. Do you think I will get better result doing it?

In gaming I mostly used battlefront 2, and I found about 10/15 fps more. But the game also if not in oc, has some very bad drops in certain parts...









Inviato dal mio iPhone utilizzando Tapatalk


----------



## KillerBee33

https://www.3dmark.com/fs/14365367 FS
https://www.3dmark.com/spy/2890786 TS
Keep playin with Benches , this thing is a beast in Bench but SLI game support still blows


----------



## KillerBee33

Titan VOLTA is OUT 3000$ just got an email from nVidia
https://www.nvidia.com/en-us/titan/titan-v/?ncid=em-ded-tnvptlh-29076&deliveryName=DM6510


----------



## gtbtk

Quote:


> Originally Posted by *Sgang*
> 
> After a very good 1060 3gb I picked a very good 1070 from ASUS (the Turbo Version) that I plan to water cool with an accelero Hybrid iii 120 I have.
> 
> How do you judge these numbers? Is the result of a 3 hours benchmarking and gaming
> 
> My micron memories goes very high, +750 and more but reading about the error generated I decided to stop here
> 
> The core is not so lucky, plus 230 I think I barely arrive to 2000mhz.
> 
> Temperature with a custom fan curve seems very good but as I said I plan to water cooling. Do you think I will get better result doing it?
> 
> In gaming I mostly used battlefront 2, and I found about 10/15 fps more. But the game also if not in oc, has some very bad drops in certain parts...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Inviato dal mio iPhone utilizzando Tapatalk


Download the free copy of OCCT and you can test your video card memory for errors. You can see the numbers of errors increase significantly onve the overclock gets to a certain point. you only need to back the OC down to just under that point and you are golden.

Start learning how to use the voltage curve to maximize the overclock performance. To get the best gaming performance, you do not need to have all of the curve set really high, just select bits of it.

YMMV, but my Micron card start throwing significant amounts of VRam errors above about +600 to +650

Water cooling will certainly give you better performance than a blower cooler. As temps rise, the clocks drop off. Water cooling should keep the CPU running in the 40-50 deg range under load


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> https://www.3dmark.com/fs/14365367 FS
> https://www.3dmark.com/fs/14365367 TS
> Keep playin with Benches , this thing is a beast in Bench but SLI game support still blows


Firestrike is one of my favorites as well. You didn't post the correct link for Time spy

What are you getting in FS with a single 1070?


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> Firestrike is one of my favorites as well. You didn't post the correct link for Time spy
> 
> What are you getting in FS with a single 1070?


TS Fixed and i never ran single GPU on any Benches ....


----------



## Nukemaster

Quote:


> Originally Posted by *Sgang*
> 
> After a very good 1060 3gb I picked a very good 1070 from ASUS (the Turbo Version) that I plan to water cool with an accelero Hybrid iii 120 I have.
> 
> How do you judge these numbers? Is the result of a 3 hours benchmarking and gaming
> 
> My micron memories goes very high, +750 and more but reading about the error generated I decided to stop here
> 
> The core is not so lucky, plus 230 I think I barely arrive to 2000mhz.
> 
> Temperature with a custom fan curve seems very good but as I said I plan to water cooling. Do you think I will get better result doing it?
> 
> In gaming I mostly used battlefront 2, and I found about 10/15 fps more. But the game also if not in oc, has some very bad drops in certain parts...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Inviato dal mio iPhone utilizzando Tapatalk


This is off topic, but what is your wallpaper?

I have an Asus DUAL-GTX1070-O8G with a Arctic Accelero Mono Plus so now i am back on topic


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Firestrike is one of my favorites as well. You didn't post the correct link for Time spy
> 
> What are you getting in FS with a single 1070?
> 
> 
> 
> TS Fixed and i never ran single GPU on any Benches ....
Click to expand...

Give it a whirl with SLI disabled. Before I killed my motherboard, I was doing about 21500 Graphics with a single 1070 on a i7-2600. I would imagine that with a coffee lake CPU, it would probably be doing closer to 22000 - 22500.

https://www.3dmark.com/fs/11784694

I would be interested to see how well the extra cuda cores on an overclocked MXM card with its thermal handicap compares to my desktop card that maxed out at about 240-250W


----------



## gtbtk

Quote:


> Originally Posted by *Nukemaster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgang*
> 
> After a very good 1060 3gb I picked a very good 1070 from ASUS (the Turbo Version) that I plan to water cool with an accelero Hybrid iii 120 I have.
> 
> How do you judge these numbers? Is the result of a 3 hours benchmarking and gaming
> 
> My micron memories goes very high, +750 and more but reading about the error generated I decided to stop here
> 
> The core is not so lucky, plus 230 I think I barely arrive to 2000mhz.
> 
> Temperature with a custom fan curve seems very good but as I said I plan to water cooling. Do you think I will get better result doing it?
> 
> In gaming I mostly used battlefront 2, and I found about 10/15 fps more. But the game also if not in oc, has some very bad drops in certain parts...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Inviato dal mio iPhone utilizzando Tapatalk
> 
> 
> 
> This is off topic, but what is your wallpaper?
> 
> I have an Asus DUAL-GTX1070-O8G with a Arctic Accelero Mono Plus so now i am back on topic
Click to expand...

Lucky we don't have the Forum Police coming around


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> Give it a whirl with SLI disabled. Before I killed my motherboard, I was doing about 21500 Graphics with a single 1070 on a i7-2600. I would imagine that with a coffee lake CPU, it would probably be doing closer to 22000 - 22500.
> 
> https://www.3dmark.com/fs/11784694
> 
> I would be interested to see how well the extra cuda cores on an overclocked MXM card with its thermal handicap compares to my desktop card that maxed out at about 240-250W


https://www.3dmark.com/3dm/23844692


----------



## asdkj1740

https://www.chiphell.com/thread-1804707-1-1.html
titan v gaming performance ~30%>1080ti.
so it should be about 2080 oc performance level.


----------



## shadowrain

Quote:


> Originally Posted by *asdkj1740*
> 
> https://www.chiphell.com/thread-1804707-1-1.html
> titan v gaming performance ~30%>1080ti.
> so it should be about 2080 oc performance level.


Considering that the 1st Titan's performance ~= gtx 970, then titan x maxwell and 980ti ~= 1070, then a 1080ti should be around the 2070 level. The Titan V's performance with the price increase and hbm2 makes volta seem a bit underwhelming esp if Titan V will be almost equal to a 2080.

Then again these are just the 1st benchmarks using unoptimised drivers.


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Give it a whirl with SLI disabled. Before I killed my motherboard, I was doing about 21500 Graphics with a single 1070 on a i7-2600. I would imagine that with a coffee lake CPU, it would probably be doing closer to 22000 - 22500.
> 
> https://www.3dmark.com/fs/11784694
> 
> I would be interested to see how well the extra cuda cores on an overclocked MXM card with its thermal handicap compares to my desktop card that maxed out at about 240-250W
> 
> 
> 
> https://www.3dmark.com/3dm/23844692
Click to expand...

Thanks. That is a pretty good result given the laptop environment in which it is running. Your memory clocks are about what I was getting reliably on the desktop card. It is amazing what performance that they are getting out of a 47W CPU these days. At 4.4Ghz, My Sandy Bridge I7 can only do 10500 physics scores. You are another 2000 ahead of that. I have a Macbook pro with an i7-4770HQ that I can overclock the turbo an extra 200Mhz on each core to a max of 3.6Ghz . It cant quite hit 10,000 but it is close. I'm still impressed what it can do given that it is a base clock of only 2.2Ghz. Unfortunately the laptop only has the iGPU but It does have the GT3 128Mb eDRAM L4 Cache that helps things alond. The dGPU models of Macs only had an Nvidia GT750M which is only about 15% faster and not really worth spending the extra money on.

You were getting 1974mhz reported in the SLI run and only reporting 1848Mhz in the single run. Why was there so much difference? Do you find that the cooling keeps up well or does it start to heat soak after a period of gaming/useage and cause the GPU to start to reduce clocks?

The SLI scaling that you are getting is almost 100% over the single card run. Isn't it a shame that all games don't work the same way?


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> Thanks. That is a pretty good result given the laptop environment in which it is running. Your memory clocks are about what I was getting reliably on the desktop card. It is amazing what performance that they are getting out of a 47W CPU these days. At 4.4Ghz, My Sandy Bridge I7 can only do 10500 physics scores. You are another 2000 ahead of that. I have a Macbook pro with an i7-4770HQ that I can overclock the turbo an extra 200Mhz on each core to a max of 3.6Ghz . It cant quite hit 10,000 but it is close. I'm still impressed what it can do given that it is a base clock of only 2.2Ghz. Unfortunately the laptop only has the iGPU but It does have the GT3 128Mb eDRAM L4 Cache that helps things alond. The dGPU models of Macs only had an Nvidia GT750M which is only about 15% faster and not really worth spending the extra money on.
> 
> You were getting 1974mhz reported in the SLI run and only reporting 1848Mhz in the single run. Why was there so much difference? Do you find that the cooling keeps up well or does it start to heat soak after a period of gaming/useage and cause the GPU to start to reduce clocks?
> 
> The SLI scaling that you are getting is almost 100% over the single card run. Isn't it a shame that all games don't work the same way?


I'm not very happy with the 6820HK my Desktop got 6700K LOTTERY WINNER can run @ 4,8
This thing Crashes @ 4.2 and not benching that well @ 4.1








But these 1070's can handle up to + 700 on Memory and i run'em @ + 600 at all times
Its hard to say what they boosting to , sometimes 1800's sometimes 2050's .
This done with latest Driver , i,m quite pleased with this score to be honest https://www.3dmark.com/spy/2890786


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Thanks. That is a pretty good result given the laptop environment in which it is running. Your memory clocks are about what I was getting reliably on the desktop card. It is amazing what performance that they are getting out of a 47W CPU these days. At 4.4Ghz, My Sandy Bridge I7 can only do 10500 physics scores. You are another 2000 ahead of that. I have a Macbook pro with an i7-4770HQ that I can overclock the turbo an extra 200Mhz on each core to a max of 3.6Ghz . It cant quite hit 10,000 but it is close. I'm still impressed what it can do given that it is a base clock of only 2.2Ghz. Unfortunately the laptop only has the iGPU but It does have the GT3 128Mb eDRAM L4 Cache that helps things alond. The dGPU models of Macs only had an Nvidia GT750M which is only about 15% faster and not really worth spending the extra money on.
> 
> You were getting 1974mhz reported in the SLI run and only reporting 1848Mhz in the single run. Why was there so much difference? Do you find that the cooling keeps up well or does it start to heat soak after a period of gaming/useage and cause the GPU to start to reduce clocks?
> 
> The SLI scaling that you are getting is almost 100% over the single card run. Isn't it a shame that all games don't work the same way?
> 
> 
> 
> I'm not very happy with the 6820HK my Desktop got 6700K LOTTERY WINNER can run @ 4,8
> This thing Crashes @ 4.2 and not benching that well @ 4.1
> 
> 
> 
> 
> 
> 
> 
> 
> But these 1070's can handle up to + 700 on Memory and i run'em @ + 600 at all times
> Its hard to say what they boosting to , sometimes 1800's sometimes 2050's .
> This done with latest Driver , i,m quite pleased with this score to be honest https://www.3dmark.com/spy/2890786
Click to expand...

It is only a 47W CPU in a laptop chassis. I would not be expecting desktop level overclocks from it. Are the crashes the 0x124 WHEA BSOD? If so, you are running out of vcore to the CPU or experiencing vdroop to the point where the CPU cant continue.

Gamers Nexus only got the 6820HK to a 4.0GHz overclock. they found unstable clocks at 4.1/4.2 as well

Are you using the bios or MSI Gaming Center software to overclock or using Intel XTU? If you are doing the overclocking in the MSI software, you may want to consider XTU instead. It is a handy tool to dial in the overclock and allows on the fly changes for clocks, voltage offsets, power and current limits.

If you under-volt the CPU to try and keep the temperatures down to manageable levels, the current draw will increase so you need to increase the current limit. On mine, I can boost all cores by a max of 200Mhz and the package, under a full load will draw 69W. I can under volt the chip to -0.070v and things remain stable but under load the chip with current throttle if i don't increase the default current limit from 95A to 105A. Setting the undervolt to -.75v is mostly stable but I will get the occasional crash


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> It is only a 47W CPU in a laptop chassis. I would not be expecting desktop level overclocks from it. Are the crashes the 0x124 WHEA BSOD? If so, you are running out of vcore to the CPU or experiencing vdroop to the point where the CPU cant continue.
> 
> Gamers Nexus only got the 6820HK to a 4.0GHz overclock. they found unstable clocks at 4.1/4.2 as well
> 
> Are you using the bios or MSI Gaming Center software to overclock or using Intel XTU? If you are doing the overclocking in the MSI software, you may want to consider XTU instead. It is a handy tool to dial in the overclock and allows on the fly changes for clocks, voltage offsets, power and current limits.
> 
> If you under-volt the CPU to try and keep the temperatures down to manageable levels, the current draw will increase so you need to increase the current limit. On mine, I can boost all cores by a max of 200Mhz and the package, under a full load will draw 69W. I can under volt the chip to -0.070v and things remain stable but under load the chip with current throttle if i don't increase the default current limit from 95A to 105A. Setting the undervolt to -.75v is mostly stable but I will get the occasional crash


Thats the problem im not very good with XTU Intel Utility so BIOS but ===theres an option to change Vcore for CPU in bios but it does nothing i checked and most people say that,. I run it stable @ 4.1 and stock Voltage + Power never seen a crash yet


----------



## comanzo

Hey guys, a quick question.

So I noticed while gaming that while playing it (Batman Arkham origins for example), it will drop from 99% to 70's. It doesn't cause stutter however (or most of the time), and stays at a constant 70% usage before returning to 99%. I suspect a game engine issue with this since my fps is around 120's-140's. But since it's an open world game, I was also thinking it could be my HDD having trouble loading the textures(it's an open-world game remember). Any suggestions as to why this is happening? While stutter isn't bad, I would like it to disappear, and would like to take advantage of my gpu to it's full potential.

Specs:
1070
i7 4790s
12gb ram


----------



## TheReciever

Quote:


> Originally Posted by *gtbtk*
> 
> It is only a 47W CPU in a laptop chassis. I would not be expecting desktop level overclocks from it. Are the crashes the 0x124 WHEA BSOD? If so, you are running out of vcore to the CPU or experiencing vdroop to the point where the CPU cant continue.
> 
> Gamers Nexus only got the 6820HK to a 4.0GHz overclock. they found unstable clocks at 4.1/4.2 as well
> 
> Are you using the bios or MSI Gaming Center software to overclock or using Intel XTU? If you are doing the overclocking in the MSI software, you may want to consider XTU instead. It is a handy tool to dial in the overclock and allows on the fly changes for clocks, voltage offsets, power and current limits.
> 
> If you under-volt the CPU to try and keep the temperatures down to manageable levels, the current draw will increase so you need to increase the current limit. On mine, I can boost all cores by a max of 200Mhz and the package, under a full load will draw 69W. I can under volt the chip to -0.070v and things remain stable but under load the chip with current throttle if i don't increase the default current limit from 95A to 105A. Setting the undervolt to -.75v is mostly stable but I will get the occasional crash


People have gotten the 6820hk to 4.6Ghz I believe but for certain 4.5Ghz with a lot of fine tuning.

you MAY be able to apply a microcode exploit that will enable 4Ghz+ operation but that will also depend on your platform being capable of delivering the power needed to apply those clocks.

Scratch XTU and use Throttlestop, you may be able to apply the powercut exploit to your machine that under reports the TDP.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Hey guys, a quick question.
> 
> So I noticed while gaming that while playing it (Batman Arkham origins for example), it will drop from 99% to 70's. It doesn't cause stutter however (or most of the time), and stays at a constant 70% usage before returning to 99%. I suspect a game engine issue with this since my fps is around 120's-140's. But since it's an open world game, I was also thinking it could be my HDD having trouble loading the textures(it's an open-world game remember). Any suggestions as to why this is happening? While stutter isn't bad, I would like it to disappear, and would like to take advantage of my gpu to it's full potential.
> 
> Specs:
> 1070
> i7 4790s
> 12gb ram


I assume that 99% to 70% refers to the load that the GPU is under?

As games progress, the load does vary depending on what the game engine is asking the GPU to render. If a scene has less triangles to render there is just simply not enough work to keep the card running at 99%.

If it is the HDD causing the stutter due to slow load times, you could test that by setting a save point just before the point that it happens, let it go through and stutter. Stop the gameplay and go back to the save setting just before the point that you saw the problem and run through the same passage of game play again. The textures should still be resident in the vram and not need to load from the disk for the second run. If you still get stutters and the card drops utilization again, it is not likely to be the hdd speed causing the issue but more likely a driver optimization issue.

Might we worth asking Santa for a new SSD for Xmas.

Options that you could also try is to set the Nvidia Control panel setting for the game to be max performance mode. You could also try disabling the shader cache for that game. It has been known to solve the stutter problems in GTA V


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> It is only a 47W CPU in a laptop chassis. I would not be expecting desktop level overclocks from it. Are the crashes the 0x124 WHEA BSOD? If so, you are running out of vcore to the CPU or experiencing vdroop to the point where the CPU cant continue.
> 
> Gamers Nexus only got the 6820HK to a 4.0GHz overclock. they found unstable clocks at 4.1/4.2 as well
> 
> Are you using the bios or MSI Gaming Center software to overclock or using Intel XTU? If you are doing the overclocking in the MSI software, you may want to consider XTU instead. It is a handy tool to dial in the overclock and allows on the fly changes for clocks, voltage offsets, power and current limits.
> 
> If you under-volt the CPU to try and keep the temperatures down to manageable levels, the current draw will increase so you need to increase the current limit. On mine, I can boost all cores by a max of 200Mhz and the package, under a full load will draw 69W. I can under volt the chip to -0.070v and things remain stable but under load the chip with current throttle if i don't increase the default current limit from 95A to 105A. Setting the undervolt to -.75v is mostly stable but I will get the occasional crash
> 
> 
> 
> Thats the problem im not very good with XTU Intel Utility so BIOS but ===theres an option to change Vcore for CPU in bios but it does nothing i checked and most people say that,. I run it stable @ 4.1 and stock Voltage + Power never seen a crash yet
Click to expand...

You didn't say what sort of crashes you were having if you overclocked to 4.2Ghz

There has been another suggestion for Throttlestop by theReceiver. I have played with that tool as well but it is not as intuitive as XTU and have not spent much time using it so I am not as familiar with it as I am with XTU.

What I like about XTU is that you basically get the bios settings displayed in a windows based utility that you can change with sliders on the fly. Handy for experimenting with.

If you go too far and crash the system, none of the changes you made are permanent but you can save your settings in different profiles. If it crashes because you went too far, it will restart with the system back at the default settings again. You always then have the option of taking the settings you have discovered work well and applying them in the bios.


----------



## TheReciever

Quote:


> Originally Posted by *gtbtk*
> 
> You didn't say what sort of crashes you were having if you overclocked to 4.2Ghz
> 
> There has been another suggestion for Throttlestop by theReceiver. I have played with that tool as well but it is not as intuitive as XTU and have not spent much time using it so I am not as familiar with it as I am with XTU.
> 
> What I like about XTU is that you basically get the bios settings displayed in a windows based utility that you can change with sliders on the fly. Handy for experimenting with.
> 
> If you go too far and crash the system, none of the changes you made are permanent but you can save your settings in different profiles. If it crashes because you went too far, it will restart with the system back at the default settings again. You always then have the option of taking the settings you have discovered work well and applying them in the bios.


You can do the same and more with Throttlestop. In fact Intel has dialed back a number of the voltage levels you can apply as well that is retained in Throttlestop.

Throttlestop you can set changes immediately, after closure (in case of crash) or none at all. Along with 4 different profiles.

Throttlestop is just simple a much better tool in all aspects.


----------



## gtbtk

Quote:


> Originally Posted by *TheReciever*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You didn't say what sort of crashes you were having if you overclocked to 4.2Ghz
> 
> There has been another suggestion for Throttlestop by theReceiver. I have played with that tool as well but it is not as intuitive as XTU and have not spent much time using it so I am not as familiar with it as I am with XTU.
> 
> What I like about XTU is that you basically get the bios settings displayed in a windows based utility that you can change with sliders on the fly. Handy for experimenting with.
> 
> If you go too far and crash the system, none of the changes you made are permanent but you can save your settings in different profiles. If it crashes because you went too far, it will restart with the system back at the default settings again. You always then have the option of taking the settings you have discovered work well and applying them in the bios.
> 
> 
> 
> You can do the same and more with Throttlestop. In fact Intel has dialed back a number of the voltage levels you can apply as well that is retained in Throttlestop.
> 
> Throttlestop you can set changes immediately, after closure (in case of crash) or none at all. Along with 4 different profiles.
> 
> Throttlestop is just simple a much better tool in all aspects.
Click to expand...

I have just taken another look at TS 8.50. I need to spend some more time reading up on all the settings. I see that there are more voltage settings that I can adjust. I just need to get used to the workflow they have created.


----------



## TheReciever

Quote:


> Originally Posted by *gtbtk*
> 
> I have just taken another look at TS 8.50. I need to spend some more time reading up on all the settings. I see that there are more voltage settings that I can adjust. I just need to get used to the workflow they have created.


Just be sure not to use both at the same time as that would create unwarranted behavior


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> You didn't say what sort of crashes you were having if you overclocked to 4.2Ghz
> 
> There has been another suggestion for Throttlestop by theReceiver. I have played with that tool as well but it is not as intuitive as XTU and have not spent much time using it so I am not as familiar with it as I am with XTU.
> 
> What I like about XTU is that you basically get the bios settings displayed in a windows based utility that you can change with sliders on the fly. Handy for experimenting with.
> 
> If you go too far and crash the system, none of the changes you made are permanent but you can save your settings in different profiles. If it crashes because you went too far, it will restart with the system back at the default settings again. You always then have the option of taking the settings you have discovered work well and applying them in the bios.


BLUE SCREEN WITH







on it when CU is unstable , i did manage to raise Vcore to god knows what and i dont know how , got it to 4.2 but it gets to 90's so the actual score its worse , just reversd it back to 4.1 on STOCK Vcore and Power im all good. F*** it wont play with that no more that .01 from 4.1 to 4.2 aint worth the trouble ... still get over 10K on TimeSpy so im ok







Cooling is good on this thing never seen GPU's go over 65
Decided to reinstall Windows last night, got a free PRO UPGRADE KEY so....this is with 4.1 and all stock V and P https://www.3dmark.com/3dm/23876990


----------



## TheReciever

Going to need liquid metal when you start pushing additional voltage


----------



## KillerBee33

for some reason this dont want to show up on My Results page in 3DMark


Spoiler: Warning: Spoiler!


----------



## Falkentyne

Quote:


> Originally Posted by *gtbtk*
> 
> Download the free copy of OCCT and you can test your video card memory for errors. You can see the numbers of errors increase significantly onve the overclock gets to a certain point. you only need to back the OC down to just under that point and you are golden.
> 
> Start learning how to use the voltage curve to maximize the overclock performance. To get the best gaming performance, you do not need to have all of the curve set really high, just select bits of it.
> 
> YMMV, but my Micron card start throwing significant amounts of VRam errors above about +600 to +650
> 
> Water cooling will certainly give you better performance than a blower cooler. As temps rise, the clocks drop off. Water cooling should keep the CPU running in the 40-50 deg range under load


Looks like my VRAM tops out without errors at +700 (9400 mhz).


----------



## Falkentyne

Quote:


> Originally Posted by *KillerBee33*
> 
> I'm not very happy with the 6820HK my Desktop got 6700K LOTTERY WINNER can run @ 4,8
> This thing Crashes @ 4.2 and not benching that well @ 4.1
> 
> 
> 
> 
> 
> 
> 
> 
> But these 1070's can handle up to + 700 on Memory and i run'em @ + 600 at all times
> Its hard to say what they boosting to , sometimes 1800's sometimes 2050's .
> This done with latest Driver , i,m quite pleased with this score to be honest https://www.3dmark.com/spy/2890786


6820HK CPU's were considered a dog and a half. Only the very best CPU's could do 4.3 ghz and most topped out at 4.2 ghz. Yet the HQ versions (3.1 ghz on 4 cores) of these processors could undervolt quite a bit.

7820HK's came from a better process, usually reaching 4.2-4.4 ghz stable, but there are still some dogs out there. Usually the default VID shown at 4.2 ghz will tell you how far your CPU can go. My sample can Prime stable (NON AVX) at 4.7 ghz small FFT's with LM repaste just fine but with high temps in the 80's, and is fully game stable at 4.8 ghz, but cannot keep temps under control for stress testing. 4.9 ghz can Cinebench with high voltage (>1.325v or 1.35v I forgot+), but temps get into the 90's, and can boot windows and SuperPi 1M at 5 ghz.

What's the VID shown in Throttlestop on your 6820HK @ 4.1 ghz? (adaptive voltage, as manual voltage changes the VID).


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> You didn't say what sort of crashes you were having if you overclocked to 4.2Ghz
> 
> There has been another suggestion for Throttlestop by theReceiver. I have played with that tool as well but it is not as intuitive as XTU and have not spent much time using it so I am not as familiar with it as I am with XTU.
> 
> What I like about XTU is that you basically get the bios settings displayed in a windows based utility that you can change with sliders on the fly. Handy for experimenting with.
> 
> If you go too far and crash the system, none of the changes you made are permanent but you can save your settings in different profiles. If it crashes because you went too far, it will restart with the system back at the default settings again. You always then have the option of taking the settings you have discovered work well and applying them in the bios.
> 
> 
> 
> BLUE SCREEN WITH
> 
> 
> 
> 
> 
> 
> 
> on it when CU is unstable , i did manage to raise Vcore to god knows what and i dont know how , got it to 4.2 but it gets to 90's so the actual score its worse , just reversd it back to 4.1 on STOCK Vcore and Power im all good. F*** it wont play with that no more that .01 from 4.1 to 4.2 aint worth the trouble ... still get over 10K on TimeSpy so im ok
> 
> 
> 
> 
> 
> 
> 
> Cooling is good on this thing never seen GPU's go over 65
> Decided to reinstall Windows last night, got a free PRO UPGRADE KEY so....this is with 4.1 and all stock V and P https://www.3dmark.com/3dm/23876990
Click to expand...

you might find that one of the two software utilities is actually a better way to experiment

if the error message on the blue screen is 0x124 Whea correctable error. it means that it is lacking vcore. yest laptops will always be compromised when it comes to ultimate overclocking and temps unless you go and get the ASUS watercooled laptop

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Download the free copy of OCCT and you can test your video card memory for errors. You can see the numbers of errors increase significantly onve the overclock gets to a certain point. you only need to back the OC down to just under that point and you are golden.
> 
> Start learning how to use the voltage curve to maximize the overclock performance. To get the best gaming performance, you do not need to have all of the curve set really high, just select bits of it.
> 
> YMMV, but my Micron card start throwing significant amounts of VRam errors above about +600 to +650
> 
> Water cooling will certainly give you better performance than a blower cooler. As temps rise, the clocks drop off. Water cooling should keep the CPU running in the 40-50 deg range under load
> 
> 
> 
> Looks like my VRAM tops out without errors at +700 (9400 mhz).
Click to expand...

There you go. It took me ages to find that utility. IN tyhe pre micron bug fix days, I had been finding really inconsistent OC results. There seemed to be no Ryhme or reason, sometimes a stress test would run and other times it would fail with the same settings. It was that utility that let me finally see what was happening with my card throwing memory errors and identify the bug in the bios for Micron memory cards.


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I assume that 99% to 70% refers to the load that the GPU is under?
> 
> As games progress, the load does vary depending on what the game engine is asking the GPU to render. If a scene has less triangles to render there is just simply not enough work to keep the card running at 99%.
> 
> If it is the HDD causing the stutter due to slow load times, you could test that by setting a save point just before the point that it happens, let it go through and stutter. Stop the gameplay and go back to the save setting just before the point that you saw the problem and run through the same passage of game play again. The textures should still be resident in the vram and not need to load from the disk for the second run. If you still get stutters and the card drops utilization again, it is not likely to be the hdd speed causing the issue but more likely a driver optimization issue.
> 
> Might we worth asking Santa for a new SSD for Xmas.
> 
> Options that you could also try is to set the Nvidia Control panel setting for the game to be max performance mode. You could also try disabling the shader cache for that game. It has been known to solve the stutter problems in GTA V


Hey gtbtk. I appreciate the response. I got the ssd (samsung 850 evo 500gb just now). Gonna install it soon into my system and will find out. I will also try the other methods suggested. Appreciate the help.









On another note, when you said a game engine simply does have less triangles to ask the gpu to render, thus not pegging it at 99%. my question to that though, is if there's less stuff to render, instead of it dropping in gpu usage, wouldn't you just get higher fps with it still pegged at 99%? Since there's less stuff to render, it makes sense for it to stay at 99% with simply higher fps(less stuff to render). What are your thoughts on this?


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> 6820HK CPU's were considered a dog and a half. Only the very best CPU's could do 4.3 ghz and most topped out at 4.2 ghz. Yet the HQ versions (3.1 ghz on 4 cores) of these processors could undervolt quite a bit.
> 
> 7820HK's came from a better process, usually reaching 4.2-4.4 ghz stable, but there are still some dogs out there. Usually the default VID shown at 4.2 ghz will tell you how far your CPU can go. My sample can Prime stable (NON AVX) at 4.7 ghz small FFT's with LM repaste just fine but with high temps in the 80's, and is fully game stable at 4.8 ghz, but cannot keep temps under control for stress testing. 4.9 ghz can Cinebench with high voltage (>1.325v or 1.35v I forgot+), but temps get into the 90's, and can boot windows and SuperPi 1M at 5 ghz.
> 
> What's the VID shown in Throttlestop on your 6820HK @ 4.1 ghz? (adaptive voltage, as manual voltage changes the VID).


4.1 is what i run 24/7 on stock V and P absolutely stable


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!






Highest Vcore i've seen was 1.270V for a split second usually runs @ around 1.220V
Keeping this , seems to be the golden top


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> you might find that one of the two software utilities is actually a better way to experiment
> 
> if the error message on the blue screen is 0x124 Whea correctable error. it means that it is lacking vcore. yest laptops will always be compromised when it comes to ultimate overclocking and temps unless you go and get the ASUS watercooled laptop
> There you go. It took me ages to find that utility. IN tyhe pre micron bug fix days, I had been finding really inconsistent OC results. There seemed to be no Ryhme or reason, sometimes a stress test would run and other times it would fail with the same settings. It was that utility that let me finally see what was happening with my card throwing memory errors and identify the bug in the bios for Micron memory cards.


I tried XTU with 1.350V @ 4.4 it ran without crash on Time Spy but score was horrible and temps went from 75 to 92 , mehh i'll keep my 4.1 with Stock options in BIOS really seems to be thr golden spot.


----------



## Falkentyne

Did you raise CPU CURRENT LIMIT in the Bios?
I see in your screenshot this is set to 0.
That's 70 amps. You're going to get throttling almost instantly in XTU and other stuff at 4.2 ghz with that. 70 amps is like 80 watts or so (depending on vcore, as watts=volts * amps.

if not, you're going to get throttling because of CURRENT, not temps.
Please change CPU Current Limit to 200 amps.
If it's a 1/4 divider then that's a value of 800 in the Bios. Check the divider.
This can also be called ICCMAX.

On a GT73VR laptop, you absolutely can *NOT* run higher than 1.3v without a LM repaste, on anything that stresses multiple cores on the CPU. Temps will get out of control.

Here is how you properly apply Conductonaut on a CPU to avoid short circuiting the motherboard or the SMD components around the CPU (or even the GPU if you choose that; GPU requires a lot more prep work!)

1) Nail Polish (transparent), cellulose based, 3 coats around exposed SMD resistors around the BGA slug:
This can be used instead of Kapton tape or Super 33+ tape protection; some people found better temps this way. Drawback: harder to remove, have to be very careful not to touch the silicon and have it harden:

Link:
http://forum.notebookreview.com/threads/after-1-year-of-thermal-grizzly-conductonaut.799343/
The shiny stuff you see in the bottom picture is the nail polish coating.
If you choose not to use nail polish, use Super 33+ tape as shown below. I chose nail polish + foam.

2) FOAM DAM (Highly compressible foam) cutout squares/rectangles barriers protection to prevent stray conductive LM balls from getting anywhere outside the silicon housing area and on the mainboard (THIS IS NOT for protection of SMD components, it won't protect them, it' s to trap LM in case of a huge sudden jolt). If you have a huge jolt of the laptop, temps rise really high, but it still works, you can thank those foam dams for protecting the motherboard.

link: http://forum.notebookreview.com/threads/liquid-metal-showdown-thermal-grizzly-conductonaut-vs-cool-laboratory-liquid-ultra-pro.791489/page-41#post-10581804
link: http://forum.notebookreview.com/threads/liquid-metal-showdown-thermal-grizzly-conductonaut-vs-cool-laboratory-liquid-ultra-pro.791489/page-38#post-10534488

(that's an LGA CPU and MXM video card in each link, but you can see how the foam dam compliments the tape protection).

There is also another problem: the VID you see in the Bios is not actually the true voltage. In fact it's being misreported, and can be reported MUCH Lower than the actual true voltage target the CPU is getting. And even this does NOT account for VDROOP, as there is no vcore sensor either.
The only way to get a fully accurate VID is to use "IA AC DC Loadline of 1 and 1 for both settings in Core IA Domain, but this requires the unlocked Bios. Svet can unlock the Bios for you for a donation (on the MSI official forums). Very experienced users (I suggest having a Hardware Programmer in case you mess up) can unlock the Bios manually by unlocking options with AMIBCP, after dumping the Bios user space with FPT64, and then editing it and setting values to "Supervisor" and reflashing, however FPT64 will flash to flash if the "Bios Lock" security setting is turned on. Svet knows how to set that off and flash with his own method, but trying to disable it yourself is VERY difficult, requires downloading the official bios, finding the GUID string, and booting to an EFI USB boot disk prompt to "write' the command to disable it (Very complicated). There are instructions on notebookreview forums for how to do this, written by sirgeorge, but I need to make clear that you do NOT*, not, EVER modify the APTIO Bios options in the DOWNLOADED bios from MSI !! the downloaded file contains more stuff than just the Bios, like the Management engine and other stuff, so if you try to edit and flash that, that's a brick. You have to backup the APTIO userspace in the Bios, edit that and then reflash that. But again this only works if you have "Bios Lock" already set to disabled.

There is another way to unlock the Bios, with a key combination. You can find that on the Chinese Baidu site if you search for GT73. I can't help with this.

If you do get access to the IA AC DC loadline setting, and you set it to 1, you MUST first know what voltage MINIMUM you need for stability. Because the VID will show the true voltage BEFORE Vdroop, and if the VID is too low (like it it says 1.19v), and you put a load on the CPU, you may BSOD instantly. In that case you need to add a positive voltage offset and raise it until you find your stable point. The good news is, you will have lower temps by setting IA AC DC Loadline to 1, and finding an accurate voltage with adaptive +offsets, than if you used the AC DC =0 (Auto) setting.

You can also set this value to "1", and use manual voltage (the unlocked Bios will allow you to toggle between adaptive and manual).

The "Auto" setting does VID BOOSTING in the background, based on CPU load--the higher the load, the biggest the VID boost, which raises the VID target as the CPU load increases. But this is STILL before vdroop is applied (which cannot be shown). The auto setting seems to use a 'reference' setting of 1.80 to 2.10 mOhms. The auto setting is designed for 'basic' users who use adaptive vcore and Dragon Center for overclocking, but pushes voltage and temps beyond what is needed. The auto setting also GREATLY mis-reports the VID at full load! If you notice, when your CPU is idle, the VID may go from 1.21v to 1.3v randomly, despite there being no load. Then at load it may show 1.22v. It's actually more like 1.3v, not 1.22v. To stop that crap and regain control of your VID, you need to use a value of 1 (you can go up to 25 for minor VID boosting, but the boosting starts getting high even with "25", once you exceed 1.35v).

For my 7820HK, at 4.2 ghz, my default VID is 1.08v, which is fully NON AVX prime stable (this is with IA AC DC loadline=1). 4.5 ghz requires VID 1.175v (shown as 1.1794).


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Did you raise CPU CURRENT LIMIT in the Bios?
> I see in your screenshot this is set to 0.
> That's 70 amps. You're going to get throttling almost instantly in XTU and other stuff at 4.2 ghz with that. 70 amps is like 80 watts or so (depending on vcore, as watts=volts * amps.
> 
> if not, you're going to get throttling because of CURRENT, not temps.
> Please change CPU Current Limit to 200 amps.
> If it's a 1/4 divider then that's a value of 800 in the Bios. Check the divider.
> This can also be called ICCMAX..


You mean change this to 200?

BTW changing CPU CORE_RING Voltage in bios to 1300 starts flying all the way to 1.47 at times so i just left it at "0"


----------



## Falkentyne

Read the help thing to the right when you click CPU current limit. It tells you the divider. I dont know what you enter but its 200 amps. Read the help thing. It might be 800. 800 with a 1/4 divider is 200. So you would then enter 800, for 200 amps. See?


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Read the help thing to the right when you click CPU current limit. It tells you the divider. I dont know what you enter but its 200 amps. Read the help thing. It might be 800. 800 with a 1/4 divider is 200. So you would then enter 800, for 200 amps. See?


I'm not goin to repaste anything here , thats the reason i got a laptop , my living arrangements dont allow me to do any work here. thats why my TitanXP is just rusting in storage ;(
not sure what help thingy you referring to though ...
But this is whaT IT LOOKS LIKE nd i dont understand a word of it


----------



## Falkentyne

Dude it's simple english.
I told you what to do
I said set it to 200 amps.

it says a Value is divided by 1/4.
And a value of 400 is 100 amps
It says that right in front of you in simple english.
It's currently set at 0, which means auto.
I don't understand why I have to explain something extremely simple.

SO what would 200 amps be?

Please think.


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Dude it's simple english.
> I told you what to do
> I said set it to 200 amps.
> 
> it says a Value is divided by 1/4.
> And a value of 400 is 100 amps
> It says that right in front of you in simple english.
> It's currently set at 0, which means auto.
> I don't understand why I have to explain something extremely simple.
> 
> SO what would 200 amps be?
> 
> Please think.


400 is 100 so to get 200 i should set it to 800?


----------



## Falkentyne

Yes that's what I said earlier. You're making this hard. The very first time I ever looked at that screen I instantly understood everything, and I'm not an engineer. I just play videogames. I mean it's simple math.


----------



## TheReciever

Quote:


> Originally Posted by *Falkentyne*
> 
> Yes that's what I said earlier. You're making this hard. The very first time I ever looked at that screen I instantly understood everything, and I'm not an engineer. I just play videogames. I mean it's simple math.












Its evident we spend a lot of time on NBR lol

Hopefully he can get his system running optimally. Im still trying to get mine sorted actually. Updated v22 Haswell microcode and Intel ME but still get VRM limit. Hopefully I can fix that with software....cant get the IDP and Thermal Framework to install


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I assume that 99% to 70% refers to the load that the GPU is under?
> 
> As games progress, the load does vary depending on what the game engine is asking the GPU to render. If a scene has less triangles to render there is just simply not enough work to keep the card running at 99%.
> 
> If it is the HDD causing the stutter due to slow load times, you could test that by setting a save point just before the point that it happens, let it go through and stutter. Stop the gameplay and go back to the save setting just before the point that you saw the problem and run through the same passage of game play again. The textures should still be resident in the vram and not need to load from the disk for the second run. If you still get stutters and the card drops utilization again, it is not likely to be the hdd speed causing the issue but more likely a driver optimization issue.
> 
> Might we worth asking Santa for a new SSD for Xmas.
> 
> Options that you could also try is to set the Nvidia Control panel setting for the game to be max performance mode. You could also try disabling the shader cache for that game. It has been known to solve the stutter problems in GTA V
> 
> 
> 
> Hey gtbtk. I appreciate the response. I got the ssd (samsung 850 evo 500gb just now). Gonna install it soon into my system and will find out. I will also try the other methods suggested. Appreciate the help.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On another note, when you said a game engine simply does have less triangles to ask the gpu to render, thus not pegging it at 99%. my question to that though, is if there's less stuff to render, instead of it dropping in gpu usage, wouldn't you just get higher fps with it still pegged at 99%? Since there's less stuff to render, it makes sense for it to stay at 99% with simply higher fps(less stuff to render). What are your thoughts on this?
Click to expand...

That is a good SSD. I have the 840 Evo and I am really please with it. If you install windows on the SSD and boot from it, you will have what feels like a brand new machine.

You are correct if that is the only thing going on. However, among other things, the card also needs to load textures and they are all interdependent on each other. With a slow HDD it can exacerbate that problem and lead to stutters. Even with an SSD installed there is still a time component, albeit a smaller one, required to load the assets. The GPU also needs CPU cycles available to process the draw requests for it to process the next frame. If the CPU is already running at 100% or the GPU is starved of the assets required to create the frame, the graphics card has to either wait or draw the screens without textures.

The other thing that I didn't touch on is how much Ram do you have installed in you machine and what is running in the background? If the Ram is marginal, the OS could also be swapping memory out to the page file which will also slow things down.


----------



## TheReciever

840 EVO had some issues with old files iirc


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you might find that one of the two software utilities is actually a better way to experiment
> 
> if the error message on the blue screen is 0x124 Whea correctable error. it means that it is lacking vcore. yest laptops will always be compromised when it comes to ultimate overclocking and temps unless you go and get the ASUS watercooled laptop
> There you go. It took me ages to find that utility. IN tyhe pre micron bug fix days, I had been finding really inconsistent OC results. There seemed to be no Ryhme or reason, sometimes a stress test would run and other times it would fail with the same settings. It was that utility that let me finally see what was happening with my card throwing memory errors and identify the bug in the bios for Micron memory cards.
> 
> 
> 
> I tried XTU with 1.350V @ 4.4 it ran without crash on Time Spy but score was horrible and temps went from 75 to 92 , mehh i'll keep my 4.1 with Stock options in BIOS really seems to be thr golden spot.
Click to expand...

The score drop is likely cause by thermal throttling. Increasing Voltages do cause increased temps. The overclocking game is all about finding the right compromises. 4.1 at 100% all the time is likely better than starting at 4.4 but spending most of its time at 3Ghz because of a throttle situation.

Now that you have confirmed that voltage is the reason that your CPU struggles above 4.1GHz, you can do some fine tuning *if you want to*. Faster CPU clocks always require more voltage. The stock voltage settings usually leave a little headroom and by going to 4.1 you have used the voltage headroom at stock settings up.

Overclocking anything, laptop or desktop, is always about finding the best compromise betweeen frequency, voltage and temps. Your cpu would probably run at 5Ghz with 1.45v if you could cool it down enough. Sadly, a laptop chassis doesn't allow you enough cooling potential to allow you to do that without major modifications and sub zero cooling.

The best compromise may be something like 4.25Ghz at 1.27v @ 80 deg (just pulling numbers out of the air here). If it was me, I would dial in the OC by what temps the settings cause. Find compromise gives you the best performance while keeping temps in check and not throttling. 95 deg at 1.35v is too much but If you are topping out at 75 degrees with stock volts, you still have a bit of thermal headroom to play with. 80-85 deg should still be fine and not hitting thermal limits. You can increase the clocks and the voltage by 5mv at a time and check temps.

edit: current throttling actually seems like a more likely bet but that has been discussed


----------



## gtbtk

Quote:


> Originally Posted by *TheReciever*
> 
> 840 EVO had some issues with old files iirc


it had issues with the initial release firmware, trim and a persormance drop off. It took 2 firmware updates but that was resolved that a number of years ago.


----------



## TheReciever

Quote:


> Originally Posted by *gtbtk*
> 
> it had issues with the initial release firmware, trim and a persormance drop off. It took 2 firmware updates but that was resolved that a number of years ago.


Ah, well glad it was fixed. Last I read the fix was basically a monthly scan to keep the files current.


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> The score drop is likely cause by thermal throttling. Increasing Voltages do cause increased temps. The overclocking game is all about finding the right compromises. 4.1 at 100% all the time is likely better than starting at 4.4 but spending most of its time at 3Ghz because of a throttle situation.
> 
> Now that you have confirmed that voltage is the reason that your CPU struggles above 4.1GHz, you can do some fine tuning *if you want to*. Faster CPU clocks always require more voltage. The stock voltage settings usually leave a little headroom and by going to 4.1 you have used the voltage headroom at stock settings up.
> 
> Overclocking anything, laptop or desktop, is always about finding the best compromise betweeen frequency, voltage and temps. Your cpu would probably run at 5Ghz with 1.45v if you could cool it down enough. Sadly, a laptop chassis doesn't allow you enough cooling potential to allow you to do that without major modifications and sub zero cooling.
> 
> The best compromise may be something like 4.25Ghz at 1.27v @ 80 deg (just pulling numbers out of the air here). If it was me, I would dial in the OC by what temps the settings cause. Find compromise gives you the best performance while keeping temps in check and not throttling. 95 deg at 1.35v is too much but If you are topping out at 75 degrees with stock volts, you still have a bit of thermal headroom to play with. 80-85 deg should still be fine and not hitting thermal limits. You can increase the clocks and the voltage by 5mv at a time and check temps.
> 
> edit: current throttling actually seems like a more likely bet but that has been discussed


it can run 4.2 but most Benchmark scores drop alot so i just keep it @ 4.1 which keeps me at a reasonable performance and under 75 degrees.
Also no matter what i do 4.2 jumps up to 90's in temps. More than sure that " .1 " aint gonna give me much perform,ace anyway so i'll stay on a safe side


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Yes that's what I said earlier. You're making this hard. The very first time I ever looked at that screen I instantly understood everything, and I'm not an engineer. I just play videogames. I mean it's simple math.


So do i, Gaming mostly just do some Benchin at times for fun but its not always works out =bcz, i dont know the basics of Voltages, Watts and Amps. Anyway this is what i ended up with it got no extra performance so basically works the same as when it was @ "0" but thanx for the help. BTW CPU OC was way easier on my Z170A Gaming Pro Carbon -Desktop their options were absolutely awesome in BIOS.


Spoiler: Warning: Spoiler!






Also forgot to mention i only got a 460W Power Supply to keep it all running 2 1070's , 6820 hk and an 18 inch Display. all 4 Overclocked


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> That is a good SSD. I have the 840 Evo and I am really please with it. If you install windows on the SSD and boot from it, you will have what feels like a brand new machine.
> 
> You are correct if that is the only thing going on. However, among other things, the card also needs to load textures and they are all interdependent on each other. With a slow HDD it can exacerbate that problem and lead to stutters. Even with an SSD installed there is still a time component, albeit a smaller one, required to load the assets. The GPU also needs CPU cycles available to process the draw requests for it to process the next frame. If the CPU is already running at 100% or the GPU is starved of the assets required to create the frame, the graphics card has to either wait or draw the screens without textures.
> 
> The other thing that I didn't touch on is how much Ram do you have installed in you machine and what is running in the background? If the Ram is marginal, the OS could also be swapping memory out to the page file which will also slow things down.


I have 12gb of ram. The cpu could possibly be a culprit even though it's an i7 (i7 4790s to be precise). Reason why is because the individual threads don't have as much single-thread perf. as a i7 4790k overclocked for example. The cpu is clocked to 3.6 ghz instead of the base 3.2, but turbo had to be disabled. I was unable to get 4ghz out of the chip, it simply wouldn't let me in bios, not because it was unstable. However, in batman arkham origins, I don't see any threads being at 100% load, or if they are, it's not frequent enough for me to notice. It can be attributed in other games however. Just some food for thought. Since some games only use 4 out of the 8 threads avaliable, it's more about frequency at that point than anything else. Just a possibility.

So to answer your question, 12 gb ddr3 @ 1600mhz.


----------



## Falkentyne

Quote:


> Originally Posted by *KillerBee33*
> 
> So do i, Gaming mostly just do some Benchin at times for fun but its not always works out =bcz, i dont know the basics of Voltages, Watts and Amps. Anyway this is what i ended up with it got no extra performance so basically works the same as when it was @ "0" but thanx for the help. BTW CPU OC was way easier on my Z170A Gaming Pro Carbon -Desktop their options were absolutely awesome in BIOS.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Also forgot to mention i only got a 460W Power Supply to keep it all running 2 1070's , 6820 hk and an 18 inch Display. all 4 Overclocked


Please use Throttlestop 8.50 when benching and testing your CPU and have "Limit Reasons" open so we can see if you are getting CPU throttling and why you are getting it.
PL1 and PL2 are power limit 1 and 2 throttling. Since those default to 200 watts, those should NOT happen unless you exceed the power AC ratings of the system or some other EC restriction.

EDP Other (by itself, without power limit also appearing) is usually from exceeding the CPU current limit.
Current (by itself) is for exceeding the TDC which stands for thermal design current and is usually an obsolete setting (which is NOT "ICCMAX" (CPU Current Limit). You should never see this unless you messed with the TDC setting, which is hidden in the locked Bios; this should never be triggered normally.


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Please use Throttlestop 8.50 when benching and testing your CPU and have "Limit Reasons" open so we can see if you are getting CPU throttling and why you are getting it.
> PL1 and PL2 are power limit 1 and 2 throttling. Since those default to 200 watts, those should NOT happen unless you exceed the power AC ratings of the system or some other EC restriction.
> 
> EDP Other (by itself, without power limit also appearing) is usually from exceeding the CPU current limit.
> Current (by itself) is for exceeding the TDC which stands for thermal design current and is usually an obsolete setting (which is NOT "ICCMAX" (CPU Current Limit). You should never see this unless you messed with the TDC setting, which is hidden in the locked Bios; this should never be triggered normally.


WILL DO. lET YA KNOW WHAT SHOWS UP


----------



## Sgang

Quote:


> Originally Posted by *gtbtk*
> 
> Download the free copy of OCCT and you can test your video card memory for errors. You can see the numbers of errors increase significantly onve the overclock gets to a certain point. you only need to back the OC down to just under that point and you are golden.
> 
> Start learning how to use the voltage curve to maximize the overclock performance. To get the best gaming performance, you do not need to have all of the curve set really high, just select bits of it.
> 
> YMMV, but my Micron card start throwing significant amounts of VRam errors above about +600 to +650
> 
> Water cooling will certainly give you better performance than a blower cooler. As temps rise, the clocks drop off. Water cooling should keep the CPU running in the 40-50 deg range under load


Sorry for the late reply, and thank you for the precious info. I downloaded the OCCT free version, but what i've to do to test errors?


----------



## Falkentyne

GPU 3D, error checking.


----------



## gtbtk

Quote:


> Originally Posted by *TheReciever*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> it had issues with the initial release firmware, trim and a persormance drop off. It took 2 firmware updates but that was resolved that a number of years ago.
> 
> 
> 
> Ah, well glad it was fixed. Last I read the fix was basically a monthly scan to keep the files current.
Click to expand...

that was fw update number 1. they came out with another one after that


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is a good SSD. I have the 840 Evo and I am really please with it. If you install windows on the SSD and boot from it, you will have what feels like a brand new machine.
> 
> You are correct if that is the only thing going on. However, among other things, the card also needs to load textures and they are all interdependent on each other. With a slow HDD it can exacerbate that problem and lead to stutters. Even with an SSD installed there is still a time component, albeit a smaller one, required to load the assets. The GPU also needs CPU cycles available to process the draw requests for it to process the next frame. If the CPU is already running at 100% or the GPU is starved of the assets required to create the frame, the graphics card has to either wait or draw the screens without textures.
> 
> The other thing that I didn't touch on is how much Ram do you have installed in you machine and what is running in the background? If the Ram is marginal, the OS could also be swapping memory out to the page file which will also slow things down.
> 
> 
> 
> I have 12gb of ram. The cpu could possibly be a culprit even though it's an i7 (i7 4790s to be precise). Reason why is because the individual threads don't have as much single-thread perf. as a i7 4790k overclocked for example. The cpu is clocked to 3.6 ghz instead of the base 3.2, but turbo had to be disabled. I was unable to get 4ghz out of the chip, it simply wouldn't let me in bios, not because it was unstable. However, in batman arkham origins, I don't see any threads being at 100% load, or if they are, it's not frequent enough for me to notice. It can be attributed in other games however. Just some food for thought. Since some games only use 4 out of the 8 threads avaliable, it's more about frequency at that point than anything else. Just a possibility.
> 
> So to answer your question, 12 gb ddr3 @ 1600mhz.
Click to expand...

I cant comment on the batman game in particular, I have not played it. We are though getting to the point where 8GB is starting to be not enough for some of the New AAA games.

I was running an i7-2600 at 4.4Ghz which performed similarly to an i5-6600 and while it was not he fastest thing around any more, didn't feel that had quite reached the end of its life.

One thing that I discovered with Ryzen, may be happening with your box as well. Under some game loads, the application was really loading up the very last logical CPU (cpu15 on Ryzen) up to 100%. There would be a couple of low numbered CPUs with lesser load as well but a number of cpus in the mid range were not getting much load at all.

If you manually set the CPU affinities in Ryzen to ignore the last logical CPU, gaming performance really improves and the load more evenly spread across the different cores. A similar thing may be happening with your rig. You could try setting the CPU affinity for the game to only use the first 7 logical CPUs in task manager and leave the last logical cpu(7) out. If it is running something important on the highest numbered CPU, it will be doing it on one of the primary ones that get the majority of the compute resources.


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*


So i go this thing but i cant find a use for it...what exactly am i looking for here?


[email protected] is the best i've got so far https://www.3dmark.com/3dm/23905908

these are the HIGHEST Temps for that run
CPU
GPU


----------



## TheReciever

Quote:


> Originally Posted by *KillerBee33*
> 
> So i go this thing but i cant find a use for it...what exactly am i looking for here?
> 
> 
> [email protected] is the best i've got so far https://www.3dmark.com/3dm/23905908
> 
> these are the HIGHEST Temps for that run
> CPU
> GPU


Thats worse than my 4930mx at 3.5Ghz


----------



## KillerBee33

Quote:


> Originally Posted by *TheReciever*
> 
> Thats worse than my 4930mx at 3.5Ghz


These are Notebook chips....ans also i've stated i'm not very happy with the 6820HK







or i'm really doing something wrong


----------



## TheReciever

Quote:


> Originally Posted by *KillerBee33*
> 
> These are Notebook chips....ans also i've stated i'm not very happy with the 6820HK
> 
> 
> 
> 
> 
> 
> 
> or i'm really doing something wrong


4930mx = unlocked mobile chip 4th gen

6820hk = unlocked mobile chip 6th gen

Your 4.1Ghz seems to be vastly underperforming below my 3.5Ghz 4th gen mobile chip

Im not super intimate with overclocking but at least I know what a model number is...

I generally pull 8ish on 32m and 250-275 in 1024m


----------



## KillerBee33

Quote:


> Originally Posted by *TheReciever*
> 
> 4930mx = unlocked mobile chip 4th gen
> 
> 6820hk = unlocked mobile chip 6th gen
> 
> Your 4.1Ghz seems to be vastly underperforming below my 3.5Ghz 4th gen mobile chip
> 
> Im not super intimate with overclocking but at least I know what a model number is...
> 
> I generally pull 8ish on 32m and 250-275 in 1024m


Whats that " ?"
I generally pull 8ish on 32m and 250-275 in 1024m


----------



## TheReciever

Quote:


> Originally Posted by *KillerBee33*
> 
> Whats that " ?"
> I generally pull 8ish on 32m and 250-275 in 1024m


Honestly dude, it sounds like you need to spend time reading up on what your doing.

The TS Bench scores

32m and 1024m, its in the screen shot you provided. The second.


----------



## KillerBee33

Quote:


> Originally Posted by *TheReciever*
> 
> Honestly dude, it sounds like you need to spend time reading up on what your doing.
> 
> The TS Bench scores
> 
> 32m and 1024m, its in the screen shot you provided. The second.


Its the throttlestop 850 im not sure what that thing is for exactly
IS THIS ANY BETTER?


----------



## TheReciever

Quote:


> Originally Posted by *KillerBee33*
> 
> Its the throttlestop 850 im not sure what that thing is for exactly


You can lead a horse to water but you can make it drink is pretty fitting here...

Your system is not configured correctly, my chip which is 2 generations older, underclocked and undervolted is out performing your overclocked 6th gen 6820hk.

Thats not supposed to happen.

You need to go read up on how to configure a machine like yours. Me telling you whats wrong wouldnt even help because you dont seem to grasp the basics.

tl;dr time to hit the books!


----------



## KillerBee33

Quote:


> Originally Posted by *TheReciever*
> 
> You can lead a horse to water but you can make it drink is pretty fitting here...
> 
> Your system is not configured correctly, my chip which is 2 generations older, underclocked and undervolted is out performing your overclocked 6th gen 6820hk.
> 
> Thats not supposed to happen.
> 
> You need to go read up on how to configure a machine like yours. Me telling you whats wrong wouldnt even help because you dont seem to grasp the basics.
> 
> tl;dr time to hit the books!


Look up the changes there ....was running the wrong thread amount


----------



## TheReciever

Its better but still looks to be underperforming a little


----------



## KillerBee33

Quote:


> Originally Posted by *TheReciever*
> 
> Its better but still looks to be underperforming a little


Well. thats all i got now.


----------



## Falkentyne

Quote:


> Originally Posted by *KillerBee33*
> 
> Its the throttlestop 850 im not sure what that thing is for exactly
> IS THIS ANY BETTER?


Can you please explain why your battery is disconnected?


----------



## TheReciever

Quote:


> Originally Posted by *KillerBee33*
> 
> Well. thats all i got now.


Unless I am mistaken a 4.3Ghz 6820hk should hit about 150 1024m

4.1Ghz and 201 seems quite a leap unless the system I am recalling had high speed ram or something.


----------



## Falkentyne

Now I am using overclocked RAM timings of 15/15/15/35, 1T, tREFI 32727, tRFC=270, but that shouldn't account for that big of a difference.

*EDIT* WRONG WRONG WRONG.
(not used to TS bench......messing up here).

Um ok what.
I messed up somewhere.
Now the 1024M test overwrote the 32M test, unless it's showing an old result.

Right now: it says:

32m: 6.607
1024M: 208.403

This is 7820HK @ 4.1 ghz. with RAM timings as above.

Not sure why I had a 347.925 result on 1024M previously, ignore that.

Ok anyway your scores seem fine.
Still want to know why your battery is disconnected though.
Do you have a battery icon in the windows bottom right taskbar at all?

ran it a 2nd time at 4.1 ghz, got 208.269 for 32M.

4.3 ghz is: 198.572


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Can you please explain why your battery is disconnected?


Battery in not accessible its hidden inside so im not really sure how it can be disconnected but i'll check.


----------



## KillerBee33

Quote:


> Originally Posted by *TheReciever*
> 
> Unless I am mistaken a 4.3Ghz 6820hk should hit about 150 1024m
> 
> 4.1Ghz and 201 seems quite a leap unless the system I am recalling had high speed ram or something.


I've posted multiple times it wont run past 4.1...if someone can get it up to 4.3 Bench stable it'll be a miracle.
And Intel XTU seems to just **** with it and make changes in BIOS also but without XTU which is not even installed anymore BIOS allows only 4.2 and
CPU Core Voltage
CPU RING Voltage
CPU COre Voltage offset do save in BIOS but have no changes inside Windows...meaning they just dont work from BIOS
Unlocking BIOS on a Laptop just to get that " .2" is not worth it in my opinion and i doubt it'll give me any real Gaming Performance for that matter..
I used to go extreme when everything was watercooled but for now im stuck with this laptop

And this https://www.3dmark.com/spy/624012


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Now I am using overclocked RAM timings of 15/15/15/35, 1T, tREFI 32727, tRFC=270, but that shouldn't account for that big of a difference.
> 
> *EDIT* WRONG WRONG WRONG.
> (not used to TS bench......messing up here).
> 
> Um ok what.
> I messed up somewhere.
> Now the 1024M test overwrote the 32M test, unless it's showing an old result.
> 
> Right now: it says:
> 
> 32m: 6.607
> 1024M: 208.403
> 
> This is 7820HK @ 4.1 ghz. with RAM timings as above.
> 
> Not sure why I had a 347.925 result on 1024M previously, ignore that.
> 
> Ok anyway your scores seem fine.
> Still want to know why your battery is disconnected though.
> Do you have a battery icon in the windows bottom right taskbar at all?
> 
> ran it a 2nd time at 4.1 ghz, got 208.269 for 32M.
> 
> 4.3 ghz is: 198.572


Battery is weird on this thing its got a mind of its own i even had it Calibrated few days ago which only suggested by msi to be done every 3 months with their own Application. So im not really sure yet i only had this thing for about a week and a half yet








And this thing takes about 3 hours


----------



## Sgang

i'm experiencing some bad fps dropping with all the 3 configurations i've played the game.

I tried a 1060 3gb at 2k (no matter what preset, normal, high or ultra) my 1070 in 2k (high and ultra) on a Ryzen 1800x @4.9ghz, 16gb RAM DDR4 @2666 both with oc and normal, and a 1080ti in 4k at high and ultra on a 7820x @4.8ghz 16gb DDR4 @3200mhz

The game run smoothly at 50-55-60fps, than suddenly the frame rate drop down to 20fps for a couple of seconds and then come back to normal. This especially in the cutscenes. It's really annoying ...
Has anyone encountered this kind of problem, all the other games, are perfect


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Can you please explain why your battery is disconnected?


This is what BATTERY options look like

This is what they say about similar TITAN on MSI








https://forum-en.msi.com/index.php?topic=262502.0


----------



## KillerBee33

REMOVED POST


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Now I am using overclocked RAM timings of 15/15/15/35, 1T, tREFI 32727, tRFC=270, but that shouldn't account for that big of a difference.
> 
> *EDIT* WRONG WRONG WRONG.
> (not used to TS bench......messing up here).
> 
> Um ok what.
> I messed up somewhere.
> Now the 1024M test overwrote the 32M test, unless it's showing an old result.
> 
> Right now: it says:
> 
> 32m: 6.607
> 1024M: 208.403
> 
> This is 7820HK @ 4.1 ghz. with RAM timings as above.
> 
> Not sure why I had a 347.925 result on 1024M previously, ignore that.
> 
> Ok anyway your scores seem fine.
> Still want to know why your battery is disconnected though.
> Do you have a battery icon in the windows bottom right taskbar at all?
> 
> ran it a 2nd time at 4.1 ghz, got 208.269 for 32M.
> 
> 4.3 ghz is: 198.572


----------



## Falkentyne

Can you go to your Throttlestop options page and click "Battery monitoring?"


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Can you go to your Throttlestop options page and click "Battery monitoring?"


----------



## Falkentyne

Ok good. Always keep that on. Doesn't hurt.
Disconnecting the battery internally (it can be completely removed, unlike some other MSI laptops, the entire battery is accessible and just held in place by some square thingies and tape) causes power to be limited to about 60% of max system power, due to hybrid power cutoff (these MSI laptops still use battery boost (called NOS), to combine battery and system power, from when the 180W PSU's were not sufficient on GT72s). There is a way to prevent this from happening with the battery disconnected, but requires writing a few values to three EC RAM registers to trick the EC into thinking the battery is still connected (again I do not know how SLI systems are affected by battery hybrid cutoff).


----------



## KillerBee33

How do i reset TS ? i've tried it as OC @ 4.3 then pushed some reset button , now every time i turn it on it brings everything to stock until restart...


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> Ok good. Always keep that on. Doesn't hurt.
> Disconnecting the battery internally (it can be completely removed, unlike some other MSI laptops, the entire battery is accessible and just held in place by some square thingies and tape) causes power to be limited to about 60% of max system power, due to hybrid power cutoff (these MSI laptops still use battery boost (called NOS), to combine battery and system power, from when the 180W PSU's were not sufficient on GT72s). There is a way to prevent this from happening with the battery disconnected, but requires writing a few values to three EC RAM registers to trick the EC into thinking the battery is still connected (again I do not know how SLI systems are affected by battery hybrid cutoff).


You think there might be just software to disable battery? All my tools are locked in storage and its a drag to get to, and i really dont want to take this thing apart even though i need to , wanna change my main 256Gb SSD to 1TB i got laying around.


----------



## Falkentyne

No.


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*
> 
> No.


Well, i guess if i decide to change the SSD i'll disconnect the battery then . Thanx again.


----------



## blue.dot

Hey everyone.

I'd like to ask owners of MSI 1070 Gaming X if these models have any problems that one should know of, or be aware of before purchase?

I'm planning to buy this model in upcoming weeks. I have been waiting so far for AIB Vega 56, but it seems the prices will be crazy, especially here in Eastern Europe.
The MSI Gaming X will cost me just under 480€, might even go bit lower, and this is the price I'm okayish with.

I currently have MSI R9 380 and I'm very happy with its cooler, so I'd thought I will continue with MSI.
I have also looked at EVGA, but I remember something about their coolers not cooling VRMs properly? The one I can buy is EVGA GeForce GTX 1070 FTW GAMING ACX 3.0, is this the problematic one? Or any other models you guys could recommend?

Thanks

/Edit:
Looking a bit more in the store, the MSI 1070 Ti is just 533€. I think that would be better purchase? Would my CX550 PSU be enough for OCed Ti?


----------



## gtbtk

Quote:


> Originally Posted by *blue.dot*
> 
> Hey everyone.
> 
> I'd like to ask owners of MSI 1070 Gaming X if these models have any problems that one should know of, or be aware of before purchase?
> 
> I'm planning to buy this model in upcoming weeks. I have been waiting so far for AIB Vega 56, but it seems the prices will be crazy, especially here in Eastern Europe.
> The MSI Gaming X will cost me just under 480€, might even go bit lower, and this is the price I'm okayish with.
> 
> I currently have MSI R9 380 and I'm very happy with its cooler, so I'd thought I will continue with MSI.
> I have also looked at EVGA, but I remember something about their coolers not cooling VRMs properly? The one I can buy is EVGA GeForce GTX 1070 FTW GAMING ACX 3.0, is this the problematic one? Or any other models you guys could recommend?
> 
> Thanks
> 
> /Edit:
> Looking a bit more in the store, the MSI 1070 Ti is just 533€. I think that would be better purchase? Would my CX550 PSU be enough for OCed Ti?


I have one and got it in July 2016. It was one of the first Micron memory cards to hit the market and suffered fro the Micron bug that was resolved in the firmware update that was released in Nov 2016.

Having solved the micron Bug, It has been a great Card.

EVGA cards tend to hit power limits early and I would not recommend them because of that.

The Asus Strix is a good card to have a look at.

Having said that, they will all perform roughly the same. the extra money generally buys you a slightly better cooler

Don't forget to look at 1080 prices as well. As it is not a new card any more, there are discounts around that are not being applied to the 1070TI which is a new release.

I assume that you are not running an HEDT x299/TR systems. For mainstream systems, 550W for a single 1070 should be fine, The MSI 1070 with an OC will pull about 250W. The 1070TI will not be pulling much more than that so it should also be OK.


----------



## blue.dot

Thank you gtbtk for reply.

I did also look at ASUS Strix cards, but their prices are messed up here. Both 1070 variants cost 523€ and 538€ (normal and OC) and the 1070 Ti cost almost 560€.
Gigabyte models are also bit cheaper but I had really long and frustrating RMA with them on my previous GPU, so I'm avoiding them.

I think I'll go with the MSI 1070 Ti, it should give me enough performance at 1080p even for newer games









Thanks again


----------



## gtbtk

Quote:


> Originally Posted by *blue.dot*
> 
> Thank you gtbtk for reply.
> 
> I did also look at ASUS Strix cards, but their prices are messed up here. Both 1070 variants cost 523€ and 538€ (normal and OC) and the 1070 Ti cost almost 560€.
> Gigabyte models are also bit cheaper but I had really long and frustrating RMA with them on my previous GPU, so I'm avoiding them.
> 
> I think I'll go with the MSI 1070 Ti, it should give me enough performance at 1080p even for newer games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks again


The fan quality on the gigabyte cards has been a bit marginal.

If the Asus cards are stupid prices, I would by pass them by too. I am really pleased with the MSI cooler on my 1070. It is quiet even at 100% fan and very effective. With a side panel fan under load I an at mid 50 degrees. Without the extra fan in am in low 60s

If the difference in price is only 30 euro, then I would go the TI card. With an overclock, it is faster than a stock 1080 and the more powerful the card, the longer it will last before obsolescence.


----------



## KillerBee33

Quote:


> Originally Posted by *Falkentyne*


Tried everything i know of so far these are the results
4.3 with XTU

4.2 XTU Uninstalled so pure BIOS

4.1 BIOS EVERYTHING ELSE FACTORY DEFAULT Voltage and Power seems to be most stable


----------



## asdkj1740

rare to see a pascal model (above 1060) using another vrm controller instead of pascal standard controller upi up9511 for most of the pascal models (except those ln2 aiming models).

evga again cheaps out on the mid plate cooling the vrm...super flat mid plate without any fins/extra surface area just like the old design with extremely bad perforamnce.
however this time evga adds a flat plate to the bottom of the main heatsink, instead of using that stupid (what they called) L-shape fin design to make contact with the mid plate.
its still a stupid and redundant design making no sense for adding more layers in the middle hurting vrm cooling performance.
it would be the best if evga simply uses the main heatsink's bottom plate to actively cool/make contact with the vrm mosfet directly, but the gpu core temp must be hurt, may be a lot.


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> 
> 
> 
> 
> 
> rare to see a pascal model (above 1060) using another vrm controller instead of pascal standard controller upi up9511 for most of the pascal models (except those ln2 aiming models).
> 
> evga again cheaps out on the mid plate cooling the vrm...super flat mid plate without any fins/extra surface area just like the old design with extremely bad perforamnce.
> however this time evga adds a flat plate to the bottom of the main heatsink, instead of using that stupid (what they called) L-shape fin design to make contact with the mid plate.
> its still a stupid and redundant design making no sense for adding more layers in the middle hurting vrm cooling performance.
> it would be the best if evga simply uses the main heatsink's bottom plate to actively cool/make contact with the vrm mosfet directly, but the gpu core temp must be hurt, may be a lot.


That new Voltage Buck controller that they have used on the 1070 TI FTW has an i2C inteface and was released in June 2017. The older up9511 controllers that were being used on almost all of the other Nvidia Cards has no communication interface.

Maybe it is being done, in addition to the encryption that could always be cracked, to ensure that 1070TI bios cant be cross flashed to 1070 cards? I guess that by spreading the load over 10 mosfets, it reduces temps on a per mosfet basis. It would seem that the drama with the 1070 cards has had a profound effect on EVGA

The mid plate would still conduct heat away from the mosfets. Given that it has been doubled, each mosfet is not going to be getting that hot .


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I cant comment on the batman game in particular, I have not played it. We are though getting to the point where 8GB is starting to be not enough for some of the New AAA games.
> 
> I was running an i7-2600 at 4.4Ghz which performed similarly to an i5-6600 and while it was not he fastest thing around any more, didn't feel that had quite reached the end of its life.
> 
> One thing that I discovered with Ryzen, may be happening with your box as well. Under some game loads, the application was really loading up the very last logical CPU (cpu15 on Ryzen) up to 100%. There would be a couple of low numbered CPUs with lesser load as well but a number of cpus in the mid range were not getting much load at all.
> 
> If you manually set the CPU affinities in Ryzen to ignore the last logical CPU, gaming performance really improves and the load more evenly spread across the different cores. A similar thing may be happening with your rig. You could try setting the CPU affinity for the game to only use the first 7 logical CPUs in task manager and leave the last logical cpu(7) out. If it is running something important on the highest numbered CPU, it will be doing it on one of the primary ones that get the majority of the compute resources.


Hey gtbtk. quick update. I installed the 850 evo and noticed a substantial improvement in frame-times for the batman game. The stuttering is almost negligible minus a few hiccups here and there. Another observation made is that when the game auto-saves itself with the little save icon on the bottom right of the screen, it used to cause stutter, and now, it doesn't with the ssd. However, gpu usage remains the same(under 100%). When flying around in the city, it's in the 50's-60's. I can say with 90% confidence it's a cpu issue since my 1440p monitor located in another apartment made the gpu usage higher. However, right now I am using a 1080p monitor and it causes 50-60% gpu usage. This definitely confirms a cpu issue.

Since cpu usage is in the 25%-35% range, the game only uses 2 cores and is encroaching on the third(I have an i7 remember). When I monitor all the individual threads, I also saw threads being idle. This means any further improvement to raising gpu usage must be through speed and ipc alone(since more threads don't matter for this tittle.) Though with 100-120 ish fps, that is still great performance despite gpu not being fully utilized. Thanks again for your help, as stuttering is now bearable and playable.









So for anyone out there who says ssd's only improve load times in games and that's it, I disagree. With open world games (like batman), it also reduces stutter in-game.

Edit: I also tried the affinity suggestion, with no success. Gpu usage remains in the same range regardless of whether the last core is disabled or not. Game engine could also hamper gpu usage ( making the cpu not as much of the culprit). A couple of other questions: do you think 12gb of ram is enough for me until my next cpu upgrade? Also, is ddr3 @1600mhz ram slow enough to be a bottleneck to my system?


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> I cant comment on the batman game in particular, I have not played it. We are though getting to the point where 8GB is starting to be not enough for some of the New AAA games.
> 
> I was running an i7-2600 at 4.4Ghz which performed similarly to an i5-6600 and while it was not he fastest thing around any more, didn't feel that had quite reached the end of its life.
> 
> One thing that I discovered with Ryzen, may be happening with your box as well. Under some game loads, the application was really loading up the very last logical CPU (cpu15 on Ryzen) up to 100%. There would be a couple of low numbered CPUs with lesser load as well but a number of cpus in the mid range were not getting much load at all.
> 
> If you manually set the CPU affinities in Ryzen to ignore the last logical CPU, gaming performance really improves and the load more evenly spread across the different cores. A similar thing may be happening with your rig. You could try setting the CPU affinity for the game to only use the first 7 logical CPUs in task manager and leave the last logical cpu(7) out. If it is running something important on the highest numbered CPU, it will be doing it on one of the primary ones that get the majority of the compute resources.
> 
> 
> 
> Hey gtbtk. quick update. I installed the 850 evo and noticed a substantial improvement in frame-times for the batman game. The stuttering is almost negligible minus a few hiccups here and there. Another observation made is that when the game auto-saves itself with the little save icon on the bottom right of the screen, it used to cause stutter, and now, it doesn't with the ssd. However, gpu usage remains the same(under 100%). When flying around in the city, it's in the 50's-60's. I can say with 90% confidence it's a cpu issue since my 1440p monitor located in another apartment made the gpu usage higher. However, right now I am using a 1080p monitor and it causes 50-60% gpu usage. This definitely confirms a cpu issue.
> 
> Since cpu usage is in the 25%-35% range, the game only uses 2 cores and is encroaching on the third(I have an i7 remember). When I monitor all the individual threads, I also saw threads being idle. This means any further improvement to raising gpu usage must be through speed and ipc alone(since more threads don't matter for this tittle.) Though with 100-120 ish fps, that is still great performance despite gpu not being fully utilized. Thanks again for your help, as stuttering is now bearable and playable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So for anyone out there who says ssd's only improve load times in games and that's it, I disagree. With open world games (like batman), it also reduces stutter in-game.
> 
> Edit: I also tried the affinity suggestion, with no success. Gpu usage remains in the same range regardless of whether the last core is disabled or not. Game engine could also hamper gpu usage ( making the cpu not as much of the culprit). A couple of other questions: do you think 12gb of ram is enough for me until my next cpu upgrade? Also, is ddr3 @1600mhz ram slow enough to be a bottleneck to my system?
Click to expand...

The SSD makes it like almost a new machine doesn't it? It helps with load times but it also helps with the operating system, swap files etc etc. I still don't think that it is your CPU

Are you running with v-sync turned on either in game or in the nvidia control panel? Turn it off if you are. That will restrict the frame rate to a set frame rate and will keep the gpu usage down.

Different games will make use of a different number of threads. Software needs to be written to make use of many threads, many games are not.

12GB should be OK for 99% of anything you throw at it. Batman I don't know. if you monitor the ram usage during the game with the OSD from afterburner, you can see if it is running out of ram in game and having to swap out to disk and use virtual memory. DDR3 1600mhz ram should be fine


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> That new Voltage Buck controller that they have used on the 1070 TI FTW has an i2C inteface and was released in June 2017. The older up9511 controllers that were being used on almost all of the other Nvidia Cards has no communication interface.
> 
> Maybe it is being done, in addition to the encryption that could always be cracked, to ensure that 1070TI bios cant be cross flashed to 1070 cards? I guess that by spreading the load over 10 mosfets, it reduces temps on a per mosfet basis. It would seem that the drama with the 1070 cards has had a profound effect on EVGA
> 
> The mid plate would still conduct heat away from the mosfets. Given that it has been doubled, each mosfet is not going to be getting that hot .


different voltage controller adds more uncertainties on cross flashing bios, if we want to max out the performance of pascal/coming volta card..

check these out, the gainward 1070ti has got higher and more stable gpu core clock than the galax one.
however it is said that galax 1070ti has higher power limit on the bios than the gainward one.

http://www.tomshardware.de/gainward-leistungsaufnahme-performance-temperaturen-lautstarke,testberichte-242419-5.html
http://www.tomshardware.de/kfa2-gtx-1070-ti-performance-temperaturen-schaltungsdetails-lautstarke,testberichte-242417-5.html

i have not check the mosfet datasheet of that 1070ti evga ftw, but steve said it is a 25a mosfet rather than 35a on the old 1070 ftw.
dont know what evga has changed and what for, but surely i dont like it.

btw this is the way mosfets should be cooled:
https://www.kitguru.net/wp-content/uploads/2017/12/heatsink3.jpg
https://www.kitguru.net/components/graphic-cards/dominic-moass/sapphire-rx-vega-64-nitro-limited-edition-better-than-gtx1080/2/


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That new Voltage Buck controller that they have used on the 1070 TI FTW has an i2C inteface and was released in June 2017. The older up9511 controllers that were being used on almost all of the other Nvidia Cards has no communication interface.
> 
> Maybe it is being done, in addition to the encryption that could always be cracked, to ensure that 1070TI bios cant be cross flashed to 1070 cards? I guess that by spreading the load over 10 mosfets, it reduces temps on a per mosfet basis. It would seem that the drama with the 1070 cards has had a profound effect on EVGA
> 
> The mid plate would still conduct heat away from the mosfets. Given that it has been doubled, each mosfet is not going to be getting that hot .
> 
> 
> 
> different voltage controller adds more uncertainties on cross flashing bios, if we want to max out the performance of pascal/coming volta card..
> 
> check these out, the gainward 1070ti has got higher and more stable gpu core clock than the galax one.
> however it is said that galax 1070ti has higher power limit on the bios than the gainward one.
> 
> http://www.tomshardware.de/gainward-leistungsaufnahme-performance-temperaturen-lautstarke,testberichte-242419-5.html
> http://www.tomshardware.de/kfa2-gtx-1070-ti-performance-temperaturen-schaltungsdetails-lautstarke,testberichte-242417-5.html
> 
> i have not check the mosfet datasheet of that 1070ti evga ftw, but steve said it is a 25a mosfet rather than 35a on the old 1070 ftw.
> dont know what evga has changed and what for, but surely i dont like it.
> 
> btw this is the way mosfets should be cooled:
> https://www.kitguru.net/wp-content/uploads/2017/12/heatsink3.jpg
> https://www.kitguru.net/components/graphic-cards/dominic-moass/sapphire-rx-vega-64-nitro-limited-edition-better-than-gtx1080/2/
Click to expand...

That is what I meant. The different voltage controller prevents the cross flashing FTW bioses.

25A x 10 makes the FTW is a 250W card. The MSI gaming X 1070 is using the 32A 4C86N mosfets across 8 phases also giving about 250W that is on the v330 rev 5.0 PCB. The latests cards, including the quicksilver and the new 1070TI are using the v330 Rev 6.0 which is using a later model QBI Mosfet. I think that it is using the UP9511 Controller. The palit and galaxy are still using the 9511 as well.

The temps seem to indicate that the cooling on the Palit card is significantly better than the Galaxy cards. 10 deg difference in temps.


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> The SSD makes it like almost a new machine doesn't it? It helps with load times but it also helps with the operating system, swap files etc etc. I still don't think that it is your CPU
> 
> Are you running with v-sync turned on either in game or in the nvidia control panel? Turn it off if you are. That will restrict the frame rate to a set frame rate and will keep the gpu usage down.
> 
> Different games will make use of a different number of threads. Software needs to be written to make use of many threads, many games are not.
> 
> 12GB should be OK for 99% of anything you throw at it. Batman I don't know. if you monitor the ram usage during the game with the OSD from afterburner, you can see if it is running out of ram in game and having to swap out to disk and use virtual memory. DDR3 1600mhz ram should be fine


No I am not running v-sync in NVidia control and/or in-game. If you don't think it's a cpu issue, then maybe it's a game engine issue. Some games simply don't work well with high fps. Since I am running a 1070 on a 1080p monitor at the moment, I am in the range of 120-200fps depending on what I am doing. GTA V for example, is notorious for not working well with high fps (like in the 150-200fps range). I know that batman isn't one of the games you play, so you can't really agree or disagree, but from my experience and others, the optimization the devs. put into this franchise is far from perfect. Process of elimination also suggests this too since the cpu nor storage speed is causing this. Also, only 6gb at the most are used, so there's definitely not a ram issue.

The only other possiblility is that it's a software issue with my pc. However, I have used ddu to uninstall and re-install graphics drivers and even re-installed the OS. Therefore, I think it's the game engine. Especially since this game is the exception to the rule, and not a problem I face with other games in my library. If it was something I faced with all of my games, then I would start to think it's my pc(whether it's software or hardware). Also, the franchise has gotten criticism for it's pc ports.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> The SSD makes it like almost a new machine doesn't it? It helps with load times but it also helps with the operating system, swap files etc etc. I still don't think that it is your CPU
> 
> Are you running with v-sync turned on either in game or in the nvidia control panel? Turn it off if you are. That will restrict the frame rate to a set frame rate and will keep the gpu usage down.
> 
> Different games will make use of a different number of threads. Software needs to be written to make use of many threads, many games are not.
> 
> 12GB should be OK for 99% of anything you throw at it. Batman I don't know. if you monitor the ram usage during the game with the OSD from afterburner, you can see if it is running out of ram in game and having to swap out to disk and use virtual memory. DDR3 1600mhz ram should be fine
> 
> 
> 
> No I am not running v-sync in NVidia control and/or in-game. If you don't think it's a cpu issue, then maybe it's a game engine issue. Some games simply don't work well with high fps. Since I am running a 1070 on a 1080p monitor at the moment, I am in the range of 120-200fps depending on what I am doing. GTA V for example, is notorious for not working well with high fps (like in the 150-200fps range). I know that batman isn't one of the games you play, so you can't really agree or disagree, but from my experience and others, the optimization the devs. put into this franchise is far from perfect. Process of elimination also suggests this too since the cpu nor storage speed is causing this. Also, only 6gb at the most are used, so there's definitely not a ram issue.
> 
> The only other possiblility is that it's a software issue with my pc. However, I have used ddu to uninstall and re-install graphics drivers and even re-installed the OS. Therefore, I think it's the game engine. Especially since this game is the exception to the rule, and not a problem I face with other games in my library. If it was something I faced with all of my games, then I would start to think it's my pc(whether it's software or hardware). Also, the franchise has gotten criticism for it's pc ports.
Click to expand...

Game engine is always a possibility. Did you try disabling the shader cache in the NV control panel for the game? Have you seen any reviews on the game? Did they find that it was a FPS killer?

Is the windows power plan in Balanced or high performance? That may make a difference

Changing monitor will not make any difference to how well the GPU can render frames, only how many can be displayed on screen. Your CPU is only being loaded up to relatively low levels.


----------



## smokerings

I am moving to a 1080ti FTW3 and it should be in this week.









I can still get good money for my 1070 here in Canada so it shouldn't hurt a whole lot but that depends on how long I keep the 1070 around and installed playing around with my xeon system!
I think a few of us here have that problem with getting rid of some of our old hardware.









I still love my 1070 but I got a 4k panel a month and a half after I bought the GPU in November 2016 and as soon as I did I could really feel the jump in pixels hitting the card even though I don't play a lot of heavy AAA titles.


----------



## KillerBee33

Quote:


> Originally Posted by *smokerings*
> 
> I am moving to a 1080ti FTW3 and it should be in this week.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can still get good money for my 1070 here in Canada so it shouldn't hurt a whole lot but that depends on how long I keep the 1070 around and installed playing around with my xeon system!
> I think a few of us here have that problem with getting rid of some of our old hardware.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still love my 1070 but I got a 4k panel a month and a half after I bought the GPU in November 2016 and as soon as I did I could really feel the jump in pixels hitting the card even though I don't play a lot of heavy AAA titles.


Couldn't wait few more months for Volta huh? I'm just wondering what it'll be


----------



## KillerBee33

Has anyone tried the NEW 4.4.2 Afterburner yet? I was able to clock my memory +650 @ 2,329 MHz without artifacts with 4.4.0 now its atrtifacting all over with those settings. Same Driver.


----------



## asdkj1740

Quote:


> Originally Posted by *gtbtk*
> 
> That is what I meant. The different voltage controller prevents the cross flashing FTW bioses.
> 
> 25A x 10 makes the FTW is a 250W card. The MSI gaming X 1070 is using the 32A 4C86N mosfets across 8 phases also giving about 250W that is on the v330 rev 5.0 PCB. The latests cards, including the quicksilver and the new 1070TI are using the v330 Rev 6.0 which is using a later model QBI Mosfet. I think that it is using the UP9511 Controller. The palit and galaxy are still using the 9511 as well.
> 
> The temps seem to indicate that the cooling on the Palit card is significantly better than the Galaxy cards. 10 deg difference in temps.


the zotac amp extreme has even lower temp. and zotac amp extreme is said to have the highest power limit settings at 240w among all 1070ti.
http://www.tomshardware.com/reviews/zotac-geforce-gtx-1070-ti-amp-extreme,5337-5.html


----------



## gtbtk

Quote:


> Originally Posted by *asdkj1740*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> That is what I meant. The different voltage controller prevents the cross flashing FTW bioses.
> 
> 25A x 10 makes the FTW is a 250W card. The MSI gaming X 1070 is using the 32A 4C86N mosfets across 8 phases also giving about 250W that is on the v330 rev 5.0 PCB. The latests cards, including the quicksilver and the new 1070TI are using the v330 Rev 6.0 which is using a later model QBI Mosfet. I think that it is using the UP9511 Controller. The palit and galaxy are still using the 9511 as well.
> 
> The temps seem to indicate that the cooling on the Palit card is significantly better than the Galaxy cards. 10 deg difference in temps.
> 
> 
> 
> the zotac amp extreme has even lower temp. and zotac amp extreme is said to have the highest power limit settings at 240w among all 1070ti.
> http://www.tomshardware.com/reviews/zotac-geforce-gtx-1070-ti-amp-extreme,5337-5.html
Click to expand...

the AMD cards do tend to be the over the top power cards. The 1070 has a 300W limit. interesting that the TI is only 240W


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> Game engine is always a possibility. Did you try disabling the shader cache in the NV control panel for the game? Have you seen any reviews on the game? Did they find that it was a FPS killer?
> 
> Is the windows power plan in Balanced or high performance? That may make a difference
> 
> Changing monitor will not make any difference to how well the GPU can render frames, only how many can be displayed on screen. Your CPU is only being loaded up to relatively low levels.


Hey gtbtk. It's been a while, I know. With the holidays on top of work, can never seem to find the time to go back to the forums. Anyways, just as an update, I figured out what the problem was and the solution to it for batman. So I thank you again for your help. On another note, I am planning to get PUBG soon, and wanted to know if you had any thoughts on how that runs. Is the port a mess? I have heard mixed responses depending on the person. Thanks again.









As for the discussion regarding different monitors influencing how the gpu renders frames, it can in an indirect way. If the resolution of two monitors are different, and assuming you run them at their native resolutions, the lower resolution will put more stress on the cpu since it's rendering more frames. If the cpu is being stressed too much by the amount of frames produced, the gpu gets held back. So in a way, it sort of can influence the gpu rendering frames. Correct me if I am wrong, as any new insight is always welcomed.


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Hey gtbtk. It's been a while, I know. With the holidays on top of work, can never seem to find the time to go back to the forums. Anyways, just as an update, I figured out what the problem was and the solution to it for batman. So I thank you again for your help. On another note, I am planning to get PUBG soon, and wanted to know if you had any thoughts on how that runs. Is the port a mess? I have heard mixed responses depending on the person. Thanks again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for the discussion regarding different monitors influencing how the gpu renders frames, it can in an indirect way. If the resolution of two monitors are different, and assuming you run them at their native resolutions, the lower resolution will put more stress on the cpu since it's rendering more frames. If the cpu is being stressed too much by the amount of frames produced, the gpu gets held back. So in a way, it sort of can influence the gpu rendering frames. Correct me if I am wrong, as any new insight is always welcomed.


That doesn't make any sense.
I have two very different 24" monitors. One is an AOC gsync G2460PG with a refresh rate of 144hz and a 24" vizio smart TV with a max refresh rate of 60hz. I've run both stress tests and benchmarks with both monitors on and also with the Vizio TV off (and disabled in windows) and there was absolutely no difference in performance from either the card or the CPU. In fact the best Firestrike score I ever achieved was with both monitors on. That was about a year ago with a GTX 980 card. My 1070 can just about reach those scores but that is without OCing it where my 980 was maxed out OC with a modded bios...


----------



## blaze2210

Quote:


> Originally Posted by *Madmaxneo*
> 
> That doesn't make any sense.
> I have two very different 24" monitors. One is an AOC gsync G2460PG with a refresh rate of 144hz and a 24" vizio smart TV with a max refresh rate of 60hz. I've run both stress tests and benchmarks with both monitors on and also with the Vizio TV off (and disabled in windows) and there was absolutely no difference in performance from either the card or the CPU. In fact the best Firestrike score I ever achieved was with both monitors on. That was about a year ago with a GTX 980 card. My 1070 can just about reach those scores but that is without OCing it where my 980 was maxed out OC with a modded bios...


^This.

I've been running 4 monitors for close to a year now, 2 of them are the same, so that makes 3 totally different monitor models: the 2x AOC I2269VWs have a 60hz refresh rate @ 1080p, the ASUS VG248QE is set to 120hz normally @ 1080p, and the emachines "HDMonitor" is 75hz @ 1440x900. No difference in performance by turning any of them off....The best Firestrike score I've reached so far was with all of my monitors on.


----------



## GreedyMuffin

Anybody who got some numbers on 1070 OCed on Winminer/Nicehash?

Just bought a ASUS ROG Strix 1070 with a nice discount, and I will earn it in within 4-4.5 months including eletricity, so I thought why not.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> Game engine is always a possibility. Did you try disabling the shader cache in the NV control panel for the game? Have you seen any reviews on the game? Did they find that it was a FPS killer?
> 
> Is the windows power plan in Balanced or high performance? That may make a difference
> 
> Changing monitor will not make any difference to how well the GPU can render frames, only how many can be displayed on screen. Your CPU is only being loaded up to relatively low levels.
> 
> 
> 
> Hey gtbtk. It's been a while, I know. With the holidays on top of work, can never seem to find the time to go back to the forums. Anyways, just as an update, I figured out what the problem was and the solution to it for batman. So I thank you again for your help. On another note, I am planning to get PUBG soon, and wanted to know if you had any thoughts on how that runs. Is the port a mess? I have heard mixed responses depending on the person. Thanks again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for the discussion regarding different monitors influencing how the gpu renders frames, it can in an indirect way. If the resolution of two monitors are different, and assuming you run them at their native resolutions, the lower resolution will put more stress on the cpu since it's rendering more frames. If the cpu is being stressed too much by the amount of frames produced, the gpu gets held back. So in a way, it sort of can influence the gpu rendering frames. Correct me if I am wrong, as any new insight is always welcomed.
Click to expand...

So what was the problem? Is it because you were running two monitors with different resolutions? I believe that in that situation, you are facing a lowest common denominator problem

I have not played pubg but I believe that it is not that well optimized. It has particular issues with Ryzen because of the Ue4 game engine. This may be informative


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> So what was the problem? Is it because you were running two monitors with different resolutions? I believe that in that situation, you are facing a lowest common denominator problem
> 
> I have not played pubg but I believe that it is not that well optimized. It has particular issues with Ryzen because of the Ue4 game engine. This may be informative


no. the monitor discussion was initiated by a statement you made earlier saying how it doesn't affect gpu performance. Nothing to do with the game issues. I did have higher gpu usage with the 1440p monitor than the 1080p monitor though, that I am certain. I was researching and found it was a port issue. the game doesn't support many cores, with only 2 sometimes encroaching on the 3rd core. People also have complained about gpu usage issues with the game and stuttering. Which makes me feel better since it's not an issue with my system in particular.


----------



## comanzo

Quote:


> Originally Posted by *Madmaxneo*
> 
> That doesn't make any sense.
> I have two very different 24" monitors. One is an AOC gsync G2460PG with a refresh rate of 144hz and a 24" vizio smart TV with a max refresh rate of 60hz. I've run both stress tests and benchmarks with both monitors on and also with the Vizio TV off (and disabled in windows) and there was absolutely no difference in performance from either the card or the CPU. In fact the best Firestrike score I ever achieved was with both monitors on. That was about a year ago with a GTX 980 card. My 1070 can just about reach those scores but that is without OCing it where my 980 was maxed out OC with a modded bios...


Several problems with your argument. First, it's not clear whether the two displays have different resolutions. Refresh rate is irrelevant in cpu/gpu performance(for this discussion). When it comes to games, it depends on the game as some games are more cpu dependent than others. However, it's been proven time and time again that if you run a game with a high end gpu, a certain trend can be found. The lower the resolution, the more frames a high end gpu can produce. The cpu then ends up being the limitation from the gpu reaching it's full 99% usage. Why do you think in cpu tests, youtubers use a 1080ti at 1080p? It's not because a 1080ti is needed for 1080p, but rather to cause a cpu bottleneck on purpose.

In conclusion, not only do lower resolutions lead to more fps and thus, more cpu stress, but lower settings in-game also lead to more fps and more stress for the cpu as a result. Think of the cpu as a fps limit that is different for every game, and the stronger the cpu, the higher the frame cap is. Most people don't have to worry about this frame cap unless gaming at high refresh rates with a gpu that can supply such high fps.

Now, assuming you run the monitors at their native resolutions, this then leads to such discrepancies in performance. No, a monitor can't physically impact gaming performance. It's the resolution the monitor has that impacts it. The last issue with your argument is that you are using Firestrike, which is a gpu test. Since the gpu gets stressed a lot in this test, you won't notice the cpu holding you back unless you run a ridiculously low resolution like 144p for example, or extremely low settings in the benchmark.


----------



## comanzo

Quote:


> Originally Posted by *blaze2210*
> 
> ^This.
> 
> I've been running 4 monitors for close to a year now, 2 of them are the same, so that makes 3 totally different monitor models: the 2x AOC I2269VWs have a 60hz refresh rate @ 1080p, the ASUS VG248QE is set to 120hz normally @ 1080p, and the emachines "HDMonitor" is 75hz @ 1440x900. No difference in performance by turning any of them off....The best Firestrike score I've reached so far was with all of my monitors on.


Refer to Madmaxneo's post and my response to it. The same answer can be applied here as well.


----------



## Madmaxneo

Quote:


> Originally Posted by *comanzo*
> 
> Several problems with your argument. First, it's not clear whether the two displays have different resolutions. Refresh rate is irrelevant in cpu/gpu performance(for this discussion). When it comes to games, it depends on the game as some games are more cpu dependent than others. However, it's been proven time and time again that if you run a game with a high end gpu, a certain trend can be found. The lower the resolution, the more frames a high end gpu can produce. The cpu then ends up being the limitation from the gpu reaching it's full 99% usage. Why do you think in cpu tests, youtubers use a 1080ti at 1080p? It's not because a 1080ti is needed for 1080p, but rather to cause a cpu bottleneck on purpose.
> 
> In conclusion, not only do lower resolutions lead to more fps and thus, more cpu stress, but lower settings in-game also lead to more fps and more stress for the cpu as a result. Think of the cpu as a fps limit that is different for every game, and the stronger the cpu, the higher the frame cap is. Most people don't have to worry about this frame cap unless gaming at high refresh rates with a gpu that can supply such high fps.
> 
> Now, assuming you run the monitors at their native resolutions, this then leads to such discrepancies in performance. No, a monitor can't physically impact gaming performance. It's the resolution the monitor has that impacts it. The last issue with your argument is that you are using Firestrike, which is a gpu test. Since the gpu gets stressed a lot in this test, you won't notice the cpu holding you back unless you run a ridiculously low resolution like 144p for example, or extremely low settings in the benchmark.


Doh! Forgive my ignorance in not really reading the post... My mind automatically focused on refresh rate and not resolution. Apologies!


----------



## Hiv359

Sup, any Gigabyte 1070Ti GV-N107TGAMING-8GD owners here? Im trying to figure out a few things:
1)Average and best mem/clock OC
2)Best curve setup for 1080p gaming
3)Cause of almost random(happens in 6 reboots out of 10) black screen on windows10 logon but more often it just stucked on blue welcome screen: tried different drivers, was uninstalling it with DDU and default uninstaller - nothing helped, but when i plug in my old amd r9 280x everything becomes ok.
4)Coil whining. DAMN, ITS TOO LOUD.

Happy New Year everyone!


----------



## khanmein

One of the best feature from Gigabyte aka Aorus is coil whine.

Happy New Year.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Madmaxneo*
> 
> That doesn't make any sense.
> I have two very different 24" monitors. One is an AOC gsync G2460PG with a refresh rate of 144hz and a 24" vizio smart TV with a max refresh rate of 60hz. I've run both stress tests and benchmarks with both monitors on and also with the Vizio TV off (and disabled in windows) and there was absolutely no difference in performance from either the card or the CPU. In fact the best Firestrike score I ever achieved was with both monitors on. That was about a year ago with a GTX 980 card. My 1070 can just about reach those scores but that is without OCing it where my 980 was maxed out OC with a modded bios...
> 
> 
> 
> Several problems with your argument. First, it's not clear whether the two displays have different resolutions. Refresh rate is irrelevant in cpu/gpu performance(for this discussion). When it comes to games, it depends on the game as some games are more cpu dependent than others. However, it's been proven time and time again that if you run a game with a high end gpu, a certain trend can be found. The lower the resolution, the more frames a high end gpu can produce. The cpu then ends up being the limitation from the gpu reaching it's full 99% usage. Why do you think in cpu tests, youtubers use a 1080ti at 1080p? It's not because a 1080ti is needed for 1080p, but rather to cause a cpu bottleneck on purpose.
> 
> In conclusion, not only do lower resolutions lead to more fps and thus, more cpu stress, but lower settings in-game also lead to more fps and more stress for the cpu as a result. Think of the cpu as a fps limit that is different for every game, and the stronger the cpu, the higher the frame cap is. Most people don't have to worry about this frame cap unless gaming at high refresh rates with a gpu that can supply such high fps.
> 
> Now, assuming you run the monitors at their native resolutions, this then leads to such discrepancies in performance. No, a monitor can't physically impact gaming performance. It's the resolution the monitor has that impacts it. The last issue with your argument is that you are using Firestrike, which is a gpu test. Since the gpu gets stressed a lot in this test, you won't notice the cpu holding you back unless you run a ridiculously low resolution like 144p for example, or extremely low settings in the benchmark.
Click to expand...

I think that you need to be more specific. If you are running a game at 1080p, the resolution of the monitor wont make any difference as long as the monitor supports 1080p or better. If you are running a game at the native resolution of the monitor, the higher the resolution you go, the more the performance balance is pushed onto the GPU and away from the CPU. The reason is that the GPU is being forced to render more pixels at the higher resolution while the CPU has less calculations for the physics and game logic as the GPU cannot render the same number of frames.

PUBG creates other issues due to the extremely poor optimization of the code running on the UE4 game engine. The game should be running at really high frame rates given the relative simplicity of the graphics when compared to something like battlefront 2 or battlefield 1.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *blaze2210*
> 
> ^This.
> 
> I've been running 4 monitors for close to a year now, 2 of them are the same, so that makes 3 totally different monitor models: the 2x AOC I2269VWs have a 60hz refresh rate @ 1080p, the ASUS VG248QE is set to 120hz normally @ 1080p, and the emachines "HDMonitor" is 75hz @ 1440x900. No difference in performance by turning any of them off....The best Firestrike score I've reached so far was with all of my monitors on.
> 
> 
> 
> Refer to Madmaxneo's post and my response to it. The same answer can be applied here as well.
Click to expand...

again, there are two different things on the table here. With multiple monitors, you can either run a game on surround mode across multi monitors or you can run it on a single monitor. Firestrike is fixed at 1080p on a single monitor so the variation would be on all the other things that are running in the background, not the fact that the 2nd monitor was turned on. If you are running over multi monitors, the GPU is having to render double the amount of pixels so it will impact performance


----------



## gtbtk

Quote:


> Originally Posted by *Hiv359*
> 
> Sup, any Gigabyte 1070Ti GV-N107TGAMING-8GD owners here? Im trying to figure out a few things:
> 1)Average and best mem/clock OC
> 2)Best curve setup for 1080p gaming
> 3)Cause of almost random(happens in 6 reboots out of 10) black screen on windows10 logon but more often it just stucked on blue welcome screen: tried different drivers, was uninstalling it with DDU and default uninstaller - nothing helped, but when i plug in my old amd r9 280x everything becomes ok.
> 4)Coil whining. DAMN, ITS TOO LOUD.
> 
> Happy New Year everyone!


what is the code of the BSOD? The 1070 is significantly more load on your hardware than a R9 280x Various motherboard chip sets have voltages that you can tweak to help with memory controller and PCIe controller loads. VCCIO and System Agent are two that are worth looking into. Vcore voltage may also benefit from a slight boost

Cant give you specifics on a TI but a 1070 and 1070TI are almost the same cards.

Memory should be OK at +500 as a starting point, you may even find that it gets up to +650. The VCCIO voltages help there.

I found with my 1070 that adding extra voltage increased temps but didn't really help that much with extra performance. I found that the best frequency for my 1070 was at around 2076/2088Mhz. I can run it over 2100 but the performance is not any better.

Start out with the core slider and increase it until it starts to artifact or crash then back it off 25 points. That is the core starting point. You can then try increasing single points on the curve. The two that I would start with are .950 and 1.050 with the voltage slider at 0. with the voltage slider at 0 the max voltage you card will draw is 1.063v but if you use that point, the only place for the clocks to go as the card heat up is down. If you stay one step below the max voltage you leave headroom so that the card can increase voltage draw itself first before it starts to drop frequencies keeping the clocks more stable.

If you want to use +100 voltage, then the two important points are 1.081 and the maximum 1.093v


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> I think that you need to be more specific. If you are running a game at 1080p, the resolution of the monitor wont make any difference as long as the monitor supports 1080p or better. If you are running a game at the native resolution of the monitor, the higher the resolution you go, the more the performance balance is pushed onto the GPU and away from the CPU. The reason is that the GPU is being forced to render more pixels at the higher resolution while the CPU has less calculations for the physics and game logic as the GPU cannot render the same number of frames.
> 
> PUBG creates other issues due to the extremely poor optimization of the code running on the UE4 game engine. The game should be running at really high frame rates given the relative simplicity of the graphics when compared to something like battlefront 2 or battlefield 1.


I did say at it's native resolution. Anyways, I guess I wasn't specific enough as muliple people objected to it. my apologies.

On another note, when do you guys think volta is coming out? My guess is by this summer.


----------



## comanzo

Quote:


> Originally Posted by *Madmaxneo*
> 
> Doh! Forgive my ignorance in not really reading the post... My mind automatically focused on refresh rate and not resolution. Apologies!


No worries man.


----------



## comanzo

Quote:


> Originally Posted by *gtbtk*
> 
> again, there are two different things on the table here. With multiple monitors, you can either run a game on surround mode across multi monitors or you can run it on a single monitor. Firestrike is fixed at 1080p on a single monitor so the variation would be on all the other things that are running in the background, not the fact that the 2nd monitor was turned on. If you are running over multi monitors, the GPU is having to render double the amount of pixels so it will impact performance


oh ok. I see. thanks for the clarification. I thought that if both monitors were connected to the same gpu, even if not run in surround, will still cause increased gpu usage since both monitors' pixels must be loaded. However, since I never put two monitors simultaneously on the same gpu, I can't really comment on that.


----------



## gtbtk

Quote:


> Originally Posted by *comanzo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> again, there are two different things on the table here. With multiple monitors, you can either run a game on surround mode across multi monitors or you can run it on a single monitor. Firestrike is fixed at 1080p on a single monitor so the variation would be on all the other things that are running in the background, not the fact that the 2nd monitor was turned on. If you are running over multi monitors, the GPU is having to render double the amount of pixels so it will impact performance
> 
> 
> 
> oh ok. I see. thanks for the clarification. I thought that if both monitors were connected to the same gpu, even if not run in surround, will still cause increased gpu usage since both monitors' pixels must be loaded. However, since I never put two monitors simultaneously on the same gpu, I can't really comment on that.
Click to expand...

two monitors on one card will increase the load slightly, mostly against the vram but if the image, such as your desktop is on one screen, it is static so the GPU doesn't have to do much work because nothing changes. The game, on the other hand is making the gpu render multiple frames that are constantly changing as you have movement, particles, physics etc etc.

Good lesson for life in general here. You just need to keep in mind that we cannot see your gear so while you you are familiar with it, it is not safe to assume that everyone else is. It is unique to you so go the extra mile to identify all the variables and describe them. Communication improves as no-one is talking in circles or going off on tangents.

If you can keep that concept in mind for every interaction with other people you have, not just IT forums, then you are way ahead of the average pleb in the street.


----------



## dafour

Just gotten myself a MSI Aero 1070 ti and slapped my arctic twin turbo 2 on it,i'm getting 1.9 ghz stock due better temperatures.Is this a good boost speed?

Also is this a reference pcb?


----------



## mattliston

Looks very close to my PNY 1070.

I thought it was PNY just cheaping out a ton by removing quite a bit of power delivery and filtering components

BTW, repasted my 1070. PNY is such a GARBAGE company to produce my 1070 in such a way that IVE SEEN TREE BARK SMOOTHER THAN MY HEATSINK!

this garbage aluminum heatsink would struggle with an overclocked 1060.

No wonder my card has reached over 92*C with fans at 100% in sub 70*F ambient.

The 7870 that is 2 slots below it folds at its stock 1ghz core at only 56*C at roughly 55% fan speed.

This garbage 1070 cooler is enough for me to wanna hack my old kraken x61 onto it.

The repaste keeps the card in the lower 80s now.

I cannot remember the last card Ive read about getting a 10*C drop from a repaste. that is disgusting in my book. Such a waste. Id LOVE to have to pay an additional 5-20 bucks for a video card brand new if I knew it had thermal paste that at the VERY least was middle of the road.

The paste on my card was greasy and sludgy around the edges, and chalky and dry in the center. There were obvious contact issues, like the paste was not thick enough.

What else on this card did PNY cheap out on? Is it even going to survive more than a few years?


----------



## zipper17

it doesn't make sense to judge entire product is garbage..

you bought standard series PNY? is that reference blower type? that's what you got dude..
92C temp is it brand new condition/out of the box? or been used long time enough?? have you often maintenance the gpu? or clean the dust inside your PC? how good is the airflow?

every gpu brand has lower & greater series of quality of their GPu product
PNY have greater series like jetstream, gamerock edition, just like other brand Asus ROG, MSI gaming X, GIgabtye aorous, Galax EXOC/HOF, zotac AMp, etc. they also has lower brand/reference type of GPU, much cheaper, and use standard components and cooler.


----------



## mattliston

I thought I was clearly crapping on PNY, not the 1070.

That is the only 1070 I could buy locally in a 3 hour drive radius.

And when it was cheaper than the cheapest online to order at the time, it made sense.

The cooler that appears hand crafted with a torch and a toothpick is what is concerning to me, not to mention how the PCB indicates 20-30% of the planned power delivery was removed, most likely to save a buck.

Its not a reference PCB, so I cannot use an OEM cooler, which would prove to be quite an improvement.

Im thinking of going with a Kraken G12 and my old Kraken X61 to keep temps in check, but I would then almost need to do the power shunt mod to make use of the temps.

This card, surprisingly has touched 2.1ghz quite briefly, and is very stable at 2062 at 0.993 volts No idea if that is any good, but I would love to be able to chuck full voltage down its throat with better temps and see what it has to offer on the table.


----------



## gtbtk

Quote:


> Originally Posted by *dafour*
> 
> Just gotten myself a MSI Aero 1070 ti and slapped my arctic twin turbo 2 on it,i'm getting 1.9 ghz stock due better temperatures.Is this a good boost speed?
> 
> Also is this a reference pcb?


yes it is using a reference PCB. perfectly fine with reasonable cooling

you should be able to push that up to 2100mhz with the memory boosted by +500 to +650 or there abouts with MSI afterburner


----------



## gtbtk

Quote:


> Originally Posted by *mattliston*
> 
> I thought I was clearly crapping on PNY, not the 1070.
> 
> That is the only 1070 I could buy locally in a 3 hour drive radius.
> 
> And when it was cheaper than the cheapest online to order at the time, it made sense.
> 
> The cooler that appears hand crafted with a torch and a toothpick is what is concerning to me, not to mention how the PCB indicates 20-30% of the planned power delivery was removed, most likely to save a buck.
> 
> Its not a reference PCB, so I cannot use an OEM cooler, which would prove to be quite an improvement.
> 
> Im thinking of going with a Kraken G12 and my old Kraken X61 to keep temps in check, but I would then almost need to do the power shunt mod to make use of the temps.
> 
> This card, surprisingly has touched 2.1ghz quite briefly, and is very stable at 2062 at 0.993 volts No idea if that is any good, but I would love to be able to chuck full voltage down its throat with better temps and see what it has to offer on the table.


The power shunt is not going to get you that much improvement. Water cooling will give you a good bump keeping the clock speeds up. Make sure that you address memory and VRM cooling as well. The cards really like running under 60 deg, under 50 is even better


----------



## 21Dante

Even if you bump them up to 2100,does the card keep these clocks at gaming?
My asus 1070,with 112% power,sometimes throttles due to power limit.
And I have it clocked at [email protected]
Testing with anything higher just had it underclocking to keep the power limit.


----------



## mattliston

2062 is steady during gaming. Not so much during folding, so I took a different tactic.

I use offset clocking for folding, and curve tuning for gaming.

curve tuning wins every time for gaming and benchmarks.

EDIT if you take a dozen 1070s and put them all at a found steady clock and voltage of 2000mhz and 1.000v, (assuming stability) they will each draw a different amount of power.

Dont compare your power % to X clocks and Y voltage. Very few cards will 100% mimic each other.


----------



## gtbtk

Quote:


> Originally Posted by *21Dante*
> 
> Even if you bump them up to 2100,does the card keep these clocks at gaming?
> My asus 1070,with 112% power,sometimes throttles due to power limit.
> And I have it clocked at [email protected]
> Testing with anything higher just had it underclocking to keep the power limit.


you can if you play with the curves. not generally worth it though.

My best performance was at 2088-2076mhz. Scores really didn't improve any if I increased voltage above stock levels.

The key to maximizing clocks under load is to make sure your case has good air flow and set the peak frequency one step below the max voltage point on the curve that your current settings allow (1.050v for stock or 1.081v at +100). That will leave the card an extra step of voltage headroom to use before it will start to reduce clocks as temps increase.


----------



## gtbtk

Quote:


> Originally Posted by *mattliston*
> 
> 2062 is steady during gaming. Not so much during folding, so I took a different tactic.
> 
> I use offset clocking for folding, and curve tuning for gaming.
> 
> curve tuning wins every time for gaming and benchmarks.
> 
> EDIT if you take a dozen 1070s and put them all at a found steady clock and voltage of 2000mhz and 1.000v, (assuming stability) they will each draw a different amount of power.
> 
> Dont compare your power % to X clocks and Y voltage. Very few cards will 100% mimic each other.


Folding is likely not using the same p-state as gaming so your frequencies will differ. Nvidia inspector is a handy tool that will allow you to see wat pstate the card is operating in


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> you can if you play with the curves. not generally worth it though.
> 
> My best performance was at 2088-2076mhz. Scores really didn't improve any if I increased voltage above stock levels.
> 
> The key to maximizing clocks under load is to make sure your case has good air flow and set the peak frequency one step below the max voltage point on the curve that your current settings allow (1.050v for stock or 1.081v at +100). That will leave the card an extra step of voltage headroom to use before it will start to reduce clocks as temps increase.


You can also add a waterblock to your GPU to help ensure the temps never go high enough to cause the card to throttle down the clocks.... I use a Heatkiller IV pro water block on my 1070 and it has never gone above about 50 degs on any bench, test, or game so far.


----------



## 21Dante

Quote:


> Originally Posted by *gtbtk*
> 
> you can if you play with the curves. not generally worth it though.
> 
> My best performance was at 2088-2076mhz. Scores really didn't improve any if I increased voltage above stock levels.
> 
> The key to maximizing clocks under load is to make sure your case has good air flow and set the peak frequency one step below the max voltage point on the curve that your current settings allow (1.050v for stock or 1.081v at +100). That will leave the card an extra step of voltage headroom to use before it will start to reduce clocks as temps increase.


I use the curve and still,power limit hits 112% at most games.
Maybe my 1070 is power hungry,dunno.


----------



## mattliston

Anybody else able to hit a touch over 2ghz core with .962 volts?

Im working on a tighter curve, to see how much voltage I really need for 2ghz.

Im hoping for upper 800's mV

What are your experiences?


----------



## 21Dante

My asus dual can hit 2ghz with .893
At 2012mhz drivers crash


----------



## Madmaxneo

I am looking at doing some benchmarks with my GTX 1070.

But I have forgotten some basic things...lol.

What were the basic parameters for increasing core and memory clocks again?


----------



## KillerBee33

Quote:


> Originally Posted by *Madmaxneo*
> 
> I am looking at doing some benchmarks with my GTX 1070.
> 
> But I have forgotten some basic things...lol.
> 
> What were the basic parameters for increasing core and memory clocks again?


Start +150 Clock +500 Memory and take it from there. 10 Series can go up to +800 on the memory.


----------



## mattliston

it works best with 2 monitors, but I have MSI afterburner open, the curve graph open (since it shows real time what voltage and frequency the GPU is at with a crosshair), and i have GPU-Z open to monitor power usage

Pick a voltage point on the graph. I picked 1000mV, as it was simple and easy to remember. AND much lower than what the GPU uses by default to do a poor job of maintaining over 1800mhz.

I simply click the square on the graph that represents 1000mv, dragged it upwards close to 2000 (can use UP arrow key), flattened and matched all points to the right so the GPU doesnt increase voltage, and looped HEAVEN with a custom resolution to cover 90% of the desktop on the second display, so I could leave it in window mode.

My 1070 gets scorching hot, so I settled for what was stable for 3-4 hours in heaven. While folding, I drop the clocks at the same voltages down about 100mhz or so. I currently use 1936MHz at 943mV, and it folds a LOT faster than the default clocks/curve which would have had it folding at approx 1700mhz or so.

If you do folding, keep in mind it sets the memory clocks to 3800mhz and tightens the clock straps, so I suggest removing any memory overclock. the memory speed is not a big enough folding performance increase to risk being unstable or giving [email protected] crappy data


----------



## mattliston

I forgot to add something.

Something that seems to be with afterburner is, if you set the points in the curve too agressively, such as a big leap from one to the next, the graph will automatically "soften" the curve.

Example, I am testing how low of votlage I can run 1898Mhz. I set a point to the left up to 1898, and the graph doesnt quite match the point to 1898. I merely repeat the process, but see afterburner is actually setting much lower points at a higher frequency as well, attempting to keep the curve "soft"

For 893mv, for example, I had to repeat this 1898 point 4 times before it clicked into palce, and I then watched the active crosshair go to it, and I also saw my GPU temp drop 1 degree lol

Even with this terrible TERRIBLE aluminum cooler on this PNY card, it is currently running heaven at 68*C, and I started with 76*C at a higher voltage.

This curve stuff makes a HUGE difference.

Keep in mind, if you reduce the fan speed in order to force more active thermal throttleing, you can fine tune where the card will sit once its doing its various throttling behaviors.

I am currently forcing a 70% fan speed (these fans suck, so 100% isnt much better) and still in the 60's for temps at full load. BUT this is for 1898mhz, a low power speed in my eyes.

I always start with lower frequencies, because it gets irritating to manually increase every point to the right of my active point. When you increase the frequency, it auto adjusts everything to the right to be at least as fast. It saves you 5-20 minutes of adjusting lol


----------



## mattliston

welp. 975mv, 1974mhz core, +100mhz offset MEMORY, and 100% fan speed flat out force power usage above 100% at all times during heaven bench.

Caught a few random driver crashes at 2012MHz and 983mV, possibly since the next lowest frequency was 1936mhz at 893mV. Perhaps the changing voltage/frequency jumps threw it out of whack for a second.

Power shunt mod is coming. Im thinking of soldering an extra male MOLEX plug, and run 2 pairs of wires to each side of the shunt resistors, and use the female MOLEX as a housing for 2 resistors, of some later calculated value, to get the card to "see" 70-80% at what it currently sees as approx 100%. No need to go further, temps will hold me back until I decide to use my X61 Kraken thats sitting around.

Whoever said the power shunt mod is not worth doing, doesnt own a 1070 with high power leakage.

I can only go 1 (or both) of 2 ways to fix this. Power shunt, or VERY cold temperatures (like max 45*C core)

Those are the only two ways I will be able to allow this card to live above 1000mv and not constantly switch the frequency on me at full synthetic load.

When its stable at 2062mhz at 1000mv, it is only because games dont attempt to crush this 1070 constantly like a good bench loop does


----------



## Madmaxneo

@KillerBee33 That is exactly what I was looking for.
@mattliston Thanks for the added tips. Though I won't be doing the power shunt mod....

Thanks for the advice so far all!

FYI, I am not that worried about temps as I am watercooling my GPU via a Swiftech H140-X paired with a Heatkiller IV pro GPU block. My temps rarely go above the low 40's and only reach those temps under extreme benching.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you can if you play with the curves. not generally worth it though.
> 
> My best performance was at 2088-2076mhz. Scores really didn't improve any if I increased voltage above stock levels.
> 
> The key to maximizing clocks under load is to make sure your case has good air flow and set the peak frequency one step below the max voltage point on the curve that your current settings allow (1.050v for stock or 1.081v at +100). That will leave the card an extra step of voltage headroom to use before it will start to reduce clocks as temps increase.
> 
> 
> 
> You can also add a waterblock to your GPU to help ensure the temps never go high enough to cause the card to throttle down the clocks.... I use a Heatkiller IV pro water block on my 1070 and it has never gone above about 50 degs on any bench, test, or game so far.
Click to expand...

That is true, however it depends on what brand you buy regards losing warranty if you take the card apart. EVGA is ok with watercooling all of their models, Asus and MSI will void warranty for example.

My MSI will run in the 50s so It is not a huge issue leaving my card on air and it still performs better than most cards out there even running on a sandy bridge CPU


----------



## gtbtk

Quote:


> Originally Posted by *21Dante*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> you can if you play with the curves. not generally worth it though.
> 
> My best performance was at 2088-2076mhz. Scores really didn't improve any if I increased voltage above stock levels.
> 
> The key to maximizing clocks under load is to make sure your case has good air flow and set the peak frequency one step below the max voltage point on the curve that your current settings allow (1.050v for stock or 1.081v at +100). That will leave the card an extra step of voltage headroom to use before it will start to reduce clocks as temps increase.
> 
> 
> 
> I use the curve and still,power limit hits 112% at most games.
> Maybe my 1070 is power hungry,dunno.
Click to expand...

The vendor bios has a lot to do with cards hitting power limits. The EVGA bios sets the card up so it power throttles with a 1080p workload. The MSI Gaming bioses will not.

I was not talking about power limits in my post though, I was talking about setting the max frequency on the curve to hit the point one down from the maximum that the afterburner power settings limit the card. You set up the curve so that the fasted frequence the curve allows is at the second last point on the curve within the accessible range and then set the frequency point to be flat at the last accessible voltage point on the curve. The card will use the voltage regardless of the load that you place on it.

Power limits are dependent on the load the you put on the card.


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> @KillerBee33 That is exactly what I was looking for.
> @mattliston Thanks for the added tips. Though I won't be doing the power shunt mod....
> 
> Thanks for the advice so far all!
> 
> FYI, I am not that worried about temps as I am watercooling my GPU via a Swiftech H140-X paired with a Heatkiller IV pro GPU block. My temps rarely go above the low 40's and only reach those temps under extreme benching.


make sure you also have airflow on the VRMs


----------



## gtbtk

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Madmaxneo*
> 
> I am looking at doing some benchmarks with my GTX 1070.
> 
> But I have forgotten some basic things...lol.
> 
> What were the basic parameters for increasing core and memory clocks again?
> 
> 
> 
> Start +150 Clock +500 Memory and take it from there. 10 Series can go up to +800 on the memory.
Click to expand...

the core clock will depend on the factory OC in the bios. the Gigabyte Xtreme will be lucky if you can get more than +25 as it is already +155 above the reference clock out of the factory.

you can calculate the addition starting calculations at the reference clock of 1506Mhz. The EVGA SC is 1595mhz base from memory so +150 may not be achievable


----------



## KillerBee33

Quote:


> Originally Posted by *gtbtk*
> 
> the core clock will depend on the factory OC in the bios. the Gigabyte Xtreme will be lucky if you can get more than +25 as it is already +155 above the reference clock out of the factory.
> 
> you can calculate the addition starting calculations at the reference clock of 1506Mhz. The EVGA SC is 1595mhz base from memory so +150 may not be achievable


He got an EVGA SC with 1607 Base clock which you might still get another +125 on it depending on a card. But i'd calculate +200 from Reference1506MHz first. Memory really varies on all of them but gives a huge difference in Benches.


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> make sure you also have airflow on the VRMs


The entire card is covered by the waterblock this includes the VRMs


Spoiler: Warning: Spoiler!






Quote:


> Originally Posted by *gtbtk*
> 
> the core clock will depend on the factory OC in the bios. the Gigabyte Xtreme will be lucky if you can get more than +25 as it is already +155 above the reference clock out of the factory.
> 
> you can calculate the addition starting calculations at the reference clock of 1506Mhz. The EVGA SC is 1595mhz base from memory so +150 may not be achievable


Without any OC my card hits just under 2000mhz and it never gets hotter than 43 degs on Valley, but that's normal. The memory on my card is micron and it is limited on how far I can OC it until I start getting artifacts.


----------



## THC Butterz

My new Asus 1070 Strix, Only paid $250 shipped for this bad boy with the EK watterblock and backplate + the stock strix cooler for backup


----------



## gtbtk

Quote:


> Originally Posted by *Madmaxneo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> make sure you also have airflow on the VRMs
> 
> 
> 
> The entire card is covered by the waterblock this includes the VRMs
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> the core clock will depend on the factory OC in the bios. the Gigabyte Xtreme will be lucky if you can get more than +25 as it is already +155 above the reference clock out of the factory.
> 
> you can calculate the addition starting calculations at the reference clock of 1506Mhz. The EVGA SC is 1595mhz base from memory so +150 may not be achievable
> 
> Click to expand...
> 
> Without any OC my card hits just under 2000mhz and it never gets hotter than 43 degs on Valley, but that's normal. The memory on my card is micron and it is limited on how far I can OC it until I start getting artifacts.
Click to expand...

Sorry, I thought you said you were using an AIO.

I was trying to say that the factory overclock reduce the overclock headroom available on the GPU. While a reference card may be fine at +200, an overclocked card is unlikely to get that high. Someone said start at +200 but that really depends on the card you are using.

I have Micron memory too. I'm the guy that tracked down the Micron bug and got Nvidia to fix it with the help of a few guys here in the forum. Make sure that your card is running the 86.04.50.00.xx bios version and not the original 86.04.26 bios that originally came with the micron cards.

I found too, that increasing VCCIO voltage in the UEFI a little helped with the Vram Memory OC stability performance as well. I can do +650 above reference on my Micron card without any dramas. My card starts getting funky just under +700 above the reference clock.


----------



## gtbtk

Quote:


> Originally Posted by *THC Butterz*
> 
> My new Asus 1070 Strix, Only paid $250 shipped for this bad boy with the EK watterblock and backplate + the stock strix cooler for backup


thats a great deal


----------



## Madmaxneo

Quote:


> Originally Posted by *gtbtk*
> 
> Sorry, I thought you said you were using an AIO.
> 
> I was trying to say that the factory overclock reduce the overclock headroom available on the GPU. While a reference card may be fine at +200, an overclocked card is unlikely to get that high. Someone said start at +200 but that really depends on the card you are using.
> 
> I have Micron memory too. I'm the guy that tracked down the Micron bug and got Nvidia to fix it with the help of a few guys here in the forum. Make sure that your card is running the 86.04.50.00.xx bios version and not the original 86.04.26 bios that originally came with the micron cards.
> 
> I found too, that increasing VCCIO voltage in the UEFI a little helped with the Vram Memory OC stability performance as well. I can do +650 above reference on my Micron card without any dramas. My card starts getting funky just under +700 above the reference clock.


I'm using a Swiftech H140-X AIO but I've changed the cpu block out for the Heatkiller block along with changing the tubing. Swiftech AIO's are meant to be configurable.
I'll check on that bios and possibly increasing the VCCIO when I get home. Thanks!


----------



## Sylencer90

Hello guys, i just registered here to ask some simple questions about my overclocking.

currently im getting 1962 mhz on clock and 4276 on mem

Had my clock higher a while ago but i faced freezes in couple games. Now the max i can go up is +60

Using the Gainward GeForce® GTX 1070 Phoenix "GLH"

Can i go higher on mem? Im scared to risk my gpu since i heard the micron memory is really fragile xD

Heres my afterburner settings : http://prntscr.com/i19n76


----------



## Dude970

Crank your memory up +500 The micron bug was fixed with vBIOS


----------



## gtbtk

Quote:


> Originally Posted by *Sylencer90*
> 
> Hello guys, i just registered here to ask some simple questions about my overclocking.
> 
> currently im getting 1962 mhz on clock and 4276 on mem
> 
> Had my clock higher a while ago but i faced freezes in couple games. Now the max i can go up is +60
> 
> Using the Gainward GeForce® GTX 1070 Phoenix "GLH"
> 
> Can i go higher on mem? Im scared to risk my gpu since i heard the micron memory is really fragile xD
> 
> Heres my afterburner settings : http://prntscr.com/i19n76


make sure that your card has the 86.04.50.xx.xx version vbios installed. Micron memory is fine. +500 to +650 should be achievable.

The worst you can do to your GPU if you are overclocking an unmodded card is crash the PC, after a reboot it will be back at stock speeds


----------



## mattliston

+500mhz memory?

this whole time Ive been uber conservative, thinking I was forcing the ECC functionality to jump in. Perhaps I was coming up on a memory strap that just needed to get pushed to another setting.


----------



## Sylencer90

Quote:


> Originally Posted by *gtbtk*
> 
> make sure that your card has the 86.04.50.xx.xx version vbios installed. Micron memory is fine. +500 to +650 should be achievable.
> 
> The worst you can do to your GPU if you are overclocking an unmodded card is crash the PC, after a reboot it will be back at stock speeds





so am i good to push it up more







? 

Edit : Core 2126 & Mem 4604 is the max i can go, everything above will let my Unigine Valley Benchmark crash.

Was running witcher 3 for like 30mins on +450mem and then the gpu driver crashed.

Guess i'll have to let it on +60 core / + 350 mem.


----------



## gtbtk

Quote:


> Originally Posted by *mattliston*
> 
> +500mhz memory?
> 
> this whole time Ive been uber conservative, thinking I was forcing the ECC functionality to jump in. Perhaps I was coming up on a memory strap that just needed to get pushed to another setting.


Micron memory cards using the original vbios (86.04.26.xx.xx) needs updating.

I found that after the vbios update on my rig, a little extra VCCIO voltage helped improve vram oc stability and got me higher memory clocks

Get hold of OCCT. The free version has a VRam test section that you can run against your card. It will show you any memory errors in real time.

I found it handy to run that tool as I push up memory clock speeds in steps. You can see that for the first 500 or so there are no memory errors. You will initially see one or two errors pop up occasionly, then As you go higher still, the numbers of errors will start to increase exponentially. These cards do have the capacity to error correct to a point but as the numbers of errors increase performance drops off or ultimately the number of errors surpasses the cards ability to correct the errors causing intermittent crashes. You want to limit the overclock to the point just below where the errors start to exponentially increase. The card should deal with the odd intermittent error no problem.


----------



## gtbtk

Quote:


> Originally Posted by *Sylencer90*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gtbtk*
> 
> make sure that your card has the 86.04.50.xx.xx version vbios installed. Micron memory is fine. +500 to +650 should be achievable.
> 
> The worst you can do to your GPU if you are overclocking an unmodded card is crash the PC, after a reboot it will be back at stock speeds
> 
> 
> 
> 
> 
> 
> so am i good to push it up more
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Edit : Core 2126 & Mem 4604 is the max i can go, everything above will let my Unigine Valley Benchmark crash.
> 
> Was running witcher 3 for like 30mins on +450mem and then the gpu driver crashed.
> 
> Guess i'll have to let it on +60 core / + 350 mem.
Click to expand...

In the afterburner settings panel you need to check the box to allow you to adjust the card voltage. the way your screen shot of afterburner is set, the maximum voltage you will get from the card is 1.063V. You can increase that to 1.093v and you might get more stability at 2126. I can run my card at that speed too but I honestly got better results at 2088-2076.

For memory overclocks, I found with my card in everything i tested, except one thing, i was stable at +500 above reference clocks. Your card has a factory overclock on the memory to start with, so you you need to adjust those numbers but I would think that you could explore overclocks with an extra 150-200mhz. You are at 8910Mhz now. 9200 -9300 may well be achievable.

The one exception to +500 stability I found was time spy. I could not get the memory clocks above about +430 without crashing.

Overclocking using afterburner will not hurt or damage the card. It may crash/hang the PC but the card will go back to stock clocks after the reboot. All these cards are slightly different, the best way to check is to try it and see what happens.


----------



## Sylencer90

Quote:


> Originally Posted by *gtbtk*
> 
> In the afterburner settings panel you need to check the box to allow you to adjust the card voltage. the way your screen shot of afterburner is set, the maximum voltage you will get from the card is 1.063V. You can increase that to 1.093v and you might get more stability at 2126. I can run my card at that speed too but I honestly got better results at 2088-2076.
> 
> For memory overclocks, I found with my card in everything i tested, except one thing, i was stable at +500 above reference clocks. Your card has a factory overclock on the memory to start with, so you you need to adjust those numbers but I would think that you could explore overclocks with an extra 150-200mhz. You are at 8910Mhz now. 9200 -9300 may well be achievable.
> 
> The one exception to +500 stability I found was time spy. I could not get the memory clocks above about +430 without crashing.
> 
> Overclocking using afterburner will hurt or damage the card. It may crash/hang the PC but the card will go back to stock clocks after the reboot. All these cards are slightly different, the best way to check is to try it and see what happens.


At how much would i need to set the slider on to get those extra volts?

Nvm. Put the slider on +100. Sitting at 1.0810 V now. Will see how it works









Edit : +60 core / +360 mem is the max i reach before unigine valley crashes/freezes on me.


----------



## mattliston

Is anybody interested in some PNY 1070 pictures? I have some questions regarding 2 spots where it says unlock vcore and unlock ocp (spots on pcb)

Also, wondering if there are any good points to add some capacitors in order to clean up power delivery for my future power shunt mod, as I constantly hit power limit in synthetic benchmarks, even at under 900mV

heck, OCCT was hitting power limit at 1780-ish mhz and 800mV


----------



## gtbtk

Quote:


> Originally Posted by *mattliston*
> 
> Is anybody interested in some PNY 1070 pictures? I have some questions regarding 2 spots where it says unlock vcore and unlock ocp (spots on pcb)
> 
> Also, wondering if there are any good points to add some capacitors in order to clean up power delivery for my future power shunt mod, as I constantly hit power limit in synthetic benchmarks, even at under 900mV
> 
> heck, OCCT was hitting power limit at 1780-ish mhz and 800mV


If you feel like being experimental, you could try cross flashing your card. Different bios versions from different vendors handle power limits and the ramp up in power draw differently. The EVGA bios versions hit power limits very quickly. The Asus strix and MSI Gaming X\Z bioses tend to be much more resistant hitting power limits particularly at 1080p.

the OCCT workload and furmark workloads are designed to hit power limits early and do it on all cards they are "power virus" applications and not representative of the real world. Don't focus your attention on those results regardsing power draw.

PNY is about the only brand of bios that I have not tried cross flashing to my card. (MSI Gaming X) I don't know what sort of VRM they are using. As far as I can tell, all the 1070 cards are using reference PCBs If that is true, the VRM will handle up to about 250W. If you do decide to try cross flashing be aware that some bioses work better on your hardware than others. Some bios versions will disable some of the display ports to support their custom HDMI ports etc. so if your screen doesnt seem to work on the reboot, don't panic and try other ports on the card first.

The Galax HOF and snipr cards use a different voltage controller so dont try those or it will brick your card.


----------



## mattliston

10 pictures in cellphone potatoe glory.

These were taken during my recent re-paste.

Thermal pads felt like junk, paste was nasty compared to my XFX rx480 and r7 370, and pictures dont show the tree bark texture of the cooler.

If this is indeed a reference design, not only am I sad nvidia gimped the execution of power delivery components, but I am happy I would be able to use a copper based cooler with much better VRM cooling.


Spoiler: Warning: Spoiler!
















Any thoughts on power filtering? Id like to experiment with several groups of small capacitors in the 100-220 uf spec, probably try ones with very low ESR and perhaps around a 50% margin of voltage (capacitors perform better in middle of spec, not at top or bottom, ie, 2v system might want 2.7v capacitors)

EDIT I cannot remember if I used TG-7 or IC7 on this gpu.

EDIT#2 the synthetic workloads I used were OCCT for memory checking, and heaven run at a custom window resolution of around 1700x1045 or something, to have a windowed run that covers 95% of desktop, but leaves taskbar available to use


----------



## philrock

I have an Asus ROG Strix GTX 1070 OC running on a Threadripper 1950x system.

I used Heaven Benchmark to stress test using MSI Afterburner and settled on a stable setting of +90 core, +400 memory. I didn't touch the voltage in my final settings, when I experimented with it, it didn't seem to give me anymore stability and the card would hang at the same settings.

Here are CPU-Z stats...

*At stock:*

*
At +90 core, +400 mem:*


Heaven Benchmark results...

*Stock (4244):*


*OC (4430):*


I also wanted to test for content creation, my main use is for 3D animation. For some reason I found Cinebench R15 to be very inconsistent so I abandoned that and found a benchmark program for Redshift which in the end is a lot more useful as I plan on using it for rendering.

*Stock (15m33s):*


*OC (14m27s):*


Doing the math I found a +4.4% increase in performance with Heaven Bench, and +7.6% increase with Redshift.

I also took note of peak core speeds, 2075MHz for Heaven and 2114MHz for Redshift which might explain the slightly better performance increase.

Overall I'm pretty happy, I noticed the stock boost was already giving me high performance at 1987MHz, well above advertised boost speeds of 1873MHz, so in the end even people who don't overclock are getting a great value with this card.

It's debatable whether running Afterburner on startup is even worth it for <10% increase in performance, I'd imagine for most people it wouldn't be noticeable in everyday use, but it's nice to have the option.


----------



## gtbtk

@philrock Threadripper brings its own challenges, the NUMA nature of the chip adds its own challenges particularly when accessing a GPU. You have a challenge in that you want to render that benefits from all the cores but you also want to game that running on all 16 cores is detrimental.

You will get the best results from your threadripper rig if you manage the threads of your applications based on what workload you are planning to run on it. There are a couple of tools that you can use that will automate it for you. Process Lasso is a paid product and works quite well but I have also found another one that is free called Bill2 Process manager that you can download from here https://www.bill2-software.com/processmanager/download-en.shtml here is a review http://www.thewindowsclub.com/bill-2-s-process-manager-review-download

For your gaming loads, set a template to use the first 15 logical CPUs, leaving the last logical CPU on Die 0 unused for games. For some reason, the drivers want to put a large load on the last SMT thread and run it at 100% instead of spreading it more evenly over the rest of the die. By adjusting the affinity, you can mitigate that to a large extent and maximize the gaming performance.

The results that you are getting with your asus core are in line with what I would expect given the base 1633Mhz the bios sets. You may improve performance by using the curve instead of the slider. Leave most points at stock and only increase the .950v point as high as you can get it stable and boost the 1.081 point up as high as you can keep it stable

The memory at +400 is maybe a touch lower than I would expect on an Intel platform but that may have something to do with the TR memory controller rather than the GPU itself.


----------



## HowYesNo

guys, i got this gainward gtx 1070 it runs fine, and no backplate on it.
I've noticed during gaming back side where the vrm's are get quite hot.
can i place some thermal pad and some heatsink on area marked in photo?
it's not even surface as ther are some capacitors and stuff there, so will that work.
thanks.


----------



## mattliston

do you have a spare fan header open on your motherboard?

Do what I did on my 1070, and grab a 60 or 80mm fan, run 4 zipties threw the mounting holes to space the fan off the card a few millimeters (with head of zip ties) and enjoy a very noticeable temperature drop.

My core operates around 3-4 degrees cooler, and I no longer here the fans kicking up and down


----------



## Nukemaster

Quote:


> Originally Posted by *HowYesNo*
> 
> guys, i got this gainward gtx 1070 it runs fine, and no backplate on it.
> I've noticed during gaming back side where the vrm's are get quite hot.
> can i place some thermal pad and some heatsink on area marked in photo?
> it's not even surface as ther are some capacitors and stuff there, so will that work.
> thanks.


You can.
That is the idea behind Arctic's backside VRM coolers bundled with many of the VGA coolers the sell.

Example
https://www.arctic.ac/worldwide_en/accelero-xtreme-iv.html

This works very well because if you look at the card, you will see that they have many thermal vias(the look like dots on the back of the board) connecting the front of the board to the back.

This is very common with surface mount parts because they are very small and lack a metal tab to bolt a heatsink on like older parts. They instead have a pad that when soldered to the board dumps heat into the board it self. The thermal vias gives the heat a path to the back of the board where the board maker can have large sections of copper to help with cooling.

Something like this

Image source here

A heatsink on the back will enhance this effect.

Please note these VRMs can take VERY high temperatures so you do not HAVE to do this, but this is OCN.


----------



## asdkj1740

Quote:


> Originally Posted by *HowYesNo*
> 
> guys, i got this gainward gtx 1070 it runs fine, and no backplate on it.
> I've noticed during gaming back side where the vrm's are get quite hot.
> can i place some thermal pad and some heatsink on area marked in photo?
> it's not even surface as ther are some capacitors and stuff there, so will that work.
> thanks.


do this
http://cdn.overclock.net/6/61/900x900px-LL-61df8e32_SBPC.png


----------



## saunupe1911

Anybody crypto mining with their GTX 1070??? Has it been profitable?

Sent from my Pixel 2 XL using Tapatalk


----------



## BulletSponge

saunupe1911 said:


> Anybody crypto mining with their GTX 1070??? Has it been profitable?
> 
> Sent from my Pixel 2 XL using Tapatalk


A single 1070 gets @ $4 a day with Nicehash. Not much but I run it when at work or asleep.

Off topic, how are you liking the Pixel 2 XL?


----------



## saunupe1911

BulletSponge said:


> A single 1070 gets @ $4 a day with Nicehash. Not much but I run it when at work or asleep.
> 
> Off topic, how are you liking the Pixel 2 XL?


It's the fastest phone I've ever used. However my screen does have a very slight blue tint and tilted angles. 

I think I may start mining though just for the heck of it since I leave my PC almost 24/7

Sent from my Pixel 2 XL using Tapatalk


----------



## Nawafwabs

How can I try different vbios ?

I have Asus 1070 o8g


----------



## gtbtk

Nawafwabs said:


> How can I try different vbios ?
> 
> I have Asus 1070 o8g


Experimentation is fun. do not try and flash the Galax HOF, MSI Lightning, EVGA Kingpin or classified cards as they will brick your card because they use a different voltage controller. Some bioses will also disable ports so If you do flash a bios and your monitor stops working when you reboot, try connecting the monitor to a different port on the card before you worry about the card being bricked. Bricking a card does not mean you have destroyed it, you can recover it if you have an iGPU or a second graphics card that you can boot your system from

Make sure that you extract a copy of your original bios and save it to a safe place before you start messing with anything. You can use nvflash or GPU-Z to do that. GPU-Z is probably the easiest way to do that

The Asus strix OC bios is actually one of the best bioses around for the non specialist cards like the lightning or kingpin. In my opinion, you are actually not likely to find much, if any improved performance by cross flashing the OC model card. If your card was the non OC or the Advanced version, Flashing the strix OC version will give you a free upgrade to the top model as the hardware on all three models is identical. emember, the card is limited by the quality of the physical silicon and cross flashing does not change that.

If you still want to experiment, you need to get a copy of nvflash64 and a copy of the bios file that you want to test out. You can download both from techpowerup.com. You can search for different versions of bios files are in the VGA Bios database and the latest version of NVflash is in the downloads section. Only download and flash 1070 bios files that are version 86.04.50.xx.xx. Remember that cross flashing vbioses is a risk

After you make a backup of your original bios, the process to manually flash your card is as follows:

Make a directory. c:\nvflash makes this easy. Copy the NVflash64.exe and nvflash64.sys files and bios file into the directory you created.
Open device manager and disable the 1070 device. The screen will go black and then the PC will revert to the default Microsoft VGA driver.
Open an administrative Command prompt
In the command prompt cd to the directory you created

I am assuming that you only have a single 1070 installed in your machine. The command you need to type in is:

nvflash64 -6 newbiosfile.rom

you will get two warnings that there is a miss match between the bios and your card. 

Type "Y" to continue at each prompt. The utility should then flash the card and report that the flash was successful. 

Open Device manager and re-enable the 1070 device again and then reboot your PC. The new bios will take effect at boot time

If it tells you that the card was erased and then fails during the flashing process, don't panic and don't reboot the PC. These cards only ever read the bios at boot time so. Grab a copy of your backup bios file and reflash the card again with the backup bios.

The Asus card has a VRM that is rated up to about 250W and only has a single 8 pin power connector from memory. I would recommend that you do not use the bioses like the Zotac Amp Extreme as it is designed to pull 300W so it can potentially overload the VRM on the Asus card. The card will run with the Zotac bios. The danger to the card is more long term caused by the higher than spec power draw, it will not cause instant failure but make sure if you do try any of the top of range bioses you monitor your power and temps closely. hwinfo64 is the best tool to do that

While I don't like the EVGA bioses as daily drivers. They hit power limits much earlier than other brands of bioses. They do however enable the auto overclocking utility in Precision XOC that you can use to profile your card and easily find out roughly what the overclock limits are for each voltage point on your voltage curve. With Pascal, if you use the slider to overclock, the best increase you will get is limited to the weakest voltage point and you end up not using the extra performance that would otherwise be available by using the curve to overclock from the other points on the curve that have not hit their limits.

Some of the bioses that may be worth experimenting with are the Gainward Pheonix GLH/Palit Gamerock premium bioses. they are highly overclocked cards and are 8 pin power desgned cards


----------



## zipper17

I hitted 21k GPU scores in Firestrike standard, with most stable settings that I have been use, using 390.65 driver

previously I never hit 21K with that stable settings, only 20,8-20,9k. Hmm ... lucky or the driver get improved?!

I can easily get 21K with dirty & rough settings, but it will fail in stress test (Crashes)


----------



## gtbtk

zipper17 said:


> I hitted 21k GPU scores in Firestrike standard, with most stable settings that I have been use, using 390.65 driver
> 
> previously I never hit 21K with that stable settings, only 20,8-20,9k. Hmm ... lucky or the driver get improved?!
> 
> I can easily get 21K with dirty & rough settings, but it will fail in stress test (Crashes)


Well done. 

My best graphics score is 21500. I was at the same wall you were at for a long time until I finally found the right combination of settings, discovering that for me, the best performance came at 2076-2088 and not over 2100Mhz and added a side panel mounted case fan.


----------



## HMRMike

gtbtk said:


> Some of the bioses that may be worth experimenting with are the Gainward Pheonix GLH/Palit Gamerock premium bioses. they are highly overclocked cards and are 8 pin power desgned cards


The whole guide was extremely helpful, but this last point reminded me to try again!
I've been trying different, apparently compatible, BIOS versions on my Gainward Phoenix GS a while back but failed flashing every single attempt and command. Maybe the newest nvflash did some magic this time?
Among the various Gainward listings at TPU there is this one marked as GLH, distinctly limited to a nice 225W instead of 195W like the rest:
https://www.techpowerup.com/vgabios/186552/gainward-gtx1070-8192-160930
After flashing, Afterburner let me throw the power slider up to 115% instead of 114%. The real change was the reported power usage. Where the original BIOS hit 114% and throttled even under 2100MHz, now it struggles to go above 100% at all. The result is a perfectly steady core clock. 

Cooling the GPU was also important to me, so it received a coating of some liquid metal (cheap ebay galinstan, none of that 1 gram for a ripoff price commercial stuff). At 100% fan speed the the core now stays under 49-50C (20C ambient).

It doesn't clock too high on the core before hitting instability at any possible voltage, so nothing special there, but at least the Samsung memory helps a bit.
Passed Time Spy at 2125/9600 and broke my 7K barrier. Maybe further tweaking could bump this a few points.
http://www.3dmark.com/spy/3304053
So I'm happy and can calm down and clock down until the next generation.
Many thanks for your time to explain everything!


----------



## Falkentyne

You DID put nail polish (Cellulose based) around those SMD resistors didn't you?

Also, can you link the ebay item that you purchased? There's strange stuff all over. Stuff from Latvia and other places 
Thank you!


----------



## HMRMike

Falkentyne said:


> You DID put nail polish (Cellulose based) around those SMD resistors didn't you?
> 
> Also, can you link the ebay item that you purchased? There's strange stuff all over. Stuff from Latvia and other places
> Thank you!


In the photo there's temporary tape on the SMDs for masking, later coated properly 
I learned a big lesson after frying a 280X that way, along with a little too much metal, stuff went all over the PCB...But I was young and stupid.

The original listing was sold out, this is the equivalent I can find on the seller's page, and it's sold out again  German stuff, sehr gut!
https://www.ebay.com/itm/GALINSTAN-...ber-Alternative-Flussig-Gallium-/262740584919
The thing is any previous liquid metal I could find that was marketed as TIM, I could only find the small 1 (?) gram syringes for the same price as these 10 grams.


----------



## Falkentyne

Do you think this stuff is safe to buy?

https://www.ebay.com/itm/Galinstan-...372053?hash=item4b38602455:g:488AAOSwzBJaf3aT


----------



## HMRMike

Galinstan is galinstan, as long as the seller doesn't lie and it's actually mercury or NaK that kills you even better (but who in their right mind would sell it for cheap?).
I think 30 grams is way too much but boy is it cheap (relatively) per application! 
Make sure to remove any oxides and stuff you'll see on the surface layer, they can get lumpy and we just want the shiny, shiny liquid. You'll also face the issue of removing the tiny needed amount from the container without contaminating the rest somehow. I've had success with wooden toothpicks, the metal somewhat sticks to them after some scraping and maybe isolating a small drop from the pool. If you have a tiny syringe just use that


----------



## mattliston

if you can get to a hobby shop, you can find those Q tips that are made with a kind of foam-ish material, it would grab at debris fairly well.


Im kinda thinking along lines of metallurgy? metal-melting?, where you want to remove the top "crust" to exploit the good material beneath.


----------



## Nawafwabs

Guys I have Asus 1070 o8g 

I want to run it on lower speed clock and memory also

Because I feel input lag when I play games 

If I flash bios from Asus nonOC edition will work with my GPU?


----------



## mattliston

lower core and memory clocks wont solve input lag.

go into the nvidia control panel, and look at the v-sync options. One of them is either fast adaptive, or adaptive. Perhaps even another name.

Description of the setting will mention "reduces input lag"


your core/memory speeds only effect input lag when the card is struggling to render frames. Else its generally a driver setting that needs to be adjusted


----------



## Loodeelow

So conclusion is that TI bios is not possible on regular 1070?


----------



## gtbtk

No Problem, glad it worked out for you. When you start cross flashing these cards, you really see that you do get better coolers for the higher price, Pascal is quite efficient and doesnt need a bazillion phase VRMs, other than that they are all pretty much the same 

The 1671mhz bios versions are only available from gainward/palit and gigabyte. On my MSI Gaming X, the gigabyte card loses one of the display ports o my MSI. Neither the gainward or the gigabyte suit my card very well and they only tend to run on the verge of stability for me but I found my card has a trouble overclocking the voltage points at around the 1.0v level and the default curve on the 1671 buises start hitting those funky points. My card is one of the very first Micron cards though. I'm glad it worked out for you. 



HMRMike said:


> The whole guide was extremely helpful, but this last point reminded me to try again!
> I've been trying different, apparently compatible, BIOS versions on my Gainward Phoenix GS a while back but failed flashing every single attempt and command. Maybe the newest nvflash did some magic this time?
> Among the various Gainward listings at TPU there is this one marked as GLH, distinctly limited to a nice 225W instead of 195W like the rest:
> https://www.techpowerup.com/vgabios/186552/gainward-gtx1070-8192-160930
> After flashing, Afterburner let me throw the power slider up to 115% instead of 114%. The real change was the reported power usage. Where the original BIOS hit 114% and throttled even under 2100MHz, now it struggles to go above 100% at all. The result is a perfectly steady core clock.
> 
> Cooling the GPU was also important to me, so it received a coating of some liquid metal (cheap ebay galinstan, none of that 1 gram for a ripoff price commercial stuff). At 100% fan speed the the core now stays under 49-50C (20C ambient).
> 
> It doesn't clock too high on the core before hitting instability at any possible voltage, so nothing special there, but at least the Samsung memory helps a bit.
> Passed Time Spy at 2125/9600 and broke my 7K barrier. Maybe further tweaking could bump this a few points.
> http://www.3dmark.com/spy/3304053
> So I'm happy and can calm down and clock down until the next generation.
> Many thanks for your time to explain everything!


----------



## gtbtk

Yes it will but it wont solve your problem of lag. 

Some suggestions before you worry about flashing the card.

Dont use Vsync, that adds lots of latency, If you have screen tearing use Nvidia Fastsync instead as it provides much the same function at much lower latencies. Alternatively, run without anything and learn to ignore tearing artifacts.

also if you have USB peripherals, make sure thay are plugged into ports off the motherboard. Dont use a hub.



Nawafwabs said:


> Guys I have Asus 1070 o8g
> 
> I want to run it on lower speed clock and memory also
> 
> Because I feel input lag when I play games
> 
> If I flash bios from Asus nonOC edition will work with my GPU?


----------



## gtbtk

Don't think anyone has worked out how to flash it. Nvidia is added some extra security to prevent cross flashing from a 1070 as par as I am aware



Loodeelow said:


> So conclusion is that TI bios is not possible on regular 1070?


----------



## Skylinestar

gtbtk said:


> also if you have USB peripherals, make sure thay are plugged into ports off the motherboard. Dont use a hub.


What? USB hub increases system lag? (I'm not talking about keyboard/mouse lag)


----------



## gtbtk

Skylinestar said:


> What? USB hub increases system lag? (I'm not talking about keyboard/mouse lag)


They can, if the hub is not enumerating properly because of a fault or a bad connection, the retries can add latency. If you remove the hub from the equation and just plug direct to the root hub controller, best case you solve the problem or worst case, you eliminate another variable in the problem solving process.


----------



## Nukemaster

I can personally say that I have seen a usb hub affect the input/sec rate of a mouse in the past.

It is worth trying without it.


----------



## Blze001

Has anyone attempted to desolder/snip the DVI port off of one of these cards? I'm embarking on a quest to put watercooling in a case that wasn't meant for it. Not sure if I'll _need_ to get this drastic, just curious if it's an option.


----------



## Madmaxneo

Blze001 said:


> Has anyone attempted to desolder/snip the DVI port off of one of these cards? I'm embarking on a quest to put watercooling in a case that wasn't meant for it. Not sure if I'll _need_ to get this drastic, just curious if it's an option.


I am not sure you removing the DVI port will help that much with making it any more thinner than what it will be once you add the waterblock. 
Wouldn't it just be easier and safer to just get a different case? There are loads of cases out there for less than $50, some as low as $25 to $30.


----------



## TLCH723

Blze001 said:


> Has anyone attempted to desolder/snip the DVI port off of one of these cards? I'm embarking on a quest to put watercooling in a case that wasn't meant for it. Not sure if I'll _need_ to get this drastic, just curious if it's an option.



You mean something like this??


----------



## comanzo

Hi everyone. I recently updated msi afterburner and had to mess with the fan curve again(it was back to default). I set my fan curve and everything, but didn't play for several days. Couple of days later when I decide to play, I forgot to check to see if the fan curve was calibrated correctly(and of course, it wasn't). I didn't realize that my gpu was overheating until 10 minutes at most in the game(approximately, wasn't using a stopwatch or anything, just a guess). The temps reached at 90-92 degrees for those ten minutes. My gpu is fine as of right now, no hiccups or anything, but should I start contacting EVGA for a replacement(since it probably got damaged)? Or should I be ok since it was only 10 minutes and they put in safety protections for temps and stuff? Jus wondering if you guys think I should start contacting evga for warranty or not. Thanks.


----------



## Blze001

Madmaxneo said:


> Wouldn't it just be easier and safer to just get a different case? There are loads of cases out there for less than $50, some as low as $25 to $30.


Yeah, but then I'd have this perfectly good one sitting around gathering dust. Plus part of this build is me trying to put a full loop in a case not meant to support water cooling in the first place.

I probably wont need to, it was more of an exploratory question.


----------



## Madmaxneo

Blze001 said:


> Yeah, but then I'd have this perfectly good one sitting around gathering dust. Plus part of this build is me trying to put a full loop in a case not meant to support water cooling in the first place.
> 
> I probably wont need to, it was more of an exploratory question.


Oh, a full loop kind of mod. Sounds like fun....lol. 
If I could make a suggestion you might try for an AIO loop instead of a full one as they don't take up that much space. I personally recommend Swiftech AIOs, I have two of their previous gen AIOs in my system right now and they do a great job. They're listed in my sig....


----------



## gtbtk

you should be fine. afterburner allows you to increase the threshold to 92 deg for a start. The chips does start to throttle by design when it reached the hi temps but that is a protection mechanism build into the chip.

of course, if the card is misbehaving and throwing artifacts all over the place or not working at all then you should RMA it



comanzo said:


> Hi everyone. I recently updated msi afterburner and had to mess with the fan curve again(it was back to default). I set my fan curve and everything, but didn't play for several days. Couple of days later when I decide to play, I forgot to check to see if the fan curve was calibrated correctly(and of course, it wasn't). I didn't realize that my gpu was overheating until 10 minutes at most in the game(approximately, wasn't using a stopwatch or anything, just a guess). The temps reached at 90-92 degrees for those ten minutes. My gpu is fine as of right now, no hiccups or anything, but should I start contacting EVGA for a replacement(since it probably got damaged)? Or should I be ok since it was only 10 minutes and they put in safety protections for temps and stuff? Jus wondering if you guys think I should start contacting evga for warranty or not. Thanks.


----------



## Madmaxneo

Hey all, I need help in solving an annoying issue I'm having with my system switching things on my monitors on startup. 
First:
I have two monitors, The "1" monitor is my AOC G2460PG connected via display port and the "2" monitor is a 24" vizio 1080p smart TV connected via HDMI. These are connected to my GTX 1070.

My issues:
As it starts and when in the bios the system auto defaults to the "2" monitor for some reason, it also does this when going into safe mode. This has been going on for a few months at least. 
When the system starts up I have a monitoring program that normally shows up on the "2" monitor and when I open Chrome it opens on my "1" monitor. But for some reason these are now switched when I first boot it up. Note that as far as I can tell it never switches which monitor is "1" and which is "2".
The primary I'd like to have is that it defaults to the "1" monitor for startup, BIOS, and safe mode. Then I would like for chrome and the monitoring program to stay on their perspective monitors....

Is there a setting that has somehow defaulted to different settings than normal that I can change to fix these things?


----------



## BulletSponge

Madmaxneo said:


> Hey all, I need help in solving an annoying issue I'm having with my system switching things on my monitors on startup.
> First:
> I have two monitors, The "1" monitor is my AOC G2460PG connected via display port and the "2" monitor is a 24" vizio 1080p smart TV connected via HDMI. These are connected to my GTX 1070.
> 
> My issues:
> As it starts and when in the bios the system auto defaults to the "2" monitor for some reason, it also does this when going into safe mode. This has been going on for a few months at least.
> When the system starts up I have a monitoring program that normally shows up on the "2" monitor and when I open Chrome it opens on my "1" monitor. But for some reason these are now switched when I first boot it up. Note that as far as I can tell it never switches which monitor is "1" and which is "2".
> The primary I'd like to have is that it defaults to the "1" monitor for startup, BIOS, and safe mode. Then I would like for chrome and the monitoring program to stay on their perspective monitors....
> 
> Is there a setting that has somehow defaulted to different settings than normal that I can change to fix these things?



I had the exact same issue, GTX 1070 with monitor 1 connected via DP and monitor 2 connected via HDMI. Until I actually got into windows the BIOS splash screen would be on monitor 2. I never did get it sorted to my satisfaction and went back to running a single display.


----------



## Bee Dee 3 Dee

wats the name of the monitoring program? (Madmaxneo)


----------



## Madmaxneo

Bee Dee 3 Dee said:


> wats the name of the monitoring program? (Madmaxneo)


It is the NZXT CAM program. I have both the Hue+ and the Grid+ V2 that are run through it.


----------



## tiramoko

I'm looking to upgrade my 7950 to 1070. Any idea what the range price of used 1070? I've been checking 1070 at Ebay and they range 350-400. Is that a good price for a 1070?


----------



## TLCH723

tiramoko said:


> I'm looking to upgrade my 7950 to 1070. Any idea what the range price of used 1070? I've been checking 1070 at Ebay and they range 350-400. Is that a good price for a 1070?


I got my 1070 new at 330 last year. People sell mined card on eBay so be careful.

I would say hold out as long as possible.
First, mining is starting to cool off a bit, just a tiny bit. Hopefully it will keep dropping.
Second, 1070 is an almost 2 years old card. Buying used at or above MSRP seems to be not worth it IMO. 379 and 399 MSRP for regular and founder edition.
Third, Volta should arrived either late this year or first half of next year. I am betting around May of next year, but I am no expert.


----------



## Zensou

Trying my hand at overclocking my gtx 1070 sc (micron mem) and so far I've been able to get it stable at +130 core / +200 memory for a total of 2100 core and 4200 memory. But I noticed something peculiar. Upon starting unigene heaven the core and memory are running at 2100/4200 with voltage hovering around 1030-1050. After maybe 15-20 minutes the core gets downclocked to around 2050 and stays there with voltage dipping into 1010-1020 territory. 

Temps are fine, 67 C at maximum. Is there some sort of throttling going on here or what?


----------



## GoLDii3

Zensou said:


> Trying my hand at overclocking my gtx 1070 sc (micron mem) and so far I've been able to get it stable at +130 core / +200 memory for a total of 2100 core and 4200 memory. But I noticed something peculiar. Upon starting unigene heaven the core and memory are running at 2100/4200 with voltage hovering around 1030-1050. After maybe 15-20 minutes the core gets downclocked to around 2050 and stays there with voltage dipping into 1010-1020 territory.
> 
> Temps are fine, 67 C at maximum. Is there some sort of throttling going on here or what?


Maxwell and Pascal GPU's downclock the higher the temperature gets


----------



## Zensou

GoLDii3 said:


> Maxwell and Pascal GPU's downclock the higher the temperature gets


But my temperatures were 67C at most and GPU-Z didn't show any signs of temperature throttling (sensors tab, PowerRel sensor or something..it bounced from idle to Pwr)


----------



## epic1337

question to you guys, anyone own a GALAX Katana? i saw this awesome looking single-slot GTX1070 and was wondering if this single-slot variant doesn't have any thermal or noise issues.


----------



## GoLDii3

Zensou said:


> But my temperatures were 67C at most and GPU-Z didn't show any signs of temperature throttling (sensors tab, PowerRel sensor or something..it bounced from idle to Pwr)


It's the way boost works. Don't remember what temperature steps it takes into consideration but each step it downclocks by 13 MHz.


----------



## Minium

GoLDii3 said:


> It's the way boost works. Don't remember what temperature steps it takes into consideration but each step it downclocks by 13 MHz.


There is a way to raise the throttle temp with PascalEditor. So if your card doesnt have any cooling problems but runs at 70C and downclocks a bit because its over 67C you can just raise it a bit and it wont downclock anymore.


----------



## JennyBeans

I guess I can finally accend to this group as I'm now the proud owner of a evga 1070sc


----------



## Madmaxneo

JennyBeans said:


> I guess I can finally accend to this group as I'm now the proud owner of a evga 1070sc


Awesome and welcome to the club! 

Love that tea cup BTW!


----------



## JennyBeans

Madmaxneo said:


> Awesome and welcome to the club!
> 
> Love that tea cup BTW!


thank you , I'm gonna miss that cup  i broke it last week


----------



## SavantStrike

JennyBeans said:


> thank you , I'm gonna miss that cup  i broke it last week


It looks like it was a nice cup. I switched to vacuum insulated stainless cups so I wouldn't experience the pain of losing another perfect cup.


----------



## JennyBeans

Heres the validation kinda forgot


----------



## Madmaxneo

JennyBeans said:


> thank you , I'm gonna miss that cup  i broke it last week


Can't you get a new one? I have a Diablo 3 cup that I got from the Blizzard store a while back. If it breaks I am not even sure I could replace it. 

You going to OC that card? Even though you have micron memory you should be able to hit a real decent OC without flashing the bios. You could go higher if you flashed the bios but I'd ask others on here what works for that card before jumping right in.


----------



## JennyBeans

Madmaxneo said:


> Can't you get a new one? I have a Diablo 3 cup that I got from the Blizzard store a while back. If it breaks I am not even sure I could replace it.
> 
> You going to OC that card? Even though you have micron memory you should be able to hit a real decent OC without flashing the bios. You could go higher if you flashed the bios but I'd ask others on here what works for that card before jumping right in.


I would love to .. but right now i wouldn't benifit from ocing it with my 8350 .. I'm having mad lag spikes and really slow loading with games so I'm fairly sure theres something thats dying on the board or cpu


----------



## epic1337

slow loading sometimes means its the SSD/HDD, so you might wanna check that.


----------



## JennyBeans

epic1337 said:


> slow loading sometimes means its the SSD/HDD, so you might wanna check that.


seems fine when I checked... it just seems to lag out .. specially when I alt tab out of cs .. take like 20 -30 seconds after i tab back in to unfreeze


----------



## Madmaxneo

JennyBeans said:


> seems fine when I checked... it just seems to lag out .. specially when I alt tab out of cs .. take like 20 -30 seconds after i tab back in to unfreeze


That is an older but great performing chip and I would expect some bottlenecking from that chip for this card. My 4930k bottlenecks my 1070 but I never notice it unless I compare benchmarks...lol. 
You shouldn't have that kind of lag though. What are your temps like? 

I don't know what you've already done or looked at so far but if you'd like help in diagnosing the issue there are plenty here who could help. To not go to far off topic from this thread I'd start another thread somewhere and post the link so we might possibly offer our collective experience on this.

Also, what are your system specs?


----------



## JennyBeans

Madmaxneo said:


> That is an older but great performing chip and I would expect some bottlenecking from that chip for this card. My 4930k bottlenecks my 1070 but I never notice it unless I compare benchmarks...lol.
> You shouldn't have that kind of lag though. What are your temps like?
> 
> I don't know what you've already done or looked at so far but if you'd like help in diagnosing the issue there are plenty here who could help. To not go to far off topic from this thread I'd start another thread somewhere and post the link so we might possibly offer our collective experience on this.
> 
> Also, what are your system specs?


AMD 8350 @ 4.1 ghz cooled by H80
16gb Kingston hyperx Fury Ram @ 1866
EVGA GTX 1070sc
G2 Supernova EVGA 850w psu
Samung Evo 850 120gb SSD
2x 1tb WD Blue
1x 2TB WD Green
LG 2442 Flatron @ 75hz​Apparently you can control the color of the led in these cards but how do I go about doing that ?


----------



## Nukemaster

Most likely EVGA software will allow led control.

Nvidia has its own, but i am not sure it supports RGB.

Check here
https://www.evga.com/precisionxoc/


----------



## Madmaxneo

JennyBeans said:


> AMD 8350 @ 4.1 ghz cooled by H80
> 16gb Kingston hyperx Fury Ram @ 1866
> EVGA GTX 1070sc
> G2 Supernova EVGA 850w psu
> Samung Evo 850 120gb SSD
> 2x 1tb WD Blue
> 1x 2TB WD Green
> LG 2442 Flatron @ 75hz​Apparently you can control the color of the led in these cards but how do I go about doing that ?


Hmm your system shouldn't lag like it does for some things. It could be a software issue of some kind. You could try running Ccleaner for one amongst other things. 

Does your card come with RGB LEDs or just white LEDs?
Yes, you can enable the LEDs if they aren't already but I am not sure nowadays. There used to be an option in the Geforce experience to turn these LEDs on and at what setting, but they took that option out a long time ago for some reason. 
Some posted a fix of some kind (regedit) a while back when I asked about it but I have no idea when it was or even if it was on this thread. 

My GPU LEDs are off because I went with water cooling shortly after I got this card.


----------



## TheLastHero

JennyBeans said:


> AMD 8350 @ 4.1 ghz cooled by H80
> 16gb Kingston hyperx Fury Ram @ 1866
> EVGA GTX 1070sc
> G2 Supernova EVGA 850w psu
> Samung Evo 850 120gb SSD
> 2x 1tb WD Blue
> 1x 2TB WD Green
> LG 2442 Flatron @ 75hz​Apparently you can control the color of the led in these cards but how do I go about doing that ?


Hello from Vancouver 

The superclocked version of the EVGA 1070 only has white LED's. The "FTW" versions have the RGB LED's. 

Regarding your slow loading times, are your games installed on the SSD or HD? The only thing I can think of that's slowing things down would be running those off the HD. Alt+Tabbing out of games did the same thing for me with my 8370, took about 15-20 seconds to unfreeze. A lot of times it would totally freeze, and I'd have to kill the game via taskmanager and restart it. The 8350 is definitely bottle necking that card, even my 8370 OC'd to 4.5ghz was. About 60% utilization on the graphics card in all my games, sadly it's too much card for that setup....so I ended up switching to an Intel 8700K, 98-99% utilization all day at 1080p. Prior to getting the 1070, I had a EVGA 1060 SSC 6GB and it was much better when paired up with the FX CPU. 

My suggestion would be to try and upgrade your CPU if you can, that way you'll get the full performance out of your videocard. Plus you'll be able to alt+tab out of games without issue


----------



## gtbtk

epic1337 said:


> question to you guys, anyone own a GALAX Katana? i saw this awesome looking single-slot GTX1070 and was wondering if this single-slot variant doesn't have any thermal or noise issues.


I dont own one but I believe that they sound like a vacuum cleaner all the time as the fan needs to be run at high speed under any kind of load.

Ok if the machine is in an isolated data centre, not so good on your desk


----------



## gtbtk

Zensou said:


> Trying my hand at overclocking my gtx 1070 sc (micron mem) and so far I've been able to get it stable at +130 core / +200 memory for a total of 2100 core and 4200 memory. But I noticed something peculiar. Upon starting unigene heaven the core and memory are running at 2100/4200 with voltage hovering around 1030-1050. After maybe 15-20 minutes the core gets downclocked to around 2050 and stays there with voltage dipping into 1010-1020 territory.
> 
> Temps are fine, 67 C at maximum. Is there some sort of throttling going on here or what?


you should be looking at getting memory up to somewhere around +500. You may find that a small increase to memory voltage, VICCO and SA voltages helps with stability as they help with memory and pc. I have one of the very first MSI Gaming X ie stability under load. I have one of the first MSI Gaming X cards made with Micron memory and was the guy who managed to get nvidia to create and release the bug fix bios update to fix the bug with the memory timings. Check your card bios version with GPU-Z and make sure are you are running a 86.04.50.00.xx version of bios and not the buggy 86.04.26.00.xx version. By now, you should be getting cards with the new bios but best to check. 

After the bios update, I could run my memory stable up to about +650. I also found that the best performance on my card came when the card was running at 2076-2080. I have run my card at 2164 using curves but the framerates drop off.

For determining the max level of memory OC, you can use OCCT which is free and has a vram memory test tool that will show you errors in real time as you progressively increase vram frequency. These GPUs can easily deal with the an error or two every few seconds, batches of errors will effect performance. As you keep increasing the memory frequency, the numbers of errors will keep increasing at an exponential level until the card crashes. 

Keep increasing frequency until you start seeing infrequent errors then stop there or drop it back a little. If you have a utility like XTU or Asus AI suite, that lets you use software to adjust EFI system voltages, you can use the OCCT test to see if a voltage adjustment has a positive effect on stability or not. 
ftware 
Another neat trick you can use with your EVGA card is to run the Auto overclocking utility built into the EVGA Precision XOC run the test between 75 and 200 in 25mhz steps. The curve overclock it creates by adjusting the curve across the entire voltage range will probably end up crashing your PC so don't look at it as a final setting. What it will show you though, are any voltage points or range of points along the curve that are weaker than others. You can fine tune curves that work around any performance holes you found with the evga utility

I found on my card was the most efficient at 2080 uding the curve with +650 vram. At those settings, it will perform better than a 1070 TI at stock.


----------



## JennyBeans

TheLastHero said:


> Hello from Vancouver
> 
> The superclocked version of the EVGA 1070 only has white LED's. The "FTW" versions have the RGB LED's.
> 
> Regarding your slow loading times, are your games installed on the SSD or HD? The only thing I can think of that's slowing things down would be running those off the HD. Alt+Tabbing out of games did the same thing for me with my 8370, took about 15-20 seconds to unfreeze. A lot of times it would totally freeze, and I'd have to kill the game via taskmanager and restart it. The 8350 is definitely bottle necking that card, even my 8370 OC'd to 4.5ghz was. About 60% utilization on the graphics card in all my games, sadly it's too much card for that setup....so I ended up switching to an Intel 8700K, 98-99% utilization all day at 1080p. Prior to getting the 1070, I had a EVGA 1060 SSC 6GB and it was much better when paired up with the FX CPU.
> 
> My suggestion would be to try and upgrade your CPU if you can, that way you'll get the full performance out of your videocard. Plus you'll be able to alt+tab out of games without issue



yeah i even get intermittent lag spikes every 30 seconds in games .. its like cpu is like nope .. and this was before the 1070 so I know its not a bottle neck issue


----------



## Madmaxneo

JennyBeans said:


> yeah i even get intermittent lag spikes every 30 seconds in games .. its like cpu is like nope .. and this was before the 1070 so I know its not a bottle neck issue


My games are run off of a HDD also and I do not have that kind of lag issue. A few questions:
1. Have you run the diagnostic for the HDD that the games are on? If not I'd check the manufacturer website and download the one they recommend and run it. The lag times might be an issue with your drives
2. Run Ccleaner, especially the registry cleaner
3. Check for chipset updates for your MB. This tends to be the most common cause of issues with older CPUs and MBs. 
4. Did you install specter (and other one) updates to your bios if they were available? These are well known to slow down CPU cycles.
5. I only ask this just in case, Run a full virus scanner. Then run a free one like malwarebytes along with some other one. You never know.
6. If any of the above found any issues or there were updates run Ccleaner again. 

Just ideas. If you've already done this stuff let us know.


----------



## gtbtk

Before doing any of that. I would suggest you try resetting the bios to the optimized defaults, dont overclock the gpu at all and testing if you have the same lag experience. BCLK overclocks can cause intermittant issues for GPUs and sata controllers

If not then you have ruled out the OS and it is a bios setting. Lag under a gaming load could be a symptom vram being overclocked too much and it is creating bursts of memory errors that the card is recovering from. push it higher and the thing could crash completely.



Madmaxneo said:


> My games are run off of a HDD also and I do not have that kind of lag issue. A few questions:
> 1. Have you run the diagnostic for the HDD that the games are on? If not I'd check the manufacturer website and download the one they recommend and run it. The lag times might be an issue with your drives
> 2. Run Ccleaner, especially the registry cleaner
> 3. Check for chipset updates for your MB. This tends to be the most common cause of issues with older CPUs and MBs.
> 4. Did you install specter (and other one) updates to your bios if they were available? These are well known to slow down CPU cycles.
> 5. I only ask this just in case, Run a full virus scanner. Then run a free one like malwarebytes along with some other one. You never know.
> 6. If any of the above found any issues or there were updates run Ccleaner again.
> 
> Just ideas. If you've already done this stuff let us know.


----------



## JennyBeans

gtbtk said:


> Before doing any of that. I would suggest you try resetting the bios to the optimized defaults, dont overclock the gpu at all and testing if you have the same lag experience. BCLK overclocks can cause intermittant issues for GPUs and sata controllers
> 
> If not then you have ruled out the OS and it is a bios setting. Lag under a gaming load could be a symptom vram being overclocked too much and it is creating bursts of memory errors that the card is recovering from. push it higher and the thing could crash completely.


I've done that .. and nothing is oc'd .. I even had the cpu at stock and was still doing it but I put the cpu back to 4.1 mild oc , so its not the gpu it was doing it when i had the 290x too


----------



## gtbtk

JennyBeans said:


> I've done that .. and nothing is oc'd .. I even had the cpu at stock and was still doing it but I put the cpu back to 4.1 mild oc , so its not the gpu it was doing it when i had the 290x too


occt is a free test utility that has built in memory and vram tests that will display any errors in real time. updating bios and drivers may be a help as well. If you saw the same issues with an old gpu it is most likely the PC. Event manager may give you a clue as well

your description sounds like some sort of configuration error/adjustment/silicon lottery issue. It is possible that your Ram is creating errors and could benefit from slighty higher voltage. the mem controller could find more stability with a little extra vccio or sa voltage. ie. it could be any number of things that is causing the root problem.

I suggest that having tried the default settings, try running some OCCT tests and see if anything becomes apparent. If you see any memory errors, try reseating memory, blowing out the slots with compressed air, there may be dust or even a little corrosion that re-seating can sometimes resolve. after trying that, take a look at increasing the memory voltage slightly. do one thing at a time so you can confirm or eliminate things as you go through them. if nothing changes when you test, revert the system setting back and try the next thing such as a small bump to the vccio voltages.

have you been having BSODs? what event ID errors were they


----------



## khanmein

@gtbtk Did you faced any power limit issue like @Wagnard mentioned over here?


----------



## Falkentyne

khanmein said:


> @gtbtk Did you faced any power limit issue like @Wagnard mentioned over here?


Seems like setting power plan to "prefer maximum performance" in the Nvidia d3d settings fixes that.


----------



## khanmein

Falkentyne said:


> Seems like setting power plan to "prefer maximum performance" in the Nvidia d3d settings fixes that.


What about using MSI: AB drag the power limit to the max 112?


----------



## gtbtk

khanmein said:


> What about using MSI: AB drag the power limit to the max 112?


the prefer max performance sets the card to use higher p-states by defalt. My 1070 is currently in a box pending me getting around to building a new rig but what I have noticed is that the power management is a bit strange. 

my MSI cad has a rated power limit of 290W with a +130% slider but the card power throttles down at about 105% 240W. in GPU-z, the power limit, if you are using a curve, shows a nice rainbow in the "reason" bar for the limit that I read as "all of the above" - {gpuz has no idea).

I suspect that the ability to edit the power deliver curves in pascal has had ramifications with the existing software and it is reporting power levels etc. As it never really impacted what I was doing I put it on he list of things to ponder one day but never got to it before the Motherboard VRMs decided to let the magic smoke out a series of Platinum rated power supplies


----------



## Caveat

Hello,

I own an Asus Strix GTX1070 8gb gaming. But i freakin hate the GPU Tweak 2 program. Because it does not keep my settings for the fan speed everytime i start my pc. I know there are other programs that run better, like MSI afterburner. But, do i still need GPU Tweak 2 installed to get the card performence or can i just uninstall that program and install MSI Afterburner and it wil work the same?


----------



## GoLDii3

Caveat said:


> Hello,
> 
> I own an Asus Strix GTX1070 8gb gaming. But i freakin hate the GPU Tweak 2 program. Because it does not keep my settings for the fan speed everytime i start my pc. I know there are other programs that run better, like MSI afterburner. But, do i still need GPU Tweak 2 installed to get the card performence or can i just uninstall that program and install MSI Afterburner and it wil work the same?


It will work the same,use MSI Afterburner


----------



## thanos999

just orderd this hope its a good graphics card to replace my gtx760

https://www.msi.com/Graphics-card/GeForce-GTX-1070-ARMOR-8G-OC


----------



## JennyBeans

@gtbtk, cause of my issues is solved .. my mobo's warped lol


----------



## Caveat

GoLDii3 said:


> Caveat said:
> 
> 
> 
> Hello,
> 
> I own an Asus Strix GTX1070 8gb gaming. But i freakin hate the GPU Tweak 2 program. Because it does not keep my settings for the fan speed everytime i start my pc. I know there are other programs that run better, like MSI afterburner. But, do i still need GPU Tweak 2 installed to get the card performence or can i just uninstall that program and install MSI Afterburner and it wil work the same?
> 
> 
> 
> It will work the same,use MSI Afterburner
Click to expand...

Ok. Thank you for
your answer


----------



## Falkentyne

gtbtk said:


> the prefer max performance sets the card to use higher p-states by defalt. My 1070 is currently in a box pending me getting around to building a new rig but what I have noticed is that the power management is a bit strange.
> 
> my MSI cad has a rated power limit of 290W with a +130% slider but the card power throttles down at about 105% 240W. in GPU-z, the power limit, if you are using a curve, shows a nice rainbow in the "reason" bar for the limit that I read as "all of the above" - {gpuz has no idea).
> 
> I suspect that the ability to edit the power deliver curves in pascal has had ramifications with the existing software and it is reporting power levels etc. As it never really impacted what I was doing I put it on he list of things to ponder one day but never got to it before the Motherboard VRMs decided to let the magic smoke out a series of Platinum rated power supplies


I have an "idea" about the reason for this, but I need to see something.

Can you give me a vbios dump of your card? or are you not using it right now?


----------



## gtbtk

Falkentyne said:


> I have an "idea" about the reason for this, but I need to see something.
> 
> Can you give me a vbios dump of your card? or are you not using it right now?


Sorry it is packed up in its cardboard box without a rig right now. The vbios currently installed is the MSI Gaming Z bios.

what are you thinking?

The thing that I was doing that gave me the best performance was too increase the curve by about +50, then lift the .950v point to 2037Mhz and the 1.050 point to 2075Mhz. The .950v boost would overclock teh "Video Clock" that you can see in HWINFO64 and the 1050 boost was what is reported to the system. File the.950 boost helps performance, I thought that it screwed with the GPU-Z view on power delivery


----------



## gtbtk

thanos999 said:


> just orderd this hope its a good graphics card to replace my gtx760
> 
> https://www.msi.com/Graphics-card/GeForce-GTX-1070-ARMOR-8G-OC


PCB is really good. same as the Gaming X card. The cooler not as much. Make sure you have a case fan blowing directly at the card and you should be able to keep the performance up and keep temps under control.

Makes a great base for watercooling if you are interested in that.


----------



## Falkentyne

gtbtk said:


> Sorry it is packed up in its cardboard box without a rig right now. The vbios currently installed is the MSI Gaming Z bios.
> 
> what are you thinking?
> 
> The thing that I was doing that gave me the best performance was too increase the curve by about +50, then lift the .950v point to 2037Mhz and the 1.050 point to 2075Mhz. The .950v boost would overclock teh "Video Clock" that you can see in HWINFO64 and the 1050 boost was what is reported to the system. File the.950 boost helps performance, I thought that it screwed with the GPU-Z view on power delivery



Wanted to open the vbios in the Pascal Bios Editor and check the first value in the "Extreme power limits" section.

You mentioned Power Limit throttling at 105%
I encountered this same PL throttling at 105%, but only in PUBG (so far but can't test that anymore since i would have to re-do LM and SPI flash it again). 
Basically, when using a TDP range of 150W for 100% and 200W for 132% on a MSI MXM GTX 1070 (so, 151W to 200W in the editor), and using the default GTX 1080 presets via the preset button (this makes the "extreme" power limits section identical to the defaults for a GTX 1080 MXM vbios--failure to use the preset button would prevent the TDP from going much past 115W), I noticed that in PUBG, I would start getting power limit throttling at 105% tdp, which is basically 160W, even though the TDP limit was set to 132%. And it was throttling because the voltage would either drop lower (like from 1.050v down to 1.0v or 0.993v) and the clocks would drop, when the PL flag appeared.

It only did this in PUBG. Gsync on or off made no difference in it.
In Valley Benchmark and Heaven benchmark, it did not do this.
In Heaven, it would only show TDP throttling at 122% and higher, although the clocks and voltage would not actually drop unless it exceeded 132%.

Yet these were GTX 1080 advanced power presets, so why would it do this?

I then opened a MSI GTX 1080 MXM Vbios, which of course had the same default "Extreme power limits" setting as what the "preset" button for the 1070 does, then I pressed the preset button with the GTX 1080 vbios open.
This changed the first value from 16200 mW to 19200 mW, and the TDP Range from 215W default to 238W or 258W (I think) maximum.

All I know is that, this value had to be changed to allow the TDP to actually exceed 200W. No one wanted to explain what these advanced TDP options did over on notebookreview, saying that people messing with these could destroy the card, but I did get one reply from Coolane, saying that the first value had to be raised from 16200, to allow the card to pull more than 200W TDP.

Anyway, when I manually set this on the 200W GTX 1070, to 19200 (the higher range for the GTX 1080 preset), it COMPLETELY FIXED PUBG, and now PUBG only displayed "PWR" at TDP higher than 122%, and only dropped clocks and voltage at >132%, exactly like Heaven and Valley were doing.

So I was thinking, IF the pascal bios editor supported your video bios, maybe you could try the same thing (increasing the first extreme power limit value by 3000) and then force flashing the card with a SPI programmer, and see if that fixed your "105%=power limit" issue.


----------



## thanos999

gtbtk said:


> PCB is really good. same as the Gaming X card. The cooler not as much. Make sure you have a case fan blowing directly at the card and you should be able to keep the performance up and keep temps under control.
> 
> Makes a great base for watercooling if you are interested in that.


currently will have 2 120mm fans blowing from the middle off the case inbeetwen the harddrives and motherboard that i can angle up to a 45% too blow air onto the card like im currently doing with the 760s might also watercool it when i can afford a waterblock for it i will try and post pictures later on today in a build log

ps is there any waterblocks that will fit it ?


----------



## gtbtk

The bios that I used came from one of the guys here. it us up on the tech powerup database.

I could not get the card to throttle under any 1080p load, I had to get the load up at 1440p and above to get the power draw above 100%. AS I only have a 1200p monitor it never ended up being an issue.

The bios power limit is supposed to be up at 290W but with an MSI bios I cant get it above 250W. After the card was boxed up, I did see something on the 1080TI forum that mentioned the same thing and said that using the Nvidia-smi utility to reset things after cross flashing the card. I started cross flashing my card when we were tryng to work out the micron memory bug so I have never had the chance with a fresh from factory card at full blat. Not had the chance to try the smi reset as yet.

There are certainly more things going on with the cards than Nvidia are documenting.




Falkentyne said:


> Wanted to open the vbios in the Pascal Bios Editor and check the first value in the "Extreme power limits" section.
> 
> You mentioned Power Limit throttling at 105%
> I encountered this same PL throttling at 105%, but only in PUBG (so far but can't test that anymore since i would have to re-do LM and SPI flash it again).
> Basically, when using a TDP range of 150W for 100% and 200W for 132% on a MSI MXM GTX 1070 (so, 151W to 200W in the editor), and using the default GTX 1080 presets via the preset button (this makes the "extreme" power limits section identical to the defaults for a GTX 1080 MXM vbios--failure to use the preset button would prevent the TDP from going much past 115W), I noticed that in PUBG, I would start getting power limit throttling at 105% tdp, which is basically 160W, even though the TDP limit was set to 132%. And it was throttling because the voltage would either drop lower (like from 1.050v down to 1.0v or 0.993v) and the clocks would drop, when the PL flag appeared.
> 
> It only did this in PUBG. Gsync on or off made no difference in it.
> In Valley Benchmark and Heaven benchmark, it did not do this.
> In Heaven, it would only show TDP throttling at 122% and higher, although the clocks and voltage would not actually drop unless it exceeded 132%.
> 
> Yet these were GTX 1080 advanced power presets, so why would it do this?
> 
> I then opened a MSI GTX 1080 MXM Vbios, which of course had the same default "Extreme power limits" setting as what the "preset" button for the 1070 does, then I pressed the preset button with the GTX 1080 vbios open.
> This changed the first value from 16200 mW to 19200 mW, and the TDP Range from 215W default to 238W or 258W (I think) maximum.
> 
> All I know is that, this value had to be changed to allow the TDP to actually exceed 200W. No one wanted to explain what these advanced TDP options did over on notebookreview, saying that people messing with these could destroy the card, but I did get one reply from Coolane, saying that the first value had to be raised from 16200, to allow the card to pull more than 200W TDP.
> 
> Anyway, when I manually set this on the 200W GTX 1070, to 19200 (the higher range for the GTX 1080 preset), it COMPLETELY FIXED PUBG, and now PUBG only displayed "PWR" at TDP higher than 122%, and only dropped clocks and voltage at >132%, exactly like Heaven and Valley were doing.
> 
> So I was thinking, IF the pascal bios editor supported your video bios, maybe you could try the same thing (increasing the first extreme power limit value by 3000) and then force flashing the card with a SPI programmer, and see if that fixed your "105%=power limit" issue.


----------



## gtbtk

thanos999 said:


> currently will have 2 120mm fans blowing from the middle off the case inbeetwen the harddrives and motherboard that i can angle up to a 45% too blow air onto the card like im currently doing with the 760s might also watercool it when i can afford a waterblock for it i will try and post pictures later on today in a build log
> 
> ps is there any waterblocks that will fit it ?


The Armor has the same PCB as one used for the Gaming X, just a lighter weight cooler and shroud. the 1070 tends to be a cool running card and you may find the extra airflow keeps the temps in the 60s anyway. 

EK make one. https://www.ekwb.com/configurator/waterblock/3831109831571. 

Barrow and Bykski in China also make blocks that are be compatible 
http://www.barrowint.com/index.php/article/622.html. 

https://www.amazon.com/Cooling-GeForce-GTX1080-GTX1070-GTX1060/dp/B01IEZMQCC

Probably order them from on aliexpress.com as well


----------



## thanos999

gtbtk said:


> The Armor has the same PCB as one used for the Gaming X, just a lighter weight cooler and shroud. the 1070 tends to be a cool running card and you may find the extra airflow keeps the temps in the 60s anyway.
> 
> EK make one. https://www.ekwb.com/configurator/waterblock/3831109831571.
> 
> Barrow and Bykski in China also make blocks that are be compatible
> http://www.barrowint.com/index.php/article/622.html.
> 
> https://www.amazon.com/Cooling-GeForce-GTX1080-GTX1070-GTX1060/dp/B01IEZMQCC
> 
> Probably order them from on aliexpress.com as well


ok thanks for that


----------



## thanos999

getting terrible benchmark scores heres a screenshot my old gtx760 is better than new gtx1070


----------



## rfarmer

thanos999 said:


> getting terrible benchmark scores heres a screenshot my old gtx760 is better than new gtx1070


There is definitely something wrong, here is my 1070 at stock settings. http://www.3dmark.com/fs/15518119

17,571 graphics score, over 20,000 with overclock. I have seen many over 21,000.


----------



## thanos999

rfarmer said:


> There is definitely something wrong, here is my 1070 at stock settings. http://www.3dmark.com/fs/15518119
> 
> 17,571 graphics score, over 20,000 with overclock. I have seen many over 21,000.


fixed it there was a programm that was using most off the memory on my new gpu some kind off mining programm that i must have accidently downloaded
heres a screnshot off it working normal now


----------



## rfarmer

thanos999 said:


> fixed it there was a programm that was using most off the memory on my new gpu some kind off mining programm that i must have accidently downloaded
> heres a screnshot off it working normal now


Now that looks much better.


----------



## gtbtk

that looks more like a stock clocked 1070

this is my best result with an i7-2600 

https://www.3dmark.com/fs/11532231



thanos999 said:


> fixed it there was a programm that was using most off the memory on my new gpu some kind off mining programm that i must have accidently downloaded
> heres a screnshot off it working normal now


----------



## thanos999

gtbtk said:


> that looks more like a stock clocked 1070
> 
> this is my best result with an i7-2600
> 
> https://www.3dmark.com/fs/11532231


yes its stock at the moment i donnt really need to clock it im only gaming on a 40" 1080p tv a 1060 would have been good enought but i thought i might as well go for a 1070 it should last me a bit longer than a 1060 when it comes to the next gpu upgrade for me


----------



## Minime1981

Hi Guys.

I am new to this forum. I am 36 and from South Africa

I own a Asus Strix GTX1070TI.

Can i flash the bios with any other GTX1070ti? I am on PAge 492 of this tread and i have yet to come across the GTX 1070ti discussion.


----------



## gtbtk

Minime1981 said:


> Hi Guys.
> 
> I am new to this forum. I am 36 and from South Africa
> 
> I own a Asus Strix GTX1070TI.
> 
> Can i flash the bios with any other GTX1070ti? I am on PAge 492 of this tread and i have yet to come across the GTX 1070ti discussion.


All 1070TI are supposed to have teh same default settings, some vendors have produced cards that will give you a higher push button overclock but the same base settings.

You should be able to flash the base model strix card with the Asus Strix Advanced edition bios but the only difference you will see is a higher automatic overclock in GPU tweak. If you are looking for best performance, manual overclocking is the way to go and Afterburner is a better tool than GPUtweak anyway so the different bios is irrelevant.

Original 1070 cards got a benefit because the out of box clocks were improved if you flashed the OC bios onto the standard model card.


----------



## JennyBeans

wondering if I should flash my bios of my 1070sc2 once I get my 2600x


----------



## gtbtk

Make sure you back up your original bios before doing any cross flashing.

If you are using a CPU with an integrated GPU, I would suggest that you play with cross flashing now rather than after you get a 2600x. If you flash the card and brick it, you can boot from the igpu and just flash a working bios back to your card. The 2600x doesn't have a gpu so it makes it more of a challenge to recover the card, unless you have another GPU you can put in the PC to boot from, 

I was one of teh guys who identified teh Micron memory bug that afflicted 1070s bac in the beginning. I have an MSI Gaming X card and i Cross flashed just about every different bios from most models of 1070 to my card to see if it was a Software issue common to everyone or just my card, or msi firmware.

What I found with the EVGA bioses, both SC anf FTW, was that they would increase power draw faster than other brands bioses and was the only bios on my card that hit power limits and throttled under a 1080p load. The flip side was that you could use the auto overclocking tools in Preision XOC which, while it generally didnt produce the most stable overclocks, it allowed you to see what voltage points overclocked better than others on the power curve if you wanted to try power curve over clocking. The information that you gain from that will carry over to aby bios with adjustments for the factory overclock. 

If you want to just set and forget with a faster card, you could try one of the Palit Bioses. One model has a high Factory OC and is still a single 8 pin power card. 

I you are keen at manual overclocking, the founders edition cards use the same PCB as the SC. The founders edition bios will give you slower default frequency but tends to overclock really well so you may finf that is a good one to try. the founders card is also supported by the precision XOC tools.


----------



## Blze001

So I finally got around to running the Heaven benchmark and the numbers I got were atrocious.

1498 on the extreme setting, which seems amazingly low. In Afterburner, the core clock was at 1999Mhz and the memory clock was at 4354Mhz, with temps never going above 47c. Something seems really off, any ideas?

This is a Founder's Edition in a watercooling loop, btw.


----------



## Salman8506

INNO3d ICHILL x3 1070 here.


----------



## Madmaxneo

Blze001 said:


> So I finally got around to running the Heaven benchmark and the numbers I got were atrocious.
> 
> 1498 on the extreme setting, which seems amazingly low. In Afterburner, the core clock was at 1999Mhz and the memory clock was at 4354Mhz, with temps never going above 47c. Something seems really off, any ideas?
> 
> This is a Founder's Edition in a watercooling loop, btw.


I just ran mine again to compare. I have an EVGA GTX 970 SC Black ed that is watercooled in it's own block (specs in my sig). My GPU is not currently OC'd. I ran it on Extreme settings in full screen at 1920 x 1080. My core clock reached 1976Mhz at the highest (at least that is what I noticed). 

My final score was *2307* with a high temp of about 51 deg. 

So yeah, your score seems a little low.


----------



## Falkentyne

Blze001 said:


> So I finally got around to running the Heaven benchmark and the numbers I got were atrocious.
> 
> 1498 on the extreme setting, which seems amazingly low. In Afterburner, the core clock was at 1999Mhz and the memory clock was at 4354Mhz, with temps never going above 47c. Something seems really off, any ideas?
> 
> This is a Founder's Edition in a watercooling loop, btw.


Disable all monitor or disconnect all monitors except your primary one. Don't run multi monitor. Just makes things harder to fix.
Disable Gsync.
Set vsync to app detected.
Disable ALL anti aliasing settings (you can set NV to driver defaults if you wish).
Set anisotropic filtering to app detected.
Run heaven again. I don't know what resolution you ran, but at 1920x1080 (had to click extreme, then custom, then system resolution, full screen, as there was no default for 1920x1080 FS in Extreme preset), with + 145 core, +675 memory, and TDP @ 230W, I got 2780. That's on a laptop card with the few extra shaders or cores or whatever. So a desktop card at >170W TDP should not be that far off.


----------



## Madmaxneo

Falkentyne said:


> Disable all monitor or disconnect all monitors except your primary one. Don't run multi monitor. Just makes things harder to fix.
> Disable Gsync.
> Set vsync to app detected.
> Disable ALL anti aliasing settings (you can set NV to driver defaults if you wish).
> Set anisotropic filtering to app detected.
> Run heaven again. I don't know what resolution you ran, but at 1920x1080 (had to click extreme, then custom, then system resolution, full screen, as there was no default for 1920x1080 FS in Extreme preset), with + 145 core, +675 memory, and TDP @ 230W, I got 2780. That's on a laptop card with the few extra shaders or cores or whatever. So a desktop card at >170W TDP should not be that far off.


Multiple monitors does not affect performance of the GPU as I've done it without my secondary monitor and it made absolutely no difference whatsoever. My score (2307) was everything at stock speeds with Gsync on and at full screen but it was still 800 points higher than his score of 1498. So something is definitely up there.


----------



## Blze001

I increased the memory overclock and the score went DOWN to 13-something. I'll try using DDU to wipe all display drivers, re-install, and then test without the OC. I really hope the card isn't dying on me, I just spent $500 on the watercooling loop, it's gonna be awhile before I could spend more on another GPU.


----------



## Falkentyne

Blze001 said:


> I increased the memory overclock and the score went DOWN to 13-something. I'll try using DDU to wipe all display drivers, re-install, and then test without the OC. I really hope the card isn't dying on me, I just spent $500 on the watercooling loop, it's gonna be awhile before I could spend more on another GPU.


Memory overclock? By how much?

I saw that you're using watercooling.
there was a post somewhere, maybe in a gtx 1080 thread, where someone who was using water cooling was getting EXTREME power limit throttling. *Extreme*. And this went away when he went back to the STOCK heatsink.

What you need to do is stop panicking and get your wits together.

Run DDU.
Remove the graphics card after.
Go back to STOCK HEATSINK. I don't care if it's alot of work. Life isn't for people who want the easy way out. Sometimes you have to work.

Test on new drivers and stock heatsink.
test your 'bad' RAM overclock.

if all is now good, then you know it was the waterblock.


----------



## Blze001

Falkentyne said:


> Memory overclock? By how much?
> 
> I saw that you're using watercooling.
> there was a post somewhere, maybe in a gtx 1080 thread, where someone who was using water cooling was getting EXTREME power limit throttling. *Extreme*. And this went away when he went back to the STOCK heatsink.
> 
> Run DDU.
> Remove the graphics card after.
> Go back to STOCK HEATSINK. I don't care if it's alot of work. Life isn't for people who want the easy way out. Sometimes you have to work.
> 
> Test on new drivers and stock heatsink.
> test your 'bad' RAM overclock.
> 
> if all is now good, then you know it was the waterblock.


Strange. I did a ton of research and didn't come across any cases of 1070s power throttling with a waterblock installed. Is it because water blocks don't have something to plug into the fan pins so the card thinks it doesn't have any cooling?

EDIT: Also, your comment about power had me thinking: my PSU cables are the weird variety with an 8 pin then another 8 pin attached to that and I'm running on that second one because there's no way to cable manage off the first one. Could it be that second one doesn't have the power delivery ability?


----------



## Caveat

I have a question. Why can’t i get my fan speed below 40%? I mean. It goes 0% to 40% and from 40% i can set every percrntage i want til 100%


----------



## Caveat

Oh ye. I run gpu tweak II


----------



## Falkentyne

Blze001 said:


> Strange. I did a ton of research and didn't come across any cases of 1070s power throttling with a waterblock installed. Is it because water blocks don't have something to plug into the fan pins so the card thinks it doesn't have any cooling?
> 
> EDIT: Also, your comment about power had me thinking: my PSU cables are the weird variety with an 8 pin then another 8 pin attached to that and I'm running on that second one because there's no way to cable manage off the first one. Could it be that second one doesn't have the power delivery ability?


I can't help with this. I know nothing about it or power cable wiring stuff.

Please search the other thread in this forum called 1080 power limit throttling.
Exact. Same. Problem.

Puts on waterblock. Gets terribly power throttled.
Removes waterblock. Card back to normal.

I don't watercool. I have r9 290x. my laptop has the 1070.
I can only tell you what I've read elsewhere. I don't have the capability to directly help you with your problems.

All i can say is I'm sorry, and good luck.


----------



## Madmaxneo

Blze001 said:


> Strange. I did a ton of research and didn't come across any cases of 1070s power throttling with a waterblock installed. Is it because water blocks don't have something to plug into the fan pins so the card thinks it doesn't have any cooling?
> 
> EDIT: Also, your comment about power had me thinking: my PSU cables are the weird variety with an 8 pin then another 8 pin attached to that and I'm running on that second one because there's no way to cable manage off the first one. Could it be that second one doesn't have the power delivery ability?





Falkentyne said:


> I can't help with this. I know nothing about it or power cable wiring stuff.
> 
> Please search the other thread in this forum called 1080 power limit throttling.
> Exact. Same. Problem.
> 
> Puts on waterblock. Gets terribly power throttled.
> Removes waterblock. Card back to normal.
> 
> I don't watercool. I have r9 290x. my laptop has the 1070.
> I can only tell you what I've read elsewhere. I don't have the capability to directly help you with your problems.
> 
> All i can say is I'm sorry, and good luck.


 I am watercooling and it works just fine. I have a separate loop just for my 1070 and the radiator fan is hooked into the fan pins on the card. But I seriously doubt that is the problem.


----------



## Falkentyne

Madmaxneo said:


> I am watercooling and it works just fine. I have a separate loop just for my 1070 and the radiator fan is hooked into the fan pins on the card. But I seriously doubt that is the problem.


Did any of you read the 1080 thread I linked? Or are you just replying to sound smart?
A user with a 1080 DID have extreme power limit problems after putting on a block. Something was causing a short or a malfunction. It went away when he went to pure stock.

Do you people know how to do BASIC troubleshooting?
when you have a mod and you have problems, the VERY FIRST THING YOU DO if drivers won't fix the problem is to REMOVE THE MOD AND GO BACK TO STOCK.
Surely you guys went to highschool and learned about the scientific method....Not to be a jerk or something. But please use your brains.

Do things by the book.


----------



## Blze001

Falkentyne said:


> Did any of you read the 1080 thread I linked? Or are you just replying to sound smart?
> A user with a 1080 DID have extreme power limit problems after putting on a block. Something was causing a short or a malfunction. It went away when he went to pure stock.
> 
> Do you people know how to do BASIC troubleshooting?
> when you have a mod and you have problems, the VERY FIRST THING YOU DO if drivers won't fix the problem is to REMOVE THE MOD AND GO BACK TO STOCK.
> Surely you guys went to highschool and learned about the scientific method....Not to be a jerk or something. But please use your brains.
> 
> Do things by the book.


While I'm appreciative of the advice, not really sure why you're being so hostile. Yes, going back to stock is a troubleshooting step, I just prefer to build a list of steps and try less extreme solutions before taking the more extreme option of reverting. Besides, I'd have to order new thermal pads anyway and those will take awhile to ship.

I'm trying to understand the what and why of the problem before I jump on the "give up on watercooling" wagon.

I did some poking around with GPU-Z and the limiting reason was Vrel and PWR, with TDP at 96-97%, so it seems to me like it's drawing power properly. Still haven't done the driver clean yet, too busy until tomorrow, but I'm going to try a different rail and different cable just in case before I take the drastic step.


----------



## Falkentyne

Sorry. I'm hostile because people don't want to keep things to basics. And I want their hardware to work. I'm slowly dying (maybe the word is...degrading) from a serious medical disability yet I still enjoy having people have properly working hardware. But when people say "but....but...theres no problem when i did this", i see that as an excuse for laziness. Something I learned from living a long time and experiencing it for my own self 

I even asked you to do research to see if perhaps there were similarities.

Look at this thread.

http://www.overclock.net/forum/69-nvidia/1674225-gtx-1080-tdp-throttling-why-solved.html

it sure sounds like his problem was VERY similar to yours.

That's why I wanted you to revert to stock and then test. Because if the problem happened at stock, then it's either a defective video card or a bad driver in the OS (but that's just bad news in that case).
But clearly if another user had a similar problem as you, it's something worth looking at. Research is invaluable.


----------



## Madmaxneo

Falkentyne said:


> Did any of you read the 1080 thread I linked? Or are you just replying to sound smart?
> A user with a 1080 DID have extreme power limit problems after putting on a block. Something was causing a short or a malfunction. It went away when he went to pure stock.
> 
> Do you people know how to do BASIC troubleshooting?
> when you have a mod and you have problems, the VERY FIRST THING YOU DO if drivers won't fix the problem is to REMOVE THE MOD AND GO BACK TO STOCK.
> Surely you guys went to highschool and learned about the scientific method....Not to be a jerk or something. But please use your brains.
> 
> Do things by the book.


FYI, if the drivers didn't work then I would do more troubleshooting before going back to stock because that is a lot more work in having to remove the waterblock and components then putting everything back to the stock cooler.

I did not reply to sound smart, I replied to let others know that watercooling does work for these cards just fine. In fact I know quite a few people that watercool their 10 series GPUs and everything works just fine. I guess some just have poor luck with certain things but others do not. 

Sometimes your answer is not always the best answer. Going back to stock and everything working just fine would mean that something is either wrong with the way it was hooked up, the water cooling system parts could be screwed up somewhere, or something is just not compatible. I would double & triple check all that first before going back to stock.

My suggestion would be to keep troubleshooting because these cards work just fine with water cooling. 



Blze001 said:


> While I'm appreciative of the advice, not really sure why you're being so hostile. Yes, going back to stock is a troubleshooting step, I just prefer to build a list of steps and try less extreme solutions before taking the more extreme option of reverting. Besides, I'd have to order new thermal pads anyway and those will take awhile to ship.
> 
> I'm trying to understand the what and why of the problem before I jump on the "give up on watercooling" wagon.
> 
> I did some poking around with GPU-Z and the limiting reason was Vrel and PWR, with TDP at 96-97%, so it seems to me like it's drawing power properly. Still haven't done the driver clean yet, too busy until tomorrow, but I'm going to try a different rail and different cable just in case before I take the drastic step.


I might have missed it somewhere but what block are you using to watercool your GPU? 

You might have to take the block off to check and make sure all the thermal pads were properly placed. Before you do that I would recommend getting another set of thermal pads to replace them just in case. If one thermal pad is not set properly then it is possible that would cause the throttling issues you are experiencing.


----------



## Blze001

Madmaxneo said:


> I might have missed it somewhere but what block are you using to watercool your GPU?
> 
> You might have to take the block off to check and make sure all the thermal pads were properly placed. Before you do that I would recommend getting another set of thermal pads to replace them just in case. If one thermal pad is not set properly then it is possible that would cause the throttling issues you are experiencing.


EK's Acetal/Nickel block. I don't think it's thermal related because if it was a case of the Vram or something else getting too hot, the performance would vary as things warmed up and cooled off due to throttling adjusting temperature. It's rock steady and runs the same speed regardless of if the test has been running for 1 minute or 30.


----------



## Falkentyne

Blze001 said:


> EK's Acetal/Nickel block. I don't think it's thermal related because if it was a case of the Vram or something else getting too hot, the performance would vary as things warmed up and cooled off due to throttling adjusting temperature. It's rock steady and runs the same speed regardless of if the test has been running for 1 minute or 30.


Pascal has several cancer throttling mechanisms.
It can throttle if something is shorted. And not even signal that anything is wrong.

PUBG makes Pascal throttle WAY prematurely. Look.

https://linustechtips.com/main/topic/817866-pubg-throttles-my-gpu-help/

I've encountered this on my laptop 1070 modded to run at 230W TDP. Yet it doesn't always happen. Something is causing it to sometimes flag itself. When it happens, when you face a wall during "map preparation" phase, you will power limit throttled at strange GPU usages and see lower framerates (shadows set to high, everything else Ultra). When it is working right, you will have 144 FPS (framerate cap). And there seems to be no exact...trigger I can find. Same map, same exact spot same temps...I even wondered if micro lag from a wireless adapter or bluetooth xbox controller might be signaling something strange that only PUBG brings out.

Again, please read this thread fully.

http://www.overclock.net/forum/69-nvidia/1674225-gtx-1080-tdp-throttling-why-solved.html

It looks VERY similar to your problem.


----------



## Madmaxneo

Falkentyne said:


> Pascal has several cancer throttling mechanisms.
> It can throttle if something is shorted. And not even signal that anything is wrong.
> 
> PUBG makes Pascal throttle WAY prematurely. Look.
> 
> https://linustechtips.com/main/topic/817866-pubg-throttles-my-gpu-help/
> 
> I've encountered this on my laptop 1070 modded to run at 230W TDP. Yet it doesn't always happen. Something is causing it to sometimes flag itself. When it happens, when you face a wall during "map preparation" phase, you will power limit throttled at strange GPU usages and see lower framerates (shadows set to high, everything else Ultra). When it is working right, you will have 144 FPS (framerate cap). And there seems to be no exact...trigger I can find. Same map, same exact spot same temps...I even wondered if micro lag from a wireless adapter or bluetooth xbox controller might be signaling something strange that only PUBG brings out.
> 
> Again, please read this thread fully.
> 
> http://www.overclock.net/forum/69-nvidia/1674225-gtx-1080-tdp-throttling-why-solved.html
> 
> It looks VERY similar to your problem.


So that issue is one of the things I believe could be the issue with Blze001's set up. The issue in the link was the VRMs were not being cooled correctly by the waterblock, which is why the stock cooler worked so well. The person who asked the question didn't go into detail but something tells me he did not install the waterblock correctly as he did say it was his first time.


----------



## Blze001

Did the DDU dance, and on a whim I decided to give 3DMark Firestrike a go and see what it had to say. 13630 with no OC and 14835 as my max score. For reference, the highest score I've found for my CPU/GPU period was in the low 16000s and the highest I've found with the same amount of RAM was mid-15000s. This with the core at 2076Mhz, memory at 4608Mhz, and card undervolted to 1.025, also this was just me fiddling for an hour trying stuff, not a meticulously dialed in OC.

Heaven still shows a sub-1500 mark. I'm wondering if Heaven is just not liking my system for some reason, because games and 3DMark seem okay with it.


----------



## AT0MAC

Im trying to figure out if this is an ok OC, but in the corresponding threads for the different benchmarks, people mainly have 1080 or 1080ti so its really hard to find something to compare with.

Maybe you guys here can tell me if I have a good or a bad GPU.
The GPU speed seems a little low, but it wont go any higher, the RAM speed though seems a little on the high side, so its a strange combo.


----------



## gtbtk

Caveat said:


> I have a question. Why can’t i get my fan speed below 40%? I mean. It goes 0% to 40% and from 40% i can set every percrntage i want til 100%


In my experience GPU tweak II fan management was always a bit wierd.Dont know If It has improved recently.

I would recommend that you retire the Ausus software and replace it with MSI Afterburner. Much better GPU oc software. See how you go with using AB to set the fan curve as a first step


----------



## gtbtk

Blze001 said:


> Did the DDU dance, and on a whim I decided to give 3DMark Firestrike a go and see what it had to say. 13630 with no OC and 14835 as my max score. For reference, the highest score I've found for my CPU/GPU period was in the low 16000s and the highest I've found with the same amount of RAM was mid-15000s. This with the core at 2076Mhz, memory at 4608Mhz, and card undervolted to 1.025, also this was just me fiddling for an hour trying stuff, not a meticulously dialed in OC.
> 
> Heaven still shows a sub-1500 mark. I'm wondering if Heaven is just not liking my system for some reason, because games and 3DMark seem okay with it.


You are running quite a high overclock on the memory. Many cards start throwing errors at about +500, at +600 the memoryis throwing errors. do you find that you get unexplained intemitent crashes as well? 

Download OCCT and run the vram memory test running th e overclock. I am pretty sure that you will find that it is throwing errors all over the place. The error correction is what is dropping your performance. OCCT is a great tool, you can run the test and progressivly stepp up the memory frequency and see the point where the errors start increasing. you set your OC just below that.

Have you crossflashed the card at all?

You may want to try running this command from the commandline and seeing if it resolves your power limit issue. You need to change the number at the end to match the TDP limit as shown in the vbios that you are running on your card. If you are not sure what that should be. Look up your vbios at techpowerup.com vbiod database and te listing will tell you what the bios is set to.

"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 290


----------



## gtbtk

20200 grahics score in fire strike is about middle of the pack. you can probaby get another 1000 point out of it using the voltage curve to overclock instead of just the slider.



AT0MAC said:


> Im trying to figure out if this is an ok OC, but in the corresponding threads for the different benchmarks, people mainly have 1080 or 1080ti so its really hard to find something to compare with.
> 
> Maybe you guys here can tell me if I have a good or a bad GPU.
> The GPU speed seems a little low, but it wont go any higher, the RAM speed though seems a little on the high side, so its a strange combo.


----------



## mailto

nice score.
.but why i have x1 3.0 gpu with techpower gpu?? i have a asus 1070 strix-asusi x hero-7700k.h100i-3200 mhz gskill..

can some1 tell me ?
ty all


----------



## john1016

mailto said:


> nice score.
> .but why i have x1 3.0 gpu with techpower gpu?? i have a asus 1070 strix-asusi x hero-7700k.h100i-3200 mhz gskill..
> 
> can some1 tell me ?
> ty all



So you have 3.0 x1 instead of 3.0 x16? Is that what gpuz says? Can you post a pic?


----------



## Blackfyre

Last time I was here was close to page 400. Sorry for asking this question, but was a custom BIOS that allowed over-voltage ever made for the GTX 1070?

I own the MSI GTX 1070 Gaming X card, the first batches that came out with Samsung vRAM. Still using the original BIOS which I never updated that allows PowerLimit on MSI Afterburner to go to 126% as well. GPU has been overclocked and locked at maximum voltage of 1093 mV, my Core clock running at 2088Mhz and Memory Clock at +517Mhz (4520Mhz or 9040Mhz). Been running it like this for well over a year and gaming with it. Stable as always.

Probably won't gain much more out of it really even with a custom BIOS, but I am curious if a CUSTOM bios was ever made that allows voltage to go past 1093 mv?

Just ran a Fire Strike to show my current results:

https://www.3dmark.com/3dm/26986815


----------



## Blze001

gtbtk said:


> You are running quite a high overclock on the memory. Many cards start throwing errors at about +500, at +600 the memoryis throwing errors. do you find that you get unexplained intemitent crashes as well?
> 
> Download OCCT and run the vram memory test running th e overclock. I am pretty sure that you will find that it is throwing errors all over the place. The error correction is what is dropping your performance. OCCT is a great tool, you can run the test and progressivly stepp up the memory frequency and see the point where the errors start increasing. you set your OC just below that.
> 
> Have you crossflashed the card at all?
> 
> You may want to try running this command from the commandline and seeing if it resolves your power limit issue. You need to change the number at the end to match the TDP limit as shown in the vbios that you are running on your card. If you are not sure what that should be. Look up your vbios at techpowerup.com vbiod database and te listing will tell you what the bios is set to.
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 290


Thanks for the input, I'll have to play with this going forward. Sorry for the late response, I was getting sucked into various games.

The only crashes I've seen that were unexplained were in Witcher 3, and that was happening regardless of if the OC was running or not. Dialing back the hairworks fixed them completely for both non-OC and OC cases. Otherwise games and Firestrike have been running happily with my current settings.

I'll give the OCCT thing a shot, I've never played around with it, always thought it was for CPU stress testing.

And no, I haven't crossflashed or anything. I dunno if the bios on this card has even been updated, too nervous I'll brick it.

I reran the powerlimit set and it went to 170 from 169.12.


----------



## mailto

hi ty here the screen


----------



## gtbtk

Blze001 said:


> Thanks for the input, I'll have to play with this going forward. Sorry for the late response, I was getting sucked into various games.
> 
> The only crashes I've seen that were unexplained were in Witcher 3, and that was happening regardless of if the OC was running or not. Dialing back the hairworks fixed them completely for both non-OC and OC cases. Otherwise games and Firestrike have been running happily with my current settings.
> 
> I'll give the OCCT thing a shot, I've never played around with it, always thought it was for CPU stress testing.
> 
> And no, I haven't crossflashed or anything. I dunno if the bios on this card has even been updated, too nervous I'll brick it.
> 
> I reran the powerlimit set and it went to 170 from 169.12.


It is a PC test and information utility. It just happends to have the only tool I am aware of that includes a test of vRam and reports on any errors thrown. I found it really useful in tuning my PC. 

the nvidia-smi command is still something that may be useful to do. It wont hurt the card

There have been bios updates issued for 1070s you may find it beneficial to check and update the bios if it is not the latest version.


----------



## gtbtk

No, nothing that can be easily flashed. There is a notebook editor but it requires you to use an eprom burner and swap the physical eprom. I am not sure if it has wrked on desktop cards.

I have teh same card with Micron memory but I flashed the Gaming Z bios to just give me higher default clocks.my best memory oc is just under +700 above the reference (650 pn the gaming Z bios)

The Samsung Cards never had a problem, The new bios also goes to 126% but my card will start to throttle at about 106% regardless of where the slider is set. It needs a 1440p load though.

my best firestrike dave me a graphics score 21500. Samsung memory will usually go a bit faster and it does give you a boost in FS



Blackfyre said:


> Last time I was here was close to page 400. Sorry for asking this question, but was a custom BIOS that allowed over-voltage ever made for the GTX 1070?
> 
> I own the MSI GTX 1070 Gaming X card, the first batches that came out with Samsung vRAM. Still using the original BIOS which I never updated that allows PowerLimit on MSI Afterburner to go to 126% as well. GPU has been overclocked and locked at maximum voltage of 1093 mV, my Core clock running at 2088Mhz and Memory Clock at +517Mhz (4520Mhz or 9040Mhz). Been running it like this for well over a year and gaming with it. Stable as always.
> 
> Probably won't gain much more out of it really even with a custom BIOS, but I am curious if a CUSTOM bios was ever made that allows voltage to go past 1093 mv?
> 
> Just ran a Fire Strike to show my current results:
> 
> https://www.3dmark.com/3dm/26986815


----------



## gtbtk

No, nothing that can be easily flashed. There is a notebook editor but it requires you to use an eprom burner and swap the physical eprom. I am not sure if it has wrked on desktop cards.

I have teh same card with Micron memory but I flashed the Gaming Z bios to just give me higher default clocks.my best memory oc is just under +700 above the reference (650 pn the gaming Z bios)

The Samsung Cards never had a problem, The new bios also goes to 126% but my card will start to throttle at about 106% regardless of where the slider is set. It needs a 1440p load though.

my best firestrike dave me a graphics score 21500. Samsung memory will usually go a bit faster and it does give you a boost in FS



Blackfyre said:


> Last time I was here was close to page 400. Sorry for asking this question, but was a custom BIOS that allowed over-voltage ever made for the GTX 1070?
> 
> I own the MSI GTX 1070 Gaming X card, the first batches that came out with Samsung vRAM. Still using the original BIOS which I never updated that allows PowerLimit on MSI Afterburner to go to 126% as well. GPU has been overclocked and locked at maximum voltage of 1093 mV, my Core clock running at 2088Mhz and Memory Clock at +517Mhz (4520Mhz or 9040Mhz). Been running it like this for well over a year and gaming with it. Stable as always.
> 
> Probably won't gain much more out of it really even with a custom BIOS, but I am curious if a CUSTOM bios was ever made that allows voltage to go past 1093 mv?
> 
> Just ran a Fire Strike to show my current results:
> 
> https://www.3dmark.com/3dm/26986815


----------



## Falkentyne

gtbtk said:


> No, nothing that can be easily flashed. There is a notebook editor but it requires you to use an eprom burner and swap the physical eprom. I am not sure if it has wrked on desktop cards.
> 
> I have teh same card with Micron memory but I flashed the Gaming Z bios to just give me higher default clocks.my best memory oc is just under +700 above the reference (650 pn the gaming Z bios)
> 
> The Samsung Cards never had a problem, The new bios also goes to 126% but my card will start to throttle at about 106% regardless of where the slider is set. It needs a 1440p load though.
> 
> my best firestrike dave me a graphics score 21500. Samsung memory will usually go a bit faster and it does give you a boost in FS





> Last time I was here was close to page 400. Sorry for asking this question, but was a custom BIOS that allowed over-voltage ever made for the GTX 1070?
> 
> I own the MSI GTX 1070 Gaming X card, the first batches that came out with Samsung vRAM. Still using the original BIOS which I never updated that allows PowerLimit on MSI Afterburner to go to 126% as well. GPU has been overclocked and locked at maximum voltage of 1093 mV, my Core clock running at 2088Mhz and Memory Clock at +517Mhz (4520Mhz or 9040Mhz). Been running it like this for well over a year and gaming with it. Stable as always.
> 
> Probably won't gain much more out of it really even with a custom BIOS, but I am curious if a CUSTOM bios was ever made that allows voltage to go past 1093 mv?
> 
> Just ran a Fire Strike to show my current results:
> 
> https://www.3dmark.com/3dm/26986815


The only overvoltage possible for desktop cards (+100mv) requires a config file edit to MSI Afterburner. Unlocking the voltage slider is always possible, however if after making the changes, the card does not recognize the voltage increase, there's nothing you can do.

The Pascal Bios editor DOES work with -some- stock desktop cards, but it is over 1 years out of date and most newer cards are simply not supported now. It does *NOT* require desoldering--just an SPI flasher(a skypro is usually the best choice), a 1.8v adapter, and --very highly recommended--a Pomona 5250 IC clip and some male to female jumper cable blocks to hook up the Pomona clip to the 1.8v adapter.
Desoldering is not required, because Vbios (and most system bios) chips are bus isolated or SPI isolated, so that pushing current through the chip with the device will not cause damage or scrambling of the information. So you do not need to desolder, just make sure there is no power going through the device (this applies to system bios chips, video card bios chips, etc). There are devices which are NOT bus isolated, like Thunderbolt 3 chips, those can be damaged by trying to read them with a programmer. Those for example, require desoldering.

There is no vbios editor anywhere that forcibly allows changing voltage on pascal cards if the MSI Afterburner unlock doesn't work. You would have to do a physical hardware mod to the card, like soldering on certain ohm resistors and so on, to change the voltage higher than the 1.062v or 1.081v limit, whatever your limit is.


----------



## Madmaxneo

Does anyone know how we can find out if there is a stock bios update for the EVGA 1070 ACX 3.0 SC Black ed? The current bios on my card is 86.4.50.0.72. I noticed the bios I have on my card was updated near the end of 2016 so I was wondering if there was an updated bios and where I could get it. I have looked and haven't been able to find anything on it anywhere.


----------



## khanmein

Madmaxneo said:


> Does anyone know how we can find out if there is a stock bios update for the EVGA 1070 ACX 3.0 SC Black ed? The current bios on my card is 86.4.50.0.72. I noticed the bios I have on my card was updated near the end of 2016 so I was wondering if there was an updated bios and where I could get it. I have looked and haven't been able to find anything on it anywhere.


86.04.50.00.72 Is the latest & also the last one. 

https://www.techpowerup.com/vgabios...=GTX+1070&interface=&memType=&memSize=&since=


----------



## Madmaxneo

khanmein said:


> 86.04.50.00.72 Is the latest & also the last one.
> 
> https://www.techpowerup.com/vgabios...=GTX+1070&interface=&memType=&memSize=&since=


Thanks for the info. I saw the bios was last updated about a year and a half ago but wasn't sure if techpowerup were reliable enough for that data.


----------



## comanzo

Hey guys... My friend is looking for a b360 mobo since he won't be overclocking (will be getting a i7 8700). I can't seem to find a motherboard that supports Bluetooth. Gigabyte is preferred, but not essential. Any help or links would be much appreciated. Thanks.


----------



## khanmein

comanzo said:


> Hey guys... My friend is looking for a b360 mobo since he won't be overclocking (will be getting a i7 8700). I can't seem to find a motherboard that supports Bluetooth. Gigabyte is preferred, but not essential. Any help or links would be much appreciated. Thanks.


Avoid Gigabyte with the crappy bios or VRM. 

ASUS TUF B360-PRO GAMING


----------



## Blze001

BIOS update really seems to have helped. This was my third Time Spy run in a row (I like to get things nice and hot before I record my "real" score), and a long Division session last night had zero problems at all.

Yeah, the temps are a little high for a waterloop, but this loop isn't really optimized, it's using parts I had bought for my attempt at a watercooled Hadron Air, so the slim fans and slim rads aren't the best thermally. Next time I drain the loop I'm gonna reroute things and get some proper 25mm fans on the top.

http://www.3dmark.com/spy/3925504

That CPU test, tho. The old 4670k is starting to get long in the tooth, even at 4.4Ghz.


----------



## rfarmer

Blze001 said:


> BIOS update really seems to have helped. This was my third Time Spy run in a row (I like to get things nice and hot before I record my "real" score), and a long Division session last night had zero problems at all.
> 
> Yeah, the temps are a little high for a waterloop, but this loop isn't really optimized, it's using parts I had bought for my attempt at a watercooled Hadron Air, so the slim fans and slim rads aren't the best thermally. Next time I drain the loop I'm gonna reroute things and get some proper 25mm fans on the top.
> 
> http://www.3dmark.com/spy/3925504
> 
> That CPU test, tho. The old 4670k is starting to get long in the tooth, even at 4.4Ghz.


http://www.3dmark.com/spy/3925669 Here is my Time Spy results with a 8700k, your graphics score is a bit better but the CPU score is much higher with the 8700k.


----------



## comanzo

Quick question guys. Is running the slider for voltage at 1.093 volts(100% all the way) safe to run for an overclocked 1070 gpu? I assume so, since 1.093v seems like a mild increment from the default amount.


----------



## JennyBeans

for some reason I'm seeing really crappy frames like 30-40 fps in firestrike in 3d mark .. I uninstalled my drivers and reinstalled them still getting low fps , so I'm at a loss, nothing thermal throttling, all the temps look good for both the cpu and gpu , it only hit max 65c the card itself yet i went to a 15500 score to 8k ?


----------



## pez

JennyBeans said:


> for some reason I'm seeing really crappy frames like 30-40 fps in firestrike in 3d mark .. I uninstalled my drivers and reinstalled them still getting low fps , so I'm at a loss, nothing thermal throttling, all the temps look good for both the cpu and gpu , it only hit max 65c the card itself yet i went to a 15500 score to 8k ?


What usage and clock speeds are you seeing during the duration of the benchmarks?


----------



## JennyBeans

pez said:


> What usage and clock speeds are you seeing during the duration of the benchmarks?



goes up to its max clock speeds core clock 1974


memeory clock to 4067


the memory usage on it only 2628


99% gpu usage


----------



## pez

JennyBeans said:


> goes up to its max clock speeds core clock 1974
> 
> 
> memeory clock to 4067
> 
> 
> the memory usage on it only 2628
> 
> 
> 99% gpu usage


Not gonna lie, I've not tested 3DMark on my 1070 rig, but what is it supposed to be getting compared with other setups like yours? Have you tried a reinstall of 3DMark or trying to run it via safe mode? Better question at this point, too might be to ask what type of CPU usage you are seeing and what is eating it up.


----------



## JennyBeans

pez said:


> Not gonna lie, I've not tested 3DMark on my 1070 rig, but what is it supposed to be getting compared with other setups like yours? Have you tried a reinstall of 3DMark or trying to run it via safe mode? Better question at this point, too might be to ask what type of CPU usage you are seeing and what is eating it up.



like 55-66% the cpu isn't even an problem


----------



## gtbtk

JennyBeans said:


> for some reason I'm seeing really crappy frames like 30-40 fps in firestrike in 3d mark .. I uninstalled my drivers and reinstalled them still getting low fps , so I'm at a loss, nothing thermal throttling, all the temps look good for both the cpu and gpu , it only hit max 65c the card itself yet i went to a 15500 score to 8k ?


look at the windows power settings. for benching, you want max performance selected.

look at the Nvidia control panel and set it to use max performance as well


----------



## gtbtk

JennyBeans said:


> like 55-66% the cpu isn't even an problem


Best firestrike results I have managed is 15364 running on a rig with a i7-2600 @4.4Gghz. 

https://www.3dmark.com/fs/11784694

The graphics score was 21500 with a finely tuned overclock. 

More modern i7 CPUs that run PCIe 3.0, will allow well tuned 1070s into the 22000 range without exotic cooling.


----------



## Waleh

Hey guys,

I have been experiencing an issue with my rig over the last week or so and I figured I would turn to the OCN community for help. Just as an FYI, this is an ITX system. This is a list of my specs:
i7 7700k (stock)
Asus Z170i Pro gaming motherboard
GTX 1070 FE (+205 Core clock/+550 memory clock) - running at this setting for almost 2 years
500 GB Samsung 850 EVO - boot drive, 1 TB Crucial storage SSD
16 GB 2400 Corsair Vengence RAM
Corsair SF 600 W PSU

The issue is that I get severe FPS drops in any game I play. I mainly play BF1 and I will go from 120 FPS to 40 FPS (as an example) for a second or so and then go back up to a high FPS. This will happen in cycles throughout the round. The temps are fine in the system and it is not a temperature related issue causing this problem.

I ran Firestrike 1.1 and got these results: 
Score- 16 024 
Graphics Score- 20 104 
Physics Score- 12 690 
Combined Score - 7532

I also used DDU to uninstall the drivers and re-installed the latest drivers but it did not help. At this point, I'm not even sure the issue is the GPU but I would love some input. I assume it could be the ram but Im not sure how to test ram. Thanks everyone!


----------



## gtbtk

Overclock on the 1070 looks that it is within expected ranges. Have you tried running without an overclock? Does performance improve again in BF1?

Have you made any recent changes in bios recently? 

Have you cleaned your rig out? could be dust making things get too hot. 

Firestrike Results look about right.

You could take a look at the power profile settings you are running. That may have been changed to say power saving mode by something.

It may be an early sign that you PSU is starting to die as well. are the fans spinning on the PSU under load? BF1 exercises both the CPU and GPU at the same time putting more stress on teh PSU. Firestrike is using either CPU or GPU under load but not really both at the same time. 



Waleh said:


> Hey guys,
> 
> I have been experiencing an issue with my rig over the last week or so and I figured I would turn to the OCN community for help. Just as an FYI, this is an ITX system. This is a list of my specs:
> i7 7700k (stock)
> Asus Z170i Pro gaming motherboard
> GTX 1070 FE (+205 Core clock/+550 memory clock) - running at this setting for almost 2 years
> 500 GB Samsung 850 EVO - boot drive, 1 TB Crucial storage SSD
> 16 GB 2400 Corsair Vengence RAM
> Corsair SF 600 W PSU
> 
> The issue is that I get severe FPS drops in any game I play. I mainly play BF1 and I will go from 120 FPS to 40 FPS (as an example) for a second or so and then go back up to a high FPS. This will happen in cycles throughout the round. The temps are fine in the system and it is not a temperature related issue causing this problem.
> 
> I ran Firestrike 1.1 and got these results:
> Score- 16 024
> Graphics Score- 20 104
> Physics Score- 12 690
> Combined Score - 7532
> 
> I also used DDU to uninstall the drivers and re-installed the latest drivers but it did not help. At this point, I'm not even sure the issue is the GPU but I would love some input. I assume it could be the ram but Im not sure how to test ram. Thanks everyone!


----------



## gtbtk

double post


----------



## ianusy2k2

hi guys... just wanna sort something out of my Mining Rig which is mixed of PALIT Jetstream GTX1070 and Zotac GTX 1070 Amp Core Edition.... It was my first time using Nvidia Inspector few days back and was able to discover that 1 of 6 Zotacs i have is showing in inspector this bios... attached are photos for reference


----------



## gtbtk

ianusy2k2 said:


> hi guys... just wanna sort something out of my Mining Rig which is mixed of PALIT Jetstream GTX1070 and Zotac GTX 1070 Amp Core Edition.... It was my first time using Nvidia Inspector few days back and was able to discover that 1 of 6 Zotacs i have is showing in inspector this bios... attached are photos for reference


Only a guess but you are running the card undervolted and the GPU clock is running below stock clocks so the Modified message may be because the clocks don't match the stated spec?


----------



## par

someone has any idea if I can mount an accelero xtreme on a gigabyte 1070 g1 gaming ?

TY


----------



## gtbtk

par said:


> someone has any idea if I can mount an accelero xtreme on a gigabyte 1070 g1 gaming ?
> 
> TY



yes it will mount. the sync mount holes are standard. It need stick on heatsyncs on the memory and VRM


----------



## GunnzAkimbo

moving up to a 2080, near the same price as a high end 2070.


----------



## [email protected]

Is EVGA 2060 like a 1070Ti or what?


----------



## skupples

[email protected] said:


> Is EVGA 2060 like a 1070Ti or what?


Close enuff. Maybe a bit taster. I believe it’s been compared next to 1080.


----------



## PloniAlmoni

Definitely a side grade from a 1070, like a 1070ti.


----------



## skupples

2080 felt like a grade from 1070, when attempting to make 4K more playable. 

Yes, it added quite a few titles to the list of viable @ 4K, but totally wasn't worth it at the price I paid. I got a Zotac blower (worst card i've ever had in my life) for $759+tax, tag & title. was almost $900. 

instead - I gotta used 1080 ti for $500, & retired my 1070 to my work tower.


----------



## PloniAlmoni

I'm not playing at 4k, if I was, I'd consider a 2080 or a used 1080ti. Neither a 1070 nor a 2060 would have 60fps framerates in AAA games at 4k.


----------



## skupples

PloniAlmoni said:


> I'm not playing at 4k, if I was, I'd consider a 2080 or a used 1080ti. Neither a 1070 nor a 2060 would have 60fps framerates in AAA games at 4k.


2080 & 1080 ti barely do it well enough either  

& most of the time it isn't a line that simply turning down the settings can fix.


----------



## TwilightRavens

So how common is 2000MHz on a 1070 perchance? I know on my 1080 its not really that difficult but this 1070 I got for my other rig just isn’t having it. Would I see more benefit from just clocking the VRAM? Yes I know its being bottlenecked in the system but a lot of times it does indeed see full usage, so its not a total waste.


----------



## skupples

I ran my 1070 @ +130 +500 via air with little issue, never really tried to push it beyond that.


----------



## TwilightRavens

skupples said:


> I ran my 1070 @ +130 +500 via air with little issue, never really tried to push it beyond that.


Good to know, thanks.

Right now it seems to do +450 memory with no issue, but it really didn’t like +50 on the core but GPU-Z already reports the card as boosting to 1999MHz so maybe that’s it already. Weird though you would think it would overclock slightly better than a 1080 but my 1080 doesn’t really put up too much if a fight going to 2100. Might try for higher on the memory until I get this 1070 repasted since I bought it used, though honestly it looks like its barely been used.


----------



## skupples

better yet, those limits will change per title if you're really pushing the edge of power.


----------



## TwilightRavens

skupples said:


> better yet, those limits will change per title if you're really pushing the edge of power.


It very well could be my cheapo PSU is also holding it back, which I do plan to replace in the future. But yeah I will probably push it as far as it will let me. The first thing I’ve learned to do with Pascal is increase the voltage limit and power limit and that in itself pretty much overclocks the card at least GPU boost 3.0 wise.


----------



## TwilightRavens

So I was able to get +600MHz on the memory, albeit not 100% stable but +525MHz is rock solid and increased my Firestrike score by about 800 points even though I am CPU bottlenecked. Also so far have gotten +10MHz on the boost clock (haven’t really been pushing it that much because the gains are minimal as most of the games I play on this system are older than 2013.) But yeah so far my Firestrike score is 11,718 (remember I am CPU bottlenecked) and going to try and see if I can get her to 12,000 ish with a hair more on the CPU and GPU. 

I think repasting both the CPU and GPU has gotten me a bit of headroom temp wise as the 1070 has dropped about 10C and the CPU about 5C, we will see what unfolds. I might try the EVGA OC tool as a friend of mine suggested as supposedly it allows you to add a tiny bit of voltage to the card but I am way more comfortable with Afterburner to be honest.


----------



## kignt

This might be an unconventional way to find memory clock...

I had blender benchmark 1.0 beta 2 (focusing on cpu bench) crashing issue with msi afterburner running in bg. Black screen, but the blender process still running. 2nd scene, classroom, would always crash, but the long bench run continues. Closing AB, it clears the bench runs. Seems the memory clock of gtx 1070 was too high at 9300 effective. Dialed down to 8600, now the cpu-quick and long benches run fine. 
I haven't bothered to change from 9300 since more than a year ago, it clears superposition bench and cinematic mode fine. I tried dialing back the mem clock before but not as low. It only became apparent when I tried full default gpu clocks.


----------



## Desolutional

Best way to check max VRAM clock is to pause Uniengine Heaven on a scene then adjust VRAM clock and watch the fps increase. When the increases start getting lower per bin, you're reaching the limits.


----------



## b0uncyfr0

Desolutional said:


> Best way to check max VRAM clock is to pause Uniengine Heaven on a scene then adjust VRAM clock and watch the fps increase. When the increases start getting lower per bin, you're reaching the limits.


Oh that's good, I've always wondered how to check this without doing a firestrike run each time. Much appreciated mate!


----------



## specopsFI

So, after a long time of having a 1080 Ti that basically did whatever I asked of it, I have come across a Pascal dilemma. Picked up a nice first batch 1070 Gaming X with Samsung memory. Seems like an okay chip, but two things are causing me to wonder.

Fire Strike Ultra Stress test passes without a hitch at 1.031V / core 2088MHz / mem +600MHz. The first problem is that the temps seem a bit high, but of this I'm not sure. At those volts and fans at 100% I would have expected more along the lines of 70 degrees max in stead of 74-75. Airflow is a plenty but ambient temp around 24-25 degrees so could explain it. Still, not nice seeing such a large cooler having the clocks come down by three clock bins even running full steam. Does anyone have an opinion if those temps are normal? Will most likely change paste eventually, but would still like to hear opinions.

But the second problem really baffles me. One of the best qualities of the 1070 Gaming X has always been its generous (comparably) power limit. But no matter what I do, this card power throttles when power hits 104% (limit set at 126%). I tried searching the forum about what's going on here and came across @gtbtk and @Falkentyne talking about similar problems but from my understanding, only when cross-flashing with a BIOS of a different card. My card is on its original BIOS.

I've tried using NVSMI to both read and set my power limit and it says it applies the full 291W. But the card still throttles with power limit flag as seen in the attachment.

Is this a wide spread issue and does anyone have ideas on how to troubleshoot? Would be nice to push this elderly beyond 2.1GHz.


----------



## Falkentyne

specopsFI said:


> So, after a long time of having a 1080 Ti that basically did whatever I asked of it, I have come across a Pascal dilemma. Picked up a nice first batch 1070 Gaming X with Samsung memory. Seems like an okay chip, but two things are causing me to wonder.
> 
> Fire Strike Ultra Stress test passes without a hitch at 1.031V / core 2088MHz / mem +600MHz. The first problem is that the temps seem a bit high, but of this I'm not sure. At those volts and fans at 100% I would have expected more along the lines of 70 degrees max in stead of 74-75. Airflow is a plenty but ambient temp around 24-25 degrees so could explain it. Still, not nice seeing such a large cooler having the clocks come down by three clock bins even running full steam. Does anyone have an opinion if those temps are normal? Will most likely change paste eventually, but would still like to hear opinions.
> 
> But the second problem really baffles me. One of the best qualities of the 1070 Gaming X has always been its generous (comparably) power limit. But no matter what I do, this card power throttles when power hits 104% (limit set at 126%). I tried searching the forum about what's going on here and came across @gtbtk and @Falkentyne talking about similar problems but from my understanding, only when cross-flashing with a BIOS of a different card. My card is on its original BIOS.
> 
> I've tried using NVSMI to both read and set my power limit and it says it applies the full 291W. But the card still throttles with power limit flag as seen in the attachment.
> 
> Is this a wide spread issue and does anyone have ideas on how to troubleshoot? Would be nice to push this elderly beyond 2.1GHz.


What's the actual power watts of the card when it's throttling?
A GTX 1070 with a *291W* power limit?
I know 1080's are capable of that but I've never seen a 1070 go that high! Where are you getting this reading from?
A GTX 1070 is supposed to have a 151W base power limit and 170W (100%-112%). Just for reference, 152% is *230W*.
Find out what the base TDP is at 100% for that card, please and what wattage it is actually using.
The newest version of MSI Afterburner should be able to read the wattage. if it can not, use HWinfo64 (it even has a RTSS plugin overlay so you can use it with RTSS). HWinfo64 will show the exact wattage the card is pulling.

I'm 98% sure a GTX 1070 can't go anywhere close to 291W TDP without voltage adjustments. You would need to really crank up the voltage for that. It definitely can't do that at 1.031v for sure.
291W TDP is modded GTX 1080 territory. You mentioned "126% TDP". 126% from a 150W base is exactly 200W.


----------



## specopsFI

Falkentyne said:


> What's the actual power watts of the card when it's throttling?
> A GTX 1070 with a *291W* power limit?
> I know 1080's are capable of that but I've never seen a 1070 go that high! Where are you getting this reading from?
> A GTX 1070 is supposed to have a 151W base power limit and 170W (100%-112%). Just for reference, 152% is *230W*.
> Find out what the base TDP is at 100% for that card, please and what wattage it is actually using.
> The newest version of MSI Afterburner should be able to read the wattage. if it can not, use HWinfo64 (it even has a RTSS plugin overlay so you can use it with RTSS). HWinfo64 will show the exact wattage the card is pulling.
> 
> I'm 98% sure a GTX 1070 can't go anywhere close to 291W TDP without voltage adjustments. You would need to really crank up the voltage for that. It definitely can't do that at 1.031v for sure.
> 291W TDP is modded GTX 1080 territory. You mentioned "126% TDP". 126% from a 150W base is exactly 200W.


Yes, the stock BIOS for the GTX 1070 Gaming X is 230W at 100% and max at 126% = 291W. NVSMI log of my card attached, this is when set the power limit at max with either Afterburner or NVSMI.

Can be seen in the TPU hosted BIOS as well: https://www.techpowerup.com/vgabios/183801/msi-gtx1070-8192-160602

Also, @gtbtk was running a very similar BIOS with his card, the 1070 Gaming Z BIOS that is: https://www.overclock.net/forum/69-...a-gtx-1070-owner-s-club-965.html#post27352001.

Has the same 291W max limit, and he saw throttling at 105% = 242W. Mine hits the power limit flag at ~104%, which should be ~239W. Regardless of the fact that Afterburner and NVSMI show the current power limit being set at 126% / 291W. And maybe the weirdest thing is HWinfo showing max wattage being only at ~211W!

The normal troubleshooting has been done, aka DDU-> clean reinstall of drivers and Afterburner etc... Nothing changed. Weird thing, had nothing like this ever with my 1080 Ti.

That is exactly where the conundrum is. The BIOS should have *plenty* of headroom, but refuses to use it.


----------



## specopsFI

So not a lot of love for these 1070s these days, it seems. I'm still gonna continue my monologue for one more message.

As said, my 1070 Gaming X was struggling a bit with temps before. I have now put it on Kryonaut and that dropped the temps. Like *a lot*. Last time FS Ultra stress test settled at 75 degrees, now it's at 63 as seen in the attachment. So that's progress.

But the power throttling remains. Is there truly no one with experience on a Pascal card that power throttles *even when not hitting the set and reported power limit*? Both NVSMI and Afterburner say that the setting of 126% = 291 W is sticking, but the card just overrules it and power throttles when reaching ~103%.


----------



## TwilightRavens

specopsFI said:


> So not a lot of love for these 1070s these days, it seems. I'm still gonna continue my monologue for one more message.
> 
> As said, my 1070 Gaming X was struggling a bit with temps before. I have now put it on Kryonaut and that dropped the temps. Like *a lot*. Last time FS Ultra stress test settled at 75 degrees, now it's at 63 as seen in the attachment. So that's progress.
> 
> But the power throttling remains. Is there truly no one with experience on a Pascal card that power throttles *even when not hitting the set and reported power limit*? Both NVSMI and Afterburner say that the setting of 126% = 291 W is sticking, but the card just overrules it and power throttles when reaching ~103%.


There must be something else at play then, neither my 1070 or my 1080 do that but what I am not entirely sure. Have you DDU’d and installed an updated driver?


----------



## specopsFI

TwilightRavens said:


> There must be something else at play then, neither my 1070 or my 1080 do that but what I am not entirely sure. Have you DDU’d and installed an updated driver?


Yes, several times. Tried several different drivers as well. I had a little bit of a similar experience with my 1080Ti Jetstream after flashing it with a Super Jetstream BIOS. Since then Afterburner did set the power limit, but the card power throttled at 100% anyway. Only by setting the power limit with NVSMI did the card use up to 140% power. And it was a one time deal, after that I never had a problem with it even after DDU + new drivers. But this time NVSMI does nothing that Afterburner doesn't do as well, so running out of ideas. Maybe I'll give it another go on my new Ryzen build once I get my hands on the CPU...

Weird stuff.


----------



## gtbtk

specopsFI said:


> Yes, the stock BIOS for the GTX 1070 Gaming X is 230W at 100% and max at 126% = 291W. NVSMI log of my card attached, this is when set the power limit at max with either Afterburner or NVSMI.
> 
> Can be seen in the TPU hosted BIOS as well: https://www.techpowerup.com/vgabios/183801/msi-gtx1070-8192-160602
> 
> Also, @gtbtk was running a very similar BIOS with his card, the 1070 Gaming Z BIOS that is: https://www.overclock.net/forum/69-...a-gtx-1070-owner-s-club-965.html#post27352001.
> 
> Has the same 291W max limit, and he saw throttling at 105% = 242W. Mine hits the power limit flag at ~104%, which should be ~239W. Regardless of the fact that Afterburner and NVSMI show the current power limit being set at 126% / 291W. And maybe the weirdest thing is HWinfo showing max wattage being only at ~211W!
> 
> The normal troubleshooting has been done, aka DDU-> clean reinstall of drivers and Afterburner etc... Nothing changed. Weird thing, had nothing like this ever with my 1080 Ti.
> 
> That is exactly where the conundrum is. The BIOS should have *plenty* of headroom, but refuses to use it.


Not been here for a while. My rig is in pieces so I cannot test this. Take a look at he 1080ti bios flashing guide in teh 1080ti owners forum. It mentions the same issue with 1080ti when you flash a different bios and claims that you can reset the power limit using the nvidia-smi utility. The power limits is definitely a product of the installed firmware. At one stage I had a Galax bios on the card and it was pulling 300W but I could never get the card totally stable

I have not been able to test the nvidia-smi reset it but it might be worth a try. Let me know if it solves your problem.

The command to reset the power limit is "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 291


----------



## specopsFI

gtbtk said:


> Not been here for a while. My rig is in pieces so I cannot test this. Take a look at he 1080ti bios flashing guide in teh 1080ti owners forum. It mentions the same issue with 1080ti when you flash a different bios and claims that you can reset the power limit using the nvidia-smi utility. The power limits is definitely a product of the installed firmware. At one stage I had a Galax bios on the card and it was pulling 300W but I could never get the card totally stable
> 
> I have not been able to test the nvidia-smi reset it but it might be worth a try. Let me know if it solves your problem.
> 
> The command to reset the power limit is "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 291


Thank you for the suggestion. Unfortunately that doesn't help, tried it already. But the problem isn't that severe, to be honest. I can't get the throttling to occur in anything other than Fire Strike Ultra. So maybe I'll just think of it as an outlier and not worry about it. Might take another shot once my new rig is up and running.


----------



## TwilightRavens

specopsFI said:


> Thank you for the suggestion. Unfortunately that doesn't help, tried it already. But the problem isn't that severe, to be honest. I can't get the throttling to occur in anything other than Fire Strike Ultra. So maybe I'll just think of it as an outlier and not worry about it. Might take another shot once my new rig is up and running.


Could also try putting liquid metal on the shunt resistors, from what I have seen that raises the power limit and keeps it from throttling that way.


----------



## gtbtk

specopsFI said:


> Thank you for the suggestion. Unfortunately that doesn't help, tried it already. But the problem isn't that severe, to be honest. I can't get the throttling to occur in anything other than Fire Strike Ultra. So maybe I'll just think of it as an outlier and not worry about it. Might take another shot once my new rig is up and running.


In my experience with my gaming X at 1080p the card never hit the power limits anyway even when Overclocked to get a 21600 graphics score in Firestrike. 1440p loads would hit the power limit. As my monitor is 1200p it was never a real issue for me so I didn't investigate it to much.


----------



## wokie

Dear all,

 Hope you can help me here. I have recently downloaded latest BIOS F3 from Gigabyte website and tried to update my card (GIGABYTE GeForce GTX 1070 Xtreme Gaming). Well the result was that I had a black screen. I know, "Don't fix it if it works" but well...it happened. So I have tried to download GV-N1070XTREME-8GD/F2/0404 from techpowerup web site which partially fixed the issue. First time when I power on my computer I get black screen again, but if I reboot my computer then it loads normally. The problem is that from time to time some application crashes now and the performance is not same like it used to be. And also rebooting computer anytime when I want to start it is quite annoying. Do you have any idea how to solve this problem? Some further details: Of course I already tried to uninstall nvidia drivers, applications and then I made a clean installation. I have tried to re-flash the BIOS again but that also didn't help.

Thank you,

Regards,

​​​​​​​Tomas


----------



## FastOne8

wokie said:


> Dear all,
> 
> Hope you can help me here. I have recently downloaded latest BIOS F3 from Gigabyte website and tried to update my card (GIGABYTE GeForce GTX 1070 Xtreme Gaming). Well the result was that I had a black screen. I know, "Don't fix it if it works" but well...it happened. So I have tried to download GV-N1070XTREME-8GD/F2/0404 from techpowerup web site which partially fixed the issue. First time when I power on my computer I get black screen again, but if I reboot my computer then it loads normally. The problem is that from time to time some application crashes now and the performance is not same like it used to be. And also rebooting computer anytime when I want to start it is quite annoying. Do you have any idea how to solve this problem? Some further details: Of course I already tried to uninstall nvidia drivers, applications and then I made a clean installation. I have tried to re-flash the BIOS again but that also didn't help.
> 
> Thank you,
> 
> Regards,
> 
> ​​​​​​​Tomas


Hi. You can try to download and install official Gigabyte BIOS from here https://www.gigabyte.com/us/Graphics-Card/GV-N1070XTREME-8GD-rev-10/support#support-dl-bios
and see if it helps. 
Also on the same website you mentioned there are multiple different BIOS versions for your card model. You can try to flash them one by one and see if any would work the same as your old one










What was the original reason that you made you install different firmware? You some some problems with your GPU before?


----------



## icold

My 1070 strix run´s at 2152mhz/4363mhz in almost 3 years stable, my memory dont up much, but gpu is reasonable. I change my stock bios to strix OC bios helps reduce throttling.


----------



## gtbtk

icold said:


> My 1070 strix run´s at 2152mhz/4363mhz in almost 3 years stable, my memory dont up much, but gpu is reasonable. I change my stock bios to strix OC bios helps reduce throttling.


Strix OC bios is one of the best 1070 bioses out there.


----------



## icold

gtbtk said:


> Strix OC bios is one of the best 1070 bioses out there.



The difference from stock bios is power target 112% to 120% and works good because my GPU is strix ( non oc). I wish I had taken memories of samsung


----------



## gtbtk

wokie said:


> Dear all,
> 
> Hope you can help me here. I have recently downloaded latest BIOS F3 from Gigabyte website and tried to update my card (GIGABYTE GeForce GTX 1070 Xtreme Gaming). Well the result was that I had a black screen. I know, "Don't fix it if it works" but well...it happened. So I have tried to download GV-N1070XTREME-8GD/F2/0404 from techpowerup web site which partially fixed the issue. First time when I power on my computer I get black screen again, but if I reboot my computer then it loads normally. The problem is that from time to time some application crashes now and the performance is not same like it used to be. And also rebooting computer anytime when I want to start it is quite annoying. Do you have any idea how to solve this problem? Some further details: Of course I already tried to uninstall nvidia drivers, applications and then I made a clean installation. I have tried to re-flash the BIOS again but that also didn't help.
> 
> Thank you,
> 
> Regards,
> 
> ​​​​​​​Tomas


Probably too late but try using a different display output port on the card. Display port 1 on a Gigabyte card is not always the same as displayport 1 on a different card


----------



## gtbtk

icold said:


> The difference from stock bios is power target 112% to 120% and works good because my GPU is strix ( non oc). I wish I had taken memories of samsung


even with the samsung cards anything above about +600 and you start getting memory errors and losing performance. The Samsung memory just handles the errors more gracefully.

My Gaming X Micron card from the very first batch of micron cards, after the bug fix bios would stably clock to just over +600 from stock with a gaming Z bios or the strix bios installed. I ended up settling on the Gaming Z bios because the performance was similar to Asus Strix but it was actually written for the hardware it was running on. The 126% power slider placebo got me in as well.


----------



## icold

gtbtk said:


> even with the samsung cards anything above about +600 and you start getting memory errors and losing performance. The Samsung memory just handles the errors more gracefully.
> 
> My Gaming X Micron card from the very first batch of micron cards, after the bug fix bios would stably clock to just over +600 from stock with a gaming Z bios or the strix bios installed. I ended up settling on the Gaming Z bios because the performance was similar to Asus Strix but it was actually written for the hardware it was running on. The 126% power slider placebo got me in as well.



I just wish remove 55°C throttling to have fixed 2152mhz


----------



## Tiihokatti

Installed Accelero Mono Plus on my ~3 years old Asus GTX 1070 Dual OC.
The stock cooler is simply horrendous: The fans are loud and inefficient and the heatsink is extremely bad at cooling the card with its meager 2 heatpipes.

I have had 5 different cooling setups on the card now:
1. Stock
2. Antec h20 620 AIO with zipties, also used a hackjob version with 2x 120mm radiators on the loop.
3. Kraken G12 (which wasn't readily compatible with the non-reference PCB of the card) + Corsair Hydro H110i GTX (280mm AIO)
4. Stock heatsink with 3x Arctic F8 Silent fans strapped on it with zipties.
5. Arctic Accelero Mono Plus (stock and also with the fan replaced with Arctic F12)

And out of all those options the Accelero + F12 has been by far the best solution when it comes to thermals and noise. Temps are at most @55C and fan @800rpm. And all this is inside a NZXT H500i case.
We are talking about a 35€ air-cooler when the Corsair AIO-setup cost me over 100€. I feel utterly stupid thinking about the fact that I spent that much money on the Corsair + Kraken G12 (which also had to be "Dremeled" for it to fit on the card) when a simple and cheap air-cooler worked as efficiently as the 280mm AIO. And the air-cooler is actually much more silent than the AIO as the pump was very noisy on the Corsair.
And also keep in mind that the air-cooler is pretty much unbreakable and maintenance-free, while the AIO is in need of refilling/cleaning within 3-5 years. I've had 3 different AIOs in the past 7-8 years and I have done teardown+assembly on every single one of those (+ I have a full 280mm custom-loop inside another ITX-setup so I know my Schiit). AIOs aren't simply built to last over 5 years without maintenance...


----------



## Nukemaster

Welcome to the 1070 + Mono Plus club


----------



## Madmaxneo

Tiihokatti said:


> Installed Accelero Mono Plus on my ~3 years old Asus GTX 1070 Dual OC.
> The stock cooler is simply horrendous: The fans are loud and inefficient and the heatsink is extremely bad at cooling the card with its meager 2 heatpipes.
> 
> I have had 5 different cooling setups on the card now:
> 1. Stock
> 2. Antec h20 620 AIO with zipties, also used a hackjob version with 2x 120mm radiators on the loop.
> 3. Kraken G12 (which wasn't readily compatible with the non-reference PCB of the card) + Corsair Hydro H110i GTX (280mm AIO)
> 4. Stock heatsink with 3x Arctic F8 Silent fans strapped on it with zipties.
> 5. Arctic Accelero Mono Plus (stock and also with the fan replaced with Arctic F12)
> 
> And out of all those options the Accelero + F12 has been by far the best solution when it comes to thermals and noise. Temps are at most @55C and fan @800rpm. And all this is inside a NZXT H500i case.
> We are talking about a 35€ air-cooler when the Corsair AIO-setup cost me over 100€. I feel utterly stupid thinking about the fact that I spent that much money on the Corsair + Kraken G12 (which also had to be "Dremeled" for it to fit on the card) when a simple and cheap air-cooler worked as efficiently as the 280mm AIO. And the air-cooler is actually much more silent than the AIO as the pump was very noisy on the Corsair.
> And also keep in mind that the air-cooler is pretty much unbreakable and maintenance-free, while the AIO is in need of refilling/cleaning within 3-5 years. I've had 3 different AIOs in the past 7-8 years and I have done teardown+assembly on every single one of those (+ I have a full 280mm custom-loop inside another ITX-setup so I know my Schiit). AIOs aren't simply built to last over 5 years without maintenance...


I have a Swiftech H140 AIO with a Heatkiller IV pro block on my 1070 and it works great. My idle temps are around 30C and it barely goes over 42C when gaming on hi res games. 
I was always under the impression that you had to drain and refill the fluid in a watercooling setup at least once a year. This even though I haven't touched that AIO in over 2 years since installing it. Now the H240-X I have for my CPU has needed some maintenance every year or so. Right now it is the longest I have ran it without any maintenance in almost a year and I need to drain and refill. My temps have been slowly creeping up over the last few months. Right now at idle and a 5% load (with a dozen chrome tabs open) it is running at about 38C. But when I push it, it doesn't go above 65C (this is also paired with a Heatkiller IV pro on it). I have had my H240-X for almost 5 years now.

The Swiftech AIO's being configurable make them a much better investment than AIO's that aren't.


----------



## Squall Leonhart

I need a copy of nvflash for windows 5.292, certain 1070's will not flash with anything new (BIOS Cert 2.0 verification error)

Has anyone got it?


----------



## snorlaxgangs

Hi guys, i'm new to the forum. I searched about bios mod unlock voltage in this forum a lot, couldn't find one for 1070. Anyway, so i bought the gigabyte 1070 g1 gaming in 2016 iirc, furmark occt max load stable at +130 600 and it's been a great graphic card. A couple days ago i bought Bykski block since there is no other choice, the block is N-GV1080G1-X. The temperature was nice stable around 35*C, not sure if they really use copper but the whole thing was heavy. Yesterday i made a stupid mistake, i decided to use the RGB led strip that came with the block, plugged in one of the 4 pins on the gpu board. The moment i turned the psu on, there was smoke. Since then i got abnormal white light flashing near the power. Some of the local said one of the chip is blown and they can't get those anymore. I really wanna ask if anyone know where i can get the chip in the attached photo.


----------



## TwilightRavens

snorlaxgangs said:


> Hi guys, i'm new to the forum. I searched about bios mod unlock voltage in this forum a lot, couldn't find one for 1070. Anyway, so i bought the gigabyte 1070 g1 gaming in 2016 iirc, furmark occt max load stable at +130 600 and it's been a great graphic card. A couple days ago i bought Bykski block since there is no other choice, the block is N-GV1080G1-X. The temperature was nice stable around 35*C, not sure if they really use copper but the whole thing was heavy. Yesterday i made a stupid mistake, i decided to use the RGB led strip that came with the block, plugged in one of the 4 pins on the gpu board. The moment i turned the psu on, there was smoke. Since then i got abnormal white light flashing near the power. Some of the local said one of the chip is blown and they can't get those anymore. I really wanna ask if anyone know where i can get the chip in the attached photo.


Don’t know if the site is legit or not but: http://www.holtekusa.com/32bitmcuone.html


----------



## snorlaxgangs

TwilightRavens said:


> Don’t know if the site is legit or not but: http://www.holtekusa.com/32bitmcuone.html


Nice, my chip is HT32F52241 32-bit ARM Cortex-M0+ MCU according to tomshardware, i'm gonna buy order rn. Thank you.


----------



## dVeLoPe

I have an ''OEM HP BRANDED MODEL 1070'' according to GPU-Z and the fan has stopped spinning randomly.

I dont want to spent 50$ for acellero / kraken+aio can anyone help me source a replacement blower fan for my card??


----------



## TwilightRavens

dVeLoPe said:


> I have an ''OEM HP BRANDED MODEL 1070'' according to GPU-Z and the fan has stopped spinning randomly.
> 
> I dont want to spent 50$ for acellero / kraken+aio can anyone help me source a replacement blower fan for my card??


Check on ebay for 1070’s listed as “for parts or not working” they can usually be had pretty cheap and the fans almost always still work.

Edit: Actually really any blower model nvidia card that’s listed as not working, (i.e 980, 770, 680, 970, 1060 etc) as long as they are blower model cards you can usually just rip the blower fan out and replace yours with it.


----------



## JustEngry

Hello,
I own gtx 1070 MSI Gaming X with flashed bios to Gaming Z (86.04.1E.00.6C). After overclocking my card I got results of: https://www.3dmark.com/spy/9359082 I can't get 2100Mhz on Core Clock. Gpu-Z shows that I have Samsung Memory. So far my maximum was 2 076 MHz. Everything higher causes freezes (not artifacts). Is there any other BIOS that could help me to break the 2100 wall? My Msi afterburner settings:
Core Voltage: 100
Power Limit: 126
Core Clock: 70
Memory Clock: 775.


----------



## Tiihokatti

JustEngry said:


> Hello,
> I own gtx 1070 MSI Gaming X with flashed bios to Gaming Z (86.04.1E.00.6C). After overclocking my card I got results of: https://www.3dmark.com/spy/9359082 I can't get 2100Mhz on Core Clock. Gpu-Z shows that I have Samsung Memory. So far my maximum was 2 076 MHz. Everything higher causes freezes (not artifacts). Is there any other BIOS that could help me to break the 2100 wall? My Msi afterburner settings:
> Core Voltage: 100
> Power Limit: 126
> Core Clock: 70
> Memory Clock: 775.


Are you doing curve-OC or the traditional slider-OC?
By fiddling around with the curve you can force a tiny bit higher voltage than normal (1.093v): https://forums.evga.com/m/tm.aspx?m=2820280&p=1


Honestly, slider-OC doesn't really work on Pascal as undervolting can get you higher clocks as the temperatures drop.


----------



## JustEngry

I'm using traditional sliders. My current Gpu voltage is const. 1.093. Max temperature in 3DMark and Heaven Benchmark was 70 degrees. That's how the curve looks like


----------



## TwilightRavens

JustEngry said:


> I'm using traditional sliders. My current Gpu voltage is const. 1.093. Max temperature in 3DMark and Heaven Benchmark was 70 degrees. That's how the curve looks like


Personally i’d use the curve, when I used the sliders on my 1080 the highest oc i could get was 2062, but if you just max the fan out and adjust 1.093v first you might be surprised on what you can get. I was able to get 2126MHz at 1.093v then the step down from there 1.081v drops down to 2113MHz, try doing something like that, or I’d honestly start with 2100MHz at 1.093v then 2088MHz at 1.081v then adjust the others in 13MHz increments. Don’t forget to hit apply in afterburner before closing the graph or starting the stress. Keep gpu-z monitoring open in the background when stressing it so you can see at which voltage step and clock the stress fails at (Firestrike is really good at finding an instability in Pascal in the 1st and 2nd graphics tests), if you make note of that you pretty much know which voltage state is unstable. It looks extremely daunting at first but after you get in the rhythm its not too bad.

Oh and honestly its probably not the 2100MHz that is causing the instability, i had a lot of trouble with voltage states 1.043v, 1.050v and the one right above that, 1.081v can almost always be run at the same clock as 1.093v and if not is within 13-26MHz of it, fine tuning the curve can be done on each state by a +/-1MHz intervals with the up and down keys on keyboard, but i’d stick with adjusting them by 13MHz intervals because it’ll just round to the closest for you, instead of doing say 2100, 2095 it’ll just round it to 2100, 2100 or 2100, 2088.


----------



## Kevinf63

Bit of a cross-post, I apologize! I wrote my thread before noticing this club thread!

My Question:
https://www.overclock.net/forum/69-...x-1070-sea-hawk-x-vbios-5-power-limit-oc.html


----------



## icold

JustEngry said:


> Hello,
> I own gtx 1070 MSI Gaming X with flashed bios to Gaming Z (86.04.1E.00.6C). After overclocking my card I got results of: https://www.3dmark.com/spy/9359082 I can't get 2100Mhz on Core Clock. Gpu-Z shows that I have Samsung Memory. So far my maximum was 2 076 MHz. Everything higher causes freezes (not artifacts). Is there any other BIOS that could help me to break the 2100 wall? My Msi afterburner settings:
> Core Voltage: 100
> Power Limit: 126
> Core Clock: 70
> Memory Clock: 775.




My gpu can push 2.152 mhz (below 55°C) The chip is the limit of the overclock


----------



## gtbtk

icold said:


> I just wish remove 55°C throttling to have fixed 2152mhz


Late reply. Sorry.

The card is not throttling. it is performing as per teh design of GPU boost.

To keep the card at a stable frequency. you need to have the max frequency that you wish to obtain set at least one step below the max available voltage on the Voltage/frequency curve. 2 steps will be even better.

GPU boost works by starting at the highers frequency point on the curve. As Temps rise, as each 5 degree temp step, the card will firstly attempt to step up the voltage to maintain frequency until it hits the hard limit of the card and then secondly, step down the frequency. 

If you have the curve set to the max frequency at max voltage, there is no voltage headroom available so the only way the cars has to go is to drop the frequency.


----------



## gtbtk

JustEngry said:


> Hello,
> I own gtx 1070 MSI Gaming X with flashed bios to Gaming Z (86.04.1E.00.6C). After overclocking my card I got results of: https://www.3dmark.com/spy/9359082 I can't get 2100Mhz on Core Clock. Gpu-Z shows that I have Samsung Memory. So far my maximum was 2 076 MHz. Everything higher causes freezes (not artifacts). Is there any other BIOS that could help me to break the 2100 wall? My Msi afterburner settings:
> Core Voltage: 100
> Power Limit: 126
> Core Clock: 70
> Memory Clock: 775.


Your card has a very old vbios installed on it. Check GPU-Z and see what type of memory that card has installed. If you have Samsung memory, you are fine. If you are running Micron memory, you need to do a bios update. You can get the update utility from the MSI web site for the Gaming Z.

To get 2100 on the card, you need to use the Voltage/Frequency Curve and not the slider. (Assuming that the Silicon is not a complete dog)

Max the Voltage, temp and power limit sliders in afterburner.

Open the Curve editor using CTRL-F

Reset the curve to stock and only boost the voltage points from 1.043v and above. set the 1.043 to say 2050, 1.050v to 2080, 1.063v 2100 and leave the the rest of the points flat after 1.063v at try it out.

if it crashes, reduce the levels of the voltage points that you set to something lower and try the next highest voltage point on the curve at 2100. Rinse and repeat until you find something stable.


----------



## neoflix

Better late than never, just got my gtx1070, msi gaming x  
My old card was r9 280 so it was really good upgrade for me

Timespy result, memory is +600 and core boost around 2024
https://i.gyazo.com/48e679ada1dc0b6d812360ed66c09bde.jpg


----------



## gtbtk

neoflix said:


> Better late than never, just got my gtx1070, msi gaming x
> My old card was r9 280 so it was really good upgrade for me
> 
> Timespy result, memory is +600 and core boost around 2024
> https://i.gyazo.com/48e679ada1dc0b6d812360ed66c09bde.jpg


My Card really struggled with the memory above about +400 in timespy.

play with the curve rather than the slider to overclock core clock. You don't need to overclock the frequency points at the lower voltage settings to get the performance.

https://www.3dmark.com/spy/651325

https://www.3dmark.com/fs/11532231


----------



## TwilightRavens

You know I have never actually checked what the limit on my 1070 is, I know my 1080 and 1080 ti’s limit but i’ve never tried my 1070 except for a quick 2050 core and +400 memory. But I also don’t know how i’d go about doing that on Ubuntu.


----------



## b0uncyfr0

TwilightRavens said:


> You know I have never actually checked what the limit on my 1070 is, I know my 1080 and 1080 ti’s limit but i’ve never tried my 1070 except for a quick 2050 core and +400 memory. But I also don’t know how i’d go about doing that on Ubuntu.


Definitely push it, keeping it cool will certainly help even more.

On another note, i realized, being the doofus I am, I never played with my memory overclock so I tried it out today. I managed to get +750 working on a number of games and it seems stable, no artifacting. But it comes at a cost: the core is kinda crippled around 2025-2035 and that's with no extra juice to the core voltage either. 

It's much lower than the 2114 i can get at 1.081v - so I'm kinda torn here. What's the method to OC and get decent balance here? Id gladly settle for anything above 2050 core...

With that fat OC at 2114 core - i can get +325 from the memory but even after testing with no core OC and just the memory OC, games appear less jittery...like he minimum frame has jumped up abit.

How should i proceed...?

Also, i discovered something very peculiar about my Gaming X (flash to Z) that I never asked about:
The max vcore is 1.093 (if I remember correctly) - and yet with only +50 on the core, I'm able to hit 1.081v - How is that happening? Shouldn't it be somewhere around .60 ?
My core OC of 2114 is extremely stable at +50 but not at +100 - that's weird right?


----------



## Destrto

Hey guys, I thought I would ask in here because you guys are all 1070 owners. Hoping this includes 1070Ti as well. 

I just recently purchased the MSI GTX 1070 Ti Aero 8G. And I have been hunting for a waterblock for it. But I seem to be having a bit of trouble finding one that says it is compatible. 

Do any of you happen to know of a waterblock that is compatible? 

I see that EK has a block that says it fits the 1070 non Ti version of this card. Does anyone know if the Ti and non-Ti version use the same PCB?


----------



## TwilightRavens

Destrto said:


> Hey guys, I thought I would ask in here because you guys are all 1070 owners. Hoping this includes 1070Ti as well.
> 
> I just recently purchased the MSI GTX 1070 Ti Aero 8G. And I have been hunting for a waterblock for it. But I seem to be having a bit of trouble finding one that says it is compatible.
> 
> Do any of you happen to know of a waterblock that is compatible?
> 
> I see that EK has a block that says it fits the 1070 non Ti version of this card. Does anyone know if the Ti and non-Ti version use the same PCB?



From what i’ve read the 1070 ti Aero uses the same pcb as the 1080 aero and not the 1070 aero, which makes sense because its essentially a cut down 1080. And as far as I can tell Bykski and Barrow are the only ones that have a block.


----------



## Destrto

TwilightRavens said:


> From what i’ve read the 1070 ti Aero uses the same pcb as the 1080 aero and not the 1070 aero, which makes sense because its essentially a cut down 1080. And as far as I can tell Bykski and Barrow are the only ones that have a block.


Ah, great. thank you for the info.


----------



## garyd9

(deleted - I posted in wrong thread)


----------



## taowulf

My 1070 has been flickering a lot in games lately, it looks like the years of constant folding might be catching up with it. I ordered a 2070 the other night as well as a new Byski water block, figured I would get in front of it rather than waiting for it to fail.


----------



## Madmaxneo

Where can I find out what thickness thermal pads I need for my EVGA GeForce GTX 1070 SC GAMING?

I have a waterblock on it and I am about to change up my entire loop and I need to open up the waterblock to clean it out. I do not remember what thickness thermal pads I got last time, which was about 2 or 3 years ago.

If you have links to good deals on these thermal pads that would be greatly appreciated!


----------



## taowulf

Madmaxneo said:


> Where can I find out what thickness thermal pads I need for my EVGA GeForce GTX 1070 SC GAMING?
> 
> I have a waterblock on it and I am about to change up my entire loop and I need to open up the waterblock to clean it out. I do not remember what thickness thermal pads I got last time, which was about 2 or 3 years ago.
> 
> If you have links to good deals on these thermal pads that would be greatly appreciated!


Mostly 1.5mm, looks like some spots take 1 or 2mm as well. 

https://forums.evga.com/EVGA-Thermal-Mod-Pad-sizes-and-locations-for-DIYers-m2586299.aspx

I just hopped on amazon and ordered some, I wasn't really bargain shopping though.


----------



## TwilightRavens

Madmaxneo said:


> Where can I find out what thickness thermal pads I need for my EVGA GeForce GTX 1070 SC GAMING?
> 
> I have a waterblock on it and I am about to change up my entire loop and I need to open up the waterblock to clean it out. I do not remember what thickness thermal pads I got last time, which was about 2 or 3 years ago.
> 
> If you have links to good deals on these thermal pads that would be greatly appreciated!


Personally I always use Fujipoly 17w/mk pads on anything now, probably the best out there but the price also reflects that being ~$30 for a 50mmx60mm sheet.


----------



## JackCY

TwilightRavens said:


> Personally I always use Fujipoly 17w/mk pads on anything now, probably the best out there but the price also reflects that being ~$30 for a 50mmx60mm sheet.


Thats crazy considering a decent GPU air cooler costs around $60. These branded thermal pads are nuts price wise. Bet the manufacturers are paying at least a whole order less. It really makes no sense for thermal pads to cost a fortune other than low volume sales to DIY and milking it.


----------



## Madmaxneo

JackCY said:


> Thats crazy considering a decent GPU air cooler costs around $60. These branded thermal pads are nuts price wise. Bet the manufacturers are paying at least a whole order less. It really makes no sense for thermal pads to cost a fortune other than low volume sales to DIY and milking it.


Not even sure why anyone would want higher priced ones like that... The bargain ones I have been using now for like 2 years have worked perfectly fine. My GPU does not go above 45 deg on any benchmark...... Maybe with the new rad and a stronger pump it will be even lower!


----------



## TwilightRavens

JackCY said:


> Thats crazy considering a decent GPU air cooler costs around $60. These branded thermal pads are nuts price wise. Bet the manufacturers are paying at least a whole order less. It really makes no sense for thermal pads to cost a fortune other than low volume sales to DIY and milking it.


They are priced like that because they aren't like most thermal pads out there that are like sticking styrofoam on mosfets and vram, these actually do help thermals wise but really only if your cooler is up to it. That's not to say there aren't other options out there, but if you wanna shave off every last degree then these are you're only option performance wise and are well worth the price as they'll last 10+ years without drying up and becoming like raisins like most thermal pads do.


----------



## Blatsz32

Hello all, quick question, I've been running 2 MSI GTX1070 for a while now and i was wondering if upgrading to a 2070 is a noticeable upgrade. I have a 7700k cpu and 32g of RAM, at the moment I am not struggling to play any games at 2k. WIth the new 3000 series coming out some of the 2070s seem to be coming down in price. Should I get a RTX2070 or wait a bit and see how the 3000 perform?


----------



## yanks8981

Blatsz32 said:


> Hello all, quick question, I've been running 2 MSI GTX1070 for a while now and i was wondering if upgrading to a 2070 is a noticeable upgrade. I have a 7700k cpu and 32g of RAM, at the moment I am not struggling to play any games at 2k. WIth the new 3000 series coming out some of the 2070s seem to be coming down in price. Should I get a RTX2070 or wait a bit and see how the 3000 perform?


I'd personally keep the 2 1070s and wait to see what else is released. I bet the 2070 isnt going to do as well in most things as the 1070's, with the exception of less heat, noise and power consumption. The only way I'd sell the 1070s now is to get top dollar and buy the new cards when they are released if you can live without GPUs for that long.


----------



## Bee Dee 3 Dee

Blatsz32 said:


> Hello all, quick question, I've been running 2 MSI GTX1070 for a while now and i was wondering if upgrading to a 2070 is a noticeable upgrade. I have a 7700k cpu and 32g of RAM, at the moment I am not struggling to play any games at 2k. WIth the new 3000 series coming out some of the 2070s seem to be coming down in price. Should I get a RTX2070 or wait a bit and see how the 3000 perform?


Are u using "nVidia Profile Inspector" to enable an SLI Profile; for the games u play that don't natively support SLI?


----------



## Blatsz32

Bee Dee 3 Dee said:


> Are u using "nVidia Profile Inspector" to enable an SLI Profile; for the games u play that don't natively support SLI?


Profile inspector? I've never used it before. I mostly play Escape from Tarkov, the usage is on both gpus.


----------



## Bee Dee 3 Dee

Blatsz32 said:


> Profile inspector? I've never used it before. I mostly play Escape from Tarkov, the usage is on both gpus.


----------



## HowYesNo

i have gainward gtx 1070 dual fan cooler.
just a quick question,
vrm on the card have thermalpad and contact with cooler. those larger caps next to vrms are not conected to cooler, there is a cutout in design so they dont touch it.
so the things in red rectangle is what i am wondering about.
should i put some thermalpad on those??
thanks


----------



## Nukemaster

Those are inductors.

If Gainward did not add cooling for them, chances are they would not need it. They get enough cooling from the fan airflow alone.

My card does not have a heatsink or anything on them either.

If you are going custom, you can do whatever you want to them. Overkill is the normal for overclocking.


----------



## HowYesNo

well i did contact gainward about pad thickness, and they sent image below.
there seems to be an thermal pad for those inductors on the diagram, just not on cooler itself.


----------



## HowYesNo

just an update for those still reading.
received today ekwb 2mm thick pad, and its way too thick.
with them instaled i can see the gap beetwen cooler and pcb, point where the screws go are not even close to one another.
did not try to assembe it as i felt it would bend the pcb.
a shame these gainward ones look crapy


----------



## Nukemaster

Well that settles it. Very good to see them give information like this.

I have seen some Gainward units with another cooler on a similar board without the inductors being cooled.


----------



## rares495

Is there no 1060 club? I swear I looked everywhere but cannot find it. 

I bought a Strix 1060 and want to see if my card is any good. What do you guys think?


----------



## TwilightRavens

yanks8981 said:


> I'd personally keep the 2 1070s and wait to see what else is released. I bet the 2070 isnt going to do as well in most things as the 1070's, with the exception of less heat, noise and power consumption. The only way I'd sell the 1070s now is to get top dollar and buy the new cards when they are released if you can live without GPUs for that long.



^ This, an Nvidia rep said Turing is gonna age like Kepler so you probably wouldn’t get much out of one if you upgraded. Pascal seems to be aging like Maxwell so that’s not a bad thing and means you can get away with it for a while yet.


----------



## Madmaxneo

I am looking to OCing my 1070 again with my new system. 

From what I remember there were certain steps in memory (and possibly core clock) OCing to go by, what were those steps in memory OCing and core clock again?

Is there any better method for OC these cards? 

For reference I have an EVGA GeForce GTX 1070 SC GAMING that is watercooled.


----------



## mllrkllr88

Hey guys, new to this club! I just got a 1070 Strix to mess around with and it's doing well on water 2214/2311 Time Spy. https://hwbot.org/submission/451817..._geforce_gtx_1070_7974_marks?recalculate=true

I have a question for you guys:

1. Is there a performance bios for Strix with Micron?
2. Can the timings be altered easily for better performance?


----------



## HowYesNo

gtbtk said:


> Quote: Originally Posted by *Blackfirehawk*
> 
> After i Replaced Thermal Pads and Thermal Compount on the GPU ... 1 Hour Stresstest on my Gainwand GTX 1070 with a Gainwand phoenix GLH Bios
> 
> 
> 
> 
> 
> i need 1.043 mv for a stable GPU clock of 2012 :/.. didn´t win in Silicon lottery or the Cooler isn´t that great on the Standart gainwand GTX1070 (non phoenix but with phoenix GLH Bios)
> 
> but i think its a decent Memory Clock + 1500mhz on micron Memory
> it runs at 9500mhz
> 
> i testet it up to 9580.. @ 9600mhz heaven starts crashing some times
> 
> 1.043 is undervolting the card. Without increasing the voltage slider, the card will go up to 1.063v.
> 
> If you really want to undervolt, see how fast you an get the card using the .950v point. I can run my card at 2012Mhx at .950v and set the 1.050 to go all the way to 2088. It will settle at 2076. Those settings gets me over 21000 on firestrike graphics score
> 
> 
> 
> 9500mhz is a great memory overclock. I start getting memory errors on my card above about 1300Mhz over reference


what thermal pads did you use? what thickness?


----------



## haffulawl

Hi all! As I'm pretty new to this I was hoping to seek some advice here. 
I've started practicing my soldering skills and starting to feel competent that I will be able to pull off the Nvidia power limit on my GTX 1070 Strix.
This is my starter project so doing this in order to progress and go on to more complex hardmods. I have the necessary resistors and soldering stuff.
From what I've gathered removing power limit will allow the card to draw more power (somewhat obvious but...) but I haven't been able to find a conclusive answer as to
will the core voltage also be increased?

My card hits volt/power limit all the time in games and stays below 52C all the time. Sliders are maxed in AB and I can actually read the logs in GPU-Z that I max out @ 240W total.

Looking for people that actually has insight into this and not only watched youtube clips where they put liquid metal on their shunts.....
I have read the guide on xdevs.com to the best of my abilities.

Looking forward to your


----------

