# [NV] Geforce Driver for 980 Ti 353.06 WHQL - Kepler GPUs performance FIX driver



## kx11

Quote:


> R353.06
> Supports G-SYNC technology in windowed mode. (Supported for 2-way SLI, but not 3-way or 4-way.)
> Offers both G-SYNC (V-SYNC on) and G-SYNC (V-SYNC off) options.
> Adds support for Ultra-Low Motion Blur (ULMB) in the NVIDIA Control Panel.
> Provides Dynamic Super Resolution (DSR) for notebooks with discrete GeForce GPUs, Fermi-based and above.
> *Implements optimizations and bug fixes that provide increased performance for Kepler-based GPUs.*
> 
> Profil-Changes:
> F1 2015 (added 3D Profile "rated as not recommended")
> Heroes of the Storm (updated SLI Profile "set to SingleGPU Mode")
> Killing Floor 2 (updated 3D Profile, Compatibility-Mode "rated as not recommended")
> Monster Hunter Online (added SLI Profile)
> Project CARS (updated 3D Profile,"rated as not recommended")
> Space Engineers (added SLI Profile)
> The Witcher 3 (updated 3D Profile, Compatibility-Mode "rated as good")
> World Of Warships (updated 3D Profile "rated as not recommended")
> 
> Bugfixes:
> [GeForce Experience, SLI]: Following an in-game resolution change, Shadowplay icon is missing after Alt+ Tab. [200108633]
> [SLI]: Display goes to half tile when 3840 × 2160 resolution is applied. [200095220]
> [OpenGL, Windows 8.1 -x86/x64]: GLSL shader compile error. [1647324]
> [N16x, Windows 8 -x64]: Allow overclocking settings to be read from the video BIOS as long as it has overclocking enabled. [1579756]
> 
> *Additional Details:
> Installs PhysX System Software 9.15.0428
> Installs HD Audio 1.3.34.3
> Installs GeForce Experience 2.4.5.28*


get it while it's hot
http://www.nvidia.com/download/driverResults.aspx/85823/en-us


----------



## Lansow

Quote:


> Originally Posted by *kx11*
> 
> get it while it's hot
> http://www.nvidia.com/download/driverResults.aspx/85823/en-us


Fantastic! Maybe now the FUD surrounding NV "abandoning" Kepler can stop.

The best news in this, at least for me, is:

"Supports G-SYNC technology in windowed mode, offers both G-SYNC (V-SYNC on) and G-SYNC (V-SYNC off) options, support for Ultra-Low Motion Blur (ULMB) in the NVIDIA Control Panel"

YAY! Now G-SYNC monitors won't be stuck in G-SYNC mode all the time... and G-SYNC in windowed applications? That... is interesting.


----------



## Apolladan

if nvidia was "evil" they'd seperate the driver fix from the launch of their newest flagship by a month for higher sales

at this rate, they're malevolent at best


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Lansow*
> 
> The best news in this, at least for me, is:
> 
> "Supports G-SYNC technology in windowed mode, offers both G-SYNC (V-SYNC on) and G-SYNC (V-SYNC off) options, support for Ultra-Low Motion Blur (ULMB) in the NVIDIA Control Panel"
> 
> YAY! Now G-SYNC monitors won't be stuck in G-SYNC mode all the time... and G-SYNC in windows applications? That... is interesting.


This. While there was no downside for how I play, it was odd to always have G-Sync enabled.


----------



## Shogon

Quote:


> Supports G-SYNC technology in windowed mode.


Wow. I'm looking forward to testing this out in the future. One if the issues with GSYNC was exactly that, and now they fixed it. Sadly not for 3 or 4 way SLI users but good news for single GPU and SLI users.


----------



## iamhollywood5

Nvidia fixing Kepler optimization and setting the 980 Ti price at $650, which actually makes me think twice about upgrading my 780 Ti to a 980 Ti.

Good guy Nvidia, for once.


----------



## 47 Knucklehead

Awesome!



Trying out the new driver now.









This will be awesome since I like to play my MMO's in Windowed mode so I can game on one G-Sync monitor and have my second G-Sync monitor for web browsing and other things and not have to do ALT-TAB all the time to shift focus.


----------



## gooface

*Found the Notebook Link:
*
http://us.download.nvidia.com/Windows/353.06/353.06-notebook-win8-win7-64bit-international-whql.exe

Quote:


> Implements optimizations and bug fixes that provide increased performance for Kepler-based GPUs.


Quite the timing on that....


----------



## PiERiT

No Win10 driver?


----------



## Lansow

There's some weirdness in this one. G-SYNC is staying enabled on the windows desktop unless you change the global setting to ULMB under Mange 3D Settings. Kinda wonky IMO. The default on the desktop should go back to ULMB (if you configured it this way) and G-SYNC in fullscreen mode only. The current implementation requires me to set ULMB globally and enable G-SYNC individually for each title... not an optimal situation.

Oh well, it's better than it was! Guess I'll be filling out another Driver Feedback post.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Lansow*
> 
> There's some weirdness in this one. G-SYNC is staying enabled on the windows desktop unless you change the global setting to ULMB under Mange 3D Settings


The previous version did the same thing.


----------



## Lansow

Quote:


> Originally Posted by *47 Knucklehead*
> 
> The previous version did the same thing.


Gotcha... I didn't notice that in 352. I know I could disable G-SYNC globally to re-enable ULMB, but I didn't see the option in Manage (mange... lol) 3D Settings. Thanks for the heads up.


----------



## xSociety

Still waiting for MFAA support for SLI.


----------



## 47 Knucklehead

THIS IS AWESOME!
Quote:


> *Two new options found there way into the control panel as well: G-Sync will now let you set V-Sync on or off above the maximum refresh rate of the panel (hurrah for peer pressure!) and you can now enable ULMB directly.* This change to the V-Sync capability is only available at the high side of the monitor's refresh rate and will let a user disable V-Sync (and thus introduce horizontal tearing) in order to gain the biggest advantage possible with the lowest latency the system can muster. *This basically matches what AMD has done with FreeSync though NVIDIA's G-Sync still has a superior implementation of low frame rate technology as we demonstrated here.*


----------



## Noufel

any one to confirm kepler improuvement ?


----------



## Silent Scone

No but I can confirm it breaks Firefox under Win 8.1. Doesn't display correctly and just hung my machine lol.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Silent Scone*
> 
> No but I can confirm it breaks Firefox under Win 8.1. Doesn't display correctly and just hung my machine lol.


I'm running Windows 8.1 and Firefox 38.0.1 and I'm not having any issues at all.


----------



## Silent Scone

Doesn't display correctly at all for me. no overclock on the cards. Tabs won't display unless you hover over them and the page is distorted. Fine in IE / Opera.


----------



## jcde7ago

Anyone know if this fixes the TDR crashing with Chrome? Had to disable hardware acceleration in Chrome to stop the crashing on 352.86.


----------



## kx11

3 damn " driver stopped responding " messages in a row

holy crap


----------



## 47 Knucklehead

Quote:


> Originally Posted by *kx11*
> 
> 3 damn " driver stopped responding " messages in a row
> 
> holy crap


Doing what?


----------



## jmcosta

Quote:


> Originally Posted by *Silent Scone*
> 
> Doesn't display correctly at all for me. no overclock on the cards. Tabs won't display unless you hover over them and the page is distorted. Fine in IE / Opera.


i think its a bug in firefox\waterfox cos it happens sometimes in both of my machines (amd n nvidia)


----------



## kx11

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Doing what?


you know the usual


----------



## looniam

Quote:


> Originally Posted by *Noufel*
> 
> any one to confirm kepler improuvement ?


i just got done D/Ling but:
http://forums.guru3d.com/showthread.php?t=399569 (#12)


----------



## tpi2007

Quote:


> Originally Posted by *Lansow*
> 
> Fantastic! Maybe now the FUD surrounding NV "abandoning" Kepler can stop.


Let's see reviews of this driver compared to the previous one and then if it makes a difference (the post above mine is a good indication that it does), yes, it can stop, but not because of what you say. It's because there was a problem, the community spoke up and Nvidia had to acknowledge the problem and fix it. And apparently it was fixable after all. No FUD, on the contrary. The hardware capabilities of Kepler weren't maxed out compared to Maxwell as some have argued, trying to defend Nvidia.

Also, you have no idea if Nvidia would have fixed it, or if they would have been this fast if it wasn't for the community complaining. The mere fact that they acknowledged the problem and issued a fix proves that there was in fact something wrong with the drivers for some time now and the community did the right thing. Nvidia obviously also did the right thing in order to keep customer loyalty.

And that's that, don't try to spin this the other way around.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *tpi2007*
> 
> Also, you have no idea if Nvidia would have fixed it, or if they would have been this fast if it wasn't for the community complaining. The mere fact that they acknowledged the problem and issued a fix proves that there was in fact something wrong with the drivers for some time now and the community did the right thing. Nvidia obviously also did the right thing in order to keep customer loyalty.


Sort of like AMD and the Frame Pacing fix on Crossfire for DirectX 11 and 10 (but not 9). AMD denied it was a problem for months until PCPer pulled out hardware and actually proved it was happening and the community complained, loudly, then AMD fixed it for 2 of the 3 commonly used DirectX versions.


----------



## jcde7ago

Quote:


> Originally Posted by *kx11*
> 
> you know the usual


Welp, that answers my previous question...


----------



## Exilon

I got a ~20% bump in Witcher 3 on ultra no HW on my GTX 780. I've only compared at my latest save though.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Exilon*
> 
> I got a ~20% bump in Witcher 3 on ultra no HW on my GTX 780. I've only compared at my latest save though.


I got a 15% increase on my favorite MMO ... Rift.


----------



## OPsyduck

People on Maxwell, have you guys noticed any increase?


----------



## 47 Knucklehead

Quote:


> Originally Posted by *OPsyduck*
> 
> People on Maxwell, have you guys noticed any increase?


Sorry, I am on Maxwell ... GTX 980 with a 1500MHz overclock.


----------



## OPsyduck

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Sorry, I am on Maxwell ... GTX 980 with a 1500MHz overclock.


15% is pretty good. I'm gonna test later on Dragon Age Inquisition.


----------



## 47 Knucklehead

Off to test some others ... ARMA3 and Mechwarrior Online.


----------



## nyxagamemnon

Tested Witcher 3 with the Previous Drivers Vs these new ones on A single 780TI And Gained 8-10FPS lol.


----------



## Clocknut

sounds like the abandoning Kepler story is too lol

now all we need is rebench 760,770,780,780Ti against 960,970,980 to compare.


----------



## Lansow

Quote:


> Originally Posted by *tpi2007*
> 
> Let's see reviews of this driver compared to the previous one and then if it makes a difference (the post above mine is a good indication that it does), yes, it can stop, but not because of what you say. It's because there was a problem, the community spoke up and Nvidia had to acknowledge the problem and fix it. And apparently it was fixable after all. No FUD, on the contrary. The hardware capabilities of Kepler weren't maxed out compared to Maxwell as some have argued, trying to defend Nvidia.
> 
> Also, you have no idea if Nvidia would have fixed it, or if they would have been this fast if it wasn't for the community complaining. The mere fact that they acknowledged the problem and issued a fix proves that there was in fact something wrong with the drivers for some time now and the community did the right thing. Nvidia obviously also did the right thing in order to keep customer loyalty.
> 
> And that's that, don't try to spin this the other way around.


You are entitled to your opinion, but "abandon" has a very specific meaning that you've missed. Only one of us is trying to "spin". I am basing my statement on facts (they fixed the performance), you are basing yours on conjecture.

Let's stick to facts.


----------



## Lansow

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Sort of like AMD and the Frame Pacing fix on Crossfire for DirectX 11 and 10 (but not 9). AMD denied it was a problem for months until PCPer pulled out hardware and actually proved it was happening and the community complained, loudly, then AMD fixed it for 2 of the 3 commonly used DirectX versions.


The difference is that I never saw Nvidia deny a problem. If anyone has links to them doing so I'm open to it.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Lansow*
> 
> The difference is that I never saw Nvidia deny a problem. If anyone has links to them doing so I'm open to it.


They have. The most famous one was the most recent one ... the 970 memory issue. While technically they are right, it does have 4GB of memory and it is all usable, when you go above 3.5GB, it kills performance. Sure, if you aren't using a 970 for 1600p or 4K, odds are you'll never have the issue, but a lot of people felt burned on it, especially the people who thought they were getting 95% of a 980 for $200 cheaper. Granted that is a "deny", but even I have to admit, that is one hell of an after the fact spin.


----------



## dboythagr8

I am now getting upper 90% GPU usage on my Titan X SLI setup while using GSYNC. There was definitely something wrong in the Witcher 3 Game Ready set when using GSYNC and SLI. I was getting better card usage with GSYNC turned off.

There is also an option under Vertical Sync (NVCP) that now reds - On (Smooth). Description says "Select Smooth to smooth the frame-time patterning for SLI". That could be interesting.


----------



## Alvarado

Well played around and I couldn't tell a difference. Guessing less than 5 fps increase on my 770.


----------



## Assirra

Wow,, those increases are quite big.
Now i gotta download these.


----------



## DIYDeath

Using Titan Black with GW enabled I'm getting a 5-10 fps increase in TW3.


----------



## bvsbutthd101

Same with witcher 3 for me. Just did a quick test. getting 7-8 fps increase in the area I tested. This is coming from 350.12.


----------



## axiumone

Witcher 3 sli profile for titan x cards still sucks. Even though usage is shown across all cards, frame rates show zero improvement.


----------



## looniam

idk about 5-8 fps depending on the "view" but i definitely get less stuttering!




old




same settings but used HBAO+ in newer


----------



## Phaelynar

Installed these a few minutes ago. Will not have any time to play with them though until a bit later.


----------



## umeng2002




----------



## majin662

Quote:


> Originally Posted by *Exilon*
> 
> I got a ~20% bump in Witcher 3 on ultra no HW on my GTX 780. I've only compared at my latest save though.


Same. Performamce was instantly noticeable. Areas that used to be 50ish are solid 60. If i had to put a # on it for witcher 3 it was between 5-10fps same areas same settings as prior to this driver


----------



## Assirra

Got the same 10fps increase as most here on my gtx 980.
Able to use HW now on Geralt using the same settings as i had before with maybe 1-2fps less.


----------



## LaBestiaHumana

352.86 and 353.06 are both crashing while web browsing. This has been happening since 352.86. These Drivers also don't let me play GTA5 with any type of overclock, and sometimes the top card downclocks itself to 575mhz. My only fix was going back to 350.12, no crashes and can overclock the snot out of my Titans.


----------



## Mattousai

THIS is how I was expecting my 780 to perform from the get go. On the last driver in order to maintain 60FPS, I had to keep everything on high, with shadows and grass on medium. Now I can run everything on high, with all the textures running on ultra (HW off of course) and maintain 60 FPS.









Of course nvidia would never come out and say they intentionally crippled kepler based cards. If no one said anything though, does anyone think they would have "fixed" the issue? Not saying this is the case... but come on. They're a business, and making money is priority number 1.


----------



## DIYDeath

Quote:


> Originally Posted by *Mattousai*
> 
> THIS is how I was expecting my 780 to perform from the get go. On the last driver in order to maintain 60FPS, I had to keep everything on high, with shadows and grass on medium. Now I can run everything on high, with all the textures running on ultra (HW off of course) and maintain 60 FPS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course nvidia would never come out and say they intentionally crippled kepler based cards. If no one said anything though, does anyone think they would have "fixed" the issue? Not saying this is the case... but come on. They're a business, and making money is priority number 1.


I doubt they would have fixed the issue on their own. He who trusts a corporation to act in the best interest of the customer and not in the best interest of itself is a fool.


----------



## Lansow

Quote:


> Originally Posted by *Mattousai*
> 
> Of course nvidia would never come out and say they intentionally crippled kepler based cards. If no one said anything though, does anyone think they would have "fixed" the issue? Not saying this is the case... but come on. They're a business, and making money is priority number 1.


Of course they wouldn't, because it didn't happen. Other members on this board have tested it themselves, and seen performance INCREASE on Kepler hardware across driver releases in the past year, not decrease.

The Witcher 3 had some issues with the Game Ready driver release on Kepler. Nobody denies that, but claiming it's intentional? That's a leap of logic I'm simply unwilling to make.


----------



## Lansow

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Kepler performance wasn't intentionally crippled, but it it didn't get any performance boosts since Maxwell release. Some may argue that it's the same difference.


Majin SSJ Eric tested this in this post http://www.overclock.net/t/1556199/pcgameshardware-witcher-3-benchmarks/290#post_23925079

He found that performance has increased, even though it hasn't been by much. He saw increased 3DMark scores despite lower clock speeds. Those results came from someone who has been pretty vocal about the perceived issue.

If I had to guess, I would assume it's more because Kepler is a very mature architecture with very few optimizations left on the table, whereas Big Maxwell is still in its relative infancy.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Lansow*
> 
> Majin SSJ Eric tested this in this post http://www.overclock.net/t/1556199/pcgameshardware-witcher-3-benchmarks/290#post_23925079
> 
> He found that performance has increased, even thought it hasn't been by much. He saw increased 3DMark scores despite lower clock speeds. Those results came from someone who has been pretty vocal about the perceived issue.


Mine actually dropped. But it isn't apples to apples since I upgraded CPU.


----------



## LaBestiaHumana

http://www.3dmark.com/3dm/7138725 - Graphics Score:25803 - MAY 2015



http://www.3dmark.com/fs/1751682 - Graphics Score: 25813 - FEB 2014


----------



## Lansow

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Mine actually dropped. But it isn't apples to apples since I upgraded CPU.


That's bizarre. I wonder what lead to that drop... is it a large drop in performance, or minor?

EDIT: Okay, saw your results and it has dropped ever so slightly... strange. Still, 10 points is well within margin of error, isn't it? A single degree difference on the GPU temps could be sufficient to account for that.


----------



## DoomDash

Kepler <3


----------



## Assirra

Quote:


> Originally Posted by *Mattousai*
> 
> THIS is how I was expecting my 780 to perform from the get go. On the last driver in order to maintain 60FPS, I had to keep everything on high, with shadows and grass on medium. Now I can run everything on high, with all the textures running on ultra (HW off of course) and maintain 60 FPS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course nvidia would never come out and say they intentionally crippled kepler based cards. If no one said anything though, does anyone think they would have "fixed" the issue? Not saying this is the case... but come on. They're a business, and making money is priority number 1.


Even when they fixed their booboo the conspiracy paranoia goes on.
Nvidia can never win it seems.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Lansow*
> 
> That's bizarre. I wonder what lead to that drop... is it a large drop in performance, or minor?


It didn't really drop, considering runs can have slight variations. You can see the current GPU score is identical to the one on February 2014. So no improvements there.

I will run some Firestrike with this driver and see if there is boost on synthetic stuff.

I wouldn't mind improvements on GTA 5, since it's what I'm currently playing non-stop.


----------



## y2kcamaross

Hmmmm using these drivers with gsync mode turned to fullscreen and windowed,playing the witcher 3 in full screen cuts my fps in half, put gsync mode back to fullscreen only and my old fps returns,very odd


----------



## Noufel

Did any one with 980 sli noticed performance boost with this driver


----------



## Yungbenny911

Finally... Hopefully the amount of Kepler owners cursing the day they did business with Nvidia would reduce lol


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Yungbenny911*
> 
> Finally... Hopefully the amount of Kepler owners cursing the day they did business with Nvidia would reduce lol


At least Nvidia has shown that it listens. It would be bad if people just kept quiet.


----------



## Qu1ckset

Saw 5-10FPS increase in Witcher 3!

I think im just going to try and find a second 780ti for 300-350CAD used, much cheaper then dropping $880CAD on the 980ti, SLI 780ti's should hold me off till Arctic Islands and Pascal!


----------



## ttnuagmada

havent tested anything else yet, but this is a definite win for Witcher 3. I had been playing it on my TV with a gamepad. Prior to these drivers, with everything but HW maxed, i stayed mostly locked at 60 with an occasional dip into the mid 50's @1080p. Now suddenly i can do 1440p DSR and ive yet to see it go below 60.


----------



## barsh90

Did any one noticed any fps increases in other games, other than the witcher 3?


----------



## tsm106

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> http://www.3dmark.com/3dm/7138725 - Graphics Score:25803 - MAY 2015
> 
> 
> 
> http://www.3dmark.com/fs/1751682 - Graphics Score: 25813 - FEB 2014


Quote:


> Originally Posted by *Lansow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LaBestiaHumana*
> 
> Mine actually dropped. But it isn't apples to apples since I upgraded CPU.
> 
> 
> 
> That's bizarre. I wonder what lead to that drop... is it a large drop in performance, or minor?
> 
> EDIT: Okay, saw your results and it has dropped ever so slightly... strange. Still, 10 points is well within margin of error, isn't it? A single degree difference on the GPU temps could be sufficient to account for that.
Click to expand...

Those two runs aren't directly comparable because of the different cpu platforms and difference in gpu clocks. With the differences accounted for, gscore wise the new run is actually more efficient because it's running lower gpu clocks unless I'm mistaken. Btw, look at that silly hexacore cpu bias.


----------



## Twist86

Quote:


> Originally Posted by *iamhollywood5*
> 
> Nvidia fixing Kepler optimization and setting the 980 Ti price at $650, which actually makes me think twice about upgrading my 780 Ti to a 980 Ti.
> 
> Good guy Nvidia, for once.


I wouldn't go that far, I think the massive backlash made them change their thoughts on this. I mean let's face it Nvidia has been caught in the past pulling this crap so it's not a major stretch to think they would try it again


----------



## xSociety

Anyone tested out 'On (Smooth)' vs. just 'On' with Gsync?


----------



## go4life

How are gains for other games? I don't play W3, mostly GTA V at the moment. Anything for SLI?

Is the driver stable, unlike the last one overall?


----------



## Yungbenny911

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yungbenny911*
> 
> Finally... Hopefully the amount of Kepler owners cursing the day they did business with Nvidia would reduce lol
> 
> 
> 
> At least Nvidia has shown that it listens. *It would be bad if people just kept quiet.*
Click to expand...

Agreed!


----------



## TFL Replica

Well, the previous driver was 100% stable for me. I'll let you guys know if this one somehow manages to crash.


----------



## General123

Here's to hoping the random crashes are gone.


----------



## barsh90

Quote:


> Originally Posted by *General123*
> 
> Here's to hoping the random crashes are gone.


The random crashes from the previous driver were as a result of google chrome or Mozilla using hardware acceleration(turning off the acceleration seems to fix the issue).
However this driver did not fix that issue. I'm still having crashes with google chrome and mozilla. Nvidia was well aware of this issue, and apparently devided to overlook it with this version. How lame.


----------



## tsm106

Quote:


> Originally Posted by *barsh90*
> 
> Quote:
> 
> 
> 
> Originally Posted by *General123*
> 
> Here's to hoping the random crashes are gone.
> 
> 
> 
> The random crashes from the previous driver were as a result of google chrome or Mozilla using hardware acceleration(turning off the acceleration seems to fix the issue).
> However this driver did not fix that issue. I'm still having crashes with google chrome and mozilla. Nvidia was well aware of this issue, and apparently devided to overlook it with this version. How lame.
Click to expand...

You can prevent the hw accel easy with Afterburner. I wrote a section on hw accel and the use of RTSS. It's a universal process.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## barsh90

Can any one confirm performance increase in games, other than the witcher 3? Like 3dmark, etc, gtc 5 etc.?


----------



## Lansow

Quote:


> Originally Posted by *tsm106*
> 
> You can prevent the hw accel easy with Afterburner. I wrote a section on hw accel and the use of RTSS. It's a universal process.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


I didn't even know that thread existed! There is some *very* cool stuff in there! Thank you.


----------



## Lansow

Quote:


> Originally Posted by *y2kcamaross*
> 
> Hmmmm using these drivers with gsync mode turned to fullscreen and windowed,playing the witcher 3 in full screen cuts my fps in half, put gsync mode back to fullscreen only and my old fps returns,very odd


Wouldn't switching to windowed mode disable SLI? That might account for what you're seeing. Sounds like a bug where the driver isn't properly detecting fullscreen mode anymore.

Does it happen in any other titles?


----------



## rluker5

Quote:


> Originally Posted by *looniam*
> 
> idk about 5-8 fps depending on the "view" but i definitely get less stuttering!
> 
> 
> 
> 
> old
> 
> 
> 
> 
> same settings but used HBAO+ in newer


ooo I hate stuttering in that game. Tomorrow I'll check to see if it can match my W10 [email protected]:2:0 cobbled together performance boost. Maybe I'll set W8.1 back to #1 on boot priority.


----------



## Lansow

Quote:


> Originally Posted by *rluker5*
> 
> ooo I hate stuttering in that game. Tomorrow I'll check to see if it can match my W10 [email protected]:2:0 cobbled together performance boost. Maybe I'll set W8.1 back to #1 on boot priority.


Love your av. My daughter did the same thing when I first got the 60". LOL


----------



## phaseshift

display driver crash still not fixed....


----------



## ALT F4

Quote:


> Originally Posted by *phaseshift*
> 
> display driver crash still not fixed....


That sucks. I'm going to try on my end.
Anyone else still having this issue?


----------



## xSociety

Can someone else try Vertical Sync to On (Smooth) with SLI? I just tried it and holy hell it feels WAAAY smoother, just want to make sure I'm not crazy.


----------



## tpi2007

Quote:


> Originally Posted by *Lansow*
> 
> You are entitled to your opinion, but "abandon" has a very specific meaning that you've missed. Only one of us is trying to "spin". I am basing my statement on facts (they fixed the performance), you are basing yours on conjecture.
> 
> Let's stick to facts.


What exactly in my post are you calling "opinion"?

1. I started by saying that first let's have some reviews of the driver comparing the performance before and after before jumping to conclusions like you promptly did even before people started posting their own personal experiences. I don't know how you can take their word for it and declare that they fixed it a fact before third party testing results are presented.

2. "Abandon" (that is your word of choice to describe the problem btw, but not everyone's) is not meant to be taken literally, so there goes your specific meaning, as new drivers will keep supporting Kepler for a few more years, it's to convey the fact that they stopped optimizing the drivers for newer games like they had done before for Fermi, for example, when Kepler and Big Kepler had been out for years;

3. People have been complaining about unexpected Kepler performance vs GM204 Maxwell vs Hawaii since last year, also on this forum. The problem for Nvidia was that we do know some key metrics, like their own admitted facts: One 128 CUDA core SMM (Maxwell) has 90% of the performance of a 192 CUDA core SMX (Kepler); also, by their own admittance, the GTX 960 wasn't supposed to be better than a GTX 760, nor was it supposed to replace it, it was designed to replace the GTX 660.

4. The way you phrased your post conveyed the message that people were in the wrong for having complained and that 'there is nothing to see here, especially now, so please be quiet now', as if it was necessary. Your anticipation of negative reaction to something good is unexplainable. It's counter-productive damage control at its finest. If the problems are solved, then great, that was the purpose of the complaining, getting the problems solved. People won't forget them though. It's always nice to remember when certain things happened and the circumstances under which they changed.

Quote:


> Originally Posted by *Lansow*
> 
> Majin SSJ Eric tested this in this post http://www.overclock.net/t/1556199/pcgameshardware-witcher-3-benchmarks/290#post_23925079
> 
> He found that performance has increased, even though it hasn't been by much. He saw increased 3DMark scores despite lower clock speeds. Those results came from someone who has been pretty vocal about the perceived issue.
> 
> If I had to guess, I would assume it's more because Kepler is a very mature architecture with very few optimizations left on the table, whereas Big Maxwell is still in its relative infancy.


Big Maxwell is at its core practically the same as GM107 for all intents and purposes that matter in current games that are DX 11.0, and it was released in February of last year.

Quote:


> GM204 may be a second generation Maxwell part, but it is without question still a Maxwell part. Maxwell has learned some new tricks that we are going to cover here, but functionally speaking you can consider GM204 to be a bigger version of GM107, taking more SMMs and more ROP/memory partitions and using them to build a bigger, more powerful GPU.


http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/3

Quote:


> Diving into the specs, GM200 can for most intents and purposes be considered a GM204 + 50%. It has 50% more CUDA cores, 50% more memory bandwidth, 50% more ROPs, and almost 50% more die size.


http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review
Quote:


> GM200 is 601mm2 of graphics, and this is what makes it remarkable. There are no special compute features here that only Tesla and Quadro users will tap into (save perhaps ECC), rather it really is GM204 with 50% more GPU. This means we're looking at the same SMMs as on GM204, featuring 128 FP32 CUDA cores per SMM, a 512Kbit register file, and just 4 FP64 ALUs per SMM, leading to a puny native FP64 rate of just 1/32.


http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/2


----------



## bluewr

Quote:


> Originally Posted by *tsm106*
> 
> You can prevent the hw accel easy with Afterburner. I wrote a section on hw accel and the use of RTSS. It's a universal process.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


What am I supposed to use in that one?


----------



## RagingCain

Quote:


> Originally Posted by *tpi2007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lansow*
> 
> You are entitled to your opinion, but "abandon" has a very specific meaning that you've missed. Only one of us is trying to "spin". I am basing my statement on facts (they fixed the performance), you are basing yours on conjecture.
> 
> Let's stick to facts.
> 
> 
> 
> What exactly in my post are you calling "opinion"?
> 
> 1. I started by saying that first let's have some reviews of the driver comparing the performance before and after before jumping to conclusions like you promptly did even before people started posting their own personal experiences. I don't know how you can take their word for it and declare that they fixed it a fact before third party testing results are presented.
> 
> 2. "Abandon" (that is your word of choice to describe the problem btw, but not everyone's) is not meant to be taken literally, so there goes your specific meaning, as new drivers will keep supporting Kepler for a few more years, it's to convey the fact that they stopped optimizing the drivers for newer games like they had done before for Fermi, for example, when Kepler and Big Kepler had been out for years;
> 
> 3. People have been complaining about unexpected Kepler performance vs GM204 Maxwell vs Hawaii since last year, also on this forum. The problem for Nvidia was that we do know some key metrics, like their own admitted facts: One 128 CUDA core SMM (Maxwell) has 90% of the performance of a 192 CUDA core SMX (Kepler); also, by their own admittance, the GTX 960 wasn't supposed to be better than a GTX 760, nor was it supposed to replace it, it was designed to replace the GTX 660.
> 
> 4. The way you phrased your post conveyed the message that people were in the wrong for having complained and that 'there is nothing to see here, especially now, so please be quiet now', as if it was necessary. Your anticipation of negative reaction to something good is unexplainable. It's counter-productive damage control at its finest. If the problems are solved, then great, that was the purpose of the complaining, getting the problems solved. People won't forget them though. It's always nice to remember when certain things happened and the circumstances under which they changed.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lansow*
> 
> Majin SSJ Eric tested this in this post http://www.overclock.net/t/1556199/pcgameshardware-witcher-3-benchmarks/290#post_23925079
> 
> He found that performance has increased, even though it hasn't been by much. He saw increased 3DMark scores despite lower clock speeds. Those results came from someone who has been pretty vocal about the perceived issue.
> 
> If I had to guess, I would assume it's more because Kepler is a very mature architecture with very few optimizations left on the table, whereas Big Maxwell is still in its relative infancy.
> 
> Click to expand...
> 
> Big Maxwell is at its core practically the same as GM107 for all intents and purposes that matter in current games that are DX 11.0, and it was released in February of last year.
> 
> Quote:
> 
> 
> 
> GM204 may be a second generation Maxwell part, but it is without question still a Maxwell part. Maxwell has learned some new tricks that we are going to cover here, but functionally speaking you can consider GM204 to be a bigger version of GM107, taking more SMMs and more ROP/memory partitions and using them to build a bigger, more powerful GPU.
> 
> Click to expand...
> 
> http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/3
> 
> Quote:
> 
> 
> 
> Diving into the specs, GM200 can for most intents and purposes be considered a GM204 + 50%. It has 50% more CUDA cores, 50% more memory bandwidth, 50% more ROPs, and almost 50% more die size.
> 
> Click to expand...
> 
> http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review
> Quote:
> 
> 
> 
> GM200 is 601mm2 of graphics, and this is what makes it remarkable. There are no special compute features here that only Tesla and Quadro users will tap into (save perhaps ECC), rather it really is GM204 with 50% more GPU. This means we're looking at the same SMMs as on GM204, featuring 128 FP32 CUDA cores per SMM, a 512Kbit register file, and just 4 FP64 ALUs per SMM, leading to a puny native FP64 rate of just 1/32.
> 
> Click to expand...
> 
> http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/2
Click to expand...

Dude, you are way off base.

His original post was about people spreading misinformation, not him dismissing a performance bug introduced in the merging drivers with the Windows 10 family set. Which is 35x.xx.

By the way, for people who have Keplers, i.e myself, my performance only was surprising in PCars and Witcher 3. My 3DMark scores havent been higher nor have my Hardline/4 frames been better.


----------



## barsh90

Apparently i had some decrease in performance in other games ( i have 2 980 gtx in SLI)

I ran some short benchmarks(1440p) on dying light gta 5 and firemark extreme

dying light average fps decreased from 113fps to 100fps

Gta 5 went down from 59fps fps to 57fps

And firemark extreme graphic score went down from 12329 to 12240


----------



## th3illusiveman

so how much have they improved kepler GPUs by?


----------



## ALT F4

Quote:


> Originally Posted by *RagingCain*
> 
> Dude, you are way off base.
> 
> His original post was about people spreading misinformation, not him dismissing a performance bug introduced in the merging drivers with the Windows 10 family set. Which is 35x.xx.
> 
> By the way, for people who have Keplers, i.e myself, my performance only was surprising in PCars and Witcher 3. My 3DMark scores havent been higher nor have my Hardline/4 frames been better.


Quote:


> Originally Posted by *th3illusiveman*
> 
> so how much have they improved kepler GPUs by?


Kepler improvement will be minimal, I don't think there is any room within the architecture for exponential gains. Hopefully they don't just completely leave kepler in the dust while optimizing the new games for the newer architectures.


----------



## Mad Pistol

Quote:


> Originally Posted by *ALT F4*
> 
> Kepler improvement will be minimal, I don't think there is any room within the architecture for exponential gains. Hopefully they don't just completely leave kepler in the dust while optimizing the new games for the newer architectures.


I've been saying this for a while. Kepler is a 3-year-old architecture. It has been optimized a lot over the last 3 years. I highly doubt there is much more for Kepler to give that hasn't already been given.

What I do want to see, though, is continuing support and optimization for future titles (like Witcher 3) so that the game performs the best it possibly can. I understand that my GTX 780 is not going to be as fast as a 970 or above. However, I NEVER want to see it be slower than a GTX 960... a card with less than 1/2 the CUDA cores and 1/3 the memory bus.

It sounds like Nvidia fixed Witcher 3 performance on Kepler, so all is well.


----------



## iluvkfc

Getting crashes on Chrome with this new driver, rolled back to GTA V driver which is stable for me.

Also enabling ULMB from NV control panel doesn't enable anything, all it does is disable G-Sync.

Oh well at least 980 Ti is priced nicely.


----------



## HeadlessKnight

GTX 980 Ti looks a very promising card but I am still afraid to touch anything new from Nvidia after their Kepler driver fiasco, since I generally keep my cards more than 1 year. I am sure they made the Kepler fix after the hype they got in their forums, they don't want to get bad reputation because of it. Without people making any hype this Kepler optimization driver was probably never existed, they will instead make it as a selling point for their GTX 900 series cards.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *HeadlessKnight*
> 
> GTX 980 Ti looks a very promising card but I am still afraid to touch anything new from Nvidia after their Kepler driver fiasco, since I generally keep my cards more than 1 year. I am sure they made the Kepler fix after the hype they got in their forums, they don't want to get bad reputation because of it. Without people making any hype this Kepler optimization driver was probably never existed, they will instead make it as a selling point for their GTX 900 series cards.


agreed.

At least there is some attempt to rectify the situation for Kepler cards, but there is still a sour taste about the whole thing.


----------



## jmcosta

did a quick bench in Metro LL on my gf rig (gtx670)

a gain 4fps which is like 10 ingame


Spoiler: Warning: Spoiler!







my 980 still has the same avg 90fps since February


----------



## Silent Scone

Quote:


> Originally Posted by *tpi2007*
> 
> Let's see reviews of this driver compared to the previous one and then if it makes a difference (the post above mine is a good indication that it does), yes, it can stop, but not because of what you say. It's because there was a problem, the community spoke up and Nvidia had to acknowledge the problem and fix it. And apparently it was fixable after all. No FUD, on the contrary. The hardware capabilities of Kepler weren't maxed out compared to Maxwell as some have argued, trying to defend Nvidia.
> 
> Also, you have no idea if Nvidia would have fixed it, or if they would have been this fast if it wasn't for the community complaining. The mere fact that they acknowledged the problem and issued a fix proves that there was in fact something wrong with the drivers for some time now and the community did the right thing. Nvidia obviously also did the right thing in order to keep customer loyalty.
> 
> And that's that, don't try to spin this the other way around.


Yeah nice speech, but if Maxwell performance has suffered as a result you can shove your patch work armchair up your proverbial









If not for the end users where else is the prerogative to extract the very most out of a discontinued line?! Especially when their optimisations may even come at a performance penalty for newer cards, which some are finding.

The tinfoil around here is getting mighty ridiculous of late, it's like a subsection on godlikeproductions.


----------



## Leader2light

God I updated to fix chrome crashes. Guess thats not fixed.


----------



## bluewr

OK, I have GTX 780 SLI, did some benchmark test on it.
Catzilla didn't seem to have any change, Firestrike Extreme though got some b oost.
I'll try some gaming to see.
But I just had 1 of the crash to desktop. bug
The benchmark for both are on stock, no overclock.

Old driver 347.88



New driver 353.06


----------



## xSociety

I haven't crashed in Chrome in ANY Nvidia drivers I have ever used. I update to every single on of them too. I use nothing but Chrome btw.


----------



## Silent Scone

I've had two lock ups at desktop with these (no TDR). On an overclock that's been stable since X99 launch last September. Have rolled back


----------



## cookieboyeli

I'm getting 3-8 fps more in The Witcher 3 depending on location with these drivers vs 347.88 and 352.86. Only 350.12 was causing me crashes in everything. I've yet to crash with these. I'm running a GTX 770 Lightning (2GB). TW3 is finally playable everywhere at 1440p!


----------



## ALT F4

Quote:


> Originally Posted by *Silent Scone*
> 
> I've had two lock ups at desktop with these (no TDR). On an overclock that's been stable since X99 launch last September. Have rolled back


I just had an issue where all my windows started glitching, became the color of my windows theme, and finally became a black hole. Windows with nothing but black inside







had to restart and continuing what I was doing.


----------



## Silent Scone

Quote:


> Originally Posted by *ALT F4*
> 
> I just had an issue where all my windows started glitching, became the color of my windows theme, and finally became a black hole. Windows with nothing but black inside
> 
> 
> 
> 
> 
> 
> 
> had to restart and continuing what I was doing.


lol this is what I had in Firefox practically. Then it finally locked completely.

I blame Kepler users....


----------



## cookieboyeli

Quote:


> Originally Posted by *ALT F4*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Silent Scone*
> 
> I've had two lock ups at desktop with these (no TDR). On an overclock that's been stable since X99 launch last September. Have rolled back
> 
> 
> 
> I just had an issue where all my windows started glitching, became the color of my windows theme, and finally became a black hole. Windows with nothing but black inside
> 
> 
> 
> 
> 
> 
> 
> had to restart and continuing what I was doing.
Click to expand...

I had that issue in 350.12. This driver is more stable than that though for me. Definitely worth a try VS 347.88...


----------



## Jinto

So happy about G-sync in windowed mode. Now I can finally play FFXIV in borderless windowed and keep ACT UI on the main display.


----------



## Wishmaker

I bet AMD did not expect this







!


----------



## Silent Scone

I'm giving these another shot. Have disabled hardware acceleration in Opera to see if that remedies it. Seeing the trend at the moment I'd be inclined to pin it on browser hardware acceleration in general.

[EDIT] After doing this, the browser is actually a lot smoother when scrolling and moving the window around.


----------



## azanimefan

Quote:


> Originally Posted by *Wishmaker*
> 
> I bet AMD did not expect this
> 
> 
> 
> 
> 
> 
> 
> !


what? that nvidia would release 2 straight drivers that can't even play a youtube video without crashing?

congratz on the +3-8fps kepler owners are seeing improved in TW3, but for me the buggyness of the last driver was the most offensive part about it. It was a junk driver, just like this one is a junk driver.


----------



## Wishmaker

Quote:


> Originally Posted by *azanimefan*
> 
> what? that nvidia would release 2 straight drivers that can't even play a youtube video without crashing?
> 
> congratz on the +3-8fps kepler owners are seeing improved in TW3, but for me the buggyness of the last driver was the most offensive part about it. It was a junk driver, just like this one is a junk driver.


*bearhug* my last AMD driver 15.5 beta is crashing on youtube as well, so don't feel so special


----------



## Anateus

Finally. And please dont tell that they havent nerfed and left Kepler alone. Should the internet not act, Nvidia would do nothing.


----------



## Glottis

Quote:


> Originally Posted by *Wishmaker*
> 
> *bearhug* my last AMD driver 15.5 beta is crashing on youtube as well, so don't feel so special


but grass is always greener on the other side isn't it?


----------



## Kinaesthetic

Quote:


> Originally Posted by *Leader2light*
> 
> God I updated to fix chrome crashes. Guess thats not fixed.


Settings -> "Show Advanced Settings" (at the bottom of the settings page) -> Scroll down to the "System" subcategory -> Uncheck "Use Hardware Acceleration When Available".

Done.


----------



## sugalumps

Quote:


> Originally Posted by *Anateus*
> 
> Finally. And please dont tell that they havent nerfed and left Kepler alone. Should the internet not act, Nvidia would do nothing.


You are a true internet hero

*tips


----------



## cookieboyeli

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Leader2light*
> 
> God I updated to fix chrome crashes. Guess thats not fixed.
> 
> 
> 
> Settings -> "Show Advanced Settings" (at the bottom of the settings page) -> Scroll down to the "System" subcategory -> Uncheck "Use Hardware Acceleration When Available".
> 
> Done.
Click to expand...

While this does fix the crashing, it destroys youtube videos. They now use bicubic and it's ugly. Really wannabe-2008-flash-playeresque. Probably better to use 347.88 until they fix this.


----------



## Wishmaker

Quote:


> Originally Posted by *Glottis*
> 
> but grass is always greener on the other side isn't it?


Well NVIDIA does use green, so it is greener


----------



## ALT F4

Quote:


> Originally Posted by *azanimefan*
> 
> what? that nvidia would release 2 straight drivers that can't even play a youtube video without crashing?
> 
> congratz on the +3-8fps kepler owners are seeing improved in TW3, but for me the buggyness of the last driver was the most offensive part about it. It was a junk driver, just like this one is a junk driver.


So I'm not the only one


----------



## Sisaroth

Quote:


> Originally Posted by *Lansow*
> 
> Fantastic! Maybe now the FUD surrounding NV "abandoning" Kepler can stop.
> 
> The best news in this, at least for me, is:
> 
> "Supports G-SYNC technology in windowed mode, offers both G-SYNC (V-SYNC on) and G-SYNC (V-SYNC off) options, support for Ultra-Low Motion Blur (ULMB) in the NVIDIA Control Panel"
> 
> YAY! Now G-SYNC monitors won't be stuck in G-SYNC mode all the time... and G-SYNC in windowed applications? That... is interesting.


Without the FUD we might have not even got this.


----------



## Silent Scone

Witcher 3 just hung during a cut scene. no TDR event and was able to alt tab my way to terminate it. First time that's happened in 75 hours of game time - only variable, these drivers.

[EDIT] and again...

Gave it a second go, and regretting it. Off it comes...


----------



## Kinaesthetic

Quote:


> Originally Posted by *cookieboyeli*
> 
> While this does fix the crashing, it destroys youtube videos. They now use bicubic and it's ugly. Really wannabe-2008-flash-playeresque. Probably better to use 347.88 until they fix this.


Well, it if makes you feel any better, I had to do the exact same thing to get Chrome to play Youtube videos properly on my Macbook Air, which does NOT have any Nvidia or AMD hardware in it. Intel integrated graphics only. In particular, 1080p60fps video was dropping a significant amount of frames, and occasionally would crash Chrome.

So I'm led to believe that this problem is just as much Google's as it is Nvidia.


----------



## Panzerfury

Quote:


> Originally Posted by *Silent Scone*
> 
> Witcher 3 just hung during a cut scene. no TDR event and was able to alt tab my way to terminate it. First time that's happened in 75 hours of game time - only variable, these drivers.
> 
> [EDIT] and again...
> 
> Gave it a second go, and regretting it. Off it comes...


I just got the same thing. Also happened to me on the previous one.


----------



## Sisaroth

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Well, it if makes you feel any better, I had to do the exact same thing to get Chrome to play Youtube videos properly on my Macbook Air, which does NOT have any Nvidia or AMD hardware in it. Intel integrated graphics only. In particular, 1080p60fps video was dropping a significant amount of frames, and occasionally would crash Chrome.
> 
> So I'm led to believe that this problem is just as much Google's as it is Nvidia.


For me it's been a very long time since a youtube video didn't work or had any kind of problems. But i'm on AMD with drivers from last year though.


----------



## Silent Scone

Quote:


> Originally Posted by *Panzerfury*
> 
> I just got the same thing. Also happened to me on the previous one.


That makes matters more confusing, not had that once on the previous driver.

Hope Paupler owners are happy now







.

Look at what you've _DONE!_


----------



## Lantian

just hope win 10 driver will follow shortly


----------



## y2kcamaross

Quote:


> Originally Posted by *Lansow*
> 
> Wouldn't switching to windowed mode disable SLI? That might account for what you're seeing. Sounds like a bug where the driver isn't properly detecting fullscreen mode anymore.
> 
> Does it happen in any other titles?


No, SLI works perfectly fine in windowed mode


----------



## Maintenance Bot

Quote:


> Originally Posted by *barsh90*
> 
> Apparently i had some decrease in performance in other games ( i have 2 980 gtx in SLI)
> 
> I ran some short benchmarks(1440p) on dying light gta 5 and firemark extreme
> 
> dying light average fps decreased from 113fps to 100fps
> 
> Gta 5 went down from 59fps fps to 57fps
> 
> And firemark extreme graphic score went down from 12329 to 12240


Seen that also. I took a hit in Dying Light and BF4 around 5 to 7 fps at 1440p.


----------



## xXUNLUCKYXx

Going to install this driver tonight!

I was getting better performance from a single GTX 680 4gb than when playing in SLI

Ill update later on!

UPDATE:

SLI performance on my 680's is sill shocking getting about 10 to 15 fps more from a single card....


----------



## FloJoe6669

Well these drivers change my plans a bit...







now wondering if i should bite on a 970 (or upcoming AMD offering) when i get to start Witcher 3 in a few weeks, as Witcher 3 is my first and only game-I-will-upgrade-pc-solely-for in my favourite franchise. For people on Kepler running Witcher 3 v1.04 on 353.06 drivers, thoughts? (currently running a 4gb 670 @1440p)


----------



## orion933

g-sync windowed and borderless windowed seems to not work for me :

I go to NVCP and enable g-sync full screen and windowed but ingame when i set windowed mode the led go white and g-sync is not active
someone have that happen too?


----------



## EniGma1987

Quote:


> Originally Posted by *Lansow*
> 
> There's some weirdness in this one. G-SYNC is staying enabled on the windows desktop unless you change the global setting to ULMB under Mange 3D Settings. Kinda wonky IMO. The default on the desktop should go back to ULMB (if you configured it this way) and G-SYNC in fullscreen mode only. The current implementation requires me to set ULMB globally and enable G-SYNC individually for each title... not an optimal situation.
> 
> Oh well, it's better than it was! Guess I'll be filling out another Driver Feedback post.


But the last driver before this enabled GSync all the time when the feature was enabled, not just in games so you could never even use ULMB unless Gsync was entirely and fully disabled. So adding the ULMB option is probably a better solution all around now.


----------



## specopsFI

Did some testing with a random benchmark selection (whatever I happened to have installed). These are all "old" titles, which haven't been tampered with at any point even if some of the post Maxwell launch games have seen curious performance numbers.

GPU: Asus GTX 670 DC2 4GB @1228/3456MHz
Rest of the rig: from my signature

Bioshock Infinite:
350.12: 83 fps
353.06: 83.6 fps

FFXIV A Realm Reborn benchmark:
350.12: 11589
353.06: 11587

Heaven benchmark:
350.12: 38.1 fps
353.06: 38.3 fps

Hitman Absolution:
350.12: 37.3 fps
353.06: 38.9 fps

Metro Last Light:
350.12: 47.88 fps
353.06: 48.62 fps

Sniper Elite V2:
350.12: 65.57 fps
353.06: 65.63

Tomb Raider:
350.12: 40.7 fps
353.06: 41.2 fps

Valley Benchmark:
350.12: 47.7 fps
353.06: 47.9 fps

So the difference in these was mostly in the 1% category and the only real improvement was in Hitman Absolution. Overall the 353.06 driver was 1.1% faster. Nothing spectacular, but who in their right mind would expect a 3+ year old GPU to get an across-the-board performance boost? Not me, at least. It would be interesting to see results from the most controversial game titles from the last year or so: Project Cars, Shadow of Mordor, Watch_Dogs, Far Cry 4, GTA V etc. Personally, I haven't seen anything but steady performance from driver to driver with this GTX 670 but then again, I haven't had the time or the interest to get involved with the fresh game releases lately. Game-by-game optimizations at or right after game release are always where the big gains are seen and it's no surprise to see Nvidia drag its feet with their older uarchs. I have no doubt that AMD would have been doing the same, had they actually launched a new uarch in the last three years.


----------



## xSociety

So no one else has tried the SLI Frame-Time smoothing option but me? Come on guys let me know what you think.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *azanimefan*
> 
> what? that nvidia would release 2 straight drivers that can't even play a youtube video without crashing?


I watched about 30 different music videos on Youtube last night, both on FF and IE (I really don't care for Chrome anymore) and never crashed once.

I'm just saying.

Time to download the drivers on my work PC and see how the GTX 560Ti likes them.


----------



## axiumone

Quote:


> Originally Posted by *xSociety*
> 
> So no one else has tried the SLI Frame-Time smoothing option but me? Come on guys let me know what you think.


Can you snap a screen shot of the option? I couldn't find it.


----------



## overvolted

Wasn't the GSYNC input lag issue easily fixed by capping the frame rate to 140?


----------



## xSociety

Quote:


> Originally Posted by *axiumone*
> 
> Can you snap a screen shot of the option? I couldn't find it.


----------



## Creator

Quote:


> Originally Posted by *ALT F4*
> 
> Kepler improvement will be minimal, I don't think there is any room within the architecture for exponential gains. Hopefully they don't just completely leave kepler in the dust while optimizing the new games for the newer architectures.


No architecture is ever maxed out for any new game. Even Fermi could be optimized for TW3.


----------



## Threx

Gsync is working for both fullscreen and borderless window modes for me. Thank goodness I don't have to play games in fullscreen anymore.


----------



## axiumone

Quote:


> Originally Posted by *xSociety*


Ah, thanks for that. That option was there before, so they've added the additional description pertaining to sli?

If it operates the same as before, I remember messing around with it, but it didn't produce results that were noticeably different from standard vsync.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Threx*
> 
> Gsync is working for both fullscreen and borderless window modes for me. Thank goodness I don't have to play games in fullscreen anymore.


Yup, best thing about the driver for me.

A close second is the older card updates.

The bit about selectively being able to enable to disable VSync when above the panels frequency is cute, but I don't use it. At 1440p, I never go above 120-144 FPS anyway. But it is nice to shut up the AMD FreeSync people about the possibility of lag above the VRRange.


----------



## overvolted

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Yup, best thing about the driver for me.
> 
> A close second is the older card updates.
> 
> The bit about selectively being able to enable to disable VSync when above the panels frequency is cute, but I don't use it. At 1440p, I never go above 120-144 FPS anyway. But it is nice to shut up the AMD FreeSync people about the possibility of lag above the VRRange.


Overall yea it's cute, and just a nice jab at AMD but they could improve it a bit. Every now and then I catch my Swift allowing the fps to get up to 145 which isn't preferred. They could do us all a favor by capping it to 140 on the driver side instead of 143. Or better yet do the right thing and allow us to decide our own frame rate cap









Maybe I just have bad luck but anytime I add a secondary frame limiter in game, the FPS swings stop being so smooth. Pretty much no choice but to leave it uncapped.


----------



## Silent Scone

Quote:


> Originally Posted by *47 Knucklehead*
> 
> I watched about 30 different music videos on Youtube last night, both on FF and IE (I really don't care for Chrome anymore) and never crashed once.
> 
> I'm just saying.
> 
> Time to download the drivers on my work PC and see how the GTX 560Ti likes them.


Disabling hardware acceleration within the browser fixes this issue anyway. Easily remedied. What I can't tolerate obviously is how it's now crashing Witcher 3. Not a TDR event, but a game crash event. No issues at all on 352.86.

So the works on my machine certificate isn't exactly any indication that the driver is decent


----------



## djriful

CPU 3.8Ghz (stock)

*352.86* - The Witcher 3 + Hairworks on GTX TITAN 1200Mhz -> 28-35FPS

*353.06* - The Witcher 3 + Hairworks on GTX TITAN 1200Mhz -> 40-45FPS

Yeah huge improvement.


----------



## zefs

+4fps on GTX 780 - Witcher 3, wow what a "fix"
Nvidia is mocking customers again.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *zefs*
> 
> +4fps on GTX 780 - Witcher 3, wow what a "fix"
> Nvidia is mocking customers again.


Don't like it? Uninstall the driver, lose those 4FPS. Or sell your 780 and get a 980 (I did). But seriously, complaining because you got better performance for FREE, seriously?


----------



## barsh90

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Don't like it? Uninstall the driver, lose those 4FPS. Or sell your 780 and get a 980 (I did). But seriously, complaining because you got better performance for FREE, seriously?


Me and some other people have already confirmed performance loss in other games.


----------



## phaseshift

after updating to this I still experienced the "driver has stopped working" error, so what I did was reverted back to 350.12 on a clean install, then I updated back to 353.06. Needless to say I was able to do work and play Witcher 3 for about 5 hours without any crashes. Will continue to do some testing, I hope it's fixed the issue. How? I don't know lol


----------



## Dotachin

Is the 960 still better than the 780 on tw3?


----------



## Anateus

Quote:


> Originally Posted by *Dotachin*
> 
> Is the 960 still better than the 780 on tw3?


Was it ever better?


----------



## Dotachin

Quote:


> Originally Posted by *Anateus*
> 
> Was it ever better?


Yes.


----------



## Olivon

http://www.computerbase.de/2015-06/geforce-353.06-leistung-kepler-witcher-3/


----------



## xSociety

Quote:


> Originally Posted by *axiumone*
> 
> Ah, thanks for that. That option was there before, so they've added the additional description pertaining to sli?
> 
> If it operates the same as before, I remember messing around with it, but it didn't produce results that were noticeably different from standard vsync.


It feels a lot smoother to me now. I was even playing Far Cry 4 while not noticing any stuttering or hiccups. Would like to see an actual frame time review of this option though.


----------



## PiOfPie

As a heads-up: 353.06 broke game optimization functionality through GeForce Experience for me. It says the 760M in my lappy isn't a supported card; 352.86 worked fine.

Going to try a reinstall and see if it fixes things.


----------



## provost

This is an unstable driver. Rolling back


----------



## Nightingale

Quote:


> Originally Posted by *barsh90*
> 
> Me and some other people have already confirmed performance loss in other games.


I can concur. It's not just the witcher 3 and it seems people are missing that point. This didn't start with witcher 3 people had been noticing this ever since the 344.77 drivers. Wither 3 was just the straw that broke the camels back.


----------



## iluvkfc

Anyone have an issue where selecting ULMB doesn't enable ULMB?


----------



## 47 Knucklehead

Well, I updated my i7 at work running 64-bit Windows 7 with a GTX 560Ti card and I've been playing Youtube videos for 30 minutes now on FF, Chrome, and IE ... all at the same time no less ... and no crashes.


----------



## y2kcamaross

I'm going to reinstall tonight in hopes that it fixed my windowed gsync problems, though I've only tested it in the witcher 3, can anyone else test it to see if it's just on my end or the driver in general? I have gsync selected in the nvidia control panel as fullscreen and windowed, and then I go to play the witcher 3(in full screen mode) and my framerate is tanked, in the low 40s, yet when I go back and change gsync to just fullscreen mode in the nvidia control panel and then go back into the witcher 3(in full screen mode) my frame rates are back to the 75-90 that they should be


----------



## Aftermath2006

drivers are completely unstable for me constant driver crashing and no issues before this install guess im rolling it back hope when my two 980ti's get here this issue doesn't persist


----------



## bigtonyman1138

Haven't had any issues yet with my 780ti and the latest drivers. Haven't had a chance to fire up the witcher 3 yet though.


----------



## Cerax

Really unstable driver . Im using GTX 970


----------



## Horsemama1956

Quote:


> Originally Posted by *Lansow*
> 
> Majin SSJ Eric tested this in this post http://www.overclock.net/t/1556199/pcgameshardware-witcher-3-benchmarks/290#post_23925079
> 
> He found that performance has increased, even though it hasn't been by much. He saw increased 3DMark scores despite lower clock speeds. Those results came from someone who has been pretty vocal about the perceived issue.
> 
> If I had to guess, I would assume it's more because Kepler is a very mature architecture with very few optimizations left on the table, whereas Big Maxwell is still in its relative infancy.


Umm you're basing your argument on one guy doing a few tests? LOL


----------



## m0n4rch

Witcher 3 crashed as soon as I started playing, and then again later during the gameplay, luckily I saved the game right before that happened. Never had crashes on 350.12. This driver is definitely unstable.


----------



## Abovethelaw

Cool. Yet another release I have to skip because GTA V will start to crash again. I'm still using 348.88 because it has never crashed with it, but it will crash with 350.12, 352.86, etc.


----------



## jdstock76

Quote:


> Originally Posted by *Silent Scone*
> 
> lol this is what I had in Firefox practically. Then it finally locked completely.
> 
> I blame Kepler users....


LoL ... Sorry but this is funny.

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Don't like it? Uninstall the driver, lose those 4FPS. Or sell your 780 and get a 980 (I did). But seriously, complaining because you got better performance for FREE, seriously?


^true

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Well, I updated my i7 at work running 64-bit Windows 7 with a GTX 560Ti card and I've been playing Youtube videos for 30 minutes now on FF, Chrome, and IE ... all at the same time no less ... and no crashes.


Hahahaha .... operator error!









Anyone have FF ARR, ARMA3, FC4, PR, Insurgency numbers? Still stuck at work.


----------



## cutty1998

Quote:


> Originally Posted by *iamhollywood5*
> 
> Nvidia fixing Kepler optimization and setting the 980 Ti price at $650, which actually makes me think twice about upgrading my 780 Ti to a 980 Ti.
> 
> Good guy Nvidia, for once.


I had a feeling that after all the backlash they heard for the 970 dilemma,then gimping Kepler cards that they would make good and un-gimp them! Too bad there is nothing they could do to fix the 970's.


----------



## Xoriam

Crashing like crazy everytime I alt tab in FFXIV, and getting random hickups and driver crashes.
Hardware acceleration in Chrome is off.


----------



## Apolladan

lmao

driver literally crashed while installing, a first for me

has been stable other than that, knock on wood


----------



## toxify

I've been playing w3 and metro last light redux no crashes or drop in peformance,also have had chrome open for the past day with twich on


----------



## ALT F4

Quote:


> Originally Posted by *Creator*
> 
> No architecture is ever maxed out for any new game. Even Fermi could be optimized for TW3.


I agree, but you must of misread my post. Focus on minimal and exponential, there is a huge difference between a .1%, 1%, and an 18% performance increase.

I never said any architecture was maxed out, I simply said there is no room for exponential gains. You don't have to bother responding because the the proof is in the performance itself. If there was more room for optimization, fermi and kepler would receive a bigger boost, they didn't for a reason.
Quote:


> Originally Posted by *ALT F4*
> 
> Kepler improvement will be minimal, I don't think there is any room within the architecture for exponential gains. Hopefully they don't just completely leave kepler in the dust while optimizing the new games for the newer architectures.


----------



## Cyclops

Can anyone confirm if a 780 is not longer being "bested" by a 960 in Witcher 3? I mean seriously...


----------



## MapRef41N93W

This is the second NVIDIA driver in a row that is garbage. Never seen this with NVIDIA before. Normally you get one dud driver that crashes a lot every year or so, but two in a row? That is some AMD level stuff right there.


----------



## Xoriam

Yeah.. I'm reverting. Too many crashes.

Also, when are we going to get a fix for Shadowplay crashing games when ALT TABBING?


----------



## Slink3Slyde

Apart from the instability, can anyone confirm any improvement for Kepler in games other then Witcher 3? I tried to download the drivers last night but our internet is screwed at the moment.

I know someone posted some results showing slight improvements. I'm more interested in games released after Maxwell was.

Thats the exact point at which Kepler started to show ~10 % less performance then on games before across most review sites Ive looked at relative to Maxwell.

I'm waiting for the custom boards to release so I can check out TPU's numbers with this latest driver and reference it to the older ones, should give a better idea.


----------



## barsh90

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Apart from the instability, can anyone confirm any improvement for Kepler in games other then Witcher 3? I tried to download the drivers last night but our internet is screwed at the moment.
> 
> I know someone posted some results showing slight improvements. I'm more interested in games released after Maxwell was.
> 
> Thats the exact point at which Kepler started to show ~10 % less performance then on games before across most review sites Ive looked at relative to Maxwell.
> 
> I'm waiting for the custom boards to release so I can check out TPU's numbers with this latest driver and reference it to the older ones, should give a better idea.


I can co firm loss in performance in gta 5, dying light, killing floor 2, and firestrike. Too lazy to bench other games, but you get the idea, this driver is a dud as well, just like last one. Nvidia keeps dropping the ball.


----------



## dboythagr8

Might be weird...but does anybody notice a difference in Youtube quality after installing these visuals? Like...things are not as sharp even in 1080p. It looks like the video is blown up or something and not native 1920x1080. I've watched various videos and they all look like this wth


----------



## Slink3Slyde

Quote:


> Originally Posted by *barsh90*
> 
> I can co firm loss in performance in gta 5, dying light, killing floor 2, and firestrike. Too lazy to bench other games, but you get the idea, this driver is a dud as well, just like last one. Nvidia keeps dropping the ball.


Maybe theyve rushed it out to improve Witcher 3 on older cards and screwed something. I had problems with the ones before as well in Dying light, overclock ive had stable a long time had to be wound back a notch and I had to disable hw acceleration in Firefox.

I'm sure they'll get a fix sorted sooner rather then later if this many people are having problems.

Off to work, lets see how this develops


----------



## rluker5

I've got instability issues with Witcher 3 too. The last game ready driver gave me good sli. Now the game is a mess. OCing gpus not working well on this driver at all. I do have shader cache on ramdisk and that might be the problem, but performance is still like 10fps behind (other when standing still) [email protected]:2:0 on W10. (this only improves W3 at 4k in my experience)
Other games may be better with this driver, but all my gaming time is going to W3 so I'm not going to bother with my 8.1 drive too much for a while.
If they put this fix on the windows 10 driver that might get rid of occasional stutter or make it worth it to raise my settings (high tex, veg draw distance, med most else -right now)
Or it might make me drop lots of oc like the 8.1 does.
I like the idea, maybe I just haven't figured out how to make use of this new driver.


----------



## barsh90

Quote:


> Originally Posted by *rluker5*
> 
> I've got instability issues with Witcher 3 too. The last game ready driver gave me good sli. Now the game is a mess. OCing gpus not working well on this driver at all. I do have shader cache on ramdisk and that might be the problem, but performance is still like 10fps behind (other when standing still) [email protected]:2:0 on W10. (this only improves W3 at 4k in my experience)
> Other games may be better with this driver, but all my gaming time is going to W3 so I'm not going to bother with my 8.1 drive too much for a while.
> If they put this fix on the windows 10 driver that might get rid of occasional stutter or make it worth it to raise my settings (high tex, veg draw distance, med most else -right now)
> Or it might make me drop lots of oc like the 8.1 does.
> I like the idea, maybe I just haven't figured out how to make use of this new driver.


I noticed that as well. The previous driver gave me good sli scale on some games, while this driver has horrible sli performance. Seems nvidia is fixing 1 thing while going backwards .


----------



## SONICDK

my gsync did NOT work in windowed mode using win 7 basic

setting win 7 aero theme ON fixed it.
just tested with halo spartan assault in the menu with vsync ingame off


----------



## John Shepard

still haven't fixed the crashes in chrome.....


----------



## Blackcurrent

Performance boost in The Witcher 3 is nice but driver gives me stutter in other games. Will revert back to 347.88 and force Physx to CPU mode to give me the Kepler fix in The Witcher 3.


----------



## Silent Scone

Resolved all my crash issues by...

Disabling Hardware acceleration in Opera

Reinstalling the driver up from previous without DDU and using clean install with only GFE, PhysX and driver.

Uninstalling Precision X. Seems for whatever reason this driver causes Witcher 3 to app crash when running PX, even with overlay disabled.

Just incase this happens to help anyone. Or if people would like to join Donoto in his lack of fundamental understanding of system variables and pot luck 'works on my machine' certificate of excellence.


----------



## ALT F4

Quote:


> Originally Posted by *Silent Scone*
> 
> Resolved all my crash issues by...
> 
> Disabling Hardware acceleration in Opera
> 
> Reinstalling the driver up from previous without DDU and using clean install with only GFE, PhysX and driver.
> 
> Uninstalling Precision X. Seems for whatever reason this driver causes Witcher 3 to app crash when running PX, even with overlay disabled.
> 
> Just incase this happens to help anyone. Or if people would like to join Donoto in his lack of fundamental understanding of system variables and pot luck 'works on my machine' certificate of excellence.


Haven't even thought of removing precision x, definitely worth a try, and thanks for sharing.


----------



## Silent Scone

There's also a flash update. Might be of no benefit but might help with some browser issues


----------



## zefs

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Don't like it? Uninstall the driver, lose those 4FPS. Or sell your 780 and get a 980 (I did). But seriously, complaining because you got better performance for FREE, seriously?


Or, nvidia could do their job and provide an actual fix for kepler instead of laughing at customers that don't want to upgrade their cards.
I would upgrade if there were good AAA titles out there, but I am not going to pay 550eu + for a card just for 2-3 games that are actually good and graphic demanding out now.


----------



## rluker5

Quote:


> Originally Posted by *Silent Scone*
> 
> Resolved all my crash issues by...
> 
> Disabling Hardware acceleration in Opera
> 
> Reinstalling the driver up from previous without DDU and using clean install with only GFE, PhysX and driver.
> 
> Uninstalling Precision X. Seems for whatever reason this driver causes Witcher 3 to app crash when running PX, even with overlay disabled.
> 
> Just incase this happens to help anyone. Or if people would like to join Donoto in his lack of fundamental understanding of system variables and pot luck 'works on my machine' certificate of excellence.


I have afterburner, but I have riva up on the corner. Uninstalling that is easy to try.
Thanks.


----------



## lyx

Not a single crash so far after installing those new yesterday - got msi afterburner 4.1.1 but dont have any overlay active.


----------



## Olivon

I'm only playing TW3 but I really appreciate this drivers.
Gain is so much important for the 780Ti that I verified my settings on game + ini if there are changes or not.
Next time, bring it first nVidia. Classical "Kepler is crippled" is not good for your reputation.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *zefs*
> 
> Or, nvidia could do their job and provide an actual fix for kepler instead of laughing at customers that don't want to upgrade their cards.
> I would upgrade if there were good AAA titles out there, but I am not going to pay 550eu + for a card just for 2-3 games that are actually good and graphic demanding out now.


Dude, it's been proven time and time again that performance on Kepler has NOT gone down over time. It has SLOWLY gone up over time. It's just that nVidia hasn't been able to optimize as many resources to squeezing more and more optimizations out of such an old card until now. They've been working on Maxwell, DirectX 12, Pascal, and deals with Ellon Musk.

But no, they didn't "nerf" it and make it worse. That lie has been proven false time and time again. Your card is still slightly faster than it was when you bought it. You lost nothing. nVidia can't help it if EA and other companies make unoptimized games that bring an older card to it's knees. That is called progress. It's like you buying a Chevy Chevette when the speed limit was 55mph and then when the Feds raise the national speed limit to 75mph you go back to Chevy and whine to them because the Chevette can't keep up at 75mph anymore.


----------



## zefs

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Dude, it's been proven time and time again that performance on Kepler has NOT gone down over time. It has SLOWLY gone up over time. It's just that nVidia hasn't been able to optimize as many resources to squeezing more and more optimizations out of such an old card until now. They've been working on Maxwell, DirectX 12, Pascal, and deals with Ellon Musk.
> 
> But no, they didn't "nerf" it and make it worse. That lie has been proven false time and time again. Your card is still slightly faster than it was when you bought it. You lost nothing. nVidia can't help it if EA and other companies make unoptimized games that bring an older card to it's knees. That is called progress. It's like you buying a Chevy Chevette when the speed limit was 55mph and then when the Feds raise the national speed limit to 75mph you go back to Chevy and whine to them because the Chevette can't keep up at 75mph anymore.


I was talking about that particular game, Witcher 3. The 960 offers about the same fps while with my OC on the 780 I should be closing on the 970 performance.
Even the CPU Physx option (through nvcp) performance gain shows that there is something wrong going on.

About the "they've been working on other things" you do know how much time they've got since they were given all the necessary Witcher 3 sources to work on right?
I can understand other cards have priority but giving us a 4fps driver "fix" while there is a lot more performance loss (and that is proven by 960 fps output on this game) is silly.

It doesn't have to do so much about "they are obliged to fix this", but I expect better from them.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *zefs*
> 
> I was talking about that particular game, Witcher 3. The 960 offers about the same fps while with my OC on the 780 I should be closing on the 970 performance.
> Even the CPU Physx option (through nvcp) performance gain shows that there is something wrong going on.


My point still stands. You are blaming nVidia for one particular game that isn't made by them? How logical is that?


----------



## zefs

Quote:


> Originally Posted by *47 Knucklehead*
> 
> My point still stands. You are blaming nVidia for one particular game that isn't made by them? How logical is that?


It's an nvidia title with gameworks features, they've been working closely with CDPR. I don't care who's fault it is. Nvidia providing a minor fix for this is a clear point that they are to blame so why shouldn't I?

My point is all they do lately is disappoint their customers, by trying to earn more profit and I think over 100 pages of complains about this exact issue on their official forums proves that.


----------



## hwoverclkd

I wonder which game(s) they targeted for performance improvement. GTA V (and my other older games) remains the same as previous drivers and FS graphics score in win 7 went a bit lower. On Win 8.1, however, it remains the same. I'm not playing witcher so I can't confirm.


----------



## 47 Knucklehead

Nevermind, I'm too tired to care about trying to make everyone happy.

Play the game or not, use the driver or not, upgrade your computer or not, go to AMD and see what they have to offer or not. I really don't care anymore.


----------



## Nightingale

Quote:


> Originally Posted by *47 Knucklehead*
> 
> My point still stands. You are blaming nVidia for one particular game that isn't made by them? How logical is that?


Sigh, it's not just one game. They are neglecting Kepler and there is substantial proof to back that up.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Nightingale*
> 
> Sigh, it's not just one game. They are neglecting Kepler and there is substantial proof to back that up.


So prove it.


----------



## gamervivek

Quote:


> Originally Posted by *Nightingale*
> 
> Sigh, it's not just one game. They are neglecting Kepler and there is substantial proof to back that up.


You'd be surprised at how many folks have said that 'one game' excuse to me.


----------



## NvNw

Quote:


> Originally Posted by *jdstock76*
> 
> LoL ... Sorry but this is funny.
> ^true
> Hahahaha .... operator error!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone have FF ARR, ARMA3, FC4, PR, Insurgency numbers? Still stuck at work.


For current FF ARR client with on 780 i'm getting arround the same performance than before and using two in SLI is also not working well for me as before.

But with the Heavensward benchmark before the update i got 17000 score with SLI 780 and now i get 18800 with the same stuff.

Hope the SLI performance stick on the final DX11 client and is not only on the benchmark that works okay...


----------



## KenLautner

Almost the same performance / 2-3 fps gain maybe, with the new drivers on Witcher 3 and GTA 5.

GTX 760.


----------



## bluewr

Quote:


> Originally Posted by *47 Knucklehead*
> 
> So prove it.


There have been multiple post and topic here and on the official forum that proves it.

Choosing to ignore them, is not helping your arguement.


----------



## Woundingchaney

Quote:


> Originally Posted by *gamervivek*
> 
> You'd be surprised at how many folks have said that 'one game' excuse to me.


I think one of the issues is that the 700 series has is handling high levels of tessellation and direct compute. This is known to be weaknesses for the architecture. The updated 900 series addressed these very issues and even the 200 series from AMD faired better (on top of having a higher level of vram and a larger bus size).

With modern games relying more and more on tessellation and direct compute there is only so much optimization that can be done on the software level without the presence of hardware support. I suppose one could blame Nvidia for the hardware design of the chips, buts its more of an issue with where the software industry is going rather than any relevant choice for Nvidia to "bork" the cards.

I very much understand how people can be upset about their hardware not having the best shelf life, but to suggest this is due to Nvidia intentionally crippling their own performance doesn't make sense.
Quote:


> [There have been multiple post and topic here and on the official forum that proves it.
> 
> Choosing to ignore them, is not helping your arguement.


Im not aware of performance actually going down, the hardware hasn't scaled as well over time perhaps. There is a dramatic difference between these two realities. The hardware hasn't scaled as well because it doesn't have the ability to.


----------



## Slink3Slyde

Quote:


> Originally Posted by *47 Knucklehead*
> 
> So prove it.


Following up on a Reddit threa that showed something similar I took data from reviews from TechPowerUp and Techspot, as well as looking at reviews from Guru 3d and Sweclockers to check the results where they had benched the same games. Theres a thread called 'Are Nvidia Neglecting driver optimisations since Maxwell'if you cared to look









IIRC In games released before the release of Maxwell the GTX 780 reference is 10% on average behine the 970. On games released after Maxwells release its 20% on average. The 780TI reference was about 8% ahead of the 970, games released after it's almost the same performance. Reference clocks. I do understand.

I'm not rabid about this, it's understandable from a business perspective to not use a lot of resources towards a product that is legacy and is not making you any money any more. It's also possible that the older games are not optimised for Maxwell and that could explain the difference. The fact that AMD's cards have improved relative to all Nvidia cards is explained by improving the CPU overhead on their drivers and the fact they havent changed their architecture basically in 3 years.

However now, if people are reporting improvements from the new driver in The Witcher 3 on Kepler with the new drivers. I would suggest that maybe they can do better for Kepler in most new games and it is not 'tapped out' or the result of developers creating games that run badly on Kepler. Not too many people were interested in this before The Witcher benches came out.

I'm sure I wont buy an AMD card when they have a new geneation due either because I would expect something similar. It's just good to know it can happen for future reference. *Overclocked Kepler cards are still good gaming cards*, you'll just miss out on 5-10% FPS of optimisatons that Nvidia can't justify spending the time on when they have their driver teams working on improving performance for Maxwell.

I'm not sure if this happened before, 580-680 was such a jump in performance it probably wasnt as noticeable, Pascal may be the same.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *bluewr*
> 
> There have been multiple post and topic here and on the official forum that proves it.
> 
> Choosing to ignore them, is not helping your arguement.


So link to them.

Quote:


> Originally Posted by *Woundingchaney*
> 
> Im not aware of performance actually going down, the hardware hasn't scaled as well over time perhaps. There is a dramatic difference between these two realities. The hardware hasn't scaled as well because it doesn't have the ability to.


EXACTLY!

Not going up as fast as other cards is VASTLY DIFFERENT than "Going down".


----------



## Slink3Slyde

Quote:


> Originally Posted by *Woundingchaney*
> 
> I think one of the issues is that the 700 series has is handling high levels of tessellation and direct compute. This is known to be weaknesses for the architecture. The updated 900 series addressed these very issues and even the 200 series from AMD faired better (on top of having a higher level of vram and a larger bus size).
> 
> With modern games relying more and more on tessellation and direct compute there is only so much optimization that can be done on the software level without the presence of hardware support. I suppose one could blame Nvidia for the hardware design of the chips, buts its more of an issue with where the software industry is going rather than any relevant choice for Nvidia to "bork" the cards.
> 
> I very much understand how people can be upset about their hardware not having the best shelf life, but to suggest this is due to Nvidia intentionally crippling their own performance doesn't make sense.
> Im not aware of performance actually going down, the hardware hasn't scaled as well over time perhaps. There is a dramatic difference between these two realities. The hardware hasn't scaled as well because it doesn't have the ability to.


Youre correct performnace hasnt gone down, and it's not a deliberate nerf. However I would wonder how the AMD cards performance is not being affected by the tesselation in new games. They've actually improved a lot relative to Kepler despite worse tess performance.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Youre correct performnace hasnt gone down, and it's not a deliberate nerf. However I would wonder how the AMD cards performance is not being affected by the tesselation in new games. They've actually improved a lot relative to Kepler despite worse tess performance.


Memory bus comes to mind as a possible reason as to why. nVidia has been with the 256-bit and 384-bit bus for a long time now, and AMD has gone to 512-bit and soon to even wider with HBM. Once nVidia comes out with HBM GEN2, it will double even the upcoming AMD bus.

Also, the 900 series has a vastly different memory system than the 700 or 600 series of cards, so I tend to think that a good deal of the issue with the older cards is because of the memory system. The older systems just aren't as dynamic as the newer ones, even at the same narrow (compared to AMD) bus.


----------



## Woundingchaney

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Youre correct performnace hasnt gone down, and it's not a deliberate nerf. However I would wonder how the AMD cards performance is not being affected by the tesselation in new games. They've actually improved a lot relative to Kepler despite worse tess performance.


Though the AMD architecture has advantages over Kepler in other regards. Their direct compute performance is impressive as well as having more available Vram and a larger bus size. For instance when one starts pushing resolution and other Vram intensive aspects of software the 200 series really starts to show its worth. Even in comparisons with the initial 900 series releases.

Its also important to note than on the driver level there is/was more optimization possibilities for the 200 series over Nvidias 700 series.


----------



## Slink3Slyde

384 bit 280x is not to far off a 383 bit 780 in newer games?

http://www.guru3d.com/articles_pages/dragon_age_inquisition_vga_graphics_performance_benchmark_review,8.html

http://www.techspot.com/review/991-gta-5-pc-benchmarks/page3.html

https://www.techpowerup.com/reviews/ASUS/GTX_960_STRIX_OC/21.html

I should say I fully understand that AMD really had to sort their drivers out at some point. I've already said that they are still using basically the same arch as 3 years ago with more cores and tweaks. They are making money there so they have reason to devote resources.

Nvidia could keep spending time pushing 5-10% more performance for Kepler in new games, but it doesnt make sense for them to. If people dont want to believe that Nvidia isnt tirelessly devoted to maximising their old architectures for every % when they arent making money on it then so be it.


----------



## Silent Scone

Quote:


> Originally Posted by *Woundingchaney*
> 
> I think one of the issues is that the 700 series has is handling high levels of tessellation and direct compute. This is known to be weaknesses for the architecture. The updated 900 series addressed these very issues and even the 200 series from AMD faired better (on top of having a higher level of vram and a larger bus size).
> 
> With modern games relying more and more on tessellation and direct compute there is only so much optimization that can be done on the software level without the presence of hardware support. I suppose one could blame Nvidia for the hardware design of the chips, buts its more of an issue with where the software industry is going rather than any relevant choice for Nvidia to "bork" the cards.
> 
> I very much understand how people can be upset about their hardware not having the best shelf life, but to suggest this is due to Nvidia intentionally crippling their own performance doesn't make sense.
> Im not aware of performance actually going down, the hardware hasn't scaled as well over time perhaps. There is a dramatic difference between these two realities. The hardware hasn't scaled as well because it doesn't have the ability to.


The performance squeezed in this driver for Kepler is within reasonable bounds of what one could expect when forced to 'knuckle down'. Being forthright, I think people who believe NVIDIA would intentionally neuter performance on an old architecture are idiots. Tessellation is very much driven by the source and geometry data and ultimately once it's come out of the DX pipeline it's down to how well the GPU hardware copes with it. The problem with HairWorks is it uses geometry which isn't ideal for slower architecture in the first place.


----------



## Creator

Quote:


> Originally Posted by *Silent Scone*
> 
> The performance squeezed in this driver for Kepler is within reasonable bounds of what one could expect when forced to 'knuckle down'. Being forthright, I think people who believe NVIDIA would intentionally neuter performance on an old architecture are idiots. Tessellation is very much driven by the source and geometry data and ultimately once it's come out of the DX pipeline it's down to how well the GPU hardware copes with it. The problem with HairWorks is it uses geometry which isn't ideal for slower architecture in the first place.


First there were the rumors, then the TW3 happened. And now we have this performance boosting driver. This shows Kepler has been neglected by NV, and NV likely would have done nothing if there were no mass outcry. You think someone over at NV would have asked why the 780 was performing slower than the 960 while they were optimizing, but that never happened because they didn't have anyone on Kepler.

It's not that NV is neutering, but its just disturbing they seemingly dropped support of the architecture. I'm not confident they'll continue to do anything more with Kepler, despite the latest driver release, until the next disaster set of performance results like the TW3 comes in again. But at least we know NV will respond. That's still a plus.

And finally, nothing is ever maxed out for new games. I'm not replying this to you, but to those who keep saying Kepler is maxed out. This "maxed out" talk really needs to be put to rest, because it just gives both AMD and NV an excuse to stop supporting older architectures. Even Fermi can be optimized to run better than it currently does in the past few releases. But there's not much point to do that, since very few still use Fermi, unlike Kepler...


----------



## Silent Scone

I'm just giving fundamental truths, I don't care why performance was neglected. The gains in this driver are shrug worthy anyway frankly. If I was cross with NVIDIA before, I wouldn't exactly be jumping for joy now.


----------



## Seven7h

Quote:


> Originally Posted by *Mad Pistol*
> 
> I've been saying this for a while. Kepler is a 3-year-old architecture. It has been optimized a lot over the last 3 years. I highly doubt there is much more for Kepler to give that hasn't already been given.
> 
> What I do want to see, though, is continuing support and optimization for future titles (like Witcher 3) so that the game performs the best it possibly can. I understand that my GTX 780 is not going to be as fast as a 970 or above. However, I NEVER want to see it be slower than a GTX 960... a card with less than 1/2 the CUDA cores and 1/3 the memory bus.
> 
> It sounds like Nvidia fixed Witcher 3 performance on Kepler, so all is well.


Graphics architecture is made of more than just cuda cores. Maxwell is just way better at tessellation. Less architectural potholes which Witcher 3 is hitting. It's a fact of life, not a conspiracy.


----------



## Seven7h

Quote:


> Originally Posted by *zefs*
> 
> It's an nvidia title with gameworks features, they've been working closely with CDPR. I don't care who's fault it is. Nvidia providing a minor fix for this is a clear point that they are to blame so why shouldn't I?
> 
> My point is all they do lately is disappoint their customers, by trying to earn more profit and I think over 100 pages of complains about this exact issue on their official forums proves that.


Whos fault is the disappointment if the expectations were unreasonable to begin with?

A bunch of kids on Reddit with no engineering backgrounds complaining about some conspiracy theory with no factual basis doesn't mean NVIDIA is disappointing all their customers.


----------



## Silent Scone

Quote:


> Originally Posted by *Seven7h*
> 
> Whos fault is the disappointment if the expectations were unreasonable to begin with?
> 
> *A bunch of kids on Reddit with no engineering backgrounds complaining about some conspiracy theory with no factual basis* doesn't mean NVIDIA is disappointing all their customers.


To be honest a lot of these vendor debates could be summarised with the highlighted. Sadly the truth hurts as reddit is infected with a lot of that.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Seven7h*
> 
> A bunch of kids on Reddit with no engineering backgrounds complaining about some conspiracy theory with no factual basis doesn't mean NVIDIA is disappointing all their customers.


Exactly. Most of the threads on Redit are so full of lies it isn't even funny.


----------



## Mumbles37

Quote:


> Originally Posted by *47 Knucklehead*
> 
> So link to them.


If you wanted proof that there were issues, all you had to do was google "nVidia Kepler problem." Five of the first seven results prove that there really was an issue and nVidia acknowledged it. Here's just one of them: http://www.eteknix.com/witcher-3-kepler-problems-will-fixed-new-driver-update/

If people hadn't complained and notified nVidia, would it ever have been fixed? Who knows. But the people who were not just whining in general (many will whine whenever a new card comes out, but that's nothing new) had actually found an issue with Kepler cards specifically in Witcher 3, which this driver is meant to fix.

So there was an issue. It is supposed to be resolved for win 7/8 users with this driver.


----------



## provost

The whole discussion about architecture vs software vs hardware is a circular argument. Let's look at this from the point of view of Nvidia's business model. As many have stated, Nvidia has a dominant market share in the discreet GPU market which means that the low hanging fruit to take customers away from your competitor is gone. Nvidia is now in the business of selling performance, and pricing the "experience" it delivers based on performance. This Performance based Pricing strategy relies heavily on software rather than on hardware, i.e. multiple skus, various boost control price points, and other "performance levers" such as Gsync, and Gameworks, etc.
To say that Kepler can not be further optimized is another way of saying that Kepler no longer fits in Nvidia's overall strategy of charging its customers for software based Performance rather than for hardware they paid for. With no low hanging fruit left in the market share, the only way to increase your top line is to reduce the hardware development related capex, and manage performance tiered pricing for more frequent product turnover.
Kepler was no more obsolete a few months ago than it is today, and Maxwell will be no more obsolete in a few months after Pascal drops, unless the Performance is being managed to maximize $$ return for Pascal by turning over the existing users.
The problem is not necessarily why Nvidia stopped optimizing Kepler, but rather the longevity of cards in this new Priced for Performance regime that Nvidia has introduced via software based controls. With Apple, people know that a new iphone model will be released every year, but they don't lose support for the old iphone. And, Apple even throws in an incentive to upgrade. Handset market has plenty of competition, whereas the discreet GPU market not so much.
If Nvidia were to provide an adequate disclosure of the optimization support period for its cards, it is highly unlikely that it would have been able to charge $1000 for a card (Titan Black for example) that it knew was going to drop optimization support within a few months when 980 was released.
The rules of industry have changed with Nvidia's dominating market share, and therefore caveat emptor, or buyer be aware, is so much more important than it was a few years ago when Nvidia and AMD were going toe to toe and keeping some semblance of competitive consumer friendly market, even in a duopoly.

Edit: And, its funny that people call the Performance from this driver for Kepler "free"..lol It's not "free" performance after months of neglect, and all GK110 owners paid for adequate period of optimization support or driver performance upfront when they purchased their, Titans, 780s, 780 Tis , Titan Blacks and Titan Zs..... on a separate note, some kinks still need to be worked out for this driver as its not stable.


----------



## Seven7h

Quote:


> Originally Posted by *Mumbles37*
> 
> If you wanted proof that there were issues, all you had to do was google "nVidia Kepler problem." Five of the first seven results prove that there really was an issue and nVidia acknowledged it. Here's just one of them: http://www.eteknix.com/witcher-3-kepler-problems-will-fixed-new-driver-update/
> 
> If people hadn't complained and notified nVidia, would it ever have been fixed? Who knows. But the people who were not just whining in general (many will whine whenever a new card comes out, but that's nothing new) had actually found an issue with Kepler cards specifically in Witcher 3, which this driver is meant to fix.
> 
> So there was an issue. It is supposed to be resolved for win 7/8 users with this driver.


An additional 15% is a "fix"?

People convinced of the conspiracy will never be satistfied. Its a form of mental cancer. If you believe there was a conspiracy and NVIDIA does nothing, then its evil. If it offers up 15% performance, then its clear proof that they "intentionally gimped it to begin with", and theyre evil, and "got caught", and if it "wasnt for those meddling kids, they wouldve gotten away with it too!!!"

Lack of monitoring and analysis of performance on last generation hardware does not constitute a conspiracy. It just means you prioritized other things. And honestly, even after the 15% improvement, most will still think theres intentional gimping. The architecture is at its limits with this game. Accept it and move on.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Seven7h*
> 
> Whos fault is the disappointment if the expectations were unreasonable to begin with?
> 
> A bunch of kids on Reddit with no engineering backgrounds complaining about some conspiracy theory with no factual basis doesn't mean NVIDIA is disappointing all their customers.


Sigh. It's not a 'nerf' or a 'gimp'. But I believe there is a bit less performance with Kepler these days then there potentially could be. Because they arent really trying anymore. No grand conspiracy theyve probably always done it. Except this time the two generations weren't all that far apart in performance initially and before they werent charging 1000+ for a card until 1 month before they moved on to the next gen.

The thread and numbers I referenced earlier was my own, with data I took myself from the TPU and Techspot reviews. If benchmark numbers from two different popular websites aren't factual, well....

Just to be clear.


----------



## Mumbles37

Quote:


> Originally Posted by *Seven7h*
> 
> An additional 15% is a "fix"?
> 
> People convinced of the conspiracy will never be satistfied. Its a form of mental cancer. If you believe there was a conspiracy and NVIDIA does nothing, then its evil. If it offers up 15% performance, then its clear proof that they "intentionally gimped it to begin with", and theyre evil, and "got caught", and if it "wasnt for those meddling kids, they wouldve gotten away with it too!!!"
> 
> Lack of monitoring and analysis of performance on last generation hardware does not constitute a conspiracy. It just means you prioritized other things. And honestly, even after the 15% improvement, most will still think theres intentional gimping. The architecture is at its limits with this game. Accept it and move on.


Really hope I don't have mental cancer.

Did not think I was representing either view. If it came across that way, I hereby officially deny being for or against nVidia. Was simply stating *the facts: there was/is an issue with Kepler. This driver is supposed to fix it.* Somebody asked for proof so I proved it. This thread, incidentally, contains all of the official information from nVidia: https://forums.geforce.com/default/topic/833016/geforce-700-600-series/gtx-780-possible-fail-as-performance-in-the-witcher-3-wild-hunt-/1/

I do hope you were not referring to me when using the words "conspiracy," "evil," or "cancer." I didn't use those terms. Actually I didn't even use the word "fix." Honestly I don't know why you would take your time quoting me as 1. I'm not sure the connection between my post and what you wrote and 2. I'm not quite sure what your point was.


----------



## 47 Knucklehead

Your "proof" has been invalidate by your own link ...

https://forums.geforce.com/default/topic/833016/geforce-700-600-series/gtx-780-possible-fail-as-performance-in-the-witcher-3-wild-hunt-/post/4532052/#4532052

Two drivers didn't support as many older cards as some people expected. The previous driver and the later driver did, and guess what? Witcher 3 works a LOT better on 7 series cards.
Quote:


> 7series GPU owners, install 347.88 driver and you will get all your performance back.
> DON'T!!!!! install newer 350.12 and 352.86 drivers, beacuse they don't support GTX760/GTX760Ti, GTX770/GTX770Ti, GTX780/GTX780Ti
> Those two drivers will significantly cripple your 7series GPU!!!


Welcome to the universe of limited resources and having to prioritize things.


----------



## Mumbles37

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Your "proof" has been invalidate by your own link ...
> 
> https://forums.geforce.com/default/topic/833016/geforce-700-600-series/gtx-780-possible-fail-as-performance-in-the-witcher-3-wild-hunt-/post/4532052/#4532052
> 
> One driver didn't support as many older cards as some people expected. Subsequent drivers did, and fixed it.


Wait wait wait... what? ManuelG, the nVidia rep, acknowledged in the thread that there was a Kepler problem, that the team was looking into it, then that they made a fix, and then: we have 353.06 driver, with the official text on the driver including: *Implements optimizations and bug fixes that provide increased performance for Kepler-based GPUs.*

That means bug fixes, because they wrote "bug fixes." As in there was a bug (or more than one).

I don't see where anything was invalidated. I want to be clear, what I wrote and am writing is neither a defense of the "conspiracy" side nor the nVidia "defenders" but rather just a statement of the facts. That is all.

By the way, I don't believe you read much of what I linked. You referred to post #19 out of 1552 posts. I don't see what you were trying to say. A lot can happen in 1533 posts. The post you referenced pointed to a misprint in the 352.86 driver documentation where some Kepler GPUs were not listed.


----------



## 47 Knucklehead

People didn't bother to read what cards the driver supported and when they installed it on an unsupported card they didn't get the performance they expected ... and when they installed either an older or newer driver that DID support their card, the performance came back?

You don't say!


----------



## zefs

Quote:


> Originally Posted by *47 Knucklehead*
> 
> People didn't bother to read what cards the driver supported and when they installed it on an unsupported card they didn't get the performance they expected ... and when they installed either an older or newer driver that DID support their card, the performance came back?
> 
> You don't say!


What are you talking about? I think you are confused.

This is from the driver list of changes:

_Implements optimizations and bug fixes that provide increased performance for
Kepler-based GPUs._


----------



## Mumbles37

Quote:


> Originally Posted by *47 Knucklehead*
> 
> People didn't bother to read what cards the driver supported and when they installed it on an unsupported card they didn't get the performance they expected ... and when they installed either an older or newer driver that DID support their card, the performance came back?
> 
> You don't say!


I agree, you seem confused.

That was a misprint, and nobody (other than the people who complain about anything) cared about it.

That has nothing to do with anything. nVidia has publicly acknowledged that there was an issue (or more than one issue) with Kepler-based GPUs in the latest drivers. That bug(s) was limiting the performance of Kepler GPUs especially in Witcher 3.

I don't see what is so hard to understand about all this. It happened. It came from nVidia.

The only takeaway from this for me is that you will argue no matter what I or anyone says. And if that's the case I regret getting involved.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *zefs*
> 
> What are you talking about? I think you are confused.
> 
> This is from the driver list of changes:
> 
> _Implements optimizations and bug fixes that provide increased performance for
> Kepler-based GPUs._


Try reading the link I posted ...
Quote:


> Here you have proof that GTX780 is NOT supported. Go to page 34 (supported NVIDIA Desktop Products)and there you will find following GPU supported list:
> GeForce GTX TITAN Z
> GeForce GTX TITAN X
> GeForce GTX 980
> GeForce GTX 970
> GeForce GTX 960
> GeForce GTX 750 Ti
> GeForce GTX 750
> GeForce GTX 560 Ti
> GeForce GT 740
> GeForce GT 730
> GeForce GT 720
> GeForce GT 610
> GeForce 210


NOT ALL Kepler GPU's ... ONLY THE SUPPORTED ONES.


----------



## Mumbles37

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Try reading the link I posted ...
> NOT ALL Kepler GPU's ... ONLY THE SUPPORTED ONES.


Please, before saying anything else, read this:

*That was a misprint. Officially acknowledged as a misprint by nVidia and then fixed.*

If you need further proof, please check page 33, Table 3.1 in the revised 352.86 driver documentation: http://us.download.nvidia.com/Windows/352.86/352.86-win8-win7-winvista-desktop-release-notes.pdf

Consumer Products Notes
GeForce GTX TITAN X
GeForce GTX TITAN Z
GeForce GTX TITAN Black
GeForce GTX TITAN
GeForce GTX 980
GeForce GTX 970
GeForce GTX 960
GeForce GTX 780 Ti
GeForce GTX 780
GeForce GTX 770
GeForce GTX 760
GeForce GTX 760 Ti (OEM)
GeForce GTX 750 Ti
GeForce GTX 750
GeForce GTX 745
GeForce GT 740
GeForce GT 730
GeForce GT 720
GeForce GTX 690
GeForce GTX 680

Again, nobody is arguing with you except yourself. Nobody cares about a misprint.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Mumbles37*
> 
> Please, before saying anything else, read this:
> 
> *That was a misprint. Officially acknowledged as a misprint by nVidia and then fixed.*


Fine. As you said, it was a misprint, the driver has been fixed, all is right with the world again. Witcher 3 works with Kepler cards after 2 weeks from the game being released.

What's the problem again?


----------



## Mumbles37

Quote:


> Originally Posted by *47 Knucklehead*
> 
> What's the problem again?


Exactly.

That's what I was saying in my first post.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Mumbles37*
> 
> Exactly.
> 
> That's what I was saying in my first post.


Even still, that is just one game, one that was released 2 weeks earlier. I'm still wondering where this "nVidia intentionally nerfed their driver for the Kepler so that people would buy Maxwell" garbage came from. But I guess conspiracy theorists like others (not you) on this thread (or Reddit) can't really be taken seriously.


----------



## Mumbles37

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Even still, that is just one game, one that was released 2 weeks earlier. I'm still wondering where this "nVidia intentionally nerfed their driver for the Kepler so that people would buy Maxwell" garbage came from. But I guess conspiracy theorists like others (not you) on this thread (or Reddit) can't really be taken seriously.


I totally agree, some people just look for (or invent, maybe?) reasons to support their agendas. There is no proof that nVidia did anything intentionally.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Mumbles37*
> 
> I totally agree, some people just look for (or invent, maybe?) reasons to support their agendas. *There is no proof that nVidia did anything intentionally.*


That, ultimately, was my entire point. Hence why I was asking for proof.


----------



## NvNw

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Even still, that is just one game, one that was released 2 weeks earlier. I'm still wondering where this "nVidia intentionally nerfed their driver for the Kepler so that people would buy Maxwell" garbage came from. But I guess conspiracy theorists like others (not you) on this thread (or Reddit) can't really be taken seriously.


What nvidia fixed is not the performance of The Witcher 3 with Kepler alone, it's supposed to fix the performance of Kepler running the Gameworks code, and that should fix any game that use Gameworks like TW3 and ProyectCars, FF ARR last benchmark also seems to use some Gameworks code (or at least it have the logo on it) and i saw around a 10% improvement (using 2 780 in SLI).

What I'm not liking about nvidia is that they only are focusing the development of the drivers on the last architecture and not doing a proper testing and optimization on every card they "support".


----------



## Dinky Dino

Anyone find that with the latest 353.06 drivers their voltage is limited to 1.2 volts even if you flash the bios? I went back to 335.23 and found that it was using 1.212 properly again. I have a 760. Or was this happening before this driver? Overclocking used to be so easy and now it's giving me a huge headache with all these limitations.


----------



## NvNw

Yesterday i set 1.3 volt on my 780s using just msi afterburner and skynet rom.


----------



## provost

The fine line between "gimping" and "not optimizing" is very thin, and a matter of splitting hair. You have to look at the cause and effect, to understand the "why":

Cause:
Not optimizing drivers on cards released within 5 months (Titan Z) to 1.5 year (Titan) and everything in between is the cause. Gk 110 cards start to suffer in newer games.

Rationale Provided:
Customers are supposed to understand that Nvidia has other priorities such as Tegra, so tough luck if you bought a Titan Black or Titan Z or any other GK 110 card.

Effect:
Either Gk110 customers upgrade to get the same performance or live with continued degradation of performance even relative to AMD's old architecture.

Rationale:
Given the relative lack of competition, and the fact that people buying the GK110 are the "stickier" type of customers ( particularity if these customers also bought an Nvidia gsync monitor), it is likely that these customers will upgrade to another Nvidia card. If it weren't for all the complaining, this would be perfect (and if AMD never released a worthwhile card again, even better) from Nvidia's perspective.

Summary:
No one can accurately guess the motivation or whether Nvidia planned this obsolescence while selling $1k cards months before stopping optimization (doesn't matter the reason, from the customer's perspective, Nvidia's corporate priorities are not the customers' problems). However, one can predict the upside or downside (the "Effect") for Nvidia from an action or inaction (the "Cause'") which in this case is the very short time period of driver optimizations for GK110 cards.


----------



## KenLautner

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Even still, that is just one game, one that was released 2 weeks earlier. I'm still wondering where this "nVidia intentionally nerfed their driver for the Kepler so that people would buy Maxwell" garbage came from. But I guess conspiracy theorists like others (not you) on this thread (or Reddit) can't really be taken seriously.


It's not just one game my friend. They said the driver fix was for Kepler in general for a range of games.
No one can prove that they intentionally nerfed the drivers, nor can anyone proof the problem was unintentional. The fact remains that they neglected Kepler for a while till all these conspiracy talk started coming up and grabbed their attention.


----------



## LaBestiaHumana

Unfortunately this driver keeps crashing. I had to roll back to 350.


----------



## Silent Scone

Just plugged in the 40" 4K VA Phillips, driver still working very well after disabling browser hardware acel and Precision X


----------



## Woundingchaney

Quote:


> Originally Posted by *NvNw*
> 
> What nvidia fixed is not the performance of The Witcher 3 with Kepler alone, it's supposed to fix the performance of Kepler running the Gameworks code, and that should fix any game that use Gameworks like TW3 and ProyectCars, FF ARR last benchmark also seems to use some Gameworks code (or at least it have the logo on it) and i saw around a 10% improvement (using 2 780 in SLI).
> 
> What I'm not liking about nvidia is that they only are focusing the development of the drivers on the last architecture and not doing a proper testing and optimization on every card they "support".


Im wondering if the optimizations are anything more than just limiting the settings associated with Gameworks titles. Has there been anything to suggest a lowering of visual fidelity? It could be that they are simply limiting things like tessellation and what not for aspects such as Hairworks for example when being ran by the 700 series.


----------



## Menta

When will Nvidia fix these TDR issues, its being going ON for years and years


----------



## bluewr

Quote:


> Originally Posted by *Menta*
> 
> When will Nvidia fix these TDR issues, its being going ON for years and years


Probabely when gaming news site make a article on it?


----------



## KenLautner

Quote:


> Originally Posted by *Woundingchaney*
> 
> Im wondering if the optimizations are anything more than just limiting the settings associated with Gameworks titles. Has there been anything to suggest a lowering of visual fidelity? It could be that they are simply limiting things like tessellation and what not for aspects such as Hairworks for example when being ran by the 700 series.


At first I thought the same as when I launched Witcher 3 after the drivers update, the first effect was gone.. I mean I was spelling the Igni sign but no flames showed up, just the red lines on the ground.. but it worked after a bit then I went to the same places I did before the update and compared, it looks the same. So no downgrade of graphics.


----------



## hyp36rmax

I'll have to try this with my GTX780Ti and i7 4770K when i get home.


----------



## neoroy

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Unfortunately this driver keeps crashing. I had to roll back to 350.


Same happens in here too, but so far only in gaming crashes and browsing is still no crashing, playing AC unity with GTX980 Strix default got crashed after 5 minutes, my friend with MSI GTX970 Gaming had freezed with GTA5 every 5 minutes he said, this is horrible Nvidia.
Looks like 350.12 WHQL is the best driver for now.


----------



## OkanG

I've used this since it was released now, and I'm still crashing at the desktop. Not only that, but the stuttering in GTA 5 has come back, which was the only thing 352.xx had going on for it.

No idea what NVIDIA is actually thinking with these drivers, they've really dropped the ball lately.


----------



## rluker5

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Unfortunately this driver keeps crashing. I had to roll back to 350.


Got RivaTuner statistics installed? I uninstalled it and my goofy (but great working) setup was fixed. Now works great.
Just an idea.


----------



## 47 Knucklehead

Question for everyone who is crashing during browsing ... Are you turning off the browsers hardware acceleration mode?


----------



## rluker5

Quote:


> Originally Posted by *Silent Scone*
> 
> Resolved all my crash issues by...
> 
> Disabling Hardware acceleration in Opera
> 
> Reinstalling the driver up from previous without DDU and using clean install with only GFE, PhysX and driver.
> 
> Uninstalling Precision X. Seems for whatever reason this driver causes Witcher 3 to app crash when running PX, even with overlay disabled.
> 
> Just incase this happens to help anyone. Or if people would like to join Donoto in his lack of fundamental understanding of system variables and pot luck 'works on my machine' certificate of excellence.


I uninstalled RivaTuner statistics due to this suggestion and now the game works great. Stutter greatly reduced as well.
The framerate is easy to see if it drops below 55 fps on this tv due to how crisply it refreshes.
Maybe I'll try fraps for the first time, maybe not. Playability of this game is more important to me now than statistics on it. Maybe when I've beat it.


----------



## maarten12100

They realized that abandoning the old arch as in not pushing the improvements had bad PR effect I guess. It's a good thing rather than pushing people to the newer platform. Good for those with Kepler cards.


----------



## y2kcamaross

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Question for everyone who is crashing during browsing ... Are you turning off the browsers hardware acceleration mode?


I'm not, then again it really doesn't bother me too much, just black screens for a second every couple hours and the driver crashes


----------



## hyp36rmax

I tried both The Witcher 3 and Project cars as they were both pretty smooth on an i7 4770k and a GTX 780Ti @ 1080P. I'll post up my settings later. No Driver crashing on YouTube videos of the like. I believe some of you have software conflicts.


----------



## djriful

I had one crash so far, right after I log back into my Windows after 10min idle screensaver auto lock.


----------



## Lindwurm

After having no issue with any of the previous drivers, my GTX 770 started getting driver crashes in TW3 with 353.06. After the first time I rolled my already-modest overclock back, and crashed again. Rolled back to 347.88.


----------



## decimator

Installed 353.06 a couple days ago and my SLi 780 Ti Classy's are running fine. Played AC: Unity for about an hour last night without issues. Internet browsing and YouTube playback with hardware acceleration enabled are also fine. I don't have TW3 but plan on getting it soon. I should note that my system is pretty bare bones -- Windows 7, drivers, Microsoft Office, Steam, and games. There aren't a lot of background applications running, so there's basically no chance of any software conflicts.


----------



## barsh90

Quote:


> Originally Posted by *hyp36rmax*
> 
> I tried both The Witcher 3 and Project cars as they were both pretty smooth on an i7 4770k and a GTX 780Ti @ 1080P. I'll post up my settings later. No Driver crashing on YouTube videos of the like. I believe some of you have software conflicts.


Quote:


> Originally Posted by *decimator*
> 
> Installed 353.06 a couple days ago and my SLi 780 Ti Classy's are running fine. Played AC: Unity for about an hour last night without issues. Internet browsing and YouTube playback with hardware acceleration enabled are also fine. I don't have TW3 but plan on getting it soon. I should note that my system is pretty bare bones -- Windows 7, drivers, Microsoft Office, Steam, and games. There aren't a lot of background applications running, so there's basically no chance of any software conflicts.


Congratz..


----------



## decimator

Quote:


> Originally Posted by *barsh90*
> 
> Congratz..


You act like I'm bragging or something. People are having issues with this driver and I outlined why I'm *not* having issues with this driver (most of the evidence points toward software conflicts). I don't see how what I said doesn't add to the conversation...

You're also responding to 2 people with Kepler cards, not Maxwell cards like you have. This driver looks like it was focused on getting Kepler running better, not necessarily on improving Maxwell performance, as well. nVidia probably rushed it to placate all the "nVidia is gimping Kepler" noobs and didn't really put enough effort into Maxwell.


----------



## barsh90

Quote:


> Originally Posted by *decimator*
> 
> You act like I'm bragging or something. People are having issues with this driver and I outlined why I'm *not* having issues with this driver (most of the evidence points toward software conflicts). I don't see how what I said doesn't add to the conversation...
> 
> You're also responding to 2 people with Kepler cards, not Maxwell cards like you have.


Didn't meant ti come out like that, just woke up. Did yiu turned off hardware acceleration by chance? I turned it off and havent had any issues, ither than choppy chrome performance, until nvidia fixes this whole fiasco.


----------



## hyp36rmax

Quote:


> Originally Posted by *decimator*
> 
> You act like I'm bragging or something. People are having issues with this driver and I outlined why I'm *not* having issues with this driver (most of the evidence points toward software conflicts). I don't see how what I said doesn't add to the conversation...
> 
> You're also responding to 2 people with Kepler cards, not Maxwell cards like you have. This driver looks like it was focused on getting Kepler running better, not necessarily on improving Maxwell performance, as well. nVidia probably rushed it to placate all the "nVidia is gimping Kepler" noobs and didn't really put enough effort into Maxwell.


I was going to tell him the same thing. Great response! My system is similar to yours as bare as it can be with Windows 8.1 as I use it strictly as a Lan box. I'm sure if some were to do a fresh install some problems would go away.


----------



## decimator

Quote:


> Originally Posted by *barsh90*
> 
> Didn't meant ti come out like that, just woke up. Did yiu turned off hardware acceleration by chance? I turned it off and havent had any issues, ither than choppy chrome performance, until nvidia fixes this whole fiasco.


No worries, all good. I don't use Chrome and stick with IE (I know, terrible, but bare bones and all...). I left hardware acceleration on for Flash and it runs fine for me.


----------



## neoroy

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Question for everyone who is crashing during browsing ... Are you turning off the browsers hardware acceleration mode?


I was crashing with 352.86 when browsing and I didn't touch anything on browser's hardware acceleration mode, btw I use mozilla firefox. But with 353.06 untill now it still works great and no crash while browsing only while gaming I got once crashed when playing AC Unity for 5 minutes perhaps, gotta try AC Unity again for more test. Other games like FIFA15 works great no crashing.


----------



## Assirra

Quote:


> Originally Posted by *barsh90*
> 
> Didn't meant ti come out like that, just woke up. Did yiu turned off hardware acceleration by chance? I turned it off and havent had any issues, ither than choppy chrome performance, until nvidia fixes this whole fiasco.


Just curious but why is nvidia getting blamed for something that might as well be a chrome fault?


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Assirra*
> 
> Just curious but why is nvidia getting blamed for something that might as well be a chrome fault?


Par for the course.


----------



## Abovethelaw

Quote:


> Originally Posted by *barsh90*
> 
> Didn't meant ti come out like that, just woke up. Did yiu turned off hardware acceleration by chance? I turned it off and havent had any issues, ither than choppy chrome performance, until nvidia fixes this whole fiasco.


I still crash with HW acceleration disabled. Made no difference in my system. Back to 344.48


----------



## Arizonian

So just had driver crash on desktop while surfing with IE.







Have to roll back to 350.12 last working driver.









I guess it's not so easy to support Kepler GPU's as it seems for Nvidia. I think they might be getting a taste of what really goes into supporting their products for longer than a generation. I'm glad their trying to change this but the drivers have to work. One would hope driver improvements for Kepler can at the very least continue until Pascal.


----------



## 47 Knucklehead

Still have yet to crash on either my Maxwell or my Kepler machines, even after using both for at least 8+ hours each.


----------



## Arizonian

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Still have yet to crash on either my Maxwell or my Kepler machines, even after using both for at least 8+ hours each.


Not even two hours for me before crash. Updated this morning, not even a chance to game.


----------



## garikfox

I'm convinced all these TDR/Driver crashes are coming from OCed and Factory OCed cards. It's weird NVIDIA hasn't fixed this yet.


----------



## rationalthinking

Quote:


> Originally Posted by *tinate*
> 
> Quote:
> 
> 
> 
> Originally Posted by *barsh90*
> 
> Didn't meant ti come out like that, just woke up. Did yiu turned off hardware acceleration by chance? I turned it off and havent had any issues, ither than choppy chrome performance, until nvidia fixes this whole fiasco.
> 
> 
> 
> I still crash with HW acceleration disabled. Made no difference in my system. Back to 344.48
Click to expand...

Try flashing a stock VBIOS for your card. I have had this issue several times, seems like corrupted firmware caused a lot of issues which seems purely driver issues.

EDT: Go to the 980 Owner's Club thread to find EZFlash download links and stock VBIOS for almost every card on the NA/EU market.


----------



## Abovethelaw

Quote:


> Originally Posted by *rationalthinking*
> 
> Try flashing a stock VBIOS for your card. I have had this issue several times, seems like corrupted firmware caused a lot of issues which seems purely driver issues.
> 
> EDT: Go to the 980 Owner's Club thread to find EZFlash download links and stock VBIOS for almost every card on the NA/EU market.


I'll try. It seems unlikely to be a card issue though since it only crashes on GTA V and no other game.


----------



## rationalthinking

Quote:


> Originally Posted by *tinate*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rationalthinking*
> 
> Try flashing a stock VBIOS for your card. I have had this issue several times, seems like corrupted firmware caused a lot of issues which seems purely driver issues.
> 
> EDT: Go to the 980 Owner's Club thread to find EZFlash download links and stock VBIOS for almost every card on the NA/EU market.
> 
> 
> 
> I'll try. It seems unlikely to be a card issue though since it only crashes on GTA V and no other game.
Click to expand...

I thought the same thing, was always application/game specific. Pulled my hair out blaming it on the MB, bent pins and bad PSUs. Turned out to be the last thing I suspected, my GPU firmware.

My computer room/office has a done of equipment in it running and we have those electrical fire safe breakers in our home. Well, during the summer in Louisiana it gets pretty hot and the lines running to the breaker can get hot, causing the these special breakers to trip. Well, I think power drops in the middle of gaming sessions corrupted my GPU bios on the main rig. Since then I have put my HomeServer on a UPS but some of my other rigs are not.


----------



## littledonny

Quote:


> Originally Posted by *Arizonian*
> 
> So just had driver crash on desktop while surfing with IE.
> 
> 
> 
> 
> 
> 
> 
> Have to roll back to 350.12 last working driver.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess it's not so easy to support Kepler GPU's as it seems for Nvidia. I think they might be getting a taste of what really goes into supporting their products for longer than a generation. I'm glad their trying to change this but the drivers have to work. One would hope driver improvements for Kepler can at the very least continue until Pascal.


You took the words right out of my mouth.

I appreciate the effort, but driver crashes in games (for me) and on the desktop aren't worth a 5% performance gain. Back to 350.12 I go.

The drivers are usually solid, so this is a rare inconvenience.


----------



## criminal

Quote:


> Originally Posted by *Arizonian*
> 
> So just had driver crash on desktop while surfing with IE.
> 
> 
> 
> 
> 
> 
> 
> Have to roll back to 350.12 last working driver.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess it's not so easy to support Kepler GPU's as it seems for Nvidia. I think they might be *getting a taste of what really goes into supporting their products for longer than a generation*. I'm glad their trying to change this but the drivers have to work. One would hope driver improvements for Kepler can at the very least continue until Pascal.


Lol... very true. That is why i want to dump this 980 before Pascal gets close. Don't want to be subject to this issue with my 980. Maxwell is over a year old now. Can't keep supporting this dinosaur product forever.


----------



## ALT F4

Quote:


> Originally Posted by *maarten12100*
> 
> They realized that abandoning the old arch as in not pushing the improvements had bad PR effect I guess. It's a good thing rather than pushing people to the newer platform. Good for those with Kepler cards.


Almost 20% performance increase on my Kepler Kingpins, which was completely unexpected, I knew something was wrong when 100-200mhz on core and memory was amounting to fractions of FPS earlier. I was literally at 45-50 average with the same settings where now I am at 60-60+ average. This is almost everything on ultra and hairworks turned off.

However every 1 out of 5 attempts loading any game on this driver causes my screens to flicker and bug out.


----------



## RagingPwner

Not sure if it was just a coincidence, but my pc got a bsod while the driver was installing. I didn't feel like messing with it so I rolled back to 347.88 as that's what I had handy. I'll try again when I wake up and see if I can't get the damn driver to install.


----------



## rationalthinking

Quote:


> Originally Posted by *RagingPwner*
> 
> Not sure if it was just a coincidence, but my pc got a bsod while the driver was installing. I didn't feel like messing with it so I rolled back to 347.88 as that's what I had handy. I'll try again when I wake up and see if I can't get the damn driver to install.


DDU and flash a cards with a fresh VBIOS.


----------



## RagingCain

I wonder if the TDR issues are stemming from a new voltage algorithm or possibly from many users using custom BIOSes or even both.


----------



## xenophobe

I crashed in ACU about 10 minutes in. Haven't tried anything else. Not getting crashes while not gaming.


----------



## Clazman55

The only issue I've had so far on the new driver's is in TW3. The screen would just disappear but the game would keep running. It was like the program was minimized. That happened 3 time after the driver install, anywhere from 5-15 mins into the game.

After another reboot, driver reinstall and reboot, I haven't had any issues in the last 9 hours of play.

The only thing of note is the issue only happened when I was using the "Activate All Monitors" option. Using SLI, the issue didn't occur. Thankfully, the reinstall fixed that issue.


----------



## Exilon

Quote:


> Originally Posted by *RagingCain*
> 
> I wonder if the TDR issues are stemming from a new voltage algorithm or possibly from many users using custom BIOSes or even both.


No crashes here and I have a custom bios.


----------



## RagingCain

Quote:


> Originally Posted by *Exilon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> I wonder if the TDR issues are stemming from a new voltage algorithm or possibly from many users using custom BIOSes or even both.
> 
> 
> 
> No crashes here and I have a custom bios.
Click to expand...

I too am on custom bioses, Skyn3t's bios specifically. No TDRs yet, knock on wood.


----------



## djriful

Quote:


> Originally Posted by *RagingCain*
> 
> I wonder if the TDR issues are stemming from a new voltage algorithm or possibly from many users using custom BIOSes or even both.


I strongly believe is a change in that driver to prevent custom modding BIOS for future GPUs. Side effect on Kepler which many users are running on custom BIOS. Any modded BIOS crash a lot which these new drivers. It seems like I am forced to run with stock BIOS.

It's weird, all the paper said for example 980 is 1.5 faster than original TITAN. Couple of weeks later some new games. 980 is only 1.1x faster... and it gets lower. Older GPU Kepler runs on custom modded is able to keep up close. I bet Nvidia doesn't like that, it makes it harder to sell the newer GPU because it is only marginally improvement.


----------



## dubldwn

With 353 my 980 overclock became unstable. With stock clocks I crashed twice in witcher 3 over an hour or so, which hasn't happened yet. I went back to 350 and no problems after a gaming marathon.

This driver is no good.


----------



## LaBestiaHumana

Has anyone tried 353.12?


----------



## jdstock76

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Has anyone tried 353.12?


----------



## djriful

Quote:


> Originally Posted by *dubldwn*
> 
> With 353 my 980 overclock became unstable. With stock clocks I crashed twice in witcher 3 over an hour or so, which hasn't happened yet. I went back to 350 and no problems after a gaming marathon.
> 
> This driver is no good.


Did you have GeForce Experience running? Shadowplay on? Still crash without GeForce Experience on at all?

353.12: http://www.overclock.net/t/1558826/nv-geforce-hotfix-driver-353-12#post_23995615


----------



## LaBestiaHumana

Quote:


> Originally Posted by *jdstock76*


I saw it was released today. I'm I missing something?


----------



## Silent Scone

This driver had issues with W3 for me. No driver crashes to speak of, but the game will freeze frame even though I can still move about and hear myself getting attacked. No such problem on 352.86 (86 hours worth)


----------



## dubldwn

Quote:


> Originally Posted by *djriful*
> 
> Did you have GeForce Experience running? Shadowplay on? Still crash without GeForce Experience on at all?


I never install geforce experience; only the driver and physx. The only thing running was the witcher. With 350 I can run precision, hwinfo, and witcher all at once.


----------



## Booty Warrior

Huh, guess I've been lucky. I've been using these for a few days now and no crashes or TDRs so far.

The browser TDRs seems to be a nagging issue for Nvidia though. I remember there being a rash of reports of the same thing with the 320 driver variants a couple of years ago. My reference 780 (non Ti) got hit with the 36hr TDR bug, but was later fixed with another driver. It's a little disconcerting to see it keep popping back up.


----------



## specopsFI

Two days of non-stop usage from me with a custom BIOS GTX 670, hefty OC, lots of gaming and benching with various titles, Chrome usage with HW acceleration ON...

Zero problems. In fact, zero problems for as long as I can remember, with any driver.


----------



## RagingCain

Quote:


> Originally Posted by *Booty Warrior*
> 
> Huh, guess I've been lucky. I've been using these for a few days now and no crashes or TDRs so far.
> 
> The browser TDRs seems to be a nagging issue for Nvidia though. I remember there being a rash of reports of the same thing with the 320 driver variants a couple of years ago. My reference 780 (non Ti) got hit with the 36hr TDR bug, but was later fixed with another driver. It's a little disconcerting to see it keep popping back up.


I think they have changed the power profile voltages is what's the problem. The voltage is slightly too low for HW acceleration or HD video content. Custom / non-reference models would be the most likely to experience issues, but even reference / OEM versions are not all identical with the voltages. This would also be why some overclockers are reporting unstable overclocking. The voltages may be reporting wrong. So they think nothing has changed but their may be a voltage drop in different P states.... since P3 has been somewhat stable.

Power profile changes make sense since these drivers (35x.xx) are merging changes with the Windows 10 family, which has a new power saving engine with it.

What we need is a guy with a little knowledge and voltmeter to verify sensors are reporting what the GPU is actually using.

Then we report to nVidia. Wait a week or two for fix in voltage, or at least sensor read out.


----------



## RagingPwner

Quote:


> Originally Posted by *rationalthinking*
> 
> DDU and flash a cards with a fresh VBIOS.


DDU and a manual download from nvidia rather than through GeForce experience and I've been good so far.


----------



## DoktorCreepy

^Same

I only install the display driver and PhysX as well.


----------



## looniam

Quote:


> Originally Posted by *RagingCain*
> 
> I think they have changed the power profile voltages is what's the problem. The voltage is slightly too low for HW acceleration or HD video content. Custom / non-reference models would be the most likely to experience issues, but even reference / OEM versions are not all identical with the voltages. This would also be why some overclockers are reporting unstable overclocking. The voltages may be reporting wrong. So they think nothing has changed but their may be a voltage drop in different P states.... since P3 has been somewhat stable.
> 
> Power profile changes make sense since these drivers (35x.xx) are merging changes with the Windows 10 family, which has a new power saving engine with it.
> 
> What we need is a guy with a little knowledge and voltmeter to verify sensors are reporting what the GPU is actually using.
> 
> Then we report to nVidia. Wait a week or two for fix in voltage, or at least sensor read out.


you may have something there. i just did a quick run w/FireStrikeExtreme. first one gave me a TDR - rebooted and successful on the second attempt. using skynet's bios for my classy would give me base1019 (0.994v) boost 1176. (12.58-1.264) however **i think** the voltage would go up to 1.276 before and it seems to downclock to 1019 (0.994)really fast.

yeah got a dmm and probe it


----------



## Asus11

I wonder if they updated titan z?


----------



## Ithanul

Hmmm, have yet to try this driver out. The vanilla Titans I have been sitting on 347.09 drivers for months now. May try this one out just for the heck of seeing if it gives better folding points.


----------



## jdstock76

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> I saw it was released today. I'm I missing something?


Nope I miss read your post. My bad! But after looking it up I don't see it on Nvidia's website not even in Beta. Maybe I missed it.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *jdstock76*
> 
> Nope I miss read your post. My bad! But after looking it up I don't see it on Nvidia's website not even in Beta. Maybe I missed it.


May not be an official driver after all. I was on my phone earlier and didn't have a change to read through it.

OCN Thread: http://www.overclock.net/t/1558826/nv-geforce-hotfix-driver-353-12/0_30

driver link:

http://nvidia.custhelp.com/app/answers/detail/a_id/3676


----------



## jdstock76

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> May not be an official driver after all. I was on my phone earlier and didn't have a change to read through it.
> 
> OCN Thread: http://www.overclock.net/t/1558826/nv-geforce-hotfix-driver-353-12/0_30
> 
> driver link:
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/3676


Hmmmm .... I haven't installed the 353.06 yet but I wonder if installing that right after will help?


----------



## nSone

same here, 353.06 - vanilla Titan, crashes with Chrome... it's stupid


----------



## FungYW

Quote:


> Originally Posted by *nSone*
> 
> same here, 353.06 - vanilla Titan, crashes with Chrome... it's stupid


It can be fix by manually set your GPU for PhysX. Do not use default auto select.


----------



## neoroy

Untill now I got no crashing with firefox and gaming with witcher 3 is smooth no crash. Will continue monitoring my PC if there is a crash like 352.86 did. Mortal Kombat X also smooth.


----------



## nSone

I'll give it a shot now, thank you!
Quote:


> Originally Posted by *FungYW*
> 
> It can be fix by manually set your GPU for PhysX. Do not use default auto select.


----------



## auraofjason

People with crashing issues, nvidia is looking for help: http://www.reddit.com/r/nvidia/comments/38qokw/geforce_users_with_random_tdrs_in_chrome_after/


----------



## jdstock76

I'm using Chrome. Just installed new drivers with the new 980 ti and everything is gtg so far. Will keep pestering it till I can get it to crash though.


----------



## djriful

Nvidia is requesting help at local area for this major driver issues lately. 3 drivers released since Witcher 3 has been TDR crashing with Chrome or without.

https://forums.geforce.com/default/topic/840191/geforce-drivers/geforce-users-with-random-tdrs-in-chrome-after-installing-352-86-driver-living-in-bay-area-sacramen/


----------



## neoroy

Quote:


> Originally Posted by *djriful*
> 
> Nvidia is requesting help at local area for this major driver issues lately. 3 drivers released since Witcher 3 has been TDR crashing with Chrome or without.
> 
> https://forums.geforce.com/default/topic/840191/geforce-drivers/geforce-users-with-random-tdrs-in-chrome-after-installing-352-86-driver-living-in-bay-area-sacramen/


Hmm sounds like it's beginning abit serious about this, I hope Nvidia will fix this soon.
Btw my OCed's card is not quite stable with 353.06 driver when playing The Witcher 3 but at stock is quite smooth.


----------



## Cyclops

Haven't you learned your lesson people? Clearly Internet Explorer is the way to go.


----------



## Lantian

Quote:


> Originally Posted by *Cyclops*
> 
> Haven't you learned your lesson people? Clearly Internet Explorer is the way to go.











But I prefer Waterfox never had any issues with it, unlike firefox or chrome


----------



## ALT F4

Quote:


> Originally Posted by *Lantian*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I prefer Waterfox never had any issues with it, unlike firefox or chrome


Their logo reminds of daemon tools, very similar


----------



## Rndomuser

Haven't had a single TDR (or any other kind of) crash with Chrome, using EVGA's 980 SC card (no manual overclocking or voltage adjustments beyond the stock values, have no need for that for now). Previous drivers crashed a couple of times when playing Witcher 3, the current ones have no issues in any games so far (or when using Chrome/FF). Sucks for people who do have issues, though. Probably a combination of driver settings and particular "non-reference" clock/voltage adjustments with particular cards from particular manufacturers.


----------



## neoroy

Quote:


> Originally Posted by *Cyclops*
> 
> Haven't you learned your lesson people? Clearly Internet Explorer is the way to go.


Looks like I don't have any crash anymore with this new driver while browsing firefox. Only The Witcher 3 is abit not stable when I overclocking my card. Btw I use latest DDU to remove 352.86 and installing 353.06.


----------



## bigkahuna360

Praying for a Windows 10 hotfix or beta driver for 980 Ti... I don't like thinking I just purchased two $700 paper weights.


----------



## dawn1980

drivers 353.06 and running precision x crashes GTA 5...anyone else having crashes while running precision x in GTA 5? Game runs fine not running x. Tried lowering OC too nothing works and even tried afterbuner does it too....both with K boost enables etc....nothing works really like to run something in GTA5 to monitor my card....Windows 8.1 with all drivers up to date...anyone running windows 10 with precision x/ gta5??? Thanks in advance


----------



## P.J

Quote:


> Originally Posted by *nSone*
> 
> same here, 353.06 - vanilla Titan, crashes with Chrome... it's stupid


Same here x)


----------



## Silent Scone

Quote:


> Originally Posted by *neoroy*
> 
> Looks like I don't have any crash anymore with this new driver while browsing firefox. Only The Witcher 3 is abit not stable when I overclocking my card. Btw I use latest DDU to remove 352.86 and installing 353.06.


Are you sure it's your overclock? 353.06 is crap for me on WIN8.1 and TITAN X. The game stops responding yet I can still hear it running, there is no driver crash event log. Occasionally get a game crash log if it manages to exit. Fine on 352.86. GPUs are at stock.


----------



## MonarchX

353.12 Beta are out guys! NVidia says they are only for G-Sync in Borderless Mode issue, but I bet they fixed some other things there too!


----------



## djriful

Quote:


> Originally Posted by *bigkahuna360*
> 
> Praying for a Windows 10 hotfix or beta driver for 980 Ti... I don't like thinking I just purchased two $700 paper weights.


Windows 10 is not official out, you can't make statement like that until July 29. Beta product, preview product there is nothing you can argue about.


----------



## provost

There is no fix for Gk110s as I constantly crash with my Titans on this driver. I am running stock bios, no OC, no background program such as AB, PX, Fraps or steam overlay. With the previous version of the driver I still get crashes but not as frequently. Since I have been having these driver problems, I started following some of the threads at other forums to see if others have the same issue.

This is from Guru3d where someone is getting a much higher 3d mark sore with drivers from last year on their kepler cards vs this driver

http://forums.guru3d.com/showthread.php?t=399569&page=16

Here is a real gem, this is the poster I ignored reading at least 20-25 times, but he keeps coming back. Either he is neurotic or he has absolute conviction in his beliefs, and I am starting to think that its the latter at this point..lol

https://forums.geforce.com/default/topic/841042/geforce-700-600-series/nvidia-has-reduced-kepler-oc-in-many-new-drivers-with-1-1-max-instad-1-2v-manuelg-s-ca-m/

So when Nvidia says that it is software company now what are they talking about Tegra? as they clearly have a total of 0 guys working on the qc for their drivers now


----------



## Silent Scone

Quote:


> Originally Posted by *provost*
> 
> There is no fix for Gk110s as I constantly crash with my Titans on this driver. I am running stock bios, no OC, no background program such as AB, PX, Fraps or steam overlay. With the previous version of the driver I still get crashes but not as frequently. Since I have been having these driver problems, I started following some of the threads at other forums to see if others have the same issue.
> 
> This is from Guru3d where someone is getting a much higher 3d mark sore with drivers from last year on their kepler cards vs this driver
> 
> http://forums.guru3d.com/showthread.php?t=399569&page=16
> 
> Here is a real gem, this is the poster I ignored reading at least 20-25 times, but he keeps coming back. Either he is neurotic or he has absolute conviction in his beliefs, and I am starting to think that its the latter at this point..lol
> 
> https://forums.geforce.com/default/topic/841042/geforce-700-600-series/nvidia-has-reduced-kepler-oc-in-many-new-drivers-with-1-1-max-instad-1-2v-manuelg-s-ca-m/
> 
> *
> So when Nvidia says that it is software company now what are they talking about Tegra? as they clearly have a total of 0 guys working on the qc for their drivers now
> 
> 
> 
> 
> 
> 
> 
> *


You say he's neuritic but then you make this comment? lol.









I've no TDR issues with any of these drivers.


----------



## looniam

oh that guy is a GEM!


----------



## Ganf

Quote:


> Originally Posted by *djriful*
> 
> Windows 10 is not official out, you can't make statement like that until July 29. Beta product, preview product there is nothing you can argue about.


BS there isn't anything to complain about, people testing the W10 preview are supposed to be testing hardware compatibility as well as software. If Nvidia leaves the 980ti's broken until W10 releases then slaps on a fix then that means that their owners get to Beta test the new drivers AFTER the beta is over.

Not that they're going to stay broken, Nvidia and AMD have been updating their drivers for windows 10 at least once a week, but they both need to keep up with the rapid hotfixes if anyone is going to have working GPU's on the release of Windows 10, which everyone is pushing in order to get high adoption rates of DX12. That's everyone from Microsoft, to developers, to Nvidia and AMD, and all of the bits and pieces in between. Windows 10 is buggy for games on release people are going to LOL from the high heavens of the Tumblrnetz down to the waste retention ponds of 4chan and it'll be months before anyone even thinks about switching over.

People testing W10 now need a working product NOW, just a month before release, because if it isn't working at this point what hope do we have of everything going smoothly on release?


----------



## Silent Scone

Quote:


> Originally Posted by *Ganf*
> 
> BS there isn't anything to complain about, people testing the W10 preview are *supposed to be testing hardware compatibility as well as software*


Precisely and there on you invalidated your entire rant.


----------



## Ganf

Quote:


> Originally Posted by *Silent Scone*
> 
> Precisely and there on you invalidated your entire rant.


Nope. Nothing is being tested, thus it's still a problem. Kahuna and thousands of people like him are clearly in sit and wait mode, meanwhile the latest W10 build is one of the buggiest yet.

Yay for broken OS releases!


----------



## bigkahuna360

Windows 10 has been pretty good for support pre-upgrade. Drivers that had support for 8.1 had unofficial support for Windows 10 and I don't believe I've had a single crash (outside of playing The Evil Within, that's the devils work all right). At this point, it's hard to call it a beta anymore so close to release. They've fixed every major problem that I've experienced in previous versions and now it's down to just ironing out the smaller issues others may be having. You would think that a company that makes so much money in a year and that also releases drivers for previous cards on Windows 10 would be able to release a single hot fix driver for their most expensive consumer card.


----------



## MonarchX

I do not think NVidia cripped GTX 980 4GB.... Where is the evidence of that??? For 1080p, 4GB should be fine for quite some time, at least until the end of 2015...


----------



## rdr09

Quote:


> Originally Posted by *MonarchX*
> 
> I do not think NVidia cripped GTX 980 4GB.... Where is the evidence of that??? For 1080p, 4GB should be fine for quite some time, at least until the end of 2015...


we were just discussing vram in another thread. members say 3GB is still fine even up to 1440.


----------



## ALT F4

Quote:


> Originally Posted by *rdr09*
> 
> we were just discussing vram in another thread. members say 3GB is still fine even up to 1440.


It will do fine on 2160p for a good while also.
This is no insult to anyone, so please read carefully to understand my point.
I think there is a huge miscommunication when people are talking about this specific issue. The main problem comes from people who are giving opinions about hardware they don't own or using a benchmark for reference. There is a clear difference in maxing out settings to bench the game and maxing out settings to the extent of what your eye can see. I'm sorry but 16X AA on water reflection shadowing might eat up 500MB of vram but your eyes aren't seeing anything different than 2X AA on water reflection shadowing







. To claim you can't enjoy the 2160p experience with 3gb of vram is a bit silly, we have to be real careful when we are reading about this.


----------



## DADDYDC650

I'm bored and wanting to upgrade from Windows 8.1 to WIndows 10. I game a lot and I'm running SLI. Bad idea? Has anyone noticed any performance increases? I've used Windows 10 before but that was a few months ago when it had a crippling issue with Steam.


----------



## bigkahuna360

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm bored and wanting to upgrade from Windows 8.1 to WIndows 10. I game a lot and I'm running SLI. Bad idea? Has anyone noticed any performance increases? I've used Windows 10 before but that was a few months ago when it had a crippling issue with Steam.


You should be fine as long as you don't get 980 Ti's. I've been gaming on Windows 10 since its release and build 10074 and build 10130 work just fine. No crashing.


----------



## djriful

Quote:


> Originally Posted by *looniam*
> 
> oh that guy is a GEM!


That guy has nothing else to do.


----------



## DADDYDC650

Quote:


> Originally Posted by *bigkahuna360*
> 
> You should be fine as long as you don't get 980 Ti's. I've been gaming on Windows 10 since its release and build 10074 and build 10130 work just fine. No crashing.


I have two Titan X's. Cause for concern?


----------



## bigkahuna360

Quote:


> Originally Posted by *DADDYDC650*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bigkahuna360*
> 
> You should be fine as long as you don't get 980 Ti's. I've been gaming on Windows 10 since its release and build 10074 and build 10130 work just fine. No crashing.
> 
> 
> 
> I have two Titan X's. Cause for concern?
Click to expand...

I wouldnt imagine so. I'm only held back due to not having drivers for Windows 10 since the 980 Ti is so new.


----------



## DADDYDC650

Quote:


> Originally Posted by *bigkahuna360*
> 
> I wouldnt imagine so. I'm only held back due to not having drivers for Windows 10 since the 980 Ti is so new.


Wouldn't the Titan X drivers work well with the 980 Ti's? Same GM200 chip and everything.

Also, has the Steam bug been fixed? I have games installed on multiple drives which caused Steam to crash.


----------



## bigkahuna360

Quote:


> Originally Posted by *DADDYDC650*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bigkahuna360*
> 
> I wouldnt imagine so. I'm only held back due to not having drivers for Windows 10 since the 980 Ti is so new.
> 
> 
> 
> Wouldn't the Titan X drivers work well with the 980 Ti's? Same GM200 chip and everything.
> 
> Also, has the Steam bug been fixed? I have games installed on multiple drives which caused Steam to crash.
Click to expand...

I have no idea. I can try it when I get home, but it most likely won't work.

As for the steam issue, I have games installed both on my SSD and HDD and its never crashed under 10.


----------



## Ganf

Quote:


> Originally Posted by *djriful*
> 
> That guy has nothing else to do.


They keep him and a couple others unbanned simply to stir up resentment for the competition. AMD would have the same if anyone actually went to their forums.


----------



## provost

Quote:


> Originally Posted by *djriful*
> 
> That guy has nothing else to do.


Well, clearly that guy is off the reservation, and it's unfortunate as his almost psychotic posts bury any legitimate comments some might be others might be trying to make on that forum.
However, this kind of lack of driver performance just doesn't happen overnight, and this is most likely a symptom of sustained underinvestment in the driver resources by Nvidia for sometime now. The sli driver support has been lagging for some time, and the redirection of company's resources to other initiatives (whether it be gsync, game works or tegra, etc) is now manifesting itself into bigger driver issues. I have been using Nvidia cards for sometime, and the drivers just don't fall of the cliff like this, unless someone has taken their eye off the ball to focus on other "corporate priorities"... my two cents


----------



## ALT F4

Quote:


> Originally Posted by *provost*
> 
> Well, clearly that guy is off the reservation*, and it's unfortunate as his almost psychotic posts bury any legitimate comments some might be others might be trying to make on that forum.*
> However, this kind of lack of driver performance just doesn't happen overnight, and this is most likely a symptom of sustained underinvestment in the driver resources by Nvidia for sometime now. The sli driver support has been lagging for some time, and the redirection of company's resources to other initiatives (whether it be gsync, game works or tegra, etc) is now manifesting itself into bigger driver issues. I have been using Nvidia cards for sometime, and the drivers just don't fall of the cliff like this, unless someone has taken their eye off the ball to focus on other "corporate priorities"... my two cents


You are spot on. Couldn't of said it any better


----------



## Redeemer

anybody getting the "Nvidia Dissplay drivers has stopped responding" and recovered message lately?


----------



## Qu1ckset

Everyone keeps ripping on AMD for crappy drivers, god the last 3 drivers from nvidia have been straight garbage, ive never had issues in the past with nvidia drivers and lately its been so much crashing, if they dont fix this stuff id consider AMD next round, Drivers was one of the main reasons i switched teams...


----------



## djriful

Quote:


> Originally Posted by *Qu1ckset*
> 
> Everyone keeps ripping on AMD for crappy drivers, god the last 3 drivers from nvidia have been straight garbage, ive never had issues in the past with nvidia drivers and lately its been so much crashing, if they dont fix this stuff id consider AMD next round, Drivers was one of the main reasons i switched teams...


I'm with you.


----------



## azanimefan

Quote:


> Originally Posted by *Qu1ckset*
> 
> Everyone keeps ripping on AMD for crappy drivers, god the last 3 drivers from nvidia have been straight garbage, ive never had issues in the past with nvidia drivers and lately its been so much crashing, if they dont fix this stuff id consider AMD next round, Drivers was one of the main reasons i switched teams...


I owned nvidia for years before getting a hd 7770 and r9-280x, no issues with those drivers. frankly, i didn't really have any nvidia driver issues either prior to those two amd cards, and i'm back with nvidia for this gtx970, and i gotta say these last few drivers from nvidia are the worst drivers I've ever had the displeasure of using in the last 6 years or so from either company.


----------



## neoroy

Quote:


> Originally Posted by *Redeemer*
> 
> anybody getting the "Nvidia Dissplay drivers has stopped responding" and recovered message lately?


I just got one but not in browsing firefox like used to be with 352.86 but with Mortal Kombat X with stock clock on my card, sadly I guess 353.06 is still has some bug now rollback to 350.12.
Quote:


> Originally Posted by *Silent Scone*
> 
> Are you sure it's your overclock? 353.06 is crap for me on WIN8.1 and TITAN X. The game stops responding yet I can still hear it running, there is no driver crash event log. Occasionally get a game crash log if it manages to exit. Fine on 352.86. GPUs are at stock.


Now I just got crash to desktop and baloon on my desktop corner said "Nvidia display driver has stopped responding ..." I guess 353.06 still has bug







although I never have crashed anymore while browsing with Firefox.... weird.
With 352.86 almost every day I got crashed while browsing.
Time to rollback to 350.12 untill Nvidia fix this issue.


----------



## rbarrett96

I'll be happy when they get Hair Works working properly on my GTX 680


----------



## decimator

Quote:


> Originally Posted by *Redeemer*
> 
> anybody getting the "Nvidia Dissplay drivers has stopped responding" and recovered message lately?


Yes, this is what's known as a TDR (timeout detection and recovery) and plenty of people have been getting it if you scroll back in this thread a couple pages.


----------



## neoroy

Just rolled back to 350.12 now I can play almost 2 hours in The Witcher 3 with my OCed 980 Strix at 1528/1953 stock voltage








Looks like I'm gonna use it for a while untill Nvidia released proper/better new driver.


----------



## djriful

https://forums.geforce.com/default/topic/841230/geforce-drivers/possible-fix-tdr-352-xx-353-xx-link-power-state-management-to-off/


----------



## Redeemer

Quote:


> Originally Posted by *decimator*
> 
> Yes, this is what's known as a TDR (timeout detection and recovery) and plenty of people have been getting it if you scroll back in this thread a couple pages.


So its driver related and we can expect a fix?


----------



## Rndomuser

Quote:


> Originally Posted by *Redeemer*
> 
> So its driver related and we can expect a fix?


Well, if you'll read up you will see that the issue disappears for most people if they roll back to older drivers, so yea, it's mostly the issue with drivers. And yea, Nvidia will fix it, as soon as they will be able to properly reproduce it and figure out what's causing it. You can even help them do it faster if you live near their location in Bay Area:
https://forums.geforce.com/default/topic/840191/geforce-drivers/geforce-users-with-random-tdrs-in-chrome-after-installing-352-86-driver-living-in-bay-area-sacramen/

They really need to start a proper "user bug reporting" program, with an appropriate rewards, so for example if someone manages to successfully demonstrate some bug and if this will help Nvidia successfully reproduce it and fix it - they should provide that person a proper reward. Perhaps a new Titan X if it was a major bug, or 980 card if it was less severe one. Would be a "Win-Win" situation: Nvidia would save a lot on QA workers' salary (since they seem to be doing an awful job anyway) and people would actually have a good incentive to properly report bugs to them


----------



## Lansow

Quote:


> Originally Posted by *Rndomuser*
> 
> Nvidia would save a lot on QA workers' salary (since they seem to be doing an awful job anyway) and people would actually have a good incentive to properly report bugs to them


I am a senior QA analyst at a major software company. It's not as easy as many users seem to think; everyone's system is different, with different software running, different hardware, mixes of old and new hardware/software/etc. Identifying issues with software that could be running on hardware you've never even heard of before is impossible.

tl;dr - Nvidia is doing an amazing job of QA on their drivers, they just can't catch everything... after all, not everyone is experiencing these issues.


----------



## provost

Quote:


> Originally Posted by *Lansow*
> 
> Your statement is just asinine and shows *you don't understand QA on software.*.. like at all.
> 
> *I am a senior QA analyst at a major software company*. It's not as easy as many users seem to think; everyone's system is different, with different software running, different hardware, mixes of old and new hardware/software/etc. *Identifying issues with software that could be running on hardware you've never even heard of before is impossible.*
> 
> tl;dr - *Nvidia is doing an amazing job of QA on their drivers, they just can't catch everything... after all, not everyone is experiencing these issues*.


This is general response (putting myself in the other poster's shoes, as I am having tons of crashes)
1.Don't care as its not my job to know QA. That is what I pay Nvidia for.
2. Not relevant
3. If this is an amazing job, I would hate to think what a crappy job looks like, and no not everyone is having issues if that's the lowest common denominator to gauge customer satisfaction and pat yourself on the back come the annual bonus time, however every other person seems to be having issues with these new drivers:

https://forums.geforce.com/default/topic/836914/geforce-drivers/official-nvidia-353-06-whql-game-ready-display-driver-feedback-thread-released-5-31-15-/46/
http://forums.evga.com/NVIDIA-WHQL-driver-35306-ready-980-Ti-m2342264-p3.aspx
http://forums.evga.com/NVIDIA-WHQL-driver-35286-ready-Witcher-3-m2336784.aspx
http://forums.overclockers.co.uk/showthread.php?t=18675135&page=7
http://forums.guru3d.com/showthread.php?t=399569&page=17


----------



## ComputerRestore

Had a lot of users in the Forums of the game I play with that "Display Driver stopped working etc"

Not sure if it's related, but it seemed that Windows was blocking certain things from installing which would cause issues in only some applications.

Some got around it by doing a clean installation as "Administrator"


----------



## EniGma1987

Driver installations should always be done as Administrator...


----------



## neoroy

Quote:


> Originally Posted by *EniGma1987*
> 
> Driver installations should always be done as Administrator...


Hmm never had problem before while executed driver installation without Administrator with old driver before 352.86 but perhaps this one needs run as admin.
Btw guys how about "link power state management" in 353.06 driver? Is it already "Off" by default? I'm with 350.12 now, so far so good.


----------



## Silent Scone

No issues on 353.06 or 353.12 with system spec in sig.

I resolved one issue I was having with Witcher 3 by updating to the hotfix driver and setting CPU PhysX to CPU.


----------



## provost

Yeah, setting phsx to CPU seems to stop TDR errors for me, but I haven't tried it extensively yet. This is on a multi Titan set up. I am still on the last driver version, as 353.06 was giving me a lot of crashes. So, haven't tried this yet with 353.06 driver either.
Either way, what a ******ed bug since phsx is supposed to work well on Nvidia GPU was my understanding.


----------



## jologskyblues

I don't have any issues with the 353.06 driver so far.

I did have a few driver crashes with the last two releases prior to this one.


----------



## Dyaems

Same here.

I just updated through GeForce Experience (I just want to try it out to see how it works) without removing the current drivers first (which I don't do until today) and it seems that I don't experience the issues some users were having here *knock on wood*


----------



## neoroy

Quote:


> Originally Posted by *Dyaems*
> 
> Same here.
> 
> I just updated through GeForce Experience (I just want to try it out to see how it works) without removing the current drivers first (which I don't do until today) and it seems that I don't experience the issues some users were having here *knock on wood*


Hmm interesting, I will try your way later if my PC still crashing.

Quote:


> Originally Posted by *Silent Scone*
> 
> No issues on 353.06 or 353.12 with system spec in sig.
> 
> I resolved one issue I was having with Witcher 3 by updating to the hotfix driver and setting CPU PhysX to CPU.


Ok I will try 353.12 BETA hotfix driver and change PhysX to CPU, hope it will be stable in gaming, because I have no crashing anymore with 353.06, only unstable in gaming.


----------



## Chargeit

The updated drivers have been kind to me. Performance seems up all around, and no crashes. Not that I have a lot of crashes, been about a year since my last which were due to a faulty PSU.

I did notice testing Dead Rising 3 the extra pop in. A minor annoyance.


----------



## Qu1ckset

Just reverted back to 350.12, couldnt stand the amount of crashes


----------



## Dyaems

I guess I did have an issue after updating after all, although it seems that it only happens when I am browsing GOG so it is a minor issue hopefully *knocks on wood*

I'm getting random black screen while browsing GOG website. Besides that I haven't really noticed anything odd. Black screen meaning its something like someone removed the video cable for a second and put it back again.


----------



## Atomfix

Only had my GTX770 for a few days and this is the first driver I installed. Haven't seen any crashes or the likes of it. What's the problems peeps?


----------



## erocker

Quote:


> Originally Posted by *Atomfix*
> 
> Only had my GTX770 for a few days and this is the first driver I installed. Haven't seen any crashes or the likes of it. What's the problems peeps?


What's the problem? The driver crashes... Isn't it obvious? Apparently the crashes don't affect your older card.


----------



## rluker5

Quote:


> Originally Posted by *Lansow*
> 
> I am a senior QA analyst at a major software company. It's not as easy as many users seem to think; everyone's system is different, with different software running, different hardware, mixes of old and new hardware/software/etc. Identifying issues with software that could be running on hardware you've never even heard of before is impossible.
> 
> tl;dr - Nvidia is doing an amazing job of QA on their drivers, they just can't catch everything... after all, not everyone is experiencing these issues.


It doesn't help Nvidia that it is popular to OC to the edge of stability for the volts. I know I do it.
I've got my rig hooked up to a higher end 4k tv that stores settings. If I switch from multiple stream display port at 4k to single stream at 1080 or 1440 on the tv remote, the drivers keep up. If I switch any DP to hdmi2.0 in at 4k 4:2:0 via the tv remote, the drivers keep up. If I put a medium sized game in ramdisk only to be disappointed by the same performance, the drivers keep up. If I switch my windows 10 preview drive back and forth into my amd laptop because I like to keep a spare there because I've messed it up before, the drivers keep up.
I'm sure mobo manufacturers differ in their power saving methods too.
With sli and dedicated PhysX.
The drivers accommodate tiny minorities of users and work well for most in basically every game.

This driver works particularly well for me as it has removed my stuttering when I play W3 at 4k,4:2:0 with my rig oc'd to the edge of stability. The stuttering is still there in W10 at all of the same settings so I know it is the driver to credit.

My rig can barely handle W3 so I've been shopping for new gpu's. The Fury really intrigues me, but the driver support concerns me. I want to do multiple cards and have them work on the games I want to play. It will be hard for me to leave NVidia since their drivers do so much right.


----------



## Kylar182

At first it seemed fine but after patch 1.06 on TW3, TW3 crashes very regularly w/and w/o OC. No clue as to what causes it.

i7 5930k @ 4.2
Rog Rampage V Extreme X99
8x4 DDR4 Ballistex 15-16-16-34 1t
4X GTX980 Hydrocopper @ 1520MHz/7602MHz
Custom EK Loop
Corsair 1500i
3xPG278Q @ 7680X1440 + 1XDell S2240Touch @ 1920X1080


----------



## Woundingchaney

Do the recent iCafe drivers work with the 980ti. I have been running those rock solid for about 3 weeks.


----------



## provost

Quote:


> Originally Posted by *Woundingchaney*
> 
> Do the recent iCafe drivers work with the 980ti. I have been running those rock solid for about 3 weeks.


Sorry, but what the heck is an icafe driver? I have seen it mentioned by a crazy person on Nvidia forums and some more sane people on other forums, but why would iCafe be different than these atrocious WHQL drivers that keep crashing my system. Does this icafe have sli profiles and otherwise up to date on applications?
_Since Furyx isnt out yet, I might as well try whatever hail marry comes my way in the meantime..lol_

System info- OG Titan 4 way sli (stock bios and stock clocks)
3930k (stock clocks)
Rive
corsair 1500w and 1200 w dual psus

So far I have tried all kind of fixes including AB polling rate to 5000 or 6000, no running any overlay (PX, AB or Steam), NV Performance and Adaptive setting in NV control panel, Phsx to CPU and back to GPU,
clean install numerous times, 350.06 is a complete bust for me, the previous version gives me fewer crashes.

If there is a silver bullet out there, more than willing to try it to put my rig out of its misery..


----------



## iamhollywood5

I finally got around to updating to these drivers. I've had absolutely no problems, to my pleasant surprise. 780 Ti on win8.1. I used DDU to completely wipe the old drivers, and then explicitly chose to install the new set as admin.


----------



## w35t

I'm crashing extremely often using both 970 and 980ti gpus. Getting extremely frustrated by this.


----------



## Woundingchaney

Quote:


> Originally Posted by *provost*
> 
> Sorry, but what the heck is an icafe driver? I have seen it mentioned by a crazy person on Nvidia forums and some more sane people on other forums, but why would iCafe be different than these atrocious WHQL drivers that keep crashing my system. Does this icafe have sli profiles and otherwise up to date on applications?
> _Since Furyx isnt out yet, I might as well try whatever hail marry comes my way in the meantime..lol_
> 
> System info- OG Titan 4 way sli (stock bios and stock clocks)
> 3930k (stock clocks)
> Rive
> corsair 1500w and 1200 w dual psus
> 
> So far I have tried all kind of fixes including AB polling rate to 5000 or 6000, no running any overlay (PX, AB or Steam), NV Performance and Adaptive setting in NV control panel, Phsx to CPU and back to GPU,
> clean install numerous times, 350.06 is a complete bust for me, the previous version gives me fewer crashes.
> 
> If there is a silver bullet out there, more than willing to try it to put my rig out of its misery..


http://drivers.softpedia.com/blog/NVIDIA-Outs-iCafe-GeForce-Graphics-Driver-352-94-Download-Now-482678.shtml

This is what I am using and my system is rock solid.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *provost*
> 
> Sorry, but what the heck is an icafe driver? I have seen it mentioned by a crazy person on Nvidia forums and some more sane people on other forums, but why would iCafe be different than these atrocious WHQL drivers that keep crashing my system. Does this icafe have sli profiles and otherwise up to date on applications?
> _Since Furyx isnt out yet, I might as well try whatever hail marry comes my way in the meantime..lol_
> 
> System info- OG Titan 4 way sli (stock bios and stock clocks)
> 3930k (stock clocks)
> Rive
> corsair 1500w and 1200 w dual psus
> 
> So far I have tried all kind of fixes including AB polling rate to 5000 or 6000, no running any overlay (PX, AB or Steam), NV Performance and Adaptive setting in NV control panel, Phsx to CPU and back to GPU,
> clean install numerous times, 350.06 is a complete bust for me, the previous version gives me fewer crashes.
> 
> If there is a silver bullet out there, more than willing to try it to put my rig out of its misery..


This person gave a good explanation about it:
Quote:


> Originally Posted by *dmasteR*
> 
> NVIDIA China typically gets the latest brand of NVIDIA drivers. They're typically very stable and offer slightly better performance than the regular NVIDIA drivers. Just remember to not install anything but the drivers.
> 
> These drivers are used at Internet Cafes in China, since Internet Cafe's a very popular over there. Thus they need the most stable drivers possible with a slight performance boost.


----------



## caenlen

Quote:


> Originally Posted by *kx11*
> 
> 3 damn " driver stopped responding " messages in a row
> 
> holy crap


happens to me randomly all the time. i installed my 290x again. rock solid. amd gets a bad rap.


----------



## Silent Scone

Drivers have been rock solid for me besides a couple of issues with Witcher 3.


----------



## PyroTechNiK

Horrible experience with these drivers. Constant TDR crashes all day when playing any game. I removed this abomination of a driver and installed the icafe 352.94 and haven't had a crash in 2 days so far.


----------



## Slink3Slyde

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> This person gave a good explanation about it:


What am I missing here is there any particular reason why if these Icafe drivers are better then the standard they arent just used as the standard?


----------



## Toan

Do these iCafe drivers work with the 980ti?


----------



## 8472

I haven't installed my 980ti's yet, but if I have trouble with this driver, will any other drivers work with the 980ti?


----------



## Celcius

These drivers have been flawless for my 780Ti.


----------



## Blackops_2

I get frequent crashes weekly gaming and browsing. Stock 780 Classies. Dunno never had this issue before, did a clean install. T


----------



## Toan

I can't even overclock cause the drivers crash while I'm benching.


----------



## anti-clockwize

Quote:


> Originally Posted by *neoroy*
> 
> Time to rollback to 350.12 untill Nvidia fix this issue.


350.12 ftw, i am also back on them, nice and stable compared to the later 2 which both gave me bsod x050 - nvlddmkm.sys while using web browser or even doing nothing at all. (on 780TI SLI).


----------



## SharpShoot3r07

I used to get BSOD all the time with a 780 ti SLI setup. Firefox would also randomly crash about once an hour. I decided to delete the drivers and reinstall them. Ever since then it seems to work a lot better. My web browser only crashes maybe once-twice a day and I can't remember any BSOD in the past two days.


----------



## Dyaems

kind of off-topic... but does the latest/recent drivers work with PLP surround setup? or we still need a third party program (if there is) so that we can get it to work?


----------



## Asus11

Quote:


> Originally Posted by *Celcius*
> 
> These drivers have been flawless for my 780Ti.


what drivers are you running? hopefully should be good for the titan z? im moments away from watercooling it


----------



## SirWaWa

are 350.12 drivers as solid as 347.88? any TDR's?


----------



## provost

Quote:


> Originally Posted by *SirWaWa*
> 
> are 350.12 drivers as solid as 347.88? any TDR's?


With a qualifier that I haven't tested the icafe extensively, using these haven't given me any crashes, but you also loose any optimization boost to kepler that was supposedly part of the 353.06 drivers. So thanks to Woundingchaney and Mr.Tooshort for recommending and clarifying these drivers for me.


----------



## SirWaWa

Quote:


> Originally Posted by *provost*
> 
> With a qualifier that I haven't tested the icafe extensively, using these haven't given me any crashes, but you also loose any optimization boost to kepler that was supposedly part of the 353.06 drivers. So thanks to Woundingchaney and Mr.Tooshort for recommending and clarifying these drivers for me.


so stick with 347.88 is what you're saying?
I was getting TDR's with 353.06


----------



## provost

Quote:


> Originally Posted by *SirWaWa*
> 
> so stick with 347.88 is what you're saying?
> I was getting TDR's with 353.06


I am using these that Woundingchaney had linked earlier.

http://drivers.softpedia.com/blog/NVIDIA-Outs-iCafe-GeForce-Graphics-Driver-352-94-Download-Now-482678.shtml


----------



## nircc

Hey Guys
with these new drivers do you think its possible for me to Hairworks ON in TheWitcher 3 1440p all settings on Ultra
PG278Q +980 TI G1 ?
or should i run it with HW Off


----------



## DNMock

Quote:


> Originally Posted by *nircc*
> 
> Hey Guys
> with these new drivers do you think its possible for me to Hairworks ON in TheWitcher 3 1440p all settings on Ultra
> PG278Q +980 TI G1 ?
> or should i run it with HW Off


The 1.05 Witcher 3 patch did a lot more for improving performance with hairworks than the Nvidia drivers did.


----------



## nircc

Quote:


> Originally Posted by *DNMock*
> 
> The 1.05 Witcher 3 patch did a lot more for improving performance with hairworks than the Nvidia drivers did.


So its should be possible


----------



## DNMock

Quote:


> Originally Posted by *nircc*
> 
> So its should be possible


yeah, if not you can probably go into the user.settings file and turn hairworks settings down a little.


----------



## Asus11

just dled these now while playing games my system shuts off


----------



## mcg75

Hesitated going to these drivers based on feedback here.

But did so a week ago. Have only played Dying Light for about 10 hours so far but using Chrome only for browsing, no issues to report.


----------



## azanimefan

Quote:


> Originally Posted by *mcg75*
> 
> Hesitated going to these drivers based on feedback here.
> 
> But did so a week ago. Have only played Dying Light for about 10 hours so far but using Chrome only for browsing, no issues to report.


i didn't have any issues for a few days with these drivers. so i was in here saying they worked for me. well i admit now, they're junk drivers. after a couple of days nothing worked anymore. ended up rolling back to stable 347.88


----------

