# [PCPer] NVIDIA Responds to GTX 970 3.5GB Memory Issue



## Baghi

Quote:


> The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.


Source


----------



## ZealotKi11er

Clearly this was by desing.


----------



## hurleyef

Seems pretty negligible to me, but I'd still like to see more in depth testing from someone that wasn't nVidia just to be sure.


----------



## Cybertox

So the VRAM capacity of the 970s is segmented into 3.5 and 0.5 GBs where as the 980 has a sole segment resulting in 4 GBs? Sounds like a cheesy design strategy.


----------



## Noufel

Why didn't nvidia said that when they launched their 970 ?


----------



## benbenkr

They can't even spell Shadow of Mord*o*r right. Yeah.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Noufel*
> 
> Why didn't nvidia said that when they launched their 970 ?


So they can sell more. Most people are not going to use more then 3.5GB but many people got the GTX 970 to upgrade because of vRAM over GTX780/Ti.


----------



## Kuivamaa

Time for a site that has access to FCAT to examine more than framerates (I hope it is allowed since FCAT is an nvidia tool).


----------



## Noufel

Quote:


> Originally Posted by *Kuivamaa*
> 
> Time for a site that has access to FCAT to examine more than framerates (I hope it is allowed since FCAT is an nvidia tool).


i don't think that nvidia will allow this untill they find a solution for that problem.


----------



## hht92

That's why i waited for my 780 1 year (yea yea i took her 4 months before 900 series), cause the new product isn't always the best product.


----------



## LocutusH

Quote:


> Originally Posted by *Noufel*
> 
> Why didn't nvidia said that when they launched their 970 ?


Why would anyone care about this? The 970 brings awesome performance/price ratio either way. How it does that internally, isnt really that interesting. Only for syntethic testers maybe.


----------



## iTurn

How is the 4GB claim not false advertising?

Im glad this came to light in totality those that were affected by this and posted about it were accused of trying to start a smear campaign.


----------



## benbenkr

Quote:


> Originally Posted by *iTurn*
> 
> How is the 4GB claim not false advertising?
> 
> Im glad this came to light in totality those that were affected by this and posted about it were accused of trying to start a smear campaign.


It's not false advertising. The fact remains, the 970 has 4GB of VRAM on board. That's it, don't argue about this further.

The issue is how the 4GB of VRAM is being used, *NOT* that the 970 having a missing 512mb of VRAM.
Quote:


> Originally Posted by *LocutusH*
> 
> Why would anyone care about this? The 970 brings awesome performance/price ratio either way. How it does that internally, isnt really that interesting. Only for syntethic testers maybe.


It's called first world problems.

People are angry that the 4GB of VRAM doesn't work as it should and that Nvidia kept quiet about it. A few days ago a rep was saying they are "looking" into the issue, but really it's more like how they should word out their PR statement instead of actually doing anything specifically to the 970.


----------



## Noufel

Quote:


> Originally Posted by *LocutusH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noufel*
> 
> Why didn't nvidia said that when they launched their 970 ?
> 
> 
> 
> Why would anyone care about this? The 970 brings awesome performance/price ratio either way. How it does that internally, isnt really that interesting. Only for syntethic testers maybe.
Click to expand...

i think people who bought the 970 with 4g and no 3.5g (technicaly only 3.5 are fully used by the gpu) for 350 bucks piece care.


----------



## mtcn77

Quote:


> Originally Posted by *LocutusH*
> 
> Why would anyone care about this? The 970 brings awesome performance/price ratio either way. How it does that internally, isnt really that interesting. *Only for syntethic* testers maybe.


Except, the driver picks up on benchmarks and you never notice the difference, except in games.Post #90
Many reviewers resort to making conjunctures out of a repertoire of games quite distant from reaching 4GB ever, too.


----------



## LocutusH

Quote:


> Originally Posted by *Noufel*
> 
> i think people who bought the 970 with 4g and no 3.5g for 350 bucks piece care.


While i see your point, there is no reason to call it 3.5gb. It has 4gb, and all 4gb is accessible. There was no word about a slower part tough.
Either way, anyone who bought a 970, did it because of its performance in nowadays games. Is this affected by this new found issue? NO. It still brings the fps numbers everyone wanted...


----------



## velocd

Quote:


> Originally Posted by *benbenkr*
> 
> It's not false advertising. The fact remains, the 970 has 4GB of VRAM on board. That's it, don't argue about this further.


It's not false advertisement but it is deceptive advertisement, which may hurt Nvidia sales in the long run. If something is marketed as 4GB VRAM you reasonably expect it to use up to 4GB of VRAM effectively, not 3.5GB effective and the last .5 poorly. I'm okay with the latter as long as it's marketed to perform that way, but Nvidia has kept that quiet for obvious reasons.


----------



## JonnyBigBoss

I feel like Nvidia was dishonest by not being clear about this from the get-go.

I own a GTX 970 and now feel like my PC is less future proof.


----------



## maarten12100

Quote:


> Originally Posted by *Noufel*
> 
> Why didn't nvidia said that when they launched their 970 ?


They have basically made up this 2 way segmentation PR rubish. Just now.

Should've listed it as 3,5 + 0,5 GB or something like that when they sold it.

Nvidia lists throughput here:
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications

224GB/s for the full 4GB while that is not the case so that would qualify as false advertising because the actually bandwidth is 1/10th of that and that means it is significantly off.
Quote:


> Originally Posted by *LocutusH*
> 
> Why would anyone care about this? The 970 brings awesome performance/price ratio either way. How it does that internally, isnt really that interesting. Only for syntethic testers maybe.


Bandwidth is cut to a point where hiccups occur.
Quote:


> Originally Posted by *benbenkr*
> 
> People are angry that the 4GB of VRAM doesn't work as it should and that Nvidia kept quiet about it. A few days ago a rep was saying they are "looking" into the issue, but really it's more like how they should word out their PR statement instead of actually doing anything specifically to the 970.


indeed


----------



## Xuper

Another Link :

http://techreport.com/news/27721/nvidia-admits-explains-geforce-gtx-970-memory-allocation-issue

So According to NV , We have No problem lol! Serious Question : Why didn't Nvidia Tell us before Launch?


----------



## ZealotKi11er

Quote:


> Originally Posted by *LocutusH*
> 
> While i see your point, there is no reason to call it 3.5gb. It has 4gb, and all 4gb is accessible. There was no word about a slower part tough.
> Either way, anyone who bought a 970, did it because of its performance in nowadays games. Is this affected by this new found issue? NO. It still brings the fps numbers everyone wanted...


No. People wanted 4GB of vRAM. A lot of GTX780 owners got it for the extra 1 GB and not 512MB. Its false advertisement if it's true. It's like putting 1.5GB of vRAM in GTX570 and saying only 1.25GB is being used. As a 3.5GB card GTX970 would have lost sales to R9 290 easily.


----------



## gigafloppy

It's not clear to me if this means the main 3.5GB has lower bandwidth because the last 0.5GB is not used most of the time? Is that 0.5GB a physical chip or a range of adresses on all 8 memory chips?


----------



## velocd

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> I feel like Nvidia was dishonest by not being clear about this from the get-go.
> 
> I own a GTX 970 and now feel like my PC is less future proof.


I feel the same way. I build systems to last 4-5 years, and I would have purchased a GTX 980 had I known the GTX 970 was effectively 3.5GB.

This news depreciates the value of the GTX 970 for resale.


----------



## LocutusH

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> I feel like Nvidia was dishonest by not being clear about this from the get-go.
> 
> I own a GTX 970 and now feel like my PC is less future proof.


What would have been a honest start? To say its 3.5GB, when it does have 4GB? Or to tell that 3.5GB is fast, and 0.5GB is slower, because the GPU is stripped down from the 980, and it just cant do more?

And there also the fact, that users found a lot of other, older cards, that are also affected by this slowing down upper memory part. In fact a lot of GPU-s, even 290-s with that specific syntethic benchmark. Some more, some less. I think both GPU companies are using this techique to address memory. It is always highly dependent on the GPU architecture, if it can use exactly 4GB, or 2GB, or just a bit less.


----------



## ZealotKi11er

Quote:


> Originally Posted by *gigafloppy*
> 
> It's not clear to me if this means the main 3.5GB has lower bandwidth because the last 0.5GB is not used most of the time? Is that 0.5GB a physical chip or a range of adresses on all 8 memory chips?


I am not 100% how memory BW is allocated but it would mean the card has 7/8th the memory bandwidth if it has 3.5GB. That would be 256-Bit down to 224-Bit


----------



## maarten12100

Quote:


> Originally Posted by *LocutusH*
> 
> What would have been a honest start? To say its 3.5GB, when it does have 4GB? Or to tell that 3.5GB is fast, and 0.5GB is slower, because the GPU is stripped down from the 980, and it just cant do more?
> 
> And there also the fact, that users found a lot of other, older cards, that are also affected by this slowing down upper memory part. In fact a lot of GPU-s, even 290-s with that specific syntethic benchmark. Some more, some less. I think both GPU companies are using this techique to address memory. It is always highly dependent on the GPU architecture, if it can use exactly 4GB, or 2GB, or just a bit less.


Say that 3,5GB ran at 224GB/s and the other 0,5GB runs and therefor cuts the entire amount of memory back to a puny 20GB/s bandwidth.

Also that bench is meant to be run headless otherwise it isn't an indication of anything all all cards would fail to meet their bandwidth numbers at those ranges of memory usage. Now there are reports of hiccups so there are implications. Also not the Nvidia given numbers are named "example" as if they were not representative and this might be due to users reporting hiccups despite the frame rate being constant.


----------



## ZealotKi11er

Either way Nvidia has to pay for this. They made a false advertisement damaging their partners, the competition sales and created sales from people that would have not upgraded or switched sides if GTX970 was 3.5GB.


----------



## thebski

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am not 100% how memory BW is allocated but it would mean the card has 7/8th the memory bandwidth if it has 3.5GB. That would be 256-Bit down to 224-Bit


I could be wrong, but isn't the memory bus made up of 64-bit controllers? A 256-bit bus would be made up of four 64-bit controllers. That would mean it could either run on 192-bit or 256-bit bus. I could be completely wrong and feel free to correct me, but I don't know that 224-bit is possible.


----------



## velocd

Quote:


> Originally Posted by *LocutusH*
> 
> What would have been a honest start? To say its 3.5GB, when it does have 4GB? Or to tell that 3.5GB is fast, and 0.5GB is slower, because the GPU is stripped down from the 980, and it just cant do more?


I think they can advertise 4GB, but it should be written in technical specifications that the last .5 is allocated differently and is slower. They didn't explain this because it would have hurt sales.


----------



## maarten12100

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Either way Nvidia has to pay for this. They made a false advertisement damaging their partners, the competition sales and created sales from people that would have not upgraded or switched sides if GTX970 was 3.5GB.


It is 4GB but that 0,5GB just wrecks the available bandwidth to a point were having a lower amount of faster memory would run better. So it is there but it is not a practical thing to ever use. Still I believe false advertising about the bandwidth of those parts and how the memory system works was falsely advertised.

Not sure if it is reason to start a lawsuit I actually think not but people are going to try I guess.


----------



## NuclearPeace

I don't see a problem. All 970s come with 4gb of GDDR5 on the board. Just because its rarely used doesn't mean it isn't there. It doesn't seem to affect performance too much if at all anyway.

The 660 usually could only use 1.5GB of VRAM most of the time for similar reasons.


----------



## Bartouille

So basically it's a 3.5GB card. It's not too bad but still, it's not the first time Nvidia uses weird memory setups (like the GTX 660 ti with 2GB on a 192-bit bus).


----------



## Xoriam

guys, mine is starting to crap out at about 2,5gb then crashes on the final 2 or 3 passes.
I've tried changing BIOS, Drivers, swapping my cards.
Same result everytime.


----------



## ZealotKi11er

Quote:


> Originally Posted by *maarten12100*
> 
> It is 4GB but that 0,5GB just wrecks the available bandwidth to a point were having a lower amount of faster memory would run better. So it is there but it is not a practical thing to ever use. Still I believe false advertising about the bandwidth of those parts and how the memory system works was falsely advertised.
> 
> Not sure if it is reason to start a lawsuit I actually think not but people are going to try I guess.


I dont see why not. Being there really means nothing. Like 3930K being a 8-Core die but only 6-Core operational. Would like like Intel to adventive it as a 8-Core CPU but you can only use 6? It's technically there.


----------



## Xuper

Quote:


> Originally Posted by *LocutusH*
> 
> What would have been a honest start? To say its 3.5GB, when it does have 4GB? Or to tell that 3.5GB is fast, and 0.5GB is slower, because the GPU is stripped down from the 980, and it just cant do more?
> 
> And there also the fact, that users found a lot of other, older cards, that are also affected by this slowing down upper memory part. In fact a lot of GPU-s, even 290-s with that specific syntethic benchmark. Some more, some less. I think both GPU companies are using this techique to address memory. It is always highly dependent on the GPU architecture, if it can use exactly 4GB, or 2GB, or just a bit less.


Dude , Problem is Not GTX 970 nor Memory , 3.5GB or 512Vram.The One that We Must Focus Is : *Trustful*.(Sorry my English is not good but you can understand what it is).Nvidia Knew this from start ,on other hand Nvidia Lost Owners of 970 Trust.


----------



## benbenkr

Quote:


> Originally Posted by *maarten12100*
> 
> It is 4GB but that 0,5GB just wrecks the available bandwidth to a point were having a lower amount of faster memory would run better. So it is there but it is not a practical thing to ever use. Still I believe false advertising about the bandwidth of those parts and how the memory system works was falsely advertised.
> 
> *Not sure if it is reason to start a lawsuit I actually think not but people are going to try I guess*.


Not sure if there's much chance to win if there's a lawsuit. End of the day, the 970 does have 4GB of VRAM onboard and even though 0.5GB is technically crippled, Nvidia never lied about the 970 having less than 4GB of VRAM. The 0.5GB of VRAM is also usable, albeit being slow.


----------



## TheMentalist

If I had a company making GPU's, I would do the same thing nVidia did. It's called smart marketing **evil laugh**


----------



## thebski

Quote:


> Originally Posted by *benbenkr*
> 
> Not sure if there's much chance to win if there's a lawsuit. End of the day, the 970 does have 4GB of VRAM onboard and even though 0.5GB is technically crippled, Nvidia never lied about the 970 having less than 4GB of VRAM. The 0.5GB of VRAM is also usable, albeit being slow.


They did technically lie about it having 4 GB of VRAM on a 256-bit bus. I would assume most people consider the memory subsystem as a whole when purchasing (quantity, clock, and bus size).


----------



## ZealotKi11er

Quote:


> Originally Posted by *benbenkr*
> 
> Not sure if there's much chance to win if there's a lawsuit. End of the day, the 970 does have 4GB of VRAM onboard and even though 0.5GB is technically crippled, Nvidia never lied about the 970 having less than 4GB of VRAM. The 0.5GB of VRAM is also usable, albeit being slow.


There are 1 million + GTX970/980 owners. Probably 80% are GTX970. They would lose easily if enough people go for it. I was going to buy a GTX970 for a friend build and even spend $110 more then R9 290 but i dont think i am get it for him now.


----------



## looniam

Quote:


> Originally Posted by *Xoriam*
> 
> guys, mine is starting to crap out at about 2,5gb then crashes on the final 2 or 3 passes.
> I've tried changing BIOS, Drivers, swapping my cards.
> Same result everytime.
> 
> 
> Spoiler: Warning: Spoiler!


that's unreliable benchmark . .throw it away.


----------



## PontiacGTX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I dont see why not. Being there really means nothing. Like 3930K being a 8-Core die but only 6-Core operational. Would like like Intel to adventive it as a 8-Core CPU but you can only use 6? It's technically there.


Technically the only false advertisement would be that the last 64bits of memory controller have 4 functionally ROPs with a 3.5GB< scenario if you can prove that the card usrs 213bit instead 256. As a card that could manage [email protected]/1440+AA w/o the delta compression in game


----------



## Blindsay

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I dont see why not. Being there really means nothing. Like 3930K being a 8-Core die but only 6-Core operational. Would like like Intel to adventive it as a 8-Core CPU but you can only use 6? It's technically there.


That's not really a good comparison.

the 2 cores on that 3930k die have been fused and could never be used in any way shape or form.

this card does have 4gb and can use it all, just not in a way that one would typically expect


----------



## Asus11

whatever happens, Nvidia in my eyes has done wrong, but what can you expect? the price difference between the 970 & 980 is huge..

there had to be something up.. I guess you get what you pay for when it comes to Nvidia


----------



## PontiacGTX

Quote:


> Originally Posted by *Asus11*
> 
> whatever happens, Nvidia in my eyes has done wrong, but what can you expect? the price difference between the 970 & 980 is huge..
> 
> there had to be something up.. I guess you get what you pay for when it comes to Nvidia


call it GTX970SE not GTX 970 and not 330.280-310 with a caution note


----------



## TopicClocker

Sigh, this is not good.


----------



## FlyingSolo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> No. People wanted 4GB of vRAM. A lot of GTX780 owners got it for the extra 1 GB and not 512MB. Its false advertisement if it's true. It's like putting 1.5GB of vRAM in GTX570 and saying only 1.25GB is being used. As a 3.5GB card GTX970 would have lost sales to R9 290 easily.


I have to agree with you. I went from a 780 to 970 because of the vram. Or else i would have bought a 290 or 290x. I think nvidia needs to give refunds to the people who bought the cards because of the vram. With gta 5 coming i'm sure with mods it will go over 3.5gb easily.


----------



## John Shepard

So what happens when it access the 512Mb section?

I go over 3.5Gb all the time and haven't noticed anything strange.


----------



## CircuitFreak

Quote:


> Originally Posted by *NuclearPeace*
> 
> I don't see a problem. All 970s come with 4gb of GDDR5 on the board. Just because its rarely used doesn't mean it isn't there. It doesn't seem to affect performance too much if at all anyway.
> 
> The 660 usually could only use 1.5GB of VRAM most of the time for similar reasons.


That's exactly why I bought the wifey the 3gb 660ti, no bottle neck on a small part of the memory if it was ever needed.


----------



## ZealotKi11er

When Nvidia try to achieve a price point they do much worse then AMD. They are good at making cards with no budged or very large budget like 780 ti and titan.


----------



## morbid_bean

So can someone explain this to me? Probably a question for Nvidia, I donno.

So now that NVIDIA has explained the setup, sounds like its not a problem right? If it needs more than 3.5 it jumps to the second pool of ram? Then why are people getting choke issues on the cards?


----------



## DuckieHo

Quote:


> Originally Posted by *Xuper*
> 
> So According to NV , We have No problem lol! Serious Question : Why didn't Nvidia Tell us before Launch?


because the vast majority don't understand... and even more don't care.


----------



## mtcn77

So, factually, how wide is the card's memory bus interface? 208, 213, 224, 240, or 256 bits?'


----------



## ZealotKi11er

Quote:


> Originally Posted by *mtcn77*
> 
> So, factually, how wide is the card's memory bus interface? 208, 213, 224, 240, or 256 bits?'


If 3.5GB it would technically be 224-Bit. I noticed something stage with GTX970 since launch. X70 part was much slower then X70 in older generations considering on paper there should have not been that big of gap between GTX980 and GTX970. Most reviews used Reference 980s and 970s where mostly non reference OCed.


----------



## 2010rig

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There are 1 million + GTX970/980 owners. Probably 80% are GTX970. They would lose easily if enough people go for it. I was going to buy a GTX970 for a friend build and even spend $110 more then R9 290 but i dont think i am get it for him now.


Has the 970 become slower than a 290 all of the sudden?


----------



## mcg75

Same warning here as the other thread.

Keep posts on topic.

Do not make personal comments toward others such as calling them fanboy etc.

If you're going to debate, treat others with respect while doing so.


----------



## Obrigado




----------



## Fateful_Ikkou

Quote:


> Originally Posted by *LocutusH*
> 
> While i see your point, there is no reason to call it 3.5gb. It has 4gb, and all 4gb is accessible. There was no word about a slower part tough.
> Either way, *anyone who bought a 970, did it because of its performance in nowadays games. Is this affected by this new found issue? NO. It still brings the fps numbers everyone wanted*...


I beg to differ about that statement, My brother who has two GTX 970's in SLI can't max out Battlefield 4 properly at 1080P because his cards hit the 3.5GB "limit" and he starts stuttering like hell. It's the same with my one 970 and COD: Ghosts, when I hit that Vram "limit" I go from steady 55~60FPS down to 37~46 and start stuttering and lagging around to the point I can't play. The issue exists, claiming it doesn't for a majority doesn't mean that the issue doesn't exist or that everyone is happy. Don't get me wrong I still love the card it's works fine for 95% of the games I play but it's the other 5% that's a big let down. This card is more than capable of maxing out COD: Ghosts and Battlefield 4 but because of that Vram issue I have to either lower my settings or watch my Vram usage like a hawk and quit when it get's to that limit and I shouldn't have to do that when the GPU itself is more than capable.


----------



## Cybertox

Quote:


> Originally Posted by *DuckieHo*
> 
> because the vast majority don't understand... and even more don't care.


Judging by this thread most of the users actually care.


----------



## DuckieHo

Quote:


> Originally Posted by *Cybertox*
> 
> Judging by this thread most of the users actually care.


The operative phase is "vast majority".

Comapnies have to simply marketing and published specs for the masses. There are not many out there who understand the intricacies of hardware and software design.


----------



## jprovido

Quote:


> Originally Posted by *LocutusH*
> 
> Why would anyone care about this? The 970 brings awesome performance/price ratio either way. How it does that internally, isnt really that interesting. Only for syntethic testers maybe.


if you bought two of those cards and you were expecting to get 4gb vram and you "upgraded" from a gtx 780 because of that it would definitely piss you off. and yea I am that person.


----------



## Cybertox

Quote:


> Originally Posted by *DuckieHo*
> 
> The operative phase is "vast majority".
> 
> Comapnies have to simply marketing and published specs for the masses. There are not many out there who understand the intricacies of hardware and software design.


Pretty sure that as all these news and rumours spread, more and more users will find out about this and they wont be too satisfied learning about Nvidia and their tricky marketing strategy as well the segmentation of the VRAM capacity on the 970.


----------



## Vesku

Quote:


> Originally Posted by *Noufel*
> 
> Why didn't nvidia said that when they launched their 970 ?


This.

Why are reviewers cutting Nvidia so much slack for not telling people about this memory configuration?

From PCPer writup:
Quote:


> Our own Josh Walrath offers this analysis:
> 
> A few days ago when we were presented with evidence of the 970 not fully utilizing all 4 GB of memory, I theorized that it had to do with the reduction of SMM units. It makes sense from an efficiency standpoint to perhaps "hard code" memory addresses for each SMM. The thought behind that would be that 4 GB of memory is a huge amount of a video card, and the potential performance gains of a more flexible system would be pretty minimal.
> 
> I believe that the memory controller is working as intended and not a bug. When designing a large GPU, there will invariably by compromises made. From all indications NVIDIA decided to save time, die size, and power by simplifying the memory controller and crossbar setup. These things have a direct impact on time to market and power efficiency. NVIDIA probably figured that a couple percentage of performance lost was outweighed by the added complexity, power consumption, and engineering resources that it would have taken to gain those few percentage points back.


That's fine but they should have disclosed this compromise.


----------



## TheMentalist

I wonder why this wasn't discovered earlier. We need to start inspecting the cards more thoroughly.


----------



## DuckieHo

Quote:


> Originally Posted by *Cybertox*
> 
> Pretty sure that as all these news and rumours spread, more and more users will find out about this and they wont be too satisfied learning about Nvidia and their tricky marketing strategy as well the segmentation of the VRAM capacity on the 970.


...and yet, the masses probably still won't care.

It's not trickery but a design limitation. How do you explain memory allocation and tiering to the masses?


----------



## boot318

Vesku, the only tech company that would take flack from this is AMD. The "Fat Cats" would just give incentives not to criticize them.


----------



## Leopard2lx

Quote:


> Originally Posted by *2010rig*
> 
> Has the 970 become slower than a 290 all of the sudden?


Wow! Only 20% difference between single 980 and 295x2 dual-gpu.
On the other hand 970 is kicking butt so I don't see why people are complaining. And if you wanna play at 4k you should have gotten a 980 or two. I don't have any stutters at 4k even when VRAM is maxed out.


----------



## tweezlednutball

Quote:


> Originally Posted by *Fateful_Ikkou*
> 
> I beg to differ about that statement, My brother who has two GTX 970's in SLI can't max out Battlefield 4 properly at 1080P because his cards hit the 3.5GB "limit" and he starts stuttering like hell. It's the same with my one 970 and COD: Ghosts, when I hit that Vram "limit" I go from steady 55~60FPS down to 37~46 and start stuttering and lagging around to the point I can't play. The issue exists, claiming it doesn't for a majority doesn't mean that the issue doesn't exist or that everyone is happy. Don't get me wrong I still love the card it's works fine for 95% of the games I play but it's the other 5% that's a big let down. This card is more than capable of maxing out COD: Ghosts and Battlefield 4 but because of that Vram issue I have to either lower my settings or watch my Vram usage like a hawk and quit when it get's to that limit and I shouldn't have to do that when the GPU itself is more than capable.


That sounds really bad as i have 2 7970's (old faithful) in crossfire with only 3gb vram each and i run the game full ultra with 200 resolution scale buttery smooth. I guess thats what true 384 bit bus gets u. NVidia has been known to split busses like this, it hasnt been the first time.


----------



## ChrisB17

I will be returning mine. Didn't even open it. I don't want a card that is false advertised and imo defective designed.


----------



## boot318

Quote:


> Originally Posted by *TheMentalist*
> 
> I wonder why this wasn't discovered earlier. We need to start inspecting the cards more thoroughly.


People have been saying this for awhile. They mostly have talked down to. People had faith in Nvidia.... and this happened. I know they feel good for following what the numbers were telling them.


----------



## jprovido

would the distributors in my country even accept returns for my 970's. this is getting pretty annoying. damn you nvidia I have words in my mind I can't type here in OCN. i'm not happy at all


----------



## Menta

well they don't seem worried this is really lame, abuse or altering the real facts what we payed for!


----------



## Vesku

Quote:


> Originally Posted by *DuckieHo*
> 
> ...and yet, the masses probably still won't care.
> 
> It's not trickery but a design limitation. How do you explain memory allocation and tiering to the masses?


They were upfront about it when they mixed memory before: see 550 Ti. Not disclosing it with the 970 was a pure marketing decision that reduced how informed buyers COULD be. Turn the thinking around, if most of the buying masses won't notice or understand why not continue to disclose such design choices?


----------



## Cybertox

Quote:


> Originally Posted by *DuckieHo*
> 
> ...and yet, the masses probably still won't care.
> 
> It's not trickery but a design limitation. How do you explain memory allocation and tiering to the masses?


There is nothing to explain to them. Pretty sure that those people who will read the news and the rumours will deduce that the 970 only has 3.5 instead of 4 GBs and will have a more severe impression of being cheated and that the VRAM advertised is falsely claimed. Doubt they will go as far to understand that the 970 is segmented and totals in 4 GBs VRAM capacity.


----------



## Vesku

Quote:


> Originally Posted by *Leopard2lx*
> 
> Wow! Only 20% difference between single 980 and 295x2 dual-gpu.
> On the other hand 970 is kicking butt so I don't see why people are complaining. And if you wanna play at 4k you should have gotten a 980 or two. I don't have any stutters at 4k even when VRAM is maxed out.


That's not examining frame times with a tool like FCAT. Nvidia is the one who pointed out to reviewers that the gaming experience is more than FPS.


----------



## SandGlass

This issue basically kills any potential for using the 970 for anything scientific computing, especially those that use multi GB or even multi TB data sets divided into multi GB segments. Ironic considering CUDA has been failing us for Ubuntu 14.04 with "cudaErrorMemoryAllocation" errors with occasional two magnitude spikes in memory access latency , and OpenCL is around 20% slower for what we do. More investigation needs to be done on Nvidia cards to see when exactly this happens, and if amd has similar problems.


----------



## Vesku

Quote:


> Originally Posted by *SandGlass*
> 
> This issue basically kills any potential for using the 970 for anything scientific computing, especially those that use multi GB or even multi TB data sets divided into multi GB segments. Ironic considering CUDA has been failing us for Ubuntu 14.04 with "cudaErrorMemoryAllocation" errors with occasional two magnitude spikes in memory access latency , and OpenCL is around 20% slower for what we do. More investigation needs to be done on Nvidia cards to see when exactly this happens, and if amd has similar problems.


Now that it's a known issue they can tweak their code to treat the 970 as a 3.5GB card. That's now that it has been exposed, before they might have thought something was wrong with their code or CUDA libraries.


----------



## Pnanasnoic

With any luck the 970 prices will drop so I can pick one up!


----------



## nyxagamemnon

Alright eveybody here's your gtx 970 memory data plan. 4gb total 3.5gb full speed if you exceed 3.5gb your subject to data throttling to 1/10th speeds.

The last 512mb is effectively so slow that using it will make your game run like crap.

Nvidia knew about this how can you not, they created the dam thing and the creator knows all.

Word to the wise all gpu's should be tested from now on.


----------



## cooperb21

How did reviewers not notice this really when they hyped up card so much talk about BS they are also prob payed by Nvidia.


----------



## darealist

Lol. I couldn't care less about 500mb less VRAM, but these GTX 970 coil whine is killing my ears. Welp. Onto my 5th RMA.


----------



## sugalumps

Quote:


> Originally Posted by *nyxagamemnon*
> 
> Alright eveybody here's your gtx 970 memory data plan. 4gb total 3.5gb full speed if you exceed 3.5gb your subject to data throttling to 1/10th speeds.
> 
> The last 512mb is effectively so slow that using it will make your game run like crap.
> 
> Nvidia knew about this how can you not, they created the dam thing and the creator knows all.
> 
> Word to the wise all gpu's should be tested from now on.


All the reviewers and users using it till now, how did none of them pick up on this? Is it because 99.9% of 970 users are still at 1080p, thus never going over 3gb.


----------



## cooperb21

Does this mean that 8 GB 970's won't happen?


----------



## jprovido

Quote:


> Originally Posted by *nyxagamemnon*
> 
> Alright eveybody here's your gtx 970 memory data plan. 4gb total 3.5gb full speed if you exceed 3.5gb your subject to data throttling to 1/10th speeds.
> 
> The last 512mb is effectively so slow that using it will make your game run like crap.
> 
> Nvidia knew about this how can you not, they created the dam thing and the creator knows all.
> 
> Word to the wise all gpu's should be tested from now on.


I think it's better to just give us a bios that effectively reduces vram to just 3.5gb to avoid the performance issues. what they did is a crime but there's nothing we can do.
Quote:


> Originally Posted by *cooperb21*
> 
> Does this mean that 8 GB 970's won't happen?


7GB gtx 970


----------



## NuclearPeace

I doubt that prices will drop. The 970 already launched $70 cheaper than the 770.

Along with that, you guys are overestimating how much research people are doing when it comes to building computers. The 980 and the 970 have collectively sold more than a million cards already despite the 290x and the 290 being substantially cheaper. A lot of people buy electronics (PC hardware included) based off testimonials from their friends and family. Misinformation from comment fanboys also paints AMD as this dodgy budget brand and NVIDIA as the premium luxury brand.


----------



## cutty1998

Quote:


> Originally Posted by *velocd*
> 
> I feel the same way. I build systems to last 4-5 years, and I would have purchased a GTX 980 had I known the GTX 970 was effectively 3.5GB.
> 
> This news depreciates the value of the GTX 970 for resale.


Maybe not ,if you hurry up !


----------



## Yvese

Don't worry Nvidia's superior drivers will get this fixed asap.


----------



## Vesku

Quote:


> Originally Posted by *sugalumps*
> 
> All the reviewers and users using it till now, how did none of them pick up on this? Is it because 99.9% of 970 users are still at 1080p, thus never going over 3gb.


Some users did notice, that's why Nvidia had to "respond". The GPU review sites have some explaining to do though, especially the sudden reduction in the use of the FCAT tool for frame time analysis.


----------



## jprovido

Quote:


> Originally Posted by *Yvese*
> 
> Don't worry Nvidia's superior drivers will get this fixed asap.


----------



## HaGGeN

I can't help but feel a bit dissatisfied as a long time NVIDIA customer. On top of that, I just purchased a GTX 970 a few days ago. I love how it has performed so far but I don't like buying s product that doesn't live up to what it says it does.


----------



## nSone

did you check this out?
PSA: Gigabyte has just released new bios update for all gtx 970's
https://forums.geforce.com/default/topic/806255/geforce-900-series/psa-gigabyte-has-just-released-new-bios-update-for-all-gtx-970s/


----------



## cooperb21

Quote:


> Originally Posted by *Yvese*
> 
> Don't worry Nvidia's superior drivers will get this fixed asap.


Actually apparently a bios update might fix it Gigabyte just put out new one that might have fixed everything./


----------



## lester007

From what I understand its hardware design i don't think software will fix this thing, but i dont care about 4k gamin yet


----------



## Noufel

Quote:


> Originally Posted by *jprovido*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nyxagamemnon*
> 
> Alright eveybody here's your gtx 970 memory data plan. 4gb total 3.5gb full speed if you exceed 3.5gb your subject to data throttling to 1/10th speeds.
> 
> The last 512mb is effectively so slow that using it will make your game run like crap.
> 
> Nvidia knew about this how can you not, they created the dam thing and the creator knows all.
> 
> Word to the wise all gpu's should be tested from now on.
> 
> 
> 
> I think it's better to just give us a bios that effectively reduces vram to just 3.5gb to avoid the performance issues. what they did is a crime but there's nothing we can do.
> Quote:
> 
> 
> 
> Originally Posted by *cooperb21*
> 
> Does this mean that 8 GB 970's won't happen?
> 
> Click to expand...
> 
> 7GB gtx 970
Click to expand...

man u killing me







and dont forget the 960 with 112 bit 1.75gb
just kidding btw


----------



## Cybertox

Quote:


> Originally Posted by *jprovido*
> 
> *7GB gtx 970*


----------



## velocd

[
Quote:


> Originally Posted by *lester007*
> 
> From what I understand its hardware design i don't think software will fix this thing, but i dont care about 4k gamin yet


You can exceed 3.5GB at 1080p and 1440p.


Ultra resolution texture packs. e.g. Shadow of Mordor.
Extreme AA (depending on the game)
Nvidia DSR
To name a few.


----------



## jprovido

Quote:


> Originally Posted by *velocd*
> 
> [
> You can exceed 3.5GB at 1080p and 1440p.
> 
> 
> Ultra resolution texture packs. e.g. Shadow of Mordor.
> Extreme AA.
> Nvidia DSR
> To name a few.


at 2560x1600 I think I'm already experiencing the problem. I just shrugged it off and thought it was something else. I will look into this more. nvidia is ass


----------



## ZealotKi11er

Considering Nvidia has always had lower vRAM i think most Nvidia users are used to having less vRAM.


----------



## Xoriam

One thing I'm worried about, my main game is Final Fantasy XIV.
I'm currently playing at 3840x2160
The current client is DX9, and the memory hits 2,5mb sometimes already.
They've announced that with the DX11 client they will be using uncompressed textures, So I'm worried I might end up in that 3,5gb+ range and FPS starting to crap out.

I know that the few ocassions I hit 4096MB in ACU at 4k the fps started going chug chug chug chug.


----------



## jprovido

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Considering Nvidia has always had lower vRAM i think most Nvidia users are used to having less vRAM.


tbh when I bought my gtx 970's deep inside I was like yea 4GB VRAM take that AMD cards. again, nvidia is ass

time to change the vram count on my sig rig.


----------



## Cybertox

Quote:


> Originally Posted by *jprovido*
> 
> tbh when I bought my gtx 970's deep inside I was like yea 4GB VRAM take that AMD cards. again, nvidia is ass
> 
> time to change the vram count on my sig rig.


Take that AMD? AMD GPUs had 4 GBs of VRAM before the 900 series was even released. AMD was always ahead in terms of VRAM capacity.


----------



## Kinaesthetic

Can someone with a Gigabyte GTX 970 who has ran the test once and had this Vram issue problem install that vBIOS and re-run the test?


----------



## jprovido

Quote:


> Originally Posted by *Cybertox*
> 
> Take that AMD? AMD GPUs had 4 GBs of VRAM before the 900 series was even released. AMD was always ahead in terms of VRAM capacity.


whoah that's news to me! who knew they had 4gb vram?!?

ofcourse I knew







. I had a gtx 780 before and I was fed up of getting used with nvidia always having less vram. so yeah I sold my 780's to get 970's because of the extra 512MB vram. I are smart


----------



## Gilles3000

This is pretty unacceptable. If this was by design they should have marketed it as either a 3.5GB VRAM card or 3.5GB VRAM+0.5GB Useless VRAM.

This is basically false advertising.


----------



## doomlord52

So isn't this assuredly false advertising?

Nvidia's own site lists the GTX 970 as having 4GB at 224gb/s. However, it seems that this is not the case. While the card does technically have 4gb, it is not all accessible at 224gb/s. That would make the spec sheet claim on their site false (a lie), since once all of the available 4GB of Vram is used, it no longer operates even close to the claimed speed.

So.... where is the lawsuit? I could go for an upgrade to a 980 (or a 2nd, free, 970).


----------



## flippin_waffles

Where are all the FCAT latency tests??


----------



## Cybertox

Its not as price competitive with that 3.5 GBs of VRAM, is it now?


----------



## Xoriam

Quote:


> Originally Posted by *doomlord52*
> 
> So isn't this assuredly false advertising?
> 
> Nvidia's own site lists the GTX 970 as having 4GB at 224gb/s. However, it seems that this is not the case. While the card does technically have 4gb, it is not all accessible at 224gb/s. That would make the spec sheet claim on their site false (a lie), since once all of the available 4GB of Vram is used, it no longer operates even close to the claimed speed.
> 
> So.... where is the lawsuit? I could go for an upgrade to a 980 (or a 2nd, free, 970).


it's not false advertising because the cards "techincally" have 4gb of ram :/


----------



## MxPhenom 216

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> I feel like Nvidia was dishonest by not being clear about this from the get-go.
> 
> I own a GTX 970 and now feel like my PC is less future proof.


Your computer was never future proof...


----------



## Cybertox

Quote:


> Originally Posted by *Xoriam*
> 
> it's not false advertising because the cards "techincally" have 4gb of ram :/


However it is segmented where as all the other 4 GBs cards do not have any segmentation of VRAM capacity.


----------



## Baghi

Quote:


> Originally Posted by *jprovido*
> 
> time to change the vram count on my sig rig.


Ah, that was quick!


----------



## ZealotKi11er

Everyone though GTX970 was an amazing card but what about GTX980? Was GTX980 impressive @ $550 now and then?


----------



## Xoriam

Quote:


> Originally Posted by *Cybertox*
> 
> However it is segmented where as all the other 4 GBs cards do not have any segmentation of VRAM capacity.


yeah which is total BS.
I bet not one person here knew about this before the purchase.


----------



## jprovido

Quote:


> Originally Posted by *Baghi*
> 
> Ah, that was quick!


I'm very unlucky with graphics cards this gen. my gtx 970's has fake 4gb vram. my newly bought r9 280x flickers at 2d clocks. nvidia and amd = ass


----------



## hurleyef

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Can someone with a Gigabyte GTX 970 who has ran the test once and had this Vram issue problem install that vBIOS and re-run the test?


I too would like to see the results of this.


----------



## Baghi

Quote:


> Originally Posted by *jprovido*
> 
> I'm very unlucky with graphics cards this gen. my gtx 970's has fake 4gb vram. my newly bought r9 280x flickers at 2d clocks. nvidia and amd = ass


Disable UPLS for AMD cards.


----------



## Xoriam

I have the gigabyte gtx 970, I've run the tests on both bios this was my results as stated before, starts to have issues at 2,5gb and crashes on the last few passes.

I don't see this result in games though.


----------



## HMBR

Quote:


> Originally Posted by *jprovido*
> 
> 7GB gtx 970


don't forget the bonus ultra slow 1GB for a total of 8GB!

that's some really weird stuff, but it was clear that the whole 64ROPs/256bit memory looked weired considering the difference to the 980 in some synthetic tests, the disabled SMX thing had a deeper impact than just TMU/ALUs

but it's probably no big deal in reality, but perhaps the slow 512MB should be disabled?


----------



## Vesku

Quote:


> Originally Posted by *Xoriam*
> 
> it's not false advertising because the cards "techincally" have 4gb of ram :/


The memory bandwidth is not as advertised, though.


----------



## Vesku

Quote:


> Originally Posted by *jprovido*
> 
> I'm very unlucky with graphics cards this gen. my gtx 970's has fake 4gb vram. my newly bought r9 280x flickers at 2d clocks. nvidia and amd = ass


That 2D flickering is most likely a bios with too conservative of clocks for that GPU state. Potentially fixable with a BIOS update, not so for the GTX 970 memory config unfortunately.


----------



## hurleyef

Quote:


> Originally Posted by *Xoriam*
> 
> I have the gigabyte gtx 970, I've run the tests on both bios this was my results as stated before, starts to have issues at 2,5gb and crashes on the last few passes.
> 
> I don't see this result in games though.
> 
> *snip*


Are you running this headless?


----------



## mtcn77

Quote:


> Originally Posted by *jprovido*
> 
> I'm very unlucky with graphics cards this gen. my gtx 970's has fake 4gb vram. my newly bought r9 _280x flickers at 2d clocks_. nvidia and amd = ass


There's a cure for that. Have you tried this?


----------



## Xoriam

Quote:


> Originally Posted by *hurleyef*
> 
> Are you running this headless?


sorry what?


----------



## hurleyef

Quote:


> Originally Posted by *Xoriam*
> 
> sorry what?


When you run the test you should not be using the card that you are running it on as your display adapter; ie use your igp while running it.


----------



## Xoriam

Quote:


> Originally Posted by *hurleyef*
> 
> When you run the test you should not be using the card that you are running it on as your display adapter; ie use your igp while running it.


brb with update


----------



## Xoriam

Same results

only difference is the slowdown starts at 3gb instead of 2.5 and the last pass crashes the driver.


----------



## hurleyef

Quote:


> Originally Posted by *Xoriam*
> 
> Same results


Interesting. Does anyone have a gigabyte 970 that can test with the new bios?


----------



## Xoriam

Quote:


> Originally Posted by *hurleyef*
> 
> Interesting. Does anyone have a gigabyte 970 that can test with the new bios?


Same results on new and old. new and old drivers as well.
2 different cards.


----------



## tpi2007

As I've read over at PCPer, this is because they chose to do it like this. Having less cores than the full chip doesn't necessarily imply having inferior access to the VRAM. Does an Intel hexacore only have access to a part of your RAM at the rated bandwidth and to get the whole thing you need to buy the octacore ? No, of course not. So this isn't a clear cut case of excusing Nvidia for their design choices.

In any case, it's interesting to see how they apparently addressed the issue, yet completely avoided making any comment on the memory benchmark results. What is actually the effective bandwidth when accessing that 0.5GB section ? Is it 224 GB/s ?

If it isn't then we have a problem, because then that would be false advertising:

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications



If the bandwidth is inferior then they shouldn't be allowed to make a global claim, it should read 224 GB/s for the first 3.5 GB, and whatever it actually is (10x less it seems) for the remainder 0.5GB.

I mean, sure, the VRAM bandwidth is technically correct, but if the GPU can't make use of it at that bandwidth for that section, it's deceiving. It reminds me of a case a few years ago where an OEM builder was selling desktop PCs, advertising them as having DDR2 667 Mhz memory, but forgetting to mention that the entry level motherboard only supported DDR2 533 Mhz, and that was the speed it ran at.

Also, framerates don't tell the whole story, let's see if we get some FCAT results along with the fps numbers.


----------



## Creator

Maybe they can driver update to have Windows tasks like Aero occupy the 500mb segement? This way games can utilize 3.5gb and the difference would then end up being much less considering the VRAM usage of Windows tasks.


----------



## John Shepard

So what happens now?

I mean this is a hardware issue,it cannot be fixed through software.

Are we stuck with a 3.5Gb card?


----------



## Vesku

Quote:


> Originally Posted by *Creator*
> 
> Maybe they can driver update to have Windows tasks like Aero occupy the 500mb segement? This way games can utilize 3.5gb and the difference would then end up being much less considering the VRAM usage of Windows tasks.


Would not surprise me if they already do this. But I don't think that handles true Fullscreen game modes as opposed to Windowed Fullscreen.


----------



## Xoriam

Quote:


> Originally Posted by *Vesku*
> 
> Would not surprise me if they already do this. But I don't think that handles true Fullscreen game modes as opposed to Windowed Fullscreen.


Yeah I'd think it would already be optimized to push things like that into that sector first.
If it's not already being loaded there in the first place. @4k my idle consumption is roughly 500mb.


----------



## Master__Shake

so it's a non issue?


----------



## iSlayer

^ if Nvidia is correct...yes. We need testing to verify that though.
Quote:


> Originally Posted by *hurleyef*
> 
> Seems pretty negligible to me, but I'd still like to see more in depth testing from someone that wasn't nVidia just to be sure.


Indeed, we need third party results.
Quote:


> Originally Posted by *benbenkr*
> 
> It's not false advertising. The fact remains, the 970 has 4GB of VRAM on board. That's it, don't argue about this further.
> 
> The issue is how the 4GB of VRAM is being used, *NOT* that the 970 having a missing 512mb of VRAM.
> It's called first world problems.
> 
> People are angry that the 4GB of VRAM doesn't work as it should and that Nvidia kept quiet about it. A few days ago a rep was saying they are "looking" into the issue, but really it's more like how they should word out their PR statement instead of actually doing anything specifically to the 970.


"There are bigger problems in life so yours don't matter!" I do believe that's a logical fallacy.
Quote:


> Originally Posted by *Noufel*
> 
> i think people who bought the 970 with 4g and no 3.5g (technicaly only 3.5 are fully used by the gpu) for 350 bucks piece care.


Ooh ooh, me, me! I do, I do!


----------



## Noufel

Quote:


> Originally Posted by *Master__Shake*
> 
> so it's a non issue?


it's Nvidia so there is no problem


----------



## mcg75

Quote:


> Originally Posted by *Master__Shake*
> 
> so it's a non issue?


I think when some sites start doing frame time analysis of when that last 500 mb actually comes into use, it's going to show the issue is more prominent than just giving an average fps number does.


----------



## Master__Shake

Quote:


> Originally Posted by *mcg75*
> 
> I think when some sites start doing frame time analysis of when that last 500 mb actually comes into use, it's going to show the issue is more prominent than just giving an average fps number does.


but isn't fcat nvidias tech?

can they not allow people to publish results using it?


----------



## Vesku

Quote:


> Originally Posted by *Master__Shake*
> 
> but isn't fcat nvidias tech?
> 
> can they not allow people to publish results using it?


Those people should tell the public if Nvidia tries to keep them from investigating. Nvidia marketing has always seemed competent to me I'd like to think they'd realize the potential Streisand Effect if they embargo FCAT investigations of this.


----------



## MonarchX

OK, I just read Guru3D nVidia response, which pretty much states that nVidia did over GTX 970 users over with their GTX 970 VRAM segmentation, which obviously does not work as nVidia intended. In addition, they never warned anyone about it, but seeing how quickly they figured it out, someone from nVidia undeniably knew about this ...crap... (I mean how else would you call it?). As I've posted in the other thread, nVidia messing up more and more, sliding way down to a low product and support quality level. It no longer releases drivers with performance updates of any kind, with exceptions for a game or two. In fact, there are still no true optimized Maxwell drivers that utilize the new architecture. All they are giving us are generic and raw hardware and clock-driven drivers. The way things are going, I may end up with AMD for the next round, especially if nVidia's Maxwell Titan X price is $1350 like they say!

It has always been like that - nVidia becomes a cocky dominant card maker and then starts slipping after several years of excellent PC GPU dominance sales, thinking they are un-beatable in top performance (per single-core/GPU at least). In the meantime, AMD is sick of its bad decision and losses, so they pour all their strength into developing and releasing a super-card that undeniably beats the competition or at the very least stays on par with it, but for a much lower price. I may be wrong, but I believe stacked memory architecture is a year-away on nVidia's schedule (or not?), while AMD is already pushing it. Then the whole $1350 is just insane, but its only rumors... You can buy a rather good gaming PC for that much! AFAIK, there was a similar situation when Radeon 970/980 Pro cards were released and obliterated everything nVidia-made, because nVidia made a horrible call with the card (some FX) they decided to release. Today's situation is different, but there are enough factors for nVidia to have a bad year and for AMD to rise somewhat.

*FYI: I am actually a big fan, supporter, and defender of nVidia's products and drivers*, when they are actually good. Don't believe me? Check out my previous posts, although you may need to research quite a bit before you find the ones where I made Pro nVidia and anti-AMD statements.. When fans like me begin to question & dislike nVidia's current state of affairs and future state of PC GPU products and support, seeing how nVidia could easily slide into (temporary) abyss, then its a really bad sign







, although I am giving myself too much "importance weight" / credit when I say all this







.


----------



## Rookie1337

I'm just curious how one is going to test this...load up a game set so that it uses more than 3.5GB but not more than 4GB? I mean otherwise wouldn't that miss testing just the effect of this partitioning scheme?


----------



## darkreize

Quote:


> Originally Posted by *jprovido*
> 
> would the distributors in my country even accept returns for my 970's. this is getting pretty annoying. damn you nvidia I have words in my mind I can't type here in OCN. i'm not happy at all


I doubt it.







You know how it is here.









And I had plans to get one before the end of the month.


----------



## AgentHydra

Even if its only 3.5GB these cards still kick ass, I don't regret getting mine.

Although if Nvidia wanted to give out some games as a consolation I'd be down for that.


----------



## Xoriam

Quote:


> Originally Posted by *AgentHydra*
> 
> Even if its only 3.5GB these cards still kick ass, I don't regret getting mine.
> 
> Although if Nvidia wanted to give out some games as a consolation I'd be down for that.


Or a discount on whatever is next


----------



## iSlayer

Any updates on the situation?

@2010rig insert your avatar here.
Quote:


> Originally Posted by *MonarchX*
> 
> OK, I just read Guru3D nVidia response, which pretty much states that nVidia did over GTX 970 users over with their GTX 970 VRAM segmentation, which obviously does not work as nVidia intended. In addition, they never warned anyone about it, but seeing how quickly they figured it out, someone from nVidia undeniably knew about this ...crap... (I mean how else would you call it?). As I've posted in the other thread, nVidia messing up more and more, sliding way down to a low product and support quality level. It no longer releases drivers with performance updates of any kind, with exceptions for a game or two. In fact, there are still no true optimized Maxwell drivers that utilize the new architecture. All they are giving us are generic and raw hardware and clock-driven drivers. The way things are going, I may end up with AMD for the next round, especially if nVidia's Maxwell Titan X price is $1350 like they say!
> 
> It has always been like that - nVidia becomes a cocky dominant card maker and then starts slipping after several years of excellent PC GPU dominance sales, thinking they are un-beatable in top performance (per single-core/GPU at least). In the meantime, AMD is sick of its bad decision and losses, so they pour all their strength into developing and releasing a super-card that undeniably beats the competition or at the very least stays on par with it, but for a much lower price. I may be wrong, but I believe stacked memory architecture is a year-away on nVidia's schedule (or not?), while AMD is already pushing it. Then the whole $1350 is just insane, but its only rumors... You can buy a rather good gaming PC for that much! AFAIK, there was a similar situation when Radeon 970/980 Pro cards were released and obliterated everything nVidia-made, because nVidia made a horrible call with the card (some FX) they decided to release. Today's situation is different, but there are enough factors for nVidia to have a bad year and for AMD to rise somewhat.
> 
> *FYI: I am actually a big fan, supporter, and defender of nVidia's products and drivers*, when they are actually good. Don't believe me? Check out my previous posts, although you may need to research quite a bit before you find the ones where I made Pro nVidia and anti-AMD statements.. When fans like me begin to question & dislike nVidia's current state of affairs and future state of PC GPU products and support, seeing how nVidia could easily slide into (temporary) abyss, then its a really bad sign
> 
> 
> 
> 
> 
> 
> 
> , although I am giving myself too much "importance weight" / credit when I say all this
> 
> 
> 
> 
> 
> 
> 
> .


----------



## maarten12100

Quote:


> Originally Posted by *benbenkr*
> 
> Not sure if there's much chance to win if there's a lawsuit. End of the day, the 970 does have 4GB of VRAM onboard and even though 0.5GB is technically crippled, Nvidia never lied about the 970 having less than 4GB of VRAM. The 0.5GB of VRAM is also usable, albeit being slow.


The list bandwidth for 4GB at 224GB/s not 3,5 at 224 and 0,5 at 20GB/s so they could take it to court. But I don't think it will.


----------



## Seven7h

It's not a 3.5GB GPU. If it were, your performance would be worse because for any single command buffers that reference an amount over 3.5GB, you'd be texturing out of system memory, which is way slower than the last 512MB.

The driver has smarts to avoid using the last 512MB unless it has to, and it is intelligent about putting the least important resources in there.

So anyone seeing just 3.5GB available, that's just a reporting issue. In truth, if a single command buffer of a normal game workload (mixed resources) references more than 3.5GB of resources, then there is no choice but to make some of that data resident in the last 512MB. If a game creates more than 3.5GB of resources over time but doesn't reference them all in a single command buffer, the OS would rather evict something to system memory to make room for newly created resources in the first 3.5GB than place a potentially important resource into the last 512MB.

In short, when necessary, the memory will be used, and it's much better than coming from system memory.

And in any real world game, when you have enough resources referenced in a single command buffer to start to want to use the last 512MB, there is always some subset of those resources that you can put there that will ensure a fairly insignificant overall performance impact by having them there.


----------



## maarten12100

Quote:


> Originally Posted by *Seven7h*
> 
> It's not a 3.5GB part. If it were, your perf would be worse because for any single command buffers that reference an amount over 3.5GB, *you'd be texturing out of system memory, which is way slower than the last 512MB.*


Not true 20GB/s is about the same speed as seen on dual channel DDR3 setups so the bandwidth would be equal actually.


----------



## solid9

I'm pretty disappointed from this because before NVIDIA told the customers when a card had a slower section of memory (660 for example) but with the 970 they didn't and that's because I and others buyer like would have bought a 290x instead (I opened a thread of 4gb 970 vs 8gb 290x for slighty more and guess what I would have bought if it was 3.5gb vs 8?).
They sold me something that doesn't do what they advertised it for so I expect them to give me what I payed for or else give ma a refund.
And those that don't think this is a big issue , I am neutral between amd and nvidia so I can tell for sure that if this thread was about 290 having less memory lots of you would have been here bashing amd instead of thelling that this isn't really and issue when some games use more than 3.5 gb at 1080p.


----------



## Triniboi82

Quote:


> Originally Posted by *AgentHydra*
> 
> *Even if its only 3.5GB these cards still kick ass,* I don't regret getting mine.
> 
> Although if *Nvidia wanted to give out some games as a consolation I'd be down for that*.


X2, I am however a bit disappointed reading about this as I was planning to make an upgrade to 4K and thought I had relatively decent gpu power/vram. Still 1440p is good enough for me atm and I'll stick with my cards till the next gen comes out but I will be more cautious when shopping for Nvidia's lower end models. As it's a hardware issue with no possible fix I hope they do right by us and offer some sort of incentive. IMO this is false advertising by Nvidia


----------



## lester007

i did some testing to myself, i used as much vram as i could then i saw spikes everywhere fps, memory used/controller load/bus interface load


----------



## Mad Pistol

The GTX 970 is still a stupidly fast and efficient card, but I am a little disappointed now. Honestly, I can't even max out the 3GB of my GTX 780, let alone the 4GB of my wife's GTX 970. Still, when a card is advertised as a 4GB card, most people expect the card to perform perfectly all the way up to the 4GB cap on frame buffer. In that sense, I am disappointed.


----------



## Seven7h

Quote:


> Originally Posted by *maarten12100*
> 
> Not true 20GB/s is about the same speed as seen on dual channel DDR3 setups so the bandwidth would be equal actually.


No... It's more likely that the test you're referencing is basic enough in how it's accessing memory that it's not actually measuring the performance of the last 512MB... It is likely measuring system memory accesses. In a non synthetic allocation test with real games and real resource types and access patterns, youll get the last 512MB instead of system memory.

Not to mention no matter how much memory bandwidth you have on the cpu in total, it is split with the CPUs processing memory needs. The 512MB is dedicated.

Also local video memory has much better latency than system memory, which affects performance. There are reasons why GPUs use GDDR and don't use standard DDR modules for vidmem.


----------



## Clockster

Gotta love big companies pr bull...laughing so hard.
I also love how some people defend this rubbish lol...epic!!.


----------



## solid9

Yes , they did false advertising and they should pay for it , also isn't this also true for the 780? Does this mean that a 780 has 2.7 gb of vram and Titan 5.5?


----------



## Gilles3000

Quote:


> Originally Posted by *michaelius*
> 
> Due to inefficient drivers AMD cards run into cpu-bottlenecks faster. Also since it's average among many titles probably crossfire was not working in some of them.


Do you have a source for either of those claims?

And what do CPU bottlenecks have to do with the GPU drivers?


----------



## Xoriam

Quote:


> Originally Posted by *Gilles3000*
> 
> Do you have a source for either of those claims?
> 
> And what do CPU bottlenecks have to do with the GPU drivers?


Yeah i'm curious too, because I've always ran into GPU bottlenecks before cpu bottlenecks on AMD GPUS


----------



## ebduncan

when i read threads like this all I keep thinking is

Fuel for AMD Fanboys to throw on their red fire and further announce why AMD rules.

I didn't expect graphics card markets to follow in the same path as hard drive makers. IE advertise a 240gb SSD yet only have 238gb free space available etc... Guess they got caught red handed, now i guess the question is how far are they going to back pedal before admitting a error.

Common 3.5gb sector and a 0.5 gb sector? this isn't a hard drive.


----------



## Xoriam

Quote:


> Originally Posted by *ebduncan*
> 
> when i read threads like this all I keep thinking is
> 
> Fuel for AMD Fanboys to throw on their red fire and further announce why AMD rules.
> 
> I didn't expect graphics card markets to follow in the same path as hard drive makers. IE advertise a 240gb SSD yet only have 238gb free space available etc... Guess they got caught red handed, now i guess the question is how far are they going to back pedal before admitting a error.
> 
> Common 3.5gb sector and a 0.5 gb sector? this isn't a hard drive.


or you could contribute it to nvidia being able to say nananana still outperforming amd with 3.5gb


----------



## mtcn77

Quote:


> Originally Posted by *solid9*
> 
> Yes , they did false advertising and they should pay for it , also isn't this also true for the 780? Does this mean that a 780 has 2.7 gb of vram and Titan 5.5?


They misrepresented the memory bus width on marketting presentations as far as I can tell.


----------



## Menta

Was wondering when someone would use the HDD card....sad really


----------



## Cybertox

Quote:


> Originally Posted by *michaelius*
> 
> Due to inefficient drivers AMD cards run into cpu-bottlenecks faster. Also since it's average among many titles probably crossfire was not working in some of them.


Inefficient GPU drivers causing CPU bottlenecks faster? What are you even talking about?


----------



## mtcn77

Quote:


> Originally Posted by *Cybertox*
> 
> Inefficient GPU drivers causing CPU bottlenecks faster? What are you even talking about?


He is referring how AMD doesn't support DX11 drawcall multithreading.


----------



## Forceman

Quote:


> Originally Posted by *mtcn77*
> 
> They misrepresented the memory bus width on marketting presentations as far as I can tell.


I don't know that it's possible to have different bus widths for different parts of the VRAM - the chips are all connected via a 256-bit bus. This sounds like more of an internal GPU die limitation, not a physical bus width issue.


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> I don't know that it's possible to have different bus widths for different parts of the VRAM - the chips are all connected via a 256-bit bus. This sounds like more of an internal GPU die limitation, not a physical bus width issue.


I don't know where the crossbar switch is placed, but I know GDDR5 requires point to point transmission interface, so admitting one is implying the other, imo.


----------



## Cybertox

Quote:


> Originally Posted by *mtcn77*
> 
> He is referring how AMD doesn't support DX11 drawcall multithreading.


Yet supports Mantle which is the least CPU bottlenecking API.


----------



## Seven7h

Quote:


> Originally Posted by *Cybertox*
> 
> Inefficient GPU drivers causing CPU bottlenecks faster? What are you even talking about?


The GPU driver does work that is required to set graphics runtime data up for the GPU. This work happens on the CPU.

Recently NVIDIA has made big improvements in the CPU overhead of such work for DX11. Meanwhile AMD has not really done as much for DX11 driver overhead, and they have instead focused far more on Mantle, which eliminates the need for some of this driver work altogether, while also pushing much of it to being the developer's responsibility (which some developers prefer due to control and visibility).


----------



## Forceman

Quote:


> Originally Posted by *mtcn77*
> 
> I don't know where the crossbar switch is placed, but I know GDDR5 requires point to point transmission interface, so admitting one is implying the other, imo.


I'm not sure what the industry standard is for defining the interface. If it has a 256-bit physical connection, which I'm sure it does, that would seem to qualify.

Same with the memory bandwidth number, is that normally reported as an average, or a max, or what? Not saying it's right, but I doubt if there are any grounds for a lawsuit or recall. I don't remember anything like that with the 660 that had the same issue.


----------



## Horsemama1956

Quote:


> Originally Posted by *Xoriam*
> 
> or you could contribute it to nvidia being able to say nananana still outperforming amd with 3.5gb


What would nVidia have to do for you not to defend this? Kind of messed up that you are. Just a huge nVidia fan or what?


----------



## Gilles3000

Quote:


> Originally Posted by *mtcn77*
> 
> He is referring how AMD doesn't support DX11 drawcall multithreading.


And does this cause any relevant CPU bottlenecks, because this is the first time I'm hearing about this.
Quote:


> Originally Posted by *Xoriam*
> 
> or you could contribute it to nvidia being able to say nananana still outperforming amd with 3.5gb


All I know is that this will stop me from buying a 970 if they don't fix this problem.

I don't get why people keep fanboying for either company, just get the gpu that performs the best in your price range or has the features you need, sheez.


----------



## NuclearPeace

Doesn't Mantle have memory leaks? I heard that a lot of people were having insane VRAM usage while playing BF4 to the point were DX11 was faster.

That along with the fact that the i3 4330 wont bottleneck a 260x at all (so there won't be much gained from Mantle) is why I passed up the 260x and grabbed a $130 750 Ti FTW.

Anyway, the 900 series started off looking so well. The 980 was a decent flagship and the 970 made AMD make huge price drops. Then, the 960 released and it turned out it was only a little bit faster than the 760 and then you have this discovery.


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> I'm not sure what the industry standard is for defining the interface. If it has a 256-bit physical connection, which I'm sure it does, that would seem to qualify.
> 
> Same with the memory bandwidth number, is that normally reported as an average, or a max, or what? Not saying it's right, but I doubt if there are any grounds for a lawsuit or recall. I don't remember anything like that with the 660 that had the same issue.


That is right because 660Ti wasn't misrepresented as a 256 bit card for that matter. Nobody can prove anything on 4GB, but the bus is crippled.


----------



## Xoriam

Quote:


> Originally Posted by *Horsemama1956*
> 
> What would nVidia have to do for you not to defend this? Kind of messed up that you are. Just a huge nVidia fan or what?


you're kidding right? this was just an example. My card is crashing in the benchmark in the final passes....

I've been using AMD since GTX+ 9800


----------



## PontiacGTX

Quote:


> Originally Posted by *Forceman*
> 
> I'm not sure what the industry standard is for defining the interface. If it has a 256-bit physical connection, which I'm sure it does, that would seem to qualify.
> 
> Same with the memory bandwidth number, is that normally reported as an average, or a max, or what? Not saying it's right, but I doubt if there are any grounds for a lawsuit or recall. I don't remember anything like that with the 660 that had the same issue.


512MB Dies 8x
4x 64Bit bus each
16 ROP per memory bus. which leaves 12 rop without function on the last 64 bit which implies 1 GB per 64 bits. 3/4 of the 16 rop doesnt work 3/4 of the 1024MB are 768MB, leaves 3328 active with 52 rops

208 Bit- with 52 ROP with 3328MB 182GB/s
192 Bit- with 48 ROP with 3072MB 168GB/s
64 Bit - with 16 ROP with 1024MB 56GB/s
48 Bit - with 12 ROP with. 768MB 42GB/s


----------



## michaelius

Quote:


> Originally Posted by *Cybertox*
> 
> Inefficient GPU drivers causing CPU bottlenecks faster? What are you even talking about?


http://www.neogaf.com/forum/showpost.php?p=138983332&postcount=39
Quote:


> Originally Posted by *Gilles3000*
> 
> Do you have a source for either of those claims?
> 
> And what do CPU bottlenecks have to do with the GPU drivers?


.

Not perfect example since it's 760 vs 280x due to pricin at the time test was made but still:




you can see AMD card gaining lot more from overclocked Haswell which means it was held back by cpu performance of 3,9Ghz cpu while 760 get's only minimal bump which means it's fully restricted by gpu performance.

This will affect average scores in benchmarks.


----------



## Seven7h

Quote:


> Originally Posted by *NuclearPeace*
> 
> Doesn't Mantle have memory leaks? I heard that a lot of people were having insane VRAM usage while playing BF4 to the point were DX11 was faster.
> 
> That along with the fact that the i3 4330 wont bottleneck a 260x at all (so there won't be much gained from Mantle) is why I passed up the 260x and grabbed a $130 750 Ti FTW.
> 
> Anyway, the 900 series started off looking so well. The 980 was a decent flagship and the 970 made AMD make huge price drops. Then, the 960 released and it turned out it was only a little bit faster than the 760 and then you have this discovery.


I doubt Mantle itself has memory leaks. It's more likely that the implementation by the developer using Mantle was somewhat buggy because it is a less well established development path with less tools, and puts a ton of responsibility on the developer, when the platform is representing only a fraction of the revenue-base.... I.e. It's lower priority than what makes them most of their money, so the implementation won't be perfect.

This is true for any niche PC gaming tech.


----------



## Forceman

Quote:


> Originally Posted by *PontiacGTX*
> 
> 512MB Dies 8x
> 4x 64Bit bus each
> 16 ROP per memory bus. which leaves 12 rop without function on the last 64 bit which implies 1 GB per 64 bits. 3/4 of the 16 rop doesnt work 3/4 of the 1024MB are 768MB, leaves 3328 active with 52 rops
> 
> 208 Bit- with 52 ROP with 3328MB
> 192 Bit- with 48 ROP with 3096MB


But the 970 still has 64 ROPs, same as the 980, so the interface between the die and the memory chips should be the same. The difference is internal to the die and how those ROPs are connected to the SMMs (or whatever internally). It may effectively have fewer ROPs, but for the purposes of marketing materials (all I'm saying), it's still got a 256-bit bus.


----------



## PontiacGTX

Quote:


> Originally Posted by *Forceman*
> 
> But the 970 still has 64 ROPs, same as the 980, so the interface between the die and the memory chips should be the same. The difference is internal to the die and how those ROPs are connected to the SMMs (or whatever internally). It may effectively have fewer ROPs, but for the purposes of marketing materials (all I'm saying), it's still got a 256-bit bus.


Quote:


> Originally Posted by *wikipedia*
> The ROPs perform the transactions between the relevant buffers in the local memory - this includes writing or reading values, as well as blending them together.


Quote:


> Originally Posted by *anadtech*
> Now the subject of ROPs is always a dicey one because of the nature of pixel operations. Unlike compute hardware, which can be scaled up rather effectively with more complex workloads and better caching methods, the same is not true for ROPs*. ROPs are the ultimate memory bandwidth burner. They are paired with memory controllers specifically because the work they do - the Z testing, the pixel blending, the anti-aliasing - devours immense amounts of bandwidth. As a result, even if you are bottlenecked by ROP performance increasing the ROP count won't necessarily be performance effective if those ROPs are going to be bandwidth starved*.


with these 2 quoted you could say that the ROP is one fo the final stages of the processing of the memory interface. which could mean that only the first group of rops are avaialble therefore 208Bit /3328MB are avaialable to be processed, then after the lack of it the other 12 rops and 768MB are a kind of bottleneck
Quote:


> Originally Posted by *michaelius*
> 
> snip.


offtopic but.

those charts show that the AMD drivers are outdated here you have the recent ones










Spoiler: Offtopic



http://www.overclock.net/content/type/61/id/2329648/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329649/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329651/width/500/height/1000


----------



## Art Vanelay

Quote:


> Originally Posted by *Cybertox*
> There are way better options which allocate VRAM capacity properly, there are way better cars with the same 150 limit, there are way better mice with 8000 dpi.


Why pay more for features you're not going to use, though? If something's way better just because it performs better in a situation that you, personally, will never encounter, it's not actually better.


----------



## Unknownm

I would be pissed off if my R9 290 in CF only came with 3.5gb full bandwidth and crippled 512mb.

I'm always using about 3.5-4gb of vram when running 4k 0x AA and almost the same amount with 1080p 8x AA.

Mind you I do have 512bit and 4gb with full bandwidth so I shouldn't complain.

Good luck Nvidia 970 owners!

Sent from my HTC Incredible S using Tapatalk 2


----------



## sugalumps

Quote:


> Originally Posted by *Art Vanelay*
> 
> Why pay more for features you're not going to use, though? If something's way better just because it performs better in a situation that you, personally, will never encounter, it's not actually better.


This the people that buy the top i7's and 970's to play LoL at 1080p are silly.


----------



## poii

voila a small update from Nai talking a bit how CUDA and his benchmark work

https://translate.google.com/translate?hl=en&sl=de&tl=en&u=http%3A%2F%2Fwww.computerbase.de%2Fforum%2Fshowthread.php%3Ft%3D1435408%26page%3D7%26p%3D16912375%23post16912375&sandbox=1


----------



## Art Vanelay

Quote:


> Originally Posted by *sugalumps*
> 
> This the people that buy the top i7's and 970's to play LoL at 1080p are silly.


I was mostly talking about people who buy the 970 then plays games that don't use that much VRAM.

People who buy i7s to play games in general are just silly though.


----------



## BulletSponge

Quote:


> Originally Posted by *NuclearPeace*
> 
> Doesn't Mantle have memory leaks? I heard that a lot of people were having insane VRAM usage while playing BF4 to the point were DX11 was faster.
> 
> That along with the fact that the i3 4330 wont bottleneck a 260x at all (so there won't be much gained from Mantle) is why I passed up the 260x and grabbed a $130 750 Ti FTW.
> 
> Anyway, the 900 series started off looking so well. The 980 was a decent flagship and the 970 made AMD make huge price drops. Then, the 960 released and it turned out it was only a little bit faster than the 760 and then you have this discovery.


Mantle has been unusable for me in DA:I and BF4. It only works fairly well in Civ:BE. I never noticed high VRAM usage with Mantle but in BF4 and Dragon Age it eats system memory. Once it is using over 9GB of my 16 both games start stuttering like mad and once it hits 10GB that's it, system crash every time.


----------



## TheReciever

Well now this is a little disappointing. I was planning on getting a couple later this year...

Guess I'll go back to the drawing board. Thanks for the post

Been a while since I played red team


----------



## Ganf

Quote:


> Originally Posted by *Art Vanelay*
> 
> Why pay more for features you're not going to use, though? If something's way better just because it performs better in a situation that you, personally, will never encounter, it's not actually better.


We all encounter this situation when we keep our hardware longer than a year, something is released that we want to play, and we find out that our card can no longer max out the refresh rate of our monitor.

The speed limit isn't constantly being raised from year to year on the interstate, so no, you don't need to future-proof your car to any great degree.


----------



## Menta

Quote:


> Originally Posted by *Art Vanelay*
> 
> I was mostly talking about people who buy the 970 then plays games that don't use that much VRAM.
> 
> People who buy i7s to play games in general are just silly though.


thats not actually true anymore, with the luanch of the new consoles and multi threaded architecture , the i7 is actually starting to make some diference in newer games and in the future probably more so


----------



## Redwoodz

LMAO!







FCAT SLi results



Look at all the dropped frames. http://www.guru3d.com/articles-pages/geforce-gtx-970-sli-review,8.html



Look at the frame drops,and the declining FPS as more VRAM is used.
970 owners you have been had.


----------



## Kuivamaa

Quote:


> Originally Posted by *michaelius*
> 
> http://www.neogaf.com/forum/showpost.php?p=138983332&postcount=39
> .
> 
> Not perfect example since it's 760 vs 280x due to pricin at the time test was made but still:
> 
> 
> 
> 
> you can see AMD card gaining lot more from overclocked Haswell which means it was held back by cpu performance of 3,9Ghz cpu while 760 get's only minimal bump which means it's fully restricted by gpu performance.
> 
> This will affect average scores in benchmarks.


Two issues
a) DX11 multithreading capacity of radeon vs geforce is irrelevant in this scenario because TW2 is strictly DX9
b) PClab is probably the worst benchmarking site out there. Pretty much every test they do is badly botched.


----------



## Exilon

Quote:


> Originally Posted by *poii*
> 
> voila a small update from Nai talking a bit how CUDA and his benchmark work
> 
> https://translate.google.com/translate?hl=en&sl=de&tl=en&u=http%3A%2F%2Fwww.computerbase.de%2Fforum%2Fshowthread.php%3Ft%3D1435408%26page%3D7%26p%3D16912375%23post16912375&sandbox=1


Translation: "there's a CUDA memory allocation bug that's causing the last 500 MB of virtual address space to be constantly swapped over the PCIe bus instead of being read from the 500 MB partition."

This is probably a driver or firmware issue caused by the magic prioritization that Nvidia is doing. No, your last 500 MB isn't 20 GB/s, that's the PCIe bus bandwidth + L2 caching.


----------



## poii

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4432672/#4432672

That makes the pcper and techreport statements official I guess


----------



## Exilon

Quote:


> Originally Posted by *Redwoodz*
> 
> LMAO!
> 
> 
> 
> 
> 
> 
> 
> FCAT SLi results
> 
> Look at all the dropped frames. http://www.guru3d.com/articles-pages/geforce-gtx-970-sli-review,8.html
> 
> Look at the frame drops,and the declining FPS as more VRAM is used.
> 970 owners you have been had.


The SLI 980 doesn't have this issue and the FCAT is the same.
http://www.guru3d.com/articles-pages/geforce-gtx-980-sli-review,9.html


----------



## damric

Could this problem occur on other cut-down cards? Could this happen with cut-down Radeons as well? I have Crossfired HD 7850s if someone wants me to test in some way.


----------



## Seven7h

Quote:


> Originally Posted by *Redwoodz*
> 
> LMAO!
> 
> 
> 
> 
> 
> 
> 
> FCAT SLi results
> 
> 
> 
> Look at all the dropped frames. http://www.guru3d.com/articles-pages/geforce-gtx-970-sli-review,8.html
> 
> 
> 
> Look at the frame drops,and the declining FPS as more VRAM is used.
> 970 owners you have been had.


You dont seem to be even remotely interested in approaching this scientifically, and seem more interested in the perceived opportunity for drama. This doesn't prove anything. You should compare the same results to 980 as a baseline.

Also, why would slower memory lead to dropped frames? If anything you would see very slow frame times, not instantaneous ones. Obviously this is unrelated.

If FPS goes down with more memory usage it's likely that you turned up resolution (more work, lower FPS) or you turned up settings (more work, lower FPS). If all you did was change texture quality and you saw lower FPS, that would be decent evidence.

Correlation does not imply causation.


----------



## Redwoodz

Quote:


> Originally Posted by *Exilon*
> 
> The SLI 980 doesn't have this issue and the FCAT is the same.
> http://www.guru3d.com/articles-pages/geforce-gtx-980-sli-review,9.html


Well there is also Nvi's powerdraw trickery too, they purposely toggle voltage while gaming to give better numbers.
Quote:


> Originally Posted by *Seven7h*
> 
> You dont seem to be even remotely interested in approaching this scientifically, and seem more interested in the perceived opportunity for drama. This doesn't prove anything. You should compare the same results to 980 as a baseline.
> 
> If FPS goes down with more memory usage it's likely that you turned up resolution (more work, lower FPS) or you turned up settings (more work, lower FPS). If all you did was change texture quality and you saw lower FPS, that would be decent evidence.
> 
> Correlation does not imply causation.


The graph does not lie my friend, whatever the cause.


----------



## swiftypoison

I honestly feel cheated. Not only they purposely withheld information about VRAM usage, they are now playing with the numbers in order to avoid some bad PR. I was planning on ordering a GTX 980, but now ill just keep my GTX 770 Classy and wait for AMD's 380X.


----------



## Seven7h

Quote:


> Originally Posted by *Redwoodz*
> 
> Well there is also Nvi's powerdraw trickery too, they purposely toggle voltage while gaming to give better numbers.
> The graph does not lie my friend, whatever the cause.


Are you capable of hypothesizing how slower memory could result in dropped frames instead of just slower performance overall? Memory residency is not a spikey high frequency process, and certainly doesn't last for just one frame at a time.

If this is even valid data, it's pretty clearly unrelated to the separate 512MB memory section.


----------



## nSone

u saw this right?
https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4432672/#4432672


----------



## Exilon

Quote:


> Originally Posted by *Redwoodz*
> 
> Well there is also Nvi's powerdraw trickery too, they purposely toggle voltage while gaming to give better numbers.
> The graph does not lie my friend, whatever the cause.


You're just trolling m8.
Quote:


> Originally Posted by *poii*
> 
> voila a small update from Nai talking a bit how CUDA and his benchmark work
> 
> https://translate.google.com/translate?hl=en&sl=de&tl=en&u=http%3A%2F%2Fwww.computerbase.de%2Fforum%2Fshowthread.php%3Ft%3D1435408%26page%3D7%26p%3D16912375%23post16912375&sandbox=1


Chrome has a better translation than translate.google.com. How does that work?
Quote:


> The benchmark tries to reduce this effect by repeatedly requesting the data in each storage area in turn. That is the first demand "should" cause the benchmark a page fault. The page fault "should" copy the data on the GPU Page from the DRAM from the CPU to the GPU DRAM. Then the other global memory accesses would be carried out with the DRAM bandwidth. So at least the assumption on my part.
> 
> Interesting way behaves the GPU not like me to do. This makes the GPU, the corresponding data does not seem to upload into the DRAM of the GPU, but again to request directly from the DRAM CPU with each memory access in a page fault in CUDA. Thus, the benchmark measures overall in such cases more or less the swapping behavior of CUDA and not the DRAM bandwidth. The whole can be verified easily by allowing any applications running in the background that consume a lot of DRAM from the GPU, thus more swapping is needed. In this case, the benchmark collapses also.


In English:
There's a bug in CUDA memory allocation and Nai's benchmark is measuring the effective PCIe bus bandwidth over the last 1 GB of virtual address space instead of VRAM bandwidth. The 500 MB partition should be slower than the other 3.5 GB, but it probably isn't 10x slower.


----------



## poii

It sounds to me the benchmark can't clear the VRAM and thus CUDA writes the created files (or whatever you'll call this) somewhere else and therefor measuring PCIe speed or something else.


----------



## Redeemer

Guy is it the BIOS or Driver that dictates memory allocation?


----------



## Clovertail100

If this was intentional, nVidia probably knew it could come to light, and they probably had their lawyers involved before ever screwing over these 970 owners. I'm sure they were confident that the benefits of all this would outweigh the consequences for them.

If it wasn't intentional, wow.
But no, it was intentional.


----------



## Seven7h

Quote:


> Originally Posted by *Exilon*
> 
> There's a bug in CUDA memory allocation and Nai's benchmark is measuring the effective PCIe bus bandwidth over the last 1 GB of virtual address space instead of VRAM bandwidth. The 500 MB partition should be slower than the other 3.5 GB, but it probably isn't 10x slower.


This is 95% the likely case, which means that people were indeed seeing that benchmark treat the GPU as a 3.5GB card. But it is just plain wrong since that's not what games get... They would get the 512MB of much-faster-than-system-memory memory. This test clearly didn't simulate real games.

These are not workstation or data center GPUs, so even if the 3.5GB limit before spilling to system memory behavior when allocating through CUDA were intentional and permanent (unlikely), games simply don't spill to system memory above 3.5GB like this.


----------



## Redwoodz

Quote:


> Originally Posted by *Exilon*
> 
> You're just trolling m8.


Yes, I am just making this up.


http://www.tomshardware.co.uk/nvidia-geforce-gtx-980-970-maxwell,review-33038-12.html


----------



## Seven7h

Quote:


> Originally Posted by *Redwoodz*
> 
> Yes, I am just making this up.
> 
> 
> http://www.tomshardware.co.uk/nvidia-geforce-gtx-980-970-maxwell,review-33038-12.html


At first I didn't understand why you are posting what you are posting.

Now it has become clear that you don't understand why you are posting what you are posting.


----------



## Menta

where does power and heat factor in


----------



## looniam

Quote:


> Originally Posted by *Redwoodz*
> 
> LMAO!
> 
> 
> 
> 
> 
> 
> 
> FCAT SLi results
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Look at all the dropped frames. http://www.guru3d.com/articles-pages/geforce-gtx-970-sli-review,8.html
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Look at the frame drops,and the declining FPS as more VRAM is used.
> 970 owners you have been had.


very nice SLI results BUT for a single card:




http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-970-and-980-reference-review,1.html

how about paying more attention?


----------



## Mad Pistol

Quote:


> Originally Posted by *Redeemer*
> 
> Guy is it the BIOS or Driver that dictates memory allocation?


Both, but this is a hardware fault. No amount of BIOS changes or driver tweaking can fix it. Only a physical change on the chip's die can fix this.


----------



## Menta

Quote:


> Originally Posted by *Mad Pistol*
> 
> Both, but this is a hardware fault. No amount of BIOS changes or driver tweaking can fix it. Only a physical change on the chip's die can fix this.


i wish you were wrong


----------



## Exilon

Quote:


> Originally Posted by *Redwoodz*
> 
> Yes, I am just making this up.
> http://www.tomshardware.co.uk/nvidia-geforce-gtx-980-970-maxwell,review-33038-12.html


You see, this is another case where the "tech news" blogs should really take their findings to the manufacturer for comment instead of just publishing them outright.

Tom's Hardware (and you) didn't know that Gigabyte had a modified firmware on their GTX 980/970s to have a power limit of 250W.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-13.html
Quote:


> We originally posted Power Consumption Torture (GPGPU) results that showed a simulated GeForce GTX 970 reference card pulling over 240 Watts. This does not represent Nvidia's reference GeForce GTX 970 board because our data point was simulated with a Gigabyte GTX 970 card that has a non-reference ~250 Watt power target, unlike the reference board's ~150 W power target.
> 
> We have since pulled that data since it does not represent Nvidia's reference GeForce GTX 970 card. On the other hand, as far as we know there are no actual GeForce GTX 970 reference card designs for sale as each manufacturer has put their own spin on this model. None of the manufacturers we have talked to have released a GeForce GTX 970 card with a ~150 Watt power target as of this time, opting instead to give this product more performance headroom.
> 
> This is an issue we are keeping a close eye on, and we will follow up with a detailed investigation in the near future. We are curious to see if a reference-based GeForce GTX 970 will perform in the same league as the cards we have tested with higher power targets, but it would certainly make more sense in an HTPC or for use in smaller form factors. In the meantime, we have removed the 'simulated' GeForce GTX 970 data point from the following charts.


Of course, this edit was way too late to stop all the AMD trolls like you and Abwx from going around claiming that Nvidia is lying about their TDP. for the next year.


----------



## CasualCat

What's with the conspiracy theories about the reviewers? Some people's red tinfoil is too tight.









Nvidia may have screwed up here, but unless the reviewers were fudging their tests (the tests which are usually standardized) I don't see how they're accountable. I do think the better ones though will use this experience/knowledge to start doing additional tests on cards meant to catch this sort of thing.
Quote:


> Originally Posted by *Redwoodz*
> 
> LMAO!
> 
> 
> 
> 
> 
> 
> 
> FCAT SLi results
> 
> 
> 
> Look at all the dropped frames. http://www.guru3d.com/articles-pages/geforce-gtx-970-sli-review,8.html
> 
> 
> 
> Look at the frame drops,and the declining FPS as more VRAM is used.
> 970 owners you have been had.


I remember reading those reviews and hoping to see a follow-up to the dropped frame investigation and not hearing/reading anything. Is this definitely the cause though or are you just speculating?


----------



## Redwoodz

Quote:


> Originally Posted by *looniam*
> 
> very nice SLI results BUT for a single card:
> 
> 
> 
> 
> http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-970-and-980-reference-review,1.html
> 
> how about paying more attention?


It is relevant because SLi uses vram from 1 card,so if there is an issue it would show more if there is a vram limitation.









Quote:


> Originally Posted by *Exilon*
> 
> You see, this is another case where the "tech news" blogs should really take their findings to the manufacturer for comment instead of just publishing them outright.
> 
> Tom's Hardware (and you) didn't know that Gigabyte had a modified firmware on their GTX 980/970s to have a power limit of 250W.
> 
> http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-13.html
> Of course, this edit was way too late to stop all the AMD trolls like you and Abwx from going around claiming that Nvidia is lying about their TDP. for the next year.


I never claimed they lied. I am stating they achieved their superior TDP figures by cycling power, not by a necessarily inherent more efficient process. Whether that is acceptable is up to the user. I suspect there may be some drawbacks to this aproach, however. Like driver problems for instance.
Quote:


> Originally Posted by *CasualCat*
> 
> What's with the conspiracy theories about the reviewers? Some people's red tinfoil is too tight.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nvidia may have screwed up here, but unless the reviewers were fudging their tests (the tests which are usually standardized) I don't see how they're accountable. I do think the better ones though will use this experience/knowledge to start doing additional tests on cards meant to catch this sort of thing.
> I remember reading those reviews and hoping to see a follow-up to the dropped frame investigation and not hearing/reading anything. Is this definitely the cause though or are you just speculating?


Speculating,but I will await PCPer's full on FCAT results,since they seem to be missing so far.


----------



## mtcn77

Quote:


> Originally Posted by *Exilon*
> 
> You see, this is another case where the "tech news" blogs should really take their findings to the manufacturer for comment instead of just publishing them outright.
> 
> Tom's Hardware (and you) didn't know that Gigabyte had a modified firmware on their GTX 980/970s to have a power limit of 250W.
> 
> http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-13.html
> Of course, this edit was way too late to stop all the AMD trolls like you and Abwx from going around claiming that Nvidia is lying about their TDP. for the next year.


"*None of the manufacturers we have talked to have released a GeForce GTX 970 card with a ~150 Watt power target as of this time, opting instead to give this product more performance headroom.*"


----------



## Forceman

Quote:


> Originally Posted by *Redwoodz*
> 
> I never claimed they lied. I am stating they achieved their superior TDP figures by cycling power, not by a necessarily inherent more efficient process. Whether that is acceptable is up to the user. I suspect there may be some drawbacks to this aproach, however. Like driver problems for instance.


Do you have any comparable graphs of AMD cards that show a different power use pattern? Because power use doesn't necessarily mean the voltage is cycling, and doesn't mean it is anything out of the ordinary.


----------



## Redeemer

So the 970 has 64 ROPs but cannot use them effectively as the 980 can?


----------



## looniam

Quote:


> Originally Posted by *Redwoodz*
> 
> It is relevant because SLi uses vram from 1 card,so if there is an issue it would show more if there is a vram limitation.


we both know the vram is mirrored on both cards but its usage doesn't go up as opposed to a single card. and that its not _dropped frame_s (a common symptom in SLI) but *frame time stuttering.*

however, my little 780TI would like to discuss TR and Thief needing more than 3Gbs of vram @ 1440 under the AA conditions guru3d run their review









btw, its nice you are still using that flawed, cherry picked power consumption from THG. i find it highly intriguing *that they stopped using it.*

i wonder why?
maybe because their results were never replicated on _ANY OTHER_ site?

hhhhmmmmmmmm.


----------



## mxthunder

This is uber shady. I am dissapointed in Nvidia


----------



## venomblade

Dang, was just about to pull the trigger on a 970. So this can't be fixed via driver/bios update? Guess I'll wait for a v2.


----------



## PureBlackFire

oh noes man. every time I give nvidia the benefit of the doubt, they prove unworthy.


----------



## mrkk

Quote:


> Originally Posted by *looniam*
> 
> [/SPOILER]
> 
> that's unreliable benchmark . .throw it away.


Unreliable tester not unreliable benchmark.


----------



## iTurn

Quote:


> Originally Posted by *benbenkr*
> 
> It's not false advertising. The fact remains, the 970 has 4GB of VRAM on board. That's it, don't argue about this further.
> 
> The issue is how the 4GB of VRAM is being used, *NOT* that the 970 having a missing 512mb of VRAM.
> It's called first world problems.
> 
> People are angry that the 4GB of VRAM doesn't work as it should and that Nvidia kept quiet about it. A few days ago a rep was saying they are "looking" into the issue, but really it's more like how they should word out their PR statement instead of actually doing anything specifically to the 970.


Calm your tits theres a lot of deception in their marketing with this discovery being revealed... see why below.
Quote:


> Originally Posted by *velocd*
> 
> It's not false advertisement but it is deceptive advertisement, which may hurt Nvidia sales in the long run. If something is marketed as 4GB VRAM you reasonably expect it to use up to 4GB of VRAM effectively, not 3.5GB effective and the last .5 poorly. I'm okay with the latter as long as it's marketed to perform that way, but Nvidia has kept that quiet for obvious reasons.


Quote:


> Originally Posted by *maarten12100*
> 
> They have basically made up this 2 way segmentation PR rubish. Just now.
> 
> Should've listed it as 3,5 + 0,5 GB or something like that when they sold it.
> 
> Nvidia lists throughput here:
> http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications
> 
> 224GB/s for the full 4GB while that is not the case so that would qualify as false advertising because the actually bandwidth is 1/10th of that and that means it is significantly off.
> Bandwidth is cut to a point where hiccups occur.
> indeed


Agreed! False advertising if ive ever seen it, nvidia defense force always out... same with the 970 coil whine and now with this "smear campaign".

Im just glad Nvidia admitted and hopefully are looking into a fix.


----------



## looniam

Quote:


> Originally Posted by *mrkk*
> 
> Unreliable tester not unreliable benchmark.


i guess you missed the part where nia has now seen/admitted a cuda programming error?
Quote:


> The benchmark measures "actually" not the DRAM bandwidth but the bandwidth of the "global memory" s in CUDA. The global memory in CUDA is a virtual memory space, which includes not only the DRAM, the GPU but also DRAM areas of the CPU.


dram areas of the cpu via the pci-e bus
here

i spent hours with the benchmark on guru3D's forum with others and experience several inconsistencies on a 780TI . . so that's borked too? :/


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> i guess you missed the part where nia has now seen/admitted a cuda programming error?
> dram areas of the cpu via the pci-e bus
> here
> 
> i spent hours with the benchmark on guru3D's forum with others and experience several inconsistencies on a 780TI . . so that's borked too? :/


Technically, it has 40 rops, so yeah.


----------



## battleaxe

Well. I'm definitely not getting a second 970 now. Crossfire it is.


----------



## Creator

Are any other cards with cut down ROPs affected by this? Like GTX 780?


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> Technically, it has 40 rops, so yeah.


pssst . .*48* ROPs


----------



## DzillaXx

The Card still performs well despite the 3.5gb, but this is something Nvidia should have told its users from the start. Should have been in the release reviews, not to find out month later after people start reporting problems.

If the Cut Down Maxwell die can't allocate all the memory without performance impacts, this really should have been day one information.

You simply are not getting a 4GB card, that last 512mb is useless. Infact I would ask for a way to disable that 512mb altogether, as I wouldn't want to take the change of it impacting overall performance.


----------



## Seven7h

Quote:


> Originally Posted by *DzillaXx*
> 
> You simply are not getting a 4GB card, that last 512mb is useless. Infact I would ask for a way to disable that 512mb altogether, as I wouldn't want to take the change of it impacting overall performance.


Sorry but is is just straight up factually wrong.

Just like a case where you have the full memory amount with identical access speeds, on the 970 you use that last 512MB for graphics resources when the first 3.5GB is actually exceeded, and it is much faster than system memory.

Couple this with the fact that only low priority/infrequently accessed resources go into that 512MB when above 3.5GB in the first place, and the effect on your overall frame time is absolutely minute.

Disabling it means literally anything above 3.5GB goes into system memory, which is the bandwidth you see above 3.5GB in the Nai CUDA app. I thought you *didnt* want the bandwidth seen in that test? Cause that's what you'd get if you disable the additional 512MB.

Im not saying the lack of heads-up about this is great, but it's certainly being blown out of proportion, especially given that it doesn't seem to impact gaming performance.


----------



## MonarchX

The worst thing is that after all this, GTX 970 will still have 4GB in specs and in ads without mention of this huge defect. Be lucky if you get ANYTHING out of all this. No games will compensate for this and there is no chance of recall and a newer revision. GTX 970 should be PULLED OFF THE MARKET.

I do wish a huge petition for a refund will get started somewhere to have nVidia bite its nails and toe nails!


----------



## Exilon

Quote:


> Originally Posted by *Creator*
> 
> Are any other cards with cut down ROPs affected by this? Like GTX 780?


No, this seems unique to 2nd gen Maxwell, but this also the first time Nvidia had a symmetric memory setup (4x64-bit w/ 2 4gb GDDR5 attached to each) but also prioritized a section of memory due to internal architecture reasons. There looks to be a bug where the 500 MB partition isn't being used at all in the benchmark everyone's running. This isn't expected behavior and nvidia will need to fix it.


----------



## Mad Pistol

Quote:


> Originally Posted by *Creator*
> 
> Are any other cards with cut down ROPs affected by this? Like GTX 780?


Nope. I can max out the VRAM on my GTX 780, and the performance is great. It isn't until I try and exceed the 3GB of framebuffer that things start getting hairy... which is normal.

You guys know what this means, right? GK110 is still a king in its own right. A true beast without compromises.


----------



## cutty1998

Quote:


> Originally Posted by *MonarchX*
> 
> The worst thing is that after all this, GTX 970 will still have 4GB in specs and in ads without mention of this huge defect. Be lucky if you get ANYTHING out of all this. No games will compensate for this and there is no chance of recall and a newer revision. GTX 970 should be PULLED OFF THE MARKET.
> 
> I do wish a huge petition for a refund will get started somewhere to have nVidia bite its nails and toe nails!


I concur. This is bad.Thinking AMD will capitalize off this !


----------



## Redeemer

Can anyone explain why exactly the 970 cannot match the 980s pixel throughput?


----------



## MonarchX

They need to check if GTX 960 has similar problems. Is there a 4GB version? I feel so lucky I went for GTX 980, the only cut Maxwell board worth purchasing.


----------



## Mopar63

Quote:


> Originally Posted by *velocd*
> 
> It's not false advertisement but it is deceptive advertisement, which may hurt Nvidia sales in the long run. If something is marketed as 4GB VRAM you reasonably expect it to use up to 4GB of VRAM effectively, not 3.5GB effective and the last .5 poorly. I'm okay with the latter as long as it's marketed to perform that way, but Nvidia has kept that quiet for obvious reasons.


Simple solution to this, nVidia should offer a 100% refund to anyone that feels they where cheated by the marketing that was done.


----------



## nSone

is OCCT a proper GPU benchmark we could use to compare memory/bandwidth across different cards including amd?


----------



## Mad Pistol

Quote:


> Originally Posted by *MonarchX*
> 
> They need to check if GTX 960 has similar problems. Is there a 4GB version? I feel so lucky I went for GTX 980, the only cut Maxwell board worth purchasing.


GTX 960 is a fully enabled GM206 core. There's almost no chance that this issue will affect it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> Nope. I can max out the VRAM on my GTX 780, and the performance is great. It isn't until I try and exceed the 3GB of framebuffer that things start getting hairy... which is normal.
> 
> You guys know what this means, right? GK110 is still a king in its own right. A true beast without compromises.


Think about it. GTX780 is $650 card, GTX970 $330 card.


----------



## mtcn77

Quote:


> Originally Posted by *Mopar63*
> 
> Simple solution to this, nVidia should offer a 100% refund to anyone that feels they where cheated by the marketing that was done.


And scratch the marketting material to restate the card's correct bus width.


----------



## HyperC

Quote:


> Originally Posted by *MonarchX*
> 
> The worst thing is that after all this, GTX 970 will still have 4GB in specs and in ads without mention of this huge defect. Be lucky if you get ANYTHING out of all this. No games will compensate for this and there is no chance of recall and a newer revision. GTX 970 should be PULLED OFF THE MARKET.
> 
> I do wish a huge petition for a refund will get started somewhere to have nVidia bite its nails and toe nails!


Don't wish about it start one! Anyways anyone try to flash 980 bios to 970 yet? maybe it could fix the issue perhaps


----------



## Forceman

Quote:


> Originally Posted by *HyperC*
> 
> Don't wish about it start one! Anyways anyone try to flash 980 bios to 970 yet? maybe it could fix the issue perhaps


It's a hardware limitation. No BIOS is going to be able to fix it. The best they 'll be able to do is minimize the impact by limiting the use of the slower VRAM area, which is what it sounds like the drivers already do.

But until someone comes up with a way to test the impact of using that memory, there's really no way to say how much of an impact it really has. That "benchmark" program is bugged, so that's useless, and just increasing the graphics options on a game to increase memory usage doesn't help too much because increasing the options is going to hurt performance anyway. You'd need a performance-neutral way of increasing memory use, and I don't know what that is, although an FCAT analysis of a high memory use situation would be helpful, since that would show any stuttering. I'm sure someone is working on that.

Nvidia's comparison of the performance relative tot he GTX 980 as you increase the memory is another possibility for independent testing.


----------



## Orangey

Has anyone started a petition/class action?


----------



## Vesku

If this was a non-issue why do some people not trust that the response is legitimate? Been seeing on some sites "How do we know PCPer isn't making this up?" "Maybe all those other sites are referencing PCPer who made it up." It matters to at least some people and given the difference in how 550 Ti and other mixed memory was reported vs the 970 I'd say Nvidia marketing would be well aware of the potential for reducing the hype around the 970 launch.


----------



## Exilon

Quote:


> Originally Posted by *Redeemer*
> 
> Can anyone explain why exactly the 970 cannot match the 980s pixel throughput?


The same reason why it can't match the GTX 980's "bandwidth" in Nai's benchmark. Both benches are bottlenecked by shaders.


----------



## Mad Pistol

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Think about it. GTX780 is $650 card, GTX970 $330 card.


It was around $500 at the end of it's life.

No one here is denying that the GTX 970 is not only a cheaper, but faster card. However, at least we got exactly what was advertised.









I love my GK110 beast.


----------



## Art Vanelay

Quote:


> Originally Posted by *Ganf*
> 
> We all encounter this situation when we keep our hardware longer than a year, something is released that we want to play, and we find out that our card can no longer max out the refresh rate of our monitor.
> 
> The speed limit isn't constantly being raised from year to year on the interstate, so no, you don't need to future-proof your car to any great degree.


Yeah, but there are still people who realistically won't be using more than 3.5GB of VRAM within the next few years.

Someone who's running a 120hz monitor and doesn't like AA probably isn't going to use that much VRAM in the future... plus saving money now and just buying a better card when you need it isn't a bad option, considering you're only losing .5GB.

Quote:


> Originally Posted by *Menta*
> 
> thats not actually true anymore, with the luanch of the new consoles and multi threaded architecture , the i7 is actually starting to make some diference in newer games and in the future probably more so


From every benchmark I've seen, the performance improvement going from a core i5 to i7 is very far from worth the extra $100 it costs.


----------



## Clocknut

This is clearly a PR statement to avoid 970 recall. It seems that they do not want to do a "Intel P67" recall despite having 2 issues, coil whine + 3.5GB.

This is clearly a false advertisement, if the last 512MB are not performing at full bandwidth, they should have list 4GB*** with a *** & mention what it means at the bottom of the box. Kinda like how Intel did on 5820K 28 lanes PCIE.

take for example..

1. would u be mad that u bought a turbocharged car not knowing the turbo are not working when u are on the 6th gear?
2. How about a i7-4790K CPU that advertise 4GHz but when u start using HT it is only able to run 3GHz?

none of other hardware makers do this, Only Nvidia did this. They would have been fine if that put a *** next to 4GB


----------



## hurleyef

Quote:


> Originally Posted by *Vesku*
> 
> If this was a non-issue why do some people not trust that the response is legitimate? Been seeing on some sites "How do we know PCPer isn't making this up?" "Maybe all those other sites are referencing PCPer who made it up."


It's official. People are just paranoid.


----------



## Mad Pistol

Quote:


> Originally Posted by *hurleyef*
> 
> It's official. People are just paranoid.


The fact that we have multiple independent sources that have tested this and confirmed it on their cards, that should be enough to prove that this is not some hoax made up by an AMD fan. The issue is very real.

And I also agree that it's not that big of a deal in terms of card performance. What is the big deal, though, is that people were given a different product than what was advertised. That is a massive liability for Nvidia.


----------



## hurleyef

Quote:


> Originally Posted by *Mad Pistol*
> 
> The fact that we have multiple independent sources that have tested this and confirmed it on their cards, that should be enough to prove that this is not some hoax made up by an AMD fan. The issue is very real.


I was referring to the post itself, as originally PCPer didn't link to the original forum post that they sourced, leading some to suspect that they'd just made it up altogether.


----------



## Art Vanelay

Quote:


> Originally Posted by *Mad Pistol*
> 
> The fact that we have multiple independent sources that have tested this and confirmed it on their cards, that should be enough to prove that this is not some hoax made up by an AMD fan. The issue is very real.
> 
> And I also agree that it's not that big of a deal in terms of card performance. What is the big deal, though, is that people were given a different product than what was advertised. That is a massive liability for Nvidia.


According to Nvidia, it'll still use the extra 0.5GB of RAM if a game needs it, though.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Art Vanelay*
> 
> According to Nvidia, it'll still use the extra 0.5GB of RAM if a game needs it, though.


Nvidia has to come with better explanation.


----------



## PureBlackFire

Quote:


> Originally Posted by *Art Vanelay*
> 
> According to Nvidia, it'll still use the extra 0.5GB of RAM if a game needs it, though.


the issue is not using the vram. when it uses the last .5GB the bandwidth gets reduced by a huge amount and this leads to stutter in games. I tested this with several games at 4K last night. didn't run FC4 or Crysis 3, but the only game I was able to confirm the BS with was BF4. because almost all the other games tested (except Tomb Raider, Alan Wake and Borderlands 2, none of which used over 2.5GB at 4K) ran so terribly slow at 4K _with_ the necessary settings to eat up all that vram and I wouldn't have been able to notice any performance degrade as bad as they ran in the first place. average frame rate didn't even drop in BF4, it just got real stuttery every time AB showed < 3500mb vram usage. for single card users this is mostly a non issue. for SLI users it can be an issue. from what I've seen single 970 runs out of horsepower way before it runs out of vram.


----------



## Vesku

Quote:


> Originally Posted by *hurleyef*
> 
> It's official. People are just paranoid.


I was using their disbelief as an example that Nvidia should have disclosed this when they launched the 970. Some people even had trouble believing Nvidia would keep quiet about it.


----------



## Xoriam

Quote:


> Originally Posted by *Art Vanelay*
> 
> According to Nvidia, it'll still use the extra 0.5GB of RAM if a game needs it, though.


I've gotten ACU up to 4096 @4k, however the performance was not stunning when I hit the 3900 mark.


----------



## Seven7h

Quote:


> Originally Posted by *PureBlackFire*
> 
> the issue is not using the vram. when it uses the last .5GB the bandwidth gets reduced by a huge amount and this leads to stutter in games. I tested this with several games at 4K last night. didn't run FC4 or Crysis 3, but the only game I was able to confirm the BS with was BF4. because almost all the other games tested (except Tomb Raider, Alan Wake and Borderlands 2, none of which used over 2.5GB at 4K) ran so terribly slow at 4K _with_ the necessary settings to eat up all that vram and I wouldn't have been able to notice any performance degrade as bad as they ran in the first place. average frame rate didn't even drop in BF4, it just got real stuttery every time AB showed < 3500mb vram usage. for single card users this is mostly a non issue. for SLI users it can be an issue. from what I've seen single 970 runs out of horsepower way before it runs out of vram.


That's not true. You don't get stutter due to slower memory bandwidth for a section of ram.

You experience stutter on a per frame basis. When frametimes vary from frame to frame, you feel stutter.

Reaching into a segment of slower memory just slows the frametime down on *every frame equally*.

So you put stuff you don't need frequently there and the performance impact is negligible. It's not like having a couple infrequently referenced resources there linearly and dramatically slows down frametimes. It depends on how much data is accessed, and how many times over the frame. Given that it's optimized to be the stuff that doesn't matter as much, the impact is 1-2% *when using memory above 3.5GB in a single command buffer*.

Even old 8-series GPUs had a notorious extra slowdown specific to higher MSAA settings, and it was bigger than 1-2%. That sort of thing isn't unheard of.

Whatever stutter you see is unrelated, or placebo.

Mass Hysteria!!!!!!!


----------



## FLCLimax

so it's not a defect, it's a feature eh?


----------



## Vesku

Quote:


> Originally Posted by *Seven7h*
> 
> That's not true. You don't get stutter due to slower memory bandwidth for a section of ram.
> 
> You experience stutter on a per frame basis. When frametimes vary from frame to frame, you feel stutter.
> 
> Reaching into a segment of slower memory just slows down the frametime down on *every frame equally*.
> 
> So you put stuff you don't need frequently there and the performance impact there and the performance impact is negligible. It's not like having a couple infrequently referenced resources there linearly slows down frametimes. It depends on how much data is accessed, and how many times over the frame. Given that it's optimized to be the stuff that doesn't matter as much, the impact is 1-2% *when using memory above 3.5GB in a single command buffer*.
> 
> Even old 8-series GPUs had a notorious extra slowdown specific to higher MSAA settings. That's sort of thing isn't unheard of.
> 
> Whatever stutter you see is unrelated, or placebo.
> 
> Mass Hysteria!!!!!!!


What assets end up in that 500MB will depend on the game. I guess titles that use the major licensed game engines will be relatively well catered for by Nvidia, such as Unreal Engine, Unity and CryEngine.


----------



## ZeusHavok

I have this issue with my 970s and I couldn't figure out why. I'm pretty annoyed because I game @ 5k and I am constantly running into this issue. Anything over 3.5gb VRAM causes crazy amounts of jittering and load lag.

I'm buying AMD cards next time, I can't be bothered with Nvidia's blatant lies anymore.


----------



## PureBlackFire

Quote:


> Originally Posted by *Seven7h*
> 
> That's not true. You don't get stutter due to slower memory bandwidth for a section of ram.
> 
> You experience stutter on a per frame basis. When frametimes vary from frame to frame, you feel stutter.
> 
> Reaching into a segment of slower memory just slows the frametime down on *every frame equally*.
> 
> So you put stuff you don't need frequently there and the performance impact is negligible. It's not like having a couple infrequently referenced resources there linearly and dramatically slows down frametimes. It depends on how much data is accessed, and how many times over the frame. Given that it's optimized to be the stuff that doesn't matter as much, the impact is 1-2% *when using memory above 3.5GB in a single command buffer*.
> 
> Even old 8-series GPUs had a notorious extra slowdown specific to higher MSAA settings, and it was bigger than 1-2%. That sort of thing isn't unheard of.
> 
> Whatever stutter you see is unrelated, or placebo.
> 
> Mass Hysteria!!!!!!!


thanks for the info but you can keep the sarcasm.









this could be an engine issue as it occurs at regualr intervals. the vram usage does go over 3500mb at each time the stutter occurs. like I said though, BF4 is the only game it happens in. everything else is either fine or borderline unplayable anyway.


----------



## raghu78

Nvidia's response to the whole thing is classic Nvidia - Deny any wrongdoing or withholding of factual information from the customer. They have shown in the past how unscruplous they are with their handling of the GPU bumpgate scandal. What did users expect from them this time around ? Now if only the press got serious and investigated this by testing games at 1440p and 4k which exceed 3.5 GB VRAM usage- both in single GPU and SLI. Complete with FCAT results. Does anybody have the guts to do a full blown investigation and come out with results ? Few games which are known to hit 3.5 GB+ usage are Middle Earth Shadow of Mordor, Call of Duty Advanced Warfare, AC Unity, Farcry 4, WatchDogs, Lords of the Fallen, Ryse Son of Rome, Dead Rising 3. The irony is the majority of these games are TWIMTBP games


----------



## looniam

Quote:


> Originally Posted by *Xoriam*
> 
> I've gotten ACU up to 4096 @4k, however the performance was not stunning when I hit the 3900 mark.


wellll . .

Quote:


> Originally Posted by *PureBlackFire*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> the issue is not using the vram. when it uses the last .5GB the bandwidth gets reduced by a huge amount and this leads to stutter in games. I tested this with several games at 4K last night. didn't run FC4 or Crysis 3, but the only game I was able to confirm the BS with was BF4. because almost all the other games tested (except Tomb Raider, Alan Wake and Borderlands 2, none of which used over 2.5GB at 4K) ran so terribly slow at 4K _with_ the necessary settings to eat up all that vram and I wouldn't have been able to notice any performance degrade as bad as they ran in the first place. average frame rate didn't even drop in BF4, it just got real stuttery every time AB showed < 3500mb vram usage. for single card users this is mostly a non issue. for SLI users it can be an issue.
> 
> 
> *from what I've seen single 970 runs out of horsepower way before it runs out of vram.*


excuse me for the "editing" PBF









to all:
*THE RESULTS OF THE NIA BENCHMARK HAVE BEEN PROVEN TO BE INVALID!*

do i need to repeat that? (yes, i think i do)

*THE RESULTS OF THE NIA BENCHMARK HAVE BEEN PROVEN TO BE INVALID!*

however is there an issue with *some* gtx970s?

yes that has been shown.

is nvida "limiting" the bandwidth? (oh, dear i have to do it again)

*THE RESULTS OF THE NIA BENCHMARK HAVE BEEN PROVEN TO BE INVALID!*

in the meantime nvidia has claimed the vram is there and available when it is needed. so, what there is at this moment is a problem that has a symptom but an unknown cause. it is just as likely that some game code that allocates vram is not able to access through the crossbar as easily as other games.(DX11 anyone?)

btw, as far as liability; good luck on nvidia and or its marketing department knowing mislead and fraudulently advertised the vram on the 970. get real - that ain't going to happen. that is not to say i feel bad for owners of the 970s _that are having issues._ i would like to highly recommend going to the nvidia forum and having a positive and construct dialogue and help provide what testing is necessary at resolving your issue(s)

in the meantime ignore the AMD shills that have been circularly posting on several forums.


----------



## mtcn77

Quote:


> Originally Posted by *ZeusHavok*
> 
> I have this issue with my 970s and I couldn't figure out why. I'm pretty annoyed because I game @ 5k and I am constantly running into this issue. Anything over 3.5gb VRAM causes crazy amounts of jittering and load lag.
> 
> I'm buying AMD cards next time, I can't be bothered with Nvidia's blatant lies anymore.


MLAA is even smoother than SMAA. Also, RadeonPro counteracts coil whine as a side effect.


----------



## Seven7h

Quote:


> Originally Posted by *Vesku*
> 
> Not all textures are equal. Those of a seldom used object will be less likely to cause issues.


That's what I just said. I'm saying a texture is a type of resource, so when you said it depends on the type of resource it makes it sound like anything could go in there and performance of your game will be non-deterministic.

My point was just that that's not the case and things will work reliably, with the space being used as efficiently as possible.


----------



## raghu78

Quote:


> Originally Posted by *looniam*
> 
> in the meantime nvidia has claimed the vram is there and available when it is needed. so, what there is at this moment is a problem that has a symptom but an unknown cause. it is just as likely that some game code that allocates vram is not able to access through the crossbar as easily as other games.(DX11 anyone?)
> 
> btw, as far as liability; good luck on nvidia and or its marketing department knowing mislead and fraudulently advertised the vram on the 970. get real - that ain't going to happen. that is not to say i feel bad for owners of the 970s _that are having issues._ i would like to highly recommend going to the nvidia forum and having a positive and construct dialogue and help provide what testing is necessary at resolving your issue(s)
> 
> in the meantime ignore the AMD shills that have been circularly posting on several forums.


You are trying real hard to do damage control for Nvidia.







But the fact is Nvidia withheld information from the customer and falsely advertised 4GB at 224 GB/s. Nowhere in the GTX 970 marketing material given to press or on the product box was the partition and the access speed difference mentioned.


----------



## Vesku

Quote:


> Originally Posted by *Seven7h*
> 
> That's what I just said. I'm saying a texture is a type of resource, so when you said it depends on the type of resource it makes it sound like anything could go in there and performance of your game will be non-deterministic.
> 
> My point was just that that's not the case and things will work reliably, with the space being used as efficiently as possible.


I doubt Nvidia will be able to ensure that the assets of every game using 3.5+GB of VRAM are properly aligned between the 3.5GB and 0.5GB section. Time will tell.


----------



## looniam

Quote:


> Originally Posted by *raghu78*
> 
> You are trying real hard to do damage control for Nvidia.
> 
> 
> 
> 
> 
> 
> 
> But the fact is Nvidia withheld information from the customer and falsely advertised 4GB at 224 GB/s. Nowhere in the GTX 970 marketing material given to press or on the product box was the partition and the access speed difference mentioned.


you nor does anyone else have valid proof of those claims matey. as you have completely ignored:

*THE RESULTS OF THE NIA BENCHMARK HAVE BEEN PROVEN TO BE INVALID!*

let's look at history; was there anything on the box about the 550ti's mix memory densities?

*NOPE.*

and btw, i am doing nothing for nvidia. i am trying to provide the factual basis for which the owners of cards having issues can find a resolution.

you on the other hand, seem to be on a witch hunt, for your own reasons . .


----------



## Seven7h

Quote:


> Originally Posted by *raghu78*
> 
> You are trying real hard to do damage control for Nvidia.
> 
> 
> 
> 
> 
> 
> 
> But the fact is Nvidia withheld information from the customer and falsely advertised 4GB at 224 GB/s. Nowhere in the GTX 970 marketing material given to press or on the product box was the partition and the access speed difference mentioned.


It's an implementation detail. You bought it based on the performance it provides in games at various settings as seen in real reviews, not for specs on a box.

Memory bandwidth is never guaranteed anyway, since it is always a theoretical maximum in a contrived measurement case. No games actually see the full memory bandwidth. By your logic you've been cheated on every GPU you've bought from NVIDIA and AMD both. Also you've been screwed over by Intel and AMDs CPU system memory bandwidth quotes.

Memory bandwidth is a top speed figure, not an average or minimum.


----------



## Vesku

We won't know the throughput of that 0.5GB section until Nvidia updates CUDA to know about it or someone figures out another way to test it. But by Nvidia's own admission there is a performance penalty once that 0.5GB section comes into play, they are simply saying it is low single digit %.


----------



## clerick

It seems like the nvidia compression thing is breaking down above 3.5gb (the purple thing is their third generation delta compression color thing?)


----------



## ZealotKi11er

Quote:


> Originally Posted by *clerick*
> 
> 
> 
> 
> 
> It seems like the nvidia compression thing is breaking down above 3.5gb (the purple thing is their third generation delta compression color thing?)


He did something to that system. He is using 13.5GB of RAM lol.


----------



## mtcn77

You guys are being proficiently steered by Nvidia. No point in arguing the memory buffer, that memory is there, either linked, or unlinked. It is the memory bus interface that is being falsely advertised. They have admitted it firsthand.
Notice this has never occurred before, 660Ti for instance has been advertised as it truly were - 192 bit. No way that card is functioning as 256 bit.


----------



## Mad Pistol

Quote:


> Originally Posted by *mtcn77*
> 
> You guys are being proficiently steered by Nvidia. No point in arguing the memory buffer, that memory is there, either linked, or unlinked. It is the memory bus interface that is being falsely advertised. They have admitted it firsthand.
> Notice this has never occurred before, 660Ti for instance has been advertised as it truly were - 192 bit. No way that card is functioning as 256 bit.


The 660 Ti exhibited a similar scenario, where anything above 1.5GB utilization caused the card's performance to tank hard.


----------



## Seven7h

Quote:


> Originally Posted by *Vesku*
> 
> We won't know the throughput of that 0.5GB section until Nvidia updates CUDA to know about it or someone figures out another way to test it. But by Nvidia's own admission there is a performance penalty once that 0.5GB section comes into play, they are simply saying it is low single digit %.


Yes, low single digit % for the frame overall. The slowdown on that memory is likely more than this amount, but given that anything in there is inherently less impactful on the frametime, it's only low % over the whole frame.

Think of it as a "50% of 5%" type of math (making up figures here to provide an example of how a big speed difference can result in a tiny overall difference).


----------



## Art Vanelay

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nvidia has to come with better explanation.


To be fair, the majority of the people they have to explain to have probably never programmed in anything lower level than Java, so explaining advanced firmware programming to an audience like that is going to be pretty difficult, if you want a thorough explanation.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> You guys are being proficiently steered by Nvidia. No point in arguing the memory buffer, that memory is there, either linked, or unlinked. *It is the memory bus interface that is being falsely advertised. They have admitted it firsthand.*
> Notice this has never occurred before, 660Ti for instance has been advertised as it truly were - 192 bit. *No way that card is functioning as 256 bit.*


nvidia said they prioritized the vram into segments, lets keep it straight, eh?

as far as the card not running 256 bit, please provide valid proof.


----------



## mtcn77

Quote:


> Originally Posted by *Mad Pistol*
> 
> The 660 Ti exhibited a similar scenario, where anything above 1.5GB utilization caused the card's performance to tank hard.


Except, it wasn't falsely advertised as a 256 bit card, for instance.


----------



## Alatar

eh

1) assuming that memory bandwidth is actually consistently lower for the last 0.5 gigs people who own a 970 and care should probably consider something like a class action lawsuit for false advertising. Seems like a reasonable response.

2) Considering that there are a bunch of FCAT and normal FPS measurements of the 970 already (and from the 980 too) and they don't show anything out of the ordinary as far as performance goes I would suggest that people should think about and examine how much of the VRAM marketing nonsense actually has a tangible effect on reality

3) People linking random power consumption graphs (from 3rd party OC cards during specific GPGPU loads) and out of context FCAT graphs (where 980 graphs from the same review show the same results) are going full tinfoil and using an opportunity to rile people up.


----------



## Forceman

Quote:


> Originally Posted by *mtcn77*
> 
> Except, it wasn't falsely advertised as a 256 bit card, for instance.


The connection between the memory and the die is 256-bit. Just because the die internally addresses some memory inefficiently doesn't mean it is falsely advertised.


----------



## Mad Pistol

Quote:


> Originally Posted by *mtcn77*
> 
> Except, it wasn't falsely advertised as a 256 bit card, for instance.


No, but the GTX 660 Ti wasn't very usable as a 2GB video card either. It didn't like anything above 1.5GB utilization.


----------



## nSone

I'm really confused by this statement
maybe it's just me but does it state that both 970 and 980 have 3.5 + 0.5 vram sections and because of different SMs configurations only the 980 manages to utilize that 0.5GB properly or is it that bandwidth drops on both cards?


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> nvidia said they prioritized the vram into segments, lets keep it straight, eh?
> 
> as far as the card not running 256 bit, please provide valid proof.


Sure.
I don't know which is the bigger denial, responding like nothing fishy is present, or trying to fish out the official response.
Quote:


> However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.


Let's hush this up before it spreads, quick!


----------



## raghu78

Quote:


> Originally Posted by *looniam*
> 
> you nor does anyone else have valid proof of those claims matey. as you have completely ignored:
> 
> and btw, i am doing nothing for nvidia. i am trying to provide the factual basis for which the owners of cards having issues can find a resolution.


dude Nvidia has come out with a statement clearly explaining the memory partition. *What they have not revealed in their carefully worded PR statement is the actual memory access speeds and bandwidth of the last 0.5 GB. Why do you think they did not reveal that information. its classic PR and damage control.* Thats why the press needs to get serious and investigate this with game testing at 1440p and 4k and prove if it actually affects game performance in games which exceed 3.5 GB usage. There are quite a few games which I already mentioned that do that easily. For once FCAT could come to haunt Nvidia. there are users who report the stuttering when memory usage hits > 3.5 GB. btw the owners of the cards are not required to find a resolution to this problem. Its Nvidia's damn job to address this problem and not deny it or run away from it with carefully worded PR.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> Sure.
> I don't know which is the bigger denial, responding like nothing fishy is present, or trying to fish out the official response.
> Quote:
> 
> 
> 
> However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.
> 
> 
> 
> Let's hush this up before it spreads, quick!
Click to expand...

yes because there are less cuda cores (SMs) there would not be as many crossbars as opposed to the gtx980.

what is there to hide?

and how is that proof?

. .funny you mentioned fishing. hope to catch something . .


----------



## Seven7h

Quote:


> Originally Posted by *nSone*
> 
> I'm really confused by this statement
> maybe it's just me but does it state that both 970 and 980 have 3.5 + 0.5 vram sections and because of different SMs configurations only the 980 manages to utilize that 0.5GB properly or is it that bandwidth drops on both cards?


980 works as GPUs traditionally have. No new behavior there, just exclude it from this discussion.

970 has a slower access speed to the last 512MB, but the Nai CUDA test was absolutely wrong in the numbers it showed... People freaked out about the results it showed and drew the incorrect conclusion from them. For memory above 3.5GB, the benchmark was *not* testing the separate 512MB memory section bandwidth... instead it was *actually* testing system memory bandwidth.

So no one has actually tested the real speed of this last 512MB yet, except via game workloads (by forcing memory consumption above 3.5GB but below 4GB), which show negligible performance differences.


----------



## mark_thaddeus

I thought this was already explained by Tech Report last October 1 ,2014?

http://techreport.com/blog/27143/here-another-reason-the-geforce-gtx-970-is-slower-than-the-gtx-980

They already mentioned...
Quote:


> On paper, the GTX 970 ought to be nearly as fast on this front as the 980-and the Asus Strix card ought to be a smidgen faster. The 3DMark color fill test we use has evidently been limited by memory bandwidth at times in the past, but that shouldn't be an issue since all three cards in question have the exact same memory config.
> 
> For a while, I've thought I should drop you an email about some pixel fillrate numbers you use in the peak rates tables for GPUs. Actually, most people got those numbers wrong as Nvidia is not crystal clear about those kind of details unless you ask very specifically.
> 
> The pixel fillrate can be linked to the number of ROPs for some GPUs, but it's been limited elsewhere for years for many Nvidia GPUs. Basically there are 3 levels that might have a say at what the peak fillrate is :
> 
> The number of rasterizers
> The number of SMs
> The number of ROPs
> 
> On both Kepler and Maxwell each SM appears to use a 128-bit datapath to transfer pixels color data to the ROPs. Those appears to be converted from FP32 to the actual pixel format before being transferred to the ROPs. With classic INT8 rendering (32-bit per pixel) it means each SM has a throughput of 4 pixels/clock. With HDR FP16 (64-bit per pixel), each SM has a throughput of 2 pixels/clock.
> 
> On Kepler each rasterizer can output up to 8 pixels/clock. With Maxwell, the rate goes up to 16 pixels/clock (at least with the currently released Maxwell GPUs).
> 
> So the actual pixels/cycle peak rate when you look at all the limits (rasterizers/SMs/ROPs) would be :
> 
> GTX 750 : 16/16/16
> GTX 750 Ti : 16/20/16
> GTX 760 : 32/24/32 or 24/24/32 (as there are 2 die configuration options)
> GTX 770 : 32/32/32
> GTX 780 : 40/48/48 or 32/48/48 (as there are 2 die configuration options)
> GTX 780 Ti : 40/60/48
> GTX 970 : 64/52/64
> GTX 980 : 64/64/64
> 
> Extra ROPs are still useful to get better efficiency with MSAA and so. But they don't participate in the peak pixel fillrate.
> 
> *That's in part what explains the significant fillrate delta between the GTX 980 and the GTX 970* (as you measured it in 3DMark Vantage). *There is another reason which seem to be that unevenly configured GPCs are less efficient with huge triangles splitting* (as it's usually the case with fillrate tests).


Isn't that the reason why the 970 struggles with high VRAM usage versus the 980 or isn't as efficient when dealing with high memory usage? Why is this still an issue? Oh yeah... first world problems!


----------



## looniam

Quote:


> Originally Posted by *raghu78*
> 
> dude Nvidia has come out with a statement clearly explaining the memory partition. *What they have not revealed in their carefully worded PR statement is the actual memory access speeds and bandwidth of the last 0.5 GB. Why do you think they did not reveal that information. its classic PR and damage control.* Thats why the press needs to get serious and investigate this with game testing at 1440p and 4k and prove if it actually affects game performance in games which exceed 3.5 GB usage. There are quite a few games which I already mentioned that do that easily. For once FCAT could come to haunt Nvidia. there are users who report the stuttering when memory usage hits > 3.5 GB. btw the owners of the cards are not required to find a resolution to this problem. Its Nvidia's damn job to address this problem and not deny it or run away from it with carefully worded PR.


so because nvidia didn't admit anything, _there must something wrong_. careful it may seem to some that you are in the tinfoil hat brigade.

and true it is their problem, and it is up to nvidia to address it. however any company cannot address an issue unless customers provide constructive feedback, ie their experiences/games/set ups when the problem occurs. _just like when there is a driver issue._ so, it is not asking nothing out of the ordinary . .dude.

e:
typos


----------



## Vesku

Quote:


> Originally Posted by *looniam*
> 
> so because nvidia didn't admit anything, _there must something wrong_. careful it may seem to some that you are in the tinfoil hat brigade.
> 
> and true it is their problem, and it is up to nvidia to address it. however any company cannot address unless customers provide constructive feedback, ie their experiences/games/set ups when the problem occurs. _just like when there is a driver issue._ so, it is not asking nothing out of the ordinary . .dude.


They certainly weren't embracing constructive feedback by not telling anyone about the memory configuration.


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> yes because there are less cuda cores (SMs) there would not be as many crossbars as opposed to the gtx980.
> 
> what is there to hide?
> 
> and how is that proof?
> 
> . .funny you mentioned fishing. hope to catch something . .


So, there are 52 "active rops" 13 SMM's three of which is turned off, yet these cards have no variation in fill rate from their stated maximum?
According to what I found, shaders interface to L2 and via L2 to memory controllers is through the crossbar interface. So, say you have less of the shaders & a cut down crossbar. What happens to the layout that overrides a portion of that connected to individual memory modules? Something's gotta give.


----------



## nSone

thx! Seven7h
I understand the part about the CUDA test, but was wondering if the stated frame drops are due to same memory allocation/section issues


----------



## Orangey

If the latest rumours about 380X are to be believed AMD have run into this problem too, albeit in a different way. The 4GB "limit" on HBM gen1 seems to have caused them to implement a two-level memory system that uses a 2nd pool of slower (possibly GDDR5) RAM for less crucial data. How they make it "smart" I cannot see but I fully expect Huddy to work this scandal into his rhetoric somehow.


----------



## raghu78

Quote:


> Originally Posted by *looniam*
> 
> so because nvidia didn't admit anything, _there must something wrong_. careful it may seem to some that you are in the tinfoil hat brigade.
> 
> and true it is their problem, and it is up to nvidia to address it. *however any company cannot address unless customers provide constructive feedback, ie their experiences/games/set ups when the problem occurs*. _just like when there is a driver issue._ so, it is not asking nothing out of the ordinary . .dude.


Except this is a design decision which Nvidia knows very well and not a driver issue. Nvidia did not care to explain or reveal to the press in their marketing material or on the product box to the customer. Nvidia is giving the expected PR response to an actual witholding of factual information from the press and customer.
Quote:


> Originally Posted by *Vesku*
> 
> They certainly weren't embracing constructive feedback by not telling anyone about the memory configuration.


spot on.


----------



## Seven7h

Quote:


> Originally Posted by *Vesku*
> 
> They certainly weren't embracing constructive feedback by not telling anyone about the memory configuration.


Yes, they should also ask users how the GPU architecture and plumbing should be laid out. Because consumers are obviously experts in the goods they're consuming, and the trade off implications in cost and yields are something they should be providing feedback on.

Also, people on forums are absolutely the entire demographic for any product, and purchasing customers don't care about real world performance... Just numbers printed on a box.

More seriously, customers are selfish in that they want to get everything for nothing. Stockholders are selfish in that they want to give the customer nothing and take everything from them.

A company has the responsibility of balancing those demands on both sides, and does so while keeping an eye on its competition and their offerings.


----------



## BulletSponge

I'm sure if enough people complain they'll offer something along the lines of $100 FTP in game currency.


----------



## EarthSpiritD2

I ordered a Gigabyte G1 but it has not been received yet and i will want a refund. Will newegg accept my return and give me a refund?


----------



## Seven7h

Quote:


> Originally Posted by *nSone*
> 
> thx! Seven7h
> I understand the part about the CUDA test, but was wondering if the stated frame drops are due to same memory allocation/section issues


There's no logical reasons they should be. The slower memory will be slow every frame, and therefore will not cause stutter or drops or other momentary effects... It would instead just slow every frame down equally by some percentage (small, and only when resources are in that final 512MB anyway because you have consumed the prior 3.5GB).


----------



## Vesku

Quote:


> Originally Posted by *Seven7h*
> 
> Yes, they should also ask users how the GPU architecture and plumbing should be laid out. Because consumers are obviously experts in the goods they're consuming, and the trade off implications in cost and yields are something they should be providing feedback on.
> 
> Also, people on forums are absolutely the entire demographic for any product, and purchasing customers don't care about real world performance... Just numbers printed on a box.
> 
> More seriously, customers are selfish in that they want to get everything for nothing. Stockholders are selfish in that they want to give the customer nothing and take everything from them.
> 
> A company has the responsibility of balancing those demands on both sides, and does so while keeping an eye on its competition and their offerings.


Yes, they should reveal when a product has its own specific quirk. That way reviewers and enthusiast customers can do their own testing and if they find any issues let Nvidia know. Nvidia could just slap a name on a card and reveal no details at all to reviewers and customers, they'd probably continue to sell well. That doesn't mean we should just shrug and say "no big deal".

This doesn't mean the GTX 970 is suddenly a bad card, but it's not quite the card everyone thought it was.


----------



## Clockster

Quote:


> Originally Posted by *raghu78*
> 
> Except this is a design decision which Nvidia knows very well and not a driver issue. Nvidia did not care to explain or reveal to the press in their marketing material or on the product box to the customer. Nvidia is giving the expected PR response to an actual witholding of factual information from the press and customer.
> spot on.


You're wasting your time bud, the Nvidia defenders are here in force and you know how they are..just as bad as the AMD defenders.
Wish people would wake up one day and realise these big companies like Nvida, AMD, Intel don't give a crap about them.

There is no way Nvidia didn't know about this when the cards launched.


----------



## Xoriam

Quote:


> Originally Posted by *BulletSponge*
> 
> I'm sure if enough people complain they'll offer something along the lines of $100 FTP in game currency.


I'd perfer a 100$ cash back rebate or something, more than 100$ in games.


----------



## Seven7h

Quote:


> Originally Posted by *Vesku*
> 
> Yes, they should reveal when a product has its own specific quirk. That way reviewers and enthusiast customers can do their own testing and if they find any issues let Nvidia know. Nvidia could just slap a name on a card and reveal no details at all to reviewers and customers, they'd probably continue to sell well. That doesn't mean we should just shrug and say "no big deal".
> 
> This doesn't mean the GTX 970 is suddenly a bad card, but it's not quite the card everyone thought it was.


I'd argue it's exactly the card everyone thought it was. The performance documented in every review doesn't change, and plenty of reviewers should've and likely did test performance at 4K.

If there were issues they should've been brought up to NVIDIA and documented in the review, regardless of whether or not they had prior knowledge of a different behavior. No one mentioned anything of the sort in their reviews.

Even if you couldn't use the last 512MB (which you can), I bet most people complaining would've bought the same card at the same price when looking at the performance seen in reviews, even if it actually did only have 3.5GB.

We are witnessing an emotional response rather than a logical, fact-curious response.


----------



## mark_thaddeus

Quote:


> Originally Posted by *Xoriam*
> 
> I'd perfer a 100$ cash back rebate or something, more than 100$ in games.


Sorry why would NV give back $100 dollars back?

1. It's a business
2. It was cheaper when they sold initially (before 780 Ti dropped down in prices) it versus the 780 Ti and performs at par (or close to it) to it with more VRAM
3. It still does well for all but the last 1% of users (us at OCN and the like) who willl never OC their card and just use it in their 1080p monitors as advertised


----------



## Xoriam

Quote:


> Originally Posted by *mark_thaddeus*
> 
> Sorry why would NV give back $100 dollars back?
> 
> 1. It's a business
> 2. It was cheaper when they sold initially (before 780 Ti dropped down in prices) it versus the 780 Ti and performs at par (or close to it) to it with more VRAM
> 3. It still does well for all but the last 1% of users (us at OCN and the like) who willl never OC their card and just use it in their 1080p monitors as advertised


um.. he just said he thinks they might consider 100$ in games, I said i would want 100$ in real money back instead.

I never said I expected it. I'm not running into any issues even when i surpass 3,5GB in games.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> So, there are 52 "active rops" 13 SMM's three of which is turned off, yet these cards have no variation in fill rate from their stated maximum?
> According to what I found, shaders interface to L2 and via L2 to memory controllers is through the crossbar interface. So, say you have less of the shaders & a cut down crossbar. What happens to the layout that overrides a portion of that connected to individual memory modules? Something's gotta give.


no there are 64 ROPs in both the 980 and 970 . .
Quote:


> Originally Posted by *raghu78*
> 
> Except this is a design decision which Nvidia knows very well and not a driver issue. Nvidia did not care to explain or reveal to the press in their marketing material or on the product box to the customer. Nvidia is giving the expected PR response to an actual witholding of factual information from the press and customer.
> Quote:
> 
> 
> 
> Originally Posted by *Vesku*
> 
> They certainly weren't embracing constructive feedback by not telling anyone about the memory configuration.
> 
> 
> 
> spot on.
Click to expand...

again, nvidia didn't care to explain the mixed memory densities in the 550ti nor was it listed on the box.

however so where is it decided what information ought to be disclosed? hhmmm?

you are still just making accusations of nvida hiding facts/information when you haven't shown that anyone knew there would be a problem how many reviews sites had these vary cards in their hands and found no such issue?

none out of the dozen, or more, that did. so please you have nothing to back your claims up but your own assumptions.


----------



## mark_thaddeus

Oops! Refer to post 314


----------



## Vesku

Quote:


> Originally Posted by *Seven7h*
> 
> I'd argue it's exactly the card everyone thought it was. The performance documented in every review doesn't change, and plenty of reviewers should've and likely did test at 4k performance.
> 
> If there were issues they should've been brought regardless of whether or not they had prior knowledge of a different behavior. No one mentioned anything of the sort in their reviews.
> 
> Even if you couldn't use the last 512MB (which you can), I bet most people complaining would've bought the same card at the same price when looking at the reviews if it actually did only have 3.5GB.


Some reviewers would be a bit more rigorous with the high VRAM testing if they had known about this. Anandtech and PCPer have already stated they plan to do more analysis of that type of game load.
Quote:


> Originally Posted by *looniam*
> 
> no there are 64 ROPs in both the 980 and 970 . .
> again, nvidia didn't care to explain the mixed memory densities in the 550ti nor was it listed on the box.


Actually they did reveal mixed memory to reviewers:

GTX 550 Ti's Quirk: 1GB Of VRAM On A 192-bit Bus

http://www.anandtech.com/show/4221/nvidias-gtx-550-ti-coming-up-short-at-150/2


----------



## mark_thaddeus

Quote:


> Originally Posted by *Xoriam*
> 
> um.. he just said he thinks they might consider 100$ in games, I said i would want 100$ in real money back instead.
> 
> I never said I expected it. I'm not running into any issues even when i surpass 3,5GB in games.


Cool!


----------



## Xoriam

However I do hope this is the last batch of 4GB cards.


----------



## mark_thaddeus

Quote:


> Originally Posted by *mtcn77*
> 
> So, there are 52 "active rops" 13 SMM's three of which is turned off, yet these cards have no variation in fill rate from their stated maximum?
> According to what I found, shaders interface to L2 and via L2 to memory controllers is through the crossbar interface. So, say you have less of the shaders & a cut down crossbar. What happens to the layout that overrides a portion of that connected to individual memory modules? Something's gotta give.


Rasterizer / SMs / ROPs

GTX 970 : 64/52/64
GTX 980 : 64/64/64


----------



## looniam

Quote:


> Originally Posted by *Vesku*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> no there are 64 ROPs in both the 980 and 970 . .
> again, nvidia didn't care to *explain the mixed memory densities in the 550ti nor was it listed on the box.*
> 
> 
> 
> Actually they did reveal mixed memory to reviewers:
> 
> GTX 550 Ti's Quirk: 1GB Of VRAM On A 192-bit Bus
> 
> http://www.anandtech.com/show/4221/nvidias-gtx-550-ti-coming-up-short-at-150/2
Click to expand...

it was there but not explained.
Quote:


> *Our base assumption* is that NVIDIA is using a memory interleaving mode similar to "flex" modes on desktop computers,


----------



## mark_thaddeus

Did they not market this card as better than the last Gen and performance shows that (970 vs 780 / 980 vs 780 Ti - performance per watt)? So what's misleading about that?









It's basically like what Intel did with their 4790k vs 4770k right? So what's misleading about that?


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> no there are 64 ROPs in both the 980 and 970 . .


If both cards are equal, how do you explain this, then? These are strictly rop benchmarks.


----------



## mark_thaddeus

Quote:


> Originally Posted by *mtcn77*
> 
> If both cards are equal, how do you explain this, then? These are strictly rop benchmarks.


It's not just about ROPs, it's a combination of the rasterizer, the SM count and the ROPs:

Rasterizer / SMs / ROPs

GTX 970 : 64/52/64
GTX 980 : 64/64/64

The way bandwidth is allocated is different that what we've known previously before. Check out the Tech Report Article I quoted on page 15.


----------



## raghu78

Quote:


> Originally Posted by *Clockster*
> 
> You're wasting your time bud, the Nvidia defenders are here in force and you know how they are..just as bad as the AMD defenders. Wish people would wake up one day and realise these big companies like Nvida, AMD, Intel don't give a crap about them. There is no way Nvidia didn't know about this when the cards launched.


I cannot believe people are defending Nvidia on this. AMD got hammered on the reference hawaii cooler and the result was AMD delivered a much better solution with R9 295X2. Similarly the public and press should hold Nvidia accountable. Either change the marketing material and info on the product box or recall the product. The easier option is the first and not going to cost a lot. Clearly mention the actual bandwidth for the last 0.5 GB on the box. What we also need is the press to inform the users with extensive testing so as to enable the user to make a informed decision. I hope pcper and other sites like techreport show the same diligence that they showed in the HD 7900 series frametime issues.


----------



## raghu78

Quote:


> Originally Posted by *looniam*
> 
> *again, nvidia didn't care to explain the mixed memory densities in the 550ti nor was it listed on the box.* however so where is it decided what information ought to be disclosed? hhmmm?


pathetic lies. The press was well informed and they did a good job explaining to the customer. But they still wanted more details.

http://www.anandtech.com/show/4221/nvidias-gtx-550-ti-coming-up-short-at-150/2

" *Finally, we certainly haven't forgotten about NVIDIA's interesting memory arrangement with the GTX 550 Ti. It's a shame that they won't tell us more about how they're interleaving memory accesses on this unique design, but hopefully they'll open up in the future. It's something we're definitely going to revisit once the CUDA memory bug is dealt with, and hopefully at that time we'll be able to learn more about how NVIDIA is accomplishing this. If this is the start of a long term change to memory layout by NVIDIA, then getting to better understand how they're interleaving memory accesses here will be all the more important to understanding future products*."

I expect this will stop you from comparing the GTX 970 case from GTX 550 Ti. Nvidia revealed the exact memory interleaving design and the press did a good job of communicating to the market.


----------



## Vesku

Quote:


> Originally Posted by *looniam*
> 
> it was there but not explained.


Nvidia revealed they would be interleaving the memory they just didn't reveal the details of how it would be split and managed. With the GTX 970 they revealed nothing about the odd memory configuration.

Another site that reported on the odd 550 Ti memory configuration, saying Nvidia told them about it but wouldn't reveal the details of operation. Which is what they've belatedly done with this 970 memory issue response.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/41590-nvidia-geforce-gtx-550-ti-1gb-review-2.html
Quote:


> NVIDIA now has a way to allow for mixed memory allotments on a per-channel basis. Since the technology is proprietary and will presumably a closely guarded secret, they declined to discuss the specifics with us. What we do know is that two of the GTX 550 Ti's memory controllers are populated with 256MB of memory (in two 128MB modules) while the other is paired up with 512MB of GDDR5. Presumably, there is some sort of load balancing going on behind the scenes which is facilitated by a slightly revised driver stack but we're sure that some core changes were implemented as well.


----------



## mark_thaddeus

Quote:


> Originally Posted by *raghu78*
> 
> I cannot believe people are defending Nvidia on this. AMD got hammered on the reference hawaii cooler and the result was AMD delivered a much better solution with R9 295X2. Similarly the public and press should hold Nvidia accountable. Either change the marketing material and info on the product box or recall the product. The easier option is the first and not going to cost a lot. Clearly mention the actual bandwidth for the last 0.5 GB on the box. What we also need is the press to inform the users with extensive testing so as to enable the user to make a informed decision. I hope pcper and other sites like techreport show the same diligence that they showed in the HD 7900 series frametime issues.


So based on your logic, Nvdia will make changes on the next cards coming out since they didn't change anything on the 290x / 290 coolers right? As for the bandwidth, you still will be able to use all 4GB , but it's setup differently than what we're used to? It's not as efficient as we would want it to be, but we can still use it right? It was a design decision because of the limited SMs 12 (less than the 980) which was pointed out by Tech Report since October 1 of last year. It still performs according to what they promised, which is better than 780 and the 670 (it's direct sibling)? As for the whole cooler fiasco, the reason AMD had to make changes is because it affects all their cards even the ones who didn't OC (when I say OC, I mean like us who push it to the limits) it but used uber mode (meaning not just the 1% like us, but also the remaining 99%). Right now the people who complain about the 970 are the 1% and not the remaining 99%, big difference there!

I'm not trying to get in an argument here, just pointing out facts.









EDIT: And NVidia did mention that they would use interleaving memory, why would they need to explain how they did it in a marketing material, when it would only interest the 1%. It was marketed as being better than last gen and it is!

Lastly, I'm all for the reports coming out because this would mean the next GPUs will have this sorted out just like the 290x to the 295x2!


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> If both cards are equal, how do you explain this, then? These are strictly rop benchmarks.
> 
> 
> Spoiler: Warning: Spoiler!


let me reinterate:
Quote:


> Originally Posted by *looniam*
> 
> nvidia said they prioritized the vram into segments, lets keep it straight, eh?
> 
> *as far as the card not running 256 bit, please provide valid proof.*


i'll explain; trying to throw things against the wall and hoping something sticks is not providing proof.

_nor is it doing a favor to those that are having an issue_.








Quote:


> Originally Posted by *raghu78*
> 
> pathetic lies. The press was well informed and they did a good job explaining to the customer.
> 
> http://www.anandtech.com/show/4221/nvidias-gtx-550-ti-coming-up-short-at-150/2


lies? well first of all it was a SINGULAR statement not plural. second of all - there you go again making an accusation. as i said before but you obviously, because of your own agenda, over looked
Quote:


> *Our base assumption* is that NVIDIA is using a memory interleaving mode similar to "flex" modes on desktop computers


seriously dude, we are done and cheers to you for stooping low and start to be insulting to end this discussion - shows a lot of your character.


----------



## Exilon

Quote:


> Originally Posted by *Clockster*
> 
> You're wasting your time bud, the Nvidia defenders are here in force and you know how they are..just as bad as the AMD defenders.
> Wish people would wake up one day and realise these big companies like Nvida, AMD, Intel don't give a crap about them.
> 
> There is no way Nvidia didn't know about this when the cards launched.


Of course they knew about the consequences of disabling the 3 SMMs. They segmented the VRAM to mitigate it so by definition they must have known about it.

Now there are a lot of misunderstandings, both willful and not, going around. In the end it's a bunch of enthusiasts who know nothing about their toys wildly speculating. Let me lay down some facts:

No, the last 500 MB isn't 20 GB/s. That's PCIe bandwidth. Nai says the last 500 MB is being thrashed for some reason and the GPU is always swapping from system RAM in that segment for his CUDA benchmark. We don't know how much effective bandwidth there is in the last 500 MB at all.

No, the GTX 970 is not a 20x-bit card. The GTX 980 gets a "180 GB/s" score because it's a floating point addition benchmark and the $550 card has 23% more shaders. (180 GB/16bytes per float4 vector == 180 * 1024 ^ 3 / 16 bytes per float4 * 4 FLOP/float4 == 4.83 GFLOPS.) Yes, the benchmark is shader bottlenecked until it starts swapping from system RAM.

The other remaining question is (and the most important):

How badly does this affect 4K frame times when the upper .5 GB needs to be used? We have plenty of 4K SLI benchmarks showing perfectly acceptable 99-percentile frametimes, but no indication of how much VRAM was needed to draw those frames.


----------



## error-id10t

Quote:


> Originally Posted by *Seven7h*
> 
> eople freaked out about the results it showed and drew the incorrect conclusion from them. For memory above 3.5GB, the benchmark was *not* testing the separate 512MB memory section bandwidth... *instead it was *actually* testing system memory bandwidth*.
> 
> *So no one has actually tested the real speed of this last 512MB yet*, except via game workloads (by forcing memory consumption above 3.5GB but below 4GB), which show negligible performance differences.


On the first bolded part.. curious, why did people get the same speed regardless of their system. If this was system memory then I certainly have faster speed RAM than someone running 1600MHz (example only).

On the second bolded part.. I bet Nvidia have and if there was nothing there, they'd have shown it and provided a tool/benchmark to prove it.


----------



## raghu78

Quote:


> Originally Posted by *looniam*
> 
> lies? well first of all it was a SINGULAR statement not plural. second of all - there you go again making an accusation. as i said before but you obviously, because of your own agenda, over looked
> seriously dude, we are done and cheers to you for stooping low and start to be insulting to end this discussion - shows a lot of your character.


This is what you said
Quote:


> Originally Posted by *looniam*
> 
> *again, nvidia didn't care to explain the mixed memory densities in the 550ti nor was it listed on the box.* however so where is it decided what information ought to be disclosed? hhmmm?


So yeah you lied about Nvidia not explaining the mixed memory densities and the memory interleaving design on GTX 550 Ti. btw when you are caught redhanded atleast have the grace to accept it


----------



## Exilon

Quote:


> Originally Posted by *error-id10t*
> 
> On the first bolded part.. curious, why did people get the same speed regardless of their system. If this was system memory then I certainly have faster speed RAM than someone running 1600MHz (example only).
> 
> On the second bolded part.. I bet Nvidia have and if there was nothing there, they'd have shown it and provided a tool/benchmark to prove it.


Actually, it's not system RAM bandwidth. *It's not even bandwidth at all.*

The benchmark is calculating how many much time it takes to increment 128 MB worth of SP floats from 0 to 10. According to Nai, when it hits the 500 MB portion, it's constantly thrashing (i.e page fault, loads it from system RAM, does the calculation, tries to grab it from VRAM, page fault, load from system RAM). The bottleneck here should be PCIe link speed.

Maybe if you limit the link speed, you'll be able to change the speed on the last portion.


----------



## mtcn77

Quote:


> Originally Posted by *looniam*


Your whole plan of engagement is nonsensical derailment and demagogy, is that not correct?
Once again, that "rop benchmark" which you inanely paraphrased without rebuttal was from Hardware.fr.
Quote:


> Originally Posted by *looniam*
> 
> _nor is it doing a favor to those that are having an issue_.


I'm not a company representative. I'm a user, and not an Nvidia user with Stockholm's Vindication.


----------



## mark_thaddeus

Quote:


> Originally Posted by *Exilon*
> 
> Of course they knew about the consequences of disabling the 3 SMMs. They segmented the VRAM to mitigate it so by definition they must have known about it.
> 
> *Now there are a lot of misunderstandings, both willful and not, going around. In the end it's a bunch of enthusiasts who know nothing about their toys wildly speculating.* Let me lay down some facts:
> 
> No, the last 500 MB isn't 20 GB/s. That's PCIe bandwidth. Nai says the last 500 MB is being thrashed for some reason and the GPU is always swapping from system RAM in that segment for his CUDA benchmark. We don't know how much effective bandwidth there is in the last 500 MB at all.
> 
> No, the GTX 970 is not a 20x-bit card. The GTX 980 gets a "180 GB/s" score because it's a floating point addition benchmark and the $550 card has 23% more shaders. (180 GB/16bytes per float4 vector == 180 * 1024 ^ 3 / 16 bytes per float4 * 4 FLOP/float4 == 4.83 GFLOPS.) Yes, the benchmark is shader bottlenecked until it starts swapping from system RAM.
> 
> The other remaining question is (and the most important):
> 
> How badly does this affect 4K frame times when the upper .5 GB needs to be used? We have plenty of 4K SLI benchmarks showing perfectly acceptable 99-percentile frametimes, but no indication of how much VRAM was needed to draw those frames.


^ This and everything else!


----------



## looniam

Quote:


> Originally Posted by *raghu78*
> 
> This is what you said
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> *again, nvidia didn't care to explain the mixed memory densities in the 550ti nor was it listed on the box.* however so where is it decided what information ought to be disclosed? hhmmm?
> 
> 
> 
> So yeah you lied about Nvidia not explaining the mixed memory densities and the memory interleaving design on GTX 550 Ti. btw when you are caught redhanded atleast have the grace to accept it
Click to expand...

i know exactly what i said _but as you further edited your post to include_:
Quote:


> Originally Posted by *raghu78*
> 
> " *It's a shame that they won't tell us more about how they're interleaving memory accesses on this unique design, but hopefully they'll open up in the future.* ."


it is not i whom has been deceitful. so do not demand of me nor request anything about having grace when you lack that virtue yourself.

though do continue the personal attacks, you're showing your true colors here.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> Your whole plan of engagement is nonsensical derailment and demagogy, is that not correct?
> Once again, that "rop benchmark" which you inanely paraphrased without rebuttal was from Hardware.fr.
> I'm not a company representative. I'm a user, and not an Nvidia user with Stockholm's Vindication.


i have no plan of engagement, i'm looking to find the truth which ill informed speculation and accusations will fail to find.

unfortunately there are is quite a bit of that and has covered up a number of informative posts directly related to the matter at hand.

e:
grammar


----------



## Seven7h

Quote:


> Originally Posted by *error-id10t*
> 
> On the first bolded part.. curious, why did people get the same speed regardless of their system. If this was system memory then I certainly have faster speed RAM than someone running 1600MHz (example only).
> 
> On the second bolded part.. I bet Nvidia have and if there was nothing there, they'd have shown it and provided a tool/benchmark to prove it.


More specifically, the bottleneck in the benchmark when accessing system memory from a GPU is PCIE. 20GB/s is in line with PCIE speeds, just as Exilon said, and as the author of the benchmark confirmed. It's a safe assumption that whatever bandwidth the 512MB section of video memory has, it is higher than this, otherwise there would be little reason to use it instead of swapping from system memory.

This is all a brand new controversy. Whether NVIDIA chooses to give out the bandwidth or a tool to test it for the 512MB section of video memory in the future is up to them. A more cleverly written graphics test could probably get more accurate data though (unlike the CUDA test).

Key facts though:

- You have 4GB and get to use it all.
- 512MB is a different segment, and will be slower to access, though it's not understood how slow.
- The driver is very careful about how it uses that 512MB such that the performance impact is negligible.
- This whole situation appears to not affect real world game performance much at all.


----------



## Noufel

It's a mass hysteria for 970's owners friends of mine are returning their 970 and i think many people will do that even if the nai bench was incorrect and the last 500 mb aren't so slow
The vast majoroty of people don't understand this but they want a 4g 970 when they paid for it no some tricky 3.5 + 0.5 thing.


----------



## raghu78

Quote:


> Originally Posted by *looniam*
> 
> *i know exactly what i said* _but as you further edited your post to include_:
> it is not i whom has been deceitful. so do not demand of me nor request anything about having grace when you lack that virtue yourself.


do you ? I don't think so. So again to remind you of what you said
Quote:


> Originally Posted by *looniam*
> 
> *again, nvidia didn't care to explain the mixed memory densities in the 550ti nor was it listed on the box.* however so where is it decided what information ought to be disclosed? hhmmm?


Nvidia did explain the mixed memory densities. What they did not reveal is the exact details of the memory interleaving design which they said is proprietary (according to hwc quote mentioned above).
Quote:


> though do continue the personal attacks, you're showing your true colors here.


no personal attacks. just pointing out facts and refreshing your memory.


----------



## Vesku

Quote:


> Originally Posted by *Noufel*
> 
> It's a mass hysteria for 970's owners friends of mine are returning their 970 and i think many people will do that even if the nai bench was incorrect and the last 500 mb aren't so slow
> The vast majoroty of people don't understand this but they want a 4g 970 when they paid for it no some tricky 3.5 + 0.5 thing.


It's a shame Nvidia didn't just give this same brief description at GTX 970 launch. It's a great card for the money. I don't think it would scare off most potential buyers and the few that really didn't like it would be at least as likely to get a 980 as a 290 or 290X. Keeping quiet about it just creates a cloud of "what was so bad about it that they needed to hide it?"


----------



## Seven7h

Quote:


> Originally Posted by *Noufel*
> 
> It's a mass hysteria for 970's owners friends of mine are returning their 970 and i think many people will do that even if the nai bench was incorrect and the last 500 mb aren't so slow
> The vast majoroty of people don't understand this but they want a 4g 970 when they paid for it no some tricky 3.5 + 0.5 thing.


The vast majority of people aren't enthusiasts on forums. They are people like my friends who bought them and experience the performance they saw in reviews.

It's fine to feel like you got a somewhat different product than you thought. But it's clearly a product that is better than a 3.5GB GPU and most still would've bought it if it had been marketed as 3.5GB. Same product in each case, but two different emotional reactions.

It likely would've cost more to make it as people expected, so the compromise was pretty likely baked into the price. So it's not as if anyone got screwed financially.


----------



## XXnomadXX

i paid for my gtx 970 4gig and i expect to have that 4 gig of memory to be available not a 3.5 gig.


----------



## Xoriam

Quote:


> Originally Posted by *XXnomadXX*
> 
> i paid for my gtx 970 4gig and i expect to have that 4 gig of memory to be available not a 3.5 gig.


It has 4gb available. 500mb are just slower and it isn't clear exactly how much slower.


----------



## XXnomadXX

i would not be surprise nvidia and other brands release a 3.5 gig gtx 970 in the future then tell everyone that a 512mb is an os memory usage


----------



## Seven7h

Quote:


> Originally Posted by *Vesku*
> 
> It's a shame Nvidia didn't just give this same brief description at GTX 970 launch. It's a great card for the money. I think most potential buyers could probably live with it and the few that could not would be at least as likely to get a 980 as a 290 or 290X.


I agree... It would've been better if it had been documented. The assumption was likely that performance and benchmarks and price are all that really matter here. And those are all still valid. But for enthusiasts who like to know their GPU inside and out, information and disclosure can be comforting.


----------



## 2010rig

Quote:


> Originally Posted by *Noufel*
> 
> It's a mass hysteria for 970's owners friends of mine are returning their 970 and i think many people will do that even if the nai bench was incorrect and the last 500 mb aren't so slow
> The vast majoroty of people don't understand this but they want a 4g 970 when they paid for it no some tricky 3.5 + 0.5 thing.


Make sure they downgrade to a 290 or 290X since those have 4GB of RAM.
Quote:


> Originally Posted by *XXnomadXX*
> 
> i would not be surprise nvidia and other brands release a 3.5 gig gtx 970 in the future then tell everyone that a 512mb is an *os memory usage*


WHAT - that doesn't even make sense.

I have barely posted on this topic, I don't like to make assumptions when I don't know the whole story, and I haven't had the time to look into it. So surprise surprise, this is only my 3rd post on the topic.

I tell it how it is when it comes to any company. AMD hasn't released anything worthy of talking about one way or the other in quite a while.

No new CPU's or GPU's in how long now?

The benchmark that caused this mass hysteria turned out to be buggy, so how about we wait and see what the true bandwidth numbers are for that final .5 GB, because it certainly seems that it's not 20 GB/s like so many have wildly speculated. If NVIDIA isn't delivering as promised, I'm not just going to blindly defend them, and will speak out for them needing to fix things.


----------



## spacin9

Quote:


> Originally Posted by *Noufel*
> 
> It's a mass hysteria for 970's owners friends of mine are returning their 970 and i think many people will do that even if the nai bench was incorrect and the last 500 mb aren't so slow
> The vast majoroty of people don't understand this but they want a 4g 970 when they paid for it no some tricky 3.5 + 0.5 thing.


I'd return mine if I could also. I wouldn't keep them unless I had to or got a super-awesome deal on one. No way to sell these puppies now on ebay without taking a huge hit, for me anyway.

I got Zotac Omegas... wonder if I can use this situation to leverage them giving me the super-secret, non-existent, as advertised BIOS to unlock extra voltage and power features.







Nope... not gonna happen.


----------



## looniam

Quote:


> Originally Posted by *raghu78*
> 
> Nvidia did explain the mixed memory densities. What they did not reveal is the exact details of the memory interleaving design which they said is proprietary (according to hwc quote mentioned above).


since i had a minute to google "550ti mixed memory" along with:
TechPowerUp Review Database









Introducing the GeForce GTX 550 Ti
Quote:


> *A memory mystery*
> 
> Armed with half-a-dozen 32-bit memory controllers, NVIDIA was faced with something of a conundrum; does it equip the GTX 550 Ti with 768MB (6 x 128MB) or 1,536MB (6 x 256MB chips) of memory. The answer is neither. In an effort to match the GTS 450's 1GB frame buffer, the manufacturer has done something a little weird - it's used mixed-density memory chips; 4 x 128MB and 2 x 256MB.
> 
> It's an approach to GPU memory we've not seen before, and *NVIDIA must be confident* that the mixed modules won't have a knock-on effect on latency. *On paper*, it has the required effect - GTX 550 Ti gets the desired 1,024MB of GDDR5 memory and sidesteps having to go into expensive-to-produce 1,536MB territory. Everyone's a winner, *it seems*.


sure statements a reviewer would make if nvidia provided more than a spec sheet.









and btw, that is the most information given by a reviewer of any 550ti reviewed. other than anand's *assumptions* so unless you have a pic of a box with a detailed description or review that has a statement from nvidia you have not shown me to be a liar, let alone wrong.

seems claim w/o proof is your specialty today like most.

e:
formating


----------



## Kinaesthetic

Quote:


> Originally Posted by *raghu78*
> 
> I cannot believe people are defending Nvidia on this. AMD got hammered on the reference hawaii cooler and the result was AMD delivered a much better solution with R9 295X2. Similarly the public and press should hold Nvidia accountable. Either change the marketing material and info on the product box or recall the product. The easier option is the first and not going to cost a lot. Clearly mention the actual bandwidth for the last 0.5 GB on the box. What we also need is the press to inform the users with extensive testing so as to enable the user to make a informed decision. I hope pcper and other sites like techreport show the same diligence that they showed in the HD 7900 series frametime issues.


The usual Nvidia suspects are defending this as usual.......just like YOU and the other usual AMD suspects were defending the reference Hawaii cooler as usual. A person who is a hypocrite is worse than just a fanboy/girl (although both are incredibly stupid).

The other people, everyone else, are gladly roasting Nvidia on this without giving one darn for who they root for. And those people are also trying to gather up evidence, cold hard numbers, rather than just blowing hot air with the other side's fanboys/girls.

Honestly, there are some people that need to be banned from this thread...

*Now, actually trying to get to the bottom of this. Can someone with a Samsung vram card run the benchmark (check in GPU-Z)? *Was reading through that Nvidia thread, and there was someone that mentioned trying that, but no one actually did. Because a vast majority of the GTX 970s on the market are using Hynix. My GTX 970 is personally Hynix and does exhibit the problem according to Nai's bench. (EVGA GTX 970 SC)


----------



## bonami2

Boost gpu are called boost gpu for a reason they dont sell as 1000mhz gpu if they run at 930 and boost to 1000 because it false advertising.

the 3.5gb is useable but the 4gb is just not working as expected

Im sure they can end in a lawsuit for stupids thing like that.. It hurt amd and blabla

And who with any resonable mind will be like

I have a 3gb gpu

Im upgrading for 30-70% more power

But only .5gb more


----------



## 2010rig

You're right, and I was wrong, I meant to say Redwoods, not you.









Was talking about this post, which has *no correlation* to the topic at hand, because if it did, you'd see the same issue with Single card results.
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *Redwoodz*
> 
> LMAO!
> 
> 
> 
> 
> 
> 
> 
> FCAT SLi results
> 
> 
> 
> Look at all the dropped frames. http://www.guru3d.com/articles-pages/geforce-gtx-970-sli-review,8.html
> 
> 
> 
> Look at the frame drops,and the declining FPS as more VRAM is used.
> 970 owners you have been had.
Click to expand...


----------



## spacin9

_*"Now, actually trying to get to the bottom of this. Can someone with a Samsung vram card run the benchmark (check in GPU-Z)? *Was reading through that Nvidia thread, and there was someone that mentioned trying that, but no one actually did. Because a vast majority of the GTX 970s on the market are using Hynix. My GTX 970 is personally Hynix and does exhibit the problem according to Nai's bench. (EVGA GTX 970 SC)"_

Samsung here... same issue as the other 970s.

Nai's test:

http://www.overclock.net/t/1537596/lazygamer-nvidia-s-gtx970-has-a-rather-serious-memory-allocation-bug/100#post_23450576


----------



## John Shepard

Is this why SoM stutters like hell at 4K with ultra textures?
With textures set to high VRAM usage is around ~3.5GB and the game runs great but with ultra textures the vram maxes out and it stutters like crazy.

Anybody with sli 980s can test this? Does you game stutter with ultra settings/textures at 4k?


----------



## Seven7h

Quote:


> Originally Posted by *John Shepard*
> 
> Is this why SoM stutters like hell at 4K with ultra textures?
> With textures set to high VRAM usage is around ~3.5GB and the game runs great but with ultra textures the vram maxes out and it stutters like crazy.
> 
> Anybody with sli 980s can test this? Does you game stutter with ultra settings/textures at 4k?


On *any gpu* if you approach the dedicated video memory limit (say within 100-200MB) all bets are off with regard to stuttering and performance. This is because Windows doesn't want to fail an application requested allocation, and cause corruption or a crash, so it starts shuffling stuff into system memory. This shuffling process into and out of video memory is called paging and can cause stuttering, especially when the stuff coming from "system memory" was actually kicked out to disk (HDD/SSD).

This has always been the case, and always will be, both for this particular configuration as well as standard memory configurations that have equal speed access to all video memory.

Windows is in control of what resources live where, and when to change that.


----------



## Noufel

Quote:


> Originally Posted by *John Shepard*
> 
> Is this why SoM stutters like hell at 4K with ultra textures?
> With textures set to high VRAM usage is around ~3.5GB and the game runs great but with ultra textures the vram maxes out and it stutters like crazy.
> 
> Anybody with sli 980s can test this? Does you game stutter with ultra settings/textures at 4k?


I experienced stuttering playing SoM at 4k DSR ultra with my 980s g1 sli but i think i hit a memory usage wall maxing the 4gb of vram in some places.


----------



## John Shepard

Quote:


> Originally Posted by *Noufel*
> 
> I experienced stuttering playing SoM at 4k DSR ultra with my 980s g1 sli but i think i hit a memory usage wall maxing the 4gb of vram in some places.


Is it stuttering to the point where the game becomes completely unplayable or it is occasional?
How is the gpu usage?

I noticed my 970s are never at 99%.


----------



## darealist

Nvidia's all new data plan. 3.5g max until throttle.


----------



## Kinaesthetic

Quote:


> Originally Posted by *spacin9*
> 
> *"Now, actually trying to get to the bottom of this. Can someone with a Samsung vram card run the benchmark (check in GPU-Z)? *Was reading through that Nvidia thread, and there was someone that mentioned trying that, but no one actually did. Because a vast majority of the GTX 970s on the market are using Hynix. My GTX 970 is personally Hynix and does exhibit the problem according to Nai's bench. (EVGA GTX 970 SC)"
> 
> Samsung here... same issue as the other 970s.
> 
> Nai's test:
> 
> http://www.overclock.net/t/1537596/lazygamer-nvidia-s-gtx970-has-a-rather-serious-memory-allocation-bug/100#post_23450576


Appreciate it. Unfortunately looks like I'm just gonna have to personally stick with the GTX 970. It is already under water, and I REALLY don't want to have to re-bend the acrylic runs to the GPU block. Had to do two miserably hard bends. The Caselabs S3 doesn't give much space to work with one you start packing a ton of components in it . Laziness > 3.5GBVram.


----------



## raghu78

Quote:


> Originally Posted by *Kinaesthetic*
> 
> The usual Nvidia suspects are defending this as usual.......*just like YOU and the other usual AMD suspects were defending the reference Hawaii cooler as usual*. A person who is a hypocrite is worse than just a fanboy/girl (although both are incredibly stupid).


I did not defend the reference Hawaii cooler and AMD got their fair share of criticism. That also helped with AMD's approach towards cooling and resulted in a much better R9 295x2 , both thermally and acoustically. btw before you spout some nonsense try and back it up with proof.


----------



## Noufel

Quote:


> Originally Posted by *John Shepard*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noufel*
> 
> I experienced stuttering playing SoM at 4k DSR ultra with my 980s g1 sli but i think i hit a memory usage wall maxing the 4gb of vram in some places.
> 
> 
> 
> Is it stuttering to the point where the game becomes completely unplayable or it is occasional?
> How is the gpu usage?
> 
> I noticed my 970s are never at 99%.
Click to expand...

no not unplayable it's more occasionnel espacialy daytime
and the gpus are at 99% most of the time at 4k DSR the min memory usage i've seen was 3.2 g.


----------



## Noufel

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *spacin9*
> 
> _*"Now, actually trying to get to the bottom of this. Can someone with a Samsung vram card run the benchmark (check in GPU-Z)? *Was reading through that Nvidia thread, and there was someone that mentioned trying that, but no one actually did. Because a vast majority of the GTX 970s on the market are using Hynix. My GTX 970 is personally Hynix and does exhibit the problem according to Nai's bench. (EVGA GTX 970 SC)"_
> 
> Samsung here... same issue as the other 970s.
> 
> Nai's test:
> 
> http://www.overclock.net/t/1537596/lazygamer-nvidia-s-gtx970-has-a-rather-serious-memory-allocation-bug/100#post_23450576
> 
> 
> 
> Appreciate it. Unfortunately looks like I'm just gonna have to personally stick with the GTX 970. It is already under water, and I REALLY don't want to have to re-bend the acrylic runs to the GPU block. Had to do two miserably hard bends. The Caselabs S3 doesn't give much space to work with one you start packing a ton of components in it
> 
> 
> 
> 
> 
> 
> 
> . Laziness > 3.5GBVram.
Click to expand...

that's a good point


----------



## Almost Heathen

The more I read about Intel and Nvidia, the less I like their business practices.

If AMD has competitive hardware when it's upgrade time, I'll definitely be looking more closely at them than I have before.


----------



## hurleyef

Quote:


> Originally Posted by *Exilon*
> 
> Of course they knew about the consequences of disabling the 3 SMMs. They segmented the VRAM to mitigate it so by definition they must have known about it.
> 
> Now there are a lot of misunderstandings, both willful and not, going around. In the end it's a bunch of enthusiasts who know nothing about their toys wildly speculating. Let me lay down some facts:
> 
> *No, the last 500 MB isn't 20 GB/s. That's PCIe bandwidth. Nai says the last 500 MB is being thrashed for some reason and the GPU is always swapping from system RAM in that segment for his CUDA benchmark. We don't know how much effective bandwidth there is in the last 500 MB at all.
> *
> No, the GTX 970 is not a 20x-bit card. The GTX 980 gets a "180 GB/s" score because it's a floating point addition benchmark and the $550 card has 23% more shaders. (180 GB/16bytes per float4 vector == 180 * 1024 ^ 3 / 16 bytes per float4 * 4 FLOP/float4 == 4.83 GFLOPS.) Yes, the benchmark is shader bottlenecked until it starts swapping from system RAM.
> 
> *The other remaining question is (and the most important):
> 
> How badly does this affect 4K frame times when the upper .5 GB needs to be used? We have plenty of 4K SLI benchmarks showing perfectly acceptable 99-percentile frametimes, but no indication of how much VRAM was needed to draw those frames.*


This.


----------



## pony-tail

With this on the 970 and the 960 not playing nice at over 1080p it does not leave many choices for people at higher resolutions .
I am looking for a card for my sig rig and one for it's twin that will run well at 1440p - I have waited quite a while for the 960 to come out only to find it is not good at 1440 and so I started looking at the 970 to see If there was one that was 9.5" or less and I found this . I hope somebody comes up with a decent mid range card soon - I am stuck on a r7 260x - which I might add is not great at 1440p .
P.S. It's twin is running a MSI 760 mini ITX card and it is only slightly better.


----------



## 2010rig

Quote:


> Originally Posted by *Almost Heathen*
> 
> The more I read about Intel and Nvidia, the less I like their business practices.
> 
> If AMD has competitive hardware when it's upgrade time, I'll definitely be looking more closely at them than I have before.


It's been 8 years since AMD has had the lead with CPU's.

Making faster products than the competition, and figuring out how to cripple the performance of your products is a hard business for Intel and NVIDIA, cut them some slack man.

The $220 difference between 970 & 980 had to have some drawbacks.









Why don't you wait and see for the whole story to unfold? The 970 isn't any slower at gaming than it was at launch.


----------



## hurleyef

Quote:


> Originally Posted by *Noufel*
> 
> I experienced stuttering playing SoM at 4k DSR ultra with my 980s g1 sli but i think i hit a memory usage wall maxing the 4gb of vram in some places.


Shadow of Mordor has had many well documented issues with SLI. Even with SLI disabled I still had microstutter issues with two cards, ran great with just one, though.


----------



## brootalperry

Is another one of these threads really required?


----------



## SperVxo

Quote:


> Originally Posted by *Defoler*
> 
> I doubt most people feel the effect of this max memory issue to throw their 970s away and switch to AMD.
> 
> And beside an artificial test, there is been no actual tests that can even show if this 0.5GB memory bandwide reduction even affect the gaming experience.
> In all tests, eventually, the 970 sits where it's supposed to, around the 290x and cost the same.
> So if real performance and reviews show the performance of the card in various ways, and the FPS looks to be ok, I see no problem with this. This has a potential of a problem, but when looking at reviews, can you point where this is really coming into effect? I don't. And we aren't running artificial tests when playing games.
> 
> Between artificial test and a potential problem, and an actual gaming problem, there needs to be proof of this. And I have yet to see any at all.


Well it seems in testings all over that when the card uses over 3.5 memory the FPS drops and game stutters. If they advertise its 4gb with a specific speed it should work at that speed. Not drop 90% speed when going over 3.5 vram.

This doesnt seem to happen on the 980.

If they would have advertised the card correctly with 3.5 vram + 500 vram that is 90% slower people wouldn't have bought the card and insted go for 980 or a AMD card.

And the card wouldn't be worth that kind of money if it only work 100% up to 3.5vram.

Nvidia screwed customers and some might take it harder then others, And alot of users are pissed of.


----------



## pony-tail

I haven't bought yet - but that said I am not sure what I am going to buy either .
The "issues with the new Nvidia cards and the Aussie prices has me wondering - I may even get a couple of used gtx 770s yet . It doesn't look as if AMD is coming out with anything in the near future .
Looks like - suck it up and buy a 960/970 or buy used .
edit ! I guess there is a third option . -- ditch the whole ecosystem and get an Android tablet ( if I can figure out a use for it )


----------



## Clocknut

IMO, Nvidia should now release a 970Ti with only 2 SMM cut and actually have full 4GB to remedy this problem and EOL this 970.


----------



## XXnomadXX

oh god no. if nvidia EOL gtx 970. this is the last purchase im making and moving to amd or something else. the last time nvidia srewed me was the defect gts 8800 series chip.


----------



## pony-tail

Quote:


> Originally Posted by *Clocknut*
> 
> IMO, Nvidia should now release a 970Ti with only 2 SMM cut and actually have full 4GB to remedy this problem and EOL this 970.


That would work for me


----------



## poii

Quote:


> Originally Posted by *looniam*
> 
> *THE RESULTS OF THE NAI BENCHMARK HAVE BEEN PROVEN TO BE INVALID!*


I want to be clear with that, I am german and my english is far from perfect but so is the google/chrome translater.
I can't translate the whole post made by Nai with all the technical jibberish but I can however try to tell you what he was saying before all this technical stuff in the last 3 paragraphs.

He said his benchmark has troubles if GPU RAM is already used by DirectX files. He thought due to more requests to write those CUDA files (or w/e) into the already used GPU RAM it would clear the part for his CUDA bench.

This seems not to be the case and CUDA adresses other RAM space instead.

However and that is speculation on my part that needs confirmation from someone who knows this stuff:
If this test is done in headless mode no GPU RAM should be used and the benchmarks should run without any problems as shown with many other GPUs.


----------



## spacin9

Oh they'll put out the 970 TI alright. Just like they did when AMD initiated another coup with the 290s vs the GTX 780. They rushed out the 780 Ti... at a big premium over the 780. Nvidia is only too happy to rectify all this.


----------



## Clocknut

Quote:


> Originally Posted by *XXnomadXX*
> 
> oh god no. if nvidia EOL gtx 970. this is the last purchase im making and moving to amd or something else. the last time nvidia srewed me was the defect gts 8800 series chip.


well, this is the only way out for them to put out this GTX970 Cancer child(coil whine + 3.5GB) behind without doing a massive "Intel P67" recall.

1. release a 970Ti with 1792 cores/64ROP + true 4GB VRAM @ $399
2. EOL the 970, send those GM204 with defective Rop to release as 3GB 1536/48Rop 192bit 960Ti @ $279

then everybody forget about the 970 lol

Right now with GTX 960 @ $199 I am not sure how they gonna fit in such tight price range if they were to follow 970 $300-320 pricing now, between those 970/960 it is only $100 diff then if they put in 960Ti now the price gap between card will be $50 only, those SuperClock edition 960/960Ti is gonna have a hard time to sell when the next tier is just a little bit extra.


----------



## damric

The R9 295 is looking really good right now at less than $700 for anyone running super high resolution.

http://pcpartpicker.com/parts/video-card/#c=169&sort=a8


----------



## Exilon

Quote:


> Originally Posted by *poii*
> 
> I want to be clear with that, I am german and my english is far from perfect but so is the google/chrome translater.
> I can't translate the whole post made by Nai with all the technical jibberish but I can however try to tell you what he was saying before all this technical stuff in the last 3 paragraphs.
> 
> He said his benchmark has troubles if GPU RAM is already used by DirectX files. He thought due to more requests to write those CUDA files (or w/e) into the already used GPU RAM it would clear the part for his CUDA bench.
> 
> This seems not to be the case and CUDA adresses other RAM space instead.
> 
> However and that is speculation on my part that needs confirmation from someone who knows this stuff:
> If this test is done in headless mode no GPU RAM should be used and the benchmarks should run without any problems as shown with many other GPUs.


Yeah you got the gist of it. The nuance is that portions of VRAM with contention will be much slower because the CUDA bench and whoever else is using the RAM both the need to store data. What ends up a happens in the contested portions is that the CUDA bench hits a page fault, and has to pull the data from system memory. A little later, the other thing using the VRAM runs and page faults, booting the CUDA bench page out of VRAM into system RAM. Basically the two programs swap the data in and out of system RAM. This causes the effective bandwidth to drop since the GPU will be waiting for the swap to happen and it takes longer to process the 128 MB block.

The strange thing is that the last 500 MB block is also behaving like something is kicking it out of VRAM, or at least the 128MB workload is taking just as long to finish. This could be a side-effect of the prioritization magic Nvidia is pulling in the background, _or_ the 500 MB block is as slow as the PCIe bus which would really make it a 3.5 GB card...

One way to test whether the numbers are a coincidence is to put the GTX 970 on a PCIe 2.0 or PCIe 3.0 8x port and see if the slow portion gets worse. I know a lot of you have GTX 970 so someone should test it real quick.


----------



## wermad

Lol, ppl chill. You bought a GTX 970 and there has to be some handicap(s) in place in order for it to relinquish overall dominance to GTX 980. If you want your 4gbs, put up the cashes and get a 980 (or head over to the hot, literally, red camp w/ Hawaii







). Didn't we learn how much Nvidia cripples its runner up w/ GTX 570???

As far as false advertisement, its not. They are effectively delivering you a product that does have 4gb. Its not missing any. How it utilizes that memory, is questionable to some but it does deliver on its premise and so far has had impressive results imho. Furthermore, both Amd and Nvidia advertise their dual gpu cards w/ the total memory not effective memory as a good marketing scheme. Dishonest? No, and you have to understand how marketing and capturing consumer's "desires" works. Also, small clean numbers work best for marketing. Sure, we can get down to the nitty-gritty details, but that's for the reviews. Its just how it works in the art of marketing.

A good anology is car horsepower. Its measured at the flywheel in the majority of cases, and you don't get all those applied to the tarmac. But a 400 hp car sounds better then a 331.28 whp car. I can't sue a car maker for not giving me 400 ponies at the rubber. C'est la vie.

All those in a bunch, Nvidia can solve this with the magical * :

GTX 970 4GB*

(total effective vram available) (<-lawyers love to be vague and tiny







).

Did any one confirm if this is only Hynix and not Samsung? Just curious on this and if anyone here has tested this theory?


----------



## Obrigado

GTX980 user!!! Attention please...

make this test like me to clarify the situation

*FarCry 4 - DSR 4K - Detail all to maximum - test 1 with MSAA2x - test 2 with MSAA 4X*

there is my screen of the test

*MSAA2X= 21.6fps*


*MSAA4X 3.7fps*


if the 980 roughly behaves the same way with a gap of +10/15% then we can stay calm.


----------



## error-id10t

Quote:


> Originally Posted by *wermad*
> 
> GTX 970 4GB*
> 
> (total effective vram available) (<-lawyers love to be vague and tiny
> 
> 
> 
> 
> 
> 
> 
> ).


Memory Bandwidth (GB/sec): 224.

Already mentioned, so now all they need to do is show that's true. I'm still trying to duplicate the BF4 setup (anyone should be able to).


----------



## bonami2

Quote:


> Originally Posted by *error-id10t*
> 
> Memory Bandwidth (GB/sec): 224.
> 
> Already mentioned, so now all they need to do is show that's true. I'm still trying to duplicate the BF4 setup (anyone should be able to).


effective mean full potential ?

= 500mb cause bad performance

Lawyer win million


----------



## raghu78

Quote:


> Originally Posted by *2010rig*
> 
> Making faster products than the competition, *and figuring out how to cripple the performance of your products is a hard business for Intel and NVIDIA*, cut them some slack man.


maybe difficult for Intel but no so much for Nvidia given how Kepler performance has been less than stellar in recent games allowing the R9 290X to now match GTX 780 Ti when a year back the 780 Ti was clearly faster.








Quote:


> *The $220 difference between 970 & 980 had to have some drawbacks*.
> 
> 
> 
> 
> 
> 
> 
> Why don't you wait and see for the whole story to unfold? The 970 isn't any slower at gaming than it was at launch.


The problem is not about the drawbacks. But the fact that Nvidia did not reveal the memory partition detail at launch or the bandwidth of the last 0.5 GB. For the performance ramifications when VRAM > 3.5 GB is accessed we cannot trust Nvidia given how they deliberately withheld this information from the public. We require the tech press to do a honest and thorough investigation. someone needs to have the guts to do a fair job with the investigation and not be afraid to step on Nvidia's toes. btw are you now not defending Nvidia (which is what you accuse others of)


----------



## Noufel

Quote:


> Originally Posted by *raghu78*
> 
> Quote:
> 
> 
> 
> Originally Posted by *2010rig*
> 
> Making faster products than the competition, *and figuring out how to cripple the performance of your products is a hard business for Intel and NVIDIA*, cut them some slack man.
> 
> 
> 
> maybe difficult for Intel but no so much for Nvidia given how Kepler performance has been less than stellar in recent games allowing the R9 290X to now match GTX 780 Ti when a year back it was clearly faster.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> *The $220 difference between 970 & 980 had to have some drawbacks*.
> 
> 
> 
> 
> 
> 
> 
> Why don't you wait and see for the whole story to unfold? The 970 isn't any slower at gaming than it was at launch.
> 
> Click to expand...
> 
> The problem is not about the drawbacks. But the fact that Nvidia did not reveal the memory partition detail at launch or the bandwidth of the last 0.5 GB. For the performance ramifications when VRAM > 3.5 GB is accessed we cannot trust Nvidia given how they deliberately withheld this information from the public. We require the tech press to do a honest and thorough investigation. someone needs to have the guts to do a fair job with the investigation and not be afraid to step on Nvidia's toes. btw are you now not defending Nvidia (which is what you accuse others of)
Click to expand...

I can't see nvidia advertising the 970 as a 3.5 + 0.5 GB GPU even at that price the potential buyer will think twice .


----------



## raghu78

Quote:


> Originally Posted by *wermad*
> 
> As far as false advertisement, its not. They are effectively delivering you a product that does have 4gb. Its not missing any. How it utilizes that memory, is questionable to some but it does deliver on its premise and so far has had impressive results imho.


It does not. Nvidia said 224 GB/s memory bandwidth on a 256 bit GDDR5 memory bus running 7Ghz GDDR5 memory and a 4GB VRAM capacity. What they conveniently missed out was the last 0.5 GB is not accessed at the same speed as the rest of the 3.5 GB. How can you defend such crap ?









http://www.hardocp.com/image.html?image=MTQxMTk3NjU5NW5pdEZaMTFFZzFfMV8yX2wuZ2lm

Would you accept it if a portion of Intel's core i5 / core i7 L3 cache ran at a much lesser speed than the rest of the L3 causing performance to be affected when the affected portion of L3 cache is used. For CPUs the L3 cache is the last level before accesses go to system memory. For GPUs its the onboard GPU memory. The fact that you are justifying it is ridiculous


----------



## John Shepard

Can someone with a 980 perform the following test?

Put DAI at 4k [email protected] settings go into the war room and post your frame rates and memory usage.

On two 970s(even with sli disabled)whenever i go into the war room the memory maxes out and the game starts stuttering/lagging.
This does not happen at 1080/1440p res.

Outside of the war room memory usage is <3500MB and the game runs fine.


----------



## error-id10t

This is my choppy experience at >3.5GB with BF4, goes without saying you feel like yours eyes are about to explode if this was normal. 4790K @ 4.8giggles and 970 SLI drives it plenty compared to the Nvidia example using a single card.



Ignore the few dips prior to that, I had to keep upping the resolution scaling etc to bump it up, finally with pagefile at 12GB I got it not to crash on me!


----------



## sy573mx

I'm Happy with my 3.5GB.... Up from 1GB

I'm still on 1080p so No issues for me right now.


----------



## Nafu

oh, over 350 discussions posts on that serious flaw from NVIDIA, and i am just inn,, seems bit late in the party.

This flaw not a gerneal or unnoticeable, quite condemnable as 970 was their best selling product right now Globally. .5G un usable give bottleneck to some extend in by limiting the memory bandwidth using utilizing in the game. especially in SoM, battlefield and COD AW which are VRAM intensive games more than other. if it turns out to be hardware return case, it became, mess up/ which i don't think so.

i am sure General Gamer don't bother with that issue. the overclockers and benchmarkers have serious concern of being not fully utilization of resources lacking the performance.


----------



## delboy67

Quote:


> Originally Posted by *wermad*
> 
> Lol, ppl chill. You bought a GTX 970 and there has to be some handicap(s) in place in order for it to relinquish overall dominance to GTX 980. If you want your 4gbs, put up the cashes and get a 980 (or head over to the hot, literally, red camp w/ Hawaii
> 
> 
> 
> 
> 
> 
> 
> ). Didn't we learn how much Nvidia cripples its runner up w/ GTX 570???
> 
> As far as false advertisement, its not. They are effectively delivering you a product that does have 4gb. Its not missing any. How it utilizes that memory, is questionable to some but it does deliver on its premise and so far has had impressive results imho. Furthermore, both Amd and Nvidia advertise their dual gpu cards w/ the total memory not effective memory as a good marketing scheme. Dishonest? No, and you have to understand how marketing and capturing consumer's "desires" works. Also, small clean numbers work best for marketing. Sure, we can get down to the nitty-gritty details, but that's for the reviews. Its just how it works in the art of marketing.
> 
> A good anology is car horsepower. Its measured at the flywheel in the majority of cases, and you don't get all those applied to the tarmac. But a 400 hp car sounds better then a 331.28 whp car. I can't sue a car maker for not giving me 400 ponies at the rubber. C'est la vie.
> 
> All those in a bunch, Nvidia can solve this with the magical * :
> 
> GTX 970 4GB*
> 
> (total effective vram available) (<-lawyers love to be vague and tiny
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> Did any one confirm if this is only Hynix and not Samsung? Just curious on this and if anyone here has tested this theory?


A better analogy would be heres our new car it has 300bhp, except when you go over 80mph, then it falls to 80bhp.


----------



## Obrigado

last test GTX 970 VS GTX 980

different cpu and game places but... same settings (DSR 4k - all maximum settings)

GTX 970

msaa2x


msaa4x


GTX980

msaa2x
http://imgur.com/SkJGwRx

msaa4x
http://imgur.com/yVJm1aF

GTX 970 T1=20.4fps (ram video 3704mb)(ram system 5.5gb of 16)
GTX 980 T1=23fps (ram video 3946mb)(ram system 7.3gb of 16)

GTX970 T2=12.1fps (ram video4090mb)(ram system 8.8gb of 16)
GTX980 T2=11.9fps (ram video4020mb)(ram system 9.2gb of 16)

no problem!


----------



## rdr09

Quote:


> Originally Posted by *delboy67*
> 
> A better analogy would be heres our new car it has 300bhp, except when you go over 80mph, then it falls to 80bhp.


prolly was using a spare tire. can't go over 55MPH with those.

edit: my bad. 50MPH.


----------



## piggycute

I've tested my Card with Shadow of mordor,the Settings was 2880*1800 (1.5x DSR from 1920*1200) with all Ultra,

to run in-game Benchmark,that mostly consumes 3.5GB of vram,sounds no good.

But I continue playing the story mode ,i found that the game take up 3.9G-4GB of vram , and gaming experience was very smooth,hardly find any stuttering :O

i have a idea, that we started to blame nvidia is cheating us,but why dont we ask Nai for the Source code so let us know hows that benchmark to work? I guess Nai's benchmark wasnt testing vRam bandwidth but testing GPU Xbar's performance.

































And I live in main land china,so I'm very sorry for my bad english and i cant provide any testing videos on youtube.

my rig:

4790K @ 4.4Ghz,16GB Ram , Dual GTX970 (Zotac,an unique model for Mainland china)

BTW: my card had bad result on nai's benchmark.


----------



## TheReciever

Welcome to the forums


----------



## 2010rig

Quote:


> Originally Posted by *raghu78*
> 
> maybe difficult for Intel but no so much for Nvidia given how Kepler performance has been less than stellar in recent games allowing the R9 290X to now match GTX 780 Ti when a year back the 780 Ti was clearly faster.
> 
> 
> 
> 
> 
> 
> 
> 
> The problem is not about the drawbacks. But the fact that Nvidia did not reveal the memory partition detail at launch or the bandwidth of the last 0.5 GB. For the performance ramifications when VRAM > 3.5 GB is accessed we cannot trust Nvidia given how they deliberately withheld this information from the public. We require the tech press to do a honest and thorough investigation. someone needs to have the guts to do a fair job with the investigation and not be afraid to step on Nvidia's toes. btw are you now not defending Nvidia (which is what you accuse others of)


If you couldn't detect the sarcasm in my post, well....









My only stance right now is to wait and see how fast that last 0.5GB of RAM is really running at, because I'm certain it's not 20 GB/s, and if it's being utilized. I'm not going to make up my mind based on one buggy benchmark. I'd like to see more gaming results.
Quote:


> Originally Posted by *Obrigado*
> 
> last test GTX 970 VS GTX 980
> 
> different cpu and game places but... same settings (DSR 4k - all maximum settings)
> 
> GTX 970 T1=20.4fps (*ram video 3704mb)(ram system 5.5gb of 16*)
> GTX 980 T1=23fps (ram video 3946mb)(ram system 7.3gb of 16)
> 
> GTX970 T2=12.1fps (*ram video4090mb*)(ram system 8.8gb of 16)
> GTX980 T2=11.9fps (ram video4020mb)(ram system 9.2gb of 16)
> no problem!


Interesting results


----------



## MapRef41N93W

Quote:


> Originally Posted by *tweezlednutball*
> 
> That sounds really bad as i have 2 7970's (old faithful) in crossfire with only 3gb vram each and i run the game full ultra with 200 resolution scale buttery smooth. I guess thats what true 384 bit bus gets u. NVidia has been known to split busses like this, it hasnt been the first time.


Nah that's not the reason. That poster just clearly is exaggerating his VRAM usage. I can play both of those games at 4k and never see above 3GB VRAM on both a 295x2 and 970.


----------



## Ezpuck

Quote:


> Originally Posted by *John Shepard*
> 
> Can someone with a 980 perform the following test?
> 
> Put DAI at 4k [email protected] settings go into the war room and post your frame rates and memory usage.
> 
> On two 970s(even with sli disabled)whenever i go into the war room the memory maxes out and the game starts stuttering/lagging.
> This does not happen at 1080/1440p res.
> 
> Outside of the war room memory usage is <3500MB and the game runs fine.


I have a 980gtx and i7920 3.6. I tried the war room at 4k ultra and the game didint slutter at all. There is no framedrops or sluttering if the vram >3.5gb


----------



## thegreatsquare

Quote:


> Originally Posted by *raghu78*
> 
> It does not. Nvidia said 224 GB/s memory bandwidth on a 256 bit GDDR5 memory bus running 7Ghz GDDR5 memory and a 4GB VRAM capacity. What they conveniently missed out was the last 0.5 GB is not accessed at the same speed as the rest of the 3.5 GB. How can you defend such crap ?


I have to agree. It's "Memory Bandwidth (GB/sec) - 224", not "Memory Bandwidth (GB/sec) - 224***".


----------



## gamervivek

It's about ethics in hardware journalism.


----------



## mtcn77

Quote:


> Originally Posted by *piggycute*
> 
> I've tested my Card with Shadow of mordor,the Settings was 2880*1800 (1.5x DSR from 1920*1200) with all Ultra,
> 
> to run in-game Benchmark,that mostly consumes 3.5GB of vram,sounds no good.
> 
> But I continue playing the story mode ,i found that the game take up 3.9G-4GB of vram , and gaming experience was very smooth,hardly find any stuttering :O
> 
> i have a idea, that we started to blame nvidia is cheating us,but why dont we ask Nai for the Source code so let us know hows that benchmark to work? I guess Nai's benchmark wasnt testing vRam bandwidth but testing *GPU Xbar's performance*.


Hi!
According to an Nvidia developer's guide Xbar is the "link" from smm's to L2 cache. There seems to be no discrepancy about what is and isn't - it is worded quite directly - so I think it is fair to say without Xbar, the memory bus(which is connected via L2 cache) is dislocated from the gpu which means in effect, the traces may be on the pcb, albeit unconnected. Which is why the card is not full "256 bit".


----------



## thegreatsquare

It seems my 980m 8GB does have the bug, but I'm not sure I should care a lot. If the card is not going to have an issue until I hit 7.5GB, big whoop. I don't understand why the end of the first 4GB is fine when the last 4GB has the issue. I sort of expected the issue to occur at 3.5GB and 7.5GB.


----------



## ojin

Frametimes at >3.5GB vram usage should be tested at over 60fps, because tests at low fps will give the vram more time each frame and hide the problem (if it actually exists).


----------



## Shaded War

So this is why I still get warnings to disable aero to save on memory with my 5760x1080 setup on a 4GB card when my games never use over 3.5GB.

This is my first Nvidia graphics, and probably my last. AMD never pulled any crap like this on me.


----------



## Gilles3000

Actually, i wonder what the best buy for 4K is,

2X ASUS GTX 780 6GB Strix for €780

or

2X ASUS GTX 970 3.5GB Strix for €700










(Ignoring the obvious 2X XFX 290X 8GB for €770 to avoid "unfair" comparison arguments)


----------



## Defoler

Quote:


> Originally Posted by *Shaded War*
> 
> This is my first Nvidia graphics, and probably my last. AMD never pulled any crap like this on me.


As far as you know.
There is no CUDA on AMD so you can't run this program to test AMD cards to see if they also pull that same thing or not.

After all, AMD never say anything false and they are always honest and their PR is always spot on









Besides, outside of this little CUDA benchmark, I have yet to see any, at all, real performance hindering which doesn't correspond with pushing the GPU to its complete limit and get lower performance than you would expect from the card in that same line.
Not in this thread or others or anywhere which shows me a game which goes to the 4GB limit, above the 3.5GB, and give FPS too low compared to the 290x or the relative performance of the 980.

It only shows CUDA being pushed and data stored being overwritten by the benchmark. Gaming =/= CUDA calculation.


----------



## mtcn77

Quote:


> Originally Posted by *Defoler*
> 
> As far as you know.
> There is no CUDA on AMD so you can't run this program to test AMD cards to see if they also pull that same thing or not.
> 
> After all, AMD never say anything false and they are always honest and their PR is always spot on


No, no, don't just throw the towel like that.
AMD doesn't cheat in their marketting material like Nvidia. 64 rops is INDEED 64 rops in 290.
You may check it yourself.


----------



## rainzor

So no videos to show this sudden fps drop and stutter at over 3.5GB vram usage? Yet so much complaining..


----------



## DzillaXx

Quote:


> Originally Posted by *Defoler*
> 
> As far as you know.
> There is no CUDA on AMD so you can't run this program to test AMD cards to see if they also pull that same thing or not.
> 
> After all, AMD never say anything false and they are always honest and their PR is always spot on
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Besides, outside of this little CUDA benchmark, I have yet to see any, at all, real performance hindering which doesn't correspond with pushing the GPU to its complete limit and get lower performance than you would expect from the card in that same line.
> Not in this thread or others or anywhere which shows me a game which goes to the 4GB limit, above the 3.5GB, and give FPS too low compared to the 290x or the relative performance of the 980.
> 
> It only shows CUDA being pushed and data stored being overwritten by the benchmark. Gaming =/= CUDA calculation.


If you go on youtube as well as previous posts, will show that this problem does happen in games. Plenty of youtube video's show the stuttering caused by going over 3.5gb.

Though with windows 7 all you need to do is right click on the game's EXE, go to properties, and disable desktop composition. IMO a must for gaming on windows 7, as Aero on 7 is a memory hog. Will keep the warnings from popping up, and will give you better game performance.









Also AMD card's have always had a solid proven backend. Don't even try to bash it.


----------



## Baghi

Quote:


> Originally Posted by *Defoler*
> 
> As far as you know.
> There is no CUDA on AMD so you can't run this program to test AMD cards to see if they also pull that same thing or not.


That benchmark may be CUDA depended, but the warning to disable Aero is not.


----------



## PontiacGTX

Quote:


> Originally Posted by *damric*
> 
> The R9 295 is looking really good right now at less than $700 for anyone running super high resolution.
> 
> http://pcpartpicker.com/parts/video-card/#c=169&sort=a8


better 200usd more on CF 8GB 290X of you want futureproof


----------



## Silent Scone

Quote:


> Originally Posted by *rainzor*
> 
> So no videos to show this sudden fps drop and stutter at over 3.5GB vram usage? Yet so much complaining..


That's because it's not happening at 3.5 it's happening after 4gb. It's hard to see the forest through the trees on GeForce.com through all the angry 17 year olds who feel they were cheated for Christmas.


----------



## Biorganic

VRAMGate 2015.









Glad I am waiting until 14/16 nm to upgrade my entire rig. I want to see a new top end AMD cpu, and the boldest die shrunk maxwell/Fiji possible. With all my VRAM useable, thank you.

Sorry to all the folks who were shafted out of "useable" 512 mb on their 4 gig cards, shameful business practice.


----------



## Defoler

Quote:


> Originally Posted by *Baghi*
> 
> That benchmark may be CUDA depended, but the warning to disable Aero is not.


I had that warning playing with AMD GPUs as well. There is nothing nvidia specific on how crappy aero is.
Quote:


> Originally Posted by *mtcn77*
> 
> No, no, don't just throw the towel like that.
> AMD doesn't cheat in their marketting material like Nvidia. 64 rops is INDEED 64 rops in 290.


What are you talking about? ROP is not SM.
SM is streaming multiprocessors, each has X amount of core in it. Less SMs means less cores.
ROPs are the connected to the memory controllers, 16 to each of the 4 memory controllers. 64 on both 980 and 970.
Learn your maxwell (and the OP's article should learn that as well...).
The 980 has 16 SMs, with each contain 128 cores. The 970 has 13 SMs with each contain 128 cores. Nvidia never said otherwise. EVER.

What they do have is less dedicated cores to memory handling inside each SM (from what I understand from the nvidia response). Or less addressing. I'm not sure.
And you can see from their response and numbers from the SoM example, there is no actual problem in games. And I have yet to see someplace that shows it.

BTW, the 290x has 8 MCs with 8 ROPs each, with a shared L2 cache.
The 980/970 has 4 MCs, but with 16 ROPs each and a dedicated L2 cache.
Each way has its own good or bad, but both are fine.
Quote:


> Originally Posted by *DzillaXx*
> 
> If you go on youtube as well as previous posts, will show that this problem does happen in games. Plenty of youtube video's show the stuttering caused by going over 3.5gb.
> 
> Though with windows 7 all you need to do is right click on the game's EXE, go to properties, and disable desktop composition. IMO a must for gaming on windows 7, as Aero on 7 is a memory hog. Will keep the warnings from popping up, and will give you better game performance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also AMD card's have always had a solid proven backend. Don't even try to bash it.


Many new games have a lot of micro stuttering on both the 290x and the 980. And many times it was fixed with patches and new drivers.
And this problem would not be the cause of micro stuttering but on performance drops. It should drop FPS extremely low once the game starts to push the graphics, and we don't see it just because its 970 and over 4GB, but just because the GPU can't handle it. We see similar performance reductions on 980, 290x and 780 TI and so on as well.

If this was the case, we should have seen the 970 giving serious bad FPS results in 4K gaming with AA where the 4GB limit is pushed, compared to the 290x or 980. As in if they drop by 40%, the 970 should drop by 90% if those numbers in the CUDA benchmarks were true.
In reality, in games, this is not the happening.

Still not convinced. Especially if what you say about Aero as windows 7 issue which is getting fixed, it looks more like an OS problem fighting against the heavy burden of the GPU than a memory issue.


----------



## Forceman

Quote:


> Originally Posted by *Shaded War*
> 
> So this is why I still get warnings to disable aero to save on memory with my 5760x1080 setup on a 4GB card when my games never use over 3.5GB.
> 
> This is my first Nvidia graphics, and probably my last. AMD never pulled any crap like this on me.


The aero thing is a bug/glitch/idiosyncrasy, it's been happening for years.
Quote:


> Originally Posted by *thegreatsquare*
> 
> It seems my 980m 8GB does have the bug, but I'm not sure I should care a lot. If the card is not going to have an issue until I hit 7.5GB, big whoop. I don't understand why the end of the first 4GB is fine when the last 4GB has the issue. I sort of expected the issue to occur at 3.5GB and 7.5GB.


The test is bugged. *Stop using it.*


----------



## i7Stealth1366

Come on AMD R9 39X series, if you offer something that beats the pants off of nvidia Ill buy!


----------



## MonarchX

Quote:


> Originally Posted by *Shaded War*
> 
> So this is why I still get warnings to disable aero to save on memory with my 5760x1080 setup on a 4GB card when my games never use over 3.5GB.
> 
> This is my first Nvidia graphics, and probably my last. AMD never pulled any crap like this on me.


Yeah, they used to produce lower-quality drivers, but recently there have been fewer complains and these days nVidia drivers are no better! I've always said that AMD hardware was actually more powerful, and overall better, but the drivers for it were a mess. Now that some games already picked Mantle, AMD 390X is surely appealing, especially since it uses Stacked VRAM, a technology nVidia wasn't planning on using until next year. Then we also have FreeSync, that may be picked up faster than G-Sync that costs extra and takes more time to integrate.

One company is *on its way to be pushed into the Abyss "This is SPARTA!!!" style*, while another is rising from the ashes like a forgotten, but *never completely-defeated entity of the Red Devil itself*







. Don't you love business competition?


----------



## Hattifnatten

Quote:


> Originally Posted by *rainzor*
> 
> So no videos to show this sudden fps drop and stutter at over 3.5GB vram usage? Yet so much complaining..


I've seen four, and I don't even follow this topic


----------



## bossie2000

OK. This Is What Nvidia have to do. All you people with 970's , just came into the nearest store and get yourself a 980 !


----------



## morbid_bean

Can anyone confirm if this affects the just released EVGA FTW+ edition


----------



## mtcn77

What is striking is how PcPer have carefully treaded gtx970 around 4K SLI in AAA titles like Battlefield 4, Bioshock:Infinite, Crysis 3, Metro:LL & Skyrim(because 2x cheaper cards cannot SLI, right?







).


----------



## Forceman

Quote:


> Originally Posted by *morbid_bean*
> 
> Can anyone confirm if this affects the just released EVGA FTW+ edition


Does it have a 970 GPU? Then yes.

It's part of the GPU itself, there is no card that is going to be different.


----------



## UZ7

Well what intrigues me is how many people were talking about the 3.5GB vram usage as it was seen in various gaming that it hovered around 3500MB use (vs 980 4000mb use) and to get more you almost have to force it, then there were also a bunch of people just calling BS and how everyones just over reacting and were doing tests to show they can use more than 3.5GB of ram, then it moved to well how come when I use more than 3.5GB ram the speed jumps down to a crawl.. which then coined up well if you're doing high res or high AA of course you're going to hit slow speeds also there were some testing users were doing and showing that they were able to run games over 3.5 use and still worked fine, but theres also tests that show an increase in frame time. Then a benchmark came out that showed speeds/bandwidth when using more ram.. which had many questions of validity as well as vram reserved for OS, physx and such...... but with all this nVidia replies with more or less... welp you guys are right, we have 3.5/0.5 GB ram partitions and its working as intended, you shouldve bought a 980 instead... here is a report typed up from one of our interns showing how there is barely any difference and you should see no performance losses (okay it wasnt done by the interns but the picture made it seem rushed to cover the claims). So with that in mind, based on the nature and how the 970 was designed... they confirmed the whole 3.5+0.5GB when half the people who were arguing were calling BS. Now the question many people have is when/if they do have to utilize more ram use, will they get shafted? and how will it present in the future.

Pretty much this is the summary I've gathered by just reading the forums (while I was gaming







), but if you rewind back.. all i read is.. hey how come 970 only uses 3500 and 980 uses 4000 on the same game.. users test test test benchmark test... nvidia says hold on let us test as well.. okay we're back, basically what you guys are saying is "true" the card is designed 3.5/0.5.. its working as intended, shouldnt make a difference in what you guys do, here is a benchmark we did ourselves to back up our claim... so people go back to uhh so we see 3500MB because thats how it was designed... after that post nvidia made many went back to well guys i can use 4gb, whatchu talkin bout willis?... while many are confused... and others unsatisfied with nVidia's response *at the end of it all... would nVidia have made a post like that if no one posted about it in the first place?







*


----------



## PontiacGTX

Quote:


> Originally Posted by *UZ7*
> 
> Well what intrigues me is how many people were talking about the 3.5GB vram usage as it was seen in various gaming that it hovered around 3500MB use (vs 980 4000mb use) and to get more you almost have to force it, then there were also a bunch of people just calling BS and how everyones just over reacting and were doing tests to show they can use more than 3.5GB of ram, then it moved to well how come when I use more than 3.5GB ram the speed jumps down to a crawl.. which then coined up well if you're doing high res or high AA of course you're going to hit slow speeds also there were some testing users were doing and showing that they were able to run games over 3.5 use and still worked fine, but theres also tests that show an increase in frame time. Then a benchmark came out that showed speeds/bandwidth when using more ram.. which had many questions of validity as well as vram reserved for OS, physx and such...... but with all this nVidia replies with more or less... welp you guys are right, we have 3.5/0.5 GB ram partitions and its working as intended, you shouldve bought a 980 instead... here is a report typed up from one of our interns showing how there is barely any difference and you should see no performance losses (okay it wasnt done by the interns but the picture made it seem rushed to cover the claims). So with that in mind, based on the nature and how the 970 was designed... they confirmed the whole 3.5+0.5GB when half the people who were arguing were calling BS. Now the question many people have is when/if they do have to utilize more ram use, will they get shafted? and how will it present in the future.
> 
> Pretty much this is the summary I've gathered by just reading the forums (while I was gaming
> 
> 
> 
> 
> 
> 
> 
> ), but if you rewind back.. all i read is.. hey how come 970 only uses 3500 and 980 uses 4000 on the same game.. users test test test benchmark test... nvidia says hold on let us test as well.. okay we're back, basically what you guys are saying is "true" the card is designed 3.5/0.5.. its working as intended, shouldnt make a difference in what you guys do, here is a benchmark we did ourselves to back up our claim... so people go back to uhh so we see 3500MB because thats how it was designed... after that post nvidia made many went back to well guys i can use 4gb, whatchu talkin bout willis?... while many are confused... and others unsatisfied with nVidia's response *at the end of it all... would nVidia have made a post like that if no one posted about it in the first place?
> 
> 
> 
> 
> 
> 
> 
> *


the problem was the slowdown that generates the bottleneck of.the last ROPs/memory bandwidth past 3328MB/3500MB


----------



## mcg75

Reminder.

Please keep the off topic posts out of the thread. The topic is the 970 and it's memory bug.

That means arguing about drivers is off topic.

Also, please stop with the personal comments and calling each other out.


----------



## Defoler

I just wanted to say another point.

Its like checking your symptoms online.
You sneeze once, and all of a sudden you run to the doctor demanding cancer treatments.

Micro stuttering has nothing to do with this issue. Micro stuttering is when frames do not arrive on constant intervals. And it shows extremely less in lower fps.
The main symptom for this problem is FPS drops. When the data for the frames is on the upper 0.5GB, it should cause extreme low FPS for several frames because the time to calculate this frame takes longer because access to the memory of these frames is slower to get.
We don't really see this symptom.
People see micro stuttering and yell "cancer!!" without even considering other options.


----------



## Sisaroth

This all seems so overblown. That benchmark either doesn't even work correctly or tons of cards are affected by this.


----------



## mtcn77

Quote:


> Originally Posted by *Sisaroth*
> 
> This all seems so overblown. That benchmark either doesn't even work correctly or tons of cards are affected by this.


Except, there are working fcat benchmarks everywhere and 970 has NEVER been tested at 4K-SLI. I mean, what are the chances that "all" reviewers turned this idea down - even those that have tested 980 right beside the 970 in the same review?


----------



## ZealotKi11er

Quote:


> Originally Posted by *DzillaXx*
> 
> If you go on youtube as well as previous posts, will show that this problem does happen in games. Plenty of youtube video's show the stuttering caused by going over 3.5gb.
> 
> Though with windows 7 all you need to do is right click on the game's EXE, go to properties, and disable desktop composition. IMO a must for gaming on windows 7, as Aero on 7 is a memory hog. Will keep the warnings from popping up, and will give you better game performance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also AMD card's have always had a solid proven backend. Don't even try to bash it.


It's funny because only people with Nvidia cards had to disable Aero in games. I never had to disable it with ATI cards.


----------



## intelfan

Haha this is just like T-Mobile's data plans. Go over your high speed data and you get throttled to 128kbps


----------



## mtcn77

Quote:


> Planned obsolescence or built-in obsolescence[1] in industrial design is a policy of planning or designing a product with an artificially limited useful life, so it will become obsolete, that is, unfashionable or no longer functional after a certain period of time.[2] The rationale behind the strategy is to generate long-term sales volume by reducing the time between repeat purchases (referred to as "shortening the replacement cycle").
> Firms that pursue this strategy believe that the additional sales revenue it creates more than offsets the additional costs of research and development and opportunity costs of existing product line cannibalization. The rewards are by no means certain: *in a competitive industry, this can be a risky strategy because consumers may decide to buy from competitors.
> Planned obsolescence tends to work best when a producer has at least an oligopoly*.[3] _Before introducing a planned obsolescence, the producer has to know that the consumer is at least somewhat likely to buy a replacement from them_. In these cases of planned obsolescence, there is an information asymmetry between the producer - who knows how long the product was designed to last - and the consumer, who does not. When a market becomes more competitive, product lifespans tend to increase.


----------



## hurleyef

Quote:


> Originally Posted by *Sisaroth*
> 
> This all seems so overblown. That benchmark either doesn't even work correctly or tons of cards are affected by this.


The writer of the benchmark already said as much. Its results are completely irrelevant in the context of this discussion.

That's not to say that there is no issue, only that it has not yet really been quantified except for the data released by nVidia from the article that started this thread. Sadly, it's been almost entirely ignored in favor of hand waving and trolling.

My original post stands:
Quote:


> Originally Posted by *hurleyef*
> 
> Seems pretty negligible to me, but I'd still like to see more in depth testing from someone that wasn't nVidia just to be sure.


----------



## MonarchX

*This is so ridiculous.* Mods are removing replies that do not violate any rules what-so-ever just because they are pro-AMD and anti-nVidia, but backed up with legit evidence. How lame is that? Let's hail nVidia for this issue. After all, their logo is still green, which makes them L337! I can't say for sure, but I bet nVidia pays money to all kind of organizations and forums to keep this issue as quiet as possible, just like they "work" with game developers to make sure their games are optimized ("made") for nVidia cards. Boo!


----------



## sugalumps

Quote:


> Originally Posted by *i7Stealth1366*
> 
> Come on AMD R9 39X series, if you offer something that beats the pants off of nvidia Ill buy!


Would have to beat the 980 in fps / lower power draw / heat output / noise for me to consider it, those are the reasons I went for maxwell(980) and obvioulsy the fact it outperformed my last card in fps.


----------



## Asus11

I hope we see price drops in gtx 970s now this is out in the open.. or better yet price drops on the 980

im guessing there will be alot of people abit sceptical of buying the lower end card from Nvidias next Gen considering what just went on


----------



## Defoler

Quote:


> Originally Posted by *mtcn77*
> 
> Except, there are working fcat benchmarks everywhere and 970 has NEVER been tested at 4K-SLI. I mean, what are the chances that "all" reviewers turned this idea down - even those that have tested 980 right beside the 970 in the same review?


Are you freaking kidding me?

970 SLI 4K fcat + review
970 + SLI + 4K + Surround


----------



## mtcn77

Quote:


> Originally Posted by *Defoler*
> 
> Are you freaking kidding me?
> 
> 970 SLI 4K fcat + review
> 970 + SLI + 4K + Surround


Nice.
Those aren't 4K fcat frametime benchmarks. Those are average fps + frametime benchmarks. What matters isn't
a- the average fps
b-the average frametime
But the ordered view 99p fcat frametime benchmarks, ones that guru3d has refrained themselves to 2k,(like this) and that PcPer have curiously NOT at least repeated what they did for 980 in the same benchmark. And please don't play the "they may not have had two 970 at the review launch period" speculation. They just did.


----------



## Forceman

Quote:


> Originally Posted by *mtcn77*
> 
> Nice.
> Those aren't fcat frametime benchmarks. Those are average fps + frametime benchmarks. What matters isn't
> a- the average fps
> b-the average frametime
> But the ordered view 99p fcat frametime benchmarks, ones that guru3d has refrained themselves to 2k,(like this) and that PcPer has curiously NOT have at least repeated what they did for 980 in the same benchmark. And please don't play the "they may not have had two 970 at the review launch period" speculation. They just did.


Yeah, much more likely that it's a huge conspiracy organized by Nvidia.

Several sites have tested GTX 970 SLI at 4K and none of them noticed any issues visually; which was what started the whole FCAT thing in the first place, the difference between what the user saw and what the numbers showed.

Maybe, just maybe, it's not the Earth-ending catastrophe some people want to make it out to be.


----------



## Redwoodz

Quote:


> Originally Posted by *mtcn77*
> 
> Nice.
> Those aren't 4K fcat frametime benchmarks. Those are average fps + frametime benchmarks. What matters isn't
> a- the average fps
> b-the average frametime
> But the ordered view 99p fcat frametime benchmarks, ones that guru3d has refrained themselves to 2k,(like this) and that PcPer has curiously NOT have at least repeated what they did for 980 in the same benchmark. And please don't play the "they may not have had two 970 at the review launch period" speculation. They just did.


Yes,I find this very curious,since they have sold a million of them,yet no full review. Has anyone asked Ryan Shrout why?


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, much more likely that it's a huge conspiracy organized by Nvidia.


You know what: it makes sense when you speak your mind.


----------



## MonarchX

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, much more likely that it's a huge conspiracy organized by Nvidia.
> 
> Several sites have tested GTX 970 SLI at 4K and none of them noticed any issues visually; which was what started the whole FCAT thing in the first place, the difference between what the user saw and what the numbers showed.
> 
> Maybe, just maybe, it's not the Earth-ending catastrophe some people want to make it out to be.


No conspiracy, just pure GTX 970 owners PWNAGE, and not like usual OWNAGE, but like french Peenuhjay, just because its extra bad







.


----------



## Silent Scone

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, much more likely that it's a huge conspiracy organized by Nvidia.
> 
> Several sites have tested GTX 970 SLI at 4K and none of them noticed any issues visually; which was what started the whole FCAT thing in the first place, the difference between what the user saw and what the numbers showed.
> 
> Maybe, just maybe, it's not the Earth-ending catastrophe some people want to make it out to be.


+1 but wouldn't waste your breath.


----------



## Forceman

Quote:


> Originally Posted by *Silent Scone*
> 
> +1 but wouldn't waste your breath.


I know, there's no point in trying to be reasonable when video cards are involved.

I do find it amusing that certain people are clamoring for FCAT results, when those same people thought FCAT was a fraud perpetrated by Nvidia specifically to make AMD card look bad, and widely derided any and all FCAT results when it was introduced.


----------



## Bluemustang

Well, this kinda sucks. I've got SLI gigabyte g1 970s that im in the process of putting under a custom water loop right now (parts arrive mid next week).

Like usual i intended to keep these for 1 maybe 2 generations before reselling them for the new hotness. But how seriously do you think this will affect resale value in the future when i sell these to buy new ones (and the resale value of the water blocks and back plates for that matter as well....)?


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> I know, there's no point in trying to be reasonable when video cards are involved.
> 
> I do find it amusing that certain people are clamoring for FCAT results, *when those same people thought FCAT was a fraud perpetrated by Nvidia specifically to make AMD card look bad*, and widely derided any and all FCAT results when it was introduced.


What do your lamentations have got to do with an objective testing? This generation hasn't fared run of the mill pomposity, eh?


----------



## MonarchX

Quote:


> Originally Posted by *Pill Monster*
> 
> Hello Vulture. You still banned?




How can I be banned if I am posting?


----------



## Forceman

Quote:


> Originally Posted by *mtcn77*
> 
> What do your lamentations have got to do with an objective testing?


The irony of people asking for objective testing with the exact same tool they ran away from when it was first released.

It's amusing to me.
Quote:


> Originally Posted by *Bluemustang*
> 
> Like usual i intended to keep these for 1 maybe 2 generations before reselling them for the new hotness. But how seriously do you think this will affect resale value in the future when i sell these to buy new ones (and the resale value of the water blocks and back plates for that matter as well....)?


If you are selling on eBay, or Craigslist or something, probably not much at all. Kind of early to tell though, depends on what the "objective" testing finds for actual performance impacts.


----------



## darealist

I just sold my GTX 970 because of this bs and never-ending coil whine. It's cheap for a reason guys. Cheap components for a cheap card. I'll wait for GM200 from here on out.


----------



## Redwoodz

Quote:


> Originally Posted by *Forceman*
> 
> I know, there's no point in trying to be reasonable when video cards are involved.
> 
> I do find it amusing that certain people are clamoring for FCAT results, when those same people thought FCAT was a fraud perpetrated by Nvidia specifically to make AMD card look bad, and widely derided any and all FCAT results when it was introduced.


Well, it does kind of seem FCAT testing has suddenly lost favor since AMD came out with superior crossfire results with elimination of the need for bridges. Can't have it both ways, either it is very relevant, or it never was. Regardless of which brand I may prefer, I build custom PC's for my customers, who often request Nvidia GPU's. It is my duty to fully investigate all relevant products and their performance in common usage scenarios. I want to know how best to advise my customers.


----------



## rainzor

https://www.youtube.com/watch?v=EIHlX3nyr-M
https://www.youtube.com/watch?v=rxy4Ct7TZjw
https://www.youtube.com/watch?v=OIg01XdK1Z0
https://www.youtube.com/watch?v=ctZ8o27UbUk
https://www.youtube.com/watch?v=eeqJ9ngljoY
https://www.youtube.com/watch?v=kDD9kimnjCg
https://www.youtube.com/watch?v=5r_eezULDro
https://www.youtube.com/watch?v=xbmKik7KPqw

Look at all those poor bastards stuttering. Oh wait....they're not.








But let's make a fuss anyway


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> The irony of people asking for objective testing with the exact same tool they ran away from when it was first released.
> 
> It's amusing to me.


The irony of people calling others ran away from it once and themselves running away from it now.
It is amusing to me, too, to see an objective tool considered as a backstabbing heinous dagger when in fact it is only the mirror.


----------



## Vesku

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, much more likely that it's a huge conspiracy organized by Nvidia.
> 
> Several sites have tested GTX 970 SLI at 4K and none of them noticed any issues visually; which was what started the whole FCAT thing in the first place, the difference between what the user saw and what the numbers showed.
> 
> Maybe, just maybe, it's not the Earth-ending catastrophe some people want to make it out to be.


The whole point of FCAT was to catch issues that often only a fraction of users would notice. Now that this has been confirmed detailed FCAT analysis is in order.


----------



## Forceman

Quote:


> Originally Posted by *Redwoodz*
> 
> Well, it does kind of seem FCAT testing has suddenly lost favor since AMD came out with superior crossfire results with elimination of the need for bridges. Can't have it both ways, either it is very relevant, or it never was. Regardless of which brand I may prefer, I build custom PC's for my customers, who often request Nvidia GPU's. It is my duty to fully investigate all relevant products and their performance in common usage scenarios. I want to know how best to advise my customers.


How many sites really used it though? PCPer and maybe Anand? I don't really recall it being all that common. 4K testing may be harder to use it for also, which may limit some testing. I'd assume it requires more robust hardware to capture 4K data.
Quote:


> Originally Posted by *mtcn77*
> 
> The irony of people calling others ran away from it once and themselves running away from it now.


I'm not running away from it, I'm just skeptical that there's a conspiracy to not use it specifically to cover up this issue.
Quote:


> Originally Posted by *Vesku*
> 
> The whole point of FCAT was to catch issues that often only a fraction of users would notice. Now that this has been confirmed detailed FCAT analysis is in order.


I fully expect to see more in-depth testing of this now that's it's been widely acknowledged, just like there was widespread testing once the AMD Crossfire issue was accepted. Not thinking there is a conspiracy is not the same as not wanting to see testing.

Edit: And in regards to PCPer testing it:
Quote:


> January 24, 2015 | 07:21 PM - Posted by Ryan Shrout
> 
> It's a weekend guys. I was trying to spend it with my family. Sorry to let you down.
> 
> Monday will be here soon!


But also:
Quote:


> January 24, 2015 | 07:20 PM - Posted by Ryan Shrout
> 
> No, we definitely didn't see it in our own 4K testing. As far as I know, no reviewer on the normal circuit I know about has found it before forums and others began to post about it.


----------



## revro

so are we now doing a classaction suit


----------



## ZealotKi11er

Review sites even PCPer don't go out of their way to look for possible problems in every new GPU. When you have something like GTX970 and GTX980 you usually just the the higher end model and make a conclusion based on that for GTX970 too. There is nothing that can be dont for GTX970 now. People should just be more careful in their next GPU purchase. Even with this GTX970 is still a great card at a great price.


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> Edit: And in regards to PCPer testing it:
> But also:


What is funny is they avoided reporting the results of 4K-SLI tests for 970 and now claim not to have seen anything.


----------



## Seven7h

Quote:


> Originally Posted by *delboy67*
> 
> A better analogy would be heres our new car it has 300bhp, except when you go over 80mph, then it falls to 80bhp.


No, bad analogy. Only the resources placed in the slower memory are slower to access, and they are lower priority by definition. It does not slow your performance overall significantly like 80bhp would.


----------



## nleksan

First, I have not owned, neither do I possess any plans to own, any GM204 cards. I have been holding out since selling my trio of Kingpins, the latest cadre of impressively engineered silicon "Game Playing Units" following the less powerful but nevertheless impressive 780 Classifieds, the promising in potential but unfortunately underwhelming 290X Lightning CFX setup, the 680 Lightning 2-way SLI (with the artisan quality Aquacomputer limited edition blocks) that MSI graciously offered as replacement for the two previous GPU's, a pair of 7970 Lightnings (less impressive blocks via EK) both having failed (*read: sudden death with no apparent cause), and my beloved EVGA GTX670 FTW's in their unparalleled HK blocks/backplates (and currently my main cards, having sold most others to permit justification to self for purchase of 2-3x full GM210/GM200 beasts, and as my 680 Lightnings reside in my 3770K benching rig).
So call me a fanboy, I don't care, but do know that my first self bought high end GPU was an X800XT Platinum Edition (to complement the FX-51,yes a $1000 AMD CPU, and actually received over a week before release due to the retailer having confused the date on which sales could begin), and I have continued to spend my own hard earned money on the absolute pinnacle AMD cards each generation not only in terms of the most powerful die but also by exclusively purchasing THE absolute best of the best of custom AIB designs such as the unparalleled (on the Red Front) Lightnings courtesy MSI. If you want to say that I am biased, or that I have no place discussing the relative merits, and downfalls, of each company's flagship Single-GPU cards of each generation (a topic I have broached in other threads, resulting from the relevance to the topic presented in those threads, while my words in this case merely serve as a means to present the simple fact that I have spent, frankly far too much, in my quest to find out for myself what is "better" and in what circumstances each competing architecture's benefits, as well as downsides, reveal themselves; as a scientist I value true, bias-free data above all, but as this forum has come to demonstrate to the point of absurdity, I will sooner find myself engaged in a menage-a-trois with the illustrious Scarlett Johansson and the charming and beautiful Amy Acker, than I am to find consensus on any such topic involving AMD, Intel, or Nvidia; however simply because data obtained from the proper execution of double-blind testing methodology is as valuable as it is borderline-extinct in such a topic as this, does not mean that I don't value anecdotal evidence (remember, anecdotes are at best evidence, and can never prove or disprove a legitimate theory, for as the world exists not in black and white but rather myriad shades of grey and color (although never exceeding 14.7 million for TN users







), so must theory be formed to the world it explores and not the idealized fantasy world specific to each individual), and with that said now so must this, which is that the wall of pseudo anonymity and false security the Internet is so adept at providing calls into question the validity of each and every piece of "evidence" presented through the medium, in which biases are easily disguised, "credibility" is as easily manufactured by the individual themself as it is earned if not easier, and true motives are nigh impossible to discern lest the person slip up in their furious finger tap-dancing to issue a retort/insult/sleight/display of wit or pseudo-intellectual ego-stroking...)

All that said, and to ensure that even those with the poorest of reading comprehension understand in no uncertain terms, I ADMIT that I have a bias towards Nvidia, a preference that has evolved over the past 13 years as the result of not one, or a few, but myriad experiences that are (going by the list of GPU's purchased aka owned by myself from each of the two real players at the table, and while I admit that some is dependent on my admittedly fallible Electrochemical Cerebral Calculation Device, or "brain" in the far less fun common vernacular, I am fairly confident that I have not forgotten but perhaps one or two cards, out of the 70+ total, which (by means of a more traditional calculator, a necessity by way of my severe aversion to anything "math-y", my own GPU history actually slants more towards ATi/AMD than Nvidia by percentage, closer a 60/40 split than 50-50) based not upon my admittedly underdeveloped Google-fu or a belief that "cheaply attained information is of equivalent value to the knowledge acquired by those willing to put forth the time, money, and energy in its pursuit" as has become so unfortunately de-rigeur anytime some poor soul having stumbled across our (otherwise, mostly unparalleled) little slice of heaven makes the unknowing and unfortunate mistake of asking for help deciding between AMD/Intel/Nvidia and expecting to receive unbiased responses in the midst of a respectful discussion between equals who value the opinions and experience of others while simultaneously acknowledging the limits of their own, but is instead assaulted with a virtual onslaught of slander, insults, and based on my experience in the world of hard science, the most dangerous of all: presentation of (cherry-picked or not, but we all know that the former is the sad reality) benchmarks conducted by third parties (usually found by scouring Google Images for the first pretty colored graph that "agrees" with their position, the poster caring naught the fact that should said graph require that you scan through twenty or fifty other, "less agreeable" ones, the process has transitioned from the pursuit of existing facts to the manufacture of them, tailored however necessary to fit their "truths"; there exist few things which inspire as much genuine fear as does the constantly growing trend towards this, one of the most grievous abuses of information outside of your local NSA SCIF), including "personal recommendations" made by the poster aligned with their "truth chart", all while having never owned the products in question. I have a firm, unwavering belief that regardless of education, training, and experience specific to addiction, unless you have struggled with this the most misunderstood of diseases, you can never truly understand what it is like to fight so hard internally that you risk performing an inadvertent Corpus Collosotomy, and even as every single thread of your conscious mind screams in protest, you watch on as your hands draw up the dope, tap out air trapped in the barrel, slide the needle through the skin and into a vein, pull back the plunger to get the flash and ensure the hit is registered, and then depress the plunger as the solution enters the bloodstream and is transported through the blood-brain barrier and to the neurons as the warm narcotic embrace welcomes you back and your shame, guilt, and self-loathing are temporarily assuaged, muted, only to come back stronger each time the temporary bliss runs it's course, much less what it's like to know that you are literally going to die but the same thing which will ultimately claim credit for your demise is the agent that makes it nigh impossible to ask for help (the criminalization of a disease every bit as legitimate as cancer or MS is certainly not helpful in encouraging those still struggling with the horrors of addiction to seek treatment when doing so is by necessity an admission of guilt to multiple felony statutes).
Just as an individual can become extremely informed about the disease of chemical abuse and dependance via "2nd/3rd party" sources, yet never have a TRUE understanding of what it is to be an addict, so too can people who present (hypothetically, unbiased; hypothetical because that's so rarely the case that it's more an exercise in the theoretical than the actual reality) a variety of data from only highly regarded sources and in its entirety, present the individual seeking information and knowledge a source of data that can be potentially one of multiple independent resources on which to base further questions, but it cannot be presented as anything more just as neither can it be of any value should it be qualified by the opinion of a person who has no experience with the items in question, for their knowledge base does not extend beyond the quantifiable, the hard numbers which when (properly) collected, organized, and assembled serve to portray a legitimate, but entirely incomplete, picture of a much larger image. While frame time analysis has been a valuable new tool in the dissection and analysis of a finer grained set of numbers, they are still numbers and anyone who believes that they can exist in a vacuum, separated from the experience of actual ownership and use of whatever products are being called into question, is someone whose opinion should be, while not outright dismissed, at least weighed against the opinions of those who can provide experiential information nonexistent to the former parties.
It's a trap all too easy to miss until after the fall, and I will never claim to have successfully avoided falling prey to such "easy thinking" myself in the past, but I also actively watch out for it having realized the implications of such behavior, which is to say that it's far easier to tell others how they should spend their money, than it is to actually spend it yourself and remain silent otherwise.
Yet, until everyone can realize that conflict is not a necessity, nor does it even belong, in a place such as this, a sanctuary for those who share a common interest, passion, and the means for a free exchange of ideas such as cannot exist in remotely the same capacity in "real life" or through any other existing medium, we will instead suffer through a dozen or more insults or presentations of falsehoods/incomplete information and biased half-formed opinions for every nugget of truth. We will continue to watch our community, the single best large PC enthusiasts forum, devolve and eventually (God forbid) become no better than Toms Hardware, so long as we allow ourselves to be split by the most meaningless and inconsequential of allegiances, and those who seemingly exist for no reason other than to stoke the fires of civil unrest in the hopes of an all out "war" regardless of their motivation (although the motive is irrelevant, it's actions that reveal an individuals true character), the faster we drive the nails into the coffin of, and shovel dirt atop, the value our community possesses.

The bottom line is that what was once an "occurrence", back when I just lurked (and stalked Alatar, of course; btw buddy, would you mind checking to see if there are any spy satellites and/or remote surveillance drones that have crashed somewhere in your vicinity? Why do I ask? No reason, and it's definitely completely unrelated to the previous sentence, yup, there's no relation between the, oh, say, 113 expended surveillance devices littered around you, especially not the ones in your bed watching you sleep, their giant eye unblinking...) and anything else I've said, no sir!) and for a while after I completed my initiation (anyone else think it weird that no one was wearing anything under their hooded robes during the "ritualistic chanting" portion of the process, or is it just me?)... However, at some point there was a transition, and unlike before, speculative pandering, fierce post-purchase rationalization, and the discarding of unbiased data balanced with informed commentary and opinions resulting from actual ownership and use of "the item(s)" have become the standard, while many of the attempts to provide such things, especially the last one, result in an outpouring of accusations of bias, defensive posturing, and of course the classic false-ego derived "that's wrong because I experienced something different".
Even if you post in a thread asking "Should I buy X or Y?", and you own both, it's far more likely that those who believe that their posts are at least equal in value, if not significantly more valuable, will use whatever tactics they can to make it look like their ability to post benchmarks favoring the position they maintain (again, with zero actual basis for such a belief) means their having never actually laid out their own money for, or even simply having spent any real time (if any at all) using, the items in question is irrelevant, and heaven forbid the person who has had both settled on the other company's product as then they're clearly a shill, and to someone new to the forums, or even new to the hobby, they're unlikely to be able to see through the layers of FUD being smeared mercilessly, so in essence the people doing the smearing have found a way to continually perpetuate and reinforce their own distorted beliefs about the relative value of information depending upon the source...

Okay, that was far longer than I intended, but it (mostly, at least) comes back full circle to the issue at hand...

Namely:
If you own AMD, or dislike Nvidia, you are not benefitting ANYONE by posting here, this topic is NOT a place for AMD's Mutual Mastur...err, Admiration Society to congregate, it's not a place to bash Nvidia, it's not a place to spout nonsensical theories nor for those who have no knowledge of the topic to fluff up their own ego by espousing theories regarding the semiconductor architecture as fact despite having no knowledge of chip design outside of what can be copied and pasted from Wikipedia, it's not the place to try to make yet another "lol AMD roxxorz, Nvidia suxxorz ROFL-copter *stupid meme* U g0t [email protected], N00b!" 'argument' (we get it, that exact argument is flawless and I bow to your unquestionably superior intellect and mastery of the English language's finest and most subtle nuances)...

Whether this is a real issue or not, The Great VRAM Panic of 2015, remains to be seen. Given a little time, conclusive evidence will emerge, and the issue can either be laid to rest or everyone can power up to Level 3 Bickering and Level 5 "Convinced of Ability to Dictate Corporate Policy from Behind a Monitor on a Forum" powers.

It's not happened yet, so please, those who have no ability or intent to contribute in a purely constructive manner, I will refrain from pointing any fingers but will say that most of this group is obvious, PLEASE gracefully refrain from pushing your agenda and if you absolutely must continue participating in a discussion that you have no real reason to be in, then refrain from posting anything unless it legitimately moves the thread forward. Talking about rival companies, their products, or simply taking cheap shots at EITHER company are exactly what DO NOT belong in here! (frankly, anything less than respectful civil discussion doesn't belong on the forum period, but I have not seen that stop people...).

Fin.


----------



## tsm106

Call me when its class action time. I want my money back.


----------



## tpi2007

There's quite a few questions that still need answering:

1. If the benchmark people are saying is bugged can't properly access the last 0.5 GB on the GTX 970, but can on the GTX 980 is that because Nvidia's drivers aren't catching the benchmark and applying its damage mitigation routines to it ? I have a hard time understanding why it would be the benchmark's problem and not Nvidia's drivers. The benchmark would just be assuming the normal, it's up to the drivers to re-route the benchmark's requests in order to conform to the card's specific layout, otherwise you would have a non universal language because every card is different, putting the burden on the developers to change the code every time a new card comes out. It seems obvious that the specific memory access configuration and respective management are the issue here.

2. How much slower is the access to the last 0.5 GB, and how does it impact performance where resolution and quality settings stay the same, only difference being texture quality ? 4K SLI would be the worst case scenario to test.

3. Since some people are claiming that their cards are not affected, do we actually know if Nvidia fuses off the remaining cores in exactly the same way on every GTX 970 ? I'd say no, as that would imply that every faulty GM204 die would have defects in exactly the same place. For those chips that are actually all functional, but they need to fill the GTX 970 quota, I would bet that they fuse off the rest of the cores in the most optimal way possible. The question is, does this make any difference ? Could some cards be affected more than others ? I mean, after all, until yesterday nobody outside Nvidia, including the tech press, knew that there were two memory sections and that the first had higher priority access, so what else don't we know in order to draw an objective conclusion ?


----------



## iTurn

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nleksan*
> 
> First, I have not owned, neither do I possess any plans to own, any GM204 cards. I have been holding out since selling my trio of Kingpins, the latest cadre of impressively engineered silicon "Game Playing Units" following the less powerful but nevertheless impressive 780 Classifieds, the promising in potential but unfortunately underwhelming 290X Lightning CFX setup, the 680 Lightning 2-way SLI (with the artisan quality Aquacomputer limited edition blocks) that MSI graciously offered as replacement for the two previous GPU's, a pair of 7970 Lightnings (less impressive blocks via EK) both having failed (*read: sudden death with no apparent cause), and my beloved EVGA GTX670 FTW's in their unparalleled HK blocks/backplates (and currently my main cards, having sold most others to permit justification to self for purchase of 2-3x full GM210/GM200 beasts, and as my 680 Lightnings reside in my 3770K benching rig).
> So call me a fanboy, I don't care, but do know that my first self bought high end GPU was an X800XT Platinum Edition (to complement the FX-51,yes a $1000 AMD CPU, and actually received over a week before release due to the retailer having confused the date on which sales could begin), and I have continued to spend my own hard earned money on the absolute pinnacle AMD cards each generation not only in terms of the most powerful die but also by exclusively purchasing THE absolute best of the best of custom AIB designs such as the unparalleled (on the Red Front) Lightnings courtesy MSI. If you want to say that I am biased, or that I have no place discussing the relative merits, and downfalls, of each company's flagship Single-GPU cards of each generation (a topic I have broached in other threads, resulting from the relevance to the topic presented in those threads, while my words in this case merely serve as a means to present the simple fact that I have spent, frankly far too much, in my quest to find out for myself what is "better" and in what circumstances each competing architecture's benefits, as well as downsides, reveal themselves; as a scientist I value true, bias-free data above all, but as this forum has come to demonstrate to the point of absurdity, I will sooner find myself engaged in a menage-a-trois with the illustrious Scarlett Johansson and the charming and beautiful Amy Acker, than I am to find consensus on any such topic involving AMD, Intel, or Nvidia; however simply because data obtained from the proper execution of double-blind testing methodology is as valuable as it is borderline-extinct in such a topic as this, does not mean that I don't value anecdotal evidence (remember, anecdotes are at best evidence, and can never prove or disprove a legitimate theory, for as the world exists not in black and white but rather myriad shades of grey and color (although never exceeding 14.7 million for TN users
> 
> 
> 
> 
> 
> 
> 
> ), so must theory be formed to the world it explores and not the idealized fantasy world specific to each individual), and with that said now so must this, which is that the wall of pseudo anonymity and false security the Internet is so adept at providing calls into question the validity of each and every piece of "evidence" presented through the medium, in which biases are easily disguised, "credibility" is as easily manufactured by the individual themself as it is earned if not easier, and true motives are nigh impossible to discern lest the person slip up in their furious finger tap-dancing to issue a retort/insult/sleight/display of wit or pseudo-intellectual ego-stroking...)
> 
> All that said, and to ensure that even those with the poorest of reading comprehension understand in no uncertain terms, I ADMIT that I have a bias towards Nvidia, a preference that has evolved over the past 13 years as the result of not one, or a few, but myriad experiences that are (going by the list of GPU's purchased aka owned by myself from each of the two real players at the table, and while I admit that some is dependent on my admittedly fallible Electrochemical Cerebral Calculation Device, or "brain" in the far less fun common vernacular, I am fairly confident that I have not forgotten but perhaps one or two cards, out of the 70+ total, which (by means of a more traditional calculator, a necessity by way of my severe aversion to anything "math-y", my own GPU history actually slants more towards ATi/AMD than Nvidia by percentage, closer a 60/40 split than 50-50) based not upon my admittedly underdeveloped Google-fu or a belief that "cheaply attained information is of equivalent value to the knowledge acquired by those willing to put forth the time, money, and energy in its pursuit" as has become so unfortunately de-rigeur anytime some poor soul having stumbled across our (otherwise, mostly unparalleled) little slice of heaven makes the unknowing and unfortunate mistake of asking for help deciding between AMD/Intel/Nvidia and expecting to receive unbiased responses in the midst of a respectful discussion between equals who value the opinions and experience of others while simultaneously acknowledging the limits of their own, but is instead assaulted with a virtual onslaught of slander, insults, and based on my experience in the world of hard science, the most dangerous of all: presentation of (cherry-picked or not, but we all know that the former is the sad reality) benchmarks conducted by third parties (usually found by scouring Google Images for the first pretty colored graph that "agrees" with their position, the poster caring naught the fact that should said graph require that you scan through twenty or fifty other, "less agreeable" ones, the process has transitioned from the pursuit of existing facts to the manufacture of them, tailored however necessary to fit their "truths"; there exist few things which inspire as much genuine fear as does the constantly growing trend towards this, one of the most grievous abuses of information outside of your local NSA SCIF), including "personal recommendations" made by the poster aligned with their "truth chart", all while having never owned the products in question. I have a firm, unwavering belief that regardless of education, training, and experience specific to addiction, unless you have struggled with this the most misunderstood of diseases, you can never truly understand what it is like to fight so hard internally that you risk performing an inadvertent Corpus Collosotomy, and even as every single thread of your conscious mind screams in protest, you watch on as your hands draw up the dope, tap out air trapped in the barrel, slide the needle through the skin and into a vein, pull back the plunger to get the flash and ensure the hit is registered, and then depress the plunger as the solution enters the bloodstream and is transported through the blood-brain barrier and to the neurons as the warm narcotic embrace welcomes you back and your shame, guilt, and self-loathing are temporarily assuaged, muted, only to come back stronger each time the temporary bliss runs it's course, much less what it's like to know that you are literally going to die but the same thing which will ultimately claim credit for your demise is the agent that makes it nigh impossible to ask for help (the criminalization of a disease every bit as legitimate as cancer or MS is certainly not helpful in encouraging those still struggling with the horrors of addiction to seek treatment when doing so is by necessity an admission of guilt to multiple felony statutes).
> Just as an individual can become extremely informed about the disease of chemical abuse and dependance via "2nd/3rd party" sources, yet never have a TRUE understanding of what it is to be an addict, so too can people who present (hypothetically, unbiased; hypothetical because that's so rarely the case that it's more an exercise in the theoretical than the actual reality) a variety of data from only highly regarded sources and in its entirety, present the individual seeking information and knowledge a source of data that can be potentially one of multiple independent resources on which to base further questions, but it cannot be presented as anything more just as neither can it be of any value should it be qualified by the opinion of a person who has no experience with the items in question, for their knowledge base does not extend beyond the quantifiable, the hard numbers which when (properly) collected, organized, and assembled serve to portray a legitimate, but entirely incomplete, picture of a much larger image. While frame time analysis has been a valuable new tool in the dissection and analysis of a finer grained set of numbers, they are still numbers and anyone who believes that they can exist in a vacuum, separated from the experience of actual ownership and use of whatever products are being called into question, is someone whose opinion should be, while not outright dismissed, at least weighed against the opinions of those who can provide experiential information nonexistent to the former parties.
> It's a trap all too easy to miss until after the fall, and I will never claim to have successfully avoided falling prey to such "easy thinking" myself in the past, but I also actively watch out for it having realized the implications of such behavior, which is to say that it's far easier to tell others how they should spend their money, than it is to actually spend it yourself and remain silent otherwise.
> Yet, until everyone can realize that conflict is not a necessity, nor does it even belong, in a place such as this, a sanctuary for those who share a common interest, passion, and the means for a free exchange of ideas such as cannot exist in remotely the same capacity in "real life" or through any other existing medium, we will instead suffer through a dozen or more insults or presentations of falsehoods/incomplete information and biased half-formed opinions for every nugget of truth. We will continue to watch our community, the single best large PC enthusiasts forum, devolve and eventually (God forbid) become no better than Toms Hardware, so long as we allow ourselves to be split by the most meaningless and inconsequential of allegiances, and those who seemingly exist for no reason other than to stoke the fires of civil unrest in the hopes of an all out "war" regardless of their motivation (although the motive is irrelevant, it's actions that reveal an individuals true character), the faster we drive the nails into the coffin of, and shovel dirt atop, the value our community possesses.
> 
> The bottom line is that what was once an "occurrence", back when I just lurked (and stalked Alatar, of course; btw buddy, would you mind checking to see if there are any spy satellites and/or remote surveillance drones that have crashed somewhere in your vicinity? Why do I ask? No reason, and it's definitely completely unrelated to the previous sentence, yup, there's no relation between the, oh, say, 113 expended surveillance devices littered around you, especially not the ones in your bed watching you sleep, their giant eye unblinking...) and anything else I've said, no sir!) and for a while after I completed my initiation (anyone else think it weird that no one was wearing anything under their hooded robes during the "ritualistic chanting" portion of the process, or is it just me?)... However, at some point there was a transition, and unlike before, speculative pandering, fierce post-purchase rationalization, and the discarding of unbiased data balanced with informed commentary and opinions resulting from actual ownership and use of "the item(s)" have become the standard, while many of the attempts to provide such things, especially the last one, result in an outpouring of accusations of bias, defensive posturing, and of course the classic false-ego derived "that's wrong because I experienced something different".
> Even if you post in a thread asking "Should I buy X or Y?", and you own both, it's far more likely that those who believe that their posts are at least equal in value, if not significantly more valuable, will use whatever tactics they can to make it look like their ability to post benchmarks favoring the position they maintain (again, with zero actual basis for such a belief) means their having never actually laid out their own money for, or even simply having spent any real time (if any at all) using, the items in question is irrelevant, and heaven forbid the person who has had both settled on the other company's product as then they're clearly a shill, and to someone new to the forums, or even new to the hobby, they're unlikely to be able to see through the layers of FUD being smeared mercilessly, so in essence the people doing the smearing have found a way to continually perpetuate and reinforce their own distorted beliefs about the relative value of information depending upon the source...
> 
> Okay, that was far longer than I intended, but it (mostly, at least) comes back full circle to the issue at hand...
> 
> Namely:
> If you own AMD, or dislike Nvidia, you are not benefitting ANYONE by posting here, this topic is NOT a place for AMD's Mutual Mastur...err, Admiration Society to congregate, it's not a place to bash Nvidia, it's not a place to spout nonsensical theories nor for those who have no knowledge of the topic to fluff up their own ego by espousing theories regarding the semiconductor architecture as fact despite having no knowledge of chip design outside of what can be copied and pasted from Wikipedia, it's not the place to try to make yet another "lol AMD roxxorz, Nvidia suxxorz ROFL-copter *stupid meme* U g0t [email protected], N00b!" 'argument' (we get it, that exact argument is flawless and I bow to your unquestionably superior intellect and mastery of the English language's finest and most subtle nuances)...
> 
> Whether this is a real issue or not, The Great VRAM Panic of 2015, remains to be seen. Given a little time, conclusive evidence will emerge, and the issue can either be laid to rest or everyone can power up to Level 3 Bickering and Level 5 "Convinced of Ability to Dictate Corporate Policy from Behind a Monitor on a Forum" powers.
> 
> It's not happened yet, so please, those who have no ability or intent to contribute in a purely constructive manner, I will refrain from pointing any fingers but will say that most of this group is obvious, PLEASE gracefully refrain from pushing your agenda and if you absolutely must continue participating in a discussion that you have no real reason to be in, then refrain from posting anything unless it legitimately moves the thread forward. Talking about rival companies, their products, or simply taking cheap shots at EITHER company are exactly what DO NOT belong in here! (frankly, anything less than respectful civil discussion doesn't belong on the forum period, but I have not seen that stop people...).
> 
> Fin.






Wall of text crits you for 10,000... you die


----------



## tsm106

I guess he is in love with reading his own wall of text.


----------



## Ramzinho

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nleksan*
> 
> First, I have not owned, neither do I possess any plans to own, any GM204 cards. I have been holding out since selling my trio of Kingpins, the latest cadre of impressively engineered silicon "Game Playing Units" following the less powerful but nevertheless impressive 780 Classifieds, the promising in potential but unfortunately underwhelming 290X Lightning CFX setup, the 680 Lightning 2-way SLI (with the artisan quality Aquacomputer limited edition blocks) that MSI graciously offered as replacement for the two previous GPU's, a pair of 7970 Lightnings (less impressive blocks via EK) both having failed (*read: sudden death with no apparent cause), and my beloved EVGA GTX670 FTW's in their unparalleled HK blocks/backplates (and currently my main cards, having sold most others to permit justification to self for purchase of 2-3x full GM210/GM200 beasts, and as my 680 Lightnings reside in my 3770K benching rig).
> So call me a fanboy, I don't care, but do know that my first self bought high end GPU was an X800XT Platinum Edition (to complement the FX-51,yes a $1000 AMD CPU, and actually received over a week before release due to the retailer having confused the date on which sales could begin), and I have continued to spend my own hard earned money on the absolute pinnacle AMD cards each generation not only in terms of the most powerful die but also by exclusively purchasing THE absolute best of the best of custom AIB designs such as the unparalleled (on the Red Front) Lightnings courtesy MSI. If you want to say that I am biased, or that I have no place discussing the relative merits, and downfalls, of each company's flagship Single-GPU cards of each generation (a topic I have broached in other threads, resulting from the relevance to the topic presented in those threads, while my words in this case merely serve as a means to present the simple fact that I have spent, frankly far too much, in my quest to find out for myself what is "better" and in what circumstances each competing architecture's benefits, as well as downsides, reveal themselves; as a scientist I value true, bias-free data above all, but as this forum has come to demonstrate to the point of absurdity, I will sooner find myself engaged in a menage-a-trois with the illustrious Scarlett Johansson and the charming and beautiful Amy Acker, than I am to find consensus on any such topic involving AMD, Intel, or Nvidia; however simply because data obtained from the proper execution of double-blind testing methodology is as valuable as it is borderline-extinct in such a topic as this, does not mean that I don't value anecdotal evidence (remember, anecdotes are at best evidence, and can never prove or disprove a legitimate theory, for as the world exists not in black and white but rather myriad shades of grey and color (although never exceeding 14.7 million for TN users
> 
> 
> 
> 
> 
> 
> 
> ), so must theory be formed to the world it explores and not the idealized fantasy world specific to each individual), and with that said now so must this, which is that the wall of pseudo anonymity and false security the Internet is so adept at providing calls into question the validity of each and every piece of "evidence" presented through the medium, in which biases are easily disguised, "credibility" is as easily manufactured by the individual themself as it is earned if not easier, and true motives are nigh impossible to discern lest the person slip up in their furious finger tap-dancing to issue a retort/insult/sleight/display of wit or pseudo-intellectual ego-stroking...)
> 
> All that said, and to ensure that even those with the poorest of reading comprehension understand in no uncertain terms, I ADMIT that I have a bias towards Nvidia, a preference that has evolved over the past 13 years as the result of not one, or a few, but myriad experiences that are (going by the list of GPU's purchased aka owned by myself from each of the two real players at the table, and while I admit that some is dependent on my admittedly fallible Electrochemical Cerebral Calculation Device, or "brain" in the far less fun common vernacular, I am fairly confident that I have not forgotten but perhaps one or two cards, out of the 70+ total, which (by means of a more traditional calculator, a necessity by way of my severe aversion to anything "math-y", my own GPU history actually slants more towards ATi/AMD than Nvidia by percentage, closer a 60/40 split than 50-50) based not upon my admittedly underdeveloped Google-fu or a belief that "cheaply attained information is of equivalent value to the knowledge acquired by those willing to put forth the time, money, and energy in its pursuit" as has become so unfortunately de-rigeur anytime some poor soul having stumbled across our (otherwise, mostly unparalleled) little slice of heaven makes the unknowing and unfortunate mistake of asking for help deciding between AMD/Intel/Nvidia and expecting to receive unbiased responses in the midst of a respectful discussion between equals who value the opinions and experience of others while simultaneously acknowledging the limits of their own, but is instead assaulted with a virtual onslaught of slander, insults, and based on my experience in the world of hard science, the most dangerous of all: presentation of (cherry-picked or not, but we all know that the former is the sad reality) benchmarks conducted by third parties (usually found by scouring Google Images for the first pretty colored graph that "agrees" with their position, the poster caring naught the fact that should said graph require that you scan through twenty or fifty other, "less agreeable" ones, the process has transitioned from the pursuit of existing facts to the manufacture of them, tailored however necessary to fit their "truths"; there exist few things which inspire as much genuine fear as does the constantly growing trend towards this, one of the most grievous abuses of information outside of your local NSA SCIF), including "personal recommendations" made by the poster aligned with their "truth chart", all while having never owned the products in question. I have a firm, unwavering belief that regardless of education, training, and experience specific to addiction, unless you have struggled with this the most misunderstood of diseases, you can never truly understand what it is like to fight so hard internally that you risk performing an inadvertent Corpus Collosotomy, and even as every single thread of your conscious mind screams in protest, you watch on as your hands draw up the dope, tap out air trapped in the barrel, slide the needle through the skin and into a vein, pull back the plunger to get the flash and ensure the hit is registered, and then depress the plunger as the solution enters the bloodstream and is transported through the blood-brain barrier and to the neurons as the warm narcotic embrace welcomes you back and your shame, guilt, and self-loathing are temporarily assuaged, muted, only to come back stronger each time the temporary bliss runs it's course, much less what it's like to know that you are literally going to die but the same thing which will ultimately claim credit for your demise is the agent that makes it nigh impossible to ask for help (the criminalization of a disease every bit as legitimate as cancer or MS is certainly not helpful in encouraging those still struggling with the horrors of addiction to seek treatment when doing so is by necessity an admission of guilt to multiple felony statutes).
> Just as an individual can become extremely informed about the disease of chemical abuse and dependance via "2nd/3rd party" sources, yet never have a TRUE understanding of what it is to be an addict, so too can people who present (hypothetically, unbiased; hypothetical because that's so rarely the case that it's more an exercise in the theoretical than the actual reality) a variety of data from only highly regarded sources and in its entirety, present the individual seeking information and knowledge a source of data that can be potentially one of multiple independent resources on which to base further questions, but it cannot be presented as anything more just as neither can it be of any value should it be qualified by the opinion of a person who has no experience with the items in question, for their knowledge base does not extend beyond the quantifiable, the hard numbers which when (properly) collected, organized, and assembled serve to portray a legitimate, but entirely incomplete, picture of a much larger image. While frame time analysis has been a valuable new tool in the dissection and analysis of a finer grained set of numbers, they are still numbers and anyone who believes that they can exist in a vacuum, separated from the experience of actual ownership and use of whatever products are being called into question, is someone whose opinion should be, while not outright dismissed, at least weighed against the opinions of those who can provide experiential information nonexistent to the former parties.
> It's a trap all too easy to miss until after the fall, and I will never claim to have successfully avoided falling prey to such "easy thinking" myself in the past, but I also actively watch out for it having realized the implications of such behavior, which is to say that it's far easier to tell others how they should spend their money, than it is to actually spend it yourself and remain silent otherwise.
> Yet, until everyone can realize that conflict is not a necessity, nor does it even belong, in a place such as this, a sanctuary for those who share a common interest, passion, and the means for a free exchange of ideas such as cannot exist in remotely the same capacity in "real life" or through any other existing medium, we will instead suffer through a dozen or more insults or presentations of falsehoods/incomplete information and biased half-formed opinions for every nugget of truth. We will continue to watch our community, the single best large PC enthusiasts forum, devolve and eventually (God forbid) become no better than Toms Hardware, so long as we allow ourselves to be split by the most meaningless and inconsequential of allegiances, and those who seemingly exist for no reason other than to stoke the fires of civil unrest in the hopes of an all out "war" regardless of their motivation (although the motive is irrelevant, it's actions that reveal an individuals true character), the faster we drive the nails into the coffin of, and shovel dirt atop, the value our community possesses.
> 
> The bottom line is that what was once an "occurrence", back when I just lurked (and stalked Alatar, of course; btw buddy, would you mind checking to see if there are any spy satellites and/or remote surveillance drones that have crashed somewhere in your vicinity? Why do I ask? No reason, and it's definitely completely unrelated to the previous sentence, yup, there's no relation between the, oh, say, 113 expended surveillance devices littered around you, especially not the ones in your bed watching you sleep, their giant eye unblinking...) and anything else I've said, no sir!) and for a while after I completed my initiation (anyone else think it weird that no one was wearing anything under their hooded robes during the "ritualistic chanting" portion of the process, or is it just me?)... However, at some point there was a transition, and unlike before, speculative pandering, fierce post-purchase rationalization, and the discarding of unbiased data balanced with informed commentary and opinions resulting from actual ownership and use of "the item(s)" have become the standard, while many of the attempts to provide such things, especially the last one, result in an outpouring of accusations of bias, defensive posturing, and of course the classic false-ego derived "that's wrong because I experienced something different".
> Even if you post in a thread asking "Should I buy X or Y?", and you own both, it's far more likely that those who believe that their posts are at least equal in value, if not significantly more valuable, will use whatever tactics they can to make it look like their ability to post benchmarks favoring the position they maintain (again, with zero actual basis for such a belief) means their having never actually laid out their own money for, or even simply having spent any real time (if any at all) using, the items in question is irrelevant, and heaven forbid the person who has had both settled on the other company's product as then they're clearly a shill, and to someone new to the forums, or even new to the hobby, they're unlikely to be able to see through the layers of FUD being smeared mercilessly, so in essence the people doing the smearing have found a way to continually perpetuate and reinforce their own distorted beliefs about the relative value of information depending upon the source...
> 
> Okay, that was far longer than I intended, but it (mostly, at least) comes back full circle to the issue at hand...
> 
> Namely:
> If you own AMD, or dislike Nvidia, you are not benefitting ANYONE by posting here, this topic is NOT a place for AMD's Mutual Mastur...err, Admiration Society to congregate, it's not a place to bash Nvidia, it's not a place to spout nonsensical theories nor for those who have no knowledge of the topic to fluff up their own ego by espousing theories regarding the semiconductor architecture as fact despite having no knowledge of chip design outside of what can be copied and pasted from Wikipedia, it's not the place to try to make yet another "lol AMD roxxorz, Nvidia suxxorz ROFL-copter *stupid meme* U g0t [email protected], N00b!" 'argument' (we get it, that exact argument is flawless and I bow to your unquestionably superior intellect and mastery of the English language's finest and most subtle nuances)...
> 
> Whether this is a real issue or not, The Great VRAM Panic of 2015, remains to be seen. Given a little time, conclusive evidence will emerge, and the issue can either be laid to rest or everyone can power up to Level 3 Bickering and Level 5 "Convinced of Ability to Dictate Corporate Policy from Behind a Monitor on a Forum" powers.
> 
> It's not happened yet, so please, those who have no ability or intent to contribute in a purely constructive manner, I will refrain from pointing any fingers but will say that most of this group is obvious, PLEASE gracefully refrain from pushing your agenda and if you absolutely must continue parti





cipating in a discussion that you have no real reason to be in, then refrain from posting anything unless it legitimately moves the thread forward. Talking about rival companies, their products, or simply taking cheap shots at EITHER company are exactly what DO NOT belong in here! (frankly, anything less than respectful civil discussion doesn't belong on the forum period, but I have not seen that stop people...).

Fin.

that wall of text







.... Chill out Nick.. dont feed the trolls. people will be the same.. fanboys are every where.. no more no less.

On the other hand. i think Nvidia didn't think that a GPU targeted at 1080 would be using more than 3.5GB Vram. but they neglected the thought that there are many people who would go SLI 970 instead of one 980. and this is where troubles happened. If people really wanna give it a shot why not try crippling a 980 to Match 970s performance and see if the Vram is really different?


----------



## Baghi

Quote:


> Originally Posted by *darealist*
> 
> I just sold my GTX 970 because of this bs and never-ending coil whine. It's cheap for a reason guys. Cheap components for a cheap card. I'll wait for GM200 from here on out.


So true. ASUS also has 970 Strix and 980 Strix with very different coolers on them.

Premium:


Cheap:


----------



## Seven7h

Quote:


> Originally Posted by *Redwoodz*
> 
> Well, it does kind of seem FCAT testing has suddenly lost favor since AMD came out with superior crossfire results with elimination of the need for bridges. Can't have it both ways, either it is very relevant, or it never was. Regardless of which brand I may prefer, I build custom PC's for my customers, who often request Nvidia GPU's. It is my duty to fully investigate all relevant products and their performance in common usage scenarios. I want to know how best to advise my customers.


If anything, removing bridges hurts fcat. You need to transfer an extra screen res sized buffer (large) over PCIE rather than read it directly from the source by MUXing the GPU that the display reads rendered image data from.


----------



## Forceman

Quote:


> Originally Posted by *Seven7h*
> 
> If anything, removing bridges hurts fcat. You need to transfer an extra screen res sized buffer (large) over PCIE rather than read it directly from the source by MUXing the GPU that the display reads rendered image data from.


Not sure what you mean. FCAT pulls the data from the video card output, just like what goes to the monitor, and into a capture card. Not sure how bridge/no-bridge affects FCAT testing itself.


----------



## nleksan

Quote:


> Originally Posted by *tsm106*
> 
> I guess he is in love with reading his own wall of text.


Or maybe I believe that the forum quality is going to only plummet if people refuse to show any respect for others, and I very much don't want the forum quality to decline?
You are welcome to disagree with me, and I would absolutely be interested in a civil conversation about any points of contention or why you disagree with my requests/hopes! Instead of the derogatory/insulting response posted...

Do you really think that "respect" and "decency" are too much to ask, let alone requests that deserve to be mocked?


----------



## Seven7h

Quote:


> Originally Posted by *Forceman*
> 
> Not sure what you mean. FCAT pulls the data from the video card output, just like what goes to the monitor, and into a capture card. Not sure how bridge/no-bridge affects FCAT testing itself.


Delaying the display-out in order to transfer a resource over PCIE is prone to introducing stutter. You can't display from it until the transfer is complete. It's logically better to just display it out of the source directly and switch which GPU is the source each frame.


----------



## Forceman

Quote:


> Originally Posted by *Seven7h*
> 
> Delaying the display-out in order to transfer a resource over PCIE is prone to introducing stutter. You can't display from it until the transfer is complete. It's logically better to just display it out of the source directly and switch which GPU is the source each frame.


That's inherent in Crossfire though, it has nothing to do with FCAT itself. The output comes out of one video card no matter what it is connected to (monitor or FCAT capture card).

On a related note, I wonder what capture card they are using for 4K testing? The one they had before was only up to 2560x1600 @ 60 FPS.


----------



## Dry Bonez

maybe someone can drop some knowledge to me for what i am about to say.....

Ok,i understand whats going on here and feel bad for those affected. BUT is this type of RAM the same as almost all other kinds of RAM? for example, RAM as in DDR3 RAM, if you get 8gb,you dont actually use 8gb,it says 7.xx usable. Same with tablets,phones,and everything i can think of with RAM. So IF this is true,what is the big issue about? Another thing i wana point out, well ask really. What if this card was equipped with 3.5gb? would the same issue occur and only use 3gb instead of 3.5gb?

Please drop some knowledge because i am trying to understand why people are fighting about "they should have said something" . YET when you buy an SSD/HDD of, lets say, 120gb. We all know we dont use that much. so please enlighten me on the BIGGER picture.


----------



## Seven7h

Quote:


> Originally Posted by *Forceman*
> 
> That's inherent in Crossfire though, it has nothing to do with FCAT itself. The output comes out of one video card no matter what it is connected to (monitor or FCAT capture card).
> 
> On a related note, I wonder what capture card they are using for 4K testing? The one they had before was only up to 2560x1600 @ 60 FPS.


That was my point. It would hurt FCAT results if anything. I didn't mean FCAT was wrong or invalid.

Then again it's the same size transfer every frame so it probably wouldn't cause stutter. It just seems inefficient, though it's true there is a ton of PCIE bandwidth to spare.


----------



## Forceman

Quote:


> Originally Posted by *Dry Bonez*
> 
> maybe someone can drop some knowledge to me for what i am about to say.....
> 
> Ok,i understand whats going on here and feel bad for those affected. BUT is this type of RAM the same as almost all other kinds of RAM? for example, RAM as in DDR3 RAM, if you get 8gb,you dont actually use 8gb,it says 7.xx usable. Same with tablets,phones,and everything i can think of with RAM. So IF this is true,what is the big issue about? Another thing i wana point out, well ask really. What if this card was equipped with 3.5gb? would the same issue occur and only use 3gb instead of 3.5gb?
> 
> Please drop some knowledge because i am trying to understand why people are fighting about "they should have said something" . YET when you buy an SSD/HDD of, lets say, 120gb. We all know we dont use that much. so please enlighten me on the BIGGER picture.


It's not the same as system RAM, pretty much the full 4GB should be available - and it is still available with these cards, it just can't be accessed as efficiently. If they had made it a 3.5GB card (through the BIOS I guess, you couldn't put physical memory chips on to make it that way) you would still have 3.5GB available. It isn't a "take away from what is there" kind of problem, it is a "part of the RAM isn't connected the same way" problem.


----------



## gamervivek

If FCAT is nvidia creation then its objectivity is in doubt, though interesting that it would incriminate themselves now.

For that wall of text: It's about ethics in hardware journalism.


----------



## sage101

It seems like the amd guys are rejoicing like they have won a battle but the real battle to be won is AMD bringing out a much faster card than nvidia. The nvidia guys seem to be doing some serious damage control, a lot of 970 owners are experiencing problems when using more than 3.5gb vram so there's definitely an issue here and not some conspiracy. Anyways anybody willing to sell their gimped 970 card link me up prepared to pay $250 since you guys think it's worth much less than the original price.


----------



## PontiacGTX

Quote:


> Originally Posted by *Forceman*
> 
> It's not the same as system RAM, pretty much the full 4GB should be available - and it is still available with these cards, it just can't be accessed as efficiently. If they had made it a 3.5GB card (through the BIOS I guess, you couldn't put physical memory chips on to make it that way) you would still have 3.5GB available. It isn't a "take away from what is there" kind of problem, it is a "part of the RAM isn't connected the same way" problem.


well OS uses VRAM aswell (with aero)


----------



## rdr09

Quote:


> Originally Posted by *nleksan*
> 
> Or maybe I believe that the forum quality is going to only plummet if people refuse to show any respect for others, and I very much don't want the forum quality to decline?
> You are welcome to disagree with me, and I would absolutely be interested in a civil conversation about any points of contention or why you disagree with my requests/hopes! Instead of the derogatory/insulting response posted...
> 
> Do you really think that "respect" and "decency" are too much to ask, let alone requests that deserve to be mocked?


Let us stay focus on the topic at hand.


Spoiler: Warning: Spoiler!


----------



## ZealotKi11er

Quote:


> Originally Posted by *PontiacGTX*
> 
> well OS uses VRAM aswell (with aero)


Does that mean games in Windows 8.1 use less vRAM by default?


----------



## PontiacGTX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does that mean games in Windows 8.1 use less vRAM by default?


I havent tried Win 8.1

but I found this



how allocates windows 8.1 the vram versus windows7?
I dont know

But look at this

http://forums.evga.com/GTX-670-FTW-3way-SLI-Scaling-Review-at-5760x1080-NEW-Overclock-Comparison-m1625084.aspx
post 17


----------



## Noufel

I think the most hurt people are ones that have gone the 970 sli way to run games at 4k resolution and people who "upgraded" from a 780 for the 1gb of vram difference .


----------



## Forceman

Quote:


> Originally Posted by *PontiacGTX*
> 
> I havent tried Win 8.1
> 
> but I found this
> 
> 
> 
> how allocates windows 8.1 the vram veruss windows7/8.1?
> I dont know


Wonder what happens when you run a Windowed game? I would assume when you go full-screen that memory gets dumped, but in window mode I wonder if it is still allocated?


----------



## Gilles3000

Quote:


> Originally Posted by *nleksan*
> 
> Or maybe I believe that the forum quality is going to only plummet if people refuse to show any respect for others, and I very much don't want the forum quality to decline?
> You are welcome to disagree with me, and I would absolutely be interested in a civil conversation about any points of contention or why you disagree with my requests/hopes! Instead of the derogatory/insulting response posted...
> 
> Do you really think that "respect" and "decency" are too much to ask, let alone requests that deserve to be mocked?


Fanboys are going stay fanboys, they're always going to trow their excrement at each other. You're better off plain ignoring them.
They're just as bad as the extremely religious people vs the extreme atheists.

I don't get why people keep having that I don't like what you like therefore I must insult you stance, tho. And even less that people are so bound to a single company that they feel the need to defend it.

So can we just ignore the fanboys and trolls now?


----------



## Menta

dam the NV FORUM IS ON FIRE









I Would like and even expect NVIDIA to launch a trade in program i would probably get a 980


----------



## Noufel

Quote:


> Originally Posted by *Menta*
> 
> dam the NV FORUM IS ON FIRE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I Would like and even expect NVIDIA to launch a trade in program i would probably get a 980


noooooooooooooooooooooooo you have to pay the full price for the 980 like me


----------



## Menta

Quote:


> Originally Posted by *Noufel*
> 
> noooooooooooooooooooooooo you have to pay the full price for the 980 like me


LOL, i think its the least they could do but i hate waiting would be willing to pay the extra amount but not full price









option 1 RMA and then money back but hate waiting

option 2 sell the card and get a 980 (not willing to loose to much money)

or wait and see


----------



## Forceman

Quote:


> Originally Posted by *Menta*
> 
> I Would like and even expect NVIDIA to launch a trade in program i would probably get a 980


I doubt it. This wasn't an unexpected error like the P67 bug, this was a design choice they made knowingly. There's no way they are going to do a recall.

Edit: although I guess it would depend on how much pressure the vendors put on them.


----------



## Menta

if the stores receive back the card for rma in big numbers that would trigger something but i guess most people wont bother they just shouting out


----------



## tsm106

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Menta*
> 
> I Would like and even expect NVIDIA to launch a trade in program i would probably get a 980
> 
> 
> 
> I doubt it. This wasn't an unexpected error like the P67 bug, *this was a design choice they made knowingly.* There's no way they are going to do a recall.
> 
> Edit: although I guess it would depend on how much pressure the vendors put on them.
Click to expand...

I would suspect this is something the FEDs would get involved in because they set out to mislead and deceive consumers. Intel was able to recall and replace. I wonder how Nv will handle this.


----------



## tweezlednutball

I just dont understand why people just cant wait a few months for amd's 300 series. even if they do decide to go with nVidia at that point in time, prices are surely going to be slashed.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tweezlednutball*
> 
> I just dont understand why people just cant wait a few months for amd's 300 series. even if they do decide to go with nVidia at that point in time, prices are surely going to be slashed.


A lot of people dont consider AMD so there is no point on waiting.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> I would suspect this is something the FEDs would get involved in because they set out to mislead and deceive consumers. Intel was able to recall and replace. I wonder how Nv will handle this.


Not disclosing the internal working of the memory subsystem is not the same as intentionally misleading or disceiving. I think anyone waiting for a government response is going to be waiting a long time.

It has the full 4GB and I doubt there's much specificity in how you have to report things like memory bandwidth (what conditions, what tests, etc). Fair amount of wiggle room in marketing materials.


----------



## michaelius

Quote:


> Originally Posted by *tweezlednutball*
> 
> I just dont understand why people just cant wait a few months for amd's 300 series. even if they do decide to go with nVidia at that point in time, prices are surely going to be slashed.


http://videocardz.com/52302/amd-radeon-r9-390x-cooling-pictured

AMD hinted new GPU release on September 12th which means if someone decided to wait it would be 6 to 9 months wait.


----------



## Seven7h

Quote:


> Originally Posted by *PureBlackFire*
> 
> thanks for the info but you can keep the sarcasm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this could be an engine issue as it occurs at regualr intervals. the vram usage does go over 3500mb at each time the stutter occurs. like I said though, BF4 is the only game it happens in. everything else is either fine or borderline unplayable anyway.


There was absolutely zero sarcasm in my statement... I'm confused...


----------



## tpi2007

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I would suspect this is something the FEDs would get involved in because they set out to mislead and deceive consumers. Intel was able to recall and replace. I wonder how Nv will handle this.
> 
> 
> 
> Not disclosing the internal working of the memory subsystem is not the same as intentionally misleading or disceiving. I think anyone waiting for a government response is going to be waiting a long time.
Click to expand...

Quote:


> However the 970 has a different configuration of SMs than the 980, *and fewer crossbar resources to the memory system.* To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.


As I said in the previous thread, the first part of the sentence does not necessarily imply the second (bolded) part. People who bought an Intel hexacore are getting all the memory throughput to the quad channel memory interface, they don't have to buy the octacore model to get that.

Also, "higher priority access" usually means 'faster' compared to the lower priority access, which must have some correspondence to throughput, which may very well mean that you actually don't get 224 GB/s when accessing all the VRAM, which is what they state on their site.

Also, their wording (last sentence quoted) implies that they have to do some driver work in order to use the second segment - "we use", so combine that with the benchmark's failure to address it properly and it seems that there is some hand tuning going on, which raises the question of the card's longevity - until when will they do that fine tuning. Not to mention what the performance penalty really is (FCAT + min framerates at same resolution, but different texture quality, which is something that many games will start emphasizing within the next year, making people question this card's value proposition).


----------



## tsm106

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I would suspect this is something the FEDs would get involved in because they set out to mislead and deceive consumers. Intel was able to recall and replace. I wonder how Nv will handle this.
> 
> 
> 
> Not disclosing the internal working of the memory subsystem is not the same as intentionally misleading or disceiving. I think anyone waiting for a government response is going to be waiting a long time.
> 
> It has the full 4GB and I doubt there's much specificity in how you have to report things like memory bandwidth (what conditions, what tests, etc). Fair amount of wiggle room in marketing materials.
Click to expand...

I think all we need is common sense and fair expectations? I don't think its expecting too much to get 4gb in vram, that operates exactly like the first 3.5gb portion is it? They however knew that wasn't possible and did not disclose that information. There are consumer protection laws against this sort of seediness.


----------



## PontiacGTX

Quote:


> Originally Posted by *Forceman*
> 
> Not disclosing the internal working of the memory subsystem is not the same as intentionally misleading or disceiving. I think anyone waiting for a government response is going to be waiting a long time.
> 
> It has the full 4GB and I doubt there's much specificity in how you have to report things like memory bandwidth (what conditions, what tests, etc). Fair amount of wiggle room in marketing materials.


you can check the performance drop
http://www.tomshardware.com/reviews/geforce-gtx-660-ti-memory-bandwidth-anti-aliasing,3283-11.html


----------



## Tivan

Quote:


> Originally Posted by *tpi2007*
> 
> As I said in the previous thread, the first part of the sentence does not necessarily imply the second (bolded) part. People who bought an Intel hexacore are getting all the memory throughput to the quad channel memory interface, they don't have to buy the octacore model to get that.
> 
> Also, "higher priority access" usually means 'faster' compared to the lower priority access, which must have some correspondence to throughput, which may very well mean that you actually don't get 224 GB/s when accessing all the VRAM, which is what they state on their site.
> 
> Also, their wording (last sentence quoted) implies that they have to do some driver work in order to use the second segment - "we use", so combine that with the benchmark's failure to address it properly and it seems that there is some hand tuning going on, which raises the question of the card's longevity - until when will they do that fine tuning. Not to mention what the performance penalty really is (FCAT + min framerates at same resolution, but different textures quality, which is something that many games will start emphasizing within the next year, making people question this card's value proposition).


Important to highlight to me, despite the reading between the lines implications, they avoid making a clear statement about access speed to the last 0.5GB of ram, entirely.

So if you feel your FPS is worse than reviews would imply, if your VRAM usage is above 3.5GB, then Nvidia is pretty much 'No Comment' mode, right now.

edit: and I would definitely RMA the card in that case. (well if I had a need for it~)


----------



## sugalumps

"Dont go for the 980 over the 970, it's a waste", " The 970 is the perfect card at every price point", "970 this 970 that".









Sucks that this has happened, but I am so glad I grabbed the 980 and I am guessing a lot of others are feeling the same way now.


----------



## Defoler

Quote:


> Originally Posted by *tpi2007*
> 
> As I said in the previous thread, the first part of the sentence does not necessarily imply the second (bolded) part. People who bought an Intel hexacore are getting all the memory throughput to the quad channel memory interface, they don't have to buy the octacore model to get that.
> 
> Also, "higher priority access" usually means 'faster' compared to the lower priority access, which must have some correspondence to throughput, which may very well mean that you actually don't get 224 GB/s when accessing all the VRAM, which is what they state on their site.
> 
> Also, their wording (last sentence quoted) implies that they have to do some driver work in order to use the second segment - "we use", so combine that with the benchmark's failure to address it properly and it seems that there is some hand tuning going on, which raises the question of the card's longevity - until when will they do that fine tuning. Not to mention what the performance penalty really is (FCAT + min framerates at same resolution, but different textures quality, which is something that many games will start emphasizing within the next year, making people question this card's value proposition).


As we don't see any gaming effect outside a forced CUDA benchmark, and as the card has 4GB which are fully accessible, and without any full information regarding what is far, prioritised or whatever, everything is completely speculative.

You want to test SLI 4K FCAT? No one is stopping anyone. Its completely open and free to be tested.
You want to return the card even if performance wise you get what you paid for (no one promised 100fps at 4K with msaax10 and DSR x100, and yeah, I'm exaggerating)? You are free to do so. The vendor does not necessarily accept it as long as there is nothing wrong with the card.

I have yet to see a single actual proof outside the benchmark of an actual issue. And we have seen 4GB full usage games not suddenly dropping frames with a single card to correspond with the so called problem or something which can indication of a problem about the card's longevity. No one also promised the card to be competitive vs everything else on the market forever.

People are over doing it, an extreme ways, while most owners are fine with their card and it performs to their liking.


----------



## Menta

any pc enthusiast diggs on specs and trying to achieve the best system they can possibly build while thinking of cost and performance, discovering they got a "glitch" or were cheated somehow is kind of let down at the end of the day...

thats my feeling anyway


----------



## Tivan

Quote:


> Originally Posted by *Defoler*
> 
> As we don't see any gaming effect outside a forced CUDA benchmark, and as the card has 4GB which are fully accessible, and without any full information regarding what is far, prioritised or whatever, everything is completely speculative.
> 
> You want to test SLI 4K FCAT? No one is stopping anyone. Its completely open and free to be tested.
> You want to return the card even if performance wise you get what you paid for (no one promised 100fps at 4K with msaax10 and DSR x100, and yeah, I'm exaggerating)? You are free to do so. The vendor does not necessarily accept it as long as there is nothing wrong with the card.
> 
> I have yet to see a single actual proof outside the benchmark of an actual issue. And we have seen 4GB full usage games not suddenly dropping frames with a single card to correspond with the so called problem or something which can indication of a problem about the card's longevity. No one also promised the card to be competitive vs everything else on the market forever.
> 
> People are over doing it, an extreme ways, while most owners are fine with their card and it performs to their liking.


I think this issue is nothing to be worried about, too. People who experience a performance falloff with over 3.5gb usage should just RMA!

We still need a thread and some coverage about this, so people can isolate issues easier, should it be related to the ram.


----------



## tpi2007

Quote:


> Originally Posted by *Defoler*
> 
> As we don't see any gaming effect outside a forced CUDA benchmark, and as the card has 4GB which are fully accessible, and without any full information regarding what is far, prioritised or whatever, everything is completely speculative.
> 
> You want to test SLI 4K FCAT? No one is stopping anyone. Its completely open and free to be tested.
> You want to return the card even if performance wise you get what you paid for (no one promised 100fps at 4K with msaax10 and DSR x100, and yeah, I'm exaggerating)? You are free to do so. The vendor does not necessarily accept it as long as there is nothing wrong with the card.
> 
> I have yet to see a single actual proof outside the benchmark of an actual issue. And we have seen 4GB full usage games not suddenly dropping frames with a single card to correspond with the so called problem or something which can indication of a problem about the card's longevity. No one also promised the card to be competitive vs everything else on the market forever.
> 
> People are over doing it, an extreme ways, while most owners are fine with their card and it performs to their liking.


As I also asked previously, there's another question that we may not know about: does Nvidia fuse off the other cores in exactly the same way on the GM204 die ? I wouldn't think so as that would imply all the faulty chips to be damaged in exactly the same area. So, having said that, does cutting cores here or there make any difference performance wise ? How many crossbars are available depending on where they cut off ? Nobody knows. What I know is that some people are claiming that they don't have problems and others claim otherwise. And that is strange.

As Anandtech said:

http://www.anandtech.com/show/8931/nvidia-publishes-statement-on-geforce-gtx-970-memory-allocation
Quote:


> If nothing else, what we've learned today is that *we know less than we thought we did*, and that's never a satisfying answer.


----------



## Forceman

Quote:


> Originally Posted by *PontiacGTX*
> 
> you can check the performance drop
> http://www.tomshardware.com/reviews/geforce-gtx-660-ti-memory-bandwidth-anti-aliasing,3283-11.html


Totally different situation, and those results may have no bearing in what happens in this case.

Edit: Actually, what are you trying to show there? That memory bandwidth affects high-VRAM-use performance? Because I think we all know that, the question is how much.

Bottom line, until we see more testing, or get more details from Nvidia, it's impossible to know how big a deal this is. But by all means, let's jump to conclusions and make wild accusations in the meantime.


----------



## michaelius

Quote:


> Originally Posted by *sugalumps*
> 
> "Dont go for the 980 over the 970, it's a waste", " The 970 is the perfect card at every price point", "970 this 970 that".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sucks that this has happened, but I am so glad I grabbed the 980 and I am guessing a lot of others are feeling the same way now.


You paid 200$ for 20% more performance and 0,5GB ram more


----------



## gamervivek

Quote:


> Originally Posted by *Defoler*
> 
> As we don't see any gaming effect outside a forced CUDA benchmark, and as the card has 4GB which are fully accessible, and without any full information regarding what is far, prioritised or whatever, everything is completely speculative.
> 
> You want to test SLI 4K FCAT? No one is stopping anyone. Its completely open and free to be tested.
> You want to return the card even if performance wise you get what you paid for (no one promised 100fps at 4K with msaax10 and DSR x100, and yeah, I'm exaggerating)? You are free to do so. The vendor does not necessarily accept it as long as there is nothing wrong with the card.
> 
> I have yet to see a single actual proof outside the benchmark of an actual issue. And we have seen 4GB full usage games not suddenly dropping frames with a single card to correspond with the so called problem or something which can indication of a problem about the card's longevity. No one also promised the card to be competitive vs everything else on the market forever.
> 
> People are over doing it, an extreme ways, while most owners are fine with their card and it performs to their liking.


The CUDA benchmark came into existence because people were seeing it in games. If the 980 loads up 4GB while the 970 doesn't, are you sure there won't be any differences?


----------



## Mad Pistol

Those of you that did buy the GTX 970, you should probably just keep the card and enjoy it. The card is still phenomenally fast and inexpensive for what it can do. You're going to lose money in the long run if you try and sell it then buy an AMD card. For the price you paid for the card, it's still a really good card.


----------



## dean_8486

This really sucks... I was playing star citizen 1440p maxed out and I am getting stuttering every 30 seconds or so, checking MSI Afterburner my VRAM hits about 3580mb usage and I get a big dip in GPU usage, which in turn causes the stuttering. I would have got a 980 if I knew this! The card is as good as useless if you want stutter free gaming in Star Citizen and newer high res texture games going forward.
I hope this issue can be resolved in some way or form, or I will be moving back to AMD next time, not good Nvidia...


----------



## Menta

Quote:


> Originally Posted by *Mad Pistol*
> 
> Those of you that did buy the GTX 970, you should probably just keep the card and enjoy it. The card is still phenomenally fast and inexpensive for what it can do. You're going to lose money in the long run if you try and sell it then buy an AMD card. For the price you paid for the card, it's still a really good card.


AMD is always off the table just don't like their software on my pc
Quote:


> Originally Posted by *Mad Pistol*
> 
> Those of you that did buy the GTX 970, you should probably just keep the card and enjoy it. The card is still phenomenally fast and inexpensive for what it can do. You're going to lose money in the long run if you try and sell it then buy an AMD card. For the price you paid for the card, it's still a really good card.


thats not the point the card is not reaching its full potential clocks etc, there is a problem within the memory management


----------



## gamervivek

I hope they can change the behavior via drivers, apparently opengl doesn't have a problem with memory allocation, the directx drivers seem more strict though.


----------



## Forceman

Quote:


> Originally Posted by *gamervivek*
> 
> I hope they can change the behavior via drivers, apparently opengl doesn't have a problem with memory allocation, the directx drivers seem more strict though.


Sounds like they already are managing it via drivers, by not using the segmented section unless absolutely needed.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Sounds like they already are managing it via drivers, by not using the segmented section unless absolutely needed.


GTX970 even as a 3.5GB is more then fast enough for people not to tell the difference between it being 4GB. 4GB just looks a lot better to buys when competition also has 4GB.


----------



## dean_8486

Not true 500mb could be the difference between smooth and stuttering gaming


----------



## ZealotKi11er

Quote:


> Originally Posted by *dean_8486*
> 
> Not true 500mb could be the difference between smooth and stuttering gaming


Yes but how many people will it effect? There are only some game sand some high resolutions that will trigger more then 3.5GB.


----------



## gamervivek

Someone posted that the behavior was different on an earlier driver, not sure what to make of it. More memory for the reviewers or would they go with more fps if the bandwidth is lower for that segment?
Assuming they were right, of course.

This one looks like a gtx970 usage, where the game engine supposedly fills up whole vram regardless of settings:
Quote:


> Even on the highest AA setting and 4K resolution we see the maximum VRAM usage at 3575MB . For those using different GPUs and lower or higher VRAM the numbers might vary as the game utilizes as much as it can as seen on the HD resolution .


http://www.hardwarepal.com/wp-content/uploads/2014/11/COD-Advanced-Warfare-Video-ram-Usage.jpg

It might not lead to drop in fps or a stutter, but it has to impact somewhere, and if it does for objects on the screen or some effects, then expect more drama when somebody starts going on about it.


----------



## Forceman

Depends on what the VRAM is being used for. If the game is just caching extra assets when using 4GB over 3.5GB then it might not make any difference. If that extra VRAM is actually needed though...

I don't know how much of the 4GB games really need, as opposed to how much they just use because it is available. Another reason why we need to wait for more testing.


----------



## Deluxe

I think the best we got out of this thread was the massive wall of text that noone read.


----------



## givmedew

Quote:


> Originally Posted by *hht92*
> 
> That's why i waited for my 780 1 year (yea yea i took her 4 months before 900 series), cause the new product isn't always the best product.


The new product might not always be the best in terms of its not better than a 780ti but it is certainly better than a 780 and by far a better value than both.

The card performs rock solid and is almost as fast as a 780TI cost $330 and has 4GB of ram... so what if applications report the ram usage wrong or if a game doesn't need 4GB it only uses what it needs.


----------



## thegreatsquare

Quote:


> Originally Posted by *Deluxe*
> 
> I think the best we got out of this thread was the massive wall of text that noone read.


Yes, there is that.

As for me as have two opposing ideas in my head.

1: People are telling me that the test is flawed and there isn't a problem to worry about.

2: Nvidia makes a statement basically admitting to some ghetto-rigging on the 970.


----------



## gamervivek

Quote:


> Originally Posted by *thegreatsquare*
> 
> Yes, there is that.
> 
> As for me as have two opposing ideas in my head.
> 
> 1: People are telling me that the test is flawed and there isn't a problem to worry about.
> 
> 2: Nvidia makes a statement basically admitting to some ghetto-rigging on the 970.


The test-maker has confirmed(I read a post that is) that the test is swapping with system ram for that 0.5GB of memory and that's why the values are so low. So his program can't really access that CUDA memory and we don't know how much the hit to bandwidth is there, if any. Again, second hand information.

It's possible that nvidia are keeping it separate just because it would make their drivers run better that way or make it simpler to code for.

But they should've been upfront about it, the 52 effective ROPs and then this, and as much I love being an nvidia hateboy,, there are simply too many cards out there already to wish that it turned into a real problem for the buyers.


----------



## Cyclonic

Wonder if the 780 6 gig editions got the same sort of problems


----------



## givmedew

Quote:


> Originally Posted by *sugalumps*
> 
> "Dont go for the 980 over the 970, it's a waste", " The 970 is the perfect card at every price point", "970 this 970 that".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sucks that this has happened, but I am so glad I grabbed the 980 and I am guessing a lot of others are feeling the same way now.


*Most people who bought a 970 I'm fairly certain are still happy that they didn't make the mistake of buying a 980 for a tiny little performance bump.*
Quote:


> Originally Posted by *michaelius*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugalumps*
> 
> "Dont go for the 980 over the 970, it's a waste", " The 970 is the perfect card at every price point", "970 this 970 that".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sucks that this has happened, but I am so glad I grabbed the 980 and I am guessing a lot of others are feeling the same way now.
> 
> 
> 
> You paid 200$ for 20% more performance and 0,5GB ram more
Click to expand...

*You don't get 20% more performance with a 980 so yeh it is kind of a rip off. For the price difference it should have been 6GB or more for those running 2 or more so they had enough memory to match the performance of SLI.*
Quote:


> Originally Posted by *dean_8486*
> 
> This really sucks... I was playing star citizen 1440p maxed out and I am getting stuttering every 30 seconds or so, checking MSI Afterburner my VRAM hits about 3580mb usage and I get a big dip in GPU usage, which in turn causes the stuttering. I would have got a 980 if I knew this! The card is as good as useless if you want stutter free gaming in Star Citizen and newer high res texture games going forward.
> I hope this issue can be resolved in some way or form, or I will be moving back to AMD next time, not good Nvidia...


I will have to check this out on the 970 I bought for my brother but my guess is you are using anti-aliasing. So just turn it off... The bottom line is that if what you people are saying that 500MB is the difference between fail and not fail then the 970 and 980 are both TOTAL FAILURES!!!! Because if 3.5GB isn't enough for a 970 then 4GB isn't enough for 2 or even 3 980s... you would need more memory since the cards would be able to easily handle higher anti-aliasing and graphics settings than a single 980.

I will be paying attention to this since I pay for and maintain my little brothers gaming rig... but I am not worried about it at all. All of his games rock at 1600P.


----------



## sugalumps

Quote:


> Originally Posted by *michaelius*
> 
> You paid 200$ for 20% more performance and 0,5GB ram more


The irony being that it probably is a waste for me since I have yet to go over 3gb of vram at 1440p. On a single card you are never going to use 4gb of vram without running out of raw gpu power first anyway unless you are fine with 40fps territory, as that is what you are going to be at if you are running all those textures and aa options.

That is the best part about all of this, 99% of the 970 users would never have known or even found out about this but know they have been told are freaking out. It sucks what has happened but most of them will never be hitting 3.5gb vram and above except the people running two in sli with all the options at max.


----------



## nleksan

Quote:


> Originally Posted by *Deluxe*
> 
> I think the best we got out of this thread was the massive wall of text that noone read.


Not sure if sarcasm or..?

But I should perhaps change my user title to something that appropriately reflects my tendency towards meandering expository text....


----------



## gamervivek

Quote:


> Originally Posted by *Cyclonic*
> 
> Wonder if the 780 6 gig editions got the same sort of problems


If the 3gig didn't, then the 6 one shouldn't.
Though it has asymmetry too, not like the gtx970 one though.
Quote:


> So the actual pixels/cycle peak rate when you look at all the limits (rasterizers/SMs/ROPs) would be :
> 
> GTX 750 : 16/16/16
> GTX 750 Ti : 16/20/16
> GTX 760 : 32/24/32 or 24/24/32 (as there are 2 die configuration options)
> GTX 770 : 32/32/32
> GTX 780 : 40/48/48 or 32/48/48 (as there are 2 die configuration options)
> GTX 780 Ti : 40/60/48
> GTX 970 : 64/52/64
> GTX 980 : 64/64/64
> 
> Extra ROPs are still useful to get better efficiency with MSAA and so. But they don't participate in the peak pixel fillrate.


http://techreport.com/blog/27143/here-another-reason-the-geforce-gtx-970-is-slower-than-the-gtx-980


----------



## Silent Scone

Quote:


> Originally Posted by *Forceman*
> 
> Depends on what the VRAM is being used for. If the game is just caching extra assets when using 4GB over 3.5GB then it might not make any difference. If that extra VRAM is actually needed though...
> 
> I don't know how much of the 4GB games really need, as opposed to how much they just use because it is available. Another reason why we need to wait for more testing.


Another reason why we need DX12 more like.


----------



## Seven7h

Quote:


> Originally Posted by *Silent Scone*
> 
> Another reason why we need DX12 more like.


This has nothing to do with and is not impacted by the API.

Even under DX12, GPU implementations and their drivers, along with Windows, will decide how memory is managed.

Also, today in DX11, developers already know what their consumption looks like, and how much is allocated vs used each frame, or each command buffer.


----------



## error-id10t

Not wanting to cross-post but I doubt everyone here is also reading the Nvidia section, this is what I see on BF4. Hope that helps someone.

http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/630#post_23457783

Secondly, I remember this test (PCI-E bandwidth test (cuda)):

http://forums.evga.com/tm.aspx?m=1972266

Post #24 shows my results back when I used 670 SLI (SC+ 4GB) cards. I believe I was using a 4770K but otherwise the setup is the same. Comparing those numbers to what I see now on my 970s:



Otherwise the same except bidirectional speeds have gone up. I didn't know what this did specifically back then and still don't but that's what I see. Curious if anyone else could take a look and maybe compare with a 980.


----------



## TheBlindDeafMute

I don't think is as big of a deal as people are making it out to be. That being said, it was a bit shady. Still an excellent card tho. Chances are, most people bought 970 for 1080p, which 3.5gb should be plenty indeed.


----------



## tweezlednutball

Quote:


> Originally Posted by *ZealotKi11er*
> 
> A lot of people dont consider AMD so there is no point on waiting.


I guess people are really that closed minded and stubborn. but like i said the prices would fall, so why the hell not. This is IMO, one of those times you actually wait it out for a bit.


----------



## meowth2

People going to jump on this


----------



## Quasimojo

Maybe I'm just not as well educated on the inner workings of a gpu as some. It would appear to me that the card was advertised with 4GB memory, it comes with 4GB of memory and it can utilize all 4GB of that memory. It doesn't seem to me that its being segmented or not would have much effect on the 970 being able to efficiently utilize all of it. This is something we've seen several times in the past - the "less than flagship" cards being held back by a lower bandwidth, occasionally causing them not to be able to effectively utilize all the memory on-board. I remember seeing the math at one point showing something like how a 128-bit card couldn't possibly utilize all of 2GB of on-board memory at once, yet at the time we were seeing those cards with 2GB and even 3GB flying off the shelves.

The fact remains that the 970 is still a marvel that we've not seen very often at that price point. The people who thought there was no reason to pony up the extra scratch for a 980 when a 970 was so much cheaper and just about as fast were deluding themselves. No such thing as a free lunch.


----------



## criminal

Quote:


> Originally Posted by *Quasimojo*
> 
> Maybe I'm just not as well educated on the inner workings of a gpu as some. It would appear to me that the card was advertised with 4GB memory, it comes with 4GB of memory and it can utilize all 4GB of that memory. It doesn't seem to me that its being segmented or not would have much effect on the 970 being able to efficiently utilize all of it. This is something we've seen several times in the past - the "less than flagship" cards being held back by a lower bandwidth, occasionally causing them not to be able to effectively utilize all the memory on-board. I remember seeing the math at one point showing something like how a 128-bit card couldn't possibly utilize all of 2GB of on-board memory at once, yet at the time we were seeing those cards with 2GB and even 3GB flying off the shelves.
> 
> *The fact remains that the 970 is still a marvel that we've not seen very often at that price point. The people who thought there was no reason to pony up the extra scratch for a 980 when a 970 was so much cheaper and just about as fast were deluding themselves. No such thing as a free lunch.*


Lol... okay. Nvidia didn't do anything wrong at all. Comsumers just except to much.


----------



## ckool

I am not new to the computing world but i am new to this forum.... just wondering how many prople actually work for nvidia here lol,some of the comments is just unbelievable....


----------



## Leopard2lx

Quote:


> Originally Posted by *ckool*
> 
> I am not new to the computing world but i am new to this forum.... just wondering how many prople actually work for nvidia here lol,some of the comments is just unbelievable....


There are just as many NVIDIA employees here as there are AMD.


----------



## damric

Quote:


> Originally Posted by *ckool*
> 
> I am not new to the computing world but i am new to this forum.... just wondering how many prople actually work for nvidia here lol,some of the comments is just unbelievable....


It's been a boring last few months, so all of us trolls are happy to have something to argue about.


----------



## eddyg

this is quite surprising, I was thinking of getting a GTX 970 for a new build, but maybe I will save some money and get the GTX 980 instead, do you guys know if the GTX 980 is affected by this?


----------



## Forceman

No, the 980 is not affected.


----------



## Quasimojo

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> Maybe I'm just not as well educated on the inner workings of a gpu as some. It would appear to me that the card was advertised with 4GB memory, it comes with 4GB of memory and it can utilize all 4GB of that memory. It doesn't seem to me that its being segmented or not would have much effect on the 970 being able to efficiently utilize all of it. This is something we've seen several times in the past - the "less than flagship" cards being held back by a lower bandwidth, occasionally causing them not to be able to effectively utilize all the memory on-board. I remember seeing the math at one point showing something like how a 128-bit card couldn't possibly utilize all of 2GB of on-board memory at once, yet at the time we were seeing those cards with 2GB and even 3GB flying off the shelves.
> 
> *The fact remains that the 970 is still a marvel that we've not seen very often at that price point. The people who thought there was no reason to pony up the extra scratch for a 980 when a 970 was so much cheaper and just about as fast were deluding themselves. No such thing as a free lunch.*
> 
> 
> 
> Lol... okay. Nvidia didn't do anything wrong at all. Comsumers just except to much.
Click to expand...

They've never marketed product shortcomings in the past (none of the GPU manufacturers have), why are we getting all b-hurt about it now? I just don't see why it's suddenly some kind of evil conspiracy.









Everyone was thrilled with their 970's, until someone pushed its limits and posted results that *appeared* to be a big problem, but turned out to simply be an architecture quirk. You *still* have 4GB of memory, and the 970 is *still* using it all. You get some hiccups when pushing it to its limits? Why would that surprise anyone from a GTX x70 series card?? It's still the best bang for the buck second fiddle GPU nVidia has produced in a long time.


----------



## nSone

for all the "official" info we got for now it ain't 980 seems to have all the needed SMs to drive the 4GB vram properly
- it's the 970 workings that remain unclear for now... hope we see a non-underestimating explanation on the issue by nVidia real soon


----------



## nSone

Quote:


> Originally Posted by *Quasimojo*
> 
> ...
> turned out to simply be an architecture quirk.
> ...


that's exactly the part that ruins the party... at least for me it does, since I want to know how big a quirk it is
it ain't really a bang-for-the-buck if it affects the performance, endurance etc. of the card
for me it doesn't matter which GPU manufacturers does it, I take it as an understatement of consumers engagement with the product, it's not like we all go around switching cards every month or so


----------



## darkwizard

Quote:


> Originally Posted by *Quasimojo*
> 
> They've never marketed product shortcomings in the past (none of the GPU manufacturers have), why are we getting all b-hurt about it now? I just don't see why it's suddenly some kind of evil conspiracy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everyone was thrilled with their 970's, until someone pushed its limits and posted results that *appeared* to be a big problem, but turned out to simply be an architecture quirk. You *still* have 4GB of memory, and the 970 is *still* using it all. You get some hiccups when pushing it to its limits? Why would that surprise anyone from a GTX x70 series card?? It's still the best bang for the buck second fiddle GPU nVidia has produced in a long time.


After reading pretty much the whole thread, there are variables that some are omitting:

- 970 for now seem ok, games not really pushing the envelop on VRAM at 1080p, as developers get more comfortable with newer cards that have 4gb or more, how will the 970 perform then? - long term ramifications
- Console ports.
- some might have gone through quite a bit of struggle to save up for the card to find out it doesn't work they way it was "Advertised" - I do feel bad for those that feel somewhat cheated (regardless that if they never hit the performance ceiling-issue)
- not everyone has the cash flow to upgrade every 6-12months for newer cards.
- 4K gaming - I remember seeing quite a bit of posts (not just in OCN but on different sites) that would say, "Why 980? get 2 970's for just $100 and better performance for 4k"
- We'll see how the 970 behaves once Witcher 3 comes out


----------



## Quasimojo

Quote:


> Originally Posted by *nSone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> ...
> turned out to simply be an architecture quirk.
> ...
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *nSone*
> 
> that's exactly the part that ruins the party... at least for me it does, since I want to know how big a quirk it is
> 
> Click to expand...
Click to expand...

The proof's in the pudding. Look at the benchmarks. The 970's have been kicking butt. How big a quirk could it be??

Quote:


> Originally Posted by *nSone*
> 
> it ain't really a bang-for-the-buck if it affects the performance, endurance etc. of the card


Yes, it's still a lot of bang for the buck. Again, look at the benchmarks (unless the prevailing theory is that they must have all been faked due to pressure from nVidia). And I can't think of any sane reason to think this has any effect on the endurance of the card.

Quote:


> Originally Posted by *nSone*
> 
> for me it doesn't matter which GPU manufacturers does it, I take it as an understatement of consumers engagement with the product, it's not like we all go around switching cards every month or so


Ok, I don't understand how that plays into this. Why do you think people will need to switch cards?


----------



## doomlord52

Quote:


> Originally Posted by *Quasimojo*
> 
> They've never marketed product shortcomings in the past (none of the GPU manufacturers have), why are we getting all b-hurt about it now? I just don't see why it's suddenly some kind of evil conspiracy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everyone was thrilled with their 970's, until someone pushed its limits and posted results that *appeared* to be a big problem, but turned out to simply be an architecture quirk. You *still* have 4GB of memory, and the 970 is *still* using it all. You get some hiccups when pushing it to its limits? Why would that surprise anyone from a GTX x70 series card?? It's still the best bang for the buck second fiddle GPU nVidia has produced in a long time.


It doesn't matter if the card continues to work fine, or that the card can in fact use all 4GB of the VRAM - the problem is, the card is flawed in a way that makes it no longer meet its advertised performance spec by a large margin.

On Nvidia's own website page for the 970, they state that the card has 4GB at 224gb/s. It does not say 3.5gb at 224gb/s + 0.5gb at 22gb/s. It claims that all 4gb run are accessible at 224gb/s. While most consumers probably didn't go out of their way to make decisions based entirely around memory access speed, the fact still stands - Nvidia advertised it as having 4gb (and ALL 4gb) of the memory being available at 224gb/s.

It now appears that this claim (all 4gb accessible at 224gb/s) is no longer true. Weather or not Nvidia was aware of this at launch is irrelevant - the product no longer meets advertised spec. Hence, the product (knowingly or unknowingly) was sold under false pretense; which is illegal in many/most countries of the world.
.


----------



## nSone

well for one thing, if I buy a GPU that has 4GBs VRAM declared on it's specification chart I want it to be exactly the thing I get for the money I give
take a moment and consider this situation - I personally don't care about games, I actually desperately need a GPU for rendering, so when considering the 970 over the 780 one of the main arguments in favor would be the amount of VRAM
it simply changes the equitation, not to go on about other performance effects we still don't know about cause of this quirk you call


----------



## nSone

Quote:


> Originally Posted by *doomlord52*
> 
> It doesn't matter if the card continues to work fine, or that the card can in fact use all 4GB of the VRAM - the problem is, the card is flawed in a way that makes it no longer meet its advertised performance spec by a large margin.
> 
> On Nvidia's own website page for the 970, they state that the card has 4GB at 224gb/s. It does not say 3.5gb at 224gb/s + 0.5gb at 22gb/s. It claims that all 4gb run are accessible at 224gb/s. While most consumers probably didn't go out of their way to make decisions based entirely around memory access speed, the fact still stands - Nvidia advertised it as having 4gb (and ALL 4gb) of the memory being available at 224gb/s.
> 
> It now appears that this claim (all 4gb accessible at 224gb/s) is no longer true. Weather or not Nvidia was aware of this at launch is irrelevant - the product no longer meets advertised spec. Hence, the product (knowingly or unknowingly) was sold under false pretense; which is illegal in many/most countries of the world.
> .


^this
I don't know how it's not clear by now
all over the net people go on about personal judgement on game experience, amd this nvidia that etc while ignoring the fact that by now it's clear that a product is being sold by false pretense and looks like it's already confirmed but no other explanation provided except "it still gives you fps so yeah... fps"


----------



## Defoler

Quote:


> Originally Posted by *tpi2007*
> 
> As I also asked previously, there's another question that we may not know about: does Nvidia fuse off the other cores in exactly the same way on the GM204 die ? I wouldn't think so as that would imply all the faulty chips to be damaged in exactly the same area. So, having said that, does cutting cores here or there make any difference performance wise ? How many crossbars are available depending on where they cut off ? Nobody knows. What I know is that some people are claiming that they don't have problems and others claim otherwise. And that is strange.
> 
> As Anandtech said:
> 
> http://www.anandtech.com/show/8931/nvidia-publishes-statement-on-geforce-gtx-970-memory-allocation


Why does it matter?
Are we all GPU engineers and we need to know which SM on which part on which segment is being disabled?
I have a feeling Nvidia did not make one card with 40 crossbars and one with 24 and one with 90. If all cards perform similar in the end of the day, I see no issue at all.
Do you know how AMD are doing it with the 280x? How they will do it with the 380x or whatever its name is going to be?
You don't. And you never will.
Quote:


> Originally Posted by *gamervivek*
> 
> The CUDA benchmark came into existence because people were seeing it in games. If the 980 loads up 4GB while the 970 doesn't, are you sure there won't be any differences?


That is false.
People saw the problem after the CUDA benchmark was alive. No one was complaining about stuttering and low 4K / DSR / SLI at 10 fps with extreme low performance after this came into existence.
Cards were performing as you could see on reviews in similar cases. This issue was born once the benchmark came to be.
There are no threads, no "omg my card doesn't perform as it should!" no "my card's memory doesn't function!" issues.


----------



## Quasimojo

Quote:


> Originally Posted by *doomlord52*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> They've never marketed product shortcomings in the past (none of the GPU manufacturers have), why are we getting all b-hurt about it now? I just don't see why it's suddenly some kind of evil conspiracy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everyone was thrilled with their 970's, until someone pushed its limits and posted results that *appeared* to be a big problem, but turned out to simply be an architecture quirk. You *still* have 4GB of memory, and the 970 is *still* using it all. You get some hiccups when pushing it to its limits? Why would that surprise anyone from a GTX x70 series card?? It's still the best bang for the buck second fiddle GPU nVidia has produced in a long time.
> 
> 
> 
> It doesn't matter if the card continues to work fine, or that the card can in fact use all 4GB of the VRAM - the problem is, the card is flawed in a way that makes it no longer meet its advertised performance spec by a large margin.
> 
> On Nvidia's own website page for the 970, they state that the card has 4GB at 224gb/s. It does not say 3.5gb at 224gb/s + 0.5gb at 22gb/s. It claims that all 4gb run are accessible at 224gb/s. While most consumers probably didn't go out of their way to make decisions based entirely around memory access speed, the fact still stands - Nvidia advertised it as having 4gb (and ALL 4gb) of the memory being available at 224gb/s.
> 
> It now appears that this claim (all 4gb accessible at 224gb/s) is no longer true. Weather or not Nvidia was aware of this at launch is irrelevant - the product no longer meets advertised spec. Hence, the product (knowingly or unknowingly) was sold under false pretense; which is illegal in many/most countries of the world.
> .
Click to expand...

I didn't read the entire thread, so I may have missed it. If it is, indeed, true that the last 512MB of memory is only accessible at 22Gb/s, then I guess I can see where some may have a real problem with that. Still, it's not much different from the neutering that GPU makers have been doing to their lesser parts for decades, now, just to hit a particular price point.

Quote:


> Originally Posted by *Defoler*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gamervivek*
> 
> The CUDA benchmark came into existence because people were seeing it in games. If the 980 loads up 4GB while the 970 doesn't, are you sure there won't be any differences?
> 
> 
> 
> That is false.
> People saw the problem after the CUDA benchmark was alive. No one was complaining about stuttering and low 4K / DSR / SLI at 10 fps with extreme low performance after this came into existence.
> Cards were performing as you could see on reviews in similar cases. This issue was born once the benchmark came to be.
> There are no threads, no "omg my card doesn't perform as it should!" no "my card's memory doesn't function!" issues.
Click to expand...

That's my point as well. How many people who are up in arms had no reason to think they had a problem, until someone told them they did? Again, all one has to do is look at the benchmarks and reviews. Scoreboard. I can tell you that I could not care less how much memory, how many stream processors or shaders a card has, as long as it can use what it has to produce that kind of performance. They could tell me it runs on pixie dust and moonbeams for all I care. Content creators? Perhaps, but they should be using Quadro or Fire cards, anyway - not that it matters, if the GTX can still do the job as advertised.


----------



## Forceman

Quote:


> Originally Posted by *doomlord52*
> 
> It does not say 3.5gb at 224gb/s + 0.5gb at 22gb/s. It claims that all 4gb run are accessible at 224gb/s.


For the 80th time, that benchmark is bugged. *It is not 22 GB/sec.* No one knows what the actual speed of memory access to that section of VRAM is (and quite possibly never will).

I swear that benchmark is the worst thing that could have happened for any kind of rational discussion of this issue (not that rational discussion was very likely anyway, mind you).


----------



## nSone

Quote:


> Originally Posted by *Quasimojo*
> 
> Content creators? Perhaps, but they should be using Quadro or Fire cards, anyway - not that it matters, if the GTX can still do the job as advertised.


what about this?
Quote:


> NVIDIA Maxwell GPU support for iray in Autodesk 3ds Max 2015
> 
> 
> Maxwell is the next generation NVIDIA GPU architecture. In order for iray to render with Maxwell GPUs using Autodesk 3ds Max you need to update the iray library (.dll) that is distributed with 3ds Max 2015. If you have a Maxwell GPU and use iray in 3ds Max 2015, it uses only the CPU without this updated dll.
> Download the updated iray for Maxwell library here
> Step by step instructions for updating the iray library can be found on the mental ray blog.
> This patch is only relevant to 3ds Max 2015 SP2. Future patches and releases of 3ds Max will automatically support Maxwell generation GPUs
> 
> Maxwell GPUs models*
> Quadro K2200
> Quadro K620
> GeForce GTX 980
> GeForce GTX 980M
> GeForce GTX 970
> GeForce GTX 970M
> GeForce GTX 860M
> GeForce GTX 850M
> GeForce GTX 840M
> GeForce GTX 830M
> GeForce GTX 750 Ti
> GeForce GTX 750
> GeForce GTX 745
> 
> *Available on October 10th, 2014
> Legal and Privacy Information


http://www.nvidia-arc.com/products/mentalray/maxwell-gpu-support.html


----------



## Clocknut

Quote:


> Originally Posted by *darkwizard*
> 
> After reading pretty much the whole thread, there are variables that some are omitting:
> 
> - 970 for now seem ok, games not really pushing the envelop on VRAM at 1080p, as developers get more comfortable with newer cards that have 4gb or more, how will the 970 perform then? - long term ramifications
> - Console ports.
> - some might have gone through quite a bit of struggle to save up for the card to find out it doesn't work they way it was "Advertised" - I do feel bad for those that feel somewhat cheated (regardless that if they never hit the performance ceiling-issue)
> - not everyone has the cash flow to upgrade every 6-12months for newer cards.
> - 4K gaming - I remember seeing quite a bit of posts (not just in OCN but on different sites) that would say, "Why 980? get 2 970's for just $100 and better performance for 4k"
> - We'll see how the 970 behaves once Witcher 3 comes out


it may not be a problem now but those who save for a 970 to use for 3-4years, This 3.5Gb cap will come back to bite them. GTX470/480/570/580/670/680 use to have sufficient VRAM. Look what happen now lol

fyi I am still using 570 on my other rig, this card is still a pretty decent card for 1080p, the problem I am facing now is the card are fast enough, but there isnt enough Vram to keep up. I think I am going to hold up until a nvidia/AMD 8GB/16 Vram gpu, the gen next console port is going to get a lot worst as developer push the console limit in a few years time.


----------



## spacin9

Quote:


> Originally Posted by *Noufel*
> 
> I think the most hurt people are ones that have gone the 970 sli way to run games at 4k resolution and people who "upgraded" from a 780 for the 1gb of vram difference .


My rig overclocked does 19,500 in firestrike and I never got quite that high, even with 780 TIs. No where near with 780s... about 17,500 maybe. And 5500 4k firestrike, vs 5100 Tis firestrike. It is a big upgrade. I sold them right before 970s came out and I don't regret it at all, despite this debacle or whatever it is. The only thing I miss about my 780s is that they kept my feet warm during gaming.


----------



## GrimDoctor

Quote:


> Originally Posted by *Forceman*
> 
> For the 80th time, that benchmark is bugged. *It is not 22 GB/sec.* No one knows what the actual speed of memory access to that section of VRAM is (and quite possibly never will).
> 
> I swear that benchmark is the worst thing that could have happened for any kind of rational discussion of this issue (not that rational discussion was very likely anyway, mind you).


Thank you for saying this, someone needed to.


----------



## gamervivek

It's before his benchmark.
Quote:


> Originally Posted by *Defoler*
> 
> Why does it matter?
> Are we all GPU engineers and we need to know which SM on which part on which segment is being disabled?
> I have a feeling Nvidia did not make one card with 40 crossbars and one with 24 and one with 90. If all cards perform similar in the end of the day, I see no issue at all.
> Do you know how AMD are doing it with the 280x? How they will do it with the 380x or whatever its name is going to be?
> You don't. And you never will.
> That is false.
> People saw the problem after the CUDA benchmark was alive. No one was complaining about stuttering and low 4K / DSR / SLI at 10 fps with extreme low performance after this came into existence.
> Cards were performing as you could see on reviews in similar cases. This issue was born once the benchmark came to be.
> There are no threads, no "omg my card doesn't perform as it should!" no "my card's memory doesn't function!" issues.


That's of course the worst case scenario, but it's before his benchmark if the 15th Jan date is correct.

http://forums.anandtech.com/showthread.php?t=2416150

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/1/

The 'omg someone told me and now I see it all the time' is no excuse to wish away a buyers' concerns, and I've very little patience for that defense.


----------



## looniam

Quote:


> Originally Posted by *GrimDoctor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> For the 80th time, that benchmark is bugged. *It is not 22 GB/sec.* No one knows what the actual speed of memory access to that section of VRAM is (and quite possibly never will).
> 
> I swear that benchmark is the worst thing that could have happened for any kind of rational discussion of this issue (not that rational discussion was very likely anyway, mind you).
> 
> 
> 
> Thank you for saying this, someone needed to.
Click to expand...

it been shown on every forum that the nai benchmark has made an appearance that it is unreliable. but still people take it as relevant since they never hardly ever read everything . .


----------



## error-id10t

Quote:


> Originally Posted by *Forceman*
> 
> For the 80th time, that benchmark is bugged. *It is not 22 GB/sec.* No one knows what the actual speed of memory access to that section of VRAM is (and quite possibly never will).
> 
> I swear that benchmark is the worst thing that could have happened for any kind of rational discussion of this issue (not that rational discussion was very likely anyway, mind you).


The benchmark isn't what broke the rational discussion, it's the unbelievable Nvidia defending that happens on this site. Previously I had only read about it but this thread is prove of it - comments from many people who don't even own the card saying it's a non-issue, comments from people who have 1 card and think 20FPS is "normal" etc etc.


----------



## nSone

Besides that flawed Nai CUDA test, and all subjective experience, non-professional benchmarks circling around etc. no one needs to be an expert to figure that Nvidia confirmed the issue, details are yet to be provided but yeah they confirm it and we're trying to figure a better explanation than theirs, so OOOK?


----------



## Quasimojo

Quote:


> Originally Posted by *nSone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> Content creators? Perhaps, but they should be using Quadro or Fire cards, anyway - not that it matters, if the GTX can still do the job as advertised.
> 
> 
> 
> what about this?
> Quote:
> 
> 
> 
> NVIDIA Maxwell GPU support for iray in Autodesk 3ds Max 2015
> 
> Click to expand...
Click to expand...

What about it? Everyone knows you can run Autodesk on a GTX card. Most people who do it for a living also know that the Quadro and Fire cards are better suited for it.

Quote:


> Originally Posted by *Clocknut*
> 
> Quote:
> 
> 
> 
> Originally Posted by *darkwizard*
> 
> After reading pretty much the whole thread, there are variables that some are omitting:
> 
> - 970 for now seem ok, games not really pushing the envelop on VRAM at 1080p, as developers get more comfortable with newer cards that have 4gb or more, how will the 970 perform then? - long term ramifications
> - Console ports.
> - some might have gone through quite a bit of struggle to save up for the card to find out it doesn't work they way it was "Advertised" - I do feel bad for those that feel somewhat cheated (regardless that if they never hit the performance ceiling-issue)
> - not everyone has the cash flow to upgrade every 6-12months for newer cards.
> - 4K gaming - I remember seeing quite a bit of posts (not just in OCN but on different sites) that would say, "Why 980? get 2 970's for just $100 and better performance for 4k"
> - We'll see how the 970 behaves once Witcher 3 comes out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it may not be a problem now but those who save for a 970 to use for 3-4years, This 3.5Gb cap will come back to bite them.
Click to expand...

What cap?! There is no cap. You're still getting use of the full 4GB - just not in the way you thought it worked (if you thought about it at all before this).


----------



## Defoler

Quote:


> Originally Posted by *Clocknut*
> 
> it may not be a problem now but those who save for a 970 to use for 3-4years, This 3.5Gb cap will come back to bite them. GTX470/480/570/580/670/680 use to have sufficient VRAM. Look what happen now lol.


The 970 will not be competitive to updated cards for the best new games on the market in 3-4 years. Especially when you are talking about performance of games which are running at 10-15 fps now where this "issue" is being pointed out.
So no, I don't think buyers will be falsely advertised or be hindered with a 3.5GB or some cap which doesn't really exist, as nvidia stated that games can access this 0.5GB, and we have seen reviews and numbers of this without damaging performance as being speculated.
Quote:


> Originally Posted by *gamervivek*
> 
> It's before his benchmark.
> That's of course the worst case scenario, but it's before his benchmark if the 15th Jan date is correct.
> 
> http://forums.anandtech.com/showthread.php?t=2416150
> 
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/1/
> 
> The 'omg someone told me and now I see it all the time' is no excuse to wish away a buyers' concerns, and I've very little patience for that defense.


You are pointing me to a SB with PCIE 2.0 vs a haswell-EX with PCIE 3.0 as an example for "omg my card have issues" for a stress test memory benchmark? How can you even make any kind of an educated guess when the two are so vastly different?
And if you look at the far cry 4 benchmarks results they posted, they are fine for this card compared to the 980 as well as some 970 vs 980 are so different. And still performance wise the card is operating as expected.

I still see zero issues here or any performance hindering in games.
Quote:


> Originally Posted by *error-id10t*
> 
> The benchmark isn't what broke the rational discussion, it's the unbelievable Nvidia defending that happens on this site. Previously I had only read about it but this thread is prove of it - comments from many people who don't even own the card saying it's a non-issue, comments from people who have 1 card and think 20FPS is "normal" etc etc.


If you get 20 fps and the 980 gets 25 fps, than its 100% a non-issue as the card performs *as expected*.
If you get 20 fps and the 980 gets 100 fps, than its an issue.
And we have seen zero proof of the second option. None.

People are defending this as they see logic in that the benchmark is flawed and that people see things which aren't there.
Its like someone tell you "do you feel ok?" with a worried tone, and all of a sudden you feel like every germ has nested in your body, even if you have nothing. Psychology is sometimes so strong, that your body starts to believe the lies.

I would still like to see one actual performance issue in a game which shows us 100% that this is an issue.
One. Please. No one can provide it, but still so many say its an issue.
Its like people yelling "aliens are invading us right now!!!" but nothing is happening...


----------



## gamervivek

Quote:


> Originally Posted by *Defoler*
> 
> The 970 will not be competitive to updated cards for the best new games on the market in 3-4 years. Especially when you are talking about performance of games which are running at 10-15 fps now where this "issue" is being pointed out.
> So no, I don't think buyers will be falsely advertised or be hindered with a 3.5GB or some cap which doesn't really exist, as nvidia stated that games can access this 0.5GB, and we have seen reviews and numbers of this without damaging performance as being speculated.
> You are pointing me to a SB with PCIE 2.0 vs a haswell-EX with PCIE 3.0 as an example for "omg my card have issues" for a stress test memory benchmark? How can you even make any kind of an educated guess when the two are so vastly different?
> And if you look at the far cry 4 benchmarks results they posted, they are fine for this card compared to the 980 as well as some 970 vs 980 are so different. And still performance wise the card is operating as expected.


I am not gonna play 'keep hitting for the goal while I move the posts around' with you.
Quote:


> I still see zero issues here or any performance hindering in games.


I am sure you will continue to do so.


----------



## nSone

Quote:


> Originally Posted by *Quasimojo*
> 
> What about it? Everyone knows you can run Autodesk on a GTX card. Most people who do it for a living also know that the Quadro and Fire cards are better suited for it.
> What cap?! There is no cap. You're still getting use of the full 4GB - just not in the way you thought it worked (if you thought about it at all before this).


OMG it's not that you can run Autodesk, Blender, Adobe suite etc. programs on GTX cards... maybe you can run them on a nintendo for that matter
the thing is they are advertised, sold and purchased for those exclusive Cuda features. GTX cards perform even better in some cases than Quadros
I got no intention to argue with you, nor to make anyone more insightful but do you know the amount of users buying these cards solely for that purpose?
edit:
here's one example
Blender Artists Community StatisticsBlender Artists Community Statistics
Threads 329,966 Posts 2,753,540 *Members 211,028*


----------



## Xoriam

I'm going to repeat this as well for people who keep saying "970 only has 3.5gb of ram"
I hit the 4gb cap daily playing ACU @ 4k.
*There is no limit of 3.5gb ram usable.*


----------



## Defoler

Quote:


> Originally Posted by *gamervivek*
> 
> I am not gonna play 'keep hitting for the goal while I move the posts around' with you.
> I am sure you will continue to do so.


You are always welcome to give us proof otherwise.
Yelling "here is your proof" with links which doesn't show gaming proof and give us benchmarks on completely different systems, shows nothing at all.


----------



## Noufel

Quote:


> Originally Posted by *Xoriam*
> 
> I'm going to repeat this as well for people who keep saying "970 only has 3.5gb of ram"
> I hit the 4gb cap daily playing ACU @ 4k.
> *There is no limit of 3.5gb ram usable.*


no one is saying that anymore ( except trolls ) but the fact is that there are 2 section one of 3.5gb running at full speed and another 0.5 gb that runs at a slower speed ( that we dont know, the nai bench is buggy) an that's what causing stetturing passing the 3.5gb cap .


----------



## Xoriam

Quote:


> Originally Posted by *Noufel*
> 
> no one is saying that anymore but the fact is that there are 2 section one of 3.5gb running at full speed and another 0.5 gb that runs at a slower speed ( that we dont know, the nai bench is buggy) an that's what causing stetturing passing the 3.5gb cap .


I saw like a page or 2 back someone still whining about this thats why i mentioned it.

I haven't noticed any loss of performance in that zone of memory use.
Most of the time it's sitting between 3.6-3.9gb used.

The only performance loss i'm getting, which happens with every card is when you hit the physical cap of the memory 4gb and it decides it needs a bunch of information real quick like when you go dashing around a corner or spin the camera really fast. you get a stutter.
Thats all I've seen.


----------



## ZealotKi11er

There is people that think this card has 3.5GB and those that say it has 4GB while Nvidia the selfs says it had 3.5GB + 0.5GB. There is something going on. Why is GTX 970 different from 980. What is the effect of this split? This needs proper testing so people have a better idea instead of blaming on Nvidia. It was clearly a design that had to be made. It could have been for cost, differentiation or only way to do it. Either way you get what you pay for. GTX 980 is 220$ more expensive for these reasons.


----------



## EarlZ

Has anyone tested if this is also affecting the 780s ?


----------



## doomlord52

Quote:


> Originally Posted by *Quasimojo*
> 
> I didn't read the entire thread, so I may have missed it. If it is, indeed, true that the last 512MB of memory is only accessible at 22Gb/s, then I guess I can see where some may have a real problem with that. Still, it's not much different from the neutering that GPU makers have been doing to their lesser parts for decades, now, just to hit a particular price point.


It's very different. usually when the nerf their cards for profit, they still work as it says on the box. That's why this is so different, and is causing so much anger.


----------



## Defoler

Quote:


> Originally Posted by *doomlord52*
> 
> It's very different. usually when the nerf their cards for profit, they still work as it says on the box. That's why this is so different, and is causing so much anger.


That is true, if we could see the card not performing as it should at 4GB memory usage.
The problem is that the anger is artificial and people are so angry over something which is still speculated still.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> There is people that think this card has 3.5GB and those that say it has 4GB while Nvidia the selfs says it had 3.5GB + 0.5GB. There is something going on. Why is GTX 970 different from 980. What is the effect of this split? This needs proper testing so people have a better idea instead of blaming on Nvidia. It was clearly a design that had to be made. It could have been for cost, differentiation or only way to do it. Either way you get what you pay for. GTX 980 is 220$ more expensive for these reasons.


People have been testing it left and right for the last 24 hours.
So far the 970 performs as expected in the 10-15% from the 980.
Artificial performance tests show some difference, but the card performs well still.
I don't know what people expect more to appear? If there was an actual problem in performance and drops in FPS because of the 0.5GB memory line, we would have seen it by now.


----------



## Seven7h

Quote:


> Originally Posted by *doomlord52*
> 
> It's very different. usually when the nerf their cards for profit, they still work as it says on the box. That's why this is so different, and is causing so much anger.


I'm confused... do you read a box, or do you read a review?


----------



## HMBR

unfortunately with 56 pages it not easy for me to see if it's already been answered but,
did anyone underclock heavily the 970 memory to see if on that cuda test the speed changes as much for the higher memory part (over 3.5) as the rest?

like try 300Mhz or something really low.

anyway, seeing 20GB/s results, those are low for VRAM, but way to high for PCIE 3.0 x16 on real world usage (should be more like 10GB/s if it was transferring using PCIE), I think


----------



## Silent Scone

Quote:


> Originally Posted by *Seven7h*
> 
> I'm confused... do you read a box, or do you read a review?


It's a turn of phrase...









This thread....


----------



## darealist

970 is also the very reason that coil whine is now a major concern. VRAM and coil whine testing should be standard for reviewers in the future.


----------



## Defoler

Quote:


> Originally Posted by *HMBR*
> 
> unfortunately with 56 pages it not easy for me to see if it's already been answered but,
> did anyone underclock heavily the 970 memory to see if on that cuda test the speed changes as much for the higher memory part (over 3.5) as the rest?
> 
> like try 300Mhz or something really low.
> 
> anyway, seeing 20GB/s results, those are low for VRAM, but way to high for PCIE 3.0 x16 on real world usage (should be more like 10GB/s if it was transferring using PCIE), I think


To repeat something which has been posted so many times already, the benchmark has been proven to be incorrect. The author of the benchmark said so himself, that the results are not right.
The CUDA benchmark is irrelevant to the question if or what performance reduction is happening at the upper 0.5GB, if at all.


----------



## Arturo.Zise

Funny that many 970 users have had no problems at all until they read this, then all of a sudden their card sucks and doesn't work properly


----------



## Final8ty





Noticed that the 980 would use 4GB but the 970 would not go over 3.5GB.


----------



## Crouch

Glad I didn't upgrade to the 970. I'll be upgrading my monitor from 1080p to 1440p/120Hz+ so I'll need all the memory I can get. Can't wait for the new AMD cards, hopefully they won't have this issue.


----------



## doomlord52

Quote:


> Originally Posted by *Defoler*
> 
> That is true, if we could see the card not performing as it should at 4GB memory usage.
> The problem is that the anger is artificial and people are so angry over something which is still speculated still.


I was under the impression that even though the 'Nai' benchmark was broken, the reports of games stuttering once they hit over 3.5gb were still true.

I guess I will have to do some testing of my own. Unfortunately, the only game I have that consistenly uses even close to 3.5gb of Vram is Skyrim... which has a pretty stutter engine to begin with.


----------



## Moparman

I almost am thinking of returning my my 3rd 970 until the price drops. my other 2 were well under 300 open box so maybe they will hit the $299 or so mark.


----------



## Xuper

I don't know Why do you Defend Nvidia? When We say :

*How much Vram does Geforce GTX 970 have = How much Vram can we Use for Game.*

Nvidia Admitted it and Please accept that there is 3.5GB Vram + 512mb on other hand there are 2 partition instead of One Partition!So Nvidia should have said it before lanuch.Nvidia Said it's 4GB so we thought we have 4 GB that we can use for game but now we can only use 3.5GB So it's "Cheat" Or false advertisement.Forget about performance.just nail to ass that why did you hide it from Users.


----------



## dish_moose

As a GTX 970 owner with a 1920x1200 monitor I am having no issues with my gpu. The misleading advertising about the vram is a little disconcerting but, for all those that state " I'm glad I did not buy a GTX 970", I say what other card would you buy that gives the same performance at it's price point. What nvidia did was wrong - no way to defend it! I have to chuckle when I see people with gpus like an HD 5770 calling out the GTX 970.
-Bruce


----------



## tojoleon

Is this design only for the 970 or is it the same for all for green cards from 700 series onwards.
Is this why my 780 going down recent game benchmarks tie'ing up with the 7970.
Both ways I am disappointed and will jump to the red team next buy.


----------



## TheReciever

5770 was a legend for its time when it released so its funny that you reference it


----------



## dish_moose

Legend ... ok - no where near the performance of a GTX 970. Like me owning a VW GTi and calling out the performance of a Turbo Carerra.
-Bruce


----------



## looniam

though he does have a point.

just like people posting you tube videos from some obscure site when they usually spend their time on AMD threads . . .


----------



## Xuper

Quote:


> Originally Posted by *TheReciever*
> 
> 5770 was a legend for its time when it released so its funny that you reference it


When I decided to buy new Graphic card I was hesitant to buy : HD 4890(VisionTek) , Geforce GTX 260 (Gainward) , HD 5770.(all price were between 275 to 285$) So I chose 5770 because of DX11.I'm glad that I bought 5770.other wise I couldn't Play Crysis 3!









Quote:


> Originally Posted by *dish_moose*
> 
> As a GTX 970 owner with a 1920x1200 monitor I am having no issues with my gpu. The misleading advertising about the vram is a little disconcerting but, for all those that state " I'm glad I did not buy a GTX 970", I say what other card would you buy that gives the same performance at it's price point. What nvidia did was wrong - no way to defend it! I have to chuckle when I see people with gpus like an HD 5770 calling out the GTX 970.
> -Bruce


LOL! Well , should I Laugh at you that you were fooled by Nvidia, heh?


----------



## TheReciever

Quote:


> Originally Posted by *dish_moose*
> 
> Legend ... ok - no where near the performance of a GTX 970. Like me owning a VW GTi and calling out the performance of a Turbo Carerra.
> -Bruce


Seems you glossed over my post and took it somehow as a comparison of 5770 vs gtx 970.

5770 was an early DX11 card that support eyefinity while beating out the previous 4890 (when drivers matured) with an overclock. It was considered one the best p/p cards of its time. Also Xfire scaled really well.

Hence the irony of you referencing it as a crap card.


----------



## Slink3Slyde

Quote:


> Originally Posted by *tojoleon*
> 
> Is this design only for the 970 or is it the same for all for green cards from 700 series onwards.
> Is this why my 780 going down recent game benchmarks tie'ing up with the 7970.
> Both ways I am disappointed and will jump to the red team next buy.


The 7xx series losing out in some sites recent new game benchmarks is a completely separate issue.


----------



## poii

Quote:


> Originally Posted by *dish_moose*
> 
> Legend ... ok - no where near the performance of a GTX 970. Like me owning a VW GTi and calling out the performance of a Turbo Carerra.
> -Bruce


Well I think I could say a 8L V10 putting out less performance than a 5.2L V10 is pretty bad engineered (both non-supercharged). Even though I own a Golf GTi or can I not?


----------



## Defoler

Quote:


> Originally Posted by *Xuper*
> 
> I don't know Why do you Defend Nvidia? When We say :
> 
> *How much Vram does Geforce GTX 970 have = How much Vram can we Use for Game.*
> 
> Nvidia Admitted it and Please accept that there is 3.5GB Vram + 512mb on other hand there are 2 partition instead of One Partition!So Nvidia should have said it before lanuch.Nvidia Said it's 4GB so we thought we have 4 GB that we can use for game but now we can only use 3.5GB So it's "Cheat" Or false advertisement.Forget about performance.just nail to ass that why did you hide it from Users.


That is false. They said that if a game requires 4GB, it will take 4GB.
They never said otherwise. The fact that the memory has 2 partition does not change that fact.
Quote:


> Originally Posted by *Final8ty*
> 
> Noticed that the 980 would use 4GB but the 970 would not go over 3.5GB.


Let me just remind you how VRAM works.
The card will load as much as it can into its memory, regardless if it needs it now or not. It could load 2GB of this map now, and have another 2GB left from the other map, or only 1.5GB. Those 2GB of the current map are what matters.
Its like filling your desk with books. You put all the books you need, as well as books you might need later. If you don't have room for the books you will need later, you will not put them on the desk and only get them when you really need them, and put unused books back to free some place.

If the card uses 3.5GB it doesn't mean that it's missing 0.5GB of data to the frames. It only means that it loaded 3.5GB of data over the time since it loaded the game.
If the card needed the other 0.5GB, it will go and bring it in.

This is pretty simple. If the card does need those 0.5GB right now, while the 980 will bring it into memory and not remove them in chance it will need them later, the 970 will decided not to bring them it. It doesn't mean it hindered performance, or that data is missing from the current map and it keeps loading data from the system memory.

The correct test will be to see how the card reacts with full 4GB of memory, by going full maximum settings with high DSR and high AA, and hope the game push the GPU in terms of memory. Then the card will really need to work. We will see the FPS drops on both the 980 and 970 to lower numbers, and we will see how far are they in terms of performance.
If the 980 dropped from 70fps to 35fps, while the 970 drops from 60 fps to 10fps, we will say "ok, we have a problem". If the 970 drops instead to 30fps, as in both cards are losing about 50-55% of their performance, there is no issue.

This video doesn't show it. At all. So I still don't see a problem.
The reason no 0.5GB extra memory was used, is most likely because it wasn't needed to, and nvidia said that that 3.5GB has higher priority, so they prefer to use it. Which is fine design wise.


----------



## Fiery

I'm gonna return my 970 strix today and save up a bit more and get a 980 after reading this, that is if my mom allows me to use that amount of money...


----------



## mtcn77

Quote:


> Originally Posted by *Fiery*
> 
> I'm gonna return my 970 strix today and save up a bit more and get a 980 after reading this, that is if my mom allows me to use that amount of money...


Get 780 Strix in return if they cause undue problems. Most Strix 780 reviews demonstrated 40 rop performance(same as 780ti).


----------



## Wirerat

Quote:


> Originally Posted by *Final8ty*
> 
> Noticed that the 980 would use 4GB but the 970 would not go over 3.5GB.


You know what the most important part of that video is? The gtx 970 was just as smooth as the gtx 980. Its also doing the same job using less vram.

Its all about perspective.


----------



## Final8ty

Quote:


> Originally Posted by *Defoler*
> 
> That is false. They said that if a game requires 4GB, it will take 4GB.
> They never said otherwise. The fact that the memory has 2 partition does not change that fact.
> Let me just remind you how VRAM works.


I know how Vram works, and the fact its behaving differently on the 970 to the 980 and for some games it could be an issue
Im not disputing that the 970 has 4GB of Vram total.


----------



## Final8ty

Quote:


> Originally Posted by *Wirerat*
> 
> You know what the most important part of that video is? The gtx 970 was just as smooth as the gtx 980. Its also doing the same job using less vram.
> 
> Its all about perspective.


Just as smooth is a matter of opinion and i will agree to disagree.


----------



## rdr09

Quote:


> Originally Posted by *mtcn77*
> 
> Get 780 Strix in return if they cause undue problems. Most Strix 780 reviews demonstrated 40 rop performance(same as 780ti).


that's a downgrade.

features alone the 970 is still better by far.


----------



## looniam

if anyone cares:



which would be:

NVIDIA Responds to GTX 970 3.5GB Memory Issue
(same as OP)


----------



## Wirerat

Quote:


> Originally Posted by *Final8ty*
> 
> Just as smooth is a matter of opinion and i will agree to disagree.


There is no visible stutter on either side playing at regular speed. No idea how that can be an opinion. But if you say so.


----------



## Final8ty

Quote:


> Originally Posted by *Wirerat*
> 
> There is no visible stutter on either side playing at regular speed. No idea how that can be an opinion. But if you say so.


As i have said its a matter of opinion as what is considered smooth to one person is not to another and is no different than what minimum FPS is smooth enough, some are happy with going as low as 30fps and some or not, some are happy with Vsync and some are not, so no its not if i so.


----------



## Cyro999

Quote:


> That is false. They said that if a game requires 4GB, it will take 4GB.
> They never said otherwise.


Why is Shadow of Mordor showing not enough memory error popup from windows, and crashing before going over ~3550MB if you have paging file too small or disabled?

I don't care about what they said, only what's actually happening


----------



## dean_8486

If anyone thinks this is a non issue play star citizen @1440p on ultra settings and report back. I am hitting 3500mb vram usage and gpu usage drops significantly caising the game to stutter. This to me proves that the last 500mb of vram is next to useless in a gaming situation. If this cannot be resolved via firmware/driver fix I want a refund and if I am refused I will take further action.


----------



## mtcn77

Quote:


> Originally Posted by *rdr09*
> 
> that's a downgrade.
> 
> features alone the 970 is still better by far.


What features are those? I suppose you haven't seen the latest Call of Duty 4K benchmarks. Gtx970 is walking the ignoble path to glory.


----------



## Silent Scone

You'd be an idiot to buy a 970 for 4K regardless of what is happening here, frankly.

Quote:


> Originally Posted by *dean_8486*
> 
> If anyone thinks this is a non issue play star citizen @1440p on ultra settings and report back. I am hitting 3500mb vram usage and gpu usage drops significantly caising the game to stutter. This to me proves that the last 500mb of vram is next to useless in a gaming situation. If this cannot be resolved via firmware/driver fix I want a refund and if I am refused I will take further action.


There is currently no accurate way to monitor VRAM usage with the 970 this is what drew people's attention to this issue in the first place. How much multi sampling are you using?


----------



## looniam

Quote:


> Originally Posted by *Wirerat*
> 
> Just as smooth is a matter of opinion and i will agree to disagree.


There is no visible stutter on either side playing at regular speed. No idea how that can be an opinion. But if you say so.[/quote]
yeah looks good too.

but considering one system has an i5 and the other an i7 . i wonder if the resolutions and settings are the same??

. . . too bad that's not in the description.









maybe *the 970 owners themselves ought to be posting their own videos* _so answers to questions like that can be known._


----------



## Final8ty

Quote:


> Originally Posted by *looniam*
> 
> There is no visible stutter on either side playing at regular speed. No idea how that can be an opinion. But if you say so.


Quote:


> yeah looks good too.
> 
> but considering one system has an i5 and the other an i7 . i wonder if the resolutions and settings are the same??
> 
> . . . too bad that's not in the description.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> maybe *the 970 owners themselves ought to be posting their own videos* _so answers to questions like that can be known._


They both look bad to me and the tearing i would not accept, different strokes for different folks


----------



## Wirerat

Quote:


> Originally Posted by *dean_8486*
> 
> If anyone thinks this is a non issue play star citizen @1440p on ultra settings and report back. I am hitting 3500mb vram usage and gpu usage drops significantly caising the game to stutter. This to me proves that the last 500mb of vram is next to useless in a gaming situation. If this cannot be resolved via firmware/driver fix I want a refund and if I am refused I will take further action.


not that i dont believe you but you are talking about returning a gpu based on performance in a beta game thats not even released yet.


----------



## mtcn77

So, the reviews are silent about this; although Guru's 2K SLI fcat measurements are riddled with spikes.
If users are also not on the pursuit, Nvidia maybe can sustain their bubble version of reality.


----------



## looniam

Quote:


> Originally Posted by *Final8ty*
> 
> They both look bad to me and the tearing i would not accept, different strokes for different folks


since you can't take a hint let me be direct:
Quote:


> Originally Posted by *looniam*
> 
> just like people posting you tube videos from some obscure site when they usually spend their time on AMD threads . . .


^ guess who that was meant for?









several threads across different forums have been locked down because of spammers/shills polluting them with garbage and then defending their post(s).

yeah, thats helpful.


----------



## Final8ty

Quote:


> Originally Posted by *looniam*
> 
> since you can't take a hint let me be direct:
> ^ guess who that was meant for?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> several threads across different forums have been locked down because of spammers/shills polluting them with garbage and then defending their post(s).
> 
> yeah, thats helpful.


Whats been going on in other forums and spammers/shills is not my problem, it changes noting about my opinion on the vid.


----------



## tpi2007

Quote:


> Originally Posted by *Defoler*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> As I also asked previously, there's another question that we may not know about: does Nvidia fuse off the other cores in exactly the same way on the GM204 die ? I wouldn't think so as that would imply all the faulty chips to be damaged in exactly the same area. So, having said that, does cutting cores here or there make any difference performance wise ? How many crossbars are available depending on where they cut off ? Nobody knows. What I know is that some people are claiming that they don't have problems and others claim otherwise. And that is strange.
> 
> As Anandtech said:
> 
> http://www.anandtech.com/show/8931/nvidia-publishes-statement-on-geforce-gtx-970-memory-allocation
> 
> 
> 
> Why does it matter?
> Are we all GPU engineers and we need to know which SM on which part on which segment is being disabled?
> I have a feeling Nvidia did not make one card with 40 crossbars and one with 24 and one with 90. If all cards perform similar in the end of the day, I see no issue at all.
> Do you know how AMD are doing it with the 280x? How they will do it with the 380x or whatever its name is going to be?
> You don't. And you never will.
Click to expand...

I thought my post was self-explanatory, but I'll give it another try: it matters if, depending on where on the die they cut off defective cores that affects performance from card to card, thus possibly explaining why some people are reporting problems and some aren't.

It's an idea worth discussing, and more importantly, that the tech media should ask Nvidia. And that is why I quoted that Anandtech sentence that sums it up pretty well: up until now reviewers thought they knew how the cards worked, and now they realise that wasn't true, so what else that is relevant to the discussion don't we know?

Think about it for a moment. If this had been known from the start and more directed reviews had come out pointing out some possible problems in the future when games start requiring all of the 4GB and not just caching (which gives Nvidia some freedom to swap non priority things around, thus minimizing the problem), some people may not have been so quick to buy the card.

I can certainly see many people having bought their GTX 970 first, being very happy with it at 1080p / 1440p and then deciding to buy a 4K monitor during the holiday season or in a January sale, and here we are, in January, after the festivities, and people are starting to realise the problems in demanding scenarios. It doesn't seem unreasonable.

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *doomlord52*
> 
> It does not say 3.5gb at 224gb/s + 0.5gb at 22gb/s. It claims that all 4gb run are accessible at 224gb/s.
> 
> 
> 
> For the 80th time, that benchmark is bugged. *It is not 22 GB/sec.* No one knows what the actual speed of memory access to that section of VRAM is (and quite possibly never will).
> 
> I swear that benchmark is the worst thing that could have happened for any kind of rational discussion of this issue (not that rational discussion was very likely anyway, mind you).
Click to expand...

I'll ask again: is it really bugged ? If it returns normal results on the 980 - because it assumes that all of it's VRAM is accessible in the same way, and returns erroneous results on the 970 because of its specificities, who is really responsible ? Are you implying that the guy who coded the benchmark should be aware of a hardware specificity that nobody outside of Nvidia knew about and coded around it ? Or is that Nvidia's driver team job ? To identify when an application / game requires access to the full 4 GB and put the 'access 3.5 GB segment first, then proceed to 0.5 GB segment.' procedure in motion. Nvidia implies on its statement that that is indeed their job. Otherwise game and application developers would have to make adjustments to all their products to take one specific card into account.

What I take from this is that Nvidia's driver failed to catch the benchmark and re-route its requests to make it work properly on the GTX 970, thus showing what some people are describing as a bug. How can you patch a bug when you don't know that card has a special way to address memory ?

Quote:


> Originally Posted by *Silent Scone*
> 
> You'd be an idiot to buy a 970 for 4K regardless of what is happening here, frankly.


Oh really ? What about SLI ? Do you think that it's completely unreasonable to spend $660 for two GTX 970's instead of $1100 for two GTX 980's ? Everybody should know you can't max. out games at 4K with one card, so SLI will give you the horsepower, but then you learn that this great deal may not be so great because some of the VRAM you thought you had can't be accessed so efficiently.


----------



## criminal

I don't understand why so many people are standing in defense of Nvidia (especially those that don't even own a 970). Nvidia may not have straight out lied about the 970, but they didn't tell the whole truth either. Who cares why some users are upset even they have not ran into the issue themselves? They spent their money on a card that Nvidia fudged on. If the huge price difference between the 970 and 980 was because of this issue, then it is Nvidia's responsibility to make that know. Saying that the people who bought a 970 because it was so much cheaper should have known something was up is ridiculous. Last I checked it was not a users responsible to vet out the issuse of a gpu.


----------



## looniam

Quote:


> Originally Posted by *Final8ty*
> 
> Whats been going on in other forums and spammers/shills is not my problem, it changes noting about my opinion on the vid.


since you don't have a 970, thats not your problem either.









and your opinion about the vid doesn't matter.


----------



## spacin9

Even if this is all BS, NV needs to do some damage control. Give away a game, a refund. Store credit gift cards. A coupon for a free Dunkin coffee or something. This mess lowered the resale value of my cards by 50 bucks or more, which is what really bugs me. After a month gaming @ 4K, I got the performance I expected.


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> since you don't have a 970, thats not your problem either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and your opinion about the vid doesn't matter.


So much for advocating false marketting & censure of the unveiling. The consolation is worth it, I hope.


----------



## Kuivamaa

Quote:


> Originally Posted by *Cyro999*
> 
> Why is Shadow of Mordor showing not enough memory error popup from windows, and crashing before going over ~3550MB if you have paging file too small or disabled?
> 
> I don't care about what they said, only what's actually happening


I think your observation touches the root of the problem.


----------



## kingduqc

Quote:


> Originally Posted by *looniam*
> 
> since you don't have a 970, thats not your problem either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and your opinion about the vid doesn't matter.


...

You can have an opinion on gpu even if you don't own it. Did you read what you just typed? Can you keep remarks like "Your opinion don't matter" elsewhere then a forum.


----------



## MerkageTurk

nVidia and their shady tactics,

Then the 780ti drivers have a legitimate argument that it is being dumbed down

290x best price/performance


----------



## looniam

Quote:


> Originally Posted by *tpi2007*
> 
> I'll ask again: is it really bugged ? If it returns normal results on the 980 - because it assumes that all of it's VRAM is accessible in the same way, and returns erroneous results on the 970 because of its specificities, who is really responsible ? Are you implying that the guy who coded the benchmark should be aware of a hardware specificity that nobody outside of Nvidia knew about and coded around it ? Or is that Nvidia's driver team job ? To identify when an application / game requires access to the full 4 GB and put the 'access 3.5 GB segment first, then proceed to 0.5 GB segment.' procedure in motion. Nvidia implies on its statement that that is indeed their job. Otherwise game and application developers would have to make adjustments to all their products to take one specific card into account.
> 
> What I take from this is that Nvidia's driver failed to catch the benchmark and re-route its requests to make it work properly on the GTX 970, thus showing what some people are describing as a bug. How can you patch a bug when you don't know that card has a special way to address memory ?


the benchmark was devoped after the complaints of the 970 limited to 3.5Gb vram and since then the dev himself (Nai on computerbase.de) has described how its flawed
http://www.computerbase.de/forum/showthread.php?t=1435408&page=7&p=16912375#post16912375
condensed. (trans from german via chrome btw)
Quote:


> Thus, the benchmark measures overall in such cases more or less the swapping behavior of CUDA and not the DRAM bandwidth. The whole can be verified easily by allowing any applications running in the background that consume a lot of DRAM from the GPU, thus more swapping is needed. In this case, the benchmark collapses also.


i'll admit it's slightly gibberish to me but sevral users on guru3d agreed with him - its unrealiable.









edit: just to be clear:


Quote:


> Originally Posted by *kingduqc*
> 
> You can have an opinion on gpu even if you don't own it. Did you read what you just typed? Can you keep remarks like "Your opinion don't matter" elsewhere then a forum.


kind sir, if you look at any of my posting you will specifically see any opinion of mine on the 970 is entirely omitted.

what you will see is a lot of criticism of people seemly stirring the pot.


----------



## gamervivek

Quote:


> Originally Posted by *criminal*
> 
> I don't understand why so many people are standing in defense of Nvidia (especially those that don't even own a 970). Nvidia may not have straight out lied about the 970, but they didn't tell the whole truth either. Who cares why some users are upset even they have not ran into the issue themselves? They spent their money on a card that Nvidia fudged on. If the huge price difference between the 970 and 980 was because of this issue, then it is Nvidia's responsibility to make that know. Saying that the people who bought a 970 because it was so much cheaper should have known something was up is ridiculous. Last I checked it was not a users responsible to vet out the issuse of a gpu.


I could give benefit of the doubt to those who are going against the outraged ones talking about the CUDA benchmark, but going the other way and claiming that nvidia aren't in the least bit in error here, is just amazing. I've decided some of them aren't worth the time, and hopefully others will come the same conclusion soon.

The simple thing is if nvidia had disclosed this earlier, would it have made a difference to the bottom line? Yes, absolutely. Case closed. The only saving grace is if they could salvage the situation via some driver or vbios changes.


----------



## provost

Quote:


> Originally Posted by *MerkageTurk*
> 
> nVidia and their shady tactics,
> 
> Then the 780ti drivers have a legitimate argument that it is being dumbed down
> 
> 290x best price/performance


Sorry for OT, but can you please direct me to the source of this about the GK110 being dumbed down through the divers? I would be curious to know how much support is being cut back to prop up the Maxwell, on a relative performance basis, since some part of the Maxwell performance is being driven by software efficiencies, although a large part appears to be architecture related.
Apple is rumored to do it for the older iPhones and iPads when they release a new product. In fact, I have experienced it myself on my ipad 3 with the latest software updates. So, no more apple for me.. Lol


----------



## mtcn77

Quote:


> Originally Posted by *gamervivek*
> 
> I could give benefit of the doubt to those who are going against the outraged ones talking about the CUDA benchmark, but going the other way and claiming that nvidia aren't in the least bit in error here, is just amazing. I've decided some of them aren't worth the time, and hopefully others will come the same conclusion soon.
> 
> The simple thing is if nvidia had disclosed this earlier, would it have made a difference to the bottom line? Yes, absolutely. Case closed. The only saving grace is if they could salvage the situation via some driver or vbios changes.


Think of all the people who purchased believing this was a 64 rop 256 bit 4GB card. They could have relieved their business with an alternative.


----------



## flippin_waffles

A lot of the complaints about 970 are about frame latency and stuttering, so obviously reviewers should be testing with FCAT. It's awfully suspicious that PCPER and Techreport are not using it so abruptly.


----------



## looniam

Quote:


> Originally Posted by *flippin_waffles*
> 
> A lot of the complaints about 970 are about frame latency and stuttering, so obviously reviewers should be testing with FCAT. It's awfully suspicious that PCPER and Techreport are not using it so abruptly.


oh?

i guess you missed those reviews, huh?
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/Battlefi


Spoiler: Warning: Spoiler!


----------



## Quasimojo

Quote:


> Originally Posted by *nSone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> What about it? Everyone knows you can run Autodesk on a GTX card. Most people who do it for a living also know that the Quadro and Fire cards are better suited for it.
> What cap?! There is no cap. You're still getting use of the full 4GB - just not in the way you thought it worked (if you thought about it at all before this).
> 
> 
> 
> OMG it's not that you can run Autodesk, Blender, Adobe suite etc. programs on GTX cards... maybe you can run them on a nintendo for that matter
> the thing is they are advertised, sold and purchased for those exclusive Cuda features. GTX cards perform even better in some cases than Quadros
> I got no intention to argue with you, nor to make anyone more insightful but do you know the amount of users buying these cards solely for that purpose?
> edit:
> here's one example
> Blender Artists Community StatisticsBlender Artists Community Statistics
> Threads 329,966 Posts 2,753,540 *Members 211,028*
Click to expand...

No idea what that is supposed to illustrate. A lot of people use Blender? I do too, and on a GTX card, no less. But I bought the card for gaming and don't expect it to be the best option for everything else. It's not.

Quote:


> Originally Posted by *doomlord52*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> I didn't read the entire thread, so I may have missed it. If it is, indeed, true that the last 512MB of memory is only accessible at 22Gb/s, then I guess I can see where some may have a real problem with that. Still, it's not much different from the neutering that GPU makers have been doing to their lesser parts for decades, now, just to hit a particular price point.
> 
> 
> 
> It's very different. usually when the nerf their cards for profit, they still work as it says on the box. That's why this is so different, and is causing so much anger.
Click to expand...

It does work "as it says on the box". It's got 4GB memory and uses every bit of it. People have speculated that it doesn't access the last 512MB of memory as fast, but that's all it is - pure speculation. My guess is that any problem in this regard is a driver issue. That is only a guess as well, but it seems to me to be more likely. After all, if you segment your HDD into multiple partitions, one doesn't function any faster than the other (ok, there's short-stroking, but that's a mechanical thing). Heck, the RAM is segmented as well.

The fact remains that the 970 performs 85-90% as well as the 980. I just can't fathom how that's not enough. The fact that it can't quite manage Shadow of Mordor on ultimate quality at 4k or even 1440p should not come as a shock to anyone. $350 is not meant to get you a card that can do that.


----------



## Sargas290X

Quote:


> Originally Posted by *hht92*
> 
> That's why i waited for my 780 1 year (yea yea i took her 4 months before 900 series), cause the new product isn't always the best product.


Exactly, the 780 is a premium card. The 970 is a gimped card. Even though the performance might be better on paper. That's why I got a 290x instead of a 970, and I was considering getting a 780 Ti too. People who use more than 3.5gb on a 970 will experience a drop in performance. It's always best to wait a bit before jumping on new hardware, it's risky being an early adopter.


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> oh?
> 
> i guess you missed those reviews, huh?
> http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/Battlefi
> 
> 
> Spoiler: Warning: Spoiler!


May I ask where the SLI scores are?Post #173


----------



## Final8ty

Quote:


> Originally Posted by *looniam*
> 
> since you don't have a 970, thats not your problem either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and your opinion about the vid doesn't matter.


You dont need to own it or suffer from the issues personally to have an opinion on it, but that's your opinion that people do, so go and make your own forum and set the rules as such.


----------



## Silent Scone

Quote:


> Originally Posted by *tpi2007*
> 
> Oh really ? What about SLI ? Do you think that it's completely unreasonable to spend $660 for two GTX 970's instead of $1100 for two GTX 980's ? Everybody should know you can't max. out games at 4K with one card, so SLI will give you the horsepower, but then you learn that this great deal may not be so great because some of the VRAM you thought you had can't be accessed so efficiently.


Yes really. 4GB VRAM on a 256bit memory bus. 4GB is an absolute minimum for 4K, and with SLI this only gets more prominent due to the extra memory required. What some users are experiencing is the cards exhausting the entire frame buffer, not just because they're using the sub division of divided VRAM. I know this because I experience the same thing with 980s in the same scenarios as some of the complainants. I would wait and see what further correspondence Ryan Shrout gets today.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> May I ask where the SLI scores are?Post #173


since you knew, why ask?

oohhh so you can skew data by introducing the varible of SLI!










hey, next time tell your buddy not to look more foolish than you.


----------



## criminal

Quote:


> Originally Posted by *gamervivek*
> 
> I could give benefit of the doubt to those who are going against the outraged ones talking about the CUDA benchmark, but going the other way and claiming that nvidia aren't in the least bit in error here, is just amazing. I've decided some of them aren't worth the time, and hopefully others will come the same conclusion soon.
> 
> The simple thing is if nvidia had disclosed this earlier, would it have made a difference to the bottom line? Yes, absolutely. Case closed. The only saving grace is if they could salvage the situation via some driver or vbios changes.


I have no doubt that Nai's benchmark is not reliable, so people getting outraged by that alone deserves some criticism. But yeah, saying Nvidia is well within their rights to not disclose everything about a product are wrong. I believe sales would have been impacted had this issue (whether it impacts performance or not) been disclosed upon release and Nvidia knew that. That is why this is a shady practice.

Issues like these really show one's bias. There are some heavily bias people who love anything Nvidia and can't see the wrong they do. Die hard AMD users will act the same way. Last i checked both companies are in it to make money.

Nvidia is in the wrong with this issue plain and simple. People make excuses for them should be ashamed. Let Nvidia do their own damage control. They make more than enough money to do so.


----------



## Baghi

Quote:


> Originally Posted by *looniam*
> 
> since you don't have a 970, thats not your problem either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and your opinion about the vid doesn't matter.


I don't own a Ferrari or a Limbo, but I give examples about them and talk; is it fine master?


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> since you knew, why ask?
> 
> oohhh so you can skew data by introducing the varible of SLI!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hey, next time tell your buddy not to look more foolish than you.


I ask because you didn't EVEN address to the fact 970 is missing in *EACH & EVERY PAGE* of multi-gpu results. How much more incompetent can you get?
There.


----------



## Forceman

Quote:


> Originally Posted by *mtcn77*
> 
> I ask because you didn't EVEN address to the fact 970 is missing in *EACH & EVERY PAGE* of multi-gpu results. How much more incompetent can you get?
> There.


I'm guessing there are no results on "each and every page" because they didn't test SLI with the 970.

Obviously because Nvidia paid them not to, and not because maybe they didn't have 2 cards or some other completely reasonable reason.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> I ask because you didn't EVEN address to the fact 970 is missing in *EACH & EVERY PAGE* of multi-gpu results. How much more incompetent can you get?
> There.


so you missed guru3Ds FCAT results.

how are your personal problems related?

do you need my help?

ask my secretary to make you an appointment.


----------



## nSone

Quote:


> Originally Posted by *Quasimojo*
> 
> No idea what that is supposed to illustrate. A lot of people use Blender? I do too, and on a GTX card, no less. *But I bought the card for gaming and don't expect it to be the best option for everything else. It's not.*
> It does work "as it says on the box". It's got 4GB memory and uses every bit of it. People have speculated that it doesn't access the last 512MB of memory as fast, but that's all it is - pure speculation. My guess is that any problem in this regard is a driver issue. That is only a guess as well, but it seems to me to be more likely. After all, if you segment your HDD into multiple partitions, one doesn't function any faster than the other (ok, there's short-stroking, but that's a mechanical thing). Heck, the RAM is segmented as well.
> 
> The fact remains that the 970 performs 85-90% as well as the 980. I just can't fathom how that's not enough. The fact that it can't quite manage Shadow of Mordor on ultimate quality at 4k or even 1440p should not come as a shock to anyone. $350 is not meant to get you a card that can do that.


So what matters is what YOU bought it for right? Go back to your toys and please don't answer me. I didn't know OCN was a gamer community, and lot's of them egocentric beyond repair for that matter.
I bought 2 of them for a price equal more than what I earn in a month here, I don't play games, and just like 90% of those members at the blender community I newer needed a quadro. I'm just glad I didn't open neither of them, so tomorrow I'm returning them, I'm lucky the store owner is my friend or I'd be f*d big time
After this I rest my case and wait for Nvidia to give an official answer. this thread is becoming more than ridiculous... sorry for being rude, but instead of giving nVidia some pressure for messing up, I just can't stand seeing people talking nonsense around in their defense.


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> I'm guessing there are no results on "each and every page" because they didn't test SLI with the 970.
> 
> Obviously because Nvidia paid them not to, and not because maybe they didn't have 2 cards or some other completely reasonable reason.


I mean 970 has SLI ports, why haven't anybody come up with this idea? Which is why I would again be equally satisfied, if they didn't include the port beforehand provided that they would be so adamant not to mention it, ever. Again, false advertisement.


----------



## notarat

Quote:


> Originally Posted by *benbenkr*
> 
> They can't even spell Shadow of Mord*o*r right. Yeah.


One does not simply look up Mordor in a dictionary


----------



## Wirerat

Quote:


> Originally Posted by *looniam*
> 
> so you missed guru3Ds FCAT results.
> 
> how are your personal problems related?
> 
> do you need my help?
> 
> ask my secretary to make you an appointment.


From that article:
Quote:


> With this chart, lower = better. Huge spikes above 40ms to 50ms can be considered a problem like a stutter or indicate a low framerate. *Quite honestly it still is nothing to worry about as the overall plot is very smooth*.


----------



## mkclan

Next time we have 12GB GPU with 3GB DDR5 other 9GB some old school thing.


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> so you missed guru3Ds FCAT results.
> 
> how are your personal problems related?
> 
> do you need my help?
> 
> ask my secretary to make you an appointment.


Do you see any 4K FCAT results on Guru3D, or would you like me to call your caretaker to clean your glasses?


----------



## looniam

Quote:


> Originally Posted by *Baghi*
> 
> I don't own a Ferrari or a Limbo, but I give examples about them and talk; is it fine master?


i don't know.

do you go on ferrari and limbo threads talking smack about any problems when you haven't a clue?

tell others to take legal action with no verifiable proof?

yeah i could see how that's a potential problem.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Quasimojo*
> 
> Maybe I'm just not as well educated on the inner workings of a gpu as some. It would appear to me that the card was advertised with 4GB memory, it comes with 4GB of memory and it can utilize all 4GB of that memory. It doesn't seem to me that its being segmented or not would have much effect on the 970 being able to efficiently utilize all of it. This is something we've seen several times in the past - the "less than flagship" cards being held back by a lower bandwidth, occasionally causing them not to be able to effectively utilize all the memory on-board. I remember seeing the math at one point showing something like how a 128-bit card couldn't possibly utilize all of 2GB of on-board memory at once, yet at the time we were seeing those cards with 2GB and even 3GB flying off the shelves.
> 
> The fact remains that the 970 is still a marvel that we've not seen very often at that price point. The people who thought there was no reason to pony up the extra scratch for a 980 when a 970 was so much cheaper and just about as fast were deluding themselves. No such thing as a free lunch.


Calling the 970 owners delusional is harsh and uncalled for. There was little to no evidence at the time that the 970 was crippled like this.

There are also plenty of instances where a GPU was 90% as fast as another GPU but substantially cheaper and in these cases, the cheaper GPU did not contain false advertisement. The 8800gt was launched at $200 and was about 90% or more the power of the 8800gtx which was in the neighborhood of $500 to $600. The 4870 was close to the 480 and was substantially cheaper. The 670 and 680 weren't far apart. The 290 and 290x are within 10% of each other. The 7950 and 7970 are pretty close as well. Should we also call everyone who bought an 8800gt, 4870, 670, 290, 7950 delusional?


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> Do you see any 4K FCAT results on Guru3D, or would you like me to call your caretaker to clean your glasses?


and 4K would mean . . .

oh, nothing.

but no thank you, she took my dentures so i ca't trust her with my glasses


----------



## criminal

Quote:


> Originally Posted by *looniam*
> 
> i don't know.
> 
> do you go on ferrari and limbo threads talking smack about any problems when you haven't a clue?
> 
> tell others to take legal action with no verifiable proof?
> 
> yeah i could see how that's a potential problem.


Nvidia has already admitted to the VRAM being divided into 3.5 and 0.5 segments. As far as can tell, it may not cause a major issue, so why not be upfront about it at the beginning? Why are so many so relentless in defending Nvidia's practice? People can criticize a product without actually owning the product based off information given about the product.

Maybe the performance complaints are overblown, but arguing that Nvidia didn't do anything wrong is a bad as well.

Anyone that has seen me post knows I am partial to Physx and lean more towards Nvidia cards. This is shady plain and simple no matter what "team" you are on.


----------



## Forceman

Quote:


> Originally Posted by *mtcn77*
> 
> Do you see any 4K FCAT results on Guru3D, or would you like me to call your caretaker to clean your glasses?


So sites tested 4K and it was fine, and sites tested SLI and it was fine, but because no site tested 4K and SLI it's a huge cover-up by Nvidia?

Just spitballing here, but maybe Guru3D doesn't have 4K capable FCAT hardware, and PCPer didn't have 2 970s to test with (since, as you pointed out, EACH & EVERY PAGE, didn't show SLI results)?

In any case, PCPer already said they are working on testing it, so maybe just cool your heels until this afternoon and see what comes out of that testing. I heard the only reason it was delayed until today was because the money truck from Nvidia was late.


----------



## Fiery

Quote:


> Originally Posted by *mtcn77*
> 
> Get 780 Strix in return if they cause undue problems. Most Strix 780 reviews demonstrated 40 rop performance(same as 780ti).


well then there is a different problem as I cant get a 780 strix here, the only one I could find was the poseidon so im gonna save and get a 980 strix or msi twin frosr I think


----------



## revro

Quote:


> Originally Posted by *dean_8486*
> 
> If anyone thinks this is a non issue play star citizen @1440p on ultra settings and report back. I am hitting 3500mb vram usage and gpu usage drops significantly caising the game to stutter. This to me proves that the last 500mb of vram is next to useless in a gaming situation. If this cannot be resolved via firmware/driver fix I want a refund and if I am refused I will take further action.


same here, i am also getting 3.5gb usage in star citizen.

well i guess i will sell my 970 once SC comes out and switch to 390x or whatever is outthere. that way i can stay on win7, since who knows how shady win10 will be in terms of privacy
i also dont plan to lose my box win7 serial for a win10 oem serial lol

you lose customers trust, it will come to bite you nvidia


----------



## mtcn77

Quote:


> Originally Posted by *Forceman*
> 
> So sites tested 4K and it was fine, and sites tested SLI and it was fine, but because no site tested 4K and SLI it's a huge cover-up by Nvidia?
> 
> Just spitballing here, but *maybe* Guru3D doesn't have 4K capable FCAT hardware, and *PCPer didn't have 2 970s* to test with (since, as you pointed out, EACH & EVERY PAGE, didn't show SLI results)?


"Maybe" & "didn't" aren't very strong arguments. Let's just say, they avoided the situation. What situation? This situation:


----------



## looniam

Quote:


> Originally Posted by *criminal*
> 
> Nvidia has already admitted to the VRAM being divided into 3.5 and 0.5 segments. As far as can tell, it may not cause a major issue, so why not be upfront about it at the beginning? Why are so many so relentless in defending Nvidia's practice? People can criticize a product without actually owning the product based off information given about the product.
> 
> Maybe the performance complaints are overblown, but arguing that Nvidia didn't do anything wrong is a bad as well.
> 
> Anyone that has seen me post knows I am partial to Physx and lean more towards Nvidia cards. This is shady plain and simple no matter what "team" you are on.


ya know since you asked me directly:

i have no idea why nvidia did not fully disclosed the vram segmentation. they have done some wacky things in the past with playing around and it hadn't been an issue such as this. but that is guessing. so why? they need to answer that because i don't speak for them. pcper is having a further discussion @ 1pm ET something might show up then.

what i do do is not defend nvidia but (now hold on to your hat) look out for the affected consumer; those 970 users. because even though anyone can criticize a company and its practices, that still doesn't excuse or leave open licence for those that have nothing better to do than stir the pot or pass FUD off as valid with the latter ending up helping no one. instigating a number of users, causing a mob mentality, would only harm those at seeking fair compensation for their inconvenience.

in other words, if people doesn't have their ducks lined up properly, they won't stand a snowball's chance in hell at getting anything from nvida.

but hey, if those who are butt hurt from past experiences at their favorite team getting criticized and want to hypocritically retort; i got time while i am seasonal laid off to engage in an adversarial process to a point.

thanks for asking.

e:
grammar


----------



## skupples

please do try to sue nvidia, that will be the funniest thing to happen in the GPU industry in quite some time.

The card has 4GB of memory, so no false advertisement,

nice try though. It's extremely cute, and gives me a great laugh, while sitting here @ work, on the phone w/ Lenovo, because they forgot to include the memory in the laptop that they replaced the motherboard in. Now they're accusing me of sending it to them without memory in the first place, even though the original issue was with the track pad, how the hell could I know the track pad is broken if it has NO MEMORY?

That's a real outrage, but you people, you people of this thread, you're faux outrage.

Yay for still having my titans, with all 6gb of included 6GB of memory.

People shouldn't even be using these cards for the resolutions required to actually NEED that amount of memory.


----------



## Quasimojo

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> Maybe I'm just not as well educated on the inner workings of a gpu as some. It would appear to me that the card was advertised with 4GB memory, it comes with 4GB of memory and it can utilize all 4GB of that memory. It doesn't seem to me that its being segmented or not would have much effect on the 970 being able to efficiently utilize all of it. This is something we've seen several times in the past - the "less than flagship" cards being held back by a lower bandwidth, occasionally causing them not to be able to effectively utilize all the memory on-board. I remember seeing the math at one point showing something like how a 128-bit card couldn't possibly utilize all of 2GB of on-board memory at once, yet at the time we were seeing those cards with 2GB and even 3GB flying off the shelves.
> 
> The fact remains that the 970 is still a marvel that we've not seen very often at that price point. The people who thought there was no reason to pony up the extra scratch for a 980 when a 970 was so much cheaper and just about as fast were deluding themselves. No such thing as a free lunch.
> 
> 
> 
> Calling the 970 owners delusional is harsh and uncalled for. There was little to no evidence at the time that the 970 was crippled like this.
> 
> There are also plenty of instances where a GPU was 90% as fast as another GPU but substantially cheaper and in these cases, the cheaper GPU did not contain false advertisement. The 8800gt was launched at $200 and was about 90% or more the power of the 8800gtx which was in the neighborhood of $500 to $600. The 4870 was close to the 480 and was substantially cheaper. The 670 and 680 weren't far apart. The 290 and 290x are within 10% of each other. The 7950 and 7970 are pretty close as well. Should we also call everyone who bought an 8800gt, 4870, 670, 290, 7950 delusional?
Click to expand...

They were if they were expecting close to the same performance capability as the more expensive part for $200 less cost *with no downside* (if you can even call this a downside). Nearly every second tier GPU is cripped somehow to some extent to get slotted in the second tier. I've been buying GPU's for a long time, and this has been the case more often than not. Whether we knew exactly how it was being done in this case is irrelevant when the results are the same. 85-90% the performance of the top tier part. 'Nuf said.


----------



## looniam

nice, now the REAL cynic is here and i have to find another hobby









what i have wanted to say for the longest is i think it would be sweet irony and even poetic if AMD had a tool/test/bench that showed exactly what is flonky with nvida's vram.


----------



## criminal

Quote:


> Originally Posted by *Intervention*
> 
> I don't even boost past 4th. 138mph is fast enough for me...
> 
> Realistically, I can't really see why everyone is so upset at this. These cards perform phenomenally and offer exceptional performance per dollar
> 
> Seems people are harder and harder to please and always looking for something to be unhappy about


Quote:


> Originally Posted by *skupples*
> 
> please do try to sue nvidia, that will be the funniest thing to happen in the GPU industry in quite some time.
> 
> The card has 4GB of memory, so no false advertisement,
> 
> nice try though. It's extremely cute, and gives me a great laugh, while sitting here @ work, on the phone w/ Lenovo, because they forgot to include the memory in the laptop that they replaced the motherboard in. Now they're accusing me of sending it to them without memory in the first place, even though the original issue was with the track pad, how the hell could I know the track pad is broken if it has NO MEMORY?
> 
> That's a real outrage, but you people, you people of this thread, you're faux outrage.
> 
> Yay for still having my titans, with all 6gb of included 6GB of memory.
> 
> People shouldn't even be using these cards for the resolutions required to actually NEED that amount of memory.


Quote:


> Originally Posted by *Quasimojo*
> 
> They were if they were expecting close to the same performance capability as the more expensive part for $200 less cost *with no downside* (if you can even call this a downside). Nearly every second tier GPU is cripped somehow to some extent to get slotted in the second tier. I've been buying GPU's for a long time, and this has been the case more often than not. Whether we knew exactly how it was being done in this case is irrelevant when the results are the same. 85-90% the performance of the top tier part. 'Nuf said.


Then Nvidia should have just been upfront about the design. Again, Nvidia is shady for not disclosing this information until after they basically got found out. No matter the issue, it is shady plain and simple. We can argue performance and all the other crap till the sun goes down, but it does not free Nvidia from not being upfront. I can almost guarantee it would have impacted sales and that is exactly why nothing was mentioned from Nvidia.


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> ya know since you asked me directly:
> 
> i have no idea why nvidia did not fully disclosed the vram segmentation. they have done some wacky things in the past with playing around and it hadn't been an issue such as this. but that is guessing. so why? they need to answer that because i don't speak for them. pcper is having a further discussion @ 1pm ET something might show up then.
> 
> *what i do* do *is not defend nvidia* but (now hold on to your hat) look out for the affected consumer; *those 970 users*. because even though anyone can criticize a company and its practices, that still doesn't excuse or leave open licence for those that have nothing better to do than stir the pot or *pass FUD off as valid* with the latter ending up helping no one. instigating a number of users, *causing a mob mentality, would only harm those at seeking fair compensation for their inconvenience*.
> 
> in other words, *if people doesn't have their ducks lined up properly, they won't stand* *a* snowball's *chance in hell* at getting anything from nvida.
> 
> but hey, if those who are butt hurt from past experiences at their favorite team getting criticized and want to hypocritically retort; i got time while i am seasonal laid off to engage in an adversarial process to a point.


Who is passing FUD to whom? Those passing "intimidation" around, or those unveiling many?
Quote:


> Originally Posted by *skupples*
> 
> please do try to sue nvidia, that will be the funniest thing to happen in the GPU industry in quite some time.
> 
> The card has 4GB of memory, so *no false advertisement*,
> 
> *People shouldn't even be using these cards*... *to actually NEED that*... .


How about the traces to and from the memory? Don't "256 bit" write on every box on every 970 model?


----------



## Cryosis00

Quote:


> Originally Posted by *morbid_bean*
> 
> So can someone explain this to me? Probably a question for Nvidia, I donno.
> 
> So now that NVIDIA has explained the setup, sounds like its not a problem right? If it needs more than 3.5 it jumps to the second pool of ram? Then why are people getting choke issues on the cards?


Herein lies the deception. Nvidia never claimed your card would not choke. They merely stated the 2nd bank is accessed at 3.5+ memory.

Car manufacturers use the same practice. Car spec claims 400hp. While that is true most people don't realize that HP# is taken at the crank, so you have to account for parasitic loss from the flywheel through the driveshaft to the wheels. Depending on that loss you could end up with 350 - 375hp at the wheels.

A simple dyno will tells us those results.

Review practices for graphics cards will need to start testing total memory of cards more thoroughly so we can be just as informed.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> Who is passing FUD to whom? Those passing "intimidation" around, or those unveiling many?


wow if you are insinuating that this toothless old man with dirty glasses, sitting behind a keyboard and wearing a pair of depends is intimating via the internet . .

i am sorry . . . but thanks!


----------



## skupples

Quote:


> Originally Posted by *Quasimojo*
> 
> They were if they were expecting close to the same performance capability as the more expensive part for $200 less cost *with no downside* (if you can even call this a downside). Nearly every second tier GPU is cripped somehow to some extent to get slotted in the second tier. I've been buying GPU's for a long time, and this has been the case more often than not. Whether we knew exactly how it was being done in this case is irrelevant when the results are the same. 85-90% the performance of the top tier part. 'Nuf said.


woah woah bro.

get out of here with this logic & reason.

970 is a 980, derp, and Nvidia is liable for lawsuit because of reasons & things! Who cares if it actually has 4GB of memory, who cares if manufacturers are starting to offer free replacements (why? IDK!) and who cares if multiple people have zero performance issues above 3.5GB!


----------



## iSlayer

Quote:


> Originally Posted by *skupples*
> 
> woah woah bro.
> 
> get out of here with this logic & reason.
> 
> 970 is a 980, derp, and Nvidia is liable for lawsuit because of reasons & things! Who cares if it actually has 4GB of memory, who cares if manufacturers are starting to offer free replacements (why? IDK!) and who cares if multiple people have zero performance issues above 3.5GB!


Uh...free replacements? Should I call up MSI?


----------



## darkwizard

Quote:


> Originally Posted by *Cryosis00*
> 
> Herein lies the deception. Nvidia never claimed your card would not choke. They merely stated the 2nd bank is accessed at 3.5+ memory.
> 
> Car manufacturers use the same practice. Car spec claims 400hp. While that is true most people don't realize that HP# is taken at the crank, so you have to account for parasitic loss from the flywheel through the driveshaft to the wheels. Depending on that loss you could end up with 350 - 375hp at the wheels.
> 
> A simple dyno will tells us those results.
> 
> Review practices for graphics cards will need to start testing total memory of cards more thoroughly so we can be just as informed.


While what you say is true about cars and horsepower, which has been true for a long period of time; doesn't really hold on the GPU debacle, same as someone tried to the same analogy with the Hard drives storage measurements.

At this point it is better to wait for in-depth analysis, but if you buy something that is supposed to work the way it was advertised and it doesn't, then by principle is wrong. Therefore, even if the card performs, even at 3.5gb, it is the principle that matters; some people bought a 970 just for that extra 1gb of vram.

While more testing may or may not uncover issues, what Nvidia did is wrong, that's my point of view.


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> nice, now the REAL cynic is here and i have to find another hobby
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what i have wanted to say for the longest is i think it would be sweet irony and even poetic if AMD had a tool/test/bench that showed exactly what is flonky with nvida's vram.


Except, AMD don't do cheap stunts like Nvidia. @FP16demotion @FCAT debacle. Help me out to name a few.


----------



## GTR Mclaren

GTX 660Ti all over again


----------



## mtcn77

Quote:


> Originally Posted by *GTR Mclaren*
> 
> GTX 660Ti all over again


No, this is false advertising; that was faltering performance only.
Compare this, with this.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> Except, AMD don't do cheap stunts like Nvidia. @FP16demotion @FCAT debacle. Help me out to name a few.


i was offline in 2010 for the FP16demontion

but the FCAT debacle wasn't nvidia but scott wasson and ryan shrout at tech report and pcper. though the former in 2012 in a damage report blog. nvida's involvement was limited and only fact when ryan asked them how they test frame pacing for SLI. they gave him a script that he then edited to use in his equipment.

here is what started that:
As the second turns: the web digests our game testing methods


----------



## Noufel

So basicaly the 3.5 / 0.5 design of the 970 is a boost but in the other direction when you pass the 3.5 cap and use the rest of the vram it becomes slugish


----------



## mtcn77

Quote:


> Originally Posted by *looniam*
> 
> i was offline in 2010 for the FP16demontion
> 
> but the FCAT debacle wasn't nvidia but scott wasson and ryan shrout at tech report and pcper. though the former in 2012 in a damage report blog. nvida's involvement was limited and only fact when ryan asked them how they test frame pacing for SLI. they gave him a script that he then edited to use in his equipment.
> 
> here is what started that:
> As the second turns: the web digests our game testing methods


Yeah, which is why Mr. Scott Wasson and Mr. Ryan Shrout have been cued not to publish gtx 970 SLI tests in any of their benchmark reviews. Just out of their own consent, I get it.


----------



## looniam

Quote:


> Originally Posted by *mtcn77*
> 
> Yeah, which is why Mr. Scott Wasson and Mr. Ryan Shrout have been cued not to publish gtx 970 SLI tests in any of their benchmark reviews. Just out of their own consent, I get it.


yeah, and ryan drivers a bently purchased by nvidia. he said so himself on this very forum in the thread explaining FCAT.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Noufel*
> 
> So basicaly the 3.5 / 0.5 design of the 970 is a boost but in the other direction when you pass the 3.5 cap and use the rest of the vram it becomes slugish


Yeah is a huge sales Booster.


----------



## dean_8486

Quote:


> Originally Posted by *Wirerat*
> 
> not that i dont believe you but you are talking about returning a gpu based on performance in a beta game thats not even released yet.


Never happend on my 680 that only has 2gb


----------



## Defoler

Quote:


> Originally Posted by *tpi2007*
> 
> I thought my post was self-explanatory, but I'll give it another try: it matters if, depending on where on the die they cut off defective cores that affects performance from card to card, thus possibly explaining why some people are reporting problems and some aren't.
> 
> It's an idea worth discussing, and more importantly, that the tech media should ask Nvidia. And that is why I quoted that Anandtech sentence that sums it up pretty well: up until now reviewers thought they knew how the cards worked, and now they realise that wasn't true, so what else that is relevant to the discussion don't we know?
> 
> Think about it for a moment. If this had been known from the start and more directed reviews had come out pointing out some possible problems in the future when games start requiring all of the 4GB and not just caching (which gives Nvidia some freedom to swap non priority things around, thus minimizing the problem), some people may not have been so quick to buy the card.
> 
> I can certainly see many people having bought their GTX 970 first, being very happy with it at 1080p / 1440p and then deciding to buy a 4K monitor during the holiday season or in a January sale, and here we are, in January, after the festivities, and people are starting to realise the problems in demanding scenarios. It doesn't seem unreasonable.
> I'll ask again: is it really bugged ? If it returns normal results on the 980 - because it assumes that all of it's VRAM is accessible in the same way, and returns erroneous results on the 970 because of its specificities, who is really responsible ? Are you implying that the guy who coded the benchmark should be aware of a hardware specificity that nobody outside of Nvidia knew about and coded around it ? Or is that Nvidia's driver team job ? To identify when an application / game requires access to the full 4 GB and put the 'access 3.5 GB segment first, then proceed to 0.5 GB segment.' procedure in motion. Nvidia implies on its statement that that is indeed their job. Otherwise game and application developers would have to make adjustments to all their products to take one specific card into account.
> 
> What I take from this is that Nvidia's driver failed to catch the benchmark and re-route its requests to make it work properly on the GTX 970, thus showing what some people are describing as a bug. How can you patch a bug when you don't know that card has a special way to address memory ?


This was funny if it wasn't sad.

First off, I'm not sure either nvidia nor amd need to provide the media on how exactly they are removing/adding SMs (either by firmware or by actually removing them). Also you are speculating too much and make it sound like you know that they are actually removing the SMs. Which you don't. We have seen enough evidence in the past that both nvidia and amd are doing it by firmware anyway.

Secondly, the people who claim they have any issue, keep putting stuttering from SLI as the problem, which is not the case. As long as the card performs as expected, and the stuttering is there on every card and its related to SLI/crossfire issues (especially with SoM which also have stuttering with amd cards).
The ones running now to replace their card, have never, ever, showed that they have a reason to. The cards perform *as expected* compared to other cards.

The whole "my card has issues!" is irrelevant completely, as long as we don't see the problem manifest even when the cards are using the whole 4GB memory, and we do.
How nvidia are making the memory accessible is completely irrelevant. You have no idea if amd are doing it exactly the same.

You don't know but still speculate that there is a problem. Why?
Does the card performs well and as expected? Yes.
Does the card drops huge FPS numbers when reaching the 4GB limit (and it does go there) compared to other cards? No.
So where is the problem? Where is the proof that people are getting low FPS because of the 0.5GB line? Its no where.
Stuttering? At less than 20fps? dah... Every card has that issue.

People are crazy about "problems" when they can't even show them.
We have seen SLI 4K with 4GB memory usage with the 970, and we have not seen large FPS drops from those cards higher than 980s or 290x.
So I call it bulls until I can actually see a single, just one, proof. You are just speculating about nothing. Making up issues or relevant things which are completely irrelevant.

I agree that the 0.5GB line is just strange why nvidia approached it. But is it a problem? No. Does it matter? No? Can the card still use 4GB? Yes.
So returning the 970s or claiming that nvidia pulled a "fast one" on everyone? Give us a break.

It just could be another way to reduce performance just so they can give the 980 a higher leverage to people who want the best. And its fine. Its their provocative.
As long as the card performs as expected and as the reviews show it, it doesn't matter at all.
Quote:


> Originally Posted by *Noufel*
> 
> So basicaly the 3.5 / 0.5 design of the 970 is a boost but in the other direction when you pass the 3.5 cap and use the rest of the vram it becomes slugish


That is the whole problem.
Except there is no sluggishness which isn't there with the 980, 290x, and every other card.


----------



## sugalumps

Quote:


> Originally Posted by *skupples*
> 
> woah woah bro.
> 
> get out of here with this logic & reason.
> 
> 970 is a 980, derp, and Nvidia is liable for lawsuit because of reasons & things! Who cares if it actually has 4GB of memory, who cares if manufacturers are starting to offer free replacements (why? IDK!) and who cares if multiple people have zero performance issues above 3.5GB!


Aye this, people that have been claiming the 970 is basically the exact same as a 980....... ye, no.


----------



## LancerVI

Quote:


> Originally Posted by *criminal*
> 
> I have no doubt that Nai's benchmark is not reliable, so people getting outraged by that alone deserves some criticism. But yeah, saying Nvidia is well within their rights to not disclose everything about a product are wrong. I believe sales would have been impacted had this issue (whether it impacts performance or not) been disclosed upon release and Nvidia knew that. That is why this is a shady practice.
> 
> Issues like these really show one's bias. There are some heavily bias people who love anything Nvidia and can't see the wrong they do. Die hard AMD users will act the same way. Last i checked both companies are in it to make money.
> 
> Nvidia is in the wrong with this issue plain and simple. People make excuses for them should be ashamed. Let Nvidia do their own damage control. They make more than enough money to do so.


This Exactly.

That is all.


----------



## iSlayer

Quote:


> Originally Posted by *Defoler*
> 
> This was funny if it wasn't sad.
> 
> First off, I'm not sure either nvidia nor amd need to provide the media on how exactly they are removing/adding SMs (either by firmware or by actually removing them). Also you are speculating too much and make it sound like you know that they are actually removing the SMs. Which you don't. We have seen enough evidence in the past that both nvidia and amd are doing it by firmware anyway.
> 
> Secondly, the people who claim they have any issue, keep putting stuttering from SLI as the problem, which is not the case. As long as the card performs as expected, and the stuttering is there on every card and its related to SLI/crossfire issues (especially with SoM which also have stuttering with amd cards).
> The ones running now to replace their card, have never, ever, showed that they have a reason to. The cards perform *as expected* compared to other cards.
> 
> The whole "my card has issues!" is irrelevant completely, as long as we don't see the problem manifest even when the cards are using the whole 4GB memory, and we do.
> How nvidia are making the memory accessible is completely irrelevant. You have no idea if amd are doing it exactly the same.
> 
> You don't know but still speculate that there is a problem. Why?
> Does the card performs well and as expected? Yes.
> Does the card drops huge FPS numbers when reaching the 4GB limit (and it does go there) compared to other cards? No.
> So where is the problem? Where is the proof that people are getting low FPS because of the 0.5GB line? Its no where.
> Stuttering? At less than 20fps? dah... Every card has that issue.
> 
> People are crazy about "problems" when they can't even show them.
> We have seen SLI 4K with 4GB memory usage with the 970, and we have not seen large FPS drops from those cards higher than 980s or 290x.
> So I call it bulls until I can actually see a single, just one, proof. You are just speculating about nothing. Making up issues or relevant things which are completely irrelevant.
> 
> I agree that the 0.5GB line is just strange why nvidia approached it. But is it a problem? No. Does it matter? No? Can the card still use 4GB? Yes.
> So returning the 970s or claiming that nvidia pulled a "fast one" on everyone? Give us a break.
> 
> It just could be another way to reduce performance just so they can give the 980 a higher leverage to people who want the best. And its fine. Its their provocative.
> As long as the card performs as expected and as the reviews show it, it doesn't matter at all.
> That is the whole problem.
> Except there is no sluggishness which isn't there with the 980, 290x, and every other card.


Some sanity.

Do we have any evidence that there is an issue as a result of the partitioning?

The hysteria at first was nice and fun but we all need to take a step back, chill, make sure there is or isn't a problem, then react accordingly.

Because Nai's benchmark seems to be flawed and we don't really have anything that proves beyond a reasonable doubt the partitioning is to blame (assuming Nvidia is correct). We need facts.


----------



## jimlaheysadrunk

hahahah omg the outrage of 512mb of ram that IS STILL THERE.

man, im ashamed to be a pc gamer these days. nothing but people crying about everything anymore.


----------



## criminal

Quote:


> Originally Posted by *jimlaheysadrunk*
> 
> hahahah omg the outrage of 512mb of ram that IS STILL THERE.
> 
> man, im ashamed to be a pc gamer these days. nothing but people crying about everything anymore.


This could possible have an influence on resell value of the 970 once that time comes, so that is a concern. And yeah Nvidia was misleading... so I see some merit for the complaints.


----------



## Wirerat

Quote:


> Originally Posted by *dean_8486*
> 
> Never happend on my 680 that only has 2gb


so your gtx 680 was able to handle the same settings you are running on the gtx 970 and perform better?
Quote:


> Originally Posted by *criminal*
> 
> This could possible have an influence on resell value of the 970 once that time comes, so that is a concern. And yeah Nvidia was misleading... so I see some merit for the complaints.


i doubt it. 3 years later i just got $110 each for my 660ti. They sold the 2nd day i posted them too.


----------



## skupples

I'm confused as to why people aren't publishing more proof / examples of this flaw.

Did I miss the smoking gun?
Quote:


> Originally Posted by *criminal*
> 
> This could possible have an influence on resell value of the 970 once that time comes, so that is a concern. And yeah Nvidia was misleading... so I see some merit for the complaints.


eh a minority of the GPU communiy pays attention to things like this.

Also, really want to see some cut & dry proof & data.

someone downclocking a 980 to the point of being = in performance to a 970, point for point, then taking them through the ringer.


----------



## iSlayer

Quote:


> Originally Posted by *criminal*
> 
> This could possible have an influence on resell value of the 970 once that time comes, so that is a concern. And yeah Nvidia was misleading... so I see some merit for the complaints.


Go look in the marketplace here. You'll find the results shocking.

A lot of people don't care about this issue. At all.
Quote:


> Originally Posted by *skupples*
> 
> I'm confused as to why people aren't publishing more proof / examples of this flaw.
> 
> Did I miss the smoking gun?
> eh a minority of the GPU communiy pays attention to things like this.
> 
> Also, really want to see some cut & dry proof & data.
> 
> someone downclocking a 980 to the point of being = in performance to a 970, point for point, then taking them through the ringer.


What I really need.


----------



## jprovido

http://www.tweaktown.com/news/43151/nvidia-gtx-970-suffers-memory-bug/index.html
Quote:


> GPU memory architecture is somewhat similar to the overall design of all computer systems. The memory bus is a linearly scalable bus, which means all RAM components connected have to respond with the same latency and bandwidth, regardless of how much RAM is present. GPU's are also designed with the same approach, so the results are definitely not expected. *The problem is most likely a firmware bug, in light of the fact that these memory subsystems are almost identical on the GTX 970 and GTX 980*.
> Quote:
> 
> 
> 
> Expect NVIDIA TO ISSUE A FIX SOON
> 
> 
> 
> Read more at http://www.tweaktown.com/news/43151/nvidia-gtx-970-suffers-memory-bug/index.html
Click to expand...

I doubt it lol


----------



## skupples

Quote:


> Originally Posted by *jprovido*
> 
> http://www.tweaktown.com/news/43151/nvidia-gtx-970-suffers-memory-bug/index.html
> I doubt it lol


Why do you doubt it>?

Do you have an EE degree which would allow you to elaborate on a deeper level than "lulz, nv bad, lulz"?

I'm truly curious.

99% of the posts in this thread are hot air blown from the mountain tops by people with obscenely over inflated egos, with little to no actual working knowledge of what they're discussion. So, instead, they throw around talking points & gibberish when attempting to prove their points.

So please, can we get some actual knowledge outside of TweakTown links, TechPowerUp editorials, & copy / pasta's of Nvidiais official statement?

like I said before, I probably already missed the smoking gun, but all I can see from the picture history is 1 or 2 things related to this, and a ton of pictures of people rehashing old issues, like Kepler ROPs and other nonsense.


----------



## Quasimojo

Quote:


> Originally Posted by *criminal*
> 
> Why are so many so relentless in defending Nvidia's practice?


I can only speak for myself, but the way I see it, there is nothing to defend. GPU manufacturers have been doing this kind of thing since discreet GPU's first hit the scene, and the fact of the matter is that if no one had brought this particular bit of data to light, people would have been just fine with the performance of their 970. nVidia releases not one but a pair of GPU's that provide better value and performance than any of AMD's offerings at their respective price points, and people just can't stand it until they can dig up something like this to rail on about. Never mind the fact that they were able to give us real, noticeable performance gains on the same die fabrication tech.

I've continued to contribute to this thread, because this Chicken Little kangaroo court of public opinion crap drives me bonkers. This is why we can't have nice things.


----------



## skupples

Quote:


> Originally Posted by *Quasimojo*
> 
> I can only speak for myself, but the way I see it, there is nothing to defend. GPU manufacturers have been doing this kind of thing since discreet GPU's first hit the scene, and the fact of the matter is that if no one had brought this particular bit of data to light, people would have been just fine with the performance of their 970. nVidia releases not one but a pair of GPU's that provide better value and performance than any of AMD's offerings at their respective price points, and people just can't stand it until they can dig up something like this to rail on about. Never mind the fact that they were able to give us real, noticeable performance gains on the same die fabrication tech.
> 
> I've continued to contribute to this thread, because this Chicken Little kangaroo court of public opinion crap drives me bonkers. This is why we can't have nice things.


I'm going to sue you for something... just give me a minute, need to find a viable reason...

So glad my Titan can use all 6GB of its available memory... funny thing about that though, performance still falls off in those last few 300-500mb, same for my old 670s, and my old 580s, and old 480s, and ... welp, pretty much every card i've ever had has started to degrade in performance when consuming the last drops of frame buffer.

still, sifting through 15 pages has become quite annoying, does anyone have proof of the smoking gun?

I've seen the official response form Nvidia, but I haven't seen viable evidence from the community.

I know someone somewhere was writing something in CUDA in an attempt @ exploiting this flaw, was this completed? is the data readily available & digestible by the common folk?


----------



## mtcn77

Quote:


> Originally Posted by *skupples*
> 
> Why do you doubt it>?
> 
> Do you have an EE degree which would allow you to elaborate on a deeper level than "lulz, nv bad, lulz"?
> 
> I'm truly curious.
> 
> 99% of the posts in this thread are hot air blown from the mountain tops by people with obscenely over inflated egos, with little to no actual working knowledge of what they're discussion. So, instead, they throw around talking points & gibberish when attempting to prove their points.
> 
> So please, can we get some actual knowledge outside of TweakTown links, TechPowerUp editorials, & copy / pasta's of Nvidiais official statement?
> 
> like I said before, I probably already missed the smoking gun, but all I can see from the picture history is 1 or 2 things related to this, and a ton of pictures of people rehashing old issues, like Kepler ROPs and other nonsense.


Let me help refresh your memory.


Spoiler: Warning: Spoiler!



The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and *fewer crossbar resources to the memory system*. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. *The GPU has higher priority access to the 3.5GB section*. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, *but may report more for GTX 980* if there is more memory used by other commands.


They have admitted to cutting down on the crossbar switch. Now what is that?
Lo and behold! We have an actual info as to what it does:


Spoiler: NVIDIA GeForce 8800 GPU Architecture Overview



Hardware Organization Overview
GPU chip consists of one or more multiprocessors.
A multiprocessor consists of 1 (CC 1.x), 2 (CC 2.x), or 4 (CC 3.x) warp schedulers.
A multiprocessor consists of 8 to 192 CUDA cores.
A multiprocessor consists of functional units of several types . . .
. . . a CUDA core is a functional unit for single-precision floating point.
GPU chip consists of one or more L2 Cache Units for mem access.
Multiprocessors connect to L2 Cache Units via a crossbar switch.
Each L2 Cache Unit has its own interface to device memory.


I wonder if things have much changed regarding 8800 to Maxwell all gpus are designed around the unified shader architecture.


----------



## criminal

Quote:


> Originally Posted by *Quasimojo*
> 
> I can only speak for myself, but the way I see it, there is nothing to defend. GPU manufacturers have been doing this kind of thing since discreet GPU's first hit the scene, and the fact of the matter is that if no one had brought this particular bit of data to light, people would have been just fine with the performance of their 970. nVidia releases not one but a pair of GPU's that provide better value and performance than any of AMD's offerings at their respective price points, and people just can't stand it until they can dig up something like this to rail on about. Never mind the fact that they were able to give us real, noticeable performance gains on the same die fabrication tech.
> 
> I've continued to contribute to this thread, because this Chicken Little kangaroo court of public opinion crap drives me bonkers. This is why we can't have nice things.


Again, Nvidia felt a need to come clean with how the memory was divided up AFTER someone stumbled upon it. No matter what Nvidia should have been upfront with the issue from day one this card released. People have a right to be upset because they felt lied to. Nothing wrong with nice things, but companies need to be upfront about crap like this no matter who they are so they don't get criticized for such practices.
Quote:


> Originally Posted by *skupples*
> 
> I'm going to sue you for something... just give me a minute, need to find a viable reason...
> 
> So glad my Titan can use all 6GB of its available memory... funny thing about that though, performance still falls off in those last few 300-500mb, same for my old 670s, and my old 580s, and old 480s, and ... welp, pretty much every card i've ever had has started to degrade in performance when consuming the last drops of frame buffer.
> 
> still, sifting through 15 pages has become quite annoying, does anyone have proof of the smoking gun?
> 
> I've seen the official response form Nvidia, but I haven't seen viable evidence from the community.
> 
> I know someone somewhere was writing something in CUDA in an attempt @ exploiting this flaw, was this completed? is the data readily available & digestible by the common folk?


Oh, just in case it is coming up differently, I am not advocating for Nvidia being sued over this. But I think companies should be called out and held accountable for their practices. If we consumers joined together in discouraging this type of behavior from companies, then maybe we can have an effect on how people "see" these companies.


----------



## skupples

huh, guess people missed the part where I said i've seen the NV copy pasta 1,000 times.

It's OK, I'm just going to PM someone that I know will ACTUALLY know, and can properly explain what the "crossbar switch" is, and does.

"things that effect performance" isn't quite the answer I'm looking for.


----------



## Xoriam

Ugh. Go to bed, come back people talking about the card only being able to access 3.5gb of ram again (a few pages back)
I thought we all understood by now the card can use all 4GB of ram without and noticable performance decrease.....


----------



## Intervention

Quote:


> Originally Posted by *darkwizard*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Cryosis00*
> 
> Herein lies the deception. Nvidia never claimed your card would not choke. They merely stated the 2nd bank is accessed at 3.5+ memory.
> 
> Car manufacturers use the same practice. Car spec claims 400hp. While that is true most people don't realize that HP# is taken at the crank, so you have to account for parasitic loss from the flywheel through the driveshaft to the wheels. Depending on that loss you could end up with 350 - 375hp at the wheels.
> 
> A simple dyno will tells us those results.
> 
> Review practices for graphics cards will need to start testing total memory of cards more thoroughly so we can be just as informed.
> 
> 
> 
> While what you say is true about cars and horsepower, which has been true for a long period of time; doesn't really hold on the GPU debacle, same as someone tried to the same analogy with the Hard drives storage measurements.
> 
> At this point it is better to wait for in-depth analysis, but if you buy something that is supposed to work the way it was advertised and it doesn't, then by principle is wrong. Therefore, even if the card performs, even at 3.5gb, it is the principle that matters; some people bought a 970 just for that extra 1gb of vram.
> 
> While more testing may or may not uncover issues, what Nvidia did is wrong, that's my point of view.
Click to expand...

To your point about crank and wheel horsepower. No manufacturer really rates power at the wheels.

Its always crank/brake horse power

So its a bit different, but I see your point. Its rated at 4GB but you really only get 3.5

My Cobra makes 700whp, so about 820 crank if you use a 15% loss factor


----------



## mtcn77

Quote:


> Originally Posted by *skupples*
> 
> huh, guess people missed the part where I said i've seen the NV copy pasta 1,000 times.
> 
> It's OK, I'm just going to PM someone that I know will ACTUALLY know, and can properly explain what the "crossbar switch" is, and does.
> 
> "things that effect performance" isn't quite the answer I'm looking for.


Without the crossbar switch, the gpu is "nothing". It connects shaders to the whole memory structure: shaders>crossbar>L2; L2>bus>modules.
They probably differentiated the two designer gpus by the crossbar alone since they didn't have a rejected batch before 970 launched(both models launched simultaneously) to make into 970's.


----------



## Quasimojo

Quote:


> Originally Posted by *criminal*
> 
> Again, Nvidia felt a need to come clean with how the memory was divided up AFTER someone stumbled upon it. No matter what Nvidia should have been upfront with the issue from day one this card released. People have a right to be upset because they felt lied to. Nothing wrong with nice things, but companies need to be upfront about crap like this no matter who they are so they don't get criticized for such practices.


You call it "coming clean". I call it providing an explanation for the test results people are seeing. People usually like that kind of open dialog from a manufacturer. You think it's an "issue", and I disagree. It's the way the GPU was designed, and it's working as advertised - all 4GB is accessible at the same bandwidth. Again, partitioning your SATA3 HDD doesn't turn one of the partitions into SATA1. If you're seeing it struggle at ultra settings on a 4k or even 1440p display...well, duh.

Benchmarks. Scoreboard.


----------



## Xoriam

Quote:


> Quote:
> Originally Posted by darkwizard View Post
> 
> Quote:
> Originally Posted by Cryosis00 View Post
> 
> Herein lies the deception. Nvidia never claimed your card would not choke. They merely stated the 2nd bank is accessed at 3.5+ memory.
> 
> Car manufacturers use the same practice. Car spec claims 400hp. While that is true most people don't realize that HP# is taken at the crank, so you have to account for parasitic loss from the flywheel through the driveshaft to the wheels. Depending on that loss you could end up with 350 - 375hp at the wheels.
> 
> A simple dyno will tells us those results.
> 
> Review practices for graphics cards will need to start testing total memory of cards more thoroughly so we can be just as informed.
> 
> While what you say is true about cars and horsepower, which has been true for a long period of time; doesn't really hold on the GPU debacle, same as someone tried to the same analogy with the Hard drives storage measurements.
> 
> At this point it is better to wait for in-depth analysis, but if you buy something that is supposed to work the way it was advertised and it doesn't, then by principle is wrong. Therefore, even if the card performs, even at 3.5gb, it is the principle that matters; some people bought a 970 just for that extra 1gb of vram.
> 
> While more testing may or may not uncover issues, what Nvidia did is wrong, that's my point of view.
> 
> To your point about crank and wheel horsepower. No manufacturer really rates power at the wheels.
> 
> Its always crank/brake horse power
> 
> So its a bit different, but I see your point. Its rated at 4GB but you really only get 3.5
> 
> My Cobra makes 700whp, so about 820 crank if you use a 15% loss factor


again 4GB works the card IS NOT 3.5gb


----------



## UZ7

Quote:


> Originally Posted by *skupples*
> 
> huh, guess people missed the part where I said i've seen the NV copy pasta 1,000 times.
> 
> It's OK, I'm just going to PM someone that I know will ACTUALLY know, and can properly explain what the "crossbar switch" is, and does.
> 
> "things that effect performance" isn't quite the answer I'm looking for.


http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970 this should explain a bit more


----------



## criminal

Quote:


> Originally Posted by *Quasimojo*
> 
> You call it "coming clean". I call it providing an explanation for the test results people are seeing. People usually like that kind of open dialog from a manufacturer. You think it's an "issue", and I disagree. It's the way the GPU was designed, and it's working as advertised - all 4GB is accessible at the same bandwidth. Again, partitioning your SATA3 HDD doesn't turn one of the partitions into SATA1. If you're seeing it struggle at ultra settings on a 4k or even 1440p display...well, duh.
> 
> Benchmarks. Scoreboard.


Yeah, I guess you have a point. BUT... people that actually bought a card and are upset about this may still they have a right to complain. And I see no issue considering it is their money and may have impacted their buying decision.

When the 970 was released, I remember lots of people talking about how overpriced the 980 was in comparison. The 970 was being praised like no other and now some of those same people are being smug and saying "well that's your fault for thinking a 970 didn't have something wrong with it to be so cheap". LOL that statement alone would piss me off. Logically the 970 would not have been praised so well if this information had been known. That is all I am trying to get people to understand.


----------



## Forceman

Quote:


> Originally Posted by *UZ7*
> 
> http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970 this should explain a bit more


So according to Nvidia, 4-6% slower than a theoretical 970 with full access to the entire memory stack. But no discussion about possible stuttering if you start using that pool of memory.


----------



## Noufel

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quasimojo*
> 
> You call it "coming clean". I call it providing an explanation for the test results people are seeing. People usually like that kind of open dialog from a manufacturer. You think it's an "issue", and I disagree. It's the way the GPU was designed, and it's working as advertised - all 4GB is accessible at the same bandwidth. Again, partitioning your SATA3 HDD doesn't turn one of the partitions into SATA1. If you're seeing it struggle at ultra settings on a 4k or even 1440p display...well, duh.
> 
> Benchmarks. Scoreboard.
> 
> 
> 
> Yeah, I guess you have a point. BUT... people that actually bought a card and are upset about this may still they have a right to complain. And I see no issue considering it is their money and may have impacted their buying decision.
> 
> When the 970 was released, I remember lots of people talking about how overpriced the 980 was in comparison. The 970 was being praised like no other and now some of those same people are being smug and saying "well that's your fault for thinking a 970 didn't have something wrong with it to be so cheap". LOL that statement alone would piss me off. Logically the 970 would not have been praised so well if this information had been known. That is all I am trying to get people to understand.
Click to expand...

All this and that's why i got the 980 .


----------



## Intervention

Quote:


> Originally Posted by *Xoriam*
> 
> Quote:
> 
> 
> 
> Quote:
> Originally Posted by darkwizard View Post
> 
> Quote:
> Originally Posted by Cryosis00 View Post
> 
> Herein lies the deception. Nvidia never claimed your card would not choke. They merely stated the 2nd bank is accessed at 3.5+ memory.
> 
> Car manufacturers use the same practice. Car spec claims 400hp. While that is true most people don't realize that HP# is taken at the crank, so you have to account for parasitic loss from the flywheel through the driveshaft to the wheels. Depending on that loss you could end up with 350 - 375hp at the wheels.
> 
> A simple dyno will tells us those results.
> 
> Review practices for graphics cards will need to start testing total memory of cards more thoroughly so we can be just as informed.
> 
> While what you say is true about cars and horsepower, which has been true for a long period of time; doesn't really hold on the GPU debacle, same as someone tried to the same analogy with the Hard drives storage measurements.
> 
> At this point it is better to wait for in-depth analysis, but if you buy something that is supposed to work the way it was advertised and it doesn't, then by principle is wrong. Therefore, even if the card performs, even at 3.5gb, it is the principle that matters; some people bought a 970 just for that extra 1gb of vram.
> 
> While more testing may or may not uncover issues, what Nvidia did is wrong, that's my point of view.
> 
> To your point about crank and wheel horsepower. No manufacturer really rates power at the wheels.
> 
> Its always crank/brake horse power
> 
> So its a bit different, but I see your point. Its rated at 4GB but you really only get 3.5
> 
> My Cobra makes 700whp, so about 820 crank if you use a 15% loss factor
> 
> 
> 
> again 4GB works the card IS NOT 3.5gb
Click to expand...

Yeah I know man

Idk why everybody is so upset. Impossible to please


----------



## dean_8486

So many trolls in here...


----------



## Intervention

Quote:


> Originally Posted by *Intervention*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Xoriam*
> 
> Quote:
> 
> 
> 
> Quote:
> Originally Posted by darkwizard View Post
> 
> Quote:
> Originally Posted by Cryosis00 View Post
> 
> Herein lies the deception. Nvidia never claimed your card would not choke. They merely stated the 2nd bank is accessed at 3.5+ memory.
> 
> Car manufacturers use the same practice. Car spec claims 400hp. While that is true most people don't realize that HP# is taken at the crank, so you have to account for parasitic loss from the flywheel through the driveshaft to the wheels. Depending on that loss you could end up with 350 - 375hp at the wheels.
> 
> A simple dyno will tells us those results.
> 
> Review practices for graphics cards will need to start testing total memory of cards more thoroughly so we can be just as informed.
> 
> While what you say is true about cars and horsepower, which has been true for a long period of time; doesn't really hold on the GPU debacle, same as someone tried to the same analogy with the Hard drives storage measurements.
> 
> At this point it is better to wait for in-depth analysis, but if you buy something that is supposed to work the way it was advertised and it doesn't, then by principle is wrong. Therefore, even if the card performs, even at 3.5gb, it is the principle that matters; some people bought a 970 just for that extra 1gb of vram.
> 
> While more testing may or may not uncover issues, what Nvidia did is wrong, that's my point of view.
> 
> To your point about crank and wheel horsepower. No manufacturer really rates power at the wheels.
> 
> Its always crank/brake horse power
> 
> So its a bit different, but I see your point. Its rated at 4GB but you really only get 3.5
> 
> My Cobra makes 700whp, so about 820 crank if you use a 15% loss factor
> 
> 
> 
> again 4GB works the card IS NOT 3.5gb
> 
> Click to expand...
> 
> Yeah I know man
> 
> Idk why everybody is so upset. Impossible to please
> 
> 85-90% the performance of a 980 and still not happy. Hell, my ftw at 1550 us faster than most stock 980
Click to expand...


----------



## Xoriam

it's funny because most of the people who are mad in this thread don't even own a 970.


----------



## Wirerat

Quote:


> Originally Posted by *dean_8486*
> 
> So many trolls in here...


Omg lulz nvidea hav 3.5giggz on teh 970 doubleUtf roflcopter...


----------



## flopper

Quote:


> Originally Posted by *Xoriam*
> 
> it's funny because most of the people who are mad in this thread don't even own a 970.


Only response Nvidia has to do is simple, full refund to any owner of 970/980 etc...

I watch a class action suit forming which will sink nvidia big time.
obviosuly they knew they sold a card that wasnt performing to what they said it should.
slam dunk class action suit if I ever seen one.


----------



## vloeibaarglas

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970

*Let's be blunt here: access to the 0.5GB of memory, on its own and in a vacuum, would occur at 1/7th of the speed of the 3.5GB pool of memory.*

There we have it. Hardware issue that can't be software patched.


----------



## skupples

Quote:


> Originally Posted by *Forceman*
> 
> So according to Nvidia, 4-6% slower than a theoretical 970 with full access to the entire memory stack. But no discussion about possible stuttering if you start using that pool of memory.


and then you run into the wall of how people perceive stuttering, the game being the cause of stuttering, bad overclocks being the cause, out of date software being the cause, people running settings that the single GPU couldn't handle either way, stuttering, and just overall subjectiveness that's impossible to measure.


----------



## UZ7

Quote:


> Originally Posted by *Forceman*
> 
> So according to Nvidia, 4-6% slower than a theoretical 970 with full access to the entire memory stack. But no discussion about possible stuttering if you start using that pool of memory.


Well I think also the false advertising would be coming from the actual amount of:

ROP Units 64 -> 56 (edit not 52 sorry







)
L2 Cache: 2048 KB -> 1792 KB

and in a sense, it may be up to drivers/software/game optimizations to be able to utilize the remaining seamlessly? it makes it seem like there is 4GB of vram to have access to but the game has to see it and it doesnt like using more than 3.5, when/if it does use it pose the whole stutter and increased frametime.


----------



## Intervention

Quote:


> Originally Posted by *Xoriam*
> 
> it's funny because most of the people who are mad in this thread don't even own a 970.


I haven't paid enough attention to what their rigs are

I love my 970FTW+

Super fast, quiet, cool.

I wouldn't care if it had 3GB of RAM. It performs awesome. Who cares how it does it

Like people complaining the new Ford GT is a v6. Who cares, it's 600+ horsepower


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> and then you run into the wall of how people perceive stuttering, the game being the cause of stuttering, bad overclocks being the cause, out of date software being the cause, people running settings that the single GPU couldn't handle either way, stuttering, and just overall subjectiveness that's impossible to measure.


yeah exactly.
Like I've said numerous times before I've taken the card to the max 4gb have seen like litterally no difference.
There was a bit of stuttering ONLY when the card needed to change alot of information extremely fast, which happens with ANY card that hits the ram limit.


----------



## Xoriam

Quote:


> Originally Posted by *Intervention*
> 
> I haven't paid enough attention to what their rigs are
> 
> I love my 970FTW+
> 
> Super fast, quiet, cool.
> 
> I wouldn't care if it had 3GB of RAM. It performs awesome. Who cares how it does it
> 
> Like people complaining the new Ford GT is a v6. Who cares, it's 600+ horsepower


Yeah man, I own 2 gigabyte gtx 970 g1 gaming, and an evga gtx 970.
i'm very happy with them considering their price compared to peformance of the 980.


----------



## Xoriam

Quote:


> Originally Posted by *UZ7*
> 
> Well I think also the false advertising would be coming from the actual amount of:
> 
> ROP Units 64 -> 52
> L2 Cache: 2048 KB -> 1792 KB
> .


this part however is extremely annoying. I'm hoping they can "unlock" it.


----------



## Forceman

Quote:


> Originally Posted by *UZ7*
> 
> Well I think also the false advertising would be coming from the actual amount of:
> 
> ROP Units 64 -> 52
> L2 Cache: 2048 KB -> 1792 KB
> 
> and in a sense, it may be up to drivers/software/game optimizations to be able to utilize the remaining seamlessly? it makes it seem like there is 4GB of vram to have access to but the game has to see it and it doesnt like using more than 3.5, when/if it does use it pose the whole stutter and increased frametime.


Yeah, they should have been clear about that. Doesn't really say why that had to cut the L2 and ROP, instead of just disabling the SMMs. Must be something architectural, but not clear what.


----------



## gamervivek

Quote:


> First, despite initial reviews and information from NVIDIA, the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. NVIDIA says this was an error in the reviewer's guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned. That means the GTX 970 has 56 ROPs and 1792 KB of L2 cache compared to 64 ROPs and 2048 KB of L2 cache for the GTX 980. Before people complain about the ROP count difference as a performance bottleneck, keep in mind that the 13 SMMs in the GTX 970 can only output 52 pixels/clock and the seven segments of 8 ROPs each (56 total) can handle 56 pixels/clock. The SMMs are the bottleneck, not the ROPs.


Quote:


> Let's be blunt here: access to the 0.5GB of memory, on its own and in a vacuum, would occur at 1/7th of the speed of the 3.5GB pool of memory. If you look at the Nai benchmarks floating around, this is what you are seeing.


tl;dr - the benchmark was right, the 0.5GB will be accessed before the system memory, wouldn't affect games much, at least for now. And the new architecture means that nvidia can bring out a 960Ti as well.

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970

So the outrage against the CUDA benchmark was wrong, the L2 cache size was correct and since a part was disabled, so were the ROPs along with it.


----------



## criminal

Quote:


> NVIDIA has come clean; all that remains is the response from consumers to take hold. For those of you that read this and remain affronted by NVIDIA calling the GeForce GTX 970 a 4GB card without equivocation: I get it. But I also respectfully disagree. Should NVIDIA have been more upfront about the changes this GPU brought compared to the GTX 980? Absolutely and emphatically. But does this change the stance or position of the GTX 970 in the world of discrete PC graphics? I don't think it does.


From the article. Yep, came clean after the fact. Impact performance? I guess not enough to matter. Reason for _some_ 970 owners to be upset and complain? Yep, but not on the basis of performance issues.


----------



## Cyro999

Quote:


> Originally Posted by *Defoler*
> 
> I'll bet you know the difference between system page file and VRAM.
> But if you don't ,you should learn about it.
> 
> Its like complaining about your car malfunctioning and claiming that the sun made it break down.


If the sun makes a 970 break down but not a 980 then it's a valid criticism
Quote:


> tl;dr - the benchmark was right, the 0.5GB will be accessed before the system memory


It's only being accessed sometimes, not all of the time


----------



## Xoriam

Quote:


> Originally Posted by *criminal*
> 
> From the article. Yep, came clean after the fact. Impact performance? I guess not enough to matter. Reason for _some_ 970 owners to be upset and complain? Yep, but not on the basis of performance issues.


It's seriously only noticable in synthetics

this part being left out when purchasing is the part that gets to me though
ROP Units 64 -> 52
L2 Cache: 2048 KB -> 1792 KB

and how the card is actually 56 rops and not 64 as was advertised.


----------



## Quasimojo

Quote:


> Originally Posted by *criminal*
> 
> When the 970 was released, I remember lots of people talking about how overpriced the 980 was in comparison. The 970 was being praised like no other and now some of those same people are being smug and saying "well that's your fault for thinking a 970 didn't have something wrong with it to be so cheap". LOL that statement alone would piss me off. Logically the 970 would not have been praised so well if this information had been known. That is all I am trying to get people to understand.


That's probably the primary point of contention. I don't really see that anything is "wrong" with the 970. That's just the way it was designed.


----------



## iSlayer

Quote:


> Originally Posted by *Xoriam*
> 
> it's funny because most of the people who are mad in this thread don't even own a 970.


Well I'm one, and I'm waiting on more results as the benchmark was invalidated.

We need facts, not FUD...
Quote:


> Originally Posted by *flopper*
> 
> Only response Nvidia has to do is simple, full refund to any owner of 970/980 etc...
> 
> I watch a class action suit forming which will sink nvidia big time.
> obviosuly they knew they sold a card that wasnt performing to what they said it should.
> slam dunk class action suit if I ever seen one.


Speak of the devil, another post from someone who doesn't own a 970 but feels the need to call for a suit on our behalf.

What's that word for these people? Chill? Pill? I can't remember...


----------



## Olivon

Quote:


> The error, as NVIDIA explains it, is that in creating the GTX 970 reviewer's guide, the technical marketing team was unaware of Maxwell's aforementioned and new "partial disable" capabilities when they filled out the GTX 970 specification table. They were aware that the GTX 970 would have the full 256-bit memory bus, and unaware of the ability to independently disable ROPs they assumed that all 64 ROPs and the full 2MB of L2 cache was similarly available and wrote the specification table accordingly. This error then made it into the final copy of the guide, not getting caught even after being shared around various groups at NVIDIA, with that information finally diffused by press such as ourselves.


Quote:


> Now as NVIDIA is in full damage control mode at this point, consideration must be given as to whether NVIDIA's story is at all true; NVIDIA would hardly be the first company to lie when painted into a corner by controversy. With that in mind, given the story that NVIDIA has provided, do we believe them? In short, yes we do. To be blunt, if this was intentional then this would be an incredibly stupid plan, and NVIDIA as a company has not shown themselves to be that dumb. NVIDIA gains nothing by publishing an initially incorrect ROP count for the GTX 970, and if this information had been properly presented in the first place it would have been a footnote in an article extoling the virtues of the GTX 970, rather than the centerpiece of a full-on front page exposé. Furthermore if not by this memory allocation issues then other factors would have ultimately brought these incorrect specifications to light, so NVIDIA would have never been able to keep it under wraps for long if it was part of an intentional deception. Ultimately only NVIDIA can know the complete truth, but given what we've been presented we have no reason to doubt NVIDIA's story.


Quote:


> In any case, the one bit of good news here is that for gaming running out of VRAM is generally rather obvious. Running out of VRAM, be it under normal circumstances or going over the GTX 970's 3.5GB segment, results in some very obvious stuttering and very poor minimum framerates. So if it does happen then it will be easy to spot. Running out of (fast) VRAM isn't something that can easily be hidden if the VRAM is truly needed.
> 
> To that end in the short amount of time we've had to work on this article we have also been working on cooking up potential corner cases for the GTX 970 and have so far come up empty, though we're by no means done. Coming up with real (non-synthetic) gaming workloads that can utilize between 3.5GB and 4GB of VRAM while not running into a rendering performance wall is already a challenge, and all the more so when trying to find such workloads that actually demonstrate performance problems. This at first glance does seem to validate NVIDIA's overall claims that performance is not significantly impacted by the memory segmentation, but we're going to continue looking to see if that holds up. In the meantime NVIDIA seems very eager to find such corner cases as well, and if there are any they'd like to be able to identify what's going on and tweak their heuristics to resolve them.


http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation


----------



## XXnomadXX

i finish reading the pcper on the gtx 970. not happy that nvidia lied about the memory being used as a 3.5Gb for free usage and the 0.5 is a pool sharing
L2. just label the box as a 3.5GB and warned the consumers about the slow usage of the remaindered 0.5 of the memory. yes the 970 is a very good card for the price and performance "present" but now the 4k and 5k monitors are approaching and the "future" of this card is going to take a hit when that memory is going to creep up past 3.5Gb. to me, i want the 4Gb as a usable portion and not to be told that at 3.5Gb will slow the performance of the card.

4GB- 3.5Gb free
- 0.5Gb slowdown
3.5Gb only open as a normal speed

reference gtx 970 i have


----------



## iSlayer

^ which will need SLi, and 4k monitors. For now, that's fine. In the future, even 970 SLi won't be enough to maintain frame rate and settings at 4k.

Ah, a post from Anandtech, hopefully this gets to the bottom of it.

Edit: mistakes were made. Fine, not exactly a great explanation even if we trust it, but fine.


----------



## Heavy MG

[/quote]
Quote:


> Originally Posted by *NuclearPeace*
> 
> I doubt that prices will drop. The 970 already launched $70 cheaper than the 770. Along with that, you guys are overestimating how much research people are doing when it comes to building computers. The 980 and the 970 have collectively sold more than a million cards already despite the 290x and the 290 being substantially cheaper. A lot of people buy electronics (PC hardware included) based off testimonials from their friends and family. Misinformation from comment fanboys also paints AMD as this dodgy budget brand and NVIDIA as the premium luxury brand.


Not really,if Nvidia had made it publicly known than the GTX 970 only really has 3.5GB of ram,I would have purchased a 4GB 290X instead.
Quote:


> Originally Posted by *Quasimojo*
> 
> No idea what that is supposed to illustrate. A lot of people use Blender? I do too, and on a GTX card, no less. But I bought the card for gaming and don't expect it to be the best option for everything else. It's not. It does work "as it says on the box". It's got 4GB memory and uses every bit of it. People have speculated that it doesn't access the last 512MB of memory as fast, but that's all it is - pure speculation. My guess is that any problem in this regard is a driver issue. That is only a guess as well, but it seems to me to be more likely. After all, if you segment your HDD into multiple partitions, one doesn't function any faster than the other (ok, there's short-stroking, but that's a mechanical thing). Heck, the RAM is segmented as well.


But it doesn't work as it says on the box. Nvidia is falsely advertising it as a 4GB 256bit GPU when it is neither. If you are buying a 4GB card you should expect that it use all 4GB and not be a laggy stuttering pile of crap if you use over 3.5GB. As much as i like Nvidia, a lawsuit should happen as there's no way Nvidia couldn't have known about the issue the whole time.
Quote:


> Originally Posted by *Intervention*
> 
> I haven't paid enough attention to what their rigs are
> I love my 970FTW+Super fast, quiet, cool. I wouldn't care if it had 3GB of RAM. It performs awesome. Who cares how it does it
> Like people complaining the new Ford GT is a v6. Who cares, it's 600+ horsepower


I like my 970 G1,but am really disappointed that I didn't get all of the useable vram that i pad for. I never bothered to update my signature,and might not while i decide if i should sell my 970. I could probably sell it and get a 4GB 770 or a 4GB 290X and know i'm getting what i paid for,lol.
It's like buying that 600HP Ford GT then finding out you can't use all 600hp because it stalls out when you were to try using all 600hp.


----------



## ebduncan

it really doesn't matter how well they perform.

The fact is the 970 was falsely advertised to have the same rops and l2 as the 980, based on the same 256 bit memory bus. When it basically has a 224bit memory bus, less rops, and less l2.

It's a open and shut class action suit.

There is no way to avoid that now. Nvidia is taking the route of admitting the faults, and giving as much information as they can about the issue.

If its possible they might even release a firmware that unlocks the other portion of the card to make it 980, and that would prevent a-lot of owners from jumping in on the class action suit soon to come.

Such a shame really.


----------



## Xoriam




----------



## Forceman

Reading the more in-depth Anand article makes it seem like something driver optimizations can help out with, if a game is not properly using the segmented VRAM.

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation


----------



## Xoriam

Quote:


> Originally Posted by *ebduncan*
> 
> it really doesn't matter how well they perform.
> 
> The fact is the 970 was falsely advertised to have the same rops and l2 as the 980, based on the same 256 bit memory bus. When it basically has a 224bit memory bus, less rops, and less l2.
> 
> .


Yeah this is what should be focused on.

Not the ram, because it is ALL useable.


----------



## cq3mrd

Can all you naysayers now finally admit you were wrong?
nVidia fracking admitted they have only 56 ROPs on 970s as opposed to 64 on the 980.
Ironically, I'm in luck, cause the G1 Gaming I ordered the other day is out of stock and will get shipped in 5-10 days, so I'll have some more time to decide what I'll do.


----------



## Intervention

Quote:


> Originally Posted by *Heavy MG*


Quote:


> Originally Posted by *NuclearPeace*
> 
> I doubt that prices will drop. The 970 already launched $70 cheaper than the 770. Along with that, you guys are overestimating how much research people are doing when it comes to building computers. The 980 and the 970 have collectively sold more than a million cards already despite the 290x and the 290 being substantially cheaper. A lot of people buy electronics (PC hardware included) based off testimonials from their friends and family. Misinformation from comment fanboys also paints AMD as this dodgy budget brand and NVIDIA as the premium luxury brand.


Not really,if Nvidia had made it publicly known than the GTX 970 only really has 3.5GB of ram,I would have purchased a 4GB 290X instead.
Quote:


> Originally Posted by *Quasimojo*
> 
> No idea what that is supposed to illustrate. A lot of people use Blender? I do too, and on a GTX card, no less. But I bought the card for gaming and don't expect it to be the best option for everything else. It's not. It does work "as it says on the box". It's got 4GB memory and uses every bit of it. People have speculated that it doesn't access the last 512MB of memory as fast, but that's all it is - pure speculation. My guess is that any problem in this regard is a driver issue. That is only a guess as well, but it seems to me to be more likely. After all, if you segment your HDD into multiple partitions, one doesn't function any faster than the other (ok, there's short-stroking, but that's a mechanical thing). Heck, the RAM is segmented as well.


But it doesn't work as it says on the box. Nvidia is falsely advertising it as a 4GB 256bit GPU when it is neither. If you are buying a 4GB card you should expect that it use all 4GB and not be a laggy stuttering pile of crap if you use over 3.5GB. As much as i like Nvidia, a lawsuit should happen as there's no way Nvidia couldn't have known about the issue the whole time.
Quote:


> Originally Posted by *Intervention*
> 
> I haven't paid enough attention to what their rigs are
> I love my 970FTW+Super fast, quiet, cool. I wouldn't care if it had 3GB of RAM. It performs awesome. Who cares how it does it
> Like people complaining the new Ford GT is a v6. Who cares, it's 600+ horsepower


I like my 970 G1,but am really disappointed that I didn't get all of the useable vram that i pad for. I never bothered to update my signature,and might not while i decide if i should sell my 970. I could probably sell it and get a 4GB 770 or a 4GB 290X and know i'm getting what i paid for,lol.
It's like buying that 600HP Ford GT then finding out you can't use all 600hp because it stalls out when you were to try using all 600hp.[/quote]

Oh... I didn't realize the 970s were completely shutting off when you hit 3.5GB

Stalling and "stuttering" are not the same


----------



## jprovido

Quote:


> Originally Posted by *Xoriam*
> 
> Yeah this is what should be focused on.
> 
> Not the ram, because it is ALL useable.


not convinced. sig rig will still be at 3.5gb lol.

i love my gtx 970 exoc's they ran much cooler and quieter than my previous cards. the thing I'm upset about is I actually "sidegraded" from two gtx 780's I knew it was stupid but I did it because of the vram. I didn't know it was THIS stupid. I upgraded for 512mb vram. i are so smart


----------



## SchmoSalt

Looks like my patience has caused me to dodge a bullet. I was going to buy 2x 970s after the next set of ATI cards came out. My hope was that the ATI cards would make NVIDIA drop the prices on their cards.

Now I will not be buying 2x 970s or even a single 970. Why would I invest $600-$700 into a set of cards that have a known faulty design? I would be insane if I did. Hopefully the 980 Ti turns out to be a good card. If not then I don't have many other options on the table. I'd hate to go ATI again because of stability issues but I would probably have to if the 980 Ti ends up being a disappointment too.


----------



## Xoriam

Quote:


> Originally Posted by *Intervention*
> 
> Not really,if Nvidia had made it publicly known than the GTX 970 only really has 3.5GB of ram,I would have purchased a 4GB 290X instead.
> But it doesn't work as it says on the box. Nvidia is falsely advertising it as a 4GB 256bit GPU when it is neither. If you are buying a 4GB card you should expect that it use all 4GB and not be a laggy stuttering pile of crap if you use over 3.5GB. As much as i like Nvidia, a lawsuit should happen as there's no way Nvidia couldn't have known about the issue the whole time.
> I like my 970 G1,but am really disappointed that I didn't get all of the useable vram that i pad for. I never bothered to update my signature,and might not while i decide if i should sell my 970. I could probably sell it and get a 4GB 770 or a 4GB 290X and know i'm getting what i paid for,lol.
> It's like buying that 600HP Ford GT then finding out you can't use all 600hp because it stalls out when you were to try using all 600hp.


Code:



Code:


Oh... I didn't realize the 970s were completely shutting off when you hit 3.5GB

Stalling and "stuttering" are not the same[/quote]

they are not shutting off -_-


----------



## NuclearPeace

If its an open and shut class action then go ahead and sue.


----------



## Quasimojo

Quote:


> Originally Posted by *Heavy MG*
> 
> But it doesn't work as it says on the box. Nvidia is falsely advertising it as a 4GB 256bit GPU when it is neither. If you are buying a 4GB card you should expect that it use all 4GB and not be a laggy stuttering pile of crap if you use over 3.5GB. As much as i like Nvidia, a lawsuit should happen as there's no way Nvidia couldn't have known about the issue the whole time.


It's not a "laggy stuttering pile of crap". Your continued insistence on hanging your hat on a point that has been disproven by every benchmark and review available discredits you.

If they had come right out and detailed these design choices right up front - told us exactly how every byte of that 4GB of vram was utilzied - people still would have read all the reviews and stood in line to buy two of them. Yet here we are storming their office building with pitchforks, based on information that doesn't amount to much of anything, when you look at the end result.

This thread is beyond ridiculous, and someone at nVidia is likely to get demoted or fired because of this kind of hysteria. It's a shame.

Enjoy your lynch mob. I'm out.


----------



## gamervivek

Quote:


> The error, as NVIDIA explains it, is that in creating the GTX 970 reviewer's guide, the technical marketing team was unaware of Maxwell's aforementioned and new "partial disable" capabilities when they filled out the GTX 970 specification table. They were aware that the GTX 970 would have the full 256-bit memory bus, and unaware of the ability to independently disable ROPs they assumed that all 64 ROPs and the full 2MB of L2 cache was similarly available and wrote the specification table accordingly. This error then made it into the final copy of the guide, not getting caught even after being shared around various groups at NVIDIA, with that information finally diffused by press such as ourselves.











Quote:


> With that in mind, given the story that NVIDIA has provided, do we believe them? In short, yes we do. To be blunt, if this was intentional then this would be an incredibly stupid plan, and NVIDIA as a company has not shown themselves to be that dumb. NVIDIA gains nothing by publishing an initially incorrect ROP count for the GTX 970, and if this information had been properly presented in the first place it would have been a footnote in an article extoling the virtues of the GTX 970, rather than the centerpiece of a full-on front page exposé.


Always attribute to greed which can be explained by stupidity.
Quote:


> Coming up with real (non-synthetic) gaming workloads that can utilize between 3.5GB and 4GB of VRAM while not running into a rendering performance wall is already a challenge, and all the more so when trying to find such workloads that actually demonstrate performance problems. This at first glance does seem to validate NVIDIA's overall claims that performance is not significantly impacted by the memory segmentation, but we're going to continue looking to see if that holds up.


>3.5GB would become a AMD gaming evolved motto from now on.

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation


----------



## provost

Quote:


> Originally Posted by *Xoriam*
> 
> Yeah this is what should be focused on.
> 
> Not the ram, because it is ALL useable.


Would it have really cost that much more for Nvidia to have included the rops, and cache specs, as advertised, or is this more of a performance segmentation issue to differentiate 980 from 970?


----------



## cq3mrd

Quote:


> Originally Posted by *Quasimojo*
> 
> It's not a "laggy stuttering pile of crap". Your continued insistence on hanging your hat on a point that has been disproven by every benchmark and review available discredits you.
> 
> If they had come right out and detailed these design choices right up front - told us exactly how every byte of that 4GB of vram was utilzied - people still would have read all the reviews and stood in line to buy two of them.


And you know this how? I say that people wouldn't have bought a 3.5GB card instead of the 980/290(x). See how easy it is to make unfounded remarks?
Quote:


> Originally Posted by *Quasimojo*
> 
> Yet here we are storming their office building with pitchforks, based on information that doesn't amount to much of anything, when you look at the end result.
> 
> This thread is beyond ridiculous, and someone at nVidia is likely to get demoted or fired because of this kind of hysteria. It's a shame.
> 
> Enjoy your lynch mob. I'm out.


So you were *proven* wrong, refused to acknowledge it, went and made unfounded claims, lashed out at others that disagreed with you and finally took your toys and went away?

Well boo-hoo!


----------



## nSone

so after this disclosure *ALL* reviews out there are now obsolete right?
the GTX970 is still a great card but oh this sure is a mess... please don't tell me it's possible no one ever noticed that all those specs out there on trusted review sites weren't noticed by nVidia technical/PR or whatever staff.
now that I think of it that would be the reason why there was no info on the 970 in the famous and over quoted Maxwell Whitepaper


----------



## Xoriam

Quote:


> Originally Posted by *provost*
> 
> Would it have really cost that much more for Nvidia to have included the rops, and cache specs, as advertised, or is this more of a performance segmentation issue to differentiate 980 from 970?


It is probably to differentiate the 980 from the 970, however it's kind of coming back to bite them in the ass now.


----------



## Forceman

Quote:


> Originally Posted by *nSone*
> 
> so after this disclosure *ALL* reviews out there are now obsolete right?


The performance is what the performance is. The specification table of all the reviews is wrong, but the actual test results haven't changed.


----------



## specopsFI

Frankly, I've been very understanding to Nvidia even after the first statement as it painted a situation where the 512MB section wasn't completely neutered but rather advantageous. This more in-depth story paints a different scenario and sadly, this is the real thing. That smaller section of the VRAM really is bogus. It can be used and it can be of use, but there will always be situations where a crucial texture is in the slower VRAM section at the wrong time. Those times might not be too common, but play at settings requiring 3.5-4GB of VRAM and they are sure to pop in (pun intended).

I've been very happy about my G1, one of the nicest GPUs I've owned, but there are potential future usage scenarios that are tarnished by Nvidia not giving us the full info on such obvious shortcomings of their design. That is really it: IMHO this actually was a lie that caused harm to me. Besides, there was an actual lie in their reviewers guide about the ROP count and L2 cache size.

So as things are, I do expect to be compensated to some degree. I'm still interested in the FCAT results that are sure to follow, but no amount of testing with current games can put my mind at ease that this VRAM issue won't become a problem further down the road.


----------



## GTR Mclaren

This is a true problem

I owned a 660 with the same "problem"

only 1.5GB were "true"

when the games used more than that lag was present and sudden frame drops

Nvidia droped the ball, between this and the 960 fail, bad month for the green


----------



## darealist

Watching the nVidia defenders against the army of AMD trolls is hilarious.


----------



## RagingCain

Looks like nVidia screwed up and end-users should be compensated in some form. A free game sounds fair.

Geforce GTX 970 is still a great card for the money.


----------



## iSlayer

Remember kids, we need actual testing to see what the impact of this is before we rage. As of now all 4GBs is available and can be used, we just don't know what the performance penalty is.
Quote:


> Originally Posted by *RagingCain*
> 
> Looks like nVidia screwed up and end-users should be compensated in some form. A free game sounds fair.
> 
> Geforce GTX 970 is still a great card for the money.


I'd prefer my $ back, but I'm one of those people who owns a 970 and thus has room to complain.

Edit: lol


----------



## Cyro999

So it's official that although 980 has the same specs on paper:

970 has 52 ROP's usable instead of 64
970 has 12.5% of L2 cache DISABLED

970 heavily prefers NOT to use the 8'th memory chip, and when it does access it, can only do so by cannibalizing bandwidth from the seventh. It can never be used to increase overall bandwidth.

As a result of that, in the fast 3.5GB of the memory - there is a total bandwidth loss of 1/8'th, aka 12.5%. That does significantly affect performance, and it's a huge part of the reason for 980 performing almost linearly better with the increase in core count (so many other resources are tied to it)


----------



## nSone

Quote:


> Originally Posted by *Forceman*
> 
> The performance is what the performance is. The specification table of all the reviews is wrong, but the actual test results haven't changed.


well right, but now when I think about this - i was blaming hardocp for not testing the 970 in SLI properly, but might need to take my words back
this should have been disclosed from day 1, to all consumers and reviewers, and we'd all be happy and made our choices on informed specs/revs
for me this is a mess, and I'll be returning 2 970 tomorrow, but I don't want to know how this will affect retailers


----------



## PureBlackFire

GTX 970 is still the value champ IMO.


----------



## skupples

Why is NV always fudging ROP numbers?

Also, reviewers have been doing whack jobs for quite some time. Nothing new here.


----------



## Forceman

Sounds like it really is going to come down to a game-by-game or driver-by-driver issue, where some games will properly be able to keep priority assets in the priority memory and others won't, and some drivers will be able to help compensate and some won't. So likely a non-issue for a large majority of titles, but with hidden landmines waiting to blow up your performance at any time.

The evil part of me wonders if we might end up seeing a GTX 970 Ti that has the same number of SMMs but the full 64 ROPs (thereby giving back that 4-6% performance and full-speed 4GB access). That'd probably be too evil even for Nvidia though - but I bet if they hadn't botched the 970 roll-out this way it would be a possibility. Doesn't sound like there's any reason they _had_ to disable those ROPs.


----------



## jprovido

so I'm not sure if I got it right...there was just a MISUNDERSTANDING between the engineering team and the PR team and no one from nvidia knew about this?

do you guys believe this bs? lies lies lies lies lies


----------



## PureBlackFire

Quote:


> Originally Posted by *Forceman*
> 
> Sounds like it really is going to come down to a game-by-game or driver-by-driver issue, where some games will properly be able to keep priority assets in the priority memory and others won't, and some drivers will be able to help compensate and some won't. So likely a non-issue for a large majority of titles, but with hidden landmines waiting to blow up your performance at any time.
> 
> Makes me wonder if we might end up seeing a GTX 970 Ti that has the same number of SMMs but the full 64 ROPs (thereby giving back that 4-6% performance and full-speed 4GB access).


a company rep (I believe it was MSI) did mention a 970 ti a few months ago iirc. I made my choice to sidegrade knowing it likely wasn't going to be permanent anyway. I do feel some kind of way about going from a 4GB/64 ROP gpu to a *4GB/52-ish ROP gpu. performance has been satisfactory though and I don't crank settings when I play at 4K anyway.
Quote:


> Originally Posted by *jprovido*
> 
> so I'm not sure if I got it right...there was just a MISUNDERSTANDING between the engineering team and the PR team and no one from nvidia knew about this?
> 
> do you guys believe this bs? lies lies lies lies lies


I don't for a second. but it's whatever.


----------



## Woundingchaney

I think that the 970 is excellent value for the money. Having said this if I had known the issue with the ROPs and the segmented memory it would have definitely made me second guess my purchase. I game at 4k and current performance specs, while important, are not as important as performance in 12-18 months from time of purchase. I think it is very reasonable to say that this does hamper the longevity of my purchase.

Im on the fence as to what I want to do with my cards, but I do expect some sort of compensation. I simply have no idea on how to achieve it.

I upgraded from xfire 290s, for reasons such as HDMI 2.0 and noise, but now that the video memory is hampered Im not sure if I would have made that decision again.


----------



## criminal

Quote:


> Originally Posted by *Forceman*
> 
> Sounds like it really is going to come down to a game-by-game or driver-by-driver issue, where some games will properly be able to keep priority assets in the priority memory and others won't, and some drivers will be able to help compensate and some won't. So likely a non-issue for a large majority of titles, but with hidden landmines waiting to blow up your performance at any time.
> 
> The evil part of me wonders if we might end up seeing a GTX 970 Ti that has the same number of SMMs but the full 64 ROPs (thereby giving back that 4-6% performance and full-speed 4GB access). That'd probably be too evil even for Nvidia though - but I bet if they hadn't botched the 970 roll-out this way it would be a possibility. Doesn't sound like there's any reason they _had_ to disable those ROPs.


Yep. 970 users will have to be concerned if new games coming out will be able to address the vram on their cards correctly. Not something I would be happy about.


----------



## Wirerat

I would prefer a way to deactivate the 512mb in nvcp. Just incase some game doesn't handle it corectly.


----------



## XXnomadXX

go back in time and the reviewers write or type that the gtx 970 have 4GB of memory but only 3.5GB is usable for normal speed but past that the 0.5 will cripple the speed of the card thus showing only a 3.5GB is the real usable of the card. i wonder if the consumers will look at the gtx 970 as a true 4GB price card.


----------



## PontiacGTX

Quote:


> Originally Posted by *criminal*
> 
> Yep. 970 users will have to be concerned if new games coming out will be able to address the vram on their cards correctly. Not something I would be happy about.


More ports/vram demanding for sure


Spoiler: List!



http://www.overclock.net/content/type/61/id/2329546/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329544/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329543/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329547/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329548/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329550/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329554/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329553/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329552/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329551/width/500/height/1000
http://www.overclock.net/content/type/61/id/2329555/width/500/height/1000


----------



## fleetfeather

Quote:


> Originally Posted by *Wirerat*
> 
> I would prefer a way to deactivate the 512mb in nvcp. Just incase some game doesn't handle it corectly.


agreed. i want to ensure applications aren't allocating any of the 0.5gb partition for games. Actually, i'd love to see this 0.5gb partition as a system ram cache of some sort, since it's apparently "4x faster" than regular system ram


----------



## Luck100

Quote:


> Originally Posted by *fleetfeather*
> 
> agreed. i want to ensure applications aren't allocating any of the 0.5gb partition for games. Actually, i'd love to see this 0.5gb partition as a system ram cache of some sort, since it's apparently "4x faster" than regular system ram


The 0.5 GB partition is only faster than system RAM for the GPU. The CPU will be limited by PCIE bus when it reads/writes that partition.


----------



## Wirerat

Quote:


> Originally Posted by *fleetfeather*
> 
> agreed. i want to ensure applications aren't allocating any of the 0.5gb partition for games. Actually, i'd love to see this 0.5gb partition as a system ram cache of some sort, since it's apparently "4x faster" than regular system ram


it adds unwanted latency. What if a game that runs fine with 3.5gb is caching 3.6 just cause it things its available? It will slow things down for no good reason.

I know the driver will deal with that but having some control would be nice.

I want to overclock them separately too lol.


----------



## MyLeftNut

I'm pretty sure even a game like COH2 at 1080p max settings uses over 3.5gb. MSI afterburner reports 3520+ for memory usage


----------



## fleetfeather

Quote:


> Originally Posted by *Luck100*
> 
> The 0.5 GB partition is only faster than system RAM for the GPU. The CPU will be limited by PCIE bus when it reads/writes that partition.


Oh, so the system RAM can be used for the GPU, but VRAM can't be used for the system? I don't know a whole heap regarding EE








Quote:


> Originally Posted by *Wirerat*
> 
> it adds unwanted latency. What if a game that runs fine with 3.5gb is caching 3.6 just cause it things its available? It will slow things down for no good reason.


yep


----------



## Olivon

Quote:


> What to think? Return the GTX 970?
> 
> We are currently still in the process of analyzing the information and complete this article with some more explanations. But of course the question of how current users of the GeForce GTX 970 should take this news is difficult to avoid.
> We believe the GeForce GTX 970 is now to be seen as a graphics card with 224-bit memory bandwidth downgraded and that merely 56 ROP. Requirements change, but not its good performance because this deficit was already included in the initial results. This is a behavior that communication problem that emerges now and it makes a user can have rightly feel that there was misrepresentation of the product.
> We have not been observed in cases where there is a real problem of synthetic tests outside performance oriented for this purpose, but it is not impossible that such situations exist, for example in poorly operated games as benchmarks or with multi-GPU systems. If necessary, Nvidia may in some cases improve the behavior of GTX 970 via new drivers. But GTX 970 will never use 4GB as goodly as GTX 980 does.
> The GeForce GTX 970 remains as an excellent graphics card but we must keep some reserves for multi-GPU, which tends to push the memory sub-systems within their boundaries.
> In any case, we believe in against a user has the right to feel aggrieved and that this error in the original specification is sufficient to demand a return of the product to the dealer ...


http://www.hardware.fr/focus/106/gtx-970-3-5-go-224-bit-lieu-4-go-256-bit.html

*Translated*


----------



## gigafloppy

It's what I feared. The 3.5GB does not have the promised 224 GB/s bandwidth, only 192 GB/s.

So we were promised:
4GB, 224GB/s, 64 ROPS. What we got:
3.5GB, 192 GB/s, 56 ROPS.

And this was just a 'mistake' of the marketing team? No one noticed the wrong numbers in the reviews? For FOUR months? Really?


----------



## iSlayer

^ facepalm, read the anandtech post.
Quote:


> Originally Posted by *criminal*
> 
> Yep. 970 users will have to be concerned if new games coming out will be able to address the vram on their cards correctly. Not something I would be happy about.


We have gone months without it being an issue. I think its safe to say it won't be.
Quote:


> Originally Posted by *XXnomadXX*
> 
> go back in time and the reviewers write or type that the gtx 970 have 4GB of memory but only 3.5GB is usable for normal speed but past that the 0.5 will cripple the speed of the card thus showing only a 3.5GB is the real usable of the card. i wonder if the consumers will look at the gtx 970 as a true 4GB price card.


We would still look at the performance and be knocked of our socks given the price.

The last lines of the Anandtech article said it best, something a lot of very thick people seem to not be getting.
Quote:


> But so far with this new information we have been unable to break the GTX 970, which means NVIDIA is likely on the right track and the GTX 970 should still be considered as great a card now as it was at launch. In which case what has ultimately changed today is not the GTX 970, but rather our perception of it.


Repeat after me.
*what has changed today is not the GTX 970, but rather our perception of it.*


----------



## skupples

Quote:


> Originally Posted by *jprovido*
> 
> so I'm not sure if I got it right...there was just a MISUNDERSTANDING between the engineering team and the PR team and no one from nvidia knew about this?
> 
> do you guys believe this bs? lies lies lies lies lies


I believe it just about as much as the nonsense AMD spews @ their little gatherings about being the world's fastest everything.


----------



## jprovido

I have a crazy/stupid idea. would it be possible to unlock the rops on SOME of the gtx 970's like ACC on AMD cpu's back in the day. nvidia would be releasing this bios update unlocking the cards with functional ROP's, L2 cache and just RMA the others. it's a long shot but I tried rofl. I just figured there could be a number of 970's with just "disabled" components like those amd cpu's.


----------



## provost

Quote:


> Originally Posted by *Xoriam*
> 
> It is probably to differentiate the 980 from the 970, however it's kind of coming back to bite them in the ass now.


I think that you are right. But, the optics of this faux pas are definitely worst than the actual issue, as this goes on to creating a perception which will take some time to correct itself.
As someone said earlier in this thread, it makes you wonder what else is there that we don't know.
The old adage of what you don't know, won't hurt you, goes out the window, since the issue is known now.
Nvidia can take the PR hit on the 970, and be ok, but how this may change people's perception of buying Nvidia cards in the future , only time will tell.


----------



## Luck100

Quote:


> Originally Posted by *fleetfeather*
> 
> Oh, so the system RAM can be used for the GPU, but VRAM can't be used for the system? I don't know a whole heap regarding EE
> 
> 
> 
> 
> 
> 
> 
> 
> yep


Your CPU can use the VRAM, but communication between CPU and VRAM goes through the PCIE bus. PCIE tops out at 16 GB/sec theoretical maximum (less in practice).


----------



## Xoriam

Quote:


> Originally Posted by *jprovido*
> 
> I have a crazy/stupid idea. would it be possible to unlock the rops on SOME of the gtx 970's like ACC on AMD cpu's back in the day. nvidia would be releasing this bios update unlocking the cards with functional ROP's, L2 cache and just RMA the others. it's a long shot but I tried rofl. I just figured there could be a number of 970's with just "disabled" components like those amd cpu's.


If I'm not mistaken the chip is actually cut.
Not sure how much they can enable via BIOS.


----------



## criminal

Quote:


> Originally Posted by *Xoriam*
> 
> If I'm not mistaken the chip is actually cut.
> Not sure how much they can enable via BIOS.


Yeah, I think that is the case. Otherwise a community like OCN would have already stumbled upon it and everyone with a 970 would be running a 980 bios.


----------



## fleetfeather

Quote:


> Originally Posted by *Luck100*
> 
> Your CPU can use the VRAM, but communication between CPU and VRAM goes through the PCIE bus. PCIE tops out at 16 GB/sec theoretical maximum (less in practice).


gotcha. thanks!


----------



## Xoriam

Quote:


> Originally Posted by *criminal*
> 
> Yeah, I think that is the case. Otherwise a community like OCN would have already stumbled upon it and everyone with a 970 would be running a 980 bios.


Lol you know someone would have found out how to unlock them to 980s by now.


----------



## Wirerat

Two words. Nvidea knew.


----------



## rdr09

Quote:


> Originally Posted by *skupples*
> 
> I believe it just about as much as the nonsense AMD spews @ their little gatherings about being the world's fastest everything.


why bring up AMD? Nvidia duped them not AMD. lol


----------



## fleetfeather

Quote:


> Originally Posted by *rdr09*
> 
> why bring up AMD? Nvidia duped them not AMD. lol


NV is _pulling a Roy_


----------



## jprovido

Quote:


> Originally Posted by *Xoriam*
> 
> If I'm not mistaken the chip is actually cut.
> Not sure how much they can enable via BIOS.


it's like a hit or miss thing.I know it's possible some of these gpus were not laser cut and was just disabled. I've unlocked a Phenom II x2 555 before all cores were stable with l3 cache. my friend's chip had an usntable 4th core etc. if it works for you then hooray for you. if it doesn't you get a free game? see I knew it was stupid lol


----------



## criminal

Quote:


> Originally Posted by *fleetfeather*
> 
> NV is _pulling a Roy_











Quote:


> Originally Posted by *jprovido*
> 
> it's like a hit or miss thing.I know it's possible some of these gpus were not laser cut and was just disabled. I've unlocked a Phenom II x2 555 before all cores were stable with l3 cache. my friend's chip had an usntable 4th core etc. if it works for you then hooray for you. if it doesn't you get a free game? see I knew it was stupid lol


I think in those cases it was just easier to disable cores. Nvidia is too greedy. 970 had to be laser cut... no way they "just disabled" anything.


----------



## Noufel

But the 970 is still the king in the 3.5 gb segment ...... just kidding


----------



## Noufel

Quote:


> Originally Posted by *fleetfeather*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> why bring up AMD? Nvidia duped them not AMD. lol
> 
> 
> 
> NV is _pulling a Roy_
Click to expand...

best post in this thread


----------



## rdr09

Quote:


> Originally Posted by *Noufel*
> 
> best post in this thread


i knew he works for NV. lol


----------



## Vesku

Quote:


> You should take two things away from that simple description. First, despite initial reviews and information from NVIDIA, the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. NVIDIA says this was an error in the reviewer's guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned.


Quote:


> That means the GTX 970 has 56 ROPs and 1792 KB of L2 cache compared to 64 ROPs and 2048 KB of L2 cache for the GTX 980.


So is this still not a big deal? Personally I think the smaller L2 cache is a bigger deal then even the RAM partitioning, definitely a few people who would have splurged for 980 or been OK jumping ship to 290/290X after their prices dropped.


----------



## Poisoner

Looks like I dodged a bullet this go around by skipping the GTX 970.

If it sounds too good to be true...


----------



## skupples

Quote:


> Originally Posted by *provost*
> 
> I think that you are right. But, the optics of this faux pas are definitely worst than the actual issue, as this goes on to creating a perception which will take some time to correct itself.
> As someone said earlier in this thread, it makes you wonder what else is there that we don't know.
> The old adage of what you don't know, won't hurt you, goes out the window, since the issue is known now.
> Nvidia can take the PR hit on the 970, and be ok, but how this may change people's perception of buying Nvidia cards in the future , only time will tell.


makes me wonder what else they did to ninja nerf Vanilla Titans, as we all know we ended up getting way more power out of those than Nvidia ever factored for. They never expected the avg watercooler to get 1300MHZ+ out an A1 Revision GK110 chip.
Quote:


> Originally Posted by *Poisoner*
> 
> Looks like I dodged a bullet this go around by skipping the GTX 970.
> 
> If it sounds too good to be true...


are you gaming @ 4K? No? Doesn't matter. Were you planning to buy 2-3 of them to push the kind of settings required to use all 4GB while still maintaining 60+FPS? no? then it doesn't matter.


----------



## jprovido

Quote:


> Originally Posted by *Vesku*
> 
> So is this still not a big deal?


It is a big deal. Lawsuits are imminent


----------



## PureBlackFire

Quote:


> Originally Posted by *skupples*
> 
> I believe it just about as much as the nonsense AMD spews @ their little gatherings about being the world's fastest everything.


this topic has nothing to do with AMD.
Quote:


> Originally Posted by *Noufel*
> 
> But the 970 is still the king in the 3.5 gb segment ...... just kidding


best 3.5GB gpu ever!


----------



## gamervivek

Quote:


> Originally Posted by *jprovido*
> 
> so I'm not sure if I got it right...there was just a MISUNDERSTANDING between the engineering team and the PR team and no one from nvidia knew about this?
> 
> do you guys believe this bs? lies lies lies lies lies


A misunderstanding that went on for four months later and one of them is now being touted as an architectural improvement over kepler.


----------



## skupples

Quote:


> Originally Posted by *PureBlackFire*
> 
> this topic has nothing to do with AMD.
> best 3.5GB gpu ever!


It's called a comparison







Felt like it was needed, since you know... NV bad, AMD good!

acting like you can't trust one company, without stating that the other isn't trustworthy either, leaves the statement open to interpretation.

Intel, AMD, NV...

can't trust any of them, period. Neither of these entities are your friend, as much as some of them pretend to be.

The goal? $$$$

The goal? The best way to swoon you away from your $$$.


----------



## MerkageTurk

290x or 290 is far better than what nvidia has on offer, regarding price performance

the Ti is rubbish now, after nvidia not optimizing drivers, so we can upgrade to a mediocre gpu 970/980

I did not expect such a low move by nVidia, as I thought nvidia was a premium brand, just like Apple (LOL)

If AMD did this I am sure many will be complaining and asking for refunds or compensation i,e. games


----------



## provost

Quote:


> Originally Posted by *skupples*
> 
> makes me wonder what else they did to ninja nerf Vanilla Titans, as we all know we ended up getting way more power out of those than Nvidia ever factored for. They never expected the avg watercooler to get 1300MHZ+ out an A1 Revision GK110 chip. .


But, at least the vanillaTitans met the advertised hardware specs... Lol
What's happened to 970 is in a league of its own when it comes to nerfing performance.


----------



## skupples

Quote:


> Originally Posted by *provost*
> 
> But, at least the vanillaTitans met the advertised hardware specs... Lol
> What's happened to 970 is in a league of its own when it comes to nerfing performance.


if I'm remembering correctly, no, no they didn't.

Wasn't there a theoretical Vs. Actual ROPs issue w/ GK110?
Quote:


> Originally Posted by *MerkageTurk*
> 
> 290x or 290 is far better than what nvidia has on offer, regarding price performance
> 
> the Ti is rubbish now, after nvidia not optimizing drivers, so we can upgrade to a mediocre gpu 970/980
> 
> I did not expect such a low move by nVidia, as I thought nvidia was a premium brand, just like Apple (LOL)
> 
> If AMD did this I am sure many will be complaining and asking for refunds or compensation i,e. games


truth, I expect GTA-V to be one of the first major tells that Nvidia has been tampering with GK110s via drivers.

also something else to note. They go back and update old drivers, so drivers you have stored locally, will end up being different than old drivers hosted on their website.

The funny thing is, they don't need to play these games. They make plenty of money, and would continue to make plenty of money w/o screwing with the end user.

Messing w/ GK110 in the drivers just makes an Nvidia user want to switch over to AMD.

Specially now w/ the price of 290x, its only issue being obscene power usage & heat.


----------



## iSlayer

Quote:


> Originally Posted by *Xoriam*
> 
> If I'm not mistaken the chip is actually cut.
> Not sure how much they can enable via BIOS.


Quote:


> Originally Posted by *criminal*
> 
> Yeah, I think that is the case. Otherwise a community like OCN would have already stumbled upon it and everyone with a 970 would be running a 980 bios.


Yah I doubt Nv would leave it possible for us to unlock to a 980.
Quote:


> Originally Posted by *Vesku*
> 
> So is this still not a big deal? Personally I think the smaller L2 cache is a bigger deal then even the RAM partitioning, definitely a few people who would have splurged for 980 or been OK jumping ship to 290/290X after their prices dropped.


For anyone looking to get a 970? No, the 970 still performs like it does.
For those of us with a 970? Well, class action law suit wouldn't be unwelcome. I wouldn't mind my cash back.
Quote:


> Originally Posted by *Poisoner*
> 
> Looks like I dodged a bullet this go around by skipping the GTX 970.
> 
> If it sounds too good to be true...


The performance is still exactly what it is. It'd only be too good to be true if the benchmarks reported inaccurate performance levels.
Quote:


> Originally Posted by *jprovido*
> 
> It is a big deal. Lawsuits are imminent


Well, time to get ready for that. I would like my cash back.
Quote:


> Originally Posted by *MerkageTurk*
> 
> 290x or 290 is far better than what nvidia has on offer, regarding price performance
> 
> the Ti is rubbish now, after nvidia not optimizing drivers, so we can upgrade to a mediocre gpu 970/980
> 
> I did not expect such a low move by nVidia, as I thought nvidia was a premium brand, just like Apple (LOL)
> 
> If AMD did this I am sure many will be complaining and asking for refunds or compensation i,e. games


The Kepler nerf turned out to be panic based on Ubicrap games.

Yes, the 290(x) is better price/performance, that's not the sole metric of a GPU.

No, the Ti isn't rubbish, at all.

You know how much talk there is of law suits in this thread? Even ignoring half of it being from people without 970s?


----------



## provost

Quote:


> Originally Posted by *skupples*
> 
> if I'm remembering correctly, no, no they didn't.
> 
> Wasn't there a theoretical Vs. Actual ROPs issue w/ GK110?
> truth, I expect GTA-V to be one of the first major tells that Nvidia has been tampering with GK110s via drivers.
> 
> also something else to note. They go back and update old drivers, so drivers you have stored locally, will end up being different than old drivers hosted on their website.


You know , I never knew about that rops issue, my biggest beef was them throttling it via the power throttle, which luckily the good guys here were able to fix.

Appreciate the heads up on the old drivers... gotta make sure I don't delete my backups.
Yeah, I am keeping a cautious eye out on the drivers for dumbing down performance issues on older cards.


----------



## iSlayer

Quote:


> Originally Posted by *skupples*
> 
> if I'm remembering correctly, no, no they didn't.
> 
> Wasn't there a theoretical Vs. Actual ROPs issue w/ GK110?
> truth, I expect GTA-V to be one of the first major tells that Nvidia has been tampering with GK110s via drivers.
> 
> also something else to note. They go back and update old drivers, so drivers you have stored locally, will end up being different than old drivers hosted on their website.
> 
> The funny thing is, they don't need to play these games. They make plenty of money, and would continue to make plenty of money w/o screwing with the end user.
> 
> Messing w/ GK110 in the drivers just makes an Nvidia user want to switch over to AMD.
> 
> Specially now w/ the price of 290x, its only issue being obscene power usage & heat.


Care to provide proof of driver manipulation?


----------



## skupples

@jprovido

lawsuits for what exactly?

people keep tossing this around, yet have an obscenely loose case.

the card has 4 physical GBs of GDDR5 memory, and can use all of those giggles, so please tell us how lawsuits are iminent... Seriously, such a joke.

I haven't seen one lawyer pipe up about this, and I can guarantee at LEAST ONE is in this thread viewing & posting right now. Most likely laughing his ass off over these statements.

Does this mean I can sue AMD over Bulldozer? it was advertised as the bestest! but it sucked.


----------



## Seven7h

Quote:


> Originally Posted by *Wirerat*
> 
> I would prefer a way to deactivate the 512mb in nvcp. Just incase some game doesn't handle it corectly.


There are exactly zero cases where that would be better. System memory is far slower, especially if the resource is no longer in system memory and was paged out to disk.

This is like saying "One tire on my car is a slightly different size. I'd rather take it off and have to drive like that than drive with a different sized tire."


----------



## Woundingchaney

Quote:


> Originally Posted by *skupples*
> 
> @jprovido
> 
> lawsuits for what exactly?
> 
> people keep tossing this around, yet have an obscenely loose case.
> 
> the card has 4 physical GBs of GDDR5 memory, and can use all of those giggles, so please tell us how lawsuits are iminent... Seriously, such a joke.
> 
> I haven't seen one lawyer pipe up about this, and I can guarantee at LEAST ONE is in this thread viewing & posting right now. Most likely laughing his ass off over these statements.
> 
> Does this mean I can sue AMD over Bulldozer? it was advertised as the bestest! but it sucked.


It doesn't have the ROPs that it was stated as having and while it does have the physical 4 gigs on the card the bandwidth is misstated as well.

These are both issues of false advertising and misrepresenting a consumer device.


----------



## MerkageTurk

AMD BULL is good at multi thread applications, plus came as advertised, nothing missing

this however, is missing.


----------



## jprovido

I bought a 4gb vram card and i expect to get one. Since that gtx 980's have 4gb vram and are same architectures it would be fair if they replaced my two 970's with two 980's and one of jen-hsun's jackets. I am not taking advantage of the situation at all.


----------



## Wirerat

Quote:


> Originally Posted by *Seven7h*
> 
> There are exactly zero cases where that would be better. System memory is far slower, especially if the resource is no longer in system memory and was paged out to disk.
> 
> This is like saying "One tire on my car is a slightly different size. I'd rather take it off and have to drive like that than drive with a different sized tire."


i want the control. I didnt say it would be off all the time.

A situation were a game is caching 5mb over the 3.5gb partition and adding latency for nothing. I dnt trust the driver to always make the right choice with every setting/resolution /game.

Your scenerio assumes the driver makes the optimal choice everytime.


----------



## PureBlackFire

Quote:


> Originally Posted by *skupples*
> 
> It's called a comparison
> 
> 
> 
> 
> 
> 
> 
> Felt like it was needed, since you know... NV bad, AMD good!
> 
> acting like you can't trust one company, without stating that the other isn't trustworthy either, leaves the statement open to interpretation.
> 
> Intel, AMD, NV...
> 
> can't trust any of them, period. Neither of these entities are your friend, as much as some of them pretend to be.
> 
> The goal? $$$$
> 
> The goal? The best way to swoon you away from your $$$.


that's all true, but we just got the real specs of a runaway success gpu today. it doesn't change the admirable performance/dollar and performance/watt at all, but it may become an issue in the future and at the very least it is deceptive. this wasn't some accident or miscommunication. they probably never expected to be under the microscope after they took $70 off the standard GTX x70 price tag. as for GK110, I'm sure (by looking at recent games and after updates in GM204 and Hawaii drivers) that Nvidia is simply ignoring them in their driver optimization in favor of the new toys right now. to hell with the $1k price tag, but it cannot be said that the Titan didn't age well. turn up the clocks and it stomps you should know.


----------



## vloeibaarglas

Good lord Nvidia.

Lower ROP (64 -> 56)

Misrepresented VRAM (4 GB -> 3.5 GB + 0.5 GB* at 1/7 speed)

Lower L2 cache (2MB -> 1.75 MB)

Lower theoretical memory bandwidth (226 -> 192 GB/s)


----------



## Forceman

Quote:


> Originally Posted by *iSlayer*
> 
> For those of us with a 970? Well, class action law suit wouldn't be unwelcome. I wouldn't mind my cash back.


Please. At best you'd get a voucher for $5 in free to play coins 5 years from now. No one is getting their money back from Nvidia from this. If you are very lucky you might get a vendor to accept a return on a card based on this, but I doubt even that.
Quote:


> Originally Posted by *Woundingchaney*
> 
> It doesn't have the ROPs that it was stated as having and while it does have the physical 4 gigs on the card the bandwidth is misstated as well.
> 
> These are both issues of false advertising and misrepresenting a consumer device.


Well then run on down to small claims court and file suit. No lawyer needed.


----------



## iSlayer

Quote:


> Originally Posted by *Wirerat*
> 
> i want the control. I didnt say it would be off all the time.
> 
> A situation were a game is caching 5mb over the 3.5gb partition and adding latency for nothing. I dnt trust the driver to always make the right choice with every setting/resolution /game.
> 
> Your scenerio assumes the driver makes the optimal choice everytime.


You would cache from RAM instead. Which would be considerably slower.

The card will ALWAYS be faster even at nerfed speeds than system RAM since you need not deal with PCIe bandwidth or latency.
Quote:


> Originally Posted by *PureBlackFire*
> 
> that's all true, but we just got the real specs of a runaway success gpu today. it doesn't change the admirable performance/dollar and performance/watt at all, but it may become an issue in the future and at the very least it is deceptive. this wasn't some accident or miscommunication. they probably never expected to be under the microscope after they took $70 off the standard GTX x70 price tag. as for GK110, I'm sure (by looking at recent games and after updates in GM204 and Hawaii drivers) that Nvidia is simply ignoring them in their driver optimization in favor of the new toys right now. to hell with the $1k price tag, but it cannot be said that the Titan didn't age well. turn up the clocks and it stomps you should know.


Turned out to be FUD based upon bugged Ubisoft titles, nothing substantial.

Again why I'd like some sauce for that claim.


----------



## NuclearPeace

Quote:


> Originally Posted by *skupples*
> 
> @jprovido
> 
> lawsuits for what exactly?
> 
> people keep tossing this around, yet have an obscenely loose case.
> 
> the card has 4 physical GBs of GDDR5 memory, and can use all of those giggles, so please tell us how lawsuits are iminent... Seriously, such a joke.
> 
> I haven't seen one lawyer pipe up about this, and I can guarantee at LEAST ONE is in this thread viewing & posting right now. Most likely laughing his ass off over these statements.
> 
> Does this mean I can sue AMD over Bulldozer? it was advertised as the bestest! but it sucked.


If they think they can win then I dont see why not.

I personally think this freakout is hilarious.


----------



## Woundingchaney

Quote:


> Originally Posted by *Forceman*
> 
> Well then run on down to small claims court and file suit. No lawyer needed.


I would imagine this is a joke, correct?

Are you suggesting that I go down to my county's small claims court and file against corporate giants such as Nvidia and Zotac?


----------



## Cryosis00

Quote:


> Originally Posted by *darkwizard*
> 
> While what you say is true about cars and horsepower, which has been true for a long period of time; doesn't really hold on the GPU debacle, same as someone tried to the same analogy with the Hard drives storage measurements.
> 
> At this point it is better to wait for in-depth analysis, but if you buy something that is supposed to work the way it was advertised and it doesn't, then by principle is wrong. Therefore, even if the card performs, even at 3.5gb, it is the principle that matters; some people bought a 970 just for that extra 1gb of vram.
> 
> While more testing may or may not uncover issues, what Nvidia did is wrong, that's my point of view.


What did they do wrong?

You may know what I know about cars, but the general public does not. Generally the guy on the lot is even more clueless. Nvidia is using the same practice. In fact, this is how the entire marketing world works.

I 100% agree it is shady. Especially if using the 2nd bank degrades performance. Early results say yes.


----------



## MerkageTurk

or under EU law as I live in United Kingdom, can ask the retailer for a refund.


----------



## Wirerat

Quote:


> Originally Posted by *iSlayer*
> 
> You would cache from RAM instead. Which would be considerably slower.


some games do cache vram just because it thinks its there.

Everything you said is true but i still dnt trust the driver to cache the optimal amount in every situation. Its an extra variable.


----------



## KeepWalkinG

We must always have to buy full chip, and we will not have any problems.


----------



## iSlayer

Quote:


> Originally Posted by *Wirerat*
> 
> some games do cache vram just because it thinks its there.
> 
> Everything you said is true but i still dnt trust the driver to cache the optimal amount in every situation. Its an extra variable.


Well, it's like setting the affinity on programs. You may think you're smarter and better at organizing workloads than the incredibly smart, efficient algorithms put in place by computer scientists and engineers, but you're not. If you thought of it, the design team surely did too. Like Nvidia doesn't work with game devs and understand exactly how game devs use their GPUs...

I'd leave it be personally.


----------



## Forceman

Quote:


> Originally Posted by *Woundingchaney*
> 
> I would imagine this is a joke, correct?
> 
> Are you suggesting that I go down to my county's small claims court and file against corporate giants such as Nvidia and Zotac?


If it is as cut and dried as people make it out to be, why not? I'm talking US, I don't know your legal system, but here you can file suit for basically nothing.

But no, my comment was sarcastic, because you have basically no chance of winning a lawsuit on the grounds that a few numbers posted in a website review were wrong. I'm basically making fun of the whole "sue them" attitude running around here. Sorry to make you the recipient, but you had a nice short post to make a point on.


----------



## wholeeo

Quote:


> Originally Posted by *Civicer*


Lmao!


----------



## Luck100

NVidia screwed up with their press materials for the 970 and they will eat humble pie (or worse) for it now. Rightly so.

BUT: the funny business with the 970's memory architecture will have very little impact on any real-world measure of performance. The GTX 970 loses 3 out of 16 SMM's compared to the 980, but it only loses 1 out 8 ROP/L2 cache units. The 970 has MORE memory bandwidth per compute unit than the 980. It's the 3 missing SMMs which have the most impact, and that was always clear from the beginning.

Anandtech and PcPer are supposed to be looking at tests with gaming scenarios using between 3.5 GB and 4 GB vram (the only case were performance depends on the slow vram). I wouldn't be surprised if they have to work really hard to engineer those scenarios. When you start bumping up resolution and DSR settings you will easily jump from under 3.5 GB to over 4 GB.


----------



## Woundingchaney

Quote:


> Originally Posted by *Forceman*
> 
> If it is as cut and dried as people make it out to be, why not? I'm talking US, I don't know your legal system, but here you can file suit for basically nothing.
> 
> But no, my comment was sarcastic, because you have basically no chance of winning a lawsuit on the grounds that a few numbers posted in a website review were wrong. I'm basically making fun of the whole "sue them" attitude running around here. Sorry to make you the recipient, but you had a nice short post to make a point on.


Im not sure if you are aware of just how this will play out in small claims court. This is not something that is typically handled in small claims court and would almost immediately be thrown out to be reopened through another court or transferred.

Filing a small claim against corporations such as this is essentially an effort in futility from an individual standpoint. There is absolutely no way that one could represent themselves even in small claims court, particularly given that handling the case outside of a courts ruling or settling in court could be viewed as an admission of guilt by Nvidia or whatever company.


----------



## Redwoodz

Remember when we were trying to figure out why Nvidia introduced the 970 at such a low price? Now we know. If anyone believes they did not purposefully lie about the cards specs, well I have a bridge in Arizona I would like to sell you. It is very relevant because the 970's main competition at that price point is the 290x, which has to it's advantage a wider memory bus and 4GB of RAM. Especially with all the 4K hype being banded about. Stick your head in the sand if you want to, I know a scam when I see one. They sold hundreds of thousands of those cards with the premise they have the superior 4K option.


----------



## PontiacGTX

Quote:


> Originally Posted by *Defoler*
> 
> BTW, the 290x has 8 MCs with 8 ROPs each, with a shared L2 cache.
> The 980/970 has 4 MCs, but with 16 ROPs each and a dedicated L2 cache.


now they say 8MC on the 970 :/


----------



## Neilthran

So nvidia had a communication problem and because of that incorrect information was used in the specifications of the 970? So basically they are saying they are sorry? hahahahahahaha. No, nvidia is a billion+ international corporation, i don't care about their justifications for the error, they should be doing better than that, no sane consumer should cut them some slack. They lied in the 970 specifications. Sure, you can spin it all you want, but in the end they lied, plain and simple.

I don't own a 970, nor i care to. Still i have the right to post in this thread, even if some try to imply you have to own a 970 to have a valid opinion. This is a forum, not the GTX 970 exclusive owners club.

Still the 970 is a very good card, but that doesn't change the fact that nvidia lied, and we, as consumers deserve an honest and better treatment.


----------



## Vesku

Quote:


> Originally Posted by *Luck100*
> 
> NVidia screwed up with their press materials for the 970 and they will eat humble pie (or worse) for it now. Rightly so.
> 
> BUT: the funny business with the 970's memory architecture will have very little impact on any real-world measure of performance. The GTX 970 loses 3 out of 16 SMM's compared to the 980, but it only loses 1 out 8 ROP/L2 cache units. The 970 has MORE memory bandwidth per compute unit than the 980. It's the 3 missing SMMs which have the most impact, and that was always clear from the beginning.
> 
> Anandtech and PcPer are supposed to be looking at tests with gaming scenarios using between 3.5 GB and 4 GB vram (the only case were performance depends on the slow vram). I wouldn't be surprised if they have to work really hard to engineer those scenarios. When you start bumping up resolution and DSR settings you will easily jump from under 3.5 GB to over 4 GB.


Yes but those scenarios will become more common over time as 4GB cards increase in popularity. At least some big game titles will want to show they are putting enthusiast's hardware to use. Watch Dogs and Shadows of Mordor Ultra textures is just the beginning.

Oh and Anandtech's Ryan Smith seems to be saying that Nai's benchmark is not actually flawed:
Quote:


> From an API perspective this is applicable towards both graphics and compute, though it's a safe bet that graphics is the more easily and accurately handled of the two thanks to the rigid nature of graphics rendering. Direct3D, OpenGL, CUDA, and OpenCL all see and have access to the full 4GB of memory available on the GTX 970, and from the perspective of the applications using these APIs the 4GB of memory is identical, the segments being abstracted. This is also why applications attempting to benchmark the memory in a piecemeal fashion will not find slow memory areas until the end of their run, as their earlier allocations will be in the fast segment and only finally spill over to the slow segment once the fast segment is full.


http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/3


----------



## Cyro999

Quote:


> For the 80th time, that benchmark is bugged. It is not 22 GB/sec. No one knows what the actual speed of memory access to that section of VRAM is (and quite possibly never will).


Just to confirm, we actually do know now. The 8'th RAM chip cannot be accessed at the same time as the seventh (so, it can't be accessed at the same time as the entire first pool that the seventh is part of) and to access it alone, it effectively has a 32 bit bus - while the other seven combined add up to 224 bit.

While the 980 and 770 for example have 256 bit across all of the memory - 224GB/s theoretical at 7ghz - the 970 is limited to 196GB/s on the fast part of the pool, and 28GB/s on the slow part, which cannot be accessed simultaneously.

That's theoretical performance, so real world is lower i guess - but basically, Nvidia has confirmed 100% that the numbers shown in the benchmark are actually accurate.


----------



## iSlayer

Quote:


> Originally Posted by *Luck100*
> 
> Anandtech and PcPer are supposed to be looking at tests with gaming scenarios using between 3.5 GB and 4 GB vram (the only case were performance depends on the slow vram). I wouldn't be surprised if they have to work really hard to engineer those scenarios. When you start bumping up resolution and DSR settings you will easily jump from under 3.5 GB to over 4 GB.


I imagine that'll be the case. At sane settings, there won't be any impact, but resolution and AA will be turned up to ridiculous settings to try and induce 4GBs on the VRAM but by that point stuttering and performance issues will just be not enough horse power.
Quote:


> Originally Posted by *Redwoodz*
> 
> Remember when we were trying to figure out why Nvidia introduced the 970 at such a low price? Now we know. If anyone believes they did not purposefully lie about the cards specs, well I have a bridge in Arizona I would like to sell you. It is very relevant because the 970's main competition at that price point is the 290x, which has to it's advantage a wider memory bus and 4GB of RAM. Especially with all the 4K hype being banded about. Stick your head in the sand if you want to, I know a scam when I see one. They sold hundreds of thousands of those cards with the premise they have the superior 4K option.


The 290(x)s sell for less than the 970. I guess those are a scam too.


----------



## criminal

Quote:


> Originally Posted by *Neilthran*
> 
> So nvidia had a communication problem and because of that incorrect information was used in the specifications of the 970? So basically they are saying they are sorry? hahahahahahaha. No, nvidia is a billion+ international corporation, i don't care about their justifications for the error, they should be doing better than that, no sane consumer should cut them some slack. They lied in the 970 specifications. Sure, you can spin it all you want, but in the end they lied, plain and simple.
> 
> I don't own a 970, nor i care to. Still i have the right to post in this thread, even if some try to imply you have to own a 970 to have a valid opinion. This is a forum, not the GTX 970 exclusive owners club.
> 
> Still the 970 is a very good card, but that doesn't change the fact that nvidia lied, and we, as consumers deserve an honest and better treatment.


Yep, this is an open forum, anyone can complain.









The way I see it now, they knew it all along. Why else price the card like they did? Now they are trying to do a PR spin on the issue and say it was a big mistake. Shady, shady, shady.... Those of you that think it is an okay practice to do this should realize that a good majority of people would have probably went for the 980 or something from AMD instead of grabbing a 970 if this had been made know from the start. Others would have bought the card anyway, but discounting the 970 owners that are truly upset is not fair at all. Again, they have a right to feel lied to and cheated.


----------



## Xoriam

Quote:


> Originally Posted by *iSlayer*
> 
> I imagine that'll be the case. At sane settings, there won't be any impact, but resolution and AA will be turned up to ridiculous settings to try and induce 4GBs on the VRAM but by that point stuttering and performance issues will just be not enough horse power.
> The 290(x)s sell for less than the 970. I guess those are a scam too.


In most of europe the 290x is 100-200€ more expensive than a 970


----------



## iSlayer

^ in the UK they're 50 pounds less than the 970. Different markets, different prices...the point was that the price says nothing about it being a scam or not.
Quote:


> Originally Posted by *criminal*
> 
> Yep, this is an open forum, anyone can complain.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The way I see it now, they knew it all along. Why else price the card like they did? Now they are trying to do a PR spin on the issue and say it was a big mistake. Shady, shady, shady.... Those of you that think it is an okay practice to do this should realize that a good majority of people would have probably went for the 980 or something from AMD instead of grabbing a 970 if this had been made know from the start. Others would have bought the card anyway, but discounting the 970 owners that are truly upset is not fair either. Again, they have a right to feel lied to and cheated.


That's the thing.

We don't need Nvidia's story to be the truth to yell at them, though as Anandtech pointed out, the story seems legit.

If Nvidia knew about this, do you think they'd have been so unprepared for this possible outcome? Again, Anandtech did point out that a VP at Nvidia is running around. I doubt they're just trying to throw up a smokescreen.

Anywho, the reason it doesn't matter if it is or isn't a lie from Nvidia is that Nvidia messed up. It doesn't matter if it was a mistake, they messed up.

That's what should be the focus, not the plausibility of the excuse.


----------



## Vesku

Given how many times it has been said even including in giant red font I'd like to reiterate that according to Anandtech's analysis of the session with Nvidia engineering:










When run properly that is.


----------



## GrimDoctor

Quote:


> Originally Posted by *Xoriam*
> 
> In most of europe the 290x is 100-200€ more expensive than a 970


Here in Australia they are similarly priced. I would have gone 290X if it were cheaper, my previous AMDs went strong for many years, but equally so did the Nvidias. I thought I was still going to go 970 SLi but now I have read the extra info I am unsure, mainly in regards to future games and the benches I've seen in said SLi. I will certainly wait for more information.


----------



## Shadin

I've owned nothing but ATI since my X1950X (with a brief affair with a used 8800GT), but the 970 was tempting since I need a card soon and felt that I should give NVIDIA another chance. Now this. They absolutely knew the card was engineered this way beforehand, it was just more lucrative to advertise it as 4GB at first and let people figure it out than try to explain this dumpster fire from the start.


----------



## Seven7h

Quote:


> Originally Posted by *Shadin*
> 
> I've owned nothing but ATI since my X1950X (with a brief affair with a used 8800GT), but the 970 was tempting since I need a card soon and felt that I should give NVIDIA another chance. Now this. They absolutely knew the card was engineered this way beforehand, it was just more lucrative to advertise it as 4GB at first and let people figure it out than try to explain this dumpster fire from the start.


I'll be happy to buy your perfectly working 4GB GPU from you at a discount.


----------



## Xoriam

Quote:


> Originally Posted by *GrimDoctor*
> 
> Here in Australia they are similarly priced. I would have gone 290X if it were cheaper, my previous AMDs went strong for many years, but equally so did the Nvidias. I thought I was still going to go 970 SLi but now I have read the extra info I am unsure, mainly in regards to future games and the benches I've seen in said SLi. I will certainly wait for more information.


Yeah this is the first time in nearly 10 years for me that Nvidia was the choice when it comes to price.
I was pretty surprised.


----------



## notarat

Quote:


> Originally Posted by *Xoriam*
> 
> In most of europe the 290x is 100-200€ more expensive than a 970


I hear that, in Europe, it probably costs more to ship cards which actually meet the specifications on the box.


----------



## Shadin

Quote:


> Originally Posted by *Seven7h*
> 
> I'll be happy to buy your perfectly working GPU from you at a discount.


I still have a 7870 LE Tahiti, I haven't pulled the trigger yet. I should have clarified, if I had bought a 970 this would make me mad but I'd just have to live with it, since I haven't bought a GPU yet this has now made me decide to take my money elsewhere.


----------



## Gilles3000

Quote:


> Originally Posted by *Xoriam*
> 
> In most of europe the 290x is 100-200€ more expensive than a 970


Cheapest GTX 970 I could quickly find: €315.25

Cheapest R9 290X I could quickly find: €316

Price difference: *€0.75*









Also note that the 290X in question is a much higher quality model.


----------



## RagingCain

Does anybody else think the 4GB of VRAM should have been one unified partition and this 0.5 GB thing is stupid? Am I the only one who doesn't really get it? They had to go out of their way, add R&D, software engineers, programmers and testers for this partition, which actually sounds like it lowers performance on the 970.

It would be cool if nVidia made this right though with their users.


----------



## Cyro999

Quote:


> Originally Posted by *RagingCain*
> 
> Does anybody else think the 4GB of VRAM should have been one unified partition and this 0.5 GB thing is stupid? Am I the only one who doesn't really get it?


They can only adress either the 7'th or the 8'th RAM chip with a read operation per cycle.

Unifying it into one pool and accessing it in parallel would result in halving the effective memory bandwidth of the whole GPU, just because the last quarter of it can't keep up


----------



## PureBlackFire

Quote:


> Originally Posted by *Gilles3000*
> 
> Cheapest GTX 970 I could quickly find: €315.25
> 
> Cheapest R9 290X I could quickly find: €316
> 
> Price difference: *€0.75*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also note that the 290X in question is a much higher quality model.


the 290X is a higher quality card in general.


----------



## RagingCain

Quote:


> Originally Posted by *Cyro999*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> Does anybody else think the 4GB of VRAM should have been one unified partition and this 0.5 GB thing is stupid? Am I the only one who doesn't really get it?
> 
> 
> 
> They can only adress either the 7'th or the 8'th RAM chip with a read operation per cycle.
> 
> Unifying it into one pool and sequentially accessing it would result in halving the effective memory bandwidth of the whole GPU. They put aside and isolated the weak link instead.
Click to expand...

This was just a silly design then. Although stuff like this is exactly why I don't buy any of the sliced and diced GPUs from either side. God knows what they did to make a lower model. Only high-end or no end.


----------



## Vesku

Quote:


> Originally Posted by *RagingCain*
> 
> Does anybody else think the 4GB of VRAM should have been one unified partition and this 0.5 GB thing is stupid? Am I the only one who doesn't really get it? They had to go out of their way, add R&D, software engineers, programmers and testers for this partition, which actually sounds like it lowers performance on the 970.


They tweaked their design so they could salvage more 980 GPUs and turn them into 970s. For each one they could save this way they make an additional ~$50 in profit. That's just for this GPU series, they probably plan to reuse this technology and software in the future. After all this is just an advancement on what they were doing with the 550 Ti and 660 Ti.


----------



## UZ7

For those new to the thread here is a brief summary of what was talked about the past few days


----------



## RagingCain

Quote:


> Originally Posted by *Vesku*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> Does anybody else think the 4GB of VRAM should have been one unified partition and this 0.5 GB thing is stupid? Am I the only one who doesn't really get it? They had to go out of their way, add R&D, software engineers, programmers and testers for this partition, which actually sounds like it lowers performance on the 970.
> 
> 
> 
> They tweaked their design so they could salvage more 980 GPUs and turn them into 970s. For each one they could save this way they make an additional ~$50 in profit. That's just for this GPU series, they probably plan to reuse this technology and software in the future. After all this is just an advancement on what they were doing with the 550 Ti and 660 Ti.
Click to expand...

Ugh low-ends!

Just kidding (right tool for the right job.)


----------



## sage101

Are we going to see a price drop on the 970? $249.99 please


----------



## Cyro999

Quote:


> Originally Posted by *RagingCain*
> 
> This was just a silly design then. Although stuff like this is exactly why I don't buy any of the sliced and diced GPUs from either side. God knows what they did to make a lower model. Only high-end or no end.


I see where you're coming from, but if the alternative was cutting it to 3GB VRAM, 192 bit bus then it's still an architectural improvement to be able to partially shut down resources like this.

However it should have been advertised as better than 3GB VRAM, 192 bit bus, but still limited - it's a whole 3.5GB on 224 bit, even with an extra 32 bit 500MB of backup VRAM - it was advertised as 4GB @256 bit though, which was false.


----------



## PontiacGTX

Quote:


> Originally Posted by *sage101*
> 
> Are we going to see a price drop on the 970? $249.99 please


no unless amd sells a 980 killer .more like they will sell a 970Ti or sell a 970rev 2


----------



## criminal

Quote:


> Originally Posted by *RagingCain*
> 
> This was just a silly design then. Although stuff like this is exactly why I don't buy any of the sliced and diced GPUs from either side. God knows what they did to make a lower model. Only high-end or no end.


That is funny that you say this because almost every person dropping by the 970 reviews thread when it first popped up said the opposite. (Not saying you were one of those people.) Basically everyone was saying the 980 was way overpriced and the 980 was a waste of money because the 970 was so good. LOL how things have changed.

Has anyone realized that big Maxwell (GM200) will probably have a similar issues on the "not top end model"? If the 970 was done this way for a valid reason, then it is reasonable to assume a cut down version of GM200 will have a similar issue. Yikes!


----------



## gigafloppy

Quote:


> Originally Posted by *PontiacGTX*
> 
> no unless amd sells a 980 killer .more like they will sell a 970Ti or sell a 970rev 2


How? Why? The chip can't be modified. The only thing possible is EOL 970 and introducing a cheaper 192-bit 3GB 965 or something like that.


----------



## damric

What happens when non-reference cards show up with 8GB VRAM? Is a whole 1GB going to be gimped?


----------



## Forceman

Quote:


> Originally Posted by *gigafloppy*
> 
> How? Why? The chip can't be modified. The only thing possible is EOL 970 and introducing a cheaper 192-bit 3GB 965 or something like that.


They could sell a card that had the same number of SMMs as the 970, but with the full 64 ROPs and L2 and full access to the 4 GB VRAM. According to Nvidia, that would be 4-6% faster, and wouldn't suffer from any over 3.5GB issues. So basically what the 970 really should have been.


----------



## criminal

Quote:


> Originally Posted by *damric*
> 
> What happens when non-reference cards show up with 8GB VRAM? Is a whole 1GB going to be gimped?


I don't see the 970 being powerful enough to take advantage of 8GB of ram anyway, but yeah pretty much.


----------



## iSlayer

Quote:


> Originally Posted by *criminal*
> 
> That is funny that you say this because almost every person dropping by the 970 reviews thread when it first popped up said the opposite. (Not saying you were one of those people.) Basically everyone was saying the 980 was way overpriced and the 980 was a waste of money because the 970 was so good. LOL how things have changed.
> 
> Has anyone realized that big Maxwell (GM200) will probably have a similar issues on the "not top end model"? Is the 970 was done this way for a valid reason, then it is reasonable to assume a cut down version of GM200 will have a similar issue. Yikes!


Sigh, criminal beginning to think you're not too smart alongside being insecure about that titan.

*What has changed today is not the GTX 970, but rather our perception of it.*

The 970 on realization of this issue didn't stop being as powerful as it is.

How
Hard
Is
It
To
Understand
That


----------



## provost

Quote:


> Originally Posted by *Forceman*
> 
> If it is as cut and dried as people make it out to be, why not? I'm talking US, I don't know your legal system, but here you can file suit for basically nothing.
> 
> But no, my comment was sarcastic, because you have basically no chance of winning a lawsuit on the grounds that a few numbers posted in a website review were wrong. I'm basically making fun of the whole "sue them" attitude running around here. Sorry to make you the recipient, but you had a nice short post to make a point on.


Given our litigious society anyone can sue anyone... Lol
I am no lawyer by any stretch of the imagination (thank goodness for that.







), but there could be a potential claim here, and that all depends on who and how it's organized.
Even if it ever went to court, my guess is Nvidia would prefer to settle it out of court, so as not to have any dirty laundry aired publicly with regards to the business practices.
Heck, if AMD wanted to be nasty, they could fund the upfront cost for such an action.


----------



## Wirerat

Marketing sold us the 960ti as if its the 970.


----------



## criminal

Quote:


> Originally Posted by *iSlayer*
> 
> Sigh, criminal beginning to think you're not too smart alongside being insecure about that titan.
> 
> *what has changed today is not the GTX 970, but rather our perception of it.*


I don't own a Titan, so why would I be insecure? I sold my Titan over a year ago for $910, so yeah no insecurities.

Edit: I guess I am not understanding what you are putting out? Yeah the 970 is the same "performance" wise as it was before we knew this issue, but we have no idea if this will cause issues in the future. That would cause concerns for me.


----------



## mtcn77

Quote:


> Originally Posted by *iSlayer*
> 
> Sigh, criminal beginning to think you're not too smart alongside being insecure about that titan.
> 
> *what has changed today is* not *the GTX 970*, but rather our perception of it.


----------



## iSlayer

Quote:


> Originally Posted by *mtcn77*


@2010rig your avatar if you'd please.


----------



## gigafloppy

Quote:


> Originally Posted by *Forceman*
> 
> They could sell a card that had the same number of SMMs as the 970, but with the full 64 ROPs and L2 and full access to the 4 GB VRAM. According to Nvidia, that would be 4-6% faster, and wouldn't suffer from any over 3.5GB issues. So basically what the 970 really should have been.


There's probably a reason why Nvidia went with this weird setup. Too many chips with L2 defects maybe? There may not be enough good chips to support your "970-Ti"


----------



## mtcn77

Quote:


> Originally Posted by *iSlayer*
> 
> Smh
> 
> OCN could do with less shills.


how is the public relations business going?


----------



## error-id10t

Quote:


> Originally Posted by *iSlayer*
> 
> *what has changed today is not the GTX 970, but rather our perception of it.*


lol okey dokey then. This comment is the saddest here.


----------



## Vesku

Quote:


> NVIDIA's performance labs continue to work away at finding examples of this occurring and the consensus seems to be something in the 4-6% range. A GTX 970 without this memory pool division would run 4-6% faster than the GTX 970s selling today in high memory utilization scenarios. Obviously this is something we can't accurately test though - we don't have the ability to run a GTX 970 without a disabled L2/ROP cluster like NVIDIA can. All we can do is compare the difference in performance between a reference GTX 980 and a reference GTX 970 and measure the differences as best we can, and that is our goal for this week.


http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970

From the horses mouth they released a spec in their reviewers guide for a 4-6% faster 970 then they were actually selling. That's not even getting into possible "smoothness" penalties. Sounds like a job for FCAT. Nvidia should provide reviewers with cards that meet the mistaken spec that they can compare the actual 970 to.


----------



## PontiacGTX

Quote:


> Originally Posted by *gigafloppy*
> 
> How? Why? The chip can't be modified. The only thing possible is EOL 970 and introducing a cheaper 192-bit 3GB 965 or something like that.


enabling more Cuda/SM units and droping the 970s price to 250,960 to 100-150,960ti to 200 or refreshing the 970 arch


----------



## criminal

Quote:


> Originally Posted by *Wirerat*
> 
> Marketing sold us the 960ti as if its the 970.












Quote:


> Originally Posted by *error-id10t*
> 
> lol okey dokey then. This comment is the saddest here.


He works for Nvidia.


----------



## Olivon

Quote:


> Originally Posted by *mtcn77*


Yeah right. We now know that a 3.5GB 224-bit GM204 card got an incredible success against powerhog 512-bit because of its performance and high level efficency.
It seems clear that nVidia did a big communication mistake but this doesn't change the 970 itself and the performance reviews already made.


----------



## GTR Mclaren

LOL and now you can get a 290x Lightining for $350.....AMD will be happy with the news


----------



## tsm106

Quote:


> Originally Posted by *criminal*
> 
> He works for Nvidia.


Wasn't he the one calling everyone else a shill? Oh the ironing...


----------



## gigafloppy

Quote:


> Originally Posted by *iSlayer*
> 
> *What has changed today is not the GTX 970, but rather our perception of it.*


You're absolutely right. We perceived it as a 4GB, 64ROPS, 224GB/s monster card. Now we don't.


----------



## sage101

Quote:


> Originally Posted by *PontiacGTX*
> 
> enabling more Cuda/SM units and droping the 970s price to 250,960 to 100-150,960ti to 200 or refreshing the 970 arch


My thoughts exactly, thanks for repairing my burst bubble from your previous post


----------



## mtcn77

Quote:


> Originally Posted by *Olivon*
> 
> Yeah right. We now know that a 3.5GB 224-bit GM204 card got an incredible success against powerhog 512-bit because of its performance and high level efficency.
> It seems clear that nVidia did a big communication mistake but this doesn't change the 970 itself and *the performance reviews already made*.


Or rather the reviews that weren't made. You are still assuming you have the better of the situation, it is false.


----------



## aka13

I own a 970, and that issue sure is somewhat disappointing in terms of false advertisement, but I still will buy a second 970, and propably a third for triple sli. Sure, I mean, well, 500gb less in effective memory, but the 350 Euros I paid were ok even for the now "reduced" card. I only can hope that it drops the price.


----------



## iSlayer

Quote:


> Originally Posted by *criminal*
> 
> I have no clue if you do or not, but you sure defend them like you do. Shill then?


Do you even read my posts or do the little voices in your head fill in the blanks?

I want a class action law suit. As a customer of the 970 I was lied to.

Nvidia doesn't have a shred of my sympathy. For those of us with a 970 though, the truth would be of value. Nvidia says that its a minor performance penalty, we need data to verify that. If a law suit doesn't happen we need a better understanding of how capable our GPUs are.


----------



## Silent Scone

As a 980GTX owner, and generally not a consumer that takes 'the next best thing'. In light of the further response from Jonah Alben, if I was a 970 owner I think I'd be a little bit disgruntled. It's of course not the first time and it won't be the last. I'm fairly sure AMD never divulged on a consumer level that triple core Phenom was in fact poor yielding rejection, however in this case it wasn't marketed as a quad core either.

Honestly, and I say this as a long time Nvidia user - I think the fact this wasn't made clear from the beginning is deliberate, however I don't think the performance impact is as large as users are suddenly now making out. The issues some are experiencing are due to exceeding the entire frame buffer, and I think a lot of the _more_ disgruntled, or livid owners probably might have expected 4GB to go further than it actually does in reality as games currently stand with higher pixel count, coming from what is likely 2GB variants.

In short though, I don't think this is something people are going to forget in a hurry.


----------



## 8g4kv369nnn2

Quote:


> Originally Posted by *iSlayer*
> 
> *What has changed today is not the GTX 970, but rather our perception of it.*


What has changed today is Nvidia admitting they sold a product not as advertised and now trying to cover their ass.


----------



## morbid_bean

Returning to this thread after about 12 hours, I really don't feel like reading 20 pages of unnecessary complaints and fanboy praise. Does anybody care to include a *non-bias* new information I have missed? Has NV came out and said anything new yet? Any new findings, lawsuits? (lol), recalls.


----------



## Kand

Heh. Early adopters.


----------



## damric

Quote:


> Originally Posted by *8g4kv369nnn2*
> 
> What has changed today is Nvidia admitting they sold a product not as advertised and now trying to cover their ass.


lol, welcome to OCN. You joined this forum to say that? Just couldn't hold it in any longer heh


----------



## ZealotKi11er

Quote:


> Originally Posted by *Shadin*
> 
> How is he a shill when what he just posted is 100% true? Performance hasn't changed, but that doesn't mean nvidia didn't withhold information from customers, which should definitely be addressed.


A lot of peoplem buy GPUs for future performance and vRAM is part that is effected latter.


----------



## Kinaesthetic

Quote:


> Originally Posted by *morbid_bean*
> 
> Returning to this thread after about 12 hours, I really don't feel like reading 20 pages of unnecessary complaints and fanboy praise. Does anybody care to include a *non-bias* new information I have missed? Has NV came out and said anything new yet? Any new findings, lawsuits? (lol), recalls.


Yep. Read this article: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970 . Unfortunately, everyone else here are acting like bickering kids. Kinda makes it incredibly hard to find real information in this thread. Wish it was cleaned up already. Instead of 18 50-post pages, it should be more like 6-7 of real conversation.


----------



## iSlayer

For anyone curious about my shilling.

http://www.overclock.net/forums/posts/by_user/id/94347/thread/1537725

See all the glorious shill posts i've made. Including but not limited to "Nvidia's excuse doesn't matter it only serves to smokescreen the real issue that they messed up" and "I want my $ back".
Quote:


> Originally Posted by *morbid_bean*
> 
> Returning to this thread after about 12 hours, I really don't feel like reading 20 pages of unnecessary complaints and fanboy praise. Does anybody care to include a *non-bias* new information I have missed? Has NV came out and said anything new yet? Any new findings, lawsuits? (lol), recalls.


Nvidia admitted there is a problem but supposedly it's not a big deal, that said, Nvidia has been going into heavy damage control. We're waiting on Anandtech and other sites to do testing to verify Nvidia's claims.

As of now though, the 970 uses a complex memory scheme that short answer, consists of two partitions. One 3.5GB fast partition, and a secondary 0.5GB slow partition.
Quote:


> Originally Posted by *criminal*
> 
> "Shill - acts as an enthusiastic customer to entice or encourage others"
> 
> He seems to be doing major damage control. He says he is upset, but most of his post don't come across that way. Especially before Nvidia's most recent statement.


Probably because we need DATA. We know the cards are nerfed, the question is HOW badly. Nvidia put it at <5%, we need third party testing.

I am beginning to be a broken record with how many times i've repeated that. We need testing.


----------



## fleetfeather

Quote:


> Originally Posted by *sugalumps*
> 
> Ofcourse mtcn has the most posts in this thread, this is the best thing to happen to his life ever. Starting to think it's the main man roy himself.


Didn't you know, Mtcn's rig is cooled by the tears of Nvidia fanboys


----------



## Noufel

Quote:


> Originally Posted by *aka13*
> 
> I own a 970, and that issue sure is somewhat disappointing in terms of false advertisement, but I still will buy a second 970, and propably a third for triple sli. Sure, I mean, well, 500gb less in effective memory, but the 350 Euros I paid were ok even for the now "reduced" card. I only can hope that it drops the price.


you mean 0.5gb less


----------



## mtcn77

Quote:


> Originally Posted by *sugalumps*
> 
> Ofcourse mtcn has the most posts in this thread, this is the best thing to happen to his life ever. Starting to think it's the main man roy himself.


Yep, yep.
So I calculated how many falsehoods they unveiled:

1 out of 8 GM204 memory interface ports is missing 256 bit > 224 bit,
The rest of memory device ports, seven, are thus providing 7/8th the card's announced maximum theoretical bandwidth,
The bus with missing port have to be waited in its transactions which are at 1/7'th the speed of the rest of the configuration
The bus with missing port delay the rest of the ports due to 2:1 memory bus to memory interface configuration.
Can't compute... *224* GB/s card... *now*... *175* GB/s...


----------



## Noufel

Quote:


> Originally Posted by *sugalumps*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iSlayer*
> 
> Mtcn, you don't even own a 970 and you have 45 posts in this thread. Your words are charitably describes as lively, uncharitably as biased, self-interested FUD.
> 
> Criminal, you are tied with me in posts (or was) at 17. You also don't own a 970.
> 
> Out of the three of us, I'm the one with a 970. Of the three of us, none of us like that Nvidia lied nor are we going to excuse that.
> 
> Now, let's address the I work for Nvidia thing. Oh wait, you're serious? Let me laugh even harder.
> 
> The point ---->
> Your head .
> 
> The idea isn't the GTX 970 isn't flawed (we know it is), its that its performance hasn't somehow changed as a result of this find. It was what it was and it still remains what it is performance wise.
> That'd make twice I've been accused of shilling. The first was in the rumor version of this thread. That argument got dropped on its head incredibly easy when it was realized I too am not amused about Nvidia withholding and misleading the public on the specs of the GPU.
> 
> 
> 
> Ofcourse mtcn has the most posts in this thread, this is the best thing to happen to his life ever. Starting to think it's the main man roy himself.
Click to expand...

no, roy works for nvidia now


----------



## criminal

Quote:


> Originally Posted by *iSlayer*
> 
> For anyone curious about my shilling.
> 
> http://www.overclock.net/forums/posts/by_user/id/94347/thread/1537725
> 
> See all the glorious shill posts i've made. Including but not limited to "Nvidia's excuse doesn't matter it only serves to smokescreen the real issue that they messed up" and "I want my $ back".
> Nvidia admitted there is a problem but supposedly it's not a big deal, that said, Nvidia has been going into heavy damage control. We're waiting on Anandtech and other sites to do testing to verify Nvidia's claims.
> 
> As of now though, the 970 uses a complex memory scheme that short answer, consists of two partitions. One 3.5GB fast partition, and a secondary 0.5GB slow partition.
> Probably because we need DATA. We know the cards are nerfed, the question is HOW badly. Nvidia put it at <5%, we need third party testing.
> 
> I am beginning to be a broken record with how many times i've repeated that. We need testing.


Yes we need testing. Never said we didn't. But users wanting a refund because of being lied to is justifiable as well. It may not be an issue right now, but it may be in the future. Users with a return window right now have reasons to because they possible won't be able to return at a later date.

Edit: Sorry for the shill comment, but saying over and over that the 970 is the same just our perception has changed kinda pushed me in that direction. Fact is, a lot of the 970 owners that think this is a big deal would have not bought the cards to begin with had they known. Which means the 970 has changed as well as perception.


----------



## morbid_bean

@Kinaesthetic and @iSlayer

Thanks Guys for the Mature reply


----------



## RagingCain

GTX 970 was already bottle necked at pixel fill rate before memory bandwidth came into play, at least I think that's what AnandTech believed.

It isn't like the GPU is powerful enough to utilize all that VRAM for rendering anywho, was just going to be storage/cached assets.

Still though this is a black eye for nVidia, no doubt about that.


----------



## doomlord52

Wow, so not only did Nvidia blatantly lie about memory performance (they knew this would happen), they also got other technical info wrong (number of RoPs). And even better; even after announcing all this, the GTX 970 official spec page still lists 4gb @ 224gb/s with a 256-bit interface... despite both of those being proven (and admitted) as false.

Wow. NV is going to get sued for this.


----------



## PostalTwinkie

This is, I think, the 4th attempt of mine to put something down that I felt comfortable putting down about this. It has proven difficult as I see both sides of this; consumer and Nvidia.

From the consumer standpoint I am upset this went undisclosed by Nvidia , but honestly that is only because it is now out there. I find myself asking _"If they had disclosed it at release, and it was mentioned in reviews, what would be different right now?"_ At the end of it, I can't really answer that question beyond _"Nothing."_ The performance of the card hasn't changed suddenly, the reviews and comparisons are still valid. This new hardware configuration *knowledge* doesn't invalidate anything about the card.

Still, as a consumer I would have rather seen 3.5GB+500MB on the box; with some marketing mumbo jumbo about it being a "Turbo Pool" or something, than the box to just say "4 GB". I feel it was a bit underhanded of Nvidia to not communicate this dual pool configuration earlier. However, that still doesn't change the performance of the card from release to now.

From the perspective of Nvidia/Business, I can see why they may have not wanted to communicate this to the consumer, at least in a meeting environment. Is the card equipped with 4GB of VRAM? Yes. Will games have access to the full 4GB? Yes. Would trying to explain to the _*average*_ consumer split memory pool designs confuse them, and cause potential damage? Yes. Should we confuse our customers? No.

This is a really clear example of good business decisions not translating into good consumer/public decisions.

As for the performance argument, you can't make it, there isn't one. This new *knowledge*, again, doesn't change the performance of the card from release day to now or in the future. The _"Well if we didn't have this configuration and had an extra port...."_ isn't valid at all, because it is the same as _"Well if my 970 had extra SMX units...."_, or, _"If it had HBM"_, basically you can _"What if..."_ all day.

The card was released, it was benched against others, and those are still valid. The performance is the performance, arguing _"What if they didn't..."_ is no different than arguing for any other hardware difference. I can sit here and _"What if..."_ myself a GTX 970 for $100 with dual GPUs; still doesn't change the fact that the performance of the 970 is what it is.

Disclosure: I don't own a 970, these are just general thoughts.

P.S.

Still have part of me that feels that maybe the 970 really truly was the 960(Ti), the 980 is the 970, and the possible 980 Ti or Titan would really be a 980. It is one thing to argue the logic of Nvidia selling mid range as top tier based only on a naming convention (GMxxx vs GKxxx, etc). It is an entirely different thing when you have an actual hardware difference slapping you right in the face making the argument.

EDIT:

Oh, and I think Nvidia is going to get bit right in the ass for this, mainly over the incorrect specifications sold on the box. I think some refunds will be in order.
Quote:


> Originally Posted by *mtcn77*
> 
> Yep, yep.
> So I calculated how many falsehoods they unveiled:
> 
> 1 out of 8 GM204 memory interface ports is missing 256 bit > 224 bit,
> The rest of memory device ports, seven, are thus providing 7/8th the card's announced maximum theoretical bandwidth,
> The bus with missing port have to be waited in its transactions which are at 1/7'th the speed of the rest of the configuration
> The bus with missing port delay the rest of the ports due to 2:1 memory bus to memory interface configuration.
> Can't compute... 224 GB/s card... now... 172 GB/s...


So, yes, you are correct in this (Incorrect specs printed in some way). I think this is where Nvidia is going to get the hit, not the performance itself, but the advertised specifications on the card. People will not VERY easily be able to say _"Hey, I bought XYZ, and was sold without disclosure LMN."_ Nvidia is going to have to answer for that one.


----------



## sugalumps

Quote:


> Originally Posted by *mtcn77*
> 
> Yep, yep.
> So I calculated how many falsehoods they unveiled:
> 
> 1 out of 8 GM204 memory interface ports is missing 256 bit > 224 bit,
> The rest of memory device ports, seven, are thus providing 7/8th the card's announced maximum theoretical bandwidth,
> The bus with missing port have to be waited in its transactions which are at 1/7'th the speed of the rest of the configuration
> The bus with missing port delay the rest of the ports due to 2:1 memory bus to memory interface configuration.
> Can't compute... *224* GB/s card... *now*... *172* GB/s...


Well you win this time, golden will show up the next time amd messes up


----------



## iSlayer

Quote:


> Originally Posted by *criminal*
> 
> Yes we need testing. Never said we didn't. But users wanting a refund because of being lied to is justifiable as well. It may not be an issue right now, but it may be in the future. Users with a return window right now have reasons to because they possible won't be able to return at a later date.


Oh well yah if you're talking about people who are within their refund window, definitely return. I'd love to be able to return still, I have GM200 and Fiji on my mind.

I'm one of those customers that was lied to who wants a refund. Even if Nvidia is right and it's a small performance gap, they still lied, they still messed up, and I still want my cash back.

From now on i'm going flagship GPU only.
Quote:


> Originally Posted by *Noufel*
> 
> no, roy works for nvidia now


Roy used to work for Nvidia and now works for AMD.

The fact he's worked for both companies is all the evidence I needed that both Nvidia and AMD have a few loose screws in management.
Quote:


> Originally Posted by *mtcn77*
> 
> Yep, yep.
> So I calculated how many falsehoods they unveiled:
> 
> 1 out of 8 GM204 memory interface ports is missing 256 bit > 224 bit,
> The rest of memory device ports, seven, are thus providing 7/8th the card's announced maximum theoretical bandwidth,
> The bus with missing port have to be waited in its transactions which are at 1/7'th the speed of the rest of the configuration
> The bus with missing port delay the rest of the ports due to 2:1 memory bus to memory interface configuration.
> Can't compute... *224* GB/s card... *now*... *172* GB/s...


So, we have the guy who doesn't own a 970 but has made it his personal vendetta to spread FUD and fill a thread he has no business posting this many times in versus Anandtech. I'm going to go with Anandtech.

http://anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation
Quote:


> Originally Posted by *morbid_bean*
> 
> @Kinaesthetic and @iSlayer
> 
> Thanks Guys for the Mature reply


http://anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation

In depth stuff here.


----------



## criminal

Quote:


> Originally Posted by *iSlayer*
> 
> Oh well yah if you're talking about people who are within their refund window, definitely return. I'd love to be able to return still, I have GM200 and Fiji on my mind.
> 
> I'm one of those customers that was lied to who wants a refund. Even if Nvidia is right and it's a small performance gap, they still lied, they still messed up, and I still want my cash back.
> 
> From now on i'm going flagship GPU only.


We agree!!!









Edit: On a funnier note, where is Goldentiger? He was like the one man crusade of getting people to buy the 970 over everything! Best card ever!


----------



## lester007

thats still a good move for nvidia, so i can return my 970 and think what gpu should i get


----------



## RagingCain

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iSlayer*
> 
> Oh well yah if you're talking about people who are within their refund window, definitely return. I'd love to be able to return still, I have GM200 and Fiji on my mind.
> 
> I'm one of those customers that was lied to who wants a refund. Even if Nvidia is right and it's a small performance gap, they still lied, they still messed up, and I still want my cash back.
> 
> From now on i'm going flagship GPU only.
> 
> 
> 
> We agree!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: *On a funnier note, where is Goldentiger*? He was like the one man crusade of getting people to buy the 970 over everything! Best card ever!
Click to expand...

*Looks at criminal's custom avatar sentence.*

Silence is golden ?


----------



## tpi2007

I like Nvidia GPUs, but I just felt my intelligence was insulted two times. First by Nvidia and then by sites such as PCPer and Anandtech that said they believe them (they could at least have taken a neutral stance).

I'm just having a very hard time believing that nobody from Nvidia's engineering staff read ANY of the GTX 970 reviews and didn't notice that the card's specs were being misrepresented everywhere. I also have a hard time believing that nobody from Nvidia's engineering staff used an utility called GPU-Z or saw screenshots from said utility scattered all over the Internet in the past four months where it's written that the 970 has 64 ROPs. In essence, I have a hard time believing that these people live in a cave when it comes to their own products.

The fact is that AMD had two cards with 64 ROPs and 4 GB of VRAM for a year when they launched these, so from a PR point of view matching AMD's cards in ROP count and claiming that the GPU has more L2 cache than it really has, helping ease the bandwidth concerns of a 256-bit bus, most certainly did impact the audience.

And this only further compounds the value assessment (and false publicity):

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2
Quote:


> GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, *but not both at once*; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, *reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card.* The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment.


How do you calculate how consumer decision making was impacted ? It's hard to tell, but I'm sure some lawyers - from AMD, nonetheless, are probably doing the math right now. Yes, the 970 still performs the same as yesterday, but it doesn't have the same headroom people thought it had, and that is a major difference when you're evaluating a card's value over its useful lifetime. So, will it perform the same 'tomorrow', when new games don't optimize for this memory configuration? Or when Nvidia decides to stop optimizing for it?

Why did they disable the ROPs and L2 cache anyway ? Was it really needed from a technical standpoint ? They didn't explain. It seems like they did it because of segmentation / binning, as Anandtech also says.


----------



## Orangey

Quote:


> Originally Posted by *criminal*
> 
> Edit: On a funnier note, where is Goldentiger? He was like the one man crusade of getting people to buy the 970 over everything! Best card ever!


He's in a holding pattern, if this blows over he will return. If not he will have to upgrade to 980s so he can keep spewing.


----------



## ebduncan

i think Nividia screwed themselves out of more $$ honestly

Just think if the 970 specs were right from the start and it only came with 3.5gb of vram. I bet you they would have sold a butt load more 980's.

people may have considered the 290x, or 290's more etc... also.

Oh well what is done is done. Now the world knows the truth. Don't get me wrong the 970 is a good card, but if I was a owner of one right now I cannot say I wouldn't feel cheated. This whole issue would make me rather salty. Its a low blow, so sorry to the 970 users.


----------



## doomlord52

Quote:


> Originally Posted by *PostalTwinkie*
> 
> So, yes, you are correct in this (Incorrect specs printed in some way). I think this is where Nvidia is going to get the hit, not the performance itself, but the advertised specifications on the card. People will not VERY easily be able to say _"Hey, I bought XYZ, and was sold without disclosure LMN."_ Nvidia is going to have to answer for that one.


Yep, exactly this.

Regardless of it being an internall error (accident) or intentional, specs were printed and used for advertising, and those specs have been shown to be false by Nvidia themselves. It's a pretty open-and-shut case; Nvidia falsely advertised the GTX 970. The EU and (apparently) Aus, as well as many other countries in the world have very strict laws protecting consumers from this - NV will almost certainly get hit.

As for what the consumer will get out of it; well, that's hard to tell. Some people are suggesting that NV will do the "free game in exchange for your right to sue" thing, while others are expecting something completely different (replacement, upgrade, etc.). My bet is that NV will try the 'free game' approach, but only in some areas. The EU will likely go after them purely on principle, which would result in some other form of compensation.


----------



## PostalTwinkie

Quote:


> Originally Posted by *tpi2007*
> 
> I'm sorry, I like Nvidia GPUs, but I just felt my intelligence was insulted two times. First by Nvidia and then by sites such as PCPer and Anandtech that said they believe them.
> 
> I'm sorry, I'm just having a very hard time believing that nobody from Nvidia's engineering staff read ANY of the GTX 970 reviews and didn't notice that the card's specs were being misrepresented everywhere. I also have a hard time believing that nobody from Nvidia's engineering staff used an utility called GPU-Z or saw screenshots from said utility scattered all over the Internet in the past four months where it's written that the 970 has 64 ROPs. In essence, I have a hard time believing that these people live in a cave when it comes to their own products.
> 
> The fact is that AMD had two cards with 64 ROPs and 4 GB of VRAM for a year when they launched these, so from a PR point of view matching AMD's cards in ROP count and claiming that the GPU has more L2 cache than it really has, helping ease the bandwidth concerns of a 256-bvit bus, most certainly did impact the audience.
> 
> And this only further compounds the value assessment (and false publicity):
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2


Oh, I think they knew. The card was just selling so damn well that they didn't want to interrupt that. I can't imagine seeing their official website, all marketing campaigns, etc. going unchecked for that long. It was also pretty golden when Anand suggested that it was not Nvidias fault, but that of journalists and reviewers.....
Quote:


> Meanwhile as press we play a role in this as well, as in retrospect we should have seen this sooner.


Then continuing to say it is due to software they had reporting the, what we now know being correct, specifications. Yet then goes on to mention that the various issues they noticed were later corrected by a driver update.

The fact the writer would try and take any of the blame away from Nvidia is a bit disturbing and dropped Anand on my respect meter a little.


----------



## RagingCain

Quote:


> Originally Posted by *tpi2007*
> 
> I'm sorry, I like Nvidia GPUs, but I just felt my intelligence was insulted two times. First by Nvidia and then by sites such as PCPer and Anandtech that said they believe them.
> 
> I'm sorry, I'm just having a very hard time believing that nobody from Nvidia's engineering staff read ANY of the GTX 970 reviews and didn't notice that the card's specs were being misrepresented everywhere. I also have a hard time believing that nobody from Nvidia's engineering staff used an utility called GPU-Z or saw screenshots from said utility scattered all over the Internet in the past four months where it's written that the 970 has 64 ROPs. In essence, I have a hard time believing that these people live in a cave when it comes to their own products.
> 
> The fact is that AMD had two cards with 64 ROPs and 4 GB of VRAM for a year when they launched these, so from a PR point of view matching AMD's cards in ROP count and claiming that the GPU has more L2 cache than it really has, helping ease the bandwidth concerns of a 256-bit bus, most certainly did impact the audience.
> 
> And this only further compounds the value assessment (and false publicity):
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2
> Quote:
> 
> 
> 
> GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, *but not both at once*; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, *reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card.* The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment.
> 
> 
> 
> How do you calculate how much ? It's hard to tell, but I'm sure some lawyers - from AMD, nonetheless, are probably doing the math right now. Yes, the 970 still performs the same as yesterday, but it doesn't have the same headroom people thought it had, and that is a major difference when you're evaluating a card's value over its useful lifetime. *So, will it perform the same 'tomorrow', when new games don't optimize for this memory configuration ? Or when Nvidia decides to stop optimizing for it ?*
> 
> Why did they disable the ROPs and L2 cache anyway ? Was it really needed from a technical standpoint ? They didn't explain. It seems like they did it because of segmentation / binning, as Anandtech also says.
Click to expand...

There is no way I would buy a GPU of this complexity for exactly this kind of reasoning: once the optimization stops, weird crap will crop up. This card is not future proof by design.

I still don't quite get this. I get that the GPU is modular granting fine graining but this is asininely modular. Not to mention everyone had to have known this was being done, this was not something they just decided, yeah lets disable a half dozen ROPs, L2 Cache, and half a gig of VRAM placed in a secondary partition.


----------



## Vesku

Quote:


> Originally Posted by *tpi2007*
> 
> I'm sorry, I like Nvidia GPUs, but I just felt my intelligence was insulted two times. First by Nvidia and then by sites such as PCPer and Anandtech that said they believe them.
> 
> I'm sorry, I'm just having a very hard time believing that nobody from Nvidia's engineering staff read ANY of the GTX 970 reviews and didn't notice that the card's specs were being misrepresented everywhere. I also have a hard time believing that nobody from Nvidia's engineering staff used an utility called GPU-Z or saw screenshots from said utility scattered all over the Internet in the past four months where it's written that the 970 has 64 ROPs. In essence, I have a hard time believing that these people live in a cave when it comes to their own products.


Yes, that's very hard to believe really. Most likely is either no one wanted to be the one to point this out because of the implied office politics brawl or it was deliberate. Mainly because the lower L2 cache doesn't seem like something that could be easily confused. If it wasn't deliberate then someone on the technical documentation team probably "corrected" to match the 980 thinking it being different was a mistake and the lack of a correction was internal Nvidia politics.

AFAIK Nvidia still hasn't said they'll adjust the way they advertise the 970s memory and memory bandwidth so that seems intentional from day one.


----------



## Heavy MG

Quote:


> Originally Posted by *lester007*
> 
> thats still a good move for nvidia, so i can return my 970 and think what gpu should i get


I think they should offer a return window for 970 owners and a discount to upgrade to a 980,though the 980 is still quite expensive imo,which is why i bought the 970 instead. But, I don't think a full hardware return would be entirely realistic. I could totally see them giving away a game title for free or Steam coupon.
Quote:


> Originally Posted by *vloeibaarglas*
> 
> Good lord Nvidia.
> Lower ROP (64 -> 56)
> Misrepresented VRAM (4 GB -> 3.5 GB + 0.5 GB* at 1/7 speed)
> Lower L2 cache (2MB -> 1.75 MB)
> Lower theoretical memory bandwidth (226 -> 192 GB/s)


The true 970 specs are kinda disappointing,then again not bad as it does keep up with the 780 and 780ti in quite a few games. An upgrade at the price point of the 770 when it came out,and it does have a bit more vram at least :/


----------



## ZealotKi11er

I just want someone to make a poll and ask GTX970 owners if they would have still bought GTX970 when they did knowing what they know now. Nothing else matters really. Nothing has changed about GTX970. We just have more info now.


----------



## Redwoodz

Looking at the layout begs another question, why would you add a another section of .5mb vram and a memory controller, without it's own crossbar? The card obviously performs very well as a 3.5GB-196GB/s gpu. Since the total 4GB of vram can't be accessed at once, what is the benefit?


----------



## Vesku

Quote:


> Originally Posted by *Redwoodz*
> 
> Looking at the layout begs another question, why would you add a another section of .5mb vram and a memory controller, without it's own crossbar? The card obviously performs very well as a 3.5GB-196GB/s gpu. Since the total 4GB of vram can't be accessed at once, what is the benefit?


4GB 256 bit on the box vs 3.5GB 224 bit. Such a choice was made available by Maxwell being more flexible than Kepler with the way it physically handles memory. That seems to be the whole reason.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I just want someone to make a poll and ask GTX970 owners if they would have still bought GTX970 when they did knowing what they know now. Nothing else matters really. Nothing has changed about GTX970. We just have more info now.


The poll would be completely invalid, because the perception of the product has changed. Unfortunately for the poll itself, perception is reality to most. Although I do feel comfortable saying that there are at least a few people who legitimately wouldn't have.


----------



## skupples

mother of god... less than 3 hours and 99 more posts...

get over yourselves. You got screwed... Buying any GXX04 product is allowing yourself to get screwed. It's your own damn fault for riding the hype wagon.

This is why you don't buy GXX04 cards. Period.

GIMP is the name of the GMXX04 game.

get used to it, or go back to consoles.

funny thing is, 970 is still a workhorse, at a low enough price point, that runs extremely cool, and sucks just north of an old world lightbulb.

Nothing changed, just perception.

You would think everyone received a 50% reduction in performance, based on the faux outrage rounding these threads.

Hell, most of the people posting said faux outrage don't even own a Maxwell product.

The real question is, how many times are people going to continue to call it a 3.5GB card, when the article from pcper says quite otherwise.

I mean really, how many of you, that do have 970s, are actually doing something with it where you hit a memory limitation BEFORE you hit a core power limitation?

ZOMG! SoM is running @ 20FPS @ Downsampled 4K AND I'M ALL OUTA MEMORY!!!!


----------



## Orthello

Suddenly i don't feel so bad for paying the premium for the 980s. I mean here it is an exorbitant amount more for the 980 vs 970 , so anything that makes me feel better about it i'll go with.

Still its going to be interesting to see what nvidia come up with to qwell this , it will be interesting to see if free games cut it.


----------



## Forceman

Quote:


> Originally Posted by *doomlord52*
> 
> Yep, exactly this.
> 
> Regardless of it being an internall error (accident) or intentional, specs were printed and used for advertising, and those specs have been shown to be false by Nvidia themselves. It's a pretty open-and-shut case; Nvidia falsely advertised the GTX 970. The EU and (apparently) Aus, as well as many other countries in the world have very strict laws protecting consumers from this - NV will almost certainly get hit.
> 
> As for what the consumer will get out of it; well, that's hard to tell. Some people are suggesting that NV will do the "free game in exchange for your right to sue" thing, while others are expecting something completely different (replacement, upgrade, etc.). My bet is that NV will try the 'free game' approach, but only in some areas. The EU will likely go after them purely on principle, which would result in some other form of compensation.


It's very similar the 660 Ti, and they didn't get sued over that. They list 192-bit and 144 GB/sec for the 2GB card, and for part of the VRAM it's 64-bit and 48GB/sec.

Edit: I originally said it's exactly the same, but the implementation is a little different, so it's only very similar. The bottom line is the same though, only part of the VRAM gets the listed performance.


----------



## fleetfeather

Skupps on that crusade right now... Gotta cool your jets chief


----------



## Wirerat

Quote:


> Originally Posted by *Forceman*
> 
> It's very similar the 660 Ti, and they didn't get sued over that. They list 192-bit and 144 GB/sec for the 2GB card, and for part of the VRAM it's 64-bit and 48GB/sec.
> 
> Edit: I originally said it's exactly the same, but the implementation is a little different, so it's only very similar. The bottom line is the same though, only part of the VRAM gets the listed performance.


that impacted performance more too as it was 25% of the total.


----------



## Menta

IS THERE any update on the issue?


----------



## Vesku

Quote:


> Originally Posted by *Forceman*
> 
> It's very similar the 660 Ti, and they didn't get sued over that. They list 192-bit and 144 GB/sec for the 2GB card, and for part of the VRAM it's 64-bit and 48GB/sec.
> 
> Edit: I originally said it's exactly the same, but the implementation is a little different, so it's only very similar. The bottom line is the same though, only part of the VRAM gets the listed performance.


Does any of the VRAM get the full 224GB/sec?

"To those wondering how peak bandwidth would remain at 224 GB/s despite the division of memory controllers on the GTX 970, Alben stated that it can reach that speed only when memory is being accessed in both pools." - PCPer write up

They add the 3.5GB bandwidth to the 0.5GB bandwidth to get 224GB/sec which is even more disingenuous then it sounds because they can't operate at the same time. It's a XOR memory segmentation.

"GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card. The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment." - Anandtech write up

That 224 GB/s doesn't actually sound true even the way Nvidia spins it. That or Anandtech is incorrect and they can be accessed simultaneously.


----------



## looniam

Quote:


> Originally Posted by *Orangey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *criminal*
> 
> Edit: On a funnier note, where is Goldentiger? He was like the one man crusade of getting people to buy the 970 over everything! Best card ever!
> 
> 
> 
> He's in a holding pattern, if this blows over he will return. If not he will have to upgrade to 980s so he can keep spewing.
Click to expand...

so i take it neither of you have been to [ another forum ] complaining that his was ready to return a 4K display think the stuttering was the display's fault?

(hint provided . .)

and btw, nothing in any _updated informaton_ changes the fact that the developer himself stated:


----------



## Exilon

Quote:


> Originally Posted by *Forceman*
> 
> The performance is what the performance is. The specification table of all the reviews is wrong, but the actual test results haven't changed.


It's a matter of consumer confidence.
Quote:


> Originally Posted by *Forceman*
> 
> It's very similar the 660 Ti, and they didn't get sued over that. They list 192-bit and 144 GB/sec for the 2GB card, and for part of the VRAM it's 64-bit and 48GB/sec.
> 
> Edit: I originally said it's exactly the same, but the implementation is a little different, so it's only very similar. The bottom line is the same though, only part of the VRAM gets the listed performance.


Actually it's different.

The GTX 660 Ti had one controller with two banks of two chips attached to it, so reading the lower 1.5 GB had full 192-bit * 6 GHz bandwidth. The GTX 970 has 3 controllers with 2 attached to each, and one controller with a special setup where only 1 32-bit channel is active for 3.5 GB. The result is 7 * 32-bit * 7 GHz bandwidth for 3.5 GB, 32-bit * 7 GHz for the last 500 MB. None of the VRAM gets full bandwidth we expect at 256 * 7 GHz.


----------



## tpi2007

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Oh, I think they knew. The card was just selling so damn well that they didn't want to interrupt that. I can't imagine seeing their official website, all marketing campaigns, etc. going unchecked for that long. It was also pretty golden when Anand suggested that it was not Nvidias fault, but that of journalists and reviewers.....
> Quote:
> 
> 
> 
> Meanwhile as press we play a role in this as well, as in retrospect we should have seen this sooner.
> 
> 
> 
> Then continuing to say it is due to software they had reporting the, what we now know being correct, specifications. Yet then goes on to mention that the various issues they noticed were later corrected by a driver update.
> 
> The fact the writer would try and take any of the blame away from Nvidia is a bit disturbing and dropped Anand on my respect meter a little.
Click to expand...

They should at least have taken a more neutral stance. We all know how businesses can be cynical, why are they taking sides when they obviously don't know what actually happened, what motivations they had and, especially, when common sense says that this is so far fetched (that nobody noticed for four months) that the best approach to appease both Nvidia and your readers is to not take sides and just report what they said. It left me uneasy with them too.

Quote:


> Originally Posted by *RagingCain*
> 
> There is no way I would buy a GPU of this complexity for exactly this kind of reasoning: once the optimization stops, weird crap will crop up. This card is not future proof by design.
> 
> I still don't quite get this. I get that the GPU is modular granting fine graining but this is asininely modular. Not to mention everyone had to have known this was being done, this was not something they just decided, yeah lets disable a half dozen ROPs, L2 Cache, and half a gig of VRAM placed in a secondary partition.


They sure must make a lot of money on those better yields to go through all this trouble. The problem is that they can then discontinue optimization and 970 owners are left behind. This isn't the same as the GTX 570 and it's low VRAM amount, but the question marks regarding future proofing (as much as it can be had) are there too.

Quote:


> Originally Posted by *Vesku*
> 
> Yes, that's very hard to believe really. Most likely is either no one wanted to be the one to point this out because of the implied office politics brawl or it was deliberate. Mainly because the lower L2 cache doesn't seem like something that could be easily confused. If it wasn't deliberate then someone on the technical documentation team probably "corrected" to match the 980 thinking it being different was a mistake and the lack of a correction was internal Nvidia politics.
> 
> AFAIK Nvidia still hasn't said they'll adjust the way they advertise the 970s memory and memory bandwidth so that seems intentional from day one.


This is especially important as the card's effective memory bandwidth is *either* 196 GB/s *or* 28 GB/s. Saying it's 224 GB/s is like saying you have a car that can go at 196 km/h in 5th gear and 28 km/h in 1st gear, therefore the car's top speed is 224 km/h. And with this I have just proved that you can actually make a car analogy that works.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I just want someone to make a poll and ask GTX970 owners if they would have still bought GTX970 when they did knowing what they know now. Nothing else matters really. Nothing has changed about GTX970. We just have more info now.


Wrong. The headroom you thought was there, isn't. The card has less bandwidth available, it has less L2 cache too. It is dependent on heuristics and possibly per application optimization to mitigate performance penalties, which is dependant on Nvidia caring about it enough to keep optimizing. You have to subtract points from the future proofness part of the equation when comparing cards and respective prices.


----------



## provost

Quote:


> Originally Posted by *RagingCain*
> 
> There is no way I would buy a GPU of this complexity for exactly this kind of reasoning: once the optimization stops, weird crap will crop up. This card is not future proof by design.
> 
> I still don't quite get this. I get that the GPU is modular granting fine graining but this is asininely modular. Not to mention everyone had to have known this was being done, this was not something they just decided, yeah lets disable a half dozen ROPs, L2 Cache, and half a gig of VRAM placed in a secondary partition.


You know, one would have to assume this level of segmentation decision was not made without a strategic plan to implement it across future sku releases, if it turned out to be a success. At least , this is how it works in the real world of corporate decision making. Well, the upshot here may be this got caught early, and Nvidia wouldn't try to do this type of segmentation without full disclosure, given the PR disaster it has turned into....

As one of my colleuges from the GK110 thread said earlier, they don't have to it, since they make enough money already...but enough is never enough when it comes to generating retruns for the stakehodlers , unless they can't get away with it in a competitive market, and that's just the way it goes..lol


----------



## iSlayer

Quote:


> Originally Posted by *criminal*
> 
> Yes we need testing. Never said we didn't. But users wanting a refund because of being lied to is justifiable as well. It may not be an issue right now, but it may be in the future. Users with a return window right now have reasons to because they possible won't be able to return at a later date.
> 
> Edit: Sorry for the shill comment, but saying over and over that the 970 is the same just our perception has changed kinda pushed me in that direction. Fact is, a lot of the 970 owners that think this is a big deal would have not bought the cards to begin with had they known. Which means the 970 has changed as well as perception.


Didn't see that edit.

970 owners seems to be less outraged than other people I could mtcn I mean mention. Again, just look at how many people are panic selling. It may just be a silent majority but the action is really subdued.
Quote:


> Originally Posted by *criminal*
> 
> We agree!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: On a funnier note, where is Goldentiger? He was like the one man crusade of getting people to buy the 970 over everything! Best card ever!


Yes, we seem to agree on more than a few things







.

Honestly, I haven't even thought about the guy. He, if anyone, would do damage control. The man eat, drinks and sleeps SLI 970s. I wonder how Nvidia's DX11 optimizations will fix the 970s







.
Quote:


> Originally Posted by *RagingCain*
> 
> *Looks at criminal's custom avatar sentence.*
> 
> Silence is golden ?


Too perfect lol.
Quote:


> Originally Posted by *tpi2007*
> 
> I like Nvidia GPUs, but I just felt my intelligence was insulted two times. First by Nvidia and then by sites such as PCPer and Anandtech that said they believe them (they could at least have taken a neutral stance).
> 
> I'm just having a very hard time believing that nobody from Nvidia's engineering staff read ANY of the GTX 970 reviews and didn't notice that the card's specs were being misrepresented everywhere. I also have a hard time believing that nobody from Nvidia's engineering staff used an utility called GPU-Z or saw screenshots from said utility scattered all over the Internet in the past four months where it's written that the 970 has 64 ROPs. In essence, I have a hard time believing that these people live in a cave when it comes to their own products.
> 
> The fact is that AMD had two cards with 64 ROPs and 4 GB of VRAM for a year when they launched these, so from a PR point of view matching AMD's cards in ROP count and claiming that the GPU has more L2 cache than it really has, helping ease the bandwidth concerns of a 256-bit bus, most certainly did impact the audience.
> 
> And this only further compounds the value assessment (and false publicity):
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2
> How do you calculate how consumer decision making was impacted ? It's hard to tell, but I'm sure some lawyers - from AMD, nonetheless, are probably doing the math right now. Yes, the 970 still performs the same as yesterday, but it doesn't have the same headroom people thought it had, and that is a major difference when you're evaluating a card's value over its useful lifetime. So, will it perform the same 'tomorrow', when new games don't optimize for this memory configuration? Or when Nvidia decides to stop optimizing for it?
> 
> Why did they disable the ROPs and L2 cache anyway ? Was it really needed from a technical standpoint ? They didn't explain. It seems like they did it because of segmentation / binning, as Anandtech also says.


Again, the issue of Nvidia didn't realize it or lying isn't the important thing, that only serves to smokescreen the real issue. The issue is Nvidia messed up.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> I just want someone to make a poll and ask GTX970 owners if they would have still bought GTX970 when they did knowing what they know now. Nothing else matters really. Nothing has changed about GTX970. We just have more info now.


Quote:


> Originally Posted by *PostalTwinkie*
> 
> The poll would be completely invalid, because the perception of the product has changed. Unfortunately for the poll itself, perception is reality to most. Although I do feel comfortable saying that there are at least a few people who legitimately wouldn't have.


This along with hindsight bias. It's hard to tell what I would've done. I probably would've looked at the perf and still gone with it.

The 970 is only a stopgap GPU from the 770 for me till GM200 and FIji.
Quote:


> Originally Posted by *fleetfeather*
> 
> Skupps on that crusade right now... Gotta cool your jets chief


I don't blame him for getting upset, this issue has got me tense at least.
Quote:


> Originally Posted by *Menta*
> 
> IS THERE any update on the issue?


No, we're still waiting on testing.


----------



## Forceman

Quote:


> Originally Posted by *Vesku*
> 
> Does any of the VRAM get the full 224GB/sec?
> 
> "To those wondering how peak bandwidth would remain at 224 GB/s despite the division of memory controllers on the GTX 970, Alben stated that it can reach that speed only when memory is being accessed in both pools." - PCPer write up
> 
> They add the 3.5GB bandwidth to the 0.5GB bandwidth to get 224GB/sec which is even more disingenuous then it sounds because they can't operate at the same time. It's a XOR memory segmentation.
> 
> "GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card. The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment." - Anandtech write up
> 
> That 224 GB/s doesn't actually sound true even the way Nvidia spins it. That or Anandtech is incorrect and they can be accessed simultaneously.


Yeah, that's a good point actually. I don't think the PCPer article made that point as clearly.
Quote:


> Originally Posted by *Exilon*
> 
> It's a matter of consumer confidence.
> Actually it's different.
> 
> The GTX 660 Ti had one controller with two banks of two chips attached to it, so reading the lower 1.5 GB had full 192-bit * 6 GHz bandwidth. The GTX 970 has 3 controllers with 2 attached to each, and one controller with a special setup where only 1 32-bit channel is active for 3.5 GB. The result is 7 * 32-bit * 7 GHz bandwidth for 3.5 GB, 32-bit * 7 GHz for the last 500 MB. None of the VRAM gets full bandwidth we expect at 256 * 7 GHz.


Yeah, Vesku pointed that out.

Still not sure it matters though, from a legal standpoint. It does still have a full 256-bit bus between the VRAM chips and the memory controllers, and a theoretical 224 GB/sec over that path, it's just that the memory controller isn't able to provide that speed internally. Nvidia is hardly the only company in the world that fudges what is essentially marketing data. The ROP/L2 discrepancy is more clear cut.

Edit: Actually, the PCPer comment kind of makes it sound like both pools could be accessed simultaneously.
Quote:


> *To those wondering how peak bandwidth would remain at 224 GB/s despite the division of memory controllers on the GTX 970, Alben stated that it can reach that speed only when memory is being accessed in both pools.


----------



## Vesku

Quote:


> Originally Posted by *looniam*
> 
> so i take it neither of you have been to [ another forum ] complaining that his was ready to return a 4K display think the stuttering was the display's fault?
> 
> (hint provided . .)
> 
> and btw, nothing in any _updated informaton_ changes the fact that the developer himself stated:


It still caught the difference, though. When run with no other VRAM being used, that 512MB section is 1/7th the speed of the rest.


----------



## SandGlass

Extremetech benchmarks are up,

2 game benchmarks:




1 synthetic:


I can believe that the initial specs were an honest mistake, but I highly doubt that nobody at Nvidia caught that in the 4 months that followed. False advertising until they got caught.


----------



## spacin9

Quote:


> Originally Posted by *Wirerat*
> 
> Two words. Nvidea knew.


nvidia lied and people cried.


----------



## ZealotKi11er

Is GTX970 suddenly a worse card? In the end of the day it destroyed R9 290/290X prices for AMD fans to buy and GTX780/Ti resale value. It did it's job. It lowered the prices of all cards. The fact that Nvidia's fastest single GPU is $550 is a good thing. GTX780 1 year before released with 3GB vs GTX970 which was better in every way for half the price.


----------



## Woundingchaney

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is GTX970 suddenly a worse card? In the end of the day it destroyed R9 290/290X prices for AMD fans to buy and GTX780/Ti resale value. It did it's job. It lowered the prices of all cards. The fact that Nvidia's fastest single GPU is $550 is a good thing. GTX780 1 year before released with 3GB vs GTX970 which was better in every way for half the price.


For those of us that are looking at the potential longevity of the card; yes it is suddenly a worse card. There can be no doubt that this revelation changes its future usability.


----------



## justinone

I would like an update on this. Is it possible for a driver fix? Or do i have to start thinking about returning my 970's for a 980 instead?


----------



## Exilon

Quote:


> Originally Posted by *Forceman*
> 
> Still not sure it matters though, from a legal standpoint. It does still have a full 256-bit bus between the VRAM chips and the memory controllers, and a theoretical 224 GB/sec over that path, it's just that the memory controller isn't able to provide that speed internally. Nvidia is hardly the only company in the world that fudges what is essentially marketing data. The ROP/L2 discrepancy is more clear cut.
> 
> Edit: Actually, the PCPer comment kind of makes it sound like both pools could be accessed simultaneously.


No, both pools can't be accessed simultaneously.

The RAM is striped for the 224-bit bus and the last memory controller can only use one of its read ports from L2. So if the GPU needs to read the 3.5 GB partition, it cannot read the 0.5 GB partition. If the GPU needs to read the 0.5 GB partition, the 7th stripe on the 3.5 GB partition is blocked and the larger partition can't be read.


----------



## skupples

so does this mean maxwell cores scale poorly? would the 970 be EVEN closer to 980 if it could properly address this 512MB of memory? Or are they microsoft megabytes?

I'm still trying to figure out how many people here have actually been truly that close to the edge. Plenty of modern games will load every last Mb, this happens with my Titans ALL THE TIME, even @ 1080P in most modern titles, the thing is, another card, w/ only 4GB comes around, and experiences the same thing, with near identical #s...

I mean, if you're running 1440P or higher, with a single, cut, GXX04 card... Your priorities and expectations be whack.


----------



## Xoriam

Quote:


> Originally Posted by *Exilon*
> 
> No, both pools can't be accessed simultaneously.
> 
> The RAM is striped for the 224-bit bus and the last memory controller can only use one of its read ports from L2. So if the GPU needs to read the 3.5 GB partition, it cannot read the 0.5 GB partition. If the GPU needs to read the 0.5 GB partition, the 7th stripe on the 3.5 GB partition is blocked and the larger partition can't be read.


it can use both. chip 7 and 8 are just using the same L2, they said the L2 and MC are fast enough to handle both chip 7 and the slower speed of chip 8 without a bottleneck.

I have no idea why you would say that both partitions are not acessible at the same time since I play daily with 4gb used.

and the benchmakrs being posted lately are all using above 3,5gb of ram.


----------



## Seven7h

Quote:


> Originally Posted by *RagingCain*
> 
> There is no way I would buy a GPU of this complexity for exactly this kind of reasoning: once the optimization stops, weird crap will crop up. This card is not future proof by design.
> 
> I still don't quite get this. I get that the GPU is modular granting fine graining but this is asininely modular. Not to mention everyone had to have known this was being done, this was not something they just decided, yeah lets disable a half dozen ROPs, L2 Cache, and half a gig of VRAM placed in a secondary partition.


The "optimization" won't stop. It's heuristically executed and it is NVIDIAs core memory management logic and the OSes logic that has always been in place for all GPUs.

This is not some special, hacky logic or hand tuning.


----------



## Vesku

Quote:


> Originally Posted by *Xoriam*
> 
> it can use both. chip 7 and 8 are just using the same L2, they said the L2 and MC are fast enough to handle both chip 7 and the slower speed of chip 8 without a bottleneck.
> 
> I have no idea why you would say that both partitions are not acessible at the same time since I play daily with 4gb used.
> 
> and the benchmakrs being posted lately are all using above 3,5gb of ram.


The 3.5GB and 0.5GB can't both be read at the same time so you don't ever get 224GB/s of bandwidth your card is either accessing the 3.5GB at 196GB/s or the 0.5GB at 28GB/s never both at the same time.


----------



## cowie

some of you guys are crazy the card you got why????? why???? say it because it's the best p/p card out atm but you hear oh its got 3.5g now you cry??/
the performance has not changed the reviews where right on that....so you guys cry oh I was lied too look you were a cheap(or smart really) guy to get one over a 980 so sthu already















jeez I got a zotac pos 980 that wont sli with any other card and it says sli on the box.....sure I got mad but damn you guys act like its a duel 980 card with 8g of ram


----------



## looniam

Quote:


> Originally Posted by *Vesku*
> 
> It still caught the difference, though. When run with no other VRAM being used, that 512MB section is 1/7th the speed of the rest.


it also gave a result that my 780ti had a problem when tested headless with both WDM services and swap file turned off.

pro tip: you don't use a benchmark's results as valid just because it _accidentally_ finds something.


----------



## tpi2007

Quote:


> Originally Posted by *Xoriam*
> 
> it can use both. chip 7 and 8 are just using the same L2, they said the L2 and MC are fast enough to handle both chip 7 and the slower speed of chip 8 without a bottleneck.
> 
> I have no idea why you would say that both partitions are not acessible at the same time since I play daily with 4gb used.
> 
> and the benchmakrs being posted lately are all using above 3,5gb of ram.


Anandtech says otherwise:

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2
Quote:


> This in turn is why the 224GB/sec memory bandwidth number for the GTX 970 is technically correct and yet still not entirely useful as we move past the memory controllers, as it is not possible to actually get that much bandwidth at once on the read side. GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card. The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment.


Quote:


> Originally Posted by *Seven7h*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> There is no way I would buy a GPU of this complexity for exactly this kind of reasoning: once the optimization stops, weird crap will crop up. This card is not future proof by design.
> 
> I still don't quite get this. I get that the GPU is modular granting fine graining but this is asininely modular. Not to mention everyone had to have known this was being done, this was not something they just decided, yeah lets disable a half dozen ROPs, L2 Cache, and half a gig of VRAM placed in a secondary partition.
> 
> 
> 
> The "optimization" won't stop. It's heuristically executed and it is NVIDIAs core memory management logic and the OSes logic that has always been in place for all GPUs.
> 
> This is not some special, hacky logic or hand tuning.
Click to expand...

Are you 100% certain that there is _also_ no per application optimization going on ?

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/3
Quote:


> The way NVIDIA describes the process we suspect there are even per-application optimizations in use, though NVIDIA can clearly handle generic cases as well.


----------



## vloeibaarglas

Quote:


> Originally Posted by *Xoriam*
> 
> it can use both. chip 7 and 8 are just using the same L2, they said the L2 and MC are fast enough to handle both chip 7 and the slower speed of chip 8 without a bottleneck.
> 
> I have no idea why you would say that both partitions are not acessible at the same time since I play daily with 4gb used.
> 
> and the benchmakrs being posted lately are all using above 3,5gb of ram.


They can't be accessed on the same cycle. GTX 970's GDDR5 runs at 7 Ghz.If you try to access 3.5 GB, then 0.5 GB, then 3.5 GB so on, your memory bandwidth is halved.

So the driver and game should avoid the last 0.5 GB at all cost.


----------



## MerkageTurk

guys my EVGA 780TI is still £559 here in the U.K

if nvidia did optimise the titian/ti, it will destroy the 970/980, it still has higher fps


----------



## provost

Quote:


> Originally Posted by *tpi2007*
> 
> Anandtech says otherwise:
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2
> 
> Are you 100% certain that there is _also_ no per application optimization going on ?
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/3


At least someone here is asking the right type of questions


----------



## Menta

wow just went over to the geforce forum and all topics os 970 vram issues where wiped


----------



## skupples

Not surprising. Typical action of any community when poop hits the fan.

I can list of 10 different games to do it in the last 2 years, if you like. ED,SC, & MWO being in the top of that list


----------



## GorillaSceptre

What will the 970 perform like when the "next gen" games arrive this year?

Not a 970 owner and have no dog in this fight... But, if i buy something i expect it to have what was advertised on the box. It may be perfectly fine for future games but if i was a 970 owner i'd be a bit uncomfortable.


----------



## Wirerat

Quote:


> Originally Posted by *GorillaSceptre*
> 
> What will the 970 perform like when the "next gen" games arrive this year?
> 
> Not a 970 owner and have no dog in this fight... But, if i buy something i expect it to have what was advertised on the box. It may be perfectly fine for future games but if i was a 970 owner i'd be a bit uncomfortable.


it last as long or longer than 3gb 780s


----------



## ZealotKi11er

Quote:


> Originally Posted by *skupples*
> 
> so does this mean maxwell cores scale poorly? would the 970 be EVEN closer to 980 if it could properly address this 512MB of memory? Or are they microsoft megabytes?
> 
> I'm still trying to figure out how many people here have actually been truly that close to the edge. Plenty of modern games will load every last Mb, this happens with my Titans ALL THE TIME, even @ 1080P in most modern titles, the thing is, another card, w/ only 4GB comes around, and experiences the same thing, with near identical #s...
> 
> I mean, if you're running 1440P or higher, with a single, cut, GXX04 card... Your priorities and expectations be whack.


Think of it this way. If Nvidia had a Real GTX970 at hand it would have been close to $400-450 because it would be very close to GTX980 in performance. With Custom Models GTX970 would have walked with GTX980 but because its 224-Bit/56 ROP it cant. I dont know what people did not test this card before this to see the difference between GTX980 and GTX970. For example all i have to do for a R9 290 is clock it to 1100MHz to match 290X in therory and it does in practice within 1-2%. This was not the case with GTX970 especially considering these cards need all the memory bandwidth they can get.


----------



## looniam

Quote:


> Originally Posted by *Menta*
> 
> wow just went over to the geforce forum and all topics os 970 vram issues where wiped


you sure?


Spoiler: Warning: Spoiler!


----------



## GorillaSceptre

But 970 owners didn't buy 780's, did they?

Not to mention the smaller bus..


----------



## skupples

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Think of it this way. If Nvidia had a Real GTX970 at hand it would have been close to $400-450 because it would be very close to GTX980 in performance. With Custom Models GTX970 would have walked with GTX980 but because its 224-Bit/56 ROP it cant. I dont know what people did not test this card before this to see the difference between GTX980 and GTX970. For example all i have to do for a R9 290 is clock it to 1100MHz to match 290X in therory and it does in practice within 1-2%. This was not the case with GTX970 especially considering these cards need all the memory bandwidth they can get.


kinda like how it only takes some 100mhz for titan to trade blows with 780Ti (minus the gains you will never make back for missing TMU, and 780Ti being the B1 revision, thus better clocks @ lower volts)


----------



## spacin9

Quote:


> Originally Posted by *Menta*
> 
> wow just went over to the geforce forum and all topics os 970 vram issues where wiped


lol I was just there myself... they did it in the last 15 minutes I think.

I don't want to make a long, contrived post because I don't know what it means, but basically what I'm seeing is, @ 4K Kombustor, 1GB memory burner and 2GB memory burner perform about the same. Jump to 3GB memory burner and it drops 10-15 fps. All things being equal, shouldn't it be the same @ 3GB

Also, @4K *System memory and swap file usage* to run Kombustor 3GB memory burner almost trebles (that's 3 X) over 2GB and 1GB. I have no idea why or what it means, but it doesn't look right since video ram is the only variable changing.

I have the screenshots if someone wants to see them. I'm not going to post them up if this post gets lost in the flame war going on in this thread and no cares to read it. It's great popcorn material, tho.


----------



## Wirerat

Quote:


> Originally Posted by *GorillaSceptre*
> 
> But 970 owners didn't buy 780's, did they?


im a 970 owner myself. 1.5ghz feels just as good today as it did yesterday. how loong it holds up wont matter. ill buy a new card every 1- 2 years anyways.


----------



## Vesku

Quote:


> Originally Posted by *looniam*
> 
> it also gave a result that my 780ti had a problem when tested headless with both WDM services and swap file turned off.
> 
> pro tip: you don't use a benchmark's results as valid just because it _accidentally_ finds something.


No but we do say "Huh, what's up with that result?" and then make a lot of noise until Nvidia reveals their deception.

Edit: Oh, and with some people saying "There is nothing wrong. You're imagining this quirkiness of the 970." The whole time up until the admission by Nvidia.


----------



## skupples

Quote:


> Originally Posted by *spacin9*
> 
> lol I was just there myself... they did it in the last 15 minutes I think.
> 
> I don't want to make a long, contrived post because I don't know what it means, but basically what I'm seeing is, @ 4K Kombustor, 1GB memory burner and 2GB memory burner perform about the same. Jump to 3GB memory burner and it drops 10-15 fps. All things being equal, shouldn't it be the same @ 3GB
> 
> Also, @4K *System memory and swap file usage* to run Kombustor 3GB memory burner almost trebles (that's 3 X) over 2GB and 1GB. I have no idea why or what it means, but it doesn't look right since video ram is the only variable changing.
> 
> I have the screenshots if someone wants to see them. I'm not going to post them up if this post gets lost in the flame war going on in this thread and no cares to read it. It's great popcorn material, tho.


not sure why anyone believed a GXX04 card w/ tiny memory bus was good for 4K anyways. I mean yes, Nvidia advertised it as such, but um... when did this community become dumb enough to buy into illogical statements?


----------



## Menta

Quote:


> Originally Posted by *looniam*
> 
> you sure?
> 
> 
> Spoiler: Warning: Spoiler!


its back up....

thought they would kill the "thing"


----------



## GorillaSceptre

Quote:


> Originally Posted by *Wirerat*
> 
> im a 970 owner myself. 1.5ghz feels just as good today as it did yesterday. how loong it holds up wont matter. ill buy a new card every 1- 2 years anyways.


Aye man, your money. I'm glad you like your card, but if it was me i'd be pissed.

Blatant false advertising by Nvidia.


----------



## provost

I could be wrong, but some of the smart questions being asked in this thread may have future implications beyond the 970......


----------



## spacin9

Quote:


> Originally Posted by *skupples*
> 
> *not sure why anyone believed a GXX04 card w/ tiny memory bus was good for 4K anyways. I mean yes, Nvidia advertised it as such, but um... when did this community become dumb enough to buy into illogical statements?*


Not sure if you meant to respond to me.. but my answer is: err, umm..trusted review sites and delta color compression?


----------



## looniam

Quote:


> Originally Posted by *Vesku*
> 
> No but we do say "Huh, what's up with that result?" and then make a lot of noise until Nvidia reveals their deception.


no. it was people's uploading youtube videos with posting and describing their experiences that brought the subject up.

granted, the benchmark giving a false empirical impress did get noticed by some sites from shills using it to circularly post/spam through forums. but no, the real credit goes to the 970 owners.


----------



## iSlayer

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Aye man, your money. I'm glad you like your card, but if it was me i'd be pissed.
> 
> Blatant false advertising by Nvidia.


Indeed seems to be that way, we'll see what happens as far as law suits and returns go. I want my $ back...preferably right when the 390x and fully enabled GM200 without DP comes out









.
Quote:


> Originally Posted by *provost*
> 
> I could be wrong, but some of the smart questions being asked in this thread may have future implications beyond the 970......


Why i've been going on about data. It's not so much about verifying Nvidia's claims as it is seeing what the damage will be like for 970 owners that are future proofing or going SLI.


----------



## Exilon

Quote:


> Originally Posted by *provost*
> 
> I could be wrong, but some of the smart questions being asked in this thread may have future implications beyond the 970......


Yes, like when they sell us a cutdown 384-bit GM200, is it a full 384-bit at 100% bandwidth or 352-bit at 7/8 bandwidth + 32-bit at 1/8 bandwidth?


----------



## looniam

Quote:


> Originally Posted by *Menta*
> 
> its back up....
> 
> thought they would kill the "thing"


i see that now.









go vote in the poll might be pointless but, you never know . . .


----------



## skupples

Quote:


> Originally Posted by *spacin9*
> 
> Not sure if you meant to respond to me.. but my answer is: err, umm..*trusted review sites?*


ahhh, there's your problem.

Eventually you learn that the only reviews you can trust are end user reviews, and that no review site is worth the aspx/html its hosted on, if they don't do everything within their power to push the card to its limits. This includes VRAM utilization.

In the modern era of games that suck up every last drop, 4K monitors, and VR headsets, these sites had/have no excuse.

They will continue to be jokes until they figure out a way to max out the VRAM of a card, in real world scenarios... I have a few tips, if they really can't figure it out.

throw skyrim w/ ENB onto a 1440P monitor, for example (even if Skyrim is a CPU whore) or! Through the last three assassins creed games onto a 1440P monitor w/ max settings.

This is not in jest. Review sites are only good for one thing these days, and that's a link fueled pissing contest.


----------



## Menta

Quote:


> Originally Posted by *looniam*
> 
> no. it was people's uploading youtube videos with posting and describing their experiences that brought the subject up.
> 
> granted, the benchmark giving a false empirical impress did get noticed by some sites from shills using it to circularly post/spam through forums. but no, the real credit goes to the 970 owners.


thats why reviews mean "nothing" and started seeking real info on forums...

community is everything and all the reviewers should be ashame


----------



## spacin9

Quote:


> Originally Posted by *skupples*
> 
> ahhh, there's your problem.
> 
> Eventually you learn that the only reviews you can trust are end user reviews, and that no review site is worth the aspx/html its hosted on, if they don't do everything within their power to push the card to its limits. This includes VRAM utilization.
> 
> In the modern era of games that suck up every last drop, 4K monitors, and VR headsets, these sites had/have no excuse.
> 
> They will continue to be jokes until they figure out a way to max out the VRAM of a card, in real world scenarios... I have a few tips, if they really can't figure it out.
> 
> throw skyrim w/ ENB onto a 1440P monitor, for example (even if Skyrim is a CPU whore) or! Through the last three assassins creed games onto a 1440P monitor w/ max settings.
> 
> This is not in jest. Review sites are only good for one thing these days, and that's a link fueled pissing contest.


I use ENB @ 4K on Skyrim.. I get about 2500MB vram usage. You guys with the Skyrim mods.. jeez. I don't game unless it's a steady 60fps in shooters, so my vram will rarely go 3GB anyway. That's why it's less pronounced for me.

Still you are somewhat correct, everything is caveat emptor.


----------



## GorillaSceptre

Quote:


> Originally Posted by *iSlayer*
> 
> Indeed seems to be that way, we'll see what happens as far as law suits and returns go. I want my $ back...preferably right when the 390x and fully enabled GM200 without DP comes out
> 
> 
> 
> 
> 
> 
> 
> .
> Why i've been going on about data. It's not so much about verifying Nvidia's claims as it is seeing what the damage will be like for 970 owners that are future proofing or going SLI.


As far as law suits go.. I don't see anyone getting anywhere with a corporation like this. I don't believe for a second that this was a misunderstanding etc.. They knew what they were doing and as such i think they would of already covered themselves before shipping.

All Nvidia have done is push me towards AMD. For allot of people this won't be a big deal, they can sell their card if they are really annoyed, but i have to spend a crap ton importing my stuff. Not going to take a chance with them doing stuff like this in the future.


----------



## ZealotKi11er

Why did Nvidia say that 0.5GB is 4 times faster then System memory? My system memory does 35GB/s. X79 and X99 do even more.


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why did Nvidia say that 0.5GB is 4 times faster then System memory? My system memory does 35GB/s. X79 and X99 do even more.


Because PR, "lets turn false advertising into a selling point" lmao.


----------



## Vesku

Quote:


> Originally Posted by *looniam*
> 
> no. it was people's uploading youtube videos with posting and describing their experiences that brought the subject up.
> 
> granted, the benchmark giving a false empirical impress did get noticed by some sites from shills using it to circularly post/spam through forums. but no, the real credit goes to the 970 owners.


Anandtech's take:
Quote:


> From an API perspective this is applicable towards both graphics and compute, though it's a safe bet that graphics is the more easily and accurately handled of the two thanks to the rigid nature of graphics rendering. Direct3D, OpenGL, CUDA, and OpenCL all see and have access to the full 4GB of memory available on the GTX 970, and from the perspective of the applications using these APIs the 4GB of memory is identical, the segments being abstracted. This is also why applications attempting to benchmark the memory in a piecemeal fashion will not find slow memory areas until the end of their run, as their earlier allocations will be in the fast segment and only finally spill over to the slow segment once the fast segment is full.


PCPer's take:
Quote:


> Let's be blunt here: access to the 0.5GB of memory, on its own and in a vacuum, would occur at 1/7th of the speed of the 3.5GB pool of memory. If you look at the Nai benchmarks floating around, this is what you are seeing.


So while it may be tricky to ensure your card has no pre-allocated VRAM, the simple CUDA check did indeed show when it started to run into the 512MB of slow memory.

Why is this still being disputed? People noticed that the 970 oddly hung around 3.5GB -> Others start testing game settings and running NAI's little app and the issue gains traction in tech forums -> Nvidia confesses.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why did Nvidia say that 0.5GB is 4 times faster then System memory? My system memory does 35GB/s. X79 and X99 do even more.


Silly, most people who buy a GTX 970 are obviously going to run a single stick of DDR3 - 800.


----------



## solid9

They're in the wrong and customers are pissed , I hope we get some sort of refoun and I don't mean a ****ty game but some real $$ plus they deserve a big fine for this.


----------



## skupples

Quote:


> Originally Posted by *Vesku*
> 
> Anandtech's take:
> PCPer's take:
> So while it may be tricky to ensure your card has no pre-allocated VRAM, the simple CUDA check did indeed show when it started to run into the 512MB of slow memory.


so wouldn't this mean that rendering on 970 would be much slower than on 980? assuming your renders gobble up all the memories.


----------



## Menta

Quote:


> Originally Posted by *looniam*
> 
> i see that now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> go vote in the poll might be pointless but, you never know . . .


will do:thumb:


----------



## Noufel

Quote:


> Originally Posted by *Exilon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *provost*
> 
> I could be wrong, but some of the smart questions being asked in this thread may have future implications beyond the 970......
> 
> 
> 
> Yes, like when they sell us a cutdown 384-bit GM200, is it a full 384-bit at 100% bandwidth or 352-bit at 7/8 bandwidth + 32-bit at 1/8 bandwidth?
Click to expand...

nice i want my titan x with 10.5 + 1.5gb









no seriously i don't think they'll do that the 970 case is special nvidia with this price tag aimed at the 1080p people ( the 970 still has a very good perf/price ratio at that resolution ) not the 4k and for that they had to do compromises but the thing that bothers me is "why they lied about the specs of the 970?"


----------



## Exilon

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why did Nvidia say that 0.5GB is 4 times faster then System memory? My system memory does 35GB/s. X79 and X99 do even more.


Because PCIe 3.0 16x bus has a much higher latency and can only do 16 GB/s w/ overhead.


----------



## Vesku

Quote:


> Originally Posted by *skupples*
> 
> so wouldn't this mean that rendering on 970 would be much slower than on 980? assuming your renders gobble up all the memories.


No idea but it probably leads to a bigger difference between the 970 and 980 than gaming does. At the very least the GTX 970s odd memory configuration is probably an important thing to know if you are in charge of optimizing CUDA code.


----------



## skupples

Quote:


> Originally Posted by *Noufel*
> 
> nice i want my titan x with 10.5 + 1.5gb
> 
> 
> 
> 
> 
> 
> 
> 
> no seriously i don't think they'll do that the 970 case is special nvidia with this price tag aimed at the 1080p people ( the 970 still has a very good perf/price ratio at that resolution ) not the 4k and for that they had to do compromises but the thing that bothers me is "why they lied about the specs of the 970?"


people seem to forget that the original marketing of this card, while it was in the hand of Mister CEO was for 680 owners to upgrade.



oops, grabbed the wrong pic... meant to click the one of him holding up 970, not 980, derp. owellz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Exilon*
> 
> Because PCIe 3.0 16x bus has a much higher latency and can only do 16 GB/s w/ overhead.


Still not 4 times faster even if latency is involved.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why did Nvidia say that 0.5GB is 4 times faster then System memory? My system memory does 35GB/s. X79 and X99 do even more.


Probably in relation to fetching the graphics assets from system RAM over the PCIe bus, not actual DDR speeds. Although those numbers don't actually match up either.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Noufel*
> 
> nice i want my titan x with 10.5 + 1.5gb
> 
> 
> 
> 
> 
> 
> 
> 
> no seriously i don't think they'll do that the 970 case is special nvidia with this price tag aimed at the 1080p people ( the 970 still has a very good perf/price ratio at that resolution ) not the 4k and for that they had to do compromises but the thing that bothers me is "why they lied about the specs of the 970?"


That is the part of it that opens them up to a lawsuit imo, it is false advertising aimed at casuals who walk into a store and see an AMD card that says 4GB etc. Most people think higher number = better. Considering how much the 970 has been hurting AMD, i wouldn't be surprised if they were the ones to take Nvidia to court.


----------



## GrimDoctor

I thought 970 SLi but since issues are getting worse with driver updates outside of games with rendering has me worried. Drivers since 344.75 have only got worse. Whilst I wasn't expecting improvement as drivers are usually updated to accommodate games not apps, I wasn't expecting it to get worse. Stuttering in an app and a crash is far more damaging than in a game.

I can only assume it has something to do with this based on the fact I still have one of my 760s and when I run it with it's 2GB, I don't run into the issue. Less RAM but maybe better allocation? I don't know.

Edit: Didn't mean to quote.


----------



## Menta

this is a mess because technically my card is the ASUS strix, i think people should be contacting the makers and not Nvidia maybe


----------



## Seven7h

Quote:


> Originally Posted by *tpi2007*
> 
> Anandtech says otherwise:
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2
> 
> Are you 100% certain that there is _also_ no per application optimization going on ?
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/3


Yup. They track how resources are being used and know what is the most important and what should not go in there. There will never be a case where everything in memory is equally important, or has equal effect on frametime.

Ever wonder why your game doesn't instantly crash if you use more video memory than you have? The driver and OS are putting graphics resources in system memory. If it's important stuff, performance can tank. That's why there is specific logic to keep important stuff out of there, and performance remains fine. It's *exactly the same* here, except you now have a third tier in between the two.

Why would you trust the exact same logic that has kept important render targets out of system memory for years on every other GPU, but suddenly call its decision making or sensitivity into question over this? Lol it's exactly the same resource management that every GPU has and has used.


----------



## spacin9

Interesting.
Quote:


> Originally Posted by *Menta*
> 
> this is a mess because technically my card is the ASUS strix, i think people should be contacting the makers and not Nvidia maybe


I asked my retailer about the issue, pretty much knowiing the answer already and they told me to contact the board partner.


----------



## Menta

Quote:


> Originally Posted by *spacin9*
> 
> Interesting.
> I asked my retailer about the issue, pretty much knowiing the answer already and they told me to contact the board partner.


I think they also to blame, they make the dam stuff


----------



## looniam

Quote:


> Originally Posted by *Vesku*
> 
> Anandtech's take:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> From an API perspective this is applicable towards both graphics and compute, though it's a safe bet that graphics is the more easily and accurately handled of the two thanks to the rigid nature of graphics rendering. Direct3D, OpenGL, CUDA, and OpenCL all see and have access to the full 4GB of memory available on the GTX 970, and from the perspective of the applications using these APIs the 4GB of memory is identical, the segments being abstracted. This is also why applications attempting to benchmark the memory in a piecemeal fashion will not find slow memory areas until the end of their run, as their earlier allocations will be in the fast segment and only finally spill over to the slow segment once the fast segment is full.
> 
> 
> 
> 
> 
> 
> PCPer's take:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Let's be blunt here: access to the 0.5GB of memory, on its own and in a vacuum, would occur at 1/7th of the speed of the 3.5GB pool of memory. If you look at the Nai benchmarks floating around, this is what you are seeing.
> 
> Click to expand...
> 
> 
> 
> 
> 
> So while it may be tricky to ensure your card has no pre-allocated VRAM, the simple CUDA check did indeed show when it started to run into the 512MB of slow memory.
> 
> Why is this still being disputed? People noticed that the 970 oddly hung around 3.5GB -> Others start testing game settings and running NAI's little app and the issue gains traction in tech forums -> Nvidia confesses.
Click to expand...

so let me get this straight:

you have *the developer come straight out and told you the benchmark doesn't properly test the Vram* _and the thread it was developed on stops using it._
(btw, cuda developers on guru3D agreed it was flawed.)
but you will ignore that?

then you have reviewers, _who missed the issue entirely_, tell you it can work so you now believe them?

please those guys are just as much in the dark as anyone because nvidia is not giving full disclosure. so far the only one who can really tell is nvidia unless someone get a GM204-200 with with all it's ROPs and crossbars intact. (to paraphrase PCper)


----------



## SuprUsrStan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Think of it this way. If Nvidia had a Real GTX970 at hand it would have been close to $400-450 because it would be very close to GTX980 in performance. With Custom Models GTX970 would have walked with GTX980 but because its 224-Bit/56 ROP it cant. I dont know what people did not test this card before this to see the difference between GTX980 and GTX970. For example all i have to do for a R9 290 is clock it to 1100MHz to match 290X in therory and it does in practice within 1-2%. This was not the case with GTX970 especially considering these cards need all the memory bandwidth they can get.


Well look at the GTX 770 vs GTX 780. Just because the name is close doesn't mean performance HAS to be similar. The 770 was essentially an updated 680 while the 780 was a cut down Titan. Worlds of difference.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Syan48306*
> 
> Well look at the GTX 770 vs GTX 780. Just because the name is close doesn't mean performance HAS to be similar. The 770 was essentially an updated 680 while the 780 was a cut down Titan. Worlds of difference.


GTX770 was just a filler and a money grab.


----------



## skupples

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GTX770 was just a filler and a money grab.


like any rebrand.


----------



## Vesku

Quote:


> Originally Posted by *looniam*
> 
> so let me get this straight:
> 
> you have *the developer come straight out and told you the benchmark doesn't properly test the Vram* _and the thread it was developed on stops using it._
> (btw, cuda developers on guru3D agreed it was flawed.)
> but you will ignore that?
> 
> then you have reviewers, _who missed the issue entirely_, tell you it can work so you now believe them?
> 
> please those guys are just as much in the dark as anyone because nvidia is not giving full disclosure. so far the only one who can really tell is nvidia unless someone get a GM204-200 with with all it's ROPs and crossbars intact. (to paraphrase PCper)


The theoretical memory speed of a GDDR5 chip can be calculated easily. The various "weird" stuff identified there was something going on. NAI's test isn't a masterful memory examiner, CUDA doesn't give that much memory control, but it did catch the slow memory section. Nvidia has admitted they have segmented the memory into 3.5 and 0.5 GB sections. It's not physically possible to access both pools at the same time, that is a limitation of the GDDR5 memory chip not Nvidia's. Unless Nvidia is lying about having that 0.5GB section hanging off the same internal controller as one of the other memory chips, not sure why they'd want to lie about that when confessing. There would also be no point in the memory pools if that chip actually had its own unshared memory controller.


----------



## Seven7h

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Still not 4 times faster even if latency is involved.


Bottlenecked throughput by PCIE, and the latency of going all the way from system memory, through the CPU memory controller, down through PCIE, to the GPU.... That is potentially a lot of latency, though it would be interesting to have the 4x logic concretely explained.

Still not positive 28GB/s is the correct number here. But if so that means that if it can achieve 22-27GB/s, and PCIE maxes out at 11-15GB/s, but the latency of system memory not being local would need to be enough to drag system memory/PCIE texturing down to 5.5-6.5GB/s for "4x the speed of using system memory" to be accurate. Wondering if we'll find out where that 4x math comes from.

I also forgot to mention that if something is in system memory, not only are you bottlenecked by PCIE, but you are potentially having to pull it *from disk* if it got paged out to HDD/SSD! That lowers throughput for those accesses of system memory vs. local dedicated video memory. Though after the first slow access it is likely back in system memory.


----------



## Depauville Kid

Quote:


> Originally Posted by *Seven7h*
> 
> Yup. They track how resources are being used and know what is the most important and what should not go in there. There will never be a case where everything in memory is equally important, or has equal effect on frametime.
> 
> Ever wonder why your game doesn't instantly crash if you use more video memory than you have? The driver and OS are putting graphics resources in system memory. If it's important stuff, performance can tank. That's why there is specific logic to keep important stuff out of there, and performance remains fine. It's *exactly the same* here, except you now have a third tier in between the two.
> 
> Why would you trust the exact same logic that has kept important render targets out of system memory for years on every other GPU, but suddenly call its decision making or sensitivity into question over this? Lol it's exactly the same resource management that every GPU has and has used.


In the Anandtech article, I found this part interesting:

"To that end in the short amount of time we've had to work on this article we have also been working on cooking up potential corner cases for the GTX 970 and have so far come up empty, though we're by no means done. Coming up with real (non-synthetic) gaming workloads that can utilize between 3.5GB and 4GB of VRAM while not running into a rendering performance wall is already a challenge, and all the more so when trying to find such workloads that actually demonstrate performance problems. This at first glance does seem to validate NVIDIA's overall claims that performance is not significantly impacted by the memory segmentation, but we're going to continue looking to see if that holds up. In the meantime NVIDIA seems very eager to find such corner cases as well, and if there are any they'd like to be able to identify what's going on and tweak their heuristics to resolve them."

Apparently, finding a game where the card would use between 3.5GB and 4GB of Vram that still falls within the performance envelope of the GPU is quite challenging. If professional benchmarkers are having trouble recreating these scenarios in real games, the chances the average consumer will stumble upon it is probably slim as well. If the scenarios do become more prevalent over time and recreatable, Anandtech seemed optimistic that Nvidia could compensate for it in their driver reducing the performance impact to a few percent.

Interesting to say the least.


----------



## Zboe

The bottom line on this is simple, all of us GTX 970 buyers still got the performance we payed for. The card works great at 1440 and sub 1440 resolutions and that's what we bought into. It still has an excellent price to performance ratio and is a overall the better product in it's price range. That's just the way I see it.


----------



## provost

Quote:


> Originally Posted by *Noufel*
> 
> nice i want my titan x with 10.5 + 1.5gb
> 
> 
> 
> 
> 
> 
> 
> 
> *no seriously i don't think they'll do that the 970 case is special* nvidia with this price tag aimed at the 1080p people ( the 970 still has a very good perf/price ratio at that resolution ) not the 4k and for that they had to do compromises but *the thing that bothers me is "why they lied about the specs of the 970*?"


I am not as certain as you are, since well run companies rarely make a strategic decision such as this one for a single sku...

As to why they didn't disclose the exact sepcs for the 970 might also be tied to the first point above.. .

I give a lot credit to some people here for having the fortitude to ask the right questions, and sharing their analysis of the technical information, while enduring plenty of adversity


----------



## Vesku

Quote:


> Originally Posted by *Seven7h*
> 
> Bottlenecked throughput by PCIE, and the latency of going all the way from system memory, through the CPU memory controller, down through PCIE, to the GPU.... That is potentially a lot of latency, though it would be interesting to have the 4x logic concretely explained.
> 
> Still not positive 28Gb/s is the correct number here.


That's the theoretical speed of 7GHz GDDR5 in standard 32 bit mode.


----------



## Exilon

Quote:


> Originally Posted by *Vesku*
> 
> The theoretical memory speed of a GDDR5 chip can be calculated easily. The various "weird" stuff identified there was something going on. NAI's test isn't a masterful memory examiner, CUDA doesn't give that much memory control, but it did catch the slow memory section. Nvidia has admitted they have segmented the memory into 3.5 and 0.5 GB sections. It's not physically possible to access both pools at the same time, that is a limitation of the GDDR5 memory chip not Nvidia's.


There's actually two GDDR5 chips attached to the memory controllers. One of the memory controllers can't read from both at the same time due to the disabled L2 port to the crossbar... so it is Nvidia's limitation.


----------



## solid9

Quote:


> Originally Posted by *looniam*
> 
> so let me get this straight:
> 
> you have *the developer come straight out and told you the benchmark doesn't properly test the Vram* _and the thread it was developed on stops using it._
> (btw, cuda developers on guru3D agreed it was flawed.)
> but you will ignore that?
> 
> then you have reviewers, _who missed the issue entirely_, tell you it can work so you now believe them?
> 
> please those guys are just as much in the dark as anyone because nvidia is not giving full disclosure. so far the only one who can really tell is nvidia unless someone get a GM204-200 with with all it's ROPs and crossbars intact. (to paraphrase PCper)


Maybe the test is flawed but it made us notice something wasn't right , nvidia has basically confessed , guru3d has updated their article again and it explains the problem very well , we got a 3.5 gb card , the other 0.5gb is basically useless since it's as fast as swapping to ram.
Nvidia is in the wrong , what I demand is a refound (and not a game/bundle ) or a new card be it a 980 or a new model of 970 that works as intended.


----------



## ZealotKi11er

GTX 970 Ti incoming with full 4GB.


----------



## skupples

Quote:


> Originally Posted by *Depauville Kid*
> 
> In the Anandtech article, I found this part interesting:
> 
> "To that end in the short amount of time we've had to work on this article we have also been working on cooking up potential corner cases for the GTX 970 and have so far come up empty, though we're by no means done. Coming up with real (non-synthetic) gaming workloads that can utilize between 3.5GB and 4GB of VRAM while not running into a rendering performance wall is already a challenge, and all the more so when trying to find such workloads that actually demonstrate performance problems. This at first glance does seem to validate NVIDIA's overall claims that performance is not significantly impacted by the memory segmentation, but we're going to continue looking to see if that holds up. In the meantime NVIDIA seems very eager to find such corner cases as well, and if there are any they'd like to be able to identify what's going on and tweak their heuristics to resolve them."
> 
> Apparently, finding a game where the card would use between 3.5GB and 4GB of Vram that still falls within the performance envelope of the GPU is quite challenging. If professional benchmarkers are having trouble recreating these scenarios in real games, the chances the average consumer will stumble upon it is probably slim as well. If the scenarios do become more prevalent over time and recreatable, Anandtech seemed optimistic that Nvidia could compensate for it in their driver reducing the performance impact to a few percent.
> 
> Interesting to say the least.


pfff, complaining about stutter while running SoM @ DSR 4K w/ max settings + texture pack @ 20-25 FPS w/ 1x 970 is totally realistic!

really though, why can't they test it in SLi?


----------



## sugarhell

Why they didint just released the card with 3,5 gb vram and 56 ROPs like the GTX570?And dont tell me that nvidia didint know. Its not even possible the marketing team to not read the design/specification document of the gpu.

I would prefer just to correct the 'mistake' the first day of release.


----------



## Zboe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GTX 970 Ti incoming with full 4GB.


All 4Gb is there though, at least my card was [using it all] during some play testing in FC4.


----------



## Vesku

Quote:


> Originally Posted by *Exilon*
> 
> There's actually two GDDR5 chips attached to the memory controllers. One of the memory controllers can't read from both at the same time due to the disabled L2 port to the crossbar... so it is Nvidia's limitation.


That's what I meant, if Nvidia is now telling the truth and one of the internal controllers is managing two memory chips then due to the way memory chip access is performed there is no way to transfer data from both memory pools at the exact same time. As in the limitations of GDDR5 means that you can't read both chips simultaneously at half speed or some such. Have to alternate between Big Pool and Little Pool (single chip). Best case scenario the GTX 970 is moving 196GB/s of memory data at any single point in time.


----------



## Depauville Kid

Quote:


> Originally Posted by *skupples*
> 
> pfff, complaining about stutter while running SoM @ DSR 4K w/ max settings + texture pack @ 20-25 FPS w/ 1x 970 is totally realistic!
> 
> really though, why can't they test it in SLi?


I'm trying to follow your statement. Are you saying it's running at 20-25 FPS due to the vram limitations? If so, how much vram is being consumed in that scenario?


----------



## skupples

Quote:


> Originally Posted by *Depauville Kid*
> 
> I'm trying to follow your statement. Are you saying it's running at 20-25 FPS due to the vram limitations? If so, how much vram is being consumed in that scenario?


no. What I'm saying is this :

We've seen multiple people use SoM as an example of where the VRAM limitation comes into play. They are doing this on a single 970, with max settings + texture pack + downsampling = an unrealistic measurement for a single GPU, as it's easily pushing past the plausible point of performance for the card, with or without full access to all 4 giggles.


----------



## Vesku

I thought SoM with Ultra textures could use 4GB at 1080P or 1440P? If you are getting to 3.5+ GB of RAM through improved textures shouldn't it be less taxing on the card than forcing higher res or AA? Apparently BF4 draw distance is chosen based on VRAM, would be interesting if the 970 showed less of a map than a 980 at exact same settings.


----------



## Gamer_Josh

Quote:


> Originally Posted by *Fateful_Ikkou*
> 
> I beg to differ about that statement, My brother who has two GTX 970's in SLI can't max out Battlefield 4 properly at 1080P because his cards hit the 3.5GB "limit" and he starts stuttering like hell. It's the same with my one 970 and COD: Ghosts, when I hit that Vram "limit" I go from steady 55~60FPS down to 37~46 and start stuttering and lagging around to the point I can't play. The issue exists, claiming it doesn't for a majority doesn't mean that the issue doesn't exist or that everyone is happy. Don't get me wrong I still love the card it's works fine for 95% of the games I play but it's the other 5% that's a big let down. This card is more than capable of maxing out COD: Ghosts and Battlefield 4 but because of that Vram issue I have to either lower my settings or watch my Vram usage like a hawk and quit when it get's to that limit and I shouldn't have to do that when the GPU itself is more than capable.


It leaves me to wonder, what variables decide whether or not a certain card has issues? I max all settings on Battlefield 4 at 1080P and experience no such stutters or hiccups.


----------



## Zboe

Quote:


> Originally Posted by *Depauville Kid*
> 
> I'm trying to follow your statement. Are you saying it's running at 20-25 FPS due to the vram limitations? If so, how much vram is being consumed in that scenario?


Not to hijack or anything but...

I tested tonight using FC4 and it was difficult to find settings that used less than 4 but more than 3.5Gb of VRAM. I ended up right around 3650mb used and had FPS in the mid to upper 40s though. Anything over 4GB of VRAM tanked FPS to single digit, dropping settings to to use around 3.8GB kept FPS at 24-26 FPS (unplayable for most of us but most console players probably would have been okay with it) and dropping the AA another notch put VRAM use at between 3.6 and 3.7GB and the game was playable. The 970 runs out of horsepower before the VRAM issues really start to matter IMHO.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Zboe*
> 
> All 4Gb is there though, at least my card was [using it all] during some play testing in FC4.


You are not using 4GB at the same time? What resolution are you running in FC4 because i have not hit more then 2.8GB @ 1440p with my R9 290X.


----------



## Depauville Kid

Quote:


> Originally Posted by *skupples*
> 
> no. What I'm saying is this :
> 
> We've seen multiple people use SoM as an example of where the VRAM limitation comes into play. They are doing this on a single 970, with max settings + texture pack + downsampling = an unrealistic measurement for a single GPU, as it's easily pushing past the plausible point of performance for the card, with or without full access to all 4 giggles.


Ahh... Now I gotcha. They would be hitting the performance limits of the card regardless of vram usage.


----------



## GorillaSceptre

Quote:


> Originally Posted by *skupples*
> 
> no. What I'm saying is this :
> 
> We've seen multiple people use SoM as an example of where the VRAM limitation comes into play. They are doing this on a single 970, with max settings + texture pack + downsampling = an unrealistic measurement for a single GPU, as it's easily pushing past the plausible point of performance for the card, with or without full access to all 4 giggles.


I completely agree with that. However it was still false advertising and we have no idea how VR is going to use vram or dx12. It's specs simply don't match what was on the box, it's wrong and shouldn't just be accepted.


----------



## tpi2007

Quote:


> Originally Posted by *Seven7h*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> Anandtech says otherwise:
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2
> 
> Are you 100% certain that there is _also_ no per application optimization going on ?
> 
> http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/3
> 
> 
> 
> Yup. They track how resources are being used and know what is the most important and what should not go in there. There will never be a case where everything in memory is equally important, or has equal effect on frametime.
> 
> Ever wonder why your game doesn't instantly crash if you use more video memory than you have? The driver and OS are putting graphics resources in system memory. If it's important stuff, performance can tank. That's why there is specific logic to keep important stuff out of there, and performance remains fine. It's *exactly the same* here, except you now have a third tier in between the two.
> 
> Why would you trust the exact same logic that has kept important render targets out of system memory for years on every other GPU, but suddenly call its decision making or sensitivity into question over this? Lol it's exactly the same resource management that every GPU has and has used.
Click to expand...

You forgot to mention the small detail that this third tier can't be accessed concurrently with the other two, so the juggling of game assets around the two segments is not the same as shifting things to RAM while not losing cycles with the VRAM.


----------



## Zboe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are not using 4GB at the same time? What resolution are you running in FC4 because i have not hit more then 2.8GB @ 1440p with my R9 290X.


I was messing around with 4k. I was able to hit over 3.6Gb used at 1440 however.


----------



## Depauville Kid

Quote:


> Originally Posted by *Zboe*
> 
> Not to hijack or anything but...
> 
> I tested tonight using FC4 and it was difficult to find settings that used less than 4 but more than 3.5Gb of VRAM. I ended up right around 3650mb used and had FPS in the mid to upper 40s though. Anything over 4GB of VRAM tanked FPS to single digit, dropping settings to to use around 3.8GB kept FPS at 24-26 FPS (unplayable for most of us but most console players probably would have been okay with it) and dropping the AA another notch put VRAM use at between 3.6 and 3.7GB and the game was playable. The 970 runs out of horsepower before the VRAM issues really start to matter IMHO.


That is pretty much the conclusion I was coming to reading what the various sites had put out today.


----------



## skupples

Quote:


> Originally Posted by *Gamer_Josh*
> 
> It leaves me to wonder, what variables decide whether or not a certain card has issues? I max all settings on Battlefield 4 at 1080P and experience no such stutters or hiccups.


see, and I've enver been able to get BF4 smooth w/ 2-3 titans. VRAM is non-issue, hitching is persistent no matter the settings. Runs almost as poorly as FC4, just with higher FPS.
Quote:


> Originally Posted by *Zboe*
> 
> Not to hijack or anything but...
> 
> I tested tonight using FC4 and it was difficult to find settings that used less than 4 but more than 3.5Gb of VRAM. I ended up right around 3650mb used and had FPS in the mid to upper 40s though. Anything over 4GB of VRAM tanked FPS to single digit, dropping settings to to use around 3.8GB kept FPS at 24-26 FPS (unplayable for most of us but most console players probably would have been okay with it) and dropping the AA another notch put VRAM use at between 3.6 and 3.7GB and the game was playable. The 970 runs out of horsepower before the VRAM issues really start to matter IMHO.


Far Cry 4 also runs worse when you have monitoring software open. ANY monitoring software, includign their own overlay.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are not using 4GB at the same time? What resolution are you running in FC4 because i have not hit more then 2.8GB @ 1440p with my R9 290X.


same, I can't get much above 3.0 in 1080P surround with mostly high/ultra settings.
Quote:


> Originally Posted by *Depauville Kid*
> 
> Ahh... Now I gotcha. They would be hitting the performance limits of the card regardless of vram usage.


Correct, this used to happen to me w/ my 670s ~1.7GB vram usage, even with two of them. they went back as soon as titan dropped.


----------



## Zboe

My monitoring is through MSI afterburner, but not with an overlay but through the LCD display in my G510.


----------



## skupples

Quote:


> Originally Posted by *Zboe*
> 
> My monitoring is through MSI afterburner, but not with an overlay but through the LCD display in my G510.


having the damn software open period affects the game. It's really dumb, and ubisoft themselves have owned up to it. Any polling software makes the game run even worse. Why? Because Ubisoft saves money via skimping on QA & compatibility.


----------



## Seven7h

Quote:


> Originally Posted by *tpi2007*
> 
> You forgot to mention the small detail that this third tier can't be accessed concurrently to the other two, so the juggling of game assets around the two segments is not the same as shifting things to RAM while not losing cycles with the VRAM.


There is not much juggling going on. They will not thrash as they are seen as almost equivalent by the OS. Therefore it won't use one unless it has to, but once it does, it's not going to be aggressive about taking stuff out of one and putting it into the other.

The performance of reading from the slow segment speaks for itself. Use more than 4GB in a game to simulate how a 3.5GB version of the 970 would perform above 3.5GB, sticking stuff in system memory. I assure you, it will not be pretty (single digit FPS) and you will be begging for your 512MB back.


----------



## Zboe

Quote:


> Originally Posted by *skupples*
> 
> having the damn software open period affects the game. It's really dumb, and ubisoft themselves have owned up to it. Any polling software makes the game run even worse. Why? Because Ubisoft saves money via skimping on QA & compatibility.


So basically the card could perform better than it does in testing, which makes it even more of a moot point.


----------



## looniam

Quote:


> Originally Posted by *Vesku*
> 
> The theoretical memory speed of a GDDR5 chip can be calculated easily. The various "weird" stuff identified there was something going on. NAI's test isn't a masterful memory examiner, CUDA doesn't give that much memory control, but it did catch the slow memory section. Nvidia has admitted they have segmented the memory into 3.5 and 0.5 GB sections. It's not physically possible to access both pools at the same time, that is a limitation of the GDDR5 memory chip not Nvidia's. Unless Nvidia is lying about having that 0.5GB section hanging off the same internal controller as one of the other memory chips, not sure why they'd want to lie about that when confessing. There would also be no point in the memory pools if that chip actually had its own unshared memory controller.


yes, the benchmark confirmed the 3.5 segmentation that 970 users are experiencing. but it is not a proper tool to show the bandwidth difference _as it was originally believed._.

and i think we agree upon that.
Quote:


> Originally Posted by *solid9*
> 
> Maybe the test is flawed but it made us notice something wasn't right , nvidia has basically confessed , guru3d has updated their article again and it explains the problem very well , we got a 3.5 gb card , the other 0.5gb is basically useless since it's as fast as swapping to ram.
> Nvidia is in the wrong , what I demand is a refound (and not a game/bundle ) or a new card be it a 980 or a new model of 970 that works as intended.


you should be compensated for your inconvenience. but it may be reaching a bit far to get a 980 or "updated/revised" 970 w/o providing the cost difference. in reality nvidia will probably "low ball" any compensation offer knowing most people don't have the time or resources for a prolonged negotiation process.

you might end up getting what you can from them and then selling the card to someone that doesn't care.


----------



## Zboe

Quote:


> Originally Posted by *Seven7h*
> 
> There is not much juggling going on. They will not thrash as they are seen as almost equivalent by the OS. Therefore it won't use one unless it has to, but once it does, it's not going to be aggressive about taking stuff out of one and putting it into the other.
> 
> The performance of reading from the slow segment speaks for itself. Use more than 4GB in a game to simulate how a 3.5GB version of the 970 would perform above 3.5GB, sticking stuff in system memory. I assure you, it will not be pretty (single digit FPS) and you will be begging for your 512MB back.


Please if you don't have a GTX 970 and haven't done any testing try to refrain from making assumptions...

I was able to test on my own tonight and your theory doesn't hold water.


----------



## Clocknut

Quote:


> Originally Posted by *Zboe*
> 
> Not to hijack or anything but...
> 
> I tested tonight using FC4 and it was difficult to find settings that used less than 4 but more than 3.5Gb of VRAM. I ended up right around 3650mb used and had FPS in the mid to upper 40s though. Anything over 4GB of VRAM tanked FPS to single digit, dropping settings to to use around 3.8GB kept FPS at 24-26 FPS (unplayable for most of us but most console players probably would have been okay with it) and dropping the AA another notch put VRAM use at between 3.6 and 3.7GB and the game was playable. The 970 runs out of horsepower before the VRAM issues really start to matter IMHO.


this will most likely to change in the future, future games are design to tailored for consoles 8GB RAM(assuming OS reserve 3Gb), we still have 5Gb just for games. Thats is right now, the console developer may change this configuration in future as they optimized their OS to have lighter on RAM footprint & as game is starting stretching out the 5Gb RAM, it may also prompt the console developer open up to change to 2GB/6GB ratio or higher.

And all that 5Gb is design to run at 720p @ 30fps. Think about how much VRAM is needed for 4K without AA which is about 4 times resolution.

Besides this, all the previous gen GTXx70 Nvidia run out of VRAM first b4 the card's speed. for ex. GTX470/570/670/770. I have my 1.25Gb 570, this card is still lack of VRAM first b4 running out of speed.


----------



## Seven7h

Quote:


> Originally Posted by *Zboe*
> 
> Please if you don't have a GTX 970 and haven't done any testing try to refrain from making assumptions...
> 
> I was able to test on my own tonight and your theory doesn't hold water.


I'm not making assumptions, I speak from facts and experience. But it's okay. Reviewers have already done digging and can't find any cases where what you're implying is happening, so I don't feel the need to justify my explanations. I'm in a position to know more than most, and I am lending what I know to the discussion here.


----------



## tpi2007

Quote:


> Originally Posted by *Seven7h*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> You forgot to mention the small detail that this third tier can't be accessed concurrently to the other two, so the juggling of game assets around the two segments is not the same as shifting things to RAM while not losing cycles with the VRAM.
> 
> 
> 
> There is not much juggling going on. They will not thrash as they are seen as almost equivalent by the OS. Therefore it won't use one unless it has to, but once it does, it's not going to be aggressive about taking stuff out of one and putting it into the other.
> 
> The performance of reading from the slow segment speaks for itself. Use more than 4GB in a game to simulate how a 3.5GB version of the 970 would perform above 3.5GB, sticking stuff in system memory. I assure you, it will not be pretty (single digit FPS) and you will be begging for your 512MB back.
Click to expand...

If it's needed, I'm betting it will, as the penalty of reading from the 0.5 GB segment is just too great - low bandwidth and the inability to read from the 3.5 GB segment concurrently, so the option is to move it to the 3.5 GB segment and move something less important to the 0.5 GB one.

That's akin to saying that people should be happy that they got that extra 0.5 GB of 28 GB/s memory instead of nothing and game assets having to be shifted to the much slower system RAM (PCIe interface bandwidth, latency, etc).

How about having all the L2 cache - *as advertised* - and a single segment of 4 GB of VRAM that has a maximum throughput of 224 GB/s *like advertised*, instead of 196 GB/s or 28 GB/s ?


----------



## Zboe

Quote:


> Originally Posted by *Clocknut*
> 
> this will most likely to change in the future, future games are design to tailored for consoles 8GB RAM(assuming OS reserve 3Gb), we still have 5Gb just for games. Thats is right now, the console developer may change this configuration in future as they optimized their OS to have lighter on RAM footprint & as game is starting stretching out the 5Gb RAM, it may also prompt the console developer open up to change to 2GB/6GB ratio or higher.
> 
> And all that 5Gb is design to run at 720p @ 30fps. Think about how much VRAM is needed for 4K without AA which is about 4 times resolution.
> 
> Besides this, all the previous gen GTXx70 Nvidia run out of VRAM first b4 the card's speed. for ex. GTX470/570/670/770. I have my 1.25Gb 570, this card is still lack of VRAM first b4 running out of speed.


As someone that used GTX 570 SLI, I know exactly and still have one in my spare rig. The x70 cards are always gimped, and almost always have some kind of controversy about them. This is nothing new and I'm not sure why people are always upset. All of the current 4GB and down cards are going to be obsolete within 2 years anyway and by that time the VRAM "useability" on the GTX 970 will be long forgotten by all but the most outspoken AMD fanboys...


----------



## skupples

Quote:


> Originally Posted by *Zboe*
> 
> So basically the card could perform better than it does in testing, which makes it even more of a moot point.


basically...

i use NV Inspector to apply my sub 1.3V OC, then start the game via shortcut with page file check disabled. smooths things out a little bit.
Quote:


> Originally Posted by *Zboe*
> 
> As someone that used GTX 570 SLI, I know exactly and still have one in my spare rig. The x70 cards are always gimped, and almost always have some kind of controversy about them. This is nothing new and I'm not sure why people are always upset. All of the current 4GB and down cards are going to be obsolete within 2 years anyway and by that time the VRAM "useability" on the GTX 970 will be long forgotten by all but the most outspoken AMD fanboys...


part of it is the age / experience of the people throwing fits... GPU purchases have grown quite a bit since fermi days, so I would be willing to bet a majority of the people throwing fits went from a used 2xx series to maxwell, or something along those lines...

I don't remember the 670 controversy though, but I had 670FTWs.


----------



## Menta

lol, should i be happy i got less rops ...


----------



## Zboe

Quote:


> Originally Posted by *Seven7h*
> 
> I'm not making assumptions, I speak from facts and experience. But it's okay. Reviewers have already done digging and can't find any cases where what you're implying is happening, so I don't feel the need to justify my explanations. I'm in a position to know more than most, and I am lending what I know to the discussion here.


No offense but either we're reading from two different playbooks or you are out in left field...

So far all the "official" test seem to be indicating that this whole thing has been blown way out of proportion.


----------



## skupples

those that are experiencing issues should probably document them and submit them...

Nvidia seems to want to see these instances, along with anandtech, you know... so they can tune their gimping skills.


----------



## MerkageTurk

How can people still defend nVidia?

I am worried about the precedent this will uphold onto future cards.

Makes no viable sense, they lied, end of discussion.


----------



## looniam

Quote:


> Originally Posted by *skupples*
> 
> those that are experiencing issues should probably document them and submit them...
> 
> Nvidia seems to want to see these instances, along with anandtech, you know... so they can tune their gimping skills.


i see what you did there.


----------



## Zboe

Quote:


> Originally Posted by *MerkageTurk*
> 
> How can people still defend nVidia?
> 
> I am worried about the precedent this will uphold onto future cards.
> 
> Makes no viable sense, they lied, end of discussion.


Remember when GTX 570s (and 590s..) were blowing up their VRMs from even (supposedly) light to moderate overclocking? Back then I was still a lurker, but I remember all the guys proclaiming how Nvidia wronged them, how they would never buy another one, etc.

It's all crap. The "precedent" was set a long time ago, and after the dust settles and Nvidia brings out the next "king of the Hill" GPU all will be forgotten and the cycle will continue.


----------



## spacin9

Quote:


> Originally Posted by *MerkageTurk*
> 
> How can people still defend nVidia?


Pretty much defending their purchases, at least for me it is. That and the fact they aren't going to do anything about it. When they wipe the forums clean @ Geforce, you know they don't care and would rather not see the noise.


----------



## Xoriam

I'm not defending Nvidia about the lying about the rops andL2 etc.
However I do think this has been blown out of proportion since I've seen no difference in my gaming performance when between 3.5 and 4gb of usage.


----------



## Zboe

Quote:


> Originally Posted by *spacin9*
> 
> Pretty much defending their purchases, at least for me it is. That and the fact they aren't going to do anything about it. When they wipe the forums clean @ Geforce, you know they don't care and would rather not see the noise.


If my card performed worse today than it did yesterday I'd be right there at the top complaining.

But really it works fine, I read all the reviews, did my due diligence, and even waited 'til people in this very forum had them in hand and had done testing. Everything was fine and dandy...

And you know what, it still is(!) I know some people have had some issues, I get that but for me the card works like I want it to. It gives the performance I knew I was buying into and does everything I have asked it to do with aplomb. It really is that simple for me. That may not work for everybody but seriously the card is still a great performer and does exactly what all the reviews said it would.


----------



## Menta

i rather "fight" the purchase then have them pulling this stunt again.

is about that and really nothing else


----------



## Xoriam

Slightly offtopic here but, previously when I was playing ACU @4k with 970SLI on driver 347.09 and patch 1.3.X I was able to get a consistant 50-80fps with 0xAA
Now that the game is on patch 1.4 and driver 347.25 I'm getting between 38-56 FPS. just slightly better than my single card performance was.

Anyone know if this is the new Nvidia driver, or the ACU patch which is to blame?


----------



## Vesku

If Nvidia can still get away with having 4GB 256 Bit on the box after this then I doubt they are too worried about the hit to their reputation this internet kerfuffle has caused. Hard to imagine moving the same amount of boxes if they have to put "3.5GB 224 Bit + 0.5GB *marketing name* Cache" in any new advertising material and on retail boxes. Which is why I really don't see why Anandtech would believe Nvidia's story of why they didn't come clean. The thinking basically came down to "those Nvidia guys are just too smart to mess this up". Neglecting the $$$ involved. Best just to relay Nvidia's explanation and make no comment than say "I believe them".


----------



## Zboe

Quote:


> Originally Posted by *Xoriam*
> 
> Slightly offtopic here but, previously when I was playing ACU @4k with 970SLI on driver 347.09 and patch 1.3.X I was able to get a consistant 50-80fps with 0xAA
> Now that the game is on patch 1.4 and driver 347.25 I'm getting between 38-56 FPS. just slightly better than my single card performance was.
> 
> Anyone know if this is the new Nvidia driver, or the ACU patch which is to blame?


Roll back the driver and check...but really it's Ubisoft so I wouldn't put it past them breaking their own game...

Never "double down" on a driver update _and_ game patch without checking it out first especially using SLI.


----------



## Xoriam

Quote:


> Originally Posted by *Zboe*
> 
> Roll back the driver and check...but really it's Ubisoft so I wouldn't put it past them breaking their own game...
> 
> Never "double down" on a driver update _and_ game patch without checking it out first especially using SLI.


I've read about some flickering problems with SLI in the latest patch, and they are really annoying ingame. I was able to resolve the flickering water, but sometimes the sky goes crazy, not to mention weird things in cutscenes.


----------



## Mopar63

Quote:


> Originally Posted by *Xoriam*
> 
> However I do think this has been blown out of proportion since I've seen no difference in my gaming performance when between 3.5 and 4gb of usage.


I think the funny thing from this is the response. When the whole frame latency issue was brought to the public it effected a small percentage of users and even then only a fraction of the effect people ever noticed it. It took specially developed software to track down the issue. It was blown way out of proportion.

In this case you have an issue that effects ALL 970 cards and was basically a marketing lie by the company. In fairness the issue does not effect a large percentage of users and is likely being blown out of proportion as well since the card does deliver a solid gaming experience.

At the end of the day however the simple truth is nVidia, through a mistake or on purpose, misrepresented their product to the consumers. What is even worse is the so called tech media did not catch this, even the top end "enthusiast" sites. This shows a seriously lack of real testing from these so called top sites since it seems, from what we are seeing now that serious, high end test would have revealed this. In fact many of the sites are now claiming they knew something was off but did not report it. So much for journalistic integrity.

At the end of the day however these massive threads and deep discussions are a waste of time. If you feel that you where cheated or slighted by nVidia then ask for a refund. Simply explain that you feel the misrepresentation of the cards specs is a bait and switch and you demand a refund. I am willing to bet you get one. If you do not get one then THAT is worthy of a thread. If your happy with the card and it does what you want then do not worry about defending it, go enjoy your card and game away.


----------



## GrimDoctor

Quote:


> Originally Posted by *Mopar63*
> 
> I think the funny thing from this is the response. When the whole frame latency issue was brought to the public it effected a small percentage of users and even then only a fraction of the effect people ever noticed it. It took specially developed software to track down the issue. It was blown way out of proportion.
> 
> In this case you have an issue that effects ALL 970 cards and was basically a marketing lie by the company. In fairness the issue does not effect a large percentage of users and is likely being blown out of proportion as well since the card does deliver a solid gaming experience.
> 
> At the end of the day however the simple truth is nVidia, through a mistake or on purpose, misrepresented their product to the consumers. What is even worse is the so called tech media did not catch this, even the top end "enthusiast" sites. This shows a seriously lack of real testing from these so called top sites since it seems, from what we are seeing now that serious, high end test would have revealed this. In fact many of the sites are now claiming they knew something was off but did not report it. So much for journalistic integrity.
> 
> At the end of the day however these massive threads and deep discussions are a waste of time. If you feel that you where cheated or slighted by nVidia then ask for a refund. Simply explain that you feel the misrepresentation of the cards specs is a bait and switch and you demand a refund. I am willing to bet you get one. If you do not get one then THAT is worthy of a thread. If your happy with the card and it does what you want then do not worry about defending it, go enjoy your card and game away.


In the US you might, here in Australia I am doubting it


----------



## skupples

what?

the frame pacing issue was blatant.

AMD even called users asking them about it liars/blind, which lead quite a few people to ditch 7970s for 680s, BEFORE there was definitive proof. That's part of the reason as to why the reaction was so great, as users were snubbed over the issues for so long.

look @ footage of 680sli
look @ footage of 7970xfire










why don't these look the same

boom, fcat.

ohhh, that's why! Because multiple frames in a row are being missed and or mashed together!


----------



## mouacyk

Quote:


> Originally Posted by *tpi2007*
> 
> If it's needed, I'm betting it will, as the penalty of reading from the 0.5 GB segment is just too great - low bandwidth and the inability to read from the 3.5 GB segment concurrently, so the option is to move it to the 3.5 GB segment and move something less important to the 0.5 GB one.
> 
> That's akin to saying that people should be happy that they got that extra 0.5 GB of 28 GB/s memory instead of nothing and game assets having to be shifted to the much slower system RAM (PCIe interface bandwidth, latency, etc).
> 
> How about having all the L2 cache - *as advertised* - and a single segment of 4 GB of VRAM that has a maximum throughput of 224 GB/s *like advertised*, instead of 196 GB/s or 28 GB/s ?


How come none of you NVidia defenders are countering this? All I read is "it's not big deal, such a small hit for the price and performance!" As an overclocking community, baselines (performance at spec) are the bread and butter on which we measure the accrued performance gained from the effort put into it. If those baselines are already shaky, where do we go from there? What exactly are you overclocking and gaining? I'd like to think that here at OCN, we do a little more than play in a sandbox.


----------



## Xoriam

Quote:


> Originally Posted by *mouacyk*
> 
> How come none of you NVidia defenders are countering this? All I read is "it's not big deal, such a small hit for the price and performance!" As an overclocking community, baselines (performance at spec) are the bread and butter on which we measure the accrued performance gained from the effort put into it. If those baselines are already shaky, where do we go from there? What exactly are you overclocking and gaining? I'd like to think that here at OCN, we do a little more than play in a sandbox.


I don't think there is a single person in here who isn't upset about the falsely advertised L2 and ROPs.


----------



## nSone

Quote:


> Originally Posted by *GrimDoctor*
> 
> In the US you might, here in Australia I am doubting it


Don't know about Australia but EU has quite strict regulations on this. Can't say how much nVidia (including manufacturers) are concerned on reputation after this fiasco but I think everyone should try a refund and make some pressure


----------



## skupples

Quote:


> Originally Posted by *mouacyk*
> 
> How come none of you NVidia defenders are countering this? All I read is "it's not big deal, such a small hit for the price and performance!" As an overclocking community, baselines (performance at spec) are the bread and butter on which we measure the accrued performance gained from the effort put into it. If those baselines are already shaky, where do we go from there? What exactly are you overclocking and gaining? I'd like to think that here at OCN, we do a little more than play in a sandbox.


how does overclocking effect memory usage? You still need two of these units to maintain any kind of playable FPS, in games that are actually USING that much memory, which normally requires max settings / texture mods & high AA settings, which all adds up to you needing 2 of them to maintain 50-70FPS.

this is why I keep mocking people that are complaining of stuttering in SoM while running the game @ settings which far exceed the core's power.

I was following at first, but tying memory allocations to overclocking, wut?

hell, the memory probably OCs better


----------



## Zboe

Quote:


> Originally Posted by *mouacyk*
> 
> How come none of you NVidia defenders are countering this? All I read is "it's not big deal, such a small hit for the price and performance!" As an overclocking community, baselines (performance at spec) are the bread and butter on which we measure the accrued performance gained from the effort put into it. If those baselines are already shaky, where do we go from there? What exactly are you overclocking and gaining? I'd like to think that here at OCN, we do a little more than play in a sandbox.


I don't understand the question to be honest, the cards do what they say on the tin...

They *do* have a 256 bit bus and they *do* have 4GB of GDDR5 RAM installed on the card. Now whether you like it or not this is the truth.

The cards performs as reviewers said they would, we all knew these were cut down GPUs and we bought cards based upon those truths.

...And that's how the cookie crumbles...


----------



## Clocknut

Quote:


> Originally Posted by *Zboe*
> 
> As someone that used GTX 570 SLI, I know exactly and still have one in my spare rig. The x70 cards are always gimped, and almost always have some kind of controversy about them. This is nothing new and I'm not sure why people are always upset. All of the current 4GB and down cards are going to be obsolete within 2 years anyway and by that time the VRAM "useability" on the GTX 970 will be long forgotten by all but the most outspoken AMD fanboys...


well.... I cant tell about the flagship AMD card, since the only AMD card I have is only 7790 & it isnt flagship.







(but it does also have VRAM issues) the rest of the GPU I have is Nvidia base.

besides it makes me wonder why they didnt just go for the simple 3.5GB route. 7 memory chip is still 1 less memory chip, that also mean lower production cost = allow Nvidia to price the card even more competitively. the additional no. 8th GDDR5 memory chip doesnt seems to add any value much.


----------



## skupples

Oh, I'm sure AMD is spinning up a new commercial as we speak.

Going to be hilarious.


----------



## MerkageTurk

Plus couple of benchmarks showcased the 7970 10% within ti fps.

At the beginning to whipped what and had on offer, 7970,290 and 290x, 970 and on/ off 980 however now the to is below the rest but just above the 7970

This really annoys me but someone from nvidia should respond


----------



## skupples

Quote:


> Originally Posted by *MerkageTurk*
> 
> Plus couple of benchmarks showcased the 7970 10% within ti fps.
> 
> At the beginning to whipped what and had on offer, 7970,290 and 290x, 970 and on/ off 980 however now the to is below the rest but just above the 7970
> 
> This really annoys me but someone from nvidia should respond


What? I think I lost you there.

A couple benchmarks showing the 7970 beating out the 780Ti? 660Ti? 560Ti? 960Ti?
Quote:


> Originally Posted by *Heavy MG*
> 
> Payed for 64 ROPS,when you really have 56 rops! YAY!
> LOL!
> It's also like paying for a car with a V8 but 2 of the cylinders really do nothing,or buying a truck that is advertised to pull 8,000lbs when it really can't even tow your car.
> It sure has me concerned as far as the possibility of buying their upper-mid tier range card in the future. I do sort of regret buying my 970 G1 so soon when i bought it for the "next gen" games in the first place.
> Aren't the mid range cards Nvidia's biggest profits?


but it's not because a V8 is a premium product, and 970 is not a premium product.

the attempt @ creating a meme overshadowed the creator's ability to make something that actually makes sense.

If this was happening with a 780Ti type product, then yes, you can call it getting a V8 that can't fire two cylinders, or a 300HP 6 speed stalling in 6th gear, but since this isn't a luxury product, those analogies do not apply.

It needs to be turned into a joke on riced out hondas, I mean, I don't even pay much attention to cars, and realize that it's poorly assembled, so I'm sure someone that ACTUALLY pays attention to cars is quite annoyed by the analogy.


----------



## MerkageTurk

Quote:


> Originally Posted by *skupples*
> 
> What? I think I lost you there.
> 
> A couple benchmarks showing the 7970 beating out the 780Ti? 660Ti? 560Ti? 960Ti?
> 
> Read brother...
> 
> Seems your drunk


----------



## skupples

10% / beat out... either way, you didn't properly define the statement to a point where it makes sense... Which Ti product?

also, what's drunk? My what is drunk? Pretty sure my dog isn't drunk, and since you said "your" and not "you're", you can't be talking about me. Are you drunk? You can't even post past the

Code:



Code:


[quote]

can you provide these benchmarks?


----------



## MerkageTurk

Argh autocorrect on phone is so annoying

From Techspots results:
GTX 980 = 100%
AMD R9 290X = 94.48%
GTX 970 = 88.28%
AMD R9 290 = 86.87%
GTX 780TI = 80.69%
GTX 780 = 71.03%
R9 280X = 69.66%
GTX 770 = 54.48%


----------



## skupples

well, at least I Have enough data to google what you're talking about.

http://www.overclock.net/t/1529108/are-nvidia-gimping-kepler-since-maxwell/0_50

decent summary...

typical "professional" review sites posting data that doesn't make sense, or match with other "professional" review sites...

Funny how that works, specially since no review site ever in the history of professional review sites has ever been caught tampering & skewing results.


----------



## MerkageTurk

Ow and check this

7970 32fps
780 ti 38fps

http://www.overclock.net/t/1528827/techspot-the-crew-benchmarked


----------



## skupples

I don't know allot about The Crew, but something tells me its likely a CPU limited game. Ubisoft has been putting out a ton of stuff recently which is CPU limited. Pegging 1-2 cores @ 99% = GPU is left waiting on the CPU.

but i'll read more, & run my own tests.

I expect NV to nerf Kepler, but not this quickly. It won't really happen until GM200 hits the market.

(see fermi nerf)


----------



## MerkageTurk

Hmm maybe amd is a far more better investment for 4k and future games s well as compute power

1) console games
2) mining (yes I know it's hard now)
3) plus they don't nerf their GPUs


----------



## XXnomadXX

went to best buy to -"talk"- about a refund of a false advertisement on a reference box and the customer service told me nvidia have to show proof of a product to receive a refund.

mind that i have a 4k samsung monitor.

so how do i get a refund to have a proper experience 4k monitor

purchase date was jan 18


----------



## skupples

I wouldn't buy anything for 4K until 390x & GM200 are on the market.

but that's just me... how Nvidia convinced people that a 256bit card would be sufficient for 4K blows my mind.

Even 980 requires twins for some decent 4K action.


----------



## provost

Quote:


> Originally Posted by *XXnomadXX*
> 
> went to best buy to -"talk"- about a refund of a false advertisement on a reference box and the customer service told me nvidia have to show proof of a product to receive a refund.
> 
> mind that i have a 4k samsung monitor.
> 
> so how do i get a refund to have a proper experience 4k monitor
> 
> purchase date was jan 18


Edit:

contact Nvidia first, hopefully Nvidia can address your issue.


----------



## skupples

the most you're going to see is a * next to the 4GB mark, with a definition on the back.


----------



## jprovido

Quote:


> Originally Posted by *provost*
> 
> If enough people file a complaint with their respective State's AG (Attorney General), you will have the appropriate attention to resolve you issue. And, all it takes is a phone call.
> 
> But, contact Nvidia first, hopefully Nvidia can address your issue before it gets to that stage.


In my case I contacted Galax first. I just emailed them a a few minutes ago. I wanedt a refund with my card as well and here in my country there's no return policy. let's see how they will handle my problem.


----------



## provost

Quote:


> Originally Posted by *jprovido*
> 
> In my case I contacted Galax first. I just emailed them a a few minutes ago. I wanedt a refund with my card as well and here in my country there's no return policy. let's see how they will handle my problem.


970 is still a pretty good card, so be absolutely sure before you return.

notwithstanding any option to return, if it were me, I would not return a 970. I would just be more careful about buying the next card. Just my two cents.


----------



## XXnomadXX

Quote:


> Originally Posted by *provost*
> 
> 970 is still a pretty good card, so be absolutely sure before you return.
> 
> notwithstanding any option to return, if it were me, I would not return a 970. I would just be more careful about buying the next card. Just my two cents.


now my next purchase of any gpu or cpu will be a 1 year wait for anything. four months waiting and got a gtx 970 and this happen. ya 1 year wait for me. like they say

strike 1- gts 8800

strike 2- false information of gtx 970

strike 3- who knows


----------



## provost

Quote:


> Originally Posted by *XXnomadXX*
> 
> now my next purchase of any gpu or cpu will be a 1 year wait for anything. four months waiting and got a gtx 970 and this happen. ya 1 year wait for me. like they say
> 
> strike 1- gts 8800
> 
> strike 2- false information of gtx 970
> 
> strike 3- who knows


This wouldn't be the first time or last time a company stretched the truth, but I do believe that some people here make a good point about performance impact being not that significant. It is indeed upsetting to be lied to, but if I kept the 970, it would be because I wanted to, not because it was an Nvidia card..

Anyway, everyone can make up their own mind....


----------



## solid9

Some of you are missing the point , it's not about being a premium product or not (v8 etc... still a gtr is 100k and a koenisseg is 1kk but the gtr is still a premium product for example) but it's false advertising none the less if people are starting to defend Nvidia and others companies even after such action there's something wrong with you guys , Nvidia has to specify what we're getting or do you like being sold 4kg of apples and later discover that it only was 3kg of apples and the other 1kg was the plastic bag and some leafs? The apples are still delicious (970 still performs well) but you'll run out of apples earlier (less vram for games that require it / future games / professional applications).
I've even explained it for those that can't get what Nvidia has done and are still playing the green flag/red flag game , we need to protect our rights and not let Nvidia get off with this or else we'll lose our rights as customers again and again.


----------



## Vesku

Another reason they should have ditched that last GDDR5 chip, it's a waste of electricity. Anyone who has been gushing over power requirements should be in a rage over having to power a near useless GDDR5 chip.

All so they could print that 4GB on the retail box.


----------



## looniam

Quote:


> Originally Posted by *Vesku*
> 
> Another reason they should have ditched that last GDDR5 chip, it's a waste of electricity. Anyone who has been gushing over power requirements should be in a rage over having to power a near useless GDDR5 chip.
> 
> All so they could print that 4GB on the retail box.


really?

people should be in a rage over a minuscule amount of voltage?

c'mon man.

i think there is enough about nvidia screwing up w/o having to conjure up anything.


----------



## Seven7h

Quote:


> Originally Posted by *tpi2007*
> 
> If it's needed, I'm betting it will, as the penalty of reading from the 0.5 GB segment is just too great - low bandwidth and the inability to read from the 3.5 GB segment concurrently, so the option is to move it to the 3.5 GB segment and move something less important to the 0.5 GB one.
> 
> That's akin to saying that people should be happy that they got that extra 0.5 GB of 28 GB/s memory instead of nothing and game assets having to be shifted to the much slower system RAM (PCIe interface bandwidth, latency, etc).
> 
> How about having all the L2 cache - *as advertised* - and a single segment of 4 GB of VRAM that has a maximum throughput of 224 GB/s *like advertised*, instead of 196 GB/s or 28 GB/s ?


I'm telling you it won't. If it it did, another allocation that must be in the high priority 3.5GB section could be made and boot all those textures back out again. This is why it's not how it works. Only things that are low priority are in there so the tax is not that great to just leave them there most of the time, given that you're approaching overall memory limitations anyway.

Games simply don't suddenly and repeatedly bounce between 4GB and 3GB. If they do, that's utterly ******ed and could cause problems even on a standard memory configuration. In fact this is exactly why many games consume most of the video memory they would ever need upfront... To avoid having to allocate a bunch of memory during gameplay.

Additionally, shuffling them from or back to the high priority memory wouldn't cause half second stutters which is what everyone is coming out of the woodwork to complain about.... You're only going from GPU memory to GPU memory. A copy like that is very high speed relative to bringing something in from system memory or disk and would easily be done in a piecemeal fashion over time. You just won't notice it as long as it's a smart algorithm, which it seems to be due to the lack of complaints for 4 months and the same logic working when balancing between video memory and system memory for years.

There's a tremendous amount going on behind the scenes in your system when running a game and it is what guarantees you good performance.

To get the behavior you fear, there would need to be room, meaning you destroyed up to 1GB of resources in a single frame during gameplay, and the OS would need to decide to aggressively move a ton of memory in an instant. That's not how it works... a driver and the OS are not that dumb.


----------



## Vesku

Quote:


> Originally Posted by *looniam*
> 
> really?
> 
> people should be in a rage over a minuscule amount of voltage?
> 
> c'mon man.
> 
> i think there is enough about nvidia screwing up w/o having to conjure up anything.


It's probably something like ~10-15W. Going by some people's posts about Nvidia vs AMD power consumption that's actually really important to them. Bit tongue in cheek but based on some of the people's comments it really should be upsetting for them. Personally I'm in the "if we let them get away with this with no pain then they'll do it again." camp. Don't want AMD getting any too clever marketing ideas, either.


----------



## looniam

Quote:


> Originally Posted by *Vesku*
> 
> It's probably something like ~10-15W. Going by some people's posts about Nvidia vs AMD power consumption that's actually really important to them.


@1.5v for gddr5 you're looking at 2 watts or less.

anyhow the amount of power isn't in question so i am sorry, but it makes the point moot.


----------



## Vesku

Now I wish I had bought a GTX 970 so I could run power tests 3.5GB vs 4GB gaming. I'm still waiting for a nm shrink before upgrading.


----------



## Arizonian

/Cleaned thread.

Remember no profanity tolerated.

Reply to a post with profanity and your reply with quoted profanity you replied to gets removed. Now someone replies to your post but since it was removed and any further discussion down the chain until discussion isn't disjointed.

Don't respond to people who use profanity, it's a waste of your time and causes a chain of removed posts.

Thread re-opened.


----------



## Sisaroth

Was probably already posted here but is interesting http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970


----------



## pony-tail

Quote:


> Originally Posted by *Sisaroth*
> 
> Was probably already posted here but is interesting http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970


That was an interesting ( and easy to follow ) read .
The card is what it is it's performance is decent - hopefully prices drop a bit after this and I will buy one !
The memory layout most likely will have little effect on the machine I would be using it on as the res is 3440 x 1440 (21:9) and I don't think I would realistically hit the end of the 3.5g and as I only recently bought that monitor I will be stuck with it for about the next 3 years ( no 4k for me unfortunately ) anyway it is a very nice monitor . I just can't afford the 980 as I have 3 video cards to buy ( 1 for gaming 2 for my SG05 shoeboxes ) in the next few months.


----------



## Cyro999

Quote:


> Originally Posted by *Vesku*
> 
> It's probably something like ~10-15W. Going by some people's posts about Nvidia vs AMD power consumption that's actually really important to them. Bit tongue in cheek but based on some of the people's comments it really should be upsetting for them. Personally I'm in the "if we let them get away with this with no pain then they'll do it again." camp. Don't want AMD getting any too clever marketing ideas, either.


10-15w?

There are eight chips







80-120w power usage from VRAM, especially when it's not being used?
Quote:


> The memory layout most likely will have little effect on the machine I would be using it on as the res is 3440 x 1440 (21:9) and I don't think I would realistically hit the end of the 3.5g


I'm doing it on a few games, particular console ports; although your resolution is ~2.39x higher than mine.


----------



## mtcn77

13 SMM 64 ROP 256 bit 224 GB/s,
<3.5 GB 13 SMM 56 ROP 224 bit 196 GB/s;
3.5-4.0 GB 12,5 SMM 50 ROP 200 bit 175 GB/s.


----------



## The Robot

The king is naked!


----------



## ondoy

Masters of Deception..


----------



## Menta

Quote:


> Originally Posted by *ondoy*
> 
> Masters of Deception..


you can bet on that, they where really true masters of Deception


----------



## mcg75

Quote:


> Originally Posted by *spacin9*
> 
> Pretty much defending their purchases, at least for me it is. That and the fact they aren't going to do anything about it. When they wipe the forums clean @ Geforce, you know they don't care and would rather not see the noise.


What did they wipe clean at the Geforce forums?

The main thread for this there has 2100 posts and 284,000 views in the 900 series section.

It was there last night when people were claiming it was removed and it's still there now.


----------



## Wirerat

Quote:


> Originally Posted by *mcg75*
> 
> What did they wipe clean at the Geforce forums?
> 
> The main thread for this there has 2100 posts and 284,000 views in the 900 series section.
> 
> It was there last night when people were claiming it was removed and it's still there now.


the mod said that the thread was getting downvoted by users so it was hidden. It was never removed. They locked it open now.


----------



## Nevk

https://www.change.org/p/nvidia-refund-for-gtx-970


----------



## HyperC

Quote:


> Originally Posted by *Nevk*
> 
> https://www.change.org/p/nvidia-refund-for-gtx-970


46 more needed DO IT NOW!


----------



## Lass3

This was the best thing that could happen, seen with AMD eyes







I look forward to 300 series. My 970s are being returned. Tired of the them anyway. Full refund incoming.


----------



## Cpt.Jeff

I am still in the return window for my card for the next few days. My question though is would this problem affect me if I am only @ 1440p?


----------



## Arturo.Zise

Send me all your busted crappy no good terrible 970's please. I will gladly buy them for cheap


----------



## skupples

Quote:


> Originally Posted by *HyperC*
> 
> 46 more needed DO IT NOW!


64 more needed for a pat on the back email from the automated change.org system.


----------



## Sisaroth

Maybe time to buy a second hand GTX 970. If i was going to buy a new GPU now it would have been a GTX 970 anyway even knowing this.


----------



## Wirerat

Quote:


> Originally Posted by *Sisaroth*
> 
> Hmm maybe time to buy a second hand GTX 970. If i was going to buy a new GPU now it would have been a GTX 970 anyway even knowing this.


i made my purchase 7 days ago. I knew of thise already.

I only wanted a nvidea card and it was a 970 or 780. Im over buying 250 watt cards so that only leaves the 970.


----------



## Olivon

Quote:


> Originally Posted by *mtcn77*
> 
> 
> 13 SMM 64 ROP 256 bit 224 GB/s,
> <3.5 GB 13 SMM 56 ROP 224 bit 196 GB/s;
> 3.5-4.0 GB 12,5 SMM 50 ROP 200 bit 175 GB/s.


And still puttiig Hawaii ashamed lawl ... ^^


----------



## Xuper

Quote:


> Originally Posted by *Olivon*
> 
> And still putting Hawaii ashamed lawl ... ^^


And still fighting one years old Hawaii ashamed lawl...^^


----------



## sugarhell

Quote:


> Originally Posted by *Olivon*
> 
> And still puttiig Hawaii ashamed lawl ... ^^


It doesnt apply for gk110 too?


----------



## skupples

Those numbers are too damn close with 290x edging it out and too many situations.


----------



## rdr09

Quote:


> Originally Posted by *Olivon*
> 
> And still puttiig Hawaii ashamed lawl ... ^^


still hyping it. even after they duped owners.


----------



## Arturo.Zise

Quote:


> Originally Posted by *skupples*
> 
> Those numbers are too damn close with 290x edging it out and too many situations.


Tell those New Yorkers to buy some 290x's and maybe they can help out with the snow problem by melting it all


----------



## Menta

i dont care for amd that much but i must admit its tempeting in the winter









some 290x


----------



## Wirerat

250 watt gpus are pita unless your on water imo.


----------



## WeirdHarold

*The performance of the GTX 970 is what the performance is. This information is incredibly interesting and warrants some debate, but at the end of the day, my recommendations for the GTX 970 really won't change at all.*

Above Video and Statement from PcPerspective over the last couple of days.

Personally to me I see how in a larger company the communication and understanding between the tech an marketing could have fallen apart even if they've done a great job in the past to keep it from happening. Personally I'd still buy a 970 as it would still be far more GPU than I'd ever need for what I do, just my


----------



## Silent Scone

There is a lot that gets lost in translation from manufacture to sales in all fields, this being no exception.

Although the performance knock-on may be slightly effected by the two way L2, this is about a fraction of the issue a lot of owners are making it out to be, and non owners as the case may be...It may be a bit different from previous offerings, but no matter how deliberate you can't divulge every technical specification. We know we are buying a cheaper card, so we should know we are getting less performance. How this is done is up to Nvidia, so although there might be a small penalty you're still in fact getting 4GB.

Those are the facts, the grey area people are trying to unearth boils down to one thing and one thing *only*. How much of a deficit the change in spec is producing, to which the answer is frankly *minimal*. By all means discuss the hows and whys, but what's the point in ifs and buts, when one will never know if this was deliberately kept from consumers or not, and more to the point - does it honestly matter when the performance is there, nor has the performance ever come into question till very recently.


----------



## WeirdHarold

Quote:


> Originally Posted by *Silent Scone*
> 
> There is a lot that gets lost in translation from manufacture to sales in all fields, this being no exception.
> 
> Although the performance knock-on may be slightly effected by the two way L2, this is about a fraction of the issue a lot of owners are making it out to be, and non owners as the case may be...It may be a bit different from previous offerings, but no matter how deliberate you can't divulge every technical specification. We know we are buying a cheaper card, so we should know we are getting less performance. How this is done is up to Nvidia, so although there might be a small penalty you're still in fact getting 4GB.
> 
> Those are the facts, the grey area people are trying to unearth boils down to one thing and one thing *only*. How much of a deficit the change in spec is producing, to which the answer is frankly *minimal*. By all means discuss the hows and whys, but what's the point in ifs and buts, when one will never know if this was deliberately kept from consumers or not, and more to the point - does it honestly matter when the performance is there, nor has the performance ever come into question till very recently.


I totally agree with everything you just said


----------



## nSone

Quote:


> Originally Posted by *Silent Scone*
> 
> There is a lot that gets lost in translation from manufacture to sales in all fields, this being no exception.
> 
> Although the performance knock-on may be slightly effected by the two way L2, this is about a fraction of the issue a lot of owners are making it out to be, and non owners as the case may be...It may be a bit different from previous offerings, but no matter how deliberate you can't divulge every technical specification. We know we are buying a cheaper card, so we should know we are getting less performance. How this is done is up to Nvidia, so although there might be a small penalty you're still in fact getting 4GB.
> 
> Those are the facts, the grey area people are trying to unearth boils down to one thing and one thing *only*. How much of a deficit the change in spec is producing, to which the answer is frankly *minimal*. By all means discuss the hows and whys, but what's the point in ifs and buts, when one will never know if this was deliberately kept from consumers or not, and more to the point - does it honestly matter when the performance is there, nor has the performance ever come into question till very recently.


please remind me to ask for your opinion when they release an 970ti with the alleged specs in place








or maybe you already know what performance improvement that would be? minimal? I know it won't be advertised as such
please note I'll be the first to buy that card


----------



## criminal

Quote:


> Originally Posted by *Silent Scone*
> 
> There is a lot that gets lost in translation from manufacture to sales in all fields, this being no exception.
> 
> Although the performance knock-on may be slightly effected by the two way L2, this is about a fraction of the issue a lot of owners are making it out to be, and non owners as the case may be...It may be a bit different from previous offerings, but no matter how deliberate you can't divulge every technical specification. We know we are buying a cheaper card, so we should know we are getting less performance. How this is done is up to Nvidia, so although there might be a small penalty you're still in fact getting 4GB.
> 
> Those are the facts, the grey area people are trying to unearth boils down to one thing and one thing *only*. How much of a deficit the change in spec is producing, to which the answer is frankly *minimal*. By all means discuss the hows and whys, but what's the point in ifs and buts, when one will never know if this was deliberately kept from consumers or not, and more to the point - does it honestly matter when the performance is there, nor has the performance ever come into question till very recently.


Well people were having issues before any of this came to light. I don't know, I understand that performance has not changed now, but like myself and others have said what about the future? True that a majority of us here on OCN may upgrade very often, but there are some users that may not upgrade for 3-4 years. If I was one of those users I would have major concerns about how good this card would do in a year or two.

Either way if it was me and I had a return window, the card would be going back.
Quote:


> Originally Posted by *nSone*
> 
> please remind me to ask for your opinion when they release an 970ti with the alleged specs in place
> 
> 
> 
> 
> 
> 
> 
> 
> or maybe you already know what performance improvement that would be? minimal? I know it won't be advertised as such
> please note I'll be the first to buy that card


A 970Ti with the original advertised 970 specs would eat right into 980 sales. It would be within 5% I am willing to bet.


----------



## Noufel

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Silent Scone*
> 
> There is a lot that gets lost in translation from manufacture to sales in all fields, this being no exception.
> 
> Although the performance knock-on may be slightly effected by the two way L2, this is about a fraction of the issue a lot of owners are making it out to be, and non owners as the case may be...It may be a bit different from previous offerings, but no matter how deliberate you can't divulge every technical specification. We know we are buying a cheaper card, so we should know we are getting less performance. How this is done is up to Nvidia, so although there might be a small penalty you're still in fact getting 4GB.
> 
> Those are the facts, the grey area people are trying to unearth boils down to one thing and one thing *only*. How much of a deficit the change in spec is producing, to which the answer is frankly *minimal*. By all means discuss the hows and whys, but what's the point in ifs and buts, when one will never know if this was deliberately kept from consumers or not, and more to the point - does it honestly matter when the performance is there, nor has the performance ever come into question till very recently.
> 
> 
> 
> Well people were having issues before any of this came to light. I don't know, I understand that performance has not changed now, but like myself and others have said what about the future? True that a majority of us here on OCN may upgrade very often, but there are some users that may not upgrade for 3-4 years. If I was one of those users I would have major concerns about how good this card would do in a year or two.
> 
> Either way if it was me and I had a return window, the card would be going back.
Click to expand...

Simple those people have to stay at 1080 res for the next year or two.


----------



## GrimDoctor

Quote:


> Originally Posted by *Silent Scone*
> 
> There is a lot that gets lost in translation from manufacture to sales in all fields, this being no exception.
> 
> Although the performance knock-on may be slightly effected by the two way L2, this is about a fraction of the issue a lot of owners are making it out to be, and non owners as the case may be...It may be a bit different from previous offerings, but no matter how deliberate you can't divulge every technical specification. We know we are buying a cheaper card, so we should know we are getting less performance. How this is done is up to Nvidia, so although there might be a small penalty you're still in fact getting 4GB.
> 
> Those are the facts, the grey area people are trying to unearth boils down to one thing and one thing *only*. How much of a deficit the change in spec is producing, to which the answer is frankly *minimal*. By all means discuss the hows and whys, but what's the point in ifs and buts, when one will never know if this was deliberately kept from consumers or not, and more to the point - does it honestly matter when the performance is there, nor has the performance ever come into question till very recently.


I know most here won't care but the problems caused in some VRAM hungry applications cause issues that are far from minimal. At this point I can only assume it is due to the way the memory allocation is configured on these cards.


----------



## Final8ty

Quote:


> Originally Posted by *Silent Scone*
> 
> There is a lot that gets lost in translation from manufacture to sales in all fields, this being no exception.
> 
> Although the performance knock-on may be slightly effected by the two way L2, this is about a fraction of the issue a lot of owners are making it out to be, and non owners as the case may be...It may be a bit different from previous offerings, but no matter how deliberate you can't divulge every technical specification. We know we are buying a cheaper card, so we should know we are getting less performance. How this is done is up to Nvidia, so although there might be a small penalty you're still in fact getting 4GB.
> 
> Those are the facts, the grey area people are trying to unearth boils down to one thing and one thing *only*. How much of a deficit the change in spec is producing, to which the answer is frankly *minimal*. By all means discuss the hows and whys, but what's the point in ifs and buts, when one will never know if this was deliberately kept from consumers or not, and more to the point - does it honestly matter when the performance is there, nor has the performance ever come into question till very recently.


No, it boils down to the specifications given which is false, it does not matter that everything cant be listed but what is listed must be accurate.

As far as *specifications are subject to change without notice*

The specifications have to be correct at the time of purchase, but they dont have to warn you about upcoming changes to the specific product, so if you wanted to buy one now and then get another one later for a matching pair and the specs have changed then tough luck.


----------



## Silent Scone

Quote:


> Originally Posted by *criminal*
> 
> Well people were having issues before any of this came to light. I don't know, I understand that performance has not changed now, but like myself and others have said what about the future? True that a majority of us here on OCN may upgrade very often, but there are some users that may not upgrade for 3-4 years. If I was one of those users I would have major concerns about how good this card would do in a year or two.
> 
> Either way if it was me and I had a return window, the card would be going back.


Well you say people are having an issue, but people have lots of issues







. Every instance I've read from owners I can replicate just as well (or not as is the case) with the 980 GTX. The real problem is deciphering real issues with ones who are merely misunderstanding the problem. As it stands third party tools aren't able to correctly monitor the buffer usage, and when using in excess of 3.8gb for example it's not uncommon to experience hitching with a 4GB equipped GPU. As is the nature of swap out, if the game is requesting large chunks of memory it can in fact easily stack up and cause large stuttering. This is happening for people in games like SOM and Assassins Creed Unity for 980 GTX users *as well as* 970 users. When in reality the two way L2 cache situation is probably only responsible for a very small frame loss.

You have to ask yourself would NVIDIA really release a product that would cause such a blatant problem when using the partitioned VRAM. Probably not, is the answer.

Quote:


> Originally Posted by *Final8ty*
> 
> No, it boils down to the specifications given which is false, it does not matter that everything cant be listed but what is listed must be accurate.
> 
> As far as *specifications are subject to change without notice*
> 
> The specifications have to be correct at the time of purchase, but they dont have to warn you about upcoming changes to the specific product, so if you wanted to buy one now and then get another one later for a matching pair and the specs have changed then tough luck.


The specification is correct, though. How the specification is met again is up to Nvidia unless you can collect data that points to crippled performance. Which is difficult as a whole anyway as you've bought a card that's marketed as being slower than the 980GTX.

Look at it this way, not everyone reads the technology tabloids. If you bought a 970 GTX on the notion that it was this amazing product, only to later read that NVIDIA had intentionally burnt out the traces to market it cheaper as a lesser product, would you be as mad?

Horses for courses.


----------



## skupples

People experiencing issues should really properly catalog and report said issues to NV. The chances of getting your money back is slim to none, so instead of endless crying, do something to attempt to help the situation. NV seems to want data on malfunctions, so provide them with the data they seem to want... I mean, no one has posted any proof of issues in over 24 hours. Is it because you have to get your GPUS down to 20 FPS to even consume that much VRAM? Thus know you will be dismissed by a statement of exceeding the power envelope of the core?

My 670s didn't have enough core power to be playable when a game got close to 2gb usage. Normally the core would hit 99% around 1.6-1.7 load.


----------



## Silent Scone

Yep.

Also I think quite frankly that some are likely GK104 owners who were expecting 4GB to take them to the moon and back in a "I can't possibly be using all of the memory" situation, which just isn't the case today.


----------



## Wirerat

Quote:


> Originally Posted by *skupples*
> 
> My 670s didn't have enough core power to be playable when a game got close to 2gb usage. Normally the core would hit 99% around 1.6-1.7 load.


This, Anandtech mentioned the gpu of the gtx 970 already exibiting bottlenecking before needing to access the last .5gb.

If that's correct it may be the gpu running out umph causing the stuttering some are seeing.


----------



## Final8ty

Quote:


> Originally Posted by *Silent Scone*
> 
> The specification is correct, though. How the specification is met again is up to Nvidia unless you can collect data that points to crippled performance. Which is difficult as a whole anyway as you've bought a card that's marketed as being slower than the 980GTX.


Sorry but you are forgetting the ROP and CACHE which are not as specified and not meet in any way shape or form.


----------



## Vesku

Quote:


> Originally Posted by *Silent Scone*
> 
> The specification is correct, though. How the specification is met again is up to Nvidia unless you can collect data that points to crippled performance. Which is difficult as a whole anyway as you've bought a card that's marketed as being slower than the 980GTX.


Based on the more detailed information from Nvidia about the separate memory pools the GTX 970 won't ever be transferring more than a theoretical 196 GB/s at any single point in time. I don't see how claiming 224 GB/s can be "correct", the 28 GB/s in the small memory pool can not occur at the same time as the 196 GB/s in the larger pool.


----------



## Silent Scone

Quote:


> Originally Posted by *Final8ty*
> 
> Sorry but you are forgetting the ROP and CACHE which are not as specified and not meet in any way shape or form.


So are you suggesting the 970 GTX is in fact ROP bound now that you've discovered this for yourself?
Quote:


> Originally Posted by *Vesku*
> 
> Based on the more detailed information from Nvidia about the separate memory pools the GTX 970 won't ever be transferring more than a theoretical 196 GB/s at any single point in time. I don't see how claiming 224 GB/s can be "correct", the 28 GB/s in the small memory pool can not occur at the same time as the 196 GB/s in the larger pool.


Not to dismiss this but it's pointless talking about theoretical bandwidth, as Skupples has already pointed out - data needs to be shown by those 'affected'.


----------



## Final8ty

Quote:


> Originally Posted by *Silent Scone*
> 
> So are you suggesting the 970 GTX is in fact ROP bound now that you've discovered this for yourself?
> Not to dismiss this but it's pointless talking about theoretical bandwidth, as Skupples has already pointed out - data needs to be shown by those 'affected'.


It does not matter if its ROP bound or not, the FACT is it does not have the ROP and cache specified which means it does not meet the specifications listed which is not down to whether its ROP or cache bound or not.


----------



## Vesku

Quote:


> Originally Posted by *Silent Scone*
> 
> So are you suggesting the 970 GTX is in fact ROP bound now that you've discovered this for yourself?
> Not to dismiss this but it's pointless talking about theoretical bandwidth, as Skupples has already pointed out - data needs to be shown by those 'affected'.


Theoretical max memory bandwidth is listed on Nvidia's web page for it. Based on Nvidia's description of the GTX 970's memory the max theoretical bandwidth that can occur on that card is 196 GB/s.


----------



## vloeibaarglas

http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/

The world is going to fall apart for 970. I wonder if PCPer will even try to FCAT some games considering they are pretty much Nvidia's mouthpiece.


----------



## Menta

Quote:


> Originally Posted by *skupples*
> 
> People experiencing issues should really properly catalog and report said issues to NV. The chances of getting your money back is slim to none, so instead of endless crying, do something to attempt to help the situation. NV seems to want data on malfunctions, so provide them with the data they seem to want... I mean, no one has posted any proof of issues in over 24 hours. Is it because you have to get your GPUS down to 20 FPS to even consume that much VRAM? Thus know you will be dismissed by a statement of exceeding the power envelope of the core?
> 
> My 670s didn't have enough core power to be playable when a game got close to 2gb usage. Normally the core would hit 99% around 1.6-1.7 load.


there is proof all around

http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/


----------



## Vesku

That Shadow of Mordor test is the "proper" kind, no change except for texture settings:

Quote:


> SHADOW OF MORDOR
> 
> VRAM USAGE MIN AVG MAX Settings
> 3.1gb 46 71.627 88 High textures
> 3.4 - 3.5 2 67.934 92 Ultra textures
> 
> This was tested using both High and Ultra textures.
> 
> At 3.1gb Vram usage, the game played smoothly. I expected higher FPS for the stock results but was very pleased with how much overclocking scaled in this game.
> 
> Above the 3.5gb threshold, the game was BARELY playable. I believe it was even playable due to the nature of the game rather than the GTX 970 handling its Vram better in this particular title. Only the minimum FPS was representative of the ****ty experience. What was 55 FPS felt like 15.


This is the scenario tech sites should be examining. Games where texture quality can be adjusted from less than 3.5GB into the 3.5-4GB range at the same resolution without changing AA. Such as verifying this persons experience at 1440P High vs Ultra textures in Shadows of Mordor.


----------



## Silent Scone

Quote:


> Originally Posted by *Vesku*
> 
> Theoretical max memory bandwidth is listed on Nvidia's web page for it. Based on Nvidia's description of the GTX 970's memory the max theoretical bandwidth that can occur on that card is 196 GB/s.


But again, this is why it's theoretical. You could run tests at any given memory clock and not reach the calculated theoretical limit, this is why they're called theoretical limits. Do not mistake me for saying the spec is somewhat better, it's a fairly odd state of affairs as it's new thanks to the architecture. As should the specs been correct in the first place, although saying 196 + 28GB/s seems somewhat ridiculous in itself.

Quote:


> Originally Posted by *Vesku*
> 
> See that Shadow of Mordor test is the "proper" kind, no change except for texture settings:
> 
> SHADOW OF MORDOR
> 
> VRAM USAGE MIN AVG MAX Settings
> 3.1gb 46 71.627 88 High textures
> 3.4 - 3.5 2 67.934 92 Ultra textures
> 
> This was tested using both High and Ultra textures.
> 
> At 3.1gb Vram usage, the game played smoothly. I expected higher FPS for the stock results but was very pleased with how much overclocking scaled in this game.
> 
> Above the 3.5gb threshold, the game was BARELY playable. I believe it was even playable due to the nature of the game rather than the GTX 970 handling its Vram better in this particular title. Only the minimum FPS was representative of the ****ty experience. What was 55 FPS felt like 15.
> 
> This is the scenario tech sites should be examining. Games where texture quality can push into the 3.5-4GB range at the same resolution without changing AA.


*Ultra textures is an uncompressed texture pack put into the game for people who are able to use it. With my 980GTX I am unable to use this texture pack because it requires more than 4GB of VRAM, I experience the same problems as you*

Why don't people ever listen to anything. Apart from all the wrong things.


----------



## 2010rig

So, basically: Specs > Performance. ( for some people )

I don't know if I buy NVIDIA's whole story, they should've provided the correct specs from the get go. At the end of the day, the 970 is performing exactly as it should be: *A $220 lesser card.*

NVIDIA will provide the correct specs from here on out. 56 ROP's, not sure how they can properly label the RAM, and they'll need to provide correct bandwidth numbers. Other than that, the performance of the 970 hasn't changed, and is exactly as it should be.



















2 FPS apart from the 980 for $220 LESS, oh the insanity!


----------



## jaydude

NV should have made this known from the very beginning, not hide it and hope no one noticed then make claims after the fact.

Shame on you NV, shame.


----------



## vloeibaarglas

Quote:


> Originally Posted by *Silent Scone*
> 
> But again, this is why it's theoretical. You could run tests at any given memory clock and not reach the calculated theoretical limit, this is why they're called theoretical limits. Do not mistake me for saying the spec is somewhat better, it's a fairly odd state of affairs as it's new thanks to the architecture. As should the specs been correct in the first place, although saying 196 + 28GB/s seems somewhat ridiculous in itself.


196 GB/s or 28 GB/s since both pools can't be accessed at the same time on the same cycle. lol


----------



## skupples

Quote:


> Originally Posted by *vloeibaarglas*
> 
> http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/
> 
> The world is going to fall apart for 970. I wonder if PCPer will even try to FCAT some games considering they are pretty much Nvidia's mouthpiece.


That hasn't always been the case. The used to even be named something like phenom.com


----------



## djriful

I read some comment... "I bought GTX970 for future proofing 4k for the next 5 years... now I want refund".

The next thing I know is, I bought $100 card to future proof 10 years.

If you are able to afford 4k... I suggest wait for big maxwell aka TITAN "???" etc.

No tech is going to last you 5 years (absolute max) if you want to keep up all the latest games engines with increasing monitor resolutions. Average is now 2-3 years per upgrades.


----------



## Vesku

Quote:


> Originally Posted by *Menta*
> 
> But again, this is why it's theoretical. You could run tests at any given memory clock and not reach the calculated theoretical limit, this is why they're called theoretical limits. Do not mistake me for saying the spec is somewhat better, it's a fairly odd state of affairs as it's new thanks to the architecture. As should the specs been correct in the first place, although saying 196 + 28GB/s seems somewhat ridiculous in itself.


It is not even theoretically possible to hit 224 GB/s with the GTX 970 memory configuration. The two pools can not be accessed at the same time. With the GTX 970s configuration there is no way to ever move all 256 bits of the memory interface at the exact same time. Now theoretically if Nvidia did not trick people and actually had a single 4GB memory pool then that memory pools theoretical max bandwidth would be 224 GB/s.


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> *So, basically: Specs > Performance.* ( for some people )
> 
> I don't know if I buy NVIDIA' whole story, but at the end of the day, the 970 is performing exactly as it should be: *A $220 lesser card.*
> 
> NVIDIA will provide the correct specs from here on out. 56 ROP's, not sure how they can properly label the RAM, and they'll need to provide correct bandwidth numbers. Other than that, the performance hasn't changed, and is as it should be.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2 FPS apart from the 980, oh the insanity!


Yep, but I can't really blame anyone for feeling this way. It is not okay that Nvidia lied about the specs. (I don't buy for a minute Nvidia made that big of a mistake.) We know that performance has not changed, but it does raise concerns for the future. You being someone that does not appear to upgrade very often (GTX470 in sig) the potential impact of this in the future can be a cause for concern now. Having less rops and theoretical bandwidth may play an issue down the road.


----------



## Silent Scone

Quote:


> Originally Posted by *Vesku*
> 
> It is not even theoretically possible to hit 224 GB/s with the GTX 970 memory configuration. The two pools can not be accessed at the same time. With the GTX 970s configuration there is no way to ever move all 256 bits of the memory interface at the exact same time. Now theoretically if Nvidia did not trick people and actually had a single 4GB memory pool then that memory pools theoretical max bandwidth would be 224 GB/s.


Yes, I know. Nor did I say the spec is correct.


----------



## Menta

thats the whole thing most of us like buying hardware, i have no problem with buying a new card in two or three years sometimes in the same year i buy two..

thats reason enough that NV should treat this matter with the right amount of willing to keep the people happy


----------



## sugalumps

Quote:


> Originally Posted by *djriful*
> 
> I read some comment... "I bought GTX970 for future proofing 4k for the next 5 years... now I want refund".
> 
> The next thing I know is, I bought $100 card to future proof 10 years.
> 
> If you are able to afford 4k... I suggest wait for big maxwell aka TITAN "???" etc.
> 
> No tech is going to last you 5 years (absolute max) if you want to keep up all the latest games engines with increasing monitor resolutions. Average is now 2-3 years per upgrades.


Indeed, the 970 is a budget card no matter how you look at it. You cant expect a mid range budget card to last you through a resolution that is just out and not even the top gpu's can handle it comfortably.

For some reason the 970 is so hyped that people think it can do anything, people are buying the 970 specifically for 4k thinking they are set for years.


----------



## 2010rig

Quote:


> Originally Posted by *criminal*
> 
> Yep, but I can't really blame anyone for feeling this way. It is not okay that Nvidia lied about the specs. (I don't buy for a minute Nvidia made that big of a mistake.) We know that performance has not changed, but it does raise concerns for the future. You being someone that does not appear to upgrade very often (GTX470 in sig) the potential impact of this in the future can be a cause for concern now. Having less rops and theoretical bandwidth may play an issue down the road.


I agree that they should have posted the correct ROP count and bandwidth numbers, and the card would have sold just as well.

As you probably already know my primary uses aren't gaming, but anyway. Does the 970 even need 64 ROP's?

Not sure if I'm reading this wrong...
Quote:


> That means the GTX 970 has 56 ROPs and 1792 KB of L2 cache compared to 64 ROPs and 2048 KB of L2 cache for the GTX 980. Before people complain about the ROP count difference as a performance bottleneck, *keep in mind that the 13 SMMs in the GTX 970 can only output 52 pixels/clock and the seven segments of 8 ROPs each (56 total) can handle 56 pixels/clock. The SMMs are the bottleneck, not the ROPs*.


----------



## djriful

Quote:


> Originally Posted by *sugalumps*
> 
> Quote:
> 
> 
> 
> Originally Posted by *djriful*
> 
> I read some comment... "I bought GTX970 for future proofing 4k for the next 5 years... now I want refund".
> 
> The next thing I know is, I bought $100 card to future proof 10 years.
> 
> If you are able to afford 4k... I suggest wait for big maxwell aka TITAN "???" etc.
> 
> No tech is going to last you 5 years (absolute max) if you want to keep up all the latest games engines with increasing monitor resolutions. Average is now 2-3 years per upgrades.
> 
> 
> 
> Indeed, the 970 is a budget card no matter how you look at it. You cant expect a mid range budget card to last you through a resolution that is just out and not even the top gpu's can handle it comfortably.
> 
> For some reason the 970 is so hyped that people think it can do anything, people are buying the 970 specifically for 4k thinking they are set for years.
Click to expand...

Yep but regardless of the comment I read. either way according to the test the way it was design with VRAM is pretty spiffy large gap of performance drop when it hits over 3.5gb. Maybe Nvidia should stop using this design, it will hinder more as most games uses more than 3.5gb.

I fear Nvidia misunderstand what we are expecting, we expect a gradual performance drop as we increase resolution and VRAM usage not a -50% drops. Obvious we are not saying GTX970 should outperform GTX980.


----------



## Silent Scone

Quote:


> Originally Posted by *2010rig*
> 
> I agree that they should have posted the correct ROP count and bandwidth numbers, and the card would have sold just as well.
> 
> As you probably already know my primary uses aren't gaming, but anyway. Does the 970 even need 64 ROP's?
> 
> Not sure if I'm reading this wrong...


No, it doesn't. The 970 is definitely not ROP bound. But it's still got peoples backs up regardless. We're probably going to end up having some pathetic suffix for this now in product specifications (0.5GB Gate) or something.

Laaaaaame. It'll be all your fault as well (general masses). I'll remember your whining every time I go GPU shopping and see it. Speaking rather more forthright though at everyone with the pitch forks, just get the flagship next time lol.


----------



## skupples

Quote:


> Originally Posted by *Silent Scone*
> 
> But again, this is why it's theoretical. You could run tests at any given memory clock and not reach the calculated theoretical limit, this is why they're called theoretical limits. Do not mistake me for saying the spec is somewhat better, it's a fairly odd state of affairs as it's new thanks to the architecture. As should the specs been correct in the first place, although saying 196 + 28GB/s seems somewhat ridiculous in itself.
> *Ultra textures is an uncompressed texture pack put into the game for people who are able to use it. With my 980GTX I am unable to use this texture pack because it requires more than 4GB of VRAM, I experience the same problems as you*
> 
> Why don't people ever listen to anything. Apart from all the wrong things.


Yeah idk SOM stutters for me as well but I haven't played since release.


----------



## Silent Scone

Quote:


> Originally Posted by *skupples*
> 
> Yeah idk SOM stutters for me as well but I haven't played since release.


Monolith got quoted by Euro Gamer as saying yes it *does* require 6GB VRAM, and I'm thinking that is with 1080p in mind. I've seen Titan owners have similar problems albeit not as frequently with running on buffer limit. Literally the texture pack is renders that are completely uncompressed and thrown in the game off their workstations, which are probably sporting K series GPUs with stacks of memory. And arguably, me being forthright again - if you're a 970 owner and you're chasing maximum settings - buy a better GPU. Don't hate the player, hate the game!


----------



## PontiacGTX

Quote:


> Originally Posted by *2010rig*
> 
> I agree that they should have posted the correct ROP count and bandwidth numbers, and the card would have sold just as well.
> 
> As you probably already know my primary uses aren't gaming, but anyway. Does the 970 even need 64 ROP's?
> 
> Not sure if I'm reading this wrong...


Quote:


> Unlike GM107, the GM204 GPU features four Graphics Processor Clusters (GPCs) instead of one. That means it benefits from four times the number of raster engines. Of course, high-end graphics cards require a beefier back-end to handle all of that data throughput, and the GeForce GTX 980 utilizes four render back-ends capable of handling 16 full-color ROP operations per clock, adding up to 64. Four 64-bit memory controllers create an aggregate 256-bit bus. By the way, you may have noticed that the GeForce GTX 970's 13 SMMs don't divide equally into four GPCs. Nvidia says that there is no predefined recipe of SMMs per GPC in the 970, and each GPU may be configured differently.


http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941.html

why the 980 uses 16 ROP per memory controller and your article says 8?


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> I agree that they should have posted the correct ROP count and bandwidth numbers, and the card would have sold just as well.
> 
> As you probably already know my primary uses aren't gaming, but anyway. Does the 970 even need 64 ROP's?
> 
> Not sure if I'm reading this wrong...


No, you are reading right. ROP count has no effect now. But back to my point I made earlier. If users have a return window still open on your 970, then I say return it for a refund (If Nvidia lying bothers them that much. It would me!). Otherwise this issue will probably never matter and can be disregarded.


----------



## skupples

Quote:


> Originally Posted by *Silent Scone*
> 
> Monolith got quoted by Euro Gamer as saying yes it *does* require 6GB VRAM, and I'm thinking that is with 1080p in mind. I've seen Titan owners have similar problems albeit not as frequently with running on buffer limit. Literally the texture pack is renders that are completely uncompressed and thrown in the game off their workstations, which are probably sporting K series GPUs with stacks of memory. And arguably, me being forthright again - if you're a 970 owner and you're chasing maximum settings - buy a better GPU. Don't hate the player, hate the game!


right, and said texture pack isn't even THAT good looking compared to stock textures.


----------



## Sargas290X

Quote:


> Originally Posted by *Quasimojo*
> 
> I can only speak for myself, but the way I see it, there is nothing to defend. GPU manufacturers have been doing this kind of thing since discreet GPU's first hit the scene, and the fact of the matter is that if no one had brought this particular bit of data to light, people would have been just fine with the performance of their 970. nVidia releases not one but a pair of GPU's that provide better value and performance than any of AMD's offerings at their respective price points, *and people just can't stand it until they can dig up something like this to rail on about.* Never mind the fact that they were able to give us real, noticeable performance gains on the same die fabrication tech.
> 
> I've continued to contribute to this thread, because this Chicken Little kangaroo court of public opinion crap drives me bonkers. This is why we can't have nice things.


Umm, I've been following the other thread on this forum about this issue for a week before Nvidia had responded. It was all by people who already own a 970. I was considering the 970 before I decided to pick up a 290x because AMD's offering was $70 cheaper than the price gouged 970. Yes you have a 60 watt difference in max TDP, but at least I don't have to worry about performance drops when games use more than 3.5gb at 1440p.


----------



## Silent Scone

Quote:


> Originally Posted by *skupples*
> 
> right, and said texture pack isn't even THAT good looking compared to stock textures.


Agreed, but it seems to be the 970 GTX 'tell'...

It's really not! You just flat out don't have enough memory on the card, period lol


----------



## 2010rig

Quote:


> Originally Posted by *PontiacGTX*
> 
> http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941.html
> 
> why the 980 uses 16 ROP per memory controller and your article says 8?


My over simplified answer is that Tom's is going by the original diagram.



But this seems to be further broken down like so...



http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970


----------



## raghu78

Quote:


> Originally Posted by *2010rig*
> 
> So, basically: Specs > Performance. ( for some people )
> 
> I don't know if I buy NVIDIA's whole story, they should've provided the correct specs from the get go. *At the end of the day, the 970 is performing exactly as it should be*: *A $220 lesser card.*
> 
> NVIDIA will provide the correct specs from here on out. 56 ROP's, not sure how they can properly label the RAM, and they'll need to provide correct bandwidth numbers. Other than that, the performance of the 970 hasn't changed, and is exactly as it should be.


Not really. The bigger question is how consistent is their performance when the last 0.5 GB is accessed. After correctly pointing out to AMD the poor frametimes on HD 7900 series (which AMD improved with XDMA and R9 290/ R9 290X ) the tech press should now show the same diligence as they did earlier. This is very important with the recent trend in AAA titles using 4GB easily at 4k and 1440p (Middle Earth Shadow of Mordor, Call of Duty Advanced Warfare, AC Unity, Lords of the Fallen) . Thats what matters to the public. As always the user should do his due diligence and read as much as possible on frametime testing with GTX 970 in cases where VRAM usage hits > 3.5 GB and <= 4GB.

btw the extremetech article from which you posted images talked about frametimes being a problem in Middle Earth

http://www.extremetech.com/extreme/198223-investigating-the-gtx-970-does-nvidias-penultimate-gpu-have-a-memory-problem/2

"*The 1% frame times in Shadows of Mordor are significantly worse on the GTX 970 than the GTX 980. This implies that yes, there are some scenarios in which stuttering can negatively impact frame rate and that the complaints of some users may not be without merit. However, the strength of this argument is partly attenuated by the frame rate itself - at an average of 33 FPS, the game doesn't play particularly smoothly or well even on the GTX 980.*"

The difference is the GTX 970 will exhibit these frametime issues even when overclocked and these issues will be even more worse in SLI when >3.5 GB VRAM is used.


----------



## Kand

Im going to say this. It has been my personal mantra that a chip with disabled shaders or cores is inherently defective.

Why did you buy a defective product?


----------



## XXnomadXX

just talk to bestbuy and told them about the gtx 970 false info and got back my -"full"- refund. now im moving to an amd now. please work 4k monitor, please.


----------



## 2010rig

Quote:


> Originally Posted by *Kand*
> 
> Im going to say this. It has been my personal mantra that a chip with disabled shaders or cores is inherently defective.
> 
> Why did you buy a defective product?


Because it was $220 less for ~10% less performance.


----------



## Silent Scone

Quote:


> Originally Posted by *Kand*
> 
> Im going to say this. It has been my personal mantra that a chip with disabled shaders or cores is inherently defective.
> 
> Why did you buy a defective product?


That's pretty dumb in its own right but then I'd imagine there are people who bought TITAN initially without knowing what it was birthed from.

And as above, in this instance because it is a lesser product. Just so happens now, it turns out it's slightly more lesser than initially thought.

Weep for humanity folks.


----------



## mercs213

Would it be possible to return a 970 to Amazon even if you're out of the return window and mention the false advertising from nvidia?


----------



## Kand

Quote:


> Originally Posted by *2010rig*
> 
> Because it was $220 less for ~10% less performance.


Early adopters get burnt.


----------



## iTurn

Quote:


> Originally Posted by *mercs213*
> 
> Would it be possible to return a 970 to Amazon even if you're out of the return window and mention the false advertising from nvidia?


Try it, I don't see why they wouldn't link them to the press release of Nvidia admitting the deception. Good luck


----------



## looniam

Quote:


> Originally Posted by *2010rig*
> 
> Because it was $220 less for ~10% less performance.


but at what resolution?

3840 x 2160 0.03%
2560 x 1440 1.00%
1920 x 1080 33.05%
http://store.steampowered.com/hwsurvey

because its 1.03% of the public that matters!


----------



## Hattifnatten

Quote:


> Originally Posted by *Kand*
> 
> Im going to say this. It has been my personal mantra that a chip with disabled shaders or cores is inherently defective.
> 
> Why did you buy a defective product?


----------



## FlyingSolo

I'm just over 4 months and where i got it from i highly doubt i will be able to get a refund.


----------



## FlyingSolo

Quote:


> Originally Posted by *mercs213*
> 
> Would it be possible to return a 970 to Amazon even if you're out of the return window and mention the false advertising from nvidia?


Amazon are great. Just tell them and they will give you a refund. I got a refund from them after 12 month on a item i bought. So you should have no problems.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Xoriam*
> 
> it's funny because most of the people who are mad in this thread don't even own a 970.


You don't have to own a product to be upset about a company's decisions and practices.


----------



## darealist

Stock down 3% doe.


----------



## Serandur

Quote:


> Originally Posted by *EchoOne*
> 
> Nope. Got it just over 2 months ago.


Restocking fee?


----------



## FlyingSolo

I'm seeing people selling there cards on ebay. But no ones biding on them at all. You will be lucky if you can even get £200 for it now.


----------



## djsi38t

I can't wait to see the day when less gpu horsepower and vram is needed for modern games.

Tech will grow and requirements will drop.


----------



## darkwizard

any FCAT done testing past 3.5gb done yet? min fps and max fps doesn't really tell the whole picture, same as it happened to the 7900 series and their Crossfire issue.


----------



## skupples

Quote:


> Originally Posted by *Silent Scone*
> 
> That's pretty dumb in its own right but then I'd imagine there are people who bought TITAN initially without knowing what it was birthed from.
> 
> And as above, in this instance because it is a lesser product. Just so happens now, it turns out it's slightly more lesser than initially thought.
> 
> Weep for humanity folks.


I'm sure there are, but that means they're deaf dumb & blind, as titan was advertised as cut down, (not good enough to be) Qaudro.
Quote:


> Originally Posted by *FlyingSolo*
> 
> I'm seeing people selling there cards on ebay. But no ones biding on them at all. You will be lucky if you can even get £200 for it now.


maybe over yonder, prices haven't budged over here what so ever, yet.

MOST people that own these things are not aware, and a large majority of those who are aware, don't understand what it means, and another slice is experiencing zero issues what so ever, so they don't care.

not everyone wants to push their FPS down to 30 or less, just to DSR 4K on a single card. Most people prefer to run 60+ @ lower settings.


----------



## mtcn77

Quote:


> Originally Posted by *darkwizard*
> 
> any FCAT done testing past 3.5gb done yet? min fps and max fps doesn't really tell the whole picture, same as it happened to the 7900 series and their Crossfire issue.



Not much else except these.


----------



## FlyingSolo

Quote:


> Originally Posted by *skupples*
> 
> maybe over yonder, prices haven't budged over here what so ever, yet.
> 
> MOST people that own these things are not aware, and a large majority of those who are aware, don't understand what it means, and another slice is experiencing zero issues what so ever, so they don't care.
> 
> not everyone wants to push their FPS down to 30 or less, just to DSR 4K on a single card. Most people prefer to run 60+ @ lower settings.


Your right most people don't know yet. But its now in some gaming news site. So now people are getting aware of it. I had a feeling i should have kept my 780 before i bought the 970. My plan was and still is to upgrade to GM 200 when it comes out or a 390X. And put the 970 card in my HTPC so i can game as well.


----------



## iSlayer

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Send me all your busted crappy no good terrible 970's please. I will gladly buy them for cheap


Quote:


> Originally Posted by *Sisaroth*
> 
> Maybe time to buy a second hand GTX 970. If i was going to buy a new GPU now it would have been a GTX 970 anyway even knowing this.


People aren't really selling, which leads credance to the theory that people just don't care.

See the OCN market place for example.
Quote:


> Originally Posted by *criminal*
> 
> Yep, but I can't really blame anyone for feeling this way. It is not okay that Nvidia lied about the specs. (I don't buy for a minute Nvidia made that big of a mistake.) We know that performance has not changed, but it does raise concerns for the future. You being someone that does not appear to upgrade very often (GTX470 in sig) the potential impact of this in the future can be a cause for concern now. Having less rops and theoretical bandwidth may play an issue down the road.


That's why we need test results, hopefully sooner rather than later.
Quote:


> Originally Posted by *Kand*
> 
> Im going to say this. It has been my personal mantra that a chip with disabled shaders or cores is inherently defective.
> 
> Why did you buy a defective product?


Have you bought Intel CPUs before? If yes, you likely purchased at least one defective product.









Quote:


> Originally Posted by *Kand*
> 
> Early adopters get burnt.


How did we get burned? We had benchmarks showing great performance and bought accordingly.

Nvidia messed up, the 970 may or may not have a big problem, etc... the 970 is still one helluva card.


----------



## Kand

Quote:


> Originally Posted by *iSlayer*
> 
> Have you bought Intel CPUs before? If yes, you likely purchased at least one defective product.


The most ive had is a processor with hyperthreading disabled. No cores or transistors lasered off.

Which ones are you talking about?


----------



## mtcn77

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The Benchmarks for the 970 and its overall performance changed?
> 
> I missed that event, have a link?


Drops 12% from <3.5GB use to 4GB use, since the slow 8th partition delays the rest. Notice this is the "best case scenario". Nvidia can pull a fast one at any time they wish by choking the driver at the slow partition. Planned obsolescence, ah!
Quote:


> But in the long term if the 7th port is fully busy, and is getting twice as many requests as the other port, then the other six must be only half busy, to match with the 2:1 ratio. So the overall bandwidth would be roughly half of peak. This would cause dramatic underutilization and would prevent optimal performance and efficiency for the GPU.


Pcper


----------



## PostalTwinkie

Quote:


> Originally Posted by *mtcn77*
> 
> Drops 12% from <3.5GB use to 4GB use, since the slow 8th partition delays the rest. Notice this is the "best case scenario". Nvidia can pull a fast one at any time they wish by choking the driver at the slow partition. Planned obsolescence, ah!
> Pcper


So the benchmarks and performance of the card has changed? Ok, give me a link please.

I was looking into this last night a lot and I haven't seen a benchmark that shows a change from release to now, except for the improvements via drivers. I would be interested in seeing this new change, the ones that show the card performing differently than before this architecture knowledge.


----------



## iSlayer

Mtcn is up to 51 posts in this thread, more than anyone else.

He still doesn't own a 970.
He still has no reason to have posted so many times.
He is still spreading FUD.
Quote:


> Originally Posted by *Kand*
> 
> The most ive had is a processor with hyperthreading disabled. No cores or transistors lasered off.
> 
> Which ones are you talking about?


I'm making reference to different dies being re-used for different Intel products.
Quote:


> Originally Posted by *mtcn77*
> 
> Drops 12% from <3.5GB use to 4GB use, since the slow 8th partition delays the rest. Notice this is the "best case scenario". Nvidia can pull a fast one at any time they wish by choking the driver at the slow partition. Planned obsolescence, ah!
> Pcper


Gonna need some sauce on those claims. I don't trust Nvidia's claim very much but in the absence of third party testing I trust them a helluva lot more than a shill.


----------



## 2010rig

Quote:


> Originally Posted by *PostalTwinkie*
> 
> So the benchmarks and performance of the card has changed? Ok, give me a link please.
> 
> I was looking into this last night a lot and I haven't seen a benchmark that shows a change from release to now, except for the improvements via drivers. I would be interested in seeing this new change, the ones that show the card performing differently than before this architecture knowledge.


That type of logic is un-called for from his point of view. All that matters is that NVIDIA lied, and he's on a mission to let all know!



Edit: Got ninja'd.


----------



## looniam

"i have a keyboard and internet access so therefore i am entitled to express my opinion."


----------



## EchoOne

Quote:


> Originally Posted by *Serandur*
> 
> Restocking fee?


No restocking fee. I told em, the GTX 970 wasn't sold as advertised and was defective from the get go. Nvidia lied to us.


----------



## michaelius

Quote:


> Originally Posted by *darkwizard*
> 
> any FCAT done testing past 3.5gb done yet? min fps and max fps doesn't really tell the whole picture, same as it happened to the 7900 series and their Crossfire issue.


Pclab tested Unity and Mordor

http://pclab.pl/art61614.html




All test runs can be seen here with ram usage
https://drive.google.com/folderview?id=0ByboKmIPQSAhUWdJUGhsQkdKbnM&usp=sharing


----------



## Kand

Quote:


> Originally Posted by *iSlayer*
> 
> I'm making reference to different dies being re-used for different Intel products..


Im aware of this and i steer clear from grossly disabled/defective products.


----------



## nleksan

I don't know if I could consider a game like SoM using the Ultra (uncompressed) textures as a reliable, or perhapsmmore importantly a representative, test of "core power V memory ability"...

Themmost recent build I've done was for a friend, who having finished his Neurosurgery Residency program decided (110 percent correctly) that he was due for a massive "guilt free, pleasure purchase", and after having introduced him to [email protected] when I had three highly clocked KPE's folding (and our respective careers and education providingat lleast a modicum of better than average understanding as to the implications of such an endeavor), he asked me to build him a "Absolutely No Budgetary Constraints, Insanely Over-the-top PC for gaming at the highest possible visuals and just as important is that it not sacrifice GPGPU performanceas it wwill be running [email protected] about 18-20/7/365". CaseLabs, borderline too intricate loop, 2x different display setups, a real audio system, and so forth.

As you likely guessed, it consists of a 5960X (@4.7-4.8), Rampage V Extreme, 32GB GSkill DDR4-3200, Xonar STX II, TH10 decked out to thebrim, every drive that isn't "video pplayback storage" or "archival data" is SSD (DC S3700 800GB primary, 2x 850Pro 1TB RAID0 despite my dislike of RAID0 SSD, 2x SLC NAND enterprisedrives 80GBx2 RAID0 as tthe Temp/Swap/Page/Cache directory, and 3x SSamsung 840Pro 1TB independent of another; HDD's are 4x WD RE 4TB RAID10 local and a concurrentlybuilt NAS/media server w 8x HGST Ultrastar 4TB in RAID6 via Areca 1883ix-24i-8G for expansion and performance that has me reconsidering the value ofR6 hhaving multiple of the same RAID Controller Cards), and so on and so forth...

The important part is that, after being very impressed by my KPE's, he asked if there could possibly be anything better, and while I would normally say thattthe Titans are extremelyniche, as mmuch as a highly specifically marketed/intended user card like the KPE, the fact that [email protected] would be using thessignificant majority of the computers time, and the enhanced compute capabilities stand out.
Because I am not 100 percent reckless with money (hovering at a mere 99 percent so







), I suggested we get a trio ofthem ffrom a reputable seller, used, and go from there as with his intention of running either up to 1-3x4K (40-47") AND an infinitesimally more traditional 3x LG34UM97 34" 3440x1440p Surround setup (more for productivity than anything else, not to mention the NEC 4096x2360(?) Medical Imaging Display he was given to be able to review records (scans/etc) at home prior to surgery (the thing's well into the "areyoufreakingseriousgivemenow!" price range, something like $13k or more?
So, along came three LATE model Titans, fewer than 3hrs use each and with a PDF including detailed rundowns of the 34 benchmarks used to find the clocking ability of each, and even offered to fully refund cards and shipping should they be unable to attain the high but brilliantly documented clocks advertised. @ $800 a piece, not cheap, but the inclusion of 3x truly BNIB Aquacomputer Kryographics Full Coverage Copper Blocks with Active Backplates and 2x 250x250mm*2 sheets of FujipolyEExtreme 17w/mK thermal pads (the shipment was directly from Aquatuning to my address, the seller had sold the cards before he expected to, to us, so BNIB means full factoryseals) wworth at least another $250+ per card...
Asaalways, the one place on OCN where everyone forgets their petty bickering and hissy fits is the Classifieds, where either out of fear of repercussions, genuinely caring about the interests of the buyer, or a mix, I have come to find that anyone who doesn't go out of their way to ensure that the buyer will be pleased is theeexception and a red flag even.

Well, a recurring physical ailment that presents itself as nearly suicide-contemplating levels of pain and necessitates absolutely insane amounts of opiates 10-200x more potent than morphine sulfate (8-18mg IV Dilaudid HP for BTP, 8x stronger; Oxymorphone HCL 10-12x more potent @ 80mg 4x/day oral plus 2-8x/day 10mg IV solution; Actiq 1800ug buccal lozenge Fentanyl Citrate 2-3x/day @ around 80x more potent; trial drug formulation of combination Transdermal Sufentanil 250ug/hr patches and 2500ug ampoules for injection of 200-400ug 2-6x/day, 70-90x potency; and I have been on over 2 dozen others over time, due to the condition being chronic but also cyclical albeit unpredictable in nature; for another point of reference, IV diacetylmorphine is 1.3-1.6x the potency of MSO4 but you likely know the original Bayer trademark name of "Heroin"...), the most important thing is keeping my mind occupied and myself busy, so it took only a week to make this beautiful system.

3x Titans @ 1380/7400, and the doubled VRAM over my KPE's but the latter at 1420/7800 with the small increase in shaders, and neither system would run SoM Ultra Textures without issue. In fact, the performance was really identical for all practical purposes...

The next day he ordered 3x Titan Blacks + AC Kryographics blocks/backplates 2nd day air to see what they would do despite the more stringent limitations on clocking via voltag, and once again the performance was unacceptable in this game.
The mutual friend in ppossession of my former 290X Lightnings (blocks/backplates as well) offered to let us try them and they struggled as much or more at my highest ambient water clocks of 1340/6000 and even with all three Hailea chillers dropping the water temps barely above zero (allowingssomething like 1400/6100) showed zero improvement.

The Titan Black performance in [email protected] was too compelling to consider anything else, and they demolish every othergame wwe've tried, so they've found a permanent home in this rig.
He of course (of course :S) kept the 3x Titans, running them in a lower cost system 24/7/365 for [email protected], and I gave him right of first refusal when my KPE's went to market but he didn't have another system to use at the time.

My point is, there is ALWAYS going to be that one nitpicky scenario where no matter what you use the demands exceed the capabilities of thehhardware, and if the three mmostpowerful ccards of recent times struggle (in 3-way SLI/CFX), it makes it an absolute certainty that the POWERFUL but corner-cut GM204 cards would not succeed where these could not...

I rarely get to build a true no-holds-barred system like this, where the ONLY relevant factor is performance, and even being told "don't even look at the prices, it's irrelevan, I trust yyour obsessive nature and knowledge/experience to make the right call", I was not going to needlessly spend money that wouldn't serve to benefit his needs (whichare mmore than just those stated prior). My only concern was that on first use, he'd be stuck with a Cheshire grin formonths (I imagine patients might find such am eexpression disconcerting plastered on the face of the man who will be cutting into their brain, lol).

The idea that something that was able to bring a truly "ultra premium" setup to its knees, would be the best thing to use to judge the performance of a significantly cheaper card with lower maximum performance capabilities seems backwards to me...

It's simply delusional to expect a 300 dollar card to do what a trio of cards totalling well over $3000 struggled significantly with...

Bottom line
TL;DR
Cutting to the Chase
Etc

"Proper management of expectations is the only true means of keeping disappointment from permeating it's way into every weave of the fabric that makes up your life"


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> I'm kinda worried about him, he's been MIA since this whole issue came up. Hope he's ok, and this didn't drive him over the edge.


He is off somewhere eating crow.


----------



## Woundingchaney

I can confirm that Newegg is accepting returns on the device without restocking fees. I gave them details and showed email content where both Nvidia and Zotac pointed me to place of purchase to address my issue.


----------



## Menta

one million opinions i think its time to start thinking with some degree of grey cells and start to look at the matter more objective and what does the law have to say.

the rest at the end of the day does not matter, there can be only one truth.
,
the right question, was there scaming envolved was there any sort of crime or legal issue not proper disclosed if not all is good.


----------



## Serandur

Quote:


> Originally Posted by *Woundingchaney*
> 
> I can confirm that Newegg is accepting returns on the device without restocking fees. I gave them details and showed email content where both Nvidia and Zotac pointed me to place of purchase to address my issue.


Was this also outside of the 30 day return window?


----------



## aDyerSituation

I was honestly thinking about switching back to the green team, but guess not now lollll.


----------



## skupples

Quote:


> Originally Posted by *aDyerSituation*
> 
> I was honestly thinking about switching back to the green team, but guess not now lollll.


right, because the action of nerfing a card that is meant to be nerfed is so bad.









don't get me wrong, they lied, but Nvidia's ability to hoodwink so many people so easily is actually becoming entertaining at this point in the time line.

people really thought they were getting a 2FPS slower 980 for $220 less.









nopes, you got a 1080P 980.

this is why my slogan, for almost 2 years now has been "pascal 390x or bust" though, 2 years ago it was "volta or 9xxx or bust" same things, different names.


----------



## Xoriam

Man despite all of the lies these cards still rock.
It curbstomps my old 7870xt tahiti.
I'm using them in SLI for 4k gaming, and I've got nothing to complain about.
Apart from drivers.


----------



## aDyerSituation

Quote:


> Originally Posted by *skupples*
> 
> right, because the action of nerfing a card that is meant to be nerfed is so bad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> don't get me wrong, they lied, but Nvidia's ability to hoodwink so many people so easily is actually becoming entertaining at this point in the time line.
> 
> people really thought they were getting a 2FPS slower 980 for $220 less.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nopes, you got a 1080P 980.
> 
> this is why my slogan, for almost 2 years now has been "pascal 390x or bust" though, 2 years ago it was "volta or 9xxx or bust" same things, different names.


I just don't want to support a company who knowingly lies about the capabilities of their product. And it is lying in my book.

I guess I will just pay $50 less for almost the same performance and the advertised 4 gigs of vram(cough 290x)


----------



## iSlayer

Quote:


> Originally Posted by *Serandur*
> 
> Was this also outside of the 30 day return window?


Very interested, also what's the window on the return from Newegg if they are doing special returns?

Oh who am I kidding, the 380x/980 Ti won't be here any time soon...I should get a temporary 750 Ti if I do this.
Quote:


> Originally Posted by *aDyerSituation*
> 
> I just don't want to support a company who knowingly lies about the capabilities of their product. And it is lying in my book.
> 
> I guess I will just pay $50 less for almost the same performance and the advertised 4 gigs of vram(cough 290x)


More than fair, if you don't want to support the lies.


----------



## mtcn77

Quote:


> Originally Posted by *iSlayer*
> 
> Very interested, also what's the window on the return from Newegg if they are doing special returns?
> 
> Oh who am I kidding, the 380x/980 Ti won't be here any time soon...I should get a temporary 750 Ti if I do this.
> More than fair, if you don't want to support the lies.


Why would you want to support the lies?


----------



## Serandur

Quote:


> Originally Posted by *skupples*
> 
> right, because the action of nerfing a card that is meant to be nerfed is so bad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> don't get me wrong, they lied, but Nvidia's ability to hoodwink so many people so easily is actually becoming entertaining at this point in the time line.
> 
> people really thought they were getting a 2FPS slower 980 for $220 less.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nopes, you got a 1080P 980.
> 
> this is why my slogan, for almost 2 years now has been "pascal 390x or bust" though, 2 years ago it was "volta or 9xxx or bust" same things, different names.


What are you on about...

First off, microprocessors aren't comparable by fixed amounts like "2 FPS". It's percentage and the 980s as marketed were shown to be ~20% (Edit: 23% to be exact) more than the 970 in shader cores and TMUs alone. And yes, frankly it was perfectly logical to assume in the absence of any indication by Nvidia, reviewers, or the presented specifications of the 970 that the VRAM amount/speed would not be an issue.

The GTX 570 had less VRAM than the 580, but that was apparently clear as per the official specifications. The 660Ti had an odd memory configuration, but this too was made apparent by the specifications and reviewers.

But the GTX 670 was literally a 680 with 1/8 shader clusters disabled and that's it, it had the full memory bus and VRAM pool completely uninhibited for much cheaper than the 680. And the 780 was in a similar situation vs the 780Ti. So too were the 7950 and 7970... and the 290 and the 290X. Even the original Titan itself was a slightly cut-down version of GK110, but that had no bearing whatsoever on its memory performance. Yes, it was perfectly natural to assume that the differences between the 970 and the 980 were only down to shaders/TMUs and nothing else with the way reviewers presented the cards, Nvidia presented their cards, and the precedent set by immediate predecessors to the second-tier, slightly-cut down version of the current top chip as the 970 is.

You're simply applying a very faulty hindsight bias here, there was no reason to foresee the 970 having memory issues of any kind based on any disclosed information or past precedent. If there was, we wouldn't be having this issue. It's simple.


----------



## Silent Scone

Quote:


> Originally Posted by *aDyerSituation*
> 
> I just don't want to support a company who knowingly lies about the capabilities of their product. And it is lying in my book.
> 
> I guess I will just pay $50 less for almost the same performance and the advertised 4 gigs of vram(cough 290x)


Then may I suggest you don't buy anything ever on that basis


----------



## criminal

Quote:


> Originally Posted by *iSlayer*
> 
> Very interested, also what's the window on the return from Newegg if they are doing special returns?
> 
> Oh who am I kidding, the 380x/980 Ti won't be here any time soon...I should get a temporary 750 Ti if I do this.
> More than fair, if you don't want to support the lies.


Since it looks like we will not be getting any new cards for a bit, I am buying a 780Ti off a friend for $200. If/when I decide to sell it for him, what ever I get over the $200 goes back to him. Seems like a sweet deal to hold me over until 380x comes out. Won't be an early adopter of Nvidia products for awhile after this little stunt.


----------



## 2010rig

Quote:


> Originally Posted by *looniam*
> 
> "i have a keyboard and internet access so therefore i am entitled to express my opinion."


Especially when the company I hate finally gives me some ammunition!
Quote:


> Originally Posted by *Woundingchaney*
> 
> I can confirm that Newegg is accepting returns on the device without restocking fees. I gave them details and showed email content where both Nvidia and Zotac pointed me to place of purchase to address my issue.


I'm just curious, what made you return the cards? And what are you going with next?

And off topic: which LG TV do you have?
Quote:


> Originally Posted by *Menta*
> 
> one million opinions i think its time to start thinking with some degree of grey cells and start to look at the matter more objective and what does the law have to say.
> 
> the rest at the end of the day does not matter, there can be only one truth.
> ,
> the right question, was there scaming envolved was there any sort of crime or legal issue not proper disclosed if not all is good.


I suggest everybody with doubts to read and understand this:
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970

I didn't start commenting until I had a better understanding of what was going on.

*If* we are to believe NVIDIA ( I don't ), marketing did not get the correct specs from engineering. I find it hard to believe that no one noticed this for 4 months.

The only scam would be if they claimed *X performance in reviews*, while delivering something completely different IRL. But that's not the case here.

The performance of the card is at it should be, the 8 missing ROP's would be useless even if they were present.

There's no denying that the 2 separate partitions does affect performance to a degree, and perhaps they went this route in order to drive a bigger performance gap between 970 & 980, and that *$220 difference*.

Either way, since the 970 delivers the performance that was promised in reviews, there's hardly anything to complain about.

Besides, it sounds like refunds are being honored, so those who are that bothered by it are able to return their cards.

I'm just curious to know what they're moving to next.








Quote:


> Originally Posted by *aDyerSituation*
> 
> I just don't want to support a company who knowingly lies about the capabilities of their product. And it is lying in my book.
> 
> I guess I will just pay $50 less for almost the same performance and the advertised 4 gigs of vram(cough 290x)


A card that consumes more power, puts out more heat, is much louder & delivers the same performance. But hey, it has 4 GB unlike the 970.


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Especially when the company I hate finally gives me some ammunition!
> I'm just curious, what made you return the cards? And what are you going with next?
> 
> And off topic: which LG TV do you have?
> I suggest everybody with doubts to read and understand this:
> http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970
> 
> I didn't start commenting until I had a better understanding of what was going on.
> 
> *If* we are to believe NVIDIA ( I don't ), marketing did not get the correct specs from engineering. I find it hard to believe that no one noticed this for 4 months.
> 
> The only scam would be if they claimed *X performance in reviews*, while delivering something completely different IRL. But that's not the case here.
> 
> The performance of the card is at it should be, the 8 missing ROP's would be useless even if they were present.
> 
> There's no denying that the 2 separate partitions does affect performance to a degree, and perhaps they went this route in order to drive a bigger performance gap between 970 & 980, and that *$220 difference*.
> 
> Either way, since the 970 delivers the performance that was promised in reviews, there's hardly anything to complain about.
> 
> Besides, it sounds like refunds are being honored, so those who are that bothered by it are able to return their cards.
> 
> 
> *I'm just curious to know what they're moving to next.*


Well this far into the 970 life cycle (4 months) gets people that much closer to 380/390x and GM200. I say buy a cheap second hand card and wait it out. 290 for under $300 and 980 for $550 if they can't wait.


----------



## 2010rig

Quote:


> Originally Posted by *criminal*
> 
> Well this far into the 970 life cycle (4 months) gets people that much closer to 380/390x and GM200. I say buy a cheap second hand card and wait it out. 290 for under $300 and 980 for $550 if they can't wait.


Hey, maybe this will force NVIDIA's hand to release GM200 sooner.


----------



## Menta

970gtx 

pulled for review


----------



## Silent Scone

I doubt it, word on the street is GM200 is reference only which means one thing. Top dollar. Highly unlikely many people who bought 970s will look to be spending anywhere near that much.

Lol at the Amazon link. That's rich in itself seeing as they take the part number of the first person to list something, so if their listing is at all wrong you've no choice but to use said template. There are hundreds of incorrectly listed items on there


----------



## iSlayer

Quote:


> Originally Posted by *mtcn77*
> 
> Why would you want to support the lies?


Well, some people don't really care, they just want a good GPU.

Its a good reason to not buy a 970 because Nvidia lied or Nvidia messed up.

Its a good reason to buy a 970 because it meets your needs and price point.

Acting on principal isn't necessarily in one's best interest but its there decision and these are good reasons to justify a choice.
Quote:


> Originally Posted by *criminal*
> 
> Since it looks like we will not be getting any new cards for a bit, I am buying a 780Ti off a friend for $200. If/when I decide to sell it for him, what ever I get over the $200 goes back to him. Seems like a sweet deal to hold me over until 380x comes out. Won't be an early adopter of Nvidia products for awhile after this little stunt.


I'm jealous, personally. Wish I had an option like that, but no, my 970 will be staying with me till the cutdown fully enabled Titan X 980 Ti whatever and 380x drop.

Flagships on my mindddddd
Quote:


> Originally Posted by *criminal*
> 
> Well this far into the 970 life cycle (4 months) gets people that much closer to 380/390x and GM200. I say buy a cheap second hand card and wait it out. 290 for under $300 and 980 for $550 if they can't wait.


I bet some like me bought the 970 as a stopgap for the next flagships.


----------



## Wirerat

Quote:


> Originally Posted by *Silent Scone*
> 
> I doubt it, word on the street is GM200 is reference only which means one thing. Top dollar. Highly likely many people who bought 970s will look to be spending anywhere near that much.


gm200 drops for $600 then a revamped 980 becomes the new xx70 and sells for 399.


----------



## Heavy MG

Quote:


> Originally Posted by *Woundingchaney*
> 
> I can confirm that Newegg is accepting returns on the device without restocking fees. I gave them details and showed email content where both Nvidia and Zotac pointed me to place of purchase to address my issue.


Was your 970 outside of Newegg's 30 day return policy? I'm wondering about this too,bought my G1 back in mid December.


----------



## mercs213

Quote:


> Originally Posted by *Menta*
> 
> 970gtx
> 
> pulled for review


Damn, can you provide a direct URL to the product?


----------



## Serandur

Quote:


> Originally Posted by *iSlayer*
> 
> I bet some like me bought the 970 as a stopgap for the next flagships.


Ditto; thought of it as a more efficient 780 with an extra GB of VRAM to hold me over until a needed performance boost and sold my 780 to get it. Then got suckered into a second one which was also a stopgap until more powerful single GPUs arrive.

But SLI sucks and I miss my 780. I swear it even slightly outperformed my current 970 in a few mainly high resolution/downsampling situations.


----------



## PureBlackFire

Quote:


> Originally Posted by *mercs213*
> 
> Damn, can you provide a direct URL to the product?


far as I can tell it's only the Asus model.


----------



## 2010rig

Quote:


> Originally Posted by *PureBlackFire*
> 
> far as I can tell it's only the Asus model.


Well that didn't last long, the listing is back.


----------



## sugarhell

Quote:


> Originally Posted by *2010rig*
> 
> Well that didn't last long, the listing is back.


Still under review for me


----------



## Dimaggio1103

Even more glad I went with the 290. got it for around 60 bucks less, actually has 4GB and performs roughly the same at stock. Thanks AMD


----------



## Xoriam

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Even more glad I went with the 290. got it for around 60 bucks less, actually has 4GB and performs roughly the same at stock. Thanks AMD


I probably would have gone with a 290x if it didn't cost 100-200€ more than a 970 where I live.


----------



## Serandur

Has this been posted yet?





Refusal to go over 3.5 GBs on the 970 and major frametime spikes, check. 980 using 4 GBs at same settings, not having spikes like that, and all while being downclocked to simulate ~970 performance, check.


----------



## PureBlackFire

Quote:


> Originally Posted by *sugarhell*
> 
> Still under review for me


same.


----------



## Xoriam

Quote:


> Originally Posted by *Serandur*
> 
> Has this been posted yet?


you can expect roughly the same single card performance at 4k without AA btw.

and what the hell are those crappy clocks.


----------



## criminal

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Even more glad I went with the 290. got it for around 60 bucks less, *actually has 4GB* and performs roughly the same at stock. Thanks AMD


LOL... see what Nvidia has done to its own card! People read what they want to read. (Not a jab at you.) Nvidia has now created a PR issue they are going to have to find a way to rectify. As others have said, the 970's performance has not changed, but the perception has. Not only does Nvidia come off looking like a liar (they are), their product no longer really has 4GB of vram... lol.
Quote:


> Originally Posted by *Serandur*
> 
> Has this been posted yet?
> 
> 
> 
> 
> 
> Refusal to go over 3.5 GBs on the 970 and major frametime spikes, check. 980 using 4 GBs at same settings, not having spikes like that, and all while being downclocked to simulate ~970 performance, check.


Good find.


----------



## Serandur

Quote:


> Originally Posted by *Xoriam*
> 
> you can expect roughly the same single card performance at 4k without AA btw.
> 
> and what the hell are those crappy clocks.


Those clocks are the 980 being underclocked to simulate ~970 performance and eliminate any confounding variable of "because the 980 is faster". The memory is the issue.


----------



## skupples

Quote:


> Originally Posted by *Serandur*
> 
> What are you on about...
> 
> First off, microprocessors aren't comparable by fixed amounts like "2 FPS". It's percentage and the 980s as marketed were shown to be ~20% (Edit: 23% to be exact) more than the 970 in shader cores and TMUs alone. And yes, frankly it was perfectly logical to assume in the absence of any indication by Nvidia, reviewers, or the presented specifications of the 970 that the VRAM amount/speed would not be an issue.
> 
> The GTX 570 had less VRAM than the 580, but that was apparently clear as per the official specifications. The 660Ti had an odd memory configuration, but this too was made apparent by the specifications and reviewers.
> 
> But the GTX 670 was literally a 680 with 1/8 shader clusters disabled and that's it, it had the full memory bus and VRAM pool completely uninhibited for much cheaper than the 680. And the 780 was in a similar situation vs the 780Ti. So too were the 7950 and 7970... and the 290 and the 290X. Even the original Titan itself was a slightly cut-down version of GK110, but that had no bearing whatsoever on its memory performance. Yes, it was perfectly natural to assume that the differences between the 970 and the 980 were only down to shaders/TMUs and nothing else with the way reviewers presented the cards, Nvidia presented their cards, and the precedent set by immediate predecessors to the second-tier, slightly-cut down version of the current top chip as the 970 is.
> 
> You're simply applying a very faulty hindsight bias here, there was no reason to foresee the 970 having memory issues of any kind based on any disclosed information or past precedent. If there was, we wouldn't be having this issue. It's simple.


It might be faulty but it's not a bias. I just don't like 104/204 cards. That's why it seems I'm not really that surprised by NV's actions.

Wood screws. Bad VRMs, lacking bandwidth in the form of memory. OH and who can forget the "well we magically reduced bandwidth need by 20% so 256 is non issue!" Then turn around with the oh Btw this is best suited for GK104 upgrades, but also AMAZING for 4k

See, I stopped trusting review sites long ago, as their results and real world results continue to grow apart, specially now that it's blatantly obvious that they all/most get cherry picked samples.


----------



## Serandur

Quote:


> Originally Posted by *skupples*
> 
> *It might be faulty but it's not a bias.* I just don't like 104/204 cards. That's why it seems I'm not really that surprised by NV's actions.
> 
> Wood screws. Bad VRMs, lacking bandwidth in the form of memory. OH and who can forget the "well we magically reduced bandwidth need by 20% so 256 is non issue!" Then turn around with the oh Btw this is best suited for GK104 upgrades, but also AMAZING for 4k
> 
> See, I stopped trusting review sites long ago, as their results and real world results continue to grow apart, specially now that it's blatantly obvious that they all/most get cherry picked samples.


Hindsight bias, mate, not a bias against the mid-range chips.

http://en.wikipedia.org/wiki/Hindsight_bias

Basically, there was no way for people to reasonably predict the 970s would have any such memory issues.

Now, a discussion on Nvidia selling x04 chips as high-end being sleazy is another discussion and one I would personally agree with you on.


----------



## Woundingchaney

Yes I purchased the cards in November.


----------



## Serandur

Quote:


> Originally Posted by *Woundingchaney*
> 
> Yes I purchased the cards in November.


Fantastic, thanks for sharing. One of my 970s was from November. Is it better to call or email?


----------



## criminal

Quote:


> Originally Posted by *skupples*
> 
> It might be faulty but it's not a bias. I just don't like 104/204 cards. That's why it seems I'm not really that surprised by NV's actions.
> 
> Wood screws. Bad VRMs, lacking bandwidth in the form of memory. OH and who can forget the "well we magically reduced bandwidth need by 20% so 256 is non issue!" Then turn around with the oh Btw this is best suited for GK104 upgrades, but also AMAZING for 4k
> 
> See, I stopped trusting review sites long ago, as their results and real world results continue to grow apart, specially now that it's blatantly obvious that they all/most get cherry picked samples.


But review sites didn't make all those claims; Nvidia did. Nvidia got caught red handed this time gimping a card more then they first showed. Still don't understand why they did this other than trying to make the 970 look better. But performance would have spoke for itself.


----------



## Woundingchaney

Quote:


> Originally Posted by *Serandur*
> 
> Fantastic, thanks for sharing. One of my 970s was from November. Is it better to call or email?


I used the live chat online option.

Send thanks to EchoOne


----------



## PostalTwinkie

Quote:


> Originally Posted by *Serandur*
> 
> Hindsight bias, mate, not a bias against the mid-range chips.
> 
> http://en.wikipedia.org/wiki/Hindsight_bias
> 
> Basically, there was no way for people to reasonably predict the *970s would have any such memory issues.*
> 
> Now, a discussion on Nvidia selling x04 chips as high-end being sleazy is another discussion and one I would personally agree with you on.


What is the issue? Seriously, what is the issue?

There is a MARKETING issue, and Nvidia done goofed on what specs they printed on the box. That doesn't mean there is an issue with the card, it works fine, damn fine as for a matter of fact. The performance of the card isn't changing, it is the same as it was the day you bought it - if not better via driver updates.

There is no technical/hardware/performance/physical issue here, it is purely an advertising/marketing issue with specs on a box. The 970 is still an amazing card.


----------



## Xoriam

Quote:


> Originally Posted by *criminal*
> 
> But review sites didn't make all those claims; Nvidia did. Nvidia got caught red handed this time gimping a card more then they first showed. Still don't understand why they did this other than trying to make the 970 look better. But performance would have spoke for itself.


Exactly!








The card is still a monster, even with these lies.
If they would have advertised it as it really is, reviewers would have seen that it's still within 10% of the 980 (which hasn't changed because the card still performs the same as it did last week.)
same income, without the hassle.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Xoriam*
> 
> Exactly!
> 
> 
> 
> 
> 
> 
> 
> 
> The card is still a monster, even with these lies.
> *If they would have advertised it as it really is,* reviewers would have seen that it's still within 10% of the 980 (which hasn't changed because the card still performs the same as it did last week.)
> same income, without the hassle.


Bingo.

This is a poor decision on branding/marketing, nothing more. Nvidia would have been better off advertising this as a 3.5 GB card with an additional 500 MB of RAM, with a marketing spin on it, for the total 4 GB of VRAM. The numbers would have spoken for it just the same, still a great card.


----------



## Serandur

Quote:


> Originally Posted by *PostalTwinkie*
> 
> What is the issue? Seriously, what is the issue?
> 
> There is a MARKETING issue, and Nvidia done goofed on what specs they printed on the box. That doesn't mean there is an issue with the card, it works fine, damn fine as for a matter of fact. The performance of the card isn't changing, it is the same as it was the day you bought it - if not better via driver updates.
> 
> There is no technical/hardware/performance/physical issue here, it is purely an advertising/marketing issue with specs on a box. The 970 is still an amazing card.


The issue is exactly what people have been saying it is and what there is plenty of anecdotal and now some benchmarked data regarding. The performance isn't changing, no, but it is clearly worse than we were led to believe. The issue is real, the product was sold on misrepresentation of its specifications ( a big no-no in the legal world; think fraud), and the 970s are performing worse than they should. The issue is even more exacerbated for SLI owners.

I'm not sure what your angle is, but you're simply obfuscating the issue. Marketing issue with regards to a hardware shortcoming, it doesn't matter what you call it. It is an issue. Look at the frametimes, there's the issue. The 970s do perform significantly worse in VRAM-demanding scenarios. It's called stuttering. The average FPS in those two runs are within 7%, and yet look at those massive and frequent frametime spikes on the 970 not on the 980.


----------



## adi518

I just got updated on this fiasco. I bought two 970s in November, nearly 800$ including extra expenses. Made a meticulous effort to buy them from the non-international-friendly folks at Newegg, then have them imported here. This is really pissing me off, and not for the 'negligible' performance difference, but for the immediate drop in value because of this. NOT okay.


----------



## FlyingSolo

So what will Nvidia do now. Will evga,msi,asus etc give out full refunds. Or will they all just stay quite and do nothing like nothing has happened at all.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Serandur*
> 
> The issue is exactly what people have been saying it is and what there is plenty of anecdotal and now some benchmarked data regarding. The performance isn't changing, no, *but it is clearly worse than we were led to believe.* The issue is real, the product was sold on misrepresentation of its specifications ( a big no-no in the legal world; think fraud), and the 970s are performing worse than they should. The issue is even more exacerbated for SLI owners.
> 
> I'm not sure what your angle is, but your simply obfuscating the issue. Marketing issue with regards to a hardware shortcoming, it doesn't matter what you call it. It is an issue. Look at the frametimes, there's the issue. The 970s do perform significantly worse in VRAM-demanding scenarios. It's called stuttering. The average FPS in those two runs are within 7%, and yet look at those massive and frequent frametime spikes on the 970 not on the 980.


It is worse than you were lead to believe? First you say performance hasn't changed, then you say it is worse than what you were lead to believe....make up your mind.

You bought the card, and you (should have) read reviews and performance benchmarks of the card, before buying it. Those numbers haven't changed, those benchmarks are still valid, the information you used in your decision making process isn't different - with the exception of numbers on the side of the box. Which, if the performance isn't different, then why care what the numbers on the side of the box say?

I fully understand, get, and back, a refund to anyone that purchased a 970 and wants to refund it. Just on the grounds that the information they put on the box was wrong. However, that doesn't change the fact that the card is still the card, and that it still performs the same, regardless of what is on the outside of the box. That this hasn't tainted performance in anyway, just imagine.
Quote:


> Originally Posted by *adi518*
> 
> I just got updated on this fiasco. I bought two 970s in November. Made a meticulous effort to buy them from the non-international-friendly folks at Newegg, then have them imported here. This is really pissing me off, and not for the 'negligible' performance difference, but for the immediate drop in value because of this. NOT okay.


Don't even attempt to use the retained value/resell argument with this.

You do not, ever, buy hardware and consider resale value as a fixed criteria. If you manage to resell your old parts, cool, awesome, but don't bank on that happening.

I would also, before you get too overly worked up about this, really look into what is happening. You will find that there isn't much to get worked up over.
Quote:


> Originally Posted by *FlyingSolo*
> 
> So what will Nvidia do now. Will evga,msi,asus etc give out full refunds. Or will they all just stay quite and do nothing like nothing has happened at all.


I would imagine it will go like this...


Issue apology letter - see what happens.
Offer free game coupon if above didn't work.
Offer refund/exchange if above didn't work.


----------



## skupples

I have yet to see this mass sell off of super cheap chips on EBay. MayBe it hasn't hit the USA yet, but I would love a link to super undercut pricing 970s like people keep mentioning. Would go nicely in this little server I'm building.


----------



## cq3mrd

People worrying if GoldenTiger is ok can rest easy. He's shi... I mean chilling over at hardOCP's thread (http://hardforum.com/showthread.php?t=1849838) about this issue. I guess that's where the cool cats are hanging







.


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> I have yet to see this mass sell off of super cheap chips on EBay. MayBe it hasn't hit the USA yet, but I would love a link to super undercut pricing 970s like people keep mentioning. Would go nicely in this little server I'm building.


Lol I haven't seen a single card being sold for cheaper than 20€ less than new.


----------



## Serandur

Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is worse than you were lead to believe? First you say performance hasn't changed, then you say it is worse than what you were lead to believe....make up your mind.
> 
> You bought the card, and you (should have) read reviews and performance benchmarks of the card, before buying it. Those numbers haven't changed, those benchmarks are still valid, the information you used in your decision making process isn't different - with the exception of numbers on the side of the box. Which, if the performance isn't different, then why care what the numbers on the side of the box say?
> 
> I fully understand, get, and back, a refund to anyone that purchased a 970 and wants to refund it. Just on the grounds that the information they put on the box was wrong. However, that doesn't change the fact that the card is still the card, and that it still performs the same, regardless of what is on the outside of the box. That this hasn't tainted performance in anyway, just imagine.
> Don't even attempt to use the retained value/resell argument with this.
> 
> You do not, ever, buy hardware and consider resale value as a fixed criteria. If you manage to resell your old parts, cool, awesome, but don't bank on that happening.
> 
> I would also, before you get too overly worked up about this, really look into what is happening. You will find that there isn't much to get worked up over.


Yes, it is worse than what *limited* benchmarks testing and reporting on *limited* scenarios with a *limited* representation of how 970s perform in failing to use frametime graphs in VRAM-limited scenarios implied. The performance of the card hasn't changed because it's a physical product that's obviously the same as it always was, but representation of its performance was not adequate enough to highlight the issue at hand.

VRAM stuttering does not manifest itself in average FPS outside of extreme circumstances and obviously it doesn't present itself in a situation not pushing on the 970s' 3.5 GB soft limit. What exactly are you trying to say, it is you who is not being clear. Are you saying VRAM amount says nothing about performance in any situation? Because that's not true, you need onboard, high-speed memory on a GPU of a certain amount to store data and lacking that, you will get worse performance as people are, but not worse performance properly reflected in any reviews at the time of the 970s' launch. Just look at the frametime graphs, it's blatantly clear what's going on. The same type of tests were not present at the 970s' launch and it was immediately clear to me upon getting my second 970 and trying to push settings that something was wrong.

My 970s are not performing as true 4 GB cards as we were led to believe and there are plenty of scenarios where I do get that stuttering. Performance is suffering.


----------



## skupples

Quote:


> Originally Posted by *Xoriam*
> 
> Lol I haven't seen a single card being sold for cheaper than 20€ less than new.


Just seems like parroting. I've asked twice now, still no proof.

Also trying to figure out why poleople with the issues aren't reporting that they've forwarded data to NV / Anand. Both are requesting evidence where the memory becomes the issue before the core.

I'm now reading that SLI makes it even worse, which is hard for me to swallow as a long time multi GPU user. Stuttering has done nothing but get worse over the last two years.


----------



## Cakewalk_S

Hm strange, my gtx970 strix easily gets over 3.5gb on 4k DSR. I can get upto around 3.8gb


----------



## adi518

@PostalTwinkie I'm not sure I get what you mean. If the card had X resale value, it now has half of that. Simple as that. Banking? what does that have anything to do with it? they advertised X, reality shows Y. They made a mistake, they admitted it, they now need to pay up... it's not enough to say "Sorry, we screwed up". I don't want a freaking game coupon. Soon, perhaps in less than a year, more people will move to 4k, and experience these issues. But again, it's more about an hardware that just lost a big portion of it's value. Reminds me of Apple's fiasco, when they knocked off 100$ of one of their iphone iterations, very soon after it was released. Not quite the same thing, but customers did receive compensation to make it just.


----------



## Xoriam

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Hm strange, my gtx970 strix easily gets over 3.5gb on 4k DSR. I can get upto around 3.8gb


thats because the full 4gb is usable, I go above 4000mb daily on ACU with my 970s


----------



## criminal

Quote:


> Originally Posted by *adi518*
> 
> @PostalTwinkie I'm not sure I get what you mean. If the card had X resale value, it now has half of that. Simple as that. Banking? what does that have anything to do with it? they advertised X, reality shows Y. Soon, perhaps in less than a year, more people will move to 4k, and experience these issues. But again, it's more about an hardware that just lost a big portion of it's value. Reminds me of Apple's fiasco, when they knocked off 100$ of one of their iphone iterations, very soon after it was released. Customers received compensation to make it just.


You should never buy based on possible resale value, but I agree that this little tidbit of information we now know about the 970 could impact resale value a bit. I know that the "worth" of the card really hasn't changed, but the perception of the card being defective different then the specs will weigh on some peoples purchasing decision.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Serandur*
> 
> Yes, it is worse than what *limited* benchmarks testing and reporting on *limited* scenarios with a *limited* representation of how 970s perform in failing to use frametime graphs in VRAM-limited scenarios implied. The performance of the card hasn't changed because it's a physical product that's obviously the same as it always was, but representation of its performance was not adequate enough to highlight the issue at hand.
> 
> VRAM stuttering does not manifest itself in average FPS outside of extreme circumstances and obviously it doesn't present itself in a situation not pushing on the 970s' 3.5 GB soft limit. What exactly are you trying to say, it is you who is not being clear. Are you saying VRAM amount says nothing about performance in any situation? Because that's not true, you need onboard, high-speed memory on a GPU of a certain amount to store data and lacking that, you will get worse performance as people are, but not worse performance properly reflected in any reviews at the time of the 970s' launch. Just look at the frametime graphs, it's blatantly clear what's going on.


I am well aware of stutter, I survived the AMD 7000 series Crossfire Stutterpocolyps.

Also, frame times have been tested, frame times delivered by the card were tested at release. PCPer has them along with others.

_The performances of these cards has been fully vetted across a wide range of scenarios - even some obscure ones._

All you are doing is regurgitating what you heard online about how access to that last 500 MB is slower, and could impact frame times of a game. You are doing this without realizing that the game engine and OS aren't even going to address the last 500 MB by itself, due to how it is sectioned off. That is seems to act more as a page file over anything, and that it is still insanely fast compared to system RAM.

In other words; the performance and frame times the card had prior to this knew knowledge is still the same, and hasn't changed. That the slower bit of VRAM on the ass end of the card has been averaged in and factored in the performance reviews and data we have available! That the OS and game engine can see it is slower, and use it accordingly.


----------



## TopicClocker

Holy crap, this thread has exploded!

I've read all of the posts in this thread.

This is crazy!


----------



## adi518

@criminal Strange assumption. I do not purchase brand new gpus based on their resale value, especially because I intend to keep my upgrade for 2-3 years before letting it go for a new gen. However, it is quite obvious now that this will damage market for 970. Both for retailers and classifieds. Doesn't matter if performance difference is tiny or huge. It's a defect and the news about this spread like wild fire.


----------



## moustang

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Bingo.
> 
> This is a poor decision on branding/marketing, nothing more. Nvidia would have been better off advertising this as a 3.5 GB card with an additional 500 MB of RAM, with a marketing spin on it, for the total 4 GB of VRAM. The numbers would have spoken for it just the same, still a great card.


If that's all it took to make a false advertising claim then just imagine how many Dell owners could sue because their PC with 8GB of RAM is really two 4GB sticks of RAM in different slots.


----------



## Xoriam

Quote:


> Originally Posted by *adi518*
> 
> @criminal Strange assumption. I do not purchase brand new gpus based on their resale value, especially because I intend to keep my upgrade for 2-3 years before letting it go for a new gen. However, it is quite obvious now that this will damage market for 970. Both for retailers and classifieds. Doesn't matter if performance difference is tiny or huge. It's a defect.


Why would you not take into consideration resale value??
Thats just absurd.


----------



## skupples

An intentional programming and engineering tactic can't equal a defect.


----------



## iSlayer

Quote:


> Originally Posted by *cq3mrd*
> 
> People worrying if GoldenTiger is ok can rest easy. He's shi... I mean chilling over at hardOCP's thread (http://hardforum.com/showthread.php?t=1849838) about this issue. I guess that's where the cool cats are hanging
> 
> 
> 
> 
> 
> 
> 
> .


I see what you did there.
Quote:


> Originally Posted by *skupples*
> 
> I have yet to see this mass sell off of super cheap chips on EBay. MayBe it hasn't hit the USA yet, but I would love a link to super undercut pricing 970s like people keep mentioning. Would go nicely in this little server I'm building.


I'll sell you mine. $350, you're paying for shipping.
Quote:


> Originally Posted by *TopicClocker*
> 
> Holy crap, this thread has exploded!
> 
> I've read all of the posts in this thread.
> 
> This is crazy!


Welcome to the club, any thoughts on the 50 and counting comments from mtcn?


----------



## criminal

Quote:


> Originally Posted by *moustang*
> 
> If that's all it took to make a false advertising claim then just imagine how many Dell owners could sue because their PC with 8GB of RAM is really two 4GB sticks of RAM in different slots.












On topic:

Check this out all: http://kb.newegg.com/FAQ/Article/1729


----------



## FlyingSolo

Quote:


> Originally Posted by *skupples*
> 
> I have yet to see this mass sell off of super cheap chips on EBay. MayBe it hasn't hit the USA yet, but I would love a link to super undercut pricing 970s like people keep mentioning. Would go nicely in this little server I'm building.


If i find a link i'll let you know. Hell i'll buy another one if i can find it for £150 or less myself and put that in my plex server. So far i have found it on ebay uk. Starting price at £200 and so far with no bids and ending in 12 hour.


----------



## moustang

Quote:


> Originally Posted by *Xoriam*
> 
> Why would you not take into consideration resale value??
> Thats just absurd.


If you're going to talk about honesty and resale value then let me ask you this....

When you go to sell your used video card are you going to clearly state that you've overclocked the crap out of it and that you have definitely shortened the lifespan of the card with your overclocking?

Or should the buyer sue you for false advertising because you failed to divulge this important bit of information?


----------



## Heavy MG

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I am well aware of stutter, I survived the AMD 7000 series Crossfire Stutterpocolyps.
> 
> Also, frame times have been tested, frame times delivered by the card were tested at release. PCPer has them along with others.
> 
> _The performances of these cards has been fully vetted across a wide range of scenarios - even some obscure ones._
> 
> All you are doing is regurgitating what you heard online about how access to that last 500 MB is slower, and could impact frame times of a game. You are doing this without realizing that the game engine and OS aren't even going to address the last 500 MB by itself, due to how it is sectioned off. That is seems to act more as a page file over anything, and that it is still insanely fast compared to system RAM.
> 
> In other words; the performance and frame times the card had prior to this knew knowledge is still the same, and hasn't changed. That the slower bit of VRAM on the ass end of the card has been averaged in and factored in the performance reviews and data we have available! That the OS and game engine can see it is slower, and use it accordingly.


The statement that the last 500mb is slower yet "faster than system ram" is just regurgitated from the internet as well. It's just Nvidia PR speak trying to damage control and play it off as no big deal. Nvidia advertised the 970 as a 4GB card not a 3.5gb +500 mb of "turbo page file" ram or whatever you'd like to call it.


----------



## Orangey

Did anyone post this yet?

http://www.reddit.com/r/pcmasterrace/comments/2tuqd4/i_benchmarked_gtx_970s_in_sli_at_1440p_and_above/


----------



## skupples

Quote:


> Originally Posted by *iSlayer*
> 
> I see what you did there.
> I'll sell you mine. $350, you're paying for shipping.
> Welcome to the club, any thoughts on the 50 and counting comments from mtcn?


That's not a deal.... Or even close but thanks. Would pickup 2x 290 over that.


----------



## Serandur

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I am well aware of stutter, I survived the AMD 7000 series Crossfire Stutterpocolyps.
> 
> Also, frame times have been tested, frame times delivered by the card were tested at release. PCPer has them along with others.
> 
> _*The performances of these cards has been fully vetted across a wide range of scenarios - even some obscure ones.*_
> 
> All you are doing is regurgitating what you heard online about how access to that last 500 MB is slower, and could impact frame times of a game. You are doing this without realizing that the game engine and OS aren't even going to address the last 500 MB by itself, due to how it is sectioned off. That is seems to act more as a page file over anything, and that it is still insanely fast compared to system RAM.
> 
> In other words; the performance and frame times the card had prior to this knew knowledge is still the same, and hasn't changed. That the slower bit of VRAM on the ass end of the card has been averaged in and factored in the performance reviews and data we have available! That the OS and game engine can see it is slower, and use it accordingly.


VRAM-limited scenarios - "VRAM stuttering does not manifest itself in average FPS outside of extreme circumstances and obviously it doesn't present itself in a situation not pushing on the 970s' 3.5 GB soft limit"

I'm regurgitating nothing I read online, I am experiencing these issues and simply posted the well-awaited benchmarks by a professional site demonstrating the issue with a 970 and 980 back-to-back.

Let's make this real simple:

What marketing led people to believe - 970 to 980 is like 780 to 780 Ti or 670 to 680; some disabled shaders and TMUs, but perfectly intact memory system.

What further understanding has revealed to be true - 970 to 980 is more like 570 to 580 with the former having a crippled interface... only no one said anything.

There are plenty of benchmarks that don't accurately test pushing VRAM limits and what impact they have, it's how 770s are still holding up in charts in the same place relative to 780s or 7870s vs 280s/290s, etc. and it's simply because VRAM limitations are less widely understood, but they're real and they are very important to recognize for people demanding quality performance and especially multi-GPU setups.

Slower memory is slower memory; it can be adequately used for some things sure, but not all things. Just look at the article and Watch Dogs benchmarks I posted earlier. It's a fact, the 970s weaker memory system is both not as advertised and subsequently is affecting performance. I know what VRAM hangups are like, I know what GPU usage, frametimes, and memory usage look like during them, I am having them, and so are other people. There is data to prove it. Your lack of understanding it does not make it untrue and no amount of belief in magic drivers is going to make slower memory never show any drawbacks of slower memory.

As I said, the knowledge we have of the 970s' performance in certain scenarios and measurements are true yes, but there's the other side of the coin of the measurements of benchmarks that weren't done before. Accuracy of the marketed VRAM amount is important to make inferences of how the 970s will hold up in comparison to 4 GB contemporaries; our more developed understanding of the 970s' memory shortcoming that Nvidia and reviewers did not disclose paints a different picture for the 970s' capabilities in specific scenarios.

Benchmark - Very Bottom


----------



## GrimDoctor

Asus just told me they had no idea what I was talking about in regards to Nvidia's statement...


----------



## Xoriam

Quote:


> Originally Posted by *moustang*
> 
> If you're going to talk about honesty and resale value then let me ask you this....
> 
> When you go to sell your used video card are you going to clearly state that you've overclocked the crap out of it and that you have definitely shortened the lifespan of the card with your overclocking?
> 
> Or should the buyer sue you for false advertising because you failed to divulge this important bit of information?


How can I have overclocked the crap out of it considering the voltage limits of the 9XX series along without being able to increase memory voltage?
When I resell used cards I always state what conditions they've been in. if you don't well.....

you do realize the absolute maximum you can currently put into a 970 is 1.3125 right?
It's a hardware limitation.


----------



## mercs213

Quote:


> Originally Posted by *criminal*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On topic:
> 
> Check this out all: http://kb.newegg.com/FAQ/Article/1729


NEWEGG!! FPS =/= frame timings or stuttering.


----------



## adi518

Quote:


> Originally Posted by *Xoriam*
> 
> Why would you not take into consideration resale value??
> Thats just absurd.


I explained it in my original post. I usually get rid of it after 2-3 years, where it's value is beginning to hit the bottom.


----------



## 2010rig

Quote:


> Originally Posted by *sugarhell*
> 
> Still under review for me


Yeah, I'm seeing that now, didn't realize it was below the details of the card.









Quote:


> Originally Posted by *criminal*
> 
> LOL... see what Nvidia has done to its own card! People read what they want to read. (Not a jab at you.) Nvidia has now created a PR issue they are going to have to find a way to rectify. As others have said, the 970's performance has not changed, but the perception has. Not only does Nvidia come off looking like a liar (they are), their product no longer really has 4GB of vram... lol.
> Good find.


It's funnier how some people have said they switched over to the 290X BECAUSE of the 4GB RAM which the 970 lacks.








Quote:


> Originally Posted by *PostalTwinkie*
> 
> What is the issue? Seriously, what is the issue?
> 
> *There is a MARKETING issue*, and Nvidia done goofed on what specs they printed on the box. That doesn't mean there is an issue with the card, it works fine, damn fine as for a matter of fact. The performance of the card isn't changing, it is the same as it was the day you bought it - if not better via driver updates.
> 
> There is no technical/hardware/performance/physical issue here, *it is purely an advertising/marketing issue with specs on a box.* The 970 is still an amazing card.


Crazy how it's being blown out of proportion. how come some people can see this, while others flat out refuse it?
Quote:


> Originally Posted by *cq3mrd*
> 
> People worrying if GoldenTiger is ok can rest easy. He's shi... I mean chilling over at hardOCP's thread (http://hardforum.com/showthread.php?t=1849838) about this issue. I guess that's where the cool cats are hanging
> 
> 
> 
> 
> 
> 
> 
> .


I see what you did there.








Quote:


> Originally Posted by *adi518*
> 
> @PostalTwinkie I'm not sure I get what you mean. If the card had X resale value, it now has half of that. Simple as that. Banking? what does that have anything to do with it? they advertised X, reality shows Y. They made a mistake, they admitted it, they now need to pay up... it's not enough to say "Sorry, we screwed up". I don't want a freaking game coupon. Soon, perhaps in less than a year, more people will move to 4k, and experience these issues. But again, it's more about an hardware that just lost a big portion of it's value. Reminds me of Apple's fiasco, when they knocked off 100$ of one of their iphone iterations, very soon after it was released. Not quite the same thing, but customers did receive compensation to make it just.


Please explain how the value of the card is now HALF. Shouldn't it have gone down by 1/8th at best.








Quote:


> Originally Posted by *TopicClocker*
> 
> Holy crap, this thread has exploded!
> 
> I've read all of the posts in this thread.
> 
> This is crazy!


You read the whole thread, and still bought into the hype with your Avatar. Well done.








Quote:


> Originally Posted by *adi518*
> 
> I explained it in my original post. I usually get rid of it after 2-3 years, where it's value is beginning to hit the bottom.


hate to break it to ya, best time to sell a card for optimum resell value is a 1 - 3 months before new cards come out.


----------



## bambino167

Wow this is getting big attention. Petition is live
http://www.maximumpc.com/gamers_petition_geforce_gtx_970_refund_over_error_specs_2015


----------



## Xoriam

Quote:


> Originally Posted by *adi518*
> 
> I explained it in my original post. I usually get rid of it after 2-3 years, where it's value is beginning to hit the bottom.


Well considering this product after 2 or 3 years I would expect to get 200€ for it. Not so sure anymore.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Xoriam*
> 
> Well considering this product after 2 or 3 years I would expect to get 200€ for it. Not so sure anymore.


Not sure how bad it is for you guys over there, but here in the states this card will be maybe $100 in 3 years.


----------



## FlyingSolo

Quote:


> Originally Posted by *cq3mrd*
> 
> People worrying if GoldenTiger is ok can rest easy. He's shi... I mean chilling over at hardOCP's thread (http://hardforum.com/showthread.php?t=1849838) about this issue. I guess that's where the cool cats are hanging
> 
> 
> 
> 
> 
> 
> 
> .


I remember him saying two 970 was fine for 4k but not on ultra settings tho.


----------



## Xoriam

Quote:


> Originally Posted by *FlyingSolo*
> 
> I remember him saying two 970 was fine for 4k but not on ultra settings tho.


Why are you laughing about that?

970 SLI with 0xAA and almost max settings works just fine.

most games, non ubisoft will run decently on ONE overclocked 970 @4k with decent settings.


----------



## mtcn77

SO, Nvidia have maximised their profits again?

They thereby prevented another gpu mining scenario (this news just stripped the resale value of 970 in comparison to the previous incident where AMD mining cards flooded the market),
They have already sold the card for more than its worth,
And you are as stated out of option to return the card.


----------



## Xoriam

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Not sure how bad it is for you guys over there, but here in the states this card will be maybe $100 in 3 years.


All GPUs are way higher priced here.

A good 970 is nearly 400€ which is like almost 600$
tìhe latest 970s that just came out are over 400€ btw.

980s are in the 600-800€ range.


----------



## Kirmie

Quote:


> Originally Posted by *DuckieHo*
> 
> ...and yet, the masses probably still won't care.
> 
> It's not trickery but a design limitation. How do you explain memory allocation and tiering to the masses?


L1 and L2 memory like cache on a cpu? Not really the same, but close enough as an existing concept a decent number of people know I think.


----------



## Ganf

Look, there's a lot of drama around this issue and feelings are starting to get involved, but I just wanted to put my experience out there since I bought a 970 last week just as all of this controversy was kicking off. As seen:


I was pretty pissed at first after reading about this issue. Package was rolling towards me like a freight train of disappointment just looking to steamroll all over my happy new build, buyers remorse was setting in for a product I hadn't even touched yet, and the bad news just kept popping up everywhere I looked on the internet like the wrong kind of mushroom after a warm rain.

But I gotta say that now, after all of this, I'm fairly satisfied with my purchase. It's performed as well as expected, in my opinion, and I'm getting just as much out of it as I anticipated.



Which isn't saying much considering that I only laid eyes on the box long enough to sign my refusal and see it carried back to the big brown truck. Shipping fees well spent, in my opinion, and the only thing I wanted out of this card after learning that it was hamstringed for the purpose that I wanted it for it's swift eviction from my life is a godsend.

I recommend the same to anyone who may similarly have one in the mail.


----------



## FlyingSolo

Quote:


> Originally Posted by *Xoriam*
> 
> Why are you laughing about that?
> 
> 970 SLI with 0xAA and almost max settings works just fine.


Well on hardforum his saying some games are giving him problems. But he didn't say that before. I was thinking of buying a 4k monitor myself and another 970 until the new cards come out. It's a good thing i didn't.


----------



## Xoriam

Quote:


> Originally Posted by *FlyingSolo*
> 
> Well on hardforum his saying some games are giving him problems. But he didn't say that before. I was thinking of buying a 4k monitor myself and another 970 until the new cards come out. It's a good thing i didn't.


it's still not a bad decsion, I can confirm from experience.
Buying 1 more 970 is less than going out and buying 2x290x
Which you might need 3 of in some cases.

ACU recent patch messed things up though.
On patch 1.3.X I was getting consistant 55-80fps
Now I'm getting the same FPS as I was getting with a single 970, and cutscenes are buggy as hell.

Everything else is performing pretty good.


----------



## adi518

Quote:


> Originally Posted by *2010rig*
> 
> hate to break it to ya, best time to sell a card for optimum resell value is a 1 - 3 months before new cards come out.


I look at it in a different way. I factor in 'value over time'. 2-3 years from now, it is still a powerful gpu, got SLI for future proof? even better. Gaming industry halted from progressing as fast as gpu technology. That value balances with the resale value. It doesn't balance when you sell a gpu right after a new one is out. You strictly lose money and the potential longevity you could get out of the gpu. It's also proven that moving from previous gen to newest gen is a bad investment.


----------



## iSlayer

Quote:


> Originally Posted by *skupples*
> 
> That's not a deal.... Or even close but thanks. Would pickup 2x 290 over that.


That's me point. Why are you even still posting here, all you've done is whine for your last 10 posts and counting.
Quote:


> Originally Posted by *bambino167*
> 
> Wow this is getting big attention. Petition is live
> http://www.maximumpc.com/gamers_petition_geforce_gtx_970_refund_over_error_specs_2015


Wonder how new the petition is, only 1.4k signers for hundreds of thousands of 970s floating around.
Quote:


> Originally Posted by *mtcn77*
> 
> SO, Nvidia have maximised their profits again?
> 
> They thereby prevented another gpu mining scenario (this news just stripped the resale value of 970 in comparison to the previous incident where AMD mining cards flooded the market),
> They have already sold the card for more than its worth,
> And you are as stated out of option to return the card.


Clearly they didn't with the way it sold. We were aware of what performance it gives and bought largely based on that...

Everyone whined about the 256 bus, Nvidia still sold 1m+ 970/980s. Why? The performance!

People aren't leaping to sell their 970s. Partially because what the hell would you buy? 290(x)s are a downgrade and the 980 doesn't offer the same price/perf sweet spot.

The ability to return may still be in reach, for example someone in this thread returned their 970.


----------



## Xoriam

Yeah I'm not signing that petition unless I'm promised a 980 in exchange, or a new gen 8gb+ card.

970 is doing just fine despite the lies.
I'll be happy with a money rebate however.

(not defending Nvidia, because it's total BS the lies, however the card still performs)


----------



## TopicClocker

Quote:


> Originally Posted by *2010rig*
> 
> Yeah, I'm seeing that now, didn't realize it was below the details of the card.
> 
> 
> 
> 
> 
> 
> 
> 
> It's funnier how some people have said they switched over to the 290X BECAUSE of the 4GB RAM which the 970 lacks.
> 
> 
> 
> 
> 
> 
> 
> 
> Crazy how it's being blown out of proportion. how come some people can see this, while others flat out refuse it?
> I see what you did there.
> 
> 
> 
> 
> 
> 
> 
> 
> Please explain how the value of the card is now HALF. Shouldn't it have gone down by 1/8th at best.
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, and you bought into the hype I see with your Avatar. Well done.
> 
> 
> 
> 
> 
> 
> 
> 
> hate to break it to ya, best time to sell a card for optimum resell value is a 1 - 3 months before new cards come out.


The Avatars just to lighten up the mood









This is a tricky topic, for weeks members in the GTX 970 Owner's Club have been talking about this and the issue has now gotten full exposure.

What I find puzzling is that Gamers had to find out for themselves, and what it took was for the problem to become big and known for Nvidia to come clean, that isn't a good look.

As said over at Anandtech, ultimately the perception of the GTX 970 has changed, it delivers solid performance but it took 4 months for it's "true" specs to be revealed.

People are complaining left right and center, on every major forum, and rightfully so, some calling Nvidia liars, some asking for refunds, some not bothered too much by it.

The main concern is the memory, with the 3.5GB segment faster than the 0.5GB segment, leading people to believing it is not a "true" 4GB card.

Also, what hype is there to buy into? the truth is out, this stuff was kept quiet somewhat, people bought cards which they were told and believed were 4GB cards, indeed they are but no one knew how the memory worked, that 3.5GB of the memory is faster, or that it has less ROPs etc.

I'm not one to raise pitch forks, but what was done was wrong, no doubt about it.
Quote:


> Originally Posted by *iSlayer*
> 
> I see what you did there.
> I'll sell you mine. $350, you're paying for shipping.
> Welcome to the club, any thoughts on the 50 and counting comments from mtcn?


I see where he is coming from in some of his posts, but generally I'm not sure.


----------



## Xoriam

Quote:


> Originally Posted by *TopicClocker*
> 
> The Avatars just to lighten up the mood
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is a tricky topic, for weeks members in the GTX 970 Owner's Club have been talking about this and the issue has now gotten full exposure.
> 
> What I find puzzling is that Gamers had to find out for themselves, and what it took was for the problem to become big and known for Nvidia to come clean, that isn't a good look.
> 
> As said over at Anandtech, ultimately the perception of the GTX 970 has changed, it delivers solid performance but it took 4 months for it's "true" specs to be revealed.
> 
> People are complaining left right and center, on every major forum, and rightfully so, some calling Nvidia liars, some asking for refunds, some not bothered too much by it.
> 
> The main concern is the memory, with the 3.5GB segment faster than the 0.5GB segment, leading people to believing it is not a "true" 4GB card.
> I see where he is coming from in some of his posts, but generally I'm not sure.


Actually in the 970 club there were 1 or 2 guys saying their card was capping out at 3,5gb and the rest of us were saying it can use 4gb because we've pushed them that far.


----------



## FlyingSolo

Quote:


> Originally Posted by *Xoriam*
> 
> it's still not a bad decsion, I can confirm from experience.
> Buying 1 more 970 is less than going out and buying 2x290x
> Which you might need 3 of in some cases.
> 
> ACU recent patch messed things up though.
> On patch 1.3.X I was getting consistant 55-80fps
> Now I'm getting the same FPS as I was getting with a single 970, and cutscenes are buggy as hell.
> 
> Everything else is performing pretty good.


Thanks for letting me know. If i find it for a good price i'll get another one.


----------



## skupples

Quote:


> Originally Posted by *mtcn77*
> 
> SO, Nvidia have maximised their profits again?
> 
> They thereby prevented another gpu mining scenario (this news just stripped the resale value of 970 in comparison to the previous incident where AMD mining cards flooded the market),
> They have already sold the card for more than its worth,
> And you are as stated out of option to return the card.


Mining, at least for the mainstream was a flash in the pan. ASICs dominate. Anyone seriously trying to make money is ASIC mining and doing a ton of day trading.

I would hope they turned a profit. I mean, idk how it works where you are from, but profit is how a private sector business STAYS IN BUSINESS

UHOH someone I going to call me a shill now.


----------



## mtcn77

Quote:


> Originally Posted by *iSlayer*
> 
> That's me point. Why are you even still posting here, all you've done is whine for your last 10 posts and counting.
> Wonder how new the petition is, only 1.4k signers for hundreds of thousands of 970s floating around.
> Clearly they didn't with the way it sold. We were aware of what performance it gives and *bought largely based on that...*
> 
> Everyone whined about the 256 bus, Nvidia still sold 1m+ 970/980s. Why? The performance!
> 
> People aren't leaping to sell their 970s. Partially because what the hell would you buy? 290(x)s are a downgrade and the 980 doesn't offer the same price/perf sweet spot.
> 
> The ability to return may still be in reach, for example someone in this thread returned their 970.


Except, Nvidia is strong and weighing heavy on how performance will be referenced & the way innovation will be progressed(or stifled). If it were AMD's world, your card would have a hell of a time.


----------



## Xoriam

Quote:


> Originally Posted by *FlyingSolo*
> 
> Thanks for letting me know. If i find it for a good price i'll get another one.


You should wait slightly longer and see how this plays out however, prices might drops considering the lies and AMD about the drop a 30-50% performance increased card.


----------



## iSlayer

Quote:


> Originally Posted by *skupples*
> 
> Mining, at least for the mainstream was a flash in the pan. ASICs dominate. Anyone seriously trying to make money is ASIC mining and doing a ton of day trading.
> 
> I would hope they turned a profit. I mean, idk how it works where you are from, but profit is how a private sector business STAYS IN BUSINESS
> 
> UHOH someone I going to call me a shill now.


But AMD is a nonprofit


----------



## 2010rig

Quote:


> Originally Posted by *adi518*
> 
> I look at it in a different way. I factor in 'value over time'. 2-3 years from now, it is still a powerful gpu, got SLI for future proof? even better. Gaming industry halted from progressing as fast as gpu technology. That value balances with the resale value. It doesn't balance when you sell a gpu right after a new one is out. You strictly lose money and the potential longevity you could get out of the gpu. It's also proven that moving from previous gen to newest gen is a bad investment.


I'm not quite following what you're saying.... Because you obviously see it differently. Notice I said for "*optimum* resell value"

I took a quick glance in the marketplace ( and looked for 3 year old cards ) Came across a 7970 being sold for $140, which originally launched for $550. ( $400 loss )

Also found 680's going for around $150. ( $350 loss )

That seems *to me* like the absolute worst time to sell $500 cards.

I've seen plenty of people sell their cards for a $50 - $100 loss *max* before new cards come out. This way they're always riding on the latest and greatest.

Look at the math, if you lose $50 - $100 every year by upgrading to the latest and greatest, it's about the same loss that you would experience over the 3 year period, while gaming on a 3 year old card.

Hope that makes sense.


----------



## FlyingSolo

Quote:


> Originally Posted by *Xoriam*
> 
> Yeah I'm not signing that petition unless I'm promised a 980 in exchange, or a new gen 8gb+ card.
> 
> 970 is doing just fine despite the lies.
> I'll be happy with a money rebate however.
> 
> (not defending Nvidia, because it's total BS the lies, however the card still performs)


I agree with you the card still performs. I would be happy with a refund but that will never happen. Plus its been little over 4 months now since i had the card.


----------



## Xoriam

Quote:


> Originally Posted by *FlyingSolo*
> 
> I agree with you the card still performs. I would be happy with a refund but that will never happen. Plus its been little over 4 months now since i had the card.


yep I totally understand that, I mean we were lied to right?
But at this point with a refund what would you buy?

I think some sort of compensation is in order for the lies, but not such of a complete refund. because they do perform as advertised. apart from the specs.
maybe a huge discount on next gen with exchange of our card, or a decent cash back deal.
or exchange for a 980 since we were given nearly the specs of that, then compenstate the 980 owners.

exchanging for 980s would probably be least hassle for them since ALOT less people have purchased 980s, this would result in an overall lower payout, since the production of the item is really so low....
you compensate the 980 owners which is a fraction of the 970 owners and give 980s to 970 owners. win win in the long run.

I mean 970s are just a cutdown bios modded 980.
same production price, then you have to modify them....


----------



## skupples

Quote:


> Originally Posted by *iSlayer*
> 
> But AMD is a nonprofit


So true.

GPUs for the people!!


----------



## 2010rig

Quote:


> Originally Posted by *skupples*
> 
> So true.
> 
> GPUs for the people!!


{rumor}
AMD is now giving away 290X's to anyone who trades in their 970 to them.
{/rumor}

AMD by the people, for the people.


----------



## ZealotKi11er

Quote:


> Originally Posted by *2010rig*
> 
> I'm not quite following what you're saying.... Because you obviously see it differently. Notice I said for "*optimum* resell value"
> 
> I took a quick glance in the marketplace ( and looked for 3 year old cards ) Came across a 7970 being sold for $140, which originally launched for $550. ( $400 loss )
> 
> Also found 680's going for around $150. ( $350 loss )
> 
> That seems *to me* like the absolute worst time to sell $500 cards.
> 
> I've seen plenty of people sell their cards for a $50 - $100 loss *max* before new cards come out. This way they're always riding on the latest and greatest.
> 
> Look at the math, if you lose $50 - $100 every year by upgrading to the latest and greatest, it's about the same loss that you would experience over the 3 year period, while gaming on a 3 year old card.
> 
> Hope that makes sense.


Not that simple. You lose way more if you upgrade every year. What was 290X worth 1 year ago? $550. Way is it now? $300-350. Also include TAX you pay for the card you buy and the card you sold and you end up losing ~ 300-$350 to upgrade to $550 card today. If you lost $50 to $100 then everyone would have the greatest and the latest here.


----------



## Xoriam

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not that simple. You lose way more if you upgrade every year. What was 290X worth 1 year ago? $550. Way is it now? $300-350. Also include TAX you pay for the card you buy and the card you sold and you end up losing ~ 300-$350 to upgrade to $550 card today. If you lost $50 to $100 then everyone would have the greatest and the latest here.


it's still worth 550$ here. (converted from euro)


----------



## 2010rig

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not that simple. You lose way more if you upgrade every year. What was 290X worth 1 year ago? $550. Way is it now? $300-350. Also include TAX you pay for the card you buy and the card you sold and you end up losing ~ 300-$350 to upgrade to $550 card today. If you lost $50 to $100 then everyone would have the greatest and the latest here.


Well this is why you don't buy AMD cards.







I kid I kid.

I know I over simplified it, especially since we no longer get yearly releases.

The optimum time to sell that 290X would have been in the 1 - 3 month period before Maxwell came out.









Remember, that 290X was still selling for $549 the day Maxwell came out, and the 970 put a stop to that.

My point is, waiting 3 years to sell just isn't wise as I showed with how much 7970's and 680's are selling for today. ( $350 - $400 loss )


----------



## Xoriam

Quote:


> Originally Posted by *2010rig*
> 
> Well this is why you don't buy AMD cards.
> 
> 
> 
> 
> 
> 
> 
> I kid I kid.
> 
> I know I over simplified it, especially since we no longer get yearly releases.
> 
> The optimum time to sell that 290X would have been in the 1 - 3 month period before Maxwell came out.


This is one thing that AMD ACTUALLY DOES RIGHT though.

You don't want to have to purchase a new card every year or 6 months do you?
I mean think about how long the 7XXX series has been around, and it's still kicking.

I know I don't want to.
And this generation was gimped by only putting 4gb of Vram on them and going with such a small perfomance increase.
Buying a use 780ti would have probably been the better option for me.
Or a used Titan.


----------



## FlyingSolo

Quote:


> Originally Posted by *Xoriam*
> 
> yep I totally understand that, I mean we were lied to right?
> But at this point with a refund what would you buy?
> 
> I think some sort of compensation is in order for the lies, but not such of a complete refund. because they do perform as advertised. apart from the specs.
> maybe a huge discount on next gen with exchange of our card, or a decent cash back deal.
> or exchange for a 980 since we were given nearly the specs of that, then compenstate the 980 owners.


Yup we where lied to. If i where to get a refund now and i had to get a card now to play games then it would have to be a 980. Or wait for the new cards to come out. But that's not happening anytime soon.


----------



## ZealotKi11er

Quote:


> Originally Posted by *2010rig*
> 
> Well this is why you don't buy AMD cards.
> 
> 
> 
> 
> 
> 
> 
> I kid I kid.
> 
> I know I over simplified it, especially since we no longer get yearly releases.
> 
> The optimum time to sell that 290X would have been in the 1 - 3 month period before Maxwell came out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Remember, that 290X was still selling for $549 the day Maxwell came out, and the 970 put a stop to that.


290X might have been that much but in used market it was not selling more then $400. Used 290X resale was down because of mining. Also GTX780/Ti Suffered too after Maxwell but how can you sell a GPU 1-3 months before to wait for the next one? What do you use then? Either way upgrading you always lose money.


----------



## Xoriam

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290X might have been that much but in used market it was not selling more then $400. Used 290X resale was down because of mining. Also GTX780/Ti Suffered too after Maxwell but how can you sell a GPU 1-3 months before to wait for the next one? What do you use then? Either way upgrading you always lose money.


The 780 and 780Ti are still selling for roughly the same price here atleast.
Dat bus


----------



## ZealotKi11er

Quote:


> Originally Posted by *Xoriam*
> 
> The 780 and 780Ti are still selling for roughly the same price here atleast.
> Dat bus


Does not matter what they sell for. What is the resale value.


----------



## Xoriam

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does not matter what they sell for. What is the resale value.


Gtx 780 resell value is roughly the same as a new 970 here.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Xoriam*
> 
> The 780 and 780Ti are still selling for roughly the same price here atleast.
> Dat bus


I have seen several 780 Ti used for $300 here in the States.

Man, you guys really do get dumped on with the price thing.


----------



## nleksan

I am genuinely curious, this is not sarcasm at least not at a conscious level...

But can someone explain to me WHY a thread SPECIFIC to a single GPU SKU from Nvidia has turned into people deciding that what the thread is clearly lackingis a rrefreshing barrage of "AMD is better" claims? (okay, I guess I lied a little about the sarcasm, but it is how my brain works so







)


----------



## Xoriam

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I have seen several 780 Ti used for $300 here in the States.
> 
> Man, you guys really do get dumped on with the price thing.


Yeah the imported product thing kind of gives a huge kick to the price.

the UK doesn't suffer this, but most of the rest does.

(if I was in Japan at the moment I could order a card from china for less that 100€, but yeah... I'm not there right now.)


----------



## Forceman

Quote:


> Originally Posted by *nleksan*
> 
> I am genuinely curious, this is not sarcasm at least not at a conscious level...
> 
> But can someone explain to me WHY a thread SPECIFIC to a single GPU SKU from Nvidia has turned into people deciding that what the thread is clearly lackingis a rrefreshing barrage of "AMD is better" claims? (okay, I guess I lied a little about the sarcasm, but it is how my brain works so
> 
> 
> 
> 
> 
> 
> 
> )


Because that's what all GPU threads end up as. It's a Godwin's law for GPU discussions.


----------



## Xoriam

One thing that gets me even more than the rest of this stuff, is we have to pay more for an item that cost less to make.
marketing tricks,

example for those who don't get it.
A 980 cost less to make than a 970.
Why?
a 970 is technically a 980 which has failed quality standards.
It then has to be cut down and then have a bios created specificly for it.

(sure sure I get why it's done, you don't have to waste materials for a lower price, but it still bothers me that a 980 cost less to make and we have to pay more.) XD


----------



## Final8ty

Quote:


> Originally Posted by *Thesnipergecko;27542683*
> I made a short video using High and Medium textures to show the 970 3.5GB VRAM issues. As soon as it reached 3500mb the game starts to stutter really badly and craps out.
> 
> Lowering it to Medium textures and it runs smooth as silk. I'm only gaming at 1080p. Should I not be using High textures at 1080p?
> 
> https://www.youtube.com/watch?v=a8Q6jmg_qik&feature=youtu.be


http://forums.overclockers.co.uk/showthread.php?t=18651061&page=85


----------



## fleetfeather

Quote:


> Originally Posted by *Forceman*
> 
> Because that's what all GPU threads end up as. It's a Godwin's law for GPU discussions.


It's actually just a lame OCN thing


----------



## 2010rig

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290X might have been that much but in used market it was not selling more then $400. Used 290X resale was down because of mining. Also GTX780/Ti Suffered too after Maxwell but how can you sell a GPU 1-3 months before to wait for the next one? What do you use then? Either way upgrading you always lose money.


I hear ya, and I was trying to demonstrate that if you play your cards right, that loss could *potentially* work out to about the same, over the same period of time. I understand it's not as cut and dried.

Lately, there have been different factors to mess that up like the mining crash, but those are out of the norm.
Quote:


> Originally Posted by *nleksan*
> 
> I am genuinely curious, this is not sarcasm at least not at a conscious level...
> 
> But can someone explain to me WHY a thread SPECIFIC to a single GPU SKU from Nvidia has turned into people deciding that what the thread is clearly lackingis a rrefreshing barrage of "AMD is better" claims? (okay, I guess I lied a little about the sarcasm, but it is how my brain works so
> 
> 
> 
> 
> 
> 
> 
> )


Are you new to OCN?









Every thread turns into that in one way or another. Doesn't matter if it's an Intel/AMD/NVIDIA thread, comparisons eventually get made.

Some ( mtcn77 ) are having a field day with it though, best thing that has happened to him in years. It's hard for him to contain his excitement.


Spoiler: His last 4 posts just to prove my point



Quote:


> Originally Posted by *mtcn77*
> 
> Except, Nvidia is strong and weighing heavy on how performance will be referenced & the way innovation will be progressed(or stifled). If it were AMD's world, your card would have a hell of a time.


Quote:


> Originally Posted by *mtcn77*
> 
> SO, Nvidia have maximised their profits again?
> 
> They thereby prevented another gpu mining scenario (this news just stripped the resale value of 970 in comparison to the previous incident where AMD mining cards flooded the market),
> They have already sold the card for more than its worth,
> And you are as stated out of option to return the card.


Quote:


> Originally Posted by *mtcn77*
> 
> Why would you want to support the lies?


Quote:


> Originally Posted by *mtcn77*
> 
> Drops 12% from <3.5GB use to 4GB use, since the slow 8th partition delays the rest. Notice this is the "best case scenario". Nvidia can pull a fast one at any time they wish by choking the driver at the slow partition. Planned obsolescence, ah!
> Pcper


----------



## skupples

Quote:


> Originally Posted by *2010rig*
> 
> Well this is why you don't buy AMD cards.
> 
> 
> 
> 
> 
> 
> 
> I kid I kid.
> 
> I know I over simplified it, especially since we no longer get yearly releases.
> 
> The optimum time to sell that 290X would have been in the 1 - 3 month period before Maxwell came out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Remember, that 290X was still selling for $549 the day Maxwell came out, and the 970 put a stop to that.
> 
> My point is, waiting 3 years to sell just isn't wise as I showed with how much 7970's and 680's are selling for today. ( $350 - $400 loss )


best 290X resale was during the Flash Pan when 290/290x was sold out globally, thus people were popping 290x off for $800-$900 a piece.
Quote:


> Originally Posted by *Xoriam*
> 
> This is one thing that AMD ACTUALLY DOES RIGHT though.
> 
> You don't want to have to purchase a new card every year or 6 months do you?
> I mean think about how long the 7XXX series has been around, and it's still kicking.
> 
> I know I don't want to.
> And this generation was gimped by only putting 4gb of Vram on them and going with such a small perfomance increase.
> Buying a use 780ti would have probably been the better option for me.
> Or a used Titan.


no need for yearly, unless you buy x04 chips.









also, when did 4GB of GDDR5 become a low number? 4GB is the peak, besides ultra boutique products w/ 8GB or Titans with 6GB. Nvidia releasing a mid range swinger w/ "4GB" (OK 3.5 DON'T CALL ME A SHILL) is pretty unheard of for them. They tend to always skimp on memory, even in the high end range, as it saves them pennies.
Quote:


> Originally Posted by *fleetfeather*
> 
> It's actually just a lame OCN thing


Not even remotely true, and insinuating as such is insulting to OCN.

It's a common thing, which transcends all medias.

Videogames. Go over to BF4 forums, you will find QQ threads about CoD

Go over to CoD Forums, find BF4 threads.

Go to star Citizen, find ED threads.

Go to Elite Dangerous forums, find tonnnnnns to SC threads, so much so that the mods have actually had to start deleting them on sight due to the slanderous and misinformed nature of most of them.


----------



## The Robot

Quote:


> Originally Posted by *Final8ty*
> 
> http://forums.overclockers.co.uk/showthread.php?t=18651061&page=85


I heard 780ti maxes out this game at 60 fps with zero issues...


----------



## Gamer_Josh

Quote:


> Originally Posted by *Gamer_Josh*
> 
> It leaves me to wonder, what variables decide whether or not a certain card has issues? I max all settings on Battlefield 4 at 1080P and experience no such stutters or hiccups.


Quote:


> Originally Posted by *skupples*
> 
> see, and I've enver been able to get BF4 smooth w/ 2-3 titans. VRAM is non-issue, hitching is persistent no matter the settings. Runs almost as poorly as FC4, just with higher FPS.


That's what makes me curious about this. I have all settings maxed in 1080P for BF4, and can have FRAPS or ShadowPlay running, with no stuttering or hitching. It's smooth. Why does this happen for some, and not others?


----------



## wanako

interestingly enough, Boris Vorontsov, creator of ENB, claims it to be a driver issue and not hardware related: http://enbdev.com/index_en.html


----------



## skupples

Quote:


> Originally Posted by *Gamer_Josh*
> 
> That's what makes me curious about this. I have all settings maxed in 1080P for BF4, and can have FRAPS or ShadowPlay running, with no stuttering or hitching. It's smooth. Why does this happen for some, and not others?


who knows. I've tried it on a 3570k, 3930k, 4930k, 8gb, 16gb, 32gb (all @ 2400mhz) 1 GPU, 2 GPU, 3 GPU.... everything but single GPU has performance issues, but like I just said in the other thread. I See stutter where other people see butter. It's extremely annoying, and I don't wish it upon anyone in this hobby, as it becomes quite expensive.


----------



## rdr09

Quote:


> Originally Posted by *iSlayer*
> 
> But AMD is a nonprofit


Your beloved nVidia DUPED people into buying a car with 3 tires and a spare and you bash AMD.


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> also, when did 4GB of GDDR5 become a low number? 4GB is the peak, besides ultra boutique products w/ 8GB or Titans with 6GB. Nvidia releasing a mid range swinger w/ "4GB" (OK 3.5 DON'T CALL ME A SHILL) is pretty unheard of for them. They tend to always skimp on memory, even in the high end range, as it saves them pennies.
> Not even remotely true, and insinuating as such is insulting to OCN.


4Gb became a low number when ACU, FarCry4 and Ryse came onto the scene.
I wouldn't call you a shill for saying the 970 has 4gb of ram because I know from personal experience that it can use it.

However it's not enough, nor is the performance of the chip itself anymore,
4k will become the new norm.
it's literally half what is needed, this seems weird to me, but before If I don't recall incorrectly current gen cards were enough to run current gen games (crysis original to be a rare case.)
The cards I use to buy used to always seem to be a step ahead of the games I was playing.
Now in my opnion, I don't know if I've become super picky or what but.
The core needs double performance, and ram needs double capacity and bus size.


----------



## Final8ty

Already posted


----------



## Forceman

Quote:


> Originally Posted by *Xoriam*
> 
> 4k will become the new norm.


I wouldn't hold your breath waiting for that to happen.


----------



## skupples

Quote:


> Originally Posted by *Xoriam*
> 
> 4Gb became a low number when ACU, FarCry4 and Ryse came onto the scene.
> I wouldn't call you a shill for saying the 970 has 4gb of ram because I know from personal experience that it can use it.
> 
> However it's not enough, nor is the performance of the chip itself anymore,
> 4k will become the new norm.
> it's literally half what is needed, this seems weird to me, but before If I don't recall incorrectly current gen cards were enough to run current gen games (crysis original to be a rare case.)
> The cards I use to buy used to always seem to be a step ahead of the games I was playing.
> Now in my opnion, I don't know if I've become super picky or what but.
> The core needs double performance, and ram needs double capacity and bus size.


all games known for being terribly optimized. I'm still surprised Ryse ever came out, due to its drama filled history and near cancel. Also seems developers have been allowed to get sloppy, possibly due to having more to work with in the next gen consoles. I mean, visuals haven't got much better, yet system demand has increased? Like seriously man, you just listed three of the most broken games to release in 2014.
Quote:


> Originally Posted by *Final8ty*
> 
> Already posted


not really trying to defend nvidia here, but both of those results are asinine. Being run @ settings which the core in no way shape or form can handle, for BOTH cards. So yes, frametime variance sucks when you're running a game @ 15FPS Whod'a thunk...









I see what there point is, but anyone running their 970 @ 15FPS .... um... wut? 24FPS is most cinematic, derp!


----------



## Ganf

4gb and 256bit officially became low numbers when the CV of the Oculus was announced to be a minimum of 1440p with 90hz refresh rate, and it suddenly became much more desirable to run 4k and downscale.

Even though I just rejected a card from Nvidia I'm going to admit right now that I don't care about brand loyalty. Whoever has the card that can pull this off for the least amount of money is getting my purchase(s) this spring.


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> all games known for being terribly optimized. I'm still surprised Ryse ever came out, due to its drama filled history and near cancel. Also seems developers have been allowed to get sloppy, possibly due to having more to work with in the next gen consoles. I mean, visuals haven't got much better, yet system demand has increased? Like seriously man, you just listed three of the most broken games to release in 2014.


Ugh broken forum.

yes I listed those games because they are from the "big names" this is a hint of what is to come.
these companies are producing for consoles and then porting horribly over to PC, we need the extra peformance and GB of vram
Quote:


> Originally Posted by *Forceman*
> 
> I wouldn't hold your breath waiting for that to happen.


why would i not hold my breath on this?
I'm one of the guys who typically tends to spend in the medium range of things when it comes to GPUs and I've already adopted 4k.
Seeing as 4k performs the same if not slightly better than 1080-1440 with those ridiculous amounts of AA peeople throw at it along with an overall better image quality, I do not see the reason it will not become the norm.


----------



## Gamer_Josh

Quote:


> Originally Posted by *skupples*
> 
> who knows. I've tried it on a 3570k, 3930k, 4930k, 8gb, 16gb, 32gb (all @ 2400mhz) 1 GPU, 2 GPU, 3 GPU.... everything but single GPU has performance issues, but like I just said in the other thread. I See stutter where other people see butter. It's extremely annoying, and I don't wish it upon anyone in this hobby, as it becomes quite expensive.


Well that's no good, sorry about that man. It always sucks when something like that happens, because it can be so hard to track down the cause. I hope that the issue is remedied for you sooner that later.


----------



## skupples

Quote:


> Originally Posted by *Xoriam*
> 
> Ugh broken forum.
> 
> yes I listed those games because they are from the "big names" this is a hint of what is to come.
> these companies are producing for consoles and then porting horribly over to PC, we need the extra peformance and GB of vram
> why would i not hold my breath on this?
> I'm one of the guys who typically tends to spend in the medium range of things when it comes to GPUs and I've already adopted 4k.
> Seeing as 4k performs the same if not slightly better than 1080-1440 with those ridiculous amounts of AA peeople throw at it along with an overall better image quality, I do not see the reason it will not become the norm.


eh, Ubisoft games have been broken since FC2, and have only gotten worse, and Crytek rarely releases anything.

If anything, it should get better over time, as the developers have to rope in their sloppy programming due to console limitations.

just look @ previous console life cycles. Look @ the quality of games @ release, then what they're able to do by the end of the life cycle.
Quote:


> Originally Posted by *Gamer_Josh*
> 
> Well that's no good, sorry about that man. It always sucks when something like that happens, because it can be so hard to track down the cause. I hope that the issue is remedied for you sooner that later.


remedied by getting the game for free, and not being much of a modern BF fan.


----------



## jcde7ago

I was already on the fence about switching to a pair of Sapphire 290X 8GBs in CrossFire over my SLI'd G1 970s for my 7680x1440p setup...and this whole debacle pretty much sealed the deal for me last night (thank you Amazon for granting me a no-questions-asked refund on my 970s even though it had already been close to 70 days since i've owned them <3).

At first I thought my decision to switch was knee-jerkish, but to be honest, 4K and triple monitor gamers are more affected by this issue than those on a single 1080p/1440p monitor, so the choice was pretty clear for me. I would still recommend that anyone gaming on a single 1080p/1440p monitor stick with their 970(s) if they already have them, as the 970 is still a helluva card and the performance we all bought them for in benchmarks hasn't necessarily disappeared, but for those looking at 4K or triple monitor setups...it'd probably be worth it to switch to some 290Xs. Just my 2 cents...really hard to go wrong either way (this is assuming that you're impatient like me and aren't necessarily waiting for the 300X series...personally, I don't plan on upgrading again until next Christmas/tax season).


----------



## Gamer_Josh

Quote:


> Originally Posted by *skupples*
> 
> remedied by getting the game for free, and not being much of a modern BF fan.


Ah, I didn't realize it was only with BF4 that you experience this issue. Well, considering that you got it for free and aren't much of a fan, it's not that bad I reckon.


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> eh, Ubisoft games have been broken since FC2, and have only gotten worse, and Crytek rarely releases anything.
> 
> If anything, it should get better over time, as the developers have to rope in their sloppy programming due to console limitations.


We can only hope









in the meantime however, I would like something that can compensate for their sloppiness.

DX12 should fix some of it.


----------



## skupples

Quote:


> Originally Posted by *Gamer_Josh*
> 
> Ah, I didn't realize it was only with BF4 that you experience this issue. Well, considering that you got it for free and not much of a fan, it's not that bad I reckon.


I think I forgot to mention running it in surround... yup, just looked... forgot to mention that I play most titles in NV Surround, and more and more titles are beginning to run like ass in surround, no matter the hardware or game settings, and this is why I'm switching over to that 144hz 1440P IPS G-sync monitor, when it drops next month.

If a game doesn't work in surround, it's highly unlikely that I'll play it, let alone buy it. This rule of course, does not apply to really old titles.


----------



## 2010rig

Quote:


> Originally Posted by *wanako*
> 
> interestingly enough, Boris Vorontsov, creator of ENB, claims it to be a driver issue and not hardware related: http://enbdev.com/index_en.html


Interesting, but we need more to go on. Will these guys reveal their tests, and how they came to that conclusion?
Quote:


> 27 january 2015
> Another update regarding "GTX 970 memory bug". I am and Alexander Blade did own tests and prooved that this performance issue exist, but it is not hardware bug as i thought. Performance decreases even for memory allocation outside of 3.5-4gb memory range after that "bug" occur (in released previously allocated blocks), so it's driver bug (or may be driver configuration per videocard model). Myth busted.


Quote:


> 24 january 2015
> Just want to say few things about recently found "GTX 970 memory bug". Do not panic, this looks like driver management of resources and any video memory tests are useless, unless they are running from clear DOS mode. I have seen a lot of videocards per model which suffer from performance issues when vram is near to end, only developers of drivers do know how that works.


----------



## wanako

Quote:


> Originally Posted by *2010rig*
> 
> Interesting, but we need more to go on. Will these guys reveal their tests, and how they came to that conclusion?


i was wondering that as well. that information would be very relevant to this topic.


----------



## Triniboi82

Quote:


> Originally Posted by *2010rig*
> 
> I'm not quite following what you're saying.... Because you obviously see it differently. Notice I said for "*optimum* resell value"
> 
> I took a quick glance in the marketplace ( and looked for 3 year old cards ) Came across a 7970 being sold for $140, which originally launched for $550. ( $400 loss )
> 
> Also found 680's going for around $150. ( $350 loss )
> 
> That seems *to me* like the absolute worst time to sell $500 cards.
> 
> I've seen plenty of people sell their cards for a $50 - $100 loss *max* before new cards come out. This way they're always riding on the latest and greatest.
> 
> Look at the math, if you lose $50 - $100 every year by upgrading to the latest and greatest, it's about the same loss that you would experience over the 3 year period, while gaming on a 3 year old card.
> 
> Hope that makes sense.


Similar upgrade strategy I use as well, when I sold my 680s I was able to get $300US per card right before the 970s hit the market. So my upgrade didn't cost me much with the exception of the waterblocks I also purchased. Pleased with the performance on my single 1440p, I have no intentions of going 4K this year but I do intend to upgrade to the Acer predator 34" 144Hz 1440p and judging from the numbers I'm getting my 970s will make do till the next gen comes out. I don't however feel like I was robbed in any way purchasing these cards as they perform really well for me at my resolution, I'll just be more cautious about purchasing any future low end cards from Nvidia.


----------



## aDyerSituation

So much bias in this thread towards Nvidia.

You can defend them all you want, but they did wrong.


----------



## ChrisB17

Quote:


> Originally Posted by *2010rig*
> 
> Interesting, but we need more to go on. Will these guys reveal their tests, and how they came to that conclusion?


Hmm this has me thinking.


----------



## GrimDoctor

The fact i couldn't run AutoCAD properly was the final straw. Thankfully, I got a refund from the retailer. They seemed tired, like they've been processing a few. Relief is the word of the day but now the hunt for another card begins...to spend more for the 980 or not...


----------



## Serandur

Quote:


> Originally Posted by *GrimDoctor*
> 
> The fact i couldn't run AutoCAD properly was the final straw. Thankfully, I got a refund from the retailer. They seemed tired, like they've been processing a few. Relief is the word of the day but now the hunt for another card begins...to spend more for the 980 or not...


I'm jealous of those lucky few that got 6 GB 780s. That strix model was pretty sweet.


----------



## GrimDoctor

Quote:


> Originally Posted by *Serandur*
> 
> I'm jealous of those lucky few that got 6 GB 780s. That strix model was pretty sweet.


I'd love one of those, even two, but struggling to find one in Australia.


----------



## Serandur

Quote:


> Originally Posted by *GrimDoctor*
> 
> I'd love one of those, even two, but struggling to find one in Australia.


They're gone pretty much everywhere really, such a shame. Nvidia made sure to it that 6 GB 780s were extremely limited and only at the end of the product's lifespan while completely blocking 6 GB 780Tis no doubt. I'm convinced this is their planned obsolescence strategy, VRAM gimping/restricting.


----------



## mercs213

Playing Dying Light at max settings on 1080p on one monitor. Got the windows is low on memory pop up. Looked at VRAM usage was pegged at 3536MB and never went above. Tabbed back in and game crashed. Seriously, screw you nvidia.

I heard the game may have a memory leak but still, these 970s refuse to use more than 3500MB of VRAM unless they are forced to.


----------



## Moparman

I think all you that don't own the card(s) really need to stop the posting of crap. I'm running 3 and I can tell you it was a very big upgrade over my 4way 4Gb 680s. As far as the Vram issue I have not seen any problems even Crysis 3 at max settings and 2880x1620 res it plays just awesome. This is supposed to be a discussion not a bashing thread especially from you guys that have never even tested the card yourself.


----------



## Serandur

Quote:


> Originally Posted by *Moparman*
> 
> I think all you that don't own the card(s) really need to stop the posting of crap. I'm running 3 and I can tell you it was a very big upgrade over my 4way 4Gb 680s. As far as the Vram issue I have not seen any problems even Crysis 3 at max settings and 2880x1620 res it plays just awesome. This is supposed to be a discussion not a bashing thread especially from you guys that have never even tested the card yourself.


Crysis 3... the game that barely uses any VRAM in comparison to the demanding stuff coming today. That's not a great example. I'm running two and I can tell you the issues are very much real, but don't take my word for it. Nvidia pretty much confirmed it themselves, benchmarks pretty much confirmed it, and the issue is ablaze on the internet. Of course the 970s are powerful and run great when they're not running into VRAM issues (Crysis 3 won't do that to your cards), but the issue is real. The problem and its impact are established facts at this point. It doesn't mean the 970s are worthless or weak cards, but they do have this issue and Nvidia failed to disclose it as they should have.


----------



## 2010rig

Quote:


> Originally Posted by *GrimDoctor*
> 
> I'd love one of those, even two, but struggling to find one in Australia.


Have you considered a used Titan?


----------



## mouacyk

In a nutshell, this


got marketed and sold as this


due to the incorrect specification highlighted here

(http://www.hardocp.com/images/articles/1411976595nitFZ11Eg1_1_2_l.gif)

and marketing here

(http://www.hardocp.com/images/articles/14110637240cPED1snfp_5_8_l.gif)

That *SINGLE* 256K L2 block had to go and mess up the 970. I feel for the NV engineers and driver programmers who HAD to come up with this clever compromise. Kudos to them, but in the end it's still dirty, because marketing said that that 256K L2 is still there because it has the same *memory subsystem* as the 980. A single piece throwing the entire system off... we all got cheated by yields, including NVidia.

It's official -- you can get a *REFUND* according to NVidia post on GeForce forums.


Spoiler: Warning: Spoiler!



Hey,

First, I want you to know that I'm not just a mod, I work for NVIDIA in Santa Clara.

I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.

It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.

Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.

--Peter


----------



## spacin9

Quote:


> Originally Posted by *mercs213*
> 
> Playing Dying Light at max settings on 1080p on one monitor. Got the windows is low on memory pop up. Looked at VRAM usage was pegged at 3536MB and never went above. Tabbed back in and game crashed. Seriously, screw you nvidia.
> 
> I heard the game may have a memory leak but still, these 970s refuse to use more than 3500MB of VRAM unless they are forced to.


I see almost the same thing with Kombustor 3. 1GB test: 800 MB system RAM used. 2GB test: 900 MB system RAM used. 3GB test. 2,700 MB system RAM used and 14,000 MB page file!. If all that's being tested is VRAM, if all else is equal,
why does the program need 3x the system RAM to run it 10-15 fps slower than @ 1GB and 2GB?


----------



## PostalTwinkie

Quote:


> Originally Posted by *mercs213*
> 
> Playing Dying Light at max settings on 1080p on one monitor. Got the windows is low on memory pop up. Looked at VRAM usage was pegged at 3536MB and never went above. Tabbed back in and game crashed. Seriously, screw you nvidia.
> 
> I heard the game may have a memory leak but still, these 970s refuse to use more than 3500MB of VRAM unless they are forced to.


You are confusing two entirely different things that sound familiar, and have no idea what you are even trying to talk about, and yelling at Nvidia for it. All the while you even state that you have heard of a Daylight memory leak, which would cause your very issue.

Who else sees the real problem here?


----------



## spacin9

Quote:


> Originally Posted by *PostalTwinkie*
> 
> You are confusing two entirely different things that sound familiar, and have no idea what you are even trying to talk about, and yelling at Nvidia for it. All the while you even state that you have heard of a Daylight memory leak, which would cause your very issue.
> 
> Who else sees the real problem here?


me... see two posts back


----------



## Serandur

Quote:


> Originally Posted by *mouacyk*
> 
> *we all got messed by yields, including NVidia.*


Doubtful, it's probably just us. 28nm is a very mature process at this point, GM204 isn't exactly a big die, and the 980 is way overpriced as are the 970s to be honest considering how cheaply and cleanly TSMC can probably crank these things out for Nvidia now.


----------



## Exilon

Bro, I think you got you images reversed, but yeah the idea is right. I'm glad I didn't sell my GTX 780 for SLI 970. I'm also glad I gave my friend a R9 290 for his birthday instead of a GTX 970. Yay frugality.


----------



## Moparman

Quote:


> Originally Posted by *Serandur*
> 
> Crysis 3... the game that barely uses any VRAM in comparison to the demanding stuff coming today. That's not a great example. I'm running two and I can tell you the issues are very much real, but don't take my word for it. Nvidia pretty much confirmed it themselves, benchmarks pretty much confirmed it, and the issue is ablaze on the internet. Of course the 970s are powerful and run great when they're not running into VRAM issues (Crysis 3 won't do that to your cards), but the issue is real. The problem and its impact are established facts at this point. It doesn't mean the 970s are worthless or weak cards, but they do have this issue and Nvidia failed to disclose it as they should have.


Just wondering have you tried to play it with everything at the highest settings and above 2560x1440? Yea it uses up the Vram. My 4way 680 setup was using 3934MB at 2880x1620.


----------



## skupples

Quote:


> Originally Posted by *Serandur*
> 
> Crysis 3... the game that barely uses any VRAM in comparison to the demanding stuff coming today. That's not a great example. I'm running two and I can tell you the issues are very much real, but don't take my word for it. Nvidia pretty much confirmed it themselves, benchmarks pretty much confirmed it, and the issue is ablaze on the internet. Of course the 970s are powerful and run great when they're not running into VRAM issues (Crysis 3 won't do that to your cards), but the issue is real. The problem and its impact are established facts at this point. It doesn't mean the 970s are worthless or weak cards, but they do have this issue and Nvidia failed to disclose it as they should have.


and neither is FC4, ACU, & Watch_Dogs. Which 99% of the benchmarkers are using.

Crysis 3 also starts to use a good amount of memory when you start cranking the res & settings.

and at least we know Crysis 3 is properly optimized & smooth running, unlike the examples being beat to death... Wait, isn't Crysis 3 an AMD evolved title? I think it is. Funny how NV branded games continue to release in hot_mess_state, while AMD branded titles tend to release in a less_hot_mess_state.

welp, I better lube up for the shill accusations. I'm sure I'm the only person on the face of the earth that experiences sub par experiences in all these ubisoft titles.
Quote:


> Originally Posted by *Serandur*
> 
> Doubtful, it's probably just us. 28nm is a very mature process at this point, GM204 isn't exactly a big die, and the 980 is way overpriced as are the 970s to be honest considering how cheaply and cleanly TSMC can probably crank these things out for Nvidia now.


wasn't there an article released recently that actually stated Maxwell is costing a good bit more to produce than GK104 products?

damn, now I need to go find it.


----------



## mouacyk

Quote:


> Originally Posted by *Exilon*
> 
> Bro, I think you got you images reversed, but yeah the idea is right. I'm glad I didn't sell my GTX 780 for SLI 970. I'm also glad I gave my friend a R9 290 for his birthday instead of a GTX 970. Yay frugality.


Thanks for pointing out the ambiguity with my ordering of the images. I've reversed them and rephrased.
Quote:


> Originally Posted by *Serandur*
> 
> Doubtful, it's probably just us. 28nm is a very mature process at this point, GM204 isn't exactly a big die, and the 980 is way overpriced as are the 970s to be honest considering how cheaply and cleanly TSMC can probably crank these things out for Nvidia now.


Well, just trying to be nice to the "other" guys that keep this thread bumped.


----------



## ChrisB17

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/162/

Check out the mod "Peter"?he is saying a driver fix is on the way?


----------



## Pnanasnoic

I haven't read every post in this thread but it seems to me there's only one question left; What is Nvidia going to do about this? Will 970 buyers be compensated somehow?


----------



## GrimDoctor

Quote:


> Originally Posted by *ChrisB17*
> 
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/162/
> 
> Check out the mod "Peter"?he is saying a driver fix is on the way?


He has a pretty good attitude toward it which is nice to see. With my experience though, drivers won't be enough to resolve the issue for enough instances.


----------



## mouacyk

Quote:


> Originally Posted by *Pnanasnoic*
> 
> I haven't read every post in this thread but it seems to me there's only one question left; What is Nvidia going to do about this? Will 970 buyers be compensated somehow?


In that same GeForce forum post, Peter offers to work with you to accomplish a refund.


----------



## Serandur

Quote:


> Originally Posted by *skupples*
> 
> and neither is FC4, ACU, & Watch_Dogs. Which 99% of the benchmarkers are using.
> 
> Crysis 3 also starts to use a good amount of memory when you start cranking the res & settings.
> 
> and at least we know Crysis 3 is properly optimized & smooth running, unlike the examples being beat to death... Wait, isn't Crysis 3 an AMD evolved title? I think it is. Funny how NV branded games continue to release in hot_mess_state, while AMD branded titles tend to release in a less_hot_mess_state.


Are you kidding? I own both FC4 and AC:U. The former isn't too bad without a good amount of AA on VRAM, but it does get there and the latter (Unity) has some pretty demanding ultra textures and general VRAM usage. It's a hog, to the point where 2xMSAA is unplayable on my SLI 970s at 1440p with ultra textures because of VRAM. Watch Dogs is also a VRAM hog, remember the whole stuttering (especially with ultra textures) thing? Clear VRAM hog. The benchmarks listed on Watch Dogs by pcgameshardware are pretty conclusive. Crysis 3 doesn't isn't remotely in the same ballpark on VRAM usage and it doesn't need to be, it's not open world like any of those three titles you listed... or the popular point of contention that is SoM while we're at it. "Optimized" is such a bogus word used so casually and with little understanding of the reality. Textures eat memory, textures on an open world game with very far-ranging draw distances voraciously devour the stuff. It's not about optimization, it's just the demands of more technically ambitious (in certain regards) games.
Quote:


> welp, I better lube up for the shill accusations. I'm sure I'm the only person on the face of the earth that experiences sub par experiences in all these ubisoft titles.
> wasn't there an article released recently that actually stated Maxwell is costing a good bit more to produce than GK104 products?
> 
> damn, now I need to go find it.


These days, yeah, no doubt given they're both on the same manufacturing process and GK104 is much smaller than GM204. But what about GM204's initial production costs vs GK104's initial ones (three years ago mind you, when 28nm was still fresh)?


----------



## skupples

how are drivers going to resolve a hardware and architecture DESIGN?

tell the drivers to do everything within their power to NEVER EVER USE THAT 512MB of memory!?
Quote:


> Originally Posted by *Serandur*
> 
> Are you kidding? I own both FC4 and AC:U. The former isn't too bad without a good amount of AA on VRAM, but it does get there and the latter (Unity) has some pretty demanding ultra textures and general VRAM usage. It's a hog, to the point where 2xMSAA is unplayable on my SLI 970s at 1440p with ultra textures because of VRAM. Watch Dogs is also a VRAM hog, remember the whole stuttering (especially with ultra textures) thing? Clear VRAM hog. The benchmarks listed on Watch Dogs by pcgameshardware are pretty conclusive. Crysis 3 doesn't isn't remotely in the same ballpark on VRAM usage and it doesn't need to be, it's not open world like any of those three titles you listed... or the popular point of contention that is SoM while we're at it. "Optimized" is such a bogus word used so casually and with little understanding of the reality. Textures eat memory, textures on an open world game with very far-ranging draw distances voraciously devour the stuff. It's not about optimization, it's just the demands of more technically ambitious (in certain regards) games.
> These days, yeah, no doubt given they're both on the same manufacturing process and GK104 is much smaller than GM204. But what about GM204's initial production costs vs GK104's initial ones (three years ago mind you, when 28nm was still fresh)?


ubisoft has documented a known issue with texture streaming in the current revision of Dunia (cryengine 2.75)

my issue with FC4/AC:U, settings don't matter for me, they still run poorly, and I'm nowhere near 6GB used, when trying to find "smooth" settings. Or maybe I should just stop expecting games to properly utilize multiple GPUs. That's probably a safer bet.

4.5 4930k, striped ssds, 32gb of c10 2400mhz memory, everything is rather stable (sorry won't claim rah rah super stable 4 lyfe! because that simply isn't ever true) these products should run smooth, 24/7, specially when GPU cores are below 99%, CPU cores are below 99%, and sys memory usage is below 8Gb out of 32. That, for me, is a development failure. Something that should have been identified during the endless QA & tuning sweeps, which are commonly labeled as "optimizations".

let me see if I can find that as well.


----------



## PatrikStar24

Quote:


> Originally Posted by *GrimDoctor*
> 
> I'd love one of those, even two, but struggling to find one in Australia.


If you don't mind going AMD, I found an 8 GB R9 290X here: http://www.mwave.com.au/product/sapphire-amd-radeon-r9-290x-vaporx-8gb-video-card-ab58131

On topic: Until more information has come out, all that needs to be said has already been said. Fanboys, shills, and astroturfers notwithstanding
As for my reaction to this whole debacle, as a 970 owner, I feel burned. I had bought the card not only as an upgrade to my Radeon HD 6870 (RIP), I bought it under the assumption that I would be getting the whole 4 GB upfront at full speed. You know... the way graphics cards have always been built?* I mean FFS, even the R9 270X has a 4 GB variant! Yes, it has a snowball's chance in hell of utilizing it all because of the GPU being used, but you still have access to it all, albeit at a slower speed compared to the GTX 970. I had already felt uneasy because of dealing with wildly fluctuating GPU usage (to the point of dipping to 0% at times, and before you ask, no, it's not a power problem. Highest known VRAM usage: 2.5 GB) in game in most of my modern games (2008-2013ish), but the memory was the final straw for me. I want get a refund out of principle, but that would leave me without a card. Even if I did have a spare, I'm well past the 30 day return window. I know that I'm stuck with it in the long run, but when it comes time to upgrade again, I'm going back to AMD.

*with exceptions? I honestly don't know


----------



## GrimDoctor

Quote:


> Originally Posted by *PatrikStar24*
> 
> If you don't mind going AMD, I found an 8 GB R9 290X here: http://www.mwave.com.au/product/sapphire-amd-radeon-r9-290x-vaporx-8gb-video-card-ab58131
> 
> On topic: Until more information has come out, all that needs to be said has already been said. Fanboys, shills, and astroturfers notwithstanding
> As for my reaction to this whole debacle, as a 970 owner, I feel burned. I had bought the card not only as an upgrade to my Radeon HD 6870 (RIP), I bought it under the assumption that I would be getting the whole 4 GB upfront at full speed. You know... the way graphics cards have always been built?* I mean FFS, even the R9 270X has a 4 GB variant! Yes, it has a snowball's chance in hell of utilizing it all because of the GPU being used, but you still have access to it all, albeit at a slower speed compared to the GTX 970. I had already felt uneasy because of dealing with wildly fluctuating GPU usage (to the point of dipping to 0% at times, and before you ask, no, it's not a power problem. Highest known VRAM usage: 2.5 GB) in game in most of my modern games (2008-2013ish), but the memory was the final straw for me. I want get a refund out of principle, but that would leave me without a card. Even if I did have a spare, I'm well past the 30 day return window.
> 
> *with exceptions? I honestly don't know


Thanks for that link, I saw that one late last night. It read up pretty well, I'll look into it more. I'll leave it at that so others can get back to...well...don't know how to describe







lol


----------



## mouacyk

Quote:


> Originally Posted by *PatrikStar24*
> 
> If you don't mind going AMD, I found an 8 GB R9 290X here: http://www.mwave.com.au/product/sapphire-amd-radeon-r9-290x-vaporx-8gb-video-card-ab58131
> 
> On topic: Until more information has come out, all that needs to be said has already been said. Fanboys, shills, and astroturfers notwithstanding
> As for my reaction to this whole debacle, as a 970 owner, I feel burned. I had bought the card not only as an upgrade to my Radeon HD 6870 (RIP), I bought it under the assumption that I would be getting the whole 4 GB upfront at full speed. You know... the way graphics cards have always been built?* I mean FFS, even the R9 270X has a 4 GB variant! Yes, it has a snowball's chance in hell of utilizing it all because of the GPU being used, but you still have access to it all, albeit at a slower speed compared to the GTX 970. I had already felt uneasy because of dealing with wildly fluctuating GPU usage (to the point of dipping to 0% at times, and before you ask, no, it's not a power problem. Highest known VRAM usage: 2.5 GB) in game in most of my modern games (2008-2013ish), but the memory was the final straw for me. I want get a refund out of principle, but that would leave me without a card. Even if I did have a spare, I'm well past the 30 day return window. I know that I'm stuck with it in the long run, but when it comes time to upgrade again, I'm going back to AMD.
> 
> *with exceptions? I honestly don't know


I know you're unsure of a refund, but looks like NVidia's stepping up to offer this for people who want it.
https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438142/#4438142

They must have just hired this guy ... love his attitude:
Quote:


> If anyone's post get deleted, PM a mod and we'll restore it. Feel free to hate on NVIDIA all you want, but please don't attack other posters.


----------



## Forceman

Haven't seen these charts posted yet, from Hardware Canucks. They tested 4 cards with scenarios that used under 3.5GB and then again where the usage was between 3.5GB and 4GB (but never over), and then graphed the average frametimes and difference. They used FCAT for the testing so the average should include any stuttering, but they haven't posted the actual FCAT charts yet.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/68595-gtx-970s-memory-explained-tested-2.html


----------



## James296

^ holy







, some of those really do take quite a hit......now where is my popcorn.


----------



## 2010rig

Quote:


> Originally Posted by *skupples*
> 
> how are drivers going to resolve a hardware and architecture DESIGN?
> 
> tell the drivers to do everything within their power to NEVER EVER USE THAT 512MB of memory!?


Miss this post?
http://www.overclock.net/t/1537725/pcper-nvidia-responds-to-gtx-970-3-5gb-memory-issue/1250_50#post_23467347


----------



## kingduqc

I have not followed much this story, so I'm not too sure if I understood it perfectly. Could anyone correct me if I'm wrong.

Nvidia chopped the memory into two pool. 3.5gb and 512mb. The 512 mb pool is much much slower. They still marketed this as a 4gb card and even lied on the specs of the 970. People using more then 3.5gb are having issues. Angry people.

I have not seen any FCAT graph, has any site posted one yet? The one above have a nice charts made out of it but no frametime.


----------



## Forceman

Quote:


> Originally Posted by *James296*
> 
> ^ holy
> 
> 
> 
> 
> 
> 
> 
> , some of those really do take quite a hit......now where is my popcorn.


I think the take-away is supposed to be that both the 980 and 970 take similar hits above 3.5GB.

They all take a pretty significant hit, but it isn't clear if that is because of the increased settings they used to force higher VRAM or just because all the cards have trouble when nearly all the VRAM is allocated.


----------



## 2010rig

Quote:


> Originally Posted by *kingduqc*
> 
> I have not followed much this story, so I'm not too sure if I understood it perfectly. Could anyone correct me if I'm wrong.
> 
> Nvidia chopped the memory into two pool. 3.5gb and 512mb. The 512 mb pool is much much slower. They still marketed this as a 4gb card and even lied on the specs of the 970. People using more then 3.5gb are having issues. Angry people.
> 
> I have not seen any FCAT graph, has any site posted one yet? The one above have a nice charts made out of it but no frametime.


This'll sum it up for ya
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970


----------



## GrimDoctor

Quote:


> Originally Posted by *Forceman*
> 
> I think the take-away is supposed to be that both the 980 and 970 take similar hits above 3.5GB.
> 
> They all take a pretty significant hit, but it isn't clear if that is because of the increased settings they used to force higher VRAM or just because all the cards have trouble when nearly all the VRAM is allocated.


Hmm. Now my brain just exploded. Don't know what to do now.


----------



## mouacyk

Quote:


> Originally Posted by *Forceman*
> 
> I think the take-away is supposed to be that both the 980 and 970 take similar hits above 3.5GB.
> 
> They all take a pretty significant hit, but it isn't clear if that is because of the increased settings they used to force higher VRAM or just because all the cards have trouble when nearly all the VRAM is allocated.


The only thing made officially clear on the significant hits is that the 970 has an initially undocumented L2 block hampering memory. Are you suggesting that the 980 could have something similar?


----------



## gamervivek

Quote:


> If anyone's post get deleted, PM a mod and we'll restore it. Feel free to hate on NVIDIA all you want, but please don't attack other posters.


lol let the shills be in peace!


----------



## Forceman

Quote:


> Originally Posted by *mouacyk*
> 
> The only thing made officially clear on the significant hits is that the 970 has an initially undocumented L2 block hampering memory. Are you suggesting that the 980 could have something similar?


I don't know, the 290X also takes a significant hit. Does it have something similar?


----------



## raghu78

Quote:


> Originally Posted by *Forceman*
> 
> Haven't seen these charts posted yet, from Hardware Canucks. They tested 4 cards with scenarios that used under 3.5GB and then again where the usage was between 3.5GB and 4GB (but never over), and then graphed the average frametimes and difference. They used FCAT for the testing so the average should include any stuttering, but they haven't posted the actual FCAT charts yet.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/68595-gtx-970s-memory-explained-tested-2.html


yeah the fcat charts and more games will follow. the GTX 970 shows a bigger hit than GTX 980 in BF4 and Middle Earth. R9 290X proves its a beast and takes the least hit in all 4 games once the VRAM usage hits > 3.5 GB and <= 4GB. . In Middle Earth, Hitman and BF4 the R9 290X literally crushes the GTX 970

Middle Earth - The gap increases to 24% from just under 10% when VRAM usage > 3.5 GB and <= 4GB
Battlefield 4 - The gap increases to 17% from just under 3% when VRAM usage > 3.5 GB and <= 4GB
Hitman - The gap increases to above 30% from just above 11% when VRAM usage > 3.5 GB and <= 4GB


----------



## Xoriam

Quote:


> Originally Posted by *Forceman*
> 
> I don't know, the 290X also takes a significant hit. Does it have something similar?


it's a typical scenarios of when you get close to full usage of memory.

The same sort of thing happens in operating systems with ram as well.

SSDs also have the same sort of thing.


----------



## Forceman

Quote:


> Originally Posted by *raghu78*
> 
> yeah the fcat charts and more games will follow. the GTX 970 shows a bigger hit than GTX 980 in BF4 and Middle Earth. R9 290X proves its a beast and takes the least hit in all 4 games once the VRAM usage hits > 3.5 GB and <= 4GB. . In Middle Earth, Hitman and BF4 the R9 290X literally crushes the GTX 970
> 
> Middle Earth - The gap increases to 24% from just under 10% when VRAM usage > 3.5 GB and <= 4GB
> Battlefield 4 - The gap increases to 17% from just under 3% when VRAM usage > 3.5 GB and <= 4GB
> Hitman - The gap increases to above 30% from just above 11% when VRAM usage > 3.5 GB and <= 4GB


Yeah, presumably that's the bandwidth advantage kicking in. Or else better VRAM management in the drivers.


----------



## skupples

Quote:


> Originally Posted by *2010rig*
> 
> Miss this post?
> http://www.overclock.net/t/1537725/pcper-nvidia-responds-to-gtx-970-3-5gb-memory-issue/1250_50#post_23467347


doesn't mean much until they explain their proof... unless I missed that as well.


----------



## 2010rig

Quote:


> Originally Posted by *skupples*
> 
> doesn't mean much until they explain their proof... unless I missed that as well.


nah, you didn't, I'd like to see how they came to that conclusion. Just sayin' it's *possible* this could be addressed via drivers.


----------



## Serandur

Yo guys, post #2412:

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090

"Hey,

First, I want you to know that I'm not just a mod, I work for NVIDIA in Santa Clara.

I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.

It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.

Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.

--Peter"


----------



## IMI4tth3w

i still think this is something that can be addressed and at least minimized with driver updates. would have been nice to know from the get go though. i think some blowing out of proportion is going on here too.

- a very satisfied evga gtx 970 ssc acx 2.0 owner


----------



## 2010rig

Quote:


> Originally Posted by *mouacyk*
> 
> I know you're unsure of a refund, but looks like NVidia's stepping up to offer this for people who want it.
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438142/#4438142
> 
> They must have just hired this guy ... love his attitude:
> Quote:
> 
> 
> 
> If anyone's post get deleted, PM a mod and we'll restore it. Feel free to hate on NVIDIA all you want, but please don't attack other posters.
Click to expand...

How can anyone hate NVIDIA? that's awesome.









People are seriously asking for $100 vouchers to upgrade to a 980, maybe they should first explain how this issue is worth $100 towards a new upgrade, and or how their 970 is suddenly worth $100 less.

They should also have to start their sentence with, I understand the missing ROP's don't mean anything, and this whole RAM "limitation" is affecting my gaming experience in the following ways:

Another guy asked if he returns the card, can he keep the FREE game? Lol


----------



## djriful

Yeah... when news explode... people overreact... exaggerated to the max... typical American society.


----------



## damric

I wonder what reviewers will think of a GTX 960 Ti 4GB if it is indeed a 25% cut-down GTX 980. That's a whole GB of VRAM in the gutter, lol.


----------



## spacin9

Quote:


> Originally Posted by *Serandur*
> 
> Yo guys, post #2412:
> 
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
> 
> "Hey,
> 
> First, I want you to know that I'm not just a mod, I work for NVIDIA in Santa Clara.
> 
> I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.
> 
> It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.
> 
> Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.
> 
> --Peter"


That's what my retailer told me... they said the board partner should help and they said if they don't, that they would help. It's nice to see NV throw-in with this also. I'll be getting 980s (if true).


----------



## wermad

Glad to see things being taking care for those who are not happy w/ this,








.


----------



## spacin9

It's being reported that EVGA is giving step-ups to 980s passed the window. I can't see the others not doing it.


----------



## wermad

Quote:


> Originally Posted by *spacin9*
> 
> It's being reported that EVGA is giving step-ups to 980s *passed the window*. I can't see the others not doing it.


By this you mean, passed the offer window, or passed approval (given the go ahead)???

edit: btw, I'm joining the "pissed off" bandwagon as i've read more in depth the issue and it does seem like Nvidia really dropped the ball. This quote really summed it up for me that Nvidia did not handle this properly. Intentional? Hmmm...don't know tbh, but I'm leaning on yes (cover up) to ensure the 970 could tangle w/ Hawaii. Sorry to all those who got burned by this. I have a friend who's fuming right now as he passed up the option to get the 980 in the first place. I told him about the evga step up though he's already deployed and won't know when he'll have time off again.
Quote:


> Spoiler: Warning: Spoiler!
> 
> 
> 
> "NVIDIA says this was an error in the reviewer's guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned."
> 
> http://www.gamespot.com/articles/nvidia-admits-to-error-in-gtx-970-specs-and-memory/1100-6424915/


----------



## mtcn77

Some bad impressions cannot be refunded(along with priceless time & cargo fees). Been there, done that.
Edit: sorry, wrong hyperlink.


----------



## darealist

Lol. If nvidia gets sued, the next x70 card wlll cost much more to cover the losses.


----------



## Swolern

Quote:


> Originally Posted by Serandur View Post
> 
> Yo guys, post #2412:
> 
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
> 
> "Hey,
> 
> First, I want you to know that I'm not just a mod, I work for NVIDIA in Santa Clara.
> 
> I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.
> 
> It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.
> 
> Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.
> 
> --Peter"


At least someone at Nvidia has the balls to say they messed up. It just pisses people off more with the PR bullcrap Nvidia put out with their initial response.


----------



## wermad

Quote:


> Originally Posted by *darealist*
> 
> Lol. If nvidia gets sued, the next x70 card wlll cost much more to cover the losses.


As I mentioned a 1000x pages ago (







), they took flak for GTX 570s blowing to death. Being runner up to GTX X80 does not seem like a happy place in Nvidia land.

EVGA step up info:


Spoiler: Warning: Spoiler!



Quote:


> EVGA Step-Up® - Upgrade program
> 
> The EVGA Step-Up program helps you have the peace of mind to purchase your favorite product now, knowing you can upgrade to the latest parts that come out within the first 90 days from your purchase. In the computer hardware world, 90 days is a lifetime! With a valid part, once per purchase, you can upgrade your EVGA product by paying the difference between what you paid and the price on EVGA's website. It's really that simple!
> 
> Current cutoff for products purchased is 10/29/2014 in order to submit a request today. Products purchased on or after this date may be eligible for the Step-Up program.
> 
> EVGA Step-Up is currently only available to residents of the United States (not including outlying territories), Canada and EU Countries.
> What products qualify for the Step-Up program?
> 
> You can unlock the Step-Up option for any Limited Lifetime Warranty or most 3 Year Warranty products as long as it meets the conditions listed below. Once the Step-Up option is unlocked, you will have 90 days from the original purchase date for your product to start the upgrade process through the EVGA website. Once the Step-up process is started you will receive a confirmation email.
> 
> Register your Limited Lifetime Warranty product purchased new from an authorized reseller within 90 days of your original purchase date.
> Register your qualifying 3 Year Warranty product purchased new from an authorized reseller within 14 days of your original purchase date.
> If you should happen to miss your 14 day registration window on your qualifying 3 Year Warranty product then you may also purchase an extended warranty within 90 days of your original purchase date to enable the Step-Up option.
> Don't worry, your Extended Warranty and Advanced RMA (EAR) purchases now transfers to your new Step-Up product!
> The EVGA Step-Up program is currently only available to residents of the United States, Canada, and EU Countries.
> 
> How does the EVGA Step-Up process work?
> 
> There are several steps to the Step-Up program. Because we require that you send your original product to EVGA first, we have instituted a process that makes sure you have the least amount of downtime possible. Once you have gotten through the queue, approval and payment phases, you can send your product in to EVGA and once we receive it we will turn it around as quickly as possible. Shipping both directions will be your responsibility.
> Stepping through the Step-Up
> 
> Register your valid product within 90 days from your original purchase
> If your qualifying product has a 3 year warranty purchased on or after 3/1/2013 and registered within 14 days of the invoice date by the original owner from an authorized reseller, then you will receive the option to step-up for the first 90 days of ownership.
> If your qualifying product has a 3 year warranty purchased on or after 1/1/11 and registered after 14 days of the original purchase date you must also purchase an extended warranty in order to be qualified for the step-up program.
> Upload a copy of your invoice
> Request to start your Step-Up within 90 days of your original purchase from your product detail page
> Navigate to My Products under Member Home
> Click on View Details of the product
> If your product qualifies, you will see the option to start your Step-Up
> Wait in queue until approval - this step can last from a few days to many weeks depending on demand
> EVGA will review your request and you will be approved if you have met the terms and conditions
> You will have 7 days to pay for the difference between the two products, plus return shipping and applicable taxes, after you have been approved
> You will be instructed to send your product in and you will have 14 business days to do so after your payment is verified
> When EVGA receives your product, we will return your new in box upgrade
> EVGA will automatically register your product so you will receive the full warranty on your Step-Up (as of 1/1/11)
> Any product auto-registered will have its Extended Warranty transferred automatically.
> You will receive your product shipped via the shipping option you chose and paid for - game on!
> 
> You will receive detailed emails at every step to instruct what must be done to continue. All Step-Up requests must go through this exact process, you may not purchase the new product and send your original product back for a refund afterward.
> Frequently Asked Questions
> Why do I have to wait in queue?
> 
> We have a queue for our Step-Up process so you can keep your current EVGA product until we have your new product ready and waiting for you in our warehouse. Sometimes, very high demand will necessitate a longer than usual queue, but once you have made your request and you are in queue the 90 day limit no longer applies - you are considered in the Step-Up process.
> 
> If you are still within the 90 days from your original purchase, you can cancel your Step-Up and start a new process yourself. If you are out of the 90 days from your original purchase and you leave the queue, you will not be allowed to start a new Step-Up.
> Can I get the new product first in a cross-shipment?
> 
> There are no options for cross-shipping a Step-Up product at this time. If you have purchased EVGA's EAR on your original product, this does not apply to Step-Up however the EAR will transfer to your new product.
> Can I send in more than one product for one Step-Up?
> 
> No - Step-Up is available only for trading one product for another, not many products for one.
> Can I Step-Up if I'm not the original owner?
> 
> No - Step-Up is available only to the original owner with a verified authorized invoice.
> Can I Step-Up a product more than once?
> 
> No - A product received from Step-Up cannot be used in the Step-Up program.
> Can I Step-Up my EVGA Power Supply or Mouse?
> 
> No - These products do not qualify for the Step-up Program.
> What happens when I Step-Up to a cheaper product?
> 
> You will only have to pay for the return shipping at the time of paying for your Step-Up. EVGA will not credit you the difference.
> What kind of product will I get?
> 
> All Step-Up replacements are new. There are limitations to what products are available to Step-Up TO, these are listed below.
> 
> Step-Up products are considered a special new product, warranty will start from the shipping date of your new product from EVGA Step-Up.
> What do I need to send back to EVGA?
> 
> The original product and all accessories that came with the original package. The original packaging material and box are recommended to be returned for the best protection for your product. Free promotional items such as games, posters, shirts, etc. are considered gifts and are not required to be returned.
> How long does "Verifying Transaction" take?
> 
> Once you have paid for your Step-Up, the next step will show "Verifying Transaction". Essentially EVGA customer service will do a final check on all the details and then send you the RMA number. This takes 1-2 business days.
> How long does it take EVGA to ship out my new product once they receive my old one?
> 
> All RMAs, even Step-Ups, will be processed within 1-3 business days. This means from the day we receive your old product, we will process and ship out your new Step-Up product within 3 business days. If there are any problems with your shipment (No RMA number, no serial number sticket, damage, etc.), this will delay the processing time.
> Can I use EVGA bucks to pay for my Step-Up?
> 
> EVGA bucks are good only for new purchases made at EVGA's online store. The EVGA Step-Up is considered a promotion and cannot be combined with EVGA bucks in any way.
> Can I change my shipping once my Step-Up has started?
> 
> No, you must cancel your Step-Up and resubmit with your new shipping selected. Note: If you are still within the 90 days from your original purchase, you can cancel your Step-Up and start a new process yourself. If you are out of the 90 days from your original purchase and you leave the queue, you will not be allowed to start a new Step-Up.
> Why did my price paid get changed during the approval process?
> 
> One of two reasons:
> 
> After reviewing the price paid on your invoice, the Step-Up department will adjust pricing based on your invoice.
> If there was a rebate for your product, the Step-Up department will adjust pricing to reflect rebates.
> 
> Will my extended warranty transfer to my new product?
> 
> Yes, any product that is auto-registered through EVGA's RMA system will have its extended warranty transferred to the new product.
> Will my unused EAR transfer to my new product?
> 
> Yes, any product that is auto-registered through EVGA's RMA system will have its EAR transferred to the new product.
> Step-Up Inventory - What can I Step-Up to?
> 
> EVGA maintains a list of current graphics cards as well as motherboards available to Step-Up to. Products listed are mostly stock (not factory overclocked, or water cooled) due to stock limitations.
> EVGA US
> 
> Motherboards
> EVGA X99 FTW (150-HE-E997-KR) (PDF)
> EVGA X99 Classified (151-HE-E999-KR) (PDF)
> EVGA X79 FTW (151-SE-E777-KR) (PDF)
> EVGA X79 Dark (150-SE-E789-K2) (PDF)
> 
> Graphics Card
> EVGA GeForce GTX 980 ACX 2.0 (04G-P4-2981-KR) (PDF)
> EVGA GeForce GTX 970 SSC ACX 2.0+ (04G-P4-3975-KR) (PDF)
> EVGA GeForce GTX 970 FTW+ ACX 2.0+ (04G-P4-3978-KR) (PDF)
> EVGA GeForce GTX 970 FTW ACX 2.0 (04G-P4-2978-KR) (PDF)
> EVGA GeForce GTX 960 SuperSC ACX 2.0+ (02G-P4-2966-KR) (PDF)
> EVGA GeForce GTX 750 Ti (02G-P4-3751-KR) (PDF)
> EVGA GeForce GTX 750 (01G-P4-2751-KR) (PDF)
> EVGA GeForce GTX 660 Superclocked (02G-P4-2662-KR) (PDF)
> EVGA GeForce GTX 650 (01G-P4-2650-KR) (PDF)
> EVGA GeForce GT 610 2GB (02G-P3-2619-KR) (PDF)
> 
> EVGA EU
> 
> EVGA GeForce GTX 980 ACX 2.0 (04G-P4-2981-KR) (PDF)
> EVGA GeForce GTX 970 SSC ACX 2.0+ (04G-P4-3975-KR) (PDF)
> EVGA GeForce GTX 970 FTW+ ACX 2.0+ (04G-P4-3978-KR) (PDF)
> EVGA GeForce GTX 970 FTW ACX 2.0 (04G-P4-2978-KR) (PDF)
> EVGA GeForce GTX 970 ACX 2.0 (04G-P4-2972-KR) (PDF)
> EVGA GeForce GTX 960 SuperSC ACX 2.0+ (02G-P4-2966-KR) (PDF)
> EVGA GeForce GTX 760 Dual w/ ACX Cooler (02G-P4-3763-KR) (PDF)
> EVGA GeForce GTX 760 4GB w/ ACX Cooler (04G-P4-2767-KR) (PDF)
> EVGA GeForce GTX 760 (02G-P4-2761-KR) (PDF)
> EVGA GeForce GTX 750 2GB Superclocked (02G-P4-2754-KR) (PDF)
> EVGA GeForce GTX 750 (01G-P4-2751-KR) (PDF)
> EVGA GeForce GTX 660 FTW w/ EVGA ACX Cooler (02G-P4-3063-KR) (PDF)
> EVGA GeForce GT 730 2GB (02G-P3-2738-KR) (PDF)
> 
> Products can be removed from or added to Step-Up availability at any time.
> Calculating Step-Up Costs
> 
> Simply put, the cost of EVGA Step-Up is the difference between what you paid for your original product and the new product as listed at the EVGA Store. To calculate what you paid, follow these simple steps:
> 
> Add line item price
> Do not include taxes
> Do not include shipping
> Subtract rebates from the line item price
> Subtract from MSRP as listed on EVGA.
> 
> Example
> 
> You have purchased a graphics card for $299.99, tax was $26.25 and $10.25 in shipping, you also took advantage of a $30.00 rebate from EVGA. The product you wish to Step-Up to is listed as MSRP of $399.99.
> 
> Add $299.99
> Ignore $26.25 tax and $10.25 shipping
> Subtract $30.00 for rebate to get $269.99
> Subtract $269.99 from $399.99
> $130 is the cost of this example, plus return shipping and applicable taxes
> 
> Return shipping (from EVGA back to you) is variable dependent upon your location, current rates and shipping speed you select.
> 
> Residents of California are required to pay sales tax for Step-Up transactions.
> Terms and Conditions
> 
> Only EVGA products are eligible for the Step-Up program.
> 
> EVGA Step-Up is currently only available to residents of the United States, Canada and EU Countries.
> 
> EVGA Step-Up is available only to the original end-user purchaser of the product from a valid reseller.
> 
> The EVGA Step-Up program is available to EVGA graphics cards and motherboards for customers within 90 days of their original purchase date based on invoice verification.
> Graphics Cards
> 
> EVGA will only release reference versions of its products, NVIDIA reference spec and clock, to the Step-Up program.
> Step-Up is limited to pre-approved graphics cards only and can only be used for exchange to a different and higher performing GPU.
> Products known to have a limited availability will not be made available to the Step-Up program. (Limited availability determined by EVGA.)
> Customers who received their EVGA graphics card as part of a complete computer system are not eligible - except for those listed on our approved system vendor list.
> 
> Graphics Cards Examples:
> 
> GTX 460 → GTX 580: YES (Upgraded GPU)
> GTX 460 768MB → GTX 460 1GB: YES (Upgraded Memory)
> GTX 970 FTW → GTX 970 FTW+ : YES (Upgraded Model)
> 
> Motherboards
> 
> EVGA motherboard products are limited to pre-approved models and same socket-type only.
> Products known to have limited availability will not be made available to the Step-Up program. (Limited availability determined by EVGA.)
> Customers who received their EVGA motherboard as part of a complete computer system are not eligible - except for those listed on our approved system vendor list.
> 
> Motherboard Examples:
> 
> P55 SLI (Socket 1156) → X58 SLI (Socket 1366): NO (Different CPU Socket)
> P55 SLI (Socket 1156) → P55 FTW (Socket 1156): YES (Same CPU Socket)
> 
> A qualifying invoice must be dated on 10/29/2014 or after in order to submit a request today. If your invoice is dated as purchased before this date, you do not qualify for Step-Up.
> 
> Only products that carry a valid limited lifetime warranty, or products that have purchased an extended warranty, qualify for the Step-Up program.
> 
> Only new purchases qualify for the Step-Up program. Products received from EVGA RMA may qualify for the Step-Up program if your original product's purchase date is still within 90 days from the time you start the request and qualified for Step-Up before your RMA.
> 
> Products received from EVGA Step-Up do not qualify for the Step-Up program.
> 
> Products received from EVGA Step-Up are not considered full retail purchases and therefore do not qualify for any rebates.
> 
> Products must be purchased from an authorized reseller, retailer, e-tailer or distributor of EVGA products. Products purchased through unauthorized channels, such as second hand or through eBay, do not qualify for the Step-Up program. Products received free through promotion or contest do not qualify for the Step-Up program.
> 
> Verification and approval requires a valid copy of your original receipt (proof of purchase).
> 
> Products must be returned in their original factory condition (back plates, brackets, etc. will be removed and discarded) and must be free from physical modification or damage per the terms of our warranty.
> 
> Products made available for EVGA Step-Up are of limited availability and will be distributed on a first-come first-serve basis.
> 
> Once EVGA has received your original product, the Step-Up process is non-reversible and non-refundable.
> 
> The EVGA Step-Up cannot be combined with any other promotions or coupons.
> 
> EVGA reserves the right to cancel the Step-Up program at any time.
> 
> EVGA reserves the right to change the terms of the Step-Up program at any time without notice.
> 
> If you have any questions about the Step-Up program, please contact customer service or by calling 888-881-EVGA (3842) M-F 9AM - 5:30PM Pacific.
> 
> http://www.evga.com/support/stepup/


----------



## spacin9

Quote:


> Originally Posted by *wermad*
> 
> By this you mean, passed the offer window, or passed approval (given the go ahead)???


Trading-in I guess. Depending how it all goes down, I may or may not participate. If I get full value for my cards to trade-in, it's an option, where I had no option before.

My experience is not so much the performance, It's the loss of value. I bought premium cards that won't fetch a premium price on ebay.

Case in point, there two golden MSI 970s up for 360 a piece on ebay right now. He might be waiting a while.


----------



## wermad

Quote:


> Originally Posted by *spacin9*
> 
> Trading-in I guess. Depending how it all goes down, I may or may not participate. If I get full value for my cards to trade-in, it's an option, where I had no option before.
> 
> My experience is not so much the performance, It's the loss of value. I bought premium cards that won't fetch a premium price on ebay.


Yeah, I hear ya. It totally sucks tbh. I guess it hit at home for me once my friend found out. He's probably gonna ship me his card and I'll receive the replacement and test it for him before sending it to him overseas. Kinda of feel bad since I did make a good pitch on the 970 (Gsync got him hyped up but I told him a Gsync monitor was needed.).


----------



## raghu78

Quote:


> Originally Posted by *damric*
> 
> I wonder what reviewers will think of a GTX 960 Ti 4GB if it is indeed a 25% cut-down GTX 980. That's a whole GB of VRAM in the gutter, lol.


Not really. Binning is done to salvage a lot of defective chips and Nvidia would rather want to sell a GM204 with a complete GPC disabled and a 64 bit memory controller with the associated L2/ROP disabled. 1536cc, 3 GPC, 12 SM, 192 bit memory controller, 48 ROP, 1.5 MB L2 cache and 3 GB VRAM. This would also be a well balanced chip and beat the R9 280X easily by a good margin and trail the R9 290 closely at 1080p while falling behind at 1440p and 4k. You can expect such a GPU by late Q1 or early Q2.


----------



## wermad

Looks like EVGA is helping out those outside the StepUp window:

Quote:


> EVGA is allowing me to Step Up outside of my promotional period. There is a reason I only buy EVGA. Customer service is top notch.
> 
> -the_real_maverick


http://forums.evga.com/FindPost/2285916








for EVGA support!


----------



## darealist

Unless they offer a discount on the stepup, I wouldn't even bother investing anymore money into this disappointing series.


----------



## Final8ty

Quote:


> Originally Posted by *Forceman*
> 
> Haven't seen these charts posted yet, from Hardware Canucks. They tested 4 cards with scenarios that used under 3.5GB and then again where the usage was between 3.5GB and 4GB (but never over), and then graphed the average frametimes and difference. They used FCAT for the testing so the average should include any stuttering, but they haven't posted the actual FCAT charts yet.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/68595-gtx-970s-memory-explained-tested-2.html


The problem is at that res the GPU grunt is the bigger factor than the Vram bandwidth which still affects it to some extent.
What needs to be done textures need to be used to fill up the Vram and not GPU intensive things like resolution and other GPU intensive settings.


----------



## wermad

Quote:


> Originally Posted by *darealist*
> 
> Unless they offer a discount on the stepup, I wouldn't even bother investing anymore money into this disappointing series.


Not likely, though they will honor the msrp (sans rebates) since this program involves the retail part of EVGA. Their manufacturing part has nothing to do w/ the issue and hence I don't see why they should reimburse customers.

Probably, Nvidia will give out those free game coupons. That's my guess


----------



## spacin9

Quote:


> Originally Posted by *darealist*
> 
> Unless they offer a discount on the stepup, I wouldn't even bother investing anymore money into this disappointing series.


Quite. I'll have to see what they say over @ Zotac. They did me right twice so...


----------



## spacin9

Quote:


> Originally Posted by *wermad*
> 
> Not likely, though they will honor the msrp (sans rebates) since this program involves the retail part of EVGA. Their manufacturing part has nothing to do w/ the issue and hence I don't see why they should reimburse customers.
> 
> Probably, Nvidia will give out those free game coupons. That's my guess


So how are you recommending 970s when you have 3 290s... did I read that right lol? With a 4690k? Or are you just ball-breaking?


----------



## Cyro999

Quote:


> Originally Posted by *Wirerat*
> 
> 250 watt gpus are pita unless your on water imo.


Well, i keep my 970 below 60c at all times on air. Wouldn't mind an extra 23% cores in there

at 1.25v


----------



## wermad

Quote:


> Originally Posted by *spacin9*
> 
> So how are you recommending 970s when you have 3 290s... did I read that right lol? With a 4690k? Or are you just ball-breaking?


For my friend? He wanted Nvidia (mainly due to the GSync hype). I tried to push him w/ Amd but he opted for the newer tech. Good thing is that he does qualify for the StepUp program (970 acx) and is probably gonna dish out for the 980.

Other then this, I don't have a firm-grip allegiance to either camp. I'm willing to help, no matter who's gpu you have. I started off with: 4870x2+4870 > 3x 470s > 4x 480s > 2x 590s > 3x 6950s 2x 6970 Ltg > 4x 580s 3gb > 670 4gb > 2x 690s > 2x Titan (yeah, yeah, yeah, I know....







) > 3x 780s > 4x 7970 Ltg (5x1 Eyefinity!!!!!!!!!!) > 2x 280Xs > 3x 290s >??? (







). I've sampled quite a bit of both sides so I can share some of that experience w/ others







.

And, no, you don't need a bigger e-peen with the top of the line cpu just to game. I've only seen a few games w/ minor differences (4 vs 8 or more threads). Actually, I'm running a G3258 right now







.


----------



## spacin9

Quote:


> Originally Posted by *wermad*
> 
> For my friend? He wanted Nvidia (mainly due to the GSync hype). I tried to push him w/ Amd but he opted for the newer tech. Good thing is that he does qualify for the StepUp program (970 acx) and is probably gonna dish out for the 980.
> 
> Other then this, I don't have a firm-grip allegiance to either camp. I'm willing to help, no matter who's gpu you have. I started off with: 4870x2+4870 > 3x 470s > 4x 480s > 2x 590s > 3x 6950s 2x 6970 Ltg > 4x 580s 3gb > 670 4gb > 2x 690s > 2x Titan (yeah, yeah, yeah, I know....
> 
> 
> 
> 
> 
> 
> 
> ) > 3x 780s > 4x 7970 Ltg (5x1 Eyefinity!!!!!!!!!!) > 2x 280Xs > 3x 290s >??? (
> 
> 
> 
> 
> 
> 
> 
> ). I've sampled quite a bit of both sides so I can share some of that experience w/ others
> 
> 
> 
> 
> 
> 
> 
> .
> 
> And, no, you don't need a bigger e-peen with the top of the line cpu just to game. I've only seen a few games w/ minor differences (4 vs 8 or more threads). Actually, I'm running a G3258 right now
> 
> 
> 
> 
> 
> 
> 
> .


And that cpu pushes 3 290s eh? I guess I like i7s for RTS.

And the G-sync hype is real.. like buttah.


----------



## wermad

Quote:


> Originally Posted by *spacin9*
> 
> And that cpu pushes 3 290s eh? I guess I like i7s for RTS.
> 
> And the G-sync hype is real.. like buttah.


Until GSync and FreeSync are put side-by-side, I'm calling hype on both. I need a clear winner, otherwise, its just smoke to me. Meh, I rather do 5x1 on 60hz until 32 > 4k is cheaper. Maybe 5x1 WQHD....







(wallet killa).


----------



## 2010rig

Quote:


> Originally Posted by *mtcn77*
> 
> Some bad impressions cannot be refunded(along with priceless time & cargo fees). Been there, done that.
> Edit: sorry, wrong hyperlink.


How would you like NVIDIA to compensate you, for all the troubles you've experienced with your 970 and this particular issue?

What would make you happy with your 970 purchase?


----------



## spacin9

Quote:


> Originally Posted by *wermad*
> 
> Until GSync and FreeSync are put side-by-side, I'm calling hype on both. I need a clear winner, otherwise, its just smoke to me. Meh, I rather do 5x1 on 60hz until 32 > 4k is cheaper. Maybe 5x1 WQHD....
> 
> 
> 
> 
> 
> 
> 
> (wallet killa).


I've been getting back to SupCom usually runs choppy..it's unavoidable. I've never seen it so smooth. I've played about 500 hours of Far Cry 3 with mods and I've never seen that so smooth either. FreeSync will probably be the same.

I paid $500 for my Acer g-sync. I wasn't going to buy into it either, but I gave it shot.. it's nice. Real nice.


----------



## LancerVI

Quote:


> Originally Posted by *Serandur*
> 
> Yo guys, post #2412:
> 
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
> 
> "Hey,
> 
> First, I want you to know that I'm not just a mod, I work for NVIDIA in Santa Clara.
> 
> I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.
> 
> It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.
> 
> Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.
> 
> --Peter"


IMHO, *if* this is what NVidia is going to do, this should be the end of it. There is nothing to see here. No problem. Yeah, it was a major screw up, perception wise, but it looks like they're making it right.

I get why people are irritated, but you have to set that aside and be reasonable.


----------



## Seven7h

Quote:


> Originally Posted by *James296*
> 
> ^ holy
> 
> 
> 
> 
> 
> 
> 
> , some of those really do take quite a hit......now where is my popcorn.


No... It's within the 4-6% NVIDIA said to expect. Meaning 980, which doesn't have this special memory config, takes almost the same hit. So it's basically confirmed to be only a minor performance decrease beyond 980, as predicted and stated by NVIDIA.

Again, it was all still captured in initial reviews, though reviewers probably don't list the cases that went above 3.5GB because they didn't know that case was special at the time.


----------



## Seven7h

Quote:


> Originally Posted by *Swolern*
> 
> At least someone at Nvidia has the balls to say they messed up. It just pisses people off more with the PR bullcrap Nvidia put out with their initial response.


It's not balls, it's a matter of preventing government and legal liabilities. Politicians and lawyers do not possess the expertise to understand what happened, and will likely grandstand and make an issue out of nothing for personal gain, without understanding what happened.

Coming out and admitting fault when the performance implication is 5% and baked into all reviews could also be said to be going above and beyond. I'm still of the opinion that it's a non issue triggered by entitled attitudes and placebo effect.


----------



## raghu78

Quote:


> Originally Posted by *Seven7h*
> 
> No... It's within the 4-6% NVIDIA said to expect. Meaning 980, which doesn't have this special memory config, takes almost the same hit. So it's basically confirmed to be only a minor performance decrease beyond 980, as predicted and stated by NVIDIA.
> 
> Again, it was all still captured in initial reviews, though reviewers probably don't list the cases that went above 3.5GB because they didn't know that case was special at the time.


its not the raw fps but also the frametimes that we need to check for. extremetech pointed out a more significant hit on frametimes in Middle Earth. hwc is also going to follow up with an article with more games and frametimes. We need other sites to do the same kind of frametime testing to get an idea of the full impact of GTX 970 memory partition.


----------



## Silent Scone

Quote:


> Originally Posted by *Swolern*
> 
> At least someone at Nvidia has the balls to say they messed up. It just pisses people off more with the PR bullcrap Nvidia put out with their initial response.


Well you've got to test the water first









Quote:


> Originally Posted by *raghu78*
> 
> its not the raw fps but also the frametimes that we need to check for. extremetech pointed out a more significant hit on frametimes in Middle Earth. hwc is also going to follow up with an article with more games and frametimes. We need other sites to do the same kind of frametime testing to get an idea of the full impact of GTX 970 memory partition.


What about other titles, AC:U? Did this take as much of a hit? Or any-other memory intensive game? Need I point out that Joel Hruska is the same journo who was tipped off about potential GameWorks performance issues by AMD? So it was little surprise to me when I saw his name on the 970 article. The man is as biased as they come with the exception of neurotic cases like Semi Accurate.

Do not believe everything you read.


----------



## GrimDoctor

Quote:


> Originally Posted by *Seven7h*
> 
> It's not balls, it's a matter of preventing government and legal liabilities. Politicians and lawyers do not possess the expertise to understand what happened, and will likely grandstand and make an issue out of nothing for personal gain, without understanding what happened.
> 
> Coming out and admitting fault when the performance implication is 5% and baked into all reviews could also be said to be going above and beyond. I'm still of the opinion that it's a non issue triggered by entitled attitudes and placebo effect.


The problems for rendering certainly aren't placebo.


----------



## Woundingchaney

I would like to thank everyone that added in addressing this issue. I had noticed odd behavior with my sli 970 setup but did not have the time to allocate to discovering the issue (what I thought was initially driver related).

It is very good to see that Nvidia is taking ownership of this issue and assisting in refunds and manufacturers are honoring the mistake of Nvidia. I would also like to point out at this time every nay-sayer and brand loyalist. If anything perhaps the efforts of the people to discover this issue assures that in the future your purchases are representative of what you were actually sold.

I have returned my purchases and purchased MSI 980 Frozr Vs. I have no problem with Nvidia as a company, but given the rude treatment I received from Zotac, I will never purchase another product from them again.


----------



## spacin9

Quote:


> Originally Posted by *LancerVI*
> 
> IMHO, *if* this is what NVidia is going to do, this should be the end of it. There is nothing to see here. No problem. Yeah, it was a major screw up, perception wise, but it looks like they're making it right.
> 
> I get why people are irritated, but you have to set that aside and be reasonable.


What is reasonable..? I think that's the issue when it comes to "what will make us happy with our purchase."

No one knows how to answer it. We want 4GBs working is what we want. Anything else makes one feel foolish. If we go with 980s, then NV can just bork video cards on purpose knowing the suckers, er umm enthusiasts will upgrade and maybe still have the same issue.

If we turn-in the cards for refunds, there's nothing left to buy. 970s are really the only option. I don't want AMD cards.

So is the fair answer, a cheap upgrade path to 980s? Is NV willing to do that? I don't know.

I think it more or less comes down to not letting NV just crap on users as much as it is compensation.


----------



## Seven7h

The update should get you a couple hundred MB of the fast memory back so the lower bandwidth will no longer start at 3.2GB-3.4GB in the headless CUDA test as some were seeing.

A side benefit to this is that there is also less of the slow memory available for textures, so the worst case is you can maybe have 300MB in the slow memory instead of 512MB.

This moves things awfully close to the point where all bets are off regardless of the memory config anyway, since getting close to the video memory limit (even 150MB before you hit it) usually means having crap shuffling between system memory and video memory. Getting the usable fast memory up closer to the system memory spill/involvement point is a good thing (as long as you don't give up total video memory capacity to do it, as some people demanded with the ability to disable the 512MB... Bad idea).

If 50-150MB away from video memory being full is the point at which any video card would start to get crappy swapping with system memory and causing unplayable hitches, then having 300MB or so of textures with slower access is only 150-250MB from that "it wouldn't matter anyway" point.


----------



## Seven7h

Quote:


> Originally Posted by *spacin9*
> 
> What is reasonable..? I think that's the issue when it comes to "what will make us happy with our purchase."
> 
> No one knows how to answer it. We want 4GBs working is what we want. Anything else makes one feel foolish. If we go with 980s, then NV can just bork video cards on purpose knowing the suckers, er umm enthusiasts will upgrade and maybe still have the same issue.
> 
> If we turn-in the cards for refunds, there's nothing left to buy. 970s are really the only option. I don't want AMD cards.
> 
> So is the fair answer is, a cheap upgrade path to 980s? Is NV willing to do that? I don't know.
> 
> I think it more or less comes down to not letting NV just crap on users as much as it is compensation.


So you just admitted you'll never be happy. The card you want doesn't exist, and if it did it would be significantly more expensive because they couldn't use chips with any L2 defects. This means cost per working chip goes up, and retail pricing with it.

It sounds like you already have what you want, and are happy with performance, but just feel emotionally hurt from a perceived deception (which was a mistake).

If they had actually misled about performance of the GPU and performance was different than reviews showed somehow, I could see a cheap upgrade to 980. But the card is working fine.


----------



## Xuper

I don't know Nai's Benchmark Is broken or Not ? Can we Trust this benchmark to show GTX 970's Memory Issue?


----------



## Seven7h

Quote:


> Originally Posted by *GrimDoctor*
> 
> The problems for rendering certainly aren't placebo.


I mean the stutter claims. They certainly are placebo.

The small average performance drop is obviously real but given that it was baked into reviews, I wouldn't call it a "problem". It's part of the design of the product, not a defect. It just changes the performance characteristics above 3.5GB a bit. It's "different".

If it were the same price as 980 then it would clearly be "worse". But it's so cheap relative to 980 you can't even say it's a "problem". It is what it is and it is entirely still usable with 95% of the GPUs original overall performance when you hit the slower memory.

Not to mention that by most accounts, turning up settings to get memory amounts sufficient to trigger the 970 memory performance hit is leading to unplayable framerates just due to running out of GPU processing horsepower (often even on 980).

So keeping it all in perspective and looking at what you're really losing is important here.


----------



## Woundingchaney

Quote:


> Originally Posted by *Seven7h*
> 
> So you just admitted you'll never be happy. The card you want doesn't exist, and it it did it would be significantly more expensive because they couldn't use chips with any L2 defects. This means cost per working chip goes up, and retail pricing with it.
> 
> Sounds like you already have what you want, and are happy with performance, but just feel emotionally hurt from a perceived decpetion (which was a mistake).
> 
> If they had actually misled about performance of the GPU and performance was different than reviews showed somehow, I could see a cheap upgrade to 980. But the card is working fine.


I cant speak for the original poster, but its not an emotional matter (at least for me). Quite frankly I purchased something that was not representative of what I was sold. Given we all are consumers I believe this should be rather understandable. Performance issues are becoming more apparent across the board. Some titles are only reading the 3.5 pool, frame time is spiking at higher than 3.5 vram usage, etc, Its also important to note that once the card is not a driver focused card, performance issues in the future may only increase, particularly given that their hardware solution is going to need software attention for tweaking titles specifically for only the 970 gpu.


----------



## spacin9

Quote:


> Originally Posted by *Seven7h*
> 
> So you just admitted you'll never be happy. The card you want doesn't exist, and it it did it would be significantly more expensive because they couldn't use chips with any L2 defects. This means cost per working chip goes up, and retail pricing with it.
> 
> Sounds like you already have what you want, and are happy with performance, but just feel emotionally hurt from a perceived decpetion (which was a mistake).
> 
> If they had actually misled about performance of the GPU and performance was different than reviews showed somehow, I could see a cheap upgrade to 980. But the card is working fine.


You want to make me happy? Give me FREE upgrades to 980s. How about that? Put up or shut up right? And I want you to pay for my upgrade. Then I would be elated.


----------



## Silent Scone

Quote:


> Originally Posted by *spacin9*
> 
> You want to make me happy? Give me FREE upgrades to 980s. How about that? Put up or shut up right? And I want you to pay for my upgrade. Then I would be elated.


That's the consumer in a nutshell today regardless of this fiasco. People feel they're owed something from these vendors 90% of the time like some kind of entitlement. Was never like that 10 or so years ago.

Too many spoon fed individuals wet behind the ears these days.


----------



## 364901

Quote:


> Originally Posted by *Xuper*
> 
> I don't know Nai's Benchmark Is broken or Not ? Can we Trust this benchmark to show GTX 970's Memory Issue?


Its the exact same memory test as Nvidia's CUDA Memory Benchmark. So yes, you can trust it, so long as you do the test properly and run the GPU headless.


----------



## Seven7h

Quote:


> Originally Posted by *Woundingchaney*
> 
> I cant speak for the original poster, but its not an emotional matter (at least for me). Quite frankly I purchased something that was not representative of what I was sold. Given we all are consumers I believe this should be rather understandable. Performance issues are becoming more apparent across the board. Some titles are only reading the 3.5 pool, frame time is spiking at higher than 3.5 vram usage, etc, Its also important to note that once the card is not a driver focused card, performance issues in the future may only increase, particularly given that their hardware solution is going to need software attention for tweaking titles specifically for only the 970 gpu.


Titles don't need special attention or tweaking, that's the point. This is basic memory management that the OS has supported for years. On the driver side, it's the same exact driver logic which prioritizes resources for you on every other GPU, and has kept important stuff out of system memory and in main video memory for years.

The driver has *always* tracked resource priority/importance and relayed these to the OS... all GPU vendors do this. If the driver didn't, you'd have render targets in system memory and slow to 2fps when you're not even close to filling up memory of a random GPU, and it would happen regularly, but seemingly randomly. That simpy doesn't happen on any GPU, even without segmented video memory.

The 970 just has an extra level of granularity for labeling resource priority, and this allows it to minimize the performance impact of the slower memory segment to 5% when over 3.5GB. That is IT.

If you trust the memory management and resource prioritization on a 980 or 780 or 760 or 8800 or 470, then there's zero reason to be afraid of the same logic now.

You got what you paid for. A 970 with a memory config identical to 980 would've cost substantially more due to less GPU dies making the cut.

Sigh, people always fear what they don't understand. It's easier to run away than to understand it or trust thousands of brilliant engineers who work pretty hard to make **** work with a good experience.


----------



## looniam

ah, rather "enlightening."

Middle Earth: Shadow of Mordor GeForce GTX 970 VRAM stress test
Quote:


> Let me clearly state this, *the GTX 970 is not an Ultra HD card, it has never been marketed as such and we never recommended even a GTX 980 for Ultra HD gaming either.*



Quote:


> So the two titles that do pass (without any tricks) 3.5 GB are Call of Duty Advanced Warfare and of course that has been most reported to stutter is Middle Earth: Shadow of Mordor. We measured, played and fragged with COD, and there is just NOTHING to detect with the graphics memory fully loaded and in use. But we know that COD simply likes to cache a lot of stuff in VMEM, opposed to using it for rendering. So our focus for this quick test will remain Middle Earth: Shadow of Mordor.




Quote:


> Above you will find an FCAT measurement, we'll run a similar event twice, one at roughly 3GB (lower IQ settings) and one where we are at 3.6 GB, the maximum this card and game seems to deal with. Obviously in the first condition the frame rates will be higher and this the latency lower. That is not interesting, what is interesting is that if we pass 3.5 GB will there be any stuttering effect once the last 512 MB starts to weigh into rendering ?
> 
> Let's have a look with FCAT.
> 
> 2560x1440 - Very high Quality + DSR @ 3840x2160 = almost 3 GB usage
> 2560x1440 - Ultra Quality + DSR @ 3840x2160 = almost 3.6 GB usage
> Obviously the framerates differ. But the focus should be be spikes in the second chart, frames with a higher latency. I again stick to my initial findings here, there's no significant evidence that once you graphics memory runs out and starts using the 512MB (or not at all) that you can see massive and weird behavior. There is however an increase (grey line = UHD DSR with Ultra Quality settings) of really tiny latency spikes. Since they are above 40ms, these are visible. But they last a split second and you only see four of them measured over 28 seconds. That's nowhere close to what you see in some of the posted video's you see on YouTube.
> 
> We can show you FCAT results like above in an endless fashion, and they will all look rather similar. And remember, we are rendering at Ultra HD quality with Ultra quality settings there.
> 
> Concluding
> 
> Our product reviews in the past few months and its conclusion are not any different opposed to everything that has happened in the past few days, the product still performance is similar to what we have shown you as hey .. it is in fact the same product. *The cluster**** that Nvidia dropped here is simple, they have not informed the media or their customers about the memory partitioning and the challenges they face. Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering.* In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned. Right now Nvidia is in full damage control mode.


moral of the story?
pretty ridiculous to demand testing past its (or any card's) capabilities. makes ya wonder why some do.

this is . .
Quote:


> We do hope to never ever see a graphics card being configured like this ever again as it would get toasted by the media, for what Nvidia did here, it's simply not the right thing to do. Last note, we submitted questions on this topic early in the week towards Nvidia US, in specific Jonah Alben SVP of GPU Engineering. On Monday Nvidia suggested a phonecall with him, however due to appointments we asked for a QA session over email. To date he or anyone from the US HQ has not responded to these questions for Guru3D.com specifically. Really, to date we have yet to receive even a single word of information from Nvidia on this topic.
> 
> *We slowly wonder though why certain US press is always so much prioritized and is cherry picked &#8230; Nvidia ?*











hilbert rocks!


----------



## Mad Pistol

The more I read about this, the more I realize that this is a lot of fuss over nothing. Yes, nvidia should have disclosed that one partition of the L2 cache was disabled, and therefore, the resources of 2 ram modules were shared on one L2 cache module. However, lately, I've been trying to test my GTX 780 close to its 3GB Vram limit, and what I found was this:

As you get close to the VRAM limit, performance seems to drop considerably, no matter what the card is. The reason for this is simple; video cards are designed to work up to a certain point. Once the threshold of resource saturation is reached, it is very likely that the GPU core will bog down. I have found that the GTX 780 is happiest at around 2GB to 2.2GB Vram usage in a game such as BF4. Performance is stellar, graphics are gorgeous, and the card behaves perfectly. Once you get above that, the strain on the GPU cores becomes quite high, and you start to see a rapid degradation in performance.

What I find interesting is that while testing my wife's GTX 970, I found the exact same results, even though the GPU has more memory. The sweet spot on it is about 2.2-2.5GB VRAM usage. After that, performance degrades at a rapid pace in BF4. I don't have SoM or Watch Dogs to test other claims, but I can bet that NEITHER the GTX 780 or 970 perform well around 3GB VRAM usage. Which means that the propose 3.5GB Vram "wall" on the GTX 970, is gravy in my mind.

TL;DR I didn't get the GTX 970 for my wife because it has 4GB VRAM. I got it because of its features for streaming: Twitch streaming and shadowplay. The extra VRAM is cool, but honestly, if you guys hadn't pointed it out, I would have never noticed... and thus, it's hard for me to really care. The card still performs amazing, and for the $360 we paid for our GTX 970 G1 Gaming, it was worth every penny.









A few people here have explained legitimate concerns because of 4k usage and multiple cards. However, most people are simply complaining for the sake of complaining.


----------



## Silent Scone

I like how Hilbert is saying exactly what I was saying over the last couple of days.

Damn I love being right about everything?! I guess that's namely through-actual experience









(Bite me)


----------



## aDyerSituation

Secretly this is just marketing because they will release the 970 ti with the full 4 gigs of ram and price it $100 more.


----------



## Moparman

I wish they would release a bios or driver that can make the card forget about that other pool of ram. 3.5Gb is more than fine on a mid range card. These cards are game killers at 1080P or 1200p and even very strong at 1440. God the best part of all this is maybe these will drop down below $300 I could use 4 more of these for my other 2 rigs in the house.


----------



## Wirerat

Quote:


> Originally Posted by *Moparman*
> 
> I wish they would release a bios or driver that can make the card forget about that other pool of ram. 3.5Gb is more than fine on a mid range card. These cards are game killers at 1080P or 1200p and even very strong at 1440. God the best part of all this is maybe these will drop down below $300 I could use 4 more of these for my other 2 rigs in the house.


ikr at $279 price they would be amazing.


----------



## Woundingchaney

Quote:


> Originally Posted by *Seven7h*
> 
> Titles don't need special attention or tweaking, that's the point. This is basic memory management that the OS has supported for years. On the driver side, it's the same exact driver logic which prioritizes resources for you on every other GPU, and has kept important stuff out of system memory and in main video memory for years.
> 
> The driver has *always* tracked resource priority/importance and relayed these to the OS... all GPU vendors do this. If the driver didn't, you'd have render targets in system memory and slow to 2fps when you're not even close to filling up memory of a random GPU, and it would happen regularly, but seemingly randomly. That simpy doesn't happen on any GPU, even without segmented video memory.
> 
> *The 970 just has an extra level of granularity for labeling resource priority, and this allows it to minimize the performance impact of the slower memory segment to 5% when over 3.5GB. That is IT.
> 
> *If you trust the memory management and resource prioritization on a 980 or 780 or 760 or 8800 or 470, then there's zero reason to be afraid of the same logic now.
> 
> You got what you paid for. A 970 with a memory config identical to 980 would've cost substantially more due to less GPU dies making the cut.
> 
> Sigh, people always fear what they don't understand. It's easier to run away than to understand it or trust thousands of brilliant engineers who work pretty hard to make **** work with a good experience.


Exactly. As of right now the 970 is a premier card. There is no way to confirm it but it is very possible that there is additional coding done for the "unique" hardware configuration with the device on the driver level allowing for the bandwidth impact of the last 1/2 gig of memory to be somewhat muted. If this is true then it is doubtful 970 owners see this level of attention in the future. I think its a bit of an ivory tower scenario to honestly believe that in the future memory bandwidth will not become more of an issue for 970 owner, to a greater extent than what other comparable cards suffer. Which is exactly what consumers were not aware of upon purchase.


----------



## 2010rig

Quote:


> Originally Posted by *LancerVI*
> 
> IMHO, *if* this is what NVidia is going to do, this should be the end of it. There is nothing to see here. No problem. Yeah, it was a major screw up, perception wise, but it looks like they're making it right.
> 
> I get why people are irritated, but you have to set that aside and be reasonable.


Nah, burn NVIDIA BURN.








Quote:


> Originally Posted by *looniam*
> 
> ah, rather "enlightening."
> 
> Middle Earth: Shadow of Mordor GeForce GTX 970 VRAM stress test
> 
> 
> moral of the story?
> pretty ridiculous to demand testing past its (or any card's) capabilities. makes ya wonder why some do.
> 
> this is . .
> 
> 
> 
> 
> 
> 
> 
> 
> hilbert rocks!


Despite this, some people will say that he was paid by NVIDIA to write all that. Though he's talking about PCPER at the end there isn't he? Lol

The answer is simple, they bought him a Porsche a couple years ago, and therefore he always has higher priority.


----------



## UZ7

Quote:


> Originally Posted by *Mad Pistol*
> 
> The more I read about this, the more I realize that this is a lot of fuss over nothing. Yes, nvidia should have disclosed that one partition of the L2 cache was disabled, and therefore, the resources of 2 ram modules were shared on one L2 cache module. However, lately, I've been trying to test my GTX 780 close to its 3GB Vram limit, and what I found was this:
> 
> As you get close to the VRAM limit, performance seems to drop considerably, no matter what the card is. The reason for this is simple; video cards are designed to work up to a certain point. Once the threshold of resource saturation is reached, it is very likely that the GPU core will bog down. I have found that the GTX 780 is happiest at around 2GB to 2.2GB Vram usage in a game such as BF4. Performance is stellar, graphics are gorgeous, and the card behaves perfectly. Once you get above that, the strain on the GPU cores becomes quite high, and you start to see a rapid degradation in performance.
> 
> What I find interesting is that while testing my wife's GTX 970, I found the exact same results, even though the GPU has more memory. The sweet spot on it is about 2.2-2.5GB VRAM usage. After that, performance degrades at a rapid pace in BF4. I don't have SoM or Watch Dogs to test other claims, but I can bet that NEITHER the GTX 780 or 970 perform well around 3GB VRAM usage. Which means that the propose 3.5GB Vram "wall" on the GTX 970, is gravy in my mind.
> 
> TL;DR I didn't get the GTX 970 for my wife because it has 4GB VRAM. I got it because of its features for streaming: Twitch streaming and shadowplay. The extra VRAM is cool, *but honestly, if you guys hadn't pointed it out, I would have never noticed...* and thus, it's hard for me to really care. The card still performs amazing, and for the $360 we paid for our GTX 970 G1 Gaming, it was worth every penny.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A few people here have explained legitimate concerns because of 4k usage and multiple cards. However, most people are simply complaining for the sake of complaining.


I'm sure thats what they were hoping for and this topic has been buried then resurfaced. Theses cards came out in September? there were already talks/concerns about it a few weeks to even months back and it wasnt in the spotlight till now. If nVidia came clean in the first place it probably would not have mattered as much, and what surprised me is how long has it been and still no one from nVidia noticed that the specs on reviews and online store showing the wrong specs. Sure the card is the best bang for your buck, at the rights speeds rivals last years flagships but you cant have that blind you into thinking it wouldn't go unnoticed. You start seeing one or two post about memory usage, wait hold up how come I only see 3.5GB~ usage when I have 4.0GB... you see people reply well it could be driver issue.. well it could be just the game engine.. oh you're just pushing the card too high and its not suppose to run high res or increase AA.. but people have compared it to the 980? why? because it had similar specs and had 4.0GB... So you start to see benchmarks between 970 and 980 and why 970 uses 3.5 and 980 use 4.0 on the same game.. but wait how come when they have such similar specs.. people then were assuming well maybe because of how maxwell works and how the disabled units were tied to the memory controller it could have made that. Fast forward from a few forum topics and concerns to benchmark tools coming out to even more population testing... while not the best tool to use, it did shed some light on the topic, even people who never used afterburner or have not overclocked or tweaked a computer in their life, they use this tool and are concerned that they have the oh no "memory bug" so yes they're concerned that they purchased a faulty product. Face it not everyone buys the latest and greatest and a ton of people are not tech-savvy but when they purchase a product and no matter how you try to explain they wont understand how the product works and therefore never going to be satisfied till they are compensated.

Then you see nVidia come up with a reply with a small graph comparing the 970 and 980 and were like oh hai guys, there really isnt that much difference when using more ram and its working as intended. You think people will buy that? more testing more testing more testing by users and reviewers... then nVidia comes out with oh wait sorry you guys didnt know? whoops we forgot to tell you, here is the actual specs.. we were just waiting for the holidays to blow over before we come back to the issue.. but I guess we have to let you know that the specs in which you purchased the cards with isnt really the right specs.. oops! but but dont worry it doesnt affect the performance and its just how we do our binning and yeah you guys are right it is 3.5GB butttt 3.5GB + 0.5 so its still 4.0GB right?







and not to worry because it can still be used!!

Of course there will be a community backlash, they paid for a product and only when issue/topic arose then they came clean. Better late than never? or would have they gone to the grave? Now with us enthusiasts or techies and whatnot.. the "know more" than the average user will be like wow nvidia gg, ill just get a new card when they come out in a few months.. but there are a lot of people who have upgraded from their old tech, some people who cant upgrade all the time, and regular consumers and the way nVidia handled it wasnt done the right way. Their reply was "segmented" just like their memory (lol). So yes as a 970 owner did this have me concerned? sure because I bought this card back in November, I suggested it to everyone I knew.. I game on it and didn't notice anything because I had a blind eye... Started using afterburner monitoring which I was lazy to do and did notice the 3.5 max cap.. made me wonder what was going on.. then only when the topic arose I started to question and then later the explanation. To me its like ooh thats why.. it makes sense.. to other people its like WHATT?? I've been lied to! My car I purchased with 4 performance tires only came with 3.5 + 0.5 regular tires! People will feel deceived, cheated, and lied to. Only now damage control has been on the go and they're doing the best they can do to calm this down. You've seen the nVidia/AMD stocks lately? yeah I'm pretty sure this is the cause of that.. not a lot of people/share holders are happy about it.

Right now I think they're working on some magic driver that will "help" but no guarantee with performance, and I think when that comes out they'll have a big press release and explanation for the entire thing to the public. Even till now you wouldn't really know of the matter unless you look in forums and review sites but it went from like 2 - 5 - 10 - 50 - 100 etc.. in a matter of days.

Overall though (IMO) I probably wont make a fuss over return/stepup or whatever.. makes me kinda wanna get the next set of cards lol.. but naw this card is a great performer just urks me that my concern was because of something bigger.. that "missing" feeling and pretty sure people who have bought this card and yes hard earned lunch money for some and even in other countries where the $320-$350 USD cards are like $400-500, this wont die down so well. nVidia is now stepping up to cater to the crowed but the damage has been done and even some people want more out of this.

This topic will also make you question future products you purchase. How would you feel if you bought a product with specific specs only to find out they weren't the right one?

What we're seeing now is more along the lines of:
1. Purchasing a product with specific specs, not getting the right specs (even if you have no clue how it works and doesnt make a difference).. we are not talking about us, we are talking about every consumer in every country... fix? refund/stepup and what not.
2. nVidia not properly addressing the issue/comes clean but still unclear about it... fix? they'll probably come out with a big statement after a driver release to say hey its not biggie!
3. People who spend more time complaining than actually using the card... and want to be compensated more for what they purchased the card for.. or just plain compensated... fix? uh would you like extra fries with that? its funny because we still see late people coming into play going gunho that they use a benchmark and they're concerned about their purchase (good luck nVidia on explaining to them).
4. User reviews of 1 or 2 games, they hit 4GB and automatically conclude that there is nothing wrong... fix? go play games
5. User reviews of 1 or 2 games, they hit 4GB and automatically conclude that something is wrong... fix? uh see 1 or 2 lol or you're just paranoid, or wait nVidia explained it (maybe im this guy lol)

*TL;DR... its too long dont read it...* At the end of the day the ball is still on nVidia's side, damage is already been done, its up to them on how they can recover or salvage whats left. Be on the look out next few days.


----------



## lacrossewacker

So let me get this straight, a GTX 970 is still the best card for 99% of the people that want enthusiast level performance? Okay got it.


----------



## criminal

Nvidia already conceded the truth after members of the enthusiast community found something wrong. Nvidia is helping with refunds because they understand what they did is wrong. Those that want to take part in getting a refund should do it. Others that are happy should keep the card. But the people still defending Nvidia AFTER they have already admitted they screwed up.... give it a rest. Nvidia has done the right thing and is trying to rectify the issue. Those still saying it has been overblown can stop beating that horse already. Nvidia is working on a resolution.


----------



## SKYMTL

Quote:


> Originally Posted by *lacrossewacker*
> 
> So let me get this straight, a GTX 970 is still the best card for 99% of the people that want enthusiast level performance? Okay got it.


Nothing has changes since Day One. Correct.


----------



## Menta

Quote:


> Originally Posted by *criminal*
> 
> Nvidia already conceded the truth after members of the enthusiast community found something wrong. Nvidia is helping with refunds because they understand what they did is wrong. Those that want to take part in getting a refund should do it. Others that are happy should keep the card. But the people still defending Nvidia AFTER they have already admitted they screwed up.... give it a rest. Nvidia has done the right thing and is trying to rectify the issue. Those still saying it has been overblown can stop beating that horse already. Nvidia is working on a resolution.


they did but not the right way, its kind of strange to jump in the middle of a forum and say all that.maybe they don't want to make it official

but its a start.


----------



## Menta

Quote:


> Originally Posted by *SKYMTL*
> 
> Nothing has changes since Day One. Correct.


the specs have changed and maybe some performance scenarios on a minor level maybe

but this boils down to perception at this point in time and damage control


----------



## looniam

Quote:


> Originally Posted by *2010rig*
> 
> [
> 
> Despite this, some people will say that he was paid by NVIDIA to write all that. Though he's talking about PCPER at the end there isn't he? Lol
> 
> The answer is simple, they bought him a Porsche a couple years ago, and therefore he always has higher priority.


no, they bought ryan a *BENTLY* that he admitted to in one of the old framepacing/FCAT threads.
(i am now, again, seriously thinking about searching for it







)


----------



## Redwoodz

I really don't think the issue here is for people running a single GTX 970. I think the issue is for people who bought 2 thinking they could game at 4K resolution. Has anyone done any SLi FCAT tests? Still waiting for that. And not just a frametime graph or two...I mean the whole review like they have done with every other card.


----------



## SKYMTL

Quote:


> Originally Posted by *looniam*
> 
> no, they bought ryan a *BENTLY* that he admitted to in one of the old framepacing/FCAT threads.
> (i am now, again, seriously thinking about searching for it
> 
> 
> 
> 
> 
> 
> 
> )


I know Ryan and he absolutely does not drive a Bentley, nor does he live in a mansion, nor does ANY of us make huge profits from this gig. He is actually one of the most straight-up guys in the ever-shrinking "true" (ie: not a "blogger" or YouTube "personality") hardware press community. To think or even claim otherwise is absolutely preposterous.


----------



## skupples

Ryan was joking. My buddy was delivering food to his office back then. (Jimmy johns) no bentleys in the parking lot. AKA he was joking about nvidia bribing him.

I guess no one was able to define how he's now a shill. FCAT makes him a shill?


----------



## SKYMTL

Quote:


> Originally Posted by *Redwoodz*
> 
> I really don't think the issue here is for people running a single GTX 970. I think the issue is for people who bought 2 thinking they could game at 4K resolution. Has anyone done any SLi FCAT tests? Still waiting for that. And not just a frametime graph or two...I mean the whole review like they have done with every other card.


Yes. We did but our host is having electricity issues due to a fire in the neighborhood so don't bother looking for it right now.









I know that PcPer and Anandtech did full-on FCAT testing with two GTX 970 cards as well.


----------



## RagingCain

Quote:


> Originally Posted by *SKYMTL*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> no, they bought ryan a *BENTLY* that he admitted to in one of the old framepacing/FCAT threads.
> (i am now, again, seriously thinking about searching for it
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> I know Ryan and he absolutely does not drive a Bentley, nor does he live in a mansion, nor does ANY of us make huge profits from this gig. He is actually one of the most straight-up guys in the ever-shrinking "true" (ie: not a "blogger" or YouTube "personality") hardware press community. To think or even claim otherwise is absolutely preposterous.
Click to expand...

Hey now Sky! Some of us bloggers are extra honest too ya know!

Seriously though Ryan is a great guy. The only reason some people have a beef with them is he proved AMD needed to do something about FrameTime metering.

The fact that's all he did, just goes to show how stupid/fanatical AMD apologists can be.


----------



## juano

Quote:


> Originally Posted by *SKYMTL*
> 
> Yes. We did but our host is having electricity issues due to a fire in the neighborhood so don't bother looking for it right now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know that PcPer and Anandtech did full-on FCAT testing with two GTX 970 cards as well.


Can I assume based on the avatar that "we" is Hardware Canucks?


----------



## RagingCain

Quote:


> Originally Posted by *juano*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SKYMTL*
> 
> Yes. We did but our host is having electricity issues due to a fire in the neighborhood so don't bother looking for it right now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know that PcPer and Anandtech did full-on FCAT testing with two GTX 970 cards as well.
> 
> 
> 
> Can I assume based on the avatar that "we" is Hardware Canucks?
Click to expand...

Are you serious? Don't you know who Sky is?


----------



## Forceman

Quote:


> Originally Posted by *RagingCain*
> 
> Hey now Sky! Some of us bloggers are extra honest too ya know!
> 
> Seriously though Ryan is a great guy. The only reason some people have a beef with them is he proved AMD needed to do something about FrameTime metering.
> 
> The fact that's all he did, just goes to show how stupid/fanatical AMD apologists can be.


And the irony is that the testing actually got AMD to fix their issues, and their frame times now kick butt.


----------



## dean_8486

so what is happening do we get a refund or not


----------



## juano

Quote:


> Originally Posted by *RagingCain*
> 
> Are you serious? Don't you know who Sky is?


No I'm obviously making a joke because you'd have to be an idiot to not know whoever that guy is. I was just testing the thread to make sure that anyone who doesn't know who he is is appropriately jumped on, and you passed.


----------



## looniam

Quote:


> Originally Posted by *SKYMTL*
> 
> I know Ryan and he absolutely does not drive a Bentley, nor does he live in a mansion, nor does ANY of us make huge profits from this gig. He is actually one of the most straight-up guys in the ever-shrinking "true" (ie: not a "blogger" or YouTube "personality") hardware press community. To think or even claim otherwise is absolutely preposterous.


you are correct and i didn't remember correctly.
since he posted leaked benchmarks of a 290X beating a titan his response was:
http://www.overclock.net/t/1427828/bsn-state-of-4k/400_50#post_20843744
Quote:


> Damn, there goes the new Bentley I was looking to buy.


(yes, the whole reference is a joke







)

that whole thread is a good read though. .


----------



## Clocknut

Quote:


> Originally Posted by *Moparman*
> 
> I wish they would release a bios or driver that can make the card forget about that other pool of ram. 3.5Gb is more than fine on a mid range card. These cards are game killers at 1080P or 1200p and even very strong at 1440. God the best part of all this is maybe these will drop down below $300 I could use 4 more of these for my other 2 rigs in the house.


may be.... in the future they gonna add a "new feature" cap Vram usage.... just like how we have fps cap..... ???


----------



## SandGlass

Petition to Nvidia asking for refund


----------



## GrimDoctor

Quote:


> Originally Posted by *SandGlass*
> 
> Petition to Nvidia asking for refund


They are already supporting and assisting refunds...
Thread search FTW!


----------



## sok0

After reading this thread, I think I am now going to try to get a full refund on my 2013 Ford since the estimated MPG that they told me it would have isnt anywhere near the MPG I get.


----------



## SKYMTL

Quote:


> Originally Posted by *RagingCain*
> 
> Hey now Sky! Some of us bloggers are extra honest too ya know!
> 
> Seriously though Ryan is a great guy. The only reason some people have a beef with them is he proved AMD needed to do something about FrameTime metering.
> 
> The fact that's all he did, just goes to show how stupid/fanatical AMD apologists can be.


I agree but there are journalists and then there are bloggers. There is a difference.

Quote:


> Originally Posted by *juano*
> 
> Can I assume based on the avatar that "we" is Hardware Canucks?


Correct. However, I do not link to the site here since we are technically "competitors" and doing so wouldn't be right.


----------



## PureBlackFire

Quote:


> Originally Posted by *sok0*
> 
> After reading this thread, I think I am now going to try to get a full refund on my 2013 Ford since the estimated MPG that they told me it would have isnt anywhere near the MPG I get.


no luck. it's all in the fine print that estimated MPG will vary.


----------



## juano

Quote:


> Originally Posted by *SKYMTL*
> 
> I agree but there are journalists and then there are bloggers. There is a difference.
> Correct. However, I do not link to the site here since we are technically "competitors" and doing so wouldn't be right.


Ahh thank you for the clarification. I don't usually frequent your site (not that I dislike it, I just don't read every tech site out there) but I will look forward to this article once you're able to get it posted.


----------



## Menta

amen


----------



## SKYMTL

Can someone explain what justifies a refund?

I just looked over the pre-launch documentation and what was written on GeForce.com and I see the following:

4GB GDDR5 (true)

Peak Bandwidth: 224GB/s (true)

The "peak bandwidth" isn't a guaranteed sustainable number and never has been.

I'm not saying I completely disagree with people being frustrated over the situation....I'm just confused about how the GTX 970's performance metrics have changed now versus when it was initially launched.


----------



## criminal

Quote:


> Originally Posted by *looniam*
> 
> you are correct and i didn't remember correctly.
> since he posted leaked benchmarks of a 290X beating a titan his response was:
> http://www.overclock.net/t/1427828/bsn-state-of-4k/400_50#post_20843744
> (yes, the whole reference is a joke
> 
> 
> 
> 
> 
> 
> 
> )
> 
> that whole thread is a good read though. .


I remember that thread. It was great!








Quote:


> Originally Posted by *sok0*
> 
> After reading this thread, I think I am now going to try to get a full refund on my 2013 Ford since the estimated MPG that they told me it would have isnt anywhere near the MPG I get.


Not even the same thing. If you drive the card like they recommend, you would get that mpg.








Quote:


> Originally Posted by *SKYMTL*
> 
> Can someone explain what justifies a refund?
> 
> I just looked over the pre-launch documentation and what was written on GeForce.com and I see the following:
> 
> 4GB GDDR5 (true)
> 
> Peak Bandwidth: 224GB/s (true)
> 
> The "peak bandwidth" isn't a guaranteed sustainable number and never has been.
> 
> I'm not saying a completely disagree with people being frustrated over the situation....I'm just confused about how the GTX 970's performance metrics have changed now versus when it was initially launched.


The wrong rop count and L2 cache amount.


----------



## Vesku

Quote:


> Originally Posted by *sok0*
> 
> After reading this thread, I think I am now going to try to get a full refund on my 2013 Ford since the estimated MPG that they told me it would have isnt anywhere near the MPG I get.


You joke but Hyundai and Kia had to either pay certain car buyers a lump sum or give them a gas card that has $X per year for several years because they fudged the US gas mileage tests.

Sounds like Nvidia is in proper 'mea culpa' mode but will still have to wait and see if they actually update the GTX 970 memory pools in a proper manner on the advertising and retail boxes. Should say 196GBps + 28GBps.
Quote:


> Originally Posted by *SKYMTL*
> 
> Can someone explain what justifies a refund?
> 
> I just looked over the pre-launch documentation and what was written on GeForce.com and I see the following:
> 
> 4GB GDDR5 (true)
> 
> Peak Bandwidth: 224GB/s (true)
> 
> The "peak bandwidth" isn't a guaranteed sustainable number and never has been.
> 
> I'm not saying a completely disagree with people being frustrated over the situation....I'm just confused about how the GTX 970's performance metrics have changed now versus when it was initially launched.


Due to the way the separate 512MB GDDR5 chip is hooked to the GPU the max theoretical peak bandwidth for a GTX 970 is actually 196GB/s. The main pool and the small pool can not be accessed at the same time. Misrepresenting the ROPs and L2 is probably enough in some countries as well, doesn't matter if there is no or little performance impact the chip is not as described.


----------



## Imouto

Quote:


> Originally Posted by *SKYMTL*
> 
> I know Ryan and he absolutely does not drive a Bentley, nor does he live in a mansion, nor does ANY of us make huge profits from this gig. He is actually one of the most straight-up guys in the ever-shrinking "true" (ie: not a "blogger" or YouTube "personality") hardware press community. To think or even claim otherwise is absolutely preposterous.


That's kinda disappointing.

I don't know, if the whole hardware press community is down the slumps accused of selling its integrity for a couple of banner deals I guess that you had to make some money out of it.


----------



## PureBlackFire

Quote:


> Originally Posted by *SKYMTL*
> 
> Can someone explain what justifies a refund?
> 
> I just looked over the pre-launch documentation and what was written on GeForce.com and I see the following:
> 
> 4GB GDDR5 (true)
> 
> Peak Bandwidth: 224GB/s (true)
> 
> The "peak bandwidth" isn't a guaranteed sustainable number and never has been.
> 
> I'm not saying I completely disagree with people being frustrated over the situation....I'm just confused about how the GTX 970's performance metrics have changed now versus when it was initially launched.


nothing justifies a refund, but people gonna get it if they can.


----------



## SKYMTL

Quote:


> Originally Posted by *criminal*
> 
> I remember that thread. It was great!
> 
> 
> 
> 
> 
> 
> 
> 
> Not even the same thing. If you drive the card like they recommend, you would get that mpg.
> 
> 
> 
> 
> 
> 
> 
> 
> The wrong rop count and L2 cache amount.


I understand but since when do folks but their cards on ROP and L2 cache amount alone? Most don't even know what those functions do. Many buy according to reviews....and those reviews have the GTX 970 in a very good position.
Quote:


> Originally Posted by *Vesku*
> 
> You joke but Hyundai and Kia had to either pay certain car buyers a lump sum or give them a gas card that has $X per year for several years because they fudged the US gas mileage tests.
> 
> Sounds like Nvidia is in proper 'mea culpa' mode but will still have to wait and see if they actually update the GTX 970 memory pools in a proper manner on the advertising and retail boxes. Should say 196GBps + 28GBps.
> Due to the way the separate 512MB GDDR5 chip is hooked to the GPU the max theoretical peak bandwidth for a GTX 970 is actually 196GB/s. The main pool and the small pool can not be accessed at the same time. Misrepresenting the ROPs and L2 is probably enough in some countries as well, doesn't matter if there is no or little performance impact the chip is not as described.


The max theoretical bandwidth is the full 224GB/s since interleaving and load balancing allow for both partitions to be accessed. I'm sure that's been covered before. I think there's a disconnect in the information being presented which has caused some confusion; while the partitions are separate, they also behave in a dynamic manner so the allocation is a constantly moving target and is determined by what a game requires.


----------



## Forceman

Quote:


> Originally Posted by *SKYMTL*
> 
> Peak Bandwidth: 224GB/s (true)
> 
> The "peak bandwidth" isn't a guaranteed sustainable number and never has been.


According to the Anand article, that 224GB/s is only possible if both memory banks are accessed simultaneously, and they say that can't happen. Do you have different information on that?

Edit: ninja'd. But that's still contrary to what Anand says happens. Maybe something to clarify with Nvidia?


----------



## mouacyk

Quote:


> Originally Posted by *PureBlackFire*
> 
> nothing justifies a refund, but people gonna get it if they can.


Even NVidia would disagree with you (now). Nothing is theoretical anymore. Those that want NVidia to own up, you can do so. Those who are content, so be it.


----------



## TopicClocker

Quote:


> Originally Posted by *juano*
> 
> No I'm obviously making a joke because you'd have to be an idiot to not know whoever that guy is. I was just testing the thread to make sure that anyone who doesn't know who he is is appropriately jumped on, and you passed.


*Keeps quiet*

Quote:


> Originally Posted by *SKYMTL*
> 
> Can someone explain what justifies a refund?
> 
> I just looked over the pre-launch documentation and what was written on GeForce.com and I see the following:
> 
> 4GB GDDR5 (true)
> 
> Peak Bandwidth: 224GB/s (true)
> 
> The "peak bandwidth" isn't a guaranteed sustainable number and never has been.
> 
> I'm not saying I completely disagree with people being frustrated over the situation....I'm just confused about how the GTX 970's performance metrics have changed now versus when it was initially launched.


The performance metrics have not changed at all, it performs just as well as it did on launch and in some cases even better due to driver updates.

I think the ones that are upset, disappointed etc are those who feel that NVIDIA has lied to them or were being somewhat shady, as they did not reveal the information about the segmentation of the memory as well as the information about the ROPS and Cache until people found out and it got really big.

I suppose it could be said that the ROPS and the Cache were falsely advertised.

Personally I don't really care about the ROPS or the Cache, but the memory, which is quite disappointing since the 3.5GB segment is faster than the other 0.5GB segment, having VRAM work like this is quite unusual.

Whether this will impact performance in the future is unknown currently.


----------



## SKYMTL

Quote:


> Originally Posted by *PureBlackFire*
> 
> nothing justifies a refund, but people gonna get it if they can.


I think what I'm having a hard problem with is the justification behind the refund idea.

Had NVIDIA launched a driver which retroactively disabled a ROP partition and effectively lowered performance after launch or lied about performance outright, then there would be ample justification in a refund. However, the GTX 970 has been out since October, has been reviewed thousands of times by hundreds of publications and its day-one performance metrics haven't been changed in any way. It performs just as well now as it did then.

While I don't see any justification in NVIDIA's lack of transparency, I think this whole situation could lead us down a very worrying road.

In my experience, when compared to AMD, Intel, Qualcomm, ARM, etc, NVIDIA is the most open about their architectures and are willing to reveal the most (but not all) about what goes into their "secret sauce". The others justifiably withhold information that could give the competition a leg up simply because the buying public isn't positively served by immense technical details. They care about performance, perf per watt and positioning. If anything I can see this leading to a push towards releasing LESS information in an effort to emulate what other companies are already doing: focusing the press on raw performance regardless of what's going on under the hood.

Again, this is just my personal opinion....


----------



## Silent Scone

Quote:


> Originally Posted by *SKYMTL*
> 
> I think what I'm having a hard problem with is the justification behind the refund idea.
> 
> Had NVIDIA launched a driver which retroactively disabled a ROP partition and effectively lowered performance after launch or lied about performance outright, then there would be ample justification in a refund. However, the GTX 970 has been out since October, has been reviewed thousands of times by hundreds of publications and its day-one performance metrics haven't been changed in any way. It performs just as well now as it did then.
> 
> While I don't see any justification in NVIDIA's lack of transparency, I think this whole situation could lead us down a very worrying road.
> 
> In my experience, when compared to AMD, Intel, Qualcomm, ARM, etc, NVIDIA is the most open about their architectures and are willing to reveal the most (but not all) about what goes into their "secret sauce". The others justifiably withhold information that could give the competition a leg up simply because the buying public isn't positively served by immense technical details. They care about performance, perf per watt and positioning. If anything I can see this leading to a push towards releasing LESS information in an effort to emulate what other companies are already doing: focusing the press on raw performance regardless of what's going on under the hood.
> 
> Again, this is just my personal opinion....


Careful, don't go into the long grass. You're talking sense.


----------



## criminal

Quote:


> Originally Posted by *SKYMTL*
> 
> I understand but since when do folks but their cards on ROP and L2 cache amount alone? Most don't even know what those functions do. Many buy according to reviews....and those reviews have the GTX 970 in a very good position.
> The max theoretical bandwidth is the full 224GB/s since interleaving and load balancing allow for both partitions to be accessed. I'm sure that's been covered before. I think there's a disconnect in the information being presented which has caused some confusion; while the partitions are separate, they also behave in a dynamic manner so the allocation is a constantly moving target and is determined by what a game requires.


The 970 sold so well because of the initial reviews. But specs for the card were still misrepresented. I know those specs don't change the performance, but some people feel betrayed. Misrepresented specs (no matter if they cause an issue or not) warrants a refund for some people and honestly I can't say I blame them.


----------



## TopicClocker

Quote:


> Originally Posted by *criminal*
> 
> The 970 sold so well because of the initial reviews. But specs for the card were still misrepresented. I know those specs don't change the performance, but some people feel betrayed. Misrepresented specs (no matter if they cause an issue or not) warrants a refund for some people and honestly I can't say I blame them.


The memory architecture is the biggest concern.
Quote:


> Originally Posted by *[email protected]*
> 
> Hey,
> 
> First, I want you to know that I'm not just a mod, I work for NVIDIA in Santa Clara.
> 
> I totally get why so many people are upset. *We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture.* I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.
> 
> It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. *We're working on a driver update that will tune what's allocated where in memory to further improve performance.*
> 
> Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.
> 
> --Peter


----------



## mouacyk

Quote:


> Originally Posted by *SKYMTL*
> 
> ...
> The max theoretical bandwidth is the full 224GB/s since interleaving and load balancing allow for both partitions to be accessed. I'm sure that's been covered before. I think there's a disconnect in the information being presented which has caused some confusion; while the partitions are separate, they also behave in a dynamic manner so the allocation is a constantly moving target and is determined by what a game requires.


If you look at the diagrams that I enhanced here, you will see that the last enabled L2 block is handling data transactions for two DRAM modules via the green cache interconect. That L2 block's link to the crossbar cannot possibly address both of its connected DRAM modules at the same time. This technically drops the bandwidth down to an effective 7/8 of 224GB/s, which is 196GB/s. In this respect the card does not have a true 256-bit bus, because one of the 32-bit bus is dual-purposed (doubled) to handle 2 DRAM chips.


----------



## SKYMTL

Quote:


> Originally Posted by *Forceman*
> 
> According to the Anand article, that 224GB/s is only possible if both memory banks are accessed simultaneously, and they say that can't happen. Do you have different information on that?
> 
> Edit: ninja'd. But that's still contrary to what Anand says happens. Maybe something to clarify with Nvidia?


Sure. I asked Jonah this question the other day over email and here is a direct quote, without any wanna-be overly technical mumbo jumbo:

_First case (just using lower 3.5GB) : in this case, only of 7 of the 8 DRAMs are in use. So if it maxes out all 7 that would be 7*32*3.5*2/8 = 196GB/sec peak.

Second case (using lower 3.5GB and upper 0.5GB) : in this case the memory bandwidth really depends on the load. *It could be as much as all the memory bandwidth or 224GB/sec if the workload is balanced well, but if the workload is imbalanced (100% reads as an example with no writes), then it could be half bandwidth*. We have extra read/write request bandwidth from the L2s to the MCs which is why the double load issue is harder to hit here.

Third case (going beyond 4GB) : now PCIe is involved and bandwidth will go down fast if you start using it too much._

_One thing to point out is that some folks I think have looked at the CUDA memory test (Nai's benchmark) and gotten concerned about whether when you use the 0.5GB, bandwidth would be 1/8th . That's not really right &#8230; that would be right if you were *only* using the 0.5GB (because now you're just using that one memory) and leaving the other memories idle. But if you assume the more likely case that you're evenly using the 4GB memory address range, then you're accessing all the memories._

And when I asked about how the load balancing was accomplished in this case (this is after a LONG technical bit):

_Actually the algorithm is oriented more towards trying to put data in the 0.5GB segment that is least likely to get accessed often. In general, that's the simplest approach to take, and simple is good for software algorithms, esp since we don't really know for sure what the access patterns might look like._

Essentially, NVIDIA is using software-based heuristics to insure the 500GB segment is utilized when memory requirements surpass the 3.5GB mark. This is why games ARE able to access the full 4GB on the card and when using a tool like AIDA64 which engages both paritions, read and write access is virtually identical between the GTX 970 and GTX 980.

Hope that helps!


----------



## PostalTwinkie

Quote:


> Originally Posted by *mouacyk*
> 
> Even NVidia would disagree with you (now). Nothing is theoretical anymore. Those that want NVidia to own up, you can do so. Those who are content, so be it.


I doubt that one person, who was clearly addressing a single individual who had just purchased his 970, and telling him to contact the retailer for return, is speaking on behalf of all Nvidia.


----------



## 2010rig

Quote:


> Originally Posted by *looniam*
> 
> no, they bought ryan a *BENTLY* that he admitted to in one of the old framepacing/FCAT threads.
> (i am now, again, seriously thinking about searching for it
> 
> 
> 
> 
> 
> 
> 
> )


I could've swore he said they bought him a Porsche in the video, lol.
Quote:


> Originally Posted by *SKYMTL*
> 
> I know Ryan and he absolutely does not drive a Bentley, nor does he live in a mansion, nor does ANY of us make huge profits from this gig. He is actually one of the most straight-up guys in the ever-shrinking "true" (ie: not a "blogger" or YouTube "personality") hardware press community. To think or even claim otherwise is absolutely preposterous.


It was a joke, when the FCAT methodology was being first introduced some people around here were claiming that he was being paid by NVIDIA to run those tests. We all know he wasn't, but the conspiracy theories were running wild at the time, because they didn't want to accept that maybe just maybe the 7970 stuttered.

In one of his videos he jokingly said NVIDIA bought them cars, etc. But he bought his own FCAT equipment. NVIDIA was too stingy to buy that, after buying him the car.


----------



## swiftypoison

How is Nvidia helping with refunds? I have a GTX 970 from Newegg and they said that until Nvidia makes an official statement, they arent taking anything back.


----------



## PostalTwinkie

Quote:


> Originally Posted by *SKYMTL*
> 
> Essentially, NVIDIA is using software-based heuristics to insure the 500GB segment is utilized when memory requirements surpass the 3.5GB mark. This is why games ARE able to access the full 4GB on the card and when using a tool like AIDA64 which engages both paritions, read and write access is virtually identical between the GTX 970 and GTX 980.
> 
> Hope that helps!


Nvidia is addressing it via software, the OS addresses it properly, and game engines will even see it and use it that way - if done right. Yet most people don't want to recognize that and still rant about "What if we addressed just the last .5 by itself?!"

I am not saying that is what Forceman is doing, just generally speaking to the point that the whole ecosystem knows how to address the memory.
Quote:


> Originally Posted by *swiftypoison*
> 
> How is Nvidia helping with refunds? I have a GTX 970 from Newegg and they said that until Nvidia makes an official statement, they arent taking anything back.


If you are out of your return window with your respective retailer, nothing you can do without Nvidia issuing a refund. Which I don't see happening given this new knownledge doesn't change the performance of the card.

Although, just by fact of misprinting specs on the box - they need to offer something.


----------



## iSlayer

Quote:


> Originally Posted by *rdr09*
> 
> Your beloved nVidia DUPED people into buying a car with 3 tires and a spare and you bash AMD.


I was making a joke based upon someone else's post ;-;.

And no, I hardly love Nvidia. Unless hoping for a law suit or a refund for my 970 and repeatedly saying Nvidia made a mistake counts as love.


----------



## PostalTwinkie

Quote:


> Originally Posted by *iSlayer*
> 
> I was making a joke based upon someone else's post ;-;.
> 
> And no, I hardly love Nvidia. Unless hoping for a law suit or a refund for my 970 and repeatedly saying Nvidia made a mistake counts as love.


His version of love is a dark and twisted one......


----------



## mouacyk

Quote:


> Originally Posted by *swiftypoison*
> 
> How is Nvidia helping with refunds? I have a GTX 970 from Newegg and they said that until Nvidia makes an official statement, they arent taking anything back.


If you are serious about your refund, take up the offer NVidia has posted here. I think they understand how difficult it is for you to convince your sellers, so they are willing to do this on your behalf if you're unable to provide the necessary information yourself.


----------



## SKYMTL

Quote:


> Originally Posted by *criminal*
> 
> The 970 sold so well because of the initial reviews. But specs for the card were still misrepresented. I know those specs don't change the performance, but some people feel betrayed. Misrepresented specs (no matter if they cause an issue or not) warrants a refund for some people and honestly I can't say I blame them.


I think I can get behind that explanation. It goes back to a situation similar to what Mazda experienced, in which they didn't publish the correct horsepower figures for their Mazda3 from 2004 to 2007. Performance remained the same but the specs changed.
Quote:


> Originally Posted by *mouacyk*
> 
> If you look at the diagrams that I enhanced here, you will see that the last enabled L2 block is handling data transactions for two DRAM modules via the green cache interconect. That L2 block's link to the crossbar cannot possibly address both of its connected DRAM modules at the same time. This technically drops the bandwidth down to an effective 7/8 of 224GB/s, which is 196GB/s. In this respect the card does not have a true 256-bit bus, because one of the 32-bit bus is dual-purposed (doubled) to handle 2 DRAM chips.


It is a slippery slope though. While there is technically a 256-bit interface, it is separated into two functional blocks that communicate with each other via the so-called buddy interface. The main issue will always be that missing 256KB of L2 though. However, this doesn't preclude the possibility of utilizing load balancing to achieve peak results. Remember, diagrams can only show so much.....


----------



## intelfan

Quote:


> Originally Posted by *Vesku*
> 
> You joke but Hyundai and Kia had to either pay certain car buyers a lump sum or give them a gas card that has $X per year for several years because they fudged the US gas mileage tests.


Exactly. Ford had to to refund several hundred dollars as a sign of goodwill. Google Ford Cmax


----------



## mouacyk

Quote:


> Originally Posted by *SKYMTL*
> 
> I think I can get behind that explanation. It goes back to a situation similar to what Mazda experienced, in which they didn't publish the correct horsepower figures for their Mazda3 from 2004 to 2007. Performance remained the same but the specs changed.
> It is a slippery slope though. While there is technically a 256-bit interface, it is separated into two functional blocks that communicate with each other via the so-called buddy interface. The main issue will always be that missing 256KB of L2 though. However, this doesn't preclude the possibility of utilizing load balancing to achieve peak results. Remember, diagrams can only show so much.....


I agree that we don't know exactly how the diagrams translate to physical traces and components on the chip. However, you can't dismiss the fact that a single L2 connection makes all the difference between being able to peak bandwidth at 224GB/s or 196GB/s. This L2 connection was there when the product launched. 4+ months later, it's no longer there. I'm not going to argue with their engineer.


----------



## iSlayer

Quote:


> Originally Posted by *Menta*
> 
> amen


Is that Gaben with a beard 0_o
Quote:


> Originally Posted by *PostalTwinkie*
> 
> His version of love is a dark and twisted one......


500MBs of Green

...That was awful I'll try again.

TIL I'm kinky


----------



## Forceman

Quote:


> Originally Posted by *mouacyk*
> 
> I agree that we don't know exactly how the diagrams translate to physical traces and components on the chip. However, you can't dismiss the fact that a single L2 connection makes all the difference between being able to peak bandwidth at 224GB/s or 196GB/s. This L2 connection was there when the product launched. 4+ months later, it's no longer there. I'm not going to argue with their engineer.


The connection was never there, it's not like they took it away. I'm assuming you just mean on the diagram, but it's not clear from that sentence.

And if you read what SKYMTL posted, it is possible to exceed 196 GB/sec depending on the load. And even if it weren't, the performance you see is still the performance you are getting. It's not like you were getting 224GB/sec and now you are only getting 196GB/sec, you are still getting the exact same bandwidth (whatever that actually is) that you were on day one.


----------



## SKYMTL

Quote:


> Originally Posted by *mouacyk*
> 
> I agree that we don't know exactly how the diagrams translate to physical traces and components on the chip. However, you can't dismiss the fact that a single L2 connection makes all the difference between being able to peak bandwidth at 224GB/s or 196GB/s. This L2 connection was there when the product launched. 4+ months later, it's no longer there. I'm not going to argue with their engineer.


I agree that software cannot replace physical L2 cache. No way.

It does however beg the question: if NVIDIA can balance out most of the shortcomings of the missing cache partition via software, can they ENHANCE the caching performance of other Maxwell cards by using the same principle?


----------



## mouacyk

Quote:


> Originally Posted by *SKYMTL*
> 
> I agree that software cannot replace physical L2 cache. No way.
> 
> It does however beg the question: if NVIDIA can balance out most of the shortcomings of the missing cache partition via software, can they ENHANCE the caching performance of other Maxwell cards by using the same principle?


Did they overclock that last L2 connection to transmit twice as fast? Did they shrink the electrons to be half the size for that connection?

You start out with two bridges to cross a river. Each bridge has full duplex capacity at 1in/1out for total of 4 people per crossing. One of those bridges becomes unsable (according to the new diagram). Now, you're telling me through clever load balancing, we can still get 3-4 people across per crossing?


----------



## ZealotKi11er

If Nvidia did not do this GTX970 would have been a 3GB 192-Bit GPU.


----------



## PureBlackFire

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If Nvidia did not do this GTX970 would have been a 3GB 192-Bit GPU.


if it was I for one would not have bought it.


----------



## PostalTwinkie

Quote:


> Originally Posted by *mouacyk*
> 
> Did they overclock that last L2 connection to transmit twice as fast? Did they shrink the electrons to be half the size for that connection?
> 
> You start out with two bridges to cross a river. Each bridge has full duplex capacity at 1in/1out for total of 4 people per crossing. One of those bridges becomes unsable (according to the new diagram). Now, you're telling me through clever load balancing, we can still get 3-4 people across per crossing?


The Maxwell architecture has new data paths that weren't previously in GPUs. Instead of just SM > Cross > L2 > MC > RAM, there is an additional pipe between the different L2 and MC clusters. So, using your bridge out scenario......

Bridge A and Bridge B.
Half way across Bridge A there is an issue that prevents traffic from flowing across it.
Traffic is then diverted across Bridge *C*, directly over to Bridge B instead - allowing for "traffic" to flow.
Quote:


> Originally Posted by *PureBlackFire*
> 
> if it was I for one would not have bought it.


Yes you would have, as the performance is the performance of the card, and you bought the card for its performance. Not for what numbers are tossed on the side of the box 90%+ people won't even keep.

Feigned outrage is worse than unjustified outrage.


----------



## criminal

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The Maxwell architecture has new data paths that weren't previously in GPUs. Instead of just SM > Cross > L2 > MC > RAM, there is an additional pipe between the different L2 and MC clusters. So, using your bridge out scenario......
> 
> Bridge A and Bridge B.
> Half way across Bridge A there is an issue that prevents traffic from flowing across it.
> Traffic is then diverted across Bridge *C*, directly over to Bridge B instead - allowing for "traffic" to flow.
> Yes you would have, as the performance is the performance of the card, and you bought the card for its performance. Not for what numbers are tossed on the side of the box 90%+ people won't even keep.
> 
> Feigned outrage is worse than unjustified outrage.


So you are saying that the 970 would have sold just as well if it only had 3GB of vram? (The bus doesn't look like it matters much on Maxwell.) I don't agree.

Reason? Because MANY people sold their 3GB 780 to pick up this 4GB* 970.


----------



## raghu78

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If Nvidia did not do this GTX970 would have been a 3GB 192-Bit GPU.


Nvidia have the 960 Ti lined up with the 1536 cc/192 bit/3GB/48ROP spec. they are the masters at binning and have a whole lot of SKUs from a single chip. The GM204 already has 4 different chips - 2 desktop and 2 mobile all with varying specs.


----------



## mouacyk

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The Maxwell architecture has new data paths that weren't previously in GPUs. Instead of just SM > Cross > L2 > MC > RAM, there is an additional pipe between the different L2 and MC clusters. So, using your bridge out scenario......
> 
> Bridge A and Bridge B.
> Half way across Bridge A there is an issue that prevents traffic from flowing across it.
> Traffic is then diverted across Bridge *C*, directly over to Bridge B instead - allowing for "traffic" to flow.
> Yes you would have, as the performance is the performance of the card, and you bought the card for its performance. Not for what numbers are tossed on the side of the box 90%+ people won't even keep.
> 
> Feigned outrage is worse than unjustified outrage.


Your explaination explains the compromise that has been made quite clear for several pages now, I understand an accept that. But since both traffic has to merge between the crossbar and the last L2 cache without speeding up or shrinking of the signals being carried, there has to be a slow down unless you know something that only NVidia also knows. Bridge-out! We've been walking on air.


----------



## Lass3

780 at 1150-1200 performs just like 970 at 1500-1550..

The only 900 series card worth anything (compared to last gen) is the 980. It's ~10% faster than 290X / 780 Ti...

Returned my 970s, got a full refund after 60 days and kept the games from the vouchers. Awaiting AMDs next line of GPUs.


----------



## ChrisB17

So is the driving fix going to do anything to stop the stuttering? And will it help when going over 3.5gb for instance on FC4?


----------



## mouacyk

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If Nvidia did not do this GTX970 would have been a 3GB 192-Bit GPU.


I wouldn't cut it down that much. Based on the revealed diagram, 224-bit width with 3.5GB capacity would still work great, without needing any compromise. 224-bit looks odd, but we have had 320-bit on the GTX 570 before, so it's not impossible.
Quote:


> Originally Posted by *ChrisB17*
> 
> So is the driving fix going to do anything to stop the stuttering? And will it help when going over 3.5gb for instance on FC4?


You'll have to wait and see. The defect is hardware-based, so all they can really do is expose (via API) the last 0.5GB DRAM chip as its own segment to be used by games and applications as they see fit. Furthermore, NVidia's driver could manage swapping of infrequently-used assets from the primary 3.5GB region to this smaller region of memory. No matter how well they tune this though, if a game engine chooses to constantly refresh assets, you're still going to feel the frame rate dips. It's going to require engines and the driver to work closely together to manage that region.


----------



## PureBlackFire

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Yes you would have, as the performance is the performance of the card, and you bought the card for its performance. Not for what numbers are tossed on the side of the box 90%+ people won't even keep.
> 
> Feigned outrage is worse than unjustified outrage.


no, I seriously wouldn't. I didn't come from a 680, 770 or some lower mid range card. I side graded from a 290 _because_ from a whole bunch of reviews I saw that I wouldn't lose on average performance or vram. otherwise I would have watercooled the 290 or changed it for a shorter one (I needed a max 10.5" card and I had a 290 Tri-X). also, as you can see from the 660ti vs 670, if this card had a 192 bit bus it would be around 10% slower than what it is and yes, that would be a deal breaker.


----------



## 5pellfire

Now that the cat is out of the bag regarding the specifications,
does the GTX970 have 256-bit Memory Bus Width or not?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Lass3*
> 
> 780 at 1150-1200 performs just like 970 at 1500-1550..
> 
> The only 900 series card worth anything (compared to last gen) is the 980. It's ~10% faster than 290X / 780 Ti...
> 
> Returned my 970s, got a full refund after 60 days and kept the games from the vouchers. Awaiting AMDs next line of GPUs.


A R9 290 @ 1200 ~ GTX970 @ 1550. GTX780 has to be 1300-1350 range.


----------



## tsm106

Quote:


> Originally Posted by *Lass3*
> 
> 780 at 1150-1200 performs just like 970 at 1500-1550..
> 
> The only 900 series card worth anything (compared to last gen) is the 980. It's ~10% faster than 290X / 780 Ti...
> 
> Returned my 970s, got a full refund after 60 days and kept the games from the vouchers. Awaiting AMDs next line of GPUs.


Waiting on Amazon, not sure how they are gonna handle it. How did you manage your return after so many days?


----------



## hawke3757

Quote:


> Originally Posted by *Lass3*
> 
> 780 at 1150-1200 performs just like 970 at 1500-1550..
> 
> The only 900 series card worth anything (compared to last gen) is the 980. It's ~10% faster than 290X / 780 Ti...
> 
> Returned my 970s, got a full refund after 60 days and kept the games from the vouchers. Awaiting AMDs next line of GPUs.


From newegg ? And you are able to keep the "chose your own path" vouchers?


----------



## mouacyk

Quote:


> Originally Posted by *5pellfire*
> 
> Now that the cat is out of the bag regarding the specifications,
> does the GTX970 have 256-bit Memory Bus Width or not?


Base on all the review site's definition of Memory Bus Width - Yes. If you count all the connections between the 8 MC's and the 8DRAM chips, they are ALL there. It's unfortunate that current definitions of the bus-width only define the connections between the MCs and the DRAM chips.


----------



## SKYMTL

Quote:


> Originally Posted by *ChrisB17*
> 
> So is the driving fix going to do anything to stop the stuttering? And will it help when going over 3.5gb for instance on FC4?


AMD exhibits the same issues in FC4. It's the Dunia engine that's messing things up.


----------



## Forceman

Quote:


> Originally Posted by *mouacyk*
> 
> I wouldn't cut it down that much. Based on the revealed diagram, 224-bit width with 3.5GB capacity would still work great, without needing any compromise. 224-bit looks odd, but we have had 320-bit on the GTX 570 before, so it's not impossible.


If you ignore the "extra" VRAM, that's basically what you have right now.


----------



## tsm106

Nice, I just chatted with Amazon. They are issuing a return label as I type this for full refund.


----------



## tpi2007

Quote:


> Originally Posted by *2010rig*
> 
> How can anyone hate NVIDIA? that's awesome.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> People are seriously asking for $100 vouchers to upgrade to a 980, maybe they should first explain how this issue is worth $100 towards a new upgrade, and or how their 970 is suddenly worth $100 less.
> 
> They should also have to start their sentence with, *I understand the missing ROP's don't mean anything*, and this whole RAM "limitation" is affecting my gaming experience in the following ways:
> 
> Another guy asked if he returns the card, can he keep the FREE game? Lol


Are you sure about that ? I'm not sure I am because of this confusing correction:



http://techreport.com/discussion/27724/nvidia-the-geforce-gtx-970-works-exactly-as-intended?post=878889

I'm not sure if I understand this right, but then again I get the feeling that Damage (Scott Wasson from TR) didn't fully explain the scope of his correction to the article. Is he implying that Shadowmapping on the GTX 970 isn't limited by the 56 ROPs?

Quote:


> Originally Posted by *SKYMTL*
> 
> It is a slippery slope though. While there is technically a 256-bit interface, it is separated into two functional blocks that communicate with each other via the so-called buddy interface. The main issue will always be that missing 256KB of L2 though. However, this doesn't preclude the possibility of utilizing load balancing to achieve peak results. Remember, diagrams can only show so much.....


I would think that if you could use load balancing to achieve 224 GB/s on a card with the GTX 970's configuration, then I don't see why Nvidia wouldn't boost the GTX 980's peak theoretical bandwidth upwards of 224 GB/s, since that card has more resources available to it.

You can't have your cake and eat it.

And on the 970 you can't access both segments at the same time. More: what happens if the drivers determine that the data on the second segment will be used more from a certain point on and really needs to be moved to the first segment ? Since both can't be accessed at the same time, then, as far as I understand it, that makes it impossible to copy data from one to the other, it has to be moved to system RAM and then copied back to the other segment, with all the associated latency.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If Nvidia did not do this GTX970 would have been a 3GB 192-Bit GPU.


But technically speaking they didn't have to do this. They didn't have to disable the L2 cache and thus the 8 ROPs could also be used. They didn't do it for the GTX 980M (1536 CUDA cores). We're talking about market segmentation. And given than 28nm is a mature process, I'm not sure if we can also talk about yield issues.


----------



## 2010rig

Quote:


> Originally Posted by *Lass3*
> 
> 780 at 1150-1200 performs just like 970 at 1500-1550..
> 
> The only 900 series card worth anything (compared to last gen) is the 980. It's ~10% faster than 290X / 780 Ti...
> 
> Returned my 970s, got a full refund after 60 days and kept the games from the vouchers. Awaiting AMDs next line of GPUs.


So let me get this straight, you returned the card because of mislabeled specs, and you kept the games anyway? Classy move there bud.

People have some twisted principles.
Quote:


> Originally Posted by *5pellfire*
> 
> Now that the cat is out of the bag regarding the specifications,
> does the GTX970 have 256-bit Memory Bus Width or not?


It's still 256 bit.
Quote:


> Originally Posted by *tpi2007*
> 
> Are you sure about that ? I'm not sure I am because of this confusing correction:
> 
> 
> 
> http://techreport.com/discussion/27724/nvidia-the-geforce-gtx-970-works-exactly-as-intended?post=878889
> 
> I'm not sure if I understand this right, but then again I get the feeling that Damage (Scott Wasson from TR) didn't fully explain the scope of his correction to the article. Is he implying that Shadowmapping on the GTX 970 isn't limited by the 56 ROPs?


This is what I gathered from PCPER's info... I'll go read up on what TR is saying...
Quote:


> You should take two things away from that simple description. First, despite initial reviews and information from NVIDIA, the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. NVIDIA says this was an error in the reviewer's guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned. That means the GTX 970 has 56 ROPs and 1792 KB of L2 cache compared to 64 ROPs and 2048 KB of L2 cache for the GTX 980. Before people complain about the ROP count difference as a performance bottleneck, *keep in mind that the 13 SMMs in the GTX 970 can only output 52 pixels/clock and the seven segments of 8 ROPs each (56 total) can handle 56 pixels/clock. The SMMs are the bottleneck, not the ROPs.*


----------



## Forceman

Quote:


> Originally Posted by *tpi2007*
> 
> And on the 970 you can't access both segments at the same time. More: what happens if the drivers determine that the data on the second segment will be used more from a certain point on and really needs to be moved to the first segment ? Since both can't be accessed at the same time, then, as far as I understand it, that makes it impossible to copy data from one to the other, it has to be moved to system RAM and then copied back to the other segment, with all the associated latency.


Wouldn't it just get read into L2 and then written back to the other block? You don't have to access both blocks simultaneously to read out of one and then write that data into another, I wouldn't think. Isn't that the point of having the L2 cache?


----------



## Orangey

Stop mindlessly defending this awful company. You are preventing the market from correcting itself.


----------



## mouacyk

Quote:


> Originally Posted by *Forceman*
> 
> Wouldn't it just get read into L2 and then written back to the other block? You don't have to access both blocks simultaneously to read out of one and then write that data into another, I wouldn't think. Isn't that the point of having the L2 cache?


He's probably talking about a swap significantly larger than 2MB. In your case, I think you're destroying the content in the 3.5GB region where you're overwriting. Is it feasible to swap in 2MB chunks via the L2 cache only, without going to RAM? Very possibly. There's going to be an impact, regardless.

This *slippery slope* started when this whole compromise got swept under the rug from the begining, not just when we are trying to understand whether 224GB/s peak bandwidth is still technically correct.


----------



## skupples

Quote:


> Originally Posted by *SKYMTL*
> 
> AMD exhibits the same issues in FC4. It's the Dunia engine that's messing things up.


'
Right, I do not discount the issue this can cause, especially for SLI users @ 1440P+ resolutions, as they can actually competently push FPS high enough + settings high enough where you can consume lots of V RAM w/o running @ 11-15FPS (like 99% of these tests keep showing) BUUUUUUUUUUUUUT...

Using FC4, AC:U, Watch_Dogs, & Ryse (all games that run like complete ass, even when hardware isn't being pushed to its limit, or even close to its limit) detracts from the case.

Pretending that these games only stutter when using 970s = no, sorry. Ryse & FC4 being the biggest offenders. ALSO, all of these games ALREADY have issues when running sli/xfire, when it comes to what appears to be VRAM stutter, even when not even close to running out of VRAM, core power, or CPU power.

also, zomg! A game runs poorly at 11-15FPS? whod'a thunk it?! OH it runs even worse than a 980 when bother are running at 11-15FPS?! Noooowaaaaaayyyyy... mind blowing.

Would LOOOOOOOOOOOOOOOOOVE to see benchmarks that are running @ 60FPS - 120FPS (you know, where most of us game @) while consuming all the memories, but that probably makes me a shill.


----------



## Forceman

Quote:


> Originally Posted by *mouacyk*
> 
> He's probably talking about a swap significantly larger than 2MB. In your case, I think you're destroying the content in the 3.5GB region where you're overwriting. Is it feasible to swap in 2MB chunks via the L2 cache only, without going to RAM? Very possibly. There's going to be an impact, regardless.


He he, yeah, forgot about the disparity in memory size there.


----------



## Heavy MG

Quote:


> Originally Posted by *2010rig*
> 
> So let me get this straight, you returned the card because of mislabeled specs, and you kept the games anyway? Classy move there bud.
> People have some twisted principles.
> It's still 256 bit.
> This is what I gathered from PCPER's info... I'll go read up on what TR is saying...


And you're still defending Nvidia? It's more like 224bit or less,the rest of bandwidth isn't technically there with the last 0.5GB.
I also think he's right to keep the game vouchers after such a mess. Nvidia of course hands out the free games,it's the least they could do for a 970 owner after all the lies.
Quote:


> Originally Posted by *PostalTwinkie*
> 
> Yes you would have, as the performance is the performance of the card, and you bought the card for its performance. Not for what numbers are tossed on the side of the box 90%+ people won't even keep.
> Feigned outrage is worse than unjustified outrage.


I for one would not have if Nvidia had sold it as a 3GB 192bit card,i would have picked up a 290X instead because the lesser true specs of the 970 may now affect short term future performance vs the 290 & 290X.


----------



## benbenkr

Quote:


> Originally Posted by *criminal*
> 
> So you are saying that the 970 would have sold just as well if it only had 3GB of vram? (The bus doesn't look like it matters much on Maxwell.) I don't agree.
> 
> Reason? Because MANY people sold their 3GB 780 to pick up this 4GB* 970.


The "many" people you speak of, they're from OCN? Great stats.


----------



## ZealotKi11er

Quote:


> Originally Posted by *benbenkr*
> 
> The "many" people you speak of, they're from OCN? Great stats.


People that buy GTX780 in the first place will buy GTX970.


----------



## tpi2007

Quote:


> Originally Posted by *2010rig*
> 
> This is what I gathered from PCPER's info... I'll go read up on what TR is saying...
> Quote:
> 
> 
> 
> You should take two things away from that simple description. First, despite initial reviews and information from NVIDIA, the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. NVIDIA says this was an error in the reviewer's guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned. That means the GTX 970 has 56 ROPs and 1792 KB of L2 cache compared to 64 ROPs and 2048 KB of L2 cache for the GTX 980. Before people complain about the ROP count difference as a performance bottleneck, *keep in mind that the 13 SMMs in the GTX 970 can only output 52 pixels/clock and the seven segments of 8 ROPs each (56 total) can handle 56 pixels/clock. The SMMs are the bottleneck, not the ROPs.*
Click to expand...

That was the same thing Scott was saying until someone corrected him, the thing I don't get is the scope of Scott's correction because it doesn't explain how it relates to the comment that corrected him and how it applies to the GTX 970.

Quote:


> Originally Posted by *Forceman*
> 
> Wouldn't it just get read into L2 and then written back to the other block? You don't have to access both blocks simultaneously to read out of one and then write that data into another, I wouldn't think. Isn't that the point of having the L2 cache?


Looking at the diagram and how only a 256KB L2 portion has access to that second segment, it's possible it can be done, but it would probably take a long time since the 'window' is so small.

On another note, I also find interesting that there is another team that probably would have noticed the specifications discrepancy: the driver team. They too have to be aware what cards they are optimizing for, so I find it hard to believe that they didn't know and didn't read any reviews of the card, didn't use GPU-Z or saw screenshots of it on the Internet, didn't read comments on the specs in forum discussions or talking to friends.


----------



## Silent Scone

I think they knew full well, honestly don't believe them not knowing at any level came into question. Driver programmers don't really communicate at consumer level.

Best case is that they quite plainly didn't think it was important, worst case - the spec was never divulged to marketing in the first place intentionally.


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> So let me get this straight, you returned the card because of mislabeled specs, and you kept the games anyway? Classy move there bud.
> 
> People have some twisted principles.


Quote:


> Originally Posted by *JustCallMeVlad*
> 
> It does? Since when? 1414bc/1902m 780 scores pretty much the same graphics score as my old 1558bc/2002m 970 does in FS
> 
> BTW real classy move keeping the games


Not defending him, but how would one "give" the game back if it has already been redeemed? More than likely the retailer that issued the refund was well aware of the free game and didn't bother to deduct that from the refund amount.
Quote:


> Originally Posted by *benbenkr*
> 
> The "many" people you speak of, they're from OCN? Great stats.


Sorry, should have said _some_. But I can guess there were people who upgraded because the 970 had 4GB of ram. Had it been 3GB, I don't see some of those people upgrading.


----------



## wooshna

Quote:


> Originally Posted by *2010rig*
> 
> So let me get this straight, you returned the card because of mislabeled specs, and you kept the games anyway? Classy move there bud.
> 
> People have some twisted principles.
> It's still 256 bit.


Seriously defending what nvidia did?

Because the video card still works?

Would you buy a house with 2000 square feet or real estate to find out 6 months later you only have 1700 squre feet? and be ok with it. you can still live in the house....

Would you buy a 4 door sedan to find out you cant access 1 of the doors? and be ok with it? the car still works.....

Bottom line is people paid for a 970 with what was advertised.

Finding out you bought a gimped video card $349

Nvidia saying they didn't know there was an issue!! 6 months of your time

Nvidia then saying it doesn't really cause any issues!! damage control incoming

People who are ok with this type of behavior from companies $Priceless$

I'm not a red, green or blue fan i buy what is best at the time with what budget i have. i wont accept any company that has no integrity with their business practices. if AMD pulled something like this i wouldnt buy AMD next time around and get something else.


----------



## criminal

Quote:


> Originally Posted by *wooshna*
> 
> Seriously defending what nvidia did?
> 
> Because the video card still works?
> 
> Would you buy a house with 2000 square feet or real estate to find out 6 months later you only have 1700 squre feet? and be ok with it. you can still live in the house....
> 
> Would you buy a 4 door sedan to find out you cant access 1 of the doors? and be ok with it? the car still works.....
> 
> Bottom line is people paid for a 970 with what was advertised.
> 
> Finding out you bought a gimped video card $349
> 
> Nvidia saying they didn't know there was an issue!! 6 months of your time
> 
> Nvidia then saying it doesn't really cause any issues!! damage control incoming
> 
> People who are ok with this type of behavior from companies $Priceless$
> 
> I'm not a red, green or blue fan i buy what is best at the time with what budget i have. i wont accept any company that has no integrity with their business practices. if AMD pulled something like this i wouldnt buy AMD next time around and get something else.


Kind of shows you someone's bias. Even those that claim they don't have any. *Cough* 2010rig *Cough*









Edit: Nvidia representative? http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/1050#post_23470519


----------



## SandGlass

Quote:


> Originally Posted by *SKYMTL*
> 
> In my experience, when compared to AMD, Intel, Qualcomm, ARM, etc, NVIDIA is the most open about their architectures and are willing to reveal the most (but not all) about what goes into their "secret sauce". The others justifiably withhold information that could give the competition a leg up simply because the buying public isn't positively served by immense technical details. They care about performance, perf per watt and positioning. If anything I can see this leading to a push towards releasing LESS information in an effort to emulate what other companies are already doing: focusing the press on raw performance regardless of what's going on under the hood.


This is utterly wrong, Nvidia is not even close to being the most open, Intel is by far the biggest contributor in the 5 companies you have mentioned to the open source community, they have very good linux open source drivers while also making contributions to GCC, AMD has also been great with documenting their graphics architecture and providing support to Mesa, they also contribute to LLVM, while Nvidia's Nouveau open source driver still cannot do memory or core reclocking for cards released in the last 5 years, and their release of documentation is in tiny dribbles.


----------



## ZealotKi11er

From experience a lot of people buy stuff in order to use them latter or take advantage off and usually love stuff that is very close to something else but much cheaper. GTX970 being so close to GTX980 on paper made people happy about their purchase.


----------



## iSlayer

Quote:


> Originally Posted by *criminal*
> 
> Not defending him, but how would one "give" the game back if it has already been redeemed? More than likely the retailer that issued the refund was well aware of the free game and didn't bother to deduct that from the refund amount.
> Sorry, should have said _some_. But I can guess there were people who upgraded because the 970 had 4GB of ram. Had it been 3GB, I don't see some of those people upgrading.


I think it was the phrasing that made it assumed they were keeping the game regardless of how able they are to return it.

IE they hadn't claimed the game yet.
Quote:


> Originally Posted by *criminal*
> 
> Kind of shows you someone's bias. Even those that claim they don't have any. *Cough 2010rig Cough*


Then how about me. I'm a 970 owner and ticked off and do agree, if you can return the game its definitely not a classy thing to keep it.

I'm in that boat though, already scratched. No shame if you have, I don't think any 970 owners anticipated or could've predicted this mess.
Quote:


> Originally Posted by *SandGlass*
> 
> This is utterly wrong, Nvidia is not even close to being the most open, Intel is by far the biggest contributor in the 5 companies you have mentioned to the open source community, they have very good linux open source drivers while also making contributions to GCC, AMD has also been great with documenting their graphics architecture and providing support to Mesa, they also contribute to LLVM, while Nvidia's Nouveau open source driver still cannot do memory or core reclocking for cards released in the last 5 years, and their release of documentation is in tiny dribbles.


Does Nvidia open source support matter given their Linux drivers are far better than AMD's (open source and 1st party)?
Quote:


> Originally Posted by *criminal*
> 
> Edit: Nvidia representative? http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/1050#post_23470519


Would definitely love proof of that.
Quote:


> Originally Posted by *wooshna*
> 
> Seriously defending what nvidia did?
> 
> Because the video card still works?
> 
> Would you buy a house with 2000 square feet or real estate to find out 6 months later you only have 1700 squre feet? and be ok with it. you can still live in the house....
> 
> Would you buy a 4 door sedan to find out you cant access 1 of the doors? and be ok with it? the car still works.....
> 
> Bottom line is people paid for a 970 with what was advertised.
> 
> Finding out you bought a gimped video card $349
> 
> Nvidia saying they didn't know there was an issue!! 6 months of your time
> 
> Nvidia then saying it doesn't really cause any issues!! damage control incoming
> 
> People who are ok with this type of behavior from companies $Priceless$
> 
> I'm not a red, green or blue fan i buy what is best at the time with what budget i have. i wont accept any company that has no integrity with their business practices. if AMD pulled something like this i wouldnt buy AMD next time around and get something else.


I don't think that's defense or forgiveness but people are out for blood and will see it as such.

As I said in an earlier post, buying a 970 for its performance/features/other and fit in your budget is still a valid choice. Likewise is not buying because of Nvidia's mess up.

They're both good reasons to buy or not buy a 970 respectively.


----------



## mouacyk

Quote:


> Originally Posted by *SandGlass*
> 
> This is utterly wrong, Nvidia is not even close to being the most open, Intel is by far the biggest contributor in the 5 companies you have mentioned to the open source community, they have very good linux open source drivers while also making contributions to GCC, AMD has also been great with documenting their graphics architecture and providing support to Mesa, they also contribute to LLVM, while Nvidia's Nouveau open source driver still cannot do memory or core reclocking for cards released in the last 5 years, and their release of documentation is in tiny dribbles.


Which is precisely why the creator of the Linux kernel publicly flipped NVidia the bird. To support your sentiments, I think if anything, communities like ours isn't encouraging any company to release LESS (or even MORE) documentation-- how about just ACCURATE documentation?


----------



## provost

Quote:


> Originally Posted by *Silent Scone*
> 
> I think they knew full well, honestly don't believe them not knowing at any level came into question. Driver programmers don't really communicate at consumer level.
> 
> Best case is that they quite plainly didn't think it was important, *worst case - the spec was never divulged to marketing* in the first place intentionally.


Probably true


----------



## SKYMTL

Quote:


> Originally Posted by *SandGlass*
> 
> This is utterly wrong, Nvidia is not even close to being the most open, Intel is by far the biggest contributor in the 5 companies you have mentioned to the open source community, they have very good linux open source drivers while also making contributions to GCC, AMD has also been great with documenting their graphics architecture and providing support to Mesa, they also contribute to LLVM, while Nvidia's Nouveau open source driver still cannot do memory or core reclocking for cards released in the last 5 years, and their release of documentation is in tiny dribbles.


Where in the world did I say anything about the open source community? I was discussing pre-launch architectural specifics rather than longterm driver support. If we were talking about open source support after an architecture has launched, the conversation would be completely different.


----------



## PostalTwinkie

This thread is just disgusting, and a grand display of how terrible people/consumers are, and how spoiled people are.

People are so stupid as to claim the card is now a 3 GB card, when it really is 4 GB and there is 4 GB on it, that you can see with your own eyeballs. People are selfish and stupid enough to ask for a free upgrade to a 980, or a $100 voucher/refund because of "performance", just stupid and selfish!

If you want to be "fair" and "get what you paid for" then take what you paid for your 970, and whatever 1/8th of that is, ask for that as a refund! Guess what? That number won't be $100.

The feigned hatred and rage around this makes me want to puke. This entire thread has essentially shamed OCN to the core. OCN should have been a community that we, as enthusiasts/experts, could sit down and say _"Yes, they screwed up on the spec sheet printed on the box. They need to make that better. However, performance is still the same regardless!"_

The stupidity in acting like this has somehow caused a loss to the buyer, I just can't get behind that notion. Sorry, that is stupid and selfish to think. The card works as well as it did, if not better via drivers, than the day it left the store.

I fully back Nvidia taking action just on the grounds of they made a mistake in marketing, and need to handle it. But the anger, outrage, and stupidity behind the argument that performance is lesser is vile. Anyone that truly wants a refund, because of the misprint on the box, deserves it; people trying to argue the same thing due to performance need to just shut up.

Either way, anyone that returns it for refund "on principle", need to also return any games and accessories it came with. None of this, I keep the games, you give me money back, crap people are pulling off.

We knew about the performance of the card, we have previews/reviews on the card. Anyone that bought the card should have researched this performance; people bought this card for the performance. It still performs! That hasn't changed!

*Does it matter how it performs, as long as it performs?!*

It could have 5 SMX, as long as the numbers it put up where the same! 1,000 is a 1,000, 500 is 500. A ton of rocks weighs as much as a ton of feathers!

People complaining about Farcry 4 stuttering at 4 GB and trying to use that hot steaming mess as a baseline, don't make me laugh. That game is a stuttering mess across any platform!

EDIT:

Do you know what this is?

People smell blood in the water and all those that want to score on it have started circling.

I have said from day one that I support the idea that Nvidia refund those that truly want it over the spec misprint, everyone else can go fly a kite.


----------



## 2010rig

Quote:


> Originally Posted by *criminal*
> 
> Not defending him, but how would one "give" the game back if it has already been redeemed? More than likely the retailer that issued the refund was well aware of the free game and didn't bother to deduct that from the refund amount.
> Sorry, should have said _some_. But I can guess there were people who upgraded because the 970 had 4GB of ram. Had it been 3GB, I don't see some of those people upgrading.


If people are going to return the cards after 60 days *based on a specs mistake*, then un-install the game, and return everything.

Since it's all about the principle, right?


----------



## provost

Quote:


> Originally Posted by *2010rig*
> 
> If people are going to return the cards after 60 days *based on a specs mistake*, *then un-install the game, and return everything.
> 
> Since it's all about the principle*, right?


Yep, that's the right thing to do.


----------



## iSlayer

Quote:


> Originally Posted by *2010rig*
> 
> If people are going to return the cards after 60 days *based on a specs mistake*, then un-install the game, and return everything.
> 
> Since it's all about the principle, right?


Criminal got you PJSalt up in this.

Nvidia can't recover the cost of the keys for the 970s, they're a sunk cost, and Ubisoft has already been paid.

So again, not much harm or foul if you did use the key.

Also that shade 2010.


----------



## Silent Scone

Slow day lol

https://twitter.com/Thracks/status/560511204951855104/photo/1



No shame.


----------



## tsm106

Hahaha!

Quote:


> Originally Posted by *Silent Scone*
> 
> No shame.


You guys are on some good freaking drugs or something. They lied and mislead consumers for months until people found out. If no one pushed the boundries would they have come out with the truth on their own? Shame, is like for some wrong they did not knowingly commit or an unfortunate event to happen to them. But not one that they invented and masterminded.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Silent Scone*
> 
> Slow day lol
> 
> https://twitter.com/Thracks/status/560511204951855104/photo/1
> 
> 
> 
> No shame.


Took AMD longer than expected to come up with that.

They must have used an AMD processor to render that in photoshop, takes awhile.

EDIT: I kid, I kid, I couldn't resist!


----------



## The Robot

Quote:


> Originally Posted by *provost*
> 
> Yep, that's the right thing to do.


Why stop there? They should sell their entire rigs and switch to consoles


----------



## MerkageTurk

Postaltwinkie

Problem is they lied

What don't you defendants of nvidia understand?

My ti should destroy the 970 but nvidia are unethically making it slower.

Nvidias business practice is shady.

Now a 290x is on par with a 980, its main rival was a 780 ti


----------



## SKYMTL

Oh, the marketing war is going to heat up now!


----------



## PostalTwinkie

Quote:


> Originally Posted by *MerkageTurk*
> 
> Postaltwinkie
> 
> Problem is they lied
> 
> What don't you defendants of nvidia understand?
> 
> My ti should destroy the 970 but nvidia are unethically making it slower.
> 
> Nvidias business practice is shady.
> 
> Now a 290x is on par with a 980, its main rival was a 780 ti


They lied? You have proof they lied? Do you even know what "Lying" means? What it means to "lie"?
Quote:


> A lie is an intentionally false statement to a person or group made by another person or group who knows it is not wholly the truth.


The explanation presented by Nvidia and how this happened, if you actually read/listened to it and understood it, makes complete sense. You have the marketing team which is used to things being done on the hardware side in a way that has been the same for years. Then Maxwell comes out, is built with the ability to utilize alternate architecture, and that isn't translated to Marketing.

For anyone to say Nvidia lied is just putting their foot in their mouth, unless they have some proof that Nvidia did lie. Want to know if Nvidia lied? Ask yourself this.....

Has the performance of the GTX 970 changed from before the release of the new architecture to after it? Is the 970 slower? Are the 970 benchmarks that were done different?

No. No, they aren't.

Guess what folks, crap happens in business. Especially when it comes to labeling things, it just isn't one company involved. When it comes to packaging you have multiple companies involved usually. Artists, designers, packaging manufacturer/printer, etc. Screwing up on a spec sheet isn't that difficult!

Unfortunately it was lost in translation between Geek Speak in the lab to Marketing Speak upstairs, and/or somewhere between the various entities involved with designing the retail packaging. God, people are acting like a business isn't allowed to screw up, almost refusing to give them the option to apologize and make it right.


----------



## mouacyk

"4GB means 4GB!" -- What a load of ambiguity!

AMD in Producer Mode: Y'all consumers need to accept that 4GB was sold. No lies there.
AMD Competitor Mode: Yo' NV -- sup with the play man? You dirty!


----------



## Noufel

Nvidia " the way 3.5gb card is meant to be a 4gb one "


----------



## Forceman

Quote:


> Originally Posted by *PostalTwinkie*
> 
> They lied? You have proof they lied? Do you even know what "Lying" means? What it means to "lie"?
> The explanation presented by Nvidia and how this happened, if you actually read/listened to it and understood it, makes complete sense. You have the marketing team which is used to things being done on the hardware side in a way that has been the same for years. Then Maxwell comes out, is built with the ability to utilize alternate architecture, and that isn't translated to Marketing.
> 
> For anyone to say Nvidia lied is just putting their foot in their mouth, unless they have some proof that Nvidia did lie. Want to know if Nvidia lied? Ask yourself this.....
> 
> Has the performance of the GTX 970 changed from before the release of the new architecture to after it? Is the 970 slower? Are the 970 benchmarks that were done different?
> 
> No. No, they aren't.
> 
> Guess what folks, crap happens in business. Especially when it comes to labeling things, it just isn't one company involved. When it comes to packaging you have multiple companies involved usually. Artists, designers, packaging manufacturer/printer, etc. Screwing up on a spec sheet isn't that difficult!
> 
> Unfortunately it was lost in translation between Geek Speak in the lab to Marketing Speak upstairs, and/or somewhere between the various entities involved with designing the retail packaging God, people are acting like a business isn't allowed to screw up, almost refusing to give them the option to apologize and make it right.


I don't know that anyone really believes Nvidia knowingly lied to reviewers about the ROP/L2 issue (what would be the point?), but it's a bit of a stretch to believe that none of the people that did know the real numbers read a review anytime in the following 4 months.


----------



## wooshna

Really a company as big as Nvidia with as much money as Nvidia has couldn't pay a few people to read the print for errors?

Saying this is a "spec misprint" is the same as saying nvidia was too stupid to realize they put a 4 instead of a 3.

people are pissed offed at the false advertisement. Period.......

even if performance is great for $349 why upgrdade from a 780 or 780ti?

Nvidia didn't "misprint" anything they knowingly/willingly lied to the consumers about their product because it "might" not sell as much.

If nvidia had any faith in their product they would have put the real specs there and say " look it looks weaker on paper but in the real world the performance is almost like our 980 or better then the competition"

But they chose to have shady business practices (because thats what everyone does) instead of taking this opportunity to be set a higher standard for the industry.

I was looking to upgrade to a 970 after seeing my borther in laws 970 performance but now i'll just wait to see what the next gen cards from amd/nvidia will be. Yeah i will be a more critical of what Nvidia says their products are and probably will wait 6 months before i buy an nividia card if its what i want.


----------



## UZ7

Quote:


> Originally Posted by *Forceman*
> 
> I don't know that anyone really believes Nvidia knowingly. Lied to reviewers about the ROP/L2 issue (what would be the point?), but it's a bit of a stretch to believe that none of the people that did know the real numbers read a review anytime in the following 4 months.


Well also the fact that the card was not made in a week and released on a rush.. this card was in production and testing for how long?


----------



## iSlayer

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Took AMD longer than expected to come up with that.
> 
> They must have used an AMD processor to render that in photoshop, takes awhile.
> 
> EDIT: I kid, I kid, I couldn't resist!


Battle of the lowblows.
Quote:


> Originally Posted by *MerkageTurk*
> 
> Postaltwinkie
> 
> Problem is they lied
> 
> What don't you defendants of nvidia understand?
> 
> My ti should destroy the 970 but nvidia are unethically making it slower.
> 
> Nvidias business practice is shady.
> 
> Now a 290x is on par with a 980, its main rival was a 780 ti


The Kepler nerf was all rumor no proof. RIP reason to QQ, press F to pay respects.
Quote:


> Originally Posted by *Noufel*
> 
> Nvidia " the way 3.5gb card is meant to be a 4gb one "


That was awful.


----------



## sugalumps

Quote:


> Originally Posted by *MerkageTurk*
> 
> Postaltwinkie
> 
> Problem is they lied
> 
> What don't you defendants of nvidia understand?
> 
> My ti should destroy the 970 but nvidia are unethically making it slower.
> 
> Nvidias business practice is shady.
> 
> Now a 290x is on par with a 980, its main rival was a 780 ti


Once again they never "unethically" made the 7 series slower or intentionaly gimped it. Those cards were EOL discontinued and they focused on updating the drivers of the brand new cards, meaning those pulled ahead. Because the new series pulled ahead it does not mean the others ones were gimped, it only meant that those were tapped out and due to immature drivers of the 9 series they were equal at first.


----------



## SKYMTL

Good lord this thread is getting nuts. I'm out for now....


----------



## 2010rig

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Took AMD longer than expected to come up with that.
> 
> They must have used an AMD processor to render that in photoshop, takes awhile.
> 
> EDIT: I kid, I kid, I couldn't resist!


I'm disappointed, I was expecting a Fixer commercial.









AMD never passes up an opportunity to dog on their competitors.


----------



## dukeReinhardt

I'm surprised people are so willing to forgive this. After all, this is something that was done on purpose, and the fact that an apology was made only after detailed investigations by a lucky consumer should make people mad. The correct thing to do would have been to make a recall as soon as possible. Do people genuinely believe that Nvidia are raging morons incapable of spotting mistakes in their own product marketing? Moreover, they're completely getting away with everything here. How many extra sales did they make from false advertising? How many people are aware of this being an issue? How many people are going to return the product, even knowing the facts?

I just don't understand why consumers are willing to overlook things in order to actively defend shady businesses and their shady practices. YOU are the one being screwed, and you don't even realize it. Some idiots seem to think that owning a product means they're part of some sort of social club. It's not Nvidia users vs. AMD users. That's ******ed. It's _you_ the _consumer_, and your rights vs. the big corporations that want your money as easily as possible. Wake up .


----------



## Mand12

Quote:


> Originally Posted by *MerkageTurk*
> 
> Problem is they lied
> 
> What don't you defendants of nvidia understand?


What we do understand is that there is *no benefit whatsoever* of them telling us the wrong thing. It wouldn't have mattered to the reviewers. They would have seen it, gone "Oh, hey, that's kinda weird" and gone on to also say "but the benchmarks are awesome!" and given the same recommendations, because in the end the cards still perform as they always have.

The only thing giving the wrong information does is make them look dumb and piss buyers off. You seriously think they *lied* in order to make that happen?


----------



## provost

Quote:


> Originally Posted by *sugalumps*
> 
> Once again they never "unethically" made the 7 series slower or intentionaly gimped it. Those cards were EOL discontinued and they focused on updating the drivers of the brand new cards, meaning those pulled ahead. Because the new series pulled ahead it does not mean the others ones were gimped, it only meant that those were tapped out and due to immature drivers of the 9 series they were equal at first.


Well, it appears that AMD continues to support its install base, and does not abandon and run with the next "sale".
From what I have gathered in our GK 110 Titan thread, it looks like that even the 7970s continue to get better support than the GK110s.!
If Nvidia's driver support cycle is a few months, may I please lease a card from Nvidia, instead of buying it?

Or better yet, it looks like AMD does not take as mercenary a view of its existing customers as Nvidia does, so instead of trying to push water uphill, it may be best to look elsewhere for the next upgrade


----------



## mouacyk

Quote:


> Originally Posted by *Mand12*
> 
> What we do understand is that there is *no benefit whatsoever* of them telling us the wrong thing. It wouldn't have mattered to the reviewers. They would have seen it, gone "Oh, hey, that's kinda weird" and gone on to also say "but the benchmarks are awesome!" and given the same recommendations, because in the end the cards still perform as they always have.
> 
> The only thing giving the wrong information does is make them look dumb and piss buyers off. You seriously think they *lied* in order to make that happen?


Let me help you out there: "What we do understand is that there is *no benefit whatsoever* of them telling us the *right* thing."


----------



## Mand12

Quote:


> Originally Posted by *mouacyk*
> 
> Let me help you out there: "What we do understand is that there is *no benefit whatsoever* of them telling us the *right* thing."


No, that's not it. There is no benefit to Nvidia to intentionally providing the wrong information. Had they provided the right information back in September, nobody would have cared and none of this would now be happening. There is NO benefit to Nvidia to lie to us about this.


----------



## skupples

Quote:


> Originally Posted by *MerkageTurk*
> 
> Postaltwinkie
> 
> Problem is they lied
> 
> What don't you defendants of nvidia understand?
> 
> My ti should destroy the 970 but nvidia are unethically making it slower.
> 
> Nvidias business practice is shady.
> 
> Now a 290x is on par with a 980, its main rival was a 780 ti


I've yet to see proof, out of once again, a bunch of REALLY poorly running Ubisoft games, that GK110 is magically getting slower.
Yes, I know, you provided links to The Grid, and it looks like The Grid is CPU bound, which would mean GPU doesn't make much of a difference at all.

I just find it really funny that all of the games being used for all of these current conspiracies & controversies are the worst running games of 2014.


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> If people are going to return the cards after 60 days *based on a specs mistake*, then un-install the game, and return everything.
> 
> Since it's all about the principle, right?


Quote:


> Originally Posted by *provost*
> 
> Yep, that's the right thing to do.


How do they return a game that has already been redeemed to their account? That doesn't make any sense.

Anyway, that game probably cost Nvidia a nickel. Nvidia shouldn't have made a mistake that warrants a consumer return and they wouldn't be out the game. The consumer should get a free game on hassle alone.
Quote:


> Originally Posted by *Silent Scone*
> 
> Slow day lol
> 
> https://twitter.com/Thracks/status/560511204951855104/photo/1
> 
> 
> 
> No shame.


Love it.









Quote:


> Originally Posted by *tsm106*
> 
> Hahaha!
> You guys are on some good freaking drugs or something. They lied and mislead consumers for months until people found out. If no one pushed the boundries would they have come out with the truth on their own? Shame, is like for some wrong they did not knowingly commit or an unfortunate event to happen to them. But not one that they invented and masterminded.


This.^

I just don't get it. I love Nvidia cards more so than AMD cards, but the level of defense some of you have for Nvidia in this thread is laughable.


----------



## mouacyk

Quote:


> Originally Posted by *skupples*
> 
> I've yet to see proof, out of once again, a bunch of REALLY poorly running Ubisoft games, that GK110 is magically getting slower.
> Yes, I know, you provided links to The Grid, and it looks like The Grid is CPU bound, which would mean GPU doesn't make much of a difference at all.
> 
> I just find it really funny that all of the games being used for all of these current conspiracies & controversies are the worst running games of 2014.


Well you know... where things are un-optimized, as they are so often newly released, consumers hope to make up with extra hardware performance, which can only be bought with cash. It would be awesome if we were all programmers and hackers and can patch the problems ourselves. It's just that sometimes, we are marketed such a good deal for that performance we need.

You're basically saying, not only is NVidia not selling what it is marketing, but all the games are un-optimized? What else don't we know?


----------



## PostalTwinkie

Quote:


> Originally Posted by *Forceman*
> 
> I don't know that anyone really believes Nvidia knowingly lied to reviewers about the ROP/L2 issue (what would be the point?), but it's a bit of a stretch to believe that none of the people that did know the real numbers read a review anytime in the following 4 months.


I don't know about that. Here is why, and maybe this is why I have the perspective I do;

Family of mine works for Intel, I have a close family member that is on the team that designed Sandy and Ivy, etc. After working on a project for years, with your face in technical data for years, you get burnt and exhausted. After the big launch party (at Intel) for Sandy and Ivy - people were just spent. They didn't want to look at reviews, because their view was fogged and fatigued from doing it already.

They were excited about the next thing, but no longer that thing. Excited they were done, that it was in market, that it was doing well. While I won't go so far as to say that they wouldn't have looked at the specs, I kind of am....

Really if you are looking at performance comparisons of the product you just created, you aren't going to look at the spec list; you made the thing. It is very plausible that the handful of people who knew about this architecture difference, something high level and well outside the scope of most, who MAY have looked at reviews just skipped the spec list. Again, because they made the card, they don't need to read the spec sheet, they just want to know how their creation stacks up against the competition.

What it boils down to, as it pertains to the 4 month window is...


How many people at Nvidia actually knew about this architectural difference?
How many of those people read post launch reviews of their creation?
Out of those people, how many of them stopped to read the spec sheet?
When was the last time you referenced your systems Device Manager to know the specs of what you built? I know I don't, nor does any other builder I know. Why? Because we built it, we don't need to read the specs, we wrote that list. Now don't flame me if somehow that gets printed wrong.


----------



## Shatterist

Corporations are still a collection of people, and people are fallible. Yes they put out false marketing material and the speed at which they realized their error (in fact, they didn't realize it until someone tripped over an issue) is disappointing to say the least, but let's not attribute things to malice when it's pretty clear that there was incompetence across this launch. The performance of the card has not changed over the past few months since it has been released, they haven't given reviewers binned cards and then unload garbage on the consumer, and even now, apparently finding edge-cases to isolate the impact of this segmentation is proving difficult to find. If this realization is making you think about returning the card and you haven't run into any problem yet, the questions become: a) what are you replacing the 970 with and b) in the event that in the future it stops providing acceptable performance, wouldn't you just as likely be upgrading to a new card anyways?

Tl;dr - The performance didn't change, the market hasn't changed (much), all that has is some marketing numbers and your perception of future-proofing.


----------



## Silent Scone

Quote:


> Originally Posted by *tsm106*
> 
> Hahaha!
> You guys are on some good freaking drugs or something. They lied and mislead consumers for months until people found out. If no one pushed the boundries would they have come out with the truth on their own? Shame, is like for some wrong they did not knowingly commit or an unfortunate event to happen to them. But not one that they invented and masterminded.


I didn't understand one word you just said. Can I take it that you endorse companies riding on the back of others because they have nothing else to talk about?
Quote:


> Originally Posted by *criminal*
> 
> How do they return a game that has already been redeemed to their account? That doesn't make any sense.
> 
> Anyway, that game probably cost Nvidia a nickel. Nvidia shouldn't have made a mistake that warrants a consumer return and they wouldn't be out the game. The consumer should get a free game on hassle alone.
> Love it.
> 
> 
> 
> 
> 
> 
> 
> 
> This.^
> 
> I just don't get it. I love Nvidia cards more so than AMD cards, but the level of defense some of you have for Nvidia in this thread is laughable.


EDIT: Actually I apologise, TSM just gets my back up as he's a raging bias of negativity. Like forever, in the irony of him defending a trend of distasteful marketing whilst insinuating I'm defending simply by posting it.


----------



## criminal

Quote:


> Originally Posted by *Mand12*
> 
> No, that's not it. There is no benefit to Nvidia to intentionally providing the wrong information. Had they provided the right information back in September, nobody would have cared and none of this would now be happening. There is NO benefit to Nvidia to lie to us about this.


I don't believe for a second that it was a mistake. 64 rops looks better on paper and 4GB is easier to explain then the way it is being addressed on the 970. They got busted and had to come clean.
Quote:


> Originally Posted by *Silent Scone*
> 
> I didn't understand one word you just said. Can I take it that you endorse companies riding on the back of others because they have nothing else to talk about?
> lol don't kid yourself, I can still reach that perch you're standing on. I hope you're not talking about me. Don't seem to recall anyone defending anything. I wouldn't piss on the 970 GTX if it was on fire let alone buy it in the first place.


Nope, wasn't talking about you.


----------



## Slink3Slyde

Quote:


> Originally Posted by *skupples*
> 
> I've yet to see proof, out of once again, a bunch of REALLY poorly running Ubisoft games, that GK110 is magically getting slower.
> Yes, I know, you provided links to The Grid, and it looks like The Grid is CPU bound, which would mean GPU doesn't make much of a difference at all.
> 
> I just find it really funny that all of the games being used for all of these current conspiracies & controversies are the worst running games of 2014.


Hiya,

http://www.overclock.net/t/1529108/are-nvidia-gimping-kepler-since-maxwell/10

Dont know if you checked back in my thread but I looked at more number this time from Techpowerup not uncluding any UBIsoft games. No conspiracy theory just numbers that point to something going on. The GTX 960 performs around the same in both pre and post maxwell games relative to R9 290. Everything Kepler is 10% worse since Maxwell was relased, according to Techspot and Techpowerup. the tests from Guru3d in DA:I show similar results.

Check my post on the 2nd page for TPU numbers from their 960 review. I didnt use the numbers from Far Cry4 Asassins Creed Unity, Watch dogs or Wolfenstein.

Sorry for off topic but if someone can disprove this or provide more data I'm all ears, it's not like I want it to be happening.


----------



## Silent Scone

Quote:


> Originally Posted by *criminal*
> 
> I don't believe for a second that it was a mistake. 64 rops looks better on paper and 4GB is easier to explain then the way it is being addressed on the 970. They got busted and had to come clean.
> Nope, wasn't talking about you.


x


----------



## Mand12

Quote:


> Originally Posted by *criminal*
> 
> I don't believe for a second that it was a mistake.


So you believe that they intentionally created a situation that would barely have budged reviews had they been correct, yet would blow up in their faces months later?


----------



## Silent Scone

Quote:


> Originally Posted by *Mand12*
> 
> So you believe that they intentionally created a situation that would barely have budged reviews had they been correct, yet would blow up in their faces months later?


Well they didn't think it through, did they. I'm surprised nobody has posted the analogy of dual GPU cards being marketed as double their VRAM capacity. Quite similar, just because you and I know already because we're super smart, that this doesn't make it an "8GB".

Do it on a single GPU that's supposed to hit a reasonable price point and be the best option for buyers looking to hit the bang for buck, and not inform people dually. It's a funny world, that's all. I still think someone had to sit down and say, no, no details on this - list it as one would expect.


----------



## criminal

Quote:


> Originally Posted by *Mand12*
> 
> So you believe that they intentionally created a situation that would barely have budged reviews had they been correct, yet would blow up in their faces months later?


No, I believe they didn't think the information would ever come out or when it did it wouldn't matter because the next big thing (GM200) would have overshadowed all this. Sorry if I don't buy into everything Nvidia says or does.


----------



## 2010rig

Quote:


> Originally Posted by *criminal*
> 
> I don't believe for a second that it was a mistake. 64 rops looks better on paper and 4GB is easier to explain then the way it is being addressed on the 970. They got busted and had to come clean.
> Nope, wasn't talking about you.


Ok, so, since they lied about the specs, how were people affected by this? Are 970's suddenly slower now? Or are they still *performing* as presented in reviews?

What's more important the specs that you thought you bought? or the performance that you're getting based on the reviews that you read?

I'm trying to understand why it bothers so many.

The only way I'd be pissed like some people are is if they sold the 980 with 970 specs. Meaning the 3 SM's disabled, while sending reviewers fully enabled cards.

Or if people got the cards and they weren't getting the same numbers as seen in reviews.

I'm of the opinion that if they listed the correct specs from the get go, the card would have sold just as well anyway. x70 cards almost always have less ROP's, and a smaller bus, that's nothing new. But we're talking about such a miniscule difference which doesn't effect the end performance promised, which is what matters.

With a $220 difference between the 2, the specs are understandable.


----------



## Bit_reaper

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I don't know about that. Here is why, and maybe this is why I have the perspective I do;
> 
> Family of mine works for Intel, I have a close family member that is on the team that designed Sandy and Ivy, etc. After working on a project for years, with your face in technical data for years, you get burnt and exhausted. After the big launch party (at Intel) for Sandy and Ivy - people were just spent. They didn't want to look at reviews, because their view was fogged and fatigued from doing it already.
> 
> They were excited about the next thing, but no longer that thing. Excited they were done, that it was in market, that it was doing well. While I won't go so far as to say that they wouldn't have looked at the specs, I kind of am....
> 
> Really if you are looking at performance comparisons of the product you just created, you aren't going to look at the spec list; you made the thing. It is very plausible that the handful of people who knew about this architecture difference, something high level and well outside the scope of most, who MAY have looked at reviews just skipped the spec list. Again, because they made the card, they don't need to read the spec sheet, they just want to know how their creation stacks up against the competition.
> 
> What it boils down to, as it pertains to the 4 month window is...
> 
> 
> How many people at Nvidia actually knew about this architectural difference?
> How many of those people read post launch reviews of their creation?
> Out of those people, how many of them stopped to read the spec sheet?
> When was the last time you referenced your systems Device Manager to know the specs of what you built? I know I don't, nor does any other builder I know. Why? Because we built it, we don't need to read the specs, we wrote that list. Now don't flame me if somehow that gets printed wrong.


I suspect the truth is somewhere in between. I find it highly doubt full that none noticed the error. Surely someone somewhere pointed it out. Whats far more likely is that after the error was noticed someone in nvidia management opted not to make it public hoping people would just let it slide or that it would go un-noticed by the general public. After all what did they have to gain by pointing out their own error unless someone actually made a fuzz about it.

While I do think people who truly want an refund are entitled to it as the product they bought does not mach what was advertised and the 3.5-0.5 memory split can have tangible effects on performance.

That said I would like to point out that both AMD and Nvidia make a far more misleading labeling when they sell an Dual GPU card as having 8GB memory when it actually only has 4GB effective memory due to how Crossfire/SLI works. At least the GTX 970 actually can load up the full 4GB worth of texture data.


----------



## battleaxe

Quote:


> Originally Posted by *SKYMTL*
> 
> Good lord this thread is getting nuts. I'm out for now....


Amen... as several um.... trolls... have now joined in... not speaking about anyone in particular or anything of course.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> Well they didn't think it through, did they. I'm surprised nobody has posted the analogy of dual GPU cards being marketed as double their VRAM capacity. Quite similar, just because you and I know already because we're super smart, that this doesn't make it an "8GB".
> 
> Do it on a single GPU that's supposed to hit a reasonable price point and be the best option for buyers looking to hit the bang for buck, and not inform people dually. It's a funny world, that's all. I still think someone had to sit down and say, no, no details on this - list it as one would expect.


The dual GPU analogy is inaccurate. Depending on the implementation of shared rendering, the setup can use up to the full 8GB. The common AFR approach does have the hard limitation of set by the min per physical GPU. SFR and other clever approaches, which are not as common can use the full capacity. The latest Civ game uses SFR. It doesn't boost framerate too much, but it boosts fluidity greatly.


----------



## lacrossewacker

Lol good one from AMD - but that reference cooler...

Here's the deal, if you're effected by this oversight (or cover up - whichever you prefer), exchange your GPU. If not, continue on your day using one of the best GPU deals that's hit our market in a LONG time.

Titan "har har har that's too expensive"
970 for 1/3 the price "oh gee, Titan performance"

You can beat your chest about it because it's the principal of the matter, but let's not act like we're all rocking 1600p screens needing to push 8xMSAA


----------



## ZealotKi11er

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I don't know about that. Here is why, and maybe this is why I have the perspective I do;
> 
> Family of mine works for Intel, I have a close family member that is on the team that designed Sandy and Ivy, etc. After working on a project for years, with your face in technical data for years, you get burnt and exhausted. After the big launch party (at Intel) for Sandy and Ivy - people were just spent. They didn't want to look at reviews, because their view was fogged and fatigued from doing it already.
> 
> They were excited about the next thing, but no longer that thing. Excited they were done, that it was in market, that it was doing well. While I won't go so far as to say that they wouldn't have looked at the specs, I kind of am....
> 
> Really if you are looking at performance comparisons of the product you just created, you aren't going to look at the spec list; you made the thing. It is very plausible that the handful of people who knew about this architecture difference, something high level and well outside the scope of most, who MAY have looked at reviews just skipped the spec list. Again, because they made the card, they don't need to read the spec sheet, they just want to know how their creation stacks up against the competition.
> 
> What it boils down to, as it pertains to the 4 month window is...
> 
> 
> How many people at Nvidia actually knew about this architectural difference?
> How many of those people read post launch reviews of their creation?
> Out of those people, how many of them stopped to read the spec sheet?
> When was the last time you referenced your systems Device Manager to know the specs of what you built? I know I don't, nor does any other builder I know. Why? Because we built it, we don't need to read the specs, we wrote that list. Now don't flame me if somehow that gets printed wrong.


If you work for Nvidia we know and these stuff. You would spot this right away unless the people that made the GPU where told to lie to the PR and keep the secret from other lower level Nvidia employees. I am going to say this again. To achieve $330 price Nvidia had to slow down the card because of GTX980. Also 4GB just feels, looks, sounds is more powerful then 3.5GB.


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> The dual GPU analogy is inaccurate. Depending on the implementation of shared rendering, the setup can use up to the full 8GB. The common AFR approach does have the hard limitation of set by the min per physical GPU. SFR and other clever approaches, which are not as common can use the full capacity. The latest Civ game uses SFR. It doesn't boost framerate too much, but it boosts fluidity greatly.


Nobody uses asynchronous crossfire because the performance deficit is terrible, ask any developer and they'll tell you alternate frame rendering is the way forward, except in one example you've given me over a top down TBS. That's like saying well the 512MB across the two way L2 cache can be used sometimes but it's not even a factor if you stay below the initial frame buffer. You're picking just as many hairs.

In fact it's arguably-much worse if you're a consumer looking at a dual GPU card, as let us say for arguments sake asynchronous crossfire cases - 90% of the time you cannot even access half of the frame buffer.


----------



## Mand12

Quote:


> Originally Posted by *Silent Scone*
> 
> Well they didn't think it through, did they.


Certainly not, but when faced with a choice between incompetent malice and plain incompetence, plain incompetence is far more likely. Malice requires forethought - you can't get into this idiotic of a mess if you're having forethought.

The benefits of the alleged conspiracy are trivial to the point of being utterly meaningless, and the risks enormous.


----------



## Vesku

Quote:


> Originally Posted by *Mand12*
> 
> No, that's not it. There is no benefit to Nvidia to intentionally providing the wrong information. Had they provided the right information back in September, nobody would have cared and none of this would now be happening. There is NO benefit to Nvidia to lie to us about this.


If they had given out the right L2 and ROP information it would be harder to blame a "miscommunication" between engineering and marketing for not revealing the 3.5GB+0.5GB memory pool setup. Even giving Nvidia the benefit of the doubt regarding the reviewer's guide, is it plausible that no one with knowledge of the actual 970 internals noticed the incorrect spec information for more than 4 months? That's the bit most people are having trouble believing.


----------



## Mand12

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you work for Nvidia we know and these stuff. You would spot this right away unless the people that made the GPU where told to lie to the PR and keep the secret from other lower level Nvidia employees. I am going to say this again. To achieve $330 price Nvidia had to slow down the card because of GTX980. Also 4GB just feels, looks, sounds is more powerful then 3.5GB.


Except these are large companies, and things aren't always communicated properly. Happens all the time - pick just about any game developer, and you'll be able to find instances where some rep made some comment that had to get walked back because he didn't get the memo from the developer who was the actual source. Internal communication failure is not the same as lying. There is NO benefit to keeping this secret. The reviews would have posted the same benchmarks, made the same comparisons, and made the same recommendations. The only thing that lying could possibly have caused is this sort of fiasco. Who in their right mind would invent a PR crisis for nothing?

And no, they didn't slow down the card because of GTX980. People are having trouble finding the edge cases where this issue actually turns into real-world performance loss. Right now, it has all the impact on your gaming experience of a typo, but because people go nuts about GPUs in Red vs Green pissing wars, it's suddenly a huge deal.

It's fashionable to be mad, but the madness really is disproportionate to any actual harm done. Especially since the people who seem to be the most mad don't even own the card.
Quote:


> Originally Posted by *Vesku*
> 
> If they had given out the right L2 and ROP information it would be harder to blame a "miscommunication" between engineering and marketing for not revealing the 3.5GB+0.5GB memory pool setup. Even giving Nvidia the benefit of the doubt regarding the reviewer's guide, is it plausible that no one with knowledge of the actual 970 internals noticed the incorrect spec information for more than 4 months? That's the bit most people are having trouble believing.


It's plausible that nobody with knowledge of the actual 970 internals actually bothered to look at the specs being posted, yes. They're busy working on other things, it's not their job to police the internet for things that they have every reason to believe were done correctly but might be wrong.


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> Ok, so, since they lied about the specs, how were people affected by this? Are 970's suddenly slower now? Or are they still *performing* as presented in reviews?
> 
> What's more important the specs that you thought you bought? or the performance that you're getting based on the reviews that you read?
> 
> I'm trying to understand why it bothers so many.
> 
> The only way I'd be pissed like some people are is if they sold the 980 with 970 specs. Meaning the 3 SM's disabled, while sending reviewers fully enabled cards.
> 
> Or if people got the cards and they weren't getting the same numbers as seen in reviews.
> 
> I'm of the opinion that if they listed the correct specs from the get go, the card would have sold just as well anyway. x70 cards almost always have less ROP's, and a smaller bus, that's nothing new. But we're talking about such a miniscule difference which doesn't effect the end performance promised, which is what matters.
> 
> With a $220 difference between the 2, the specs are understandable.


Well I am of the opinion that if the true specs had been listed the card would not have sold so well. See how I can do that too. And I am willing to bet that If this had been AMD instead of Nvidia, some of you in here would have a different opinion on the matter.


----------



## Vesku

Quote:


> Originally Posted by *Mand12*
> 
> It's plausible that nobody with knowledge of the actual 970 internals actually bothered to look at the specs being posted, yes. They're busy working on other things, it's not their job to police the internet for things that they have every reason to believe were done correctly but might be wrong.


You don't think some of those people would want to read the reviews of the product they worked hard on?


----------



## Rickles

I'd be happy with a voucher for an indie game or something... heck maybe even a T shirt?


----------



## sugalumps

Quote:


> Originally Posted by *Rickles*
> 
> I'd be happy with a voucher for an indie game or something... heck maybe even a T shirt?


Why? They suck for lying, but why do you feel the need to be given additonal items for free?

Everone wants something for nothing these days.


----------



## Xoriam

Quote:


> Originally Posted by *Rickles*
> 
> I'd be happy with a voucher for an indie game or something... heck maybe even a T shirt?


I wouldn't mind a t-shirt.
1 for each card purchased though.


----------



## Mand12

Quote:


> Originally Posted by *criminal*
> 
> Well I am of the opinion that if the true specs had been listed the card would not have sold so well. See how I can do that too. And I am willing to bet that If this had been AMD instead of Nvidia, some of you in here would have a different opinion on the matter.


You're of the opinion that people would have nitpicked the spec sheet and completely ignored the benchmark comparisons that would have been exactly the same?

Quote:


> Originally Posted by *Vesku*
> 
> You don't think some of those people would want to read the reviews of the product they worked hard on?


They probably read the review. They probably didn't proofread the spec sheet, since they had no reason to believe it would be wrong. In the future, they very likely will be confirming and reconfirming it.


----------



## dukeReinhardt

Quote:


> Originally Posted by *Mand12*
> 
> So you believe that they intentionally created a situation that would barely have budged reviews had they been correct, yet would blow up in their faces months later?


I think it's more important that they didn't own up to their mistake and act appropriately in order to properly correct it. It's risible to suggest that not one person either directly involved in the development, or related in some way to a person involved in the project would have raised the flag at Nvidia between the launch of the product and now. Online social networks are a thing now. Is it really possible that nobody at Nvidia was ever asked something as simple as, "hey, you work at Nvidia. Would you recommend a 4gb GTX 970?" on Facebook, or in real life? Would the entire company really have missed tech review sites whose article titles were things like "GTX 970 4GB Review"?

In response to what you wrote, I think it's more than plausible that Nvidia lied from the outset.

Initial sales of a product matter far more than a small-scale "scandal" like this, long after the product has already become successful and propagated Nvidia's brand. A large part of the GTX 970's draw is the fact that it's meant to be nearly the same as the GTX 980 but cheaper. It helps this image to make the specs seem similar. This "scandal" on the other hand doesn't look like it's going to do anything significant. For one - look at how many people on this forum are resorting to insulting people for caring about the issue. People here are actually calling other people petty for wanting to return their product, and also, albeit correctly, saying that the problem makes no difference to performance, and incorrectly implying that there is no problem. It doesn't really look like Nvidia's image has been tarnished much at all. In terms of direct consequences (profit), only a small handful of enthusiasts are going to return their product, so there's not much of a negative impact there. Nvidia's getting away with what they've done. Regardless of whether this was a mistake or a lie, they're losing nothing for it. If that's the case, what's stopping them from doing it all the time? It's not impossible that this was a marketing ploy from the start, though I'm not saying that it necessarily was.


----------



## bambino167

Ok am glad they are looking into the matter and are offering refunds or exchange but here my question here is, exchange for what?


----------



## sugalumps

Quote:


> Originally Posted by *bambino167*
> 
> Ok am glad they are looking into the matter and are offering refunds or exchange but here my question here is, exchange for what?


Evga are doing step ups, if you can step up to a 980.


----------



## Mand12

Quote:


> Originally Posted by *dukeReinhardt*
> 
> I think it's more important that they didn't own up to their mistake and act appropriately in order to properly correct it.


How can you own up to a mistake you didn't know you made? Once it came out that it was, actually, a mistake, they've done a rather remarkable job as companies go of owning up to it.
Quote:


> Originally Posted by *dukeReinhardt*
> 
> Is it really possible that nobody at Nvidia was ever asked something as simple as, "hey, you work at Nvidia. Would you recommend a 4gb GTX 970?" on Facebook, or in real life? Would the entire company really have missed tech review sites whose article titles were things like "GTX 970 4GB Review"?


Because it really does have 4GB of memory in it, that's exactly what they would expect a headline to say.
Quote:


> Originally Posted by *bambino167*
> 
> Ok am glad they are looking into the matter and are offering refunds or exchange but here my question here is, exchange for what?


Saw a report that Nvidia has convinced Newegg to start taking refunds for store credit, even if it's past Newegg's standard return window. Though, what you would buy with your store credit I'm not sure, you're not going to be able to get 970's performance *(which HAS NOT CHANGED with this)* for your $329 in anything other than a 970.


----------



## bambino167

but i
Quote:


> Originally Posted by *sugalumps*
> 
> Evga are doing step ups, if you can step up to a 980.


So what happens if you have a G1 card?


----------



## Rickles

If AMDs 390s were hitting in the next few weeks this would be perfect timing for them, too bad for them I guess. I'm considering going up to a GTX 980.


----------



## criminal

Quote:


> Originally Posted by *Mand12*
> 
> You're of the opinion that people would have nitpicked the spec sheet and completely ignored the benchmark comparisons that would have been exactly the same?
> They probably read the review. They probably didn't proofread the spec sheet, since they had no reason to believe it would be wrong. In the future, they very likely will be confirming and reconfirming it.


Some people have already admitted they upgraded (sidegraded?) from a 290 4GB card or from a 780 3GB card. If and I mean a big IF the 970 had been listed with the 3.5GB + 0.5GB scenario and it was explained on how it worked, that they more then likely would not have made the move. That is all I am saying.

As far as the refund is concerned, Nvidia original specs were misrepresented. That is enough to warrant some type of refund. Sorry if you think other wise.


----------



## spacin9

Quote:


> Originally Posted by *bambino167*
> 
> Ok am glad they are looking into the matter and are offering refunds or exchange but here my question here is, exchange for what?


EVGA might be doing exchanges or step-ups. Zotac isn't.


----------



## dukeReinhardt

Quote:


> Originally Posted by *Mand12*
> 
> How can you own up to a mistake you didn't know you made? Once it came out that it was, actually, a mistake, they've done a rather remarkable job as companies go of owning up to it.


A remarkable job? What's happening now is what's called damage control. If they did ANY less than what they're doing, there really would be serious consequences for their brand. You're proof of what I'm saying though - you're more than willing to easily forgive Nvidia. In fact, your post reads like praise - 'look at how exceptionally Nvidia is responding to this. Other companies aren't so good'.


----------



## Xoriam

Is there a Price discount on 980 stepup with EVGA? I currently have a card in the RMA process for black screening GTX 970.


----------



## criminal

Quote:


> Originally Posted by *dukeReinhardt*
> 
> A remarkable job? What's happening now is what's called damage control. If they did ANY less than what they're doing, there really would be serious consequences for their brand. You're proof of what I'm saying though - you're more than willing to easily forgive Nvidia. In fact, your post reads like praise - 'look at how exceptionally Nvidia is responding to this. Other companies aren't so good'.


That is the way I read his posts too.
Quote:


> Originally Posted by *Xoriam*
> 
> Is there a Price discount on 980 stepup with EVGA? I currently have a card in the RMA process for black screening GTX 970.


Why would there be a price discount? The 980 is still the same price it has always been.


----------



## pony-tail

I have not bought mine yet ( I was about to and read about this issue ) - are they going to fix this ? or is it a non-issue ?
or should I wait a couple of months for the new cards ?


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> Nobody uses asynchronous crossfire because the performance deficit is terrible, ask any developer and they'll tell you alternate frame rendering is the way forward, except in one example you've given me over a top down TBS. That's like saying well the 512MB across the two way L2 cache can be used sometimes but it's not even a factor if you stay below the initial frame buffer. You're picking just as many hairs.
> 
> In fact it's arguably-much worse if you're a consumer looking at a dual GPU card, as let us say for arguments sake asynchronous crossfire cases - 90% of the time you cannot even access half of the frame buffer.


No splitting or picking hair here. With the multiple GPU, you are delivered the hardware (n*VRAM) marketed. It's up to the multi-GPU software (be it driver, app, or game engine) to make use of that hardware. The issue we're facing here is that a piece of hardware is missing that was supposed to be there, and all sorts of software promises were (are being) made to make up for it.


----------



## Mand12

Quote:


> Originally Posted by *dukeReinhardt*
> 
> A remarkable job? What's happening now is what's called damage control. If they did ANY less than what they're doing, there really would be serious consequences for their brand. You're proof of what I'm saying though - you're more than willing to easily forgive Nvidia. In fact, your post reads like praise - 'look at how exceptionally Nvidia is responding to this. Other companies aren't so good'.


Of course it's damage control. There's a lot of damage, and they're doing their best to control it. This is really, really bad for them - but people are being overly vindictive and petty about it. What's most telling is that the people who are screaming the loudest don't own the card, and fairly often don't even own any Nvidia card.

You think they should have caught the mistake sooner. Fine. But not having caught the mistake is not the same as lying.


----------



## Mand12

Quote:


> Originally Posted by *pony-tail*
> 
> I have not bought mine yet ( I was about to and read about this issue ) - are they going to fix this ? or is it a non-issue ?
> or should I wait a couple of months for the new cards ?


It's a non-issue. Performance benchmarks are as good as they always have been, and the price is fantastic. The error is the equivalent of a typo, not a problem with the card.


----------



## Rickles

Quote:


> Originally Posted by *criminal*
> 
> Some people have already admitted they upgraded (sidegraded?) from a 290 4GB card or from a 780 3GB card. If and I mean a big IF the 970 had been listed with the 3.5GB + 0.5GB scenario and it was explained on how it worked, that they more then likely would not have made the move. That is all I am saying.
> 
> As far as the refund is concerned, Nvidia original specs were misrepresented. That is enough to warrant some type of refund. Sorry if you think other wise.


I returned a 780ti to save some $ and for the extra VRAM... turns out I did still save the monies but the VRAM will probably bite me, especially if I ever wanted to go more than 1080p.


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> No splitting or picking hair here. With the multiple GPU, you are delivered the hardware (n*VRAM) marketed. It's up to the multi-GPU software (be it driver, app, or game engine) to make use of that hardware. The issue we're facing here is that a piece of hardware is missing that was supposed to be there, and all sorts of software promises were (are being) made to make up for it.


Where did you pull n*VRAM from? You just said yourself it's up to the software, that's what Nvidia are doing by putting the least important cache inside the restricted 512MB L2.

I'm not arguing with you, it's just horses for courses. You're justifying the former but disputing the latter lol.

We're all in the same industry driven boat and along for the ride, all I am saying is this isn't the first of it's kind nor will it be the last.


----------



## Vesku

Quote:


> Originally Posted by *pony-tail*
> 
> I have not bought mine yet ( I was about to and read about this issue ) - are they going to fix this ? or is it a non-issue ?
> or should I wait a couple of months for the new cards ?


Imo, think of it as GTX 970 3.5GB. It's still a good card for the price.


----------



## dukeReinhardt

Quote:


> Originally Posted by *Mand12*
> 
> Of course it's damage control. There's a lot of damage, and they're doing their best to control it. This is really, really bad for them - but people are being overly vindictive and petty about it. What's most telling is that the people who are screaming the loudest don't own the card, and fairly often don't even own any Nvidia card.
> 
> You think they should have caught the mistake sooner. Fine. But not having caught the mistake is not the same as lying.


Like I mentioned previously, I care much more about how people are responding to this issue than about Nvidia. It's really weird how much people are willing to go out of their way to defend a company that has no interest in their well-being.

Like I also mentioned, I think squabbling about AMD and Nvidia is preposterous and idiotic. The fact that I don't own an Nvidia card right now has no bearing at all on this discussion, and I don't appreciate the implication that I have some sort of personal bias. Again, like I said, I don't necessarily think they lied from the outset (though I've explained why I think it's plausible), but I think it's delusional to expect that not one person in a giant company managed to figure this issue out.


----------



## NuclearPeace

Its so annoying that everything has to be a conspiracy on OCN. This witch hunt is extremely infantile.


----------



## Mand12

Quote:


> Originally Posted by *dukeReinhardt*
> 
> Like I mentioned previously, I care much more about how people are responding to this issue than about Nvidia. It's really weird how much people are willing to go out of their way to defend a company that has no interest in their well-being.


It's more weird how much people are willing to go out of their way to attack a company that only makes computer hardware.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> Where did you pull n*VRAM from? You just said yourself it's up to the software, that's what Nvidia are doing by putting the least important cache inside the restricted 512MB L2.
> 
> I'm not arguing with you, it's just horses for courses. You're justifying the former but disputing the latter lol.
> 
> We're all in the same industry driven boat and along for the ride, all I am saying is this isn't the first of it's kind nor will it be the last.


While I take the time to nitpick your post, I don't think you are returning the favor. What do you make of this?
Quote:


> The issue we're facing here is that a piece of hardware is missing that was supposed to be there, and all sorts of software promises were (are being) made to make up for it.


----------



## HyperC

Quote:


> Originally Posted by *Mand12*
> 
> You're of the opinion that people would have nitpicked the spec sheet and completely ignored the benchmark comparisons that would have been exactly the same?
> They probably read the review. They probably didn't proofread the spec sheet, since they had no reason to believe it would be wrong. In the future, they very likely will be confirming and reconfirming it.


If nobody proofread it then you should be seeing new job positions at Nvidia , But I call bs they probably thought they would get away with it... All business rereads all marketing that's their job granted some sale errors happen mispriced items and such, but to allow many months to go by without anyone noticing until the customers noticed is a straight out lie... And people need to stop bashing others for feeling ripped off it is their right to feel how they want regardless of the performance difference. They bought something it wasn't plain and simple have a good day!


----------



## dukeReinhardt

Quote:


> Originally Posted by *Kand*
> 
> Perspecrive.
> 
> Do you own a 970?
> 
> No?
> Get out.


Oh no, the thought police are here. I'm sorry I have an opinion officer, I'll just go and chew on grass like everyone else.


----------



## 2010rig

Quote:


> Originally Posted by *Final8ty*
> 
> 
> 
> Spoiler: Warning: Spoiler!


It's hilarious, I noticed that afterwards.








Quote:


> Originally Posted by *criminal*
> 
> Well I am of the opinion that if the true specs had been listed the card would not have sold so well. See how I can do that too. And I am willing to bet that If this had been AMD instead of Nvidia, some of you in here would have a different opinion on the matter.


One of the few times I was up in arms with AMD's lies....


Spoiler: Dont Click Here to See Which of AMDs Lies I'm Referring to.




During the Bulldozer *pre-launch* hype, everything was pointing in different directions than what we were being told, and when our pal JF-AMD got called out, these were his typical responses.
Quote:


> Originally Posted by *JF-AMD*
> *IPC & Single threaded performance will be higher. Anyone saying otherwise is uninformed, or has an agenda.*
> I am not saying that everything is fake, the Intel fanboys are touting the numbers that make it look bad.
> Why is everyone obsessing about benchmarks, I think people just like to argue.
> Is somebody being paid by intel to continually post these statements? I have never lied
> The crazy rumors end up with corporate customers. I have to debunk them. There is a selfish self-interest here.
> 
> Who can forget the Bulldozer Pre-Launch FAQ.
> http://www.overclock.net/t/1107646/bulldozer-pre-launch-faq/


He maintained that stance, and ridiculed people who disagreed with him, up until BD launched.



Anyway, If this specs "debacle" had happened to AMD, I guarantee you I'd have the same stance.







I really don't see the big deal, because *the performance in reviews, is the same performance people are getting IRL.*

I love how you ignored all my points though, did they make too much sense for ya?







.


----------



## sugarhell

So now we know why the 970 is so cheap compared to 980. Making the 970 cheap and powerful and let the people know that it onlys lacks some cuda cores compared to 980 bumping the demand. Its a lie. And its a shame because its a poor lie. 970 is a good card and it didint need any lie to sell.

And dont tell me that nvidia didint know a bout how many ROPs the card has. Anyone remembers the 570? Cut come cores lower rops,bus and memory


----------



## Heavy MG

Quote:


> Originally Posted by *2010rig*
> 
> It's hilarious, I noticed that afterwards.
> 
> 
> 
> 
> 
> 
> 
> 
> One of the few times I was up in arms with AMD's lies....
> 
> 
> Spoiler: Dont Click Here to See Which of AMDs Lies I'm Referring to.
> 
> 
> 
> 
> During the Bulldozer *pre-launch* hype, everything was pointing in different directions than what we were being told, and when our pal JF-AMD got called out, these were his typical responses.
> He maintained that stance, and ridiculed people who disagreed with him, up until BD launched.
> 
> 
> Anyway, If this specs "debacle" had happened to AMD, I guarantee you I'd have the same stance.
> 
> 
> 
> 
> 
> 
> 
> I really don't see the big deal, because *the performance in reviews, is the same performance people are getting IRL.*
> I love how you ignored all my points though, did they make too much sense for ya?
> 
> 
> 
> 
> 
> 
> 
> .


Bashing on AMD to defend Nvidia?
At least AMD sells GPU's that don't have faked specs.
Quote:


> Originally Posted by *Rickles*
> 
> I'd be happy with a voucher for an indie game or something... heck maybe even a T shirt?


I couldn't say no to especially a Nvidia t-shirt,lol. Then again i don't want stuff for free,I'd rather be offered a "step up" from a 970 even though I have a Gigabyte G1 card.
Quote:


> Originally Posted by *criminal*
> 
> Well I am of the opinion that if the true specs had been listed the card would not have sold so well. See how I can do that too. And I am willing to bet that If this had been AMD instead of Nvidia, some of you in here would have a different opinion on the matter.


I agree,there is a lot of bias in this thread,since it's Nvidia most people are acting as if nothing is wrong,lol.
Quote:


> Originally Posted by *Xoriam*
> 
> Is there a Price discount on 980 stepup with EVGA? I currently have a card in the RMA process for black screening GTX 970.


I've never had to use gigabyte's RMA service but I had my screen go black once while on the desktop. It hasn't happened again since I installed latest drivers and BIOS,but I have no idea if the card is defective.


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> While I take the time to nitpick your post, I don't think you are returning the favor. What do you make of this?


I feel a sense of entitlement over a few KB of cache and quoted performance, which is why I used the VRAM dual GPU analogy, because some may feel they're entitled to the full frame buffer, when they cannot use half of it.

Like I said


----------



## Mand12

Quote:


> Originally Posted by *HyperC*
> 
> If nobody proofread it then you should be seeing new job positions at Nvidia , But I call bs they probably thought they would get away with it... All business rereads all marketing that's their job granted some sale errors happen mispriced items and such, but to allow many months to go by without anyone noticing until the customers noticed is a straight out lie... And people need to stop bashing others for feeling ripped off it is their right to feel how they want regardless of the performance difference. They bought something it wasn't plain and simple have a good day!


*sigh*

Obviously, someone was supposed to proofread it when it was transitioned from the technical team to the marketing team. That process certainly failed. But it's not the technical guys' job to make sure that the review sites have the correct copy of the spec sheet, that's the marketing team. The tech team had every reason to believe that a site like PCPer would publish the specs as they were received from the marketing team, and they had every reason to believe that the marketing team had the correct specs to provide. Yes, a mistake happened, but the "OH MY GOD HOW COULD THEY POSSIBLY HAVE MISSED IT" hyperbole makes me wonder whether anyone saying it has ever been involved in a situation where someone made an error.

This is not a lie. And you don't even own the card, you're not affected. Why so mad?
Quote:


> Originally Posted by *Heavy MG*
> 
> I agree,there is a lot of bias in this thread,since it's Nvidia most people are acting as if nothing is wrong,lol.


No, it's not "nothing is wrong." And it's not because it's Nvidia. I for one am acting like "it's wrong, but it's also not a big deal." Because it's not, actually, a big deal. The performance is still the same as it was when they were reviewed. The benchmarks are still valid. The comparisons to other cards are still valid. The circumstances where this leads to a problem are still rare and difficult to produce.

If anything is biased, it's reality. But it's much more fun to raise the torches and pitchforks, especially if you didn't even buy the card in question, so that's where the agitation is coming from.


----------



## mouacyk

Quote:


> Originally Posted by *sugarhell*
> 
> And dont tell me that nvidia didint know a bout how many ROPs the card has. Anyone remembers the 570? Cut come cores lower rops,bus and memory


Perfectly well. I fried a reference one, running it at 950MHz for a half hour or so in the Witcher 2. I found out afterwards that it did not have sufficient VRM protection.


----------



## spacin9

Quote:


> Originally Posted by *2010rig*
> 
> One of the few times I was up in arms with AMD's lies....
> 
> 
> Spoiler: Dont Click Here to See Which of AMDs Lies I'm Referring to.
> 
> 
> 
> 
> During the Bulldozer *pre-launch* hype, everything was pointing in different directions than what we were being told, and when our pal JF-AMD got called out, these were his typical responses.
> He maintained that stance, and ridiculed people who disagreed with him, up until BD launched.
> 
> 
> 
> Anyway, If this specs "debacle" had happened to AMD, I guarantee you I'd have the same stance.
> 
> 
> 
> 
> 
> 
> 
> I really don't see the big deal, because *the performance in reviews, is the same performance people are getting IRL.*
> 
> I love how you ignored all my points though, did they make too much sense for ya?
> 
> 
> 
> 
> 
> 
> 
> .


That's interesting... for a good month before release, the leaks pretty much said Bulldozer wasn't going to be good. Whereas, the leaks and reviews for the 970s were just about spot on. Except for the 3.5 GB part, which no one reported, even after the fact.


----------



## dukeReinhardt

Quote:


> Originally Posted by *Mand12*
> 
> It's more weird how much people are willing to go out of their way to attack a company that only makes computer hardware.


How am I going out of my way to attack Nvidia? You're only seeing my posts in that light because you've got a stance that you're not willing to budge from. I've done nothing but state that companies can and will be nefarious, and it's plausible that Nvidia lied from the outset, but not necessary. Is that an attack? I've also outlined why it's implausible that Nvidia missed their mistake. What part of what I said is anything but reasonable?


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> It's hilarious, I noticed that afterwards.
> 
> 
> 
> 
> 
> 
> 
> 
> One of the few times I was up in arms with AMD's lies....
> 
> 
> Spoiler: Dont Click Here to See Which of AMDs Lies I'm Referring to.
> 
> 
> 
> 
> During the Bulldozer *pre-launch* hype, everything was pointing in different directions than what we were being told, and when our pal JF-AMD got called out, these were his typical responses.
> He maintained that stance, and ridiculed people who disagreed with him, up until BD launched.
> 
> 
> 
> Anyway, If this specs "debacle" had happened to AMD, I guarantee you I'd have the same stance.
> 
> 
> 
> 
> 
> 
> 
> I really don't see the big deal, because *the performance in reviews, is the same performance people are getting IRL.*
> 
> I love how you ignored all my points though, did they make too much sense for ya?
> 
> 
> 
> 
> 
> 
> 
> .


Didn't ignore them, I have already covered all of that. 970 Performance = same. 970 Specs = We know the truth now. That alone justifies something for users who bought a card. If only on the grounds that Nvidia hopefully won't fudge specs again.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> I feel a sense of entitlement over a few KB of cache and quoted performance, which is why I used the VRAM dual GPU analogy, because some may feel they're entitled to the full frame buffer, when they cannot use half of it.
> 
> Like I said


No, I had to look that up: "racehorse performs best on a racecourse to which it is specifically suited".

You do realize people use this card for non-gaming purposes? In any case, this isn't about entitlement over a few KB of cache and there certainly was no quoted performance from NVidia or any review sites. It's 256KB of L2 cache, and a whole bridge that went out, having to re-route data through another path and non of this was made clear as a purchase decisions.


----------



## Mand12

Quote:


> Originally Posted by *criminal*
> 
> Didn't ignore them, I have already covered all of that. 970 Performance = same. 970 Specs = We know the truth now. That alone justifies something for users who bought a card. If only on the grounds that Nvidia hopefully won't fudge specs again.


Or, you're entitled to nothing because the money you paid for what you thought you were getting is money you paid for what you actually got. The performance didn't change from the reviews to retail. The benchmark results are valid. The comparisons are valid. They shouldn't have gotten the specs wrong, but had it been just a typo instead of an internal communication failure would you still be so mad?

Don't know why anyone should listen to either one of us as far as what people who bought it should get though, since neither one of us bought one.


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> No, I had to look that up: "racehorse performs best on a racecourse to which it is specifically suited".
> 
> You do realize people use this card for non-gaming purposes? In any case, this isn't about entitlement over a few KB of cache and there certainly was no quoted performance from NVidia or any review sites. It's 256KB of L2 cache, and a whole bridge that went out, having to re-route data through another path and non of this was made clear as a purchase decisions.


There's a few SMs missing as well from the full chip is that on the box as well?

/grey areas lol

Point is, this isn't really anything new, just in the manner that it's happened. If Nvidia are letting people DSR / RMA their cards for full refunds over it then in the situation of being a 970 owner - all one need do is comply with this and do what they should of done in the first place.

Which is do it properly and not skimp on crudely torn down shader models (and/or L2 cache ROP)









I kid. But seriously do it right or go home.


----------



## criminal

Quote:


> Originally Posted by *Mand12*
> 
> Or, you're entitled to nothing because the money you paid for what you thought you were getting is money you paid for what you actually got. The performance didn't change from the reviews to retail. The benchmark results are valid. The comparisons are valid. They shouldn't have gotten the specs wrong, but had it been just a typo instead of an internal communication failure would you still be so mad?
> 
> Don't know why anyone should listen to either one of us as far as what people who bought it should get, since neither one of us bought one.


I believe 100% it was not a mistake. Not at the level Nvidia is at. Maybe mom and pop shop down the street, but not Nvidia. But had it been an honest mistake, they would still have to come clean in the same manner. Nothing would be different I am afraid.


----------



## jcde7ago

It's been a while since i've seen OCN run so rampant on something...









That said, all this revelation does is lower the initially perceived 'futureproofing' of the 970. No doubt that the slower ~500MB of VRAM is having a performance impact, but it's not like it's erasing the overall performance that people bought the cards for after seeing a ton of benchmarks. The 970 is still arguably the best bang-for-buck card when power consumption + heat output are taken into account.

Keep the 970 if you have one and are gaming on a single 1080/1440p monitor; it's still going to destroy the vast majority of games out there that aren't named Shadow of Mordor or Skyrim with 2GB's worth of VRAM in mods. If you have 4K or a triple monitor setup, either step up to the 980s, or grab some 290Xs, since you're likely to feel the impact a lot more (especially with 970's in SLI, which really just look like a poor choice now for 4K/triple monitor). I'd have held on to my G1 970s if I weren't gaming on 7680x1440p...but as it stands, I made the move to Sapphire 290X 8GB Vapor-Xs in CrossFire, and honestly I have zero regrets. Definitely helped that Amazon took the 970's back no-questions-asked even after 70 days of me owning them.









Bottom line is, Nvidia should have never let it come to this to disclose that the specs of the card weren't accurate as sold, but now that's out in the open, it's not like everyone's 970s got magically gimped; i am on the side though that those who want a refund should actively seek it and receive one. But this isn't worthy of crucifying Nvidia over, either. Heck, if 8GB (or 7GB, I suppose, lol) 970s were available, i'd have probably just gotten those instead of the 8GB 290Xs...they'd likely have been cheaper and a lot more power efficient. The Sapphire 8GB 290X Vapor-Xs are no slouches though...in fact, they're pretty darn impressive.


----------



## Final8ty

Quote:


> Originally Posted by *2010rig*
> 
> It's hilarious, I noticed that afterwards.
> 
> 
> 
> 
> 
> 
> 
> 
> One of the few times I was up in arms with AMD's lies....
> 
> 
> Spoiler: Dont Click Here to See Which of AMDs Lies I'm Referring to.
> 
> 
> 
> 
> During the Bulldozer *pre-launch* hype, everything was pointing in different directions than what we were being told, and when our pal JF-AMD got called out, these were his typical responses.
> He maintained that stance, and ridiculed people who disagreed with him, up until BD launched.
> 
> 
> 
> Anyway, If this specs "debacle" had happened to AMD, I guarantee you I'd have the same stance.
> 
> 
> 
> 
> 
> 
> 
> I really don't see the big deal, because *the performance in reviews, is the same performance people are getting IRL.*
> 
> I love how you ignored all my points though, did they make too much sense for ya?
> 
> 
> 
> 
> 
> 
> 
> .


2 wrong does not make a right and AMD got bashed for it from the get go on day one, so its not like Bulldozer got by unbashed so why bash NV for this.
AMD got bashed and rightly so and NV is getting bashed for this and rightly so.


----------



## Mand12

I must admit I'm amazed that the response has been overwhelmingly "ignore the real-world performance benchmarks, make your buying decisions based on the spec sheet." Thought OCN knew better...


----------



## Mand12

Quote:


> Originally Posted by *criminal*
> 
> I believe 100% it was not a mistake. Not at the level Nvidia is at.


Then you're a fool. Nvidia is not that stupid. From AnandTech's article on it:
Quote:


> Now as NVIDIA is in full damage control mode at this point, consideration must be given as to whether NVIDIA's story is at all true; NVIDIA would hardly be the first company to lie when painted into a corner by controversy. With that in mind, given the story that NVIDIA has provided, do we believe them? In short, yes we do.
> 
> To be blunt, if this was intentional then this would be an incredibly stupid plan, and NVIDIA as a company has not shown themselves to be that dumb. NVIDIA gains nothing by publishing an initially incorrect ROP count for the GTX 970, and if this information had been properly presented in the first place it would have been a footnote in an article extoling the virtues of the GTX 970, rather than the centerpiece of a full-on front page exposé. Furthermore if not by this memory allocation issues then other factors would have ultimately brought these incorrect specifications to light, so NVIDIA would have never been able to keep it under wraps for long if it was part of an intentional deception. Ultimately only NVIDIA can know the complete truth, but given what we've been presented we have no reason to doubt NVIDIA's story.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> There's a few SMs missing as well from the full chip is that on the box as well?
> 
> /grey areas lol


I don't want to clutter up this thread anymore than you do. That part has been made perfectly clear by NVidia as can been correlated by the diagrams and the marketing material posted here:
http://www.overclock.net/t/1537725/pcper-nvidia-responds-to-gtx-970-3-5gb-memory-issue/1280#post_23467743
Quote:


> Originally Posted by *Mand12*
> 
> I must admit I'm amazed that the response has been overwhelmingly "ignore the real-world performance benchmarks, make your buying decisions based on the spec sheet." Thought OCN knew better...


Is having both wrong?

EDIT: In addendum... when people here request for help, the first questions usually are: What specs are you running? Please fill out your rig specs. Hmm, wonder why that is.


----------



## mkclan

Quote:


> Originally Posted by *Mand12*
> 
> Then you're a fool. Nvidia is not that stupid.


So naive, almost like my six-year old son.
Nvidia never lied, it's just a mistake.


----------



## Mand12

Quote:


> Originally Posted by *mkclan*
> 
> So naive, almost like my six-year old son.
> Nvinia never lied


Lying requires intent. This is a really, really stupid lie to make. If they were going to lie, it wouldn't have been this.


----------



## dukeReinhardt

Quote:


> Originally Posted by *Mand12*
> 
> I must admit I'm amazed that the response has been overwhelmingly "ignore the real-world performance benchmarks, make your buying decisions based on the spec sheet." Thought OCN knew better...


Non sequitur? Nobody is saying that at all? You should obviously make buying decisions based on performance. What is being said is that the difference between what was sold and what was marketed isn't meaningless. It's also completely justified to return the product should you wish to, because who knows how people want to use their card? Some people might want that extra 500mb vRAM down the line.


----------



## sugarhell

Quote:


> Originally Posted by *Mand12*
> 
> Then you're a fool. Nvidia is not that stupid.


Ofc they are not stupid. They are clever. This is a marketing trick. This is my opinion always.

Imagine you dont have 4gb vram and 64 ROPs compared to the 290. There are people that buy only just with the specs. Like the gt 630 2gb or 4gb. You get the same spec as a 290 but with less money (release i know) and lower consumption. It gives you the idea that you get more with the same money.


----------



## mouacyk

Quote:


> Originally Posted by *dukeReinhardt*
> 
> Non sequitur? Nobody is saying that at all? You should obviously make buying decisions based on performance. What is being said is that the difference between what was sold and what was marketed isn't meaningless. It's also completely justified to return the product should you wish to, because who knows how people want to use their card? Some people might want that extra 500mb vRAM down the line.


Or today in a compute or rendering workload.


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> I don't want to clutter up this thread anymore than you do. That part has been made perfectly clear by NVidia as can been correlated by the diagrams and the marketing material posted here:
> http://www.overclock.net/t/1537725/pcper-nvidia-responds-to-gtx-970-3-5gb-memory-issue/1280#post_23467743


I think you'll find legally they can do that legitimately, this is what's laughable - it's not clogging up the thread at all, this is a factual legality that is getting overlooked. We're buying a cheaper card, therefore we should know we are getting less performance. How that performance is neutered is down to NVIDIA - and although there is a penalty for that 0.5GB, it is still there and can be used. It does have the same memory sub system bar small changes which in hindsight they've admitted should have been made clear albeit mainly only due to the way people were seeing reported usage in third party application.

My point that I'm trying to get across to you is that this is nothing new. These things chop and change hands and not everything gets disclosed at all times. And yes it is still wrong that it happens this being no exception


----------



## Bit_reaper

Quote:


> Originally Posted by *Silent Scone*
> 
> There's a few SMs missing as well from the full chip is that on the box as well?
> 
> /grey areas lol
> 
> Point is, this isn't really anything new, just in the manner that it's happened. If Nvidia are letting people DSR / RMA their cards for full refunds over it then in the situation of being a 970 owner - all one need do is comply with this and do what they should of done in the first place.
> 
> Which is do it properly and not skimp on crudely torn down shader models (and/or L2 cache ROP)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I kid. But seriously do it right or go home.


I don't know if it was on the actual box as I haven read one thoroughly enough but the disabled SM does show up in the cuda core count on the official geforce spec sheet. (SM's house the cuda cores, less SM's --> less cuda cores)

GTX 980 Engine Specs:
*2048CUDA Cores*
1126Base Clock (MHz)
1216Boost Clock (MHz)
144Texture Fill Rate (GigaTexels/sec)
GTX 980 Memory Specs:
7.0 GbpsMemory Clock
4 GBStandard Memory Config
GDDR5Memory Interface
256-bitMemory Interface Width
224Memory Bandwidth (GB/sec)

GTX 970 Engine Specs:
*1664CUDA Cores*
1050Base Clock (MHz)
1178Boost Clock (MHz)
109Texture Fill Rate (GigaTexels/sec)
GTX 970 Memory Specs:
7.0 GbpsMemory Clock
4 GBStandard Memory Config
GDDR5Memory Interface
256-bitMemory Interface Width
224 Memory Bandwidth (GB/sec)


----------



## DzillaXx

Quote:


> Originally Posted by *2010rig*
> 
> It's hilarious, I noticed that afterwards.
> 
> 
> 
> 
> 
> 
> 
> 
> One of the few times I was up in arms with AMD's lies....
> 
> 
> Spoiler: Dont Click Here to See Which of AMDs Lies I'm Referring to.
> 
> 
> 
> 
> During the Bulldozer *pre-launch* hype, everything was pointing in different directions than what we were being told, and when our pal JF-AMD got called out, these were his typical responses.
> He maintained that stance, and ridiculed people who disagreed with him, up until BD launched.
> 
> 
> 
> Anyway, If this specs "debacle" had happened to AMD, I guarantee you I'd have the same stance.
> 
> 
> 
> 
> 
> 
> 
> I really don't see the big deal, because *the performance in reviews, is the same performance people are getting IRL.*
> 
> I love how you ignored all my points though, did they make too much sense for ya?
> 
> 
> 
> 
> 
> 
> 
> .


I know you huge advocate for the Green side, maybe a bit too much at time, but how can you not be mad at this?

Nvidia clearly lied about the amount of ROP's the card has. This is not some simple easily mistaken fact, and more than likely they knew about it.

I for one would feel cheated to find out the card I bought has less ROPs than I originally bought the card for. Sure the Benchmarks are solid for the card, but facts are still facts. The card can't go past 3.5GB of RAM if you care about smoothness.

If someone wants to toss their 970 back in the box, and get something else. Good for them. Honestly more people should be doing it, just to prove a point. Maybe even pickup one of AMD's offerings, I have the 290. Couldn't be happier. Great card, and keeps your room warmer in the cold winter months. Though IMO the hassle of sending back a video card can be a drag. Does the GTX970 perform well? Yes. Would a 290/290x or gtx980 make you happier? perhaps. Should anyone tell you, you shouldn't return you card just to make a point? No.


----------



## Mand12

Quote:


> Originally Posted by *sugarhell*
> 
> Ofc they are not stupid. They are clever. This is a marketing trick. This is my opinion always.
> 
> Imagine you dont have 4gb vram and 64 ROPs compared to the 290. There are people that buy only just with the specs. Like the gt 630 2gb or 4gb. You get the same spec as a 290 but with less money (release i know) and lower consumption. It gives you the idea that you get more with the same money.


A marketing trick that doesn't change the benchmarks, barely moves perceptions, and is GUARANTEED to blow up into a major PR nightmare when it inevitably comes to light? That's some trick. You should be wanting Nvidia to do this, given your past history. Makes it easier to root for Red.


----------



## RagingCain

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mand12*
> 
> Or, you're entitled to nothing because the money you paid for what you thought you were getting is money you paid for what you actually got. The performance didn't change from the reviews to retail. The benchmark results are valid. The comparisons are valid. They shouldn't have gotten the specs wrong, but had it been just a typo instead of an internal communication failure would you still be so mad?
> 
> Don't know why anyone should listen to either one of us as far as what people who bought it should get, since neither one of us bought one.
> 
> 
> 
> I believe 100% it was not a mistake. Not at the level Nvidia is at. Maybe mom and pop shop down the street, but not Nvidia. But had it been an honest mistake, they would still have to come clean in the same manner. Nothing would be different I am afraid.
Click to expand...

I believe the card would have sold just as well with the correct specs, no better, no worse (% wise). That leads me to think that there was no real point of them to lie.

It would have been another cut down version of their big boy card, business as usual.


----------



## Mand12

Quote:


> Originally Posted by *DzillaXx*
> 
> I know you huge advocate for the Green side, maybe a bit too much at time, but how can you not be mad at this?
> 
> Nvidia clearly lied about the amount of ROP's the card has. This is not some simple easily mistaken fact, and more than likely they knew about it.


No, they did not clearly lie about it.

They have no reason to lie about it. There is no benefit to lying about it. There's a massive risk to lying about it. If they were lying about it, that would require an intent to deceive in order to get higher sales. They were never going to increase sales by lying, and guaranteed to lose sales by lying.

But, we should totally believe you, because you don't like Nvidia, rather than the reality of what actually happened and the actual impact.

I still find it amazing that it's the hardcore AMD owners who are the most upset about it.


----------



## sugarhell

Quote:


> Originally Posted by *Mand12*
> 
> A marketing trick that doesn't change the benchmarks, barely moves perceptions, and is GUARANTEED to blow up into a major PR nightmare when it inevitably comes to light? That's some trick. You should be wanting Nvidia to do this, given your past history. Makes it easier to root for Red.


Eh. I never said anything about the performance. And you forget an important aspect. We found out by mistake about all this. The performance and the benchmarks stays as it is tho. If i had a 970 i would probably keep it. Having the same specs as the product that you target is an important aspect. 570 was like this but they did report the correct specs because they didint have competition.Its my opinion and i had a hope for actual conversation but instead you called me out


----------



## criminal

Quote:


> Originally Posted by *Mand12*
> 
> Then you're a fool. Nvidia is not that stupid. From AnandTech's article on it:


I am no fool. You are just that gullible.








Quote:


> Originally Posted by *sugarhell*
> 
> Ofc they are not stupid. They are clever. This is a marketing trick. This is my opinion always.
> 
> Imagine you dont have 4gb vram and 64 ROPs compared to the 290. There are people that buy only just with the specs. Like the gt 630 2gb or 4gb. You get the same spec as a 290 but with less money (release i know) and lower consumption. It gives you the idea that you get more with the same money.


Thank you


----------



## Xoriam

Quote:


> Originally Posted by *Mand12*
> 
> No, they did not clearly lie about it.


Ok defending the product etc is one thing,

But why do you keep going on saying that they ABSOLUTLY did not lie?
I really like Nvidia, ALOT.

But you have no way of knowing if they lied on purpose or not, so please stop saying these sort of things without proof.

You know they could have even done this on purpose so that they could fire some marketing people who they intended on firing to begin with.
(just theoretical situation)


----------



## DzillaXx

Quote:


> Originally Posted by *mkclan*
> 
> So naive, almost like my six-year old son.
> Nvidia never lied, it's just a mistake.


You kidding right?

You really believe in a entire company full of people, not one came forward and said something or had knowledge of it? Get real, they knew. They were able to put out a statement fast enough about it.

They sold the card with false higher specs to make it look better on paper. Considering 970 is slightly faster than a 290, Nvidia needed incentive to push the card on people.

If people knew about the cut down Cache, ROPs, Memory Pool. Then people may not have even considered the 970.

The real lie is the crap excuse we are getting now.


----------



## Silent Scone

If we're being forthright actually, do any 970 owners feel slightly stupid for being so happy until this came to light? lol.

Not sure how many people would have had the initiative to look at the frame times at the exact moment they exceeded 3.5GB.


----------



## Mand12

Quote:


> Originally Posted by *Xoriam*
> 
> Ok defending the product etc is one thing,
> 
> But why do you keep going on saying that they ABSOLUTLY did not lie?
> I really like Nvidia, ALOT.
> 
> But you have no way of know if they lied on purpose or not, so please stop saying these sort of things without proof.


I'm not saying they ABSOLUTELY did not lie. I am saying that they did not ABSOLUTELY lie, which is what people are claiming.

You have no way of knowing whether they lied *at all,* let alone whether it was on purpose or not, so please stop saying these sort of things without proof.

I have said, several times, that it makes no sense for them to lie about this. They didn't get any benefit. They did open themselves up to an enormous risk. Usually, people who lie tend to do it for personal gain - that's the accusation that's been made here. A marketing trick. Inflating the product. Making it look better, so it sells more.

Except that isn't the case. Had the spec sheet been correct, it would have been glossed over in the review as "eh, it's a not-fully-enabled GM204, exactly what we expect a 970 to be" and they'd have gone on to the benchmarks, which were at the time, still are to this day, and will be for the future valid and correct and legitimate comparisons.

There is no sales boost, there is no gain, there is just the risk of enormous loss. And Team Red comes out and INSISTS that it must have been deliberate and a conspiracy and all these horrible things.

None of it has any basis in reality.


----------



## mkclan

Quote:


> Originally Posted by *DzillaXx*
> 
> You kidding right?
> 
> You really believe in a entire company full of people, not one came forward and said something or had knowledge of it? Get real, they knew. They were able to put out a statement fast enough about it.
> 
> They sold the card with false higher specs to make it look better on paper. Considering 970 is slightly faster than a 290, Nvidia needed incentive to push the card on people.
> 
> If people knew about the cut down Cache, ROPs, Memory Pool. Then people may not have even considered the 970.
> 
> The real lie is the crap excuse we are getting now.


We need sarcasm emote


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> I think you'll find legally they can do that legitimately, this is what's laughable - it's not clogging up the thread at all, this is a factual legality that is getting overlooked. We're buying a cheaper card, therefore we should know we are getting less performance. How that performance is neutered is down to NVIDIA - and although there is a penalty for that 0.5GB, it is still there and can be used. It does have the same memory sub system bar small changes which in hindsight they've admitted should have been made clear albeit mainly only due to the way people were seeing reported usage in third party application.
> 
> My point that I'm trying to get across to you is that this is nothing new. These things chop and change hands and not everything gets disclosed at all times. And yes it is still wrong that it happens this being no exception


Since you brought up legality:
Quote:


> FTC Enforcement of Deceptive Advertising Laws
> Over the years, the Federal Trade Commission (FTC) has taken action against many businesses accused of engaging in false and deceptive advertising. If FTC investigators are convinced that an ad violates the law, they can do all of the following:
> - convince the violator to voluntarily comply with the law
> - issue a cease-and-desist order and bring a civil lawsuit on behalf of people who have been harmed
> - seek a court order (injunction) to stop a questionable ad while an investigation is in progress, and
> - require an advertiser to run corrective ads, admitting that an earlier ad was deceptive.


Source: http://www.nolo.com/legal-encyclopedia/consumer-protection-laws-business-29641.html


----------



## Xoriam

Quote:


> Originally Posted by *Silent Scone*
> 
> If we're being forthright actually, do any 970 owners feel slightly stupid for being so happy until this came to light? lol.
> 
> Not sure how many people would have had the initiative to look at the frame times at the exact moment they exceeded 3.5GB.


No I do not feel stupid for being happy, my card performs like was shown in reviews.
I got what I bought, it still works like it did before the information was discovered.

Nvidia should however feel stupid for letting incorrect information be used for so long.


----------



## Mand12

Quote:


> Originally Posted by *Xoriam*
> 
> No I do not feel stupid for being happy, my card performs like was shown in reviews.
> I got what I bought, it still works like it did before the information was discovered.
> 
> Nvidia should however feel stupid for letting incorrect information be used for so long.


Exactly. Props to you for being reasonable.


----------



## Seraphic

Nvidia was content keeping quiet knowing reviews/specifications were incorrect. And only when people started to wise up that something was odd with the memory did they come clean.


----------



## Silent Scone

That's what I was aiming for


----------



## Seven7h

Quote:


> Originally Posted by *criminal*
> 
> I believe 100% it was not a mistake. Not at the level Nvidia is at. Maybe mom and pop shop down the street, but not Nvidia. But had it been an honest mistake, they would still have to come clean in the same manner. Nothing would be different I am afraid.


You'd be surprised. The more people are involved, the more segmented and specialized tasks are, and the more there is an assumption thst "someone, somewhere in the company must be taking care of it."

This actually *increases* the likelihood of such a communication failure. Not to mention that then more people must also be synced up on information to achieve the same goal.

If you work in a shop of 3 people, you can't afford to make such assumptions, and it's very easy to confirm or disprove them by asking one person. If you have 10,000 people, you assume everyone is doing this part correctly, until someone doesn't.

If you still don't understand it, or believe it, go work in a large corporation and come back to me in a year


----------



## Seven7h

Quote:


> Originally Posted by *Xoriam*
> 
> No I do not feel stupid for being happy, my card performs like was shown in reviews.
> I got what I bought, it still works like it did before the information was discovered.
> 
> Nvidia should however feel stupid for letting incorrect information be used for so long.


They do.


----------



## Mand12

Quote:


> Originally Posted by *Seven7h*
> 
> You'd be surprised. The more people are involved, the more segmented and specialized tasks are, and the more there is an assumption thst "someone, somewhere in the company must be taking care of it."
> 
> This actually *increases* the likelihood of such a communication failure. Not to mention that then more people must also be synced up on information to achieve the same goal.
> 
> If you work in a shop of 3 people, you can't afford to make such assumptions, and it's very easy to confirm or disprove them by asking one person. If you have 10,000 people, you assume everyone is doing this part correctly, until someone doesn't.
> 
> If you still don't understand it, or believe it, go work in a large corporation and come back to me in a year


Exactly this. Everyone whose job wasn't to proofread the spec sheet for release assumed that guy did his job right. Everyone else just passes it along from step to step, not even knowing whether or not it's correct.

That guy is probably going to get a talking to, though. You don't want to be that guy.
Quote:


> Originally Posted by *Seraphic*
> 
> Nvidia was content keeping quiet knowing reviews/specifications were incorrect. And only when people started to wise up that something was odd with the memory did they come clean.


You assume Nvidia knew they were wrong. That's not a valid assumption. They can't "come clean" about a mistake they didn't know existed.


----------



## Xoriam

I'm sure most people know how bigger companies "somewhat work" in the sense that with such a big screw up everyone can take a bit of comfort, because you know someone is getting fired.


----------



## Seven7h

Quote:


> Originally Posted by *Mand12*
> 
> I must admit I'm amazed that the response has been overwhelmingly "ignore the real-world performance benchmarks, make your buying decisions based on the spec sheet." Thought OCN knew better...


Agreed. While this was a mistake, there are a million ways to juice, boost, and pump up specs on a spreadsheet or spec table. Human psychology is very fragile in certain ways, and susceptible to marketing tricks in spec listings.

However there is no way to cheat measured performance (other than by lossy optimizations which are easily uncovered, of course). I'm shocked how many seem to buy GPUs just to be able to say they have one.

If you want to chase specs, a GTX 285 has a full 512 bits!!!! That has to be faster right?!?


----------



## Seven7h

Quote:


> Originally Posted by *dukeReinhardt*
> 
> A remarkable job? What's happening now is what's called damage control. If they did ANY less than what they're doing, there really would be serious consequences for their brand. You're proof of what I'm saying though - you're more than willing to easily forgive Nvidia. In fact, your post reads like praise - 'look at how exceptionally Nvidia is responding to this. Other companies aren't so good'.


According to people like you, the only way for NVIDIA to remain in their good graces is if the company turns into a full scale charity and gives GPUs away for free from now on, for the sake of the Common Good of course.

I'm sure this would make any comrade very pleased.


----------



## SandGlass

Quote:


> Originally Posted by *SKYMTL*
> 
> Where in the world did I say anything about the open source community? I was discussing pre-launch architectural specifics rather than longterm driver support. If we were talking about open source support after an architecture has launched, the conversation would be completely different.


You misunderstood, the documentation Intel and AMD releases about their GPUs is each around one or two magnitudes more than Nvidia releases about their GPUs, which also applies to pre-launch, as AMD and intel both start releasing information and working on open source drivers around a year and a half before they are released. Nvidia does not do this, their documentation pales in comparison.
Quote:


> Originally Posted by *iSlayer*
> 
> Does Nvidia open source support matter given their Linux drivers are far better than AMD's (open source and 1st party)?


Of course it does, there are quite a few reasons. Both in terms of philosophy and real world. There are many people (mainly developers of open source libraries, as you can see with Travis & github, most developers do not even do windows builds in testing) that object having any close source software on their system. Binary drivers are also incompatible with GPL. In terms of real life, you sacrifice security and risk losing support after EOL of a product if you use binary blobs. Look at the open source drivers for AMD, they still support cards released more than a decade ago. If there's a bug, it can be fixed. Having access to driver source is also extremely useful, with the exception of games, open source drivers often have faster turn around times fixing bugs (AMD has an irc channel to chat about the open source drivers, it's not unheard of a bug being discovered and fixed in the same day). There's also the advantage of cross-platform compatibility, look at the recent Freedreno drivers, want to run linux? You're screwed if the manufacturer only provides android drivers.


----------



## IRO-Bot

So are they gonna drop the price back to it's original $299?


----------



## criminal

Quote:


> Originally Posted by *Seven7h*
> 
> You'd be surprised. The more people are involved, the more segmented and specialized tasks are, and the more there is an assumption thst "someone, somewhere in the company must be taking care of it."
> 
> This actually *increases* the likelihood of such a communication failure. Not to mention that then more people must also be synced up on information to achieve the same goal.
> 
> If you work in a shop of 3 people, you can't afford to make such assumptions, and it's very easy to confirm or disprove them by asking one person. If you have 10,000 people, you assume everyone is doing this part correctly, until someone doesn't.
> 
> If you still don't understand it, or believe it, go work in a large corporation and come back to me in a year


I have worked in a large corporation before. Corporation with 100,000 + employees. Don't remember anything on this scale every happening there. I believe it can happen, I just don't believe it is an accident when the likes of Intel, AMD or Nvidia do it. Their products are sold on specs and performance. Like someone else said, GT 730 with 4GB seems pretty pointless to us, but I am sure some idiot comes along and buys the card without checking benchmarks because it has 4GB of ram and his 650Ti only has 1GB.


----------



## dukeReinhardt

Quote:


> Originally Posted by *Seven7h*
> 
> According to people like this, the only way for NVIDIA to remain in their good graces is if they turn into a full scale charity and give GPUs away for free, for the sake of the common good of course.
> 
> I'm sure this would make any comrade very pleased.


I'm forcing myself to respond to you, because you're being so rude.

1. If you're going to quote me and respond directly to what I said, don't refer to me in the third person, you ungracious ape.
2. Did I imply that? At any point in what I said did I suggest that Nvidia needs to give things out for free?
3. If you object to any single point of mine, why don't you respond to it directly, rather than insulting me?
4. What's this got to do with Communism, and why is that supposed to be an appropriate insult?
5. Screw you too?


----------



## Xoriam

Quote:


> Originally Posted by *IRO-Bot*
> 
> So are they gonna drop the price back to it's original $299?


When was a decent 970 ever that price????

(just so you know in europe a decent 970 converted from euro to $ cost 410+ USD)

(and a decent 290x converted cost roughly 530USD+.)


----------



## Seven7h

Quote:


> Originally Posted by *SandGlass*
> 
> You misunderstood, the documentation Intel and AMD releases about their GPUs is each around one or two magnitudes more than Nvidia releases about their GPUs, which also applies to pre-launch, as AMD and intel both start releasing information and working on open source drivers around a year and a half before they are released. Nvidia does not do this, their documentation pales in comparison.
> Of course it does, there are quite a few reasons. Both in terms of philosophy and real world. There are many people (mainly developers of open source libraries, as you can see with Travis & github, most developers do not even do windows builds in testing) that object having any close source software on their system. Binary drivers are also incompatible with GPL. In terms of real life, you sacrifice security and risk losing support after EOL of a product if you use binary blobs. Look at the open source drivers for AMD, they still support cards released more than a decade ago. If there's a bug, it can be fixed. Having access to driver source is also extremely useful, with the exception of games, open source drivers often have faster turn around times fixing bugs (AMD has an irc channel to chat about the open source drivers, it's not unheard of a bug being discovered and fixed in the same day). There's also the advantage of cross-platform compatibility, look at the recent Freedreno drivers, want to run linux? You're screwed if the manufacturer only provides android drivers.


Investing in driver support for a platform that the market has not validated through demand is called "waste" and "inefficiency". The noble "everything free everywhere" survives (well, more like limps along) on the backs of people willing to do thankless work for a small handful of uncompromising people, until they burn out or have to start paying rent.

Millions of dollars go into making these things... Why would you just start giving out all your hard work for free? There's a tremendous amount of intellectual property at stake.

There are also legal liabilities. If you open source, you *will* be sued by someone claiming that your code violates theirs... Even if you had no idea it violated it when you wrote it, and you have never even heard of their software.


----------



## Final8ty

Im more for making sure something like this does not happen again and that AMD dont start getting any bright ideas for doing the same.

If we all kept quiet you can be damn sure NV would do it again.


----------



## Seven7h

Quote:


> Originally Posted by *dukeReinhardt*
> 
> I'm forcing myself to respond to you, because you're being so rude.
> 
> 1. If you're going to quote me and respond directly to what I said, don't refer to me in the third person, you ungracious ape.
> 2. Did I imply that? At any point in what I said did I suggest that Nvidia needs to give things out for free?
> 3. If you object to any single point of mine, why don't you respond to it directly, rather than insulting me?
> 4. What's this got to do with Communism, and why is that supposed to be an appropriate insult?
> 5. Screw you too?


Sorry about that, I fixed it for you. Now it addresses you and your mindset directly.

Hopefully this qualifies me as a marginally more gracious ape.


----------



## Seven7h

Quote:


> Originally Posted by *Final8ty*
> 
> Im more for making sure something like this does not happen again and that AMD dont start getting any bright ideas for doing the same.
> 
> If we all kept quiet you can be damn sure NV would do it again.


There's nothing wrong with having a chip configured like this again in the future, just as long as its explicitly documented (come up with a special marketing name, or sell it as 3.5GB+512MB texture memory).

In some cases it is more efficient and enables 95% the performance for probably 80-90% the price, so it could be considered a vector of efficiency.


----------



## dukeReinhardt

Quote:


> Originally Posted by *Seven7h*
> 
> Sorry about that, I fixed it for you. Now it addresses you and your mindset directly.
> 
> Hopefully this qualifies me as a marginally more gracious ape.


You haven't fixed it so that it's any less irrelevant.


----------



## Xoriam

Quote:


> Originally Posted by *Seven7h*
> 
> There's nothing wrong with having a chip configured like this again in the future, just as long as its explicitly documented (come up with a special marketing name, or sell it as 3.5GB+512MB texture memory).


Speaking of this, you know what would be amazingly nice?

You know how the OS is always eating away at VRAM? if they program it specificly so that the OS vram requistitions never go outside of the segmented RAM section.
I'd love that sort of setup then.


----------



## Final8ty

Quote:


> Originally Posted by *Seven7h*
> 
> There's nothing wrong with having a chip configured like this again in the future, just as long as its explicitly documented (come up with a special marketing name, or sell it as 3.5GB+512MB texture memory).
> 
> In some cases it is more efficient and enables 95% the performance for probably 80-90% the price, so it could be considered a vector of efficiency.


If the card had explicitly documented 3.5GB+512MB then i would have no issue as people knew what they were buying so could not complain.


----------



## wooshna

i think people are getting out of had with this "did nvidia lie or not know" arguement.

Nvidia is tech company.

Nvidia hires the best possible people to do the job.

nvidia doesn't hire high school graduates into their marketing/pr/software/tech devisions.

people saying nvidia didn't lie about the specs obviously didn't read the specs from september vs 4 days ago.

people saying nvidia had nothing to gain by lying (intentionally) or being Incompetent are lying to themselves.

Buisnesses are there to make money.

who knows how much money nvidia put into the maxwell cards research, development and marketing. you would think they would throw in a few extra hundred dollars to proof read the spec sheet vs the marketing labels.


----------



## Seven7h

Quote:


> Originally Posted by *Xoriam*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Speaking of this, you know what would be amazingly nice?
> 
> You know how the OS is always eating away at VRAM? if they program it specificly so that the OS vram requistitions never go outside of the segmented RAM section.
> I'd love that sort of setup then.


Yes, then you don't pay a premium on full speed memory that goes to uses that don't really benefit. That's why I'd argue this could've been turned into a feature rather than a disadvantage if they had caught it in time. It's all about human perception.

The thing is, the base video memory consumption you see before launching any game comes from a couple things:

1. Low level data that the driver needs to keep resident on the GPU just for it to function at all... This it not game resource data, and it is effectively a carve out, not unlike page tables taking space on an HDD.

2. Aero (the 3D/transparency UI that has been in Windows since Vista). Every window becomes a graphics object, so these eat into your available video memory whenever the desktop is in view. However these don't consume anything when you run in full screen mode, since they get evicted from video memory.

Unfortunately the bulk of the "extra" is #1, but that is what will be moved to the slow memory in a driver update. So you'll partially get your wish


----------



## looniam

word on the street is that during the marketing power point presentation of the gtx 970; no rep cared to push themselves away from the pastry laden conference table that had only one cheese danish and hop on their segway to go across campus and interrupt the engineers who were watching kitty cat videos and eating cheetos to ask if the listed specifications were correct.

the engineers; whose whole meaning in life is graphic processor engineering, kitty cat videos and cheetos, do not read press releases or look at site reviews because that would be akin to a broadway actor reading critics' reviews of their performance on an opening night of a play; their ego and low self worth are just too fragile to handle it.

so we cannot justifiably blame the engineers nor the pr departemnt for nvidia's _plausible reason_ for the miscommunication.

no, we must blame the intern who brought only one cheese danish in the box of pastry that they delivered to the marketing conference room.

on a side note:
i don't know about ya all but, my grandmother told me calling someone a liar makes you as bad as the liar. i guess things change after 50+ years; though it doesn't look for the better.


----------



## MerkageTurk

nVidia misrepresented their own product, which warrants a refund or some sort of compensation.

From what i learned never buy a card ending with GTX X70, heck they even do not driver support their cards after three months.

A gtx 960 is also slow, even than a 770.


----------



## mouacyk

Quote:


> Originally Posted by *looniam*
> 
> word on the street is that during the marketing power point presentation of the gtx 970; no rep cared to push themselves away from the pastry laden conference table that had only one cheese danish and hop on their segway to go across campus and interrupt the engineers who were watching kitty cat videos and eating cheetos to ask if the listed specifications were correct.
> 
> the engineers; whose whole meaning in life is graphic processor engineering, kitty cat videos and cheetos, do not read press releases or look at site reviews because that would be akin to a broadway actor reading critics' reviews of their performance on an opening night of a play; their ego and low self worth are just too fragile to handle it.
> 
> so we cannot justifiably blame the engineers nor the pr departemnt for nvidia's _plausible reason_ for the miscommunication.
> 
> no, we must blame the intern who brought only one cheese danish in the box of pastry that they delivered to the marketing conference room.
> 
> on a side note:
> i don't know about ya all but, my grandmother told me calling someone a liar makes you as bad as the liar. i guess things change after 50+ years; though it doesn't look for the better.


Some grandma's also don't endorse the truth and honesty, as they see no value in it.


----------



## Seven7h

Quote:


> Originally Posted by *wooshna*
> 
> i think people are getting out of had with this "did nvidia lie or not know" arguement.
> 
> Nvidia is tech company.
> 
> Nvidia hires the best possible people to do the job.
> 
> nvidia doesn't hire high school graduates into their marketing/pr/software/tech devisions.
> 
> people saying nvidia didn't lie about the specs obviously didn't read the specs from september vs 4 days ago.
> 
> people saying nvidia had nothing to gain by lying (intentionally) or being Incompetent are lying to themselves.
> 
> Buisnesses are there to make money.
> 
> who knows how much money nvidia put into the maxwell cards research, development and marketing. you would think they would throw in a few extra hundred dollars to proof read the spec sheet vs the marketing labels.


Businesses are made up of a collection of real people. People like you and me, who have consciences and don't want to do irreparable brand damage because they like their jobs and like their pay. They are not some corporate dictatorship that is constantly scheming on how to take advantage of you. When things get shady or out of hand you will have whistleblowers and public anonymous blogposts.

You can make money and keep people happy, and it's wayyyyy better for business. On the flip side, even if you were super evil, there are only so many people you could screw before it catches up with you and you face significantly increasing backpressure trying to make each sale. No corporation of thousands of people is unaware of that.

These companies have plenty of enthusiasts and gamers *working in them* who all read gaming tech news and start asking tough questions internally and getting just as mad at their colleagues as people on the forums do (well, within reason).

This whole "it had to have been intentional" is sounding extremely conspiratorial and it's starting to make people look like "truthers" or faked moon landing theorists. Yes people **** up, on all scales, big and small. One only needs to look to their respective governments to see daily examples of that


----------



## looniam

Quote:


> Originally Posted by *mouacyk*
> 
> Some grandma's also don't endorse the truth and honesty, as they see no value in it.


only a sniveling little twit wouldn't appreciate their grandmother . . .

just saying.


----------



## sage101

This thread should be closed already. To much hostility going on in here, I don't condone nvidia deceptive product specifications however the GTX 970 is still a beast of a card. I would still pick a used R9 290X on ebay over it though unless an irate 970 owner is willing to sell me theirs for $250 or less. I'm here to save 1 lucky soul from his misery by taking his pain(970) away and making him/her $250 richer, deal of the century.


----------



## mouacyk

Quote:


> Originally Posted by *looniam*
> 
> only a sniveling little twit wouldn't appreciate their grandmother . . .
> 
> just saying.


I'll leave this here.


----------



## Xoriam

So the thread has even surpassed that of mom jokes....

leave my grandmoma out of this!


----------



## BazG

Quote:


> Originally Posted by *velocd*
> 
> I feel the same way. I build systems to last 4-5 years, and I would have purchased a GTX 980 had I known the GTX 970 was effectively 3.5GB.
> 
> This news depreciates the value of the GTX 970 for resale.


Agreed


----------



## 2010rig

Quote:


> Originally Posted by *Final8ty*
> 
> 2 wrong does not make a right and AMD got bashed for it from the get go on day one, so its not like Bulldozer got by unbashed so why bash NV for this.
> AMD got bashed and rightly so and NV is getting bashed for this and rightly so.


I was implying that it was one of the few times I was up in arms with AMD's lies, and that was pretty severe, since they did in fact lie and hyped BD's *performance* for MONTHS, prior to release. Did you miss the part where they claimed *"Bulldozer was designed to be the HIGHEST performing SINGLE and MULTI-THREADED compute core in history.* How did that claim work out for them? *Apples and Oranges.*
Quote:


> Originally Posted by *DzillaXx*
> 
> I know you huge advocate for the Green side, maybe a bit too much at time, but how can you not be mad at this?
> 
> Nvidia clearly lied about the amount of ROP's the card has. This is not some simple easily mistaken fact, and more than likely they knew about it.
> 
> *I also don't believe that it took them 4 months to realize the specs were wrong. From what we've learned so far, the card isn't ROP bottlenecked. I'm still trying to figure out what Scott Wasson's correction meant, but for now...*
> Quote:
> 
> 
> 
> the GTX 970 has 56 ROPs and 1792 KB of L2 cache compared to 64 ROPs and 2048 KB of L2 cache for the GTX 980. Before people complain about the ROP count difference as a performance bottleneck, keep in mind that the *13 SMMs in the GTX 970 can only output 52 pixels/clock and the seven segments of 8 ROPs each (56 total) can handle 56 pixels/clock.* *The SMMs are the bottleneck, not the ROPs.*
> 
> 
> 
> *Based on that, does it really matter if the card had 64 ROP's, where the last 8 ROP's would not be utilized anyway due to the SMM's being the bottleneck?
> *
> I for one would feel cheated to find out the card I bought has less ROPs than I originally bought the card for. Sure the Benchmarks are solid for the card, but facts are still facts. The card can't go past 3.5GB of RAM if you care about smoothness.
> 
> *Umm, are you sure about that? The card can in fact use all 4GB, a few users have already demonstrated that in this thread. Look at the benches posted earlier, and follow the discussions accordingly:
> http://www.overclock.net/t/1537725/pcper-nvidia-responds-to-gtx-970-3-5gb-memory-issue/1350_50#post_23468866
> 
> ALL cards take similar hits above 3.5 GB
> http://www.overclock.net/t/1537725/pcper-nvidia-responds-to-gtx-970-3-5gb-memory-issue/1300_50#post_23467862
> 
> You may also want to read this:
> http://www.overclock.net/t/1537725/pcper-nvidia-responds-to-gtx-970-3-5gb-memory-issue/1400_50#post_23469797
> *
> 
> If someone wants to toss their 970 back in the box, and get something else. Good for them. Honestly more people should be doing it, just to prove a point. Maybe even pickup one of AMD's offerings, I have the 290. Couldn't be happier. Great card, and keeps your room warmer in the cold winter months. Though IMO the hassle of sending back a video card can be a drag. Does the GTX970 perform well? Yes. Would a 290/290x or gtx980 make you happier? perhaps. Should anyone tell you, you shouldn't return you card just to make a point? No.
> 
> *If they want to return the card, by all means do so. If the retailer won't take it back, ask [email protected] for assistance.
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
> 
> I don't know why anyone would want to a sidegrade to a card that uses more power, is louder, and puts out more heat, for the SAME performance. To each his own.*
> 
> *If NVIDIA had lied about the >>>performance<<< of the card during reviews, my stance would be completely different. Fact is, the card performs as advertised, and we now have the correct specs*
Click to expand...

Quote:


> Originally Posted by *RagingCain*
> 
> I believe the card would have sold just as well with the correct specs, no better, no worse (% wise). That leads me to think that there was no real point of them to lie.
> 
> It would have been another cut down version of their big boy card, business as usual.


That's exactly what I was saying earlier. We're used to x70 cards always being cut down in ROP's, SMM's, even lower bus width. That's nothing new.


----------



## Final8ty

Quote:


> Originally Posted by *2010rig*
> 
> I was implying that it was one of the few times I was up in arms with AMD's lies, and that was pretty severe, since they did in fact lie and hyped BD's *performance* for MONTHS, prior to release. Did you miss the part where they claimed *"Bulldozer was designed to be the HIGHEST performing SINGLE and MULTI-THREADED compute core in history.* How did that claim work out for them? *Apples and Oranges.*
> 
> That's exactly what I was saying earlier. We're used to x70 cards always being cut down in ROP's, SMM's, even lower bus width. That's nothing new.


I went on the biggest tangent i have ever done when it came to BD but the fact is that's done and dusted and this is about now with the 970.


----------



## MerkageTurk

^biased mate

Be open minded, at least bulldozer did not cut parts out etc

You get the real deal, unlike nvidia missing clusters etc.

bulldozer needs software optimisation, however, most software use intel binary and nvidia.


----------



## djsi38t

I think that if you want to keep your 970 and are happy with it's performance then you shouldn't ask or expect any compensation from nvidia.

If you were happy with the card before then no reason not to be now.

It's pretty ridiculous that some people in the nvidia forums want a free 980 or 100 dollars cash because of this.

Of course this is easy for me to say as I don't own a 970,but I do know that I wouldn't all of a sudden hate my card or nvidia because of this.

Does anyone think that nvidia will do this again in the future?I am sure they aren't happy about the consumer reactions and hope they will take steps to prevent this from happening again.

Most likely the card would have been substantially more money,and if so they should have had 2 variants of the 970,so customers could have a choice.


----------



## Seven7h

Quote:


> Originally Posted by *djsi38t*
> 
> I think that if you want to keep your 970 and are happy with it's performance then you shouldn't ask or expect any compensation from nvidia.
> 
> If you were happy with the card before then no reason not to be now.
> 
> It's pretty ridiculous that some people in the nvidia forums want a free 980 or 100 dollars cash because of this.
> 
> Of course this is easy for me to say as I don't own a 970,but I do know that I wouldn't all of a sudden hate my card or nvidia because of this.
> 
> Does anyone think that nvidia will do this again in the future?I am sure they aren't happy about the consumer reactions and hope they will take steps to prevent this from happening again.
> 
> Most likely the card would have been substantially more money,and if so they should have had 2 variants of the 970,so customers could have a choice.


Well said.


----------



## sugalumps

Quote:


> Originally Posted by *BazG*
> 
> Agreed


You cant expect to buy a cud down mid range card and have it last you 4-5 years with "future proof" in mind. If that is the case you always buy the best that you can afford at the time, so if you had the money to get the 980 but bought the 970 because you thought you were getting a free lunch you fudged up.


----------



## Seven7h

Quote:


> Originally Posted by *Final8ty*
> 
> If the card had explicitly documented 3.5GB+512MB then i would have no issue as people knew what they were buying so could not complain.


To be honest I personally don't think it's entirely fair to relegate the 512MB to just being some afterthought... It's legitimate memory that is very useful.

But just to cover themselves from the most sensitive and stubborn people, it's probably best to understate slightly than overstate even the slightest bit.

It is a delicate balancing act, with all the entitled mentalities around these days. They don't want to bake in cost to the product, then undersell it's capabilities either.


----------



## 2010rig

Quote:


> Originally Posted by *djsi38t*
> 
> I think that if you want to keep your 970 and are happy with it's performance then you shouldn't ask or expect any compensation from nvidia.
> 
> If you were happy with the card before then no reason not to be now.
> 
> *It's pretty ridiculous that some people in the nvidia forums want a free 980 or 100 dollars cash because of this.*
> 
> Of course this is easy for me to say as I don't own a 970,but I do know that I wouldn't all of a sudden hate my card or nvidia because of this.
> 
> Does anyone think that nvidia will do this again in the future?I am sure they aren't happy about the consumer reactions and hope they will take steps to prevent this from happening again.
> 
> Most likely the card would have been substantially more money,and if so they should have had 2 variants of the 970,so customers could have a choice.


I commented on that yesterday, and thought those guys were being ridiculous. He was almost implying that the $100 would be for pain and suffering. lmao

There are a lot of cry babies over there.


----------



## looniam

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> only a sniveling little twit wouldn't appreciate their grandmother . . .
> 
> just saying.
> 
> 
> 
> I'll leave this here.
Click to expand...

please do since making negative comments seems to be all the rage these days.

see what i did there.


----------



## Final8ty

Quote:


> Originally Posted by *Seven7h*
> 
> To be honest I personally don't think it's entirely fair to relegate the 512MB to just being some afterthought... It's legitimate memory that is very useful.
> 
> But just to cover themselves from the most sensitive and stubborn people, it's probably best to understate slightly than overstate even the slightest bit.
> 
> It is a delicate balancing act, with all the entitled mentalities around these days. They don't want to bake in cost to the product, then undersell it's capabilities either.


Well i would not say stubborn, because the people buy a card for different reasons and needs and with the men configure its clearly not suitable for some tasks and has in fact genuinely caught some people out..

And most of us know this user.
Quote:


> Originally Posted by *GoldenTiger;1041389384*
> Well ****, I totally misread that graph set before
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> . Guess my 4k monitor wasn't at fault after all... which sucks, because I already sent it back for a refund and had gotten a killer deal on it originally.
> 
> Maxwell 2.0 chips are basically the 2600K of GPU's
> 
> 
> 
> 
> 
> 
> 
> . In general I agree though.


Quote:


> Originally Posted by *GoldenTiger;1041391210*
> Did a little testing of my own using afterburner's frametime readings and other monitoring tools... it's not FCAT but it's very accurate regardless. Here's what I got...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So yeah, using SLI GTX 970's to drive high-res high-settings will result in massive, massive frametime issues, even if the framerate over a given second remains reasonable. It is basically an unplayable mess at that point when using 3.7-4.0gb of VRAM. If you can stay around/below 3.5gb of actual usage, which it does its best to do, frametimes are consistent and tight as you would expect. The framerate averaged around 38, meaning in a perfect world the frametimes would be right around 26.3ms for each frame.
> 
> As an interesting aside, when finding my settings to test with I noticed it would literally, over the course of several seconds, try to work its way back down to below 3.5gb of usage if it went over, until I set things high enough that it couldn't and would just stick at 3.7-3.8gb+ the whole time. Otherwise it would fight and keep pingponging from ~3.4gb directly to ~3.7gb and back repeatedly before finally settling at ~3.4gb. That's probably the drivers at work, there.


Quote:


> Originally Posted by *GoldenTiger;1041391256*
> I couldn't say, honestly, but my cpu usage isn't more than ~70% average on the cores during gameplay typically so presumably there's plenty of headroom on that front.
> 
> Just to elaborate a little (copying another forum post I wrote) even with a similar framerate, frametimes get completely torpedo'd once you pass the 3.5gb threshold. For example that graph was ~38fps, but if you get below the 3.5gb mark outright with your settings a ~50fps gameplay has consistent frametimes with little variance, bouncing between ~15-25ms of render time as you'd expect, sometimes a little more or less.
> 
> The ~38fps though passing the 3.5gb vram mark, however, ends up having times constantly going between ~35ms to 150ms of time to render each frame, with many spikes over 200ms.


http://hardforum.com/showthread.php?t=1849838&page=18


----------



## Orangey

Quote:


> Originally Posted by *Final8ty*
> 
> And most of us know this user.
> 
> http://hardforum.com/showthread.php?t=1849838&page=18












Just goes to show how fast it can all fall apart.


----------



## PureBlackFire

this thread keeps blowing up wow.
Quote:


> Originally Posted by *RagingCain*
> 
> I believe the card would have sold just as well with the correct specs, no better, no worse (% wise). That leads me to think that there was no real point of them to lie.
> 
> It would have been another cut down version of their big boy card, business as usual.


I agree. as much as people talk about power consumption and specs, the thing that really sells gpus is performance. the 970 has it. price/performance is also very important and again, the 970 kicks ass. this card ruined everything north of the R9 285's price as a viable option for almost everyone looking for a gpu. that being said, what's naive (putting it politely) is the belief that the card's specs and meory design was simply miscommunicated internally during months of development and testing, as well as got miscommunicated to the press and public, and went unoticed for four months. anyone who believes that is either hopelessly biased or some variety of fool. well, either way people that are hopelessly biased when it comes to a corporation that sells products to you are fools.
Quote:


> Originally Posted by *Seraphic*
> 
> Nvidia was content keeping quiet knowing reviews/specifications were incorrect. And only when people started to wise up that something was odd with the memory did they come clean.


yep. typical behavior when you make an honest mistake








Quote:


> Originally Posted by *djsi38t*
> 
> I think that if you want to keep your 970 and are happy with it's performance then you shouldn't ask or expect any compensation from nvidia.
> 
> If you were happy with the card before then no reason not to be now.
> 
> It's pretty ridiculous that some people in the nvidia forums want a free 980 or 100 dollars cash because of this.
> 
> Of course this is easy for me to say as I don't own a 970,but I do know that I wouldn't all of a sudden hate my card or nvidia because of this.
> 
> Does anyone think that nvidia will do this again in the future?I am sure they aren't happy about the consumer reactions and hope they will take steps to prevent this from happening again.
> 
> Most likely the card would have been substantially more money,and if so they should have had 2 variants of the 970,so customers could have a choice.


people are clearly going overboard and in this thread the lunacy goes in both directions as per usual.the 970 could have released at $400 like the 770 and 670. it would still be a good card. still better price/performance than the 980 and still would have sunk the prices and appeal of the whole GK110 stack as well as AMD's 290/290X.


----------



## Menta

__ https://twitter.com/i/web/status/560462075193880576









why cant NV have a step up program and AMD CAN.


----------



## Ganf

Quote:


> Originally Posted by *PureBlackFire*
> 
> this thread keeps blowing up wow.


I'm running out of popcorn...


----------



## Mad Pistol

I'm pretty much done with this thread. It's disgusting at how many people feel entitled to reparations for this.

Nvidia has offered a full refund on the product if you're not happy with it. That's by far the best you're going to get. Nvidia isn't going to pay you money to keep your card, and it is highly unlikely they will give you a break on the cost if you want to upgrade to a 980. The 980 is still around $550, and that's not going to change until AMD releases a newer, better product.

To think, this mess started with a test/benchmark that was unreliable and finicky, and then it exploded into a fire of consumerism because the card has only 7/8 L2 cache that was advertised. Funny how the card still performs exactly the same as it did when the specs were first released.

To clarify, I am not defending Nvidia, but you guys need to grow up. Nvidia has offered a solution to your "grievances." Either get your money back, or be happy with the card as is. Your choice.

EDIT: Also, I am keeping my G1 Gaming GTX 970. The performance is really freakin good, and that hasn't changed. That $360 we paid for it was worth every penny.


----------



## rdr09

Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm pretty much done with this thread. It's disgusting at how many people feel entitled to reparations for this.
> 
> Nvidia has offered a full refund on the product if you're not happy with it. That's by far the best you're going to get. Nvidia isn't going to pay you money to keep your card, and it is highly unlikely they will give you a break on the cost if you want to upgrade to a 980. The 980 is still around $550, and that's not going to change until AMD releases a newer, better product.
> 
> To think, this mess started with a test/benchmark that was unreliable and finicky, and then it exploded into a fire of consumerism because the card has only 7/8 L2 cache that was advertised. Funny how the card still performs exactly the same as it did when the specs were first released.
> 
> To clarify, I am not defending Nvidia, but you guys need to grow up. Nvidia has offered a solution to your "grievances." Either get your money back, or be *content* with the card as is. Your choice.
> 
> EDIT: Also, I am keeping my G1 Gaming GTX 970. The performance is really freakin good, and that hasn't changed.


Fixed.


----------



## sugalumps

Quote:


> Originally Posted by *Final8ty*
> 
> Well i would not say stubborn, because the people buy a card for different reasons and needs and with the men configure its clearly not suitable for some tasks and has in fact genuinely caught some people out..
> 
> And most of us know this user.
> 
> http://hardforum.com/showthread.php?t=1849838&page=18


I am puzzled how people like golden and enthusiast reviewers never caught onto this or ran into this issue without being told about it? Golden was always posting graphs about how 970 sli easily driven his 4k monitor etc, how did he and others not run into it until now.


----------



## Triniboi82

Quote:


> Originally Posted by *Mad Pistol*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I'm pretty much done with this thread. It's disgusting at how many people feel entitled to reparations for this.
> 
> Nvidia has offered a full refund on the product if you're not happy with it. That's by far the best you're going to get. Nvidia isn't going to pay you money to keep your card, and it is highly unlikely they will give you a break on the cost if you want to upgrade to a 980. The 980 is still around $550, and that's not going to change until AMD releases a newer, better product.
> 
> To think, this mess started with a test/benchmark that was unreliable and finicky, and then it exploded into a fire of consumerism because the card has only 7/8 L2 cache that was advertised. Funny how the card still performs exactly the same as it did when the specs were first released.
> 
> To clarify, I am not defending Nvidia, but you guys need to grow up. Nvidia has offered a solution to your "grievances." Either get your money back, or be happy with the card as is. Your choice.
> 
> EDIT: Also, I am keeping my G1 Gaming GTX 970. The performance is really freakin good, and that hasn't changed. That $360 we paid for it was worth every penny
> 
> 
> .


X2 totally agree. The fact that full refunds are even available now has changed my opinion on Nvidia for the positive, I'll be keeping my cards also and look forward to the new drivers.


----------



## Tsumi

Quote:


> Originally Posted by *sugalumps*
> 
> I am puzzled how people like golden and enthusiast reviewers never caught onto this or ran into this issue without being told about it? Golden was always posting graphs about how 970 sli easily driven his 4k monitor etc, how did he and others not run into it until now.


Goes to show you how unnecessary 4gb of VRAM is in most games, even at 4K. Let alone those clamoring for 8 or 12gb.


----------



## USFORCES

I'd take any Nvidia with 3GB before I'd buy a Radeon with 4GB!


----------



## SandGlass

Quote:


> Originally Posted by *Seven7h*
> 
> Investing in driver support for a platform that the market has not validated through demand is called "waste" and "inefficiency". The noble "everything free everywhere" survives (well, more like limps along) on the backs of people willing to do thankless work for a small handful of uncompromising people, until they burn out or have to start paying rent.
> 
> Millions of dollars go into making these things... Why would you just start giving out all your hard work for free? There's a tremendous amount of intellectual property at stake.


There are tens of thousands of people writing and open sourcing code right now, apparently for little to no reward. But we now have a complete open source desktop suit, extremely usable compared to a decade back. We have libreoffice (forked from OO), firefox is back from the dead, python (along w/ numpy & scipy) and openjdk are routinely used in commercial projects, GNU octave can run a lot of matlab code, we have blender for 3d modeling, we have open source drivers from AMD and Intel, Broadcom and atheros. We have GIMP, Inkscape, kdenlive, as adobe substitutes. MME and webgl is slowly killing flash. We have Jmol for molecule visualization, matplotlib for plotting, GROMACS for protein simulation, the Bullet engine as an alternative to Physx... the list goes on and on. The point is, the majority of the developers have at one point or another used and actively contributed to OSS, literally giving away hard work for free.
Quote:


> There are also legal liabilities. If you open source, you *will* be sued by someone claiming that your code violates theirs... Even if you had no idea it violated it when you wrote it, and you have never even heard of their software.


That is the most ridiculous thing I have heard in a decade about open sourcing. If you open source, you will not get sued unless you use code under the GPL and refuse to give back. Are you implying Nvidia stole their code? Most large chip vendors have open source drivers, so Nvidia is really sticking out like a sore thumb here. They really have no excuse.


----------



## Menta

there are NO refunds its not TRUE.

there was NV rep that spoke out in A THREAD and said he would try to help, but until now nothing.

some stores are budging only to stay true to the client nothing more.

there has to be a recall order to ASUS, MSI etc etc

Nvidia has done nothing

i dont expect nothing but will follow the circus


----------



## hyp36rmax

Quote:


> Originally Posted by *USFORCES*
> 
> I'd take any Nvidia with 3GB before I'd buy a Radeon with 4GB!


I buy both depending on my build


----------



## GrimDoctor

Quote:


> Originally Posted by *Menta*
> 
> there are NO refunds its not TRUE.
> 
> there was NV rep that spoke out in A THREAD and said he would try to help, but until now nothing.
> 
> some stores are budging only to stay true to the client nothing more.
> 
> there has to be a recall order to ASUS, MSI etc etc
> 
> Nvidia has done nothing
> 
> i dont expect nothing but will follow the circus


And how exactly do you know all this?


----------



## Kuivamaa

Quote:


> Originally Posted by *Tsumi*
> 
> Goes to show you how unnecessary 4gb of VRAM is in most games, even at 4K. Let alone those clamoring for 8 or 12gb.


To be fair, it is easy to be tricked and think you simply hit the 4GB VRAM limit when stutters occur. And let's not forget this issue was revealed when users noticed that monitoring programs would report max VRAM [email protected] and some even reported issues going past. So a few did notice.


----------



## spacin9

Quote:


> Originally Posted by *GrimDoctor*
> 
> And how exactly do you know all this?


I know it. NV isn't doing anything. Just lip service.

Maybe, if we're lucky, they might be able to get drivers to do some kind of "acceptable" workaround.


----------



## GrimDoctor

Quote:


> Originally Posted by *spacin9*
> 
> I know it. NV isn't doing anything. Just lip service.
> 
> Maybe, if we're lucky, they might be able to get drivers to do some kind of "acceptable" workaround.


So the fact that Nvidia and Asus assisted me to get a refund from my retailer didn't happen? Ok


----------



## 2010rig

Quote:


> Originally Posted by *Menta*
> 
> there are NO refunds its not TRUE.
> 
> there was NV rep that spoke out in A THREAD and said he would try to help, but until now nothing.
> 
> some stores are budging only to stay true to the client nothing more.
> 
> there has to be a recall order to ASUS, MSI etc etc
> 
> Nvidia has done nothing
> 
> i dont expect nothing but will follow the circus


Preach
Quote:


> Originally Posted by *Lass3*
> 
> Returned my 970s, got a full refund after 60 days and kept the games from the vouchers. Awaiting AMDs next line of GPUs.


----------



## Mad Pistol

Quote:


> Originally Posted by *GrimDoctor*
> 
> So the fact that Nvidia and Asus assisted me to get a refund from my retailer didn't happen? Ok


This.

If an actual employee of Nvidia posted that they would help consumers get a refund if they wanted to return their GTX 970's, you'd better believe that Nvidia ok'd that. False information of that magnitude would lead to a loss of their job and potential lawsuit over that book of an NDA that they have to sign before going to work for them.


----------



## sugalumps

Quote:


> Originally Posted by *Tsumi*
> 
> Goes to show you how unnecessary 4gb of VRAM is in most games, even at 4K. Let alone those clamoring for 8 or 12gb.


Pritty much, now all amd has to do now is put some spin on it and bring their next gpu's out with 6gb on the basic models and everyone will eat it up especially after this.


----------



## Menta

Quote:


> Originally Posted by *GrimDoctor*
> 
> And how exactly do you know all this?


i know because their is no official statement, just some costumer care guy who happened to pop in the middle of the thread.

so i went ahead and tested him he knows nothing and has no vote on the matter


----------



## Seven7h

Quote:


> Originally Posted by *Final8ty*
> 
> Well i would not say stubborn, because the people buy a card for different reasons and needs and with the men configure its clearly not suitable for some tasks and has in fact genuinely caught some people out..
> 
> And most of us know this user.
> 
> http://hardforum.com/showthread.php?t=1849838&page=18


No one was caught out. Being 5% slower overall in certain cases doesn't make or break any use case on the planet. Basically that would mean someone is upset because they need to upgrade 5% sooner.


----------



## Mad Pistol

Quote:


> Originally Posted by *sugalumps*
> 
> Pritty much, now all amd has to do now is put some spin on it and bring their next gpu's out with 6gb on the basic models and everyone will eat it up especially after this.


Even if the card only performs about the same as a GTX 970 and uses double the power?

Seriously, most of you guys have forgotten how awesome the Maxwell architecture is. Nvidia single-handedly made their GK110 lineup of cards obsolete, both in performance and power consumption, without moving on to a better process node. AMD is going to have a very tough time matching the efficiency that Maxwell offers.


----------



## Menta

ok you got lucky but there is no global recall, i also have a strix and ASUS said NO


----------



## Seven7h

Quote:


> Originally Posted by *Menta*
> 
> i know because their is no official statement, just some costumer care guy who happened to pop in the middle of the thread.
> 
> so i went ahead and tested him he knows nothing and has no vote on the matter


That's incorrect. He just wants the returns to first try to go through official channels instead of trying to work out some rinky dink per-user follow up on a forum.


----------



## GrimDoctor

Quote:


> Originally Posted by *Menta*
> 
> ok you got lucky but there is no global recall, i also have a strix and ASUS said NO


I'm in Australia. Had no issue at all, they even provided me with a case number! This was before the post from NV CS. This only strengthened it.

I did ask politely and didn't fly off any handle even though it's causing work related issues...maybe that helped...


----------



## Menta

maybe...lets wait and see


----------



## Art Vanelay

Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm pretty much done with this thread. It's disgusting at how many people feel entitled to reparations for this.
> 
> Nvidia has offered a full refund on the product if you're not happy with it. That's by far the best you're going to get. Nvidia isn't going to pay you money to keep your card, and it is highly unlikely they will give you a break on the cost if you want to upgrade to a 980. The 980 is still around $550, and that's not going to change until AMD releases a newer, better product.
> 
> To think, this mess started with a test/benchmark that was unreliable and finicky, and then it exploded into a fire of consumerism because the card has only 7/8 L2 cache that was advertised. Funny how the card still performs exactly the same as it did when the specs were first released.
> 
> To clarify, I am not defending Nvidia, but you guys need to grow up. Nvidia has offered a solution to your "grievances." Either get your money back, or be happy with the card as is. Your choice.
> 
> EDIT: Also, I am keeping my G1 Gaming GTX 970. The performance is really freakin good, and that hasn't changed. That $360 we paid for it was worth every penny.


This is pretty much how I've felt, watching this whole issue.

some false advertising about the specs of the card, a thing that no one actually buys a card based on. It's a bad thing for the company to do, but this has been blown way out of proportion; people are returning their 970s, not having noticed that there was a problem with the card until this news came out. The people wanting if they can get a free game out of this are just baffling.


----------



## Fador

They totally lost me with this video.


----------



## sugalumps

Quote:


> Originally Posted by *Mad Pistol*
> 
> Even if the card only performs about the same as a GTX 970 and uses double the power?
> 
> Seriously, most of you guys have forgotten how awesome the Maxwell architecture is. Nvidia single-handedly made their GK110 lineup of cards obsolete, both in performance and power consumption, without moving on to a better process node. AMD is going to have a very tough time matching the efficiency that Maxwell offers.


That's what I meant, all they have to do is market the vram now after this debacle and people will eat it up even though the 970 is still and incredible card for the price all the while being much more efficient than any other card.


----------



## darkwizard

Well this is out.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Looking-GTX-970-Memory-Performance


----------



## Tsumi

Quote:


> Originally Posted by *Kuivamaa*
> 
> To be fair, it is easy to be tricked and think you simply hit the 4GB VRAM limit when stutters occur. And let's not forget this issue was revealed when users noticed that monitoring programs would report max VRAM [email protected] and some even reported issues going past. So a few did notice.


True. But it still does show that the vast majority of games do not yet need 4gb of VRAM. And that GPU horsepower is as much of a bottleneck at high resolutions as VRAM. And that you can still get awesome gaming performance with less than 4gb of VRAM.


----------



## sugalumps

Quote:


> Originally Posted by *GrimDoctor*
> 
> I'm in Australia. Had no issue at all, they even provided me with a case number! This was before the post from NV CS. This only strengthened it.
> 
> I did ask politely and didn't fly off any handle even though it's causing work related issues...maybe that helped...


That's probably it, the other people going into the support thread "I DESERVE A FREE T-SHIRT NVIDIA AND $100 BACK! NOW TAKE MY GPU BACK".


----------



## Menta

yep that's true, people screaming out free game is just pathetic

like i said before i will settle for a 980 and pay full difference.

have my reasons i don't think its a bad thing


----------



## 2010rig

Quote:


> Originally Posted by *Menta*
> 
> yep that's true, people screaming out free game is just pathetic
> 
> like i said before i will settle for a 980 and pay full difference.
> 
> have my reasons i don't think its a bad thing


I'm sure if you PM that NVIDIA rep, and let him know your intentions, he'll work with you to get you upgraded to a 980.


----------



## Menta

Quote:


> Originally Posted by *2010rig*
> 
> I'm sure if you PM that NVIDIA rep, and let him know your intentions, he'll work with you to get you upgraded to a 980.


the first thing i told him


----------



## MerkageTurk

I REPEAT SOME PEOPLE HAVE EXPERIENCED ISSUES, HENCE, THIS WHOLE THING STARTED. So your claims of people not having issues is just wrong.

Plus 700series are much more powerful


----------



## Mad Pistol

Quote:


> Originally Posted by *MerkageTurk*
> 
> I REPEAT SOME PEOPLE HAVE EXPERIENCED ISSUES, HENCE, THIS WHOLE THING STARTED. So your claims of people not having issues is just wrong.
> 
> *Plus 700series are much more powerful*


So is that why the GTX 970 outperforms the 780 and the 980 matches/outperforms the 780 Ti?

GM204, a 5 billion transistor chip is matching/outperforming GK110, a 7 billion transistor chip... on the same process... while using less power.

Maxwell's performance isn't a debate. It is the current GPU king. No one here is going to argue that.


----------



## spacin9

Quote:


> Originally Posted by *GrimDoctor*
> 
> So the fact that Nvidia and Asus assisted me to get a refund from my retailer didn't happen? Ok


Ok bud.. I 'm going to PM this guy. You talked to Peter on the NV GeForce forum?


----------



## 2010rig

Quote:


> Originally Posted by *rdr09*
> 
> there is a 3.5GB.


Ready to stop spreading misinformaion?

oh look at the 970 NOT using 4GB RAM.







Specifically notice what Ryan had to do just to get the card to use that much VRAM.





For those who were DYING for FCAT results, here you go.





Now quit your false advertising AMD










Quote:


> I spent nearly the entirety of two days testing the GeForce GTX 970 and trying to replicate some of the consumer complaints centered around the memory issue we discussed all week. I would say my results are more open ended than I expected. In both BF4 and in CoD: Advanced Warfare I was able to find performance settings that indicated the GTX 970 was more apt to stutter than the GTX 980. *But in both cases, the in-game settings were exceptionally high, going in the sub-25 FPS range and those just aren't realistic. A PC gamer isn't going to run at those frame rates on purpose and thus I can't quite convince myself to get upset about it.*


----------



## sugalumps

Quote:


> Originally Posted by *MerkageTurk*
> 
> I REPEAT SOME PEOPLE HAVE EXPERIENCED ISSUES, HENCE, THIS WHOLE THING STARTED. So your claims of people not having issues is just wrong.
> 
> Plus 700series are much more powerful


They did not gimp or intentionaly hold back your 780, let this pathetic crusade/vendetta go.


----------



## Mad Pistol

Quote:


> Originally Posted by *sugalumps*
> 
> They did not gimp or intentionaly hold back your 780, let this pathetic crusade/vendetta go.


Exactly. I checked my benchmarks after Maxwell was released, and behold.... my GTX 780 is performing about 200-300 points higher in 3DMark Firestrike and about 1 FPS higher average in Heaven than when I got it... imagine my surprise when I realized my GTX 780 is actually performing better than when I got it.

GK110 cards are still very much flagship hardware. For you or anyone else to believe that nvidia is gimping performance on their top stack of cards from last year is ludicrous.


----------



## gamervivek

Quote:


> some false advertising about the specs of the card, a thing that no one actually buys a card based on.


Amazing, some other comment said that ocners were above that. Amazing.


----------



## GrimDoctor

Quote:


> Originally Posted by *spacin9*
> 
> Ok bud.. I 'm going to PM this guy. You talked to Peter on the NV GeForce forum?


No I emailed Nvidia direct (I would have called but couldn't find a local listing) and then called Asus Australia when Nvidia advised me to get a case number from Asus. Then, I forwarded it all to my retailer, went in, got the refund done on my credit card.


----------



## Menta

Quote:


> Originally Posted by *spacin9*
> 
> Ok bud.. I 'm going to PM this guy. You talked to Peter on the NV GeForce forum?


i spoke to that guy maybe i will try the doctor way:thumb:


----------



## Forceman

So I'm guessing there are going to be some good open box deals on Newegg here pretty soon? Score.


----------



## Mad Pistol

Quote:


> Originally Posted by *Forceman*
> 
> So I'm guessing there are going to be some good open box deals on Newegg here pretty soon? Score.


Seriously. I'll take an open box 970 for a few bucks less than retail. The card becomes an even sweeter deal. You guys returning your cards are making a huge mistake.


----------



## tpi2007

Quote:


> Originally Posted by *darkwizard*
> 
> Well this is out.
> 
> http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Looking-GTX-970-Memory-Performance


I can't help but feel that this is a half attempt to be on good terms with everybody, Nvidia and consumers. They say that there are some problems, but hey, the settings are so high and the fps so low, it's not so important because it's not a realistic scenario. SLI ? Oh, yes, that would make the fps playable, right, but, they didn't even promise to make such a review because they claim that there is too much variance from test to test and SLI introduces stutters of its own.

Hmm. Right. How about do several runs with two GTX 980s and then two GTX 970's and compare ?

I mean, 4K is SLI territory for AAA games, and one of the GTX 970's selling points is to get 4K framerates for much cheaper, so of course people will SLI.

I'm still astounded at their justification for not doing SLI tests.


----------



## rony07

Quote:


> Originally Posted by *PostalTwinkie*
> 
> This thread is just disgusting, and a grand display of how terrible people/consumers are, and how spoiled people are.
> 
> People are so stupid as to claim the card is now a 3 GB card, when it really is 4 GB and there is 4 GB on it, that you can see with your own eyeballs. People are selfish and stupid enough to ask for a free upgrade to a 980, or a $100 voucher/refund because of "performance", just stupid and selfish!
> 
> If you want to be "fair" and "get what you paid for" then take what you paid for your 970, and whatever 1/8th of that is, ask for that as a refund! Guess what? That number won't be $100.
> 
> The feigned hatred and rage around this makes me want to puke. This entire thread has essentially shamed OCN to the core. OCN should have been a community that we, as enthusiasts/experts, could sit down and say _"Yes, they screwed up on the spec sheet printed on the box. They need to make that better. However, performance is still the same regardless!"_
> 
> The stupidity in acting like this has somehow caused a loss to the buyer, I just can't get behind that notion. Sorry, that is stupid and selfish to think. The card works as well as it did, if not better via drivers, than the day it left the store.
> 
> I fully back Nvidia taking action just on the grounds of they made a mistake in marketing, and need to handle it. But the anger, outrage, and stupidity behind the argument that performance is lesser is vile. Anyone that truly wants a refund, because of the misprint on the box, deserves it; people trying to argue the same thing due to performance need to just shut up.
> 
> Either way, anyone that returns it for refund "on principle", need to also return any games and accessories it came with. None of this, I keep the games, you give me money back, crap people are pulling off.
> 
> We knew about the performance of the card, we have previews/reviews on the card. Anyone that bought the card should have researched this performance; people bought this card for the performance. It still performs! That hasn't changed!
> 
> *Does it matter how it performs, as long as it performs?!*
> 
> It could have 5 SMX, as long as the numbers it put up where the same! 1,000 is a 1,000, 500 is 500. A ton of rocks weighs as much as a ton of feathers!
> 
> People complaining about Farcry 4 stuttering at 4 GB and trying to use that hot steaming mess as a baseline, don't make me laugh. That game is a stuttering mess across any platform!
> 
> EDIT:
> 
> Do you know what this is?
> 
> People smell blood in the water and all those that want to score on it have started circling.
> 
> I have said from day one that I support the idea that Nvidia refund those that truly want it over the spec misprint, everyone else can go fly a kite.


Quote:


> Originally Posted by *Mand12*
> 
> What we do understand is that there is *no benefit whatsoever* of them telling us the wrong thing. It wouldn't have mattered to the reviewers. They would have seen it, gone "Oh, hey, that's kinda weird" and gone on to also say "but the benchmarks are awesome!" and given the same recommendations, because in the end the cards still perform as they always have.
> 
> The only thing giving the wrong information does is make them look dumb and piss buyers off. You seriously think they *lied* in order to make that happen?


Two best posts in this entire thread, +REP to you both. As an owner of MSI GTX 970 Gaming 4g SLI, am I disappointed the specs aren't as high as initially advertised? You bet. Does it affect the performance of the cards? Has it affected any realistic way I would use the cards? Not in the least. Heck, even when I tried to fill the VRAM, I was more disappointed by the performance of the cores than I was by the VRAM configuration. And by that I mean i noticed no ill effects of using >3.5GB of VRAM than <3.5GB of VRAM. Proof below.



Even with VRAM filled >3.5GB, my experience was a smooth 35-40 FPS. However, even that is less FPS than I enjoy gaming with so I would turn down the settings to give me a more enjoyable >60FPS.


----------



## Redwoodz

Quote:


> Originally Posted by *2010rig*
> 
> Ready to stop spreading misinformaion?
> 
> oh look at the 970 NOT using 4GB RAM.
> 
> 
> 
> 
> 
> 
> 
> Specifically notice what Ryan had to do just to get the card to use that much VRAM.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> For those who were DYING for FCAT results, here you go.
> 
> 
> 
> 
> 
> Now quit your false advertising AMD


STILL waiting for that SLi test.


----------



## error-id10t

Quote:


> Originally Posted by *Ganf*
> 
> I'm running out of popcorn...


So am I!!!

Imagine what this would be if AMD had their new cards out at the moment...
Quote:


> Originally Posted by *Menta*
> 
> i know because their is no official statement, just some costumer care guy who happened to pop in the middle of the thread.
> 
> so i went ahead and tested him he knows nothing and has no vote on the matter


They don't want an "official" statement as that would then spread like a wild-fire, well beyond the normal tech sites etc etc. I guarantee there are plenty of people who are not aware of this today.


----------



## Menta

Quote:


> Originally Posted by *error-id10t*
> 
> So am I!!!
> 
> Imagine what this would be if AMD had their new cards out at the moment...
> They don't want an "official" statement as that would then spread like a wild-fire, well beyond the normal tech sites etc etc. I guarantee there are plenty of people who are not aware of this today.


very true..they are still trying to keep it down, i really understand that but their silence is no good either.

they have good options

trade up program and sell more 980

some even want free games ok give them a free game.....

minimum impact there, i like NV don't want them to go broke


----------



## mouacyk

Quote:


> Originally Posted by *Menta*
> 
> minimum impact there, i like NV don't want them to go broke


Aww... shoot! I hadn't thought of that. With my past choice of GPU's, who would I ever go to? And the friends and family who I touted the 970 so highly to as a great performance deal -- gosh I don't want them to not trust me anymore. We must save NVidia.


----------



## nyxagamemnon

Quote:


> Originally Posted by *criminal*
> 
> I have worked in a large corporation before. Corporation with 100,000 + employees. Don't remember anything on this scale every happening there. I believe it can happen, I just don't believe it is an accident when the likes of Intel, AMD or Nvidia do it. Their products are sold on specs and performance. Like someone else said, GT 730 with 4GB seems pretty pointless to us, but I am sure some idiot comes along and buys the card without checking benchmarks because it has 4GB of ram and his 650Ti only has 1GB.


$79 card has 4GB prob the entire 4GB cost a few bucks







haha NO excuse for low ram cards.


----------



## iSlayer

So many more people with not 970s, AMD card owners and certain 780 Ti owner I'm looking at you, getting so upset on behalf of 970 owners.

Feel free to put your opinion on this topic on a piece of paper and then throw it in the trash, where it belongs.
Quote:


> Originally Posted by *criminal*
> 
> Well I am of the opinion that if the true specs had been listed the card would not have sold so well. See how I can do that too. And I am willing to bet that If this had been AMD instead of Nvidia, some of you in here would have a different opinion on the matter.


It would have impacted my decision somewhat but the 290(x) was a no go for me so I'd have still gone 970.

I've tried to remain relatively unbiased and sane in this thread, I can only hope if it happened to AMD that'd still be the case.
Quote:


> Originally Posted by *criminal*
> 
> I am no fool. You are just that gullible.
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you


I think you're just looking for a conspiracy, not to be ironic.
Quote:


> Originally Posted by *SandGlass*
> 
> You misunderstood, the documentation Intel and AMD releases about their GPUs is each around one or two magnitudes more than Nvidia releases about their GPUs, which also applies to pre-launch, as AMD and intel both start releasing information and working on open source drivers around a year and a half before they are released. Nvidia does not do this, their documentation pales in comparison.
> Of course it does, there are quite a few reasons. Both in terms of philosophy and real world. There are many people (mainly developers of open source libraries, as you can see with Travis & github, most developers do not even do windows builds in testing) that object having any close source software on their system. Binary drivers are also incompatible with GPL. In terms of real life, you sacrifice security and risk losing support after EOL of a product if you use binary blobs. Look at the open source drivers for AMD, they still support cards released more than a decade ago. If there's a bug, it can be fixed. Having access to driver source is also extremely useful, with the exception of games, open source drivers often have faster turn around times fixing bugs (AMD has an irc channel to chat about the open source drivers, it's not unheard of a bug being discovered and fixed in the same day). There's also the advantage of cross-platform compatibility, look at the recent Freedreno drivers, want to run linux? You're screwed if the manufacturer only provides android drivers.


You know, as much as I love open source software and am indebted to GNU for the GCC compiler collection, I'm beginning to think Orangey is right about the open source movement having a few too many hipster pissants who are too cool for anything proprietary.

Not everything need be open source. Sometimes the closed source alternative is just better.

*cough* Windows *cough*
Quote:


> Originally Posted by *wooshna*
> 
> i think people are getting out of had with this "did nvidia lie or not know" arguement.
> 
> Nvidia is tech company.
> 
> Nvidia hires the best possible people to do the job.
> 
> nvidia doesn't hire high school graduates into their marketing/pr/software/tech devisions.
> 
> people saying nvidia didn't lie about the specs obviously didn't read the specs from september vs 4 days ago.
> 
> people saying nvidia had nothing to gain by lying (intentionally) or being Incompetent are lying to themselves.
> 
> Buisnesses are there to make money.
> 
> who knows how much money nvidia put into the maxwell cards research, development and marketing. you would think they would throw in a few extra hundred dollars to proof read the spec sheet vs the marketing labels.


Oh boy the conspiracy. Hello sir how offended are you that Nvidia didn't report the correct specs on your 970?

Oh? You don't have one? Well your opinion is like that of a man's on abortion, not very important all things considered.
Quote:


> Originally Posted by *MerkageTurk*
> 
> nVidia misrepresented their own product, which warrants a refund or some sort of compensation.
> 
> From what i learned never buy a card ending with GTX X70, heck they even do not driver support their cards after three months.
> 
> A gtx 960 is also slow, even than a 770.


That's nice, buy a 970 and use it so your whiney opinion is worth more than a used tampon.
Quote:


> Originally Posted by *dukeReinhardt*
> 
> Oh no, the thought police are here. I'm sorry I have an opinion officer, I'll just go and chew on grass like everyone else.


Leave being offended to those that bought a 970.

http://i1.kym-cdn.com/entries/icons/original/000/016/971/plebcoms.PNG
What I'm reminded of.
Quote:


> Originally Posted by *2010rig*
> 
> I was implying that it was one of the few times I was up in arms with AMD's lies, and that was pretty severe, since they did in fact lie and hyped BD's *performance* for MONTHS, prior to release. Did you miss the part where they claimed *"Bulldozer was designed to be the HIGHEST performing SINGLE and MULTI-THREADED compute core in history.* How did that claim work out for them? *Apples and Oranges.*
> 
> That's exactly what I was saying earlier. We're used to x70 cards always being cut down in ROP's, SMM's, even lower bus width. That's nothing new.


Let's try not to remember BD. I've been getting more hopeful for Zen.
Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm pretty much done with this thread. It's disgusting at how many people feel entitled to reparations for this.
> 
> Nvidia has offered a full refund on the product if you're not happy with it. That's by far the best you're going to get. Nvidia isn't going to pay you money to keep your card, and it is highly unlikely they will give you a break on the cost if you want to upgrade to a 980. The 980 is still around $550, and that's not going to change until AMD releases a newer, better product.
> 
> To think, this mess started with a test/benchmark that was unreliable and finicky, and then it exploded into a fire of consumerism because the card has only 7/8 L2 cache that was advertised. Funny how the card still performs exactly the same as it did when the specs were first released.
> 
> To clarify, I am not defending Nvidia, but you guys need to grow up. Nvidia has offered a solution to your "grievances." Either get your money back, or be happy with the card as is. Your choice.
> 
> EDIT: Also, I am keeping my G1 Gaming GTX 970. The performance is really freakin good, and that hasn't changed. That $360 we paid for it was worth every penny.


Tbh a full refund is what they should do. However damaging the mistake, Nvidia messed up. Them offering a full refund is a reassurance we can trust the company.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Forceman*
> 
> So I'm guessing there are going to be some good open box deals on Newegg here pretty soon? Score.


Quote:


> Originally Posted by *Mad Pistol*
> 
> Seriously. I'll take an open box 970 for a few bucks less than retail. The card becomes an even sweeter deal. You guys returning your cards are making a huge mistake.


*Hides behind the boxes*

I wasn't already here watching for it...........you don't see me!!!


----------



## awdrifter

How long is the refund period? I bought my card during Black Friday, so it's out of the regular refund period. I still haven't redeem the games, so if they ask for it I can give the game code card back. Is it worth the hassle? This is the last time I'll buy a cut down core, I should've went with the GTX 780 TI.


----------



## mouacyk

Quote:


> Originally Posted by *iSlayer*
> 
> Leave being offended to those that bought a 970.
> 
> http://i1.kym-cdn.com/entries/icons/original/000/016/971/plebcoms.PNG
> What I'm reminded of.


You speak of trust. It's perfectly in order for those of us who have staked a long past with NVidia (basically since its inception with the GeForce, because it was relatively oblivious prior to that) to have a valuation of such a future with them.
Quote:


> Originally Posted by *awdrifter*
> 
> How long is the refund period? I bought my card during Black Friday, so it's out of the regular refund period. I still haven't redeem the games, so if they ask for it I can give the game code card back. Is it worth the hassle? This is the last time I'll buy a cut down core, I should've went with the GTX 780 TI.


The refund period is only applicable to your specific retailer. If you are serious about your refund, try your retailer first citing as much published information as you can. If the worst comes to shove, try the CSR's who have been posting on various forums -- especially the one at the GeForce forums.

Wow, the rep took back his words on offering to help people to secure a refund. See his original offer that I quoted here.

Poor Korean and google translator on Geforce forums, but quite a meme in itself:
Quote:


> I am now being used by sli the GTX970.


----------



## spacin9

Quote:


> Originally Posted by *GrimDoctor*
> 
> No I emailed Nvidia direct (I would have called but couldn't find a local listing) and then called Asus Australia when Nvidia advised me to get a case number from Asus. Then, I forwarded it all to my retailer, went in, got the refund done on my credit card.


This is dumb logic, but I'm almost thinking they might be more cooperative outside of the US because of the premium overseas peeps pay for this stuff. But I'm glad you got your refund, I guess for 3.5 hundred US, I'm probably going to have to take it or leave it.

I'm pretty much stuck I think.


----------



## TopicClocker

Quote:


> Originally Posted by *2010rig*
> 
> Ready to stop spreading misinformaion?
> 
> oh look at the 970 NOT using 4GB RAM.
> 
> 
> 
> 
> 
> 
> 
> Specifically notice what Ryan had to do just to get the card to use that much VRAM.
> 
> 
> 
> 
> 
> For those who were DYING for FCAT results, here you go.
> 
> 
> 
> 
> 
> Now quit your false advertising AMD


People were saying the same thing about the 670, 680, 760 and 770 with 2GB VRAM, until they started choking on textures which are one of the largest consumers of VRAM.

Which only took them close to 2 years to begin running into VRAM problems, mainly in next gen games which is quite some time for a GPU, but they still have quite some horsepower. However the PS4 and Xbox One may change things when true next gen games release this year.

Frame Rating: Looking at GTX 970 Memory Performance
Quote:


> Is it possible that the 3.5GB/0.5GB memory pools are causing issues with games today at very specific settings and resolutions? Yes. Is it possible that it might do so for more games in the future? Yes. Do I think it is likely that most gamers will come across those cases? I honestly do not.


----------



## Final8ty

Quote:


> Originally Posted by *Seven7h*
> 
> No one was caught out. Being 5% slower overall in certain cases doesn't make or break any use case on the planet. Basically that would mean someone is upset because they need to upgrade 5% sooner.


I will leave it at i will agree to disagree because the example i gave was not about the 5% slower overall but about the stuttering which most of the complaints from users suffering besides any fps loss as there is more to a gaming experience than just raw FPS which is why all this thing came about in the first place as users have been complaining about stuttering for months already.

I remember one time when EVEonline had an update that had issues made my 300+fps feel like 20fps because the stuttering was so bad.


----------



## tpi2007

Quote:


> Originally Posted by *TopicClocker*
> 
> Frame Rating: Looking at GTX 970 Memory Performance
> Quote:
> 
> 
> 
> Is it possible that the 3.5GB/0.5GB memory pools are causing issues with games today at very specific settings and resolutions? Yes. Is it possible that it might do so for more games in the future? Yes. Do I think it is likely that most gamers will come across those cases? I honestly do not.
Click to expand...

He does not because he doesn't want to test SLI, which is what really matters with the current games. And who knows if it won't matter (i.e., getting playable framerates) for single cards in forthcoming games too.

Add to that that even SLI 4K doesn't have to hit a solid 60 fps to play smooth anymore, with 4K G-Sync monitors you'd be comfortable getting 40 - 45fps with a pair of GTX 970s. It seems rather obvious that they will have to test SLI sooner or later.


----------



## Final8ty

Quote:


> Originally Posted by *tpi2007*
> 
> He does not because he doesn't want to test SLI, which is what really matters with the current games. And who knows if it won't matter (i.e., getting playable framerates) for single cards in forthcoming games too.
> 
> Add to that that even SLI 4K doesn't have to hit a solid 60 fps to play smooth anymore, with 4K G-Sync monitors you'd be comfortable getting 40 - 45fps with a pair of GTX 970s. It seems rather obvious that they will have to test SLI sooner or later.


Poor excuse for not doing SLI as its very likely to exacerbate the issue as those same excuses were not a problem for him when it came to finding crossfire stuttering.


----------



## Orthello

Well i think this just increased the value of my 780 TI classified to be fair which is on auction now. I was matching it under the price of a 970 here .. no need now - up goes my reserve.

I wonder how many passed the 780 TI up vs the 970 feeling the 4Gb was the way to go ...


----------



## The Robot




----------



## rdr09

Quote:


> Originally Posted by *2010rig*
> 
> Ready to stop spreading misinformaion?


just going by the title.


----------



## Vesku

Some SLI tests from an enthusiast:

http://hardforum.com/showpost.php?p=1041392263&postcount=619

Someone also going by GoldenTiger linked this in the comments on PCPer article looking at 970 frametimes:



PCPer also ran into that issue of the GTX 970 not wanting to use that 512MB of Cache if at all possible with COD:AW.
Quote:


> The "Very High" settings run set texture resolution, normal map resolution and specular map resolution to High, ambient occlusion if Off and post-process anti-aliasing is Off as well. Even with those settings, GTX 970 is pushing the 3.6GB level of graphics memory consumption, crossing the 3.5 GB barrier and toeing into the 500MB window. The problem is that the GTX 980 is using the full 3.97 GB of its memory for the same combination of settings.


----------



## 2010rig

Quote:


> Originally Posted by *rdr09*
> 
> just going by the title.


Well played.


----------



## Final8ty

Quote:


> Originally Posted by *GoldenTiger;1041392256*
> Did some more testing using Shadow of Mordor, SLI enabled and disabled, sub-3500mb and over 3500mb VRAM consumption. Frametimes stay within normal variation/acceptable consistence when in single-card mode regardless of VRAM, but going to over 3500mb in SLI causes wild and rampant stutters/hitches with vastly fluctuating frametimes to match.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Long story short, I have to agree for single-card that while it is a big false advertisement and spec change it may not have a giant practical impact (at least from what I can see so far... unless you may want to go dual-card later for example), but in SLI it is a very real and major issue.


http://hardforum.com/showthread.php?t=1849838&page=21


----------



## Swolern

Quote:


> Originally Posted by *2010rig*
> 
> Ready to stop spreading misinformaion?
> 
> oh look at the 970 NOT using 4GB RAM.
> 
> 
> 
> 
> 
> 
> 
> Specifically notice what Ryan had to do just to get the card to use that much VRAM.


Um am i missing something? Because the 970 Vram use you quoted on Advanced Warfare is only using 3.6GB, while the 980 is using all of the 4GB. Maybe its just how some new games are coded where the 970 will be unable to efficiently use the extra 512mb.

_Quoted from Ryan:_
Quote:


> You can see that indeed there are some additional spikes in the frame times of Advanced Warfare when running on the GTX 970 at these very intense image quality settings. Those spikes are nearly non-existent on the GTX 980, the card that is using 300-400MB more memory during our testing in that scenario.


The point is if the 970 had the 4gb memory bandwidth that is was suppose to have, then 970 users would not get this extra stutters in scenarios like these.


----------



## badrapper

It doesn't look good for people who have SLI, or wanted to do SLI in future. And it doesn't look good for people who thought they could keep this card for two or more years.









I predict most people keeping this card for a year max. And the ones who don't will lose a allot of value on this card from it dropping allot of its value after a bit (from people selling on eBay), as you cant SLI these with any positives


----------



## raghu78

looking quite bad for GTX 970 SLI users who bought it for 1440p and 4K gaming and trying to max settings. they are the most likely to hit that last 0.5 GB more often and get hurt the most. They trusted nvidia and paid with their hard earned money and Nvidia conveniently missed out to mention actual specs.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Vesku*
> 
> *Someone also going by GoldenTiger* linked this in the comments on PCPer article looking at 970 frametimes:





















The Tiger pounces again!!!!










Nowhere on the interwebz is safe from his stalk!!!


----------



## Exilon

From what I'm reading from Anandtech, the only advantage that keeping the 8th controller active is that the GPU can issue a write to one partition and read from the other partition simultaneously. This is contrast to a 7-channel 224-bit 4GB GPU which would only be able to read or write from one partition or another.

So the GPU is capable of doing 196 GB/s read + 28 GB/s write or 196 GB/s write + 28 GB/s read and would want to do that kind of operation as much as possible. What a driver optimization nightmare!


----------



## TopicClocker

Quote:


> Originally Posted by *Vesku*
> 
> Some SLI tests from an enthusiast:
> 
> http://hardforum.com/showpost.php?p=1041392263&postcount=619
> 
> Someone also going by GoldenTiger linked this in the comments on PCPer article looking at 970 frametimes:
> 
> 
> 
> PCPer also ran into that issue of the GTX 970 not wanting to use that 512MB of Cache if at all possible with COD:AW.


No, that IS GoldenTiger.

Quote:


> Originally Posted by *PostalTwinkie*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Tiger pounces again!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nowhere on the interwebz is safe from his stalk!!!


lol you big meanies making fun of him!

I've heard him mentioned on OCN, OCUK, TechPowerUp and Anandtech, it seems his tests are drawing quite some attention.
Alot of people are performing lots of tests on the GTX 970s.


----------



## hyp36rmax

GoldenTiger was the main guy praising Nvidia and his 970's in SLI 4K.... Where's he been? haha


----------



## 2010rig

Quote:


> Originally Posted by *Swolern*
> 
> Um am i missing something? Because the 970 Vram use you quoted on Advanced Warfare is only using 3.6GB, while the 980 is using all of the 4GB. Maybe its just how some new games are coded where the 970 will be unable to efficiently use the extra 512mb.
> 
> _Quoted from Ryan:_
> The point is if the 970 had the 4gb memory bandwidth that is was suppose to have, then 970 users would not get this extra stutters in scenarios like these.


Yeah good points, I wonder if that's something new drivers will address. BF4 used up all 4 GB almost linear with the 980, so it's definitely capable of utilizing it.


----------



## ondoy

AMD exchange program in effect.... now.

Anyone returning their GTX970 and wanting a great deal on a Radeon with a full 4GB please let us know.


----------



## Baghi

lol Roy is ridiculous.


----------



## wermad

Quote:


> Originally Posted by *ondoy*
> 
> AMD exchange program in effect.... now.
> 
> Anyone returning their GTX970 and wanting a great deal on a Radeon with a full 4GB please let us know.










schweets


----------



## PostalTwinkie

Quote:


> Originally Posted by *ondoy*
> 
> AMD exchange program in effect.... now.
> 
> Anyone returning their GTX970 and wanting a great deal on a Radeon with a full 4GB please let us know.


I bet they are giving people a handful of game codes, maybe some cash off coupon code? Man, I am pretty interested to know what they are doing.


----------



## Cyro999

I threw a few lines to see what's up, a discount code on a £230 290-tri-x is quite appealing against a £470 980.

of course, if you Nvidia guys wanted to offer a cheaper 980 step up..


----------



## Johnny Rook

Quote:


> Originally Posted by *Swolern*
> 
> Um am i missing something? Because the 970 Vram use you quoted on Advanced Warfare is only using 3.6GB, while the 980 is using all of the 4GB. Maybe its just how some new games are coded where the 970 will be unable to efficiently use the extra 512mb.


Yes, you are missing something. Just because GTX 980 is loading 4GB of assets into memory, that doesn't mean it's using them all at same time; this is not how memory allocation works. Anyways, that's not the point. The fact GTX 970 doesn't load the all 4GB of assets has to do with the way Windows and nVIDIA drivers are handling the memory.

Quote:


> Originally Posted by *Swolern*
> 
> The point is if the 970 had the 4gb memory bandwidth that is was suppose to have, then 970 users would not get this extra stutters in scenarios like these.


The GTX 970 HAS 4GB of VRAM and uses them all; those 4GB management is done using heuristics (AI). PCPer's results shows GTX 970 having same performance degradation @ +3,5GB as the GTX 980 AND it also shows otherwise, depending on the game. If this has to do with heuristics, VRAM ou lack of Cuda Cores / SMM is opened to debate. This is what conscerns me the most. So far, there's no absolute proof of anything.
Is true that the results tend to show that in most gaming scenarios there's no problem with GTX 970. However, CoD results show something odd about it. Ryan leaves the door open to several causes, including heuristics, in which case, a driver update will fix it. So, at this point in time, if a driver can fix GTX 970's behaviour oddity in CoD, I will give nVIDIA the same opportunity I gave AMD when Crossfire was broken, and that is to came up a driver fix. That's all I give. If the driver doesn't fix it, I can't trust nVIDIA driver team to fix other potencial problems in future games.


----------



## error-id10t

Quote:


> Originally Posted by *Cyro999*
> 
> I threw a few lines to see what's up, a discount code on a £230 290-tri-x is quite appealing against a £470 980.
> 
> of course, if you Nvidia guys wanted to offer a cheaper 980 step up..


Man over here it'd have to be at least the 290X Matrix ROG. The 290X tri-x is $60 cheaper.

NOTE: I have no idea how these cards actually perform, just going by price.


----------



## Silent Scone

Yeah, no. Sorry that's wrong. I can replicate VRAM issues in AW, SOM and alike just like 970 owners. Read Guru3Ds piece on it. As far as I'm concerned the only real issue is the lying and not sold as specified. The performance issues people are experiencing is complete misinterpretation of what is-actually happening when they're getting stutter


----------



## Cybertox

Its really entertaining seeing all these Nvidia fanboys defending Nvidia so miserably. For me its not a big deal, Nvidia screwed up that is for sure but that could happen to everyone just in a slightly different way. The 970 is still a good GPU nonetheless but some of the arguments brought up by fanboys are just ridiculous.


----------



## Cyro999

Quote:


> Originally Posted by *error-id10t*
> 
> Man over here it'd have to be at least the 290X Matrix ROG. The 290X tri-x is $60 cheaper.
> 
> NOTE: I have no idea how these cards actually perform, just going by price.


My friend with a 290 wins and loses some benches, however my card was about 1.3x more expensive


----------



## clerick

I've been a long time fan of nvidia but this just leaves a really bad taste in my mouth. If the card was advertised at 3.5gb and x rops and I bought that i'd have no problem. But advertising the same 4gb and then being unable to actually use it, shameful.


----------



## Silent Scone

Quote:


> Originally Posted by *clerick*
> 
> I've been a long time fan of nvidia but this just leaves a really bad taste in my mouth. If the card was advertised at 3.5gb and x rops and I bought that i'd have no problem. But advertising the same 4gb and then being unable to actually use it, shameful.


Yep, it's a sad state of affairs. I'm starting to think that this is all part of a massive recuperation of financial loss from lack of successful progress on process shrink. It was reported that NVIDIA had in fact become irritated and 'deeply unhappy' with TSMC as 20nm was essentially worthless. So much so that TSMC have billed both AMD and NVIDIA for risk production on multiple occasions. With this in front of mind, it is not a stretch to insinuate that with Maxwell, there is some loss to be raked back by designing a sub system that would let them do as they have done with GM204, allowing for QC to reuse poor silicon off the line.

From this point on, there is no questionable doubt that this was undisclosed by pure accident. They felt (and still do I think) that it is entirely up to them how they neuter their slower products, and you shouldn't be too bothered at how this is achieved. This could also be completely seperate to the incorrectly reported ROPs. I believe that this may have been the nail in the coffin and not entirely on purpose - sales and marketing aren't going to understand the impact without fully understanding what the GPU is doing.


----------



## fleetfeather

Quote:


> Originally Posted by *ondoy*
> 
> AMD exchange program in effect.... now.
> 
> Anyone returning their GTX970 and wanting a great deal on a Radeon with a full 4GB please let us know.


This is the sort of stuff I genuinely like from Roy. I love his super agressive stance when it doesn't involve a sort of sleazy marketing alongside it.


----------



## provost

Quote:


> Originally Posted by *Silent Scone*
> 
> Yep, it's a sad state of affairs. I'm starting to think that this is all part of a massive recuperation of financial loss from lack of successful progress on process shrink. It was reported that NVIDIA had in fact become irritated and 'deeply unhappy' with TSMC as 20nm was essentially worthless. So much so that TSMC have billed both AMD and NVIDIA for risk production on multiple occasions. With this in front of mind, it is not a stretch to insinuate that with Maxwell, there is some loss to be raked back by designing a sub system that would let them do as they have done with GM204, allowing for QC to reuse poor silicon off the line.
> 
> From this point on, there is no questionable doubt that this was undisclosed by pure accident. They felt (and still do I think) that it is entirely up to them how they neuter their slower products, and you shouldn't be too bothered at how this is achieved. This could also be completely seperate to the incorrectly reported ROPs. I believe that this may have been the nail in the coffin and not entirely on purpose - sales and marketing aren't going to understand the impact without fully understanding what the GPU is doing.


I hear you, but it's not just that is it... What's their excuse for falling way behind AMD on providing appropriate driver support legacy cards (and new cards), particularly for multi gpu set up? AMD cards scale much better in crossfire than Nvidia's in sli. The 970 fiasco is another example of Nvidia having taken its eye off the ball in the discreet consumer gpu segment, and being complacent due to the lack of real competition to Maxwell. But guess what? AMD will eventually respond, and I would be grateful for that, as a consumer.
Nvidia's customers are unlikely to forget this slight of hand with the 970, whether deliberate or not, and continued dissatisfaction among the most ardent of Nvidia's supporters (and I am not one of them) for the lack of support would most likely turn a good chunk of them away from NV to AMD. But, this is how competition is supposed to work, and probably the wakeup call NV needs to jolt it out of its arrogant complacent slumber that it seems to have fallen into (or not, who knows...lol)


----------



## Silent Scone

Quote:


> Originally Posted by *provost*
> 
> I hear you, but it's not just that is it... What's their excuse for falling way behind AMD on providing appropriate driver support legacy cards (and new cards), particularly for multi gpu set up? AMD cards scale much better in crossfire than Nvidia's in sli. The 970 fiasco is another example of Nvidia having taken its eye off the ball in the discreet consumer gpu segment, and being complacent due to the lack of real competition to Maxwell. But guess what? AMD will eventually respond, and I would be grateful for that, as a consumer.
> Nvidia's customers are unlikely to forget this slight of hand with the 970, whether deliberate or not, and continued dissatisfaction among the most ardent of Nvidia's supporters (and I am not one of them) for the lack of support would most likely turn a good chunk of them away from NV to AMD. But, this is how competition is supposed to work, and probably the wakeup call NV needs to jolt it out of its arrogant complacent slumber that it seems to have fallen into (or not, who knows...lol)


Because quite simply they're too focused on Tegra at the moment. That is the long and short answer. There has been no upselling for SLI for a while now and driver performance in this department, and issues have crept up revision on revision with no mention on getting fixed.

NVIDIA seem to be under the ethic that if something did work, then it is up to the developer to fix it. That might well be true if you're looking at pointing the legal finger. But it doesn't bode well for own image or your own products. For example if one was to have a representative that had involvement in the community regularly, one would go to said person and ask them about why they were getting strange VRAM readouts with third party application - much sooner than now. Maybe this could have been cleared up a lot sooner, with less fallout.

From a performance perspective given the envelope GM204 is an amazing product. Very happy with my 980GTX, but that only goes so far and in an industry where continuing support is essential. Which evidently over the last few months we've not really been getting. Nvidia don't really have a public face, and I think in times like this where they find themselves to be grovelling, that really shows.


----------



## skupples

Only good thing about that is the eventual Denver core hitting GPUs.

Either way if SLI continues to slip, AMD EYEDINITY here I come!!

Been a long time coming, but it's time to go back. Haven't ran AMD in my main rig since mech warrior days.


----------



## greydor

Quote:


> Originally Posted by *skupples*
> 
> Been a long time coming, but it's time to go back. Haven't ran AMD in my main rig since mech warrior days.


I just can't lose the [perceived] quality drivers, PhysX (no joke, as in Batman), G-Sync, and random performance improvements in various gains. It's not to be a fanboy; if features were on parity to each of the competitors in the duopoly, I would likely go to AMD due to pricing.


----------



## provost

Quote:


> Originally Posted by *greydor*
> 
> I just can't lose the [perceived] quality drivers, PhysX (no joke, as in Batman), G-Sync, and random performance improvements in various gains. It's not to be a fanboy; if features were on parity to each of the competitors in the duopoly, I would likely go to AMD due to pricing.


Perhaps Nvidia is putting too much stock in the feature set, but these features may not be as important to some as you may think...
some would much prefer continued driver support to improve performance (and not just for Maxwell) than a "feature set" that is hardly of any use
As for Gsync, having the flexibility to switch my gpu vendors is exactly why I would never lock myself into a single provider with a monitor purchase that is tied to any proprietary tech.
If there is monitor tech that is universal, such as async, great, if not, I am prepared to live without a tech that locks my display/displays to any single gpu provider. I refuse to further limit my choices and subject myself to what amounts to an effective monopoly, from what already is a limited choice duopoly


----------



## Menta

who pissed off golden tiger that he wont come in


----------



## Baghi

Except for the PhysX and G-sync maybe, none of the feature performs as advertised.

Remember this?


----------



## mtcn77

Quote:


> Originally Posted by *Exilon*
> 
> From what I'm reading from Anandtech, the only advantage that keeping the 8th controller active is that the GPU can issue a write to one partition and read from the other partition simultaneously. This is contrast to a 7-channel 224-bit 4GB GPU which would only be able to read or write from one partition or another.
> 
> So the GPU is capable of doing 196 GB/s read + 28 GB/s write or 196 GB/s write + 28 GB/s read and would want to do that kind of operation as much as possible. What a driver optimization nightmare!


Finally we agree. The card is just no way in the same tier as 290, 290x, 295x2 & 980. That last bit of read asyncronism forces the card to be limited to 3.5 GB reads which, afaik, is the consequence of HEAVY antialiasing. Those textures aren't going to be sampled anywhere else other than rops & vram. >> Which is why those DSR benchmarks are still nothing more than pulling a smoke screen setup over the real issues.


----------



## skupples

TXAA, MFAA... They're all jokes. Same for AMD's proprietary AA techniques.

Gsync however is glorious. I'm a believer after seeming it next to a 144hz strobe. Both systems running on exact same hardware, 780s.

A-sync(freesync?)? NV will never adapt, or it will be YEARS.


----------



## Silent Scone

Quote:


> Originally Posted by *Menta*
> 
> 
> 
> who pissed off golden tiger that he wont come in


Running Ultra texture preset on a 4gb card. People really, really need to stop using SOM as a valid example.


----------



## Silent Scone

Quote:


> Originally Posted by *skupples*
> 
> TXAA, MFAA... They're all jokes. Same for AMD's proprietary AA techniques.
> 
> Gsync however is glorious. I'm a believer after seeming it next to a 144hz strobe. Both systems running on exact same hardware, 780s.
> 
> A-sync(freesync?)? NV will never adapt, or it will be YEARS.


TXAA looked ok in AC:Black Flag.

....that's all. Just if you were expecting me to defend it any further than that


----------



## skupples

Didn't the developers of shadows of Mordor straight up come out and say that 4 TV simply isn't going to cut it for extremely smooth gameplay on ultra settings? I mean I understand that people are showing an exacerbated situation, I get it. I get that people are trying to show that it runs even worse than something else, but it seems like people are rating just how bad everything runs. I wouldn't want to play that game at ultra settings with either 980 or 970, based on what the frame times show.

Did shadow of Mordor ever get proper SLI or surround support?


----------



## mtcn77

Quote:


> Originally Posted by *skupples*
> 
> TXAA, MFAA... They're all jokes. Same for AMD's proprietary AA techniques.
> 
> Gsync however is glorious. I'm a believer after seeming it next to a 144hz strobe. Both systems running on exact same hardware, 780s.
> 
> A-sync(freesync?)? NV will never adapt, or it will be YEARS.


Wait, MFAA has its advantages > clears subpixel interpolation which I cherish for prime colours.


----------



## skupples

Quote:


> Originally Posted by *mtcn77*
> 
> Wait, MFAA has its advantages > clears subpixel interpolation which I cherish for prime colours.


I wouldn't know since Nvidia made the asinine choice to lock Kepler owners out.


----------



## Baghi

Quote:


> Originally Posted by *skupples*
> 
> TXAA, MFAA... They're all jokes. Same for AMD's proprietary AA techniques.
> 
> Gsync however is glorious. I'm a believer after seeming it next to a 144hz strobe. Both systems running on exact same hardware, 780s.
> 
> A-sync(freesync?)? NV will never adapt, or it will be YEARS.


Not just G-sync, but PhysX is also a legitimate reason to prefer NVIDIA cards over AMD counterparts.

EDIT:
I hope this doesn't turn out to be another "AMD vs. NVIDIA" thread.


----------



## criminal

Quote:


> Originally Posted by *greydor*
> 
> I just can't lose the [perceived] quality drivers, PhysX (no joke, as in Batman), G-Sync, and random performance improvements in various gains. It's not to be a fanboy; if features were on parity to each of the competitors in the duopoly, I would likely go to AMD due to pricing.


The drivers for AMD are more than fine now, I love Physx myself, but not enough that it will lock me into an Nvidia card anymore and I am not paying the money to get G-Sync monitor right now. Kepler may not be getting gimped by Nvidia, but Nvidia doesn't seem to be showing them much love anymore either. Meanwhile 7970 aka 280x is still getting plenty of love from AMD. I think I have made up my mind and will give the 380x/390x a shot this go around.


----------



## skupples

Quote:


> Originally Posted by *Baghi*
> 
> Not just G-sync, but PhysX is also a legitimate reason to prefer NVIDIA cards over AMD counterparts.
> 
> EDIT:
> I hope this doesn't turn out to be another "AMD vs. NVIDIA" thread.


I would have agreed with that,5 years ago.
Quote:


> Originally Posted by *criminal*
> 
> The drivers for AMD are more than fine now, I love Physx myself, but not enough that it will lock me into an Nvidia card anymore and I am not paying the money to get G-Sync monitor right now. Kepler may not be getting gimped by Nvidia, but Nvidia doesn't seem to be showing them much love anymore either. Meanwhile 7970 aka 280x is still getting plenty of love from AMD. I think I have made up my mind and will give the 380x/390x a shot this go around.


Same

Nvidia effectively sunsetting driver updates, beyond SLI bits for GK110 is the nail in the coffin for me.
Which is a damn shame as I was really looking forward to the G-Sync 1440p 144hz IPS screen.


----------



## Baghi

Quote:


> Originally Posted by *skupples*
> 
> I would have agreed with that,5 years ago.


Not if you were still playing the Mirror's Edge.


----------



## maarten12100

Quote:


> Originally Posted by *Silent Scone*
> 
> Running Ultra texture preset on a 4gb card. People really, really need to stop using SOM as a valid example.


How so it shows that it is not just a benchmark being affected but actual games too. In other words there is a noticeable problem of the limited bandwidth of that last ~500MB.
Running out of vram would yield the same spikes as shown in goldentiger's post so this makes the card perform as if it had only 3,5GB though it has 4GB obviously.

Marketing claims:
4GB vram true
224GB/s for the whole 4GB false


----------



## skupples

Quote:


> Originally Posted by *Baghi*
> 
> Not if you were still playing the Mirror's Edge.


See, that's the issue. YES it looks great in the 1 out of 10-20 games that actually use NV GPU rendered physX.

I'm being very specific for a reason. Don't want to have someone copy paste the 2739338682 games with physX support, as most of those are not NV GPU physX.

And really the dumbest thing they ever did was lockout AMD + NV physX setups.


----------



## Silent Scone

Quote:


> Originally Posted by *maarten12100*
> 
> How so it shows that it is not just a benchmark being affected but actual games too. In other words there is a noticeable problem of the limited bandwidth of that last ~500MB.
> Running out of vram would yield the same spikes as shown in goldentiger's post so this makes the card perform as if it had only 3,5GB though it has 4GB obviously.
> 
> Marketing claims:
> 4GB vram true
> 224GB/s for the whole 4GB false


I wouldn't worry yourself too much if you still think that after everything that has been said about SOM's different presets. You don't seem to understand.

From Euro Gamer:
Quote:


> So without further ado, we present a selection of comparisons of the game's opening scenes, captured at medium, high and ultra texture settings with all other settings ramped up as high as they go. Monolith recommends a 6GB GPU for the highest possible quality level - and we found that at both 1080p and 2560x1440 resolutions, the game's art ate up between 5.4 to 5.6GB of onboard GDDR5


Bob Roberts, Monolith technical director.
Quote:


> "To make a world as rich and detailed, especially the characters - we put so much into the enemies with the nemesis system and everything - our artists are making things at an outrageously high fidelity," he said.
> 
> "They are on monster PCs making the highest possible quality stuff and then we find ways to optimise it, to fit onto next-gen, to fit onto PCs at high-end specs. Then obviously there's going to be that boundary where our monster development PCs are running it OK - but why not give people the option to crank it up? It makes sense to get it out into the world there - we have it, we built it that way to look as good as possible. You might as well, right?"


----------



## maarten12100

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090

They say things like the reviewers didn't act correct. While they didn't speak about this config of memory. Also their site still lists the card as 224GB/s and doesn't make mention of a fast slow config. (the fast slow memory config claims are obviously lame made up things called into life to defend their actions)

Look at how they got Tom's to buy into their scam. To have Tom admit they listed it wrong while Nvidia's own freaking site lists it as:

Nvidia lists it as:
Quote:


> GTX 970 Memory Specs:
> 7.0 GbpsMemory Clock
> 4 GBStandard Memory Config
> GDDR5Memory Interface
> 256-bitMemory Interface Width
> 224Memory Bandwidth (GB/sec)


http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications
Tom after being bribed lists it as:
Quote:


> 224 GB/s aggregate
> 196 GB/s (3.5 GB)
> 28 GB/s (512MB)


http://www.tomshardware.com/news/nvidia-geforce-gtx-970-specifications,28464.html

Right Nvidia if those are the actual numbers then why doesn't it say so on your own specification site. Lawsuit time?


----------



## Baghi

Quote:


> Originally Posted by *Silent Scone*
> 
> I wouldn't worry yourself too much if you still think that after everything that has been said about SOM's different presets. You don't seem to understand.


Do you've an explanation for retailers to give to their customers insisting them not to return their fully functional graphics card?


----------



## skupples

Please stop threatening lawsuits unless you intend to be the ringleader via getting the class action started.

Too much flab not enough jab.

Not trying to be a dick, I'm being completely serious. Every other post is about lawsuit lawsuit. Is you're so hell bent on suing AND think you have a case, GO GET A CONSULTATION VIA CLASS ACTION SPECIALISTS.

Would be hilarious to see NV pay out the nose, only issue is that the lawyers would take 60-65% and the other 30% would be split between thousands and thousands of people, with a payment plan of $5 an installment for life.

One thing is for sure. NV legal team has been plugging away at this from probably before the news went out around the world. They probably even evaluated it BEFORE printing the specs.

It's blatantly obvious word gymnastics, which means legal was consulted before printing.


----------



## Silent Scone

Sorry you're both going off on a tangent that has nothing to do with what I was talking about. Or what you were initially talking about either. This has already been discussed at length.

Angry, mistreated! Feel outraged, all of these words make my feelings toward this test scenario that requires 6GB of memory valid on my 4gb*3.5gb card


----------



## SKYMTL

Quote:


> Originally Posted by *maarten12100*
> 
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
> 
> They say things like the reviewers didn't act correct. While they didn't speak about this config of memory. Also their site still lists the card as 224GB/s and doesn't make mention of a fast slow config. (the fast slow memory config claims are obviously lame made up things called into life to defend their actions)
> 
> Look at how they got Tom's to buy into their scam. To have Tom admit they listed it wrong while Nvidia's own freaking site lists it as:
> 4GB 224GB/s not as the actually spec that Tom posted after Nvidia bribed them into this scheme:
> "224 GB/s aggregate
> 196 GB/s (3.5 GB)
> 28 GB/s (512MB)"
> 
> Right Nvidia if those are the actual numbers then why doesn't it say so on your own specification site. Lawsuit time?


You're accusing a company with a ~11 billion market cap and a large media organization of collusion and bribery. Those are some serious, and, until proven otherwise, quite baseless accusations.

Toms' analogy is absolutely spot on as well and needs repeating here as well:

_You're a muscle-car buff and you decide to test drive the new 2015 Dodge Charger Hellcat. The car is advertised as a supercharged 8-cylinder, 6.2 Liter Hemi engine with 24 valves that produces 707 horsepower at 6,000 RPM. It's one of the most powerful cars you can buy for the dollar, achieving 0-60 MPH in under three seconds and a quarter-mile in under 12 seconds. You take it for a test drive, you fall in love with the car, and you buy it. In the months to follow, you remain quite pleased with your purchase and the performance the car provides.

It later comes out that Dodge made a mistake on its marketing materials: the engine has 16 valves, not 24. It still produces 707 horsepower at 6,000 RPM though, and *it still offers the same amazing road performance that it did the day you bought it*. It's still one of the fastest cars you could purchase for the dollar. But you can no longer say you own a 24-valve V8._


----------



## MerkageTurk

^well you return it under misrepresentation


----------



## skupples

Yet again, another terribly designed Car example. If 970 is 700 HP then I guess 980 is 1,000 HP?

I would rate 970 around 300HP 4 banger with a turbo.
Quote:


> Originally Posted by *Silent Scone*
> 
> Sorry you're both going off on a tangent that has nothing to do with what I was talking about. Or what you were initially talking about either. This has already been discussed at length.
> 
> Angry, mistreated! Feel outraged, all of these words make my feelings toward this test scenario that requires 6GB of memory valid on my 4gb*3.5gb card


Woah bro. You're obviously missing the point here. 970 runs worse than any other 4gb when running settings which devs state require 6gb for smooth game play. It's the king of the turds, all the other turds just don't smell as bad as the king of the stinky turds.


----------



## Silent Scone

queue circus music lol.

lol 4 banger. You be so American


----------



## skupples

Yeah, occupational hazard of living in America.

Sorry, will come back when I'm more Chav. :
Now excuse me while I go find a beat up Audi and a flat bill berberyy hat.


----------



## Baghi

Quote:


> Originally Posted by *maarten12100*
> 
> They say things like the reviewers didn't act correct.


At least those reviewers, after this issue came to light show the accurate specs in their charts.


----------



## skupples

I guess my question just comes back to how reviewers could just gloss over these issues.

I don't really look at reviews these days as real world vs. Review samples continue to grow further and further apart BUT do reviewers not do everything possible to exploit the frame buffer to its full potential? They sure tried to do so with titan.


----------



## looniam

http://www.tweaktown.com/tweakipedia/68/amd-radeon-r9-290x-4gb-vs-8gb-4k-maxed-settings/index.html


----------



## solid9

I tried mailing them and first they said they never listed the number of ROPs or l2 cache in the 970 specifications page then I told them about the bandwidth being max 196GB/s and the response I got is

"The memory bandwidth is calculated as below:

Memory Clock * Memory Interface Width / 8 (in Byte)

7000000000 * 256 / 8 = 224000000000 = 224 GB"

Is this true?


----------



## Silent Scone

Quote:


> Originally Posted by *skupples*
> 
> Yeah, occupational hazard of living in America.
> 
> Sorry, will come back when I'm more Chav. :
> Now excuse me while I go find a beat up Audi and a flat bill berberyy hat.


now you're just stereotyping







lol. You actually said 4 banger








Quote:


> Originally Posted by *solid9*
> 
> I tried mailing them and first they said they never listed the number of ROPs or l2 cache in the 970 specifications page then I told them about the bandwidth being max 196GB/s and the response I got is
> 
> "The memory bandwidth is calculated as below:
> 
> Memory Clock * Memory Interface Width / 8 (in Byte)
> 
> 7000000000 * 256 / 8 = 224000000000 = 224 GB"
> 
> Is this true?


lol yes&no, this is half the argument. In laymens I can't see it being plausible that it can access the pool on the same cycle therefore it's unlikely you'll see 224, but in the same breath you probably wouldn't see this anyway, it's a theoretical limit at a given clock.


----------



## Menta

i am taking this to the ASUS forum IF ANY ONE IS WILLING TO SUPPORT me an possibly other users i would appreciate it very much

https://rog.asus.com/forum/showthread.php?57022-asus-970-strix-false-specs


----------



## mtcn77

Quote:


> Originally Posted by *solid9*
> 
> I tried mailing them and first they said they never listed the number of ROPs or l2 cache in the 970 specifications page then I told them about the bandwidth being max 196GB/s and the response I got is
> 
> "The memory bandwidth is calculated as below:
> 
> Memory Clock * Memory Interface Width / 8 (in Byte)
> 
> 7000000000 * 256 / 8 = 224000000000 = 224 GB"
> 
> Is this true?


It is true, but you cannot read from all 8 of them at the same time, so in essence the antialiasing tier of the card is lower. Luckily reviewers have no difficulty dismissing via DSR.


----------



## maarten12100

Quote:


> Originally Posted by *SKYMTL*
> 
> You're accusing a company with a ~11 billion market cap and a large media organization of collusion and bribery. Those are some serious, and, until proven otherwise, quite baseless accusations.
> 
> Toms' analogy is absolutely spot on as well and needs repeating here as well:
> 
> _You're a muscle-car buff and you decide to test drive the new 2015 Dodge Charger Hellcat. The car is advertised as a supercharged 8-cylinder, 6.2 Liter Hemi engine with 24 valves that produces 707 horsepower at 6,000 RPM. It's one of the most powerful cars you can buy for the dollar, achieving 0-60 MPH in under three seconds and a quarter-mile in under 12 seconds. You take it for a test drive, you fall in love with the car, and you buy it. In the months to follow, you remain quite pleased with your purchase and the performance the car provides.
> 
> It later comes out that Dodge made a mistake on its marketing materials: the engine has 16 valves, not 24. It still produces 707 horsepower at 6,000 RPM though, and *it still offers the same amazing road performance that it did the day you bought it*. It's still one of the fastest cars you could purchase for the dollar. *But you can no longer say you own a 24-valve V8.*_


You basically just described false advertising. You bought it because it was claimed as such the claim is false thus the advertising is false. I already proven my theory by posting Nvidia's specsheet which doesn't say a thing about 3,5+0,5 setup nor does it say about aggregate bandwidth or that there would be a split between them it just lists *4GB at 224GB/s*

To this day they are still falsely advertising their product on their own website while they have admitted that Tom's now has the correct specs. (yet they don't list the correct specs themselves and that is false advertising on their part)

Can we safely assume that Hardware Cannucks was in on this since you're their representative here on OCN is that what it means?
Quote:


> Originally Posted by *Baghi*
> 
> At least those reviewers, after this issue came to light show the accurate specs in their charts.


Yeah that is good of them much better than what Nvidia themselves does still list the specs as 224GB/s for 4GB of ram. It would go in the category less bad for the reviewers.


----------



## Silent Scone

Quote:


> False advertising or deceptive advertising is the use of false or misleading statements in advertising, and misrepresentation of the product at hand, which may negatively affect many stakeholders, especially consumers


You need to prove this has affected your decision or the products ability to perform. To which at this point nobody really has to varying degree. So no he didn't just describe false advertising at all.

We all agree Nvidia made a big mistake for whatever reason here. Doesn't give free passes to false legal claims.


----------



## tpi2007

Quote:


> Originally Posted by *SKYMTL*
> 
> Toms' analogy is absolutely spot on as well and needs repeating here as well:
> 
> _You're a muscle-car buff and you decide to test drive the new 2015 Dodge Charger Hellcat. The car is advertised as a supercharged 8-cylinder, 6.2 Liter Hemi engine with 24 valves that produces 707 horsepower at 6,000 RPM. It's one of the most powerful cars you can buy for the dollar, achieving 0-60 MPH in under three seconds and a quarter-mile in under 12 seconds. You take it for a test drive, you fall in love with the car, and you buy it. In the months to follow, you remain quite pleased with your purchase and the performance the car provides.
> 
> It later comes out that Dodge made a mistake on its marketing materials: the engine has 16 valves, not 24. It still produces 707 horsepower at 6,000 RPM though, and *it still offers the same amazing road performance that it did the day you bought it*. It's still one of the fastest cars you could purchase for the dollar. But you can no longer say you own a 24-valve V8._


This isn't about how the car(d) performs today, it's about assessing the value of a card over its useful lifetime when making a buying decision. People made their buying decision also based on the wrong premise that conveyed that, yes the GTX 970 has less CUDA cores, but the rest is just like the GTX 980, and is there to help it have more headroom in the future.

It's not a direct comparison, but at least with the GTX 570's lack of VRAM people knew from the start that that card wasn't going to age gracefully and that influenced their buying decision.

I still haven't seen anybody disprove my car analogy either:

Your car does

- 196 km/h in 5th gear;
- 28 km/h in 1st gear;

-> Therefore your car's top speed is 224 km/h.

This math is not valid, yet reviewers are letting Nvidia get away with it.

Quote:


> Originally Posted by *Baghi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maarten12100*
> 
> They say things like the reviewers didn't act correct.
> 
> 
> 
> At least those reviewers, after this issue came to light show the accurate specs in their charts.
Click to expand...

Tom's only corrected the table (and added text to explain it) for the memory bandwidth after I mentioned it, and they still didn't put it correct enough. You can't say it's aggregate if you can't access both segments at the same time. They also didn't even follow the good practice of telling the readers that they had changed the article a day after having published it. Yet when you go read the article about Samsung's Gear VR they clearly added a note when they added Samsung's statement.

http://www.tomsguide.com/us/samsung-gear-vr-best-buy,news-20367.html

These sites are full of these double standards. PCPer was prompt to test Crossfire stuttering with FCAT, yet they are making excuses to not test SLI with two GTX 970's.


----------



## SKYMTL

Quote:


> Originally Posted by *maarten12100*
> 
> You basically just described false advertising. You bought it because it was claimed as such the claim is false thus the advertising is false. I already proven my theory by posting Nvidia's specsheet which doesn't say a thing about 3,5+0,5 setup nor does it say about aggregate bandwidth or that there would be a split between them it just lists *4GB at 224GB/s*
> 
> To this day they are still falsely advertising their product on their own website while they have admitted that Tom's now has the correct specs. (yet they don't list the correct specs themselves and that is false advertising on their part)
> 
> Can we safely assume that Hardware Cannucks was in on this since you're their representative here on OCN is that what it means?
> Yeah that is good of them much better than what Nvidia themselves does still list the specs as 224GB/s for 4GB of ram. It would go in the category less bad for the reviewers.


In on what exactly? Some sort of grand conspiracy some folks have cooked up in their heads?

Another question: since when do bandwidth numbers refer to anything other than the *peak* bandwidth? Peak = "up to". Thus far there has been no proof that the GTX 970 CANNOT achieve its peak bandwidth.


----------



## Forceman

Quote:


> Originally Posted by *SKYMTL*
> 
> In on what exactly? Some sort of grand conspiracy some folks have cooked up in their heads?


That seems to be the prevailing opinion in some circles, yes.


----------



## SKYMTL

Quote:


> Originally Posted by *Forceman*
> 
> That seems to be the prevailing opinion in some circles, yes.


Gotcha.

I'll go back to my plans for world domination now. Conference call with the Illuminati at 2PM and then the Roswell aliens visit at 4PM. Big day today!


----------



## mtcn77

Quote:


> Originally Posted by *SKYMTL*
> 
> Sure. I asked Jonah this question the other day over email and here is a direct quote, without any wanna-be overly technical mumbo jumbo:
> 
> _First case (just using lower 3.5GB) : in this case, only of 7 of the 8 DRAMs are in use. So if it maxes out all 7 that would be 7*32*3.5*2/8 = 196GB/sec peak.
> 
> Second case (using lower 3.5GB and upper 0.5GB) : in this case the memory bandwidth really depends on the load. *It could be as much as all the memory bandwidth or 224GB/sec if the workload is balanced well, but if the workload is imbalanced (100% reads as an example with no writes), then it could be half bandwidth*. We have extra read/write request bandwidth from the L2s to the MCs which is why the double load issue is harder to hit here.
> 
> Third case (going beyond 4GB) : now PCIe is involved and bandwidth will go down fast if you start using it too much._
> 
> _One thing to point out is that some folks I think have looked at the CUDA memory test (Nai's benchmark) and gotten concerned about whether when you use the 0.5GB, bandwidth would be 1/8th . That's not really right &#8230; that would be right if you were *only* using the 0.5GB (because now you're just using that one memory) and leaving the other memories idle. But if you assume the more likely case that you're evenly using the 4GB memory address range, then you're accessing all the memories._
> 
> And when I asked about how the load balancing was accomplished in this case (this is after a LONG technical bit):
> 
> _Actually the algorithm is oriented more towards trying to put data in the 0.5GB segment that is least likely to get accessed often. In general, that's the simplest approach to take, and simple is good for software algorithms, esp since we don't really know for sure what the access patterns might look like._
> 
> Essentially, NVIDIA is using software-based heuristics to insure the 500GB segment is utilized when memory requirements surpass the 3.5GB mark. This is why games ARE able to access the full 4GB on the card and when using a tool like AIDA64 which engages both paritions, read and write access is virtually identical between the GTX 970 and GTX 980.
> 
> Hope that helps!


Quote:


> Originally Posted by *SKYMTL*
> 
> In on what exactly? Some sort of grand conspiracy some folks have cooked up in their heads?
> 
> Another question: since when do bandwidth numbers refer to anything other than the *peak* bandwidth? Peak = "up to". Thus far there has been no proof that the GTX 970 CANNOT achieve its peak bandwidth.


Isn't this a contradiction?


----------



## maarten12100

Quote:


> Originally Posted by *Silent Scone*
> 
> You need to prove this has affected your decision or the products ability to perform. To which at this point nobody really has to varying degree. So no he didn't just describe false advertising at all.
> 
> We all agree Nvidia made a big mistake for whatever reason here. Doesn't give free passes to false legal claims.


If you bought it because it has 24 valves or whatever. Then you are affected because it doesn't in reality. -> false advertising.

Now I don't really care since I don't have a GTX970 but this would very much qualify as false advertising.


----------



## Rahldrac

http://www.techpowerup.com/209412/amd-cashes-in-on-gtx-970-drama-cuts-r9-290x-price.html

If I did not already have two waterblocks for the 970. I would change to 290x just to show Nvidia that this is not okay! (And then buy 3xx when they are released).

Edit:
And for me it's very easy for me to prove false marketing, since the reseller I bought it from listed: 4GB with 256 Bit, not up to anything.


----------



## MadRabbit

Quote:


> Originally Posted by *maarten12100*
> 
> If you bought it because it has 24 valves or whatever. Then you are affected because it doesn't in reality. -> false advertising.
> 
> Now I don't really care since I don't have a GTX970 but this would very much qualify as false advertising.


Why do people think Actievil took back MW2 copies tied on an Steam account? Simple, false advertising, said it had console while it didn't and they had no choice in that. Nvidia saying (showing) it has something while doesn't makes up the same thing. False advertising, therefor subject to a legal case.


----------



## Cyro999

Quote:


> Originally Posted by *SKYMTL*
> 
> In on what exactly? Some sort of grand conspiracy some folks have cooked up in their heads?
> 
> Another question: since when do bandwidth numbers refer to anything other than the *peak* bandwidth? Peak = "up to". Thus far there has been no proof that the GTX 970 CANNOT achieve its peak bandwidth.


The 970 can only access one pool of memory at a time. The first pool uses seven VRAM chips each with 32 bit interface, while the gtx980 uses eight.

peak bandwidth (and actual bandwidth) is 7/8'ths (87.5%) of gtx980 as has been shown by both nvidia detailing actual specs as well as actual benchmarks.


----------



## criminal

Quote:


> Originally Posted by *SKYMTL*
> 
> Gotcha.
> 
> I'll go back to my plans for world domination now. Conference call with the Illuminati at 2PM and then the Roswell aliens visit at 4PM. Big day today!


I wrote a couple of articles for technical review site. I would receive the product for free (as my payment) after I wrote a review about the product. One product I received was really bad. I told the lead editor/owner of the site about it before getting too far into writing the article. He told me it was highly advisable for me to figure out a way to like the product and write a good review. Now what would have been his reason for that? What I took from it was that if I wrote a bad review, it would have an affect on him getting future products from that company. Not saying it is happening, but I could see other companies doing something similar.


----------



## Woundingchaney

Quote:


> Originally Posted by *Silent Scone*
> 
> You need to prove this has affected your decision or the products ability to perform. To which at this point nobody really has to varying degree. So no he didn't just describe false advertising at all.
> 
> We all agree Nvidia made a big mistake for whatever reason here. Doesn't give free passes to false legal claims.


ROP count is wrong on official marketing

L2 cache is also wrong

At the very least memory configuration and bandwidth claims are very questionable.

I cant speak for everyone but with my configuration these issues were rearing their heads. It was very difficult to find initially or pinpoint before the information was made available, but once I did I could detect them in various titles (yes some titles I noticed no difference). Honestly (once again I cant speak for everyone) if I was aware of the these hardware issues I wouldn't have purchase the 970s.


----------



## DisgruntldTek37

PC Perspective conducting some testing on the 980 vs the 970 on frame variance. The results are interesting.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Looking-GTX-970-Memory-Performance/Battlefield-4-Results


----------



## maarten12100

Quote:


> Originally Posted by *tpi2007*
> 
> This math is not valid, yet reviewers are letting Nvidia get away with it.
> Tom's only corrected the table (and added text to explain it) for the memory bandwidth after I called them out, and they still didn't put it correct enough. You can't say it's aggregate if you can't access both segments at the same time. They also didn't even follow the good practice of telling the readers that they had changed the article a day after having published it. Yet when you go read the article about Samsung's Gear VR they clearly added a note when they added Samsung's statement.
> 
> http://www.tomsguide.com/us/samsung-gear-vr-best-buy,news-20367.html
> 
> These sites are full of these double standards. PCPer was prompt to test Crossfire stuttering with FCAT, yet they are making excuses to not test SLI with two GTX 970's.


There is a very simple explanation for this:

The hand on the left is the hand of a generic reviewer on the right we have Nvidia. (just my theory of why they seem so biased)

I have great dislike for PCPer I can't stand that guy in the webcast he is so stupid he annoys me as much as AMD's Roy Taylor.
Other sites have shown that the 99th percentile results R9 290(x) versus GTX970 paint a different picture than the framerate would suggest.
Quote:


> Originally Posted by *SKYMTL*
> 
> In on what exactly? Some sort of grand conspiracy some folks have cooked up in their heads?
> 
> Another question: since when do bandwidth numbers refer to anything other than the *peak* bandwidth? Peak = "up to". Thus far there has been no proof that the GTX 970 CANNOT achieve its peak bandwidth.


I don't think a judge would accept such a lousy statement. It's like getting a fiber connection "up to 500Mbit/s" and yet you receive ADSL speeds. I mean they didn't do any false advertising there but the expected results greatly differ form what a common consumer would expect.

Nevertheless you gotta admit Nvida's numbers on their own site are false and don't depict the truth of the matter. They didn't make mention of this up to number on their site the reviewers don't depict it correctly most of the time.

I mean really it sounds to me like your avatar depicts it perfectly: you have been inhaling too much of that green smoke from that erlenmeyer your beaver is holding.


----------



## Menta

Quote:


> Originally Posted by *criminal*
> 
> I wrote a couple of articles for technical review site. I would receive the product for free (as my payment) after I wrote a review about the product. One product I received was really bad. I told the lead editor/owner of the site about it before getting too far into writing the article. He told me it was highly advisable for me to figure out a way to like the product and write a good review. Now what would have been his reason for that? What I took from it was that if I wrote a bad review, it would have an affect on him getting future products from that company. Not saying it is happening, but I could see other companies doing something similar.


yep cant really trust reviews at this point


----------



## Forceman

Quote:


> Originally Posted by *Rahldrac*
> 
> http://www.techpowerup.com/209412/amd-cashes-in-on-gtx-970-drama-cuts-r9-290x-price.html


Forcing board partners to lower prices is cashing in? Wouldn't that be cashing under, or cashing less?


----------



## Silent Scone

Really wish I'd gone with the 970 GTX ITX for my HTPC. Would love to do some of my own testing. You know like when celebrities do real poverty to experience how the other half live.

Second part is me being a wind up merchant

Quote:


> Originally Posted by *Woundingchaney*
> 
> ROP count is wrong on official marketing
> 
> L2 cache is also wrong
> 
> At the very least memory configuration and bandwidth claims are very questionable.
> 
> *I cant speak for everyone but with my configuration these issues were rearing their heads. It was very difficult to find initially or pinpoint before the information was made available, but once I did I could detect them in various titles* (yes some titles I noticed no difference). Honestly (once again I cant speak for everyone) if I was aware of the these hardware issues I wouldn't have purchase the 970s.


Ok, nobody can say fairer than that, but then you need to make these examples clear, as that is what is still some what lacking.

It's also what NVIDIA are asking for, or at least they're giving the impression they are.


----------



## Rahldrac

Quote:


> Originally Posted by *Forceman*
> 
> Forcing board partners to lower prices is cashing in? Wouldn't that be cashing under, or cashing less?


I guess they are trying to get more people over to "team red".
As you can see from this debate, some people get really fanatic about their GPU vendor. Even when Nvidia outright lies to us, they still defend Nvidia.
So if AMD can get some of those kind of people over to their side, I guess they have GPU buyers for years to come. Long term strategy I guess.


----------



## MaCk-AtTaCk

If my 970 didnt have that sweet sexy 980 refence cooler on it (can only get it at BEST BUY stores ) I would be more angered. But what can I say I got a sweet looking card. LOL im a sucker. On a more serious note, What troubles me is that because I got such a beatuy of a card and it o/c like crazy I was serioulsy considering running a pair in SLI. But Im not sure now. Iv been hearing alot of bad results because of the weird memory setup NVIDIA "forgot" to tell us when running in SLI.


----------



## SKYMTL

Quote:


> Originally Posted by *mtcn77*
> 
> Isn't this a contradiction?


Not unless I am completely missing something.

Balanced workload through both partitions = up to 224GB/s

Unbalanced workload = less than that

Granted, the GTX 970 has a higher chance of getting bogged down in certain scenarios than other 4GB cards due to its unique layout but in theory, the full 224GB/s is achievable. Hence why I believe NVIDIA's legal team hasn't required them to issue a revision to specifications

Quote:


> Originally Posted by *Cyro999*
> 
> The 970 can only access one pool of memory at a time. The first pool uses seven VRAM chips each with 32 bit interface, while the gtx980 uses eight.
> 
> peak bandwidth (and actual bandwidth) is 7/8'ths (87.5%) of gtx980 as has been shown by both nvidia detailing actual specs as well as actual benchmarks.


You're not taking interleaving and load balancing into account. There's nothing to stop NVIDIA from utilizing the partitions for different functions (IE: the primary partition works on time-sensitive workloads while the lower level partition is utilized for tertiary workloads). Again, this hinges on the drivers functioning properly and allocating resources accordingly but it doesn't necessarily mean the 500MB partition and its 28GB/s of bandwidth will go unused.


----------



## Baghi

Quote:


> Originally Posted by *SKYMTL*
> 
> In on what exactly? Some sort of grand conspiracy some folks have cooked up in their heads?
> 
> Another question: since when do bandwidth numbers refer to anything other than the *peak* bandwidth? Peak = "up to". Thus far there has been no proof that the GTX 970 CANNOT achieve its peak bandwidth.


Wasn't NVIDIA so honest about stating specs? They state "minimum" boost clocks whereas AMD states their "peak" clocks, what happened to them now? GTX 970 was a gimped GTX 980 with some shader units disabled, tad lower core clocks out of the box at significantly lesser price. This is what a consumer knows, this was the case with the GTX 670 as well but the price difference wasn't same.


----------



## MaCk-AtTaCk

I wonder If Nvidia will give back part of the L2 cache and rops in Driver update? I know it sounds crazy but I read somwhere from a reputiable site that they might do that. That those functions werent Physically cut but just "disabled" to make the 970 slow enough to warrent its price?


----------



## Silent Scone

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> I wonder If Nvidia will give back part of the L2 cache and rops in Driver update? I know it sounds crazy but I read somwhere from a reputiable site that they might do that. That those functions werent Physically cut but just "disabled" to make the 970 slow enough to warrent its price?


Just before someone jumps on you for that no bud that's not physically possible, the L2 cache is defective thus disabled indefinitely.

What warrants the price is the neutering that we _do_ know about with the removal of SMs


----------



## raghu78

Quote:


> Originally Posted by *SKYMTL*
> 
> Not unless I am completely missing something.
> *
> Balanced workload through both partitions = up to 224GB/s
> 
> Unbalanced workload = less than that*


See what you did there. basically you are creating a specific set of conditions under which 224 GB/s could be achieved. Thats unlike R9 290 / R9 290X and GTX 980 which can read with the complete maximum bandwidth. This is something so simple. If you say 224 GB/s advertising is correct then it should be available when doing 100% reads alone and not a specific set of circumstances which Nvidia or you say.








Quote:


> Granted, the GTX 970 has a higher chance of getting bogged down in certain scenarios than other 4GB cards due to its unique layout but in theory, the full 224GB/s is achievable. Hence why I believe NVIDIA's legal team hasn't required them to issue a revision to specifications. You're not taking interleaving and load balancing into account. There's nothing to stop NVIDIA from utilizing the partitions for different functions (IE: the primary partition works on time-sensitive workloads while the lower level partition is utilized for tertiary workloads). Again, this hinges on the drivers functioning properly and allocating resources accordingly but it doesn't necessarily mean the 500MB partition and its 28GB/s of bandwidth will go unused.


dude there is a difference between *224 GB/s achievable in certain scenarios* rather than in all cases. 100% reads , 100% writes or an arbitrary mix of both.


----------



## Menta

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> I wonder If Nvidia will give back part of the L2 cache and rops in Driver update? I know it sounds crazy but I read somwhere from a reputiable site that they might do that. That those functions werent Physically cut but just "disabled" to make the 970 slow enough to warrent its price?


big scandal but i guess nothing is off the table at this point...but for all i have been reading they only can help via software


----------



## SKYMTL

Quote:


> Originally Posted by *criminal*
> 
> I wrote a couple of articles for technical review site. I would receive the product for free (as my payment) after I wrote a review about the product. One product I received was really bad. I told the lead editor/owner of the site about it before getting too far into writing the article. He told me it was highly advisable for me to figure out a way to like the product and write a good review. Now what would have been his reason for that? What I took from it was that if I wrote a bad review, it would have an affect on him getting future products from that company. Not saying it is happening, but I could see other companies doing something similar.


We've written pretty scathing reviews of products in the past and have reaped the rewards. To this day getting MSI and ASRock samples is harder than pulling teeth from a pissed-off shark.

It all comes down to how a site functions and whether or not they subscribe to the principles of journalism. Admittedly it is easier for me to turn a blind eye to a product receiving a negative review since I don't use the site to put food on the table. On other sites that may not be the case but what I do know is the vast majority of sites cited in this thread (PCPer, Anandtech, Tom's, Tech Report, etc.) remove the financial component of ad sales from the editorial process. These *are not* fly-by-night operations from the backwoods of Texas (no offense to the Texans around here!) who are looking for payola above all else.


----------



## SKYMTL

Quote:


> Originally Posted by *raghu78*
> 
> See what you did there. basically you are creating a specific set of conditions under which 224 GB/s could be achieved. Thats unlike R9 290 / R9 290X and GTX 980 which can read with the complete maximum bandwidth. This is something so simple. If you say 224 GB/s advertising is correct then it should be available when doing 100% reads alone and not a specific set of circumstances which Nvidia or you say.
> 
> 
> 
> 
> 
> 
> 
> 
> dude there is a difference between *224 GB/s achievable in certain scenarios* rather than in all cases. 100% reads , 100% writes or an arbitrary mix of both.


Hence why companies use "peak" figures. Regardless of the memory interface on any device, it will never operate at 100% bandwidth utilization all the time. At least it better not....


----------



## Vesku

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> If my 970 didnt have that sweet sexy 980 refence cooler on it (can only get it at BEST BUY stores ) I would be more angered. But what can I say I got a sweet looking card. LOL im a sucker. On a more serious note, What troubles me is that because I got such a beatuy of a card and it o/c like crazy I was serioulsy considering running a pair in SLI. But Im not sure now. Iv been hearing alot of bad results because of the weird memory setup NVIDIA "forgot" to tell us when running in SLI.


As long as you treat it as a 3.5GB VRAM setup you should be OK. Keep in mind SLI isn't as good as Crossfire right now for "smoothness". I expect the next generation of Nvidia cards will also adopt PCIe based multi-GPU or improve the SLI interface.


----------



## sugalumps

Quote:


> Originally Posted by *maarten12100*
> 
> There is a very simple explanation for this:
> 
> The hand on the left is the hand of a generic reviewer on the right we have Nvidia. (just my theory of why they seem so biased)
> 
> I have great dislike for PCPer I can't stand that guy in the webcast he is so stupid he annoys me as much as AMD's Roy Taylor.
> Other sites have shown that the 99th percentile results R9 290(x) versus GTX970 paint a different picture than the framerate would suggest.
> I don't think a judge would accept such a lousy statement. It's like getting a fiber connection "up to 500Mbit/s" and yet you receive ADSL speeds. I mean they didn't do any false advertising there but the expected results greatly differ form what a common consumer would expect.
> 
> Nevertheless you gotta admit Nvida's numbers on their own site are false and don't depict the truth of the matter. They didn't make mention of this up to number on their site the reviewers don't depict it correctly most of the time.
> 
> I mean really it sounds to me like your avatar depicts it perfectly: you have been inhaling too much of that green smoke from that erlenmeyer your beaver is holding.


Maarten, have you ever been contacted by the x-files m8? You would have a promising career. You could fill in for mulder since he went missing.


----------



## MaCk-AtTaCk

Quote:


> Originally Posted by *Vesku*
> 
> As long as you treat it as a 3.5GB VRAM setup you should be OK. Keep in mind SLI isn't as good as Crossfire right now for "smoothness". I expect the next generation of Nvidia cards will also adopt PCIe based multi-GPU or improve the SLI interface.


Right but From what I gathered so far that aditional 512mb ontop of the 3.5gb is actually making things worse for sli then if the card was just a 3.5gb card? thats my issue. I can live with it being a 3.5gb card but if that extra 512mb is going to make it worse then what a 3.5gb card would have been in sli this sucks.


----------



## Vesku

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> Right but From what I gathered so far that aditional 512mb ontop of the 3.5gb is actually making things worse for sli then if the card was just a 3.5gb card? thats my issue. I can live with it being a 3.5gb card but if that extra 512mb is going to make it worse then what a 3.5gb card would have been in sli this sucks.


It would be nice if Nvidia gave a driver option to ignore that 512MB, so you don't have to monitor your memory while tweaking game settings. Not sure that will ever happen though because that would be direct acknowledgement that the 512MB is not always beneficial.


----------



## Cyro999

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> Right but From what I gathered so far that aditional 512mb ontop of the 3.5gb is actually making things worse for sli then if the card was just a 3.5gb card? thats my issue. I can live with it being a 3.5gb card but if that extra 512mb is going to make it worse then what a 3.5gb card would have been in sli this sucks.


I don't think that's the case.

However it performs as if it has 7 memory chips and not 8, so it has ~12.5% less bandwidth on the 3.5GB portion than the 980 has, assuming both are running a game using 3.5GB or less VRAM.


----------



## Silent Scone

I thought the idea that NVIDIA pays everyone off till they're bankrupt was beneath this place.

Evidently not.


----------



## fleetfeather

Quote:


> Originally Posted by *SKYMTL*
> 
> You're accusing a company with a ~11 billion market cap and a large media organization of collusion and bribery. Those are some serious, and, until proven otherwise, quite baseless accusations.
> 
> Toms' analogy is absolutely spot on as well and needs repeating here as well:
> 
> _You're a muscle-car buff and you decide to test drive the new 2015 Dodge Charger Hellcat. The car is advertised as a supercharged 8-cylinder, 6.2 Liter Hemi engine with 24 valves that produces 707 horsepower at 6,000 RPM. It's one of the most powerful cars you can buy for the dollar, achieving 0-60 MPH in under three seconds and a quarter-mile in under 12 seconds. You take it for a test drive, you fall in love with the car, and you buy it. In the months to follow, you remain quite pleased with your purchase and the performance the car provides.
> 
> It later comes out that Dodge made a mistake on its marketing materials: the engine has 16 valves, not 24. It still produces 707 horsepower at 6,000 RPM though, and *it still offers the same amazing road performance that it did the day you bought it*. It's still one of the fastest cars you could purchase for the dollar. But you can no longer say you own a 24-valve V8._


Misrepresentation of product specs is misrepresentation of product specs. Regardless of a lack of end-result performance drop (which is still up for discussion depending on the situations you bought the card for), the card is NOT featuring the same specifications that the consumers were told when they bought it. I'm honestly perplexed as to why everyone with a large voice who HASNT bought the card themselves seems to be defending a situation that doesn't centrally apply to them.

Who are the news outlets and Nvidia themselves to tell customers which factors they should purchase a product on or care about. If you ducked up, you ducked up. Less mitigation of consequence, and more willingness to listen to genuine concerns from those who are impacted.

I have plenty of respect for the various tech reviewers and communities around the world (including HC, who I feel do the best case reviews on the web), but I've found the response and rationalisation by all of these communities quite disgraceful.

YOU don't get to decide what does or does not matter to the consumers. Since the reviewers are not centrally involved, they should stick to providing the simple facts of the situation, rather than expressing an opinion which can alter outcomes for those who indeed spent their money on the product. The consumers themselves have the reasonable right to complain about misrepresentation of published product information and performance metrics, and this applies to any product.


----------



## Menta




----------



## Silent Scone

Quote:


> Originally Posted by *Menta*


Told you so.

Soz.

On the grounds of the legality, I mean.


----------



## fleetfeather

Quote:


> Originally Posted by *Menta*


Well my goodness, what a surprise to hear that a company who sold a product has stated there is nothing wrong with it. I suppose we should just agree to let all those folks in prision who said "they didn't do it" back onto the streets and such. Stuff all that evidence-based, impartial non-sense


----------



## Vesku

For anyone noticing the lack of SLI frame metering being done by reviewers, check out what TechReport Scott Wasson says in their podcast:

Summary - Nvidia has conflicting frame measuring "suggestions" for single GPU and multi-GPU.

https://www.youtube.com/watch?feature=player_embedded&v=re9c2sO2jTQ&x-yt-cl=85027636&x-yt-ts=1422503916#t=899

Interesting that reviewers are being so cautious about it. Guess they want to hear Nvidia's reasoning first?


----------



## Menta

NV is not above the law

When buying goods and services anywhere in the EU (or in Iceland, Liechtenstein or Norway) - whether from a website, local shop or seller in another country - EU law protects consumers. The Directive on Unfair Commercial Practices aims to ensure that they are not misled or exposed to aggressive marketing and that any claim made by sellers is clear, accurate and substantiated. Please note that these rules apply to business-to-consumer transactions.

The website "Is it fair?" contains practical information for consumers on how to check if they have fallen victim to an unfair commercial practice and how to get help:

http://ec.europa.eu/justice/consumer-marketing/unfair-trade/unfair-practices/is-it-fair/index_en.htm

Further information on unfair commercial practices is available on the "Your Europe" portal:

http://europa.eu/youreurope/citizens/shopping/unfair-treatment/unfair-commercial-practices/index_en.htm

Consumers buying goods and services in the EU can seek further advice and assistance with complaints from the European Consumer Centre in their country of residence if it was a cross-border purchase. A national consumer organisation may be able to help if the product was purchased in the country of residence of the customer.

If instead you were referring to a business-to-business transation, please come back to us, we will attend to your enquiry as quickly as possible.

We hope you find this information useful. Please contact us again if you have other questions.

With kind regards,
EUROPE DIRECT Contact Centre

http://europa.eu - your shortcut to the EU!


----------



## MaCk-AtTaCk

Quote:


> Originally Posted by *Cyro999*
> 
> I don't think that's the case.
> 
> However it performs as if it has 7 memory chips and not 8, so it has ~12.5% less bandwidth on the 3.5GB portion than the 980 has, assuming both are running a game using 3.5GB or less VRAM.


I see. So do you think O/C memory would help offset these issues? or is this independent of memory speed?


----------



## skupples

the continued PCper bashing is quite entertaining.

What have they done besides FCAT to piss people off?

I've asked a few times now, and no one has any answers outside of "RAHHHH FCATing 7970 into proving it was broken" which is quite entertaining, you would think the then tahiti using AMD fans would be HAPPY about PCper busting the news on this as it IMPROVED their gaming experience, especially those in eyefinity/xfire.


----------



## Vesku

Quote:


> Originally Posted by *skupples*
> 
> the continued PCper bashing is quite entertaining.
> 
> What have they done besides FCAT to piss people off?
> 
> I've asked a few times now, and no one has any answers outside of "RAHHHH FCATing 7970 into proving it was broken" which is quite entertaining, you would think the then tahiti using AMD fans would be HAPPY about PCper busting the news on this as it IMPROVED their gaming experience, especially those in eyefinity/xfire.


See my above post, it's how cautious reviewers have been about frame metering GTX 970 SLI. TechReport GPU reviewer who has been a champion of 'smoothness' wants to "find out more from Nvidia" before really exploring it. Seems that other review sites are being as equally cautious. Which annoys some people because why should Nvidia's 'suggestions' matter so much when sites had no problem doing multi-gpu frame metering prior to GM204 showing up.


----------



## MaCk-AtTaCk

Also, I think I for one along with others are getting tired of this fanboism on either side. Its really anoying to see alot of AMD fans swarming in here and taking the oppertunity to slam nvidia owners. It not like 970 owners are happy with what has transpired and comming in here and just throwing fuel to the fire is really helpfull! But Also on the flip side Its anoying to see Some Nvidia owners Blindly defend Nvidia like they can do no wrong when clearly they did... Its time to move on, Nvidia F-uped we all know this now. So now it comes to what can we do to HELP" each other out as far as if a particular member should return there card(S) or keep them depending on there configuration.
OK off my soap box.


----------



## skupples

I think you're reading too much into it.


----------



## Silent Scone

That techreport podcast is more of an overview of the situation, and doesn't in any way reflect the hatred people have been displaying in this thread lol.


----------



## Cyro999

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> I see. So do you think O/C memory would help offset these issues? or is this independent of memory speed?


If you run both cards at 7000mhz or at 8300mhz, the 970 would still have 12.5% less bandwidth on the fast VRAM sections


----------



## Vesku

Quote:


> Originally Posted by *Silent Scone*
> 
> That techreport podcast is more of an overview of the situation, and doesn't in any way reflex the hatred people have been displaying in this thread lol.


It's the first time I've heard a reviewer acknowledge that Nvidia has issued "conflicting suggestions" regarding single versus multi-gpu frame metering. Not sure what else he can be talking about except 970 SLI and the lack of frame metering from review sites.


----------



## SKYMTL

Quote:


> Originally Posted by *Vesku*
> 
> It's the first time I've heard a reviewer acknowledge that Nvidia has issued "conflicting suggestions" regarding single versus multi-gpu frame metering. Not sure what else he can be talking about except 970 SLI and the lack of frame metering from review sites.


Pretty much every site with an FCAT setup tested the GTX 970 SLI with it.


----------



## iSlayer

Quote:


> Originally Posted by *mouacyk*
> 
> You speak of trust. It's perfectly in order for those of us who have staked a long past with NVidia (basically since its inception with the GeForce, because it was relatively oblivious prior to that) to have a valuation of such a future with them.
> The refund period is only applicable to your specific retailer. If you are serious about your refund, try your retailer first citing as much published information as you can. If the worst comes to shove, try the CSR's who have been posting on various forums -- especially the one at the GeForce forums.
> 
> Wow, the rep took back his words on offering to help people to secure a refund. See his original offer that I quoted here.
> 
> Poor Korean and google translator on Geforce forums, but quite a meme in itself:


I forget where I spoke of trust, that quote is not relevant, but leave the being offended to 970 owners.

As for trust, Nvidia is helping people get their dosh back, what more could you ask for?
Quote:


> Originally Posted by *raghu78*
> 
> looking quite bad for GTX 970 SLI users who bought it for 1440p and 4K gaming and trying to max settings. they are the most likely to hit that last 0.5 GB more often and get hurt the most. They trusted nvidia and paid with their hard earned money and Nvidia conveniently missed out to mention actual specs.


Shill.txt

Leave being offended to people who own Nvidia cards, more specifically, the 970.
Quote:


> Originally Posted by *ondoy*
> 
> AMD exchange program in effect.... now.
> 
> Anyone returning their GTX970 and wanting a great deal on a Radeon with a full 4GB please let us know.


Link is dead.
Quote:


> Originally Posted by *clerick*
> 
> I've been a long time fan of nvidia but this just leaves a really bad taste in my mouth. If the card was advertised at 3.5gb and x rops and I bought that i'd have no problem. But advertising the same 4gb and then being unable to actually use it, shameful.


You can still use all 4GBs. Title is misleading.
Quote:


> Originally Posted by *fleetfeather*
> 
> This is the sort of stuff I genuinely like from Roy. I love his super agressive stance when it doesn't involve a sort of sleazy marketing alongside it.


Not sure if sarcasm because this is sleezy as crap.
Quote:


> Originally Posted by *skupples*
> 
> I wouldn't know since Nvidia made the asinine choice to lock Kepler owners out.


MFAA is supposed to be hardware based, that's why Kepler didn't get it.

Otherwise, its merely artificial segmentation. Kepler is EOL. I can understand why they did it, but whether that's upsetting or not is up to the specific Kepler owner.
Quote:


> Originally Posted by *Silent Scone*
> 
> I thought the idea that NVIDIA pays everyone off till they're bankrupt was beneath this place.
> 
> Evidently not.


Oh just ignore them, they're either whining about a perceived (see nonexistent) Nerf of their Kepler cards or own an AMD CPU/GPU and obviously shilling.

Chances are good, if someone is posting in this thread and they don't own a 970, their comments are as valuable as their diarrhea after a few too many flaming hot cheetos.
Quote:


> Originally Posted by *skupples*
> 
> the continued PCper bashing is quite entertaining.
> 
> What have they done besides FCAT to piss people off?
> 
> I've asked a few times now, and no one has any answers outside of "RAHHHH FCATing 7970 into proving it was broken" which is quite entertaining, you would think the then tahiti using AMD fans would be HAPPY about PCper busting the news on this as it IMPROVED their gaming experience, especially those in eyefinity/xfire.


Shills, what you are referring to are shills. Criticism and third party testing is how we improve. If no one told us how crap BD or Ubicrap titles are imagine where we would be now?
Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> Also, I think I for one along with others are getting tired of this fanboism on either side. Its really anoying to see alot of AMD fans swarming in here and taking the oppertunity to slam nvidia owners. It not like 970 owners are happy with what has transpired and comming in here and just throwing fuel to the fire is really helpfull! But Also on the flip side Its anoying to see Some Nvidia owners Blindly defend Nvidia like they can do no wrong when clearly they did... Its time to move on, Nvidia F-uped we all know this now. So now it comes to what can we do to HELP" each other out as far as if a particular member should return there card(S) or keep them depending on there configuration.
> OK off my soap box.


Quote:


> Originally Posted by *skupples*
> 
> I think you're reading too much into it.


No, he isn't. That's exactly what I have been complaining about.


----------



## Silent Scone

Quote:


> Originally Posted by *Vesku*
> 
> It's the first time I've heard a reviewer acknowledge that Nvidia has issued "conflicting suggestions" regarding single versus multi-gpu frame metering. Not sure what else he can be talking about except 970 SLI and the lack of frame metering from review sites.


He also acknowledged personally that if he were to own the card and find this out, he couldn't imagine restitution or 'rage quiting on it'. Point of fact, I think everyone agrees this whole thing is a complete screw up, but people need to be more realistic.


----------



## skupples

so this perceived "caution" of properly using/reporting FCAT for frame times on 970 makes them shills?

I'm trying to follow, I really am.

or was it the NV tipping them off that AMD had frame time issues, & then blowing it out of the water, & proving that they did?


----------



## Vesku

Quote:


> Originally Posted by *Silent Scone*
> 
> He also acknowledged personally that if he were to own the card and find this out, he couldn't imagine restitution or 'rage quiting on it'. Point of fact, I think everyone agrees this whole thing is a complete screw up, but people need to be more realistic.


If he doesn't mind being fooled that's fine for him personally. I think at the very least the EU owners who are actually mad can probably make some headway via their consumer protection bureaus by showing Nvidia provided incorrect specifications to reviewers.


----------



## provost

Quote:


> Originally Posted by *iSlayer*
> 
> I forget where I spoke of trust, that quote is not relevant, but leave the being offended to 970 owners.
> 
> As for trust, Nvidia is helping people get their dosh back, what more could you ask for?
> 
> MFAA is supposed to be hardware based, that's why Kepler didn't get it.
> 
> Otherwise, its merely artificial segmentation. *Kepler is EOL*. I can understand why they did it, but whether that's upsetting or not is up to the specific Kepler owner.


I think we all know that its EOL, as in no more in production. But, what does this mean for ongoing driver improvement and support? Is the Support EOL too?


----------



## Silent Scone

Quote:


> Originally Posted by *Vesku*
> 
> If he doesn't mind being fooled that's fine for him personally. I think at the very least the EU owners who are actually mad can probably make some headway via their consumer protection bureaus by showing Nvidia provided incorrect specifications to reviewers.


Well then said people have lesser priorities than others







I'll say no more on that


----------



## Jesse36m3

DAMN, this puts me in a tough spot. Just a week ago I added two evga reference 970's while reusing my heatkiller water blocks from my 660ti's. Made a custom hardline loop from copper. Took me weeks to do and then I hear about this just _after_ finishing it of course. lol I've been playing on 1440p. Absolutely no problems till I crank the AA, but even then there was not much of a noticeable visual difference. I think i'll keep the settings down to make the vram happy.

I don't think it would be worth the hassle of taking everything apart, spending more money on 980's or wait till amd's 3xx launch, then I'd need all new blocks and possibly reconfigure my loop. ehh


----------



## skupples

Quote:


> Originally Posted by *provost*
> 
> I think we all know that its EOL, as in no more in production. But, what does this mean for ongoing driver improvement and support? Is the Support EOL too?


\

if it is, then they need to specify as such.

stating that PRODUCTION is EOL doesn't automatically translate into "all support is done"


----------



## Mad Pistol

Quote:


> Originally Posted by *provost*
> 
> I think we all know that its EOL, as in no more in production. But, what does this mean for ongoing driver improvement and support? Is the Support EOL too?


No, support for all DX11 parts in Nvidia's arsenal is ongoing. I'm sure that Maxwell is top priority simply because it is a new architecture, and optimization is key to remaining competitive.

If Nvidia cut off support for every product after they were EOL, no one would still be buying their products.


----------



## provost

Quote:


> Originally Posted by *skupples*
> 
> \
> 
> if it is, then they need to specify as such.
> 
> stating that PRODUCTION is EOL doesn't automatically translate into "all support is done"


well, unless more people speak up, Nvidia would consider this a non issue, and assume the silent majority is looking to upgrade anyway, so why bother with providing continued driver improvement support.. let's be clear we are talking about Nvidia's flagship "halo" and "non halo







" kepler cards here, some released only last year, not 4 - 5 years ago
remind me again please when did AMD release the 7970s?
If Nvidia has decided to EOL continued improvement driver support for Kepler, it would appear that anyone who would have bought a 7970, 290, or 290x over a Kepler made a very smart choice, despite all the initial drama over frame timing issues.
But, you are right, Nvidia should afford this courtesy to its loyal customer base of telling them whether ongoing driver improvement support will be provided for Kepler or not. That's probably the least Nvidia can do. We are all big boys, we can make up our own minds.


----------



## nyxagamemnon

Quote:


> Originally Posted by *SKYMTL*
> 
> You're accusing a company with a ~11 billion market cap and a large media organization of collusion and bribery. Those are some serious, and, until proven otherwise, quite baseless accusations.
> 
> Toms' analogy is absolutely spot on as well and needs repeating here as well:
> 
> _You're a muscle-car buff and you decide to test drive the new 2015 Dodge Charger Hellcat. The car is advertised as a supercharged 8-cylinder, 6.2 Liter Hemi engine with 24 valves that produces 707 horsepower at 6,000 RPM. It's one of the most powerful cars you can buy for the dollar, achieving 0-60 MPH in under three seconds and a quarter-mile in under 12 seconds. You take it for a test drive, you fall in love with the car, and you buy it. In the months to follow, you remain quite pleased with your purchase and the performance the car provides.
> 
> It later comes out that Dodge made a mistake on its marketing materials: the engine has 16 valves, not 24. It still produces 707 horsepower at 6,000 RPM though, and *it still offers the same amazing road performance that it did the day you bought it*. It's still one of the fastest cars you could purchase for the dollar. But you can no longer say you own a 24-valve V8._


No its not.

So you,decide one day to take the car from 0-150 and find in your last 30mph your hp suddenly drops to 200hp. You,ask yourself *** i have 700hp for everything else what happend why am i going so slow?

Or as you approach the quarter mile you notice all of a sudden your speed dcreasing and your "effective hp' has now become 200hp.


----------



## sugalumps

Quote:


> Originally Posted by *nyxagamemnon*
> 
> No its not.
> 
> So you,de idd one day to take the car from 0-150 and find in your last 30mph your hp suddenly drops to 200hp. You,ask yourself *** i have 700hp for everything else what happend why am i going so slow?


He is right though, it still has the exact same performance it did a few days ago before all this drama. It still had the flaw and no one noticed it until now, you know why? Because 99% of the 970 owners will never go above 3gb vram. Not that it makes what nvidia has done ok, but technically the card still performs as well as it did last week when everyone LOVED it.


----------



## nyxagamemnon

Quote:


> Originally Posted by *sugalumps*
> 
> He is right though, it still has the exact same performance it did a few days ago before all this drama. It still had the flaw and no one noticed it until now, you know why? Because 99% of the 970 owners will never go above 3gb vram. Not that it makes what nvidia has done ok, but technically the card still performs as well as it did last week when everyone LOVED it.


Card performs as is because of driver trickery







Driver tries its best last tooth nail to hide this issue in the 3.5gb pool and then the majority of people dont notice it. Of course you dont notice it because you are inside that zone/bubble. Once you step out then things get interesting. Ignorance of the facts doesnt mean that nothing is wrong.

The issue here is one of lied specifications and memory addressing. And that is fact Not covering it up with cheezy lines or puff pieces 'Saying it still performs the same'.

Just think about this the last .5gb of memory issue still happened before you knew whats up afterall it was users who found out. Now you have the facts there is no getting around this.

Nvidia messed up
Incorrect specs
3.5gb segmentation.

You are now outisde the bubble.
The performance hasnt changed what has changed is the awareness of performance above 3.5gb. That issue still exsisted when everyone was ignorant to the facts as well. So saying performance hasnt changed means nothing because it serves no purpose other than word trickery to make none enthusiasts glide over the real issues.

Performance above 3.5gb was just as bad months ago as it is today so saying it hasnt changed makes no sense becauss its a bad thing yet its cast in a 'Good light'.

If you hear the performance hasnt changed you will automatically think of it in a positive way, yet ignoring the fact that these issues were present back then as well. Its not a neutral statement yet one of cleverly constructred words.

The gtx 970's performance has not changed however our knowlege about the card and actual specificstions has The gtx 970 has the same performance as it did months ago,When using above 3.5gb of memory performance is poor, and below 3.5gb of usage performance is not hindered by the 32bit bus of the 512mb portion of the 970's memory..

Wait we dont say it like that now do we Because this will inform people about an actual bottleneck and not look good for the future.


----------



## Jesse36m3

Quote:


> Originally Posted by *sugalumps*
> 
> He is right though, it still has the exact same performance it did a few days ago before all this drama. It still had the flaw and no one noticed it until now, you know why? Because 99% of the 970 owners will never go above 3gb vram. Not that it makes what nvidia has done ok, but technically the card still performs as well as it did last week when everyone LOVED it.


Serious question. What is the most taxing "setting" that would make vram increase? Anti-Aliasing? Textures? I just want to know what I should try and monitor so that I can try and remedy this if I happens.


----------



## skupples

you don't notice it because most people choose to run @ 60FPS @ high settings over 15FPS @ super ultra mega high DSR settings.


----------



## Gamer_Josh

I think people are making too big a deal out of this. There is still 4GB of VRAM, even if 500MB of it isn't used the way you thought it would/should be. Of course, optimizing the use through a driver update is welcome.

My 970 still performs as awesome now as it did when I first got it, so I'm not bothered a bit by it. If you're dissatisfied with it, my suggestion would be to return it if possible, and go another route. Otherwise, be happy and carry on.


----------



## Tsumi

Quote:


> Originally Posted by *Jesse36m3*
> 
> Serious question. What is the most taxing "setting" that would make vram increase? Anti-Aliasing? Textures? I just want to know what I should try and monitor so that I can try and remedy this if I happens.


True AA (not FXAA and other AAs like that) and textures. Also screen resolution, but you can't go above your resolution.


----------



## Johnny Rook

Quote:


> Originally Posted by *Cyro999*
> 
> The 970 can only access one pool of memory at a time. The first pool uses seven VRAM chips each with 32 bit interface, while the gtx980 uses eight.
> 
> peak bandwidth (and actual bandwidth) is 7/8'ths (87.5%) of gtx980 as has been shown by both nvidia detailing actual specs as well as actual benchmarks.


WRONG!

Stop spreading lies throughout the Internet; we already have nVIDIA doing it for you with their false-advertisements.

Lower spec'ed GPUs, with less "core" power to render pixel will drop performance and stutter sooner. Take the R9 290X for instance:



Do you think this is OK? Do you think it's better than GTX 970 in the same game?



I don't think so!

As far as I'm concerned, in most gaming scenarios, a lower spec'ed card will degrade its performance sooner than the mighty GTX 980, not because of VRAM but, because of less "Core" power.

The games that show odd behaviour - afaik, CoD: AW - are most certainly being poorly managed by Windows and drivers heuristics. A driver update will fix this and nVIDIA alrady confirmed they willl do it. So, the only question now should be: "Will nVIDIA driver team deliver?" If it does, cool. If it doesn't deliver, GTX 970 costumers AND GTX 970 COSTUMERS ALONE (we are upset enough already; we are not AMD fanboys' shields) should round and demand nVIDIA for compensation, since we can already refund the cards (at least in US).


----------



## Vesku

Quote:


> Originally Posted by *sugalumps*
> 
> He is right though, it still has the exact same performance it did a few days ago before all this drama. It still had the flaw and no one noticed it until now, you know why? Because 99% of the 970 owners will never go above 3gb vram. Not that it makes what nvidia has done ok, but technically the card still performs as well as it did last week when everyone LOVED it.


Actually it was noticed when a few people started figuring out the stuttering they were experiencing wasn't the game engine but the card.


----------



## Cyclonic

Love all the 970 owners trying to defend there card









Ill stay here the next few months when more console ports start to use arround 4 gig


----------



## mtcn77

Quote:


> Originally Posted by *Johnny Rook*
> 
> WRONG!
> 
> Stop spreading lies throughout the Internet; we already have nVIDIA doing it for you with their false-advertisements.
> 
> Lower spec'ed GPUs, with less "core" power to render pixel will drop performance and stutter sooner. Take the R9 290X for instance:
> 
> 
> 
> Do you think this is OK? Do you think it's better than GTX 970 in the same game?
> 
> 
> 
> I don't think so!
> 
> As far as I'm concerned, in most gaming scenarios, a lower spec'ed card will degrade its performance sooner than the mighty GTX 980, not because of VRAM but, because of less "Core" power.
> 
> The games that show odd behaviour - afaik, CoD: AW - are most certainly being poorly managed by Windows and drivers heuristics. A driver update will fix this and nVIDIA alrady confirmed they willl do it. So, the only question now should be: "Will nVIDIA driver team deliver?" If it does, cool. If it doesn't deliver, GTX 970 costumers AND GTX 970 COSTUMERS ALONE (we are upset enough already; we are not AMD fanboys' shields) should round and demand nVIDIA for compensation, since we can already refund the cards (at least in US).


You should verify your sources. There won't be a "magic" driver. [Source]


----------



## maarten12100

Quote:


> Originally Posted by *Johnny Rook*
> 
> WRONG!
> -snip


But why did you choose stutter dogs? It's a Nvidia backed title after all.


----------



## adi518

I hope Nvidia is coming up with a solution soon. It's making my blood boil when I think of all the hassle this is going to cause me.


----------



## criminal

Quote:


> Originally Posted by *adi518*
> 
> I hope Nvidia is coming up with a solution soon. It's making my blood boil when I think of all the hassle this is going to cause me.


What kind of solution are you looking for? Either return the card or don't. Those are your choices.


----------



## maarten12100

Quote:


> Originally Posted by *adi518*
> 
> I hope Nvidia is coming up with a solution soon. It's making my blood boil when I think of all the hassle this is going to cause me.


They will release a driver that will for always cap the card to 3,5GB while showing it is using 4GB at least that is what I expect. Can't end well...


----------



## xx9e02

Quote:


> Originally Posted by *adi518*
> 
> I hope Nvidia is coming up with a solution soon. It's making my blood boil when I think of all the hassle this is going to cause me.


https://twitter.com/NVIDIAGeForce/status/560881138869497856

There won't be a solution lol


----------



## adi518

Lol. They can go to hell then. Not everyone has access to a friendly retailer like Newegg. I bought it in November. Way beyond the return time frame now.


----------



## iSlayer

Let this sink in for a moment.

PCPer picked up and reported on the 970s not working as expected... and people think PCPer has been paid by Nvidia. Brb getting so high that makes sense.

Haha its a catch 22 though I'll be dead.
Quote:


> Originally Posted by *skupples*
> 
> so this perceived "caution" of properly using/reporting FCAT for frame times on 970 makes them shills?
> 
> I'm trying to follow, I really am.
> 
> or was it the NV tipping them off that AMD had frame time issues, & then blowing it out of the water, & proving that they did?


No, the people who were upset about PCPer reporting it were.
Quote:


> Originally Posted by *provost*
> 
> I think we all know that its EOL, as in no more in production. But, what does this mean for ongoing driver improvement and support? Is the Support EOL too?


Quote:


> Originally Posted by *Mad Pistol*
> 
> No, support for all DX11 parts in Nvidia's arsenal is ongoing. I'm sure that Maxwell is top priority simply because it is a new architecture, and optimization is key to remaining competitive.
> 
> If Nvidia cut off support for every product after they were EOL, no one would still be buying their products.


Yarp


----------



## Silent Scone

Quote:


> Originally Posted by *xx9e02*
> 
> https://twitter.com/NVIDIAGeForce/status/560881138869497856
> 
> There won't be a solution lol


That's bad, but I would take anything someone behind Twitter says with a huge pinch of salt lol.


----------



## Menta

Quote:


> Originally Posted by *adi518*
> 
> I hope Nvidia is coming up with a solution soon. It's making my blood boil when I think of all the hassle this is going to cause me.


at this point i would not expect much they, Nvidia is focused now on insisting nothing is wrong, some 970 owners will in fact get luck like evga owners but the rest of the large majority of people will stay stuck.

i will probaly try to sell my card and cut my loses.....will loose about 50 euros but im fine with that, just want this to end


----------



## iSlayer

Quote:


> Originally Posted by *Menta*
> 
> at this point i would not expect much they, Nvidia is focused now on insisting nothing is wrong, some 970 owners will in fact get luck like evga owners but the rest of the large majority of people will stay stuck.
> 
> i will probaly try to sell my card and cut my loses.....will loose about 50 euros but im fine with that, just want this to end


Perhaps you should read this thread. There are Nvidia reps on their forums you can talk to if you need help getting a return.


----------



## grunion

I've just seen the "official" Asus response.

LOL


----------



## iSlayer

Quote:


> Originally Posted by *grunion*
> 
> I've just seen the "official" Asus response.
> 
> LOL


Care to share? Knowing ASUS it's probably "get bent".

ASUS is probably the worst company to buy from if you expect your warranty or purchase to mean anything to them. The horror stories...

Its why I have started trying to avoid more ASUS purchases, however unsuccessfully.


----------



## MaCk-AtTaCk

Quote:


> Originally Posted by *Cyclonic*
> 
> Love all the 970 owners trying to defend there card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ill stay here the next few months when more console ports start to use arround 4 gig


Actually I think it would be better if you just left. Your defently not being constructive or helping at all. Just a shill.


----------



## Menta

Quote:


> Originally Posted by *iSlayer*
> 
> Perhaps you should read this thread. There are Nvidia reps on their forums you can talk to if you need help getting a return.


i have been following many threads there was one at geforce but that guy was like a bot and nothing more....

neither ASUS is willing i received a mail form them saying that every thing is fine but if i want to rma...


----------



## Johnny Rook

Quote:


> Originally Posted by *mtcn77*
> 
> You should verify your sources. There won't be a "magic" driver. [Source]


NO, you should veryify yours. First in that all thread you sourced, that I am monitoring and contributing, there wasn't a single statment from nVIDIA saying will not release a driver. Plus, is not "magical" is just "heuristics", you know what heuristics is, don't you? it might sound magical but, is only AI.

Second, the card has no design flaw; hardware is working as intended - and works very well if you look into BF4 or SoM FCAT benchmarks, where it behaves as GTX 980. When it doesn't work - CoD. AW -, is not by hardware design fault but rather, poor heuristics management of VRAM pools.
Quote:


> Originally Posted by *xx9e02*
> 
> https://twitter.com/NVIDIAGeForce/status/560881138869497856
> 
> There won't be a solution lol


If the damn card was design that way, of course they can't change it! That would take an entirely new card, with probablt lower yieds and thus, more expensive.

But saying that the card is not ging to change is not same thing as saying it will be no driver update to improve heuristics.


----------



## juano

Quote:


> Originally Posted by *Johnny Rook*
> 
> ...
> But saying that the card is not ging to change is not same thing as saying it will be no driver update to improve heuristics.


Yea can you imagine if they said that? LOL crazy!

Wait... what's this here? https://twitter.com/NVIDIAGeForce/status/560878957554569216


----------



## Asus11

seems like Nvidia in UK are putting there hands up and accepting refunds via UK retailer overclockers for mis-selling their product.

I guess thats good news for the people who want to return there gpus.. but I can see alot of people jumping to the red team.. Nvidia has done a big mistake in that regard.. first the 970 then the 960..

who can you trust anymore, I bet they wish they didn't start this cash cow, as it may of led to big profits in the present but may hinder profits in the future


----------



## mtcn77

Quote:


> Originally Posted by *iSlayer*
> 
> Care to share? Knowing ASUS it's probably "get bent".
> 
> ASUS is probably the worst company to buy from if you expect your warranty or purchase to mean anything to them. The horror stories...
> 
> Its why I have started trying to avoid more ASUS purchases, however unsuccessfully.


Except, they don't break out of no reason.
"Asustek ships 5 million, Gigabyte 3.6 million, MSI 2.8-3.0 graphics cards in 2014"


----------



## criminal

Quote:


> Originally Posted by *juano*
> 
> Yea can you imagine if they said that? LOL crazy!
> 
> Wait... what's this here? https://twitter.com/NVIDIAGeForce/status/560878957554569216


Oops... wow Nvidia...wow

Peter might be looking for a new job after this...


----------



## Menta

Quote:


> Originally Posted by *criminal*
> 
> Oops... wow Nvidia...wow
> 
> Peter might be looking for a new job after this...


dam!!!!!!!!!! LOL

should i laugh is NV waging war


----------



## Xoriam

That guy who responded on Nvidia twitter probably has no idea whats going on.


----------



## juano

Quote:


> Originally Posted by *Xoriam*
> 
> That guy who responded on Nvidia twitter probably has no idea whats going on.


That's an accurate description for too large a percentage of Nvidia employees as of late.


----------



## mtcn77

Quote:


> Originally Posted by *Johnny Rook*
> 
> NO, you should veryify yours. First in that all thread you sourced, that I am monitoring and contributing, there wasn't a single statment from nVIDIA saying will not release a driver. Plus, is not "magical" is just "heuristics", you know what heuristics is, don't you? it might sound magical but, is only AI.
> 
> Second, the card has no design flaw; hardware is working as intended - and works very well if you look into BF4 or SoM FCAT benchmarks, where it behaves as GTX 980. When it doesn't work - CoD. AW -, is not by hardware design fault but rather, poor heuristics management of VRAM pools.
> If the damn card was design that way, of course they can't change it! That would take an entirely new card, with probablt lower yieds and thus, more expensive.
> 
> But saying that the card is not ging to change is not same thing as saying it will be no driver update to improve heuristics.


You should quit multiple consecutive questions as well. It doesn't serve you well when you ask them like an open ended suggestion without proof. I could list a few fallacies with that. Presupposition is one.
290x doesn't have spikes because of the same reason Nvidia does.
INB4


----------



## maarten12100

Quote:


> Originally Posted by *grunion*
> 
> I've just seen the "official" Asus response.
> 
> LOL


judging by your As(u)shole line above your avatar I'm quite positive they side with Nvidia on "there is nothing funky going on everything functions as advertised"


----------



## Xoriam

I've gotten horrible support from every company that deals with PC parts apart from Corsair and EVGA.
From now on I think every single part that goes in my PC will be from one of those companies.

When I bought my Gigabyte G1s they were advertised to have samsung memory modules from all of the reviews.
When I recieved them however they had Hynix and overclocked like crap obviously (the memory)
I contacted them about the chips being switched without telling us about it first and their response was robotic.
I got this as a response. "if the card seems to function, please feel safe to use it."

What?????


----------



## grunion

Quote:


> Originally Posted by *iSlayer*
> 
> Care to share? Knowing ASUS it's probably "get bent".
> 
> ASUS is probably the worst company to buy from if you expect your warranty or purchase to mean anything to them. The horror stories...
> 
> Its why I have started trying to avoid more ASUS purchases, however unsuccessfully.


Quote:


> Originally Posted by *maarten12100*
> 
> judging by your As(u)shole line above your avatar I'm quite positive they side with Nvidia on "there is nothing funky going on everything functions as advertised"












Hasn't gone public yet.


----------



## criminal

Quote:


> Originally Posted by *Xoriam*
> 
> That guy who responded on Nvidia twitter probably has no idea whats going on.


Well Peter changed his original post on Nvidia forums: https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090

You can see his original someone quoted right after his initial post. Never was going to be a driver to try and address the 970 issue.


----------



## Anarion

Quote:


> Originally Posted by *Xoriam*
> 
> That guy who responded on Nvidia twitter probably has no idea whats going on.


Judging Nvidia's communication as a whole over the last days I have to say that no one in Nvidia has any idea what's going on. They keep sending individuals on forums to calm people down but instead they make people even more angry. In their forums they said they will try to compensate things with a driver fix (this means the card did not perform as it should from the beginning and they tried to cover it up) and then they said a driver "fix" is not coming and thus proving their previous guy wrong. Something really fishy going on and they really hard to cover it or they really have no idea about anything in their company. AMD is releasing new cards soon and they keep saying lies and pretend nothing ever happened. They really want to see their sales falling down.


----------



## glr123

Quote:


> Originally Posted by *Xoriam*
> 
> That guy who responded on Nvidia twitter probably has no idea whats going on.


Actually, looks like [email protected] has redacted all of his statements and edited his previous messages:

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/219/

There appears to be no specific driver incoming for the 970.


----------



## Xoriam

omg... -_- so lame.


----------



## criminal

Quote:


> Originally Posted by *glr123*
> 
> Actually, looks like [email protected] has redacted all of his statements and edited his previous messages:
> 
> https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/219/
> 
> There appears to be no specific driver incoming for the 970.


Yep... digging that hole even deeper...


----------



## maarten12100

Quote:


> Originally Posted by *grunion*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hasn't gone public yet.


I'm looking forward to laughing about it then. Though for the people who have an Asus GTX 970 card it may not be a laughing matter.
Quote:


> Originally Posted by *criminal*
> 
> Well Peter changed his original post on Nvidia forums: https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
> 
> You can see his original someone quoted right after his initial post. Never was going to be a driver to try and address the 970 issue.


I feel a bit sad for Peter he was trying to help people but was probably scolded by his superiors and had to edit his post. I fear he might get a demotion for trying to help people...

Nvidia high command what are you doing...


----------



## dejo1

at this point, I dont think I would mind if nvidia had to fold up shop and stop making goods


----------



## Johnny Rook

Quote:


> Originally Posted by *mtcn77*
> 
> You should quit multiple consecutive questions as well. It doesn't serve you well when you ask them like an open ended suggestion without proof. I could list a few fallacies with that. Presupposition is one.
> 290x doesn't have spikes because of the same reason Nvidia does.
> INB4


Let's be serious and intlectually honest, ok? Both of us.

PCPer and Guru3D FCAT benchamrks of GTX 970 against GTX 980 show both cards have exactly the expected performance degradation; no problems with those games. If that's the case, doesn't this prove the card is working as intended? I think it does.

Now, let's go to the situation where GTX 970 has a performance degradation lower than expected, considering the GTX 980. What you're saying is the card suddenly has a flaw design, ignoring the previous results at BF4. What I am ASKING is "CONSIDERING the results at BF4 that shown the card is ok, shouldn't the CoD: AW results be because of heuristics or because of less CUDA cores, less core power? That's all I',doing.

So,I throw a "control sample" into the table. R9 290X. And the R9 290X shows the same degradation as GTX 970. The R9 290X doesn't relly on heuristics. So, the performance drop and stutter has to be because of less core power. What else? If less core power it can be for R9 290X it can also be for GTX 970.

As all indicates, nVIDIA will update the drivers. If the driver fix the CoD. AW results, then I am correct. If they don't, then YOU are correct.

Seems fair to you?


----------



## digiadventures

185 pages ???
I can't belive this. I dont understand whats the fuss about.
970 is amazing card, and very cheap for perfomance you get. I am buying it as soon as I can afford it, its certanly much better then 960 who only has 2 gb of ram, and is much cheaper, atleast where I live, from 980

So what if only 3.5 gb of vram is working fullspeed ??? Who cares ? Even if it actually did only had 3.5 gb of vram instead of 4, it would still be amazing card and best bang for your buck.

As if 500mb or ram is going to make a huge difference, if you hit vram wall in some game with 3.5 gb then you are not going to have fluid experience with 4 gb either..that kind of games require 6 gb of vram


----------



## Mad Pistol

Quote:


> Originally Posted by *dejo1*
> 
> at this point, I dont think I would mind if nvidia had to fold up shop and stop making goods


First, that's not going to happen. Nvidia has some very deep pockets right now. Geforce is only one part of their income. They will bounce back from this.

Second, you would mind because without Nvidia, there would be no GPU war, and the prices would be sky high. The only reason Nvidia launched the Titan Z @ $3000 is because they believed that AMD could not make a dual-GPU version of their R9 290X... and AMD proved them wrong.


----------



## GrimDoctor

Actually it's more that the card had a flaw at the beginning and as games with higher needs got released it became more apparent and will get worse in the immediate future. The said gimping of the card means that the life span of the card is terrible, even at that price.

If this card is 'working as intended' then the intentions are poor and nothing can excuse the fact that it wasn't revealed...no amount of driver support will right this before the card is outdated and with said issues it's already on that precipice, hence, it appears Nvidia are now back flipping to cut their losses and move on now.

It's certainly a sad day.


----------



## Mad Pistol

Quote:


> Originally Posted by *GrimDoctor*
> 
> Actually it's more that the card had a flaw at the beginning and as games with higher needs got released it became more apparent and will get worse in the immediate future. The said gimping of the card means that *the life span of the card is terrible, even at that price.*
> 
> If this card is 'working as intended' then the intentions are poor and nothing can excuse the fact that it wasn't revealed...no amount of driver support will right this before the card is outdated and with said issues it's already on that precipice, hence, it appears Nvidia are now back flipping to cut their losses and move on now.
> 
> It's certainly a sad day.


The lifespan of the card is TERRIBLE, you say?

Nvidia will constantly be releasing new drivers as newer games are released, and they will have performance tweaks and improvements. Do you think that's because Nvidia suddenly pours some magical sauce on the game code and causes it to run better?

No.

Nvidia optimizes how the game code is interpreted by the video card, meaning that the GTX 970 will continue to be optimized as long as there is driver support. You guys act like the world is going to end because of this card, but that's just not the case. It will get the same treatment that every other "weird" card that nvidia has ever released has gotten... that is to say lots of optimizations.


----------



## digiadventures

Well since the card is so terible, and I am a noble person, I will take it for half the price, so you dont have to throw it away


----------



## Silent Scone

Quote:


> Originally Posted by *Asus11*
> 
> seems like Nvidia in UK are putting there hands up and accepting refunds via UK retailer overclockers for mis-selling their product.
> 
> I guess thats good news for the people who want to return there gpus.. but I can see alot of people jumping to the red team.. Nvidia has done a big mistake in that regard.. first the 970 then the 960..
> 
> who can you trust anymore, I bet they wish they didn't start this cash cow, as it may of led to big profits in the present but may hinder profits in the future


No NVIDIA aren't, the etailers are and certain AIB partners. I'm not entirely sure what NVIDIA themselves are telling consumers, probably because they don't want to you to be adamant they're accepting returns.


----------



## Vesku

Nvidia looks to be in damage control mode. Made the Nvidia forum guy edit so he no longer provided confirmation that the memory configuration of the 970 is not optimal. Don't want to provide people with any quotables. Doesn't mean they won't be tweaking how games behave with the 970's fast+slow memory in future driver releases. Just don't expect it to be called the "970 fix" or anything, thought that would be obvious.


----------



## dejo1

msi told me the only way to return was if it was faulty


----------



## GrimDoctor

Quote:


> Originally Posted by *Mad Pistol*
> 
> The lifespan of the card is TERRIBLE, you say?
> 
> Nvidia will constantly be releasing new drivers as newer games are released, and they will have performance tweaks and improvements. Do you think that's because Nvidia suddenly pours some magical sauce on the game code and causes it to run better?
> 
> No.
> 
> Nvidia optimizes how the game code is interpreted by the video card, meaning that the GTX 970 will continue to be optimized as long as there is driver support. You guys act like the world is going to end because of this card, but that's just not the case. It will get the same treatment that every other "weird" card that nvidia has ever released has gotten... that is to say lots of optimizations.


So Nvidia apparently reneging on a driver fix is not indication that they are cutting their losses?

Current games on the market right now, because of their vram needs are already making the 970 redundant, let alone high end applications. This is from first hand experience not conjecture.


----------



## amd955be5670

Nvidia retracted their statement of an incoming driver fix. The forum post has been updated.


----------



## Silent Scone

Quote:


> Originally Posted by *GrimDoctor*
> 
> So Nvidia apparently reneging on a driver fix is not indication that they are cutting their losses?
> 
> Current games on the market right now, because of their vram needs are already making the 970 redundant, let alone high end applications. This is from first hand experience not conjecture.


This is the same for 980GTX owners though.

People hear what they want to hear. Especially rage filled younglings.


----------



## GrimDoctor

Quote:


> Originally Posted by *Silent Scone*
> 
> This is the same for 980GTX owners though.
> 
> People hear what they want to hear. Especially rage filled younglings.


I just got a 980, so far the memory management/allocation is far better IMO and I think that's the key to all of this.


----------



## Silent Scone

Sounds like placebo to me but ok.


----------



## maarten12100

Quote:


> Originally Posted by *digiadventures*
> 
> 185 pages ???
> I can't belive this. I dont understand whats the fuss about.
> 970 is amazing card, and very cheap for perfomance you get. I am buying it as soon as I can afford it, its certanly much better then 960 who only has 2 gb of ram, and is much cheaper, atleast where I live, from 980
> 
> So what if only 3.5 gb of vram is working fullspeed ??? Who cares ? Even if it actually did only had 3.5 gb of vram instead of 4, it would still be amazing card and best bang for your buck.
> 
> As if 500mb or ram is going to make a huge difference, if you hit vram wall in some game with 3.5 gb then you are not going to have fluid experience with 4 gb either..that kind of games require 6 gb of vram


It's about getting what you were told to get. Nvidia said 4GB at 224GB/s they delivered:
3,5+0,5GB with the first part running at good speeds the later part performing so horrible that once you cross into that territory frames plumet. Luckily for Nvidia the card or the drivers were acting in such a way that it would stay mostly bellow 3,5GB utilization meaning that this wasn't noticed until some people crossed it.

As for who cares everybody who was scammed buying the card while under the assumption that the specs on the Nvidia site were correct. As for best bang for buck that is already debunked though if you factor in power efficiency it would still somewhat apply.

You're probably one of those consumers that lack moral backbone and will let large corporations get away with fraudulent acts but many people won't stand for that. The solution to the problem is what has to be delivered along with an apology rather than hiding and cover up.

Even though I have no reason to feel cheated as I didn't buy a GTX 970 yet. I will still defend the right cause for example my energy provider made a error due to which all my electronics were fried, my wash machine catched fire and 40 solar panels were destroyed. Their electrical engineers even stated this "MOF" should've been replaced by them years ago. Yet the company says it will not be covering damages only up to 3000 euro will they cover. The total damage is in the 20k euro range. Therefore I will take legal actions until justice is served this isn't even a matter of the money it is about they getting what the deserve. After all their failure to do their job could've gotten 4 people killed and destroyed about 20k worth of stuff.

The name of this horrible company by the way is Liander and I'm not the only one with a problem various other people have made complains that Liander doesn't pay the damages they have done. The average time a claim takes is 5 years which is way too long. Getting what you deserve is something that you have to enforce in a society where big companies think they can get away with outrageous things.


----------



## GrimDoctor

Quote:


> Originally Posted by *Silent Scone*
> 
> Sounds like placebo to me but ok.


Vram hungry application (not game) not crashing on a 980 vs 970 is not a placebo my friend. It will either crash or it won't.


----------



## tpi2007

Quote:


> Originally Posted by *Menta*


I don't think that Nvidia is fully understanding the consequences of their behaviour, and neither is the tech press.

What this essentially means is that Nvidia doesn't consider as official advertising anything other than what they post on their website.

Handing over technical press kits to tech journalists so that they can present the architecture to readers doesn't count. Nvidia is throwing the tech press under the bus. Nothing we read on tech sites, not even the supposedly factual things can be trusted anymore, it doesn't count.

And the tech media is still defending Nvidia.

I honestly don't understand why the tech media doesn't even consider the possibility that games coming out this year and the next might stress the GTX 970 in a way that will reveal its shortcomings.

Instead, they concentrate their discourse on _stating the obvious_, that the games that they reviewed still perform the same and that's great, and all there is to it, as if they felt an irrational need to justify their works' worth.

GTA V, The Witcher 3, the next Batman game, among others, and whatever comes out next year, can't and won't stress the card ? It's as if they all have this notion that nothing in the gaming world will change at all in the next year, where the GTX 970's performance will still pretty much be relevant. As if the future in the next year is to remain in the _status quo_ of games using the exact same amount of VRAM, not a megabyte more, as the ones they reviewed the cards with.

People don't buy cards to only play the current games and the ones that came before, they also normally intend to keep the cards for at least a year or two and have certain headroom expectations based on the specs that they read, and that forethought goes into the value they attribute to the card when buying it.

Discarding this entirely doesn't make any sense.


----------



## Johnny Rook

Quote:


> Originally Posted by *GrimDoctor*
> 
> Vram hungry application (not game) not crashing on a 980 vs 970 is not a placebo my friend. It will either crash or it won't.


Taking an "app" instead of a "game" as an example tells me a lot.

It tells me that in games, where you should be using the GTX 980, you are either not seeing the difference or if you are, you are not confident enought it isn't a "placebo effect".


----------



## maarten12100

Quote:


> Originally Posted by *Johnny Rook*
> 
> Taking an "app" instead of a "game" as an example tells me a lot.
> 
> It tells me that in games, where you should be using the GTX 980, you are either not seeing the difference or if you are, you are not confident enought it isn't a "placebo effect".


You can no longer use applications on your Nvidia card because it are not games? Please tell me more about the fantasy world you live in. Some people use their cards for more than just gaming.

However there are games that show it very well looking at the reddit page so simply loading one of those with the same specs should reveal a huge difference meaning it isn't a placebo effect.


----------



## TopicClocker

Quote:


> Originally Posted by *SKYMTL*
> 
> Gotcha.
> 
> I'll go back to my plans for world domination now. Conference call with the Illuminati at 2PM and then the Roswell aliens visit at 4PM. Big day today!


Hilarious post!









On a more serious note, do you mind getting the Greys to contact me, I'm in need of a Gundam!








Quote:


> Originally Posted by *digiadventures*
> 
> 185 pages ???
> I can't belive this. I dont understand whats the fuss about.
> 970 is amazing card, and very cheap for perfomance you get. I am buying it as soon as I can afford it, its certanly much better then 960 who only has 2 gb of ram, and is much cheaper, atleast where I live, from 980
> 
> So what if only 3.5 gb of vram is working fullspeed ??? Who cares ? Even if it actually did only had 3.5 gb of vram instead of 4, it would still be amazing card and best bang for your buck.
> 
> As if 500mb or ram is going to make a huge difference, if you hit vram wall in some game with 3.5 gb then you are not going to have fluid experience with 4 gb either..that kind of games require 6 gb of vram


Noooo that's not the point at all, nobody is doubting the performance at all, that hasn't changed at all.

The fuss is about the specs being kept quiet or not revealed, so those who bought the card before this was revealed may feel a combination of things, whether that be betrayed, lied to, ripped off and so on, on top of that it could be said that it was falsely advertised to some extent, however some people are not bothered much by it.

Not only that but it took Gamers and Enthusiasts to find this out before NVIDIA came clean about it.

It's kind of a tricky topic, but some people who bought a GTX 970 expecting it to be a true 4GB card as advertised feel as if they were deceived, this is pretty much a big thing to be honest.

VRAM also does matter, not having a full speed 4GB over a 3.5GB full speed segment and a slower 0.5GB segment may not be catastrophic but no one really knows if it may cause problems later, the fact that everyone is just finding out about the memory architecture now is not a good thing.

I suppose you could say that if people buy a GTX 970 with 4GB, they expect it to truly be a 4GB card as other 4GB cards are such as the GTX 980, the R9 290 and the R9 290X.

Alot of people say VRAM doesn't matter, just like with the GTX 670, 680, 760 and 770 2GB cards, however with the next gen games of today having a 4GB variant is worthwhile to run higher quality textures.
Quote:


> Originally Posted by *Mad Pistol*
> 
> The lifespan of the card is TERRIBLE, you say?
> 
> Nvidia will constantly be releasing new drivers as newer games are released, and they will have performance tweaks and improvements. Do you think that's because Nvidia suddenly pours some magical sauce on the game code and causes it to run better?
> 
> No.
> 
> Nvidia optimizes how the game code is interpreted by the video card, meaning that the GTX 970 will continue to be optimized as long as there is driver support. You guys act like the world is going to end because of this card, but that's just not the case. It will get the same treatment that every other "weird" card that nvidia has ever released has gotten... that is to say lots of optimizations.


This is likely true, but many Gamers are really upset about the VRAM and I can't say I blame them.

Quote:


> Originally Posted by *Xoriam*
> 
> That guy who responded on Nvidia twitter probably has no idea whats going on.


----------



## GrimDoctor

Quote:


> Originally Posted by *Johnny Rook*
> 
> Taking an "app" instead of a "game" as an example tells me a lot.
> 
> It tells me that in games, where you should be using the GTX 980, you are either not seeing the difference or if you are, you are not confident enought it isn't a "placebo effect".


The app situation was a particular problem for me so I stated it. I haven't had a chance to test enough games yet, just got the card but I certainly will and I'm confident.

I prefer my approach of based on experience rather than just spouting anything and everything I can


----------



## Heavy MG

Quote:


> Originally Posted by *dejo1*
> 
> at this point, I dont think I would mind if nvidia had to fold up shop and stop making goods


Really? I don't want Nvidia to stop making GPU's, I just wish they would stop damage controlling and modifying their employee's Nvidia forum posts to try covering things up. Nvidia is just hurting themselves by censoring everything and not speaking about the situation. Typical "corporations are people too" nonsense in the U.S. so Nvidia will totally get away with it,and most likely lie on specs in the future. At least the EU will probably see a lawsuit out of this since they have better laws against false advertisement.
Quote:


> Originally Posted by *Xoriam*
> 
> I've gotten horrible support from every company that deals with PC parts apart from Corsair and EVGA.
> From now on I think every single part that goes in my PC will be from one of those companies.
> When I bought my Gigabyte G1s they were advertised to have samsung memory modules from all of the reviews.
> When I recieved them however they had Hynix and overclocked like crap obviously (the memory)
> I contacted them about the chips being switched without telling us about it first and their response was robotic.
> I got this as a response. "if the card seems to function, please feel safe to use it."
> What?????


Same here,all of the reviewers were given "cherry picked" cards with Samsung memory,claiming they were getting great memory oc's.
I'd have to guess Gigabyte hurried up and quietly switched the memory to save themselves a few cents.
I have a revision 1.1 card which is rumored to have fixed the coil while,but the Hynix ram is pretty crappy and I can't get more than +100mhz on the ram.
Wasn't EVGA also the first to help and offer returns or step up to GTX 980's? I love Corsair,I just wish they made GPU's.


----------



## mtcn77

Quote:


> Originally Posted by *Johnny Rook*
> 
> Let's be serious and intlectually honest, ok? Both of us.
> 
> PCPer and Guru3D FCAT benchamrks of GTX 970 against GTX 980 show both cards have exactly the expected performance degradation; no problems with those games. If that's the case, doesn't this prove the card is working as intended? I think it does.
> 
> Now, let's go to the situation where GTX 970 has a performance degradation lower than expected, considering the GTX 980. What you're saying is the card suddenly has a flaw design, ignoring the previous results at BF4. What I am ASKING is "CONSIDERING the results at BF4 that shown the card is ok, shouldn't the CoD: AW results be because of heuristics or because of less CUDA cores, less core power? That's all I',doing.
> 
> So,I throw a "control sample" into the table. R9 290X. And the R9 290X shows the same degradation as GTX 970. The R9 290X doesn't relly on heuristics. So, the performance drop and stutter has to be because of less core power. What else? If less core power it can be for R9 290X it can also be for GTX 970.
> 
> As all indicates, nVIDIA will update the drivers. If the driver fix the CoD. AW results, then I am correct. If they don't, then YOU are correct.
> 
> Seems fair to you?


I don't negotiate the truth. For you, I'm going to give the effort and we will both learn how many fallacies you made.
Red herring - "PCPer and Guru3D FCAT benchamrks of GTX 970 against GTX 980 show both cards have exactly the expected performance degradation; no problems with those games. If that's the case, doesn't this prove the card is working as intended? I think it does."
Nobody questions whether the alus work correctly. The tests conducted on DSR don't strain backend enough. Textures are supersized and the resolution is correspondent and the textures are compressed by cuda(efficient - not rop bound). What needs to be done is the opposite, textures need to stay the same and pixel pipelines need to be overloaded... through antialiasing. Good thing Maxwell doesn't get caught with antialiasing.









Appeal to probability - "Now, let's go to the situation where GTX 970 has a performance degradation lower than expected, considering the GTX 980. "
When in fact, it is 4% worse; note 290x is 8% less.

"So,I throw a "control sample" into the table. R9 290X. And the R9 290X shows the same degradation as GTX 970. The R9 290X doesn't relly on heuristics. So, the performance drop and stutter has to be because of less core power. What else? If less core power it can be for R9 290X it can also be for GTX 970."
Conjunction fallacy / argument from ignorance/repetition/silence/moderation- I couldn't pick which one. 1. 290x & 970 aren't the same. You can't meld the in the same pot and expect your favoured outcome. I gave the answer in the previous post>INB4.


----------



## iSlayer

As this thread goes on my apathy grows and anger diminishes. Hearing AMD owners call for the suing of Nvidia and shilling in the hopes of a sweet new 285 if they be Roy's concubine for long enough, Nvidia owners with 7xxs crying because they think Kepler is being nerfed and 980 owners thinking its at all relevant that they bought a 980 and how awesome they must be for it (see insecure or compensating for something).

OCN news section: as classy as a McDonald's bathroom after three people get explosive diarrhea from poisoned laxatives.
Quote:


> Originally Posted by *Menta*
> 
> i have been following many threads there was one at geforce but that guy was like a bot and nothing more....
> 
> neither ASUS is willing i received a mail form them saying that every thing is fine but if i want to rma...


http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/1050#post_23470519
https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
Quote:


> Originally Posted by *mtcn77*
> 
> Except, they don't break out of no reason.
> "Asustek ships 5 million, Gigabyte 3.6 million, MSI 2.8-3.0 graphics cards in 2014"


Did you even read my post? How is any of that relevant?
Quote:


> Originally Posted by *GrimDoctor*
> 
> So Nvidia apparently reneging on a driver fix is not indication that they are cutting their losses?
> 
> Current games on the market right now, because of their vram needs are already making the 970 redundant, let alone high end applications. This is from first hand experience not conjecture.


Someone in marketing did get to the dude's post.

http://imgur.com/jxA5WuD,P3AaJqJ,s63rirM


----------



## Forceman

Quote:


> Originally Posted by *tpi2007*
> 
> I honestly don't understand why the tech media doesn't even consider the possibility that games coming out this year and the next might stress the GTX 970 in a way that will reveal its shortcomings.
> 
> Instead, they concentrate their discourse on _stating the obvious_, that the games that they reviewed still perform the same and that's great, and all there is to it, as if they felt an irrational need to justify their works' worth.
> 
> GTA V, The Witcher 3, the next Batman game, among others, and whatever comes out next year, can't and won't stress the card ? It's as if they all have this notion that nothing in the gaming world will change at all in the next year, where the GTX 970's performance will still pretty much be relevant. As if the future in the next year is to remain in the _status quo_ of games using the exact same amount of VRAM, not a megabyte more, as the ones they reviewed the cards with.
> 
> People don't buy cards to only play the current games and the ones that came before, they also normally intend to keep the cards for at least a year or two and have certain headroom expectations based on the specs that they read, and that forethought goes into the value they attribute to the card when buying it.
> 
> Discarding this entirely doesn't make any sense.


What are they supposed to do? They are journalists, not speculators or psychics. They don't have any idea what future game requirements are going to be, so how can they make a legitimate claim that the card will or won't have enough VRAM for games that come out next year? Everyone assumes that VRAM requirements are going to go up because "consoles" but it's not like games are suddenly going to stop running on cards that have less than the (now apparently magical) 4GB. If the games are really that demanding than today's cards may not have the horsepower to run them at settings that require more than 4 (or 3.5) GB of VRAM anyway. So you have to not run ultra-super-duper textures and instead just run very high-super-duper textures, is that so terrible? People make those same trade-offs right now in the interest of higher framerates or more AA or whatever, and it doesn't seem like that big a deal People are acting like that 0.5GB of VRAM is the end of the world as we know it.

On a related note, I gotta say I can't wait until the 380X comes out with 4GB and then the GM200 card comes out with 6GB, and see how fast the rabidly pro-AMD crowd walks back the "more VRAM is needed" claims.


----------



## PostalTwinkie

It is a physical change on the hardware, how was Nvidia hoping to address it with a driver?


----------



## tpi2007

Quote:


> Originally Posted by *Forceman*
> 
> What are they supposed to do? They are journalists, not speculators or psychics. They don't have any idea what future game requirements are going to be, so how can they make a legitimate claim that the card will or won't have enough VRAM for games that come out next year? Everyone assumes that VRAM requirements are going to go up because "consoles" but it's not like games are suddenly going to stop running on cards that have less than the (now apparently magical) 4GB. If the games are really that demanding than today's cards may not have the horsepower to run them at settings that require more than 4 (or 3.5) GB of VRAM anyway. So you have to not run ultra-super-duper textures and instead just run very high-super-duper textures, is that the end of the world? People are acting like that 0.5GB of VRAM is the end of the world as we know it.
> 
> On a related note, I gotta say I can't wait until the 380X comes out with 4GB and then the GM200 card comes out with 6GB, and see how fast the rabidly pro-AMD crowd walks back the "more VRAM is needed" claims.


You're distorting what I said. I don't want the tech media to speculate on anything. I just want them to consider that the future, in this case, the next year, might be different from the present and thus not shut the door on the possibility.


----------



## maarten12100

Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is a physical change on the hardware, how was Nvidia hoping to address it with a driver?


A driver that keeps the ram usage in all cases bellow 3,5GB would cover up the problem. If that driver would divided the displayed 4GB of ram over the available fast 3,5GB that would cover it up even better as when the 3,5GB is filled it would show it using 4GB.

If Nvidia wants to further scam customers and "fix" this with a driver that is the "best" course of action. I doubt such an attempt would hold however since it is already out people know!


----------



## Forceman

Quote:


> Originally Posted by *tpi2007*
> 
> You're distorting what I said. I don't want the tech media to speculate on anything. I just want them to consider that the future, in this case, the next year, might be different from the present and thus not shut the door on the possibility.


I know what you are saying, but what can they realistically do, except put a blurb in their conclusions that says "this card has less VRAM (which it actually doesn't) than this other card and so it may run into trouble earlier"? According to all their testing the 970 continues to work fine even when you exceed 3.5GB (and I know that conflicts with user testing, but again, they can only write about what they test) so what caveat are they supposed to add?


----------



## PostalTwinkie

Quote:


> Originally Posted by *maarten12100*
> 
> A driver that keeps the ram usage in all cases bellow 3,5GB would cover up the problem. If that driver would divided the displayed 4GB of ram over the available fast 3,5GB that would cover it up even better as when the 3,5GB is filled it would show it using 4GB.
> 
> If Nvidia wants to further scam customers and "fix" this with a driver that is the "best" course of action. I doubt such an attempt would hold however since it is already out people know!


My thought was that they could use a new driver to force games like FPSOF2014 to address the full 4GB, and not the 3.6(9?) it was addressing. Or otherwise effectively communicate to the game engine that there is indeed a full 4 GB of VRAM and let it handle it naturally from there. That almost seems like a game specific thing though, not sure if they could do it so globally.


----------



## Silent Scone

Quote:


> Originally Posted by *Forceman*
> 
> I know what you are saying, but what can they realistically do, except put a blurb in their conclusions that says "this card has less VRAM (which it actually doesn't) than this other card and so it may run into trouble earlier"? According to all their testing the 970 continues to work fine even when you exceed 3.5GB (and I know that conflicts with user testing, but again, they can only write about what they test) so what caveat are they supposed to add?


lol, user testing.


----------



## FlyingSolo

Quote:


> Originally Posted by *Asus11*
> 
> seems like Nvidia in UK are putting there hands up and accepting refunds via UK retailer overclockers for mis-selling their product.
> 
> I guess thats good news for the people who want to return there gpus.. but I can see alot of people jumping to the red team.. Nvidia has done a big mistake in that regard.. first the 970 then the 960..
> 
> who can you trust anymore, I bet they wish they didn't start this cash cow, as it may of led to big profits in the present but may hinder profits in the future


Damn i bought my one from scan uk. So i'm sure they will not accept returns or paying extra on top for another card. But will give it a try anyway. And if that doesn't work i have to RMA my card anyway since i have bad coil whine and it still hasn't gone. Thought the coil whine would go after a month or two.


----------



## Mad Pistol

Quote:


> Originally Posted by *GrimDoctor*
> 
> So Nvidia apparently reneging on a driver fix is not indication that they are cutting their losses?
> 
> Current games on the market right now, because of their vram needs are already making the 970 redundant, let alone high end applications. This is from first hand experience not conjecture.


You missed what I said, completely. Nvidia makes driver optimizations based on games for the hardware, not hardware for the games. This means that future titles will get tweaks that net extra performance on different cards. They've been doing this for years on each of their card generations.


----------



## dejo1

they will release a driver that just fills up the .5gb partitions with what it can and then finish filling it with whatever and say the problem is solved. effectively just eating it away so it can cause any issues


----------



## ZealotKi11er

What happens to a GPU when the game uses more memory then it has because i have not had a Nvidia GPU since GTX580? From my understanding to reach 3.5GB+ vRAM the GPU HP is not there so you are already getting low fps so you dont really see or benefit that much from that missing .5GB of vRAM. What happens to GTX780s when they are at their limit? I only has the experience with 290X @ 5K with Mantle where 30-40 fps would feel very choppy and vRAM was all used up.


----------



## skupples

Quote:


> Originally Posted by *dejo1*
> 
> at this point, I dont think I would mind if nvidia had to fold up shop and stop making goods











































































wow...

You really have no idea what you wish for.

AMD being a monopoly, over a mostly broken duopoly?

yeah, no..

You have no clue how bad that would be.

competition drives innovation. Lack of competition means stagnation of innovation.

A very EASY example of this to view is cable internet in America. Comcast & TWC have cooperatively agreed to not compete. Then you have situations like this

I pay $35 a month for 150/0

the people across the street pay $99 a month for the exact same package

the difference? We have 4 (soon to be 5) service providers in our neighborhood.

across the street? just comcast. The apartment complex has literally made a deal with Comcast so that they're the only provider. Situations like this are extremely common across the entire country.

so no. Nvidia going out of business would likely be one of the worst things that could happen to the GPU industry, but feel free to continue your ignorant hate train. It's quite entertaining.


----------



## GrimDoctor

Quote:


> Originally Posted by *Mad Pistol*
> 
> You missed what I said, completely. Nvidia makes driver optimizations based on games for the hardware, not hardware for the games. This means that future titles will get tweaks that net extra performance on different cards. They've been doing this for years on each of their card generations.


I saw what you said, I don't agree and I think Nvidia's back flip on the driver statement is evidence of that. I believe they will just move on and leave the 970 as is. They have new products to launch and backlash to deal with now, they would be silly to just focus on 970 driver updates that may not help. Cards are already coming back to them, why keep investing in the old, they need to get the new on the straight and narrow to reassure the public. It's simple business in the end.


----------



## Cyro999

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What happens to a GPU when the game uses more memory then it has because i have not had a Nvidia GPU since GTX580? From my understanding to reach 3.5GB+ vRAM the GPU HP is not there so you are already getting low fps so you dont really see or benefit that much from that missing .5GB of vRAM. What happens to GTX780s when they are at their limit? I only has the experience with 290X @ 5K with Mantle where 30-40 fps would feel very choppy and vRAM was all used up.


This.

http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/1200_30#post_23476625

It's a very nasty situation and best to avoid it, especially since the driver seems bugged for 970 and refuses to allocate the extra VRAM when necessary, also refusing to allocate system RAM - so if you overflow, it'll just hard crash after a while of stuttering.

500MB extra VRAM helps to push that further away from happening.


----------



## Mad Pistol

Quote:


> Originally Posted by *GrimDoctor*
> 
> I saw what you said, I don't agree and I think Nvidia's back flip on the driver statement is evidence of that. I believe they will just move on and leave the 970 as is. They have new products to launch and backlash to deal with now, they would be silly to just focus on 970 driver updates that may not help. Cards are already coming back to them, why keep investing in the old, they need to get the new on the straight and narrow to reassure the public. It's simple business in the end.


No, it's actually the opposite.

Because of the GTX 970 debacle, they have a lot more reason to ensure that it is always working at 100% capabilities to show to their customers that they are committed to their product. This is reassurance to the public that they will support their tech and not abandon it because of public backlash. You clean up a problem with a product by showing your customers that you are committed to them.

What you are suggesting would drive customers away. What I am suggesting would bring customers back.

Like you said, it's simple business.


----------



## GrimDoctor

Quote:


> Originally Posted by *Mad Pistol*
> 
> No, it's actually the opposite.
> 
> Because of the GTX 970 debacle, they have a lot more reason to ensure that it is always working at 100% capabilities to show to their customers that they are committed to their product. This is reassurance to the public that they will support their tech and not abandon it because of public backlash. You clean up a problem with a product by showing your customers that you are committed to them.
> 
> What you are suggesting would drive customers away. What I am suggesting would bring customers back.
> 
> Like you said, it's simple business.


I don't agree so we'll just agree to disagree. Only time will tell.

Ah, it's good to have civilized conversation in this thread at least







(not being sarcastic).

Edit: I am not saying Nvidia can't, just saying I don't think they will. Opinion.


----------



## skupples

-.- nvidia is a large company.

The people writing drivers for Geforce are probably separate from the people writing drivers / firmware for TEGRa.. might be some overlap, but these would definitely need to be two departments working in parallel, with how often NV drops new drivers.

sorry, just reading "Nvidia is probably going to leave it as is because of new products" just sounds off, and makes them sound like a small entity w/o the ability to multitask.


----------



## Mad Pistol

Quote:


> Originally Posted by *skupples*
> 
> -.- nvidia is a large company.
> 
> The people writing drivers for Geforce are probably separate from the people writing drivers / firmware for TEGRa.. might be some overlap, but these would definitely need to be two departments working in parallel, with how often NV drops new drivers.
> 
> sorry, just reading "Nvidia is probably going to leave it as is because of new products" just sounds off, and makes them sound like a small entity w/o the ability to multitask.


They are constantly refining their drivers. That's the reason that Fermi got a massive boost in performance over it's lifetime. Kepler also got a decent boost in performance from start to finish. Who's to say that Maxwell won't be the same?


----------



## skupples

Quote:


> Originally Posted by *Mad Pistol*
> 
> They are constantly refining their drivers. That's the reason that Fermi got a massive boost in performance over it's lifetime. Kepler also got a decent boost in performance from start to finish. Who's to say that Maxwell won't be the same?


that's my point.

BUT, it does look like we should start expecting driver support / tuning to end when the card manufacturing goes EOL, based on what we're seeing with GK110... AMD somehow continues to squeeze more and more performance out of 280x/7970, while kepler goes stagnant / subsides.









which is another reason why going w/ 390x is on my mind, though the #1 thing that STILL bothers me about AMD is time to update. They release Xfire profiles for games literally year + after the game has released. Like, they JUST NOW released a Far Cry 3 xfire profile, last month or so.


----------



## mouacyk

PCPer was trying to give their disgruntled readers an option with this post by Ryan. Now that NVidia has redacted and reworded that quotation, what does that mean for the future of PCPer and FCAT? Hope they still get to do priority reviews and receive priority tools to back up those reviews. They really did do a good job of pointing out frame variance issues with AMD cards, gotta give them that.


----------



## nSone

Quote:


> Originally Posted by *mouacyk*
> 
> PCPer was trying to give their disgruntled readers an option with this post by Ryan. Now that NVidia has redacted and reworded that quotation, what does that mean for the future of PCPer and FCAT? Hope they still get to do priority reviews and receive priority tools to back up those reviews. They really did do a good job of pointing out frame variance issues with AMD cards, gotta give them that.


so Nvidia is trolling their own customers? this needs to be written it the history books of trolling








too bad I must buy a CUDA gpu, since I find this this insulting
really can't figure what's going on with their PR but this is a mess


----------



## Forceman

Quote:


> Originally Posted by *skupples*
> 
> that's my point.
> 
> BUT, it does look like we should start expecting driver support / tuning to end when the card manufacturing goes EOL, based on what we're seeing with GK110... AMD somehow continues to squeeze more and more performance out of 280x/7970, while kepler goes stagnant / subsides.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> which is another reason why going w/ 390x is on my mind, though the #1 thing that STILL bothers me about AMD is time to update. They release Xfire profiles for games literally year + after the game has released. Like, they JUST NOW released a Far Cry 3 xfire profile, last month or so.


It may just be that 7970/280X are similar enough to 290/290X that improvements made for 290 cards also help the older cards, while Maxwell is different enough (we know the memory management is different at least) that gains made there aren't applicable to Kepler.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> It may just be that 7970/280X are similar enough to 290/290X that improvements made for 290 cards also help the older cards, while Maxwell is different enough (we know the memory management is different at least) that gains made there aren't applicable to Kepler.


That has always been the case with AMD. They keep their architecture for much longer then Nvidia.


----------



## sugarhell

Quote:


> Originally Posted by *skupples*
> 
> that's my point.
> 
> BUT, it does look like we should start expecting driver support / tuning to end when the card manufacturing goes EOL, based on what we're seeing with GK110... AMD somehow continues to squeeze more and more performance out of 280x/7970, while kepler goes stagnant / subsides.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> which is another reason why going w/ 390x is on my mind, though the #1 thing that STILL bothers me about AMD is time to update. They release Xfire profiles for games literally year + after the game has released. Like, they JUST NOW released a Far Cry 3 xfire profile, last month or so.


You mean far cry 4 ?


----------



## LancerVI

Why is this an issue still? I thought nVidia made it right? If you want a refund, people are getting them. Is that not so?

I do have to say though, if AMD had made this 'mistake', they'd be CUT TO PIECES by OCN, not defended.


----------



## sugalumps

Quote:


> Originally Posted by *LancerVI*
> 
> Why is this an issue still? I thought nVidia made it right? If you want a refund, people are getting them. Is that not so?
> 
> I do have to say though, if AMD had made this 'mistake', they'd be CUT TO PIECES by OCN, not defended.


They have been cut to pieces, only a few defended. It would be the exact same reaction the other way arround, a few names we already know would defend and the rest would cut.


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> You mean far cry 4 ?


nope, game is still missing FC4 profile, or it existed, then was revoked... hard to keep up.

talking about the omega driver or w/e it was called. Patch notes said something about adding Xfire support for a bunch of now old AAA titles.


----------



## awdrifter

Quote:


> Originally Posted by *skupples*
> 
> nope, game is still missing FC4 profile, or it existed, then was revoked... hard to keep up.
> 
> talking about the omega driver or w/e it was called. Patch notes said something about adding Xfire support for a bunch of now old AAA titles.


AMD's Crossfire support is bad, my previous setup was Crossfire. But the lack of Crossfire support is due to FC4 being an Nvidia GameWorks title, the Crossfire support is blocked at the engine level, there's no amount of driver optimization that'll enable it.


----------



## mcg75

Quote:


> Originally Posted by *awdrifter*
> 
> AMD's Crossfire support is bad, my previous setup was Crossfire. But the lack of Crossfire support is due to FC4 being an Nvidia GameWorks title, the Crossfire support is blocked at the engine level, there's no amount of driver optimization that'll enable it.


AMD said they needed a patch, 1.6 was supposed to be that patch as it had a crossfire performance drop fix.

Whether that addressed the whole issue or not, I'm not sure.

Considering they can't seem to fix the stutter for any cards, I would not be surprised if it didn't address all of it.


----------



## Silent Scone

Quote:


> Originally Posted by *mcg75*
> 
> AMD said they needed a patch, 1.6 was supposed to be that patch as it had a crossfire performance drop fix.
> 
> Whether that addressed the whole issue or not, I'm not sure.
> 
> Considering they can't seem to fix the stutter for any cards, I would not be surprised if it didn't address all of it.


I tried 1.07 recently and considering the last version I tried was 04? It seemed much improved but not sure if that's also NVIDIAs most recent driver. Honestly though with that and everything else in mind recently the whole thing really stinks. Might just be state of mind but I feel we are getting taken for a ride at the moment and the industry is firmly in the driving seat.

Not entirely sure what to do to change that fact, stop buying I guess. The change.org petition ideally should be more aimed at us wanting reassurance that we are going to get better support, which covers multi GPU and all available technologies. Because I'm fairly confident that it goes without saying, NVIDIA won't be pulling a stunt like the one in this thread any time soon.


----------



## Menta

https://rog.asus.com/forum/showthread.php?57022-asus-970-strix-false-specs/page3

hope?


----------



## Wezzor

Swedish websites, but they're very trustworthy.

http://www.nordichardware.se/Grafik/amd-hanar-nvidia-och-lovar-rabatt-foer-de-som-oeverger-geforce-gtx-970.html - AMD promises discount if you abandon the Geforce GTX 970 for Radeon R9 Series

http://www.sweclockers.com/nyhet/19983-evga-accepterar-returer-av-geforce-gtx-970 - EVGA accept returns of GeForce GTX 970


----------



## iSlayer

Quote:


> Originally Posted by *sugalumps*
> 
> They have been cut to pieces, only a few defended. It would be the exact same reaction the other way arround, a few names we already know would defend and the rest would cut.


A few people I could mtcn I mean mention...

This gave me the thought. If I had a second flame and could change my user title I'd put "Nvidia did nothing wrong" Kappa.
Quote:


> Originally Posted by *awdrifter*
> 
> AMD's Crossfire support is bad, my previous setup was Crossfire. But the lack of Crossfire support is due to FC4 being an Nvidia GameWorks title, the Crossfire support is blocked at the engine level, there's no amount of driver optimization that'll enable it.


Quote:


> Originally Posted by *mcg75*
> 
> AMD said they needed a patch, 1.6 was supposed to be that patch as it had a crossfire performance drop fix.
> 
> Whether that addressed the whole issue or not, I'm not sure.
> 
> Considering they can't seem to fix the stutter for any cards, I would not be surprised if it didn't address all of it.


Stutter Cry 4 looks like it needs another month year of updating before it doesn't run with a limp.

Does SLI work?


----------



## Silent Scone

Yes. Scaling is fine, the stuttering is a mixture of an in-engine issue and Nvidia not really looking into it.


----------



## provost

well, I don't know about anyone else.. but I am LMAO even though I am one of the, ahem owners, of some of the gpus mentioned in this .. just too hilarious not be liked









we can all use a moment of levity


----------



## Menta

Quote:


> Originally Posted by *provost*
> 
> well, I don't know about anyone else.. but I am LMAO even though I am one of the, ahem owners, of some of the gpus mentioned in this .. just too hilarious not be liked
> 
> 
> 
> 
> 
> 
> 
> 
> 
> we can all use a moment of levity


LOL, exactly my feeling


----------



## Rahldrac

Just mailed my reseller (Agito.pl I'm Norwegian but study in Poland), and they told me that they are talking with Gigabyte about compensation. Fingers crossed.


----------



## Nevk

Nvidia clarifies: No specific GTX 970 driver to improve memory allocation performance planned
http://www.pcworld.com/article/2876802/nvidia-plans-geforce-gtx-970-driver-update-for-memory-performance-concerns.html?null


----------



## nSone

LooL almost died here
what a relief


----------



## Mand12

The amount of misinformation in this thread is staggering.

3.5 vs 4 is not a big deal. The specific vram count has NEVER been a big deal, and the most misunderstood number in the entire GPU market.

The benchmarks are still the same. The performance is still the same. This card is just as good now as it was when the reviewers gave it stellar performance numbers.


----------



## skupples

Quote:


> Originally Posted by *awdrifter*
> 
> AMD's Crossfire support is bad, my previous setup was Crossfire. But the lack of Crossfire support is due to FC4 being an Nvidia GameWorks title, the Crossfire support is blocked at the engine level, there's no amount of driver optimization that'll enable it.


That's not true AT ALL. sorry. Conspiracies can go out the door.

Game had xfire support, then they disabled it, then AMD said they need to rematch it.

There are two workarounds. Renaming the .exe or forcing AFR. both seem to work well.


----------



## mtcn77

Quote:


> Originally Posted by *Mand12*
> 
> The amount of misinformation in this thread is staggering.
> 
> 3.5 vs 4 is not a big deal. The specific vram count has NEVER been a big deal, and the most misunderstood number in the entire GPU market.
> 
> The benchmarks are still the same. The performance is still the same. This card is just as good now as it was when the reviewers gave it stellar performance numbers.


Not the count & but the *associability*. The benchmarks that are broken are still standing, too. What about those? Oh, "we never thought it was possible!"


----------



## skupples

I mean, who wants To game @ 11-15 FPS?


----------



## Espair

I dont see what the big fuss is concerning the vram. Really the 970 still performs amazing, the benchmarks suddenly didnt just drop, they are performing the same as when they came out...

The bigger deal is the marketing ploy nvidia used, it was not right, but worth returning your card for it?

Also, just bought my 970 today and am loving it :3 the power consumption and heat does matter to me.

Edit, i think im late to the party, this thread is probably a merry go round by now.


----------



## mouacyk

Quote:


> Originally Posted by *Mand12*
> 
> The amount of misinformation in this thread is staggering.
> 
> 3.5 vs 4 is not a big deal. The specific vram count has NEVER been a big deal, and the most misunderstood number in the entire GPU market.
> 
> The benchmarks are still the same. The performance is still the same. This card is just as good now as it was when the reviewers gave it stellar performance numbers.


There are lots of misinformation from posters for sure, and along with that misunderstandings. However, the information that NVidia has decided to disclose is not misinformation. If anything, what they recently released is to inform and help people understand the corner-case issues they are experiencing.

NVidia wants customers to know that the corner case issues with VRAM consumption over 3.5 are caused by this:


----------



## Kand

Quote:


> Originally Posted by *Mand12*
> 
> The amount of misinformation in this thread is staggering.
> 
> 3.5 vs 4 is not a big deal. The specific vram count has NEVER been a big deal, and the most misunderstood number in the entire GPU market.
> 
> The benchmarks are still the same. The performance is still the same. This card is just as good now as it was when the reviewers gave it stellar performance numbers.


That time when peiple picked 9300gts with 2 gigs of ram over 9500gts with 1gig.


----------



## dejo1

nvidia wants to move the blame to our midunderstanding! it is that they mislead and dont want to take any financial responsibility. They will get away with it just as they have before.


----------



## Jaren1

Guys my 4k experience on a single 970 is unacceptable! 25fps and occasional stuttering all because of this 3.5gb card with 500mb in the backseat!

Lolz


----------



## jprovido

Quote:


> Originally Posted by *Kand*
> 
> That time when peiple picked 9300gts with 2 gigs of ram over 9500gts with 1gig.


that's not the point. we paid for 4GB period.

more than a year ago I had a gtx 680 and I wanted to upgrade. I had a choice of getting either an r9 290, r9 290x or a gtx 780. I knew the gtx 780 only had 3GB vram. I didn't really care tbh and bought the gtx 780 I knew what I was getting and at that time 3GB is more than enough for my needs.

now they realeased the gtx 970. It had 4GB vram. I thought it would be better if I "upgraded" to two 970's instead of getting another used gtx 780 because of the vram. lo and behold the thing only has 3.5GB. If i knew it was the case I wouldn't even consider getting two gtx 970's. I would've just bought another used gtx 780 it would've been A LOT cheaper and I didnt have to trouble myself of selling my gtx 780. nvidia tricked us period I PAID FOR 4GB VRAM NOT 3.5GB + 500MB(CRAP) vram


----------



## FlyingSolo

I just finished talking to the company i bought the card from. And they have told me they will let me know on monday. If they will give me a full refund. But they have told me that they will not do any kind of upgrade. And also told me to not buy another card from them just yet. Hopefully i get the refund or else i will still have to RMA the card since it has bad coil whine.


----------



## Exilon

Quote:


> Originally Posted by *Mand12*
> 
> The amount of misinformation in this thread is staggering.
> 
> 3.5 vs 4 is not a big deal. The specific vram count has NEVER been a big deal, and the most misunderstood number in the entire GPU market.
> 
> The benchmarks are still the same. The performance is still the same. This card is just as good now as it was when the reviewers gave it stellar performance numbers.


The amount of high speed VRAM is still the same and the fact that Nvidia lied is also the same. This card is just as much a lie as when the reviewers failed to find the problem.

Look, I get it that you like Nvidia and all, but this kind of corporate behavior is unacceptable and Nvidia deserves every bit of flak coming to them.


----------



## skupples

Quote:


> Originally Posted by *jprovido*
> 
> that's not the point. we paid for 4GB period.
> 
> more than a year ago I had a gtx 680 and I wanted to upgrade. I had a choice of getting either an r9 290, r9 290x or a gtx 780. I knew the gtx 780 only had 3GB vram. I didn't really care tbh and bought the gtx 780 I knew what I was getting and at that time 3GB is more than enough for my needs.
> 
> now they realeased the gtx 970. It had 4GB vram. I thought it would be better if I "upgraded" to two 970's instead of getting another used gtx 780 because of the vram. lo and behold the thing only has 3.5GB. If i knew it was the case I wouldn't even consider getting two gtx 970's. I would've just bought another used gtx 780 it would've been A LOT cheaper and I didnt have to trouble myself of selling my gtx 780. nvidia tricked us period I PAID FOR 4GB VRAM NOT 3.5GB + 500MB(CRAP) vram


that's your own fault.

780 sli >970 sli


----------



## FlyingSolo

If you guy's had a chance to get a refund what will be a better buy.

1 GTX 980

2 AMD R9 290

1 AMD R9 295 X2

the price of these cards will cost the same for me.


----------



## 2010rig

Quote:


> Originally Posted by *Jaren1*
> 
> Guys my 4k experience on a single 970 is unacceptable! 25fps and occasional stuttering all because of this 3.5gb card with 500mb in the backseat!
> 
> Lolz


You've come to the right place!
Quote:


> Originally Posted by *Exilon*
> 
> The amount of high speed VRAM is still the same and the fact that Nvidia lied is also the same. This card is just as much a lie as when the reviewers failed to find the problem.
> 
> Look, I get it that you like Nvidia and all, but this kind of corporate behavior is unacceptable and Nvidia deserves every bit of flak coming to them.


Explain this then, Einstein.


----------



## iSlayer

Quote:


> Originally Posted by *FlyingSolo*
> 
> If you guy's had a chance to get a refund what will be a better buy.
> 
> 1 GTX 980
> 
> 2 AMD R9 290
> 
> 1 AMD R9 295 X2
> 
> the price of these cards will cost the same for me.


Depends what you want.

And if the cards are aftermarket.

I'd go aftermarket 290s I think? Not sure how the 295x2 compares.


----------



## rdr09

Quote:


> Originally Posted by *FlyingSolo*
> 
> If you guy's had a chance to get a refund what will be a better buy.
> 
> 1 GTX 980
> 
> 2 AMD R9 290
> 
> 1 AMD R9 295 X2
> 
> the price of these cards will cost the same for me.


get something to get you by for cheap and wait for the next line of gpus.


----------



## skupples

1x 295x2


----------



## FlyingSolo

Quote:


> Originally Posted by *iSlayer*
> 
> Depends what you want.
> 
> And if the cards are aftermarket.
> 
> I'd go aftermarket 290s I think? Not sure how the 295x2 compares.


Quote:


> Originally Posted by *rdr09*
> 
> get something to get you by for cheap and wait for the next line of gpus.


Quote:


> Originally Posted by *skupples*
> 
> 1x 295x2


Thanks guy's. Hopefully i get my refund. If i do get a card now i'll go with the one skupples recommended.


----------



## Woundingchaney

Quote:


> If you guy's had a chance to get a refund what will be a better buy.
> 
> 1 GTX 980
> 
> 2 AMD R9 290
> 
> 1 AMD R9 295 X2
> 
> the price of these cards will cost the same for me.


The 295x2 is the best card listed as long as you don't mind running a dual gpu card. In fact it is considerably better than the others you have listed as far as a single card solution.

The dual 290s is probably the best performer but you will have a higher level of heat and noise.

I would go for the 295x2 personally.


----------



## Exilon

Quote:


> Originally Posted by *2010rig*
> 
> Explain this then, Einstein.


Einstein here. That's dumb and you should feel dumb.

We've already been over this whole issue more than enough times. Nvidia lied about its offering and mislead customers. The number of posts by the same people defending Nvidia with the same tired excuses is just sad.

I guess the upshot of this whole debacle is that we can easily identify who's an hopeless Nvidia fanboy. I'm glad GoldenTiger came around though. I liked that guy when he posted here.


----------



## TopicClocker

Quote:


> Originally Posted by *jprovido*
> 
> that's not the point. we paid for 4GB period.
> 
> more than a year ago I had a gtx 680 and I wanted to upgrade. I had a choice of getting either an r9 290, r9 290x or a gtx 780. I knew the gtx 780 only had 3GB vram. I didn't really care tbh and bought the gtx 780 I knew what I was getting and at that time 3GB is more than enough for my needs.
> 
> now they realeased the gtx 970. It had 4GB vram. I thought it would be better if I "upgraded" to two 970's instead of getting another used gtx 780 because of the vram. lo and behold the thing only has 3.5GB. If i knew it was the case I wouldn't even consider getting two gtx 970's. I would've just bought another used gtx 780 it would've been A LOT cheaper and I didnt have to trouble myself of selling my gtx 780. nvidia tricked us period I PAID FOR 4GB VRAM NOT 3.5GB + 500MB(CRAP) vram


I totally agree!
Quote:


> Originally Posted by *Exilon*
> 
> The amount of high speed VRAM is still the same and the fact that Nvidia lied is also the same. This card is just as much a lie as when the reviewers failed to find the problem.
> 
> Look, I get it that you like Nvidia and all, but this kind of corporate behavior is unacceptable and Nvidia deserves every bit of flak coming to them.


Definitely! The defending is ridiculous.

Quote:


> Originally Posted by *FlyingSolo*
> 
> If you guy's had a chance to get a refund what will be a better buy.
> 
> 1 GTX 980
> 
> 2 AMD R9 290
> 
> 1 AMD R9 295 X2
> 
> the price of these cards will cost the same for me.


I would be tempted to get the R9 295 X2.

Although CF and SLI problems are still things to be considered when considering any multi-GPU setup,

Quote:


> Originally Posted by *rdr09*
> 
> get something to get you by for cheap and wait for the next line of gpus.


This is also a good idea, although no one knows when the next line of GPUs are coming, it could be months from now.


----------



## Silent Scone

Guess what kids, I get stuttering at 1440P with my 980GTX in Dying Light with High textures.

You'll have to excuse me but, ZOMG this is terrible must be memory poolz.


----------



## Heavy MG

Quote:


> Originally Posted by *2010rig*
> 
> You've come to the right place!
> Explain this then, Einstein.


But you can't prove your point by showing the frame timing graph. Though there you go still trying to defend Nvidia with a cherry picked useless ram use graph. Isn't BF4 like the only game that can use more than 3.5GB? Nvidia put out false specs, and now they don't wish do anything about it, it's also wrong that the card vendors are having to take the blame for the situation. You'd have to be a fanboy to keep defending Nvidia at this point.
Quote:


> Originally Posted by *Nevk*
> 
> Nvidia clarifies: No specific GTX 970 driver to improve memory allocation performance planned
> http://www.pcworld.com/article/2876802/nvidia-plans-geforce-gtx-970-driver-update-for-memory-performance-concerns.html?null


So Nvidia doesn't even care enough to give 970 users a driver to help alleviate the issue? Thanks,Nvidia. My next card will definitely be from AMD.


----------



## FlyingSolo

Quote:


> Originally Posted by *Woundingchaney*
> 
> The 295x2 is the best card listed as long as you don't mind running a dual gpu card. In fact it is considerably better than the others you have listed as far as a single card solution.
> 
> The dual 290s is probably the best performer but you will have a higher level of heat and noise.
> 
> I would go for the 295x2 personally.


Thanks. I'll go with the 295 x2.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> Guess what kids, I get stuttering at 1440P with my 980GTX in Dying Light with High textures.
> 
> You'll have to excuse me but, ZOMG this is terrible must be memory poolz.


Please let me know the specific issues you are having with the GTX 980. I may be able to help you, but I cannot guarantee anything.


----------



## criminal

Quote:


> Originally Posted by *Heavy MG*
> 
> But you can't prove your point by showing the frame timing graph. Though there you go still trying to defend Nvidia with a cherry picked useless ram use graph. Isn't BF4 like the only game that can use more than 3.5GB? Nvidia put out false specs, and now they don't wish do anything about it, it's also wrong that the card vendors are having to take the blame for the situation. You'd have to be a fanboy to keep defending Nvidia at this point.
> 
> So Nvidia doesn't even care enough to give 970 users a driver to help alleviate the issue? Thanks,Nvidia. My next card will definitely be from AMD.


Call'em like you see'em.









I don't think they can do anything different with a driver.


----------



## FlyingSolo

Quote:


> Originally Posted by *TopicClocker*
> 
> I totally agree!
> Definitely! The defending is ridiculous.
> I would be tempted to get the R9 295 X2.
> 
> Although CF and SLI problems are still things to be considered when considering any multi-GPU setup,
> This is also a good idea, although no one knows when the next line of GPUs are coming, it could be months from now.


Thanks i didn't think about that. Will look in to this before buying the R9 295 X2.


----------



## PureBlackFire

Quote:


> Originally Posted by *FlyingSolo*
> 
> If you guy's had a chance to get a refund what will be a better buy.
> 
> 1 GTX 980
> 
> 2 AMD R9 290
> 
> 1 AMD R9 295 X2
> 
> the price of these cards will cost the same for me.


being that you feel the need to get a refund, on principal alone you can't even consider GTX980.


----------



## criminal

Quote:


> Originally Posted by *PureBlackFire*
> 
> *being that you feel the need to get a refund, on principal alone you can't even consider GTX980.*


Yep. Which I wouldn't want to anyway if I them.


----------



## sugalumps

Quote:


> Originally Posted by *Silent Scone*
> 
> Guess what kids, I get stuttering at 1440P with my 980GTX in Dying Light with High textures.
> 
> You'll have to excuse me but, ZOMG this is terrible must be memory poolz.


It's obviously a greedy nvidia vram ploy, please ask for free t-shirt and $100 back. You need atleast 6gb for 1080p.


----------



## 2010rig

Quote:


> Originally Posted by *Exilon*
> 
> Einstein here. That's dumb and you should feel dumb.
> 
> We've already been over this whole issue more than enough times. Nvidia lied about its offering and mislead customers. The number of posts by the same people defending Nvidia with the same tired excuses is just sad.
> 
> I guess the upshot of this whole debacle is that we can easily identify who's an hopeless Nvidia fanboy. I'm glad GoldenTiger came around though. I liked that guy when he posted here.


Weren't you implying they were lying about the VRAM? When the card can in fact use 4GB.

Have also been over enough times, that it can use up to 224 GB/s.

So.... your point?

Have the day 1 benchmarks changed as of today? Or were they lying about those too?


----------



## Woundingchaney

Quote:


> Originally Posted by *PureBlackFire*
> 
> yes and I've seen it posted in at least two other threads.
> being that you feel the need to get a refund, on principal alone you can't even consider GTX980.


Honestly I upgraded from sli 970s to sli 980s. I got a refund for both principle and performance reasons. My primary reason for staying with the Nvidia 900 line is because it is the only card available that supports HDMI 2.0.

There are reasons as to why 970 users would upgrade to 980s. Nvidia honored my request for a refund and Zotac supported it (though not initially). I harbor no grudges to Nvidia, particularly given they are making at least some effort to remedy the issue.


----------



## Xoriam

Quote:


> Originally Posted by *Woundingchaney*
> 
> Honestly I upgraded from sli 970s to sli 980s. I got a refund for both principle and performance reasons. My primary reason for staying with the Nvidia 900 line is because it is the only card available that supports HDMI 2.0


That was one of the big selling factors for me as well.


----------



## FlyingSolo

Quote:


> Originally Posted by *PureBlackFire*
> 
> yes and I've seen it posted in at least two other threads.
> being that you feel the need to get a refund, on principal alone you can't even consider GTX980.


Your right about that.


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> Please let me know the specific issues you are having with the GTX 980. I may be able to help you, but I cannot guarantee anything.


lol, sorry? I already know the issue. The frame buffer is saturated and swap out is occurring.


----------



## Woundingchaney

Quote:


> Originally Posted by *2010rig*
> 
> Weren't you implying they were lying about the VRAM? When the card can in fact use 4GB.
> 
> Have also been over enough times, that it can use up to 224 GB/s.
> 
> So.... your point?
> 
> Have the day 1 benchmarks changed as of today? Or were they lying about those too?


So what about that ROP count or the L2 amounts?

Regardless the Vram and bandwidth issues while perhaps technically true are very misleading.


----------



## notarat

Quote:


> Originally Posted by *Woundingchaney*
> 
> So what about that ROP count or the L2 amounts?
> 
> Regardless the Vram and bandwidth issues while perhaps technically true are very misleading.


Lemme answer as 2010rig would

ROP Count and L2 Amount?

LOL! If you were stupid enough to believe what they printed on the box, mentioned in each of the reviews, stated on their site, etc. then it's your fault.


----------



## 2010rig

Quote:


> Originally Posted by *Woundingchaney*
> 
> So what about that ROP count or the L2 amounts?
> 
> Regardless the Vram and bandwidth issues while perhaps technically true are very misleading.


You mean the 8 ROP's that are un-usable anyway due to the SMM's being the bottleneck?

I know everybody goes out and buys the card based on ROP count and L2 cache, right? No one cares about the performance.









Look, I don't buy that it took them 4 months to notice their mistake, but that's all it was, *a mistake*, a *mislabeled spec sheet*. I'm not making excuses for them.

The *performance* of the card is STILL the same as seen in reviews, shouldn't that be what matters? After almost 2000 posts in this thread, I don't know why I bother anymore. Keep complaining.


----------



## Woundingchaney

Quote:


> Originally Posted by *notarat*
> 
> Lemme answer as 2010rig would
> 
> ROP Count and L2 Amount?
> 
> LOL! If you were stupid enough to believe what they printed on the box, mentioned in each of the reviews, stated on their site, etc. then it's your fault.


Ok that was actually hilarious!!!


----------



## awdrifter

Quote:


> Originally Posted by *skupples*
> 
> That's not true AT ALL. sorry. Conspiracies can go out the door.
> 
> Game had xfire support, then they disabled it, then AMD said they need to rematch it.
> 
> There are two workarounds. Renaming the .exe or forcing AFR. both seem to work well.


You can say this is conspiracy because no one will come out and admit it. But it's the same deal as this 3.5gb ram issue, the test results speaks for themselves. Far Cry 3 was an AMD game, so Crossfire works. This game is on the same engine, however it has Nvidia GameWorks code in it and now Crossfire doesn't work. It's very likely that the GameWorks code is blocking Crossfire. AMD has came out and said that the GameWorks prevents them from optimizing the drivers for the games. I have a GTX 970 this time around so the GameWorks blocking Crossfire doesn't affect me, but Nvidia paying off devs to cripple AMD's performance is the reason why I got the card in the first place.

http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/


----------



## Woundingchaney

Quote:


> Originally Posted by *2010rig*
> 
> You mean the 8 ROP's that are un-usable anyway due to the SMM's being the bottleneck?
> 
> I know everybody goes out and buys the card based on ROP count and L2 cache, right? No one cares about the performance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Look, I don't buy that it took them 4 months to notice their mistake, but that's all it was, *a mistake*, a *mislabeled spec sheet*. I'm not making excuses for them.
> 
> The *performance* of the card is STILL the same as seen in reviews, shouldn't that be what matters? After almost 2000 posts in this thread, I don't know why I bother anymore. Keep complaining.


I don't know why you bother either. Yes, I do pay attention to hardware spec's when I purchase hardware. I assumed that most people here at OCN do.

Though its important to note that I am the exact type of user that would be most impacted by these issues. As I game at 4k resolution and run SLI configurations.

Oh and just in case you ask. Yes the difference between the 3.5 and .5 split pool and a true 4 gig pool would of mattered to me too.

I pay attention to current performance benchmarks but I use hardware specs to help gauge future performance.


----------



## 2010rig

Quote:


> Originally Posted by *Woundingchaney*
> 
> I don't know why you bother either. Yes, I do pay attention to hardware spec's when I purchase hardware. I assumed that most people here at OCN do.
> 
> Though its important to note that I am the exact type of user that would be most impacted by these issues. As I game at 4k resolution and run SLI configurations.
> 
> Oh and just in case you ask. Yes the difference between the 3.5 and .5 split pool and a true 4 gig pool would of mattered to me too.
> 
> I pay attention to current performance benchmarks but I use hardware specs to help gauge future performance.


And hence you went with 980's.







I'm going 4K soon too, but will wait out for next gen cards. I hope either camp can deliver 4K 60 FPS *single card*.

I'm not excusing NVIDIA for what they did, they had to differentiate the 970 & 980 somehow. Outside the 5% of us who understand memory pools and partitions, the general public wouldn't have had a clue of what that means. Just trying to remind people of the bigger picture that the performance is what was promised.


----------



## Woundingchaney

Quote:


> Originally Posted by *2010rig*
> 
> And hence you went with 980's.
> 
> 
> 
> 
> 
> 
> 
> I'm going 4K soon too, but will wait out for next gen cards. I hope either camp can deliver 4K 60 FPS *single card*.


I seriously doubt we see any single card solution that can provide a steady 60 fps at 4k in modern titles. Realistically 4k is getting a lot of attention because of its entrance into consumer televisions. Monitors have been running 1440p and 1600p for years and even now a single gpu solution for those resolutions is not realistic with hardware that is available.


----------



## 2010rig

Quote:


> Originally Posted by *criminal*
> 
> 780ti only has 3GB of vram. How did that solve the problem? Seems like it would have been worse.


Less is more.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> lol, sorry? I already know the issue. The frame buffer is saturated and swap out is occurring.


I am happy that you are not having any unexpected issues with the GTX 980. Yes, swapping out to system RAM on VRAM saturation is expected and is deliberate by design. Please let us know if you have any other issues with the card.


----------



## Silent Scone

Shall I record frame times with 980 GTX in Dying Light. I'm only going to bother doing it later if someone with half a brain cell, the game, and a 970 GTX does the same. At 1440p max settings.

Like I say, I've actually had to turn shadow mapping to High from Very High because of the memory usage.


----------



## rickcooperjr

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> No the performance isn't the same when you break the 3.5gb barrier FPS takes a huge dip and things become choppy and laggy along with frame times go sky high due to when used past 3.5gb the memory bandwidth capability tanks. My cousin bought 2 of these and let me tell you even at 1080p running alot of mods ( like 375 mods ) on skyrim the crap hit the fan we are talking skyrim here *** and when he put the 780 TI back in well problem solved the 970's were sold as full speed 4gb ram not 3.5gb of fast ram 512mb of slow ram so yes Nvidia messed up now they are trying to avoid the subject and crying about the flak they are getting if AMD did this they would be thrown under the bus and Nvidia would take every opportunity to capitalize on it.
> 
> My cousin also has a 120hz 4k screen and the 970's perform horribly on shadow of moridor and farcry 4 and he played some BF4 it did same so your saying ti is OK that they did something like this I don't see in any of your posts you actually criticizing them for it nor do I see you actually bashing them at all for it the only way to get a company to learn is to hit them where it hurts the pocket and theyre reputation otherwise they will just do it again. I feel companies get away with crap like this way to often and that erks me.
> 
> 
> 
> 780ti only has 3GB of vram. How did that solve the problem? Seems like it would have been worse.
Click to expand...

Because it didn't hit the point where it throttled the Vram bandwidth down to like 1/4 - 1/10th of what the rest of the ram got before it hit the last 512mb Vram.

That last bit of Vram cutting the bandwidth down is a huge difference ask Woundingchaney I believe he hit that barrier and then had the stutter fest issue and went so far as was almost opting to change CPU / Mobo for 4k gaming thinking that was the issue when it was the 970's all along. http://www.overclock.net/t/1511270/advice-on-build


----------



## notarat

Quote:


> Originally Posted by *2010rig*
> 
> You mean the 8 ROP's that are un-usable anyway due to the SMM's being the bottleneck?
> 
> I know everybody goes out and buys the card based on ROP count and L2 cache, right? No one cares about the performance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Look, I don't buy that it took them 4 months to notice their mistake, but that's all it was, *a mistake*, a *mislabeled spec sheet*. I'm not making excuses for them.
> 
> The *performance* of the card is STILL the same as seen in reviews, shouldn't that be what matters? After almost 2000 posts in this thread, I don't know why I bother anymore. Keep complaining.


Looks as if my post rang true. LOL.

There are laws in place in the US and Europe that specifically deal with false advertising. It doesn't matter if it was done maliciously or accidentally. It is still false advertising.


----------



## criminal

Quote:


> Originally Posted by *rickcooperjr*
> 
> Because it didn't hit the point where it throttled the Vram bandwidth down to like 1/4 - 1/10th of what the rest of the ram got before it hit 512mb from the limit of the Vram.


Sure. See 2010rig's avatar.


----------



## 2010rig

Quote:


> Originally Posted by *notarat*
> 
> Looks as if my post rang true. LOL.
> 
> There are laws in place in the US and Europe that specifically deal with false advertising. It doesn't matter if it was done maliciously or accidentally. It is still false advertising.


We'll see if the US or the EU end up suing NVIDIA. This is a serious offense that needs to be looked into immediately.









Did they acknowledge their mistake? Yep.

Are they offering refunds to those who want one? Yep.

Problem?
Quote:


> False advertising or deceptive advertising is the use of false or misleading statements in advertising, and misrepresentation of the product at hand, which may negatively affect many stakeholders, especially consumers.


According to your logic, NVIDIA deliberately lied about the ROP count and L2 cache, in order to persuade people to buy the 970. They did this because the Maxwell architecture is weak, the lower power consumption & less noise weren't good enough selling points. Oh, and never mind about the benchmarks and that Price / Performance ratio, those had nothing to do with why people bought the 970.

Read Day 1 reviews, and see below, you will clearly see the deception.


----------



## cowie

Quote:


> Originally Posted by *GorillaSceptre*
> 
> What will the 970 perform like when the "next gen" games arrive this year?
> 
> Not a 970 owner and have no dog in this fight... But, if i buy something i expect it to have what was advertised on the box. It may be perfectly fine for future games but if i was a 970 owner i'd be a bit uncomfortable.


it has all the rigyt things on the box yes it does have 4g
Quote:


> Originally Posted by *sugalumps*
> 
> It's obviously a *greedy nvidia vram ploy*, please ask for free t-shirt and $100 back. You need atleast 6gb for 1080p.


you guys know its a greedy ram ploy on both sides?
go look at 1440p(single monitor) results of the 290x and 290x 8g It's a few frames difference almost 200$ difference in price.
In reality the 3.5g 970 stays with both of those cards in all practical manners....but this is ocn the home of the "it does not overclock I am sending it back" whats even more at home around here bios flashing there cards messing them up then sending them back.
why don't you guys talk about suing and getting free games from those that are stupid enough to say those things on a public forum.

as for the 970 it is what it is....I could not get refunds for the months I waited on drivers to get black screens fixed on my 5970's 7980's or the darling 4g 290x's

*yes I hope you guys go get refunds wait for the unfinished amd cards then cry about grey black or maybe this time yellow screen crashes that they don't tell you about.

both sides play you guys like fiddles....take my advice hate the man nv amd intel apple and Samsung they look out for them they don't love you they just want your money./
*
before they take this post down (I am sure) read and remember


----------



## Silent Scone

Quote:


> Originally Posted by *Silent Scone*
> 
> Shall I record frame times with 980 GTX in Dying Light. I'm only going to bother doing it later if someone with half a brain cell, the game, and a 970 GTX does the same. At 1440p max settings.
> 
> Like I say, I've actually had to turn shadow mapping to High from Very High because of the memory usage.


Guess not, thread go die die now. I'll record an hour later of the first section after the tutorial (where you unlock co-op) if anyone with a 970GTX wants to try.

Single card, max settings 1440p.

Play nice everyone.


----------



## notarat

Quote:


> Originally Posted by *Woundingchaney*
> 
> I don't know why you bother either. Yes, I do pay attention to hardware spec's when I purchase hardware. I assumed that most people here at OCN do.
> 
> Though its important to note that I am the exact type of user that would be most impacted by these issues. As I game at 4k resolution and run SLI configurations.
> 
> Oh and just in case you ask. Yes the difference between the 3.5 and .5 split pool and a true 4 gig pool would of mattered to me too.
> 
> I pay attention to current performance benchmarks but I use hardware specs to help gauge future performance.


You shouldn't expect much from someone who still plays games at "Keepin' Up With the Kardashians" resolution. For those of us using higher than "Year of our Lord, 2007" graphics that extra .5 running at full speed actually helps (Unless you're Michael J Fox and can synchronize your Parkinson's to the frame times of the 970)


----------



## Woundingchaney

Quote:


> Originally Posted by *2010rig*
> 
> We'll see if the US or the EU end up suing NVIDIA. This is a serious offense that needs to be looked into immediately.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did they acknowledge their mistake? Yep.
> 
> Are they offering refunds to those who want one? Yep.
> 
> Problem?


I think this is what it boils down to and for what ever reason people are throwing their hands up in the air. Nvidia either knowingly or simply made a mistake on their specs at initial launch. Now I would imagine initially it most likely was a mistake, but I really don't believe that this issue wasn't noticed by any engineer or tech for 4 months.

Nvidia has contacted manufacturers and retailers instructing them to allow refunds (as long as the consumers have the packaging and contents). I know this for a fact by confirming through multiple emails with Newegg, Nvidia, and Zotac. Both Nvidia and Zotac offered to assist with my refund. Now for those that purchased the items from smaller retailers I have no idea what their level of involvement is, but I do know that major retailers have been given the ok to refund the cards.

No I don't feel as if Nvidia owes me anything for my inconvenience, but yes I do very much feel that as a consumer Nvidia should offer or support my refund.

I think those that are subject to the issue should at the very least try to remain reasonable and they can come to a solution for both parties.


----------



## BinaryDemon

I keep asking myself if I would have bought the GTX970's if they had originally advertised them with 3.5gb instead of 4gb. The GTX970 is still a decent deal but the answer is probably NO, because I was looking for a certain amount of future-proofing.


----------



## sugalumps

Quote:


> Originally Posted by *cowie*
> 
> it has all the rigyt things on the box yes it does have 4g
> you guys know its a greedy ram ploy on both sides?
> go look at 1440p(single monitor) results of the 290x and 290x 8g It's a few frames difference almost 200$ difference in price.
> In reality the 3.5g 970 stays with both of those cards in all practical manners....but this is ocn the home of the "it does not overclock I am sending it back" whats even more at home around here bios flashing there cards messing them up then sending them back.
> why don't you guys talk about suing and getting free games from those that are stupid enough to say those things on a public forum.
> 
> as for the 970 it is what it is....I could not get refunds for the months I waited on drivers to get black screens fixed on my 5970's 7980's or the darling 4g 290x's
> 
> *yes I hope you guys go get refunds wait for the unfinished amd cards then cry about grey black or maybe this time yellow screen crashes that they don't tell you about.
> 
> both sides play you guys like fiddles....take my advice hate the man nv amd intel apple and Samsung they look out for them they don't love you they just want your money./
> *
> before they take this post down (I am sure) read and remember


I was being sarcastic, as people think they need 6gb of vram at 1080p and all sorts. With these cards you are going to run out of raw gpu power before you run out of vram, unless you are sli at 4k.


----------



## MR-e

and here i am, waiting for a possible 970 price cut? i can't be ass'd about the .5gb difference as my games won't be demanding anywhere near 3.5+ at 1080p


----------



## 2010rig

Quote:


> Originally Posted by *sexpot*
> 
> and here i am, waiting for a possible 970 price cut? i can't be ass'd about the .5gb difference as my games won't be demanding anywhere near 3.5+ at 1080p


Give it a few days, Open Box deals incoming on NewEgg.


----------



## mouacyk

Quote:


> Originally Posted by *2010rig*


Powerful stuff... exactly what any company would do to try to sell a product. NVidia's sure got flavor here though.


----------



## tpi2007

Quote:


> Originally Posted by *Woundingchaney*
> 
> I don't know why you bother either. Yes, I do pay attention to hardware spec's when I purchase hardware. I assumed that most people here at OCN do.
> 
> Though its important to note that I am the exact type of user that would be most impacted by these issues. As I game at 4k resolution and run SLI configurations.
> 
> Oh and just in case you ask. Yes the difference between the 3.5 and .5 split pool and a true 4 gig pool would of mattered to me too.
> 
> I pay attention to current performance benchmarks but I use hardware specs *to help gauge future performance.*


I've been saying this and some people conveniently forget to address that exact point. Tech sites are doing the same. They limit themselves to stating the obvious, self justifying that the games they tested still perform the same.

This isn't about that.

And also (see below)...
Quote:


> Originally Posted by *2010rig*
> 
> Just trying to remind people of the bigger picture that the performance is what was promised.


... because this is wrong. You weren't promised ANY performance. Go read Nvidia (or AMD's or Intel's) driver EULA. You rely on their goodwill and / or willingness to develop new drivers. They make absolutely no promise, they give you no guarantee that you will get a certain amount of performance in this or that game. Truth be said, with the complexities and interactions between all sorts of hardware and software, they really couldn't.

But if you want to be more practical, they don't even have a commitment to develop new drivers. They can give you new drivers for a year, two, five, ten, but they aren't obliged to. If they really wanted, you could get the driver that came with the card and that's all (plus bug patches). Game developers would then have to develop around the API and the existing driver.

This is all to say that at the end of the day, the one thing that you can hold on to is the hardware you bought. Those specs are independent of goodwill, or willingness to optimize for this or that card, for this or that architecture, for single or SLI configurations.

To be more specific to this case: the PR message that got out (independently of Nvidia's intent or not) was that the only difference between the GTX 980 and 970 was:

1. Lower amount of CUDA cores;

2. Lower clockspeeds for the core;

The rest of it being the same: same amount of ROPs, same amount of L2 cache, same memory bandwidth in the same way the GTX 980 works.

So, these being different - had they been announced at release - may very well have made some people ponder its purchase, considering how future proof it would be and especially for 4K SLI.

I for one would have never bought a GTX 570 with 1.25 GB of VRAM. And had I seen the GTX 970's true specs from the beginning, I most probably would have demoted it from "awesome" to "good" in my mind.

Consider this additional pricing creativity that enhances the GTX 970's appeal compared to the 980: the 980 was priced $50 higher than the GTX 680 (the card the 980 is meant to replace, according to Nvidia), and while the GTX 970's price was announced at $329, that was for the practically non existent reference model, so the real price was more around $350 - $360. Add the real specs into the mix and I don't see why this card is in essence any better of a deal than the $399 GTX 670 was relative to the $499 680. But hey, it helped create the idea that the 970 was one of the best deals ever.


----------



## AngryGoldfish

Quote:


> Originally Posted by *BinaryDemon*
> 
> I keep asking myself if I would have bought the GTX970's if they had originally advertised them with 3.5gb instead of 4gb. The GTX970 is still a decent deal but the answer is probably NO, because I was looking for a certain amount of future-proofing.


Same. To be honest, there isn't really a card that truly impresses me and that should last a long time. The 980 should have come with 6GB, in my opinion.


----------



## 2010rig

Quote:


> Originally Posted by *AngryGoldfish*
> 
> Same. To be honest, there isn't really a card that truly impresses me and that should last a long time. The 980 should have come with 6GB, in my opinion.


because of the 256 bit bus, it would be 8GB. Though it's pointless since you're going to run out of GPU power, way before that 8GB is truly utilized.


----------



## CaptainZombie

I contacted Newegg to see if I can get a refund on my MSI 970 as I'm considering now just forking over the extra $200 to jump on the 980. They took my info, I just need to email the rep my serial, UPC, etc. when I get home and she will have the help desk contact me in 3-7 days.

I'm not sure that even a 290X would be worth it at this point, even though I have a Kraken G10 on the MSI which would work on the 290X. I started noticing tons of stuttering last week when I bought a 4K TV and I tried to run Mordor even with my settings a bit dialed down @ 4K.

970's are great cards, just think the 3.5GB is gonna be an issue in a years time and what about those with Skyrim that probably have a ton of mods loaded, they have to be hitting that ceiling. We should of had cards out by now in 6GB and 8GB flavors considering higher texture counts, etc. in the future upcoming games.


----------



## skupples

Quote:


> Originally Posted by *mouacyk*
> 
> Please let me know the specific issues you are having with the GTX 980. I may be able to help you, but I cannot guarantee anything.


Dying Light has a CPU utilization bug where it pegs CPU0 @ 99% = BOOM STUTTERING! VRAM doesn't matter at all in this situation.

Go turn offf core zero & it gets MUCH MUCH smoother, and GPU usage actually goes above 30-40%.

a patch is incoming for this issue.


----------



## notarat

AMD is having too much fun with this

https://twitter.com/Justin_Hilburn/status/559923551956766720


----------



## Ganf

Quote:


> Originally Posted by *notarat*
> 
> AMD is having too much fun with this
> 
> https://twitter.com/Justin_Hilburn/status/559923551956766720


Just AMD? I thought we were all having fun with this.


----------



## notarat

Quote:


> Originally Posted by *CaptainZombie*
> 
> I contacted Newegg to see if I can get a refund on my MSI 970 as I'm considering now just forking over the extra $200 to jump on the 980. They took my info, I just need to email the rep my serial, UPC, etc. when I get home and she will have the help desk contact me in 3-7 days.
> 
> I'm not sure that even a 290X would be worth it at this point, even though I have a Kraken G10 on the MSI which would work on the 290X. I started noticing tons of stuttering last week when I bought a 4K TV and I tried to run Mordor even with my settings a bit dialed down @ 4K.
> 
> 970's are great cards, just think the 3.5GB is gonna be an issue in a years time and what about those with Skyrim that probably have a ton of mods loaded, they have to be hitting that ceiling. We should of had cards out by now in 6GB and 8GB flavors considering higher texture counts, etc. in the future upcoming games.


My Skyrim is already nearing the limits of my Titans from all the mods I have and I have 2.5GB more ram to play with


----------



## SDhydro

Quote:


> Originally Posted by *Silent Scone*
> 
> https://www.youtube.com/watch?v=spZJrsssPA0&feature=youtu.be Has this been posted yet lol?


ROFL that was too funny.


----------



## mouacyk

Quote:


> Originally Posted by *skupples*
> 
> Dying Light has a CPU utilization bug where it pegs CPU0 @ 99% = BOOM STUTTERING! VRAM doesn't matter at all in this situation.
> 
> Go turn offf core zero & it gets MUCH MUCH smoother, and GPU usage actually goes above 30-40%.
> 
> a patch is incoming for this issue.


Well, that's a relief, that it's not an issue with the GTX 980.


----------



## IRO-Bot

Quote:


> Originally Posted by *2010rig*
> 
> Did they acknowledge their mistake? Yep.


Yes, they acknowledged their mistake and then they that's how they designed the 970 to be. Just that they didn't tell anyone it was designed that way all because they didn't want the boxes to show less numbers.


----------



## Silent Scone

And that's why the dual GPU cards are advertised as their total capacity and not their usable capacity.

It makes good marketing.
Quote:


> Originally Posted by *mouacyk*
> 
> Well, that's a relief, that it's not an issue with the GTX 980.


SLI pushes the memory over. One card is manageable.


----------



## MerkageTurk

Quote:


> Originally Posted by *sexpot*
> 
> and here i am, waiting for a possible 970 price cut? i can't be ass'd about the .5gb difference as my games won't be demanding anywhere near 3.5+ at 1080p


Well if that is the case then why not a TI 780 ?

.5gb? plus much higher bandwidth, These new maxwell parts are not the WOW factor,


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> And that's why the dual GPU cards are advertised as their total capacity and not their usable capacity.
> 
> It makes good marketing.
> SLI pushes the memory over. One card is manageable.


Are you having specific issues with the GTX 980 in SLI than I can help you with? I cannot guarantee that I can get you a refund, but I will try my very best.


----------



## LancerVI

Quote:


> Originally Posted by *2010rig*
> 
> And hence you went with 980's.
> 
> 
> 
> 
> 
> 
> 
> I'm going 4K soon too, but will wait out for next gen cards. I hope either camp can deliver 4K 60 FPS *single card*.
> 
> I'm not excusing NVIDIA for what they did, they had to differentiate the 970 & 980 somehow. Outside the 5% of us who understand memory pools and partitions, the general public wouldn't have had a clue of what that means. Just trying to remind people of the bigger picture that the performance is what was promised.


I think what people are arguing with you about and what you seem to refuse to acknowledge is that when they purchase these cards, it's with an eye towards the future, not just how it performs on release and today. While I'm not a big fan of the term 'future proofing' I certainly make my purchases to last, for my purposes, at least 2 generations, give or take. Now I go back and forth between AMD and nVidia. The reason I run AMD now is because of the 4GB of vRAM. It was said to be overkill at the time, but now we have Shadows of Mordor and games like that that can really eat all that up even at 1080p if you're supersampling.

So, in the end, I agree; the performance is what they said it was when they launched it, but the fact they didn't disclose the true 'layout' of their design fundamentally alters people's perception of how it'll perform down the line. That's wrong. Mistake or purposeful, there's a price to pay for that and no amount of defending them is ever going to change that. Having said that, it seems to me that nVidia is trying to do the right thing.


----------



## iSlayer

Quote:


> Originally Posted by *2010rig*
> 
> because of the 256 bit bus, it would be 8GB. Though it's pointless since you're going to run out of GPU power, way before that 8GB is truly utilized.


I wouldn't touch 8GB 290x/970s unless three way. Maybe 8GB 980s in two way.

The VRAM vs. the performance...

4GB is definitely much better, I'm glad the mainstream amount has gone up. I wouldn't say its future proof but its much better for those edge scenarios that can become common. IE SoM ultra.


----------



## 2010rig

@LancerVI I'm not refusing to accept that, I'm being realistic. I get that people want to use that as an argument, but nothing is ever "future proof", just look at my card, it didn't stand the test of time. I can't game 4K with it. Should I be angry about that, or should I be upgrading?

Do you remember when the Athlon 64 first came out? The term future proofing was tossed around left, right, and center. It took 3 years for a true 64 bit OS, and by that time, that Athlon was obsolete. ( Yes, I did have one of those processors )

Should people be buying the *290X 8GB* since it's more "future proof" due to TWICE the amount of memory? If memory was so important in this case, why are the 8GB and 4GB cards putting out the *EXACT* same frame rates?



Point is, these cards will be running out of GPU power before RAM becomes the issue. In PCPER's example, they had to scale 4K up to 150% resolution ( I'm sure I worded that wrong ) just to use ALL 4GB. Considering these cards are meant to excel at 1080p ( 33% of the market ) not 4K ( 0.03% of the market )

By the time 4K becomes truly mainstream, there will be much more powerful cards to run that resolution.


----------



## LancerVI

I hear what you're saying @2010rig but I do think we are already seeing some titles break the 4GB barrier and a card that just came out a few months ago, the 970, is definitely going to have even more problems with that than say a straight up 4 GB card. That's just my guess, based on how it's allocated, though, I admit I have nothing to back that up. It's just a guess.

But that's the problem. This is a perception issue. I know tech folks here on OCN don't like to hear that. They want facts, specs and numbers. Who does't? But perception is a hard beast to tame. I have to admit, given what we know now, it would give me pause, buying a 970.


----------



## criminal

Quote:


> Originally Posted by *2010rig*
> 
> @LancerVI I'm not refusing to accept that, I'm being realistic. I get that people want to use that as an argument, but nothing is ever "future proof", *just look at my card, it didn't stand the test of time. I can't game 4K with it.* Should I be angry about that, or should I be upgrading?
> 
> Do you remember when the Athlon 64 first came out? The term future proofing was tossed around left, right, and center. It took 3 years for a true 64 bit OS, and by that time, that Athlon was obsolete. ( Yes, I did have one of those processors )
> 
> Should people be buying the *290X 8GB* since it's more "future proof" due to TWICE the amount of memory? If memory was so important in this case, why are the 8GB and 4GB cards putting out the *EXACT* same frame rates?
> 
> 
> 
> Point is, these cards will be running out of GPU power before RAM becomes the issue. In PCPER's example, they had to scale 4K up to 150% resolution ( I'm sure I worded that wrong ) just to use ALL 4GB. Considering these cards are meant to excel at 1080p ( 33% of the market ) not 4K ( 0.03% of the market )
> 
> By the time 4K becomes truly mainstream, there will be much more powerful cards to run that resolution.


LOL... you have a 470. 4K wasn't even a thought at the time.


----------



## iSlayer

You're both right damn it.

Even for 4k, 4GBs isn't always necessary. But there are those games that just eat VRAM like its got a disease and for those situations it's not bad at all. It's not stupidly common and unless you're going dual GPU, VRAM should be more of an after thought unless you have specific needs.


----------



## looniam

Quote:


> Originally Posted by *2010rig*
> 
> Should people be buying the *290X 8GB* since it's more "future proof" due to TWICE the amount of memory? If memory was so important in this case, why are the 8GB and 4GB cards putting out the *EXACT* same frame rates?


maybe to raise the min frame rate in _some games_ while Xfire? (*warning cherry picked benchmarks incoming*)



like 50% more . .

as much as i abhor the "VRAM hype" i do wonder since 2Gbs is the min recommendation for 1080 why wouldn't 8 gb be recommended for 4x the pixels when having enough gpu grunt?


----------



## mtcn77

Quote:


> Originally Posted by *2010rig*
> 
> @LancerVI I'm not refusing to accept that, I'm being realistic. I get that people want to use that as an argument, but nothing is ever "future proof", just look at my card, it didn't stand the test of time. I can't game 4K with it. Should I be angry about that, or should I be upgrading?
> 
> Do you remember when the Athlon 64 first came out? The term future proofing was tossed around left, right, and center. It took 3 years for a true 64 bit OS, and by that time, that Athlon was obsolete. ( Yes, I did have one of those processors )
> 
> Should people be buying the *290X 8GB* since it's more "future proof" due to TWICE the amount of memory? If memory was so important in this case, why are the 8GB and 4GB cards putting out the *EXACT* same frame rates?
> 
> 
> 
> Point is, these cards will be running out of GPU power before RAM becomes the issue. In PCPER's example, they had to scale 4K up to 150% resolution ( I'm sure I worded that wrong ) just to use ALL 4GB. Considering these cards are meant to excel at 1080p ( 33% of the market ) not 4K ( 0.03% of the market )
> 
> By the time 4K becomes truly mainstream, there will be much more powerful cards to run that resolution.


Then again, why would you buy an 8GB card without the intention of 2-3-4 Crossfire, may I ask? The only benefit of extra VRAM buffer is whether or not it avoids a potential VRAM bottlenech that a higher tier visual quality may evoke.


----------



## 2010rig

Quote:


> Originally Posted by *LancerVI*
> 
> I hear what you're saying @2010rig but I do think we are already seeing some titles break the 4GB barrier and a card that just came out a few months ago, the 970, is definitely going to have even more problems with that than say a straight up 4 GB card. That's just my guess, based on how it's allocated, though, I admit I have nothing to back that up. It's just a guess.
> 
> But that's the problem. This is a perception issue. I know tech folks here on OCN don't like to hear that. They want facts, specs and numbers. Who does't? But perception is a hard beast to tame. I have to admit, given what we know now, it would give me pause, buying a 970.


I'm not denying that some games are starting to push the boundaries, which is great. In the grand scheme of things, people should be realistic with their expectations from a $330 card for now, and in the future.


Spoiler: Read this Short Story



Quote:


> Originally Posted by *nleksan*
> 
> I don't know if I could consider a game like SoM using the Ultra (uncompressed) textures as a reliable, or perhapsmmore importantly a representative, test of "core power V memory ability"...
> 
> Themmost recent build I've done was for a friend, who having finished his Neurosurgery Residency program decided (110 percent correctly) that he was due for a massive "guilt free, pleasure purchase", and after having introduced him to [email protected] when I had three highly clocked KPE's folding (and our respective careers and education providingat lleast a modicum of better than average understanding as to the implications of such an endeavor), he asked me to build him a "Absolutely No Budgetary Constraints, Insanely Over-the-top PC for gaming at the highest possible visuals and just as important is that it not sacrifice GPGPU performanceas it wwill be running [email protected] about 18-20/7/365". CaseLabs, borderline too intricate loop, 2x different display setups, a real audio system, and so forth.
> 
> As you likely guessed, it consists of a 5960X (@4.7-4.8), Rampage V Extreme, 32GB GSkill DDR4-3200, Xonar STX II, TH10 decked out to thebrim, every drive that isn't "video pplayback storage" or "archival data" is SSD (DC S3700 800GB primary, 2x 850Pro 1TB RAID0 despite my dislike of RAID0 SSD, 2x SLC NAND enterprisedrives 80GBx2 RAID0 as tthe Temp/Swap/Page/Cache directory, and 3x SSamsung 840Pro 1TB independent of another; HDD's are 4x WD RE 4TB RAID10 local and a concurrentlybuilt NAS/media server w 8x HGST Ultrastar 4TB in RAID6 via Areca 1883ix-24i-8G for expansion and performance that has me reconsidering the value ofR6 hhaving multiple of the same RAID Controller Cards), and so on and so forth...
> 
> The important part is that, after being very impressed by my KPE's, he asked if there could possibly be anything better, and while I would normally say thattthe Titans are extremelyniche, as mmuch as a highly specifically marketed/intended user card like the KPE, the fact that [email protected] would be using thessignificant majority of the computers time, and the enhanced compute capabilities stand out.
> Because I am not 100 percent reckless with money (hovering at a mere 99 percent so
> 
> 
> 
> 
> 
> 
> 
> ), I suggested we get a trio ofthem ffrom a reputable seller, used, and go from there as with his intention of running either up to 1-3x4K (40-47") AND an infinitesimally more traditional 3x LG34UM97 34" 3440x1440p Surround setup (more for productivity than anything else, not to mention the NEC 4096x2360(?) Medical Imaging Display he was given to be able to review records (scans/etc) at home prior to surgery (the thing's well into the "areyoufreakingseriousgivemenow!" price range, something like $13k or more?
> So, along came three LATE model Titans, fewer than 3hrs use each and with a PDF including detailed rundowns of the 34 benchmarks used to find the clocking ability of each, and even offered to fully refund cards and shipping should they be unable to attain the high but brilliantly documented clocks advertised. @ $800 a piece, not cheap, but the inclusion of 3x truly BNIB Aquacomputer Kryographics Full Coverage Copper Blocks with Active Backplates and 2x 250x250mm*2 sheets of FujipolyEExtreme 17w/mK thermal pads (the shipment was directly from Aquatuning to my address, the seller had sold the cards before he expected to, to us, so BNIB means full factoryseals) wworth at least another $250+ per card...
> Asaalways, the one place on OCN where everyone forgets their petty bickering and hissy fits is the Classifieds, where either out of fear of repercussions, genuinely caring about the interests of the buyer, or a mix, I have come to find that anyone who doesn't go out of their way to ensure that the buyer will be pleased is theeexception and a red flag even.
> 
> Well, a recurring physical ailment that presents itself as nearly suicide-contemplating levels of pain and necessitates absolutely insane amounts of opiates 10-200x more potent than morphine sulfate (8-18mg IV Dilaudid HP for BTP, 8x stronger; Oxymorphone HCL 10-12x more potent @ 80mg 4x/day oral plus 2-8x/day 10mg IV solution; Actiq 1800ug buccal lozenge Fentanyl Citrate 2-3x/day @ around 80x more potent; trial drug formulation of combination Transdermal Sufentanil 250ug/hr patches and 2500ug ampoules for injection of 200-400ug 2-6x/day, 70-90x potency; and I have been on over 2 dozen others over time, due to the condition being chronic but also cyclical albeit unpredictable in nature; for another point of reference, IV diacetylmorphine is 1.3-1.6x the potency of MSO4 but you likely know the original Bayer trademark name of "Heroin"...), the most important thing is keeping my mind occupied and myself busy, so it took only a week to make this beautiful system.
> 
> 3x Titans @ 1380/7400, and the doubled VRAM over my KPE's but the latter at 1420/7800 with the small increase in shaders, and neither system would run SoM Ultra Textures without issue. In fact, the performance was really identical for all practical purposes...
> 
> The next day he ordered 3x Titan Blacks + AC Kryographics blocks/backplates 2nd day air to see what they would do despite the more stringent limitations on clocking via voltag, and once again the performance was unacceptable in this game.
> The mutual friend in ppossession of my former 290X Lightnings (blocks/backplates as well) offered to let us try them and they struggled as much or more at my highest ambient water clocks of 1340/6000 and even with all three Hailea chillers dropping the water temps barely above zero (allowingssomething like 1400/6100) showed zero improvement.
> 
> The Titan Black performance in [email protected] was too compelling to consider anything else, and they demolish every othergame wwe've tried, so they've found a permanent home in this rig.
> He of course (of course :S) kept the 3x Titans, running them in a lower cost system 24/7/365 for [email protected], and I gave him right of first refusal when my KPE's went to market but he didn't have another system to use at the time.
> 
> My point is, there is ALWAYS going to be that one nitpicky scenario where no matter what you use the demands exceed the capabilities of thehhardware, and if the three mmostpowerful ccards of recent times struggle (in 3-way SLI/CFX), it makes it an absolute certainty that the POWERFUL but corner-cut GM204 cards would not succeed where these could not...
> 
> I rarely get to build a true no-holds-barred system like this, where the ONLY relevant factor is performance, and even being told "don't even look at the prices, it's irrelevan, I trust yyour obsessive nature and knowledge/experience to make the right call", I was not going to needlessly spend money that wouldn't serve to benefit his needs (whichare mmore than just those stated prior). My only concern was that on first use, he'd be stuck with a Cheshire grin formonths (I imagine patients might find such am eexpression disconcerting plastered on the face of the man who will be cutting into their brain, lol).
> 
> The idea that something that was able to bring a truly "ultra premium" setup to its knees, would be the best thing to use to judge the performance of a significantly cheaper card with lower maximum performance capabilities seems backwards to me...
> 
> It's simply delusional to expect a 300 dollar card to do what a trio of cards totalling well over $3000 struggled significantly with...
> 
> Bottom line
> TL;DR
> Cutting to the Chase
> Etc
> 
> "Proper management of expectations is the only true means of keeping disappointment from permeating it's way into every weave of the fabric that makes up your life"





Quote:


> Originally Posted by *criminal*
> 
> LOL... you have a 470. 4K wasn't even a thought at the time.


No it wasn't, I looked in my crystal ball, and was actually thinking of 5K & 8K at the time. I'm soooo disappointed in my future proofing attempts.









Wish I had gone with a 580 instead, that extra 256MB RAM would've made a world of a difference.








Quote:


> Originally Posted by *looniam*
> 
> maybe to raise the min frame rate in _some games_ while Xfire? (*warning cherry picked benchmarks incoming*)
> 
> like 50% more . .
> 
> as much as i abhor the "VRAM hype" i do wonder since 2Gbs is the min recommendation for 1080 why wouldn't 8 gb be recommended for 4x the pixels when having enough gpu grunt?


Quote:


> Originally Posted by *mtcn77*
> 
> Then again, why would you buy an 8GB without the intention of 2-3-4 Crossfire, may I ask? The extra VRAM buffer only offers benefit when you can convert more FPS into higher tier quality.


Sorry, I'm invalidating both of these points, as they don't support my original point.









Repped you both.


----------



## rickcooperjr

Quote:


> Originally Posted by *2010rig*
> 
> @LancerVI I'm not refusing to accept that, I'm being realistic. I get that people want to use that as an argument, but nothing is ever "future proof", just look at my card, it didn't stand the test of time. I can't game 4K with it. Should I be angry about that, or should I be upgrading?
> 
> Do you remember when the Athlon 64 first came out? The term future proofing was tossed around left, right, and center. It took 3 years for a true 64 bit OS, and by that time, that Athlon was obsolete. ( Yes, I did have one of those processors )
> 
> Should people be buying the *290X 8GB* since it's more "future proof" due to TWICE the amount of memory? If memory was so important in this case, why are the 8GB and 4GB cards putting out the *EXACT* same frame rates?
> 
> 
> 
> Point is, these cards will be running out of GPU power before RAM becomes the issue. In PCPER's example, they had to scale 4K up to 150% resolution ( I'm sure I worded that wrong ) just to use ALL 4GB. Considering these cards are meant to excel at 1080p ( 33% of the market ) not 4K ( 0.03% of the market )
> 
> By the time 4K becomes truly mainstream, there will be much more powerful cards to run that resolution.


Tell the 4gb or 3.5gb of Vram is plenty to the gamers running alot of mods on skyrim / fall out new vegas and such they go well over the 4gb of Vram barrier and do so on older games and many do so on 1080p long before they run out of GPU HP so PLZ rethink your way of thinking.

I know nearly everyone that has played skyrim has ran mods and such to make it prettier or change the game up for theyre wants / likes and such can tell you this. I have drastically with around 250 mods and my 4gb of Vram runs nearly maxxed 90% of time and such on 1080p and that is saying something imagine 4k you would definently need 8gb of Vram or so so PLZ rethink your mode of thinking.

PLZ keep in mind DX12 will likely follow suit as Mantle considering that DX12 is a Mantle clone it will have greatly increased Vram usage ( just the nature of the beast to get better / increased draw calls along with more pre rendered objects and such more is loaded into the Vram ) so we need to keep that in mind also so PLZ understand why this is also a issue with a card that says 4GB but is really 3.5gtb of fast ram and 512mb of slow ram and second you touch the slow ram things get crazy and performance swirls the drain.


----------



## criminal

Quote:


> Originally Posted by *2010rig*


Could have gotten a 3GB 580.


----------



## skupples

Oh god. Ricky is here. I'm out.


----------



## rickcooperjr

Quote:


> Originally Posted by *skupples*
> 
> Oh god. Ricky is here. I'm out.


don't worry I plan to keep it minimal and only put a few posts here by the way you are just now finding out I am posting here I have posted a few posts in here over past few days.


----------



## 2010rig

Quote:


> Originally Posted by *criminal*
> 
> Could have gotten a 3GB 580.


I knew that was coming, and I was prepared... They weren't available at the time of purchase & the Price / Performance ratio sucked on those things.









btw - I don't play Skyrim, so Rick's point doesn't apply to me.

Playing 1 game, with 250 mods would certainly fall into the *"outside of the norm"* category. Is that why 3 290X's are needed?









( I'm having too much fun with this, at this point. )


----------



## Asus11

it seemed the 970 was too good to be true and finally we see the truth.

for some reason while I had the 970 in my possession my gut feeling told me to send it back and I knew there had to be something up!

I bet every r9 290 owner is so proud right now.. and rightly so









AMD might sell heaters but at least you get the raw truth & performance listed


----------



## rdr09

Quote:


> Originally Posted by *skupples*
> 
> Oh god. Ricky is here. I'm out.


LOL


----------



## spacin9

Quote:


> Originally Posted by *skupples*
> 
> that's your own fault.
> 
> 780 sli >970 sli


Exactly.









http://www.overclock.net/t/1518806/fire-strike-ultra-top-30/800#post_23474078


----------



## rickcooperjr

Quote:


> Originally Posted by *2010rig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *criminal*
> 
> Could have gotten a 3GB 580.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I knew that was coming, and I was prepared... They weren't available at the time of purchase & the Price / Performance ratio sucked on those things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw - I don't play Skyrim, so Rick's point doesn't apply to me.
> 
> Playing 1 game, with 250 mods would certainly fall into the *"outside of the norm"* category. Is that why 3 290X's are needed?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ( I'm having too much fun with this, at this point. )
Click to expand...

no that is for eyefinity at 7680x1600 or 3x 2560x1600 screens in eyefinity doing crysis 3 / far cry 4 / skyrim / BF4 or any other games I feel the urge.

I have like 7 290X's floating around here I use to be a crypto miner and when I took my mining rigs down I decided to put them to use as awesome gaming rigs one is 5760x1080 ( 3x 1920x1080 ) 2x R9 290X saphire TRI-X OC's my other is 3x R9 290X matrix's 7680x1600 ( 3x 2560x1600 ) I got my money out of them mining and such and now torture the crap out of them gaming.


----------



## skupples

Quote:


> Originally Posted by *rickcooperjr*
> 
> don't worry I plan to keep it minimal and only put a few posts here by the way you are just now finding out I am posting here I have posted a few posts in here over past few days.


yea, I seem to keep getting lost.

I've asked for people to define PCPer's shillhood multiple times now, and I either keep missing the responses, or no one has a response.

apparently current/ex 7970 owners are still butthurt over PCPER forcing AMD to fix something that was long broken.

FC4 doesn't really consume that much VRAM... at least, it has lower usage than other recent ubi titles.

but hey! The game runs like ass, so maxing settings in 1440P surround is obscenely unlikely, unless you enjoy playing between 25-35 FPS.


----------



## rickcooperjr

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> don't worry I plan to keep it minimal and only put a few posts here by the way you are just now finding out I am posting here I have posted a few posts in here over past few days.
> 
> 
> 
> yea, I seem to keep getting lost.
> 
> I've asked for people to define PCPer's shillhood multiple times now, and I either keep missing the responses, or no one has a response.
> 
> apparently current/ex 7970 owners are still butthurt over PCPER forcing AMD to fix something that was long broken.
> 
> FC4 doesn't really consume that much VRAM... at least, it has lower usage than other recent ubi titles.
> 
> but hey! The game runs like ass, so maxing settings in 1440P surround is obscenely unlikely, unless you enjoy playing between 25-35 FPS.
Click to expand...

I max farcry 4 at around 45-60 fps 7680x1600 every once in a blue moon drop to about 35 fps but in general stay well above. The issue is when you come out of the cutscenes and such you get a lag / stutter GPU clocking up and loading up textures and such horrible coding game side it should be pre rendered before end of cutscenes ) outside of that it is all good for me.

I want to say I usually play on single screen but occasionally decide to do eyefinity you know the urge or itch and you do it and well it is good till your neck starts hurting and eyes burn from eyes strain but let me tell you it adds alot more immersion into the game.


----------



## skupples

Quote:


> Originally Posted by *rickcooperjr*
> 
> I max farcry 4 at around 45-60 fps every once in a blue moon drop to about 35 but in general stay well above it is when you come out of the cutscenes and such you get a lag / stutter 9 GPU clocking up and loading up textures and such horrible coding game side it should be pre rendered before end of cutscenes ) outside of that it is all good.


in 1440p eyefinity in tri-fire? So you're forcing AFR then?

You are either lucky, or lying, based on the hundreds of results I've seen from both NV & AMD...

Tri-fire/sli w/ GK110 can barely do that @ 1080P surround at "max"


----------



## dejo1

nvidia tells the most sites that review their products- what games to use, how to use games and what to compare with. Makes me think "yessa massa" it looks like pcper got part of his tests sent from nvidia themselves. How is that not a shill


----------



## skupples

Quote:


> Originally Posted by *dejo1*
> 
> nvidia tells the most sites that review their products- what games to use, how to use games and what to compare with. Makes me think "yessa massa" it looks like pcper got part of his tests sent from nvidia themselves. How is that not a shill


That's classic technique for both companies.

Do I need to go dig up the pre-release 290/290x slides? The ones that didn't even come close to mirroring reality?

Or how about the pre-release record breaking benchmarks in 3Dmark, which people couldn't replicate w/o water & massive Overclocks?


----------



## rickcooperjr

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> I max farcry 4 at around 45-60 fps every once in a blue moon drop to about 35 but in general stay well above it is when you come out of the cutscenes and such you get a lag / stutter 9 GPU clocking up and loading up textures and such horrible coding game side it should be pre rendered before end of cutscenes ) outside of that it is all good.
> 
> 
> 
> in 1440p eyefinity in tri-fire? So you're forcing AFR then?
> 
> You are either lucky, or lying, based on the hundreds of results I've seen from both NV & AMD...
> 
> Tri-fire/sli w/ GK110 can barely do that @ 1080P surround at "max"
Click to expand...

I guess I am lucky I do have ULPS disabled and CPU running locked clocks of 5ghz X8 and core parking disabled for gaming and running the AMD omega drivers and it seems all good for me. I do have the occasional dips and that is just random here and there but usually only coming out of cut scenes.

I do only average 80%-90% GPU usage though but that could be because of game side issues don't tell me I am CPU limited because don't even want to get into that bag of hornets.


----------



## looniam

Quote:


> Originally Posted by *2010rig*
> 
> [Sorry, I'm invalidating both of these points, as they don't support my original point.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Repped you both.


for the record:

i wasn't confirming or denying your point but making my own. because *IT'S ALL ABOUT ME!*









(though thanks for the +1)


----------



## PureBlackFire

Quote:


> Originally Posted by *criminal*
> 
> Could have gotten a 3GB 580.


it's been proven a hundred tiomes those were literally a waste of money.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *raghu78*
> 
> I cannot believe people are defending Nvidia on this. AMD got hammered on the reference hawaii cooler and the result was AMD delivered a much better solution with R9 295X2. Similarly the public and press should hold Nvidia accountable. Either change the marketing material and info on the product box or recall the product. The easier option is the first and not going to cost a lot. Clearly mention the actual bandwidth for the last 0.5 GB on the box. What we also need is the press to inform the users with extensive testing so as to enable the user to make a informed decision. *I hope pcper and other sites like techreport show the same diligence that they showed in the HD 7900 series frametime issues.*


I wouldn't hold my breath on that one. Especially not if Ryan Shrout has anything to say about it...


----------



## skupples

What more are people looking for from review sites on this, short of admitting that they didn't test the cards hard enough? PCPER has been updating and testing regularly. Just like everyone else.


----------



## Forceman

Quote:


> Originally Posted by *skupples*
> 
> What more are people looking for from review sites on this, short of admitting that they didn't test the cards hard enough? PCPER has been updating and testing regularly. Just like everyone else.


Admit that it's all a giant Nvidia conspiracy? Show a video of the money truck driving up to their office paying them off? Hail our AMD overlords?

Frankly, at this point I have no idea.


----------



## provost

That Shrout fella always seemed to cozy with Peterson for my liking... I mean where is the journalistic tension here between the two...they practically look like they are in love









Ok..I kid..I kid....don't kill me please ..lol


----------



## LancerVI

....as for those gtx 580's. I have a pair of 1.5GB that still do damage at 1080p in SLI. My son's build is doing nicely with those.

I'm actually quite happy with how long I've been able to use them, having added the second one recently.


----------



## tpi2007

Quote:


> Originally Posted by *2010rig*
> 
> I'm not denying that some games are starting to push the boundaries, which is great. In the grand scheme of things, people should be realistic with their expectations from a $330 card for now, and in the future.


Being realistic is one thing, not expecting, or at least not admitting the possibility that games coming out within the next year might very well start to push the 970 beyond what it can, given its memory segmentation and lack of ROPs and L2 cache, is another.

Tech sites and some of you seem to have frozen the games _status quo_ to today and not admit the possibility that games' VRAM usage won't go up for the next year and thus 'everything's fine, nothing to see here'. As has been said, when the 290X was released, 4GB was also considered unnecessary. Yet here we are, games are starting to use it fully.

You also touch another important point though, that people shouldn't expect much from a $330 card, but the truth is, that's exactly what the PR (intentionally or not) got across. The GTX 970 was seen as an awesome deal, too good to be true, yet real. Do you remember how enthusiastic some sites were ? TechPowerUp even used profanity in their conclusion to describe what they thought about AMD's competitive positioning after the card's release. That's how insane this was.

But you also have to look at the context that made this a more believable situation: not only were the specs there: 64 ROPs (same as AMD's cards), 2 MB of L2 cache to help ease the bandwidth concerns, 4GB of VRAM (same as AMD again), and AMDs cards had been on the market for a year, so Nvidia benefited from making a splash. Add to that the complaints about previous Nvidia flagship cards being too expensive, and you could reasonably believe that the 970's price was Nvidia's way to not only compete with AMD, but also to appease its customers and sort of make up for the past expensive cards.

So add all this together, and it's a big stretch to even try to put the blame on customers who believed the specs that were published.


----------



## provost

Another very thoughtful and eloquent post.


----------



## skupples

Come on TPI, obviously the card will start getting its ass kicked. That's how this technology stuff works.

There's this awesome thing called turning your settings down. I do it all the time with my Titans, and I've been doing it from day one.

His is why I find it hard to believe that anyone is running FC4 @ 60 FPS in 7680x1440p on max settings. Because I know how surround/eyefinity works.

Yes NV lied. I'm not disputing that. I'm saying that the 980 will age as quickly as the 970 or any other card.


----------



## provost

Yeah, but to be fair skupps, he did not get the specs that he paid for. Period.
I would be mighty upset to, if I were him.
Is NV still honoring returns?


----------



## skupples

I get that. Totally. I'm just challenging the card aging part. They do that, specs be damned.

Specially now that NV has established a pattern with driver support. This is somewhere that AMD wins, they seem to support their cards for much longer.


----------



## PureBlackFire

Quote:


> Originally Posted by *skupples*
> 
> Come on TPI, obviously the card will start getting its ass kicked. That's how this technology stuff works.
> 
> There's this awesome thing called turning your settings down. I do it all the time with my Titans, and I've been doing it from day one.
> 
> His is why I find it hard to believe that anyone is running FC4 @ 60 FPS in 7680x1440p on max settings. Because I know how surround/eyefinity works.
> 
> Yes NV lied. I'm not disputing that. I'm saying that the 980 will age as quickly as the 970 or any other card.


aside from the admission of reducing filters on a $1K card (hehe), I'll agree with this. they lied of course. and anyone who thinks paying more for a 980 is going to benefit them is nuts. it's legs are no longer than the 970's in the long run.


----------



## sugalumps

Quote:


> Originally Posted by *tpi2007*
> 
> Being realistic is one thing, not expecting, or at least not admitting the possibility that games coming out within the next year might very well start to push the 970 beyond what it can, given its memory segmentation and lack of ROPs and L2 cache, is another.
> 
> Tech sites and some of you seem to have frozen the games _status quo_ to today and not admit the possibility that games' VRAM usage won't go up for the next year and thus 'everything's fine, nothing to see here'. As has been said, when the 290X was released, 4GB was also considered unnecessary. Yet here we are, games are starting to use it fully.
> 
> You also touch another important point though, that people shouldn't expect much from a $330 card, but the truth is, that's exactly what the PR (intentionally or not) got across. The GTX 970 was seen as an awesome deal, too good to be true, yet real. Do you remember how enthusiastic some sites were ? *TechPowerUp even used profanity in their conclusion to describe what they thought about AMD's competitive positioning after the card's release. That's how insane this was.*
> 
> But you also have to look at the context that made this a more believable situation: not only were the specs there: 64 ROPs (same as AMD's cards), 2 MB of L2 cache to help ease the bandwidth concerns, 4GB of VRAM (same as AMD again), and AMDs cards had been on the market for a year, so Nvidia benefited from making a splash. Add to that the complaints about previous Nvidia flagship cards being too expensive, and you could reasonably believe that the 970's price was Nvidia's way to not only compete with AMD, but also to appease its customers and sort of make up for the past expensive cards.
> 
> So add all this together, and it's a big stretch to even try to put the blame on customers who believed the specs that were published.


That was true though until this all went down, the 970 has been selling out the door now for months. I doubt amd has seen much action with the 290's/290x's since the 970 launch. Except for the .5 vram loss the card is sitll that immense deal/seller that it was in those reviews. I wonder how well it would have sold though if nvidia officialy launched it as a 3.5gb vram card, probably alot more 980 sales.


----------



## tpi2007

Quote:


> Originally Posted by *provost*
> 
> Another very thoughtful and eloquent post.


Thanks!









Quote:


> Originally Posted by *skupples*
> 
> Come on TPI, obviously the card will start getting its ass kicked. That's how this technology stuff works.
> 
> There's this awesome thing called turning your settings down. I do it all the time with my Titans, and I've been doing it from day one.
> 
> His is why I find it hard to believe that anyone is running FC4 @ 60 FPS in 7680x1440p on max settings. Because I know how surround/eyefinity works.
> 
> Yes NV lied. I'm not disputing that. I'm saying that the 980 will age as quickly as the 970 or any other card.


That's a given, but it's also a given that we are talking about a year time frame where consumers had certain expectations about the 970's relevance, which might not hold true now and thus people's valuation of the card might have been different if they had known the correct specs.

As I said earlier, the card might have been demoted from 'awesome' to 'good', by the press and consumers alike. And many enthusiastic purchases might not have happened. And it's not just a matter of "what would you buy instead", but also of whether people wouldn't instead have chosen to wait a few more months for GM200, AMD R9 300 series, etc.

Consider this: many people choosing 2x 970 for 4K SLI because of the awesome price / performance might have thought "hmmm, for this price I might as well buy the GTX 1080 (non Ti), get perhaps a little less fps, but pretty much playable 4K performance on a single card with 6 GB VRAM, with the possibility to add another one down the road. The 970's incorrectly reported specs may very well have made the difference between buying and waiting.


----------



## skupples

Quote:


> Originally Posted by *PureBlackFire*
> 
> *aside from the admission of reducing filters on a $1K card* (hehe), I'll agree with this. they lied of course. and anyone who thinks paying more for a 980 is going to benefit them is nuts. it's legs are no longer than the 970's in the long run.


It's the truth.

Surround is a beast, and Nvidia is letting it slowly rot away.
Yes, they recently added 5 monitor support, but with ZERO FANFARE hell, they didn't even mention it in the driver update beyond one tiny little line! I mean, when was the last time you read an nvidia driver update that actually mentioned working on the quality of surround? I scan their PDFs for every driver release, and they NEVER say anything about it. Surround worked much better one year ago, than it does today. Yes, it's a per application situation, in some regards, but in other regards, it just seems NV really doesn't care about it anymore. This is yet another reason why I'm eyeing 390x, unless we get amazing news about Pascal, HBM, and Project Denver. Really though, it's looking like there will be a good 18-24 months between Big Boy Pascal & 390x.


----------



## criminal

Quote:


> Originally Posted by *PureBlackFire*
> 
> it's been proven a hundred tiomes those were literally a waste of money.


I know. Just picking at 2010rig and his 4K comment.


----------



## Vesku

Quote:


> Originally Posted by *2010rig*
> 
> I knew that was coming, and I was prepared... They weren't available at the time of purchase & the Price / Performance ratio sucked on those things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw - I don't play Skyrim, so Rick's point doesn't apply to me.
> 
> Playing 1 game, with 250 mods would certainly fall into the *"outside of the norm"* category. Is that why 3 290X's are needed?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ( I'm having too much fun with this, at this point. )


I'm wondering if this 970 3.5 GB stuff is going to rehash with the next Elder Scrolls or Fallout game. Now with Steam Workshop available at game launch, feature debuted a bit after Skyrim launch, even more people are going to try out mods.


----------



## rickcooperjr

Quote:


> Originally Posted by *Vesku*
> 
> Quote:
> 
> 
> 
> Originally Posted by *2010rig*
> 
> I knew that was coming, and I was prepared... They weren't available at the time of purchase & the Price / Performance ratio sucked on those things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw - I don't play Skyrim, so Rick's point doesn't apply to me.
> 
> Playing 1 game, with 250 mods would certainly fall into the *"outside of the norm"* category. Is that why 3 290X's are needed?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ( I'm having too much fun with this, at this point. )
> 
> 
> 
> I'm wondering if this 970 3.5 GB stuff is going to rehash with the next Elder Scrolls or Fallout game. Now with Steam Workshop available at game launch, feature debuted a bit after Skyrim launch, even more people are going to try out mods.
Click to expand...

Exactly I could not say it better with steam being #1 game vendor online and just as many retail copies require steam and well with steam workshop more and more people are running mods and the steam workshop mods get people into it and they go for bigger and better mods which eat even more Vram nexxus seems to give the most mods for any games.

I will tell you some mods eat 750mb-1gb of Vram alone like the ENB mods for skyrim which if run the suggested take a look for yourself these are the recomended one for skyrim and install order and everything scroll down there are 2 lists of mods suggested that all work well together and it gives the order and everything to install them I have like 1/2 of them installed and am pushing the 4gb barrier fairly easily.

http://www.nexusmods.com/skyrim/mods/30936/? watch some of the vids and drool trust me some of the mods are down right amazing and mind boggling graphics wise. PLZ also remember there are mods like this for a bunch of games so more and more people are running them and such that adds more strain on the Vram drastically doubling or tripling or more the amount of Vram used / utilized.

So the GTX 970 if ever pushed to run mods like this and such takes a crap minute you go past 3.5gb that will have alot of people upset and confused and guess whos fault it is Nvidias for misleading theyre customers.

Oh it isn't just one game that they have mods for check into it nexxus has these types of mods for a bunch of games and so does the steam workshop and once you install them once on steam they are linked to your account so they will be automatically downloaded and installed for ease of use anytime you install a game you subscribed to mods for in steam workshop.

take a look and scroll down it shows a few of the games they have mods for and such http://www.nexusmods.com/skyrim/mods/modmanager/? and there are many more games not listed there and steam workshop has mods for a bunch of games when I say a bunch I dare not attempt to count for either there are so many.


----------



## Majin SSJ Eric

What I don't get is the guys who are falling over themselves to claim that "this is no big deal" and "nothing to see here" when there are two immutable facts present:

1. If this was AMD you KNOW these same people would be shouting from the rooftops about how shady AMD is and how they lie to customers to sell product (the "massive" frame time debacle bears this out) and...

2. ...Whether or not this design choice has any significant bearing on performance kind of misses the point. People bought these cards with certain expectations and those expectations are not being met (even if its just a matter of what NV promises on the box). The consequence of this is likely a very real drop in resale value of these cards and that is something anybody who bought a 970 will not have counted on.

To be fair, I don't think this is a huge deal and I certainly don't see it rising to any kind of class action status, but it is a big deal in terms of perception and in terms of how used consumers will use this news going forward. If a 970 owner's card was worth $250 before the story hit and now nobody will give them more than $200 then you could almost say that NV just stole $50 from their pockets. That's not right and they deserve to be called out for it...


----------



## Darius510

Quote:


> Originally Posted by *sugalumps*
> 
> That was true though until this all went down, the 970 has been selling out the door now for months. I doubt amd has seen much action with the 290's/290x's since the 970 launch. Except for the .5 vram loss the card is sitll that immense deal/seller that it was in those reviews. I wonder how well it would have sold though if nvidia officialy launched it as a 3.5gb vram card, probably alot more 980 sales.


I think it would have had a pretty significant effect. I know lots of non-techie PC gamers that think VRAM is one of the most important specs. I'm sure NVIDIA knows this, which is why they went through the trouble of making this crazy RAM segmentation to keep the spec at 4GB.

If they had been up front about it how that 4GB is segmented they probably would sold a few less cards to enthusiasts but casual sales would probably be unchanged, and they could have avoided this whole mess.


----------



## gamervivek

Indeed, it's comical to hear that nobody cares about specs, and least of all the noobies who look at the vram size and crack open their wallets.


----------



## Silent Scone

Still no testing here? Any more opinions to be thrown into the ever growing pot of absolutely nothing conclusive? Lol

Maybe there aren't any actual 970 owners in here.
Quote:


> Originally Posted by *tpi2007*
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> That's a given, but it's also a given that we are talking about a year time frame where consumers had certain expectations about the 970's relevance, which might not hold true now and thus people's valuation of the card might have been different if they had known the correct specs.
> 
> As I said earlier, the card might have been demoted from 'awesome' to 'good', by the press and consumers alike. And many enthusiastic purchases might not have happened. And it's not just a matter of "what would you buy instead", but also of whether people wouldn't instead have chosen to wait a few more months for GM200, AMD R9 300 series, etc.
> 
> Consider this: many people choosing 2x 970 for 4K SLI because of the awesome price / performance might have thought "hmmm, for this price I might as well buy the GTX 1080 (non Ti), get perhaps a little less fps, but pretty much playable 4K performance on a single card with 6 GB VRAM, with the possibility to add another one down the road. The 970's incorrectly reported specs may very well have made the difference between buying and waiting.


I wouldn't recommend, nor has any review that people buy into 970s for UHD resolutions. So then you enter the more fool you group. So if they've gone against that then they either didn't read up anyway, or they've gone against recommendations.


----------



## Olivon

Quote:


> Originally Posted by *cowie*
> 
> it has all the rigyt things on the box yes it does have 4g
> you guys know its a greedy ram ploy on both sides?
> go look at 1440p(single monitor) results of the 290x and 290x 8g It's a few frames difference almost 200$ difference in price.
> In reality the 3.5g 970 stays with both of those cards in all practical manners....but this is ocn the home of the "it does not overclock I am sending it back" whats even more at home around here bios flashing there cards messing them up then sending them back.
> why don't you guys talk about suing and getting free games from those that are stupid enough to say those things on a public forum.
> 
> as for the 970 it is what it is....I could not get refunds for the months I waited on drivers to get black screens fixed on my 5970's 7980's or the darling 4g 290x's
> 
> *yes I hope you guys go get refunds wait for the unfinished amd cards then cry about grey black or maybe this time yellow screen crashes that they don't tell you about.
> 
> both sides play you guys like fiddles....take my advice hate the man nv amd intel apple and Samsung they look out for them they don't love you they just want your money./
> *
> before they take this post down (I am sure) read and remember


+Rep to this

All is said.


----------



## Majin SSJ Eric

I love that he put "GTX 970 3.5GB" in his sig!!


----------



## 5pellfire

nVidia should change its logo to

nVidia - The Way It's Meant to Be Lied


----------



## gamervivek

Oh nvidia played them alright. The way they(consumers) are meant to be played.


----------



## mouacyk

Quote:


> Originally Posted by *tpi2007*
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> That's a given, but it's also a given that we are talking about a year time frame where consumers had certain expectations about the 970's relevance, which might not hold true now and thus people's valuation of the card might have been different if they had known the correct specs.
> 
> As I said earlier, the card might have been demoted from 'awesome' to 'good', by the press and consumers alike. And many enthusiastic purchases might not have happened. And it's not just a matter of "what would you buy instead", but also of whether people wouldn't instead have chosen to wait a few more months for GM200, AMD R9 300 series, etc.
> 
> Consider this: many people choosing 2x 970 for 4K SLI because of the awesome price / performance might have thought "hmmm, for this price I might as well buy the GTX 1080 (non Ti), get perhaps a little less fps, but pretty much playable 4K performance on a single card with 6 GB VRAM, with the possibility to add another one down the road. The 970's incorrectly reported specs may very well have made the difference between buying and waiting.


The quality and cohesiveness of your posts are awesome. The gnats are biting here and there, and absolutely cannot gobble up anything whole, yet they are quite persistent in policing thoughts, down playing choice, and maintaining the status quo. I applaud your efforts, because it makes them bump this thread so others can see it. I get the big picture now on OCN.

Has anyone given thought to the possibility that perhaps not all 970's have the physical disabling of the 8th L2 chip? This may lead into how NVidia bins their chips a little bit, but perhaps the disabling is only on the worst 970's in order to maintain the quota of 4GB VRAM on the 970's? This would lend some credibility to why they kept it secret, and some 970 users are saying they don't have the issue beyond 3.5GB. This is similar to some of us receiving dog 4770K, taking 1.4v to get 4.5GHz but Alatar can get 5.1GHz at 1.2v.


----------



## Silent Scone

As do others from your avatar. He is expressing his take on the situation, as is every other person. Coherency through proper English is always a welcome bonus. What this thread lacks is collective evidence, something which people seem to be unwilling to participate in. It's all 'what what' and 'what for'. What for being evidently absolutely nothing.

After what is 2043 posts at this point, we've established everyone is under agreement on principle that NVIDIA's own lack of coherency is bad, what people should be doing now is trying to establish exactly how bad by testing these upper limits. From which 970 users are doing nothing. They're just complaining. This excluding people like yourself, who do not own the product - also complaining, and in your case, mouacyk, instigating just because you're bored.

I agree with a lot of things TPI is saying, it would just-be-nice if people put up some findings at this stage. As a 980 owner I am willing to compare.


----------



## givmedew

I do not own a 980 and BOY DO I WISH I HAD ONE TO COMPARE WITH...

I own (2) 290x and a Gigabyte GTX 970 G1 Gaming...

In Dying Light somewhere between 3450MB and 3550MB of memory usage the game is destroyed!!! At 1900x1200 it happens quite often.

NVIDIA claims that there is NO improvement in graphics quality going from medium to high for textures in Dying Light.

Going to medium the game uses less than 2GB of VRAM and w/ ALL other settings maxed out EVEN view distance the game is completely playable.

Comparison:

At 2560x1440P (and I tried 1920x1080P as well) my R9 290X uses more than 3500MB of RAM and DOES NOT stutter ever at all.

My theory...

The game does not know to stay below 3.5GB but it does know to stay below 4GB and sees the GTX 970 as a 4GB device OR the developer designed HIGH to utilize up to 4GB of RAM but not more and therefore it won't work well on any card below 4GB.

Way to decide which of those is more likely the case: have people try 2GB and 3GB video cards with the textures set to HIGH. Go ahead and set view distance to minimum because even with it at minimum it didn't not change the amount of ram was used on either of the cards...

If you can play the game on HIGH with a 2GB or 3GB card well... then the game KNOWS how much ram is present and with the GTX 970 it thinks their is 4GB tries to use and is back handed for doing so!


----------



## jprovido

Quote:


> Originally Posted by *Noufel*
> 
> man...... imagine a seconde that a new problem occure on the 980 side
> 
> 
> 
> 
> 
> 
> 
> we'll be all doomed


I swear to god I'll go bonkers if that happens lol


----------



## Unknownm

Quote:


> Originally Posted by *Mand12*
> 
> The amount of misinformation in this thread is staggering.
> 
> 3.5 vs 4 is not a big deal. The specific vram count has NEVER been a big deal, and the most misunderstood number in the entire GPU market.
> 
> The benchmarks are still the same. The performance is still the same. This card is just as good now as it was when the reviewers gave it stellar performance numbers.


I remember one of my first post here was asking about picking up a Nvidia 6800 GT with 128MB or 512MB version

and people said 512MB is so much it won't get used in years.

Now we are here about 10 years later fighting over 3-4GB vram

Time flies.....


----------



## Clocknut

Quote:


> Originally Posted by *givmedew*
> 
> I do not own a 980 and BOY DO I WISH I HAD ONE TO COMPARE WITH...
> 
> I own (2) 290x and a Gigabyte GTX 970 G1 Gaming...
> My theory...
> 
> The game does not know to stay below 3.5GB but it does know to stay below 4GB and sees the GTX 970 as a 4GB device OR the developer designed HIGH to utilize up to 4GB of RAM but not more and therefore it won't work well on any card below 4GB.
> 
> Way to decide which of those is more likely the case: have people try 2GB and 3GB video cards with the textures set to HIGH. Go ahead and set view distance to minimum because even with it at minimum it didn't not change the amount of ram was used on either of the cards...
> 
> If you can play the game on HIGH with a 2GB or 3GB card well... then the game KNOWS how much ram is present and with the GTX 970 it thinks their is 4GB tries to use and is back handed for doing so!


now imaging if there is a game developer that decides to fetch the game to fill up all the Vram first, when it detect a 4GB GPU, this could be a problem on 970.


----------



## N0ID

I myself own a gtx 970 g1 gaming from gigabyte (did not update system specs yet)

Yesterday i figured i'd play a bit of shadows of mordor.

Running everything on ultra at 1080p, performance was fine up until the 3.5gb mark. As soon as it hit 3.6gb everything started lagging REAL bad, and i mean REAL REAL REAL bad. It almost became a slideshow.

I also dont understand people saying it's impossible to hit more than 3.5gb in Shadows of Mordor @ 1080p, when it's pretty obvious that for some us that is the actual case.

Not sure if trolls or perhaps it has to do with different system setups, and some people truly dont go over 3.5gb with everything maxed at 1080p.

Opened a ticket with my retailer and waiting for response, but I already made up my mind that i am not keeping this card.

I also have to say that all the games that do not exceed the 3.5gb, run flawlessly. But unfortunately that is not what I paid for so, that's that.


----------



## rdr09

Quote:


> Originally Posted by *N0ID*
> 
> I myself own a gtx 970 g1 gaming from gigabyte (did not update system specs yet)
> 
> Yesterday i figured i'd play a bit of shadows of mordor.
> 
> Running everything on ultra at 1080p, performance was fine up until the 3.5gb mark. As soon as it hit 3.6gb everything started lagging REAL bad, and i mean REAL REAL REAL bad. It almost became a slideshow.
> 
> I also dont understand people saying it's impossible to hit more than 3.5gb in Shadows of Mordor @ 1080p, when it's pretty obvious that for some us that is the actual case.
> 
> Not sure if trolls or perhaps it has to do with different system setups, and some people truly dont go over 3.5gb with everything maxed at 1080p.
> 
> Opened a ticket with my retailer and waiting for response, but I already made up my mind that i am not keeping this card.
> 
> I also have to say that all the games that do not exceed the 3.5gb, run flawlessly. But unfortunately that is not what I paid for so, that's that.


does adjusting pagefile help? like putting it in auto or something?

if single cards are having issues . . . how much more in multi?


----------



## N0ID

I am not sure since I did not try that yet. Care to give me a step by step of what I could/should try?

I'll be glad to test it and post results.


----------



## rdr09

Quote:


> Originally Posted by *N0ID*
> 
> I am not sure since I did not try that yet. Care to give me a step by step of what I could/should try?
> 
> I'll be glad to test it and post results.


go to control panel>systems>advance system setting>


----------



## michaelius

Quote:


> Originally Posted by *Unknownm*
> 
> I remember one of my first post here was asking about picking up a Nvidia 6800 GT with 128MB or 512MB version
> 
> and people said 512MB is so much it won't get used in years.
> 
> Now we are here about 10 years later fighting over 3-4GB vram
> 
> Time flies.....


Well I never regreted buying gpu with less ram. By the time that ram was needed cards were already outdated and not fast enough.

Anyway 970/980/290/290X are just temporary cards until we get real deal in form of 16 nm GPUs.


----------



## N0ID

Edit: for some reason my quite didnt work out, this message is in reply to this:
Quote:


> Quote:
> Originally Posted by N0ID View Post
> 
> I am not sure since I did not try that yet. Care to give me a step by step of what I could/should try?
> 
> I'll be glad to test it and post results.
> 
> go to control panel>systems>advance system setting>


Allright I just did that and tested it. Shadow of Mordor does not seem to exceed 3.56gb / 3.6gb vram (highest i've seen in 10 min of running around) usage now, but i can still feel some "hiccups" now and then. Not as bad as before though so it seems to have helped a bit. Before changing the pagefile settings my vram usage would go up to 3.8 / 3.9.


----------



## rdr09

Quote:


> Originally Posted by *N0ID*
> 
> Allright I just did that and tested it. Shadow of Mordor does not seem to exceed 3.56gb / 3.6gb vram (highest i've seen in 10 min of running around) usage now, but i can still feel some "hiccups" now and then. Not as bad as before though so it seems to have helped a bit. Before changing the pagefile settings my vram usage would go up to 3.8 / 3.9.


if you have any unnecessary apps in the background when gaming . . . turning them off might even make it better.


----------



## N0ID

I appreciate your advice but I think the main point is that as soon as this card hits 3.5gb+ vram usage, it starts lagging. I shouldn't have to turn off apps just to be able to run my games properly after spending 360 eu on a graphics card don't you think?

EDIT: Spelling.


----------



## rdr09

Quote:


> Originally Posted by *N0ID*
> 
> I appreciate your advice but I think the main point is that as soon as this card hits 3.5gb+ vram usage, it starts lagging. I shouldn't have to turn off apps just to be able to run my games properly after spending 360 eu on a graphics card don't you think?
> 
> EDIT: Spelling.


true. but, it is still significantly cheaper than the 980.


----------



## Blameless

Quote:


> Originally Posted by *michaelius*
> 
> Well I never regreted buying gpu with less ram. By the time that ram was needed cards were already outdated and not fast enough.


My 320 MiB GeForce GTS 8800 frequently ran into VRAM limitations with games that were contemporaneous to it. I distinctly remember having to reduce texture quality in LotRO and Doom 3 for a playable experience.


----------



## N0ID

Quote:


> Quote:
> Originally Posted by N0ID View Post
> 
> I appreciate your advice but I think the main point is that as soon as this card hits 3.5gb+ vram usage, it starts lagging. I shouldn't have to turn off apps just to be able to run my games properly after spending 360 eu on a graphics card don't you think?
> 
> EDIT: Spelling.
> 
> true. but, it is still significantly cheaper than the 980.


I agree with you. Fact is that this card performs like a beast in any game that requires less than 3.5gb, but anything above that and it starts to become unplayable. For now this is fine but in a few months when even more demanding games are released, this will start to become an even bigger issue...Let's see if I can get a refund and i'll go with a 980 I guess, even though in my opinion it's way overpriced. What other options do I have? I already burned my hands way too many times with ATI and not going back to that.


----------



## clerick

I contacted NCIX, hopefully I can return it without issues.


----------



## provost

Quote:


> Originally Posted by *Silent Scone*
> 
> As do others from your avatar. He is expressing his take on the situation, as is every other person. Coherency through proper English is always a welcome bonus. What this thread lacks is collective evidence, something which people seem to be unwilling to participate in. It's all 'what what' and 'what for'. What for being evidently absolutely nothing.
> 
> After what is 2043 posts at this point, we've established everyone is under agreement on principle that NVIDIA's own lack of coherency is bad, what people should be doing now is trying to establish exactly how bad by testing these upper limits. From which 970 users are doing nothing. They're just complaining. This excluding people like yourself, who do not own the product - also complaining, and in your case, mouacyk, instigating just because you're bored.
> 
> I agree with a lot of things TPI is saying, it would just-be-nice if people put up some findings at this stage. As a 980 owner I am willing to compare.


Look it, in my opinion, proof will surface in due course (give it another few months) and I have no doubt about it. By that time, all the Nvidia marketing would have moved on to the next sku, and the argument would pivot from show me the proof to, "we have new tech now, don't like it, upgrade your 970 and shut up". Mark this post, as I can guarantee it will happen.

Aside from my god like predictive skills..lol, although I don't have a dog in this fight as a non owner of this card, I do have a great deal of discomfort in how NV is trying to delineate performance from hardware, or claiming performance inspite of it . If hardware specs aren't as important, which seems to be the latest talking point spin, then where exactly is the performance coming from; drivers, boost, software efficiency? And, what is it worth to me as a consumer, knowing that I am buying into the company's goodwill to keep the performance at par, until they decide not to in favor of the new sku.
If the industry as a whole is headed this way, as was pointed out to me yesterday in another thread, perhaps it's time to step back and revisit why one wants to continue gaming on a pc rather than on a console.


----------



## N0ID

Couldn't agree more with your post. It seems like nowadays paying big bucks for subpar products is not only the norm, but lowering your pants and biting on a piece of wood is becoming part of being a "consumer". Hopefully the mods do not deem my reply offensive, i am not totally sure what is or what is not allowed on these boards since I am new. My apologies if my post comes over as offensive.


----------



## skupples

Quote:


> Originally Posted by *givmedew*
> 
> I do not own a 980 and BOY DO I WISH I HAD ONE TO COMPARE WITH...
> 
> I own (2) 290x and a Gigabyte GTX 970 G1 Gaming...
> 
> In Dying Light somewhere between 3450MB and 3550MB of memory usage the game is destroyed!!! At 1900x1200 it happens quite often.
> 
> NVIDIA claims that there is NO improvement in graphics quality going from medium to high for textures in Dying Light.
> 
> Going to medium the game uses less than 2GB of VRAM and w/ ALL other settings maxed out EVEN view distance the game is completely playable.
> 
> Comparison:
> 
> At 2560x1440P (and I tried 1920x1080P as well) my R9 290X uses more than 3500MB of RAM and DOES NOT stutter ever at all.
> 
> My theory...
> 
> The game does not know to stay below 3.5GB but it does know to stay below 4GB and sees the GTX 970 as a 4GB device OR the developer designed HIGH to utilize up to 4GB of RAM but not more and therefore it won't work well on any card below 4GB.
> 
> Way to decide which of those is more likely the case: have people try 2GB and 3GB video cards with the textures set to HIGH. Go ahead and set view distance to minimum because even with it at minimum it didn't not change the amount of ram was used on either of the cards...
> 
> If you can play the game on HIGH with a 2GB or 3GB card well... then the game KNOWS how much ram is present and with the GTX 970 it thinks their is 4GB tries to use and is back handed for doing so!


The game has a significant issue with necking he Zcou which results in low GPU usage, this has been shown time and time again now. GPUs at 30-40% usage with core 0 at 99%. You can tab out and nuke core 0 but it's a meh tier workaround until the developers issue a patch.

I'll try to run some GK110 tests later today when I'm back at my main rig, until hen just go look at the screenshots in the dying light thread.


----------



## Silent Scone

My GPU usage is absolutely fine, 90% or so on average. Then again I'm certainly not going to be CPU limited.


----------



## CaptainZombie

Linus talking about this issue, he even agrees that NVIDIA should of been upfront with all this and that they intentionally deceived the public.

I'm hoping that Newegg and other retailers are instructed for returns/exchanges. If they cared about their customers they would just bite the bullet and allow returns/exchanges.




Moving forward, will the boxes have 3.5GB of GDDR5 on them? LOL!


----------



## N0ID

I just received a reply from my retailer (spanish one, not sure if i'm allowed to state which one it is) telling me that im out of luck. They say they are not responsible for the fact that 4 months after release of the graphics card, the news comes that memory is split in 2 parts. They also state that since they didnt have the technical specs on the website (I doubt it but too lazy to check) they are not responsible for any of this. Could anyone tell me what my options are? I feel shafted.


----------



## Chargeit

Quote:


> Originally Posted by *michaelius*
> 
> Well I never regreted buying gpu with less ram. By the time that ram was needed cards were already outdated and not fast enough.
> 
> Anyway 970/980/290/290X are just temporary cards until we get real deal in form of 16 nm GPUs.


I don't know man.

I think if I had the 6gb 780 I wouldn't be sitting here waiting for the next cards to come around. Not having enough Vram is a quick way to make a card that is otherwise more then powerful enough, dated.

Though I haven't had issues with that dying light game, the few times I do have a small skip or stutter I can't help but wonder if it would still be there on a 6gb 780.

*I'm also starting to wonder if getting faster ram is now worth it? Correct me if you know I'm wrong, but, if a game now needs more Vram then you have doesn't it offload the extra onto your system ram? I'm working with 16gb right now, and have seen as high as 10gb usage in that SoM game. Over 8gb in Dying light. I bought my current ram while on AM3+, and got 1866. I have since oc'ed it to 2133, but, I'm thinking of picking up a 16gb pack of 2800 or something and using my current ram in one my other systems... I can't help but wonder with the newest of games, does ram speed finally matter a lot more?


----------



## jprovido

Quote:


> Originally Posted by *CaptainZombie*
> 
> Linus talking about this issue, he even agrees that NVIDIA should of been upfront with all this and that they intentionally deceived the public.
> 
> I'm hoping that Newegg and other retailers are instructed for returns/exchanges. If they cared about their customers they would just bite the bullet and allow returns/exchanges.
> 
> 
> 
> 
> Moving forward, will the boxes have 3.5GB of GDDR5 on them? LOL!


tbh that's why I like Linus. he wasn't 100% anti-nvidia he's defending them to a certain degree. sites like PCPER feels oddly defensive. are they under nvidia's payroll?


----------



## skupples

Same thing I said I the other thread since people love copy pasta.

Linus IS NOT INNOCENT IN ALL OF THIS. just like any other reviewer that gets free samples. He had the chance to bust out his 7680x1440 setup to truly push the GPU to its max, but he didn't. SO him playing coy is a farce and a joke. Anyone that buys into it needs their brains checked.
Quote:


> Originally Posted by *Chargeit*
> 
> I don't know man.
> 
> I think if I had the 6gb 780 I wouldn't be sitting here waiting for the next cards to come around. Not having enough Vram is a quick way to make a card that is otherwise more then powerful enough, dated.
> 
> Though I haven't had issues with that dying light game, the few times I do have a small skip or stutter I can't help but wonder if it would still be there on a 6gb 780.
> 
> *I'm also starting to wonder if getting faster ram is now worth it? Correct me if you know I'm wrong, but, if a game now needs more Vram then you have doesn't it offload the extra onto your system ram? I'm working with 16gb right now, and have seen as high as 10gb usage in that SoM game. Over 8gb in Dying light. I bought my current ram while on AM3+, and got 1866. I have since oc'ed it to 2133, but, I'm thinking of picking up a 16gb pack of 2800 or something and using my current ram in one my other systems... I can't help but wonder with the newest of games, does ram speed finally matter a lot more?


Can confirm that it would still be there, via how it plays on high strung Titans.

Dying light has serious CPU utilization issues, which they claim to be patching up, soon.

No faster ram won't make a lick of difference unless running multiple GOUs and even then it's minimal.

Also you would have to make sure your CPU can even handle those speeds.


----------



## Menta

still kind of like Linus and all, but he is no authority on the matter, no one is...

the real "pro" is the end user ultimately.

NV will not have such a good launch from now on.....fans and people will be more careful.

that will be their sentence


----------



## skupples

VRAM stressing should have been a part of reviews all along, and by stressing I mean memory tests like Kombustor, and the new tools people will design after this cluster. I hear the CUDA tool kit has a memory tool which is much better than the one Thai Ahab guy released.

Tests which tax VRAM first and foremost so that you can isolate core and memory limitations.


----------



## benbenkr

You guys actually take Linus seriously? Wow, OCN.


----------



## skupples

Quote:


> Originally Posted by *benbenkr*
> 
> You guys actually take Linus seriously? Wow, OCN.


Socks and sandals = nope.

I don't take anyone seriously when they wear Jesus sandals with socks.


----------



## Menta

he is not bad....common

just has become a little commercial and less of a enthusiast but that's bound to happen when you getting sponsored

he planed a whole room watercooling pretty cool


----------



## skupples

He's been what he's always been. An unboxer and stock reviewer.

Plug it in, turn it on, run heaven, give editorial review.


----------



## Chargeit

Quote:


> Originally Posted by *skupples*
> 
> *Can confirm that it would still be there, via how it plays on high strung Titans.*
> 
> Dying light has serious CPU utilization issues, which they claim to be patching up, soon.
> 
> No faster ram won't make a lick of difference unless running multiple GOUs and even then it's minimal.
> 
> Also you would have to make sure your CPU can even handle those speeds.


Isn't the game currently know to have issues with sli? I'm not getting tons of stutter and skip. That's why I was wondering if the few cases I do get it would be improved by more Vram or faster system ram. If what you're dealing with is because using sli, then that doesn't really reflect how it performs with more Vram or faster system ram right?

All the ram speed gaming tests I've seen were done back when games weren't really taking advantage of higher Vram. The games being tested we not using all of a cards Vram, and then dumping textures off onto system ram. Seems like dated results that need to be revisited.

I'm not really thinking it would make a huge fps difference, but, it might show up in general smoothness and reduce/prevent stutter caused by the gpu calling on the slower system ram to pull textures. However that works.

I'm running a Z87/4790k. I think I should be safe with 2600... Which is cool, since I didn't realize 2800 was something like $350+. Way too much for ram. However, 2600 was something like $160 - $180ish. I can deal with that.

Just something I've been questioning. It would be nice to have fresh tests representing the current state of PC gaming and not the state of 2 years ago.


----------



## skupples

Issues with SLI= no bits for either company so no dual GPU.

The game still skips on a single GPU with 4,6 on my ivy-E with 32 giggles of 2400mhz c10, installed on SSD stripe.


----------



## Silent Scone

Seemed to be running stutter free for me last night playing co-op with everything wacked up with a single 980.


----------



## skupples

Idk I'm super sensitive to stutter/variance/hitching. Makes me nauseous and causes headaches, which probably makes me the last person you want to ask when it comes to stutter.


----------



## dean_8486

I have just requested a refund from Ebuyer in the UK, wish me luck!
This really sucks as I just built my system and changing cards means draining down (PITA with acrylic) and getting a new water-block... does the 970/980 G1 use the same block?


----------



## Chargeit

Quote:


> Originally Posted by *skupples*
> 
> Idk I'm super sensitive to stutter/variance/hitching. Makes me nauseous and causes headaches, which probably makes me the last person you want to ask when it comes to stutter.


Yea, a lot of it is perceived.

Personally the game runs generally smooth for me with little to no stutter or skip. The times I do get fps Drop since the patch are where there's lots of grass. My GPU hits 99% and pulls full boost. Since the patch, I basically have to look for those spots. Other then that, there are small drops to the low - mid 50's that I have to watch my fps to notice.

The times I do get skip is when I'm grabbed by a zombie, or spit on by a puker... Luckily, that doesn't happen that often. I hope they fix that sooner or later.

Compared to the first two dead island games on PC, this game runs like butter. Not that I'm saying it's buttery, but compared.









I do however have generally smooth gameplay. I would happily show it off to my console friends as a example of a superior PC gaming experience. I'd rate my current performance as good, but not perfect.

Worth mentioning, I'm playing this with a controller. That always helps smooth things out.

*I turned off in game Vsync, and instead limited my fps to 61 with Rivatuner.


----------



## skupples

Ugh. I could never get into DI 1&2. They felt so stale and stagnant.

This to me, for what ever reason, feels like it has a touch of Metro in it. Maybe it's the Western Europe accents, or maybe they have Old 4A staff working on it, idk, but so far I'm enjoying it, though I've only sank one hour.

It REALLY needs SLI bits, so I can run surround properly.


----------



## Silent Scone

DI promised a lot but definitely didn't deliver, and also, it was a hell of a lot more buggy than Dying Light







.


----------



## Chargeit

I enjoyed the first two DI games.

Both Dead Island games mixed some of my favorite things together. Zombies//RPG-progression/loot/Exploration into one game. Sure, they were rough around the edges. I played the first one on my PS3 which suffered from bad texture pop in among other things, but, I enjoyed the general atmosphere and gameplay.

Hell, the first DI was one of the few games that I beat, and then started over right after. Add on to that the melee focus and I was hooked. Nothing beats bashing a zombies head in with a mace. If I feel the need to shoot zombies, I usually load Left 4 Dead 2.

I don't think they were for everyone, but, if you like zombies, FPS, and RPG the games were for you.

I didn't really mind the simplified combat since even when a game offers more options, it's usually just easier to rely on one or two basic attacks to get you through. It's nice to have a game you can play in a relaxed state after a long day exploring, killing, and gaining some exp.

Not everything has to be balls to the wall, nail biting involvement to be enjoyable.


----------



## Vesku

Quote:


> Originally Posted by *N0ID*
> 
> I just received a reply from my retailer (spanish one, not sure if i'm allowed to state which one it is) telling me that im out of luck. They say they are not responsible for the fact that 4 months after release of the graphics card, the news comes that memory is split in 2 parts. They also state that since they didnt have the technical specs on the website (I doubt it but too lazy to check) they are not responsible for any of this. Could anyone tell me what my options are? I feel shafted.


List of many consumer protection departments including Spain: http://www.econsumer.gov/english/members/overview.shtm


----------



## kpo6969

Quote:


> Originally Posted by *mouacyk*
> 
> The quality and cohesiveness of your posts are awesome. The gnats are biting here and there, and absolutely cannot gobble up anything whole, yet they are quite persistent in policing thoughts, down playing choice, and maintaining the status quo. I applaud your efforts, because it makes them bump this thread so others can see it. I get the big picture now on OCN.
> 
> Has anyone given thought to the possibility that perhaps *not all* 970's have the physical disabling of the 8th L2 chip? This may lead into how NVidia bins their chips a little bit, but perhaps the disabling is only on the worst 970's in order to maintain the quota of 4GB VRAM on the 970's? This would lend some *credibility* to why they kept it *secret*, and some 970 users are saying they don't have the issue beyond 3.5GB. This is similar to some of us receiving dog 4770K, taking 1.4v to get 4.5GHz but Alatar can get 5.1GHz at 1.2v.


Just leaving this here.


----------



## provost

Quote:


> Originally Posted by *Vesku*
> 
> List of many consumer protection departments including Spain: http://www.econsumer.gov/english/members/overview.shtm


Let's just drop this fruitless exercise. If, Nvidia is honoring returns, you are wasting your time. That's all you can expect the company to do. No one is gonna have sympathy for someone's video gaming issues.

Quote:


> Originally Posted by *kpo6969*
> 
> Just leaving this here.


There are more fundamental issues at play in this industry that would start impacting consumers a lot more than missing VRAM, and these issues revolve around who owns the performance; the consumer who is purchasing the hardware or the company that is managing arbitrary performance tiers via the software controls. This also begs the question regarding the longevity of the hardware performance, sans the promise of software optimization support, as more and more of the performance gets pushed into ether and away from the hardware specs. More software based performance in the hands of the GPU company without explicit guarantees of associated support for a specified period of time can only have one outcome; frequent turnover of the hardware. I think we can forget the old assumption that if the buy expensive hardware now, it may last a few years. You are more likely to be "renting" hardware at very high prices and frequently having to upgrade as more and more performance moves away from the hardware nuts and bolts to proprietary software "feature set". Wasn't it too long ago that Nvidia declared publicly that it is more of a software company now, than a hardware company? This statement has much broader implications for Nvidia's business model than it appears on the surface.

This fundamental change to me is going to be a much bigger issue for the pc gamers as a whole going forward, as both NV and AMD seem to be headed in that direction.


----------



## rickcooperjr

Quote:


> Originally Posted by *N0ID*
> 
> Quote:
> 
> 
> 
> Quote:
> Originally Posted by N0ID View Post
> 
> I appreciate your advice but I think the main point is that as soon as this card hits 3.5gb+ vram usage, it starts lagging. I shouldn't have to turn off apps just to be able to run my games properly after spending 360 eu on a graphics card don't you think?
> 
> EDIT: Spelling.
> 
> true. but, it is still significantly cheaper than the 980.
> 
> 
> 
> I agree with you. Fact is that this card performs like a beast in any game that requires less than 3.5gb, but anything above that and it starts to become unplayable. For now this is fine but in a few months when even more demanding games are released, this will start to become an even bigger issue...Let's see if I can get a refund and i'll go with a 980 I guess, even though in my opinion it's way overpriced. What other options do I have? I already burned my hands way too many times with ATI and not going back to that.
Click to expand...

PLZ don't forget DX12 it will use more Vram and well the GTX 970 is on the DX12 supported list and DX12 is next API standard that should be out later this year DX12 will use Vram alot like Mantle does AKA around 30% or more Vram usage than currently imagine when DX12 comes in how much the GTX 970 will push that 3.5gb barrier on 1080p only 6 months or so down the road that is a very short life expectancy if you ask me for a $300+ graphics card at it's supposed specs and with Nvidias hype on it.

I simply feel that the 970 is already hurting in a few games running 1080p maxxed out imagine when more games come out like current gen games recently released add DX12 things get even dicier for the GTX 970 for current little alone next years worth of game play if you get my point.

I feel Nvidia screwed the pooch so to speak in the way they did this knowing it would come back to bite them had they have properly advertised / hyped things with proper specs and expectations things might have been different and much more 980's would have been sold which would have benefited them much more than what they did.

I remember when a GPU would easily last you 2yrs or so before you had to drop the settings down when you bought a upper mid to top tier GPU but not anymore with Nvidia you get 6 months to 1yr and it becomes unplayable unless you run minimum settings and with this 3.5gb ram issue that is even questionable that is just sad in my eyes.


----------



## Menta

i don't thinking pushing forward time makes sense, future proof is relative and depends on ones standard, some people don't need very high or ultra etc etc...

the card is gimped now.


----------



## skupples

new API is the main reason why I'm still on GK110. Maxwell is effectively first gen DX12 support, which means its level of support will be inferior to pascal or 390/490x.

either way, with HBM coming around, 4GB will probably become the lowest option for enthusiast tier GPUs.
Quote:


> Originally Posted by *Menta*
> 
> i don't thinking pushing forward time makes sense, future proof is relative and depends on ones standard, some people don't need very high or ultra etc etc...
> 
> the card is gimped now.


970 was always gimped.


----------



## Xuper

My god , Nvidia Forum has now 282 pages When do they shutdown FIRE?

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/


----------



## AngryGoldfish

Quote:


> Originally Posted by *rickcooperjr*
> 
> PLZ don't forget DX12 it will use more Vram and well the GTX 970 is on the DX12 supported list and DX12 is next API standard that should be out later this year DX12 will use Vram alot like Mantle does AKA around 30% or more Vram usage than currently imagine when DX12 comes in how much the GTX 970 will push that 3.5gb barrier on 1080p only 6 months or so down the road that is a very short life expectancy if you ask me for a $300+ graphics card at it's supposed specs and with Nvidias hype on it.
> 
> I simply feel that the 970 is already hurting in a few games running 1080p maxxed out imagine when more games come out like current gen games recently released add DX12 things get even dicier for the GTX 970 for current little alone next years worth of game play if you get my point.
> 
> I feel Nvidia screwed the pooch so to speak in the way they did this knowing it would come back to bite them had they have properly advertised / hyped things with proper specs and expectations things might have been different and much more 980's would have been sold which would have benefited them much more than what they did.
> 
> I remember when a GPU would easily last you 2yrs or so before you had to drop the settings down when you bought a upper mid to top tier GPU but not anymore with Nvidia you get 6 months to 1yr and it becomes unplayable unless you run minimum settings and with this 3.5gb ram issue that is even questionable that is just sad in my eyes.


Minimum settings? As in, the lowest? There is nothing to suggest that simply turning down AA or selecting one less than the top texture quality will not reduce VRAM usage by an amount that is enough for the 970 to remain relevant. I think you're taking this too far. People are still getting stellar performance out of the card and will probably continue to get stellar performance for at least a year, but it won't be top-tier as it was promoted. It had the potential to be much more, and I would have paid a little more to get that. But then the parity between the 970 and the 980 would have been even bigger and fewer people would have spent the extra money on the 980.

I own a 970 and have been impressed with its performance. I wish I had not bought the 970, but it was it is. I may not be able to max out anti-aliasing or use the Ultra texture pack in SoM, but it'll still perform well with FXAA and high textures. Is that enough? It is to someone who has no choice. As for the future, we can only guess what's around the corner. If we have more games like AC:Unity and Dying Light, and that trend continues upwards, then we may be in trouble. But if we get more games like MGS: Ground Zero, Wolfenstein The New Order, GTA V (if it's as capable as its requirements suggest) then we should be fine. The Witcher 3's spec sheet even looks manageable.


----------



## RagingCain

Quote:


> Originally Posted by *rickcooperjr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *N0ID*
> 
> Quote:
> 
> 
> 
> Quote:
> Originally Posted by N0ID View Post
> 
> I appreciate your advice but I think the main point is that as soon as this card hits 3.5gb+ vram usage, it starts lagging. I shouldn't have to turn off apps just to be able to run my games properly after spending 360 eu on a graphics card don't you think?
> 
> EDIT: Spelling.
> 
> true. but, it is still significantly cheaper than the 980.
> 
> 
> 
> I agree with you. Fact is that this card performs like a beast in any game that requires less than 3.5gb, but anything above that and it starts to become unplayable. For now this is fine but in a few months when even more demanding games are released, this will start to become an even bigger issue...Let's see if I can get a refund and i'll go with a 980 I guess, even though in my opinion it's way overpriced. What other options do I have? I already burned my hands way too many times with ATI and not going back to that.
> 
> Click to expand...
> 
> *PLZ don't forget DX12 it will use more Vram and well the GTX 970 is on the DX12 supported list and DX12 is next API standard that should be out later this year DX12 will use Vram alot like Mantle does AKA around 30% or more Vram usage than currently imagine when DX12* comes in how much the GTX 970 will push that 3.5gb barrier on 1080p only 6 months or so down the road that is a very short life expectancy if you ask me for a $300+ graphics card at it's supposed specs and with Nvidias hype on it.
> 
> I simply feel that the 970 is already hurting in a few games running 1080p maxxed out imagine when more games come out like current gen games recently released add DX12 things get even dicier for the GTX 970 for current little alone next years worth of game play if you get my point.
> 
> I feel Nvidia screwed the pooch so to speak in the way they did this knowing it would come back to bite them had they have properly advertised / hyped things with proper specs and expectations things might have been different and much more 980's would have been sold which would have benefited them much more than what they did.
> 
> I remember when a GPU would easily last you 2yrs or so before you had to drop the settings down when you bought a upper mid to top tier GPU but not anymore with Nvidia you get 6 months to 1yr and it becomes unplayable unless you run minimum settings and with this 3.5gb ram issue that is even questionable that is just sad in my eyes.
Click to expand...

Any proof?


----------



## Hattifnatten

Quote:


> Originally Posted by *provost*
> 
> No one is gonna have sympathy for someone's video gaming issues.


Apple got smacked down HARD in Norway for advertising the iPad3 with 4G, even though it was only HSDPA+ capable (didn't have the required LTE-bands for use outside the US). It's not like anyone is going to need 100+ MBit for emails and youtube, but that's not the point...
The 970 was marketed and sold as a 64 rop, 2MB L2 and 4GB card. What consumers actually got, was a 56rop, 1,7MB L2, and a 3,5+0,5GB card, with severe performance issues once reading/writing was necessary on the last 512MBs of vram (note: allocating and using memory is not the same).


----------



## skupples

What I can't use a cut midrange GPU for max settings for ever?

Reaaaaaggeeee.

You would think this most of these people's first GPUs.


----------



## JackMex

I still prefer a GTX 970 3.5GB+500MB that is significantly more energy and temperature efficient clock-for-clock than anything that AMD has to offer right now. The performance is still there, too. However, I do give mad props to AMD for staying in there and playing hardball with Nvidia! Competition is good for the advancement and streamlining of technologies.


----------



## darealist

Gimp my ride.


----------



## The Robot

Quote:


> Originally Posted by *darealist*
> 
> Gimp my ride.


The Way it's Meant to be Gimped


----------



## Blameless

Quote:


> Originally Posted by *N0ID*
> 
> I just received a reply from my retailer (spanish one, not sure if i'm allowed to state which one it is) telling me that im out of luck. They say they are not responsible for the fact that 4 months after release of the graphics card, the news comes that memory is split in 2 parts. They also state that since they didnt have the technical specs on the website (I doubt it but too lazy to check) they are not responsible for any of this. Could anyone tell me what my options are? I feel shafted.


The only place incorrect specifications cropped up are NVIDIA's Review Guide. Most parters never listed specs like ROP or TMU count anywhere. NVIDIA never listed these specs anywhere on their site either.

The only specs that NVIDIA publicly lists that are in contention are related to memory, and the specs NVIDIA's specs are all technically correct, with the possible exception of maximum theoretical memory bandwidth.

Most people have no recourse, because no one in the chain of sale has lied to them. NVIDIA gave reviewers the wrong info and reviewers passed it on.


----------



## BinaryDemon

Quote:


> Originally Posted by *Blameless*
> 
> The only place incorrect specifications cropped up are NVIDIA's Review Guide. Most parters never listed specs like ROP or TMU count anywhere. NVIDIA never listed these specs anywhere on their site either.
> 
> The only specs that NVIDIA publicly lists that are in contention are related to memory, and the specs NVIDIA's specs are all technically correct, with the possible exception of maximum theoretical memory bandwidth.
> 
> Most people have no recourse, because no one in the chain of sale has lied to them. NVIDIA gave reviewers the wrong info and reviewers passed it on.


While I don't feel cheated and have no interest in trying to recoop money from Nvidia, I do hope someone files a class action suit simply because this lack of information shouldn't be allowed.

Nvidia should have disclosed this information about ROP's / TMU's / and two separate memory pools with drastically different bandwidth at launch.


----------



## rickcooperjr

well I thought I would throw this out there since it also falls inline with issues at sec with Nvidia and theyre 900 series theyre Gsync has been debunked Nvidia has forced monitor vendors and such to have to use Gsync modules in theyre monitors to support Gsync guess what people have enabled Nvidia Gsync on laptops and such without Gsync modules in them this erks me check the prices of a Gsync monitor VS non Gsync monitor the price is a big difference my proof http://www.pcper.com/reviews/Graphics-Cards/Mobile-G-Sync-Confirmed-and-Tested-Leaked-Alpha-Driver


----------



## spacin9

Quote:


> Originally Posted by *rickcooperjr*
> 
> well I thought I would throw this out there since it also falls inline with issues at sec with Nvidia and theyre 900 series theyre Gsync has been debunked Nvidia has forced monitor vendors and such to have to use Gsync modules in theyre monitors to support Gsync guess what people have enabled Nvidia Gsync on laptops and such without Gsync modules in them this erks me check the prices of a Gsync monitor VS non Gsync monitor the price is a big difference my proof http://www.pcper.com/reviews/Graphics-Cards/Mobile-G-Sync-Confirmed-and-Tested-Leaked-Alpha-Driver


Supposedly the module NV uses has 800 MB for framebuffering and those who've enabled it in those laptops- it doesn't work well. I love g-sync. It's making my 144 hz start to collect dust. And why shouldn't it be proprietary?

If it's so doable and has been so for years how come it hasn't been done? NV monetized it... as long it works so what? The 970 thing is far more egregious, and superb popcorn material. Almost better than gaming.


----------



## rickcooperjr

Quote:


> Originally Posted by *spacin9*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> well I thought I would throw this out there since it also falls inline with issues at sec with Nvidia and theyre 900 series theyre Gsync has been debunked Nvidia has forced monitor vendors and such to have to use Gsync modules in theyre monitors to support Gsync guess what people have enabled Nvidia Gsync on laptops and such without Gsync modules in them this erks me check the prices of a Gsync monitor VS non Gsync monitor the price is a big difference my proof http://www.pcper.com/reviews/Graphics-Cards/Mobile-G-Sync-Confirmed-and-Tested-Leaked-Alpha-Driver
> 
> 
> 
> Supposedly the module NV uses has 800 MB for framebuffering and those who've enabled it in those laptops- it doesn't work well. I love g-sync. It's making my 144 hz start to collect dust. And why shouldn't it be proprietary?
> 
> If it's so doable and has been so for years how come it hasn't been done? NV monetized it... as long it works so what? The 970 thing is far more egregious, and superb popcorn material. Almost better than gaming.
Click to expand...

how long has adaptive Vsync been out which is what AMD freesync and such is almost identical to adaptive Vsync all this brings huge question into Nvidia Gsync is all I am saying it could very well be misleading / lies or false information / marketing as was with GTX 970.

Adaptive Vsync was out many years ago long before Nvidia Gsync that is all I can say so someone enabling Nvidia Gsync on a non Nvidia Gsync setup without the propietary Nvidia Gsync module is very troubling / questionable is it not that is why I brought it up.


----------



## Hattifnatten

Quote:


> Originally Posted by *rickcooperjr*
> 
> how long has adaptive Vsync been out which is what AMD freesync and such is almost identical to adaptive Vsync all this brings huge question into Nvidia Gsync is all I am saying it could very well be misleading / lies as was with gtx 970.


Adaptive vsync is not the same as the optional DP 1.2a standard "Adaptive-Sync". Adaptive-sync makes sure each and every frame is synced to the monitor, if it's within the monitors variable refresh-rate (VRR). Adaptive Vsync uses verticalsync when the framerate is above the monitors (static) refresh-rate, and turns it off once it drops below.

That being said, if freesync gives the same performance as g-sync, boy oh boy are Nvidia in for a 970 round two.


----------



## spacin9

Quote:


> Originally Posted by *rickcooperjr*
> 
> how long has adaptive Vsync been out which is what AMD freesync and such is almost identical to adaptive Vsync all this brings huge question into Nvidia Gsync is all I am saying it could very well be misleading / lies as was with gtx 970.
> 
> Adaptive Vsync was out many years ago long before Gsync that is all I can say so someone enabling Gsync on a non Gsync setup without the propietary Gsync module is very troubling / questionable is it not.


I've used adaptive v-sync a lot in the past.. mostly in Skyrim I think. There's still some tearing. I'm not saying g-sync is 100% perfect, but it's damn good. And I think you'll see the prices level off if FreeSync is any good. That's what competition is for.

There have always been some driver hacks and BIOS hacks to enable to some hidden function we weren't supposed to have full access to. I think NV inspector is built on that idea.

Unlocking extra pixel pipes, unlocking overclocking options on cheap motherboards... if there's a hack to exploit, someone's going to find it.


----------



## Blameless

Quote:


> Originally Posted by *spacin9*
> 
> Supposedly the module NV uses has 800 MB for framebuffering:\


The G-Sync module has significant local memory (and mostly for bandwidth purposes) because the moduel's ASIC is doing the bulk of the processing.

Module-less solutions will be doing the processing on the GPU, and thus won't need another copy of data beyond what already exist in VRAM.
Quote:


> Originally Posted by *rickcooperjr*
> 
> how long has adaptive Vsync been out which is what AMD freesync and such is almost identical to adaptive Vsync all this brings huge question into Nvidia Gsync is all I am saying it could very well be misleading / lies or false information / marketing as was with GTX 970.


VESA Adaptive Sync and adaptive vsync are _totally_ different things.


----------



## Hattifnatten

Quote:


> Originally Posted by *spacin9*
> 
> I've used adaptive v-sync a lot in the past.. mostly in Skyrim I think. There's still some tearing. I'm not saying g-sync is 100% perfect, but it's damn good. And I think you'll see the prices level off if FreeSync is any good. That's what competition is for.


Adaptive v-sync is not the same as g-sync. Adaptive v-sync is basicly vertical-sync that get's turned off when the framerate drops below the monitors refresh-rate. Hence, the tearing you've been experiencing.


----------



## spacin9

Quote:


> Originally Posted by *Hattifnatten*
> 
> Adaptive v-sync is not the same as g-sync. Adaptive v-sync is basicly vertical-sync that get's turned off when the framerate drops below the monitors refresh-rate. Hence, the tearing you've been experiencing.


I know this. I was responding to someone responding to me. I personally don't care about any company's attempt to monetize a feature... as long it works.
Quote:


> Originally Posted by *Blameless*
> 
> The G-Sync module has significant local memory (and mostly for bandwidth purposes) because the moduel's ASIC is doing the bulk of the processing.
> 
> Module-less solutions will be doing the processing on the GPU, and thus won't need another copy of data beyond what already exist in VRAM.
> VESA Adaptive Sync and adaptive vsync are _totally_ different things.


You seem to know a lot about this. I don't. Like I said, if it we're that easy, someone would have done this for free already right? Free-Sync? Why didn't they do it before g-sync and blow NVs nefarious money grab out of the water?

As far as I can see from what you said, as a layman, is that the tech is getting better and cheaper? That's a good thing right?


----------



## skupples

lol at anyone that thinks nvidia invented the G-Sync module just for the lulz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *skupples*
> 
> lol at anyone that thinks nvidia invented the G-Sync module just for the lulz.


They did it for $.


----------



## skupples

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They did it for $.


I highly doubt gsync will ever sell enough units to turn a healthy profit off of the R&D cycle, but I guess we won't know for sur until someone gets ahold of an earnings report.

I mean, just do some math. They aren't exactly selling like hot cakes or available enough to sell like hot cakes.

That might change in 2015 but as of right now I doubt they've recouped their costs to bring to market.


----------



## JackMex

Quote:


> Originally Posted by *skupples*
> 
> I highly doubt gsync will ever sell enough units to turn a healthy profit off of the R&D cycle, but I guess we won't know for sur until someone gets ahold of an earnings report.
> 
> I mean, just do some math. They aren't exactly selling like hot cakes or available enough to sell like hot cakes.
> 
> That might change in 2015 but as of right now I doubt they've recouped their costs to bring to market.


G-G-G-G-G-SYYYYYYNC!!


----------



## Mad Pistol

The idea behind gsync is to have a framebuffer built into the monitor, so that it can display the frames at exactly the right time. My guess is that it produces frames constantly and updates them constantly. At the point at which the generated frame lines up with the display timing, it then displays that frame.

That's probably why it has so much buffer and why it requires displayport in order for it to work. It's a high-bandwidth communication between the GPU and monitor.


----------



## Johnny Rook

Quote:


> Originally Posted by *skupples*
> 
> I highly doubt gsync will ever sell enough units to turn a healthy profit off of the R&D cycle, but I guess we won't know for sur until someone gets ahold of an earnings report.
> 
> I mean, just do some math. They aren't exactly selling like hot cakes or available enough to sell like hot cakes.
> 
> That might change in 2015 but as of right now I doubt they've recouped their costs to bring to market.


Yeah man, we'll just have to wait and see.

Brand new tech is always pricey at launch and starts to drop down in prices as more people buy it and as the production increases and stocks fill. Happened with DDR3, IPS, is happening with 4K, etc.


----------



## skupples

I mean how much can they really be turning off of those units? The partners probably get them for ~ $75. At most.


----------



## JackMex

Quote:


> Originally Posted by *Johnny Rook*
> 
> Yeah man, we'll just have to wait and see.
> 
> Brand new tech is always pricey at launch and starts to drop down in prices as more people buy it and as the production increases and stocks fill. Happened with DDR3, IPS, is happening with 4K, etc.


Also, as the technology of the product itself matures, and more efficient, streamlined versions are produced.


----------



## skupples

And now look at DDR3. 2x the price it was in 2012.


----------



## Vesku

Quote:


> Originally Posted by *provost*
> 
> Let's just drop this fruitless exercise. If, Nvidia is honoring returns, you are wasting your time. That's all you can expect the company to do. No one is gonna have sympathy for someone's video gaming issues.


The poster was saying his Spanish retailer refused to take it back. He can try to get Nvidia involved but if that doesn't work he'd need to try his country's consumer protection department.


----------



## Blameless

Quote:


> Originally Posted by *spacin9*
> 
> Like I said, if it we're that easy, someone would have done this for free already right? Free-Sync? Why didn't they do it before g-sync and blow NVs nefarious money grab out of the water?


Two reasons:

1. The demand for the feature wasn't there. Some people, myself included, have been wishing for dynamic refresh rates and were aware they were possible, but until very recently there was no real consumer demand for the feature. Until NVIDIA produced the current incarnation of G-Sync, the focus was simply higher refresh rates and then throwing enough GPU horsepower at it to be able to use vsync. NVIDIA saw an opportunity to distinguish their product and they took it.

2. The display hardware to do a true dynamic refresh rate hasn't existed, outside of some mobile or certain specialized professional setups, until NVIDIA implemented it.
Quote:


> Originally Posted by *spacin9*
> 
> the tech is getting better and cheaper? That's a good thing right?


Yes.
Quote:


> Originally Posted by *skupples*
> 
> I highly doubt gsync will ever sell enough units to turn a healthy profit off of the R&D cycle, but I guess we won't know for sur until someone gets ahold of an earnings report.
> 
> I mean, just do some math. They aren't exactly selling like hot cakes or available enough to sell like hot cakes.
> 
> That might change in 2015 but as of right now I doubt they've recouped their costs to bring to market.


G-Sync is an interesting and compelling feature that is still, as of this moment, something only NVIDIA can do in practice. Even if displays with G-Sync modules don't move enough volume to be a large profit, it's still a good marketing point.
Quote:


> Originally Posted by *Vesku*
> 
> The poster was saying his Spanish retailer refused to take it back. He can try to get Nvidia involved but if that doesn't work he'd need to try his country's consumer protection department.


I think it's a lost cause.


----------



## mouacyk

This thread is getting de-railed off course. The G-Sync issue has a different original thread here:

http://www.overclock.net/t/1538208/nvidia-g-sync-free-on-mobile-edp-monitors-with-980m/170#post_23484986


----------



## N0ID

Quote:


> Originally Posted by *Blameless*
> 
> Two reasons:
> 
> 1. The demand for the feature wasn't there. Some people, myself included, have been wishing for dynamic refresh rates and were aware they were possible, but until very recently there was no real consumer demand for the feature. Until NVIDIA produced the current incarnation of G-Sync, the focus was simply higher refresh rates and then throwing enough GPU horsepower at it to be able to use vsync. NVIDIA saw an opportunity to distinguish their product and they took it.
> 
> 2. The display hardware to do a true dynamic refresh rate hasn't existed, outside of some mobile or certain specialized professional setups, until NVIDIA implemented it.
> Yes.
> G-Sync is an interesting and compelling feature that is still, as of this moment, something only NVIDIA can do in practice. Even if displays with G-Sync modules don't move enough volume to be a large profit, it's still a good marketing point.
> I think it's a lost cause.


I fear it might actually be a lost cause. Nonetheless I am going to give it a try and will report back to you guys as soon as i've spoken to someone from the consumer protection asociation. Will probably be around beginning next week.

I am not getting my hopes up though...

You all have been very helpfull with your replies/suggestions, thanks for that!


----------



## provost

Quote:


> Originally Posted by *Vesku*
> 
> The poster was saying his Spanish retailer refused to take it back. He can try to get Nvidia involved but if that doesn't work he'd need to try his country's consumer protection department.


Look it, I didn't buy the 970, so I am no position to tell anyone what to do or not to do. They should do what they feel is appropriate.

Nvidia is a creature that I admire as a business, but loathe as a consumer...lol
Don't know enough about AMD, except that some defend it to death, and others drop by with random ignorant hatred comments from other forums.
Will do more research on AMD to see if they are less of a clown than Nvidia, and make my next purchase decision accordingly ....


----------



## mcg75

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They did it for $.


Gsync brings the smoothness of 60fps down to as low as 40fps.

Anybody who enjoys that smoothness now has spend less $$$ on gpu power to get it.

I don't have one but if it works the way they say it does, I'm going to save hundreds of dollars more not needing high end gpus than what I spent on Gsync.

Same can be applied to Freesync.


----------



## Blameless

Quote:


> Originally Posted by *provost*
> 
> Will do more research on AMD to see if they are less of a clown than Nvidia, and make my next purchase decision accordingly ....


What does the clownishness of the manufacturer have to do with the merits of their product?


----------



## provost

Quote:


> Originally Posted by *Blameless*
> 
> What does the clownishness of the manufacturer have to do with the merits of their product?


Because the joke is always on the buyer... Come on you asking hard questions first thing in the morning... Wearing me out already ..lol

Edit: and, my comment has nothing to do with their product quality, as I do believe the product quality is quite good


----------



## somethingname

If it can't fully utilize 3gb of ram in games then they are just side stepping in order to avoid backlash and refunds


----------



## nleksan

It uses 3GB admirably, in fact it goes to about 3.55GB before the slower memory is even brought into play, and even then only if absolutely necessary.

On another note, I have been a very active, long time member of OCN, and consider it my Internet home so to speak...

Sadly, there has been a tremendous drop in the quality of posts, civility, respect, common courtesy, and even basic human decency. The complete and utter refusal of so manyto even CONSIDER that maybe someone else has a point equally, perhaps more valid, is a nonoccurrence.
The levels of fanboy absurdity, the cognitive biases that are not corrected but instead embraced, the pandering to people who are on thessame "side" to boost rep, the INCESSANT and unchecked never ending spreading FUD and misinformation, the people who in the same breath as giving "aadvice" prove thatthey have lied about what they own (I have seen MANY times, someone say in one post their PC is xyz and turn around to say it's actually jkl, or a recent example being someone who talked about how AMD is infallible and they're getting min fps of 60+ on FC4 across 3x1440p Eye-finity 100pct max settings, with a 4M/8C FX and 3x 290X when anyone who has tried X-fire in theggame knows that isn't possible...), the way people who have 100 posts start in on people who have been here years or even yelling at mods, cherry-picked benches to prove a point (it's a rule in the scientific world that fabrication or failure to disclose any evidence negates the validity of anything associated with it...), and everyone here has their own respective strengths and weaknesses Re knowledge of specific areas of computers but I have to say that it irks me to no end to have people who know nothing about the topic constantly try to say I'm wrong when it comes to audio (I have done audio production, from the recording all the way thru the Mastering, a LONG time and own my own studio; I take an extremely scientific view of audio as my main career focus is in Psychopharmacology, and I find the nonsense that is theaaudiophile community an embarrassment, and that the ONLY subjective opinion should be the person asking questions instead of having 20ppl shaping their expectations beforehand... Ugh)...

I have been lied to by AMD, personally and as a member of the community, but it's the former that twists the knife. I havebbeen willing again and again to give them the benefit of the doubt, and I have never felt it was earned in fact I have had a few more negative experiences...
Nvidia may not be some moral guidepost, but the only "scandal" rollseyes is this VRAM thing, which is significantly better than AMD's history of late.

Then there is the ********* they have running their marketing, and I can say that he is hurting them because I will notbuy ffrom a company who not onlyemploys such iidiots, but makes them senior level staff... Let alone the public face of the company.
The fact is, AMD has developed a reputation as an immature group, which is going to kill them as it is a major issue with enterprise hardware market clients.

Mark my words, the SECOND they fire their pet monkey running the marketing, I will buy at least two of their highestend single gGPUcards, and post a picture ofthem with a sheet of paper with the names of Raghu/rdr09/mtcn/pontiacgtx/orangey printed in big font with little hearts drawn all aroundtheir names. I wwillthen have it screened into a tt-shirt, make a dunce cap out of it, and attend an OCN meet up wearing them.

Until then, I will go with the company who inspires enough trust (not just me, but obviously also in their investors) that the occasional slip-up is, while never "okay", at least more easily forgiven because they have not been doing so at such frequent a pace...

But to my earlier point...

The bottom line is that, frankly, the quality of the membership here has declined almost in concert with AMD's profit margins, and the forum is worse off for it because this is not a corporate entity who cares about your money, it's (supposed to be) a community.

Grow up people, and frankly, the OVER THE TOP SHILLING needs to stop now! It makes everyone else look bad by association, and it's time to start respecting those who you disagree with, stop watching Mean Girls on repeat, and learn that you are not better than anyone else and talking in any way that implies otherwise proves that, if not showing you are at a lower level...

Let's try to make this place decent again!


----------



## sugarhell

Why amd has anything to do with this?Both do this,both quality is subpar for what we pay. Ok you justify your choice but why should i care about this?Anyone believing on their marketing dep is an idiot

Also i hate the wall of text


----------



## Cyro999

Quote:


> Gsync brings the smoothness of 60fps down to as low as 40fps.


That's a kind of correct but also technically completely wrong statement

perfect 60fps is perfect 60fps, there's no way around that. The problem is that for stuff with variable frametimes, smoothness is artificially worse than expected for the framerate. Some games/applications might have a 3% frametime variance, while others have 10%. Some really problematic games can have 50+% frametime variance, so the FPS needed for something to look smoother than a static, even frametime 60fps video might be anywhere from ~65-80+, depending on the game.

Adaptive refresh timing on the screen does a lot to mitigate that flaw, and make 60fps in gameplay look more like it's supposed to, instead of looking far worse than a pre-recorded 60fps video, which just has one frame every 16.67ms, instead of those 60 frames being made up of a bunch of random frametimes varying from 14 to 19ms for example.

However, 60fps smoothness at 40fps won't happen unless the 60fps you are comparing to is very flawed and unsmooth. Which.. with many games, it actually is. A 1.5x difference in FPS is still quite an extreme perceptual change though, i can't rate it myself without having seen it on a range of games


----------



## Blameless

Products, good and bad, market themselves.

I don't trust either AMD or NVIDIA half as far as I can throw their collective physical assets, but I don't have to. I don't pre-order hardware, and neither AMD nor NVIDIA is directly responsible for any RMA's their products may need. If a product does what I need it to do, within the budget I have set, and there isn't something that does equally well for less, then I buy it. After the moment of purchase, the only thing I need to rely on AMD or NVIDIA for are driver updates...and the track record of both companies is less than stellar here.


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Why amd has anything to do with this?Both do this,both quality is subpar for what we pay. Ok you justify your choice but why should i care about this?Anyone believing on their marketing dep is an idiot
> 
> Also i hate the wall of text


prolly if he stops posting non-sense OCN will be decent.lol


----------



## MerkageTurk

Nleksan

You are somewhat contradicting yourself but I do agree with you,

However, I am neutral when it comes to my PC hardware, and whom ever misrepresents their products or even stops driver supporting their graphic card after couple of months, deserves the negative press they brought upon themselves. I.e. nVIDIA stopped supporting GK110

Let's be serious, it does not matter if you are nvidia or amd, what matters is the fact that a product was misrepresented and nobody knew about this until consumers tested the items themselves,

This action itself makes me want to sway to amd


----------



## nleksan

My experiences are nonsense?

Well, thanks for at least demonstrating some of my points regarding respect...

And indeed, I would like to know what AMDhas to do wwith this, since half the thread has been AMD diehards bashing Nvidia, despite having ZERO reason to be doing so, if anything you should be grateful for AMD's price drop response...


----------



## Usario

I've just been laughing at NVIDIA this whole time. This is fairly blatant false advertisement... they advertised 64 ROPs and 224GB/s memory bandwidth, though they could get away with it on the latter statement by claiming that 224GB/s to them means 196GB/s on the 3.5GB block + 28GB/s on the 512MB block (absolute nonsense in practicality, but who knows)... Wonder if the feds are going to get involved.
Quote:


> Originally Posted by *sugalumps*
> 
> Ragu/mtcn and a few others(not as bad so I wont name them) are well known amd shills, and have been for many dedicated years. Just leave them to it and ignore them, treat them like you would treat roy - with no respect or credibility.
> 
> Their dedication is admirable though, possibly the best shills ever?


Yes, everybody who disagrees with you and likes the company you hate is a shill, and surely NVIDIA has none... (In fact both sides are primarily comprised of aneurotypical fanboys with too much time on their hands and nothing going on in life)


----------



## sugarhell

Quote:


> Originally Posted by *Usario*
> 
> I've just been laughing at NVIDIA this whole time. This is fairly blatant false advertisement... they advertised 64 ROPs and 224GB/s memory bandwidth, though they could get away with it on the latter statement by claiming that 224GB/s to them means 196GB/s on the 3.5GB block + 28GB/s on the 512MB block (absolute nonsense in practicality, but who knows)... Wonder if the feds are going to get involved.
> Yes, everybody who disagrees with you and likes the company you hate is a shill, ...


Shill is the new fanboy word


----------



## Usario

Quote:


> Originally Posted by *sugarhell*
> 
> Shill is the new fanboy word


Used to have meaning when Intel had an actual massive shill program


----------



## Forceman

Quote:


> Originally Posted by *Usario*
> 
> I've just been laughing at NVIDIA this whole time. This is fairly blatant false advertisement... they advertised 64 ROPs and 224GB/s memory bandwidth, though they could get away with it on the latter statement by claiming that 224GB/s to them means 196GB/s on the 3.5GB block + 28GB/s on the 512MB block (absolute nonsense in practicality, but who knows)...


Did they actually advertise 64 ROPs though? That was in the review guide and the reviews but, legally speaking, reviews aren't advertising. Their website, and the partner websites, don't actually list the ROP count (or the L2 cache) anywhere.


----------



## skupples

Quote:


> Originally Posted by *Usario*
> 
> Used to have meaning when Intel had an actual massive shill program


Shill is not the new fan boy word

AMD and NV both Employe shills, only difference is that the AMD insider program is much larger and much more deceptive in their practices.

Most legit forums require sanctioned shills to state as such in their signatures. Feel free to PM me. I have the list of AMD's top 50 shills (and the rest) you might be surprised by the names you find.


----------



## Usario

Quote:


> Originally Posted by *Forceman*
> 
> Did they actually advertise 64 ROPs though? That was in the review guide and the reviews but, legally speaking, reviews aren't advertising. Their website, and the partner websites, don't actually list the ROP count (or the L2 cache) anywhere.


Ah, but they did in the Reviewer's Guide, NVIDIA shill. (dunno if this link works: http://www.hardocp.com/image.html?image=MTQxMTA2MzcyNDBjUEVEMXNuZnBfNV8xMF9sLmdpZg== )
Quote:


> Originally Posted by *skupples*
> 
> Shill is not the new fan boy word
> 
> AMD and NV both Employe shills, only difference is that the AMD insider program is much larger and much more deceptive in their practices.
> 
> Most legit forums require sanctioned shills to state as such in their signatures. Feel free to PM me. I have the list of AMD's top 50 shills (and the rest) you might be surprised by the names you find.


Then I would like to see your source, friend


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Did they actually advertise 64 ROPs though? That was in the review guide and the reviews but, legally speaking, reviews aren't advertising. Their website, and the partner websites, don't actually list the ROP count (or the L2 cache) anywhere.


I can tell you that people that buy $300 GPUs look at reviews and Nvidia is responsible for Reviews Accuracy. Stuff like ROP and L2 are really pointless for people to know but not knowing that the memory is split in 2 parts was hidden truth by Nvidia. 3.5GB + .5GB found on GTX970 is not the same as 4GB as GTX980 and Nvidia should have told everyone about this. This is not sub $100 GPU. People that spend this much in GPUs want to know what they are paying for. It's like making a car and saying it 200 HP and latter finding out that is has 150HP engine and 50HP second engine and they dont work at the same time.


----------



## Forceman

Quote:


> Originally Posted by *Usario*
> 
> Ah, but they did in the Reviewer's Guide, NVIDIA shill. (dunno if this link works: http://www.hardocp.com/image.html?image=MTQxMTA2MzcyNDBjUEVEMXNuZnBfNV8xMF9sLmdpZg== )


Quote:


> Originally Posted by *ZealotKi11er*
> 
> I can tell you that people that buy $300 GPUs look at reviews and Nvidia is responsible for Reviews Accuracy. Stuff like ROP and L2 are really pointless for people to know but not knowing that the memory is split in 2 parts was hidden truth by Nvidia. 3.5GB + .5GB found on GTX970 is not the same as 4GB as GTX980 and Nvidia should have told everyone about this. This is not sub $100 GPU. People that spend this much in GPUs want to know what they are paying for. It's like making a car and saying it 200 HP and latter finding out that is has 150HP engine and 50HP second engine and they dont work at the same time.


I'm not saying NVidia isn't wrong in this, but the review guide and the actual reviews are not advertising in the "false advertising" sense.

And I'd say that Nvidia isn't really responsible for review accuracy. They should make sure the data they provide is accurate, certainly, but if a review posts wrong data on their own, is it Nvidia's responsibility to correct it? So Nvidia has to fact-check every review in existence?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> I'm not saying NVidia isn't wrong in this, but the review guide and the actual reviews are not advertising in the "false advertising" sense.
> 
> And I'd say that Nvidia isn't really responsible for review accuracy. They should make sure the data they provide is accurate, certainly, but if a review posts wrong data on their own, is it Nvidia's responsibility to correct it? So Nvidia has to fact-check every review in existence?


Yes. Nvidia reads all the reviews before they are posted for all those reviews that they provided the review kit.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes. Nvidia reads all the views before they are posted for all those reviews that they provided the review kit.


You have a source on that?


----------



## Usario

Quote:


> Originally Posted by *Forceman*
> 
> I'm not saying NVidia isn't wrong in this, but the review guide and the actual reviews are not advertising in the "false advertising" sense.
> 
> And I'd say that Nvidia isn't really responsible for review accuracy. They should make sure the data they provide is accurate, certainly, but if a review posts wrong data on their own, is it Nvidia's responsibility to correct it? So Nvidia has to fact-check every review in existence?


The review guide had a spec sheet and the spec sheet was full of blatant lies as far as ROP count and L2 amount go, not to mention the memory complications. Not only was this false advertising directed towards the reviewers, the reviewers naturally passed on this false information provided by NVIDIA to all the consumers reading their reviews. And NVIDIA didn't ever acknowledge these lies until long after they were discovered by the enthusiast community, and they STILL refuse to acknowledge the spec sheet in the reviewer's guide, because they know that's what's going to get them into hot water with the feds.

When NVIDIA is knowingly sending out FALSE spec sheets and *refuses to acknowledge or correct them*, I'm fairly sure that falls under deceptive commerce/false advertising as defined by the FTC. The reviewers were posting false data *that was provided to them by NVIDIA*.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> You have a source on that?


https://forums.geforce.com/default/topic/807528/geforce-900-series/useful-information-about-false-advertising-must-read-/


----------



## 2010rig

@[email protected] Please give AMD's marketing department a break, they are paid with food stamps.








Quote:


> Originally Posted by *skupples*
> 
> Shill is not the new fan boy word
> 
> AMD and NV both Employe shills, only difference is that the AMD insider program is much larger and much more deceptive in their practices.
> 
> Most legit forums require sanctioned shills to state as such in their signatures. Feel free to PM me. I have the list of AMD's top 50 shills (and the rest) you might be surprised by the names you find.


PM me the list.









I've had my suspicions about many people on this forum, who's opinion is swayed by the hardware they receive from AMD.


----------



## Forceman

Quote:


> Originally Posted by *Usario*
> 
> The review guide had a spec sheet and the spec sheet was full of blatant lies as far as ROP count and L2 amount go, not to mention the memory complications. Not only was this false advertising directed towards the reviewers, the reviewers naturally passed on this false information provided by NVIDIA to all the consumers reading their reviews. And NVIDIA didn't ever acknowledge these lies until long after they were discovered by the enthusiast community, and they STILL refuse to acknowledge the spec sheet in the reviewer's guide, because they know that's what's going to get them into hot water with the feds.
> 
> When NVIDIA is knowingly sending out FALSE spec sheets and *refuses to acknowledge or correct them*, I'm fairly sure that falls under deceptive commerce/false advertising as defined by the FTC. The reviewers were posting false data *that was provided to them by NVIDIA*.


Again, review guides guides to the reviewers is not the same as paid-for advertising, and the spec sheets aren't sent to the consumer. If the case is so open and shut, why has no one approached the FTC about it yet? I'm pretty sure anyone can make a complaint to them.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> https://forums.geforce.com/default/topic/807528/geforce-900-series/useful-information-about-false-advertising-must-read-/


So the source is a forum post, and a list of reviews on Nvidia's site? Not exactly airtight. But I'll agree that Nvidia probably reads the reviews, although that isn't the same as approving them, which is what I initially assumed you meant (even though you said read).


----------



## Usario

Quote:


> Originally Posted by *Forceman*
> 
> Again, review guides guides to the reviewers is not the same as paid for advertising. If the case is so open and shut, why has no one approached the FTC about it yet? I'm pretty sure anyone can make a complaint to them.


Doesn't matter, FTC statutes cover deceptive practices in commerce in general, not just paid-for advertisements in the traditional sense. Though I'm no lawyer so I'm not going to say the case is 'open and shut'


----------



## mtcn77

I find this outcome hilarious. May be a personal trait, lol. [What did I say?]


----------



## Forceman

Quote:


> Originally Posted by *Usario*
> 
> Doesn't matter, FTC statutes cover deceptive practices in commerce in general, not just paid-for advertisements in the traditional sense. Though I'm no lawyer so I'm not going to say the case is 'open and shut'


I don't know either. I went to their website, but I couldn't find an easy way to file a complaint. The "file a consumer complaint" link seemed to be more focused on fraud than deceptive advertising, and following the link didn't seem very relevant. I guess you might have to call. I wonder if there is anyway to see what complaints they've received - I know the BBB does that.

I can't decide if them not changing the specs on the website (with regards to the VRAM and bandwidth) means their lawyers don't think they need to change them because they are technically correct, or because the lawyers don't want them to change them because it would be tacitly admitting they were wrong before. I'm guessing the former, but I'd also believe the latter since you never want to admit guilt in a case like this.


----------



## sugalumps

Quote:


> Originally Posted by *ZealotKi11er*
> 
> https://forums.geforce.com/default/topic/807528/geforce-900-series/useful-information-about-false-advertising-must-read-/


Not even golden is defending them! Nvidia really has fallen from grace


----------



## rdr09

Quote:


> Originally Posted by *sugalumps*
> 
> Not even golden is defending them! Nvidia really has fallen from grace


how? he was one of them who got duped. he came across as someone who worked for a company at one time but . . . i was wrong. i admit. he is pushing the issue . . . just not here in ocn. i feel sorry for the guy.


----------



## Noufel

Quote:


> Originally Posted by *sugalumps*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> https://forums.geforce.com/default/topic/807528/geforce-900-series/useful-information-about-false-advertising-must-read-/
> 
> 
> 
> Not even golden is defending them! Nvidia really has fallen from grace
Click to expand...

man .... golden is asking nvidia a free upgrade to 980


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> how? he was one of them who got duped. he came across as someone who worked for a company at one time but . . . i was wrong. i admit. he is pushing the issue . . . just not here in ocn. i feel sorry for the guy.


Because Nvidia does not care about OCN or this forum thread.


----------



## Xoriam

Quote:


> Originally Posted by *Noufel*
> 
> man .... golden is asking nvidia a free upgrade to 980


Nothing wrong with that since a 980 actually cost less to manufacture









(no need to spend on cutting the chip and modding the bios.
I know i know, 970 cost less because it failed 980 standards.)


----------



## skupples

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes. Nvidia reads all the reviews before they are posted for all those reviews that they provided the review kit.


this is how the game works for ALL review samples form EVERY company to alphabet soup to zebra cakes.

They give free samples, with a list of talking points (see Linus complaining about the 290x talking points slide in his first review) the reviewer then writes and submits he article for approval

THIS ONLY applies to sanctioned reviewers.
Quote:


> Originally Posted by *2010rig*
> 
> @[email protected] Please give AMD's marketing department a break, they are paid with food stamps.
> 
> 
> 
> 
> 
> 
> 
> 
> PM me the list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had my suspicions about many people on this forum, who's opinion is swayed by the hardware they receive from AMD.


But wait... I got the list from you!! The old domain for insider program changed, and it looks like the new one is walled off unlike the old one, so you can't freely view their list of sanctioned shills.


----------



## Noufel

Any official world from Nvidia, what's next are they getting away from that without taking care off their clients ???


----------



## skupples

Most likely. Most people that don't live in dictatorship countries seemed to have good enough luck with returns, though it seems that's starting to change now that these companies have put together proper statements with benchmarks n what not.

Amazon is still honoring returns for out of 30 day units.


----------



## Rahldrac

But even if we are offered 980 replacements, how about us who already have waterblocks on them? Should we ask for store credit instead? Or just apply some new termalpaste, attach the stock cooler and hope for the best?


----------



## rdr09

Quote:


> Originally Posted by *Rahldrac*
> 
> But even if we are offered 980 replacements, how about us who already have waterblocks on them? Should we ask for store credit instead? Or just apply some new termalpaste, attach the stock cooler and hope for the best?


that or sell the blocks. not all owners are giving up their 970s. they are still a beast.


----------



## skupples

Quote:


> Originally Posted by *Rahldrac*
> 
> But even if we are offered 980 replacements, how about us who already have waterblocks on them? Should we ask for store credit instead? Or just apply some new termalpaste, attach the stock cooler and hope for the best?


you would be lucky if they honor the warranty after you ripped off the stock block, since most GPU manufacturers put those damned nazi stickers on the screws.


----------



## Rahldrac

Quote:


> Originally Posted by *rdr09*
> 
> that or sell the blocks. not all owners are giving up their 970s. they are still a beast.


Nah, I am going waterblocks no matter what,
So either:
Store credit

-Or-

Put back on the air cooler that came with the card and hope that they do not notice that the thermal pads are different. And then trade it in for a 980. And buy new waterblocks for the 980, selling the 970 ones.

Edit:
There was no stickers on any of my screws, as far as I saw.
Problem is that the 970 SLI is powerful enough for me, but I can't stand stuttering.


----------



## skupples

Quote:


> Originally Posted by *Rahldrac*
> 
> Nah, I am going waterblocks no matter what,
> So either:
> Store credit
> 
> -Or-
> 
> Put back on the air cooler that came with the card and hope that they do not notice that the thermal pads are different. And then trade it in for a 980. And buy new waterblocks for the 980, selling the 970 ones.


It's pretty rare that a company tears down the card, unless it's being sent in for repair service.

idk, i'm spoiled. I only buy NV cards from EVGA because they really don't give two damns what you do with their card, as long as it comes back looking like it did when they sent it to you. (with stock cooler on it)


----------



## Vesku

Quote:


> Originally Posted by *Forceman*
> 
> I'm not saying NVidia isn't wrong in this, but the review guide and the actual reviews are not advertising in the "false advertising" sense.
> 
> And I'd say that Nvidia isn't really responsible for review accuracy. They should make sure the data they provide is accurate, certainly, but if a review posts wrong data on their own, is it Nvidia's responsibility to correct it? So Nvidia has to fact-check every review in existence?


It depends on how strict the country's consumer protection is but AFAIK in the EU a company disseminating bad information to reviewers can have consequences. It's fairly sensible, some buyers base at least some of their decision on reviews.


----------



## DividebyZERO

Quote:


> Originally Posted by *2010rig*
> 
> @[email protected] Please give AMD's marketing department a break, they are paid with food stamps.
> 
> 
> 
> 
> 
> 
> 
> 
> PM me the list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had my suspicions about many people on this forum, who's opinion is swayed by the hardware they receive from AMD.


How you feel the marketing is done is one thing. However your insults just make you look insecure and defensive for Nvidia. I currently have AMD gpu's that i BOUGHT and have a decent experience with them. I was Nvidia before that and had a decent experience with them at that time. If you want to be brand loyal even when that brand makes a pretty big mistake then good for you. I choose to pick my brand based on many things and may even choose Nvidia in the near future again.

For a list for opinions swayed it's obvious that you are hardcore for your chosen brand. They can do no wrong, and if they did it's not really a big deal right?


----------



## Rahldrac

Quote:


> Originally Posted by *Vesku*
> 
> It depends on how strict the country's consumer protection is but AFAIK in the EU a company disseminating bad information to reviewers can have consequences. It's fairly sensible, some buyers base at least some of their decision on reviews.


I live in the EU, so I am hoping that Nvidia has to pay for their lies. Too bad that I wanted to watercool.


----------



## ZealotKi11er

Funny how some want to change to GTX980. Playing in Nvidia little game. Unless $ is not a problem and it clearly is if you got a GTX970 over GTX980, the GTX980 is overpriced. I can get 2 x 290X for cheaper then 1 x GTX980.


----------



## Rahldrac

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Funny how some want to change to GTX980. Playing in Nvidia little game. Unless $ is not a problem and it clearly is if you got a GTX970 over GTX980, the GTX980 is overpriced. I can get 2 x 290X for cheaper then 1 x GTX980.


Why? If I can get the performance of a 980 for the price of a 970 that is a good deal for me. And Nvidia is hopefully feeling some pain from cheating us. If I only get store credit I will of course not buy Nvidia. And I probably will not do that again for quite some time.


----------



## 2010rig

Quote:


> Originally Posted by *DividebyZERO*
> 
> How you feel the marketing is done is one thing. However your insults just make you look insecure and defensive for Nvidia. I currently have AMD gpu's that i BOUGHT and have a decent experience with them. I was Nvidia before that and had a decent experience with them at that time. If you want to be brand loyal even when that brand makes a pretty big mistake then good for you. I choose to pick my brand based on many things and may even choose Nvidia in the near future again.
> 
> For a list for opinions swayed it's obvious that you are hardcore for your chosen brand. They can do no wrong, and if they did it's not really a big deal right?


Keep dividing by zero, and you'll realize how valuable your opinion is to me. It's called sarcasm, ever heard of it?

I've called out NVIDIA plenty of times in the past, especially when they started selling their mid-range dies for high end prices.

I'm a marketer, and I find AMD's marketing to be childish and amateurish, their latest Fixer video proves my point. I hardly if ever see NVIDIA or Intel taking cheap shots at AMD. Let's face it, they've had plenty of chances and ammunition.

Can you imagine what types of ads we'd be seeing if AMD had been ahead of Intel the past 8 years? Actually, it's not hard to imagine.


----------



## PureBlackFire

it's just crazy how the people going the hardest in attack/defense of nvidia in this case are not 970 owners.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Funny how some want to change to GTX980. Playing in Nvidia little game. Unless $ is not a problem and it clearly is if you got a GTX970 over GTX980, the GTX980 is overpriced. I can get 2 x 290X for cheaper then 1 x GTX980.


my thoughts exactly.


----------



## hurleyef

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Funny how some want to change to GTX980. Playing in Nvidia little game. Unless $ is not a problem and it clearly is if you got a GTX970 over GTX980, the GTX980 is overpriced. I can get 2 x 290X for cheaper then 1 x GTX980.


Unless you need HDMI 2.0 (like my buddy) or not-garbage linux drivers, in which case AMD's offerings simply won't cut it. Fortunately that seems to be changing, as I'm sure HDMI 2.0 support is an inevitability and I've heard some rumblings that AMD are finally trying to get their stuff together linux-side. I certainly hope so...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Rahldrac*
> 
> Why? If I can get the performance of a 980 for the price of a 970 that is a good deal for me. And Nvidia is hopefully feeling some pain from cheating us. If I only get store credit I will of course not buy Nvidia. And I probably will not do that again for quite some time.


If Nvidia gives you a GTX980 for your GTX970 for no extra cost then why not but most likely you have to add the extra $.
Quote:


> Originally Posted by *hurleyef*
> 
> Unless you need HDMI 2.0 (like my buddy) or not-garbage linux drivers, in which case AMD's offerings simply won't cut it. Fortunately that seems to be changing, as I'm sure HDMI 2.0 support is an inevitability and I've heard some rumblings that AMD are finally trying to get their stuff together linux-side. I certainly hope so...


Stick to GTX970 and get R9 380X. HDMI 2.0 the port for 4K TVs and the resolution AMD does best. Nothing wrong with GTX970. Still a better buy then GTX980. Nobody gave GTX980 slack because GTX970 existed.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If Nvidia gives you a GTX980 for your GTX970 for no extra cost then why not but most likely you have to add the extra $.
> Stick to GTX970 and get R9 380X. HDMI 2.0 the port for 4K TVs and the resolution AMD does best. Nothing wrong with GTX970. Still a better buy then GTX980. Nobody gave GTX980 slack because GTX970 existed.


Has anyone gotten a 980 in exchange for a 970? I know people were hoping they could, but as far as I've heard the only thing like that is EVGA offering out-of-cycle step-ups. But you are still out the cost difference there.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Has anyone gotten a 980 in exchange for a 970? I know people were hoping they could, but as far as I've heard the only thing like that is EVGA offering out-of-cycle step-ups. But you are still out the cost difference there.


There is no way they are getting free GTX980. Some can only dream.


----------



## mrawesome421

Nvidia giving free 980's to peeps? Suuuuuure.. lol

My face will fall off my head the day that happens.


----------



## skupples

Quote:


> Originally Posted by *DividebyZERO*
> 
> How you feel the marketing is done is one thing. However your insults just make you look insecure and defensive for Nvidia. I currently have AMD gpu's that i BOUGHT and have a decent experience with them. I was Nvidia before that and had a decent experience with them at that time. If you want to be brand loyal even when that brand makes a pretty big mistake then good for you. I choose to pick my brand based on many things and may even choose Nvidia in the near future again.
> 
> For a list for opinions swayed it's obvious that you are hardcore for your chosen brand. They can do no wrong, and if they did it's not really a big deal right?


you haven't read many of his posts, have you?


----------



## mcg75

Quote:


> Originally Posted by *mrawesome421*
> 
> Nvidia giving free 980's to peeps? Suuuuuure.. lol
> 
> My face will fall off my head the day that happens.


I doubt it very much as well but it they do.......

You'll probably have a hard time gaming with no face.


----------



## Noufel

Quote:


> Originally Posted by *mcg75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mrawesome421*
> 
> Nvidia giving free 980's to peeps? Suuuuuure.. lol
> 
> My face will fall off my head the day that happens.
> 
> 
> 
> I doubt it very much as well but it they do.......
> 
> You'll probably have a hard time gaming with no face.
Click to expand...

it's better than gaming without 0.5 gb


----------



## DividebyZERO

Quote:


> Originally Posted by *skupples*
> 
> you haven't read many of his posts, have you?


Not all of them, i have seen some are pretty arrogant or insulting. Instead of sticking to points its more about lets insult people because we can. I've lingered in this thread too long so good luck with your defenses. I am sure we will meet again when AMD makes a mistake like this. You will be yelling at the top of your lungs, i wont be there to defend them. Good luck!


----------



## battleaxe

Someone please lock this ridiculous thread. All points have been made. What else could there possibly be to say on the subject?


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> Did they actually advertise 64 ROPs though? That was in the review guide and the reviews but, legally speaking, reviews aren't advertising. Their website, and the partner websites, don't actually list the ROP count (or the L2 cache) anywhere.


The only thing that NVIDIA has publicly advertised that is factually incorrect is the theoretical memory bandwidth figure.


----------



## skupples

Quote:


> Originally Posted by *DividebyZERO*
> 
> Not all of them, i have seen some are pretty arrogant or insulting. Instead of sticking to points its more about lets insult people because we can. I've lingered in this thread too long so good luck with your defenses. I am sure we will meet again when AMD makes a mistake like this. You will be yelling at the top of your lungs, i wont be there to defend them. Good luck!


Yeah... You really haven't been paying attention. I bash AMD and Nvidia equally. So... um...

More importantly though, I like making fun of people that expect a gimped version of a mid range card to be "future proof" & allow them to play games on max settings for the rest of their lives. They're either new to technology, or just blinded by buyers remorse. Maxwell has seemed like a hold me over GPU for quite some time now, hopefully 390x (or w/e the next AMD flagship is called) isn't the same way. Either way, I don't really like the idea of acquiring any new GPUs when DX12 is on the horizon. I find it hard to believe that these companies can promise proper DX12 support in good faith when we don't even have any proper tech demos yet. *Some of us have been around long enough to know when an standardized API evolution looks like.*


----------



## PhotonFanatic

So when they come out with the GTX 970 8Gb model, as they are sure to do, it will be a complete gimmick?


----------



## skupples

Quote:


> Originally Posted by *PhotonFanatic*
> 
> So when they come out with the GTX 970 8Gb model, as they are sure to do, it will be a complete gimmick?


I don't think they will. I think only the 980 will get an 8GB model, if that.

970 8GB? would it be a 7gb/1gb card? or a 7.5giggle card?


----------



## battleaxe

Quote:


> Originally Posted by *skupples*
> 
> I don't think they will. I think only the 980 will get an 8GB model, if that.
> 
> 970 8GB? would it be a 7gb/1gb card? or a 7.5giggle card?


Either way, seems it wouldn't be gimped anymore then would it? It could effectively be SLI ready for high res right? As it wouldn't slam into the slower memory until 7Gb correct?

I just think its so stupid they purposely gimped this card in this way.


----------



## Xoriam

Yeah I don't see the 970 or 980 getting an 8gb version.

It will probably be the next gen of cards.
If they do however release 8gb versions I think I might be alittle annoyed as I'll have to sell the ones I have now to get them -_-


----------



## battleaxe

Quote:


> Originally Posted by *Xoriam*
> 
> Yeah I don't see the 970 or 980 getting an 8gb version.
> 
> It will probably be the next gen of cards.
> If they do however release 8gb versions I think I might be alittle annoyed as I'll have to sell the ones I have now to get them -_-


I'm getting more annoyed with Nvidia by the minute tbh...

I just don't get why they intentionally gimp a card like this. I could be wrong but I don't think AMD did this with the 290. Some of us were able to flash 290's into 290x cards. So it was not a hard mod as I understand. Just silicon that didn't bin as well was flashed as 290's instead of 290x. Seems a better way to do things instead of this memory hodge podge that ends in our 970's essentially being obsolete for high res compared to AMD offerings. I mean why? Why did they do this to themselves? Now the 290x is the card to beat for multi card and high res 4k monitors instead of the 970. Its just a bit of a letdown for me to be honest. Single card, yeah the 970 is still fine. No issues there, but what if I want to SLI? Then the AMD is a better solution. The pathetic part is that I don't think it had to be this way. Nvidia is so worried about us getting high performance (read 980 performance) from the 970 that they essentially cut off their own foot and screwed the customers. Nice job.

I still like my 970. But this was just silly.


----------



## Blameless

Quote:


> Originally Posted by *PhotonFanatic*
> 
> So when they come out with the GTX 970 8Gb model, as they are sure to do, it will be a complete gimmick?


Not necessarily, but if they don't change the ROP setup, an 8GiB GTX 970 would have a fast 7GiB partition and a slow 1GiB partition.
Quote:


> Originally Posted by *battleaxe*
> 
> I just think its so stupid they purposely gimped this card in this way.


Need to create market segmentation so the GTX 980 has at least some vague reason to exist and cost considerably more.

However, they probably would have been better off disabling more SMs and lowering clock speeds further, rather than partially disabling an ROP cluster.


----------



## PhotonFanatic

Quote:


> Originally Posted by *skupples*
> 
> I don't think they will. I think only the 980 will get an 8GB model, if that.
> 
> 970 8GB? would it be a 7gb/1gb card? or a 7.5giggle card?


If we follow their history up to this point, they're almost sure to increase the vram at some point. They almost always have. Well I don't know if that's actually nvidia. Evga and all the card companies always seem to get these better models later on. Take my old GTX 570 for example. I bought it when it first came out. It was a plain jane evga reference spec and it had 1.25Gb of vram. If I had waited 6 months, I could have got a card that was only 2/3 that size, also a vanilla evga, but with 2.5Gb of vram. So doubling it. Whoever is responsible they do it every time.

Figuring out whether or not its a gimmick has eluded me thus far. After all, can the cards really make use of the extra vram?


----------



## 2010rig

Quote:


> Originally Posted by *skupples*
> 
> I don't think they will. I think only the 980 will get an 8GB model, if that.
> 
> 970 8GB? would it be a 7gb/1gb card? or a 7.5giggle card?





Spoiler: Stop it you







An 8GB is 980 is pretty much guaranteed. A 7GB 8GB 970 would be silly for them to do after all this.


----------



## Blameless

Quote:


> Originally Posted by *battleaxe*
> 
> I'm getting more annoyed with Nvidia by the minute tbh...
> 
> I just don't get why they intentionally gimp a card like this. I could be wrong but I don't think AMD did this with the 290. Some of us were able to flash 290's into 290x cards. So it was not a hard mod as I understand. Just silicon that didn't bin as well was flashed as 290's instead of 290x.


NVIDIA and ATI/AMD have been disabling functional units for the purposes of market segmentation for a very long time.

It's just that Maxwell has a more complex ROP/memory controller setup with a partial disable feature that portions of NVIDIA relating to policy and marketing did not fully understand the implications of. They wanted a part to slot in below the 980, the engineers likely told them a few ways it could be done, and they seemingly chose the one that looked best on paper, without understanding how it's eccentricities could comeback to haunt them.

Most 290s will _not_ flash to 290X parts. Most have shaders and TMUs physically disabled.


----------



## skupples

Quote:


> Originally Posted by *PhotonFanatic*
> 
> If we follow their history up to this point, they're almost sure to increase the vram at some point. They almost always have. Well I don't know if that's actually nvidia. Evga and all the card companies always seem to get these better models later on. Take my old GTX 570 for example. I bought it when it first came out. It was a plain jane evga reference spec and it had 1.25Gb of vram. If I had waited 6 months, I could have got a card that was only 2/3 that size, also a vanilla evga, but with 2.5Gb of vram. So doubling it. Whoever is responsible they do it every time.
> 
> Figuring out whether or not its a gimmick has eluded me thus far. After all, can the cards really make use of the extra vram?


and those companies must get approval from Nvidia.

also, I would buy 290x over maxwell ANY DAY if it were for 4K screens, or eyefinity.

period.

seems I'm the only one that remembered these cards being marketed towards GK104 replacements, you can't be a high res king @ this point in time & be a mid range card meant to replace mid range cards.


----------



## Forceman

Quote:


> Originally Posted by *Blameless*
> 
> The only thing that NVIDIA has publicly advertised that is factually incorrect is the theoretical memory bandwidth figure.


And according to them, there are still scenarios where that bandwidth can be achieved (however unlikely those are). I think SKYMTL posted their quote in one of these threads.


----------



## mtcn77

Quote:


> Originally Posted by *battleaxe*
> 
> Either way, seems it wouldn't be gimped anymore then would it? It could effectively be SLI ready for high res right? As it wouldn't slam into the slower memory until 7Gb correct?
> 
> I just think its so stupid they purposely gimped this card in this way.


I think the card would have given in much sooner than it had utilized most of that 7 GB capacity. The bandwidth is real slow, it is not as interassociable. Potential use case for extra VRAM is to deliver constantly accessed data to the gpu, afaik. If you were to bet on that feature and went all in on postprocesses & resolution & AA, from 3.5GB to 7GB you would have stalled the gpu.
Nvidia gpu's are actually better at smaller resolutions in comparison of their primitive performance with their blend performance deficit, but the marketplace has to shift before the prices are adjusted as such - not until hell freezes over by the looks of it. Even so, they deliver outstanding value in the mid to low range. I mean just look at this benchmark, absolute monster in the field, the gtx 960:

Demand only primitive performance, don't pick any blending & postprocessing heavy setting and you have a winner.
AMD's selling point is much harder as a self sustaining high end vendor and people aren't finicky enough to look beyond simple fps charts & marketting material.


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> And according to them, there are still scenarios where that bandwidth can be achieved (however unlikely those are). I think SKYMTL posted their quote in one of these threads.


Apparently, the total bandwidth of the partitions does add up to 224GB/s, but you cannot read or write to both segments simultaneously, thus the theoretical figure only apply (even in the theoretical sense) if you carefully interleave reads and writes to the different segments.

Any time you try to read from both, or write to both, simultaneously, you totally cripple the effective bandwidth of the VRAM, because the other segment needs to sit idle.

Source: http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2


----------



## Forceman

Quote:


> Originally Posted by *Blameless*
> 
> Apparently, the total bandwidth of the partitions does add up to 224GB/s, but you cannot read or write to both segments simultaneously, thus the theoretical figure only apply (even in the theoretical sense) if you carefully interleave reads and writes to the different segments.
> 
> Any time you try to read from both, or write to both, simultaneously, you totally cripple the effective bandwidth of the VRAM, because the other segment needs to sit idle.
> 
> Source: http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/2


I'm not disputing that, just repeating what Nvidia said. I know what Anand wrote so I asked SKYMTL and he said the engineers told them it is possible to get the full 224 GB/s in certain scenarios, and so they didn't need to change the numbers. Whether that can actually ever happen in real life is a different story.


----------



## spacin9

An update from the league office in New York: The Superbowl will only be 3.5 quarters this year. "And here's the final field goal kick of the game... with 3.5 secs left on the clock... it's up... it has the distance...OH MY GOD IT HIT THE CROSSBAR..."


----------



## PhotonFanatic

Can anyone say for sure that a 970 (or even a 980) card could ever even make adequate use of 8Gb of vram? What is the determining factor for how much vram a given card can really use?


----------



## skupples

Quote:


> Originally Posted by *PhotonFanatic*
> 
> Can anyone say for sure that a 970 (or even a 980) card could ever even make adequate use of 8Gb of vram? What is the determining factor for how much vram a given card can really use?


That's a hard question to answer.

The old "680 can't even use the 2 extra GB on the 4GB model because of bandwidth" saying that we used to always hear, was kinda true, and kinda false @ the same time.

either way, something tells me that 980/970 would run into core power issues long before fully saturating the memory, in the way modern games are utilizing high resolution texture streaming.


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> I'm not disputing that, just repeating what Nvidia said. I know what Anand wrote so I asked SKYMTL and he said the engineers told them it is possible to get the full 224 GB/s in certain scenarios, and so they didn't need to change the numbers. Whether that can actually ever happen in real life is a different story.


Yeah, I was agreeing with you.

The theoretical 224GB/s is possible if the 3.5GiB segment is being read while the 512MiB segment is being written, or vice versa. So, strictly speaking NVIDIA is not wrong, just uselessly pedantic.


----------



## Forceman

Quote:


> Originally Posted by *Blameless*
> 
> Yeah, I was agreeing with you.
> 
> The theoretical 224GB/s is possible if the 3.5GiB segment is being read while the 512MiB segment is being written, or vice versa. So, strictly speaking NVIDIA is not wrong, *just uselessly pedantic.*


In the finest marketing tradition.


----------



## mtcn77

Quote:


> Originally Posted by *Blameless*
> 
> Yeah, I was agreeing with you.
> 
> The theoretical *224GB/s* is possible if the 3.5GiB segment is being read while the 512MiB segment is being written, or vice versa. So, strictly speaking NVIDIA is not wrong, just uselessly pedantic.


If we are focusing on the 7 segments remaining, the bandwidth is 196 GB/s, afaik.


----------



## Blameless

Quote:


> Originally Posted by *mtcn77*
> 
> If we are focusing on the 7 segments remaining, the bandwidth is 196 GB/s, afaik.


We are focusing on the maximum theoretical bandwidth achievable.

All segments can be accessed simultaneously, as long as the the 512MiB segment connected to the partially enabled ROP/memory controller isn't doing the same sort of operation (read or write) as the rest of the memory.

So, if the 3.5GiB segment is being read (196GiB/s peak, as you say), the 512MiB remainder can actually be written to at the same time (28GiB/s here), for a total peak theoretical I/O of 224GiB/s as NVIDIA has listed.


----------



## mtcn77

Quote:


> Originally Posted by *PhotonFanatic*
> 
> Can anyone say for sure that a 970 (or even a 980) card could ever even make adequate use of 8Gb of vram? What is the determining factor for how much vram a given card can really use?


Bandwidth & latency. I calculated it the day before; it takes 2ms to access(read) all 4 GB on a gtx 980, given the bandwidth is 224GB/s. Taking that as a reference, reading 8GB's once would take 4ms - latency would be higher. As such, the 8GB Sapphire 290x Vapor-X, via a 320 GB/s memory interface should take -27.2% less latency to access all 8GB(4ms>2.9 ms). So potentially 290x 8GB's shader pipelines will have a much cozier time with less memory stalls. Having said that, Nvidia Maxwell architecture keep shaders efficiently utilized via 558GB/s 2MB L2 cache; eventhough the main memory device interface can be 37% slower. Due to the allocation size advantage, Maxwell L2cache make -~%20 errors than Kepler L2 cache and takes Maxwell 10^4 calls to sample 8GB VRAM while AMD GCN architecture with 1MB L2 would need 2 times more calls to sample all 8GB VRAM with potentially twice higher control penalty.
We could also suppose Maxwell cache would make 28% less errors comparatively to R9 290x.
I think in summary, Maxwell has 39.75% better cache pumping 28% better shader processor while sampling from 37% slower memory. All things accounted I say GTX980's chances in an all-out 8GB antialiasing contest is -6.3%.
Quote:


> When reading, the cost of a cache hit is roughly the time to access an entry in the cache. The miss penalty is the additional cost of replacing a cache line with one containing the desired data.
> 
> (Access time) = (hit cost) + (miss rate)*(miss penalty)
> =(Fast memory access time) + (miss rate)*(slow memory access time)
> Note that the approximation is an underestimate - control costs have been left out. Also note that only one word is being loaded from the faster memory while a whole cache block's worth of data is being loaded from the slower memory.


----------



## Vesku

Quote:


> Originally Posted by *battleaxe*
> 
> Someone please lock this ridiculous thread. All points have been made. What else could there possibly be to say on the subject?


Well there are people like the ENB dev investigating further.

https://www.facebook.com/enbfx?_fb_noscript=1

But I think we can just make a new thread if they find anything interesting.


----------



## skupples

the ENB dev seems to be on crack...


----------



## The Robot

Quote:


> Originally Posted by *Vesku*
> 
> Well there are people like the ENB dev investigating further.
> 
> https://www.facebook.com/enbfx?_fb_noscript=1
> 
> But I think we can just make a new thread if they find anything interesting.


So basically he says that the last 512mb is completely inaccessible. Which is not true, if it was games would just drop to single-digit fps when hitting 3.5gb mark.


----------



## Vesku

Quote:


> Originally Posted by *The Robot*
> 
> So basically he says that the last 512mb is completely inaccessible. Which is not true, if it was games would just drop to single-digit fps when hitting 3.5gb mark.


Not necessarily, he even points out that it's the very rare game that needs to access the entire VRAM simultaneously. It's up to him to prove it though. Right now it's just something he suspects.

If he's correct about the behavior maybe Nvidia mirrors that slow 512MB in RAM so that it can choose between the RAM version and the VRAM version depending on which one will be faster. In other words if accessing the slow VRAM would interrupt stuff in the fast 3.5GB get it from RAM instead since that won't interfere with the 3.5GB operations.


----------



## Majin SSJ Eric

I don't personally understand ANYONE defending Nvidia on this. I mean, why? They knowingly deceived the reviewers and the consumers and they ought to be held accountable, even if it ends up just being with bad PR on forums like OCN. Defending them by saying that the performance hit is minimal or that they are "technically" correct in their bogus marketing claims seems nothing short of blatant fanboyism, to me anyway. They did what they did and they knew they'd be called out on it if they were up front so they only released deceptive half truths and withheld critical details hoping nobody would ever find out but they got caught and it is blowing up in their faces. There is no reason to defend this kind of thing other than pure brand loyalty. To me it doesn't make any difference whether its AMD guys calling them out or people who don't own 970's at all; the fact is that Nvidia lied (or at the very least were extremely deceptive with the "truth") and they are the ones to blame here, not those who are calling them out on it (regardless of whatever their personal motivations may be).

Another popular tactic when attempting to defend Nvidia in this mess is to deflect blame to AMD and start listing off the various deceptive things they've done in the past, but anything AMD may have done doesn't change the facts of this story so its really irrelevant to me. Suffice it to say that both companies have played fast and loose with the truth when it suited them and the only real victims every time are us, the consumers. The only hope we ever have of stopping this kind of flagrant fraud is by exposing it whenever either side is guilty of it and spreading awareness to the public about it as long and loud as we possibly can...


----------



## JackMex

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't personally understand ANYONE defending Nvidia on this. I mean, why? They knowingly deceived the reviewers and the consumers and they ought to be held accountable, even if it ends up just being with bad PR on forums like OCN. Defending them by saying that the performance hit is minimal or that they are "technically" correct in their bogus marketing claims seems nothing short of blatant fanboyism, to me anyway. They did what they did and they knew they'd be called out on it if they were up front so they only released deceptive half truths and withheld critical details hoping nobody would ever find out but they got caught and it is blowing up in their faces. There is no reason to defend this kind of thing other than pure brand loyalty. To me it doesn't make any difference whether its AMD guys calling them out or people who don't own 970's at all; the fact is that Nvidia lied (or at the very least were extremely deceptive with the "truth") and they are the ones to blame here, not those who are calling them out on it (regardless of whatever their personal motivations may be).
> 
> Another popular tactic when attempting to defend Nvidia in this mess is to deflect blame to AMD and start listing off the various deceptive things they've done in the past, but anything AMD may have done doesn't change the facts of this story so its really irrelevant to me. Suffice it to say that both companies have played fast and loose with the truth when it suited them and the only real victims every time are us, the consumers. The only hope we ever have of stopping this kind of flagrant fraud is by exposing it whenever either side is guilty of it and spreading awareness to the public about it as long and loud as we possibly can...


No arguments with you here, boss.


----------



## gamervivek

Oh Boris, you too. I can get why some people think that nvidia aren't that bad considering how dual gpu cards are advertised as having twice the vram, though one could make the case that since both companies do it, they are equal there.

It's only surprising that it took so long for it to blew up, the 52 effective ROPs explanation from techreport still didn't explain why gtx970 was that far behind 980. AMD could've uncovered it if they went digging, or maybe they did.


----------



## Art Vanelay

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't personally understand ANYONE defending Nvidia on this. I mean, why? They knowingly deceived the reviewers and the consumers and they ought to be held accountable, even if it ends up just being with bad PR on forums like OCN. Defending them by saying that the performance hit is minimal or that they are "technically" correct in their bogus marketing claims seems nothing short of blatant fanboyism, to me anyway. They did what they did and they knew they'd be called out on it if they were up front so they only released deceptive half truths and withheld critical details hoping nobody would ever find out but they got caught and it is blowing up in their faces. There is no reason to defend this kind of thing other than pure brand loyalty. To me it doesn't make any difference whether its AMD guys calling them out or people who don't own 970's at all; the fact is that Nvidia lied (or at the very least were extremely deceptive with the "truth") and they are the ones to blame here, not those who are calling them out on it (regardless of whatever their personal motivations may be).
> 
> Another popular tactic when attempting to defend Nvidia in this mess is to deflect blame to AMD and start listing off the various deceptive things they've done in the past, but anything AMD may have done doesn't change the facts of this story so its really irrelevant to me. Suffice it to say that both companies have played fast and loose with the truth when it suited them and the only real victims every time are us, the consumers. The only hope we ever have of stopping this kind of flagrant fraud is by exposing it whenever either side is guilty of it and spreading awareness to the public about it as long and loud as we possibly can...


A lot of people aren't defending Nvidia, but just think the outcry over this has been significantly blown out of proportion. What Nvidia did was bad, but it's not really at the point where everyone should be returning their cards out of spite. I really doubt this many people were running close to 4GB of RAM.


----------



## Noufel

Quote:


> Originally Posted by *Art Vanelay*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Majin SSJ Eric*
> 
> I don't personally understand ANYONE defending Nvidia on this. I mean, why? They knowingly deceived the reviewers and the consumers and they ought to be held accountable, even if it ends up just being with bad PR on forums like OCN. Defending them by saying that the performance hit is minimal or that they are "technically" correct in their bogus marketing claims seems nothing short of blatant fanboyism, to me anyway. They did what they did and they knew they'd be called out on it if they were up front so they only released deceptive half truths and withheld critical details hoping nobody would ever find out but they got caught and it is blowing up in their faces. There is no reason to defend this kind of thing other than pure brand loyalty. To me it doesn't make any difference whether its AMD guys calling them out or people who don't own 970's at all; the fact is that Nvidia lied (or at the very least were extremely deceptive with the "truth") and they are the ones to blame here, not those who are calling them out on it (regardless of whatever their personal motivations may be).
> 
> Another popular tactic when attempting to defend Nvidia in this mess is to deflect blame to AMD and start listing off the various deceptive things they've done in the past, but anything AMD may have done doesn't change the facts of this story so its really irrelevant to me. Suffice it to say that both companies have played fast and loose with the truth when it suited them and the only real victims every time are us, the consumers. The only hope we ever have of stopping this kind of flagrant fraud is by exposing it whenever either side is guilty of it and spreading awareness to the public about it as long and loud as we possibly can...
> 
> 
> 
> A lot of people aren't defending Nvidia, but just think the outcry over this has been significantly blown out of proportion. What Nvidia did was bad, but it's not really at the point where everyone should be returning their cards out of spite. I really doubt this many people were running close to 4GB of RAM.
Click to expand...

the 970 is still a beast at 1080 and 1440 resolutions, even at 4k without AA and little tweaking they are good performers for the price and the power efficiency ( i hate to say that







) is an important factor for the majority of people out of OCN i think


----------



## Clocknut

Quote:


> Originally Posted by *Art Vanelay*
> 
> A lot of people aren't defending Nvidia, but just think the outcry over this has been significantly blown out of proportion. What Nvidia did was bad, but it's not really at the point where everyone should be returning their cards out of spite. I really doubt this many people were running close to 4GB of RAM.


the owner of 970 does reserve the right to return(if he wish to) the false advertisement GPU whether it affect them or not. A miss leading spec is a miss leading spec.


----------



## Unknownm

Would it be a smoother gameplay experience if we flashed the 970gtx to only use 3.5GB and allowing us to use the extra 500MB as a p0rn storage or temp storage.

Kinda like the ps3 . Linux didn't have AMD drivers officially but they were able to read/write to gpu ram and use it as storage.


----------



## DIYDeath

Quote:


> Originally Posted by *Clocknut*
> 
> the owner of 970 does reserve the right to return(if he wish to) the false advertisement GPU whether it affect them or not. A miss leading spec is a miss leading spec.


Not only that but this issue has a very high chance of becoming a problem for 970 users at one point or another.

For example, what if in 1-2 years the min specs for the avg game is 4gb of vram? The 970 was supposed to handle that but it can't so the users will have 2 choices: suck it up and play with sub 1080p resolutions and low shadow quality to save on vram or upgrade 1-2 years early.

Thats why I think its a big deal, not because 0.5gb is a lot or even matters for most people but because it eventually will matter to every 970 owner, its just a matter of time.


----------



## Sisaroth

Quote:


> Originally Posted by *DIYDeath*
> 
> Not only that but this issue has a very high chance of becoming a problem for 970 users at one point or another.
> 
> For example, what if in 1-2 years the min specs for the avg game is 4gb of vram? The 970 was supposed to handle that but it can't so the users will have 2 choices: suck it up and play with sub 1080p resolutions and low shadow quality to save on vram or upgrade 1-2 years early.
> 
> Thats why I think its a big deal, not because 0.5gb is a lot or even matters for most people but because it eventually will matter to every 970 owner, its just a matter of time.


Well, think of it the other way. If the GTX 970 needed to be able to access the full 4 GB it would have higher testing requirement, meaning much lower yields meaning much higher prices. Just to make my example clear lets assume GTX 980 is 500$ and GTX 970 is 300$. Let's say the GTX 970 with "true" 4 GB would have yields so low that it would needed to be priced at 400$ to be profitable.

What would be the best deal? GTX 970 with 3.5 GB at 300$ or GTX 970 with 4GB and higher bandwidth (that it can't actually use because of disabled SMMs) at 400$ all other specs the same.

I said it before, but if i would upgrade my GPU now i would still buy the GTX 970. It's still a great card with great price/performance and great wattage/performance ratios.


----------



## Adglu

Quote:


> Originally Posted by *The Robot*
> 
> So basically he says that the last 512mb is completely inaccessible. Which is not true, if it was games would just drop to single-digit fps when hitting 3.5gb mark.


But it does explain why (acording to reddit) games tend to crash or hard cap vram at 3.5 when pagefile is disabled


----------



## Cyro999

Quote:


> Originally Posted by *Forceman*
> 
> And according to them, there are still scenarios where that bandwidth can be achieved (however unlikely those are). I think SKYMTL posted their quote in one of these threads.


It has been said that you can't read from the 7'th and 8'th chip at the same time, therefore that bandwidth is impossible, the max that you can get is 7/8'ths of it
Quote:


> Originally Posted by *Adglu*
> 
> But it does explain why (acording to reddit) games tend to crash or hard cap vram at 3.5 when pagefile is disabled


I can confirm 100% that this happens. With several games, at least, and current drivers.


----------



## N0ID

Quote:


> Originally Posted by *Cyro999*
> 
> It has been said that you can't read from the 7'th and 8'th chip at the same time, therefore that bandwidth is impossible, the max that you can get is 7/8'ths of it
> I can confirm 100% that this happens. With several games, at least, and current drivers.


I too can confirm that this happens. I experience most crashes running SoM though.


----------



## Wasupwitdat1

Quote:


> Originally Posted by *Noufel*
> 
> the 970 is still a beast at 1080 and 1440 resolutions, even at 4k without AA and little tweaking they are good performers for the price and the power efficiency ( i hate to say that
> 
> 
> 
> 
> 
> 
> 
> ) is an important factor for the majority of people out of OCN i think


No, it's not. I bought a NVidia Reference card from best Buy last week and I noticed problems with my own eyes immediately when I tried the DSR functionality. That card has been returned. I can't believe that a card that runs like this was even brought to market.


----------



## Yungbenny911

When i tested my 970's vs my 980's @ 4K res, the 970 had this V-RAM issue everyone is blowing up, and performed impressively against the 980's. This is just a stupid argument, and anyone returning their 970's because of .5gb V-ram has definitely lost their mind, but please do return them, so i can buy more for cheap lol. I'm not saying what Nvidia knowingly or un-knowingly did is right in any way, but the gaming performance of the 970 is outstanding for it's price. If you don't agree, then you're either just a hater, or a cry baby...

GTX 980 SLI - 1466Mhz (core), 2004Mhz (mem) - Air cooling
GTX 970 SLI - 1469Mhz (core), 2004Mhz (mem) - Air cooling

*Benchmarks:*











Spoiler: More 4K Benchmarks


----------



## gamervivek

It can get that bandwidth if it is reading from one and writing to another segment at the same time. Very unlikely, but nvidia could perhaps conjure up a program that somehow does that. The bandwidth in real world is lower of course, if you see the techreport's 52 effective ROPs article, 970 performs worse than you'd expect just from lack of ROPs on that bandwidth constrained test. Despite the fact that they also had strix model of 970 clocked a bit higher than 980 and presumably boosting even higher..

http://techreport.com/blog/27143/here-another-reason-the-geforce-gtx-970-is-slower-than-the-gtx-980


----------



## Noufel

Quote:


> Originally Posted by *Wasupwitdat1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noufel*
> 
> the 970 is still a beast at 1080 and 1440 resolutions, even at 4k without AA and little tweaking they are good performers for the price and the power efficiency ( i hate to say that
> 
> 
> 
> 
> 
> 
> 
> ) is an important factor for the majority of people out of OCN i think
> 
> 
> 
> No, it's not. I bought a NVidia Reference card from best Buy last week and I noticed problems with my own eyes immediately when I tried the DSR functionality. That card has been returned. I can't believe that a card that runs like this was even brought to market.
Click to expand...

that confirms what i've said the 970 is a 1080 card.


----------



## skupples

What part of GK104 replacement do people not get? A few % faster than 780 with skimped bandwidth.


----------



## Usario

Quote:


> Originally Posted by *Sisaroth*
> 
> Well, think of it the other way. If the GTX 970 needed to be able to access the full 4 GB it would have higher testing requirement, meaning much lower yields meaning much higher prices. Just to make my example clear lets assume GTX 980 is 500$ and GTX 970 is 300$. Let's say the GTX 970 with "true" 4 GB would have yields so low that it would needed to be priced at 400$ to be profitable.
> 
> What would be the best deal? GTX 970 with 3.5 GB at 300$ or GTX 970 with 4GB and higher bandwidth (that it can't actually use because of disabled SMMs) at 400$ all other specs the same.
> 
> I said it before, but if i would upgrade my GPU now i would still buy the GTX 970. It's still a great card with great price/performance and great wattage/performance ratios.


I don't think the bin would be any different because the blocks in question are only "partially disabled"... though I could be wrong there, but this does seem purely for market segmentation and to ensure that NVIDIA would still have enough buyers at the $500 price point... nothing wrong with that, until you advertise falsely...

Now you're just saying that a proper 4GB 970 would cost too much to produce, which even if perhaps true has absolutely nothing to do with the fact that NVIDIA was misrepresenting their product and lied to us.

Your claim that a 970 couldn't "actually use" 4GB makes no sense. 970s fitted with denser VRAM chips could easily address well over 4GB.


----------



## SKYMTL

Quote:


> Originally Posted by *Unknownm*
> 
> Would it be a smoother gameplay experience if we flashed the 970gtx to only use 3.5GB.


No and this is a massively erroneous subject that keeps coming up.


----------



## revro

first thing i saw on gtx970 was that it took sometime to load up textures in star citizen, unlike my former 780. ou well i am throwing towel and waiting it out. even if i could get a refund what do i buy? 290x, now months before 390x is released, hardly ...

well since i am stuck on 1440p till my monitor goes out, and thats probably 10 years from now, i am keeping the 970 for next 2 years


----------



## Nestala

My merchant where I bought my Gigabyte GTX 970 Gaming G1 agreed to a full money back exchange.
I'm going to send my GTX 970 in and put my old HD7950 back into my rig.
I'm just going to get the 380X or the 390X when it releases. Better performance, better for 4k gaming (seeing as I got a new 4k TV where I sometimes want to play single player games on).
Big hit on performance or not, I don't want a butchered product.


----------



## Rahldrac

Quote:


> Originally Posted by *Nestala*
> 
> My merchant where I bought my Gigabyte GTX 970 Gaming G1 agreed to a full money back exchange.
> I'm going to send my GTX 970 in and put my old HD7950 back into my rig.
> I'm just going to get the 380X or the 390X when it releases. Better performance, better for 4k gaming (seeing as I got a new 4k TV where I sometimes want to play single player games on).
> Big hit on performance or not, I don't want a butchered product.


I wish I still had an old GPU to put in, Wish i did not buy water blocks for my 970s. Would have done exactly the same.
I guess very few stores will agree to some kind of in store credit instead.


----------



## skupples

Quote:


> Originally Posted by *Usario*
> 
> I don't think the bin would be any different because the blocks in question are only "partially disabled"... though I could be wrong there, but this does seem purely for market segmentation and to ensure that NVIDIA would still have enough buyers at the $500 price point... nothing wrong with that, until you advertise falsely...
> 
> Now you're just saying that a proper 4GB 970 would cost too much to produce, which even if perhaps true has absolutely nothing to do with the fact that NVIDIA was misrepresenting their product and lied to us.
> 
> Your claim that a 970 couldn't "actually use" 4GB makes no sense. 970s fitted with denser VRAM chips could easily address well over 4GB.


Nah just SKU canabalization.


----------



## Clocknut

Quote:


> Originally Posted by *Usario*
> 
> I don't think the bin would be any different because the blocks in question are only "partially disabled"... though I could be wrong there, but this does seem purely for market segmentation and to ensure that NVIDIA would still have enough buyers at the $500 price point... nothing wrong with that, until you advertise falsely...
> 
> Now you're just saying that a proper 4GB 970 would cost too much to produce, which even if perhaps true has absolutely nothing to do with the fact that NVIDIA was misrepresenting their product and lied to us.
> 
> Your claim that a 970 couldn't "actually use" 4GB makes no sense. 970s fitted with denser VRAM chips could easily address well over 4GB.


well they could have just disable the entire 8th segment & sold the 970 with 7 chips/3.5GB, they could also price it even lower since they saved 1 GDDR5 chip + simpler 224bit PCB.

But no, the marketing team decide that it would suck to sell a 3.5GB card. So they opt to go partial 4GB that look like a fully speed 4GB card..
















this is the same for all the 192bit Nvidia card b4 this. The marketing team just took over the whole vram thing.


----------



## provost

Quote:


> Originally Posted by *Clocknut*
> 
> well they could have just disable the entire 8th segment & sold the 970 with 7 chips/3.5GB, they could also price it even lower since they saved 1 GDDR5 chip + simpler 224bit PCB.
> 
> But no, the marketing team decide that it would suck to sell a 3.5GB card. So they opt to go partial 4GB that look like a fully speed 4GB card..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is the same for all the 192bit Nvidia card b4 this. The marketing team just took over the whole vram thing.


I doubt that marketing had anything to do with it, at least not the front line folks. But, they are certainly bearing the wrath of it at various channels.

edit: gosh, need to brush up on my iphone typing skills, not sure how that earlier quote even got in there....grrrrrrrrrrr..grrrrr


----------



## criminal

Quote:


> Originally Posted by *Yungbenny911*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> When i tested my 970's vs my 980's @ 4K res, the 970 had this V-RAM issue everyone is blowing up, and performed impressively against the 980's. This is just a stupid argument, and anyone returning their 970's because of .5gb V-ram has definitely lost their mind, but please do return them, so i can buy more for cheap lol. I'm not saying what Nvidia knowingly or un-knowingly did is right in any way, but the gaming performance of the 970 is outstanding for it's price. If you don't agree, then you're either just a hater, or a cry baby...
> 
> GTX 980 SLI - 1466Mhz (core), 2004Mhz (mem) - Air cooling
> GTX 970 SLI - 1469Mhz (core), 2004Mhz (mem) - Air cooling
> 
> *Benchmarks:*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: More 4K Benchmarks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [


That is funny, some of your test show a discrepancy between what you are saying and what you can actually see from those graphs. A 970 is "typically" only 10-15% slower than a 980, but in some of those tests you can see over 20% difference as you turn up the settings. That tells me that something is going on with the 970 that doesn't happen on the 980.

Oops... never mind. Don't know what I was looking at.









Still think that people who bought a 970 shouldn't be made to feel stupid for returning their cards. Nvidia lied about specs and user kickback by returning 970s is the only way for them to learn. Unless those users turn around and buy a 980... lol Like field of dreams... gimp them and they will come.


----------



## sugarhell

Quote:


> Originally Posted by *criminal*
> 
> That is funny, some of your test show a discrepancy between what you are saying and what you can actually see from those graphs. A 970 is "typically" only 10-15% slower than a 980, but in some of those tests you can see over 20% difference as you turn up the settings. That tells me that something is going on with the 970 that doesn't happen on the 980.


None checks the minimums when you increase the AA?


----------



## criminal

Quote:


> Originally Posted by *sugarhell*
> 
> None checks the minimums when you increase the AA?


Yeah the minimums on Hitman x4 MSAA and Watchdogs is pretty bad on the 970. I guess there is +20% differences there, but I was more talking about averages.


----------



## CaptainZombie

I just got off the phone with a manager at Newegg and he said that they should have resolution by mid-week if not sooner as to how they are going to handle this situation with the customers that bought a 970. He said that they have had this happen with other products in the past and are going to handle it very similar. He gave me his email and took my cell to contact me directly. Guess they are getting flooded with these support calls. He told me that one of the customers told him they are going to contact the FTC and BBB over this on Nvidia. I told him I'm willing for a refund or an exchange as I would be getting another card and told him I'd even jump to the 980 which he said he might be able to do something with pricing and/or shipping.


----------



## SKYMTL

Quote:


> That is funny, some of your test show a discrepancy between what you are saying and what you can actually see from those graphs. A 970 is "typically" only 10-15% slower than a 980, but in some of those tests you can see over 20% difference as you turn up the settings. That tells me that something is going on with the 970 that doesn't happen on the 980.


In the titles you are picking out, it could very well be SMM or TMU limitations rather than memory.


----------



## notarat

Quote:


> Originally Posted by *skupples*
> 
> I don't think they will. I think only the 980 will get an 8GB model, if that.
> 
> 970 8GB? would it be a 7gb/1gb card? or a 7.5giggle card?


Nah...It'll be 3.5GB at 192GB/Sec and 4.5GB at 20GB/sec lol


----------



## djsi38t

Amazing how much of a difference 512mb of v ram makes.

Equally amazing how much cheaper the 970 is compared to the 980(in some cases half the price).

Boy did nvidia screw this one up.Especially as far as public relations go.


----------



## provost

Quote:


> Originally Posted by *djsi38t*
> 
> Amazing how much of a difference 512mb of v ram makes.
> 
> Equally amazing how much cheaper the 970 is compared to the 980(in some cases half the price).
> 
> Boy did nvidia screw this one up.Especially as far as public relations go.


Nvidia miscalculated, and started treating its customers as a business, instead of as hobbyist...lol

A business lesson that has been reminded to them, indeed, depending on how the next 2-3 quarters turn out.


----------



## Dry Bonez

the sad part about all of this is.......
Imagine if noone discovered this?! Do you guys think Nvidia would have came forward about this? I doubt it. Good thing i cut off my hands(not literally) before buying. I will stick with my 580 until next gen


----------



## sok0

I'm just hoping I can grab another 970 on the cheap when everyone decides to liquidate over .5g of VRAM. Ill take SLI 970's for $500 over a 980 any day of the week.


----------



## thebski

Quote:


> Originally Posted by *Dry Bonez*
> 
> Do you guys think Nvidia would have came forward about this?


Zero chance.

They made comment when they launched the 960 that they had sold 1 million 970s and 980s. I'm guessing 800-900K of those were 970's. They weren't going to jeopardize that for something people didn't realize was happening.


----------



## iSlayer

Quote:


> Originally Posted by *skupples*
> 
> I highly doubt gsync will ever sell enough units to turn a healthy profit off of the R&D cycle, but I guess we won't know for sur until someone gets ahold of an earnings report.
> 
> I mean, just do some math. They aren't exactly selling like hot cakes or available enough to sell like hot cakes.
> 
> That might change in 2015 but as of right now I doubt they've recouped their costs to bring to market.


G-sync is partially subsidized by the brabd loyalty it purchases. It need not ever return a profit on modules sold so long as it returns a profit on GPUs sold.
Quote:


> Originally Posted by *PureBlackFire*
> 
> it's just crazy how the people going the hardest in attack/defense of nvidia in this case are not 970 owners.
> my thoughts exactly.


Everyone else is busy doing our jobs.


----------



## nyxagamemnon

Inc 970 b1 revision with the issue fixed 4gb fully addressable no more segmentation.


----------



## Yungbenny911

Anyone stuck with full waterblocks and backplates for the Gigabyte G1 gaming 970 should send it to my Inbox lol. I'll only take 20% off


----------



## SKYMTL

Quote:


> Originally Posted by *nyxagamemnon*
> 
> Inc 970 b1 revision with the issue fixed 4gb fully addressable no more segmentation.


From my understanding it isn't that easy. Had the ROP and L2 partition on the 8th stride been enabled, yields would likely be in the toilet.


----------



## tpi2007

Quote:


> Originally Posted by *nyxagamemnon*
> 
> Inc 970 b1 revision with the issue fixed 4gb fully addressable no more segmentation.


From their current 'full damage control mode' stance, I'd say not a chance, that would imply admitting that there is something wrong with the 970.

What I can see happening is the following:

- 970 Ti with 1792 CUDA cores, full L2 cache and 64 ROPs, for $10 MSRP more than the 970's launch price (price has to be different so they can deflect criticism easier), for practically the same performance of the 980. Will make the 980 even more niche than it already is, but it's the price to pay to get the customer's goodwill back.

- drop the 970's price to $290.


----------



## AngryGoldfish

Quote:


> Originally Posted by *tpi2007*
> 
> From their current 'full damage control mode' stance, I'd say not a chance, that would imply admitting that there is something wrong with the 970.
> 
> What I can see happening is the following:
> 
> - 970 Ti with 1792 CUDA cores, full L2 cache and 64 ROPs, for $10 MSRP more than the 970's launch price (price has to be different so they can deflect criticism easier), for practically the same performance of the 980. Will make the 980 even more niche than it already is, but it's the price to pay to get the customer's goodwill back.
> 
> - drop the 970's price to $290.


But to anyone that knows about this issue-which could potentially be upwards of 200,000+ people based on the number of hits these articles and videos have been getting-and to anyone that cares about it will know that the 970ti is an un-gimped 970. Nvidia will have to rely on those that are unaware of the issue to boost sales, which means they'd still have to edit the 970's description to make it worth consumer's while to bump up to the 970ti over the original 970, in turn admitting defeat.

In my opinion, no matter how you spin it, nVidia will have to admit that the 970 was not what it was professed to be. Whether it be through an updated 'TI' version or with an admittance of fault and a partial refund. I'd say that if nVidia does anything to fix this issue, it will be to release a 970ti. I doubt it'll happen as the 980 will and always has been their main consumer-grade graphics processor for gamers, but if nVidia does anything to correct the issue it will be to try and make more money. lol









edit: Or, more likely, they'll just pretend like **** didn't hit the fan and walk away whistling with their hands in their pockets.


----------



## Serandur

Quote:


> Originally Posted by *Yungbenny911*
> 
> When i tested my 970's vs my 980's @ 4K res, the 970 had this V-RAM issue everyone is blowing up, and performed impressively against the 980's. This is just a stupid argument, and anyone returning their 970's because of .5gb V-ram has definitely lost their mind, but please do return them, so i can buy more for cheap lol. I'm not saying what Nvidia knowingly or un-knowingly did is right in any way, but the gaming performance of the 970 is outstanding for it's price. If you don't agree, then you're either just a hater, or a cry baby...
> 
> GTX 980 SLI - 1466Mhz (core), 2004Mhz (mem) - Air cooling
> GTX 970 SLI - 1469Mhz (core), 2004Mhz (mem) - Air cooling
> 
> *Benchmarks:*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: More 4K Benchmarks


I suppose childish insults to supplement ignorance of what your own data presents is one way to go about discussions. It's been said a million times already that average FPS numbers do not represent inconsistency in performance caused by brief stuttering as cards overfill their VRAM pool. Furthermore, games like Sleeping Dogs, Crysis 3, and Metro are old and noted not to be very VRAM-heavy. In contrast, your Hitman and Watch Dogs results exhibit clear and drastically low drops in minimum FPS well beyond any 20% difference in 970 and 980 GPU power. Toss in Shadow of Mordor, texture-modded Skyrim, Unity, FC 4, Space Engine, Dying Light, etc. and you're likely to see similar disparities. Not even including stuff beyond games.

This is a direct result of frametime spikes and what little is being captured of them by FPS measures which are direct results of a VRAM shortage issue. Nothing else on a video card would explain such a massive minimum FPS gulf and that has very worrying implications for smoothness. If being fully cognizant of my legal rights, the importance of VRAM, and demanding my money back from this scam that has personally affected me in the exact way I've described is tantamount to "losing my mind" and simply being a "hater" (On my own graphics cards I paid nearly $800 for?), then what's your excuse?

Less VRAM is less VRAM, you will and do run into issues when you reach the 970s' 3.5 GB limit, unless actual smoothness and consistency mean absolutely nothing go you.

More details on what's going on:





51 Seconds into this video, the cards are hooked up to a second monitor to shoot up VRAM usage, note how horribly the 970 stutters while the 980 simply does not.

https://www.youtube.com/watch?v=MTYd9_fe4iI&feature=youtu.be&t=57s - Note how the game completely _locks up_ on the 970 for several seconds.

 -

Frametimes with SLI above 3.5 GBs are going insanely high (stutterfest)

I'm sure people will continue to fail to internalize any of this and keep claiming what they want to be true. Denial is a powerful process.

Here are the facts:

Nvidia misrepresented their specifications and violated federal laws. All who purchased 970s with those false specifications in mind are entitled to a refund.

They partitioned the memory with the last 512 MB being, measurably, far slower.

970s will try to avoid using the 3.5 GBs in many scenarios whereas 980s will immediately allocate all of it. When it does use the remaining 512 MBs, it is in fact using far slower memory.

We have plenty of evidence, including your own, demonstrating what this disparity does to performance (frametime inconsistency, minimum FPS tanks)

Nvidia are in denial of the issue and doing absolutely nothing, so far, for their customers.

Different scenarios will place different demands on VRAM. You *cannot* accurately assume, from a program that doesn't push the 970s' 3.5 GB cap, that all programs will be similarly unaffected.

Average FPS figures average in the brief spikes/stutters, thereby mitigating representation of the issue.


----------



## Xoriam

that video doesn't exist.


----------



## Serandur

Quote:


> Originally Posted by *Xoriam*
> 
> that video doesn't exist.


Sorry about that, changed it to a url link and it works:

https://www.youtube.com/watch?v=MTYd9_fe4iI&feature=youtu.be&t=57s


----------



## Xoriam

Quote:


> Originally Posted by *Serandur*
> 
> Sorry about that, changed it to a url link and it works:
> 
> https://www.youtube.com/watch?v=MTYd9_fe4iI&feature=youtu.be&t=57s


Ok now that is really weird, I've only see that happen 2 times since I've had the cards and that was when I was completely maxed out at the full 4GB and it needed to swap alot of things in and out.

2 times.

I have however seen that sort of thing happen ALOT when running on a crappy HDD.
It could be his recording software causing the issue.


----------



## DIYDeath

Quote:


> Originally Posted by *AngryGoldfish*
> 
> edit: Or, more likely, they'll just pretend like **** didn't hit the fan and walk away whistling with their hands in their pockets.


If they do I'll switch to red, even though this issue doesn't affect me. I can't stand by and just let Nvidia screw people over but I also wont be jumping the gun. I'll wait and see what they do to remedy the issue.


----------



## AngryGoldfish

Quote:


> Originally Posted by *Xoriam*
> 
> Ok now that is really weird, I've only see that happen 2 times since I've had the cards and that was when I was completely maxed out at the full 4GB and it needed to swap alot of things in and out.
> 
> 2 times.
> 
> I have however seen that sort of thing happen ALOT when running on a crappy HDD.
> It could be his recording software causing the issue.


I've seen your posts on this issue a lot and I think you are one of the few that has not experiencing any issues, which also brings into play a whole other set of queries that should have been answered by nVidia a long time ago. It's quite clear that some are not experiencing many issues or are simply unable to notice it. That is not in any a dig at your ability to pick up on gaming nuances. What is far more likely is that your card is functioning better for a reason I have to see explained.


----------



## Xoriam

Quote:


> Originally Posted by *AngryGoldfish*
> 
> I've seen your posts on this issue a lot and I think you are one of the few that has not experiencing any issues, which also brings into play a whole other set of queries that should have been answered by nVidia a long time ago. It's quite clear that some are not experiencing many issues or are simply unable to notice it. That is not in any a dig at your ability to pick up on gaming nuances. What is far more likely is that your card is functioning better for a reason I have to see explained.


Yeah I hope my posts don't come off seeming like Fanboy or anything, I'm just leaving messages based on my personal experience so far.
Apart from driver issues/ crap game coding, the cards have performed flawless for me.
I haven't seen any stuttering caused by the actual hardware apart from like I said before Capped staight up the full 4gb.
Obviously I've seen stuttering in certain situations though due to SLI profile issues/the game hating SLI.

I won't deny there is a frame timing issue from what people have posted even though I haven't been able to pick it out visually, but you know my standpoint when it comes to the whole usage above 3,5gb.

Now on the subject of the specs lies, I have requested a partial refund from my supplier for all 3 GTX 970s I've purchased due to them not being described acurrately at the time of purchase. Hope they come through for me.


----------



## notarat

Quote:


> Originally Posted by *Xoriam*
> 
> Ok now that is really weird, I've only see that happen 2 times since I've had the cards and that was when I was completely maxed out at the full 4GB and it needed to swap alot of things in and out.
> 
> 2 times.
> 
> *I have however seen that sort of thing happen ALOT when running on a crappy HDD.
> It could be his recording software causing the issue*.


It isn't due to a crappy hard drive, nor is it due to the recording software.


----------



## AngryGoldfish

Quote:


> Originally Posted by *Xoriam*
> 
> Yeah I hope my posts don't come off seeming like Fanboy or anything, I'm just leaving messages based on my personal experience so far.
> Apart from driver issues/ crap game coding, the cards have performed flawless for me.
> I haven't seen any stuttering caused by the actual hardware apart from like I said before Capped staight up the full 4gb.
> Obviously I've seen stuttering in certain situations though due to SLI profile issues/the game hating SLI.
> 
> I won't deny there is a frame timing issue from what people have posted even though I haven't been able to pick it out visually, but you know my standpoint when it comes to the whole usage above 3,5gb.
> 
> Now on the subject of the specs lies, I have requested a partial refund from my supplier for all 3 GTX 970s I've purchased due to them not being described acurrately at the time of purchase. Hope they come through for me.


Oh, no, that's not what I meant, mate. Without sounding patronizing, your comments are really helpful for me personally in making a proper conclusion about this whole fiasco. I didn't mean you were fanboing or anything like that and I believe your findings are relevant. I'm just pointing out the conflictions regarding your findings so that people can see there are those who are not experiencing issues and that this may be an issue in and of itself.


----------



## Xoriam

Quote:


> Originally Posted by *notarat*
> 
> It isn't due to a crappy hard drive, nor is it due to the recording software.


I wasn't refering to the stuttering in the video of FC4 inside of the car.

I was refering to the few second lockup in this video
https://www.youtube.com/watch?v=MTYd9_fe4iI&feature=youtu.be&t=57s


----------



## AngryGoldfish

Quote:


> Originally Posted by *DIYDeath*
> 
> If they do I'll switch to red, even though this issue doesn't affect me. I can't stand by and just let Nvidia screw people over but I also wont be jumping the gun. I'll wait and see what they do to remedy the issue.


I won't, not exclusively. I'll probably be replacing my 970 with a 390/390X if the card performs better than the 980, is a little more future proof (HBM?), and doesn't require water cooling to maintain. But if nothing from AMD interests me, I'll stick with the 970 until nVidia's next high-end GPU's come out. I won't be buying a second 970 or a 980 and I won't be recommending either card to others as I don't believe they are worth their money. The 980 is overpriced and the 970 is inadequate.


----------



## Xoriam

Quote:


> Originally Posted by *AngryGoldfish*
> 
> I won't, not exclusively. I'll probably be replacing my 970 with a 390/390X if the card performs better than the 980, is a little more future proof (HBM?), and doesn't require water cooling to maintain. But if nothing from AMD interests me, I'll stick with the 970 until nVidia's next high-end GPU's come out. I won't be buying a second 970 or a 980 and I won't be recommending either card to others as I don't believe they are worth their money. The 980 is overpriced and the 970 is inadequate.


Speaking of red team green team.

This is my frist Nvidia card i've installed into my PERSONAL system since the GTX+ 9800 OC.
Just my luck


----------



## iSlayer

Quote:


> Originally Posted by *AngryGoldfish*
> 
> I won't, not exclusively. I'll probably be replacing my 970 with a 390/390X if the card performs better than the 980, is a little more future proof (HBM?), and doesn't require water cooling to maintain. But if nothing from AMD interests me, I'll stick with the 970 until nVidia's next high-end GPU's come out. I won't be buying a second 970 or a 980 and I won't be recommending either card to others as I don't believe they are worth their money. The 980 is overpriced and the 970 is inadequate.


380x will be watercooled most likely and have HBM but only 4GBs of VRAM.

Something to consider, though I strongly see the 380x being a strong representative of future performance (lots of bandwidth potential) as well as a great contender when it drops. I think with the 480x and efficiency improvements on a smaller process they'll be able to really push the performance to the limits with a good chunk of VRAM (8GB) to make it a de facto 4k champion.

Likewise with Nvidia, the perf and efficiency seems to be there but without HBM it'll be a little lackluster, especially given the likely prices and the stinginess on bandwidth.

In short, 2016 will be really exciting, the playing field will be more level and prices a lot more competitive. I may just hold onto this 970 till then...


----------



## AngryGoldfish

Quote:


> Originally Posted by *iSlayer*
> 
> 380x will be watercooled most likely and have HBM but only 4GBs of VRAM.
> 
> Something to consider, though I strongly see the 380x being a strong representative of future performance (lots of bandwidth overhead) as well as a great contender when it drops. I think with the 480x and efficiency improvements on a smaller process they'll be able to really push the performance to the limits with a good chunk of VRAM to make it a de facto 4k champion.
> 
> Likewise with Nvidia, the perf and efficiency seems to be there but without HBM it'll be a little lackluster, especially given the likely prices.
> 
> In short, 2016 will be really exciting, the playing field will be more level and prices a lot more competitive.


Well, maybe I could live with water cooling if I do a custom job. That was my initial plan, but I've somewhat gone off it.

4GB of HBM should, theoretically, be plenty for 1080p or 1440p.

I heard nVidia will adopt HBM for themselves since it will become such an integrated design with so much new hardware and software. Or at least they'll create an alternative (but really it's the same thing) to keep competitive whilst not falling behind architecturally.

Quote:


> Originally Posted by *Xoriam*
> 
> Speaking of red team green team.
> 
> This is my frist Nvidia card i've installed into my PERSONAL system since the GTX+ 9800 OC.
> Just my luck


I've never used AMD's cards. I've only ever used nVidia. Driver support was that little bit better back then. It's different nowadays.


----------



## Xoriam

Quote:


> Originally Posted by *AngryGoldfish*
> 
> Well, maybe I could live with water cooling if I do a custom job. That was my initial plan, but I've somewhat gone off it.
> 
> 4GB of HBM should, theoretically, be plenty for 1080p or 1440p.
> 
> I heard nVidia will adopt HBM for themselves since it will become such an integrated design with so much new hardware and software. Or at least they'll create an alternative (but really it's the same thing) to keep competitive whilst not falling behind architecturally.
> I've never used AMD's cards. I've only ever used nVidia. Driver support was that little bit better back then. It's different nowadays.


My initial plan was also to make a watercooling setup for my gigabyte 970 g1 gaming sli.
But then I hit my OC cap without my temps rising that much, and then this story popped up.
That killed that plan.
AMD purchase was always related to the price point for me previously, however this go around where I can get them the prices are just ridicilous..... no thanks.
Quote:


> Originally Posted by *iSlayer*
> 
> 380x will be watercooled most likely and have HBM but only 4GBs of VRAM.
> 
> Something to consider, though I strongly see the 380x being a strong representative of future performance (lots of bandwidth potential) as well as a great contender when it drops. I think with the 480x and efficiency improvements on a smaller process they'll be able to really push the performance to the limits with a good chunk of VRAM (8GB) to make it a de facto 4k champion.
> 
> Likewise with Nvidia, the perf and efficiency seems to be there but without HBM it'll be a little lackluster, especially given the likely prices and the stinginess on bandwidth.
> 
> In short, 2016 will be really exciting, the playing field will be more level and prices a lot more competitive. I may just hold onto this 970 till then...


2015 should be pretty interesting as well.
However yeah 2016 should be pretty exciting from both standpoints, I can see alot of stuff I'm interested in happening.


----------



## ZeusHavok

I have recorded massive load lag on my cards when they are using 3.5gb VRAM. I'll upload it and link when it's done.

Now the people saying this issue doesn't exist can be quiet. I run the game on an SSD. The cards never peak over 70% usage and my CPU usage is within 70%

As soon as the VRAM hits 3.5gb it lags when it has to load in new assets.

The video shows my game performance is fine when the VRAM is under 3.5 but at the times it goes over it just freaks out and my frame times are all over the place.

around 30 seconds it starts.

http://youtu.be/_ThK5kORb9M


----------



## paulerxx

Quote:


> Originally Posted by *sugalumps*
> 
> Indeed, the 970 is a budget card no matter how you look at it. You cant expect a mid range budget card to last you through a resolution that is just out and not even the top gpu's can handle it comfortably.
> 
> For some reason the 970 is so hyped that people think it can do anything, people are buying the 970 specifically for 4k thinking they are set for years.


$300+ isn't getting a video card on a budget...That's almost the same price as an Xbox One. Now getting a 750 Ti for $130 is a budget card...HD7870 for $140 is a budget card.


----------



## Xoriam

Quote:


> Originally Posted by *ZeusHavok*
> 
> I have recorded massive load lag on my cards when they are using 3.5gb VRAM. I'll upload it and link when it's done.
> 
> Now the people saying this issue doesn't exist can be quiet. I run the game on an SSD. The cards never peak over 70% usage and my CPU usage is within 70%
> 
> As soon as the VRAM hits 3.5gb it lags when it has to load in new assets.
> 
> The video shows my game performance is fine when the VRAM is under 3.5 but at the times it goes over it just freaks out and my frame times are all over the place.
> 
> around 30 seconds it starts.
> 
> http://youtu.be/_ThK5kORb9M


Dying light?








That game isn't exactly working optimally at the moment..


----------



## skupples

Quote:


> Originally Posted by *provost*
> 
> Nvidia miscalculated, and started treating its customers as a business, instead of as hobbyist...lol
> 
> A business lesson that has been reminded to them, indeed, depending on how the next 2-3 quarters turn out.


their stock prices are saying









at least they were the other day, haven't checked in today.


----------



## criminal

Quote:


> Originally Posted by *paulerxx*
> 
> $300+ isn't getting a video card on a budget...That's almost the same price as an Xbox One. Now getting a 750 Ti for $130 is a budget card...HD7870 for $140 is a budget card.


You see how Nvidia has now changed what some people see as a budget card? Nvidia's marketing and new product structure at work!

Honestly though before the introduction of the Titan, the 970 would have been the $200 card of this generation. But I don't want to get into all that about living in the past.


----------



## skupples

full die GM200 = $1,500!

yeah... no thanks.


----------



## ZeusHavok

Quote:


> Originally Posted by *Xoriam*
> 
> Dying light?
> 
> 
> 
> 
> 
> 
> 
> 
> That game isn't exactly working optimally at the moment..


I just posted that as an example. It's easy to laugh it off like that but my performance is solid until over 3.5gb VRAM is used. Please explain that or is laughing it off easy and avoiding the facts?

It happens in any game, just the effect of it is different. Some games the effect is subtle where in other games it is unplayable.


----------



## criminal

Quote:


> Originally Posted by *skupples*
> 
> full die GM200 = $1,500!
> 
> yeah... no thanks.


Let's hope not.


----------



## skupples

Quote:


> Originally Posted by *criminal*
> 
> Let's hope not.


eh, could care less what they do.

I'm holding onto the GK110s until 2nd gen DX12 cards hit the market.

Need to continue putting money away & into the market.


----------



## Attomsk

Why post a video without Afterburner showing VRAM usage?


----------



## ZeusHavok

Quote:


> Originally Posted by *Attomsk*
> 
> Why post a video without Afterburner showing VRAM usage?


Like i posted on my video comment section. Does it look like the game is stuttering at other points? No, it's when a game is using over 3.5GB. I use afterburner on my second monitor and until you can record two monitors at a time with Shadow Play It's not going to happen.

Do you honestly think, watching that video that it's anything else? I would gain absolutely nothing from lying.


----------



## skupples

right, it just seems weird that the VRAM spikes would happen when staring into corners, & not while on top of a building. I mean, for me, system usage is at its max when standing on top of a building, as the draw distance is the biggest taxer on the graphics list.

either way, it looks like your classic VRAM hitching.


----------



## Attomsk

Quote:


> Originally Posted by *ZeusHavok*
> 
> Do you honestly think, watching that video that it's anything else? I would gain absolutely nothing from lying.


I'm not saying you are lying. I am saying that video doesn't show much. It lags at one point. Without some sort of monitoring information on screen how are viewers supposed to come to their own conclusion?


----------



## Xoriam

Quote:


> Originally Posted by *Attomsk*
> 
> Why post a video without Afterburner showing VRAM usage?


Was about to say this since I forgot to in my previous post.

@Zeushavok you posted the video without the OSD, there was absolutely no way to say you were using 3.5gb+ of vram in that moment.
you also didn't say anything about your resolution and settings.
I wasn't laughing off the issue because it does exist, I'm laughing at how bad dying light can act up sometimes from what my friends have shown me.


----------



## skupples

Quote:


> Originally Posted by *Xoriam*
> 
> Was about to say this since I forgot to in my previous post.
> 
> @Zeushavok you posted the video without the OSD, there was absolutely no way to say you were using 3.5gb+ of vram in that moment.
> you also didn't say anything about your resolution and settings.
> I wasn't laughing off the issue because it does exist, I'm laughing at how bad dying light can act up sometimes from what my friends have shown me.


your friends need to "download" the new update









most issues were attributed to CPU necking, a CPU core pegging @ 99% can stutter as bad /worse than running out of VRAM.


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> your friends need to "download" the new update
> 
> 
> 
> 
> 
> 
> 
> 
> 
> most issues were attributed to CPU necking, a CPU core pegging @ 99% can stutter as bad /worse than running out of VRAM.


Ty for the info.








Didn't realize it was even out yet.


----------



## Menta

its also nice to see all the manufactures hiding.


----------



## ZeusHavok

Quote:


> Originally Posted by *Attomsk*
> 
> I'm not saying you are lying. I am saying that video doesn't show much. It lags at one point. Without some sort of monitoring information on screen how are viewers supposed to come to their own conclusion?


No, I would say look at my rig and come to an educated conclusion.
Quote:


> Originally Posted by *Xoriam*
> 
> Was about to say this since I forgot to in my previous post.
> 
> @Zeushavok you posted the video without the OSD, there was absolutely no way to say you were using 3.5gb+ of vram in that moment.
> you also didn't say anything about your resolution and settings.
> I wasn't laughing off the issue because it does exist, I'm laughing at how bad dying light can act up sometimes from what my friends have shown me.


The game runs perfectly fine for me with Medium textures (to keep VRAM usage below 3.5GB. As soon as the game uses 3.5 the FPS drop to around 40 then when a game uses any more than that it just freaks out and stuttering is astronomical.

Have a look, this time with OSD:

https://www.youtube.com/watch?v=v3vd3PtRvTI

Video might still be rendering.


----------



## Yungbenny911

Quote:


> Originally Posted by *Serandur*
> 
> I suppose childish insults to supplement ignorance of what your own data presents is one way to go about discussions. It's been said a million times already that average FPS numbers do not represent inconsistency in performance caused by brief stuttering as cards overfill their VRAM pool. Furthermore, games like Sleeping Dogs, Crysis 3, and Metro are old and noted not to be very VRAM-heavy. In contrast, your Hitman and Watch Dogs results exhibit clear and drastically low drops in minimum FPS well beyond any 20% difference in 970 and 980 GPU power. Toss in Shadow of Mordor, texture-modded Skyrim, Unity, FC 4, Space Engine, Dying Light, etc. and you're likely to see similar disparities. Not even including stuff beyond games.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> This is a direct result of frametime spikes and what little is being captured of them by FPS measures which are direct results of a VRAM shortage issue. Nothing else on a video card would explain such a massive minimum FPS gulf and that has very worrying implications for smoothness. If being fully cognizant of my legal rights, the importance of VRAM, and demanding my money back from this scam that has personally affected me in the exact way I've described is tantamount to "losing my mind" and simply being a "hater" (On my own graphics cards I paid nearly $800 for?), then what's your excuse?
> 
> Less VRAM is less VRAM, you will and do run into issues when you reach the 970s' 3.5 GB limit, unless actual smoothness and consistency mean absolutely nothing go you.
> 
> More details on what's going on:
> 
> 
> 
> 
> 
> 51 Seconds into this video, the cards are hooked up to a second monitor to shoot up VRAM usage, note how horribly the 970 stutters while the 980 simply does not.
> 
> https://www.youtube.com/watch?v=MTYd9_fe4iI&feature=youtu.be&t=57s - Note how the game completely _locks up_ on the 970 for several seconds.
> 
> -
> 
> Frametimes with SLI above 3.5 GBs are going insanely high (stutterfest)
> 
> I'm sure people will continue to fail to internalize any of this and keep claiming what they want to be true. Denial is a powerful process.
> 
> Here are the facts:
> 
> Nvidia misrepresented their specifications and violated federal laws. All who purchased 970s with those false specifications in mind are entitled to a refund.
> 
> They partitioned the memory with the last 512 MB being, measurably, far slower.
> 
> 970s will try to avoid using the 3.5 GBs in many scenarios whereas 980s will immediately allocate all of it. When it does use the remaining 512 MBs, it is in fact using far slower memory.
> 
> We have plenty of evidence, including your own, demonstrating what this disparity does to performance (frametime inconsistency, minimum FPS tanks)
> 
> Nvidia are in denial of the issue and doing absolutely nothing, so far, for their customers.
> 
> Different scenarios will place different demands on VRAM. You *cannot* accurately assume, from a program that doesn't push the 970s' 3.5 GB cap, that all programs will be similarly unaffected.
> 
> Average FPS figures average in the brief spikes/stutters, thereby mitigating representation of the issue.


Oh please take several seats... (-_-)"

I'm not going to indulge in a silly argument with you on MY EXPERIENCE. Hitman Absolution's min FPS is always recorded at the start of the bench, and because i was running multiple games, i didn't bother to run those benches 2 or 3 times. If i did so, i would have gotten better minimum fps, but i was being fair.

Did you forget to talk about the two games where the 980's min FPS was lower than the 970's? Cherry pick much? (-_-)"

*Bottom-line:* Where the 970's had slowdowns, the 980's also had slowdowns. When the 970's stuttered in stutterdogs, the 980's also stuttered. If shadow of mordor will give the 970's problems at 4k, it would also give the 980's problems at 4k. Don't believe me? Go buy yourself two 980's, 970's, and a 4k monitor and do your own testing, then come talk to me.

The only reason why i kept the 980's was because i would lose more money trying to get rid of them than trying to get rid of the more popular 970's. I'm glad i got rid of them now without loosing too much. Can't wait for my 970's that i got for cheap due to the V-RAM wars


----------



## Xoriam

Quote:


> Originally Posted by *ZeusHavok*
> 
> No, I would say look at my rig and come to an educated conclusion.
> The game runs perfectly fine for me with Medium textures (to keep VRAM usage below 3.5GB. As soon as the game uses 3.5 the FPS drop to around 40 then when a game uses any more than that it just freaks out and stuttering is astronomical.
> 
> Have a look, this time with OSD:
> 
> https://www.youtube.com/watch?v=v3vd3PtRvTI
> 
> Video might still be rendering.


I'm sorry for your situation, I've luckily yet to encounter any issues like this in my games. apart from those cherry picked few times I seriously pegged the absolute maximum 4gb of the card.
I seriously hope I never see it.
Those momentary lockups  thats got to suck man.

Have you tried disabiling SLI and see if the problem is still there?
Also are you monitoring your GPU and CPU usage?


----------



## skupples

Quote:


> Originally Posted by *Xoriam*
> 
> I'm sorry for your situation, I've luckily yet to encounter any issues like this in my games. apart from those cherry picked few times I seriously pegged the absolute maximum 4gb of the card.
> I seriously hope I never see it.
> Those momentary lockups  thats got to suck man.
> 
> Have you tried disabiling SLI and see if the problem is still there?
> Also are you monitoring your GPU and CPU usage?


you can only get the proper SLi bits from downloading experience, or going somewhere that's uploaded them, then injecting them with that ancient tool.

either way, the game is silky smooth for me, with under 3.5GB vram used @ max settings, so idk what his issue is, unless he's somehow missed out on the CPU utilization patch.


----------



## MerkageTurk

^ Xoriam

No point disagreeing as nVIDIA said it, which is from the horses mouth


----------



## Xoriam

LOL taken direct from the Gigabyte GTX 970 G1 Gaming page.
That was not there when I was checking it out when i bought it


----------



## skupples

legal duct tape.


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> you can only get the proper SLi bits from downloading experience, or going somewhere that's uploaded them, then injecting them with that ancient tool.
> 
> either way, the game is silky smooth for me, with under 3.5GB vram used @ max settings, so idk what his issue is, unless he's somehow missed out on the CPU utilization patch.


Thats a possibility, the strange things I've been shown the game doing (ie extremely low frame rate) did not include these sort of lockups hes been showing.
I mean i've seen same setup with only AMD Processor vs intel processor where AMD gets like 20 fps, and the intel is pushing 60+
But not this, what hes encountering is extremely unacceptable and hopefully there is just something he is overlooking.
Quote:


> Originally Posted by *MerkageTurk*
> 
> ^ Xoriam
> 
> No point disagreeing as nVIDIA said it, which is from the horses mouth


What am I disagreeing with? The card is not what is was original advertised as, we all know that.


----------



## MerkageTurk

Quote:


> I'm sorry for your situation, I've luckily yet to encounter any issues like this in my games. apart from those cherry picked few times I seriously pegged the absolute maximum 4gb of the card.
> I seriously hope I never see it.


Hopefully you do not encounter issues

As you said you pegged 4gb but did not encounter the issues


----------



## Xoriam

Quote:


> Originally Posted by *wanako*
> 
> since this thread has turned into a big cat fight and no one has posted anything of any sort of worth for the last 200 or so pages, here is an interview with an Nvidia engineer about the whole situation. This will put this whole thread to rest and all of you can just quiet down and leave.


----------



## bwsteg

Quote:


> Originally Posted by *sok0*
> 
> I'm just hoping I can grab another 970 on the cheap when everyone decides to liquidate over .5g of VRAM. Ill take SLI 970's for $500 over a 980 any day of the week.


I was actually able to grab a Nvidia 970 for about $280 from best buy today. I have a GTX 770 previously.


----------



## PhotonFanatic

So basically, there isn't any real way to tell if you're being ripped off when you buy a card. I personally was waiting on the 980 8Gb model, because I want to run the ultra high res textures when I finally play Shadow of Mordor. They tell you in the game menu that the setting will require 6Gb of vram. Its like you need a titan Z these days if you want any kind of future proofing at all. Nvidia is putting out new 4Gb cards when games that have already been released are needing 6Gb to play on max. They're getting on my nerves.


----------



## awdrifter

Quote:


> Originally Posted by *criminal*
> 
> You see how Nvidia has now changed what some people see as a budget card? Nvidia's marketing and new product structure at work!
> 
> Honestly though before the introduction of the Titan, the 970 would have been the $200 card of this generation. But I don't want to get into all that about living in the past.


Even by Nvidia's own marketing slides, the GTX 970 is a high-end card (the GTX 980 is enthusiast). So to lie about the specs on their own high-end card is pretty damn shady.


----------



## skupples

Quote:


> Originally Posted by *PhotonFanatic*
> 
> So basically, there isn't any real way to tell if you're being ripped off when you buy a card. I personally was waiting on the 980 8Gb model, because I want to run the ultra high res textures when I finally play Shadow of Mordor. They tell you in the game menu that the setting will require 6Gb of vram. Its like you need a titan Z these days if you want any kind of future proofing at all. Nvidia is putting out new 4Gb cards when games that have already been released are needing 6Gb to play on max. They're getting on my nerves.


good news. The high texture DLC is a joke for the amount of VRAM it requires.

Also, at this point, with maxwell basically being a filler GPU with zero competition, I think an 8GB model of 780 is unlikely, though plausible I guess.


----------



## Serandur

Quote:


> Originally Posted by *Yungbenny911*
> 
> Oh please take several seats... (-_-)"
> 
> I'm not going to indulge in a silly argument with you on MY EXPERIENCE. Hitman Absolution's min FPS is always recorded at the start of the bench, and because i was running multiple games, i didn't bother to run those benches 2 or 3 times. If i did so, i would have gotten better minimum fps, but i was being fair.
> 
> Did you forget to talk about the two games where the 980's min FPS was lower than the 970's? Cherry pick much? (-_-)"
> 
> *Bottom-line:* Where the 970's had slowdowns, the 980's also had slowdowns. When the 970's stuttered in stutterdogs, the 980's also stuttered. If shadow of mordor will give the 970's problems at 4k, it would also give the 980's problems at 4k. Don't believe me? Go buy yourself two 980's, 970's, and a 4k monitor and do your own testing, then come talk to me.
> 
> The only reason why i kept the 980's was because i would lose more money trying to get rid of them than trying to get rid of the more popular 970's. I'm glad i got rid of them now without loosing too much. Can't wait for my 970's that i got for cheap due to the V-RAM wars


You're pulling the "don't tell me about my experience" line when you're the one who called people upset over the problem or getting rid of their 970s having "lost their mind" or being "haters" over the factual data? Your experience is largely with games that don't present the issue, is shown only in a faulty average FPS format, and clearly shows the problem in a couple of games that actually might push a 970's VRAM cap.

The two games where the 980 had a lower minimum FPS were by completely minuscule amounts; 92 vs 96 is barely any difference and 18 vs 23 also not much, that could be easily attributed to margins of error and similar CPU ceilings or whatnot. There was no cherry picking, the examples that showed an abnormally low minimum FPS on the 970s were by a factor of 46% (35 vs 24) in the case of Watch Dogs, which is clearly in line with that factually disparate frametimes captured in Watch Dogs elsewhere, and by a factor of 88% in Hitman. The 980s are displaying nothing like that.

If you don't want your experience criticized as such, maybe you shouldn't be the one on some high horse trying to make 970 owners look bad for objectively having issues while failing to provide any data that actually conflicts with the numerous tests we have elsewhere on the net and just giving us your word there are no problems when the minimum FPS drop is clearly reflective of a disparity between the 980 and 970 attributable to loading (VRAM) issues. It is you making some silly put-down claim about _other_ people's experiences. I own two 970s, I've tested them at 4K, I've tested them at 5K, and there are issues with less than 99% GPU load and either bordering the 3.5 GB mark or surpassing it whereas there are none beforehand. Of course it doesn't happen in Crysis 3, Sleeping Dogs, Bioshock Infinite, etc. because as I said, the games barely touch VRAM. They're older, mostly more linear, and came before the current VRAM boom brought about by the new consoles.

Your 970s aren't special; you're simply not testing games heavy enough with VRAM issues, producing problems in a couple (at least Watch Dogs) that can, and are just asking us to believe the data you presented supporting the issue are themselves a fluke without any additional data to support your claim that they're not accurate. Then you're insulting people while doing so. I'm not going to sit back and not correct your interpretation of the data you've presented and especially not when you refuse to treat others factually reporting issues (both moral and technical) with respect as your generalizations of your limited experiences, data, and data interpretation across to all people illustrate.

I posted Watch Dogs data already proving you wrong, there are clear frametime issues caused by the 970's VRAM that the 980 does not exhibit in the same testing and GoldenTiger's own testing of SoM showed a clear problem. This is how the issue was even noticed in the first place... 970s refusing to go above 3.5 GBs until forced and people reporting performance issues as a result. Eventually, we got proof that the 512 MB portion has an almost useless amount of bandwidth. Even excluding all the benchmarks proving there to be an issue, ~30 GB/s of bandwidth is a problem just from common sense. Games don't just stutter for no reason, and not coincidentally, it's in all the VRAM-heavy stuff that people are reporting and recording issues (ie. a lot of newer stuff).

But sure, ignore all the benchmarked data that proves there is in fact an issue exclusive to the 970s as in the graphs and videos I posted, hand-wave away the issues presented in your own limited data, and continue thinking people being upset about this are dolts and losing their minds or haters or whatever word it is you need to label people proving any claim that there are never issues wrong.


----------



## GrimDoctor

Quote:


> Originally Posted by *Serandur*
> 
> You're pulling the "don't tell me about my experience" line when you're the one who called people upset over the problem or getting rid of their 970s having "lost their mind" or being "haters" over the factual data? Your experience is largely with games that don't present the issue, is shown only in a faulty average FPS format, and clearly shows the problem in a couple of games that actually might push a 970's VRAM cap.
> 
> The two games where the 980 had a lower minimum FPS were by completely minuscule amounts; 92 vs 96 is barely any difference and 18 vs 23 also not much, that could be easily attributed to margins of error and similar CPU ceilings or whatnot. There was no cherry picking, the examples that showed an abnormally low minimum FPS on the 970s were by a factor of 46% (35 vs 24) in the case of Watch Dogs, which is clearly in line with that factually disparate frametimes captured in Watch Dogs elsewhere, and by a factor of 88% in Hitman. The 980s are displaying nothing like that.
> 
> If you don't want your experience criticized as such, maybe you shouldn't be the one on some high horse trying to make 970 owners look bad for objectively having issues while failing to provide any data that actually conflicts with the numerous tests we have elsewhere on the net and just giving us your word there are no problems when the minimum FPS drop is clearly reflective of a disparity between the 980 and 970 attributable to loading (VRAM) issues. It is you making some silly put-down claim about _other_ people's experiences. I own two 970s, I've tested them at 4K, I've tested them at 5K, and there are issues with less than 99% GPU load and either bordering the 3.5 GB mark or surpassing it whereas there are none beforehand. Of course it doesn't happen in Crysis 3, Sleeping Dogs, Bioshock Infinite, etc. because as I said, the games barely touch VRAM. They're older, mostly more linear, and came before the current VRAM boom brought about by the new consoles.
> 
> Your 970s aren't special; you're simply not testing games heavy enough with VRAM issues, producing problems in a couple (at least Watch Dogs) that can, and are just asking us to believe the data you presented supporting the issue are themselves a fluke without any additional data to support your claim that they're not accurate. Then you're insulting people while doing so. I'm not going to sit back and not correct your interpretation of the data you've presented and especially not when you refuse to treat others factually reporting issues (both moral and technical) with respect as your generalizations of your limited experiences, data, and data interpretation across to all people illustrate.
> 
> I posted Watch Dogs data already proving you wrong, there are clear frametime issues caused by the 970's VRAM that the 980 does not exhibit in the same testing and GoldenTiger's own testing of SoM showed a clear problem. This is how the issue was even noticed in the first place... 970s refusing to go above 3.5 GBs until forced and people reporting performance issues as a result. Eventually, we got proof that the 512 MB portion has an almost useless amount of bandwidth. Even excluding all the benchmarks proving there to be an issue, ~30 GB/s of bandwidth is a problem just from common sense. Games don't just stutter for no reason, and not coincidentally, it's in all the VRAM-heavy stuff that people are reporting and recording issues (ie. a lot of newer stuff).
> 
> But sure, ignore all the benchmarked data that proves there is in fact an issue exclusive to the 970s as in the graphs and videos I posted, hand-wave away the issues presented in your own limited data, and continue thinking people being upset about this are dolts and losing their minds or haters or whatever word it is you need to label people proving any claim that there are never issues wrong.


I agree with you bud, that fella has a very special version of tunnel vision on


----------



## somethingname

I just stumbled upon the 970 thread it's obvious the ram swapping is causing stutter and poor performance. I highly doubt a drivers going to fix that otherwise it would have been addressed since launch.

Not sure why my 900 series card blows reply was deleted I was just speaking the truth that people seem to want to denie or sweep under the rug lulz


----------



## Clocknut

Quote:


> Originally Posted by *tpi2007*
> 
> From their current 'full damage control mode' stance, I'd say not a chance, that would imply admitting that there is something wrong with the 970.
> 
> What I can see happening is the following:
> 
> - 970 Ti with 1792 CUDA cores, full L2 cache and 64 ROPs, for $10 MSRP more than the 970's launch price (price has to be different so they can deflect criticism easier), for practically the same performance of the 980. Will make the 980 even more niche than it already is, but it's the price to pay to get the customer's goodwill back.
> 
> - drop the 970's price to $290.


This, been saying all this while long ago. 970ti is the only solution that is not financial damaging for them.

1. for extra $20 top up, allow 970 existing owner to return their card to get 970ti.
2. Launch 970ti at $330-340
3. 1-3 months later drop 970 @ $290 ($50 diff should be enough to justify 970 existence)
4. take the returned 970 and sold it in the used market as refurbished unit/ or reserve as RMA card.


----------



## notarat

Quote:


> Originally Posted by *Xoriam*
> 
> I wasn't refering to the stuttering in the video of FC4 inside of the car.
> 
> I was refering to the few second lockup in this video
> https://www.youtube.com/watch?v=MTYd9_fe4iI&feature=youtu.be&t=57s


I'm aware of what you were referring to


----------



## notarat

Quote:


> Originally Posted by *Xoriam*
> 
> LOL taken direct from the Gigabyte GTX 970 G1 Gaming page.
> That was not there when I was checking it out when i bought it


Of course it wasn't.

Why preserve evidence?


----------



## Xoriam

Well I recieved a response from the place I got my EVGA card from. ( A place I recently tried out, not the one where I typically get my stuff from)
They will not be giving a partial refund, heh oh well I still have 2 more chances. 2 more cards from another place to hope on.
They've ignored me more than once on other occasions for customer service, they didn't even fully read the most recent message I sent them.
And it took 3 emails to get my free game code..
The card they sold me was actually defective, like RMA the card because it doesn't work defective.

In my message I went on about when I purchased the card I expected X/X/X/X specs because of how they were advertised etc and instead they were like Y/Y/Y/Y.5+ 0.5 and so on.
I also told him that the card I purchased from them was defective and I had to RMA through EVGA.

His response was similar to this "Obviously we can not do this because the *SPEED OF THE CARD IS NOT IN DISCUSSION* and the card is *NOT MALFUNCTIONING*."
In my head, I'm like... Seriously? So if the card was actually made with X/X/X/X specs as advertised, it totally wouldn't have been faster than Y/Y/Y/Y.5 + 0.5
And he obviously didnt read the message fully either, because I specificly told him that I had to RMA the card because it wasn't working >-<

Whatever, I'm obviously not buying anything there anymore.
(not because of no partial refund, but because of the horrible customer service. this is an indepenant supplier company as well. not some big company that can lose track of their customers.)

now HERE COMES THE BEST PART!

wait for it......

THEY'VE RAISED THE PRICE ON THE GTX 970 BY ABOUT 60€ SINCE THIS WHOLE ISSUE BEGAN.

Queue the laughing spanish man.


----------



## DIYDeath

Quote:


> Originally Posted by *Xoriam*
> 
> Well I recieved a response from the place I got my EVGA card from. ( A place I recently tried out, not the one where I typically get my stuff from)
> They will not be giving a partial refund, heh oh well I still have 2 more chances. 2 more cards from another place to hope on.
> They've ignored me more than once on other occasions for customer service, they didn't even fully read the most recent message I sent them.
> And it took 3 emails to get my free game code..
> The card they sold me was actually defective, like RMA the card because it doesn't work defective.
> 
> In my message I went on about when I purchased the card I expected X/X/X/X specs because of how they were advertised etc and instead they were like Y/Y/Y/Y.5+ 0.5 and so on.
> I also told him that the card I purchased from them was defective and I had to RMA through EVGA.
> 
> His response was similar to this "Obviously we can not do this because the *SPEED OF THE CARD IS NOT IN DISCUSSION* and the card is *NOT MALFUNCTIONING*."
> In my head, I'm like... Seriously? So if the card was actually made with X/X/X/X specs as advertised, it totally wouldn't have been faster than Y/Y/Y/Y.5 + 0.5
> And he obviously didnt read the message fully either, because I specificly told him that I had to RMA the card because it wasn't working >-<
> 
> Whatever, I'm obviously not buying anything there anymore.
> (not because of no partial refund, but because of the horrible customer service. this is an indepenant supplier company as well. not some big company that can lose track of their customers.)
> 
> now HERE COMES THE BEST PART!
> 
> wait for it......
> 
> THEY'VE RAISED THE PRICE ON THE GTX 970 BY ABOUT 60€ SINCE THIS WHOLE ISSUE BEGAN.
> 
> Queue the laughing spanish man.


At least you didn't buy the original Titan.


----------



## gamervivek

Quote:


> Originally Posted by *Xoriam*
> 
> LOL taken direct from the Gigabyte GTX 970 G1 Gaming page.
> That was not there when I was checking it out when i bought it


Still readable, 2/10.


----------



## Nevk

https://forums.geforce.com/default/topic/808557/geforce-900-series/big-name-law-firm-currently-investigating-nvidia-over-gtx-970-/
Big name law firm currently investigating Nvidia over GTX 970.


----------



## Swolern

Quote:


> Originally Posted by *DIYDeath*
> 
> At least you didn't buy the original Titan.


I bought the original Titan almost 2 years ago used with waterblock for $800. Just sold it last week with air cooler only for $750 .








How ya like them apples.......


----------



## Silent Scone

Quote:


> Originally Posted by *skupples*
> 
> good news. The high texture DLC is a joke for the amount of VRAM it requires.
> 
> Also, at this point, with maxwell basically being a filler GPU with zero competition, I think an 8GB model of 780 is unlikely, though plausible I guess.


According to certain AIB reps they're still a possibility, think it's more a question of when.
Quote:


> Originally Posted by *Swolern*
> 
> I bought the original Titan almost 2 years ago used with waterblock for $800. Just sold it last week with air cooler only for $750 .
> 
> 
> 
> 
> 
> 
> 
> 
> How ya like them apples.......


Laughable, isn't it? People still pulling the "LOL TITAN OWNERS! ;D" card when they're still the card to own for UHD resolutions, let alone people such as yourself who've kept them since Feb 2013.

You have to be an A* derp to not see the value there. Same can't be said for 780GTX owners.


----------



## Luck100

Quote:


> Originally Posted by *Swolern*
> 
> I bought the original Titan almost 2 years ago used with waterblock for $800. Just sold it last week with air cooler only for $750 .
> 
> 
> 
> 
> 
> 
> 
> 
> How ya like them apples.......


Now that's some expert trading!


----------



## Yungbenny911

Quote:


> Originally Posted by *Serandur*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> You're pulling the "don't tell me about my experience" line when you're the one who called people upset over the problem or getting rid of their 970s having "lost their mind" or being "haters" over the factual data? Your experience is largely with games that don't present the issue, is shown only in a faulty average FPS format, and clearly shows the problem in a couple of games that actually might push a 970's VRAM cap.
> 
> The two games where the 980 had a lower minimum FPS were by completely minuscule amounts; 92 vs 96 is barely any difference and 18 vs 23 also not much, that could be easily attributed to margins of error and similar CPU ceilings or whatnot. There was no cherry picking, the examples that showed an abnormally low minimum FPS on the 970s were by a factor of 46% (35 vs 24) in the case of Watch Dogs, which is clearly in line with that factually disparate frametimes captured in Watch Dogs elsewhere, and by a factor of 88% in Hitman. The 980s are displaying nothing like that.
> 
> 
> 
> If you don't want your experience criticized as such, maybe you shouldn't be the one on some high horse trying to make 970 owners look bad for objectively having issues while failing to provide any data that actually conflicts with the numerous tests we have elsewhere on the net and just giving us your word there are no problems when the minimum FPS drop is clearly reflective of a disparity between the 980 and 970 attributable to loading (VRAM) issues. It is you making some silly put-down claim about _other_ people's experiences. *I own two 970s, I've tested them at 4K, I've tested them at 5K, and there are issues with less than 99% GPU load and either bordering the 3.5 GB mark or surpassing it whereas there are none beforehand.* Of course it doesn't happen in Crysis 3, Sleeping Dogs, Bioshock Infinite, etc. because as I said, the games barely touch VRAM. They're older, mostly more linear, and came before the current VRAM boom brought about by the new consoles.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Your 970s aren't special; you're simply not testing games heavy enough with VRAM issues, producing problems in a couple (at least Watch Dogs) that can, and are just asking us to believe the data you presented supporting the issue are themselves a fluke without any additional data to support your claim that they're not accurate. Then you're insulting people while doing so. I'm not going to sit back and not correct your interpretation of the data you've presented and especially not when you refuse to treat others factually reporting issues (both moral and technical) with respect as your generalizations of your limited experiences, data, and data interpretation across to all people illustrate.
> 
> I posted Watch Dogs data already proving you wrong, there are clear frametime issues caused by the 970's VRAM that the 980 does not exhibit in the same testing and GoldenTiger's own testing of SoM showed a clear problem. This is how the issue was even noticed in the first place... 970s refusing to go above 3.5 GBs until forced and people reporting performance issues as a result. Eventually, we got proof that the 512 MB portion has an almost useless amount of bandwidth. Even excluding all the benchmarks proving there to be an issue, ~30 GB/s of bandwidth is a problem just from common sense. Games don't just stutter for no reason, and not coincidentally, it's in all the VRAM-heavy stuff that people are reporting and recording issues (ie. a lot of newer stuff).
> 
> But sure, ignore all the benchmarked data that proves there is in fact an issue exclusive to the 970s as in the graphs and videos I posted, hand-wave away the issues presented in your own limited data, and continue thinking people being upset about this are dolts and losing their minds or haters or whatever word it is you need to label people proving any claim that there are never issues wrong.


Oh, how wonderful! You tested your SLI 970's at 5K RES, and you expected them to run at 60 FPS+ with AA cranked up? smh...









People with 6GB 4-way SLI Titans have a hard time maintaining 60 fps+ at that resolution, so i wonder what magic performance you were expecting your 970's to exhibit. You're just being delusional with your method of testing, why not play the game like you normally would? Are you buying that 60/120Hz High-RES Monitor to play your games at 30 FPS?

If you were wondering if the 970 would run at 120FPS if it had 30GB V-RAM, it won't. V-RAM is not all that matters; applying AA, and increasing resolution does not only affect V-RAM, it also affects the processing power of the GPU, and no matter the amount of V-RAM a GPU has, if it runs out of processing capabilities, your games would run like crap, and that's a FACT.

As i said, this is a silly argument, and this would be my last reply to you on this subject.


----------



## Orangey

UHD = 8GB 290X Sconey









Actually it's not even as simple as that. 980 wins quite a few too.


----------



## revro

funny for some reason now here in slovakia, titans are sold for 1050+eur, 780 are no longer on sale, only 780ti for 470-650eur.
gigabyte is the one with reasonable price, ou wait gb 780ti sucked hard. yeah now i just remembered

my gtx970 well its ok for eve online but it took some time to load up textures for star citizen ... well we shall see


----------



## Silent Scone

Quote:


> Originally Posted by *Orangey*
> 
> UHD = 8GB 290X Sconey
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Actually it's not even as simple as that. 980 wins quite a few too.


Feb 2013. 8GB 290X is a great UHD card, but doubt many will buy into them now. Well, I had two four or five months ago. They didn't work with my 4K panel as they couldn't display 60Hz







.


----------



## rdr09

Quote:


> Originally Posted by *Silent Scone*
> 
> Feb 2013. 8GB 290X is a great UHD card, but doubt many will buy into them now. Well, I had two four or five months ago. They didn't work with my 4K panel as they couldn't display 60Hz
> 
> 
> 
> 
> 
> 
> 
> .


must be a 4K TV with no DP.


----------



## Silent Scone

Quote:


> Originally Posted by *rdr09*
> 
> must be a 4K TV with no DP.


Nope. I have a trail of emails from Shane Parfitt going back to June last year regarding it. Required an EDID override for the AOC U2868, which initially didn't work. Was only properly fixed a month or two ago.

They blamed the monitor firmware, albeit it worked ok on my 780Ti


----------



## Swolern

Quote:


> Originally Posted by *rdr09*
> 
> must be a 4K TV with no DP.


Isn't there only one 4k TV with DP in existence? By Panasonic. Or did they finally release more?


----------



## CaptainZombie

Quote:


> Originally Posted by *Swolern*
> 
> Isn't there only one 4k TV with DP in existence? By Panasonic. Or did they finally release more?


Panasonic AX800U/900U. I am not sure if the Vizio P is also equipped with a DP, can't recall.


----------



## criminal

Quote:


> Originally Posted by *Serandur*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> You're pulling the "don't tell me about my experience" line when you're the one who called people upset over the problem or getting rid of their 970s having "lost their mind" or being "haters" over the factual data? Your experience is largely with games that don't present the issue, is shown only in a faulty average FPS format, and clearly shows the problem in a couple of games that actually might push a 970's VRAM cap.
> 
> The two games where the 980 had a lower minimum FPS were by completely minuscule amounts; 92 vs 96 is barely any difference and 18 vs 23 also not much, that could be easily attributed to margins of error and similar CPU ceilings or whatnot. There was no cherry picking, the examples that showed an abnormally low minimum FPS on the 970s were by a factor of 46% (35 vs 24) in the case of Watch Dogs, which is clearly in line with that factually disparate frametimes captured in Watch Dogs elsewhere, and by a factor of 88% in Hitman. The 980s are displaying nothing like that.
> 
> If you don't want your experience criticized as such, maybe you shouldn't be the one on some high horse trying to make 970 owners look bad for objectively having issues while failing to provide any data that actually conflicts with the numerous tests we have elsewhere on the net and just giving us your word there are no problems when the minimum FPS drop is clearly reflective of a disparity between the 980 and 970 attributable to loading (VRAM) issues. It is you making some silly put-down claim about _other_ people's experiences. I own two 970s, I've tested them at 4K, I've tested them at 5K, and there are issues with less than 99% GPU load and either bordering the 3.5 GB mark or surpassing it whereas there are none beforehand. Of course it doesn't happen in Crysis 3, Sleeping Dogs, Bioshock Infinite, etc. because as I said, the games barely touch VRAM. They're older, mostly more linear, and came before the current VRAM boom brought about by the new consoles.
> 
> Your 970s aren't special; you're simply not testing games heavy enough with VRAM issues, producing problems in a couple (at least Watch Dogs) that can, and are just asking us to believe the data you presented supporting the issue are themselves a fluke without any additional data to support your claim that they're not accurate. Then you're insulting people while doing so. I'm not going to sit back and not correct your interpretation of the data you've presented and especially not when you refuse to treat others factually reporting issues (both moral and technical) with respect as your generalizations of your limited experiences, data, and data interpretation across to all people illustrate.
> 
> I posted Watch Dogs data already proving you wrong, there are clear frametime issues caused by the 970's VRAM that the 980 does not exhibit in the same testing and GoldenTiger's own testing of SoM showed a clear problem. This is how the issue was even noticed in the first place... 970s refusing to go above 3.5 GBs until forced and people reporting performance issues as a result. Eventually, we got proof that the 512 MB portion has an almost useless amount of bandwidth. Even excluding all the benchmarks proving there to be an issue, ~30 GB/s of bandwidth is a problem just from common sense. Games don't just stutter for no reason, and not coincidentally, it's in all the VRAM-heavy stuff that people are reporting and recording issues (ie. a lot of newer stuff).
> 
> But sure, ignore all the benchmarked data that proves there is in fact an issue exclusive to the 970s as in the graphs and videos I posted, hand-wave away the issues presented in your own limited data, and continue thinking people being upset about this are dolts and losing their minds or haters or whatever word it is you need to label people proving any claim that there are never issues wrong.


Great post. You will just have to ignore some people. I will never understand the allegiance some people have for a company that cares nothing about them. This is a real issue and though it has been exaggerated by some, Nvidia is totally the one in the wrong in this situation. This thread has thoroughly covered the issue, so anyone defending Nvidia at this point is just trying to get a rise out of people.
Quote:


> Originally Posted by *Nevk*
> 
> https://forums.geforce.com/default/topic/808557/geforce-900-series/big-name-law-firm-currently-investigating-nvidia-over-gtx-970-/
> Big name law firm currently investigating Nvidia over GTX 970.


Good deal.
Quote:


> Originally Posted by *Swolern*
> 
> I bought the original Titan almost 2 years ago used with waterblock for $800. Just sold it last week with air cooler only for $750 .
> 
> 
> 
> 
> 
> 
> 
> 
> How ya like them apples.......


The "let's make fun of Titan owners" thing is so played out by now. The Titan is still a very decent card and has held its value longer than any card before it. Can't beat that value.


----------



## clerick

Anyone know if the offer from the nvidia rep to help with returns still active? I want to return mien to ncix, but they haven't replied yet (and the one person who got a reply from them a few days ago said they were not accepting returns).


----------



## Jesse36m3

Quote:


> Originally Posted by *clerick*
> 
> return mien to ncix.


Please let me know how that goes.


----------



## ZeusHavok

Quote:


> Originally Posted by *Xoriam*
> 
> I'm sorry for your situation, I've luckily yet to encounter any issues like this in my games. apart from those cherry picked few times I seriously pegged the absolute maximum 4gb of the card.
> I seriously hope I never see it.
> Those momentary lockups  thats got to suck man.
> 
> Have you tried disabiling SLI and see if the problem is still there?
> Also are you monitoring your GPU and CPU usage?


DPU usage is never over 70% across all cores and the GPU usage is around 70% on my cards. The reason i know it to be a VRAM issue is the game plays absolutely flawlessly when the textures are set to medium and the cards are rarely pushing over 3gb vram usage.

I've requested a refund for my cards with my hardware supplier so let's see what they come back with.

Also I wish it was just dying light that did this but it's the same across pretty much any game that goes over that 3.5 limit.

shame really because I really like these cards but it's been a problem for me since I got them.

PS: Issue still exists when I turn SLI off (although to a lesser extent)

*UPDATE*

Overclockers.co.uk are accepting both my cards back for a full refund.


----------



## doritos93

Quote:


> Originally Posted by *criminal*
> 
> The "let's make fun of Titan owners" thing is so played out by now. The Titan is still a very decent card and has held its value longer than any card before it. *Can't beat that value*.


I'm new to this thread. Got any numbers to back that claim up?


----------



## PhotonFanatic

Quote:


> Originally Posted by *skupples*
> 
> good news. The high texture DLC is a joke for the amount of VRAM it requires.


I understand the point you're trying to make, but I have to disagree about it being good news. Whatever the cause is, whatever the reason that people or the devs may cite, its ever increasing. Every year we see the need for more and more vram, and I just don't feel like the card companies are keeping up well at all. Its more like they're benefiting from it, because now your card is 2 years old! You need more vram! You need to buy this new awesome card that we've just put out that has exactly the amount of vram (or slightly less) than the games are needing these days!

People had the same types of comments back when Grand theft Auto 4 came out. Saying "oh well its just poor optimization or something". But its always some excuse. When the real point is, you just need a card with extra vram. That can actually make use of that amount of vram, unlike what we see here today.


----------



## thegreatsquare

Uh, Oh! The lawyers are chasing this ambulance.

http://bursor.com/investigations/nvidia/


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *doritos93*
> 
> Of course


Pretty simple really. I bought my two Titans for $2k back in Feb 2013 when they were the absolute fastest cards you could get. Fast forward 2 years later and they are basically still among the fastest cards on the market (with voltage control and OC'd to 1300+MHz). That means that I have enjoyed 980-like levels of performance for the past two years, not to mention I still have 6GB VRAM which has its uses as well.

Now, the kicker is that guys with original Titans are still selling their cards on the market for $650 up to $750 in some cases meaning that they enjoyed all that performance for the last two years for basically $250-$350. Seems like a pretty awesome deal to me honestly (though I have no intention of selling my Titans as they still crush every game I play at 1440P)...


----------



## doritos93

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Pretty simple really. I bought my two Titans for $2k back in Feb 2013 when they were the absolute fastest cards you could get. Fast forward 2 years later and they are basically still among the fastest cards on the market (with voltage control and OC'd to 1300+MHz). That means that I have enjoyed 980-like levels of performance for the past two years, not to mention I still have 6GB VRAM which has its uses as well.
> 
> Now, the kicker is that guys with original Titans are still selling their cards on the market for $650 up to $750 in some cases meaning that they enjoyed all that performance for the last two years for basically $250-$350. Seems like a pretty awesome deal to me honestly (though I have no intention of selling my Titans as they still crush every game I play at 1440P)...


Thanks! That was the first time I had ever read the words "Titan" and "Value" in the same sentence. When you look at it from your perspective, I get it!


----------



## criminal

Quote:


> Originally Posted by *doritos93*
> 
> Thanks! That was the first time I had ever read the words "Titan" and "Value" in the same sentence. When you look at it from your perspective, I get it!


The people that have held onto their Titans have gotten the true value. Though I don't regret selling my Titan, a part of me wishes I would have hung on to it knowing what I know now.


----------



## skupples

Quote:


> Originally Posted by *DIYDeath*
> 
> At least you didn't buy the original Titan.


I did, and love them to this day.

in fact, my level of appreciation for original titan goes up each time a new GPU is released.

Every time a new title comes out, and I turn it on & crank settings to max w/ easily 120FPS locked, I just giggle, as it reminds me of the many many people with my same level of obsession that went from Titans, then bought into the 780 hype, so got 780s, then found out 780 was only faster due to clock speeds, so went back to titans, then to 780 TI, 780Ti Class/KPE, then over to AMD for a spin, then over to 980/970, all the while I'm over here just increasing my clock speeds by a few hundred MHz and giggling.

Unlike some people, I like having my PC & and functioning, as opposed to a sloppy office covered in parts & only running on a weak ass backup rig.


----------



## criminal

Quote:


> Originally Posted by *skupples*
> 
> I did, and love them to this day.
> 
> in fact, my level of appreciation for original titan goes up each time a new GPU is released.
> 
> Every time a new title comes out, and I turn it on & crank settings to max w/ easily 120FPS locked, I just giggle, as it reminds me of the many many people with my same level of obsession that went from Titans, then bought into the 780 hype, so got 780s, then found out 780 was only faster due to clock speeds, so went back to titans, then to 780 TI, 780Ti Class/KPE, then over to AMD for a spin, then over to 980/970, all the while I'm over here just increasing my clock speeds by a few hundred MHz and giggling.
> 
> Unlike some people, I like having my PC & and functioning, as opposed to a sloppy office covered in parts & only running on a weak ass backup rig.


LOL... good stuff.


----------



## Xoriam

Quote:


> Originally Posted by *ZeusHavok*
> 
> DPU usage is never over 70% across all cores and the GPU usage is around 70% on my cards. The reason i know it to be a VRAM issue is the game plays absolutely flawlessly when the textures are set to medium and the cards are rarely pushing over 3gb vram usage.
> 
> I've requested a refund for my cards with my hardware supplier so let's see what they come back with.
> 
> Also I wish it was just dying light that did this but it's the same across pretty much any game that goes over that 3.5 limit.
> 
> shame really because I really like these cards but it's been a problem for me since I got them.
> 
> PS: Issue still exists when I turn SLI off (although to a lesser extent)
> 
> *UPDATE*
> 
> Overclockers.co.uk are accepting both my cards back for a full refund.


I'm really sorry to hear about that.
However I'm happy you're getting a refund.
What do you plan on getting?


----------



## ZeusHavok

Quote:


> Originally Posted by *Xoriam*
> 
> I'm really sorry to hear about that.
> However I'm happy you're getting a refund.
> What do you plan on getting?


Not too sure yet. I have an old GTX570 sitting here so I might just use that and stick to some light gaming and see what AMD bring to the table with their new series 3 cards.


----------



## IRO-Bot

Oh I get it. Nvidia built this card from the ground up to to include DLC. They'll release the other 512MB as a DLC for $100.


----------



## skupples




----------



## iSlayer

Quote:


> Originally Posted by *IRO-Bot*
> 
> Oh I get it. Nvidia built this card from the ground up to to include DLC. They'll release the other 512MB as a DLC for $100.


They can't. Its a hardware problem.


----------



## mtcn77

Quote:


> Originally Posted by *iSlayer*
> 
> They can't. It*'*s a hardware problem.


Fixed it for you.


----------



## skupples

nvidia doesn't view it as a problem, they view it as genius!


----------



## sugarhell

Oh god nvidia is the ubisoft of gpus


----------



## clerick

NCIX told in the forum that this whole thing isn't their problem, and if we want a refund then go talk to the manufacturer directly. How they expect the manufacturer to refund the customer which didn't buy directly from them but from a distributor is beyond me.


----------



## Master__Shake

Quote:


> Originally Posted by *clerick*
> 
> NCIX told in the forum that this whole thing isn't their problem, and if we want a refund then go talk to the manufacturer directly. How they expect the manufacturer to refund the customer which didn't buy directly from them but from a distributor is beyond me.


just saw that.

maybe the supplier allows ncix to allow or disallow refunds?


----------



## skupples

Quote:


> Originally Posted by *clerick*
> 
> NCIX told in the forum that this whole thing isn't their problem, and if we want a refund then go talk to the manufacturer directly. How they expect the manufacturer to refund the customer which didn't buy directly from them but from a distributor is beyond me.


did you expect any less from NCIX?

They essentially survive off of newegg overstock. Literally. New Egg & NCIX share an industrial complex, yet it takes NCIX 5-10 days longer to bring products to market.

They will then put items up for pre-order, long before any known release date has been pushed to suppliers. OH, and take your money while they're at it, which is normally illegal. You're supposed to only collect on funds when shipment has been confirmed. SHIPMENT to the end user.


----------



## clerick

Quote:


> Originally Posted by *skupples*
> 
> did you expect any less from NCIX?
> 
> They essentially survive off of newegg overstock. Literally. New Egg & NCIX share an industrial complex, yet it takes NCIX 5-10 days longer to bring products to market.
> 
> They will then put items up for pre-order, long before any known release date has been pushed to suppliers. OH, and take your money while they're at it, which is normally illegal. You're supposed to only collect on funds when shipment has been confirmed. SHIPMENT to the end user.


Guess that will be my last purchase from them. Live and learn.


----------



## Imouto

Quote:


> Originally Posted by *iSlayer*
> 
> They can't. Its a hardware problem.


Some retailers are offering a GTX 980 in exchange for the GTX 970 + cash.


----------



## Master__Shake

can't wait for that gtx 970 fire sale.


----------



## rc dude

Quote:


> Originally Posted by *Imouto*
> 
> Some retailers are offering a GTX 980 in exchange for the GTX 970 + cash.


Which retailers?


----------



## Archngamin

Has a single review addressed this issue? Has anyone been able to reproduce the issues illustrated in the benchmark in a real life scenario? I still haven't seen anything outside of one benchmark. Lets see what happens in games.


----------



## battleaxe

Buying a 4k monitor tomorrow... hopefully if in stock.... Gonna find out how this thing does.

So mad at Nvidia right now for this whole fiasco. I would have bought a AMD 290x (for less) if I would have known I was getting to be treated like this by Nvidia.

This was the biggest turd move I have seen in quite some time by a manufacturer. Yes, I want 4k and yes I wanted SLI. Now I have to come up with a solution for Nvidia's greed/gimping/fill in the blank______.









:Thanks Nviditurd.


----------



## rickcooperjr

Quote:


> Originally Posted by *Archngamin*
> 
> Has a single review addressed this issue? Has anyone been able to reproduce the issues illustrated in the benchmark in a real life scenario? I still haven't seen anything outside of one benchmark. Lets see what happens in games.


yes look thru the thread there has been many sets of proof in real world gaming even when running 1080p where it hit the 3.5gb barrier then crap hits fan but uptill the 3.5gb it is all good so yes it has been proven.


----------



## Boomstick727

Quote:


> Originally Posted by *battleaxe*
> 
> Buying a 4k monitor tomorrow... hopefully if in stock.... Gonna find out how this thing does.
> 
> So mad at Nvidia right now for this whole fiasco. I would have bought a AMD 290x (for less) if I would have known I was getting to be treated like this by Nvidia.
> 
> This was the biggest turd move I have seen in quite some time by a manufacturer. Yes, I want 4k and yes I wanted SLI. Now I have to come up with a solution for Nvidia's greed/gimping/fill in the blank______.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> :Thanks Nviditurd.


+1

I've been buying Nvidia cards since way back in the 8800GTX days, Liked Nvidia and AMD, hats off to Nvidia because of the crappy way they have treated their customers with the 970 issues. Great false advertising and then just wash your hands of it..

They have managed to completely put me off buying a card from them in the future. Nice move Nvidia









I literally will only buy AMD GPU's from now, I really hope they keep bringing the performance !


----------



## skupples

wuuuut, has to turn down settings on a gimped version of a mid range GPU?!

blasphemy!

yes yes I know, that 512mb of memory would make a WORLD of difference, until that runs out as well...

pro-tip, no matter how much memory it has, it will act up in the last few hundred MB, if it's truly swapping tons of data.


----------



## Cyro999

Quote:


> Originally Posted by *skupples*
> 
> wuuuut, has to turn down settings on a gimped version of a mid range GPU?!
> 
> blasphemy!
> 
> yes yes I know, that 512mb of memory would make a WORLD of difference, until that runs out as well...
> 
> pro-tip, no matter how much memory it has, it will act up in the last few hundred MB, if it's truly swapping tons of data.


Better for that to happen at 3.8GB than at 3.3GB


----------



## Archngamin

Quote:


> Originally Posted by *rickcooperjr*
> 
> yes look thru the thread there has been many sets of proof in real world gaming even when running 1080p where it hit the 3.5gb barrier then crap hits fan but uptill the 3.5gb it is all good so yes it has been proven.


Mind pointing it out?


----------



## battleaxe

I think the point is... we bought 4Gb cards.

What's next. Well... the card has 4Gb but only 2Gb is truly useful, as the other 4Gb is only glued to the card and not truly wired in. As others have said. Would have been better to just leave it 3.5Gb and not allocate the gimped memory sloshing trick as done. Seems to work on a few owners cards and hits a brick wall on others. I personally bought my card knowing I would go SLI eventually. This is not what I though I had bought. I had the option to purchase a 290x at the same time. But I wanted to own both Nvidia and AMD in two different rigs like I usually do. Now I'm kinda sorry I did. It just doesn't feel like I was given a fair representation of the product I was told I was getting. (you know, by looking at the box for example)


----------



## DIYDeath

Quote:


> Originally Posted by *skupples*
> 
> I did, and love them to this day.
> 
> in fact, my level of appreciation for original titan goes up each time a new GPU is released.
> 
> Every time a new title comes out, and I turn it on & crank settings to max w/ easily 120FPS locked, I just giggle, as it reminds me of the many many people with my same level of obsession that went from Titans, then bought into the 780 hype, so got 780s, then found out 780 was only faster due to clock speeds, so went back to titans, then to 780 TI, 780Ti Class/KPE, then over to AMD for a spin, then over to 980/970, all the while I'm over here just increasing my clock speeds by a few hundred MHz and giggling.
> 
> Unlike some people, I like having my PC & and functioning, as opposed to a sloppy office covered in parts & only running on a weak ass backup rig.


I meant in comparison to the Titan Black, honestly both the Titan and the Titan Black have a decent amount of future proofing, I look at these 9XX series cards and lol all the time, for supposedly improved architecture maxwell sure is sucking so far, compared to cream of the crop kepler at least.

I expect that to change within the year though which means before I'll consider upgrading I spent $500 a year on a GPU plus the resale on these cards is still $1000 due to double precision.


----------



## skupples

Quote:


> Originally Posted by *DIYDeath*
> 
> I meant in comparison to the Titan Black, honestly both the Titan and the Titan Black have a decent amount of future proofing, I look at these 9XX series cards and lol all the time, for supposedly improved architecture maxwell sure is sucking so far, compared to cream of the crop kepler at least.
> 
> I expect that to change within the year though which nets me aprox $500/year for my investment before I'll even consider upgrading.


There's only one way to buy Nvidia products, in this modern era of AMD starting to once again kick ass (now if they could just release xfire updates within a week of a new game releasing, not a year) and that's by going flagship or bust, and I don't mean 980. 980 is not a flagship, OK, well it is, it's just a mid range flagship.

Titan Black is definitely a dumb acquisition, but hey! it was the only way to get a 6GB 780Ti!

buying titan, AFTER 780 released was a bad.

I only ever recommended Titans to people that were looking to run Surround, or 1600p+.


----------



## DIYDeath

Quote:


> Originally Posted by *skupples*
> 
> There's only one way to buy Nvidia products, in this modern era of AMD starting to once again kick ass (now if they could just release xfire updates within a week of a new game releasing, not a year) and that's by going flagship or bust, and I don't mean 980. 980 is not a flagship, OK, well it is, it's just a mid range flagship.
> 
> Titan Black is definitely a dumb acquisition, but hey! it was the only way to get a 6GB 780Ti!
> 
> buying titan, AFTER 780 released was a bad.
> 
> I only ever recommended Titans to people that were looking to run Surround, or 1600p+.


Agreed, Nvidia @ this point is flagship or bust for the reasons yo uand I both know and one other: the resale on double precision cards is fantastic. Titan Blacks still sell on ebay for $1000 which means I can essentially get a free upgrade somewhere down the road.

I at 1st had a 780 ti, I hated the damn thing because I was constantly pushing 3gb of vram so I bit the bullet and got the TB. The only reason I'm not sore over that is because of its high resale value and I knew vram was going to become more and more of an issue over the next year or so - which it did.


----------



## skupples

Quote:


> Originally Posted by *DIYDeath*
> 
> Agreed, Nvidia @ this point is flagship or bust for the reasons yo uand I both know and one other: the resale on double precision cards is fantastic. Titan Blacks still sell on ebay for $1000 which means I can essentially get a free upgrade somewhere down the road.


Yeah... I'm getting ready to drop down to dual-SLi, just so I can stick $600-$700 into the reserve account for future upgrades / female costs. Also selling my Russian built VKB Big Fat Black Mamba joystick, because I finally found someone to fabricate a left hand mirror for my second warthog, but that's completely off topic.


----------



## rickcooperjr

Quote:


> Originally Posted by *Archngamin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> yes look thru the thread there has been many sets of proof in real world gaming even when running 1080p where it hit the 3.5gb barrier then crap hits fan but uptill the 3.5gb it is all good so yes it has been proven.
> 
> 
> 
> Mind pointing it out?
Click to expand...

No I am not searching thru all the pages to point out specific posts that is your job not mine I have been watching this thread for past 75-100 pages so simply you are asking me to pick thru all those pages for you I am sorry I am not doing your job for you.

The simple way to put it is the searching thru the thread and reading is what will give you an idea of whats going on and is the research you should be doing not me especially since I have already done so and don't wish to spend next 3 hours going thru past 100 pages or so for you.

Simply you are asking me to do alot for you and do all the research for you that is like having a quiz and having your brother that took course year before do your test for you doing things in this fashion you do no learning of your own and in end will actually not learn anything I call this stuff being lazy when you ask someone to do it for you.


----------



## Archngamin

Quote:


> Originally Posted by *rickcooperjr*
> 
> No I am not searching thru all the pages to point out specific posts that is your job not mine I have been watching this thread for past 75-100 pages so simply you are asking me to pick thru all those pages for you I am sorry I am not doing your job for you.
> 
> The simple way to put it is the searching thru the thread and reading is what will give you an idea of whats going on and is the research you should be doing not me especially since I have already done so and don't wish to spend next 3 hours going thru past 100 pages or so for you.
> 
> Simply you are asking me to do alot for you and do all the research for you that is like having a quiz and having your brother that took course year before do your test for you doing things in this fashion you do no learning of your own and in end will actually not learn anything I call this stuff being lazy when you ask someone to do it for you.


So... you couldn't find anything? I'm just surprised this wasn't caught by a single review out there.


----------



## rickcooperjr

Quote:


> Originally Posted by *Archngamin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> No I am not searching thru all the pages to point out specific posts that is your job not mine I have been watching this thread for past 75-100 pages so simply you are asking me to pick thru all those pages for you I am sorry I am not doing your job for you.
> 
> The simple way to put it is the searching thru the thread and reading is what will give you an idea of whats going on and is the research you should be doing not me especially since I have already done so and don't wish to spend next 3 hours going thru past 100 pages or so for you.
> 
> Simply you are asking me to do alot for you and do all the research for you that is like having a quiz and having your brother that took course year before do your test for you doing things in this fashion you do no learning of your own and in end will actually not learn anything I call this stuff being lazy when you ask someone to do it for you.
> 
> 
> 
> So... you couldn't find anything? I'm just surprised this wasn't caught by a single review out there.
Click to expand...

That is the issue ALL of the reviewers were given false info from Nvidia ( AKA lied to and therfore spreading more lies and false info to us the customers / viewers of the reviewers and such ) and Nvidia often told reviewers what to use to test in most cases part of working with companies to get test samples from the company. So only a few reviewers are truly unbiased reviewers as has also been pointed out in this thread again learn to read it is all in this thread and has been gone over for almost 250 pages so apparently it is pretty well founded and popular since the subject is only like 1 week or so old.

Nvidia is well known for scratch my back I scratch yours but if a reviewer ever show them in the negative / wrong as a reviewer they will almost certainly not get much hardware from Nvidia to ever review again. The point is Nvidia are very dirty like this ask some of the reviewers they will tell you this many when they don't show them in the lime light they then have to buy Nvidia products out of pocket for reviewing for months or years before they get back on Nvidias good side / graces again to get test samples from Nvidia again for reviewing.

I will state this as I see it what Nvidia has done is wrong and very shady along with illegal they had to have known what they were doing before they allowed all this false information to get out and add to it released all theyre marketing stuff full of blatant lies. Then they hoped and prayed nobody caught it and well it took over 4 months before it was found but once it was the crap hit the fan and Nvidia is likely in hardcore damage control mode and to point they are scared to eat the bullet for legal reasons because they have broken atleast 4 major laws I can plainly see in the US alone I bet they have done so in other countries also.

What this tells me is Nvidia is now using theyre wording and responses as sparingly as possible because every action they make at this point can be turned against them in short they are in the middle of a public legal nightmare that could truly turn very nasty and bad very fast.

I want to say this also the laws they have broken step into the federal level of fraud / misrepresentation of a product and add to it another law they broke has to do with foriegn trade agreement violations along with another even worse one negligence of duty due to the way they have handled the issue the last one is more of a hurt to theyre company in my eyes such a large company as Nvidia image is everything.


----------



## PhotonFanatic

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Now, the kicker is that guys with original Titans are still selling their cards on the market for $650 up to $750 in some cases meaning that they enjoyed all that performance for the last two years for basically $250-$350. Seems like a pretty awesome deal to me honestly (though I have no intention of selling my Titans as they still crush every game I play at 1440P)...


Those people are getting ripped off. I just watched a titan Z go off ebay for $768.


----------



## Tsumi

Quote:


> Originally Posted by *Archngamin*
> 
> So... you couldn't find anything? I'm just surprised this wasn't caught by a single review out there.


It would be a very hard issue to catch. Pretty much the only thing it causes is stuttering. Stuttering can be attributed to other things, like the GPU simply not being powerful enough.


----------



## Cyro999

Quote:


> Originally Posted by *Tsumi*
> 
> It would be a very hard issue to catch. Pretty much the only thing it causes is stuttering. Stuttering can be attributed to other things, like the GPU simply not being powerful enough.


It causes more stuff.

Shadow of Mordor, switching textures from high to ultra makes the game take 3-5x longer to load. Freezing/stuttering. Hard crashing.

Textures back to high, it loads fast again (not loading to 3.48GB in 5 seconds and then sitting there bouncing up and down on VRAM for the next 20 seconds with an ssd), doesn't stutter and doesn't crash

it's pretty obvious if you have a vram demanding game or play at higher resolutions that the card doesn't even try to use the last 0.5GB, and fails when it's needed, as expected when you go over VRAM capacity on any card


----------



## xenophobe

Quote:


> Originally Posted by *Boomstick727*
> 
> They have managed to completely put me off buying a card from them in the future. Nice move Nvidia
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I literally will only buy AMD GPU's from now, I really hope they keep bringing the performance !


lol Up until the upgrade cycle where nVidia totally smashes ATI.

I guess you weren't a gamer that purchased ATI because of Doom 3 benchmarks. There was almost as much rage over that than there is with this. lol


----------



## Archngamin

Quote:


> Originally Posted by *rickcooperjr*
> 
> snip.


Obviously you don't understand that I have no desire to read every page of the multiple threads this topic takes up. I would say sorry to have bothered you but you did this to yourself as I asked an open question. Don't respond if you don't want to next time.
Quote:


> Originally Posted by *Tsumi*
> 
> It would be a very hard issue to catch. Pretty much the only thing it causes is stuttering. Stuttering can be attributed to other things, like the GPU simply not being powerful enough.


I believe this much more than the "every review was paid" theory.


----------



## xenophobe

Quote:


> Originally Posted by *Archngamin*
> 
> I believe this much more than the "every review was paid" theory.


Engineering Sample reviews often come with an NDA that only allows you to discuss some things. If it's a 3rd party site reviewing cards from specific manufacturers providing the cards or the brand is advertising on their site, you don't know what's going on. Most reviews will state where the cards come from. If they went out and bought them from regular distribution channels, they usually mention that too. Those are usually the most credible, IMO.


----------



## Archngamin

Quote:


> Originally Posted by *xenophobe*
> 
> Engineering Sample reviews often come with an NDA that only allows you to discuss some things. If it's a 3rd party site reviewing cards from specific manufacturers providing the cards or the brand is advertising on their site, you don't know what's going on. Most reviews will state where the cards come from. If they went out and bought them from regular distribution channels, they usually mention that too. Those are usually the most credible, IMO.


Still would have thought something like this would be spotted in a review like over at Hardforum.


----------



## xenophobe

Quote:


> Originally Posted by *Archngamin*
> 
> Still would have thought something like this would be spotted in a review like over at Hardforum.


A review that isn't specifically looking for this issue might not have even noticed. And from what I understand, this issue isn't an issue with most games. And if you've noticed, a lot of people are trying to reproduce this and the results are mixed.


----------



## skupples

plenty of reviewers were like "hey, herp derp, these act funny in SLi" but that's as far as their laziness went.


----------



## DIYDeath

Quote:


> Originally Posted by *skupples*
> 
> plenty of reviewers were like "hey, herp derp, these act funny in SLi" but that's as far as their laziness went.


I remember reading that in a few reviews too, didn't a few of them try to chop it up to SLI being generally broken or not having the best drivers?


----------



## skupples

Quote:


> Originally Posted by *DIYDeath*
> 
> I remember reading that in a few reviews too, didn't a few of them try to chop it up to SLI being generally broken or not having the best drivers?


probably. That's the normal cop out "new tech, needs drivers, will get better, NBD" I guess no one bothered to open up a HW monitor.


----------



## rickcooperjr

I have also heard some rumors somehow the GTX 980 is also gimped in some way not sure but there are rumors about it also not quite being right and all that was advertised don't quote me on it though as I can't remember where I seen info on it apparently something to do with the GTX 780 or GTX 780 TI being nearly identical performance wise even though the GTX 980 is suppose to be advanced with superior architecture or something.

Basically Nvidia advertised the GTX 980 performance to out do the 780 TI if I remember corectly it was suppose to be like 15%+ or so increase when in fact it averages around 2%-5% in the few times the GTX 980 wins and often the GTX 980 loses to the GTX 780 TI so that has me a bit puzzled as to why they hyped the GTX 980 as being the new king enthusiast card again false advertisement.


----------



## DIYDeath

Quote:


> Originally Posted by *skupples*
> 
> probably. That's the normal cop out "new tech, needs drivers, will get better, NBD" I guess no one bothered to open up a HW monitor.


I guess that just boils down to good judgement. If you're looking to SLI the 970 and reviewers note something worky, you wait. If there are no SLI reviews, you wait unless you want to SLI review and don't care about being burned if you're not satisfied with the results.

At least people in Europe and Australia have their consumer rights protected, here in NA we get the finger, usually.


----------



## Cyro999

Quote:


> Originally Posted by *rickcooperjr*
> 
> I have also heard some rumors somehow the GTX 980 is also gimped in some way not sure but there are rumors about it also not quite being right and all that was advertised don't quote me on it though as I can't remember where I seen info on it apparently something to do with the GTX 780 or GTX 780TI being nearly identical performance wise even though the GTX 980 is suppose to be advanced with superior architecture or something.


Probably just that gm204 wasn't the real gk110 replacement (780, titan, 780ti)

Nvidia recently has been releasing upper-midrange chip as an x80, like the gtx680 while the 780-780ti was of the same generation (same architecture, same manufacturing process)

we will very likely have the same thing with 980 and "1080". There's a gm200 chip rumored to be 1.5x smm count and bus width of 980


----------



## skupples

Quote:


> Originally Posted by *rickcooperjr*
> 
> I have also heard some rumors somehow the GTX 980 is also gimped in some way not sure but there are rumors about it also not quite being right and all that was advertised don't quote me on it though as I can't remember where I seen info on it apparently something to do with the GTX 780 or GTX 780TI being nearly identical performance wise even though the GTX 980 is suppose to be advanced with superior architecture or something.


you're just off in lala land now, aren't you?

what part of aimed to replace GK104 do people still not understand? NV's goal was to put out as little as possible, as they knew they could run a "cheap" to produce product w/ little to no performance gains over top tier GK110 (though the lines separate when OC comes into play) because AMD would still be 6+ months away from a new product.


----------



## DIYDeath

Quote:


> Originally Posted by *skupples*
> 
> you're just off in lala land now, aren't you?
> 
> what part of aimed to replace GK104 do people still not understand? NV's goal was to put out as little as possible, as they knew they could run a "cheap" to produce product w/ little to no performance gains over top tier GK110 (though the lines separate when OC comes into play) because AMD would still be 6+ months away from a new product.


1000x this, the R&D Nvidia has blows through cash, its in their interest to make every generation of GPU last as long as possible. This is why competition in this market is necessary because without competition we see shady things like this but far worse happen.


----------



## rickcooperjr

Quote:


> Originally Posted by *DIYDeath*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> you're just off in lala land now, aren't you?
> 
> what part of aimed to replace GK104 do people still not understand? NV's goal was to put out as little as possible, as they knew they could run a "cheap" to produce product w/ little to no performance gains over top tier GK110 (though the lines separate when OC comes into play) because AMD would still be 6+ months away from a new product.
> 
> 
> 
> 1000x this, the R&D Nvidia has blows through cash, its in their interest to make every generation of GPU last as long as possible. This is why competition in this market is necessary because without competition we see shady things like this but far worse happen.
Click to expand...

I also have to say Nvidia is dropping driver optimizations way to early on theyre cards most within 1yr-2yrs no longer get optimizations outside of SLI optimizations this is especially troubling on the enthusiast grade cards from 1-2 gens ago like the 780 TI / Titans Nvidia are really giving those owners the shaft after they drop so much money on those high end cards that still are the top performing on Nvidia side that is a major thing that erks me. I meen come on they are Nvidias top performing cards yet they are already dropping theyre support / driver optimizations outside of SLI optimizations that is sad and erks me.

I also hear Nvidia is not pushing game optimization for previous generations of cards and only dealing with current gen cards for game optimizations and leaving people with the 780's / 780 TI's / Titans in the dark for game optimizations supposedly they are not pushing the game devs to add optimizations for the past cards even the previous gen enthusiast lineup are supposedly being left out even though they are Nvidias top performing cards.

so how exactly is that trying to make every generation of GPU to last as long as possible.


----------



## Hattifnatten

Quote:


> Originally Posted by *Archngamin*
> 
> So... you couldn't find anything? I'm just surprised this wasn't caught by a single review out there.





Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Serandur*
> 
> Less VRAM is less VRAM, you will and do run into issues when you reach the 970s' 3.5 GB limit, unless actual smoothness and consistency mean absolutely nothing go you.
> 
> More details on what's going on:
> 
> 
> 
> 
> 
> 51 Seconds into this video, the cards are hooked up to a second monitor to shoot up VRAM usage, note how horribly the 970 stutters while the 980 simply does not.
> 
> https://www.youtube.com/watch?v=MTYd9_fe4iI&feature=youtu.be&t=57s - Note how the game completely _locks up_ on the 970 for several seconds.
> 
> -
> 
> Frametimes with SLI above 3.5 GBs are going insanely high (stutterfest)
> 
> I'm sure people will continue to fail to internalize any of this and keep claiming what they want to be true. Denial is a powerful process.
> 
> Here are the facts:
> 
> Nvidia misrepresented their specifications and violated federal laws. All who purchased 970s with those false specifications in mind are entitled to a refund.
> 
> They partitioned the memory with the last 512 MB being, measurably, far slower.
> 
> 970s will try to avoid using the 3.5 GBs in many scenarios whereas 980s will immediately allocate all of it. When it does use the remaining 512 MBs, it is in fact using far slower memory.
> 
> We have plenty of evidence, including your own, demonstrating what this disparity does to performance (frametime inconsistency, minimum FPS tanks)
> 
> Nvidia are in denial of the issue and doing absolutely nothing, so far, for their customers.
> 
> Different scenarios will place different demands on VRAM. You *cannot* accurately assume, from a program that doesn't push the 970s' 3.5 GB cap, that all programs will be similarly unaffected.
> 
> Average FPS figures average in the brief spikes/stutters, thereby mitigating representation of the issue.





GoldenTiger was doing a lot of benches over at Nvidia-forums aswell. PcPer just did a SLI-review. Some games use more vram than others, and thus give different results. The FC4-video from ComputerBase (above) shows just how bad it CAN get.


Spoiler: Warning: Spoiler!


----------



## Ziglez

How many of you guys are going to switch to ATI after this?.


----------



## Doktorbombay

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So they can sell more. Most people are not going to use more then 3.5GB but many people got the GTX 970 to upgrade because of vRAM over GTX780/Ti.


Well do you really think people like myself bought the 970 over the 780 TI only because of the Vram?, i can say for myself that i bought it because of that and because it was much cheaper and that it was a lightyear better then my GTX 570







, otherwise i could have just bought the 980 instead


----------



## error-id10t

For me it all depends on how the 300 series performs, I don't care who's card I use, that's the good thing about not being a fanboy. On separate note, both of my cards are gone now and I can finally stop reading these threads lol


----------



## Doktorbombay

According to an interview i saw last month, they had to segment it due to the 3 smm cores were Deactivated and 0,25mb of L2 Cache was deactivated orelse the bandwidth of the entire 4gb vram would've been very instead of just the last 0,5gb


----------



## xenophobe

Quote:


> Originally Posted by *Ziglez*
> 
> How many of you guys are going to switch to ATI after this?.


----------



## GrimDoctor

There's a good chance I will be


----------



## Noufel

Good news for 970s sli owners with the dx12 using both vram of the 2gpus at the same time.
Ps: golden you can come back here







( just kidding )


----------



## DIYDeath

Quote:


> Originally Posted by *Noufel*
> 
> Good news for 970s sli owners with the dx12 using both vram of the 2gpus at the same time.
> Ps: golden you can come back here
> 
> 
> 
> 
> 
> 
> 
> ( just kidding )


Huh, well that pretty much nullifies the biggest issue. I say they should give all 970 owners a $75 cupon towards any purchase per 970 purchased. Those two combined imo would be sufficient to resolve the issue and it works in their favor by enticing people effected by this to stay with Nvidia.


----------



## Khaled G

If I understand this correctly, the best thing to do in this situation is that nvidia offer to flash all the 970s to 980s.


----------



## Ziglez

Quote:


> Originally Posted by *Khaled G*
> 
> If I understand this correctly, the best thing to do in this situation is that nvidia offer to flash all the 970s to 980s.


You really think nvidia would do this?







.


----------



## Silent Scone

1. No vendor would ever roll out a BIOS outside of the absolute necessary

2. If they did you'd be left with a brick, because NVIDIA has almost always disabled SMs via laser removal


----------



## Rahldrac

I see many articles on different sites claiming "Nvidia corrects 970 specs" (Like anadtech, PCpr, guru3d) but I do not find an official source of this?

Edit:
I am asking because my reseller (Agito.pl) states that Nvidia have never changed the specs on this card.


----------



## skupples

Quote:


> Originally Posted by *Noufel*
> 
> Good news for 970s sli owners with the dx12 using both vram of the 2gpus at the same time.
> Ps: golden you can come back here
> 
> 
> 
> 
> 
> 
> 
> ( just kidding )


When was this announced.


----------



## poii

He probably talks about split frame rendering (SFR) in SLI/crossfire where memory doesn't need to be identical in both cards. Each GPU only needs memory to be able to render its part of the frame.

see
http://www.overclock.net/t/1539526/guru3d-both-mantle-and-dx12-can-combine-video-memory


----------



## Archngamin

Quote:


> Originally Posted by *Hattifnatten*
> 
> GoldenTiger was doing a lot of benches over at Nvidia-forums aswell. PcPer just did a SLI-review. Some games use more vram than others, and thus give different results. The FC4-video from ComputerBase (above) shows just how bad it CAN get.
> 
> 
> Spoiler: Warning: Spoiler!


Thanks. I saw the images but couldn't find the post.


----------



## nSone

Quote:


> Originally Posted by *skupples*
> 
> When was this announced.


http://www.guru3d.com/news-story/both-mantle-and-dx12-can-combine-video-memory.html
_Keep in mind though, this is a marketing rep talking ..._


----------



## Silent Scone

SFR uses sub GPUs as slaves thus performance scaling isn't great but frame times are noticeably improved, as well as potential to stack available video memory. There is a reason games do not use this method though besides CIV BE.

AFR is still the most preferable method, and what Thracks is saying is nothing new. He's just rocking the boat.


----------



## Blameless

Quote:


> Originally Posted by *Silent Scone*
> 
> There is a reason games do not use this method though besides CIV BE.
> 
> AFR is still the most preferable method


AFR is only the most preferable because it gives the largest scaling and best frame rate numbers. This frequently does not translate into the best experience.

I'd gladly sacrifice a fair amount of scaling for improvements to latency and frame time consistency...which is essentially what I've been doing since I dumped my SLI and CFX configs for single GPU setups.


----------



## Silent Scone

Quote:


> Originally Posted by *Blameless*
> 
> AFR is only the most preferable because it gives the largest scaling and best frame rate numbers. This frequently does not translate into the best experience.
> 
> I'd gladly sacrifice a fair amount of scaling for improvements to latency and frame time consistency...which is essentially what I've been doing since I dumped my SLI and CFX configs for single GPU setups.


Tell that to AMD who are pro-actively promoting UHD resolutions and SFR simultaneously lol. Johan Andersson who was lead programmer on Frostbite 3.0 / Battlefield 4 has said AFR is still the method of choice.

There is a future for SFR, but it's not got much of one within the next two years IMO


----------



## skupples

Quote:


> Originally Posted by *nSone*
> 
> http://www.guru3d.com/news-story/both-mantle-and-dx12-can-combine-video-memory.html
> _Keep in mind though, this is a marketing rep talking ..._


I mean, some of us have been cheering on the death of AFR and Mirrored memory for many many many years, but I'm not holding my breath.


----------



## Orangey

Something's gotta give with unified memory, mezzanine connectors, HBM etc etc all on the cards in the next 2 years.


----------



## Imouto

Quote:


> Originally Posted by *rc dude*
> 
> Which retailers?


Coolmod in Spain for example.


----------



## Tsumi

Quote:


> Originally Posted by *Silent Scone*
> 
> Tell that to AMD who are pro-actively promoting UHD resolutions and SFR simultaneously lol. Johan Andersson who was lead programmer on Frostbite 3.0 / Battlefield 4 has said AFR is still the method of choice.
> 
> There is a future for SFR, but it's not got much of one within the next two years IMO


SFR will require mirroring most of the VRAM. If you're rapidly turning, you can't wait for the GPUs to load the textures for their part of the screen. It will reduce the frame size needed, but that only takes up a small portion of the VRAM. Different techniques will be needed for VRAM pooling.


----------



## Blameless

Quote:


> Originally Posted by *Tsumi*
> 
> SFR will require mirroring most of the VRAM.


Yes.
Quote:


> Originally Posted by *Silent Scone*
> 
> There is a future for SFR, but it's not got much of one within the next two years IMO


Everything but AFR has largely been in limbo, with a handful of exceptions cropping up from time to time, since 2006.

I don't expect the transition from AFR to be anything resembling quick. It's going to take forever and it will be like pulling teeth.


----------



## clerick

So the two nvidia reps who replied about taking care of everyone and helping with refunds disappeared from the forums and haven't been active whatsoever (and everyone who sent them pms didn't get a reply back, its been 5 days). This thread should be re-titled to "nvidia stays silent" because they are clearly not replying at the present time.


----------



## sugalumps

Quote:


> Originally Posted by *clerick*
> 
> So the two nvidia reps who replied about taking care of everyone and helping with refunds disappeared from the forums and haven't been active whatsoever (and everyone who sent them pms didn't get a reply back, its been 5 days). This thread should be re-titled to "nvidia stays silent" because they are clearly not replying at the present time.


For them to do that would be to admit there is something wrong with the card, and they are sticking to it's fine and that's the way they wanted to make it.


----------



## boot318

Quote:


> Originally Posted by *clerick*
> 
> So the two nvidia reps who replied about taking care of everyone and helping with refunds disappeared from the forums and haven't been active whatsoever (and everyone who sent them pms didn't get a reply back, its been 5 days). This thread should be re-titled to "nvidia stays silent" because they are clearly not replying at the present time.


I feel better making my purchases from Amazon then. I just told them the problem and they approve the RMA. They advanced me my money back before I could even print the UPS shipping document. #datcustomersupport


----------



## doritos93

Quote:


> Originally Posted by *clerick*
> 
> So the two nvidia reps who replied about taking care of everyone and helping with refunds disappeared from the forums and haven't been active whatsoever (and everyone who sent them pms didn't get a reply back, its been 5 days). This thread should be re-titled to "nvidia stays silent" because they are clearly not replying at the present time.


"Nvidia REP stays silent", classic case of speaking too soon. You should never confirm anything (especially refunds lol) when the company you work for hasn't even addressed the issue.


----------



## tpi2007

Quote:


> Originally Posted by *clerick*
> 
> So the two nvidia reps who replied about taking care of everyone and helping with refunds disappeared from the forums and haven't been active whatsoever (and everyone who sent them pms didn't get a reply back, its been 5 days). This thread should be re-titled to "nvidia stays silent" because they are clearly not replying at the present time.


Quote:


> Originally Posted by *sugalumps*
> 
> For them to do that would be to admit there is something wrong with the card, and they are sticking to it's fine and that's the way they wanted to make it.


Exactly. They are in full damage control mode, and that implies saying as little as possible to avoid both admitting that there is a problem and the additional exposure, hoping that the matter goes away, which, let's be honest, with a viral video already at 500k+ views in less than a week isn't going to happen.

That is why the comment about helping people with returns was removed. It's all happening at the retailer level in a non standard manner, meaning you will get a different treatment depending on where you bought the card. A big mess to say the least.

And also why they only made official comments to three top tier sites.

And finally, why they never made an official statement on their site about the matter. They leave it to the forums and why they didn't correct the 970's memory bandwidth on the official spec sheet.

To those that say that you can achieve that bandwidth through careful balancing of reads on one segment and writes on the other I ask: the 980's advertised bandwidth is the same and yet it has a unified pool with the full complement of ROPs and L2 cache to access it, instead of having to use one ROP/L2 unit of the first segment to access the second, which will overburden that unit. How can they be the same ? Something doesn't add up.


----------



## AngryGoldfish

I wonder whether any engineers are reading these forums privately?

Hi.


----------



## rickcooperjr

Quote:


> Originally Posted by *AngryGoldfish*
> 
> I wonder whether any engineers are reading these forums privately?
> 
> Hi.


I bet the Nvidia execs are reading them and probably sweating bullets and hoping / praying it just blows over because Nvidia has already went the tight lipped lockdown route and no longer working to do refunds and such basically they are in a tight spot and know they are in trouble either way and don't know how to remedy it.

The short outcome of this is you see the true side of Nvidia when the crap hits the fan they disappear and leave theyre customers / fanboys hanging and that is the main thing I believe everyone can agree on.

Had this been AMD they would have been vocal and explained what they were doing to try to remedy the issue or promptly work with vendors and such to hand out refunds / upgrades I know I got upgraded before from AMD to next tier up free of charge Nvidia will never do this.

I sent in a HD 6970 to AMD for RMA directly from AMD and recieved a HD 7950 in return I also sent in a bad XFX HD 4890 years ago recieved a HD 6870 for a replacement both times I recieved a more powerfull and newwer GPU than I sent in so AMD tends to make things right much better than Nvidia.


----------



## Tsumi

Quote:


> Originally Posted by *rickcooperjr*
> 
> I bet the Nvidia execs are reading them and probably sweating bullets and hoping / praying it just blows over because Nvidia has already went the tight lipped lockdown route and no longer working to do refunds and such basically they are in a tight spot and know they are in trouble either way and don't know how to remedy it.
> 
> The short outcome of this is you see the true side of Nvidia when the crap hits the fan they disappear and leave theyre customers / fanboys hanging and that is the main thing I believe everyone can agree on.


I highly doubt nVidia cares much. At the end of the day, OEM systems are still shipping with nVidia cards, and they still have their workstation/enterprise sales. A few low level managers might be sweating bullets, but the top level are going to be like "Meh, whatever, this will disappear in a year. Let's get back to business with those multi-million dollar Quadro and Tesla contracts." Consumers have low short-term memory times.


----------



## xenophobe

Quote:


> Originally Posted by *rickcooperjr*
> 
> The short outcome of this is you see the true side of Nvidia when the crap hits the fan they disappear and leave theyre customers / fanboys hanging and that is the main thing I believe everyone can agree on.


ATI is no different. They've done equally disturbing things to their customers in the past.

The only thing I am a fanboi of is Coca-Cola. I will never, ever have a Pepsi. Ever.

As for computer hardware, whatever is best at the time that I want to purchase, I buy. Antics like what nVidia has pulled are to be expected. This year it's nVidia, next year it will be ATI. All I care about is performance. And yes, if I bought a 970 I would be mad as hell. I didn't, so I'm not.


----------



## AngryGoldfish

Quote:


> Originally Posted by *rickcooperjr*
> 
> I bet the Nvidia execs are reading them and probably sweating bullets and hoping / praying it just blows over because Nvidia has already went the tight lipped lockdown route and no longer working to do refunds and such basically they are in a tight spot and know they are in trouble either way and don't know how to remedy it.
> 
> The short outcome of this is you see the true side of Nvidia when the crap hits the fan they disappear and leave theyre customers / fanboys hanging and that is the main thing I believe everyone can agree on.
> 
> Had this been AMD they would have been vocal and explained what they were doing to try to remedy the issue or promptly work with vendors and such to hand out refunds / upgrades I know I got upgraded before from AMD to next tier up free of charge Nvidia will never do this.
> 
> I sent in a HD 6970 to AMD for RMA directly from AMD and recieved a HD 7950 in return I also sent in a bad XFX HD 4890 years ago recieved a HD 6870 for a replacement both times I recieved a more powerfull and newwer GPU than I sent in so AMD tends to make things right much better than Nvidia.


I get what you mean, but I can't help but think, if executives are reading this (which I personally doubt), they will not be worried. You don't get to be an executive by worrying about small things. To us, this is big, but to them, they're still selling units and no doubt have a solution to make sure they keep making money. But ultimately we're only guessing because no-one from nVidia is really discussing the issue and trying to rally support.


----------



## clerick

So I had a thought:

Since the real specs of the card are this:

Memory Bandwidth (original) - 224 GB/s for 4GB VRAM
Memory Bandwidth (actual) - 196 GB/s for 7 memory chips totalling 3.5 GB, and 28 GB/s for the remaining 512 MB.

Since nvidia did mental gymnastics and added the 512mbs partition to the total average of 224GB/s, what is to stop them from adding 1mb of 600GB/s ram and then claiming their net card has an average of 800 GB/s since the 4gb of ram have the speed of 200 GB/s and the last 1mb has the speed of 600 Gb/s?

This is literally what they did here.


----------



## mtcn77

Quote:


> Originally Posted by *clerick*
> 
> So I had a thought:
> 
> Since the real specs of the card are this:
> 
> Memory Bandwidth (original) - 224 GB/s for 4GB VRAM
> Memory Bandwidth (actual) - 196 GB/s for 7 memory chips totalling 3.5 GB, and 28 GB/s for the remaining 512 MB.
> 
> Since nvidia did mental gymnastics and added the 512mbs partition to the total average of 224GB/s, what is to stop them from adding 1mb of 600GB/s ram and then claiming their net card has an average of 800 GB/s since the 4gb of ram have the speed of 200 GB/s and the last 1mb has the speed of 600 Gb/s?
> 
> This is literally what they did here.


I'm expecting the next card to be to launch based on its 560GB/s 2MB L2 cache's highlights.


----------



## doritos93

Quote:


> Originally Posted by *AngryGoldfish*
> 
> I wonder whether any engineers are reading these forums privately?
> 
> Hi.


If you got paid a nice six figures to engineer a card that makes no difference in your life, would you read this thread?

If the card had half the memory they advertised, the engineers still get paid.


----------



## iSlayer

Quote:


> Originally Posted by *clerick*
> 
> So I had a thought:
> 
> Since the real specs of the card are this:
> 
> Memory Bandwidth (original) - 224 GB/s for 4GB VRAM
> Memory Bandwidth (actual) - 196 GB/s for 7 memory chips totalling 3.5 GB, and 28 GB/s for the remaining 512 MB.
> 
> Since nvidia did mental gymnastics and added the 512mbs partition to the total average of 224GB/s, what is to stop them from adding 1mb of 600GB/s ram and then claiming their net card has an average of 800 GB/s since the 4gb of ram have the speed of 200 GB/s and the last 1mb has the speed of 600 Gb/s?
> 
> This is literally what they did here.


Don't give them any ideas.


----------



## lombardsoup

Enjoying my 290X after full refund of my 970, and plan on upgrading to the 300 series when its released.


----------



## Serandur

Quote:


> Originally Posted by *clerick*
> 
> So I had a thought:
> 
> Since the real specs of the card are this:
> 
> Memory Bandwidth (original) - 224 GB/s for 4GB VRAM
> Memory Bandwidth (actual) - 196 GB/s for 7 memory chips totalling 3.5 GB, and 28 GB/s for the remaining 512 MB.
> 
> Since nvidia did mental gymnastics and added the 512mbs partition to the total average of 224GB/s, what is to stop them from adding 1mb of 600GB/s ram and then claiming their net card has an average of 800 GB/s since the 4gb of ram have the speed of 200 GB/s and the last 1mb has the speed of 600 Gb/s?
> 
> This is literally what they did here.


No need to stop there.

Nvidia could count system memory amount and bandwidth and just add it to their marketing totals.

Throw a couple of 20nm Denver cores on these cards, clock them to above 2 GHz, and watch Nvidia claim them to be running at 2000+ MHz on air, standard, while being manufactured on a 20nm process.

Market a card's TDP based on its idle power draw.

...all while neglecting to mention or even hint towards the true values.

It's not like accurate representation of the products are important to Nvidia; the marketing team can have a blast, claim miscommunication, and excuse it all without legal or financial obligations.


----------



## sugalumps

Quote:


> Originally Posted by *lombardsoup*
> 
> Enjoying my 290X after full refund of my 970, and plan on upgrading to the 300 series when its released.


Be honest do you ever go above 3.5gb vram? If not then you literally have no performance difference from the switch you made, and with you upgrading soon again anyways what was the point? Let me guess, "principle" ? Unless you got the 290x cheaper, then that was a smart move.


----------



## Serandur

Quote:


> Originally Posted by *Yungbenny911*
> 
> Oh, how wonderful! You tested your SLI 970's at 5K RES, and you expected them to run at 60 FPS+ with AA cranked up? smh...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> People with 6GB 4-way SLI Titans have a hard time maintaining 60 fps+ at that resolution, so i wonder what magic performance you were expecting your 970's to exhibit. You're just being delusional with your method of testing, why not play the game like you normally would? Are you buying that 60/120Hz High-RES Monitor to play your games at 30 FPS?
> 
> If you were wondering if the 970 would run at 120FPS if it had 30GB V-RAM, it won't. V-RAM is not all that matters; applying AA, and increasing resolution does not only affect V-RAM, it also affects the processing power of the GPU, and no matter the amount of V-RAM a GPU has, if it runs out of processing capabilities, your games would run like crap, and that's a FACT.
> 
> As i said, this is a silly argument, and this would be my last reply to you on this subject.


You sure you're replying to me? From the very portion you bolded:

"and there are issues with less than 99% GPU load and either bordering the 3.5 GB mark or surpassing it whereas there are none beforehand"

That couldn't more clear, unless the concepts of GPU load and VRAM walls causing a very characteristic stuttering/hitching clearly not caused by insufficient GPU rendering power elude you. My testing was very thorough specifically to pinpoint whether the 970s were exhibiting a VRAM-related issue or otherwise as are many other people's testing.

The results are conclusive, there are VRAM-related issues on the 3.5 GB border caused by that "soft" border. It's not debatable. Nvidia lied, Nvidia's lie has very real implications, and they are legally bound to refund purchasers based on their misrepresentation. Basic elements of fraud and a proven issue with the product.

The only thing silly about the argument is that you don't seem willing to read what I'm saying and instead have to conjure up some phantom argument I did not even remotely make to shoot down instead.


----------



## FlyingSolo

The company i have bought the card from. Have today said they will not give full refund but they will exchange it with any other cards. But obviously i have to pay more if the card i chose cost more. The other day they said they will give full refund. Now have to decide what card to go for since the new cards will be coming out soon.


----------



## lombardsoup

Quote:


> Originally Posted by *sugalumps*
> 
> Be honest do you ever go above 3.5gb vram? If not then you literally have no performance difference from the switch you made, and with you upgrading soon again anyways what was the point? Let me guess, "principle" ? Unless you got the 290x cheaper, then that was a smart move.


Noticed the performance issue and saved an extra $50. Props to Amazon for the painless full refund done in a matter of minutes.

False advertising is enough to make me avoid a company entirely, at least until those responsible are released from their positions. I'm willing to buy from NVIDIA again if they make things right.


----------



## Serandur

Quote:


> Originally Posted by *FlyingSolo*
> 
> The company i have bought the card from. Have today said they will not give full refund but they will exchange it with any other cards. But obviously i have to pay more if the card i chose cost more. The other day they said they will give full refund. Now have to decide what card to go for since the new cards will be coming out soon.


If you choose a cheaper card, are they refunding you the difference?

R9 290s are a pretty good deal right now for a temporary but powerful card, should be able to sell it for a good amount later on too for the new stuff.


----------



## mtcn77

Quote:


> Originally Posted by *sugalumps*
> 
> Be honest do you ever go above 3.5gb vram? If not then you literally have no performance difference from the switch you made, and with you upgrading soon again anyways what was the point? Let me guess, "principle" ? Unless you got the 290x cheaper, then that was a smart move.


Lol, at inductive reasoning.
Nvidia gpus and AMD gpus have all too different strongpoints. Nvidia fails at backend, AMD the frontend. So the usual complaint for AMD gpus are likely low fps due to excess geometry while Nvidia gpu's can't take fluid game play at superresolutions as displayed in recent game reviews much too frequently.


----------



## amtbr

Quote:


> Originally Posted by *sugalumps*
> 
> Be honest do you ever go above 3.5gb vram? If not then you literally have no performance difference from the switch you made, and with you upgrading soon again anyways what was the point? Let me guess, "principle" ? Unless you got the 290x cheaper, then that was a smart move.


Is it so hard to believe someone would return something out of principal? Do you enjoy being lied to? In the end, its no sweat off your back what video card he uses, is it?


----------



## Ghoxt

Quote:


> Originally Posted by *sugalumps*
> 
> Be honest do you ever go above 3.5gb vram? If not then you literally have no performance difference from the switch you made, and with you upgrading soon again anyways what was the point? Let me guess, "principle" ? Unless you got the 290x cheaper, then that was a smart move.


I understand the gamer's point, however I do Octane GPU rendering in parallel and purchased (7) 970's for the cuda cores, efficiency ,and 4GB memory. All but one are going back to Amazon. 4Gig of scene files in memory completely negates the reason for GPU rendering as the slow memory kills it. Sure I could keep my scenes under 3.5Gig but that's not what i paid $ for. I was wondering why i was slow with larger scenes and this was before I had all cards purchased. If I had known I would have made a different decision and honestly bought multiple 980's or Titan Blacks...


----------



## clerick

Quote:


> Originally Posted by *sugalumps*
> 
> Be honest do you ever go above 3.5gb vram? If not then you literally have no performance difference from the switch you made, and with you upgrading soon again anyways what was the point? Let me guess, "principle" ? Unless you got the 290x cheaper, then that was a smart move.


That's a very silly way of looking at it. You're basically saying you're fine being gouged for things that aren't there as long as it works "good enough".


----------



## DIYDeath

Quote:


> Originally Posted by *clerick*
> 
> That's a very silly way of looking at it. You're basically saying you're fine being gouged for things that aren't there as long as it works "good enough".


Not to mention that every 970 owner will eventually be effected by this issue, game min specs don't typically go down, they go up.

But hey, if he's fine with being ripped off then more power to him but I reserve a big, fat "told ya so" for a year or two down the road when more and more games will highlight the 970's issue.


----------



## FlyingSolo

Quote:


> Originally Posted by *Serandur*
> 
> If you choose a cheaper card, are they refunding you the difference?
> 
> R9 290s are a pretty good deal right now for a temporary but powerful card, should be able to sell it for a good amount later on too for the new stuff.


Not sure about refunding the difference. Forgot to ask about that.


----------



## jdstock76

Anyone know what the next gen Nvidia cards number designations will be? I haven't seen anything yet.


----------



## skupples

A
Quote:


> Originally Posted by *Serandur*
> 
> No need to stop there.
> 
> Nvidia could count system memory amount and bandwidth and just add it to their marketing totals.
> 
> Throw a couple of 20nm Denver cores on these cards, clock them to above 2 GHz, and watch Nvidia claim them to be running at 2000+ MHz on air, standard, while being manufactured on a 20nm process.
> 
> *Market a card's TDP based on its idle power draw.*
> 
> ...all while neglecting to mention or even hint towards the true values.
> 
> It's not like accurate representation of the products are important to Nvidia; the marketing team can have a blast, claim miscommunication, and excuse it all without legal or financial obligations.


that's intel's game.


----------



## iSlayer

Quote:


> Originally Posted by *Serandur*
> 
> No need to stop there.
> 
> Nvidia could count system memory amount and bandwidth and just add it to their marketing totals.
> 
> Throw a couple of 20nm Denver cores on these cards, clock them to above 2 GHz, and watch Nvidia claim them to be running at 2000+ MHz on air, standard, while being manufactured on a 20nm process.
> 
> Market a card's TDP based on its idle power draw.
> 
> ...all while neglecting to mention or even hint towards the true values.
> 
> It's not like accurate representation of the products are important to Nvidia; the marketing team can have a blast, claim miscommunication, and excuse it all without legal or financial obligations.


TDP != power usage.
Quote:


> Originally Posted by *skupples*
> 
> that's intel's game.


Facepalm


----------



## kckyle

although its not false advertisement due to no breach in technicality. it is quite deceptive, this could and should hurt nvidia's customers' loyalty.

this will most likely hurt resale value of the 970 on the 2nd hand market, although 3.5gb is enough for today games, that last "slow" 500mb is definitely going to hurt any sort of future proofing. especially if you plan on going to SLI.

i'm just glad most people aren't really fully aware of this problem, else 290 price would probably jump up a bit.

let's just say me going back to team red looks mighty tempting for the upcoming cards.


----------



## Yungbenny911

Nvidia deserves every bit of backlash they're getting for misleading people, and they should definitely find a way to compensate for their actions. I hate misinformation of any kind, that's why i also i don't support people trying to make the 970 look like a bad product. They lied about it's specs, that's all there is to it. The 970 is still a great GPU, and will perform ridiculously well in normal gaming scenarios.

If your aim is to run X8MSAA on a 5k Monitor, no current single or dual GPU setup will give you playable FPS on demanding games. Even x4 GPU's will sweat on settings like that. Yes, bash Nvidia for lying, but don't bash the 970. It's performing just as well as any other GPU out there in real world gaming.

Best thing to do would be to force Nvidia to reduce the price or find a way to make them pay for damages they might have cost you


----------



## skupples

Quote:


> Originally Posted by *iSlayer*
> 
> TDP != power usage.
> Facepalm


was a comment towards when intel was debating reclassifying how they rate TDP based off of non TDP numbers.

did they ever end up doing that? I rarely pay attention to TDP, seeing as I have close to 2.5KW of power available in my tower.


----------



## rickcooperjr

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iSlayer*
> 
> TDP != power usage.
> Facepalm
> 
> 
> 
> was a comment towards when intel was debating reclassifying how they rate TDP based off of non TDP numbers.
> 
> did they ever end up doing that? I rarely pay attention to TDP, seeing as I have close to 2.5KW of power available in my tower.
Click to expand...

I am same I got 2x PC power & cooling turbo cool 1200w power supplies that can do 1400w each they are server grade power supplies hard to get ahold of anymore but still some of the best power supplies to get http://www.pcper.com/reviews/Cases-and-Cooling/PC-Power-amp-Cooling-Turbo-Cool-1200W-PSU-Review/Specifications OCZ bought out PC power & cooling if I remember correctly. The older ones are the server grade parts that are known to be almost indestructable mine are the older ones. I have around 5-8 of these older PC power & cooling power supplies around 5 in the 1kw flavor and 2 or 3 in the 1200w flavor here I love these power supplies super reliable and got super good power output and very clean power period I know the power supply is the lifeblood so I run the good stuff to feed my machines.

That gives me 2.4kw-2.8kw worth of power in my tower I have a dedicated 220v line back here that can do 80 or 90 amps can't remember I ran the line myself for my own use it was a pain to run the line thru the attic and such and it is all up to spec / code and such and has its own breaker box also.


----------



## skupples

Wouldn't it being up to spec & code require someone of proper licensing doing it?

idk about where you live, but that's how it works down here. Hurricane code & all.


----------



## rickcooperjr

Quote:


> Originally Posted by *skupples*
> 
> Wouldn't it being up to spec & code require someone of proper licensing doing it?
> 
> idk about where you live, but that's how it works down here. Hurricane code & all.


Yes my uncle is a certified electrician and runs a electrical business he was here the whole time and it was inspected by the local inspector also a few days after by the way I am from a family full of construction workers and such so I was raised around this stuff and my family own several construction companies.

I also had to have the electrical inspected later on for my home owners insurance and for house value adjustments due to addons and changes I made and such to the house and shops everything is fully upto code and such so you know.

I also have 4Kw worth of solar / large battery bank and a backup tri fuel generator propane / gasoline / natural gas I installed which is also part of why had to have insurance inspector and such check things and home value reevaluated it raised my homes value well over $20k almost $30k can't remember exact number would have to get paper work out.

I want to point out I live in tornado ally and the past few years the grid around here has gotten very unreliable a few years back we had power outages nearly everyday for 4-6 hrs at a time. I had a daughter on the way and had all the stuff pretty much aquired so decided to go for it trust best decision I have ever made. I can go off grid in an outage and not have any issues and can go to solar as long as I am carefull about power usage for several days at a time and if needed can fire the generator up and charge bank back up I have other alternative backups also so you know.

I am from a military family that has always made sure to be prepared for stuff at any given time.


----------



## skupples

ah, of course


----------



## Blameless

TDP is thermal design power, or the amount of heat a part needs to dissipate via it's cooling solution to remain within spec.

Both AMD and NVIDIA GPU TDPs are controlled by dynamic clocks and limiters. Peak power, when forced to their maximum clocks and not allowed to throttle under the heaviest of loads will almost always exceed TDP...which is why power limits often need to be raised for meaningful overclocking.

Regarding Intel's processor TDP, I have never seen any Intel CPU reach it's TDP at stock clocks and volts, even when running the most stressful tests I could find. It's not rare for peak power to be much lower than the rated TDP of a series, and I have, for example, a few 130w TDP parts that struggle to consume/dissipate more than 70-80w while running LINPACK.
Quote:


> Originally Posted by *skupples*
> 
> Wouldn't it being up to spec & code require someone of proper licensing doing it?


These things are almost impossible to enforce at a private residence unless you need a new building permit for some reason, or are selling the property, and someone bothers to get it inspected.

Personally, I live in a death trap that would fail all sorts of inspection check boxes.


----------



## skupples

houses are normally inspected when sold...

hell, lots of insurance companies require it to insure a newly acquired property which is flagged for single family home.


----------



## rickcooperjr

Quote:


> Originally Posted by *rickcooperjr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> Wouldn't it being up to spec & code require someone of proper licensing doing it?
> 
> idk about where you live, but that's how it works down here. Hurricane code & all.
> 
> 
> 
> Yes my uncle is a certified electrician and runs a electrical business he was here the whole time and it was inspected by the local inspector also a few days after by the way I am from a family full of construction workers and such so I was raised around this stuff and my family own several construction companies.
> 
> I also had to have the electrical inspected later on for my home owners insurance and for house value adjustments due to addons and changes I made and such to the house and shops everything is fully upto code and such so you know.
> 
> I also have 4Kw worth of solar / large battery bank and a backup tri fuel generator propane / gasoline / natural gas I installed which is also part of why had to have insurance inspector and such check things and home value reevaluated it raised my homes value well over $20k almost $30k can't remember exact number would have to get paper work out.
Click to expand...

Quote:


> Originally Posted by *skupples*
> 
> houses are normally inspected when sold...
> 
> hell, lots of insurance companies require it to insure a newly acquired property which is flagged for single family home.


That all depends on the state / city some cities literally have 0 ordinances / regulations and some states have some of the crappiest housing requirements in alot of these cities / states 80% of the house are borderline death traps that look fine until you look closer and well my state is one of those Illinois but I still go thru the proper routes / channels I got to much to lose and insurance company say well we can't won't cover it again goes to show you how different these rules / regulations are state to state city to city.


----------



## skupples

rules are strict down here for a reason.

we call them storms, you call them hurricanes.

we will get pillaged the next time one comes through, been too long. Things come loose, owners & renters become careless, maintenance declines... blah blah.

we test run our shutters every season, just in case, to make sure they haven't warped out of shape, but we also have the old school panel & slide shutters... so bad.


----------



## Blameless

Even a proper inspection isn't always invasive enough to find things like wiring alterations, which could require opening up walls.


----------



## rickcooperjr

Quote:


> Originally Posted by *skupples*
> 
> rules are strict down here for a reason.
> 
> we call them storms, you call them hurricanes.
> 
> we will get pillaged the next time one comes through, been too long. Things come loose, owners & renters become careless, maintenance declines... blah blah.
> 
> we test run our shutters every season, just in case, to make sure they haven't warped out of shape, but we also have the old school panel & slide shutters... so bad.


if things get to bad adding plywood and screws can help alot for protecting windows my uncle is in florida and has been there for years he is in the military down there and lives off base with his family he has told me and I came down after catrina to help after that hurricane my family sent a bunch of theyre crews down there and such and I went also.

I want to point out my family never got paid for it but my family has never complained so in short it was more charity than anything on that run after catrina.


----------



## skupples

Katrina cleanup was brutal man, but damn, getting paid $25 an hour (bring your own hazmat!) was damn good for how young I was.

News didn't even come close to properly covering it.

it was 100% a war zone, for almost a month.

the hood came out to play, and they meant business. Gun fire all day & night, for the first few weeks.

I think we're on track for getting this locked


----------



## Xoriam

Ughhh, I'm too lazy to check.

Does a 295x2 8gb outperform SLI 970?

I could get one for cheaper than my current SLI 970.
It also comes with an AIO.


----------



## skupples

Quote:


> Originally Posted by *Xoriam*
> 
> Ughhh, I'm too lazy to check.
> 
> Does a 295x2 8gb outperform SLI 970?
> 
> I could get one for cheaper than my current SLI 970.
> It also comes with an AIO.


do they make a 295x2 w/ 8GB effective?

not being obtuse, I'm truly curious.

is there a 295x2 w/ dual 8GB memory sections?


----------



## Xoriam

Quote:


> Originally Posted by *skupples*
> 
> do they make a 295x2 w/ 8GB effective?
> 
> not being obtuse, I'm truly curious.
> 
> is there a 295x2 w/ dual 8GB memory sections?


I dunno, but the one I'm looking at by sapphire is watercooled and avertised as 8gb.


----------



## Anth0789

Quote:


> Originally Posted by *Xoriam*
> 
> Ughhh, I'm too lazy to check.
> 
> Does a 295x2 8gb outperform SLI 970?
> 
> I could get one for cheaper than my current SLI 970.
> It also comes with an AIO.




Pretty much sums it up here:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/20.html


----------



## Forceman

Quote:


> Originally Posted by *skupples*
> 
> do they make a 295x2 w/ 8GB effective?
> 
> not being obtuse, I'm truly curious.
> 
> is there a 295x2 w/ dual 8GB memory sections?


No, it's 4 GB per core, just like always. Sounds like false advertising, now that I think about it.


----------



## Xoriam

awwwwww sh*** yeah looked at benchmarks, probably better if I keep my 970 SLI.

However with that watercooling I bet than 295x2 would OC like crazy and kill my 970s.


----------



## Shaded War

Just spoke with newegg about getting a refund but seems they are talking direct with Gigabyte about it. I cannot accept this crap vram design I need it all for 5760x1080. Hopefully I can get a refund and buy a 390x when they come out, I'm not supporting Nvidia with these lies.


----------



## Xoriam

Something I just remembered.

guys thinking about dropping Nvidia for AMD, myself somewhat included since I was a big AMD owner before.......

You WILL LOSE SHADOWPLAY. kiss the 0 fps loss recording goodbye.


----------



## N0rm

Quote:


> Originally Posted by *Xoriam*
> 
> Something I just remembered.
> 
> guys thinking about dropping Nvidia for AMD, myself somewhat included since I was a big AMD owner before.......
> 
> You WILL LOSE SHADOWPLAY. kiss the 0 fps loss recording goodbye.


Introducing the Video Coding Engine (VCE)


----------



## Xoriam

Quote:


> Originally Posted by *N0rm*
> 
> Introducing the Video Coding Engine (VCE)


hmmm... I never saw this before. and i was searching for something like it
Whats the reason?


----------



## rickcooperjr

Quote:


> Originally Posted by *skupples*
> 
> Katrina cleanup was brutal man, but damn, getting paid $25 an hour (bring your own hazmat!) was damn good for how young I was.
> 
> News didn't even come close to properly covering it.
> 
> it was 100% a war zone, for almost a month.
> 
> the hood came out to play, and they meant business. Gun fire all day & night, for the first few weeks.
> 
> I think we're on track for getting this locked


you aren't kidding while helping do cleanup and such I had slugs fly over my head a few times and one of our trucks got the radiator shot out setting at the place we were staying add to it we had to literally have guards to guard our locked up utility trailers we had tools and generators and all kinds of stuff stolen within a 5 minute period of time it was real bade we took shift protecting vehicles and trailers 4-6 of us at a time plus doing cleanup.


----------



## rickcooperjr

Quote:


> Originally Posted by *Xoriam*
> 
> Something I just remembered.
> 
> guys thinking about dropping Nvidia for AMD, myself somewhat included since I was a big AMD owner before.......
> 
> You WILL LOSE SHADOWPLAY. kiss the 0 fps loss recording goodbye.


AMD basically has shadow play it is AMD GVR http://raptr.com/TinyDino/news/53d6730c16b46f0af3/introducing-gvr-and-instant-replays-capture-and-share-gameplay-video-with-nearly-no-performance-impact-


----------



## FlyingSolo

Quote:


> Originally Posted by *Shaded War*
> 
> Just spoke with newegg about getting a refund but seems they are talking direct with Gigabyte about it. I cannot accept this crap vram design I need it all for 5760x1080. Hopefully I can get a refund and buy a 390x when they come out, I'm not supporting Nvidia with these lies.


Reading other forums all the other board partners are accepting returns for full refund apart from Gigabyte in the UK. But for some odd reason where i got my card from they are not giving full refund. Instead they will let you upgrade to any card plus money on top if the card cost more. But will call them again tomorrow and see if i can get a full refund instead. Hope it works out for you.


----------



## Xoriam

Quote:


> Originally Posted by *rickcooperjr*
> 
> AMD basically has shadow play it is AMD GVR http://raptr.com/TinyDino/news/53d6730c16b46f0af3/introducing-gvr-and-instant-replays-capture-and-share-gameplay-video-with-nearly-no-performance-impact-


Oh yeah the one in raptr, it was completely broken for tahiti cards, and only somewhat working for 2XX cards when I switched over.

It definatly didn't make me lose 0fps like shadowplay does.


----------



## skupples

Quote:


> Originally Posted by *Forceman*
> 
> No, it's 4 GB per core, just like always. Sounds like false advertising, now that I think about it.


----------



## Xoriam

So consensus on the 970 SLI vs 295x2 is 970 sli right?


----------



## rickcooperjr

Quote:


> Originally Posted by *Xoriam*
> 
> So consensus on the 970 SLI vs 295x2 is 970 sli right?


well that depends on the game and how high you want to crank them again if you get in the 3.5gb area on the 970's that decision will come back to bite you the R9 295 x2 on other hand has a full 4gb that you wont hit that issue of the last 0.5gb being slow ram and from my research the R9 295x2 is nearly identical performance wise to SLI 970's and the 295 x2 beats the 970's in some games and if those games use more than 3.5gb you already know the answer to that one.

So simply the decision is left to you keep in mind games will only use more and more Vram in future pushing the 3.5gb barrier more and more might be 6 months might be 2 years we don't know and DX12 if is anything like Mantle which miocrosoft themself say it is a Mantle clone it will use more Vram same as Mantle does often around 20%-30% more vram usage over DX11 so imagine DX12 that will push up on the 3.5gb issue very fast if that is the case as Mantle is.

If you plan to utilize the SLI 970's apparently when ran in SLI they hit the 3.5gb issue much quicker than when ran in single card setups so you might regret the 970 sli's if that issue occurs as has with many other people attempting SLI with 970's in many games.


----------



## Xoriam

Quote:


> Originally Posted by *rickcooperjr*
> 
> well that depends on the game and how high you want to crank them again if you get in the 3.5gb area on the 970's that decision will come back to bite you the R9 295 x2 on other hand has a full 4gb that you wont hit that issue of the last 0.5gb being slow ram and from my research the R9 295x2 is nearly identical performance wise to SLI 970's and the 295 x2 beats the 970's in some games and if those games use more than 3.5gb you already know the answer to that one.
> 
> So simply the decision is left to you keep in mind games will only use more and more Vram in future pushing the 3.5gb barrier more and more might be 6 months might be 2 years we don't know and DX12 if is anything like Mantle which miocrosoft themself say it is a Mantle clone it will use more Vram same as Mantle does often around 20%-30% more vram that will push up on the 3.5gb issue very fast if that is the case as Mantle is.
> 
> If you plan to utilize the SLI 970's apparently when ran in SLI they hit the 3.5gb issue much quicker than when ran in single card setups so you might regret the 970 sli's if that issue occurs as has with many other people attempting SLI with 970's in many games.


Something that seriously worries me though is the whole gameworks thing, if I switch back over to AMD I'll be having that problem again.


----------



## rickcooperjr

Quote:


> Originally Posted by *Xoriam*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> well that depends on the game and how high you want to crank them again if you get in the 3.5gb area on the 970's that decision will come back to bite you the R9 295 x2 on other hand has a full 4gb that you wont hit that issue of the last 0.5gb being slow ram and from my research the R9 295x2 is nearly identical performance wise to SLI 970's and the 295 x2 beats the 970's in some games and if those games use more than 3.5gb you already know the answer to that one.
> 
> So simply the decision is left to you keep in mind games will only use more and more Vram in future pushing the 3.5gb barrier more and more might be 6 months might be 2 years we don't know and DX12 if is anything like Mantle which miocrosoft themself say it is a Mantle clone it will use more Vram same as Mantle does often around 20%-30% more vram that will push up on the 3.5gb issue very fast if that is the case as Mantle is.
> 
> If you plan to utilize the SLI 970's apparently when ran in SLI they hit the 3.5gb issue much quicker than when ran in single card setups so you might regret the 970 sli's if that issue occurs as has with many other people attempting SLI with 970's in many games.
> 
> 
> 
> Something that seriously worries me though is the whole gameworks thing, if I switch back over to AMD I'll be having that problem again.
Click to expand...

keep in mind almost all games are a console port and who's hardware is used in consoles so of course they will get first priority on the optimizations it is just natural x86 8core apu and GCN graphics architecture to boot and both are what us PC AMD users are using GCN and X86 the consoles now are a PC basically. So this time around I don't see AMD getting the crappy end of stick otherwise the consoles won't work either and we know that would be fixed quickly.


----------



## Luciferxy

Quote:


> Originally Posted by *Xoriam*
> 
> So consensus on the 970 SLI vs 295x2 is 970 sli right?


I'd keep the 970s if I were you, wait 'till 3XX or GM200 to unfold then decide.


----------



## rickcooperjr

Quote:


> Originally Posted by *Luciferxy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Xoriam*
> 
> So consensus on the 970 SLI vs 295x2 is 970 sli right?
> 
> 
> 
> I'd keep the 970s if I were you, wait 'till 3XX or GM200 to unfold then decide.
Click to expand...

actually if he can get the refund now and get the 295 x2 now for cheaper than the gtx 970's currently in SLI it is a win win he gets fully functional hardware that won't have a coniption fit to run his games especially if plans to push the envelope the 970's won't hold up as long as the 295 x2 and likely the 970's value will crash drastically very soon so if has option to get a trade for cheaper and get cash back the 295 x2 won't have its value hurt as the 970 will so when he goes to upgrade later he takes less of a hit in the wallet selling the old hardware and buying new.

I also want to point out if he does so and gets the 295 x2 for cheaper than his 2x 970's he has money in pocket and a card that will not have the issues the 970 has and at this point the 295 x2 is a bit more future proof because when things advance game wise value of the 295 x2 will be higher and also the 295 x2 won't have the limitations the 970's have and also the 295 x2 will likely hold its value longer making resale much better later.

with all that is going on the 970's value will go down fast and likely very soon so don't wait to long or your cards value could drop 10% or more making getting rid of them or trading them much harder and also again you will take a loss on top of the headache of a gimped not fully functional card.

Most vendors only honor current retail / selling price at the time of exchange / refund and sometimes charge restocking fees not the price you paid for the cards so if theyre value drops you get shafted so time is of the essence to get this taken care of otherwise you might be left holding the bag ( in this case holding your wallet out and letting them literally take money out of it ) on 2x 970's worth far less than you paid for them and they aren't fully functional from the git go that is the embodiment of getting shafted if you ask me.

from my understanding currently almost all vendors are offering full customer paid value of the cards and not charging restocking fees so this is the time to do it to keep from taking a loss and getting shafted if you wait much longer expect to get shafted.


----------



## Noufel

Quote:


> Originally Posted by *Xoriam*
> 
> So consensus on the 970 SLI vs 295x2 is 970 sli right?


keep your 970s sli & wait for the 3xx or the GM200, unless you are swiching to the 295 x2 for principle then im 100% with you


----------



## hollowtek

970 sli is going to be as good as its gunna get for some time lol. ive always wondered why my vram usage on shadow of mordor never exceeded 3.4x gb.


----------



## sugalumps

Quote:


> Originally Posted by *clerick*
> 
> That's a very silly way of looking at it. You're basically saying you're fine being gouged for things that aren't there as long as it works "good enough".


Quote:


> Originally Posted by *amtbr*
> 
> Is it so hard to believe someone would return something out of principal? Do you enjoy being lied to? In the end, its no sweat off your back what video card he uses, is it?


Quote:


> Originally Posted by *mtcn77*
> 
> Lol, at inductive reasoning.
> Nvidia gpus and AMD gpus have all too different strongpoints. Nvidia fails at backend, AMD the frontend. So the usual complaint for AMD gpus are likely low fps due to excess geometry while Nvidia gpu's can't take fluid game play at superresolutions as displayed in recent game reviews much too frequently.


The attacks.......... You do realise I was just asking was it out of principal as that is the only reason that would make sense since they are so close and he already had the 970. I used qoutes round the principal to qoute all the people annoyed about the principal and saying they are sending it back out of principal, not because I was trying to mock the principal.

It's great though that the drama has gotten to the stage where if you are not attacking nvidia you are automatically sided with them and their lie aswell as promoting it.


----------



## Luck100

It's one thing to be pissed/annoyed with NVidia pulling a sneaky with their specs. More power to any legal action that encourages NVidia (and others) to avoid tricky/misleading spec games.

But no need to lose your marbles and suddenly think the 970 is broken or bad value. It is a cut-down 980 missing 3 out of 16 SMM's and 1 out of 8 memory interfaces. For the sake of argument, let's just say that the last 0.5 GB doesn't exist. Then, what we have is this:

A. 18.75% less compute units than the 980
B. 12.5% less memory bandwidth than the 980
C. 12.5% less VRAM.
D. 40% cheaper list price than the 980

Still seems like a value winner to me. Yes, it will stutter when you go above 3.5 GB VRAM (but not quite as badly as a true 3.5 GB VRAM card - that extra 0.5 GB is still better than system RAM via PCIE). Just like the 980 will stutter when you go over 4 GB. An extra 12.5% VRAM is not worth $220 to me.

As for performance within the 3.5 GB VRAM envelope - the 970 actually has MORE memory bandwidth per compute unit than the 980. This is the right way to look at it because memory bandwidth use is driven by compute. If you double your compute speed, you will double your memory bandwidth need (all else being equal).


----------



## Nickyvida

Quote:


> Originally Posted by *Luck100*
> 
> It's one thing to be pissed/annoyed with NVidia pulling a sneaky with their specs. More power to any legal action that encourages NVidia (and others) to avoid tricky/misleading spec games.
> 
> But no need to lose your marbles and suddenly think the 970 is broken or bad value. It is a cut-down 980 missing 3 out of 16 SMM's and 1 out of 8 memory interfaces. For the sake of argument, let's just say that the last 0.5 GB doesn't exist. Then, what we have is this:
> 
> A. 18.75% less compute units than the 980
> B. 12.5% less memory bandwidth than the 980
> C. 12.5% less VRAM.
> D. 40% cheaper list price than the 980
> 
> Still seems like a value winner to me. Yes, it will stutter when you go above 3.5 GB VRAM (but not quite as badly as a true 3.5 GB VRAM card - that extra 0.5 GB is still better than system RAM via PCIE). Just like the 980 will stutter when you go over 4 GB. An extra 12.5% VRAM is not worth $220 to me.
> 
> As for performance within the 3.5 GB VRAM envelope - the 970 actually has MORE memory bandwidth per compute unit than the 980. This is the right way to look at it because memory bandwidth use is driven by compute. If you double your compute speed, you will double your memory bandwidth need (all else being equal).


Still doesn't change the fact that Nvidia lied to customers with the incorrect specs. People bought the graphics card based on the card's specs at the time and they have a right to feel aggrieved especially if they have purchased what they were led to believe. They wanted full speed 4gb, not a cripped .5gb + 3.5 gb, which is what any normal graphics card output.

This has put me off Nvidia. I'll be taking my business to AMD once the 20nm gpus drop.


----------



## Luck100

Quote:


> Originally Posted by *Nickyvida*
> 
> Still doesn't change the fact that Nvidia lied to customers with the incorrect specs. People bought the graphics card based on the card's specs at the time and they have a right to feel aggrieved especially if they have purchased what they were led to believe. They wanted full speed 4gb, not a cripped .5gb + 3.5 gb, which is what any normal graphics card output.
> 
> This has put me off Nvidia. I'll be taking my business to AMD once the 20nm gpus drop.


Yes, did you read the first line of my post? I don't dispute that NVidia did wrong. As owner of SLI 970's I too feel aggrieved. But I'm also interested in what the value of the 970 is now that we know its true specs.


----------



## Orangey

Quote:


> Originally Posted by *Nickyvida*
> 
> This has put me off Nvidia. I'll be taking my business to AMD once the 20nm gpus drop.


There are and will be no 20nm GPUs.


----------



## carlhil2

Theres's a reason why Nvidia priced the 970 as they did compared to the 980...it was gimped in more ways than one...still a beast card though...


----------



## Wasupwitdat1

It's a piece of garbage take my word for it. I took my 970 back, got a full refund and bought the 980. It runs a lot better. Best Buy selling the GTX970 ref card for $379 plus tax makes it a $400 card. Definitely not a good value. I'm not even sure the 980 was a good value because after tax I paid $640. My 3 x 680's in my other machine run better than the 980.


----------



## skupples

Why would you buy a GPU from best buy. Lol


----------



## iRUSH

Quote:


> Originally Posted by *skupples*
> 
> Why would you buy a GPU from best buy. Lol


They are the only retailer in the U.S. that sells the 970 with the Titan reference cooler. That's the only reason I can think of aside from gift cards on hand.


----------



## skupples

I'll allow it.


----------



## Wasupwitdat1

Best Buy also doesn't give you any crap, like Newegg, about returns. And I like the Titan style cooler.


----------



## Woundingchaney

Quote:


> Originally Posted by *Wasupwitdat1*
> 
> It's a piece of garbage take my word for it. I took my 970 back, got a full refund and bought the 980. It runs a lot better. Best Buy selling the GTX970 ref card for $379 plus tax makes it a $400 card. Definitely not a good value. I'm not even sure the 980 was a good value because after tax I paid $640. My 3 x 680's in my other machine run better than the 980.


Im not sure where you are located, but I got my 970s for 330 usd from Newegg when they released and when I purchased my 980s I paid 550 usd at Frys for the higher end MSI Twin Frozr cards. If you are paying those prices then you aren't looking very hard. I think that Best Buy also price matches IIRC.


----------



## MaCk-AtTaCk

I had a evga 970sc and I wasn't impressed with its build quality at all... I got the 970 from bestbuy for $350 after I priced matched newegg. I love the card, in my opinion its the best 970 out there ( 980 reference cooler/pcb is sick). Im not to happy about what Nvidia did but I wont go to AMD... I had a R9 290 for about a week and that was a terrible experience. I had the black screen bug (had to actually down clock memory to play games..) and it O/C liked crap. I really wanted to like it but it was just not usable. Anyway, I hope that nvidia will drop the price over this mess so I can grab another 970


----------



## rdr09

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> I had a evga 970sc and I wasnt impreased with its build quality at all... I got the 970 from bestbuy for $350 after I priced matched newegg. I love the card, in my opinion its the best 970 out there ( 980 refrence cooler is sick). Im not to happy about what Nvidia did but I wont go to AMD... I had a R9 290 for about a week and that was a terrible expericne. I had the black screen bug and it O/C liked crap. I really wanted to like it but it was just not usable. Anyway, I hope that nvidia will drop the price over this mess so I can grab another 970


i still recommend the 970 even after this finding, especially if the user has a low wattage psu. i just make sure it is paired with a 1080. i might get cussed out.

edit: you sure you want to sli?


----------



## MaCk-AtTaCk

Quote:


> Originally Posted by *rdr09*
> 
> i still recommend the 970 even after this finding, especially if the user has a low wattage psu. i just make sure it is paired with a 1080. i might get cussed out.
> 
> edit: you sure you want to sli?


Yes I would.
I dont plan to run multiple 4K monitors with insane amounts of AA. I usually upgrade my gpus ever other year so I think I will be fine for another year. Like I said I would only go SLI if Nvidia decided to drop the price of the 970 because of this mess or otherwise ( new cards coming out) I agree with you, I for one think the 970 is an amazing card. my card O/C's to over 1500 and never goes above 75C and its quiet. I just have my expectations in check for a card that is $350. I hope nvidia gets sued or something bad enough for them to not do this again. BUT I still think the 970 is a awesome card. maybe not the perfect flawless gem it was once cherished to be but still a DAM good card.


----------



## skupples

Quote:


> Originally Posted by *Wasupwitdat1*
> 
> Best Buy also doesn't give you any crap, like Newegg, about returns. And I like the Titan style cooler.


Yeah Best Buy is fine for returns as long as you're within their 15 day grace period. After that things get a little iffy. Newegg is hot garbage and has been for a number of years now. I try to use Amazon is much is possible, because they have some of the most liberal return policy is in the United States. We also have seven people using the same prime account so that's always a plus


----------



## amtbr

Quote:


> Originally Posted by *skupples*
> 
> Yeah Best Buy is fine for returns as long as you're within their 15 day grace period. After that things get a little iffy. Newegg is hot garbage and has been for a number of years now. I try to use Amazon is much is possible, because they have some of the most liberal return policy is in the United States. We also have seven people using the same prime account so that's always a plus


Yeah I rarely shop at Newegg now, I've bought from them for years, but their policies and customer service are terrible compared to Amazon. Unfortunately I bought my 970 from Newegg...

I bought my 970 with the intention of SLI down the road and a 4K monitor due to the advertised 4GB ram, now I am pretty weary of the future viability of my plan. Hopefully Newegg announces something soon.


----------



## skupples

What was advertised is fine, but almost every review showed SLI was wonky, though most of it was overlooked and blamed on drivers.


----------



## Woundingchaney

Quote:


> Originally Posted by *amtbr*
> 
> Yeah I rarely shop at Newegg now, I've bought from them for years, but their policies and customer service are terrible compared to Amazon. Unfortunately I bought my 970 from Newegg...
> 
> I bought my 970 with the intention of SLI down the road and a 4K monitor due to the advertised 4GB ram, now I am pretty weary of the future viability of my plan. Hopefully Newegg announces something soon.


I am in the process of RMAing my 970s right now. Newegg didn't give me any hassle what-so-ever.


----------



## amtbr

Quote:


> Originally Posted by *Woundingchaney*
> 
> I am in the process of RMAing my 970s right now. Newegg didn't give me any hassle what-so-ever.


How long ago did you purchase? I've had mine about 3 months.


----------



## Wasupwitdat1

Now I find out Best Buy price matches. It figures, I'm never in the loop. It's to late to go back and complain to them so I'll just live with my purchase and I'll know better next time. The 980 works fine in the PC I installed it in so I'm happy about that.


----------



## CaptainZombie

Quote:


> Originally Posted by *Wasupwitdat1*
> 
> Now I find out Best Buy price matches. It figures, I'm never in the loop. It's to late to go back and complain to them so I'll just live with my purchase and I'll know better next time. The 980 works fine in the PC I installed it in so I'm happy about that.


If you just bought it take your receipt up to customer service and get your money back. The difference from the price match could possibly buy you some lunch or dinner, etc.


----------



## RagingCain

Why don't you guys use FTA in BF4, Dragon Age, or Civilization Beyond Earth

And check out the frame rate variance when sitting at 3.5GB of VRAM vs lower?

Graph how bad it affects performance?

http://www.overclock.net/t/1530583/fta-frame-time-analyzer-v1-0-1-supports-bf4-civ-be-da-i/0_50


----------



## Woundingchaney

Quote:


> Originally Posted by *amtbr*
> 
> How long ago did you purchase? I've had mine about 3 months.


I purchased mine in Nov of last year.


----------



## notarat

Quote:


> Originally Posted by *jdstock76*
> 
> Anyone know what the next gen Nvidia cards number designations will be? I haven't seen anything yet.


I don't know what they're going to call it, but I bet they add "+512Mb of slow-ass RAM" just to cover their asses


----------



## AngryGoldfish

Quote:


> Originally Posted by *Wasupwitdat1*
> 
> It's a piece of garbage take my word for it. I took my 970 back, got a full refund and bought the 980. It runs a lot better. Best Buy selling the GTX970 ref card for $379 plus tax makes it a $400 card. Definitely not a good value. I'm not even sure the 980 was a good value because after tax I paid $640. My 3 x 680's in my other machine run better than the 980.


It's not a piece of garbage. Where are people getting this information from? What tests show it? The card was improperly advertised and is not as high-end as the specs suggested. This results in poor performance for a number of users-note, not all-at extremely high resolutions with games that demand a great deal of VRAM. Many have struggled to recreate the problem while others seem to find it no matter where they go. This does not make the 970 a piece of garbage. It is still the same card as when it was released, which everyone was raving about. Nvidia pulled a fast one. They're dicks. The card is still the same, though, despite its actual specifications. If you think 500MB less full speed memory makes a piece of garbage from a beastly card, you have priorities issues I believe you need to attend to. Are you seriously going to spend another $250 for 500MB of faster RAM and a few other performance benefits and call the 970 garbage? Really?

Would I recommend SLI 970 for a 1440p or tripled 1080p surround setup with high amounts of anti-aliasing and texture mods? No. That is just not appropriate any more once fully tested. Would I still recommend the 970 for 1080p if you don't care about crazy amounts of AA or DSR? Yes. Out of principle, I would suggest waiting for the new AMD cards, but if you can't wait and cannot deal with the power consumption or the heat dissipation of a 290X, it's still the best card on the market. Where before the 970 was a great card for anyone, now it's a niche card that I would not recommend to everyone.


----------



## iSlayer

Quote:


> Originally Posted by *rickcooperjr*
> 
> keep in mind almost all games are a console port and who's hardware is used in consoles so of course they will get first priority on the optimizations it is just natural x86 8core apu and GCN graphics architecture to boot and both are what us PC AMD users are using GCN and X86 the consoles now are a PC basically. So this time around I don't see AMD getting the crappy end of stick otherwise the consoles won't work either and we know that would be fixed quickly.


Multithreading is still tough and games are only going to be using 6 (potentially a 7th, handicapped core).

I wouldn't have too much hope for console ports being kinder on AMD CPUs. DX12 though, that looks like it'll do it.
Quote:


> Originally Posted by *Orangey*
> 
> There are and will be no 20nm GPUs.


How annoyed do you get every time you see that. Because its starting to really irk me?


----------



## Orangey

Quote:


> Originally Posted by *iSlayer*
> 
> How annoyed do you get every time you see that. Because its starting to really irk me?


I have resigned myself to the fact that this info will never proliferate.









It doesn't help that wccf and others have been hyping it long after we knew it wasn't coming. Had to get those clicks.


----------



## skupples

Quote:


> Originally Posted by *Orangey*
> 
> I have resigned myself to the fact that this info will never proliferate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It doesn't help that wccf and others have been hyping it long after we knew it wasn't coming. Had to get those clicks.


We knew there would be no 20nm GPU over a year ago, but it didn't keep the click fiends from publishing articles on it.

going straight to 16/14nm.

non-finfet 20nm simply can't be produced in an economical enough situation to allow for thousand core GPUs, let alone 3k core flagships.


----------



## benbenkr

Quote:


> Originally Posted by *Nickyvida*
> 
> Still doesn't change the fact that Nvidia lied to customers with the incorrect specs. *People bought the graphics card based on the card's specs at the time* and they have a right to feel aggrieved especially if they have purchased what they were led to believe. They wanted full speed 4gb, not a cripped .5gb + 3.5 gb, which is what any normal graphics card output.
> 
> This has put me off Nvidia. I'll be taking my business to AMD once the 20nm gpus drop.


While I agree on what you're saying, but the bolded part is just your assumption.

Majority of people bought the 970 and in fact, majority of people buy graphics cards based on framerates on charts. Yeah, that's how naive people are. Majority of people don't care what's going on under the card, they just care if they can play their games at the frames and resolution they want it at.


----------



## clerick




----------



## Thready

I know nothing about how graphics cards work on a deep level, but I know enough and I know how businesses work. This is such a fanboy bandwagon-y issue. Do you guys honestly think Nvidia is going to cheap out and put a half gig less RAM in their cards, knowing full well that there exist a lot of PC experts who use their cards and will tear down the internals of their card and post everything online?

Do you really think they're going to do this? Their cards cost hundreds of dollars. VRAM is not that expensive compared to the GPU on the card (maybe I'm wrong on this but I think I'm right). If they wanted to cut corners, then the VRAM is not the way to go. And why would they cut corners anyways? Nvidia seems like the type of company that has no problem passing the cost onto their consumer instead of cutting corners.

Their explanation makes perfect sense to me. It makes more sense to me that they set up their VRAM in the card in such a way that a program would think it's 3.5 GB when in reality it's just split up. Nvidia is not stupid.

I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, is not going to pull some shady cheap crap.


----------



## <({D34TH})>

Quote:


> Originally Posted by *Thready*
> 
> *I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, is not going to pull some shady cheap crap.*


That's what corporations want you to think. They'll do anything, shady or not, to make a profit.


----------



## mtcn77

Quote:


> Originally Posted by *Thready*
> 
> *I know nothing about how graphics cards work on a deep level, but* I know enough and I know how businesses work. This is such a fanboy bandwagon-y issue. *Do you guys honestly think Nvidia is going to cheap out* and put a half gig less RAM in their cards, knowing full well that there exist a lot of PC experts who use their cards and will tear down the internals of their card and post everything online?
> 
> *Do you really think they're going to do this?* Their cards cost hundreds of dollars. VRAM is not that expensive compared to the GPU on the card (maybe I'm wrong on this but I think I'm right). *If they wanted to cut corners*, then the VRAM is not the way to go. *And why would they cut corners anyways?* Nvidia seems like the type of company that has no problem passing the cost onto their consumer *instead of cutting corners.*
> 
> Their explanation makes perfect sense to me. It makes more sense to me that they set up their VRAM in the card in such a way that a program would think it's 3.5 GB when in reality it's just split up. Nvidia is not stupid.
> 
> I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, *is not going to pull some shady cheap crap.*


Soo, for a debater of assumptive argument, you are cutting corners when it comes to argument out of repetition. You are also assuming Nvidia wouldn't because people rip their cards to shreds and post them online(negative assumption from positive premise).


----------



## Thready

Quote:


> Originally Posted by *mtcn77*
> 
> Soo, for a debater of assumptive argument, you are cutting corners when it comes to argument out of repetition. You are also assuming Nvidia wouldn't because people rip their cards to shreds and post them online(negative assumption from positive premise).


What the hell are you talking about? I'm saying what I think is the most logical answer. Stop trying to turn things into an argument. You OCNers love to pick apart things because it makes you feel smarter than another person.


----------



## Art Vanelay

Quote:


> Originally Posted by *The Robot*
> 
> I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, is not going to pull some shady cheap crap.


It's not like any company ever pulled some shady crap that caused a manned rocket to explode.

Especially not Morton Thiokol...


----------



## Luck100

Quote:


> Originally Posted by *Thready*
> 
> I know nothing about how graphics cards work on a deep level, but I know enough and I know how businesses work. This is such a fanboy bandwagon-y issue. Do you guys honestly think Nvidia is going to cheap out and put a half gig less RAM in their cards, knowing full well that there exist a lot of PC experts who use their cards and will tear down the internals of their card and post everything online?
> 
> Do you really think they're going to do this? Their cards cost hundreds of dollars. VRAM is not that expensive compared to the GPU on the card (maybe I'm wrong on this but I think I'm right). If they wanted to cut corners, then the VRAM is not the way to go. And why would they cut corners anyways? Nvidia seems like the type of company that has no problem passing the cost onto their consumer instead of cutting corners.
> 
> Their explanation makes perfect sense to me. It makes more sense to me that they set up their VRAM in the card in such a way that a program would think it's 3.5 GB when in reality it's just split up. Nvidia is not stupid.
> 
> I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, is not going to pull some shady cheap crap.


It's nothing to do with "cutting corners". When they fabricate the chips for the GTX 980, a large fraction (possibly the majority) have defects. Instead of throwing them away, they make 970's out of them. If they didn't do that, they would have to charge even more for the 980 to make up the costs of throwing away dies which are not fully functional.

Same goes for AMD R9 290x and the 290, and the NVidia GTX 780ti and 780. The top card (980, 780ti, 290x) commands a premium because it requires a fully functional chip, while the lesser card (970, 780, 290) is made out of the rejects.


----------



## 2010rig

Have we ever had a bigger issue, where so many with AMD hardware have been affected by NVIDIA's actions?


----------



## skupples

Quote:


> Originally Posted by *benbenkr*
> 
> While I agree on what you're saying, but the bolded part is just your assumption.
> 
> Majority of people bought the 970 and in fact, majority of people buy graphics cards based on framerates on charts. Yeah, that's how naive people are. Majority of people don't care what's going on under the card, they just care if they can play their games at the frames and resolution they want it at.


they just GPU boss it.

At least, this is what I've observed when dealing with people at work. Start talking about GPUs, and they bust out the GPUBOSS / GPU compare nonsense.

never the less, people STILL bought it assuming that the box branding was true.


----------



## renji1337

is newegg taking returns on 970s? i bought mine at newegg business in november


----------



## skupples

Quote:


> Originally Posted by *renji1337*
> 
> is newegg taking returns on 970s? i bought mine at newegg business in november


nope.

NewEgg is being NewEgg. Lying & blowing people off. Telling them that they're looking into it and that "If it were up to me I would totally do it! In fact, I'll override that department and do it for you if they don't get back to me, soon"

still not as bad as NCIX.


----------



## xenophobe

Quote:


> Originally Posted by *benbenkr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nickyvida*
> 
> . *People bought the graphics card based on the card's specs at the time*
> 
> 
> 
> While I agree on what you're saying, but the bolded part is just your assumption.
Click to expand...

Graphics card memory has always been of major importance to me, going back to the days of the 386 before there even was 3D acceleration. It has always mattered to me. Can't speak for anyone else though.


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> No, it's 4 GB per core, just like always. Sounds like false advertising, now that I think about it.


8GiB physically soldered to the card. 4 per GPU.

Advertised pretty much the same way as any dual GPU part ever, because it's technically correct and is a bigger number that looks more impressive to the uninformed.
Quote:


> Originally Posted by *Xoriam*
> 
> hmmm... I never saw this before. and i was searching for something like it
> Whats the reason?


I don't know.

VCE has been around since the Radeon 7900 series, or over three years at this point, and I've been using it for about two years with MSI AB.

Still, it's limited to 1080p recording and like NVENC (what Shadowplay uses) does not do well with low bitrates. Software recording with OBS looks much better and is also very low performance impact, especially on my hex core parts.
Quote:


> Originally Posted by *Xoriam*
> 
> Oh yeah the one in raptr, it was completely broken for tahiti cards, and only somewhat working for 2XX cards when I switched over.
> 
> It definatly didn't make me lose 0fps like shadowplay does.


It's what Raptr uses, but Raptr is hardly the only, or best, way to use it.
Quote:


> Originally Posted by *skupples*
> 
> Why would you buy a GPU from best buy. Lol


They do occasionally have competitive pricing, and being able to return it locally if something goes wrong is nice.


----------



## CaptainZombie

Quote:


> Originally Posted by *renji1337*
> 
> is newegg taking returns on 970s? i bought mine at newegg business in november


Quote:


> Originally Posted by *skupples*
> 
> nope.
> 
> NewEgg is being NewEgg. Lying & blowing people off. Telling them that they're looking into it and that "If it were up to me I would totally do it! In fact, I'll override that department and do it for you if they don't get back to me, soon"
> 
> still not as bad as NCIX.


Yeah, I'm having a hell of a time trying to get a return completed through Newegg. I posted earlier on the BS that I have had to go through this past week with them. If I don't get a clear cut answer from them on Monday, I'll threaten a charge back and see what they do since that is worse for them do to the fees they accrue. If they don't do anything, I'll just move on and sell my 970.


----------



## Forceman

Quote:


> Originally Posted by *Blameless*
> 
> 8GiB physically soldered to the card. 4 per GPU.
> 
> Advertised pretty much the same way as any dual GPU part ever, because it's technically correct and is a bigger number that looks more impressive to the uninformed.


I know, it was a joke. You know, making fun of the "GTX 970 doesn't have 4 GB" crowd. I thought the wink would have made that clear.


----------



## error-id10t

Quote:


> Originally Posted by *Forceman*
> 
> No, it's 4 GB per core, just like always. Sounds like false advertising, now that I think about it.


Isn't the reason they could advertise it as 8GB because there are programs out there that do see it as complete 8GB of vRAM? I'm not talking games etc, that's my understanding.. and yes I see your post is about poking fun at the 970 crowd but just curious if anyone knows.


----------



## rickcooperjr

Quote:


> Originally Posted by *Luck100*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Thready*
> 
> I know nothing about how graphics cards work on a deep level, but I know enough and I know how businesses work. This is such a fanboy bandwagon-y issue. Do you guys honestly think Nvidia is going to cheap out and put a half gig less RAM in their cards, knowing full well that there exist a lot of PC experts who use their cards and will tear down the internals of their card and post everything online?
> 
> Do you really think they're going to do this? Their cards cost hundreds of dollars. VRAM is not that expensive compared to the GPU on the card (maybe I'm wrong on this but I think I'm right). If they wanted to cut corners, then the VRAM is not the way to go. And why would they cut corners anyways? Nvidia seems like the type of company that has no problem passing the cost onto their consumer instead of cutting corners.
> 
> Their explanation makes perfect sense to me. It makes more sense to me that they set up their VRAM in the card in such a way that a program would think it's 3.5 GB when in reality it's just split up. Nvidia is not stupid.
> 
> I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, is not going to pull some shady cheap crap.
> 
> 
> 
> It's nothing to do with "cutting corners". When they fabricate the chips for the GTX 980, a large fraction (possibly the majority) have defects. Instead of throwing them away, they make 970's out of them. If they didn't do that, they would have to charge even more for the 980 to make up the costs of throwing away dies which are not fully functional.
> 
> Same goes for AMD R9 290x and the 290, and the NVidia GTX 780ti and 780. The top card (980, 780ti, 290x) commands a premium because it requires a fully functional chip, while the lesser card (970, 780, 290) is made out of the rejects.
Click to expand...

you are wrong the R9 290 uses a different ram than the 290x they found that the ram used in the 290 would not reach theyre goals so they went back to drawing board and used better ram instead most of the 290X's use hynix ram or samsung I believe and the 290 used elpida or something of the sort.

A very good read on the subject of 290 vs 290x ram http://www.overclock.net/t/1457836/determine-hynix-elpida-memory-on-290x-cards-based-on-product-number-before-buying-them-confirmation-needed-please-help-if-you-have-a-290x

here is a quote from it to specify things a bit :::::::: Note: This only works for 290x models! All 290 non x models always have a -00 or -50 ending, even for Elpida. ::::: in short all 290's were pretty much elpida ram while the 290X's were not and the 290 running the elpida ram would not often reach 290x speeds and did so for short time before failure.


----------



## drufause

Quote:


> Originally Posted by *rickcooperjr*
> 
> you are wrong the R9 290 uses a different ram than the 290x they found that the ram used in the 290 would not reach theyre goals so they went back to drawing board and used better ram instead most of the 290X's use hynix ram or samsung I believe and the 290 used elpida or something of the sort.
> 
> A very good read on the subject of 290 vs 290x ram http://www.overclock.net/t/1457836/determine-hynix-elpida-memory-on-290x-cards-based-on-product-number-before-buying-them-confirmation-needed-please-help-if-you-have-a-290x
> 
> here is a quote from it to specify things a bit :::::::: Note: This only works for 290x models! All 290 non x models always have a -00 or -50 ending, even for Elpida. ::::: in short all 290's were pretty much elpida ram while the 290X's were not and the 290 running the elpida ram would not often reach 290x speeds and did so for short time before failure.


My MSI card reports as having Hynix and its a r9 290


----------



## Orangey

The Tri-X is (was) guaranteed Hynix. MSI also had guaranteed on the Lightning and one other model I think. Not the Gaming. But it was used randomly on launch cards.


----------



## GrimDoctor

Quote:


> Originally Posted by *error-id10t*
> 
> Isn't the reason they could advertise it as 8GB because there are programs out there that do see it as complete 8GB of vRAM? I'm not talking games etc, that's my understanding.. and yes I see your post is about poking fun at the 970 crowd but just curious if anyone knows.


Yes AutoCAD Inventor would crash on me. Occasional issues in Vegas when rendering, either a crash or bad render. Now I have a 980 no crash at all. I use these two everyday so I can vouch for these.


----------



## AngryGoldfish

Quote:


> Originally Posted by *Thready*
> 
> I know nothing about how graphics cards work on a deep level, but I know enough and I know how businesses work. This is such a fanboy bandwagon-y issue. Do you guys honestly think Nvidia is going to cheap out and put a half gig less RAM in their cards, knowing full well that there exist a lot of PC experts who use their cards and will tear down the internals of their card and post everything online?
> 
> Do you really think they're going to do this? Their cards cost hundreds of dollars. VRAM is not that expensive compared to the GPU on the card (maybe I'm wrong on this but I think I'm right). If they wanted to cut corners, then the VRAM is not the way to go. And why would they cut corners anyways? Nvidia seems like the type of company that has no problem passing the cost onto their consumer instead of cutting corners.
> 
> Their explanation makes perfect sense to me. It makes more sense to me that they set up their VRAM in the card in such a way that a program would think it's 3.5 GB when in reality it's just split up. Nvidia is not stupid.
> 
> I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, is not going to pull some shady cheap crap.


Yet they made such a massive mistake with the specifications. They attribute this screw up to poor communication between marketing and engineering. No matter how you spin that statement, nVidia made a huge mistake. If it's the truth, they screwed up big time and something should be done about it. Although little is being done for the public, maybe heads have rolled within the company. If it's a lie, they made a card that was marketed inaccurately in order to sell more. If this is proven true, which I doubt it will, nVidia are dirtbags and have potentially lost a few thousand customers.


----------



## rickcooperjr

Quote:


> Originally Posted by *AngryGoldfish*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Thready*
> 
> I know nothing about how graphics cards work on a deep level, but I know enough and I know how businesses work. This is such a fanboy bandwagon-y issue. Do you guys honestly think Nvidia is going to cheap out and put a half gig less RAM in their cards, knowing full well that there exist a lot of PC experts who use their cards and will tear down the internals of their card and post everything online?
> 
> Do you really think they're going to do this? Their cards cost hundreds of dollars. VRAM is not that expensive compared to the GPU on the card (maybe I'm wrong on this but I think I'm right). If they wanted to cut corners, then the VRAM is not the way to go. And why would they cut corners anyways? Nvidia seems like the type of company that has no problem passing the cost onto their consumer instead of cutting corners.
> 
> Their explanation makes perfect sense to me. It makes more sense to me that they set up their VRAM in the card in such a way that a program would think it's 3.5 GB when in reality it's just split up. Nvidia is not stupid.
> 
> I'm not even an Nvidia guy. But I know that a multi billion dollar company like Nvidia, a company that makes parts that go into space, is not going to pull some shady cheap crap.
> 
> 
> 
> Yet they made such a massive mistake with the specifications. They attribute this screw up to poor communication between marketing and engineering. No matter how you spin that statement, nVidia made a huge mistake. If it's the truth, they screwed up big time and something should be done about it. Although little is being done for the public, maybe heads have rolled within the company. If it's a lie, they made a card that was marketed inaccurately in order to sell more. If this is proven true, which I doubt it will, nVidia are dirtbags and have potentially lost a few thousand customers.
Click to expand...

I could not agree more with you you hit the nail on the head. + rep for you


----------



## Gabe63

I have not followed this closely and have a serious question please. I do not own this card but am in the market. The 980 cost too much for me.

As I understand the card was sold as 4gb but can only use 3.5.

1. Is the card trying to use 4gb and causing issues another 3.5gb card would not have?
2. If not, is the card performing as any other 3.5gb card would.
3. If I bought this card understanding it would only use 3.5gb and put my game setting accordingly would I be happy or is there more to this issue?

I am trying to understand if the card still performs as well as it did before we found out about this issue? Does it still perform as well as when the reviews came out and everyone was happy before they knew this?

Thanks


----------



## rickcooperjr

Quote:


> Originally Posted by *Gabe63*
> 
> I have not followed this closely and have a serious question please. I do not own this card but am in the market. The 980 cost too much for me.
> 
> As I understand the card was sold as 4gb but can only use 3.5.
> 
> 1. Is the card trying to use 4gb and causing issues another 3.5gb card would not have?
> 2. If not, is the card performing as any other 3.5gb card would.
> 3. If I bought this card understanding it would only use 3.5gb and put my game setting accordingly would I be happy or is there more to this issue?
> 
> I am trying to understand if the card still performs as well as it did before we found out about this issue? Does it still perform as well as when the reviews came out and everyone was happy before they knew this?
> 
> Thanks


The minute you go past its 3.5gb Vram in alot of games the crap hits the fan and it becomes a stutter fest or all out locks up and alot of software that needs the Vram also if it hits the 3.5gb Vram threshhold locks up or sends errors and corrupts whatever you were trying to do. So in short the specs about the GTX 970 card was lies down to the number of ROP's / TAU's / Vram bandwidth and such that was suppose to be in the card they may have been there but they were cut in a way to make them unusable basically a GTX 970 is a heavily gimped GTX 980 that if you push the 970 things get dicey and answers / questions go into the air and it is not fixable via driver it's a hardware / architecture issue on the GTX 970 itself so in short it can't be fixed ( YOU GET WHAT YOU GET ) and Nvidia seems to have dropped the ball remedying the issue or even addressing it properly.

Keep in mind future games will only use more and more Vram putting the 970 at the 3.5gb barrier much quicker making things even dicier and also Dx12 which the GTX 970 is supporting will use more Vram we have seen testing done recently on this that were recently leaked / posted DX12 is effectively more than doubling the FPS on star swarm for Nvidia cards and DX12 and mantle performance are nearly side by side keep in mind microsoft brought in the Mantle team to help develop DX12. http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/3

Nvidia is now avoiding any conversation about it like the plague and leaving it's customers out to dry sadly I did not expect them to do this I expected them to atleast address it and hold to giving refunds but apparently they have stopped giving refunds and no longer are responding to theyre customers on the subject. I also have to say many other vendors that sell Nvidia GPU's have done the same likely because Nvidia told them to do so saying they would not give the vendors theyre money back for refunding theyre customers.


----------



## AngryGoldfish

Quote:


> Originally Posted by *Gabe63*
> 
> I have not followed this closely and have a serious question please. I do not own this card but am in the market. The 980 cost too much for me.
> 
> As I understand the card was sold as 4gb but can only use 3.5.
> 
> 1. Is the card trying to use 4gb and causing issues another 3.5gb card would not have?
> 2. If not, is the card performing as any other 3.5gb card would.
> 3. If I bought this card understanding it would only use 3.5gb and put my game setting accordingly would I be happy or is there more to this issue?
> 
> I am trying to understand if the card still performs as well as it did before we found out about this issue? Does it still perform as well as when the reviews came out and everyone was happy before they knew this?
> 
> Thanks


As far as I understand it:

1. The card _is_ using 4GB of VRAM, but that last 0.5GB is running extremely slowly (1/7th the speed) and can defer to system memory for help.

2. The card performs poorly because the game's coding is seeing 4GB of VRAM, yet it only effectively has 3.5GB of full speed memory to use for texture mapping. This is why the performance drops are so noticeable; it causes micro-stuttering and timeframe spikes. For other cards, such as the 980 with 4GB of full speed memory, in games that draw more than 4GB of VRAM (Shadow of Mordor, for example), the card will drop in performance but will still be playable. The Titan will draw 5GB in the same scene while the 980 will draw 4GB. The 980 still performs perfectly fine in these situations as the game will recognise the cap and possibly 'throttle' down. The 970 occasionally does not.

3. Your happiness is your own to decide, but I have not personally experienced any severe issues yet. With DSR on the games I'm currently playing (ME3, Far Cry 3, Tomb Raider, Borderlands 2, CS:GO, Max Payne 3), I am not experiencing any noticeable issues. However, I have yet to play Far Cry 4, Shadow of Mordor, Unity, Advanced Warfare, Dying Light or The Evil Within. These are a few of the only games that benefit from high speed, high quantity VRAM. BF4 also benefits when playing at 4K, but the 970 is not capable of keeping up with this anyway. Skyrim with heavy amounts of texture mods is known to demand 4GB of VRAM or more. This will probably be the test I will use as I would like to play Skyrim with the mods, while Unity and a bunch of the others were so badly designed that I lost interested.


----------



## Tsumi

Quote:


> Originally Posted by *Gabe63*
> 
> I have not followed this closely and have a serious question please. I do not own this card but am in the market. The 980 cost too much for me.
> 
> As I understand the card was sold as 4gb but can only use 3.5.
> 
> 1. Is the card trying to use 4gb and causing issues another 3.5gb card would not have?
> 2. If not, is the card performing as any other 3.5gb card would.
> 3. If I bought this card understanding it would only use 3.5gb and put my game setting accordingly would I be happy or is there more to this issue?
> 
> I am trying to understand if the card still performs as well as it did before we found out about this issue? Does it still perform as well as when the reviews came out and everyone was happy before they knew this?
> 
> Thanks


If you're not running SLI, you are probably not going to run into the slow memory problems. If you are running SLI, the chances are much higher that you run into the slow memory.

Yes, it performs the same for the most part in single card mode. But some SLI issues that were thought to be SLI problems may in fact actually be problems related to the slow VRAM.


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> I know, it was a joke. You know, making fun of the "GTX 970 doesn't have 4 GB" crowd. I thought the wink would have made that clear.


I'm almost completely immune to sarcasm and emoticons.
Quote:


> Originally Posted by *error-id10t*
> 
> Isn't the reason they could advertise it as 8GB because there are programs out there that do see it as complete 8GB of vRAM?


If they can use the GPUs completely independently, sure.


----------



## PhotonFanatic

So right now, there is little that will be done. What about when they (the companies like evga) inevitably come out with a 970 with more Vram? They always do it. Always have, always will, so please don't say they won't. It would be contradicting all of video card history.

Should those who are considering the 970, wait on that card? I'm assuming the issue would be fixed.


----------



## rickcooperjr

Quote:


> Originally Posted by *PhotonFanatic*
> 
> So right now, there is little that will be done. What about when they (the companies like evga) inevitably come out with a 970 with more Vram? They always do it. Always have, always will, so please don't say they won't. It would be contradicting all of video card history.
> 
> Should those who are considering the 970, wait on that card? I'm assuming the issue would be fixed.


The core issue cannot be fixed it is broken hardware / architecture side anything short of a full GTX 980 is going to have a issue due to the way they took the rejected 980's and cut them internally not software side and rebadged them as GTX 970's.

They literally went in and physically cut connections internally during the processing / manufacturing stage to silicon that couldn't pass the GTX 980 requirements and then called them GTX 970's but they did so in such a way all 970's were the same.

That is one heck of a way to cut cost of making the GTX 980's if you ask me take the fails and hard cut them not firmware / driver method down to make 970's even though they did so in such a way it broke the way they were advertised to function spec wise and such and just ran with it.


----------



## Cyro999

Quote:


> Originally Posted by *PhotonFanatic*
> 
> So right now, there is little that will be done. What about when they (the companies like evga) inevitably come out with a 970 with more Vram? They always do it. Always have, always will, so please don't say they won't. It would be contradicting all of video card history.
> 
> Should those who are considering the 970, wait on that card? I'm assuming the issue would be fixed.


Instead of 256 bit effective 4/8GB VRAM, an extended VRAM version with double density memory chips would have 224 bit effective still, with 7GB instead of 3.5GB usable


----------



## Anth0789

Here is the response I got from Asus for asking for a refund:
Quote:


> Dear Anth,
> 
> Thank you for contacting ASUS Customer Service.
> My name is Sean and it is my pleasure to help you with your problem.
> 
> I understand that you are requesting for refund for your product.
> 
> I apologize for the inconvenience this has caused and it would be my pleasure to assist you in resolving this issue.
> 
> Could you please describe the issue in detail to assist you better.
> 
> As for us now we never heard about the memory issue in this product.
> 
> Could you please forward the reference link where you would have read about memory issue on this product.
> 
> If the product is repairable then it will be repaired else you will receive a notification mail from us regarding replacement.
> 
> I am sorry to say that refund is not possible.
> 
> If you need any further assistance,Please feel free to contact us.
> 
> Thank you very much.
> Sean D


----------



## Cyro999

Not even plausible that Asus hasn't heard of the false advertisement issue yet


----------



## djsi38t

I have a feeling not many people who are upset about this are going to switch to amd for the rest of their gaming days.

In my opinion expecting a cash refund on a product past the return window is expecting to much.

Also for those battling with newegg,don't burn your bridge with them over this after a long happy relationship.

If you do a chargeback they may never sell to you again.They have been good to you,now it's your turn to be good to them.


----------



## Cyro999

Quote:


> In my opinion expecting a cash refund on a product past the return window is expecting to much.


People owned them for ~1-4 months with an expected life cycle of probably 3 years, depending on the person. If it's not as advertised in a significant way then it matters and EU consumer protection laws probably come in somewhere, whether the retailer likes it or not


----------



## DrFPS

Quote:


> Originally Posted by *AngryGoldfish*
> 
> Yet they made such a massive mistake with the specifications. They attribute this screw up to poor communication between marketing and engineering. No matter how you spin that statement, nVidia made a huge mistake. If it's the truth, they screwed up big time and something should be done about it. Although little is being done for the public, maybe heads have rolled within the company. If it's a lie, they made a card that was marketed inaccurately in order to sell more. If this is proven true, which I doubt it will, nVidia are dirtbags and have potentially lost a few thousand customers.


Quote:


> Originally Posted by *rickcooperjr*
> 
> I could not agree more with you you hit the nail on the head. + rep for you


I have a hard time believing the marketing vs engineering story. If you go to nvidia's website the only thing is there actually the 4gb vram and bandwidth.

I'm not sure but missing .5Gb would defiantly cut into the bandwidth advertised at 224 Gb/s.

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications

I have no idea how GPUz works, maybe the video bios. That's as wrong as wrong can be. IMHO nvidia need to buy off all the original 970 owners, and drastically reduce the price.
Or they will end up in trouble.


----------



## Forceman

Quote:


> Originally Posted by *DrFPS*
> 
> I have a hard time believing the marketing vs engineering story. If you go to nvidia's website the only thing is there actually the 4gb vram and bandwidth.
> 
> I'm not sure but missing .5Gb would defiantly cut into the bandwidth advertised at 224 Gb/s.
> 
> http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications
> 
> I have no idea how GPUz works, maybe the video bios. That's as wrong as wrong can be. IMHO nvidia need to buy off all the original 970 owners, and drastically reduce the price.
> Or they will end up in trouble.


This has already been covered numerous times. They aren't changing the specs because the card does have 4GB addressable and there are (at least theoretically) scenarios where you can get the full 224GB/sec. So as far as Nvidia is concerned, the specs are correct.

And GPU-Z just gets information from a lookup table/database, it isn't querying the card for things like ROPs or bus width.


----------



## EQvet80

Quote:


> Originally Posted by *2010rig*
> 
> You mean the 8 ROP's that are un-usable anyway due to the SMM's being the bottleneck?
> 
> I know everybody goes out and buys the card based on ROP count and L2 cache, right? No one cares about the performance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Look, I don't buy that it took them 4 months to notice their mistake, but that's all it was, *a mistake*, a *mislabeled spec sheet*. I'm not making excuses for them.
> 
> The *performance* of the card is STILL the same as seen in reviews, shouldn't that be what matters? After almost 2000 posts in this thread, I don't know why I bother anymore. Keep complaining.


It matters to those who are concerned with performance. Those concerned with performance however are a percentage, just like a percentage of those are more concerned with honesty. To me, if it performs, and performs beastly, then that is all... Then again that's just me..


----------



## iSlayer

I wonder when the mods will make spreading FUD a bannable offence...
Quote:


> Originally Posted by *mtcn77*
> 
> 
> 
> 
> 
> 
> 
> 
> , lol! So early to retire some pretty glamorous cards, me thinks.
> 
> 
> Spoiler: curse of the 2 gigabytes


2.5k posts over more than 2 weeks. SMH. How does this post add anything to this thread?


----------



## mtcn77

Quote:


> Originally Posted by *iSlayer*
> 
> I wonder when the mods will make spreading FUD for personal gain a bannable offence...
> 2.5k posts over more than 2 weeks. SMH.


Stop flaming please.


----------



## Kinaesthetic

Quote:


> Originally Posted by *mtcn77*
> 
> Stop flaming please.


He ain't flaming, just posting the obvious. Over the course of these past two weeks in this thread, you've done nothing but troll it. Even the usual suspects have vanished from this thread. Which is impressively sad that you've trolled it this long.


----------



## mtcn77

Quote:


> Originally Posted by *Kinaesthetic*
> 
> He ain't flaming, just posting the obvious. Over the course of these past two weeks in this thread, you've done nothing but troll it. Even the usual suspects have vanished from this thread. Which is impressively sad that you've trolled it this long.


So you sentence me to stop? I thought _I was posting the obvious_. You can check the source, it is from a review.


----------



## Yungbenny911

Quote:


> Originally Posted by *mtcn77*
> 
> 
> 
> 
> 
> 
> 
> 
> , lol! So early to retire some pretty glamorous cards, me thinks.
> 
> 
> Spoiler: curse of the 2 gigabytes


http://www.overclock.net/t/1474177/2gb-v-ram/0_100


----------



## 2010rig

Quote:


> Originally Posted by *Yungbenny911*
> 
> http://www.overclock.net/t/1474177/2gb-v-ram/0_100


Does this mean 3.5 GB isn't enough?









mtcn77 said it's cursed.









PCPer had to increase 4K to 150% just to go past 3.5 GB & Considering the 970 excels at 1080p, the 970 is definitely doomed.


----------



## iSlayer

Quote:


> Originally Posted by *2010rig*
> 
> Does this mean 3.5 GB isn't enough?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> mtcn77 said it's cursed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Considering the 970 excels at 1080p....


You know, I can see mtcn already.

>970 tapers off badly after 3.5GBs
"lol nvidia lied and should be sued, GG at 4k!"
>380x struggles at 4k with 4GBs
"OMG guys it wins in Firestrike"


----------



## 2010rig

@iSlayer you've been reading too many of his posts haven't you? Here are other likely responses to the 3.5 GB question...
Quote:


> Originally Posted by *mtcn77*
> 
> Lol, at inductive reasoning.
> Nvidia gpus and AMD gpus have all too different strongpoints. Nvidia fails at backend, AMD the frontend. So the usual complaint for AMD gpus are likely low fps due to excess geometry while Nvidia gpu's can't take fluid game play at *superresolutions* as displayed in recent game reviews much too frequently.


Stop responding, because...
Quote:


> Originally Posted by *mtcn77*
> 
> Soo, for a debater of assumptive argument, you are cutting corners when it comes to argument out of repetition. You are also assuming Nvidia wouldn't because people rip their cards to shreds and post them online(negative assumption from positive premise).


----------



## Clairvoyant129

Quote:


> Originally Posted by *mtcn77*
> 
> So you sentence me to stop? I thought _I was posting the obvious_. You can check the source, it is from a review.


Impressive amount of posts for card you don't even have. Hope AMD is paying you well.


----------



## Silent Scone

Seeing as there's no useful information in this thread, regarding people getting refunded or performance impact data...

It's time for the lockodile crocodile.


----------



## iRUSH

Quote:


> Originally Posted by *Silent Scone*
> 
> Seeing as there's no useful information in this thread, regarding people getting refunded or performance impact data...
> 
> It's time for the lockodile crocodile.


lock-odile Done-dee


----------



## mcg75

I agree.

Locked.

Will be interesting how much market share this major screw up costs Nvidia.


----------

