# [PCGH] Maxwell GTX 880 specifications leaked



## Ha-Nocri

lol, 256-bit. Hoping NV aren't playing this game again.


----------



## Master__Shake

looks like about 650 -700 for it

GPU Prices amirite.


----------



## TK421

Quote:


> Originally Posted by *Master__Shake*
> 
> looks like about 650 -700 for it
> 
> GPU Prices amirite.


how well does it mine though?


----------



## subyman

Oh boy, 680 all over again?


----------



## King Lycan

Still 384 bit ?


----------



## Scorpion667

It's okay, the higher GPU prices just made me realize there's very little need for me to upgrade every GPU every year. It's a win-win situation that involves Nvidia getting less of my money, long-term =)


----------



## Master__Shake

Quote:


> Originally Posted by *King Lycan*
> 
> Still 384 bit ?


nope 256bit looks like gm104


----------



## Dangur

pcgameshardware?


----------



## cravinmild

Quote:


> Originally Posted by *Ha-Nocri*
> 
> lol, 256-bit. Hoping NV aren't playing this game again.


Nvidia needs to be careful, if they throw in everything they got into the first edition then how the heck will they be able to offer a dozen slightly more powerful card each week for the next twelve months. I love marketing


----------



## Germanian

Quote:


> Originally Posted by *Dangur*
> 
> pcgameshardware?


it's a very reputable german computer hardware magazin. I have been reading this magazine for over 5 years


----------



## ThePath

Look decent. Remember that maxwell shader are better than Kepler. 3200 maxwell cores is better than 3200 kepler cores

I don't think 256-bit will be a problem. Just look at GTX 480 vs GTX 660 or GTX 560 vs GTX 285. The newer GPU wins despite the lower memory bandwidth.


----------



## Ha-Nocri

Quote:


> Originally Posted by *cravinmild*
> 
> Nvidia needs to be careful, if they throw in everything they got into the first edition then how the heck will they be able to offer a dozen slightly more powerful card each week for the next twelve months. I love marketing


Exactly. 880ti, 8-series titan, 8-series titan black...
Milking, milking is my way


----------



## Valor958

Hmm... still feeling that just going with a 2nd card is more beneficial than a new series at this point. Money better spent so far as I can tell.


----------



## yunshin

Looks like I can skip the 880 after all if that leak is real.


----------



## DADDYDC650

256-bit? Hope not otherwise I'll wait at least 8 months after release to buy a GTX 880. By that time there will be a GTX 880 Super TI Mega Edition with 12GB.


----------



## pterois

So it will be well into 2015 that we get the stronger cards.


----------



## gamerguuy

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Exactly. 880ti, 8-series titan, 8-series titan black...
> Milking, milking is my way


The milking is strong with this one!


----------



## zalbard

OBR = most likely made-up stuff...


----------



## AnnoyinDemon

If this is true about the 880 then NV are just milking the customers UNLESS its the same price as a few bucks from a 780ti.


----------



## Dangur

Whats the point of 4gigs with 256bit?


----------



## Mr.Eiht

Quote:


> GTX 880 with 3200 shaders, 200 Texture units, 25SMM (Maxwell-shader-processors). I believe Raster-endstufen stands for ROP's at 40


Small correction here. The original texts says that this "3200 shaders, 200 Texture units" would be equal/comparable to "25SMM".
And yup, the Rasterendstufen means R.O.P.
(I had to look it up. No german speaking person uses words like that - not since we have PROPER english terms for it







)


----------



## SuprUsrStan

Quote:


> Originally Posted by *cravinmild*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ha-Nocri*
> 
> lol, 256-bit. Hoping NV aren't playing this game again.
> 
> 
> 
> Nvidia needs to be careful, if they throw in everything they got into the first edition then how the heck will they be able to offer a dozen slightly more powerful card each week for the next twelve months. I love marketing
Click to expand...

Nvidia doesn't need to be careful at all. All they need to do is be 5% faster than AMD's fastest offerings. I didn't matter that the 680 was a mid range card as long as it was slighty faster than AMDs offerings at release it will sell loads of them.

I'm looking at you AMD. You going to push Nvidia to bring the big guns? Ball's in your court.


----------



## Germanian

Quote:


> Originally Posted by *Mr.Eiht*
> 
> Small correction here. The original texts says that this "3200 shaders, 200 Texture units" would be equal/comparable to "25SMM".
> And yup, the Rasterendstufen means R.O.P.
> (I had to look it up. No german speaking person uses words like that - not since we have PROPER english terms for it
> 
> 
> 
> 
> 
> 
> 
> )


thx i fixed the description. I read it too fast


----------



## Ha-Nocri

Quote:


> Originally Posted by *Syan48306*
> 
> Nvidia doesn't need to be careful at all. All they need to do is be 5% faster than AMD's fastest offerings. I didn't matter that the 680 was a mid range card as long as it was slighty faster than AMDs offerings at release it will sell loads of them.
> 
> I'm looking at you AMD. You going to push Nvidia to bring the big guns? Ball's in your court.


They will sell this mid-range card for 750$ until AMD releases something better, then they will cut the price to 400$. Ofc, many will buy it for 750$ sadly.


----------



## Germanian

the GTX 880 basically will be a GTX 780Ti replacement with more VRAM at 4GB and use less power at 230W TDP. Expect a nice quiet high performance card until the new more powerful big daddy comes out on 20nm. I expect this card to overclock like the 7970's series of AMD. Getting excellent overclocks out of the new 20nm, but one thing I am scared of is NVIDIA locking the voltage down.

We are all waiting for the card that's even better than GTX 880. Probably Titan2 or something


----------



## coachmark2

Quote:


> Originally Posted by *Syan48306*
> 
> Nvidia doesn't need to be careful at all. All they need to do is be 5% faster than AMD's fastest offerings. I didn't matter that the 680 was a mid range card as long as it was slighty faster than AMDs offerings at release it will sell loads of them.
> 
> I'm looking at you AMD. You going to push Nvidia to bring the big guns? Ball's in your court.


This is exactly right. Nvidia would release the GTX 880 with a 32-bit bus if they could do so and still defeat their competition. They will only innovate to the extent that they are pushed to do so. They're not "milking" anyone.

No one is putting a gun to your head and forcing you to buy their products, therefore, you don't get the right to point one at Nvidia and tell them what they should and shouldn't make.

That being said, you CAN choose to condemn Nvidia for their selection of memory bus (though the fact that it's so important to you leaves me wondering) and not buy their products. Eventually, if their component selection _really_ is as bad as some people say, AMD will sweep them eventually and put them in an un-enviable market position.


----------



## Rookie1337

Quote:


> Originally Posted by *Dangur*
> 
> Whats the point of 4gigs with 256bit?


I thought the whole point of GDDR5 was you could have a smaller bus-width and still utilize a higher VRAM fully (to a degree) and still have as good or better bandwidth?

I find it highly suspect that it has the second highest TFLOPs performance of the listed cards, the second highest number of SPs, the lowest number of ROPs, mid of the pack clocks, and lowest bandwidth. This is no way the 880. Either this is a fake, a very earlier sample, or it is more likely the midrange either 860/870. There's just too much inconsistency with it. Maxwell may be different but this seems bizarre.


----------



## Capt

If those are the true specs then this is just disappointing.


----------



## Germanian

Quote:


> Originally Posted by *Capt*
> 
> If those are the true specs then this is just disappointing.


the faster version of 20nm will be coming later on. They did the same thing with the 680 series followed by Titan and 780 series.


----------



## geoxile

"midrange". They said the same thing about the 680 and we didn't get anything better for nearly another generation


----------



## MoBeeJ

Gtx 750 ti handled itself pretty well for a 128 bus.


----------



## L36

They're not going to release 20nm equivalent of gk110 as the price of such die at 20nm would be absurd. They will apply the 680 strategy while 20nm wafer prices are high, plebs will buy it because it will be 20% faster than the titan and a year later they will release titan 2 electric boogaloo once 20nm wafers are cheaper. Makes sense.


----------



## zealord

I would love to replace my GTX 680, but If I assume this leak to be true, then I do not see my self paying the 650$/€(?) for this "midrange" card even if the performance is well above the 780 Ti.
Well of course if it is like 60-70% better than the 780 Ti I might give in, but I don't see that remotely happening.

Let's just hope the leak is wrong, atleast about the price, because a few months after the 880 there will be much better cards making the 880 a midrange card again.


----------



## Exilon

If Nvidia is using a 256-bit bus, it's because they think the huge L2 cache in Maxwell will counteract the reduced memory bandwidth. Looking at how the 750 Ti can keep up with the 650 Ti Boost with half the memory bandwidth and 2/3 of the ROPs, I think Nvidia knows what they're doing. In any case, the area for the L2 cache has to come from somewhere.


----------



## Popple

What are the main factors of a card that determine its antialiasing performance?


----------



## vlps5122

skipping maxwell for volta


----------



## Ha-Nocri

Quote:


> Originally Posted by *Popple*
> 
> What are the main factors of a card that determine its antialiasing performance?


memory bus plays a huge part here


----------



## serothis

If that chart is to be believed then the 880 will be almost identical in price to the 780ti...With those specs, no sale for me.


----------



## JoHnYBLaZe

Waiting for double 780ti performance (or thereabouts) and _*USED
*_
20-40% percent better than last gen at double the price won't cut it, neither will a shiny cooler.....


----------



## iamhollywood5

Pretty skeptical of this, but if it's true, I'm certainly not interested in dropping $700 for a card with a 256-bit bus, 200 TMUs, and 40 ROPs. No thank you.

Looks like I wont be regretting my recent 780 Ti purchase at least!!


----------



## Germanian

Quote:


> Originally Posted by *iamhollywood5*
> 
> Pretty skeptical of this, but if it's true, I'm certainly not interested in dropping $700 for a card with a 256-bit bus, 200 TMUs, and 40 ROPs. No thank you.
> 
> Looks like I wont be regretting my recent 780 Ti purchase at least!!


ya the 780Ti is a beast (especially overclocked) and will hold well at least for a year or two. Until we see Titan 2


----------



## MunneY

So much meh... I'm glad that I grabbed a couple tis


----------



## Imglidinhere

I can't wait to see what Nvidia does for the flagship laptop GPU here!









Should be pretty epic.


----------



## wholeeo

I'm glad I grew out of my upgrade itch. I'll just wait until new cards are released that my 780s can't be overclocked to to meet their stock performance. But even then I'll wait for the 6 GB Ti Black versions!


----------



## ohhgourami

Still something to replace my gtx 670...


----------



## lacrossewacker

If it's faster than the Titan, I don't care if it has 512-bit bus or a 64-bit bus.

We have yet to see this in action (if it's even real) and not all of our previous experiences with certain specs are applicable to future technologies.


----------



## Baghi

This has to be a damn good deal if priced $700-800, COMPARED TO ZE TITANS!


----------



## Alatar

The specs do sound somewhat reasonable for a GM204 part. Cuda core count is maybe a bit high imo but we'll see.

That said there's two things that aren't being said here:

1) There seems to be a price listed in that table which is quite silly. Prices of GPUs are up in the air until the last couple of weeks before launch.

2) This comes from OBR. OBR might have a good record with Bulldozer but he has an absolutely terrible record with anything related to Nvidia GPUs. He's the guy who was passing around photoshopped versions of random chips calling them GK114, or some random ccard with IHS calling it GK100...

Grain of salt...


----------



## MattGordon

Maxwell
I
L
K

So the 800 will be the most milked series ever. I can already see it ; 850, 850ti, 860, 860ti, 870 , 870ti, 880, 880ti, 885, 885ti, 890, and multiple version of the titan series







. Oh and each card will of course bring a 2, 4, or 6gb version.

With that said... can't wait for the big daddy of maxwell.


----------



## AnnoyinDemon

Quote:


> Originally Posted by *Alatar*
> 
> The specs do sound somewhat reasonable for a GM204 part. Cuda core count is maybe a bit high imo but we'll see.
> 
> That said there's two things that aren't being said here:
> 
> 1) There seems to be a price listed in that table which is quite silly. Prices of GPUs are up in the air until the last couple of weeks before launch.
> 
> 2) This comes from OBR. OBR might have a good record with Bulldozer but he has an absolutely terrible record with anything related to Nvidia GPUs. He's the guy who was passing around photoshopped versions of random chips calling them GK114, or some random ccard with IHS calling it GK100...
> 
> Grain of salt...


OBR?? I thought he quit... He was also right about most things on AM3+ side, even about piledriver being the last.


----------



## Xyxox

So if this is true, I'll probably be better off upgrading my sig rig to an Intel 3770K and getting a couple of the 6GB 780s due to be released by EVGA soon and sit it out until the 9xx series drops. Seems like every even number starts the ball rolling and the odd nuber series is where they get things locked down before the next latest and greatest chip is release.


----------



## fleetfeather

I feel a GTX 880 Ultra coming

#throwback


----------



## Alatar

Quote:


> Originally Posted by *AnnoyinDemon*
> 
> OBR?? I thought he quit... He was also right about most things on AM3+ side, even about piledriver being the last.


OP says the leak comes from OBR.

And yeah he was right about Bulldozer and Piledriver. But that was because he actually had access to engineering samples. No one just believed him because everyone thought BD was going to be the greatest thing since sliced bread.

If he leaked more benches from some upcoming AMD CPUs I might believe him but his Nvidia stuff has been very meh. Lots of photoshopped pictures of old GPUs passed of as some next gen kepler parts. And even the codenames that he photoshopped on his images were wrong and we never actually saw GK114 for example.

Take this gem of an article for example: http://videocardz.com/35227/nvidia-geforce-gtx-780-will-not-be-based-on-gk110-gpu


----------



## Artikbot

Quote:


> Originally Posted by *subyman*
> 
> Oh boy, 680 all over again?


Including the guys who will endlessly remind us how it is a superior alternative to AMD's card because it's supposed to be a midrange but there is absolutely no part above it until some crazy $1000 card comes a year later?

Looks like it.


----------



## psyside

256 bit is not really a problem, because of the increased cache on newer arch - increased L2 cache i guess.


----------



## AnnoyinDemon

Quote:


> Originally Posted by *Alatar*
> 
> OP says the leak comes from OBR.
> 
> And yeah he was right about Bulldozer and Piledriver. But that was because he actually had access to engineering samples. No one just believed him because everyone thought BD was going to be the greatest thing since sliced bread.
> 
> If he leaked more benches from some upcoming AMD CPUs I might believe him but his Nvidia stuff has been very meh. Lots of photoshopped pictures of old GPUs passed of as some next gen kepler parts. And even the codenames that he photoshopped on his images were wrong and we never actually saw GK114 for example.
> 
> Take this gem of an article for example: http://videocardz.com/35227/nvidia-geforce-gtx-780-will-not-be-based-on-gk110-gpu


Nice find and its true about the engineering samples. Nice find alater.

I call this fake...


----------



## Alatar

Quote:


> Originally Posted by *psyside*
> 
> 256 bit is not really a problem, because of the increased cache on newer arch - increased L2 cache i guess.


This is at least partly true.

However we don't know how much L2 cache larger maxwell designs will get. 750Ti does perform extremely well for a part with such little memory bandwidth and at least a part of that is because the card can do more with 8x the cache size of equivalent kepler parts.


----------



## szeged

Super fake, not even worth being posted.


----------



## psyside

Quote:


> Originally Posted by *Alatar*
> 
> This is at least partly true.
> 
> However we don't know how much L2 cache larger maxwell designs will get. 750Ti does perform extremely well for a part with such little memory bandwidth and at least a part of that is because the card can do more with 8x the cache size of equivalent kepler parts.


IMHO they would not release this cards with 256 bit bus, if there is not some different method of gaining performance due to narrower bus, knowing 4K gaming is the thing now and in the future, there is no chance you can feed those crazy pixels with low memory bandwidth, if there is not something quite different in that architecture (L2 cache coming first to mind)


----------



## revro

Quote:


> Originally Posted by *MattGordon*
> 
> Maxwell
> I
> L
> K
> 
> So the 800 will be the most milked series ever. I can already see it ; 850, 850ti, 860, 860ti, 870 , 870ti, 880, 880ti, 885, 885ti, 890, and multiple version of the titan series
> 
> 
> 
> 
> 
> 
> 
> . Oh and each card will of course bring a 2, 4, or 6gb version.
> 
> With that said... can't wait for the big daddy of maxwell.


you forgot the TiTi version of each card







880TiTi xD

Quote:


> Originally Posted by *Xyxox*
> 
> So if this is true, I'll probably be better off upgrading my sig rig to an Intel 3770K and getting a couple of the 6GB 780s due to be released by EVGA soon and sit it out until the 9xx series drops. Seems like every even number starts the ball rolling and the odd nuber series is where they get things locked down before the next latest and greatest chip is release.


yep thats the card people should go. the 6gb 780 will be like ultimate long term cards

Quote:


> Originally Posted by *szeged*
> 
> Super fake, not even worth being posted.


yeah i kinda believe it too


----------



## szeged

With all the hype nvidia put into 4k I bet they'll do a 512 bit bus just to appeal to the "bigger specs number means bigger performance" crowd.


----------



## serothis

Quote:


> Originally Posted by *szeged*
> 
> With all the hype nvidia put into 4k I bet they'll do a 512 bit bus just to appeal to the "bigger specs number means bigger performance" crowd.


I hope you're right because I'm a card carrying member of that crowd but I can't help but feel that cutting features for the sake of "good enough" performance is going to win out in the nv board rooms because it's more profitable.


----------



## AnnoyinDemon

Quote:


> Originally Posted by *psyside*
> 
> IMHO they would not release this cards with 256 bit bus, if there is not some different method of gaining performance due to narrower bus, knowing 4K gaming is the thing now and in the future, there is no chance you can feed those crazy pixels with low memory bandwidth, if there is not something quite different in that architecture (L2 cache coming first to mind)


I thought the bus speed effects the memery. I still have a lot to learn...

How big is the cache on a GPU?


----------



## Blameless

Quote:


> Originally Posted by *ThePath*
> 
> Look decent. Remember that maxwell shader are better than Kepler. 3200 maxwell cores is better than 3200 kepler cores


Shader power isn't everything.

40 ROPs is pretty low, but more than I would expect for the mid-range part. We'll probably see 48-64 ROPs on big Maxwell.
Quote:


> Originally Posted by *Dangur*
> 
> Whats the point of 4gigs with 256bit?


Bus width isn't the same as bandwidth, and nothing about a 4GiB buffer demands extreme bandwidth to be useful.

It's true, that with current and past titles, that to utilize a 4GiB texture/frame buffer you would need to run at settings most cards equipped with 256-bit memory buses could not handle well. However, the trend people have of linking the two, absent reason, needs to stop. It just leads to foolishly incorrect assumptions.

If a game has a lot of high res textures, or if multi-GPU solutions are used, that large buffer will come in very handy, even on mid-range parts.
Quote:


> Originally Posted by *Rookie1337*
> 
> I thought the whole point of GDDR5 was you could have a smaller bus-width and still utilize a higher VRAM fully (to a degree) and still have as good or better bandwidth?


GDDR5 does provide more bandwidth at the same bus width as previous standards. It's effective clock speed is higher.
Quote:


> Originally Posted by *Exilon*
> 
> If Nvidia is using a 256-bit bus, it's because they think the huge L2 cache in Maxwell will counteract the reduced memory bandwidth.


It's cheaper. Smaller memory controller = smaller die. Less memory channels = less PCB traces.

256-bit is perfectly sufficient for a mid-range to upper-mid range part.
Quote:


> Originally Posted by *Popple*
> 
> What are the main factors of a card that determine its antialiasing performance?


For conventional AA types (MSAA, SSAA, and their derivatives)? Fill rate (ROP count * clock speed) and memory bandwidth (memory bus width * clock speed), are the largest factors.

For post-processing AA (MLAA, FXAA, SMAA, etc)? Shader performance is the prime factor.
Quote:


> Originally Posted by *Alatar*
> 
> The specs do sound somewhat reasonable for a GM204 part. Cuda core count is maybe a bit high imo but we'll see.
> 
> That said there's two things that aren't being said here:
> 
> 1) There seems to be a price listed in that table which is quite silly. Prices of GPUs are up in the air until the last couple of weeks before launch.
> 
> 2) This comes from OBR. OBR might have a good record with Bulldozer but he has an absolutely terrible record with anything related to Nvidia GPUs. He's the guy who was passing around photoshopped versions of random chips calling them GK114, or some random ccard with IHS calling it GK100...
> 
> Grain of salt...


Agreed. That price, most especially, does not seem reasonable for a non-flagship part.
Quote:


> Originally Posted by *AnnoyinDemon*
> 
> I thought the bus speed effects the memery. I still have a lot to learn...
> 
> How big is the cache on a GPU?


Bus width does affect memory bandwidth, but it's one component of it, and even theoretical memory bandwidth figures say little about the whole memory sub-systems.

GPUs typically have very small caches. I think the 750ti had 2MiB of L2. Kepler only had 256KiB.


----------



## DStanding

Is it just me, or does this chart have the specs on the Asus 780TI completely wrong?


----------



## Olivon

I call fake.


----------



## Just a nickname

Quote:


> Originally Posted by *szeged*
> 
> Super fake, not even worth being posted.


Yes... Super fake. If this rumor had any weight they would have hinted at least the die size. Like, they know how many transistors there is inside even the vram capacity but not the die size? Why?


----------



## xentrox

So now we gotta wait for the 880 Super Kingpin SC OC Ti Black Edition Z?

But first we have to wait for the 880 SCC Mini-Ti X edition first to come out, before the next big boy is released...


----------



## serothis

Quote:


> Originally Posted by *DStanding*
> 
> Is it just me, or does this chart have the specs on the Asus 780TI completely wrong?


Good catch. It looks like they took some specs from the from the 780ti matrix but the one's they didn't have they copy/pasted from the 780(non-ti)


----------



## DStanding

Quote:


> Originally Posted by *serothis*
> 
> Good catch. It looks like they took some specs from the from the 780ti matrix but the one's they didn't have they copy/pasted from the 780(non-ti)


If they can't even get that right I seriously doubt the veracity of the rest of the chart. I'll wait for a more reputable source.

And yes, this is partly fueled by denial and not wanting Nvidia to release a midrange x80 card again


----------



## MeanBruce

Could this be my next video card? My Asus DC1 AMD 6870 after 3.5yrs needs replacing.









He's actually barely breathing with 2560x1440. Help!


----------



## brkbeatjunkie

Quote:


> Originally Posted by *Ha-Nocri*
> 
> lol, 256-bit. Hoping NV aren't playing this game again.


I know right? Thats what I thought when they released the Gtx 750 ti with a 128 bit bus.


----------



## MeanBruce

Quote:


> Originally Posted by *szeged*
> 
> Super fake, not even worth being posted.


Really? Aw come on man, got my hopes up and all, my current 6870 needs intensive care at this point.

My poor 6870 is surrounded by performance and needs to say farewell and find a better place.


----------



## serothis

Quote:


> Originally Posted by *MeanBruce*
> 
> Quote:
> 
> 
> 
> Originally Posted by *szeged*
> 
> Super fake, not even worth being posted.
> 
> 
> 
> Really? Aw come on man, got my hopes up and all, my current 6870 needs intensive care at this point.
Click to expand...

I think it could do with some electroshock therapy. MOAR VOLTS!







:thumb:


----------



## omari79

don't go underestimating the card guys..the 750Ti is a very good indication of what Maxwell can do


----------



## bencher

Quote:


> Originally Posted by *omari79*
> 
> don't go underestimating the card guys..the 750Ti is a very good indication of what Maxwell can do


True.


----------



## mikeaj

Compared to GTX 750 Ti:

5x shaders (CUDA cores)
2.5x ROPs
2x memory bus
0.88x core clock
1.37x memory clock (for 2.74x mem bandwidth)
3.83x TDP (on 20 nm as opposed to 28 nm)
4.22x transistors

I would have imagined fewer shaders and lower TDP (i.e. smaller chip), really, unless this thing is a long time coming and yields on 20 nm are already fine for something of that kind of size. In fact, the shader count is pretty crazy, given what GTX 750 Ti can do. Is 750 Ti really that shader starved? Never runs into memory or ROP or any other bottlenecks? Otherwise, this "GTX 880" doesn't make much sense in context.

In any case, it looks fishy.


----------



## MeanBruce

Quote:


> Originally Posted by *omari79*
> 
> don't go underestimating the card guys..the 750Ti is a very good indication of what Maxwell can do


I'm certainly not underestimating, hoping for the absolute best with Nvidia Maxwell, for someone with only a working content creation rig who only purchases a new GPU every 3 to 4 years I'm looking for extreme 2K and later 4K performance coupled with very low power draw so the new Asus Direct CU3 coolers will run in perfect silence.









Asus GTX880 DC3 Woooot.


----------



## i7monkey

Looks like it's mid-ranged trash again. Anything more than $399 would be a blatant ripoff.

Many will be cheering if it's $499. Others will be happy at $599. This will probably go for $649 and still sell like hot-cakes. NO THANKS!

Can't wait for a $1099 GM110 Titan 2 lmao


----------



## szeged

if the REAL 880 launches at $550 ill be happy, but if this fake crap garbage thats on the front page was $550 id lol my way to the next amd card.


----------



## DStanding

Also, 7.9mn transistors on 20nm = about 80% the die size of GK110? GK104 was ~52% the size of GK110, so if we go by that this may be slightly less "midrange" than Gk104...

Although I may just be talking out my ass.


----------



## bencher

Quote:


> Originally Posted by *szeged*
> 
> if the REAL 880 launches at $550 ill be happy, but if this fake crap garbage thats on the front page was $550 id lol my way to the next amd card.


Ha I want to see that.


----------



## szeged

Quote:


> Originally Posted by *bencher*
> 
> Ha I want to see that.


thinking about grabbing a 290x lightning for funsies, maybe itll hold me over till the real 880 kingpins or the next amd gpu


----------



## Zero4549

Damnit Nvidia, quit releasing these mid range cards with high end price tags. You aren't fooling anyone (well ok, you actually are fooling just about everyone, but _not me!_).


----------



## CynicalUnicorn

Quote:


> Originally Posted by *MoBeeJ*
> 
> Gtx 750 ti handled itself pretty well for a 128 bus.


Yup. Bus doesn't mean anything by itself though. Bandwidth = bus * memory speed. A 1024-bit bus running at 500MHz has the same bandwidth as a 64-bit bus at 8GHz. Nvidia has higher memory clockspeeds than AMD in general, so they aren't nearly as limited by the smaller bus in terms of bandwidth.

Quote:


> Originally Posted by *Rookie1337*
> 
> I find it highly suspect that it has the second highest TFLOPs performance of the listed cards, the second highest number of SPs, the lowest number of ROPs, mid of the pack clocks, and lowest bandwidth. This is no way the 880. Either this is a fake, a very earlier sample, or it is more likely the midrange either 860/870. There's just too much inconsistency with it. Maxwell may be different but this seems bizarre.


It has the second highest TFLOPS of all the _cards_, but not of all the GPUs. Remember, the Titan Z is a dual GPU. Also remember that Maxwell is, core for core and clock for clock, something like 30-40% faster than Kepler. ROPs? Those are usually reliant on the backend, but I'm not entirely sure how they're determined. I think bus width has some to do with them though, and that implies that this should have 32 ROPs instead, or should have a 320-bit bus. In any case, I am definitely skeptical about this, but this is Nvidia. The 700 series has had like five different flagships. Maybe we'll see a 384-bit 880Ti with 3600 shaders and 60 ROPs just three weeks after the 880's release!









So at this point, I don't even think I'm waiting for Maxwell. I'll probably just grab a 770 if Nvidia keeps up their bullcrap. I am not happy about how GK110 has turned out, and if this is real, then this is just the 600 series all over again: mid-range crap for the first run, and then the real stuff the second time. However, I will give them the benefit of doubt and wait since it is so early in the year.


----------



## MxPhenom 216

Sweet, now once it releases, watch people buy it, then ill take a 780 off their hands, SLI till 2nd generation Maxwell is out, or Pascal.


----------



## ZealotKi11er

Will probably be ~ $500. Should be faster then GTX780 since it more cores.
Memory should be fine really. Yes its true we need better card for $500 but this is the best we will get.


----------



## Usario

256-bit? A bit surprising, but I guess they can pull it off, especially considering the GTX 750 Ti's performance with 128-bit being damn near the 256-bit 7850 and the fact that they're using 7400MHz memory.

If these are the real specs though, if AMD decides to produce another big-die Hawaii-type chip, I just can't see NVIDIA winning this round. If AMD reverts back to a Tahiti or Cypress-type die size though there might be some heated competition.


----------



## szeged

Quote:


> Originally Posted by *MattGordon*
> 
> Did someone say _blacker_ edition?
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I suck at editing pictures


excellent


----------



## CynicalUnicorn

Quote:


> Originally Posted by *MxPhenom 216*
> 
> Sweet, now once it releases, watch people buy it, then ill take a 780 off their hands, SLI till 2nd generation Maxwell is out, or Pascal.


This is also what I'm considering.









Quote:


> Originally Posted by *ZealotKi11er*
> 
> Will probably be ~ $500. Should be faster then GTX780 since it more cores.
> Memory should be fine really. Yes its true we need better card for $500 but this is the best we will get.


More cores != more performance. Haswell i5 vs Vishera octacore comes to mind, but these are GPUs. I suppose the 580 to 680 is a good comparison. The core count tripled, but performance did not.

Quote:


> Originally Posted by *Usario*
> 
> 256-bit? A bit surprising, but I guess they can pull it off, especially considering the GTX 750 Ti's performance with 128-bit being damn near the 256-bit 7850 and the fact that they're using 7400MHz memory.
> 
> If these are the real specs though, if AMD decides to produce another big-die Hawaii-type chip, I just can't see NVIDIA winning this round. If AMD reverts back to a Tahiti or Cypress-type die size though there might be some heated competition.


Bandwidth isn't the whole story of performance though. 750Ti and 7850/R7 265 are low-end enough that that doesn't matter too much.

Definitely, but heat will be a huge issue unless they release Pirate Islands more reasonably clocked or the node shrink helps significantly.


----------



## skupples

and people mocked us when we said they would follow the 6xx/7xx series release to a T.

OK. For serious this time... We are getting closer, so things like this will become more & more common to see. Seeing as this is one of the first "leaks" on the 880, it is likely 100% based in lalaland.

I do believe that Nvidia will follow the same format as 6xx/7xx, but I also believe the 880 will be ~20% faster than the 780Ti @ ~500-600$.


----------



## tpi2007

Sounds fishy in quite a few respects.

First, can somebody find me a source that claims faster than 7 Ghz GDDR5 is in production ? A GPU can't just ship with overclocked VRAM, GPU vendors are highly averse to doing that, most that ship overclocked GPUs do the GPU core part and leave the VRAM untouched.

Second, 20nm ? When ? All the rumours point to the fact 20nm isn't ready for high powered GPUs yet.

Third, 3200 CUDA cores ? Why ? According to Nvidia, 128 Maxwell CUDA cores have 90% of the performance of 192 Kepler CUDA cores, so a GPU with 3200 cores would have the equivalent of 50% more CUDA cores compared to the GTX 780 Ti, and be quite a bit faster than it, possibly 40%. Considering that the GTX 680 was only 19% faster (26% now) than the GTX 580 at launch, I find this hard to believe.

It would be the same as having a Kepler card with 4320 Kepler cores. Honestly, this sounds too good to be true. Either that or the scaling at the high-end isn't as good as the current Maxwell parts suggest.


----------



## SuprUsrStan

Quote:


> Originally Posted by *fleetfeather*
> 
> I feel a GTX 880 Ultra coming
> 
> #throwback


I wouldn't mind a GTX 880 Ultra if only it could live up to the hype.









The GTX 8800 was a beast.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tpi2007*
> 
> Sounds fishy in quite a few respects.
> 
> First, can somebody find me a source that claims faster than 7 Ghz GDDR5 is in production ? A GPU can't just ship with overclocked VRAM, GPU vendors are highly averse to doing that, most that ship overclocked GPUs do the GPU core part and leave the VRAM untouched.
> 
> Second, 20nm ? When ? All the rumours point to the fact 20nm isn't ready for high powered GPUs yet.
> 
> Third, 3200 CUDA cores ? Why ? According to Nvidia, 128 Maxwell CUDA cores have 90% of the performance of 192 Kepler CUDA cores, so a GPU with 3200 cores would have the equivalent of 50% more CUDA cores compared to the GTX 780 Ti, and be much faster than it, more than 40%. Considering that the GTX 680 was only 19% faster (26% now) than the GTX 580 at launch, I find this hard to believe.
> 
> It would be the same as having a Kepler card with 4320 Kepler cores. Honestly, this sounds too good to be true. Either that or the scaling at the high-end isn't as good as the current Maxwell parts suggest.


I would not expect core to space like that. GTX750 Ti power improvement and small die comes partly because of 128-Bit. If they use 384-Bit on GTX880 it will be a big die.


----------



## tpi2007

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would not expect core to space like that. GTX750 Ti power improvement and small die comes partly because of 128-Bit. If they use 384-Bit on GTX880 it will be a big die.


I doubt they will use a 384-bit memory bus, 320-bit at most. L2 cache does the rest.

Still, I don't know where they got 7.4 Ghz VRAM. Nothing higher than 7 Ghz GDDR5 exists. And I highly doubt GPU vendors and Nvidia will take the risk upon themselves to ship 'stock' GPUs with VRAM running above spec.


----------



## Baghi

Quote:


> Originally Posted by *MeanBruce*
> 
> You Rock at editing pictures.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll take that with a new Rampage 5 Extreme Black On Black Edition.
> 
> with inconspicuous unfocused undetermined unmarked helicopter behind the mobo red background LED's. yeah baby
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...


...but, but you'll have to be a scientist to do that.


----------



## azanimefan

256bit

this thing will suck at high resolutions.


----------



## pcfoo

I find the op specs over-optimistic, and ppl who complain wanting more from a "mid range" or more correctly 1st gen maxwell product, a bit silly.

25x SMM is a "big maxwell" possibility, doubt will be on their mainstream card.


----------



## maneil99

Quote:


> Originally Posted by *azanimefan*
> 
> 256bit
> 
> this thing will suck at high resolutions.


Maxwell's caches make it more effecient in terms of memory bandwith


----------



## Arni90

I find it interesting that no one has pointed out that a 256-bit memory bus wouldn't feature 40 ROPs unless Nvidia decided to decouple the ROPs from the memory bus, which doesn't seem to be a cost-effective solution. Personally, I'm expecting closer to 20 SMMs and a 256-bit memory bus, which should outperform the 780 Ti by about 20% (20 SMMs * 90% performance / 15 SMXes = 120%)


----------



## kingduqc

if it's only 15-20% gain for the same amount of money I hope none of you buy em and they drop prices real fast.

miss the days dual gpu cards where 500$ now a single high end gpu is 750$


----------



## CynicalUnicorn

Quote:


> Originally Posted by *Arni90*
> 
> I find it interesting that no one has pointed out that a 256-bit memory bus wouldn't feature 40 ROPs unless Nvidia decided to decouple the ROPs from the memory bus, which doesn't seem to be a cost-effective solution. Personally, I'm expecting closer to 20 SMMs and a 256-bit memory bus, which should outperform the 780 Ti by about 20% (20 SMMs * 90% performance / 15 SMXes = 120%)


*cough*

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> ROPs? Those are usually reliant on the backend, but I'm not entirely sure how they're determined. I think bus width has some to do with them though, and that implies that this should have 32 ROPs instead, or should have a 320-bit bus.


But yes, this is very odd. If it isn't a hoax, then







. If it is a hoax, then at least these people could be nice enough to make the specs accurate.


----------



## funkmetal

Hmm, I dont know the ins and outs of GPU tech specs but it seems that this is just a slightly faster 780Ti at stock. So by that logic a 780Ti Classy should be able to compete with it as a un overclocked card?


----------



## Blackops_2

If this is remotely true it further justifies a 780 classy for me







and means i don't have to turn around and sell it for an 880 around Q4 14 or Q1 15. Hmmm


----------



## Usario

Quote:


> Originally Posted by *Arni90*
> 
> I find it interesting that no one has pointed out that a 256-bit memory bus wouldn't feature 40 ROPs unless Nvidia decided to decouple the ROPs from the memory bus, which doesn't seem to be a cost-effective solution. Personally, I'm expecting closer to 20 SMMs and a 256-bit memory bus, which should outperform the 780 Ti by about 20% (20 SMMs * 90% performance / 15 SMXes = 120%)


I did not notice this.
I don't know what to believe now.


----------



## skupples

the specs are just schizophrenic. They wouldn't put 3200 cores on a 256 bus card.


----------



## Kinaesthetic

Quote:


> Originally Posted by *Usario*
> 
> I did not notice this.
> I don't know what to believe now.


People still to this day actually believe rumors posted here? 99% of them have been false, yet for every rumor, people take it like it is the Bible.


----------



## skupples

Quote:


> Originally Posted by *Kinaesthetic*
> 
> People still to this day actually believe rumors posted here? 99% of them have been false, yet for every rumor, people take it like it is the Bible.


such is the course of GPU release since the beginning of time.

As time moves forward the "leaks" (speculations) will become more accurate, and actually eventually turn into legit leaks.


----------



## Arni90

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> *cough*
> But yes, this is very odd. If it isn't a hoax, then
> 
> 
> 
> 
> 
> 
> 
> . If it is a hoax, then at least these people could be nice enough to make the specs accurate.


Must have missed it








Most of the time, the ROPs are connected directly to the memory controllers to reduce internal bandwidth consumption in the chip and ensure maximum bandwith efficiency, I think Nvidia has been doing this for over 10 years (don't quote me on this)

I remember seeing more specs like these with non-matching ROP count to memory bus width from OBR earlier, like 24 ROPs and 256-bit memory bus or 64 ROPs with 384-bit memory bus.


----------



## Usario

Quote:


> Originally Posted by *Kinaesthetic*
> 
> People still to this day actually believe rumors posted here? 99% of them have been false, yet for every rumor, people take it like it is the Bible.


OBR is cited as the source. That man is terrifyingly accurate. I say terrifyingly because he almost always has bad news for AMD ;-;

The only time he was wrong IIRC was when he said that the GTX 680 would be a "new G80", and apparently that was only because he was under the impression that GK100 was not cancelled and that GK104 would be the 660 Ti.


----------



## i7monkey

Anyone get the feeling this is coming out in the next 2-4 months?


----------



## PostalTwinkie

Not sure I am buying this...

With Nvidia and game developers pushing for ultra high resolution gaming, the decrease in bandwidth on the memory would punch that right in the kisser.


----------



## szeged

Quote:


> Originally Posted by *i7monkey*
> 
> Anyone get the feeling this is coming out in the next 2-4 months?


no.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Redeemer*
> 
> Why not the 680 was awesome


It's performance went downhill (compared to 7970) once you started using AA even @1080p


----------



## WhyCry

I have some bad news, this leak is from September 2013.

The original thread at Baidu was removed, but I got this.. (it was a long thread with specs changing frequently).

*The original source:*


----------



## skupples

Quote:


> Originally Posted by *i7monkey*
> 
> Anyone get the feeling this is coming out in the next 2-4 months?


I think it will be revealed @ computex then released @ the end of the year.


----------



## i7monkey

Anyone also get the feeling that Nvidia releases fake specs to disappoint people then announces a monster at ridiculous prices?

All the leaks lead to a $500 mid ranger then all of a sudden BOOM! 384-bit GM100 at $799!

They must do this right?


----------



## Blackops_2

Quote:


> Originally Posted by *i7monkey*
> 
> Anyone also get the feeling that Nvidia releases fake specs to disappoint people then announces a monster at ridiculous prices?
> 
> All the leaks lead to a $500 mid ranger then all of a sudden BOOM! 384-bit GM100 at $799!
> 
> They must do this right?


Well.. maybe it's just me but a monster at ridiculous price wouldn't make me any more satisfied than a small jump mid range at 500$.


----------



## i7monkey

Quote:


> Originally Posted by *Blackops_2*
> 
> Well.. maybe it's just me but a monster at ridiculous price wouldn't make me any more satisfied than a small jump mid range at 500$.


I wouldn't be happy either. A monster should be $500 max but that's Nvidia's way of disappointing people then releasing a monster at ******ed prices which people will gobble up.


----------



## CynicalUnicorn

Quote:


> Originally Posted by *Arni90*
> 
> I remember seeing more specs like these with non-matching ROP count to memory bus width from OBR earlier, like 24 ROPs and 256-bit memory bus or 64 ROPs with 384-bit memory bus.


It is uncommon, but there are some GPUs with weird ROP/bus and VRAM arrangements. 650Tis have a 128-bit bus and 1GB or 2GB of VRAM, which is to be expected given the 16 ROPs. All VRAM can be used quite easily. Same goes for Pitcairn/Curaco when looking at AMD. 256-bit bus, 32 ROPs, and either 1GB, 2GB, or 4GB, and all of it can be used with no issue. Of course, Tahiti has the same number of ROPs and a 50% wider bus, so AMD probably deals with the back-end a bit differently.

The 650Ti BOOST edition adds 8 more ROPs, which gives it 24 ROPs and a 192-bit bus. It has 1GB and 2GB variants. The 660s, both vanilla and Ti, also have 24 ROPs and 192-bit buses, and they only have 2GB versions available. I believe some benchmarks have been done, and after the first 1.5GB, memory speed plummets. The remaining VRAM can be used, just not very well. I don't think, however, that anything strays from the "64-bits per 8 ROPs" guideline.


----------



## TheBlindDeafMute

I'll wait for the big boy Maxwell. Even if the cores are better on Maxwell, I doubt it will make a huge difference. I'm waiting for the Maxwell titan. 2 in sli and call it a day. Hopefully the cooler looks pretty boss.


----------



## Germanian

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> I'll wait for the big boy Maxwell. Even if the cores are better on Maxwell, I doubt it will make a huge difference. I'm waiting for the Maxwell titan. 2 in sli and call it a day. Hopefully the cooler looks pretty boss.


ya, this. Bigboy will be game changer 2015


----------



## i7monkey

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> I'll wait for the big boy Maxwell. Even if the cores are better on Maxwell, I doubt it will make a huge difference. *I'm waiting for the Maxwell titan. 2* in sli and call it a day. Hopefully the cooler looks pretty boss.


The Titan line is a rip off. If you don't compute just wait for the 980 3 months after for much cheaper.


----------



## Capt

Quote:


> Originally Posted by *vlps5122*
> 
> skipping maxwell for volta


skipping volta for...


----------



## maarten12100

Quote:


> Originally Posted by *Germanian*
> 
> A leak just got in regarding maxwell GTX 880 specs.
> GTX 880 is supposedly a MID RANGE CARD. We will see stronger versions later on.


Why is the price stated the almost the same as the price of the 780 Ti what have we done!


----------



## MapRef41N93W

Quote:


> Originally Posted by *Capt*
> 
> skipping volta for...


Pascal is 2016 so Volta is now what 2018?


----------



## zooterboy

Quote:


> Originally Posted by *maarten12100*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Germanian*
> 
> A leak just got in regarding maxwell GTX 880 specs.
> GTX 880 is supposedly a MID RANGE CARD. We will see stronger versions later on.
> 
> 
> 
> Why is the price stated the almost the same as the price of the 780 Ti what have we done!
Click to expand...

Same reason as the 680.


----------



## maarten12100

Quote:


> Originally Posted by *zooterboy*
> 
> Same reason as the 680.


But the 680 was 500 for a midrange chip while the 780 Ti is a 700+ card








Either way judging by it listing the memory as 12 rather than 2x6 is a bit strange but not uncommon even with us educated users.


----------



## Hms1193

Quote:


> Originally Posted by *WhyCry*
> 
> I have some bad news, this leak is from September 2013.
> 
> The original thread at Baidu was removed, but I got this.. (it was a long thread with specs changing frequently).
> 
> *The original source:*


Rightly put. The NVIDIA Maxwell specs are old speculation off from Chinese forums. The source (PCTuning OBR) goes on to state that the card will feature a single 6-Pin connector while they list 230W under the specification chart.
Quote:


> GTX 880 is said to have only a single 6-pin power connector, TDP has to be somewhere on the level GTX 760 performance over the GTX 780 Ti. PCB cards to be small, cheap, simple (remember the GTX 680).


This just like the R300 Series post is speculation and information derived from various sites/forums. Nothing concrete is known regarding the upcoming chips. In short, the term leaked to be used for these posts is plain wrong, its all speculation from a bunch of people on Chinese sites.


----------



## MattGordon

Quote:


> Originally Posted by *maarten12100*
> 
> Why is the price stated the almost the same as the price of the 780 Ti what have we done!


At this point we allowed it to happen. If this rumor is true then we have dug our own grave.

There's little justification on releasing at such a steep price point for a "midrange" card.


----------



## Usario

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> It is uncommon, but there are some GPUs with weird ROP/bus and VRAM arrangements. 650Tis have a 128-bit bus and 1GB or 2GB of VRAM, which is to be expected given the 16 ROPs. All VRAM can be used quite easily. Same goes for Pitcairn/Curaco when looking at AMD. 256-bit bus, 32 ROPs, and either 1GB, 2GB, or 4GB, and all of it can be used with no issue. Of course, Tahiti has the same number of ROPs and a 50% wider bus, so AMD probably deals with the back-end a bit differently.
> 
> The 650Ti BOOST edition adds 8 more ROPs, which gives it 24 ROPs and a 192-bit bus. It has 1GB and 2GB variants. The 660s, both vanilla and Ti, also have 24 ROPs and 192-bit buses, and they only have 2GB versions available. I believe some benchmarks have been done, and after the first 1.5GB, memory speed plummets. The remaining VRAM can be used, just not very well. I don't think, however, that anything strays from the "64-bits per 8 ROPs" guideline.


Difference with those cards is that the ROP and bus still work together.

40 is just not divisible into 256 at all.


----------



## CynicalUnicorn

Right, which I said earlier: either it really is 256-bit and has 8 fewer (32) ROPs than listed, or it does have all 40 ROPs and the bus is 64-bit too narrow (320-bit). The "leak" just doesn't add up.


----------



## Jimhans1

Quote:


> Originally Posted by *Usario*
> 
> OBR is cited as the source. That man is terrifyingly accurate. I say terrifyingly because he almost always has bad news for AMD ;-;
> 
> The only time he was wrong IIRC was when he said that the GTX 680 would be a "new G80", and apparently that was only because *he was under the impression that GK100 was not cancelled and that GK104 would be the 660 Ti.*


GK104 was supposed to BE the GTX660(non-Ti) it was only the dismal initial performance of the 7970 that allowed nVidia to make it the GTX680. GK110 SHOULD have been a GTX680, and it should have been an MSRP of $499-549.


----------



## skupples

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> I'll wait for the big boy Maxwell. Even if the cores are better on Maxwell, I doubt it will make a huge difference. I'm waiting for the Maxwell titan. 2 in sli and call it a day. Hopefully the cooler looks pretty boss.


I'm TRYING to skip maxwell, assuming Denver truly is delayed until Volta/Pascal. I figure tri-sli-titans @ 1300mhz should power through pretty much what ever I need it to, @ near ultra settings well into 2016. If things get hairy in SC i'll just buy a fourth, used, EVGA titan ~500$.
Quote:


> Originally Posted by *Jimhans1*
> 
> GK104 was supposed to BE the GTX660(non-Ti) it was only the dismal initial performance of the 7970 that allowed nVidia to make it the GTX680. GK110 SHOULD have been a GTX680, and it should have been an MSRP of $499-549.


Lets hope AMD isn't dragging their heals this time around. I'm tired of Nvidia having no competition for 6 months @ a time.


----------



## TFchris

Seeing how Nvidia cards follow the names of past famous scientists... and GTX TITAN is named after a moon of Saturn, I'm guessing the next "Titan" will be of the cosmic naming scheme.

GTX EUROPA? lol

Going by the next baddest moon names in the solar system, I can only assume it will be named either one of these four.

GTX Hyperion
GTX Calypso
GTX Atlas
GTX Callisto


----------



## Robertdt

When game companies and / or GPU manufacturers actually release something good enough to pique my interest in a faster card...


----------



## Kinaesthetic

Quote:


> Originally Posted by *Robertdt*
> 
> When game companies and / or GPU manufacturers actually release something good enough to pique my interest in a faster card...


I hear ya. My machine is essentially a League/Hearthstone machine, because there just aren't many good games out there on the market right now.


----------



## Thetbrett

if it brings the price down for ti's, then great. I can't stomach the $200AU markup we have to deal with here.


----------



## skupples

Quote:


> Originally Posted by *TFchris*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Seeing how Nvidia cards follow the names of past famous scientists... and GTX TITAN is named after a moon of Saturn, I'm guessing the next "Titan" will be of the cosmic naming scheme.
> 
> GTX EUROPA? lol
> 
> Going by the next baddest moon names in the solar system, I can only assume it will be named either one of these four.
> 
> GTX Hyperion
> GTX Calypso
> GTX Atlas
> GTX Callisto


Atlas is already taken. I believe it is the pet name for the full 15SMX core the 780ti/titan black uses.


----------



## Nexo

I want to this come out.


----------



## nepas

Quote:


> Originally Posted by *vlps5122*
> 
> skipping maxwell for volta


Good luck with that seeing as "Volta" no longer exists


----------



## DizZz

My biggest hope for the GTX 8xx series is reduced power consumption - the future is looking bright!


----------



## maneil99

Quote:


> Originally Posted by *nepas*
> 
> Good luck with that seeing as "Volta" no longer exists


Got swtiched with Pascal on the roadmap, it will be out 2018 or whenver pascal was gonna come out actually.


----------



## Bloodbath

MEH! I've got three good Titans I'm good till Volta. Such incremental gains wouldn't even be worth draining my loop.


----------



## zealord

Quote:


> Originally Posted by *Bloodbath*
> 
> MEH! I've got three good Titans I'm good till Volta. Such incremental gains wouldn't even be worth draining my loop.


let's see what you think when Maxwell refresh launches


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Bloodbath*
> 
> MEH! I've got three good Titans I'm good till Volta. Such incremental gains wouldn't even be worth draining my loop.


I'm with you on that one.


----------



## Bloodbath

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> I'm with you on that one.


I really cant see any games coming out at least within the next 1-2 years that will even make these things break a sweat.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Bloodbath*
> 
> I really cant see any games coming out at least within the next 1-2 years that will even make these things break a sweat.


Yeah, I'm at 1080p, and the "benchmark" games are usually a one time play through, the "addictive" games I play can be run at 600mhz with my titans at 1080p 120hz.
I'm getting a 1440p Asus Swift Gsync Monitor as soon as it is released. Even that, Titans in SLI should be fine for another year.


----------



## zyezye

256 bit







wat.


----------



## Blackops_2

Quote:


> Originally Posted by *Bloodbath*
> 
> I really cant see any games coming out at least within the next 1-2 years that will even make these things break a sweat.


Well it depends..if your a single monitor user running 120hz-144hz, at the same time you like to run full eye candy you'll likely need two of the current generation cards to do so. Though i just like having the ability to go above 60fps, so i compromise with my 7970 and a couple of games to reach that 100fps mark.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Blackops_2*
> 
> Well it depends..if your a single monitor user running 120hz-144hz, at the same time you like to run full eye candy you'll likely need two of the current generation cards to do so. Though i just like having the ability to go above 60fps, so i compromise with my 7970 and a couple of games to reach that 100fps mark.


Some games look fantastic even at med settings. For fast paced games who has time to admire the details on the flowers by the pond? lol

Two 7970s or 680s can still power up a 1440p nicely.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blackops_2*
> 
> Well it depends..if your a single monitor user running 120hz-144hz, at the same time you like to run full eye candy you'll likely need two of the current generation cards to do so. Though i just like having the ability to go above 60fps, so i compromise with my 7970 and a couple of games to reach that 100fps mark.


Seriously there is no point on dropping 1K in GPUs for 8 Hour game that might need the GPU power.

Also going from High graphics to Ultra is like 10-15% change in QI for 50%-100% performance hit.

Compromise is a smart thing to do and wait for prices to come down.

If priced keep going up i will just wait and get a 290X for $300 a year from now.


----------



## skupples

I really think they are going to reuse the 6xx/7xx release
Quote:


> Originally Posted by *zyezye*
> 
> 256 bit
> 
> 
> 
> 
> 
> 
> 
> wat.


Hello! Welcome to the early GPU speculation thread: Type B. Take what you read with a massive grain of salt. Hell, just stick the whole salt lick in your mouth then start chewing.


----------



## SuprUsrStan

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bloodbath*
> 
> MEH! I've got three good Titans I'm good till Volta. Such incremental gains wouldn't even be worth draining my loop.
> 
> 
> 
> let's see what you think when Maxwell refresh launches
Click to expand...

It's ironic but the first adopters of the original Titan actually got quite a bit of use out of their video cards. Looking back, the GTX 780's and Titans were amazing investments considering how they were the king of the hill for much longer than they should have been and will be still revenant for quite a bit of time.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Syan48306*
> 
> It's ironic but the first adopters of the original Titan actually got quite a bit of use out of their video cards. Looking back, the GTX 780's and Titans were amazing investments considering how they were the king of the hill for much longer than they should have been and will be still revenant for quite a bit of time.


Yep, when they were released, they blew everything out of the water. It came with a nice price tag, but if you ask me, it was totally worth it. When Team Skyn3t released bioses and afterburner tweaks, I loved my cards even more.


----------



## Swolern

256 bit bus is a freaking joke. Sure hope thats not true!
Quote:


> Originally Posted by *Bloodbath*
> 
> MEH! I've got three good Titans I'm good till Volta. Such incremental gains wouldn't even be worth draining my loop.


Haha, hilarious!!
Quote:


> Originally Posted by *Bloodbath*
> 
> I really cant see any games coming out at least within the next 1-2 years that will even make these things break a sweat.


Star Citizen is going to be a GPU killer! Nothing 3 Titans cant run though.


----------



## Sunreeper

Quote:


> Originally Posted by *cravinmild*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ha-Nocri*
> 
> lol, 256-bit. Hoping NV aren't playing this game again.
> 
> 
> 
> Nvidia needs to be careful, if they throw in everything they got into the first edition then how the heck will they be able to offer a dozen slightly more powerful card each week for the next twelve months. I love marketing
Click to expand...

When this comes out

Nvidia: Introducing the gtx 880 the fastest graphics card on the planet! For the small price of $699

One week later

Nvidia: Introducing the titan x the fastest graphics card on the planet! For only $1000

Another week later

Nvidia: Introducing the gtx 880 black edition the fastest graphics card on the planet. Now only for $799!!!!


----------



## xlink

and my 660Ti is still doing just fine and it's about half as powerful as the 780Ti at a third the price...

Reminds me of the GeForce 6600 vs. 6800 performance difference... except I got my card earlier at less than half the cost and if I sacrifice on AA or res the actual real world impact isn't as significant.


----------



## SuprUsrStan

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syan48306*
> 
> It's ironic but the first adopters of the original Titan actually got quite a bit of use out of their video cards. Looking back, the GTX 780's and Titans were amazing investments considering how they were the king of the hill for much longer than they should have been and will be still revenant for quite a bit of time.
> 
> 
> 
> Yep, when they were released, they blew everything out of the water. It came with a nice price tag, but if you ask me, it was totally worth it. When Team Skyn3t released bioses and afterburner tweaks, I loved my cards even more.
Click to expand...

I spent just as much money on my three GTX 780's at release as two Titans when you count the waterblocks and everything. I don't regret that purchase for a moment even after they dropped the price on those cards.


----------



## skupples

Quote:


> Originally Posted by *Syan48306*
> 
> It's ironic but the first adopters of the original Titan actually got quite a bit of use out of their video cards. Looking back, the GTX 780's and Titans were amazing investments considering how they were the king of the hill for much longer than they should have been and will be still revenant for quite a bit of time.


My Titans are still flying. Your average stock bios, air cooled titan is quite the slug now, but they have so much potential when you slap a block on them, change the bios, & hack the voltage. I'm running 3x @ 1300/7ghz 24/7 (can go higher for benching) which swings with 780ti all day long. Remember, the difference between the two is minimal. 128 cores and a few TMU. That KPE though, wooooo those things fly.

We got flamed to high heaven back when our cards started showing up ~14 months ago. People flamed the Titan club for weeks & weeks due to the price point. Some of the major fud flingers have spent well over $1000 on GPUs since we purchased our titans @ release. At this point in the game I feel like it was a really good purchase. They will provide me with max settings for @least another year.

Star Citizen is going to be the only title that makes my system cry, but that's only because i'm running in surround. There will also be server side performance issues that we will have no control over.


----------



## Popple

Quote:


> Originally Posted by *Ha-Nocri*
> 
> memory bus plays a huge part here


I see.


----------



## KenjiS

Hopefully will drive the price on the 770 down, so i can get a second and try SLI for the first time


----------



## Luciferxy

Quote:


> Originally Posted by *KenjiS*
> 
> Hopefully will drive the price on the 770 down, so i can get a second and try SLI for the first time


with the 337.50 driver, you will be flying with SLI 770


----------



## KenjiS

Quote:


> Originally Posted by *Luciferxy*
> 
> with the 337.50 driver, you will be flying with SLI 770


Heck I noticed a good improvement in a few titles on a single 770.

Heres hoping they get Rome II's SLI texture glitches sorted out.. Thats kinda my problem ATM, i like games that dont like SLI D:


----------



## subyman

I'll take a few moments to slide the Core Clock bar over a few notches on my 780 and save $650.


----------



## kx11

mid-range GPU ?!!


----------



## Chrono Detector

Why would you wanna take the x80 name and call it a mid range card? This kind of reminds me of 9800 GTX.


----------



## szeged

680 is midrange when compared to 780s, but when it's the best thing out at the time, it's not midrange.


----------



## GorbazTheDragon

Quote:


> Originally Posted by *Ha-Nocri*
> 
> lol, 256-bit. Hoping NV aren't playing this game again.


This is only a card that will stand take the place of the GK104 in the maxwell lineup, a GK104 with a 384-bit bus is overkill and it is only logical to release the larger chips later on... They are much harder to produce and knowing how the architecture behaves from the other products is a huge advantage when scaling the architecture.
Quote:


> Originally Posted by *subyman*
> 
> Oh boy, 680 all over again?


Yes it is, and I'm glad they are not going straight to the limit... 7.9billion transistors is a huge count for a mid range chip and making the larger ones is only going to be harder for TSMC... Just look at the fact that it has double the shaders of the GK104. It should be priced at around double the 770 if not a bit more to make up for the fact that it should be significantly more power efficient.

Overall, I think that the architecture release cycle is slowing due to the difficulties in moving to the new processes and the simple fact that for a large part of the market there is not much demand for the high end cards...


----------



## zealord

Quote:


> Originally Posted by *szeged*
> 
> 680 is midrange when compared to 780s, but when it's the best thing out at the time, it's not midrange.


That is true. I am a bit anxious though how Nvidia is going to play Maxwell

GTX 880 then GTX Titanwell then GTX 980 then GTX 980 Ti then GTX Titanwell Black ?

I guess the most important question (atleast for me) then will be : how big is the gap between the release of the GM204(GTX880) and the big boys?


----------



## sherlock

Quote:


> Originally Posted by *zealord*
> 
> That is true. I am a bit anxious though how Nvidia is going to play Maxwell
> 
> GTX 880 then GTX Titanwell then GTX 980 then GTX 980 Ti then GTX Titanwell Black ?
> 
> I guess the most important question (atleast for me) then will be : how big is the gap between the release of the GM204(GTX880) and the big boys?


Judging by precedent(GTX 680-> Titan), You are looking at a 11 month wait.


----------



## GorbazTheDragon

Quote:


> Originally Posted by *sherlock*
> 
> Judging by precedent(GTX 680-> Titan), You are looking at a 11 month wait.


Sounds about right, but it would really depend on how much bigger nvidia wants to make the GM110... If they are looking to do something that is almost 2x as big as the GM104 it could be significantly longer, especially considering the die sizes we are looking at with this iteration of the 880.

I'd actually be surprised if they go for anything more than 5k CUDA cores with the GM110 as it seems like that would require a stupidly large die (I'm talking over 600mm^2)

Edit: Not to mention the power draw of one of those things... who's up for a 400w chip


----------



## Callist0

I'm not falling for this again...*has a 680*


----------



## GorbazTheDragon

Quote:


> Originally Posted by *Callist0*
> 
> I'm not falling for this again...*has a 680*


Don't if you are one to wait for more mature products do so... Just keep in mind that they evolve at a relatively constant rate and that the price for the same thing goes down at a relatively constant rate... If you wait until the GTX 980Ti (full GM110) you might find yourself wanting whatever architecture is next...


----------



## Callist0

Quote:


> Originally Posted by *GorbazTheDragon*
> 
> Don't if you are one to wait for more mature products do so... Just keep in mind that they evolve at a relatively constant rate and that the price for the same thing goes down at a relatively constant rate... If you wait until the GTX 980Ti (full GM110) you might find yourself wanting whatever architecture is next...


I typically don't wait for the next best thing, as you said these things are always improving. However, the 680 is a pretty mediocre card compared to the 590 (even though its dual GPU) and especially the 780 and 780Ti.

It works great and looks great in my rig, but Kepler 2.0 was definitely worth the wait.


----------



## abirli

i was waiting for the next big thing from 8800gtx, then upgraded to titans. the waiting game is almost never ending, theres always better stuff right around the corner


----------



## skupples

another day.. another attempt @ believing it has 3200 cores, another attempt failed.


----------



## fateswarm

I give it 90% probability this to be made up. Someone trying to see how manufacturing fake news is done. It's easy, be predictable and people will believe it as possible.

But I think it's TOO early to know. Yes, TSMC already makes 20nm processors for Apple and yes, they can already make a small GPU probably, but I doubt NVIDIA has leaked anything yet, of that specific detail. At most, they are probably at the stage they have handed out the blueprints to TSMC and they wait their queue.

In all likelihood the GPUs will be released at least in October and the announcements will be done at least mid-Summer or later.

On speculation, it's possible to be like that, an underwhelming model at first. It's hard to shrink the transistor cheaply nowadays. But, it's also possible the liar went too far with predictability: It's also too underwhelming to even care about it. It's possible the first model will be more exciting. The 20nm gives *30% more transistors* per die size while performance per transistor is similar, but GPUs easily consume more cores without performance impact, so I would expect at least 30% more transistors and roughly at least 30% more performance on the first models.

But wait, the first would be likely a smaller die so it even sounds optimistic. We would expect only around 5mil transistors on the first die. 30% more of the last small one.

So, to get 7.9 you'll need a die that is the size of.. according to my excel math *exactly the same size with 290 , 438mm^2*

i.e. someone is playing a nerdy joke to us. Or, NVIDIA copied the 290 size. Or, a devilish coincidence.


----------



## StreekG

This will be interesting. I fell for the 680, while it was a nice card, i had 2 Lightnings in SLi. When the 780Ti results came out, i upgraded to one and have the same performance, but less power consumption and heat.... I just wonder whether Nvidia will actually do the same trolling and it really turns out to be a mid range card while AMD brings out it's best with the R390X series competing with a mid range GTX880.
I remember right before Titan came out, Sources at AMD didn't actually believe it and thought Big Kepler were all rumours. AMD however had price on their side.

My hope is that Nvidia keeps the price a bit more fair this time. I paid $950 AUD for my overclocked 780Ti.

Over the years i've always switched between Nvidia and ATi/AMD. But Nvidia has always been the better experience in the last few years for me. The only thing they mess up with, is always stooging us with the VRAM. Compared to AMD


----------



## HeadlessKnight

256-bit bus for $750? really? I hope AMD spank it if that is anything near true...


----------



## Cakewalk_S

This will be interesting to see the temperature difference with the transistor density...


----------



## skupples

We already know that easily cracking voltage on ref models will never happen again.


----------



## HeadlessKnight

Quote:


> Originally Posted by *skupples*
> 
> We already know that easily cracking voltage on ref models will never happen again.


Yep since what happened to Titan Black and 780 Ti. Looks like custom BIOSes with 1.5V or 1.3V like what happened to Titan 1/ 780 non-Ti will be beyond the reach with these GPUs sadly...


----------



## Ghoxt

With the entire landscape included and blinders off, with DX12 and GMxxx chips, we should see incredible performance jumps "IF" the planets align in mid to late 2014.

Is this the 1000 year Storm, the Magical Equinox when all parallel worlds are in alignment?

DX12, Maxwell, and finally a







worthy enthusiast new chip from Intel?


----------



## fateswarm

Quote:


> Originally Posted by *StreekG*
> 
> This will be interesting. I fell for the 680, while it was a nice card, i had 2 Lightnings in SLi. When the 780Ti results came out, i upgraded to one and have the same performance, but less power consumption and heat.... I just wonder whether Nvidia will actually do the same trolling and it really turns out to be a mid range card while AMD brings out it's best with the R390X series competing with a mid range GTX880.
> I remember right before Titan came out, Sources at AMD didn't actually believe it and thought Big Kepler were all rumours. AMD however had price on their side.
> 
> My hope is that Nvidia keeps the price a bit more fair this time. I paid $950 AUD for my overclocked 780Ti.
> 
> Over the years i've always switched between Nvidia and ATi/AMD. But Nvidia has always been the better experience in the last few years for me. The only thing they mess up with, is always stooging us with the VRAM. Compared to AMD


It's not trolling. To make the 780 you need better yields of 20nm 28nm and at first those were too bad. Not only the dies are bigger, they fail more often at manufacturing so the NVIDIA cost must be approximately at least double. The yields never became perfect I suppose and I doubt TSMC designed their fabs with that die size in mind. NVIDIA are probably their customers with the biggest die. You need a humongous die, you gonna pay. It's not their fault, people demand larger and faster and that's the only cost effective-ish way. Unless you buy Titan.


----------



## GorbazTheDragon

Quote:


> Originally Posted by *Callist0*
> 
> I typically don't wait for the next best thing, as you said these things are always improving. However, the 680 is a pretty mediocre card compared to the 590 (even though its dual GPU) and especially the 780 and 780Ti.


Well, comparing the 680 and 590 would be more or less like comparing the 560ti and 295. But IMO you can't compare it like that because the price does not scale linearly with die size (or chip power draw) and it has many other factors.


----------



## skupples

In just want stacked rams and Denver. Maxwell has neither thus it is going to take Huuuuuuge performance gains to get me off tri-titan any time soon.


----------



## rationalthinking

Quote:


> A leak just got in regarding maxwell GTX 880 specs.
> GTX 880 is supposedly a MID RANGE CARD. We will see stronger versions later on.


Are ppl not seeing this note?


----------



## skupples

Quote:


> Originally Posted by *rationalthinking*
> 
> Are ppl not seeing this note?


That is the main topic of discussion...


----------



## GorbazTheDragon

Quote:


> Originally Posted by *rationalthinking*
> 
> Are ppl not seeing this note?


Problem is that people can't realize that to get the performance they are looking to get out of the high range cards the mid range cards have to be around 500+ USD, knowing NVidia they have probably set the price somewhat higher and have a decent margin to pull it down later, but until the production costs for 20nm GPU chips go down we will have trouble getting GM104 at 2012 680 prices.

The price brackets will keep shifting like they are not because NVidia wants your money, but because of the cost of production.


----------



## subyman

Quote:


> Originally Posted by *GorbazTheDragon*
> 
> Just look at the fact that it has double the shaders of the GK104. It should be priced at around double the 770 if not a bit more to make up for the fact that it should be significantly more power efficient.


No it shouldn't. If we had followed the price/performance chart linearly upwards with each new card released, we'd have million dollar graphics cards. The only point to a new generation is to get the cost per transistor down with a new node or architecture enhancements and hence the increase the performance/dollar.


----------



## subyman

Quote:


> Originally Posted by *rationalthinking*
> 
> Are ppl not seeing this note?


When did $650 become a "mid-range" GPU?


----------



## Capt

Quote:


> Originally Posted by *subyman*
> 
> When did $650 become a "mid-range" GPU?


When NV announced the Titan Z


----------



## Cakewalk_S

Delaying having my first child so I can get this 880...


----------



## Caples

As expected, AMD's lack of a competitive product is allowing NV to push a neutered GPU onto the market. I think I'll be holding on to my 290s until we get a full Maxwell.


----------



## sugarhell

Quote:


> Originally Posted by *Caples*
> 
> As expected, AMD's lack of a competitive product is allowing NV to push a neutered GPU onto the market. I think I'll be holding on to my 290s until we get a full Maxwell.


After gtx480 i dont think nvidia is gonna release a big die first again. They will go the same route like gtx680 then they will gonna release the big guns


----------



## skupples

Quote:


> Originally Posted by *Capt*
> 
> When NV announced the Titan Z


You know they have been making GPUs that cost more than 3k for ever... Right?


----------



## sugarhell

Quote:


> Originally Posted by *skupples*
> 
> You know they have been making GPUs that cost more than 3k for ever... Right?


Tesla/Quadros?


----------



## Caples

Quote:


> Originally Posted by *sugarhell*
> 
> After gtx480 i dont think nvidia is gonna release a big die first again. They will go the same route like gtx680 then they will gonna release the big guns


Indeed. I really don't know that I ever expected them to release the full chip. I was just really, really hoping we could avoid that this time.


----------



## rationalthinking

Quote:


> Originally Posted by *subyman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rationalthinking*
> 
> Are ppl not seeing this note?
> 
> 
> 
> When did $650 become a "mid-range" GPU?
Click to expand...

Since Titans launched nearly 1.5yrs ago.


----------



## szeged

1.2 years sir.


----------



## Capt

Quote:


> Originally Posted by *skupples*
> 
> You know they have been making GPUs that cost more than 3k for ever... Right?


I'm not talking about professional workstation graphics cards such as Quadros. You clearly missed the joke.


----------



## rationalthinking

Quote:


> Originally Posted by *szeged*
> 
> 1.2 years sir.


It feels like forever none the less.


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> Tesla/Quadros?


never heard of em.

Titan z Boggles my mind... Advertising towards commercial use per the reveal but running on geforce stack.


----------



## sugarhell

Quote:


> Originally Posted by *skupples*
> 
> never heard of em.
> 
> Titan z Boggles my mind... Advertising towards commercial use per the reveal but running on geforce stack.


Its a geforce part with dp with no ecc and pro drivers priced as a pro card


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> Its a geforce part with dp with no ecc and pro drivers priced as a pro card


That is why it blows my mind.


----------



## Exilon

Quote:


> Originally Posted by *subyman*
> 
> No it shouldn't. If we had followed the price/performance chart linearly upwards with each new card released, we'd have million dollar graphics cards. The only point to a new generation is to get the cost per transistor down with a new node or architecture enhancements and hence the increase the performance/dollar.


28nm is looking to be the last node where per-transistor costs rapidly fall below the breakeven point. Sure they can pack more performance into a GPU, but costs are going up too.


----------



## subyman

Quote:


> Originally Posted by *rationalthinking*
> 
> Since Titans launched nearly 1.5yrs ago.


No no no, thats ultra high end stupidity. High end is still 780. Mid-range has always been the 770-like cards while mid-low = 760, low = 750. Nvidia added a new category of "dummy, money grab GPUs." If looking at a bell-curve of sales, Titan would be like 4 standard deviations out.


----------



## subyman

Quote:


> Originally Posted by *Exilon*
> 
> 28nm is looking to be the last node where per-transistor costs rapidly fall below the breakeven point. Sure they can pack more performance into a GPU, but costs are going up too.


Then we better be ready for never upgrading. The average consumer can't follow nvidia into the $800+ dollar range and they won't upgrade without a performance increase.


----------



## strong island 1

Quote:


> Originally Posted by *Caples*
> 
> Indeed. I really don't know that I ever expected them to release the full chip. I was just really, really hoping we could avoid that this time.


I feel like we are on to them now though. I definitely will not be buying the 680 equal part this time around and will wait it out for the bigger guys. I regretted buying 680's once the 780's and titans came out.


----------



## subyman

Quote:


> Originally Posted by *strong island 1*
> 
> I feel like we are on to them now though. I definitely will not be buying the 680 equal part this time around and will wait it out for the bigger guys. I regretted buying 680's once the 780's and titans came out.


Same here. I bought the 680 on launch day (had a 470), but regretted it after a few weeks. Ended up selling it for a 7970 due to the 7970's over clocking potential on water.


----------



## erocker

Quote:


> Originally Posted by *subyman*
> 
> No no no, thats ultra high end stupidity. High end is still 780. Mid-range has always been the 770-like cards while mid-low = 760, low = 750. Nvidia added a new category of "dummy, money grab GPUs." If looking at a bell-curve of sales, Titan would be like 4 standard deviations out.


Oh so true.


----------



## skupples

Quote:


> Originally Posted by *strong island 1*
> 
> I feel like we are on to them now though. I definitely will not be buying the 680 equal part this time around and will wait it out for the bigger guys. I regretted buying 680's once the 780's and titans came out.


awww yeah!! Strong get dat editor status!

I will be skipping first run maxwell, and likely even skipping second run maxwell unless it comes with Denver.


----------



## cravinmild

Our power company raised rates 28% last month. Using their monitoring to track usage my pc is the highest power consumer in the house now ... by far. I can see 1$/h when gaming. Thses new card have to cut power by half or we cant afford to run them. Good thing gpu prices are at an all time low


----------



## Imouto

Quote:


> Originally Posted by *cravinmild*
> 
> Our power company raised rates 28% last month. Using their monitoring to track usage my pc is the highest power consumer in the house now ... by far. I can see 1$/h when gaming. Thses new card have to cut power by half or we cant afford to run them. Good thing gpu prices are at an all time low


With a crazy 1000W consumption you should have 1$kW/h rates and I think there isn't a single place like that or even close.


----------



## MaxFTW

Lolno

Find the 230w hard to believe seeing as they want to use maxwell to be more power efficient


----------



## Alatar

Quote:


> Originally Posted by *sugarhell*
> 
> Its a geforce part with dp with no ecc and pro drivers priced as a pro card


Titans are a mix of both consumer and pro cards. Gets some pro features, lacks others, priced lower than pro GPUs.


----------



## fateswarm

> Evil NVIDIA, AMD good

The /facepalm is intensifying. Don't you see AMD also had to make a smaller die for the 290s compared to NVIDIA's and still charged enough, almost identical for what NVIDIA was asking for the 780s per die size? Leave Titan out of it, that's obviously a luxury product line that is indeed overpriced.

Why would AMD do that? Why wouldn't they make a bigger die and beat the GK110s? They obviously did it because it's expensive at manufacturing. The TSMC price is high.

Get over it and wake up. It's harder to shrink the transistor nowadays. The only alternative is AMD+NVIDIA are a cartel, TSMC is ripping them off or similar, but I doubt it.


----------



## Mand12

How dare Lamborghini charge so much when I can get a used civic for peanuts! They're such jerks.


----------



## rationalthinking

Quote:


> Originally Posted by *Mand12*
> 
> How dare Lamborghini charge so much when I can get a used civic for peanuts! They're such jerks.












Pay for what you get.


----------



## Caples

Quote:


> Originally Posted by *strong island 1*
> 
> I feel like we are on to them now though. I definitely will not be buying the 680 equal part this time around and will wait it out for the bigger guys. I regretted buying 680's once the 780's and titans came out.


Don't get me wrong, I really liked my 670s and used them over my 7970s for a long time. However, once is enough and the practice is just annoying after being teased with such a potentially amazing architecture.


----------



## rationalthinking

Quote:


> Originally Posted by *Caples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *strong island 1*
> 
> I feel like we are on to them now though. I definitely will not be buying the 680 equal part this time around and will wait it out for the bigger guys. I regretted buying 680's once the 780's and titans came out.
> 
> 
> 
> Don't get me wrong, I really liked my 670s and used them over my 7970s for a long time. However, once is enough and the practice is just annoying after being teased with such a potentially amazing architecture.
Click to expand...

Yeah, but why wait almost 1yr for the complete architecture. If the cut down version being released is better than the last architecture, why wait? My 670s, 680 and 690 had been amazing compared to the 560 Ti and 570s I had used from the previous generation.

The great thing about CPUs and GPUs is that they relatively hold good resale value. Usually you can recoup 2/3 of the cost.


----------



## Caples

Quote:


> Originally Posted by *rationalthinking*
> 
> Yeah, but why wait almost 1yr for the complete architecture. If the cut down version being released is better than the last architecture, why wait? My 670s, 680 and 690 had been amazing compared to the 560 Ti and 570s I had used from the previous generation.
> 
> The great thing about CPUs and GPUs is that they relatively hold good resale value. Usually you can recoup 2/3 of the cost.


I'm already running two 290s. This is the first year in a long time I do not plan on making any changes to my system outside of adding more storage. There's just no need to.


----------



## Redeemer

So Nvidia is pulling a GK104 all over again with Maxwell


----------



## GHADthc

Hmmm so the GTX 680 strategy all over again, until AMD compels Nvidia and the price of 20nm wafers goes down...Good thing I just sold my GTX 670's and bought a 290X for the money I made off of them (scored it for $465 AUD..not bad) I think this stand in card will do me well enough until we see proper 20nm high end cards release.


----------



## Mr357

Quote:


> Originally Posted by *Ha-Nocri*
> 
> lol, 256-bit. Hoping NV aren't playing this game again.


My thoughts exactly


----------



## i7monkey

Quote:


> The price brackets will keep shifting like they are not because NVidia wants your money, but because of the cost of production.


Am I delusional or is tech supposed to get cheaper not more expensive?

Seems like every other tech sector is getting cheaper why is the GPU segment so screwed?


----------



## Jimhans1

Quote:


> Originally Posted by *i7monkey*
> 
> Am I delusional or is tech supposed to get cheaper not more expensive?
> 
> Seems like every other tech sector is getting cheaper why is the GPU segment so screwed?


Because we, the people buying them, allowed it. It started with the folks buying a GK104 GTX680 and went downhill from there.


----------



## Oubadah

Quote:


> Originally Posted by *Jimhans1*
> 
> Because we, the people buying them, allowed it. It started with the folks buying a GK104 GTX680 and went downhill from there.


I often wondered if I was mad, being probably one of only people who refused to buy the 680 'on principle'.


----------



## Jimhans1

Quote:


> Originally Posted by *Oubadah*
> 
> I often wondered if I was mad, being probably one of only people who refused to buy the 680 'on principle'.


Same here, I held out till the 780Ti, and only then because of fact that I'm a retailer and only pay wholesale for it, and they are still overpriced.


----------



## Xyxox

Like I said earlier in the thread. Best option for me is to drop a 3770K into my sig rig and replace the GTX560tis with a pair of 6GB GTX780s or even GTX780tis, then hold out for the gm110 in whatever they end up calling the high end.


----------



## KenjiS

Quote:


> Originally Posted by *i7monkey*
> 
> Am I delusional or is tech supposed to get cheaper not more expensive?
> 
> Seems like every other tech sector is getting cheaper why is the GPU segment so screwed?


It does in a way...

Last years x80 is a x70 today, a x60 gives you the performance of an x80 of 2 generations ago

Flagship smartphones/tablets/etc are exactly the same, its not just GPUs....

You're still going to pay a lot for the best performance, When has that not been true?


----------



## GorbazTheDragon

Quote:


> Originally Posted by *KenjiS*
> 
> It does in a way...
> 
> Last years x80 is a x70 today, a x60 gives you the performance of an x80 of 2 generations ago
> 
> Flagship smartphones/tablets/etc are exactly the same, its not just GPUs....
> 
> You're still going to pay a lot for the best performance, When has that not been true?


People have trouble accepting that


----------



## SuprUsrStan

Quote:


> Originally Posted by *maarten12100*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Baghi*
> 
> Are you sure? Because the MRSP of the GTX 680 was 50 bucks less than the AMD counterpart.
> 
> 
> 
> By September 2012 the 680 costed 500 euro while the 7970 was 400 euro.
> Then if you add to that that the 680 has a smaller bus, less memory, a 1080P focused core and worse scaling in multiple cards.
> My judgement would say the 680 is the inferior product.
> 
> (however we must take into consideration the driver state at that time before Raja started fixing it which was quite bad)
Click to expand...

Yes but at launch the 680 was faster than the lowly clocked 7970 and cheaper too.


----------



## CynicalUnicorn

To paraphrase @nvidiaftw12:

"No flagship should ever have a TDP below 200W."

This was about the 680 being awful. Even @Alatar doesn't like GK104!


----------



## jdstock76

Quote:


> Originally Posted by *Exilon*
> 
> If Nvidia is using a 256-bit bus, it's because they think the huge L2 cache in Maxwell will counteract the reduced memory bandwidth. Looking at how the 750 Ti can keep up with the 650 Ti Boost with half the memory bandwidth and 2/3 of the ROPs, I think Nvidia knows what they're doing. In any case, the area for the L2 cache has to come from somewhere.


^ this but people will always find a reason to complain. My 660tis smoke 780s and Titans. But whatever. I'll probably buy an 880 but not at $750. It's a $550 card at most.


----------



## skupples

7970 was terrible in xfire+eyefinity


----------



## rationalthinking

Quote:


> Originally Posted by *i7monkey*
> 
> Quote:
> 
> 
> 
> The price brackets will keep shifting like they are not because NVidia wants your money, but because of the cost of production.
> 
> 
> 
> Am I delusional or is tech supposed to get cheaper not more expensive?
> 
> Seems like every other tech sector is getting cheaper why is the GPU segment so screwed?
Click to expand...

Rapid change combined with increasing difficult manufacturing process in a small market.


----------



## grunion

Will this address NV 4K shortcomings?


----------



## fateswarm

Quote:


> Originally Posted by *Alatar*
> 
> Does the person I'm recommending to OC? If not - custom cooler hawaii, if yes - GK110.


Unless they want gsync.

Though ok, the environment is foggy for 6 months at least about it. And amdsync on top.

Then again, just in time for the next line of 14nm/20nm things.


----------



## NABBO

http://translate.google.it/translate?hl=it&sl=de&tl=en&u=http%3A%2F%2Fwww.forum-3dcenter.org%2Fvbulletin%2Fshowthread.php%3Ft%3D535975%26page%3D119


----------



## skupples

Seems people are still hopefully that Denver will show up in maxwell.. Would be nice and would seriously raise the bar from here on out.


----------



## candy_van

I seriously hope that 256-bit business is wrong....


----------



## skupples

Quote:


> Originally Posted by *candy_van*
> 
> I seriously hope that 256-bit business is wrong....


What is this about bus speed not mattering as much due to cache size?

320 bit or AMD!!!


----------



## NABBO

Quote:


> Originally Posted by *candy_van*
> 
> I seriously hope that 256-bit business is wrong....


it is very likely that it will be a 256, max 320bit
what matters is the performance that will be in the games, not the size of the bus.

however, a 512bit will also be released (GTX 900), but much later.
the end of 2015 imho.


----------



## i7monkey

Quote:


> Originally Posted by *KenjiS*
> 
> It does in a way...
> 
> Last years x80 is a x70 today, a x60 gives you the performance of an x80 of 2 generations ago
> 
> Flagship smartphones/tablets/etc are exactly the same, its not just GPUs....
> 
> You're still going to pay a lot for the best performance, When has that not been true?


If the mainstream car industry was priced like this we'd be buying Toyota Corollas for $100 000 but it's not, Corollas get better every several years and the price point stays the same.

Prices in the graphics industry are becoming considerably higher, and it's gotten to the point where Nvidia can stuff a "mid-range" engine into a card and charge higher end sedan prices.

The price-point's changing in a sector that's cheaper to produce tech. This doesn't make sense and considering that there's only 2 companies we can only come to the conclusion that this isn't a competitive market and that price fixing and conspiracy is evident.


----------



## HitMe

fake check the GTX 780TI and ROG 780TI

288GB/s vs 336GB/s 192 sp 240 sp lool
this guy is totaly totally wrong


----------



## jdstock76

Who cares about the 7970. This is about the 880. Let's keep things focused. LoL


----------



## Blackops_2

7970 is about like Hawaii Vs the 780. Faster clock for clock, less OCing headroom but a bigger bus width which helps at higher res.

I though this was funny. These specs are turning up everywhere lol


----------



## discoprince

Quote:


> Originally Posted by *Ha-Nocri*
> 
> lol, 256-bit. Hoping NV aren't playing this game again.


my first thought


----------



## maarten12100

Quote:


> Originally Posted by *jdstock76*
> 
> ^ this but people will always find a reason to complain. My 660tis smoke 780s and Titans. But whatever. I'll probably buy an 880 but not at $750. It's a $550 card at most.


Comparing SLI to a single card while everybody know that you'll pay a premium for the later. Your cards barely have enough memory bandwidth to function alone and they will slip once the resolution is increased ever so slightly and the memory usage or bandwidth limitations starve the core.

GK110 based cards simply are the choice for multi setup at Nvidia. Rule of thumb for multi card setups has always been get the highest possible card before multi card. (with recent gen that however is no longer enirely the case because Nvidia pulled a Nvidia and the choice would fall on a 780 which is close enough to the highest card)


----------



## fateswarm

They might go nvlink early which would practically mean faster SLI for gaming. They announced it together with Tesla but I doubt it requires it.


----------



## skupples

Quote:


> Originally Posted by *Blackops_2*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 7970 is about like Hawaii Vs the 780. Faster clock for clock, less OCing headroom but a bigger bus width which helps at higher res.
> 
> I though this was funny. These specs are turning up everywhere lol


people are grasping @ straws.


----------



## Clocknut

Gonna love the GTX860 SLI on my i5 2400+650w PSU. Hopefully they falls within single 6 pin PCIE.


----------



## banging34hzs

some one pass the salt.... nvm I found some...


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Clocknut*
> 
> Gonna love the GTX860 SLI on my i5 2400+650w PSU. Hopefully they falls within single 6 pin PCIE.


An 860 will most likely be GTX 680/770


----------



## Clocknut

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> An 860 will most likely be GTX 680/770


I was looking for 860/860Ti = 780/780ti performance. Anything less I would hold off till 960/960ti


----------



## Forceman

Quote:


> Originally Posted by *Jimhans1*
> 
> Same here, I held out till the 780Ti, and only then because of fact that I'm a retailer and only pay wholesale for it, and they are still overpriced.


So your principled stand against high prices was to boycott the $499 top-end card and wait until they released a $649 top-end card?

I'm guessing Nvidia may have gotten a different message than you intended.


----------



## maarten12100

Quote:


> Originally Posted by *Dragonsyph*
> 
> If his sli 660s get him 60+ fps in every game why would he need to over pay for titans?


E comparison is invalid because it will lose when mem runs out or when there is no sli support. Just saying comparing single cards to a sli setup is silly.


----------



## kx11

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> An 860 will most likely be GTX 680/770


but 860 will be faster since it can be OC higher on air ( based on what 750ti can do on air )


----------



## fateswarm

Quote:


> Originally Posted by *kx11*
> 
> but 860 will be faster since it can be OC higher on air ( based on what 750ti can do on air )


Forget about it. Not just forget about it, reverse it. 20nm will be HARDER to overclock because the lower the process size, the harder to deviate from factory voltage. This is from NVIDIA's own mouth on a science presentation a few months back you can find on youtube. The 750ti is a tiny chip and on 28nm, not comparable by any reasonable means.

Same goes with 14nm processors by the way.

Overclocking will be hard mkay.

PS. All this popular culture in overclocking forums about "power efficiency goals hurt our overclocks" is nonsense because they fail to undestand that simple scientific fact. The smaller the transistor process, the harder to deviate from factory voltage. There's no energy efficiency competing with it, they always were "energy efficient" in the past, if they run slower.


----------



## Astral Fly

I'm hoping that with Maxwell I can get a card, perhaps a gtx 870 that performs like a 780 ti @ <200W. I was thinking 150W, but that's likely too low. With Maxwell architecture and 20nm happening at the same time, how big a jump in TDP do you guys think we'll see?


----------



## skupples

Quote:


> Originally Posted by *fateswarm*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Forget about it. Not just forget about it, reverse it. 20nm will be HARDER to overclock because the lower the process size, the harder to deviate from factory voltage. This is from NVIDIA's own mouth on a science presentation a few months back you can find on youtube. The 750ti is a tiny chip and on 28nm, not comparable by any reasonable means.
> 
> Same goes with 14nm processors by the way.
> 
> Overclocking will be hard mkay.
> 
> PS. All this popular culture in overclocking forums about "power efficiency goals hurt our overclocks" is nonsense because they fail to undestand that simple scientific fact. The smaller the transistor process, the harder to deviate from factory voltage. There's no energy efficiency competing with it, they always were "energy efficient" in the past, if they run slower.


and the whole flashing a bios with a 600W TDP which completely removes and tdp limitations except for those on LN2 maybe.


----------



## skupples

Quote:


> Originally Posted by *subyman*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Yup, got a 680 at launch. Ended up with a 7970 after the driver dropped. The 7970 was hated on when it first came out due to the pricing. The 680 was actually considered the underdog that beat the 7970 and forced AMD to drop the price. The 7950 launched at $449 and the 7970 at $549. After the 680, AMD dropped the prices nearly $100 on both by that summer. The major reason people were so disappointed with AMD at first was that the 7970 was priced much higher than the previous 6970 (about $200 more.) The 680 came out at the same price as the 580, so it was given a pass.
> 
> 
> The back and forth with nvidia and AMD is fun, but one thing that *everyone should learn from the past is to not buy at launch*.


in all aspects of life.


----------



## cravinmild

Ah yes but the newest shinny is so hard to resist. Thankfully im broke most days so im immune to impulse buys/product launches


----------



## GorbazTheDragon

lol...

Buying at launch is more expensive... because you are buying it earlier...

If you want it early, pay for it. If you don't want to pay, don't get it early...


----------



## geoxile

Quote:


> Originally Posted by *i7monkey*
> 
> Am I delusional or is tech supposed to get cheaper not more expensive?
> 
> Seems like every other tech sector is getting cheaper why is the GPU segment so screwed?


Ideally yes, but that's often not the case. Especially when the direction of the manufacturing industry (low power) does not coincide with the designs (high performance)


----------



## JambonJovi

Quote:


> Originally Posted by *Germanian*
> 
> it's a very reputable german computer hardware magazin. I have been reading this magazine for over 5 years


Yes, however the original "leaked specs" came from a different website >>> tyden.cz
and its author Z. Obermaier seems far from trustworthy.
Seems more like a quiet week for them, and judging by the comments... well...


----------



## freestyla85

oh man..this is making want to just stick with my 670 for another year.


----------



## skupples

It loosely relates to why people are pissed about Nvidia's new selling structure of releasing mid-grade first.


----------



## Jimhans1

Quote:


> Originally Posted by *skupples*
> 
> It loosely relates to why people are pissed about Nvidia's new selling structure of releasing mid-grade first.


Releasing Mid-grade as top-grade is what I'm ticked at.


----------



## Dragonsyph

Quote:


> Originally Posted by *maarten12100*
> 
> E comparison is invalid because it will lose when mem runs out or when there is no sli support. Just saying comparing single cards to a sli setup is silly.


Ya i think the truth is if we all were rich, we would all have titans lol.


----------



## VinhDiezel

Quote:


> Originally Posted by *skupples*
> 
> It loosely relates to why people are pissed about Nvidia's new selling structure of releasing mid-grade first.


Agreed.
Quote:


> Originally Posted by *Jimhans1*
> 
> Releasing Mid-grade as top-grade is what I'm ticked at.


Agreed.

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya i think the truth is if we all were rich, we would all have titans lol.


To a certain extent! Me personally, I would never drop 1k on a GPU...the last couple of years I've only spent about $330 after tax on graphics card which is the 670 I have right now. I would love to grab another one and run SLi but knowing that this GK104 chip is only half the amount of cores of a full fledge GK110 I'd rather go that route instead or a 290/290x card!

I'd hate to drop $500 bucks on this mid-high range card just for a better full fledge card of the same generation to be released 6-12 months later...I think I may as well hold out for tha instead and just upgrade my cpu/ssd/case first. Good job Nvidia! You got us real good back during the 680s days!!!


----------



## Baghi

To be honest, it's us, the people, who are at fault because of buying their thrown crap. First Intel started selling locked chips and got away with it and now NVIDIA. An overclocker on budget can't do much but go AMD route.


----------



## GorbazTheDragon

There is no way they would do a GM110 release so early... The 20nm process is simply not ready for 500+mm^2 chips.

The other option I guess is that they don't even talk about maxwell, and then throw everything at you in 18 months time, when the GM110 can be viably produced.

The only thing I have an issue with is the bloated prices, which although they are higher than they should be, are still not that much over the price, especially considering that at this point producing at 20nm is almost certainly somewhat more expensive than 28nm.


----------



## fateswarm

Nah, they just drop a small 20nm now before the end of the year (to be ready for the holiday sales as well) and then wait for the big ones next year when yields at manufacturing are more stable.

One thing is though that it will probably be not much faster than a 780 Ti now, max 10% faster I suspect. But it might be cheaper.

I think it's deeply obvious 28nm is burned out, they are just waiting on queue at TSMC at this point without saying it.


----------



## fateswarm

Something of note is that NVIDIA probably overshot with the GK110 in relation to 20nm. That means, they might have gone that far with its size that 20nm will have a big problem to compete with it before it also goes big size! According to the estimation of 30% more transistor per die size on 20nm compared to 28nm, the 7.9mil transistors guess will need a die that will be exactly the size of the 290s.

So that would make it harder for the next gen of big chips that might come in mid-late 2015 after the first batch at the end of this year. It will make big Maxwell very hard to actually exist!

Well, there is a clear alternative that it's most likely to be followed: Big-ish chips now, slightly bigger later that are not that impressive.

But you know what that means right? When Volta comes about what are they gonna do? Oh.. they bet on stacked vram.. I get it.


----------



## fateswarm

I lot of the confusion in this forum stems from the fact people think NVIDIA controls the manufacturing process. They don't. They wait like puppies for TSMC to give them the "OK". You think they can't handle orders on chips they already have? No way. You think they sell so many Pro cards they can't sell Gaming cards? No way. They wait like puppies for TSMC to tell them "ok, little nvidia, you now print your chips, and wait, you know, they cost a cazillion for the first few months". So they print mainly for the pro cards at first and the cost generally goes down to what 780 was at first for the GK110. And people whine that nvidia is "overpriced". A joke. Titan sure, but this is 28 and 20nm and 2014 people. Forget the 90s and the early 00s. We got to pay in blood to shrink the transistor. And it's so monumentally expensive to do it, nvidia can not afford it, amd could not afford it anymore, only the giants can. So let amd and nvidia wait like puppies for tsmc to give their ok for gpus because really, they control nothing, they just wait for the apple apus now to complete and it's their turn, with possibly even more horrible yields at first. Which is a bit worrisome by the way since if they overshot with the gk110, they got to make a big-ish chip on the get-go on 20nm, and that'll be bad for the consumer. It's likely to be of the size of the 290 as the rumour specifically implies and that means possibly an underwhelming gm110.


----------



## fateswarm

You know, I've been thinking Pascal in relation to this and it's very likely 20nm will be an one-chip cycle. GK110 is gigantic and it's likely, to the point of almost certainty the first 20nm chip to be around the size of the 290, as this rumour implies. So, they release this and then they go straight to 16nm with stacked vram on top. It's said that 16nm is just a minor improvement on 20nm anyway in terms of foundry alterations.

So in practice, we get this first somewhere around October, we wait on it until at least somewhere around next Summer. It's gonna be ~10% faster than GK110 so it's marginally satisfactory.

Then 16nm sometime late 2015 with vram stacked on top of the GPU and Bam, huge performance increase. Only problem is, they might be stuck with nothing after it.

But they probably assume Pascal will be so amazing that it will only need a small chip at first, so it's gonna have another iteration in 2016.

And then it goes 2017 for anything that is next. But that's probably beyond the retirement of some people that control this thing.


----------



## skupples

Quote:


> Originally Posted by *fateswarm*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> You know, I've been thinking Volta in relation to this and it's very likely 20nm will be an one-chip cycle. GK110 is gigantic and it's likely, to the point of almost certainty the first 20nm chip to be around the size of the 290, as this rumour implies. So, they release this and then they go straight to 16nm with stacked vram on top. It's said that 16nm is just a minor improvement on 20nm anyway in terms of foundry alterations.
> 
> So in practice, we get this first somewhere around October, we wait on it until at least somewhere around next Summer. It's gonna be ~10% faster than GK110 so it's marginally satisfactory.
> 
> Then 16nm sometime late 2015 with vram stacked on top of the GPU and Bam, huge performance increase. Only problem is, they might be stuck with nothing after it.
> 
> But they probably assume stacked vram will be so amazing that it will only need a small chip at first, so it's gonna have another iteration in 2016.
> 
> And then it goes 2017 for anything that is next. But that's probably beyond the retirement of some people that control this thing.


stacked & Denver cores.

I think the possibility of Maxwell being a one trick pony is likely. Mostly because both of the things rumored to come with it have been pushed to Volta/Pascal.


----------



## mcg75

Thread cleaned and unlocked.

Please keep on topic from here on out.

Thread does not need to be a warzone talking about old product superiority.

Thanks.


----------



## RoboChimp

I think my original Titan won't need to replaced for a while.


----------



## Doomtomb

I wouldn't consider better than GTX 780 Ti and the like to be midrange. It's hard to tell from specs just how it will perform because it is gimped by the memory bit depth and ROP count but will probably still beat a vanilla GTX 780.

I don't consider the Titan or any of its above $1000 companions to be "high-end" because it is playing it a totally different ballgame, "ultra-high-end" for professionals with deeper pockets.

I really hope next generation Nvidia returns to reason that the sweet spot price/performance will always outsell and make the big bucks for them. Not this runaway price point train. It just seems like every new graphics card announced is more expensive than the last. The ceiling will be found soon.


----------



## y2kcamaross

Quote:


> Originally Posted by *Doomtomb*
> 
> I wouldn't consider better than GTX 780 Ti and the like to be midrange. It's hard to tell from specs just how it will perform because it is gimped by the memory bit depth and ROP count but will probably still beat a vanilla GTX 780.
> 
> I don't consider the Titan or any of its above $1000 companions to be "high-end" because it is playing it a totally different ballgame, "ultra-high-end" for professionals with deeper pockets.
> 
> I really hope next generation Nvidia returns to reason that the sweet spot price/performance will always outsell and make the big bucks for them. Not this runaway price point train. It just seems like every new graphics card announced is more expensive than the last. The ceiling will be found soon.


So you think the 880 will probably beat a vanilla 780...has there ever been a gpu release where the previous high end model(780) BEATS the new high end model(880)? Because that would equate to about 0 cards sold


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *fateswarm*
> 
> I lot of the confusion in this forum stems from the fact people think NVIDIA controls the manufacturing process. They don't. They wait like puppies for TSMC to give them the "OK". You think they can't handle orders on chips they already have? No way. You think they sell so many Pro cards they can't sell Gaming cards? No way. They wait like puppies for TSMC to tell them "ok, little nvidia, you now print your chips, and wait, you know, they cost a cazillion for the first few months". So they print mainly for the pro cards at first and the cost generally goes down to what 780 was at first for the GK110. *And people whine that nvidia is "overpriced". A joke*. Titan sure, but this is 28 and 20nm and 2014 people. Forget the 90s and the early 00s. We got to pay in blood to shrink the transistor. And it's so monumentally expensive to do it, nvidia can not afford it, amd could not afford it anymore, only the giants can. So let amd and nvidia wait like puppies for tsmc to give their ok for gpus because really, they control nothing, they just wait for the apple apus now to complete and it's their turn, with possibly even more horrible yields at first. Which is a bit worrisome by the way since if they overshot with the gk110, they got to make a big-ish chip on the get-go on 20nm, and that'll be bad for the consumer. It's likely to be of the size of the 290 as the rumour specifically implies and that means possibly an underwhelming gm110.


O.K.....sounds informed ***but....

What would you wager the price a gtx 780 actually cost Nvidia to make?...cold hard math....

Because no matter how well informed you sound, I just can't, for the life of me, swallow that Nvidia has to work so hard and is actually doing us a favor with these prices

I personally would wager that the actual manufacturing price is somewhere between ridiculous and laughable


----------



## PostalTwinkie

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> O.K.....sounds informed ***but....
> 
> What would you wager the price a gtx 780 actually cost Nvidia to make?...cold hard math....
> 
> Because no matter how well informed you sound, I just can't, for the life of me, swallow that Nvidia has to work so hard and is actually doing us a favor with these prices
> 
> I personally would wager that the actual manufacturing price is somewhere between ridiculous and laughable


I would wager the markup on pretty much any computer component is nothing near what the markup is on many other retail items you see on a store shelf.


----------



## skupples

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> O.K.....sounds informed
> 
> 
> 
> 
> 
> 
> 
> 
> What would you wager the price a gtx 780 actually cost Nvidia to make?...cold hard math....
> 
> Because no matter how well informed you sound, I just can't, for the life of me, swallow that Nvidia has to work so hard and is actually doing us a favor with these prices
> 
> I personally would wager that the actual manufacturing price is somewhere between ridiculous and laughable


I would wager using manufacturing price alone is 100% inaccurate in figuring the price of what it costs to produce a GPU. Just JUST the price of stamping out the card is likely REALLY low. Paying TSMC? Obscene. Engineering time? Obscene. Payroll? Obscene. Paying for CEO's Bentley, Yacht & 5 houses? Obscene.


----------



## MapRef41N93W

Yeah the price of actually making a chip (the parts that go into one) is obviously dirt cheap. That doesn't take into account the price of R&D and the cost of fabs/using other companies fabs. After you account for that the markup on components is not that high.


----------



## skupples

You will find people who complain about any and everything if in the right venue.

GPU prices are obscene from both parties. Nvidia takes the cake with Titan & Titan Z. Resources are also up... Seen the price of gold lately? Granted, very little goes into each GPU, but it is an example of what can/does drive up the physical production price of the PCB. I would love to know what the physical production prices are, and what they value the R&D time @.


----------



## Chargeit

There is no way GPU's cost that much to make. I'd be willing to bet the highest end single cards could go for $400 - $500 and they'd still bring in a really great profit.

They charge what they think they can get away with, not what the item is worth. I mean, how much is maxing out a game at a good res really worth? Thousands of dollars when you can get a console for fractions of the price? Hell no.

Doesn't bother me, I was sitting this gen of cards out anyway.


----------



## GorbazTheDragon

Quote:


> Originally Posted by *skupples*
> 
> You will find people who complain about any and everything if in the right venue.
> 
> GPU prices are obscene from both parties. Nvidia takes the cake with Titan & Titan Z. Resources are also up... Seen the price of gold lately? Granted, very little goes into each GPU, but it is an example of what can/does drive up the physical production price of the PCB. I would love to know what the physical production prices are, and what they value the R&D time @.


I'd like to know too.

It's going to get to a point pretty quickly that they can't keep pushing the prices.

I would actually also be interested in seeing how much the prices have actually gone up, taking into account inflation and the change in prices of the components themselves.


----------



## Clocknut

Quote:


> Originally Posted by *Chargeit*
> 
> There is no way GPU's cost that much to make. I'd be willing to bet the highest end single cards could go for $400 - $500 and they'd still bring in a really great profit.
> 
> They charge what they think they can get away with, not what the item is worth. I mean, how much is maxing out a game at a good res really worth? Thousands of dollars when you can get a console for fractions of the price? Hell no.
> 
> Doesn't bother me, I was sitting this gen of cards out anyway.


Well just stick to ur own budget. there is no reason to keep following the tech if it get more expensive.

If GTX860 is somehow becoming $400-500, I will not buy it. It is very unlikely I will buy a GPU over $250.


----------



## Cakewalk_S

I'll put $30 away a month for a gtx780 probably. The 880 or 860 whatever will come out and likely drop the price of the gtx780 by $100... Hopefully...


----------



## Radeon915

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Exactly. 880ti, 8-series titan, 8-series titan black...
> Milking, milking is my way


Don't forget the GTX 880 Ultra, and the MX 880 that's actually based on G92b..


----------



## fateswarm

Again, if you think NVIDIA (significantly) overprices their non-Titan chips, you must answer the following question, otherwise your argument hangs on nothing.

Why did AMD make smaller chips this time around when they had the chance, and almost certainly the time, to make a bigger chip than the GK110 and they still had to keep the price elevated compared to their own previous models?

Because otherwise, the only alternative is that NVIDIA+AMD are a cartel but it would be unlikely the deception to be happening in that scale. In that case your argument doesn't stand either since you're supposed to be proving NVIDIA are teh evil and AMD teh saints.

Again, nobody are saints, there might be various marketing tricks, and obviously there are. I won't even touch the argument of "silicon cost". It's childish. Of course we are talking about R&D and everything of the manufacturer, not just the cost of it, we are talking about what TSMC is charging them. And all signs indicate very blatantly that it's quite expensive to be early on TSMC's latest node.


----------



## skupples

Not to mention AMD is going to recycle Hawaii 3 times total. So that completely offsets those trolling NV about rehashing content.

Moral of the story NV is evil AMD is good guy looking out for you! Those $900 290x units never happened because AMD couldn't keep up with demand due to low yield.

Its all the same. Like some one else said. We are paying in blood for the dye shrinks.


----------



## Mand12

Quote:


> Originally Posted by *fateswarm*
> 
> Why did AMD make smaller chips this time around when they had the chance, and almost certainly the time, to make a bigger chip than the GK110 and they still had to keep the price elevated compared to their own previous models?


Because if they had they would likely have caught fire given their cooling system.


----------



## Usario

Quote:


> Originally Posted by *skupples*
> 
> Not to mention AMD is going to recycle Hawaii 3 times total. So that completely offsets those trolling NV about rehashing content.
> 
> Moral of the story NV is evil AMD is good guy looking out for you! Those $900 290x units never happened because AMD couldn't keep up with demand due to low yield.


I don't understand how AMD can be blamed for retailer markup.
Quote:


> Originally Posted by *Mand12*
> 
> Because if they had they would likely have caught fire given their cooling system.


Hawaii runs hot because the die is very dense. 14.1M transistors per mm^2 vs 12.6M transistors per mm^2 on GK110, or 11.78M transistors per mm^2 on Tahiti. Everything is tightly packed so the heat is more concentrated and difficult to dissipate. (This is the same reason why Ivy Bridge was known for running hotter than SB -- pretty much the same chip, but in a 160mm^2 die rather than ~240mm^2 or whatever SB was.) If the exact same Hawaii chip was made to be, say, 500mm^2, it would run a lot cooler, but it would also cost AMD more money due to the larger die and might not yield as well (though I'm not sure if Hawaii's density might affect its yields either). If AMD made a ~520mm^2 chip with 48 CUs (vs 44 CUs on the 290X) it would probably run a bit cooler than Hawaii and would have probably had the edge over the 780 Ti. Why didn't they do that? Only AMD knows. I do believe that their engineers feel restricted by die size and are uncomfortable going any higher than they already have, but who knows.


----------



## Mand12

Quote:


> Originally Posted by *Usario*
> 
> Hawaii runs hot because the die is very dense.


What makes you think that density would have decreased had they tried to make a 500 mm2 die?


----------



## skupples

Can't keep up with demand = prices go through the roof. Is that really hard for you to understand?

Who's fault is it between AMD and TSMC? No one knows so blame both.


----------



## Mr.Eiht

Quote:


> Originally Posted by *zealord*
> 
> I would love to replace my GTX 680, but If I assume this leak to be true, then I do not see my self paying the 650$/€(?) for this "midrange" card even if the performance is well above the 780 Ti.
> Well of course if it is like 60-70% better than the 780 Ti I might give in, but I don't see that remotely happening.


^This. Would love to replace my boring 680 with something funky, two cards so the case does not look that empty.
But why would I replace a boring card with two boring cards?


----------



## 47 Knucklehead

This is why, like Windows Operating Systems, you buy EVERY OTHER generation of GPU and in the "off generation", you pick up a 2nd card and go SLi/Crossfire ... and why good drivers for multiple cards is so important.

I'm glad I went from my GTX 580's to my GTX 780's ... totally skipping the GTX 680 and now skipping the GTX 880. I won't even bother looking at video cards until the GTX 980 or the AMD R9 5xx (or what ever they plan to call their card 2 generations from the R9 290(x) ).


----------



## skupples

Pascal or bust!!! 2016!!!

Wish we knew more about that PCIE-X... Nvidia about to start making boards again?


----------



## Usario

Quote:


> Originally Posted by *Mand12*
> 
> What makes you think that density would have decreased had they tried to make a 500 mm2 die?


I meant, if they make the exact same chip, just less dense with a 500mm^2 die.
Quote:


> Originally Posted by *skupples*
> 
> Can't keep up with demand = prices go through the roof. Is that really hard for you to understand?
> 
> Who's fault is it between AMD and TSMC? No one knows so blame both.


I'm not aware of any evidence to suggest it wasn't just retailer markup. The MSRP remained the same and prices outside of North America were not affected.


----------



## PostalTwinkie

Quote:


> Originally Posted by *47 Knucklehead*
> 
> This is why, like Windows Operating Systems, you buy EVERY OTHER generation of GPU and in the "off generation", you pick up a 2nd card and go SLi/Crossfire ... and why good drivers for multiple cards is so important.
> 
> I'm glad I went from my GTX 580's to my GTX 780's ... totally skipping the GTX 680 and now skipping the GTX 880. I won't even bother looking at video cards until the GTX 980 or the AMD R9 5xx (or what ever they plan to call their card 2 generations from the R9 290(x) ).


Yea, I kind of broke that rule this cycle, but with my 780 Ti showing up today, I will add another later, and completely skip the next two series from both camps. I will start looking again when Nvidia actually dumps Pascal in our laps, assuming it has the major shifts the gpu paradigm it looks to bring.


----------



## skupples

Prices in other nations(excluding Canada some times) have no affect on supply lines to the states. The cards were constantly sold out and or @ $900 that is decent evidence of supply shortages in the states. Said patterns went all the way down to the 280. Unless that is a conspiracy as well.

USA definitely has the most miners and the cheapest power. Even my local brick and mortar stores were sold out of everything down to 270x


----------



## szeged

I blame newegg still 100%.


----------



## sugarhell

Quote:


> Originally Posted by *skupples*
> 
> Prices in other nations(excluding Canada some times) have no affect on supply lines to the states. The cards were constantly sold out and or @ $900 that is decent evidence of supply shortages in the states. Said patterns went all the way down to the 280. Unless that is a conspiracy as well.
> 
> USA definitely has the most miners and the cheapest power. Even my local brick and mortar stores were sold out of everything down to 270x


Compared to china? I doubt it


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> Compared to china? I doubt it


I always forgot about then being a mining super power.


----------



## PostalTwinkie

Quote:


> Originally Posted by *skupples*
> 
> I always forgot about then being a mining super power.


In the digital world and the real world!


----------



## Chargeit

Quote:


> Originally Posted by *Clocknut*
> 
> Well just stick to ur own budget. there is no reason to keep following the tech if it get more expensive.
> 
> If GTX860 is somehow becoming $400-500, I will not buy it. It is very unlikely I will buy a GPU over $250.


Yea, I feel most comfortable in the $500 - $700MAX ($700 if I feel crazy, and want to lie to my ol'lady about the price I paid =) range of GPU. This is assuming that it's considered a high end card. I'm not paying $700+ for a GPU that is considered "Mid" range. Lol, does that mean we're only to expect medium settings in games? (I know this isn't true but still funny when you think of the price range)

My plan was to just stick with my current 780 through this next gen, and even go for the same range of GPU, or go with SLI or Crossfire $300 range GPU next. As much as the upgrade junkie in me wants to go up at any chance, I just can't justify the constant shelling out of $100's of dollars.

I really do hope what was suggested is just BS brought on by people trying to make sense of the crazy price they put those "Z" out at. Pricing like this will only further push a long the eventual take over of pure on chip graphics solutions. It doesn't take a tech guru to realize what the future holds for dedicated GPU, especially if the card makers want to a attempt to increase prices into the range they're talking about.

And about R&D prices, I'm sure a large portion of those costs go towards developing future Mobile/integrated solutions.


----------



## skupples

All estimates on price are even further into the realm of speculation thn the physical Specs them selves. I bet 550-600


----------



## PostalTwinkie

Quote:


> Originally Posted by *skupples*
> 
> All estimates on price are even further into the realm of speculation thn the physical Specs them selves. I bet 550-600


I think we will continue to see the pricing scheme we have seen the last two generations for several more.

Which isn't really terrible (don't get me wrong, I love cheaper as well) considering you have cards like the GTX 750 Ti for $150 that can game at 1080P just fine. There isn't _need_ in most situations to buy 780, 780 Tis, or the other flagships.


----------



## Dudewitbow

Quote:


> Originally Posted by *EliteReplay*
> 
> i dont think you can play ULTRA 1080p any game with a 750ti


it would be case by case situation and preference. because Ultra technically doesnt always have a definition for how much AA is placed. if it was something light on AA like FXAA or MLAA, 1080p ultra on nearly all titles wouldnt be impossible. Then the other subjective preference would be minimum fps preference. There is no standard "min fps must be this amount" so one person wouldn't mind playing with dips to 30, where another will hate dips below 50 per say, it goes further as games like FPS generally has a preference for better minimums compared to a RPG, where Low FPS doesn't entirely kill game immersion because reaction time isn't as mandatory.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Dudewitbow*
> 
> it would be case by case situation and preference. because Ultra technically doesnt always have a definition for how much AA is placed. if it was something light on AA like FXAA or MLAA, 1080p ultra on nearly all titles wouldnt be impossible. Then the other subjective preference would be minimum fps preference. There is no standard "min fps must be this amount" so one person wouldn't mind playing with dips to 30, where another will hate dips below 50 per say, it goes further as games like FPS generally has a preference for better minimums compared to a RPG, where Low FPS doesn't entirely kill game immersion because reaction time isn't as mandatory.


Pretty much this!

There is also a very very very (in some situations ZERO) visual difference between x8 and x16. In a lot of situations it can be hard to tell the difference between x4 and x8 if you aren't TRYING to find jaggies.

Most people at 1080P these days buy ultra high end GPUs because of e-peen and/or they are in the mindset of _"I have to max everything or it looks bad!"_. Even though in most situations the only thing you net with Ultra over something like High is a tanked frame rate.

CoD Ghost was amazing at this; you could tank your frame rate by close to 50% by setting everything up at max, yet it provided no visual benefit over one step below.


----------



## Chargeit

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I think we will continue to see the pricing scheme we have seen the last two generations for several more.
> 
> Which isn't really terrible (don't get me wrong, I love cheaper as well) considering you have cards like the GTX 750 Ti for $150 that can game at 1080P just fine. There isn't _need_ in most situations to buy 780, 780 Tis, or the other flagships.


That's the reason I don't feel the need to upgrade this 780. I know people talk a lot about games being GPU bound, but, I think once you get to a higher level of card you're more likely to be limited by CPU/engine without going for very high res, or multimonitor.. For instance, I play in 1080p, which this GPU is more then capable of. I just got GW2 while it was on sale. Running a 4770k/780 max I still drop to 40fps (saw a 30 somewhere) in the towns. That's CPU/engine without a doubt.

Heck, I did think about adding a 2nd 780, but if my previous experiences have taught me anything, it's that there does come a point that you just can't really improve some things. Such as my GW2 example. Messing around with dropping things which you'd think had a large affect on fps doesn't seem to do anything (Demanding graphical settings have little performance effect). I've noticed that a lot with this GPU, since my limits are most likely due to cpu/engine.

I know that for now at the 1080p I play, there is only so much that upgrading the GPU can do with 95% of the games I play. (my most demanding game is still only FarCry3. I don't even have it installed)

I'm considering getting a 3rd monitor, if that's the case then I'd benefit from a better/2nd GPU without a doubt. I'd most likely just get a 2nd 780, since multi cards are the way to go playing on 3 monitors.

*I got the game for my ol'lady also, who is on a FX6300/HD7850 (I like the 6300 much more then my old 8320). I left the settings at default, but, it looks about as good as it does on mine, and runs equally well from what I saw.


----------



## GreenStone

I am still using my 4 (4.5?) years old HD 5850 and even though it doesn't push crazy framerates I am able to play games at medium settings and 1080p. I will most probably upgrade to the next generation of GPUs so I can again dabble with high to ultra settings. But considering that I am still at 1080p there will be no need for a behemoth. Actually, the feature that interests me the most is the supposed power efficiency of the new cards. This is the reason why I didn't jump on the R9 280x (yet?) and I am holding out to see what wattage NVidia brings to the table.
Personally, I feel like there has been no need to upgrade the GPU in the last couple of years. All of the games looked relatively fine without taxing the hardware too much.


----------



## PostalTwinkie

Quote:


> Originally Posted by *GreenStone*
> 
> I am still using my 4 (4.5?) years old HD 5850 and even though it doesn't push crazy framerates I am able to play games at medium settings and 1080p. I will most probably upgrade to the next generation of GPUs so I can again dabble with high to ultra settings. But considering that I am still at 1080p there will be no need for a behemoth. Actually, the feature that interests me the most is the supposed power efficiency of the new cards. This is the reason why I didn't jump on the R9 280x (yet?) and I am holding out to see what wattage NVidia brings to the table.
> Personally, I feel like there has been no need to upgrade the GPU in the last couple of years. All of the games looked relatively fine without taxing the hardware too much.


If the GTX 750 Ti is any indication of the coming power efficiency from Nvidia, it is going to be impressive. The 750 Ti can pretty much hit 60 FPS at 1080P on medium settings for most any game, it is a beast of a little card.


----------



## GreenStone

Quote:


> Originally Posted by *PostalTwinkie*
> 
> If the GTX 750 Ti is any indication of the coming power efficiency from Nvidia, it is going to be impressive. The 750 Ti can pretty much hit 60 FPS at 1080P on medium settings for most any game, it is a beast of a little card.


I hope that as well. The reviews were indeed impressive and encouraging. But I will first wait for the "main course" to come out before going bonkers


----------



## NuclearPeace

Its annoying when people pretend like mid spec cards don't exist and go on to yell about how the world is about to end because graphics cards are all of a sudden aren't affordable. All you need to max out is a GTX 760, which is a whopping $250. If you are good with just High settings, which is barely a compromise, you can use a R9 270 or a GTX 660 which is even cheaper. If you want a mix of Medium/High then you can get a R7 265 or a GTX 750 Ti. Ive haven't seen sales of GPUs that good ever since the 650 Ti Boost was selling for $130 that has SLI and slightly higher FPS over the 750 Ti.

Buy according to your needs. I always feel my stomach drop when a new PC gamer buys a GTX 780 Ti with a 60Hz 1080p monitor.


----------



## skupples

Quote:


> Originally Posted by *NuclearPeace*
> 
> *I always feel my stomach drop when a new PC gamer buys a GTX 780 Ti with a 60Hz 1080p monitor.*


You and me both. Now then, if it was 3x 1080P i would recommend 2x 780Ti & a 3770k/4770k.


----------



## DizZz

Quote:


> Originally Posted by *NuclearPeace*
> 
> I always feel my stomach drop when a new PC gamer buys a GTX 780 Ti with a 60Hz 1080p monitor.


I don't necessarily agree. Most games look smoother the higher the fps they are able to run at, even if they cannot display all those frames on a monitor so for people who are serious gamers and have the money to spend, a 780 TI is not definitively a bad choice for 60hz 1080p. But then again, why would a serious gamer be running 60hz 1080p...

It does happen though


----------



## Blackops_2

I just picked up a 780 Classy







but i'm running a 144hz VG248QE









I'm with Dizz though, the more future proof the better, at least it's not gonna hurt.


----------



## Hasty

Even stable 60fps is hard to achieve in some games with a 780ti.
I will never understand this concept of high end cards being supposedly "overkill/unnecessary/useless"


----------



## Brodda-Syd

Quote:


> Originally Posted by *Hasty*
> 
> Even stable 60fps is hard to achieve in some games with a 780ti.
> I will never understand this concept of high end cards being supposedly "overkill/unnecessary/useless"


I think it all depends on what games we are all playing!

I felt my GTX680's were overkill on Crysis2 (DX11 & Hi Res Textures ) & Battlefield 3 AT 1080P with ultra\high settings.
But Crysis 3 at ultra settings HEAVILY taxed them!
Can anyone achieve 144 fps on their 144Hz display on Crysis 3 with ultra settings?

Now I'm running at 4K, I'm thinking 2 GTX880's may not be enough and I may have to play at medium settings.
If it is using a GM204 (presumably cut down version), hopefully AMD may push them to release a GM210 chip!
But then I may waste a year waiting for that to happen, then it doesn't.
I can't wait 2 years for Pascal.


----------



## Cakewalk_S

I'm excited for what the mobile versions of these cards will bring. With the big drop in power consumption on the 750ti and great performance I would expect to see a mobile chip to really shine with Maxwell...


----------



## GorbazTheDragon

Yeah the 20nm mobile parts should be pretty good.

As far as running 780 Tis at 1080p IMO it's a waste of money if you don't have 120/144Hz capability... Depending on what your definition of a serious gamer is, they will often use high FPS monitors anyway, and will be looking for very consistent/smooth framerates.

I play quite a few games relatively competitively, so having the extra overhead to provide smooth performance is very welcome. Then again, who plays Crysis 3 competetively... Sounds a bit far fetched, even after you have thrown out ultra settings.


----------



## raisethe3

I know I am late, but does this mean the GTX880= the new 8800GT?


----------



## fateswarm

Quote:


> Originally Posted by *DizZz*
> 
> I don't necessarily agree. Most games look smoother the higher the fps they are able to run at, even if they cannot display all those frames on a monitor so for people who are serious gamers and have the money to spend, a 780 TI is not definitively a bad choice for 60hz 1080p. But then again, why would a serious gamer be running 60hz 1080p...
> 
> It does happen though


Yeah, basically 1080p is pretty much a standard when competition must be priority number one. Hell, they even turn unnecessary graphics down to hell so it's not even supposed to look good. The alternative is to get high fps on at least 1440k but good luck finding it that cost effective for an environment that Must have, at all times, optimal FPS.


----------



## GorbazTheDragon

Quote:


> Originally Posted by *raisethe3*
> 
> I know I am late, but does this mean the GTX880= the new 8800GT?


No, maybe a closer analogy would be the 8800GTS 512

Otherwise think of it as the GTX 480 if that had the specs of the GTX 560 ti.


----------



## DizZz

Quote:


> Originally Posted by *fateswarm*
> 
> Yeah, basically 1080p is pretty much a standard when competition must be priority number one. Hell, they even turn unnecessary graphics down to hell so it's not even supposed to look good. The alternative is to get high fps on at least 1440k but good luck finding it that cost effective for an environment that Must have, at all times, optimal FPS.


Yes but most competitive gamers who run 1080p are running at least 120hz but yeah you are right.


----------



## GorbazTheDragon

Quote:


> Originally Posted by *DizZz*
> 
> Yes but most competitive gamers who run 1080p are running at least 120hz but yeah you are right.


There's not much point in running at 1440p for a pro gamer... I still know of a lot of pros who play on 5:4 or 4:3 at 1280x1024 or some small res like that... They make a little 19 inch window on the 24 inch monitors.

I wonder what Riot Games uses for their computers for the LCS... Wouldn't be surprised if it was 660s or something pretty mid range like that.

I get 150-200 FPS on my 670m which is basically a slow 560... I can't really say the same for some other games, but AFAIK the more demanding games are never popular in competitive play because they are less responsive.


----------



## DizZz

Quote:


> Originally Posted by *GorbazTheDragon*
> 
> There's not much point in running at 1440p for a pro gamer... I still know of a lot of pros who play on 5:4 or 4:3 at 1280x1024 or some small res like that... They make a little 19 inch window on the 24 inch monitors.
> 
> I wonder what Riot Games uses for their computers for the LCS... Wouldn't be surprised if it was 660s or something pretty mid range like that.
> 
> I get 150-200 FPS on my 670m which is basically a slow 560... I can't really say the same for some other games, but AFAIK the *more demanding games are never popular in competitive play because they are less responsive*.


Exactly which is part of the reason why CS has been so popular over the years and even cod4 promod to a certain extent.


----------



## fateswarm

Quote:


> Originally Posted by *GorbazTheDragon*
> 
> There's not much point in running at 1440p for a pro gamer... I still know of a lot of pros who play on 5:4 or 4:3 at 1280x1024 or some small res like that... They make a little 19 inch window on the 24 inch monitors.
> 
> I wonder what Riot Games uses for their computers for the LCS... Wouldn't be surprised if it was 660s or something pretty mid range like that.
> 
> I get 150-200 FPS on my 670m which is basically a slow 560... I can't really say the same for some other games, but AFAIK the more demanding games are never popular in competitive play because they are less responsive.


It depends on the game. That window you mentioned refers also to WoW for example. But in that game it's probably best to make sure the CPU is a beast and then care about the graphics. It's probably the most optimal on a high end 4/8 threaded Intel or better, but on any GPU better than a 680. They probably don't even need more than 60hz vsync.

But go on a competitive FPS game, you have to be full screen because it's just more viewing area. Well, unless your monitor is around 30'' or more and you can can afford a window, but it's probably best for performance to full screen a 24'' 1080. On 100 fps at least.


----------



## raisethe3

Quote:


> Originally Posted by *GorbazTheDragon*
> 
> No, maybe a closer analogy would be the 8800GTS 512
> 
> Otherwise think of it as the GTX 480 if that had the specs of the GTX 560 ti.


Only reason I brought this up because the numbering is so familiar. Which card currently gives the same performance/price as the 8800GT today? Just curious.


----------



## i7monkey

I click on this page every day and I've played less than 2 hours worth of games on my 780Ti since November


----------



## fateswarm

Quote:


> Originally Posted by *i7monkey*
> 
> I click on this page every day and I've played less than 2 hours worth of games on my 780Ti since November


Yeah, one of the reasons these cards are an investment worth investigating. e.g. I currently have a list of adventure games to play that don't need anything better than a 4 or 5 year old computer to play. But I also want to wait for the Fall since it's likely to get 14 or/and 20nm chips by then so it's worth the wait more.


----------



## Doomtomb

Quote:


> Originally Posted by *y2kcamaross*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Doomtomb*
> 
> I wouldn't consider better than GTX 780 Ti and the like to be midrange. It's hard to tell from specs just how it will perform because it is gimped by the memory bit depth and ROP count but will probably still beat a vanilla GTX 780.
> 
> I don't consider the Titan or any of its above $1000 companions to be "high-end" because it is playing it a totally different ballgame, "ultra-high-end" for professionals with deeper pockets.
> 
> I really hope next generation Nvidia returns to reason that the sweet spot price/performance will always outsell and make the big bucks for them. Not this runaway price point train. It just seems like every new graphics card announced is more expensive than the last. The ceiling will be found soon.
> 
> 
> 
> So you think the 880 will probably beat a vanilla 780...has there ever been a gpu release where the previous high end model(780) BEATS the new high end model(880)? Because that would equate to about 0 cards sold
Click to expand...

What's the point of your comment? Ok Thanks for agreeing with me.


----------



## BusterOddo

Quote:


> Originally Posted by *DizZz*
> 
> Most games look smoother the higher the fps they are able to run at, even if they cannot display all those frames on a monitor so for people who are serious gamers and have the money to spend, a 780 TI is not definitively a bad choice for 60hz 1080p. But then again, why would a serious gamer be running 60hz 1080p...
> 
> I definitely agree with the first part of your post. Because lots of people use the same screen for both their tv and computer screen. And in no way is 780ti/290x overkill for 1080P even at 60 hz. Plenty of games with high settings and aa applied will need that power, and Im going to leave the serious gamer comment alone lol
> 
> Don't know why it adds my post in the same box as what I quoted???


----------



## AlphaC

Since everyone's speculating, this is my speculation dartboard based off 128 CUDA cores per SMM (the smallest unit of Maxwell):



One has to match.

See also http://www.overclock.net/a/what-i-expect-from-nvidias-maxwell-speculation-draft

edit: performance prediction


Where'd I get the clocks from?
*HWBOT average overclock for cards on air , water* for cards with PCI-e power


Spoiler: Warning: Spoiler!



GTX 780 Ti = 1186 , 1346
GTX 780 = 1178 , 1202
GTX 770 = 1246 , 1278
GTX 760 = 1239 , 1238
*GTX 750 TI = 1324 , 1423*
*GTX 750 = 1314*

GTX 690 = 1072 , 1237
GTX 680 = 1220, 1311
GTX 670 = 1134 , 1215
GTX 660 TI = 1130 , 1257
GTX 660 = 1314
GTX 650 TI Boost = 1207
GTX 650 = 1281, 1451

R9 290X = 1160 , 1239
R9 290 = 1131 , 1158
R9 280X = 1190 , 1274
R9 270X = 1203 , 1255
R9 270 = 1058 , 1183
R7 260X = 1243
R7 260 = 1241

HD 7990 = 1101 , 1143
HD 7970 x2 = 1180 , 1180
HD 7970 = 1205 , 1286
HD 7950 = 1134 , 1216
HD 7870XT = 1184 , 1220
HD 7870 = 1226 , 1290
HD 7850 = 1126 , 1267
HD 7790 = 1175 , 1453
HD 7770 = 1150 , 1328



As you can see 1100Mhz is a highly probable stock clock and 1300Mhz is a highly reasonable overclock expectation.


----------



## L36

Quote:


> Originally Posted by *AlphaC*
> 
> Since everyone's speculating, this is my speculation dartboard based off 128 CUDA cores per SMM (the smallest unit of Maxwell):
> 
> 
> 
> One has to match.
> 
> See also http://www.overclock.net/a/what-i-expect-from-nvidias-maxwell-speculation-draft
> 
> edit: performance prediction
> 
> 
> Where'd I get the clocks from?
> *HWBOT average overclock for cards on air , water* for cards with PCI-e power
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GTX 780 Ti = 1186 , 1346
> GTX 780 = 1178 , 1202
> GTX 770 = 1246 , 1278
> GTX 760 = 1239 , 1238
> *GTX 750 TI = 1324 , 1423*
> *GTX 750 = 1314*
> 
> GTX 690 = 1072 , 1237
> GTX 680 = 1220, 1311
> GTX 670 = 1134 , 1215
> GTX 660 TI = 1130 , 1257
> GTX 660 = 1314
> GTX 650 TI Boost = 1207
> GTX 650 = 1281, 1451
> 
> R9 290X = 1160 , 1239
> R9 290 = 1131 , 1158
> R9 280X = 1190 , 1274
> R9 270X = 1203 , 1255
> R9 270 = 1058 , 1183
> R7 260X = 1243
> R7 260 = 1241
> 
> HD 7990 = 1101 , 1143
> HD 7970 x2 = 1180 , 1180
> HD 7970 = 1205 , 1286
> HD 7950 = 1134 , 1216
> HD 7870XT = 1184 , 1220
> HD 7870 = 1226 , 1290
> HD 7850 = 1126 , 1267
> HD 7790 = 1175 , 1453
> HD 7770 = 1150 , 1328
> 
> 
> 
> As you can see 1100Mhz is a highly probable stock clock and 1300Mhz is a highly reasonable overclock expectation.


Doubt they're going to stick with 384 bit wide bus. Its either 512 wide bus with GDDR5 or 384 with GDDR6.


----------



## skupples

GDDR6 would be the sex, though i kinda doubt we are going to see it on V1 Maxwell. Maybe second run.


----------



## GorbazTheDragon

Your speculations brings me back to the original leaks... I agree that 3200 cores seems a tad far fetched for the GM104/204. Especially since the GM107 is only 640 cores. I'd expect 2560 for the GM104 and that to be used for the GTX 880. This would fall in line with my expectation for the GM110/210 to be delayed until around a year after the 104/204.

But, here's the big but: This ALL assumes that nvidia still uses the same numbering scheme as with kepler... That means

110>104>106>107>108

There is always the possibility that they change this, but if it does carry over I'd expect the following (i named these 2xx chips because they would be 20nm rather than the 28nm 1xx chips)

107/208 - 640 Cudas, 128-bit
207 - 1280 Cudas, 192-bit
206 - 1920 Cudas, 192-bit (256-bit?)
204 - 2560 Cudas, 256-bit (320-bit?)
210 - 3200 Cudas (3840?), 384-bit

Now that I think about it a 3200 shader GM204 seems quite plausible, especially when considering the performance gap between the GK104 and GK106... This would leave the problem that 3200 cores is a LOT, and it will almost certainly make the GM204 larger than the GK104, which is a possibility, but it starts invading into the space of the GM210, unless somehow they can produce 600sqmm+ chips.

Either way, seeing the performance out of the GM107 I'd be perfectly happy with a 2.5k shader maxwell card, it would definitely be up there with the 780 Ti as far as performance. Also, I wonder how memory starved the GM107 is, and if the next 640 shader card will have a 192-bit bus. If so, I'd be pretty certain that a 2560 shader GM204 will have at least a 320-bit bus.


----------



## kx11

most interesting thing i found about GDDR6
Quote:


> Chipmaker AMD, that contributed a lot to the GDDR memory standard, has made the first steps towards GDDR6 memory by contributing to the JEDEC group on the new memory standard, which manages memory chip standards on GDDR. It is expected that the faster and more efficient GDDR6 chips will be appearing from 2014 onward in Nvidia and AMD video cards. The new memory chip standard is expected to be used at least until the year 2020, the same time period as GDDR5. I'm very curious of what we can expect from the upcoming GDDR6 video cards.
> 
> Sure is that GDDR6 video cards will be a lot faster and more energy efficient, which alone will let us do things on our computers we can not do as consumers today. As video cards will be able to go far beyond the capabilities of GDDR5 memory, clock speeds of 12GHz and beyond do not seem that far fetched for future GDDR6 based video cards.


----------



## megahmad

Quote:


> Originally Posted by *DADDYDC650*
> 
> 256-bit? Hope not otherwise I'll wait at least 8 months after release to buy a GTX 880. By that time *there will be a GTX 880 Super TI Mega Edition with 12GB*.


----------



## GorbazTheDragon

Quote:


> Originally Posted by *kx11*
> 
> most interesting thing i found about GDDR6


12GHz seems pretty far fetched based on my knowledge of physics (not much)


----------



## Usario

Quote:


> Originally Posted by *GorbazTheDragon*
> 
> 12GHz seems pretty far fetched based on my knowledge of physics (not much)


Remember that 12GHz is the effective clock; the actual clock would be 4GHz.


----------



## cstkl1

was browsing around and doing my thing and suddenly holy crap batman..

GTX860M Maxwell compute at 5.0??


Spoiler: Warning: Spoiler!








Death of Titans.


----------



## skupples

doesn't that seem a bit strange that 860m has compute 5.0 but 880m does not?

8xx better have a "titan killer" or erlse nvidia is doing it wrong.


----------



## ZealotKi11er

Quote:


> Originally Posted by *skupples*
> 
> doesn't that seem a bit strange that 860m has compute 5.0 but 880m does not?
> 
> 8xx better have a "titan killer" or erlse nvidia is doing it wrong.


880M is Kepler Rebrand. 860M is Maxwell.


----------



## skupples

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 880M is Kepler Rebrand. 860M is Maxwell.












Yeah! Totally not confusing or un-intuitive @ all!


----------



## cstkl1

my bad. always thought compute numbers shows capability in performance..
red more
its not.. just some spec thingy

titan safe for now.


----------



## skupples

Quote:


> Originally Posted by *cstkl1*
> 
> my bad. always thought compute numbers shows capability in performance..
> red more
> its not.. just some spec thingy
> 
> titan safe for now.


We are all hoping for a 10-20% increase on Titan performance w/ a $500 price tag, but i'm not sure if 880 will bring that level of performance for those of us in Surround if it comes w/ dinky memory bandwidth.


----------



## MxPhenom 216

Quote:


> Originally Posted by *skupples*
> 
> We are all hoping for a *10-20%* increase on Titan performance w/ a $500 price tag, but i'm not sure if 880 will bring that level of performance for those of us in Surround if it comes w/ dinky memory bandwidth.


Jesus. I'd hope for a bit more than that.

Though it doesnt matter much, im skipping out on first gen Maxwell.


----------



## Germanian

Quote:


> Originally Posted by *MxPhenom 216*
> 
> Jesus. I'd hope for a bit more than that.
> 
> Though it doesnt matter much, im skipping out on first gen Maxwell.


the way it looks like the first maxwell 880 will be still on 28nm and then like 3-6 months followed by the real BIG DIE on 20nm, which will eat the titan up and spit it out.


----------



## Cyro999

Quote:


> I wonder what Riot Games uses for their computers for the LCS... Wouldn't be surprised if it was 660s or something pretty mid range like that.


League is mostly CPU bound.. i have a 3770k + gtx260 system in the house and it runs at 250fps in the tutorial, so i'm sure they'd do completely fine with 660's.


----------



## fateswarm

880 will be 512bit GPU to VRAM because they can't market 3GB anymore, i.e. if only for the marketing reason, and 256bit is too slow to be even considered. They can comfortably go 20nm after the end of the summer since TSMC is confirmed to be mass producing Apple's A8, and that usually ships in September.

20nm will probably only see one big or medium chip because the GK110 is a monster to compete with and 16nm is a small upgrade at foundry, unlike other upgrades we're used to.

NVIDIA is due for a major update in Q4. AMD is for late Q1 2015.


----------



## Clocknut

Quote:


> Originally Posted by *skupples*
> 
> We are all hoping for a 10-20% increase on Titan performance w/ a $500 price tag, but i'm not sure if 880 will bring that level of performance for those of us in Surround if it comes w/ dinky memory bandwidth.


Typically next gen mainstream always get equal or beat the old flagship GPU by a little especially on a new architecture change

I think we may get GTX860/860ti to be as quick as 780ti/titan.


----------



## Arturo.Zise

So which of the new Maxwell's would give me 780 performance at a non 780 price = cheaper?


----------



## Steffek

Quote:


> Originally Posted by *Arturo.Zise*
> 
> So which of the new Maxwell's would give me 780 performance at a non 780 price = cheaper?


Tradition says the 870.


----------



## fateswarm

Three possibilities. A small maxwell, a regular maxwell nerfed, or a rehash of kepler. I suspect there's no small maxwell since 16nm may come soon after 20nm, but it's likely.

Though I'm mainly almost sure for the performance market.

It's very doubtful a small maxwell would beat GK110.


----------



## NABBO

interesting speculation

http://forum.beyond3d.com/showthread.php?t=59531&page=68

GM200/Big Maxwell 28nm?
possible?


----------



## SuprUsrStan

Quote:


> Originally Posted by *NABBO*
> 
> interesting speculation
> 
> http://forum.beyond3d.com/showthread.php?t=59531&page=68
> 
> GM200/Big Maxwell 28nm?
> possible?


Nope. That just means we'll have to wait half a year longer than originally expected for the prices to equalize.


----------



## akromatic

Quote:


> Originally Posted by *TK421*
> 
> how well does it mine though?


........ whats wrong with this world. out of so many questions like how well it would perform and will it run 4k gaming you ask about mining......


----------



## NABBO

Quote:


> Originally Posted by *Syan48306*
> 
> Nope.


sure? 100%?
I do not.


----------



## fateswarm

Quote:


> Cost of a 28nm wafer - $4,500-$5,000
> Cost of a 20nm wafer - $6,000
> Cost of a 16/14nm Finfet wafer - $7,270


Finally, a more realistic view. I was trying to tell people it's nonsense that GPUs "cost ~$40". NVIDIA publicly complained for 20nm prices back in 2011-12, it's a real issue and they would not risk their relationship by exposing their partner if it weren't true.

So, for a GK110, and with a low yield rate that is expected for the first month for the humongous chips



And that's the min cost of NV to make a chip, add NV's profit, put the whole card on it and all its components, the margins of card manufacturers, the retail, ..

PS. I know, yield is usually reported lower, but I believe it reflects mainly the small/regular chip sizes and after months of production).

PPS. And I didn't even subtract the obvious lack of chips near the edges of wafers and the spacing of wafers, e.g. with 90% of it usable.



waow, so profit.


----------



## Mand12

Quote:


> Originally Posted by *fateswarm*
> 
> Finally, a more realistic view. I was trying to tell people it's nonsense that GPUs "cost ~$40". NVIDIA publicly complained for 20nm prices back in 2011-12, it's a real issue and they would not risk their relationship by exposing their partner if it weren't true.


Yeah, $40 seems a bit low.

Although, if you assume 100% yield, then it'd end up being pretty close to $40. That's probably the error people make.


----------



## fateswarm

Yeah, but it's not 100% yield. The yields they advertise and are proud of are reported to be 70% and up. When they make GK110, a humongous chip and they start in 2012, it might be a very low yield.


----------



## Mand12

Quote:


> Originally Posted by *fateswarm*
> 
> Yeah, but it's not 100% yield.


Which is why I called it an error


----------



## Tjj226 Angel

Why are people still commenting on this thread? Obviously the source is a VERY bad one.


----------



## fateswarm

Quote:


> Originally Posted by *Mand12*
> 
> Which is why I called it an error


Yeah. Sorry. I was too fast.


----------



## GoldenTiger

Quote:


> Originally Posted by *NABBO*
> 
> interesting speculation
> 
> http://forum.beyond3d.com/showthread.php?t=59531&page=68
> 
> GM200/Big Maxwell 28nm?
> possible?


Um, yeah, I speculated that over *three and a half months ago* along with probable specs:

http://hardforum.com/showthread.php?t=1806694


----------



## skupples

Quote:


> Originally Posted by *akromatic*
> 
> ........ whats wrong with this world. out of so many questions like how well it would perform and will it run 4k gaming you ask about mining......


some people haven't realized that multi script asic is going to destroy GPU mining. Its already in the process of doing so. Why the hell would you GPU mine when you can get said asic AND have 10x the yield @ 1/2 the draw.


----------



## i7Stealth1366

Honestly it does not matter at that price per card. GTX Titan and beyond are enough for now, they really are not even optimized yet.


----------



## Ghoxt

What a difference a month can make. Does anyone question the positioning of the GTX880 on this chart and specs. I personally question the memory and dearly hope NV releases the card with 6Gig right from the start. I'm hoping AMD dropping the 295X2 allows to NV to release the hounds after getting quite simply embarrassed at their offering for the dual GPU card.

So do we think Bessy, I mean the cow that is Kepler is done?

Titan owner still looking to buy an upgrade card







Or will I have to just buy another? sigh...

Quote:



> Originally Posted by *Germanian*
> 
> A leak just got in regarding maxwell GTX 880 specs.
> GTX 880 is supposedly a MID RANGE CARD. We will see stronger versions later on.


----------



## azanimefan

who knows?

personally i think those numbers are bunk... but we have this silly stupid thread dedicated to speculation so speculate away. i expect the 880 will be on the same node as kepler... just like the 750/750ti... they'll save the process shrink for the refresh in a year or two.


----------



## skupples

880 being mid range isn't debatable.

What has changed in a month? I must have missed a major announcement from Nvidia on the solid rock hard specs.(or i just ignored all of the videocardZ & WCCFTECH click bate, who knows)


----------



## fateswarm

Will it be worth waiting for this or getting a cheapo 780 now and turning it SLI later? If this would be 20% faster or more I'd care. If it's like 10% faster I wouldn't care at all.


----------



## krel

Think we'll get more information next week?


----------



## zealord

Quote:


> Originally Posted by *krel*
> 
> Think we'll get more information next week?


nah. official information is never released so far in advance. maybe more rumours and unconfirmed sources and stuff like this. Companies want to leave you as much in the dark as they can so they still sell enough of the "old" products.


----------



## skupples

Quote:


> Originally Posted by *krel*
> 
> Think we'll get more information next week?


Quote:


> Originally Posted by *zealord*
> 
> nah. official information is never released so far in advance. maybe more rumours and unconfirmed sources and stuff like this. Companies want to leave you as much in the dark as they can so they still sell enough of the "old" products.


i'm hoping we hear something about Maxwell @ COMPUTEX.

"so far in advance"

what? we have no real timeline unless we follow that supposed shareholder's letter from last year stating Q3/Q4.


----------



## fateswarm

I fear they'll keep the strategy of not talking about it. I suspect they will have 20nm GPUs by the end of Q4 but it's still very early. It would hurt their sales talking about it.


----------



## zealord

Quote:


> Originally Posted by *skupples*
> 
> i'm hoping we hear something about Maxwell @ COMPUTEX.
> 
> "so far in advance"
> 
> what? we have no real timeline unless we follow that supposed shareholder's letter from last year stating Q3/Q4.


yeah I meant more by that than what I actually wrote. I don't think we will have a release date or the official specification for a GTX 880 in the next few weeks . Maybe we will get some loose information, but I didn't want to get his hopes up


----------



## SandGlass

Charlie is saying 20mm Maxwell for Q4 2014, IDK how much we can trust him. But he's been spot on in the past.


----------



## skupples

Quote:


> Originally Posted by *SandGlass*
> 
> Charlie is saying 20mm Maxwell for Q4 2014, IDK how much we can trust him. But he's been spot on in the past.


idk who that is. I must be a noob.


----------



## VSG

The Nvidia hating guy from Semi Accurate.


----------



## SandGlass

Quote:


> Originally Posted by *geggeg*
> 
> The Nvidia hating guy from Semi Accurate.


Yup, he's pretty negative about Nvidia. But his claims are not without merit.
Edit: his tegra K1 power consumption claims were FUD, but he was bang on with the tegra division news,


----------



## skupples

Quote:


> Originally Posted by *geggeg*
> 
> The Nvidia hating guy from Semi Accurate.


the guy behind a pay wall?


----------



## benbenkr

Will wait for Maxwell refresh... oh wai-


----------



## skupples

Quote:


> Originally Posted by *benbenkr*
> 
> Will wait for Maxwell refresh... oh wai-












i'm going to try to hold out for stacked dram, which ever chip that ends up being on. This likely means i'll be returning to AMD for awhile since it *seems* they could be pushing out ultra high bandwidth products way before Nvidia.

also, I need to REALLY evaluate my monitor setup as 5760x1080p is just getting old. This likely means i'll wait for SOLID 4K panels to come around before getting new GPUs. I would love to go Portrait/landscape/portrait with 4K, but i don't see Nvidia ever really supporting it from an official capacity.

I return to patiently waiting, 3 1300mhz titans should allow me to ride things out just fine.


----------



## Germanian

Quote:


> Originally Posted by *skupples*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'm going to try to hold out for stacked dram, which ever chip that ends up being on. This likely means i'll be returning to AMD for awhile since it *seems* they could be pushing out ultra high bandwidth products way before Nvidia.
> 
> also, I need to REALLY evaluate my monitor setup as 5760x1080p is just getting old. This likely means i'll wait for SOLID 4K panels to come around before getting new GPUs. I would love to go Portrait/landscape/portrait with 4K, but i don't see Nvidia ever really supporting it from an official capacity.
> 
> I return to patiently waiting, 3 1300mhz titans should allow me to ride things out just fine.


u should be more than fine for the next 1.5 years








those stacked DRAM might show up first on AMD side in spring 2015 that's what I am thinking.
Maxwell will hit around December for holiday season.


----------



## skupples

Quote:


> Originally Posted by *Germanian*
> 
> u should be more than fine for the next 1.5 years
> 
> 
> 
> 
> 
> 
> 
> 
> those stacked DRAM might show up first on AMD side in spring 2015 that's what I am thinking.
> Maxwell will hit around December for holiday season.


Worst comes to worst: I'll have to drop my settings from maxed out to ultra.


----------



## GoldenTiger

I'm on a single 780 running 4k 60hz resolution, it's doable at high settings in most games but I definitely could use more horsepower for high/ultra on some like Elder Scrolls Online where I'm forced to resort to a small amount of subsampling to keep a 60+ framerate in large fights, and some of the more intense BF4 maps that (with full native res. of course, ESO is the only one I subsample in) dip to 45-50 at times... looks amazing in most games though still. So, definitely anxious to see what Maxwell brings, I could go for a 2nd 780 now but I want to go dual 20nm Maxwell honestly If they're "only" 1-4 months away since I think they'll be a pretty large boost.

EDIT: By running fullscreen exclusive mode in ESO the performance jumps enough that I'm now running no subsampling and it performs plenty well







. Looks amazing indeed.


----------



## ladcrooks

Quote:


> Originally Posted by *cravinmild*
> 
> Nvidia needs to be careful, if they throw in everything they got into the first edition then how the heck will they be able to offer a dozen slightly more powerful card each week for the next twelve months. I love marketing


And if they market this properly they can start with the ' A ' model and lastly the ' Z ' - this can create 26 versions.

Boy oh boy, Nvidia should pay me for this idea


----------



## fateswarm

20nm in Q4 is a very good prediction. If you plot a graph of past major releases you see that NVIDIA is due for Q4 and AMD for Q1+. 20nm NVIDIA will likely be a medium-to-big sized die because they overshot with the GK110, then probably straight to 16nm with no much delay (it's a very small foundry upgrade).


----------



## Olivon

Q4 at best so.
Big SOC companies like Qualcomm or Mediatek indicate first half 2015 regarding 20 nm SOC availability (Q1'15 QCOM and 1H'15 for MDT).


----------



## fateswarm

They might be also restricted by their own past releases stock. AMD has also confirmed 20nm on 2015 onwards which is natural since R9 200 was released so recently. NVIDIA has a much older release and they will start itching for a new product to be ready after September.


----------



## Olivon

I really hope so but have some doubts about it.


----------



## fateswarm

Of course. Can never be certain.


----------



## JackNaylorPE

Quote:


> Originally Posted by *ladcrooks*
> 
> And if they market this properly they can start with the ' A ' model and lastly the ' Z ' - this can create 26 versions.
> 
> Boy oh boy, Nvidia should pay me for this idea


looks like another bait and switch .... last time they looked at AMDs offerings and said .... let's do this, that 680 we were going to release....let's put that on the shelf and take our proposed 670 and call that the 680 and so on down the line..... then if anything pops up, we have an ace in the hole. I think these leaks are half leaks / half gamesmanship.


----------



## Dyaems

If this is true, I guess I was right that NVIDIA is pulling another GTX 680 since AMD will most likely does not have anything to compete with it xD

more monies


----------



## Mako0312

Has anything on the 860/870 leaked out??

I'm debating on a 4gb 760/770 right now.


----------



## szeged

still hoping for around late october early november release, hopefully the classified model is out before 2015. If not, another kingpin here we go.


----------



## mcg75

Quote:


> Originally Posted by *JackNaylorPE*
> 
> looks like another bait and switch .... last time they looked at AMDs offerings and said .... let's do this, that 680 we were going to release....let's put that on the shelf and take our proposed 670 and call that the 680 and so on down the line..... then if anything pops up, we have an ace in the hole. I think these leaks are half leaks / half gamesmanship.


Except they didn't have a GK110 680 ready to put on the shelf. If they did, they would have released it as a professional/workstation card where is really brings in the profits.


----------



## skupples

Quote:


> Originally Posted by *Mako0312*
> 
> Has anything on the 860/870 leaked out??
> 
> I'm debating on a 4gb 760/770 right now.


I would 100% get a 290 over a 4GB 770 right now. Only 20-30$ more if you are in the states, way way more powerful, & 2x the memory bandwidth.

Yes. I am still in the derpaderpatitans owners club.

you can find 290s for $350 now(when on sale), & they even have non-ref coolers. Newegg had powercolor 290s down to $339 last week.


----------



## fateswarm

770 is a useless choice on the medium-to-high end. The 780 on the other hand.. a very different story. The vanilla versions are not priced painfully more than a 290, they require smaller PSUs and if you're possibly interested in G-SYNC too, they are the only choice.

I would go with a Gigabyte 780 Version 2 right now (much better VRM than 1.0) if I were to chose between it and 290.


----------



## JackNaylorPE

Quote:


> Originally Posted by *mcg75*
> 
> Except they didn't have a GK110 680 ready to put on the shelf. If they did, they would have released it as a professional/workstation card where is really brings in the profits.


It was all over the trade press at the time, surprised ya didn't see it.

Quote:


> Originally Posted by *fateswarm*
> 
> 770 is a useless choice on the medium-to-high end. The 780 on the other hand.. a very different story. The vanilla versions are not priced painfully more than a 290, they require smaller PSUs and if you're possibly interested in G-SYNC too, they are the only choice.
> 
> I would go with a Gigabyte 780 Version 2 right now (much better VRM than 1.0) if I were to chose between it and 290.


Since the 5xx series, everybody (Asus, MSI, Giga) was using custom PCBs and VRMs (well except for EVGA's SC series), but now I notice some are going back to reference boards since nVidia came out with a revised version. MSIs newer 780s Tis are on the new reference boards and they have managed to top the competition in most reviews garnering a 9.9 rating on techpowerup for example. I have been told that the newer 7890s are also using the same (Ver 2 ?) reference PCB now also....pictures look almost identical with just a few solder points different from original reference boards.


----------



## jdstock76

Quote:


> Originally Posted by *Mako0312*
> 
> Has anything on the 860/870 leaked out??
> 
> I'm debating on a 4gb 760/770 right now.


I would save the money on the 4gb 770. No need for it. You can find 770s a dime a dozen on eBay for $275 or less. I just picked up 2 770s for $500. Easy peasy.


----------



## Cakewalk_S

I think I'm due for a new GPU and probably CPU depending what Intel does around Q3-Q4 2015...Lol that is, as long as I don't have a kid before then...wedding is less than a month away...


----------



## mcg75

Quote:


> Originally Posted by *JackNaylorPE*
> 
> It was all over the trade press at the time, surprised ya didn't see it.


You must have links to the articles then because neither I or plenty of others here have ever seen such a thing. I'd be very curious to finally set the record straight.

What everybody else heard in a nutshell was that GK100 was not ready for release. Nvidia saw GK104 able to hang with 7970 so it became 680.

GK100 was either dropped entirely or refined further and released as GK110.


----------



## JackNaylorPE

I'd google it but can't think of an appropriate search term









How would you describe the reshuffle ?


----------



## Chomuco

maxwell-

http://videocardz.com/51009/nvidia-preparing-four-maxwell-gm204-skus

http://wccftech.com/nvidia-geforce-maxwell-lineup-feature-gm204-gpu-skus-rumors-point-16nm-revision-late-2015/

https://fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-xaf1/t31.0-8/1899610_10152276080643253_3645500732288917078_o.jpg


----------



## skupples

I would loooooooove to see them revise down to 16 NM by late 2015 as it would help solidify the whole maxwell turning into a filler card due to dropping Denver and stacked on the first revision speculation.


----------



## mtcn77

I think it will deliver maximum pixel pipeline thoroughput at 1.4ghz with those 40 rops & 7.4ghz memory. Though, I still cannot correlate that with texels.


----------



## skupples

I'll still be holding out for 9xx... I need to evaluate my monitor situation before stepping up to another three flags.


----------



## bhav

Why no 8 Ghz GDDR5???? Hynix makes plenty.


----------



## NuclearPeace

Quote:


> Originally Posted by *bhav*
> 
> Why no 8 Ghz GDDR5???? Hynix makes plenty.


$$$


----------



## Germanian

Quote:


> Originally Posted by *Chomuco*
> 
> maxwell-
> 
> http://videocardz.com/51009/nvidia-preparing-four-maxwell-gm204-skus
> 
> http://wccftech.com/nvidia-geforce-maxwell-lineup-feature-gm204-gpu-skus-rumors-point-16nm-revision-late-2015/
> 
> https://fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-xaf1/t31.0-8/1899610_10152276080643253_3645500732288917078_o.jpg


thx i added 1 source to 1st page


----------



## Chomuco

wow 880 ....


----------



## VSG

If that is the 880 at stock, then not bad at all.. About 1k higher than the reference 780Ti at stock.


----------



## vlps5122

about equal to a 780 ti at 1250-1300 mhz core clock, im assuming that 880 is somewhere around 1000 core clock


----------



## GoldenTiger

More evidence. Hearing the leaked pic score is lower than launch intention too. That already is 28 percent over a stock ti though. Rumored msrp is lower too.


----------



## Diablosbud

Not impressed with the memory bandwidth, that will likely limit gaming performance. 3200 cores makes me happy. That 5.7 TFLOPS calculation speed though







. Looks like the 880 will be a great card for compute performance.


----------



## skupples

Quote:


> Originally Posted by *GoldenTiger*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> More evidence. Hearing the leaked pic score is lower than launch intention too. That already is 28 percent over a stock ti though. Rumored msrp is lower too.


you really think NV is going to unveil 880 @ Gamescom? I guess it is possible.

i'm going to expect a VideoCardz article stating "unconfirmed sources state GTX880 reveal at Gamescom" in the next 4 hours, or i'll be extremely disappointing in his OCN lurking skills.


----------



## Germanian

Quote:


> Originally Posted by *Chomuco*
> 
> wow 880 ....


where is this from?


----------



## Alatar

Quote:


> Originally Posted by *skupples*
> 
> you really think NV is going to unveil 880 @ Gamescom? I guess it is possible.
> 
> i'm going to expect a VideoCardz article stating "unconfirmed sources state GTX880 reveal at Gamescom" in the next 4 hours, or i'll be extremely disappointing in his OCN lurking skills.


It was already made an hour ago:

http://videocardz.com/51078/nvidia-geforce-gtx-880-arrives-gamescom

Someone feel free to post in the rumors section. I'd rather not because imo the screen is really fishy.


----------



## Olivon

Quote:


> Originally Posted by *Chomuco*
> 
> wow 880 ....


780 Ti score ?


----------



## VSG

Quote:


> Originally Posted by *Alatar*
> 
> It was already made an hour ago:
> 
> http://videocardz.com/51078/nvidia-geforce-gtx-880-arrives-gamescom
> 
> Someone feel free to post in the rumors section. I'd rather not because imo the screen is really fishy.


Ya, I can make the same screenshot using a 780Ti as well. I wonder if the rumour mills will bite if I send them a similar screenie using my KPE


----------



## szeged

im gonna put my kingpin on ln2 tomorrow and run FSE at 1800 core and blur out the gpu name and send it to VC and tell them its a 880, cant wait to see how they spin that into a headline.


----------



## skupples

seems like it would be REAAAAAAAAAAAAAALLY easy to replicate that screenshot with any number of highly overclocked top of the line GPUs right now.

Remember back in the day when these types of leaks came with haxed gpu-Z screenshots instead of CPU-Z sreenshots? Ahhh the good old days.


----------



## szeged

that score can be done on a reference 780ti pretty easily lol


----------



## ref

Yeah, that screenshot could easily be faked/replicated with an overclocked 780ti.

Gamescom seems a bit early for a reveal, then saying mid August release date? Nah, don't buy it.

I'm still betting on Late October/Early November release date.

I'd love to be proven wrong though as I'm waiting to get a pair of 880's to replace my 670.


----------



## szeged

my guess is late october, as it has been since january. August is just too early, maybe we will get a reveal or teaser in august, but a full on launch where we can get the product in our hand? October 17thish at the earliest imo.


----------



## Chomuco

Quote:


> Originally Posted by *geggeg*
> 
> If that is the 880 at stock, then not bad at all.. About 1k higher than the reference 780Ti at stock.


Quote:


> Originally Posted by *vlps5122*
> 
> about equal to a 780 ti at 1250-1300 mhz core clock, im assuming that 880 is somewhere around 1000 core clock


sorry ingles¨, argentina,
English sorry, I'll take the KPE! more thousand thousand less
http://www.legitreviews.com/wp-content/uploads/2013/11/firestrike-x-645x1038.jpg

Ever since we heard this rumor, we've been trying to verify this information and so far no one was able to confirm it.

So what exactly is happening in August? According NVIDIA's calendar, there are two event where 'something could be shown to public'. First is SIGGRAPH, which is more software-oriented, and then there's Gamescom. This event takes place in Germany between 13rd and 17th August. Of course NVIDIA will be present there, in fact they even have their own booth there.


----------



## ohhgourami

More rumors. We are getting very close. Looking to replace my 670.


----------



## Chomuco

Quote:


> Originally Posted by *szeged*
> 
> my guess is late october, as it has been since january. August is just too early, maybe we will get a reveal or teaser in august, but a full on launch where we can get the product in our hand? October 17thish at the earliest imo.


mmm jajaj nice


----------



## Germanian

Quote:


> Originally Posted by *Chomuco*
> 
> mmm jajaj nice


----------



## TTheuns

1. NVidia releases 880 with GM204
2. AMD releases R9 390X with ...
3. NVidia releases 880Ti with GM210
4. ???
5. Profit!


----------



## fateswarm

Quote:


> Originally Posted by *TTheuns*
> 
> 1. NVidia releases 880 with GM204
> 2. AMD releases R9 390X with ...
> 3. NVidia releases 880Ti with GM210
> 4. ???
> 5. Profit!


Yeah, profit when NVIDIA is forced to use good prices.


----------



## maarten12100

Quote:


> Originally Posted by *Chomuco*
> 
> mmm jajaj nice


If you isolate a space very well even a GT 610 or a HD7730 can bake an egg.


----------



## Ghoxt

Quote:


> Originally Posted by *TTheuns*
> 
> 1. NVidia releases 880 with GM204
> 2. AMD releases R9 390X with ...
> 3. NVidia releases 880Ti with GM210
> 4. ???
> 5. Profit!


I'm trying to think of what I'll do with GM210 SLI. Definitely hyped for it, however other than benching what's on the horizon?

I skipped Bioshock Infinite and several other games... maybe I'll go that route.


----------



## CalinTM

nevermind, didnt looked at the thread date.


----------



## zealord

new GPU-Z version : http://www.techpowerup.com/downloads/2398/techpowerup-gpu-z-v0-7-9/mirrors

Added preliminary support for NVIDIA GM204
Added preliminary support for AMD Tonga
Added support for AMD Radeon R9 M275X, FirePro W5100, W9100
Added support NVIDIA GeForce GTX 780 6 GB, GTX 860M, GTX 780M, GT 830M, GT 740, GT 730, GT 720, Quadro NVS 510, FX 380M, GRID K520, Tesla K40c
Added release date for R9 290
More robust PhysX detection
Fixed fan speed monitoring on some recent AMD cards
Fix for sensor graph over/underflow
Performance improvements to sensor graph drawing
*Fix for French translation*

PogChamp


----------



## TTheuns

Quote:


> Originally Posted by *Ghoxt*
> 
> I'm trying to think of what I'll do with GM210 SLI. Definitely hyped for it, however other than benching what's on the horizon?
> 
> I skipped Bioshock Infinite and several other games... maybe I'll go that route.


I hope SLI GM210 would be really for Surround 4K


----------



## TTheuns

Quote:


> Originally Posted by *fateswarm*
> 
> Yeah, profit when NVIDIA is forced to use good prices.


NVidia is just not a company for a good Price to Performance ratio


----------



## fateswarm

Quote:


> Originally Posted by *TTheuns*
> 
> NVidia is just not a company for a good Price to Performance ratio


Yeah 780 is not that bad right now after it was forced on a lower price, but it's still beaten by 290 on pretty much anything on price over performance.


----------



## Luck100

Quote:


> Originally Posted by *fateswarm*
> 
> Yeah 780 is not that bad right now after it was forced on a lower price, but it's still beaten by 290 on pretty much anything on price over performance.


I'm expecting 870 / 880 will tilt price/performance vs 290/290x back to NVidia. No new AMD gpus in that range until 2015, so NVidia will dominate sales during the big Q4 shopping season.


----------



## fateswarm

Quote:


> Originally Posted by *Luck100*
> 
> I'm expecting 870 / 880 will tilt price/performance vs 290/290x back to NVidia. No new AMD gpus in that range until 2015, so NVidia will dominate sales during the big Q4 shopping season.


Erm. If they go 20nm it will beat anything but it might be overpriced again. If they go 28nm for a few months, it might do that, but I expect them to stay low-end-ish, something like up to beating the 780.


----------



## CalinTM

Im expecting 870 between 780 and 780 Ti, and 880 a little faster than 780 Ti.

Dont believe the leaks, they are tons of them. Wait for something official.


----------



## skupples

same STUFF different GPU release.

I'm always entertained by the sensational reporting, but i'm even more entertained by the minions who buy it hook line & sinker without any actual proooooof.

By the way... Isn't Nvidia *supposedly* doing something big at Gamescom?


----------



## VSG

Wasn't it Denver?


----------



## skupples

Quote:


> Originally Posted by *geggeg*
> 
> Wasn't it Denver?


Who knows...

Denver, that thing that was supposedly going to be part of Maxwell, until road map showed Maxwell as just another GPU. I still hope that 9xx will be 20nm + Denver + stacked, but ehhhhhhhh... i'm guessing that NV showing it as part of Pascal/Volta means that it IS in fact for Pascal / Volta.

either way, games would need to be programmed to support the arm core, correct? Which means the benefit would be minimal until said support started floating around.


----------



## VSG

If 8xx is 28nm Maxwell then either 880Ti or 9xx will be 20 nm big daddy Maxwell. Pascal is 2016 at the minimum.


----------



## CalinTM

I dont think a TI model will come, unless AMD will release something big. Ti probably after the big maxwell. Exactly like 680, 780, Titan days. The same lineup strategy.


----------



## skupples

Kepler strategy seems to have been pretty profitable for Nvidia, and like I have been saying all along. Nvidia could possibly get away with another 28nm run @ massive profit... I would be surprised if the entire 8xx run is all 20nm. It would be much easier for me to believe a 2x 20nm flagship, with everything else being 28.


----------



## CalinTM

Quote:


> Originally Posted by *skupples*
> 
> Kepler strategy seems to have been pretty profitable for Nvidia, and like I have been saying all along. Nvidia could possibly get away with another 28nm run @ massive profit... I would be surprised if the entire 8xx run is all 20nm. It would be much easier for me to believe a 2x 20nm flagship, with everything else being 28.


Yes it is very good for nvidia.

In a real world, the 700 series should have been already on maxwell chips. Like: 780 on GM110, 770 on GM104, and the rest of them nerfed down from 770.
In a real world scenario.

And by now, 2015 was supposed to have already the volta chips.

All this in a real world.

But nvidia is all about marketing, releasing titan cards, who are very expensive, and fooling consumers that they are ONLY for gaming at 1000$. And actually a titan card is a hybrid card, for BOTH gaming and workstation card. Basically when you buy a TITAN card, you buy a Geforce card and a half-Quadro card, thats why was soo expensive.

Nvidia released this Titan card name, ONLY because its a cheaper way to buy a Quadro GPU.

So titan = quadro at lower price, FOR people who WANT to buy a Quadro card, and DONT have the money for it, TITAN is their solution, titan card is just not for gaming only, and people dont understand this.

...and now they have a new toy for their marketing, 28nm vs. 20nm

Dont get me wrong, i will buy a gtx880 no matter what, its time for upgrade for me.


----------



## Raghar

I actually use integers in CUDA, so normal card works for me flawlessly. I more have problems with slow bitshifts on GPU.

(BTW you need to activate a compute mode for high DP throughput on Titan, gaming mode gives you more FPS in games, and lowers DP to 1/3 ? At least I heard some drivers required user to open control center and set mode he wanted.)


----------



## Raghar

Quote:


> Originally Posted by *skupples*
> 
> either way, games would need to be programmed to support the arm core, correct? Which means the benefit would be minimal until said support started floating around.


I hope not. The main advantage of ARM was ability to run drivers on GFX card, thus completely free CPU. How well they managed to do that is yet is unknown because it's not out yet.


----------



## Olivon

Quote:


> According to information provided to Nordic Hardware, Nvidia has now started inviting selected media to a press event in the United States where 10 or 11 September, featuring details and information about their new graphics card family.


http://www.nordichardware.se/Grafik/geforce-gtx-880-lurar-runt-hoernet-naer-nvidia-samlar-pressen-i-september.html

*Translated*


----------



## GoldenTiger

Quote:


> Originally Posted by *Olivon*
> 
> http://www.nordichardware.se/Grafik/geforce-gtx-880-lurar-runt-hoernet-naer-nvidia-samlar-pressen-i-september.html
> 
> *Translated*


Nice find







.


----------



## fateswarm

*What's in the box?* The mystery of if it's 20nm or 28nm continues with absolutely no official reports on it. It's hilarious how long it dragged with only speculation.


----------



## duhasttas

Quote:


> Originally Posted by *fateswarm*
> 
> *What's in the box?* The mystery of if it's 20nm or 28nm continues with absolutely no official reports on it. It's hilarious how long it dragged with only speculation.


I've been here since 2009 and lurked for at least 2 years prior to that and I can firmly say speculation and 200+ threads are all so common when it comes to upcoming GPU's, it is especially awesome when people get into some heated discussions


----------



## DoktorCreepy

Yeah, but we knew months ahead of time that both the HD7000 series and the GTX 600 series were both going to be on the 28nm process and we knew what process some previous generations were going to be as well months in advance.

This upcoming generation is just people shooting in the dark as far as process node is concerned.

TSMC has some of the blame for that.


----------



## GoldenTiger

Quote:


> Originally Posted by *duhasttas*
> 
> I've been here since 2009 and lurked for at least 2 years prior to that and I can firmly say speculation and 200+ threads are all so common when it comes to upcoming GPU's, it is especially awesome when people get into some heated discussions


Yep, I have posted since before hardocp existed, when firingsquad was a big deal, and Shack news was shugashack, same thing back then.... My oldest account on a current tech site is from 1999.









Good times, and I know plenty have me beaten!


----------



## GoldenTiger

Quote:


> Originally Posted by *DoktorCreepy*
> 
> Yeah, but we knew months ahead of time that both the HD7000 series and the GTX 600 series were both going to be on the 28nm process and we knew what process some previous generations were going to be as well months in advance.
> 
> This upcoming generation is just people shooting in the dark as far as process node is concerned.
> 
> TSMC has some of the blame for that.


We don't always know much prior to some launches.


----------



## Astral Fly

There was that
Quote:


> Originally Posted by *fateswarm*
> 
> *What's in the box?* The mystery of if it's 20nm or 28nm continues with absolutely no official reports on it. It's hilarious how long it dragged with only speculation.


There was that leak which shows a chip which supposedly is GM204. The chip in the pic is rather big which speaks for 28nm, but of course nothing is certain and nothing is official. I will be surprised to see 20nm gpu's in 2014.


----------



## fateswarm

Quote:


> Originally Posted by *DoktorCreepy*
> 
> TSMC has some of the blame for that.


TSMC is pretty open that they handed out 20nm processors to Apple and that they are full on contracts on 20nm until the end of 2014 (by Qualcomm, AMD, and others, NVIDIA is rarely mentioned). It's mainly the tech "journalists"(sorry real journalists) that parroted 28nm without much other than looking at the 750 Ti (and its "amazing scaling"). In fact, it's almost confirmed AMD is preparing 20nm GPUs for Q1 or Q2 since they hinted almost explicitly they go 20nm next year and they are due for a major release at Q1/2.


----------



## Mand12

Quote:


> Originally Posted by *GoldenTiger*
> 
> We don't always know much prior to some launches.


And Nvidia has demonstrated recently some surprising capability in regards to keeping a lid on things. I saw _nothing_ about G-Sync until their big event on it, not even the barest hint of a rumor.


----------



## DoktorCreepy

Quote:


> Originally Posted by *GoldenTiger*
> 
> We don't always know much prior to some launches.


Hence why I worded the post the way I did?

Do people actually read the entire post anymore on this forum or is it synopsizing and paraphrasing all the time?


----------



## Threx

Is Nvidia already done with their part at Gamescom?


----------



## GoldenTiger

Quote:


> Originally Posted by *DoktorCreepy*
> 
> Hence why I worded the post the way I did?
> 
> Do people actually read the entire post anymore on this forum or is it synopsizing and paraphrasing all the time?


Of course I did. However my response was to your whole post... nothing is unusual about people shooting in the dark most of the time in these kinds of threads for the bulk of product launches that have occurred. Try reading the whole thing yourself next time, eh?


----------



## DoktorCreepy

Quote:


> Originally Posted by *GoldenTiger*
> 
> Of course I did. However my response was to your whole post... nothing is unusual about people shooting in the dark most of the time in these kinds of threads for the bulk of product launches that have occurred. Try reading the whole thing yourself next time, eh?


Ok I have to quote my own post again apparently
Quote:


> This upcoming generation is just people shooting in the dark as far as process node is concerned


How that can be comprehended as anything other than no one except Nvidia or people under NDA have any idea if the upcoming GTX 800 series is 28nm or 20nm is beyond me.

I made no mention of Cuda core count, TMU's, ROP's, core or memory frequencies or anything else in that sentence because for some of those things we already have somethings to go on like the GTX 750ti and the leaks showing an upcoming 256bit part and that we know the amount of Cuda cores per SMM cluster; so for those things yeah people will start to speculate because its more an educated guess since we have SOMETHING to go on just like we did with some of the aspects other launches.

No one knows anything about the process its on that can actually talk about it hence the shooting in the dark, and if we are shooting in the dark then WFCCtech and Videocardz are trying to pin the tail on the donkey blind folded in the dark and drunk after being spun around 50 times.


----------



## Darkpriest667

Quote:


> Originally Posted by *Threx*
> 
> Is Nvidia already done with their part at Gamescom?


yeah they want you to buy a shield they promise it's really neato.


----------



## Cyro999

Quote:


> Originally Posted by *Diablosbud*
> 
> Not impressed with the memory bandwidth, that will likely limit gaming performance. 3200 cores makes me happy. That 5.7 TFLOPS calculation speed though
> 
> 
> 
> 
> 
> 
> 
> . Looks like the 880 will be a great card for compute performance.


We already have GPU's capable of theoretical tflops >6.5 safe for 24/7 usage


----------



## GoldenTiger

Quote:


> Originally Posted by *DoktorCreepy*
> 
> Ok I have to quote my own post again apparently
> How that can be comprehended as anything other than no one except Nvidia or people under NDA have any idea if the upcoming GTX 800 series is 28nm or 20nm is beyond me.
> 
> I made no mention of Cuda core count, TMU's, ROP's, core or memory frequencies or anything else in that sentence because for some of those things we already have somethings to go on like the GTX 750ti and the leaks showing an upcoming 256bit part and that we know the amount of Cuda cores per SMM cluster; so for those things yeah people will start to speculate because its more an educated guess since we have SOMETHING to go on just like we did with some of the aspects other launches.
> 
> No one knows anything about the process its on that can actually talk about it hence the shooting in the dark, and if we are shooting in the dark then WFCCtech and Videocardz are trying to pin the tail on the donkey blind folded in the dark and drunk after being spun around 50 times.


Uh, yeah.... you're acting like all of this is new, but it's not. That's what my point was... this happens every single launch







.


----------



## skupples

Quote:


> Originally Posted by *CalinTM*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Yes it is very good for nvidia.
> 
> In a real world, the 700 series should have been already on maxwell chips. Like: 780 on GM110, 770 on GM104, and the rest of them nerfed down from 770.
> In a real world scenario.
> 
> And by now, 2015 was supposed to have already the volta chips.
> 
> All this in a real world.
> 
> But nvidia is all about marketing, releasing titan cards, who are very expensive, and fooling consumers that they are ONLY for gaming at 1000$. And actually a titan card is a hybrid card, for BOTH gaming and workstation card. Basically when you buy a TITAN card, you buy a Geforce card and a half-Quadro card, thats why was soo expensive.
> 
> Nvidia released this Titan card name, ONLY because its a cheaper way to buy a Quadro GPU.
> 
> So titan = quadro at lower price, FOR people who WANT to buy a Quadro card, and DONT have the money for it, TITAN is their solution, titan card is just not for gaming only, and people dont understand this.
> 
> ...and now they have a new toy for their marketing, 28nm vs. 20nm
> 
> Dont get me wrong, i will buy a gtx880 no matter what, its time for upgrade for me.


Most of the people I know got titans because they wanted to do things like 7680x1440P and that simply wasn't going to happen on 7970s or 680s.

I don't know a single person who thought Titan was ONLY a gaming card... In fact, the first half and the second half of your post contradict them selves.
Quote:


> Originally Posted by *Darkpriest667*
> 
> yeah they want you to buy a shield they promise it's really neato.


really? NV went through all that hype all to say absolutely nothing a bout 8xx series at games com?

ha, and people are buying a WCCFtech story about a march 2015 release date...

Did nv even present yet?


----------



## renji1337

I wonder when I should sell my 780 classy's


----------



## The Source

Quote:


> Originally Posted by *renji1337*
> 
> I wonder when I should sell my 780 classy's


When you find out some real information regarding this series and how it will compare? It might not be worth it and more of a sideways "upgrade". If you bought your cards new, you'll be taking a 150-200 loss on each. I don't see this being a 400 dollar improvement.


----------



## skupples

Does anyone know if NV was even really supposed to be at Gamescom? I never actually saw that "See you at gamescom" ticker on their website.


----------



## skupples

thought so... no one actually knows a damn thing about NV @ Gamescom.


----------



## VSG

http://blogs.nvidia.com/blog/2014/08/14/how-nvidia-will-be-going-big-at-gamescom-the-worlds-biggest-gaming-show/

http://www.nextpowerup.com/news/11371/nvidia-unveils-64-bit-tegra-k1-denver.html


----------



## skupples

Denver cores.









+1


----------

